Thursday, April 30, 2015

Mobile Apps - Personalizing While Respecting Personal Privacy

All the data I have been reading this week suggest mobile users want and value a personalized experience on their mobile apps or mobile website, but on the other hand they don't like giving up their personal data.  That means it is imperative to find the right balance so a mutually satisfying relationship can be fostered.  As we all know, the more data you have on an individual, the easier it is to configure a personalized experience.

In a fresh Cognizant survey this week involving 5,000 participants, 68% reported they are willing to provide information on their gender, 55% their age, and 65% their brand and product preferences, but the majority are not in favor of volunteering much else.  That is an interesting answer since 80% of the same survey participants belong to one or more loyalty and rewards programs, and the biggest reasons according to 74% are the points or rewards for each dollar spent.  The second biggest motivation was automatic discounts for loyalty program members.  That tells us there is a willingness to give up some level of data privacy if the rewards and discounts are valuable enough.

Mobile retailers need to find out how much personal data is worth to their customer base.  They need to give up enough in points, rewards and discounts to motivate the sharing of more in depth personal data.  Collecting data on social media is not the answer.  In my research, consumers don't like the idea of any kind of data collection for marketing purposes from their social media activities.  It makes them mad.  Mad is not a feeling a consumer products company wants to elicit from their customers.

It seems to me that a bold, transparent process would be best.  The online and mobile retailer should place a value on data.  For example:
  • Answer 10 specific questions about yourself and your preferences, and I will give you an automatic 10% off your purchases.
  • Answer 20 specific questions about yourself and your preferences, and I will give you an automatic 20% off your purchases.
  • Answer 30 specific questions about yourself and your preferences, and I will give you an automatic 30% off your purchases.
Whatever the real value, we all agree that there is a value to data.  Finding the real value, and transparently using that information to provide a personalized user experience, benefits all parties.

I think IoT (Internet of Things) sensors may also play a role in data collection and personalization. Rather than make people uncomfortable by tracking more personal data, sensors can track product data and that can be used to provide a personalized experience for the owner of the product.  Here is an example - a man buys a bass fishing boat and a service agreement.  Sensors (as defined in the service agreement) collects data on the boat engine.  Information such as:
  • Locations
  • Activities
  • Usage profiles
  • Hours of operations
  • Data and time
  • etc.
The boat engine information is added to the customer's profile to provide a "boater's profile" that can be used to personalize online and mobile experiences.

Follow me on Twitter @krbenedict.
************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Wednesday, April 29, 2015

Speed, Personalization, Analytics and Enterprise Information Systems

Ninety percent of 18-34 years old strongly value a personalized mobile and web experience, and eighty-two percent of of those over 45 years old value personalization.  What kind of personalization?  Forty-seven percent of shoppers prefer location or time-based personalization in mobile applications or websites.  In other words, don't show me things that are not available for me to purchase in Boise at this time, or that I don't like!  Given this survey data we all know what needs to be done and are taking the necessary steps to be able to offer personalization, right?  Wrong it seems.  Many companies are not using available data to understand their customers better so they can provide them with contextually relevant and personalized mobile application experiences.

My colleague, Benjamin Pring, at Cognizant's Center for the Future of Work recently published a research paper titled Putting the Experience in Digital Customer Experience.  In his research he found fewer than 20% of respondents use analytics generated by application programming interface (API) traffic to understand their customers’ online and offline purchase journeys.  Just 41% of respondents in the retail industry say they will be effective at analyzing customer metadata by 2017.  A mere 42% of respondents say they have adequate tools and skills to analyze digitally generated data.  Only one-third of respondents have made adjustments to their business model to pursue strategies driven by digital information about their customers.

An additional challenge, is that personalizing mobile user experiences takes speed, speed many do not have available in their current IT environments.  As organizations begin developing mobile strategies and implementing mobile apps, they quickly realize simply developing and deploying basic mobile apps, infrastructure and frameworks is not enough.  They must push further and implement a real-time enterprise to remain competitive.  This real-time requirement is at the root of many problems.  Eighty-four percent of survey participants reported they have IT systems too slow or incapable of supporting real-time mobility, which negatively impacts mobile app performance and the user’s experience.

We have some more work to do.


************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Wednesday, April 22, 2015

Monitoring an Ear of Corn with an IoT Sensor?

Once upon a time, farmers would walk through a field or ride a horse around it to determine the amount of fertilizer and water their crops required.  I have done this myself.  Today agricultural drones with sensors and analysis software can fly over large fields and analyze the crops and their needs precisely in seconds.  If we wanted to get even more ambitious, we could place an IoT (Internet of Things) sensor next to every stalk of corn to monitor and optimize its growth.  Although these steps are all feasible today, some are not yet economically advantageous.  That might, however, soon change.  In the past, we treated crops in aggregate. Today, we can customize how we treat each section of a crop due to the benefits of sensors.

Globally, we will need to feed 8 billion people by 2030 and 9 billion by 2050.  The UN Food and Agriculture Organization (FAO) projects that, under current production and consumption trends, global food production must increase 60 percent by 2050 in order to meet the demands of the growing world population.  That's only 35 years away!!!

Another fact, over 25-40% of our food spoils or is lost before it can be consumed (source http://www.foodwastealliance.org/about-our-work/assessment/).  This is a massive amount of waste and inefficiency that no one wants and IoT sensors can help us reduce food waste.

Do you see, as I do, the need for a digital transformation in agriculture, food processing and delivery? The Internet of Things is not just the newest gadget for us to play with, it can mean the difference between life and death for many people.  Data collected through sensors and analyzed to help optimize growth, harvesting, processing, delivery and consumption may just be the solution we need.



************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Wednesday, April 15, 2015

Top 11 Articles on IoT, Mobility, Code Halos and Digital Transformation Strategies

Most of the stuff I write is rubbish, but these 11 articles beat the odds and are actually worth reading. You can find my complete Top 40 list here. Enjoy!

  1. Mobile Apps, Blind Spots, Tomatoes and IoT Sensors
  2. IoT Sensors, Nerves for Robots and the Industrial Internet
  3. Sensors - Sensing and Sharing the Physical World
  4. IoT Sensors, Tactile Feedback, iPhones and Digital Transformation
  5. IoT, Software Robots, Mobile Apps and Network Centric Operations
  6. Networked Field Services and Real-Time Decision Making
  7. Thinking About Enterprise Mobility, Digital Transformation and Doctrine
  8. GEOINT, GIS, Google Field Trip and Digital Transformation
  9. Connecting the Dots Between Enterprise Mobility and IoT
  10. Merging the Physical with the Digital for Optimized Productivity
  11. IoT Sensors Extend Our Physical Senses Beyond Our Physical Reach
You can find my Top 75 articles on Mobile Strategies here.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

IoT Sensors, Nerves for Robots and the Industrial Internet

Sensors are Nerves for Robots
Yesterday I interviewed two robotics experts on the growing demand for IPAs (intelligent process automation) robots.  These robots are made of software code.  They are assigned pre-defined actions based on steps in a process, the analysis of data, and the decision trees they are provided.  For example, an IPA can review a car loan application and approve or disprove it instantly – based on the data.  In fact, they can analyze the data from tens of thousands of car loans in seconds based on the parameters and decision trees they have been given.

There are literally hundreds of thousands of different use cases for IPA robots.  IPA robots can also interact with IoT sensors and take actions based on sensor data.  Not just by completing a digital business processes, but even by controlling physical equipment and machines as well.  Sensors serve robots in much the same way as nerves serve us humans.

Earlier this week I was briefed by a company AMS AG,  a developer of IoT sensors.  They just released a new sensor that smells odors in homes and offices.  Yes, indeed!  The sensor is embedded in a home monitoring system from Withings.  In Withings’ Home product, the AS-MLV-P2 (sensor) is combined with a 5Mpixel video camera, dual microphones, temperature and humidity sensors and Wi-Fi® and Bluetooth® Smart radios. This means that users of the Home monitoring system can see, hear, feel and smell the inside of their home or office remotely via a smartphone or tablet app supplied by Withings.

AMS’s sensor detects VOCs (volatile organic compounds), including both human-made and naturally occurring chemical compounds. These include ambient concentrations of a broad range of reducing gases associated with bad air quality such as alcohols, aldehydes, ketones, organic acids, amines, and aliphatic and aromatic hydrocarbons, all which can be harmful to human and animal health at high levels. These are most of the scents humans smell.  In the Home app, the sensor’s measurements of these chemicals are converted to an air-quality rating as well as to a measurement of VOC concentrations.

If you combine IPA robots, AMS’s sensors and Withings Home monitoring system with your HVAC system, the IPA robot can ensure you have healthy air quality in your home or office continuously. In fact, an IPA robot could manage the air quality and security of tens of thousands of homes and offices at the same time.  The results of these findings and actions can be displayed and controlled on smartphones and tablets as well.

Not only do you have robots sensing the physical world, but also automatically reacting to it on your behalf.  In my opinion, how sensors detect and communicate the physical and natural world to humans and robots is one of the most interesting areas of innovation today.

An additional value of using IPA robots is the massive clouds of data they spin-off as a result of their decisions and actions.  This data can be further analyzed to find new areas for optimization and potential business opportunities.  Herein lies an emerging area where big data analysis can give us even deeper insights.



************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Tuesday, April 14, 2015

Robots and I - Intelligent Process Automation, Siri and More

Today I had the privilege of interviewing two robotics and process automation experts.  I learned there are many different kinds of robots including the humanoid types we see in movies, and robots made entirely out of software.  In this interview we discuss Rob Brown's recent white paper titled Robots and I, the different forms of robots, and then dig deep into how software robots are transforming many industries today with expert Matt Smith.  Enjoy!

Video Link: https://youtu.be/qOPFD3vshec


************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Thursday, April 09, 2015

Mobile Apps, Blind Spots, Tomatoes and IoT Sensors

Master Tomato Gardener
A lot is written on mobile technologies, the Internet of Things, social media and analytics, but little is written on how all these might work together in a retail environment.  I think best by writing, so let's think this through together.

Blind spots are defined as, “Areas where a person's view is obstructed.” Many business decisions today are still made based on conjecture (unsubstantiated assumptions), because the data needed to make a data-driven decision lies in an operational “blind spot.”

Smart companies when designing mobile applications consider how they can personalize the user experience.  They ask themselves how they can utilize all the accumulated data they have collected on their customers or prospects, plus third-party data sources, to make the experience as beautiful and pleasurable as possible.  To start, they can often access the following kinds of data from their own and/or purchased databases to personalize the experience:
  • Name
  • Age
  • Gender
  • Address
  • Demographic data
  • Income estimate
  • Credit history
  • Education level
  • Marital status
  • Children
  • Lifestyle
  • Social media profile and sentiment
  • Job title
  • Purchase history
  • Locations of purchases
  • Preferences, tastes and style
  • Browsing/Shopping history
This data, however, is basic.  It is merely a digital profile. It has many blind spots.  It is often not based on real-time data.  As competition stiffens, the above profile data will not be enough to deliver a competitive advantage.  Companies will need to find ways to reduce blind spots in their data so they can increase the degree of personalization.

Sensors connected to the IoT (Internet of Things) will play an important role in reducing blind spots. Sensors, often cost only a few dollars, and can be set-up to detect or measure physical properties, and then wirelessly communicate the results to a designated server.  Also as smartphones (aka sensor platforms) increase the number of sensors they include, and then make these sensors available to mobile application developers through APIs, the competitive playing field will shift to how these sensors can be used to increase the level of personalization.

Let’s imagine a garden supply company, GardenHelpers, developing a mobile application.  The goal of the application is to provide competitive differentiation in the market by offering personalized garden advice and solutions.  The GardenHelpers use the following smartphone sensors in their design to provide more personalized gardening advice:
  • GPS sensor (location data)
  • Cell Tower signal strength (location data)
  • Magnetometer sensor (location of sun)
  • Ambient light sensor (available sunlight)
  • Barometer sensor (altitude)
GardenHelpers combine the sensor data with date and time, plus third-party information such as:
  • GIS (geospatial information system on terrain, slopes, angles, watershed, etc.) data
  • Historic weather information
  • Government soil quality information
  • Government crop data, recommendations and advice
GardenHelpers also encourages the user to capture the GPS coordinates, via their smartphone, on each corner of their garden to input the estimated garden size, and to capture the amount of sunlight at various times of the day through the ambient light sensor.  This information is compared with area weather data and the amount of shade and sunlight on their garden is estimated.

GardenHelpers now understands a great deal about the gardener (mobile app user), the garden location, size, lay of the land and sunlight at various times.  However, there remain “blind spots.”  GardenHelpers doesn't know the exact temperature, wind speeds, humidity levels, or the amount of water in the soil of the garden.  How do they remedy these blind spots?  They offer to sell the gardeners a kit of wireless IoT sensors to measure these.

With all of this information now the blind spots are now greatly reduced, but some remain.  What about local pests, soil issues and advice?  GardenHelpers adds a social and analytics element to their solution.  This enables gardeners to share advice with other local gardeners with similar garden sizes and crops.

GardenHelpers can now deliver a mobile app that is hyper-personalized for their customers and prospects.  The products they offer and recommend are not selected randomly, but are now based on precise smartphone and sensor data. The mobile app combined with the IoT sensors become an indispensable tool for their customers which leads to increased brand loyalty and sales.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Wednesday, April 08, 2015

Sensors - Sensing and Sharing the Physical World

Global Sensor Data
We spend a lot of time talking and writing about the IoT (Internet of Things) in the macro, as a giant worldwide network of objects and things, communicating with themselves and others.  That is indeed interesting, but the most interesting components of the IoT, in my opinion, are the sensors.  Sensors are defined as, "Devices that detect or measure a physical property and record, indicate, or otherwise responds to it."  In the context of IoT, sensors detect or measure a physical property and then communicate the findings wirelessly to a server for analysis. Sensors are our digital fingers that touch and feel the earth and environment!

Just last week I read this about a new iPhone patent, "The patent is titled “Digital camera with light splitter.” The camera described in the patent has three sensors for splitting color. The camera would split colors into three different rays. These would be red, green and blue. The splitting of colors is designed to allow the camera to maximize pixel array resolution." This patent potentially could help Apple improve the image quality of its mobile cameras, especially in video.  In other words, it will help iPhones better capture, display and share the scenes on our planet for viewing.

At the Mobile World Congress in Barcelona this year I saw demonstrated an iPhone add-on from the company, Flir.   It was a Personal Thermal Imagery Camera.  You connect it to your iPhone and then you can find leaky pipes in your wall, overloaded electrical breakers, or even spot live rodents hiding in your walls. You can use it in your boat to spot floating debris in the water in the dark or use while hiking in the dark to spot hidden predators preparing to devour you.  I WANT ONE NOW!

Sensors measure and collect data and can be connected to just about any piece of equipment. Satellite cameras are sensors.  There are audio and visual sensors.  There are pressure and heat sensors.  There are all kinds of sensors.  One of the most interesting sensor technologies I have been researching of late is hyper spectral remote sensors.

Hyper spectral sensors combined with GIS (geospatial information systems) information and Big Data analytics are a powerful mix. These sensors can be integrated into very powerful cameras. Hyper spectral remote sensing is an emerging technology that is being studied for its ability to detect and identify minerals, terrestrial vegetation, and man-made materials and backgrounds.  I want one!

Hyper spectral remote sensing combines imaging and spectroscopy (spectroscopy is a term used to refer to the measurement of radiation intensity as a function of wavelength) in a single system, which often includes large data sets that require Big Data analytics.  Hyper spectral imagery is typically collected (and represented) as a data cube with spatial information collected in the X-Y plane, and spectral information represented in the Z-direction.
hyper spectal imaging

What can be done with hyper spectral remote sensing?  Using powerful hyper spectral cameras one can detect unique noble gases (each unique gas emits a unique color on the spectrum), different inks, dyes and paints (each have different characteristics that can be uniquely identified).  You can detect, identify and quantify chemicals.  You can detect chemical composition and physical properties including their temperature and velocity all with a camera!

Taking a hyper spectral image of an object, connected to real-time Big Data analytics, can tell you an amazing amount of information about it.  Theoretically, a hyper spectral image of a person combined with facial recognition can identify a person, their shampoo, make-up, hand lotion, deodorant, perfume, the food they ate, chemicals they have been in contact with and the materials and chemicals used in their clothes.  OK, the implications of this technology for personal privacy are really creepy, but the technology itself is fascinating.

Theoretically hyper spectral remote sensing systems can be used for healthcare, food monitoring, security at airports, for public safety, in intelligence systems and integrated with drone and satellite surveillance systems.

Today, luckily, these cameras are far too expensive for me.

Related Articles: http://mobileenterprisestrategies.blogspot.com/2015/04/iot-sensors-tactile-feedback-iphones.html

Related Video: http://mobileenterprisestrategies.blogspot.com/2015/03/iot-and-sensors-from-ams-at-mwc15.html
************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Monday, April 06, 2015

IoT Sensors, Tactile Feedback, iPhones and Digital Transformation

IoT sensors extend our physical senses beyond our physical reach and communicate the results from afar. They also allow us to share experiences remotely, not just mentally, but also tactilely. That is the first time I have ever used the word “tactilely.” It means to tangibly or physically experience something. For example, AMS’s MEMS gas sensor allows people to hear, see and smell inside their home remotely from an iPhone app. The Withings Home camera sends alerts to an iPhone if it detects movement or noise in the house. Its night-vision sensor mode enables the remote viewer to even see in the dark. The viewer can also talk through the camera to ask questions like, “Who are you, and why are you carrying my big screen TV away?”

Today you can combine 3D modeling apps for smartphones and tablets with sounds, vibrations and colors so you can augment your reality with tactile experiences. Wireless sensors and 3D modeling and visualization tools enable you to see and monitor conditions at distance - in real-time. A combination of sensors, analytics, visualization and tactile feedback tools can alert and inform you of changing conditions, patterns or variations in activity or data patterns. This experience can truly augment your reality.

The new Apple Watch enables you to signal somebody on the other side of the world with tactile vibrations that you customize. For example, while on the road I can signal my wife that I miss her by sending five quick “pulses” that vibrate on her wrist.

Digitally modeled realities enable experts, from anywhere in the world, to work and manage factories, farms and other kinds of operations from distant locations. The obstacles of the past, lack of information and monitoring capabilities, that resulted in operational blind spots are quickly disappearing as more sensors are put in place. Most of us either own or have seen noise canceling headsets. Sensors in the headset capture the incoming noise and then instantly counter with anti-sound that matches the sensor data. This same kind of sensor technology can capture noise and transmit it to distant locations where it can be recreated and listened to by others.

I can image near-term scenarios where entire factory floors are digitally replicated and real-time operations can be viewed and managed from great distances. Every component of the operation can be monitored via sensor data. Aberrations, out of compliance data, and other faults would instantly cause alerts, notifications and remedies to be implemented.

In the military arena, acoustical sensors can now pin-point the location of incoming bullets, rockets, missiles, etc., in real-time and activate various instantaneous counter measure technologies. Data means power.

Today's competitive marketplace requires companies to collect more data, analyze more data and utilize more data to improve customer interactions and engagements. Mobile devices are exceptionally designed to assist in this effort. Apple's iPhone and Apple Watch come with an array of sensors for collecting data about your surroundings:
  • Touch/Multi-Touch screen sensor
  • Force Touch sensor– measures different levels of touch (Apple Watch), determines the difference between a tap and a press
  • Taptic Engine sensor – tactile feedback via gentle vibration(Apple Watch)
  • Audio/Voice sensor
  • GPS sensor
  • Bluetooth sensor (supports iBeacon)
  • WiFi sensor
  • WiFi strength sensor – help track indoor activities
  • Proximity sensor - deactivates the display and touchscreen when the device is brought near the face during a call, and it shuts off the screen and touch sensitivity
  • Ambient Light sensor - brightens the display when you’re in sunlight and dims it in darker place
  • Magnetometer sensor - measure the strength and/or direction of the magnetic field in the vicinity of the device – runs digital compass
  • Accelerometer sensor- measures the force of acceleration, i.e. the speed of movement (uses movement and gravity sensing), steps counter, distance, speed of movement, detects the angle an iPhone is being held
  • Apple Watch sensors measure steps taken, calories burned, and pulse rate
  • Gyroscope – 3 axis gyro (combined with Accelerometer provides 6 axis motion sensing), Pitch, Roll and Yaw
  • Barometer sensor – altitude, elevation gain during workouts, weather condition
  • Camera sensor with a plethora of sensors and digital features: face detection, noise reduction, optical image stabilization, auto-focus, color sensors, backside Illumination sensor, True Tone sensor and flash 
  • Fingerprint identity sensor
  • Heart rate sensor (Apple Watch) - uses infrared and visible-light LEDs and photodiodes to detect heart rate Sensor
Other sensor add-ons: Personal Thermal Imagery Cameras sensor (Flir)

I attended a defense related conference and listened to an IT expert in the CIA present on how they can use sensors on smartphones to uniquely identify the walking style and pace of individuals. For example, the intelligence agency may suspect a person carrying a phone is a bad guy. They can remotely switch on the smartphone's sensors and record the walking style and pace of the person carrying the phone and match it with their database records.

Sensors help bridge the gap between the physical and digital worlds. They convert the physical world into data. Tactile feedback tools convert the data back into physical experiences – like a Star Trek Transporter.

Mobile apps can also be considered the API (application programming interface) between humans and smartphones. Sensors are the API between the phone and the physical world. For example, a mobile application for recommending local restaurants may start by asking the user what kind of food they prefer. The human queries their stomach for pain and preferences, and then inputs the results into mobile apps by touching the keypad or using their voice. Suddenly a server in an Amazon data center knows your stomach's inputs! That is one powerful sensor and API! Given the vast array of sensors in the human body incredible things can be done once those sensor convert them to data.

Until recently, the data from natural sensors in the human body were mostly communicated to analytics engines via human's touch, typing, drawings or voice inputs. The emergence of wearable sensors and smart devices, however, change that. Wearable sensors can bypass the human in the middle and wirelessly communicate directly with your applications or healthcare provider.

Sensors and computers are also connected to the non-physical. Applications can react differently based on recognized time inputs. Once time reaches a specified location (place?), an alarm can be activated sending sound waves to your physical ear. That is converting the non-physical (time) into sound waves that vibrate our ear drums.

The challenge for businesses today is to envision how all of these sensors and available real-time data can be used to improve sales, customer service, product design, marketplace interactions and engagements so there are more profits at the end of the day.

In the book Digital Disruptions, James McQuivey writes that for most of history, disruptions (business and marketplace transformations) occurred in a physical world of factories and well-trod distribution networks. However, the disruptors of tomorrow are likely coming from digital disruptions - sensors, code halos, big data, mobile devices and wearables.

The task and challenge of every IT department is to understand and design a strategy that recognizes the competitive playing fields of tomorrow are among the digits.


************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.