IoT Sensors, Nerves for Robots and the Industrial Internet

Sensors are Nerves for Robots
Yesterday I interviewed two robotics experts on the growing demand for IPAs (intelligent process automation) robots.  These robots are made of software code.  They are assigned pre-defined actions based on steps in a process, the analysis of data, and the decision trees they are provided.  For example, an IPA can review a car loan application and approve or disprove it instantly – based on the data.  In fact, they can analyze the data from tens of thousands of car loans in seconds based on the parameters and decision trees they have been given.

There are literally hundreds of thousands of different use cases for IPA robots.  IPA robots can also interact with IoT sensors and take actions based on sensor data.  Not just by completing a digital business processes, but even by controlling physical equipment and machines as well.  Sensors serve robots in much the same way as nerves serve us humans.

Earlier this week I was briefed by a company AMS AG,  a developer of IoT sensors.  They just released a new sensor that smells odors in homes and offices.  Yes, indeed!  The sensor is embedded in a home monitoring system from Withings.  In Withings’ Home product, the AS-MLV-P2 (sensor) is combined with a 5Mpixel video camera, dual microphones, temperature and humidity sensors and Wi-Fi® and Bluetooth® Smart radios. This means that users of the Home monitoring system can see, hear, feel and smell the inside of their home or office remotely via a smartphone or tablet app supplied by Withings.

AMS’s sensor detects VOCs (volatile organic compounds), including both human-made and naturally occurring chemical compounds. These include ambient concentrations of a broad range of reducing gases associated with bad air quality such as alcohols, aldehydes, ketones, organic acids, amines, and aliphatic and aromatic hydrocarbons, all which can be harmful to human and animal health at high levels. These are most of the scents humans smell.  In the Home app, the sensor’s measurements of these chemicals are converted to an air-quality rating as well as to a measurement of VOC concentrations.

If you combine IPA robots, AMS’s sensors and Withings Home monitoring system with your HVAC system, the IPA robot can ensure you have healthy air quality in your home or office continuously. In fact, an IPA robot could manage the air quality and security of tens of thousands of homes and offices at the same time.  The results of these findings and actions can be displayed and controlled on smartphones and tablets as well.

Not only do you have robots sensing the physical world, but also automatically reacting to it on your behalf.  In my opinion, how sensors detect and communicate the physical and natural world to humans and robots is one of the most interesting areas of innovation today.

An additional value of using IPA robots is the massive clouds of data they spin-off as a result of their decisions and actions.  This data can be further analyzed to find new areas for optimization and potential business opportunities.  Herein lies an emerging area where big data analysis can give us even deeper insights.



************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Robots and I - Intelligent Process Automation, Siri and More

Today I had the privilege of interviewing two robotics and process automation experts.  I learned there are many different kinds of robots including the humanoid types we see in movies, and robots made entirely out of software.  In this interview we discuss Rob Brown's recent white paper titled Robots and I, the different forms of robots, and then dig deep into how software robots are transforming many industries today with expert Matt Smith.  Enjoy!

Video Link: https://youtu.be/qOPFD3vshec


************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Mobile Apps, Blind Spots, Tomatoes and IoT Sensors

Master Tomato Gardener
A lot is written on mobile technologies, the Internet of Things, social media and analytics, but little is written on how all these might work together in a retail environment.  I think best by writing, so let's think this through together.

Blind spots are defined as, “Areas where a person's view is obstructed.” Many business decisions today are still made based on conjecture (unsubstantiated assumptions), because the data needed to make a data-driven decision lies in an operational “blind spot.”

Smart companies when designing mobile applications consider how they can personalize the user experience.  They ask themselves how they can utilize all the accumulated data they have collected on their customers or prospects, plus third-party data sources, to make the experience as beautiful and pleasurable as possible.  To start, they can often access the following kinds of data from their own and/or purchased databases to personalize the experience:
  • Name
  • Age
  • Gender
  • Address
  • Demographic data
  • Income estimate
  • Credit history
  • Education level
  • Marital status
  • Children
  • Lifestyle
  • Social media profile and sentiment
  • Job title
  • Purchase history
  • Locations of purchases
  • Preferences, tastes and style
  • Browsing/Shopping history
This data, however, is basic.  It is merely a digital profile. It has many blind spots.  It is often not based on real-time data.  As competition stiffens, the above profile data will not be enough to deliver a competitive advantage.  Companies will need to find ways to reduce blind spots in their data so they can increase the degree of personalization.

Sensors connected to the IoT (Internet of Things) will play an important role in reducing blind spots. Sensors, often cost only a few dollars, and can be set-up to detect or measure physical properties, and then wirelessly communicate the results to a designated server.  Also as smartphones (aka sensor platforms) increase the number of sensors they include, and then make these sensors available to mobile application developers through APIs, the competitive playing field will shift to how these sensors can be used to increase the level of personalization.

Let’s imagine a garden supply company, GardenHelpers, developing a mobile application.  The goal of the application is to provide competitive differentiation in the market by offering personalized garden advice and solutions.  The GardenHelpers use the following smartphone sensors in their design to provide more personalized gardening advice:
  • GPS sensor (location data)
  • Cell Tower signal strength (location data)
  • Magnetometer sensor (location of sun)
  • Ambient light sensor (available sunlight)
  • Barometer sensor (altitude)
GardenHelpers combine the sensor data with date and time, plus third-party information such as:
  • GIS (geospatial information system on terrain, slopes, angles, watershed, etc.) data
  • Historic weather information
  • Government soil quality information
  • Government crop data, recommendations and advice
GardenHelpers also encourages the user to capture the GPS coordinates, via their smartphone, on each corner of their garden to input the estimated garden size, and to capture the amount of sunlight at various times of the day through the ambient light sensor.  This information is compared with area weather data and the amount of shade and sunlight on their garden is estimated.

GardenHelpers now understands a great deal about the gardener (mobile app user), the garden location, size, lay of the land and sunlight at various times.  However, there remain “blind spots.”  GardenHelpers doesn't know the exact temperature, wind speeds, humidity levels, or the amount of water in the soil of the garden.  How do they remedy these blind spots?  They offer to sell the gardeners a kit of wireless IoT sensors to measure these.

With all of this information now the blind spots are now greatly reduced, but some remain.  What about local pests, soil issues and advice?  GardenHelpers adds a social and analytics element to their solution.  This enables gardeners to share advice with other local gardeners with similar garden sizes and crops.

GardenHelpers can now deliver a mobile app that is hyper-personalized for their customers and prospects.  The products they offer and recommend are not selected randomly, but are now based on precise smartphone and sensor data. The mobile app combined with the IoT sensors become an indispensable tool for their customers which leads to increased brand loyalty and sales.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Sensors - Sensing and Sharing the Physical World

Global Sensor Data
We spend a lot of time talking and writing about the IoT (Internet of Things) in the macro, as a giant worldwide network of objects and things, communicating with themselves and others.  That is indeed interesting, but the most interesting components of the IoT, in my opinion, are the sensors.  Sensors are defined as, "Devices that detect or measure a physical property and record, indicate, or otherwise responds to it."  In the context of IoT, sensors detect or measure a physical property and then communicate the findings wirelessly to a server for analysis. Sensors are our digital fingers that touch and feel the earth and environment!

Just last week I read this about a new iPhone patent, "The patent is titled “Digital camera with light splitter.” The camera described in the patent has three sensors for splitting color. The camera would split colors into three different rays. These would be red, green and blue. The splitting of colors is designed to allow the camera to maximize pixel array resolution." This patent potentially could help Apple improve the image quality of its mobile cameras, especially in video.  In other words, it will help iPhones better capture, display and share the scenes on our planet for viewing.

At the Mobile World Congress in Barcelona this year I saw demonstrated an iPhone add-on from the company, Flir.   It was a Personal Thermal Imagery Camera.  You connect it to your iPhone and then you can find leaky pipes in your wall, overloaded electrical breakers, or even spot live rodents hiding in your walls. You can use it in your boat to spot floating debris in the water in the dark or use while hiking in the dark to spot hidden predators preparing to devour you.  I WANT ONE NOW!

Sensors measure and collect data and can be connected to just about any piece of equipment. Satellite cameras are sensors.  There are audio and visual sensors.  There are pressure and heat sensors.  There are all kinds of sensors.  One of the most interesting sensor technologies I have been researching of late is hyper spectral remote sensors.

Hyper spectral sensors combined with GIS (geospatial information systems) information and Big Data analytics are a powerful mix. These sensors can be integrated into very powerful cameras. Hyper spectral remote sensing is an emerging technology that is being studied for its ability to detect and identify minerals, terrestrial vegetation, and man-made materials and backgrounds.  I want one!

Hyper spectral remote sensing combines imaging and spectroscopy (spectroscopy is a term used to refer to the measurement of radiation intensity as a function of wavelength) in a single system, which often includes large data sets that require Big Data analytics.  Hyper spectral imagery is typically collected (and represented) as a data cube with spatial information collected in the X-Y plane, and spectral information represented in the Z-direction.
hyper spectal imaging

What can be done with hyper spectral remote sensing?  Using powerful hyper spectral cameras one can detect unique noble gases (each unique gas emits a unique color on the spectrum), different inks, dyes and paints (each have different characteristics that can be uniquely identified).  You can detect, identify and quantify chemicals.  You can detect chemical composition and physical properties including their temperature and velocity all with a camera!

Taking a hyper spectral image of an object, connected to real-time Big Data analytics, can tell you an amazing amount of information about it.  Theoretically, a hyper spectral image of a person combined with facial recognition can identify a person, their shampoo, make-up, hand lotion, deodorant, perfume, the food they ate, chemicals they have been in contact with and the materials and chemicals used in their clothes.  OK, the implications of this technology for personal privacy are really creepy, but the technology itself is fascinating.

Theoretically hyper spectral remote sensing systems can be used for healthcare, food monitoring, security at airports, for public safety, in intelligence systems and integrated with drone and satellite surveillance systems.

Today, luckily, these cameras are far too expensive for me.

Related Articles: http://mobileenterprisestrategies.blogspot.com/2015/04/iot-sensors-tactile-feedback-iphones.html

Related Video: http://mobileenterprisestrategies.blogspot.com/2015/03/iot-and-sensors-from-ams-at-mwc15.html
************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

IoT Sensors, Tactile Feedback, iPhones and Digital Transformation

IoT sensors extend our physical senses beyond our physical reach and communicate the results from afar. They also allow us to share experiences remotely, not just mentally, but also tactilely. That is the first time I have ever used the word “tactilely.” It means to tangibly or physically experience something. For example, AMS’s MEMS gas sensor allows people to hear, see and smell inside their home remotely from an iPhone app. The Withings Home camera sends alerts to an iPhone if it detects movement or noise in the house. Its night-vision sensor mode enables the remote viewer to even see in the dark. The viewer can also talk through the camera to ask questions like, “Who are you, and why are you carrying my big screen TV away?”

Today you can combine 3D modeling apps for smartphones and tablets with sounds, vibrations and colors so you can augment your reality with tactile experiences. Wireless sensors and 3D modeling and visualization tools enable you to see and monitor conditions at distance - in real-time. A combination of sensors, analytics, visualization and tactile feedback tools can alert and inform you of changing conditions, patterns or variations in activity or data patterns. This experience can truly augment your reality.

The new Apple Watch enables you to signal somebody on the other side of the world with tactile vibrations that you customize. For example, while on the road I can signal my wife that I miss her by sending five quick “pulses” that vibrate on her wrist.

Digitally modeled realities enable experts, from anywhere in the world, to work and manage factories, farms and other kinds of operations from distant locations. The obstacles of the past, lack of information and monitoring capabilities, that resulted in operational blind spots are quickly disappearing as more sensors are put in place. Most of us either own or have seen noise canceling headsets. Sensors in the headset capture the incoming noise and then instantly counter with anti-sound that matches the sensor data. This same kind of sensor technology can capture noise and transmit it to distant locations where it can be recreated and listened to by others.

I can image near-term scenarios where entire factory floors are digitally replicated and real-time operations can be viewed and managed from great distances. Every component of the operation can be monitored via sensor data. Aberrations, out of compliance data, and other faults would instantly cause alerts, notifications and remedies to be implemented.

In the military arena, acoustical sensors can now pin-point the location of incoming bullets, rockets, missiles, etc., in real-time and activate various instantaneous counter measure technologies. Data means power.

Today's competitive marketplace requires companies to collect more data, analyze more data and utilize more data to improve customer interactions and engagements. Mobile devices are exceptionally designed to assist in this effort. Apple's iPhone and Apple Watch come with an array of sensors for collecting data about your surroundings:
  • Touch/Multi-Touch screen sensor
  • Force Touch sensor– measures different levels of touch (Apple Watch), determines the difference between a tap and a press
  • Taptic Engine sensor – tactile feedback via gentle vibration(Apple Watch)
  • Audio/Voice sensor
  • GPS sensor
  • Bluetooth sensor (supports iBeacon)
  • WiFi sensor
  • WiFi strength sensor – help track indoor activities
  • Proximity sensor - deactivates the display and touchscreen when the device is brought near the face during a call, and it shuts off the screen and touch sensitivity
  • Ambient Light sensor - brightens the display when you’re in sunlight and dims it in darker place
  • Magnetometer sensor - measure the strength and/or direction of the magnetic field in the vicinity of the device – runs digital compass
  • Accelerometer sensor- measures the force of acceleration, i.e. the speed of movement (uses movement and gravity sensing), steps counter, distance, speed of movement, detects the angle an iPhone is being held
  • Apple Watch sensors measure steps taken, calories burned, and pulse rate
  • Gyroscope – 3 axis gyro (combined with Accelerometer provides 6 axis motion sensing), Pitch, Roll and Yaw
  • Barometer sensor – altitude, elevation gain during workouts, weather condition
  • Camera sensor with a plethora of sensors and digital features: face detection, noise reduction, optical image stabilization, auto-focus, color sensors, backside Illumination sensor, True Tone sensor and flash 
  • Fingerprint identity sensor
  • Heart rate sensor (Apple Watch) - uses infrared and visible-light LEDs and photodiodes to detect heart rate Sensor
Other sensor add-ons: Personal Thermal Imagery Cameras sensor (Flir)

I attended a defense related conference and listened to an IT expert in the CIA present on how they can use sensors on smartphones to uniquely identify the walking style and pace of individuals. For example, the intelligence agency may suspect a person carrying a phone is a bad guy. They can remotely switch on the smartphone's sensors and record the walking style and pace of the person carrying the phone and match it with their database records.

Sensors help bridge the gap between the physical and digital worlds. They convert the physical world into data. Tactile feedback tools convert the data back into physical experiences – like a Star Trek Transporter.

Mobile apps can also be considered the API (application programming interface) between humans and smartphones. Sensors are the API between the phone and the physical world. For example, a mobile application for recommending local restaurants may start by asking the user what kind of food they prefer. The human queries their stomach for pain and preferences, and then inputs the results into mobile apps by touching the keypad or using their voice. Suddenly a server in an Amazon data center knows your stomach's inputs! That is one powerful sensor and API! Given the vast array of sensors in the human body incredible things can be done once those sensor convert them to data.

Until recently, the data from natural sensors in the human body were mostly communicated to analytics engines via human's touch, typing, drawings or voice inputs. The emergence of wearable sensors and smart devices, however, change that. Wearable sensors can bypass the human in the middle and wirelessly communicate directly with your applications or healthcare provider.

Sensors and computers are also connected to the non-physical. Applications can react differently based on recognized time inputs. Once time reaches a specified location (place?), an alarm can be activated sending sound waves to your physical ear. That is converting the non-physical (time) into sound waves that vibrate our ear drums.

The challenge for businesses today is to envision how all of these sensors and available real-time data can be used to improve sales, customer service, product design, marketplace interactions and engagements so there are more profits at the end of the day.

In the book Digital Disruptions, James McQuivey writes that for most of history, disruptions (business and marketplace transformations) occurred in a physical world of factories and well-trod distribution networks. However, the disruptors of tomorrow are likely coming from digital disruptions - sensors, code halos, big data, mobile devices and wearables.

The task and challenge of every IT department is to understand and design a strategy that recognizes the competitive playing fields of tomorrow are among the digits.


************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Interviews with Kevin Benedict