Showing posts with label sensors. Show all posts
Showing posts with label sensors. Show all posts

Top 11 Articles on IoT, Mobility, Code Halos and Digital Transformation Strategies

Most of the stuff I write is rubbish, but these 11 articles beat the odds and are actually worth reading. You can find my complete Top 40 list here. Enjoy!

  1. Mobile Apps, Blind Spots, Tomatoes and IoT Sensors
  2. IoT Sensors, Nerves for Robots and the Industrial Internet
  3. Sensors - Sensing and Sharing the Physical World
  4. IoT Sensors, Tactile Feedback, iPhones and Digital Transformation
  5. IoT, Software Robots, Mobile Apps and Network Centric Operations
  6. Networked Field Services and Real-Time Decision Making
  7. Thinking About Enterprise Mobility, Digital Transformation and Doctrine
  8. GEOINT, GIS, Google Field Trip and Digital Transformation
  9. Connecting the Dots Between Enterprise Mobility and IoT
  10. Merging the Physical with the Digital for Optimized Productivity
  11. IoT Sensors Extend Our Physical Senses Beyond Our Physical Reach
You can find my Top 75 articles on Mobile Strategies here.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

IoT Sensors, Nerves for Robots and the Industrial Internet

Sensors are Nerves for Robots
Yesterday I interviewed two robotics experts on the growing demand for IPAs (intelligent process automation) robots.  These robots are made of software code.  They are assigned pre-defined actions based on steps in a process, the analysis of data, and the decision trees they are provided.  For example, an IPA can review a car loan application and approve or disprove it instantly – based on the data.  In fact, they can analyze the data from tens of thousands of car loans in seconds based on the parameters and decision trees they have been given.

There are literally hundreds of thousands of different use cases for IPA robots.  IPA robots can also interact with IoT sensors and take actions based on sensor data.  Not just by completing a digital business processes, but even by controlling physical equipment and machines as well.  Sensors serve robots in much the same way as nerves serve us humans.

Earlier this week I was briefed by a company AMS AG,  a developer of IoT sensors.  They just released a new sensor that smells odors in homes and offices.  Yes, indeed!  The sensor is embedded in a home monitoring system from Withings.  In Withings’ Home product, the AS-MLV-P2 (sensor) is combined with a 5Mpixel video camera, dual microphones, temperature and humidity sensors and Wi-Fi® and Bluetooth® Smart radios. This means that users of the Home monitoring system can see, hear, feel and smell the inside of their home or office remotely via a smartphone or tablet app supplied by Withings.

AMS’s sensor detects VOCs (volatile organic compounds), including both human-made and naturally occurring chemical compounds. These include ambient concentrations of a broad range of reducing gases associated with bad air quality such as alcohols, aldehydes, ketones, organic acids, amines, and aliphatic and aromatic hydrocarbons, all which can be harmful to human and animal health at high levels. These are most of the scents humans smell.  In the Home app, the sensor’s measurements of these chemicals are converted to an air-quality rating as well as to a measurement of VOC concentrations.

If you combine IPA robots, AMS’s sensors and Withings Home monitoring system with your HVAC system, the IPA robot can ensure you have healthy air quality in your home or office continuously. In fact, an IPA robot could manage the air quality and security of tens of thousands of homes and offices at the same time.  The results of these findings and actions can be displayed and controlled on smartphones and tablets as well.

Not only do you have robots sensing the physical world, but also automatically reacting to it on your behalf.  In my opinion, how sensors detect and communicate the physical and natural world to humans and robots is one of the most interesting areas of innovation today.

An additional value of using IPA robots is the massive clouds of data they spin-off as a result of their decisions and actions.  This data can be further analyzed to find new areas for optimization and potential business opportunities.  Herein lies an emerging area where big data analysis can give us even deeper insights.



************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Robots and I - Intelligent Process Automation, Siri and More

Today I had the privilege of interviewing two robotics and process automation experts.  I learned there are many different kinds of robots including the humanoid types we see in movies, and robots made entirely out of software.  In this interview we discuss Rob Brown's recent white paper titled Robots and I, the different forms of robots, and then dig deep into how software robots are transforming many industries today with expert Matt Smith.  Enjoy!

Video Link: https://youtu.be/qOPFD3vshec


************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Mobile Apps, Blind Spots, Tomatoes and IoT Sensors

Master Tomato Gardener
A lot is written on mobile technologies, the Internet of Things, social media and analytics, but little is written on how all these might work together in a retail environment.  I think best by writing, so let's think this through together.

Blind spots are defined as, “Areas where a person's view is obstructed.” Many business decisions today are still made based on conjecture (unsubstantiated assumptions), because the data needed to make a data-driven decision lies in an operational “blind spot.”

Smart companies when designing mobile applications consider how they can personalize the user experience.  They ask themselves how they can utilize all the accumulated data they have collected on their customers or prospects, plus third-party data sources, to make the experience as beautiful and pleasurable as possible.  To start, they can often access the following kinds of data from their own and/or purchased databases to personalize the experience:
  • Name
  • Age
  • Gender
  • Address
  • Demographic data
  • Income estimate
  • Credit history
  • Education level
  • Marital status
  • Children
  • Lifestyle
  • Social media profile and sentiment
  • Job title
  • Purchase history
  • Locations of purchases
  • Preferences, tastes and style
  • Browsing/Shopping history
This data, however, is basic.  It is merely a digital profile. It has many blind spots.  It is often not based on real-time data.  As competition stiffens, the above profile data will not be enough to deliver a competitive advantage.  Companies will need to find ways to reduce blind spots in their data so they can increase the degree of personalization.

Sensors connected to the IoT (Internet of Things) will play an important role in reducing blind spots. Sensors, often cost only a few dollars, and can be set-up to detect or measure physical properties, and then wirelessly communicate the results to a designated server.  Also as smartphones (aka sensor platforms) increase the number of sensors they include, and then make these sensors available to mobile application developers through APIs, the competitive playing field will shift to how these sensors can be used to increase the level of personalization.

Let’s imagine a garden supply company, GardenHelpers, developing a mobile application.  The goal of the application is to provide competitive differentiation in the market by offering personalized garden advice and solutions.  The GardenHelpers use the following smartphone sensors in their design to provide more personalized gardening advice:
  • GPS sensor (location data)
  • Cell Tower signal strength (location data)
  • Magnetometer sensor (location of sun)
  • Ambient light sensor (available sunlight)
  • Barometer sensor (altitude)
GardenHelpers combine the sensor data with date and time, plus third-party information such as:
  • GIS (geospatial information system on terrain, slopes, angles, watershed, etc.) data
  • Historic weather information
  • Government soil quality information
  • Government crop data, recommendations and advice
GardenHelpers also encourages the user to capture the GPS coordinates, via their smartphone, on each corner of their garden to input the estimated garden size, and to capture the amount of sunlight at various times of the day through the ambient light sensor.  This information is compared with area weather data and the amount of shade and sunlight on their garden is estimated.

GardenHelpers now understands a great deal about the gardener (mobile app user), the garden location, size, lay of the land and sunlight at various times.  However, there remain “blind spots.”  GardenHelpers doesn't know the exact temperature, wind speeds, humidity levels, or the amount of water in the soil of the garden.  How do they remedy these blind spots?  They offer to sell the gardeners a kit of wireless IoT sensors to measure these.

With all of this information now the blind spots are now greatly reduced, but some remain.  What about local pests, soil issues and advice?  GardenHelpers adds a social and analytics element to their solution.  This enables gardeners to share advice with other local gardeners with similar garden sizes and crops.

GardenHelpers can now deliver a mobile app that is hyper-personalized for their customers and prospects.  The products they offer and recommend are not selected randomly, but are now based on precise smartphone and sensor data. The mobile app combined with the IoT sensors become an indispensable tool for their customers which leads to increased brand loyalty and sales.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Sensors - Sensing and Sharing the Physical World

Global Sensor Data
We spend a lot of time talking and writing about the IoT (Internet of Things) in the macro, as a giant worldwide network of objects and things, communicating with themselves and others.  That is indeed interesting, but the most interesting components of the IoT, in my opinion, are the sensors.  Sensors are defined as, "Devices that detect or measure a physical property and record, indicate, or otherwise responds to it."  In the context of IoT, sensors detect or measure a physical property and then communicate the findings wirelessly to a server for analysis. Sensors are our digital fingers that touch and feel the earth and environment!

Just last week I read this about a new iPhone patent, "The patent is titled “Digital camera with light splitter.” The camera described in the patent has three sensors for splitting color. The camera would split colors into three different rays. These would be red, green and blue. The splitting of colors is designed to allow the camera to maximize pixel array resolution." This patent potentially could help Apple improve the image quality of its mobile cameras, especially in video.  In other words, it will help iPhones better capture, display and share the scenes on our planet for viewing.

At the Mobile World Congress in Barcelona this year I saw demonstrated an iPhone add-on from the company, Flir.   It was a Personal Thermal Imagery Camera.  You connect it to your iPhone and then you can find leaky pipes in your wall, overloaded electrical breakers, or even spot live rodents hiding in your walls. You can use it in your boat to spot floating debris in the water in the dark or use while hiking in the dark to spot hidden predators preparing to devour you.  I WANT ONE NOW!

Sensors measure and collect data and can be connected to just about any piece of equipment. Satellite cameras are sensors.  There are audio and visual sensors.  There are pressure and heat sensors.  There are all kinds of sensors.  One of the most interesting sensor technologies I have been researching of late is hyper spectral remote sensors.

Hyper spectral sensors combined with GIS (geospatial information systems) information and Big Data analytics are a powerful mix. These sensors can be integrated into very powerful cameras. Hyper spectral remote sensing is an emerging technology that is being studied for its ability to detect and identify minerals, terrestrial vegetation, and man-made materials and backgrounds.  I want one!

Hyper spectral remote sensing combines imaging and spectroscopy (spectroscopy is a term used to refer to the measurement of radiation intensity as a function of wavelength) in a single system, which often includes large data sets that require Big Data analytics.  Hyper spectral imagery is typically collected (and represented) as a data cube with spatial information collected in the X-Y plane, and spectral information represented in the Z-direction.
hyper spectal imaging

What can be done with hyper spectral remote sensing?  Using powerful hyper spectral cameras one can detect unique noble gases (each unique gas emits a unique color on the spectrum), different inks, dyes and paints (each have different characteristics that can be uniquely identified).  You can detect, identify and quantify chemicals.  You can detect chemical composition and physical properties including their temperature and velocity all with a camera!

Taking a hyper spectral image of an object, connected to real-time Big Data analytics, can tell you an amazing amount of information about it.  Theoretically, a hyper spectral image of a person combined with facial recognition can identify a person, their shampoo, make-up, hand lotion, deodorant, perfume, the food they ate, chemicals they have been in contact with and the materials and chemicals used in their clothes.  OK, the implications of this technology for personal privacy are really creepy, but the technology itself is fascinating.

Theoretically hyper spectral remote sensing systems can be used for healthcare, food monitoring, security at airports, for public safety, in intelligence systems and integrated with drone and satellite surveillance systems.

Today, luckily, these cameras are far too expensive for me.

Related Articles: http://mobileenterprisestrategies.blogspot.com/2015/04/iot-sensors-tactile-feedback-iphones.html

Related Video: http://mobileenterprisestrategies.blogspot.com/2015/03/iot-and-sensors-from-ams-at-mwc15.html
************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

IoT Sensors, Tactile Feedback, iPhones and Digital Transformation

IoT sensors extend our physical senses beyond our physical reach and communicate the results from afar. They also allow us to share experiences remotely, not just mentally, but also tactilely. That is the first time I have ever used the word “tactilely.” It means to tangibly or physically experience something. For example, AMS’s MEMS gas sensor allows people to hear, see and smell inside their home remotely from an iPhone app. The Withings Home camera sends alerts to an iPhone if it detects movement or noise in the house. Its night-vision sensor mode enables the remote viewer to even see in the dark. The viewer can also talk through the camera to ask questions like, “Who are you, and why are you carrying my big screen TV away?”

Today you can combine 3D modeling apps for smartphones and tablets with sounds, vibrations and colors so you can augment your reality with tactile experiences. Wireless sensors and 3D modeling and visualization tools enable you to see and monitor conditions at distance - in real-time. A combination of sensors, analytics, visualization and tactile feedback tools can alert and inform you of changing conditions, patterns or variations in activity or data patterns. This experience can truly augment your reality.

The new Apple Watch enables you to signal somebody on the other side of the world with tactile vibrations that you customize. For example, while on the road I can signal my wife that I miss her by sending five quick “pulses” that vibrate on her wrist.

Digitally modeled realities enable experts, from anywhere in the world, to work and manage factories, farms and other kinds of operations from distant locations. The obstacles of the past, lack of information and monitoring capabilities, that resulted in operational blind spots are quickly disappearing as more sensors are put in place. Most of us either own or have seen noise canceling headsets. Sensors in the headset capture the incoming noise and then instantly counter with anti-sound that matches the sensor data. This same kind of sensor technology can capture noise and transmit it to distant locations where it can be recreated and listened to by others.

I can image near-term scenarios where entire factory floors are digitally replicated and real-time operations can be viewed and managed from great distances. Every component of the operation can be monitored via sensor data. Aberrations, out of compliance data, and other faults would instantly cause alerts, notifications and remedies to be implemented.

In the military arena, acoustical sensors can now pin-point the location of incoming bullets, rockets, missiles, etc., in real-time and activate various instantaneous counter measure technologies. Data means power.

Today's competitive marketplace requires companies to collect more data, analyze more data and utilize more data to improve customer interactions and engagements. Mobile devices are exceptionally designed to assist in this effort. Apple's iPhone and Apple Watch come with an array of sensors for collecting data about your surroundings:
  • Touch/Multi-Touch screen sensor
  • Force Touch sensor– measures different levels of touch (Apple Watch), determines the difference between a tap and a press
  • Taptic Engine sensor – tactile feedback via gentle vibration(Apple Watch)
  • Audio/Voice sensor
  • GPS sensor
  • Bluetooth sensor (supports iBeacon)
  • WiFi sensor
  • WiFi strength sensor – help track indoor activities
  • Proximity sensor - deactivates the display and touchscreen when the device is brought near the face during a call, and it shuts off the screen and touch sensitivity
  • Ambient Light sensor - brightens the display when you’re in sunlight and dims it in darker place
  • Magnetometer sensor - measure the strength and/or direction of the magnetic field in the vicinity of the device – runs digital compass
  • Accelerometer sensor- measures the force of acceleration, i.e. the speed of movement (uses movement and gravity sensing), steps counter, distance, speed of movement, detects the angle an iPhone is being held
  • Apple Watch sensors measure steps taken, calories burned, and pulse rate
  • Gyroscope – 3 axis gyro (combined with Accelerometer provides 6 axis motion sensing), Pitch, Roll and Yaw
  • Barometer sensor – altitude, elevation gain during workouts, weather condition
  • Camera sensor with a plethora of sensors and digital features: face detection, noise reduction, optical image stabilization, auto-focus, color sensors, backside Illumination sensor, True Tone sensor and flash 
  • Fingerprint identity sensor
  • Heart rate sensor (Apple Watch) - uses infrared and visible-light LEDs and photodiodes to detect heart rate Sensor
Other sensor add-ons: Personal Thermal Imagery Cameras sensor (Flir)

I attended a defense related conference and listened to an IT expert in the CIA present on how they can use sensors on smartphones to uniquely identify the walking style and pace of individuals. For example, the intelligence agency may suspect a person carrying a phone is a bad guy. They can remotely switch on the smartphone's sensors and record the walking style and pace of the person carrying the phone and match it with their database records.

Sensors help bridge the gap between the physical and digital worlds. They convert the physical world into data. Tactile feedback tools convert the data back into physical experiences – like a Star Trek Transporter.

Mobile apps can also be considered the API (application programming interface) between humans and smartphones. Sensors are the API between the phone and the physical world. For example, a mobile application for recommending local restaurants may start by asking the user what kind of food they prefer. The human queries their stomach for pain and preferences, and then inputs the results into mobile apps by touching the keypad or using their voice. Suddenly a server in an Amazon data center knows your stomach's inputs! That is one powerful sensor and API! Given the vast array of sensors in the human body incredible things can be done once those sensor convert them to data.

Until recently, the data from natural sensors in the human body were mostly communicated to analytics engines via human's touch, typing, drawings or voice inputs. The emergence of wearable sensors and smart devices, however, change that. Wearable sensors can bypass the human in the middle and wirelessly communicate directly with your applications or healthcare provider.

Sensors and computers are also connected to the non-physical. Applications can react differently based on recognized time inputs. Once time reaches a specified location (place?), an alarm can be activated sending sound waves to your physical ear. That is converting the non-physical (time) into sound waves that vibrate our ear drums.

The challenge for businesses today is to envision how all of these sensors and available real-time data can be used to improve sales, customer service, product design, marketplace interactions and engagements so there are more profits at the end of the day.

In the book Digital Disruptions, James McQuivey writes that for most of history, disruptions (business and marketplace transformations) occurred in a physical world of factories and well-trod distribution networks. However, the disruptors of tomorrow are likely coming from digital disruptions - sensors, code halos, big data, mobile devices and wearables.

The task and challenge of every IT department is to understand and design a strategy that recognizes the competitive playing fields of tomorrow are among the digits.


************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Code Halos - Tracking the Mobile Workforce, Equipment and Other Variables for Optimal Performance

I write and speak often on the need to have a thoughtful Code Halo strategy in addition to your mobile and digital strategies.  Code Halos is the term for the information that surrounds people, organizations, and devices.  Many companies consider Code Halo strategies only for marketing, sales and customer service, but a well thought out Code Halo strategy for work done in the field like maintenance, repairs, asset management, construction and engineering is also important.  Let me try to make the case here.

There are many different objects and variables that can impact the performance of a mobile workforce, especially in the services industry.  In my enterprise mobility workshops I call these things PIOs (performance impact objects), and PIVs (performance impact variables).

Examples of PIOs:
  • People
  • Parts/Supplies/Materials
  • Tools
  • Job locations
  • Equipment (and availability)
  • Transportation (and availability)
  • Vendor (and availability)
  • Subcontractor (and availability)
  • Jobsite access
  • Permits/Approvals
Examples of PIVs:
  • Schedules (dependencies)
  • Qualifications
  • Skills
  • Experience
  • Weather
  • Traffic
  • Condition of equipment repair/maintenance
  • Sickness/Health
  • Funding
Each of these items must come together at the right time and right place to optimize the performance of a field service technician.  I think of PIOs and PIVs in the context of building the first transcontinental railroad in 1869.  In order to be completed and functioning, all the PIOs/PIVs had to come together at the right physical place and time.  If pieces were missing, or misaligned the entire system was delayed or fails.

In an ideal world, we would have full situational awareness.  All of the data from each PIO and PIV would be instantly available to our management system so predictive analytics and artificial intelligence could align all the variables for optimized service delivery.  Full situational awareness does not happen by accident.  It requires a great deal of strategy, planning and execution.

All of the PIOs and PIVs need to be tracked and monitored.  Sensors (IoT), GPS vehicle tracking and smartphones all play an important role here.  The data that is needed to make right decisions, either by a human decision maker or an artificial intelligence system needs to be collected, and as data has a shelf-life, it needs to be timely.  Those on the Titanic knew they were in trouble, but only when it was too late to prevent the trouble.  They would have appreciated good information a few minutes earlier.

Let me provide a scenario for consideration.  A customer calls in and requires repairs to a specialized, expensive piece of equipment.  The repair requires specialized training and skills, certifications, special parts, special tools and experience.  Knowing just the schedules and locations of your field service technicians is not good enough.  You need to know information concerning each PIO and PIV.  In order to optimally provide service to your customer, you need to know and monitor all relevant information, and since most field services teams are mobile, that means mobile technology and wireless sensors must be integrated with as many PIOs and PIVs systems as possible in order to provide the necessary data and visibility to maximize productivity.

When PIOs and PIVs are all connected via a shared network that provides visibility to network members it is called a Network Centric Operation.  A full network centric operational environment may not be economically feasible for 25 service technicians, but for 2,5000 service technicians yes.

If you have an available field service technician without the right experience or qualifications, then that doesn't help.  If you have a qualified, experienced and available field services technician, but without the right tools, equipment, parts or their location is too distant to be of service, then that also doesn't help.

PIOs/PIVs are most often not in one location for easy management.  They are located in many different locations and accessed via many different systems.  Enterprise mobility, sensors, connectivity, integration, dashboards, dynamic scheduling, HCM (human capital management), GPS tracking and event/project management, predictive analytics and artificial intelligence are all required to bring all of these pieces, data and variables together to provide optimal productivity.  Ideally these would be brought together under a considered Code Halo strategy for collecting, analyzing and using data to optimize productivity.



************************************************************************
Kevin Benedict
Writer, Speaker, Editor
Senior Analyst, Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies
Recommended Strategy Book Code Halos
Recommended iPad App Code Halos for iPads

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

IoT Expert Interview: Microsoft's Nick Landry

This week I am both learning, and speaking at the The Internet of Things Expo in New York City.  I will be teaching a session on the subject of IoT, Code Halos and Digital Transformation Strategies. Today, I had the privilege of interviewing Microsoft's mobile and IoT guru Nick Landry (Twitter @ActiveNick).  In this interview he shares Microsoft's solutions and strategies around the Internet of Things.  Enjoy!

Video Link: http://youtu.be/smE9rjfLiWI



************************************************************************
Kevin Benedict
Writer, Speaker, Editor
Senior Analyst, Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies
Recommended Strategy Book Code Halos
Recommended iPad App Code Halos for iPads

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

The Internet of Things Comes to the Smartphone

Infrared Sensors for iPhones
For sometime now I have been pondering how the use of additional sensors on smartphones might be useful.  Today many smartphones contain the following sensors:
  • Proximity sensor
  • Motion sensor/accelerometer
  • Ambient light sensor
  • Moisture sensor
  • Gyroscopic sensor
  • Magnetometer
Those listed sensors are incredibly valuable, but there are more coming.  I read today about 3D photo sensors that Google is testing for smartphones, and other sensors like barometric sensors to help determine what floor of a building you are on, and biometric sensors to recognize finger prints for security purposes.  The 3D sensor would enable intelligent applications to understand the shape and layout of a room, the barometric sensor would help identify the floor of a building and the magnetometer to help an application to understand the direction the room is facing.  All of these sensors could contribute to interesting indoor retailing apps. They convert the physical into digital.  They represent the tip of the spear for digital transformation.  Once that is complete, software algorithms can be programmed and intelligence added to support revolutionary new business processes.

Evolution in Sensor Sizes
At the Mobile World Congress 2014 in Barcelona last week I discovered more interesting sensors.  I saw a sensor, the size of a pin head, for monitoring and reporting humidity and temperature (Sensirion). It connected to smartphones via bluetooth.  My favorite though, was the Thermal Imager sensor from Flir that connects to an iPhone via a sled (with extra batteries).  It let's you, among other things, point at a person walking by to see their body temperature (a must for determining the living from the dead and the human from an android), plus you can see warm blooded animals and warm objects light up your iPhone screen even in complete darkness.

Imagine you hear a loud noise in your backyard at night.  Reach for your thermal imager enabled iPhone and scan your backyard.  The mountain lion hiding behind your shrubbery is instantly exposed.  This can be useful in my neighborhood (see http://magicvalley.com/news/local/mountain-lion-killed-in-idaho-neighborhood/article_7b8e91ce-902a-11e3-a505-001a4bcf887a.html).  You can also scan your hardwood floor to see if a giraffe or anyone has walked across it in the past few minutes.

These sensors are just the tip of the iceberg.  There are hundreds of sensors that are being miniaturized today (see the Evolution in Sensor Size photo above I took at MWC14).  They capture information about your physical world and can wirelessly transmit this information to your smartphone or enterprise server.  How fun!

The Internet of Things is already massively increasing the amount of data flowing into servers from the physical world.  Our big task in 2014 is imagining all the ways this data can be used to make our world a better place.

*************************************************************
Kevin Benedict
Senior Analyst, Digital Transformation Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Join the Linkedin Group Strategic Enterprise Mobility

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

The Race for Sensors to Supply Big Data and Enterprise Mobility

Today's competitive marketplace requires companies to collect more data, analyze more data and utilize more data to improve customer interactions and engagements.  Mobile devices are exceptionally designed to assist in this effort.  Apple's iPhone comes with an inventory of sensors:
  • Touch
  • Voice
  • GPS
  • Proximity
  • Ambient Light
  • Accelerometer
  • Magnetometer
  • Gyroscopic
I listened to an IT expert in the CIA give a presentation on how they could use the sensors on a typical smartphones to uniquely identify the walking style and pace of individuals.  For example, the intelligence agency may suspect a person carrying a phone is a bad guy.  They can remotely switch on the smartphone's sensors and record the walking style and pace of the person carrying the phone and match it with their database records.  SCARY ISN'T IT!?

Those are just a few of the sensors available that integrate the physical world with the digital.  Read this article I wrote to learn more about the incredible capabilities of sensors.

Mobile apps can also be considered the API (application programming interface) between humans and smartphones.  For example, a mobile application for recommending local restaurants may start by asking the user what kind of food they prefer.  The human queries their stomach, and then inputs the results into their mobile app by touching the keypad or using their voice.  Suddenly a server in an Amazon data center knows your stomach's inputs!  That is one powerful sensor and API!  Given the vast array of sensors in the human body incredible things can be done once those sensor inputs are digitized.

Although there are many powerful sensors in the human body the API is still the human's touch, typing or voice.  The emergence of wearable sensors and smart devices are a way to try to automate the process of data collection so humans are not required to take time and effort to input the data.

Sensors are also connected to the non-physical.  Sensors can connect with time.  Once time reaches a specified place, a digital alarm can go off striking your physical ear with sound waves.  That is making the non-physical inputs, physical.

The challenge for businesses today is to envision how all of these sensors and available real-time data can be used to improve customer service, product design, marketplace interactions and engagements so there are more profits at the end of the day.  

In the book Digital Disruptions, James McQuivey writes that for most of history, disruptions (business and marketplace transformations) occurred in a physical world of factories and well-trod distribution networks.  However, the disruptions of tomorrow are likely coming from digital disruptions - sensors, code halos, big data and mobile devices and wearables.

The task and challenge of every IT department is to understand and design a strategy that recognizes that the competitive playing fields of tomorrow are among the digits.

***Have you seen the new Mobile Solution Directory here http://mobilesolutiondirectory.blogspot.com/?

*************************************************************
Kevin Benedict, Head Analyst for Social, Mobile, Analytics and Cloud (SMAC) Cognizant
View Linkedin Profile
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Join the Linkedin Group Strategic Enterprise Mobility

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and SMAC analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Interviews with Kevin Benedict