Today you can combine 3D modeling apps for smartphones and tablets with sounds, vibrations and colors so you can augment your reality with tactile experiences. Wireless sensors and 3D modeling and visualization tools enable you to see and monitor conditions at distance - in real-time. A combination of sensors, analytics, visualization and tactile feedback tools can alert and inform you of changing conditions, patterns or variations in activity or data patterns. This experience can truly augment your reality.
The new Apple Watch enables you to signal somebody on the other side of the world with tactile vibrations that you customize. For example, while on the road I can signal my wife that I miss her by sending five quick “pulses” that vibrate on her wrist.
Digitally modeled realities enable experts, from anywhere in the world, to work and manage factories, farms and other kinds of operations from distant locations. The obstacles of the past, lack of information and monitoring capabilities, that resulted in operational blind spots are quickly disappearing as more sensors are put in place. Most of us either own or have seen noise canceling headsets. Sensors in the headset capture the incoming noise and then instantly counter with anti-sound that matches the sensor data. This same kind of sensor technology can capture noise and transmit it to distant locations where it can be recreated and listened to by others.
I can image near-term scenarios where entire factory floors are digitally replicated and real-time operations can be viewed and managed from great distances. Every component of the operation can be monitored via sensor data. Aberrations, out of compliance data, and other faults would instantly cause alerts, notifications and remedies to be implemented.
In the military arena, acoustical sensors can now pin-point the location of incoming bullets, rockets, missiles, etc., in real-time and activate various instantaneous counter measure technologies. Data means power.
Today's competitive marketplace requires companies to collect more data, analyze more data and utilize more data to improve customer interactions and engagements. Mobile devices are exceptionally designed to assist in this effort. Apple's iPhone and Apple Watch come with an array of sensors for collecting data about your surroundings:
- Touch/Multi-Touch screen sensor
- Force Touch sensor– measures different levels of touch (Apple Watch), determines the difference between a tap and a press
- Taptic Engine sensor – tactile feedback via gentle vibration(Apple Watch)
- Audio/Voice sensor
- GPS sensor
- Bluetooth sensor (supports iBeacon)
- WiFi sensor
- WiFi strength sensor – help track indoor activities
- Proximity sensor - deactivates the display and touchscreen when the device is brought near the face during a call, and it shuts off the screen and touch sensitivity
- Ambient Light sensor - brightens the display when you’re in sunlight and dims it in darker place
- Magnetometer sensor - measure the strength and/or direction of the magnetic field in the vicinity of the device – runs digital compass
- Accelerometer sensor- measures the force of acceleration, i.e. the speed of movement (uses movement and gravity sensing), steps counter, distance, speed of movement, detects the angle an iPhone is being held
- Apple Watch sensors measure steps taken, calories burned, and pulse rate
- Gyroscope – 3 axis gyro (combined with Accelerometer provides 6 axis motion sensing), Pitch, Roll and Yaw
- Barometer sensor – altitude, elevation gain during workouts, weather condition
- Camera sensor with a plethora of sensors and digital features: face detection, noise reduction, optical image stabilization, auto-focus, color sensors, backside Illumination sensor, True Tone sensor and flash
- Fingerprint identity sensor
- Heart rate sensor (Apple Watch) - uses infrared and visible-light LEDs and photodiodes to detect heart rate Sensor
I attended a defense related conference and listened to an IT expert in the CIA present on how they can use sensors on smartphones to uniquely identify the walking style and pace of individuals. For example, the intelligence agency may suspect a person carrying a phone is a bad guy. They can remotely switch on the smartphone's sensors and record the walking style and pace of the person carrying the phone and match it with their database records.
Sensors help bridge the gap between the physical and digital worlds. They convert the physical world into data. Tactile feedback tools convert the data back into physical experiences – like a Star Trek Transporter.
Mobile apps can also be considered the API (application programming interface) between humans and smartphones. Sensors are the API between the phone and the physical world. For example, a mobile application for recommending local restaurants may start by asking the user what kind of food they prefer. The human queries their stomach for pain and preferences, and then inputs the results into mobile apps by touching the keypad or using their voice. Suddenly a server in an Amazon data center knows your stomach's inputs! That is one powerful sensor and API! Given the vast array of sensors in the human body incredible things can be done once those sensor convert them to data.
Until recently, the data from natural sensors in the human body were mostly communicated to analytics engines via human's touch, typing, drawings or voice inputs. The emergence of wearable sensors and smart devices, however, change that. Wearable sensors can bypass the human in the middle and wirelessly communicate directly with your applications or healthcare provider.
Sensors and computers are also connected to the non-physical. Applications can react differently based on recognized time inputs. Once time reaches a specified location (place?), an alarm can be activated sending sound waves to your physical ear. That is converting the non-physical (time) into sound waves that vibrate our ear drums.
The challenge for businesses today is to envision how all of these sensors and available real-time data can be used to improve sales, customer service, product design, marketplace interactions and engagements so there are more profits at the end of the day.
In the book Digital Disruptions, James McQuivey writes that for most of history, disruptions (business and marketplace transformations) occurred in a physical world of factories and well-trod distribution networks. However, the disruptors of tomorrow are likely coming from digital disruptions - sensors, code halos, big data, mobile devices and wearables.
The task and challenge of every IT department is to understand and design a strategy that recognizes the competitive playing fields of tomorrow are among the digits.
************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work
Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Browse the Mobile Solution Directory
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies
***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.