Robots and I - Intelligent Process Automation, Siri and More

Today I had the privilege of interviewing two robotics and process automation experts.  I learned there are many different kinds of robots including the humanoid types we see in movies, and robots made entirely out of software.  In this interview we discuss Rob Brown's recent white paper titled Robots and I, the different forms of robots, and then dig deep into how software robots are transforming many industries today with expert Matt Smith.  Enjoy!

Video Link: https://youtu.be/qOPFD3vshec


************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Mobile Apps, Blind Spots, Tomatoes and IoT Sensors

Master Tomato Gardener
A lot is written on mobile technologies, the Internet of Things, social media and analytics, but little is written on how all these might work together in a retail environment.  I think best by writing, so let's think this through together.

Blind spots are defined as, “Areas where a person's view is obstructed.” Many business decisions today are still made based on conjecture (unsubstantiated assumptions), because the data needed to make a data-driven decision lies in an operational “blind spot.”

Smart companies when designing mobile applications consider how they can personalize the user experience.  They ask themselves how they can utilize all the accumulated data they have collected on their customers or prospects, plus third-party data sources, to make the experience as beautiful and pleasurable as possible.  To start, they can often access the following kinds of data from their own and/or purchased databases to personalize the experience:
  • Name
  • Age
  • Gender
  • Address
  • Demographic data
  • Income estimate
  • Credit history
  • Education level
  • Marital status
  • Children
  • Lifestyle
  • Social media profile and sentiment
  • Job title
  • Purchase history
  • Locations of purchases
  • Preferences, tastes and style
  • Browsing/Shopping history
This data, however, is basic.  It is merely a digital profile. It has many blind spots.  It is often not based on real-time data.  As competition stiffens, the above profile data will not be enough to deliver a competitive advantage.  Companies will need to find ways to reduce blind spots in their data so they can increase the degree of personalization.

Sensors connected to the IoT (Internet of Things) will play an important role in reducing blind spots. Sensors, often cost only a few dollars, and can be set-up to detect or measure physical properties, and then wirelessly communicate the results to a designated server.  Also as smartphones (aka sensor platforms) increase the number of sensors they include, and then make these sensors available to mobile application developers through APIs, the competitive playing field will shift to how these sensors can be used to increase the level of personalization.

Let’s imagine a garden supply company, GardenHelpers, developing a mobile application.  The goal of the application is to provide competitive differentiation in the market by offering personalized garden advice and solutions.  The GardenHelpers use the following smartphone sensors in their design to provide more personalized gardening advice:
  • GPS sensor (location data)
  • Cell Tower signal strength (location data)
  • Magnetometer sensor (location of sun)
  • Ambient light sensor (available sunlight)
  • Barometer sensor (altitude)
GardenHelpers combine the sensor data with date and time, plus third-party information such as:
  • GIS (geospatial information system on terrain, slopes, angles, watershed, etc.) data
  • Historic weather information
  • Government soil quality information
  • Government crop data, recommendations and advice
GardenHelpers also encourages the user to capture the GPS coordinates, via their smartphone, on each corner of their garden to input the estimated garden size, and to capture the amount of sunlight at various times of the day through the ambient light sensor.  This information is compared with area weather data and the amount of shade and sunlight on their garden is estimated.

GardenHelpers now understands a great deal about the gardener (mobile app user), the garden location, size, lay of the land and sunlight at various times.  However, there remain “blind spots.”  GardenHelpers doesn't know the exact temperature, wind speeds, humidity levels, or the amount of water in the soil of the garden.  How do they remedy these blind spots?  They offer to sell the gardeners a kit of wireless IoT sensors to measure these.

With all of this information now the blind spots are now greatly reduced, but some remain.  What about local pests, soil issues and advice?  GardenHelpers adds a social and analytics element to their solution.  This enables gardeners to share advice with other local gardeners with similar garden sizes and crops.

GardenHelpers can now deliver a mobile app that is hyper-personalized for their customers and prospects.  The products they offer and recommend are not selected randomly, but are now based on precise smartphone and sensor data. The mobile app combined with the IoT sensors become an indispensable tool for their customers which leads to increased brand loyalty and sales.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Sensors - Sensing and Sharing the Physical World

Global Sensor Data
We spend a lot of time talking and writing about the IoT (Internet of Things) in the macro, as a giant worldwide network of objects and things, communicating with themselves and others.  That is indeed interesting, but the most interesting components of the IoT, in my opinion, are the sensors.  Sensors are defined as, "Devices that detect or measure a physical property and record, indicate, or otherwise responds to it."  In the context of IoT, sensors detect or measure a physical property and then communicate the findings wirelessly to a server for analysis. Sensors are our digital fingers that touch and feel the earth and environment!

Just last week I read this about a new iPhone patent, "The patent is titled “Digital camera with light splitter.” The camera described in the patent has three sensors for splitting color. The camera would split colors into three different rays. These would be red, green and blue. The splitting of colors is designed to allow the camera to maximize pixel array resolution." This patent potentially could help Apple improve the image quality of its mobile cameras, especially in video.  In other words, it will help iPhones better capture, display and share the scenes on our planet for viewing.

At the Mobile World Congress in Barcelona this year I saw demonstrated an iPhone add-on from the company, Flir.   It was a Personal Thermal Imagery Camera.  You connect it to your iPhone and then you can find leaky pipes in your wall, overloaded electrical breakers, or even spot live rodents hiding in your walls. You can use it in your boat to spot floating debris in the water in the dark or use while hiking in the dark to spot hidden predators preparing to devour you.  I WANT ONE NOW!

Sensors measure and collect data and can be connected to just about any piece of equipment. Satellite cameras are sensors.  There are audio and visual sensors.  There are pressure and heat sensors.  There are all kinds of sensors.  One of the most interesting sensor technologies I have been researching of late is hyper spectral remote sensors.

Hyper spectral sensors combined with GIS (geospatial information systems) information and Big Data analytics are a powerful mix. These sensors can be integrated into very powerful cameras. Hyper spectral remote sensing is an emerging technology that is being studied for its ability to detect and identify minerals, terrestrial vegetation, and man-made materials and backgrounds.  I want one!

Hyper spectral remote sensing combines imaging and spectroscopy (spectroscopy is a term used to refer to the measurement of radiation intensity as a function of wavelength) in a single system, which often includes large data sets that require Big Data analytics.  Hyper spectral imagery is typically collected (and represented) as a data cube with spatial information collected in the X-Y plane, and spectral information represented in the Z-direction.
hyper spectal imaging

What can be done with hyper spectral remote sensing?  Using powerful hyper spectral cameras one can detect unique noble gases (each unique gas emits a unique color on the spectrum), different inks, dyes and paints (each have different characteristics that can be uniquely identified).  You can detect, identify and quantify chemicals.  You can detect chemical composition and physical properties including their temperature and velocity all with a camera!

Taking a hyper spectral image of an object, connected to real-time Big Data analytics, can tell you an amazing amount of information about it.  Theoretically, a hyper spectral image of a person combined with facial recognition can identify a person, their shampoo, make-up, hand lotion, deodorant, perfume, the food they ate, chemicals they have been in contact with and the materials and chemicals used in their clothes.  OK, the implications of this technology for personal privacy are really creepy, but the technology itself is fascinating.

Theoretically hyper spectral remote sensing systems can be used for healthcare, food monitoring, security at airports, for public safety, in intelligence systems and integrated with drone and satellite surveillance systems.

Today, luckily, these cameras are far too expensive for me.

Related Articles: http://mobileenterprisestrategies.blogspot.com/2015/04/iot-sensors-tactile-feedback-iphones.html

Related Video: http://mobileenterprisestrategies.blogspot.com/2015/03/iot-and-sensors-from-ams-at-mwc15.html
************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

IoT Sensors, Tactile Feedback, iPhones and Digital Transformation

IoT sensors extend our physical senses beyond our physical reach and communicate the results from afar. They also allow us to share experiences remotely, not just mentally, but also tactilely. That is the first time I have ever used the word “tactilely.” It means to tangibly or physically experience something. For example, AMS’s MEMS gas sensor allows people to hear, see and smell inside their home remotely from an iPhone app. The Withings Home camera sends alerts to an iPhone if it detects movement or noise in the house. Its night-vision sensor mode enables the remote viewer to even see in the dark. The viewer can also talk through the camera to ask questions like, “Who are you, and why are you carrying my big screen TV away?”

Today you can combine 3D modeling apps for smartphones and tablets with sounds, vibrations and colors so you can augment your reality with tactile experiences. Wireless sensors and 3D modeling and visualization tools enable you to see and monitor conditions at distance - in real-time. A combination of sensors, analytics, visualization and tactile feedback tools can alert and inform you of changing conditions, patterns or variations in activity or data patterns. This experience can truly augment your reality.

The new Apple Watch enables you to signal somebody on the other side of the world with tactile vibrations that you customize. For example, while on the road I can signal my wife that I miss her by sending five quick “pulses” that vibrate on her wrist.

Digitally modeled realities enable experts, from anywhere in the world, to work and manage factories, farms and other kinds of operations from distant locations. The obstacles of the past, lack of information and monitoring capabilities, that resulted in operational blind spots are quickly disappearing as more sensors are put in place. Most of us either own or have seen noise canceling headsets. Sensors in the headset capture the incoming noise and then instantly counter with anti-sound that matches the sensor data. This same kind of sensor technology can capture noise and transmit it to distant locations where it can be recreated and listened to by others.

I can image near-term scenarios where entire factory floors are digitally replicated and real-time operations can be viewed and managed from great distances. Every component of the operation can be monitored via sensor data. Aberrations, out of compliance data, and other faults would instantly cause alerts, notifications and remedies to be implemented.

In the military arena, acoustical sensors can now pin-point the location of incoming bullets, rockets, missiles, etc., in real-time and activate various instantaneous counter measure technologies. Data means power.

Today's competitive marketplace requires companies to collect more data, analyze more data and utilize more data to improve customer interactions and engagements. Mobile devices are exceptionally designed to assist in this effort. Apple's iPhone and Apple Watch come with an array of sensors for collecting data about your surroundings:
  • Touch/Multi-Touch screen sensor
  • Force Touch sensor– measures different levels of touch (Apple Watch), determines the difference between a tap and a press
  • Taptic Engine sensor – tactile feedback via gentle vibration(Apple Watch)
  • Audio/Voice sensor
  • GPS sensor
  • Bluetooth sensor (supports iBeacon)
  • WiFi sensor
  • WiFi strength sensor – help track indoor activities
  • Proximity sensor - deactivates the display and touchscreen when the device is brought near the face during a call, and it shuts off the screen and touch sensitivity
  • Ambient Light sensor - brightens the display when you’re in sunlight and dims it in darker place
  • Magnetometer sensor - measure the strength and/or direction of the magnetic field in the vicinity of the device – runs digital compass
  • Accelerometer sensor- measures the force of acceleration, i.e. the speed of movement (uses movement and gravity sensing), steps counter, distance, speed of movement, detects the angle an iPhone is being held
  • Apple Watch sensors measure steps taken, calories burned, and pulse rate
  • Gyroscope – 3 axis gyro (combined with Accelerometer provides 6 axis motion sensing), Pitch, Roll and Yaw
  • Barometer sensor – altitude, elevation gain during workouts, weather condition
  • Camera sensor with a plethora of sensors and digital features: face detection, noise reduction, optical image stabilization, auto-focus, color sensors, backside Illumination sensor, True Tone sensor and flash 
  • Fingerprint identity sensor
  • Heart rate sensor (Apple Watch) - uses infrared and visible-light LEDs and photodiodes to detect heart rate Sensor
Other sensor add-ons: Personal Thermal Imagery Cameras sensor (Flir)

I attended a defense related conference and listened to an IT expert in the CIA present on how they can use sensors on smartphones to uniquely identify the walking style and pace of individuals. For example, the intelligence agency may suspect a person carrying a phone is a bad guy. They can remotely switch on the smartphone's sensors and record the walking style and pace of the person carrying the phone and match it with their database records.

Sensors help bridge the gap between the physical and digital worlds. They convert the physical world into data. Tactile feedback tools convert the data back into physical experiences – like a Star Trek Transporter.

Mobile apps can also be considered the API (application programming interface) between humans and smartphones. Sensors are the API between the phone and the physical world. For example, a mobile application for recommending local restaurants may start by asking the user what kind of food they prefer. The human queries their stomach for pain and preferences, and then inputs the results into mobile apps by touching the keypad or using their voice. Suddenly a server in an Amazon data center knows your stomach's inputs! That is one powerful sensor and API! Given the vast array of sensors in the human body incredible things can be done once those sensor convert them to data.

Until recently, the data from natural sensors in the human body were mostly communicated to analytics engines via human's touch, typing, drawings or voice inputs. The emergence of wearable sensors and smart devices, however, change that. Wearable sensors can bypass the human in the middle and wirelessly communicate directly with your applications or healthcare provider.

Sensors and computers are also connected to the non-physical. Applications can react differently based on recognized time inputs. Once time reaches a specified location (place?), an alarm can be activated sending sound waves to your physical ear. That is converting the non-physical (time) into sound waves that vibrate our ear drums.

The challenge for businesses today is to envision how all of these sensors and available real-time data can be used to improve sales, customer service, product design, marketplace interactions and engagements so there are more profits at the end of the day.

In the book Digital Disruptions, James McQuivey writes that for most of history, disruptions (business and marketplace transformations) occurred in a physical world of factories and well-trod distribution networks. However, the disruptors of tomorrow are likely coming from digital disruptions - sensors, code halos, big data, mobile devices and wearables.

The task and challenge of every IT department is to understand and design a strategy that recognizes the competitive playing fields of tomorrow are among the digits.


************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Mobile Apps, Analytics, Code Halos and Mass Personalization

Kevin Benedict, moderates this panel of digital experience and mobility experts including Benjamin Pring, Ted Shelton and Jack C. Crawford as they review and discuss the findings of Ben Pring's recent study Putting the Experience in Digital Customer Experience.

Video Link: https://youtu.be/xsPDWReccF4?list=UUGizQCw2Zbs3eTLwp7icoqw

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

IoT, Software Robots, Mobile Apps and Network Centric Operations

Articles about the IoT (Internet of Things) have moved from technical journals to our daily newspapers.  In this article we will go beyond the simplistic applications talked about in the local paper and discuss how IoT and complementary technologies, including software robots, can add real business value to the rugged outdoor work found in many industries.

In the rugged blue collar environment, vehicles, high valued equipment and other assets can be connected to the IoT to wirelessly report on their status, hours of operation, location, environment, maintenance and repair needs. This data can alert management when there is a problem, event or automatically create service tickets or send alerts when an action or decision is required. The IoT has the ability to provide "situational awareness" across large geographic areas and thousands of assets all at the same time.  This capability helps both decision-makers and automated systems (software robots) better understand how to optimize the use of experts, equipment and schedules across different geographic areas.

Today, sensors can be connected to many different pieces of equipment and are capable of bidirectional data exchanges.  That means they can both send data and receive data.  Data sent to them can include commands to perform a task.  These tasks may be to unlock a door, open a gate, increase or decrease the temperature, reposition a video camera, or to remotely operate equipment, think drones!  This capability is powerful and we are just scratching the surface of possibilities.

The IoT delivers on a vision of connecting physical and digital items to each other wirelessly through a network. These connections, and the data exchanged, can provide real-time access to information about the physical world in distant and remote locations.  This information can be analyzed by humans or software robots and turned into actionable intelligence that can be utilized by automated systems or human decision-makers. Connected IoT devices integrated into business systems can lead to many innovation and gains in efficiency and productivity that were never before possible.

A few of the key markets for IoT are:
  • Utilities/Smart grids
  • Defense
  • Fleet management/Automotive systems
  • Field services management
  • Rental equipment
  • Heavy equipment monitoring (think tractors, bulldozers, cranes, etc.)
  • Plant maintenance
  • Facility management
  • Connected homes/Home Energy Management Systems (HEMS)
  • Healthcare - fitness, remote patient and health monitoring
  • Medical equipment monitoring
  • Vending machines
  • ATMs
  • POS systems
  • Remote asset management monitoring
  • Security systems
  • Consumer electronics (eReaders, Wireless Printers, Appliances, etc.) 
  • etc.
In a world filled with data from mobile users, databases, websites and the IoT, the big question is what can be done with all of this data? This is where real-time analytics are required - analytic solutions that have the capacity and capability to analyze large amounts of incoming data in real-time.  The results of their analysis can be utilized by humans and/or software robots to optimize productivity and efficiencies.  Many of today's most advanced workforce optimization and scheduling solutions use software robots that can instantly react to the real-time data and optimize thousands of schedules and assignments in seconds (see ClickSoftware).

What are software robots?  According to a new study by my colleague, Rob Brown, at the Center for the Future of Work, titled The Robot and I, humans are working smarter with sophisticated software (robots) to automate business tasks that help humans attain new levels of process efficiency, such as improved operational costs, speed, accuracy and throughput volume.  In short, software robots are digital assistants and force-multipliers for humans.

Data and Real-Time Decision Making

Enterprise mobility apps offer significant value on their own, but when integrated into a network with many other applications, objects with sensors, software robots and other data collection technologies, the value of this "network of applications" is multiplied.   The challenge, as identified earlier, is to understand how to use this plethora of real-time data for the purpose of real-time decision-making and operational improvements.  Innovations within many modern military organizations offer lessons for us in the commercial space.

USAF Colonel John Boyd
USAF Colonel John Boyd is credited with the concept of the OODA loop. The OODA loop (Observe, Orient, Decide and Act) is a concept originally applied to combat operations and processes that involves analyzing real-time data and rapidly making decisions that enable you to out-maneuver an opponent.

According to Boyd, decision-making occurs in a recurring cycle of observe, orient, decide and act.  An entity (whether an individual, organization or software robot) that can complete a decision-making cycle quicker - observing and reacting to unfolding events more rapidly than an opponent, can thereby "get inside" the opponent's decision cycle and gain the advantage.

In the business world, OODA loop is an emerging concept for making better decisions, faster, and managing more effectively.  The ability to observe and react to unfolding events more rapidly than competition requires data collection, communication, analytics and solutions that can use the data to optimize operations. Some of the different enterprise solutions that can exploit IoT data are:
  • Field services solutions
  • Fleet management systems
  • Supply chain management systems
  • Optimized workforce scheduling solutions
  • Solutions using predictive analytics and machine learning
  • Enterprise asset management solutions
  • Plant maintenance systems
  • Facility management solutions
  • CRM solutions
  • Healthcare management systems
  • etc.
Many of these solutions are already utilizing software robots to quickly accomplish complex tasks and to analyze and act on incoming data.

Let us walk through a field service scenario together.  Mobile apps and sensors (human and machine) supply the data that enables a field services manager or software robot to observe.  Business analytic systems can be used to help managers or software robots to be oriented as to what the data means, and how it impacts the mission/project/task.  Next the manager or software robot needs to decide what actions to take, and then act.

OODA Loop
The “loop” in OODA Loop refers to the fact that this is a continual process. Each time you complete a cycle in the OODA loop you again observe, orient, decide and act based upon the results you see from the prior cycle.  The speed at which you cycle through the loop can be greatly enhanced by the use of supporting software robots.

Those involved in agile development projects will recognize these cycles.  If the results are positive, you can continue down that path and improve it. If the results are negative, you quickly adjust. It is a fast moving process of trial, error and adjustment until you get the results you want.

The OODA loop is particularly useful in environments that are unpredictable.  In these working environments, decision-making is often very difficult and without the appropriate training, or automated systems (software robots) - indecision, inaction, inefficiency or even chaos may occurs.  The OODA Loop is a decision-making process that is well suited to helping people or software robots make decisions and act in situations where there is no identified plan or obvious right answer.  

The military has effectively implemented the OODA Loop decision making process for use in many different areas including air combat, tank warfare, maneuver warfare strategies and daily in Special Forces operations.  Today, predictive analytics and software robots are utilizing OODA Loops with machine learning to cycle through analysis, decision-making and action even quicker.  In fact, many of today's most advanced jet fighters require the use of ultra-fast software robots in order to maneuver and stay airborne.

In a world where nearly 40 percent of the workforce is mobile, companies must learn and implement these concepts in order to successfully manage mobile and remote operations and services.  To be successful implementing and integrating the OODA loop, software robots and Network Centric Operational concepts into field services operations it requires the following:
  1. Data collection systems, sensors (IoT)
  2. Mobile apps 
  3. Real-time mobile communications
  4. GPS tracking - real time knowledge of the location of your mobile workforce, assets and inventories
  5. Real time knowledge of the capabilities and expertise of your mobile workforce
  6. Real time status and progress updates of the tasks, work assignments, projects and the schedules of the mobile workforce
  7. Real time knowledge of the location of all materials, equipment, tools and other assets required to complete specific tasks
  8. Field service management system that assigns, schedules and dispatches specific assignments to specific members of your mobile workforce (often utilizing software robots)
  9. Real-time business analytics 
  10. OODA Loop or similar rapid decision-making processes
All of the items listed above help provide the real time visibility into your field operations that is required in a networked field services organization practicing OODA Loop strategies and processes.

One of the remaining challenges, however, with the systems listed above is that humans quickly become overwhelmed by large volumes of data.  Complexity can become an inhibitor to the practice of OODA.  It is not enough to have real time visibility into massive volumes of data, one must be able to orient, or understand what the data means and how it will impact the mission.  That is where automated systems/software robots solve a real problem.  Let's consider the following scenario in a Networked Field Service environment:
  1. A high value bulldozer with an engine sensor wirelessly notifies a service provider that maintenance is needed.
  2. The information is instantly integrated into the work order management system of the service provider.
  3. The business intelligence feature analyzes the scheduling requirements related to the maintenance code that was received.
  4. Automated processes (software robots) quickly search for maintenance updates or alerts from the tractor’s manufacturer that might be related to the received code.
  5. Automated processes (software robots) search for the location of the nearest available and qualified diesel mechanic
  6. Automated processes (software robots) review all qualified mechanics' schedules and compares them for the purpose of optimizing all schedules.
  7. Automated processes (software robots) search for the nearest location where there is an inventory of parts for that particular make and model of tractor.
  8. Automated processes (software robots) looks for the nearest inventory of tools and repair equipment that may be necessary to complete the job.
  9. Automated processes (software robots) search for and reports on the current account status for the customer and any relevant warranty or service contract details.
  10. All of this data is unified and wirelessly sent to the service technician’s smartphone.
All of the above steps can be performed in seconds, with the right data, analytics, processes, solutions, software robots and strategies, but only when accurate and real-time data is available.

In summary, the Network Centric Operations concept seeks to translate an information advantage, enabled in part by mobile, IoT, analytics, management solutions and software robotics into a competitive advantage through the robust networking of well informed geographically dispersed people and assets.   This networked organization, using the OODA loop decision making cycle, has the tools necessary to make good and quick decisions in chaotic and unpredictable environments.



************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Enterprise Mobility - Adventures and Lessons from the Mobile World Congress with Jon Reed

Diginomica's Jon Reed interviews Cognizant's Senior Analyst for Digital Transformation and Mobility, Kevin Benedict on what he learned this year at the annual Mobile World Congress in Barcelona, Spain.  This year 93,000 people came together to learn and review the newest mobile, wireless and connected smart technologies at this event.  Much has changed in the past 12 months and this interview covers many of these trends.

Video Link: https://youtu.be/8jwkhZgck1U

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Networked Field Services and Real-Time Decision Making

Investment in space travel has provided many direct and indirect benefits to society.  For example, weather forecasting technology, solar energy, scratch resistant lenses, water purification systems, enriched baby food and air quality monitoring have all made advancements because of investments in space travel research.  Likewise, the military has made huge investments related to the implementation of Network-Centric Warfare technologies and mobile data collection strategies that are now providing benefit and revolutionizing the way commercial field services organizations operate.

One of the most important capabilities that mobile solutions offer organizations today is the ability to provide better visibility, in near real time, into the activities and events taking place in the field – let’s call it situational awareness.  Historically, it has been difficult to ensure that quality and service standards and processes are followed on remote jobsites and in mobile environments.  The lack of real time visibility often means critical operational decisions and optimized scheduling choices are delayed which results in the inefficient utilization of resources and assets.

Better communication and visibility about the work completed or not completed on remote jobsites can ensure that proper policies and operational and safety processes are followed and assistance is provided when needed.  Receiving, processing and sharing sensor (M2M) data from equipment, digital images, streaming video and real time mobile app updates with management and other process experts can often resolve challenging issues quickly and efficiently.

Today mobile applications support mobile data collection, real time database queries, alerts, mobile business processes, work order dispatch, location tracking, optimized scheduling, customer updates and alerts in most areas of the world.  Situational awareness is a new capability for most organizations.  It virtually enables managers and experts from anywhere in the world to be “digitally present” on remote jobsites.  Being “digitally present” is accomplished today using a variety of tools available on most smartphones. These tools include:
  • Phone
  • Photos
  • Video
  • Voice/Audio
  • SMS
  • Email
  • Augmented reality
  • Bluetooth add-on equipment
  • GPS/Maps/Tracking
  • Custom mobile apps
Most organizations have yet to understand and exploit these capabilities to maximize efficiency and optimize returns. Each of these tools can and do play an important role in a networked field services operation.

The New Networked Organization

The most advanced militaries are developing and implementing strategies based on the concept of Network-Centric Warfare.  These strategies, methodologies and concepts have direct relevance to commercial enterprises and field services organizations today under the name Network-Centric Operations or Networked Field Services.

The military uses rugged handhelds, radios, laptop computers, satellites, radio scanners, drones (UAVs), human resources, video surveillance, aerial surveillance, infrared cameras, remote sensors of all kinds and many other embedded mobile devices to create a web or grid of data collection points that are all wirelessly networked together.

Collected data is securely and wirelessly sent to a central server where it forms a real time and unified view of operations that can be used for analysis, forecasting, resource allocation, planning and real time decision making.  This networked approach enables users to see where their assets are located, where they are needed and how best to manage them at all times to successfully and efficiently accomplish the mission.

Network-Centric Warfare, goes by the name Network-Centric Operations in commercial environments and is a relatively new military doctrine.  It seeks to translate an information advantage (real-time data collected in the field) into a competitive advantage by using it for real-time decision-making.   This networking, combined with real-time data, analytics, AI (artificial intelligence) and machine learning, enable organizations to behave and respond in ways never before possible.  This strategy is based on the following four beliefs:
  1. A robustly networked workforce improves data sharing.
  2. Data sharing enhances the quality of information and supports situational awareness.
  3. Shared situational awareness enables collaboration, and management and resource agility.
  4. Points 1-3 support an optimized and efficient workforce
In order to optimize the performance of a military operation or a field services organization, it is critical to know, in real time, the location of all resources, the status of each job, the assets and equipment needed, and the time each job will require. When effectively coordinated and managed, human resources, equipment, assets and mobile inventories can be shared between multiple projects, and the right experts with the right levels of experience can be used on the right projects at the right time.  The bottom line is that a leaner, more efficient organization can be put in the field that can accomplish more work with fewer resources and generate a higher return on investment.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Real-Time Mobile Infrastructure and Digital Transformation Discussion with Expert Ved Sen

In this Google+ Hangout OnAir, I have the privilege of discussing the findings of my recent report, Real-Time Mobile Infrastructure, with UK based mobile and digital transformation expert Ved Sen. We discuss the challenges identified and possible solutions.  Enjoy!

Real-Time Mobile Infrastructure Report, Introduction
Real-Time Mobile Infrastructure Report, Part 1
Real-Time Mobile Infrastructure Report, Part 2
Real-Time Mobile Infrastructure Report, Part 3

Video Link: https://youtu.be/IMYHORGxMYY?list=UUGizQCw2Zbs3eTLwp7icoqw


************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Real-Time Mobile Infrastructure Report, Part 3

In this new report titled Real-Time Mobile Infrastructure, I ask 80 IT and business professionals involved in enterprise and consumer mobility to answer a series of questions.  The results will be shared here in the following article series.  This is Part 3 in the series.

Real-Time Mobile Infrastructure Report, An Introduction
Real-Time Mobile Infrastructure Report, Part 1
Real-Time Mobile Infrastructure Report, Part 2

Question: Do you (or your clients’) have IT systems that are too slow or incapable of supporting real-time mobile application requirements?
Click to Enlarge

Eighty-four percent report IT systems in their inventory that are completely incapable of supporting real-time mobility.   It is important for enterprises to take inventory of their IT systems and to thoroughly understand which systems can support real-time mobility, and which cannot, and then analyze the cost of non-support.  This inventory must be reviewed alongside forecasted technology and market trends and the actions of competitors.  The pace of change, in many cases, is faster than current planning and budget cycles, and without bold action the ability to successfully compete in the future diminishes.

Click to Enlarge
Question: Will your (or your clients’) IT environment and back office systems prevent you from delivering an optimized mobile application experience?

Optimized mobile applications are viewed as key to the future success of businesses, yet 43% report IT environments and systems that will prevent them from delivering an optimized user experience. This data should be viewed with the seriousness it deserves and should serve as a call to action.

Recent studies find that mobile application users are impatient and only willing to wait for 3-5 seconds for a mobile application to load before abandoning it.  Many never to return.  This is significant as a higher percentage of commerce is moving to mobile applications.

Question: How important is the speed of a mobile application to the overall user experience?

Click to Enlarge
All survey participants identified mobile app speed as being “Important.”  80% said it was “Very Important.”  Mobile applications by their very nature are often in the hands of a moving user.  Location and time are key data points used to establish context in many mobile apps (e.g. this morning’s weather in Boston).  No matter how great a mobile application design - delays in retrieving or interacting with back-office business or IT systems equate to negative user experiences and must be resolved.

Benchmarks for acceptable mobile application responses and speeds should be established and used to detect troublesome systems early.  Some speed issues may be related to app design or Internet connectivity, but often the problems are in the back-office IT environment and require extensive efforts to alleviate.


************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Interviews with Kevin Benedict