We hear a lot about Internet of Things but the million dollar question is, how does anybody actually make any money?
- The Cloud based vendors will add in IoT support in order to retain or grow their customer base within their MBaaS, MADP or API Gateway solutions.
- The developers will try and cash in on wearables as a new platform.
- Random new wearable devices will appear from disparate vendors.
I created a hardware demo of a consumer product (a squeezable Mayonnaise bottle) that could detect when and where it was shaken, and then send that information to a marketing micro-website. You can watch the video here (https://www.youtube.com/watch?v=2yaE6-KuHgs) and add your thoughts as to how I accomplished this feat and if you want to help me Kickstarter fund one. I was considering that the bottle could also detect when it was nearly empty and automatically order another one.
A few days later the Amazon Dash Button was announced (https://www.youtube.com/watch?v=NMacTuHPWFI) and people could actually press a remote button to directly order something. It did not stop there though as a few weeks later Google announced Project Soli (https://www.youtube.com/watch?v=0QNiZfSsPc0) which is effectively a small radar sensor which can detect small finger movements and map them into user interactions. I was so excited that I ordered a Flic (https://flic.io/) which is a remote button which you can program to do just about anything. The possibilities seem endless and the sensors are only going to get smaller. Indeed while the current trend is for phones and indeed watches to get bigger then it is left to the sensors to shrink and seamlessly integrate.
I would therefore predict that the real money is in these small integrated sensors which can offer us digital experiences without us touching a PC, phone, tablet or watch. Interestingly this fits into the Post App World vision that Apple and Google are allegedly eyeing up (http://www.wired.com/2015/06/apple-google-ecosystem/). For in the Post App World, it is the API that rules supreme and offers us frictionless services integrated into our consumer products. This vision of hardware sensors being able to offer us user interaction without a traditional screen is intriguing and describes the multiple touch points of the article fittingly.
After the sensors the real money lies surely with what the sensors produce…which is data. Suddenly Big Data just got a whole lot more interesting. There will be reams and reams of data from hardware sensors everywhere which are just crying out for Big Data processing solutions. But what do we do with the data? This is where the algorithms come in…highly intelligent algorithms that can analyse consumer data and use predictive analytics in order to offer us services before we even know that we need them. And then what? The algorithms start to use artificial intelligence and we end up with automated agents that operate on data models using M2M, freely trading data with each other, in order to analyse us and then directly offer us new targeted services. Cold Calling has already replaced humans with static voice recordings, but how long before that becomes dynamic? Imagine an autonomous agent somewhere processes enough of your data to work out that you need double glazing and then dynamically records a sales pitch and sends it to you.
My predications in a nutshell are an integration of sensors with everyday consumer products and the result driving the Big Data market some 12 months later. And what of Virtual Reality? The more I think about it the more I see an augmented digital reality powered by sensors where the ‘screen’ is our lives. I think the mistake that AR vendors made in the past was to think that we actually needed a screen…