Thursday, August 27, 2015

How Do Mobile Experts Use Mobility and What Does it Mean for Retailers?


One hundred percent of mobile experts in our recent survey of 108 mobile experts purchase products online.  Ninety percent have purchased products and services using mobile devices, but only 13% use mobile devices exclusively for purchasing products. Forty-five percent typically use only desktops/laptops, and 40% use both equally.  These are some of the findings from the survey we conducted in May of 2015.

How often do mobile experts purchase products and services using their mobile devices?  Only 1% purchase products using mobile devices daily, 30% weekly, 43% monthly and 20% once every three months.

Wow!  I am a one-percenter!!!  I use my Starbuck's app and Apple Pay often multiple times in a day.

In another recent survey of 5,000 people in North America that I was involved in titled Cognizant's 2015 Shopper's Survey, we found 73% still prefer using desktops/laptops for online purchases. This does not mean mobile devices were not used in the path-to-purchase journey, rather desktops/laptops are often preferred for payments.

Our findings also reveal a typical path-to-purchase journey involves multiple platforms and devices. Often smartphones are used for quick searches and discovery, tablets are used for in-depth immersive product research, and desktops/laptops for purchases.  People even change their device preferences depending on the time of day.  Mobile devices are popular in the morning, at lunch and in the late afternoon.  Desktops and laptops are popular during business hours, while tablets are popular in the early to late evenings.  This points to the popularity of living room and in-bed shopping.  When asked where they are located when making online purchases they answered:
  • 46% in the living room
  • 36% at work
  • 29% in the bedroom
  • 24% in the TV room
  • 20% in coffee shops or restaurants
The use of multiple devices and platforms at different times of the day makes it challenging for online retailers and marketers to track consumer interests.  When asked the time of day when they make most of their online purchases, mobile experts listed the times in the following order by popularity:
  1. Early morning
  2. Mid-morning/Early afternoon
  3. Noon
  4. Late night
Our findings reveal that the retail strategies of yesteryear are insufficient for future success.  Today those involved in mobile commerce have many new challenges.  Mobile users follow different path-to-purchase journeys across multiple devices, times and locations.  These journeys look different for different demographics, categories of products and products with different price points as well. Context is mandatory today to understand how to personalize a digital experience.  Recommending places to eat in San Francisco based on my past preferences, when I am in Boston isn't useful.

Collecting greater quantities of data with users' permission in order to provide a contextually relevant and personalized experience is a hurdle retailers must overcome.  I have some thoughts.  Stay tuned for my new report, "Cutting Through Chaos in the Age of "Mobile Me."

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
The Center for the Future of Work, Cognizant
View my profile on LinkedIn
Read more at Future of Work
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Monday, August 24, 2015

Mobile Commerce Strategies - Contextually Relevant Opportunities, Moments and Environments

In the early 1990s major retailers began investing in data analytics to better manage their stores and warehouses by analyzing individual store sales.  This insight gave them a perspective on the needs of the local market.

Retailers soon advanced in their use of analytics and added external factors for consideration and planning like demographics, weather, geography, local events and competitor's promotions and campaigns.

When customer loyalty programs tied to POS (point of sale) systems were implemented, retailers were able to start understanding individual customers through their transaction histories - at least what individuals bought from their stores.  The limitation, however, was this data was known and analyzed post-sales. There were no mechanisms in place to alert retailers to help customers during their path-to-purchase journeys.

Mobile computing technologies and wireless internet access introduced the age of mobile commerce. Mobile commerce enables retailers unprecedented capabilities to collect and analyze data from a wide array of sensors embedded in mobile devices.  The challenge then shifted from how to collect data, to how to get the user's permission and approval to collect and use data.  This is not always easy.

When asked in surveys, customers voice opposition to retailer's collecting data on them.  This, however, does not align with other survey results that show customers value a personalized digital experience.  You cannot personalize a digital experience based on data without data.  This dichotomy must be recognized by retailers and incorporated into their customer education plans and strategies.

Personalized digital experiences show respect and professionalism to customers.  Treating
individuals as if they belong to one homogeneous market is a recipe for failure.  It reflects an attitude that getting to know you is not worth the time or investment.  As more commerce moves from face-to-face interactions to mobile commerce, service and support can easily be lost in the bits and bytes. Retailers that try to offer mobile commerce without relevant personalization are short sighted and will ultimately fail.

Winners in mobile commerce will implement Code Halos (the data available about every person, object and organization) business strategies to find business meaning in data and to provide beautiful customer experiences.  They will also seek to triangulate three sources of data:
  1. Digital data from online and mobile activities
  2. Physical data from sensors and the IoT (internet of things, wearables, telematics, etc.)
  3. Customer loyalty and rewards programs data
Mobile commerce winners will seek contextually relevant opportunities, moments and environments (CROME) that can trigger personalized content at exactly the right time.  Alerting me to available food options in a city I left yesterday is not useful.  I need food options in the city I am in now. Context is time and location sensitive.

The competitive field in mobile commerce tomorrow will be around personalization, context and real-time operational tempos.  Can your legacy IT environment be upgraded to compete in the world of tomorrow?

Stay tuned for a major report I am writing on this subject to be published soon.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
The Center for the Future of Work, Cognizant
View my profile on LinkedIn
Read more at Future of Work
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Wednesday, August 19, 2015

What Has Happened to Enterprise Mobility - An Update

My colleague and mobile expert Peter Rogers, shares his insights into the fast changing world of enterprise mobility, and his fondness for acronyms with us today.  Please add your thoughts and comments!  Enjoy!

*****
Peter Rogers
Cognizant Mobile Expert
Four years ago, if you asked me for recommendations on the best solution for a mobile application on multiple platforms, I would have doubtless told you to use a MEAP (Mobile Enterprise Application Platform) or a MCAP (Mobile Consumer Application Platform) depending on the business case. Three years ago, it would have been a MADP (Mobile Application Development Platform)  solution because the two terms merged. Two years ago, I probably would have told you that the merging of the terms was a bad idea and to use an MCAP solution with an BaaS (Backend as a Service). Last year, I would have told you to use an MCAP solution, pure native or an HTML5 based solution combined with a Mobile App Platform as a Service / Mobile Platform as a Service that combined API Gateway functionality. You can see that things are starting to get a bit nebulous here in the terminology stakes and the definitions, boundaries and business cases are looking a bit wobbly. So the burning question is what would I tell you this year?

In the last six months around 5 major events have taken place defining the new enterprise mobile space.
  1. The first is the introduction of consumer grade wearable devices has driven the IoT market and given developers opportunities to look into new non-mobile propositions such as: Watch Apps; Augmented Reality Glasses Apps; Fitness Bands; and Virtual Reality Apps. This has not proved to be a massive disruption in the marketplace, yet but it has provided a non-Mobile-First mindset. It kind of shows us that the whole term "Mobile-First" is too rooted in a single technology. We should really be saying "Client-First" or "Device-First". This means that we design a solution around device-specific client-rendering capabilities as opposed to just rendering everything on the server. I would argue that the current MVW (Model View Whatever) frameworks in the web space (Angular, Meteor, React, Polymer, etc.) are probably closer to the money.  Angular in particular due to its focus on testing and Angular 2 on web components. We are talking about client-side templates that offer device-specific rendering and pulling the data we need from secured disparate APIs.
  2. The second major thing that has happened is a move away from the term "Mobile-First" into the more general term "Digital". Traditionally reserved for marketing focused activities the term Digital now seems to encompass Mobile, Web and IoT. Companies are now grouping together their teams to form Digital Centres and the results are certainly impressive.
  3. The third major thing is the stability of mobile platform vendors. I was quite shocked this week to learn that one of my favourite MBaaS (Mobile Backend as a Service) vendors had closed for business. I look around at recent acquisitions and struggling fortunes of some of the MADP/MBaaS vendors and I have to pose the question of what happens if they go under or their new owner completely changes the technology direction. Combined with the continued rise of HTML 5 and the imminent arrival of ECMAScript 6 / 2015 then you seriously have to weigh up your development options these days. Most modern Web Apps can accomplish the same goal as a mobile app without a download. A prime example being the Google Maps website.
  4. The fourth is the rise of Test Driven Development. The days of "Build an App in 24 hours" are definitely gone. Most companies with a few years experience of mobile development are now asking for: test driven development (or behaviour driven development); unit tests with 80-90% code coverage; some form of automation for testing and deployment; and at least consideration of moving towards continual integration / continual development / continual deployment. Mobile, Web and IoT applications need to be supported by comprehensive testing, diagnostics, debugging and other similar tool sets.
  5. The fifth is Apple continuing to change the goal posts to make cross platform development effectively more difficult. WKWebView broke Cordova last year and it will be interesting to see how the new iOS 9 Safari View Controller works in practise. iOS 9 introduces App Slicing, Bitcode Submissions and on-demand resource loading. It also adds a great deal of demand for on-device system testing with its new multi-tasking capabilities on tablets.
Taking all this into the picture then I am going to make some bold predictions for 2016 and get ahead of the curve here:
  1. Mobile-First is dead. The term Digital shall refer to Mobile, Web and IoT henceforth.
  2. Mobile-First investments will dry up. Instead we will see more generic Digital Integration Platforms that offer Client-First / Device-First solutions across the proliferation of devices from watches to virtual reality. These shall include more generic PaaS (Platform as a Service) solutions (Amazon, Heroku, etc.) and API Gateways with device specific adaptors. Flexible integration with diverse devices and virtualisation of portable server-side environments will be the key, as opposed to targeted device specific solutions.
  3. Developers will increasingly turn to pure native or pure HTML5 solutions due to increased demands of UX and the diversity of mobile operating systems. iOS 9 will prove a major turning point in this direction.
  4. Testing and traditional software practices will finally become mandatory in the mobile, web and IoT spaces. 'Write an App in 24 hours' and 'Cowboy Coding' will not fly anymore in this day and age, and digital work will move to more professional outfits as a result.
  5. Re-usable components shall become exceptionally powerful with the arrival of Web Components and many companies juggling multiple mobile projects at once (as opposed to just one, two or three projects last year). The traditional arguments around expiration and failed maintenance of re-usable components shall be expunged using Flow Based Programming architectures (more on that one in a bit).
So what was that "Flow Based Programming" architecture comment that I just threw out there? A man by the name of Paul Morrison decided it would be a good idea to invent a new way to construct computer programs. It's called FBP (flow-based programming).

FBP defines applications as networks of 'black box' processes, which exchange data across predefined connections by message passing, where the connections are specified externally to the processes. These black box processes can be reconnected endlessly to form different applications without having to be changed internally. FBP is thus naturally component-oriented.

http://rawkes.com/articles/an-introduction-to-noflo-and-flow-based-programming

This means that we can construct software programs entirely out of reusable components. We can also use real-time swapping of expired components from an online component repository that is actually kept up to date. The main problem with reusable components is that swapping them in and out proves too difficult, component maintenance, and nobody communicates their progress, feature set or release date in time to be used in projects. If we look at the movement towards component based models then FBP provides us with visual tools (that are great for business analysts, architects and developers) for constructing pipelines of components that can be dynamically altered using simple visual tools. Throw in developer collaboration tools based around program management - to get the component developers and the project / program managers talking and the components can actually be ready in time for upcoming projects.

One company who are probably the closest to this in the mobile space are Mendix. They offer a visual application designer tool that can be used by business analysts and architects. Please note this is slightly different from a typical WYSIWYG tool because this is modelling business processes as opposed to visual artifacts.

https://www.mendix.com/application-platform-as-a-service/

The current defacto implementation actually lives in the web space though and is called NoFlo (http://noflojs.org/). This comes with visual application design tools and a rich set of components that can be swapped in and out. Just recently NoFlo again became popular after a Kickstarter fund to upgrade the solution produced Flowhub (https://flowhub.io/). My only stipulation is that the output from a visual FBP tool must offer access to the source code. One of the major limitations of MADP are the lock down to certain customizable widgets or UI components. To offer truly world class UX then a developer needs to be able to have full access to the source code to implement the best native UI the design team can imagine.

Why do I think visual FBP tools are the future? If you can look at the rise of pure native or pure web solutions and their movement towards reusable components and the increasing diversity of Apple and Android platforms then the cost of MADP maintenance is going to be too high and the lack of reusable component integration is going to be too limiting.

You could of course argue that the old operating systems are moving out so the cost of maintenance is stabilising. The problem is that the new operating systems are offering very diverse new capabilities and features like multi-tasking place an increased demand in on-device system testing and diagnostics. Likewise you could argue that native component libraries can be wrapped or MADP specific component libraries used, but this is never as simple as it sounds in practise. Its clearly missing that ease of use that FBP offers.

I am currently leaning towards a pure native / pure web solution with a highly configurable set of reusable components using FBP that includes software collaboration tools allowing component developers and project / program managers to work together across a whole enterprise. I would also include APIs in the solution so that subscription models for various APIs can be included in the visual design along with traditional components. Notably MuleSoft (who acquired ProgrammableWeb) are probably closest in the market to the successful monetization of APIs.

I think given around $5 million  I could probably build the perfect Digital Integration Platform using: visual FBP; re-usable components; API monetization; device-first rendering techniques; diverse device integration; environment virtualisation; and a native or ECMAScript 6/2015 core. It would, however, be completely different to anything that came before it because it would not be mobile-first and it would not be cross-platform. That is not something I would have said before the last 6 months to answer my original question.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
The Center for the Future of Work, Cognizant
View my profile on LinkedIn
Read more at Future of Work
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Thursday, August 13, 2015

What is DevOps and Why is it Relevant to Enterprise Mobile?

Our resident expert on enterprise mobility and mobile application development, Peter Rogers, shares his insights on the subject of Enterprise Mobility and DevOps in this guest post.  Enjoy!

*****
A lot of people say the word “DevOps” but there is very little material that actually defines what it
Peter Rogers,
Mobile and Digital Expert
Cognizant
actually means. It actually transpires that DevOps and Enterprise Mobility were a match made in heaven. This is because the current trend is for enterprise mobile services to go live in 3-6 months and requires a whole new mindset.

“DevOps is a software development method that emphasizes communication, collaboration (information sharing and web service usage), integration, automation, and measurement of cooperation between software developers and other IT professionals. The method acknowledges the interdependence of software development, quality assurance (QA), and IT operations, and aims to help an organization rapidly produce software products and services and to improve operations performance”

https://en.wikipedia.org/wiki/DevOps

“Gartner believes that rather than being a market per se, DevOps is a philosophy, a cultural shift that merges operations with development and demands a linked toolchain of technologies to facilitate collaborative change. Gartner views DevOps as a virtual (and likely temporal) market and has focused the scope of the definition on tools that support DevOps and practices associated with it in the context of continuous delivery, continuous improvement, infrastructure and configuration as code, and so on. Gartner categorizes these tools as DevOps-ready, -enabled and -capable tools.”

http://www.gartner.com/newsroom/id/2999017

DevOps is a suitably vague term and I think over the years it has become an umbrella for more definitive terms that have started to appear in the Mobile and Web spheres:

·         Test Driven Development
·         Behaviour Driven Development
·         Automation
·         Continual Deployment
·         Client/Server Application Monitoring
·         Agile Project Life Cycle

The idea around DevOps is to give developers more operational powers in order to deploy software more autonomously. Typically in an old school Waterfall Approach:

1.       We write our code
2.       We run Unit Tests to an unknown level of Code Coverage
3.       We run Integration Testing
4.       We run System Integration Testing (SIT)
5.       We run System Testing
6.       We run User Acceptance Testing (UAT)

The problem with this approach is that to take a feature live then it is actually incredibly expensive in terms of testing effort. Most mobile projects do not last more than 6 months and so the budget just isn’t there for this drawn out linear approach. Wouldn’t it be smarter to put the power in the hands of the mobile developer to be able to safely deploy a new feature without all this hassle. Well it all depends if you trust your mobile developers. In order to be able to achieve this goal then we must pick the right methodologies and the right tools to empower the developers.

The new methodologies of Test Driven Development (TDD) and Behaviour Driven Development (BDD) provide us the strategy and are supported by toolsets. In Test Driven Development a developer must define a test for the software unit first, then implement the code unit and finally verify the implementation through the unit tests. The responsibility lies with the developer to make sure the software unit is delivered as a working entity in an atomic transaction. TDD not only hands unit testing to the developer but it also makes them think about code design upfront. Mobile projects are particularly susceptible to future support issues because the focus is often based around getting the App out there yesterday without any consideration of what happens 12 - 18 months down the line. TDD ensures that the code has adequate test coverage and the tests also act as a form of documentation, which is particularly handy when you find there isn’t any.

Behaviour Driven Development is a superset of TDD which enforces a higher level of collaboration between the core groups within an Agile scrum: developers; tester; user experience specialists; business analysts; and senior managers making sure a software contract is fulfilled. Mobile projects require a very strong user experience but this is difficult when your developers, business analysts and designers all talk a different language. Using BDD we can actually communicate a shared vision that enforces a business-coherent and a user-friendly experience.

“Behaviour Driven development combines the general techniques and principles of TDD with ideas from domain-driven design and object-oriented analysis and design to provide software development and management teams with shared tools and a shared process to collaborate on software development. BDD is a second-generation, outside-in, pull-based, multiple-stakeholder, multiple-scale, high-automation, agile methodology. It describes a cycle of interactions with well-defined outputs, resulting in the delivery of working, tested software that matters”

https://en.wikipedia.org/wiki/Behavior-driven_development

Test Driven Development (TDD) gives us the following:

·         The developer writes the unit tests
·         The unit tests are written before the code
·         There are defined levels of code coverage which are generally 80% or higher
·         Mocks are used for Black Box testing and to help with asynchronous testing
·         Unit tests are typically automated and fast to execute - at least on the developer’s machines

Behaviour Driven Development (BDD) gives us the following:

·         A common language that can be understood by business analysts, user experience specialists, testers and developers
·         A feature driven approach to development that includes business level top-down design
·         Complete traceability of business requirements against software features
·         Automated acceptance tests

OK, so through TDD and BDD we have empowered developers and allowed them to talk in a language that the rest of the world understands, most importantly the business. We have definitely put more work on the developers but we have elevated their status in life. If a senior manager actually understands the business value of what the developer is building then they are most likely to get higher praise. But how do we get rid of the traditional System Integration Testing and User Acceptance Testing phases which slow down the deployment of new features? Well this is where Continuous Integration, Continuous Delivery and Continuous Deployment come in. This is typically a challenge in mobile projects because everybody wants to see the App regularly and getting the App to the end user is not always trivial. This normally results in out of date versions of the application being circulated and vast quantities of defects being based that were already solved days ago.

“Continuous Integration is the practice of merging development work with a Master/Trunk/Mainline branch constantly so that you can test changes, and test that changes work with other changes.  The idea here is to test your code as often as possible to catch issues early.  Most of the work is done by automated tests, and this technique requires a unit test framework.  Typically there is a build server performing these tests, so developers can continue working while tests are being performed.”

“Continuous Delivery is the continual delivery of code to an environment once the developer feels the code is ready to ship.  This could be UAT or Staging or could be Production.  But the idea is you are delivering code to a user base, whether it be QA or customers for continual review and inspection.  This is similar to Continuous Integration, but it can feed business logic tests.  Unit tests cannot catch all business logic, particularly design issues, so this stage or process can be used for these needs.   You may also be delivering code for Code Review.   Code may be batched for release or not after the UAT or QA is done.  The basis of Continuous Delivery is small batches of work continually fed to the next step will be consumed more easily and find more issues early on.  This system is easier for the developer because issues are presented to the developer before the task has left their memory.”

“Continuous Deployment is the deployment or release of code to Production as soon as it is ready.  There is no large batching in Staging nor long UAT process that is directly before Production.  Any testing is done prior to merging to the Mainline branch and is performed on Production-like environments, see Integration blog article for more information.  The Production branch is always stable and ready to be deployed by an automated process.  The automated process is key because it should be able to be performed by anyone in a matter of minutes (preferably by the press of a button).  After a deploy, logs must be inspected to determine if your key metrics are affected, positively or negatively.  Some of these metrics may include revenue, user sign-up, response time or traffic, preferably these metrics are graphed for easy consumption.  Continuous Deployment requires Continuous Integration and Continuous Delivery - otherwise, you are just cowboy coding and you will get errors in the release.”

http://blog.assembla.com/assemblablog/tabid/12618/bid/92411/Continuous-Delivery-vs-Continuous-Deployment-vs-Continuous-Integration-Wait-huh.aspx

With Continuous Integration we have eliminated the need to have a dedicated System Integration Testing phase, as long as our automated unit tests, integration tests and system tests are strong enough. Typically adhoc testing would be performed at the same time by quality assurance testers and most system tests are typically manual as they are difficult to automate in practise. With Continuous Deployment we have eliminated the need to have a dedicated User Acceptance Testing phase, as long as we have used BDD automated acceptance testing and we can truly deploy the application to any target user at the press of a button.

There is a piece of the puzzle missing and this is Integration Testing which is the Achilles’ Heel of every mobile project. If something breaks in an enterprise mobile project then you know that 99.9% of the time the finger is pointing at your mobile application. What is more annoying is that 99.9% the problem is located somewhere else and you just can’t prove that fact. Well the way out of that particular Catch 22 is to provide an effective integration testing solution.

Typically integration testing is performed in a rather convoluted way through permutations of components talking to each other. This gives rise to a vast amount of tests which are often slow to run and at the end of the day nobody knows where the fault lies when they fail. Luckily there is a much better approach to Integration Testing and this is described in a beautiful one hour video which anyone who is serious about testing should really watch. In a nutshell instead of Integration Testing you should treat each component as a black box and perform Collaboration Testing and Contract Testing.  Collaboration Testing makes sure your component is talking correctly to another component. This is achieved by defining an interface for each component for the services it offers. Contract Testing makes sure that the component you are talking to actually honours the contract. It is probably easiest to think about this as a client-server model.

https://vimeo.com/80533536

Let us imagine the simplest mobile application in the world. We have two boxes: a mobile library which handles networking to a server; and a remote server that responds with JSON data. This is a true client-server mobile application scenario to make it easier to understand but in practise this will involve many client components talking to each other before an outside server is even reached. Let us say the remote server offers a REST based API. We can now perform Collaboration Testing on the mobile library to make sure it calls the REST API correctly. Most people normally forget the other half of the equation, which is making sure the server is well behaved. This is why the mobile client ALWAYS gets the blame and it can be prevented very easily. We can perform Contract Testing on the remote server in this instance by making sure that it responds correctly to the API calls. You should set up some kind of service monitoring here. You should also check that the JSON data returned meets the agreed contract and this is where JSON Schema come into play. You can define precisely what are acceptable responses from the remote server as opposed to just expecting the client to be 100% defensively coded.

If we imagine that we start with the first component in an enterprise solution and then we work our way down through a tree-like structure of component interactions then as long as we test each individual software unit and perform a collaboration test with each other unit it talks to and a contract test for each unit that talks to it…then we should have a perfectly tested system. Why is this important? Well it removes the whole entropy of integration testing and seeing as we are testing against interfaces then most of the time we can treat the components as black boxes and use Mocks. This means we can now perform automated integration testing as opposed to winging it. It also means that the code is designed better because we can spot the limitation in the component contracts at design time rather than build time. Finally it means we have a break out loop from the Blame-Mobile-First approach.

At the end of the day we have just empowered a team of developers to create a fully tested unit of software that implements a well-defined feature that has traceability back to a well-defined business requirement, which is fully tested against all the other software components and merged into the main software product safely. Furthermore we can actually move that software release through various stages of deployment automatically, including a full production release (knowing this has been tested against acceptance criteria that all parties have bought into) and have the resulting application ready to deploy at the press of a button. I think THIS is what people mean by DevOps and THIS is why it is critical to modern day mobile development.

 ************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
The Center for the Future of Work, Cognizant
View my profile on LinkedIn
Read more at Future of Work
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Monday, August 10, 2015

Digital Transformation Behind the Currents!

My good friend and colleague at Cognizant, Ved Sen wrote this article about the definitions of terms around the topic of digital transformation.  It is an important piece that gives us the language to discuss the huge impact technologies are having today on both business and IT.  Enjoy!

******
Ved Sen, 
Head of Digital, 
Cognizant Business Consulting 
@vedsen
Despite working in the digital space for years, now I was quite stumped a few weeks ago when I was asked to define it. Sometimes you can get away by circumlocution or to use the technically correct term, waffling. But given all the hype around digital transformation, I felt that it was a good time to try and get a working definition going. For one it helps to cut the hype. And two, clarifies what is NOT digital at a time when the label is being slapped around with abandon.

So I read descriptions of digital in the media, and on our competitors sites. I listened to analysts and and read books and white papers. I asked our clients what they were doing. And I spoke to the experts in Cognizant, and spent time just thinking about this. And I’m happy to say I’m willing to stick my neck out and try and define digital in less than 25 words.

Of course the problem with definitions is the tradeoff between pithiness, abstraction and comprehensiveness. You can be very pithy but be too abstract e.g. ‘Digital is the future of business’. Or you can take a whole page to define digital, but that’s a description and not a definition. So here’s my definition and you’re welcome to challenge it or differ with it, or adapt it as you wish.

Digital means: exploiting emerging technologies to create user / customer centric interfaces and data driven business models, leading to more agile, responsive and competitive business models.

Let’s break this up.

Emerging technologies are certainly a driving force of digital. It’s the reason why we’re having this conversation. But to be clear, there are many discrete elements that make up the emerging technology theme. Arguably the big bang for ‘digital’ is the launch of the iPhone – because it put powerful computers into people’s pockets. It democratised access and provided a platform for almost all the other innovations. Samsung’s (and others’) lower cost and Android driven imitation of the iPhone ensured a mass market for smart phones. Alongside the smartphone though, you have to consider the continuously evolving web 2.0 (are we still allowed to say that?) and the emergence and maturing of HTML5, Javascript and more frameworks to deliver slick web front ends than you can shake a smartphone at. HTML5 and the ever improving web have had a see-saw battle with native mobile platforms, frameworks and entire generations of technology have come and gone in the past 5 years. Remember MCAP and MEAP platforms, and the allure of cross platform development for mobile apps? All of this have also greatly helped social platforms – which includes Facebook, Twitter, Whatsapp and hundreds of messaging and collaboration platforms.

Behind the scenes: But this is not just about front end technologies. Moore’s Law continues to drive the cost of computing down, leading to significant capabilities to process data – be it the in-memory database capability of a SAP Hana or the emergence of Big Data, and our ability to analyse and make meaning of ever larger data sets in continuously decreasing cycle times. Newer and more efficient Graph (Neo4J) and clustered database models (Hadoop) are supplanting the once ubiquitous RDBMS providers. And the en masse shift to cloud infrastructure and smarter automation has created a whole universe of services – starting with the PAAS and now a generic ‘as a service’ nomenclature.

The Internet of Everything: And to top it all, the next wave of internet connected sensors and devices is just beginning. Another whole wave of connected and smart objects has the potential to change everything, again, in the way we buy and consume goods and services. The Internet of Things does not have a single killer app, yet, but it’s growth and spread nonetheless are accelerating.

Its not what you did, it's how you did it: the shift in the underlying methodology has played its role. The maturing and widespread adoption of agile frameworks and the toolkits to deliver them is a key construct of digital. The rapid evolution of technologies both necessitates and enables a much more adaptive and cyclical approach to technology delivery.

Design thinking: Almost absurdly, all this fantastic technology is still not what truly drives the digital change we see in businesses. That honour belongs to the emergence of design thinking and service design methodologies. Some of this is commonsensical and you would think should have been the norm rather than an innovation. But the mind-shift is fundamental. Industry leading businesses are now recognising the need to be customer journey driven. I use the word interface in a broad sense here and not just restricted to screens. The question to ask is how do your customers, partners and even employees interface with your business? Historically, this was driven by inside-out thinking. In other words, businesses decided how they wanted to run their processes and designed systems and interfaces to match those desired processes. So if a bank’s preference was for the customer to be in the branch while opening an account, that’s how the processes and systems were defined. In digital, those interfaces are conceptualised outside-in. This means the starting point is the user. What does she or he want to do it? How does the prospective customer want to open the account? What are her constraints? What would make her choice easier, and her experience better? Once you start thinking outside in – you reach a very different point in the way systems and processes are defined. And when you combine this user-centric interface thinking with the technology opportunities that are emerging you begin to understand why transformation is the buzzword du jour.

Data Driven Decisions: Implicitly or explicitly, every decision we make (what to wear to work, for example) is made on the basis of data that we process (what meetings do I have, what is the dress code, what is the weather?). Complex decisions require more sophisticated data. Historically this data has not been available to us for many large and small decisions. How much to spend on the marketing campaign? Where to open the next store? Who to hire as a program leader for a new business area? How to implement a hot-desking policy? As a consequence, most businesses have relied on ‘experts’ for these decisions, whether they are from within the business or consultants brought in for the purpose. Experts use their wisdom which is often an implicit accumulation of data from deep experience in that area. What we are witnessing, thanks to the combination of lean thinking and instrumentation, is a seismic shift to more explicit data driven decision making. For example, if everybody used a smartphone to access the office for a month or two, it might provide data that suggests that wednesdays are the busiest day of the week while friday is the lightest. The latter may be visually obvious but the former may not. Or the data may show that on mondays, the average time spent in office by people is actually just 4 hours – because they are in meetings or on projects outside. Suddenly there is explicit data to influence your hot-decking policy depending on what your objectives were. This is a tiny example but very representative of how digital is reshaping our decision making. Now imagine this at scale and for the hundreds of decisions made every day and you get a sense of what I mean.

Responsive business models:  we are used to stability and to treating change as a temporary disruption between periods of stability. Not dissimilar to moving house. Increasingly though, we find ourselves in a state of continuous change. The disruption is not a passing inclemency, but it is the new normal. Think of moving from a house to a caravan, for example. What the combination of technologies, design thinking and data surfeit allow us, is to build a responsive or adaptive business model that is able to keep pace with a fast changing environment. Think evolving operating model instead of target operating model. Think of the cost of change as a part of the cost of doing business, not as a capital expenditure. Obviously, industry context is vital – Retail banks and media businesses are much further down the path of transformation than, say, infrastructure providers. But while the impact may vary, the change is universal. Digital is not therefore about B2C vs B2B, it’s not about marketing or about your social media. I believe this is fundamentally about your business model being impacted by better data, delivered at the point of decision making.

Agile Strategy: seen in this way, it would therefore be logical to look at your strategy as an agile and evolving artefact. Many companies still look at 3 year or 5 year plans which are sequential. Instead, we should be looking at rolling 12 quarter roadmaps which reflect our strategy, but which can be modified on a quarterly basis, keeping a vision or end goal in mind. But more about that some other time.

The point of all this is to be competitive. And digital business models which use technology, design thinking and data optimally are far more competitive in the world we live in. I heard John Chambers, the CEO of Cisco, say ‘Change will never be this slow again’. And 52% of companies from the Fortune 500 list of 2000 no longer exist. Collectively that sums up the challenges and dangers of being change resistant. So whether you agree with my definition of digital or not, a response to the change around us is not optional. Enjoy the ride!

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
The Center for the Future of Work, Cognizant
View my profile on LinkedIn
Read more at Future of Work
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Tuesday, August 04, 2015

Retail Evolution and Mobility

Farmers once sold their harvest bounty directly to their customers from beneath the branches of their fruit trees.  Customers had a direct face-to-face relationship with the farmer and could express their preferences and demonstrate their buying patterns to the farmer.  Over time farmers developed means to preserve and package their products, and to sell them through retail stores with large customer bases.  Sales expanded, but the personal relationship between the farmers and their customers, and an intimate understanding of each of their customers’ preferences was lost behind the retail shelves of big box stores.

Over time retail stores seeking market expansion and competitive differentiators developed mobile commerce apps that enabled them to sell products across a much wider geographic area, and to larger markets at any time of the day or night. This expanded sales potential, but in the process disconnected customers from the retailer’s physical store and location.

Mass marketing to mass audiences depersonalized the shopping experience.  It reduced the farmer’s products to mere commodities, and retail stores to logistic, warehouse and delivery centers.  It shifted competitive differentiators from customer service, retail locations, store layouts, local product selections and building designs to the designs of mobile commerce apps and websites, their performance and ease of navigation.  In addition, shipping costs and post-sales return policies moved from afterthoughts in fine print, to major competitive differentiators. Few were satisfied with these developments.  Customer service, brand loyalty and the consumer’s retail experience suffered.

Today, however, technologies and business strategies are converging again to offer hope these relationships can be restored, and the quality of the consumer’s mobile commerce experience improved.  The development of MyX (My Experience) personalization strategies and technologies are promising highly personalized digital experiences for consumers, and competitive advantages for businesses that can support them.

Creating highly personalized MyX mobile commerce apps for thousands and even millions of consumers requires business process re-engineering, new IT strategies, technologies, intelligent process automation and upgraded legacy systems and real-time personalized experiences. The competitive battlefields of retail are moving fast and demand urgent action today.

As consumers shift more of their work and personal time to mobile devices, we see rapid growth in both mobile marketing investments and the numbers of mobile commerce transactions.  Today 34 percent of global e-commerce transactions are mobile, even though 73 percent of survey participants continue to use desktop/laptops for most of their online shopping activities.  Mobile shoppers (those that shop online regularly using a smartphone or tablet) shop online more frequently than computer shoppers (those mostly using computers for online shopping activities), and as shoppers continue to migrate to mobile commerce these transaction numbers will see continuing growth.  The bottom line, mobile commerce is growing fast across all demographics and represents the future of retailing. Developing a strategy for personalizing users' experience is the key component.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
The Center for the Future of Work, Cognizant
View my profile on LinkedIn
Read more at Future of Work
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Tuesday, July 21, 2015

iOS 9 - The Challenges and Facts Developers Must Know

Our resident mobile expert and guru Peter Rogers shares his insights on the challenges presented by iOS9 in this guest article.  Enjoy!
*****
Peter Rogers, Cognizant Mobile Guru
Web and App developers often live in fear of the latest iOS release because of the challenges it will bring to porting their software.  This time, however, iOS 9 really has outdone itself. This article will focus on the core changes that developers are going to need know and address.

Apple has spent a lot of time focused on the interoperability between Apps and Safari. A key deliverable of this is the new Safari View Controller. The idea is that all HTML content rendering requests are literally handed over to Safari, as opposed to using UIWebView or WKWebView (which never reached its true potential and remains largely unloved). How this impacts PhoneGap who struggled to implement WKWebView will be very interesting to see.

http://www.macstories.net/stories/ios-9-and-safari-view-controller-the-future-of-web-views/

This web interoperability theme is continued with the ability to use universal links that securely lead users directly to a specific part of your App from a website link. This is achieved by means of a signed shared web credentials file in JSON format that has to be stored on your server. The net result means a seamless interchange between web and App environments that bypasses Safari, of which developers will want to take full advantage. There is also an extensive Search API that exposes mobile and web data seamlessly.

https://developer.apple.com/videos/wwdc/2015/?id=509

The controversy arrives in the form of downloadable Safari extension Apps that can be used amongst other things to offer mobile Ad Blockers. This is something that Purify aims to take full advantage of and the end result is amazing for web users but terrible for advertising agencies. The panic has already started to spread and I predict we will see typical strategies of “beg to whitelist” and “in content advertising” which offers solutions in the desktop space.

Swift 2.0 has been released but it is interesting to see that a lot of developers have not adopted Swift 1.0 due to lack of tooling support. There are also standard updates to HomeKit, HealthKit, CloudKit and MapKit. CloudKit offers a new web interface using CloudKit JS that can be used to share your Cloud data between Mobile Apps, Desktop App and Web Apps. The Games support has been massively overhauled and Metal continues to offer the lowest overhead access to the GPU, spurning OpenGL portability for raw performance. The News App is also worthy of note because it may mean similar Apps get blocked from publishing on the App Store if they are too similar. We also have Application Transport Security which forces HTTPS and declarations of network connections.

http://stackoverflow.com/questions/30751053/ios9-ats-what-about-html5-based-apps

The second piece of controversy arrives in the form of multitasking on Tablets. Suddenly every Tablet App can be run in a multitasking context which could support a secondary App taking up half the screen and a third video App running Picture in Picture. The problem for developers is that the onus is on them to deliver a constrained application that has been fully tested in a multitasking environment. This means that suddenly System Testing – in particular Performance Testing – and Non-Functional Requirements become more important and that of course adds to the cost.

The options available are as follows:
  1. Slide Over provides a user-invoked overlay view on the right side of the screen (or on the left side in a right-to-left language version of iOS) that lets a user pick a secondary app to view and interact with.
  2. Split View displays two side-by-side apps, letting the user view, resize, and interact with both of them.
  3. Picture in Picture lets a user play video in a moveable, resizable window that floats over the apps onscreen.
“From a development perspective, the biggest change is about resource management. Every iOS app—even one that opts out of using multitasking features—needs to operate as a good citizen in iOS 9. Now, even full-screen apps don’t have exclusive use of screen real estate, the CPU, memory, or other resources. To participate effectively in this environment, an iOS 9 developer must carefully tune their app’s resource usage. If an app consumes too much time per frame, screen updates can fall below 60 frames per second. Under memory pressure, the system terminates the app consuming the most memory.”

An application can opt out of appearing in the Slide Over selector bar or being available for PiP as long as there is a good reason. If you are submitting Apps to the Apple App Store then you are most probably going to have to include this feature if you do not want to be rejected.  Here is the major rub though…you cannot stop another application from being run at the same time as your application in Slide Over, Split View or PiP. That means you have to make sure your App is a well behaved multitasking citizen and that means a lot of System Testing on actual devices. You need to watch your framerate, memory consumption, handle window based resizing as opposed to screen based resizing (Autolayout helps here) and handle system state events (such as temporarily being put in the background and releasing memory intensive resources).

The following can happen even if you opt out of multitasking in your application:
  • A user adds a Picture in Picture window that plays over whatever else is onscreen (including a full-screen app), while the app that owns the video continues to run in the background.
  • A user employs Slide Over to use a secondary app. While it is visible, the secondary app runs in the foreground.
  • A user invokes a keyboard from a secondary app in Slide Over, thereby obscuring a portion of the primary app.
One key challenge here is automated testing. Often the devices are remote and accessed through some Cloud based testing set up like CloudBees, DeviceAnywhere or Xamarin Test Cloud. This means that remote device testing vendors now have to offer the ability to run three applications on a remote iPad and provide performance and memory logs. If such vendors do not offer these capabilities then you have to acquire the devices yourself and run the tests manually and that adds to the hardware costs for the project.

If that wasn’t enough we also have the third and final controversy: App Thinning. In order to help developers streamline their Apps for the new multitasking environment and to handle Apple Watch, Apple have introduced three new concepts that come under the banner of App Thinning:
  1. Slicing
  2. Bitcode
  3. On Demand Resources
Slicing is the process of creating optimal versions of the application for specific target devices. For example rather than have one Universal App that covers all the tablet and mobile code then you could deliver two Apps. In fact rather than have legacy code for iOS 7 and iOS 8 then why not just deliver three dedicated Apps each for iOS 7, iOS 8 and iOS 9. Why stop there? Why not look at screen sizes and offer precisely the resources that a certain screen size needs. Welcome to Slicing.

The creation side of things is handled by Xcode 7, and the new ability to specify target devices and provide multiple resolutions of an image for the asset catalog. Xcode also allows you to test the local variants of the application on a simulator. You then create an archive of the App and send it off to the App Store. The App Store actually handles the deployment side of things offering up the precise variant of your App to iOS 9 App Store clients. The question remains how this works for Enterprise App Stores but that one is for a later article.

Bitcode is an intermediate representation of a compiled program. By supplying bitcode in your application then it allows Apple to re-optimise your App binary in the future automatically. I can only assume that it is impossible to introduce errors in this process or there would be a large outcry.

On-demand resources are those that can be fetched when required, as opposed to being bundled with the application. The reason why this does not work so well in Android is because of the Java Heap meaning that often memory gets lost in the process. The App Store hosts the resources for you on Apple Servers and manages the download for you. The operating system purges on-demand resources when they are no longer needed and disk space is low. If you use an Enterprise App Store instead of the Apple App Store then you must host the on-demand resources yourself and that is again something worth exploring in future articles here.

Hopefully when your customer asks you for a quick iOS 9 update then you will at least be prepared now.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
The Center for the Future of Work, Cognizant
View my profile on LinkedIn
Read more at Future of Work
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Monday, July 20, 2015

Mobility, Sensors, Robotic Process Automation and the Principle of Acceleration

If you have spent any time working on IT projects you would have heard the comment, "The system is only as good as the data." It's an accurate and necessary statement, as it describes a prerequisite for many technological innovations. Many system designs fail in the face of reality. Reality is often a cloaked term for implementing a digital solution in a physical world without a sufficient understanding of how the physical world operates. This is one problem where sensors can really help.

Sensors fill in the blind spots in our systems and operations by measuring the physical world and providing us with the data. Where previously we operated on conjecture or false assumptions, sensors provide real data on how the real world functions. Operating on real data allows for new and different approaches and IT strategies. Strategies that utilize artificial intelligence or in more complex environments robotic process automation solutions. These automated processes or solutions know exactly what to do in a complex process given specific data. Robotic process automation offers operational speeds and levels of accuracy never before possible with humans alone.

In a world of ubiquitous mobility, businesses must learn to operate in real-time. Marketing, sales and commerce must all evolve to operate in real-time. Think about a LBS (location based service) where retailers want to inform their customers, via SMS, of nearby discounts or special offers. If the SMS is delayed, the customer will likely have moved on and the SMS will be irrelevant. Payments must operate in real-time. Real-time is a speed deemed impossible just a few years ago and remains a future goal for most companies. Today, however, with mobile devices and real-time wireless sensors updating complex systems, it is often the humans in a process that are the sources and causes of bottlenecks. Think about how slow a credit or debit card transaction would be if every transaction ended up in a human's inbox to review and approve before it could be completed. Global and mobile commerce would stop. The credit and debit card processes have long ago been automated. Enterprises are now feeling the pressure to automate more processes to enable an operational tempo than runs at the speed of mobility.

What does it take to automate and run at real-time operational tempos? First, it takes accurate data that has not expired on the shelf. Data that has expired on the shelf means the value it once had, no longer remains.  For example, the weather forecast for last weekend, is not useful for this weekend.  The value of the data has expired. Second, it takes IT infrastructures capable of supporting real-time transactions and processing speeds. Thirdly, it takes defining decision trees, business rules and processes to the level where they can be coded and automated. This will then enable artificial intelligence to be added and utilized. Once enough artificial intelligence is supported it can be connected together into a complete process for RPA (robotic process automation) to be supported. Now you have a chance at real-time speeds.

In summary, accurate and real-time data, especially in a physical environment, will require sensors to fill data blind spots and replace data that has expired on the shelf. This is just one of the many ways enterprises can take advantage of the IoT (Internet of Things).

Mobile apps are driving the demand for real-time interactions and information.  Real-time demand drives a need to change business processes and IT (digital transformations). Digital transformation increases the demand for real-time IT infrastructures and processes, which in turn will increase the demand for IoT and robotic process automations. In economic circles this is known as the principle of acceleration. If demand for a product or solution increases, then the production capabilities for supplying the demand increases at an even greater amount. What does that mean for us?  Mobile is going to drive all kinds of increasing changes in business and IT. Mobile technologies are having an acceleration effect across enterprises and IT today. This effect is driving digital transformation initiatives toward reaching the "real-time" benchmark that will require more enterprise IoT and robotic process automations to achieve real-time speeds.

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Read more at Future of Work
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Monday, July 13, 2015

Laws for Mobility, IoT, Artificial Intelligence and Intelligent Process Automation

If you are the VP of Sales, it is quite likely you want and need to know up to date sales numbers, pipeline status and forecasts.  If you are meeting with a prospect to close a deal, it is quite likely that having up to date business intelligence and CRM information would be useful.  Likewise traveling to a remote job site to check on the progress of an engineering project is also an obvious trigger that you will need the latest project information.  Developing solutions integrated with mobile applications that can anticipate your needs based upon your Code Halo data, the information that surrounds people, organizations, projects, activities and devices, and acting upon it automatically is where a large amount of productivity gains will be found in the future.

There needs to be a law, like Moore's infamous law, that states, "The more data that is collected and analyzed, the greater the economic value it has in aggregate," i.e. as Aristotle is credited with saying, "the whole is greater than the sum of its parts." This law I believe is accurate and my colleagues at the Center for the Future of Work, wrote a book titled Code Halos that documents evidence of its truthfulness as well.  I would also like to submit an additional law, "Data has a shelf-life and the economic value of data diminishes over time."  In other words, if I am negotiating a deal today, but can't get the critical business data I need for another week, the data will not be as valuable to me then.  The same is true if I am trying to optimize, in real-time, the schedules of 5,000 service techs, but don't have up to date job status information. Receiving job status information tomorrow, does not help me optimize schedules today.

Mobile devices are powerful sensor platforms.  They capture, through their many integrated sensors, information useful to establishing context.  Capturing GPS coordinates for example, enables managers to see the location of their workforce.  Using GPS coordinates and geo-fencing techniques enables a software solution to identify the job site where a team is located.  The job site is associated with a project, budget, P&L, schedule and customer.  Using this captured sensor data and merging it with an understanding of the needs of each supervisor based upon their title and role on the project enables context to be established.  If supervisor A is responsible for electrical, then configure the software systems to recognize his/her physical approach to a jobsite and automatically send the latest information on the relevant component of the project.

I submit for your consideration yet another law, "The economic value of information multiplies when combined with context, meaning and right time delivery."  As we have seen, mobile technologies are critical for all of the laws discussed so far in this article.

Once sensors are deployed, sensor measurements captured, data wirelessly uploaded, and context understood, then business rules can be developed whereby intelligent processes can be automated. Here is an example, workers arrive at a jobsite and this data is captured via GPS sensors in their smartphones and their arrival automatically registers in the timesheet app and their supervisor is notified.  As they near the jobsite in the morning, using geo-fencing rules, each worker is wirelessly sent their work assignments, instructions and project schedules for the day.  The right data is sent to the right person on the right device at the right time.

The IoT (Internet of Things) is a world of connected sensors.  These sensors feed more sources of captured data into the analytics engine that is used to find meaning and to provide situational awareness.  If smartphones are mobile sensor platforms, then smartphones and IoT are both peas in the same pod.

Intelligent automated processes, like the ones mentioned above, are called "software robots" by some. These are "aware" processes acting upon real-time data in a manner that supports human activities and increases productivity.  Here is what we all need to recognize - mobile applications and solutions are just the beginning in this value chain.  Rule: Mobile apps provide only as much value as the systems behind them.  Recognizing mobile devices are sensor and reporting platforms that front systems utilizing artificial intelligence and automated processes to optimize human productivity is where the giant leaps in productivity will be found.

If you agree with my premises, then you will understand the urgency to move beyond the simple testing and deployment of basic mobile apps and jump into building the real value in the intelligent systems behind them.

Summary of Laws:
  • The more data that is collected and analyzed, the greater the economic value it has in aggregate
  • Data has a shelf-life and the economic value of data diminishes over time
  • The economic value of information multiplies when combined with context, meaning and right time delivery
  • Mobile apps provide only as much value as the systems and intelligent processes behind them
************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Read more at Future of Work
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Tuesday, July 07, 2015

Making the Web Run Faster via WebAssembly

Digital and Mobile Guru, Peter Rogers
Like most of us, my colleague at Cognizant and technical mobile and web expert, Peter Rogers, spends his warm summer evenings pondering how to make the Internet run faster.  In this guest blog, Peter shares the latest developments in "WebAssembly."  Enjoy!

*****

You probably saw the Blogosphere explode the other day when WebAssembly was announced. [Sorry Peter, I missed that one]. For the uninitiated Web Assembly is ‘a new low-level binary compile format that will do a better job at being a compiler target than JavaScript’. It is basically a binary form of AST (https://en.wikipedia.org/wiki/Abstract_syntax_tree) which means that it is much faster to load, process and potentially run. Half of the problem with JavaScript is that we need to load a text file and then wait for it to be interpreted. Web Browsers are trying to use JIT (Just In Time) and AOT (Ahead Of Time) strategies to speed things up but the language itself makes life hard. Suddenly Brendan Eich comes up with a project to deliver an actual binary AST (https://medium.com/javascript-scene/why-we-need-webassembly-an-interview-with-brendan-eich-7fb2a60b0723) that can be used and what really excites people is that Google, Mozilla and Microsoft have all agreed to work on the format. Browsers will understand the binary format, which means we'll be able to compile binary bundles that compress smaller and that mean faster delivery. Depending on compile-time optimization opportunities the WebAssembly bundles may run faster too. A quick overview can be found here (https://medium.com/javascript-scene/what-is-webassembly-the-dawn-of-a-new-era-61256ec5a8f6) and the community page is now open (https://www.w3.org/community/webassembly/).

The rest of what followed was a whole load of confused articles about JavaScript being dead, writing everything in WebAssembly and questions as to why JavaScript itself cannot use WebAssembly instead of C/C++ .

Well here are some interesting bullet points for you:

  • You can actually run WebAssembly today and it does actually use JavaScript
  • WebAssembly is going to be a slow evolution not an overnight sensation
  • This solution is really useful for game developers and advanced web applications but it probably won’t be applicable in most cases 

The whole WebAssembly idea has actually evolved from Emscripten (https://github.com/kripken/emscripten), ASM.js (https://hacks.mozilla.org/2015/03/asm-speedups-everywhere/) and NaCl/PNaCl (https://developer.chrome.com/native-client/overview). ASM.js is a subset of JavaScript that has been designed to be highly optimised by compilers but it is a textual format. If a Web Browser supports ASM.js and you have somehow managed to load this textual format then you can see as much as 1.5x native speed execution depending on the browser and the code itself. Just to put that in perspective that is pretty much the same speed as Java and C#. Sounds great, but how do I get my code in ASM.js? This is where it starts to get nasty…you have to code in C/C++…You use Clang to compile your C/C++ into LLVM bytecode (https://en.wikipedia.org/wiki/LLVM) and then use Emscripten to convert the LLVM bytecode into to a subset of JavaScript (http://ejohn.org/blog/asmjs-javascript-compile-target/) called ASM. The Unreal 3 Engine was ported to ASM and it ran surprisingly well in ASM capable web browsers.

In a nutshell, you have to write in C/C++ and then use a few tools to output a highly optimised subset of JavaScript (albeit still a textual one) that can be accelerated on web browsers that support it. The rendering part needs to be considered though and normally we use WebGL because it is hardware accelerated and perfect for dealing with the ASM data structures. Unity have been quick to support ASM, NaCl and now WebAssembly. Any WYSIWG tool or dedicated programming environment can always spit out ASM. Of course there will always be an overhead if you are switching ASM contexts on and off, so you have to consider your application structure. Ideally one huge lump of heavy processing gets handed over to ASM and the rest of the application carries on with normal JavaScript. You could write the whole application in ASM but the whole thing becomes totally unreadable unless you use a tool like Unity: but that is probably more suited towards a canvas type approach like a Widget or a game rather than a full web application.

Quite a few browsers support ASM according to this excellent article (https://hacks.mozilla.org/2015/03/asm-speedups-everywhere/):


Exciting…but what if you do not wish to program in C/C++? Here is where it gets very interesting. As it stands there is no WebAssembly but there is a Polyfill (https://github.com/WebAssembly/polyfill-prototype-1) that uses ASM. Which means you can run WebAssembly today using ASM, but you are still actually using JavaScript, only a highly optimisable version in a textual form. The roadmap basically starts here with the logical progression being a binary form of the existing ASM before we move to a new whole new language. At the moment you have to use C/C++ in order to generate ASM but there is nothing to stop you hand-coding it - other than patience and sanity. Anyone familiar with writing video game pokes in machine code from back in the 80s will probably smile at the challenge of hand-coding ASM.

So why C/C++? Well the problem is that scripting languages are very high level. This excellent article talks you through the argument tremendously well (http://mrale.ph/blog/2011/11/05/the-trap-of-the-performance-sweet-spot.html). Probably the best scripting language I know is Ruby and even that does not have the low level capabilities of C/C++ required for performance optimisation. Indeed the whole web was founded on a mixing of declarative languages (CSS, HTML) and simple scripting languages (ECMAScript, JavaScript, JScript and even VBScript). There is a reason why game developers use C/C++ and that is for performance and so if you really want to get close to the metal then a traditional scripting language is probably not going to cut it. Most of this is down to the data structures and the overhead in storing objects.

However…what about a next generational scripting language…he said mischievously. I was amazed to find a new scripting language that fitted the bill called LLJS (http://lljs.org/), which I presume stands for Low Level JavaScript. They have just been able to compile LLJS into ASM (http://jlongster.com/Compiling-LLJS-to-asm.js,-Now-Available-). This is a very exciting glimpse into the future. I can see tool vendors like Unity and next generation scripting languages like LLJS, all being able to spit out ASM and deliver a much improved web experience. Soon you will be to write a 3D application in Unity and export it to WebAssembly and use the WebAssembly Polyfill to actually run it in most modern web browsers. LLJS will probably not be the only next generation scripting language and ECMAScript 7/ECMAScript 2016 along with new APIs are already adding lots of new features which makes it much more effective to accelerate JavaScript such as: Typed Objects (http://wiki.ecmascript.org/doku.php?id=harmony:typed_objects); and SIMD (https://hacks.mozilla.org/2014/10/introducing-simd-js/).

My guess is that ECMAScript will start to evolve into a much more lower level language and this will rapidly accelerate as soon as a few next generation scripting languages start to challenge it. It will be very interesting to see how low level a scripting language can actually become. Swift has arguably proved an initial attempt at just this, by embracing the best practices of scripting along with much deeper control.
*****
Thanks for sharing this article with us Peter!

************************************************************************
Kevin Benedict
Writer, Speaker, Senior Analyst
Digital Transformation, EBA, Center for the Future of Work Cognizant
View my profile on LinkedIn
Read more at Future of Work
Learn about mobile strategies at MobileEnterpriseStrategies.com
Follow me on Twitter @krbenedict
Subscribe to Kevin'sYouTube Channel
Join the Linkedin Group Strategic Enterprise Mobility
Join the Google+ Community Mobile Enterprise Strategies

***Full Disclosure: These are my personal opinions. No company is silly enough to claim them. I am a mobility and digital transformation analyst, consultant and writer. I work with and have worked with many of the companies mentioned in my articles.

Upcoming Mobility Events