It’s All About the User Experience - Part II: Responsiveness and Zero-Latency Data Access
As I mentioned in part one (It’s All About the User Experience: Offline Support),the Cloud has been an amazing enabler, allowing the rapid launch of new and innovative solutions and making those solutions and services accessible and available world-wide and from multiple devices. However, while the Cloud enabled and principally solved the any device, anytime part of the equation, the anywhere part assumes good connectivity and service responsiveness to be true, which often it isn’t. The Cloud, therefore, actually and is unfortunately degrading the user experience, especially relative to today’s users’ expectations of instantaneous experiences that are always available at one’s fingertips, everywhere and anywhere.
In part two of this series on the user experience, we’ll focus on the responsiveness of the user interface and the benefits of zero-latency data access.
As an example, in my home I have television and content services provided by a company with a Cloud DVR. In general, it works well, and I love the ability to access content from any device over any connection. However, during periods of high latency or lag, the solution becomes downright horrible, when even seemingly simple tasks such as browsing the content guide or changing the channel get queued up with no feedback for the user. The entire UI grinds to a halt, making the system unusable, and then at some point in the future it unfreezes and processes every queued key press in rapid succession, resulting in unanticipated behavior and frustrating results. While the content, of course, benefits from being in the cloud, the meta-data describing that content should be available on the device and shouldn’t necessarily be dependent on instantaneous cloud access because the user experience drops to “unusable” the instant the latency exceeds a given threshold.
Most mobile applications have the same problem – they rely on fetching data from the Cloud, and are largely unusable or otherwise paused when doing so. Latency cannot always be guaranteed on most networks, especially mobile and WiFi, but is critical to providing a good experience, driving usage, and even revenue. Amazon found every 100ms of latency cost them 1% in sales; Google found an extra half-second in search page generation time dropped traffic by 20%. When consumers have choices, they will use the solution that provides the best experience, and churn off the applications that don’t. So why don’t more applications focus on providing a good experience? The key to avoiding this latency-induced lag is storing data locally on the device and enabling zero-latency access to data.
Solutions like Realm allow developers to store data locally on the mobile/IOT device, keep it updated in real-time, and track changes both on the device and the Cloud to automatically resolve conflicts caused by updates. In the above example, the meta-data around content could be stored locally on my device(s) and updated as it changes (and in reality, it changes slowly). In the second example, the important data could be stored locally on the mobile or IOT device, ensuring that the UI is highly responsive and reactive, independent of the current connectivity characteristics. In this way the system can deliver unparalleled user experiences, ensuring continued usage, customer acquisition, revenue, and reputation.
In a world where most services have multiple options, and today’s expectation of service experiences (often assuming instantaneous access to data), delivering the best possible user experience is critical, and the current default mechanism of relying on on-demand data fetching from the cloud is not adequate to satisfy this requirement. REST is not actually best – solutions like Realm can help. Come and check out why various companies are coming to Realm to fundamentally drive the next-generation of user experience.