Originally posted on thedrum.
As we prepare for “the new normal” one thing is for sure – the world will look different from what we were accustomed to pre-coronavirus. In particular, the way we experience almost everything. Some behaviors we adopted during this time of lockdown are going to remain with us for a lot longer than we think, and will influence larger behavior changes.
Social distancing and “touchless-touch” – or low touch – interactions are here to stay for at least some time. And while these behaviors may result in restrictions in interaction, they are also an opportunity for innovation.
In thinking about retail, events and public spaces, how will we enable people to interact, browse content, learn, buy things, and experience storytelling activities without needing to touch anything? And how can we start to design solutions in the short-term that fuel this growing need?
Lessons taught by need
The evolution of how we interact with computers has dramatically changed. From push cards and trackball mouse, to gestures and voice interfaces. All of these technological evolutions were driven by needs, the biggest driver of change.
Similarly, this era of a low-touch economy is driven by need – a need for human safety and business stability.
To that end, we are going to see a surge in innovation for retail, event, training and learning, and public spaces, with the way we consume and interact with content. Being able to adapt other technologies, and borrow from other industries or mediums, is a fast and low-risk way to innovate since the technology is mature and has already been tested.
Here are three interesting approaches:
User studies have used eye tracking devices for some time in order to understand what people are focusing on while they are looking at, for instance, a webpage. The result produces heat maps of what people’s eyes dwell on, on-screen. This technology could be used for navigating content. People could move a cursor on screen with their eyes, and dwelling on a piece of content could be used as a means of validation. With the recent evolution of deep learning libraries it is also now possible to track a user‘s eyes with a regular webcam. While this requires a bit of software training for every user, it is an interesting tactic for a low tech approach.
When Leap Motion first came out and developers started to build experiences, they were clunky and unintuitive. It felt like the device did not belong in an experience. The only time where it really made sense was when it was hooked up to a VR headset which allowed users to see their own hands and interact with the world around them. However, now people are going to be more reticent in using their hands to touch foreign surfaces so this device could find renewed purpose. This would also give us the opportunity to rethink the user interaction and how it integrates with content to enhance the user experience.
Combining technology takes innovation even further. Like Ultrahaptics acquisition of Leap Motion for example. “Separately, Leap Motion‘s hand-tracking technology and Ultrahaptics‘ mid-air haptic feedback technology already offer some novel implications for VR, AR, and device interaction. But combined they could create something right out of science fiction – the ability to feel and manipulate virtual objects without the need for gloves or other wearables.”
Thanks to deep learning and large pools of dataset, computers can now understand what humans are saying. If a computer is able to successfully extract an intent from a sentence, this intent can then be turned into a computer command in order to trigger an action. Amazon Lex, Google DialogueFlow, IBM Watson are all services that can help with this kind of activity. The challenge here is not to make these work, but to operate them effectively in public spaces because these spaces are very much challenging for voice activated applications. Why? Because high levels of background noise and trying to process a variety of speaking styles at once is difficult. Considering different layouts and material choices can help to minimize background noise. Maybe thinking about building little interactive pods with a narrow angle microphone could be a valuable solution.
Continuous drive for innovative change
Touchless consumer behaviors were on the rise and have exploded during the Covid-19 pandemic: online shopping for groceries, Zoom/Skype for virtual communication and business development, online chats for customer service, and contactless mediums for payment. Consumer needs are driving what brands are doing in terms of marketing.
Sustainability is something that was forced upon brands because consumers wanted to see companies were being responsible and doing the right thing for future generations to come. We are going to see a similar approach with what will follow this pandemic, where brands will need to make sure they show consumers they are doing their bits to preserve both their safety and the safety of others. These new needs and behaviors will shape our “new normal” – a low touch, high tech world.
At large events, we may see venues tracking its attendees for the overall safety of everyone. Restricting attendance to set days, and reducing the number of exhibitors are other options for better managing large-scale events as we move into a post coronavirus world. And shaking hands as a way to greet someone is sure to change – to a nod, a smile, a wave. A touchless interaction intended to evoke the same sentiment.
Sebastien Jouhans, creative technology director, Genuine X, Jack Morton