As we move past touch-based interactions, the next generation of UI is all about touch-free control, which enables communication with a device through speech, gestures and even facial expressions. Touch-free controls are especially relevant in today’s virus-conscious world and promise a whole new level of engagement.
Gesture recognition is a complex area that involves sensors and cameras to capture hand movements that serve as an input for the computer. Current advances make this recognition more context-sensitive so that devices can accurately anticipate what a user wants. While gestures are hugely popular in the gaming industry with the use of virtual reality (VR), augmented reality (AR) and mixed reality (MR) experiences, they are set to grow in the business world, too. Enterprise user interface experts must study the different types of gestures employed, including generational differences in the way devices are used, to provide an easy-to-use and intuitive design.
Through the use of multiple applications, Infosys ’ Tennis Platform offers an MR HoloLens experience that provides a view of a futuristic tennis retail store and a VR-based tennis experience so fans have the feel of a live environment. Users are also able to interact with the application by using gesture commands such as air tap, gaze, head rotation and also through voice commands.
Natural user interfaces represent simplified human-machine interactions. These smarter interfaces arose with the advent of social channels and progressed as social media became the primary source of engagement for both business and leisure. As users moved from simple phones to smartphones and from desktops to mobiles or tablets, these interfaces also kept pace to make it as seamless as possible to use these devices.
Touch user interfaces were early breakthroughs leading up to conversational AI, which has created a significant shift in the way we communicate with machines. In conversational AI, the device enables the user to give it voice or text commands in natural language and eliminates the need for special commands or buttons. Chatbots and smart speakers are commonly used today to facilitate frictionless experiences with a device.
For an American bank, Infosys developed an app to enable on-the-go expense reporting with a conversational interface, replete with real-time updates and insights. The app fully supported natural language-based interactions.
Infosys developed a wearable-based stock trade app that used a combination of haptic feedback and force touch to enable interactions with the app. The app supported natural language-based conversations to interact with the app without having to depend on visual cues.
To keep yourself updated on the latest technology and industry trends subscribe to the Infosys Knowledge Institute’s publicationsCount me in!