Image source: Gfycat
In domains ranging from gaming to ecommerce, augmented-reality, virtual-reality, and mixed-reality applications have experienced notable growth and popularity in the last couple of years.
UX designers are no longer limited to designing just screen-only user interfaces and are now experimenting with more physical interactions.
Voice as the Primary User Interface
Voice user interfaces (VUIs) let users control applications and devices through voice commands, as shown in Figure 3. They are revolutionizing the way users interact with products. For example, people have been using voice recognition to control devices such as Google Home, Alexa, Cortana, and Siri for several years now. VUIs are gradually taking a greater part in facilitating users’ daily, mundane tasks, whether setting an alarm or scheduling an appointment.
Image source: Jifo Vj’s “Voice Enabled Future of AI Watch,” on Dribbble
As VUIs demonstrate, the future of user interfaces isn’t restricted to physical screens. According to Adobe’s analytics survey, digital assistants and smart speakers are becoming ever more popular with users and will continue to do so in the future.
Nevertheless, current voice assistants have some limitations. Because they don’t take human emotions into consideration, users feel they’re communicating with a device, not a real human being.  But we’ve seen powerful, voice-based interactions in films—such as J.A.R.V.I.S. in Iron Man and Samantha in the movie Her, and it’s clear that, in the not distant future, voice assistants will serve as companions to users.
Artificial intelligence (AI) is a field of computer science that facilitates the development of smart computers that could ultimately behave like humans. An artificial intelligence has the ability to think, learn, and perform tasks autonomously, as shown in Figure 4. It learns from human behavior within the context of its surroundings—just as UX researchers learn by observing users perform their tasks.
Image source: Gleb Kuznetsov’s “Facial Biometrics Recognition,” on Dribbble
With the rise of chatbots and voice assistants such as Siri, Cortana, Google Assistant, and Alexa, AI is revolutionizing our interactions with digital devices. In addition to enhancing human-device interactions, AI has also contributed to predictive product design by analyzing historical and current data to predict users’ future behavior and reduce the effort necessary for them to complete their tasks.
AI is also helping UX designers analyze the abundant data that researchers glean from UX research, allowing them to invest more time and effort on design than on data analysis. Tools that are powered by AI are also helping to increase sales and user engagement.
Big giants such as Google, Microsoft, and Amazon have already begun exploring the power of AI to enhance the user experience. In the future, most industries will implement AI to enhance the user experience in some way.
Have you ever imagined how a user interface (UI) would look if there were no device screens, as Figure 5 does? The movement away from screens toward Zero UI is not entirely new. In recent years, we’ve seen Zero UI experiences such as Amazon Echo, Microsoft Kinect, and Nest. All of these products have features that support the Zero-UI concept, including haptic and gestural control, voice control, and artificial intelligence.
Image source: Sebastian Scholz, Google Home and Nuki Smart Lock, on Unsplash
With Zero UI, the user’s actions, speech, glances, and even thoughts and feelings could cause the system to respond. Rather than clicking buttons, typing long strings of text, and tapping objects on a screen, users input information by means of speech, gestures, and touch. Users’ interactions shift from the manipulation of physical devices to speech and gestures.
Do you remember when Apple launched iPhone? It has revolutionized the way we interact with mobile devices, with not just taps, but swipes and pinching and zooming, as Figure 6 shows. With ever-increasing amounts of content on digital devices, it has become very difficult for UX designers and developers to manage to display all of it on small screens. That’s where gestures can come to the rescue. Designers can use progressive disclosure to temporarily hide unnecessary buttons and content, making a user interface much cleaner and more interactive.
Image source: Gfycat
Gestures  are not unique to digital devices. Humans have always used gestures in their daily lives, and now we’re using them to interact with physical products such as faucets and soap and towel dispensers. Whenever companies release new gestural interfaces, there’s a learning curve for users, but these interfaces are very effective and efficient once people learn how to use them.
Gestures are key to Zero-UI digital products. In the future, we’ll see more new gestural interactions.
We’ve long seen 3D user interfaces in video games and movies, but the rise of augmented-reality, virtual-reality, and mixed-reality in digital products is pushing UX designers to experiment and implement 3D interfaces for real human interactions.
3D graphics provide another dimension to the look and feel of a user-interface design. While it takes a lot of expertise and time to create 3D graphics, UX designers can customize them according to their needs.
Image source: Sebastian Stapelfeldt’s “Floating UI,” on Dribbble
Device-Agnostic User Interfaces
User experiences should be seamless and functional, regardless of the device a user is currently using. With the rise of mobile and wearable devices in the past few years and the pace at which artificial reality, augmented-reality, virtual-reality, and mixed-reality technologies are growing, device-agnostic design will be in high demand in the future. Therefore, it will be necessary for UX designers to rethink their approach to design for diverse devices.
Image source: Paperpillar, on Dribbble
Design for Accessibility
There are millions of people with disabilities in the world, including vision, speech, hearing, cognitive, physical, and mental-health disabilities. As UX designers, we must address this gap when designing products.
Designers should imagine themselves living in users’ shoes, interview people with disabilities, and ask them about their painpoints when using technology products and what features could help them use digital products more effectively.
Many companies have become more aware of the importance of accessibility in the past few years and are designing products with accessibility for all in mind. There are now many features and products that help people with disabilities to accomplish their tasks in their day-to-day life. In the future, it will be easier for these people to use products that have better user experiences.
The evolution of technology is unending. Over time, it has brought many significant changes to human lives. The way users interact with products has evolved continually along with evolutions in technology. While design trends come and go, what really matters are the principles behind UX design and our ability to redefine the user experience for users with the help of technology.
While I’m not sure that all of these predictions will come true in the future, as quickly as technology and user experience are evolving today, we won’t have to wait long to see whether these predictions become reality.
 Durrani, Khalid. “UX Design Trends to Watch for in 2019.” DZone, March 13, 2019. Retrieved August 21, 2019.
 Sparklin. “Yet Another Design Trends Article—on UX & Digital Possibilities of 2019.” Sparklin, on Medium, January 31, 2019. Retrieved August 21, 2019.
 Kuznetsov, Gleb. “Designing Emotional Interfaces of the Future.” Smashing Magazine, January 23, 2019. Retrieved August 21, 2019.
 Batchu, Vamsi. “How Gestures Are Shaping the Future of UX.” UX Collective, on Medium, April 30, 2019. Retrieved August 21, 2019.