More artificial intelligence, fewer screens: the future of computing unfoldssteemCreated with Sketch.

in #technology7 years ago

rainbow-hand-cropped-photo-by-joe-mckendrick.jpg

We are approaching the day when user interfaces and user experience (UI and UX) will mean much more than working through screens on devices. It may not involve screens at all.
That's the word from Accenture, which spells out, in a recent report, the rise of AI as the new purveyor of UI and UX. Developments such as autonomous vehicles and voice-activated home assistants are just early examples surfacing that suggest more screenless computing is on the horizon.

This has implications for the way enterprise users work, as well as customers. Already, there's plenty of talk -- and pilots -- involving the use of connected tools and wearables in the workplace that serve to augment employee tasks.

The report's authors make three predictions:
"In five years, more than half of your customers will select your services based on your AI instead of your traditional brand."
"In seven years, most interfaces will not have a screen and will be integrated into daily tasks."
"In 10 years, digital assistants will be so pervasive they'll keep employees productive 24/7/365, operating in the background for workplace interactions, like creating video summaries right after an important meeting."
Accenture's findings are based on a survey of 5,400 executives across the globe. "Moving beyond aback-end tool for the enterprise, AI is taking on more sophisticated roles within technology interfaces," the report's authors state. "From autonomous driving vehicles that use computer vision, to live translations made possible by artificial neural networks, AI is making every interface both simple and smart - and setting a high bar for how future interactions will work."

In the survey, 79% of executives agree that AI will help accelerate technology adoption throughout their organizations. In addition, 85% indicate they will invest extensively in AI-related technologies over the next three years.

The Accenture authors cite a prime example of where AI is making its first inroads into enterprise UI and UX: voice-activated systems. "Advances in natural language processing and machine learning make technology more intuitive to use, like telling virtual assistants to schedule a meeting instead of accessing scheduling software to find a time, create an event,and type the details," they state. "AI already plays a variety of roles throughout the user experience. At the simplest level, it curates content for people, like the mobile app Spotify suggesting new music based on previous listening choices. In a more significant role, AI applies machine learning to guide actions toward the best outcome."

The leading enterprise technology vendors are also looking to AI as the future of computer interfaces -- "from Salesforce Einstein, to Microsoft Azure Cognitive Services, to the Google Cloud Platform." There are also open source AI platforms available -- "from Google's TensorFlow to Intel's Trusted Analytics Platform. Caffe, a deep learning framework developed at the University of California, Berkeley, was the basis of the Deep Dream project Google released in 2016 to show how their artificial neural networks viewed images."

The combination of "intuitive,natural interactions and the ready availability of open source tools paves the way for big changes across the interface," the Accenture team adds.

How to get stated on this artificially intelligent, screenless journey? In an accompanying article at Tech Emergence, Paul Daugherty, chief technology and innovation officer at Accenture, describes the actions companies who want to explore AI UX applications need to take:

"Take existing communication channels and determine how these can be made smarter -- using inspiration from other successful conversational interface or voice interface applications."
"Look at every customer and employee interaction and ask yourself how they can be improved through AI."
"Look at new interfaces beyond the screen and consider how new channels can enable multidimensional conversations."

Sort:  

In some time from now, the word "interface" will have a completely different meaning. We will remember touchscreens and keyboards with nostagia as we will be using technology that responds to our words, movements and even thoughts. The integration of such devices in our everyday life will be so immense that we will no longer be able to distinguish one computation system from another, our home devices will be a part of the same grid, that will be a part of larger network that will be a part of global web etc... That brings a bunch of questions about our human nature, but to me the most important question is this: is human-computer integration via brain-computer interfaces and AI actually a next step in human evolution? Or is it a bad direction that will lead to even greater problems for mankind?

This post recieved an upvote from minnowpond. If you would like to recieve upvotes from minnowpond on all your posts, simply FOLLOW @minnowpond

Coin Marketplace

STEEM 0.18
TRX 0.15
JST 0.029
BTC 62837.64
ETH 2542.11
USDT 1.00
SBD 2.65