the end of navigation part two

Jim Scully | Jan 11, 2018

In Part One of this two-part blog I shared some reasons why voice recognition in combination with artificial intelligence (AI) is likely to go mainstream in the workplace and eventually replace much of the on-screen computer navigation performed. In this part, I will shift focus to the application of voice recognition in the digital employee experience.

I’ll share my thoughts in the form of a futuristic but plausible story of a day in life of Autodash Inc., a hypothetical manufacturer of voice-activated components like sound systems, climate controls, monitoring gauges and navigation systems, made for self-driving cars.

 
“Self Driving Car” — Photo by  Aral Tasher  on  Unsplash

“Self Driving Car” — Photo by Aral Tasher on Unsplash

 

A Day in the Life of Autodash, Inc.

At 6 a.m. Monday morning Shirley Rich, Autodash’s CEO and founder, approaches the front entrance of the company’s shiny new headquarters building. “Hi Door,” says Shirley, “Please open for employee zero zero zero one.” Matching Shirley’s unique voice to her employee number, the voice-activated door promptly slides open. A pleasant, cheerful voice replies, “Good Morning Ms. Rich. Would you like me to activate your office?”

“Yes, please.” replies Shirley, smiling because the voice resembles that of her daughter, who is away at college halfway around the world. Shirley felt a little selfish asking the Autodash programmers to model the door’s voice after her own daughter’s, but she’s looking forward to the upcoming enhancement that will allow all employees to customize the voice to their own preferences. She winces at the thought of workers configuring their preferences to imitate, well, whomever, imagining the voice of Homer Simpson welcoming employees to the building. But she realizes the important thing is that the computer accurately recognize employees’ unique voices and if being greeted by Homer Simpson lightens an employee’s morning who was she to deny that pleasure.

 
 

The lights in Shirley’s office are already on when she enters her office. The computer monitor is glowing, indicating it’s ready to receive instructions. “Computer,” says Shirley, as she approaches her desk, “display my calendar for today.” Immediately, her daily calendar presents itself on the screen. Seeing that she has a meeting in ten minutes with her Production Chief, Macon Quigley, and realizing it will take her 10 minutes to reach his office in the production area, she curses herself briefly for not having prepared for the meeting over the weekend.

“Computer, print out the daily sales and production report,” says Shirley. The printer in her office jumps into action. Grabbing the sheet off the printer Shirley then instructs her smart phone to read Macon Quigley’s email from Friday afternoon. Walking briskly and silently greeting passers by, Shirley listening to the email through her earbuds while glancing at the report in her hands. Shirley, a practice multi-tasker, quickly absorbs the information in preparation for her meeting with Macon.

Shirley enters Macon’s office to find him smiling broadly while listening to a voice flowing from his surround-sound audio system, “Mr. Quigley, the current value of your 401(k) account is two million eight hundred fifty thousand dollars and twenty-nine cents.” This is a morning ritual for Macon. He’s heavily invested in Autodash stock and loves hearing his increasing wealth recited to him. He calls it his morning motivational.

“You better not be thinking of retiring, Macon,” teases Shirley on queue.

“Sorta depends on how our meeting goes, Shirley,” quips Macon. “Did you get a chance to read my email? Sorry about sending it so late on Friday.”

“I read it,” replies Shirley as she settles into a chair and eyes him knowingly. Macon instructs his office assistant to display Autodash’s monthly sales and production figures. “No need,” says Shirley, “I’ve already read the report.” From Macon’s email Shirley has already surmised that he’s going to make a case for building a new factory, which will cost billions and require Board approval. This is a big and risky decision. Shirley knows the risk of long-term investment because of the lightning speed of technological disruption. Leaders can become laggers in the blink of an eye. She braces for what she knows will be an intense discussion. With Macon they almost always were.

As Macon and Shirley begin their meeting Macon’s office assistant interrupts, “Third shift is now ending.” Mason has configured his calendar to make this announcement in case he needs to catch up with one of the supervisors on their way out the door. But today there’s no need, so Macon ignores the prompt.

 
Photo by  chuttersnap  on  Unsplash

Photo by chuttersnap on Unsplash

 

Meanwhile, third shift workers are pouring into the parking lot. For Norm Bossey, the third shift production manager, it was a particularly long night because he started feeling ill at around 3 a.m. A flu but, he guesses. Norm prides himself on being healthy and not missing a day of work since joining Autodash three years ago. For that reason Norm has never had to use Autodash’s medical benefit plan. He realizes he doesn’t even know where to start. So, while walking to his car Norm speaks into his smart phone, “This is Norm. I’m feeling ill.”

“Sorry to hear that, Norm.” replies a voice. “Do you need to see a doctor?”

“How much will it cost me?” replies Norm.

After a few seconds the voice replies, “Your medical plan requires a $30 copay per doctor visit. I see you chose a primary care physician at enrollment. Do you want me to call for an appointment?”

“Yes, please,” says Norm, a little surprised that he actually has a doctor. Within seconds, Norm is speaking to the receptionist at the doctor’s office. After confirming his appointment for later that day, Norm again speaks into his smart phone, “Send a message to Macon Quigley.”

After sending a message to Macon that he’s ill and may not be in the office the next day, Norm straps on his seat belt and tells his self-driving car to drive him home.

“Sure thing, Norm,” replies a friendly voice through the car’s speaker, which was made by Autodash. “We should arrive in twenty minutes,” continues the voice. With this, Norm leans back and closes his eyes as the car begins to move toward the exit.

 
Robot — Photo by  Dominik Scythe  on  Unsplash

Robot — Photo by Dominik Scythe on Unsplash

 

This story may be fictitious, but it’s not science fiction. The technologies described already exist but have yet to go mainstream. Not only does the story describe how voice recognition and basic artificial intelligence might exist in the workplace, it illustrates some of the tangible benefits. Voice recognition can enhance and simplify security procedures by replacing passwords and security badges with voice recognition. Productivity can be increased by offering new ways to multi-task, liberating workers from computer keyboards and monitors. The story also shows how employee self-service can be increased and enhanced by turning transactions into interactions. And, finally, it shows how the employee experience can be enhanced by making these transactions positive.

Most importantly, the Autodash story illustrates how voice recognition allows users to interact with computers without on-screen navigation, offering potential benefits beyond the user experience. Application developers put great energy and resources into making user navigation intuitive, since applications that are difficult to use are doomed to failure. But adding features tends to increase navigation complexity. Thus, developers are in a constant struggle between functionality and usability. But voice recognition and AI have the potential to remove this constraint by removing navigation from the user experience. The best way to simplify navigation is to eliminate it entirely. This, in my opinion, is the greatest potential for AI-powered voice recognition. Remember HAL in the movie 2001 Space Odyssey?

Of course, despite the title of this blog, it is highly unlikely that voice recognition will completely replace navigation, just as navigation will never replace programming code as the permanent “raw material” of computing. Practically speaking, I can’t imagine a graphic designer or interior designer using voice instead of screen navigation. Clearly, there are proper and improper use cases for voice. Voice will replace navigation for many but not alluse cases.

With that, I’ll close this blog with other ideas for enhancing the employee experience using voice recognition technology:

  • In-the-moment time and expense reporting

  • Assorted voice-initiated self-service HR transactions

  • Real-time performance recognition and feedback

  • Interactive digital new hire onboarding

  • Email/memo dictation and read-back (prior to hitting send to detect unintended tone or ambiguity)

If you’ve got other ideas, I’d love to hear them by emailing me at jim.scully@leapgen.com.

______________________________________________________________

ABOUT THE AUTHOR

Jim Scully is Leapgen’s HR Delivery Transformation practice leader. He has spent the last 22 years in the field of HR service delivery, both as a consultant and corporate practitioner. Jim’s primary focus is designing and implementing HR service delivery models that achieve business results through technology-enabled process excellence. Jim was the founder of the HR Shared Services Institute (HRSSI); the services of which are now a part of Leapgen.

ABOUT LEAPGEN
Leapgen is a global digital transformation company shaping the future of work. Highly respected as a visionary partner to organizations looking to design and deliver a digital workforce experience that will produce valued outcomes to the business, Leapgen helps enterprise leaders rethink how to better design and deliver workforce services and architect HR technology solutions that meet the expectations of workers and the needs of the business. Contact us to get started.