Making AI human-centred

Published on the 12/12/2024 | Written by Heather Wright


Making AI human-centred

And getting the agents in on the act…

It’s time to think about the digital workplace in a different way.

That’s the message from Forrester’s Sam Higgins, who says the concept of the digital workplace is long overdue a revamp – and AI is driving home that requirement.

“Just throwing this technology onto employees is not a great way to deliver an effective employee experience.”

In a recent Forrester and Slack webinar on forging experiences that matter with humans and AI, Higgins, a principal analyst with Forrester, called for companies to take a human-centred approach to AI adoption.

He outlined Forrester’s belief that there are six core components of a digital workplace: Employee facing apps and HR management tools, core productivity platforms such as collaboration, end user computing services and enterprise applications, collaboration, an intelligent orchestration layer, experience touchpoints, analytics and application integration.

“We need to bring more of the employee services up into these core productivity platforms, and we need to bring core productivity platforms together and orchestrate them much more intelligently,” Higgins says.

He contrasted the modern digital experience platform with the environments seen in typical CRM platforms where tab provide easy access to the richness of the platform’s core content.

“If we think about that in the context of a digital employee environment, that’s what we need to do.”

Context-sensitive AI assistants, rather than generic GenAI assistants, which can pop up when they notice, based on a user’s experience, that they’re attempting to, or are about to do a particular task, will provide a much better experience than just providing ‘some generic technology’.

“Just throwing this technology [AI] onto the employee is not a great way to deliver an effective employee experience,” he says.

Research, however, suggests that many companies are struggling to bring employees along on the AI journey.

Christina Janzer, senior VP, research and analytics and head of Slack’s Workforce Lab says while there’s a sense of urgency among executives to incorporate AI – its featuring as the number one priority in Slack data, and has increased seven-fold over the last six months, when you look at actual usage data, only a third of desk workers are using AI.

“Some of this is just we need more time for people to adopt it, understand it and start to integrate it into their working lives. But we also see some real barriers getting in the way of people adopting this technology,” Janzer says.

Higgins says while there are plenty of barriers, including some technical ones, human barriers are key including the issue of skills, understanding, changing the way work works and whether people are ready for those changes.

A Forrester AI quotient model (AIQ) identified four core competencies companies need to manage when thinking about adopting AI into their systems.

Making sure people understand AI – not just know about it, but actually understand it – leads the list.

“Yes it might look like doing magical stuff, but it’s really just complicated maths. Giving people that understanding, including of the difference between deterministic [with a single, predictable outcome] and probabilistic computing.”

Then there are the soft skills and inclinations people might need, such as knowing when to question the outputs.

Hard skills and training, including how to create prompts and use AI, and ethics, risk and privacy awareness are also required.

Forrester research shows 45 percent of those surveyed say they know what prompt engineering is and how to use it.

Higgins and Janzer say that’s ‘probably a little high’.

“You don’t know what you don’t know,” Janzer notes.

When Forrester segmented respondents into AIQ groups – high, medium or low – it found just 47 percent ranked as high AIQ.

“Now that might be the one-third in your organisation who are actively using it in your organisation, so your risk of exposure to mistakes may not be a risk. But for a lot of organisations… you are not going to know whether that one-third using AI are someone with low AIQ or someone more sophisticated.”

When segmentation was overlayed on a question around respondents knowing whether or not to question the output of AI, only 16 percent of those in the low AIQ group (which accounted for 21 percent of all respondents) were ready to question the output.

“And we know, with GenAI particularly, that it is not deterministic and we need to check what comes out,” Higgins notes.

There was another caveat noted, with Higgins saying research also showed leaders were far more likely to believe they have trained their teams in the technology than the employees were to say they had been trained.

“There’s a big disconnect there – a 15 percentage point gap.”

Janzer says Slack figures confirm that lack of training and enablement with only 15 percent of workers surveyed feel adequately trained to use AI effectively.

She recommends microtraining and, even more importantly, getting teams to just try AI out, experiment and get their hands dirty, suggesting companies follow a path taken by Slack itself and set an expectation that employees spend 15 minutes a day picking three bottom things from their to-do list and experimenting with how AI can help.

“It gets people more comfortable trying it and experiencing it, recognising what it’s good for, what not good for, and start to trust the technology a bit more.”

 The move to agentic, or agent-based AI, will potentially address many of the challenges currently being experienced, Janzer says.

“What I’m seeing in our data is we are putting so much onus on individuals to figure out how to use AI. We are not training them, giving them guidance. We’re just saying ‘here is this amazing technology, figure it out’.

“That is holding people back and the thing I love about agents is they are very well defined.”

By definition, an agent must be given structure. It is told exactly what data it has access to, what role it is playing and what it can do, defining the role it might play in a team.

“You are taking away so much ambiguity and providing a very clear role to the team. They now know exactly how to interact with it, what to go to it for,” Janzer says.

Post a comment or question...

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

MORE NEWS:

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Follow iStart to keep up to date with the latest news and views...
ErrorHere