View more on these topics

Can AI help advisers practise difficult conversations?

Amanda-Newman-Smith-Final

Difficult conversations go with the territory for financial advisers. Helping clients live the lives they want is immensely rewarding, but addressing emotive issues that run alongside this, such as death, poor health and inheritance, can be tricky.

Some advisers will also be business owner-managers who employ other people, so they may also need to have difficult conversations with employees while wearing the boss’s hat.

The problem with difficult conversations is that we cannot predict how they will pan out. We might have an inkling based on what we know about the other person and how previous conversations with them have gone.

We might even run through the ways the conversation is likely to play out and think about what we will say. But if the other person is coming at it from a completely different angle to us, the conversation can easily take a different turn to the one we expect.

Is this an area where AI training tools can make a difference?

Virtual humans

Soft skills are often taught and practised through face-to-face role-playing, but there are drawbacks. These sessions can be time consuming, taking people away from other tasks they need to be doing.

People can feel anxious and intimidated about role-playing because they do not want to make mistakes in front of colleagues and senior managers. Getting people to act out specific roles, such as the classic ‘difficult customer’, will not necessarily replicate realistic situations because people might ‘overdo’ elements of their behaviour while in character to make a point.

Game-based simulation training company Attensi has developed an AI-based role-playing solution, RealTalk, to get around those issues.

Using agentic AI – a form of AI that makes autonomous action and decision-making possible – RealTalk enables difficult conversations to take place with AI-powered virtual humans in a wide range of workplace scenarios, from the management of employees to sales training.

The company sees a demand for this after its research found 75% of employees have experienced anxiety due to difficult workplace conversations and 72% of employees would welcome training on how to handle difficult conversations in the workplace.

Many employees – particularly younger ones – said they would prefer role-play training from an AI virtual trainer rather than their manager. But the company is clear that RealTalk is a time-saving addition to face-to-face training rather than a replacement for it.

A safe space

Attensi creative director Justin Blanchard says there are roles where making a mistake can have big consequences – hence the need for managers of those individuals to have difficult conversations with them. Having the confidence to do this does not automatically appear when someone becomes a manager, so training is often needed.

“But role-playing in front of other people can be humiliating if you get it wrong,” says Blanchard. “So, we have built a system, a safe space for people to learn how to have difficult conversations.”

RealTalk’s AI characters are designed to say or do things that make for a challenging conversation, just like real people. They may, for example, be extremely busy – looking over their shoulder at colleagues or glancing at their watch – then saying they are a bit pushed for time.

It can be tempting to respond by saying you will catch up with them another time in that case – which will not give you great feedback at the end of the module.

“You have to resist the urge to leave it. The worst thing you can do is chit-chat and not get to what you need to say,” says Blanchard.

Commentators in the financial advice and fintech markets believe there are great benefits to using AI in the context of training.

“It certainly has its place for building confidence and awareness of issues advisers may come across, and giving them a ‘no pressure’ environment to see how they would handle it,” says Twenty7tec chief executive James Tucker.

However, Tucker adds we must not lose sight of the nuance of human behaviour.

“AI cannot account for every scenario or customer type, which is why shop-floor shadowing is important for both senior and junior advisers, and should still be prioritised, even if they are doing AI training too,” he says.

Drawing the line

FTRC founder and Money Marketing’s long-standing technology columnist Ian McKenna says that, to be effective, he would expect AI in this field to at least be capable of the Turing test. The Turing test is a way of testing a machine’s ability to think like a human.

“It’s an interesting conversation – could you train a bot to be good enough to be able to deal with a whole load of sales objections from somebody towards buying a pension or buying a life-insurance product?” says McKenna.

However, McKenna wonders whether a client who has so many objections to a financial product would be a ‘lapse waiting to happen’.

McKenna says advisers will want to sell to people that genuinely understand the need.

“There is a very clear line I would not want to cross in terms of being excessive in the sales process,” he says. “So perhaps I’d pivot it and ask whether you use an AI to examine if you are presenting your arguments in a sufficiently compelling way?”

Comments

There are 3 comments at the moment, we would love to hear your opinion too.

  1. In this context, the correct spelling is practise, not practice (which is a noun, not a verb).

  2. One line will null and void difficult conversations ..

    “Do you want me to be nice or honest”

    There you go ….do not fall for the rise in AI for the loss in RL (real intelligence)

Leave a comment

Recommended