Insights
- Businesses face a rising tide of mistrust in AI and must respond to this.
- Communities outside the tech industry are more inclined to trust AI when they understand what it is and its role.
- Children and young people are growing up with AI and approach it very differently to adults.
- Good product design can help users accept AI.
- Challenges remain but there is cause for optimism.
Artificial intelligence (AI) has a trust problem. Despite work across businesses, including at Infosys, to approach this transformative technology in an ethical way, many are skeptical or even downright suspicious about AI.
In many ways this is not surprising, and businesses using AI must take notice of this. Concerns include the impact on jobs to fears that creative professionals — writers, artists, musicians and more — will have their livelihoods taken away. Additionally, creative professionals are up in arms about their work potentially having been scraped by big AI companies to train their models, and as a result, many in the creative sector are actively hostile to AI.
Other concerns include fears about bias creeping into automated decisions as a result of both opaque algorithms — the “black box” problem — and bias in the underlying training data feeding into those automated decisions.
Dr Jeni Tennison OBE is the founder of Connected By Data, a UK-based not-for-profit that advocates for a wide range of stakeholders in AI. She has been working with communities and organizations in the UK, and speaking to Infosys’s AI Interrogator podcast, she described the positive impact of running a “people’s panel” as an event linked to the UK’s AI Safety Summit in the summer of 2023.
“We had 11 people from all walks of life coming in and having conversations with experts, and talking about their hopes and fears around [AI]. Some were very familiar with it, some people not familiar, lots of people with concerns and fears. As we dug into the details with them, they became more confident while retaining their critical thinking around AI. It was amazing to see … it means they felt in control. They’re more open to it than if they remain in this place of ignorance and fear.”
More broadly, Tennison stressed the need for transparency, and what she described as “a more public AI.” She distinguished between openness, “which means that stuff is out there and anyone can use it, and transparency, which means that some people who need to see the detail are able to see the detail.”
Phaedra Boinodiris, IBM Consulting’s global lead for trustworthy AI, also stressed the need for a wide range of stakeholders to be heard when building AI systems. She told the AI Interrogator podcast that at present, “there is this teeny-tiny homogeneous group of people determining what datasets are used to train our models.”
Boinodiris said she came to working on trust from early in her career, “since the get-go”, and realized that earning trust “wasn’t strictly a technical problem at all, but something that was truly sociotechnical.” This, she added, means “having those teams truly be diverse and inclusive, not just in terms of gender, race, ethnicity, neurodiversity, etc., but also in truth, different lived world experiences. And then making sure those teams are multidisciplinary in nature.”
Key to building trust in AI is including young people. Boinodiris works with middle school and high school children in the US and wants them to be taught about AI not just in computer science classes, but also ensuring “that it’s actually taught in civics classes.” This, she said, means helping people understand “the importance of AI literacy in a holistic fashion so that people understand.”
Courtney Batiste of Cisco is also working with young people through her Texas-based initiative, The Batiste Project. She pointed out that while for adults, AI is a novel technology, children are growing up with it. She told the AI Interrogator that children are “a little bit more open-minded to it. They’re not exposed to some of the history that we have seen or been a part of.” She added: “What I’m getting from the kids it’s more about ‘how can I use it? How can I speed up my work?’.”
For older children, she said, “I have more students now trying to find degree plans around AI because they have such a vested interest and they want to build, they want to add to the platform as well as take, and build their own.”
In the UK, Maggie Philbin, the founder and CEO of TeenTech, a charity that works to connect young people with STEM careers, is very clear that to build trust, “we have to be very, very, very inclusive. I think it can be very dangerous if you only discuss the ethics underpinning AI with a small group of vested interest stakeholders.”
Speaking to the AI Interrogator, she added: “If you are not careful, you produce a system that works well for an elite group of people — and it hasn’t been designed for someone who’s 84, or a young person in Oldham [a deprived town in the north of England].”
TeenTech gathers data on young people’s views at its events, and, said Philbin, those schoolchildren are positive about AI. “We use voting buttons; we get hundreds of responses at once around how they feel about what they’re learning about AI, and does it leave them feeling positive towards it; neutral, or negative. And the vast majority are either very positive, neutral, and there are some who feel negative, but it’s moving in the direction of feeling positive.”
Businesses can then perhaps look to a future where AI is not greeted with fear and mistrust, but with informed optimism as today’s young people become adults. In the shorter term, however, organizations are rolling out AI functions in products, services, and workplaces, and it is imperative that their users, customers and employees are able to trust AI.
The AI Interrogator spoke to Valentina Proietti, head of design at Wongdoody, where she leads a team that is dealing with the here-and-now concerns about how to build products that use AI. Proietti described working on a customer service and support project for a telecom client that was looking at how to help the company’s users get the Wi-Fi in their homes right. This threw up a question of transparency, said Proietti. “We were debating if we had to tell users that they were using an AI: We were exploring the ethics around that. What do we tell users? Do we need to tell them they’re using an AI, or do we simply offer them a solution that works for them?”
The customer support product “was clearly not a human interaction,” explained Proietti — it was a voice-activated product. “Because there was quite a lot of transparency about the fact that it wasn’t a human, we decided not to tell them that it was an AI.”
But doesn’t that lack of transparency erode trust? Proietti said not: “It came out that the interaction and how the AI was solving their problems created a lot of value. And that’s when we understood that actually, consumers didn’t care if there was an AI or there wasn’t an AI — they only cared about the fact that it solved their problems.”
She added that trust came not so much from the transparency revealing that the support agent was an AI, but that “trust is something that is a case-by-case basis based on the brand, based on the experience. The trust is there when the value is there.”
However, she added, businesses need to “really understand if it is the right solution, if it is the right technology.”
Businesses must play an active role in building trust not just in their products and services, but more broadly, by having the right leadership in place and implementing the right ethical structures. “Implementing it really needs a framework,” added Proietti. “It needs governance, it needs understanding the data, it needs testing, a lot of testing.” Leadership is a challenge, too, she said: “The leadership in AI is definitely lacking. No one is really in charge in many companies.”
Trust will remain a challenge for businesses as the AI era progresses. However, there is room for optimism: Ensuring a wide range of voices is listened to, and a thoughtful, ethical approach that is led from the top of organizations, should help dispel some of the fears and suspicions.
