AI UX Design

Designing an inclusive AI chatbot

We took a financial institution's existing AI chatbot and came up with solutions to detect accents and adapt to assistive technologies. We also addressed the limitations of AI as well as its advantages to start conversations across several departments about AI's future within the organization.
Three screens of an AI chatbot where the user is asking where the latest ATM is and accessing language settings
My role
Lead UI and UX Designer
Timeline
1 month
Disclaimer
Some project details were omitted to protect client confidentiality.

Project outcomes

  • We found out through my research plan that the company's chatbot has an average task completion rate of 60%, which can be improved.
  • The average error rate for the chatbot is 80%, with users rating the experience a 3 out 5.
  • Our solution to add capabilities to the voice chat that understand different English language accents, have the voice chat speak to the user with the accent they chose, and fix screen reader problems presented in the usability test are being implemented currently.
  • We estimate that after implementing these accessibility and linguistic differences fixes, the error rate will decrease by over 40%.

Project scope

Chatbots can be one of the most frustrating and inaccessible parts of a website, but at the same time, they have the potential to create a good user experience. Our current chatbot has accessibility issues that must be addressed, which is the purpose of this project. We are to identify those issues and provide solutions to remediate them.

Why do chatbots tend to be inaccessible?

It’s no secret that AI chatbots are gaining popularity, but they still come with a set of challenges for people with assistive technologies.

According to the Bureau of Internet Accessibility, some chatbots don't have a design that makes it clear how different elements are related. For example, a user should easily understand which button to click in response to a message.

The chatbot buttons are often at the bottom lower right of the screen, which is a challenge for screen readers. They might not be able to reach the button easily because they may be forced to tab through the whole page each time.

Adding landmarks and skip links can solve some of these issues.

Another challenge about screen readers is that they need to be able to notify users of chatbot conversation replies or any updates. An aria-label can be one way to provide the needed context.

Our process

Our objectives are:

  • Understand the challenges faced by users with different accents and disabilities when interacting with AI chatbots.
  • Identify the ways the chatbot's voice chat handles different English language accents.
  • Gather requirements from stakeholders, including accessibility goals and technical constraints.

Methodology

  • Usability testing: I planned and conducted testing on the current chatbot to see how its voice feature handles different English language accents and how users with assistive technologies navigate it.
  • Literature review: Reviewed articles and existing research on speech recognition, accent detection, and accessibility.
  • Competitive analysis: Analyzed existing AI chatbots and assistive technologies to identify gaps.

Usability test

Research questions:

  • How does the chatbot recognize and respond to various English accents?
  • What improvements can be made to enhance the chatbot’s accessibility for users with diverse accents?
  • What are the common accessibility issues faced by users with disabilities?

Methodology

I recruited 7 participants of different backgrounds, diverse accents, and accessibility needs that have experience using chatbots. The users will interact with the chatbot by completing a series of predefined tasks such as asking for information about home equity loans.

The key metrics for this test will be task completion rate, error rate, and user satisfaction ratings.

After completing the tasks, the participants will answer these questions:

  • On a scale of 1 to 5, with 1 being the lowest, how would you rate your experience using chatbot? Explain your answer.
  • What would you change about the chatbot?
  • How would the AI chatbot help you with everyday tasks?

Usability test insights

  • Non-native English speakers that have accents often faced significant barriers in using the AI chatbot during the test.
  • The average user experience rating for the chatbot is 3.
  • The average task completion rate is 60% with an 80% error rate.
  • One participant born and raised in Latin America indicated that he once used a chatbot and spoke in English using the voice chat, but the chatbot thought he was speaking Russian.
  • There was a need for the chatbot to detect accents, provide constructive feedback, and seamlessly integrate with assistive technologies such as screen readers and speech-to-text.
  • The chatbot updates dynamically, and sometimes during the testing, these updates were not properly announced for screen reader users.
  • Personalization is key to improving user satisfaction when it comes to chatbots. The participants feel like it does not adapt to their needs.
"Until I started using several AI chatbots with voice mode, I didn't actually realize that having a familiar accent had any impact on such interactions with AI. Although I'm used to listening to American English voices, I'm not at all used to interacting with American English voices, which makes it odd. I feel weird using it."

Literature review

  • According to a report by Juniper Research, banks that adopted AI chatbots experienced a reduction in customer service costs by up to 30% due to the efficiency and speed of automated responses compared to traditional customer service methods. (Source: Juniper Research)
  • Voice bots interact with customers by analyzing their vocal data and providing the most accurate response by encoding and decoding spoken content. They use machine learning to train itself to improve its accuracy. (Source: Gnani.ai)
  • There are over 7,000 different languages and dialects. English is spoken across all 195 countries in the world, which means there are over 100 different accents for the language. (Source: Gnani.ai)
  • With the help of machine learning models, text-to-speech bots can be trained with billions of customer conversations to help create a strong foundation for the Natural Language Processing component. (Source: Gnani.ai)
  • Gnani.ai is training its model in 20+ international languages instead of building new models for multiple languages.
  • Banks use AI chatbots to streamline various banking processes, such as account management, loan applications, and transaction monitoring.
  • The World Economic Forum highlighted that AI chatbots could cut processing times for simple banking operations by up to 80%. (Source: World Economic Forum)

Competitive analysis

The purpose of this competitive analysis is to see how other AI chatbots are handling accessibility and language.

Replika

Pros:

  • It's mainly text-based, which makes it ideal for users with speech impairments.
  • User-friendly design that is easy to navigate.
  • Allows users to personalize interactions.

Cons:

  • Has limited voice interaction capabilities.
  • Users have reported online that it has problems adapting to screen readers.

ChatGPT by OpenAI

Pros:

  • Has robust text-based interactions that are helpful for people with speech impairments.
  • Whenever the user types something in a different language, ChatGPT responds using the same language. Its multilingual capability makes it an effective tool for communicating with users from diverse linguistic backgrounds

Cons:

  • Does not natively support voice interaction. The users would have to set up TTS (text-to-speech), which can be complex.
  • May struggle with understanding different accents and dialects, impacting the experience for non-native speakers.

Google Assistant

Pros:

  • Understands various accents and gives users the ability to change the accent of Google Assistant.
  • Supports text, voice, and visual interactions.
  • Can control smart home devices, aiding users with physical disabilities.

Cons:

  • Despite being able to recognize accents, there are users from other countries such as Australia reporting problems speaking to Google Assistant and the AI not understanding some words.

Analyzing the pain points

Our research taught us that the current chatbot the company needs improvements so that users can browse it with screen readers and eliminate barriers that prevent people of diverse linguistic backgrounds from using the voice chat.

Sticky notes with customer behavior, device/features preference, cognitive/physical limitations, and demographic factors

After empathizing with the users, we started to brainstorm the "how might we" question.

Sticky notes with how might we ideas categorized under trust, safety, and security, performance, empowerment, inclusion, and accessibility

How might we

How might we improve the company's current AI chatbot so that it adapts to linguistic differences in the English language and assistive technologies?

Problem statement

Our AI chatbot isn't user-friendly because it's difficult to navigate it using a keyboard and its voice chat cannot understand different English language accents.

User personas

User persona of Lee, a writer wanting to use the AI chatbot despite her physical limitations
User persona of David, a musician with visual impairments that wants to use the chatbot

User journey

David's user journey diagram

Sketching the ideas

I strongly advocated for including a way to set the voice chat's accent so that it recognizes the user's speech and also talks back using the same accent they chose for it.

We also discussed how AI chatbots should adapt to other assistive technologies such as responding to sign language, which would require the camera to be turned on.

This triggered a lengthy discussion about how would AI ethically handle this due to users possibly having concerns about their privacy. Similarly, this could have implications such as the financial institution wondering how would they prevent someone else from impersonating the account owner.

This part of the process was important to me because it was thought-provoking. It's part of the many discussions we ended up having with other departments for the purpose of increasing AI maturity in the company.

Idea sketches for assistive technologies and languages
Idea sketches for assistive technologies with solutions for users with cognitive and physical disabilities

Storyboard

This is the storyboard I created to present and gather buy-in from stakeholders. It depicts Lee, an Australian woman trying to use voice chat to check her account balance, but her words are misinterpreted by the chatbot due to her accent.

Storyboard of Lee trying to get her bank account balance using voice chat, but the AI isn’t able to interpret what she is saying

High fidelity prototypes

I designed the high fidelity prototypes. This has the settings to not only change the chatbot's language, but to also pick the dialect and your accent. The voice chat feature will respond to you using the accent you picked, and it will understand what the user is saying.

A user will be able to pick Spanish, select Argentina as the dialect, and choose from any of the Spanish language accents there are in Latin America.

Prototype of the chatbot screen with the user asking where the nearest ATM is
Prototype of the chatbot screen with the chatbot's response, showing ATMs near the user
Prototype of the chatbot screen with the kebab menu opened
Prototype of the language settings screen with a language field, a dialect field, and an accent field

Next steps

These features are being built and will be tested with users again. I prepared a testing plan to ensure the keyboard issue where screen readers are not announcing the chatbot replies and updates is fixed.

We are now liaising with the engineering team to make these ideas technically feasible.

The takeaways

This project taught me about how important it is to stay up-to-date on the latest AI advancements and to always bring up these conversations about its implications.

Companies don't have to wait until major changes happen. They can, however, prepare for them.

I was also taught the importance of testing your existing products with users despite the product already being launched, since you never know what could be going wrong. The process does not end after launch.

Something I would have done better during this project was involve the engineers more. I felt that their technical input was needed.

All in all, I can say that testing this product with users that have disabilities has opened the door to increasing the company's AI and accessibility maturity.