Conversational AI is complex and requires the expertise of developers, data scientists, and when a truly global customer experience is required, linguists. A multilingual chatbot or voice assistant adds a layer of complexity, given the volume of data that must be created and ingested into the machine learning engine to drive the linguistic requirements and ensure all business and cultural realities are addressed.
From languages, dialects, and accents to sarcasm, emojis, and slang, there are a lot of factors that can influence—and enhance—the communication between a human and a machine. Conversational AI systems need to keep up with what’s normal and what’s the ‘new normal’ with human communication.
Multilingual Customer Support
Chatbots have become almost synonymous with customer support, and they can be difficult enough to develop and tune in just one language…
Creating a multilingual chatbot can bring with it a whole host of barriers to overcome!
In the past, multilingual customer support workers have had to rely on a company manual or knowledge base to provide the answers they need. These manuals can consist of thousands of pages, and as you can imagine, scrolling through this to find one specific answer is incredibly time-consuming and labor-intensive. Even searching a knowledge base can slow down the customer support agent.
Those who sell products or services on multilingual sites know that this content quickly becomes outdated. This type of content will always need updating, retranslating, and editing. And this constant cycle of updating can be costly, a cost that many companies choose not to invest in.
Up until now, there has been a heavy reliance on machine translation (MT) to handle the demand for customer support translation, and this has worked reasonably well.
However, businesses are now leaning towards smart chatbots because they provide customers with seamless user experience.
How Can Providers and Developers Keep Up with This Demand?
It’s only natural for humans to reject change; it is an unknown. We ask ourselves, will these chatbots work? What if it is not worth the cost? What will happen to the humans who provide our customer support? What if our business collapses?
It is easier said than done, but we need to learn how to adapt to the growing demand for smart chatbots and find our place within this new automated world—a place where humans will still play an integral role.
The Four Main Chatbots
Whether you opt for a more simplistic chatbot or go for one that is more complex, LSPs, translators, and language professionals will always have a role to play. Humans are an important part of the multilingual chatbot evolution.
If you are unsure of the differences between chatbots, here are the 4 main types for those of you looking to expand your global communication.
Simple Rule-Based Chatbots
The most common rule-based chatbots you will come across are FAQs and form-based interactions.
Interactive FAQs are programmed with a list of pre-defined keywords. If a customer uses one of these keywords in their inquiry, it will direct the chatbot to a predetermined set of information related to that specific keyword.
Form-based interactions follow a similar programming pattern. Users are asked to fill out a form, but what is different is that this format feels more conversational because the chatbot is talking directly to you and guiding you to the answer you want.
For both chatbots, there are a few ways the language services will be of help.
First, there is the translation of the FAQ content, and LSPs perform this type of work daily. Then there is the process of transcreation for the keywords and phrases used, and this can be a little more technical.
This process involves predicting what keywords humans will use for a certain topic. For example, if someone wanted to find out why their package has not been delivered, a translator would create a list of words and phrases based on this particular scenario.
The chatbot needs to have enough knowledge surrounding this topic to provide the user with the best possible answer. The challenge with this approach, though, is that if you didn’t accurately predict what words your client will use, or if you mistakenly mapped a keyword to the wrong result, the system will never adapt or learn. Your client will simply reach a dead end.
Complex Rule-Based Chatbots
This is where chatbots begin to feel a little more human. These chatbots use decision trees to map out where humans would input certain information, and then use this to create dialogues.
A great example of complex rule-based chatbots is The Kuki, which was developed by Steve Worswick and has been a multiple Loebner Prize winner.
The chatbot is programmed using AIML (artificial intelligence markup language), which allows the chatbot to appear human, even though it is still essentially a rule-based system bot.
People from different countries and cultures will conduct conversations in different ways. So, it is important to adapt your rule-based system to adhere to the complexities of each language.
Again, though, as with simple rule-based chatbots, if you didn’t predict the behavior, the bot will not respond correctly.
Basic Machine Learning Chatbots
Question answering is a good example of a basic machine learning chatbot. These bots are programmed to understand certain query patterns and then generate answers to match each query.
They are very similar to intent-based systems, so they need humans to produce the types of questions a user may ask and the intention behind this query.
You need to think about what result the user is looking for. Once seed the engine with a representative array of possible user inputs and associations, the machine has the ability to extrapolate and handle user inputs that you never imagined. This is the inherent strength of machine learning chatbots.
Complex Hybrid Chatbots
For this chatbot, the clue is in the name, as they are a “hybrid” of rule and machine learning.
For the rule part of the chatbot, you need to think about building a strong grammar foundation, which is something most language services providers (LSPs) should be well accustomed to.
For the machine learning element, LSPs also have a role to play, as they can build language models that can perform at an NLU (natural language understanding) level. LSPs can also supply audio data, transcriptions, and annotations to teach the engine how to mediate human sound (both as an input from a customer and as an output through “voice” of the chatbot) to build acoustic models that function in tandem with language models.
This means the chatbot will be able to interpret what the customer is saying and will then be able to match this to a specific outcome. These chatbots are having conversations with you in real-time, so for this to be successful language-specific features need to be implemented properly.
For the chatbot to understand how a conversation would flow, modeling of multiple conversation patterns should be applied.
For example, a customer might ask for a refund for an item of clothing that does not fit properly, however, they might then change their mind and ask for their order to be amended and be swapped for a different size.
The bot needs to be able to switch between multiple conversations, so humans need to program them with the means to carry this out successfully.
In Conclusion of Multilingual Conversational AI
Multilingual data is essential for enabling global conversational AI. It has a broad range of use cases in operations, production, and supply chain management. The global reach of business requires AI datasets to be trained to understand text and speech in multiple languages.