Using Generative AI in Localization
In November 2022, OpenAI released a wildly popular conversational app, ChatGPT, which sparked great excitement in the business and technology world.
Using generative AI, ChatGPT can rapidly create cogent, insightful content based on user prompts ranging from “I want you to act as an English pronunciation assistant for Turkish-speaking people” to “Write code for an eCommerce website using HTML and CSS.”
ChatGPT has given us a glimpse of the remarkable opportunities presented by it and other large language models (LLMs). Its ease of use and remarkably human-sounding responses helped make the app the most popular in history in just a matter of weeks. Significantly, tech giants have accelerated development and access to their own LLM tools. For example, in February, Microsoft announced it was incorporating into the Bing search engine (and Edge browser) the underlying technology that powers ChatGPT.
Generative AI is not a new technology, and ChatGPT is by no means the only example of an LLM. (Indeed, we at Centific have been working with generative AI and LLMs for some time.) However, generative AI has now reached a level of maturity and visibility that the public is getting a taste of its power. The potential of LLMs to shape the future of language-based interactions is becoming clearer to businesses everywhere.
LLMs and Localization
The emergence of LLM technology is a game changer for the localization industry because it offers a range of benefits that can help companies improve the speed, accuracy, and scalability of their translation processes. It can also help reduce costs, improve customer experience, and increase the availability of language resources. It will revolutionize the localization industry and redefine workflow and quality management.
The technology is potentially capable of automated quality assurance, automating the localization of digital assets, and providing more accurate natural language processing. These features make LLM an attractive option for companies looking to reduce costs and improve the quality of their localization services.
It’s possible to see where LLM tools will supplement traditional translation and QA tools to offer sophisticated content analysis and quality assurance. For instance:
- Source content will be analyzed pre-translation to optimize for the localization process.
- Geopolitical considerations will be dynamically managed and integrated as a standard part of content validation.
- Term mining, glossary management, and language quality will be fully integrated into “human-in-the-loop" workflows, including error checking, bug identification, and reporting.
- Marketing teams will achieve new levels of sophistication in personalization and rapid “time to live” response.
- Code and other markup will be handled more comprehensively to allow for more fluid management of complex file types, enhancing agile development programs.
LLMs can be of tremendous benefit to localization. But there are some considerations, such as bias or inaccuracy, which are common challenges in AI. At Centific, we process and evaluate massive amounts of data to help industry leaders mitigate such risks when training and deploying their global tools and models.
Data curation and algorithmic development processes must be mindful of potential bias and intentional about the intended outcomes.It takes a diverse team of people to ensure that accuracy, inclusiveness, and cultural understanding are respected in such tools' inputs and outputs. Centific’s approach is to rely on globally crowdsourced resources who possess in-market subject matter expertise, mastery of 200+ languages, and insight into local forms of expression. This experience helps drive our understanding of the usage of these tools in the language space.
This is a rapidly evolving sector with seemingly endless possibilities and some unknowns relating to the ongoing security and integrity of data shared with these technologies. There are positive signs that open source or commercial offerings will provide opportunities to deploy private models. This will greatly increase the footprint of use cases and drive custom training for specific tasks or outcomes.
At Centific, we continue advancing with our R&D investment in LLM and other evolving AI technologies to identify opportunities to evolve our delivery programs. We are excited to embrace this new future for the benefit of our clients.