• November, 12 2022
  • by Ascentspark Software

"If you talk to a man in a language he understands, that goes to his head. If you talk to him in his own language, that goes to his heart." - Nelson Mandela

Language, as we say, can build bridges and mend hearts; the words just need to be spoken. Until the rise of technology, it might have been hard for people to understand or learn a new language. But as technology advanced, the language barrier between people and oceans began to grow thinner. 

What significance does language have in the tech world?

However, sometimes even technology can have its limitations when it comes to language. Some programs and software might only exist in the universally known and acknowledged language; mostly English. Which would definitely have been difficult for most of the people who wouldn’t have known English; especially when it comes to operating an AI integrated technology.

Well, good news is on its way for people speaking different languages across the world who want to operate an AI tech using their Regional or National language but could not due to language barriers. 

Google Taking the AI Leap to Break Language Barriers

  • Google has recently announced a new project that is aimed at developing a single AI language model that would support the world’s “1,000 most spoken languages.” 
  • Taking the first step towards achieving this ambitious goal, Google is displaying an AI model which has been trained on almost 400 languages from across the globe, which it chronicles as “the largest language coverage seen in a speech model today.”

AI and Language have undoubtedly been at the heart of Google’s products always, however, recent advances in machine learning; particularly the development of multi-functional “large language models” or LLMs, have evidently placed a new found emphasis on these domains.

What are Google’s early plans?

  • Google has already started integrating the language models into its products such as Google Search
  • Language models tend to have several flaws such as the tendency to disgorge societal biases such as racism and xenophobia along with an inability to decipher language with human sensitivity
  • These AI models are capable of several tasks, though, from the standpoint of language generation (such as OpenAI’s GPT-3) to translation ( Meta’s No Language Left Behind), they need to be advanced
  • And thus, Google’s1,000 Languages Initiative” has been planned to not focus on any particular functionality, but to develop a single system with a plethora  of knowledge of languages from across the world

As per Interviews and Reports

By having a single model that is exposed to and trained on many different languages, we get much better performance on our low resource languages,” Ghahramani (VP of Research at Google) said to the Verge in an interview. 

He also cited, “The way we get to 1,000 languages is not by building 1,000 different models. Languages are like organisms, they’ve evolved from one another and they have certain similarities. And we can find some pretty spectacular advances in what we call zero-shot learning when we incorporate data from a new language into our 1,000 language model and get the ability to translate [what it’s learned] from a high-resource language to a low-resource language.

What’s more to it?

In the recent past, research has shown the scalability of this approach and the scale of Google’s model could offer considerable profit over the past work. 

Evidently, large-scale and competitive projects have become ubiquitous to tech companies’ and their ambition to dominate the AI research market. They are also looking forward to drawing on these firms’ unique advantages such as access to vast computing power and training data. 

What can be seen as a comparable project is Facebook’s parent company Meta’s ongoing research to develop a “universal speech translator.

Some Issues to Deal with

Some problems may arise while Google will attempt to put together this large scale process such as:

  • Access to data while training across so many languages
  • Google reportedly said that in order to support the work on the 1,000-language model, it will be funding the collection of data for low-resource languages which will include audio recordings and written texts

The Plan Further

Google has implied that it has no definitive plans on the application of the functionality of this model. As of now, it expects that the AI Model will have a range of improvement uses across Google’s products, starting from Google Translate to YouTube captions and many more.

In another open interview, Ghahramani said, “The same language model can turn commands for a robot into code; it can solve math problems; it can do translation.

One of the really interesting things about large language models and language research in general is that they can do lots and lots of different tasks,” implied Ghahramani. 

As per Ghahramani, “The same language model can turn commands for a robot into code; it can solve maths problems; it can do translation. The really interesting thing about language models is that they’re becoming repositories of a lot of knowledge, and by probing them in different ways you can get to different bits of useful functionality.” 

All in all, tech and language experts are yet to test Google’s ambitious project and give it a nod. 

Will it really make AI language friendly to that extent that it will definitely bring the tech world closer?

we’re here to discuss your

NEXT PROJECT