Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
osanseviero 
posted an update Jan 29

For some reason, being exposed to two very different languages during training seems to help models (just like humans) with all sorts of tasks

·

At the same time, there was discussion when BLOOM was trained that the multilingualism was hurting the english performance

I hear there is an incredible amount of competition among LLM makers within China, I guess one would publish and thus promote only the best. Hundreds of models. Competition is good for performance.

·

I am glad that rn there is a lot of open research within china 😌

Do Multilingual LLMs have the capability to understand between languages
& whether they do have the capability or not
Does Multilingualism improve llms performance across the board

something similar has crossed my mind many times when contemplating about ai/llms in general.

We Humans formulate our thoughts and interpret things with our main spoken language.

When people that are multilingual are asked what language they use in their head when they are thinking, they typically answer with their primary /& or first learned language.

China & the US happen to be two of the most 'wealthiest' countries in the world

which relates with us having the some of the most/best researchers, training data & models

I wonder if language has a role in China + US being two of the worlds leading nations in tech & science.

which is language is the best ?
maybe different languages perform differently across different domains

Finishing sentiment:
thoughts and words (intelligence) are vibration waves, probably somewhere out there in the universe is another complicated wave

Worth trying out!

Learning a lang from another lang family might improve model's capability on appearing unrelated aspects.

The study of French improved my grammar and I wished I could master another language like Arabic / Hindi to see the world from a different angle.