The biggest names in language technology are fighting hard to break the language barrier. Their mission is to match the standards of human translation by building AI machines that can learn for themselves – and Google just took a big step in the right direction.
The biggest name in search says its new automatic translation app is almost as good as human, professional translators. It’s a huge claim, for sure – but Google has test results to back up it up. So has the day finally come where machines leave human translators redundant?
Google Translate gets a serious upgrade
Google Translate has been given a fat injection of artificial intelligence, allowing it to tackle languages in a three-pronged approach:
- Google breaks down each sentence into smaller chunks so it can first establish the meaning of the material in its original language (a challenge in itself for machines).
- Next, Google takes the source material and converts it into the other language – the idea being that greater understanding of the original material results in more accurate translations.
- Finally, Google’s AI platform is constantly learning by itself, theoretically improving the accuracy of its translations over time.
Those first two steps are actually the industry standard for automatic translation and machine translation, so there’s nothing new there. However, it’s step 3 that really sets the new Google Translate apart.
This thing can learn for itself now and it makes instant decisions based on an ever-growing hoard of data. Google says all of this adds up to an 80% improvement in translation accuracy (or 80% fewer mistakes, to be more precise).
Is Google Translate really as good as humans?
Google Translate may be a much smarter machine now, but can it really match human translators?
This claim comes from tests of Google’s new translator, where the app had to translate from English into Spanish, French and Mandarin Chinese. Participants fluent in English and one of the other languages were asked to spot the difference between Google’s translations and those from professionals.
In some cases, they couldn’t spot the difference, the study says. Rated on an accuracy scale between zero and six, the old Google Translate scored 3.6 for Spanish. The new Google Translate scored 5.0 for Spanish in the latest round of tests, while humans average at 5.1.
It also achieved similar scores with French in the most recent tests, so it looks like Google really has caught up with humans! Except these results don’t tell the full story.
Still a long way to go for Google Translate
The results from Google are certainly impressive and it’s clear that Google Translate is a different kind of tool now. However, you have to take the tech firm’s salesmanship with a pitch of salt.
It would be interesting to see samples of the sentences that Google used in its tests. There are plenty of scenarios where even the old Google Translate could closely match human translators for very basic expressions, so it’s difficult to know what has actually been achieved here.
We already use machine translation and automatic translation to speed up our own translation services in certain cases, but this doesn’t mean the technology matches our language experts.
There’s also the fact that Spanish remains one of the easiest languages for machines to translate from English – closely followed by French. Translating into Chinese, on the other hand, is a real challenge for technology, and human translators still drastically outperform software in these situations.
In fact, most Asian languages pose several challenges when translating from English, situations where creative decisions need to be applied – the kind of things only humans can do.
Something else we can’t know from Google’s announcement is the kind of material the tests were using. Simple conversational language is one thing, but complex legal documents, medical journals or safety instructions are something else entirely.
Unfortunately, it seems that the new Google Translate isn’t live yet, despite reports. Or, if it is, we’re still seeing a lot of the classic mistakes in English-Spanish translation that it has been guilty of for years – pretty basic ones at that.
So there’s no way of knowing how much Google Translate has really improved. The studies boast some impressive figures, but we don’t have enough details about the tests to know what they actually mean and we can’t seem to test the app out for ourselves. We’ll definitely be interested to see what it’s capable of when the time comes, though!