Reading Time: 3 minutes

Picture by husuny de nhohuy08 via Canva

Table of Contents

CHALLENGE:

One of our long-standing partners is the legal department of a textile company. Our collaboration has been long and successful because they appreciate our high-quality legal translation and expert review by a qualified lawyer or expert in legal issues.

Recently, we came up against a challenge concerning legal and technical translation. Technical, because there was a lot of terminology from the world of textiles. Legal, because it contained many references to and citations from European regulations. The text in question was of considerable length, almost 9,000 words, and our partner needed it in 12 languages.

As often, we faced two major drawbacks – time and money.

SOLUTION:

A few months earlier we had run some initial tests with machine translation (MT). At first, the results we got were decent, but not good enough. If the quality of the MT output is too low, cleaning up and post-editing MT can take even longer than traditional translation. After further testing and tweaking we tried again with the help of a technical manual. This time, most of our linguists were very positive and even enjoyed the support they got from using MT.

That’s why we offered our partner from the textile industry the option of applying machine translation and post-edition (1) by an expert linguist, for six of the twelve languages. Convinced of the benefits of saving both time and money, our partner accepted.

Costs would drop considerably. Editing work by the post-editors would need to be very detailed, but all our editors have a keen eye and are highly trained. At the same time, because the MT yielded good results, editing would be faster than translating from scratch. This also meant that our partner received the translations when needed.

The source language translation was English and the six target languages for which we used MT were: German, French, Italian, Dutch, Polish and Portuguese. These six languages had obtained good results in our preliminary tests.

Once we had the 6 machine-translated outputs, we asked our post-editors to rate the MT results, which they had to value from 1 to 10 (10 being the highest). All but the Polish agreed on scores ranging from 7 to 8. Not bad at all!

Most of the errors they found were:

  • errors between (compound) nouns and adjectives;
  • errors in specific terminology;
  • style errors.

Some of those errors came from small inconsistencies in the original text. For example, sentences that included the word “part” were translated very differently by the MT engine. In some cases, the text was translated as, “depending on the part covered by the PPE”. On other occasions, mishaps occurred, such as “depending on the body part covered by the PPE”.

Part of the original formatting (text marked in red, for example) was lost or displayed incorrectly in the automatic version.

Specifically, in Portuguese there were many terms written by MT following the old Portuguese spelling.

The Polish post-editor only gave a score of 3 out of 10, probably because the machine translation engine we used wasn’t trained enough for this type of text.

In conclusion, for certain language combinations and topics, machine translation makes a lot of sense. This is especially true of well-structured texts with lots of terminology and little ambiguity. However, human post-editing, done by qualified linguists such as ours, is still crucial to ensure good results!

Our partner was happy with the six versions we translated using this method. They were also able to receive the translations earlier and at a lower price.

If you think you have texts that are suitable for MT, don’t hesitate to get in touch, and we’ll evaluate whether it’s a good idea.

(1) In these cases, we use the term post-edition, not revision, because the work is more complicated and requires greater time and attention.

Written by LocalizationLab