At CSOFT, we are well acquainted with two of biggest pain points in the translation process: local regional translation review and the final approval of translated documents. In a perfect world, third-party reviewers would be dedicated senior staff with strong technical skills who sit around waiting for translated material to judiciously proofread and validate. Ideally, these localization experts would have the appropriate domain knowledge and linguistic skills to suggest the right terminology and a style that best suits their local market.
The reality is, though, that the third-party review and validation process in our industry is broken. As facilitators, there are two scenarios that we regularly come across.
Review Pitfalls: Scenario #1
Often times, the person tasked with reviewing translated content—preferably a sales or marketing manager immersed in the target market—does not have the time to perform a full linguistic review. Because of this, it’s not uncommon for reviewing tasks to trickle down to the lowest common denominator within the company—somebody who can’t just pass it off to someone else.
While this is not a huge problem with more straightforward translations, like software Help menus, it does become a problem with transcreated MarCom content, which should accurately reflect brand name, product line and the target audience’s reaction to the source content. (There’s an excellent post on the niceties of transcreation in Mathew Stibbe’s Bad Language—definitely worth a look. Also, I suggest checking out Common Sense Advisory’s research study, “Reaching New Markets through Transcreation”, to learn more about how companies are recreating their multilingual content beyond simple translation.)
This is something that shouldn’t be subjected to the stylistic whims of someone other than a subject-matter expert with, if possible, creative writing experience in the target language. But owing to client-side time or money constraints, this scenario is not as uncommon as you might think.
Review Pitfalls: Scenario #2
Many organizations lack the internal linguistic resources to perform reviews, so they outsource the work to third-party agencies that are paid to find translation issues. Now, imagine for a moment that you were getting paid to find mistakes in this blog. Under normal circumstances, you’d probably think that the content here is pretty decent (if I do say so myself). But once money changes hands, and you’re specifically tasked to find problems, you could easily come up with 50 or more suggestions on how my writing needs serious attention—in this one paragraph alone.
It happens. And although it’s not something that’s openly discussed, it is something that needs to be addressed.
So what’s the fix?
Most LSPs in the localization industry have heaps of processes, whitepapers and best practices for conducting in-country review. They’ll stress the importance of adhering to glossaries, style guides and standards. They’ll promote the importance of good communication and will offer “strategic” suggestions about using the track changes features in Microsoft Word, PDF annotation in Acrobat, or even propose elaborate Excel-based challenge/response forms that only the world’s foremost terminologists could possibly wrap their minds around.
On paper, it all seems to make sense. These “solutions” make for compelling webinars with which you can ISO people till the cows come home, but when it comes down to actual execution and resolution, the problem still exists.
All clients want is a quick and powerful way to view, annotate, collaboratively agree and sign-off on localized documents so that they can get on with their real jobs.
The localization industry’s first web-based, collaborative translation review and annotation platform. Here’s how it works:
- Client reviewers are sent an e-mail with a link to the document du jour that has been uploaded to a centralized database.
- The multilingual document appears in the user’s web browser of choice.
- Reviewers can then highlight sections and add changes using color-coded, time-stamped sticky notes.
- All stakeholders are notified when review comments are added.
- Comments are automatically collated and sent to the LSP for implementation.
No learning curve. No more e-mailing files. No software installation. All comments are stamped with the time and date.
ReviewIT is 100% visual and 100% traceable, web-based collaborative bliss.
It will change your life.
Our marketing department will kill me, but here’s a quick screenshot to show you what it’s like. (Click on the image to expand.)
We uploaded a document related to TermWiki, our cloud-based terminology management system, into ReviewIT. As you can see, users can leave review notes directly in a central PDF. These sticky notes are stamped with the name of the reviewer, as well as the time and date of when the comments were left. And, because it’s a centralized, web-based document, it’s completely traceable. You never have to worry about version control—ever again.
Drooling yet? It’s okay—I had the same reaction.
Don’t worry; it’s coming out soon.