Translation Introduction - LQA

Before starting a review the Reviewer must check that all the necessary material has been correctly delivered by the Vendor.                                

1. Review Files: verify that the files that must be reviewed have been delivered.                                

2. English Source: verify that the English source files have been delivered.                                

3. Reference files: verify that the reference files declared in the project delivery email have been delivered.  

"4. Review Languages: check that the review languages are correct. For example, the Vendor may sometimes send Simplified Chinese Software to Traditional Chinese Reviewers."                                

"5. Software License: verify that the appropriate license has been supplied, if needed (only for the review of live software)."                                

"6. Linguistic Test Cases: verify that linguistic test cases have been supplied (only for the review of live software and/or screenshots, or for UPD review)." 

7. Connectivity information: verify connectivity to the remote sever (only for review of live software).     

"Note: If any item is missing, the Reviewer must contact the Vendor for immediate delivery." 


 How to fill in the RFF

"1. The Reviewer renames the Review Feedback Form file as follows: RFF_<Job Name>_<Language Code>.xls     Example: RFF_CX Mira GUI_IT.xls

Note: once defined, this name is never to be changed."                                

"2. The Reviewer opens the ""Review Result Summary"" sheet and inserts all the general job information: Job Name, Job Type, Language, Translation Vendor, Total Word Count, and Reviewed Word Count. Volumes are expressed in number of words, without commas or periods.

The review volume to be inserted in the ""Total Word Count"" field is indicated in TMS for TMS jobs; for non TMS jobs, consider the value communicated to the Reviewer by the Vendor in the review notification e-mail, not the value stated in the Lockit if this value is different.

The review volume to be inserted in the ""Reviewed Word Count"" field is the real volume reviewed by the Reviewer. The default value in this field will equal the value of the ""Total Word Count"" field and will be filled in automatically. If the values are different because of partial review, this value is to be inserted manually.

Also, the Reviewer fills in the RFF History section by supplying: Date (ex. July 14, 2010; this format is mandatory), Edits, Author, and Role. Upon first release of the RFF, ""Review"" is inserted in the Edits field. All fields are mandatory. The Quarter field is filled in automatically.

Note: The information inserted automatically in the Quarter field of the RFF History section is the quarter in which the job was delivered. If the delivery date is not inserted correctly, the Quarter field will show an error. Information regarding the Quarter is inserted only for initial review."   

"3. The ""Total errors allowed for..."" value for each color rating will be completed automatically after the ""Reviewed Word Count"" field is filled in."

"4. After or during the review, the Reviewer enters the most significant errors found in the translation into the ""Review Feedback"" sheet, supplying as much information as possible regarding each error. Error Type and Error Severity are to be selected from the related drop-down lists. All errors will be counted automatically on the ""Review Result Summary"" sheet. To report on an issue that is not to be counted as an error, the Reviewer must select ""Not Rated"" from the Error Severity field. Multiple instances of the same error are to be logged only once. In the ""Global fix"" field the Reviewer indicates if that error must be fixed in all the files of the job. The number of global fixes is automatically indicated on the Review Result Summary sheet.

Note: for more information on Error Types and Error Severity, please see the related worksheets."   

"5. The RFF will automatically indicate localization quality with colored ratings (Green, Yellow, Red) in the ""Review Result"" field. The rating calculation is done considering the weighted errors value (see the Error Details section on the ""Review Result Summary"" sheet). In case of Green rating, the reviewer may also decide to assign Over-achievement if the translation is exceptionally well done and free of errors. This is done by selecting ""* * * * *"" in the Over-achievement field. See section 6 below for more details.

Note: Even though the rating is automatically calculated by the RFF, the Reviewer may manually change this value based on his / her subjective judgment. Please remember that this is an irreversible operation, since it will delete the formula contained inside the cell. The Reviewer is only allowed to improve the rating supplied by the automatic calculation. "                                

"6. The Reviewer enters comments in the ""Reviewer Comments"" field on the ""Review Result Summary"" sheet, and explains the reason for the assigned rating (comments are mandatory in case of Yellow or Red ratings).  

Note: comments must be inserted in English."                                

"7. The Reviewer delivers the RFF to the Vendor on time along with the reviewed files by posting it to the appropriate location on eRoom, and by notifying the Vendor via email.

Note: when posting the RFF to eRoom, the Reviewer must check that appropriate eRoom access rights are granted to all those involved."                                

"8. After receiving the RFF, the Vendor may optionally comment on the Reviewer's rating by filling in the appropriate section in the ""Review Feedback"" sheet. When providing feedback on a single error, the Vendor must also indicate whether the Reviewer's correction was implemented or not by using the ""Implemented?"" field. The Vendor is also required to update the RFF History section in the ""Review Result Summary"" sheet (example: ""Added translator comments"" in the Edits field). This is done by editing the file online using the eRoom edit function. Comments from Vendors are to be supplied no later than five (5) working days after the RFF was made available to the Vendor. The Vendor must notify the Reviewer via email that comments have been added to the RFF on eRoom, and provide the link.

Note: All fields of the RFF History section are mandatory except for the Quarter, which is required only for the initial review."                                

"9. The Reviewer must reply to Vendor comments as soon as possible, and re-rate the job if needed. In this case, the RFF History section is updated by the Reviewer by indicating (for example): ""Re-rated from Yellow to Green"". This is done by editing the file online using the eRoom edit function.

Note: The Vendor must proceed with the localization workflow as soon as the reviewed files and the initial RFF are received from the Reviewer, and must not wait for the RFF to be finalized with comments."   


Review Ratings

Green: Translation quality fully meets EMC standards. (below threshold of 0.15% of the review volume) 

"Yellow: Translation quality partially meets EMC standards. Vendor improvement is needed in upcoming translations. (between thresholds of 0.15% and 0.30%). Reviewer comments are mandatory. Logging significant errors in the ""Review Feedback"" sheet is mandatory." 

"Red: Translation quality fails to meet EMC standards. Significant improvement is required from the Vendor in upcoming translations. (over threshold of 0.30%). Reviewer comments are mandatory. Logging significant errors in the ""Review Feedback"" sheet is mandatory. Logging an entry in the Language Vendor Issue Tracker (LVIT) in eRoom and notifying the Vendor (GQM Team and PM Team on cc) is mandatory. " 


Dealing with Red review ratings

"A Red rating indicates that job translation quality is unsatisfactory. However, sometimes poor quality can be caused by a variety of factors outside the translation Vendor's control. For this reason, before assigning a Red rating, the Reviewer should always take into consideration the following possibilities:"                                

"* errors in previously translated, re-cycled files (if out of TEP scope)"                                

* errors in the source language                                

* unclear source language                                

* errors that result from review performed on wrong or old files/builds                                

* insufficient GQM support/management                                

* insufficient EMC PM support/management                                

* review performed when specific terminology/language issues were not discussed in advance       

images/yonghu.png
images/phone.png
images/youjian.png
原语言
目标语言
详情备注
提交成功
三个工作日内回复您
确认