KR-102962729-B1 - AI-based game content translation verification method and system
Abstract
The following embodiments relate to a technology for automatically verifying translation quality issues that occur during the game localization process. More specifically, they relate to an integrated quality verification system that utilizes a neural network-based vision model to detect and classify linguistic and visual display errors in translated text extracted from game screens in real time. An electronic device that communicates with a user terminal for a user according to one embodiment to provide a game translation check service comprises a memory that stores at least one instruction and a processor that executes said at least one instruction, wherein the processor receives game screen data, which is data regarding a game screen translated from a first language to a second language, extracts text data including a plurality of texts displayed in the second language from said game screen data, and analyzes said text data through a vision model that has been pre-trained based on a neural network, thereby determining whether a translation error has occurred during the process of translating from the first language to the second language, and can generate an error report based on the result of determining said translation error.
Inventors
- 윤강원
Assignees
- (주) 라티스글로벌커뮤니케이션스
Dates
- Publication Date
- 20260508
- Application Date
- 20250915
Claims (5)
- In an electronic device that communicates with a user terminal to provide a game translation check service, Memory that stores at least one instruction; and It includes a processor that executes at least one of the above instructions, The above processor is, Receiving game screen data, which is data for a game screen translated from a first language to a second language from the above user terminal, and Extract text data including a plurality of texts displayed in the second language from the above game screen data, and By analyzing the text data through a vision model pre-trained based on a neural network, it is determined whether a translation error occurred during the process of translating from the first language to the second language, and Generate an error report based on the judgment result of the above translation error, The above processor is, Extract a container area representing the boundary coordinates and size of UI (User Interface) elements, including buttons, dialogue boxes, and menu boxes containing text, from the above game screen data, and Extract a text display area representing the coordinates and size of the pixel area actually occupied by the above extracted text data, and Compare and determine whether the text display area extends beyond the boundary of the container area, If it is determined that the text display area extends beyond the boundary of the container area, the ratio of the text extending beyond the boundary of the container area to the total text is calculated using the vision model, and If the above-calculated ratio exceeds a predefined threshold, it is further determined whether the core meaning of the original text has been lost by comparing it with the original text of the first language, and recorded as a text truncation error in the error report. When determining whether the core meaning of the above original text has been lost, the part of speech of the portion extending beyond the boundaries of the container area is analyzed through dependency parsing, and it is determined whether the portion extending beyond the boundaries of the container area is a main component or a sub-component of the sentence, If the part extending beyond the boundaries of the above container area contains essential components including a subject and an object, or irreplaceable information including numbers and proper nouns, it is determined that the core meaning of the above original text has been lost. Electronic device.
- delete
- In Article 1, The above processor is, If the above second language is a right-to-left (RTL) language, Extracting the start and end point coordinates of each of the plurality of texts from the above game screen data to determine the text progression direction, Identify the on-screen placement order and alignment reference points of the above UI elements, and Detecting as a layout error cases where the text progression direction proceeds from left to right, or where the screen placement order of the determined UI elements is arranged opposite to the RTL language rules, and The degree of impact of the detected layout error on the user's readability and game progress is quantified and evaluated through the above vision model, and Automatically generate specific correction guides for the correct text direction and UI layout in the above error report, but When quantifying and evaluating the extent of the impact of the above layout error on the user's readability and game progression, an impact evaluation is performed through eye-tracking heatmap simulation, and a cognitive load score is calculated by comparing the mirroring of the F-pattern, which is the typical eye-tracking pattern of RTL language users, with the actual UI layout. Electronic device.
- delete
- delete
Description
AI-based game content translation verification method and system The following embodiments relate to a technology for automatically verifying translation quality issues that occur during the game localization process. More specifically, they relate to an integrated quality verification system that utilizes a neural network-based vision model to detect and classify linguistic and visual display errors in translated text extracted from game screens in real time. With the rapid growth of the global game market, multilingual support has become an essential element. In particular, as games from Asian markets such as Korea, China, and Japan expand into North American and European markets, the importance of accurate and natural localization is becoming increasingly prominent. Game localization goes beyond simply translating text and requires a comprehensive adaptation process that considers the cultural context and linguistic characteristics of each region. In this process, Linguistic Quality Assurance (LQA) is recognized as an essential step for verifying in advance various quality issues that may arise when translated content is applied to game builds. The traditional LQA process has primarily relied on human testers manually playing games to identify translation errors. Testers inspect each screen individually to identify issues such as text truncation, broken characters, missing translations, contextually inappropriate translations, and errors in the use of honorifics between characters, recording these findings in spreadsheets or dedicated reporting tools. This manual approach has several fundamental limitations. First, for large-scale games, the sheer volume of text to be verified makes it practically impossible to thoroughly review every piece of content. For instance, modern RPGs contain millions of words of dialogue and thousands of UI elements, requiring an enormous amount of time and manpower to verify everything manually. Second, there is the issue of inconsistency caused by the subjectivity and fatigue of human testers. The same error may be judged differently by each tester, and accumulated fatigue from long working hours significantly reduces the error detection rate. In particular, the detection of errors related to subtle linguistic nuances or cultural contexts can vary greatly depending on the tester's language proficiency and cultural background. Third, it is difficult to respond to repetitive updates and patches. Modern games evolve through continuous content updates and patches, and manually re-verifying every change is inefficient and incurs high costs. From a technical perspective, the text displayed on the game screen is not merely a simple string of characters but is implemented through a complex UI system and rendering engine. UI elements developed in the original language can cause various visual issues due to differences in text length when translated. For example, while text length typically decreases by 20–30% when translating from English to Korean, it can increase by 30–40% when translating from German. These length differences lead to text truncation or overflow issues in UI elements such as buttons, dialogue boxes, and menu boxes. Furthermore, in the case of RTL (Right-to-Left) languages like Arabic or Hebrew, the orientation of the entire UI layout must be changed; this process frequently results in misalignment of UI elements or distortion of the visual hierarchy. From a linguistic perspective, even more complex issues exist. Dialogue between in-game characters must maintain an appropriate tone and level of honorifics based on each character's personality, relationship, and situation. Particularly when translating into languages with complex honorific systems, such as Korean and Japanese, consistent use of honorifics that consider the characters' age, position, and level of intimacy is essential. However, maintaining such consistency is extremely difficult when thousands of dialogue files are processed by multiple translators; consequently, unnatural phenomena occur where the same character uses different speech patterns depending on the situation. Although some automation tools have been developed recently, they are primarily limited to simple text comparisons or basic string length checks. For example, while translation memory systems can verify consistency with previously translated text, they cannot verify how the text is displayed on the actual game screen. Furthermore, because existing automation tools do not understand context, they fail to detect translations that are grammatically correct but inappropriate for the game situation. For instance, the word "Save" should be translated as "Save" in the game menu but as "Rescue" in combat situations, yet existing tools cannot distinguish these contextual differences. FIG. 1 is a diagram illustrating the relationship between a user terminal and an electronic device according to one embodiment. FIG. 2 is a diagram illustrating t