Search

KR-20260067967-A - Method and method for assisting an evaluation result using generative AI models

KR20260067967AKR 20260067967 AKR20260067967 AKR 20260067967AKR-20260067967-A

Abstract

According to one feature of the present disclosure, a method for assisting in the creation of an evaluation plan using a generative AI model executed on an evaluation assistance server is provided. The method comprises the steps of: receiving an evaluation plan creation command and creating an evaluation plan; receiving scoring elements as input; and generating detailed scoring elements using a generative AI model based on the scoring elements.

Inventors

  • 김기범
  • 고동완
  • 김동근
  • 김동진

Assignees

  • 주식회사 데이터드리븐

Dates

Publication Date
20260513
Application Date
20250721

Claims (1)

  1. A method for assisting in the generation of evaluation results using a generative AI model executed on an evaluation assistance server.

Description

Method and method for assisting an evaluation result using generative AI models The present disclosure relates to a method for assisting in the evaluation of descriptive problems, and more specifically, to a method and apparatus for assisting in the generation of evaluation results using a generative AI model. Essay questions are highly effective for evaluating a student's or examinee's thinking skills, creativity, and logical organization abilities. Unlike multiple-choice questions, essay questions provide the examinee with the opportunity to freely express their thoughts and require comprehensive thinking skills rather than simple rote memorization. Due to these characteristics, essay questions play a crucial role in deeply assessing academic achievement or understanding of specific topics. Furthermore, they are useful for providing educational feedback as they allow for the identification of the examinee's thought process and problem-solving methods. Despite the advantages of descriptive questions, the grading process is time-consuming and labor-intensive. Since evaluators must review each response individually, there is a possibility of variance due to subjective judgment, and it can be difficult to grade a large number of responses consistently. Furthermore, defining the range of correct answers is complex because different responses may exist for the same question. While standardized criteria are necessary to reduce variance among graders, strictly applying such standards in practice is difficult. Consequently, grading descriptive questions can be inefficient and has limitations in guaranteeing fairness. FIG. 1 is a schematic diagram illustrating an evaluation assistance system (100) according to one embodiment of the present disclosure. FIG. 2 is a functional block diagram schematically illustrating the functional configuration of a user terminal (110) illustrated in FIG. 1 according to one embodiment of the present disclosure. FIG. 3 is a functional block diagram schematically illustrating the functional configuration of the evaluation aid device (130) shown in FIG. 1 according to one embodiment of the present disclosure. FIG. 4 is a functional block diagram schematically illustrating the functional configuration of the evaluation plan module illustrated in FIG. 3 according to one embodiment of the present disclosure. FIG. 5 is a drawing illustrating, in accordance with one embodiment of the present disclosure, evaluation plan data stored in an evaluation plan database in an exemplary manner, and FIG. 6 is a drawing illustrating, in an exemplary manner, a screen on which evaluation plan data stored in an evaluation plan database is displayed to a user. FIG. 7 is an exemplary screen in which, according to one embodiment of the present disclosure, a scoring criteria generation AI module generates detailed scoring elements of a scoring element based on a scoring criteria (e.g., a scoring element) and/or an achievement standard. FIG. 8 is an exemplary screen in which, according to one embodiment of the present disclosure, a scoring criteria generation AI module generates detailed scoring elements based on scoring criteria (scoring elements) or achievement criteria according to a predetermined evaluation mode. FIG. 9 is an operation flowchart conceptually illustrating the process of an evaluation planning module generating detailed scoring elements of a scoring element according to one embodiment of the present disclosure. FIG. 10 is a functional block diagram schematically illustrating the functional configuration of an evaluation result generating AI module illustrated in FIG. 3 according to one embodiment of the present disclosure. FIG. 11 is a drawing illustrating an exemplary performance evaluation plan according to one embodiment of the present disclosure, and FIG. 12 is a drawing illustrating an exemplary scoring result generated by evaluating a task result according to the evaluation plan in an AI module that generates evaluation results according to scoring criteria. FIG. 13 is an operation flowchart conceptually showing the process of an evaluation result generating AI module generating an evaluation result according to one embodiment of the present disclosure. FIG. 14 is a drawing illustrating, in accordance with one embodiment of the present disclosure, an exemplary comprehensive record generated by a comprehensive record generation AI module for a predetermined student. FIG. 15 is an operation flowchart conceptually showing the process of an evaluation result generating AI module generating an evaluation result according to one embodiment of the present disclosure. Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the following, specific descriptions of already known functions and configurations are omitted where it is deemed that doing so would unnecessarily obscure the essence of the present disclosure. Furthermor