KR-102963584-B1 - A METHOD OF ACQUIRING URINATION INFORMATION WITH HIGH ACCURACY
Abstract
According to one embodiment of the present specification, a method for obtaining urination information with high accuracy may be provided, comprising dividing acoustic data into a plurality of windows, obtaining fragment target data corresponding to each window from the acoustic data, obtaining fragment classification data and fragment flow rate data that distinguish between a urination section or a non-urination section using the obtained fragment target data, and obtaining urination data using the obtained fragment classification data and fragment flow rate data.
Inventors
- 송지영
- 두경연
- 정지영
- 김대연
Assignees
- 사운더블헬스코리아 주식회사
Dates
- Publication Date
- 20260512
- Application Date
- 20220222
Claims (20)
- In a method for obtaining urination information, Acoustic data is acquired using an acoustic sensor; Using a processor, multiple first fragment target data corresponding to multiple first windows are obtained - each of the first windows has a first length and is continuously determined between the start point and the end point of the acoustic data -; Using a processor, multiple second fragment target data corresponding to multiple second windows are obtained - each of the second windows has a second length and is continuously determined between the start point and the end point of the acoustic data -; Using a processor, the plurality of first fragment target data are input into a classification model to obtain the plurality of fragment classification data - the classification model is configured to output data including at least a value for distinguishing between a urination period or a non-urination period when data related to urination sounds is input -; Using a processor, the plurality of second segment target data are input into a prediction model to obtain the plurality of segment urine flow rate data - the prediction model is configured to output data including at least a value for urine flow rate when data related to urination sounds is input -; and Acquiring urination data using at least the plurality of fragment classification data and the plurality of fragment flow rate data using a processor; comprising Method for obtaining urination information.
- In Article 1, The plurality of first piece target data includes m first piece target data, and the plurality of first windows includes m windows, and The m first piece target data correspond to the m windows, and The above m is a natural number greater than or equal to 2, and The plurality of second piece target data includes n second piece target data, and the plurality of second windows includes n windows, and The above n second piece target data correspond to the above n windows, and The above n is a natural number greater than or equal to 2, Method for obtaining urination information.
- In Article 2, Among the n windows mentioned above, consecutive windows have parts that overlap each other. Method for obtaining urination information.
- In Paragraph 3, Among the m windows mentioned above, consecutive windows partially overlap each other, and The degree of overlap between consecutive windows among the m windows is different from the degree of overlap between consecutive windows among the n windows. Method for obtaining urination information.
- In Paragraph 3, Among the m windows mentioned above, consecutive windows partially overlap each other, and The degree of overlap between consecutive windows among the m windows is the same as the degree of overlap between consecutive windows among the n windows. Method for obtaining urination information.
- In Article 2, Among the m windows mentioned above, consecutive windows do not overlap each other. Method for obtaining urination information.
- In Article 2, Each of the above m windows is identical to each of the above n windows, Method for obtaining urination information.
- In Article 2, Acquiring the above plurality of first piece target data is, Converting the above acoustic data into spectrogram data; and Acquiring m first fragment target data corresponding to m windows from the spectrogram data; comprising Method for obtaining urination information.
- In Article 2, Acquiring the above plurality of first piece target data is, Acquiring m first piece acoustic data corresponding to the m windows above; and Acquiring m first piece target data by converting each of the m first piece acoustic data into spectrogram data; Method for obtaining urination information.
- In Article 1, Acquiring the above urination data is, Obtaining urination classification data using the above plurality of fragment classification data; Obtaining candidate urine rate data using the above fragment urine rate data; and Processing the candidate urine flow rate data using the above voiding classification data; comprising Method for obtaining urination information.
- In Article 10, The above voiding data is obtained by performing a convolution operation on the above voiding classification data and the above candidate urine flow rate data, Method for obtaining urination information.
- In Article 10, The above voiding data is obtained by multiplying at least a portion of the above voiding classification data and the above candidate urine flow rate data, Method for obtaining urination information.
- In Article 1, The first length above is the same as the second length, Method for obtaining urination information.
- In Article 1, The first length above is different from the second length, Method for obtaining urination information.
- A non-transient computer-readable recording medium storing a computer program for executing the method according to claim 1.
- delete
- delete
- delete
- delete
- delete
Description
A Method of Acquiring Highly Accurate Urinary Information This specification relates to a method for obtaining highly accurate voiding information, and more specifically, to an invention for extracting voiding information from acoustic data regarding recorded voiding using a voiding/non-voiding interval classification model and a urine flow rate prediction model. Sounds emanating from the human body have long been used as important information for assessing a person's health status or the presence of disease. In particular, sounds produced during urination can contain key information for diagnosing urinary function; therefore, research on methods to obtain urination information by analyzing urination sounds is continuously being conducted. Specifically, by analyzing acoustic data obtained from recording the process of urination, it is possible to predict a person's urination rate or volume. Meanwhile, sounds generated during the human urination process can include various sounds caused by the surrounding environment in addition to sounds related to urination itself. Sounds from the surrounding environment can be present during the urination or non-urination phases; in particular, there was a problem where analysis results became inaccurate when environmental sounds were included in the non-urination phase and reflected in the acoustic data. Therefore, in order to obtain more accurate urination information, a method for analyzing acoustic data regarding the urination process is required by considering cases where sounds from the surrounding environment are included in the non-urination period. The technology forming the background of the present invention is disclosed in Korean Published Patent Application No. 10-2020-0002093 (published on January 8, 2020). FIG. 1 is a diagram showing an environment for analyzing urination information according to one embodiment of the present specification. FIG. 2 is a diagram showing the configuration of an acoustic analysis system according to one embodiment of the present specification. FIG. 3 is a diagram showing the process of operation of the configuration of an acoustic analysis system according to one embodiment of the present specification. FIGS. 4 and FIGS. 5 are drawings illustrating a method of separating acoustic data according to a window according to an embodiment of the present specification. FIG. 6 is a diagram illustrating a process of extracting feature values from acoustic data according to one embodiment of the present specification. FIGS. 7 and FIGS. 8 are drawings illustrating the process of acquiring target data to be analyzed according to one embodiment of the present specification. FIG. 9 is a diagram illustrating the process of obtaining urine flow rate data using a urine flow rate prediction model according to one embodiment of the present specification. FIG. 10 is a diagram illustrating a method for obtaining candidate urine rate data according to one embodiment of the present specification. FIG. 11 is a diagram illustrating the process of obtaining classification data using a voiding/non-voiding classification model according to one embodiment of the present specification. FIG. 12 is a diagram illustrating a method for obtaining urination classification data according to one embodiment of the present specification. FIG. 13 is a diagram illustrating a method for obtaining urination classification data according to another embodiment of the present specification. FIGS. 14 and 15 are drawings illustrating a method for obtaining voiding data using candidate urine flow rate data and voiding classification data according to one embodiment of the present specification. FIGS. 16 and FIGS. 17 are flowcharts illustrating a method for analyzing urination information according to one embodiment of the present specification. FIG. 18 is a graph showing a comparison of results when a voiding/non-voiding classification model according to one embodiment of the present specification is not used and when it is used. FIG. 19 is a flowchart illustrating the process of training a urine flow rate prediction model according to one embodiment of the present specification. FIG. 20 is a flowchart illustrating the process of training a voiding/non-voiding classification model according to one embodiment of the present specification. The aforementioned purposes, features, and advantages of this specification will become more apparent from the following detailed description in conjunction with the accompanying drawings. However, as this specification is subject to various modifications and may have various embodiments, specific embodiments are illustrated in the drawings and described in detail below. Throughout the specification, identical reference numbers generally represent identical components. Additionally, components with identical functions within the same scope of concept appearing in the drawings of each embodiment are described using the same reference numeral, a