CN-122023146-A - Infrared polarized image fusion enhancement method, system, medium and product
Abstract
An infrared polarized image fusion enhancement method, an infrared polarized image fusion enhancement system, a medium and a product relate to the technical field of image fusion. The method comprises the steps of firstly obtaining a basic image dataset comprising infrared radiation intensity, linear polarization degree and polarization angle, dividing the basic image dataset into local image blocks with positions corresponding to each other one by one, then extracting global and local statistical characteristic values, determining the contrast state of each local image block based on a numerical comparison result of the global and local statistical characteristic values, then carrying out self-adaptive stretching or numerical adjustment on the infrared radiation intensity, the linear polarization degree and the polarization angle image in each local image block according to the contrast state by adopting a corresponding strategy to obtain each modulation component, and finally mapping each component to a target interval and carrying out weighted fusion. According to the method, local differential modulation is realized by jointly analyzing global and local features, the problem that local weak texture details are compressed and lost in a high dynamic scene due to the fact that the related technology relies on global statistical characteristics is effectively solved, and the detail retaining capacity of a fusion image is improved.
Inventors
- LIAO XINWU
- MIAO YIWEN
- JI XIAOQIAN
- FENG LUPING
- QI MENG
Assignees
- 北京波谱华光科技有限公司
Dates
- Publication Date
- 20260512
- Application Date
- 20260129
Claims (10)
- 1. An infrared polarized image fusion enhancement method applied to an image processing device, comprising the following steps: Acquiring polarization original data acquired by an infrared polarization detector, and carrying out physical quantity analysis and pretreatment on the polarization original data to obtain a basic image dataset, wherein the basic image dataset comprises an infrared radiation intensity image, a linear polarization degree image and a polarization angle image; dividing an infrared radiation intensity image, a linear polarization degree image and a polarization angle image in the basic image data set into a plurality of local image blocks corresponding to the positions one by one according to uniform spatial positions; extracting global statistical characteristic values of the basic image data set and local statistical characteristic values of the local image blocks; Determining a contrast state of each local image block based on a numerical comparison result between a global statistical feature value of the infrared radiation intensity image and a local statistical feature value of each local image block, wherein the contrast state at least comprises a basic steady state and a high dynamic response state; According to the contrast state of each local image block, adopting a corresponding strategy to perform stretching calculation on the infrared radiation intensity image in the local image block to obtain an infrared intensity modulation component, and performing numerical adjustment or stretching calculation on the linear polarization degree image to obtain a linear polarization degree modulation component; Determining a polarization angle modulation coefficient based on local statistical characteristics of each local image block, and modulating the polarization angle image according to the polarization angle modulation coefficient to obtain a polarization angle modulation component; and mapping the infrared intensity modulation component, the polarization angle modulation component and the linear polarization degree modulation component to a unified target numerical value interval respectively, and carrying out weighted fusion to output a fused image.
- 2. The method according to claim 1, wherein said determining the contrast status of each of said local image blocks based on a numerical comparison between the global statistical characteristic of said infrared radiation intensity image and the local statistical characteristic of each of said local image blocks, in particular comprises: acquiring a global maximum gray value and a global minimum gray value of the whole infrared radiation intensity image; calculating the difference between the global maximum gray value and the global minimum gray value, and determining the difference as global contrast; obtaining a local maximum gray value and a local minimum gray value of an infrared radiation intensity value in a current local image block; Calculating the difference value between the local maximum gray value and the local minimum gray value, and determining the difference value as local contrast; when the global contrast is smaller than a preset global contrast threshold value and the local contrast is simultaneously smaller than a preset local contrast threshold value, judging that the current local image block is in a basic stable state; And when the global contrast is larger than or equal to the global contrast threshold or the local contrast is larger than or equal to the local contrast threshold, judging that the current local image block is in a high dynamic response state.
- 3. The method according to claim 2, wherein the performing numerical adjustment or stretching calculation on the linear polarization degree image to obtain a linear polarization degree modulation component specifically includes: When the current local image block is in a basic stable state, taking the linear polarization degree average value of all pixels of each local image block as a linear polarization degree modulation component of the current local image block; When the current local image block is in a high dynamic response state, judging whether the global contrast is larger than a preset global contrast threshold value or not, and judging whether the local contrast is larger than the preset local contrast threshold value or not at the same time; If yes, stretching calculation is carried out on linear polarization degree data in the current local image block according to a preset first polarization degree modulation function to obtain a linear polarization degree modulation component, wherein the first polarization degree modulation function is a linear normalization function and is used for carrying out linear mapping on the linear polarization degree data based on the local contrast of the current local image block; If not, stretching calculation is carried out on the linear polarization degree data in the current local image block according to a preset second polarization degree modulation function to obtain a linear polarization degree modulation component, wherein the second polarization degree modulation function is a nonlinear enhancement function and is used for stretching numerical value differences of the linear polarization degree data based on a nonlinear transformation relation.
- 4. The method according to claim 2, wherein determining the polarization angle modulation factor based on the local statistical feature of each of the local image blocks comprises: calculating a local polarization angle modulation coefficient of the current local image block according to the average value of the polarization angles of all pixels of the current local image block; when the local polarization angle modulation coefficient is larger than or equal to a preset polarization angle modulation threshold value, determining the local polarization angle modulation coefficient as the polarization angle modulation coefficient of the current local image block; And when the local polarization angle modulation coefficient is smaller than a preset polarization angle modulation threshold value, calculating a global polarization angle modulation coefficient of the basic image data set, and determining the global polarization angle modulation coefficient as the polarization angle modulation coefficient of the current local image block.
- 5. The method according to claim 3, wherein mapping the infrared intensity modulation component, the polarization angle modulation component, and the linear polarization degree modulation component to uniform target value intervals respectively comprises: Traversing the infrared intensity modulation component, the polarization angle modulation component and the linear polarization degree modulation component respectively to obtain a global maximum value and a global minimum value corresponding to each modulation component; And inputting the global maximum value, the global minimum value and a preset target region bit depth value into a linear normalization model for calculation to obtain a normalized infrared component, a normalized polarization angle component and a normalized linear polarization degree component, wherein the numerical distribution range of the normalized infrared component, the normalized polarization angle component and the normalized linear polarization degree component are aligned with the target numerical range.
- 6. The method according to claim 5, wherein the performing weighted fusion and outputting a fused image specifically comprises: Acquiring a first fusion weight, a second fusion weight and a third fusion weight which respectively correspond to the normalized infrared component, the normalized polarization angle component and the normalized linear polarization degree component; Multiplying the pixel value of the normalized infrared component, the pixel value of the normalized polarization angle component and the pixel value of the normalized linear polarization degree component in the local image block with the first fusion weight, the second fusion weight and the third fusion weight respectively to obtain a first weighted component value, a second weighted component value and a third weighted component value; And accumulating the first weighted component value, the second weighted component value and the third weighted component value to obtain a single-channel gray fusion image.
- 7. The method of claim 6, wherein the second fused weight has a value greater than the first fused weight such that the gray scale fused image preferentially preserves edge profile features in the polarization angle modulation component; The first fusion weight has a value greater than the third fusion weight to maintain the thermal radiation distribution characteristics in the infrared intensity modulation component for the gray scale fusion image.
- 8. An image processing apparatus comprising one or more processors and memory; the memory is coupled to the one or more processors, the memory for storing computer program code comprising computer instructions that the one or more processors invoke to cause the image processing device to perform the method of any of claims 1-7.
- 9. A computer readable storage medium storing computer instructions which, when run on an image processing apparatus, cause the image processing apparatus to perform the method of any of claims 1-7.
- 10. A computer program product comprising computer programs/instructions which, when run on an image processing device, cause the image processing device to perform the method of any of claims 1-7.
Description
Infrared polarized image fusion enhancement method, system, medium and product Technical Field The application relates to the technical field of image fusion, in particular to an infrared polarized image fusion enhancement method, an infrared polarized image fusion enhancement system, an infrared polarized image fusion enhancement medium and an infrared polarized image fusion enhancement product. Background The infrared polarization imaging technology combines the thermal radiation intensity information and the surface polarization vector information of the target, and can remarkably improve the target identification capability under complex environments such as thermal intersection, rain and fog and the like. In order to facilitate human eye observation or machine identification, an image processing algorithm is generally required to fuse an infrared radiation intensity image with an analyzed linear polarization degree and polarization angle image, so that infrared thermal characteristics of a scene and contour texture details of a target are simultaneously presented in a single image. In the related art, when such multi-source image fusion is processed, each component image is usually calculated through Stokes vectors, then enhancement processing is carried out on each component through a linear stretching algorithm based on global histogram equalization or a maximum and minimum value of a full graph, and finally fusion is completed through simple weighted superposition or color space transformation. The core of the processing path is to uniformly determine the mapping relation and gain coefficient of the image by relying on the global statistical distribution characteristic (such as the gray mean, variance or extremum of the whole image). However, in the case of complex scenes with extremely large temperature differences and other high dynamic ranges, the limited gray scale is further compressed in the local areas with low overall brightness or insignificant temperature difference changes due to the fact that the stretching range and gain factor of the image are dominated by the extremely bright or dark pixels of the whole image. The weak texture details and low-contrast targets in the local area are difficult to obtain enough gray contrast, so that the key detail information is blurred in the fused image and even completely lost, and the effectiveness of infrared polarized image fusion is reduced. Disclosure of Invention The application provides an infrared polarized image fusion enhancement method, an infrared polarized image fusion enhancement system, a medium and a product, which are used for improving the effectiveness of infrared polarized image fusion in a complex scene. The application provides an infrared polarization image fusion enhancement method, which is applied to image processing equipment and comprises the steps of acquiring polarization original data acquired by an infrared polarization detector, and carrying out physical quantity analysis and pretreatment on the polarization original data to obtain a basic image data set, wherein the basic image data set comprises an infrared radiation intensity image, a linear polarization degree image and a polarization angle image; dividing an infrared radiation intensity image, a linear polarization degree image and a polarization angle image in a basic image data set into a plurality of local image blocks corresponding to the positions one by one according to uniform space positions, extracting global statistical characteristic values of the basic image data set and local statistical characteristic values of the local image blocks, determining a contrast state of each local image block based on numerical comparison results between the global statistical characteristic values of the infrared radiation intensity image and the local statistical characteristic values of the local image blocks, wherein the contrast state at least comprises a basic steady state and a high dynamic response state, carrying out stretching calculation on the infrared radiation intensity image in each local image block by adopting a corresponding strategy according to the contrast state of each local image block to obtain an infrared intensity modulation component, carrying out numerical adjustment or stretching calculation on the linear polarization degree image to obtain a linear polarization degree modulation component, determining a polarization angle modulation coefficient based on the local statistical characteristics of each local image block, modulating the polarization angle image according to the polarization angle modulation coefficient to obtain a polarization angle modulation component, and respectively mapping the infrared intensity modulation component, the polarization angle modulation component and the linear polarization degree modulation component to uniform target numerical value intervals, carrying out weighted fusion, and outputting a fusion image. By a