CN-116485993-B - Real-time panoramic three-dimensional reconstruction method based on virtual stereoscopic unwrapping method
Abstract
The invention discloses a real-time panoramic three-dimensional reconstruction method based on a virtual stereo unwrapping method, which only comprises a camera, a projector and a turntable, wherein the system captures three-step phase shift sinusoidal fringes through projection at each angle, and after wrapping phases are obtained, the virtual camera-projector system formed under adjacent rotation angles is used for carrying out auxiliary unwrapping, so that a 360-degree three-dimensional model is finally synthesized. The method has the advantages that only a single camera is adopted, the system cost is reduced compared with a multi-camera SPU system, and more visual angles can be provided for unwrapping in the process of utilizing the virtual camera-projector system to assist unwrapping, so that the unwrapping stability is greatly improved.
Inventors
- LIN BIN
- WANG HENGYU
Assignees
- 浙江大学
Dates
- Publication Date
- 20260505
- Application Date
- 20230303
Claims (5)
- 1. The real-time panoramic three-dimensional reconstruction method based on the virtual stereoscopic unwrapping method is characterized by comprising the following steps of: The method comprises the steps of constructing a three-dimensional reconstruction system, wherein the system comprises a computer, a projector, a camera, a turntable controller, a turntable and a camera, wherein the projector, the camera and the turntable controller are respectively connected with the computer, the turntable is connected with the turntable controller, and the projector is connected with the camera; Calibrating the projector, the camera and the turntable to obtain the camera, the internal and external parameters of the projector and the position of the rotating shaft of the turntable; Sending a rotating instruction to a turntable controller, and enabling the turntable to rotate at a constant speed; performing single virtual stereo unwrapping (VSPU); delay after single virtual stereo unwrapping (VSPU) is finished Performing the next VSPU process until 360-degree scanning is completed; the specific process of the single virtual stereo unwrapping (VSPU) comprises the following steps: projection shooting 2m+1 of projector ) A sub-multi-step phase shift pattern with each time delay of When in each projection, the camera shoots a phase shift diagram after distortion through hard triggering; respectively calculating according to multi-step phase shift diagrams under different angles A parcel phase map under the location; obtaining a limited number of possible k values through constant geometric constraint (CDC) according to a parcel phase diagram at a 0 position, and obtaining point clouds with different k values through a triangular ranging method; Respectively projecting the obtained point clouds to The virtual camera target surface and the virtual projector target surface of the position, the wrapping phase diagrams of the points on the virtual camera and the virtual projector are obtained, and the absolute value of the difference is obtained A wrapped phase error map at a location; obtaining a wrapped phase error map under different k values by summing the wrapped phase error maps, and obtaining a k value with the smallest error to obtain a final unwrapped k value map; according to the unwrapped k-value graph, obtaining point clouds under the 0-position angle through triangular ranging, and registering the point clouds into a 360-degree panoramic model; The multi-step phase shift diagrams under different angles are calculated respectively The parcel phase diagram under the position specifically comprises: The projector projects the phase-shifted fringes onto the object, the camera captures a distorted fringe pattern, which can be expressed as: ; Representing projection No The image of the individual stripes is a pattern of stripes, ; The number of phase steps is indicated as the number of phase steps, Represent the first The pixel coordinates of the individual camera images (hereinafter for simplicity, used A representation); the average intensity is indicated as such, Representing the amplitude; representing phase; representing phase shift, phase The phase shift algorithm can be obtained by: ; For signed arctangent operation, the phase will be wrapped by the sign of the numerator and denominator Expanded to 。
- 2. The real-time panoramic three-dimensional reconstruction method based on a virtual stereo unwrapping method according to claim 1, characterized in that a limited number of possible k values are obtained by constant geometric constraint (CDC) according to the wrapping phase diagram at the 0 position, specifically: In CDC, the system can set a rough range according to the size of the object And Expressed as: ; For a 360 measurement system, the CDC excludes some out-of-range The depth range of the measured object at different angles can be changed greatly, and the phase ambiguity cannot be eliminated.
- 3. The real-time panoramic three-dimensional reconstruction method based on the virtual stereo unwrapping method as in claim 2, wherein the obtained point clouds are respectively projected to The virtual camera target surface and the virtual projector target surface of the position, the wrapping phase diagrams of the points on the virtual camera and the virtual projector are obtained, and the absolute value of the difference is obtained The wrapping phase error map on the position is specifically: Taking an object to be measured as a reference system, and considering that the camera-projector system rotates around the rotating shaft of the turntable to form a virtual camera-projector system, and the adjacent camera-projector systems differ by an angle Is that Wherein Indicating the rotational speed of the turntable, Representing the interval time between shooting and collecting phase-shifted images, taking the rotary shaft of a rotary table as a world coordinate system The relationship of the camera and the projector at different angles can be expressed as: ; Unlike the spatial stereoscopic unwrapping system, the projector in the virtual system is also rotated, in order to use the adjacent angle virtual camera-projector for auxiliary unwrapping, the possible points obtained by the main camera are projected to the adjacent virtual projector and virtual camera, respectively, expressed as: ; ; Wherein the method comprises the steps of Respectively represent the first Projection of a possible point onto a camera And projector The coordinates of the target surface are calculated, Respectively represent the third under world coordinate system The transformation matrix of the individual camera and projector, Representing the internal reference matrix of the camera and projector respectively, Respectively represent the first The extrinsic matrix of the individual cameras and projectors, Representing the first camera in the world coordinate system A possible point, a phase projected to the camera target surface Image pairs of wrapped phases obtained by Eq (3) at corresponding angles of rotation The two-dimensional interpolation is carried out, phase compensation is needed at the jump position, and the phase value projected to the target surface of the projector can be expressed as: ; Wherein the method comprises the steps of Representation of The camera-projector limit constraint axis is the u-axis, Is within the range of And (3) adopting the wrapping phase errors projected by the virtual projector and the camera by the possible points as a discrimination standard, screening the possible points, wherein the errors are expressed as follows: ; ; Wherein the method comprises the steps of Is the kth possible point at The wrapped phase difference under the individual virtual cameras, Representing corresponding discrimination error, and actively making jump to ensure that the phase deviation is in And (3) inner part.
- 4. The real-time panoramic three-dimensional reconstruction method based on a virtual stereo unwrapping method according to claim 1 or 3, wherein the summing of the wrapped phase error maps is performed to obtain wrapped phase error maps under different k values, and k values with the smallest error are taken to obtain a final unwrapped k value map, which specifically comprises: The final selected k value is the minimum sum of the deviations of all virtual systems, ; ; Wherein the method comprises the steps of Obtained for the VSPU method The value of the sum of the values, For phase errors of different k values, An index function that is the smallest value of the values, To select the number of adjacent virtual systems, the number of systems that assist in unwrapping is 。
- 5. The real-time panoramic three-dimensional reconstruction method based on the virtual stereo unwrapping method according to claim 4, wherein the single virtual stereo unwrapping process comprises five projection shooting multi-step phase shift diagrams, and each time delay is After the VSPU process of a single angle is finished, time delay is carried out The next VSPU process is carried out, specifically: First, the turntable starts and then starts at a constant angular acceleration The rotation is accelerated to a uniform speed The stabilization time is The system then starts projection shooting each time it passes Taking a set of fringe patterns for the VSPU at time intervals of a set registration angle interval Obtained by 。
Description
Real-time panoramic three-dimensional reconstruction method based on virtual stereoscopic unwrapping method Technical Field The invention relates to the field of optical measurement, in particular to a real-time panoramic three-dimensional reconstruction method based on a virtual stereoscopic unwrapping method. Background The optical three-dimensional (3D) morphology reconstruction technology has wide application in the fields of intelligent manufacturing, medical treatment and the like [1-3]. Among them, fringe Projection Profilometry (FPP) [4-5] plays an important role in it because of its high accuracy, full field reconstruction, etc. However, the three-dimensional reconstruction of the single-direction surface type cannot obtain all information of the target due to shadow shielding and other problems, and the requirements in the fields of reverse modeling, industrial detection and the like cannot be met gradually [6], and the rapid, high-precision and 360-degree three-dimensional reconstruction is increasingly focused by academia and industry [7]. To achieve 360 degree three-dimensional reconstruction, point clouds of objects need to be acquired under different viewing angles and spliced, the first implementation method is a multi-angle stereo digital correlation (DIC) system, the first time of which is to measure panoramic strain of a cylindrical sample by using a multi-view DIC technology through a 4-camera system, such as j-j.orteu [8], the system still has a limited field of view, and the first time of which is to measure full-field deformation and strain of lower limbs of a human body through a 12-camera system, such as DANA SOLAV [9 ]. However, the multi-camera system increases the equipment cost and calibration of the system is difficult. Another method is that the point clouds acquired by a single measurement system under different view angles are registered, and the method of point cloud registration can be divided into algorithm registration and instrument assistance. The point cloud registration technique relies on the repeated parts of the point cloud under different view angles to splice, szymon [10] et al propose a real-time 360-degree three-dimensional model acquisition technique based on iterative nearest neighbor points (ICPs), but the method skips coarse matching, resulting in lower registration accuracy [7]. And 2019,Jiaming Qian[7, and the like, an improved registration strategy from thick to thin is provided, so that the accuracy is greatly improved while the registration speed is ensured. However, the technology based on point cloud registration still has serious time consumption, is difficult to be suitable for a real-time matching scene, and generally requires a person to manually move an object to be measured, so that the method is difficult to be suitable for an automatic three-dimensional measurement scene. The instrument assistance comprises mirror assistance, turntable assistance and the like, the mirror assistance is used for capturing targets with three angles through a mirror to realize panoramic three-dimensional measurement [11], but the number of view angles provided by the method is limited, and the 360-degree reconstruction requirement of a complex object cannot be met. The turntable is assisted to obtain point clouds under any angle by rotating the turntable, the point clouds obtained under each angle can finish point cloud registration only by rotating the point clouds under a rotating shaft coordinate system by a corresponding angle, a precise panoramic model can be obtained by selecting smaller rotating step length for complex objects, MEILING DAI [12] and the like, a method for realizing automatic calibration of a rotating shaft by using a calibration plate is provided, and Xiaoqi Cai [13] is used for improving the calibration precision of the rotating shaft by using an auxiliary camera. Xiaoli Liu [6] and the like realize panoramic reconstruction of a model by using a plurality of 3D cameras and an automatic control turntable, but the panoramic reconstruction time is longer because each sensor needs time-sharing reconstruction and the turntable needs to be stopped at each angle to wait for the reconstruction to finish. Because the FPP requires multi-frame projection, reconstruction can be destroyed in the rotating process of the turntable, the efficiency of panoramic reconstruction is severely limited, and the real-time three-dimensional reconstruction based on the FPP 2,14,15 begins to develop along with the development of high-speed projectors in the present year. Jiaming Qian [7] the method for realizing the real-time three-dimensional panorama reconstruction based on the turntable by using a four-camera projector system only needs to project single-period phase shift stripes to obtain wrapping phases, and the four cameras are matched with an Adaptive Depth Constraint (ADC) strategy to assist in unwrapping so as to realize the real-time three-dimension