Search

US-12626465-B2 - Generating virtual representations

US12626465B2US 12626465 B2US12626465 B2US 12626465B2US-12626465-B2

Abstract

Improved methods for generating a virtual representation of an interior space such as a room are provided. Techniques for capturing and normalizing sets of points for polygon mesh generation that utilize augmented reality toolkits are also described. The improved methods comprise: techniques for generating a polygon mesh that incorporates a projecting extrusion; techniques for generating a polygon mesh that incorporates curved surfaces; and techniques for generating a polygon mesh in real-time.

Inventors

  • James NICHOLL
  • Robert Lewis
  • Owen McConnell
  • Jonathan Achenbach

Assignees

  • SIGNATURIZE HOLDINGS LTD

Dates

Publication Date
20260512
Application Date
20230802
Priority Date
20180504

Claims (12)

  1. 1 . A method for generating a virtual representation of an interior space such as a room, comprising: a. obtaining a first set of three-dimensional coordinates that represent three-dimensional positions of points located on edges of walls of the interior space, and a second set of three-dimensional coordinates that represent positions of points located on edges of an extrusion in one of the walls of the interior space, wherein the first or the second set of three-dimensional coordinates are associated with a set of curve-defining values; and b. generating a polygon mesh representing the three-dimensional shape of the interior space, wherein generating the polygon mesh comprises: i. using the first set of three-dimensional coordinates to determine planes representing the walls of the interior space without considering any extrusions in the walls; and ii. for each wall with one or more extrusions, using the respective determined plane and the second set of three-dimensional coordinates to determine a plurality of sub-meshes that in combination represent the respective wall excluding the respective one or more extrusions; and iii. combining the plurality of sub-meshes into a mesh representing the wall with the one or more extrusions; and c. wherein the curve defining values include information associated with a curve defined by the arc of a shape such as an ellipse, the information including the x and y radii of the ellipse.
  2. 2 . The method of claim 1 , wherein the curve defining values include one or more of the definition of the end point of the curve, an indication of the type of curve being defined, and one or more values that define the shape of the curve.
  3. 3 . The method of claim 1 , wherein the curve defining values include information associated with one or more of the end points of the curve, the type of curve, and the shape of the curve.
  4. 4 . The method of claim 1 , wherein the curve defining values represent a two dimensional curve that is projected as a plane curve.
  5. 5 . The method of claim 1 , wherein the curve defining values include an end point of the curve that is a three-dimensional coordinate that either forms part of the first set of three-dimensional coordinates, or forms an intermediate point within a spline.
  6. 6 . The method of claim 1 , wherein the curve defining values include information associated with a curve defined by a mathematical equation, such as a parametric curve, a cubic Bézier curve, or a quadratic Bézier curve.
  7. 7 . The method of claim 1 , wherein the curve defining values include information associated with a parametric curve, the information including one or more three-dimensional coordinate control points.
  8. 8 . The method of claim 2 , wherein the curve defining values include a plurality of two-dimensional curves that are combined into a single parametric surface during the mesh generation process.
  9. 9 . A method for generating a virtual representation of an interior space such as a room, comprising: a. obtaining a first set of three-dimensional coordinates that represent three-dimensional positions of points located on edges of walls of the interior space, and a second set of three-dimensional coordinates that represent positions of points located on edges of an extrusion in one of the walls of the interior space; b. storing the three-dimensional coordinates; and c. on accessing the stored three-dimensional coordinates, generating a polygon mesh representing the three-dimensional shape of the interior space, wherein generating the polygon mesh comprises: i. using the first set of three-dimensional coordinates to determine planes representing the walls of the interior space without considering any extrusions in the walls; and ii. for each wall with one or more extrusions, using the respective determined plane and the second set of three-dimensional coordinates to determine a plurality of sub-meshes that in combination represent the respective wall excluding the respective one or more extrusions; and iii. combining the plurality of sub-meshes into a mesh representing the wall with the one or more extrusions d. wherein generating the polygon mesh further comprises obtaining a set of device configurations, and generating the polygon mesh in dependence upon the device configurations.
  10. 10 . The method of claim 9 , wherein generating the polygon mesh further comprises determining a display means for displaying the virtual representation, and generating the polygon mesh in dependence upon the display means.
  11. 11 . The method of claim 9 , wherein generating the polygon mesh further comprises determining the intended use, and generating the polygon mesh in dependence of the intended use.
  12. 12 . The method of claim 9 , wherein at least a portion of the mesh is generated with a lower fidelity than the remainder of the mesh.

Description

CROSS REFERENCE TO RELATED APPLICATIONS This application is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 17/213,945 filed on Mar. 26, 2021, which is a continuation of and claims priority to U.S. patent application Ser. No. 16/400,638 filed on May 1, 2019, now granted as U.S. Pat. No. 10,991,161, which claims priority under 35 U.S.C. § 119 to UK Application No. GB1807361.9 filed on May 4, 2018, the contents of which are hereby incorporated herein by reference in their entireties. TECHNICAL FIELD The invention relates to methods, computer programs and computer systems for generating virtual representations, in particular, virtual representations of three dimensional interior spaces such as rooms. BACKGROUND Virtual representations of three-dimensional objects and spaces may be generated for various reasons. For example, virtual representations of environments, buildings, objects and people may be generated for films, animation and gaming; virtual representations of anatomical objects may be generated for medical imaging; and virtual representations of buildings, rooms and objects within buildings and rooms may be generated for architectural and interior design purposes. Some techniques for generating virtual representations of objects and spaces involve the generation of a polygon mesh (sometimes called a wireframe model), typically made up of triangles, that approximates the 3D shape of the object or space for which the virtual representation is to be generated. The mesh is then input to a rendering engine which uses techniques such as shading and texture mapping to convert the mesh into a virtual representation of the 3D object or environment for display on a screen. Rendering techniques and engines for converting a mesh into an image are well-known and will not be described in further detail. Generating a polygon mesh for input to a rendering engine typically involves applying a mesh-generation technique to an array of predefined vertices (three-dimensional coordinates of surface points of the object or space). According to some known polygonal modelling techniques: An array of edges which connect pairs of the vertices is generated (or may itself be predefined, in an edge table for example);Using the array of edges, all polygons (typically triangles), which are closed sets of the edges, are generated;All polygons on the same face plane are combined;All polygon faces which are in the same horizontal or vertical plane are grouped; andAll groups of polygon faces are combined to form the 3D polygonal mesh of the object or space being modelled. The predefined vertices that are used as an input to the mesh-generation algorithm may be sourced from anywhere, but typically must be highly accurate if the mesh-generation algorithm is to produce a mesh that accurately represents the shape of the 3D object. For generating meshes of buildings and interior spaces such as rooms of buildings, vertices are often captured using specialized equipment such as a laser rangefinder, operated by trained individuals. The complexity of the vertex capture process may therefore mean that mesh generation, particularly for interior spaces, is not accessible to untrained users and is not amenable to real-time or near-real time applications. SUMMARY OF THE INVENTION The scope of protection is defined in the independent claims to which reference is now directed. Optional features are set out in the dependent claims. Embodiments described herein address problems with known techniques for generating meshes that are used as inputs of a rendering engine, and provide for the real-time generation of virtual representations of interior spaces such as rooms. The inventors have appreciated that some known mesh generation techniques, while effective, may be computationally demanding. This is especially problematic for mobile devices such as smart phones and tablets, which have limited processing capabilities and battery life. Further, the inventors have appreciated that existing vertex capture techniques limit the accessibility of virtual generation of interior spaces, and limits real-time or near-real time generation of virtual representations of spaces. Embodiments described herein provide mesh generation techniques which can make use of vertices captured without specialized equipment and skills, and so permits all kinds of users to generate virtual representations in real time or near-real time. Techniques for capturing vertices are also provided. Generating a mesh that represents a very simple space which does not have any extrusions such as doors, windows and fireplaces in its walls may be relatively straightforward. However, extrusions, which are present in most rooms, may vastly increase the complexity of some known mesh generation techniques. This is because extrusions quickly increase the number of three-dimensional coordinates/vertices required to represent the space, such that the number of edges conne