Search

US-20260127659-A1 - TRANSACTION SESSION FOR WEARABLE PROCESSING DEVICE

US20260127659A1US 20260127659 A1US20260127659 A1US 20260127659A1US-20260127659-A1

Abstract

A transaction session is established directly or indirectly between a wearable processing device and a cloud-based server of a store. During the session, items are recognized by placing the items in a field-of-view of a front-facing camera of the device. Item recognition does not require item barcode identification. A depth sensor associated with the camera creates a three-dimensional mapping of a given item. The mapping and image features are processed to uniquely identify the item even when the item is associated with a same category of items. Customer input during the session can be achieved through gestures (hand, eyes, head, fingers, etc.) and/or voice commands. The customer input is translated and mapped into transaction interface commands/options and processed during the session to select items, delete items, view a transaction receipt, identify a quantity of items, obtain item details for a given item, etc.

Inventors

  • Kip Oliver Morgan
  • Gina Torcivia Bennett
  • Aleah Jean Kadry
  • Kelli Lee

Assignees

  • NCR VOYIX CORPORATION

Dates

Publication Date
20260507
Application Date
20251230

Claims (8)

  1. 1 . A method, comprising: establishing a shopping session with a wearable processing device worn by a customer in a store during a shopping trip; maintaining a virtual shopping cart for the shopping session; receiving item features for item images of items captured by a front-facing camera of the wearable processing device during the shopping session; resolving item identifiers for the items based at least on the item features during the shopping session; adding the item identifiers to the virtual shopping cart during the shopping session; modifying the item identifiers or a quantity total for a given item identifier within the virtual shopping cart based on gesture-based or audio-based input received from the customer through the wearable processing device during the shopping session; and processing a payment to pay for the items of the virtual shopping cart and to conclude the shopping session based on a payment option communicated by the customer through the wearable processing device.
  2. 2 . The method of claim 1 , wherein receiving further includes receiving three-dimensional measurements for each item with the corresponding item features, wherein the three-dimensional measurements captured by a depth sensor associated with the front-facing camera or the wearable processing device.
  3. 3 . The method of claim 1 , wherein resolving further includes scoring the three-dimensional measurements and the item features and matching scores produced against candidate scores for candidate item identifiers to determine the item identifiers.
  4. 4 . The method of claim 1 , wherein resolving further includes providing the item features to a trained machine-learning module and receiving the item identifiers as output from the trained machine-learning module.
  5. 5 . The method of claim 1 , wherein resolving further includes identifying a candidate list of item identifiers for at least one item image received based on the corresponding item features, providing candidate item images and candidate item information for the candidate list to the wearable processing device, and receive a gesture-select option made by the customer to resolve a particular item identifier for each of the at least one item images.
  6. 6 . The method of claim 1 , wherein modifying further includes providing a summary, a running price total, and a running item quantity total for the virtual shopping cart to the wearable processing device for presentation to the customer on an Augmented Reality (AR)-enabled display or lenses of the wearable processing device during the shopping session.
  7. 7 . A system, comprising: at least one server comprising a processor and a non-transitory computer-readable storage medium; and a wearable processing device comprising an Augmented Reality (AR)-enabled display or AR-enabled lenses, a front-facing camera, a rear-facing camera, a wireless transceiver, a microphone, a depth sensor, and an accelerometer; the system configured to establish a wireless transaction session between the server and the wearable processing device during a shopping trip of a customer to a store using the wireless transceiver of the wearable processing device; the wearable processing device, during the transaction session, is configured to: capture item images of items placed within the field-of-view of the front-facing camera; extract features for the item images; obtain three-dimensional measurements for each item image using the depth sensor; provide the features and three-dimensional measurements to the server; confirm item identifiers for the items added to a virtual shopping cart by the server; display cart and item information on the AR-enabled display or the AR-enabled lenses; display feedback information on the AR-enabled display or the AR-enabled lenses received from the server; translate gestures made by the customer and detected by the front-facing camera, the rear-facing camera, and accelerometer into transaction interface commands, transaction interface selections, and transaction interface options recognized by the server; and translate audio spoken by the customer captured by the microphone into the transaction interface commands, the transaction interface selections, and the transaction interface options. the server, during the session, configured to: identify the item identifiers from the features and the three-dimensional measurements associated with the item images; obtain candidate item images when a given item image corresponds to multiple candidate item identifiers; confirm each item identifier added to, modified, or removed from the virtual shopping cart through the transaction interface commands, the transaction interface selections, and the transaction interface options received from the wearable processing device; maintain the virtual shopping cart; provide the feedback information to the wearable processing device for the virtual shopping cart and for results associated with processing the transaction interface commands, the transaction interface selections, and the transaction interface options; and processing a payment for the virtual shopping cart based on select ones of the transaction interface commands to end the transaction session.
  8. 8 . The system of claim 7 , wherein the wearable processing device is glasses or a headset.

Description

CROSS-REFERENCE TO RELATED APPLICATION This application is a division of U.S. patent application Ser. No. 17/537,585, filed Nov. 30, 2021, which application and publication is incorporated herein by reference in its entirety. BACKGROUND As technology advances and consumers embrace it in all aspects of their lives, many retailers have taken advantage of this phenomenon with technology offerings that make it easier for their customers to interact with the retailers and transaction with the retailers. For example, most retail stores now have Self-Service Checkouts (SCOs) where customers of the stores can self-checkout. Customers utilizing SCOs typically have to pick items from the store shelves, carry them to the SCOs, scan the item barcodes at the SCOs, and pay for the goods. The problem with this approach is that the customers have to handle the items multiple times before checking out (pick from shelves, place in cart, remove from cart, scan at the SCOs, bag the items, etc.). Consequently, many retailers now offer mobile applications accessible from their customer phones that permit their customers to scan item barcodes as they shop in the stores and place scanned items in bags of a cart or a basket. Scan as you shop applications have streamlined the customer experience within the stores. However, these applications still have a number of problems, which have limited customer adoption of this technology. The scan as you shop applications require the user to actively operate their mobile phones as they shop. This creates a usability issue because one customer hand has to hold a picked item while the other customer hand has to operate the phone and interact with the mobile application during shopping. Customers struggle to carry their personal belongings, deal with small children and/or push a cart (or carry a basket) while operating scan as you shop applications on their phones. Many customers find this experience too cumbersome and difficult. Additionally, most scan as you shop applications require the customers to properly orient a held item so that its barcode is placed in the field of view of the phone's camera for properly identifying and recording an item identifier for the item. As a result, there is a need for improved scan as you shop applications, workflows, and interfaces. SUMMARY In various embodiments, a system and methods for transaction sessions with wearable processing devices are presented. According to an embodiment, a method for managing a transaction session with a wearable processing device is provided. A connection to a cloud-based store server is requested during a shopping trip of a customer to a store. A wireless transaction session is established with the cloud-based store server based on the requested connection. Item images placed within a field of view of a front-facing camera of a wearable processing device worn by the customer are captured during the transaction session. Item identifiers and item information are obtained for items associated with the item images based at least on the item images. Gestures of the customer are translated during the transaction session into customer selections, customer options, and customer-initiated commands associated with a virtual shopping cart maintained by the cloud-based store server during the transaction session; the virtual shopping cart comprises the item identifiers and item information. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a diagram of a system for managing transaction sessions with wearable processing devices, according to an example embodiment. FIG. 2 is a diagram of a method for managing a transaction session with a wearable processing device, according to an example embodiment. FIG. 3 is a diagram of another method for managing a transaction session with a wearable processing device, according to an example embodiment. DETAILED DESCRIPTION FIG. 1 is a diagram of a system/platform 100 for managing transaction sessions with wearable processing devices, according to an example embodiment. It is to be noted that the components are shown schematically in greatly simplified form, with only those components relevant to understanding of the embodiments being illustrated. Furthermore, the various components (that are identified in system/platform 100) are illustrated and the arrangement of the components are presented for purposes of illustration only. It is to be noted that other arrangements with more or less components are possible without departing from the teachings of conducting, operating, and managing transaction sessions via a wearable processing device, presented herein and below. System/platform 100 (herein after just “system 100”) provides a processing environment by which a customer engages in a transaction session with a retail store's server via an improved and seamless interface associated with a wearable processing device (such as glasses/goggles/headsets) during a shopping trip at the store. The wearable processin