US-20260127827-A1 - Method and Apparatus for Capturing Video and Providing Information
Abstract
An apparatus may comprise a glasses frame configured to be worn on a user's head; a button on the frame; a camera on the frame configured to capture an image when a user touches the button; and a transceiver configured to transmit the image wirelessly to a separate device.
Inventors
- Alex C Chen
Assignees
- Alex C Chen
Dates
- Publication Date
- 20260507
- Application Date
- 20250703
Claims (20)
- 1 . An apparatus comprising: a glasses frame configured to be worn on a user's head; a camera on the glasses frame configured to capture a video of an object in front of the user and the apparatus; a microphone on the glasses frame configured to receive a command from the user to request information about the object; a processor on the glasses frame configured to receive the user command and the captured video; a transceiver on the glasses frame configured to receive the user command and captured video from the processor, wirelessly transmit the user command and captured video to a mobile communication device, and receive information about the object from the mobile communication device; and an output component on the glasses frame configured to receive the information about the object and output the information to the user's ear.
- 2 . The apparatus of claim 1 , wherein the processor is configured to recognize a phrase of words from the user to trigger the processor to process the user command, wherein the phrase is not commonly said in conversations between people.
- 3 . The apparatus of claim 1 , wherein the user command is a question on how the user can interact with the object in the captured video, wherein the information received by the transceiver comprises a suggestion on how the user can interact with the object.
- 4 . The apparatus of claim 1 , wherein the user command is a question on where to find a second object based on the location of the first object, wherein the information received by the transceiver comprises directions on where the user can go to find the second object.
- 5 . The apparatus of claim 1 , wherein the information received by the transceiver comprises music.
- 6 . The apparatus of claim 1 , wherein the information received by the transceiver comprises a warning to the user about the object in front of the user and the apparatus.
- 7 . The apparatus of claim 1 , wherein the processor, transceiver, and mobile communication device are configured to send the captured video across a network to a web site.
- 8 . The apparatus of claim 1 , wherein the microphone, processor, transceiver, output component, and mobile communication device allow the user to communicate with another person and send the captured video across a network to a device of the other person.
- 9 . The apparatus of claim 1 , wherein the processor is configured to analyze the captured video and determine a position of the object.
- 10 . The apparatus of claim 1 , wherein the camera is configured to start and stop capturing video based on user commands.
- 11 . The apparatus of claim 1 , further comprising a component on the glasses frame configured to generate light.
- 12 . The apparatus of claim 1 , further comprising a display on the glasses frame configured to display information about the object.
- 13 . The apparatus of claim 1 , further comprising a touch pad configured to 1) receive a command from the user by touch, and 2 ) send the command to the processor.
- 14 . An apparatus comprising: a glasses frame configured to be worn on a user's head; a microphone on the glasses frame configured to receive a command from the user requesting information; a processor on the glasses frame configured to receive the command from the microphone; a transceiver on the glasses frame configured to receive the command from the processor, wirelessly transmit the command to a mobile communication device, and wirelessly receive information from the mobile communication device; and an output component on the glasses frame configured to receive the information and output the information to the user's ear.
- 15 . The apparatus of claim 14 , wherein the information is related to at least one of 1) an object in front of the apparatus, and 2) a location of the user.
- 16 . The apparatus of claim 14 , wherein the information comprises music.
- 17 . An apparatus comprising: a first wireless transceiver configured to receive a user command and a video of an object from a glasses frame configured to be worn on a user's head, the video being captured by a camera on the glasses frame; a processor configured to receive the user command and the video from the first wireless transceiver; a second wireless transceiver configured to 1) receive the user command and the video from the processor, 2) transmit the user command and the captured video across a network to a server, 3) receive a description of the object from the server, and 4) transmit the description to the glasses frame.
- 18 . The apparatus of claim 17 , wherein the first wireless transceiver comprises a Bluetooth transceiver, and the second wireless transceiver comprises a cellular transceiver.
- 19 . The apparatus of claim 17 , wherein the second wireless transceiver is configured to encode the user command before transmitting the user command across the network to the server.
- 20 . The apparatus of claim 17 , further comprising a user interface configured to receive input from the user to select one of a plurality of types of information about the object that the user would like to receive from the server.
Description
CLAIM OF PRIORITY This is a continuation patent application of U.S. patent application Ser. No. 17/521,845 entitled “Method and Apparatus for Capturing Video and Providing Information” filed on Nov. 8, 2021, which claims priority to U.S. patent application Ser. No. 16/136,261, entitled “Method and Apparatus for Recognizing Behavior and Providing Information,” filed on Sept. 20, 2018, which claims priority to Ser. No. 15/331,834, entitled “Method and Apparatus for Recognizing Behavior and Providing Information,” filed on Oct. 22, 2016 and granted as U.S. Pat. No. 10,115,238, which claims priority to U.S. patent application Ser. No. 14/182,297, entitled “Method and Apparatus for Recognizing Behavior and Providing Information,” filed on Feb. 18, 2014 and granted as U.S. Pat. No. 9,500,865, which claimed priority to U.S. Provisional Application No. 61/771,943, entitled “Method and Apparatus for Sensing and Displaying Information,” filed on Mar. 4, 2013, which are all hereby incorporated herein by reference in their entireties. FIELD This application relates to devices that sense and display information. BACKGROUND Cell phones, tablet computers, and laptop computers receive and display information. BRIEF DESCRIPTION OF THE FIGURES FIG. 1A shows a device with a projection component to project an image. FIG. 1B shows first and second devices similar to the device of FIG. 1A. FIG. 1C shows a device with light generating components. FIG. 1D shows an example of three light-generating components. FIGS. 2A and 2B show back and top views of a pair of glasses. FIG. 2C shows a side view of a pair of glasses. FIG. 3A shows sensors embedded in a player's clothes or gear. FIG. 3B shows sensors placed on lines or markers of a field or court. FIG. 4 shows a car with a system that has one or more sensors. FIG. 5 shows a device with a transceiver communicating with another device. FIG. 6 shows an example of a show that starts with two guys and two girls. FIG. 7 shows a device that prevents people from using their cell phones. FIG. 8A shows a device that can sense when a user's item is moved. FIG. 8B shows a device with a transceiver that detects a user key. FIG. 9 shows a device that represent the devices in FIGS. 1A-8B and 10. FIG. 10 shows a pair of glasses of FIG. 2A, a device, a network, and a server. FIG. 11 shows information displayed on one or both lenses of a pair of glasses. FIG. 12 shows functions that may be performed by the glasses of FIGS. 2A and 10. DESCRIPTION 1. Mobile Device that Projects An Image or Video2. Glasses that Generate Viewable Information on Lenses3. Sensors for Sports4. Sensors in car to detect drunk driving5. Location-specific Communication Device6. Multi-media Interactive Dating Experience7. Mobile Phone Jamming Device8. Motion Detection and Alert System FIG. 9 shows a device 900, which may represent any of the devices in FIGS. 1A-8B described below. The devices in FIGS. 1A-8B may comprise one or more of the elements shown in FIG. 9 and/or additional elements, depending on the desired cost, size, and functions of each device. The device 900 may represent a mobile phone, a tablet computer, a laptop, a display, a wristwatch, a game console, a car key, a key chain, or a remote control to control another device, such as a TV, stereo, display, or car. The device 900 may comprise a processor 902, a memory 904, a wireless transceiver 908 (e.g., 2G or 3G cellular, Long Term Evolution (LTE), WiMAX, WiFi, Bluetooth, RFID, Near Field Communication (NFC), etc.), a display 910, a user interface 912, one or more sensors 914, and a global positioning system (GPS) chip or other position tracking system 916. The user interface 912 may include one or more physical keys or buttons, and/or a menu of options shown on the display 910, such as a touchscreen. The sensors 914 may sense one or more conditions of the environment (such as amount of visible or invisible light, temperature, humidity, odors, sounds, touch, amount of particles in the air, such as plant pollen) and/or objects (or characteristics of the objects) around the device 900, such as motion, color, shape, size, surface type, distance from the device 900. The memory 904 may store one or more software modules 906 that can be executed by the processor 902 to perform some or all of the functions described below. The device 900 may communicate directly with a server or computer or indirectly via a wireless network to download software apps and transmit and receive information. Data transmitted from one device to another device or network, as described below, may be encoded (such as CDMA) or encrypted for security. An image as described below may refer to a single image or a series of images such as a video. 1. Mobile Device That Projects an Image or Video A. Projecting on a Surface FIG. 1A shows a device 10 (such as a mobile phone or tablet) with a projection component 16 to project an image or video 12 on a wall or other surface. Projection Component The proj