Magic Leap, Inc.

United States of America

Back to Profile

1-100 of 3,131 for Magic Leap, Inc. Sort by
Query
Aggregations
IP Type
        Patent 2,981
        Trademark 150
Jurisdiction
        United States 2,112
        World 696
        Canada 309
        Europe 14
Date
New (last 4 weeks) 29
2024 September (MTD) 9
2024 August 31
2024 July 19
2024 June 27
See more
IPC Class
G02B 27/01 - Head-up displays 1,367
G06T 19/00 - Manipulating 3D models or images for computer graphics 782
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer 741
G02B 27/00 - Optical systems or apparatus not provided for by any of the groups , 453
F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems 260
See more
NICE Class
09 - Scientific and electric apparatus and instruments 135
42 - Scientific, technological and industrial services, research and design 59
41 - Education, entertainment, sporting and cultural services 42
38 - Telecommunications services 36
35 - Advertising and business services 31
See more
Status
Pending 567
Registered / In Force 2,564
  1     2     3     ...     32        Next Page

1.

EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM

      
Application Number 18664223
Status Pending
Filing Date 2024-05-14
First Publication Date 2024-09-12
Owner Magic Leap, Inc. (USA)
Inventor
  • Messer, Kevin
  • Klug, Michael Anthony

Abstract

An augmented reality display system can include a first eyepiece waveguide with a first input coupling grating (ICG) region. The first ICG region can receive a set of input beams of light corresponding to an input image having a corresponding field of view (FOV), and can in-couple a first subset of the input beams. The first subset of input beams can correspond to a first sub-portion of the FOV. The system can also include a second eyepiece waveguide with a second ICG region. The second ICG region can receive and in-couple at least a second subset of the input beams. The second subset of the input beams can correspond to a second sub-portion of the FOV. The first and second sub-portions of the FOV can be at least partially different but together include the complete FOV of the input image.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 5/18 - Diffracting gratings
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

2.

METHODS, DEVICES, AND SYSTEMS FOR ILLUMINATING SPATIAL LIGHT MODULATORS

      
Application Number 18661321
Status Pending
Filing Date 2024-05-10
First Publication Date 2024-09-12
Owner Magic Leap, Inc. (USA)
Inventor
  • Cheng, Hui-Chuan
  • Chung, Hyunsun
  • Trisnadi, Jahja I.
  • Carlisle, Clinton
  • Oh, Chulwoo
  • Curtis, Kevin Richard

Abstract

An optical device may include a light turning element. The optical device can include a first surface that is parallel to a horizontal axis and a second surface opposite to the first surface. The optical device may include a light module that includes a plurality of light emitters. The light module can be configured to combine light for the plurality of emitters. The optical device can further include a light input surface that is between the first and the second surfaces and is disposed with respect to the light module to receive light emitted from the plurality of emitters. The optical device may include an end reflector that is disposed on a side opposite the light input surface. The light coupled into the light turning element may be reflected by the end reflector and/or reflected from the second surface towards the first surface.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 5/30 - Polarising elements
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only
  • G02B 30/26 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the autostereoscopic type
  • G02B 30/52 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being constructed from a stack or sequence of 2D planes, e.g. depth sampling systems
  • G02F 1/137 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
  • G03B 21/00 - Projectors or projection-type viewers; Accessories therefor
  • G03B 21/20 - Lamp housings
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G09G 3/02 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes by tracing or scanning a light beam on a screen
  • G09G 3/24 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix using controlled light sources using incandescent filaments
  • H04N 13/00 - PICTORIAL COMMUNICATION, e.g. TELEVISION - Details thereof
  • H04N 13/239 - Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

3.

METHODS AND APPARATUSES FOR REDUCING STRAY LIGHT EMISSION FROM AN EYEPIECE OF AN OPTICAL IMAGING SYSTEM

      
Application Number 18666474
Status Pending
Filing Date 2024-05-16
First Publication Date 2024-09-12
Owner Magic Leap, Inc. (USA)
Inventor
  • Yaras, Fahri
  • Browy, Eric C.
  • Liu, Victor Kai
  • Bhargava, Samarth
  • Singh, Vikramjit
  • Vaughn, Michal Beau Dennison
  • Sawicki, Joseph Christopher

Abstract

An eyepiece for a head-mounted display includes one or more first waveguides arranged to receive light from a spatial light modulator at a first edge, guide at least some of the received light to a second edge opposite the first edge, and extract at least some of the light through a face of the one or more first waveguides between the first and second edges. The eyepiece also includes a second waveguide positioned to receive light exiting the one or more first waveguides at the second edge and guide the received light to one or more light absorbers.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 6/12 - Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind
  • G02B 6/293 - Optical coupling means having data bus means, i.e. plural waveguides interconnected and providing an inherently bidirectional system by mixing and splitting signals with wavelength selective means
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • G02B 6/35 - Optical coupling means having switching means
  • G02B 27/42 - Diffraction optics
  • G02B 30/50 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels

4.

COLOR-SELECTIVE WAVEGUIDES FOR AUGMENTED REALITY/MIXED REALITY APPLICATIONS

      
Application Number 18654811
Status Pending
Filing Date 2024-05-03
First Publication Date 2024-09-12
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Bhagat, Sharad D.
  • Jurbergs, David Carl
  • Ong, Ryan Jason
  • Peroz, Christophe
  • Chang, Chieh
  • Li, Ling

Abstract

Color-selective waveguides, methods for fabricating color-selective waveguides, and augmented reality (AR)/mixed reality (MR) applications including color-selective waveguides are described. The color-selective waveguides can advantageously reduce or block stray light entering a waveguide (e.g., red, green, or blue waveguide), thereby reducing or eliminating back-reflection or back-scattering into the eyepiece.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • B29D 11/00 - Producing optical elements, e.g. lenses or prisms
  • G02B 5/22 - Absorbing filters

5.

WAVEGUIDE-BASED ILLUMINATION FOR HEAD MOUNTED DISPLAY SYSTEM

      
Application Number 18647479
Status Pending
Filing Date 2024-04-26
First Publication Date 2024-09-05
Owner Magic Leap, Inc. (USA)
Inventor
  • Edwin, Lionel Ernest
  • Yeoh, Ivan Li Chuen
  • Jia, Zhiheng
  • Carlson, Adam C.

Abstract

A head-mounted display system is configured to project light to an eye of a user wearing the head-mounted display system to display content in a vision field of said user. The head-mounted display system comprises at least one diffusive optical element, at least one out-coupling optical element, at least one mask comprising at least one mask opening, at least one illumination in-coupling optical element configured to in-couple light from at least one illumination source into a light-guiding component, an image projector configured to in-couple an image and an at least one illumination source is configured to in-couple light into at least one illumination in-coupling optical element, an eyepiece, a curved light-guiding component, a light-guiding component comprising a portion of a frame, and/or two light-guiding components disposed on opposite sides of at least one out-coupling optical element.

IPC Classes  ?

  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

6.

TIME-MULTIPLEXED DISPLAY OF VIRTUAL CONTENT AT VARIOUS DEPTHS

      
Application Number 18661157
Status Pending
Filing Date 2024-05-10
First Publication Date 2024-09-05
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Komanduri, Ravi Kumar
  • Edwin, Lionel Ernest
  • Oh, Chulwoo

Abstract

Techniques for operating an optical system are disclosed. World light may be linearly polarized along a first axis. When the optical system is operating in accordance with a first state, a polarization of the world light may be rotated by 90 degrees, the world light may be linearly polarized along a second axis perpendicular to the first axis, and zero net optical power may be applied to the world light. When the optical system is operating in accordance with a second state, virtual image light may be projected onto an eyepiece of the optical system, the world light and the virtual image light may be linearly polarized along the second axis, a polarization of the virtual image light may be rotated by 90 degrees, and non-zero net optical power may be applied to the virtual image light.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 5/30 - Polarising elements
  • G02B 27/42 - Diffraction optics
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

7.

PATIENT VIEWING SYSTEM

      
Application Number 18663682
Status Pending
Filing Date 2024-05-14
First Publication Date 2024-09-05
Owner Magic Leap, Inc. (USA)
Inventor
  • Robaina, Nastasja U
  • Babu Jd, Praveen
  • Lundmark, David Charles
  • Ilic, Alexander

Abstract

A method of viewing a patient including inserting a catheter is described for health procedure navigation. A CT scan is carried out on a body part of a patient. Raw data from the CT scan is processed to create three-dimensional image data, storing the image data in the data store. Projectors receive generated light in a pattern representative of the image data and waveguides guide the light to a retina of an eye of a viewer while light from an external surface of the body transmits to the retina of the eye so that the viewer sees the external surface of the body augmented with the processed data rendering of the body part.

IPC Classes  ?

  • A61B 6/00 - Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
  • A61B 5/06 - Devices, other than using radiation, for detecting or locating foreign bodies
  • A61B 6/02 - Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
  • A61B 6/03 - Computerised tomographs
  • A61B 6/46 - Arrangements for interfacing with the operator or the patient
  • G06T 7/00 - Image analysis
  • G06T 15/00 - 3D [Three Dimensional] image rendering

8.

FAN ASSEMBLY FOR DISPLAYING AN IMAGE

      
Application Number 18663953
Status Pending
Filing Date 2024-05-14
First Publication Date 2024-09-05
Owner Magic Leap, Inc. (USA)
Inventor
  • Rohena, Guillermo Padin
  • Remsburg, Ralph
  • Kaehler, Adrian
  • Rynk, Evan Francis

Abstract

Apparatus and methods for displaying an image by a rotating structure are provided. The rotating structure can comprise blades of a fan. The fan can be a cooling fan for an electronics device such as an augmented reality display. In some embodiments, the rotating structure comprises light sources that emit light to generate the image. The light sources can comprise light-field emitters. In other embodiments, the rotating structure is illuminated by an external (e.g., non-rotating) light source.

IPC Classes  ?

  • G09G 3/02 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes by tracing or scanning a light beam on a screen
  • F04D 17/16 - Centrifugal pumps for displacing without appreciable compression
  • F04D 25/08 - Units comprising pumps and their driving means the working fluid being air, e.g. for ventilation
  • F04D 29/00 - NON-POSITIVE-DISPLACEMENT PUMPS - Details, component parts, or accessories
  • F04D 29/42 - Casings; Connections for working fluid for radial or helico-centrifugal pumps
  • G02B 27/01 - Head-up displays
  • G02B 30/56 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
  • G06F 3/16 - Sound input; Sound output
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G08B 21/18 - Status alarms
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 5/10 - Intensity circuits
  • H04N 5/262 - Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects

9.

SPATIAL INSTRUCTIONS AND GUIDES IN MIXED REALITY

      
Application Number 18664036
Status Pending
Filing Date 2024-05-14
First Publication Date 2024-09-05
Owner Magic Leap, Inc. (USA)
Inventor
  • Arora, Tushar
  • Kramarich, Scott

Abstract

Exemplary systems and methods for creating spatial contents in a mixed reality environment are disclosed. In an example, a location associated with a first user in a coordinate space is determined. A persistent virtual content is generated. The persistent virtual content is associated with the first user's associated location. The first user's associated location is determined and is associated with the persistent virtual content. A location of a second user at a second time in the coordinate space is determined. The persistent virtual content is presented to the second user via a display at a location in the coordinate space corresponding to the first user's associated location.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • H04W 4/024 - Guidance services

10.

LIGHTWEIGHT AND LOW POWER CROSS REALITY DEVICE WITH HIGH TEMPORAL RESOLUTION

      
Application Number 18655518
Status Pending
Filing Date 2024-05-06
First Publication Date 2024-08-29
Owner Magic Leap, Inc. (USA)
Inventor
  • Zahnert, Martin Georg
  • Ilic, Alexander

Abstract

A wearable display system for a cross reality (XR) system may have a dynamic vision sensor (DVS) camera and a color camera. At least one of the cameras may be a plenoptic camera. The wearable display system may dynamically restrict processing of image data from either or both cameras based on detected conditions and XR function being performed. For tracking an object, image information may be processed for patches of a field of view of either or both cameras corresponding to the object. The object may be tracked based on asynchronously acquired events indicating changes within the patches. Stereoscopic or other types of image information may be used when event-based object tacking yields an inadequate quality metric. The tracked object may be a user's hand or a stationary object in the physical world, enabling calculation of the pose of the wearable display system and of the wearer's head.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G06T 7/292 - Multi-camera tracking
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

11.

AUGMENTED REALITY MAP CURATION

      
Application Number 18655140
Status Pending
Filing Date 2024-05-03
First Publication Date 2024-08-29
Owner Magic Leap, Inc. (USA)
Inventor
  • Hazen, Griffith Buckley
  • Dedonato, Amy
  • Shahrokni, Ali
  • Weisbih, Ben
  • Balakumar, Vinayram

Abstract

An augmented reality device may communicate with a map server via an API interface to provide mapping data that may be implemented into a canonical map, and may also receive map data from the map server. A visualization of map quality, including quality indicators for multiple cells of the environment, may be provided to the user as an overlay to the current real-world environment seen through the AR device. These visualizations may include, for example, a map quality minimap and/or a map quality overlay. The visualizations provide guidance to the user that allows more efficient updates to the map, thereby improving map quality and localization of users into the map.

IPC Classes  ?

  • G06T 11/00 - 2D [Two Dimensional] image generation
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 3/60 - Rotation of a whole image or part thereof
  • G09B 29/10 - Map spot or co-ordinate position indicators; Map-reading aids

12.

DEPTH PLANE SELECTION FOR MULTI-DEPTH PLANE DISPLAY SYSTEMS BY USER CATEGORIZATION

      
Application Number 18658715
Status Pending
Filing Date 2024-05-08
First Publication Date 2024-08-29
Owner Magic Leap, Inc. (USA)
Inventor
  • Miller, Samuel A.
  • Agarwal, Lomesh
  • Edwin, Lionel Ernest
  • Yeoh, Ivan Li Chuen
  • Farmer, Daniel
  • Prokushkin, Sergey Fyodorovich
  • Munk, Yonatan
  • Selker, Edwin Joseph
  • Fonseka, Erik
  • Greco, Paul M.
  • Sommers, Jeffrey Scott
  • Stuart, Bradley Vincent
  • Das, Shiuli
  • Shanbhag, Suraj Manjunath

Abstract

A display system includes a head-mounted display configured to project light, having different amounts of wavefront divergence, to an eye of a user to display virtual image content appearing to be disposed at different depth planes. The wavefront divergence may be changed in discrete steps, with the change in steps being triggered based upon whether the user is fixating on a particular depth plane. The display system may be calibrated for switching depth planes for a main user. Upon determining that a guest user is utilizing the system, rather than undergoing a full calibration, the display system may be configured to switch depth planes based on a rough determination of the virtual content that the user is looking at. The virtual content may have an associated depth plane and the display system may be configured to switch to the depth plane of that virtual content.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

13.

VIRTUAL AND AUGMENTED REALITY DISPLAY SYSTEMS WITH EMISSIVE MICRO-DISPLAYS

      
Application Number 18590722
Status Pending
Filing Date 2024-02-28
First Publication Date 2024-08-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Klug, Michael Anthony
  • Poliakov, Evgeni
  • Trisnadi, Jahja I.
  • Chung, Hyunsun
  • Edwin, Lionel Ernest
  • Cohen, Howard Russell
  • Taylor, Robert Blake
  • Russell, Andrew Ian
  • Curtis, Kevin Richard
  • Carlisle, Clinton

Abstract

A wearable display system includes one or more emissive micro-displays, e.g., micro-LED displays. The micro-displays may be monochrome micro-displays or full-color micro-displays. The micro-displays may include arrays of light emitters. Light collimators may be utilized to narrow the angular emission profile of light emitted by the light emitters. Where a plurality of emissive micro-displays is utilized, the micro-displays may be positioned at different sides of an optical combiner, e.g., an X-cube prism which receives light rays from different micro-displays and outputs the light rays from the same face of the cube. The optical combiner directs the light to projection optics, which outputs the light to an eyepiece that relays the light to a user's eye. The eyepiece may output the light to the user's eye with different amounts of wavefront divergence, to place virtual content on different depth planes.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 6/26 - Optical coupling means
  • G02B 27/09 - Beam shaping, e.g. changing the cross-sectioned area, not otherwise provided for
  • G02B 27/10 - Beam splitting or combining systems
  • G02B 27/14 - Beam splitting or combining systems operating by reflection only
  • G02B 27/18 - Optical systems or apparatus not provided for by any of the groups , for optical projection, e.g. combination of mirror and condenser and objective
  • G02B 27/30 - Collimators
  • G02B 27/40 - Optical focusing aids
  • G02B 27/62 - Optical apparatus specially adapted for adjusting optical elements during the assembly of optical systems
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 3/32 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
  • H02N 2/02 - Electric machines in general using piezoelectric effect, electrostriction or magnetostriction producing linear motion, e.g. actuators; Linear positioners

14.

DYNAMIC FIELD OF VIEW VARIABLE FOCUS DISPLAY SYSTEM

      
Application Number 18650003
Status Pending
Filing Date 2024-04-29
First Publication Date 2024-08-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Yeoh, Ivan Li Chuen
  • Edwin, Lionel Ernest
  • Schowengerdt, Brian T.
  • Klug, Michael Anthony
  • Trisnadi, Jahja I.

Abstract

An augmented reality (AR) device is described with a display system configured to adjust an apparent distance between a user of the AR device and virtual content presented by the AR device. The AR device includes a first tunable lens that change shape in order to affect the position of the virtual content. Distortion of real-world content on account of the changes made to the first tunable lens is prevented by a second tunable lens that changes shape to stay substantially complementary to the optical configuration of the first tunable lens. In this way, the virtual content can be positioned at almost any distance relative to the user without degrading the view of the outside world or adding extensive bulk to the AR device. The augmented reality device can also include tunable lenses for expanding a field of view of the augmented reality device.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 3/14 - Fluid-filled or evacuated lenses of variable focal length
  • G02B 26/00 - Optical devices or arrangements for the control of light using movable or deformable optical elements
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 5/80 - Geometric correction
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

15.

VIRTUAL AND AUGMENTED REALITY SYSTEMS AND METHODS HAVING UNEQUAL NUMBERS OF COMPONENT COLOR IMAGES DISTRIBUTED ACROSS DEPTH PLANES

      
Application Number 18648347
Status Pending
Filing Date 2024-04-27
First Publication Date 2024-08-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Schowengerdt, Brian T.
  • Hua, Hong
  • Cheng, Hui-Chuan
  • Peroz, Christophe

Abstract

Images perceived to be substantially full color or multi-colored may be formed using component color images that are distributed in unequal numbers across a plurality of depth planes. The distribution of component color images across depth planes may vary based on color. In some embodiments, a display system includes a stack of waveguides that each output light of a particular color, with some colors having fewer numbers of associated waveguides than other colors. The waveguide stack may include multiple pluralities (e.g., first and second pluralities) of waveguides, each configured to produce an image by outputting light corresponding to a particular color. The total number of waveguides in the second plurality of waveguides may be less than the total number of waveguides in the first plurality of waveguides, and may be more than the total number of waveguides in a third plurality of waveguides, in embodiments that utilize three component colors.

IPC Classes  ?

16.

WIDE FIELD-OF-VIEW POLARIZATION SWITCHES AND METHODS OF FABRICATING LIQUID CRYSTAL OPTICAL ELEMENTS WITH PRETILT

      
Application Number 18648448
Status Pending
Filing Date 2024-04-28
First Publication Date 2024-08-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Oh, Chulwoo
  • Komanduri, Ravi Kumar

Abstract

A switchable optical assembly comprises a switchable waveplate configured to be electrically activated and deactivated to selectively alter the polarization state of light incident thereon. The switchable waveplate comprises first and second surfaces and a first liquid crystal layer disposed between the first and second surfaces. The first liquid crystal layer comprises a plurality of liquid crystal molecules that are rotated about respective axes parallel to a central axis, where the rotation varies with an azimuthal angle about the central axis. The switchable waveplate additionally comprises a plurality of electrodes to apply an electrical signal across the first liquid crystal layer.

IPC Classes  ?

  • G02F 1/1337 - Surface-induced orientation of the liquid crystal molecules, e.g. by alignment layers
  • G02B 27/01 - Head-up displays
  • G02F 1/01 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02F 1/13363 - Birefringent elements, e.g. for optical compensation
  • G02F 1/1347 - Arrangement of liquid crystal layers or cells in which the final condition of one light beam is achieved by the addition of the effects of two or more layers or cells

17.

PHYSICS-BASED AUDIO AND HAPTIC SYNTHESIS

      
Application Number 18649778
Status Pending
Filing Date 2024-04-29
First Publication Date 2024-08-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Leider, Colby Nelson
  • Mathew, Justin Dan
  • Land, Michael Z.
  • Wood, Blaine Ivin
  • Lee, Jung-Suk
  • Tajik, Anastasia Andreyevna
  • Jot, Jean-Marc

Abstract

Disclosed herein are systems and methods for generating and presenting virtual audio for mixed reality systems. A method may include determining a collision between a first object and a second object, wherein the first object comprises a first virtual object. A memory storing one or more audio models can be accessed. It can be determined if the one or more audio models stored in the memory comprises an audio model corresponding to the first object. In accordance with a determination that the one or more audio models comprises an audio model corresponding to the first object, an audio signal can be synthesized, wherein the audio signal is based on the collision and the audio model corresponding to the first object, and the audio signal can be presented to a user via a speaker of a head-wearable device. In accordance with a determination that the one or more audio models does not comprise an audio model corresponding to the first object, an acoustic property of the first object can be determined, a custom audio model based on the acoustic property of the first object can be generated, an audio signal can be synthesized, wherein the audio signal is based on the collision and the custom audio model, and the audio signal can be presented, via a speaker of a head-wearable device, to a user.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • A63F 13/285 - Generating tactile feedback signals via the game input device, e.g. force feedback
  • A63F 13/577 - Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
  • G06F 3/16 - Sound input; Sound output
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

18.

SPATIALLY VARIABLE LIQUID CRYSTAL DIFFRACTION GRATINGS

      
Application Number 18651052
Status Pending
Filing Date 2024-04-30
First Publication Date 2024-08-22
Owner Magic Leap, Inc. (USA)
Inventor Oh, Chulwoo

Abstract

The present disclosure relates to display systems and, more particularly, to augmented reality display systems including diffraction grating(s), and methods of fabricating same. A diffraction grating includes a plurality of different diffracting zones having a periodically repeating lateral dimension corresponding to a grating period adapted for light diffraction. The diffraction grating additionally includes a plurality of different liquid crystal layers corresponding to the different diffracting zones. The different liquid crystal layers include liquid crystal molecules that are aligned differently, such that the different diffracting zones have different optical properties associated with light diffraction.

IPC Classes  ?

19.

CONTEXTUAL AWARENESS OF USER INTERFACE MENUS

      
Application Number 18651200
Status Pending
Filing Date 2024-04-30
First Publication Date 2024-08-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Powderly, James M.
  • Naples, Alysha
  • Hoover, Paul Armistead
  • Spofford, Tucker

Abstract

Examples of systems and methods for a wearable system to automatically select or filter available user interface interactions or virtual objects are disclosed. The wearable system can select a group of virtual objects for user interaction based on contextual information associated with the user, the user's environment, physical or virtual objects in the user's environment, or the user's physiological or psychological state.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06K 7/14 - Methods or arrangements for sensing record carriers by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
  • G06K 19/06 - Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

20.

AUDIOVISUAL PRESENCE TRANSITIONS IN A COLLABORATIVE REALITY ENVIRONMENT

      
Application Number 18651314
Status Pending
Filing Date 2024-04-30
First Publication Date 2024-08-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Pejsa, Tomislav
  • Mori, Koichi
  • Bailey, Richard St. Clair

Abstract

Examples of systems and methods to facilitate audiovisual presence transitions of virtual objects such as virtual avatars in a mixed reality collaborative environment are disclosed. The systems and methods may be configured to produce different audiovisual presence transitions such as appearance, disappearance and reappearance of the virtual avatars. The virtual avatar audiovisual transitions may be further indicated by various visual and sound effects of the virtual avatars. The transitions may occur based on various colocation or decolocation scenarios.

IPC Classes  ?

  • G06T 13/80 - 2D animation, e.g. using sprites
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

21.

VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

      
Application Number 18651473
Status Pending
Filing Date 2024-04-30
First Publication Date 2024-08-22
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Rodriguez, Jose Felix
  • Perez, Ricardo Martinez
  • Nourai, Reza
  • Taylor, Robert Blake

Abstract

A method in a virtual, augmented, or mixed reality system includes a GPU determining/detecting an absence of image data. The method also includes shutting down a portion/component/function of the GPU. The method further includes shutting down a communication link between the GPU and a DB. Moreover, the method includes shutting down a portion/component/function of the DB. In addition, the method includes shutting down a communication link between the DB and a display panel. The further also includes shutting down a portion/component/function of the display panel.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 1/20 - Processor architectures; Processor configuration, e.g. pipelining
  • H04L 45/48 - Routing tree calculation
  • H04L 67/131 - Protocols for games, networked simulations or virtual reality

22.

REVERBERATION GAIN NORMALIZATION

      
Application Number 18653795
Status Pending
Filing Date 2024-05-02
First Publication Date 2024-08-22
Owner Magic Leap, Inc. (USA)
Inventor
  • Audfray, Remi Samuel
  • Jot, Jean-Marc
  • Dicker, Samuel Charles

Abstract

Systems and methods for providing accurate and independent control of reverberation properties are disclosed. In some embodiments, a system may include a reverberation processing system, a direct processing system, and a combiner. The reverberation processing system can include a reverb initial power (RIP) control system and a reverberator. The RIP control system can include a reverb initial gain (RIG) and a RIP corrector. The RIG can be configured to apply a RIG value to the input signal, and the RIP corrector can be configured to apply a RIP correction factor to the signal from the RIG. The reverberator can be configured to apply reverberation effects to the signal from the RIP control system. In some embodiments, one or more values and/or correction factors can be calculated and applied such that the signal output from a component in the reverberation processing system is normalized to a predetermined value (e.g., unity (1.0)).

IPC Classes  ?

  • G10K 15/08 - Arrangements for producing a reverberation or echo sound

23.

OPTICAL ELEMENTS BASED ON POLYMERIC STRUCTURES INCORPORATING INORGANIC MATERIALS

      
Application Number 18616981
Status Pending
Filing Date 2024-03-26
First Publication Date 2024-08-15
Owner Magic Leap, Inc. (USA)
Inventor
  • West, Melanie Maputol
  • Peroz, Christophe
  • Melli, Mauro

Abstract

The present disclosure relates to display systems and, more particularly, to augmented reality display systems. In one aspect, a method of fabricating an optical element includes providing a substrate having a first refractive index and transparent in the visible spectrum. The method additionally includes forming on the substrate periodically repeating polymer structures. The method further includes exposing the substrate to a metal precursor followed by an oxidizing precursor. Exposing the substrate is performed under a pressure and at a temperature such that an inorganic material comprising the metal of the metal precursor is incorporated into the periodically repeating polymer structures, thereby forming a pattern of periodically repeating optical structures configured to diffract visible light. The optical structures have a second refractive index greater than the first refractive index.

IPC Classes  ?

24.

HAND GESTURE INPUT FOR WEARABLE SYSTEM

      
Application Number 18633947
Status Pending
Filing Date 2024-04-12
First Publication Date 2024-08-15
Owner Magic Leap, Inc. (USA)
Inventor Lacey, Paul

Abstract

Techniques are disclosed for allowing a user's hands to interact with virtual objects. An image of at least one hand may be received from an image capture devices. A plurality of keypoints associated with at least one hand may be detected. In response to determining that a hand is making or is transitioning into making a particular gesture, a subset of the plurality of keypoints may be selected. An interaction point may be registered to a particular location relative to the subset of the plurality of keypoints based on the particular gesture. A proximal point may be registered to a location along the user's body. A ray may be cast from the proximal point through the interaction point. A multi-DOF controller for interacting with the virtual object may be formed based on the ray.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition

25.

CROSS REALITY SYSTEM WITH LOCALIZATION SERVICE

      
Application Number 18638995
Status Pending
Filing Date 2024-04-18
First Publication Date 2024-08-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Shahrokni, Ali
  • Olshansky, Daniel
  • Zhao, Xuan
  • Torres, Rafael Domingos
  • Holder, Joel David
  • Lin, Keng-Sheng
  • Swaminathan, Ashwin
  • Mohan, Anush

Abstract

A cross reality system enables any of multiple devices to efficiently and accurately access previously stored maps and render virtual content specified in relation to those maps. The cross reality system may include a cloud-based localization service that responds to requests from devices to localize with respect to a stored map. The request may include one or more sets of feature descriptors extracted from an image of the physical world around the device. Those features may be posed relative to a coordinate frame used by the local device. The localization service may identify one or more stored maps with a matching set of features. Based on a transformation required to align the features from the device with the matching set of features, the localization service may compute and return to the device a transformation to relate its local coordinate frame to a coordinate frame of the stored map.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • H04L 67/131 - Protocols for games, networked simulations or virtual reality

26.

PIXEL INTENSITY MODULATION USING MODIFYING GAIN VALUES

      
Application Number 18643757
Status Pending
Filing Date 2024-04-23
First Publication Date 2024-08-15
Owner Magic Leap, Inc. (USA)
Inventor
  • Johnston, Richard Stephen
  • Schowengerdt, Brian T.

Abstract

A visual perception device has a look-up table stored in a laser driver chip. The look-up table includes relational gain data to compensate for brighter areas of a laser pattern wherein pixels are located more closely than areas where the pixels are further apart and to compensate for differences in intensity of individual pixels when the intensities of pixels are altered due to design characteristics of an eye piece.

IPC Classes  ?

  • G02B 26/10 - Scanning systems
  • G02B 27/01 - Head-up displays
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes

27.

METHODS AND SYSTEMS FOR CREATING VIRTUAL AND AUGMENTED REALITY

      
Application Number 18635985
Status Pending
Filing Date 2024-04-15
First Publication Date 2024-08-15
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Bradski, Gary R.
  • Miller, Samuel A.
  • Abovitz, Rony

Abstract

Configurations are disclosed for presenting virtual reality and augmented reality experiences to users. The system may comprise an image capturing device to capture one or more images, the one or more images corresponding to a field of the view of a user of a head-mounted augmented reality device, and a processor communicatively coupled to the image capturing device to extract a set of map points from the set of images, to identify a set of sparse points and a set of dense points from the extracted set of map points, and to perform a normalization on the set of map points.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • A63F 13/00 - Video games, i.e. games using an electronically generated display having two or more dimensions
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G02B 30/52 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being constructed from a stack or sequence of 2D planes, e.g. depth sampling systems
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06V 10/50 - Extraction of image or video features by summing image-intensity values; Projection analysis
  • G06V 10/56 - Extraction of image or video features relating to colour
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/10 - Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G10L 19/00 - Speech or audio signal analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis
  • H04N 7/18 - Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
  • H04N 13/00 - PICTORIAL COMMUNICATION, e.g. TELEVISION - Details thereof
  • H04N 13/117 - Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
  • H04N 13/139 - Format conversion, e.g. of frame-rate or size
  • H04N 13/189 - Recording image signals; Reproducing recorded image signals
  • H04N 13/204 - Image signal generators using stereoscopic image cameras
  • H04N 13/239 - Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
  • H04N 13/271 - Image signal generators wherein the generated image signals comprise depth maps or disparity maps
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/366 - Image reproducers using viewer tracking
  • H04N 13/371 - Image reproducers using viewer tracking for tracking rotational head movements around the vertical axis
  • H04N 13/383 - Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
  • H04N 21/414 - Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance

28.

CONTROL OF DYNAMIC BRIGHTNESS OF LIGHT-EMITTING DIODE ARRAY

      
Application Number 18645700
Status Pending
Filing Date 2024-04-25
First Publication Date 2024-08-15
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Davis, Christopher F.
  • Cohen, Howard Russell
  • Zivkovic, Mihailo Slobodan
  • Capps, Marshall Charles

Abstract

An apparatus includes a light-emitting diode (LED) driver circuit, one or more LEDs of an LED array, and an electronic switching circuit. The LED driver circuit is configured to generate an electric current. The one or more LEDs are electrically connected to the LED driver circuit. The electronic switching circuit is electrically connected to the one or more LEDs and configured to be placed in one of multiple switching configurations. The electronic switching circuit is further configured to direct a portion of the electric current away from the one or more LEDs, such that a remaining portion of the electric current drives the one or more LEDs. The portion of the electric current corresponds to the one of the multiple switching configurations.

IPC Classes  ?

  • H05B 45/46 - Circuit arrangements for operating light-emitting diodes [LED] - Details of LED load circuits with an active control inside an LED matrix having LEDs disposed in parallel lines
  • H05B 45/10 - Controlling the intensity of the light

29.

Charger for head mounted audio-visual display system

      
Application Number 29731453
Grant Number D1038875
Status In Force
Filing Date 2020-04-15
First Publication Date 2024-08-13
Grant Date 2024-08-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Fraser, Bradley
  • Awad, Haney
  • Günther, Sebastian Gonzalo Arrieta
  • Kaji, Masamune
  • Wheeler, William

30.

UNFUSED POSE-BASED DRIFT CORRECTION OF A FUSED POSE OF A TOTEM IN A USER INTERACTION SYSTEM

      
Application Number 18597716
Status Pending
Filing Date 2024-03-06
First Publication Date 2024-08-08
Owner Magic Leap, Inc. (USA)
Inventor Wan, Sheng

Abstract

The invention relates generally to a user interaction system having a head unit for a user to wear and a totem that the user holds in their hand and determines the location of a virtual object that is seen by the user. A fusion routine generates a fused location of the totem in a world frame based on a combination of an EM wave and a totem IMU data. The fused pose may drift over time due to the sensor's model mismatch. An unfused pose determination modeler routinely establishes an unfused pose of the totem relative to the world frame. A drift is declared when a difference between the fused pose and the unfused pose is more than a predetermined maximum distance.

IPC Classes  ?

  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G09G 5/38 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of individual graphic patterns using a bit-mapped memory with means for controlling the display position

31.

VIRTUAL AND AUGMENTED REALITY SYSTEMS AND METHODS

      
Application Number 18611050
Status Pending
Filing Date 2024-03-20
First Publication Date 2024-08-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Welch, William Hudson
  • Greco, Paul M.
  • Abovitz, Rony
  • Munk, Yonatan
  • Miller, Samuel A.

Abstract

Methods and systems are disclosed for presenting virtual objects on a limited number of depth planes using, e.g., an augmented reality display system. A farthest one of the depth planes is within a mismatch tolerance of optical infinity. The display system may switch the depth plane on which content is actively displayed so that the content is displayed on the depth plane on which a user is fixating. The impact of errors in fixation tracking is addressed using partially overlapping depth planes. A fixation depth is determined and the display system determines whether to adjust selection of a selected depth plane at which a virtual object is presented. The determination may be based on whether the fixation depth falls within a depth overlap region of adjacent depth planes. The display system may switch the active depth plane depending upon whether the fixation depth falls outside the overlap region.

IPC Classes  ?

32.

DEPTH BASED DYNAMIC VISION SENSOR

      
Application Number 18635670
Status Pending
Filing Date 2024-04-15
First Publication Date 2024-08-08
Owner Magic Leap, Inc. (USA)
Inventor
  • Zahnert, Martin Georg
  • Llic, Alexander
  • Fonseka, Erik

Abstract

An image sensor suitable for use in an augmented reality system to provide low latency image analysis with low power consumption. The image sensor may have multiple pixel cells, each of the pixel cells comprising a photodetector to generate an electric signal based on an intensity of light incident upon the photodetector. The pixel cells may include multiple subsets of pixel cells, each subset of pixel cells including at least one angle-of-arrival to-intensity converter to modulate incident light reaching one or more of the pixel cells in the subset based on an angle of arrival of the incident light. Each pixel cell within the plurality of pixel cells may include differential readout circuitry configured to output a readout signal only when an amplitude of a current electric signal from the photodetector is different from an amplitude of a previous electric signal from the photodetector.

IPC Classes  ?

  • H04N 25/705 - Pixels for depth measurement, e.g. RGBZ
  • G02B 27/42 - Diffraction optics
  • H04N 13/122 - Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 25/75 - Circuitry for providing, modifying or processing image signals from the pixel array

33.

MISSION DRIVEN VIRTUAL CHARACTER FOR USER INTERACTION

      
Application Number 18434394
Status Pending
Filing Date 2024-02-06
First Publication Date 2024-08-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Whitney, Kristofer Ryan
  • Moran, Andrew
  • Price, Danielle Marie
  • Mangagil, Jonathan Wells
  • Kalkute, Minal Luxman

Abstract

An augmented reality (AR) display device can display a virtual assistant character that interacts with the user of the AR device. The virtual assistant may be represented by a robot (or other) avatar that assists the user with contextual objects and suggestions depending on what virtual content the user is interacting with. Animated images may be displayed above the robot's head to display its intents to the user. For example, the robot can run up to a menu and suggest an action and show the animated images. The robot can materialize virtual objects that appear on its hands. The user can remove such an object from the robot's hands and place it in the environment. If the user does not interact with the object, the robot can dematerialize it. The robot can rotate its head to keep looking at the user and/or an object that the user has picked up.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06T 13/40 - 3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

34.

CROSS REALITY SYSTEM

      
Application Number 18612614
Status Pending
Filing Date 2024-03-21
First Publication Date 2024-08-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Mohan, Anush
  • Torres, Rafael Domingos
  • Olshansky, Daniel
  • Miller, Samuel A.
  • Tajik, Jehangir
  • Holder, Joel David
  • Miranda, Jeremy Dwayne
  • Taylor, Robert Blake
  • Swaminathan, Ashwin
  • Agarwal, Lomesh
  • Barot, Hiral Honar
  • Suzuki, Helder Toshiro
  • Shahrokni, Ali
  • Guendelman, Eran
  • Singhal, Prateek
  • Zhao, Xuan
  • Choudhary, Siddharth
  • Kramer, Nicholas Atkinson
  • Tossell, Kenneth William
  • Moore, Christian Ivan Robert

Abstract

A cross reality system that provides an immersive user experience by storing persistent spatial information about the physical world that one or multiple user devices can access to determine position within the physical world and that applications can access to specify the position of virtual objects within the physical world. Persistent spatial information enables users to have a shared virtual, as well as physical, experience when interacting with the cross reality system. Further, persistent spatial information may be used in maps of the physical world, enabling one or multiple devices to access and localize into previously stored maps, reducing the need to map a physical space before using the cross reality system in it. Persistent spatial information may be stored as persistent coordinate frames, which may include a transformation relative to a reference orientation and information derived from images in a location corresponding to the persistent coordinate frame.

IPC Classes  ?

  • G06T 17/05 - Geographic models
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 16/29 - Geographical information databases
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04L 67/01 - Protocols

35.

WAVEGUIDES WITH HIGH INDEX MATERIALS AND METHODS OF FABRICATION THEREOF

      
Application Number 18622925
Status Pending
Filing Date 2024-03-30
First Publication Date 2024-08-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Vikramjit
  • Luo, Kang
  • Vaughn, Michal Beau Dennison
  • Bhargava, Samarth
  • Yang, Shuqiang
  • Miller, Michael Nevin
  • Xu, Frank Y.
  • Klug, Michael Anthony
  • Messer, Kevin
  • Tekolste, Robert D.
  • Deng, Xiaopei
  • Li, Xiao

Abstract

Waveguides comprising materials with refractive index greater than or equal to 1.8 and methods of patterning waveguides are disclosed. Patterned waveguides comprising materials with refractive index greater than or equal to 1.8 can be incorporated in display devices, such as, for example wearable display devices to project virtual images to a viewer. A waveguide may be transparent and may comprise a substrate comprising a first material having a first refractive index greater than about 2.0. Diffractive features may be formed, on the substrate, of a second material having a second refractive index that is lower than the first refractive index. A third material may be disposed over the diffractive features and may have a third refractive index that is higher than the second refractive index.

IPC Classes  ?

36.

METHOD AND SYSTEM FOR PROJECTION DISPLAY WITH POLARIZATION SELECTIVE REFLECTORS

      
Application Number 18631607
Status Pending
Filing Date 2024-04-10
First Publication Date 2024-08-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Freedman, Barak
  • Pellman, Asaf
  • Weinstein, Ori

Abstract

A method for displaying an image using an eyepiece waveguide includes emitting light from a first laser and emitting light from a second laser. The method also includes receiving light from the first laser at a first polarization selective reflector, reflecting, using the first polarization selective reflector, light from the first laser at a first angle to a scanning mirror, and receiving light from the second laser at a second polarization selective reflector. The method further includes reflecting, using the second polarization selective reflector, light from the second laser at a second angle to the scanning mirror, receiving light reflected by the scanning mirror at an input coupling port of the eyepiece waveguide, and projecting the image from an output port of the eyepiece waveguide.

IPC Classes  ?

  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G02B 27/01 - Head-up displays
  • G02B 30/25 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type using polarisation techniques
  • G02B 30/60 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images involving reflecting prisms and mirrors only

37.

BROADBAND ADAPTIVE LENS ASSEMBLY FOR AUGMENTED REALITY DISPLAY

      
Application Number 18601808
Status Pending
Filing Date 2024-03-11
First Publication Date 2024-08-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Oh, Chulwoo
  • Komanduri, Ravi Kumar
  • Patterson, Roy Matthew
  • Carden, Charles Scott
  • Miller, Michael Nevin
  • Singh, Vikramjit

Abstract

A display device comprises a waveguide configured to guide light in a lateral direction parallel to an output surface of the waveguide. The waveguide is further configured to outcouple the guided light through the output surface. The display device additionally comprises a broadband adaptive lens assembly configured to incouple and to diffract therethrough the outcoupled light from the waveguide. The broadband adaptive lens assembly comprises a first waveplate lens comprising a liquid crystal (LC) layer arranged such that the waveplate lens has birefringence (Δn) that varies in a radially outward direction from a central region of the first waveplate lens and configured to diffract the outcoupled light at a diffraction efficiency greater than 90% within a wavelength range including at least 450 nm to 630 nm. The broadband adaptive lens assembly is configured to be selectively switched between a plurality of states having different optical powers.

IPC Classes  ?

  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02B 1/00 - Optical elements characterised by the material of which they are made; Optical coatings for optical elements
  • G02B 5/18 - Diffracting gratings
  • G02B 5/30 - Polarising elements
  • G02B 6/12 - Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind
  • G02B 6/122 - Basic optical elements, e.g. light-guiding paths
  • G02B 27/01 - Head-up displays
  • G02F 1/1337 - Surface-induced orientation of the liquid crystal molecules, e.g. by alignment layers
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G09G 3/00 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
  • G09G 5/00 - Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
  • G09G 5/14 - Display of multiple viewports

38.

TRANSMODAL INPUT FUSION FOR A WEARABLE SYSTEM

      
Application Number 18629740
Status Pending
Filing Date 2024-04-08
First Publication Date 2024-08-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Lacey, Paul
  • Miller, Samuel A.
  • Kramer, Nicholas Atkinson
  • Lundmark, David Charles

Abstract

Examples of wearable systems and methods can use multiple inputs (e.g., gesture, head pose, eye gaze, voice, totem, and/or environmental factors (e.g., location)) to determine a command that should be executed and objects in the three-dimensional (3D) environment that should be operated on. The wearable system can detect when different inputs converge together, such as when a user seeks to select a virtual object using multiple inputs such as eye gaze, head pose, hand gesture, and totem input. Upon detecting an input convergence, the wearable system can perform a transmodal filtering scheme that leverages the converged inputs to assist in properly interpreting what command the user is providing or what object the user is targeting.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/16 - Sound input; Sound output
  • G06T 5/20 - Image enhancement or restoration by the use of local operators
  • G06T 7/70 - Determining position or orientation of objects or cameras

39.

VISUAL TRACKING OF PERIPHERAL DEVICES

      
Application Number 18632182
Status Pending
Filing Date 2024-04-10
First Publication Date 2024-08-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Nienstedt, Zachary C.
  • Miller, Samuel A.
  • Freedman, Barak
  • Edwin, Lionel Ernest
  • Browy, Eric C.
  • Welch, William Hudson
  • Lidji, Ron Liraz

Abstract

An augmented reality (AR) system includes a wearable device comprising a display inside the wearable device and operable to display virtual content and an imaging device mounted to the wearable device. The AR system also includes a handheld device comprising handheld fiducials affixed to the handheld device and a computing apparatus configured to perform localization of the handheld device with respect to the wearable device.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays
  • G06F 1/16 - Constructional details or arrangements

40.

OBJECT RECOGNITION NEURAL NETWORK FOR AMODAL CENTER PREDICTION

      
Application Number 18633275
Status Pending
Filing Date 2024-04-11
First Publication Date 2024-08-01
Owner Magic Leap, Inc. (USA)
Inventor
  • Mahendran, Siddharth
  • Bansal, Nitin
  • Sekhar, Nitesh
  • Gangwar, Manushree
  • Gupta, Khushi
  • Singhal, Prateek

Abstract

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for object recognition neural network for amodal center prediction. One of the methods includes receiving an image of an object captured by a camera. The image of the object is processed using an object recognition neural network that is configured to generate an object recognition output. The object recognition output includes data defining a predicted two-dimensional amodal center of the object, wherein the predicted two-dimensional amodal center of the object is a projection of a predicted three-dimensional center of the object under a camera pose of the camera that captured the image.

IPC Classes  ?

  • G06T 7/579 - Depth or shape recovery from multiple images from motion
  • G06F 18/21 - Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
  • G06F 18/24 - Classification techniques
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06T 7/60 - Analysis of geometric attributes
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 20/64 - Three-dimensional objects

41.

NON-UNIFORM STEREO RENDERING

      
Application Number 18601816
Status Pending
Filing Date 2024-03-11
First Publication Date 2024-07-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Paulus, Jr., John Carl
  • Liebenow, Michael Harold

Abstract

Examples of the disclosure describe systems and methods for recording augmented reality and mixed reality experiences. In an example method, an image of a real environment is received via a camera of a wearable head device. A pose of the wearable head device is estimated, and a first image of a virtual environment is generated based on the pose. A second image of the virtual environment is generated based on the pose, wherein the second image of the virtual environment comprises a larger field of view than a field of view of the first image of the virtual environment. A combined image is generated based on the second image of the virtual environment and the image of the real environment.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/01 - Head-up displays
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

42.

VARIABLE-FOCUS VIRTUAL IMAGE DEVICES BASED ON POLARIZATION CONVERSION

      
Application Number 18626181
Status Pending
Filing Date 2024-04-03
First Publication Date 2024-07-25
Owner Magic Leap, Inc. (USA)
Inventor Oh, Chulwoo

Abstract

Example display devices include a waveguide configured to propagate visible light under total internal reflection in a direction parallel to a major surface of the waveguide. The waveguide has formed thereon an outcoupling element configured to outcouple a portion of the visible light in a direction normal to the major surface of the waveguide. The example display devices additionally include a cholesteric liquid crystal (CLC) reflector disposed on a forward side of said waveguides, said CLC reflector configured to have an optical power or a depth of focus that is adjustable upon application of an electrical signal. The outcoupling element is disposed to extract light from the waveguide and direct at least a portion of said light propagating within said waveguide to the CLC reflector, said light being directed from said CLC reflector back through said waveguide and into said eye to present an image from the optical display into the eye of the wearer.

IPC Classes  ?

  • G02F 1/137 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
  • G02B 27/01 - Head-up displays
  • G02F 1/133 - Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
  • G02F 1/29 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection

43.

CONTENT INTERACTION DRIVEN BY EYE METRICS

      
Application Number 18627219
Status Pending
Filing Date 2024-04-04
First Publication Date 2024-07-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Martin, Heather Michelle
  • O`brien, Kevin John
  • Arroyo, Pedro Luis
  • Deoliveira, Flavio

Abstract

A head mounted display system for displaying image content to a user comprises at least one display configured to be worn by a user to present virtual content to first and second eyes of a user, one or more inwardly facing sensors or camera configured to monitor one or both of the users eye and processing electronics. This head mounted display system is configured such that virtual content activity can be initiated and/or driven from eye inputs such as gaze direction, eyelid motions (e.g., blinking), and/or other eye gestures.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,

44.

ADAPTIVE LENS ASSEMBLIES INCLUDING POLARIZATION-SELECTIVE LENS STACKS FOR AUGMENTED REALITY DISPLAY

      
Application Number 18624795
Status Pending
Filing Date 2024-04-02
First Publication Date 2024-07-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Komanduri, Ravi Kumar
  • Oh, Chulwoo

Abstract

The present disclosure relates to display systems and, more particularly, to augmented reality display systems an related methods. In one aspect, an adaptive lens assembly includes a lens stack configured to exert polarization-dependent optical power to linearly polarized light. The lens stack includes a birefringent lens and an isotropic lens contacting each other to form a conformal interface therebetween. The adaptive lens assembly is configured to be selectively switched between a plurality of states having different optical powers.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • C12Q 1/6844 - Nucleic acid amplification reactions
  • G02F 1/1335 - Structural association of cells with optical devices, e.g. polarisers or reflectors
  • G02F 1/13363 - Birefringent elements, e.g. for optical compensation
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

45.

METHOD OF MAKING HOLOGRAMS USING LIQUID CRYSTAL MASTERS

      
Application Number 18561978
Status Pending
Filing Date 2022-06-01
First Publication Date 2024-07-18
Owner Magic Leap, Inc. (USA)
Inventor
  • Ziegler, David Thomas
  • Eckert, Rolf
  • Montfort, Frédéric

Abstract

An optical device includes one or more volume phase holographic gratings each of which includes a photosensitive layer whose optical properties are spatially modulated. The spatial modulation of optical properties are recorded in the photosensitive layer by generating an optical interference pattern using a beam of light and one or more liquid crystal master gratings. The volume phase holograms may be configured to redirect light of visible or infrared wavelengths propagating in free space or through a waveguide. Advantageously, fabricating the volume phase holographic gratings using liquid crystal master grating allows independent control of the optical function and the selectivity of the volume phase holographic grating during the fabrication process.

IPC Classes  ?

  • G03H 1/04 - Processes or apparatus for producing holograms
  • C09K 19/36 - Steroidal liquid crystal compounds
  • G02B 5/32 - Holograms used as optical elements
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • G02B 27/01 - Head-up displays

46.

SYSTEMS AND METHODS FOR MANIPULATING LIGHT FROM AMBIENT LIGHT SOURCES

      
Application Number 18622116
Status Pending
Filing Date 2024-03-29
First Publication Date 2024-07-18
Owner Magic Leap, Inc. (USA)
Inventor
  • Baerenrodt, Eric
  • Robaina, Nastasja U.
  • Samec, Nicole Elizabeth
  • Harrises, Christopher M.
  • Baerenrodt, Mark

Abstract

An optical device includes variable optical material that alters at least one of: incident ambient light, spectral content of incident ambient light or direction of incident ambient light through the optical device in response to a stimulus provided by the device. The device can sense intensity and/or spectral characteristics of ambient light and provide appropriate stimulus to various portions of the optical device to activate the variable optical material and alter at least one of: incident ambient light, spectral content of incident ambient light or direction of incident ambient light.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 3/14 - Fluid-filled or evacuated lenses of variable focal length
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02F 1/00 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
  • G02F 1/01 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour
  • G02F 1/29 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems

47.

Enhanced eye tracking for augmented or virtual reality display systems

      
Application Number 18309787
Grant Number 12039099
Status In Force
Filing Date 2023-04-29
First Publication Date 2024-07-16
Grant Date 2024-07-16
Owner Magic Leap, Inc. (USA)
Inventor
  • Abele, Nicolas
  • Chevallaz, Eric
  • De Gol, Philippe
  • Gamet, Julien
  • Cosendey, Gatien
  • Gamper, Stephan Arthur

Abstract

Techniques are described for enhanced eye tracking for display systems, such as augmented or virtual reality display systems. The display systems may include a light source configured to output light, a moveable diffractive grating configured to reflect light from the light source, the reflected light forming a scan pattern on the eye of the user, and light detectors to detect light reflected from the eye. The orientation of the diffractive grating can be moved such that the light reflected from the diffractive grating is scanned across the eye according to the scan pattern. Light intensity pattern(s) are obtained via the light detectors, with a light intensity pattern representing a light detector signal obtained by detecting light reflected by the eye as the light is scanned across the eye. One or more physiological characteristics of the eye are determined based on detected light intensity pattern(s).

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 26/10 - Scanning systems
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G02B 27/09 - Beam shaping, e.g. changing the cross-sectioned area, not otherwise provided for

48.

AUGMENTED AND VIRTUAL REALITY DISPLAY SYSTEMS WITH SHARED DISPLAY FOR LEFT AND RIGHT EYES

      
Application Number 18414251
Status Pending
Filing Date 2024-01-16
First Publication Date 2024-07-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Trisnadi, Jahja I.
  • Chung, Hyunsun
  • Edwin, Lionel Ernest
  • Cohen, Howard Russell
  • Taylor, Robert Blake
  • Russell, Andrew Ian
  • Curtis, Kevin Richard
  • Carlisle, Clinton

Abstract

A wearable display system includes a light projection system having one or more emissive micro-displays, e.g., micro-LED displays. The light projection system projects time-multiplexed left-eye and right-eye images, which pass through an optical router having a polarizer and a switchable polarization rotator. The optical router is synchronized with the generation of images by the light projection system to impart a first polarization to left-eye images and a second different polarization to right-eye images. Light of the first polarization is incoupled into an eyepiece having one or more waveguides for outputting light to one of the left and right eyes, while light of the second polarization may be incoupled into another eyepiece having one or more waveguides for outputting light to the other of the left and right eyes. Each eyepiece may output incoupled light with variable amounts of wavefront divergence, to elicit different accommodation responses from the user's eyes.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/28 - Optical systems or apparatus not provided for by any of the groups , for polarising
  • G02B 30/24 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
  • G02B 30/25 - Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer’s left and right eyes of the stereoscopic type using polarisation techniques
  • H04N 13/341 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
  • H04N 13/398 - Synchronisation thereof; Control thereof

49.

SYSTEMS AND METHODS FOR PERFORMING SELF-IMPROVING VISUAL ODOMETRY

      
Application Number 18417523
Status Pending
Filing Date 2024-01-19
First Publication Date 2024-07-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Detone, Daniel
  • Malisiewicz, Tomasz Jan
  • Rabinovich, Andrew

Abstract

In an example method of training a neural network for performing visual odometry, the neural network receives a plurality of images of an environment, determines, for each image, a respective set of interest points and a respective descriptor, and determines a correspondence between the plurality of images. Determining the correspondence includes determining one or point correspondences between the sets of interest points, and determining a set of candidate interest points based on the one or more point correspondences, each candidate interest point indicating a respective feature in the environment in three-dimensional space). The neural network determines, for each candidate interest point, a respective stability metric and a respective stability metric. The neural network is modified based on the one or more candidate interest points.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G06N 3/08 - Learning methods
  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06V 10/40 - Extraction of image or video features

50.

SINGULATION AND EDGE-SEALING OF MULTILAYER POLYMER EYEPIECE

      
Application Number 18560049
Status Pending
Filing Date 2022-05-24
First Publication Date 2024-07-11
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Martinez, Jr., Arturo Manuel
  • Sawicki, Joseph Christopher
  • Schmulen, Jeffrey Dean
  • Jurbergs, David Carl

Abstract

Fabricating a multilayer optical component includes obtaining a substrate having a multiplicity of polymer layers and cutting the substrate with a CO2 laser to yield a multilayer optical component having a first surface, a second surface opposite the first surface, and a blackened edge along a perimeter of the multilayer optical component. The multiplicity of polymer layers is sealed along the blackened edge. The resulting multilayer optical component includes a multiplicity of polymer layers and a blackened edge seal around the multiplicity of polymer layers. The blackened edge seal includes polymer melt from the multiplicity of polymer layers.

IPC Classes  ?

  • B29C 65/74 - Joining of preformed parts; Apparatus therefor by welding and severing
  • B23K 26/38 - Removing material by boring or cutting
  • B23K 26/402 - Removing material taking account of the properties of the material involved involving non-metallic material, e.g. isolators
  • B23K 103/00 - Materials to be soldered, welded or cut
  • B29C 65/16 - Laser beam
  • B32B 7/12 - Interconnection of layers using interposed adhesives or interposed materials with bonding properties
  • B32B 27/08 - Layered products essentially comprising synthetic resin as the main or only constituent of a layer next to another layer of a specific substance of synthetic resin of a different kind
  • G02B 25/00 - Eyepieces; Magnifying glasses

51.

ENHANCED POSE DETERMINATION FOR DISPLAY DEVICE

      
Application Number 18378086
Status Pending
Filing Date 2023-10-09
First Publication Date 2024-07-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Zahnert, Martin Georg
  • Faro, Joao Antonio Pereira
  • Velasquez, Miguel Andres Granados
  • Kasper, Dominik Michael
  • Swaminathan, Ashwin
  • Mohan, Anush
  • Singhal, Prateek

Abstract

To determine the head pose of a user, a head-mounted display system having an imaging device can obtain a current image of a real-world environment, with points corresponding to salient points which will be used to determine the head pose. The salient points are patch-based and include: a first salient point being projected onto the current image from a previous image, and with a second salient point included in the current image being extracted from the current image. Each salient point is subsequently matched with real-world points based on descriptor-based map information indicating locations of salient points in the real-world environment. The orientation of the imaging devices is determined based on the matching and based on the relative positions of the salient points in the view captured in the current image. The orientation may be used to extrapolate the head pose of the wearer of the head-mounted display system.

IPC Classes  ?

  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G06V 40/18 - Eye characteristics, e.g. of the iris

52.

ANNULAR AXIAL FLUX MOTORS

      
Application Number 18616809
Status Pending
Filing Date 2024-03-26
First Publication Date 2024-07-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Konings, Arno Leon
  • Schabacker, Charles Robert
  • Mareno, Jason Donald

Abstract

An annular axial flux motor includes a rotor mounted on an annular subsection of a rotatable cam ring and a stator mounted on an annular subsection of a carrier frame. The rotor includes two Halbach arrays of permanent magnets spaced from each other on the cam ring along an axial direction. The stator includes multiple phase electrical windings printed on multiple layers of a printed circuit board (PCB) that are stacked along the axial direction. The multiple layers are positioned between the Halbach arrays, with active side of the Halbach arrays facing to opposite sides of the multiple layers. The Halbach arrays are configured to generate a symmetrical magnetic field and the multiple phase electrical windings are configured to have a same rotor-dependent torque constant, such that the stator can generate a constant torque to rotate the rotor and the cam ring within a finite travel range.

IPC Classes  ?

  • H02K 1/2795 - Rotors axially facing stators the rotor consisting of two or more circumferentially positioned magnets
  • H02K 1/2792 - Surface mounted magnets; Inset magnets with magnets arranged in Halbach arrays
  • H02K 3/26 - Windings characterised by the conductor shape, form or construction, e.g. with bar conductors consisting of printed conductors

53.

EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM

      
Application Number 18618091
Status Pending
Filing Date 2024-03-27
First Publication Date 2024-07-11
Owner Magic Leap, Inc. (USA)
Inventor
  • Bhargava, Samarth
  • Liu, Victor Kai
  • Messer, Kevin

Abstract

An eyepiece waveguide for an augmented reality display system may include an optically transmissive substrate, an input coupling grating (ICG) region, a multi-directional pupil expander (MPE) region, and an exit pupil expander (EPE) region. The ICG region may receive an input beam of light and couple the input beam into the substrate as a guided beam. The MPE region may include a plurality of diffractive features which exhibit periodicity along at least a first axis of periodicity and a second axis of periodicity. The MPE region may be positioned to receive the guided beam from the ICG region and to diffract it in a plurality of directions to create a plurality of diffracted beams. The EPE region may overlap the MPE region and may out couple one or more of the diffracted beams from the optically transmissive substrate as output beams.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 27/09 - Beam shaping, e.g. changing the cross-sectioned area, not otherwise provided for
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes

54.

Display panel or portion thereof with a transitional mixed reality graphical user interface

      
Application Number 29860128
Grant Number D1034656
Status In Force
Filing Date 2022-11-16
First Publication Date 2024-07-09
Grant Date 2024-07-09
Owner Magic Leap, Inc. (USA)
Inventor
  • Heiner, Cole Parker
  • Pazmino, Lorena
  • Tran, Gregory Minh
  • Hoover, Paul Armistead

55.

TECHNIQUES FOR DETERMINING SETTINGS FOR A CONTENT CAPTURE DEVICE

      
Application Number 18601406
Status Pending
Filing Date 2024-03-11
First Publication Date 2024-07-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Smith, Brian Keith
  • Tsunaev, Ilya

Abstract

A method includes receiving a first image captured by a content capture device included in a mixed reality or augmented reality device, identifying a location corresponding to a pixel group of a plurality of pixel groups in the first image, and determining, for each location, one or more updates for one or more settings of the content capture device. The method also includes iteratively: updating the content capture device using an update of the one or more updates, capturing an image of a predetermined number of images using the content capture device and the update of the one or more updates, and repeating updating the content capture device and capturing the image the predetermined number of times. The method also includes stitching the predetermined number of images together to form a composite image.

IPC Classes  ?

  • H04N 23/72 - Combination of two or more compensation controls
  • H04N 5/222 - Studio circuitry; Studio devices; Studio equipment
  • H04N 5/265 - Mixing
  • H04N 23/63 - Control of cameras or camera modules by using electronic viewfinders
  • H04N 23/71 - Circuitry for evaluating the brightness variation
  • H04N 23/73 - Circuitry for compensating brightness variation in the scene by influencing the exposure time
  • H04N 23/741 - Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
  • H04N 23/743 - Bracketing, i.e. taking a series of images with varying exposure conditions
  • H04N 23/76 - Circuitry for compensating brightness variation in the scene by influencing the image signals

56.

SYSTEM AND METHOD FOR TRACKING A WEARABLE DEVICE

      
Application Number 18605653
Status Pending
Filing Date 2024-03-14
First Publication Date 2024-07-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Shee, Koon Keong
  • Rodriguez, Jose Felix
  • Arencibia, Ricardo
  • Aly, Aly H.M.

Abstract

A method of using a first device to locate a second device is disclosed. The first device, while in a first mode, transmits a first signal and receives a second signal comprising a reflection of the first signal by the second device. The first device determines, based on the received second signal, a position of the second device relative to the first device. The first device transitions to a second mode, and while in the second mode, receives a third signal from the second device. The first device determines, based on the third signal, an orientation of the second device relative to the first device. The first device comprises one or more receiving antennas, and the second device comprises one or more transmitting antennas. The third signal corresponds to a transmitting antenna of the second device.

IPC Classes  ?

  • G01S 13/46 - Indirect determination of position data
  • G01S 7/41 - RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES - Details of systems according to groups , , of systems according to group using analysis of echo signal for target characterisation; Target signature; Target cross-section
  • G08C 17/02 - Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
  • H01Q 1/27 - Adaptation for use in or on movable bodies

57.

MIXED REALITY MUSICAL INSTRUMENT

      
Application Number 18607222
Status Pending
Filing Date 2024-03-15
First Publication Date 2024-07-04
Owner Magic Leap, Inc. (USA)
Inventor Tajik, Anastasia Andreyevna

Abstract

A method is disclosed, the method comprising the steps of identifying a first real object in a mixed reality environment, the mixed reality environment having a user; identifying a second real object in the mixed reality environment; generating, in the mixed reality environment, a first virtual object corresponding to the second real object; identifying, in the mixed reality environment, a collision between the first real object and the first virtual object; determining a first attribute associated with the collision; determining, based on the first attribute, a first audio signal corresponding to the collision; and presenting to the user, via one or more speakers, the first audio signal.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G10H 1/00 - ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE - Details of electrophonic musical instruments
  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control

58.

CROSS REALITY SYSTEM WITH MAP PROCESSING USING MULTI-RESOLUTION FRAME DESCRIPTORS

      
Application Number 18609101
Status Pending
Filing Date 2024-03-19
First Publication Date 2024-07-04
Owner Magic Leap, Inc. (USA)
Inventor
  • Joseph, Elad
  • Braun, Gal
  • Shahrokni, Ali

Abstract

A distributed, cross reality system efficiently and accurately compares location information that includes image frames. Each of the frames may be represented as a numeric descriptor that enables identification of frames with similar content. The resolution of the descriptors may vary for different computing devices in the distributed system based on degree of ambiguity in image comparisons and/or computing resources for the device. A descriptor computed for a cloud-based component operating on maps of large areas that can result in ambiguous identification of multiple image frames may use high resolution descriptors. High resolution descriptors reduce computationally intensive disambiguation processing. A portable device, which is more likely to operate on smaller maps and less likely to have the computational resources to compute a high resolution descriptor, may use a lower resolution descriptor.

IPC Classes  ?

  • G06T 17/00 - 3D modelling for computer graphics
  • G06T 15/00 - 3D [Three Dimensional] image rendering
  • G06V 10/40 - Extraction of image or video features
  • G06V 10/46 - Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • H04N 23/57 - Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
  • H04N 23/90 - Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

59.

Device controller

      
Application Number 29914879
Grant Number D1033387
Status In Force
Filing Date 2023-10-23
First Publication Date 2024-07-02
Grant Date 2024-07-02
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Awad, Haney
  • Swinton, Matthew David
  • Urban, Hayes

60.

MULTIMODAL TASK EXECUTION AND TEXT EDITING FOR A WEARABLE SYSTEM

      
Application Number 18596054
Status Pending
Filing Date 2024-03-05
First Publication Date 2024-06-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Powderly, James M.
  • Niles, Savannah
  • Devine, Jennifer M.R.
  • Carlson, Adam C.
  • Sommers, Jeffrey Scott
  • Babu J D, Praveen
  • Fernandes, Ajoy Savio
  • Sheeder, Anthony Robert

Abstract

Examples of wearable systems and methods can use multiple inputs (e.g., gesture, head pose, eye gaze, voice, and/or environmental factors (e.g., location)) to determine a command that should be executed and objects in the three-dimensional (3D) environment that should be operated on. The multiple inputs can also be used by the wearable system to permit a user to interact with text, such as, e.g., composing, selecting, or editing text.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays
  • G06F 1/16 - Constructional details or arrangements
  • G06F 3/16 - Sound input; Sound output
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

61.

MAPPING AND LOCALIZATION OF A PASSABLE WORLD

      
Application Number 18598740
Status Pending
Filing Date 2024-03-07
First Publication Date 2024-06-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Dedonato, Amy
  • Petty, James Cameron
  • Hazen, Griffith Buckley
  • Cazamias, Jordan Alexander
  • Stolzenberg, Karen

Abstract

An augmented reality (AR) device can be configured to generate a virtual representation of a user's physical environment. The AR device can capture images of the user's physical environment to generate or identify a user's location. The AR device can project graphics at designated locations within the user's environment to guide the user to capture images of the user's physical environment. The AR device can provide visual, audible, or haptic guidance to direct the user of the AR device to explore the user's environment.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 13/20 - 3D [Three Dimensional] animation
  • G06T 15/06 - Ray-tracing
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/18 - Eye characteristics, e.g. of the iris

62.

BROWSER FOR MIXED REALITY SYSTEMS

      
Application Number 18600439
Status Pending
Filing Date 2024-03-08
First Publication Date 2024-06-27
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Zurmoehle, Tim
  • Montoya, Andrea Isabel
  • Macdonald, Robert John Cummings
  • Groth, Sakina
  • Mak, Genevieve

Abstract

Disclosed is an improved systems and method for navigation and manipulation of browser windows in a 3D mixed reality environment. An improved approach is provided to view a user's windows, regardless of the current location for the user relative to one or more previously-opened windows. A method for displaying windows in a computing environment includes receiving an instruction to select multiple open windows. The method also includes retrieving information for the multiple open windows, where the multiple open windows are associated with different physical locations. The method further includes displaying a representation of the multiple open windows in a single user interface. Moreover, the method includes upon receiving a selection of a selected window of the multiple open windows, loading the selected window into a foreground of a field of view for a user.

IPC Classes  ?

  • G06F 3/0483 - Interaction with page-structured environments, e.g. book metaphor
  • G02B 27/01 - Head-up displays
  • G06F 3/0482 - Interaction with lists of selectable items, e.g. menus
  • G06F 3/04845 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

63.

CROSS REALITY SYSTEM FOR LARGE SCALE ENVIRONMENT RECONSTRUCTION

      
Application Number 18599083
Status Pending
Filing Date 2024-03-07
First Publication Date 2024-06-27
Owner Magic Leap, Inc. (USA)
Inventor
  • Cao, Yilun
  • Kandra, Mohan Babu
  • Molyneaux, David Geoffrey
  • Olshansky, Daniel
  • Paul Pena, David
  • Steinbrücker, Frank Thomas
  • Torres, Rafael Domingos

Abstract

Various techniques pertaining to methods, systems, and computer program products a spatial persistence process that places a virtual object relative to a physical object for an extended-reality display device based at least in part upon a persistent coordinate frame (PCF). A determination is made to decide whether a drift is detected for the virtual object relative to the physical object, upon or after detection of the drift or deviation, the drift or deviation is corrected at least by updating a tracking map into an updated tracking map and further at least by updating the persistent coordinate frame (PCF) based at least in part upon the updated tracking map, wherein the persistent coordinate frame (PCF) comprises six degrees of freedom relative to the map coordinate system.

IPC Classes  ?

  • G06T 17/00 - 3D modelling for computer graphics
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 7/55 - Depth or shape recovery from multiple images
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

64.

METHODS AND SYSTEM FOR GENERATING AND DISPLAYING 3D VIDEOS IN A VIRTUAL, AUGMENTED, OR MIXED REALITY ENVIRONMENT

      
Application Number 18601749
Status Pending
Filing Date 2024-03-11
First Publication Date 2024-06-27
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Babu J D, Praveen
  • Riley, Sean Christopher

Abstract

Disclosed is an approach for displaying 3D videos in a VR and/or AR system. The 3D videos may include 3D animated objects that escape from the display screen. The 3D videos may interact with objects within the VR and/or AR environment. The 3D video may be interactive with a user such that based on user input corresponding to decisions elected by the user at certain portions of the 3D video such that a different storyline and possibly a different conclusion may result for the 3D video. The 3D video may be a 3D icon displayed within a portal of a final 3D render world.

IPC Classes  ?

  • H04N 13/122 - Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
  • G06F 3/00 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
  • G06T 15/08 - Volume rendering
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/128 - Adjusting depth or disparity
  • H04N 13/279 - Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking

65.

Mobile device accessory with cameras

      
Application Number 29717237
Grant Number D1032682
Status In Force
Filing Date 2019-12-16
First Publication Date 2024-06-25
Grant Date 2024-06-25
Owner Magic Leap, Inc. (USA)
Inventor
  • Natsume, Shigeru
  • Gunther, Sebastian Gonzalo Arrieta
  • Awad, Haney
  • Green Mercer, Bryson John

66.

SYSTEMS AND METHODS FOR EFFICIENT FLOORPLAN GENERATION FROM 3D SCANS OF INDOOR SCENES

      
Application Number 18414163
Status Pending
Filing Date 2024-01-16
First Publication Date 2024-06-20
Owner MAGIC LEAP, INC. (USA)
Inventor Phalak, Ameya Pramod

Abstract

Methods, systems, and wearable extended reality devices for generating a floorplan of an indoor scene are provided. A room classification of a room and a wall classification of a wall for the room may be determined from an input image of the indoor scene. A floorplan may be determined based at least in part upon the room classification and the wall classification without constraining a total number of rooms in the indoor scene or a size of the room.

IPC Classes  ?

  • G06V 20/64 - Three-dimensional objects
  • G06F 18/23 - Clustering techniques
  • G06F 18/2431 - Multiple classes
  • G06T 7/55 - Depth or shape recovery from multiple images
  • G06T 17/00 - 3D modelling for computer graphics
  • G06V 10/762 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes

67.

LIGHT-EMITTING USER INPUT DEVICE FOR CALIBRATION OR PAIRING

      
Application Number 18591983
Status Pending
Filing Date 2024-02-29
First Publication Date 2024-06-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Powderly, James M.
  • Niles, Savannah
  • Nesladek, Christopher David
  • Azu, Isioma Osagbemwenorue
  • Fontaine, Marshal Ainsworth
  • Awad, Haney
  • Wheeler, William
  • Schwab, Brian David
  • Bucknor, Brian Edward Oliver

Abstract

A light-emitting user input device can include a touch sensitive portion configured to accept user input (e.g., from a user's thumb) and a light-emitting portion configured to output a light pattern. The light pattern can be used to assist the user in interacting with the user input device. Examples include emulating a multi-degree-of-freedom controller, indicating scrolling or swiping actions, indicating presence of objects nearby the device, indicating receipt of notifications, assisting pairing the user input device with another device, or assisting calibrating the user input device. The light-emitting user input device can be used to provide user input to a wearable device, such as, e.g., a head mounted display device.

IPC Classes  ?

  • G06F 3/04815 - Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/02 - Input arrangements using manually operated switches, e.g. using keyboards or dials
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/0346 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/0354 - Pointing devices displaced or positioned by the user; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
  • G06F 3/038 - Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
  • G06F 3/041 - Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
  • G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
  • G06F 3/04883 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
  • G06F 3/04886 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06V 10/145 - Illumination specially adapted for pattern recognition, e.g. using gratings
  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • G09G 3/32 - Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

68.

THIN ILLUMINATION LAYER WAVEGUIDE AND METHODS OF FABRICATION

      
Application Number 18556856
Status Pending
Filing Date 2022-04-28
First Publication Date 2024-06-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Vikramjit
  • Shultz, Jason Allen
  • Xu, Frank Y.
  • Tekolste, Robert D.

Abstract

Disclosed herein are systems and methods for displays, such as for a head wearable device. An example display can include an infrared illumination layer, the infrared illumination layer including a waveguide having a first face and a second face, the first face disposed opposite the second face. The illumination layer may also include an in-coupling grating disposed on the first face, the in-coupling grating configured to couple light into the waveguide to generate internally reflected light propagating in a first direction. The illumination layer may also include a plurality of out-coupling gratings disposed on at least one of the first face and the second face, the plurality of out-coupling gratings configured to receive the internally reflected light and couple the internally reflected light out of the waveguide.

IPC Classes  ?

  • G02B 6/42 - Coupling light guides with opto-electronic elements
  • G02B 27/01 - Head-up displays

69.

IMPRINT LITHOGRAPHY PROCESS AND METHODS ON CURVED SURFACES

      
Application Number 18556865
Status Pending
Filing Date 2022-04-28
First Publication Date 2024-06-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Singh, Vikramjit
  • Xu, Frank Y.

Abstract

Methods for creating a pattern on a curved surface and an optical structure (e.g., curved waveguide, a lens having an antireflective feature, an optical structure of a wearable head device) are disclosed. In some embodiments, the method comprises: depositing a patterning material on a curved surface; positioning a superstrate over the patterning material, the superstrate comprising a template for creating the pattern; applying, using the patterning material, a force between the curved surface and the superstrate; curing the patterning material, wherein the cured patterning material comprises the pattern; and removing the superstrate. In some embodiments, the method comprises forming the optical structure using the pattern.

IPC Classes  ?

  • B29C 59/02 - Surface shaping, e.g. embossing; Apparatus therefor by mechanical means, e.g. pressing
  • B29C 33/58 - Applying the releasing agents
  • B29L 11/00 - Optical elements, e.g. lenses, prisms
  • G02B 6/26 - Optical coupling means
  • G02B 27/01 - Head-up displays

70.

ENCODING STEREO SPLASH SCREEN IN STATIC IMAGE

      
Application Number 18586326
Status Pending
Filing Date 2024-02-23
First Publication Date 2024-06-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Capps, Marshall Charles
  • Jesu, Anuroop Suresh

Abstract

During a boot-up processing of a computing device, such as an augmented reality wearable device, a static image and a bootup process progress bar may be encoded in a single image file, such as a bitmap image, and displayed in conjunction with updates that are applied to a hardware gamma table at various stages of the bootup process to create the effect of an animated progress bar.

IPC Classes  ?

  • G06F 9/4401 - Bootstrapping
  • G02B 27/01 - Head-up displays
  • G06T 1/20 - Processor architectures; Processor configuration, e.g. pipelining
  • G06T 3/4007 - based on interpolation, e.g. bilinear interpolation (image demosaicing G06T 3/4015;edge-driven or edge-based scaling G06T 3/403)
  • G06T 9/00 - Image coding
  • G06T 13/80 - 2D animation, e.g. using sprites
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

71.

DUAL LISTENER POSITIONS FOR MIXED REALITY

      
Application Number 18587728
Status Pending
Filing Date 2024-02-26
First Publication Date 2024-06-20
Owner Magic Leap, Inc. (USA)
Inventor Tajik, Anastasia Andreyevna

Abstract

A method of presenting audio comprises: identifying a first ear listener position and a second ear listener position in a mixed reality environment; identifying a first virtual sound source in the mixed reality environment; identifying a first object in the mixed reality environment; determining a first audio signal in the mixed reality environment, wherein the first audio signal originates at the first virtual sound source and intersects the first ear listener position; determining a second audio signal in the mixed reality environment, wherein the second audio signal originates at the first virtual sound source, intersects the first object, and intersects the second ear listener position; determining a third audio signal based on the second audio signal and the first object; presenting, to a first ear of a user, the first audio signal; and presenting, to a second ear of the user, the third audio signal.

IPC Classes  ?

  • H04S 7/00 - Indicating arrangements; Control arrangements, e.g. balance control
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04R 1/40 - Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers
  • H04R 3/12 - Circuits for transducers for distributing signals to two or more loudspeakers

72.

WAVEGUIDES WITH INTEGRATED OPTICAL ELEMENTS AND METHODS OF MAKING THE SAME

      
Application Number 18589212
Status Pending
Filing Date 2024-02-27
First Publication Date 2024-06-20
Owner Magic Leap, Inc. (USA)
Inventor
  • Peroz, Christophe
  • Liu, Victor Kai
  • Bhargava, Samarth

Abstract

An example waveguide can include a polymer layer having substantially optically transparent material with first and second major surfaces configured such that light containing image information can propagate through the polymer layer being guided therein by reflecting from the first and second major surfaces via total internal reflection. The first surface can include first smaller and second larger surface portions monolithically integrated with the polymer layer and with each other. The first smaller surface portion can include at least a part of an in-coupling optical element configured to couple light incident on the in-coupling optical element into the polymer layer for propagation therethrough by reflection from the second major surface and the second larger surface portion of the first major surface. The waveguide can include a tilted surface portion forming at least part of an in-coupling optical element, the tilted surface portion having curvature to provide optical power.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 6/28 - Optical coupling means having data bus means, i.e. plural waveguides interconnected and providing an inherently bidirectional system by mixing and splitting signals
  • G02B 27/18 - Optical systems or apparatus not provided for by any of the groups , for optical projection, e.g. combination of mirror and condenser and objective
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

73.

CRYSTALLINE WAVEGUIDES AND WEARABLE DEVICES CONTAINING THE SAME

      
Application Number US2023084633
Publication Number 2024/130250
Status In Force
Filing Date 2023-12-18
Publication Date 2024-06-20
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Singh, Vikramjit
  • Khandekar, Chinmay
  • Tekolste, Robert D.
  • Ong, Ryan Jason
  • Martinez Jr., Arturo Manuel
  • Faraji-Dana, Mohammadsadegh
  • Xue, Qizhen
  • Xu, Frank Y.
  • Melli, Mauro
  • Mueller, Brennen

Abstract

A head-mounted display system includes: a head mounted display frame; a first eyepiece supported by the frame, the first eyepiece including a first substrate composed of a crystalline, transparent material having crystallographic axes in a first orientation with respect to the frame, the substrate having a first surface and a second surface opposite the first surface, the first eyepiece further including a first in-coupling element including a grating on the first surface, and a first out-coupling element including a grating on the first surface and/or a grating on the second surface; and a second eyepiece including a second substrate composed of the crystalline, transparent material having crystallographic axes in a second orientation with respect to the frame different from the first orientation, a second in-coupling element on either surface of the second substrate, and a second out-coupling element on either surface of the second substrate.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02F 1/313 - Digital deflection devices in an optical waveguide structure
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • G02B 5/32 - Holograms used as optical elements

74.

COVER ARCHITECTURES IN CURVED EYEPIECE STACKS FOR MIXED REALITY APPLICATIONS

      
Application Number 18554623
Status Pending
Filing Date 2022-04-15
First Publication Date 2024-06-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Ong, Ryan Jason
  • Li, Ling
  • Chang, Chieh
  • Bhagat, Sharad D.
  • Peroz, Christophe
  • Liu, Victor Kai
  • Bharagava, Samarth
  • Melli, Mauro
  • West, Melanie Maputol

Abstract

Eyepieces and methods of fabricating the eyepieces are disclosed. In some embodiments, the eyepiece comprises a curved cover layer and a waveguide layer for propagating light. In some embodiments, the curved cover layer comprises an antireflective feature.

IPC Classes  ?

75.

METHOD AND SYSTEM FOR IMPROVING PHASE CONTINUITY IN EYEPIECE WAVEGUIDE DISPLAYS

      
Application Number US2023082783
Publication Number 2024/123946
Status In Force
Filing Date 2023-12-06
Publication Date 2024-06-13
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Tekolste, Robert D.
  • Hsu, Liyi
  • Khandekar, Chinmay
  • Singh, Vikramjit
  • Xue, Qizhen
  • Yang, Shuqiang

Abstract

An eyepiece for an augmented reality headset includes an eyepiece waveguide and an incoupling diffractive optical element coupled to the eyepiece waveguide. The incoupling diffractive optical element is disposed at a first lateral location. The eyepiece also includes an outcoupling diffractive optical element coupled to the eyepiece waveguide. The outcoupling diffractive optical element is disposed at a second lateral location different than the first lateral location. The eyepiece further includes an optical structure coupled to the eyepiece waveguide. The optical structure is disposed at a third lateral location between the first lateral location and the second lateral location.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • G02F 1/295 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection in an optical waveguide structure
  • H04N 13/344 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

76.

VERY HIGH INDEX EYEPIECE SUBSTRATE-BASED VIEWING OPTICS ASSEMBLY ARCHITECTURES

      
Application Number 18407878
Status Pending
Filing Date 2024-01-09
First Publication Date 2024-06-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Klug, Michael Anthony
  • Curtis, Kevin Richard
  • Singh, Vikramjit
  • Luo, Kang
  • Vaughn, Michal Beau Dennison
  • Bhargava, Samarth
  • Yang, Shuqiang
  • Miller, Michael Nevin
  • Xu, Frank Y.
  • Messer, Kevin
  • Tekolste, Robert D.

Abstract

Very high refractive index (n>2.2) lightguide substrates enable the production of 70° field of view eyepieces with all three color primaries in a single eyepiece layer. Disclosed herein are viewing optics assembly architectures that make use of such eyepieces to reduce size and cost, simplifying manufacturing and assembly, and better-accommodating novel microdisplay designs.

IPC Classes  ?

77.

METHOD AND SYSTEM FOR PUPIL SEPARATION IN A DIFFRACTIVE EYEPIECE WAVEGUIDE DISPLAY

      
Application Number 18582405
Status Pending
Filing Date 2024-02-20
First Publication Date 2024-06-13
Owner Magic Leap, Inc. (USA)
Inventor
  • Komanduri, Ravi Kumar
  • Oh, Chulwoo
  • Messer, Kevin
  • Papadopoulos, Ioannis

Abstract

A pupil separation system includes an input surface and a central portion including one or more dichroic mirrors. The pupil separation system also includes a reflective surface disposed laterally with respect to the central portion and an output surface including a central surface operable to transmit light in a first wavelength range and a peripheral surface operable to transmit light in a second wavelength range different from the first wavelength range. The light in the second wavelength range is reflected by the one or more dichroic mirrors before being reflected by the reflective surface.

IPC Classes  ?

  • G06V 20/20 - Scenes; Scene-specific elements in augmented reality scenes
  • G02B 27/01 - Head-up displays
  • G06F 18/10 - Pre-processing; Data cleansing
  • G06F 18/21 - Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
  • G06F 18/22 - Matching criteria, e.g. proximity measures
  • G06N 20/00 - Machine learning
  • G06T 3/18 - Image warping, e.g. rearranging pixels individually
  • G06V 20/62 - Text, e.g. of license plates, overlay texts or captions on TV images

78.

OPTICALLY FUNCTIONAL STRUCTURES FOR AUGMENTED REALITY DEVICES

      
Application Number US2022051853
Publication Number 2024/123308
Status In Force
Filing Date 2022-12-05
Publication Date 2024-06-13
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Faraji-Dana, Mohammadsadegh
  • Tekolste, Robert Dale
  • Khandekar, Chinmay
  • Hsu, Liyi
  • Singh, Vikramjit
  • Melli, Mauro
  • Liu, Victor Kai
  • West, Melanie Maputol
  • Xue, Qizhen

Abstract

Disclosed herein is an article including: a waveguide formed from a polymer material and including: an optical coupling structure on a surface of the waveguide, the optical coupling structure configured to couple light incident on the optical coupling structure into the waveguide; an out-coupling surface grating extending over a first region of the surface of the waveguide; and an anti-reflective surface grating extending over a second region of the surface of the waveguide, the second region encircling the optical coupling structure, the anti-reflective surface grating including: a layer of material layered to a thickness on the surface of the waveguide; and a pattern of nanostructures having a width, extending from the layer of material by a height and spaced apart by a pitch.

IPC Classes  ?

  • G02B 5/18 - Diffracting gratings
  • G02B 6/00 - Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
  • G02B 27/01 - Head-up displays

79.

EYE TRACKING IN NEAR-EYE DISPLAYS

      
Application Number 18545172
Status Pending
Filing Date 2023-12-19
First Publication Date 2024-06-06
Owner
  • UNIVERSITY OF WASHINGTON (USA)
  • MAGIC LEAP, INC. (USA)
Inventor
  • Seibel, Eric J.
  • Brunton, Steven L.
  • Gong, Chen
  • Schowengerdt, Brian T.

Abstract

Techniques for tracking eye movement in an augmented reality system identify a plurality of base images of an object or a portion thereof. A search image may be generated based at least in part upon at least some of the plurality of base images. A deep learning result may be generated at least by performing a deep learning process on a base image using a neural network in a deep learning mode. A captured image may be localized at least by performing an image registration process on the captured image and the search image using a Kalman filter model and the deep learning result.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G02B 27/01 - Head-up displays
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06N 3/08 - Learning methods
  • G06T 7/246 - Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
  • G06T 7/277 - Analysis of motion involving stochastic approaches, e.g. using Kalman filters
  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods

80.

AUGMENTED REALITY DISPLAY HAVING LIQUID CRYSTAL VARIABLE FOCUS ELEMENT AND ROLL-TO-ROLL METHOD AND APPARATUS FOR FORMING THE SAME

      
Application Number 18414100
Status Pending
Filing Date 2024-01-16
First Publication Date 2024-06-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Patterson, Roy Matthew
  • Oh, Chulwoo
  • Komanduri, Ravi Kumar
  • Carden, Charles Scott
  • Miller, Michael Nevin
  • Singh, Vikramjit
  • Yang, Shuqiang

Abstract

A display device includes a waveguide assembly comprising a waveguide configured to outcouple light out of a major surface of the waveguide to form an image in the eyes of a user. An adaptive lens assembly comprises a switchable waveplate assembly. The switchable waveplate assembly includes quarter-wave plates on opposing sides of a switchable liquid crystal layer, and electrodes on the quarter-wave plates in the volume between the quarter-wave plates. The electrodes can selectively establish an electric field and may serve as an alignment structure for molecules of the liquid crystal layer. Portions of the adaptive lens assembly may be manufactured by roll-to-roll processing in which a substrate roll is unwound, and alignment layers and liquid crystal layers are formed on the substrate as it moves towards a second roller, to be wound on that second roller.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • G02F 1/29 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection

81.

METHODS TO IMPROVE THE PERCEPTUAL QUALITY OF FOVEATED RENDERED IMAGES

      
Application Number 18439296
Status Pending
Filing Date 2024-02-12
First Publication Date 2024-06-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Wu, Bing
  • Vlaskamp, Bjorn Nicolaas Servatius
  • Cohen, Howard Russell
  • Rodriguez, Jose Felix

Abstract

A method of generating foveated rendering using temporal multiplexing includes generating a first spatial profile for an FOV by dividing the FOV into a first foveated zone and a first peripheral zone. The first foveated zone will be rendered at a first pixel resolution, and the first peripheral zone will be rendered at a second pixel resolution lower than the first pixel resolution. The method further includes generating a second spatial profile for the FOV by dividing the FOV into a second foveated zone and a second peripheral zone, the second foveated zone being spatially offset from the first foveated zone. The second foveated zone and the second peripheral zone will be rendered at the first pixel resolution and the second pixel resolution, respectively. The method further includes multiplexing the first spatial profile and the second spatial profile temporally in a sequence of frames.

IPC Classes  ?

  • G06T 3/40 - Scaling of a whole image or part thereof
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 15/10 - Geometric effects

82.

DUAL IMU SLAM

      
Application Number 18439653
Status Pending
Filing Date 2024-02-12
First Publication Date 2024-06-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Huang, Yu-Hsiang
  • Levine, Evan Gregory
  • Napolskikh, Igor
  • Kasper, Dominik Michael
  • Sanchez Nicuesa, Manel Quim
  • Sima, Sergiu
  • Langmann, Benjamin
  • Swaminathan, Ashwin
  • Zahnert, Martin Georg
  • Czuprynski, Blazej Marek
  • Faro, Joao Antonio Pereira
  • Tobler, Christoph
  • Ghasemalizadeh, Omid

Abstract

Examples of the disclosure describe systems and methods for presenting virtual content on a wearable head device. In some embodiments, a state of a wearable head device is determined by minimizing a total error based on a reduced weight associated with a reprojection error. A view reflecting the determined state of the wearable head device is presented via a display of the wearable head device. In some embodiments, a wearable head device calculates a preintegration term based on the image data received via a sensor of the wearable head device and the inertial data received via a first IMU and a second IMU of the wearable head device. The wearable head device estimates a position of the device based on the preintegration term, and the wearable head device presents the virtual content based on the position of the device.

IPC Classes  ?

  • G06T 15/20 - Perspective computation
  • G01C 19/00 - Gyroscopes; Turn-sensitive devices using vibrating masses; Turn-sensitive devices without moving masses; Measuring angular rate using gyroscopic effects
  • G01P 7/00 - Measuring speed by integrating acceleration
  • G01P 13/00 - Indicating or recording presence or absence of movement; Indicating or recording of direction of movement
  • G01P 15/08 - Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration by making use of inertia forces with conversion into electric or magnetic values
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06T 19/00 - Manipulating 3D models or images for computer graphics

83.

OPTICAL DEVICE WITH ONE-WAY MIRROR

      
Application Number 18440180
Status Pending
Filing Date 2024-02-13
First Publication Date 2024-06-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Oh, Chulwoo
  • Komanduri, Ravi Kumar
  • Kleinman, David
  • Mathur, Vaibhav
  • Manly, David

Abstract

In some implementations, an optical device includes a one-way mirror formed by a polarization selective mirror and an absorptive polarizer. The absorptive polarizer has a transmission axis aligned with the transmission axis of the reflective polarizer. The one-way mirror may be provided on the world side of a head-mounted display system. Advantageously, the one-way mirror may reflect light from the world, which provides privacy and may improve the cosmetics of the display. In some implementations, the one-way mirror may include one or more of a depolarizer and a pair of opposing waveplates to improve alignment tolerances and reduce reflections to a viewer. In some implementations, the one-way mirror may form a compact integrated structure with a dimmer for reducing light transmitted to the viewer from the world.

IPC Classes  ?

84.

APPLICATION SHARING

      
Application Number 18444315
Status Pending
Filing Date 2024-02-16
First Publication Date 2024-06-06
Owner Magic Leap, Inc. (USA)
Inventor
  • Babu J D, Praveen
  • Stolzenberg, Karen
  • Tajik, Jehangir
  • Talwalkar, Rohit Anil
  • Bryant, Colman Thomas
  • Zolotarev, Leonid

Abstract

A host device having a first processor executes an application via the first processor. The host device determines a state of the application. A scenegraph is generated corresponding to the state of the application, and the scenegraph is presented to a remote device having a display and a second processor. The remote device is configured to, in response to receiving the scenegraph, render to the display a view corresponding to the scenegraph, without executing the application via the second processor.

IPC Classes  ?

  • H04L 67/131 - Protocols for games, networked simulations or virtual reality
  • G02B 27/01 - Head-up displays
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04L 9/40 - Network security protocols
  • H04L 67/51 - Discovery or management thereof, e.g. service location protocol [SLP] or web services
  • H04L 67/75 - Indicating network or usage conditions on the user display
  • H04W 4/021 - Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences

85.

METHOD AND SYSTEM FOR REDUCING OPTICAL ARTIFACTS IN AUGMENTED REALITY DEVICES

      
Application Number US2022051296
Publication Number 2024/118062
Status In Force
Filing Date 2022-11-29
Publication Date 2024-06-06
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Llaneras, Zachary Michael
  • Schaefer, Jason
  • Singh, Vikramjit
  • Arend, Erik Heath
  • Tekolste, Robert D.

Abstract

An augmented reality headset includes a frame and a plurality of eyepiece waveguide displays supported in the frame. Each of the plurality of eyepiece waveguide displays includes a projector and an eyepiece having a world side and a user side. The eyepiece includes one or more eyepiece waveguide layers, each of the one or more eyepiece waveguide layers including an in-coupling diffractive optical element and an out-coupling diffractive optical element. Each of the plurality of eyepiece waveguide displays also includes a first extended depth of field (EDOF) refractive element disposed adjacent the world side a dimmer assembly disposed adjacent the world side, a second EDOF refractive element disposed adjacent the user side, and an optical absorber disposed adjacent the eyepiece and overlapping in plan view with a portion of the eyepiece.

IPC Classes  ?

86.

DUMMY IMPRINTED REGIONS

      
Application Number US2023082041
Publication Number 2024/119053
Status In Force
Filing Date 2023-12-01
Publication Date 2024-06-06
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Xue, Qizhen
  • Singh, Vikramjit
  • Xu, Frank Y.

Abstract

In some implementations, a method includes imprinting an optically-diffractive structure, and imprinting an optically-sub-diffractive structure adjacent to the optically-diffractive structure. For example, the structures can be grating with different characteristics. The imprinted structures can be included in optical devices and display systems.

IPC Classes  ?

  • G02B 27/44 - Grating systems; Zone plate systems
  • G02B 6/34 - Optical coupling means utilising prism or grating
  • G06F 1/16 - Constructional details or arrangements
  • G02B 6/293 - Optical coupling means having data bus means, i.e. plural waveguides interconnected and providing an inherently bidirectional system by mixing and splitting signals with wavelength selective means
  • H01L 33/58 - Optical field-shaping elements

87.

WEARABLE DISPLAY SYSTEMS WITH NANOWIRE LED MICRO-DISPLAYS

      
Application Number 18385264
Status Pending
Filing Date 2023-10-30
First Publication Date 2024-05-30
Owner Magic Leap, Inc. (USA)
Inventor
  • Trisnadi, Jahja I.
  • Carlisle, Clinton

Abstract

A wearable display system includes one or more nanowire LED micro-displays. The nanowire micro-LED displays may be monochrome or full-color. The nanowire LEDs forming the arrays may have an advantageously narrow angular emission profile and high light output. Where a plurality of nanowire LED micro-displays is utilized, the micro-displays may be positioned at different sides of an optical combiner, for example, an X-cube prism which receives light rays from different micro-displays and outputs the light rays from the same face of the cube. The optical combiner directs the light to projection optics, which outputs the light to an eyepiece that relays the light to a user's eye. The eyepiece may output the light to the user's eye with different amounts of wavefront divergence, to place virtual content on different depth planes.

IPC Classes  ?

  • G02B 27/01 - Head-up displays
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • B82Y 20/00 - Nanooptics, e.g. quantum optics or photonic crystals

88.

METHOD OF FABRICATING MOLDS FOR FORMING EYEPIECES WITH INTEGRATED SPACERS

      
Application Number 18388660
Status Pending
Filing Date 2023-11-10
First Publication Date 2024-05-30
Owner Magic Leap, Inc. (USA)
Inventor
  • Melli, Mauro
  • Chang, Chieh
  • Li, Ling
  • West, Melanie Maputol
  • Peroz, Christophe
  • Karbasi, Ali
  • Bhagat, Sharad D.
  • Hill, Brian George

Abstract

Methods are disclosed for fabricating molds for forming eyepieces having waveguides with integrated spacers. The molds are formed by etching deep holes (e.g., 5 μm to 1000 μm deep) into a substrate using a wet etch or dry etch. The etch masks for defining the holes may be formed with a thick metal layer and/or multiple layers of different metals. A resist layer may be disposed over the etch mask. The resist layer may be patterned to form a pattern of holes, the pattern may be transferred to the etch mask, and the etch mask may be used to transfer the pattern into the underlying substrate. The patterned substrate may be utilized as a mold onto which a flowable polymer may be introduced and allowed to harden. Hardened polymer in the holes may form integrated spacers. The hardened polymer may be removed from the mold to form a waveguide with integrated spacers.

IPC Classes  ?

  • B29D 11/00 - Producing optical elements, e.g. lenses or prisms
  • B29C 33/42 - SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING - Details thereof or accessories therefor characterised by the shape of the moulding surface, e.g. ribs or grooves
  • G02B 6/12 - Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind
  • G02B 6/136 - Integrated optical circuits characterised by the manufacturing method by etching

89.

SYSTEMS AND METHODS FOR PRESENTING PERSPECTIVE VIEWS OF AUGMENTED REALITY VIRTUAL OBJECT

      
Application Number 18431729
Status Pending
Filing Date 2024-02-02
First Publication Date 2024-05-30
Owner Magic Leap, Inc. (USA)
Inventor Mccall, Marc Alan

Abstract

Examples of the disclosure describe systems and methods for sharing perspective views of virtual content. In an example method, a virtual object is presented, via a display, to a first user. A first perspective view of the virtual object is determined, wherein the first perspective view is based on a position of the virtual object and a position of the first user. The virtual object is presented, via a display, to a second user, wherein the virtual object is presented to the second user according to the first perspective view. A second perspective view of the virtual object is determined, wherein the second perspective view is based on an input from the first user. The virtual object is presented, via a display, to the second user, wherein presenting the virtual object to the second user comprises presenting a transition from the first perspective view to the second perspective view.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06V 20/64 - Three-dimensional objects

90.

EXIT PUPIL EXPANDER

      
Application Number 18432547
Status Pending
Filing Date 2024-02-05
First Publication Date 2024-05-30
Owner Magic Leap, Inc. (USA)
Inventor Kimmel, Jyrki Sakari

Abstract

An exit pupil expander (EPE) has entrance and exit pupils, a back surface adjacent to the entrance pupil, and an opposed front surface. In one embodiment the EPE is geometrically configured such that light defining a center wavelength that enters at the entrance pupil perpendicular to the back surface experiences angularly varying total internal reflection between the front and back surfaces such that the light exiting the optical channel perpendicular to the exit pupil is at a wavelength shifted from the center wavelength. In another embodiment a first distance at the entrance pupil between the front and back surfaces is different from a second distance at the exit pupil between the front and back surfaces. The EPE may be deployed in a head-wearable imaging device (e.g., virtual or augmented reality) where the entrance pupil in-couples light from a micro display and the exit pupil out-couples light from the EPE.

IPC Classes  ?

91.

SYSTEMS AND METHODS FOR MIXED REALITY

      
Application Number 18431758
Status Pending
Filing Date 2024-02-02
First Publication Date 2024-05-30
Owner Magic Leap, Inc. (USA)
Inventor
  • Schowengerdt, Brian T.
  • Watson, Mathew D.
  • Tinch, David
  • Yeoh, Ivan Li Chuen
  • Macnamara, John Graham
  • Edwin, Lionel Ernest
  • Klug, Michael Anthony
  • Welch, William Hudson

Abstract

A virtual image generation system comprises a planar optical waveguide having opposing first and second faces, an in-coupling (IC) element configured for optically coupling a collimated light beam from an image projection assembly into the planar optical waveguide as an in-coupled light beam, a first orthogonal pupil expansion (OPE) element associated with the first face of the planar optical waveguide for splitting the in-coupled light beam into a first set of orthogonal light beamlets, a second orthogonal pupil expansion (OPE) element associated with the second face of the planar optical waveguide for splitting the in-coupled light beam into a second set of orthogonal light beamlets, and an exit pupil expansion (EPE) element associated with the planar optical waveguide for splitting the first and second sets of orthogonal light beamlets into an array of out-coupled light beamlets that exit the planar optical waveguide.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G02B 5/18 - Diffracting gratings
  • G02B 26/08 - Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
  • G02B 27/01 - Head-up displays
  • G02B 27/09 - Beam shaping, e.g. changing the cross-sectioned area, not otherwise provided for
  • G02B 27/10 - Beam splitting or combining systems
  • G02B 27/30 - Collimators
  • G02B 27/42 - Diffraction optics
  • G02B 6/12 - Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type of the integrated circuit kind

92.

SYSTEMS AND METHODS FOR VIRTUAL AND AUGMENTED REALITY

      
Application Number 18430521
Status Pending
Filing Date 2024-02-01
First Publication Date 2024-05-23
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Cardenuto, Rodolpho C.
  • Lundmark, David Charles

Abstract

A computer implemented method of facilitating communication between first and second users includes displaying, by a first head-worn device, a first virtual object to the first user first user wearing the first head-worn device. The method also includes displaying, by a second head-worn device, a second virtual object to the second user wearing the second head-worn device. The method further includes facilitating, by the first and second head-worn devices, communications between the first and second users using the first and second virtual objects to simulate the first and second users being present in a common environment.

IPC Classes  ?

  • G06F 3/14 - Digital output to display device
  • G02B 27/01 - Head-up displays
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

93.

OPTICAL DEVICES AND HEAD-MOUNTED DISPLAYS EMPLOYING TUNABLE CYLINDRICAL LENSES

      
Application Number 18281896
Status Pending
Filing Date 2022-03-14
First Publication Date 2024-05-23
Owner Magic Leap, Inc. (USA)
Inventor Russell, Andrew Ian

Abstract

This disclosure describes in-plane switching mode liquid crystal geometric phase tunable lenses that can be integrated into an eyepiece of an optical device for the correction of non-emmetropic vision, such as in an augmented reality display system. The eyepiece can include an integrated, field-configurable optic arranged with respect to a waveguide used to project digital imagery to the user, the optic being capable of providing a tunable Rx for the user including variable spherical refractive power (SPH), cylinder refractive power, and cylinder axis values. In certain configuration, each tunable eyepiece includes two variable compound lenses: one on the user-side of the waveguide with variable SPH, cylinder power, and axis values; and a second on the world side of the waveguide with variable SPH.

IPC Classes  ?

  • G02F 1/1343 - Electrodes
  • G02B 27/00 - Optical systems or apparatus not provided for by any of the groups ,
  • G02B 27/01 - Head-up displays
  • G02F 1/1347 - Arrangement of liquid crystal layers or cells in which the final condition of one light beam is achieved by the addition of the effects of two or more layers or cells
  • G02F 1/29 - Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection

94.

VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

      
Application Number 18423854
Status Pending
Filing Date 2024-01-26
First Publication Date 2024-05-23
Owner MAGIC LEAP, INC. (USA)
Inventor Taylor, Robert Blake

Abstract

A method for displaying a three dimensional (“3D”) image includes rendering a frame of 3D image data. The method also includes analyzing the frame of 3D image data to generate best known depth data. The method further includes using the best known depth data to segment the 3D image data into near and far frames of two dimensional (“2D”) image data corresponding to near and far depths respectively. Moreover, the method includes displaying near and far 2D image frames corresponding to the near and far frames of 2D image data at near and far depths to a user respectively.

IPC Classes  ?

95.

ATHERMALIZATION CONCEPTS FOR POLYMER EYEPIECES USED IN AUGMENTED REALITY OR MIXED REALITY DEVICES

      
Application Number 18549835
Status Pending
Filing Date 2022-03-11
First Publication Date 2024-05-23
Owner Magic Leap , Inc. (USA)
Inventor
  • Rugg, Stephen Richard
  • Karvasi, Ali
  • Mareno, Jason Donald
  • Nguyen, Bach
  • Brune, Philip F.
  • Tinch, David
  • Bhargava, Smarth

Abstract

Embodiments of this disclosure provides systems and methods for displays. In embodiments, a display system includes a frame, an eyepiece coupled to the frame, and a first adhesive bond disposed between the frame and the eyepiece. The eyepiece can include a light input region and a light output region. The first adhesive bond can be disposed along a first portion of a perimeter of the eyepiece, where the first portion of the perimeter of the eyepiece borders the light input region such that the first adhesive bond is configured to maintain a position of the light input region relative to the frame.

IPC Classes  ?

96.

GLOBAL DIMMING

      
Application Number 1789376
Status Registered
Filing Date 2024-04-12
Registration Date 2024-04-12
Owner Magic Leap, Inc. (USA)
NICE Classes  ? 09 - Scientific and electric apparatus and instruments

Goods & Services

Computer software and hardware; headsets.

97.

ANTI-REFLECTIVE COATINGS ON OPTICAL WAVEGUIDES

      
Application Number 18419183
Status Pending
Filing Date 2024-01-22
First Publication Date 2024-05-16
Owner Magic Leap, Inc. (USA)
Inventor
  • Peroz, Christophe
  • Messer, Kevin

Abstract

An anti-reflective waveguide assembly comprising a waveguide substrate having a first index of refraction, a plurality of diffractive optical elements disposed upon a first surface of the waveguide and an anti-reflective coating disposed upon a second surface of the waveguide. The anti-reflective coating preferably increases absorption of light through a surface to which it is applied into the waveguide so that at least 97 percent of the light is transmitted. The anti-reflective coating is composed of four layers of material having different indices of refraction that the first index of refraction and an imaginary refractive index less than 1×10−3 but preferably less than 5×10−4.

IPC Classes  ?

  • G02B 1/115 - Multilayers
  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • G02B 27/01 - Head-up displays

98.

METHODS, SYSTEMS, AND PRODUCTS FOR AN EXTENDED REALITY DEVICE HAVING A LAMINATED EYEPIECE

      
Application Number US2023078965
Publication Number 2024/102747
Status In Force
Filing Date 2023-11-07
Publication Date 2024-05-16
Owner MAGIC LEAP, INC. (USA)
Inventor
  • Singh, Vikramjit
  • Xu, Frank Y.
  • Ong, Ryan Jason
  • Frish, Julie
  • Bhagat, Sharad D.
  • Tekolste, Robert D.
  • Mareno, Jason Donald

Abstract

An eyepiece of an extended reality system that further comprises a polymeric laminate, a monolithic glass-like optical element having a first side or a portion thereof that is laminated to the polymeric laminate or a double glass-like optical elements that sandwich the polymeric laminate, a set of surface relief grating structures that is implemented on a second side or a portion of the second side of the monolithic glass-like optical element, and a projector that projects light beams of one or more images at multiple different depths through the eyepiece to an eye of a user. Described further includes creating and presenting virtual contents to a user using at least the aforementioned eyepiece.

IPC Classes  ?

  • G02B 5/18 - Diffracting gratings
  • G02B 1/04 - Optical elements characterised by the material of which they are made; Optical coatings for optical elements made of organic materials, e.g. plastics
  • G02B 27/01 - Head-up displays
  • G03B 21/14 - Projectors or projection-type viewers; Accessories therefor - Details
  • B32B 17/08 - Layered products essentially comprising sheet glass, or fibres of glass, slag or the like comprising glass as the main or only constituent of a layer, next to another layer of a specific substance of cellulosic plastic substance
  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • H04N 13/332 - Displays for viewing with the aid of special glasses or head-mounted displays [HMD]

99.

WAVEGUIDES HAVING INTEGRATED SPACERS, WAVEGUIDES HAVING EDGE ABSORBERS, AND METHODS FOR MAKING THE SAME

      
Application Number 18394211
Status Pending
Filing Date 2023-12-22
First Publication Date 2024-05-16
Owner Magic Leap, Inc. (USA)
Inventor
  • Peroz, Christophe
  • Chang, Chieh
  • Bhagat, Sharad D.
  • Liu, Victor Kai
  • Patterson, Roy Matthew
  • Jurbergs, David Carl
  • Khorasaninejad, Mohammadreza
  • Li, Ling
  • Miller, Michael Nevin
  • Carden, Charles Scott

Abstract

In some embodiments, a head-mounted, near-eye display system comprises a stack of waveguides having integral spacers separating the waveguides. The waveguides may each include diffractive optical elements that are formed simultaneously with the spacers by imprinting. The spacers are disposed on one major surface of each of the waveguides and indentations are provided on an opposite major surface of each of the waveguides. The indentations are sized and positioned to align with the spacers, thereby forming a self-aligned stack of waveguides. Tops of the spacers may be provided with light scattering features, anti-reflective coatings, and/or light absorbing adhesive to prevent light leakage between the waveguides. As seen in a top-down view, the spacers may be elongated along the same axis as the diffractive optical elements. The waveguides may include structures (e.g., layers of light absorbing materials, rough surfaces, light out-coupling optical elements, and/or light trapping microstructures) along their edges to mitigate reflections and improve the display contrast.

IPC Classes  ?

  • F21V 8/00 - Use of light guides, e.g. fibre optic devices, in lighting devices or systems
  • B29D 11/00 - Producing optical elements, e.g. lenses or prisms
  • G02B 27/42 - Diffraction optics

100.

METHOD OF WAKING A DEVICE USING SPOKEN VOICE COMMANDS

      
Application Number 18418131
Status Pending
Filing Date 2024-01-19
First Publication Date 2024-05-16
Owner Magic Leap, Inc. (USA)
Inventor
  • Roach, David Thomas
  • Jot, Jean-Marc
  • Lee, Jung-Suk

Abstract

Disclosed herein are systems and methods for processing speech signals in mixed reality applications. A method may include receiving an audio signal; determining, via first processors, whether the audio signal comprises a voice onset event; in accordance with a determination that the audio signal comprises the voice onset event: waking a second one or more processors; determining, via the second processors, that the audio signal comprises a predetermined trigger signal; in accordance with a determination that the audio signal comprises the predetermined trigger signal: waking third processors; performing, via the third processors, automatic speech recognition based on the audio signal; and in accordance with a determination that the audio signal does not comprise the predetermined trigger signal: forgoing waking the third processors; and in accordance with a determination that the audio signal does not comprise the voice onset event: forgoing waking the second processors.

IPC Classes  ?

  1     2     3     ...     32        Next Page