Verb Surgical Inc.

United States of America

Back to Profile

1-100 of 127 for Verb Surgical Inc. Sort by
Query
Patent
World - WIPO
Aggregations Reset Report
Date
New (last 4 weeks) 1
2024 April (MTD) 1
2024 January 2
2024 (YTD) 3
2023 17
See more
IPC Class
A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery 61
A61B 34/30 - Surgical robots 59
A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets 44
A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges 34
A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis 24
See more
Found results for  patents
  1     2        Next Page

1.

METHOD AND SYSTEM FOR CONTROLLING AND DISPLAYING VIDEO STREAMS

      
Application Number IB2023060348
Publication Number 2024/079710
Status In Force
Filing Date 2023-10-13
Publication Date 2024-04-18
Owner VERB SURGICAL INC. (USA)
Inventor
  • Xu, Yiming
  • Bork, Felix Jonas
  • Hanuschek, Axel
  • Kojcev, Risto
  • Fuerst, Bernhard Adolf

Abstract

A method performed by a video controller. The method receives a first video stream captured by an endoscope of a surgical system, and receives a second video stream that comprises surgical data. The method displays the second video stream superimposed above an area of the first video stream, and determines that the second video stream is to cease being superimposed. Responsive to determining that the second video stream is to cease being superimposed, the method continues to display the first video stream.

IPC Classes  ?

  • A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor

2.

METHOD AND SYSTEM FOR DETECTING LIQUID IMMERSION OF AN END EFFECTOR OF AN ULTRASONIC INSTRUMENT

      
Application Number IB2023057072
Publication Number 2024/013643
Status In Force
Filing Date 2023-07-10
Publication Date 2024-01-18
Owner VERB SURGICAL INC. (USA)
Inventor
  • Farvardin, Amirhossein
  • Gonenc, Berk
  • Lucas, Guion Y.

Abstract

A method performed by a surgical system. The method determines one or more characteristics of an end effector of an ultrasonic instrument and determines that the end effector is at least partially submerged within a liquid based on the determined one or more characteristics. In response, the method displays a notification on a display of the surgical system indicating that the end effector is at least partially submerged within the liquid.

IPC Classes  ?

  • A61B 17/32 - Surgical cutting instruments
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

3.

METHOD AND SYSTEM FOR DETECTING THAT AN ULTRASONIC INSTRUMENT IS IN CONTACT WITH AN OBJECT

      
Application Number IB2023057077
Publication Number 2024/013647
Status In Force
Filing Date 2023-07-10
Publication Date 2024-01-18
Owner VERB SURGICAL INC. (USA)
Inventor
  • Farvardin, Amirhossein
  • Gonenc, Berk

Abstract

A method performed by a surgical system. The method determines a temperature of an end effector of the ultrasonic instrument based on one or more characteristics of the end effector. The method determines that the end effector is in contact with an object, and, in response to determining that the end effector is in contact with the object and that the temperature is greater than a threshold temperature, presents a notification indicating that the end effector is too hot to be in contact with the object.

IPC Classes  ?

  • A61B 17/32 - Surgical cutting instruments
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

4.

ACTUATION COMBINER MECHANISM FOR A SURGICAL TOOL

      
Application Number IB2023054793
Publication Number 2023/218349
Status In Force
Filing Date 2023-05-09
Publication Date 2023-11-16
Owner VERB SURGICAL INC. (USA)
Inventor Matilla, Jose Luis De Cordoba

Abstract

A surgical tool for a surgical robotic system, the surgical tool comprising: a surgical tool grasper having a jaw operable to perform a surgical procedure; a handle coupled to the surgical tool grasper and having a lever operable to actuate the jaw; and an actuation combiner mechanism coupled to the lever and operable to combine a first actuation force input of a lever input link from the lever with a second actuation force input of a mechanical input link from a mechanical actuator into an output link to control the operation of the jaw or the lever. Other aspects are also described and claimed.

IPC Classes  ?

  • A61B 17/29 - Forceps for use in minimally invasive surgery
  • A61B 34/30 - Surgical robots
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

5.

METHOD AND SYSTEM FOR ESTIMATING TEMPERATURE OF AN ULTRASONIC INSTRUMENT

      
Application Number IB2023052083
Publication Number 2023/203396
Status In Force
Filing Date 2023-03-06
Publication Date 2023-10-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Farvardin, Amirhossein
  • Gonenc, Berk
  • Conlon, Sean
  • Fuerst, Bernhard
  • Boronyak, Steven

Abstract

A method performed by a surgical system. The method determines that an ultrasonic instrument is in a low-power state The method determines a resonance frequency of an end effector of the ultrasonic instrument and determines a temperature of the end effector based on the resonance frequency. A notification is displayed on a display of the surgical system based on the temperature.

IPC Classes  ?

  • A61B 17/32 - Surgical cutting instruments
  • A61B 18/04 - Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

6.

METHOD AND SYSTEM FOR MODEL-BASED TEMPERATURE ESTIMATION OF AN ULTRASONIC INSTRUMENT

      
Application Number IB2023053101
Publication Number 2023/203405
Status In Force
Filing Date 2023-03-29
Publication Date 2023-10-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Gonenc, Berk
  • Farvardin, Amirhossein
  • Conlon, Sean
  • Boronyak, Steven

Abstract

A method performed by a surgical system that includes an ultrasonic instrument with an end effector. The method determines a change in resonance frequency of the end effector while the ultrasonic instrument is either in 1) a high-power state in which the ultrasonic instrument draws a first current to cause the end effector to produce heat or 2) a low-power state in which the ultrasonic instrument draws a second current, which is less than the first current that does not cause the end effector to produce heat. The method determines a temperature of the end effector by applying the change in resonance frequency to a hysteresis model that includes a hysteretic relationship between changes in resonance frequency of the end effector and corresponding temperatures of the end effector, and outputs a notification based on the temperature.

IPC Classes  ?

  • A61B 17/32 - Surgical cutting instruments
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

7.

ANNOTATION SYSTEM FOR SURGICAL CONTENT

      
Application Number IB2023054039
Publication Number 2023/203513
Status In Force
Filing Date 2023-04-20
Publication Date 2023-10-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Sahni, Nishant Shailesh
  • Kasravi, Peijmon
  • Sturgeon, Darrick Tyler
  • Barker, Jocelyn Elaine

Abstract

An annotation system facilitates collection of labels for images, video, or other content items relevant to training machine learning models associated with surgical applications or other medical applications. The annotation system enables an administrator to configure annotation jobs associated with training a machine learning model. The job configuration controls presentation of content items to various participating annotators via an annotation application and collection of the labels via a user interface of the annotation application. The annotation application enables the participating annotators to provide inputs in a simple and efficient manner, such as by providing gesture-based inputs or selecting graphical elements associated with different possible labels.

IPC Classes  ?

  • G16H 50/00 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
  • G16H 30/00 - ICT specially adapted for the handling or processing of medical images
  • G16H 10/60 - ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • G06N 20/00 - Machine learning
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer

8.

VIDEO-BASED SURGICAL SKILL ASSESSMENT USING TOOL TRACKING

      
Application Number IB2023052782
Publication Number 2023/180939
Status In Force
Filing Date 2023-03-21
Publication Date 2023-09-28
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fathollahi Ghezelghieh, Mona
  • Sarhan, Mohammad Hasan
  • Barker, Jocelyn
  • Dimonte, Lela

Abstract

Disclosed are various systems and techniques for performing video-based surgeon technical-skill assessments and classifications. In one aspect, a process for classifying a surgeon's technical skill in performing a surgery is disclosed. During operation, the process receives a tool-motion track comprising a sequence of detected tool motions of a surgeon performing a surgery with a surgical tool. The process then generates a sequence of multi-channel feature matrices to mathematically represent the tool-motion track. Next, the process performs a one-dimensional (1D) convolution operation on the sequence of multi-channel feature matrices to generate a sequence of context-aware multi-channel feature representations of the tool-motion track. The sequence of context-aware multi-channel feature representations is subsequently processed by a transformer model to generate the skill classification, wherein the transformer model is trained to identify and focus on a subset of tool motions in the sequence of detected tool motions that are most relevant to the skill classification.

IPC Classes  ?

  • G16H 50/00 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
  • G16H 30/40 - ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
  • G06T 7/10 - Segmentation; Edge detection
  • G06N 20/00 - Machine learning
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis

9.

VIDEO-BASED ANALYSIS OF STAPLING EVENTS DURING A SURGICAL PROCEDURE USING MACHINE LEARNING

      
Application Number IB2023052826
Publication Number 2023/180963
Status In Force
Filing Date 2023-03-22
Publication Date 2023-09-28
Owner VERB SURGICAL INC. (USA)
Inventor
  • Sturgeon, Darrick Tyler
  • Barker, Jocelyn
  • Goel, Varun Kejriwal
  • Aronhalt, Taylor W.

Abstract

An analysis system trains a machine learning model to detect stapling events from a video of a surgical procedure. The machine learning model detects times when stapling events occur as well as one or more characteristics of each stapling event such as length of staples, clamping time, or other characteristics. The machine learning model is trained on videos of surgical procedures identifying when stapling events occurred through a learning process. The machine learning model may be applied to an input video to detect a sequence of stapler events. Stapler event sequences may furthermore be analyzed and/or aggregated to generate various analytical data relating to the surgical procedures for applications such as inventor management, performance evaluation, or predicting patient outcomes.

IPC Classes  ?

  • G16H 50/20 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
  • G16H 30/00 - ICT specially adapted for the handling or processing of medical images
  • G16H 40/60 - ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
  • G06N 20/00 - Machine learning
  • A61B 17/068 - Surgical staplers
  • A61B 34/10 - Computer-aided planning, simulation or modelling of surgical operations

10.

APPARATUS, SYSTEMS, AND METHODS FOR INTRAOPERATIVE INSTRUMENT TRACKING AND INFORMATION VISUALIZATION

      
Application Number IB2023051857
Publication Number 2023/166417
Status In Force
Filing Date 2023-02-28
Publication Date 2023-09-07
Owner VERB SURGICAL INC. (USA)
Inventor
  • Bork, Felix Jonas
  • Kojcev, Risto
  • Fuerst, Bernhard Adolf
  • Gonenc, Berk

Abstract

Systems and methods for intraoperative tracking and visualization are disclosed. A current minimally invasive surgical (MIS) instrument pose may be determined based on a live intraoperative input video stream comprising a current image frame captured by a MIS camera. In addition, an instrument activation state and at least one parameter value associated with the instrument may also be determined. Intraoperative graphic visualization enhancements may be determined based on the activation state of the instrument, and/or a comparison of parameter values with corresponding parametric thresholds. The visualization enhancements may be applied to a current graphics frame. The current graphics frame may also include visualization enhancements related to proximate anatomical structures with proximity determined from the instrument pose and an anatomical model. The current graphics frame may be blended with the current input image frame to obtain an output blended image frame, which may form part of an output video stream.

IPC Classes  ?

  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • A61B 34/10 - Computer-aided planning, simulation or modelling of surgical operations
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis

11.

ROBUST OPERATING ROOM VIDEO ANONYMIZATION BASED ON ENSEMBLE DEEP LEARNING

      
Application Number IB2022061655
Publication Number 2023/126716
Status In Force
Filing Date 2022-12-01
Publication Date 2023-07-06
Owner VERB SURGICAL INC. (USA)
Inventor Torabi, Meysam

Abstract

Disclosed are various face-detection and human de-identification systems and techniques based on deep learning. In one aspect, a process for de-identifying people captured in an operating room (OR) video is disclosed. This process can begin by receiving a sequence of video frames from an OR video. Next, the process applies a first machine-learning face detector based on a first deep-learning model to each video frame in the sequence of video frames to generate a first set of detected faces. The process further applies a second machine-learning face detector to the sequence of video frames to generate a second set of detected faces, wherein the second machine-learning face detector is constructed based on a second deep-learning model different from the first deep-learning model. The process subsequently de-identifies the received sequence of video frames by blurring out both the first set of detected faces and the second set of detected faces.

IPC Classes  ?

  • G06T 5/20 - Image enhancement or restoration by the use of local operators
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
  • G06N 3/04 - Architecture, e.g. interconnection topology
  • G06N 3/08 - Learning methods
  • G06N 20/20 - Ensemble learning

12.

REAL-TIME SURGICAL TOOL PRESENCE/ABSENCE DETECTION IN SURGICAL VIDEOS

      
Application Number IB2022061837
Publication Number 2023/126721
Status In Force
Filing Date 2022-12-06
Publication Date 2023-07-06
Owner VERB SURGICAL INC. (USA)
Inventor
  • Torabi, Meysam
  • Goel, Varun Kejriwal
  • Fer, Danyal
  • Barker, Jocelyn
  • Ghanem, Amer
  • Timm, Richard W.

Abstract

Embodiments described herein provide various techniques and systems for building machine-learning surgical tool presence/absence detection models for processing surgical videos and predicting whether a surgical tool is present or absent in each video frame of a surgical video. In one aspect, a process for ensuring patient safety during a laparoscopic or robotic surgery involving an energy tool is disclosed. The process can begin receiving a real-time control signal indicating an operating state of an energy tool during the surgery. Next, the process receives real-time endoscope video images of the surgery. The process simultaneously applies a machine-learning surgical tool presence/absence detection model to the real-time endoscope video images to generate real-time decisions on a location of the energy tool in the real-time endoscope video images. The process then checks the real-time control signal against the real-time decisions to identify an unsafe event and takes a proper action when an unsafe event is identified.

IPC Classes  ?

  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 18/12 - Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
  • A61B 18/00 - Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body

13.

TRACKING MULTIPLE SURGICAL TOOLS IN A SURGICAL VIDEO

      
Application Number IB2022061944
Publication Number 2023/105467
Status In Force
Filing Date 2022-12-08
Publication Date 2023-06-15
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fathollahi Ghezelghieh, Mona
  • Barker, Jocelyn

Abstract

Disclosed are various systems and techniques for tracking surgical tools in a surgical video. In one aspect, the system begins by receiving one or more established tracks for one or more previously-detected surgical tools in the surgical video. The system then processes a current frame of the surgical video to detect one or more objects using a first deep-learning model. Next, for each detected object in the one or more detected objects, the system further performs the flowing steps to assign the detected object to a right track: (1) computing a semantic similarity between the detected object and each of the one or more established tracks; (2) computing a spatial similarity between the detected object and the latest predicted location for each of the one or more established tracks; and (3) attempting to assign the detected object to one of the one or more established tracks based on the computed semantic similarity and the spatial similarity metric.

IPC Classes  ?

  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • G16H 20/40 - ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
  • G16H 30/20 - ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
  • G16H 30/40 - ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

14.

INTEGRATED SENSORS FOR SURGICAL STAPLERS

      
Application Number IB2022058731
Publication Number 2023/067412
Status In Force
Filing Date 2022-09-15
Publication Date 2023-04-27
Owner VERB SURGICAL INC. (USA)
Inventor
  • Gonenc, Berk
  • Sanker, Benjamin Alan
  • Cordoba, Jose Luis
  • Garcia Kilroy, Pablo

Abstract

A surgical stapler for a surgical robotic system, the surgical stapler including a jaw coupled to a base, the jaw having a first anvil that moves relative to a second anvil between an open position and a closed position; and a force sensor operable to detect a force applied to the jaw.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 17/068 - Surgical staplers
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

15.

INTEGRATED SENSORS FOR ENERGY TOOLS

      
Application Number IB2022058650
Publication Number 2023/067410
Status In Force
Filing Date 2022-09-14
Publication Date 2023-04-27
Owner VERB SURGICAL INC. (USA)
Inventor
  • Sanker, Benjamin Alan
  • Gonenc, Berk
  • Cordoba, Jose Luis
  • Garcia Kilroy, Pablo

Abstract

An energy tool for a surgical robotic system, the energy tool comprising: a jaw coupled to a base, the jaw having a first anvil that moves relative to a second anvil between an open position and a closed position; and at least one of a force sensor, a temperature sensor and an acoustic sensor coupled to the jaw.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 18/14 - Probes or electrodes therefor
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 5/00 - Measuring for diagnostic purposes ; Identification of persons

16.

HARDSTOP DETECTION AND HANDLING FOR SURGICAL TOOL

      
Application Number IB2022058649
Publication Number 2023/062457
Status In Force
Filing Date 2022-09-14
Publication Date 2023-04-20
Owner VERB SURGICAL INC. (USA)
Inventor
  • Zhang, Xiaobin
  • Zhou, Renbin
  • Hariri, Alireza

Abstract

The disclosed embodiments relate to systems and methods for a surgical tool or a surgical robotic system. One example system for detecting a hardstop for a surgical tool includes a wrist connected to and driven by a plurality of cables of a tool driver, a plurality of sensors configured to detect forces associated with the plurality of cables one or more processors configured to perform a comparison of the forces associated with the plurality of cables, selected a highest tension cable from the plurality of cables based on the comparison of the forces associated with the plurality of cables, set a force assigned to the highest tension cable to a predetermined value, calculate a variable torque threshold for the wrist based on a sum of the predetermined value for the highest tension cable and detected forces for remaining cables in the plurality of cables, receive a joint torque value for the wrist, perform a comparison of the received joint torque value for the wrist to a variable wrist torque threshold and identify a hardstop based on the comparison of the received joint torque value for the wrist to the variable wrist torque threshold.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • B25J 3/00 - Manipulators of master-slave type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
  • B25J 11/00 - Manipulators not otherwise provided for
  • B25J 9/10 - Programme-controlled manipulators characterised by positioning means for manipulator elements
  • B25J 9/16 - Programme controls
  • B25J 15/02 - Gripping heads servo-actuated

17.

APPARATUS, SYSTEMS, AND METHODS FOR INTRAOPERATIVE VISUALIZATION

      
Application Number IB2022059862
Publication Number 2023/062594
Status In Force
Filing Date 2022-10-14
Publication Date 2023-04-20
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard Adolf
  • Gonenc, Berk
  • Kojcev, Risto
  • Kilroy, Pablo Garcia
  • Sanker, Benjamin
  • Bork, Felix Jonas

Abstract

Errors in a blended stream that would result in non-display or obscuring of a live video stream from a medical device may be automatically detected, and a failover stream corresponding to the first live video stream may be displayed to medical personnel. For example, one or more second input streams that are being blended may contain no data or invalid data which may result in the blended stream not displaying (or obscuring) the live video stream (if the blended were displayed). Switching from blending to a failover buffer may occur within the time to process a single video image frame. Upon detection (prior to display) that the blended stream would not display the live video stream, display of the live video stream from the failover buffer may be initiated. Other aspects are also described and claimed.

IPC Classes  ?

  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • G16H 30/20 - ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
  • G16H 30/40 - ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
  • G16H 20/40 - ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture

18.

PROJECTION OPERATOR FOR INVERSE KINEMATICS OF A SURGICAL ROBOT FOR LOW DEGREE OF FREEDOM TOOLS

      
Application Number IB2022055441
Publication Number 2023/281332
Status In Force
Filing Date 2022-06-13
Publication Date 2023-01-12
Owner VERB SURGICAL INC. (USA)
Inventor
  • Shrivastava, Apoorv
  • Yu, Haoran
  • Zhou, Renbin
  • Yun, Seungkook
  • Klingbeil, Ellen

Abstract

For teleoperation of a surgical robotic system, the control of the surgical robotic system accounts for a limited degree of freedom of a tool in a surgical robotic system. A projection from the greater DOF of the user input commands to the lesser DOF of the tool is included within or as part of the inverse kinematics. The projection identifies feasible motion in the end-effector domain. This projection allows for a general solution that works for tools having different degrees of freedom and will converge on a solution.

IPC Classes  ?

  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots

19.

PROJECTION OF USER INTERFACE POSE COMMAND TO REDUCED DEGREE OF FREEDOM SPACE FOR A SURGICAL ROBOT

      
Application Number IB2022055446
Publication Number 2023/281333
Status In Force
Filing Date 2022-06-13
Publication Date 2023-01-12
Owner VERB SURGICAL INC. (USA)
Inventor
  • Klingbeil, Ellen
  • Zhou, Renbin
  • Yu, Haoran

Abstract

For teleoperation of a surgical robotic system, the user command for the pose of the end effector is projected into a subspace reachable by the end effector. For example, a user command with six DOF is projected to a five DOF subspace. The six DOF user interface device may be used to more intuitively control, based on the projection, the end effector with the limited DOF relative to the user interface device.

IPC Classes  ?

  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots

20.

SCALABLE FILTERING INFRASTRUCTURE FOR VARIABLE CONTROL RATES IN A DISTRIBUTED SYSTEM SUCH AS A SURGICAL ROBOTIC SYSTEM

      
Application Number IB2022055438
Publication Number 2023/275645
Status In Force
Filing Date 2022-06-13
Publication Date 2023-01-05
Owner VERB SURGICAL INC. (USA)
Inventor
  • Shrivastava, Apoorv
  • Yu, Haoran
  • Zhou, Renbin
  • Sen, H. Tutkun
  • Hariri, Alireza

Abstract

For a scalable filtering infrastructure, a library of filters each usable at different control rates is provided by defining filters in a continuous time mode despite eventual use for digital filtering. For implementation, a filter is selected and discretized for the desired control rate. The discretized filter is then deployed as a discrete time realization for convolution. In a distributed system with multiple control rates, the library may be used to more rapidly and conveniently generate the desired filters.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • G16H 40/40 - ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management of medical equipment or devices, e.g. scheduling maintenance or upgrades
  • G16H 40/67 - ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation

21.

DEEP-LEARNING-BASED REAL-TIME REMAINING SURGERY DURATION (RSD) ESTIMATION

      
Application Number IB2022050386
Publication Number 2022/200864
Status In Force
Filing Date 2022-01-18
Publication Date 2022-09-29
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fathollahi Ghezelghieh, Mona
  • Barker, Jocelyn Elaine
  • Garcia Kilroy, Pablo Eduardo

Abstract

NNNNN frames.

IPC Classes  ?

  • A61B 34/10 - Computer-aided planning, simulation or modelling of surgical operations
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
  • A61B 1/313 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • G16H 20/40 - ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture

22.

ROBOT-ASSISTED SETUP FOR A SURGICAL ROBOTIC SYSTEM

      
Application Number IB2022050385
Publication Number 2022/185127
Status In Force
Filing Date 2022-01-18
Publication Date 2022-09-09
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard Adolf
  • Johnson, Eric Mark

Abstract

A method performed by a surgical robotic system. The method determines a surgical procedure that is to be performed using a robotic arm. The method determines, for the robotic arm, a planned trajectory based on the surgical procedure, where the planned trajectory is from a current pose of the robotic arm to a predefined procedure pose that is within a threshold distance from a trocar that is coupled to a patient. The method drives the robotic arm along the planned trajectory from the current pose to the predefined procedure pose.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery

23.

ARTICULATION JOINT HARDSTOP HANDLING FOR SURGICAL TOOL

      
Application Number IB2021059041
Publication Number 2022/074525
Status In Force
Filing Date 2021-10-01
Publication Date 2022-04-14
Owner VERB SURGICAL INC. (USA)
Inventor
  • Zhang, Xiaobin
  • Chatzigeorgiou, Dimitri

Abstract

The disclosed embodiments relate to systems and methods for a surgical tool or a surgical robotic system. One example system for handling hardstops includes one or more processors configured to calculate an articulation joint position for the articulation drive disk or the one or more corresponding rotary motors corresponding rotary motors, calculate an articulation joint torque for the articulation drive disk or the one or more corresponding rotary motors, determine a torque ratio based on the articulation joint position and the articulation joint torque, and adjust a commanded articulation joint position received from the user based on the torque ratio to compensate for collision involving the end effector.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 34/30 - Surgical robots
  • A61B 17/29 - Forceps for use in minimally invasive surgery
  • B25J 9/16 - Programme controls

24.

NULL SPACE CONTROL FOR END EFFECTOR JOINTS OF A ROBOTIC INSTRUMENT

      
Application Number IB2021059042
Publication Number 2022/074526
Status In Force
Filing Date 2021-10-01
Publication Date 2022-04-14
Owner VERB SURGICAL INC. (USA)
Inventor
  • Zhang, Xiaobin
  • Chatzigeorgiou, Dimitri

Abstract

The disclosed embodiments relate to systems and methods for a surgical tool or a surgical robotic system. One example method includes providing a redundant degree of freedom (DoF) for an end effector joint of one DoF by driving the joint with two actuators, calculating a position displacement of the joint to effect a desired end effector movement in response to an input command, calculating a first movement of the two actuators based on the position displacement of the joint and a second movement of the two actuators based on a second control objective in a null space corresponding to the redundant DoF, and driving the joint according to the first movement and the second movement to effect the desired end effector movement while accomplishing the second control objective in the null space.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots
  • A61B 17/29 - Forceps for use in minimally invasive surgery
  • B25J 9/16 - Programme controls
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

25.

LIMITING GRIP FORCE AND MAINTAINING MINIMUM OPENING FORCE OF JAWS IN POSITION CONTROL MODE, AND CONTROLLING GRIP FORCE WHEN TRANSITIONING BETWEEN POSITION CONTROL MODE AND FORCE MODE

      
Application Number IB2021057757
Publication Number 2022/069963
Status In Force
Filing Date 2021-08-24
Publication Date 2022-04-07
Owner VERB SURGICAL INC. (USA)
Inventor
  • Ergueta Tejerina, Edgar Ignacio
  • Hariri, Alireza

Abstract

Disclosed are systems and methods for limiting the grip force generated by closing robotic wrist jaws while operating in position mode in which the jaws are commanded to a desired jaw angle prior to being commanded to generate a grip force. Also disclosed are systems and methods for maintaining the opening force generated by robotic wrist jaws operating in position mode in which the jaws are commanded to a desired jaw angle prior to being commanded to generate a grip force. Also disclosed are systems and methods for achieving a smooth transition in the grip force when the wrist jaws transition between the position and force mode.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots

26.

SYSTEMS AND METHODS FOR DOCKING SURGICAL ROBOTIC ARMS

      
Application Number IB2021058696
Publication Number 2022/070006
Status In Force
Filing Date 2021-09-23
Publication Date 2022-04-07
Owner VERB SURGICAL INC. (USA)
Inventor
  • Vargas, Matthew
  • Bajo, Andrea
  • Wang, Nathan
  • Gordon, Alec
  • Sahin, Koray

Abstract

Methods and apparatuses for attaching a cannula to a surgical robotic arm are described and claimed.

IPC Classes  ?

27.

AUGMENTED REALITY HEADSET FOR A SURGICAL ROBOT

      
Application Number IB2021058785
Publication Number 2022/070015
Status In Force
Filing Date 2021-09-27
Publication Date 2022-04-07
Owner VERB SURGICAL INC. (USA)
Inventor
  • Song, Tianyu
  • Olson, Blade
  • Fuerst, Bernhard A.
  • Fer, Danyal

Abstract

Disclosed is an augmented reality (AR) headset that provides a wearer with spatial, system, and temporal contextual information of a surgical robotic system to guide the wearer in configuring, operating, or troubleshooting the surgical robotic system prior to, during, or after surgery. The spatial context information may be rendered to display spatially-fixed 3D-generated virtual models of the robotic arms, instruments, bed, and other components of the surgical robotic system that match the actual position or orientation of the surgical robotic system in the AR headset's coordinate frame. The AR headset may communicate with the surgical robotic system to receive real-time state information of the components of the surgical robotic system. The AR headset may use the real-time state information to display context-sensitive user interface information such as tips, suggestions, visual or audio cues on maneuvering the robotic arms and table to their target positions and orientations or for troubleshooting purpose.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis

28.

DEEP DISENGAGEMENT DETECTION DURING TELESURGERY

      
Application Number IB2021058540
Publication Number 2022/064346
Status In Force
Filing Date 2021-09-20
Publication Date 2022-03-31
Owner VERB SURGICAL INC. (USA)
Inventor Torabi, Meysam

Abstract

Disclosed are various user input device (UID) disengagement-detection techniques based on real-time time-series data processing and deep learning. More specifically, various disclosed UID disengagement-detection techniques include training a long short-term memory (LSTM) network-based classifier based on the acquired time-series data of UID motions including both surgical motions and docking motions. The trained deep-learning classifier can then be used during teleoperation sessions to monitor the movements of UIDs, and continuously classify the real-time UID motions as either teleoperation motions or docking motions. The disclosed disengagement-detection techniques can immediately disengage the UIDs from the surgical tools as soon as the monitored UID motions are classified as docking motions by the trained classifier, thereby preventing unintended surgical tool motions. The disclosed disengagement-detection techniques allow the UIDs and the surgical tools to become disengaged naturally by simply having the user putting the UIDs back to their docking positions without having to take any additional actions..

IPC Classes  ?

  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • G16H 20/40 - ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
  • G16H 40/67 - ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

29.

3D VISUALIZATION ENHANCEMENT FOR DEPTH PERCEPTION AND COLLISION AVOIDANCE IN AN ENDOSCOPE SYSTEM

      
Application Number US2020050915
Publication Number 2022/055515
Status In Force
Filing Date 2020-09-15
Publication Date 2022-03-17
Owner VERB SURGICAL INC. (USA)
Inventor
  • Xu, Yiming
  • Gonenc, Berk

Abstract

A series of images is obtained from an endoscope. Three-dimensional reconstruction is performed on the series of images to reconstruct anatomy shown in the series of images. A graphic, such as a grid, is rendered based on the three-dimensional reconstruction, over the series of images resulting in an enhanced endoscopic video feed to be shown on a display.

IPC Classes  ?

  • A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor

30.

USER PRESENCE/ABSENCE RECOGNITION DURING ROBOTIC SURGERIES USING DEEP LEARNING

      
Application Number US2020051159
Publication Number 2022/055516
Status In Force
Filing Date 2020-09-17
Publication Date 2022-03-17
Owner VERB SURGICAL INC. (USA)
Inventor Torabi, Meysam

Abstract

Disclosed are various user-presence/absence recognition techniques based on deep learning. More specifically, various user-presence/absence recognition techniques include building/training a CNN-based image recognition model including a user-presence/absence classifier based on training images collected from the user-seating area of a surgeon console under various clinically-relevant conditions/cases. The trained user-presence/absence classifier can then be used during teleoperation/surgical procedures to monitor/track users in the user-seating area of the surgeon console, and continuously classify the real-time video images of the user-seating area as either a user-presence state or a user-absence state. In some embodiments, the disclosed techniques can be used to detect a user-switching event at the surgeon console when a second user is detected to have entered the user-seating area after a first user is detected to have exited the user-seating area. If the second user is identified as a new user, the disclosed techniques can trigger a recalibration procedure for the new user.

IPC Classes  ?

  • G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints

31.

CONTROL OF A SURGICAL INSTRUMENT HAVING BACKLASH, FRICTION, AND COMPLIANCE UNDER EXTERNAL LOAD IN A SURGICAL ROBOTIC SYSTEM

      
Application Number US2020048224
Publication Number 2022/046058
Status In Force
Filing Date 2020-08-27
Publication Date 2022-03-03
Owner VERB SURGICAL INC. (USA)
Inventor
  • Sen, H. Tutkun
  • Hariri, Alireza

Abstract

For control of a surgical instrument in a surgical robotic system, multiple actuators establish a static pretension by actuating in opposition to each other in torque control. The static pretension reduces or removes the compliance and elasticity, reducing the backlash width. To drive the tool, the actuators are then moved in cooperation with each other in position mode control so that the movement maintains the static pretension while providing precise control.

IPC Classes  ?

32.

CONTROL OF AN ENDOSCOPE BY A SURGICAL ROBOT

      
Application Number US2020048138
Publication Number 2022/046053
Status In Force
Filing Date 2020-08-27
Publication Date 2022-03-03
Owner VERB SURGICAL INC. (USA)
Inventor
  • Zhou, Renbin
  • Yun, Seungkook
  • Yu, Haoran
  • Klignbeil, Ellen
  • Shrivastava, Apoorv

Abstract

An endoscope is controlled by a surgical robotic system. A user input with six degrees of freedom maps to control of an endoscope by a robotic arm having a fewer number of degrees of freedom. For example, untethered user interface devices control motion of an endoscope through a series of projections from user command, to endoscope motion, and to joint motion of the robotic arm. The projection from user command to endoscope motion may project a singular angular motion from three angular motions of the user interface devices. The projection may account for the remote center of motion and/or an angular orientation of the view of the endoscope relative to a shaft of the endoscope.

IPC Classes  ?

  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 34/30 - Surgical robots
  • B25J 9/16 - Programme controls
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G16H 40/60 - ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
  • G16H 40/63 - ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation

33.

DETECTION OF DISENGAGEMENT IN CABLE DRIVEN TOOL

      
Application Number IB2021057405
Publication Number 2022/034518
Status In Force
Filing Date 2021-08-11
Publication Date 2022-02-17
Owner VERB SURGICAL INC. (USA)
Inventor
  • Maughan, Spencer
  • Hariri, Alireza

Abstract

The disclosed embodiments relate to systems and methods for a surgical tool or a surgical robotic system. One example system for detecting disengagement of a surgical tool, includes an end effector connected to and driven by cables of a tool driver, sensors configured to detect forces associated with the cables, and one or more processors. The one or more processors identify cable tensions derived from forces detected by the sensors, compare the tension to a threshold tension value, calculate a velocity norm value based on a vector including the velocity value for each of the cables, compare the velocity norm value to a statistic velocity threshold, and identify a disengagement of at least one of the plurality of cables based on the first comparison and the second comparison.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

34.

SURGICAL ROBOTIC SYSTEM AND METHOD FOR TRANSITIONING CONTROL TO A SECONDARY ROBOT CONTROLLER

      
Application Number US2020044885
Publication Number 2022/031271
Status In Force
Filing Date 2020-08-04
Publication Date 2022-02-10
Owner VERB SURGICAL INC. (USA)
Inventor Desai, Jignesh

Abstract

A robotic surgical system and method are disclosed for transitioning control to a secondary robotic arm controller. In one embodiment, a robotic surgical system comprises a user console comprising a display device and a user input device; a robotic arm configured to be coupled to an operating table; a primary robotic arm controller configured to move the robotic arm in response to a signal received from the user input device at the user console; and a secondary robotic arm controller configured to move the robotic arm in response to a signal received from a user input device remote from the user console. Control over movement of the robotic arm is transitioned from the primary robotic arm controller to the secondary robotic arm controller in response to a failure in the primary robotic arm controller. Other embodiments are provided.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • B25J 9/16 - Programme controls

35.

INVERSE KINEMATICS OF A SURGICAL ROBOT FOR TELEOPERATION WITH HARDWARE CONSTRAINTS

      
Application Number US2020038098
Publication Number 2021/251989
Status In Force
Filing Date 2020-06-17
Publication Date 2021-12-16
Owner VERB SURGICAL INC. (USA)
Inventor
  • Zhou, Renbin
  • Yu, Haoran
  • Yun, Seungkook Nia
  • Klingbeil, Ellen
  • Shrivastava, Apoorv

Abstract

Various approaches to solve for inverse kinematics may be used for teleoperation of a surgical robotic system. In one approach, an iterative solver solves for the linear component of motion independently from solving for the angular component of motion. One solver may be used to solve for both together. In another approach, all limits (e.g., position, velocity, and acceleration) are handled in one solution. Where a limit is reached, the limit is used as a bound in the intermediate solution, allowing solution even where a bound is reached. In another approach, a ratio of limits of position are used to create a slow-down region near the bounds to more naturally control motion. In yet another approach, the medical-based teleoperation uses a bounded Gauss-Siedel solver, such as with successive-over-relaxation.

IPC Classes  ?

  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • B25J 9/16 - Programme controls
  • A61B 34/30 - Surgical robots

36.

REMOTE SURGICAL MENTORING USING AUGMENTED REALITY

      
Application Number US2020038088
Publication Number 2021/247052
Status In Force
Filing Date 2020-06-17
Publication Date 2021-12-09
Owner VERB SURGICAL INC. (USA)
Inventor
  • Olson, Blade
  • Fuerst, Bernhard A.

Abstract

A virtual representation of an operating room is generated based on robot information and sensing of the OR with depth cameras. One of the depth cameras is integrated with a portable electronic device, operated by a local user in the operating room. The virtual representation of the OR is communicated to the virtual reality headset, with three-dimensional point cloud data. A virtual reality environment is rendered to a display of the virtual reality headset, operated by a remote user. A virtual representation of the remote user is rendered in augmented reality to a display of the portable electronic device.

IPC Classes  ?

  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

37.

DIGITIZATION OF AN OPERATING ROOM

      
Application Number US2020037923
Publication Number 2021/247050
Status In Force
Filing Date 2020-06-16
Publication Date 2021-12-09
Owner VERB SURGICAL INC. (USA)
Inventor
  • Olson, Blade, A.
  • Fuerst, Bernhard, A.
  • Barthel, Alexander
  • Xu, Yiming
  • Taylor, Giacomo
  • Song, Tianyu

Abstract

A surgical exercise performed with a surgical robotic system is sensed by depth cameras, generating 3D point cloud data. Robot system data associated with the surgical robotic system is logged. Object recognition is performed on image data produced by the one or more depth cameras, to recognized objects, including surgical equipment and people, in the operating room (OR). The surgical exercise is digitized by storing the 3D point cloud data of unrecognized objects, a position and orientation associated with the recognized objects, and c) the robot system data.

IPC Classes  ?

  • G16H 30/20 - ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • G16H 30/40 - ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
  • G16H 40/60 - ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices

38.

PORT PLACEMENT GUIDE BASED ON INSUFFLATED PATIENT TORSO MODEL AND NORMALIZED SURGICAL TARGETS

      
Application Number US2021033517
Publication Number 2021/247248
Status In Force
Filing Date 2021-05-20
Publication Date 2021-12-09
Owner VERB SURGICAL INC. (USA)
Inventor
  • Monteverde, David R.
  • Fer, Danyal
  • Bzostek, Andrew

Abstract

A method for determining surgical port placement for minimally invasive surgery. Based on received measurements, an instance of a parametric torso model that defines an external surface and a visceral surface each having a dome shape that takes into account an insufflation effect, is determined. Normalized surgical target locations in the parametric torso model are determined in response to an identification of a surgical procedure, and are mapped to un-normalized surgical target locations. Permissible port locations on the instance of the parametric torso model are computed, based on the characteristics of a surgical tool and based on the un-normalized surgical target locations. Other aspects are also described and claimed.

IPC Classes  ?

  • A61B 34/10 - Computer-aided planning, simulation or modelling of surgical operations

39.

REMOTE CENTER OF MOTION CONTROL FOR A SURGICAL ROBOT

      
Application Number US2020029889
Publication Number 2021/216091
Status In Force
Filing Date 2020-04-24
Publication Date 2021-10-28
Owner VERB SURGICAL INC. (USA)
Inventor
  • Gonenc, Berk
  • Xu, Yiming
  • Nicholson, Margaret
  • Kilroy, Pablo, Garcia

Abstract

For control about a remote center of motion (RCM) ot a surgical robotic system, possible configurations of a robotic manipulator are searched to find the configuration providing a greatest overlap of the workspace of the surgical instrument with the target anatomy. The force at the RCM may be measured, such as with one or more sensors on the cannula or in an adaptor connecting the robotic manipulator to the cannula. The measured force is used to determine a change in the RCM to minimize the force exerted on the patient at the RCM. Given this change, the configuration of the robotic manipulator may be dynamically updated. Various aspects of this RCM control may be used alone or in combination, such as to optimize the alignment of workspace to the target anatomy, to minimize force at the RCM, and/or to dynamically control the robotic manipulator configuration based on workspace alignment and force measurement.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • B25J 9/16 - Programme controls

40.

MOBILE VIRTUAL REALITY SYSTEM FOR SURGICAL ROBOTIC SYSTEMS

      
Application Number US2020031367
Publication Number 2021/201890
Status In Force
Filing Date 2020-05-04
Publication Date 2021-10-07
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard
  • Johnson, Eric
  • Garcia Kilroy, Pablo

Abstract

Mobile virtual reality system for simulation, training or demonstration of a surgical robotic system can include a virtual reality processor. The processor can generate a virtual surgical robot and render the virtual surgical robot on a display. The virtual surgical robot can include a virtual surgical tool. A handheld user input device (UID) can sense a hand input from a hand. A foot input device can sense a foot input from a foot. The virtual reality processor can be configured to control a movement or action of the virtual surgical robot based on the hand input, and change which of the virtual surgical instruments is controlled by the one or more handheld UIDs based on the foot input. Other embodiments and aspects are disclosed and claimed.

IPC Classes  ?

  • A61B 34/10 - Computer-aided planning, simulation or modelling of surgical operations
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

41.

TROCAR POSE ESTIMATION USING MACHINE LEARNING FOR DOCKING SURGICAL ROBOTIC ARM TO TROCAR

      
Application Number US2020049764
Publication Number 2021/188146
Status In Force
Filing Date 2020-09-08
Publication Date 2021-09-23
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard A.
  • Moses, Dennis
  • Garcia Kilroy, Pablo

Abstract

A surgical robotic system senses position or orientation of an object, which may be a trocar that has a magnetic field. Magnetic field sensors are coupled to a surgical robotic arm. A machine learning model coupled to the magnetic field sensors is trained to output three-dimensional position and/or three-dimensional orientation of the trocar or other object. Other aspects are also described.

IPC Classes  ?

  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 34/30 - Surgical robots
  • A61B 17/34 - Trocars; Puncturing needles

42.

DROP DETECTION OF UNGROUNDED MASTER CONTROLLER FOR A SURGICAL ROBOT

      
Application Number US2020030871
Publication Number 2021/188127
Status In Force
Filing Date 2020-04-30
Publication Date 2021-09-23
Owner VERB SURGICAL INC. (USA)
Inventor
  • Miller, Denise Ann
  • Savall, Joan
  • Hellman, Randall Blake

Abstract

Disclosed herein are methods to detect a free-falling or other non-surgical motions of the user interface device (UID) of a surgical robotic system so that the surgical robotic system may pause the robotic arm controlled by the UID to prevent the robotic arm from mimicking the unintentional movement of the UID. Contact sensors embedded in the UID may be used to detect conditions indicating that a user does not possess full control of the UID. After determining that the user does not have full control of the UID, the UID may detect if the UID is experiencing non-surgical motions using motion sensors such as inertial sensors. By conditioning analysis of the data from the motion sensors by the initial determination that the UID is not being held based on the contact sensors, the method increases the robustness of the detection of non-surgical motions and reduces the probability of false positives.

IPC Classes  ?

  • A61B 34/37 - Master-slave robots
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • B25J 13/02 - Hand grip control means
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  • B25J 19/06 - Safety devices

43.

INTEGRATED ROBOTIC INSUFFLATION AND SMOKE EVACUATION

      
Application Number US2020032062
Publication Number 2021/188129
Status In Force
Filing Date 2020-05-08
Publication Date 2021-09-23
Owner VERB SURGICAL INC. (USA)
Inventor
  • Russell, Geoffrey Robert
  • Vakharia, Omar J.
  • Magnasco, John H.

Abstract

A surgical robotic system comprising: a robotic arm; a tool drive coupled to the robotic arm; a cannula interface configured to couple a cannula to the tool drive, the cannula interface having a fluid pathway in communication with an interior lumen of the cannula; and an insufflation pathway coupled to the robotic arm, the insufflation pathway having a distal end coupled to the fluid pathway and a proximal end coupled to a surgical insufflator.

IPC Classes  ?

44.

DETECTING CABLE BREAKAGE ON CABLE DRIVEN TOOLS

      
Application Number US2020023595
Publication Number 2021/183152
Status In Force
Filing Date 2020-03-19
Publication Date 2021-09-16
Owner VERB SURGICAL INC. (USA)
Inventor
  • Ergueta Tejerina, Edgar Ignacio
  • Hariri, Alireza

Abstract

A surgical robotic tool used with a surgical robotic system can include cables that effect movement in the surgical robotic tool. A brake in any of these cables can be detected by checking a plurality of conditions. A process can compare a) a tension error against a first threshold, b) a rate of change of the sensed tension of the cable against a second threshold, and c) a rate of change of a cable extension error against a third threshold. If all thresholds are exceeded, the process can disable the respective actuator.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

45.

SURGEON DISENGAGEMENT DETECTION DURING TERMINATION OF TELEOPERATION

      
Application Number US2020023208
Publication Number 2021/183150
Status In Force
Filing Date 2020-03-17
Publication Date 2021-09-16
Owner VERB SURGICAL INC. (USA)
Inventor Torabi, Meysam

Abstract

A method for disengagement detection of a surgical instrument of a surgical robotic system, the method comprising: determining whether a user's head is unstable prior to disengagement of a teleoperation mode; determining whether a pressure release has occurred relative to at least one of a first user input device or a second user input device for controlling a surgical instrument of the surgical robotic system during the teleoperation mode; and in response to determining the user's head is unstable or determining the pressure release has occurred, determining whether a distance change between the first user input device and the second user input device indicates the user is performing an unintended action prior to disengagement of the teleoperation mode.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 34/37 - Master-slave robots
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

46.

REDUNDANT ROBOT POWER AND COMMUNICATION ARCHITECTURE

      
Application Number US2021021599
Publication Number 2021/183584
Status In Force
Filing Date 2021-03-10
Publication Date 2021-09-16
Owner VERB SURGICAL INC. (USA)
Inventor
  • Rea, Rochelle
  • Zietlow, Klaus
  • Hoffman, Adam
  • Zeng, Yiqi

Abstract

An electronic circuit for a surgical robotic system includes a central power node, a first voltage bus that electrically couples a first power source to the node, a second voltage bus that electrically couples a second power source to the node, and several robotic arms, each arm is electrically coupled to the node via an output circuit breaker and is arranged to draw power from the node. Each bus is arranged to provide power from a respective power source to the node and each bus has an input circuit breaker that is arranged to limit a first output current flow from the node and into the bus. Each breaker that is arranged to limit a second output current flow from the node and into a respective arm. A breaker is arranged to open in response to a fault occurring within the respective arm, while the other breakers remain closed.

IPC Classes  ?

  • A61B 34/37 - Master-slave robots
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • B25J 9/00 - Programme-controlled manipulators
  • B25J 9/16 - Programme controls
  • B25J 19/00 - Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
  • B25J 19/06 - Safety devices
  • A61B 90/57 - Accessory clamps

47.

GRAPHICAL USER GUIDANCE FOR A ROBOTIC SURGICAL SYSTEM

      
Application Number US2020021090
Publication Number 2021/177959
Status In Force
Filing Date 2020-03-05
Publication Date 2021-09-10
Owner VERB SURGICAL INC. (USA)
Inventor
  • Johnson, Eric, Mark
  • Levin, Michal
  • Freiin Von Kapri, Anette, Lia

Abstract

Graphical user guidance for a robotic surgical system is provided. In one embodiment, a graphical user interface for a robotic surgical system comprises a first region and a second region. The first region is used to display an endoscopic view of a surgical site inside a patient taken by an endoscopic camera of the robotic surgical system, and the second region is used to display user feedback information. The graphical user interface overlays a guidance message on top of the endoscopic view of the surgical site in the first region to provide user instructions for interacting with a user input device to engage a robotic arm of the robotic surgical system. Other embodiments are provided.

IPC Classes  ?

  • G06F 3/048 - Interaction techniques based on graphical user interfaces [GUI]

48.

ROBOTIC SURGICAL SYSTEM AND METHOD FOR PROVIDING A STADIUM VIEW WITH ARM SET-UP GUIDANCE

      
Application Number US2020019214
Publication Number 2021/167619
Status In Force
Filing Date 2020-02-21
Publication Date 2021-08-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Johnson, Eric
  • Brahic, Francois

Abstract

A robotic surgical system and method for providing a stadium view with arm set-up guidance is provided. In one embodiment, a robotic surgical system comprises a plurality of robotic arms, a display device, and a processor. The processor is configured to render, on the display device, a graphical representation of the plurality of robotic arms in their current positions, as well as user-guidance information on how to move the plurality of robotic arms to a different position. Other embodiments are provided.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/10 - Computer-aided planning, simulation or modelling of surgical operations
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 90/50 - Supports for surgical instruments, e.g. articulated arms
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery

49.

METHOD AND SYSTEM FOR DATA EXCHANGE WITH ROBOTIC SURGICAL TOOLS USING NEAR FIELD COMMUNICATION (NFC)

      
Application Number US2020020822
Publication Number 2021/167626
Status In Force
Filing Date 2020-03-03
Publication Date 2021-08-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Jhaveri, Mufaddal
  • Abad, Robert
  • Bosteels, Jan

Abstract

In this patent disclosure, various embodiments of using near-field communication (NFC) to facilitate the transfer of data and power between a robotic surgical system and a surgical tool attached to the robotic surgical system are disclosed. In one aspect, a process for automatically managing surgical tool attachment in a robotic surgical system can begin by detecting an attachment of a surgical tool. The process next establishes a secure near-field communication (NFC) link between a first NFC module embedded in the robotic surgical system and a second NFC module embedded in the surgical tool. Next, the process requests tool calibration data from the surgical tool via the secure NFC link. The process subsequently uses the received tool calibration data to initialize the surgical tool so that the surgical tool is ready for use.

IPC Classes  ?

  • A61B 34/32 - Surgical robots operating autonomously
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 90/90 - Identification means for patients or instruments, e.g. tags
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

50.

MULTI-CAMERA USER INTERFACE DEVICE CALIBRATION AND TRACKING

      
Application Number US2020022442
Publication Number 2021/167628
Status In Force
Filing Date 2020-03-12
Publication Date 2021-08-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard A.
  • Barthel, Alexander

Abstract

Surgical robotic system includes a surgical robotic arm, a handheld user interface device (UID) having a camera. Images from the camera are processed to detect a marker in the user environment. A pose of the handheld UID is determined based on the detected marker. A movement of the surgical robotic arm is effected based on the pose of the handheld UID.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

51.

THREE DIMENSIONAL MEDICAL IMAGING AND INTERACTIONS

      
Application Number US2021015905
Publication Number 2021/155290
Status In Force
Filing Date 2021-01-29
Publication Date 2021-08-05
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard
  • Barthel, Alexander
  • Johnson, Eric, M.
  • Kojcev, Risto

Abstract

A medical image viewer can render to a display, a three-dimensional view of patient anatomy, a multi planar reconstruction (MPR) view of the patient anatomy, and an intra-operational view. Some of the views can be synchronized to show a common focal point of the anatomy. Other embodiments are described and claimed.

IPC Classes  ?

  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 34/30 - Surgical robots
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery

52.

VIRTUAL REALITY SYSTEMS FOR SIMULATING SURGICAL WORKFLOW WITH PATIENT MODEL AND CUSTOMIZABLE OPERATION ROOM

      
Application Number US2019064713
Publication Number 2021/086417
Status In Force
Filing Date 2019-12-05
Publication Date 2021-05-06
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard
  • Johnson, Eric
  • Garcia Kilroy, Pablo

Abstract

Planning surgical robotic workflow with a surgical robotic system can include generating a virtual surgical environment, the virtual surgical environment including a virtual surgical robotic arm and a virtual patient. A workspace can be determined in the virtual patient. A position of a virtual tool, attached to the virtual surgical robotic arm, can be determined. A position of the virtual surgical robotic arm can be determined to maintain a reach of the virtual tool in the workspace.

IPC Classes  ?

  • A61B 34/10 - Computer-aided planning, simulation or modelling of surgical operations
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • G16H 20/40 - ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture

53.

SYSTEMS AND METHODS FOR VISUAL SENSING OF AND DOCKING WITH A TROCAR

      
Application Number US2019059896
Publication Number 2021/086409
Status In Force
Filing Date 2019-11-05
Publication Date 2021-05-06
Owner VERB SURGICAL INC. (USA)
Inventor
  • Gonenc, Berk
  • Liu, Xin
  • Cordoba, Jose Luis
  • Fuerst, Bernhard A.
  • Moses, Dennis
  • Garcia Kilroy, Pablo

Abstract

A surgical robotic system has a tool drive coupled to a distal end of a robotic arm that has a plurality of actuators. The tool drive has a docking interface to receive a trocar. The system also includes one or more sensors that are operable to visually sense a surface feature of the trocar. One or more processors determine a position and orientation of the trocar, based on the visually sensed surface feature. In response, the processor controls the actuators to orient the docking interface to the determined orientation of the trocar and to guide the robotic arm toward the determined position of the trocar. Other aspects are also described and claimed.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 17/34 - Trocars; Puncturing needles
  • A61B 90/90 - Identification means for patients or instruments, e.g. tags
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

54.

ENGAGEMENT AND/OR HOMING OF A SURGICAL TOOL IN A SURIGICAL ROBOTIC SYSTEM

      
Application Number US2019058477
Publication Number 2021/080623
Status In Force
Filing Date 2019-10-29
Publication Date 2021-04-29
Owner VERB SURGICAL, INC. (USA)
Inventor
  • Zhou, Renbin
  • Yu, Haoran
  • Kosari, Sina, Nia

Abstract

Engaging and/or homing is provided for a motor control of a surgical tool in a surgical robotic system. Where two or more motors are to control the same motion, the motors may be used to detect engagement even where no physical stop is provided. The motors operate in opposition to each other or in a way that does not attempt the same motion, resulting in one of the motors acting as a stop for the other motor in engagement. A change in motor operation then indicates the engagement. The known angles of engaged motors and the transmission linking the motor drives to the surgical tool indicate the home or current position of the surgical tool.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 90/50 - Supports for surgical instruments, e.g. articulated arms
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

55.

REGULATING JOINT SPACE VELOCITY OF A SURGICAL ROBOTIC ARM

      
Application Number US2019062059
Publication Number 2021/080626
Status In Force
Filing Date 2019-11-18
Publication Date 2021-04-29
Owner VERB SURGICAL INC. (USA)
Inventor Yu, Haoran

Abstract

A surgical robotic system has a surgical robotic arm, and a programmed processor that determines a longest principal axis of a velocity ellipsoid for a first configuration of the arm, applies a maximum task space velocity (that is in the direction of the longest principal axis) to an inverse kinematics equation which computes a potential joint space velocity, computes a ratio of i) the potential joint space velocity and ii) a joint space velocity limit of the arm, and applies the ratio to an initial joint space velocity, to produce a regulated joint space velocity. Other aspects are also described and claimed.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • B25J 9/16 - Programme controls

56.

HANDHELD USER INTERFACE DEVICE FOR A SURGICAL ROBOT

      
Application Number US2019051260
Publication Number 2021/050087
Status In Force
Filing Date 2019-09-16
Publication Date 2021-03-18
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard Adolf
  • Garcia Kilroy, Pablo E.

Abstract

Disclosed herein is a mobile interface device to control a robotically-assisted surgical system. The mobile interface device provides a surgeon with a direct view of the surgical robotic system and allows a surgeon to easily and intuitively select, control, or manipulate various target components of the surgical robotic system. The mobile interface device may capture a live image of the surgical robotic system to automatically identify a robotic arm that appears in the center of the captured live image as the target component selected by the surgeon. Based on the current pose or position of the target component, the mobile interface device may generate a list of target poses and control options. The surgeon may select a control option to select a target pose and to manipulate the target component to command a robotically-assisted movement of the selected robotic arm from the current pose to the target pose.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/37 - Master-slave robots
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

57.

SURGICAL TOOL

      
Application Number US2020045598
Publication Number 2021/030255
Status In Force
Filing Date 2020-08-10
Publication Date 2021-02-18
Owner VERB SURGICAL INC. (USA)
Inventor
  • Asadian, Ali
  • Hariri, Alireza
  • Dahdouh, Andrew

Abstract

The disclosed embodiments related to systems and methods for a surgical tool or a surgical robotic system. A tool driver is coupled to a distal end of a robotic arm and includes a roll drive disk driven by a rotary motor. One or more processors are configured to detect an attachment of a surgical tool to the tool driver. The surgical tool includes a roll tool disk to be engaged with the roll drive disk of the tool driver, actuate of the roll drive disk through the rotary motor, determine that a measured torque of the rotary motor exceeds a preset torque threshold for a preset period of time since the actuation, and report a successful engagement between the roll drive disk and the roll tool disk.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 17/29 - Forceps for use in minimally invasive surgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

58.

MITIGATING ELECTROMAGNETIC FIELD DISTORTION FOR A SURGICAL ROBOTIC SYSTEM

      
Application Number US2019044523
Publication Number 2021/021199
Status In Force
Filing Date 2019-07-31
Publication Date 2021-02-04
Owner VERB SURGICAL INC. (USA)
Inventor
  • Savall, Joan
  • Miller, Denise Ann
  • Sani, Hamid Reza

Abstract

Surgical systems including a user console for controlling a surgical robotic tool are described. A witness sensor and a reference sensor can be mounted on the user console to measure an electromagnetic field distortion near a location, and to measure deformation of the location, respectively. Distortion in the electromagnetic field can be detected based on the measurements from the witness sensor and the reference sensor. An alert can be generated, or teleoperation of the surgical tool can be adjusted or paused, when a user interface device used to control the surgical tool is within a range of the distortion. The distortion can be from a known source, such as from actuation of a haptic motor of the user interface device, and the user console can adjust the actuation to reduce the likelihood that the distortion will disrupt surgical tool control. Other embodiments are described and claimed.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/37 - Master-slave robots
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

59.

ROBOTIC ARM HAVING AN EXTENDABLE PRISMATIC LINK

      
Application Number US2019044526
Publication Number 2021/021200
Status In Force
Filing Date 2019-07-31
Publication Date 2021-02-04
Owner VERB SURGICAL INC. (USA)
Inventor
  • Devengenzo, Roman
  • Garcia Kilroy, Pablo

Abstract

Robotic arms and surgical robotic systems incorporating such arms are described. A robotic arm includes a roll joint connected to a prismatic link by a pitch joint and a tool drive connected to the prismatic link by another pitch joint. The prismatic link includes several prismatic sublinks that are connected by a prismatic joint. A surgical tool supported by the tool drive can insert into a patient along an insertion axis through a remote center of motion of the robotic arm. Movement of the robotic arm can be controlled to telescopically move the prismatic sublinks relative to each other by the prismatic joint while maintaining the remote center of motion fixed. Other embodiments are also described and claimed.

IPC Classes  ?

  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 90/50 - Supports for surgical instruments, e.g. articulated arms
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

60.

STRAIN WAVE GEARING WITH INPUT TO OUTPUT BRAKING

      
Application Number US2019043286
Publication Number 2021/015764
Status In Force
Filing Date 2019-07-24
Publication Date 2021-01-28
Owner VERB SURGICAL INC. (USA)
Inventor Thakkar, Bharat

Abstract

A braking assembly for a strain wave gearing of a surgical robotic manipulator, the braking assembly including a first braking member fixedly coupled to an input portion of a strain wave gearing of a surgical robotic manipulator; and a second braking member fixedly coupled to an output portion of the strain wave gearing, and wherein during a braking operation the first braking member contacts the second braking member to mechanically brake the input portion to the output portion.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 34/37 - Master-slave robots
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • B25J 9/10 - Programme-controlled manipulators characterised by positioning means for manipulator elements
  • B25J 19/00 - Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
  • F16H 49/00 - Other gearing
  • F16H 55/08 - Profiling
  • A61B 90/50 - Supports for surgical instruments, e.g. articulated arms

61.

EYE TRACKING CALIBRATION FOR A SURGICAL ROBOTIC SYSTEM

      
Application Number US2019039044
Publication Number 2020/256748
Status In Force
Filing Date 2019-06-25
Publication Date 2020-12-24
Owner VERB SURGICAL INC. (USA)
Inventor
  • Freiin Von Kapri, Anette Lia
  • Miller, Denise
  • Savall, Joan

Abstract

A method for passively calibrating and verifying eye tracking in a surgical robotic system. The gaze of a user facing a user display of a user console of a surgical robotic system is tracked while the user is using a user interface device (UID). When a user interaction to the UID is detected, an expected gaze position on the display of the user console is determined without instructing the user to look at the expected gaze position. The latter becomes a reference gaze point of the user at the time of detected user action. The measured gaze point of the user is compared with an acceptable threshold of the reference gaze point, and in response to a determination of a mismatch, calibration parameters used by the tracking are adjusted according to the reference gaze point. Other aspects are also described and claimed.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

62.

METHOD AND SYSTEM FOR SYNCHRONIZING PROCEDURE VIDEOS FOR COMPARATIVE LEARNING

      
Application Number US2019038603
Publication Number 2020/251595
Status In Force
Filing Date 2019-06-21
Publication Date 2020-12-17
Owner VERB SURGICAL INC. (USA)
Inventor
  • Garcia Kilroy, Pablo
  • Venkataraman, Jagadish

Abstract

Embodiments described herein provide various examples of preparing two procedure videos, in particular two surgical procedure videos for comparative learning. In some embodiments, to allow comparative learning of two recorded surgical videos, each of the two recorded surgical videos is segmented into a sequence of predefined phases/steps. Next, corresponding phases/steps of the two segmented videos are individually time-synchronized in pair-wise manner so that a given phase/step of one recorded video and a corresponding phase/step of the other segmented video can have the same or substantially the same starting time and ending timing during comparative playbacks of the two recorded videos. The disclosed comparative-learning techniques can generally be applied to any type of procedure videos which can be broken down into a sequence of predefined phases/steps, and to synchronize/slave one such procedure video to another procedure video of the same type at each segmented phase/step in the sequence of predefined phases/steps.

IPC Classes  ?

  • H04N 21/845 - Structuring of content, e.g. decomposing content into time segments
  • H04N 21/8547 - Content authoring involving timestamps for synchronizing content
  • G06Q 50/20 - Education

63.

METHOD AND SYSTEM FOR AUTOMATICALLY TURNING ON/OFF A LIGHT SOURCE FOR AN ENDOSCOPE DURING A SURGERY

      
Application Number US2019038606
Publication Number 2020/251596
Status In Force
Filing Date 2019-06-21
Publication Date 2020-12-17
Owner VERB SURGICAL INC. (USA)
Inventor Venkataraman, Jagadish

Abstract

In this patent disclosure, a machine-learning-based system for automatically turning on/off a light source of an endoscope camera during a surgical procedure is disclosed. The disclosed system can receive a sequence of video images captured by the endoscope camera when the light source is turned on. The system next analyzes the sequence of video images using a machine-learning classifier to classify each video image as a first class of image of being inside the patient's body or a second class of image of being outside of the patient's body. The system next determines whether the endoscope camera is inside or outside of the patient's body based on the classified video images. When the endoscope camera is determined to be outside of the patient's body, the system generates a control signal for turning off the light source, wherein the control signal is used to immediately turn off the light source.

IPC Classes  ?

  • A61B 1/06 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
  • A61B 1/313 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
  • A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
  • A61B 1/045 - Control thereof
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

64.

ESTIMATING JOINT FRICTION AND TRACKING ERROR OF A ROBOTICS END EFFECTOR

      
Application Number US2020036487
Publication Number 2020/247865
Status In Force
Filing Date 2020-06-05
Publication Date 2020-12-10
Owner VERB SURGICAL INC. (USA)
Inventor Hariri, Alirzea

Abstract

A computerized method for estimating joint friction in a joint of a robotic wrist of an end effector. Sensor measurements of force or torque in a transmission that mechanically couples a robotic wrist to an actuator, are produced. Joint friction in a joint of the robotic wrist that is driven by the actuator is computed by applying the sensor measurements of force or torque to a closed form mathematical expression that relates transmission force or torque variables to a joint friction variable. A tracking error of the end effector is also computed, using a closed form mathematical expression that relates the joint friction variable to the tracking error. Other aspects are also described and claimed.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

65.

SUPERVISED ROBOT-HUMAN COLLABORATION IN SURGICAL ROBOTICS

      
Application Number US2019036247
Publication Number 2020/246994
Status In Force
Filing Date 2019-06-10
Publication Date 2020-12-10
Owner VERB SURGICAL INC. (USA)
Inventor Dahdouh, Andrew

Abstract

A surgical robotic system offers automation templates, such as surgical task templates, for collaborative control of the robot arms operating in an automated manner. This automated operation through integration of template selection and programming may reduce fatigue while maintaining accuracy and dexterity. For more routine parts of the surgery, the surgeon may select a template and use the template interface to set various parameters for a given surgery, such as the force to be applied, order of tasks, trajectory of movement, stop points, and/or distance of any given movement in automatic operation for surgeon verification.. By automating parts of the surgery, the surgeon may use direct control for more sensitive aspects of the surgery while having a respite or assistance for more routine aspects of the surgery.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery

66.

METHOD AND SYSTEM FOR COMBINING VIDEO, IMAGE, AND AUDIO DATA WITH TEXT DATA FOR COMPREHENSIVE DATA ANALYTICS

      
Application Number US2019034063
Publication Number 2020/236190
Status In Force
Filing Date 2019-05-24
Publication Date 2020-11-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Venkataraman, Jagadish
  • Garcia Kilroy, Pablo

Abstract

This patent disclosure provides various embodiments of combining multiple modalities of non-text surgical data of different formats, in particular in forms of videos, images, and audios in a meaningful manner so that the combined data from the multiple modalities are compatible with text data. In some embodiments, prior to combining the multiple modalities of surgical data, multiple segmentation engines are used to segment and convert a corresponding modality of surgical data into a corresponding set of metrics and parameters. The multiple sets of metrics and parameters corresponding to the multiple modalities are then combined to generate a combined feature set. The combined feature set can be provided to a data analytics tool for performing comprehensive data analyses on the combined feature set to generate one or more predictions for the surgical procedure.

IPC Classes  ?

  • G16H 70/00 - ICT specially adapted for the handling or processing of medical references
  • G16H 30/40 - ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
  • G16H 50/20 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
  • G16H 50/30 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for individual health risk assessment
  • G16H 50/70 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients

67.

METHOD AND SYSTEM FOR ANONYMIZING RAW SURGICAL PROCEDURE VIDEOS

      
Application Number US2019034064
Publication Number 2020/236191
Status In Force
Filing Date 2019-05-24
Publication Date 2020-11-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Venkataraman, Jagadish
  • Garcia Kilroy, Pablo

Abstract

This patent disclosure provides various embodiments for anonymizing raw surgical procedure videos recorded by a recording device, such as an endoscope camera, during a surgical procedure performed on a patient inside an operating room (OR). In one aspect, a process for anonymizing raw surgical procedure videos recorded by a recording device within an OR is disclosed. This process can begin by receiving a set of raw surgical videos corresponding to a surgical procedure performed within the OR. The process next merges the set of raw surgical videos to generate a surgical procedure video corresponding to the surgical procedure. Next, the process detects image-based personally-identifiable information embedded in the set of raw video images of the surgical procedure video. When image-based personally-identifiable information is detected, the process automatically de-identifies the detected image-based personally-identifiable information in the surgical procedure video.

IPC Classes  ?

  • G16H 70/00 - ICT specially adapted for the handling or processing of medical references
  • G16H 30/40 - ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
  • G16H 50/20 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
  • G16H 50/30 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for individual health risk assessment
  • G16H 50/70 - ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients

68.

PROXIMITY SENSORS FOR SURGICAL ROBOTIC ARM MANIPULATION

      
Application Number US2019034719
Publication Number 2020/236194
Status In Force
Filing Date 2019-05-30
Publication Date 2020-11-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Liu, Xin
  • Gonenc, Berk
  • Fuerst, Bernhard A.
  • Cordoba, Jose Luis
  • Garcia Kilroy, Pablo E.

Abstract

A surgical robotic system including a surgical table, a surgical robotic manipulator coupled to the surgical table and comprising a plurality of links coupled together by a plurality of joints that are operable to move with respect to one another to move the surgical robotic manipulator, at least one of the plurality of links or the plurality of joints having a portion that faces another of the plurality of links or the plurality of joints, a proximity sensing assembly coupled to the portion of the at least one of the plurality of links or the plurality of joints, the proximity sensing assembly operable to detect an object prior to the surgical robotic manipulator colliding with the object and to output a corresponding detection signal, and a processor operable to receive the corresponding detecting signal and cause the manipulator or the object to engage in a collision avoidance operation.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

69.

METHODS FOR DETERMINING IF TELEOPERATION SHOULD BE DISENGAGED BASED ON THE USER'S GAZE

      
Application Number US2019034721
Publication Number 2020/236195
Status In Force
Filing Date 2019-05-30
Publication Date 2020-11-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Freiin Von Kapri, Anette Lia
  • Miller, Denise Ann
  • Invernizzi, Paolo
  • Savall, Joan
  • Magnasco, John H.

Abstract

A method for disengaging a surgical instrument of a surgical robotic system comprising receiving a gaze input from an eye tracker; determining, by one or more processors, whether the gaze input indicates the gaze of the user is outside or inside of the display; in response to determining the gaze input indicates the gaze of the user is outside of the display, determining an amount of time the gaze of the user is outside of the display; in response to determining the gaze of the user is outside of the display for less than a maximum amount of time, pause the surgical robotic system from a teleoperation mode; and in response to determining the gaze of the user is outside of the display for more than the maximum amount of time, disengage the surgical robotic system from the teleoperation mode.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

70.

SENSORS FOR TOUCH-FREE CONTROL OF SURGICAL ROBOTIC SYSTEMS

      
Application Number US2019034718
Publication Number 2020/236193
Status In Force
Filing Date 2019-05-30
Publication Date 2020-11-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Gonenc, Berk
  • Liu, Xin
  • Fuerst, Bernhard A.
  • Cordoba, Jose Luis
  • Garcia Kilroy, Pablo E.

Abstract

A control system for surgical robots based on proximity sensing, the control system including a proximity sensor coupled to a component of a surgical robot, the surgical robot component including a table, robotic arms coupled to the table, and surgical tools mounted on the robotic arms, the proximity sensor configured to sense a movement of a nearby controlling object in one or more degrees of freedom; and a processor configured to drive the component of the surgical robot to follow the movement of the controlling object.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61G 13/02 - Adjustable operating tables; Controls therefor
  • H03K 17/955 - Proximity switches using a capacitive detector
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

71.

INTERLOCK MECHANISMS TO DISENGAGE AND ENGAGE A TELEOPERATION MODE

      
Application Number US2019034722
Publication Number 2020/236196
Status In Force
Filing Date 2019-05-30
Publication Date 2020-11-26
Owner VERB SURGICAL INC. (USA)
Inventor
  • Savall, Joan
  • Miller, Denise Ann
  • Freiin Von Kapri, Anette Lia
  • Invernizzi, Paolo
  • Magnasco, John H.

Abstract

A method for engaging and disengaging a surgical instrument of a surgical robotic system comprising: receiving a plurality of interlock inputs from one or more interlock detection components of the surgical robotic system; determining, by one or more processors communicatively coupled to the interlock detection components, whether the plurality of interlock inputs indicate each of the following interlock requirements are satisfied: (1) a user is looking toward a display, (2) at least one or more user interface devices of the surgical robotic system are configured in a usable manner, and (3) a surgical workspace of the surgical robotic system is configured in a usable manner; in response to determining each of the interlock requirements are satisfied, transition the surgical robotic system into a teleoperation mode; and in response to determining less than all of the interlock requirements are satisfied, transition the surgical robotic system out of a teleoperation mode.

IPC Classes  ?

72.

UNMATCHING/MATCHING UID TO ROBOT GRASPER FOR ENGAGING TELEOPERATION

      
Application Number US2019032049
Publication Number 2020/231402
Status In Force
Filing Date 2019-05-13
Publication Date 2020-11-19
Owner VERB SURGICAL INC. (USA)
Inventor
  • Klingbeil, Ellen
  • Yu, Haoran

Abstract

A surgical robotic system has a robotic grasper, a user interface device (UID), and one or more processors communicatively coupled to the UID and the robotic grasper. The system detects a directive to engage or re-engage a teleoperation mode, determines that the system is in a non-teleoperation mode, receives a sequence of user actions through the UID, determines the UID matches a jaw angle or a grip force of the robotic grasper, and transitions into teleoperation mode. Other embodiments are also described and claimed.

IPC Classes  ?

  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 17/29 - Forceps for use in minimally invasive surgery

73.

METHOD AND SYSTEM FOR PREDICTING CURRENT PATHS AND EVALUATING ELECTRICAL BURN RISKS OF A MONOPOLAR ELECTROSURGERY TOOL

      
Application Number US2019060635
Publication Number 2020/204999
Status In Force
Filing Date 2019-11-08
Publication Date 2020-10-08
Owner VERB SURGICAL INC. (USA)
Inventor Venkataraman, Jagadish

Abstract

Embodiments described herein provide various examples of predicting potential current paths from an active electrode of a monopolar electrosurgery tool to a return electrode of the monopolar electrosurgery tool based on analyzing electrical properties of tissues inside a patient's body, and evaluating and eliminating tissue burn risks associated with the predicted current paths. In some embodiments, a current-path-prediction technique is used to predict a set of potential current paths from the active electrode to the return electrode for any given geometrical configuration of the two electrodes on the patient's body. These predicted current paths can then be pictorially displayed on a 3D scan of the patient's body or an endoscopic view of the patient's body and in relation to the display of any existing metal implant inside the patient's body, which allows for visualizing points of tissue burn risks inside the patient's body.

IPC Classes  ?

  • A61B 34/10 - Computer-aided planning, simulation or modelling of surgical operations
  • A61B 18/12 - Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
  • G16H 20/40 - ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture

74.

ROBOTIC SURGICAL SYSTEM AND METHOD FOR HANDLING REAL-TIME AND NON-REAL-TIME TRAFFIC

      
Application Number US2019026582
Publication Number 2020/204962
Status In Force
Filing Date 2019-04-09
Publication Date 2020-10-08
Owner VERB SURGICAL, INC. (USA)
Inventor Desai, Jignesh

Abstract

A robotic surgical system and method are disclosed for handling real-time and non-real-time traffic. In one embodiment, a surgical robotic system is provided comprising at least one robotic arm coupled to an operating table; and a control computer comprising a processor and a hardware interface, wherein the processor is configured to: receive a notification about real-time data from the operating table at the hardware interface; process the real-time data immediately upon receiving the notification; and poll the hardware interface for non-real time data from the operating table only when not processing the real-time data. Other embodiments are provided.

IPC Classes  ?

  • G09B 23/30 - Anatomical models
  • G09B 23/28 - Models for scientific, medical, or mathematical purposes, e.g. full-sized device for demonstration purposes for medicine

75.

METHOD AND SYSTEM FOR AUTOMATICALLY REPOSITIONING A VIEWABLE AREA WITHIN AN ENDOSCOPE VIDEO VIEW

      
Application Number US2019025673
Publication Number 2020/197569
Status In Force
Filing Date 2019-04-03
Publication Date 2020-10-01
Owner VERB SURGICAL INC. (USA)
Inventor
  • Venkataraman, Jagadish
  • Scott, David D.
  • Johnson, Eric

Abstract

Embodiments described herein provide various examples of displaying video images of a surgical video captured at a first resolution on a screen of a surgical system having a second resolution lower than the first resolution. In one aspect, a process begins by receiving the surgical video and selecting a first portion of the video images having the same or substantially the same resolution as the second resolution. The process subsequently displays the first portion of the video images on the screen. While displaying the first portion of the video images, the process monitors a second portion of the video images not being displayed on the screen for a set of predetermined events, wherein the second portion is not visible to the user. When a predetermined event in the set of predetermined events is detected in the second portion, the process generates an alert to notify the user.

IPC Classes  ?

  • G06T 7/33 - Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
  • G06T 7/00 - Image analysis
  • A61B 1/04 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
  • H04N 5/225 - Television cameras

76.

SYSTEMS AND METHODS FOR MAGNETIC SENSING AND DOCKING WITH A TROCAR

      
Application Number US2019021465
Publication Number 2020/176113
Status In Force
Filing Date 2019-03-08
Publication Date 2020-09-03
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard A.
  • Moses, Dennis
  • Piedrahita, Miguel
  • Wong, Michael
  • Garcia Kilroy, Pablo
  • Cordoba, Jose Luis

Abstract

A surgical robotic system has a tool drive coupled to a distal end of a robotic arm that has a plurality of actuators. The tool drive has a docking interface to receive a trocar. One or more sensors in the docking interface sense a magnetic field generated by the trocar. One or more processors are configured to determine a position and orientation of the trocar based on the sensed magnetic field, and then drive the actuators to orient the docking interface to the determined orientation of the trocar, or otherwise guide the robotic arm toward the determined position of the trocar. Other aspects are also described and claimed.

IPC Classes  ?

  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 34/30 - Surgical robots
  • A61B 17/34 - Trocars; Puncturing needles
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

77.

WEARABLE USER INTERFACE DEVICE

      
Application Number US2019060966
Publication Number 2020/154012
Status In Force
Filing Date 2019-11-12
Publication Date 2020-07-30
Owner VERB SURGICAL INC. (USA)
Inventor
  • Savall, Joan
  • Demartini, Richard Edward
  • Hellman, Randall Blake
  • Miller, Denise Ann
  • Freiin Von Kapri, Anette Lia
  • Garcia Kilroy, Pablo E.

Abstract

Wearable user interface devices are described. A wearable user interface device can include a wearable base connected to a trackable device component by a linkage. The linkage can connect to a pivoted support that the trackable device is mounted on, and which maintains poses when the user interface device is not manipulated by a user's hand. The pivoted support has several orthogonal axes intersecting at a center of rotation located inside a device body of the trackable device. Other embodiments are also described and claimed.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

78.

METHODS FOR ACTIVELY ENGAGING AND DISENGAGING TELEOPERATION OF A SURGICAL ROBOTIC SYSTEM

      
Application Number US2018068221
Publication Number 2020/139405
Status In Force
Filing Date 2018-12-31
Publication Date 2020-07-02
Owner VERB SURGICAL INC. (USA)
Inventor
  • Cone, Taylor Joseph
  • Savall, Joan
  • Freiin Von Kapri, Anette Lia
  • Johnson, Eric Mark

Abstract

A method for engaging and disengaging a surgical instrument of a surgical robotic system including receiving a sequence of user inputs from one or more user interface devices of the surgical robotic system; determining, by one or more processors communicatively coupled to the user interface devices and the surgical instrument, whether the sequence of user inputs indicates an intentional engagement or disengagement of a teleoperation mode in which the surgical instrument is controlled by user inputs received from the user interface devices; in response to determining of engagement, transition the surgical robotic system into the teleoperation mode; and in response to determining of disengagement, transition the surgical robotic system out of the teleoperation mode such that the user interface devices are prevented from controlling the surgical instrument.

IPC Classes  ?

  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

79.

METHOD AND SYSTEM FOR EXTRACTING AN ACTUAL SURGICAL DURATION FROM A TOTAL OPERATING ROOM (OR) TIME OF A SURGICAL PROCEDURE

      
Application Number US2018067553
Publication Number 2020/122962
Status In Force
Filing Date 2018-12-26
Publication Date 2020-06-18
Owner VERB SURGICAL INC. (USA)
Inventor
  • Venkataraman, Jagadish
  • Garcia Kilroy, Pablo

Abstract

Embodiments described herein provide various examples of a surgical procedure analysis system for extracting an actual procedure duration that involves actual surgical tool-tissue interactions from a total procedure duration of a surgical procedure. In one aspect, the process for generating the haptic feedback signal includes the steps of: obtaining the total procedure duration of the surgical procedure; receiving a set of operating room (OR) data from a set of OR data sources collected during the surgical procedure; analyzing the set of OR data to detect a set of non-surgical events during the surgical procedure that do not involve surgical tool-tissue interactions; extracting a set of durations corresponding to the set of events; and determining the actual procedure duration by subtracting the combined set of durations corresponding to the set of events from the total procedure duration.

IPC Classes  ?

  • G16H 40/20 - ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
  • A61B 34/10 - Computer-aided planning, simulation or modelling of surgical operations
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

80.

SURGICAL ROBOTIC SYSTEM

      
Application Number US2019058160
Publication Number 2020/092170
Status In Force
Filing Date 2019-10-25
Publication Date 2020-05-07
Owner VERB SURGICAL INC. (USA)
Inventor
  • Garcia Kilroy, Pablo
  • Koenig, Karen
  • Bajo, Andrea
  • Wiggers, Robert
  • Savall, Joan
  • Johnson, Eric

Abstract

A surgical robotic system is disclosed to include an operating table, a plurality of robotic arms and surgical instruments, a user console, and a control tower. The plurality' of robotic arms are mounted on the operating table and can be stowed folded under the table for storage. The user console has one or more user interface devices, which function as master devices to control the plurality of surgical instruments.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • A61B 19/00 - Instruments, implements or accessories for surgery or diagnosis not covered by any of the groups A61B 1/00-A61B 18/00, e.g. for stereotaxis, sterile operation, luxation treatment, wound edge protectors(protective face masks A41D 13/11; surgeons' or patients' gowns or dresses A41D 13/12; devices for carrying-off, for treatment of, or for carrying-over, body liquids A61M 1/00)
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 34/37 - Master-slave robots

81.

METHOD AND SYSTEM FOR AUTOMATICALLY TRACKING AND MANAGING INVENTORY OF SURGICAL TOOLS IN OPERATING ROOMS

      
Application Number US2018051562
Publication Number 2020/055434
Status In Force
Filing Date 2018-09-18
Publication Date 2020-03-19
Owner VERB SURGICAL INC. (USA)
Inventor Venkataraman, Jagadish

Abstract

Embodiments described herein provide various examples of automatically processing surgical videos to detect surgical tools and tool-related events, and extract surgical-tool usage information. In one aspect, a process for automatically detecting a new surgical tool engagement during a recorded surgical procedure is disclosed. This process can begin by receiving a surgical procedure video and then segmenting the surgical video into sequences of video frames. Next, for each sequence of video frames, the video frames are processed to detect one or more surgical tools and one or more surgical tool engagements associated with the detected surgical tools. If a surgical tool engagement is detected in the sequence of video frames, the process then determines if a detected surgical tool associated with the detected surgical tool engagement is associated with a previously identified surgical tool engagement. If not, the process identifies the detected surgical tool engagement as a new surgical tool engagement.

IPC Classes  ?

  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 34/10 - Computer-aided planning, simulation or modelling of surgical operations
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • G16H 20/40 - ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture

82.

MACHINE-LEARNING-BASED VISUAL-HAPTIC FEEDBACK SYSTEM FOR ROBOTIC SURGICAL PLATFORMS

      
Application Number US2018051567
Publication Number 2020/055435
Status In Force
Filing Date 2018-09-18
Publication Date 2020-03-19
Owner VERB SURGICAL INC. (USA)
Inventor
  • Venkataraman, Jagadish
  • Miller, Denise Ann

Abstract

Embodiments described herein provide various examples of a visual-haptic feedback system for generating a haptic feedback signal based on captured endoscopy images. In one aspect, the process for generating the haptic feedback signal includes the steps of: receiving an endoscopic video captured for a surgical procedure performed on a robotic surgical system; detecting a surgical task in the endoscopic video involving a given type of surgical tool-tissue interaction; selecting a machine learning model constructed for analyzing the given type of surgical tool-tissue interaction; for a video image associated with the detected surgical task depicting the given type of surgical tool-tissue interaction, applying the selected machine learning model to the video image to predict a strength level of the depicted surgical tool-tissue interaction; and then providing the predicted strength level to a surgeon performing the surgical task as a haptic feedback signal for the given type of surgical tool-tissue interaction.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/10 - Computer-aided planning, simulation or modelling of surgical operations
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges
  • A61B 34/35 - Surgical robots for telesurgery
  • G16H 20/40 - ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
  • G06N 99/00 - Subject matter not provided for in other groups of this subclass
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • A61B 18/00 - Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body

83.

SURGICAL TOOL HAVING INTEGRATED MICROPHONES

      
Application Number US2018048801
Publication Number 2020/040795
Status In Force
Filing Date 2018-08-30
Publication Date 2020-02-27
Owner VERB SURGICAL INC. (USA)
Inventor
  • Miller, Denise Ann
  • Savall, Joan
  • Russell, Geoffrey Robert

Abstract

Communication apparatus and devices for surgical robotic systems are described. The communication apparatus can include a user console in communication with a communication device having a surgical tool. The communication device can include a microphone to convert a sound input into an acoustic input signal. The communication device can transmit the acoustic input signal to the user console for reproduction as a sound output for a remote operator. The surgical tool can include an endoscope having several microphones mounted on a housing. The surgical tool can be a sterile barrier having a microphone and a drape. The microphone(s) of the surgical tools can face a surrounding environment such that a tableside staff is a source of the sound input that causes the sound output, and a surgeon and the tableside staff can communicate in a noisy environment. Other embodiments are also described and claimed.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

84.

METHOD AND SYSTEM FOR ENGAGEMENT OF A SURGICAL TOOL WITH ACTUATORS OF A TOOL DRIVE IN A SURGICAL ROBOTIC SYSTEM

      
Application Number US2018048812
Publication Number 2020/040796
Status In Force
Filing Date 2018-08-30
Publication Date 2020-02-27
Owner VERB SURGICAL INC. (USA)
Inventor
  • Yu, Haoran
  • Hariri, Alireza
  • Nia Kosari, Sina
  • Zhou, Renbin
  • Sen, Tutkun Hasan
  • Asadian, Ali

Abstract

A system and computerized method for detection of engagement of a surgical tool to a tool drive of a robotic arm of a surgical robotic system. The method may include activating an actuator of the tool drive to rotate a drive disk to be mechanically engaged with a tool disk in the surgical tool. One or more motor operating parameters of the actuator that is causing the rotation of the drive disk are monitored while activating the actuator. The method detects when the drive disk becomes mechanically engaged with the tool disk, based on the one or more monitored motor operating parameters. Other embodiments are also described and claimed.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/37 - Master-slave robots
  • A61B 90/98 - Identification means for patients or instruments, e.g. tags using electromagnetic means, e.g. transponders
  • A61B 90/90 - Identification means for patients or instruments, e.g. tags
  • A61B 90/96 - Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text using barcodes
  • A61B 34/35 - Surgical robots for telesurgery

85.

SETUP OF SURGICAL ROBOTS USING AN AUGMENTED MIRROR DISPLAY

      
Application Number US2018048861
Publication Number 2020/036611
Status In Force
Filing Date 2018-08-30
Publication Date 2020-02-20
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard Adolf
  • Garcia Kilroy, Pablo E.

Abstract

Assisting robotic arm setup in a surgical robotic system using augmented reality can include capturing a live video of a user setting up a robotic arm in a surgical robotic system. A visual guide representing a target pose of the robotic arm can be rendered onto the live video, resulting in an augmented live video for guiding the arm setup. The augmented live video can be displayed to the user while the user is following the visual guide to set up the robotic arm. The captured live video can be continuously processed to determine whether the robotic arm has reached the target pose.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • A61B 90/00 - Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups , e.g. for luxation treatment or for protecting wound edges

86.

PEDAL WITH SLIDING AND LOCKING MECHANISMS FOR SURGICAL ROBOTS

      
Application Number US2018043557
Publication Number 2020/018124
Status In Force
Filing Date 2018-07-24
Publication Date 2020-01-23
Owner VERB SURGICAL INC. (USA)
Inventor
  • Cone, Taylor
  • Savall, Joan

Abstract

A foot pedal system (900) for controlling a surgical robotic system, the foot pedal system (900) comprising a foot pedal assembly (200,904,906) movably coupled to a foot pedal assembly platform (902). The foot pedal assembly (200,904,906) having a foot pedal base (206), a foot pedal (208) pivotally coupled to the foot pedal base (206), and a foot pedal platform (202), the foot pedal base (206) operable to slide across the foot pedal platform (202) along an x- axis and a y-axis to an arrangement of activation positions. The foot pedal platform (202) operable to translate and rotate with respect to the foot pedal assembly platform (902) to any position along the foot pedal assembly platform (902), and the foot pedal platform (202) is operable to engage or disengage with the foot pedal assembly platform (902) at the any position along the foot pedal assembly platform (902).

IPC Classes  ?

  • G05G 1/445 - Controlling members actuated by foot pivoting about a central fulcrum
  • H01H 3/14 - Operating parts, i.e. for operating driving mechanism by a mechanical force external to the switch adapted for operation by a part of the human body other than the hand, e.g. by foot
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets
  • G05G 1/44 - Controlling members actuated by foot pivoting
  • B25J 13/04 - Foot-operated control means

87.

ROBOTIC SURGICAL PEDAL WITH INTEGRATED FOOT SENSOR

      
Application Number US2018043554
Publication Number 2020/018123
Status In Force
Filing Date 2018-07-24
Publication Date 2020-01-23
Owner VERB SURGICAL INC. (USA)
Inventor
  • Cone, Taylor
  • Savall, Joan

Abstract

A foot pedal assembly for controlling a robotic surgical system. The foot pedal assembly including a foot pedal base, a foot pedal and a sensor. The foot pedal moves relative to the foot pedal base and has a contact surface extending from a distal end to a proximal end of the foot pedal. The contact surface is to come into contact with a foot of a user during use of the foot pedal assembly for controlling the robotic surgical system and the distal end is farther away from a heel of the foot than the proximal end during use of the assembly for controlling the robotic surgical system. The sensor is coupled to the contact surface of the foot pedal at a position closer to the proximal end than the distal end, and the sensor is operable to sense a target object positioned a distance over the contact surface.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery

88.

SURGICAL ROBOTIC SYSTEM HAVING ANTHROPOMETRY-BASED USER CONSOLE

      
Application Number US2018043173
Publication Number 2020/013871
Status In Force
Filing Date 2018-07-20
Publication Date 2020-01-16
Owner VERB SURGICAL INC. (USA)
Inventor
  • Hondori, Hossein Mousavi
  • Savall, Joan
  • Sani, Hamid Reza
  • Nobles, Brent Michael

Abstract

Surgical robotic systems including a user console for controlling a robotic arm or a surgical robotic tool are described. The user console includes components designed to automatically adapt to anthropometric characteristics of a user. A processor of the surgical robotic system is configured to receive anthropometric inputs corresponding to the anthropometric characteristics and to generate an initial console configuration of the user console based on the inputs using a machine learning model. Actuators automatically adjust a seat, a display, or one or more pedals of the user console to the initial console configuration. The initial console configuration establishes a comfortable relative position between the user and the console components. Other embodiments are described and claimed.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 90/60 - Supports for surgeons, e.g. chairs or hand supports

89.

AUTOSTEROSCOPIC DISPLAY WITH MEANS FOR TRACKING THE HEAD POSITION AND THE GAZE OF A USER IN ROBOTIC SURGERY AND AN ASSOCIATED METHOD

      
Application Number US2019021475
Publication Number 2020/009731
Status In Force
Filing Date 2019-03-08
Publication Date 2020-01-09
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard A.
  • Garcia Kilroy, Pablo
  • Savall, Joan
  • Freiin Von Kapri, Anette Lia

Abstract

An autostereoscopic three-dimensional display system (140) for surgical robotics has an autostereoscopic three-dimensional display (142) configured to receive and display video from a surgical robotics camera and a first sensor assembly (144) and a second sensor assembly (146). A processor is configured to detect and track an eye position or a head position of a user relative to the display based on processing output data of the first sensor assembly and to detect and track a gaze of the user based on processing output data of the second sensor assembly. The processor further is configured to modify or control an operation of the display system based on the detected gaze of the user. A spatial relationship of the display also can be automatically adjusted in relation to the user based on the detected eye or head position of the user to optimize the user's visualization of three-dimensional images on the display..

IPC Classes  ?

  • A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • H04N 13/302 - Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays

90.

USER INTERFACE DEVICE HAVING GRIP LINKAGES

      
Application Number US2018037941
Publication Number 2019/240824
Status In Force
Filing Date 2018-06-15
Publication Date 2019-12-19
Owner VERB SURGICAL INC. (USA)
Inventor
  • Savall, Joan
  • Lenta Shum, Allegra Anna

Abstract

User interface devices for manipulating a robotic surgical tool in a surgical robotic system are described. A user interface device can includes a device body containing a tracking sensor to generate a spatial state signal in response to movement of the device body. The spatial state signal can be used to control a spatial motion of a surgical robotic system actuator. Several grip linkages can be pivotally coupled to the device body. A grip linkage displacement sensor may monitor movement of the grip linkages relative to the device body, and generate a grip signal in response to the movement. The grip signal can be used to control a grip motion of a robotic surgical tool mounted on the surgical robotic system actuator. Other embodiments are also described and claimed.

IPC Classes  ?

  • B25J 13/02 - Hand grip control means
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery

91.

USER INTERFACE DEVICE HAVING FINGER CLUTCH

      
Application Number US2018037943
Publication Number 2019/240825
Status In Force
Filing Date 2018-06-15
Publication Date 2019-12-19
Owner VERB SURGICAL INC. (USA)
Inventor
  • Savall, Joan
  • Lenta Shum, Allegra Anna

Abstract

User interface devices for manipulating a robotic surgical tool in a surgical robotic system are described. A user interface device can include a device housing having a gripping surface symmetrically disposed about a central axis. The gripping surface can include a surface of revolution about the central axis. A tracking sensor can be mounted within the device housing to generate spatial state signals in response to movement of the device housing. The spatial state signals can be used to control motion of robotic system actuators. A finger clutch can be disposed at an end of the device housing, and can generate a clutch signal in response to a touch by a user. The clutch signal can be used to pause the motion of the robotic system actuators. Other embodiments are also described and claimed.

IPC Classes  ?

  • B25J 13/02 - Hand grip control means
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery

92.

USER INPUT DEVICE FOR USE IN ROBOTIC SURGERY

      
Application Number US2019037226
Publication Number 2019/241655
Status In Force
Filing Date 2019-06-14
Publication Date 2019-12-19
Owner VERB SURGICAL INC. (USA)
Inventor
  • Fuerst, Bernhard Adolf
  • Garcia Kilroy, Pablo E.
  • Gonenc, Berk
  • Cordoba, Jose Luis
  • Savall, Joan
  • Barthel, Alexander

Abstract

User input devices (UIDs) for controlling a surgical robotic system are described. A UID can include one or more tracking sensors to generate respective spatial state signals in accordance with a pose of the UID. At least one of the tracking sensors can be a camera. In the case of multiple tracking sensors, the spatial state signals are processed by a sensor fusion algorithm to generate a more robust, single tracking signal and a quality measure. The tracking signal and the quality measure are then used by a digital control system to control motion of a surgical robotic system actuator that is associated with the UID. Other embodiments are also described and claimed.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

93.

CALIBRATION OF 3-PHASE MOTOR CURRENT SENSING FOR SURGICAL ROBOTIC ACTUATORS

      
Application Number US2018038045
Publication Number 2019/240826
Status In Force
Filing Date 2018-06-18
Publication Date 2019-12-19
Owner VERB SURGICAL INC. (USA)
Inventor Bosteels, Jan

Abstract

A 3-phase motor driver circuit has a first input to be coupled to an output of a first phase current sensor, and a second input that represents a zero reference. A controller adjusts one or more of a first phase voltage, a second phase voltage, and a third phase voltage, until a comparison between the first input and the input indicates that the first input has reached the zero reference, and in response the controller captures an output of a second phase current sensor and an output of a third phase current sensor. The controller then stores, in memory, calibration data that is based on the captured outputs of the second and third phase current sensors. Other aspects are also described.

IPC Classes  ?

  • G01R 35/00 - Testing or calibrating of apparatus covered by the other groups of this subclass
  • H02P 6/10 - Arrangements for controlling torque ripple, e.g. providing reduced torque ripple
  • H02P 6/28 - Arrangements for controlling current
  • H02P 21/22 - Current control, e.g. using a current control loop

94.

MACHINE-LEARNING-ORIENTED SURGICAL VIDEO ANALYSIS SYSTEM

      
Application Number US2018036452
Publication Number 2019/226182
Status In Force
Filing Date 2018-06-07
Publication Date 2019-11-28
Owner VERB SURGICAL INC. (USA)
Inventor
  • Venkataraman, Jagadish
  • Garcia Kilroy, Pablo E.

Abstract

Embodiments described herein provide various examples of a surgical video analysis system for segmenting surgical videos of a given surgical procedure into shorter video segments and labeling/tagging these video segments with multiple categories of machine learning descriptors. In one aspect, a process for processing surgical videos recorded during performed surgeries of a surgical procedure includes the steps of: receiving a diverse set of surgical videos associated with the surgical procedure; receiving a set of predefined phases for the surgical procedure and a set of machine learning descriptors identified for each predefined phase in the set of predefined phases; for each received surgical video, segmenting the surgical video into a set of video segments based on the set of predefined phases and for each segment of the surgical video of a given predefined phase, annotating the video segment with a corresponding set of machine learning descriptors for the given predefined phase.

IPC Classes  ?

  • A61B 1/00 - Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor

95.

SYSTEM AND METHOD FOR CONTROLLING A ROBOTIC WRIST

      
Application Number US2018033478
Publication Number 2019/221754
Status In Force
Filing Date 2018-05-18
Publication Date 2019-11-21
Owner VERB SURGICAL INC. (USA)
Inventor
  • Hariri, Alireza
  • Nia Kosari, Sina

Abstract

A system for controlling a surgical robotic tool having an end effector driven by actuators through antagonistic cables is disclosed. The control system may include a position controller and a grip force controller. The position controller may be configured to receive an input signal to control the position of the end effector and generate a first command to drive the actuators to move the end effector. The grip force controller may be configured to receive another input to control the force exerted by jaws of the end effector and generate a second command. The first command and the second command may be combined to generate a composite command that is provided to the actuators to drive motion of the end effector. A third current or position command may be generated by a slack controller to prevent cable slack.

IPC Classes  ?

96.

ROBOTIC PORT PLACEMENT GUIDE AND METHOD OF USE

      
Application Number US2018030862
Publication Number 2019/203860
Status In Force
Filing Date 2018-05-03
Publication Date 2019-10-24
Owner VERB SURGICAL INC. (USA)
Inventor
  • Anderson, Kent
  • Anderson, Katherine

Abstract

A port placement guide may include a base, a member, and at least one tracking element. The base can be configured to couple to a surface of a patient at a preliminary port location for a robotic arm. The member can be coupled to the base and can comprise a first end and a second end opposite the first end. The at least one tracking element can be coupled to the member and can be configured to allow tracking of the member relative to the preliminary port location.

IPC Classes  ?

  • A61B 17/34 - Trocars; Puncturing needles
  • A61B 34/30 - Surgical robots
  • A61B 34/20 - Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis

97.

SURGICAL ROBOTIC ARM WITH WIRELESS POWER SUPPLY INTERFACE

      
Application Number US2018030867
Publication Number 2019/203861
Status In Force
Filing Date 2018-05-03
Publication Date 2019-10-24
Owner VERB SURGICAL INC. (USA)
Inventor
  • Wu, Qiong
  • Sahin, Koray
  • Bernard, Jonathan

Abstract

A proximal end portion of a robotic surgical arm is to be coupled to an adapter of a surgical robotic platform, for use during a surgical session at the platform, and then decoupled from the adapter for storage until being re-coupled for use during another surgical session at the platform. A resonant-mode transformer-coupled power converter is provided that has a secondary side and a primary side. The secondary side is in the arm and has a transformer secondary coil in the proximal end portion of the arm. The primary side has a transformer primary coil in the adapter. The primary and secondary coils are held at positions and orientations that enable mutual inductive coupling between them for operation of the power converter when the arm is coupled to the adapter. Other embodiments are also described and claimed.

IPC Classes  ?

  • A61B 34/35 - Surgical robots for telesurgery
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

98.

SURGICAL ROBOTIC TOOL MULTI-MOTOR ACTUATOR AND CONTROLLER

      
Application Number US2018031284
Publication Number 2019/203862
Status In Force
Filing Date 2018-05-06
Publication Date 2019-10-24
Owner VERB SURGICAL INC. (USA)
Inventor
  • Yu, Haoran
  • Zhou, Renbin
  • Nia Kosari, Sina
  • Bajo, Andrea

Abstract

A first input coupling and a second input coupling are coupled to rotatably drive an output coupling at the same time. In one embodiment, the output coupling rotates a robotic surgery endoscope about a longitudinal axis of the output coupling. A first motor drives the first input coupling while being assisted by a second motor that is driving the second input coupling. A first compensator produces a first motor input based on a position error and in accordance with a position control law, and a second compensator produces a second motor input based on the position error and in accordance with an impedance control law. In another embodiment, the second compensator receives a measured torque of the first motor. Other embodiments are also described and claimed.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 17/00 - Surgical instruments, devices or methods, e.g. tourniquets

99.

CORRECTING A ROBOTIC SURGERY USER INTERFACE DEVICE TRACKING INPUT

      
Application Number US2018019774
Publication Number 2019/164533
Status In Force
Filing Date 2018-02-26
Publication Date 2019-08-29
Owner VERB SURGICAL INC. (USA)
Inventor
  • Sen, Hasan Tutkun
  • Nia Kosari, Sina

Abstract

A sequence of tracking input samples that are measures of position or orientation of a user interface device, UID, being held by a user, are received. In a prediction phase, a current output sample of a state of linear quadratic estimator, LQE, is computed that is an estimate of the position or orientation of the UID. The current output sample is computed based on i) a previously computed output sample, and ii) a velocity term. In an update phase, an updated output sample of the state of the LQE is computed, based on i) a previously computed output sample from the prediction phase, and ii) a most recent tracking input sample. Other embodiments are also described and claimed.

IPC Classes  ?

  • A61B 34/00 - Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
  • A61B 34/30 - Surgical robots

100.

ACTIVE BACKDRIVING FOR A ROBOTIC ARM

      
Application Number US2017065636
Publication Number 2019/117855
Status In Force
Filing Date 2017-12-11
Publication Date 2019-06-20
Owner VERB SURGICAL INC. (USA)
Inventor
  • Zhou, Renbin
  • Nia Kosari, Sina
  • Moses, Dennis

Abstract

A robotic surgical system includes at least one robotic arm comprising at least one movable joint and an actuator configured to drive the at least one movable joint, and a controller configured to generate a first signal, the first signal comprising a first oscillating waveform having a first frequency and being modulated by a second oscillating waveform having a second frequency, wherein the second frequency is higher than the first frequency. The actuator is configured to drive the at least one movable joint based on the first signal to at least partially compensate for friction in the at least one movable joint.

IPC Classes  ?

  • A61B 34/30 - Surgical robots
  • A61B 34/37 - Master-slave robots
  • A61B 90/50 - Supports for surgical instruments, e.g. articulated arms
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  1     2        Next Page