N9 8 Rendezvous and Capture Systems _/ Two complementary hardware-in-the-loop simulation faciliti... more N9 8 Rendezvous and Capture Systems _/ Two complementary hardware-in-the-loop simulation facilities for automatic rendezvous and capture systems at the Marshall Space Flight Center are described. One, the Flight Robotics Laboratory, uses an 8 DOF overhead manipulator with a work volume of 160 by 40 by 23 feet to evaluate automatic rendezvous algorithms and range/rate sensing systems. The other, the Space Station/Space Operations Mechanism Test Bed, uses a 6 DOF hydraulic table to perform docking and berthing dynamics simulations.
Statement of technical details of the capability being described Automated spacecraft docking ope... more Statement of technical details of the capability being described Automated spacecraft docking operations are being performed using a full scale motion based simulator and an optical sensor. This presentation will discuss the work in progress at TRW and MSFC facilities to study the problem of automated proximity and docking operations. The docking sensor used is the MSFC Optical Sensor and simulation runs are performed using the MSFC Flat Floor Facility. The control algorithms and six degree of freedom (6DOF) simulation software were developed at TRW and integrated into the MSFC facility.
This study develops a vision-based detection and classification algorithm to address the challeng... more This study develops a vision-based detection and classification algorithm to address the challenges of in-situ small orbital debris environment classification, including debris observability and instrument requirements for small debris observation. The algorithm operates in near real-time and is robust under challenging tasks in moving objects classification such as multiple moving objects, objects with various movement trajectories and speeds, very small or faint objects, and substantial background motion. The performance of the algorithm is optimized and validated using space image data available through simulated environments generated using NASA Marshall Space Flight Centers Dynamic Star Field Simulator of on-board optical sensors and cameras.
Fundamental to many of NASA's in-space transportation missions is the capture and handling of var... more Fundamental to many of NASA's in-space transportation missions is the capture and handling of various objects and vehicles in various orbits for servicing, debris disposal, sample retrieval, and assembly without the benefit of sufficient grapple fixtures and docking ports. To perform similar material handling tasks on Earth, pincher grippers, suction grippers, or magnetic chucks are used, but are unable to reliably grip aluminum and composite spacecraft, insulation, radiators, solar arrays, or extra-terrestrial objects in the vacuum of outer space without dedicated handles in the right places.
Statement of technical details of the capability being described Automated spacecraft docking ope... more Statement of technical details of the capability being described Automated spacecraft docking operations are being performed using a full scale motion based simulator and an optical sensor. This presentation will discuss the work in progress at TRW and MSFC facilities to study the problem of automated proximity and docking operations. The docking sensor used is the MSFC Optical Sensor and simulation runs are performed using the MSFC Flat Floor Facility. The control algorithms and six degree of freedom (6DOF) simulation software were developed at TRW and integrated into the MSFC facility.
An Electric Sail (E-Sail) propulsion system consists of long, thin tethers-positively-charged wir... more An Electric Sail (E-Sail) propulsion system consists of long, thin tethers-positively-charged wires extending radially and symmetrically outward from a spacecraft. Tethers must be biased using a high-voltage power supply to ensure that the solar wind produces thrust. While the E-Sail concept shows great promise for flying heliopause missions with higher characteristic acceleration than solar sails, there are significant technical challenges related to deploying and controlling multiple tethers. A typical full-scale design involves a hub and spoke arrangement of 10 to 100 tethers, each 20 km long. In the last 20 years, there have been multiple space mission failures due to tether deployment and control issues, and most configurations involved a single tether. This paper describes an effort to develop and test a simple yet robust single-tether deployment system for a two-6U CubeSat configuration. The project included the following: a) Tether dynamic modeling/simulation b) E-Sail single-tether prototype development and testing c) Space environmental effects testing to identify best materials for further development. These three areas of investigation were needed to provide technical rationale for an E-Sail flight demonstration mission that is expected to be proposed for the 2022 timeframe.
Light Reflected From Target An Unusually Wide-Beam Laser Range Finder would be combined with a qu... more Light Reflected From Target An Unusually Wide-Beam Laser Range Finder would be combined with a quadrant detector so that the distance and direction to a target could be measured simultaneously.
Light Reflected From Target An Unusually Wide-Beam Laser Range Finder would be combined with a qu... more Light Reflected From Target An Unusually Wide-Beam Laser Range Finder would be combined with a quadrant detector so that the distance and direction to a target could be measured simultaneously.
Photonics makes connections in space. Richard T Howard, Thomas C Bryan, Michael L Book, Lesley Ro... more Photonics makes connections in space. Richard T Howard, Thomas C Bryan, Michael L Book, Lesley Rogers Photonics Spectra 33:55, 191-194, 5/1999. NASA-Marshall is developing technology for automation of spacecraft ...
The Advanced Video Guidance Sensor (AVGS) was designed to be the proximity operations sensor for ... more The Advanced Video Guidance Sensor (AVGS) was designed to be the proximity operations sensor for the Demonstration of Autonomous Rendezvous Technologies (DART). The DART mission flew in April of2005 and was a partial success. The AVGS did not get the opportunity to operate in every mode in orbit, but those modes in which it did operate were completely successful. This paper will detail the development, testing, and on-orbit performance of the AVGS.
Embedded software has been developed specifically for controlling an Advanced Video Guidance Sens... more Embedded software has been developed specifically for controlling an Advanced Video Guidance Sensor (AVGS). A Video Guidance Sensor is an optoelectronic system that provides guidance for automated docking of two vehicles. Such a system includes pulsed laser diodes and a video camera, the output of which is digitized. From the positions of digitized target images and known geometric relationships, the relative position and orientation of the vehicles are computed. The present software consists of two subprograms running in two processors that are parts of the AVGS. The subprogram in the first processor receives commands from an external source, checks the commands for correctness, performs commanded non-image-data-processing control functions, and sends image data processing parts of commands to the second processor. The subprogram in the second processor processes image data as commanded. Upon power-up, the software performs basic tests of functionality, then effects a transition to...
Robotic systems and surface mobility will play an increased role in future exploration missions. ... more Robotic systems and surface mobility will play an increased role in future exploration missions. Unlike the LRV during Apollo era which was an astronaut piloted vehicle future systems will include teleoperated and semi-autonomous operations. The tasks given to these vehicles will run the range from infrastructure maintenance, ISRU, and construction to name a few. A common task that may be performed would be the retrieval and deployment of trailer mounted equipment. Operational scenarios may require these operations to be performed remotely via a teleoperated mode,or semi-autonomously. This presentation describes the on-going project to adapt the Automated Rendezvous and Capture (AR&C) sensor developed at the Marshall Space Flight Center for use in an automated trailer pick-up and deployment operation. The sensor which has been successfully demonstrated on-orbit has been mounted on an iRobot/John Deere RGATOR autonomous vehicle for this demonstration which will be completed in the Ma...
In the development of the technology for autonomous rendezvous and docking, key infrastructure ca... more In the development of the technology for autonomous rendezvous and docking, key infrastructure capabilities must be used for effective and economical development. This involves facility capabilities, both equipment and personnel, to devise, develop, qualify, and integrate ARD elements and subsystems into flight programs. One effective way of reducing technical risks in developing ARD technology is the use of the ultimate
Four updated video guidance sensor (VGS) systems have been proposed. As described in a previous N... more Four updated video guidance sensor (VGS) systems have been proposed. As described in a previous NASA Tech Briefs article, a VGS system is an optoelectronic system that provides guidance for automated docking of two vehicles. The VGS provides relative position and attitude (6-DOF) information between the VGS and its target. In the original intended application, the two vehicles would be spacecraft, but the basic principles of design and operation of the system are applicable to aircraft, robots, objects maneuvered by cranes, or other objects that may be required to be aligned and brought together automatically or under remote control. In the first two of the four VGS systems as now proposed, the tracked vehicle would include active targets that would light up on command from the tracking vehicle, and a video camera on the tracking vehicle would be synchronized with, and would acquire images of, the active targets. The video camera would also acquire background images during the perio...
The Advanced Video Guidance Sensor (AVGS) was the primary docking sensor for the Orbital Express ... more The Advanced Video Guidance Sensor (AVGS) was the primary docking sensor for the Orbital Express mission. The sensor performed extremely well during the mission, and the technology has been proven on orbit in other flights too. Parts obsolescence issues prevented the construction of more AVGS units, so the next generation of sensor was designed with current parts and updated to support future programs. The Next Generation Advanced Video Guidance Sensor (NGAVGS) has been tested as a breadboard, two different brassboard units, and a prototype. The testing revealed further improvements that could be made and demonstrated capability beyond that ever demonstrated by the sensor on orbit. This paper presents some of the sensor history, parts obsolescence issues, radiation concerns, and software improvements to the NGAVGS. In addition, some of the testing and test results are presented. The NGAVGS has shown that it will meet the general requirements for any space proximity operations or doc...
The Video Guidance Sensor (VGS) system is an optoelectronic sensor that provides automated guidan... more The Video Guidance Sensor (VGS) system is an optoelectronic sensor that provides automated guidance between two vehicles. In the original intended application, the two vehicles would be spacecraft docking together, but the basic principles of design and operation of the sensor are applicable to aircraft, robots, vehicles, or other objects that may be required to be aligned for docking, assembly, resupply, or precise separation. The system includes a sensor head containing a monochrome charge-coupled- device video camera and pulsed laser diodes mounted on the tracking vehicle, and passive reflective targets on the tracked vehicle. The lasers illuminate the targets, and the resulting video images of the targets are digitized. Then, from the positions of the digitized target images and known geometric relationships among the targets, the relative position and orientation of the vehicles are computed. As described thus far, the VGS system is based on the same principles as those of the ...
N9 8 Rendezvous and Capture Systems _/ Two complementary hardware-in-the-loop simulation faciliti... more N9 8 Rendezvous and Capture Systems _/ Two complementary hardware-in-the-loop simulation facilities for automatic rendezvous and capture systems at the Marshall Space Flight Center are described. One, the Flight Robotics Laboratory, uses an 8 DOF overhead manipulator with a work volume of 160 by 40 by 23 feet to evaluate automatic rendezvous algorithms and range/rate sensing systems. The other, the Space Station/Space Operations Mechanism Test Bed, uses a 6 DOF hydraulic table to perform docking and berthing dynamics simulations.
Statement of technical details of the capability being described Automated spacecraft docking ope... more Statement of technical details of the capability being described Automated spacecraft docking operations are being performed using a full scale motion based simulator and an optical sensor. This presentation will discuss the work in progress at TRW and MSFC facilities to study the problem of automated proximity and docking operations. The docking sensor used is the MSFC Optical Sensor and simulation runs are performed using the MSFC Flat Floor Facility. The control algorithms and six degree of freedom (6DOF) simulation software were developed at TRW and integrated into the MSFC facility.
This study develops a vision-based detection and classification algorithm to address the challeng... more This study develops a vision-based detection and classification algorithm to address the challenges of in-situ small orbital debris environment classification, including debris observability and instrument requirements for small debris observation. The algorithm operates in near real-time and is robust under challenging tasks in moving objects classification such as multiple moving objects, objects with various movement trajectories and speeds, very small or faint objects, and substantial background motion. The performance of the algorithm is optimized and validated using space image data available through simulated environments generated using NASA Marshall Space Flight Centers Dynamic Star Field Simulator of on-board optical sensors and cameras.
Fundamental to many of NASA's in-space transportation missions is the capture and handling of var... more Fundamental to many of NASA's in-space transportation missions is the capture and handling of various objects and vehicles in various orbits for servicing, debris disposal, sample retrieval, and assembly without the benefit of sufficient grapple fixtures and docking ports. To perform similar material handling tasks on Earth, pincher grippers, suction grippers, or magnetic chucks are used, but are unable to reliably grip aluminum and composite spacecraft, insulation, radiators, solar arrays, or extra-terrestrial objects in the vacuum of outer space without dedicated handles in the right places.
Statement of technical details of the capability being described Automated spacecraft docking ope... more Statement of technical details of the capability being described Automated spacecraft docking operations are being performed using a full scale motion based simulator and an optical sensor. This presentation will discuss the work in progress at TRW and MSFC facilities to study the problem of automated proximity and docking operations. The docking sensor used is the MSFC Optical Sensor and simulation runs are performed using the MSFC Flat Floor Facility. The control algorithms and six degree of freedom (6DOF) simulation software were developed at TRW and integrated into the MSFC facility.
An Electric Sail (E-Sail) propulsion system consists of long, thin tethers-positively-charged wir... more An Electric Sail (E-Sail) propulsion system consists of long, thin tethers-positively-charged wires extending radially and symmetrically outward from a spacecraft. Tethers must be biased using a high-voltage power supply to ensure that the solar wind produces thrust. While the E-Sail concept shows great promise for flying heliopause missions with higher characteristic acceleration than solar sails, there are significant technical challenges related to deploying and controlling multiple tethers. A typical full-scale design involves a hub and spoke arrangement of 10 to 100 tethers, each 20 km long. In the last 20 years, there have been multiple space mission failures due to tether deployment and control issues, and most configurations involved a single tether. This paper describes an effort to develop and test a simple yet robust single-tether deployment system for a two-6U CubeSat configuration. The project included the following: a) Tether dynamic modeling/simulation b) E-Sail single-tether prototype development and testing c) Space environmental effects testing to identify best materials for further development. These three areas of investigation were needed to provide technical rationale for an E-Sail flight demonstration mission that is expected to be proposed for the 2022 timeframe.
Light Reflected From Target An Unusually Wide-Beam Laser Range Finder would be combined with a qu... more Light Reflected From Target An Unusually Wide-Beam Laser Range Finder would be combined with a quadrant detector so that the distance and direction to a target could be measured simultaneously.
Light Reflected From Target An Unusually Wide-Beam Laser Range Finder would be combined with a qu... more Light Reflected From Target An Unusually Wide-Beam Laser Range Finder would be combined with a quadrant detector so that the distance and direction to a target could be measured simultaneously.
Photonics makes connections in space. Richard T Howard, Thomas C Bryan, Michael L Book, Lesley Ro... more Photonics makes connections in space. Richard T Howard, Thomas C Bryan, Michael L Book, Lesley Rogers Photonics Spectra 33:55, 191-194, 5/1999. NASA-Marshall is developing technology for automation of spacecraft ...
The Advanced Video Guidance Sensor (AVGS) was designed to be the proximity operations sensor for ... more The Advanced Video Guidance Sensor (AVGS) was designed to be the proximity operations sensor for the Demonstration of Autonomous Rendezvous Technologies (DART). The DART mission flew in April of2005 and was a partial success. The AVGS did not get the opportunity to operate in every mode in orbit, but those modes in which it did operate were completely successful. This paper will detail the development, testing, and on-orbit performance of the AVGS.
Embedded software has been developed specifically for controlling an Advanced Video Guidance Sens... more Embedded software has been developed specifically for controlling an Advanced Video Guidance Sensor (AVGS). A Video Guidance Sensor is an optoelectronic system that provides guidance for automated docking of two vehicles. Such a system includes pulsed laser diodes and a video camera, the output of which is digitized. From the positions of digitized target images and known geometric relationships, the relative position and orientation of the vehicles are computed. The present software consists of two subprograms running in two processors that are parts of the AVGS. The subprogram in the first processor receives commands from an external source, checks the commands for correctness, performs commanded non-image-data-processing control functions, and sends image data processing parts of commands to the second processor. The subprogram in the second processor processes image data as commanded. Upon power-up, the software performs basic tests of functionality, then effects a transition to...
Robotic systems and surface mobility will play an increased role in future exploration missions. ... more Robotic systems and surface mobility will play an increased role in future exploration missions. Unlike the LRV during Apollo era which was an astronaut piloted vehicle future systems will include teleoperated and semi-autonomous operations. The tasks given to these vehicles will run the range from infrastructure maintenance, ISRU, and construction to name a few. A common task that may be performed would be the retrieval and deployment of trailer mounted equipment. Operational scenarios may require these operations to be performed remotely via a teleoperated mode,or semi-autonomously. This presentation describes the on-going project to adapt the Automated Rendezvous and Capture (AR&C) sensor developed at the Marshall Space Flight Center for use in an automated trailer pick-up and deployment operation. The sensor which has been successfully demonstrated on-orbit has been mounted on an iRobot/John Deere RGATOR autonomous vehicle for this demonstration which will be completed in the Ma...
In the development of the technology for autonomous rendezvous and docking, key infrastructure ca... more In the development of the technology for autonomous rendezvous and docking, key infrastructure capabilities must be used for effective and economical development. This involves facility capabilities, both equipment and personnel, to devise, develop, qualify, and integrate ARD elements and subsystems into flight programs. One effective way of reducing technical risks in developing ARD technology is the use of the ultimate
Four updated video guidance sensor (VGS) systems have been proposed. As described in a previous N... more Four updated video guidance sensor (VGS) systems have been proposed. As described in a previous NASA Tech Briefs article, a VGS system is an optoelectronic system that provides guidance for automated docking of two vehicles. The VGS provides relative position and attitude (6-DOF) information between the VGS and its target. In the original intended application, the two vehicles would be spacecraft, but the basic principles of design and operation of the system are applicable to aircraft, robots, objects maneuvered by cranes, or other objects that may be required to be aligned and brought together automatically or under remote control. In the first two of the four VGS systems as now proposed, the tracked vehicle would include active targets that would light up on command from the tracking vehicle, and a video camera on the tracking vehicle would be synchronized with, and would acquire images of, the active targets. The video camera would also acquire background images during the perio...
The Advanced Video Guidance Sensor (AVGS) was the primary docking sensor for the Orbital Express ... more The Advanced Video Guidance Sensor (AVGS) was the primary docking sensor for the Orbital Express mission. The sensor performed extremely well during the mission, and the technology has been proven on orbit in other flights too. Parts obsolescence issues prevented the construction of more AVGS units, so the next generation of sensor was designed with current parts and updated to support future programs. The Next Generation Advanced Video Guidance Sensor (NGAVGS) has been tested as a breadboard, two different brassboard units, and a prototype. The testing revealed further improvements that could be made and demonstrated capability beyond that ever demonstrated by the sensor on orbit. This paper presents some of the sensor history, parts obsolescence issues, radiation concerns, and software improvements to the NGAVGS. In addition, some of the testing and test results are presented. The NGAVGS has shown that it will meet the general requirements for any space proximity operations or doc...
The Video Guidance Sensor (VGS) system is an optoelectronic sensor that provides automated guidan... more The Video Guidance Sensor (VGS) system is an optoelectronic sensor that provides automated guidance between two vehicles. In the original intended application, the two vehicles would be spacecraft docking together, but the basic principles of design and operation of the sensor are applicable to aircraft, robots, vehicles, or other objects that may be required to be aligned for docking, assembly, resupply, or precise separation. The system includes a sensor head containing a monochrome charge-coupled- device video camera and pulsed laser diodes mounted on the tracking vehicle, and passive reflective targets on the tracked vehicle. The lasers illuminate the targets, and the resulting video images of the targets are digitized. Then, from the positions of the digitized target images and known geometric relationships among the targets, the relative position and orientation of the vehicles are computed. As described thus far, the VGS system is based on the same principles as those of the ...
Uploads
Papers by Tom Bryan