Prev-Talk | Next-Talk | All-Talks | Talks-Sorted

Talk-15: YellowFin UUV

Michael Novitzky, Paul Varnell, Evan Seguin, Andrew Melim, Mick E. West, Georgia Tech Research Institute

The Georgia Tech Research Institute (G.T.R.I), the applications branch of the Georgia Institute of Technology, developed the Yellowfin, a small man portable Unmanned Underwater Vehicle (UUV). The mission for the Yellowfin is to conduct autonomous collaborative operations. The multi-UUV design allows for a much wider swath of the ocean to be observed and monitored. Both oceanographic and military missions are aided tremendously by the use of a UUV network. In this presentation, we introduce the design of the Yellowfin system.

The software architecture of each vehicle is provided by MOOS-IvP with integration into several aspects including communication with the WHOI acoustics modem utilizing the JAUS message standard, mission planning using MissionLab, mission execution using FalconView, front seat control with the XMOS microcontroller, and visualization with Blender. The pAcommsHandler allows for robust message handling from the WHOI acoustic modem but has been modified to use the OpenJAUS (Joint Architecture for unmanned Systems) standard instead of CCL/DCCL for inter vehicle and basestation messages. Mission planning is performed through the use of MissionLab, an open source software tool developed by Georgia Tech. MissionLab allows for the creation of a state machine out of available behaviors which is then converted to IvP Helm files for mission execution. FalconView is a mission control interface and mapping system developed by GTRI and widely used by the US Air Force. When tethered to the YellowFin UUV, it is capable of connecting directly to the MOOSDB as a client for near real time updates of pose which displays on an overhead global map for ease of mission collaboration. The XMOS multi-core multi-threaded processor is used as the front seat processing on the Yellowfin vehicle. It uses an event driven architecture for IO aligning itself between an FPGA and a modern microcontroller. Currently, XMOS communicates information to and from the low level sensors and actuators over Ethernet to the MOOSDB through a Python server running on the backseat processor, a pico-ITX single board computer. The Blender simulator is available under the GNU General Public License and has interfaces with Python which have been used along with MOOSDB bindings to turn a Blender simulation into a MOOSDB client and subscribe to vehicle pose information. By updating a simulated vehicle’s pose along with the MOOSDB allows it to stay in synch with other visualization tools such as pMarineSim. The presentation will show initial test results and simulation.

Categories:

  • Autonomy
  • Multi-Vehicle Autonomy
  • MOOS-IvP
  • Mission Planning
  • Mission Monitoring
  • UUVs
  • Acoustic Communications
  • Academia