Mobility Virtual Environment - MoVE

Table of Contents

  1. Introduction
  2. What need does MoVE address?
  3. Is MoVE a vehicle simulator? Can MoVE capture motion from real and virtual vehicles?
  4. How is motion determined for virtual vehicles? Do you need a virtual driver?
  5. MoVE Core
  6. Controlling the experiment with runState
  7. Config File
  8. Is MoVE right for you?
  9. MoVE as open-source software
  10. Inspiration and similarities

Introduction

The Mobility Virtual Environment, or MoVE, is a multi-vehicle testing framework that incorporates real and virtual vehicle motion, with pedestrians in a common coordinate system with a common timestamp.

MoVE allows the vehicle autonomy and mobility research community to configure, study and share multi-vehicle tests that include a mix of real and virtual elements. The goal for testing is increased realism and maturity. But, along the way, it is helpful to reduce certain risks by representing certain elements virtually. MoVE provides a spectrum of realism in which to test, develop, and advance algorithms, sensors, requirements, and autonomous system designs.

MoVE spatial layout schematic

What need does MoVE address?

Self-driving vehicles and Unmanned Autonomouse Systems (UAS) have risk inherently built-in during testing on public roadways, public waterways, or in public airspace. What is needed is a way to test multi-vehicle scenarios safely, in an open, transparent, and repeatable way.

The concept is not entirely new, but models, algorithms, and data need to be shared more among the different techincal and user communities. Proprietary counterparts exist in industry, but the research community need a common environment for rehearsing, sharing, verifying, and collaborating with multi-vehicle scenarios.

For example, the sense-and-avoid maneuver for ground and aerial vehicles has inherent risk during testing. Using a virtual avoidance vehicle (or a virtual target) while developing the right sensors and avoidance algorithms can improve the testing process, improve the final result, and improve public confidence in the final design as fidelity and realism increase.

Research communities targeted include those studying dynamics, mobility, autonomy, robotics, mechatronics, aerial vehicles, surface vehicles, ground vehicles, sensor fusion, Artificail Intelligence (AI), embedded systems, GPS and GPS-denied operation, localization, by-wire conversion, tele-op, instrumentation, and wireless communications.


Is MoVE a vehicle simulator? Can MoVE capture motion from real and virtual vehicles?

Yes, MoVE is a multi-vehicle simulator that can be configured with N vehicles, where N is limited by the computing and network resources. This is simulation-only mode. The figure below shows three virtual vehicle models exercising the wander(), periodicTurn(), and stayInBounds() behaviors.
MoVE architecture

Or MoVE can be configured to accept live GPS inputs from real vehicles and pedestrians. This second configuration is essentially a data collection method for outdoor experiments with real people and real vehicles streaming GPS position over the cellular network.

The figure below is an example outdoor experiment role-playing a medical evacuatin scenario. The scenario contains:
MoVE architecture
A photo taken from the real UAV showing the role-playing injured person and medic:
MoVE architecture

The interesting combination occurs when N simulated vehicles are combined with M live GPS inputs from real people and real vehicles in the same coordinate frame and at the same time.

This is mixed-mode simulation with real and virtual vehicles operating and possibly interacting in the same coordinate system in real time.

In this mode, virtual vehicles can 'see' and avoid real vehicles and real people. Or, real people with awareness of the virtual environment can 'see' and avoid the virtual challenge vehicles. This situation provides repeatable, virtual traffic for developing autonomy behaviors using real vehicles and real people but with reduced collision risk.

These mixed real-and-virtual experiments are the very situations needed for safely developing autonomy algorithms that contain significant risk.


How is motion determined for virtual vehicles? Do you need a virtual driver?

Yes. MoVE tests are focused on vehicle mobility. This means operators must provide inputs to command each vehicle. For real vehicles and pedestrians, we assume human operators. For virtual vehicles, MoVE has a behavior scheduler included with each built-in vehicle model to provide interesting operator inputs:


MoVE Core

MoVE Core aggregates all real and virtual vehicle positions into State which contains a snapshot all real and virtual vehicles in the scenario.
MoVE architecture

The scenario State is time-stamped and logged at a frequency specified by the config file for post-test replay and analysis. Because every clock on every computer has error, the timestamp from Core is chosen as the common timestamp for all vehicles in the multi-vehicle scenario.

For real vehicles with live GPS inputs, coordinate transformations are performed between GPS's geocentric frame using latitude/longitude in decimal degrees and a local Earth-fixed, orthogonal, or Cartesian XYZ coordinate frame.

For virtual vehicles, the opposite coordinate conversion is performed so even simulated vehicles in the XYZ frame have a corrsponding lat/lon location in geocentric coordinates.

Because both live GPS vehicles and virtual simulated vehicles have both lat/lon and XYZ coordinates, both can be plotted together on an XYZ plot or on Google Maps with lat/lon coordinates.

MoVE Core is also where vehicle separation distances are monitored for the avoid() behavior. If Core detects an impending collision between two vehicles, it sends avoid messages to both with the other vehicle's location to help both vehicles turn the other way to avoid collision.


Controlling the experiment with runState

Steps to completing an example experiment is presented in the Hello World tab at the top.

Once vehicle processes are launched and MoVE Core is running, the experiment is controlled using the main_changeRunState.py command-line program.

The choices for runState are:

Config File

The configuration file is intended to improve collaboration and data-sharing among vehicle mobility researchers. The config file captures scenario details such as:
  • number of virtual vehicles
  • number of real vehicles
  • lat/lon origin
  • behaviors that are enabled or disabled
  • State logging frequency
  • integrator stepsize
  • udp network ports for vehicle-to-core and core-to-vehicle communication

    The intent is for collaborators to develop a scenario, send it to colleagues, and allow others to recreate the same scenario and continue experimentation.


    Is MoVE right for you?

    If you have an idea for a multi-vehicle scenario or an algorithm you'd like to explore, MoVE may be the right framework for you.

    MoVE is almost entirely command-line software, written in Python 3, so the right person will be comfortable at a Linux command prompt or be willing to learn at least the basics.


    MoVE as open-source software

    MoVE is distributed under the open-source terms of the GNU Public License, version 3 (GPLv3).

    This means it is free for you to download, study the source code, use, and modify as you like.

    The GPLv3 is a copyleft license which means any released version separate from this distribution must also be released under the GNU GPLv3.

    The intent is to create a strong user community and ensure this software remains free and open-source.

    Read more about the GPLv3 here.


    Inspiration and similarities

    MoVE was inspired by OneSAF and other networked, distributed virtual environments that incorporate real people, real vehicles, real pedestrians and their virtual counterparts.

    MoVE's design is similar to OneSAF's technical framework as a network-centric, multi-site, multi-vehicle simulation and training capability.

    However, OneSAF is focused on military simulation and training and is not open source. MoVE is open-source and serves as a publicly available virtual proving ground for self-driving vehicles and Unmanned Autonomous Systems.