Set Up Docker File for SITL Simulation

This guide will walk you through how to set up your docker file to execute Software-in-the-Loop simulation tests. At the end of the page, you'll find an example docker file for the px4-gazebo.

The docker file is what will be mounted by Chassy and execute your simulation tests. This offers maximum flexibility on the your end to define their simulation environment as you wish while still being able to run in a structured environment as well as produce reports.

Basic Mountpoints

When Chassy runs your docker image, the following mounts will be added with their corresponding functions:

/opt/chassy/src

Your GitHub source code will be mounted here. It is best practice to include all artifacts required for simulation such as robot, world, and launch files in your git repository so they can be included lock-step with your simulation.

/opt/chassy/simulate

This executable is the entrypoint to how Chassy will run your simulation. Provide all necessary arguments and commands to run your simulation.

/opt/chassy/reports

The user’s tests are responsible for populating the reports directories in order for results to be reported and displayed by Chassy. The following report types are currently supported, and more are being added.

JUnit Reports

JUnit reports serve as the basic status reports for testing suites. Reports are to be saved in the root directory of /opt/chassy/reports as JUnit XML files. Each test suite can have its own separate XML file.

  /opt/chassy/reports/
  ├── junit_setup_validation.xml       
  ├── test-results-minimal.xml         
  ├── test-results-gazebo.xml          
  ├── test-results-flight.xml          
  └── test-results-integration.xml      # Combined test results

Screenshots and GIFs

Screenshots and GIFs are to be stored under a screenshots directory. Subdirectories should denote test-suites and can contain either screenshots and GIFs.

Universal Scene Description (USD) Files

Some simulation engines such as Nvidia Isaac offers the ability to save scene state. Place them in the OVD subdirectory. Group USD files by test and have a directory for each of them. In the example below, we list them as TESTNAME_TIMESTAMP, however that is good practice and not required. Accompanying JSON files can contain metadata for each test that will be displayed alongside each test.

MCAP

MCAP logs can be grouped by process as shown below:

/opt/chassy/logs

All logs should be stored in /opt/chassy/logs to be ingestible by Chassy. Chassy currently supports the following log types, and more are being added everyday.

ROS and ROS2: Rosconsole / rosout / rcllogging

It is recommended to configure ROS to change the default directory of ros logs from ~/.ros/log to /opt/chassy/logs/ to facilitate this.

Google Logs

For glogs, please store them using glog extensions (ie log.LOGLEVEL)

C++ spd logs

If using SPD logs, store logs using the .log extension.

It is good practice to setup a rotating logger so unexpected failures will not result in lost logs, such as in this example:

Python Logs

For python logging, it is good practice to setup a rotating logger using the logging module.

Example python code:

/opt/chassy/artifacts (Under Development)

In the future, Chassy will provide another mount option to allow large immutable infrastructure artifacts such as world files, mapping data or other models to be mounted separately from Chassy Index.

Example Dockerfile

The primary purpose of the dockerfile is to setup your simulation environment. Below we have an example dockerfile to setup a simulation environment for the px4-gazebo, either using CPU only simulation or CPU and GPU.

Note that this is intended to act as a guideline and your dockerfile does not need to follow these conventions.

We will walk through each part of the dockerfile separately, and then view the resulting file in the end.

Base Image

Starts from nvidia/opengl:1.2-glvnd-runtime-ubuntu22.04 → provides Ubuntu 22.04 with NVIDIA OpenGL runtime support for GPU acceleration.

Environment Setup

Sets DEBIAN_FRONTEND=noninteractive to suppress interactive prompts during package installation.

Install Dependencies

Updates package lists and installs a large set of required tools and libraries:

  • Build tools: curl, wget, git, cmake, build-essential, pkg-config

  • Python: python3, python3-pip, python3-dev

  • Libraries: libxml2-dev, libxslt-dev, libeigen3-dev, libopencv-dev, libgoogle-glog-dev, protobuf-compiler

  • GStreamer stack: gstreamer1.0-plugins-bad, gstreamer1.0-libav, gstreamer1.0-gl, libgstreamer-plugins-base1.0-dev

  • Other tools: xvfb (virtual framebuffer for headless rendering), mesa-utils, geographiclib-tools, software-properties-common, lsb-release, gnupg2, sudo, libimage-exiftool-perl

  • NVIDIA runtime libraries: nvidia-utils-470, libnvidia-gl-470

Cleans up apt cache to reduce image size.

Gazebo Installation

  • Adds OSRF Gazebo repository and GPG key.

  • Installs gz-harmonic (Gazebo Harmonic simulator).

  • Cleans apt cache.

Simulation Environment Variables

Configures headless mode:

  • HEADLESS=1

  • DISPLAY=:99 for X virtual framebuffer

  • NVIDIA-specific env vars for GPU usage (NVIDIA_VISIBLE_DEVICES, NVIDIA_DRIVER_CAPABILITIES, __GLX_VENDOR_LIBRARY_NAME)

  • Forces Gazebo to use Ogre2 rendering engine.

PX4 Workspace

  • Creates working directory /opt/px4_ws.

  • Clones PX4-Autopilot repository (recursive clone to include submodules).

  • Runs PX4 setup script (./Tools/setup/ubuntu.sh --no-nuttx) to install PX4 dependencies, skipping NuttX (firmware build system).

Cleans apt cache again.

Python Dependencies for Testing & Simulation

Installs Python packages:

  • pytest, pytest-xvfb (testing)

  • psutil, pexpect (process utilities)

  • mavsdk, pymavlink, pytest-asyncio (MAVLink-based drone communication)

  • mss (screenshot utility)

PX4 Build

Builds PX4 in SITL mode (make px4_sitl_default).

Set PX4 Environment Variables

  • PX4_HOME=/opt/px4_autopilot

  • Adds PX4 tools to PATH

Simulation Entrypoint Setup

  • Copies a custom simulate script into /opt/chassy/simulate.

  • Makes it executable.

  • Sets this script as the Docker entrypoint, so containers run the simulation directly.

Full example

The full docker file can be found embedded below:

Last updated