Set Up Docker File for SITL Simulation

This guide will walk you through how to set up your docker file to execute Software-in-the-Loop simulation tests. At the end of the page, you'll find an example docker file for the px4-gazebo.

The docker file is what will be mounted by Chassy and execute your simulation tests. This offers maximum flexibility on the your end to define their simulation environment as you wish while still being able to run in a structured environment as well as produce reports.

Basic Mountpoints

When Chassy runs your docker image, the following mounts will be added with their corresponding functions:

/opt/chassy/src

Your GitHub source code will be mounted here. It is best practice to include all artifacts required for simulation such as robot, world, and launch files in your git repository so they can be included lock-step with your simulation.

/opt/chassy/simulate

This executable is the entrypoint to how Chassy will run your simulation. Provide all necessary arguments and commands to run your simulation.

/opt/chassy/reports

The user’s tests are responsible for populating the reports directories in order for results to be reported and displayed by Chassy. The following report types are currently supported, and more are being added.

JUnit Reports

JUnit reports serve as the basic status reports for testing suites. Reports are to be saved in the root directory of /opt/chassy/reports as JUnit XML files. Each test suite can have its own separate XML file.

  /opt/chassy/reports/
  ├── junit_setup_validation.xml       
  ├── test-results-minimal.xml         
  ├── test-results-gazebo.xml          
  ├── test-results-flight.xml          
  └── test-results-integration.xml      # Combined test results

Screenshots and GIFs

Screenshots and GIFs are to be stored under a screenshots directory. Subdirectories should denote test-suites and can contain either screenshots and GIFs.

 /opt/chassy/reports/
  ├── gazebo-test-results-integration.xml    # Test Suite 03-04 JUnit results
  ├── gazebo-test-results-manipulation.xml   # Test Suite 05 JUnit results
  ├── gazebo-test-results-simulation.xml     # Test Suite 02 JUnit results
  ├── gazebo-test-results-unit.xml          # Test Suite 01 JUnit results
  ├── screenshots/
  │   └──manipulation/                     # Test Suite 05 visual docs
  │       ├── observer_camera::link::scene_camera_0.png   # Frame 0
  │       ├── observer_camera::link::scene_camera_1.png   # Frame 1
  │       ├── ...                           # Frames 2-21 (22 total)
  │       ├── observer_camera::link::scene_camera_21.png  # Frame 21
  │       ├── unknown_test_animation.gif    # 303KB animated GIF  

Universal Scene Description (USD) Files

Some simulation engines such as Nvidia Isaac offers the ability to save scene state. Place them in the OVD subdirectory. Group USD files by test and have a directory for each of them. In the example below, we list them as TESTNAME_TIMESTAMP, however that is good practice and not required. Accompanying JSON files can contain metadata for each test that will be displayed alongside each test.

/opt/chassy
└── reports
    ├── junit_isaac_environment.xml
    ├── ovd
    │   ├── articulated_robot_20250907_171459_856
    │   │   ├── state_001.json
    │   │   ├── state_001.usd
    │   │   ├── state_002.json
    │   │   ├── state_002.usd
    │   │   ├── state_003.json
    │   │   ├── state_003.usd
    │   │   ├── state_004.json
    │   │   ├── state_004.usd
    │   │   └── summary.json
    │   ├── collision_detection_20250907_171333_867
    │   │   ├── state_001.json
    │   │   ├── state_001.usd
    │   │   ├── state_002.json
    │   │   ├── state_002.usd
    │   │   ├── state_003.json
    │   │   ├── state_003.usd
    │   │   ├── state_004.json
    │   │   ├── state_004.usd
    │   │   ├── state_005.json
    │   │   ├── state_005.usd
    │   │   ├── state_006.json
    │   │   ├── state_006.usd
    │   │   └── summary.json
    │   ├── rigid_body_physics_20250907_171315_612
    │   │   ├── state_001.json
    │   │   ├── state_001.usd
    │   │   ├── state_002.json
    │   │   ├── state_002.usd
    │   │   ├── state_003.json
    │   │   ├── state_003.usd
    │   │   ├── state_004.json
    │   │   ├── state_004.usd
    │   │   ├── state_005.json
    │   │   ├── state_005.usd
    │   │   └── summary.json
    │   └── robot_loading_20250907_171440_848
    │       ├── state_001.json
    │       ├── state_001.usd
    │       ├── state_002.json
    │       ├── state_002.usd
    │       └── summary.json
    ├── test-results-isaac-core.xml
    ├── test-results-isaac-init.xml
    ├── test-results-isaac-physics.xml
    └── test-results-isaac-robot.xml

MCAP

MCAP logs can be grouped by process as shown below:

/opt/chassy/reports/
├── camera/
│   ├── 2025-09-11T10-15-00Z_camera_run001.mcap
│   ├── 2025-09-11T10-30-00Z_camera_run002.mcap.zst
│   └── latest.mcap -> 2025-09-11T10-30-00Z_camera_run002.mcap.zst
├── perception/
│   ├── 2025-09-11T10-15-05Z_perception_run001.mcap
│   └── latest.mcap -> 2025-09-11T10-15-05Z_perception_run001.mcap
├── planning/
│   ├── 2025-09-11T10-16-10Z_planning_run001.mcap.lz4
│   └── latest.mcap -> 2025-09-11T10-16-10Z_planning_run001.mcap.lz4
├── controls/
│   ├── 2025-09-11T10-16-12Z_controls_run001.mcap
│   └── latest.mcap -> 2025-09-11T10-16-12Z_controls_run001.mcap
└── prediction/
    ├── 2025-09-11T10-16-15Z_prediction_run001.mcap
    └── latest.mcap -> 2025-09-11T10-16-15Z_prediction_run001.mcap

/opt/chassy/logs

All logs should be stored in /opt/chassy/logs to be ingestible by Chassy. Chassy currently supports the following log types, and more are being added everyday.

ROS and ROS2: Rosconsole / rosout / rcllogging

It is recommended to configure ROS to change the default directory of ros logs from ~/.ros/log to /opt/chassy/logs/ to facilitate this.

/opt/chassy/log/
├── 2025-09-11-09-42-05/                  # timestamped run directory
│   ├── rosout.log                        # aggregated text log
│   ├── events.log                        # structured event log (YAML/JSON)
│   ├── launch.log                        # launch system messages
│   ├── my_robot_driver-1-stdout.log      # logs from a specific node
│   ├── my_robot_driver-1-stderr.log
│   ├── lidar_node-2-stdout.log
│   ├── lidar_node-2-stderr.log
│   └── <other-node-logs>...
├── 2025-09-11-10-15-32/                  # another run (new timestamp dir)
│   ├── rosout.log
│   ├── events.log
│   ├── launch.log
│   ├── navigation_node-3-stdout.log
│   ├── navigation_node-3-stderr.log
│   └── camera_node-4-stdout.log

Google Logs

For glogs, please store them using glog extensions (ie log.LOGLEVEL)

/opt/chassy/logs/glogs
├── camera/
│   ├── camera.INFO -> camera.robotA.ops.log.INFO.20250911-101532.27451
│   ├── camera.WARNING -> camera.robotA.ops.log.WARNING.20250911-101534.27451
│   ├── camera.ERROR -> camera.robotA.ops.log.ERROR.20250911-102115.27451
│   ├── camera.robotA.ops.log.INFO.20250911-094205.27110
│   ├── camera.robotA.ops.log.INFO.20250911-101532.27451
│   ├── camera.robotA.ops.log.WARNING.20250911-095010.27110
│   ├── camera.robotA.ops.log.WARNING.20250911-101534.27451
│   └── camera.robotA.ops.log.ERROR.20250911-102115.27451
├── perception/
│   ├── perception.INFO -> perception.robotA.ops.log.INFO.20250911-101533.27702
│   ├── perception.WARNING -> perception.robotA.ops.log.WARNING.20250911-101540.27702
│   ├── perception.robotA.ops.log.INFO.20250911-094207.27333
│   ├── perception.robotA.ops.log.INFO.20250911-101533.27702
│   └── perception.robotA.ops.log.WARNING.20250911-101540.27702
├── planning/
│   ├── planning.INFO -> planning.robotA.ops.log.INFO.20250911-101536.27901
│   ├── planning.ERROR -> planning.robotA.ops.log.ERROR.20250911-101842.27901
│   ├── planning.robotA.ops.log.INFO.20250911-094210.27400
│   ├── planning.robotA.ops.log.INFO.20250911-101536.27901
│   └── planning.robotA.ops.log.ERROR.20250911-101842.27901
├── controls/
│   ├── controls.INFO -> controls.robotA.ops.log.INFO.20250911-101537.28045
│   ├── controls.WARNING -> controls.robotA.ops.log.WARNING.20250911-102002.28045
│   ├── controls.robotA.ops.log.INFO.20250911-094211.27450
│   ├── controls.robotA.ops.log.INFO.20250911-101537.28045
│   └── controls.robotA.ops.log.WARNING.20250911-102002.28045
└── prediction/
    ├── prediction.INFO -> prediction.robotA.ops.log.INFO.20250911-101538.28190
    ├── prediction.robotA.ops.log.INFO.20250911-094212.27501
    └── prediction.robotA.ops.log.INFO.20250911-101538.28190

C++ spd logs

If using SPD logs, store logs using the .log extension.

/opt/chassy/logs/
├── camera/
│   ├── camera.log                    # main rolling log
│   ├── camera.1.log                  # rotated log (older)
│   ├── camera.2.log
│   └── ...
├── perception/
│   ├── perception.log
│   ├── perception.1.log
│   ├── perception.2.log
│   └── ...
├── planning/
│   ├── planning.log
│   ├── planning.1.log
│   ├── planning.2.log
│   └── ...
├── controls/
│   ├── controls.log
│   ├── controls.1.log
│   ├── controls.2.log
│   └── ...
└── prediction/
    ├── prediction.log
    ├── prediction.1.log
    ├── prediction.2.log
    └── ...

It is good practice to setup a rotating logger so unexpected failures will not result in lost logs, such as in this example:

#include "spdlog/spdlog.h"
#include "spdlog/sinks/rotating_file_sink.h"

int main() {
    // Example for the "camera" module
    auto logger = spdlog::rotating_logger_mt(
        "camera_logger",              // logger name
        "/opt/chassy/logs/camera/camera.log", // log file path
        10 * 1024 * 1024,             // max file size (10 MB)
        5                             // keep up to 5 rotated files
    );

    spdlog::set_level(spdlog::level::debug);  // global log level
    logger->info("Camera module started");
    logger->warn("Low light detected");
    logger->error("Camera stream lost!");
}

Python Logs

For python logging, it is good practice to setup a rotating logger using the logging module.

/opt/chassy/logs/
├── camera/
│   ├── camera.log
│   ├── camera.log.1
│   ├── camera.log.2
│   └── ...
├── perception/
│   ├── perception.log
│   ├── perception.log.1
│   └── ...
├── planning/
│   ├── planning.log
│   ├── planning.log.1
│   └── ...
├── controls/
│   ├── controls.log
│   ├── controls.log.1
│   └── ...
└── prediction/
    ├── prediction.log
    ├── prediction.log.1
    └── ...

Example python code:

import logging
import logging.handlers
import os

def setup_logger(module_name, log_dir="/opt/chassy/logs", max_bytes=10*1024*1024, backup_count=5):
    """
    Creates a rotating file logger for a specific robotics module.
    
    :param module_name: e.g., "camera", "perception", "planning"
    :param log_dir: base directory for logs
    :param max_bytes: max log size before rotation (default 10 MB)
    :param backup_count: number of rotated logs to keep
    """
    module_dir = os.path.join(log_dir, module_name)
    os.makedirs(module_dir, exist_ok=True)

    log_file = os.path.join(module_dir, f"{module_name}.log")

    logger = logging.getLogger(module_name)
    logger.setLevel(logging.DEBUG)  # or INFO/WARNING depending on module

    # File handler with rotation
    handler = logging.handlers.RotatingFileHandler(
        log_file, maxBytes=max_bytes, backupCount=backup_count
    )

    formatter = logging.Formatter(
        fmt="%(asctime)s [%(levelname)s] %(name)s: %(message)s",
        datefmt="%Y-%m-%d %H:%M:%S"
    )
    handler.setFormatter(formatter)

    # Avoid duplicate handlers on repeated setup
    if not logger.handlers:
        logger.addHandler(handler)

    return logger

# Example usage for the camera process
if __name__ == "__main__":
    camera_logger = setup_logger("camera")
    camera_logger.info("Camera module started")
    camera_logger.warning("Low light detected")
    camera_logger.error("Camera stream lost!")

/opt/chassy/artifacts (Under Development)

In the future, Chassy will provide another mount option to allow large immutable infrastructure artifacts such as world files, mapping data or other models to be mounted separately from Chassy Index.

Example Dockerfile

The primary purpose of the dockerfile is to setup your simulation environment. Below we have an example dockerfile to setup a simulation environment for the px4-gazebo, either using CPU only simulation or CPU and GPU.

Note that this is intended to act as a guideline and your dockerfile does not need to follow these conventions.

We will walk through each part of the dockerfile separately, and then view the resulting file in the end.

Base Image

Starts from nvidia/opengl:1.2-glvnd-runtime-ubuntu22.04 → provides Ubuntu 22.04 with NVIDIA OpenGL runtime support for GPU acceleration.

FROM nvidia/opengl:1.2-glvnd-runtime-ubuntu22.04

Environment Setup

Sets DEBIAN_FRONTEND=noninteractive to suppress interactive prompts during package installation.

ENV DEBIAN_FRONTEND=noninteractive

Install Dependencies

Updates package lists and installs a large set of required tools and libraries:

  • Build tools: curl, wget, git, cmake, build-essential, pkg-config

  • Python: python3, python3-pip, python3-dev

  • Libraries: libxml2-dev, libxslt-dev, libeigen3-dev, libopencv-dev, libgoogle-glog-dev, protobuf-compiler

  • GStreamer stack: gstreamer1.0-plugins-bad, gstreamer1.0-libav, gstreamer1.0-gl, libgstreamer-plugins-base1.0-dev

  • Other tools: xvfb (virtual framebuffer for headless rendering), mesa-utils, geographiclib-tools, software-properties-common, lsb-release, gnupg2, sudo, libimage-exiftool-perl

  • NVIDIA runtime libraries: nvidia-utils-470, libnvidia-gl-470

Cleans up apt cache to reduce image size.

RUN apt-get update && apt-get install -y \
    curl \
    wget \
    git \
    cmake \
    build-essential \
    python3 \
    python3-pip \
    python3-dev \
    pkg-config \
    libxml2-dev \
    libxslt-dev \
    libeigen3-dev \
    gstreamer1.0-plugins-bad \
    gstreamer1.0-libav \
    gstreamer1.0-gl \
    libgstreamer-plugins-base1.0-dev \
    libimage-exiftool-perl \
    geographiclib-tools \
    libeigen3-dev \
    libopencv-dev \
    libgoogle-glog-dev \
    protobuf-compiler \
    xvfb \
    mesa-utils \
    software-properties-common \
    lsb-release \
    gnupg2 \
    sudo \
    nvidia-utils-470 \
    libnvidia-gl-470 \
    && rm -rf /var/lib/apt/lists/*

Gazebo Installation

  • Adds OSRF Gazebo repository and GPG key.

  • Installs gz-harmonic (Gazebo Harmonic simulator).

  • Cleans apt cache.

RUN wget https://packages.osrfoundation.org/gazebo.gpg -O /usr/share/keyrings/pkgs-osrf-archive-keyring.gpg \
    && echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/pkgs-osrf-archive-keyring.gpg] http://packages.osrfoundation.org/gazebo/ubuntu-stable $(lsb_release -cs) main" | tee /etc/apt/sources.list.d/gazebo-stable.list > /dev/null \
    && apt-get update \
    && apt-get install -y gz-harmonic \
    && rm -rf /var/lib/apt/lists/*

Simulation Environment Variables

Configures headless mode:

  • HEADLESS=1

  • DISPLAY=:99 for X virtual framebuffer

  • NVIDIA-specific env vars for GPU usage (NVIDIA_VISIBLE_DEVICES, NVIDIA_DRIVER_CAPABILITIES, __GLX_VENDOR_LIBRARY_NAME)

  • Forces Gazebo to use Ogre2 rendering engine.

ENV HEADLESS=1
ENV DISPLAY=${DISPLAY:-:99}
ENV NVIDIA_VISIBLE_DEVICES=all
ENV NVIDIA_DRIVER_CAPABILITIES=graphics,utility,compute
ENV __GLX_VENDOR_LIBRARY_NAME=nvidia
ENV GZ_SIM_RENDER_ENGINE=ogre2

PX4 Workspace

  • Creates working directory /opt/px4_ws.

  • Clones PX4-Autopilot repository (recursive clone to include submodules).

  • Runs PX4 setup script (./Tools/setup/ubuntu.sh --no-nuttx) to install PX4 dependencies, skipping NuttX (firmware build system).

Cleans apt cache again.

WORKDIR /opt/px4_ws

RUN git clone --recursive https://github.com/PX4/PX4-Autopilot.git /opt/px4_autopilot

RUN cd /opt/px4_autopilot \
    && bash ./Tools/setup/ubuntu.sh --no-nuttx \
    && rm -rf /var/lib/apt/lists/*

Python Dependencies for Testing & Simulation

Installs Python packages:

  • pytest, pytest-xvfb (testing)

  • psutil, pexpect (process utilities)

  • mavsdk, pymavlink, pytest-asyncio (MAVLink-based drone communication)

  • mss (screenshot utility)

RUN pip3 install --no-cache-dir \
    pytest \
    pytest-xvfb \
    psutil \
    pexpect \
    mavsdk \
    pytest-asyncio \
    pymavlink \
    mss

PX4 Build

Builds PX4 in SITL mode (make px4_sitl_default).

RUN cd /opt/px4_autopilot \
    && make px4_sitl_default

Set PX4 Environment Variables

  • PX4_HOME=/opt/px4_autopilot

  • Adds PX4 tools to PATH

ENV PX4_HOME=/opt/px4_autopilot
ENV PATH="${PX4_HOME}/Tools:${PATH}"

Simulation Entrypoint Setup

  • Copies a custom simulate script into /opt/chassy/simulate.

  • Makes it executable.

  • Sets this script as the Docker entrypoint, so containers run the simulation directly.

COPY simulate /opt/chassy/simulate
RUN chmod +x /opt/chassy/simulate

ENTRYPOINT ["/opt/chassy/simulate"]

Full example

The full docker file can be found embedded below:

# Base image: Ubuntu 22.04 with NVIDIA OpenGL runtime (for GPU-accelerated rendering)
FROM nvidia/opengl:1.2-glvnd-runtime-ubuntu22.04

# Prevent interactive prompts during apt installs
ENV DEBIAN_FRONTEND=noninteractive


RUN apt-get update && apt-get install -y \
    curl \
    wget \
    git \
    cmake \
    build-essential \
    python3 \
    python3-pip \
    python3-dev \
    pkg-config \
    libxml2-dev \
    libxslt-dev \
    libeigen3-dev \
    gstreamer1.0-plugins-bad \
    gstreamer1.0-libav \
    gstreamer1.0-gl \
    libgstreamer-plugins-base1.0-dev \
    libimage-exiftool-perl \
    geographiclib-tools \
    libeigen3-dev \
    libopencv-dev \
    libgoogle-glog-dev \
    protobuf-compiler \
    xvfb \
    mesa-utils \
    software-properties-common \
    lsb-release \
    gnupg2 \
    sudo \
    nvidia-utils-470 \
    libnvidia-gl-470 \
    && rm -rf /var/lib/apt/lists/*

# Install Gazebo Harmonic simulator from OSRF packages
RUN wget https://packages.osrfoundation.org/gazebo.gpg -O /usr/share/keyrings/pkgs-osrf-archive-keyring.gpg \
    && echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/pkgs-osrf-archive-keyring.gpg] http://packages.osrfoundation.org/gazebo/ubuntu-stable $(lsb_release -cs) main" | tee /etc/apt/sources.list.d/gazebo-stable.list > /dev/null \
    && apt-get update \
    && apt-get install -y gz-harmonic \
    && rm -rf /var/lib/apt/lists/*


ENV HEADLESS=1
ENV DISPLAY=${DISPLAY:-:99}
ENV NVIDIA_VISIBLE_DEVICES=all
ENV NVIDIA_DRIVER_CAPABILITIES=graphics,utility,compute
ENV __GLX_VENDOR_LIBRARY_NAME=nvidia
ENV GZ_SIM_RENDER_ENGINE=ogre2

# Workspace directory
WORKDIR /opt/px4_ws

# Clone PX4-Autopilot (with submodules)
RUN git clone --recursive https://github.com/PX4/PX4-Autopilot.git /opt/px4_autopilot

# Run PX4 setup script (skip NuttX since we only need SITL)
RUN cd /opt/px4_autopilot \
    && bash ./Tools/setup/ubuntu.sh --no-nuttx \
    && rm -rf /var/lib/apt/lists/*


RUN pip3 install --no-cache-dir \
    pytest \
    pytest-xvfb \
    psutil \
    pexpect \
    mavsdk \
    pytest-asyncio \
    pymavlink \
    mss

# Build PX4 in Software-In-The-Loop (SITL) mode
RUN cd /opt/px4_autopilot \
    && make px4_sitl_default

# Environment variables for PX4
ENV PX4_HOME=/opt/px4_autopilot
ENV PATH="${PX4_HOME}/Tools:${PATH}"

# Copy custom simulation entrypoint script
COPY simulate /opt/chassy/simulate
RUN chmod +x /opt/chassy/simulate

# Entrypoint: runs the simulation inside Chassy
ENTRYPOINT ["/opt/chassy/simulate"]

Last updated