Building SUMO on the HPC Cluster¶
This guide captures the steps I followed to get SUMO v1_24_0+1027 running on the Universidad de Sevilla HPC environment after the stock AppImage failed because of missing system libraries. It mirrors the repo documentation style so the process can be reproduced or adapted for future sessions.
🧰 Prerequisites¶
- Access to the HPC cluster. First access via the VPN then enter via ssh to one of the front nodes (
ssh <uvus>@172.16.46.4ossh <uvus>@172.16.46.6). - Optional: a prepared SUMO scenario archive by cloning the repo CICA_test or send them via
scp
1. Request an Interactive Allocation¶
salloc -c 100 -t 30:00 # initial run
salloc -c 100 -t 300:00 # re-run with a longer time slice
The allocation lands on a compute node (c06, c07, …). All subsequent build commands must run inside that allocation.
2. Load the Required Modules¶
The environment is empty by default. Load the toolchain before invoking CMake:
ml CMake # 3.27.6-GCCcore-13.2.0 on the cluster
ml Python/3.10.8-GCCcore-12.2.0
ml Xerces-C++/3.2.4-GCCcore-12.3.0
✅ Expect CMake to warn about optional components (FOX GUI, FreeType, GDAL, SWIG). These are not required for a headless build.
If the interactive shell reloads (e.g. after reconnection) repeat the ml … commands before building or running SUMO.
3. Compile SUMO from Source¶
# From the home directory inside the allocation
git clone https://github.com/eclipse/sumo.git
cd sumo
mkdir build && cd build
cmake .. -DCMAKE_INSTALL_PREFIX=$HOME/sumo
make -j4
make install
The install step populates $HOME/sumo with binaries, libraries, tools, and Python packages (sumolib, traci, simpla). The CMake run emits several warnings about optional dependencies, but the build still succeeds for command-line usage.
4. Persist the Environment¶
Run the Python file set_sumo_home.py, found in the repo CICA_test to set the SUMO home or append the following exports to ~/.bashrc for convenience and source the file inside each allocation:
echo 'export PATH=$HOME/sumo/bin:$PATH' >> ~/.bashrc
echo 'export SUMO_HOME=$HOME/sumo' >> ~/.bashrc
source ~/.bashrc
Because each SLURM job starts with a clean environment, re-run ml … and source ~/.bashrc after reconnecting or when a job times out.
5. Validate the Installation with a Sample Scenario¶
- Copy or unzip the scenario:
unzip map_v0.zip -d CICA_TEST cd CICA_TEST/map_v0 - Run SUMO headlessly:
sumo osm.sumocfg
During the long validation run the simulator produced numerous warnings (unused traffic light states, emergency braking, teleports) but the binaries executed correctly. GUI launch (sumo-gui) is unavailable because FOX/X11 modules are not installed on the cluster.
6. Troubleshooting Notes¶
GLIBCXX_3.4.32not found – indicates the AppImage or binaries were built against a newerlibstdc++; rebuild from source as shown above.sumonot found after reconnecting – ensure you sourced~/.bashrcand reloaded the modules in the current allocation.SUMO_HOMEwarnings / XML validation disabled – exportSUMO_HOME=$HOME/sumobefore running simulations.make installcleanup warnings – safe to ignore (build/libmissing) as long as the install step finishes.- Job cancelled due to time limit – request a longer allocation (
salloc -t 300:00) and resume where you left off.