1. Introduction – Why Automate a Benchmark?
Imagine you are a performance‑engineer, a game‑modder, or a data‑scientist who wants to know exactly how “Shadow of the Tomb Raider” behaves on a Linux workstation.
You could launch the game, open the in‑game benchmark, click “Run”, scribble numbers on a piece of paper, and pray you didn’t forget the settings you used.
That approach works, but it is slow, error‑prone, and impossible to repeat at scale.
What you really want is a repeatable, scriptable pipeline that:
- Selects a preset (resolution, quality, ray‑tracing, DLSS/FSR, frame generation…)
- Copies the appropriate profile file (native or Proton) into the game’s configuration directory.
- Launches the game (either the native Linux executable or the Proton‑wrapped Windows executable).
- Waits for the benchmark to finish and extracts the JSON result that the game writes.
- Renames / archives the JSON with a unique, machine‑readable name.
- Repeats for every resolution/quality you care about.
All of that is already baked into a handful of small Bash/INI/JSON files that you can find in the attachment (the context you gave).
This manual will walk you through every piece, explain the philosophy behind it, and show you exactly how to run the automation – both natively (Linux binary) and through Proton (Wine/DXVK wrapper).
2. High‑Level Architecture
+-------------------+ +---------------------------+ +--------------------------+
| run_sottr.sh | ---> | groups.{native,proton}. | ---> | tests.{native,proton}. |
| (master script) | | conf.sh | | conf.sh |
+-------------------+ +---------------------------+ +--------------------------+
| | |
| reads TEST_GROUPS & TESTS | loads group definitions | loads test definitions
| | |
v v v
+-------------------+ +-------------------+ +-------------------+
| profiles/ | | results/ | | logs/ |
| native-*.xml | | *.json | | *.log |
| proton-*.reg | | benchmark output | | execution traces |
+-------------------+ +-------------------+ +-------------------+run_sottr_benchmark.sh– the driver script that orchestrates everything.groups.*.conf.sh– maps group names (e.g., “native‑quick”) to a space‑separated list of individual test identifiers.tests.*.conf.sh– maps a test identifier (e.g.,native-1080p-high-rt-off) to a line of parameters that the benchmark UI expects.- Profile files – two families:
- Native profiles:
native‑<resolution>-<quality>-rt‑off.preferences.xml - Proton profiles:
proton‑<...>.preferences.user.reg(Windows registry‑style file)
- Native profiles:
The driver script decides which set of files to use based on the launch mode (native vs proton) and then copies the correct profile into the game’s configuration directory before launching the benchmark.
3. Prerequisites – Getting the Environment Ready
3.1. Ubuntu (or any Debian‑based distro) + Steam
# Install Ubuntu (or a flavour) and make sure you have a working internet connection.
# Update everything first:
sudo apt update && sudo apt full-upgrade -y
# Install the Steam client (the official Ubuntu package pulls the latest stable version):
sudo apt install steam -y
# Launch Steam once, log in, enable “Steam Play” for all titles:
# Steam → Settings → Steam Play → Enable Steam Play for supported titles
# Tick “Enable Steam Play for all other titles” and select a Proton version (e.g., Proton 8.0).
3.2. Install Shadow of the Tomb Raider
- In Steam, search for “Shadow of the Tomb Raider”.
- Click Install. Steam will automatically download the native Linux build if your hardware is supported; otherwise it will download the Windows build and use Proton.
- Wait for the download to finish.
Note – The benchmark automation works for both the native Linux build and the Windows build run under Proton.
The script will automatically fall back to Proton if the native binary cannot be found, but you can also force a mode (see Section 5).
3.3. Optional – Install Helper Tools
jq– for parsing JSON result files.timeout(part of coreutils) – used by the script to kill a hung benchmark.curl– used by the script to fetch GPU/CPU info if you want to augment the result file.
sudo apt install jq curl -y
4. Understanding the Building Blocks
4.1. Test Definition Files
| File | Purpose | Key Variables |
|---|---|---|
tests.native.conf.sh |
Defines native‑only test identifiers and their parameters. | declare -A TESTS – e.g. TESTS["native-1080p-high-rt-off"]= |
tests.proton.conf.sh |
Defines Proton test identifiers (DirectX 12 on/off, DLSS Ultra‑Performance, etc.). | declare -A TESTS – e.g. TESTS["proton-dx12-on-dlss-ultra-performance-4k-high"]= |
groups.native.conf.sh |
Bundles native test identifiers into groups for easier batch runs. | declare -A TEST_GROUPS – e.g. TEST_GROUPS["native-quick"]= |
groups.proton.conf.sh |
Same idea, but for Proton tests. Dynamically builds groups for each resolution/quality combo. | Populated by build_proton_ultra_perf_groups function. |
How the driver uses them
When run_sottr_benchmark.sh starts, it sources the appropriate files:
source "$SCRIPT_DIR/tests.native.conf.sh"
source "$SCRIPT_DIR/groups.native.conf.sh"
If you launch in Proton mode (--mode=proton), the driver sources tests.proton.conf.sh and groups.proton.conf.sh instead.
4.2. Profile Files – The “What‑to‑run” Blueprint
The game reads a preferences file at startup that tells it:
- Resolution, fullscreen/windowed mode
- Graphic quality preset (low/medium/high/ultra)
- Whether ray‑tracing, DLSS, FSR, etc. are enabled
- Whether frame‑generation (aka “Super‑Resolution”) is on
The script detects which profile to copy based on the test identifier prefix:
if [[ $test_name == native-* ]]; then
PROFILE_SRC="$PROFILES_DIR/$test_name.preferences.xml"
cp "$PROFILE_SRC" "$NATIVE_PREF_PATH"
else
PROFILE_SRC="$PROFILES_DIR/$test_name.preferences.user.reg"
cp "$PROFILE_SRC" "$PROTON_PREF_PATH"
fi
Tip – If you ever add a new resolution or a new upscaling mode, just drop a new profile file that follows the naming convention and update the tests.*.conf.sh accordingly. The driver will pick it up automatically.4.3. Result JSON Files
When the benchmark finishes, Shadow of the Tomb Raider writes a JSON file in the user’s “Saves” directory:
<game_save_path>/Benchmark/BenchmarkResult_<timestamp>.json
The driver script moves this file to the central results/ directory and renames it to include:
- Game name (always “Shadow of the Tomb Raider”)
- GPU model
- Resolution & quality
- Whether ray‑tracing was on
- Whether we used Proton or native
- A timestamp and a unique hash to avoid collisions
Example final name (generated in run_sottr_benchmark.sh):
ShadowOfTheTombRaider_4k_high_ultra_perf_proton_2024-09-13T15-32-07_8151MB_RTX5060.json
5. Preparing the Scripts – The “Wizard’s Toolkit”
All the automation lives inside the benchmark/ directory (or wherever you cloned the repository).
The entry point is run_sottr_benchmark.sh. Let’s dissect it.
5.1. Core Variables (top of the script)
# Game and Steam IDs
GAME_ID=750920 # Steam AppID for Shadow of the Tomb Raider
GAME_NAME="Shadow of the Tomb Raider"
# Paths – adjust if you installed Steam in a custom location
STEAM_ROOT="${HOME}/.local/share/Steam"
GAME_ROOT="${STEAM_ROOT}/steamapps/common/Shadow of the Tomb Raider"
# Where the driver looks for definition files
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
Most of these variables are hard‑coded to match a typical Ubuntu + Steam install. If you installed Steam somewhere else (e.g.,/opt/steam), just changeSTEAM_ROOT.
5.2. Loading Configurations
# Load the appropriate test & group definitions
if [[ "$LAUNCH_MODE" == "native" ]]; then
source "${SCRIPT_DIR}/tests.native.conf.sh"
source "${SCRIPT_DIR}/groups.native.conf.sh"
else
source "${SCRIPT_DIR}/tests.proton.conf.sh"
source "${SCRIPT_DIR}/groups.proton.conf.sh"
fi
The script branches early: if we are in native mode we load only the native definitions; otherwise we load the Proton‑specific ones. This keeps the TESTS associative array small and avoids confusion.
5.3. The Main Loop – “Run every test in a group”
run_group() {
local group_name="$1"
local test_list="${TEST_GROUPS[$group_name]}"
local test
echo "=== Running group: $group_name ==="
for test in $test_list; do
run_single_test "$test"
done
}
run_single_test is the heart that does the following steps:
- Lookup parameters (
mode resolution quality rt fg) fromTESTS. - Select the right profile file (
native-*.preferences.xmlorproton-*.preferences.user.reg). - Copy the profile into the game’s config location.
- Launch the game (either the native binary or
protonwrapper). - Wait for the benchmark to finish (uses
timeoutto avoid infinite hangs). - Collect the JSON result and rename it.
Below is a condensed version of that function, heavily annotated:
run_single_test() {
local test_id="$1"
local params="${TESTS[$test_id]}" # e.g. "native 1920x1080 high off off"
read -r mode resolution quality rt fg <<<"$params"
echo ">>> Running $test_id: mode=$mode, res=$resolution, quality=$quality, RT=$rt, FG=$fg"
# -------------------------------------------------
# 1️⃣ Choose the profile file
# -------------------------------------------------
if [[ $test_id == native-* ]]; then
PROFILE_SRC="${SCRIPT_DIR}/profiles/${test_id}.preferences.xml"
PROFILE_DST="${HOME}/.local/share/ShadowOfTheTombRaider/preferences.xml"
else
PROFILE_SRC="${SCRIPT_DIR}/profiles/${test_id}.preferences.user.reg"
PROFILE_DST="${HOME}/.local/share/Steam/steamapps/compatdata/${GAME_ID}/pfx/user.reg"
fi
if [[ ! -f $PROFILE_SRC ]]; then
echo "❌ Profile $PROFILE_SRC missing! Skipping test."
return 1
fi
# -------------------------------------------------
# 2️⃣ Install the profile (overwrite previous)
# -------------------------------------------------
cp "$PROFILE_SRC" "$PROFILE_DST"
echo "🔧 Copied profile to $PROFILE_DST"
# -------------------------------------------------
# 3️⃣ Build the launch command
# -------------------------------------------------
if [[ $mode == "native" ]]; then
# Native binary – path is inside the Steam folder
local exe_path="${GAME_ROOT}/bin/${GAME_NAME}"
LAUNCH_CMD=("$exe_path" "-benchmark")
else
# Proton – we call steam through the Proton wrapper
local proton_path="${STEAM_ROOT}/steamapps/common/Proton ${PROTON_VERSION}/proton"
LAUNCH_CMD=("$proton_path" run "${GAME_ROOT}/bin/${GAME_NAME}" "-benchmark")
fi
# -------------------------------------------------
# 4️⃣ Run the benchmark with a timeout
# -------------------------------------------------
local timeout_secs=1800 # 30 min – enough for the longest 4K run
echo "🚀 Launching: ${LAUNCH_CMD[*]}"
timeout "$timeout_secs" "${LAUNCH_CMD[@]}" &
local bench_pid=$!
# Show a friendly progress spinner while we wait
while kill -0 "$bench_pid" 2>/dev/null; do
printf "\r⏳ Benchmark running... PID $bench_pid"
sleep 5
done
echo -e "\n✅ Benchmark finished (PID $bench_pid)."
# -------------------------------------------------
# 5️⃣ Grab the JSON result and rename it
# -------------------------------------------------
local result_dir="${HOME}/.local/share/ShadowOfTheTombRaider/Benchmark"
local json_file
json_file=$(ls -t "$result_dir"/*.json | head -n1) # newest file
if [[ -z $json_file ]]; then
echo "❗ No result JSON found – something went wrong."
return 1
fi
# Build a safe, descriptive filename
local gpu_name=$(jq -r '.GPU' "$json_file" | tr ' ' '_' | tr -cd '[:alnum:]_-')
local timestamp=$(date +"%Y-%m-%dT%H-%M-%S")
local out_name="${GAME_NAME// /_}_${resolution}_${quality}_${mode}_${timestamp}_${gpu_name}.json"
mv "$json_file" "${SCRIPT_DIR}/results/$out_name"
echo "📁 Result saved as results/$out_name"
}
The script is written as a wizard’s spellbook: each test is a spell, each profile is a scroll, and the driver is the archmage that copies the scroll into the right place, chants the incantation (timeout …) and then collects the magical artifact (the JSON).6. Running the Automation
6.1. Quick “Hello‑World” – Run the native quick group
cd ~/benchmark # or wherever you cloned the repo
chmod +x run_sottr_benchmark.sh
# Native mode, quick group (1080p, 1440p, 4K – high preset, no ray‑tracing)
./run_sottr_benchmark.sh --mode=native --group=native-quick
What happens?
- The script loads
tests.native.conf.shandgroups.native.conf.sh. - It resolves
native-quick→ three test IDs:native-1080p-high-rt-offnative-1440p-high-rt-offnative-4k-high-rt-off
- For each, it copies the corresponding
native-*-high-rt-off.preferences.xmlinto the user’s config folder. - It launches the native Linux binary (
ShadowOfTheTombRaider) with the-benchmarkflag. - After each benchmark finishes, the resulting JSON lands in
results/with a name like:
Shadow_of_the_Tomb_Raider_3840x2160_high_native_2024-09-13T15-32-07_GeForce_RTX_5060_8151MB.json
6.2. Proton Mode – Run the ultra‑performance group
# Example: run the “proton‑quick” group (DX12 on/off, DLSS Ultra‑Performance, high quality)
./run_sottr_benchmark.sh --mode=proton --group=proton-quick
Explanation of the flow:
- Loading Proton definitions:
tests.proton.conf.shcreates entries likeproton-dx12-on-dlss-ultra-performance-4k-high. - Loading Proton groups:
groups.proton.conf.shbuilds dynamic groups;proton-quickcontains the on/off pairs for each resolution. - Profile selection: The script now uses the registry file (
*.preferences.user.reg). These files contain lines like:
[Software\\ShadowOfTheTombRaider]
Resolution=3840x2160
Quality=high
DX12=1 ; 1 = on, 0 = off
DLSS=1 ; DLSS Ultra‑Performance enabled
- Proton wrapper: The script finds the Proton version you have installed (by default it uses the version set in Steam → Settings → Steam Play).
The command becomes something like:
~/.local/share/Steam/steamapps/common/Proton 8.0/proton run \
~/.local/share/Steam/steamapps/common/Shadow\ of\ the\ Tomb\ Raider/ShadowOfTheTombRaider -benchmark
- DXVK & Vulkan: Because the benchmark UI forces the game to run in DX12 (or Vulkan) mode, the script relies on DXVK (installed automatically by Proton) for hardware‑accelerated rendering.
- Result file: The JSON is produced by the Windows version of the game; the script still moves it to
results/and adds the suffixprotonto the filename so you can later filter by mode.
6.3. Full‑Feature Command‑Line Interface
The driver script understands the following options (all parsed with getopts in the real file):
| Option | Description | Example |
|---|---|---|
| `--mode=native | proton` | Force the launch mode. If omitted, the script auto‑detects the native binary. |
--group=GROUP_NAME |
Name of the group you want to run (must exist in TEST_GROUPS). |
--group=native-4k |
--list-groups |
Print all available groups (both native and proton). | ./run_sottr_benchmark.sh --list-groups |
--list-tests |
Print the raw test identifiers for a given group. | ./run_sottr_benchmark.sh --mode=native --group=native-4k |
--dry-run |
Show what would be done (profile copy, command line) without actually launching the game. Useful for debugging. | ./run_sottr_benchmark.sh --mode=proton --group=proton-4k --dry-run |
--timeout=SECONDS |
Override the default 1800 s timeout for each benchmark. | --timeout=3600 |
-h / --help |
Print the help screen. | ./run_sottr_benchmark.sh -h |
Example: List everything
./run_sottr_benchmark.sh --list-groups
Sample output (truncated):
Available groups (native):
native-quick → native-1080p-high-rt-off native-1440p-high-rt-off native-4k-high-rt-off
native-4k-ultra → native-4k-ultra-rt-on native-4k-ultra-rt-off
...
Available groups (proton):
proton-quick → proton-dx12-on-dlss-ultra-performance-1080p-high proton-dx12-off-dlss-ultra-performance-1080p-high …
proton-4k-ultra → proton-dx12-on-dlss-ultra-performance-4k-high proton-dx12-off-dlss-ultra-performance-4k-high
...
7. Digging Into the Results – Turning Numbers Into Insight
The JSON files contain a wealth of data:
{
"GPU": "GeForce RTX 5060 (8151 MB)",
"CPU": "AMD Ryzen 7 5800X",
"Resolution": "3840x2160",
"Quality": "High",
"RT": false,
"DLSS": "Ultra Performance",
"FPS": 75.4,
"FrameTimes": [...],
"BenchmarkVersion": "1.3"
}
7.1. Quick Summary with jq
# Show a table with resolution, quality, mode and FPS
for f in results/*.json; do
echo "$(basename "$f") → $(jq -r '"\(.Resolution) \(.Quality) \(.Mode) FPS:\(.FPS)"' "$f")"
done
7.2. Plotting Over Time (optional)
If you have Python or gnuplot you can feed the JSONs into a simple script:
# plot_fps.py
import json, glob, matplotlib.pyplot as plt
data = []
for path in glob.glob('results/*.json'):
with open(path) as f:
j = json.load(f)
label = f"{j['Resolution']}_{j['Quality']}_{j['Mode']}"
data.append((label, j['FPS']))
data.sort()
labels, fps = zip(*data)
plt.figure(figsize=(10,6))
plt.bar(labels, fps, color='steelblue')
plt.xticks(rotation=45, ha='right')
plt.ylabel('Average FPS')
plt.title('Shadow of the Tomb Raider Benchmark Results')
plt.tight_layout()
plt.show()
Run it:
python3 plot_fps.py
You will get a nice bar chart that instantly tells you how the game scales from 1080p to 4K, with and without DLSS, and with native vs Proton.
7.3. Comparing Native vs Proton
Because the filenames embed the launch mode, you can grep them:
# All 4K runs, native
ls results/*4k*native* | wc -l
# All 4K runs, Proton
ls results/*4k*proton* | wc -l
Or use jq to extract the GPU and see whether Proton introduced any driver‑level differences (DXVK may expose a slightly different Vulkan version).
7. Extending the Toolkit – Adding New Resolutions or Upscaling Modes
The automation is deliberately data‑driven. To add a new test, you only need to:
- Create a profile file that matches the naming scheme.
- For native:
native-2160p-ultra-rt-on.preferences.xml - For Proton:
proton-dx12-on-dlss-ultra-performance-2160p-ultra.preferences.user.reg
- For native:
- Run the new group as before.
Optionally add a new group in the group file:
# In groups.proton.conf.sh
TEST_GROUPS["proton-5k-ultra"]="proton-dx12-on-dlss-ultra-performance-5k-ultra"
Add an entry to the appropriate tests.*.conf.sh (or let the existing function generate it automatically).
# Example – add a 5K Ultra‑Performance entry for Proton
TESTS["proton-dx12-on-dlss-ultra-performance-5k-ultra"]="dlss-ultra-performance 5120x2880 ultra off off on"
Pro tip – Keep your profile files under a subdirectory likeprofiles/so they don’t clutter the root of the repo. The driver script already assumes$SCRIPT_DIR/profiles/when looking for a profile.
8. Troubleshooting – The Dark Arts
| Symptom | Likely Cause | Fix |
|---|---|---|
❌ Profile … missing! |
The test identifier does not have a matching profile file on disk. | Verify the filename pattern, double‑check the tests.*.conf.sh entry, and make sure the profile exists in profiles/. |
timeout: the monitored command dumped core |
The benchmark ran longer than the timeout (default 30 min). | Increase --timeout on the command line (./run_sottr_benchmark.sh). |
No result JSON found |
The benchmark crashed before writing a file, or the path to the benchmark folder is wrong. | Run the game manually once with the -benchmark flag to see where the game stores the JSON (~/.local/share/ShadowOfTheTombRaider/Benchmark). Update RESULT_DIR accordingly. |
| GPU name in final filename looks garbled | jq could not parse the GPU field, or the field contains spaces/special characters. |
Edit the rename logic in run_single_test to use a sanitized version (tr -cd '[:alnum:]_-), or add a fallback like `GPU=$(lspci |
| Game launches but benchmark UI never appears | Wrong command‑line flag; the native binary expects -benchmark, the Windows version expects -benchmark after Proton. |
Verify the LAUNCH_CMD array. For Proton you may need to add -dx12 or -dxvk flags depending on the Proton version you are using. |
| Benchmark runs but JSON never appears (empty file) | The profile file was not correctly copied (permissions problem). | Ensure the destination folder is writable (chmod -R u+rw ~/.local/share/ShadowOfTheTombRaider). |
9. Full Example – End‑to‑End Run from Scratch
Below is a complete session you could copy‑paste into a terminal to see everything happen.
# 1️⃣ Clone the repo (or copy the files into a folder)
git clone https://github.com/yourname/sottr‑benchmark.git ~/sottr-bench
cd ~/sottr-bench
# 2️⃣ Make the driver executable
chmod +x run_sottr_benchmark.sh
# 3️⃣ Clean the results folder (optional)
rm -rf results/*
mkdir -p results
# 4️⃣ Run *both* native and proton quick groups in one go
./run_sottr_benchmark.sh --mode=native --group=native-quick
./run_sottr_benchmark.sh --mode=proton --group=proton-quick
# 5️⃣ Summarise everything
echo "=== Summary of all runs ==="
for f in results/*.json; do
echo "$(basename "$f")"
jq -r '"Resolution: \(.Resolution) | Quality: \(.Quality) | FPS: \(.FPS) | GPU: \(.GPU)"' "$f"
done
You will see the driver printing friendly emojis (the script already contains them) and, after the run finishes, you’ll have six JSON files in results/, three native and three Proton.
You can now feed those JSON files into any analytics pipeline you like – Excel, Python Pandas, R, etc.
10. Advanced Topics
10.1. Running Benchmarks Headless (No UI)
The driver script uses the -benchmark argument, which automatically opens the benchmark UI, runs the test, and writes the JSON without requiring user interaction.
If you want zero‑window execution (e.g., on a remote server), you can add the -noscreen flag (the game accepts it) and pipe the output to a virtual X server (xvfb-run).
# Example for native mode on a headless server
xvfb-run -a ./run_sottr_benchmark.sh --mode=native --group=native-4k
10.2. Parallelising Different Resolutions
Because each benchmark writes a single JSON file in the same directory, you cannot run two benchmarks at the same time on the same user profile – they would overwrite each other’s results.
If you have multiple user accounts on the same machine, you can launch a separate instance of the driver under each account, each with its own $HOME path, and then merge the results later.
10.3. Automating GPU/CPU Detection
The script already pulls the GPU name from the JSON, but you may also want to embed CPU model, driver version, or OS version. Add a small function near the top of run_sottr_benchmark.sh:
collect_system_info() {
local out_json="$1"
local cpu=$(lscpu | grep 'Model name' | cut -d: -f2 | xargs)
local driver=$(glxinfo | grep 'OpenGL version string' | awk '{print $4}')
jq --arg cpu "$cpu" --arg driver "$driver" \
'. + {CPU: $cpu, GL_Driver: $driver}' "$out_json" > "$out_json.tmp" && mv "$out_json.tmp" "$out_json"
}
Call it right after moving the JSON:
collect_system_info "${SCRIPT_DIR}/results/$out_name"
Now every result file contains CPU and OpenGL driver fields, making cross‑hardware comparisons easier.
11. Frequently Asked Questions (FAQ)
| Question | Answer |
|---|---|
| Do I need to close the game manually after each benchmark? | No. The script launches the game with the -benchmark flag; the game automatically exits when the benchmark ends. The driver uses timeout as a safety net. |
| Can I benchmark the VR mode? | The current automation only supports the built‑in non‑VR benchmark. Adding VR would require a different launch flag and a separate profile type. |
| My GPU isn’t recognized in the result JSON. | Some older GPUs may not expose the GPU field. You can manually add it by editing the collect_system_info function (see Section 10.3). |
| Why does the Proton group contain “dx12‑on” and “dx12‑off”? | Those identifiers control the DXVK flag inside the Proton prefix. The dx12‑on profile enables native DirectX 12 (via DXVK), while dx12‑off forces the game to fall back to Vulkan or OpenGL, allowing you to compare performance between the two APIs. |
| Can I use a different version of Proton? | Yes. Set the environment variable PROTON_VERSION before invoking the script, e.g.: |
export PROTON_VERSION="Proton 8.0"
./run_sottr_benchmark.sh --mode=proton --group=proton-4k-ultra
The driver builds the path to ${STEAM_ROOT}/steamapps/common/Proton ${PROTON_VERSION}/proton. |
| Will the script interfere with my normal game saves? | No. The script only touches the preferences file (which stores graphics settings) and the benchmark result file. Your regular campaign saves remain untouched. |
12. Recap – What You’ve Gained
| ✅ | What you can now do |
|---|---|
| Run a full suite of resolutions and quality presets with a single command. | |
| Switch instantly between native Linux and Proton‑wrapped Windows builds. | |
| Collect reproducible JSON artefacts that contain every metric the game reports. | |
| Rename & archive those artefacts with self‑describing filenames, ready for batch analysis. | |
Add new tests by simply dropping a profile file and updating the tests.*.conf.sh. |
|
| Automate headless runs on CI servers or performance‑testing rigs. |
All of this is achieved with ≈200 lines of Bash, a handful of definition files, and a tidy folder of profile files – exactly the sort of lightweight, version‑controlled toolset that fits nicely into any DevOps or data‑science workflow.
13. Final Words – The Wizard’s Parting Advice
“A spell is only as good as the scroll you feed it.”
In our case, the profile scroll (XML or REG) determines every pixel the engine will draw. By keeping those scrolls version‑controlled and naming them consistently, you guarantee that every future run will be identical to the one you performed yesterday, last month, or next year.
Take a moment to commit the entire benchmark/ folder to a Git repository:
git init
git add .
git commit -m "Initial benchmark automation for SOTTR"
Now you have a single source of truth for performance testing that you can clone on any machine, modify, and share with teammates.
Happy benchmarking! 🎮🚀📊