This project provides tools to collect, analyze, and visualize data from iPerf3 network performance tests.
config.R: Configuration settingsdata/: Raw and processed iPerf dataR/: Functions for parsing, analyzing, and visualizing iPerf datascripts/: Main processing scriptsoutput/: Generated results and visualizationsdata_collection/: Scripts for standardized data collection
Use the provided data collection script:
cd data_collection
chmod +x run_iperf_test.sh
./run_iperf_test.shOptions:
-s SERVER: Server to test (can specify multiple times)-d SECONDS: Test duration in seconds (default: 120)-u USER: Username to record (default: current user)
Test Setup:
- Use geographically diverse servers for comprehensive testing
- Ensure test duration is at least 60 seconds (120s recommended)
- Minimize other network activity during tests
- Run tests at different times of day for better data variety
Recommended Command Options:
# Basic test
./run_iperf_test.sh -s speedtest.serverius.net -d 120
# Multiple servers
./run_iperf_test.sh -s speedtest.serverius.net -s speedtest.london.linode.com -d 120See data_collection/BEST_PRACTICES.md for more detailed guidance.
Copy the generated files to your input directory (set in config.R), then run:
source("scripts/process_all_data.R")This will:
- Parse raw iPerf3 output files
- Clean and process the data
- Generate performance summaries
- Create visualizations
For customized analysis, you can also use individual scripts:
source("scripts/parse_raw_data.R") # Convert raw data to processed format
source("scripts/generate_reports.R") # Create summary reports
source("scripts/create_plots.R") # Generate visualizationsResults are saved to:
- Summary reports:
output/reports/ - Visualizations:
output/figures/ - Processed data:
data/processed/
The analysis can process:
- Raw iPerf3 text output
- Pre-processed CSV files
R packages:
- tidyverse
- fs
- patchwork
- scales
- zoo
- lubridate
- Connectivity failures: The script attempts to ping servers first to avoid hanging.
- Disk space: Long-term testing may accumulate large data files. Monitor available space.