Skip to content

Add Ulfius: C REST framework on GNU Libmicrohttpd (first C app framework!)#26

Merged
MDA2AV merged 4 commits intoMDA2AV:mainfrom
BennyFranciscus:add-ulfius
Mar 16, 2026
Merged

Add Ulfius: C REST framework on GNU Libmicrohttpd (first C app framework!)#26
MDA2AV merged 4 commits intoMDA2AV:mainfrom
BennyFranciscus:add-ulfius

Conversation

@BennyFranciscus
Copy link
Copy Markdown
Collaborator

Ulfius — Pure C REST Framework

Ulfius is a lightweight HTTP framework for building REST APIs in pure C. Built on GNU Libmicrohttpd with Jansson for JSON, it's designed for embedded systems and apps where memory footprint matters.

Why add it?

  • First C application framework in HttpArena — we have C-based web servers (h2o, nginx), but no C app framework. Ulfius fills that gap
  • ~1,140 stars — solid underdog project maintained by a single developer since 2015
  • GNU Libmicrohttpd backend — battle-tested HTTP library under the hood
  • Minimal footprint — designed for constrained environments, interesting to see how it compares to full-featured frameworks

Implementation

  • All HttpArena endpoints implemented: /pipeline, /baseline11, /baseline2, /json, /compression, /db, /upload, /static/:filename
  • Jansson for JSON serialization
  • Thread-local SQLite with prepared statements for /db
  • Pre-loaded datasets and static files
  • TLS via GnuTLS (built-in secure framework support)
  • Compiled with -O3 -flto -march=native

Tests enabled

baseline, noisy, pipelined, limited-conn, json, upload, compression, mixed, baseline-h2, static-h2


cc @babelouest — thought it'd be cool to see how Ulfius stacks up against other frameworks in HttpArena! Pure C with libmicrohttpd is a really interesting combination for benchmarks 🔧

yder's CMake requires libsystemd-dev for journald logging, which
isn't available in minimal Docker builds. Disable it with
-DWITH_JOURNALD=OFF since we only need yder as a dependency for
ulfius, not its journald features.
@BennyFranciscus
Copy link
Copy Markdown
Collaborator Author

Build fix: yder (logging library dependency) requires libsystemd-dev for journald support, which is not available in minimal Docker containers.

Added -DWITH_JOURNALD=OFF to the yder CMake build. We only need yder as a transitive dependency for ulfius — journald logging is irrelevant in a benchmark container. 🔧

- Include zlib.h and link -lz
- Pre-compress large JSON at startup with deflateInit2 (gzip mode)
- Serve gzipped response when Accept-Encoding includes gzip
- Add zlib1g to runtime image
libmicrohttpd is HTTP/1.1 only. Removed baseline-h2 and static-h2
from tests list since they can't pass.
@BennyFranciscus
Copy link
Copy Markdown
Collaborator Author

CI fix: removed baseline-h2 and static-h2 from meta.json tests.

libmicrohttpd is HTTP/1.1 only — no HTTP/2 support. The H2 tests were incorrectly included and caused 4 test failures (H2 negotiation + missing Content-Type over H2). All HTTP/1.1 tests (22/22) passed clean ✅

@github-actions
Copy link
Copy Markdown
Contributor

Benchmark Results

Framework: ulfius | Profile: all profiles

ulfius / baseline / 512c (p=1, r=0, cpu=unlimited)
  Best: 899885 req/s (CPU: 8852.6%, Mem: 90.8MiB) ===

ulfius / baseline / 4096c (p=1, r=0, cpu=unlimited)
  Best: 1010878 req/s (CPU: 7703.4%, Mem: 128.0MiB) ===

ulfius / baseline / 16384c (p=1, r=0, cpu=unlimited)
  Best: 902412 req/s (CPU: 4744.5%, Mem: 129.6MiB) ===

ulfius / pipelined / 512c (p=16, r=0, cpu=unlimited)
  Best: 1116171 req/s (CPU: 10139.0%, Mem: 79.0MiB) ===

ulfius / pipelined / 4096c (p=16, r=0, cpu=unlimited)
  Best: 1140343 req/s (CPU: 9834.6%, Mem: 124.9MiB) ===

ulfius / pipelined / 16384c (p=16, r=0, cpu=unlimited)
  Best: 1265052 req/s (CPU: 6897.5%, Mem: 126.7MiB) ===

ulfius / limited-conn / 512c (p=1, r=10, cpu=unlimited)
  Best: 176316 req/s (CPU: 682.5%, Mem: 46.5MiB) ===

ulfius / limited-conn / 4096c (p=1, r=10, cpu=unlimited)
  Best: 175617 req/s (CPU: 678.2%, Mem: 54.4MiB) ===

ulfius / json / 4096c (p=1, r=0, cpu=unlimited)
  Best: 61218 req/s (CPU: 11959.9%, Mem: 165.1MiB) ===

ulfius / json / 16384c (p=1, r=0, cpu=unlimited)
  Best: 57552 req/s (CPU: 10062.0%, Mem: 170.5MiB) ===

ulfius / upload / 64c (p=1, r=0, cpu=unlimited)
  Best: 803 req/s (CPU: 5830.9%, Mem: 2.6GiB) ===

ulfius / upload / 256c (p=1, r=0, cpu=unlimited)
  Best: 863 req/s (CPU: 6832.1%, Mem: 9.9GiB) ===

ulfius / upload / 512c (p=1, r=0, cpu=unlimited)
  Best: 827 req/s (CPU: 7175.8%, Mem: 15.8GiB) ===

ulfius / compression / 4096c (p=1, r=0, cpu=unlimited)
  Best: 92513 req/s (CPU: 6208.0%, Mem: 517.3MiB) ===

ulfius / compression / 16384c (p=1, r=0, cpu=unlimited)
  Best: 71393 req/s (CPU: 2581.8%, Mem: 559.1MiB) ===

ulfius / noisy / 512c (p=1, r=0, cpu=unlimited)
  Best: 696306 req/s (CPU: 9300.8%, Mem: 80.4MiB) ===

ulfius / noisy / 4096c (p=1, r=0, cpu=unlimited)
  Best: 690767 req/s (CPU: 8725.9%, Mem: 126.1MiB) ===

ulfius / noisy / 16384c (p=1, r=0, cpu=unlimited)
  Best: 663940 req/s (CPU: 4240.9%, Mem: 123.4MiB) ===

ulfius / mixed / 4096c (p=1, r=5, cpu=unlimited)
  Best: 11035 req/s (CPU: 424.5%, Mem: 318.7MiB) ===

ulfius / mixed / 16384c (p=1, r=5, cpu=unlimited)
  Best: 11218 req/s (CPU: 373.4%, Mem: 355.6MiB) ===
Full log
  Bandwidth:  75.15MB/s
  Status codes: 2xx=2998071, 3xx=0, 4xx=845898, 5xx=0
  Latency samples: 3843969 / 3843969 responses (100.0%)
  Reconnects: 166804
  Per-template: 1524962,1473303,845704,0,0
  Per-template-ok: 1524847,1473224,0,0,0

  WARNING: 845898/3843969 responses (22.0%) had unexpected status (expected 2xx)
  CPU: 3177.7% | Mem: 129.6MiB

=== Best: 663940 req/s (CPU: 4240.9%, Mem: 123.4MiB) ===
  Input BW: 67.12MB/s (avg template: 106 bytes)
[dry-run] Results not saved (use --save to persist)
httparena-bench-ulfius
httparena-bench-ulfius

==============================================
=== ulfius / mixed / 4096c (p=1, r=5, cpu=unlimited) ===
==============================================
ec2e0fd62f11037747e35b1e25f31f8338bf6b23330154e6b007abaffc77f8f1
[wait] Waiting for server...
[ready] Server is up

[run 1/3]
gcannon — io_uring HTTP load generator
  Target:    localhost:8080/
  Threads:   64
  Conns:     4096 (64/thread)
  Pipeline:  1
  Req/conn:  5
  Templates: 10
  Expected:  200
  Duration:  5s


  Thread Stats   Avg      p50      p90      p99    p99.9
    Latency   52.48ms    221us   307.00ms   339.70ms   342.30ms

  224825 requests in 5.00s, 55175 responses
  Throughput: 11.03K req/s
  Bandwidth:  373.88MB/s
  Status codes: 2xx=55175, 3xx=0, 4xx=0, 5xx=0
  Latency samples: 55174 / 55175 responses (100.0%)
  Reconnects: 176470
  Per-template: 5924,5904,6007,5897,6152,4927,4914,6024,4773,4652
  Per-template-ok: 5924,5904,6007,5897,6152,4927,4914,6024,4773,4652
  CPU: 424.5% | Mem: 318.7MiB

[run 2/3]
gcannon — io_uring HTTP load generator
  Target:    localhost:8080/
  Threads:   64
  Conns:     4096 (64/thread)
  Pipeline:  1
  Req/conn:  5
  Templates: 10
  Expected:  200
  Duration:  5s


  Thread Stats   Avg      p50      p90      p99    p99.9
    Latency   23.64ms    182us   112.80ms   122.90ms   130.20ms

  210906 requests in 5.00s, 21699 responses
  Throughput: 4.34K req/s
  Bandwidth:  149.74MB/s
  Status codes: 2xx=21699, 3xx=0, 4xx=0, 5xx=0
  Latency samples: 21698 / 21699 responses (100.0%)
  Reconnects: 191477
  Per-template: 2145,2265,2455,2435,2506,1913,1902,2298,1884,1895
  Per-template-ok: 2145,2265,2455,2435,2506,1913,1902,2298,1884,1895
  CPU: 194.7% | Mem: 389.9MiB

[run 3/3]
gcannon — io_uring HTTP load generator
  Target:    localhost:8080/
  Threads:   64
  Conns:     4096 (64/thread)
  Pipeline:  1
  Req/conn:  5
  Templates: 10
  Expected:  200
  Duration:  5s


  Thread Stats   Avg      p50      p90      p99    p99.9
    Latency   23.16ms    179us   107.10ms   110.20ms   121.50ms

  209425 requests in 5.00s, 18370 responses
  Throughput: 3.67K req/s
  Bandwidth:  123.17MB/s
  Status codes: 2xx=18370, 3xx=0, 4xx=0, 5xx=0
  Latency samples: 18369 / 18370 responses (100.0%)
  Reconnects: 192088
  Per-template: 2020,2053,2090,2035,1920,1676,1472,1995,1512,1596
  Per-template-ok: 2020,2053,2090,2035,1920,1676,1472,1995,1512,1596
  CPU: 161.3% | Mem: 445.9MiB

=== Best: 11035 req/s (CPU: 424.5%, Mem: 318.7MiB) ===
  Input BW: 1.08GB/s (avg template: 104924 bytes)
[dry-run] Results not saved (use --save to persist)
httparena-bench-ulfius
httparena-bench-ulfius

==============================================
=== ulfius / mixed / 16384c (p=1, r=5, cpu=unlimited) ===
==============================================
214456a19dc728118b292126c7b0f3b936da991efbaab7fea7bb11ef11016c25
[wait] Waiting for server...
[ready] Server is up

[run 1/3]
gcannon — io_uring HTTP load generator
  Target:    localhost:8080/
  Threads:   64
  Conns:     16384 (256/thread)
  Pipeline:  1
  Req/conn:  5
  Templates: 10
  Expected:  200
  Duration:  5s


  Thread Stats   Avg      p50      p90      p99    p99.9
    Latency   68.59ms    267us   365.00ms   460.00ms   567.00ms

  216951 requests in 5.00s, 56091 responses
  Throughput: 11.21K req/s
  Bandwidth:  375.05MB/s
  Status codes: 2xx=56091, 3xx=0, 4xx=0, 5xx=0
  Latency samples: 56091 / 56091 responses (100.0%)
  Reconnects: 160869
  Per-template: 6151,6253,6270,6100,6100,4918,4777,6065,4739,4718
  Per-template-ok: 6151,6253,6270,6100,6100,4918,4777,6065,4739,4718
  CPU: 373.4% | Mem: 355.6MiB

[run 2/3]
gcannon — io_uring HTTP load generator
  Target:    localhost:8080/
  Threads:   64
  Conns:     16384 (256/thread)
  Pipeline:  1
  Req/conn:  5
  Templates: 10
  Expected:  200
  Duration:  5s


  Thread Stats   Avg      p50      p90      p99    p99.9
    Latency   39.96ms    543us   113.50ms   311.60ms   519.80ms

  210859 requests in 5.00s, 5616 responses
  Throughput: 1.12K req/s
  Bandwidth:  39.88MB/s
  Status codes: 2xx=5616, 3xx=0, 4xx=0, 5xx=0
  Latency samples: 5616 / 5616 responses (100.0%)
  Reconnects: 198249
  Per-template: 503,615,671,632,594,520,512,565,540,464
  Per-template-ok: 503,615,671,632,594,520,512,565,540,464
  CPU: 119.6% | Mem: 359.8MiB

[run 3/3]
gcannon — io_uring HTTP load generator
  Target:    localhost:8080/
  Threads:   64
  Conns:     16384 (256/thread)
  Pipeline:  1
  Req/conn:  5
  Templates: 10
  Expected:  200
  Duration:  5s


  Thread Stats   Avg      p50      p90      p99    p99.9
    Latency   32.51ms    738us   110.30ms   274.20ms   532.50ms

  212010 requests in 5.00s, 7199 responses
  Throughput: 1.44K req/s
  Bandwidth:  51.45MB/s
  Status codes: 2xx=7199, 3xx=0, 4xx=0, 5xx=0
  Latency samples: 7199 / 7199 responses (100.0%)
  Reconnects: 196537
  Errors: connect 0, read 6, timeout 0
  Per-template: 858,840,670,711,807,568,591,847,673,634
  Per-template-ok: 858,840,670,711,807,568,591,847,673,634
  CPU: 131.2% | Mem: 393.3MiB

=== Best: 11218 req/s (CPU: 373.4%, Mem: 355.6MiB) ===
  Input BW: 1.10GB/s (avg template: 104924 bytes)
[dry-run] Results not saved (use --save to persist)
httparena-bench-ulfius
httparena-bench-ulfius
[skip] ulfius does not subscribe to baseline-h2
[skip] ulfius does not subscribe to static-h2
[skip] ulfius does not subscribe to baseline-h3
[skip] ulfius does not subscribe to static-h3
[skip] ulfius does not subscribe to unary-grpc
[skip] ulfius does not subscribe to unary-grpc-tls
[skip] ulfius does not subscribe to echo-ws
[restore] Restoring CPU governor to powersave...

@BennyFranciscus
Copy link
Copy Markdown
Collaborator Author

Ulfius benchmarks are in! 🎉 Solid showing for a C REST framework built on GNU Libmicrohttpd:

🚀 Pipelined: 1.27M req/s at 16384c — impressive that it actually scales up with more connections. MHD's thread pool + internal epoll handling this well.
Baseline: 1.01M at 4096c — crossing the 1M barrier on baseline is a great result for a framework-level C entry (not raw socket).
📦 JSON: 61K at 4096c — jansson's tree-based API adds overhead vs frameworks with streaming serializers, but it's the cleanest API in C.
🗜️ Compression: 93K — solid, zlib doing its thing.
🔀 Noisy: 696K — handling mixed valid/invalid traffic well.
💾 Memory: Only 90-130MB for most profiles — by far the most memory-efficient entry we've seen. C doing C things.

The limited-conn numbers (176K) suggest MHD's thread pool has some per-connection overhead on connection churn — makes sense given it's managing internal connection state.

Upload at ~800 req/s is on par with other entries for the 20MB profile. The memory spike to 15.8GB at 512c suggests MHD buffers the full body per-thread — could be tunable but works fine.

Ready for merge! 🚀

@MDA2AV MDA2AV self-requested a review March 16, 2026 15:01
@MDA2AV MDA2AV merged commit 7bc414d into MDA2AV:main Mar 16, 2026
3 of 4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants