Everything moves, nothing correlates, including the operation itself.
A true random number generator where nothing is fixed, not the inputs, not the combination function, not the rule that selects the function.
The SMC combines uncorrelated physical signals through a randomly-selected discontinuous combiner that changes every call. If generation exceeds a time threshold, it stops mid-computation and returns the partial result. the interruption point itself is random, adding yet another layer of non-determinism.
Unlike pseudo-random generators which have a frozen transition function and inevitably repeat, the SMC has zero autocorrelation at every lag provably.
Neural Network; Stochastic gradient descent relies on randomness for mini-batch sampling, dropout, initialization, and data augmentation.
PRNGs like the Mersenne Twister use a fixed deterministic transition function, which guarantees periodicity. That periodicity means gradient estimates inherit hidden bias: the optimizer revisits the same regions of the loss landscape, convergence slows, and training can get trapped in local minima.
The SMC eliminates this. Because every component mutates independently and nothing correlates across time, gradient estimates under the SMC are truly unbiased.
The sample mean converges at the optimal O(1/sqrt(T)) rate given by the Law of Large Numbers, with no periodic bias term.
Exploration is genuinely free, no revisitation patterns that trap the optimizer in basins it has already visited.
For mathematical foundations, proofs, and architecture details, see the whitepaper at pawit.co/whitepapers/smc.pdf.
For library usage, see usage.md.
use smc::Smc;
fn main() {
let mut smc = Smc::default_fast();
// Generate random values
let value = smc.next_u64();
let float = smc.next_f64(); // [0, 1)
let small = smc.next_u32();
// Fill a buffer
let mut buf = [0u8; 64];
smc.fill_bytes(&mut buf);
println!("{value:#018x}");
println!("{float:.16}");
}15/15 statistical disorder tests pass, confirming zero detectable structure across 1,000,000 samples:
[PASS] Monobit (bit frequency)
ones: 32006995, expected: 32000000, z-score: 1.7488
[PASS] Byte chi-squared uniformity
chi-sq: 230.99, df: 255, expected range: [150, 350]
[PASS] Runs test (bit-level)
runs: 3198086, expected: 3199999, z-score: -1.5126
[PASS] Autocorrelation lag 1
r(1): 0.000010, threshold: +/-0.003000
[PASS] Autocorrelation lag 7
r(7): 0.000220, threshold: +/-0.003000
[PASS] Autocorrelation lag 13
r(13): 0.000908, threshold: +/-0.003000
[PASS] Autocorrelation lag 100
r(100): 0.000034, threshold: +/-0.003000
[PASS] Autocorrelation lag 1000
r(1000): -0.000542, threshold: +/-0.003002
[PASS] Serial correlation (consecutive pairs)
chi-sq: 272.27, df: 255, expected range: [165, 345]
[PASS] Shannon entropy per byte
entropy: 7.999979 bits/byte (> 7.99 required, ideal = 8.0)
[PASS] Mean value test
mean: 9231602385776214016, expected: 9223372036854775808, z-score: 1.5456
[PASS] Longest run of ones
longest run: 27, expected ~23, range: [11, 45]
[PASS] Nibble frequency (4-bit chi-sq)
chi-sq: 19.05, df: 15 (< 45.0 required)
[PASS] Collision test (birthday)
collisions: 1161, expected: 1192, z-score: -0.8999
[PASS] Gap test (bit gaps)
mean gap: 259.6, expected: 255, z-score: 1.4007
15/15 tests passed
Verdict: output exhibits no detectable structure — consistent with true randomness.Run it yourself:
cargo run --release --bin benchSMC whitepaper
Github Repo- this
Time of Creation: 31 March 2026, by Pawit Sahare (also known as, Amon).
3:33