Welcome to the gpunoise wiki!
For those of you who don't know there are an increasing number of people that "mine" various crypto currencies using GPUs from the likes of Nvidia, AMD, etc. If you have heard of this you're probably upset because you haven't been able to purchase a gaming GPU for a while. I don't know what to say about that...
TL;DR - Large number of GPUs in open air crypto currency mining "rigs" emit substantial spurious emissions/noise on portions of the LTE spectrum used in the United States.
I happen to have a GPU mining farm as documented here:
You know, because "I mine HARD". Get it?
I noticed fairly early on that cellular devices were had connectivity/signal issues in close proximity to this install, as mentioned on various reddit threads/comments:
...etc. I was curious why I didn't seem to get cell service when standing next to more than 100 Nvidia GPUs in an open air configuration, with no chassis, "grounding" etc. If you think my setup is bad you should see what some other "miners" are up to... Anyway, luckily we live in an amazing age where $20 and some internet knowledge brings you the RTL-SDR and you can figure these things out. That's what this project is about.
Also inspired by this article about ASIC mining.
- Laptop (wifi and bt disabled), running Ubuntu 18.04
- E4000-based RTL-SDR tuner (we need to get to 2.2 GHz, yo)
- El-cheapo omnidirectional telescoping antenna in a fixed configuration/location (unless otherwise noted). We're trying to measure noise so I wasn't too concerned about this. I'm trying to count coins and gains, not SWR! That said I did do the insane LTE base station PPM calibrations for my hardware so these results and frequencies are at least somewhat valid...
I did some research on LTE bands used by various providers in the United States and came up with three ranges of RF spectrum to scan:
- LTE "range 1": 617 - 746 MHz
- LTE "range 2": 1710 - 1790 MHz
- LTE "range 3": 2110 - 2190 MHz
Note that these aren't actual ranges as described anywhere, just ranges as defined by me for the purposes of grouping and scanning spectrum.
The first set of results comes from a different, undisclosed location with a single EVGA GTX 1060 installed in a SuperMicro server case. First run is with overclock and fan settings enabled. Second run is with bminer mining equihash resulting in 100% GPU utilization as reported by nvidia-smi.
LTE "range 1":
LTE "range 2":
LTE "range 3":
So, basically no difference.
Now lets look at my > 100 GPU farm in much higher image resolution.
There was a distinct spike in the US LTE "range 1" as seen in this diff:
There definitely seem to be some spurious emissions/noise between roughly 696 and 700 MHz.
Now the original source scans:
For completeness here are diffs for the other LTE ranges as described elsewhere in this project:
So, basically just the kind of amplitude changes that one would expect from moving outdoors to indoors.
(Mostly) Full VHF+UHF Scan
This is possibly the most interesting, there is a substantial amount of spurious emissions between 100 and 400 MHz, with especially high amplitude between 100 and 300 MHz. These spurious emissions could definitely interfere with a variety of licensed transmitters in the United States and elsewhere.
- LTE spectrum in use in other countries
- Investigate potential relationship between GPU clock rates and frequency of noise emitted