I first explored low-cost MEMS sensors and came across the MMC5603, advertised with an RMS noise of 400 nT. I even wrote an ESP32-compatible library: available here.

My idea was simple: use a statistical approach → take massive numbers of measurements to reduce the noise. For example, 10,000 measurements lower the variance by a factor of 10,000, and thus the standard deviation by a factor of 100 (thanks, law of large numbers!). It’s a technique I had already used to measure Earth’s rotation with BMI160 sensors: explained here (FR).
So, on paper, by running 40,000 measurements (at 100 Hz → 100 seconds of measurements with 4 sensors in parallel), I should be able to reduce the noise from 400 nT to 2 nT. Bingo, right? Well… no.
While digging through the literature, I found Marcel Ochsendorf’s thesis: Development of a Permanent Magnet Characterisation Framework for Use in Low-Field MRI Systems.
He reports that the MMC5603NJ, in the context of ultra-low-field nuclear magnetic resonance (using the Earth’s magnetic field), exhibits a background noise of 15 nT, mostly due to 1/f noise.
And here’s the big problem: the statistical approach works well with white (Gaussian) noise, but not with 1/f noise. In this case, averaging does not reduce the standard deviation following the usual 1 / sqrt(N)
Bertrand Selva
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.