My first job in electronic design circa 1981 was making analog autopilots and control devices for RPVs – the early form of what today we call UAVs. A couple of really delicate boxes with gyroscopes, accelerometers, and magnetometers, and several boards full of LM148 quad op-amps surrounded by a lot of resistors and capacitors made the 17ft wingspan composite bird head in the direction we wanted.
One thing I learned quickly was close only counts for horseshoes, hand grenades, and high-gain transistors. The care and feeding of op-amps requires a fair amount of trimming, in those days with the relatively new Bourns trimming potentiometer. By using a nominal resistor value in series with a potentiometer, we could finely trim the output of a control loop at a critical point – and then glue the knob with Loctite so it would stay there. (I still have a tray full of “pot tweaker” sticks somewhere in my electronics box in the garage.)
Ah, if we could only make things digital, then all this trimming nonsense seems archaic. Digital circuits don’t drift with temperature. They compensate for design variation in software. It’s a good theory, except for one thing: it doesn’t magically work that way. At some point, the world is analog, and there is a sensor or a reference voltage or an oscillator that varies from its nominal specification, requiring some form of calibration.
Apple discovered this the hard way on the iPhone 5S, really screwing up the bias on their MEMS accelerometer and failing to correct for it, resulting in big errors in apps. One “fix” was a software calibration routine that figures out the correct bias; note the caveat that the “surface is closer to flat than the existing bias of the device”, otherwise things can get worse.
You may remember Apple has an M7 motion coprocessor (aka the NXP LPC18A1, an ARM Cortex-M3 microcontroller) between the MEMS sensor and the A7 processor itself, which probably saved them from the mother of all product recalls. iOS 7.03 fixes the problem, presumably (although Apple is pretty mum) updating the motion coprocessor firmware to grab a stored calibration factor and report a computationally-adjusted sensor reading.
Without affixing blame between Apple and their MEMS sensor supplier, this is a hack that should have never escaped into the wild. Even a modicum of analog experience would have suggested factory calibration of the MEMS sensor, storing a value in non-volatile memory.
Many MEMS sensors are being paired with a microcontroller, especially where the sensor is on a network readable by any number of nodes. This points to a best practice of local sensor calibration using one-time programmable memory such as the OTP IP offered by Sidense, a much lower cost approach than using the primary MCU flash to store fixed calibration values, and far smaller and more reliable than the ancient art of electromechanical trimming.
The actual trimming operation varies, but usually takes two forms. Analog circuits can be altered directly by control of electronic switches from bits stored in OTP memory, typically adding or subtracting resistance to change a voltage or gain stage, or if an MCU is present on the sensor a computational factor can be permanently stored for firmware use. As MEMS processes become smaller, analog variation increases and trimming becomes more important.
System-wide sensor calibration is an antiquated, time consuming, and expensive approach. As more MEMS sensors are showing up – automotive, fitness and medical, mobile, home automation – connected via networks, especially wireless sensor networks for the Internet of Things, the risk of an uncalibrated sensor reading propagating to multiple points gets bigger. With on-sensor trimming using OTP memory, every sensor delivers calibrated values every time, no matter what is reading them.
Share this post via: