Author here. Lots of questions about precision --- in the article I calculated 0.6mm as the standard deviation of 200ms worth of phase measurements (n=124) while the slide wasn't moving.
I'd love to hear any advice/ideas re:
- approaches for figuring out what's currently limiting the accuracy (i.e., noise sources)
- the relative merits of averaging in time vs phase domain
I kind of glossed over your article but one thing that stood out is that you preferred to lower the adc sample rate. That doesn't make sense.
Usually the way you want to set something like this up is with a low pass filter before you sample it and you want to sample at a high enough rate (some multiple of that filter frequency). This is due to the sampling theorem. Maybe I'm missing something.
I would aim for the highest possible sample rate and then post-process the signal (assuming that's computationally reasonable, i.e. you can keep up with real time).
There are general practices/principles for minimizing noise. Having a ground plane. Separating your digital electronics from your analog electronics. Clean power supply. Shielding.
Another thing I'd be concerned about is that this whole thing is a big antenna. Yet another concern is that motion might influence the results (any time you have conductors moving in the presence of fields you get some induced current).
Commercial positioning solutions are usually based on optics and gratings. I wonder if that's a better approach even for a hobbyist. Something like an optical mouse just linearized...
Be sure to watch Big Clive's video that's linked in the article. It's amazing how cheap digital calipers have gotten, and how accurate they are at that price. They probably use a 4-bit MCU or an ASIC to do the calculations and output the data --- both to the screen, and an often-hidden serial port. The use of a non-absolute encoding leading to drift, and the higher standby power consumption which was distinctive of the infamous battery-eating Mitutoyo clones, seems to have been largely solved within the past few years. There are even solar models available for not much more.
I would love some solar powered calipers. "I'll wing it" measurements for me are usually powered by "god damn it, do I even have any more CR2032's? Is the battery cover pressing down enough? Screw it where's a ruler..."
Recommend you take the CR2032 out when not using the calipers. Cheap/no name ones (atleast those sold 5-10 years ago) have a relatively high standby current, even when the LCD screen is off.
I just flip the coin cell around so it stays in the device but does not deplete (reverse polarity but in fact the contacts will not touch the battery poles so no harm).
Would you happen to know the current draws for the older and newer designs? I would like to check if mine are worth replacing (since I dutifully remove and store the battery in the designated holder in the case I can't rely on durability comparisons because it would depend on usage patterns).
I don't know the current draw of either. I don't know if newer designs are better, I only have one I bought around 10-15 years ago. For mine a bettery left in the calipers depletes within a few months, without use.
Yes I remember seeing this video when I came out and he explains it quite well. I don’t know who was the first to use a linear encoder in a digital caliper but it’s ingeniously simple. And omg have I gone through some LR44 with my battery eating clone as well, good to know I’m not alone.
I don't actually see an accuracy number, only the claim of "millimeter precision", which is actually pretty bad for calipers. Looks like a fun project though. Basically a linear resolver sensor I guess. From how much effort the author has put into the project I'd estimate the accuracy is much better than +/- 0.5mm.
Which is quite low. My manual caliper is precise to the 1/10 of a mm, my electronic to 1/100 (but I would say 0,02 is the more realistic)
The manual one is not good, harbor freight quality, electronic is a mituyo (not an entry level). Still - good ROI on both, got the electronic one because my eyesight is not that good anymore.
With a decent set of vernier calipers (I have Brown and Sharpe ones) they're accurate to 0.001" (0.02mm) every time. But what's nice about analog measuring tools is you can actually reliably achieve better accuracy--like 0.0005" +/- 0.00025"--by "reading between the lines". I can reliably take finishing cuts accurately to a few ten thousandths of an inch using vernier calipers (confirmed by checking with a micrometer accurate to 0.0001").
The only application I've encountered where digital tools work better for me is having a DRO on a mill is extremely convenient.
You really want to use a mike on this kind of precision. Calipers can be repeatable in a certain range but even then a readout from vernier gives too much error. Measuring a tenth of mm is acceptable (tho I'd never trust a vernier caliper measurement beyond 0.2). A hundredth IMO is wishful thinking.
Looking at my calipers now I noticed that the imperial side is twice as precise as the metric side. Graduations of 0.05mm vs 0.001". I wonder why that is.
In my experience, yes. I've checked with micrometers that are an order of magnitude more precise. You have to be fairly experienced reading it, and it often helps to use a magnifying glass. Also you have to be very careful not to drag the jaws open when removing from the object you're measuring. Taking multiple measurements and averaging helps.
To be clear, for a measurement where accuracy to less than 0.001" actually matters use a micrometer! Otherwise you're likely to screw up the part. But the advertised precision of 0.001" is totally repeatable within 0.0005".
tenths and hundredths of an inch "don't mean anything" because we don't divide inches that way in common use, but in subtractive manufacturing and the like they do use "thou" - and 0.001" is a thou.
Personally, i use microns instead of 0.001mm, too, when measuring that small. I forget the accuracy of my good calipers, but i could detect errors of around 2 microns if memory serves. It's been a long time since i cared about anything that accurate so i have two pairs of cheap plastic ones - scale and digital.
A typical metric micrometer is accurate to 0.01mm (tho you can find more precise ones at premium). It's really unlikely you'll get a micron precision from any calipers. Even an angry glance warms up the instrument enough to make this meaningless.
Microns are the domain of grinding and lapping, you rarely ever need to go there with cutting.
> tenths and hundredths of an inch "don't mean anything" because we don't divide inches that way in common use
Architectural rulers tend to divide inches into tenths for some reason. I have no idea why, because lumber comes in multiples of 12 (e.g. 8', 10', 12', 16') so if you design in multiples of 10 you're likely to waste a lot. If anyone knows I'd be curious to hear about it, mostly it makes drawing things to scale a pain in my ass..
Somewhat ironically, it's to make architects' jobs easier when they're drawing things to <architectural> scale.
If you're making a floor plan drawing at 1:100, a 240 inch wall becomes 2.4 inches on the drawing. The scales [of the drawings] and the scales [the tools] evolved together. (Similar to "why do computer people work in base 2 or base 16 so often?")
Makes sense, I'll try to work in whole number scaled inches more (although tbh I'm just being picky, eyeballing the fractional part is usually good enough).
Yeah I only use calipers and micrometers for machining--I haven found any use for additive manufacturing--and never in metric units because all my tools are imperial. Just strange the calipers punish metric users by giving them only half the precision.
If you buy a tool in a country that is mainly using imperial, the markings on it might be more exact for the imperial measurement. Might be the opposite in a country with metric. Just guessing though but that is often how other things work out.
I really cannot understand that you talk about micro meter (um) accuracy on a caliper. If you apply a tiny bit more pressure or it is under a tiny angle, the error is at least tens of um.
EDIT: ah, I see your confusion. A "micrometer" is an aliased term. It means both "a millionth of a meter" and "a tool for measuring very precisely". I used the latter meaning above. Although in terms of order of magnitude precision they're identical--a micrometer accurate to 0.0001" is accurate to 2.54e-6m. It's possible to get within a handful of µm with decent calipers. Easier with a micrometer.
You don't need to flex the material you are measuring to get tens of microns difference. Simply both sides of the caliper will not stay parallel since there is always some dust or fluid between the sliding pieces. The dust will simply flex.
Of course. To get a good reading you have to clean both surfaces. I mean if you want to wring blocks together it won't work without clean surfaces either.
Interesting that you say that. My current backburner project is a display (TFT or PC) for Mitutoyo Digimatic. I can read the bright VFD display, but it struck me that others might find it difficult to read from across a workbench.
My dad was a machinist so he always had calipers. I never thought I needed a set until I bought a cheap pair for under $10. I use it all the time.
The only problem? I'm left handed so I either end up using it upside down or trying it with my left hand and then switching to my right. They make left handed calipers but the cheapest I've found are over $40.
I wonder if this technique could be used in typical stepper motor controlled "step counting" 3d printers and CNC mills, to provide closed loop servo-like control? Have real time micrometer-precision measurements of your x, y, and z axis - and feed that back unto your control loops?
TLDR: <0.02mm should be possible w/open source using cheap fabs.
Interesting project. The hardware guy earlier built a rotary encoder and a vape pen. I am no metrologist (though by chance I once worked for the UK guy who brought Hexagon to China and made bank), this looks overall like quite a complex scheme that was probably referenced from an existing implementation. These days you can get 0.10mm pitch tracks and offsets ("4 mil") or 0.09mm ("3.5 mil") from JLC on 2 layer/4+ layer. With flex PCBs you can get still smaller pitch ("3 mil"). Combining a few rows of these with basic multi-track rotary encoder theory should give you portions thereof, ~0.01-0.02mm.
This back of hand calculation aligns well with my Mitutoyo's test report, which states maximum permissible error is 0.04mm @ 5mm diameter, 0.02mm @ 0-200mm, and 0.03mm @ 300mm. Indicated errors on the test report are all in the range of 0mm-0.02mm except inside radius which is 0.03mm. This would be a standard high grade caliper level of precision.
In practice, achieving these levels is going to require machining high grade steels and mounting them at high levels of parallelism, not simply working out the electronics.
Fab is simply short for fabricate or fabrication which is a term commonly used in all manner of manufacturing including circuitboards. An external party that provides manufacturing is often referred to as a 'fabrication partner' or 'fabricator', shortened to 'fab'.
Interesting, I always naively assumed that those cheap calipers measure distance mechanically, with something like a wheel and an encoder. The actual method is much cleverer.
Also, nitpick: „ I’m stuck in the local optima of …“ should be „optimum“.
Long ago, as a member of Pumping Station One in Chicago, I had a fellow member cut out a series of 0.1" fingers spaced on 0.2" centers, and stuck them on a long piece of aluminum stock, and was able to use a Harbor Freight caliper to read the aluminum as a scale without further modification. It really doesn't take much more than an accurate set of interdigital electrodes to make a working scale.
I'd be super curious to see how the accuracy changes with averaging out/low-pass filtering the measurements. Accuracy usually improves proportional to sqrt(N) when you take N samples so your higher precision desire might just be a bit of code to write.
The other side of it though is that you're starting to get down into the "everything needs to be temperature controlled" region as you squeeze that precision number. FR-4 and copper have thermal expansion coefficients around 15-20ppm/C. If I'm doing this mental math correctly, a 5 deg temperature rise would make a 1m long piece of FR4 expand by 0.1mm, or a 10cm piece of FR4 expand by 0.01mm.
One time I wanted to demonstrate thermal expansion to my kid (1st grade or so) so I made some marks with a steel ruler and put it in the freezer. Imagine my surprise when we took it out and there was no perceptible difference :-D
No, several MCUs, like a lot of STM32s, support hardware oversampling with optional division. Here's[1] a document describing it for the L0 and L4 series, which like most of the STM32 series have SAR[2] ADCs.
That at 500Hz is called "optical mouse on absolute positioning patterned surface".
On that note: I'm looking for a mouse style camera sensor unit that can export full frame rate raw to a system where I can actually decode such an absolute positioning code.
The DYI Drones and Ardupilot comminities were using mouse sensors to do "optical flow sensing" for accurate low altitude position hold around 10 or so years back.
I once looked into mouse sensors for similar purpose, but the raw frame export capabilities were disappointing. Looking at the datasheet of PMW3360 which claims to have up to 12000 FPS, but the timing sequence for raw frame dump suggests ~20ms if not 40ms to transfer the whole frame which would give only 50-25FPS. That doesn't necessarily mean the main spec is lying, it just isn't capable of dumping the frames over SPI as fast as it is producing and processing them internally. Frame dumping seems to be more of debug/test feature, so no big surprise it's not capable of fully utilizing the sensor and there is little motivation for sensor manufacturer to improve it. But there is also a chance that the manufacturer has put very conservative timing for it in the datasheet, since it's not the main functionality of sensor.
If anyone has explored further how much the timing can be optimized, or maybe there are other mice sensors for with better frame dumping capabilities, I am curious to hear about it as well.
With that said even if you remove all the unnecessary delays data transfer over SPI will likely be a limiting factor for very high framerates. Assuming 4MHz SPI and 32x32 8bpp image that would be at most 500FPS. Few hundred FPS would still be nice, and you might push it a bit higher with increasing frequency but it's probably not possible to push it much further than 1000FPS without MIPI or some kind of other parallel data transfer protocol.
The ardupilot project in sibling comment seems to be using the mouse movement output directly instead of capturing the frame and then doing optical flow analysis on main CPU. The image dump is only being used to verify camera focus.
Tldr: understood that you think you don’t need 500 Hz, but there is a technological limitation why you need that frequency for these systems to work.
The reason these measurements need to be so fast is because the measurement is not absolute but periodic. Meaning that when you measure something it can’t tell you how large it is in absolute terms just how much larger it is than the closest integer period. In mathematical notation you can measure x where the whole distance is k*period+x, where k is an unknown integer, and period is a design parameter. It can’t natively tell you what is the value of k.
So to figure out the whole measurement you need to calculate k. And you can do that by keeping track of it. You know it is zero when they zero out the caliper, and you know that it just increased by one every time there was a falling discontinuity of x. (Meaning that every time x approaches the period and then suddenly drops to a low number it just crossed over one of these period boundaries.) Similarly you decrease k every time x had a positive discontinuity.
And you can only do that trick if the sampling rate is much higher than the speed the caliper is moved. If you sample too slow suddenly the caliper might move multiple whole periods between two samples and the code would loose track of the value of k.
Author here. Lots of questions about precision --- in the article I calculated 0.6mm as the standard deviation of 200ms worth of phase measurements (n=124) while the slide wasn't moving.
I'd love to hear any advice/ideas re:
- approaches for figuring out what's currently limiting the accuracy (i.e., noise sources)
- the relative merits of averaging in time vs phase domain
- how to improve the analog frontend (See my collaborator Mitko's github repo for the hardware schematics: https://github.com/MitkoDyakov/Calipatron/blob/main/Hardware...)
I kind of glossed over your article but one thing that stood out is that you preferred to lower the adc sample rate. That doesn't make sense.
Usually the way you want to set something like this up is with a low pass filter before you sample it and you want to sample at a high enough rate (some multiple of that filter frequency). This is due to the sampling theorem. Maybe I'm missing something.
I would aim for the highest possible sample rate and then post-process the signal (assuming that's computationally reasonable, i.e. you can keep up with real time).
There are general practices/principles for minimizing noise. Having a ground plane. Separating your digital electronics from your analog electronics. Clean power supply. Shielding.
Another thing I'd be concerned about is that this whole thing is a big antenna. Yet another concern is that motion might influence the results (any time you have conductors moving in the presence of fields you get some induced current).
Commercial positioning solutions are usually based on optics and gratings. I wonder if that's a better approach even for a hobbyist. Something like an optical mouse just linearized...
Super cool project! I don't have any answers but really appreciate learning how these things work.
Too late to edit, and somewhat off topic, but I wonder if you could package an interferometer in such a way to work for calipers?
https://caltechexperimentalgravity.github.io/IFO.html
Again, too late to edit, but there's some interesting related content here:
https://hackaday.io/project/5283-potpourri/log/21475-laser-i...
https://hackaday.com/2015/07/24/self-built-interferometer-me...
Be sure to watch Big Clive's video that's linked in the article. It's amazing how cheap digital calipers have gotten, and how accurate they are at that price. They probably use a 4-bit MCU or an ASIC to do the calculations and output the data --- both to the screen, and an often-hidden serial port. The use of a non-absolute encoding leading to drift, and the higher standby power consumption which was distinctive of the infamous battery-eating Mitutoyo clones, seems to have been largely solved within the past few years. There are even solar models available for not much more.
I would love some solar powered calipers. "I'll wing it" measurements for me are usually powered by "god damn it, do I even have any more CR2032's? Is the battery cover pressing down enough? Screw it where's a ruler..."
Get a Mitutoyo, it'll be accurate AND the battery will last 5 years. It only hurts once. Cheap calipers hurt every time you use them :/
Recommend you take the CR2032 out when not using the calipers. Cheap/no name ones (atleast those sold 5-10 years ago) have a relatively high standby current, even when the LCD screen is off.
I just flip the coin cell around so it stays in the device but does not deplete (reverse polarity but in fact the contacts will not touch the battery poles so no harm).
Would you happen to know the current draws for the older and newer designs? I would like to check if mine are worth replacing (since I dutifully remove and store the battery in the designated holder in the case I can't rely on durability comparisons because it would depend on usage patterns).
I don't know the current draw of either. I don't know if newer designs are better, I only have one I bought around 10-15 years ago. For mine a bettery left in the calipers depletes within a few months, without use.
Even my cheap 100% plastic calipers have been running for 2years. And yes they cost $100 but my good calipers have been running for 5years or so
It would be super cool to do some sort of energy harvesting when you move the calipers open and closed.
Solar calipers are around US$10 at the usual far-East online shops.
This blog shows a better capture of the pulse-train waveforms:
https://www.grant-trebbin.com/2014/04/digital-calliper-teard...
Which I guess is from here: (2006)
https://web.archive.org/web/20060923040306/http://www.yadro....
Yes I remember seeing this video when I came out and he explains it quite well. I don’t know who was the first to use a linear encoder in a digital caliper but it’s ingeniously simple. And omg have I gone through some LR44 with my battery eating clone as well, good to know I’m not alone.
I just pull the battery out before putting it away.
I don't actually see an accuracy number, only the claim of "millimeter precision", which is actually pretty bad for calipers. Looks like a fun project though. Basically a linear resolver sensor I guess. From how much effort the author has put into the project I'd estimate the accuracy is much better than +/- 0.5mm.
Which is quite low. My manual caliper is precise to the 1/10 of a mm, my electronic to 1/100 (but I would say 0,02 is the more realistic) The manual one is not good, harbor freight quality, electronic is a mituyo (not an entry level). Still - good ROI on both, got the electronic one because my eyesight is not that good anymore.
With a decent set of vernier calipers (I have Brown and Sharpe ones) they're accurate to 0.001" (0.02mm) every time. But what's nice about analog measuring tools is you can actually reliably achieve better accuracy--like 0.0005" +/- 0.00025"--by "reading between the lines". I can reliably take finishing cuts accurately to a few ten thousandths of an inch using vernier calipers (confirmed by checking with a micrometer accurate to 0.0001").
The only application I've encountered where digital tools work better for me is having a DRO on a mill is extremely convenient.
You really want to use a mike on this kind of precision. Calipers can be repeatable in a certain range but even then a readout from vernier gives too much error. Measuring a tenth of mm is acceptable (tho I'd never trust a vernier caliper measurement beyond 0.2). A hundredth IMO is wishful thinking.
Looking at my calipers now I noticed that the imperial side is twice as precise as the metric side. Graduations of 0.05mm vs 0.001". I wonder why that is.
Precision, maybe, but is it accurate to that degree?
In my experience, yes. I've checked with micrometers that are an order of magnitude more precise. You have to be fairly experienced reading it, and it often helps to use a magnifying glass. Also you have to be very careful not to drag the jaws open when removing from the object you're measuring. Taking multiple measurements and averaging helps.
To be clear, for a measurement where accuracy to less than 0.001" actually matters use a micrometer! Otherwise you're likely to screw up the part. But the advertised precision of 0.001" is totally repeatable within 0.0005".
tenths and hundredths of an inch "don't mean anything" because we don't divide inches that way in common use, but in subtractive manufacturing and the like they do use "thou" - and 0.001" is a thou.
Personally, i use microns instead of 0.001mm, too, when measuring that small. I forget the accuracy of my good calipers, but i could detect errors of around 2 microns if memory serves. It's been a long time since i cared about anything that accurate so i have two pairs of cheap plastic ones - scale and digital.
A typical metric micrometer is accurate to 0.01mm (tho you can find more precise ones at premium). It's really unlikely you'll get a micron precision from any calipers. Even an angry glance warms up the instrument enough to make this meaningless.
Microns are the domain of grinding and lapping, you rarely ever need to go there with cutting.
> tenths and hundredths of an inch "don't mean anything" because we don't divide inches that way in common use
Architectural rulers tend to divide inches into tenths for some reason. I have no idea why, because lumber comes in multiples of 12 (e.g. 8', 10', 12', 16') so if you design in multiples of 10 you're likely to waste a lot. If anyone knows I'd be curious to hear about it, mostly it makes drawing things to scale a pain in my ass..
Somewhat ironically, it's to make architects' jobs easier when they're drawing things to <architectural> scale.
If you're making a floor plan drawing at 1:100, a 240 inch wall becomes 2.4 inches on the drawing. The scales [of the drawings] and the scales [the tools] evolved together. (Similar to "why do computer people work in base 2 or base 16 so often?")
Makes sense, I'll try to work in whole number scaled inches more (although tbh I'm just being picky, eyeballing the fractional part is usually good enough).
> i use microns instead of 0.001mm
What "micron" are you referring to here? The "micron" I am familiar with is exactly that one (i.e. 1 micron = 1µm = 0.001mm).
Yeah I only use calipers and micrometers for machining--I haven found any use for additive manufacturing--and never in metric units because all my tools are imperial. Just strange the calipers punish metric users by giving them only half the precision.
If you buy a tool in a country that is mainly using imperial, the markings on it might be more exact for the imperial measurement. Might be the opposite in a country with metric. Just guessing though but that is often how other things work out.
A Google image search for "caliper" shows several have 0.001in and 0.02mm accuracy, which is similar.
For 0.01mm there are only electronic or dial calipers.
I really cannot understand that you talk about micro meter (um) accuracy on a caliper. If you apply a tiny bit more pressure or it is under a tiny angle, the error is at least tens of um.
Try it! You might surprise yourself.
EDIT: ah, I see your confusion. A "micrometer" is an aliased term. It means both "a millionth of a meter" and "a tool for measuring very precisely". I used the latter meaning above. Although in terms of order of magnitude precision they're identical--a micrometer accurate to 0.0001" is accurate to 2.54e-6m. It's possible to get within a handful of µm with decent calipers. Easier with a micrometer.
It depends what you are measuring. Some plastic piece? Of course this has a ton of flex. A solid steel shaft? No way you can flex it tens of um.
You don't need to flex the material you are measuring to get tens of microns difference. Simply both sides of the caliper will not stay parallel since there is always some dust or fluid between the sliding pieces. The dust will simply flex.
Of course. To get a good reading you have to clean both surfaces. I mean if you want to wring blocks together it won't work without clean surfaces either.
Interesting that you say that. My current backburner project is a display (TFT or PC) for Mitutoyo Digimatic. I can read the bright VFD display, but it struck me that others might find it difficult to read from across a workbench.
The proper term for calipers, for me, is Mitutoyo. I really want one of the solar models.
Why tho? The coin cell battery lasts years.
I loathe disposable batteries.
Article states +/- 0.6mm
My dad was a machinist so he always had calipers. I never thought I needed a set until I bought a cheap pair for under $10. I use it all the time.
The only problem? I'm left handed so I either end up using it upside down or trying it with my left hand and then switching to my right. They make left handed calipers but the cheapest I've found are over $40.
I think I'll keep fumbling with the cheap ones.
Same (except the left-handed challenges) - I have a pair that were probably <€10 on AliExpress, are good enough, and get used all the time.
I wonder if this technique could be used in typical stepper motor controlled "step counting" 3d printers and CNC mills, to provide closed loop servo-like control? Have real time micrometer-precision measurements of your x, y, and z axis - and feed that back unto your control loops?
This is basically how Digital ReadOuts (DROs) work on manual milling machines, and I'd assume they use the same thing on CNC machines as well.
From reading the headline, I thought this'd be about brakes and was really confused.
TLDR: <0.02mm should be possible w/open source using cheap fabs.
Interesting project. The hardware guy earlier built a rotary encoder and a vape pen. I am no metrologist (though by chance I once worked for the UK guy who brought Hexagon to China and made bank), this looks overall like quite a complex scheme that was probably referenced from an existing implementation. These days you can get 0.10mm pitch tracks and offsets ("4 mil") or 0.09mm ("3.5 mil") from JLC on 2 layer/4+ layer. With flex PCBs you can get still smaller pitch ("3 mil"). Combining a few rows of these with basic multi-track rotary encoder theory should give you portions thereof, ~0.01-0.02mm.
This back of hand calculation aligns well with my Mitutoyo's test report, which states maximum permissible error is 0.04mm @ 5mm diameter, 0.02mm @ 0-200mm, and 0.03mm @ 300mm. Indicated errors on the test report are all in the range of 0mm-0.02mm except inside radius which is 0.03mm. This would be a standard high grade caliper level of precision.
In practice, achieving these levels is going to require machining high grade steels and mounting them at high levels of parallelism, not simply working out the electronics.
See also: https://www.eevblog.com/forum/projects/absolute-capacitive-r... (see animation, GC7626C datasheet) https://github.com/littleboot/ACRE
> TLDR: <0.02mm should be possible w/open source using cheap fabs.
What does that have to do with a Fab? Isn’t a fab the thing that creates ICs? PCBs aren’t made in a fab.
Fab is simply short for fabricate or fabrication which is a term commonly used in all manner of manufacturing including circuitboards. An external party that provides manufacturing is often referred to as a 'fabrication partner' or 'fabricator', shortened to 'fab'.
https://en.wiktionary.org/wiki/fab#Noun https://en.wiktionary.org/wiki/fab_lab#English https://en.wiktionary.org/wiki/fabricator#English and if you really want to understand the English culture don't miss https://en.wikipedia.org/wiki/Absolutely_Fabulous ;)
Chips are made in a semiconductor foundry / chip fab.
Printed circuit boards are made at a printed circuit board manufacturer / "fab house."
Interesting, I always naively assumed that those cheap calipers measure distance mechanically, with something like a wheel and an encoder. The actual method is much cleverer.
Also, nitpick: „ I’m stuck in the local optima of …“ should be „optimum“.
> Rust’s formal compile-time machinery doesn’t support trigonometry
C++20 can do this in a consteval. It is a godsend for embedded table generation.
Long ago, as a member of Pumping Station One in Chicago, I had a fellow member cut out a series of 0.1" fingers spaced on 0.2" centers, and stuck them on a long piece of aluminum stock, and was able to use a Harbor Freight caliper to read the aluminum as a scale without further modification. It really doesn't take much more than an accurate set of interdigital electrodes to make a working scale.
> Have you ever wished for a 500 Hz, millimeter-precise linear position sensing system
Kind of, but I'd like an 0.01mm precision please. It can be just a few Hz, I don't need 500 Hz.
Great project though!
I'd be super curious to see how the accuracy changes with averaging out/low-pass filtering the measurements. Accuracy usually improves proportional to sqrt(N) when you take N samples so your higher precision desire might just be a bit of code to write.
The other side of it though is that you're starting to get down into the "everything needs to be temperature controlled" region as you squeeze that precision number. FR-4 and copper have thermal expansion coefficients around 15-20ppm/C. If I'm doing this mental math correctly, a 5 deg temperature rise would make a 1m long piece of FR4 expand by 0.1mm, or a 10cm piece of FR4 expand by 0.01mm.
One time I wanted to demonstrate thermal expansion to my kid (1st grade or so) so I made some marks with a steel ruler and put it in the freezer. Imagine my surprise when we took it out and there was no perceptible difference :-D
did you measure the freezer after? It expanded :)
With some microcontrollers you can do this "averaging out" just by changing ADC parameters, you don't even have to write the low-pass-filter code.
That is specifically sigma-delta type ADCs: https://www.analog.com/en/resources/technical-articles/sigma...
No, several MCUs, like a lot of STM32s, support hardware oversampling with optional division. Here's[1] a document describing it for the L0 and L4 series, which like most of the STM32 series have SAR[2] ADCs.
[1]: https://www.st.com/resource/en/application_note/an4629-adc-h...
[2]: https://en.wikipedia.org/wiki/Successive-approximation_ADC
That at 500Hz is called "optical mouse on absolute positioning patterned surface".
On that note: I'm looking for a mouse style camera sensor unit that can export full frame rate raw to a system where I can actually decode such an absolute positioning code.
Anyone got something in the sub-100$ range?
The DYI Drones and Ardupilot comminities were using mouse sensors to do "optical flow sensing" for accurate low altitude position hold around 10 or so years back.
https://ardupilot.org/copter/docs/common-mouse-based-optical...
https://github.com/RCmags/ADNS3080_frame_capture
https://www.pixelelectric.com/sensors/distance-vision/adns-3...
I once looked into mouse sensors for similar purpose, but the raw frame export capabilities were disappointing. Looking at the datasheet of PMW3360 which claims to have up to 12000 FPS, but the timing sequence for raw frame dump suggests ~20ms if not 40ms to transfer the whole frame which would give only 50-25FPS. That doesn't necessarily mean the main spec is lying, it just isn't capable of dumping the frames over SPI as fast as it is producing and processing them internally. Frame dumping seems to be more of debug/test feature, so no big surprise it's not capable of fully utilizing the sensor and there is little motivation for sensor manufacturer to improve it. But there is also a chance that the manufacturer has put very conservative timing for it in the datasheet, since it's not the main functionality of sensor.
If anyone has explored further how much the timing can be optimized, or maybe there are other mice sensors for with better frame dumping capabilities, I am curious to hear about it as well.
With that said even if you remove all the unnecessary delays data transfer over SPI will likely be a limiting factor for very high framerates. Assuming 4MHz SPI and 32x32 8bpp image that would be at most 500FPS. Few hundred FPS would still be nice, and you might push it a bit higher with increasing frequency but it's probably not possible to push it much further than 1000FPS without MIPI or some kind of other parallel data transfer protocol.
The ardupilot project in sibling comment seems to be using the mouse movement output directly instead of capturing the frame and then doing optical flow analysis on main CPU. The image dump is only being used to verify camera focus.
> It can be just a few Hz, I don't need 500 Hz.
Tldr: understood that you think you don’t need 500 Hz, but there is a technological limitation why you need that frequency for these systems to work.
The reason these measurements need to be so fast is because the measurement is not absolute but periodic. Meaning that when you measure something it can’t tell you how large it is in absolute terms just how much larger it is than the closest integer period. In mathematical notation you can measure x where the whole distance is k*period+x, where k is an unknown integer, and period is a design parameter. It can’t natively tell you what is the value of k.
So to figure out the whole measurement you need to calculate k. And you can do that by keeping track of it. You know it is zero when they zero out the caliper, and you know that it just increased by one every time there was a falling discontinuity of x. (Meaning that every time x approaches the period and then suddenly drops to a low number it just crossed over one of these period boundaries.) Similarly you decrease k every time x had a positive discontinuity.
And you can only do that trick if the sampling rate is much higher than the speed the caliper is moved. If you sample too slow suddenly the caliper might move multiple whole periods between two samples and the code would loose track of the value of k.
Thanks, appreciated! Yes, I took the Hz as meaning the rate at which I see an accurate measurement on my screen.