I have a question about scanner speed. As I understand it, scanner speed is commonly rated as the maximum number of points per second that a scanner can successfully track the standard ILDA test pattern at a specific scan angle without significant distortion when properly tuned. Feel free to correct me!
As an electronics guy, I'm used to more generic measures like Rise Time; the amount of time it takes for a device to span from 10-90% of its overall range.
What is the typical rise time for a middle of the road 30K scanner? Does 30K (or any other speed) translate to a specific number of degrees (or percent of total span) per millisecond?
Dean