Transport / Sprocket sensors - Optical Flow Sensor (AKA Mouse Sensors)

Buried into another discussion was the possibility of using optical flow sensors for sensing film transport, and alternatively also for detecting sprockets.

The initial response was the complication of Optical Flow Sensors (OFS) to contact the film, in the case of sprocket detection. But later I found other OFS that use optics to avoid surface contact.

The ADNS3080 specifications indicate is capable of 6400 frames per second (fps) and detecting surface changes of 40 inches per second.

Here is an interesting video combining the use of OFS and laser for robotics… I think these are promising ingredients for an alternative approach to sprocket sensing.

Short term I do not have the bandwidth to prototype one, so the purpose of this topic is to explore alternatives and feasibility of using these. Seeking feedback in case anyone have experimented with these.

2 Likes

Here’s a hack of the optical scanner in a mouse:
http://spritesmods.com/?art=mouseeye

And some early information from a decade ago:
https://web.archive.org/web/20090817163528/http://www.martijnthe.nl/2009/07/interfacing-an-optical-mouse-sensor-to-your-arduino/

Thanks for the info. I think this could be an interesting direction for Kinograph if we can get it to work. Would either you or @PM490 be interested in building a test? With a small loop of film and one motor you could find out if this is a viable solution, I’d guess.

I think the problem with using an optical perf sensor is how you handle different gauges. if the scanner has to do 16 and 35, the perfs are in totally different positions. Our old Imagica 3000V had perf sensors and they worked great. They were Keyence red laser proximity sensors, but that scanner was only 35mm, so it was easy to set up in a fixed position and tune them to look for the perfs.

The other issue with using sensors is what happens when you hit a section of broken or missing perfs? A better method is an encoder wheel that measures how far the film has moved across a capstan or PTR roller. That gives you an approximate idea of where you are (not accounting for shrinkage). Then the image can be taken and aligned in software. Shrinkage can be calculated to adjust the distance the film moves between exposures based on what you learn from the registration algorithm with each frame. Since shrinkage can vary in a reel, this needs to be an ongoing process. But at least if you have broken perfs, you have a backup positioning system from the encoder.

As long as the gate is big enough to allow for some slop, this should work fine.

Totally agree @friolator. I think in this case the sensor would be placed before/after the gate and position just above the middle of the film so that it detects a constant surface, similar to how a mouse would.

We decided not to go this route for now since we didn’t have enough evidence that it would work at higher speeds and did not have time to test it.

I would love it if someone did a test with a small loop of film, though!