Genuinely curious: is it even still relevant today? I've got the impression that there were a lot of these elaborate techniques and algorithms before around 2016, some of which I even learned, which subsequently were basically just replaced by some single NN-model trained somewhere in Facebook, which you maybe need to fine-tune to your specific task. So it's all got boring, and learning them today is akin to learning abacus or finding antiderivatives by hand at best.
That’s a great question. While NNs are revolutionary, they’re just one tool. In industrial Machine Vision, tasks like measurement, counting, code reading, and pattern matching often don’t need NNs.
In fact, illumination and hardware setup are often more important than complex algorithms. Classical techniques remain highly relevant, especially when speed and accuracy are critical.
And, usually you need determinism, within tight bounds. The only way to get that with a NN is to have a more classical algorithm to verify the NN's solution, using boring things like least squares fits and statistics around residuals. Once you have that in place, you can then skip the NN entirely, and you're done. That's my experience.
Those NN-models are monstrosities that eat cycles (and watts). If your task fits neatly into one of the algorithms presented (such as may be the case in industrial design automation settings) then yes, you are most definitely better off using them instead of a neural net-based solution.
If your problem is well-suited for “computer vision” without neural nets, these methods are a godsend. Some of them can even be implemented with ultra low latency on RTOS MCU’s, great for real-time control of physical actuators.