I don't see it. Any non trivial analog computation involves a very large circuit, which has the problems of normal programming (bugs) and graphic programming (write-only), but with the extra pitfalls of electronics (resistance, delay, the resulting oscillations). And then you have to read all the outputs. That's going to be slow and expensive to build.
In what concrete problems do you (or Veritasium) think analog computing could beat a GPU?
In what concrete problems do you (or Veritasium) think analog computing could beat a GPU?