Call order and count assertions are important in non-idempotent calls (message delivery for example) and complex integration tests.
While I respect your opinion, it misses three realities. Firstly you can't generalise most complex systems enough for this to be a viable solution universally. Secondly, skill level just isn't on the market to implement this for every case even if you could specify it. Thirdly, to solve the first two problems it costs more.
Data flow moves all the test responsibilities into integration testing, which is much harder!
It's a careful thing to balance but I'm currently on the side of mocks and control flow based systems. They are cost effective and scale well in practice (I think we're on about 4 million or so lines of code and tens of thousands of class levels now)
Complex systems only become tractable when modularized, and simple module boundaries are preferable to chatty APIs; the nature of the system may make it more or less worthwhile, I'll agree.
Building crappier software because that's all you can build with the people you can find seems to me to be putting the cart before the horse. And I think you easily end up with crappier, less flexible software if you blindly follow Java norms. The tendency to over-specified tests both increases the cost of writing test code, increases the cost of refactoring (the tests usually need to change in some proportion to the implementation, even if the behaviour is no different), but even worse: your modules are, by default, not recomposable. When things are built out of data flow instead of control flow, you get a bunch of power almost for free.
Parallel and distributed processing, logging and system introspection, up to and including new product development using known good components. The business agility is especially important.
I've seen codebases calcified by "long land borders" between modules, with value locked in by the architecture, requiring risky and costly refactoring, rewrites or monotonically increasing complexity to try and repurpose that value. A system that looks more like Lego than a series of custom crafted blocks is more valuable, even if it's harder to build.
The opportunity cost obviously depends heavily on the domain. For an internal corporate IT project, I can certainly see that the upside isn't as clear.
While I respect your opinion, it misses three realities. Firstly you can't generalise most complex systems enough for this to be a viable solution universally. Secondly, skill level just isn't on the market to implement this for every case even if you could specify it. Thirdly, to solve the first two problems it costs more.
Data flow moves all the test responsibilities into integration testing, which is much harder!
It's a careful thing to balance but I'm currently on the side of mocks and control flow based systems. They are cost effective and scale well in practice (I think we're on about 4 million or so lines of code and tens of thousands of class levels now)