"1) your certainty of failure if you go the wrong way, 2) the harm to the business/team given a failure, 3) your role in execution of the task."
Relatedly, I use a very similar analysis when giving tasks to junior engineers. They almost always want to do something that isn't exactly what I would do, for a whole bunch of obvious reasons. Sometimes I need to override them, but it's a complicated analysis that sums up to what you wrote above. I will even sometimes let them do something that I think will lead to some negative, but recoverable consequences, because the best learning is from your own mistakes, but to learn from those mistakes you must first be allowed to make them. I just have to make sure the positives of learning outweigh the possible issues the mistakes could lead to. And sometimes, I'm wrong about whether it was a mistake or not, of course. :)
An example of something I'll generally override regardless is a security issue, because the consequences of that can get pretty negative. But merely layout out the code with the IO & business logic more tightly coupled than I'd like is something I may let slide, especially if they're going to end up having to write automated testing for it anyhow.
When I do let something go, I generally give a heads up about what negative consequences I expect may transpire, and then let them do their thing. I think that in general, "wisdom" can't be taught in the conventional sense of the term, but I do think you can sensitize people to the existence of certain patterns and accelerate the process of acquiring such wisdom as a result.
Relatedly, I use a very similar analysis when giving tasks to junior engineers. They almost always want to do something that isn't exactly what I would do, for a whole bunch of obvious reasons. Sometimes I need to override them, but it's a complicated analysis that sums up to what you wrote above. I will even sometimes let them do something that I think will lead to some negative, but recoverable consequences, because the best learning is from your own mistakes, but to learn from those mistakes you must first be allowed to make them. I just have to make sure the positives of learning outweigh the possible issues the mistakes could lead to. And sometimes, I'm wrong about whether it was a mistake or not, of course. :)
An example of something I'll generally override regardless is a security issue, because the consequences of that can get pretty negative. But merely layout out the code with the IO & business logic more tightly coupled than I'd like is something I may let slide, especially if they're going to end up having to write automated testing for it anyhow.
When I do let something go, I generally give a heads up about what negative consequences I expect may transpire, and then let them do their thing. I think that in general, "wisdom" can't be taught in the conventional sense of the term, but I do think you can sensitize people to the existence of certain patterns and accelerate the process of acquiring such wisdom as a result.