This is a useful bit of knowledge, but to guard against people making irrational judgments about Accenture, it's important to note that (1) It was primarily Anderson's consulting divisions that went to Accenture, and (2) these large auditing firms have offices all over the US and world, and in these types of cases, it was really only a branch or two that were complicit in the fraud.
In the same way that we shouldn't condemn the employees of <INSERT TECH COMPANY> because senior leaders decided to <Censor/Abuse/Manipulate users> we shouldn't condemn otherwise ethical accountants because of the misdeeds of their colleagues - especially when they pass more stringent ethical requirements than developers.
Ironically, people couldn't differentiate the isolated incident, and AA liquidated/sold because no one wanted to do business with them. [0]
"Ironically, people couldn't differentiate the isolated incident, and AA liquidated/sold because no one wanted to do business with them."
"Ironically" is not the word you should be looking for, there. "Fittingly", or "Unsurprisingly", perhaps. Even if you know not all the apples in the barrel were bad, you know that it was a barrel with more than one bad apple, so you throw that barrel out.
Ironically, people often dismiss the scale of a problem by saying it was limited to a few "bad apples", when the point of the apple/barrel metaphor is that one bad apple can spoil the whole bunch, so having one bad apple in it is just as bad as every apple being rotten.
The tech examples will often involve employees directly responsible for building, improving and maintaining the systems of abuse that are central to the ill deeds of the senior leaders you're referring to. The senior leaders can't do what they do without those systems.
This concept is fundamentally why 4,000 of Google's employees staged a protest against Google working on AI systems for the military. They have an inkling about what such systems will be used for.
So it begs the obvious question about how complicit you are if you build software systems that you know are going to be used for immoral things; that you know ahead of time how they're going to be used by said senior leadership. To say nothing of the fact that often said systems are built for the sole purpose of enabling abuse, so there's very little question about the line of moral responsibility (whether of privacy or in the aiding of censorship in authoritarian nations, et al). This obviously isn't a new debate within tech though, it goes all the way back in the industry (eg IBM's counting machines).
> how complicit you are if you build software systems that you know are going to be used for immoral things
There's another path to not building and not participating it. It might be possible to be subversive and design these systems to best fit your values:
"My bias was always to build decentralization into the net. That way it would be hard for one group to gain control. I didn’t trust large central organizations. It was just in my nature to distrust them." -- Robert Taylor
In the same way that we shouldn't condemn the employees of <INSERT TECH COMPANY> because senior leaders decided to <Censor/Abuse/Manipulate users> we shouldn't condemn otherwise ethical accountants because of the misdeeds of their colleagues - especially when they pass more stringent ethical requirements than developers.
Ironically, people couldn't differentiate the isolated incident, and AA liquidated/sold because no one wanted to do business with them. [0]
[0] en.wikipedia.org/wiki/Arthur_Andersen#Demise