People tend to bloviate a bit about how the first person or group to build and train a "seed AGI" would have "godlike" power, but what they forget is that given godlike power, there is no reason to be selfish or psychopathic. Most human selfishness and greed comes from incentive gradients and competition traps resulting from the systems we have to survive in. Once you have nonhuman but human-equivalent-or-greater intelligence directed towards human goals, you're beyond competing for survival, and have no incentive not to direct the AGI cooperatively and altruistically.
This is a choice we can make, as a profession, as a community, and as a species. There is no point in letting short-sighted competitive anxieties destroy such a potential for good.
I wish I could be as optimistic. But AGIs are physical beings with computational limitations that run on energy. The Sun can produce so much over a given period of time. It's not hard to imagine a competition trap involving AGIs, each trying to get as much resources as possible to increase its capacity and crush the competition.
I would hope artificial agents are intelligent enough to realize that erasing their own utility functions to grab resources has very low utility! Cooperation is a high expected-utility strategy for the mean agent, which is why it evolved in the first place.
This is a choice we can make, as a profession, as a community, and as a species. There is no point in letting short-sighted competitive anxieties destroy such a potential for good.