I believe he talked about starting work on AGI at the time he went part-time at Meta, long before GPT-3.
I encourage anyone to try out some AI related stuff, genius IQ not required. It's still a young field so there aren't yet huge towers of knowledge to climb before you can do anything. The core ideas are actually super simple, requiring nothing more than high school math.
The hardware is fairly accessible. You can start for free with Colab, try a subscription for $9.99/mo, or use the gaming PC you might already have. The hardest thing is data, but again there are lots of free datasets available as well as pretrained models you can fine-tune on a custom smaller dataset that you make yourself.
Andrew Ng's online courses are great to get your feet wet. You use Octave/Matlab to implement the basics of many machine learning models from scratch, and build yourself up to using python to design several popular deep learning models including convolutional and transformers. It's not required, but a good idea to understand at least the basics of linear algebra and calculus.
Interesting. To be honest I really appreciated how they started with Matlab; it gave a very math-centric focus to the fundamentals, although of course you can do all of that with Python too. And I say this as a professional developer.
FastAI gets recommended a lot I think, if you can already code - focuses on hacking with frameworks instead of starting with the boring linear algebra stuff.
I encourage anyone to try out some AI related stuff, genius IQ not required. It's still a young field so there aren't yet huge towers of knowledge to climb before you can do anything. The core ideas are actually super simple, requiring nothing more than high school math.