I'll add another thought here - what I really want many times is a custom LLM like GPT, but trained on a particular language or framework or topic. I would love to go to a website for a new language and be able to talk about its documentation and ask questions of an LLM to help me understand. Huge bonus points if it was trained on real world code examples of that language or framework and I could have it help me write a new program or function right there. More bonus points if its tied in with an online repl where it can help me right inline.
Ease of retraining/refinement is something I'm really hoping for.
There are an endless number projects to make a "cleaner, revised X", where the coding itself is rote and has already been done at some point, it's just shoved into slightly different semantics that will be a bit more optimal or secure or configurable. It's something that an LLM feels like it's "tip of the tongue" capable of, and in more trivial cases you really can tell GPT to "rewrite this from JS to Python" and it works. But it's limited by just interpolating what's in the training set, when what you want is "port all these standard libraries to my experimental language, and also make a build system for them".