I've wasted so much time trying to get LLMs to help me code. One issue I have is that I can never seem to get the AI to say the word no. No matter what I ask, it will say "Absolutely! You can solve [impossible problem] like so...". At this point I basically use them as documentation search engines. Searching for things like "does this library have a function to do thing?". Gemini and deepseek seem to be good enough at that.
I've entirely given up on using LLMs for exploratory exercises.
Or the old asking a question Q1, getting wrong answer A1, explaining why it’s wrong with Q2, getting answer A2 that hyper-focuses on Q2 and misses important parts of Q1, restating Q1 you again obtain A1, repeat ad nauseam.
I've entirely given up on using LLMs for exploratory exercises.