"How surprising!", said nobody.
Programming
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities [email protected]
The amount of copilots are getting out hand, are people able to keep track of all of them?
The copilot autocomplete and copilot chat are great QoL tools, but trying to have it generate complete PR for a given issue seems... optimistic.
I'm surprised Matthew concludes that it might actually be useful for some people honestly.
yeah, who knows - some people might rather waste time fighting a language model than actually learning things...
it's too early for something like this imo, but maybe in a few years it serves a purpose
Hmm, interesting idea, to ask the user to provide the overall intent by making them edit/write the plan, since the LLM can't do intent.
But man, we've recently had a number of programming beginners join us in our relatively large codebase, and I've basically had to write such a plan, i.e. step-by-step instructions, for them many times.
It just means that I go through the whole codebase and have to think everything through, without doing it myself.
It often took similarly long do that, and formulate instructions, as it would have taken me to write the code myself. Because obviously we're using a high-level programming language, so there's not many detail problems which are easier to describe in a natural language.
It's also incredibly difficult to provide correct instructions that way, since I don't get to read the existing code while I write the code.
And reviewing their code to figure out what came from it, that binds even more time.
So, yeah, it really doesn't sound like this LLM thing would save me time either...