You Should Know
YSK - for all the things that can make your life easier!
The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:
Rules (interactive)
Rule 1- All posts must begin with YSK.
All posts must begin with YSK. If you're a Mastodon user, then include YSK after @youshouldknow. This is a community to share tips and tricks that will help you improve your life.
Rule 2- Your post body text must include the reason "Why" YSK:
**In your post's text body, you must include the reason "Why" YSK: It’s helpful for readability, and informs readers about the importance of the content. **
Rule 3- Do not seek mental, medical and professional help here.
Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.
Rule 4- No self promotion or upvote-farming of any kind.
That's it.
Rule 5- No baiting or sealioning or promoting an agenda.
Posts and comments which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.
Rule 6- Regarding non-YSK posts.
Provided it is about the community itself, you may post non-YSK posts using the [META] tag on your post title.
Rule 7- You can't harass or disturb other members.
If you harass or discriminate against any individual member, you will be removed.
If you are a member, sympathizer or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people and you were provably vocal about your hate, then you will be banned on sight.
For further explanation, clarification and feedback about this rule, you may follow this link.
Rule 8- All comments should try to stay relevant to their parent content.
Rule 9- Reposts from other platforms are not allowed.
Let everyone have their own content.
Rule 10- The majority of bots aren't allowed to participate here.
Unless included in our Whitelist for Bots, your bot will not be allowed to participate in this community. To have your bot whitelisted, please contact the moderators for a short review.
Partnered Communities:
You can view our partnered communities list by following this link. To partner with our community and be included, you are free to message the moderators or comment on a pinned post.
Community Moderation
For inquiry on becoming a moderator of this community, you may comment on the pinned post of the time, or simply shoot a message to the current moderators.
Credits
Our icon(masterpiece) was made by @clen15!
view the rest of the comments
Unpopular opinion: for a beginner, ChatGPT gives way better answers than stackoverflow users. The advantage of ChatGPT is that I can command it to dumb it down. Stackoverflow users are used to answer in a language that resembles the language in documentations. They are dry, abstract, lack good examples to the point that the "foobar" shit triggers an immediate defensive reaction in my brain and are phrased for people who already understood a concept but need to refresh their knowledge. Their core problem, as is tradition in any IT field, is that they lack the empathy to understand the viewpoint of someone who understands less of something than they do. It's like asking someone to teach you reading and getting a poem with the advice to just read it as an answer.
I can circumvent that via ChatGPT by asking it to ELI5. Also, I get an answer instantly, am not discouraged to ask further questions and not advised to read a link where a solution is offered in an equally difficult language.
People are saying that using ChatGPT doesn't give accurate information and fails to convey important concepts, but I feel it's actually the other way around. Since there is ChatGPT, I'm making way more progress than before.
I understand that users don't want AI answers, but I also don't get why anyone would want that on this platform. You can just, you know, use AI directly.
I think that one issue with using AI to help you solve programming problems is that sometimes it will wholesale make things up. Of course, people can do that too, which is why communities of coders can vote on the best answer. I say, more power to you, using the tools that work for you. Just be cautious.
The key with ChatGPT for me has been taken use it as an augmentation, not a gap fill. There's some prerequisite knowledge required on my part. It's a much more useful tool when it's helping flesh out something I know, but have forgotten, or am familiar with, but not proficient. That means I find mistakes faster, and am less prone to having it loop or hallucinate. If I need to ask a question about something where I know very little or nothing at all, I'll peek at a Wikipedia page or something first if I can.
I’ve asked the Bing one to look through some documentation for me before, and generally that seems to work out alright.
“Using [some package], how do I do X?”
Recently there was a Typescript thing I didn’t know how to do, and it was faster to ask the robot than dig through tons and tons of documentation. And I can still always double check (which I do).
That's not unpopular. But there is a problem. ChatGPT can answer your questions mostly because it was trained on the posts and answers of sites like StackOverflow.
If people abandon SO and similar forums then the quality of ChatGPTs answers will go down too.
Especially with something like programming. It's always changing. Next year there will be new versions of C++ and python. There will be new JS frameworks as always. It doesn't stand still.
And without new discussions about new problems, there's nowhere for ChatGPT to learn about them.
Haven't though about that, you're right.
I think that one issue with using AI to help you solve programming problems is that sometimes it will wholesale make things up. Of course, people can do that too, which is why communities of coders can vote on the best answer. I say, more power to you, using the tools that work for you. Just be cautious.
ChatGPT is incredible for middle ground developers like myself. I understand the goal I'm trying to achieve, and I understand the general process of how to do it. I can ask very granular, specific questions to ChatGPT and it will spit out some code that will get me close to what I need.
If I was a complete novice, I think ChatGPT would make me too dependent on using it for answers.
That seems like a totally valid use case. I occasionally will outline some very specific requirements and have AI generate the code, which just saves a lot of time typing, versus it generating it entirely on its own. And I still go through all the code and verify that it's good. It's just a tool that can be used to make your job easier.
Totally. The other day I had to test a csv/xls upload tool. I wanted to make sure that no matter what configuration an asshole user had for phone numbers, it would strip everything out so it would be a valid integer for my database.
I told chatgpt to make me a csv with 20 rows, 6 columns with xyz headers, and to give me an assortment of different phone number formats. Took 10 seconds.
OOooooooohhh that's a great use case. Get it to generate data!
You're storing phone numbers as integers?
I wish my students would care about the concepts and try to understand the answers instead of just blindly copying and pasting ill-fitting code (and then wondering why it only kinda works...).
As a former student now practicing engineer this habit never gets broken. All of us accept cargo cult computing to one extent or another. It sucks.
Usually the engineers with the least tolerance for it do better but only in the long run. In the short run they are yelled at for holding back projects.
I've yet to get a useful answer out of chatgpt for a technical question. It's good for fluffing up emails, but I haven't been super impressed with any use case I've tried for it.
When I’ve used it for decently complex programming questions I’ve found it often likes to make up functions and libraries. It’ll be like just use this reasonable sounding function from this library, and I look it up and the library does not have that functionality at all. Over and over!
Well it's a large language model that generates text probabilistically. It's trained on vast amounts of data, so it's expert at sounding like a skilled programmer, but there's absolutely no reason at all for the results to be useful code. It will sound like useful code and look like useful code, and it will be on the right topic, and that might well be enough, but it might not.
Here to echo the same. I thought using AI to assist me in coding would just make me lazy and learn nothing, but turns out I actually learn more than ever since it's much faster, more polite and patient, and the semantics are usually more catered to my needs and self explanatory than the average answers I find elsewhere.
It's great for writing snippets and creating basic frameworks. However, it definitely makes a lot of mistakes which I doubt a total beginner can spot, especially if the error lies in logic and not syntax.
Works great only as a tool for now, but chances are AI will probably surpass human coders sooner than we think.
I played around with ChatGPT for programming for a few hours a while back.
It is far better at explaining code in plain language than pretty much any human I've seen, atleast online. It's absolute dogshit st writing anything but the most basic of code, but it does do a good job explaining.
Programmers are shit at communicating.
I've found that it gives me a decent skeleton of something that I can then apply to my actual problem, but not much more, and it usually comes with some pretty big mistakes. I was trying to learn Z80 assembly and it gave me a good idea of how my code should generally look, but I did end up having to rewrite a whole bunch of it before I could actually execute anything.
I think there's a sweet spot for how many other resources are out there. JavaScript GPT answers are pretty good. But when you get to a less popular language like Elixir not so much
While I have no issues with the use of ChatGPT as an asist, from my understanding of the system as well as from some examples of using it, I would be very hesitant to say a beginner should use ChatGPT. While ChatGPT has given me some great starting points in a couple projects, I have also encountered a few instances where the code output was... not totally optimal at best, or wrong at worst. It also varies in quality from language to language (usually based on that languages over all popularity at the time the model was trained).
I suppose all of the above could hold true when it comes to getting your info from SO or any other social media source, but I would still highly suggest not relying on ChatGPT or AI systems until one has a intermediate or better grasp of the language.
Totally my opinion, though.