this post was submitted on 20 Sep 2023
22 points (75.0% liked)

Technology

34995 readers
173 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
top 15 comments
sorted by: hot top controversial new old
[–] [email protected] 10 points 1 year ago (1 children)

tailored to their corporation's needs

[–] [email protected] -2 points 1 year ago (1 children)

No these models are out there now. The more specialized, the less likely you need huge computing or graphical power. The corporations have realized bigger isn't better.

[–] [email protected] 4 points 1 year ago (1 children)

I think you did not get the point.

[–] [email protected] -2 points 1 year ago

Do I not? People are framing this as a bad thing, and it is, but not the way people think. The smart corporations won't hoard away the tech to only do what they want with it. The smart companies will use the users as the product and harvest incredible amounts of information. Not only do you get user information, marketing data, human data, you get people to create models for free. Sure they get some use out of it for whatever project, but they might have just cooked up a million dollar idea that they can leverage and steal as well. I think "AI" is going to be a lot less scary when it's integrated everywhere and user dependence is the biggest problem.

[–] [email protected] 9 points 1 year ago (1 children)

Yep, makes sense. You don’t want a researcher using the same tool as a lawyer or fiction writer. The researcher needs AI to summarize existing literate in a factual way, the lawyer needs to source actual cases, while the writer needs novel combinations of existing literary ideas. A single tool isn’t going to meet all those needs.

[–] [email protected] 0 points 1 year ago (2 children)

A single tool isn't going to meet all those needs yet

[–] [email protected] 4 points 1 year ago (1 children)

Why would we want one? We don’t have a single social media tool: forums, link aggregators, micro blogging, networking, etc. are all separate tools. We wouldn’t want to do all of those on Facebook.

ChatGPT is just a demo of a technology that can be used for all sorts of cool things. Trying to make ChatGPT do it all isn’t really needed nor desirable.

[–] [email protected] 0 points 1 year ago (1 children)

I do think it's desirable. It's unnecessary for users to keep track of which tool is best for which purpose if one tool can do it all. There's no reason why one tool wouldn't be able to; even in the worst case it could just automatically choose the best tool to answer your prompt, saving you the trouble of doing so.

[–] [email protected] 3 points 1 year ago

The tools would be integrated into things we already use.

I’m a doctor, and our EMR is planning to start piloting generative text for replies to patient messages later this year. These would be fairly informal and don’t need to be super medically rigorous, needing just a quick review to make sure the AI doesn’t give dangerous advice.

However, at some point AI may be used in clinical support, where it may offer suggestions on diagnoses, tests, and/or medications. And here, we would need a much higher standard of evidence and reproducibility of results as relying on a bad medical decision here could lead to serious harm.

These are already in two different sections of the medical chart (inbox vs. encounter, respectively) and these would likely be two separate tools with two separate contexts. I would not need to memorize two tools to use the software: in my inbox, I’ll have my inbox tools, and in my encounter, I’ll have my encounter tools, without worrying about exactly what AI implementation is being used in the background.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

That's literally their point. You have a specialised tool for each.

Being general makes them much harder to train and worse at each individual task.

[–] [email protected] 1 points 1 year ago

And they meant that in the further future, even that might stop being a problem.

[–] [email protected] 4 points 1 year ago

My main issue with using the general chatbot is that it's an incredibly inefficient way to convey information. For writing tasks I essentially need to type most of the answer first to get reasonable outputs when considering my actual constraints.

More specialized tooling will have these constraints built-in, which will increase productivity.

Even if we have the perfect general chatbot, it's still a lot of work to concisely describe your requirements to it.

[–] Eggyhead 2 points 1 year ago

I imagine there will eventually be businesses that aggregate data specifically to sell to LLM businesses. Like photostock but with a bunch of LLM conversational stuff.

[–] [email protected] 1 points 1 year ago

This is where I think AI has the power to really increase productivity. General-purpose AIs are nice, but having one that knows the current status of projects, learns the people you talk to frequently and your tone, and has access to all your company's internal documentation would

Instead of digging around in Confluence for a document, you just ask the AI. You ask the AI what meetings you have and what they're about. You ask the AI to write an email to Joe about the contract renewal and it spits one out for you to proofread and send.

It would be like everyone having their own personal secretary, but one that works seamlessly with everyone else's and never takes a sick day.

[–] [email protected] 0 points 1 year ago

... no shit Sherlock