this post was submitted on 10 Jan 2024
1240 points (96.5% liked)
Technology
59604 readers
3750 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
...
Yes, you can download an executable of a chatbot lol.
That's different than running something remotely like even OpenAI.
The more it has to reference, the more the system scales up. Not just storage, but everything else.
Like, in your example of video games it would be more like stripping down a PS5 game of all the assets, then playing it on a NES at 1 frame per five minutes.
You're not only wildly overestimating chatbots ability, you're doing that while drastically underestimating the resources needed.
Edit:
I think you literally don't know what people are talking about..
Do you think people are talking about AI image generators?
No one else is...
I think you're confusing training it with running it. After it's trained, you can run it on much weaker hardware.
The issue is it reproducing copyrighted works verbatim...
It can't do that unless it contains the entire text to begin with...
I am talking about generative AI, be it text or image both have a challenge with copyrighted material.
Are you refering to my joke?
I am far from overestimating capacity, Starfield runs mediocre on a modern gaming system compared to other games. The Vicuna 13b llm runs mediocre on the same system compared with gpt 3.5. To this date there is no local model that i would trust for professional use and chatgpt 3.5 doesnt hit that level either.
But it remains a very interesting, rapidly evolving technology that i hope receives as much future open source support as possible.
I presume you must believe the the following lemmy community and resources to be typed up by a group of children, either that or your just naive.
https://lemmy.world/c/fosai
https://www.fosai.xyz/
https://github.com/huggingface/transformers
https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
https://huggingface.co/microsoft/phi-2 & https://www.microsoft.com/en-us/research/blog/phi-2-the-surprising-power-of-small-language-models/
https://www.theguardian.com/technology/2023/may/05/google-engineer-open-source-technology-ai-openai-chatgpt
Or...
I could just block some of the people who are really really into chatbots, but don't understand it in the slightest.
I think that might be more productive than reading a bunch of stuff from other people who don't understand it.
HOT TAKE: Hugging face is run by people who are really into chatbots but dont understand it in the slightest.
I have been patient and friendly so far but your tone has been nothing but dismissive.
you cannot have a nuanced conversation about AI while excluding the entire Open Source field within it. That's simply unreasonable and i plore you to ask others because i know you wont take my word for it.
Farewell