this post was submitted on 09 Jan 2025
478 points (99.2% liked)

Opensource

1839 readers
121 users here now

A community for discussion about open source software! Ask questions, share knowledge, share news, or post interesting stuff related to it!

CreditsIcon base by Lorc under CC BY 3.0 with modifications to add a gradient



founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] kautau 116 points 1 month ago* (last edited 1 month ago) (16 children)

The most important part is that it’s a local ~~LLM~~ model running on your machine. The problem with AI is less about LLMs themselves, and more about their control and application by unethical companies and governments in a world driven by profit and power. And it’s none of those things, it’s just some open source code running on your device. So that’s cool and good.

[–] [email protected] 40 points 1 month ago (15 children)

Also the incessant ammounts of power/energy that they consume.

[–] [email protected] 1 points 4 weeks ago (1 children)

Curious how resource intensive AI subtitle generation will be. Probably fine on some setups.

Trying to use madVR (tweaker's video postprocessing) in the summer in my small office with an RTX 3090 was turning my office into a sauna. Next time I buy a video card it'll be a lower tier deliberately to avoid the higher power draw lol.

[–] kautau 2 points 4 weeks ago

I think it really depends on how accurate you want / what language you are interpreting. https://github.com/openai/whisper has multiple variations on their model, but they all pretty much require VRAM/graphics capability (or likely NPUs as they become more commonplace).

load more comments (13 replies)
load more comments (13 replies)