dtlnx

joined 1 year ago
[–] [email protected] 7 points 1 year ago

Yes very nice addition. All around the app feels much better to use.

 

I found and bookmarked this resource a while back. Lots of tools to try out!

 

I just wanted to mention that I've noticed seriously improved performance this evening on Beehaw. Whatever you did seems to have done the trick for now!

Thanks for running this instance and putting in all the effort!

[–] [email protected] 1 points 1 year ago

This was the example I immediately thought of when I saw this post. Blew me away when I first saw it.

[–] [email protected] 2 points 1 year ago

Wonder what steam deck support will look like.

[–] [email protected] 3 points 1 year ago (1 children)

You could try something like this.

https://github.com/xNul/chat-llama-discord-bot

Looks like it works on Linux/Windows/MacOS.

[–] [email protected] 3 points 1 year ago (1 children)

I'd have to say I'm very impressed with WizardLM 30B (the newer one). I run it in GPT4ALL, and even though it is slow the results are quite impressive.

Looking forward to Orca 13b if it ever releases!

 

Let's talk about our experiences working with different models, either known or lesser-known.

Which locally run language models have you tried out? Share your insights, challenges, or anything you found interesting during your encounters with those models.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Very good so far. I understand that server owners are needing to make changes to optimize for a large number of users migrating, so any slowdowns or service issues are completely understandable.

I really like the idea of a federated "reddit style" forum. Gives power back to the users.

[–] [email protected] 8 points 1 year ago

Swipe to vote, Community grouping, Enhanced media viewer, Customizable themes and layout!

That's it! :)

[–] [email protected] 2 points 1 year ago

Currently playing through Halo Reach. I've never played the series before, and so far I'm really liking it. Even though it is marked as playable, the only slight slowdown was signing into Microsoft, which wasn't too bad even.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

I'm not sure it is quite there yet, but it does have a localdocs plugin. I haven't played with it too much so I can't really comment on it.

 

I figured I'd post this. It's a great way to get an LLM set up on your computer and is extremely easy for folks that don't have that much technical knowledge!

[–] [email protected] 0 points 1 year ago (1 children)
[–] [email protected] 8 points 1 year ago

The lemmy community is not big enough to fill such niche communities yet. It will come in time as more users show up.

[–] [email protected] 3 points 1 year ago

That's exactly what I have set up. Extremely easy to get going!

view more: next ›