this post was submitted on 23 Apr 2024
44 points (78.2% liked)

Ask Lemmy

27156 readers
2688 users here now

A Fediverse community for open-ended, thought provoking questions

Please don't post about current US Politics. If you need to do this, try [email protected]


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected]. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
 

Been using Perplexity AI quite a bit lately for random queries, like travel suggestions.

So I started wondering what random things people are using it for to help with daily tasks. Do you use it more than Google/etc?

Also if anyone is paying for Pro versions? Thinking if it's worth it paying for Perplexity AI Pro or not.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 7 months ago (2 children)

I've got a local LLM set up for code suggestions and run GitHub copilot for spots where the local isn't good enough. I can start writing out a thought and pseudo implementation and have a mostly viable real implementation instantly which I can then modify to suit my needs. It also takes a lot of the busywork out of things that need boilerplate. The local is trained on the style of my repos so I can keep up with style standards too which is helpful. Also great for explaining legacy code and coming up with more semantic variable names in old code too.

[–] stangel 2 points 7 months ago (1 children)

What are you using for your local installation?

[–] [email protected] 3 points 7 months ago

I was using mixtral but I'm recently testing out the new llama 3 models, decent improvement and hopefully we'll see some good fine tuned models of it soon