this post was submitted on 23 Apr 2024
44 points (78.2% liked)

Ask Lemmy

27046 readers
1248 users here now

A Fediverse community for open-ended, thought provoking questions

Please don't post about US Politics. If you need to do this, try [email protected]


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected]. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 1 year ago
MODERATORS
 

Been using Perplexity AI quite a bit lately for random queries, like travel suggestions.

So I started wondering what random things people are using it for to help with daily tasks. Do you use it more than Google/etc?

Also if anyone is paying for Pro versions? Thinking if it's worth it paying for Perplexity AI Pro or not.

you are viewing a single comment's thread
view the rest of the comments
[–] Audalin 5 points 7 months ago (1 children)

I'm using local models. Why pay somebody else or hand them my data?

  • Sometimes you need to search for something and it's impossible because of SEO, however you word it. A LLM won't necessarily give you a useful answer, but it'll at least take your query at face value, and usually tell you some context around your question that'll make web search easier, should you decide to look further.
  • Sometimes you need to troubleshoot something unobvious, and using a local LLM is the most straightforward option.
  • Using a LLM in scripts adds a semantic layer to whatever you're trying to automate: you can process a large number of small files in a way that's hard to script, as it depends on what's inside.
  • Some put together a LLM, a speech-to-text model, a text-to-speech model and function calling to make an assistant that can do something you tell it without touching your computer. Sounds like plenty of work to make it work together, but I may try that later.
  • Some use RAG to query large amounts of information. I think it's a hopeless struggle, and the real solution is an architecture other than a variation of Transformer/SSM: it should address real-time learning, long-term memory and agency properly.
  • Some use LLMs as editor-integrated coding assistants. Never tried anything like that yet (I do ask coding questions sometimes though), but I'm going to at some point. The 8B version of LLaMA 3 should be good and quick enough.
[–] [email protected] 0 points 7 months ago* (last edited 7 months ago) (1 children)

Got any links teaching how to run a self hosted RAG LLM?

[–] Audalin 1 points 7 months ago (1 children)

Never ran RAG, so unfortunately no. But there're quite a few projects doing the necessary handling already - I'd expect them to have manuals.

[–] [email protected] -1 points 7 months ago* (last edited 7 months ago) (2 children)

Got any links to those please?

[–] Audalin 4 points 7 months ago (1 children)
[–] [email protected] 2 points 7 months ago* (last edited 7 months ago) (1 children)

Thank you. The "jonfairbanks" github repo is exactly what I was looking for, because FUCK sending any of my data to an AI company using their APIs for them to ingest my information to sell off to others.

You are the best!

[–] Audalin 1 points 7 months ago

You're welcome!

As far as I understand, all of them can be made to work locally (especially if your local model is served via an OpenAI-compatible API, e.g. see llama.cpp's server binary) with varying degrees of effort required.

[–] Rolando 1 points 7 months ago

Not comment-OP, but you could start here: [email protected]. the latest post is to a RAG tutorial, and there are various other resources in the sidebar.