this post was submitted on 11 Sep 2023
8 points (72.2% liked)

Autism

6515 readers
34 users here now

A community for respectful discussion and memes related to autism acceptance. All neurotypes are welcome.

We have created our own instance! Visit the following community for more info:

Our Community

Values

Rules

  1. No abusive, derogatory, or offensive post/comments e.g: racism, sexism, religious hatred, homophobia, gatekeeping, trolling.
  2. Posts must be related to autism, off-topic discussions happen in the matrix chat.
  3. Your posts must include a text body. It doesn't have to be long, it just needs to be descriptive.
  4. Do not request donations.
  5. Be respectful in discussions.
  6. Do not post misinformation.
  7. Mark NSFW content accordingly.
  8. Do not promote Autism Speaks.
  9. General Lemmy World rules.

Encouraged

  1. Open acceptance of all autism levels as a respectable neurotype.
  2. Funny memes.
  3. Respectful venting.
  4. Describe posts of pictures/memes using text in the body for our visually impaired users.
  5. Welcoming and accepting attitudes.
  6. Questions regarding autism.
  7. Questions on confusing situations.
  8. Seeking and sharing support.
  9. Engagement in our community's values.
  10. Expressing a difference of opinion without directly insulting another user.
  11. Please report questionable posts and let the mods deal with it. Chat Room

Helpful Resources

Relevant Communities

Autism:

ADHD:

Bipolar:

Mental Health:

Misc:

Neurodivergence:

Social:

^lemmy.world/c/autism^ ^will^ ^happily^ ^promote^ ^other^ ^ND^ ^communities^ ^as^ ^long^ ^as^ ^said^ ^communities^ ^demonstrate^ ^that^ ^they^ ^share^ ^our^ ^community's^ ^values.^

Lemmy World Donations

The admins of our instance work hard to ensure that we have a good place to host our community. They are also helpful at protecting our community from trolls and other malicious actors. They do this for free, so if you appreciate our community, please consider donating to them. The first link is the preferred method, but both work.

founded 1 year ago
MODERATORS
 

Hey everyone, I've been searching for a bit on getting local LLM inference to process legal paperwork (I am not a lawyer, I just have trouble through large documents to figure out my rights). This would help me have conversations with my landlord and various other people who will withhold crucial information such as your rights during a unit inspection or accuse you of things you did not etc.

Given that there are 1000s of pre-trained models, would it be better to train a small model myself on an RTX 4090 or a Daisy chain of other GPUs? Is there a legal archive somewhere that I'm just not seeing or where should I direct my energy? I think lots of us could benefit from a pocket law reference that can serve as an aid to see what to do next.

you are viewing a single comment's thread
view the rest of the comments
[–] inspxtr 2 points 9 months ago (3 children)

I know nothing about “in context learning” or legal stuff, but intuitively, don’t legal documents tend to reference each other, especially the more complicated ones? If so, how would you apply in context learning if you’re not aware which ones may be relevant?

[–] [email protected] 5 points 9 months ago* (last edited 9 months ago) (2 children)

Yes, you can craft your prompt in such a way that if the llm doesn’t know about a referenced legal document it will ask for it, so you can then paste the relevant section of that document into the prompt to provide it with that information.

I’d encourage you to look up some info on prompting LLMs and LLM context.

They’re powerful tools, so it’s good to really learn how to use them, especially for important applications like legalese translators and rent negotiators.

[–] inspxtr 1 points 9 months ago (1 children)

thanks for your answer! Is this same or different from indexing to provide context? I saw some people ingesting large corpus of documents/structured data, like with LlamaIndex. Is it an alternative way to provide context or similar?

[–] [email protected] 2 points 9 months ago

Indexing and tools like llamaindex use LLM generated embeddings to “intelligently” search for similar documents to a search query.

Those documents are usually fed into an LLM as part of the prompt (eg. context)