this post was submitted on 30 Jul 2023
6 points (80.0% liked)

Free Open-Source Artificial Intelligence

3014 readers
4 users here now

Welcome to Free Open-Source Artificial Intelligence!

We are a community dedicated to forwarding the availability and access to:

Free Open Source Artificial Intelligence (F.O.S.A.I.)

More AI Communities

LLM Leaderboards

Developer Resources

GitHub Projects

FOSAI Time Capsule

founded 2 years ago
MODERATORS
 

I just tried a few and nothing in the open space seems complete with an easy checkpoint setup freely available and good documentation. Do they all require proprietary weights or worse?

you are viewing a single comment's thread
view the rest of the comments
[–] j4k3 1 points 2 years ago (1 children)

Well I tried tortoise TTS today and got a bit farther than others but it still doesn't work for me. I almost have it working, but figuring out the API and playing the audio from a conda container inside a distrobox container just to shield my system from the outdated stuff used in the project may prove to be too much for my skills. The documentation for offline execution is crap.

I'm actually getting farther into these configurations by keeping a Wizard LM 30B GGLM running in instruct mode the whole time and asking it questions. It is quite capable of taking in most output errors from a terminal and giving almost useful advice in many cases. That 30B model in GGML setup with 10 CPU threads and 20 layers on a 3080Ti-16GB is very close to the speed of a Llama2 7B running on just the GPU. It only crashes if I feed it something larger than what might fit on a single page of a PDF. My machine has 32GB of system memory. I think I need to get the max 64GB. As far as I have seen, a 7B model lies half the time, a 13B lies 20% of the time and my 30B lies around 10% at 4 bit. With a ton of extra RAM I want to see how much better a 30B is at 8 bit, or if a 70B is feasible and maybe closes the gap.

[–] Blaed 2 points 2 years ago

Really appreciate the info and insights. Helps me adjust and test my benchmarks a ton. It’s remarkable what we’re able to do with consumer hardware now. It’s exciting to imagine where we’ll be at even a year from now!

Let us know if you find a better setup and workflow in the future. Sounds pretty effective though. Curious to see how it powers up for you throughout the rest of the year.

Thanks again. All this info is very helpful for others looking to get something similar running.