Hey everyone!
I think it's time we had a fosai model on HuggingFace. I'd like to start collecting ideas, strategies, and approaches for fine-tuning our first community model.
I'm open to hearing what you think we should do. We will release more in time. This is just the beginning.
For now, I say let's pick a current open-source foundation model and fine-tune on datasets we all curate together, built around a loose concept of using a fine-tuned LLM to teach ourselves more bleeding-edge technologies (and how to build them using technical tools and concepts).
FOSAI is a non-profit movement. You own everything fosai as much as I do. It is synonymous with the concept of FOSS. It is for everyone to champion as they see fit. Anyone is welcome to join me in training or tuning using the workflows I share along the way.
You are encouraged to leverage fosai tools to create and express ideas of your own. All fosai models will be licensed under Apache 2.0. I am open to hearing thoughts if other licenses should be considered.
We're Building FOSAI Models! ๐ค
Our goal is to fine-tune a foundation model and open-source it. We're going to start with one foundation family with smaller parameters (7B/13B) then work our way up to 40B (or other sizes), moving to the next as we vote on what foundation we should fine-tune as a community.
Fine-Tuned Use Case โ๏ธ
Technical
FOSAI Model Idea #1
- Research & Development AssistantFOSAI Model Idea #2
- Technical Project ManagerFOSAI Model Idea #3
- Personal Software DeveloperFOSAI Model Idea #4
- Life Coach / Teacher / MentorFOSAI Model Idea #5
- FOSAI OS / System Assistant
Non-Technical
FOSAI Model Idea #6
- Dungeon Master / Lore MasterFOSAI Model Idea #7
- Sentient Robot CharacterFOSAI Model Idea #8
- Friendly Companion CharacterFOSAI Model Idea #9
- General RPG or Sci-Fi CharacterFOSAI Model Idea #10
- Philosophical Character
OR
FOSAI Foundation Model
โ๏ธ
Foundation Model โ๏ธ
(Pick one)
Mistral
Llama 2
Falcon
..(Your Submission Here)
Model Name & Convention
snake_case_example
CamelCaseExample
kebab-case-example
0.) FOSAI โ๏ธ
fosai-7B
fosai-13B
1.) FOSAI Assistant โ๏ธ
fosai-assitant-7B
fosai-assistant-13B
2.) FOSAI Atlas โ๏ธ
fosai-atlas-7B
fosai-atlas-13B
3.) FOSAI Navigator โ๏ธ
fosai-navigator-7B
fosai-navigator-13B
4.) ?
Datasets โ๏ธ
TBD!
What datasets do you think we should fine-tune on?
Alignment โ๏ธ
To embody open-source mentalities, I think it's worth releasing both censored and uncensored versions of our models. This is something I will consider as we train and fine-tune over time. Like any tool, you are responsible for your usage and how you choose to incorporate into your business and/or personal life.
License โ๏ธ
All fosai models will be licensed under Apache 2.0. I am open to hearing thoughts if other licenses should be considered.
This will be a fine-tuned model, so it may inherit some of the permissions and license agreements as its foundation model and have other implications depending on your country or local law.
Generally speaking, you can expect that all fosai models will be commercially viable through the selection process of its foundation family and the post-processing steps that are fine-tuning the model.
Costs
I will be personally covering all training and deployment costs. This may change if I choose to put together some sort of patronage, but for now - don't worry about this. I will be using something like RunPod or some other custom deployed solution for training.
Cast Your Votes! โ๏ธ
Share Your Ideas & Vote in the Comments Below! โ
What do you want to see out of this first community model? What are some of the fine-tuning ideas you've wanted to try, but never had the time or chance to test? Let me know in the comments and we'll brainstorm together.
I am in no rush to get this out, so I will leave this up for everyone to see and interact with until I feel we have a solid direction we can all agree upon. There will be plenty of more opportunities to create, curate, and customize more fosai models I plan to release in the future.
Update [10/25/23]: I may have found a fine-tuning workflow for both Llama (2) and Mistral, but I haven't had any time to validate the first test run. Once I have a chance to do this and test some inference I'll be updating this post with the workflow, the models, and some sample output with example datasets. Unfortunately, I have ran out of personal funds to allocate to training, so it is unsure when I will have a chance to make another attempt at this if this first attempt doesn't pan out. Will keep everyone posted as we approach the end of 2023.
If you're going to finetune a foundation model, it'd make sense to choose Mistral - once they release a 13B.
Also consider adding function calling to the home assistant use case.
Mistral seems to be the popular choice. I think it's the most open-source friendly out of the bunch. I will keep function calling in mind as I design some of our models! Thanks for bringing that up.
Mistral is an excellent choice. I am thoroughly impressed by the capability of these 7b models.