this post was submitted on 18 Jul 2023
1 points (100.0% liked)

Ars Technica (RSS)

6 readers
1 users here now

founded 1 year ago
MODERATORS
 

A family of pretrained and fine-tuned language models in sizes from 7 to 70 billion parameters.

top 1 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 1 year ago

How does this compare to falcon 40b? Do we know yet?