Depends what hardware you have. You want a big model, at least 30b, but you can make it easier to run with quantization. A five bit quantization is like a jpg. It's a bit grainy but still very recognizable. Less bits than that and it goes to crap.
What is your goal? Find a model that fits that. Zoom over to hugging face and you can use a bunch. I think it is less about the model and more about the training set.
[ + ] TheYiddler
[ - ] TheYiddler 1 point 10 monthsJul 31, 2024 06:34:30 ago (+1/-0)
Mixtral 8x7b is nice and versatile.
[ + ] Love240
[ - ] Love240 1 point 10 monthsJul 31, 2024 02:59:38 ago (+1/-0)
[ + ] MaryXmas
[ - ] MaryXmas 0 points 10 monthsJul 31, 2024 08:25:45 ago (+0/-0)