China open-source AI models surpass 10 billion downloads
China open-source AI models surpass 10 billion downloads
China's domestically developed open-source large language models have recorded more than 10 billion cumulative downloads worldwide, and the country now holdsadmin (Daily Ittehad)
like this

neon_nova
in reply to ☆ Yσɠƚԋσʂ ☆ • • •I have a 16gb MacBook Air m4.
I like the idea of having a model I can run locally in the event of a possible long term internet outage.
Can you recommend a model that would be suitable for my computer?
☆ Yσɠƚԋσʂ ☆
in reply to neon_nova • • •16gb is a bit low unfortunately. You could run a 2 bit quant of latest Qwen, but that's going to be a severely degraded performance. huggingface.co/unsloth/Qwen3.6…
Might be worth trying though to see if it does what you need.
unsloth/Qwen3.6-35B-A3B-GGUF · Hugging Face
huggingface.coneon_nova
in reply to ☆ Yσɠƚԋσʂ ☆ • • •☆ Yσɠƚԋσʂ ☆
in reply to neon_nova • • •It's entirely possible we might see fairly capable models that can be run with 16 gigs of RAM in the near future. Qwen 3.5 came out in February, and you needed a server with hundreds of gigs of memory to run a 397bln param model. Fast forward to a couple of weeks ago and 3.6 comes out with a 27bln param version beating the old 397bln param one in every way. Just stop and think about how phenomenal that is qwen.ai/blog?id=qwen3.6-27b
So, it's entirely possible people will find ways to optimize this stuff even further this year or the next, and we'll get an even smaller model that's more capable.
Qwen Studio
qwen.aiRobin
in reply to neon_nova • • •bountygiver [any]
in reply to neon_nova • • •AlHouthi4President
in reply to ☆ Yσɠƚԋσʂ ☆ • • •AlHouthi4President
in reply to AlHouthi4President • • •☆ Yσɠƚԋσʂ ☆
in reply to AlHouthi4President • • •racoon
in reply to ☆ Yσɠƚԋσʂ ☆ • • •OldQWERTYbastard
in reply to ☆ Yσɠƚԋσʂ ☆ • • •HiddenLayer555
in reply to ☆ Yσɠƚԋσʂ ☆ • • •