floofloof@lemmy.ca to Technology@lemmy.worldEnglish · 5 months ago1-bit LLMs Could Solve AI’s Energy Demandsspectrum.ieee.orgexternal-linkmessage-square15fedilinkarrow-up12arrow-down10cross-posted to: [email protected]
arrow-up12arrow-down1external-link1-bit LLMs Could Solve AI’s Energy Demandsspectrum.ieee.orgfloofloof@lemmy.ca to Technology@lemmy.worldEnglish · 5 months agomessage-square15fedilinkcross-posted to: [email protected]
minus-squareNaz@sh.itjust.workslinkfedilinkEnglisharrow-up2·5 months agoTry using a 1-bit LLM to test the article’s claim. The perplexity loss is staggering. It’s like 75% accuracy lost or more. It turns a 30 billion parameter model into a 7 billion parameter model. Highly recommended that you try to replicate their results.
Try using a 1-bit LLM to test the article’s claim.
The perplexity loss is staggering. It’s like 75% accuracy lost or more. It turns a 30 billion parameter model into a 7 billion parameter model.
Highly recommended that you try to replicate their results.
https://xkcd.com/2934/