jeffw@lemmy.world to Technology@lemmy.worldEnglish · 4 months agoEverything Apple iOS 18 Will Do, Android Already Doesgizmodo.comexternal-linkmessage-square58fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkEverything Apple iOS 18 Will Do, Android Already Doesgizmodo.comjeffw@lemmy.world to Technology@lemmy.worldEnglish · 4 months agomessage-square58fedilink
minus-squareAce! _SL/S@ani.sociallinkfedilinkEnglisharrow-up0·4 months ago A private local LLM Running on a phone? No way, not without being absolutely horrible, slow or making your phone churn through your battery anyway. Good LLMs are olready slow on a GTX 1080, which is already miles faster than any phone out there
minus-squarehabanhero@lemmy.calinkfedilinkEnglisharrow-up0·4 months agoIt’s not a LLM, it’s a much smaller model (~3B) which is closer to what Microsoft labels as a SLM (Small Language Models, e.g. MS Phi-3 Mini). https://machinelearning.apple.com/research/introducing-apple-foundation-models
minus-squareWomble@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-24 months agoMicrosoft’s penchant for making up names for thing that already have names is neither here nor there. It is an LLM, in fact its already twice as large as chatGPT2 (1.5B params).
Running on a phone? No way, not without being absolutely horrible, slow or making your phone churn through your battery anyway.
Good LLMs are olready slow on a GTX 1080, which is already miles faster than any phone out there
It’s not a LLM, it’s a much smaller model (~3B) which is closer to what Microsoft labels as a SLM (Small Language Models, e.g. MS Phi-3 Mini).
https://machinelearning.apple.com/research/introducing-apple-foundation-models
Microsoft’s penchant for making up names for thing that already have names is neither here nor there. It is an LLM, in fact its already twice as large as chatGPT2 (1.5B params).