welcome
Live Science

Live Science

Technology

Technology

Large language models can be squeezed onto your phone — rather than needing 1000s of servers to run — after breakthrough

Live Science
Summary
Nutrition label

80% Informative

A new algorithm could compress AI models so they fit onto a smartphone or laptop.

The new algorithm, Calibration Aware Low precision Decomposition with Low Rank Adaptation ( CALDERA ) compresses the massive amounts of data needed to run a large language model (LLM) The results could pave the way for LLMs to be stored and run on smartphones or laptops in the future.

Small business owner?

Otherweb launches Autoblogger—a revolutionary way to bring more leads to any small business, using the power of AI.