A small team of AI researchers at Adobe Inc., working with a colleague from Auburn University and another from Georgia Tech, has developed a small language model (SLM) that they claim can be run locally on a smart phone with no access to the cloud. The group has written a paper describing their new app, which they call SlimLM, and have posted it to the arXiv preprint server.
As LLM technology continues to mature, researchers across the globe continue to find new ways to improve it. In this new effort, the research team has found a way to cut the cord for a specific type of AI application—processing documents locally.
As LLMs such as ChatGPT become more popular, users have become more worried about privacy. And it is not just individuals—companies large and small have adopted AI applications that assist with a variety of business processes, some of which require a high degree of privacy.
The reason LLMs are not private right now is some of their work and a lot of their storage is on cloud devices, which can be hacked. The obvious solution, people in the field have been noting, is to cut the cord and run small language models (SLMs) locally with no need for the cloud, so that privacy worries can be solved.
Some of the biggest players in the field have been working toward that end—Google, Apple and Meta have all developed apps that can be run without accessing the cloud. But none so far are being used in the real world. That is where SlimLM differs, at least according to the team. They plan to make the app available to users “soon.”
The researchers acknowledge that the reason their product can be used locally is because of its specificity—it is not a chatbot, or a general use tool. Instead, it can be used for specific document tasks, such as creating a summary or answering topical questions. That means the app was trained only on document processing, which reduces the number of parameters—the smallest version currently runs with just 125 million. It also means it has far less work to do on the smartphone.
The researchers suggest their app also represents a move toward more localized AI applications and a much higher degree of privacy for all types of applications. https://techxplore.com/news/2024-11-adobe-slm-locally-cloud.html
Recent Comments