Curious about DeepSeek but worried about privacy? These apps let you use an LLM without the internet

Most of us are used to using internet chatbots like ChatGPT and DeepSeek in one of two ways: via a web browser or via their dedicated smartphone apps. There are two drawbacks to this. First, their use requires an internet connection. Second, everything you type into the chatbot is sent to the companies’ servers, where it is analyzed and retained. In other words: the more you use the chatbot the more the company knows about you. This is a particular worry surrounding DeepSeek that American lawmakers have expressed.

But thanks to a few innovative and easy-to-use desktop apps, LM Studio and GPT4All, you can bypass both these drawbacks. With the apps, you can run various LLM models on your computer directly. I’ve spent the last week playing around with these apps and thanks to each, I can now use DeepSeek without the privacy concerns. Here’s how you can, too.

Run DeepSeek locally on your computer without an internet connection

To get started, simply download LM Studio or GPT4All on your Mac, Windows PC, or Linux machine. Once the app is installed, you’ll download the LLM of your choice into it from an in-app menu. I chose to run DeepSeek’s R1 model, but the apps support myriad open-source LLMs.

LM Studio can run DeepSeek’s reasoning model privately on your computer.

Once you’ve done the above you’ve essentially turned your personal computer into an AI server capable of running numerous open-source LLMs, including ones from DeepSeek and Meta. Next, simply open a new chat window and type away just as you would when using an AI chatbot on the web.

The best thing about both these apps is that they are free for general consumer use, you can run several open-source LLMs in them (you get to choose which and can swap between LLMs at will), and, if you already know how to use an AI chatbot in a web browser, you’ll know how to use the chatbot in these apps.

But there are additional benefits to running LLM’s locally on your computer, too.

The benefits of using an LLM locally

I’ve been running DeepSeek’s reasoning model on my MacBook for the past week without so much as a hiccup in both LM Studio or GPT4All. One of the coolest things about interacting with DeepSeek in this way is that no internet is required. Since the LLM is hosted directly on your computer, you don’t need any kind of data connection to the outside world to use it.

Running LLMs like DeepSeek in apps like GPT4All can help keep your data secure.

Or as GPT4All’s lead developer, Adam Treat, puts it, “You can use it on an airplane or at the top of Mount Everest.” This is a major boon to business travelers stuck on long flights and those working in remote, rural areas. 

But if Treat had to sum up the biggest benefit of running DeepSeek locally on your computer, he would do it in one word: “Privacy.”

“Every online LLM is hosted by a company that has access to whatever you input into the LLM. For personal, legal, and regulatory reasons this can be less than optimal or simply not possible,” Treat explains. 

While for individuals, this can present privacy risks, those who upload business or legal documents into an LLM to summarize could be putting their company and its data in jeopardy.

“Uploading that [kind of data] to an online server risks your data in a way that using it with an offline LLM will not,” Treat notes. The reason an offline LLM running locally on your own computer doesn’t put your data at risk is because “Your data simply never leaves your machine,” says Treat.

This means, for example, if you want to use DeepSeek to help you summarize that report you wrote, you can upload it into the DeepSeek model stored locally on your computer via GPT4All or LM Studio and rest assured the information in that report isn’t being sent to the LLM maker’s servers.

The drawbacks of using an LLM locally

However, there are drawbacks to running an LLM locally. The first is that you’re limited to using only the open-source models that are available, which may be less recent than the model that is available through the chatbot’s official website. And because only open-source models can be installed, that means you can’t use apps like GPT4All or LM Studio to run OpenAI’s ChatGPT locally on your computer.

Another disadvantage is speed. 

“Because you are using your own hardware (your laptop or desktop) to power the AI, the speed of responses will be generally slower than an online server,” Treat says. And since AI models rely heavily on RAM to perform their computations, the amount of RAM you have in your computer can limit which models you can install in apps like GPT4All and LM Studio.

“As online servers are usually powered by very high-end hardware they are generally going to be faster and have more memory allowing for very fast responses by very large models,” explains Treat.

Still, in my testing of both LM Studio and GPT4All over the past week, I don’t think the reduced speediness of DeepSeek’s replies is a dealbreaker. When using DeepSeek’s R1 reasoning model on the web, the DeepSeek hosted on servers in China took 32 seconds to return an answer to the prompt “Can you teach me how to make a birthday cake?” When asking the local DeepSeek R1 model stored in LM Studio and GPT4All, the response time was 84 seconds and 82 seconds, respectively. 

I’ve found that the benefits of running DeepSeek locally on my device using LM Studio and GPT4All far outweigh the extra waiting time required to get a response. Without a doubt, being able to access a powerful AI model like DeepSeek’s R1 locally on my computer anywhere at any time without an internet connection—and knowing the data I enter into it remains private—is a trade-off worth making.

https://www.fastcompany.com/91285738/curious-about-deepseek-but-worried-about-privacy-these-apps-let-you-use-an-llm-without-the-internet?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Utworzony 23d | 1 mar 2025, 11:30:05


Zaloguj się, aby dodać komentarz

Inne posty w tej grupie

These tech companies are building healthier social media habits for kids

The last year has seen a global reckoning with the effects of social media on kids. Australia banned

23 mar 2025, 12:30:02 | Fast company - tech
This wellness app is like TikTok for your feelings

Would you share the pages of your journal with a bunch of strangers, because that’s the idea behind social wellness app Exist. 

The new

23 mar 2025, 05:30:03 | Fast company - tech
Yes, Apple is delaying some AI features. But does it really matter?

Earlier this month, Apple officially announced that it would be postponing the launch of some planned Apple Intelligence features to a later, unspecified date in the future. These features mainly

22 mar 2025, 10:50:06 | Fast company - tech
Suffering from loneliness? These businesses may have a cure

Loneliness isn’t just a lingering by-product of COVID lockdowns—it’s a public health crisis. The impacts of social isolation are said to be as detrimental to human health as

22 mar 2025, 10:50:06 | Fast company - tech
Anthropic is adding web search to its Claude chatbot in a very smart way

Anthropic announced Thursday that it has added web search capability to its Claude chatbot. It’s not a new feature to the AI world—but the company’s approach stands as one of the most thoughtful t

21 mar 2025, 23:20:06 | Fast company - tech
In this horror game, the monster can see you through your webcam

If the thought of being hunted by something that can see your every move makes your skin crawl, you might want to steer clear of Eyes Never Wake.

This viral horror game takes im

21 mar 2025, 21:10:03 | Fast company - tech
Fewer than 500 people are responsible for $3.2 trillion of artificial crypto trading

Market manipulation in the cryptocurrency world is rampant—and fewer than 500 people are responsible for as much as $250 million a year in profits and over $3.2 trillion in artificial trading, acc

21 mar 2025, 18:40:04 | Fast company - tech