How to Run AI with No Internet

Using LMStudio with Meta & DeepSeek on your laptop

In partnership with

👋 Alex here!

Thanks for tuning in.

Our mission is to help Account Executives use AI in practical ways to save time and get their sanity back. Hundreds of sellers have signed up from the likes of DataBricks, HubSpot, LinkedIn, Clay, and even Salesforce.

This week’s post will show you how to run AI free, locally on your desktop. No, not because we want to save money - but, now you’ll be able to use AI on planes, trains, or even out in the wilderness.

By using LMStudio.ai, you can download ‘distilled’ open source LLMs, like Meta’s Llama or DeepSeek, and use them locally without the internet.

Now for the guide to set it up.

P.s. check out my other newsletter on GTM Engineering: Claymation

Objective

Learn how to download open source LLMs and run them locally on your laptop.

Trust me, this is MUCH easier than you’re imagining right now…

Why run locally?

You can now use AI when you don’t have internet access or when it’s slow or when service is slow.

Or when you do have internet access and want keep your sensitive data safe so you don’t expose customer information or company IP.

What’s cool is you have greater ability to customize AI and also chat with your documents.

Plus, it’s free. Kinda crazy, right?

FREE AI

What is LM Studio?

LM Studio is an app that runs on your desktop to discover, download, and run LLMs locally. You can also download compatible HuggingFace models from their repositories.

The UI is your typical chat interface like ChatGPT.

The best part - it’s free and you can use free open source LLMs.

That’s crazy…. FREE AI.

How to Setup StudioLM & Run LLMs Locally

Step 1: Download LM Studio

Head to lmstudio.ai

Download the version that corresponds to your laptop’s operating system.

LM Studio

Step 2: Drag into Application Folder

Step 3: Search for Model & Download

There’s 2 types of LLM models

  • The ‘Teacher’ LLM - these are the ones we’re familiar with and typically interact with such as ChatGPT 4o.

  • Distilled LLMs are called the ‘Student’ - these are more compact LLMs the are derived from knowledge distillation.

We need to use a distilled version because the parent LLMs are too big to fit on a laptop.

The ‘1.5B’ up to ‘70B’ refer to the size of the model variant.

What is knowledge distillation?

Knowledge distillation is a machine learning technique that aims to transfer the learnings of a large pre-trained model, the “teacher model,” to a smaller “student model.” It’s used in deep learning as a form of model compression and knowledge transfer, particularly for massive deep neural networks.

The goal of knowledge distillation is to train a more compact model to mimic a larger, more complex model. Whereas the objective in conventional deep learning is to train an artificial neural network to bring its predictions closer to the output examples provided in a training data set, the primary objective in distilling knowledge is to train the student network to match the predictions made by the teacher network.

Searching & downloading an LLM locally via LM Studio

Step 4: Load & Run Model

This is Llama 3.2 1B from Meta:

Testing out Llama 3.2 locally

Now for DeepSeek. The transparency of its reasoning is pretty cool

DeekSeep Reasoning Exampl

Stay up-to-date with AI

The Rundown is the most trusted AI newsletter in the world, with 1,000,000+ readers and exclusive interviews with AI leaders like Mark Zuckerberg, Demis Hassibis, Mustafa Suleyman, and more.

Their expert research team spends all day learning what’s new in AI and talking with industry experts, then distills the most important developments into one free email every morning.

Plus, complete the quiz after signing up and they’ll recommend the best AI tools, guides, and courses – tailored to your needs.

Claymation - GTM EngineeringJoin 2,500 GTM Leaders learning how to build the latest RevOps + GTM Systems with Clay + AI + No Code.
Building AI AgentsFree, quality news for professionals about AI agents—written by humans

Reply

or to participate.