Run DeepSeek-R1 Locally for Free in Just 3 Minutes!

#deepseek #ai #developer #coding

DeepSeek-R1 has been creating quite a buzz in the AI community. Developed by a Chinese AI company DeepSeek, this model is being compared to OpenAI's top models. The excitement around DeepSeek-R1 is not just because of its capabilities but also because it is open-sourced, allowing anyone to download and run it locally. In this blog, I'll guide you through setting up DeepSeek-R1 on your machine using Ollama.

Why DeepSeek-R1?

Responsive image

DeepSeek-R1 stands out for several reasons:

Getting Started with Ollama

Responsive image

Before we begin, let's discuss Ollama. Ollama is a free, open-source tool that allows users to run Natural Language Processing models locally. With Ollama, you can easily download and run the DeepSeek-R1 model.

Step 1: Install Ollama

First, you'll need to download and install Ollama. Visit the Ollama website and download the version that matches your operating system. Follow the installation instructions provided on the site.

Install Ollama

Step 2: Download DeepSeek-R1

As you can see when you go to Ollama website, you can run the different parameters of DeepSeek-R1. You can find the details of requirements here (as shown above in the screenshot).

You can run 1.5b, 7b, 8b, 14b, 32b, 70b, 671b and obviously the hardware requirements increase as you choose bigger parameter. I used 7b one in my tutorial.

Once Ollama is installed, open your terminal and type the following command to download the DeepSeek-R1 model:

ollama run deepseek-r1

This command tells Ollama to download the model. Depending on your internet speed, this might take some time. Grab a coffee while it completes!

Download DeepSeek-R1

Step 3: Verify Installation

After downloading, verify the installation by running:

ollama list

You should see deepseek-r1 in the list of available models. If you do, great job! You're ready to run the model.

Verify Installation

Step 4: Run DeepSeek-R1

Now, let's start the model using the command:

ollama run deepseek-r1

And just like that, you're interacting with DeepSeek-R1 locally. It's that simple!

Step 5: Ask a Query

The model looks good with coding tasks also. Let's check that approach too.

Below is a complete step-by-step video of using DeepSeek-R1 for different use cases.

Ask a Query

Additional Tips

My First Impression About DeepSeek-R1

My first impression about DeepSeek-R1 is just mind-blowing! It’s fast, efficient, and incredibly versatile for coding and problem-solving tasks.

Back to Home