Install and Run Cherry Studio Using Ollama on Windows

Install and Run Cherry Studio Using Ollama on Windows

Cherry Studio is a powerful, open-source desktop application designed as a unified front-end for large language models (LLMs). It integrates smoothly with both local LLM engines like Ollama and popular cloud-based services, providing Windows users with a flexible, privacy-focused AI experience.

This guide walks you through installing and running Cherry Studio with Ollama on Windows, including setup, configuration, troubleshooting, and advanced usage tips.


Overview of Cherry Studio and Ollama

Cherry Studio

  • Cross-platform desktop client for LLMs (Windows, macOS, Linux)
  • Supports OpenAI, Gemini, Anthropic, and local backends like Ollama
  • Features include chat, RAG (Retrieval-Augmented Generation), agents, and productivity tools
  • Designed for privacy: run local models without any cloud connection

Ollama

  • Open-source tool for running LLMs locally
  • Supports models like Llama 2, Mistral, Deepseek, Gemma, and more
  • Offers a CLI and OpenAI-compatible API
  • Keeps all interactions private by processing data locally

System Requirements

Cherry Studio

  • OS: Windows 10 or 11
  • CPU: Modern x64 processor
  • RAM: Minimum 8 GB (16 GB+ recommended)
  • Disk Space: 2 GB+ for the app, more for models
  • GPU: Strongly recommended (NVIDIA GPU, 8 GB+ VRAM)

Ollama

  • OS: Best on Windows 11 (limited support on Windows 10)
  • CPU: x64 architecture
  • RAM: 8 GB minimum
  • GPU: Required for large models; CPU possible for smaller ones
  • Disk Space: Depends on model size (hundreds of MBs to several GBs)

Step 1: Download and Install Cherry Studio

  1. Visit the Official Website
    Go to the Cherry Studio download page.
  2. Download the Windows Installer
    Choose the .exe installer for a typical setup or the portable version if preferred.
  3. Handle Browser Warnings
    If your browser flags the file, choose “Keep” or allow the download.
  4. Run the Installer
    Double-click the installer and follow the setup wizard. Once done, launch Cherry Studio.

Step 2: Download and Install Ollama

  1. Go to Ollama’s Website
    Visit https://ollama.com.
  2. Download the Windows Installer
    Select the Windows version and start the download.
  3. Install Ollama
    Run the installer and complete the setup.

Verify Installation
Open PowerShell or Command Prompt and run:

ollama --version

If not recognized, ensure ollama is in your system’s PATH.


Step 3: Download and Run a Model with Ollama

  1. Open PowerShell or Command Prompt
  2. Leave Ollama Running
    Keep the terminal open so Cherry Studio can connect to Ollama’s local API.

Run the Model

ollama run gemma3:1b

Download a Model

ollama pull gemma3:1b

Replace gemma3:1b with your desired model name.


Step 4: Configure Cherry Studio to Use Ollama

  1. Open Cherry Studio
  2. Go to Settings
    Click the gear icon in the left navigation panel.
  3. Open Model Providers
    Navigate to the “Model Providers” or “Model Services” tab.
  4. Add Ollama as a Provider
    Click on “Ollama” and enable it.
  5. Configure API Details
    • API Address: http://localhost:11434
    • API Key: Leave blank (not required)
    • Session Timeout: Set preferred duration in minutes
  6. Add Downloaded Models
    Click “+ Add” and input model names like gemma3:1b. Use “Manage” to edit or remove.

Step 5: Use Cherry Studio with Ollama

  1. Start a New Chat
  2. Select Ollama as the Provider
  3. Choose Your Model
    Pick from the list of added models (e.g., gemma3:1b).
  4. Begin Chatting
    Enter your prompt and receive responses from the locally running model. All processing is handled on your machine.

Troubleshooting Common Issues

Ollama Not Running

  • Confirm Ollama is open and running
  • Verify http://localhost:11434 is accessible
  • Run Ollama manually and check for terminal errors

Model Not Found in Cherry Studio

  • Ensure the model is downloaded via ollama pull
  • Double-check the model name is correct in Cherry Studio

Performance Problems

  • Use a smaller model if system resources are limited
  • Ensure GPU drivers are updated for optimal acceleration

Language or UI Confusion

  • Some UI text may appear in Chinese—use browser translation tools if needed

Advanced Configuration and Features

Multiple LLM Providers

Integrate local and cloud models for maximum flexibility.

RAG Support

Enhance context via document search or web connections.

Agent Workflows

Use Cherry Studio's agents to automate tasks and connect to external APIs.

Model Management

Switch between multiple local models via the Ollama integration.

UI Customization

Enable dark mode and adjust keyboard shortcuts for a personalized setup.


Security and Privacy Benefits

  • Local Processing: No data leaves your device
  • Offline Capability: Works without an internet connection
  • Transparency: Both tools are fully open-source

Cherry Studio + Ollama vs. Cloud-Based LLMs

Feature Cherry Studio + Ollama Cloud-Based LLMs (e.g., ChatGPT)
Data Privacy 100% Local Cloud-processed
Offline Use Yes No
Customization Full Limited
Cost Free (Open Source) May Require Subscription
Hardware Needs Higher (Local Models) Lower
Integration Options Flexible Restricted

Tips for the Best Experience

  • Use a GPU: NVIDIA GPUs (8 GB+ VRAM) improve performance significantly
  • Monitor Resources: Close unused apps to free up RAM/VRAM
  • Stay Updated: Regularly check for new releases of Cherry Studio and Ollama
  • Explore Model Options: Try various models to see what fits your needs

Conclusion

By installing Cherry Studio and Ollama on your Windows machine, you unlock a private, customizable, and powerful AI setup. Whether you’re experimenting with LLMs, building workflows, or protecting sensitive data, this local-first solution offers full control with zero reliance on the cloud.

FAQ

FAQ 1 Is Cherry Studio free? FAQ 1 Yes, Cherry Studio is free to use. Some LLMs may require a paid API key.
FAQ 2 Can I use local models? FAQ answer 2 Yes, with support for Ollama and compatible local models.
FAQ 3 Which file types are supported? FAQ answer 3 Cherry Studio handles text, images, Office files, and PDFs.
FAQ 4 Are plugins supported? FAQ answer 4 Yes, via mini-programs to extend functionality.

References

  1. Install and Run Cherry Studio on Windows: A Complete Guide
  2. Install and Run Cherry Studio on Mac
  3. Install and Run Cherry Studio on Linux Ubuntu: A Complete Guide
  4. Installing and running Cherry Studio with Ollama on a Mac