Install and Run Cherry Studio on Windows: A Complete Guide

Install and Run Cherry Studio on Windows: A Complete Guide

Cherry Studio is a powerful, open-source desktop client designed to help you interact with large language models (LLMs) from various providers—including OpenAI, Gemini, local models, and more.

With cross-platform compatibility, a modern UI, and robust productivity tools, Cherry Studio is ideal for developers, researchers, writers, and anyone seeking to harness AI in their workflow.


Overview of Cherry Studio

What is Cherry Studio?

  • A desktop application supporting multiple LLM providers: OpenAI, Gemini, d.run, Ollama, and more.
  • Available on Windows, macOS, and Linux.
  • Designed for tasks like multi-model chat, document analysis, translation, assistant creation, and more.

Key Features

  • Multi-Provider Support: Connect to OpenAI, Gemini, and local models.
  • AI Assistant Management: Use or customize 300+ built-in assistants.
  • Document Processing: Upload PDFs, images, or text to generate searchable knowledge bases.
  • Modern UI: User-friendly interface with theme switching and workspace customization.
  • Productivity Tools: Translation, global search, chat organization, and multi-agent workflows.

Step 1: Download Cherry Studio for Windows

Where to Download

Download Instructions

  1. Navigate to the GitHub Releases page.
  2. Download the latest CherryStudio-Setup-x64.exe.
  3. Optionally, verify the download using the provided checksum.

Step 2: Install Cherry Studio

Installation Steps

  1. Run Installer: Double-click the .exe file.
  2. Bypass Security Prompt: Click “More info” > “Run anyway” if prompted by Windows.
  3. Follow Setup Wizard:
    • Accept the license.
    • Choose install location.
    • Install for all users or just yourself.
  4. Complete Installation: Click “Finish” to close the setup.

Cherry Studio will now be accessible via Start Menu or Desktop shortcut.


Step 3: First Launch & Interface Overview

First Launch

  • Open Cherry Studio from Start Menu or desktop.
  • Allow through Windows Firewall when prompted.

UI Overview

  • Sidebar: Navigate between chat, assistants, knowledge base, and settings.
  • Main Panel: View chat history, settings, or documents.
  • Footer Bar: Theme toggle, language settings, quick preferences.

Step 4: Connect to LLM Providers

Adding a Model Provider

  1. Click the Settings (gear icon).
  2. Go to the Model Providers section.
  3. Click Add and choose a provider:
    • OpenAI
    • Gemini
    • d.run
    • Ollama
    • Any OpenAI-compatible API
  4. Fill in:
    • API Key
    • API Endpoint (e.g., https://api.openai.com)
  5. Click Save.

Supported Providers

  • OpenAI: GPT-3, GPT-4, etc.
  • Gemini: Google’s LLM.
  • d.run: Lightweight hosting solution.
  • Ollama: For running local LLMs.
  • Custom APIs: As long as they support OpenAI-like interfaces.

Step 5: Explore Cherry Studio Features

A. Multi-Model Conversations

  • Start conversations with any connected provider.
  • Easily switch models during a session.
  • Use drag-and-drop and smart naming for chat organization.

B. AI Assistant Creation

  • Choose from 300+ prebuilt assistants.
  • Customize prompts, instructions, and model preferences.
  • Save assistants for fast access.

C. Document & File Processing

  • Upload PDFs, DOCX, PPTX, images, or text files.
  • Generate searchable knowledge bases.
  • Supported formats: .pdf, .docx, .pptx, .xlsx, .txt, .md, .mdx.

D. Knowledge Base & Vectorization

  1. Open the Knowledge Base section.
  2. Click Add to create a new knowledge base.
  3. Upload files or connect data sources:
    • Local folders
    • URLs or sitemaps
    • Custom notes or pasted text

Cherry Studio automatically embeds and vectorizes content for semantic retrieval.

E. Translation & Multilingual Support

  • Built-in tools to translate chats and documents.
  • Supports multiple languages with inline translation.

F. UI Customization

  • Switch between light and dark modes.
  • Adjust layout, colors, and font sizes under settings.

Step 6: Advanced Usage

A. Use Local Models (e.g., Ollama)

  • Install and run Ollama separately.
  • In Cherry Studio, add it as a provider via its local API endpoint.
  • Enables private, offline LLM interaction using your hardware.

B. Multi-Agent & RAG (Retrieval-Augmented Generation)

  • Create agent workflows for advanced use cases.
  • Use RAG to combine live LLM answers with vectorized knowledge base content.

C. Bonus Productivity Features

  • Global Search: Find content across all chats, docs, and bases.
  • Topic Organization: Group assistants and chats by category or project.
  • File Access via WebDAV: Sync and manage files in the cloud.

Step 7: Troubleshooting & Pro Tips

Common Issues

  • Installer Won’t Launch: Cherry Studio only supports Windows 10/11.
  • Non-English Docs: Use Cherry Studio’s built-in translator or browser tools.
  • API Errors: Double-check keys and endpoint URLs.
  • Lag with Local Models: Ensure proper hardware (RAM, GPU).

Usage Tips

  • Keep Cherry Studio updated from GitHub or the official site.
  • Explore YouTube tutorials and GitHub discussions for help.
  • Contribute via pull requests or issue reports if you're a developer.

Summary Table: Quick Start Reference

Step Description
Download Get .exe from GitHub or official site
Install Run setup and complete installation
Launch Open from Start Menu or desktop shortcut
Connect Providers Add LLM APIs (OpenAI, Gemini, etc.)
Use Features Chat, upload files, create assistants, manage knowledge bases
Advanced Features Local model integration, multi-agent workflows, RAG, global search

Conclusion

Cherry Studio brings powerful LLM interaction to your desktop with an intuitive UI, flexible provider support, and a wealth of productivity tools. Whether you're building AI agents, managing documents, translating content, or experimenting with local models, Cherry Studio offers a seamless experience on Windows.

FAQs

FAQ 1 Is Cherry Studio free to use? FAQ 1 Yes, it’s open-source and free for personal and professional use.
FAQ 2 Can I use Cherry Studio offline? FAQ answer 2 Yes, pair it with local model providers like Ollama for offline AI capabilities.
FAQ 3 Which LLMs are supported? FAQ answer 3 OpenAI, Gemini, d.run, Ollama, and any provider using OpenAI-compatible APIs.
FAQ 4 How do I update the app? FAQ answer 4 Check the GitHub Releases page for the latest .exe installer.

References

  1. Install and Run Cherry Studio on Linux Ubuntu: A Complete Guide
  2. Install and Run Cherry Studio on Mac