HyperChat logo

HyperChat Dadigua

Use this command to install HyperChat:
winget install --id=BigSweetPotatoStudio.HyperChat -e

HyperChat is an open Chat client that can use various LLM APIs to provide the best Chat experience and implement productivity tools through the MCP protocol. - Supports OpenAI-style LLMs, OpenAI, Claude(OpenRouter), Qwen, Deepseek, GLM, Ollama. - Built-in MCP plugin market with user-friendly MCP installation configuration, one-click installation, and welcome to submit HyperChatMCP. - Also supports manual installation of third-party MCPs; simply fill in command, args, and env.

HyperChat is a chat client designed to provide an open and flexible experience for interacting with large language models (LLMs). It supports various LLM APIs, including OpenAI, Claude, Qwen, Deepseek, GLM, Ollama, and more, enabling users to choose the best model for their needs. HyperChat also implements productivity tools through the MCP protocol, allowing for seamless integration of custom plugins.

Key Features:

  • Multi-LLM Support: Works with OpenAI-style LLMs, including Claude (via OpenRouter), Qwen, Deepseek, GLM, and Ollama.
  • MCP Plugin Market: Built-in plugin market with one-click installation for user-friendly setup of MCP plugins.
  • Custom Integration: Supports manual installation of third-party MCPs by specifying commands, arguments, and environments.

Audience & Benefit:
Ideal for developers, researchers, and tech enthusiasts seeking a customizable chat client to experiment with different LLMs and productivity tools. HyperChat empowers users to enhance their workflows through diverse AI-driven features.

HyperChat can be installed via winget, making it easy to integrate into your workflow.

README

中文 | English

Introduction

HyperChat is an open-source Chat client that supports MCP and can use APIs from various LLMs to achieve the best Chat experience and productivity tools.

Build @dadigua/hyper-chat npm downloads

  • Supports OpenAI-style LLMs, OpenAI, Claude(OpenRouter), Qwen, Deepseek, GLM, Ollama.
  • Fully supports MCP.

DEMO

Features:

  • 🪟Windows + 🍏MacOS + Linux
  • Command line run, npx -y @dadigua/hyper-chat, default port 16100, password 123456, Web access http://localhost:16100/123456/
  • Docker
    • Command line version docker pull dadigua/hyperchat-mini:latest
    • Ubuntu desktop + Chrome + BrowserUse version (coming soon)
  • WebDAV supports incremental sync
  • Added HyperPrompt prompt syntax, supports variables (text + js code variables), basic syntax checking + hover real-time preview.
  • MCP extension
  • Supports dark mode🌙
  • Resources, Prompts, Tools support
  • Supports English and Chinese
  • Supports Artifacts, SVG, HTML, Mermaid rendering
  • Supports defining Agents, with preset prompts and options for allowed MCPs
  • Supports scheduled tasks, specifying Agents to complete tasks on a schedule and view task completion status.
  • Supports KaTeX, displaying mathematical formulas, code rendering with syntax highlighting and quick copy
  • Added RAG, based on MCP knowledge base
  • Introduced ChatSpace concept, supporting simultaneous chats in multiple conversations
  • Supports model comparison in chat

TODO:

  • Support official Claude protocol

LLM

LLMUsabilityRemarks
claude⭐⭐⭐⭐⭐No explanation
openai⭐⭐⭐⭐Also supports multi-step function calls perfectly (gpt-4o-mini also works)
gemini flash 2.0⭐⭐⭐⭐Very usable
qwen⭐⭐⭐⭐Very usable
doubao⭐⭐⭐Feels okay to use
deepseek⭐⭐⭐Multi-step function calls may have issues

Usage

    1. Configure API KEY, ensure your LLM service is compatible with OpenAI style.
    1. Ensure that uv + nodejs and others are installed on your system.

uvx & uv

Install via command line, or check the official Github tutorial uv

# MacOS
brew install uv
# windows
winget install --id=astral-sh.uv  -e

npx & nodejs

Install via command line, or download and install from the official website, nodejs

# MacOS
brew install node
# windows
winget install OpenJS.NodeJS.LTS

Development

cd electron && npm install
cd web && npm install
npm install
npm run dev

Telegram

HyperChat User Communication

Super input, supports variables (text + js code variables), basic syntax checking + hover real-time preview

Animation

Chat supports model comparison

image_2025-04-07_21-26-19 image

Click tool names to directly invoke debugging

image

MCP call Tool prompt + dynamically modify LLM call Tool parameters

image

Supports quick input with @ + invoking Agent

17790cb3c686690e255462c7638b61f6

Supports Artifacts, SVG, HTML, Mermaid rendering

image image

Supports selecting MCP + selecting part of Tool

image

You can access via the Web anywhere + any device, and set a password

image

Calling terminal MCP automatically analyzes ASAR files + helps decompress them

image

Calling terminal view interface

image

Gaode Map MCP

image

One-click webpage writing and publishing to (cloudflare)

image

Calling Google Search, asking what the TGA Game of the Year is

image

What are some limited-time free games, please visit the website to call the tool

image

Helps you open web pages, analyze results, and write to files

image

Using web tools + command line tools to open GitHub README for learning + GIT clone + setting up development environment

image

Multi-chat Workspace + Night mode

image

Scheduled task list + schedule sending messages to Agent to complete tasks

image

Install MCP from third-party (supports any MCP)

image

H5 interface

image image image image

Testing model capabilities

image.png

Knowledge base

image.png

Disclaimer

  • This project is for learning and communication purposes only. If you use this project for any operations, such as crawling behavior, it is unrelated to the developers of this project.
Versions
1.8.1
1.8.0
1.7.3
1.7.0
1.6.5
1.6.4
1.5.4
1.5.3
1.5.0
1.4.17
Website
License