HyperChat Dadigua
winget install --id=BigSweetPotatoStudio.HyperChat -e
HyperChat is an open Chat client that can use various LLM APIs to provide the best Chat experience and implement productivity tools through the MCP protocol. - Supports OpenAI-style LLMs, OpenAI, Claude(OpenRouter), Qwen, Deepseek, GLM, Ollama. - Built-in MCP plugin market with user-friendly MCP installation configuration, one-click installation, and welcome to submit HyperChatMCP. - Also supports manual installation of third-party MCPs; simply fill in command, args, and env.
HyperChat is a chat client designed to provide an open and flexible experience for interacting with large language models (LLMs). It supports various LLM APIs, including OpenAI, Claude, Qwen, Deepseek, GLM, Ollama, and more, enabling users to choose the best model for their needs. HyperChat also implements productivity tools through the MCP protocol, allowing for seamless integration of custom plugins.
Key Features:
- Multi-LLM Support: Works with OpenAI-style LLMs, including Claude (via OpenRouter), Qwen, Deepseek, GLM, and Ollama.
- MCP Plugin Market: Built-in plugin market with one-click installation for user-friendly setup of MCP plugins.
- Custom Integration: Supports manual installation of third-party MCPs by specifying commands, arguments, and environments.
Audience & Benefit:
Ideal for developers, researchers, and tech enthusiasts seeking a customizable chat client to experiment with different LLMs and productivity tools. HyperChat empowers users to enhance their workflows through diverse AI-driven features.
HyperChat can be installed via winget, making it easy to integrate into your workflow.
README
Introduction
HyperChat is an open-source Chat client that supports MCP and can use APIs from various LLMs to achieve the best Chat experience and productivity tools.
- Supports OpenAI-style LLMs,
OpenAI
,Claude(OpenRouter)
,Qwen
,Deepseek
,GLM
,Ollama
. - Fully supports MCP.
DEMO
- HyperChat on Docker
Features:
- 🪟Windows + 🍏MacOS + Linux
- Command line run,
npx -y @dadigua/hyper-chat
, default port 16100, password 123456, Web access http://localhost:16100/123456/ - Docker
- Command line version
docker pull dadigua/hyperchat-mini:latest
- Ubuntu desktop + Chrome + BrowserUse version (coming soon)
- Command line version
-
WebDAV
supports incremental sync - Added
HyperPrompt
prompt syntax, supports variables (text + js code variables), basic syntax checking + hover real-time preview. -
MCP
extension - Supports dark mode🌙
- Resources, Prompts, Tools support
- Supports English and Chinese
- Supports
Artifacts
,SVG
,HTML
,Mermaid
rendering - Supports defining Agents, with preset prompts and options for allowed MCPs
- Supports scheduled tasks, specifying Agents to complete tasks on a schedule and view task completion status.
- Supports
KaTeX
, displaying mathematical formulas, code rendering with syntax highlighting and quick copy - Added
RAG
, based on MCP knowledge base - Introduced ChatSpace concept, supporting simultaneous chats in multiple conversations
- Supports model comparison in chat
TODO:
- Support official Claude protocol
LLM
LLM | Usability | Remarks |
---|---|---|
claude | ⭐⭐⭐⭐⭐ | No explanation |
openai | ⭐⭐⭐⭐ | Also supports multi-step function calls perfectly (gpt-4o-mini also works) |
gemini flash 2.0 | ⭐⭐⭐⭐ | Very usable |
qwen | ⭐⭐⭐⭐ | Very usable |
doubao | ⭐⭐⭐ | Feels okay to use |
deepseek | ⭐⭐⭐ | Multi-step function calls may have issues |
Usage
-
- Configure API KEY, ensure your LLM service is compatible with OpenAI style.
-
- Ensure that
uv + nodejs
and others are installed on your system.
- Ensure that
uvx & uv
Install via command line, or check the official Github tutorial uv
# MacOS
brew install uv
# windows
winget install --id=astral-sh.uv -e
npx & nodejs
Install via command line, or download and install from the official website, nodejs
# MacOS
brew install node
# windows
winget install OpenJS.NodeJS.LTS
Development
cd electron && npm install
cd web && npm install
npm install
npm run dev
Telegram
Super input, supports variables (text + js code variables), basic syntax checking + hover real-time preview
Chat supports model comparison
Click tool names to directly invoke debugging
MCP call Tool prompt + dynamically modify LLM call Tool parameters
Supports quick input with @ + invoking Agent
Supports Artifacts
, SVG
, HTML
, Mermaid
rendering
Supports selecting MCP + selecting part of Tool
You can access via the Web anywhere + any device, and set a password
Calling terminal MCP automatically analyzes ASAR files + helps decompress them
Calling terminal view interface
Gaode Map MCP
One-click webpage writing and publishing to (cloudflare)
Calling Google Search, asking what the TGA Game of the Year is
What are some limited-time free games, please visit the website to call the tool
Helps you open web pages, analyze results, and write to files
Using web tools + command line tools to open GitHub README for learning + GIT clone + setting up development environment
Multi-chat Workspace + Night mode
Scheduled task list + schedule sending messages to Agent to complete tasks
Install MCP from third-party (supports any MCP)
H5 interface
Testing model capabilities
Knowledge base
Disclaimer
- This project is for learning and communication purposes only. If you use this project for any operations, such as crawling behavior, it is unrelated to the developers of this project.