Use this command to install HuggingFace Model Downloader:
winget install --id=bodaay.hfdownloader -e
The HuggingFace Model Downloader is a utility tool for downloading models and datasets from the HuggingFace website. It offers multithreaded downloading for LFS files and ensures the integrity of downloaded models with SHA256 checksum verification.
The HuggingFace Model Downloader is a utility tool designed to streamline the process of downloading models and datasets from the HuggingFace platform. It simplifies access to machine learning resources by offering efficient and reliable retrieval mechanisms.
Key Features:
Multithreaded LFS File Downloads: Enhances download speeds for large files, making it easier to handle extensive model and dataset files.
SHA256 Checksum Verification: Ensures the integrity of downloaded models, confirming they are unchanged and valid.
Cross-Platform Compatibility: Operates seamlessly across various operating systems, including Linux, macOS, and Windows WSL2.
Configuration File Support: Allows users to set default values for command flags, enhancing customization and ease of use.
Download Resumption: Continues interrupted downloads without starting over, saving time and bandwidth.
HuggingFace Access Token Integration: Supports secure access to restricted models and datasets using tokens.
Audience & Benefit:
Ideal for researchers, data scientists, and developers working with machine learning models. This tool enables efficient retrieval of resources, ensuring reliability and integrity while handling large-scale downloads. Its flexibility and robust features make it a valuable asset in managing machine learning projects effectively.
The HuggingFace Model Downloader can be installed via winget, offering a straightforward setup process to integrate into your workflow seamlessly.
README
HuggingFaceModelDownloader · v2.0.0
Fast, resilient, resumable CLI (and Go library) for downloading models and datasets from the Hugging Face Hub—now with a clean flag set, a colorful TUI, structured JSON events, retry/backoff, and filesystems‑only resume (no progress files).
> v2.0.0 is a breaking redesign of the CLI and library. The old README summarized that this major version simplifies flags, adds dry‑run plans, verification options, multipart thresholding, and JSON output. This document is the definitive, up‑to‑date guide for v2.0.0.
Highlights
Resumable by default
LFS files: verified by SHA‑256 (when provided by the repo).
Non‑LFS files: verified by size.
Large files use multipart range downloads with per‑part resume.
Beautiful live TUI
Auto‑adapts to terminal width/height; smart truncation; per‑file bars, speeds, ETA.
Colorful when supported; graceful plain‑text fallback (TERM=dumb or NO_COLOR=1).
Robust + fast cancellation
Ctrl‑C (SIGINT) or SIGTERM aborts immediately across goroutines; second Ctrl‑C exits.
Structured progress events (--json) for CI/logging.
Practical controls
Overall concurrency and per‑file connection limits.
Skip decisions are made only from what’s on disk (checksums/sizes).
Note: the previous README mentioned saving an .hfdownloader.meta.json. v2.0.0 no longer persists such files; resume is purely filesystem‑based.
Installation
From source (Go 1.21+)
git clone https://github.com/bodaay/HuggingFaceModelDownloader
cd HuggingFaceModelDownloader
go build -o hfdownloader .
# optional:
# go install . # installs into your $GOBIN
-t, --token — Hugging Face token (or HF_TOKEN env)
--config — path to JSON config (defaults to ~/.config/hfdownloader.json if present)
> The old README showed a larger flag surface and described resume/overwrite toggles and metadata persistence; v2.0.0 deliberately simplifies this—resume is always on, and skip decisions are based on your files only (no metadata saved).
> Skip lines: each file prints at most one “skip (…)” per run. Deduplication is enforced internally.
Resume & verification (how it decides to skip)
LFS files (SHA available) → compute local SHA‑256 and compare to the repo’s SHA.
If equal → skip (file_done with skip (sha256 match)).
If different (even if size matches) → re‑download.
Non‑LFS / SHA unknown → compare file size.
If equal → skip (skip (size match)).
If different → re‑download.
Multipart parts → each range part downloads to path.part-00, path.part-01, …; matching‑length parts are not re‑fetched and the file is assembled when all parts are present.
Cancellation & signals
SIGINT/SIGTERM: fast, cooperative cancellation—no new work is scheduled, all goroutines exit promptly, and ongoing HTTP calls are context‑bound.
SIGKILL (9): cannot be intercepted by any program; the OS terminates immediately.
Events you can handle: scan_start, plan_item, file_start, file_progress, retry, file_done, error, done.
Troubleshooting
401 Unauthorized
Provide a token: -t TOKEN or HF_TOKEN=.... Some repos require auth/acceptance.
403 Forbidden (terms)
Visit the repo page and accept terms, then retry.
Range requests disabled
Multipart falls back to a single GET automatically; downloads still work.
Slow throughput
Increase --connections and --max-active gradually; ensure disk/FS and network can keep up.
Repeated “skip” lines
v2.0.0 emits at most one “skip (…)” per file per run. If you still see duplicates, check for duplicate paths in the upstream tree or path collisions on Windows.
Why v2?
Cleaner mental model (one download command, sensible defaults).
Filesystem‑only resume—reliable and transparent; no “state” files to corrupt.
JSON events and a TUI that looks great everywhere.
Strong cancellation story for real‑world, long‑running downloads.
> The prior README’s CLI outline and examples helped guide this cleanup; this version documents the finalized v2 surface and behavior.
License
Apache‑2.0 (see LICENSE).
Acknowledgements
Thanks to the HF community and tooling ecosystem—this project tries to be a pragmatic drop‑in for everyday model & dataset fetching.