Overview
The following docs are aimed at end users who want to troubleshoot or learn how to use the Jan Desktop application better.
If you are interested to build extensions, please refer to developer docs instead (WIP).
If you are interested to contribute to the underlying framework, please refer to framework docs instead.
Jan Desktopโ
The desktop client is a ChatGPT alternative that runs on your own computer, with a local API server.
Featuresโ
- Compatible with open-source models (GGUF via llama.cpp, TensorRT via TensorRT-LLM, and remote APIs)
- Compatible with most OSes: Windows, Mac, Linux, with GPU acceleration through llama.cpp
- Stores data in open file formats
- Local API server mode
- Customizable via extensions
- And more in the roadmap. Join us on Discord and tell us what you want to see!
Why Jan?โ
We believe in the need for an open source AI ecosystem.
We're focused on building infra, tooling and custom models to allow open source AIs to compete on a level playing field with proprietary offerings.
Read more about our mission and culture here.
๐ป Own your AIโ
Jan runs 100% on your own machine, predictably, privately and offline. No one else can see your conversations, not even us.
๐๏ธ Extensionsโ
Jan ships with a local-first, AI-native, and cross platform extensions framework. Developers can extend and customize everything from functionality to UI to branding. In fact, Jan's current main features are actually built as extensions on top of this framework.
๐๏ธ Open File Formatsโ
Jan stores data in your local filesystem. Your data never leaves your computer. You are free to delete, export, migrate your data, even to a different platform.
๐ Open Sourceโ
Both Jan and Nitro, our lightweight inference engine, are licensed via the open source AGPLv3 license.