Version 4.0 for Windows 11, 10
Release Notes
Note: Fusion Quill does not send any information about your AI models usage (Inputs and Outputs of the AI Models) to us. Please use responsibly and follow local and international laws. Check our Terms for more details.
Download Instructions:
- Click on the Get Fusion Link on the Microsoft site. It will open Microsoft Store app.
- Click on Install on the Fusion Quill App page in the Microsoft Store app.
- Click Open after it is installed. You can also open it from the Windows Start menu.
- Start FusionQuill
- It will start a wizard for first time configuration to connect to different AI APIs or to download a local model.
- See Help page for more information.
- Once again, Local AI models are large and will take time to download.
- Contact us with any issues.
Fusion Quill connects to Local Models and API models.
- Local Models – GGUF Models from Hugging Face.
- API Models – Open AI, Azure AI, Google Gemini, Amazon Bedrock, Groq, Ollama, Hugging Face, vLLM, llama.cpp, etc.
- AI Workflows
- AI Word processor
- Chat with AI
Basic Requirements
- Windows 11 or 10 PC with a Internet connection
- 16GB RAM
Requirements for Local Inference
- Windows 11 (preferably manufactured in the last 2 years)
- Nvidia or AMD GPU preferred with updated GPU Driver.
- CUDA 11 or 12 installation for faster AI on Nvidia GPUs for Local LLMs.
- 32 GB Memory (Text Generation works with16GB but is slow)
- 12 GB min disk space, 24+ GB recommended.
- First time AI generation is slow due to Model loading time
- Fast internet connection is recommended to download huge AI models ( > 50 Mbps)