MaxedOut Flux Installer Guide 🚀

This guide walks you through the manual setup for the Flux Kontext workflow. For the best experience, right-click any file link and choose "Save Link As..." into the specified folder.

⚡ Prefer to skip all this?

Just use the one-click installer. It sets everything up automatically (models, nodes, dependencies). This manual guide is only here in case you're on unsupported hardware or want full control.

✅ If you're using ComfyUI Desktop, you're fully supported.

📦 Custom Nodes

Open a terminal inside the custom_nodes/ folder and run:

git clone https://github.com/Maxed-Out-99/ComfyUI-MaxedOut.git
git clone https://github.com/Maxed-Out-99/ComfyUI-SmartModelLoaders-MXD.git

🔧 Install Node Requirements

Now activate your virtual environment:

Windows:
.venv\Scripts\activate

Mac:
/.venv/bin/activate

Then install the node requirements:

python -m pip install -r custom_nodes/ComfyUI-SmartModelLoaders-MXD/requirements.txt

That's it. ✅

💾 Core Files

Right-click any file link → choose "Save Link As..." → Save into the specified folder inside ComfyUI/models/

  • ae.safetensors ➡️ vae/
  • clip_l.safetensors ➡️ clip/

⚠️ Optional: download a quantized Nunchaku model for faster performance on NVIDIA GPUs only:

  • RTX 50 series only: svdq-fp4_r32-flux.1-kontext-dev.safetensors ➡️ diffusion_models/
  • All other NVIDIA GPUs: svdq-int4_r32-flux.1-kontext-dev.safetensors ➡️ diffusion_models/

📎 CLIP & T5 Models

Select ONE T5 model based on your system's RAM (not VRAM) and place into models/clip/.

  • Tier A (32GB): t5xxl_fp16.safetensors
  • Tier B (16GB): t5xxl_fp8_scaled.safetensors
  • Tier C (Less than 16GB): t5xxl_Q5_K_M.gguf

🧠 UNet Models

Choose one UNet model based on your GPU VRAM. Save into models/diffusion_models/.

  • 🔶 Tier S (32GB VRAM): flux1-kontext-dev.safetensors
  • 🔶 Tier A (16–31GB VRAM): flux1-dev-kontext_fp8_scaled.safetensors
  • 🔶 Tier B (12–15GB VRAM): flux1-kontext-dev-Q5_K_M.gguf
  • 🔶 Tier C (Under 12GB / Apple Silicon / CPU): flux1-kontext-dev-Q3_K_S.gguf

✅ Done!

Once you've:

  • Downloaded your models
  • Selected one UNet + one T5 tier
  • Placed everything into the correct folders
  • Installed my nodes and their requirements

You're ready to run the Flux Kontext workflow manually 🎉

Enjoy! And if you're stuck, you can always fall back to the one-click installer.

This site is maintained by MaxedOut.