Install Guide

How to Install ACE-Step Locally

Complete setup guide for Windows, macOS, and Linux β€” including troubleshooting the most common errors.

Last updated: February 2026

System Requirements

ComponentMinimumRecommended
GPUNVIDIA RTX 2080 (8GB VRAM)NVIDIA RTX 3090/4090 (24GB VRAM)
CPUIntel i7-8700 / AMD Ryzen 7 3700XIntel i9-12900K / AMD Ryzen 9 5900X
RAM16GB DDR432GB+ DDR4
Storage20GB SSD50GB+ NVMe SSD
CUDA Version11.812.1
Python3.93.10 or 3.11
OSWindows 10, macOS 12, Ubuntu 20.04Ubuntu 22.04 LTS (best performance)

Installation Steps

  1. 1

    Install Conda or Miniconda

    Download and install Miniconda from conda.io. This manages your Python environment and prevents dependency conflicts.

  2. 2

    Clone the Repository

    Run: git clone https://github.com/ace-step/ACE-Step.git && cd ACE-Step

    git clone https://github.com/ace-step/ACE-Step.git
    cd ACE-Step
  3. 3

    Create Conda Environment

    Run: conda env create -f environment.yml && conda activate ace-step

    conda env create -f environment.yml
    conda activate ace-step
  4. 4

    Install PyTorch with CUDA

    For CUDA 11.8: pip install torch==2.0.1+cu118 -f https://download.pytorch.org/whl/cu118/torch_stable.html

    pip install torch==2.0.1+cu118 torchvision==0.15.2+cu118 torchaudio==2.0.2+cu118 \
      -f https://download.pytorch.org/whl/cu118/torch_stable.html
  5. 5

    Download Model Weights

    Run: python download_models.py to fetch the ~15GB model checkpoint from HuggingFace.

    python download_models.py
  6. 6

    Launch the Web UI

    Run: python app.py to start the Gradio web interface at http://localhost:7860

    python app.py

Performance Optimization

For faster generation, use FP16 precision (--fp16 flag) on supported GPUs. Enable xFormers attention by installing xformers for 30-40% speed improvement. For GPUs with less than 12GB VRAM, always use 8-bit quantization.

pip install xformers
python app.py --fp16

Common Issues & Fixes

Out of Memory (OOM) Error

If you encounter CUDA OOM errors, try enabling 8-bit quantization by adding --quantize int8 to your launch command. Alternatively, enable CPU offload with --cpu-offload to move some layers to system RAM.

python app.py --quantize int8
# or
python app.py --cpu-offload --fp16

Dependency Conflicts

ACE-Step requires specific versions of PyTorch and CUDA. Always use the provided conda environment file or requirements.txt. Mixing pip and conda installs frequently causes conflicts.

Port Already in Use

If port 7860 is busy, add --port 7861 (or any available port) to the Gradio launch command.

python app.py --port 7861

Model Path Not Found

Ensure you have enough disk space (model weights are ~15GB). Run the download script from the project root directory, not a subdirectory.

FAQ

Skip the Setup Entirely

Not a fan of terminal commands and dependency hell? FM9 gives you cloud-powered ACE-Step compatible music generation in your browser. No GPU, no setup, no waiting.

Start Creating Free