23.1.2 Installation guides

2025.10.06.
AI Security Blog

A successful red teaming engagement begins long before the first test is run. It starts with a clean, reproducible, and secure setup of your toolkit. This section provides direct, actionable installation guidance for the key open-source tools discussed previously. Getting this step right prevents dependency conflicts, ensures tool integrity, and establishes a stable foundation for your assessments.

Foundation: Your Environment

Never install security tools directly into your system’s global Python environment. This practice leads to version conflicts and unpredictable behavior. Always use an isolated environment for each project or major toolset. This discipline is non-negotiable for professional work.

Kapcsolati űrlap - EN

Do you have a question about AI Security? Reach out to us here:

Isolate Everything

Virtual environments are the cornerstone of reproducible research and secure testing. They create self-contained sandboxes for your Python packages, preventing interference between projects. The two most common tools for this are venv (built into Python) and Conda.

Using Python’s venv

For most use cases, the standard venv module is sufficient.


# 1. Create a new virtual environment in a directory named 'venv'
#    Replace 'python3' with 'python' if needed on your system.
python3 -m venv venv

# 2. Activate the environment
#    On macOS/Linux:
source venv/bin/activate

#    On Windows (Command Prompt):
#    venvScriptsactivate.bat
#
#    On Windows (PowerShell):
#    venvScriptsActivate.ps1

# Your command prompt should now be prefixed with (venv).
# To exit the environment, simply type 'deactivate'.
            

Virtual Environment Isolation System Environment (Global Python) Project A: Venv 1 Python 3.10 ART v1.14 TensorFlow v2.10 Project B: Venv 2 Python 3.11 Garak v0.9 Transformers v4.35

Core Tool Installation

The following guides assume you have activated a fresh virtual environment. The primary installation method for most Python-based tools is pip, the Python Package Installer.

Adversarial Robustness Toolbox (ART)

ART is a comprehensive library for evaluating model robustness. Its dependencies are tied to specific machine learning frameworks.


# For a minimal installation (no ML framework)
pip install adversarial-robustness-toolbox

# Recommended: Install with support for your framework of choice
# For PyTorch:
pip install adversarial-robustness-toolbox[pytorch]

# For TensorFlow (ensure you have a compatible TensorFlow version first):
pip install adversarial-robustness-toolbox[tensorflow]

# To install with all supported frameworks and features:
pip install adversarial-robustness-toolbox[all]
            

Garak (LLM Vulnerability Scanner)

Garak is a dedicated tool for probing and scanning Large Language Models for common failure modes and vulnerabilities.


# Standard installation via pip
pip install garak
            

API Keys for Garak

Garak interacts with various model APIs. After installation, you will need to configure your API keys. You can do this by setting environment variables (e.g., OPENAI_API_KEY) or by using Garak’s key management features.

Counterfit (CLI Attack Framework)

Counterfit provides a command-line interface for managing and executing attacks on AI models.


# 1. Clone the repository from GitHub
git clone https://github.com/Azure/counterfit.git
cd counterfit

# 2. Install the package in editable mode
# This allows you to easily pull updates from the repository.
pip install -e .
            

Note on Counterfit

As of late 2023, Counterfit’s development has slowed. While still a valuable tool for learning, always check the repository for the latest status and compatibility with modern libraries before deploying it in a critical engagement.

LLM Guard (Security Library)

LLM Guard is not a standalone tool but a library you integrate into your applications to sanitize, detect, and protect LLM interactions.


# Basic installation
pip install llm-guard

# To include all supported scanners and features (recommended for red teaming)
pip install "llm-guard[all]"
            

Verification and Maintenance

After installing a tool, you must verify it works as expected. A simple check can save hours of debugging later.

Quick Verification Steps

  1. Check the version: Most command-line tools support a --version or similar flag. For libraries, you can check programmatically.
  2. Run a basic command: Execute a simple, non-destructive command, like listing available attacks or plugins.

For example, after installing Garak:


# Check the installed version
garak --version

# List all available probes to ensure it's functional
garak --list_probes
            

Managing Your Toolkit

Keep your tools and their dependencies up to date to benefit from the latest features and security patches. Within your activated virtual environment:


# To upgrade a specific package
pip install --upgrade garak

# To see a list of all installed packages and their versions
pip list

# To save your environment's state for reproducibility
pip freeze > requirements.txt

# You can then recreate this exact environment elsewhere with:
# pip install -r requirements.txt
            

Treat your requirements.txt file as a critical artifact of your engagement. It ensures that your findings can be reproduced by you or your client later.