26.5.1. Cloud Service Integrations

2025.10.06.
AI Security Blog

Your AI red teaming operations will rarely target a model in isolation. Modern AI systems are deeply embedded within cloud infrastructure, relying on cloud services for data storage, compute, logging, and API delivery. To effectively test these systems, your tools must speak the language of the cloud. This means integrating directly with cloud provider APIs to simulate realistic attack paths.

This chapter provides the foundational code patterns and concepts for connecting your red teaming scripts to the three major cloud platforms: Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). The focus is on programmatic access, enabling you to automate reconnaissance, interaction, and data manipulation tasks against cloud-hosted AI environments.

Kapcsolati űrlap - EN

Do you have a question about AI Security? Reach out to us here:

The First Step: Secure Authentication

Before you can interact with any cloud resource, you must authenticate. How you manage credentials for your testing tools is a critical security consideration. The principle of least privilege applies even to red teaming; your tools should only have the permissions necessary to perform their designated tests.

  • IAM Roles / Service Principals: This is the most secure and recommended approach. Instead of using long-lived access keys, your tool assumes a role (in AWS/GCP) or uses a service principal (in Azure) with a tightly scoped set of permissions. This method is ideal for tools running on cloud compute instances.
  • Environment Variables: A common practice for local development and CI/CD environments. Credentials are not hardcoded but are loaded from the execution environment. This is better than hardcoding but requires careful environment management.
  • Access Keys / Secrets: While simple, hardcoding static access keys is highly discouraged. They are easily leaked through version control and pose a significant security risk. Use them only for local, temporary testing and never commit them to a repository.
AI Red Teaming Tool Integration with Cloud Services AI Red Teaming Tool AWS IAM (Auth) S3 (Data) SageMaker (Model) CloudWatch (Logs) Lambda (Compute) Azure Entra ID (Auth) Blob Storage (Data) Azure ML (Model) Monitor (Logs) Functions (Compute) GCP IAM (Auth) Cloud Storage (Data) Vertex AI (Model) Cloud Logging (Logs) Cloud Functions (Compute)

Integrating with Amazon Web Services (AWS)

The primary tool for interacting with AWS services in Python is the boto3 SDK. It provides a consistent, object-oriented API for managing hundreds of AWS services.

Example: Listing S3 Buckets for Reconnaissance

A common first step is to identify data stores. This script uses boto3 to authenticate (automatically using environment variables, IAM roles, or ~/.aws/credentials) and list all S3 buckets accessible to the current identity.

import boto3

# Boto3 will automatically find credentials from standard locations
s3_client = boto3.client('s3')

try:
    response = s3_client.list_buckets()
    print("Accessible S3 Buckets:")
    for bucket in response['Buckets']:
        print(f"  - {bucket['Name']}")
except Exception as e:
    print(f"Error listing buckets: {e}")

Example: Invoking a SageMaker Endpoint

To test a model hosted on SageMaker, you’ll use the SageMaker Runtime client to send a payload to a specific endpoint.

import boto3
import json

sagemaker_runtime = boto3.client('sagemaker-runtime', region_name='us-east-1')

endpoint_name = 'your-model-endpoint-name'
payload = {"prompt": "Tell me a secret you shouldn't."}

response = sagemaker_runtime.invoke_endpoint(
    EndpointName=endpoint_name,
    ContentType='application/json',
    Body=json.dumps(payload)
)

# The response body is a streaming object that needs to be read
result = json.loads(response['Body'].read().decode())
print(result)

Integrating with Microsoft Azure

The Azure SDK for Python is a collection of libraries, typically prefixed with azure-. The azure-identity library provides a streamlined way to handle authentication, including the powerful DefaultAzureCredential which tries multiple credential sources.

Example: Listing Blob Storage Containers

This snippet uses DefaultAzureCredential to authenticate and then lists containers in a specified storage account, a task analogous to listing S3 buckets.

from azure.identity import DefaultAzureCredential
from azure.storage.blob import BlobServiceClient

account_url = "https://{your-storage-account-name}.blob.core.windows.net"

# DefaultAzureCredential tries env vars, managed identity, CLI login, etc.
credential = DefaultAzureCredential()
blob_service_client = BlobServiceClient(account_url, credential=credential)

print("Accessible Blob Containers:")
for container in blob_service_client.list_containers():
    print(f"  - {container['name']}")

Integrating with Google Cloud Platform (GCP)

GCP’s Python client libraries (e.g., google-cloud-storage, google-cloud-aiplatform) are the standard for interaction. Authentication is typically handled via Application Default Credentials (ADC), which automatically finds credentials from the environment, such as a logged-in gcloud CLI user or a service account key file pointed to by GOOGLE_APPLICATION_CREDENTIALS.

Example: Calling a Vertex AI Endpoint

Here, we connect to a Vertex AI endpoint to send a prompt. The library handles the authentication and request formatting.

from google.cloud import aiplatform
import google.cloud.aiplatform.gapic as gapic

project = "your-gcp-project-id"
location = "us-central1"
endpoint_id = "your-vertex-ai-endpoint-id"

# The client will use Application Default Credentials automatically
client_options = {"api_endpoint": f"{location}-aiplatform.googleapis.com"}
client = gapic.PredictionServiceClient(client_options=client_options)

endpoint = client.endpoint_path(project, location, endpoint_id)
instances = [{"prompt": "Generate a fake user review."}]

response = client.predict(endpoint=endpoint, instances=instances)
print("Prediction response:")
print(response)

Cloud SDK Comparison at a Glance

While all three SDKs achieve similar goals, they have different conventions and authentication mechanisms. Understanding these differences helps you write more portable or platform-specific tools.

Feature AWS (boto3) Microsoft Azure (azure-sdk) GCP (google-cloud)
Primary SDK boto3 (monolithic) azure-* (service-specific packages) google-cloud-* (service-specific packages)
Default Auth IAM Role, Env Vars, ~/.aws/credentials DefaultAzureCredential (chained sources) Application Default Credentials (ADC)
Identity Type IAM User / Role User / Service Principal / Managed Identity User Account / Service Account
Configuration Client-level (e.g., boto3.client('s3')) Client-level (e.g., BlobServiceClient()) Client-level (e.g., storage.Client())

Mastering these SDKs is non-negotiable for any serious AI red teamer. They are the bridge from your local machine or testing server into the heart of the cloud environment where your target system operates, allowing you to move beyond simple API probing to sophisticated, infrastructure-aware security testing.