Skip to main content

AWS Deployment Options

Spice.ai provides multiple deployment options on Amazon Web Services (AWS), allowing you to leverage AWS's robust infrastructure for your data and AI applications. Whether you prefer virtual machines, container orchestration, or managed services, you can deploy Spice.ai to meet your specific requirements for performance, scalability, and cost efficiency.

Benefits of Deploying on AWS​

Deployment Options​

Amazon EKS (Elastic Kubernetes Service)​

Leverage Kubernetes orchestration with Amazon EKS for containerized Spice.ai deployments.

  1. Create an EKS Cluster:

  2. Deploy Spice.ai on EKS:

For comprehensive instructions and advanced configuration options, refer to the Amazon EKS User Guide, EKS Best Practices Guide, and Spice.ai Kubernetes Deployment Guide.

EC2 / AWS CloudFormation​

Deploy Spice.ai directly on Amazon EC2 instances for maximum control over the environment.

  1. Manual EC2 Deployment:

  2. Automated EC2 Deployment with CloudFormation:

For detailed guidance and best practices, refer to the AWS CloudFormation User Guide, EC2 User Guide for Linux Instances, and AWS Systems Manager Parameter Store Documentation.

Amazon ECS (Elastic Container Service)​

Deploy Spice.ai as containerized tasks on Amazon ECS for easy container management and flexible scaling.

  1. Create an ECS Cluster:

  2. Define a Task Definition:

  3. Deploy Spice.ai on ECS:

    • Create an ECS Service to run and manage Spice.ai tasks.
    • Set up load balancing with NLB.
    • (Optional) Configure auto-scaling based on resource usage or CloudWatch metrics.
    • (Optional) Use CI/CD pipelines for automated updates. Manage infrastructure with CloudFormation, Terraform, or the AWS CLI.

For more details, see the Amazon ECS Developer Guide and Spice.ai Docker Deployment Guide.

Learn More​