AWS Deployment Options
Spice.ai provides multiple deployment options on Amazon Web Services (AWS), allowing you to leverage AWS's robust infrastructure for your data and AI applications. Whether you prefer virtual machines, container orchestration, or managed services, you can deploy Spice.ai to meet your specific requirements for performance, scalability, and cost efficiency.
Benefits of Deploying on AWS​
- Scalability: Easily scale your Spice.ai applications with AWS's elastic infrastructure.
- Global Reach: Deploy across AWS's worldwide regions for low-latency access.
- Integration: Connect with other AWS services like Amazon S3, Amazon RDS, and AWS Secrets Manager.
- Cost Control: Optimize expenses with various instance types and pricing models.
- Security and Compliance: Deploy Spice.ai within your AWS security perimeter using features like VPC isolation, security groups, IAM roles to meet organizational compliance requirements.
Deployment Options​
Amazon EKS (Elastic Kubernetes Service)​
Leverage Kubernetes orchestration with Amazon EKS for containerized Spice.ai deployments.
-
Create an EKS Cluster:
- Use the AWS Management Console, AWS CLI, or eksctl to create your cluster
- Configure node groups according to your workload requirements
- (Optional) Use EKS Fargate profiles for serverless container deployment
-
Deploy Spice.ai on EKS:
- Apply Spice.ai Kubernetes manifests via Helm chart
- Configure persistent storage using Amazon EBS or Amazon EFS
- Set up ingress with the AWS Network Load Balancer (NLB)
- (Optional) Automate cluster and resource provisioning with Infrastructure as Code (IaC) tools such as AWS CloudFormation or Terraform for consistent, repeatable deployments
For comprehensive instructions and advanced configuration options, refer to the Amazon EKS User Guide, EKS Best Practices Guide, and Spice.ai Kubernetes Deployment Guide.
EC2 / AWS CloudFormation​
Deploy Spice.ai directly on Amazon EC2 instances for maximum control over the environment.
-
Manual EC2 Deployment:
- Launch an EC2 instance with your preferred Linux distribution
- Install Docker
- Run Spice.ai as a Docker Container on your EC2 instance
- (Optional) Use Infrastructure as Code (IaC) tools like AWS CloudFormation or Terraform to automate the provisioning, configuration, and management of EC2 resources for repeatable and consistent deployments
-
Automated EC2 Deployment with CloudFormation:
- Define your infrastructure in a CloudFormation template, including EC2 instances (using a Linux AMI), security groups, IAM roles, VPC, and subnets
- Use EC2
UserData
to automate Docker installation, pull the Spice.ai Docker image, retrieve configuration or secrets from AWS Parameter Store or Secrets Manager, and run the container with required environment variables - (Optional) Add parameters to your template for VPC ID, Subnet ID, KeyPair, instance type, and secret names to enable flexible deployments
- (Optional) Store sensitive data such as API keys in Parameter Store or Secrets Manager and reference them securely in
UserData
- (Optional) Deploy and manage your CloudFormation stack using the AWS Console, CLI, or CI/CD pipelines for repeatable, version-controlled infrastructure
For detailed guidance and best practices, refer to the AWS CloudFormation User Guide, EC2 User Guide for Linux Instances, and AWS Systems Manager Parameter Store Documentation.
Amazon ECS (Elastic Container Service)​
Deploy Spice.ai as containerized tasks on Amazon ECS for easy container management and flexible scaling.
-
Create an ECS Cluster:
- Choose a launch type: EC2 (manage your own EC2 instances) or Fargate (serverless).
- Create the ECS cluster using the AWS Console, CLI, or Infrastructure as Code (CloudFormation, Terraform).
-
Define a Task Definition:
- Specify the Spice.ai Docker image, resource needs, networking, environment variables, and storage in a Task Definition.
- (Optional) Use AWS Secrets Manager or Parameter Store to inject secrets securely.
- Enable logging with Amazon CloudWatch.
-
Deploy Spice.ai on ECS:
- Create an ECS Service to run and manage Spice.ai tasks.
- Set up load balancing with NLB.
- (Optional) Configure auto-scaling based on resource usage or CloudWatch metrics.
- (Optional) Use CI/CD pipelines for automated updates. Manage infrastructure with CloudFormation, Terraform, or the AWS CLI.
For more details, see the Amazon ECS Developer Guide and Spice.ai Docker Deployment Guide.