Skip to main content

3 posts tagged with "helm"

Helm Chart related topics and usage

View All Tags

Spice v1.2.2 (May 13, 2025)

ยท 4 min read
Jack Eadie
Token Plumber at Spice AI

Announcing the release of Spice v1.2.2! ๐ŸŒŸ

Spice v1.2.2 introduces support for Databricks Mosaic AI model serving and embeddings, alongside the existing Databricks catalog and dataset integrations. It adds configurable service ports in the Helm chart and resolves several bugs to improve stability and performance.

Highlights in v1.2.2โ€‹

  • Databricks Model & Embedding Provider: Spice integrates with Databricks Model Serving for models and embeddings, enabling secure access via machine-to-machine (M2M) OAuth authentication with service principal credentials. The runtime automatically refreshes tokens using databricks_client_id and databricks_client_secret, ensuring uninterrupted operation. This feature supports Databricks-hosted large language models and embedding models.

    models:
    - from: databricks:databricks-llama-4-maverick
    name: llama-4-maverick
    params:
    databricks_endpoint: dbc-46470731-42e5.cloud.databricks.com
    databricks_client_id: ${secrets:DATABRICKS_CLIENT_ID}
    databricks_client_secret: ${secrets:DATABRICKS_CLIENT_SECRET}

    embeddings:
    - from: databricks:databricks-gte-large-en
    name: gte-large-en
    params:
    databricks_endpoint: dbc-42424242-4242.cloud.databricks.com
    databricks_client_id: ${secrets:DATABRICKS_CLIENT_ID}
    databricks_client_secret: ${secrets:DATABRICKS_CLIENT_SECRET}

    For detailed setup instructions, refer to the Databricks Model Provider documentation.

  • Configurable Helm Chart Service Ports: The Helm chart now supports custom ports for flexible network configurations for deployments. Specify non-default ports in your Helm values file.

  • Resolved Issues:

    • MCP Nested Tool Calling: Fixed a bug preventing nested tool invocation when Spice operates as the MCP server federating to MCP clients.

    • Dataset Load Concurrency: Corrected a failure to respect the dataset_load_parallelism setting during dataset loading.

    • Acceleration Hot-Reload: Addressed an issue where changes to acceleration enable/disable settings were not detected during hot reload of Spicepod.yaml.

Contributorsโ€‹

Breaking Changesโ€‹

No breaking changes.

Cookbook Updatesโ€‹

Updated cookbooks:

The Spice Cookbook now includes 68 recipes to help you get started with Spice quickly and easily.

Upgradingโ€‹

To upgrade to v1.2.2, use one of the following methods:

CLI:

spice upgrade

Homebrew:

brew upgrade spiceai/spiceai/spice

Docker:

Pull the spiceai/spiceai:1.2.2 image:

docker pull spiceai/spiceai:1.2.2

For available tags, see DockerHub.

Helm:

helm repo update
helm upgrade spiceai spiceai/spiceai

What's Changedโ€‹

Dependenciesโ€‹

  • No major dependency changes.

Changelogโ€‹

- Update spark-connect-rs to override user agent string by @ewgenius in https://github.com/spiceai/spice/pull/5798
- Merge pull request by @ewgenius in https://github.com/spiceai/spice/pull/5796
- Pass the default user agent string to the Databricks Spark, Delta, and Unity clients by @ewgenius in https://github.com/spiceai/spice/pull/5717
- bump to 1.2.2 by @Jeadie in https://github.com/spiceai/spice/pull/none
- Helm chart: support for service ports overrides by @sgrebnov in https://github.com/spiceai/spice/pull/5774
- Update spice cli login command with client-id and client-secret flags for Databricks by @ewgenius in https://github.com/spiceai/spice/pull/5788
- Fix bug where setting Cache-Control: no-cache doesn't compute the cache key by @phillipleblanc in https://github.com/spiceai/spice/pull/5779
- Update to datafusion-contrib/datafusion-table-providers#336 by @phillipleblanc in https://github.com/spiceai/spice/pull/5778
- Lru cache: limit single cached record size to u32::MAX (4GB) by @sgrebnov in https://github.com/spiceai/spice/pull/5772
- Fix LLMs calling nested MCP tools by @Jeadie in https://github.com/spiceai/spice/pull/5771
- MySQL: Set the character_set_results/character_set_client/character_set_connection session variables on connection setup by @Sevenannn in https://github.com/spiceai/spice/pull/5770
- Control the parallelism of acceleration refresh datasets with runtime.dataset_load_parallelism by @phillipleblanc in https://github.com/spiceai/spice/pull/5763
- Fix Iceberg predicates not matching the Arrow type of columns read from parquet files by @phillipleblanc in https://github.com/spiceai/spice/pull/5761
- fix: Use decimal_cmp for numerical BETWEEN in SQLite by @peasee in https://github.com/spiceai/spice/pull/5760
- Support product name override in databricks user agent string by @ewgenius in https://github.com/spiceai/spice/pull/5749
- Databricks U2M Token Provider support by @ewgenius in https://github.com/spiceai/spice/pull/5747
- Remove HTTP auth from LLM config and simplify Databricks models logic by using static headers by @Jeadie in https://github.com/spiceai/spice/pull/5742
- clear plan cache when dataset updates by @kczimm in https://github.com/spiceai/spice/pull/5741
- Support Databricks M2M auth in LLMs + Embeddings by @Jeadie in https://github.com/spiceai/spice/pull/5720
- Retrieve Github App tokens in background; make TokenProvider not async by @Jeadie in https://github.com/spiceai/spice/pull/5718
- Make 'token_providers' crate by @Jeadie in https://github.com/spiceai/spice/pull/5716
- Databricks AI: Embedding models & LLM streaming by @Jeadie in https://github.com/spiceai/spice/pull/5715

See the full list of changes at: v1.2.1...v1.2.2

Spice v0.12.2-alpha (May 13, 2024)

ยท 4 min read
Sergei Grebnov
Senior Software Engineer at Spice AI

The v0.12.2-alpha release introduces data streaming and key-pair authentication for the Snowflake data connector, enables general append mode data refreshes for time-series data, improves connectivity error messages, adds nested folders support for the S3 data connector, and exposes nodeSelector and affinity keys in the Helm chart for better Kubernetes management.

Highlightsโ€‹

  • Improved Connectivity Error Messages: Error messages provide clearer, actionable guidance for misconfigured settings or unreachable data connectors.

  • Snowflake Data Connector Improvements: Enables data streaming by default and adds support for key-pair authentication in addition to passwords.

  • API for Refresh SQL Updates: Update dataset Refresh SQL via API.

  • Append Data Refresh: Append mode data refreshes for time-series data are now supported for all data connectors. Specify a dataset time_column with refresh_mode: append to only fetch data more recent than the latest local data.

  • Docker Image Update: The spiceai/spiceai:latest Docker image now includes the ODBC data connector. For a smaller footprint, use spiceai/spiceai:latest-slim.

  • Helm Chart Improvements: nodeSelector and affinity keys are now supported in the Helm chart for improved Kubernetes deployment management.

Breaking Changesโ€‹

  • API to trigger accelerated dataset refreshes has changed from POST /v1/datasets/:name/refresh to POST /v1/datasets/:name/acceleration/refresh to be consistent with the Spicepod.yaml structure.

Contributorsโ€‹

  • @mach-kernel
  • @y-f-u
  • @sgrebnov
  • @ewgenius
  • @Jeadie
  • @Sevenannn
  • @digadeesh
  • @phillipleblanc
  • @lukekim

What's Changedโ€‹

Full Changelog: https://github.com/spiceai/spiceai/compare/v0.12.1-alpha...v0.12.2-alpha

Resourcesโ€‹

Communityโ€‹

Spice.ai started with the vision to make AI easy for developers. We are building Spice.ai in the open and with the community. Reach out on Discord or by email to get involved.

Spice.ai v0.10.1-alpha

ยท 2 min read
Luke Kim
Founder and CEO of Spice AI

Announcing the release of Spice v0.10.1-alpha! ๐Ÿ”ฅ

The v0.10.1-alpha release focuses on stability, bug fixes, and usability by improving error messages when using SQLite data accelerators, improving the PostgreSQL support, and adding a basic Helm chart.

Highlights in v0.10.1-alphaโ€‹

Improved PostgreSQL support for Data Connectors TLS is now supported with PostgreSQL Data Connectors and there is improved VARCHAR and BPCHAR conversions through Spice.

Improved Error messages Simplified error messages from Spice when propagating errors from Data Connectors and Accelerator Engines.

Spice Pods Command The spice pods command can give you quick statistics about models, dependencies, and datasets that are loaded by the Spice runtime.

Kubernetes Helm Deploymentโ€‹

Spice.ai can be deployed to Kubernetes using Helm. Here's a quick guide to get started:

Step 1. (Optional) Start a local kind cluster:

go install sigs.k8s.io/[email protected]
kind create cluster

Step 2. Install Spice in your Kubernetes cluster using Helm:

helm repo add spiceai https://helm.spiceai.org
helm install spiceai spiceai/spiceai

Step 3. Verify that the Spice pods are running:

kubectl get pods
kubectl logs deploy/spiceai

Step 4. Run the Spice SQL REPL inside the running pod:

kubectl exec -it deploy/spiceai -- spiced --repl

Learn more about deploying Spice.ai to Kubernetes

Contributorsโ€‹

  • @phillipleblanc
  • @mitchdevenport
  • @ewgenius
  • @sgrebnov
  • @lukekim
  • @digadeesh

New in this releaseโ€‹

Resourcesโ€‹

Communityโ€‹

Spice.ai started with the vision to make AI easy for developers. We are building Spice.ai in the open and with the community. Reach out on Discord or by email to get involved.