Skip to main content

Spice v0.16-alpha (July 22, 2024)

ยท 7 min read
Luke Kim
Founder and CEO of Spice AI

The v0.16-alpha release is the first candidate release for the beta milestone on a path to finalizing the v1.0 developer and user experience. Upgraders should be aware of several breaking changes designed to improve the Secrets configuration experience and to make authoring spicepod.yml files more consistent. See the [Breaking Changes](#Breaking Changes) section below for details. Additionally, the Spice Java SDK was released, providing Java developers a simple but powerful native experience to query Spice.

Highlights in v0.16-alphaโ€‹

secrets:
- from: env
name: env
- from: aws_secrets_manager:my_secret_name
name: aws_secret

Secrets managed by configured Secret Stores can be referenced in component params using the syntax ${<store_name>:<key>}. E.g.

datasets:
- from: postgres:my_table
name: my_table
params:
pg_host: localhost
pg_port: 5432
pg_pass: ${ env:MY_PG_PASS }
  • Java Client SDK: The Spice Java SDK has been released for JDK 17 or greater.

  • Federated SQL Query: Significant stability and reliability improvements have been made to federated SQL query support in most data connectors.

  • ODBC Data Connector: Providing a specific SQL dialect to query ODBC data sources is now supported using the sql_dialect param. For example, when querying Databricks using ODBC, the databricks dialect can be specified to ensure compatibility. Read the ODBC Data Connector documentation for more details.

Breaking Changesโ€‹

  • Secret Stores: Secret Stores support has been overhauled including required changes to spicepod.yml schema. File based secrets stored in the ~/.spice/auth file are no longer supported. See Secret Stores Documentation for full reference.

To upgrade Secret Stores, rename any parameters ending in _key to remove the _key suffix and specify a secret inline via the secret replacement syntax (${<secret_store>:<key>}):

datasets:
- from: postgres:my_table
name: my_table
params:
pg_host: localhost
pg_port: 5432
pg_pass_key: my_pg_pass

to:

datasets:
- from: postgres:my_table
name: my_table
params:
pg_host: localhost
pg_port: 5432
pg_pass: ${secrets:my_pg_pass}

And ensure the MY_PG_PASS environment variable is set.

  • Datasets: The default value of time_format has changed from unix_seconds to timestamp.

To upgrade:

datasets:
- from:
name: my_dataset
# Explicitly define format when not specified.
time_format: unix_seconds
  • HTTP Port: The default HTTP port has changed from port 3000 to port 8090 to avoid conflicting with frontend apps which typically use the 3000 range. If an SDK is used, upgrade it at the same time as the runtime.

To upgrade and continue using port 3000, run spiced with the --http command line argument:

# Using Dockerfile or spiced directly
spiced --http 127.0.0.1:3000
  • HTTP Metrics Port: The default HTTP Metrics port has changed from port 9000 to 9090 to avoid conflicting with other metrics protocols which typically use port 9000.

To upgrade and continue using port 9000, run spiced with the metrics command line argument:

# Using Dockerfile or spiced directly
spiced --metrics 127.0.0.1:9000

To upgrade, change:

json_path: my.json.path

To:

json_pointer: /my/json/pointer
  • Data Connector Configuration: Consistent connector name prefixing has been applied to connector specific params parameters. Prefixed parameter names helps ensure parameters do not collide.

For example, the Databricks data connector specific params are now prefixed with databricks:

datasets:
- from: databricks:spiceai.datasets.my_awesome_table # A reference to a table in the Databricks unity catalog
name: my_delta_lake_table
params:
mode: spark_connect
endpoint: dbc-a1b2345c-d6e7.cloud.databricks.com
token: MY_TOKEN

To upgrade:

datasets:
# Example for Spark Connect
- from: databricks:spiceai.datasets.my_awesome_table # A reference to a table in the Databricks unity catalog
name: my_delta_lake_table
params:
mode: spark_connect
databricks_endpoint: dbc-a1b2345c-d6e7.cloud.databricks.com # Now prefixed with databricks
databricks_token: ${secrets:my_token} # Now prefixed with databricks

Refer to the Data Connector documentation for parameter naming changes in this release.

Clickhouse Data Connector: The clickhouse_connection_timeout parameter has been renamed to connection_timeout as it applies to the client and is not Clickhouse configuration itself.

To upgrade, change:

clickhouse_connection_timeout: time

To:

connection_timeout: time

Contributorsโ€‹

  • @y-f-u
  • @phillipleblanc
  • @ewgenius
  • @github-actions
  • @sgrebnov
  • @lukekim
  • @digadeesh
  • @peasee
  • @Sevenannn

What's Changedโ€‹

Dependenciesโ€‹

No major dependency updates.

Commitsโ€‹

Full Changelog: https://github.com/spiceai/spiceai/compare/v0.15.2-alpha...v0.16.0-alpha

Resourcesโ€‹

Communityโ€‹

Spice.ai started with the vision to make AI easy for developers. We are building Spice.ai in the open and with the community. Reach out on Discord or by email to get involved.