Skip to main content

3 posts tagged with "mcp"

Model-Context-Protocol (MCP) related topics and usage

View All Tags

Spice v2.0-rc.4 (Apr 30, 2026)

ยท 22 min read
William Croxson
Senior Software Engineer at Spice AI

Announcing the release of Spice v2.0-rc.4! ๐Ÿš€

v2.0.0-rc.4 is the fourth release candidate for advanced testing of v2.0, building on v2.0.0-rc.3.

Highlights in this release candidate include:

  • Elasticsearch Data Connector (Alpha) with native hybrid search (BM25 full-text + kNN vector + RRF)
  • PostgreSQL Native CDC via WAL logical replication, eliminating the need for Debezium or Kafka
  • Multi-vector Embeddings with MaxSim for ColBERT-style late-interaction retrieval
  • Rerank UDTF for hybrid search pipelines with automatic query propagation
  • HashiCorp Vault and Azure Key Vault Secret Stores for enterprise secret management
  • DuckDB Vector Engine with HNSW index support
  • Azure Cosmos DB Connector (RC), Git Connector promoted to RC
  • MCP Streamable HTTP transport
  • Read-only API Key Enforcement on Flight DoGet and async query paths

What's New in v2.0.0-rc.4โ€‹

Elasticsearch Data Connector (Alpha, Spice.ai Enterprise)โ€‹

The new Elasticsearch data connector enables querying Elasticsearch indexes as SQL tables with full hybrid search support. Currently available in Spice.ai Enterprise.

Key capabilities:

  • SQL Table Access: Query any Elasticsearch index with standard SQL via a native DataFusion TableProvider.
  • kNN Vector Search: Use the vector_search() UDTF against Elasticsearch-backed vector fields.
  • BM25 Full-Text Search: Use the text_search() UDTF for native Elasticsearch full-text queries.
  • Hybrid Search: Combine kNN and BM25 results with the rrf() UDTF for reciprocal rank fusion.
  • Elasticsearch as a Vector Engine: Accelerated datasets can use Elasticsearch as the backing vector engine for embedding storage and retrieval.

Example configuration:

datasets:
- from: elasticsearch:my_index
name: my_data
params:
elasticsearch_endpoint: https://my-cluster.es.io:9200
elasticsearch_username: ${secrets:es_user}
elasticsearch_password: ${secrets:es_password}

PostgreSQL Native Replication via WALโ€‹

Postgres datasets configured with refresh_mode: changes can now stream changes directly from PostgreSQL logical replication (WAL) into any local accelerator without Debezium or Kafka required.

Key capabilities:

  • Native Logical Replication: Uses pgoutput decoding to stream INSERT/UPDATE/DELETE events.
  • Automatic Slot Management: Each Spice replica creates a distinct replication slot (spice_<dataset>_<hash>), so multi-replica deployments work automatically. Publications are shared.
  • Bootstrap Snapshot: An initial REPEATABLE READ snapshot seeds the accelerator before replication begins.
  • LSN Acknowledgement: The LsnCommitter sends durable LSN back to Postgres so WAL segments are reclaimed.
  • All Accelerators Supported: Works with DuckDB, SQLite, Postgres, Cayenne, and Arrow accelerators.

Example configuration:

datasets:
- from: postgres:my_table
name: my_table
params:
pg_host: localhost
pg_port: 5432
pg_db: mydb
pg_publication: my_publication # optional; auto-created if omitted
acceleration:
enabled: true
engine: duckdb
refresh_mode: changes

Multi-vector Embeddings with MaxSim (Late Interaction)โ€‹

Column-level embeddings now support list-of-string columns, producing one embedding vector per list element and enabling ColBERT-style late-interaction retrieval.

Key capabilities:

  • Multi-vector per Row: Columns of type List<String> produce List<FixedSizeList<F32, D>> โ€” one embedding per list element.
  • MaxSim / Mean / Sum Scoring: Per-row score is the max, mean, or sum cosine over the list elements. Default is MaxSim (ColBERT).
  • _match Column: Returns the specific list element that produced the highest cosine similarity.
  • No Schema Changes Required: Works with existing embedding configurations; activates automatically for list-type columns.

A new rerank() table-valued function reorders scored results from vector_search, text_search, or rrf by a reranker model's relevance judgements. See Search Functionality for an overview of search UDTFs.

Key capabilities:

  • Auto Query Propagation: The query string is automatically inherited from a nested search UDTF โ€” no repetition required.
  • Any Chat Model as Reranker: Any registered chat/completion model can serve as a reranker via the built-in LlmRerank adapter (listwise prompt by default; pointwise available).
  • Filter and Projection Pushdown: The RerankExec physical node supports pushdown, reducing data movement.
  • Extensible: A new RerankerModelStore sits alongside ChatModelStore and EmbeddingModelStore; native providers (Cohere, Voyage, BGE) can be added without runtime plumbing changes.
SELECT * FROM rerank(
rrf(vector_search('my_table', 'query text'), text_search('my_table', 'query text')),
document => content
) LIMIT 10;

New Secret Stores: HashiCorp Vault and Azure Key Vaultโ€‹

Two new enterprise-grade Secret Stores are now available.

HashiCorp Vault (hashicorp_vault):

  • KV v2 (default) and KV v1 mount support.
  • Auth methods: token, approle, kubernetes, jwt.
  • Token leases are cached and automatically re-acquired on expiry.
secrets:
- from: hashicorp_vault:secret/my-app
name: my_secrets
params:
hashicorp_vault_addr: https://vault.example.com
hashicorp_vault_auth_method: approle
hashicorp_vault_role_id: ${env:VAULT_ROLE_ID}
hashicorp_vault_secret_id: ${secrets:vault_secret_id}

Azure Key Vault (azure_keyvault):

  • Per-key caching with single-flight fetch coalescing.
  • Auth methods: service principal, managed identity, workload identity, Azure CLI, or auto-detect.
  • Supports sovereign clouds via endpoint parameter.
secrets:
- from: azure_keyvault:my-vault
name: my_secrets
params:
azure_keyvault_auth_method: managed_identity

DuckDB Vector Engineโ€‹

DuckDB-accelerated tables can now use DuckDB's HNSW index for vector search via the vector_engine: duckdb option, enabling fast approximate nearest-neighbor search without an external vector store.

Example configuration:

datasets:
- from: postgres:public.documents
name: documents
columns:
- name: content
embeddings:
- from: hf_minilm
row_id: id
acceleration:
enabled: true
engine: duckdb
mode: file
vectors:
enabled: true
engine: duckdb
params:
duckdb_distance_metric: cosine
duckdb_hnsw_m: 16
duckdb_hnsw_ef_construction: 64
duckdb_hnsw_ef_search: 32

embeddings:
- from: huggingface:huggingface.co/minishlab/potion-base-2M
name: hf_minilm

New and Promoted Connectorsโ€‹

Azure Cosmos DB (Alpha):

A new read-only Azure Cosmos DB NoSQL / Core SQL API connector built on the azure_data_cosmos 0.30 SDK. Supports cross-partition scans, schema inference from document samples, and key-based auth (connection string or account endpoint + key).

Git Connector (RC):

The Git data connector is promoted to RC status with HTTPS/SSH auth (git_token, git_username/git_password, git_ssh_key), Git LFS support (enable_lfs), and per-repo connection resilience (semaphore, bounded retries with exponential backoff, permanent-error circuit breaking).

DynamoDB Write Support (DML)โ€‹

DynamoDB datasets now support write-back via INSERT, UPDATE, and DELETE operations, complementing the existing read and CDC streaming capabilities.

MCP Streamable HTTP Transportโ€‹

The MCP server has been upgraded to rmcp 1.5.0 and switched to the Streamable HTTP transport (/v1/mcp), replacing the previous SSE-based endpoint. The client-side transport is updated to StreamableHttpClientTransport.

Security Improvementsโ€‹

Read-only API Key Enforcement: API keys with read-only scope are now strictly enforced on the Flight DoGet path and on async query endpoints, preventing write operations from being issued under a read-only key.

GitHub Workflow Hardening: CI workflows have been hardened with improved security posture to reduce supply-chain risk.

Developer Experience Improvementsโ€‹

  • Actionable Config Errors: Parameter typos, missing secret references, and unknown engine names now produce specific, actionable error messages with Levenshtein-based suggestions, rather than silent drops or generic "missing required parameter" messages.
  • spice init Improvements: Written spicepods now include a yaml-language-server: $schema=... directive for IDE completions. Creation messages print regardless of log level.
  • REPL Improvements: Log filter honors RUST_LOG when -v is not passed; version banner moves to stderr and prints only on an interactive TTY.
  • 403 / 401 Routing: HTTP 403 responses route to a new PermissionDenied variant; 401 messages point at spice login / SPICE_API_KEY.

OpenTelemetry Improvementsโ€‹

See Observability & Monitoring and the runtime.telemetry reference for full configuration details.

  • Metric Name Prefix: Configure a prefix for all exported OTLP metric names via runtime.telemetry.metric_prefix.
  • Delta Temporality Default: The OTLP push exporter now defaults to delta temporality, matching Prometheus and most backends.
  • Resource Attributes: runtime.telemetry.properties are applied as OTLP resource attributes on exported metrics.

Full-text Search Performanceโ€‹

Tantivy full-text search ingestion performance is significantly improved with better batch handling and a rollback-on-error path.

SQL and Query Engineโ€‹

  • DataFusion Upgrade: Updated to a newer DataFusion revision with additional bug fixes and performance improvements.
  • Views on DDL Catalogs: DDL-defined catalogs (e.g., Unity Catalog) can now expose and query views.
  • flatten_json / json_tree / expand_maps UDTFs: New table-valued functions for JSON transformation, map expansion, and schema decomposition in query pipelines. See JSON Functions and Operators.
  • cosine_distance Pushdown to DuckDB: cosine_distance is now pushed down to DuckDB accelerators via array_cosine_distance.
  • Snowflake Type Support: Added support for OBJECT, MAP, GEOGRAPHY, GEOMETRY, VECTOR, and TIMESTAMP_LTZ types in the Snowflake connector.
  • MySQL Zero-Date Behavior: The MySQL connector adds a new mysql_zero_date_behavior parameter (null or error) controlling how MySQL zero-date values (0000-00-00) are handled.
  • Databricks Timeouts: The Databricks connector adds new connect_timeout and client_timeout parameters for sql_warehouse mode.

Dependency Updatesโ€‹

Dependency / ComponentVersion / Update
DataFusionUpdated
rmcpv1.5.0 (from fork pin)
mistral.rsUpdated
openssl0.10.78

Contributorsโ€‹

Breaking Changesโ€‹

No breaking changes.

Cookbook Updatesโ€‹

No new cookbook recipes.

The Spice Cookbook includes 86 recipes to help you get started with Spice quickly and easily.

Upgradingโ€‹

To upgrade to v2.0.0-rc.4, use one of the following methods:

CLI:

spice upgrade v2.0.0-rc.4

Homebrew:

brew upgrade spiceai/spiceai/spice

Docker:

Pull the spiceai/spiceai:2.0.0-rc.4 image:

docker pull spiceai/spiceai:2.0.0-rc.4

For available tags, see DockerHub.

Helm:

helm repo update
helm upgrade spiceai spiceai/spiceai --version 2.0.0-rc.4

AWS Marketplace:

Spice is available in the AWS Marketplace.

What's Changedโ€‹

Changelogโ€‹

Full Changelog: https://github.com/spiceai/spiceai/compare/v2.0.0-rc.3...v2.0.0-rc.4

Spice v1.2.2 (May 13, 2025)

ยท 5 min read
Jack Eadie
Token Plumber at Spice AI

Announcing the release of Spice v1.2.2! ๐ŸŒŸ

Spice v1.2.2 introduces support for Databricks Mosaic AI model serving and embeddings, alongside the existing Databricks catalog and dataset integrations. It adds configurable service ports in the Helm chart and resolves several bugs to improve stability and performance.

Highlights in v1.2.2โ€‹

  • Databricks Model & Embedding Provider: Spice integrates with Databricks Model Serving for models and embeddings, enabling secure access via machine-to-machine (M2M) OAuth authentication with service principal credentials. The runtime automatically refreshes tokens using databricks_client_id and databricks_client_secret, ensuring uninterrupted operation. This feature supports Databricks-hosted large language models and embedding models.

    models:
    - from: databricks:databricks-llama-4-maverick
    name: llama-4-maverick
    params:
    databricks_endpoint: dbc-46470731-42e5.cloud.databricks.com
    databricks_client_id: ${secrets:DATABRICKS_CLIENT_ID}
    databricks_client_secret: ${secrets:DATABRICKS_CLIENT_SECRET}

    embeddings:
    - from: databricks:databricks-gte-large-en
    name: gte-large-en
    params:
    databricks_endpoint: dbc-42424242-4242.cloud.databricks.com
    databricks_client_id: ${secrets:DATABRICKS_CLIENT_ID}
    databricks_client_secret: ${secrets:DATABRICKS_CLIENT_SECRET}

    For detailed setup instructions, refer to the Databricks Model Provider documentation.

  • Configurable Helm Chart Service Ports: The Helm chart now supports custom ports for flexible network configurations for deployments. Specify non-default ports in your Helm values file.

  • Resolved Issues:

    • MCP Nested Tool Calling: Fixed a bug preventing nested tool invocation when Spice operates as the MCP server federating to MCP clients.

    • Dataset Load Concurrency: Corrected a failure to respect the dataset_load_parallelism setting during dataset loading.

    • Acceleration Hot-Reload: Addressed an issue where changes to acceleration enable/disable settings were not detected during hot reload of Spicepod.yaml.

Contributorsโ€‹

Breaking Changesโ€‹

No breaking changes.

Cookbook Updatesโ€‹

Updated cookbooks:

The Spice Cookbook now includes 68 recipes to help you get started with Spice quickly and easily.

Upgradingโ€‹

To upgrade to v1.2.2, use one of the following methods:

CLI:

spice upgrade

Homebrew:

brew upgrade spiceai/spiceai/spice

Docker:

Pull the spiceai/spiceai:1.2.2 image:

docker pull spiceai/spiceai:1.2.2

For available tags, see DockerHub.

Helm:

helm repo update
helm upgrade spiceai spiceai/spiceai

What's Changedโ€‹

Dependenciesโ€‹

  • No major dependency changes.

Changelogโ€‹

- Update spark-connect-rs to override user agent string by @ewgenius in https://github.com/spiceai/spice/pull/5798
- Merge pull request by @ewgenius in https://github.com/spiceai/spice/pull/5796
- Pass the default user agent string to the Databricks Spark, Delta, and Unity clients by @ewgenius in https://github.com/spiceai/spice/pull/5717
- bump to 1.2.2 by @Jeadie in https://github.com/spiceai/spice/pull/none
- Helm chart: support for service ports overrides by @sgrebnov in https://github.com/spiceai/spice/pull/5774
- Update spice cli login command with client-id and client-secret flags for Databricks by @ewgenius in https://github.com/spiceai/spice/pull/5788
- Fix bug where setting Cache-Control: no-cache doesn't compute the cache key by @phillipleblanc in https://github.com/spiceai/spice/pull/5779
- Update to datafusion-contrib/datafusion-table-providers#336 by @phillipleblanc in https://github.com/spiceai/spice/pull/5778
- Lru cache: limit single cached record size to u32::MAX (4GB) by @sgrebnov in https://github.com/spiceai/spice/pull/5772
- Fix LLMs calling nested MCP tools by @Jeadie in https://github.com/spiceai/spice/pull/5771
- MySQL: Set the character_set_results/character_set_client/character_set_connection session variables on connection setup by @Sevenannn in https://github.com/spiceai/spice/pull/5770
- Control the parallelism of acceleration refresh datasets with runtime.dataset_load_parallelism by @phillipleblanc in https://github.com/spiceai/spice/pull/5763
- Fix Iceberg predicates not matching the Arrow type of columns read from parquet files by @phillipleblanc in https://github.com/spiceai/spice/pull/5761
- fix: Use decimal_cmp for numerical BETWEEN in SQLite by @peasee in https://github.com/spiceai/spice/pull/5760
- Support product name override in databricks user agent string by @ewgenius in https://github.com/spiceai/spice/pull/5749
- Databricks U2M Token Provider support by @ewgenius in https://github.com/spiceai/spice/pull/5747
- Remove HTTP auth from LLM config and simplify Databricks models logic by using static headers by @Jeadie in https://github.com/spiceai/spice/pull/5742
- clear plan cache when dataset updates by @kczimm in https://github.com/spiceai/spice/pull/5741
- Support Databricks M2M auth in LLMs + Embeddings by @Jeadie in https://github.com/spiceai/spice/pull/5720
- Retrieve Github App tokens in background; make TokenProvider not async by @Jeadie in https://github.com/spiceai/spice/pull/5718
- Make 'token_providers' crate by @Jeadie in https://github.com/spiceai/spice/pull/5716
- Databricks AI: Embedding models & LLM streaming by @Jeadie in https://github.com/spiceai/spice/pull/5715

See the full list of changes at: v1.2.1...v1.2.2

Spice v1.1.1 (Apr 7, 2025)

ยท 6 min read
Phillip LeBlanc
Co-Founder and CTO of Spice AI

Announcing the release of Spice v1.1.1! ๐Ÿ“Š

Spice v1.1.1 introduces several key updates, including a new Component Metrics System, improved Delta Data Connector performance, improved MCP tool descriptions, and expanded runtime results caching options. This release also adds detailed MySQL connection pool metrics for better observability. Component Metrics are Prometheus-compatible and accessible via the metrics endpoint.

Highlights v1.1.1โ€‹

  • Component Metrics System: A new system for monitoring components, starting with MySQL connection pool metrics. These metrics provide insights into MySQL connection performance and can be selectively enabled in the dataset configuration. Metrics are exposed in Prometheus format via the metrics endpoint.

For more details, see the Component Metrics documentation.

  • Results Caching Enhancements: Added a cache_key_type option for runtime results caching. Options include:
    • plan (Default): Uses the query's logical plan as the cache key. Matches semantically equivalent queries but requires query parsing.
    • sql: Uses the raw SQL string as the cache key. Provides faster lookups but requires exact string matches. Use sql for predictable queries without dynamic functions like NOW().

Example spicepod.yaml configuration:

runtime:
results_cache:
enabled: true
cache_max_size: 128MiB
cache_key_type: sql # Use SQL for the results cache key
item_ttl: 1s

For more details, see the runtime configuration documentation.

  • Delta Data Connector: Improved scan performance for faster query performance.

  • MCP Tools: Improved descriptions for built-in MCP tools to improve usability.

  • MySQL Component Metrics: Added detailed metrics for monitoring MySQL connections, such as connection count and pool activity.

Example spicepod.yaml configuration:

datasets:
- from: mysql:my_table
name: my_dataset
metrics:
- name: connection_count
enabled: true
- name: connections_in_pool
enabled: true
- name: active_wait_requests
enabled: true
params:
mysql_host: localhost
mysql_tcp_port: 3306
mysql_user: root
mysql_pass: ${secrets:MYSQL_PASS}

For more details, see the MySQL Data Connector documentation.

  • spice.js SDK: The spice.js SDK has been updated to v2.0.1 and includes several important security updates.

New Contributors ๐ŸŽ‰โ€‹

Contributorsโ€‹

Breaking Changesโ€‹

No breaking changes in this release.

Cookbook Updatesโ€‹

The Spice Cookbook now includes 65 recipes to help you get started with Spice quickly and easily.

Upgradingโ€‹

To upgrade to v1.1.1, use one of the following methods:

CLI:

spice upgrade

Homebrew:

brew upgrade spiceai/spiceai/spice

Docker:

Pull the spiceai/spiceai:1.1.1 image:

docker pull spiceai/spiceai:1.1.1

For available tags, see DockerHub.

Helm:

helm repo update
helm upgrade spiceai spiceai/spiceai

What's Changedโ€‹

Dependenciesโ€‹

  • No major dependency changes.

Changelogโ€‹

- fix: Testoperator DuckDB, SQLite, Postgres, Spicecloud by [@peasee](https://github.com/peasee) in [#5190](https://github.com/spiceai/spiceai/pull/5190)
- Update Helm Chart and SECURITY.md to v1.1.0 by [@lukekim](https://github.com/lukekim) in [#5223](https://github.com/spiceai/spiceai/pull/5223)
- Update version.txt to v1.1.1-unstable by [@lukekim](https://github.com/lukekim) in [#5224](https://github.com/spiceai/spiceai/pull/5224)
- Update Cargo.lock to v1.1.1-unstable by [@lukekim](https://github.com/lukekim) in [#5225](https://github.com/spiceai/spiceai/pull/5225)
- Add tests for `verify_schema_source_path` in `ListingTableConnector` by [@phillipleblanc](https://github.com/phillipleblanc) in [#5221](https://github.com/spiceai/spiceai/pull/5221)
- Reduce noise from debug logging by [@phillipleblanc](https://github.com/phillipleblanc) in [#5227](https://github.com/spiceai/spiceai/pull/5227)
- Improve `openai_test_chat_messages` integration test reliability by [@Sevenannn](https://github.com/Sevenannn) in [#5222](https://github.com/spiceai/spiceai/pull/5222)
- Verify the checkpoints existence before shutting down runtime in integration tests directly querying checkpoint by [@Sevenannn](https://github.com/Sevenannn) in [#5232](https://github.com/spiceai/spiceai/pull/5232)
- Fix CORS support for json content-type api by [@sgrebnov](https://github.com/sgrebnov) in [#5241](https://github.com/spiceai/spiceai/pull/5241)
- Fix ModelGradedScorer error: The 'metadata' parameter is only allowed when 'store' is enabled. by [@sgrebnov](https://github.com/sgrebnov) in [#5231](https://github.com/spiceai/spiceai/pull/5231)
- fix: Use `pulls-with-spice-action` and switch to `spiceai-macos` runners by [@peasee](https://github.com/peasee) in [#5238](https://github.com/spiceai/spiceai/pull/5238)
- Use v1.0.3 pulls with spice action by [@lukekim](https://github.com/lukekim) in [#5244](https://github.com/spiceai/spiceai/pull/5244)
- feat: Build ODBC binaries, run testoperator on ODBC by [@peasee](https://github.com/peasee) in [#5237](https://github.com/spiceai/spiceai/pull/5237)
- Bump timeout for several integration test runtime load_components & readiness check by [@Sevenannn](https://github.com/Sevenannn) in [#5229](https://github.com/spiceai/spiceai/pull/5229)
- Validate port is available before binding port for docker container in integration tests by [@Sevenannn](https://github.com/Sevenannn) in [#5248](https://github.com/spiceai/spiceai/pull/5248)
- Update datafusion-table-providers to fix the schema for PostgreSQL materialized views by [@ewgenius](https://github.com/ewgenius) in [#5259](https://github.com/spiceai/spiceai/pull/5259)
- Verify flight server is ready for flight integration tests by [@Sevenannn](https://github.com/Sevenannn) in [#5240](https://github.com/spiceai/spiceai/pull/5240)
- fix: Publish to MinIO inside of matrix on build_and_release by [@peasee](https://github.com/peasee) in [#5258](https://github.com/spiceai/spiceai/pull/5258)
- fix: TPCDS on zero results benchmarks by [@peasee](https://github.com/peasee) in [#5263](https://github.com/spiceai/spiceai/pull/5263)
- Use model as a judge scorer for Financebench by [@sgrebnov](https://github.com/sgrebnov) in [#5264](https://github.com/spiceai/spiceai/pull/5264)
- Fix FinanceBench llm scorer secret name by [@sgrebnov](https://github.com/sgrebnov) in [#5276](https://github.com/spiceai/spiceai/pull/5276)
- Implements support for `runtime.results_cache.cache_key_type` by [@phillipleblanc](https://github.com/phillipleblanc) in [#5265](https://github.com/spiceai/spiceai/pull/5265)
- fix: Testoperator MS SQL, query overrides, dispatcher by [@peasee](https://github.com/peasee) in [#5279](https://github.com/spiceai/spiceai/pull/5279)
- refactor: Delete old benchmarks by [@peasee](https://github.com/peasee) in [#5283](https://github.com/spiceai/spiceai/pull/5283)
- Imporve embedding column parsing performance test by [@Sevenannn](https://github.com/Sevenannn) in [#5268](https://github.com/spiceai/spiceai/pull/5268)
- Add Support for AWS Session Token in S3 Data Connector by [@kczimm](https://github.com/kczimm) in [#5243](https://github.com/spiceai/spiceai/pull/5243)
- Implement Component Metrics system + MySQL connection pool metrics by [@phillipleblanc](https://github.com/phillipleblanc) in [#5290](https://github.com/spiceai/spiceai/pull/5290)
- Add default descriptions to built-in MCP tools by [@lukekim](https://github.com/lukekim) in [#5293](https://github.com/spiceai/spiceai/pull/5293)
- fix: Vector search with cased columns by [@peasee](https://github.com/peasee) in [#5295](https://github.com/spiceai/spiceai/pull/5295)
- Run delta kernel scan in a blocking Tokio thread. by [@phillipleblanc](https://github.com/phillipleblanc) in [#5296](https://github.com/spiceai/spiceai/pull/5296)
- Expose the `mysql_pool_min` and `mysql_pool_max` connection pool parameters by [@phillipleblanc](https://github.com/phillipleblanc) in [#5297](https://github.com/spiceai/spiceai/pull/5297)
- use patched pdf-extract by [@kczimm](https://github.com/kczimm) in [#5270](https://github.com/spiceai/spiceai/pull/5270)

Full Changelog: v1.1.0...v1.1.1