Spark Configurations with Serverless Compute
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-29-2025 06:09 AM
I have some few problems to convert my notebooks run run with serverless compute.
Right now I can't set my delta userMetadata at session and scope level using spark or sql.
Setting userMetadata at dataframe write operation is ok using the option: option("userMetadata","xxxxx")
but spark.databricks.delta.commitInfo.userMetadata for scope level and
set spark.databricks.delta.commitInfo.userMetadata for sql is not supported
Any plans to support or eliminate the limitation?
Best
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-29-2025 07:30 AM
Hi @RobsonNLPT,
There is an internal feature request for this use-case. https://databricks.aha.io/ideas/ideas/DB-I-12401 and it's under idea and not ETA on its implementation yet.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-29-2025 07:57 AM

