Spark Configurations with Serverless Compute
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
I have some few problems to convert my notebooks run run with serverless compute.
Right now I can't set my delta userMetadata at session and scope level using spark or sql.
Setting userMetadata at dataframe write operation is ok using the option: option("userMetadata","xxxxx")
but spark.databricks.delta.commitInfo.userMetadata for scope level and
set spark.databricks.delta.commitInfo.userMetadata for sql is not supported
Any plans to support or eliminate the limitation?
Best
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
Hi @RobsonNLPT,
There is an internal feature request for this use-case. https://databricks.aha.io/ideas/ideas/DB-I-12401 and it's under idea and not ETA on its implementation yet.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)
![](/skins/images/B38AF44D4BD6CE643D2A527BE673CCF6/responsive_peak/images/icon_anonymous_message.png)