Hi @seefoods
Please find below my findings for your case.
You don’t need (and can’t meaningfully add) any Spark conf to enable availableNow on Databricks Serverless.
Let me explain clearly, and then show what is safe to do in your decorator.
availableNow is not a Spark conf
For Auto Loader and Kafka, availableNow is a Structured Streaming trigger, not a Spark configuration.Correct usage (this is the only place it belongs):
(spark.readStream.format("cloudFiles") # or "kafka"
.option("cloudFiles.format", "json")
.option("cloudFiles.schemaLocation", schema_path)
.load(input_path)
.writeStream
.trigger(availableNow=True)
.option("checkpointLocation", checkpoint)
.start(output_path)
)
There is no spark.conf.set(...) required (or available) for availableNow.
Why Spark conf won’t work on Serverless
On Databricks Serverless :
- SparkSession is already created
- Executor / driver / streaming engine configs are locked
- SparkSession.builder.getOrCreate():
- does not create a new session
- silently ignores most configs
So adding something like:
SparkSession.builder \
.config("spark.sql.streaming.availableNow", "true") # does not exist
.getOrCreate()