Hi @Hritik_Moon ,
I guess you cannot. To disable disk cache you need to have ability to run following command:
spark.conf.set("spark.databricks.io.cache.enabled", "[true | false]")
But serverless compute does not support setting most Spark properties for notebooks or jobs. The following are the properties you can configure:

So, if you want to have a proper envirionment to learn apache spark optimization use OSS Apache Spark docker container as an alternative