@M Baig​ yes you need just to create service account for databricks and than assign storage admin role to bucket. After that you can mount GCS standard way:bucket_name = "<bucket-name>"mount_name = "<mount-name>"dbutils.fs.mount("gs://%s" % bucket_na...
@Jensen Ackles​ Here is answer your concern. that is i found on doc databrick as link below :https://docs.databricks.com/sql/user/dashboards/index.html#refresh-a-dashboardHope this can help on this caseRefresh a dashboardDashboards should load quick...
https://sparkbyexamples.com/pyspark/pyspark-distinct-to-drop-duplicates/refer this link above may match with your concern. hope this can make and help in this case