- 1036 Views
- 1 replies
- 0 kudos
I used to download the SQL query output from the Notebook UI. but right now I am unable to download files now
- 1036 Views
- 1 replies
- 0 kudos
Latest Reply
This is a workspace-level configuration. Probably your workspace admin disabled it. If you have admin privilege on your workspace you can enable it from the Admin Console -> Workspace Settings
- 742 Views
- 1 replies
- 0 kudos
I have a delta table in adls and for the same table, I have defined an external table in hive After creating the hive table and generating manifests, I am loading the partitions using MSCK REPAIR TABLE. All the partition columns are in same But s...
- 742 Views
- 1 replies
- 0 kudos
Latest Reply
Can you please check partition column order, does it in same sequence as before or it has changed
- 1648 Views
- 1 replies
- 0 kudos
I had a cluster that I used in the past. I do not see the cluster any longer. I checked with the admin and my team and everyone confirmed that there no user deletion.
- 1648 Views
- 1 replies
- 0 kudos
Latest Reply
If the cluster is unsued for 30 days, Databricks removes the cluster. This is a general clean-up policy. It's possible to whitelist a cluster from this clean-up by Pinning the cluster. https://docs.databricks.com/clusters/clusters-manage.html#pin-a-c...
- 1018 Views
- 1 replies
- 0 kudos
In Notebook, My code read and write the data to delta , My delta is partitioned by calendar_date. After the initial load i am able to read the delta file and look the data just fine.But after the second load for data for 6 month , the previous part...
- 1018 Views
- 1 replies
- 0 kudos
Latest Reply
I think you are writing the data in override mode. what happens in delta is it doesn't delete the data for certain days even it is written by overwrite mode for versioning , and you will be able to query only most recent data,But in format parque...
- 441 Views
- 1 replies
- 0 kudos
I am trying to create a Delta table and it seems the Delta table requires additional permissions on the parent folder of the table. The command failed telling permission errors. I tried to create a parquet table and it works fine.
- 441 Views
- 1 replies
- 0 kudos
Latest Reply
Delta table is the non-hive compatible format. So there must also be permissions for a client to access the path to the database’s location so that it can create a new temporary “directory” there. This comes from Spark SQL’s handling of external tabl...
- 912 Views
- 1 replies
- 0 kudos
We are using an internal metastore implementation. ie the metastore is hosted at the Dataricks side. However, we believe the metastore instance made available for my workspace is not adequate enough to handle the load. How can I monitor the number of...
- 912 Views
- 1 replies
- 0 kudos
Latest Reply
Use the below code snippet from a notebook%scala
import java.sql.Connection
import java.sql.DriverManager
import java.sql.ResultSet
import java.sql.SQLException
/**
* For details on what this query means, checkout https://dev.mysql.com/doc/refma...
- 721 Views
- 1 replies
- 1 kudos
My company uses Immuta for data governance. Will Databricks be able to fit into our existing security patterns?
- 721 Views
- 1 replies
- 1 kudos
Latest Reply
Yes, check out the immuta web page on the Databricks Integration. https://www.immuta.com/integrations/databricks
- 2048 Views
- 1 replies
- 0 kudos
Is there a way to get some kind of compute the cost associated with every SQL analytics query?
- 2048 Views
- 1 replies
- 0 kudos
Latest Reply
Right now, we do not have an option to measure the compute cost at a query level.
- 222 Views
- 0 replies
- 0 kudos
Best practices: Cluster configuration | Databricks on AWSLearn best practices when creating and configuring Databricks clusters.https://docs.databricks.com/clusters/cluster-config-best-practices.html
- 222 Views
- 0 replies
- 0 kudos
- 239 Views
- 0 replies
- 0 kudos
Best practices | Databricks on Google CloudLearn best practices when using or administering Databricks.https://docs.gcp.databricks.com/best-practices-index.html
- 239 Views
- 0 replies
- 0 kudos
- 216 Views
- 0 replies
- 0 kudos
Best practices - Azure DatabricksLearn best practices when using or administering Azure Databricks.https://docs.microsoft.com/en-us/azure/databricks/best-practices-index
- 216 Views
- 0 replies
- 0 kudos
- 268 Views
- 0 replies
- 0 kudos
Best practices | Databricks on AWSLearn best practices when using or administering Databricks.https://docs.databricks.com/best-practices-index.html
- 268 Views
- 0 replies
- 0 kudos
- 1874 Views
- 1 replies
- 0 kudos
I am seeing with new commits the old checkpoints are getting removed and i can time travel only last 10 versions , Is there any way I can prevent it so that delat checkpoints are not removed I'm using Azure Databricks 7.3 LTS ML.
- 1874 Views
- 1 replies
- 0 kudos
Latest Reply
If you want to keep your checkpoints X days, you can set delta.checkpointRetentionDuration to X days this way:spark.sql(f"""
ALTER TABLE delta.`path`
SET TBLPROPERTIES (
delta.checkpointRetentionDuration = 'X days'...
- 709 Views
- 1 replies
- 0 kudos
My VACCUM command is stuck. I am not sure if it's deleting any files.
- 709 Views
- 1 replies
- 0 kudos
Latest Reply
There is no direct way to track the progress of the VACUUM command. One easy workaround is to run a DRY RUN from another notebook which will give the estimate of files to be deleted at that point in time. This will give a rough estimate of files to b...
- 1471 Views
- 1 replies
- 0 kudos
I have a directory where I get files with the same multiple times. Will Auto-loader process all the files or will it process the first and ignore the rest
- 1471 Views
- 1 replies
- 0 kudos
Latest Reply
Autoloader has an option - "cloudFiles. allowOverwrites". This determines whether to allow input directory file changes to overwrite existing data. This option is available in Databricks Runtime 7.6 and above.