In the first cell of my notebooks, I record a changelog for all changes done in the notebook in Markdown. However, as this list becomes longer and longer, I want to implement a dropdown list. Is there anyway to do this in Markdown in databricks?For t...
Hi @Jesse vd S​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
I want to know how dataframe transformations work.Suppose I have a DataFrame instance df1. I apply some operation on it, say a filter. As every operation gives a new dataframe, so lets say now we have df2. So we have two DataFrame instances now, df1 ...
Hi @mghildiy​ Does @Kaniz Fatma​ response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!
Two related questions:1: There has been several mentions in this forum about "Databricks Tunnel", which should allow us to connect from our local IDE to a remote databricks cluster and develop stuff locally. The roumors said early 2022, is there some...
Hi there @Erik Parmann​ Does @Youssef Mrini​ response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks
Hi, Databricks! You are my favorite Big Data tool, but I've recently faced an issue I didn't expect to have. For our agriculture customers, we're trying to use Databricks SQL Platform to keep our data accurate all day. We use Alerts to validate our d...
Hi @Dmytro Imshenetskyi​ Does @Hubert Dudek​ response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!
Hi thereI have used my company email to register an account for customer-academy.databricks.com a while back.Now what I need to do is create an account with partner-academy.databricks.com using my company email too.However when I register at partner-...
I'm trying to delete a table that was created from a csv and due to the file deletion, I can't execute the deletion, with the following error: I'm new to Databricks and I don't know how to fix this. Some help?
To delete the table, it's looking for underlying delta log file and because the file doesn't exist, it's throwing you that error.Just drop the table.drop table <table_name>
Hello Team,I have written Spark SQL Query in data bricks :DROP TABLE IF EXISTS Salesforce.Location;CREATE EXTERNAL TABLE Salesforce.Location (Id STRING,OwnerId STRING,IsDeleted bigint,Name STRING,CurrencyIsoCode STRING,CreatedDate bigint,CreatedById ...
You need to provide one of the following value for 'data_source':TEXTAVROCSVJSONPARQUETORCDELTAeg: USING PARQUETIf you skip USING clause, then the default data source is DELTAhttps://docs.databricks.com/sql/language-manual/sql-ref-syntax-ddl-create-t...
I have a delta live tables pipeline that is loading and transforming data. Currently I am having a problem that the schema inferred by DLT does not match the actual schema of the table. The table is generated via a groupby.pivot operation as follows:...
I need to process a number of files where I manipulate file text utilising an external executable that operates on stdin/stdout. I am quite new to spark. What I am attempting is to use rdd.pipe as in the followingexe_path = " /usr/local/bin/external...
Let's assume that we have 3 streaming Delta Tables:BronzeSilverGoldMy aim is to add partitioning to Silver table (for example by Date). So, as a result Gold table with throw an error that source table has been updated and I would need to set 'ignoreC...
Hi @Leszek​ , We haven’t heard from you on the last response from @Werner Stinckens​ , and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please do share that with the community as it can be helpful to o...
In previous days all notebooks containing : 'import anomalydetection' worked just fine. There was no change in any configuration of the cluster, notebook or our imported library.However recently notebooks just crashed with below errorSame happen also...
Solution: This is due to the latest version of protobuf library, please try to downgrade the library which should solve the issuepip install protobuf==3.20.*protobuf library versions which works: 3.20.1 if it does not work then try 3.18.1
Hi,I'm very new to Terraform. Currently, I'm trying to automate the service principal setup process using Terraform.Following this example, I successfully created a service principal and an access token. However, when I tried adding databricks_git_cr...
I have some python code which takes parquet files from an adlsv2 location and merges it into delta tables (run as a workflow job on a schedule)I have a try catch wrapper around this so that any files that fail get moved into a failed folder using dbu...
That's the problem - it's not being locked (or fs.mv() isn't checking/honoring the lock). The upload process/tool is a 3rd-prty external toolI can see via the upload tool that the file upload is 'in progress'I can also see the 0 byte destination file...