Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
spark.sql("CREATE TABLE integrated.TrailingWeeks(ID bigint GENERATED BY DEFAULT AS IDENTITY (START WITH 0 increment by 1) ,Week_ID int NOT NULL) USING delta OPTIONS (path 'dbfs:/<Path in Azure datalake>/delta')")
Hi,When you define an identity column in Databricks with GENERATED BY DEFAULT AS IDENTITY (START WITH 0 INCREMENT BY 1), it is expected to start at 0 and increment by 1. However, due to Databricks' distributed architecture, the values may not be str...
Hi, i'm trying to run the notebooks but it doesn't do any activity.I had to create a cluster in order to start my code.pressing the play button inside of notebook does nothing at all.and the 'compute' , pressing play there on the clusters gives the e...
This is very common issue I see with community edition. I suppose the only work around is to create new cluster each time. More info on stackoverflow:https://stackoverflow.com/questions/69072694/databricks-community-edition-cluster-wont-start
Hi, I'm trying to create a calendar dimension including a fiscal year with a fiscal start of April 1. I'm using the fiscalyear library and am setting the start to month 4 but it insists on setting April to month 7.runtime 12.1My code snipet is:start_...
Hi,I have a table as below:create table default.test_user(ID bigint NOT NULL GENERATED BY DEFAULT AS IDENTITY (START WITH 1 INCREMENT BY 1),usr1 varchar(255) NOT NULL,ts1 timestamp NOT NULL,usr2 varchar(255) NOT NULL,ts2 timestamp NOT NULL) USING Del...
Hi @Himanshu Agrawal Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us...
I am trying to add a button in a notebook to trigger an execution of another notebook, but it does not respond to the command. Any idea why? The run command works if I run in a separate cell
Databricks start support to run selected text in a cell this will help us a lot during debugging of the code.In windows just select the line of code which you want to execute and press Ctrl+Shift+Enter
I try to stop my_cluster from compute from admin role. BTW, using same account, I could not restart my_cluster. The information is as followings. How should I do?
Hi,I am practicing with Databricks. In sample notebooks,I have seen different use of writeStream with or without ".start()" method. Samples are below:Without .start() spark.readStream
.format("cloudFiles")
.option("cloudFiles.f...
Hi @Mohammad Saber Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else bricksters will get back to you soon. Thanks
Hello,I've spent some time trying to initialize a couple of clusters into an Azure Databricks private environment.Apparently, without reason they've been causing these two failures: With the advice of retrying in some minutes. Looking for information...
Hello both! I attach here the 2 error details:Bootstrap timeout:HelpBootstrap Timeout. Please try again later.Instance bootstrap failed command: BootstrapTimeoutFailure message: Bootstrap script took too long and timeout. Please try again later.VM ex...
I found an issue:For a table with an identity column defined.when the table column is renamed using this method, the identity definition will be removed. That means using an identity column in a table requires extra attention to check whether the ide...
try to avoid reload table, I found we can upgrade table version, and use rename column commandALTER TABLE test_id2 SET TBLPROPERTIES ( 'delta.columnMapping.mode' = 'name', 'delta.minReaderVersion' = '2', 'delta.minWriterVersion' = '6')ALTER TABLE ...
Hello Guys,I am new to databricks. I have try to read the documentation as much I can. Now I want to jump in. What I Want : I have store my parquet file in Databricks storage system. I want to load this file into Data Lake Table. And then want to do ...
Hi @Learner bricks Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...
I am using community edition of databricks for learning and hands-on projects. However, when I try to create a cluster today, I am getting an error popup- "Backend service unavailable". I would like to know if it is a problem with my account or a bac...
Hey there @Venkat K Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...
After accumulating many updates to a delta table,like,keyExample bigint GENERATED ALWAYS AS IDENTITY (START WITH 1 INCREMENT BY 1),my identity column values are in the hundreds of millions. Is there any way that I can reset this value through vacuumi...
Hey there @Andrew Fogarty Does @Werner Stinckens's response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly? Else please let us know if you need more help. Thanks!
Hello,i'm getting this message today everytime i try to run a notebook since i logged in:When i click 'ok' nothing happens.I tried refreshing and switching browsers already. Tried run this cell, run below cells, run all cells, etc. I also tried to cr...
I've tried this, but it doesn't appear to be working: https://community.databricks.com/s/question/0D53f00001GHVX1CAP/unable-to-install-sf-and-rgeos-r-packages-on-the-clusterWhen I run the following after that init script, I receive an error.library(r...
Hey there @Christopher Flach Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear fr...