cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Garvita1
by New Contributor II
  • 2942 Views
  • 5 replies
  • 2 kudos

Databricks Certified Data Engineer Associate Certificate and Badge not received

I have attempted the exam and also got passed but I have not received the badge and certificate. I have also raised the request but I have not got any response yet. It is urgently required. I request the databrick team to provide me with the same as ...

  • 2942 Views
  • 5 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Garvita Kumari​ Just a friendly follow-up. Are you able to get your certification? If yes, then mark the answer as best or if you need further assistance kindly let me know.Thanks and Regards

  • 2 kudos
4 More Replies
Manimkm08
by New Contributor III
  • 3096 Views
  • 3 replies
  • 0 kudos

Jobs are failed with AWS_INSUFFICIENT_FREE_ADDRESSES_IN_SUBNET_FAILURE

We have assigned 3 dedicated subnets (one per AZ ) to the Databricks workspace each with /24 CIDR but noticed that all the jobs are running into a single subnet which causes AWS_INSUFFICIENT_FREE_ADDRESSES_IN_SUBNET_FAILURE.Is there a way to segregat...

  • 3096 Views
  • 3 replies
  • 0 kudos
Latest Reply
Manimkm08
New Contributor III
  • 0 kudos

@karthik p​ Have configured one subnet per AZ(total 3). Have followed the same steps as mentioned in the document. Is there a way to check whether the Databricks uses all the subnets or not?@Debayan Mukherjee​ am not getting how to use LB in this set...

  • 0 kudos
2 More Replies
berserkersap
by Contributor
  • 13115 Views
  • 3 replies
  • 5 kudos

What is the timeout for dbutils.notebook.run, timeout = 0 ?

Hello everyone,I have several notebooks (around 10) and I want to run them in a sequential order. At first I thought of using %run but I have a variable that is repeatedly used in every notebook. So now I am thinking to pass that variable from one ma...

image
  • 13115 Views
  • 3 replies
  • 5 kudos
Latest Reply
UmaMahesh1
Honored Contributor III
  • 5 kudos

Hi @pavan venkata​ Yes, as the document says 0 means no timeout. It means that the notebook will take it's sweet time to complete execution without throwing an error due to a time limit. Be it if the notebook takes 1 min or 1 hour or 1 day or more. H...

  • 5 kudos
2 More Replies
Databrickguy
by New Contributor II
  • 6426 Views
  • 6 replies
  • 3 kudos

Resolved! How to parse/extract/format a string based a pattern?

How to parse, extract or form a string based on a pattern?SQL server has a function which will format the string based on a pattern. example,a string is "abcdefgh", the pattern is XX-XX-XXXX,the the string will be "ab-cd-efgh".How to archive this wit...

  • 6426 Views
  • 6 replies
  • 3 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 3 kudos

@Tim zhang​ ,thanks for your code, and here is your answer  I asked this question in Stackoverflow and got this answer Here is the Stackoverflow link- https://stackoverflow.com/questions/74845760/how-to-parse-a-pattern-and-use-it-to-format-a-string-u...

  • 3 kudos
5 More Replies
Pat
by Honored Contributor III
  • 7162 Views
  • 5 replies
  • 9 kudos

Reading data from "dbfs:/mnt/"

Hi community,I don't know what is happening TBH. I have a use case where data is written to the location "dbfs:/mnt/...", don't ask me why it's mounted, it's just a side project. I do believe that data is stored in ADLS2.I've been trying to read the ...

  • 7162 Views
  • 5 replies
  • 9 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 9 kudos

this is really interesting never faced this type od situation @Pat Sienkiewicz​  can you please share whole code by that we can test and debug this in our systemThanksAviral

  • 9 kudos
4 More Replies
bakselrud
by New Contributor III
  • 11271 Views
  • 12 replies
  • 2 kudos

Resolved! DLT pipeline failure - Detected a data update... This is currently not supported

We are using DLT pipeline in Databricks workspace hosted by Microsoft Azure platform which is failing intermittently and for unclear reason.The pipeline is as follows:spark.readStream.format("delta").option("mergeSchema", "true").option("ignoreChange...

  • 11271 Views
  • 12 replies
  • 2 kudos
Latest Reply
bakselrud
New Contributor III
  • 2 kudos

Ok, so after doing some investigation on the way to resolving my original question, I think we're getting some clarity after all.Consider the following data frame that is ingested by DLT streaming pipeline:dfMock = spark.sparkContext.parallelize([[1,...

  • 2 kudos
11 More Replies
SM14
by New Contributor
  • 1546 Views
  • 1 replies
  • 0 kudos

Row Level Validation

I have two array one of devl other one is prod.Inside this there are many tables .How do i compare and check the count difference.Wanted to create a automated script so as to check the count difference and perform row level validation.Pyspark script...

  • 1546 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, You can use except command for the same. Please refer: https://stackoverflow.com/questions/70366209/databricks-comparing-two-tables-to-see-which-records-are-missing. Please let us know if this helps.

  • 0 kudos
g96g
by New Contributor III
  • 1491 Views
  • 2 replies
  • 1 kudos

Databricks SQL permission problems

We are using a catalog and normally I have the ALL PREVELAGES user status but Im not able to modify the SQL script which is created by some of my colleagues. They have to give me an access and after that Im able to modify. How can I solve this proble...

  • 1491 Views
  • 2 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, when you are not able to modify could you please confirm the error you are receiving? Also, you can refer to https://docs.databricks.com/_static/notebooks/set-owners-notebook.html and https://docs.databricks.com/sql/admin/transfer-ownership.html

  • 1 kudos
1 More Replies
Viren123
by Contributor
  • 5683 Views
  • 5 replies
  • 6 kudos

API to write into Databricks tables

Hello Experts,Is there any API in databricks that allows to write the data in the Databricks tables. I would like to send small size Logs information to Databricks tables from other service. What are my options?Thank you very much.

  • 5683 Views
  • 5 replies
  • 6 kudos
Latest Reply
jneira
New Contributor III
  • 6 kudos

and what about use the jdbc/odbc driver, either programatically or using a tool like dbeaver?​

  • 6 kudos
4 More Replies
CHANDAN_NANDY
by New Contributor III
  • 4846 Views
  • 2 replies
  • 4 kudos

Resolved! GitCopilot Support

Any Idea why GitCopiot is not available in Azure Databricks, though it supports Github? 

  • 4846 Views
  • 2 replies
  • 4 kudos
Latest Reply
nightcoder
New Contributor II
  • 4 kudos

That is true (this is not an answer but a comment) vscode is supported. But vscode is not integrating with notebook on aws. When will this feature be available?

  • 4 kudos
1 More Replies
brickster_2018
by Databricks Employee
  • 2386 Views
  • 3 replies
  • 0 kudos

Resolved! For the Autoloader, cloudFiles.includeExistingFiles option, is ordering respected?

If Yes, how is order ensured?  For example, let's say there are a number of CDC change files that are uploaded to a directory over time. If a table were to be created using the cloudFiles source, in what order would those files be processed?

  • 2386 Views
  • 3 replies
  • 0 kudos
Latest Reply
Hanish_Goel
New Contributor II
  • 0 kudos

Hi, Is there any new development in terms of ensuring ordering of the files in autoloader?

  • 0 kudos
2 More Replies
vk217
by Contributor
  • 15202 Views
  • 5 replies
  • 17 kudos

Resolved! python wheel cannot be installed as library.

When I try to install the python whl library, I get the below error. However I can install it as a jar and it works fine. One difference is that I am creating my own cluster by cloning an existing cluster and copying the whl to a folder called testin...

image
  • 15202 Views
  • 5 replies
  • 17 kudos
Latest Reply
vk217
Contributor
  • 17 kudos

The issue was that the package was renamed after it was installed to the cluster and hence it was not recognized.

  • 17 kudos
4 More Replies
140015
by New Contributor III
  • 782 Views
  • 0 replies
  • 0 kudos

DLT using the result of one view in another table with collect()

Hey,Do you guys know, if there is an option to implement something like this in DLT:@dlt.view()def view_1(): # some calculations that return a small dataframe with around max 80 rows@dlt.table()def table_1(): result_df = dlt.read("view_1") resu...

  • 782 Views
  • 0 replies
  • 0 kudos
ACP
by New Contributor III
  • 2557 Views
  • 4 replies
  • 2 kudos

Accreditation, Badges, Points not received

Hi there​ ,I have completed a few courses but didn't receive any badges or points. I also did an accreditation but also didn't receive anything. It's been already 3 or 4 days and still nothing.I would really appreciate if Databricks could fix this.Ma...

  • 2557 Views
  • 4 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Andre Paiva​ Thank you for reaching out! Please submit a ticket to our Training Team here: https://help.databricks.com/s/contact-us?ReqType=training  and our team will get back to you shortly. 

  • 2 kudos
3 More Replies
KVNARK
by Honored Contributor II
  • 7501 Views
  • 3 replies
  • 8 kudos

Resolved! Advantages of Databricks Lakehouse over Azure synapse.

What are more advantages of data bricks over azure synapse analytics. Looks like most of them are almost similar features like computation or storage etc... in both.

  • 7501 Views
  • 3 replies
  • 8 kudos
Latest Reply
Geeta1
Valued Contributor
  • 8 kudos

Below link has good comparison of both:https://hevodata.com/learn/azure-synapse-vs-databricks/

  • 8 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels