cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

dnchankov
by New Contributor II
  • 3860 Views
  • 3 replies
  • 2 kudos

Why my notebook I created in a Repo can be opened safe?

I've cloned a Repo during "Get Started with Data Engineering on Databricks".Then I'm trying to run another notebook from a cell with a magic %run command.But I get that the file can't be opened safe.Here my code:notebook_aname = "John" print(f"Hello ...

  • 3860 Views
  • 3 replies
  • 2 kudos
Latest Reply
petermeissner
New Contributor II
  • 2 kudos

It could be that you need to put the %run in a cell all by itself. Suggested here: https://stackoverflow.com/a/72833400/1144966

  • 2 kudos
2 More Replies
espenol
by New Contributor III
  • 12787 Views
  • 9 replies
  • 10 kudos

input_file_name() not supported in Unity Catalog

Hey, so our notebooks reading a bunch of json files from storage typically use a input_file_name() when moving from raw to bronze, but after upgrading to Unity Catalog we get an error message:AnalysisException: [UC_COMMAND_NOT_SUPPORTED] input_file_n...

  • 12787 Views
  • 9 replies
  • 10 kudos
Latest Reply
JasonThomas
New Contributor III
  • 10 kudos

.withColumn("RECORD_FILE_NAME", col("_metadata.file_name"))Will work for spark.read to get the file name, or: .withColumn("RECORD_FILE_NAME", col("_metadata.file_path"))To get the whole file path

  • 10 kudos
8 More Replies
yunna_wei
by New Contributor II
  • 611 Views
  • 0 replies
  • 3 kudos

In any Spark application, Spark driver plays a critical role and performs the following functions: 1. Initiating a Spark Session 2. Communicating with...

In any Spark application, Spark driver plays a critical role and performs the following functions:1. Initiating a Spark Session2. Communicating with the cluster manager to request resources (CPU, memory, etc) from the cluster manager for Spark's exec...

  • 611 Views
  • 0 replies
  • 3 kudos
AT
by New Contributor III
  • 9973 Views
  • 5 replies
  • 4 kudos

Resolved! Databricks Certification Voucher Code not received

I have an exam to take for the Databricks Associate ML certification early this week. I have raised multiple tickets for the same and previously but didn't receive any reply on the same. I have attended the webinar for 3 days on Data Engineering as m...

  • 9973 Views
  • 5 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Avinash Tiwari​ I hope you are doing great!I have forwarded your query to our Academic team. Soon the problem will be resolved. Please bear with us.Thanks and Regards

  • 4 kudos
4 More Replies
Dilorom
by New Contributor
  • 5620 Views
  • 5 replies
  • 4 kudos

What is a recommended directory for creating a database with a specified path?

I was going through Data Engineering with Databricks training, and in DE 3.3L - Databases, Tables & Views Lab section, it says "Defining database directories for groups of users can greatly reduce the chances of accidental data exfiltration." I agree...

  • 5620 Views
  • 5 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Dilorom A​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we ...

  • 4 kudos
4 More Replies
Shanthala
by New Contributor III
  • 1042 Views
  • 1 replies
  • 3 kudos

Workspace usage for the partners

We have 11 people working on the Data Engineering Associate certification using Data Engineering with Databricks V3.  We just got done with the Foundation one and start the Engineering journey. We are Registered partners and Data Engineering with Dat...

  • 1042 Views
  • 1 replies
  • 3 kudos
Latest Reply
youssefmrini
Honored Contributor III
  • 3 kudos

Hell Shanthala, you can send an email to partnerops@databricks.com who then provide information how to set this up

  • 3 kudos
mebinjoy
by New Contributor II
  • 3062 Views
  • 6 replies
  • 8 kudos

Resolved! Certificate not received.

I had completed the Data Engineering Associate V3 certification today morning and I'm yet to receive my certification. I had received a mail stating that I had passed and the certification would be mailed.

  • 3062 Views
  • 6 replies
  • 8 kudos
Latest Reply
Anonymous
Not applicable
  • 8 kudos

Hi @Mebin Joy​ Thank you for reaching out! Please submit a ticket to our Training Team here: https://help.databricks.com/s/contact-us?ReqType=training  and our team will get back to you shortly. Regards

  • 8 kudos
5 More Replies
johnb1
by Contributor
  • 21931 Views
  • 13 replies
  • 13 kudos

Certified Data Engineer Associate - v2 vs. v3 (Databricks Academy)

Which version of the Data Engineering with Databricks learning plan should I do? v2 or v3? Is there a Certified Data Engineer Associate V3 Exam already?Where can I find practice exams for Certified Data Engineer Associate V3?

  • 21931 Views
  • 13 replies
  • 13 kudos
Latest Reply
Frank_Tao
New Contributor II
  • 13 kudos

I would suggest choose v3 - it was latest version and covered more topic.

  • 13 kudos
12 More Replies
Chris_Shehu
by Valued Contributor III
  • 2409 Views
  • 3 replies
  • 3 kudos

Resolved! Is there a way to specify a header, set the delimiter, etc...in DLT?

I was looking forward to using the Data Quality features that are provided with DLT but as far as I can the ingestion process is more restrictive than other methods. It doesn't seem like you can do much as far as setting delimiter type, headers or an...

  • 2409 Views
  • 3 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

DLT uses Autoloader to ingest data. With autoloader, you can provide read options for the table. https://docs.databricks.com/ingestion/auto-loader/options.html#csv-options has the docs on CSV. I attached a picture of an example.

  • 3 kudos
2 More Replies
VVill_T
by Contributor
  • 16829 Views
  • 13 replies
  • 59 kudos

Resolved! Data Engineering with Databricks V2 or V3 for qualification in a few months time

If I am new to Databricks and is aiming to get qualification some point Dec2022 or Jan 2023, should I be studying the material Data Engineering with Databricks V2 or V3?

  • 16829 Views
  • 13 replies
  • 59 kudos
Latest Reply
Devarsh
Contributor
  • 59 kudos

I would suggest to go for V3 because the course Data Engineering with Databricks (V3) is the latest version as of now and was released on 14th October 2022. So, this version would have more topics in comparison to V2.

  • 59 kudos
12 More Replies
AL1
by Contributor
  • 18846 Views
  • 19 replies
  • 42 kudos

Resolved! Data Engineering Professional Practice exam

I'd like to ask if there is a tentative date to release Databricks Data Engineering practice exam. Thank you!

  • 18846 Views
  • 19 replies
  • 42 kudos
Latest Reply
Devarsh
Contributor
  • 42 kudos

No, as of now there is no practice exam available for this certification but a good way to get an idea about the exam would be appearing for it once. There are multiple trainings going on from Databricks, attending which you can get the voucher code ...

  • 42 kudos
18 More Replies
Kopal
by New Contributor II
  • 4211 Views
  • 3 replies
  • 3 kudos

Resolved! Data Engineering - CTAS - External Tables - Limitations of CTAS for external tables - can or cannot use options and location

Data Engineering - CTAS - External TablesCan someone help me understand why In chapter 3.3, we cannot not directly use CTAS with OPTIONS and LOCATION to specify delimiter and location of CSV?Or I misunderstood?Details:In Data Engineering with Databri...

  • 4211 Views
  • 3 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

The 2nd statement CTAS will not be able to parse the csv in any manner because it's just the from statement that points to a file. It's more of a traditional SQL statement with select and from. It will create a Delta Table. This just happens to b...

  • 3 kudos
2 More Replies
FerArribas
by Contributor
  • 2417 Views
  • 4 replies
  • 3 kudos

How to import a custom CA certificate into the Databricks SQL module?

We need to be able to import a custom certificate (https://learn.microsoft.com/en-us/azure/databricks/kb/python/import-custom-ca-cert) in the same way as in the "data engineering" module but in the Databricks SQL module

  • 2417 Views
  • 4 replies
  • 3 kudos
Latest Reply
VaibB
Contributor
  • 3 kudos

You can try downloading it to DBFS and may be accessing it from there if you use case really needs that.

  • 3 kudos
3 More Replies
User16835756816
by Valued Contributor
  • 3337 Views
  • 4 replies
  • 11 kudos

How can I extract data from different sources and transform it into a fresh, reliable data pipeline?

Tip: These steps are built out for AWS accounts and workspaces that are using Delta Lake. If you would like to learn more watch this video and reach out to your Databricks sales representative for more information.Step 1: Create your own notebook or ...

  • 3337 Views
  • 4 replies
  • 11 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 11 kudos

Thanks @Nithya Thangaraj​ 

  • 11 kudos
3 More Replies
yang
by New Contributor II
  • 1219 Views
  • 1 replies
  • 2 kudos

Resolved! Error in DE 4.1 - DLT UI Walkthrough (from Data Engineering with Databricks v3 course)

I am working on Data Engineering with Databricks v3 course. In notebook DE 4.1 - DLT UI Walkthrough, I countered an error in cmd 11: DA.validate_pipeline_config(pipeline_language)The error message is: AssertionError: Expected the parameter "suite" to...

  • 1219 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

The DA validate function is just to check that you named the pipeline correctly, set up the correct number of workers, 0, and other configurations. The name and directory aren't crucial to the learning process. The goal is to get familiar with the ...

  • 2 kudos
Labels