- 6638 Views
- 3 replies
- 2 kudos
I've cloned a Repo during "Get Started with Data Engineering on Databricks".Then I'm trying to run another notebook from a cell with a magic %run command.But I get that the file can't be opened safe.Here my code:notebook_aname = "John"
print(f"Hello ...
- 6638 Views
- 3 replies
- 2 kudos
Latest Reply
It could be that you need to put the %run in a cell all by itself. Suggested here: https://stackoverflow.com/a/72833400/1144966
2 More Replies
- 18545 Views
- 9 replies
- 11 kudos
Hey, so our notebooks reading a bunch of json files from storage typically use a input_file_name() when moving from raw to bronze, but after upgrading to Unity Catalog we get an error message:AnalysisException: [UC_COMMAND_NOT_SUPPORTED] input_file_n...
- 18545 Views
- 9 replies
- 11 kudos
Latest Reply
.withColumn("RECORD_FILE_NAME", col("_metadata.file_name"))Will work for spark.read to get the file name, or: .withColumn("RECORD_FILE_NAME", col("_metadata.file_path"))To get the whole file path
8 More Replies
- 898 Views
- 0 replies
- 3 kudos
In any Spark application, Spark driver plays a critical role and performs the following functions:1. Initiating a Spark Session2. Communicating with the cluster manager to request resources (CPU, memory, etc) from the cluster manager for Spark's exec...
- 898 Views
- 0 replies
- 3 kudos
by
AT
• New Contributor III
- 10810 Views
- 5 replies
- 4 kudos
I have an exam to take for the Databricks Associate ML certification early this week. I have raised multiple tickets for the same and previously but didn't receive any reply on the same. I have attended the webinar for 3 days on Data Engineering as m...
- 10810 Views
- 5 replies
- 4 kudos
Latest Reply
Hi @Avinash Tiwari I hope you are doing great!I have forwarded your query to our Academic team. Soon the problem will be resolved. Please bear with us.Thanks and Regards
4 More Replies
- 6747 Views
- 3 replies
- 4 kudos
I was going through Data Engineering with Databricks training, and in DE 3.3L - Databases, Tables & Views Lab section, it says "Defining database directories for groups of users can greatly reduce the chances of accidental data exfiltration." I agree...
- 6747 Views
- 3 replies
- 4 kudos
Latest Reply
Hi @Dilorom A Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we ...
2 More Replies
- 1335 Views
- 1 replies
- 3 kudos
We have 11 people working on the Data Engineering Associate certification using Data Engineering with Databricks V3. We just got done with the Foundation one and start the Engineering journey. We are Registered partners and Data Engineering with Dat...
- 1335 Views
- 1 replies
- 3 kudos
Latest Reply
Hell Shanthala, you can send an email to partnerops@databricks.com who then provide information how to set this up
- 4141 Views
- 6 replies
- 8 kudos
I had completed the Data Engineering Associate V3 certification today morning and I'm yet to receive my certification. I had received a mail stating that I had passed and the certification would be mailed.
- 4141 Views
- 6 replies
- 8 kudos
Latest Reply
Hi @Mebin Joy Thank you for reaching out! Please submit a ticket to our Training Team here: https://help.databricks.com/s/contact-us?ReqType=training and our team will get back to you shortly. Regards
5 More Replies
- 28803 Views
- 13 replies
- 13 kudos
Which version of the Data Engineering with Databricks learning plan should I do? v2 or v3? Is there a Certified Data Engineer Associate V3 Exam already?Where can I find practice exams for Certified Data Engineer Associate V3?
- 28803 Views
- 13 replies
- 13 kudos
Latest Reply
I would suggest choose v3 - it was latest version and covered more topic.
12 More Replies
- 3298 Views
- 3 replies
- 3 kudos
I was looking forward to using the Data Quality features that are provided with DLT but as far as I can the ingestion process is more restrictive than other methods. It doesn't seem like you can do much as far as setting delimiter type, headers or an...
- 3298 Views
- 3 replies
- 3 kudos
Latest Reply
DLT uses Autoloader to ingest data. With autoloader, you can provide read options for the table. https://docs.databricks.com/ingestion/auto-loader/options.html#csv-options has the docs on CSV. I attached a picture of an example.
2 More Replies
- 18862 Views
- 11 replies
- 52 kudos
If I am new to Databricks and is aiming to get qualification some point Dec2022 or Jan 2023, should I be studying the material Data Engineering with Databricks V2 or V3?
- 18862 Views
- 11 replies
- 52 kudos
Latest Reply
I would suggest to go for V3 because the course Data Engineering with Databricks (V3) is the latest version as of now and was released on 14th October 2022. So, this version would have more topics in comparison to V2.
10 More Replies
- 28673 Views
- 19 replies
- 42 kudos
I'd like to ask if there is a tentative date to release Databricks Data Engineering practice exam. Thank you!
- 28673 Views
- 19 replies
- 42 kudos
Latest Reply
No, as of now there is no practice exam available for this certification but a good way to get an idea about the exam would be appearing for it once. There are multiple trainings going on from Databricks, attending which you can get the voucher code ...
18 More Replies
by
Kopal
• New Contributor II
- 6054 Views
- 3 replies
- 3 kudos
Data Engineering - CTAS - External TablesCan someone help me understand why In chapter 3.3, we cannot not directly use CTAS with OPTIONS and LOCATION to specify delimiter and location of CSV?Or I misunderstood?Details:In Data Engineering with Databri...
- 6054 Views
- 3 replies
- 3 kudos
Latest Reply
The 2nd statement CTAS will not be able to parse the csv in any manner because it's just the from statement that points to a file. It's more of a traditional SQL statement with select and from. It will create a Delta Table. This just happens to b...
2 More Replies
- 3211 Views
- 4 replies
- 3 kudos
We need to be able to import a custom certificate (https://learn.microsoft.com/en-us/azure/databricks/kb/python/import-custom-ca-cert) in the same way as in the "data engineering" module but in the Databricks SQL module
- 3211 Views
- 4 replies
- 3 kudos
Latest Reply
You can try downloading it to DBFS and may be accessing it from there if you use case really needs that.
3 More Replies
- 4194 Views
- 4 replies
- 11 kudos
Tip: These steps are built out for AWS accounts and workspaces that are using Delta Lake. If you would like to learn more watch this video and reach out to your Databricks sales representative for more information.Step 1: Create your own notebook or ...
- 4194 Views
- 4 replies
- 11 kudos
by
yang
• New Contributor II
- 1600 Views
- 1 replies
- 2 kudos
I am working on Data Engineering with Databricks v3 course. In notebook DE 4.1 - DLT UI Walkthrough, I countered an error in cmd 11: DA.validate_pipeline_config(pipeline_language)The error message is: AssertionError: Expected the parameter "suite" to...
- 1600 Views
- 1 replies
- 2 kudos
Latest Reply
The DA validate function is just to check that you named the pipeline correctly, set up the correct number of workers, 0, and other configurations. The name and directory aren't crucial to the learning process. The goal is to get familiar with the ...