cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mebinjoy
by New Contributor II
  • 6035 Views
  • 7 replies
  • 8 kudos

Resolved! Certificate not received.

I had completed the Data Engineering Associate V3 certification today morning and I'm yet to receive my certification. I had received a mail stating that I had passed and the certification would be mailed.

  • 6035 Views
  • 7 replies
  • 8 kudos
Latest Reply
varsha2
New Contributor II
  • 8 kudos

I completed my exam last week still not received certificate. Please help as soon as possible Its really urgent

  • 8 kudos
6 More Replies
dnchankov
by New Contributor II
  • 10364 Views
  • 5 replies
  • 9 kudos

Resolved! Why my notebook I created in a Repo can be opened safe?

I've cloned a Repo during "Get Started with Data Engineering on Databricks".Then I'm trying to run another notebook from a cell with a magic %run command.But I get that the file can't be opened safe.Here my code:notebook_aname = "John" print(f"Hello ...

  • 10364 Views
  • 5 replies
  • 9 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 9 kudos

+1 to all the above comments. Having the %run command along with other commands will confuse the REPL execution. So, have the %run notebook_b3 command alone in a new cell, maybe as the first cell, is notebook_a, which will resolve the issue, and your...

  • 9 kudos
4 More Replies
espenol
by New Contributor III
  • 28772 Views
  • 11 replies
  • 13 kudos

input_file_name() not supported in Unity Catalog

Hey, so our notebooks reading a bunch of json files from storage typically use a input_file_name() when moving from raw to bronze, but after upgrading to Unity Catalog we get an error message:AnalysisException: [UC_COMMAND_NOT_SUPPORTED] input_file_n...

  • 28772 Views
  • 11 replies
  • 13 kudos
Latest Reply
ramanpreet
New Contributor II
  • 13 kudos

The reason why the 'input_file_name' is not supported because this function was available in older versions of Databricks runtime. It got deprecated from Databricks Runtime 13.3 LTS onwards

  • 13 kudos
10 More Replies
Eyespoop
by New Contributor II
  • 31058 Views
  • 4 replies
  • 4 kudos

Resolved! PySpark: Writing Parquet Files to the Azure Blob Storage Container

Currently I am having some issues with the writing of the parquet file in the Storage Container. I do have the codes running but whenever the dataframe writer puts the parquet to the blob storage instead of the parquet file type, it is created as a f...

image image(1) image(2)
  • 31058 Views
  • 4 replies
  • 4 kudos
Latest Reply
amarv
New Contributor II
  • 4 kudos

This is my approach:from databricks.sdk.runtime import dbutils from pyspark.sql.types import DataFrame output_base_url = "abfss://..." def write_single_parquet_file(df: DataFrame, filename: str): print(f"Writing '{filename}.parquet' to ABFS") ...

  • 4 kudos
3 More Replies
yunna_wei
by Databricks Employee
  • 1907 Views
  • 0 replies
  • 3 kudos

In any Spark application, Spark driver plays a critical role and performs the following functions: 1. Initiating a Spark Session 2. Communicating with...

In any Spark application, Spark driver plays a critical role and performs the following functions:1. Initiating a Spark Session2. Communicating with the cluster manager to request resources (CPU, memory, etc) from the cluster manager for Spark's exec...

  • 1907 Views
  • 0 replies
  • 3 kudos
AT
by New Contributor III
  • 12581 Views
  • 5 replies
  • 4 kudos

Resolved! Databricks Certification Voucher Code not received

I have an exam to take for the Databricks Associate ML certification early this week. I have raised multiple tickets for the same and previously but didn't receive any reply on the same. I have attended the webinar for 3 days on Data Engineering as m...

  • 12581 Views
  • 5 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Avinash Tiwari​ I hope you are doing great!I have forwarded your query to our Academic team. Soon the problem will be resolved. Please bear with us.Thanks and Regards

  • 4 kudos
4 More Replies
Dilorom
by New Contributor
  • 7948 Views
  • 3 replies
  • 4 kudos

What is a recommended directory for creating a database with a specified path?

I was going through Data Engineering with Databricks training, and in DE 3.3L - Databases, Tables & Views Lab section, it says "Defining database directories for groups of users can greatly reduce the chances of accidental data exfiltration." I agree...

  • 7948 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Dilorom A​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we ...

  • 4 kudos
2 More Replies
Shanthala
by New Contributor III
  • 1992 Views
  • 1 replies
  • 3 kudos

Workspace usage for the partners

We have 11 people working on the Data Engineering Associate certification using Data Engineering with Databricks V3.  We just got done with the Foundation one and start the Engineering journey. We are Registered partners and Data Engineering with Dat...

  • 1992 Views
  • 1 replies
  • 3 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 3 kudos

Hell Shanthala, you can send an email to partnerops@databricks.com who then provide information how to set this up

  • 3 kudos
johnb1
by Contributor
  • 35439 Views
  • 13 replies
  • 13 kudos

Certified Data Engineer Associate - v2 vs. v3 (Databricks Academy)

Which version of the Data Engineering with Databricks learning plan should I do? v2 or v3? Is there a Certified Data Engineer Associate V3 Exam already?Where can I find practice exams for Certified Data Engineer Associate V3?

  • 35439 Views
  • 13 replies
  • 13 kudos
Latest Reply
Frank_Tao
New Contributor II
  • 13 kudos

I would suggest choose v3 - it was latest version and covered more topic.

  • 13 kudos
12 More Replies
Chris_Shehu
by Valued Contributor III
  • 4717 Views
  • 3 replies
  • 3 kudos

Resolved! Is there a way to specify a header, set the delimiter, etc...in DLT?

I was looking forward to using the Data Quality features that are provided with DLT but as far as I can the ingestion process is more restrictive than other methods. It doesn't seem like you can do much as far as setting delimiter type, headers or an...

  • 4717 Views
  • 3 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

DLT uses Autoloader to ingest data. With autoloader, you can provide read options for the table. https://docs.databricks.com/ingestion/auto-loader/options.html#csv-options has the docs on CSV. I attached a picture of an example.

  • 3 kudos
2 More Replies
VVill_T
by Contributor
  • 21779 Views
  • 11 replies
  • 52 kudos

Resolved! Data Engineering with Databricks V2 or V3 for qualification in a few months time

If I am new to Databricks and is aiming to get qualification some point Dec2022 or Jan 2023, should I be studying the material Data Engineering with Databricks V2 or V3?

  • 21779 Views
  • 11 replies
  • 52 kudos
Latest Reply
Devarsh
Contributor
  • 52 kudos

I would suggest to go for V3 because the course Data Engineering with Databricks (V3) is the latest version as of now and was released on 14th October 2022. So, this version would have more topics in comparison to V2.

  • 52 kudos
10 More Replies
AL1
by Contributor
  • 34923 Views
  • 19 replies
  • 44 kudos

Resolved! Data Engineering Professional Practice exam

I'd like to ask if there is a tentative date to release Databricks Data Engineering practice exam. Thank you!

  • 34923 Views
  • 19 replies
  • 44 kudos
Latest Reply
Devarsh
Contributor
  • 44 kudos

No, as of now there is no practice exam available for this certification but a good way to get an idea about the exam would be appearing for it once. There are multiple trainings going on from Databricks, attending which you can get the voucher code ...

  • 44 kudos
18 More Replies
Kopal
by New Contributor II
  • 7716 Views
  • 3 replies
  • 3 kudos

Resolved! Data Engineering - CTAS - External Tables - Limitations of CTAS for external tables - can or cannot use options and location

Data Engineering - CTAS - External TablesCan someone help me understand why In chapter 3.3, we cannot not directly use CTAS with OPTIONS and LOCATION to specify delimiter and location of CSV?Or I misunderstood?Details:In Data Engineering with Databri...

  • 7716 Views
  • 3 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

The 2nd statement CTAS will not be able to parse the csv in any manner because it's just the from statement that points to a file. It's more of a traditional SQL statement with select and from. It will create a Delta Table. This just happens to b...

  • 3 kudos
2 More Replies
FerArribas
by Contributor
  • 4727 Views
  • 4 replies
  • 3 kudos

How to import a custom CA certificate into the Databricks SQL module?

We need to be able to import a custom certificate (https://learn.microsoft.com/en-us/azure/databricks/kb/python/import-custom-ca-cert) in the same way as in the "data engineering" module but in the Databricks SQL module

  • 4727 Views
  • 4 replies
  • 3 kudos
Latest Reply
VaibB
Contributor
  • 3 kudos

You can try downloading it to DBFS and may be accessing it from there if you use case really needs that.

  • 3 kudos
3 More Replies
User16835756816
by Databricks Employee
  • 8439 Views
  • 4 replies
  • 11 kudos

How can I extract data from different sources and transform it into a fresh, reliable data pipeline?

Tip: These steps are built out for AWS accounts and workspaces that are using Delta Lake. If you would like to learn more watch this video and reach out to your Databricks sales representative for more information.Step 1: Create your own notebook or ...

  • 8439 Views
  • 4 replies
  • 11 kudos
Latest Reply
Ajay-Pandey
Databricks MVP
  • 11 kudos

Thanks @Nithya Thangaraj​ 

  • 11 kudos
3 More Replies
Labels