cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

KVNARK
by Honored Contributor II
  • 2046 Views
  • 2 replies
  • 6 kudos

Resolved! Microsoft ADB file from BLOB into Azure Databricks

Can anyone let me know how we can load the database file into Azure Databricks from the azure blob.

  • 2046 Views
  • 2 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @KVNARK .​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback wil...

  • 6 kudos
1 More Replies
Michael_Papadop
by New Contributor II
  • 11596 Views
  • 3 replies
  • 0 kudos

How can I set the status of a databricks job as skipped via python?

I have a basic 2 task job. The 1st notebook (task) checks whether the source file has changes and if so then refreshes a corresponding materialized view. In case we have no changes then I use dbutils.jobs.taskValues.set(key = "skip_job", value = 1) &...

  • 11596 Views
  • 3 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@Michael Papadopoulos​ usually that should not be the case i think, as for task level we have 3 level notifications ( success, failure,start), where as whole job level skip option is available to discard notification . will see if some one from commu...

  • 0 kudos
2 More Replies
thushar
by Contributor
  • 9200 Views
  • 5 replies
  • 0 kudos

Optimize & Compaction

Hi,From which data bricks runtime will support Optimize and compaction

  • 9200 Views
  • 5 replies
  • 0 kudos
Latest Reply
Joe_Suarez
New Contributor III
  • 0 kudos

Optimize and compaction are operations commonly used in Apache Spark for optimizing and improving the performance of data storage and processing. Databricks, which is a cloud-based platform for Apache Spark, provides support for these operations on v...

  • 0 kudos
4 More Replies
Direo
by Contributor II
  • 28198 Views
  • 5 replies
  • 1 kudos

Resolved! Importing CA certificate into a Databricks cluster

Hi!I was following guide outlined here:https://kb.databricks.com/en_US/python/import-custom-ca-cert(also tried this: https://stackoverflow.com/questions/73043589/configuring-tls-ca-on-databricks)to add ca root certificate into Databricks cluster, but...

  • 28198 Views
  • 5 replies
  • 1 kudos
Latest Reply
Direo
Contributor II
  • 1 kudos

In the end it turned out that I tried to add wrong certificate. To check certificate's Distinguished Name (DN) which help identify the organization that the certificate was issued to, run %sh openssl s_client -connect <hostname>:<port>-showcerts -CAf...

  • 1 kudos
4 More Replies
youssefmrini
by Databricks Employee
  • 1410 Views
  • 1 replies
  • 0 kudos
  • 1410 Views
  • 1 replies
  • 0 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 0 kudos

Support for Jupyter notebooks (.ipynb files) is available in Repos. You can clone repositories with .ipynb notebooks, work in Databricks UI, and then commit and push as .ipynb notebooks. Metadata such as a notebook dashboard is preserved. Admins can ...

  • 0 kudos
youssefmrini
by Databricks Employee
  • 2111 Views
  • 1 replies
  • 0 kudos
  • 2111 Views
  • 1 replies
  • 0 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 0 kudos

Feel free to read my medium blog where I summarized all the featureshttps://medium.com/@youssefmrini/databricks-workflows-features-oriented-3f9ec025301a

  • 0 kudos
beer
by New Contributor II
  • 1730 Views
  • 3 replies
  • 0 kudos

Didn't receive my Databricks Spark 3.0 certification

I took the exam yesterday and passed the test. I haven't received any email from Databricks Academy. How long would it take to receive the certification?

  • 1730 Views
  • 3 replies
  • 0 kudos
Latest Reply
beer
New Contributor II
  • 0 kudos

This is resolved.

  • 0 kudos
2 More Replies
Rik
by New Contributor III
  • 2585 Views
  • 2 replies
  • 0 kudos

Incorrect error when adding an IP access list

I have disabled the IP Access List on my workspace and am trying to add an IP list through the IP Access List API. However, when adding a list, I get the INVALID_STATE response.The docs mention this is because:"If the new list would block the calling...

  • 2585 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rik
New Contributor III
  • 0 kudos

"One possible workaround could be to (1) temporarily enable the IP Access List feature, (2) add the necessary IP addresses to the list, and then (3) disable the feature again. This way, you can add the IP addresses you need without blocking the curre...

  • 0 kudos
1 More Replies
Indika_debnath
by New Contributor II
  • 4972 Views
  • 9 replies
  • 0 kudos

Databricks Certification voucher not received

Hello team,I have attended the webinar Databricks Certification Overview Series- Data Engineer on Jan 17Completed the Databricks Lakehouse fundamentals accreditation and Completed the survey.As per communication it is expected that I will receive Dat...

  • 4972 Views
  • 9 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Indika Debnath​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...

  • 0 kudos
8 More Replies
SaraCorralLou
by New Contributor III
  • 1876 Views
  • 1 replies
  • 0 kudos

Resolved! Delta tables background

Hi,Looking at the delta tables and how they are stored I have a question. If the delta tables are stored as parquet files in ADLS why if I copy/paste/rename a folder that corresponds to an existing table in the same location/database this does not ge...

  • 1876 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Sara Corral​ :When you copy/paste/rename a folder that corresponds to an existing delta table in the same location/database, it does not generate a copy of the previous table because delta tables are not just plain parquet files. They have additiona...

  • 0 kudos
Mr__D
by New Contributor II
  • 7165 Views
  • 1 replies
  • 0 kudos

Databricks Cluster Autoscaling

Hello All,Could anyone please suggest impact of Autoscaling in cluster cost ?Suppose if I have a cluster where min worker is 2 and max is 10 but most of the time active worker are 3 so the cluster will be billed for only 3 workers or for 10 worker(...

  • 7165 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Deepak Bhatt​ :Autoscaling in Databricks can have a significant impact on cluster cost, as it allows the cluster to dynamically add or remove workers based on the workload.In the scenario you described, if the active worker count is consistently at ...

  • 0 kudos
Mr__D
by New Contributor II
  • 16734 Views
  • 1 replies
  • 0 kudos

Populating data from databricks to sql server tables

Hello All,Could any one please suggest what is the best way to populate(Upsert) data from delta table into the sql server table.we are transforming our data in Databricks and storing data into the delta table. but for reporting purpose we need to pop...

  • 16734 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Deepak Bhatt​ :Yes, using the Spark Synapse connector could be a good option for upserting data from a Delta table into a SQL Server table. The Spark Synapse connector allows you to read and write data from Azure Synapse Analytics, formerly known as...

  • 0 kudos
priyak
by New Contributor III
  • 5708 Views
  • 7 replies
  • 3 kudos

Resolved! Multiple versions of custom libraries on the cluster

Using the install_libraries API, I installed a custom Python whl file on a running cluster. For certain types of requests, we have a requirement to install a different version of the same custom whl file in the running cluster. My problem is that uni...

  • 5708 Views
  • 7 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Priya K​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your ...

  • 3 kudos
6 More Replies
Gopal269673
by Contributor
  • 5351 Views
  • 11 replies
  • 8 kudos

Resolved! Facing issues in running the converted code in spark sql framework with 5 to 10 percent volume of prod data. Need help in solving this and suggestions required.

Hi All.. Need your help in this issue what i am facing. Currently we are using data bricks as a platform to build pipeline and execute our talend ETL sqls converted into the spark sql framework as we were facing issues in loading the history data int...

  • 5351 Views
  • 11 replies
  • 8 kudos
Latest Reply
Gopal269673
Contributor
  • 8 kudos

@All Users Group​  Metrics stats also attached here.Thanks.

  • 8 kudos
10 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels