cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

ACP
by New Contributor III
  • 1827 Views
  • 5 replies
  • 0 kudos

Screenshot 2023-01-09 094039

Hey guys,Databricks academy login is not working. I have been trying for the past 1 hour and still doesn't work. It seems to be with the Databricks https certificate being expired but not sure. I'm attaching an image with the error. Any help with thi...

  • 1827 Views
  • 5 replies
  • 0 kudos
Latest Reply
Chaitanya_Raju
Honored Contributor
  • 0 kudos

Hi @Andre Paiva​ ,Can you please try now I can able to load both customer and partner academy websites, I think the Academy team has fixed the issue.  Happy Learning!!

  • 0 kudos
4 More Replies
databicky
by Contributor II
  • 2126 Views
  • 5 replies
  • 3 kudos

Resolved! How to add a current date as suffix while using copy?

how to add a current date after filename suffix while copy from the dbutils like report20221223.xlsx​dbutils.fs.cp('dbfs://temp/balancing/report.xlsx','abfss://con@adls/provsn/result/report.xlsx',True)​i need to add the current date in the file like ...

  • 2126 Views
  • 5 replies
  • 3 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 3 kudos

Hi @Mohammed sadamusean​, We haven’t heard from you since the last response from @Aviral Bhardwaj​ and @Ratna Chaitanya Raju Bandaru​, and I was checking back to see if their suggestions helped you. Or else, If you have any solution, please do share ...

  • 3 kudos
4 More Replies
thibault
by Contributor II
  • 3125 Views
  • 6 replies
  • 0 kudos

Resolved! Monaco editor - Toggle line comment not working

I recently tried the new editor, and usual shortcuts like CTRL + / to comment is ineffective. Is this a known issue? It's working fine with the classic editor, so I am switching back to it in the meantime, but it would be great to use this new additi...

  • 3125 Views
  • 6 replies
  • 0 kudos
Latest Reply
thibault
Contributor II
  • 0 kudos

It has been fixed now, thanks!

  • 0 kudos
5 More Replies
Aviral-Bhardwaj
by Esteemed Contributor III
  • 809 Views
  • 2 replies
  • 19 kudos

�� Deltalake Vs Datalake in Databricks ��Delta Lake Databricks Delta Lake is an open-source storage layer that sits on top of existing d...

Deltalake Vs Datalake in Databricks Delta Lake DatabricksDelta Lake is an open-source storage layer that sits on top of existing data lake storage, such as Azure Data Lake Store or Amazon S3. It provides a more robust and scalable alternative to tra...

  • 809 Views
  • 2 replies
  • 19 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 19 kudos

Awesome!

  • 19 kudos
1 More Replies
Nilave
by New Contributor III
  • 5402 Views
  • 6 replies
  • 5 kudos

Resolved! Azure Databricks unable to connect to private DNS KeyVault in createScope, showing "DNS invalid"

I have an Azure KeyVault with private endpoint created in the same Vnet as Azure Databricks. While trying to add it as a scope using the private DNS Zone ie <KVname>.privatelink.vaultcore.azure.netgetting error "DNS  is invalid and cannot be reached....

  • 5402 Views
  • 6 replies
  • 5 kudos
Latest Reply
mark_362882
New Contributor III
  • 5 kudos

I got it working by creating the KV backed scope via UI. I used the the dns without the private part: <KVName>.vault.azure.netThe private dns will resolve it to the right IP.You do have to check the "Allow trusted Microsoft services to bypass this fi...

  • 5 kudos
5 More Replies
rubenteixeira
by New Contributor III
  • 1911 Views
  • 2 replies
  • 0 kudos

Can't parallelize model training with sc.parallelize, even tough I can run the same code without parallelizing

I'm training a NeuralProphet for a time series forecasting problem. I'm trying to parallelize my training, but this error is appearingThe folder lightning_logs has a hparams.yaml but it's empty. Is this related to permissions on the cluster? Thanks i...

image image.png
  • 1911 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi,Please let us know if this was checked already:

  • 0 kudos
1 More Replies
Aviral-Bhardwaj
by Esteemed Contributor III
  • 1458 Views
  • 2 replies
  • 20 kudos

⏩ Understanding Unity Catalog in Databricks ⏮ In Databricks, the Unity Catalog is a data catalog that allows you to store, access, and manage data wit...

Understanding Unity Catalog in Databricks In Databricks, the Unity Catalog is a data catalog that allows you to store, access, and manage data within your Databricks workspace. It provides a unified interface for working with data across different s...

  • 1458 Views
  • 2 replies
  • 20 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 20 kudos

Nice one!Keep sharing such informative posts.

  • 20 kudos
1 More Replies
tanjil
by New Contributor III
  • 1990 Views
  • 2 replies
  • 2 kudos

print(flush = True) not working

Hello, I have the following minimum example working example using multiprocessing:from multiprocessing import Pool   files_list = [('bla', 1, 3, 7), ('spam', 12, 4, 8), ('eggs', 17, 1, 3)]     def f(t): print('Hello from child process', flush = Tr...

  • 1990 Views
  • 2 replies
  • 2 kudos
Latest Reply
tanjil
New Contributor III
  • 2 kudos

No errors are generated. The code executes successfully, but there the print statement for "Hello from child process" does not work.

  • 2 kudos
1 More Replies
Optum
by New Contributor III
  • 6737 Views
  • 10 replies
  • 4 kudos

Resolved! Databricks JDBC & Remote Write

Hello,I'm trying to write to a Delta Table in my Databricks instance from a remote Spark session on a different cluster with the Simba Spark driver. I can do reads, but when I attempt to do a write, I get the following error:{  df.write.format("jdbc...

  • 6737 Views
  • 10 replies
  • 4 kudos
Latest Reply
Atanu
Esteemed Contributor
  • 4 kudos

Could you try setting the flag to ignore transactions? I’m not sure what the exact flag is, but there should be more details in the JDBC manual on how to do this

  • 4 kudos
9 More Replies
brickster_2018
by Esteemed Contributor
  • 4867 Views
  • 2 replies
  • 1 kudos
  • 4867 Views
  • 2 replies
  • 1 kudos
Latest Reply
User16752240150
New Contributor II
  • 1 kudos

Every 10 transactions json files in the _delta_log are converted to parquet files. The .crc file is a checksum added to prevent corruption if a parquet file is corrupted in flight

  • 1 kudos
1 More Replies
cybersam
by New Contributor II
  • 922 Views
  • 2 replies
  • 0 kudos

How do I find the documentation for a Databricks "platform release"?

My DB portal says the platform version is v3.86, and provides a link to all the releases. But none of those releases state the "platform version". And I can't find "v3.86" by searching in the Databricks docs.So, how does one find the documentation fo...

  • 922 Views
  • 2 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

Hey @Samuel Yang​ ,Here you will find all detailshttps://docs.databricks.com/release-notes/index.htmlThanksAviral Bhardwaj

  • 0 kudos
1 More Replies
cmilligan
by Contributor II
  • 1950 Views
  • 3 replies
  • 4 kudos

Link a visio diagram in a markdown cell

Is there a way to have databricks pull a diagram directly from visio? I've tried to use the embed links from visio but the image won't render. I'm trying to get around loading the image to DBFS as there may be updates to the image that I want it to g...

  • 1950 Views
  • 3 replies
  • 4 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 4 kudos

thanks for this

  • 4 kudos
2 More Replies
Sharath
by New Contributor II
  • 1526 Views
  • 7 replies
  • 0 kudos

Hi Databricks Team,​ I passed the associate data engineer exam day before but still haven&#39;t received on accredible or on db academy. My registere...

Hi Databricks Team,​ I passed the associate data engineer exam day before but still haven't received on accredible or on db academy. My registered email id for exam is sharath.koushik@gmail.com. Could you please help ?​

  • 1526 Views
  • 7 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Sharath K​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 0 kudos
6 More Replies
shamly
by New Contributor III
  • 2554 Views
  • 3 replies
  • 2 kudos

How to remove extra ENTER line in csv UTF-16 while reading

Dear Friends,I have a csv and it looks like this‡‡Id‡‡,‡‡Version‡‡,‡‡Questionnaire‡‡,‡‡Date‡‡‡‡123456‡‡,‡‡Version2‡‡,‡‡All questions have been answered accurately and the guidance in the questionnaire was understood and followed‡‡,‡‡2010-12-16 00:01:...

  • 2554 Views
  • 3 replies
  • 2 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 2 kudos

This is working fine, from pyspark.sql.functions import regexp_replace   path="dbfs:/FileStore/df/test.csv" dff = spark.read.option("header", "true").option("inferSchema", "true").option('multiline', 'true').option('encoding', 'UTF-8').option("delimi...

  • 2 kudos
2 More Replies
Paradox_Parijat
by New Contributor III
  • 1669 Views
  • 6 replies
  • 8 kudos

Hello World! ​This my first databricks community post. Looking forward to contribute from my end. ​Peace out!​ @Dinesh Mergu​ 

Hello World! ​This my first databricks community post. Looking forward to contribute from my end. ​Peace out!​@Dinesh Mergu​ 

  • 1669 Views
  • 6 replies
  • 8 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 8 kudos

Welcome to the community @Parijat Dhar​ !!

  • 8 kudos
5 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels