cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

dotan
by New Contributor II
  • 891 Views
  • 2 replies
  • 1 kudos

Resolved! How do I reduce the size of a hive table's S3 bucket

I have a hive table in Delta format with over 1B rows, when I check the Data Explorer in the SQL section of Databricks it notes that the table size is 139.3GiB with 401 files but when I check the S3 bucket where the files are located (dbfs:/user/hive...

  • 891 Views
  • 2 replies
  • 1 kudos
Latest Reply
apingle
Contributor
  • 1 kudos

When you run updates, deletes etc on a delta table, new files are created. However, the old files are not automatically deleted. This is to allow for features like time travel on the Delta tables. In order to delete older files for a delta table, you...

  • 1 kudos
1 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 428 Views
  • 1 replies
  • 5 kudos

Exciting news for Databricks users! #databricks launched a new feature that allows users to run job workflows continuously. Setting up a continuous jo...

Exciting news for Databricks users! #databricks launched a new feature that allows users to run job workflows continuously. Setting up a continuous job workflow is straightforward: create a job and select the continuous trigger option in the scheduli...

ezgif-1-1c3322d3f9
  • 428 Views
  • 1 replies
  • 5 kudos
Latest Reply
jose_gonzalez
Moderator
  • 5 kudos

Thank you for sharing!!!

  • 5 kudos
Sujitha
by Community Manager
  • 388 Views
  • 1 replies
  • 1 kudos

Weekly Release Notes RecapHere’s a quick recap of the latest release notes updates from the past one week. Databricks platform release notesFebruary 2...

Weekly Release Notes RecapHere’s a quick recap of the latest release notes updates from the past one week.Databricks platform release notesFebruary 21 - 28, 2023Ray on Databricks (Public Preview)With Databricks Runtime 12.0 and above, you can create ...

  • 388 Views
  • 1 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Moderator
  • 1 kudos

Thank you for sharing!!!

  • 1 kudos
SteveGPT
by New Contributor III
  • 2207 Views
  • 6 replies
  • 3 kudos

How to by pass SSL cert verification, using Repos with Azure Devops

Hi all, after some time working with Devops and Repos and getting used to the convenience our SSL Cert situation got jacked up somehow. While not ideal, I'd like to be able to temporarily bypass cert verification. There are ways to do this in the she...

  • 2207 Views
  • 6 replies
  • 3 kudos
Latest Reply
SteveGPT
New Contributor III
  • 3 kudos

Guess I'm out of luck on this one...

  • 3 kudos
5 More Replies
youssefmrini
by Honored Contributor III
  • 884 Views
  • 2 replies
  • 4 kudos

Resolved! Does DLT Support watermarking and Windowing ?

Yes it does.Here is the syntaxe for Watermarkinghttps://docs.databricks.com/sql/language-manual/sql-ref-syntax-qry-select-watermark.htmlHere it the syntaxe for Windowing https://docs.databricks.com/sql/language-manual/sql-ref-window-functions.html

  • 884 Views
  • 2 replies
  • 4 kudos
Latest Reply
Kaniz
Community Manager
  • 4 kudos

Hi @Youssef Mrini​ , Thank you for sharing the valuable information. Your insights are beneficial, and I appreciate the time and effort you put into gathering and presenting that information. I'm sure our peers will find it as valuable as us. Thanks ...

  • 4 kudos
1 More Replies
youssefmrini
by Honored Contributor III
  • 977 Views
  • 2 replies
  • 0 kudos
  • 977 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Youssef Mrini​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedbac...

  • 0 kudos
1 More Replies
youssefmrini
by Honored Contributor III
  • 835 Views
  • 2 replies
  • 2 kudos

Resolved! Can I run Ray applications on Databricks ?

With Databricks Runtime 12.0 and above, you can create a Ray cluster and run Ray applications in Databricks with the Ray on Spark API.Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a ...

  • 835 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Youssef Mrini​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 2 kudos
1 More Replies
Jfoxyyc
by Valued Contributor
  • 2056 Views
  • 8 replies
  • 7 kudos

Databricks IDE - Notebooks

With the announcement of the official IDE support for VS Code, does any one know if there's a way to run notebooks in VSC Code on Databricks clusters?https://www.databricks.com/blog/2023/02/14/announcing-a-native-visual-studio-code-experience-for-dat...

  • 2056 Views
  • 8 replies
  • 7 kudos
Latest Reply
Jfoxyyc
Valued Contributor
  • 7 kudos

There has been no response regarding running notebook cells in VS Code, so no best response.

  • 7 kudos
7 More Replies
Tico23
by Contributor
  • 1477 Views
  • 3 replies
  • 0 kudos

Resolved! AmazonS3 with Autoloader consume "too many" requests or maybe not!

After successfully loading 3 small files (2 KB each) in from AWS S3 using Auto Loader for learning purposes, I got, few hours later, a "AWS Free tier limit alert", although I haven't used the AWS account for a while.Does this streaming service on Da...

Budget_alert
  • 1477 Views
  • 3 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, ​​Auto Loader incrementally and efficiently processes new data files as they arrive in cloud storage. Auto Loader can load data files from AWS S3 (s3://), Azure Data Lake Storage Gen2 (ADLS Gen2, abfss://), Google Cloud Storage (GCS, gs://), Azur...

  • 0 kudos
2 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 906 Views
  • 1 replies
  • 5 kudos

Notebook cell output results limit increased- 10,000 rows or 2 MB. Hi all, Now, databricks start showing the first 10000 rows instead of 1000 rows.Tha...

Notebook cell output results limit increased- 10,000 rows or 2 MB.Hi all,Now, databricks start showing the first 10000 rows instead of 1000 rows.That will reduce the time of re-execution while working on fewer sizes of data that have rows between 100...

  • 906 Views
  • 1 replies
  • 5 kudos
Latest Reply
Kaniz
Community Manager
  • 5 kudos

Thank you @Ajay Pandey​ for sharing the good news with your peers.

  • 5 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 968 Views
  • 3 replies
  • 7 kudos

Starting from #databricks runtime 12.2 LTS, implicit lateral column aliasing is now supported. This feature enables you to reuse an expression defined...

Starting from #databricks runtime 12.2 LTS, implicit lateral column aliasing is now supported. This feature enables you to reuse an expression defined earlier in the same SELECT list, thus avoiding repetition of the same calculation.For instance, in ...

ezgif-3-d3fac0139c
  • 968 Views
  • 3 replies
  • 7 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 7 kudos

Informative Thanks for sharing.

  • 7 kudos
2 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 3126 Views
  • 4 replies
  • 23 kudos

Encrypt and decrypt personal data with Spark Databricks.We create a table that will include personal information. However, we want to hide personal id...

Encrypt and decrypt personal data with Spark Databricks.We create a table that will include personal information. However, we want to hide personal identifiers so no one can see them.We set a key. A key need to have 16, 24, or 32 bytes. 1 byte = 1 ch...

image.png image.png image.png image.png
  • 3126 Views
  • 4 replies
  • 23 kudos
Latest Reply
MaheshDBR
New Contributor II
  • 23 kudos

@Hubert Dudek​ how can we decrypt the data outside of databricks with python? which is encrypted with aes_encrypt

  • 23 kudos
3 More Replies
STummala
by New Contributor
  • 1090 Views
  • 2 replies
  • 0 kudos
  • 1090 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi ​@sandeep tummala​ , Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your fe...

  • 0 kudos
1 More Replies
raghub1
by New Contributor II
  • 4375 Views
  • 5 replies
  • 3 kudos

Resolved! Writing PySpark DataFrame onto AWS Glue throwing error

I have followed the steps as mentioned in this blog : https://www.linkedin.com/pulse/aws-glue-data-catalog-metastore-databricks-deepak-rajak/ but when trying to saveAsTable(table_name), it is giving an error as IllegalArgumentException: Path must be ...

  • 4375 Views
  • 5 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hey @Raghu Bharadwaj Tallapragada​ Just wanted to check in if you were able to resolve your issue or do you need more help? We'd love to hear from you.Thanks!

  • 3 kudos
4 More Replies
Labels
Top Kudoed Authors