cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Marco9898
by New Contributor II
  • 7275 Views
  • 2 replies
  • 3 kudos

Running SQL Data File through Notebook Python

I am attempting to run larger sql scripts through Databricks Notbook and export data to a file. For the most part the Notebook works when the sql script is a single SELECT statement. However, if the sql file is more complicated such as involving the ...

  • 7275 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Marco Perez​ Does @Jose Gonzalez​ response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 3 kudos
1 More Replies
Bevin
by New Contributor III
  • 2687 Views
  • 5 replies
  • 9 kudos

Resolved! What is the best way to manage the growth of the default storage location?

What is the best way to manage the growth of the default storage location in a shared workspace? I have a number of users logging in and working on different clusters in a single workspace. I understand that all the results and notebooks are saved t...

  • 2687 Views
  • 5 replies
  • 9 kudos
Latest Reply
Anonymous
Not applicable
  • 9 kudos

Hi @Bevin Maragh​ Glad to hear that! It's a request that mark an answer as best. It will be highly appreciable.Thanks and Regards

  • 9 kudos
4 More Replies
Mado
by Valued Contributor II
  • 2309 Views
  • 2 replies
  • 3 kudos

When should I use ".start()" with writeStream?

Hi,I am practicing with Databricks. In sample notebooks,I have seen different use of writeStream with or without ".start()" method. Samples are below:Without .start() spark.readStream   .format("cloudFiles")   .option("cloudFiles.f...

  • 2309 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Mohammad Saber​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else bricksters will get back to you soon. Thanks

  • 3 kudos
1 More Replies
Mado
by Valued Contributor II
  • 2960 Views
  • 2 replies
  • 3 kudos

Question about "foreachBatch" to remove duplicate records when streaming data

Hi,I am practicing with Databricks sample notebook published here:https://github.com/databricks-academy/advanced-data-engineering-with-databricksIn one of the notebooks (ADE 3.1 - Streaming Deduplication) (URL), there is a sample code to remove dupli...

  • 2960 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Mohammad Saber​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else bricksters will get back to you soon. Thanks

  • 3 kudos
1 More Replies
Lalli
by New Contributor II
  • 1071 Views
  • 1 replies
  • 2 kudos

Databricks notebook is not printing image in python

When I run a python command that should display an image. The image is not showing, and I do not get any error message.

  • 1071 Views
  • 1 replies
  • 2 kudos
Latest Reply
User16753725469
Contributor II
  • 2 kudos

Could you please share the code snippet you are using to print the image?

  • 2 kudos
data_explorer
by New Contributor II
  • 1441 Views
  • 1 replies
  • 2 kudos
  • 1441 Views
  • 1 replies
  • 2 kudos
Latest Reply
User16753725469
Contributor II
  • 2 kudos

Please refer: https://www.databricks.com/blog/2021/05/26/introducing-databricks-unity-catalog-fine-grained-governance-for-data-and-ai-on-the-lakehouse.html

  • 2 kudos
Nisarg_Khamar
by New Contributor
  • 1176 Views
  • 1 replies
  • 1 kudos

Did not get bedge and 100 points for Databricks Lakehouse Platform Accreditation exam

I have completed databricks Fundamentals of the Databricks Lakehouse Platform Accreditation exam but i did not get bedge and 100 points in my account.Could someone please help.

  • 1176 Views
  • 1 replies
  • 1 kudos
Latest Reply
pkgltn
New Contributor III
  • 1 kudos

Same question here too. I had the accreditation done, but I do not know how to redeem those 100 points

  • 1 kudos
stupendousenzio
by New Contributor III
  • 2812 Views
  • 4 replies
  • 7 kudos

Unable to access workspace after the trial period in databricks in Google cloud provider.

I was using the trial period in databricks for 14 days and had some important notebooks where I had made all the changes. Now I have extended the service and have subscribed for databricks in GCP. When I enter the workspace section I cannot see the w...

  • 2812 Views
  • 4 replies
  • 7 kudos
Latest Reply
Anonymous
Not applicable
  • 7 kudos

Hi @Aditya Aranya​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 7 kudos
3 More Replies
JamesKuo
by New Contributor III
  • 6566 Views
  • 2 replies
  • 7 kudos

Where can I find API documentation to dbutils.notebook.entry_point?

dbutils.notebook.help only lists "run" and "exit" methods. I could only find references to dbutils.notebook.entry_point spread across the web but there does not seem to be an official Databricks API documentation to its complete APIs anywhere. Can so...

  • 6566 Views
  • 2 replies
  • 7 kudos
Latest Reply
Anonymous
Not applicable
  • 7 kudos

Hi @James kuo​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 7 kudos
1 More Replies
Reda
by New Contributor II
  • 2156 Views
  • 1 replies
  • 6 kudos

Creating a DLT pipeline that reads from a JDBC source

Hey,I'm trying to create a DLT pipeline that reads from a JDBC source, and the code I'm using looks something like this in python:import dlt @dlt.table def table_name(): driver = 'oracle.jdbc.driver.OracleDriver' url = '...' query = 'SELECT ......

  • 2156 Views
  • 1 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Reda Bitar​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else bricksters will get back to you soon. Thanks

  • 6 kudos
impulsleistung
by New Contributor III
  • 3699 Views
  • 4 replies
  • 6 kudos

mount s3 bucket with specific endpoint

Environment:AZURE-DatabricksLanguage: PythonI can access my s3 bucket via:boto3.client('s3', endpoint_url='https://gateway.storjshare.io', ... )and it also works via:boto3.resource('s3', endpoint_url='https://gateway.storjshare.io', ... )As a next st...

  • 3699 Views
  • 4 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Kevin Ostheimer​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 6 kudos
3 More Replies
khoa
by New Contributor II
  • 2083 Views
  • 1 replies
  • 4 kudos

Delta sharing in Databricks doesn't work

Databricks Delta sharing server seems to be broken. We have a table ~ 10M rows and there is no way for us to query the shared data via any methods (e.g Python/Spark or even another Databricks account that the data was shared with)Any ideas on why thi...

image
  • 2083 Views
  • 1 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Khoa Ho​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 4 kudos
bitsplease
by New Contributor II
  • 1714 Views
  • 3 replies
  • 4 kudos

Haven't received Databricks Certificate or any form of correspondence

I passed the Databricks Certified Associate Developer for Apache Spark 3.0 - Python on 10/22/2022 with a score of 85%.My kryterion webassessor account shows a pass. However, I've not yet received any correspondence/badge from Databricks

  • 1714 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Kartikeya Shukla​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...

  • 4 kudos
2 More Replies
dimandfacts
by New Contributor III
  • 2283 Views
  • 2 replies
  • 6 kudos

Community Edition SQL Warehouse is not starting up, is it not free to even trail ?

When i start the sql warehosue , i get this error. Is there a way around to start up, I just want to try some features. Clusters are failing to launch. Cluster launch will be retried.Details for the latest failure: Error: Error code: PublicIPCountLim...

  • 2283 Views
  • 2 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Anbarasan Dhanushkodi​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from ...

  • 6 kudos
1 More Replies
James_209101
by New Contributor II
  • 7605 Views
  • 2 replies
  • 5 kudos

Using large dataframe in-memory (data not allowed to be "at rest") results in driver crash and/or out of memory

I'm having trouble working on Databricks with data that we are not allowed to save off or persist in any way. The data comes from an API (which returns a JSON response). We have a scala package on our cluster that makes the queries (almost 6k queries...

  • 7605 Views
  • 2 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @James Held​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 5 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels