cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

data_explorer
by New Contributor II
  • 924 Views
  • 1 replies
  • 2 kudos
  • 924 Views
  • 1 replies
  • 2 kudos
Latest Reply
User16753725469
Contributor II
  • 2 kudos

Please refer: https://www.databricks.com/blog/2021/05/26/introducing-databricks-unity-catalog-fine-grained-governance-for-data-and-ai-on-the-lakehouse.html

  • 2 kudos
Nisarg_Khamar
by New Contributor
  • 750 Views
  • 1 replies
  • 1 kudos

Did not get bedge and 100 points for Databricks Lakehouse Platform Accreditation exam

I have completed databricks Fundamentals of the Databricks Lakehouse Platform Accreditation exam but i did not get bedge and 100 points in my account.Could someone please help.

  • 750 Views
  • 1 replies
  • 1 kudos
Latest Reply
pkgltn
New Contributor III
  • 1 kudos

Same question here too. I had the accreditation done, but I do not know how to redeem those 100 points

  • 1 kudos
stupendousenzio
by New Contributor III
  • 1677 Views
  • 4 replies
  • 7 kudos

Unable to access workspace after the trial period in databricks in Google cloud provider.

I was using the trial period in databricks for 14 days and had some important notebooks where I had made all the changes. Now I have extended the service and have subscribed for databricks in GCP. When I enter the workspace section I cannot see the w...

  • 1677 Views
  • 4 replies
  • 7 kudos
Latest Reply
Anonymous
Not applicable
  • 7 kudos

Hi @Aditya Aranya​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 7 kudos
3 More Replies
JamesKuo
by New Contributor III
  • 4162 Views
  • 3 replies
  • 6 kudos

Resolved! Where can I find API documentation to dbutils.notebook.entry_point?

dbutils.notebook.help only lists "run" and "exit" methods. I could only find references to dbutils.notebook.entry_point spread across the web but there does not seem to be an official Databricks API documentation to its complete APIs anywhere. Can so...

  • 4162 Views
  • 3 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @James kuo​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 6 kudos
2 More Replies
Reda
by New Contributor II
  • 1491 Views
  • 1 replies
  • 6 kudos

Creating a DLT pipeline that reads from a JDBC source

Hey,I'm trying to create a DLT pipeline that reads from a JDBC source, and the code I'm using looks something like this in python:import dlt @dlt.table def table_name(): driver = 'oracle.jdbc.driver.OracleDriver' url = '...' query = 'SELECT ......

  • 1491 Views
  • 1 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Reda Bitar​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else bricksters will get back to you soon. Thanks

  • 6 kudos
impulsleistung
by New Contributor III
  • 2347 Views
  • 5 replies
  • 7 kudos

mount s3 bucket with specific endpoint

Environment:AZURE-DatabricksLanguage: PythonI can access my s3 bucket via:boto3.client('s3', endpoint_url='https://gateway.storjshare.io', ... )and it also works via:boto3.resource('s3', endpoint_url='https://gateway.storjshare.io', ... )As a next st...

  • 2347 Views
  • 5 replies
  • 7 kudos
Latest Reply
Anonymous
Not applicable
  • 7 kudos

Hi @Kevin Ostheimer​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 7 kudos
4 More Replies
khoa
by New Contributor II
  • 1387 Views
  • 2 replies
  • 5 kudos

Delta sharing in Databricks doesn't work

Databricks Delta sharing server seems to be broken. We have a table ~ 10M rows and there is no way for us to query the shared data via any methods (e.g Python/Spark or even another Databricks account that the data was shared with)Any ideas on why thi...

image
  • 1387 Views
  • 2 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Khoa Ho​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 5 kudos
1 More Replies
bitsplease
by New Contributor II
  • 1068 Views
  • 3 replies
  • 4 kudos

Haven't received Databricks Certificate or any form of correspondence

I passed the Databricks Certified Associate Developer for Apache Spark 3.0 - Python on 10/22/2022 with a score of 85%.My kryterion webassessor account shows a pass. However, I've not yet received any correspondence/badge from Databricks

  • 1068 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Kartikeya Shukla​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...

  • 4 kudos
2 More Replies
dimandfacts
by New Contributor III
  • 1473 Views
  • 2 replies
  • 6 kudos

Community Edition SQL Warehouse is not starting up, is it not free to even trail ?

When i start the sql warehosue , i get this error. Is there a way around to start up, I just want to try some features. Clusters are failing to launch. Cluster launch will be retried.Details for the latest failure: Error: Error code: PublicIPCountLim...

  • 1473 Views
  • 2 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Anbarasan Dhanushkodi​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from ...

  • 6 kudos
1 More Replies
James_209101
by New Contributor II
  • 4284 Views
  • 2 replies
  • 4 kudos

Using large dataframe in-memory (data not allowed to be "at rest") results in driver crash and/or out of memory

I'm having trouble working on Databricks with data that we are not allowed to save off or persist in any way. The data comes from an API (which returns a JSON response). We have a scala package on our cluster that makes the queries (almost 6k queries...

  • 4284 Views
  • 2 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @James Held​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 4 kudos
1 More Replies
Mado
by Valued Contributor II
  • 961 Views
  • 2 replies
  • 3 kudos

What is default location when using "writeStream"?

Hi,Assume that I want to write a table by" writeStream". Where is the default location on DBFS where the table is saved?Sample code:spark.table("TEMP_SILVER").writeStream   .option("checkpointLocation", "dbfs:/user/AAA@gmail.com")   ....

  • 961 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Mohammad Saber​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...

  • 3 kudos
1 More Replies
AkilK
by New Contributor II
  • 677 Views
  • 2 replies
  • 3 kudos

community edition workspace password reset issue

I am not able to reset my community edition workspace password. It continuously processing and password not getting rese

  • 677 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Akil Kapasi​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...

  • 3 kudos
1 More Replies
logan0015
by Contributor
  • 753 Views
  • 1 replies
  • 3 kudos

How to move the "__apply changes_storage_mytablename" when creating a streaming live table?

As the title suggests, whenever I create a streaming live table it creates a __apply_changes_storage_"mytablename" section in the database on databricks. Is there a way to specify a different cloud location for these files?

  • 753 Views
  • 1 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Logan Nicol​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else bricksters will get back to you soon. Thanks

  • 3 kudos
farefin
by New Contributor II
  • 2063 Views
  • 2 replies
  • 5 kudos

Need help in a pyspark code in Databricks to calculate a new measure column.

Details of the requirement is as below:I have a table with below structure:So i have to write a code in pyspark to calculate a new column.Logic for new column is Sum of Magnitude for different Categories divided by the total Magnitude.And it should b...

Sample Data
  • 2063 Views
  • 2 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Faizan Arefin​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 5 kudos
1 More Replies
tum
by New Contributor II
  • 2753 Views
  • 3 replies
  • 4 kudos

Create new job api error "MALFORMED_REQUEST"

hi,i'm trying to test create a new job api (v 2.1) with python, but i got error:{ 'error_code': 'MALFORMED_REQUEST', 'message': 'Invalid JSON given in the body of the request - expected a map'}How do i validate json body before posting ?this is my js...

  • 2753 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @tum m​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 4 kudos
2 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels
Top Kudoed Authors