cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

thiagoawstest
by Contributor
  • 3551 Views
  • 1 replies
  • 0 kudos

Resolved! mount bucket s3

Hi, I have Databricks configured on AWS, I need to mount some S3 buckets on Databricks in /mnt, but I have some questions:- How can a bucket be mounted for all clusters and users to have access to, so as not to need to mount it every time the cluster...

  • 3551 Views
  • 1 replies
  • 0 kudos
Latest Reply
Yeshwanth
Databricks Employee
  • 0 kudos

@thiagoawstest To mount an S3 bucket in Databricks on AWS so that all clusters and users have access to it without needing to remount each time, and without creating an access key in AWS, follow these steps:  Mounting an S3 Bucket Using an AWS Instan...

  • 0 kudos
saikumar_ganji
by New Contributor III
  • 3441 Views
  • 7 replies
  • 0 kudos

DATABRICKS DATA ENGINEER ASSOCIATE EXAM GOT SUSPENDED

I encountered Pathetic experience while attempting my Databricks certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times and then suspended my exam, saying I have exceeded eyes movement and I almost comple...

  • 3441 Views
  • 7 replies
  • 0 kudos
Latest Reply
saikumar_ganji
New Contributor III
  • 0 kudos

@Cert-Team @Cert-TeamOPS @Retired_mod Can you please look into this issue. I have to complete my exam asap

  • 0 kudos
6 More Replies
Karlo_Kotarac
by New Contributor III
  • 4626 Views
  • 4 replies
  • 0 kudos

Run failed with error message ContextNotFound

Hi all!Recently we've been getting lots of these errors when running Databricks notebooks:At that time we observed DRIVER_NOT_RESPONDING (Driver is up but is not responsive, likely due to GC.) log on the single-user cluster we use.Previously when thi...

Karlo_Kotarac_0-1713422302017.png
  • 4626 Views
  • 4 replies
  • 0 kudos
Latest Reply
Karlo_Kotarac
New Contributor III
  • 0 kudos

In case somebody else runs into the same issue: After investigation from Databricks support the conclusion was that the driver's memory was overloaded ('Driver Not Responding' error message in the event log) but it can happen that we don't get the co...

  • 0 kudos
3 More Replies
Lily99
by New Contributor
  • 1645 Views
  • 1 replies
  • 0 kudos

SQL function does not work in 'Create Function'

This SQL statement works fine by itself SELECT COUNT(1) FROM tablea f INNER JOIN tableb t ON lower(f.col1) = t.col1but if I want to use it inside a function:​CREATE OR REPLACE FUNCTION fn_abc(var1 ...

  • 1645 Views
  • 1 replies
  • 0 kudos
Latest Reply
lucasrocha
Databricks Employee
  • 0 kudos

Hello @Lily99 , I hope this message finds you well. Could you please try the code below and let me know the results? CREATE OR REPLACE FUNCTION fn_abc(var1 STRING, var2 STRING) RETURNS DOUBLECOMMENT 'test function'RETURN SELECT    CASE    WHEN EXISTS...

  • 0 kudos
Yash_542965
by New Contributor II
  • 1854 Views
  • 1 replies
  • 0 kudos

DLT aggregation problem

I'm utilizing SQL to perform aggregation operations within a gold layer of a DLT pipeline. However, I'm encountering an error when running the pipeline while attempting to return a data frame using spark.sql.Could anyone please assist me with the SQL...

  • 1854 Views
  • 1 replies
  • 0 kudos
Latest Reply
lucasrocha
Databricks Employee
  • 0 kudos

Hello @Yash_542965 , I hope this message finds you well. Could you please share a sample of code you are using so that we can check it further? Best regards,Lucas Rocha

  • 0 kudos
vijaykumarbotla
by New Contributor III
  • 2021 Views
  • 1 replies
  • 0 kudos

Databricks Notebook error : Analysis Exception with multiple datasets

Hi All,I am getting below error when trying to execute the code.AnalysisException: Column Is There a PO#17748 are ambiguous. It's probably because you joined several Datasets together, and some of these Datasets are the same. This column points to ...

  • 2021 Views
  • 1 replies
  • 0 kudos
Latest Reply
lucasrocha
Databricks Employee
  • 0 kudos

Hello @vijaykumarbotla , I hope you're doing well. This is probably because both DataFrames contain a column with the same name, and Spark is unable to determine which one you are referring to in the select statement. To resolve this issue, you can u...

  • 0 kudos
User16752244127
by Contributor
  • 1379 Views
  • 1 replies
  • 0 kudos
  • 1379 Views
  • 1 replies
  • 0 kudos
Latest Reply
lucasrocha
Databricks Employee
  • 0 kudos

Hello @User16752244127 , I hope this message finds you well. Delta Live Tables supports loading data from any data source supported by Databricks. You can find the datasources supported here Connect to data sources, and JDBC is one of them. You can a...

  • 0 kudos
Sambit_S
by New Contributor III
  • 1090 Views
  • 1 replies
  • 0 kudos

Exceptions are Not Getting Handled In Autoloader Write Stream

I have below logic implemented using Databricks Autoloader. ## Autoloader Write stream: Its calling forEachBatch function to write into respective datatype catalog table#  and using checkpoint to keeps track of processing files.try:    ##Observe raw ...

Sambit_S_0-1717689309381.png
  • 1090 Views
  • 1 replies
  • 0 kudos
Latest Reply
raphaelblg
Databricks Employee
  • 0 kudos

Hello @Sambit_S ,In your scenario, there is a merge failure. Your query won't be able to progress as the problematic batch can't be committed to sink. Even if you handle the exception in a try catch block, it's impossible for the autoloader to update...

  • 0 kudos
tgen
by New Contributor II
  • 2404 Views
  • 1 replies
  • 0 kudos

Increase stack size Databricks

Hi everyoneI'm currently running a shell script in a notebook, and I'm encountering a segmentation fault. This is due to the stack size limitation. I'd like to increase the stack size using ulimit -s unlimited, but I'm facing issues with setting this...

  • 2404 Views
  • 1 replies
  • 0 kudos
satishnavik
by New Contributor II
  • 12366 Views
  • 5 replies
  • 0 kudos

How to connect Databricks Database with Springboot application using JPA

facing issue with integrating our Spring boot JPA supported application with Databricks.Below are the steps and setting we did for the integration.When we are starting the spring boot application we are getting a warning as :HikariPool-1 - Driver doe...

  • 12366 Views
  • 5 replies
  • 0 kudos
Latest Reply
172036
New Contributor II
  • 0 kudos

Was there any resolution to this?  Is Spring datasource supported now?

  • 0 kudos
4 More Replies
djburnham
by New Contributor III
  • 4688 Views
  • 2 replies
  • 1 kudos

Resolved! How to get a list of workspace users who have the "unrestricted cluster create" entitlement ?

Hello - I'm hoping somebody can help me with this ... I have a lot of users configured with access to a workspace (100's) and I want to write a report to see if any of the users have  "unrestricted cluster create" entitlement in the workspace. This i...

  • 4688 Views
  • 2 replies
  • 1 kudos
Latest Reply
djburnham
New Contributor III
  • 1 kudos

Many thanks for you help @Yeshwanth  it put me on the right track. The API does have a filter option and that looks like it complies with rfc7644 but my attempts to use it were rather hit and miss - I suspect as the API is preview it is not fully imp...

  • 1 kudos
1 More Replies
Anonymous
by Not applicable
  • 9141 Views
  • 11 replies
  • 2 kudos

Sql Serverless Option is missing when using Azure Databricks Workspace with No Public IP and VNET Injection

HelloAfter creating an Databricks Workspace in Azure with No Public IP and VNET Injection, I'm unable to use DBSQL Serverless because the option to enable it in SQL warehouse Settings is missing. ​Is it by design? Is it a limitation when using Privat...

  • 9141 Views
  • 11 replies
  • 2 kudos
Latest Reply
RomanLegion
New Contributor III
  • 2 kudos

Fixed, go to Profile -> Compute->  SQL Server Serverless -> On -> Save. For some reason this has been disabled for us.

  • 2 kudos
10 More Replies
jenshumrich
by Contributor
  • 4394 Views
  • 1 replies
  • 0 kudos

Resolved! R install - cannot open URL

Neither standard nor non standard repo seem available. Any idea how to debug/fix this? %r install.packages("gghighlight", lib="/databricks/spark/R/lib", repos = "http://cran.us.r-project.org") Warning: unable to access index for repository http://cra...

  • 4394 Views
  • 1 replies
  • 0 kudos
Latest Reply
jenshumrich
Contributor
  • 0 kudos

%sh nc -zv cran.us.r-project.org 80 It was a network issue. These lines above proved it and the network administrators had to open the IPs.

  • 0 kudos
BobBubble2000
by New Contributor II
  • 4822 Views
  • 4 replies
  • 0 kudos

Delta Live Tables with Common Data Model as source

Hi,I'm investigating whether it's possible to use Common Data Model CDM (in particular the Dynamics 365 exported csv and cdm files) as a Delta Live Tables data source? Can someone point me in the right direction?Thanks!

  • 4822 Views
  • 4 replies
  • 0 kudos
Latest Reply
Suryanarayan
New Contributor II
  • 0 kudos

Using Delta Live Tables with Common Data Model (CDM) as a Source in DatabricksI'm investigating the use of Delta Live Tables (DLT) to process Common Data Model (CDM) files exported from Dynamics 365, and I found a solution that works well. Here’s a q...

  • 0 kudos
3 More Replies
Jackson1111
by New Contributor III
  • 1065 Views
  • 3 replies
  • 1 kudos

get job detail API

Hello, is there an API interface for passing in batches of run_id to obtain job running details?

  • 1065 Views
  • 3 replies
  • 1 kudos
Latest Reply
mhiltner
Databricks Employee
  • 1 kudos

Maybe this could help. Its not batch, but you can get the run_id details  https://docs.databricks.com/en/workflows/jobs/jobs-2.0-api.html#runs-get-output

  • 1 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels