cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

HelloDatabricks
by Visitor
  • 269 Views
  • 5 replies
  • 7 kudos

Connect Timeout - Error when trying to run a cell

Hello everybody.Whenever I am trying to run a simple cell I receive the following error message now:Notebook detached. Exception when creating expectation context: java.net.SocketTimeoutException: Connect Timeout.After that error message the cluster ...

  • 269 Views
  • 5 replies
  • 7 kudos
Latest Reply
MarijaS
Visitor
  • 7 kudos

today is ok

  • 7 kudos
4 More Replies
Geoff123
by New Contributor
  • 90 Views
  • 7 replies
  • 0 kudos

Trouble on Accessing Azure Storage from Databricks (Python)

I used the same accessing method shown in https://community.databricks.com/t5/data-engineering/to-read-data-from-azure-storage/td-p/32230 but kept get the error below.org.apache.spark.SparkSecurityException: [INSUFFICIENT_PERMISSIONS] Insufficient pr...

  • 90 Views
  • 7 replies
  • 0 kudos
Latest Reply
Wojciech_BUK
Contributor III
  • 0 kudos

Hi,you can find storage account firewall information by accessing resource in azure portal Please mind that if you are using Unity Catalog you should NOT mount Storage Account, you should rather use abstraction of Storage Creadentials and External Lo...

  • 0 kudos
6 More Replies
Meshynix
by New Contributor II
  • 95 Views
  • 1 replies
  • 0 kudos

Not able to create external table in a schema under a Catalog.

Problem StatementCluster 1 (Shared Cluster) is not able to read the file location at "dbfs:/mnt/landingzone/landingzonecontainer/Inbound/" and hence we are not able to create an external table in a schema inside Enterprise Catalog.Cluster 2 (No Isola...

  • 95 Views
  • 1 replies
  • 0 kudos
Latest Reply
Meshynix
New Contributor II
  • 0 kudos

Hi @Kaniz Wondering if you can please advise a solution. Thanks heaps in advance.

  • 0 kudos
AxelBrsn
by New Contributor II
  • 40 Views
  • 1 replies
  • 0 kudos

Use DLT from another pipeline

Hello, I have a question.Context :I have a Unity Catalog organized with three schemas (bronze, silver and gold). Logically, I would like to create tables in each schemas.I tried to organize my pipelines on the layers, which mean that I would like to ...

  • 40 Views
  • 1 replies
  • 0 kudos
Latest Reply
standup1
Visitor
  • 0 kudos

To my knowledge and as of today, DLT does not support multi schemas and you can’t cross from one pipeline to another using “live.table”. However, live table is just a materialized view. You can change your script to create materialized view instead o...

  • 0 kudos
satishnavik
by New Contributor II
  • 530 Views
  • 3 replies
  • 0 kudos

How to connect Databricks Database with Springboot application using JPA

facing issue with integrating our Spring boot JPA supported application with Databricks.Below are the steps and setting we did for the integration.When we are starting the spring boot application we are getting a warning as :HikariPool-1 - Driver doe...

  • 530 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @satishnavik, It seems you’re encountering issues while integrating your Spring Boot JPA application with Databricks. Let’s address the warnings and exceptions you’re facing. Warning: Driver Does Not Support Network Timeout for Connections The...

  • 0 kudos
2 More Replies
RajNath
by New Contributor II
  • 56 Views
  • 2 replies
  • 0 kudos

Traversing to previous rows and getting the data based on condition

Sample Input data setClusterIdEventEventTime1212-18-r9u1kzn1RUNNING2024-02-02T11:38:30.168+00:001212-18-r9u1kzn1TERMINATING2024-02-02T13:43:33.933+00:001212-18-r9u1kzn1STARTING2024-02-02T15:50:05.174+00:001212-18-r9u1kzn1RUNNING2024-02-02T15:54:21.51...

  • 56 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @RajNath , Handling event times and aggregations in large datasets can be challenging, but Structured Streaming in Databricks provides powerful tools to address this. Let’s break down your requirements and explore how you can achieve them: Ru...

  • 0 kudos
1 More Replies
RajNath
by New Contributor II
  • 341 Views
  • 2 replies
  • 0 kudos

Cost of using delta sharing with unity catalog

I am new to databricks delta sharing. In case of delta sharing, i don't see any cluster running. Tried looking for documentation but only hint i got is, it usage delta sharing server but what is the cost of it and how to configure and optimize for la...

  • 341 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @RajNath, Let’s dive into the world of Delta Sharing and explore how it works, its cost implications, and optimization strategies. What is Delta Sharing? Delta Sharing is a secure data-sharing platform developed by Databricks. It allows you to ...

  • 0 kudos
1 More Replies
Cheryl
by Visitor
  • 68 Views
  • 2 replies
  • 0 kudos

Query example for databricks Query History API

Hi I am trying to get query history data from my SQL warehouse. Following previous examples is not working. databricks_workspace_url = "xxx"token = "xxx"start_time = 1707091200end_time = 1707174000api_endpoint = f"{databricks_workspace_url}/api/2.0/s...

  • 68 Views
  • 2 replies
  • 0 kudos
Latest Reply
shan_chandra
Honored Contributor III
  • 0 kudos

@Cheryl - you can use query_start_time=2023-01-01T00:00:00Z  as a parameter to filter for the time frame. available filter criteria are given below - https://docs.databricks.com/api/workspace/queryhistory/list#filter_by-query_start_time_range    

  • 0 kudos
1 More Replies
Anonymous
by Not applicable
  • 924 Views
  • 3 replies
  • 3 kudos

Resolved! 6.4 Extended Support (includes Apache Spark 2.4.5, Scala 2.11 Connect Timeout

"Notebook detached Exception when creating execution context: java.net.SocketTimeout Exception: Connect Timeout" when trying to connect my cluster to a notebook. Then "Error trying to handle that request We failed to handle that request, please try a...

  • 924 Views
  • 3 replies
  • 3 kudos
Latest Reply
Wolverine
Visitor
  • 3 kudos

Hello @Kaniz  I am facing same issue I tried changing DBR but it is still giving me error and the cluster is not startingRegardsMS

  • 3 kudos
2 More Replies
dg
by New Contributor II
  • 6498 Views
  • 7 replies
  • 0 kudos

Trying to use pdf2image on databricks

Trying to use pdf2image on databricks, but its failing with "PDFInfoNotInstalledError: Unable to get page count. Is poppler installed and in PATH?"I've installed pdf2image & poppler-utils by running the following in a cell:%pip install pdf2image%pip ...

  • 6498 Views
  • 7 replies
  • 0 kudos
Latest Reply
Slalom_Tobias
New Contributor III
  • 0 kudos

Seems like this thread has died, but for posterity, databricks provides the following code for installing poppler on a cluster. The code is sourced from the dbdemos accelerators, specifically the "LLM Chatbot With Retrieval Augmented Generation (RAG)...

  • 0 kudos
6 More Replies
Ravikumashi
by New Contributor III
  • 2803 Views
  • 8 replies
  • 0 kudos

failed to initialise azure-event-hub with azure AAD(service principal)

We have been trying to authenticate azure-event-hub with azure AD(service principal) instead of shared access key(connection string) and read events from azure-event-hub and it is failing to initialise azure-event-hubs. And throwing no such method ex...

Error message full
  • 2803 Views
  • 8 replies
  • 0 kudos
Latest Reply
Ravikumashi
New Contributor III
  • 0 kudos

@swathi-dataops I have added ServicePrincipalCredentialsAuth and ServicePrincipalAuthBase as a normal classes instead of creating a separate jar for these 2 classes and packaged them as a part of my project jar.And used the below code for configuring...

  • 0 kudos
7 More Replies
Constantine
by Contributor III
  • 2241 Views
  • 5 replies
  • 1 kudos

Resolved! How to use Databricks Query History API (REST API)

I have setup authentication using this page https://docs.databricks.com/sql/api/authentication.html and run curl -n -X GET https://<databricks-instance>.cloud.databricks.com/api/2.0/sql/history/queriesTo get history of all sql endpoint queries, but I...

  • 2241 Views
  • 5 replies
  • 1 kudos
Latest Reply
MorpheusGoGo
New Contributor II
  • 1 kudos

Are you sure this works?payload = { "filter_by": {    }, "max_results": 1} Returns 1 result. payload = { "filter_by": {      "query_start_time_range":{       "start_time_ms" :1640995200000,        "end_time_ms" : 1641081599000   } }, "max_results": 1...

  • 1 kudos
4 More Replies