cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

vamsivarun007
by New Contributor II
  • 47373 Views
  • 5 replies
  • 3 kudos

Driver is up but is not responsive, likely due to GC.

Hi all, "Driver is up but is not responsive, likely due to GC." This is the message in cluster event logs. Can anyone help me with this. What does GC means? Garbage collection? Can we control it externally?

  • 47373 Views
  • 5 replies
  • 3 kudos
Latest Reply
jacovangelder
Databricks MVP
  • 3 kudos

9/10 times GC is due to out of memory exceptions.@Jaron spark.catalog.clearCache() is not a configurable option, but rather a command to submit.

  • 3 kudos
4 More Replies
mysteryuser000
by New Contributor
  • 884 Views
  • 0 replies
  • 0 kudos

dlt pipeline will not create live tables

I have created a dlt pipeline based four sql notebooks, each containing between 1 and 3 queries.  Each query begins with "create or refresh live table ..." yet each one outputs a materialized view.  I have tried deleting the materialized views and ru...

  • 884 Views
  • 0 replies
  • 0 kudos
arthurandraderj
by New Contributor
  • 569 Views
  • 0 replies
  • 0 kudos

Error truncating #REF with spark.read

Hello guysI am trying to read an excel file and even using PERMISSIVE mode, its truncating the records that contains #REF in any column Can anyone please help me on that? schema = StructType([\        StructField('Col1', DateType(), True), \ <-------...

  • 569 Views
  • 0 replies
  • 0 kudos
dpc
by Contributor
  • 1667 Views
  • 2 replies
  • 3 kudos

Resolved! Data prefixed by ' > '

HiI have a databricks table that has a column with a string datatypeWhen loading data, certain rows are prefixed by ' > 'Does anybody know what would cause this?It seems to be when the string is above a certain number of characters (around 200)Thanks

  • 1667 Views
  • 2 replies
  • 3 kudos
Latest Reply
Jag
New Contributor III
  • 3 kudos

It looks like default string data length exceeded and due to that its showing this kind of symbol and extra new line in the column.

  • 3 kudos
1 More Replies
WTW-DBrat
by New Contributor II
  • 1669 Views
  • 2 replies
  • 0 kudos

Service principal’s Microsoft Entra ID access token returns 400 when calling Databricks REST API

I'm using the following to call a Databricks REST API. When I use a PAT for access_token, everything works fine. When I use a Microsoft Entra ID access token, the response returns 400. The service principal has access to the workspace and is part of ...

  • 1669 Views
  • 2 replies
  • 0 kudos
Latest Reply
Jag
New Contributor III
  • 0 kudos

hello, Try to print the repose and see are you table to see the access_token in the payload else looks like access issue.Try to go to the workspace setting and grant token access permission to the service principle.Workspace > Setting 

  • 0 kudos
1 More Replies
isaac_gritz
by Databricks Employee
  • 6205 Views
  • 5 replies
  • 5 kudos

SQL IDE Support

How to use a SQL IDE with Databricks SQLDatabricks provides SQL IDE support using DataGrip and DBeaver with Databricks SQL.Let us know in the comments if you've used DataDrip or DBeaver with Databricks! Let us know if there are any other SQL IDEs you...

  • 6205 Views
  • 5 replies
  • 5 kudos
Latest Reply
Jag
New Contributor III
  • 5 kudos

dbeaver is perfectly working fine but I fount one issue it wont show the correct error for query. 

  • 5 kudos
4 More Replies
dataengutility
by New Contributor III
  • 5224 Views
  • 4 replies
  • 1 kudos

Resolved! Yml file replacing job cluster with all-purpose cluster when running a workflow

Hi all,I have been having some trouble running a workflow that consists of 3 tasks that run sequentially. Task1 runs on an all-purpose cluster and kicks off Task2 that needs to run on a job cluster. Task2 kicks off Task3 which also uses a job cluster...

  • 5224 Views
  • 4 replies
  • 1 kudos
Latest Reply
jacovangelder
Databricks MVP
  • 1 kudos

I don't know if you've cut off your yaml snippet, but your snippet doesn't show your job cluster with key job-cluster. Just to validate, your job cluster is also defined in your workflow yaml?Edit: Looking it it again and knowing the defaults, it loo...

  • 1 kudos
3 More Replies
tramtran
by Contributor
  • 13923 Views
  • 7 replies
  • 0 kudos

Resolved! How to import a function to another notebook?

Could you please provide guidance on the correct way to dynamically import a Python module from a user-specific path in Databricks Repos? Any advice on resolving the ModuleNotFoundError would be greatly appreciated.udf_check_table_exists notebook:fro...

  • 13923 Views
  • 7 replies
  • 0 kudos
Latest Reply
tramtran
Contributor
  • 0 kudos

Thank you all again

  • 0 kudos
6 More Replies
Volker
by Contributor
  • 2547 Views
  • 1 replies
  • 0 kudos

Terraform Error: Cannot create sql table context deadline

I am currently trying to deploy external parquet tables to the Databricks UC using terraform. However, for some tables I get the following error:Error: cannot create sql table: Post "https://[MASKED]/api/2.0/sql/statements/": context deadline exceede...

  • 2547 Views
  • 1 replies
  • 0 kudos
Latest Reply
Volker
Contributor
  • 0 kudos

Hey @Retired_mod,thanks for your reply and sorry for the late reply from my side. I couldn't fix the problem with the databricks terraform provider unfortunately. I now switched to using liquibase to deploy tables to databricks.

  • 0 kudos
JeremyH
by New Contributor II
  • 4125 Views
  • 4 replies
  • 0 kudos

CREATE WIDGETS in SQL Notebook attached to SQL Warehouse Doesn't Work.

I'm able to create and use widgets using the UI in my SQL notebooks, but they get lost quite frequently when the notebook is reset.There is documentation suggesting we can create widgets in code in SQL: https://learn.microsoft.com/en-us/azure/databri...

  • 4125 Views
  • 4 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

Hi @JeremyH - can you please try adding like the below in your query and see if widgets are getting populated? {{parameter_name }}

  • 0 kudos
3 More Replies
Akash_Wadhankar
by New Contributor III
  • 1129 Views
  • 0 replies
  • 0 kudos

DatabricksUniForm

Hi Community members, I tried creating a Delta UniForm table using databricks notebook. I created a database without providing location. It took the dbfs default storage location. On top of that I was able to create a Delta UniForm table. Then I trie...

  • 1129 Views
  • 0 replies
  • 0 kudos
Nathant93
by New Contributor III
  • 2089 Views
  • 2 replies
  • 0 kudos

Resolved! Unzipping with Serverless Compute

HiI have started using serverless compute but have come across the limitation that I cannot use the local filesystem for temporarily storing the files and directories before moving them to where they need to be in adls.Does anyone have a way of unzip...

Data Engineering
serverless
unzip
  • 2089 Views
  • 2 replies
  • 0 kudos
Latest Reply
delonb2
New Contributor III
  • 0 kudos

Do you have the ability to make a Unity Catalog volume? You could use it as temporary storage before migrating the files to adls.

  • 0 kudos
1 More Replies
QuikPl4y
by New Contributor III
  • 3996 Views
  • 2 replies
  • 0 kudos

Resolved! Databricks to Oracle to Delete Rows

Hi Community!  I’m working in a Fatabricks notebook and using the Oracle JDBC Thin Client connector to query and Oracle table, merge together and select specific rows from my dataframe and write those rows to a table back in Oracle. All of this works...

  • 3996 Views
  • 2 replies
  • 0 kudos
Latest Reply
delonb2
New Contributor III
  • 0 kudos

This stack overflow post ran into the same issue, would be worth trying How to delete a record from Oracle Table in Python SQLAlchemy - Stack Overflow

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels