cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

arthurandraderj
by New Contributor
  • 401 Views
  • 0 replies
  • 0 kudos

Error truncating #REF with spark.read

Hello guysI am trying to read an excel file and even using PERMISSIVE mode, its truncating the records that contains #REF in any column Can anyone please help me on that? schema = StructType([\        StructField('Col1', DateType(), True), \ <-------...

  • 401 Views
  • 0 replies
  • 0 kudos
dpc
by New Contributor III
  • 917 Views
  • 2 replies
  • 3 kudos

Resolved! Data prefixed by ' > '

HiI have a databricks table that has a column with a string datatypeWhen loading data, certain rows are prefixed by ' > 'Does anybody know what would cause this?It seems to be when the string is above a certain number of characters (around 200)Thanks

  • 917 Views
  • 2 replies
  • 3 kudos
Latest Reply
Jag
New Contributor III
  • 3 kudos

It looks like default string data length exceeded and due to that its showing this kind of symbol and extra new line in the column.

  • 3 kudos
1 More Replies
WTW-DBrat
by New Contributor II
  • 986 Views
  • 2 replies
  • 0 kudos

Service principal’s Microsoft Entra ID access token returns 400 when calling Databricks REST API

I'm using the following to call a Databricks REST API. When I use a PAT for access_token, everything works fine. When I use a Microsoft Entra ID access token, the response returns 400. The service principal has access to the workspace and is part of ...

  • 986 Views
  • 2 replies
  • 0 kudos
Latest Reply
Jag
New Contributor III
  • 0 kudos

hello, Try to print the repose and see are you table to see the access_token in the payload else looks like access issue.Try to go to the workspace setting and grant token access permission to the service principle.Workspace > Setting 

  • 0 kudos
1 More Replies
isaac_gritz
by Databricks Employee
  • 4025 Views
  • 5 replies
  • 5 kudos

SQL IDE Support

How to use a SQL IDE with Databricks SQLDatabricks provides SQL IDE support using DataGrip and DBeaver with Databricks SQL.Let us know in the comments if you've used DataDrip or DBeaver with Databricks! Let us know if there are any other SQL IDEs you...

  • 4025 Views
  • 5 replies
  • 5 kudos
Latest Reply
Jag
New Contributor III
  • 5 kudos

dbeaver is perfectly working fine but I fount one issue it wont show the correct error for query. 

  • 5 kudos
4 More Replies
dataengutility
by New Contributor III
  • 2994 Views
  • 4 replies
  • 1 kudos

Resolved! Yml file replacing job cluster with all-purpose cluster when running a workflow

Hi all,I have been having some trouble running a workflow that consists of 3 tasks that run sequentially. Task1 runs on an all-purpose cluster and kicks off Task2 that needs to run on a job cluster. Task2 kicks off Task3 which also uses a job cluster...

  • 2994 Views
  • 4 replies
  • 1 kudos
Latest Reply
jacovangelder
Honored Contributor
  • 1 kudos

I don't know if you've cut off your yaml snippet, but your snippet doesn't show your job cluster with key job-cluster. Just to validate, your job cluster is also defined in your workflow yaml?Edit: Looking it it again and knowing the defaults, it loo...

  • 1 kudos
3 More Replies
tramtran
by Contributor
  • 8121 Views
  • 7 replies
  • 0 kudos

Resolved! How to import a function to another notebook?

Could you please provide guidance on the correct way to dynamically import a Python module from a user-specific path in Databricks Repos? Any advice on resolving the ModuleNotFoundError would be greatly appreciated.udf_check_table_exists notebook:fro...

  • 8121 Views
  • 7 replies
  • 0 kudos
Latest Reply
tramtran
Contributor
  • 0 kudos

Thank you all again

  • 0 kudos
6 More Replies
Volker
by New Contributor III
  • 1949 Views
  • 1 replies
  • 0 kudos

Terraform Error: Cannot create sql table context deadline

I am currently trying to deploy external parquet tables to the Databricks UC using terraform. However, for some tables I get the following error:Error: cannot create sql table: Post "https://[MASKED]/api/2.0/sql/statements/": context deadline exceede...

  • 1949 Views
  • 1 replies
  • 0 kudos
Latest Reply
Volker
New Contributor III
  • 0 kudos

Hey @Retired_mod,thanks for your reply and sorry for the late reply from my side. I couldn't fix the problem with the databricks terraform provider unfortunately. I now switched to using liquibase to deploy tables to databricks.

  • 0 kudos
Arby
by New Contributor II
  • 13433 Views
  • 4 replies
  • 0 kudos

Help With OSError: [Errno 95] Operation not supported: '/Workspace/Repos/Connectors....

Hello,I am experiencing issues with importing from utils repo the schema file I created.this is the logic we use for all ingestion and all other schemas live in this repo utills/schemasI am unable to access the file I created for a new ingestion pipe...

icon
  • 13433 Views
  • 4 replies
  • 0 kudos
Latest Reply
Arby
New Contributor II
  • 0 kudos

@Debayan Mukherjee​ Hello, thank you for your response. please let me know if these are the correct commands to access the file from notebookI can see the files in the repo folderbut I just noticed this. the file I am trying to access the size is 0 b...

  • 0 kudos
3 More Replies
JeremyH
by New Contributor II
  • 2239 Views
  • 4 replies
  • 0 kudos

CREATE WIDGETS in SQL Notebook attached to SQL Warehouse Doesn't Work.

I'm able to create and use widgets using the UI in my SQL notebooks, but they get lost quite frequently when the notebook is reset.There is documentation suggesting we can create widgets in code in SQL: https://learn.microsoft.com/en-us/azure/databri...

  • 2239 Views
  • 4 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

Hi @JeremyH - can you please try adding like the below in your query and see if widgets are getting populated? {{parameter_name }}

  • 0 kudos
3 More Replies
Akash_Wadhankar
by New Contributor III
  • 749 Views
  • 0 replies
  • 0 kudos

DatabricksUniForm

Hi Community members, I tried creating a Delta UniForm table using databricks notebook. I created a database without providing location. It took the dbfs default storage location. On top of that I was able to create a Delta UniForm table. Then I trie...

  • 749 Views
  • 0 replies
  • 0 kudos
Nathant93
by New Contributor III
  • 1204 Views
  • 2 replies
  • 0 kudos

Resolved! Unzipping with Serverless Compute

HiI have started using serverless compute but have come across the limitation that I cannot use the local filesystem for temporarily storing the files and directories before moving them to where they need to be in adls.Does anyone have a way of unzip...

Data Engineering
serverless
unzip
  • 1204 Views
  • 2 replies
  • 0 kudos
Latest Reply
delonb2
New Contributor III
  • 0 kudos

Do you have the ability to make a Unity Catalog volume? You could use it as temporary storage before migrating the files to adls.

  • 0 kudos
1 More Replies
wschoi
by New Contributor III
  • 8934 Views
  • 5 replies
  • 3 kudos

How to fix plots and image color rendering on Notebooks?

I am currently running dark mode for my Databricks Notebooks, and am using the "new UI" released a few days ago (May 2023) and the "New notebook editor."Currently all plots (like matplotlib) are showing wrong colors. For example, denoting:```... p...

  • 8934 Views
  • 5 replies
  • 3 kudos
Latest Reply
aleph_null
New Contributor II
  • 3 kudos

Any updated on this issue? This is a huge drawback to use the dark theme

  • 3 kudos
4 More Replies
QuikPl4y
by New Contributor III
  • 2423 Views
  • 2 replies
  • 0 kudos

Resolved! Databricks to Oracle to Delete Rows

Hi Community!  I’m working in a Fatabricks notebook and using the Oracle JDBC Thin Client connector to query and Oracle table, merge together and select specific rows from my dataframe and write those rows to a table back in Oracle. All of this works...

  • 2423 Views
  • 2 replies
  • 0 kudos
Latest Reply
delonb2
New Contributor III
  • 0 kudos

This stack overflow post ran into the same issue, would be worth trying How to delete a record from Oracle Table in Python SQLAlchemy - Stack Overflow

  • 0 kudos
1 More Replies
_TJ
by New Contributor III
  • 9806 Views
  • 3 replies
  • 2 kudos

Incremental load from source; how to handle deletes

Small introduction; I'm a BI/Data developer, mostly working in MSSQL & DataFactory, coming from SSIS. Now I'm trying Databricks, to see if it works for me and my customers. I got enthusiastic by the video; https://www.youtube.com/watch?v=PIFL7W3DmaY&...

  • 9806 Views
  • 3 replies
  • 2 kudos
Latest Reply
abajpai
New Contributor II
  • 2 kudos

@_TJ did you find a solution for sliding window

  • 2 kudos
2 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels