cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

parimalpatil28
by New Contributor III
  • 12280 Views
  • 2 replies
  • 2 kudos

Resolved! Caused by: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "s3"

Hello,I am facing issue while "Insert query or while .saveAsTable". The error is thrown by query is Caused by: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "s3"org.apache.spark.SparkException: [TASK_WRITE_FAILED] Task...

  • 12280 Views
  • 2 replies
  • 2 kudos
Latest Reply
parimalpatil28
New Contributor III
  • 2 kudos

Hello @Retired_mod ,Thanks for the help.We have also investigated internally, we have found the root cause of it.Our products configuration overwriting the Databricks default spark.executor.extraclasspath confs. because of this our clusters was not a...

  • 2 kudos
1 More Replies
gopeshr
by New Contributor
  • 2389 Views
  • 0 replies
  • 0 kudos

Databricks <> snowflake connectivity

We are trying to establish connection  between databricks and  snowflake through the databricks workspaces running on cluster. Initially we assumed it would be the firewall/network blocking the traffic and tried to add a firewall rule but even after ...

gopeshr_1-1698199098184.png
  • 2389 Views
  • 0 replies
  • 0 kudos
boriste
by New Contributor II
  • 13437 Views
  • 11 replies
  • 10 kudos

Resolved! Upload to Volume inside unity catalog not possible?

 I want to upload a simple csv file to a volume which was created in our unity catalog. We are using secure cluster connectivity and our storage account (metastore) is not publicly accessable. We injected the storage in our vnet. I am getting the fol...

  • 13437 Views
  • 11 replies
  • 10 kudos
Latest Reply
jeroenvs
New Contributor III
  • 10 kudos

@AdrianaIspas We are running into the same issue. It took a while to figure out that the error message is related to this limitation. Any updates on when we can expect the limitation to be taken away? We want to secure access to our storage accounts ...

  • 10 kudos
10 More Replies
hujohnso
by New Contributor II
  • 2107 Views
  • 2 replies
  • 0 kudos

Databricks Connect V2 Never Returning Anything

I am trying to use databricks connect V2 using Azure Databricks from pycharm.  I haveCreated a cluster with runtime 13.2 in Shared Access ModeI have enabled unity catalog for the workspace and I am the account adminI have created a .databrickscfg fil...

  • 2107 Views
  • 2 replies
  • 0 kudos
Latest Reply
jackson-nline
New Contributor III
  • 0 kudos

We are also related issues, see https://community.databricks.com/t5/get-started-discussions/databricks-connect-13-1-0-limitations/td-p/37096. However, this issue also highlights that the heartbeat that let you know a job was running on Databricks Con...

  • 0 kudos
1 More Replies
Priyag1
by Honored Contributor II
  • 5041 Views
  • 2 replies
  • 7 kudos

Migration of all Databricks SQL content to the workspace browser

Migration of all Databricks SQL content to the workspace browser Databricks will force-migrate all Databricks SQL content (dashboards, queries, alerts) to the workspace browser. Visit My Queries, My Alerts, and My Dashboards and look for any un-migra...

Data Engineering
Dashboards
Databricks SQL
Visualization
  • 5041 Views
  • 2 replies
  • 7 kudos
Latest Reply
joseheliomuller
New Contributor III
  • 7 kudos

The ability to easily migrate queries and dashboards across Databricks Workspace it extremely important.In my company we have dev, stg and production workspaces, with same pipeline creating the data.We create our dashboards in DEV and then we have to...

  • 7 kudos
1 More Replies
clapton79
by New Contributor II
  • 17116 Views
  • 5 replies
  • 7 kudos

Resolved! on-behalf-of token creation (for SPN)

I am trying to create an on-behalf-token for and SPN on my Azure Databricks Premium instance. The response is a FEATURE_DISABLED error message ("On-behalf-of token creation for service principals is not enabled for this workspace"). How do I turn on ...

  • 17116 Views
  • 5 replies
  • 7 kudos
Latest Reply
alexott
Databricks Employee
  • 7 kudos

There is no On-behalf-of token on Azure - just generate an AAD token for the Service Principal and use it to create PAT (make sure that SP has permission to use PATs). The easiest way of doing it is to use the new Databricks CLI that supports unified...

  • 7 kudos
4 More Replies
harish446
by New Contributor
  • 2277 Views
  • 1 replies
  • 0 kudos

Can a not null constraint be applied on a identity column

I had a table creation script as follows for example: CREATE TABLE default.test2          (  id BIGINT GENERATED BY DEFAULT AS IDENTITY(),                name  String)using deltalocation "/mnt/datalake/xxxx"  What are the possible ways to apply not n...

Data Engineering
data engineering
Databricks
Delta Lake
Delta tables
spark
  • 2277 Views
  • 1 replies
  • 0 kudos
Latest Reply
Krishnamatta
Contributor
  • 0 kudos

Hi Harish,Here is the documentation for this issuehttps://docs.databricks.com/en/tables/constraints.html

  • 0 kudos
Kristin
by New Contributor
  • 1581 Views
  • 0 replies
  • 0 kudos

Structured streaming - missing records in Gold layer, the foreach batch doesn't write some data

Good afternoon,Spark,Streaming,Delta,GoldI'm facing an issue with the foreach batch function in my streaming pipeline. The pipeline is fetching data from the data lake storage using Autoloader. This data is first written to a bronze layer. Following ...

Kristin_0-1698065953782.png
  • 1581 Views
  • 0 replies
  • 0 kudos
pankaj_kaushal
by New Contributor
  • 1895 Views
  • 0 replies
  • 0 kudos

Tuple2 UDF not working

From a UDF i am trying to return a tuple. But looks like the tuple is not serialising and hence getting empty tuple.Can some help me on this.Attached code and output. 

  • 1895 Views
  • 0 replies
  • 0 kudos
dhruval
by New Contributor
  • 2991 Views
  • 1 replies
  • 1 kudos

Read Json file

I want to read json file. Code is shown as below# credential_path ="/dbfs/Workspace/Users/dhruval/Return-label/GCP_Credential.json"credential_path = "/Workspace/Users/dhruval/Return-label/GCP_Credential.json"credential = spark.read.format("json").loa...

  • 2991 Views
  • 1 replies
  • 1 kudos
Latest Reply
Krishnamatta
Contributor
  • 1 kudos

Hi Dhruval,Did you try using the file: prefix to the path? credential = spark.read.format("json").option("multiline","true").load("file:/Workspace/Users/dhruval/Return-label/GCP_Credential.json")Note: Tested on 13.3 LTS cluster 

  • 1 kudos
Akash2
by Contributor
  • 1358 Views
  • 0 replies
  • 0 kudos

Data Engineer Professional Exam Suspended

Hi team,I was giving my exam today and 40 minutes into the exam I was interrupted by the proctor to show the test area. The table had a guitar e string and an almost eaten apple. Nothing else was on the table. Then the proctor asked me to show the ro...

  • 1358 Views
  • 0 replies
  • 0 kudos
dfoard
by New Contributor
  • 3440 Views
  • 0 replies
  • 0 kudos

ERROR: No matching distribution found for databricks-smolder

I'm trying to follow along with the blog post Gaining Insights Into Your HL7 Data With Smolder and Databricks-#1 of 3. I was able to finally get a jar file built from the repo using Java 17 and it successfully imports into the cluster. However, when ...

  • 3440 Views
  • 0 replies
  • 0 kudos
berserkersap
by Contributor
  • 14858 Views
  • 4 replies
  • 1 kudos

Resolved! How to update a SQL Server Table using JDBC or something else in Python/Pyspark ?

I need to update a SQL Server Table from Databricks notebook. Right now, I am trying to do this using JDBC. However, it seems we can only append or overwrite the table using the JDBC Connection.Query databases using JDBC - Azure Databricks | Microsof...

berserkersap_0-1688032497010.png
Data Engineering
Databricks
SQL Server
Update
  • 14858 Views
  • 4 replies
  • 1 kudos
Latest Reply
diego_poggioli
Contributor
  • 1 kudos

Hi @berserkersap thanks for your answer. I was able to solve the problem in 2 ways:1) downgrading the Runtime version to 12.2 and then the installer of msodbcsql17 no longer failed (with the error Can't open lib 'ODBC Driver 17 for SQL Server' : file...

  • 1 kudos
3 More Replies
lstk
by New Contributor
  • 5422 Views
  • 2 replies
  • 1 kudos

Resolved! Job ID value out of range - Azure Logic App Connector

Hello everybody,i tried to build a Logic App Custom Connector following this one explanation. (https://medium.com/@poojaanilshinde/create-azure-logic-apps-custom-connector-for-azure-databricks-e51f4524ab27)Now i run in the following Problem and wante...

image.png
  • 5422 Views
  • 2 replies
  • 1 kudos
Latest Reply
stefnhuy
New Contributor III
  • 1 kudos

Hey Lukas,I can totally relate to the frustration of encountering those confounding errors when building custom connectors in Azure Logic Apps. The "Job ID value out of range" issue can be quite perplexing, but fear not, for there's a solution on the...

  • 1 kudos
1 More Replies
MC8D
by New Contributor II
  • 2248 Views
  • 1 replies
  • 1 kudos

Foreign Catalog with Case Sensitive PostgreSQL

I am trying to query my postgresql read replica as a foreign catalog.I can sucessfuly test the connection.I can see the database names.The table names are auto populated correctly.However when I try to view or query a table, I get the following error...

  • 2248 Views
  • 1 replies
  • 1 kudos
Latest Reply
MC8D
New Contributor II
  • 1 kudos

Hi @Retired_mod I am able to query the pg_catalog database which has all lower case table names, so the connection is working.I am unable to query the tables in my "public" schema, as they have capitalization in the table names.If I query with no bac...

  • 1 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels