cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

hujohnso
by New Contributor II
  • 1031 Views
  • 2 replies
  • 0 kudos

Databricks Connect V2 Never Returning Anything

I am trying to use databricks connect V2 using Azure Databricks from pycharm.  I haveCreated a cluster with runtime 13.2 in Shared Access ModeI have enabled unity catalog for the workspace and I am the account adminI have created a .databrickscfg fil...

  • 1031 Views
  • 2 replies
  • 0 kudos
Latest Reply
jackson-nline
New Contributor III
  • 0 kudos

We are also related issues, see https://community.databricks.com/t5/get-started-discussions/databricks-connect-13-1-0-limitations/td-p/37096. However, this issue also highlights that the heartbeat that let you know a job was running on Databricks Con...

  • 0 kudos
1 More Replies
Priyag1
by Honored Contributor II
  • 3032 Views
  • 3 replies
  • 7 kudos

Migration of all Databricks SQL content to the workspace browser

Migration of all Databricks SQL content to the workspace browser Databricks will force-migrate all Databricks SQL content (dashboards, queries, alerts) to the workspace browser. Visit My Queries, My Alerts, and My Dashboards and look for any un-migra...

Data Engineering
Dashboards
Databricks SQL
Visualization
  • 3032 Views
  • 3 replies
  • 7 kudos
Latest Reply
joseheliomuller
New Contributor III
  • 7 kudos

The ability to easily migrate queries and dashboards across Databricks Workspace it extremely important.In my company we have dev, stg and production workspaces, with same pipeline creating the data.We create our dashboards in DEV and then we have to...

  • 7 kudos
2 More Replies
clapton79
by New Contributor II
  • 8174 Views
  • 5 replies
  • 6 kudos

Resolved! on-behalf-of token creation (for SPN)

I am trying to create an on-behalf-token for and SPN on my Azure Databricks Premium instance. The response is a FEATURE_DISABLED error message ("On-behalf-of token creation for service principals is not enabled for this workspace"). How do I turn on ...

  • 8174 Views
  • 5 replies
  • 6 kudos
Latest Reply
alexott
Valued Contributor II
  • 6 kudos

There is no On-behalf-of token on Azure - just generate an AAD token for the Service Principal and use it to create PAT (make sure that SP has permission to use PATs). The easiest way of doing it is to use the new Databricks CLI that supports unified...

  • 6 kudos
4 More Replies
harish446
by New Contributor
  • 1072 Views
  • 1 replies
  • 0 kudos

Can a not null constraint be applied on a identity column

I had a table creation script as follows for example: CREATE TABLE default.test2          (  id BIGINT GENERATED BY DEFAULT AS IDENTITY(),                name  String)using deltalocation "/mnt/datalake/xxxx"  What are the possible ways to apply not n...

Data Engineering
data engineering
Databricks
Delta Lake
Delta tables
spark
  • 1072 Views
  • 1 replies
  • 0 kudos
Latest Reply
Krishnamatta
New Contributor III
  • 0 kudos

Hi Harish,Here is the documentation for this issuehttps://docs.databricks.com/en/tables/constraints.html

  • 0 kudos
Kristin
by New Contributor
  • 750 Views
  • 0 replies
  • 0 kudos

Structured streaming - missing records in Gold layer, the foreach batch doesn't write some data

Good afternoon,Spark,Streaming,Delta,GoldI'm facing an issue with the foreach batch function in my streaming pipeline. The pipeline is fetching data from the data lake storage using Autoloader. This data is first written to a bronze layer. Following ...

Kristin_0-1698065953782.png
  • 750 Views
  • 0 replies
  • 0 kudos
dhruval
by New Contributor
  • 2266 Views
  • 1 replies
  • 1 kudos

Read Json file

I want to read json file. Code is shown as below# credential_path ="/dbfs/Workspace/Users/dhruval/Return-label/GCP_Credential.json"credential_path = "/Workspace/Users/dhruval/Return-label/GCP_Credential.json"credential = spark.read.format("json").loa...

  • 2266 Views
  • 1 replies
  • 1 kudos
Latest Reply
Krishnamatta
New Contributor III
  • 1 kudos

Hi Dhruval,Did you try using the file: prefix to the path? credential = spark.read.format("json").option("multiline","true").load("file:/Workspace/Users/dhruval/Return-label/GCP_Credential.json")Note: Tested on 13.3 LTS cluster 

  • 1 kudos
viniaperes
by New Contributor II
  • 1676 Views
  • 1 replies
  • 1 kudos

Resolved! Pass Databricks's Spark session to a user defined module

Hello everyone,I have a .py file (not a notebook) where I have the following class with the following constructor:class DataQualityChecker: def __init__(self, spark_session: SparkSession, df: DataFrame, quality_config_filepath: str) -> None: ...

  • 1676 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @viniaperes, It looks like you're passing all the required arguments to your DataQualityChecker constructor, but the SparkSession parameter is being interpreted as a positional argument instead of a keyword argument. To fix this, you can explicitl...

  • 1 kudos
kaleighspitz
by New Contributor
  • 1145 Views
  • 1 replies
  • 0 kudos

Delta Live Tables saving as corrupt files

Hello,I am using Delta Live Tables to store data and then trying to save them to ADLS. I've specified the storage location of the Delta Live Tables in my Delta Live Tables pipeline. However, when I check the files that are saved in ADLS, they are cor...

Data Engineering
Delta Live Tables
  • 1145 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @kaleighspitz ,  If your Delta Lake files are saving as corrupt files in ADLS, it's possible that there is an issue with the data being written to the file system. Here are a few suggestions to help troubleshoot the issue: Verify the integrity of...

  • 0 kudos
Paval
by New Contributor
  • 1151 Views
  • 1 replies
  • 0 kudos

Failed to run the job on databricks version LTS 9.x and 10.x(AWS)

Hi Team,When we tried to change the databricks version from 7.3 to 9.x or 10.x we are getting below error. Caused by: java.lang.RuntimeException: MetaException(message:Unable to verify existence of default database: com.amazonaws.services.glue.model....

  • 1151 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Paval, Could you please verify that the cluster uses the IAM policy attached to the IAM role to access the AWS Glue Catalog? Ensure that it includes the glue:GetDatabase action. If the policy is correct, you must ensure the cluster has the necess...

  • 0 kudos
berserkersap
by Contributor
  • 7344 Views
  • 4 replies
  • 1 kudos

Resolved! How to update a SQL Server Table using JDBC or something else in Python/Pyspark ?

I need to update a SQL Server Table from Databricks notebook. Right now, I am trying to do this using JDBC. However, it seems we can only append or overwrite the table using the JDBC Connection.Query databases using JDBC - Azure Databricks | Microsof...

berserkersap_0-1688032497010.png
Data Engineering
Databricks
SQL Server
Update
  • 7344 Views
  • 4 replies
  • 1 kudos
Latest Reply
diego_poggioli
Contributor
  • 1 kudos

Hi @berserkersap thanks for your answer. I was able to solve the problem in 2 ways:1) downgrading the Runtime version to 12.2 and then the installer of msodbcsql17 no longer failed (with the error Can't open lib 'ODBC Driver 17 for SQL Server' : file...

  • 1 kudos
3 More Replies
lstk
by New Contributor
  • 2131 Views
  • 2 replies
  • 1 kudos

Resolved! Job ID value out of range - Azure Logic App Connector

Hello everybody,i tried to build a Logic App Custom Connector following this one explanation. (https://medium.com/@poojaanilshinde/create-azure-logic-apps-custom-connector-for-azure-databricks-e51f4524ab27)Now i run in the following Problem and wante...

image.png
  • 2131 Views
  • 2 replies
  • 1 kudos
Latest Reply
stefnhuy
New Contributor III
  • 1 kudos

Hey Lukas,I can totally relate to the frustration of encountering those confounding errors when building custom connectors in Azure Logic Apps. The "Job ID value out of range" issue can be quite perplexing, but fear not, for there's a solution on the...

  • 1 kudos
1 More Replies
MC8D
by New Contributor II
  • 1264 Views
  • 2 replies
  • 1 kudos

Foreign Catalog with Case Sensitive PostgreSQL

I am trying to query my postgresql read replica as a foreign catalog.I can sucessfuly test the connection.I can see the database names.The table names are auto populated correctly.However when I try to view or query a table, I get the following error...

  • 1264 Views
  • 2 replies
  • 1 kudos
Latest Reply
MC8D
New Contributor II
  • 1 kudos

Hi @Kaniz_Fatma I am able to query the pg_catalog database which has all lower case table names, so the connection is working.I am unable to query the tables in my "public" schema, as they have capitalization in the table names.If I query with no bac...

  • 1 kudos
1 More Replies
jgen17
by New Contributor II
  • 6589 Views
  • 4 replies
  • 0 kudos

Cluster library installation fails

Hello everyone,I get a weird error when installing additional libraries in my cluster.I have a predefined Databricks cluster (Standard_L8s_v2) as a Compute instance. I run pipelines on that cluster in Azure ADF. The pipeline consists several tasks. T...

  • 6589 Views
  • 4 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @jgen17 , Could you please share your cluster details?  

  • 0 kudos
3 More Replies
successhawk
by New Contributor II
  • 885 Views
  • 1 replies
  • 1 kudos

How can I provide read only access to the Admin console?

As a DevSecOps engineer, I want to provide Ops support personnel READ ONLY access to the admin console in my production workspaces, so that they can easily view non-secret configurations, such as user/group memberships/entitlements and workspace sett...

  • 885 Views
  • 1 replies
  • 1 kudos
Latest Reply
418971
New Contributor II
  • 1 kudos

Have you found out a solution for this?

  • 1 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels