cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ChristianRRL
by Valued Contributor
  • 3780 Views
  • 2 replies
  • 1 kudos

DLT Primary Key Deduplication: Expectations vs. Constraints vs. Other?

I'm trying to figure out what's the best way to "de-duplicate" data via DLT. Currently, my only leads are:Manage data quality with Delta Live Tables | Databricks on AWSVia "Drop invalid records"Constraints on Databricks | Databricks on AWSVia "pre-de...

Get Started Discussions
Auto Loader
autoloader
Delta Live Table
Delta Live Table Pipeline
dlt
  • 3780 Views
  • 2 replies
  • 1 kudos
Latest Reply
Palash01
Valued Contributor
  • 1 kudos

Hey @ChristianRRL ,Based on my understanding you want to de-duplicate your data during your DLT pipeline processing unfortunately I was not able to find a solution to this when I ran into this problem due to the native feature limitations.Limitations...

  • 1 kudos
1 More Replies
Phani1
by Valued Contributor II
  • 15648 Views
  • 2 replies
  • 1 kudos

ADF vs Databricks

Hi Team ,I would appreciate your suggestion on which scenario to choose between ADF (Azure Data Factory) and Databricks for orchestration, as well as any significant differences between them.Regards,Phanindra

  • 15648 Views
  • 2 replies
  • 1 kudos
Latest Reply
Michael_Galli
Contributor III
  • 1 kudos

Hi, I work with both, so it depends on the usecase.ADF is easy to set up and good for data integration, e.g. "copy data" job to transfer files from storage 1 to storage 2ADF data flows (data transformations) can be used to some level, but when the tr...

  • 1 kudos
1 More Replies
harvey-c
by New Contributor III
  • 2251 Views
  • 4 replies
  • 0 kudos

DLT Performance question with Unity Catalog

Dear Community MembersThis question is about debugging performance issue of DLT pipeline with unity catalog.I had a DLT pipeline in Azure Databricks running on local store i.g. hive_metastore. And the processes took about 2 hour with the auto scalain...

  • 2251 Views
  • 4 replies
  • 0 kudos
Latest Reply
Mystagon
New Contributor II
  • 0 kudos

Hey Harvey, I getting around the same performance problems as you:From around 25 minutes in a normal workspace to an 1 hour and 20mins in UC workspace. Which is roughly 3x slower.Did you manage to solve this? I've also noticed dbutil.fs.ls() is much ...

  • 0 kudos
3 More Replies
ChristianRRL
by Valued Contributor
  • 3622 Views
  • 0 replies
  • 0 kudos

Auto-Update API Data

Not sure if this has come up before, but I'm wondering if Databricks has any kind of functionality to "watch" an API call for changes?E.g. Currently I have a frequently running job that pulls data via an API call and overwrites the old data. This see...

  • 3622 Views
  • 0 replies
  • 0 kudos
SamyA
by New Contributor II
  • 6783 Views
  • 7 replies
  • 3 kudos

System table with state UNAVAILABLE

Hello,When I check the system table's status, it seems that they are in UNAVAILABLE state. I would like to know if anyone have faced this issue ?Because of that, I can't enable the system table. {"schemas":[{"schema":"storage","state":"UNAVAILABLE"},...

  • 6783 Views
  • 7 replies
  • 3 kudos
Latest Reply
D365
New Contributor II
  • 3 kudos

May Be internal IssueFollow

  • 3 kudos
6 More Replies
abhijit007
by New Contributor
  • 3401 Views
  • 1 replies
  • 0 kudos

Unable to connect Azure kafka server with public IP from databricks notebook

Hi Team,I am unable to connect (SSH connection) from Azure Databricks Notebook to Azure Kafka server.Kafka Server and Databricks both are under same resource group and region. Also in Inbound rule the port is added in Kafka server.Please help me to r...

abhijit007_0-1704345593867.png abhijit007_1-1704346105457.png
  • 3401 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, This looks like issue with networking config. Could you please check on the routing configs, firewall routes etc to make sure destination IP to 9092 is added in the Azure console? 

  • 0 kudos
311102
by New Contributor
  • 1291 Views
  • 1 replies
  • 0 kudos

user email invitation to workspace not received

HelloSince December 2023, I cannot anymore invite users to connect to my workspace as I used to. For no reason, the users I add through my admin dashboard do not receive the invitation email and thus the link to connect to the workspace.   I tried my...

  • 1291 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Could you also please try to add users through account console if Identity Federation is enabled? Refer: https://docs.databricks.com/en/administration-guide/users-groups/users.html#assign-a-user-to-a-workspace-using-the-account-console

  • 0 kudos
thibault
by Contributor II
  • 11631 Views
  • 11 replies
  • 6 kudos

databricks-connect 13.1.0 limitations

Hi,Quite excited to see the new release of databricks-connect, I started writing unit tests running pyspark on a databricks cluster using databricks-connect.After some successful basic unit tests, I tested just more chained transformations on a dataf...

  • 11631 Views
  • 11 replies
  • 6 kudos
Latest Reply
jackson-nline
New Contributor III
  • 6 kudos

I doubled the `spark.connect.grpc.maxInboundMessageSize` parameter to 256mb but that didn't appear to resolve anything.

  • 6 kudos
10 More Replies
udi_azulay
by New Contributor II
  • 866 Views
  • 1 replies
  • 0 kudos

Running sql command on Single User cluster vs Shared.

Hi, when i am running the below simple code over my Unity Catalog on a Shared cluster, it works very well.But on a Single User - i am getting : Failed to acquire a SAS token for list on /__unitystorage/schemas/1bb5b053-ac96-471b-8077-8288c56c8a20/tab...

  • 866 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Could you please refer to the limitations here: https://docs.databricks.com/en/compute/access-mode-limitations.html . Please let us know if this helps. 

  • 0 kudos
Databricks_Work
by New Contributor II
  • 2314 Views
  • 1 replies
  • 0 kudos

how to access data in one databricks in another databricks

I want to acces data in another databricks in my databricks, how to do that

  • 2314 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Hello, many thanks for your question, to be able to provide you with a more precise response we required some additional information:1. When you refer databricks in my databricks are you refering to access data that is in one workspace to another wor...

  • 0 kudos
hbs59
by New Contributor III
  • 5648 Views
  • 3 replies
  • 2 kudos

Resolved! Move multiple notebooks at the same time (programmatically)

If I want to move multiple (hundreds of) notebooks at the same time from one folder to another, what is the best way to do that? Other than going to each individual notebook and clicking "Move".Is there a way to programmatically move notebooks? Like ...

  • 5648 Views
  • 3 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

You should be redirected to the KB page, but this is the information contained: Problem How to migrate Shared folders and the notebooks Cause Shared notebooks are not migrated into new workspace by default Solution Please find the script to migrate t...

  • 2 kudos
2 More Replies
Phani1
by Valued Contributor II
  • 2550 Views
  • 1 replies
  • 2 kudos

Databricks API using the personal access token

We can access the Azure databricks API using the personal access token which is created by us manually.The objective is that client don’t want to store the personal access token which may not be secure .Do we have option to generate the token during ...

  • 2550 Views
  • 1 replies
  • 2 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 2 kudos

Hi @Phani1 ,Yes, now you can use databricks Create a user token API for create access token via automated API call.Please refer below doc - Create a user token | Token API | REST API reference | Azure Databricks

  • 2 kudos
Eldar_Dragomir
by New Contributor II
  • 4593 Views
  • 3 replies
  • 0 kudos

Databricks Volume. Not able to read a file from Scala.

I used to use dbfs with mounted directories and now I want to switch to Volumes for storing my jars and application.conf for pipelines. I see the file my application.conf in Data Explorer > Catalog > Volumes, I also see the file with dbutils.fs.ls("/...

Get Started Discussions
Databricks
Unity Catalog
  • 4593 Views
  • 3 replies
  • 0 kudos
Latest Reply
argus7057
New Contributor II
  • 0 kudos

Volumes mount are accessible using scala code only on a shared cluster. On single user mode this features is not supported yet. We use init scripts to move contents from Volumes to clusters local drive, when we need to access files from Native Scala ...

  • 0 kudos
2 More Replies
ChristianRRL
by Valued Contributor
  • 3699 Views
  • 2 replies
  • 1 kudos

Resolved! DLT Notebook and Pipeline Separation vs Consolidation

Super basic question. For DLT pipelines I see there's an option to add multiple "Paths". Is it generally best practice to completely separate `bronze` from `silver` notebooks? Or is it more recommended to bundle both raw `bronze` and clean `silver` d...

ChristianRRL_1-1705597040187.png
  • 3699 Views
  • 2 replies
  • 1 kudos
Latest Reply
ChristianRRL
Valued Contributor
  • 1 kudos

This is great! I completely missed the list view before.

  • 1 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels