cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Platform Discussions
Dive into comprehensive discussions covering various aspects of the Databricks platform. Join the conversation to deepen your understanding and maximize your usage of the Databricks platform.
cancel
Showing results for 
Search instead for 
Did you mean: 

Browse the Community

Data Engineering

Join discussions on data engineering best practices, architectures, and optimization strategies with...

8399 Posts

Data Governance

Join discussions on data governance practices, compliance, and security within the Databricks Commun...

355 Posts

Generative AI

Explore discussions on generative artificial intelligence techniques and applications within the Dat...

54 Posts

Machine Learning

Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithm...

774 Posts

Warehousing & Analytics

Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Communi...

468 Posts

Activity in Databricks Platform Discussions

Phani1
by Valued Contributor
  • 4765 Views
  • 8 replies
  • 8 kudos

Delta Live Table name dynamically

Hi Team,Can we pass Delta Live Table name dynamically [from a configuration file, instead of hardcoding the table name]? We would like to build a metadata-driven pipeline.

  • 4765 Views
  • 8 replies
  • 8 kudos
Latest Reply
Vic01
New Contributor
  • 8 kudos

Hello, I wonder if there is any update for this feature?Thanks

  • 8 kudos
7 More Replies
John_Rotenstein
by New Contributor II
  • 6180 Views
  • 7 replies
  • 3 kudos

Retrieve job-level parameters in Python

Parameters can be passed to Tasks and the values can be retrieved with:dbutils.widgets.get("parameter_name")More recently, we have been given the ability to add parameters to Jobs.However, the parameters cannot be retrieved like Task parameters.Quest...

  • 6180 Views
  • 7 replies
  • 3 kudos
Latest Reply
xiangzhu
Contributor II
  • 3 kudos

ah sorry, the thread asked for notebooks too.nevertheless, I'm search for getting job params in pure python jobs

  • 3 kudos
6 More Replies
marcuskw
by Contributor
  • 66 Views
  • 4 replies
  • 1 kudos

IDENTIFIER not working in UPDATE

The following code works perfectly fine: df = spark.createDataFrame([('A', 1), ('B', 2)]) df.createOrReplaceTempView('temp') spark.sql(""" SELECT IDENTIFIER(:col) FROM temp """, args={ "col": "_1" } ).display(...

  • 66 Views
  • 4 replies
  • 1 kudos
Latest Reply
Witold
New Contributor III
  • 1 kudos

Hey @marcuskw the docs state that this is actually not supported. You can only use table names in an update statement:Table name of a MERGE, UPDATE, DELETE, INSERT, COPY INTO 

  • 1 kudos
3 More Replies
rk1994
by New Contributor
  • 106 Views
  • 2 replies
  • 0 kudos

Incrementally ingesting from a static db into a Delta Table

Hello everyone,I’m very new to Delta Live Tables (and Delta Tables too), so please forgive me if this question has been asked here before.Some context: I have over 100M records stored in a Postgres table. I can connect to this table using the convent...

  • 106 Views
  • 2 replies
  • 0 kudos
Latest Reply
TPSteve
New Contributor II
  • 0 kudos

First, you need to understand why your current solution is failing.Materialized views and views in DLT don't differ conceptually from mat. views and views in PostgreSQL. Every time the pipeline is run, both the mat. view and the view will be recalcul...

  • 0 kudos
1 More Replies
migq2
by New Contributor II
  • 25 Views
  • 0 replies
  • 0 kudos

Cannot log SparkML model to Unity Catalog due to missing output signature

I am training Spark ML model (concretely a SynapseML LightGBM ) in Databricks using mlflow and autologWhen I try to register my model in Unity catalog I get the following error:  MlflowException: Model passed for registration contained a signature th...

  • 25 Views
  • 0 replies
  • 0 kudos
thiagoawstest
by Contributor
  • 30 Views
  • 0 replies
  • 0 kudos

change network/vpc workspace

Hello, I have two workspaces, each workspace pointing to a VPC in AWS, in one of the accounts we need to remove a subnet, after removing the InvalidSubnetID.NotFound AWS error when starting the clueter, checked in Manager Account, the networl is poin...

thiagoawstest_0-1720808852626.png
  • 30 Views
  • 0 replies
  • 0 kudos
Shivam_Pawar
by New Contributor III
  • 9555 Views
  • 12 replies
  • 4 kudos

Databricks Lakehouse Fundamentals Badge

I have successfully passed the test after completion of the course with 95%. But I have'nt recieved any badge from your side as promised. I have been provided with a certificate which looks fake by itself. I need to post my credentials on Linkedin wi...

  • 9555 Views
  • 12 replies
  • 4 kudos
Latest Reply
Elham
Visitor
  • 4 kudos

hello, I'm trying to login to following URL :https://v2.accounts.accredible.com/login?app=recipient-portal&origin=https:%2F%2Fcredentials.databricks.com%2Fissuer%2F45847%2Fcredentials&language=enbut I received an error and the message is :Sorry, we c...

  • 4 kudos
11 More Replies
spicysheep
by Visitor
  • 25 Views
  • 0 replies
  • 0 kudos

Where to find comprehensive docs on databricks.yaml / DAB settings options

Where can I find documentation on how to set cluster settings (e.g., AWS instance type, spot vs on-demand, number of machines) in Databricks Asset Bundle databicks.yaml files? The only documentation I've come across mentions these things indirectly, ...

  • 25 Views
  • 0 replies
  • 0 kudos
AWS1567
by New Contributor III
  • 14118 Views
  • 10 replies
  • 5 kudos

We've encountered an error logging you in.

I'm trying to login for past two days and i'm still facing this error: "We've encountered an error logging you in." I've tried to reset the password multiple times and nothing happened. My friend is also not able to login. I request you to resolve t...

Databricks_login_issue
  • 14118 Views
  • 10 replies
  • 5 kudos
Latest Reply
AlxMares
Visitor
  • 5 kudos

I had to sign up again using the same email. In my case, the error seemed to be related to my Azure account. So, when Databricks asks you to choose your cloud provider to sync with, you should select the community version instead.

  • 5 kudos
9 More Replies
theanhdo
by New Contributor
  • 56 Views
  • 1 replies
  • 0 kudos

Databricks Asset Bundles library dependencies - JAR file

Hi there,I have used databricks asset bundles (DAB) to deploy workflows. For each job, I will create a job cluster and install external libraries by specifying libraries in each task, for example:- task_key: my-task  job_cluster_key: my-cluster  note...

  • 56 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @theanhdo, installing JAR files directly from the Databricks workspace is not currently supported. Limitations of Installing JAR Files from WorkspaceThe key points are: Workspace Libraries Deprecation: The documentation states that "Workspace libr...

  • 0 kudos
Avinash_Narala
by Contributor
  • 26 Views
  • 0 replies
  • 0 kudos

Tracking Serverless cluster cost

Hi,I just explored serverless feature in databricks and wondering how can i track cost associated with it. Is it stored in system tables? If yes, then where can i find it?And also how can i prove that it's cost is relatively less compared to classic ...

  • 26 Views
  • 0 replies
  • 0 kudos
Avinash_Narala
by Contributor
  • 28 Views
  • 0 replies
  • 0 kudos

File Trigger VS Autoloader

Hi,I recently came across File Trigger in Databricks and find mostly similar to Autoloader. My 1st question is why file trigger as we have autoloader.In which scenarios I can go with file triggers and autoloader.Can you please differentiate?

  • 28 Views
  • 0 replies
  • 0 kudos
Avinash_Narala
by Contributor
  • 77 Views
  • 2 replies
  • 1 kudos

Custom Endpoints for AI functions In Databricks

Hi Community.Recently I gone through the AI Functions and amazed by the results.I just wanted to know whether can we use our custom endpoints(instead of databricks foundational models) and leverage this AI Functions(ai_classify, ai_mask, etc)https://...

  • 77 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @Avinash_Narala, I’m glad to hear that you’re excited about AI Functions in Databricks.  The article provides more details on setting up web endpoints for use with Custom Commands in Azure. While this is specific to Azure, the general principles o...

  • 1 kudos
1 More Replies
rimaissa
by New Contributor
  • 96 Views
  • 2 replies
  • 0 kudos

Autoloader file notification mode error using UC

We have a DLT pipeline we've created that is using autoloader file notification mode. The pipeline ran fine before moving it to UC. Now that we're using UC, we are getting an AWS permissions issue when the autoloader file notification mode is set to ...

  • 96 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @rimaissa,  Ensure that the user or service principal running the DLT pipeline has the necessary permissions to access the S3 bucket and set up the required cloud resources (SNS, SQS) in the Unity Catalog context. This may require additional permi...

  • 0 kudos
1 More Replies
RamaTeja
by New Contributor II
  • 9659 Views
  • 12 replies
  • 4 kudos

In Azure Databricks Delta Lake not able to see unity catalog databases or tables in the drop down.

I have created an Azure data factory pipeline with a copy data function to copy data from adls path to a delta table .In the delta table drop downs i am able to see only the hive metastore database and tables only but the unity catalog tables are not...

  • 9659 Views
  • 12 replies
  • 4 kudos
Latest Reply
StdyFriend1
New Contributor II
  • 4 kudos

Hi all, Please try following this syntax in the database field  <catalog_name>`.`<database_name>  It is important that you use ` not ' This should force the connect to use a Unity Catalog Database. 

  • 4 kudos
11 More Replies
Top Kudoed Authors