cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

labromb
by Databricks Partner
  • 16612 Views
  • 10 replies
  • 4 kudos

How to pass configuration values to a Delta Live Tables job through the Delta Live Tables API

Hi Community,I have successfully run a job through the API but would need to be able to pass parameters (configuration) to the DLT workflow via the APII have tried passing JSON in this format:{ "full_refresh": "true", "configuration": [ ...

  • 16612 Views
  • 10 replies
  • 4 kudos
Latest Reply
Edthehead
Contributor III
  • 4 kudos

You cannot pass parameters from a Databricks job to a DLT pipeline. Atleast not yet. You can see from the DLT rest API that there is no option for it to accept any parameters.But there is a workaround.But there is a workaround.With the assumption tha...

  • 4 kudos
9 More Replies
ashraf1395
by Honored Contributor
  • 1792 Views
  • 1 replies
  • 0 kudos

Resolved! Column level and table level tagging in dlt pipeline

Can i set column level and table level tags programmatically in a dlt pipeline. I tried it the normal way using spark.sql(f"alter table and set tags (key=value)") using this syntax I found it out in one of the databricks community post. But can we do...

  • 1792 Views
  • 1 replies
  • 0 kudos
Latest Reply
julie598doyle
New Contributor III
  • 0 kudos

@ashraf1395 wrote:Can i set column level and table level tags programmatically in a dlt pipeline. I tried it the normal way using spark.sql(f"alter table and set tags (key=value)") using this syntax I found it out in one of the databricks community p...

  • 0 kudos
oishimbo
by New Contributor
  • 20899 Views
  • 3 replies
  • 1 kudos

Databricks time travel - how to get ALL changes ever done to a table

Hi time travel gurus,I am investigating creating a reporting solution with an AsOf functionality. Users will be able to create a report based on the current data or on the data AsOf some time ago. Due to the nature of our data this AsOf feature is qu...

  • 20899 Views
  • 3 replies
  • 1 kudos
Latest Reply
mcveyroosevelt
New Contributor III
  • 1 kudos

In Databricks, you can use time travel to access historical versions of a table using the versionAsOf or timestampAsOf options in a SELECT query. To retrieve all changes made to a table, you would typically query the table's historical versions, spec...

  • 1 kudos
2 More Replies
ashap551
by New Contributor II
  • 4248 Views
  • 2 replies
  • 0 kudos

JDBC Connection to NetSuite SuiteAnalytics Using Token-Based-Authentication (TBA)

I'm trying to connect to NetSuite2.com using Pyspark from a Databricks Notebook utilizing a JDBC driver.I was successful in setting up my DBVisualizer connection by installing the JDBC Driver (JAR) and generating the password with the one-time hashin...

  • 4248 Views
  • 2 replies
  • 0 kudos
Latest Reply
alicerichard65
New Contributor II
  • 0 kudos

It seems like the issue might be related to the password generation or the JDBC URL configuration. Here are a few things you can check: NextCareurgentcare1. Password Generation: Ensure that the generate_tba_password function is correctly implemented ...

  • 0 kudos
1 More Replies
Sanjeev
by New Contributor II
  • 2121 Views
  • 1 replies
  • 0 kudos

Resolved! Triggering a Databricks job more than once daily

Hi Team,I have a requirement to trigger a databricks job more than once daily, may be twice or thrice daily.I have checked the workflows but I couldn't find any option in the UI.Please advice

  • 2121 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @Sanjeev, You might want to try this option:   Please refer to below document.You can use cron syntax to schedule jobs. https://docs.databricks.com/en/jobs/scheduled.html

  • 0 kudos
Pingleinferyx
by New Contributor
  • 2931 Views
  • 7 replies
  • 0 kudos

jdbc integration returning header as data for read operation

package com.example.databricks; import org.apache.spark.sql.Dataset;import org.apache.spark.sql.Row;import org.apache.spark.sql.SparkSession; public class DatabricksJDBCApp {     public static void main(String[] args) {        // Initialize Spark Ses...

  • 2931 Views
  • 7 replies
  • 0 kudos
Latest Reply
Dengineer
New Contributor II
  • 0 kudos

After reading through the Driver documentation I've finally found a solution that appears to work for me. I've added .option("UseNativeQuery", 0) to my JDBC connection. The query that was being passed from the Databricks Driver to the Databricks Clus...

  • 0 kudos
6 More Replies
Tmm35
by New Contributor II
  • 1819 Views
  • 1 replies
  • 0 kudos

Migrating from Snowflake to Databricks

How are you offloading data from Snowflake & repointing raw S3 staging dumps to Parquet/Delta?

  • 1819 Views
  • 1 replies
  • 0 kudos
Latest Reply
thelogicplus
Contributor II
  • 0 kudos

@Tmm35  use Travinto Technologies tool that may help you. if you want to migrate   etl, database and sql to databrick or any , their tool is very good.

  • 0 kudos
OthmaneH
by New Contributor II
  • 3371 Views
  • 2 replies
  • 2 kudos

Migration Teradata to Databricks

We are actually working in the migration of Teradata to Databricks.And i want to know how can i replace the Teradata connector in Datastage, to send the data to azure datalake using mft ? Thank you for your help.

  • 3371 Views
  • 2 replies
  • 2 kudos
Latest Reply
thelogicplus
Contributor II
  • 2 kudos

@OthmaneH  use Travinto Technologies tool that may help you. we have done using their tool  more that 100+ sources to databricks migration with etl, database and sql.

  • 2 kudos
1 More Replies
Jay_rockstar
by New Contributor
  • 1685 Views
  • 1 replies
  • 0 kudos

Data Migration

I am looking for a solution for Data Migration.Want to connect SFTP Server to Databricks to New Platform(Don't Want ADF in between)Is it possible?

  • 1685 Views
  • 1 replies
  • 0 kudos
Latest Reply
thelogicplus
Contributor II
  • 0 kudos

@Jay_rockstar  use Travinto Technologies tool.

  • 0 kudos
jeremy98
by Honored Contributor
  • 2324 Views
  • 3 replies
  • 0 kudos

using VSCode extension to interact with Databricks

Hello community, I want to understand if it is possible to use Databricks Connect inside VSCode IDE to interact with Notebooks in local interactively like in Databricks Notebook, Is it possible? Because now I can only use the cluster and wait after t...

  • 2324 Views
  • 3 replies
  • 0 kudos
Latest Reply
jeremy98
Honored Contributor
  • 0 kudos

Anyone, knows which could be the problem

  • 0 kudos
2 More Replies
meghana_tulla
by Databricks Partner
  • 1568 Views
  • 2 replies
  • 0 kudos

How to Set Expiration Time for Delta Sharing URL in Databricks Using Terraform?

 I am automating Delta Sharing from Databricks to non-Databricks recipients using Terraform. I can successfully create shares and recipients with my Terraform code, retrieve the sharing URL after creating the recipient, and see that the URL gets a de...

  • 1568 Views
  • 2 replies
  • 0 kudos
Latest Reply
jeremy98
Honored Contributor
  • 0 kudos

Hello, I did it yesterday through account console (idk if you can do it using terraform).If you are an admin at higher level you can go in that window and enable your metastore to set a token with an expiration date. I hope I answer to your problem

  • 0 kudos
1 More Replies
s3
by Databricks Partner
  • 1443 Views
  • 1 replies
  • 0 kudos

extracting attachments from outlook

Can we fetch attachments from outlook in databricks?

  • 1443 Views
  • 1 replies
  • 0 kudos
Latest Reply
Stefan-Koch
Databricks Partner
  • 0 kudos

Hi s3You could use Microsoft graph for that. Here is an example: https://learn.microsoft.com/en-us/answers/questions/1631663/using-graph-api-to-retrieve-emailAnother way I have always done this is through the Logic App. It is pretty easy to set up an...

  • 0 kudos
Data_Analytics1
by Contributor III
  • 37646 Views
  • 10 replies
  • 10 kudos

Failure starting repl. How to resolve this error? I got this error in a job which is running.

Failure starting repl. Try detaching and re-attaching the notebook.java.lang.Exception: Python repl did not start in 30 seconds. at com.databricks.backend.daemon.driver.IpykernelUtils$.startIpyKernel(JupyterDriverLocal.scala:1442) at com.databricks.b...

  • 37646 Views
  • 10 replies
  • 10 kudos
Latest Reply
PabloCSD
Valued Contributor II
  • 10 kudos

I have had this problem many times, today I made a copy of the cluster and it got "de-saturated", it could help someone in the future

  • 10 kudos
9 More Replies
Shreyash_Gupta
by New Contributor III
  • 7655 Views
  • 1 replies
  • 1 kudos

Resolved! How do Databricks notebooks differ from traditional Jupyter notebooks

Can someone please explain the key difference between a Databricks notebook and a Jupyter notebook.

  • 7655 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

The key differences between a Databricks notebook and a Jupyter notebook are as follows: Integration and Collaboration: Databricks Notebooks: These are integrated within the Databricks platform, providing a unified experience for data science and ma...

  • 1 kudos
Harsha777
by New Contributor III
  • 2517 Views
  • 5 replies
  • 1 kudos

Resolved! Does column masking work with job clusters

Hi,We are trying to experiment with the column masking feature.Here is our use case:We have added a masking function to one of the columns of a tablethe table is part of a notebook with some transformation logicthe notebook is executed as part of a w...

Harsha777_0-1732696132629.png Harsha777_1-1732696804007.png
  • 2517 Views
  • 5 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Hello, the shared cluster on a job will act the same as in an all purpose cluster, basically means that the cluster will be available for any user with permissions to it, in a job there will not be much actions to be done but when an action you are r...

  • 1 kudos
4 More Replies
Labels