Hello Community,I am currently working on populating gold layer tables. Source for these gold layer tables are silver layer tables. A query is going to run on silver layer tables, spark sql query contains joins between multiple tables.ex:select colum...
Hi @bodempudi venkat​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...
Hi @Melody Mazaiwana​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...
I have a docker image for debezium in my ECR repo and added IAM roles from Databricks to pull this image for my cluster and seeing this error when cluster is created
Can you please help technical pros and cons for running SQL query in databricks notebook (data engineering workspace) and serverless warehouse SQL editor
NotebookPROS: More traditional cluster, git integration, choose DBR versionCONS: Cluster startup time, photon not automatically part of the clusterServerlessPROS: Faster, almost immediate startup time, less expensive for a single query, photon enable...
I have a table with latitude and longitude for a few addresses (no more than 10 at the moment) but when I select the appropriate columns in the visualization editor for Map (Markers) I get an message that states "error while rendering visualization"....
Hi Databricks Community,I ran into the following issue when setting up a new cluster with the latest LTS Databricks runtime (11.3). When trying to install the package with the coordinates com.microsoft.azure.kusto:kusto-spark_3.0_2.12:3.1.4 from Mave...
Hi @Andrei Bondarenko​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you....
customer is trying to generate a Databricks token for a service principal (SP). They’ve created the SP in Azure AD and have used the Databricks rest api to add it as an admin. When using the Databricks rest api /api/2.0/token-management/on-behalf-of...
I have a databricks workflow where the first task is to set up task parameters for other notebooks to use later in the process. Since these are variables that are used in all of my notebooks, I have opted to assign them in a shared notebook and call ...
We have build multiple tables and Views under Databricks Sql and unable to figure out how can we take this code and deploy at our higher environments? Need some guidance as we're unable to get any information from searched documentations.
Hi, I agree with @Josef Prakljacic​ . If Databricks would like to compete SQL DWH/Synapse or Snowflake and target DWH users, It should prepare some guidelines how to manage "database" objects. Yea, @Werner Stinckens​ with Engineering workspace and py...
Hi I'm facing an issue when writing to a salesforce object. I'm using the springml/spark-salesforce library. I have the above libraries installed as recommended based on my research.I try to write like this:(_sqldf .write .format("com.springml.spar...
I'm trying to implement an incremental ingestion logic in the following way:database tables have DbUpdatedDate columnDuring initial load I perform a full copy of the database tableDuring incremental load I:scan the data already in the DLT to see what...
Hi @Chris Nawara​, I had the same issue you had. I was trying to avoid the apply_changes but we in the end I implemented it and I'm happier that I expected heheand if you have any additional standardization columns that you need to implement, you can...
Hi @Mohammad Saber​ ,I think first you need to configure audit log in databricks then you use it.Please refer below blog that will help you in this.Configure audit logging | Databricks on AWS
Hi Team, Need assistance to understand Databricks workspace service principle token expire calculation. Issue : when I am creating a token I have given lifetime =3600, but when I doing get token I am getting unexpected expiry number and even when I ...
Hi Team, Please help on my issue,Is there any way to find expiry of token, i mean still how much time have token to expiry. creation_time - expiry_time is not giving me exact output.Kindly let me know if there is any way to find as soon as possibleT...
when we run databricks job it take some time to get job cluster active . I created pool also and attached with job cluster but still it take time to attached the cluster and job cluster get active to start the job run. is there any way - we can run d...
If you want instant processing, you will have to have a cluster running all the time.As mentioned above, Databricks is testing serverless compute for data engineering workloads (comparable to serverless SQL). This fires up a cluster in a few seconds...