cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Navashakthi
by New Contributor
  • 822 Views
  • 4 replies
  • 2 kudos

Resolved! Community Edition Sign-up Issue

Hi, I'm trying to signup community edition for learning purpose. The sign up page has issue in selecting country. The select dropdown doesn't work and continue option redirects to same page. Couldn't complete signup. Kindly help!

  • 822 Views
  • 4 replies
  • 2 kudos
Latest Reply
amitdas2k6
New Contributor II
  • 2 kudos

for me it is alwas displaying below error but entered correct user name and passowrd,my user name : amit.das2k16@gmail.com Invalid email address or passwordNote: Emails/usernames are case-sensitive 

  • 2 kudos
3 More Replies
Shadowsong27
by New Contributor III
  • 8310 Views
  • 14 replies
  • 4 kudos

Resolved! Mongo Spark Connector 3.0.1 seems not working with Databricks-Connect, but works fine in Databricks Cloud

On latest DB-Connect==9.1.3 and dbr == 9.1, retrieving data from mongo using Maven coordinate of Mongo Spark Connector: org.mongodb.spark:mongo-spark-connector_2.12:3.0.1 - https://docs.mongodb.com/spark-connector/current/ - working fine previously t...

  • 8310 Views
  • 14 replies
  • 4 kudos
Latest Reply
mehdi3x
New Contributor II
  • 4 kudos

Hi everyone the solution for me it was to replace spark.read.format("mongo") by spark.read.format("mongodb") my spark version is 3.3.2 and my mongodb version is 6.0.6 . 

  • 4 kudos
13 More Replies
erigaud
by Honored Contributor
  • 1079 Views
  • 4 replies
  • 1 kudos

Deploying existing queries and alerts to other workspaces

I have several queries and associated alerts in a workspace, and I would like to be able to deploy them to an other workspace, for example an higher environment. Since both queries and objects are not supported in repos, what is the way to go to easi...

  • 1079 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @erigaud  We haven't heard from you since the last response from @btafur , and I was checking back to see if her suggestions helped you. Or else, If you have any solution, please share it with the community, as it can be helpful to others.  Also, ...

  • 1 kudos
3 More Replies
dprutean
by New Contributor III
  • 382 Views
  • 0 replies
  • 0 kudos

JDBC DatabaseMetaData.getCatalogs()

Calling the DatabaseMetaData.getCatalogs() returns 'spark_catalogs instead' of 'hive_metastore', when connected to tradition version of databricks cluster which is not signed with uc_catalog tag.Please check this.

  • 382 Views
  • 0 replies
  • 0 kudos
VD10
by New Contributor
  • 674 Views
  • 1 replies
  • 0 kudos

Data Engineering Professional Certificate

On the way to obtain the certificate. Any preparing tips would be appreciated! Thanks!

  • 674 Views
  • 1 replies
  • 0 kudos
Latest Reply
dplante
Contributor II
  • 0 kudos

Disclaimer - I haven't taken this exam yet A couple of suggestions (from this forum, google searches, etc):- check out this blog post - https://medium.com/@sjrusso/passing-the-databricks-professional-data-engineer-exam-115cccc90aba#:~:text=I%20recent...

  • 0 kudos
KKo
by Contributor III
  • 7328 Views
  • 4 replies
  • 2 kudos

Resolved! Union Multiple dataframes in loop, with different schema

With in a loop I have few dataframes created. I can union them with out an issue if they have same schema using (df_unioned = reduce(DataFrame.unionAll, df_list). Now my problem is how to union them if one of the dataframe in df_list has different nu...

  • 7328 Views
  • 4 replies
  • 2 kudos
Latest Reply
anoopunni
New Contributor II
  • 2 kudos

Hi,I have come across same scenario, using reduce() and unionByname we can implement the solution as below:val lstDF: List[Datframe] = List(df1,df2,df3,df4,df5)val combinedDF = lstDF.reduce((df1, df2) => df1.unionByName(df2, allowMissingColumns = tru...

  • 2 kudos
3 More Replies
VikashKumar
by New Contributor
  • 2427 Views
  • 0 replies
  • 0 kudos

Is there any way to convert delta share short-lived presigned URLs to CSV files at Client End

Hello All, I have requirement , where I need to disclose the data at the client end and they are suppose to access the data in CSV format. I am planning to use Delta Sharing integrated with Unity Catalog. As we know, according to Delta sharing protoc...

  • 2427 Views
  • 0 replies
  • 0 kudos
180122
by New Contributor II
  • 1122 Views
  • 3 replies
  • 1 kudos

Data Engineering Professional - Practice exam?

Hi, when will we get Practice Exams for this the Data Engineering Professional Certification Exam? It seems like we already have it for a good amount of the associate exams, and this Professional exam seems more difficult than the associate ones, so ...

  • 1122 Views
  • 3 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @180122  Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too. Cheers!

  • 1 kudos
2 More Replies
bradleyjamrozik
by New Contributor III
  • 1448 Views
  • 3 replies
  • 3 kudos

Resolved! Questions about Lineage and DLT

Hey there!1. Does column lineage work across multiple catalogs and schemas?2. Do Delta Live Tables support lineage? If yes does that work across multiple pipelines or only with a single one?

  • 1448 Views
  • 3 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @bradleyjamrozik  We haven't heard from you since the last response from @Vinay_M_R and @erigaud , and I was checking back to see if her suggestions helped you. Or else, If you have any solution, please share it with the community, as it can be he...

  • 3 kudos
2 More Replies
YS1
by New Contributor III
  • 898 Views
  • 3 replies
  • 1 kudos

Updating tables from SQL Server to Databricks

Hi,I have SQL Server tables which are the primary location for all live transactions happen and currently I read them through pyspark as dataframes and overwrite them everyday to have the latest copy of them in Databricks. The problem is it takes lon...

  • 898 Views
  • 3 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @YS1  Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too. Cheers!

  • 1 kudos
2 More Replies
samuraidjakk
by New Contributor II
  • 895 Views
  • 2 replies
  • 1 kudos

Resolved! Lineage from Unity Catalog on GCP

We are in the prosess of trying to do a PoC of our pipelines using DLT. Normally, we use another tool and we have created a custom program to extract lineage. We want to try to get / display lineage using Unity Catalog.But.. we are on GCP, and it see...

  • 895 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @samuraidjakk  Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best? If not, please tell us so we can help you. Thanks!

  • 1 kudos
1 More Replies
SRK
by Contributor III
  • 3082 Views
  • 6 replies
  • 5 kudos

Resolved! How to deploy Databricks SQL queries and SQL Alerts from lower environment to higher environment?

We are using Databricks SQL Alerts to handle one scenario. We have written the queries for the same, also we have created the SQL Alert. However, I was looking for the best way to deploy it on Higher Environments like Pre-Production and Production.I ...

  • 3082 Views
  • 6 replies
  • 5 kudos
Latest Reply
valeryuaba
New Contributor III
  • 5 kudos

Thanks!

  • 5 kudos
5 More Replies
erigaud
by Honored Contributor
  • 4129 Views
  • 7 replies
  • 4 kudos

Resolved! Autoloader Excel Files

Hello, I looked at the documentation but could not find what I wanted. Is there a way to load Excel files using an autoloader and if yes, what options should be given to specify format, sheet name etc ? Thank you friends !

  • 4129 Views
  • 7 replies
  • 4 kudos
Latest Reply
Hemant
Valued Contributor II
  • 4 kudos

Unfortunately, Databricks autoloader doesn't support Excel file types to incrementally load new files.Link:https://docs.databricks.com/ingestion/auto-loader/options.html If your excel file contains a single sheet then there is a workaround.

  • 4 kudos
6 More Replies
sumit23
by New Contributor
  • 738 Views
  • 0 replies
  • 0 kudos

[Error] [SECRET_FUNCTION_INVALID_LOCATION]: While running secret function with create or replace

Hi, recently we made an upgrade to our databricks warehouse, transitioning from SQL Classic to SQL PRO.However, we started encountering the following error message when attempting to execute the "CREATE or REPLACE" table query with the secret functio...

  • 738 Views
  • 0 replies
  • 0 kudos
Labels
Top Kudoed Authors