cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Soma
by Valued Contributor
  • 1794 Views
  • 5 replies
  • 3 kudos

Resolved! Unable to create Key Vault secrets scope with NPIP Workspace

Hi Team for secure connection we created secured cluster withNPIP(https://learn.microsoft.com/en-us/azure/databricks/security/secure-cluster-connectivity) WORKSPACE hosted in a private VNET.We had a hub vnet with private endpoint for key vault ,We pe...

  • 1794 Views
  • 5 replies
  • 3 kudos
Latest Reply
Kaniz
Community Manager
  • 3 kudos

Hi @somanath Sankaran​ â€‹, We haven’t heard from you since the last response from @Hubert Dudek​, and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others....

  • 3 kudos
4 More Replies
db-avengers2rul
by Contributor II
  • 5024 Views
  • 2 replies
  • 6 kudos

Resolved! AttributeError: 'list' object has no attribute 'columns' - PySpark

Hi All,i am getting the below error when i am ingesting the data from source file , source file is also attached , i have tried in both Community edition and Azure databricks as well getting the same error , can any one suggest me the solution ? # ...

  • 5024 Views
  • 2 replies
  • 6 kudos
Latest Reply
Kaniz
Community Manager
  • 6 kudos

Hi @Rakesh Reddy Gopidi​ â€‹, We haven’t heard from you since the last response from me, and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others. Otherwise...

  • 6 kudos
1 More Replies
parulpaul
by New Contributor III
  • 1355 Views
  • 5 replies
  • 7 kudos
  • 1355 Views
  • 5 replies
  • 7 kudos
Latest Reply
parulpaul
New Contributor III
  • 7 kudos

No solution found

  • 7 kudos
4 More Replies
Ken1
by New Contributor III
  • 913 Views
  • 4 replies
  • 7 kudos

PySpark Error in Azure Databricks

I have this error - com.databricks.WorkflowException: com.databricks.NotebookExecutionException: FAILEDwhen i ran dbutils.notebook.run(.......)

  • 913 Views
  • 4 replies
  • 7 kudos
Latest Reply
Kaniz
Community Manager
  • 7 kudos

Hi @Godswill Mbata​ â€‹, We haven’t heard from you since the last response from @Debayan Mukherjee​ , and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to othe...

  • 7 kudos
3 More Replies
DataRabbit
by New Contributor II
  • 7021 Views
  • 2 replies
  • 0 kudos

Resolved! py4j.security.Py4JSecurityException: Constructor public org.apache.spark.ml.feature.VectorAssembler(java.lang.String) is not whitelisted.

Hello, i have a problem.When I try to run the MLlib Assembler (from pyspark.ml.feature import VectorAssembler) I get this error and I don't know what to do anymore. Please help.

  • 7021 Views
  • 2 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 0 kudos

Is it High Concurrency cluster with credential passthrough enabled? In that case, you can use a different cluster mode.https://docs.azuredatabricks.net/spark/latest/data-sources/azure/adls-passthrough.htmlThis exception is thrown when you have access...

  • 0 kudos
1 More Replies
jm99
by New Contributor III
  • 1492 Views
  • 2 replies
  • 3 kudos

Ingesting Kafka Avro into an Delta STREAMING LIVE TABLE

Using Azure Databricks:I can create a DLT table in python usingimport dlt import pyspark.sql.functions as fn from pyspark.sql.types import StringType   @dlt.table( name = "<<landingTable>>", path = "<<storage path>>", comment = "<< descri...

  • 1492 Views
  • 2 replies
  • 3 kudos
Latest Reply
lninza
New Contributor II
  • 3 kudos

Hi @John Mathews​  did you find a way to progress here?i am stuck in the same point...

  • 3 kudos
1 More Replies
jon1
by New Contributor II
  • 487 Views
  • 1 replies
  • 0 kudos

How to dedupe a source table prior to merge through JDBC SQL driver integration

Hi!We're working with change event data from relational and NoSQL databases then processing and ingesting that into DataBricks. It's streamed from source to our messaging platform. Then, our connector is pushing to DataBricks.Right now we're doing th...

  • 487 Views
  • 1 replies
  • 0 kudos
Latest Reply
jon1
New Contributor II
  • 0 kudos

Update on the theory we are looking at. It'd be similar to below (with necessary changes to support best practices for MERGE such as reducing the search space):-- View for deduping pre-merge CREATE OR REPLACE TEMPORARY VIEW {view} AS SELECT * EXCEPT ...

  • 0 kudos
KKo
by Contributor III
  • 2330 Views
  • 3 replies
  • 7 kudos

Incompatible format detected while writing in Parquet format.

I am writing/reading data from Azure databricks to data lake. I wrote dataframe to a path in delta format using query a below, later I realized that I need the data in parquet format, and I went to the storage account and manually deleted the filepat...

  • 2330 Views
  • 3 replies
  • 7 kudos
Latest Reply
Kaniz
Community Manager
  • 7 kudos

Hi @Kris Koirala​ â€‹, We haven’t heard from you since the last response from @Jose Gonzalez​ , and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others. Ot...

  • 7 kudos
2 More Replies
Raagavi
by New Contributor
  • 1445 Views
  • 2 replies
  • 1 kudos

Is there a way to read the CSV files automatically from on-premises network locations and write back to the same from Databricks?

Is there a way to read the CSV files automatically from on-premises network locations and write back to the same from Databricks? 

  • 1445 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Raagavi Rajagopal​ â€‹, We haven’t heard from you since the last response from @Debayan Mukherjee​, and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to ot...

  • 1 kudos
1 More Replies
NickMendes
by New Contributor III
  • 897 Views
  • 3 replies
  • 2 kudos

Resolved! Alert e-mail is not recognizing my html text

I've always used alert e-mail notifications with my custom message, written in HTML. The problem is that today it suddenly is not working anymore and I'm getting the alert e-mail notification distorted, as HTML doesn't work anymore.Does anyone know w...

  • 897 Views
  • 3 replies
  • 2 kudos
Latest Reply
NickMendes
New Contributor III
  • 2 kudos

Apparently, it has been corrected and it is working again. Thank you everyone

  • 2 kudos
2 More Replies
g96g
by New Contributor III
  • 3274 Views
  • 2 replies
  • 1 kudos

Resolved! how can I pass the df columns as a parameter

Im doing the self study and want pass df column name as a parameter.I have defined the widget column_name= dbutils.widgets.get('column_name')which is executing succefuly ( giving me a column name)then Im reading the df and do some transformation and ...

  • 3274 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Givi Salu​ , We haven’t heard from you since the last response from @Hubert Dudek​, and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others. Otherwis...

  • 1 kudos
1 More Replies
mghildiy
by New Contributor
  • 646 Views
  • 2 replies
  • 1 kudos

Checking spark performance locally

I am experimenting with spark, on my local machine. So, is there some tool/api available to check the performance of the code I write?For eg. I write:val startTime = System.nanoTime() invoicesDF .select( count("*").as("Total Number Of Inv...

  • 646 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @mghildiy​ â€‹, We haven’t heard from you since the last response from @Hubert Dudek​, and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others. Otherwis...

  • 1 kudos
1 More Replies
NOOR_BASHASHAIK
by Contributor
  • 2035 Views
  • 5 replies
  • 4 kudos

Azure Databricks VM type for OPTIMIZE with ZORDER on a single column

DearsI was trying to check what Azure Databricks VM type is best suited for executing OPTIMIZE with ZORDER on a single timestamp value (but string data type) column for around 5000+ tables in the Delta Lake.I chose Standard_F16s_v2 with 6 workers & 1...

image image image image
  • 2035 Views
  • 5 replies
  • 4 kudos
Latest Reply
Kaniz
Community Manager
  • 4 kudos

Hi @NOOR BASHA SHAIK​​, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

  • 4 kudos
4 More Replies
William_Scardua
by Valued Contributor
  • 3222 Views
  • 4 replies
  • 4 kudos

How do you structure and storage you medallion architecture ?

Hi guys,How you suggestion about how to create a medalion archeterure ? how many and what datalake zones, how store data, how databases used to store, anuthing I think that zones:1.landing zone, file storage in /landing_zone - databricks database.bro...

  • 3222 Views
  • 4 replies
  • 4 kudos
Latest Reply
Kaniz
Community Manager
  • 4 kudos

Hi @William Scardua​ â€‹, We haven’t heard from you since the last response from @Jose Gonzalez​ , and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpful to others....

  • 4 kudos
3 More Replies
John_BardessGro
by New Contributor II
  • 4569 Views
  • 3 replies
  • 4 kudos

Cluster Reuse for delta live tables

I have several delta live table notebooks that are tied to different delta live table jobs so that I can use multiple target schema names. I know it's possible to reuse a cluster for job segments but is it possible for these delta live table jobs (w...

  • 4569 Views
  • 3 replies
  • 4 kudos
Latest Reply
Kaniz
Community Manager
  • 4 kudos

Hi @John Fico​ â€‹, We haven’t heard from you since the last response from @Hubert Dudek​ and @Jose Gonzalez​ , and I was checking back to see if you have a resolution yet. If you have any solution, please share it with the community as it can be helpf...

  • 4 kudos
2 More Replies
Labels
Top Kudoed Authors