cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Dataengineer_mm
by New Contributor
  • 1784 Views
  • 2 replies
  • 1 kudos

Surrogate key using identity column.

I want to create a surrogate in the delta table And i used the identity column id-Generated as DefaultCan i insert rows into the delta table using only spark.sql like Insert query ? or i can also use write delta format options? If i use the df.write ...

  • 1784 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @Menaka Murugesan​(Customer)​, We haven’t heard from you since the last response from @Nandini N​  (Customer)​, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as ...

  • 1 kudos
1 More Replies
bluesky111
by New Contributor II
  • 1306 Views
  • 2 replies
  • 3 kudos

Resolved! I Input the wrong schedule time for the exams can it be reschedule ?

Helo today ,i think i was scheduled to do an exams at 2.15 PM but unfortunately i made a mistake put the time to 2.15 AM, could it be rescheduled? i already submit a ticket to https://help.databricks.com/s/contact-us?ReqType=training but no reply yet...

  • 1306 Views
  • 2 replies
  • 3 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 3 kudos

Hi @heron halim​ (Customer)​​, We haven't heard from you since the last response from @Akshay Padmanabhan​​, and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please share it with the community, as it ca...

  • 3 kudos
1 More Replies
Philearner
by New Contributor II
  • 1715 Views
  • 3 replies
  • 3 kudos

Unable to find input by typing input in the Multiselect Widget

In the AWS databricks widgets.multiselect, I'm unable to find input by typing input in the mulitselect bar. It was working before. Although I can find the inputs by scrolling down the list, it's annoying if the list is long.​​Here's my script:measlis...

databrick widget problem databrick widget problem 2
  • 1715 Views
  • 3 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Philip Teu​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 3 kudos
2 More Replies
Sas
by New Contributor II
  • 2926 Views
  • 3 replies
  • 4 kudos

Resolved! Confusion in string comparison

Hello expertI am new to spark. I am using same price of code but getting different resultsWhen i am using below piece of code, i am getting errorpy4j.Py4JException: Method or([class java.lang.String]) does not existdf.filter(F.col("state").isNull()  ...

  • 2926 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Saswata Dutta​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedbac...

  • 4 kudos
2 More Replies
sagiatul
by New Contributor II
  • 3544 Views
  • 2 replies
  • 3 kudos

Databricks driver logs

I am running jobs on databricks clusters. When the cluster is running I am able to find the executor logs by going to Spark Cluster UI Master dropdown, selecting a worker and going through the stderr logs. However, once the job is finished and cluste...

image
  • 3544 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Atul Arora​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback w...

  • 3 kudos
1 More Replies
saikrishna3390
by New Contributor II
  • 5172 Views
  • 2 replies
  • 2 kudos

How do I configure managed identity to databricks cluster and access azure storage using spark config

Partner want to use adf managed identity to connect to my databricks cluster and connect to my azure storage and copy the data from my azure storage to their azure storage storage

  • 5172 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @SAI PUSALA​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback w...

  • 2 kudos
1 More Replies
js54123875
by New Contributor III
  • 1608 Views
  • 2 replies
  • 2 kudos

Failure to initialize configuration' on SQL Warehouse Tables

Yesterday I had a basic DLT pipeline up and running, and was able to query the hive_metastore tables successfully. The pipeline uses autloader to ingest a few csv files from cloud storage to streaming live bronze and silver tables. Today after star...

image of error image
  • 1608 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Jennette Shepard​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feed...

  • 2 kudos
1 More Replies
User16752244127
by Contributor
  • 1088 Views
  • 2 replies
  • 4 kudos

Resolved! DLT code examples and notebooks?

we like the examples that you show in webinars especially with DLT and Huggingface or DLT with ingestion from Kafka, are they publicly available?

  • 1088 Views
  • 2 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Frank Munz​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedback w...

  • 4 kudos
1 More Replies
Vijay_Bhau
by New Contributor II
  • 1790 Views
  • 4 replies
  • 3 kudos
  • 1790 Views
  • 4 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Vijay Gadhave​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 3 kudos
3 More Replies
nirajtanwar
by New Contributor
  • 1232 Views
  • 2 replies
  • 2 kudos

To collect the elements of a SparkDataFrame and coerces them into an R dataframe.

Hello Everyone,I am facing the challenge while collecting a spark dataframe into an R dataframe, this I need to do as I am using TraMineR algorithm whih is implemented in R only and the data pre-processing I have done in pysparkI am trying this:event...

  • 1232 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Niraj Tanwar​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...

  • 2 kudos
1 More Replies
Arunsundar
by New Contributor III
  • 1767 Views
  • 4 replies
  • 3 kudos

Automating the initial configuration of dbx

Hi Team,Good morning.As of now, for the deployment of our code to Databricks, dbx is configured providing the parameters such as cloud provider, git provider, etc., Say, I have a code repository in any one of the git providers. Can this process of co...

  • 1767 Views
  • 4 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Arunsundar Muthumanickam​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear fr...

  • 3 kudos
3 More Replies
Mado
by Valued Contributor II
  • 3560 Views
  • 4 replies
  • 1 kudos

Resolved! How to set properties for a delta table when I want to write a DataFrame?

Hi,I have a PySpark DataFrame with 11 million records. I created the DataFrame on a cluster. It is not saved on DBFS or storage account. import pyspark.sql.functions as F from pyspark.sql.functions import col, when, floor, expr, hour, minute, to_time...

  • 3560 Views
  • 4 replies
  • 1 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 1 kudos

Hi @Mohammad Saber​ , Are you getting the error while writing the file to the table? Or before that?

  • 1 kudos
3 More Replies
Andrei_Radulesc
by Contributor III
  • 3834 Views
  • 3 replies
  • 3 kudos

Resolved! FutureWarning: Deprecated in 3.0.0. Use SparkSession.builder.getOrCreate() instead.

I'm trying to get rid of the warning below:/databricks/spark/python/pyspark/sql/context.py:117: FutureWarning: Deprecated in 3.0.0. Use SparkSession.builder.getOrCreate() instead.In my setup, I have a front-end notebook that gets parameters from the ...

  • 3834 Views
  • 3 replies
  • 3 kudos
Latest Reply
Andrei_Radulesc
Contributor III
  • 3 kudos

That fixes it. Thanks. I need to do spark = SparkSession.builder.getOrCreate() df = spark.table("prod.some_schema.some_table")instead of sc = SparkSession.builder.getOrCreate()   sqlc = SQLContext(sc)   df = sqlc.table(f"prod.some_schema.some...

  • 3 kudos
2 More Replies
sage5616
by Valued Contributor
  • 3398 Views
  • 1 replies
  • 3 kudos

Resolved! Set Workflow Job Concurrency Limit

Hi Everyone,I need a job to be triggered every 5 minutes. However, if that job is already running, it must not be triggered again until that run is finished. Hence, I need to set the maximum run concurrency for that job to only one instance at a time...

  • 3398 Views
  • 1 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

@Michael Okulik​ :To ensure that a Databricks job is not triggered again until a running instance of the job is completed, you can set the maximum concurrency for the job to 1. Here's how you can configure this in Databricks:Go to the Databricks work...

  • 3 kudos
sandeepv
by New Contributor II
  • 1651 Views
  • 3 replies
  • 0 kudos

Databricks Spark certification voucher code expired

Hi Team,I am getting error that voucher code expired when trying to register for "Databricks Certified Associate Developer for Apache Spark 3.0 - Python" certification.Can you please help here

  • 1651 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Sandeep Venishetti​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell ...

  • 0 kudos
2 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels
Top Kudoed Authors