cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Host
by New Contributor
  • 2823 Views
  • 1 replies
  • 0 kudos

hostinc-logo

Hostinc is the best place to match the price and quality of the product at the most affordable price. If you are looking for a server that can make your marketing campaign a huge success here you go with our one of the most powerful Dedicated Server ...

  • 2823 Views
  • 1 replies
  • 0 kudos
Latest Reply
Sovchenko
New Contributor II
  • 0 kudos

Thanks for sharing! Before you hire mobile app developers, you need to carefully study this topic.

  • 0 kudos
User16790091296
by Databricks Employee
  • 21358 Views
  • 6 replies
  • 1 kudos

How to delete from a temp view or equivalent in spark sql databricks?

I need to delete from a temp view in databricks, but it looks like i can do only merge, select and insert. Maybe i missed something but I did not find any documentation on this.

  • 21358 Views
  • 6 replies
  • 1 kudos
Latest Reply
crazy_horse
New Contributor II
  • 1 kudos

What about%sqlDROP TABLE IF EXISTS xxxxx

  • 1 kudos
5 More Replies
Bin
by New Contributor
  • 11363 Views
  • 0 replies
  • 0 kudos

How to do an "overwrite" output mode using spark structured streaming without deleting all the data and the checkpoint

I have this delta lake in ADLS to sink data through spark structured streaming. We usually append new data from our data source to our delta lake, but there are some cases when we find errors in the data that we need to reprocess everything. So what ...

  • 11363 Views
  • 0 replies
  • 0 kudos
mp
by New Contributor II
  • 4264 Views
  • 2 replies
  • 6 kudos

How can I convert a parquet into delta table?

I am looking to migrate my legacy warehouse data. How can I convert a parquet into delta table?

  • 4264 Views
  • 2 replies
  • 6 kudos
Latest Reply
spark-user
Contributor
  • 6 kudos

Hi mp,You have two options to convert a parquet into a delta table through SQLCONVERT TO DELTA parquet.`/data-pipeline/` CREATE TABLE events USING DELTA LOCATION '/data-pipeline/'orCREATE TABLE events USING PARQUET OPTIONS (path '/data-pipeline/') CO...

  • 6 kudos
1 More Replies
ilarsen
by Contributor
  • 1501 Views
  • 0 replies
  • 1 kudos

Trouble referencing a column that has been added by schema evolution (Auto Loader with Delta Live Tables)

Hi,I have a Delta Live Tables pipeline, using Auto Loader, to ingest from JSON files. I need to do some transformations - in this case, converting timestamps. Except one of the timestamp columns does not exist in every file. This is causing the DLT p...

  • 1501 Views
  • 0 replies
  • 1 kudos
serg-v
by New Contributor III
  • 3236 Views
  • 2 replies
  • 0 kudos

Running large window spark structured streaming aggregations with small slide duration

I want to run aggregations on large windows (90 days) with small slide duration (5 minutes).Straightforward solution leads to giant state around hundreds of gigabytes, which doesn't look acceptable.Is there any best practices doing this?Now I conside...

  • 3236 Views
  • 2 replies
  • 0 kudos
SailajaB
by Databricks Partner
  • 3662 Views
  • 2 replies
  • 8 kudos

Resolved! How to restrict Azure users to use launch workspace to login to ADB workspace as admin when user has owner or contributor role

HI,Is there any way to disable launch workspace option in Azure portal for ADB.We have user accesses at resource group, so we need to restrict users who are part of owner or contributor role to launch ADB worksapce as admin.Thank you

  • 3662 Views
  • 2 replies
  • 8 kudos
Latest Reply
none_ranjeet
New Contributor III
  • 8 kudos

Deny Assignments don't block subscription contributor to launch workspace and become admin. Actually I haven't find any way to block that after many tries of different methods.

  • 8 kudos
1 More Replies
Malcoln_Dandaro
by New Contributor
  • 2767 Views
  • 0 replies
  • 0 kudos

Is there any way to navigate/access cloud files using the direct abfss URI (no mount) with default python functions/libs like open() or os.listdir()?

Hello, Today on our workspace we access everything via mount points, we plan to change it to "abfss://" because of security, governance and performance reasons. The problem is sometimes we interact with files using "python only" code, and apparently ...

  • 2767 Views
  • 0 replies
  • 0 kudos
danny_edm
by New Contributor
  • 1198 Views
  • 0 replies
  • 0 kudos

collect_set wired result when Proton enable

Cluster : DBR 10.4 LTS with protonSample schemaseq_no (decimal)type (string)Sample dataseq_no type1 A1 A2 A2 B2 Bcommand : F.size(F.collect_set(F.col("type")).over(Window.partitionBy("seq_no"))...

  • 1198 Views
  • 0 replies
  • 0 kudos
Mamdouh_Dabjan
by New Contributor III
  • 6637 Views
  • 6 replies
  • 2 kudos

Importing a large csv file into databricks free

Basically, I have a large csv file that does not fit in a single worksheet. I can just use it in power query. I am trying to import this file into my databricks notebook. I imported it and created a table using that file. But, When I saw the table, i...

  • 6637 Views
  • 6 replies
  • 2 kudos
Latest Reply
weldermartins
Honored Contributor
  • 2 kudos

hello, manually opening one of the parts of the csv file is the view different?

  • 2 kudos
5 More Replies
yannickmo
by Databricks Partner
  • 9603 Views
  • 7 replies
  • 14 kudos

Resolved! Adding JAR from Azure DevOps Artifacts feed to Databricks job

Hello,We have some Scala code which is compiled and published to an Azure DevOps Artifacts feed.The issue is we're trying to now add this JAR to a Databricks job (through Terraform) to automate the creation.To do this I'm trying to authenticate using...

  • 9603 Views
  • 7 replies
  • 14 kudos
Latest Reply
alexott
Databricks Employee
  • 14 kudos

As of right now, Databricks can't use non-public Maven repositories as resolving of the maven coordinates happens in the control plane. That's different from the R & Python libraries. As workaround you may try to install libraries via init script or ...

  • 14 kudos
6 More Replies
User16752245312
by Databricks Employee
  • 7091 Views
  • 2 replies
  • 2 kudos

How can I automatically capture the heap dump on the driver and executors in the event of an OOM error?

If you have a job that repeatedly run into Out-of-memory error (OOM) either on the driver or executors, automatically capture the heap dump on OOM event will help debugging the memory issue and identify the cause of the error.Spark config:spark.execu...

  • 7091 Views
  • 2 replies
  • 2 kudos
Latest Reply
John_360
New Contributor II
  • 2 kudos

Is it necessary to use exactly that HeapDumpPath? I find I'm unable to get driver heap dumps with a different path but otherwise the same configuration. I'm using spark_version 10.4.x-cpu-ml-scala2.12.

  • 2 kudos
1 More Replies
Serhii
by Contributor
  • 4628 Views
  • 1 replies
  • 1 kudos

Resolved! Behaviour of cluster launches in multi-task jobs

We are adapting the multi-tasks workflow example from dbx documentation for our pipelines https://dbx.readthedocs.io/en/latest/examples/python_multitask_deployment_example.html. As a part of configuration we specify cluster configuration and provide ...

  • 4628 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16873043099
Databricks Employee
  • 1 kudos

Tasks within the same multi task job can reuse the clusters. A shared job cluster allows multiple tasks in the same job to use the cluster. The cluster is created and started when the first task using the cluster starts and terminates after the last ...

  • 1 kudos
Ashok1
by New Contributor II
  • 2041 Views
  • 2 replies
  • 1 kudos
  • 2041 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hey there @Ashok ch​ Hope everything is going great.Does @Ivan Tang​'s response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly? Else please let us know if you need more hel...

  • 1 kudos
1 More Replies
Labels