cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Alexrc
by New Contributor
  • 485 Views
  • 0 replies
  • 0 kudos

Buy Amazon Account

Buy Amazon AccountDo you want to buy Amazon account? Our store is the best place where you can buy Amazon accounts. Only fully verified Amazon accounts on our storeBuy Amazon Account

amazon.jpg
  • 485 Views
  • 0 replies
  • 0 kudos
AlexR
by New Contributor
  • 1549 Views
  • 0 replies
  • 0 kudos

Buy Tripadvisor Account

Buy Tripadvisor AccountDo you want to buy Tripadvisor account? Our store is the best place where you can buy Tripadvisor accounts. Only fully verified Tripadvisor accounts on our storeBuy Tripadvisor Account

tripadvisor.jpg
  • 1549 Views
  • 0 replies
  • 0 kudos
alex97
by New Contributor
  • 763 Views
  • 0 replies
  • 0 kudos

Buy Facebook account

Buy Facebook accountDo you want to buy Upwork account? Our store is the best place where you can buy Upwork accounts. Only fully verified Upwork accounts on our storeBuy Facebook account

facebook.jpg
  • 763 Views
  • 0 replies
  • 0 kudos
alexc96
by New Contributor
  • 497 Views
  • 0 replies
  • 0 kudos

Buy Match Account

Buy Match AccountDo you want to buy Match Match account? Our store is the best place where you can buy Match accounts. Only fully verified Match accounts on our storeBuy Match Account

Match.jpg
  • 497 Views
  • 0 replies
  • 0 kudos
alexc95
by New Contributor
  • 2661 Views
  • 0 replies
  • 0 kudos

Buy Badoo account

Buy Badoo accountDo you want to buy Badoo account? Our store is the best place where you can buy Badoo accounts. Only fully verified Badoo accounts on our storeBuy Badoo account

Badoo.jpg
  • 2661 Views
  • 0 replies
  • 0 kudos
BhargaviKapu13
by New Contributor
  • 902 Views
  • 0 replies
  • 0 kudos

Error 403

Unable to access the page while attempting the quiz for basics of the Databricks Lakehouse Platform

  • 902 Views
  • 0 replies
  • 0 kudos
alj_a
by New Contributor III
  • 3663 Views
  • 2 replies
  • 2 kudos

Resolved! Delta Live Table - not reading the changed record from cloud file

Hi,I am trying to ingest the data from cloudfile to bronze table. DLT is working fist time and loading the data into Bronze table. but when i add new record and change a filed in existing record the DLT pipeline goes success but it should be inserted...

Data Engineering
Databricks Delta Live Table
  • 3663 Views
  • 2 replies
  • 2 kudos
Latest Reply
alj_a
New Contributor III
  • 2 kudos

Thank you Emil. I tried all the suggestions. .read works fine it picks up the new data or changed data. but my problem is it is bronze table  as target. in this case my bronze table has duplicate records. However, let me look at the other options to ...

  • 2 kudos
1 More Replies
chevichenk
by New Contributor III
  • 2187 Views
  • 1 replies
  • 1 kudos

Resolved! Why I still see delta history if I made a vacuum with just 5 hours retention

Hi, everyone!I execute a vacuum with 5 hours retention but I can see all the history of versions, even I can query those older version of the table.Plus, when I see the history version, it doesn't start with zero (supposed to be the creation of the t...

chevichenk_1-1699661687787.png
  • 2187 Views
  • 1 replies
  • 1 kudos
Latest Reply
Rom
New Contributor III
  • 1 kudos

Hi,When disk caching is enabled, a cluster might contain data from Parquet files that have been deleted with VACUUM. Therefore, it may be possible to query the data of previous table versions whose files have been deleted. Restarting the cluster will...

  • 1 kudos
MikeK_
by New Contributor II
  • 44906 Views
  • 6 replies
  • 0 kudos

Resolved! SQL Update Join

  Hi, I'm importing some data and stored procedures from SQL Server into databricks, I noticed that updates with joins are not supported in Spark SQL, what's the alternative I can use? Here's what I'm trying to do: update t1 set t1.colB=CASE WHEN t2....

  • 44906 Views
  • 6 replies
  • 0 kudos
Latest Reply
LyderIversen
New Contributor II
  • 0 kudos

Hi! This is way late, but did you ever find a solution to the CROSS APPLY-part of your question? Is it possible to do CROSS APPLY in Spark SQL, or is there something you can use instead?

  • 0 kudos
5 More Replies
boyarkaandre
by New Contributor
  • 1349 Views
  • 0 replies
  • 0 kudos

Buy ByBit Account

Buy ByBit Accounthello! Are you looking for a bybit account? You can go to our store and buy bybit account here. You can see crypto account in our store!Buy ByBit Account hereBuy ByBit Account here

bybit.jpg
  • 1349 Views
  • 0 replies
  • 0 kudos
andyh
by New Contributor
  • 3141 Views
  • 2 replies
  • 0 kudos

Resolved! Job queue for pool limit

I have a cluster pool with a max capacity limit, to make sure we're not burning too extra silicon. We use this for some of our less critical workflow/jobs. They still spend a lot of time idle, but sometimes hit this max capacity limit. Is there a way...

  • 3141 Views
  • 2 replies
  • 0 kudos
Latest Reply
SSundaram
Contributor
  • 0 kudos

Try increasing your max capacity limit and might want to bring down the min number of nodes the job uses.At the job level try configuring retry and time interval between retries. 

  • 0 kudos
1 More Replies
yutaro_ono1_558
by New Contributor II
  • 11925 Views
  • 2 replies
  • 1 kudos

How to read data from S3 Access Point by pyspark?

I want to read data from s3 access point.I successfully accessed using boto3 client to data through s3 access point.s3 = boto3.resource('s3')ap = s3.Bucket('arn:aws:s3:[region]:[aws account id]:accesspoint/[S3 Access Point name]')for obj in ap.object...

  • 11925 Views
  • 2 replies
  • 1 kudos
Latest Reply
shrestha-rj
New Contributor II
  • 1 kudos

I'm reaching out to seek assistance as I navigate an issue. Currently, I'm trying to read JSON files from an S3 Multi-Region Access Point using a Databricks notebook. While reading directly from the S3 bucket presents no challenges, I encounter an "j...

  • 1 kudos
1 More Replies
rbricks007
by New Contributor II
  • 4189 Views
  • 1 replies
  • 0 kudos

Trying to use pivot function with pyspark for count aggregate

I'm trying this code but getting the following error testDF = (eventsDF .groupBy("user_id") .pivot("event_name") .count("event_name")) TypeError: _api() takes 1 positional argument but 2 were givenPlease guide how to fix...

Data Engineering
count
pivot
python
  • 4189 Views
  • 1 replies
  • 0 kudos
Latest Reply
Krishnamatta
Contributor
  • 0 kudos

Try thisfrom pyspark.sql import functions as F testDF = (eventsDF .groupBy("user_id") .pivot("event_name") .agg(F.count("event_name")))  

  • 0 kudos
victor-nj-miami
by New Contributor II
  • 9352 Views
  • 2 replies
  • 2 kudos

Resolved! Cannot create a metastore anymore

Hi Community,I am trying to create a metastore for the Unity Catalog, but I am getting an error saying that there is already a metastore in the region, which is not true, because I deleted all the metastores. I used to have one working properly, but ...

  • 9352 Views
  • 2 replies
  • 2 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 2 kudos

@ashu_sama I see your issue got resolved by clearing or purging revision history, can you mark this as resolved  

  • 2 kudos
1 More Replies
Baldur
by New Contributor II
  • 4004 Views
  • 3 replies
  • 1 kudos

Unable to follow H3 Quickstart

Hello,I'm following H3 quickstart(Databricks SQL) tutorial because I want to do point-in-polygon queries on 21k polygons and 95B points. The volume is pushing me towards using H3. In the tutorial, they use geopandas.According to H3 geospatial functio...

Baldur_0-1698768174045.png
Data Engineering
Geopandas
H3
  • 4004 Views
  • 3 replies
  • 1 kudos
Latest Reply
siddhathPanchal
Databricks Employee
  • 1 kudos

Hi @Baldur . I hope that above answer solved your problem. If you have any follow up questions, please let us know. If you like the solution, please do not forget to press 'Accept as Solution' button.

  • 1 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels