cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

slakshmanan
by New Contributor III
  • 539 Views
  • 1 replies
  • 0 kudos

how to get access_token from REST API without a user password

using rest api /oauth2/token how do i get access_token programmatically

  • 539 Views
  • 1 replies
  • 0 kudos
Latest Reply
LauJohansson
Contributor
  • 0 kudos

Have you read this blogpost?https://medium.com/@wernkai95/generate-azure-databricks-workspace-personal-access-token-for-azure-service-principal-8dff72d045c6Also refer to the API docs: https://docs.databricks.com/api/azure/workspace/tokens/create 

  • 0 kudos
jonathanjone
by New Contributor
  • 762 Views
  • 1 replies
  • 0 kudos

Large Dataset Processing: How AI-Powered PCs Measure Up Against Cloud Solutions

Hey community,I’ve been diving deep into AI-powered PCs and their growing capabilities, particularly when it comes to processing large datasets. As cloud solutions like AWS, Google Cloud, and Azure have been the go-to for scaling data-heavy tasks, I’...

  • 762 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

My answer to the same question in another topic:https://community.databricks.com/t5/data-engineering/how-does-an-ai-powered-pc-handle-large-datasets-compared-to/m-p/91889#M38291

  • 0 kudos
nadia
by New Contributor II
  • 27295 Views
  • 4 replies
  • 2 kudos

Resolved! Executor heartbeat timed out

Hello, I'm trying to read a table that is located on Postgreqsl and contains 28 million rows. I have the following result:"SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in sta...

  • 27295 Views
  • 4 replies
  • 2 kudos
Latest Reply
SparkJun
Databricks Employee
  • 2 kudos

Please also review the Spark UI to see the failed Spark job and Spark stage. Please check on the GC time and data spill to memory and disk. See if there is any error in the failed task in the Spark stage view. This will confirm data skew or GC/memory...

  • 2 kudos
3 More Replies
amitprasad01
by New Contributor II
  • 782 Views
  • 4 replies
  • 2 kudos

I want to consolidate the delta tables historical versions and write to another table in append mode

Hi Team, I have a table let's say, employee and it has 5 versions .version 0 and 1 has column A, column B, from version 2 column A has been changed to column C and column B to column D. I want to ingest version by version without manual intervention....

  • 782 Views
  • 4 replies
  • 2 kudos
Latest Reply
uday_satapathy
Databricks Employee
  • 2 kudos

Have you checked out deltaTable.cloneAtVersion()?

  • 2 kudos
3 More Replies
demo0404
by New Contributor II
  • 640 Views
  • 1 replies
  • 1 kudos

Upload files to DBFS

  Hello, since yesterday I cannot upload files to DBFS, only the S3 option appears, I am a little desperate with this, because I have to teach some courses and the tool does not work for me, is there a way to upload the csv files? Thank you!

  • 640 Views
  • 1 replies
  • 1 kudos
Latest Reply
gchandra
Databricks Employee
  • 1 kudos

https://community.databricks.com/t5/data-engineering/suddenly-can-t-find-the-option-to-uplaod-files-into-databricks/m-p/94359/highlight/true#M38884

  • 1 kudos
SandraFernando
by New Contributor II
  • 325 Views
  • 1 replies
  • 0 kudos

Databricks community edition browse data to create New Table not available

Hi,From today (17-10-24), I can't seems to browse data to create New Table.  Is there any change? If so how to import data on the community edition to run pySpark cluster. Please let me know as I run academic classes using this tool.Thank you Sandra

  • 325 Views
  • 1 replies
  • 0 kudos
Latest Reply
gchandra
Databricks Employee
  • 0 kudos

https://community.databricks.com/t5/data-engineering/suddenly-can-t-find-the-option-to-uplaod-files-into-databricks/m-p/94359/highlight/true#M38884

  • 0 kudos
Mitali1
by New Contributor
  • 334 Views
  • 1 replies
  • 0 kudos

Not able to create table using upload file option

I was trying to create table using a CSV file in community addition. I am not able to find any option to do that on the portal. 

  • 334 Views
  • 1 replies
  • 0 kudos
Latest Reply
gchandra
Databricks Employee
  • 0 kudos

https://community.databricks.com/t5/data-engineering/suddenly-can-t-find-the-option-to-uplaod-files-into-databricks/m-p/94359/highlight/true#M38884

  • 0 kudos
petter777
by New Contributor
  • 480 Views
  • 1 replies
  • 0 kudos

DBFS FILE BROWSER BUTTON

Hello, I can not see the filestore when I open the ' catalog' tab and when I want to activate the ' dbfs file browser', the option to enable it in 'settings/Advanced' does not appear. How could I do it so I can have the Filestore active in my Databri...

petter777_0-1729210021870.png databriks.png
  • 480 Views
  • 1 replies
  • 0 kudos
Latest Reply
gchandra
Databricks Employee
  • 0 kudos

Please mount your AWS S3 bucket.  https://community.databricks.com/t5/data-engineering/suddenly-can-t-find-the-option-to-uplaod-files-into-databricks/m-p/94359/highlight/true#M38884    

  • 0 kudos
mplang
by New Contributor
  • 1431 Views
  • 1 replies
  • 1 kudos

DLT x UC x Auto Loader

Now that the Directory Listing Mode of Auto Loader is officially deprecated, is there a solution for using File Notification Mode in a DLT pipeline writing to a UC-managed table? My understanding is that File Notification Mode is only available on si...

Data Engineering
autoloader
dlt
UC
  • 1431 Views
  • 1 replies
  • 1 kudos
Latest Reply
drewipson
New Contributor III
  • 1 kudos

I am having the same concern and am reaching out to our Solutions Architect to better understand how AutoLoader &DLT can be used. DLT and AutoLoader should go hand in hand especially when using file notification mode.

  • 1 kudos
irfanaziz
by Contributor II
  • 32963 Views
  • 8 replies
  • 8 kudos

Resolved! How to merge small parquet files into a single parquet file?

I have thousands of parquet files having same schema and each has 1 or more records. But reading with spark these files is very very slow. I want to know if there is any solution how to merge the files before reading them with spark? Or is there any ...

  • 32963 Views
  • 8 replies
  • 8 kudos
Latest Reply
Sailaja
New Contributor II
  • 8 kudos

We can combine all these small parquet files into single file using optimize command..Optimize delta_table_name

  • 8 kudos
7 More Replies
shashigunda0211
by New Contributor III
  • 950 Views
  • 3 replies
  • 7 kudos

Cant find DBFS File Browser option in community edition settings

Hi,I was able to access DBFS, but I can no longer find the DBFS File Browser option in the settings. I used it two days ago, but now it's missing.Please help.Thanks,Shashi    

  • 950 Views
  • 3 replies
  • 7 kudos
Latest Reply
shashigunda0211
New Contributor III
  • 7 kudos

Thanks for the update @szymon_dybczak . It's reassuring to know that others are experiencing the same issue. I'll keep an eye out for any official announcements regarding the change. 

  • 7 kudos
2 More Replies
Vetrivel
by Contributor
  • 1244 Views
  • 5 replies
  • 0 kudos

Internal Errors in Databricks Queries via Go Library.

While executing a query directly from the Databricks editor for the entire table, no errors are observed. However, when querying through the Go library, we encounter an internal error.Your guidance on this issue would be greatly appreciated.

  • 1244 Views
  • 5 replies
  • 0 kudos
Latest Reply
michael569gardn
New Contributor III
  • 0 kudos

@Vetrivel wrote:While executing a query directly from the Databricks editor for the entire table, no errors are observed. However, when querying through the Go library, we encounter an internal error.Your guidance on this issue would be greatly appre...

  • 0 kudos
4 More Replies
Henrik_
by New Contributor III
  • 946 Views
  • 2 replies
  • 1 kudos

Connect a SFTP to Databricks

What would be the best way to set up a connection to a sftp server from Databricks? In Jupyter Lab, this can done from terminal. Of coruse, there are alternatives like using paramiko library. But is there perhaps a more Databricks-ish solution? 

  • 946 Views
  • 2 replies
  • 1 kudos
Latest Reply
Panda
Valued Contributor
  • 1 kudos

@Henrik_ Try using python libraray's like spark-sftp, paramiko

  • 1 kudos
1 More Replies
haydenLiQ
by New Contributor
  • 648 Views
  • 2 replies
  • 0 kudos

Databricks Tables in Excel

I attended the Databricks World Tour in 2023 (not the most recent one) where I remember there was mention of a feature in development that was an add-in (or similar) for Excel that would allow for native connection and importation of a Databricks tab...

  • 648 Views
  • 2 replies
  • 0 kudos
Latest Reply
Panda
Valued Contributor
  • 0 kudos

@haydenLiQ  -Power BI has native support for Databricks connections via the Azure Databricks connector. You can pull data from Databricks into Power BI, create datasets, and then use Power BI’s Excel integration to export these datasets into Excel. C...

  • 0 kudos
1 More Replies
zmsoft
by Contributor
  • 14271 Views
  • 4 replies
  • 2 kudos

Resolved! Why is mounts = dbutils.fs.mounts () not available now?

Hi there,I am currently using cluster version 15.4 LTS with UC enabled . Azure Data Lake Storage Gen2 has enabled hierarchical namespaces.I tried the following three ways to mount external storage and all got an errorMount point via the ADLS Gen2 acc...

  • 14271 Views
  • 4 replies
  • 2 kudos
Latest Reply
Panda
Valued Contributor
  • 2 kudos

@zmsoft Unity Catalog (UC) enforces strict access control policies, and traditional mounting techniques—such as using access keys or the dbutils.fs.mount command—are not recommended. Best practices for DBFS and Unity Catalog.Databricks advises agains...

  • 2 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels