cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

anonymous_567
by New Contributor II
  • 1663 Views
  • 1 replies
  • 0 kudos

Ingesting Non-Incremental Data into Delta

Hello,I have non-incremental data landing in a storage account. This data contains old data from before as well as new data. I would like to avoid doing a complete table deletion and table creation just to upload the data from storage and have an upd...

  • 1663 Views
  • 1 replies
  • 0 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 0 kudos

Well, if you know the conditions to separate new data from old data, then while reading the data in to your dataframe, use filter or where clause to select new data and ingest it in to your delta table.This is how you can do in general. But if you ha...

  • 0 kudos
Olfa_Kamli
by New Contributor III
  • 1564 Views
  • 1 replies
  • 0 kudos

log delivery are not creating data in s3 bucket

Hiii, Does anyone have an idea about the typical duration for Databricks to create logs in an S3 bucket using the databricks_mws_log_delivery Terraform resource? I've implemented the code provided in the Databricks official documentation, but I've be...

  • 1564 Views
  • 1 replies
  • 0 kudos
Latest Reply
Olfa_Kamli
New Contributor III
  • 0 kudos

The issue has been resolved. There was no problem with the code or the API. However, it took over 12 hours for logs to start appearing in my bucket, despite Databricks documentation indicating that logs should appear within 1 hour..Thank you!

  • 0 kudos
TheIceBrick
by New Contributor III
  • 14430 Views
  • 3 replies
  • 1 kudos

Is there a (request-) size limit for the Databricks Rest Api Sql statements?

When inserting rows through the Sql Api (/api/2.0/sql/statements/), when more than a certain number of records (about 25 records with 8 small columns) are included in the statement, the call fails with the error:"The request could not be processed by...

Get Started Discussions
REST API
Sql Statements
  • 14430 Views
  • 3 replies
  • 1 kudos
Latest Reply
ChrisCkx
New Contributor II
  • 1 kudos

@TheIceBrick did you find out anything else about this?I am experiencing exactly the same, I can insert up to 35 rows but break at about 50 rows.The payload size is 42KB, I am passing parameters for each row.@Debayan This is no where near the 16MiB /...

  • 1 kudos
2 More Replies
jenshumrich
by Contributor
  • 12265 Views
  • 2 replies
  • 0 kudos

Long running jobs get lost

Hello,I tried to schedule a long running job and surprisingly it does seem to neither terminate (and thus does not let the cluster shut down), nor continue running, even though the state is still "Running":But the truth is that the job has miserably ...

jenshumrich_0-1712742957610.png jenshumrich_2-1712743008070.png jenshumrich_3-1712743098546.png
  • 12265 Views
  • 2 replies
  • 0 kudos
Latest Reply
Lakshay
Databricks Employee
  • 0 kudos

Have you looked at the sql plan to see what the  spark job 72 was doing?

  • 0 kudos
1 More Replies
chari
by Contributor
  • 2804 Views
  • 2 replies
  • 0 kudos

Reading csv file with spark throws [insufficient privelage] error

Hello Community,I have some csv files saved in databricks workspace and want to read them with spark. I make use of the commanddf = spark.read.format('csv').load(r'filepath') However, it throws the error.org.apache.spark.SparkSecurityException: [INSU...

  • 2804 Views
  • 2 replies
  • 0 kudos
Latest Reply
Lakshay
Databricks Employee
  • 0 kudos

If this a UC enabled workspace, you need to provide the right access.

  • 0 kudos
1 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 5142 Views
  • 3 replies
  • 2 kudos

Resolved! Update regarding Community Reward Store

Hi Team,Is there any update on the Community Reward Store, as it's been discontinued from the old portal, and we still can't see the new portal for that.Is there any expected date when this will be available for community members?

  • 5142 Views
  • 3 replies
  • 2 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 2 kudos

Thanks for update.

  • 2 kudos
2 More Replies
chloeh
by New Contributor II
  • 1447 Views
  • 1 replies
  • 0 kudos

Using SQL for Structured Streaming

Hi!I'm new to Databricks. I'm trying to create a data pipeline with structured streaming. A minimal example data pipeline would look like: read from upstream Kafka source, do some data transformation, then write to downstream Kafka sink. I want to do...

  • 1447 Views
  • 1 replies
  • 0 kudos
Latest Reply
chloeh
New Contributor II
  • 0 kudos

Ok I figured out why I was getting an error on the usage of `read_kafka`. My default cluster was set up with the wrong Databricks runtime

  • 0 kudos
liormayn
by New Contributor III
  • 2351 Views
  • 0 replies
  • 3 kudos

Error while encoding: java.lang.RuntimeException: org.apache.spark.sql.catalyst.util.GenericArrayDa

Hello:)we are trying to run an existing working flow that works currently on EMR, on databricks.we use LTS 10.4, and when loading the data we get the following error:at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:...

  • 2351 Views
  • 0 replies
  • 3 kudos
RakeshRakesh_De
by New Contributor III
  • 4674 Views
  • 1 replies
  • 0 kudos

if any user has only permission 'select table' in unityCatalog but not having permission to ext loc

Hi,Suppose one use having access 'Select' permission the table but user not having any permission to table external location in the 'external location'..  User will be able to read the data from table?? if yes how can user will be able to read the wh...

  • 4674 Views
  • 1 replies
  • 0 kudos
Latest Reply
RakeshRakesh_De
New Contributor III
  • 0 kudos

Hi @Retired_mod , thanks for response.. Why the hyperlink command not showing full?

  • 0 kudos
RobinK
by Contributor
  • 3478 Views
  • 5 replies
  • 0 kudos

How to switch Workspaces via menue

Hello,In various webinars and videos featuring Databricks instructors, I have noticed that it is possible to switch between different workspaces using the top menu within a workspace. However, in our organization, we have three separate workspaces wi...

  • 3478 Views
  • 5 replies
  • 0 kudos
Latest Reply
Rajani
Contributor II
  • 0 kudos

Hi @RobinK looking at screenshots provided i can see you have access to different workspaces but still the dropdown is not visible for you, i also checked if there is any setting for same but i didnt found it.you can raise a ticket to databricks and ...

  • 0 kudos
4 More Replies
dustint121
by New Contributor II
  • 3858 Views
  • 1 replies
  • 1 kudos

Resolved! Issue with creating cluster on Community Edition

I have recently signed up for Databricks Community Edition and have yet to succesfully create a cluster.I get this message when trying to create a cluster:"Self-bootstrap failure during launch. Please try again later and contact Databricks if the pro...

  • 3858 Views
  • 1 replies
  • 1 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 1 kudos

Hi @dustint121 It's Databricks internal issue; wait for some time and it will resolve.

  • 1 kudos
halox6000
by New Contributor III
  • 4213 Views
  • 3 replies
  • 1 kudos

Resolved! Databricks community edition down?

I am getting this error when trying to create a cluster: "Self-bootstrap failure during launch. Please try again later and contact Databricks if the problem persists. Node daemon fast failed and did not answer ping for instance"

  • 4213 Views
  • 3 replies
  • 1 kudos
Latest Reply
dustint121
New Contributor II
  • 1 kudos

I still have this issue, and have yet to successfully create a cluster instance.Please advise on how this error was fixed.

  • 1 kudos
2 More Replies
anonymous_567
by New Contributor II
  • 3089 Views
  • 3 replies
  • 0 kudos

Autoloader update table when new changes are made

Hello,Everyday a new file of the same name gets sent to my storage account with old and new data appended at the end.  Columns may also be added during one of these file updates. This file does a complete overwrite of the previous file. Is it possibl...

  • 3089 Views
  • 3 replies
  • 0 kudos
Latest Reply
data-grassroots
New Contributor III
  • 0 kudos

This may be helpful - the bit on allow overwritehttps://docs.databricks.com/en/ingestion/auto-loader/faq.html

  • 0 kudos
2 More Replies
Chinu
by New Contributor III
  • 3548 Views
  • 1 replies
  • 0 kudos

System Tables - Billing schema

Hi Experts!We enabled UC and also the system table (Billing) to start monitoring usage and cost. We were able to create a dashboard where we can see the usage and cost for each workspace. The usage table in the billing schema has workspace_id but I'd...

  • 3548 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaizen
Valued Contributor
  • 0 kudos

@Retired_mod Im also not seeing the compute names logged in the system billing tables. Is this located elsewhere?

  • 0 kudos
Miguel_Grafana
by New Contributor
  • 1110 Views
  • 0 replies
  • 0 kudos

Azure Oauth Passthrough with the Go Driver

Can anyone point me towards some resources for achieving this? I already have the token.Trying with: dbsql.WithAccessToken(settings.Token)But I'm getting the following error:Unable to load OAuth Config: request error after 1 attempt(s): unexpected HT...

  • 1110 Views
  • 0 replies
  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels