cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

manupmanoos
by New Contributor III
  • 3665 Views
  • 1 replies
  • 0 kudos

How to save the best model checkpoi through the epochs of a deep learning network through callbacks?

I have create a neural network and I am training the model with the code as below.  The code fails to write to the databricks file storage. is there any other way to write the checkpoint to databricks storage or to an s3 bucket directly?custom_early_...

  • 3665 Views
  • 1 replies
  • 0 kudos
Latest Reply
manupmanoos
New Contributor III
  • 0 kudos

Hi @Retired_mod ,I am not able to save it to local storage in databricks dbfs also. It is showing invalid operation when I am trying to save to databricks file storage. Additionally, I have valid aws credentials with which I am able to save a model t...

  • 0 kudos
eric2
by New Contributor II
  • 2769 Views
  • 3 replies
  • 0 kudos

Databricks Delta table Insert Data Error

When trying to insert data into the Delta table in databricks, an error occurs as shown below. [TASK_WRITE_FAILED] Task failed while writing rows to abfss://cont-01@dlsgolfzon001.dfs.core.windows.net/dir-db999_test/D_RGN_INFO_TMP.In SQL, the results ...

  • 2769 Views
  • 3 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

seems ok to me, have you tried to display the data from table A and also the B/C join?

  • 0 kudos
2 More Replies
elgeo
by Valued Contributor II
  • 2796 Views
  • 1 replies
  • 0 kudos

Retrieve DBU per query executed

Hello experts,Do you know how we can retrieve the DBUs consumed for a specific query?Thank you

  • 2796 Views
  • 1 replies
  • 0 kudos
Latest Reply
elgeo
Valued Contributor II
  • 0 kudos

I couldn't find a metadata table. However the workaround is to multiply the DBU of the current cluster (retrieve it either online or to be more accurate from the compute page at the right) and multiply it with the time in minutes that the query took ...

  • 0 kudos
NielsMH
by New Contributor III
  • 2937 Views
  • 2 replies
  • 1 kudos

running notebook job from remote github repository fails, but do not fail on python script type

Hi allI am trying to run a notebook from a remote repository, but the job fails. I setup the job as follows:my project structure is as such:but the output i get is like such: The thing is if i set the job type to "Python Script" i dont encounter this...

job-setup.png folder_structure.png job_output.png
  • 2937 Views
  • 2 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@NielsMH if you want to run your jobs based o job name, please use new preview service that databricks released which are DAB format. there you can run your job based on your job name.remote repo in the sense, are you using github actions or api, loo...

  • 1 kudos
1 More Replies
FatemaMalu
by New Contributor II
  • 1750 Views
  • 1 replies
  • 1 kudos

Query Hash missing

From the following Databricks API  /api/2.0/preview/sql/queries query_hash is missing from the actual response.But the sample response mentioned in the API documentation has it.{ "count": 0, "page": 0, "page_size": 0, "results": [ { ...

  • 1750 Views
  • 1 replies
  • 1 kudos
hari007
by New Contributor II
  • 1921 Views
  • 1 replies
  • 1 kudos

Databricks cluster automated

Is there any way to automatically start a Databricks cluster when an event occurs, such as the cluster terminating for some reason, and have the Databricks cluster restart automatically thereafter ? It should avoid manual start.

Get Started Discussions
Cluster automated
  • 1921 Views
  • 1 replies
  • 1 kudos
sg-vtc
by New Contributor III
  • 3425 Views
  • 1 replies
  • 0 kudos

problem with workspace after metastore deleted

I am completely new to Databricks AWS and start working on it a week ago.  Pls excuse me if I ask or did something silly.I created a workspace and a single node cluster for testing. A metastore was created from Databricks quickstart and it was automa...

  • 3425 Views
  • 1 replies
  • 0 kudos
Latest Reply
sg-vtc
New Contributor III
  • 0 kudos

I restarted the compute node and this problem went away.ErrorClass=METASTORE_DOES_NOT_EXIST] Metastore 'b11fb1a0-a462-4dfb-b91b-e0795fde10b0' does not exist.New question: I am testing Databricks with non-AWS S3 object storage.  I can access the non-A...

  • 0 kudos
aerofish
by New Contributor III
  • 3781 Views
  • 3 replies
  • 1 kudos

drop duplicates within watermark

Recently we are using structured streaming to ingest data. We want to use watermark to drop duplicated event. But We encountered some wired behavior and unexpected exception. Anyone can help me to explain what is the expected behavior and how should ...

  • 3781 Views
  • 3 replies
  • 1 kudos
Latest Reply
aerofish
New Contributor III
  • 1 kudos

Any maintainer can help me on this question??

  • 1 kudos
2 More Replies
bigt23
by New Contributor II
  • 5649 Views
  • 2 replies
  • 1 kudos

Resolved! Read zstd file from Databricks

I just started to read `zstd` compressed file in Databricks on Azure, Runtime 14.1 on Spark 3.5.0I've set PySpark commands as followspath = f"wasbs://{container}@{storageaccount}.blob.core.windows.net/test-zstd" schema = "some schema" df = spark.read...

  • 5649 Views
  • 2 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

The available compression types are format dependent.For json, zstd is not (yet) available, whereas for parquet it is.

  • 1 kudos
1 More Replies
floringrigoriu
by New Contributor II
  • 3295 Views
  • 0 replies
  • 0 kudos

Can Error Message be un Redacted

I there a way to un-redact the logging of error message ?Alternatively would be nice to have access to the source code of involved classes like : com.databricks.backend.common.util.CommandLineHelper or com.databricks.util.UntrustedUtils I'm getting t...

  • 3295 Views
  • 0 replies
  • 0 kudos
Abhiqa
by New Contributor II
  • 5526 Views
  • 1 replies
  • 1 kudos

How to schedule/refresh databricks alerts using REST API?

Hi, I am deploying Databricks SQL alerts using REST API. But I can't seem to figure out how to schedule their refresh task.I went through the documentation it says "Alerts can be scheduled using the sql_task type of the Jobs API, e.g. Jobs/Create"How...

Abhiqa_0-1697550139434.png Abhiqa_1-1697550638337.png
Get Started Discussions
Alerts
REST API
sql query
sql_task
  • 5526 Views
  • 1 replies
  • 1 kudos
Latest Reply
btafur
Databricks Employee
  • 1 kudos

What they mention in the API docs is that you can create a job with sql_task of type Alert. To make it easier you can try creating the job first in the UI first and downloading the JSON config. Here is an example with the main parameters that should ...

  • 1 kudos
bfrank1972
by New Contributor III
  • 1062 Views
  • 0 replies
  • 0 kudos

Small files and discrepancy in S3 vs catalog

Hello all,I'm in the process of optimizing my tables and I'm running into a confusing situation. I have a table named "trace_messages_fg_streaming_event". If I navigate to the Databricks catalog, it shows stats:Size: 6.7GB, Files: 464But when I look ...

bfrank1972_0-1697559008309.png
  • 1062 Views
  • 0 replies
  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels