cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Kiranrathod
by New Contributor III
  • 3053 Views
  • 2 replies
  • 0 kudos

Auotoloader-"cloudFiles.backfillInterval"

1. How to use cloudFiles.backfillInterval option in a notebook?2. Does It need to be any set of the property?3. Where is exactly placed readstream portion of the code or writestream portion of the code?4. Do you have any sample code?5. Where we find ...

  • 3053 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kiranrathod
New Contributor III
  • 0 kudos

1.Is the following code correct for specifying the  .option("cloudFiles.backfillInterval", 300)?df = spark.readStream.format("cloudFiles") \.option("cloudFiles.format", "csv") \.option("cloudFiles.schemaLocation", f"dbfs:/FileStore/xyz/back_fill_opti...

  • 0 kudos
1 More Replies
mvmiller
by New Contributor III
  • 1876 Views
  • 0 replies
  • 0 kudos

Sharing compute between tasks of a job

Is there a way to set up a workflow with multiple tasks, so that different tasks can share the same compute resource, at the same time?I understand that an instance pool may be an option, here. Wasn't sure if there were other possible options to cons...

  • 1876 Views
  • 0 replies
  • 0 kudos
sg-vtc
by New Contributor III
  • 2993 Views
  • 1 replies
  • 0 kudos

Can the same Databricks account be used for both AWS and Azure

I am testing Databricks with non-AWS S3 object storage and want to test it with Databricks on Azure and Databricks on AWS.  My Databricks account is currently using Databricks on AWS with metadata and a single node compute running. Can the same accou...

  • 2993 Views
  • 1 replies
  • 0 kudos
Latest Reply
sg-vtc
New Contributor III
  • 0 kudos

Thank you Kaniz for posting the link. Looking at that, I believe the answer is:This is not possible in Databricks for now.

  • 0 kudos
neca36
by New Contributor
  • 1966 Views
  • 0 replies
  • 0 kudos

Databricks data engineer associate got paused

Hi team,I've faced a disappointing experience during my first certification attempt and need help in resolving the issue.While attending the certification - Databricks data engineer associate on each 2-3 questions I kept receiving a message that the ...

  • 1966 Views
  • 0 replies
  • 0 kudos
Reshma_123
by New Contributor II
  • 3370 Views
  • 3 replies
  • 0 kudos

Resolved! Databricks data engineer associate Exam got suspended.

@Cert-Team I have registered for Databricks certified data engineer associate exam and I have done all biometric and other prerequisites required and launched my exam. While writing the exam, it's exited twice with some technical issue although I don...

  • 3370 Views
  • 3 replies
  • 0 kudos
Latest Reply
Reshma_123
New Contributor II
  • 0 kudos

Thank you for your support. I have given my test.

  • 0 kudos
2 More Replies
Ankur_K
by New Contributor II
  • 4624 Views
  • 1 replies
  • 0 kudos

Connecting to Databricks using Python(VS Code)

I'm trying to connect to tables/views in Databricks using Python(via VS Code). However, I'm getting the following error:- File "C:\Users\XXXXXXXX\AppData\Roaming\Python\Python311\site-packages\urllib3\util\retry.py", line 592, in incrementraise MaxRe...

Get Started Discussions
SSLCertificateError
  • 4624 Views
  • 1 replies
  • 0 kudos
MeggieFox
by New Contributor III
  • 10621 Views
  • 3 replies
  • 3 kudos

Azure Databricks User Alerts: query_result_table

Hi,How can I set up the notification email to show all of rows from query_result_table, not only first 10?

  • 10621 Views
  • 3 replies
  • 3 kudos
Latest Reply
MeggieFox
New Contributor III
  • 3 kudos

I see that my previous message has been cut. So, I just wanted to check where can I change email setting using the code above, because when I open  User > Account Settings > Notifications I can see only option like this: there is no place to configur...

  • 3 kudos
2 More Replies
NC
by New Contributor III
  • 5355 Views
  • 1 replies
  • 0 kudos

Unable to use job cluster for task in workflows

Hi,I have a workflow setup in Databricks using 12.2 LTS ML.I am trying to use a job cluster for the task but i am getting the following error: Spark Conf: ‘spark.databricks.acl.enabled’ is not allowed when choosing an access modeAs a result I have to...

  • 5355 Views
  • 1 replies
  • 0 kudos
PiotrU
by Contributor II
  • 18667 Views
  • 1 replies
  • 1 kudos

Resolved! Some streams terminated before this command could finish! -> java.lang.NoClassDefFoundError: scala/c

HelloI do face:Some streams terminated before this command could finish!java.lang.NoClassDefFoundError: scala/compat/java8/FutureConverters$Running some very simple query on eventhub :df = spark \.readStream \.format("eventhubs") \.options(**ehConf) ...

  • 18667 Views
  • 1 replies
  • 1 kudos
Latest Reply
PiotrU
Contributor II
  • 1 kudos

Of course just after writing that post I did realized how dummy this question is .... after adding scala_java8_compat_2_12_1_0_2.jar it works as expected

  • 1 kudos
Bhanu1
by New Contributor III
  • 1844 Views
  • 0 replies
  • 0 kudos

Thoughts on how to improve string search queries

Please see sample code I am running below. What options can I explore to improve speed of query execution in such a scenario? Current full code takes about 4 hrs to run on 1.5 billion rows. Thanks!SELECT fullVisitorId ,VisitId ,EventDate ,PagePath ,d...

  • 1844 Views
  • 0 replies
  • 0 kudos
AJ270990
by Contributor II
  • 3959 Views
  • 1 replies
  • 0 kudos

API for Databricks code functionality

I have a Databricks notebook for which I want to create an API. From that API I will have to call the notebook and perform certain operations. Result will be sent back to API. I dont want to do via Postman, as someone has to install Postman at their ...

  • 3959 Views
  • 1 replies
  • 0 kudos
Anonym
by New Contributor II
  • 1576 Views
  • 1 replies
  • 0 kudos

Error ingesting files with databricks jobs

The source path that i want to ingest files with is:"gs://bucket-name/folder1/folder2/*/*.json"I have a file in this path that ends with ".json.gz" and the databricks job ingests this file even though it doesn't suppose to.How can i fix it?Thanks.

  • 1576 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonym
New Contributor II
  • 0 kudos

Thanks Kaniz

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels