cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

JohnSmith2
by New Contributor II
  • 5451 Views
  • 4 replies
  • 2 kudos

Resolved! Error on Workflow

Hi , I have some mysteries situation here My workflow (job) ran and got an error -> [INVALID_IDENTIFIER] The identifier transactions-catalog is invalid. Please, consider quoting it with back-quotes as `transactions-catalog`.(line 1, pos 12) == SQL ==...

  • 5451 Views
  • 4 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

Jobs are just notebooks executed in background, so if the notebook is the same between interactive (manual) and job run, there should be no difference.So I don't see what is wrong.  Is the job using DLT perhaps?

  • 2 kudos
3 More Replies
DBEnthusiast
by New Contributor III
  • 6013 Views
  • 1 replies
  • 1 kudos

DataBricks Cluster

Hi All,I am curious to know the difference between a spark cluster and a DataBricks one.As per the info I have read Spark Cluster creates driver and Workers when the Application is submitted whereas in Databricks we can create cluster in advance in c...

  • 6013 Views
  • 1 replies
  • 1 kudos
Mohan2
by New Contributor
  • 4755 Views
  • 0 replies
  • 0 kudos

SQL Warehouse - several issues

Hi there,I am facing several issues while trying to run SQL warehouse-starter on Azure databricks.Please note I am new to this data world, Azure & Databricks .  while starting SQL starter warehouse in Databricks Trail version and  I am getting these ...

Mohan2_0-1695443100723.png Mohan2_0-1695441032770.png
  • 4755 Views
  • 0 replies
  • 0 kudos
eimis_pacheco
by Contributor
  • 5638 Views
  • 2 replies
  • 2 kudos

Resolved! Is it not needed to preserve the data in its original format anymore with the usage of medallion?

Hi Community I have a doubt. The bronze layer always causes confusion for me. Someone mentioned, "File Format: Store data in Delta Lake format to leverage its performance, ACID transactions, and schema evolution capabilities" for bronze layers.Then, ...

  • 5638 Views
  • 2 replies
  • 2 kudos
vivek2612
by New Contributor II
  • 10960 Views
  • 5 replies
  • 0 kudos

Exam got suspended Databricks Certified Data Engineer Associate exam

Hi team,My Databricks Certified Data Engineer Associate exam got suspended within 20 minutes.  My exam got suspended due to eye movement without any warning. I was not moving my eyes away from laptop screen. Some questions are so big in the exam so I...

  • 10960 Views
  • 5 replies
  • 0 kudos
Latest Reply
Cert-Team
Databricks Employee
  • 0 kudos

@rajib_bahar_ptg funny, not funny, right?! I just posted that tip today in this post: https://community.databricks.com/t5/certifications/minimize-the-chances-of-your-exam-getting-suspended-tip-3/td-p/45712

  • 0 kudos
4 More Replies
Mbinyala
by New Contributor II
  • 8719 Views
  • 6 replies
  • 3 kudos

Cluster policy not showing while creating delta live table pipeline

Hi all!!!I have created a cluster policy but when i want to use that while creating dlt pipeline, It is showing none. I have checked I have all the necessary permissions to create cluster policies. Still, in dlt ui it is showing none.

  • 8719 Views
  • 6 replies
  • 3 kudos
Latest Reply
Rishitha
New Contributor III
  • 3 kudos

@btafur  Can we also set the auto_terminate minutes with the policy?  (for the dlt cluster type)

  • 3 kudos
5 More Replies
RithwikMR
by New Contributor
  • 1979 Views
  • 1 replies
  • 0 kudos

How to automate creating notebooks when i have multiple .html or .py files

Hi all,I have 50+ .html and .py files, for which I have to create separate notebooks for each and every single one of them. For this, manually creating a notebook using the UI and importing the .html/.py file is a bit tedious and time consuming. Is t...

  • 1979 Views
  • 1 replies
  • 0 kudos
Latest Reply
btafur
Databricks Employee
  • 0 kudos

Depending on your use case and requirements, one alternative would be to create a script that loops through your files and uploads them using the API. You can find more information about the API here: https://docs.databricks.com/api/workspace/workspa...

  • 0 kudos
adrianna2942842
by New Contributor III
  • 6903 Views
  • 2 replies
  • 3 kudos

Resolved! Limitations of committing ipynb notebook output with Repos

During my experimentation with the latest feature that allows including notebook output in a commit, I ran into a specific issue. While attempting to commit my recent changes, I encountered an error message stating "Error fetching Git status." Intere...

  • 6903 Views
  • 2 replies
  • 3 kudos
Latest Reply
adrianna2942842
New Contributor III
  • 3 kudos

I've found that the restriction I've encountered isn't related to the file size within Repos, but rather the maximum file size that can be shown in the Azure Databricks UI. You can find this limitation documented at https://learn.microsoft.com/en-us/...

  • 3 kudos
1 More Replies
PériclesTD
by New Contributor
  • 15755 Views
  • 2 replies
  • 2 kudos

workspace

How can I do to access the Workspace ?   

  • 15755 Views
  • 2 replies
  • 2 kudos
Latest Reply
Debayan
Databricks Employee
  • 2 kudos

Hi, you can try checking https://docs.databricks.com/en/administration-guide/workspace/index.html , please let us know if this helps.  Also please tag @Debayan​ with your next response which will notify me, Thank you!

  • 2 kudos
1 More Replies
dvmentalmadess
by Valued Contributor
  • 9405 Views
  • 1 replies
  • 1 kudos

Structured Streaming of S3 source

I am trying to setup s3 as a structured streaming source. The bucket receives ~17K files/day and the original load to the bucket was ~54K files. The bucket was first loaded 3 months ago and we haven't started reading from it since. So let's say there...

  • 9405 Views
  • 1 replies
  • 1 kudos
Latest Reply
dvmentalmadess
Valued Contributor
  • 1 kudos

Thanks,We were able to make things work by increasing the driver instance size so it has more memory for the initial load. After initial load we scaled the instance down for subsequent runs. We're still testing, if we aren't able to make it work we'l...

  • 1 kudos
jl1
by New Contributor
  • 1920 Views
  • 0 replies
  • 0 kudos

"command complete" but not executed

I have a notebook that calls other notebooks with `dbutils.notebook.run` and execute them as a 'Notebook job'. But sometimes when a notebook is taking a long time and the cluster is just waiting for, for instance, an api response, the subsequent comm...

  • 1920 Views
  • 0 replies
  • 0 kudos
Sujitha
by Databricks Employee
  • 35295 Views
  • 8 replies
  • 6 kudos

Get up to speed on Generative AI with this free on-demand training

Build foundational knowledge of generative AI, including large language models (LLMs), with 4 short videos.   Here is how it works: Watch 4 short tutorial videosPass the knowledge testEarn a badge for Generative AI Fundamentals you can share on your ...

Screenshot 2023-07-19 at 11.29.11 AM.png
  • 35295 Views
  • 8 replies
  • 6 kudos
Latest Reply
rahmat_kun
New Contributor II
  • 6 kudos

Videos not running

  • 6 kudos
7 More Replies
Theor
by New Contributor III
  • 17070 Views
  • 6 replies
  • 2 kudos

Exam retest voucher

Hi Data bricks Team,I set for Data bricks Certified Machine Learning Professional exam for 2nd time (10 Sept 2023), but didn't pass again. Got 66.66% overall.I am seasoned Data bricks user but this particular exam is quite unorthodox one. Nevertheles...

Get Started Discussions
Cetificate
machine learning
  • 17070 Views
  • 6 replies
  • 2 kudos
Latest Reply
Cert-Team
Databricks Employee
  • 2 kudos

Hi @Theor Thank you for submitting a ticket to our support team! They are working on it.

  • 2 kudos
5 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels