cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

halox6000
by New Contributor III
  • 2397 Views
  • 3 replies
  • 1 kudos

Resolved! Databricks community edition down?

I am getting this error when trying to create a cluster: "Self-bootstrap failure during launch. Please try again later and contact Databricks if the problem persists. Node daemon fast failed and did not answer ping for instance"

  • 2397 Views
  • 3 replies
  • 1 kudos
Latest Reply
dustint121
New Contributor II
  • 1 kudos

I still have this issue, and have yet to successfully create a cluster instance.Please advise on how this error was fixed.

  • 1 kudos
2 More Replies
Chinu
by New Contributor III
  • 1535 Views
  • 3 replies
  • 0 kudos

System Tables - Billing schema

Hi Experts!We enabled UC and also the system table (Billing) to start monitoring usage and cost. We were able to create a dashboard where we can see the usage and cost for each workspace. The usage table in the billing schema has workspace_id but I'd...

  • 1535 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaizen
Valued Contributor
  • 0 kudos

@Kaniz_Fatma Im also not seeing the compute names logged in the system billing tables. Is this located elsewhere?

  • 0 kudos
2 More Replies
kp12
by New Contributor II
  • 5256 Views
  • 4 replies
  • 1 kudos

column "id" is of type uuid but expression is of type character varying.

Hello,I'm trying to write to Azure PostgreSQL flexible  database from Azure Databricks, using PostgreSQL connector in Databricks Runtime in 12.2LTS.I'm using df.write.format("postgresql").save() to write to PostgreSQL database, but getting the follow...

  • 5256 Views
  • 4 replies
  • 1 kudos
Latest Reply
Student-Learn
New Contributor II
  • 1 kudos

Yes, this stack overflow was my reference too and adding below option made load go with no error on UUID data type in postgres columnSpoiler.option(stringtype, "unspecified").option(stringtype, "unspecified")https://stackoverflow.com/questions/409739...

  • 1 kudos
3 More Replies
Raja_fawadAhmed
by New Contributor
  • 511 Views
  • 1 replies
  • 0 kudos

databricks job compute price w.r.t running time

I have two workflows (jobs) in data bricks (AWS) with below cluster specs (job base cluster NOT general purpose)Driver: i3.xlarge · Workers: i3.xlarge · 2-8 workers Job 1 takes 10 min to completeJob 2 takes 50 min to completeQuestions:DBU cost is sam...

  • 511 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

  Hi @Raja_fawadAhmed, DBU Cost for Both Jobs: Databricks pricing is based on DBUs (Databricks Units) consumed. The cost depends on the type of compute instances used and the specific workload.For your two jobs, the DBU cost would be calculated...

  • 0 kudos
Surajv
by New Contributor III
  • 489 Views
  • 1 replies
  • 0 kudos

Difference between delete token API and revoke token API Databricks

Hi Community, I am trying to understand the difference between:Delete token API: DELETE /api/2.0/token-management/tokens/{token_id}Revoke token API: POST /api/2.0/token/deleteAs, when I create more than 600 tokens - I am getting QUOTA_EXCEEDED error....

  • 489 Views
  • 1 replies
  • 0 kudos
Latest Reply
Surajv
New Contributor III
  • 0 kudos

Delete token API doc link: https://docs.databricks.com/api/workspace/tokenmanagement/deleteRevoke token API doc link: https://docs.databricks.com/api/workspace/tokens/revoketoken 

  • 0 kudos
NC
by New Contributor III
  • 531 Views
  • 1 replies
  • 0 kudos

Using libpostal in Databricks

Hi,I am trying to work on address parsing and would like to use libpostal in Databricks.I have used the official python bindings: GitHub - openvenues/pypostal: Python bindings to libpostal for fast international address parsing/normalizationpip insta...

  • 531 Views
  • 1 replies
  • 0 kudos
Latest Reply
NC
New Contributor III
  • 0 kudos

I managed to install pylibpostal via the Cluster Library. but I cannot seem to download the data needed to run it.Please help. Thank you.

  • 0 kudos
hpicatto
by New Contributor III
  • 767 Views
  • 1 replies
  • 0 kudos

Download event and run logs

how can I download the run and event logs? spark UI is loading them from somewhere but I couldn't find them in dbfs nor on s3

  • 767 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @hpicatto,  The Spark Web UI provides a suite of user interfaces (UIs) for monitoring your Spark cluster. You can access it by navigating to http://<driver-node-ip>:18080 in your web browser.On the appropriate application page, click the Download ...

  • 0 kudos
super7admin
by New Contributor
  • 1236 Views
  • 2 replies
  • 0 kudos

unable to see AI playground in Machine Learning in Dashboard

unable to see AI playground in Machine Learning in Dashboard

  • 1236 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @super7admin, Please check this document - https://docs.databricks.com/en/large-language-models/ai-playground.html

  • 0 kudos
1 More Replies
Surajv
by New Contributor III
  • 688 Views
  • 1 replies
  • 0 kudos

What is the quota limit for using create user token api?

Hi Community, I was going through this doc: https://docs.databricks.com/api/workspace/tokens/create to and got to know, that there is a quota limit to how many token one can generate using the api: POST /api/2.0/token/create, having breached the thre...

  • 688 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Surajv, Let’s dive into the details of token quotas in Databricks. Quota Limit for Token Creation: The quota limit for creating user tokens via the API (specifically, using POST /api/2.0/token/create) is essential to manage token usage.Each u...

  • 0 kudos
Surajv
by New Contributor III
  • 521 Views
  • 1 replies
  • 0 kudos

Number of tokens generated for a service principal

Hi community, Is there any API or option to view all PAT tokens generated by a Databricks service principal?

  • 521 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Surajv, When working with Databricks service principals, you can manage and view personal access tokens (PATs) associated with them. Here’s how you can achieve this: What is a Service Principal? A service principal is an identity created in ...

  • 0 kudos
DatabricksGuide
by Community Manager
  • 948 Views
  • 0 replies
  • 0 kudos

Join Our Databricks Free Trial Experience feedback AMA on Friday March 29, 2024!

We're looking for feedback on the Databricks free trial experience, and we need your help! Whether you've used it for data engineering, data science, or analytics, Sujit Nair, our Product Manager on the free trial experience, and our journey archite...

  • 948 Views
  • 0 replies
  • 0 kudos
Еmil
by New Contributor III
  • 3226 Views
  • 3 replies
  • 1 kudos

Resolved! source set to GIT for Databricks Asset Bundle notebook_task - git authentication fails on run

My post was marked as Spam after trying to post the description of my issue so now I have posted the question on stackoverflow.

  • 3226 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @Еmil, I've read through your question and believe I have a solution for you. Here's a response to your question: Since your job runs as a service principal, consider using OAuth M2M authentication for accessing your Azure DevOps Git repository.En...

  • 1 kudos
2 More Replies
Frustrated_DE
by New Contributor II
  • 1309 Views
  • 1 replies
  • 0 kudos

DLT SQL demo pipeline issue

Hi,   First foray into DLT and following code exerts from the sample-DLT-notebook.I'm creating a notebook with the SQL below:CREATE STREAMING LIVE TABLE sales_orders_rawCOMMENT "The raw sales orders, ingested from /databricks-datasets."TBLPROPERTIES ...

  • 1309 Views
  • 1 replies
  • 0 kudos
Latest Reply
Frustrated_DE
New Contributor II
  • 0 kudos

If you change the notebook default language as opposed to using magic command. I normally have it set to Python, I've wrongly assumed DLT would transpose as can't use magic command but have to change default in order for it to work. 

  • 0 kudos
hpicatto
by New Contributor III
  • 1615 Views
  • 3 replies
  • 0 kudos

using the api for getting cost in usd

I'm trying to use the API of billable usage and I do get a report but I have not been able to get the usd cost report, only the dbuHours. I guess I've to change the meter_name but I cannot find the key for that parameter anywhere

  • 1615 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @hpicatto, AWS Usage Reports: AWS provides detailed usage and cost reports through the AWS Cost and Usage Report. You can access this report via the AWS Management Console. Here are the steps: Log in to the AWS Management Console.Navigate to the B...

  • 0 kudos
2 More Replies
Floody
by New Contributor II
  • 1454 Views
  • 1 replies
  • 0 kudos

New draft for every post I visit

When I visit my profile page, under the drafts section I see an entry for every post I visit in the discussions. Is this normal?

  • 1454 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Floody, Yes, it is normal to see an entry for every post you visit in the discussions under the drafts section of your profile page. This feature allows you to easily access and continue working on drafts of posts that you have started or viewed ...

  • 0 kudos
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels
Top Kudoed Authors