cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Leszek
by Contributor
  • 1173 Views
  • 1 replies
  • 1 kudos

SQL Serverless - cost view

Hi,Anyone knows how I'm able to monitor cost of the SQL Serverless? I'm using Databricks in Azure and I'm not sure where to find cost generated by compute resources hosted on Databricks.

  • 1173 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, You can calculate the pricing in https://www.databricks.com/product/pricing/databricks-sql also, https://azure.microsoft.com/en-in/pricing/details/databricks/#:~:text=Sign%20in%20to%20the%20Azure,asked%20questions%20about%20Azure%20pricing. For A...

  • 1 kudos
Shubhanshu
by New Contributor II
  • 3133 Views
  • 2 replies
  • 0 kudos

Error while creating external table in unity catalog

I am trying to create an external table using csv file which is stored in ADLS gen2 My account owner has created a storage credential and an external location I am a databricks user who all privileges on external location when trying to create a tabl...

Get Started Discussions
ADLS gen2
SQL warehouse
Unity Catalog
  • 3133 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi @Shubhanshu , Could you please try https://learn.microsoft.com/en-us/answers/questions/1314651/error-invalid-configuration-value-detected-for-fs and see if this is working? 

  • 0 kudos
1 More Replies
rustem17
by New Contributor II
  • 2132 Views
  • 1 replies
  • 0 kudos

Issue with Databricks notebooks and 'requests' python library. Incosistent output.

I have a strange issue with Databricks notebooks and Google Colab notebooks, where I cannot get the results from requests library that would be consistent to what I get on my local computer. Dir Surveys (wyo.gov) ("http://pipeline.wyo.gov/r_Direction...

  • 2132 Views
  • 1 replies
  • 0 kudos
Latest Reply
rustem17
New Contributor II
  • 0 kudos

Additional information: when I download html file from the above mentioned website, I still get not the same html page as I get from the same code that I run on my computer or any other computer. The size of downloaded html file is the same as I get ...

  • 0 kudos
VMeghraj
by New Contributor II
  • 2356 Views
  • 1 replies
  • 0 kudos

REST API

Creating an application to capture cluster metrics and sending HTTP REST request to the Spark History Server's API endpoint to retrieve a list of applications. This request doesn't generate logs in the Spark History Server's log files. The Spark Hist...

  • 2356 Views
  • 1 replies
  • 0 kudos
Bagger
by New Contributor II
  • 3172 Views
  • 1 replies
  • 0 kudos

Monitoring job metrics

Hi,We need to monitor Databricks jobs and we have made a setup where are able to get the prometheus metrics, however, we are lagging an overview of which metrics refer to what.Namely, we need to monitor the following:failed jobs : is a job failedtabl...

Get Started Discussions
jobs
metrics
prometheus
  • 3172 Views
  • 1 replies
  • 0 kudos
Latest Reply
Bagger
New Contributor II
  • 0 kudos

I have reposted this post in "Administation and Architecture"Monitoring job metrics - Databricks - 42956

  • 0 kudos
Cryptocurentcyc
by New Contributor
  • 931 Views
  • 0 replies
  • 0 kudos

ListBucket

{  "Version": "2012-10-17",  "Statement": [    {      "Effect": "Allow",      "Action": [        "s3:ListBucket"      ],     "Resource": [        "arn:aws:s3:::<s3-bucket-name>"      ]    },    {      "Effect": "Allow",      "Action": [        "s3:Pu...

  • 931 Views
  • 0 replies
  • 0 kudos
kll
by New Contributor III
  • 3765 Views
  • 1 replies
  • 0 kudos

pass a tuple as parameter to sql query

at_lst = ['131','132','133'] at_tup = (*at_lst,) print(at_tup) ('131','132','133')<div> <div><span>In my sql query, i am trying to pass this on a parameter, however, it doesn't work. <div> <div><div><div><span>%sql<br /><div><span>select * from ma...

  • 3765 Views
  • 1 replies
  • 0 kudos
Latest Reply
kll
New Contributor III
  • 0 kudos

@Retired_mod  I am writing sql using the magic command in the cell block, `%%sql`. Is there a way to pass a parameter in the query without using the `execute` method of the cursor object? Can you please share an example? 

  • 0 kudos
esi
by New Contributor
  • 3152 Views
  • 0 replies
  • 0 kudos

Ingesting PowerBI Tables to databricks

Hi Community,I am looking for a way to access the Power BI tables from databricks and import them as a spark dataframe into my databricks notebook.As far as I have seen, there is a Power BI connector to load data from databricks into Power BI but not...

  • 3152 Views
  • 0 replies
  • 0 kudos
lin
by New Contributor
  • 1156 Views
  • 0 replies
  • 0 kudos

Facing UNKNOWN_FIELD_EXCEPTION.NEW_FIELDS_IN_FILE

[UNKNOWN_FIELD_EXCEPTION.NEW_FIELDS_IN_FILE] Encountered unknown fields during parsing: [<field_name>], which can be fixed by an automatic retry: trueI am using Azure Databricks, and write with python code. Want to catch the error and raise. Tried wi...

  • 1156 Views
  • 0 replies
  • 0 kudos
mano7438
by New Contributor III
  • 8857 Views
  • 6 replies
  • 1 kudos

Resolved! Unable to create table with primary key

Hi Team,Getting below error while creating a table with primary key,"Table constraints are only supported in Unity Catalog."Table script : CREATE TABLE persons(first_name STRING NOT NULL, last_name STRING NOT NULL, nickname STRING,CONSTRAINT persons_...

  • 8857 Views
  • 6 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, this needs further investigation, could you please raise a support case with Databricks? 

  • 1 kudos
5 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels