cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

rotate_employee
by New Contributor
  • 721 Views
  • 1 replies
  • 0 kudos

Connecting to databricks ipython kernel from VSCode

I'd like to run python notebooks (.ipynb) from VSCode connecting to the ipython kernel from databricks. I have already connected to an execution cluster from VSCode and am able to run python scripts (.py files) and see the output on my local console....

  • 721 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @rotate_employee, I recommend checking Databricks or VSCode documentation or file a support ticket for further assistance.

  • 0 kudos
Maria_fed
by New Contributor III
  • 1808 Views
  • 6 replies
  • 0 kudos

Need help migrating company customer and partner academy accounts to work properly

Hi, originally I accidentally made a customer academy account with my company that is a databricks partner. Then I made an account using my personal email and listed my company email as the partner email for the partner academy account. that account ...

  • 1808 Views
  • 6 replies
  • 0 kudos
Latest Reply
APadmanabhan
Moderator
  • 0 kudos

Hi @Maria_fed Thanks again, i have assigned your case to my colleague and you should hearing from them soon. Regards, Akshay

  • 0 kudos
5 More Replies
dannythermadom
by New Contributor III
  • 739 Views
  • 2 replies
  • 1 kudos

Getting FileNotFoundException while using cloudFiles

Hi,Following is the code i am using the ingest the data incrementally (weekly).val ssdf = spark.readStream.schema(schema) .format("cloudFiles").option("cloudFiles.format", "parquet").load(sourceUrl).filter(criteriaFilter)val transformedDf = ssdf.tran...

  • 739 Views
  • 2 replies
  • 1 kudos
Latest Reply
BilalAslamDbrx
Honored Contributor III
  • 1 kudos

Danny is another process mutating / deleting the incoming files? 

  • 1 kudos
1 More Replies
Phani1
by Valued Contributor
  • 580 Views
  • 1 replies
  • 0 kudos

RBAC, Security & Privacy controls

Could you please share us best practices on implementation of RBAC, Security & Privacy controls in Databricks

Get Started Discussions
Privacy controls
RBAC
  • 580 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Could you please check on https://docs.databricks.com/en/lakehouse-architecture/security-compliance-and-privacy/best-practices.html and see if this helping? Also, please tag @Debayan with your next comment which will notify me. Thanks!

  • 0 kudos
seefoods
by New Contributor III
  • 477 Views
  • 1 replies
  • 0 kudos

ganglia metrics

Hello Everyone, I have build this script in order to collect ganglia metrics but the size of stderr and sdtout ganglia is 0. It doesn't work. I Have put this script on Workspace due to migration databricks all init-script should be place on Workspace...

  • 477 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Is there any error you are getting? Also, please tag @Debayan with your next comment which will notify me. Thanks!

  • 0 kudos
MetaMaestro
by New Contributor III
  • 547 Views
  • 2 replies
  • 0 kudos

GCP hosted Databricks - DBFS temp files - Not Found

I've been working on obtaining DDL at the schema level in Hive Metastore within GCP-hosted Databricks. I've implemented a Python code that generates SQL files in the dbfs/temp directory. However, when running the code, I'm encountering a "file path n...

  • 547 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, the error code snippet with the whole error may help to determine the issue, also, considering the above points may also work as a fix. 

  • 0 kudos
1 More Replies
Rajan_651
by New Contributor
  • 790 Views
  • 2 replies
  • 0 kudos

Resolved! Unable to find permission button in Sql Warehouse for providing Access

Hi Everyone, am unable to see the permission button in sql warehouse to provide access to other users.I have admin rights and databricks is premium subscription.  

  • 790 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Could you please provide a screenshot of the SQL warehouse? Also, you can go through: https://docs.databricks.com/en/security/auth-authz/access-control/sql-endpoint-acl.htmlAlso, please tag @Debayan with your next comment which will notify me. Th...

  • 0 kudos
1 More Replies
Leszek
by Contributor
  • 384 Views
  • 1 replies
  • 1 kudos

SQL Serverless - cost view

Hi,Anyone knows how I'm able to monitor cost of the SQL Serverless? I'm using Databricks in Azure and I'm not sure where to find cost generated by compute resources hosted on Databricks.

  • 384 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 1 kudos

Hi, You can calculate the pricing in https://www.databricks.com/product/pricing/databricks-sql also, https://azure.microsoft.com/en-in/pricing/details/databricks/#:~:text=Sign%20in%20to%20the%20Azure,asked%20questions%20about%20Azure%20pricing. For A...

  • 1 kudos
Cryptocurentcyc
by New Contributor
  • 396 Views
  • 1 replies
  • 0 kudos

ListBucket

{  "Version": "2012-10-17",  "Statement": [    {      "Effect": "Allow",      "Action": [        "s3:ListBucket"      ],     "Resource": [        "arn:aws:s3:::<s3-bucket-name>"      ]    },    {      "Effect": "Allow",      "Action": [        "s3:Pu...

  • 396 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Cryptocurentcyc, The version in the given JSON is "Privilege Model version 1.0". The statement in the JSON is about upgrading to Privilege Model version 1.0 to take advantage of privilege inheritance and new features. It also highlights the diffe...

  • 0 kudos
Bagger
by New Contributor II
  • 1561 Views
  • 2 replies
  • 0 kudos

Monitoring job metrics

Hi,We need to monitor Databricks jobs and we have made a setup where are able to get the prometheus metrics, however, we are lagging an overview of which metrics refer to what.Namely, we need to monitor the following:failed jobs : is a job failedtabl...

Get Started Discussions
jobs
metrics
prometheus
  • 1561 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Bagger, To monitor the metrics you specified, you can use a combination of Databricks features and Prometheus: 1. **Failed Jobs:** You can monitor failed jobs using Databricks’ built-in job monitoring capabilities. The status of each job run, inc...

  • 0 kudos
1 More Replies
ff-paulo-barbos
by New Contributor
  • 1112 Views
  • 1 replies
  • 0 kudos

Spark Remote error when connecting to cluster

Hi, I am using the latest version of pyspark and I am trying to connect to a remote cluster with runtime 13.3.My doubts are:- Do i need databricks unity catalog enabled?- My cluster is already in a Shared policy in Access Mode, so what other configur...

  • 1112 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Is your workspace is already unity catalog enabled? Also, did you go through the considerations for enabling workspace for unity catalog? https://docs.databricks.com/en/data-governance/unity-catalog/enable-workspaces.html#considerations-before-yo...

  • 0 kudos
mano7438
by New Contributor III
  • 4127 Views
  • 6 replies
  • 1 kudos

Resolved! Unable to create table with primary key

Hi Team,Getting below error while creating a table with primary key,"Table constraints are only supported in Unity Catalog."Table script : CREATE TABLE persons(first_name STRING NOT NULL, last_name STRING NOT NULL, nickname STRING,CONSTRAINT persons_...

  • 4127 Views
  • 6 replies
  • 1 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 1 kudos

Hi, this needs further investigation, could you please raise a support case with Databricks? 

  • 1 kudos
5 More Replies
inesandres567
by New Contributor
  • 817 Views
  • 1 replies
  • 0 kudos

Problem starting cluster

I try to start cluster that i used to start it 7 times before and it gave me this error Cloud provider is undergoing a transient resource throttling. This is retryable. 1 out of 2 pods scheduled. Failed to launch cluster in kubernetes in 1800 seconds...

  • 817 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, This error "GCE out of resources" typically means that Google compute engine is out of resources as in out of nodes (can be a quota issue or can be node issues in that particular region in GCP). Could you please raise a google support case on thi...

  • 0 kudos
inesandres567
by New Contributor
  • 558 Views
  • 1 replies
  • 0 kudos

Fail start cluster

I try to start cluster that i used to start it 7 times before and it gave me this error Cloud provider is undergoing a transient resource throttling. This is retryable. 1 out of 2 pods scheduled. Failed to launch cluster in kubernetes in 1800 seconds...

  • 558 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, This error "GCE out of resources" typically means that Google compute engine is out of resources as in out of nodes (can be a quota issue or can be node issues in that particular region in GCP). Could you please raise a google support case on thi...

  • 0 kudos
diego_poggioli
by Contributor
  • 1322 Views
  • 1 replies
  • 1 kudos

Resolved! Run tasks conditionally "Always" condition missing?

Does the new feature 'Run If' that allows you to run tasks conditionally lack the 'ALWAYS' option? In order to execute the task both when there is OK and error from the dependencies

  • 1322 Views
  • 1 replies
  • 1 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 1 kudos

You can choose the All Done option to run the task in both the scenarios

  • 1 kudos
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels
Top Kudoed Authors