cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ChriZhan_93142
by New Contributor
  • 1691 Views
  • 1 replies
  • 0 kudos

how to upgrade pip associated with the default python

We have a job scheduled and submitted via Airflow to Databricks using api: api/2.0/jobs/runs/submit. Each time the job runs an ephemeral cluster will be launched and during the process a virtual env named: /local_disk0/.ephemeral_nfs/cluster_librarie...

  • 1691 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, I got an interesting article on the same. You can follow and let us know if this helps.  Please tag @Debayan with your next comment which will notify me!

  • 0 kudos
JohnJustus
by New Contributor III
  • 1687 Views
  • 1 replies
  • 0 kudos

Pyspark API reference

All,I am using Azure Databricks and at times I refer to pyspark API's to interact with data in Azure datalake using python, SQL here https://spark.apache.org/docs/3.5.0/api/python/reference/pyspark.sql/index.htmlDoes databricks website has the list o...

  • 1687 Views
  • 1 replies
  • 0 kudos
Data_Analytics1
by Contributor III
  • 1501 Views
  • 1 replies
  • 2 kudos

The base provider of Delta Sharing Catalog system does not exist.

I have enabled system tables in Databricks by following the procedure mentioned here. The owner of the system catalog is System user. I cannot see the schemas or tables of this catalog. It is showing me the error: The base provider of Delta Sharing C...

  • 1501 Views
  • 1 replies
  • 2 kudos
Latest Reply
Data_Analytics1
Contributor III
  • 2 kudos

I have already enabled all these schemas using the Databricks CLI command. After enabling, I was able to see all the tables and data inside these schemas. Then I disabled the all the schemas using the CLI command mentioned here. Now, even after re-en...

  • 2 kudos
deepthakkar007
by New Contributor
  • 868 Views
  • 1 replies
  • 1 kudos
  • 868 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16539034020
Databricks Employee
  • 1 kudos

Hello,  Thanks for contacting Databricks Support.  It appears you're employing a CloudFormation template to establish a Databricks workspace. The recommended method for creating workspaces is through the AWS Quick Start. Please refer to the documenta...

  • 1 kudos
Upen_databricks
by New Contributor II
  • 1026 Views
  • 0 replies
  • 0 kudos

Databricks access to Microsoft Sql Server

 Hi, i am facing below error while accessing Microosfot sql server. Please suggest what permissions I need to check at database level. I have the scope and secret created and key vault set up as expected. I feel some DB permission issue.Error: com.mi...

Upen_databricks_0-1696539860979.png
  • 1026 Views
  • 0 replies
  • 0 kudos
horatiug
by New Contributor III
  • 2857 Views
  • 3 replies
  • 2 kudos

Resolved! Changing GCP billing account

Hello we need to change the billing account associated with our Databricks subscription. Is there any documentation available  describing the procedure to be followed ? ThanksHoratiu

  • 2857 Views
  • 3 replies
  • 2 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 2 kudos

Start by logging into the Google Cloud Platform. If you are a new user, you need to create an account before you subscribe to Data bricks. Once in the console, start by selecting an existing Google Cloud project, or create a new project, and confirm ...

  • 2 kudos
2 More Replies
horatiug
by New Contributor III
  • 1080 Views
  • 1 replies
  • 0 kudos

Infrastructure question

We've noticed that the GKE worker nodes which are automatically created when Databricks workspace is created inside GCP project are using the default compute engine SA which's not the best security approach, even Google doesn't recommend using defaul...

  • 1080 Views
  • 1 replies
  • 0 kudos
nramya
by New Contributor
  • 1119 Views
  • 0 replies
  • 0 kudos

How do I add static tag values in the aws databricks-multi-workspace.template.yaml

Hello Team, I have a databricks workspace running on an AWS environment. I have a requirement where the team wanted to add a few customized  tags as per the docs I see below the recommendationTagValue:Description: All new AWS objects get a tag with t...

  • 1119 Views
  • 0 replies
  • 0 kudos
Rsa
by New Contributor II
  • 4517 Views
  • 4 replies
  • 2 kudos

CI/CD pipeline using Github

Hi Team,I've recently begun working with Databricks and I'm exploring options for setting up a CI/CD pipeline to pull the latest code from GitHub.I have to pull latest code(.sql) from Github whenever push is done to main branch and update .sql notebo...

  • 4517 Views
  • 4 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

FWIW:we pull manually, but it is possible to automate that without any cost if you use Azure Devops.  There is a free tier (depending on the number of pipelines/duration).

  • 2 kudos
3 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors