cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Azeez
by New Contributor II
  • 4471 Views
  • 8 replies
  • 1 kudos

Resolved! BAD_REQUEST:Failed to get oauth access token.Please try logout and login again

We deployed a test databricks workspace cluster on GCP. A single cluster got spinned up.Later we deleted the workspace.Now when we are trying to create a new one.It is giving this error"BAD_REQUEST:Failed to get oauth access token.Please try logout ...

  • 4471 Views
  • 8 replies
  • 1 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 1 kudos

@Azeez Sayyad​ you can try this workaround.Remove the Databricks App from your Google account. In Google account settings, go to "Manage third-paarty access", and remove Databricks from both Third-Paarty app with account access and Sign-in with Googl...

  • 1 kudos
7 More Replies
isaac_gritz
by Valued Contributor II
  • 796 Views
  • 0 replies
  • 3 kudos

Optimize Azure VM / AWS EC2 / GKE Cloud Infrastructure Costs

Tips on Reducing Cloud Compute Infrastructure Costs for Azure VM, AWS EC2, and GCP GKE on DatabricksDatabricks takes advantage of the latest Azure VM / AWS EC2 / GKE VM/instance types to ensure you get the best price performance for your workloads on...

  • 796 Views
  • 0 replies
  • 3 kudos
Ryan512
by New Contributor III
  • 1415 Views
  • 3 replies
  • 2 kudos

Autoloader (GCP) Custom PubSub Queue

I want to know if what I describe below is possible with AutoLoader in the Google Cloud Platform.Problem Description:We have GCS buckets for every client/account. Inside these buckets is a path/blob for each client's instances of our platform. A clie...

  • 1415 Views
  • 3 replies
  • 2 kudos
Latest Reply
Noopur_Nigam
Valued Contributor II
  • 2 kudos

Hello @Ryan Ebanks​ Please let us know if more help is needed on this.

  • 2 kudos
2 More Replies
Tahseen0354
by Valued Contributor
  • 1961 Views
  • 5 replies
  • 2 kudos

Why set up audit log delivery in databricks GCP fails ?

I am trying to set up audit log delivery in google cloud. I have followed this page https://docs.gcp.databricks.com/administration-guide/account-settings-gcp/log-delivery.html and have added log-delivery@databricks-prod-master.iam.gserviceaccount.co...

  • 1961 Views
  • 5 replies
  • 2 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 2 kudos

Hi @Md Tahseen Anam​ , We haven't heard from you on the last response from @Prabakar, and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please share it with the community as it can be helpful to others.A...

  • 2 kudos
4 More Replies
ishantjain194
by New Contributor II
  • 1566 Views
  • 4 replies
  • 4 kudos

AWS OR AZURE OR GCLOUD??

I want to know whether which cloud is better to learn and which cloud services has more career opportunities.

  • 1566 Views
  • 4 replies
  • 4 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 4 kudos

Hi @ishant jain​ , We haven't heard from you on the last response from @me and @Cedric Law Hing Ping​​, and I was checking back to see if our solutions helped you. Or else, If you have any solution, please share it with the community as it can be hel...

  • 4 kudos
3 More Replies
Tahseen0354
by Valued Contributor
  • 1332 Views
  • 3 replies
  • 1 kudos

Resolved! Configure CLI on databricks on GCP

Hi, I have a service account in my GCP project and the service account is added as a user in my databricks GCP account. Is it possible to configure CLI on databricks on GCP using that service account ? Something similar to:databricks configure ---tok...

  • 1332 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @Md Tahseen Anam​ , Note :-The CLI feature is unavailable on Databricks on Google Cloud as of this release.This article explains the configuration options available when you create and edit Databricks clusters. It focuses on creating and editing c...

  • 1 kudos
2 More Replies
MoJaMa
by Valued Contributor II
  • 932 Views
  • 1 replies
  • 0 kudos
  • 932 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Valued Contributor II
  • 0 kudos

Each local disk is 375 GB.So, for example, for n2-standard-4, it is 2 local disks. (0.75TB /2)https://databricks.com/wp-content/uploads/2021/05/GCP-Pricing-Estimator-v2.pdf?_ga=2.241263109.66068867.1623086616-828667513.1602536526

  • 0 kudos
Labels