cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

udays22222
by New Contributor II
  • 4317 Views
  • 6 replies
  • 1 kudos

Error writing data to Google Bigquery

Hi,I am able to read data from a Bigquery table ,But am getting error writing data to a table in BigqueryFollowed instuctions in this document.Connecting Databricks to BigQuery | Google Cloud%scalaimport scala.io.Sourceval contentCred = "/dbfs/FileSt...

  • 4317 Views
  • 6 replies
  • 1 kudos
Latest Reply
GeoPer
New Contributor III
  • 1 kudos

@udays22222 did you find any solution on this one? I face the same problem when I use Shared (Access mode) cluster. I can read but I cannot write with the error you mentioned.

  • 1 kudos
5 More Replies
Fernando_Messas
by New Contributor II
  • 9781 Views
  • 6 replies
  • 3 kudos

Resolved! Error writing data to Google Bigquery

Hello, I'm facing some problems while writing data to Google BigQuery. I'm able to read data from the same table, but when I try to append data I get the following error.Error getting access token from metadata server at: http://169.254.169.254/compu...

  • 9781 Views
  • 6 replies
  • 3 kudos
Latest Reply
asif5494
New Contributor III
  • 3 kudos

Sometime this error occur when your Private key or your service account key is not going in request header, So if you are using Spark or Databricks then you have to configure the JSON Key in Spark config so it will be added in request header.

  • 3 kudos
5 More Replies
boyelana
by Contributor III
  • 2617 Views
  • 3 replies
  • 7 kudos

Resolved! How to start with Databricks in Google Cloud?

I am looking through Google Cloud Platform and I am looking to get started with Databricks on GCP. Happy if anyone can point me in the direction that can provide guidance on how to get started.Thansk

  • 2617 Views
  • 3 replies
  • 7 kudos
Latest Reply
martinez
New Contributor III
  • 7 kudos

Hey boyelana Databricks on Google Cloud Platform is definitely an interesting and powerful combination, and I'm thrilled to see that you're looking to get started with it, boyelana!To begin your journey with Databricks on GCP, there are a few steps y...

  • 7 kudos
2 More Replies
f2008700
by New Contributor III
  • 15248 Views
  • 6 replies
  • 7 kudos

Configuring average parquet file size

I have S3 as a data source containing sample TPC dataset (10G, 100G).I want to convert that into parquet files with an average size of about ~256MiB. What configuration parameter can I use to set that?I also need the data to be partitioned. And withi...

  • 15248 Views
  • 6 replies
  • 7 kudos
Latest Reply
Anonymous
Not applicable
  • 7 kudos

Hi @Vikas Goel​ We haven't heard from you since the last response from @Werner Stinckens​ â€‹, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to o...

  • 7 kudos
5 More Replies
Ryan512
by New Contributor III
  • 6003 Views
  • 2 replies
  • 5 kudos

Resolved! Does the `pathGlobFilter` option work on the entire file path or just the file name?

I'm working in the Google Cloud environment. I have an Autoloader job that uses the cloud files notifications to load data into a delta table. I want to filter the files from the PubSub topic based on the path in GCS where the files are located, not...

  • 6003 Views
  • 2 replies
  • 5 kudos
Latest Reply
Ryan512
New Contributor III
  • 5 kudos

Thank you for confirming what I observed that differed from the documentation.

  • 5 kudos
1 More Replies
david_bernstein
by New Contributor III
  • 3228 Views
  • 6 replies
  • 0 kudos

DLT autoloader credentials not available error in Azure

I'm just getting started with Databricks and DLTs. I've followed all the docs and tutorials I can find on this and believe I have set up everything in Azure correctly: service principal, paths, and spark configs. When I run a simple DLT autoloader pi...

  • 3228 Views
  • 6 replies
  • 0 kudos
Latest Reply
david_bernstein
New Contributor III
  • 0 kudos

Thank you, I will look into this.

  • 0 kudos
5 More Replies
mkd_140848
by New Contributor II
  • 3188 Views
  • 5 replies
  • 0 kudos

Open notebook in new tab not working as of 2 March 2023. Web browser: Google Chrome 110.0.5481.178 Databricks version: Standard Cloud platform: Azure

Before Feb 2023, I often opened Notebook in new tabs by Ctrl (or Command for Mac) + Click the notebook names, in the Workspace explorer UI. Recently it stopped working (Ctrl + Click Notebook name, nothing happened). Is this a bug?

  • 3188 Views
  • 5 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @David H​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 0 kudos
4 More Replies
jose_herazo
by New Contributor III
  • 3293 Views
  • 5 replies
  • 5 kudos

Databricks doesn't stop compute resources in GCP

I started using Databricks in Google Cloud but it charges some unexpected costs. When I create a cluster I notice some compute resources being created in GCP but when I stop the cluster these resources are still up and never shut down. This issue res...

  • 3293 Views
  • 5 replies
  • 5 kudos
Latest Reply
antquinonez
New Contributor II
  • 5 kudos

The answer to the question about the kubernetes cluster regardless of dbx compute and dwh resources running is provided in this thread: https://community.databricks.com/s/question/0D58Y00009TbWqtSAF/auto-termination-for-clusters-jobs-and-delta-live-t...

  • 5 kudos
4 More Replies
horatiug
by New Contributor III
  • 2964 Views
  • 4 replies
  • 1 kudos

Databricks workspace with custom VPC using terraform in Google Cloud

I am working on Google Cloud and want to create Databricks workspace with custom VPC using terraform. Is that supported ? If yes is it similar to AWS way ?Thank youHoratiu

  • 2964 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @horatiu guja​ GCP Workspace provisioning using Terraform is public preview now. Please refer to the below doc for the steps.https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/gcp-workspace

  • 1 kudos
3 More Replies
asif5494
by New Contributor III
  • 2177 Views
  • 3 replies
  • 0 kudos

preAction in databricks while writing into Google Big Query Table?

I am writing into Google Big Query table using append mode. I need to delete current day data before writing new data. I just want to know if there is any preActions parameter can be used to first delete data before writing into table? Below is the s...

  • 2177 Views
  • 3 replies
  • 0 kudos
Latest Reply
Cami
Contributor III
  • 0 kudos

Can you use override mode instead append?

  • 0 kudos
2 More Replies
stupendousenzio
by New Contributor III
  • 2252 Views
  • 4 replies
  • 7 kudos

Unable to access workspace after the trial period in databricks in Google cloud provider.

I was using the trial period in databricks for 14 days and had some important notebooks where I had made all the changes. Now I have extended the service and have subscribed for databricks in GCP. When I enter the workspace section I cannot see the w...

  • 2252 Views
  • 4 replies
  • 7 kudos
Latest Reply
Anonymous
Not applicable
  • 7 kudos

Hi @Aditya Aranya​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...

  • 7 kudos
3 More Replies
ellaine
by New Contributor III
  • 6267 Views
  • 12 replies
  • 8 kudos

Google Ad Manager Reporting API Authentication via Databricks

Hello - I was wondering whether anyone has had any experience fetching data through GAM's Reporting API via Databricks. The Reporting API requires installation of the "googleads" library, as well as a googleads.yaml file. I was able to find some docu...

  • 6267 Views
  • 12 replies
  • 8 kudos
Latest Reply
Anonymous
Not applicable
  • 8 kudos

Hi @Ellaine Ho​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 8 kudos
11 More Replies
horatiug
by New Contributor III
  • 4247 Views
  • 8 replies
  • 3 kudos

Create workspace in Databricks deployed in Google Cloud using terraform

In the documentation https://registry.terraform.io/providers/databricks/databricks/latest/docs https://docs.gcp.databricks.com/dev-tools/terraform/index.html I could not find documentation on how to provision Databricks workspaces in GCP. Only cre...

  • 4247 Views
  • 8 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @horatiu guja​ Does @Debayan Mukherjee​ response answer your question?If yes, would you be happy to mark it as best so that other members can find the solution more quickly? Else, we can help you with more details.

  • 3 kudos
7 More Replies
PChan
by New Contributor II
  • 971 Views
  • 1 replies
  • 0 kudos

www.googleapis.com

It happens after databricks deleted my cluster{    "protoPayload": {      "@type": "type.googleapis.com/google.cloud.audit.AuditLog",      "status": {},      "serviceName": "container.googleapis.com",      "methodName": "google.container.v1.ClusterMa...

error
  • 971 Views
  • 1 replies
  • 0 kudos
Latest Reply
PChan
New Contributor II
  • 0 kudos

attached the error log.

  • 0 kudos
data_testing1
by New Contributor III
  • 3719 Views
  • 5 replies
  • 5 kudos

Resolved! How much of this tutorial or blog post can I run before starting a cloud instance of databricks?

I'm new to python and databricks so I'm still running tests on features, and not sure how much of this can be run without databricks which I guess requires an AWS or Google cloud account? Can I do all three stages without the AWS databricks or how fa...

  • 3719 Views
  • 5 replies
  • 5 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 5 kudos

Hi, to run it, you need databricks. You can try to open a free community account. Here is explained how: https://community.databricks.com/s/feed/0D53f00001ebEasCAE

  • 5 kudos
4 More Replies
Labels