Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Hi,I am able to read data from a Bigquery table ,But am getting error writing data to a table in BigqueryFollowed instuctions in this document.Connecting Databricks to BigQuery | Google Cloud%scalaimport scala.io.Sourceval contentCred = "/dbfs/FileSt...
@udays22222 did you find any solution on this one? I face the same problem when I use Shared (Access mode) cluster. I can read but I cannot write with the error you mentioned.
Hello, I'm facing some problems while writing data to Google BigQuery. I'm able to read data from the same table, but when I try to append data I get the following error.Error getting access token from metadata server at: http://169.254.169.254/compu...
Sometime this error occur when your Private key or your service account key is not going in request header, So if you are using Spark or Databricks then you have to configure the JSON Key in Spark config so it will be added in request header.
I am looking through Google Cloud Platform and I am looking to get started with Databricks on GCP. Happy if anyone can point me in the direction that can provide guidance on how to get started.Thansk
Hey boyelana Databricks on Google Cloud Platform is definitely an interesting and powerful combination, and I'm thrilled to see that you're looking to get started with it, boyelana!To begin your journey with Databricks on GCP, there are a few steps y...
I have S3 as a data source containing sample TPC dataset (10G, 100G).I want to convert that into parquet files with an average size of about ~256MiB. What configuration parameter can I use to set that?I also need the data to be partitioned. And withi...
Hi @Vikas Goel We haven't heard from you since the last response from @Werner Stinckens , and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to o...
I'm working in the Google Cloud environment. I have an Autoloader job that uses the cloud files notifications to load data into a delta table. I want to filter the files from the PubSub topic based on the path in GCS where the files are located, not...
I'm just getting started with Databricks and DLTs. I've followed all the docs and tutorials I can find on this and believe I have set up everything in Azure correctly: service principal, paths, and spark configs. When I run a simple DLT autoloader pi...
Before Feb 2023, I often opened Notebook in new tabs by Ctrl (or Command for Mac) + Click the notebook names, in the Workspace explorer UI. Recently it stopped working (Ctrl + Click Notebook name, nothing happened). Is this a bug?
Hi @David H Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
I started using Databricks in Google Cloud but it charges some unexpected costs. When I create a cluster I notice some compute resources being created in GCP but when I stop the cluster these resources are still up and never shut down. This issue res...
The answer to the question about the kubernetes cluster regardless of dbx compute and dwh resources running is provided in this thread: https://community.databricks.com/s/question/0D58Y00009TbWqtSAF/auto-termination-for-clusters-jobs-and-delta-live-t...
I am working on Google Cloud and want to create Databricks workspace with custom VPC using terraform. Is that supported ? If yes is it similar to AWS way ?Thank youHoratiu
Hi @horatiu guja GCP Workspace provisioning using Terraform is public preview now. Please refer to the below doc for the steps.https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/gcp-workspace
I am writing into Google Big Query table using append mode. I need to delete current day data before writing new data. I just want to know if there is any preActions parameter can be used to first delete data before writing into table? Below is the s...
I was using the trial period in databricks for 14 days and had some important notebooks where I had made all the changes. Now I have extended the service and have subscribed for databricks in GCP. When I enter the workspace section I cannot see the w...
Hi @Aditya Aranya Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...
In the documentation https://registry.terraform.io/providers/databricks/databricks/latest/docs https://docs.gcp.databricks.com/dev-tools/terraform/index.html I could not find documentation on how to provision Databricks workspaces in GCP. Only cre...
Hi @horatiu guja Does @Debayan Mukherjee response answer your question?If yes, would you be happy to mark it as best so that other members can find the solution more quickly? Else, we can help you with more details.
I'm new to python and databricks so I'm still running tests on features, and not sure how much of this can be run without databricks which I guess requires an AWS or Google cloud account? Can I do all three stages without the AWS databricks or how fa...
Hi, to run it, you need databricks. You can try to open a free community account. Here is explained how: https://community.databricks.com/s/feed/0D53f00001ebEasCAE
While connecting the Databricks and Grafana, I have gone through the following approach.Install Grafna Agent in Databrics Clusters from Databricks console --> Not working since the system is not booted with systemd as init systemSince Spark 3 has Pro...
There is a repo with Prometheus gateway https://gist.github.com/Lowess/3a71792d2d09e38bf8f524644bbf8349. In the community, we usually use DataDog as both plays nicely https://docs.datadoghq.com/integrations/databricks/?tabs=driveronly