cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Pras1
by New Contributor II
  • 8142 Views
  • 2 replies
  • 2 kudos

Resolved! AZURE_QUOTA_EXCEEDED_EXCEPTION - even with more than vCPUs than Databricks recommends

I am running this Delta Live Tables PoC from databricks-industry-solutions/industry-solutions-blueprintshttps://github.com/databricks-industry-solutions/pos-dltI have Standard_DS4_v2 with 28GB and 8 cores x 2 workers - so a total of 16 cores. This is...

  • 8142 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Prasenjit Biswas​ We haven't heard from you since the last response from @Jose Gonzalez​ â€‹ . Kindly share the information with us, and in return, we will provide you with the necessary solution.Thanks and Regards

  • 2 kudos
1 More Replies
shrutis23
by New Contributor III
  • 4101 Views
  • 4 replies
  • 4 kudos

How to use delta live table with google cloud storage

Hi Team I have been working on a POC exploring delta live table with GCS location. I have some doubts :how to access the gcs bucket. We have connection established using databricks service account. In a normal cluster creation , we go to cluster page...

  • 4101 Views
  • 4 replies
  • 4 kudos
Latest Reply
Senthil1
Contributor
  • 4 kudos

Kindly mount the DBFS location to GCS cloud storage, see belowMounting cloud object storage on Databricks | Databricks on Google Cloud

  • 4 kudos
3 More Replies
explore
by New Contributor
  • 1497 Views
  • 0 replies
  • 0 kudos

Hi, Can we connect to the Teradata vantage installed in a vm via the community notebook. I am working on a POC to fetch data from Teradata vantate (just a teradata as it uses the jdbc) and process it in community notebook. Downloaded the terajdbc4.jar

from pyspark.sql import SparkSessionspark = SparkSession.builder.getOrCreate()def load_data(driver, jdbc_url, sql, user, password):  return spark.read \    .format('jdbc') \    .option('driver', driver) \    .option('url', jdbc_url) \    .option('dbt...

  • 1497 Views
  • 0 replies
  • 0 kudos
Ashley1
by Contributor
  • 2509 Views
  • 5 replies
  • 1 kudos

Resolved! Can ADLS be mounted in DBFS using only ADLS account key?

I realise this is not an optimal configuration but I'm trying to pull together a POC and I'm not at the point that I wish to ask the AAD admins to create an application for OAuth authentication.I have been able to use direct references to the ADLS co...

  • 2509 Views
  • 5 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hey there @Ashley Betts​ Thank you for posting your question. And you found the solution.This is awesome!Would you be happy to mark the answer as best so that other members can find the solution more quickly?Cheers!

  • 1 kudos
4 More Replies
alonisser
by Contributor
  • 2668 Views
  • 3 replies
  • 4 kudos

Resolved! How to migrate an existing workspace for an external metastore

Currently we're on an azure databricks workspace, we've setup during the POC, a long time ago. In the meanwhile we have built quite a production workload above databricks.Now we want to split workspaces - one for analysts and one for data engineeri...

  • 2668 Views
  • 3 replies
  • 4 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 4 kudos

From databricks notebook just run mysqldump. Server address and details you can take from logs or configuration.I am including also link to example notebook https://docs.microsoft.com/en-us/azure/databricks/kb/_static/notebooks/2016-election-tweets.h...

  • 4 kudos
2 More Replies
Labels