by
Pras1
• New Contributor II
- 8508 Views
- 2 replies
- 2 kudos
I am running this Delta Live Tables PoC from databricks-industry-solutions/industry-solutions-blueprintshttps://github.com/databricks-industry-solutions/pos-dltI have Standard_DS4_v2 with 28GB and 8 cores x 2 workers - so a total of 16 cores. This is...
- 8508 Views
- 2 replies
- 2 kudos
Latest Reply
Hi @Prasenjit Biswas We haven't heard from you since the last response from @Jose Gonzalez . Kindly share the information with us, and in return, we will provide you with the necessary solution.Thanks and Regards
1 More Replies
- 4246 Views
- 4 replies
- 4 kudos
Hi Team I have been working on a POC exploring delta live table with GCS location. I have some doubts :how to access the gcs bucket. We have connection established using databricks service account. In a normal cluster creation , we go to cluster page...
- 4246 Views
- 4 replies
- 4 kudos
Latest Reply
Kindly mount the DBFS location to GCS cloud storage, see belowMounting cloud object storage on Databricks | Databricks on Google Cloud
3 More Replies
- 1565 Views
- 0 replies
- 0 kudos
from pyspark.sql import SparkSessionspark = SparkSession.builder.getOrCreate()def load_data(driver, jdbc_url, sql, user, password): return spark.read \ .format('jdbc') \ .option('driver', driver) \ .option('url', jdbc_url) \ .option('dbt...
- 1565 Views
- 0 replies
- 0 kudos
- 2638 Views
- 5 replies
- 1 kudos
I realise this is not an optimal configuration but I'm trying to pull together a POC and I'm not at the point that I wish to ask the AAD admins to create an application for OAuth authentication.I have been able to use direct references to the ADLS co...
- 2638 Views
- 5 replies
- 1 kudos
Latest Reply
Hey there @Ashley Betts Thank you for posting your question. And you found the solution.This is awesome!Would you be happy to mark the answer as best so that other members can find the solution more quickly?Cheers!
4 More Replies
- 2747 Views
- 3 replies
- 4 kudos
Currently we're on an azure databricks workspace, we've setup during the POC, a long time ago. In the meanwhile we have built quite a production workload above databricks.Now we want to split workspaces - one for analysts and one for data engineeri...
- 2747 Views
- 3 replies
- 4 kudos
Latest Reply
From databricks notebook just run mysqldump. Server address and details you can take from logs or configuration.I am including also link to example notebook https://docs.microsoft.com/en-us/azure/databricks/kb/_static/notebooks/2016-election-tweets.h...
2 More Replies