cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

anushajalesh28
by New Contributor II
  • 1724 Views
  • 2 replies
  • 1 kudos

Catalog issue

When i was trying to create catalog i got an error saying to mention azure storage account and storage container in the following query -CREATE CATALOG IF NOT EXISTS Databricks_Anu_Jal_27022024MANAGED LOCATION 'abfss://<databricks-workspace-stack-anu...

Get Started Discussions
Azure Databricks
  • 1724 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @anushajalesh28, To create a catalog in Azure Databricks, you need to specify the Azure storage account and storage container in the MANAGED LOCATION clause.    Let’s break down the query:   CREATE CATALOG IF NOT EXISTS Databricks_Anu_Jal_270220...

  • 1 kudos
1 More Replies
Peter_Jones
by New Contributor III
  • 2496 Views
  • 6 replies
  • 0 kudos

Resolved! Clusters are failing to launch

Hi Guys,I am a complete newbie to data bricks, we are trying to figure out if our data models and ETL can run on it.I have got the failure to launch message. I have read this message as well.https://community.databricks.com/t5/data-engineering/cluste...

PeterJones_0-1708350996925.png
  • 2496 Views
  • 6 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Peter_Jones, Let’s tackle this cluster launch issue step by step. Quota Exceeded Error: The error message indicates that your cluster launch failed due to exceeding the approved quota for standardEDSv4Family Cores in the westeurope location. ...

  • 0 kudos
5 More Replies
Data_Engineer3
by Contributor II
  • 1691 Views
  • 2 replies
  • 0 kudos

Resolved! spark context in databricks

Hi @all,In Azure Databricks,I am using structured streaming for each batch functionality, in one of the functions I am creating tempview with pyspark dataframe (*Not GlobalTempView) and trying to access the same temp view by using spark.sql functiona...

  • 1691 Views
  • 2 replies
  • 0 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 0 kudos

Do you face this issue without spark streaming as well? Also, could you share a minimal repo code preferably without streaming?

  • 0 kudos
1 More Replies
Labels