cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

JohnJustus
by New Contributor III
  • 556 Views
  • 3 replies
  • 1 kudos

Pyspark API reference

All,I am using Azure Databricks and at times I refer to pyspark API's to interact with data in Azure datalake using python, SQL here https://spark.apache.org/docs/3.5.0/api/python/reference/pyspark.sql/index.htmlDoes databricks website has the list o...

  • 556 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @JohnJustus, Yes, Databricks has its own API reference set, which you can refer to. However, as you use PySpark on Azure Databricks, you can also refer to the Apache Spark™ PySpark API reference for more detailed information. Both sources can be h...

  • 1 kudos
2 More Replies
sensanjoy
by Contributor
  • 549 Views
  • 2 replies
  • 1 kudos

Monitor all Streaming jobs to make sure they are in RUNNING status.

Hi Experts,Is there any way that we can monitor all our Streaming jobs in workspace to make sure they are in "RUNNING" status?I could see there is one option to create a batch job that runs frequently and check the status(through REST API) of all str...

  • 549 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @sensanjoy, - You can monitor streaming jobs in the workspace through:  1. **Job Run Dashboard**: Displays information about all running jobs.    - Components: Job ID, Run Page, Run Name, Start Time, Created By.    - Divided into two sections: Job...

  • 1 kudos
1 More Replies
NanthakumarYoga
by New Contributor
  • 505 Views
  • 1 replies
  • 0 kudos

Partitioning or Processing : Reading CSV file with size of 5 to 9 GB

Hi Team,Would you please guide me onInstance with 28GB and 8 Cores1. how data bricks reading 5 to 9GB files from BLOB storage ? ( directly loaded full file into one nodes memory )2. howmany tasks will be created based on Core ? how many executors wil...

  • 505 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @NanthakumarYoga, Databricks reads data from Blob storage in a distributed way, breaking the data into partitions processed by separate tasks in Spark. - The size of partitions can be user-controlled, enabling efficient processing of large files w...

  • 0 kudos
LJacobsen
by New Contributor II
  • 1495 Views
  • 2 replies
  • 2 kudos

Resolved! Call a workspace notebook from a repository notebook

We have a Databricks workspace with several repositories. We'd like to have a place with shared configuration variables that can be accessed by notebooks in any repository.I created a folder named Shared under the root workspace and in that folder, c...

2023-10-04_14-46-04.png LJacobsen_0-1696456391781.png
  • 1495 Views
  • 2 replies
  • 2 kudos
Latest Reply
Kaniz
Community Manager
  • 2 kudos

Hi @LJacobsen, you cannot directly call a workspace notebook from inside a repository in Databricks. The error message you're seeing suggests that Databricks cannot find the notebook you're trying to reference, possibly because it's looking within t...

  • 2 kudos
1 More Replies
non
by New Contributor
  • 572 Views
  • 1 replies
  • 0 kudos

Resolved! Not able to reset password

Hello when i am clicking forgot password ..i am receiving the mail but when i am entering the new password and click on submit its getting loading all the time an. Please help me

  • 572 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @non ,  Please look at this link related to the Community - Edition, which might solve your problem. I appreciate your interest in sharing your Community-Edition query with us. However, at this time, we are not entertaining any Community-Edition q...

  • 0 kudos
Data_Analytics1
by Contributor III
  • 570 Views
  • 2 replies
  • 2 kudos

The base provider of Delta Sharing Catalog system does not exist.

I have enabled system tables in Databricks by following the procedure mentioned here. The owner of the system catalog is System user. I cannot see the schemas or tables of this catalog. It is showing me the error: The base provider of Delta Sharing C...

  • 570 Views
  • 2 replies
  • 2 kudos
Latest Reply
Data_Analytics1
Contributor III
  • 2 kudos

I have already enabled all these schemas using the Databricks CLI command. After enabling, I was able to see all the tables and data inside these schemas. Then I disabled the all the schemas using the CLI command mentioned here. Now, even after re-en...

  • 2 kudos
1 More Replies
jgrycz
by New Contributor III
  • 656 Views
  • 2 replies
  • 1 kudos

Resolved! Delivery audit logs to multiple S3 buckets

Hi!Am I able to configure delivery of Databricks audit logs to multiple S3 buckets (on different AWS accounts)? Thanks in Advance!

  • 656 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @jgrycz , Yes, you are able to configure the delivery of Databricks audit logs to multiple S3 buckets on different AWS accounts. This can be achieved by setting up a separate storage configuration for each S3 bucket using the Databricks API. Here...

  • 1 kudos
1 More Replies
Akshith_Rajesh
by New Contributor III
  • 611 Views
  • 1 replies
  • 0 kudos

Get the thrift hive.metastore.uri for Databricks unity catalog

I am trying to connect to Unity catalog meta store tables using Presto Based on the presto documentation I need to use the below configuration to connect to delta tables in the unity catalog https://prestodb.io/docs/current/connector/hive.htmlSo from...

  • 611 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Akshith_Rajesh , In Databricks, the Hive metastore URI is not directly exposed to users. However, you can interact with the metastore using Spark SQL commands. If you're using an external metastore, the URI would be something you've configured an...

  • 0 kudos
Ivaylokrastev12
by New Contributor
  • 516 Views
  • 1 replies
  • 0 kudos

Can someone explain the key differences between Databricks

Hello Databricks community! I'm relatively new to Databricks and I'm trying to understand the distinctions between Databricks Community Edition and Databricks Workspace

  • 516 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Ivaylokrastev12 ,  This article describes how to sign up for Databricks Community Edition. Unlike the Databricks Free Trial, Community Edition doesn’t require that you have your own cloud account or supply cloud compute or storage resources. Howe...

  • 0 kudos
6502
by New Contributor III
  • 460 Views
  • 1 replies
  • 0 kudos

Blue/Green Deployment, Table Cloning, and Delta live table pipelines

this a rather complex question that addresses Databricks users only. Let me recap a bit of the context that produced it. In the attempt to adopt the Blue/Green deployment protocol, we found good applications of the table cloning capabilities offered ...

  • 460 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @6502, To clone tables used in Databricks Delta Lake (DLT) pipelines, you can utilize Delta Lake's versioning and time travel features. Here are the steps you can follow: 1. First, identify the version of the table you want to clone. This could be...

  • 0 kudos
hafeez
by New Contributor III
  • 379 Views
  • 1 replies
  • 0 kudos

Repos section in Admin Settings Page is not visible

Hello,We have multitple workspace of Azure Databricks and we recently noticed that in some of the Administrator workspace settings, we are not able to Repos section (https://learn.microsoft.com/en-us/azure/databricks/repos/repos-setup#--restrict-usag...

  • 379 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @hafeez , To enable the Repos section in the Administrator workspace settings, you need to follow these steps: 1. Click your username in the top bar of the workspace.2. Select Admin Settings.3. Navigate to the Repos section and enable it.  For mor...

  • 0 kudos
Hareesh1980
by New Contributor
  • 312 Views
  • 1 replies
  • 0 kudos

Calculation on a dataframe

Hi, I need to do following calculations on a dataframe. It should be done for each period and calculated value will be used for next period's calculation. Adding sample data and formula from excel here. Thanks in advance for your help.Need to calcula...

  • 312 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Hareesh1980, In general, to perform these calculations in a data frame in Spark, you would need to use the withColumn Function to create new columns ’NewCashFlow’ and ’NewAllocation’, and apply the necessary calculations using the existing column...

  • 0 kudos
Upen_databricks
by New Contributor II
  • 322 Views
  • 1 replies
  • 0 kudos

Databricks access to Microsoft Sql Server

 Hi, i am facing below error while accessing Microosfot sql server. Please suggest what permissions I need to check at database level. I have the scope and secret created and key vault set up as expected. I feel some DB permission issue.Error: com.mi...

Upen_databricks_0-1696539860979.png
  • 322 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Upen_databricks, The issue you're experiencing with your DLT pipeline could be due to a couple of factors: 1. Development Optimizations: As per the Databricks release notes from September 7-13, 2021, new pipelines run in development mode by defau...

  • 0 kudos
deepthakkar007
by New Contributor
  • 344 Views
  • 1 replies
  • 1 kudos
  • 344 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16539034020
Contributor II
  • 1 kudos

Hello,  Thanks for contacting Databricks Support.  It appears you're employing a CloudFormation template to establish a Databricks workspace. The recommended method for creating workspaces is through the AWS Quick Start. Please refer to the documenta...

  • 1 kudos
Mohan2
by New Contributor
  • 490 Views
  • 1 replies
  • 0 kudos

SQL Warehouse - several issues

Hi there,I am facing several issues while trying to run SQL warehouse-starter on Azure databricks.Please note I am new to this data world, Azure & Databricks .  while starting SQL starter warehouse in Databricks Trail version and  I am getting these ...

Mohan2_0-1695443100723.png Mohan2_0-1695441032770.png
  • 490 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Mohan2, Based on the errors you're encountering, you're having issues with cluster creation and quota limitations. Here are some potential solutions: 1. **Increase your Azure quota:** The error message indicates that your Azure subscription does...

  • 0 kudos
Labels
Top Kudoed Authors