cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Akshith_Rajesh
by New Contributor III
  • 461 Views
  • 1 replies
  • 0 kudos

Get the thrift hive.metastore.uri for Databricks unity catalog

I am trying to connect to Unity catalog meta store tables using Presto Based on the presto documentation I need to use the below configuration to connect to delta tables in the unity catalog https://prestodb.io/docs/current/connector/hive.htmlSo from...

  • 461 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Akshith_Rajesh , In Databricks, the Hive metastore URI is not directly exposed to users. However, you can interact with the metastore using Spark SQL commands. If you're using an external metastore, the URI would be something you've configured an...

  • 0 kudos
Ivaylokrastev12
by New Contributor
  • 287 Views
  • 1 replies
  • 0 kudos

Can someone explain the key differences between Databricks

Hello Databricks community! I'm relatively new to Databricks and I'm trying to understand the distinctions between Databricks Community Edition and Databricks Workspace

  • 287 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Ivaylokrastev12 ,  This article describes how to sign up for Databricks Community Edition. Unlike the Databricks Free Trial, Community Edition doesn’t require that you have your own cloud account or supply cloud compute or storage resources. Howe...

  • 0 kudos
6502
by New Contributor III
  • 336 Views
  • 1 replies
  • 0 kudos

Blue/Green Deployment, Table Cloning, and Delta live table pipelines

this a rather complex question that addresses Databricks users only. Let me recap a bit of the context that produced it. In the attempt to adopt the Blue/Green deployment protocol, we found good applications of the table cloning capabilities offered ...

  • 336 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @6502, To clone tables used in Databricks Delta Lake (DLT) pipelines, you can utilize Delta Lake's versioning and time travel features. Here are the steps you can follow: 1. First, identify the version of the table you want to clone. This could be...

  • 0 kudos
hafeez
by New Contributor III
  • 298 Views
  • 1 replies
  • 0 kudos

Repos section in Admin Settings Page is not visible

Hello,We have multitple workspace of Azure Databricks and we recently noticed that in some of the Administrator workspace settings, we are not able to Repos section (https://learn.microsoft.com/en-us/azure/databricks/repos/repos-setup#--restrict-usag...

  • 298 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @hafeez , To enable the Repos section in the Administrator workspace settings, you need to follow these steps: 1. Click your username in the top bar of the workspace.2. Select Admin Settings.3. Navigate to the Repos section and enable it.  For mor...

  • 0 kudos
Hareesh1980
by New Contributor
  • 243 Views
  • 1 replies
  • 0 kudos

Calculation on a dataframe

Hi, I need to do following calculations on a dataframe. It should be done for each period and calculated value will be used for next period's calculation. Adding sample data and formula from excel here. Thanks in advance for your help.Need to calcula...

  • 243 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Hareesh1980, In general, to perform these calculations in a data frame in Spark, you would need to use the withColumn Function to create new columns ’NewCashFlow’ and ’NewAllocation’, and apply the necessary calculations using the existing column...

  • 0 kudos
Upen_databricks
by New Contributor II
  • 252 Views
  • 1 replies
  • 0 kudos

Databricks access to Microsoft Sql Server

 Hi, i am facing below error while accessing Microosfot sql server. Please suggest what permissions I need to check at database level. I have the scope and secret created and key vault set up as expected. I feel some DB permission issue.Error: com.mi...

Upen_databricks_0-1696539860979.png
  • 252 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Upen_databricks, The issue you're experiencing with your DLT pipeline could be due to a couple of factors: 1. Development Optimizations: As per the Databricks release notes from September 7-13, 2021, new pipelines run in development mode by defau...

  • 0 kudos
deepthakkar007
by New Contributor
  • 298 Views
  • 1 replies
  • 1 kudos
  • 298 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16539034020
Contributor II
  • 1 kudos

Hello,  Thanks for contacting Databricks Support.  It appears you're employing a CloudFormation template to establish a Databricks workspace. The recommended method for creating workspaces is through the AWS Quick Start. Please refer to the documenta...

  • 1 kudos
Mohan2
by New Contributor
  • 350 Views
  • 1 replies
  • 0 kudos

SQL Warehouse - several issues

Hi there,I am facing several issues while trying to run SQL warehouse-starter on Azure databricks.Please note I am new to this data world, Azure & Databricks .  while starting SQL starter warehouse in Databricks Trail version and  I am getting these ...

Mohan2_0-1695443100723.png Mohan2_0-1695441032770.png
  • 350 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Mohan2, Based on the errors you're encountering, you're having issues with cluster creation and quota limitations. Here are some potential solutions: 1. **Increase your Azure quota:** The error message indicates that your Azure subscription does...

  • 0 kudos
bento
by New Contributor
  • 336 Views
  • 1 replies
  • 0 kudos

Model serving is not available for trial workspaces. Please contact Databricks

Hi, as mentioned in the title, I'm getting this error when I try to use model serving, despite being on the premium plan, my trial account ends on 28th September 2023, is there a way to use model serving immediately or am i stuck until 28th September...

  • 336 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @bento, It's best to contact Databricks support or check the terms of service for your trial account to get accurate information.

  • 0 kudos
horatiug
by New Contributor III
  • 786 Views
  • 3 replies
  • 2 kudos

Resolved! Changing GCP billing account

Hello we need to change the billing account associated with our Databricks subscription. Is there any documentation available  describing the procedure to be followed ? ThanksHoratiu

  • 786 Views
  • 3 replies
  • 2 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 2 kudos

Start by logging into the Google Cloud Platform. If you are a new user, you need to create an account before you subscribe to Data bricks. Once in the console, start by selecting an existing Google Cloud project, or create a new project, and confirm ...

  • 2 kudos
2 More Replies
horatiug
by New Contributor III
  • 313 Views
  • 2 replies
  • 0 kudos

Infrastructure question

We've noticed that the GKE worker nodes which are automatically created when Databricks workspace is created inside GCP project are using the default compute engine SA which's not the best security approach, even Google doesn't recommend using defaul...

  • 313 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @horatiug, there is an option to avoid using the default service account when creating a Databricks workspace in a GCP project. You can create your workspaces in an existing customer-managed Virtual Private Cloud (VPC) that you make in your accoun...

  • 0 kudos
1 More Replies
nramya
by New Contributor
  • 379 Views
  • 1 replies
  • 0 kudos

How do I add static tag values in the aws databricks-multi-workspace.template.yaml

Hello Team, I have a databricks workspace running on an AWS environment. I have a requirement where the team wanted to add a few customized  tags as per the docs I see below the recommendationTagValue:Description: All new AWS objects get a tag with t...

  • 379 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @nramya, You can use custom variables to make your bundle settings files more modular and reusable. You can declare a variable that represents the ID of an existing cluster and then change that variable's value to different cluster IDs for various...

  • 0 kudos
DatBoi
by Contributor
  • 1279 Views
  • 3 replies
  • 2 kudos

Resolved! Recreating Unity Catalog object through different environments

Hi all! I am working on a DevOps project to automate the creation of UC objects through different environments (dev-test-prod). Each time we deploy our code to a different environment (using a Github workflow, not really relevant) we want to also cre...

  • 1279 Views
  • 3 replies
  • 2 kudos
Latest Reply
Kaniz
Community Manager
  • 2 kudos

Hi @DatBoi, To copy or recreate a function from one UC location to another, you would need first to understand the definition and implementation of the original function and then play that function in the new location using the CREATE FUNCTION SQL co...

  • 2 kudos
2 More Replies
Geebigib
by New Contributor
  • 303 Views
  • 1 replies
  • 0 kudos

xgboost.spark.core' has no attribute 'SparkXGBClassifierModel'

I got error: xgboost.spark.core' has no attribute 'SparkXGBClassifierModel' when attempting to load model. I have upgraded to xgboost-2.0.0.

  • 303 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Geebigib, The error you're experiencing is likely because the SparkXGBClassifierModel class is no longer available in the xgboost.spark.core module in xgboost-2.0.0. This could be due to changes or deprecations in the newer versions of the librar...

  • 0 kudos
Rsa
by New Contributor II
  • 547 Views
  • 4 replies
  • 2 kudos

CI/CD pipeline using Github

Hi Team,I've recently begun working with Databricks and I'm exploring options for setting up a CI/CD pipeline to pull the latest code from GitHub.I have to pull latest code(.sql) from Github whenever push is done to main branch and update .sql notebo...

  • 547 Views
  • 4 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

FWIW:we pull manually, but it is possible to automate that without any cost if you use Azure Devops.  There is a free tier (depending on the number of pipelines/duration).

  • 2 kudos
3 More Replies