cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

BillGuyTheScien
by New Contributor II
  • 887 Views
  • 2 replies
  • 0 kudos

how do committed-use discounts work?

How do committed-use discounts work for Databricks?  Do I purchase a chunk of DBUs for a flat fee and then draw down on them until exhausted?  Or am I purchasing a % discount to all DBUs I use until the time period ends?In either case, is this reflec...

  • 887 Views
  • 2 replies
  • 0 kudos
Latest Reply
BillGuyTheScien
New Contributor II
  • 0 kudos

Thanks @Kaniz_Fatma that helps!  What is the commitment term?  one month? one year?

  • 0 kudos
1 More Replies
VANNGA
by New Contributor II
  • 2950 Views
  • 3 replies
  • 1 kudos

POC

Hi, I wonder if you could help me on the below please.We tried Databricks Data Intelligence platform for one of our clients and found that its very expensive when compared to AWS EMR. I understand its not apple-apple comparision as one being platform...

  • 2950 Views
  • 3 replies
  • 1 kudos
Latest Reply
VANNGA
New Contributor II
  • 1 kudos

Hi @Kaniz_Fatma Thanks for getting back with so valuable information.SystemFile sizeDurationSystemDurationCommentsComments1EMR225 GB22 minsDatabricks63 minsEMR is cheaper than Databricks by 5 timesThis involves various S3 writes with m5d4xlargeEMR225...

  • 1 kudos
2 More Replies
philipkd
by New Contributor III
  • 2800 Views
  • 4 replies
  • 2 kudos

Resolved! Idle Databricks trial costs me $1/day on AWS

I created a 14-day trial account on Databricks.com and linked it to my AWS. I'm aware that DBUs are free for 14 days, but any AWS charges are my own. I created one workspace, and the CloudFormation was successful. I haven't used it for two days and t...

  • 2800 Views
  • 4 replies
  • 2 kudos
Latest Reply
dataguru
New Contributor II
  • 2 kudos

I also faced the same not sure how to disable or limit the usage. 

  • 2 kudos
3 More Replies
haseeb2001
by New Contributor II
  • 695 Views
  • 2 replies
  • 0 kudos

Feature Store with Spark Pipeline

Hi,I am using a spark pipeline having stages VectoreAssembler, StandardScalor, StringIndexers, VectorAssembler, GbtClassifier. And then logging this pipeline using feature store log_model function as follows:fe = FeatureStoreClient() // I have tried ...

image.png
  • 695 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @haseeb2001, It seems you’re encountering an issue after logging your Spark pipeline model using the fe.log_model function in MLflow. Let’s break down the steps and address the error: Pipeline Stages: You’ve mentioned several stages in your S...

  • 0 kudos
1 More Replies
hpicatto
by New Contributor III
  • 1475 Views
  • 5 replies
  • 2 kudos

Problem updating a one time run Job

I'm creating a series of runs using the /api/2.1/jobs/runs/submit, I wanted to add some tags for more control on the cost and usage, but I notice it's not an option. My first idea was using /api/2.1/jobs/update but it returns that it doesn't have any...

  • 1475 Views
  • 5 replies
  • 2 kudos
Latest Reply
hpicatto
New Contributor III
  • 2 kudos

It could be, but I can still list the job permissions, so it's creating some kind of job... Is there a way of adding from the begining/updating tags into that job?

  • 2 kudos
4 More Replies
Surajv
by New Contributor III
  • 513 Views
  • 2 replies
  • 0 kudos

Can I use databricks principals on databricks connect 12.2?

Hi community,Is it possible to use Databricks service principals for authentication on Databricks connect 12.2 to connect my notebook or code to Databricks compute, rather than using personal access token? I checked the docs and got to know that upgr...

  • 513 Views
  • 2 replies
  • 0 kudos
Latest Reply
Surajv
New Contributor III
  • 0 kudos

Hi @Kaniz_FatmaThanks for your response. I was able to generate the token of the service principal following this doc, later saved it in the <Databricks Token> variable prompted when running databricks-connect configure command in terminal. And was a...

  • 0 kudos
1 More Replies
Phani1
by Valued Contributor
  • 4932 Views
  • 4 replies
  • 1 kudos

UCX Installation

We aim to streamline the UCX installation process by utilizing Databricks CLI and automating the manual input of required details at each question level .Could you please guide us how can we achieve to automate the parameter's while installation? wha...

  • 4932 Views
  • 4 replies
  • 1 kudos
Latest Reply
Phani1
Valued Contributor
  • 1 kudos

Hi Team, We don't see option at UCX command level  passing parameters as json /config file, could you please help me in this case how we can automate the installation.    

  • 1 kudos
3 More Replies
nidhin
by New Contributor
  • 326 Views
  • 1 replies
  • 0 kudos

Error Handling for Web Data Retrieval and Storage in Databricks UNITY Clusters

The following code works well in a normal Databricks cluster, where it passes a null JSON and retrieves content from the web link. However, in a Unity cluster, it produces the following error: 'FileNotFoundError: [Errno 2] No such file or directory: ...

  • 326 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Honored Contributor
  • 0 kudos

Hi @nidhin , Good Day!  The reason behind the below error while trying to access the external dbfs mount file using "with open" is that you are using a shared access mode cluster. 'FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/mnt/ra...

  • 0 kudos
Phani1
by Valued Contributor
  • 1779 Views
  • 1 replies
  • 0 kudos

UCX Installation without CLI

 Hi Team, Can we install UCX Toolkit into databricks workspace without installing it via Databricks CLI? If its possible then How ?https://github.com/databrickslabs/ucx

  • 1779 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Phani1, The UCX Toolkit is a powerful tool for upgrading your Databricks workspace to Unity Catalog. It helps you migrate various assets within your workspace, including: Legacy Table ACLsEntitlementsAWS instance profilesClusters and cluster p...

  • 0 kudos
Phani1
by Valued Contributor
  • 3788 Views
  • 1 replies
  • 1 kudos

Resolved! SAP Successfator

Hi Team,We are working on a new Data Product onboarding to the current Databricks Lakehouse Platform.The first step is foundation where we should get data from SAP success factors to S3+ Bronze layer and then do the initial setup of Lakehouse+Power B...

  • 3788 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @Phani1, Retrieving data from SAP SuccessFactors and storing it in your Databricks Lakehouse Platform involves several considerations.    Let’s break it down step by step:   Data Extraction from SAP SuccessFactors: You’ve correctly identified that...

  • 1 kudos
Phani1
by Valued Contributor
  • 1053 Views
  • 1 replies
  • 1 kudos

cost finding and optimization

 Hi Team,Could you please suggest the best way to track the cost of Databricks objects/components? Could you please share any best practices for optimizing costs and conducting detailed cost analysis?Regards,Phanindra

  • 1053 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @Phani1,  Understand Databricks Units (DBUs): A DBU (Databricks Unit) is the fundamental unit of consumption within the Databricks platform.DBUs are based on the number of nodes and the computational power of VM instance types in your clusters...

  • 1 kudos
Ramacloudworld
by New Contributor
  • 860 Views
  • 1 replies
  • 0 kudos

cross join issue generating surrogate keys in delta table

I used below code to populate target table, it is working as expected but expect for surrogatekey column. After I inserted dummy entry -1 and did run merge code, it is generating the numbers in surrogatekey column like this 1,3,5,7,9 odd numbers, it ...

  • 860 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Ramacloudworld, It seems you’re encountering an issue with the surrogate key generation in your merge code. Let’s break it down and address the problem. Surrogate Keys: A surrogate key is a system-assigned unique value used to identify an ent...

  • 0 kudos
Eric76
by New Contributor
  • 2374 Views
  • 1 replies
  • 0 kudos

Can the community version of databrick run model training examples?

Hi, Newcomer here. I am experimenting with the community version of databrick.I wanted to run the notebook example provided here https://community.cloud.databricks.com/?o=6085264701896358#notebook/2691200955149229It failed because it cannot import th...

  • 2374 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Eric76, Welcome to the Databricks community! Let’s address the issue you’re facing with importing data in your notebook. The example notebook you’re trying to run relies on a dataset located at /dbfs/databricks-datasets/wine-quality/winequalit...

  • 0 kudos
Phani1
by Valued Contributor
  • 746 Views
  • 2 replies
  • 0 kudos

Capture changes at the object level in Databricks

Could you please suggest how to capture changes at the object level in Databricks, such as notebooks changes, table DDL changes, view DDL , functions DDL, and workflows etc. changes? We would like to build a dashboard for changes.

  • 746 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hey there! Thanks a bunch for being part of our awesome community!  We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...

  • 0 kudos
1 More Replies
wellington
by New Contributor III
  • 656 Views
  • 2 replies
  • 0 kudos

Log notebook activities

Hi friends;I'm working on a project where we are 4 programmers. We are working in a single environment, using only the "Workspaces" folder. Each has its own user, which is managed by Azure AD.We had a peak in consumption on the 5th Feb. So I can see ...

  • 656 Views
  • 2 replies
  • 0 kudos
Latest Reply
wellington
New Contributor III
  • 0 kudos

Hi @Kaniz_Fatma , thanks for your quick answer.There is no other way to monitor notebook runs. I ask this because adding tags to the cluster and workspace does not solve my problem, considering that everyone uses the same cluster and the same workspa...

  • 0 kudos
1 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels
Top Kudoed Authors