cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

VANNGA
by New Contributor II
  • 22944 Views
  • 2 replies
  • 0 kudos

POC

Hi, I wonder if you could help me on the below please.We tried Databricks Data Intelligence platform for one of our clients and found that its very expensive when compared to AWS EMR. I understand its not apple-apple comparision as one being platform...

  • 22944 Views
  • 2 replies
  • 0 kudos
Latest Reply
VANNGA
New Contributor II
  • 0 kudos

Hi @Retired_mod Thanks for getting back with so valuable information.SystemFile sizeDurationSystemDurationCommentsComments1EMR225 GB22 minsDatabricks63 minsEMR is cheaper than Databricks by 5 timesThis involves various S3 writes with m5d4xlargeEMR225...

  • 0 kudos
1 More Replies
markwilliam8506
by New Contributor
  • 1647 Views
  • 1 replies
  • 0 kudos

What causes affect QB Won't Open and how to fix it?

What Can Be Causing QB Won't Open Issue and How Can I Fix It? I need help immediately to fix this annoying issue! Has anybody else had such problems with QB refusing to open? My personal attempts at troubleshooting have yielded no results. I would be...

  • 1647 Views
  • 1 replies
  • 0 kudos
Latest Reply
kartanjohn29
New Contributor II
  • 0 kudos

@markwilliam8506 If your QB won't open even after multiple tries, you might be facing some common error messages. This scenario can be a result of damaged program files or a faulty installation process, among other possible reasons. The error message...

  • 0 kudos
kfab
by New Contributor II
  • 6371 Views
  • 1 replies
  • 0 kudos

Serving GPU Endpoint, can't find CUDA

Hi everyone !I'm encountering an issue while trying to serve my model on a GPU endpoint.My model is using deespeed that needs I got the following error : "An error occurred while loading the model. CUDA_HOME does not exist, unable to compile CUDA op(...

  • 6371 Views
  • 1 replies
  • 0 kudos
philipkd
by New Contributor III
  • 15058 Views
  • 3 replies
  • 2 kudos

Resolved! Idle Databricks trial costs me $1/day on AWS

I created a 14-day trial account on Databricks.com and linked it to my AWS. I'm aware that DBUs are free for 14 days, but any AWS charges are my own. I created one workspace, and the CloudFormation was successful. I haven't used it for two days and t...

  • 15058 Views
  • 3 replies
  • 2 kudos
Latest Reply
dataguru
New Contributor II
  • 2 kudos

I also faced the same not sure how to disable or limit the usage. 

  • 2 kudos
2 More Replies
haseeb2001
by New Contributor II
  • 1863 Views
  • 1 replies
  • 0 kudos

Feature Store with Spark Pipeline

Hi,I am using a spark pipeline having stages VectoreAssembler, StandardScalor, StringIndexers, VectorAssembler, GbtClassifier. And then logging this pipeline using feature store log_model function as follows:fe = FeatureStoreClient() // I have tried ...

image.png
  • 1863 Views
  • 1 replies
  • 0 kudos
hpicatto
by New Contributor III
  • 3197 Views
  • 5 replies
  • 2 kudos

Problem updating a one time run Job

I'm creating a series of runs using the /api/2.1/jobs/runs/submit, I wanted to add some tags for more control on the cost and usage, but I notice it's not an option. My first idea was using /api/2.1/jobs/update but it returns that it doesn't have any...

  • 3197 Views
  • 5 replies
  • 2 kudos
Latest Reply
hpicatto
New Contributor III
  • 2 kudos

It could be, but I can still list the job permissions, so it's creating some kind of job... Is there a way of adding from the begining/updating tags into that job?

  • 2 kudos
4 More Replies
sanjay
by Valued Contributor II
  • 2309 Views
  • 0 replies
  • 0 kudos

Deploy mlflow model to Sagemaker

Hi,I am trying to deploy mlflow model in Sagemaker. My mlflow model is registered in Databrick.Followed below url to deploy and it need ECR for deployment. For ECR, either I can create custom image and push to ECR or its mentioned in below url to get...

  • 2309 Views
  • 0 replies
  • 0 kudos
Sanky
by New Contributor
  • 3138 Views
  • 0 replies
  • 0 kudos

SQL query on information_schema.tables via service principal

Hi,I have a simple python notebook with below code ----query = "select table_catalog, table_schema, table_name from system.information_schema.tables where table_type!='VIEW' and table_catalog='TEST' and table_schema='TEST'"test = spark.sql(query)disp...

Get Started Discussions
information_schema
Service Principal
troubleshooting
  • 3138 Views
  • 0 replies
  • 0 kudos
arkiboys
by Contributor
  • 2494 Views
  • 1 replies
  • 1 kudos

reading databricks tables

Hello,Currently I have created databricks tables in the hive_metastore.databasesTo read these tables using a select * query inside the databricks notebook, I have to make sure the databrcks cluster is started.Question is to do with reading the databr...

  • 2494 Views
  • 1 replies
  • 1 kudos
Latest Reply
arkiboys
Contributor
  • 1 kudos

thank you

  • 1 kudos
lwoodward
by New Contributor II
  • 2692 Views
  • 1 replies
  • 0 kudos

Resolved! ETL Advice for Large Transactional Database

I have a SQL server transactional database on an EC2 instance, and an AWS Glue job that pulls full tables in parquet files into an S3 bucket. There is a very large table that has 44 million rows, and records are added, updated and deleted from this t...

  • 2692 Views
  • 1 replies
  • 0 kudos
Latest Reply
ScottSmithDB
Databricks Employee
  • 0 kudos

If you have a CDC stream capability, you can use the APPLY CHANGES INTO API to perform SCD1, or SCD2 in a Delta Lake table in Databricks.  You can find more information here.  This is the best way to go if CDC is a possibility.If you do not have a CD...

  • 0 kudos
Nagasundaram
by New Contributor II
  • 9614 Views
  • 2 replies
  • 0 kudos

Connect to Databricks using Java SDK through proxy

I'm trying to connect to databricks from java using the java sdk and get cluster/sqlWarehouse state. I'm able to connect and get cluster state from my local. But, once I deploy it to the server, my company's network is not allowing the connection. We...

  • 9614 Views
  • 2 replies
  • 0 kudos
Latest Reply
Allia
Databricks Employee
  • 0 kudos

Hi  @Nagasundaram    You can make use of the below init script inorder to use a proxy server with Databricks cluster. The content of the init script can be added at  "Workspace/shared/setproxy.sh"  ================================================== v...

  • 0 kudos
1 More Replies
Surajv
by New Contributor III
  • 1499 Views
  • 1 replies
  • 0 kudos

Can I use databricks principals on databricks connect 12.2?

Hi community,Is it possible to use Databricks service principals for authentication on Databricks connect 12.2 to connect my notebook or code to Databricks compute, rather than using personal access token? I checked the docs and got to know that upgr...

  • 1499 Views
  • 1 replies
  • 0 kudos
Latest Reply
Surajv
New Contributor III
  • 0 kudos

Hi @Retired_modThanks for your response. I was able to generate the token of the service principal following this doc, later saved it in the <Databricks Token> variable prompted when running databricks-connect configure command in terminal. And was a...

  • 0 kudos
samarth_solanki
by New Contributor II
  • 10789 Views
  • 1 replies
  • 0 kudos

How to add instance profile permission to all user via databricks-sdk workspace client

How to add instance profile permission to all user via databricks-sdk workspace client. Just like terraform where we can give "users" for all users , how can we don same using databricks-sdk workspace-client. I cannot find permission for instance pro...

  • 10789 Views
  • 1 replies
  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels