cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

kfab
by New Contributor II
  • 6622 Views
  • 1 replies
  • 0 kudos

Serving GPU Endpoint, can't find CUDA

Hi everyone !I'm encountering an issue while trying to serve my model on a GPU endpoint.My model is using deespeed that needs I got the following error : "An error occurred while loading the model. CUDA_HOME does not exist, unable to compile CUDA op(...

  • 6622 Views
  • 1 replies
  • 0 kudos
philipkd
by New Contributor III
  • 15394 Views
  • 3 replies
  • 2 kudos

Resolved! Idle Databricks trial costs me $1/day on AWS

I created a 14-day trial account on Databricks.com and linked it to my AWS. I'm aware that DBUs are free for 14 days, but any AWS charges are my own. I created one workspace, and the CloudFormation was successful. I haven't used it for two days and t...

  • 15394 Views
  • 3 replies
  • 2 kudos
Latest Reply
dataguru
New Contributor II
  • 2 kudos

I also faced the same not sure how to disable or limit the usage. 

  • 2 kudos
2 More Replies
haseeb2001
by New Contributor II
  • 1945 Views
  • 1 replies
  • 0 kudos

Feature Store with Spark Pipeline

Hi,I am using a spark pipeline having stages VectoreAssembler, StandardScalor, StringIndexers, VectorAssembler, GbtClassifier. And then logging this pipeline using feature store log_model function as follows:fe = FeatureStoreClient() // I have tried ...

image.png
  • 1945 Views
  • 1 replies
  • 0 kudos
hpicatto
by New Contributor III
  • 3341 Views
  • 5 replies
  • 2 kudos

Problem updating a one time run Job

I'm creating a series of runs using the /api/2.1/jobs/runs/submit, I wanted to add some tags for more control on the cost and usage, but I notice it's not an option. My first idea was using /api/2.1/jobs/update but it returns that it doesn't have any...

  • 3341 Views
  • 5 replies
  • 2 kudos
Latest Reply
hpicatto
New Contributor III
  • 2 kudos

It could be, but I can still list the job permissions, so it's creating some kind of job... Is there a way of adding from the begining/updating tags into that job?

  • 2 kudos
4 More Replies
sanjay
by Valued Contributor II
  • 2384 Views
  • 0 replies
  • 0 kudos

Deploy mlflow model to Sagemaker

Hi,I am trying to deploy mlflow model in Sagemaker. My mlflow model is registered in Databrick.Followed below url to deploy and it need ECR for deployment. For ECR, either I can create custom image and push to ECR or its mentioned in below url to get...

  • 2384 Views
  • 0 replies
  • 0 kudos
Sanky
by New Contributor
  • 3265 Views
  • 0 replies
  • 0 kudos

SQL query on information_schema.tables via service principal

Hi,I have a simple python notebook with below code ----query = "select table_catalog, table_schema, table_name from system.information_schema.tables where table_type!='VIEW' and table_catalog='TEST' and table_schema='TEST'"test = spark.sql(query)disp...

Get Started Discussions
information_schema
Service Principal
troubleshooting
  • 3265 Views
  • 0 replies
  • 0 kudos
arkiboys
by Contributor
  • 2668 Views
  • 1 replies
  • 1 kudos

reading databricks tables

Hello,Currently I have created databricks tables in the hive_metastore.databasesTo read these tables using a select * query inside the databricks notebook, I have to make sure the databrcks cluster is started.Question is to do with reading the databr...

  • 2668 Views
  • 1 replies
  • 1 kudos
Latest Reply
arkiboys
Contributor
  • 1 kudos

thank you

  • 1 kudos
lwoodward
by New Contributor II
  • 2913 Views
  • 1 replies
  • 0 kudos

Resolved! ETL Advice for Large Transactional Database

I have a SQL server transactional database on an EC2 instance, and an AWS Glue job that pulls full tables in parquet files into an S3 bucket. There is a very large table that has 44 million rows, and records are added, updated and deleted from this t...

  • 2913 Views
  • 1 replies
  • 0 kudos
Latest Reply
ScottSmithDB
Databricks Employee
  • 0 kudos

If you have a CDC stream capability, you can use the APPLY CHANGES INTO API to perform SCD1, or SCD2 in a Delta Lake table in Databricks.  You can find more information here.  This is the best way to go if CDC is a possibility.If you do not have a CD...

  • 0 kudos
Nagasundaram
by New Contributor II
  • 9812 Views
  • 2 replies
  • 0 kudos

Connect to Databricks using Java SDK through proxy

I'm trying to connect to databricks from java using the java sdk and get cluster/sqlWarehouse state. I'm able to connect and get cluster state from my local. But, once I deploy it to the server, my company's network is not allowing the connection. We...

  • 9812 Views
  • 2 replies
  • 0 kudos
Latest Reply
Allia
Databricks Employee
  • 0 kudos

Hi  @Nagasundaram    You can make use of the below init script inorder to use a proxy server with Databricks cluster. The content of the init script can be added at  "Workspace/shared/setproxy.sh"  ================================================== v...

  • 0 kudos
1 More Replies
Surajv
by New Contributor III
  • 1551 Views
  • 1 replies
  • 0 kudos

Can I use databricks principals on databricks connect 12.2?

Hi community,Is it possible to use Databricks service principals for authentication on Databricks connect 12.2 to connect my notebook or code to Databricks compute, rather than using personal access token? I checked the docs and got to know that upgr...

  • 1551 Views
  • 1 replies
  • 0 kudos
Latest Reply
Surajv
New Contributor III
  • 0 kudos

Hi @Retired_modThanks for your response. I was able to generate the token of the service principal following this doc, later saved it in the <Databricks Token> variable prompted when running databricks-connect configure command in terminal. And was a...

  • 0 kudos
samarth_solanki
by New Contributor II
  • 10883 Views
  • 1 replies
  • 0 kudos

How to add instance profile permission to all user via databricks-sdk workspace client

How to add instance profile permission to all user via databricks-sdk workspace client. Just like terraform where we can give "users" for all users , how can we don same using databricks-sdk workspace-client. I cannot find permission for instance pro...

  • 10883 Views
  • 1 replies
  • 0 kudos
Madalian
by New Contributor III
  • 1077 Views
  • 0 replies
  • 0 kudos

How managed tables are useful in Madalian Archtecture

I am having basic question, As managed tables doesnt store their data into ADLS Gen2. But in our architcture we created 3 containers in ADLS Gen2 (Bronze, Silver and Gold) . If I chose managed tables then neither metadata  nor data doesnt store into ...

  • 1077 Views
  • 0 replies
  • 0 kudos
Phani1
by Databricks MVP
  • 10941 Views
  • 2 replies
  • 1 kudos

UCX Installation

We aim to streamline the UCX installation process by utilizing Databricks CLI and automating the manual input of required details at each question level .Could you please guide us how can we achieve to automate the parameter's while installation? wha...

  • 10941 Views
  • 2 replies
  • 1 kudos
Latest Reply
Phani1
Databricks MVP
  • 1 kudos

Hi Team, We don't see option at UCX command level  passing parameters as json /config file, could you please help me in this case how we can automate the installation.    

  • 1 kudos
1 More Replies
nidhin
by New Contributor
  • 1244 Views
  • 1 replies
  • 0 kudos

Error Handling for Web Data Retrieval and Storage in Databricks UNITY Clusters

The following code works well in a normal Databricks cluster, where it passes a null JSON and retrieves content from the web link. However, in a Unity cluster, it produces the following error: 'FileNotFoundError: [Errno 2] No such file or directory: ...

  • 1244 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Databricks Employee
  • 0 kudos

Hi @nidhin , Good Day!  The reason behind the below error while trying to access the external dbfs mount file using "with open" is that you are using a shared access mode cluster. 'FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/mnt/ra...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels