cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

dataailearner
by New Contributor II
  • 166 Views
  • 1 replies
  • 0 kudos

Cluster Auto Termination Best Practices

Are there any recommended practices to set cluster auto termination for cost optimization?

  • 166 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ravivarma
New Contributor III
  • 0 kudos

Hello @dataailearner , Greetings of the day! Here are a few steps that you can follow for cost optimizations: 1. Choose the most efficient compute size: Databricks runs one executor per worker node. The total number of cores across all executors is a...

  • 0 kudos
vaidhaicha
by New Contributor II
  • 272 Views
  • 2 replies
  • 0 kudos

Databricks Custom model Serving endpoint Failing

Hello all, I have created a custom model serving endpoint in Azure databricks. This endpoint connects with the AzureopenAI model and Azure postgres connection.All of these Azure services are with Private endpoints. When I run this notebook,I am able ...

  • 272 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @vaidhaicha, It sounds like you’re encountering issues with your custom model serving endpoint in Azure Databricks, specifically when querying through the serving endpoint using your Personal Access Token (PAT). A private endpoint is a network ...

  • 0 kudos
1 More Replies
pjv
by New Contributor III
  • 199 Views
  • 2 replies
  • 0 kudos

VSCode Databricks Extension

Hi all,I've been trying to sync my VSCode IDE with our Databricks GCP workspace using the Databricks extension. I am able to connect authenticate my account and workspace and find our clusters. However, when I try to sync a destination it throws a st...

  • 199 Views
  • 2 replies
  • 0 kudos
Latest Reply
pjv
New Contributor III
  • 0 kudos

@Kaniz_Fatma thanks for you response.I am not running through a proxy. At least, not on purpose. How do I know if I am running through a proxy? And where can I find the values of <proxy_url> and <port> so that I can try restarting my VSCode.I have tr...

  • 0 kudos
1 More Replies
User1234
by New Contributor II
  • 1750 Views
  • 4 replies
  • 2 kudos

Cluster compute metrics

I want to fetch compute metrics(hardware, gpu and spark) and use them in certain dashboard on Databricks, however i'm not able to fetch them. i have tried GET API request and system tables. The system tables only have CPU utilization and memory utili...

Community Discussions
cluster
compute
metrics
  • 1750 Views
  • 4 replies
  • 2 kudos
Latest Reply
BlakeWood
New Contributor II
  • 2 kudos

Replying for the updates.

  • 2 kudos
3 More Replies
Neelofer
by New Contributor
  • 136 Views
  • 1 replies
  • 0 kudos

Fetching CPU and memory data using REST APIs

Hi,I am trying to fetch CPU and memory details from Databricks. Are there any APIs present to which I can connect using postman and fetch these details?

  • 136 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Neelofer, To fetch CPU and memory details from Databricks, you can utilize the Databricks REST APIs. While it doesn’t provide real-time data, the Spark UI offers information related to Spark-specific performance, such as job and task breakdo...

  • 0 kudos
unity_Catalog
by New Contributor II
  • 419 Views
  • 1 replies
  • 0 kudos

UCX Installation Error

While downloading and installing ucx from a shell code,I am facing the below error. Can anyone provide a solution[i] Creating isolated Virtualenv with Python: /c/Program Files/Python312/pythonActual environment location may have moved due to redirect...

  • 419 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @unity_Catalog,  The error message suggests that you need to install a Python package.To do this, open your terminal or command prompt and run: pip install databricks This should resolve the missing module issue.Ensure that you are in the correc...

  • 0 kudos
Gerlex
by New Contributor
  • 176 Views
  • 1 replies
  • 1 kudos

Restore committed changes to Azure Databricks Git after abandoned pull request

I want to restore the committed changes (before and after view) in my branch. As this pull request was abandoned in Azure DevOps, then the branch was not merged. Therefore, the modified notebooks still exist but not the commits.How can I retrieve aga...

Capture.PNG
  • 176 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @Gerlex, If you want to restore the committed changes from an abandoned pull request in Azure DevOps, follow these steps: Navigate to the Azure DevOps portal and open the abandoned pull request.Check the Changes tab to review the modifications ...

  • 1 kudos
Apoorva2
by New Contributor
  • 175 Views
  • 1 replies
  • 1 kudos

ipywidgets not loading in DataBricks Community Edition

Hi All,I am trying to run commands with ipywidgets but it just says :Loading widget...Same error occurs even when I re run the cell.DataBricks version used : 14.2  

  • 175 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @Apoorva2,  Make sure you’re using the latest version of the ipywidgets library. Ensure that the .observe function for the dropdown (if you’re using one) is correctly set up. Verify that it’s calling the on_select function when the dropdown value ...

  • 1 kudos
86conventional
by New Contributor
  • 111 Views
  • 1 replies
  • 0 kudos

Eorror in perform a merge inside a streaming foreachbatch using the command: microBatchDF._jdf.s

I'm trying to perform a merge inside a streaming foreachbatch using the command: microBatchDF._jdf.sparkSession().sql(self.merge_query)Streaming

  • 111 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @86conventional, For clusters with Databricks Runtime version 10.5 and above, you can access the local Spark session within the foreachBatch method. If you encounter any further issues, feel free to ask for more assistance! For additional referen...

  • 0 kudos
86conventional
by New Contributor
  • 124 Views
  • 1 replies
  • 0 kudos

Is there any specific error you are receiving when running the init script? Does the run complete st

Is there any specific error you are receiving when running the init script? Does the run complete start up or fail due to the init script?

  • 124 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @86conventional,  If the init script fails, check the script’s content for any syntax errors or missing dependencies.Verify that the script is accessible from the Databricks cluster. You can store the script in an Azure Blob Storage or DBFS locati...

  • 0 kudos
hkadhao
by New Contributor
  • 554 Views
  • 1 replies
  • 0 kudos

Spot label in pool even though the configuration selected is all on-demand

Why is there a spot label in pool even though the configuration selected is all on-demand? can someone explain?

  • 554 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @hkadhao, Let me explain. When configuring a pool in Databricks, you have the option to use either all spot instances or all on-demand instances. If you choose the “All Spot” option, the pool will launch clusters with spot instances for all nodes,...

  • 0 kudos
leelee3000
by New Contributor III
  • 1668 Views
  • 4 replies
  • 2 kudos

Resolved! Setting up Unity Catalog in Azure

Trying to create a metastore that will be connected to an external storage (ADLS) but we don't have the option to create a new metastore in 'Catalog' tab in the UI. Based on some research, we see that we'll have to go into "Manage Account" and then c...

  • 1668 Views
  • 4 replies
  • 2 kudos
Latest Reply
bsadler
New Contributor II
  • 2 kudos

I have been wrestling with this question for days now.  I seem to be the only one with this question so I am sure I am doing something wrong.  I am trying to create a UC metastore but there is not an option in "Catalog" to create a metastore.  This s...

  • 2 kudos
3 More Replies
eheinlein
by New Contributor
  • 135 Views
  • 1 replies
  • 0 kudos

How to confirm a workspace ID via an api token?

Hello! We are integrating with Databricks and we get the API key, workspace ID, and host from our users in order to connect to Databricks. We need the to validate the workspace ID because we do need it outside of the context of the API key (with webh...

  • 135 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @eheinlein, You can obtain the workspace ID from within a Databricks Notebook by running the following command in a Python or Scala cell: spark.conf.get("spark.databricks.clusterUsageTags.clusterOwnerOrgId") This command will return the worksp...

  • 0 kudos
samsonite
by New Contributor
  • 155 Views
  • 1 replies
  • 0 kudos

Failed deploying bundle via gitlab - Request failed for POST

I'm encountering an issue in my .gitlab-ci.yml file when attempting to execute databricks bundle deploy -t prod. The error message I receive is: Error: Request failed for POST <path>/state/deploy.lockInterestingly, when I run the same command locally...

  • 155 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @samsonite, The error message you’re encountering seems related to a lock request.  Ensure that the credentials and permissions used for deploying to the production environment are correct. Differences in credentials between local and CI/CD enviro...

  • 0 kudos
Ramakrishnan83
by New Contributor III
  • 9984 Views
  • 7 replies
  • 0 kudos

Renaming the database Name in Databricks

Team,Initially our team created the databases with the environment name appended. Ex: cust_dev, cust_qa, cust_prod.I am looking to standardize the database name as consistent name across environments. I want to rename to "cust". All of my tables are ...

  • 9984 Views
  • 7 replies
  • 0 kudos
Latest Reply
Avvar2022
Contributor
  • 0 kudos

You can also use “CASCADE” to drop schema and tables as well. It is recursive. 

  • 0 kudos
6 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Top Kudoed Authors