metastore is down
I am trying to run a scala notebook, but my job just spins and says Metastore is down. Can someone help me. Thanks in advance.
- 1170 Views
- 0 replies
- 0 kudos
I am trying to run a scala notebook, but my job just spins and says Metastore is down. Can someone help me. Thanks in advance.
I have this error - com.databricks.WorkflowException: com.databricks.NotebookExecutionException: FAILEDwhen i ran dbutils.notebook.run(.......)
Hi @Godswill Mbata ,this looks like issue seems to be with High Concurrency cluster. Could you please confirm if you are using high concurrency cluster?Please refer: https://community.databricks.com/s/question/0D53f00001cx3ybCAA/strange-error-with-d...
I have a table that I'll update with multiple inputs (csv). Is there a simple way to update my target when the source fields won't be a 1:1 match? Another challenge I've run into is that my sources don't have a header field, though I guess I could ...
Read your CSV as a dataframe and than make update using Merge (upsert).
Hi, right now I am trying to run a pyflink script that can connect to a kafka server. When I run that script, I got an error "An error occurred while trying to connect to the Java server 127.0.0.1:35529". Do I need to install a extra jdk for that? er...
did you get Flink running on the Databricks cluster? Because that seems to be the issue here.
Hi,I'm trying the Databricks Premium. Got registered etc. No problem.Then I setup AWS and workspace etc. No problem.Back to Databricks and now can't login. Trying to reset password but get nothing in the email.Trying other browser also doesn't work.C...
Hi @Debayan Mukherjee could you please help me out?Is there any way I can login to my Databricks account?
I have created a workspace in AWS with private link. When we launch a cluster we get the following error: "Security Daemon Registration Exception: Failed to set up the spark container due to an error when registering the container to security daemon"...
Thanks @Sivaprasad C S It's already working. The problem was that the AWS STS private link was wrongly associated with the correct subnet.
Hi thereI have used my company email to register an account for https://customer-academy.databricks.com/learn/signin a while back.Now what I need to do is create an account with https://partner-academy.databricks.com/learn/signin using my company ema...
Hello Ajay,Thank you for reaching out. Please submit a ticket to https://help.databricks.com/s/contact-us?ReqType=training . Thank you!
Databricks released new visualization toolshttps://docs.databricks.com/notebooks/visualizations/index.htmlBut I don't see them in my notebook (DBR 10.4) nor I can find out how to enable them. How do I use the new tools?
@Atanu Sarkar , I am still not seeing this. I am running a cluster on runtime 10.4 LTS ML. The "Legacy Visualizations" are the only option available. This feature should be fully deployed, yes?
I uploaded some files in Community Edition. But when I try to delete them I get the popup that say do you want to delete, and when I select delete, the popup goes away but the file doesn't delete. Seems the the GUI is bugged. However, when I use d...
Hi Team,As I'm performing the Databricks workspace migration, during Metastore migration I'm facing below issue.As we found differences in the Metastore table count between Legacy and Target workspace, we checked error logs.After going through Failed...
I am using databricks job cluster for multitask jobs, when my job failed/succeeded I couldn't see any logs, Do I need to add any location in advanced options, cluster logging to see the logs for the failed/succeeded jobs or what it is and how it work...
Hi @swetha kadiyala Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...
Super curious about people's workflows for analyzing A/B tests. Trying to learn how to optimize my approach as it is very manual right now.
Hi @Ben Mathew Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
We are distributing pbids files providing the connection info to databricks. It contains options passed to the "Databricks.Catalogs " function implementing the connection to databricks. It is my understanding that databricks has made this together wi...
Hi @Erik Parmann Does @Hubert Dudek response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!
The S3 buckets are a likely source location for the new EDL builder uploads. Is there a way to search Databricks to find the naming convention for the S3 buckets that have been assigned to our team. We uploaded some files using EDL this morning but...
Hi @James Longstreet Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...
Our SQL Warehouse Serverless Endpoint started failing from this morning (2022-08-23 18:00:00 UTC):“org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to build AWSGlueClient: com.amazonaws.SdkClientException: Unable to find...
Hi @Bo Zhu Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New GroupUser | Count |
---|---|
1610 | |
763 | |
345 | |
286 | |
251 |