cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AWS1567
by New Contributor III
  • 20260 Views
  • 10 replies
  • 6 kudos

We've encountered an error logging you in.

I'm trying to login for past two days and i'm still facing this error: "We've encountered an error logging you in." I've tried to reset the password multiple times and nothing happened. My friend is also not able to login. I request you to resolve t...

Databricks_login_issue
  • 20260 Views
  • 10 replies
  • 6 kudos
Latest Reply
rmutili
New Contributor II
  • 6 kudos

 Hey, I am not able to login to my work databrick's account. I am getting the above errors.

  • 6 kudos
9 More Replies
BeardyMan
by New Contributor III
  • 6092 Views
  • 9 replies
  • 3 kudos

Resolved! MLFlow Serve Logging

When using Azure Databricks and serving a model, we have received requests to capture additional logging. In some instances, they would like to capture input and output or even some of the steps from a pipeline. Is there any way we can extend the lo...

  • 6092 Views
  • 9 replies
  • 3 kudos
Latest Reply
Dan_Z
Databricks Employee
  • 3 kudos

Another word from a Databricks employee:"""You can use the custom model approach but configuring it is painful. Plus you have ended every loggable model in the custom model. Another less intrusive solution would be to have a proxy server do the loggi...

  • 3 kudos
8 More Replies
brickster_2018
by Databricks Employee
  • 12586 Views
  • 3 replies
  • 6 kudos

Resolved! How to add I custom logging in Databricks

I want to add custom logs that redirect in the Spark driver logs. Can I use the existing logger classes to have my application logs or progress message in the Spark driver logs.

  • 12586 Views
  • 3 replies
  • 6 kudos
Latest Reply
Kaizen
Valued Contributor
  • 6 kudos

1) Is it possible to save all the custom logging to its own file? Currently it is being logging with all other cluster logs (see image) 2) Also Databricks it seems like a lot of blank files are also being created for this. Is this a bug? this include...

  • 6 kudos
2 More Replies
marksachin_k
by New Contributor
  • 2624 Views
  • 1 replies
  • 0 kudos

Python custom Logging on Databricks

I am planning to introduce a custom logging to the databricks workload. To achieve this I am using a python logging module. I am storing logs in driver memory "file:/tmp/" directory before I move those logs to blob storage. In my personal databricks ...

  • 2624 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @MARKSACHIN K​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 0 kudos
Snowhow1
by New Contributor II
  • 9172 Views
  • 1 replies
  • 1 kudos

Logging when using multiprocessing with joblib

Hi,I'm using joblib for multiprocessing in one of our processes. The logging does work well (except weird py4j errors which I supress) except when it's within multiprocessing. Also how do I supress the other errors that I always receive on DB - perha...

  • 9172 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Sam G​ :It seems like the issue is related to the py4j library used by Spark, and not specifically related to joblib or multiprocessing. The error message indicates a network error while sending a command between the Python process and the Java Virt...

  • 1 kudos
elgeo
by Valued Contributor II
  • 1212 Views
  • 1 replies
  • 0 kudos

User logging in Databricks

Hello experts. Is there a way to see user logs as who is running which notebook or a specific query? I couldn't find anything in "Driver logs" (stdout, log4j). Just to specify that I have admin role in the workspace. Thank you in advance

  • 1212 Views
  • 1 replies
  • 0 kudos
Latest Reply
elgeo
Valued Contributor II
  • 0 kudos

Any update on this? Thank you

  • 0 kudos
Gim
by Contributor
  • 65541 Views
  • 3 replies
  • 9 kudos

Best practice for logging in Databricks notebooks?

What is the best practice for logging in Databricks notebooks? I have a bunch of notebooks that run in parallel through a workflow. I would like to keep track of everything that happens such as errors coming from a stream. I would like these logs to ...

  • 65541 Views
  • 3 replies
  • 9 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 9 kudos

@Gimwell Young​ AS @Debayan Mukherjee​ mentioned if you configure verbose logging in workspace level, logs will be moved to your storage bucket that you have provided during configuration. from there you can pull logs into any of your licensed log mo...

  • 9 kudos
2 More Replies
Vashista
by New Contributor II
  • 3652 Views
  • 4 replies
  • 0 kudos

Blank page after logging into community cloud

After logging in on community.cloud.databricks.com. A blank white page appears as if still loading. Does not open even when I refresh. Not an internet connectivity issue. How do I access the page ? I'm running on safari

  • 3652 Views
  • 4 replies
  • 0 kudos
Latest Reply
Vidula
Honored Contributor
  • 0 kudos

Hi @Vashista Thakuri​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...

  • 0 kudos
3 More Replies
kjoth
by Contributor II
  • 6991 Views
  • 4 replies
  • 3 kudos

Resolved! Pyspark logging - custom to Azure blob mount directory

I'm using the logging module to log the events from the job, but it seems the log is creating the file with only 1 lines. The consecutive log events are not being recorded. Is there any reference for custom logging in Databricks.

  • 6991 Views
  • 4 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

@karthick J​ - If Jose's answer helped solve the issue, would you be happy to mark their answer as best so that others can find the solution more easily?

  • 3 kudos
3 More Replies
zachary_jones
by New Contributor
  • 4521 Views
  • 3 replies
  • 0 kudos

Resolved! Python logging: 'Operation not supported' after upgrading to DBRT 6.1

My organization has an S3 bucket mounted to the databricks filesystem under /dbfs/mnt. When using Databricks runtime 5.5 and below, the following logging code works correctly:log_file = '/dbfs/mnt/path/to/my/bucket/test.log' logger = logging.getLogg...

  • 4521 Views
  • 3 replies
  • 0 kudos
Latest Reply
lycenok
New Contributor II
  • 0 kudos

Probably it's worth to try to rewrite the emit ... https://docs.python.org/3/library/logging.html#handlers This works for me: class OurFileHandler(logging.FileHandler): def emit(self, record): # copied from https://github.com/python/cpython/bl...

  • 0 kudos
2 More Replies
Labels