cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Abel_Martinez
by Contributor
  • 14179 Views
  • 11 replies
  • 38 kudos

Why Python logs shows [REDACTED] literal in spaces when I use dbutils.secrets.get in my code?

When I use  dbutils.secrets.get in my code, spaces in the log are replaced by "[REDACTED]" literal. This is very annoying and makes the log reading difficult. Any idea how to avoid this?See my screenshot...

  • 14179 Views
  • 11 replies
  • 38 kudos
Latest Reply
jlb0001
New Contributor III
  • 38 kudos

I ran into the same issue and found that the reason was that the notebook included some test keys with values of "A" and "B" for simple testing. I noticed that any string with a substring of "A" or "B" was "[REDACTED]".​So, in my case, it was an eas...

  • 38 kudos
10 More Replies
vinaykumar
by New Contributor III
  • 4912 Views
  • 6 replies
  • 0 kudos

Log files are not getting deleted automatically after logRetentionDuration internal

Hi team Log files are not getting deleted automatically after logRetentionDuration internal from delta log folder and after analysis , I see checkpoint files are not getting created after 10 commits . Below table properties using spark.sql(    f"""  ...

No checkpoint.parquet
  • 4912 Views
  • 6 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @vinay kumar​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...

  • 0 kudos
5 More Replies
Kaijser
by New Contributor II
  • 3466 Views
  • 4 replies
  • 1 kudos

Logging clogged up with error messages (OSError: [Errno 95] Operation not supported, --- Logging error ---)

I have encountered this issue for a while now and it happens each run that is triggered. I discovered 2 things:1) If I run my script on a cluster that is not active and the cluster is activated by a scheduled trigger (not manually!) this doesn't happ...

  • 3466 Views
  • 4 replies
  • 1 kudos
Latest Reply
manasa
Contributor
  • 1 kudos

Hi @Aaron Kaijser​ Are you able to your logfile to ADLS?If yes, could you please explain how you did it

  • 1 kudos
3 More Replies
Nilofar
by New Contributor II
  • 2787 Views
  • 7 replies
  • 0 kudos

i am not able to reset the password for data bricks cloud community

Hi,i am not log in to https://community.cloud.databricks.com/login.html. Please assist .

  • 2787 Views
  • 7 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Nilofar Sharma​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answer...

  • 0 kudos
6 More Replies
Therdpong
by New Contributor III
  • 1603 Views
  • 2 replies
  • 0 kudos

how to check what jobs cluster to have expanddisk.

We would like to know how to check what jobs cluster to have to expand disk.

  • 1603 Views
  • 2 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

You can check in the cluster's event logs. You can type in the search box, "disk" and you will see all the events there.

  • 0 kudos
1 More Replies
elgeo
by Valued Contributor II
  • 2156 Views
  • 0 replies
  • 5 kudos

Clean up _delta_log files

Hello experts. We are trying to clarify how to clean up the large amount of files that are being accumulated in the _delta_log folder (json, crc and checkpoint files). We went through the related posts in the forum and followed the below:SET spark.da...

  • 2156 Views
  • 0 replies
  • 5 kudos
dibo
by New Contributor II
  • 615 Views
  • 0 replies
  • 0 kudos

I can't login to https://community.cloud.databricks.com/login.html

Now, I can't login to https://community.cloud.databricks.com/login.html with the correct username and password, later I click the button to reset my password and I receive the email for modifying password, I have modified password, But I still can't ...

  • 615 Views
  • 0 replies
  • 0 kudos
chandan_a_v
by Valued Contributor
  • 8590 Views
  • 11 replies
  • 6 kudos

Resolved! logging.basicConfig not creating a file in Databricks

Hi,I am using the logger to log some parameters in my code and I want to save the file under DBFS. But for some reason the file is not getting created under DBFS. If I clear the state of the notebook and check the DBFS dir then file is present. Pleas...

  • 8590 Views
  • 11 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Perhaps PyCharm sets a different working directory, meaning the file ends up in another place. Try providing a full path.

  • 6 kudos
10 More Replies
Debashis
by New Contributor II
  • 1299 Views
  • 1 replies
  • 3 kudos

Resolved! Can not log in to https://community.cloud.databricks.com/login.html

Hi , I created the community cloud account , even I got a mail for resetting password . But once I try to log in to https://community.cloud.databricks.com/login.html , it does not give error , but simple hanging for some time and again login screen ...

  • 1299 Views
  • 1 replies
  • 3 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 3 kudos

Hi @Debashis Mallick​, Thank you for reaching out!Let us look into this for you, and we'll check back with an update.Please share your relevant details as well as screenshots to community@databricks.com.

  • 3 kudos
kjoth
by Contributor II
  • 6166 Views
  • 6 replies
  • 6 kudos

Resolved! Pyspark logging - custom to Azure blob mount directory

I'm using the logging module to log the events from the job, but it seems the log is creating the file with only 1 lines. The consecutive log events are not being recorded. Is there any reference for custom logging in Databricks.

  • 6166 Views
  • 6 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

@karthick J​ - If Jose's answer helped solve the issue, would you be happy to mark their answer as best so that others can find the solution more easily?

  • 6 kudos
5 More Replies
MGH1
by New Contributor III
  • 5195 Views
  • 8 replies
  • 3 kudos

Resolved! how to log the KerasClassifier model in a sklearn pipeline in mlflow?

I have a set of pre-processing stages in a sklearn `Pipeline` and an estimator which is a `KerasClassifier` (`from tensorflow.keras.wrappers.scikit_learn import KerasClassifier`).My overall goal is to tune and log the whole sklearn pipeline in `mlflo...

  • 5195 Views
  • 8 replies
  • 3 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 3 kudos

could you please share the full error stack trace?

  • 3 kudos
7 More Replies
brickster_2018
by Esteemed Contributor
  • 15139 Views
  • 1 replies
  • 2 kudos

Resolved! How do I change the log level in Databricks?

How can I change the log level of the Spark Driver and executor process?

  • 15139 Views
  • 1 replies
  • 2 kudos
Latest Reply
brickster_2018
Esteemed Contributor
  • 2 kudos

Change the log level of Driver:%scala   spark.sparkContext.setLogLevel("DEBUG")   spark.sparkContext.setLogLevel("INFO")Change the log level of a particular package in Driver logs:%scala   org.apache.log4j.Logger.getLogger("shaded.databricks.v201809...

  • 2 kudos
Labels