cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

berserkersap
by Contributor
  • 12549 Views
  • 1 replies
  • 0 kudos

How to deal with Decimal data type arithmetic operations ?

I am dealing with values ranging from 10^9 to 10^-9 , the sum of values can go up to 10^20 and need accuracy. So I wanted to use Decimal Data type [ Using SQL in Data Science & Engineering workspace]. However, I got to know the peculiar behavior of D...

  • 12549 Views
  • 1 replies
  • 0 kudos
Latest Reply
berserkersap
Contributor
  • 0 kudos

Hello Everyone,I understand that there is no best answer for this question. So, I could only do the same thing I found when I surfed the net.The method I found works whenIf you know the range of values you deal with (not just the input data but also ...

  • 0 kudos
190809
by Contributor
  • 2170 Views
  • 2 replies
  • 0 kudos

Invalid port error when trying to read from PlanetScale MySQL databse

Using the code below I am attempting to connect to a PlanetScale MySQL database. I get the following error: java.sql.SQLException: error parsing url : Incorrect port value. However the port is the default 3306, and I have used the correct url based o...

  • 2170 Views
  • 2 replies
  • 0 kudos
Latest Reply
Pat
Esteemed Contributor
  • 0 kudos

HI @Rachel Cunningham​ ,maybe you can share your `driver` and `url` value (masked)?

  • 0 kudos
1 More Replies
eques_99
by New Contributor II
  • 4484 Views
  • 2 replies
  • 0 kudos

Remove a category (slice) from a Pie Chart

I added a grand total row to a "Count" in SQL, which I needed for some counter visualisations. I used the "ROLL UP" command to get the grand total.However, I have a pie chart which references the same count, and so the grand total row has been added...

Capture
  • 4484 Views
  • 2 replies
  • 0 kudos
Latest Reply
eques_99
New Contributor II
  • 0 kudos

hi, as per the picture above, the slice disappears but the name ("null" in this case) remains on the legend.

  • 0 kudos
1 More Replies
Jayanth746
by New Contributor III
  • 8526 Views
  • 2 replies
  • 2 kudos

Databricks <-> Kafka - SSL handshake failed

I am receiving SSL handshake error even though the trust-store I have created is based on server certificate and the fingerprint in the certificate matches the trust-store fingerprint.kafkashaded.org.apache.kafka.common.errors.SslAuthenticationExcept...

  • 8526 Views
  • 2 replies
  • 2 kudos
Latest Reply
Debayan
Databricks Employee
  • 2 kudos

Hi @Jayanth Goulla​ , worth a try ,https://stackoverflow.com/questions/54903381/kafka-failed-authentication-due-to-ssl-handshake-failedDid you follow: https://docs.microsoft.com/en-us/azure/databricks/spark/latest/structured-streaming/kafka?

  • 2 kudos
1 More Replies
elgeo
by Valued Contributor II
  • 2683 Views
  • 1 replies
  • 2 kudos

Disable auto-complete (tab button)

Hello. How could we disable autocomplete that appears with tab button? Thank you

  • 2683 Views
  • 1 replies
  • 2 kudos
Latest Reply
elgeo
Valued Contributor II
  • 2 kudos

Thank you @Kaniz Fatma​ 

  • 2 kudos
vs_29
by New Contributor II
  • 4303 Views
  • 1 replies
  • 3 kudos

Custom Log4j logs are not being written to the DBFS storage.

 I used custom Log4j appender to write the custom logs through the init script and I can see the Custom Log file on the Driver logs but Databricks is not writing those custom logs to the DBFS. I have configured Logging Destination in the Advanced sec...

init script driver logs logs destination
  • 4303 Views
  • 1 replies
  • 3 kudos
Latest Reply
Debayan
Databricks Employee
  • 3 kudos

Hi @VIjeet Sharma​ , Do you receive any error? This can be an issue using DBFS mount point /dbfs in an init script: the DBFS mount point is installed asynchronously, so at the very beginning of init script execution, that mount point might not be ava...

  • 3 kudos
sharonbjehome
by New Contributor
  • 2752 Views
  • 1 replies
  • 1 kudos

Structered Streamin from MongoDB Atlas not parsing JSON correctly

HI all,I have a table in MongoDB Atlas that I am trying to read continuously to memory and then will write that file out eventually. However, when I look at the in-memory table it doesn't have the correct schema.Code here:from pyspark.sql.types impo...

image.png
  • 2752 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi @sharonbjehome​ , This has to be checked thoroughly via a support ticket, did you follow: https://docs.databricks.com/external-data/mongodb.html Also, could you please check with mongodb support, Was this working before?

  • 1 kudos
dara
by New Contributor
  • 1573 Views
  • 1 replies
  • 1 kudos

How to count DelayCategories?

I would like to know how many count of each categories in each year, When I run count, it doesn't work.

image
  • 1573 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, @Dara Tourt​ , When you say it does not work, what is the error? You can run count aggregate function. https://docs.databricks.com/sql/language-manual/functions/count.htmlPlease let us know if this helps.

  • 1 kudos
547284
by New Contributor II
  • 1441 Views
  • 1 replies
  • 1 kudos

How to read in csvs from s3 directory with different columns

I can read all csvs under an S3 uri byu doing:files = dbutils.fs.ls('s3://example-path')df = spark.read.options(header='true',            encoding='iso-8859-1',            dateFormat='yyyyMMdd',            ignoreLeadingWhiteSpace='true',            i...

  • 1441 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi @Anthony Wang​ As of now, I think that's the only way. Please refer: https://docs.databricks.com/external-data/csv.html#pitfalls-of-reading-a-subset-of-columns. Please let us know if this helps.

  • 1 kudos
sage5616
by Valued Contributor
  • 9262 Views
  • 3 replies
  • 6 kudos

Saving PySpark standard out and standard error logs to cloud object storage

I am running my PySpark data pipeline code on a standard databricks cluster. I need to save all Python/PySpark standard output and standard error messages into a file in an Azure BLOB account.When I run my Python code locally I can see all messages i...

  • 9262 Views
  • 3 replies
  • 6 kudos
Latest Reply
sage5616
Valued Contributor
  • 6 kudos

This is the approach I am currently taking. It is documented here: https://stackoverflow.com/questions/62774448/how-to-capture-cells-output-in-databricks-notebook from IPython.utils.capture import CapturedIO capture = CapturedIO(sys.stdout, sys.st...

  • 6 kudos
2 More Replies
ajithkaythottil
by New Contributor
  • 990 Views
  • 0 replies
  • 0 kudos

usedlaptopcalicut.in

We Are Among The Most Reliable Used Laptop Sellers In Calicut. A Wide Variety Of Laptops From Different Brands To Suit Different Budgets Are Available At Us. The used laptops are in good condition and cost a fraction of what a brand-new laptop would....

used laptop in calicut
  • 990 Views
  • 0 replies
  • 0 kudos
flora2408
by New Contributor II
  • 1736 Views
  • 1 replies
  • 2 kudos

I have passed the Fundamentals Accreditation but I haven´t received my badge and certificate.

I have just passed  Fundamentals Accreditation i dont have the badge

  • 1736 Views
  • 1 replies
  • 2 kudos
Latest Reply
LandanG
Databricks Employee
  • 2 kudos

Hi @FRANCISCO LORA​ @Kaniz Fatma​ knows more than me but you could probably submit a ticket to Databricks' Training Team here: https://help.databricks.com/s/contact-us?ReqType=training who will get back to you shortly. 

  • 2 kudos
Rahul_Tiwary
by New Contributor II
  • 8907 Views
  • 1 replies
  • 4 kudos

Getting Error "java.lang.NoSuchMethodError: org.apache.spark.sql.AnalysisException" while writing data to event hub for streaming. It is working fine if I am writing it to another data brick table

import org.apache.spark.sql._import scala.collection.JavaConverters._import com.microsoft.azure.eventhubs._import java.util.concurrent._import scala.collection.immutable._import org.apache.spark.eventhubs._import scala.concurrent.Futureimport scala.c...

  • 8907 Views
  • 1 replies
  • 4 kudos
Latest Reply
Gepap
New Contributor II
  • 4 kudos

The dataframe to write needs to have the following schema:Column | Type ---------------------------------------------- body (required) | string or binary partitionId (*optional) | string partitionKey...

  • 4 kudos
196083
by New Contributor II
  • 2537 Views
  • 1 replies
  • 2 kudos

iPython shell `set_next_input` not working

I'm running on 11.3 LTS. Expected Behavior:Databricks Notebook Behavior (it does nothing): You can also do `shell.set_next_input("test", replace=True)` to replace the current cell content which also doesn't work on Databricks. `set_next_input` stores...

Jupyter Shell Example Databricks Behavior
  • 2537 Views
  • 1 replies
  • 2 kudos
horatiug
by New Contributor III
  • 6672 Views
  • 8 replies
  • 3 kudos

Create workspace in Databricks deployed in Google Cloud using terraform

In the documentation https://registry.terraform.io/providers/databricks/databricks/latest/docs https://docs.gcp.databricks.com/dev-tools/terraform/index.html I could not find documentation on how to provision Databricks workspaces in GCP. Only cre...

  • 6672 Views
  • 8 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @horatiu guja​ Does @Debayan Mukherjee​ response answer your question?If yes, would you be happy to mark it as best so that other members can find the solution more quickly? Else, we can help you with more details.

  • 3 kudos
7 More Replies
Labels