cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Anonymous
by Not applicable
  • 5634 Views
  • 2 replies
  • 1 kudos
  • 5634 Views
  • 2 replies
  • 1 kudos
Latest Reply
wmespi
New Contributor II
  • 1 kudos

Is this random number not possible to extract from the notebook context? It is available in the browser_hash but that is not populated when running a job.Is this random number static or does it change over time? If it is static, it can then be hardco...

  • 1 kudos
1 More Replies
Mado
by Valued Contributor II
  • 5925 Views
  • 1 replies
  • 0 kudos

Resolved! Error when query a table created by DLT pipeline; "Couldn't find value of a column"

Hi, I create a table using DLT pipeline (triggered once). In the ETL process, I add a new column to the table with Null values by:output = output.withColumn('Indicator_Latest_Value_Date', F.lit(None))Pipeline works and I don't get any error. But, whe...

  • 5925 Views
  • 1 replies
  • 0 kudos
Latest Reply
josruiz22
New Contributor III
  • 0 kudos

Hi,Try converting the None of the output line this :output = output.withColumn('Indicator_Latest_Value_Date', F.lit(None).cast("String"))

  • 0 kudos
Ancil
by Contributor II
  • 5552 Views
  • 5 replies
  • 3 kudos

I have created one databricks account using my organisation email Id, i need to delete that account. Any one please help me how can we do that

I have created one databricks account using my organisation emailID, i need to delete that account. Any please help me how can we do that

  • 5552 Views
  • 5 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hey there @Ancil P A​ Just wanted to check in if you were able to resolve your issue or do you need more help? We'd love to hear from you.Thanks!

  • 3 kudos
4 More Replies
EDDatabricks
by Contributor
  • 3320 Views
  • 3 replies
  • 7 kudos

Resolved! Unable to perform VACUUM on Delta table

We have a table containing records from the last 2-3 years. The table size is around 7.5 TBytes (67 Billion rows).Because there are periodic updates on historical records and daily optimizations of this table, we have tried repeatedly to execute a m...

  • 3320 Views
  • 3 replies
  • 7 kudos
Latest Reply
Anonymous
Not applicable
  • 7 kudos

Hi @EDDatabricks EDDatabricks​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that ...

  • 7 kudos
2 More Replies
carlosst01
by New Contributor II
  • 2089 Views
  • 2 replies
  • 2 kudos

Resolved! Running Libraries and/or modules in Databricks' lifecycle?

Hi, i have had this question for some weeks and didn't find any information about the topic. Specifically, my doubt is: what is the 'lifecycle' or cycle or steps to be able to use a new Python library in Databricks in terms of compatibility? For exam...

  • 2089 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Carlos Caravantes​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best ans...

  • 2 kudos
1 More Replies
adrianna2942842
by New Contributor III
  • 24448 Views
  • 3 replies
  • 7 kudos

Resolved! Exception "java.nio.charset.MalformedInputException: Input length = 1" when creating data profile on Docker Container Service (10.4 LTS)

I am encountering an issue while attempting to create a data profile on clusters using Docker Container Service (version 10.4 LTS). I keep receiving the following exception:java.nio.charset.MalformedInputException: Input length = 1What's puzzling is ...

  • 24448 Views
  • 3 replies
  • 7 kudos
Latest Reply
Vartika
Databricks Employee
  • 7 kudos

Hi @Adrianna Klank​,We haven't heard from you since the last response from @Akash Bhat​​, and I was checking back to see if the suggestion helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to other...

  • 7 kudos
2 More Replies
AhSon
by New Contributor II
  • 2228 Views
  • 2 replies
  • 5 kudos

Resolved! Databricks Certificate Renewal

I received email reminder on my databricks certificate that going to expire next month. May I check where we can renew the certificate like how we did in Microsoft?Thank you.

  • 2228 Views
  • 2 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Jason Yap​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...

  • 5 kudos
1 More Replies
Taha_Hussain
by Databricks Employee
  • 7777 Views
  • 5 replies
  • 8 kudos

Ask your technical questions at Databricks Office Hours! Register here for any of our upcoming dates:May 10 - 11:00 AM - 12:00 PM PTMay 17 - 8:00 AM -...

Ask your technical questions at Databricks Office Hours! Register here for any of our upcoming dates:May 10 - 11:00 AM - 12:00 PM PTMay 17 - 8:00 AM - 9:00 AM PTMay 24 - 9:00 AM - 10:00 AM GMTDatabricks Office Hours connects you directly with experts...

  • 7777 Views
  • 5 replies
  • 8 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 8 kudos

Thanks for this info

  • 8 kudos
4 More Replies
PriyaV
by New Contributor II
  • 13559 Views
  • 5 replies
  • 10 kudos

Suppress output in python notebooks

My dilemma is this - We use PySpark to connect to external data sources via jdbc from within databricks. Every time we issue a spark command, it spits out the connection options including the username, url and password which is not advisable. So, is ...

  • 13559 Views
  • 5 replies
  • 10 kudos
Latest Reply
Pabeggetur
New Contributor II
  • 10 kudos

Thanks for taking the time to discuss this, I feel strongly about it and love learning more on this topic.youi contact hoursuber eats complaints

  • 10 kudos
4 More Replies
JordanYaker
by Contributor
  • 9983 Views
  • 3 replies
  • 6 kudos

Has anyone else seen state files disappear in low-volume delta tables?

I have some Delta tables in our dev environment that started popping up with the following error today:py4j.protocol.Py4JJavaError: An error occurred while calling o670.execute. : org.apache.spark.SparkException: Job aborted due to stage failure: Tas...

  • 9983 Views
  • 3 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Jordan Yaker​ We haven't heard from you since the last response from @Kaniz Fatma​ , and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to other...

  • 6 kudos
2 More Replies
ptutak
by New Contributor III
  • 10451 Views
  • 5 replies
  • 6 kudos

Databricks + Snowflake Snowpipe Streaming

Does anyone know whether it is possible to use Databricks Snowflake Connector together with the latest Snowflake feature which is Snowpipe Streaming?

  • 10451 Views
  • 5 replies
  • 6 kudos
Latest Reply
artsheiko
Databricks Employee
  • 6 kudos

@Piotr Tutak​ , I believe you don't need Snowflake at all - just use source files / events from your Data Lake / message brocker to process it whithin Databricks. Auto-loader that might be combined with DLT.How to + demo.Auto loader docDLT docWith Au...

  • 6 kudos
4 More Replies
gilo12
by New Contributor III
  • 9836 Views
  • 3 replies
  • 2 kudos

merge into deletes from SOURCE

I am using the following query to make an upsert:MERGE INTO my_target_table AS target USING (SELECT MAX(__my_timestamp) AS checkpoint FROM my_source_table) AS source ON target.name = 'some_name' AND target.address = 'some_address' WHEN MATCHED AN...

  • 9836 Views
  • 3 replies
  • 2 kudos
Latest Reply
gilo12
New Contributor III
  • 2 kudos

I was using a view for my_source_table, once I changed that to be a table the issue stoped.That unblocked me, but I think Databricks has a bug with using MERGE INTO from a VIEW

  • 2 kudos
2 More Replies
PK225
by New Contributor III
  • 1483 Views
  • 2 replies
  • 1 kudos
  • 1483 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vartika
Databricks Employee
  • 1 kudos

Hi @Pavan Kumar​,Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too.Cheers!

  • 1 kudos
1 More Replies
James1100
by New Contributor II
  • 1678 Views
  • 1 replies
  • 1 kudos

Databricks connect to GCS

Hi,Would like to ask if anyone knows how to connect to GCS - basically read csv file from GCS bucket.I have no issue connecting to Data Lake.Thank you so much in advance.

  • 1678 Views
  • 1 replies
  • 1 kudos
Latest Reply
Vartika
Databricks Employee
  • 1 kudos

Hi @James C​,Just checking in. If @Kaniz Fatma​'s answer helped, would you let us know and mark the answer as best? If not, would you be happy to give us more information?We'd love to hear from you.Cheers!

  • 1 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels