cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

vs_29
by New Contributor II
  • 2349 Views
  • 2 replies
  • 3 kudos

Custom Log4j logs are not being written to the DBFS storage.

 I used custom Log4j appender to write the custom logs through the init script and I can see the Custom Log file on the Driver logs but Databricks is not writing those custom logs to the DBFS. I have configured Logging Destination in the Advanced sec...

init script driver logs logs destination
  • 2349 Views
  • 2 replies
  • 3 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 3 kudos

Hi @VIjeet Sharma​, We haven’t heard from you since the last response from @Debayan Mukherjee​ and I was checking back to see if his suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to...

  • 3 kudos
1 More Replies
RohitKulkarni
by Contributor II
  • 6485 Views
  • 6 replies
  • 6 kudos

External Table issue format in databricks

I am new to databricksI am trying to create a external table in databricks with below format :CREATE EXTERNAL TABLE Salesforce.Account( Id string ,  IsDeleted bigint,  Name string ,  Type string ,  RecordTypeId string ,  ParentId string ,  ShippingSt...

  • 6485 Views
  • 6 replies
  • 6 kudos
Latest Reply
AmitA1
Contributor
  • 6 kudos

Databricks is awesome if you have SQL knowledge....I just came across one of my problem in my project and databricks helped me a lot....like a use of low watermark to hold the load success date .....​

  • 6 kudos
5 More Replies
jt
by New Contributor III
  • 2463 Views
  • 3 replies
  • 3 kudos

collapse partial code in large cell?

In databricks notebook, we have SQL cells that are over 700 lines long. Is there a way to collapse a portion of the code vs scrolling? Looking for something similar to what exists in Netezza, "--region" and "--end region" where anything between those...

  • 2463 Views
  • 3 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @james t​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 3 kudos
2 More Replies
KrishZ
by Contributor
  • 4206 Views
  • 4 replies
  • 1 kudos

How to print the path of a .py file or a notebook?

I have stored a test.py in the dbfs at the below location "/dbfs/FileStore/shared_uploads/krishna@company.com/Project_Folder/test.py"I have a print statement in test.py which says the belowprint( os.getcwd() )and it prints the below'/databricks/drive...

  • 4206 Views
  • 4 replies
  • 1 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 1 kudos

Hey @Krishna Zanwar​  Please use the below code this will work and as you want the specific location you can create a custom code and format the path using a python formatter , it will give you desired result .

  • 1 kudos
3 More Replies
cmilligan
by Contributor II
  • 2418 Views
  • 1 replies
  • 2 kudos

Resolved! org.apache.http.conn.ConnectTimeoutException: What does this mean and how can we resolve it.

My team has run into getting this error pretty frequently on one of our larger jobs. I've set out retry policy to 5 and that seems to fix it and keep the job going. It seems like it's unable to pick up the task immediately but can after it's complete...

  • 2418 Views
  • 1 replies
  • 2 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 2 kudos

Hey @Coleman Milligan​ ,I also faced this type of issue many times you can add the below configuration in your cluster and it should work.spark.executor.heartbeatInterval 60sspark.network.timeout 120sFor more details, you can explore this doc - https...

  • 2 kudos
auser85
by New Contributor III
  • 3845 Views
  • 2 replies
  • 2 kudos

cannot convert Parquet type INT64 to Photon type double

I am trying to read in files via the COPY INTO command but I am getting this error lately for a certain subset of the data;`Error while reading file: Schema conversion error: cannot convert Parquet type INT64 to Photon type double`These are my option...

  • 3845 Views
  • 2 replies
  • 2 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 2 kudos

hey @Andrew Fogarty​ I also faced the same issue when I moved from the 7.3 LTS version to a higher runtime version so to mitigate this issue you can use the below cluster configuration spark.sql.storeAssignmentPolicy LEGACYspark.sql.parquet.binaryAsS...

  • 2 kudos
1 More Replies
Anonymous
by Not applicable
  • 1931 Views
  • 4 replies
  • 0 kudos

Resolved! Safari problems after the maintenance on 12/9/2022

I'm experience some problems on Safari 15.3 ( MacOS )I would like to know if I am alone in this and how to fix ( if I can ) this.This is the Databricks SQLData science and Engineering. ( is this case Workflows).

Screen Shot 2022-09-13 at 11.58.39 AM Screen Shot 2022-09-13 at 12.00.19 PM
  • 1931 Views
  • 4 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

The problem is fixed, anything works as usual.

  • 0 kudos
3 More Replies
Aidonis
by New Contributor III
  • 10300 Views
  • 2 replies
  • 2 kudos

Resolved! Load Data from Sharepoint Site to Delta table in Databricks

Hi New to the community so sorry if my post lacks detail.I am trying to create a connection between databricks and a sharepoint site to read excel files into a delta tableI can see there is a FiveTran partner connection that we can use to get sharepo...

  • 10300 Views
  • 2 replies
  • 2 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 2 kudos

Hi @Aidan Heffernan​ you can use Sharepoint Rest API to connect with databricks Please refer below code- from office365.sharepoint.client_context import ClientContext from office365.runtime.auth.client_credential import ClientCredential sharep...

  • 2 kudos
1 More Replies
ossinova
by Contributor II
  • 1741 Views
  • 1 replies
  • 1 kudos

Jobs failing with repl error

Recently my Databricks jobs have failed with the error message:Failure starting repl. Try detaching and re-attaching the notebook.   java.lang.Exception: Python repl did not start in 30 seconds seconds. at com.databricks.backend.daemon.driver.Ipyker...

  • 1741 Views
  • 1 replies
  • 1 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 1 kudos

Yes, you can use re-try if still it's not resolve raise a support ticket to databricks

  • 1 kudos
User16826992666
by Valued Contributor
  • 15643 Views
  • 2 replies
  • 2 kudos

Can I query my Delta tables with PowerBI?

I would like to connect to the Delta tables I have created with PowerBI to use for reporting. Is it possible to do this with Databricks or do I have to write my data to some other serving layer?

  • 15643 Views
  • 2 replies
  • 2 kudos
Latest Reply
gbrueckl
Contributor II
  • 2 kudos

if you want to read your Delta Lake table directly from the storage without the need of having a Databricks cluster up and running you can also use the official connector Power BI connector for Delta Lake https://github.com/delta-io/connectors/tree/m...

  • 2 kudos
1 More Replies
KVNARK
by Honored Contributor II
  • 1932 Views
  • 5 replies
  • 11 kudos

Resolved! Delta table output to Databricks SQL

How to write a delta table output to Databricks SQL for analysis purpose.

  • 1932 Views
  • 5 replies
  • 11 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 11 kudos

Hi @KVNARK .​, We haven’t heard from you since the last response from @Brian Labrom​ and @Ajay Pandey​ and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please do share that with the community as it can ...

  • 11 kudos
4 More Replies
KVNARK
by Honored Contributor II
  • 1035 Views
  • 1 replies
  • 5 kudos

Resolved! Trigger another .py file by uisng 2 .py files.

Hi,I have 3 .py files - a.py, b.py & c.py files. By joining a.py & b.py, based on the output that I get need to trigger the c.py file.

  • 1035 Views
  • 1 replies
  • 5 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 5 kudos

Hi @KVNARK .​ refer below link this will help in thisLink

  • 5 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels