cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MonishKumar
by New Contributor
  • 2577 Views
  • 1 replies
  • 0 kudos

SFTP - JSchException: Algorithm negotiation fail

When I tried to read the SFTP (CSV file) in Databricks I'm getting the below error"JSchException: Algorithm negotiation fail"Code:var df = spark.read.options(Map("header"->"true","host"->"20.118.190.30","username"->"user","password"->"pass","fileForm...

Data Engineering
SFTP Spark SCALA Databricks CSV JSCH
  • 2577 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16752239289
Databricks Employee
  • 0 kudos

@MonishKumar Could you provide the entire exception ?From the one line error message, I suspect this is due to the SSL cipher suites required by the SFTP server is not available on cluster. You can run below to get the cipher suites that sftp require...

  • 0 kudos
Manjula_Ganesap
by Contributor
  • 1744 Views
  • 1 replies
  • 0 kudos

Delta Live Table Graph different with no change in Notebook code

I have a DLT code to create 40+ bronze tables. The tables are created on top of the latest parquet files for each of those tables. While executing the pipeline, sometimes I notice that the graph is different than the regular one i see. I do not under...

Manjula_Ganesap_0-1694001824330.png Manjula_Ganesap_1-1694001865082.png
  • 1744 Views
  • 1 replies
  • 0 kudos
Latest Reply
Manjula_Ganesap
Contributor
  • 0 kudos

@Retired_mod  - Thank you for your response. There is no change in the table dependencies. The code to create the individual raw tables look like this: The input to this is always the same 40 tables with only the underlying parquet file changing. I c...

  • 0 kudos
FabriceDeseyn
by Contributor
  • 1361 Views
  • 0 replies
  • 0 kudos

merge breaking persistance of dataframe

Hi allIn the minimal example below you can see that executing a merge statement trigger recomputation of a persisted dataframe. How does this happen?   from delta.tables import DeltaTable table_name = "hive_metastore.default.test_table" # initializ...

FabriceDeseyn_1-1694011507567.png
  • 1361 Views
  • 0 replies
  • 0 kudos
RP2007
by New Contributor
  • 2215 Views
  • 2 replies
  • 1 kudos

I would like to know why I am getting this error when I tried to earn badges for lakehouse fundamen

 I would like to know why I am getting this error when I tried to earn badges for lakehouse fundamentals. I can't access the quiz page. Can you please help on this?Getting below error:-403FORBIDDENYou don't have permission to access this page2023-08-...

  • 2215 Views
  • 2 replies
  • 1 kudos
Latest Reply
APadmanabhan
Databricks Employee
  • 1 kudos

Hello Both, This link would be of help.

  • 1 kudos
1 More Replies
guostong
by New Contributor III
  • 6988 Views
  • 2 replies
  • 0 kudos

Resolved! how to set jobs permission with rest api

create job with cli, but can not set the permission with cli,have to use rest api to set permission:https://docs.databricks.com/api/workspace/permissions/setbelow is my command in windows to set permission:curl -X PUT https://my-workspace-url.azureda...

  • 6988 Views
  • 2 replies
  • 0 kudos
Latest Reply
guostong
New Contributor III
  • 0 kudos

thank you, the new permission list should be a whole list, not the new permission

  • 0 kudos
1 More Replies
parimalpatil28
by New Contributor III
  • 1128 Views
  • 0 replies
  • 0 kudos

Looking for Upload file to dbfs using "/api/2.0/dbfs/put"

Hello,I am trying to upload the file from local linux machine to dbfs using request.post(<URI>,<Headers>, params={"path": "dbfs:/tmp", "contents": local_path}) and getting the error b'{"error_code":"INVALID_PARAMETER_VALUE","message":"You must provid...

  • 1128 Views
  • 0 replies
  • 0 kudos
804082
by New Contributor III
  • 2159 Views
  • 2 replies
  • 1 kudos

Backup/Export Databricks SQL Column Comments

We've had users make comments on tables/columns throughout Databricks SQL using the Data Explorer UI. I'm looking for a way to backup these comments, but when I run DESCRIBE TABLE, the comment column is always null despite being non-null in Data Expl...

  • 2159 Views
  • 2 replies
  • 1 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 1 kudos

@804082  - Markdown does not render when returned by DESCRIBE statements. we can view them on the Data Explorer UI.Reference:   https://docs.databricks.com/en/data/markdown-data-comments.html#document-data-with-markdown-comments

  • 1 kudos
1 More Replies
venkat94
by New Contributor
  • 986 Views
  • 1 replies
  • 0 kudos

Databricks Job RUns API

/api/2.1/jobs/runs/list Currently returns all jobs which got executed within specified time lines which we provide as input. Is There any way where we can get only specific jobs as per their status(only success)?

Data Engineering
API
azure
Databricks
jobruns
  • 986 Views
  • 1 replies
  • 0 kudos
Latest Reply
BilalAslamDbrx
Databricks Employee
  • 0 kudos

@venkat94 thanks for the feedback. We are working on updating the Jobs Runs API so you can filter runs by status e.g. only success. Stay tuned in the next couple of months.

  • 0 kudos
dbdude
by New Contributor II
  • 1681 Views
  • 1 replies
  • 0 kudos

Re-running DLT Pipeline Does Not Add Data After Delete

I am using DLT and unity catalog and using managed tables. The first table in this pipeline is a live streaming table. I first did this in the SQL editor:DELETE FROM my_table;This appears to have deleted all the records, which I wanted since now when...

  • 1681 Views
  • 1 replies
  • 0 kudos
Latest Reply
BilalAslamDbrx
Databricks Employee
  • 0 kudos

@Mo is correct! 

  • 0 kudos
Julie285720
by New Contributor
  • 1876 Views
  • 0 replies
  • 0 kudos

SQL Merge condition

Hi guys  I have a question regarding this merge step and I am a new beginner for Databricks, trying to do some study in data warehousing, but couldn't figure it out by myself. need your help with it. Appreciate your help in advance. I got this questi...

Julie285720_0-1693869289599.png
  • 1876 Views
  • 0 replies
  • 0 kudos
XavierPereVives
by New Contributor II
  • 2201 Views
  • 1 replies
  • 0 kudos

Azure Shared Clusters - P4J Security Exception on non-whitelisted classes

When I try to use a third party JAR on an Azure shared cluster - which is installed via Maven and I can successfully import - , I get the following message:  py4j.security.Py4JSecurityException: Method public static org.apache.spark.sql.Column com.da...

  • 2201 Views
  • 1 replies
  • 0 kudos
Latest Reply
XavierPereVives
New Contributor II
  • 0 kudos

Thanks Kaniz.I must use a shared cluster because I'm reading from a DLT table stored in a Unity Catalog.https://docs.databricks.com/en/data-governance/unity-catalog/compute.htmlMy understanding is that shared clusters are enforcing the Py4J policy I ...

  • 0 kudos
alemo
by New Contributor III
  • 2743 Views
  • 3 replies
  • 1 kudos

Delta live table UC Kinesis: options overwriteschema, ignorechanges not supported for data sourc

I try to build a DLT in UC with Kinesis as producer.My first table looks like:  @dlt.create_table( table_properties={ "pipelines.autoOptimize.managed": "true" }, spark_conf={"spark.databricks.delta.schema.autoMerge.enabled": "true"},)def feed_chu...

  • 2743 Views
  • 3 replies
  • 1 kudos
Latest Reply
Corbin
Databricks Employee
  • 1 kudos

If you use the "Preview" Channel in the "Advanced" section of the DLT Pipeline, this error should resolve itself. This fix is planned to make it into the "Current" channel by Aug 31, 2023

  • 1 kudos
2 More Replies
vroste
by New Contributor III
  • 2300 Views
  • 0 replies
  • 0 kudos

Delta Live Tables maintenance schedule

I have a DLT that runs every day and an automatically executed maintenance job that runs within 24 hours every day. The maintenance operations are costly, is it possible to change the schedule to once a week or so?

  • 2300 Views
  • 0 replies
  • 0 kudos
scvbelle
by New Contributor III
  • 4176 Views
  • 3 replies
  • 3 kudos

Resolved! DLT failure: ABFS does not allow files or directories to end with a dot

In my DLT pipeline outlined below which generically cleans identifier tables, after successfully creating initial streaming tables from the append-only sources, fails when trying to create the second cleaned tables witht the following:It'**bleep** cl...

Data Engineering
abfss
azure
dlt
engineering
  • 4176 Views
  • 3 replies
  • 3 kudos
Latest Reply
Priyanka_Biswas
Databricks Employee
  • 3 kudos

Hi @scvbelle The error message you're seeing is caused by an IllegalArgumentException error due to the restriction in Azure Blob File System (ABFS) that does not allow files or directories to end with a dot. This error is thrown by the trailingPeriod...

  • 3 kudos
2 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels