<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Databricks Delta Lake Sink Connector in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25236#M17528</link>
    <description>&lt;P&gt;I am trying to use Databricks Delta Lake Sink Connector(confluent cloud ) and write to S3 . &lt;/P&gt;&lt;P&gt;the connector starts up with the following error . Any help on this could be appreciated &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;org.apache.kafka.connect.errors.ConnectException: java.sql.SQLException: [Simba][SparkJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403, Error message: Unknown.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at io.confluent.connect.databricks.deltalake.DatabricksDeltaLakeSinkTask.deltaLakeConnection(DatabricksDeltaLakeSinkTask.java:253)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at io.confluent.connect.databricks.deltalake.DatabricksDeltaLakeSinkTask.start(DatabricksDeltaLakeSinkTask.java:88)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:305)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:193)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:184)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:234)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.util.concurrent.FutureTask.run(FutureTask.java:266)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.lang.Thread.run(Thread.java:750)&lt;/P&gt;&lt;P&gt;Caused by: java.sql.SQLException: [Simba][SparkJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403, Error message: Unknown.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2Client.handleTTransportException(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.spark.jdbc.DowloadableFetchClient.handleTTransportException(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2Client.openSession(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2Client.&amp;lt;init&amp;gt;(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.spark.jdbc.DowloadableFetchClient.&amp;lt;init&amp;gt;(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;... 4 more&lt;/P&gt;&lt;P&gt;Caused by: com.simba.spark.support.exceptions.ErrorException: [Simba][SparkJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403, Error message: Unknown.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;... 14 more&lt;/P&gt;&lt;P&gt;Caused by: com.simba.spark.jdbc42.internal.apache.thrift.transport.TTransportException: HTTP Response code: 403, Error message: Unknown&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.TETHttpClient.handleHeaderErrorMessage(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.TETHttpClient.handleErrorResponse(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.TETHttpClient.flushUsingHttpClient(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.TETHttpClient.flush(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc42.internal.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc42.internal.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc42.internal.apache.hive.service.rpc.thrift.TCLIService$Client.send_OpenSession(TCLIService.java:147)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper.send_OpenSession(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc42.internal.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:139)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper.callOpenSession(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper.access$1700(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper$18.clientCall(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper$18.clientCall(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper.executeWithRetry(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper.OpenSession(Unknown Source)&lt;/P&gt;</description>
    <pubDate>Fri, 18 Mar 2022 11:51:39 GMT</pubDate>
    <dc:creator>Bency</dc:creator>
    <dc:date>2022-03-18T11:51:39Z</dc:date>
    <item>
      <title>Databricks Delta Lake Sink Connector</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25236#M17528</link>
      <description>&lt;P&gt;I am trying to use Databricks Delta Lake Sink Connector(confluent cloud ) and write to S3 . &lt;/P&gt;&lt;P&gt;the connector starts up with the following error . Any help on this could be appreciated &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;org.apache.kafka.connect.errors.ConnectException: java.sql.SQLException: [Simba][SparkJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403, Error message: Unknown.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at io.confluent.connect.databricks.deltalake.DatabricksDeltaLakeSinkTask.deltaLakeConnection(DatabricksDeltaLakeSinkTask.java:253)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at io.confluent.connect.databricks.deltalake.DatabricksDeltaLakeSinkTask.start(DatabricksDeltaLakeSinkTask.java:88)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:305)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:193)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:184)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:234)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.util.concurrent.FutureTask.run(FutureTask.java:266)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.lang.Thread.run(Thread.java:750)&lt;/P&gt;&lt;P&gt;Caused by: java.sql.SQLException: [Simba][SparkJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403, Error message: Unknown.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2Client.handleTTransportException(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.spark.jdbc.DowloadableFetchClient.handleTTransportException(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2Client.openSession(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2Client.&amp;lt;init&amp;gt;(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.spark.jdbc.DowloadableFetchClient.&amp;lt;init&amp;gt;(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.spark.jdbc.DownloadableFetchClientFactory.createClient(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.core.HiveJDBCCommonConnection.establishConnection(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.spark.core.SparkJDBCConnection.establishConnection(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc.core.LoginTimeoutConnection$1.call(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;... 4 more&lt;/P&gt;&lt;P&gt;Caused by: com.simba.spark.support.exceptions.ErrorException: [Simba][SparkJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 403, Error message: Unknown.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;... 14 more&lt;/P&gt;&lt;P&gt;Caused by: com.simba.spark.jdbc42.internal.apache.thrift.transport.TTransportException: HTTP Response code: 403, Error message: Unknown&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.TETHttpClient.handleHeaderErrorMessage(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.TETHttpClient.handleErrorResponse(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.TETHttpClient.flushUsingHttpClient(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.TETHttpClient.flush(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc42.internal.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc42.internal.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc42.internal.apache.hive.service.rpc.thrift.TCLIService$Client.send_OpenSession(TCLIService.java:147)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper.send_OpenSession(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.jdbc42.internal.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:139)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper.callOpenSession(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper.access$1700(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper$18.clientCall(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper$18.clientCall(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper.executeWithRetry(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;at com.simba.spark.hivecommon.api.HS2ClientWrapper.OpenSession(Unknown Source)&lt;/P&gt;</description>
      <pubDate>Fri, 18 Mar 2022 11:51:39 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25236#M17528</guid>
      <dc:creator>Bency</dc:creator>
      <dc:date>2022-03-18T11:51:39Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Delta Lake Sink Connector</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25237#M17529</link>
      <description>&lt;P&gt;It is connection issue. You can just download JDBC/ODBC driver to your local computer and validate all connection settings. You can also test connection using BI tools like PowerBI. If settings are ok, one common issue can be that your ip is blocked by security groups&lt;/P&gt;</description>
      <pubDate>Fri, 18 Mar 2022 12:06:37 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25237#M17529</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2022-03-18T12:06:37Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Delta Lake Sink Connector</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25238#M17530</link>
      <description>&lt;P&gt;Hi @Hubert Dudek​&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;thanks for the response ! &lt;/P&gt;&lt;P&gt;Yes I did try it with a sql client and spark simba jdbc driver and it works fine . But from an ec2 box where we run our kafka connector it fails with the above error . Is there any permission that has to be done from data bricks (other than just using the token of a user )&lt;/P&gt;</description>
      <pubDate>Fri, 18 Mar 2022 12:09:56 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25238#M17530</guid>
      <dc:creator>Bency</dc:creator>
      <dc:date>2022-03-18T12:09:56Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Delta Lake Sink Connector</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25239#M17531</link>
      <description>&lt;P&gt;there is also staging S3 bucket in configuration. You can check is anything written there. It can be issue with  access to S3 as well. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Please check also that all is in the same region. Problem is that this error doesn't provide too much information.&lt;/P&gt;</description>
      <pubDate>Fri, 18 Mar 2022 12:21:35 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25239#M17531</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2022-03-18T12:21:35Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Delta Lake Sink Connector</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25241#M17533</link>
      <description>&lt;P&gt;Hi @Kaniz Fatma​&amp;nbsp; yes we did , looks like it was indeed a whitelisting issue . Thanks &lt;/P&gt;&lt;P&gt;@Hubert Dudek​&amp;nbsp; @Kaniz Fatma​&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 24 Mar 2022 18:53:07 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25241#M17533</guid>
      <dc:creator>Bency</dc:creator>
      <dc:date>2022-03-24T18:53:07Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Delta Lake Sink Connector</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25242#M17534</link>
      <description>&lt;P&gt;Great to hear. Could you choose my answer as the best one?&lt;/P&gt;</description>
      <pubDate>Thu, 24 Mar 2022 19:06:36 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25242#M17534</guid>
      <dc:creator>Hubert-Dudek</dc:creator>
      <dc:date>2022-03-24T19:06:36Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks Delta Lake Sink Connector</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25243#M17535</link>
      <description>&lt;P&gt;@Hubert Dudek​&amp;nbsp; sure just did , thanks  again&lt;/P&gt;</description>
      <pubDate>Thu, 24 Mar 2022 19:08:07 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-delta-lake-sink-connector/m-p/25243#M17535</guid>
      <dc:creator>Bency</dc:creator>
      <dc:date>2022-03-24T19:08:07Z</dc:date>
    </item>
  </channel>
</rss>

