<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Delta Lake S3 multi-cluster writes - DynamoDB in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/delta-lake-s3-multi-cluster-writes-dynamodb/m-p/46285#M431</link>
    <description>&lt;P&gt;Hi there!&lt;/P&gt;&lt;P&gt;I'm trying to figure out how the multi-writers architecture for Delta Lake tables is implemented under the hood.&lt;/P&gt;&lt;P data-unlink="true"&gt;I understand that a &lt;A href="https://docs.delta.io/latest/delta-storage.html#multi-cluster-setup" target="_blank"&gt;DynamoDB table&lt;/A&gt;&amp;nbsp;is used to provide mutual exclusion, but the question is: where is the table located? Is it in the control plane compute or the user's?&lt;/P&gt;&lt;P data-unlink="true"&gt;If it's in the data plane, how can I provide permissions to create/update this specific table?&lt;/P&gt;&lt;P&gt;If it's in the control plane compute, why is it failing with the following:&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;Py4JJavaError: An error occurred while calling o476.save.
: com.amazonaws.services.securitytoken.model.AWSSecurityTokenServiceException: The security token included in the request is invalid.&lt;/LI-CODE&gt;&lt;P&gt;?&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;</description>
    <pubDate>Tue, 26 Sep 2023 13:18:38 GMT</pubDate>
    <dc:creator>JonLaRose</dc:creator>
    <dc:date>2023-09-26T13:18:38Z</dc:date>
    <item>
      <title>Delta Lake S3 multi-cluster writes - DynamoDB</title>
      <link>https://community.databricks.com/t5/administration-architecture/delta-lake-s3-multi-cluster-writes-dynamodb/m-p/46285#M431</link>
      <description>&lt;P&gt;Hi there!&lt;/P&gt;&lt;P&gt;I'm trying to figure out how the multi-writers architecture for Delta Lake tables is implemented under the hood.&lt;/P&gt;&lt;P data-unlink="true"&gt;I understand that a &lt;A href="https://docs.delta.io/latest/delta-storage.html#multi-cluster-setup" target="_blank"&gt;DynamoDB table&lt;/A&gt;&amp;nbsp;is used to provide mutual exclusion, but the question is: where is the table located? Is it in the control plane compute or the user's?&lt;/P&gt;&lt;P data-unlink="true"&gt;If it's in the data plane, how can I provide permissions to create/update this specific table?&lt;/P&gt;&lt;P&gt;If it's in the control plane compute, why is it failing with the following:&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;Py4JJavaError: An error occurred while calling o476.save.
: com.amazonaws.services.securitytoken.model.AWSSecurityTokenServiceException: The security token included in the request is invalid.&lt;/LI-CODE&gt;&lt;P&gt;?&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;</description>
      <pubDate>Tue, 26 Sep 2023 13:18:38 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/delta-lake-s3-multi-cluster-writes-dynamodb/m-p/46285#M431</guid>
      <dc:creator>JonLaRose</dc:creator>
      <dc:date>2023-09-26T13:18:38Z</dc:date>
    </item>
    <item>
      <title>Re: Delta Lake S3 multi-cluster writes - DynamoDB</title>
      <link>https://community.databricks.com/t5/administration-architecture/delta-lake-s3-multi-cluster-writes-dynamodb/m-p/48683#M489</link>
      <description>&lt;P&gt;Thank you&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/9"&gt;@Retired_mod&lt;/a&gt;&amp;nbsp;.&lt;/P&gt;&lt;P&gt;Does the S3 Commit service use the `s3a` configured S3 endpoint (from the Spark session Hadoop configurations)? If not, is there a way to configure the S3 endpoint that the S3 Commit service uses?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 08 Oct 2023 10:51:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/delta-lake-s3-multi-cluster-writes-dynamodb/m-p/48683#M489</guid>
      <dc:creator>JonLaRose</dc:creator>
      <dc:date>2023-10-08T10:51:30Z</dc:date>
    </item>
    <item>
      <title>Re: Delta Lake S3 multi-cluster writes - DynamoDB</title>
      <link>https://community.databricks.com/t5/administration-architecture/delta-lake-s3-multi-cluster-writes-dynamodb/m-p/80751#M1427</link>
      <description>&lt;P&gt;Hi, could you please help me here? How can i use this configuration in DataBricks?&amp;nbsp;&lt;BR /&gt;So I will maintain my transcription logs there, and with Parallel, I can use the Delta-RS job.&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;spark.conf.&lt;/SPAN&gt;&lt;SPAN&gt;set&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;spark.delta.logStore.s3a.impl&lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;, &lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;io.delta.storage.S3DynamoDBLogStore&lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;spark.conf.&lt;/SPAN&gt;&lt;SPAN&gt;set&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;spark.io.delta.storage.S3DynamoDBLogStore.ddb.tableName&lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;, &lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;delta_log&lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;spark.conf.&lt;/SPAN&gt;&lt;SPAN&gt;set&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;spark.io.delta.storage.S3DynamoDBLogStore.ddb.region&lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;, &lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;eu-west-1&lt;/SPAN&gt;&lt;SPAN&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Fri, 26 Jul 2024 18:08:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/delta-lake-s3-multi-cluster-writes-dynamodb/m-p/80751#M1427</guid>
      <dc:creator>prem14f</dc:creator>
      <dc:date>2024-07-26T18:08:22Z</dc:date>
    </item>
  </channel>
</rss>

