<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Problem creating external delta table on non-AWS s3 bucket in Get Started Discussions</title>
    <link>https://community.databricks.com/t5/get-started-discussions/problem-creating-external-delta-table-on-non-aws-s3-bucket/m-p/49960#M1649</link>
    <description>&lt;P&gt;Found the solution to disable it.&amp;nbsp; Can close this question.&lt;/P&gt;</description>
    <pubDate>Fri, 27 Oct 2023 01:36:22 GMT</pubDate>
    <dc:creator>sg-vtc</dc:creator>
    <dc:date>2023-10-27T01:36:22Z</dc:date>
    <item>
      <title>Problem creating external delta table on non-AWS s3 bucket</title>
      <link>https://community.databricks.com/t5/get-started-discussions/problem-creating-external-delta-table-on-non-aws-s3-bucket/m-p/49612#M1619</link>
      <description>&lt;P&gt;I am testing Databricks with non-AWS S3 object storage.&amp;nbsp; I can access the non-AWS S3 bucket by setting these parameters:&lt;/P&gt;&lt;P&gt;sc._jsc.hadoopConfiguration().set("fs.s3a.access.key", "XXXXXXXXXXXXXXXXXXXX")&lt;BR /&gt;sc._jsc.hadoopConfiguration().set("fs.s3a.secret.key", "XXXXXXXXXXXXXXXXXXXXXXXXXXXX")&lt;BR /&gt;sc._jsc.hadoopConfiguration().set("fs.s3a.endpoint", "XXXXXXXXXXXX.com")&lt;/P&gt;&lt;P&gt;I can read the csv files in the bucket&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;spark.read.&lt;/SPAN&gt;&lt;SPAN&gt;format&lt;/SPAN&gt;&lt;SPAN&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;"csv"&lt;/SPAN&gt;&lt;SPAN&gt;).option(&lt;/SPAN&gt;&lt;SPAN&gt;"inferschema"&lt;/SPAN&gt;&lt;SPAN&gt;,&lt;/SPAN&gt;&lt;SPAN&gt;"true"&lt;/SPAN&gt;&lt;SPAN&gt;).option(&lt;/SPAN&gt;&lt;SPAN&gt;"header"&lt;/SPAN&gt;&lt;SPAN&gt;,&lt;/SPAN&gt;&lt;SPAN&gt;"true"&lt;/SPAN&gt;&lt;SPAN&gt;).option(&lt;/SPAN&gt;&lt;SPAN&gt;"sep"&lt;/SPAN&gt;&lt;SPAN&gt;,&lt;/SPAN&gt;&lt;SPAN&gt;"|"&lt;/SPAN&gt;&lt;SPAN&gt;).load(&lt;/SPAN&gt;&lt;SPAN&gt;"s3://deltalake/10g_csv/reason.csv"&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;When trying to create external table from this csv, got AWS Security token service invalid error.&amp;nbsp; Since I am not using AWS s3 bucket, is there a way to skip this checking.&amp;nbsp;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="sgvtc_0-1697817308224.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/4510i1D178E5763B01578/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400" role="button" title="sgvtc_0-1697817308224.png" alt="sgvtc_0-1697817308224.png" /&gt;&lt;/span&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;I can see Databricks created parquet file and _delta_log folder in this external bucket location but it did not complete the delta table creation.&amp;nbsp; It did not create 00000000000000000000.crc and 00000000000000000000.json in the _delta_log folder.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="sgvtc_1-1697817308223.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/4511iAB5F0ED88BE4B5A7/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400" role="button" title="sgvtc_1-1697817308223.png" alt="sgvtc_1-1697817308223.png" /&gt;&lt;/span&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="sgvtc_2-1697817308221.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/4512i168D960EAE5693B0/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400" role="button" title="sgvtc_2-1697817308221.png" alt="sgvtc_2-1697817308221.png" /&gt;&lt;/span&gt;&lt;P&gt;Any suggestion how to bypass AWS security token check as I am not using AWS S3 bucket.&amp;nbsp; When I use Databricks community edition to test, external tables are created successfully in the same non-AWS S3 bucket.&amp;nbsp; Both Databricks on AWS and community edition compute are using same Databricks version.&lt;BR /&gt;Both are at 14.0 (Scala 2.12 and Spark 3.5.0).&lt;/P&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Fri, 20 Oct 2023 16:00:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/problem-creating-external-delta-table-on-non-aws-s3-bucket/m-p/49612#M1619</guid>
      <dc:creator>sg-vtc</dc:creator>
      <dc:date>2023-10-20T16:00:46Z</dc:date>
    </item>
    <item>
      <title>Re: Problem creating external delta table on non-AWS s3 bucket</title>
      <link>https://community.databricks.com/t5/get-started-discussions/problem-creating-external-delta-table-on-non-aws-s3-bucket/m-p/49960#M1649</link>
      <description>&lt;P&gt;Found the solution to disable it.&amp;nbsp; Can close this question.&lt;/P&gt;</description>
      <pubDate>Fri, 27 Oct 2023 01:36:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/get-started-discussions/problem-creating-external-delta-table-on-non-aws-s3-bucket/m-p/49960#M1649</guid>
      <dc:creator>sg-vtc</dc:creator>
      <dc:date>2023-10-27T01:36:22Z</dc:date>
    </item>
  </channel>
</rss>

