cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Problem creating external delta table on non-AWS s3 bucket

sg-vtc
New Contributor III

I am testing Databricks with non-AWS S3 object storage.  I can access the non-AWS S3 bucket by setting these parameters:

sc._jsc.hadoopConfiguration().set("fs.s3a.access.key", "XXXXXXXXXXXXXXXXXXXX")
sc._jsc.hadoopConfiguration().set("fs.s3a.secret.key", "XXXXXXXXXXXXXXXXXXXXXXXXXXXX")
sc._jsc.hadoopConfiguration().set("fs.s3a.endpoint", "XXXXXXXXXXXX.com")

I can read the csv files in the bucket 

spark.read.format("csv").option("inferschema","true").option("header","true").option("sep","|").load("s3://deltalake/10g_csv/reason.csv")
When trying to create external table from this csv, got AWS Security token service invalid error.  Since I am not using AWS s3 bucket, is there a way to skip this checking. 
sgvtc_0-1697817308224.png

 

 
I can see Databricks created parquet file and _delta_log folder in this external bucket location but it did not complete the delta table creation.  It did not create 00000000000000000000.crc and 00000000000000000000.json in the _delta_log folder.
sgvtc_1-1697817308223.png

 

 

sgvtc_2-1697817308221.png

Any suggestion how to bypass AWS security token check as I am not using AWS S3 bucket.  When I use Databricks community edition to test, external tables are created successfully in the same non-AWS S3 bucket.  Both Databricks on AWS and community edition compute are using same Databricks version.
Both are at 14.0 (Scala 2.12 and Spark 3.5.0).

1 ACCEPTED SOLUTION

Accepted Solutions

sg-vtc
New Contributor III

Found the solution to disable it.  Can close this question.

View solution in original post

1 REPLY 1

sg-vtc
New Contributor III

Found the solution to disable it.  Can close this question.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group