cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

DianGermishuiz1
by New Contributor III
  • 1355 Views
  • 4 replies
  • 0 kudos

What is the purpose of the databricks-workspace-stack-lambdazipsbucket S3 bucket created on Databricks AWS Account Provisioning?

An S3 bucket with the prefix "databricks-workspace-stack-lambdazipsbucket" was created by default when I created my AWS Databricks account. It is set to public access. It has one zip file in it called "lambda.zip". What is the purpose of this S3 buck...

  • 1355 Views
  • 4 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Dian Germishuizen​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best ans...

  • 0 kudos
3 More Replies
LidorAbo
by New Contributor II
  • 1053 Views
  • 1 replies
  • 0 kudos

Databricks can write to s3 bucket through panda but not from spark

Hey,I have problem with access to s3 bucket using cross account bucket permission, i got the following error:Steps to repreduce:Checking the role that assoicated to ec2 instance:{ "Version": "2012-10-17", "Statement": [ { ...

Access_Denied_S3_Bucket
  • 1053 Views
  • 1 replies
  • 0 kudos
Latest Reply
Nhan_Nguyen
Valued Contributor
  • 0 kudos

Could you try to map s3 bucket location with Databricks File System then write output to this new location instead of directly write to S3 location.

  • 0 kudos
Navy__jup-frag-
by New Contributor
  • 830 Views
  • 2 replies
  • 1 kudos

How to search for an S3 bucket

The S3 buckets are a likely source location for the new EDL builder uploads. Is there a way to search Databricks to find the naming convention for the S3 buckets that have been assigned to our team. We uploaded some files using EDL this morning but...

  • 830 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @James Longstreet​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...

  • 1 kudos
1 More Replies
akshay1
by New Contributor II
  • 997 Views
  • 0 replies
  • 2 kudos

Data unloading to S3 bucket from Databricks.

Hi,I am completely new to the Databricks & have a task to unload the data from Databricks table to the S3 location using java/sql. Is this possible? If yes can you please help me?

  • 997 Views
  • 0 replies
  • 2 kudos
165036
by New Contributor III
  • 978 Views
  • 1 replies
  • 1 kudos

Resolved! Mounting of S3 bucket via Terraform is frequently timing out

Summary of the problemWhen mounting an S3 bucket via Terraform the creation process is frequently timing out (running beyond 10 minutes). When I check the Log4j logs in the GP cluster I see the following error message repeated:```22/07/26 05:54:43 ER...

  • 978 Views
  • 1 replies
  • 1 kudos
Latest Reply
165036
New Contributor III
  • 1 kudos

Solved. See here: https://github.com/databricks/terraform-provider-databricks/issues/1500

  • 1 kudos
Megan05
by New Contributor III
  • 1183 Views
  • 4 replies
  • 1 kudos

Trying to write to S3 bucket but executed code not showing any progress

I am trying to write data from databricks to an S3 bucket but when I submit the code, it runs and runs and does not make any progress. I am not getting any errors and the logs don't seem to recognize I've submitted anything. The cluster also looks un...

image
  • 1183 Views
  • 4 replies
  • 1 kudos
Latest Reply
User16753725469
Contributor II
  • 1 kudos

Can you please check the driver log4j to see what is happening?

  • 1 kudos
3 More Replies
Kaniz
by Community Manager
  • 2050 Views
  • 1 replies
  • 1 kudos
  • 2050 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

We can check using this method:-import boto3from botocore.errorfactory import ClientErrors3 = boto3.client('s3')try: s3.head_object(Bucket='bucket_name', Key='file_path')except ClientError: # Not found pass

  • 1 kudos
User16826990884
by New Contributor III
  • 427 Views
  • 0 replies
  • 0 kudos

Encrypt root S3 bucket

This is a 2-part question:How do I go about encrypting an existing root S3 bucket?Will this impact my Databricks environment? (Resources not being accessible, performance issues etc.)

  • 427 Views
  • 0 replies
  • 0 kudos
Labels