- 926 Views
- 0 replies
- 0 kudos
Hi there, I am trying to build a delta live tables pipeline that ingests gzip compressed archives as they're uploaded to S3. The archives contain 2 files in a proprietary format, and one is needed to determine how to parse the other. Once the file co...
- 926 Views
- 0 replies
- 0 kudos
by
mihai
• New Contributor III
- 6971 Views
- 7 replies
- 31 kudos
Hello,I have been trying to deploy a workspace on AWS using the quickstart feature, and I have been running into a problem where the stack fails when trying to create a resource.The following resource(s) failed to create: [CopyZips].From the CloudWat...
- 6971 Views
- 7 replies
- 31 kudos
Latest Reply
Dropping by with my experience in case anyone lands here via Google.Note that the databricks-prod-public-cfts bucket is located in us-west-2.If your AWS organisation has an SCP which whitelists specific regions (such as this example) and us-west-2 is...
6 More Replies
- 5878 Views
- 4 replies
- 15 kudos
I have a trigger in lambda that gets triggered when a new file arrives in S3. I want this file to be straightaway processed using a notebook to Upsert all the data into a delta table.I'm looking for a solution with minimum latency.
- 5878 Views
- 4 replies
- 15 kudos
Latest Reply
There are two possible solution:autoloader/cloudfiles, better with "File notification" queue to avoid unnecessary scans,ORfrom lambda sending post request to /api/2.1/jobs/run-nowAdditionally in both solution it is important to have private link and...
3 More Replies