cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

naveenprabhun
by New Contributor III
  • 4909 Views
  • 2 replies
  • 3 kudos

Resolved! Unable to read data from ElasticSearch using Databricks (AWS) Cannot detect ES version - Caused by: org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [IP:PORT]

I am trying to read data from ElasticSearch(ES Version 8.5.2) using PySpark on Databricks (13.0 (includes Apache Spark 3.4.0, Scala 2.12)). The ecosystem is on AWS.I am able to run a curl command on the Databricks notebook to the ES ip:port and fetch...

ErrorScreenshot Screenshot 2023-06-01 at 1.25.29 PM
  • 4909 Views
  • 2 replies
  • 3 kudos
Latest Reply
Hoviedo
New Contributor III
  • 3 kudos

I have the same problem, did you find any solution? thanks

  • 3 kudos
1 More Replies
sanjay
by Valued Contributor II
  • 1796 Views
  • 2 replies
  • 1 kudos

Resolved! How can I prioritize message in autoloader

Hi,I am using autoloader, it picks data from AWS S3 and stores in delta table. In case there are large number of messages, I like to process messages by priority. Is it possible to prioritize messages in autoloader.Regards,Sanjay

  • 1796 Views
  • 2 replies
  • 1 kudos
Latest Reply
sanjay
Valued Contributor II
  • 1 kudos

Thank you Sandeep. Other option is I can keep messages in 2 different folders in S3. Can autoloader read message from multiple folders

  • 1 kudos
1 More Replies
gdoron
by New Contributor
  • 1651 Views
  • 2 replies
  • 0 kudos

using pyspark can I write to an s3 path I don't have GetObject permission to?

After spark finishes writing the dataframe to S3, it seems like it checks the validity of the files it wrote with: `getFileStatus` that is `HeadObject` behind the scenes.What if I'm only granted write and list objects permissions but not GetObject? I...

  • 1651 Views
  • 2 replies
  • 0 kudos
Latest Reply
Lakshay
Databricks Employee
  • 0 kudos

It is not possible in my opinion.

  • 0 kudos
1 More Replies
Sweetnesh
by New Contributor
  • 1993 Views
  • 2 replies
  • 0 kudos

Not able to read S3 object through AssumedRoleCredentialProvider

SparkSession spark = SparkSession.builder() .appName("SparkS3Example") .master("local[1]") .getOrCreate(); spark.sparkContext().hadoopConfiguration().set("fs.s3a.access.key", S3_ACCOUNT_KEY); spark.sparkContext().hadoopConf...

  • 1993 Views
  • 2 replies
  • 0 kudos
Latest Reply
Vartika
Databricks Employee
  • 0 kudos

Hi @Sweetnesh Dholariya​,Does @Debayan Mukherjee​'s response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?Thanks!

  • 0 kudos
1 More Replies
THIAM_HUATTAN
by Valued Contributor
  • 2055 Views
  • 2 replies
  • 3 kudos

Resolved! Grant Databricks Access

In the above printscreen of Grant Databricks Access, we see we need to give the rights to a certain Bucket at the highest level. Why is this so? Are we able to limit the rights to only certain directories in a bucket, when we need Databricks to have ...

DatabricksQuestion
  • 2055 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @THIAM HUAT TAN​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...

  • 3 kudos
1 More Replies
Raghav_597352
by New Contributor II
  • 4929 Views
  • 2 replies
  • 4 kudos

Resolved! Workspace not getting created

Hey guys,I tried to create a workspace, I didn't encountered error like this. I provided everything correctly but don't know why I'm getting this. Tried doing it by using different Data bricks Id and AWS ID also access this on AWS Root account

Capture3
  • 4929 Views
  • 2 replies
  • 4 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 4 kudos

https://docs.gcp.databricks.com/administration-guide/workspace/create-workspace.html

  • 4 kudos
1 More Replies
Chinu
by New Contributor III
  • 1015 Views
  • 1 replies
  • 1 kudos

API to get Databricks Status AWS.

Hi, Do you have an api endpoint to call to get the databricks status for AWS?Thanks,

  • 1015 Views
  • 1 replies
  • 1 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 1 kudos

@Chinu Lee​ you have webhook/slack that can be used to fetch status https://docs.databricks.com/resources/status.html#webhookare you specifically looking for your account workspace/above one

  • 1 kudos
playermanny2
by New Contributor II
  • 1992 Views
  • 2 replies
  • 1 kudos

Reading data in Azure Databricks Delta Lake from AWS Redshift

We have Databricks set up and running on Azure. Now we want to connect it with Redshift (AWS) to perform further downstream analysis for our redshift users.I could find the documentation on how to do it within the same cloud (Either AWS or Azure) but...

  • 1992 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Manny Cato​ :To allow Redshift to read data from Delta Lake hosted on Azure, you can use AWS Glue Data Catalog as an intermediary. The Glue Data Catalog is a fully managed metadata catalog that integrates with a variety of data sources, including De...

  • 1 kudos
1 More Replies
dceman
by New Contributor
  • 1172 Views
  • 1 replies
  • 0 kudos

How to skip "onboarding" wizard?

I have registreded account via AWS marketplace.Also I have deployed workspaces with Terraform.When I log in admin console, It redirects me to https://accounts.cloud.databricks.com/onboardingwhere I need to create workspace manually, but I don't want ...

  • 1172 Views
  • 1 replies
  • 0 kudos
Latest Reply
Mounika_Tarigop
Databricks Employee
  • 0 kudos

Hi Team, Would you mind telling us how you have provisioned? Are you using the same account id which you have used while creation. If so, Could you please try to login through incognito and see if that works?

  • 0 kudos
User16826992783
by New Contributor II
  • 1330 Views
  • 1 replies
  • 0 kudos

Why are some of my AWS EBS volumes in my workspace unencrypted?

I noticed that 30GB of my EBS volumes are unencrypted, is there a reason for this, and is there a way to encrypt these volumes?

  • 1330 Views
  • 1 replies
  • 0 kudos
Latest Reply
Abishek
Databricks Employee
  • 0 kudos

https://docs.databricks.com/security/keys/customer-managed-keys-storage-aws.html#introductionThe Databricks cluster’s EBS volumes (optional) - For Databricks Runtime cluster nodes and other compute resources in the Classic data plane, you can option...

  • 0 kudos
samruddhi
by New Contributor
  • 1666 Views
  • 1 replies
  • 0 kudos

Issue while creating Workspace in databricks using AWS

I am trying to configure databricks with AWS, I have configured the cloud resources as described in this https://docs.databricks.com/administration-guide/account-api/iam-role.html#language-Databricks%C2%A0VPC I have selected "Your VPC Default" as the...

image.png
  • 1666 Views
  • 1 replies
  • 0 kudos
Latest Reply
Abishek
Databricks Employee
  • 0 kudos

@samruddhi ChitnisCan you please check the below troubleshooting guide : Credentials configuration error messages: Malformed request: Failed credential configuration validation checksThe list of permissions checks in the error message indicate the li...

  • 0 kudos
nicole_wong
by New Contributor II
  • 11532 Views
  • 10 replies
  • 7 kudos

Resolved! Can Terraform be used to set configurations in Admin / workspace settings?

I am posting this on behalf of my customer. They are currently working on the deployment & config of their workspace on AWS via Terraform.Is it possible to set some configs in the Admin/workspace settings via TF? According to the Terraform module, it...

  • 11532 Views
  • 10 replies
  • 7 kudos
Latest Reply
francly
New Contributor II
  • 7 kudos

Hi, can I get a full list of the latest configurable supported workspace_conf on tf, I can't find the list on tf registry site.

  • 7 kudos
9 More Replies
kll
by New Contributor III
  • 3101 Views
  • 1 replies
  • 0 kudos

Fatal error: The Python kernel is unresponsive when attempting to query data from AWS Redshift within Jupyter notebook

I am running jupyter notebook on a cluster with configuration: 12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12)Worker type: i3.xlarge 30.5gb memory, 4 coresMin 2 and max 8 workers cursor = conn.cursor()   cursor.execute( """ ...

  • 3101 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Could you please confirm the usage of your cluster while running this job? you can monitor the performance here: https://docs.databricks.com/clusters/clusters-manage.html#monitor-performance with different metrics. Also, please tag @Debayan​ with...

  • 0 kudos
MaheshDR
by New Contributor II
  • 8978 Views
  • 6 replies
  • 1 kudos

Open firewall to Azure Databricks workspace from AWS RDS machine/EC2 machine

Hi All,As part of our solution approach, we need to connect to one of our AWS RDS Oracle databases from Azure Databricks notebook.We need your help to understand which IP range of Azure Databricks to consider to whitelist them on AWS RDS security gro...

  • 8978 Views
  • 6 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Mahesh D​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 1 kudos
5 More Replies
Chris_Shehu
by Valued Contributor III
  • 4510 Views
  • 7 replies
  • 4 kudos

On-Behalf of tokens disabled for Azure Environments?

While trying to setup a Power BI connection to the Azure Delta Lake we ran into several issues around Service Principals. ​1) The API listed on the learn.microsoft site (link 1 below) indicates that there is an API you can use to create SP tokens. Wh...

  • 4510 Views
  • 7 replies
  • 4 kudos
Latest Reply
meetskorun
New Contributor II
  • 4 kudos

hello,i am new here from india, here to share some thoughts with you all

  • 4 kudos
6 More Replies
Labels