cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

costi9992
by New Contributor III
  • 1196 Views
  • 2 replies
  • 0 kudos

Access Databricks API using IDP token

Hello,We have a databricks account & workspace, provided by AWS with SSO enabled. Is there any way to access databricks workspace API ( jobs/clusters, etc ) using a token retrieved from IdentityProvider ? We can access databricks workspace API with A...

  • 1196 Views
  • 2 replies
  • 0 kudos
Latest Reply
fpopa
New Contributor II
  • 0 kudos

Hey - Costin and Anonymous user, have you managed to get this working, do you have examples by any chance?I'm also trying something similar but I haven't been able to make it work.> authenticate and access the Databricks REST API by setting the Autho...

  • 0 kudos
1 More Replies
MoJaMa
by Valued Contributor II
  • 4594 Views
  • 8 replies
  • 2 kudos
  • 4594 Views
  • 8 replies
  • 2 kudos
Latest Reply
User15848365773
New Contributor II
  • 2 kudos

Hi @amitca71 @atanu .. yes you can associate as many vpcs(workspace deployment fundamental) across regions and aws accounts to one single databricks aws account infact its one of the super powers of databricks platform and you can even track all thei...

  • 2 kudos
7 More Replies
AndyAtINX
by New Contributor III
  • 1480 Views
  • 4 replies
  • 1 kudos

Resolved! Error inviting user to workspace "Failed to add user: A user with email ... or username ... in different cases already exist in the account"

We have 3 workspaces - 1 old version in one AWS account, 2 latest versions in another.We are PAYG full edition, not using SSO.Our admins (existing DBX users in the `admins` group) can invite new users via the Admin Console from the 1 old and 1 new wo...

  • 1480 Views
  • 4 replies
  • 1 kudos
Latest Reply
Schneider-Elect
New Contributor II
  • 1 kudos

We are facing same issue, We are on azure. @AndyAtINX you mean if user exist in workspace with abc@gmail.com we should add the user in workspace2 with abc@gmail.com not ABC@GMAIL.COM. if this the case we tried this and its not working for us.

  • 1 kudos
3 More Replies
Anonymous
by Not applicable
  • 20465 Views
  • 6 replies
  • 9 kudos

How to connect and extract data from sharepoint using Databricks (AWS) ?

We are using Databricks (on AWS). We need to connect to SharePoint and extract & load data to Databricks Delta table. Any possible solution on this ?

  • 20465 Views
  • 6 replies
  • 9 kudos
Latest Reply
yliu
New Contributor III
  • 9 kudos

Wondering the same.. Can we use Sharepoint REST API to download the file and save to dbfs/external location and read it? 

  • 9 kudos
5 More Replies
bricksdata
by New Contributor
  • 4832 Views
  • 3 replies
  • 0 kudos

Unable to authenticate against https://accounts.cloud.databricks.com as an account admin.

ProblemI'm unable to authenticate against the https://accounts.cloud.databricks.com endpoint even though I'm an account admin. I need it to assign account level groups to workspaces via the workspace assignment api (https://api-docs.databricks.com/re...

  • 4832 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @lasse l​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your ...

  • 0 kudos
2 More Replies
Fast_Lanes
by New Contributor II
  • 652 Views
  • 2 replies
  • 3 kudos

Why am I getting shown estimated costs and charges (DBU in $) during community trial edition?

I recently signed up for the 14 day community trial and I noticed it was incurring or showing estimate costs on my Usage page shortly after I created my first workspace linked through AWS. Is this only for monitoring purposes or am I actually going t...

  • 652 Views
  • 2 replies
  • 3 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 3 kudos

Hi @Fast_Lanes No, you will not be charged for the trial period on databricks usage.It's just showing the usual charges that will be charged when you will not be using trial version.

  • 3 kudos
1 More Replies
danatsafe
by New Contributor
  • 2464 Views
  • 3 replies
  • 0 kudos

Amazon returns a 403 error code when trying to access an S3 Bucket

Hey! So far I have followed along with the Configure S3 access with instance profiles article to grant my cluster access to an S3 bucket. I have also made sure to disable IAM role passthrough on the cluster. Upon querying the bucket through a noteboo...

  • 2464 Views
  • 3 replies
  • 0 kudos
Latest Reply
winojoe
New Contributor III
  • 0 kudos

I had the same issue and I found a solutionFor me, the permission problems only exist when the Cluster's (compute's) Access mode is "Shared No Isolation".  When the Access Mode is either "Shared" or "Single User" then the IAM configuration seems to a...

  • 0 kudos
2 More Replies
parthsalvi
by Contributor
  • 1413 Views
  • 1 replies
  • 2 kudos

Amazon SES : boto3 credentials not found. DBR 11.2 Shared mode

We're trying to send email using Amazon SES using boto3.client in python. We've added SES Full access in clusters IAM Role.We were able to send email in "No isolation shared" mode in DBR 11.2 using ses = boto3.client('ses', region_name='us-****-2') s...

image
  • 1413 Views
  • 1 replies
  • 2 kudos
Latest Reply
JameDavi_51481
New Contributor III
  • 2 kudos

This appears to be an intentional design choice to prevent users from using the credentials of the host machine to carry out arbitrary AWS API calls. I really wish there was a workaround or setting to disable this behavior because we put a lot of wor...

  • 2 kudos
ivanychev
by Contributor
  • 984 Views
  • 2 replies
  • 1 kudos

Is there a way to avoid using EBS drives on workers with local NVMe SSD?

The Databricks on AWS docs claim that 30G + 150G EBS drives are mounter to every node by default. But if I use instance type like r5d.2xlarge, it already has local disk so I want to avoid mounting the 150G EBS drive to it. Is there a way to do it?We ...

  • 984 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @ivanychev, Based on the provided information, if you want to avoid mounting the 150G EBS drive to a node with the local disk, you can set ebs_volume_count it to 0 in the Clusters API when creating the cluster. Another option could be manually det...

  • 1 kudos
1 More Replies
jlb0001
by New Contributor III
  • 943 Views
  • 3 replies
  • 1 kudos

[AWS] How do you replace the Account Admin?

I need to remove an older admin that previously set up the Databricks Account. However, I get an error (even through I am also an Account Admin).How do I replace a prior account admin? Or at least remove their admin status and/or disable the accoun...

Databricks Permssion Error - Cannot Disable Original Account
  • 943 Views
  • 3 replies
  • 1 kudos
Latest Reply
jlb0001
New Contributor III
  • 1 kudos

I escalated via my Databricks rep yesterday and got an answer that seemed to be along the lines that "something is wrong here". He is going to try to find out internally and possibly work with the product development folks to come up with a solution...

  • 1 kudos
2 More Replies
Saurabh707344
by New Contributor III
  • 2972 Views
  • 2 replies
  • 1 kudos

Platform and Approach Comparison

Do anyone have structure and crisp comparison between benefits of performing MLOps using below ways and what are the strong areas of each platform:a) Standalone Databricks where all pipelines and orchestration done on Databricks and external third pa...

  • 2972 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Saurabh Singh​, Here is a structured and crisp comparison of the benefits and strong areas of each platform for performing MLOps:a) Standalone Databricks:Benefits: Unified platform: Databricks provides a unified environment for data engineering, ...

  • 1 kudos
1 More Replies
Thaw
by New Contributor III
  • 1001 Views
  • 3 replies
  • 4 kudos

Resolved! How to change Instance Family in CloudFormation in a Databricks trial mood?

I implemented Databrick on AWS and the template is used i3.xlarge. Could I use it for down Instance Family for cost optimization? Is i3.xlarge the minimum size to use Databricks in a trial mood? Thanks

  • 1001 Views
  • 3 replies
  • 4 kudos
Latest Reply
Thaw
New Contributor III
  • 4 kudos

Thank you so much for your reply to my question, @Vidula Khanna​ @Kaniz Fatma​ . After I took some study time, I understood the basics, and then I am on the way to Databricks.

  • 4 kudos
2 More Replies
Sudhir1
by New Contributor II
  • 3382 Views
  • 5 replies
  • 1 kudos

Connecting to AWS MSK

how to connect to the AWS MSK which has I-Am based authentication?

  • 3382 Views
  • 5 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Sudhir Jaiswal​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answer...

  • 1 kudos
4 More Replies
Taha_Hussain
by Valued Contributor II
  • 6219 Views
  • 5 replies
  • 8 kudos

Ask your technical questions at Databricks Office Hours! Register here for any of our upcoming dates:May 10 - 11:00 AM - 12:00 PM PTMay 17 - 8:00 AM -...

Ask your technical questions at Databricks Office Hours! Register here for any of our upcoming dates:May 10 - 11:00 AM - 12:00 PM PTMay 17 - 8:00 AM - 9:00 AM PTMay 24 - 9:00 AM - 10:00 AM GMTDatabricks Office Hours connects you directly with experts...

  • 6219 Views
  • 5 replies
  • 8 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 8 kudos

Thanks for this info

  • 8 kudos
4 More Replies
naveenprabhun
by New Contributor III
  • 2335 Views
  • 2 replies
  • 3 kudos

Resolved! Unable to read data from ElasticSearch using Databricks (AWS) Cannot detect ES version - Caused by: org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [IP:PORT]

I am trying to read data from ElasticSearch(ES Version 8.5.2) using PySpark on Databricks (13.0 (includes Apache Spark 3.4.0, Scala 2.12)). The ecosystem is on AWS.I am able to run a curl command on the Databricks notebook to the ES ip:port and fetch...

ErrorScreenshot Screenshot 2023-06-01 at 1.25.29 PM
  • 2335 Views
  • 2 replies
  • 3 kudos
Latest Reply
Hoviedo
New Contributor II
  • 3 kudos

I have the same problem, did you find any solution? thanks

  • 3 kudos
1 More Replies
Labels