I am trying to configure databricks with AWS, I have configured the cloud resources as described in this https://docs.databricks.com/administration-guide/account-api/iam-role.html#language-Databricks%C2%A0VPC I have selected "Your VPC Default" as the...
@samruddhi ChitnisCan you please check the below troubleshooting guide : Credentials configuration error messages: Malformed request: Failed credential configuration validation checksThe list of permissions checks in the error message indicate the li...
i have a workspace in my adb and in that workspace i have folder contains lots of notebook , for backup purpoe i want to copy all the notebook in my github repo, how can i do that in one shot ?
Hi @suman mukherjee​ Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does.Your feedb...
What i learned based on learning materials, documents, etc.. For data bricks it is a good practice to set up 1 non-prod workspace but separate clusters for Dev, QA, SIT, etc.Is it best practice to set up only 1 NON-PROD Workspace instead of separate ...
I would like to install a library that is under the /Workspace/Shared/ directory using the init.sh script in a cluster. How to access the /Workspace/Shared/ folder in shell? This page only shows how to access manually but doesn't show how to access i...
Hi @Juned Mala​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
We have 11 people working on the Data Engineering Associate certification using Data Engineering with Databricks V3. We just got done with the Foundation one and start the Engineering journey. We are Registered partners and Data Engineering with Dat...
Hi Team, Need assistance to understand Databricks workspace service principle token expire calculation. Issue : when I am creating a token I have given lifetime =3600, but when I doing get token I am getting unexpected expiry number and even when I ...
Hi Team, Please help on my issue,Is there any way to find expiry of token, i mean still how much time have token to expiry. creation_time - expiry_time is not giving me exact output.Kindly let me know if there is any way to find as soon as possibleT...
Greetings,Recently we were doing cleanups in AWS and removed some Databricks related resources that were used only once for setting up our workspace and were not used since then.Since there is no plan to create any other workspaces the decision was t...
The resources that were cleaned up were just the ones that were used for the initial setup of the workspace, everything else important for the day to day operation are in place and we are actively using the workspace, therefore there is no plan to de...
Can anyone let me know, Is there anyway In which we can access different workspace delta tables in a workspace where we run the pipelines using python?​
@Hemanth A​ go to the workspace you want data from, in warehouse tab you will find connectivity in that copy host name, http path and generate token for it, by this credentials you can access the data of this workspace in any other workspace.
I was using the trial period in databricks for 14 days and had some important notebooks where I had made all the changes. Now I have extended the service and have subscribed for databricks in GCP. When I enter the workspace section I cannot see the w...
Hi @Aditya Aranya​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Than...
AWS quickstart - Cloudformation failureWhen deploying your workspace with the recommended AWS quickstart method, a Cloudformation template will be launched in your AWS account. If you experience a failure with the error message along the lines of ROL...
Navigate and discover content more efficiently with Search in DatabricksHi all- Justin Kim here, I'm the Databricks product manager responsible for content organization and navigation in our product, which includes Search. Great to see you on the Com...
@Justin Kim​ Thank you for quick reply, usually Last Modified is Recent changes right (that can be last 24hrs or cap limit that we add), whereas anytime they should show all Notebooks or Tables from start. that is where i got confused
I had a working DLT pipeline that failed in the morning because it cannot read files from the Workspace or Repo directory. It is throwing an Operation not permitted error.Even when I am listing directories, it is throwing the same error (although it ...