cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mchirouze
by New Contributor
  • 899 Views
  • 1 replies
  • 0 kudos

Send formatted html email from email distribution address

Hi, I have created an email distribution list "#MyList@mycompany.com". In the RShiny world I was able to send emails by a) getting the IP of the server I was sending the emails from and b) whitelisting that IP address within my company's SMTP Relay r...

  • 899 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @mchirouze, To set up email services in Databricks, you have a few options depending on your requirements. Let’s explore them: Workspace Email Settings: As a workspace admin user, you can configure when users receive emails for certain events ...

  • 0 kudos
Chaitanya07
by New Contributor
  • 759 Views
  • 1 replies
  • 0 kudos

Databricks Rest APIs CORS Issue

Hello Team,We are currently integrating Databricks Rest APIs into our in-house application for managing access permissions. While testing with curl and Postman, we've successfully accessed certain APIs like listing cluster permission. However, we're ...

  • 759 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Chaitanya07, Dealing with CORS (Cross-Origin Resource Sharing) issues can be a bit tricky, but I’ll provide some guidance to help you resolve this issue when integrating Databricks REST APIs into your in-house application. Understanding CORS:...

  • 0 kudos
Sikalokym
by New Contributor II
  • 880 Views
  • 4 replies
  • 0 kudos

Databricks job with a type "Python wheel" does not work if "Package name" contains dash

HelloI created a databricks job with a type "Python wheel". In the "Package name" field I assigned a python package which contains a dash in its name (see attach). The run of the job failed saying that could not import python package due to dash in t...

test_job.PNG
  • 880 Views
  • 4 replies
  • 0 kudos
Latest Reply
AndréSalvati
New Contributor III
  • 0 kudos

There you can see a complete template project with a python wheel task and modules. Please, follow the instructions for deployment.https://github.com/andre-salvati/databricks-template

  • 0 kudos
3 More Replies
eric-cordeiro
by New Contributor II
  • 3658 Views
  • 3 replies
  • 1 kudos

Databricks AWS Secrets Manager access

I have a workspace deployed in AWS and need to read some secrets from AWS Secrets Manager in my notebook. I'm aware that there is no default process similar to Azure Key Vault, however I know that we can try to access it using boto3, but I'm stuck at...

  • 3658 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @eric-cordeiro,  IAM roles are used for authentication to access AWS Secrets Manager from a Databricks Notebook in AWS. - Create a Cross-Account IAM Role with permissions to access secrets in AWS Secrets Manager.- Create an access policy that gran...

  • 1 kudos
2 More Replies
ossinova
by Contributor II
  • 664 Views
  • 2 replies
  • 0 kudos

Defaulting or overriding the cluster policy list order

I have numerous cluster policies varying that varies in sizes (Job - xsmall, Job - small, Job - medium...). However, when I create a new job and create a new job cluster the default policy selected from the drop down menu is on the bigger size. Is th...

Job cluster.png
  • 664 Views
  • 2 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

I checked on our sorting, and there does not seem to be any logic in it.They are not sorted by ID or description for sure.

  • 0 kudos
1 More Replies
smehta_0908
by New Contributor II
  • 607 Views
  • 1 replies
  • 0 kudos

Monitor and Alert Databricks Resource Utilization and Cost Consumption

We want to build monitoring and Alerting solution for Azure Databricks that should capture Resource Utilization details (like Aggregated CPU%, Memory% etc.) and Cost consumption at the Account Level.We have Unity Catalog Enabled and there are multipl...

  • 607 Views
  • 1 replies
  • 0 kudos
Latest Reply
AlliaKhosla
New Contributor III
  • 0 kudos

@smehta_0908 Greetings! You can utilize Datadog for monitoring CPU and memory of clusters. https://docs.datadoghq.com/integrations/databricks/?tab=driveronly For Cost consumption at accounts level you can make use of billable usage logs using the Acc...

  • 0 kudos
GS24
by New Contributor II
  • 830 Views
  • 3 replies
  • 0 kudos

Connecting to Azure Databricks deployed using Vnet injection over Public Internet

I'm trying to connect to Azure Databricks (deployed using Vnet injection method) from a 3rd party service running on Azure in the same region. When I try to connect using the Databricks hostname directly in my connection the host name always resolves...

Administration & Architecture
Azure Private Link
Databricks
Vnet Injection
  • 830 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hey there! Thanks a bunch for being part of our awesome community!  We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...

  • 0 kudos
2 More Replies
Learnit
by New Contributor II
  • 834 Views
  • 1 replies
  • 0 kudos

Managing databricks workspace permissions

I need assistance with writing API/Python code to manage a Databricks workspace permissions database(unity catalog). The task involves obtaining a list of workspace details from the account console, which includes various details like Workspace name,...

  • 834 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Learnit, Here’s a high-level overview of the steps you might take:   Obtain Workspace Details: You can use the Databricks REST API to obtain workspace details. The API provides various endpoints to manage access for different objects and users. M...

  • 0 kudos
Wojciech_BUK
by Contributor III
  • 3975 Views
  • 4 replies
  • 0 kudos

Unity Catalog - Lakehouse Federation: Permission to read data from foreign catalogs

I have seup connection "SQL-SV-conn" to SQL Server and based on that connection I have created foreign catalog "FC-SQL-SV".I have granted All permission on CATALOG to developers:Use CatalogUse SchemaSelectBut they can not query table (e.g. by running...

Administration & Architecture
Foreign Catalog
Lakehouse Federation
Unity Catalog
  • 3975 Views
  • 4 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

I want to express my gratitude for your effort in selecting the most suitable solution. It's great to hear that your query has been successfully resolved. Thank you for your contribution. 

  • 0 kudos
3 More Replies
valjas
by New Contributor III
  • 965 Views
  • 1 replies
  • 0 kudos

Is it possible to change the Azure storage account of Unity Catalog?

We have unity catalog metastore set up in storage account prod_1. Can we move this to prod_2 storage account and delete prod_1?Also, is it possible to rename the catalogs once they are created?

  • 965 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @valjas , Certainly! Let’s address your questions:   Moving Unity Catalog Metastore: Yes, you can move the Unity Catalog metastore from one storage account (e.g., prod_1) to another (e.g., prod_2).Follow these steps: Create a new Premium perform...

  • 0 kudos
Learnit
by New Contributor II
  • 1214 Views
  • 1 replies
  • 0 kudos

Resolved! Databricks deployment and automation tools comparison.

Hello All, As a newcomer to databricks, I am seeking guidance on automation within databricks environments. What are the best best practices for deployment, and how do Terraform, the REST API, and the databricks SDK compare in terms of advantages and...

  • 1214 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Learnit , Certainly! As a newcomer to Azure Databricks, understanding best practices for deployment and automation is crucial.    Let’s explore some recommendations and compare the tools you’ve mentioned:   Best Practices for Deployment in Azure ...

  • 0 kudos
Ishmael
by New Contributor III
  • 2894 Views
  • 7 replies
  • 3 kudos

Resolved! Connect to databricks from external non-spark cluster

Hi,I have an app/service on a non-spark kubernetes cluster. Is there a way to access/query a databricks service from my app/service? I see documentations on connectors, particularly on scala which is the code of my app/service. Can I use these connec...

  • 2894 Views
  • 7 replies
  • 3 kudos
Latest Reply
Kaniz
Community Manager
  • 3 kudos

Hi @Ishmael, Yes, you are correct. ScalaPy is a library that enables seamless interoperability between Scala and Python. It is based on the Py4J project, which allows Python programs running in a Python interpreter to dynamically access Java objects ...

  • 3 kudos
6 More Replies
gabo2023
by New Contributor III
  • 1245 Views
  • 1 replies
  • 3 kudos

Unable to read resources - Unsupported Protocol Scheme (Terraform AWS)

Hello everyone!Over the last few weeks my company has been trying to deploy a Databricks workspace on AWS adapted to the customer's needs, using Terraform. To do this, we started from a base code on Databricks own github (https://github.com/databrick...

image.png image (1).png
  • 1245 Views
  • 1 replies
  • 3 kudos
Latest Reply
gabo2023
New Contributor III
  • 3 kudos

 

  • 3 kudos
Lwin
by New Contributor
  • 492 Views
  • 1 replies
  • 0 kudos

Want to learn more about system tables

Where can I learn more about system tables? Looking for docs or usage stories! Thanks

  • 492 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Please refer to this - https://docs.databricks.com/administration-guide/system-tables/index.html

  • 0 kudos
Labels