Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
I am having a tough time to connect to smtp server from databricks with ACL enabled cluster , I am able to access with out ACL enabled cluster what can be done to access it via ACL enabled cluster import smtplibfrom email.mime.multipart import MIMEMu...
Hi! Using bundles, I want to update a running streaming job. All good until the new job gets deployed, but then the job needs to be stopped manually so that the new assets are used and it has to be started manually. This might lead to the job running...
For the past two days, our entire project team has been experiencing a recurring "Page Not Responding" error in Databricks. This issue occurs roughly every five minutes, making it impossible to run any code effectively.What's notable is that this pr...
Hi everyone, I hope you are very well.In the research for my university we are conducting a survey to Databricks users, aimed to gather information about how cluster configuration and optimization is faced in the industry.If you like you can answer i...
Hi,I am trying to add a service principal using Microsoft Entre ID, but I encounter an issue as described in the following documentation: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/oauth-m2m.I followed the instructions step by ...
Hi! I have a Job running to process multiple streaming tables. In the beginning, it was working fine, but now I have 80 tables running in this job, the problem is that all the runs are trying to run at the same time throwing an error. Is there a way ...
DABs seem like a great way to build a project based workflow. However, I have a need to take a user's SQL and create a TABLE/VIEW based on it. We plan to use the API to submit the request but is this the best way going forward? Should we consider ...
DABs use Terraform which in turn calls the API anyway. So, I guess it comes down to personal preference of deployment consistency, code maintenance etc. However, from your description of "taking a user's SQL" I'm assuming some sort of real-time appli...
Hi,I hope this message finds you well.I am reaching out regarding a concern with Databricks Administrator privileges. I have an Azure subscription and I use Azure Databricks for my tutorials, but I currently do not have Global Administrator access, w...
Hello, each account should have at least one account administrator that is the one that has the permissions on the managed account and that can give you this access if applicable or assist you with the set up of the UC, you might need to ask internal...
I'm looking to disable the Personal Compute Policy in Terraform, however I don't see an option to do this. According to this documentation here https://docs.databricks.com/en/admin/clusters/personal-compute.html#customize-the-personal-compute-policy ...
I have the same issue in terraform-provider-google.I found how to do it from Admin UI (UI -> Settings -> Features Enablement -> disable Personal compute). At the same time I didn't find how to have the same behavior from Terraform.
Hi everyone, as in the topic, I would like to delete unnnecesarily created account. I have found outdated solutions (e.g. https://community.databricks.com/t5/data-engineering/how-to-delete-databricks-account/m-p/6323#M2501), but they do not work anym...
Hi @Edyta ,FOR AWS:Manage your subscription | Databricks on AWS" Before you delete a Databricks account, you must first cancel your Databricks subscription and delete all Unity Catalog metastores in the account. After you delete all metastores associ...
Hi All,I'm very new to Databricks and trying to understand my use case a bit better.I have a databricks script / job that I want to be reactive to events outside of my databricks environment. A best case scenario would be if my script / job could aut...
Hi @AmandaOhare ,You can use AWS Lambda to achieve that. You can setup queue trigger that will activate AWS lambda function. In that function you can call datbricks rest API that will launch workflow/job.
Hi folks!I'm sure I'm not the only one, but our users have the tendency to click the big Refresh button on all dashboards every time they open them.Using resources efficiently is something I value deeply, so our team came up with a schedule policy - ...
Hi @leo-machado this is a very reasonable request. It's not feasible right now, but I'll pass it on to the product team as I'm sure you're not the only one.
Hi Team, Previously we are able to access databricks workspaces using username and password but now its asking for token, But we dont have access to mail account with username that we are using while login. Could you please assist on the issue how ca...
I am using Databricks connect with PyCharm in order to locally develop my pipelines before deploying to Databricks. I set up authentication for databricks-connect using the following guide https://learn.microsoft.com/en-us/azure/databricks/dev-tools/...
I managed to solve the problem by including my login and password in the HTTPS_PROXY environment variable.I set it to something like this HTTPS_PROXY = "http://username:password@proxy.company.com:port" and databricks-connect can create spark sessions...
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.