- 525 Views
- 0 replies
- 0 kudos
Please help me to understood the % of savings how Databricks are calculating DBCUThey are telling if I take DBCU 12500 plan the price will be with discount 12000 and 4% discount.That means if I consume 12500 DBU, I am paying for this $12000 and 4% sa...
- 525 Views
- 0 replies
- 0 kudos
- 1088 Views
- 2 replies
- 0 kudos
Hi Team, I am trying to build dynamic dashboard for which I have created 2 dataset and I am passing values from 1st dataset to 2nd dataset as parameter. When an user selects parameter values of 1st dataset, 2nd dataset should use those user input val...
- 1088 Views
- 2 replies
- 0 kudos
Latest Reply
Here are a few suggestions:
Ensure that your query in the second dataset is correctly set up to accept and use the parameters from the first dataset. The parameters should be correctly mapped to the catalog, schema, and table name in your "select * ...
1 More Replies
- 1627 Views
- 1 replies
- 0 kudos
Hi friends;I'm working on a project where we are 4 programmers. We are working in a single environment, using only the "Workspaces" folder. Each has its own user, which is managed by Azure AD.We had a peak in consumption on the 5th Feb. So I can see ...
- 1627 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Retired_mod , thanks for your quick answer.There is no other way to monitor notebook runs. I ask this because adding tags to the cluster and workspace does not solve my problem, considering that everyone uses the same cluster and the same workspa...
- 21390 Views
- 2 replies
- 1 kudos
Our application does storage autoscaling on Azure. We would like to deploy our solution with Azure databricks. But even though the service principal associated with our application has the necessary roles and permissions to attach/detach a disk from ...
- 21390 Views
- 2 replies
- 1 kudos
Latest Reply
Thank you for your replyIs there any way databricks provides to bypass the deny assignment for specific apps? I noticed in the deny assignment unity-catalog-access-connector has been provided exlusion under the excludePrincipals section. is there a w...
1 More Replies
- 4209 Views
- 2 replies
- 0 kudos
Hi @all,In Azure Databricks,I am using structured streaming for each batch functionality, in one of the functions I am creating tempview with pyspark dataframe (*Not GlobalTempView) and trying to access the same temp view by using spark.sql functiona...
- 4209 Views
- 2 replies
- 0 kudos
Latest Reply
Do you face this issue without spark streaming as well? Also, could you share a minimal repo code preferably without streaming?
1 More Replies
by
valjas
• New Contributor III
- 6675 Views
- 1 replies
- 1 kudos
We have two environments for our Azure Databricks. Dev and Prod. We had clusters created and tested in Dev environment, then they were exported to the prod environment through APIs. The clusters in Dev are performing as expected. Whereas, the cluster...
- 6675 Views
- 1 replies
- 1 kudos
Latest Reply
Both Prod and Dev are connect to unity catalog and I am working with the same table in both the envs. Can something done during the creation of workspace itself affect the performance of clusters? Do clusters update to latest Databricks runtime versi...
- 11459 Views
- 2 replies
- 1 kudos
Hello All,I'm not be able to access the account console page. I'm a portal admin, my workspace is premium, and yet the Databricks portal stays in a loop, always returning to the Workspaces Overview page and not going to the accounts console page so t...
- 11459 Views
- 2 replies
- 1 kudos
Latest Reply
Guys, the problem is that I was not signed in as a Global Administrator in Azure AD. After that, I reset all the Browser settings and I was able to do that. Here's the tip.
1 More Replies
- 12188 Views
- 1 replies
- 0 kudos
Hi Team,We have an @adf pipeline which will run some set of activities before #Azure databricks notebooks get called.As and when the notebooks are called our pipeline will launch a new cluster for every job with job compute as Standard F4 with a sing...
- 12188 Views
- 1 replies
- 0 kudos
- 3010 Views
- 3 replies
- 0 kudos
I am trying to setup a recon activity between GCP Pub-Sub and databricks, Is there any way to fetch the last 24hrs record count from Pub-Sub?I tried but not got any direct solution for it, It will be great if any one can suggest me the way t#pubsub, ...
- 3010 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @Ajay-Pandey
Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too.
Cheers!
2 More Replies