- 402 Views
- 3 replies
- 2 kudos
Can I subscribe to Databricks in AWS Marketplace via Terraform/automation? How to handle this ?
how to subscribe the Databricks product from AWS - market place using Terraform or other automation tool
- 402 Views
- 3 replies
- 2 kudos
- 2 kudos
For now, We got clarification that Databricks subscription on AWS market place can't be automate. I will be in touch more question are on the way
- 2 kudos
- 3916 Views
- 3 replies
- 0 kudos
How to upload a file to Unity catalog volume using databricks asset bundles
Hi,I am working on a CI CD blueprint for developers, using which developers can create their bundle for jobs / workflows and then create a volume to which they will upload a wheel file or a jar file which will be used as a dependency in their noteboo...
- 3916 Views
- 3 replies
- 0 kudos
- 0 kudos
With this setup, users who are entitled to access the catalog will have the access to use the volume, if permissions are set in this way. And, users will be able to utilize the notebook and we need to provide documentation either to clone the noteboo...
- 0 kudos
- 946 Views
- 6 replies
- 5 kudos
Resolved! Event-driven Architecture with Lake Monitoring without "Trigger on Arrival" on DABs
AWS databricksI want to create data quality monitoring and event-driven architecture without trigger on file arrival but once at deploy.I plan to create a job which trigger once at deploy.The job run this tasks sequentially.1. run script to create ex...
- 946 Views
- 6 replies
- 5 kudos
- 5 kudos
@tana_sakakimiya ah, I think I see the difference. My screenshot says that "external tables" backed by delta lake will work. This means, you'll need to have the table already created in databricks, from your external location i.e. make an external ta...
- 5 kudos
- 917 Views
- 6 replies
- 3 kudos
Resolved! How to sending parameters from http request to in job running notebook
I've try to trigger job running via n8n workflow, which can command to make notebook running properly.BUT another bullet to achieve is I have to send some data to that job to be rub as well, i googled it and can't find solutions anywhere. My setup wa...
- 917 Views
- 6 replies
- 3 kudos
- 3 kudos
@AmpolJon I don't think you should be giving up on the method, the API allows you to pass job parameters to it and you can retrieve them from in the Python Notebook. Here's an example.1. Call on the API, https://docs.databricks.com/api/workspace/jobs...
- 3 kudos
- 411 Views
- 1 replies
- 1 kudos
AWS Serverless NCC
I have setup a new Databricks workspace in AWS for one of my customers and they are using serverless compute. We are trying to obtain durable IP addresses that we can whitelist on a Redshift instance so that Databricks can run federated queries again...
- 411 Views
- 1 replies
- 1 kudos
- 1 kudos
Dear @lmcconnell1665 Greatings for the day!!The serverless firewall feature (which enables the stable public IPs you're seeking via the NCC's default rules) is currently in public preview on Databricks AWS. This means it requires explicit enablement ...
- 1 kudos
- 4078 Views
- 7 replies
- 9 kudos
Resolved! Enforcing Tags on SQL Warehouses
Is there a way to enforce tags on SQL Warehouses? Regular cluster policies do not apply to SQL Warehouses and budget policies do not cover SQL Warehouses either, which I find quite surprising given the fact that budget policies are documented as "Att...
- 4078 Views
- 7 replies
- 9 kudos
- 9 kudos
@Michael_Appiah Yes the default tag policies doesn't apply on warehouse. The solution that I can recommend is assign a tag block if you are deploying using terraform, asset bundle etc to deploy the warehouse. The other solution that I use is I run a ...
- 9 kudos
- 287 Views
- 1 replies
- 2 kudos
Resolved! Regarding - Serverless workspace deployment
When creating a Serverless workspace in Databricks, is there any option to have the workspace’s default (root) storage bucket created in our own AWS account instead of the Databricks-managed account? I know we can set up external locations for data, ...
- 287 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @APJESK ,Unfortunately, this is no possible as of now.
- 2 kudos
- 4874 Views
- 2 replies
- 0 kudos
Databricks - Cost difference between Job Clusters and DLT
Wanted to know about the cost comparison and certain specific feature details between job clusters and DLT,Per the Pricing page (based on both Azure Pricing and Databricks Pricing page) following is the understanding - Region: US EastProvisionedJobs ...
- 4874 Views
- 2 replies
- 0 kudos
- 0 kudos
@smurug24 wrote:In DLT Provisioned, there will be two clusters - updates (for performing the actual data processing) and maintenance (for performing the maintenance operations). So in case of DLT serverless as well, will it be internally running two ...
- 0 kudos
- 144 Views
- 1 replies
- 0 kudos
Way to set 'source' format as default for everyone on the Workspace level (instead of ipynb)
HiI don't want any ipynb notebooks in the project I manage. I don't want any 'output' committed into our repo. I don't like to run diff tools on the ipynb notebooks. etc. I want every user in my workspaces to use simple .py files. They are easier to ...
- 144 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @mydefaultlogin ,I don't think such an option exists. It's per user setting and curently you can't enforce it
- 0 kudos
- 518 Views
- 2 replies
- 2 kudos
Resolved! Lakebase -- Enable RLS in synced Table
Dear all,I am currently testing Lakebase for integration in our overall system. In particular I need to enable RLS on a Lakebase table, which is synced from a "Delta Streaming Table" in UC. Setting up the data sync was no trouble, in UC I am the owne...
- 518 Views
- 2 replies
- 2 kudos
- 2 kudos
Hello @DaPo! Could you please confirm whether you are the owner of the table within the Lakebase Postgres (not just in Unity Catalog)?Also, can you try creating a view on the synced table and then configure RLS on that view?
- 2 kudos
- 1903 Views
- 3 replies
- 0 kudos
Move metastore to another azure subscription
Hi, We need to migrate our metastore with Unity Catalog to a new Azure subscription while remaining in the same Azure region. Currently, we have two workspaces attached to a single Unity Catalog. I’m looking for the best approach to move the metastor...
- 1903 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @alesventus,There are few points to be considered before migrating from One metastore to another. We need to see how the catalogs, schemas and tables are created as of now.If you have created everything has managed like managed catalog, schema and...
- 0 kudos
- 443 Views
- 4 replies
- 1 kudos
Resolved! REST API for swapping cluster
Hi Team, I am trying to find REST API reference for swapping a cluster but unable to find it in the documentation. Can anyone please tell me what is the REST API reference for swapping an existing cluster to another existing cluster, if present?If no...
- 443 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @szymon_dybczak,It helps! I am able to change the clusters. Thanks a lot!
- 1 kudos
- 2090 Views
- 9 replies
- 6 kudos
Issue accessing databricks secrets from ADF
Hello -Seeing an issue where notebook triggered from ADF is not able to access secret scopes, which was working earlier. Here are the steps I did 1. Provide ADF contributor role permission in databrick workspace. - we tested this and were able to tri...
- 2090 Views
- 9 replies
- 6 kudos
- 6 kudos
Hey @gs2 @IkuyoshiKuroda ,I have reviewed the documentation:According to the Databricks documentation on secret scopes, there are two types:Databricks-backed scopes → secrets are stored inside Databricks. These do not support authentication via Azure...
- 6 kudos
- 647 Views
- 2 replies
- 0 kudos
Resolved! Using pip cache for pypi compute libraries
I am able to configure pip's behavior w.r.t index url by setting PIP_INDEX_URL, PIP_TRUSTED_HOST etc. I would like to cache compute-wide pypi libraries, to improve cluster startup performance / reliability. However, I notice that PIP_CACHE_DIR has no...
- 647 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Isi,We moved away from docker images for the reasons you mention, and because they otherwise had issues for us. We are already using artifactory (as hinted by the environment variables mentioned in my post). I wanted to try further improving the s...
- 0 kudos
- 1844 Views
- 7 replies
- 2 kudos
Unable to see Manage Account option in the Databricks Workspace
Hi, I have an organizational account who is the owner of the databricks workspace (premium) and also the global administrator. Still, I don't see "Account Console" option in the databricks after clicking the "manage account" option.I have tried to cl...
- 1844 Views
- 7 replies
- 2 kudos
- 2 kudos
Hi @szymon_dybczak ,Yes have databricks premium account and admin as well
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
43 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
97 | |
37 | |
26 | |
25 | |
18 |