- 757 Views
- 3 replies
- 2 kudos
GCP Databricks GKE cluster with 4 nodes
I am working on setting up GCP Databricks and successfully created first GCP-Databricks workspace, but what I observed is it is incurring additional charges even i am using 14days free trail. It is GKE cluster with 4 nodes which are spin up as part o...
- 757 Views
- 3 replies
- 2 kudos
- 2 kudos
Thank you @BigRoux,Just want to dig more into this as is there any way to reduce this nodes using CLI or creating customer managed network.
- 2 kudos
- 2517 Views
- 2 replies
- 2 kudos
Resolved! Databricks All-purpose compute Pricing
Hello, I am now struggling how to calculate the cost of my job cluster.My configuration is as below:If I have to run the above cluster 18 hours per day, in Standard Tier and East Asia Region, how much will be the pricing of the cluster?Any help provi...
- 2517 Views
- 2 replies
- 2 kudos
- 2 kudos
@karen_c Let me make a small correction.It seems that you have checked the option for Spot Instances, which should make the cost slightly lower. Please refer to the far-right column of the attached pricing table for more details.Additionally, you hav...
- 2 kudos
- 4030 Views
- 5 replies
- 2 kudos
Resolved! Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook
The problemAfter setting up Unity Catalog and a managed Volume, I can upload/download files to/from the volume, on Databricks Workspace UI.However, I cannot access the volume from notebook. I created an All-purpose compute, and run dbutils.fs.ls("/Vo...
- 4030 Views
- 5 replies
- 2 kudos
- 2 kudos
I found the reason and a solution, but I feel this is a bug. And I wonder what is the best practice.When I enable the ADSL Gen2's Public network access from all networks as shown below, I can access the volume from a notebook.However, if I enable the...
- 2 kudos
- 1123 Views
- 2 replies
- 1 kudos
Access to system.billing.usage tables
I have Account, Marketplace, Billing Admin roles. I have visibility to system.billing.list_prices table only.How do I get access to system.billing.usage tables? Databricks instance is on AWS.Thanks
- 1123 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @Alberto_Umana, Thanks for your response. I needed Metastore Admin permissions too. In account console, I changed the Metastore Admin to be a group, became a part of the group. With this other tables were visible. With this permission using the gr...
- 1 kudos
- 1923 Views
- 3 replies
- 0 kudos
Best Practices for Daily Source-to-Bronze Data Ingestion in Databricks
How can we effectively manage source-to-bronze data ingestion from a project perspective, particularly when considering daily scheduling strategies using either Auto Loader or Serverless Warehouse COPY INTO commands?
- 1923 Views
- 3 replies
- 0 kudos
- 0 kudos
No, it is not a strict requirement. You can have a single node job cluster run the job if the job is small.
- 0 kudos
- 3026 Views
- 5 replies
- 0 kudos
Any Databricks system tables contain info of the saved/pre-defined queries
How can I find the saved/pre-defined queries in Databricks system tables?system.query.history seems NOT having the info, like query-id or query-name
- 3026 Views
- 5 replies
- 0 kudos
- 0 kudos
Hi Bryan, Databricks system tables do not store saved queries. Query history table captures the query execution details, including: Statement IDExecution statusUser who ran the queryStatement text (if not encrypted)Statement typeExecution durationRes...
- 0 kudos
- 1696 Views
- 2 replies
- 2 kudos
Resolved! Seeking Practical Example for Structured Streaming with Delta Tables in Medallion Architecture
Hi everyone,I’m working on implementing Structured Streaming in Databricks to capture Change Data Capture (CDC) as part of a Medallion Architecture (Bronze, Silver, and Gold layers). While Microsoft’s documentation provides a theoretical approach, I’...
- 1696 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @JissMathew ,Do you have access to databricks academy? I believe in their data engineering track there's pleny of example notebooks.Or you can try dbdemos. For example, here you can find demo notebook for autoloaderDatabricks Autoloader (cloudfile...
- 2 kudos
- 954 Views
- 1 replies
- 1 kudos
Resolved! Multiple volumes from same external location?
Hey all,Do you know if it's possible to create multiple volumes referencing the same s3 bucket from the same external location?For example, if I have two workspaces (test and prod) testing different versions of pipeline code but with static data I'd ...
- 954 Views
- 1 replies
- 1 kudos
- 1 kudos
Yes, it is a limitation, and it is not possible to create multiple volumes referencing the same S3 bucket. This restriction ensures consistency and prevents conflicts when accessing the same data source.Possible Solution:Use subdirectories within the...
- 1 kudos
- 578 Views
- 3 replies
- 0 kudos
Environment Notification / Message
Is it somehow possible to create a message or alerting for specific Databricks environments to make people more aware that they are using e.g. a PROD environment?It can be reflected in the environment name like "dev" or "prod", yes. But it would be n...
- 578 Views
- 3 replies
- 0 kudos
- 0 kudos
Seems that for Azure the process is a little bit different you might follow steps in https://learn.microsoft.com/en-us/azure/databricks/resources/ideas
- 0 kudos
- 618 Views
- 2 replies
- 0 kudos
getting job_parameters object with sql
Hey,In order to create more meaningful monitoring or usage or few platformic jobs I am using I need to be able to access the job_parameters object of jon runs.While job_parameters exists in system.workflow.job_run_timeline table, it is not populated ...
- 618 Views
- 2 replies
- 0 kudos
- 0 kudos
@yairofek wrote:Hey,In order to create more meaningful monitoring or usage or few platformic jobs I am using I need to be able to access the job_parameters object of jon runs.While job_parameters exists in system.workflow.job_run_timeline table, it ...
- 0 kudos
- 7325 Views
- 7 replies
- 6 kudos
Resolved! Secret scope with Azure RBAC
Hello!We have lots of Azure keyvaults that we use in our Azure Databricks workspaces. We have created secret scopes that are backed by the keyvaults. Azure supports two ways of authenticating to keyvaults:- Access policies, which has been marked as l...
- 7325 Views
- 7 replies
- 6 kudos
- 6 kudos
@Chamak You can find 'AzureDatabricks' in User, group or service principal assignment. You dont need to find application id, as it will automatically displayed when you add AzureDatabricks as member. cc: @daniel_sahal
- 6 kudos
- 1504 Views
- 5 replies
- 2 kudos
Resolved! NAT gateway with public IP for SCC disabled Databricks cluster
Hi Team, We need to have single public IP for all outbound traffic flowing through our Databricks cluster. The Secure Cluster Connectivity (SCC) is disabled for our cluster and currently we get dynamic public IPs assigned to the VMs under managed res...
- 1504 Views
- 5 replies
- 2 kudos
- 526 Views
- 1 replies
- 0 kudos
renaming resource group on azure is it possible?
Hello community,I deployed one month ago a resource group with a particular name, where inside there are two databricks workspaces deployed. Is it possible to rename the resource group without any problem? Or do I need to move the existed dbws to a n...
- 526 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @jeremy98 ,Unfortunately, you cannot rename resource group. You need to create new resource group and recreate all required resources.
- 0 kudos
- 673 Views
- 1 replies
- 0 kudos
Resolved! System tables on workspace level
I could be mistaken, but it seem like the systems table contain data of all workspaces, even workspaces that you don't have access to. According to "least principle privilege" idea, I do not think that's a good idea.If forementioned is correct, has s...
- 673 Views
- 1 replies
- 0 kudos
- 0 kudos
As per documentation it is confirmed that system tables include data from all workspaces in your account, but they can only be accessed by a workspace with Unity Catalog, you can restrict which admins has access to this system tables.It is not possib...
- 0 kudos
- 7217 Views
- 1 replies
- 0 kudos
Managing databricks workspace permissions
I need assistance with writing API/Python code to manage a Databricks workspace permissions database(unity catalog). The task involves obtaining a list of workspace details from the account console, which includes various details like Workspace name,...
- 7217 Views
- 1 replies
- 0 kudos
- 0 kudos
Here's a start. https://docs.databricks.com/api/workspace/workspacebindings/updatebindings As far as coding, I use CURL. See attachment as to the syntax. Note the example in the attachment is for Workspace notebooks, as opposed to Workspace envir...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
26 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
52 | |
35 | |
25 | |
17 | |
10 |