cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AlbertWang
by Valued Contributor
  • 2241 Views
  • 5 replies
  • 2 kudos

Resolved! Azure Databricks Unity Catalog - Cannot access Managed Volume in notebook

The problemAfter setting up Unity Catalog and a managed Volume, I can upload/download files to/from the volume, on Databricks Workspace UI.However, I cannot access the volume from notebook. I created an All-purpose compute, and run dbutils.fs.ls("/Vo...

  • 2241 Views
  • 5 replies
  • 2 kudos
Latest Reply
AlbertWang
Valued Contributor
  • 2 kudos

I found the reason and a solution, but I feel this is a bug. And I wonder what is the best practice.When I enable the ADSL Gen2's Public network access from all networks as shown below, I can access the volume from a notebook.However, if I enable the...

  • 2 kudos
4 More Replies
raghu2
by New Contributor III
  • 558 Views
  • 2 replies
  • 1 kudos

Access to system.billing.usage tables

I have Account, Marketplace, Billing Admin roles. I have visibility to system.billing.list_prices table only.How do I get access to system.billing.usage tables? Databricks instance is on AWS.Thanks

  • 558 Views
  • 2 replies
  • 1 kudos
Latest Reply
raghu2
New Contributor III
  • 1 kudos

Hi @Alberto_Umana, Thanks for your response. I needed Metastore Admin permissions too. In account console, I changed the Metastore Admin to be a group, became a part of the group. With this other tables were visible. With this permission using the gr...

  • 1 kudos
1 More Replies
JissMathew
by Contributor III
  • 478 Views
  • 3 replies
  • 0 kudos

Best Practices for Daily Source-to-Bronze Data Ingestion in Databricks

How can we effectively manage source-to-bronze data ingestion from a project perspective, particularly when considering daily scheduling strategies using either Auto Loader or Serverless Warehouse COPY INTO commands?

  • 478 Views
  • 3 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

No, it is not a strict requirement. You can have a single node job cluster run the job if the job is small.

  • 0 kudos
2 More Replies
BryanC
by New Contributor II
  • 596 Views
  • 5 replies
  • 0 kudos

Any Databricks system tables contain info of the saved/pre-defined queries

How can I find the saved/pre-defined queries in Databricks system tables?system.query.history seems NOT having the info, like query-id or query-name

Administration & Architecture
query
System Tables
system-table
  • 596 Views
  • 5 replies
  • 0 kudos
Latest Reply
tapash-db
Databricks Employee
  • 0 kudos

Hi Bryan, Databricks system tables do not store saved queries. Query history table captures the query execution details, including: Statement IDExecution statusUser who ran the queryStatement text (if not encrypted)Statement typeExecution durationRes...

  • 0 kudos
4 More Replies
JissMathew
by Contributor III
  • 1083 Views
  • 2 replies
  • 2 kudos

Resolved! Seeking Practical Example for Structured Streaming with Delta Tables in Medallion Architecture

Hi everyone,I’m working on implementing Structured Streaming in Databricks to capture Change Data Capture (CDC) as part of a Medallion Architecture (Bronze, Silver, and Gold layers). While Microsoft’s documentation provides a theoretical approach, I’...

  • 1083 Views
  • 2 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @JissMathew ,Do you have access to databricks academy? I believe in their data engineering track there's pleny of example notebooks.Or you can try dbdemos. For example, here you can find demo notebook for autoloaderDatabricks Autoloader (cloudfile...

  • 2 kudos
1 More Replies
dsmoore
by New Contributor II
  • 334 Views
  • 1 replies
  • 1 kudos

Resolved! Multiple volumes from same external location?

Hey all,Do you know if it's possible to create multiple volumes referencing the same s3 bucket from the same external location?For example, if I have two workspaces (test and prod) testing different versions of pipeline code but with static data I'd ...

  • 334 Views
  • 1 replies
  • 1 kudos
Latest Reply
ozaaditya
Contributor
  • 1 kudos

Yes, it is a limitation, and it is not possible to create multiple volumes referencing the same S3 bucket. This restriction ensures consistency and prevents conflicts when accessing the same data source.Possible Solution:Use subdirectories within the...

  • 1 kudos
janhouf
by New Contributor
  • 1053 Views
  • 1 replies
  • 0 kudos

Query has been timed out due to inactivity.

Hi,We're experiencing an issue with SQL Serverless Warehouse when running queries through the dbx-sql-connector in Python. The error we get is: "Query has been timed out due to inactivity."This happens intermittently, even for queries that should com...

  • 1053 Views
  • 1 replies
  • 0 kudos
Latest Reply
ozaaditya
Contributor
  • 0 kudos

Possible reasons for this error may include:The warehouse is busy or waiting for compute resources.Connection or network issues.Solutions to try:Increase the timeout duration and try again.If the issue persists, please share the error message for fur...

  • 0 kudos
DominikBraun
by New Contributor II
  • 278 Views
  • 3 replies
  • 0 kudos

Environment Notification / Message

Is it somehow possible to create a message or alerting for specific Databricks environments to make people more aware that they are using e.g. a PROD environment?It can be reflected in the environment name like "dev" or "prod", yes. But it would be n...

  • 278 Views
  • 3 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Seems that for Azure the process is a little bit different you might follow steps in https://learn.microsoft.com/en-us/azure/databricks/resources/ideas 

  • 0 kudos
2 More Replies
yairofek
by New Contributor
  • 241 Views
  • 2 replies
  • 0 kudos

getting job_parameters object with sql

Hey,In order to create more meaningful monitoring or usage or few platformic jobs I am using I need to be able to access the job_parameters object of jon runs.While job_parameters exists in system.workflow.job_run_timeline table, it is not populated ...

  • 241 Views
  • 2 replies
  • 0 kudos
Latest Reply
michelle653burk
New Contributor III
  • 0 kudos

 @yairofek wrote:Hey,In order to create more meaningful monitoring or usage or few platformic jobs I am using I need to be able to access the job_parameters object of jon runs.While job_parameters exists in system.workflow.job_run_timeline table, it ...

  • 0 kudos
1 More Replies
achistef
by New Contributor III
  • 3981 Views
  • 7 replies
  • 6 kudos

Resolved! Secret scope with Azure RBAC

Hello!We have lots of Azure keyvaults that we use in our Azure Databricks workspaces. We have created secret scopes that are backed by the keyvaults. Azure supports two ways of authenticating to keyvaults:- Access policies, which has been marked as l...

  • 3981 Views
  • 7 replies
  • 6 kudos
Latest Reply
kuldeep-in
Databricks Employee
  • 6 kudos

@Chamak You can find 'AzureDatabricks' in User, group or service principal assignment. You dont need to find application id, as it will automatically displayed when you add AzureDatabricks as member. cc: @daniel_sahal   

  • 6 kudos
6 More Replies
vsd
by New Contributor III
  • 666 Views
  • 5 replies
  • 2 kudos

Resolved! NAT gateway with public IP for SCC disabled Databricks cluster

Hi Team, We need to have single public IP for all outbound traffic flowing through our Databricks cluster. The Secure Cluster Connectivity (SCC) is disabled for our cluster and currently we get dynamic public IPs assigned to the VMs under managed res...

  • 666 Views
  • 5 replies
  • 2 kudos
Latest Reply
vsd
New Contributor III
  • 2 kudos

Thank you @szymon_dybczak ! That's helpful!

  • 2 kudos
4 More Replies
jeremy98
by Contributor III
  • 231 Views
  • 1 replies
  • 0 kudos

renaming resource group on azure is it possible?

Hello community,I deployed one month ago a resource group with a particular name, where inside there are two databricks workspaces deployed. Is it possible to rename the resource group without any problem? Or do I need to move the existed dbws to a n...

  • 231 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @jeremy98 ,Unfortunately, you cannot rename resource group. You need to create new resource group and recreate all required resources.

  • 0 kudos
Mario_D
by New Contributor III
  • 257 Views
  • 1 replies
  • 0 kudos

Resolved! System tables on workspace level

I could be mistaken, but it seem like the systems table contain data of all workspaces, even workspaces that you don't have access to. According to "least principle privilege" idea, I do not think that's a good idea.If forementioned is correct, has s...

  • 257 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

As per documentation it is confirmed that system tables include data from all workspaces in your account, but they can only be accessed by a workspace with Unity Catalog, you can restrict which admins has access to this system tables.It is not possib...

  • 0 kudos
Learnit
by New Contributor II
  • 3474 Views
  • 1 replies
  • 0 kudos

Managing databricks workspace permissions

I need assistance with writing API/Python code to manage a Databricks workspace permissions database(unity catalog). The task involves obtaining a list of workspace details from the account console, which includes various details like Workspace name,...

  • 3474 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Here's a start.   https://docs.databricks.com/api/workspace/workspacebindings/updatebindings As far as coding, I use CURL.  See attachment as to the syntax.  Note the example in the attachment is for Workspace notebooks, as opposed to Workspace envir...

  • 0 kudos
xzero-trustx
by New Contributor
  • 337 Views
  • 1 replies
  • 0 kudos

Get hardware metrics like CPU usage, memory usage and send it to Azure Monitor

Hello guys,I would like to get hardware metrics like server load distribution, CPU utilization, memory utilization and send it to Azure Monitor. Is there any way to do this? Can you help me with this doubt?Thanks.

  • 337 Views
  • 1 replies
  • 0 kudos
Latest Reply
michelle653burk
New Contributor III
  • 0 kudos

@xzero-trustx wrote:Hello guys,I would like to get hardware metrics like server load distribution, CPU utilization, memory utilization and send it to Azure Monitor. Is there any way to do this? Can you help me with this doubt?Thanks.Hello!Yes, you ca...

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels