- 900 Views
- 3 replies
- 0 kudos
How to get the Usage/DBU Consumption report without using system tables
Is there a way to get the usage/DBU consumption report without using system tables?
- 900 Views
- 3 replies
- 0 kudos
- 0 kudos
To @Cloud_Architect ,The Databricks Account Console offers a "Usage" page that provides an overview of your DBU consumption across workspaces. You can access it by navigating to Account Settings -> Usage. This page allows you to view usage data in DB...
- 0 kudos
- 1535 Views
- 2 replies
- 2 kudos
UI menu customisation
If I want to customize the UI menu so some users/groups can't for example create jobs or make experiments etc.As I see it, when unity catalog is enabled, everyone can create jobs (If they have attach to cluster permission). But in my organization, th...
- 1535 Views
- 2 replies
- 2 kudos
- 2 kudos
I am also searching for solution to restrict users from creating new jobs, but seems like there isn't any
- 2 kudos
- 4213 Views
- 4 replies
- 2 kudos
Databricks Asset Bundles and Dashboards
Hi Databricks Team! I saw in the documentation that Databricks Asset Bundles will support Dashboards as well in the future. Could you please share, when we can expect that feature to be available? Is it coming only for the new Lakeview dashboards or ...
- 4213 Views
- 4 replies
- 2 kudos
- 2 kudos
Why it is not possible to deploy dashboards in asset bundles?
- 2 kudos
- 1026 Views
- 1 replies
- 2 kudos
MLOps on Azure: API vs SDK vs Databricks CLI?
Hello fellow community members,In our organization, we have developed, deployed and utilized an API-based MLOps pipeline using Azure DevOps.The CI/CD pipeline has been developed and refined for about 18 months or so, and I have to say that it is pret...
- 1026 Views
- 1 replies
- 2 kudos
- 2 kudos
Hello @ManiMar, In my opinion it's up to you to choose, and you're in the right path by comparing the pros/cons of each approach. I'd like to highlight that one of the advantages of the Databricks CLI is being able to use Databricks Asset Bundles. I...
- 2 kudos
- 1521 Views
- 3 replies
- 0 kudos
DataBricks Certification Exam Got Suspended. Require support for the same.
Hello Team, I encountered Pathetic experience while attempting my 1st DataBricks certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam without giving any reaso...
- 1521 Views
- 3 replies
- 0 kudos
- 0 kudos
@Kaniz @Cert-Team @Sujitha I have sent multiple emails to the Support team to reschedule my exam with Date, but I have not received any confirmation from them.Please look into this issue and reschedule the exam as soon as possible. This certification...
- 0 kudos
- 624 Views
- 0 replies
- 0 kudos
Got suspended which attempting Databricks certified Associate Developer for Apache Spark 3.0 Python
Hi Team, My Databricks Certified exam got suspended.I was continuously in front of the camera and an alert appeared and then my exam resumed. Then later a support person asked me to show the entire table and entire room, I have showed around the room...
- 624 Views
- 0 replies
- 0 kudos
- 1286 Views
- 1 replies
- 1 kudos
Resolved! capture return value from databricks job to local machine by CLI
Hi,I want to run a python code on databricks notebook and return the value to my local machine. Here is the summary:I upload files to volumes on databricks. I generate a md5 for local file. Once the upload is finished, I create a python script with t...
- 1286 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @pshuk, You could check the below CLI commands: get-run-output Get the output for a single run. This is the REST API reference, which relates to the CLI command: https://docs.databricks.com/api/workspace/jobs/getrunoutput export-run There's al...
- 1 kudos
- 1914 Views
- 1 replies
- 0 kudos
Resolved! Error Code: METASTORE_DOES_NOT_EXIST when using Databricks API
Hello, I'm attempting to use the databricks API to list the catalogs in the metastore. When I send the GET request to `/api/2.1/unity-catalog/catalogs` , I get this error I have checked multiple times and yes, we do have a metastore associated with t...
- 1914 Views
- 1 replies
- 0 kudos
- 0 kudos
Turns out I was using the wrong databricks host url when querying from postman. I was using my Azure instance instead of my AWS instance.
- 0 kudos
- 11645 Views
- 3 replies
- 3 kudos
Resolved! Use SQL Server Management Studio to Connect to DataBricks?
The Notebook UI doesn't always provide the best experience for running exploratory SQL queries. Is there a way for me to use SQL Server Management Studio (SSMS) to connect to DataBricks? See Also:https://learn.microsoft.com/en-us/answers/questions/74...
- 11645 Views
- 3 replies
- 3 kudos
- 3 kudos
What you can do is define a SQL endpoint as a linked server. Like that you can use SSMS and T-SQL.However, it has some drawbacks (no/bad query pushdown, no caching).Here is an excellent blog of Kyle Hale of databricks:Tutorial: Create a Databricks S...
- 3 kudos
- 1103 Views
- 1 replies
- 2 kudos
ingest csv file on-prem to delta table on databricks
Hi,So I want to create a delta live table using a csv file that I create locally (on-prem). A little background: So I have a working ELT pipeline that finds newly generated files (since the last upload), and upload them to databricks volume and at th...
- 1103 Views
- 1 replies
- 2 kudos
- 2 kudos
Hello @pshuk , Based on your description, you have an external pipeline that writes CSV files to a specific storage location and you wish to set up a DLT based on the output of this pipeline. DLT offers has access to a feature called Autoloader, whic...
- 2 kudos
- 1425 Views
- 3 replies
- 3 kudos
I am facing an issue while generating the DBU consumption report and need help.
I am trying to access the following system tables to generate a DBU consumption report, but I am not seeing this table in the system schema. Could you please help me how to access it?system.billing.inventory, system.billing.workspaces, system.billing...
- 1425 Views
- 3 replies
- 3 kudos
- 3 kudos
- 3 kudos
- 1593 Views
- 2 replies
- 0 kudos
Delta Sharing - Info about Share Recipient
What information do you know about a share recipient when they access a table shared to them via Delta Sharing?Wondering if we might be able to utilize something along the lines of is_member, is_account_group_member, session_user, etc for ROW and COL...
- 1593 Views
- 2 replies
- 0 kudos
- 0 kudos
Now that I'm looking closer at the share credentials and the recipient entity you would really need a way to know the bearer token and relate that back to various recipient properties - databricks.name and any custom recipient property tags you may h...
- 0 kudos
- 1494 Views
- 0 replies
- 0 kudos
Parallel kafka consumer in spark structured streaming
Hi,I have a spark streaming job which reads from kafka and process data and write to delta lake.Number of kafka partition: 100number of executor: 2 (4 core each)So we have 8 cores total which are reading from 100 partitions of a topic. I wanted to un...
- 1494 Views
- 0 replies
- 0 kudos
- 1555 Views
- 1 replies
- 0 kudos
Updating Databricks SQL Warehouse using Terraform
We can Update SQL Warehouse manually in Databricks.Click SQL Warehouses in the sidebarIn Advanced optionsWe can find Unity Catalog toggle button there! While Updating Existing SQL Warehouse in Azure to enable unity catalog using terraform, I couldn'...
- 1555 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 979 Views
- 0 replies
- 1 kudos
how to develop Notebooks on vscode for git repos?
I am able to use vscode extension + databricks connect to develop Notebooks on my local computer and run them on my databricks cluster. However I can not figure out how to develop the Notebooks that have the file `.py` extension but identified by Dat...
- 979 Views
- 0 replies
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »