- 1249 Views
- 1 replies
- 0 kudos
Linking Workspace IDs to Names in Billing Schema
Hi everyone,We recently enabled UC and the Billing system table to monitor our usage and costs. We've successfully set up a dashboard to track these metrics for each workspace. The usage table includes the workspace_id, but I'm having trouble finding...
- 1249 Views
- 1 replies
- 0 kudos
- 0 kudos
I got this from their older version of the dashboard. dbdemos uc-04-system-tablesWhen everything is executed, go to your graph in the dashboard, click three dots in the top right, select in my case "View dataset:usage_overview" then paste/modify sql ...
- 0 kudos
- 595 Views
- 1 replies
- 0 kudos
Unable to Access Data Apps/Templates in Databricks Workspace
Hi,We are currently unable to view any of the standard apps/templates (such as Data Apps, Chatbot, and Hello World) within our Databricks workspace.Workspace Details:Workspace Name: KdataaiWorkspace URL: https://1027317917071147.7.gcp.databricks.comC...
- 595 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @aishwaryakdataa! Could you please verify if the Databricks Apps service is enabled in your workspace? You may also need to review the App configuration settings and ensure that the appropriate roles and permissions are in place to access the t...
- 0 kudos
- 983 Views
- 3 replies
- 0 kudos
Unable to add a microsoft security group as Workspace Admin
I'm a workspace admin for a databricks workspace. I can add a microsoft security group in the workspace. When I click on the group to view it I can view the members of the group same in the Azure AD reflecting correctly but it throws an error on the ...
- 983 Views
- 3 replies
- 0 kudos
- 0 kudos
@pranav5 This issue usually occurs because of how Databricks handles group provisioning via SCIM, especially with external groups from Azure AD.SCIM 404 Error: This generally means Databricks cannot find a matching SCIM identity for the Azure AD grou...
- 0 kudos
- 2529 Views
- 1 replies
- 0 kudos
Data bricks is not mounting with storage account giving java lang exception error 480
Hi Everyone,I am currently facing an issue with in our Test Environment where Data bricks is not able to mount with the storage account and we are using the same mount in other environments those are Dev,Preprod and Prod and it works fine there witho...
- 2529 Views
- 1 replies
- 0 kudos
- 856 Views
- 2 replies
- 0 kudos
JDBC Driver cannot connect when using TokenCachePassPhrase property
Hello all, I'm looking for suggestions on enabling the token cache when using browser based SSO login. I'm following the instructions found here: Databricks-JDBC-Driver-Install-and-Configuration-Guide For my users, I would like to enable the token ca...
- 856 Views
- 2 replies
- 0 kudos
- 0 kudos
For the error encountered (Cannot invoke "java.nio.file.attribute.AclFileAttributeView.setAcl(...)" because "<local6>" is null) might be permission or file system issues where the token cache store is being accessed. When EnableTokenCache=0, the to...
- 0 kudos
- 4624 Views
- 3 replies
- 1 kudos
Databricks Notebook says "Connecting.." for some users
For some users, after clicking on a notebook the screen says "connecting..." and the notebook does not open.The users are using Chrome browser and the same happens with Edge as well.What could be the reason?
- 4624 Views
- 3 replies
- 1 kudos
- 1 kudos
Even I am facing the same issue. It always keep saying, opening the notepad. Luckily once it is opened and when connected with the cluster, then its getting timeout.
- 1 kudos
- 2665 Views
- 8 replies
- 1 kudos
Notebook Detached Error: exception when creating execution context: java.net.SocketTimeoutException:
Hello Community,I have been facing this issue since yesterday. After attaching the cluster to a notebook and running a cell, I get the following error in the community edition of the databricks:Notebook detached:exception when creating execution cont...
- 2665 Views
- 8 replies
- 1 kudos
- 1 kudos
Hi All,I am also facing issue.if anyone knows how to resolve this, please post the solution here.
- 1 kudos
- 2120 Views
- 7 replies
- 0 kudos
preloaded_docker_images: how do they work?
At my org, when we start a databricks cluster, it oftens takes awhile to become available (due to (1) instance provisioning, (2) library loading, and (3) init script execution). I'm exploring whether an instance pool could be a viable strategy for im...
- 2120 Views
- 7 replies
- 0 kudos
- 0 kudos
Hello, when we specify docker image with credentials in instance pool configuration, should we also specify credentials in cluster configuration?. as we already have image pulled into the pool instance.
- 0 kudos
- 3375 Views
- 2 replies
- 0 kudos
Is there an automated way to strip notebook outputs prior to pushing to github?
We have a team that works in Azure Databricks on notebooks.We are not allowed to push any data to Github per corporate policy.Instead of everyone having to always remember to clear their notebook outputs prior to commit and push, is there a way this ...
- 3375 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi,pushing to GitHub isn’t allowed, but clearing notebook outputs before internal version control is still important, you can automate this process by using a pre-commit hook or a script within your internal CI/CD pipeline (if one exists). Tools like...
- 0 kudos
- 1364 Views
- 3 replies
- 1 kudos
Replacing Excel with Databricks
I have a client that currently uses a lot of Excel with VBA and advanced calculations. Their source data is often stored in SQL Server.I am trying to make the case to move to Databricks. What's a good way to make that case? What are some advantages t...
- 1364 Views
- 3 replies
- 1 kudos
- 1 kudos
To add on this, my team and I have been using Databricks in an enterprise environment to replace Excel based calculation relying on SQL stored data with Calculations served as model serving endpoints (API) - the initial 'translation' work can be tedi...
- 1 kudos
- 733 Views
- 2 replies
- 1 kudos
Resolved! Search page to search code inside .py files
Hello, hope you are doing good.When on the search page, it seems it's not searching for code inside .py files but rather only the filename.Is there an option somewhere I'm missing to be able to search inside .py files ? Best,Alan
- 733 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello, so it seems Databricks does not allow it - an easy workaround for us is to search directly on our Azure DevOps Repos.
- 1 kudos
- 940 Views
- 1 replies
- 0 kudos
COMMUNITY EDITION CLUSTER DETACH JAVA.UTIL.TIMEOUTEXCEPTION.
Hi folks i was exploring the databricks community edition and came across a cluster issue mostly because of jdbc driver java.util.timeoutexception . basically the cluster connects and executes for 15 sec or so which is a socket limit and disables any...
- 940 Views
- 1 replies
- 0 kudos
- 0 kudos
@Kishore23 paturnpikewrote:Hi folks i was exploring the databricks community edition and came across a cluster issue mostly because of jdbc driver java.util.timeoutexception . basically the cluster connects and executes for 15 sec or so which is a so...
- 0 kudos
- 49827 Views
- 11 replies
- 1 kudos
Error: Folder xxxx@xxx.com is protected
Hello, On Azure Databricks i'm trying to remove a folder on the Repos folder using the following command : databricks workspace delete "/Repos/xxx@xx.com"I got the following error message:databricks workspace delete "/Repos/xxxx@xx.com"Error: Folder ...
- 49827 Views
- 11 replies
- 1 kudos
- 1 kudos
Hello Databricks Forums,When you see the Azure Databricks error message "Folder xxxx@xxx.com is protected," it means that you are attempting to remove a system-protected folder, which is usually connected to a user's workspace, particularly under the...
- 1 kudos
- 559 Views
- 1 replies
- 0 kudos
dev and prod
"SELECT * FROM' data call on my table in PROD is giving all the rows of data, but a call on my table in DEV is giving me just one row of data. what could be the problem??
- 559 Views
- 1 replies
- 0 kudos
- 0 kudos
Tell us more about your environment. Are you using Unity Catalog? What is the table format? What cloud platform are you on? More information is needed.
- 0 kudos
- 1997 Views
- 3 replies
- 2 kudos
Resolved! Cluster by auto pyspark
I can find documentation to enable automatic liquid clustering with SQL code: CLUSTER BY AUTO. But how do I do this with Pyspark? I know I can do it with spark.sql("ALTER TABLE CLUSTER BY AUTO") but ideally I want to pass it as an .option().Thanks in...
- 1997 Views
- 3 replies
- 2 kudos
- 2 kudos
Not at the moment. You have to use the SQL DDL commands either at table creation or via alter table command. Hope this help, Louis.
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |