- 1813 Views
- 1 replies
- 1 kudos
UC migration : Mount Points in Unity Catalog
Hi All,In my existing notebooks we have used mount points url as /mnt/ and we have notebooks where we have used the above url to fetch the data/file from the container. Now as we are upgrading to unity catalog these url will no longer be supporting a...
- 1813 Views
- 1 replies
- 1 kudos
- 1 kudos
Unfortunately no, mount points are no longer supported with UC, so you will need to modify the URL manually on your notebooks.
- 1 kudos
- 503 Views
- 1 replies
- 0 kudos
- 503 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Thekenal , You can following link to first connect to Azure SQL server from databricks https://learn.microsoft.com/en-us/azure/databricks/connect/external-systems/sql-server.Then follow dashboard creation within Databricks https://docs.databricks...
- 0 kudos
- 1761 Views
- 2 replies
- 1 kudos
Looking for HR use cases for Databricks
Human Resources use cases.
- 1761 Views
- 2 replies
- 1 kudos
- 978 Views
- 3 replies
- 0 kudos
Issue with Validation After DBFS to Volume Migration in Databricks Workspace
Hello Databricks Community,I have successfully migrated my DBFS (Databricks File System) from a source workspace to a target workspace, moving it from a path in Browse DBFS -> Folders to a Catalog -> Schema -> Volume.Now, I want to validate the migra...
- 978 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @Sudheer2, thanks for your comments, you can try using %sh magic to list the folder and sub-directores using unix-like commands for example:
- 0 kudos
- 6199 Views
- 5 replies
- 2 kudos
Cluster compute metrics
I want to fetch compute metrics(hardware, gpu and spark) and use them in certain dashboard on Databricks, however i'm not able to fetch them. i have tried GET API request and system tables. The system tables only have CPU utilization and memory utili...
- 6199 Views
- 5 replies
- 2 kudos
- 2 kudos
How can we store the CPU & Memory metrics for GCP databricks centrally and setup some alerts incase if the usage is high and monitor the performance.
- 2 kudos
- 2472 Views
- 3 replies
- 1 kudos
Resolved! Constantly Running Interactive Clusters Best Practices
Hello there, I’ve been creating an ETL/ELT Pipeline with Azure Databricks Workflows, Spark and Azure Data Lake. It should process in Near Real Time changes (A Change Data Capture process) from an Azure SQL Database. For that purpose, I will have sev...
- 2472 Views
- 3 replies
- 1 kudos
- 1 kudos
No problem! let me know if you have any other questions.
- 1 kudos
- 1141 Views
- 3 replies
- 0 kudos
Dashboard sharing in Databriks with Unity Catalog enabled
Hello,I am planning to deploy a workspace with Unity Catalog enabled. Deploying permissions in one place sounds like a good solution. It can even simplify dataset architecture by masking rows and columns.As an architect, I’m concerned about the user’...
- 1141 Views
- 3 replies
- 0 kudos
- 0 kudos
I will suggest for you to submit a feature request for this through https://docs.databricks.com/en/resources/ideas.html#ideas
- 0 kudos
- 1063 Views
- 1 replies
- 0 kudos
Clean Room Per Collaborator Pricing
For Clean Room Pricing it is mentioned50$/ collaborator per day.1) If I keep the clean room for may be 5 hrs and delete it later will I get charged for those 5hrs(10.41$)/collaborator or will I be charged 50$ regardless of active hrs of clean room fo...
- 1063 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @RohithChippa, Thanks for your questions! Please find below feedback: 1) If you keep the clean room for 5 hours and delete it later, you will be charged $50 per collaborator for that day, regardless of the number of active hours the clean room was...
- 0 kudos
- 1204 Views
- 1 replies
- 0 kudos
Databricks Community Edition Not working
Hello,Is the Databricks Community Edition working, as I am getting error every time and after enter my mail id for login and enter the pass-code, it again load the Sign-In page again and i have tried with multiple browsers as well.Can anyone help giv...
- 1204 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @AshishMehta, Have you tried cleaning cookies and cache? Also can you please try resetting your password?
- 0 kudos
- 1508 Views
- 0 replies
- 1 kudos
Turn Your Dataframes into an Interactive Tableau-Styled Drag-and-Drop UI for Visual Analysis
You can create Tableau-styled charts without leaving your notebook with just a few lines of code.Imagine this: You’re working within Databricks notebook, trying to explore your Spark/Pandas DataFrame, but visualizing the data or performing Explorator...
- 1508 Views
- 0 replies
- 1 kudos
- 2914 Views
- 6 replies
- 0 kudos
Migrating Service Principals from Non-Unity to Unity-Enabled Databricks Workspaces - Entitlements Mi
Hello Databricks Community,I am currently in the process of migrating Service Principals from a non-Unity workspace to a Unity-enabled workspace in Databricks. While the Service Principals themselves seem to be migrating correctly, I am facing an is...
- 2914 Views
- 6 replies
- 0 kudos
- 0 kudos
Do you have the option to open a support ticket? If not the case I would suggest to run an additional code that disable the entitlement for the new objects, as seems that the entitlement is not being properly passed in the original call
- 0 kudos
- 1695 Views
- 0 replies
- 3 kudos
How to make your Service Principal or Group Workspace admin?
Do you require for a Service Principal or a Group to have admin rights to allow automation or reduce the efforts in the process of adding the permission to each user. Solution For Service Principals: You need to be at least Workspace AdminYou can eit...
- 1695 Views
- 0 replies
- 3 kudos
- 1082 Views
- 3 replies
- 0 kudos
Unable to install UCX
Executing databricks labs install ucx via databricks cli and running into 'WorkspaceInstaller is not supposed to be executed in databricks runtime.' How do I resolve it?
- 1082 Views
- 3 replies
- 0 kudos
- 0 kudos
Databricks CLI is designed to work in your local terminal and connect to your workspace. The error is mentioning that the command is not allowed to be run in a Databricks cluster, as you are using the web terminal of a cluster in your workspace it w...
- 0 kudos
- 1671 Views
- 3 replies
- 2 kudos
Problems registering models via /api/2.0/mlflow/model-versions/create
Initially, I tried registering while logging a model from mlflow and got:"Got an invalid source 'dbfs:/Volumes/compute_integration_tests/default/compute-external-table-tests-models/test_artifacts/1078d04b4d4b4537bdf7a4b5e94e9e7f/artifacts/model'. Onl...
- 1671 Views
- 3 replies
- 2 kudos
- 2 kudos
I know, right! Look at the error message I posted:"Got an invalid source 'dbfs:/Volumes/compute_integration_tests/default/compute-external-table-tests-models/test_artifacts/1078d04b4d4b4537bdf7a4b5e94e9e7f/artifacts/model'. Only DBFS locations are cu...
- 2 kudos
- 2286 Views
- 2 replies
- 2 kudos
Azure Databricks to GCP Databricks Migration
Hi Team, Can you provide your thoughts on moving Databricks from Azure to GCP? What services are required for the migration, and are there any limitations on GCP compared to Azure? Also, are there any tools that can assist with the migration? Please ...
- 2286 Views
- 2 replies
- 2 kudos
- 2 kudos
Hello Team, Adding to @sunnydata comments: Moving Databricks from Azure to GCP involves several steps and considerations. Here are the key points based on the provided context: Services Required for Migration:Cloud Storage Data: Use GCP’s Storage T...
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 34 |