- 1052 Views
- 1 replies
- 1 kudos
When Databricks Enabling Support for Rust and Go in Notebook
Now #Rust and #GoLang are trending for their efficiency and speed. When can databricks enthusiasts can leverage the power of Rust and Golang in Databricks notebook to create data/ETL pipelines. Any plan at #databricks ?
- 1052 Views
- 1 replies
- 1 kudos
- 1 kudos
Rust is an allowed language at Databricks if you must avoid a JVM process. I can see that the teams are working to provide additional support for Rust which might be available in the near future.
- 1 kudos
- 684 Views
- 1 replies
- 1 kudos
UC migration : Mount Points in Unity Catalog
Hi All,In my existing notebooks we have used mount points url as /mnt/ and we have notebooks where we have used the above url to fetch the data/file from the container. Now as we are upgrading to unity catalog these url will no longer be supporting a...
- 684 Views
- 1 replies
- 1 kudos
- 1 kudos
Unfortunately no, mount points are no longer supported with UC, so you will need to modify the URL manually on your notebooks.
- 1 kudos
- 452 Views
- 1 replies
- 0 kudos
- 452 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Thekenal , You can following link to first connect to Azure SQL server from databricks https://learn.microsoft.com/en-us/azure/databricks/connect/external-systems/sql-server.Then follow dashboard creation within Databricks https://docs.databricks...
- 0 kudos
- 1544 Views
- 2 replies
- 1 kudos
Looking for HR use cases for Databricks
Human Resources use cases.
- 1544 Views
- 2 replies
- 1 kudos
- 886 Views
- 3 replies
- 0 kudos
Issue with Validation After DBFS to Volume Migration in Databricks Workspace
Hello Databricks Community,I have successfully migrated my DBFS (Databricks File System) from a source workspace to a target workspace, moving it from a path in Browse DBFS -> Folders to a Catalog -> Schema -> Volume.Now, I want to validate the migra...
- 886 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @Sudheer2, thanks for your comments, you can try using %sh magic to list the folder and sub-directores using unix-like commands for example:
- 0 kudos
- 5906 Views
- 5 replies
- 2 kudos
Cluster compute metrics
I want to fetch compute metrics(hardware, gpu and spark) and use them in certain dashboard on Databricks, however i'm not able to fetch them. i have tried GET API request and system tables. The system tables only have CPU utilization and memory utili...
- 5906 Views
- 5 replies
- 2 kudos
- 2 kudos
How can we store the CPU & Memory metrics for GCP databricks centrally and setup some alerts incase if the usage is high and monitor the performance.
- 2 kudos
- 2147 Views
- 3 replies
- 1 kudos
Resolved! Constantly Running Interactive Clusters Best Practices
Hello there, I’ve been creating an ETL/ELT Pipeline with Azure Databricks Workflows, Spark and Azure Data Lake. It should process in Near Real Time changes (A Change Data Capture process) from an Azure SQL Database. For that purpose, I will have sev...
- 2147 Views
- 3 replies
- 1 kudos
- 1 kudos
No problem! let me know if you have any other questions.
- 1 kudos
- 1000 Views
- 3 replies
- 0 kudos
Dashboard sharing in Databriks with Unity Catalog enabled
Hello,I am planning to deploy a workspace with Unity Catalog enabled. Deploying permissions in one place sounds like a good solution. It can even simplify dataset architecture by masking rows and columns.As an architect, I’m concerned about the user’...
- 1000 Views
- 3 replies
- 0 kudos
- 0 kudos
I will suggest for you to submit a feature request for this through https://docs.databricks.com/en/resources/ideas.html#ideas
- 0 kudos
- 931 Views
- 1 replies
- 0 kudos
Clean Room Per Collaborator Pricing
For Clean Room Pricing it is mentioned50$/ collaborator per day.1) If I keep the clean room for may be 5 hrs and delete it later will I get charged for those 5hrs(10.41$)/collaborator or will I be charged 50$ regardless of active hrs of clean room fo...
- 931 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @RohithChippa, Thanks for your questions! Please find below feedback: 1) If you keep the clean room for 5 hours and delete it later, you will be charged $50 per collaborator for that day, regardless of the number of active hours the clean room was...
- 0 kudos
- 1121 Views
- 1 replies
- 0 kudos
Databricks Community Edition Not working
Hello,Is the Databricks Community Edition working, as I am getting error every time and after enter my mail id for login and enter the pass-code, it again load the Sign-In page again and i have tried with multiple browsers as well.Can anyone help giv...
- 1121 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @AshishMehta, Have you tried cleaning cookies and cache? Also can you please try resetting your password?
- 0 kudos
- 1295 Views
- 0 replies
- 1 kudos
Turn Your Dataframes into an Interactive Tableau-Styled Drag-and-Drop UI for Visual Analysis
You can create Tableau-styled charts without leaving your notebook with just a few lines of code.Imagine this: You’re working within Databricks notebook, trying to explore your Spark/Pandas DataFrame, but visualizing the data or performing Explorator...
- 1295 Views
- 0 replies
- 1 kudos
- 2589 Views
- 6 replies
- 0 kudos
Migrating Service Principals from Non-Unity to Unity-Enabled Databricks Workspaces - Entitlements Mi
Hello Databricks Community,I am currently in the process of migrating Service Principals from a non-Unity workspace to a Unity-enabled workspace in Databricks. While the Service Principals themselves seem to be migrating correctly, I am facing an is...
- 2589 Views
- 6 replies
- 0 kudos
- 0 kudos
Do you have the option to open a support ticket? If not the case I would suggest to run an additional code that disable the entitlement for the new objects, as seems that the entitlement is not being properly passed in the original call
- 0 kudos
- 1338 Views
- 0 replies
- 3 kudos
How to make your Service Principal or Group Workspace admin?
Do you require for a Service Principal or a Group to have admin rights to allow automation or reduce the efforts in the process of adding the permission to each user. Solution For Service Principals: You need to be at least Workspace AdminYou can eit...
- 1338 Views
- 0 replies
- 3 kudos
- 954 Views
- 3 replies
- 0 kudos
Unable to install UCX
Executing databricks labs install ucx via databricks cli and running into 'WorkspaceInstaller is not supposed to be executed in databricks runtime.' How do I resolve it?
- 954 Views
- 3 replies
- 0 kudos
- 0 kudos
Databricks CLI is designed to work in your local terminal and connect to your workspace. The error is mentioning that the command is not allowed to be run in a Databricks cluster, as you are using the web terminal of a cluster in your workspace it w...
- 0 kudos
- 1436 Views
- 3 replies
- 2 kudos
Problems registering models via /api/2.0/mlflow/model-versions/create
Initially, I tried registering while logging a model from mlflow and got:"Got an invalid source 'dbfs:/Volumes/compute_integration_tests/default/compute-external-table-tests-models/test_artifacts/1078d04b4d4b4537bdf7a4b5e94e9e7f/artifacts/model'. Onl...
- 1436 Views
- 3 replies
- 2 kudos
- 2 kudos
I know, right! Look at the error message I posted:"Got an invalid source 'dbfs:/Volumes/compute_integration_tests/default/compute-external-table-tests-models/test_artifacts/1078d04b4d4b4537bdf7a4b5e94e9e7f/artifacts/model'. Only DBFS locations are cu...
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
97 | |
52 | |
42 | |
30 |