- 1765 Views
- 3 replies
- 2 kudos
PERMISSION_DENIED: Cannot access Spark Connect. when trying to run serverless databricks connect
I am not able to run a file as "run as workflow" nor "run with databricks connect" when I choose serverless run on my paid account. However I can perform this action in my free edition account . See error : pyspark.errors.exceptions.connect.SparkCon...
- 1765 Views
- 3 replies
- 2 kudos
- 2 kudos
Hi @ivan7256 ,This might be because serverless compute isn't enabled for workflows in your paid workspace.
- 2 kudos
- 1036 Views
- 3 replies
- 5 kudos
Databricks Free Edition Needs Transparency About Data Access
When I first discovered the Databricks Free Edition, I thought it was a generous offering for data enthusiasts, researchers, and developers who just needed a personal sandbox. No cost. Easy setup. Promises of productivity. But what caught me off guar...
- 1036 Views
- 3 replies
- 5 kudos
- 5 kudos
Thanks again for all the perspectives shared so far. I want to re-emphasize that the Databricks Free Edition offers real value. For data enthusiasts, learners, and builders, it’s a genuinely powerful environment to get hands-on without jumping throug...
- 5 kudos
- 5294 Views
- 3 replies
- 8 kudos
Resolved! Rss feeds for databricks releases
Hi,are there any rss feeds for the databricks platform, sql & runtime releases? We have a big tech stack so it is sometimes hard to keep up with the ever changing technologies. We are using rss feeds to keep up with all of that.Cant find anything for...
- 5294 Views
- 3 replies
- 8 kudos
- 8 kudos
Databricks recently published an RSS feed for all their updates. As far as I can find, it is only for AWS at the moment.https://docs.databricks.com/aws/en/feed.xml
- 8 kudos
- 86211 Views
- 7 replies
- 7 kudos
Resolved! How to create temporary table in databricks
Hi Team,I have a requirement where I need to create temporary table not temporary view.Can you tell me how to create temporary table in data bricks ?
- 86211 Views
- 7 replies
- 7 kudos
- 7 kudos
I see, thanks for sharing, can you mark the solution which worked for you @abueno as Accepted.
- 7 kudos
- 563 Views
- 1 replies
- 0 kudos
CLI: Export-dir provides LatestClone
Hi everyone,I want to download the current databricks codebase out of a workspace and tried viadatabricks databricks workspace export-dir /Sandbox/foo .Surprisingly, some of the subfolders are twice in the export target: One with the expected name (`...
- 563 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @holunder ,This could be because the backend stores both the original and cloned versions of folders, even if only one appears in the web UI. The Databricks CLI exports everything from the backend, not just what's visible in the UI.
- 0 kudos
- 4102 Views
- 5 replies
- 1 kudos
"PutWithBucketOwnerFullControl" privilege missing for storage configuration
Hi. I've been unable to create workspaces manually for a while now. The error I get is "MALFORMED_REQUEST: Failed storage configuration validation checks: List,Put,PutWithBucketOwnerFullControl,Delete". The storage configuration is on a bucket that ...
- 4102 Views
- 5 replies
- 1 kudos
- 1 kudos
I faced same issue because I have created bucket in wrong region.
- 1 kudos
- 11686 Views
- 9 replies
- 3 kudos
Stored Procedure creation in Unity Catalog
Does Databricks support the direct creation of stored procedures using the CREATE PROCEDURE syntax?
- 11686 Views
- 9 replies
- 3 kudos
- 3 kudos
HI sridharplv,Thanks for the info, it worked
- 3 kudos
- 9471 Views
- 12 replies
- 35 kudos
Are you part of a Databricks user group in your city/region?
Joining a regional user group is a great way to connect with data and AI professionals near you. These groups bring together practitioners to learn, network, and share real-world experiences — all within your local context. To join a group: Select y...
- 9471 Views
- 12 replies
- 35 kudos
- 35 kudos
@Rishabh_Tiwari appreciate this initiative! I think user groups are the best way to bring together the community as well as to learn, share and grow. I would like to start a local user group in my city since there is none already. Could you please gu...
- 35 kudos
- 537 Views
- 1 replies
- 0 kudos
Prakash Hinduja Geneva (Swiss) fix access denied issues when using DBFS in Databricks Community?
Hello Community, I’m Prakash Hinduja, a financial strategist residing in Geneva, Switzerland (Swiss). My primary focus is on supporting Swiss businesses by crafting tailored financial strategies. These strategies attract global investments and foster...
- 537 Views
- 1 replies
- 0 kudos
- 0 kudos
Helllo @prakashhinduja So the file limit in community edition DBFS limit is 10 GB. Are you trying to upload more than 10 GB?
- 0 kudos
- 2333 Views
- 11 replies
- 4 kudos
Resolved! How to create a widget in SQL with variables?
I want to create a widget in SQL and use it in R later. Below is my code%sqldeclare or replace date1 date = "2025-01-31";declare or replace date2 date ;set var date2=add_months(date1,5); What's the correct syntax of using date2 to create a widget? I ...
- 2333 Views
- 11 replies
- 4 kudos
- 4 kudos
Hi @zc ,Unfortunately, I think in case of sql widgets default value needs to be string literals. So above approach won't work.Regarding your second question about accessing variables decalared in SQL in R cell, you cannot do such a thing. Here's an e...
- 4 kudos
- 1620 Views
- 4 replies
- 12 kudos
Resolved! Want to See More Resolved Posts? Try This Simple Step
In community discussion, it’s common to see a question get a helpful reply, but then the conversation stalls. There is no follow-up, no marked solution, and no real closure. A small but effective way to keep things moving is@mention the person you’re...
- 1620 Views
- 4 replies
- 12 kudos
- 12 kudos
Thanks, @TheOC — appreciate you joining the effort!
- 12 kudos
- 1043 Views
- 5 replies
- 0 kudos
Is Model Registry possible in Databricks Free Edition?
Has anyone registered a Model in Databricks Free Edition? It seems not working or not available in Free Edition.
- 1043 Views
- 5 replies
- 0 kudos
- 0 kudos
Hi @xyz999 Have you tried to upgrade to models in Unity Catalog:import mlflowmlflow.set_registry_uri("databricks-uc")
- 0 kudos
- 1201 Views
- 2 replies
- 1 kudos
Resolved! old unwated accounts
I am having trouble removing several accounts that were used for trial purposes. Both AWS and AzureOne of my AWS logins still works; however, when I attempt to manage the account and cancel my "Enterprise Plan" via the Manage Account section, I enco...
- 1201 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello @Thayal! You can refer to this post for detailed steps on deleting a Databricks account: https://community.databricks.com/t5/administration-architecture/delete-databricks-account/td-p/87187 Regarding the error about canceling via the account co...
- 1 kudos
- 2339 Views
- 3 replies
- 0 kudos
Resolved! databricks community edition is unable to sign
i was created a free edition of the databricks but I'm trying to login to the community edition wiht the same email of free edition throwing a error like this; User is not a member of this workspace.Please slove the issue
- 2339 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello @diwakarnayudu! The Community Edition and the Free Edition are separate environments. Since you’ve created an account for the Free Edition, you can access that environment by logging in here: https://www.databricks.com/learn/free-edition.
- 0 kudos
- 527 Views
- 0 replies
- 0 kudos
ML-based profiling of data skew and bottlenecks on Databricks
Automatically detect skew and pipeline issues using ML profiling Data skew is a persistent performance issue in distributed data pipelines. On platforms like Databricks, skewed partitions can quietly degrade performance, inflate compute costs, and d...
- 527 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
4 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
12 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 128 | |
| 62 | |
| 57 | |
| 42 |