- 726 Views
- 3 replies
- 1 kudos
Resolved! Not able to create Databriks Compute in Central US
I Created my databricks account in Central US and I am not able to create Compute . So I need help to create Compute.
- 726 Views
- 3 replies
- 1 kudos
- 1 kudos
If everything looks fine, opening a support ticket with Databricks or your cloud provider (Azure/AWS) would be the quickest way to resolve this and get your compute set up.
- 1 kudos
- 977 Views
- 2 replies
- 2 kudos
Resolved! Databricks Claude Access Error - Permission Denied
I'm using databricks-claude-sonnet-3.7 through Azure Databricks, and it was working until yesterday, but when I accessed it now, I got this error: Error: 403 {"error_code":"PERMISSION_DENIED","message":"PERMISSION_DENIED: Endpoint databricks-claude-s...
- 977 Views
- 2 replies
- 2 kudos
- 2 kudos
@szymon_dybczak Thank you! I'll wait a couple days and try again. Much appreciated!
- 2 kudos
- 1583 Views
- 2 replies
- 1 kudos
Pandas API on Spark creates huge query plans
Hello,I have a piece of code written in Pyspark and Pandas API on Spark. On comparing the query plans, I see Pandas API on Spark creates huge query plans whereas Pyspark plan is a tiny one. Furthermore, with Pandas API on spark, we see a lot of incon...
- 1583 Views
- 2 replies
- 1 kudos
- 1 kudos
@FRB1984 could you provide some examples? I'm curious. My first thoughts would be around the shuffling. Check this out: https://spark.apache.org/docs/3.5.4/api/python/user_guide/pandas_on_spark/best_practices.html . There's an argument to be made abo...
- 1 kudos
- 456 Views
- 1 replies
- 1 kudos
Resolved! Table Counts
Hello,My company loads a lot of tables into a databricks schema. I would like to build a dashboard on what has been loaded, but SQL commands like select * from information_schema do not work. Instead we have SHOW TABLES {FROM} LIKE {}; And that fails...
- 456 Views
- 1 replies
- 1 kudos
- 1 kudos
Just trying to rule out some of the lower-hanging stuff. When you run your SQL statements i.e. select * from information_schemaAre you using the correct namespace syntax i.e. {catalog_here}.information_schemaAre you using Unity Catalog?Example of th...
- 1 kudos
- 458 Views
- 3 replies
- 0 kudos
What permissions are needed to fix [INSUFFICIENT_PERMISSIONS] User does not have permission toSELECT
Hi,I am getting the following error in Databricks when running a SELECT query: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission SELECT on any file. SQLSTATE: 42501Context:Environment: Unity Catalog enabledI am tryin...
- 458 Views
- 3 replies
- 0 kudos
- 0 kudos
If you’re getting an “Insufficient Permissions” error in Databricks, it usually means your user is missing one or more privileges required for the action you’re trying to perform. In Unity Catalog, for example, querying a view in dedicated compute mo...
- 0 kudos
- 767 Views
- 4 replies
- 2 kudos
Resolved! add new column to a table and failing the previous jobs
Hello community! I’m new to Databricks and currently working on a project structured in Bronze / Silver / Gold layers using Delta Lake and Change Data Feed.I recently added 3 new columns to a table and initially applied these changes via PySpark SQ...
- 767 Views
- 4 replies
- 2 kudos
- 2 kudos
Hello @leticialima__ Good dayCan you please share the error observed on the driver log. is it : [Errno 13] Permission denied or No such file or directory? Please let me know the error on the driver log. THank you.
- 2 kudos
- 245 Views
- 1 replies
- 0 kudos
Steamlit in Databricks
HiI have developed a streamlit app locally on my desktop using dummy data, and now I want to be able to use actual data stored in azure blog storage. I have tried to run the same code within a notebook, but keep on getting dependency errors. Is there...
- 245 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @aw1! What exact dependency errors or permission failures are you getting? Can you please share the error message?
- 0 kudos
- 284 Views
- 0 replies
- 0 kudos
Exclusive Masterclass: Oracle Fusion + Databricks Integration using Orbit Analytics
Transform Your Oracle Fusion and Databricks Data into AI-Powered Business Intelligence with Orbit AnalyticsExclusive Masterclass: Oracle Fusion + Databricks Integration for more information click hereWhy This Matters to YouAre you struggling to unloc...
- 284 Views
- 0 replies
- 0 kudos
- 346 Views
- 2 replies
- 0 kudos
databricks dashboard deployment (schema and catalog modification)
I have a databricks dashboard. I have deployed the lvdash.json file through yml (resource.json) from dev to qa env.Now I can see my dashboard published version in resources folder.I want to change the catalog and schema of those underlying queries I ...
- 346 Views
- 2 replies
- 0 kudos
- 0 kudos
You can try using DAB to promote the dashboard and parameterize the query. For more details, check out the DAB dashboard documentation.
- 0 kudos
- 20919 Views
- 34 replies
- 1 kudos
Resolved! My Databrick exam got suspended just for coming closer to laptop screen to read question and options
Hi team,My Databricks Certified Data Engineer Associate exam got suspended within 10 minutes.I had also shown my exam room to the proctor. My exam got suspended due to eye movement. I was not moving my eyes away from laptop screen. It's hard to focus...
- 20919 Views
- 34 replies
- 1 kudos
- 1 kudos
@Cert-TeamOPS I am writing to raise a concern regarding an interruption that occurred during my Databricks Certified Data Engineer Associate exam scheduled for today at 1:15 PM. I began the exam at 1:00 PM, and the experience was smooth until I recei...
- 1 kudos
- 635 Views
- 7 replies
- 5 kudos
Resolved! How to create classes that can be instantiated from other notebooks?
Hi,I am familiar with object oriented programming and cannot really get my head around the philosophy of coding in Databricks. My approach that naturally consists in creating classes and instantiating objects does not seem to be the right one.Can som...
- 635 Views
- 7 replies
- 5 kudos
- 5 kudos
Legendary, @szymon_dybczak All the best,BS
- 5 kudos
- 1889 Views
- 3 replies
- 1 kudos
Google PubSub for DLT - Error
I'm trying to create a delta live table from a Google PubSub stream.Unfortunately I'm getting the following error:org.apache.spark.sql.streaming.StreamingQueryException: [PS_FETCH_RETRY_EXCEPTION] Task in pubsub fetch stage cannot be retried. Partiti...
- 1889 Views
- 3 replies
- 1 kudos
- 1 kudos
@itamarwe can you please share which permission resulted into the issue and how it got resolved
- 1 kudos
- 1731 Views
- 2 replies
- 2 kudos
Data load issue
I have a job in Databricks which completed successfully but the data is not been written into the target table, I have checked all the possible ways, each n every thing is correct in the code, target table name, source table name, etc etc. It is a Fu...
- 1731 Views
- 2 replies
- 2 kudos
- 2 kudos
This looks like a misconfigured Query Watchdog, specifically the below config: spark.conf.get("spark.databricks.queryWatchdog.outputRatioThreshold") Please check the value of this config - it is 1000 by default. Also, we recommend using Jobs Comput...
- 2 kudos
- 377 Views
- 1 replies
- 1 kudos
Delta UniForm
When we save a delta table using the UniForm option we are seeing a 50% drop in table size. When we add UniForm to a delta table in post we are seeing no change in data size. Is this expected or are others seeing this as well?
- 377 Views
- 1 replies
- 1 kudos
- 1 kudos
Re:When we save a delta table using the UniForm option we are seeing a 50% drop in table size What format are you starting with? e.g. csv -> Delta.
- 1 kudos
- 554 Views
- 1 replies
- 2 kudos
Resolved! AutoLoader Pros/Cons When Extracting Data (Cross-Post)
Cross-posting from: https://community.databricks.com/t5/data-engineering/autoloader-pros-cons-when-extracting-data/td-p/127400Hi there, I am interested in using AutoLoader, but I'd like to get a bit of clarity if it makes sense in my case. Based on e...
- 554 Views
- 1 replies
- 2 kudos
- 2 kudos
You’ve already identified data duplication as a potential con of landing the data first, but there are several benefits to this approach that might not be immediately obvious:Schema Inference and Evolution: AutoLoader can automatically infer the sche...
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 116 | |
| 56 | |
| 42 | |
| 33 |