- 1768 Views
- 2 replies
- 1 kudos
How to Determine Which CLI Methods to Use?
The Databricks CLI is considered Legacy for versions below 0.17 and is in Public Preview for versions above 0.20. I have access to documentation for both these versions separately. As I am developing a new project, I prefer not to use legacy options ...
- 1768 Views
- 2 replies
- 1 kudos
- 1 kudos
While the legacy CLI (versions 0.18 and below) is still available, it is not receiving any non-critical updates and it is recommend migrating to the new CLI as soon as possible. "Databricks recommends that you use Databricks CLI version 0.205 or abov...
- 1 kudos
- 2162 Views
- 1 replies
- 0 kudos
Community version now shows only the MLpersona
hiI have been using the community version for sometime and had saved some notebooks. While the notebooks are there in the workspace the different personas are not available as seen today. Only machine learning persona is visible. Would like to check...
- 2162 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi MSD, We removed the different personas that was observed earlier and now it is a flat system (It is not a recent change, been there for sometime). The notebooks would be saved in your workspace hence they are accessible. Irrespective of the perso...
- 0 kudos
- 1829 Views
- 0 replies
- 0 kudos
dbdemo LLAM CHATBOT RAG
i have an issue when running the below code using the default dbdemos in the advanced preparation , i have reduced the chunk_size and max_batch_size and running the code in a proper compute resources , could anyone help on that please :(spark.readStr...
- 1829 Views
- 0 replies
- 0 kudos
- 6850 Views
- 4 replies
- 0 kudos
pytest error
Hello,I have a quick question. If my source code call pysark collect() or any method related to rdd methods, then pytest on my local PC will report the following error. My local machine doesn't have any specific setting for pyspark and I used findspa...
- 6850 Views
- 4 replies
- 0 kudos
- 0 kudos
Thank you very much, brockb. Probably I will try it in databricks. Thanks.
- 0 kudos
- 5427 Views
- 0 replies
- 0 kudos
SQL Statement Execution API w/ Javascript (REST)
I need to use Databricks SQL Statement Execution API w/ Javascript (see example post )For some reason, Curl Works, Python works, but Javascript fails.This works : (curl)______________________________curl --request POST \https://adb-5750xxxxxxx.azured...
- 5427 Views
- 0 replies
- 0 kudos
- 1241 Views
- 2 replies
- 0 kudos
DataBricks Certification Exam Got Suspended. Require Immediate support
Hi Team,Today (29th May 2024), I began my Databricks assessment exam, but it was abruptly suspended by the proctor without any explanation. This was my first exam and it has been a disappointing experience.I started the exam calmly, but the proctor w...
- 1241 Views
- 2 replies
- 0 kudos
- 0 kudos
Thank you, @Cert-Team , for your quick response.I am looking forward to the resolution of my issue.
- 0 kudos
- 1034 Views
- 0 replies
- 0 kudos
Create a community user group Romania
Hello,I would like to receive some support in creating a Community User Group in Romania, Cluj-Napoca. Our intention as a company is to become a partner and us to create a community: events, meetups so on.Looking for your help.Thank youNicu
- 1034 Views
- 0 replies
- 0 kudos
- 2423 Views
- 3 replies
- 1 kudos
"AWS S3 resource has been disabled" error on job, not appearing on notebook
I am getting an "INTERNAL_ERROR" on a databricks job submitted through the API. Which says:"Run result unavailable: run failed with error message All access to AWS S3 resource has been disabled"However, when I click on the notebook created by the job...
- 2423 Views
- 3 replies
- 1 kudos
- 1 kudos
@Retired_mod In the s3 logs of the run, I am seeing this:24/05/29 06:39:30 WARN FileSystem: Failed to initialize fileystem dbfs:///: java.io.FileNotFoundException: Bucket user-workspace-s3-bucket does not exist24/05/29 06:39:30 ERROR DbfsHadoop3: Fai...
- 1 kudos
- 2569 Views
- 2 replies
- 0 kudos
Insufficient privileges error when running query from Notebook
Hello all, I am running into a permission issue when running a simple MERGE INTO query from a Notebook: " AnalysisException: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have USE SCHEMA on Schema 'system.query'."I can run the que...
- 2569 Views
- 2 replies
- 0 kudos
- 0 kudos
Are you using an all purpose cluster or a SQL warehouse to run this query in the notebook?
- 0 kudos
- 4103 Views
- 3 replies
- 0 kudos
issue related to Cluster Policy
Hello Databricks Community,I am currently working on creating a Terraform script to provision clusters in Databricks. However, I've noticed that by default, the clusters created using Terraform have the policy set to "Unrestricted."I would like to co...
- 4103 Views
- 3 replies
- 0 kudos
- 0 kudos
The policy id will persist, it is tied to the configuration you have set, even if changing the configuration of a custom policy it will persist with same policy id
- 0 kudos
- 2281 Views
- 1 replies
- 2 kudos
MLOps on Azure: API vs SDK vs Databricks CLI?
Hello fellow community members,In our organization, we have developed, deployed and utilized an API-based MLOps pipeline using Azure DevOps.The CI/CD pipeline has been developed and refined for about 18 months or so, and I have to say that it is pret...
- 2281 Views
- 1 replies
- 2 kudos
- 2 kudos
Hello @ManiMar, In my opinion it's up to you to choose, and you're in the right path by comparing the pros/cons of each approach. I'd like to highlight that one of the advantages of the Databricks CLI is being able to use Databricks Asset Bundles. I...
- 2 kudos
- 1163 Views
- 1 replies
- 0 kudos
Unity Catalog - Quality - Monitor error
Monitor errorAn error occurred while configuring your monitor for this table:Error while creating dashboard for unity-catalog-xxx: com.databricks.api.base.DatabricksServiceException: INTERNAL_ERROR: An internal error occurredPlease delete and recreat...
- 1163 Views
- 1 replies
- 0 kudos
- 0 kudos
If you have too many dashboards, there's a chance that the workspace reached the quota. I recommend you contacting Databricks Support for a more in-depth analysis.
- 0 kudos
- 3172 Views
- 2 replies
- 0 kudos
count or toPandas taking too long
Hi,I am fetching data from unity catalog from notebooks using spark.sql(). The query takes just a few seconds - I am actually trying to retrieving 2 rows - but some operations like count() or toPandas() take forever. I wonder why does it take so long...
- 3172 Views
- 2 replies
- 0 kudos
- 0 kudos
Hey @jimcast how are you? You can check the internals and have a good hint of what's happening using the SparkUI. Filter and select the jobs that are taking the longest and check what is being requested on the SQL/Data Frame tab, as well as their pla...
- 0 kudos
- 1727 Views
- 2 replies
- 2 kudos
- 1727 Views
- 2 replies
- 2 kudos
- 2 kudos
Yes, storage-partitioned joins can be optimized for data skewness. Techniques like adaptive query processing and dynamic repartitioning help distribute the workload evenly across nodes. clipping path service provider By identifying and addressing dat...
- 2 kudos
- 2383 Views
- 3 replies
- 0 kudos
DataBricks Certification Exam Got Suspended. Require support for the same.
Hello Team, I encountered Pathetic experience while attempting my 1st DataBricks certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam without giving any reaso...
- 2383 Views
- 3 replies
- 0 kudos
- 0 kudos
@Kaniz @Cert-Team @Sujitha I have sent multiple emails to the Support team to reschedule my exam with Date, but I have not received any confirmation from them.Please look into this issue and reschedule the exam as soon as possible. This certification...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
112 | |
56 | |
42 | |
30 |