- 1884 Views
- 3 replies
- 1 kudos
Replacing Excel with Databricks
I have a client that currently uses a lot of Excel with VBA and advanced calculations. Their source data is often stored in SQL Server.I am trying to make the case to move to Databricks. What's a good way to make that case? What are some advantages t...
- 1884 Views
- 3 replies
- 1 kudos
- 1 kudos
To add on this, my team and I have been using Databricks in an enterprise environment to replace Excel based calculation relying on SQL stored data with Calculations served as model serving endpoints (API) - the initial 'translation' work can be tedi...
- 1 kudos
- 896 Views
- 2 replies
- 1 kudos
Resolved! Search page to search code inside .py files
Hello, hope you are doing good.When on the search page, it seems it's not searching for code inside .py files but rather only the filename.Is there an option somewhere I'm missing to be able to search inside .py files ? Best,Alan
- 896 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello, so it seems Databricks does not allow it - an easy workaround for us is to search directly on our Azure DevOps Repos.
- 1 kudos
- 1106 Views
- 1 replies
- 0 kudos
COMMUNITY EDITION CLUSTER DETACH JAVA.UTIL.TIMEOUTEXCEPTION.
Hi folks i was exploring the databricks community edition and came across a cluster issue mostly because of jdbc driver java.util.timeoutexception . basically the cluster connects and executes for 15 sec or so which is a socket limit and disables any...
- 1106 Views
- 1 replies
- 0 kudos
- 0 kudos
@Kishore23 paturnpikewrote:Hi folks i was exploring the databricks community edition and came across a cluster issue mostly because of jdbc driver java.util.timeoutexception . basically the cluster connects and executes for 15 sec or so which is a so...
- 0 kudos
- 50474 Views
- 11 replies
- 1 kudos
Error: Folder xxxx@xxx.com is protected
Hello, On Azure Databricks i'm trying to remove a folder on the Repos folder using the following command : databricks workspace delete "/Repos/xxx@xx.com"I got the following error message:databricks workspace delete "/Repos/xxxx@xx.com"Error: Folder ...
- 50474 Views
- 11 replies
- 1 kudos
- 1 kudos
Hello Databricks Forums,When you see the Azure Databricks error message "Folder xxxx@xxx.com is protected," it means that you are attempting to remove a system-protected folder, which is usually connected to a user's workspace, particularly under the...
- 1 kudos
- 659 Views
- 1 replies
- 0 kudos
dev and prod
"SELECT * FROM' data call on my table in PROD is giving all the rows of data, but a call on my table in DEV is giving me just one row of data. what could be the problem??
- 659 Views
- 1 replies
- 0 kudos
- 0 kudos
Tell us more about your environment. Are you using Unity Catalog? What is the table format? What cloud platform are you on? More information is needed.
- 0 kudos
- 2654 Views
- 0 replies
- 0 kudos
Issue with Disabled "Repair DAG", "Repair All DAGs" Buttons in Airflow UI, functionality is working.
We are encountering an issue in the Airflow UI where the 'Repair DAG' and 'Repair All DAGs' options are disabled when a specific task fails. While the repair functionality itself is working properly (i.e., the DAGs can still be repaired through execu...
- 2654 Views
- 0 replies
- 0 kudos
- 703 Views
- 2 replies
- 0 kudos
Will auto loader read files if it doesn't need to?
I want to run auto loader on some very large json files. I don't actually care about the data inside the files, just the file paths of the blobs. If I do something like``` spark.readStream .format("cloudFiles") .option("cloudFiles.fo...
- 703 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @charliemerrell Yes, Databricks will still open and parse the JSON files, even if you're only selecting _metadata.It must infer schema and perform basic parsing, unless you explicitly avoid it.So, even if you do:.select("_metadata")It doesn't skip...
- 0 kudos
- 1003 Views
- 2 replies
- 1 kudos
Enroll, Learn, Earn Databricks !!
Hello Team,I had attended the session in CTS Manyata on 22nd April. I am interested in pursuing for the certifications but while enrolling it shows you are not a member of any group.Link for the available certifications and courses: https://community...
- 1003 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @samgupta88 you can find it on the partner academy. Everything is listed in the partner portal.
- 1 kudos
- 880 Views
- 4 replies
- 0 kudos
UCX Installation error
Error Message: databricks.sdk.errors.platform.ResourceDoesNotExist: Can't find a cluster policy with id: 00127F76E005AE12.
- 880 Views
- 4 replies
- 0 kudos
- 0 kudos
Click into each policy in the Compute UI of the Workspace to see if the policy ID exists. If it does, then the account that invoked the SDK method didn't have workspace admin permissions.
- 0 kudos
- 1366 Views
- 3 replies
- 0 kudos
Resolved! .py file running stuck on waiting
Hello, hope you are doing well.We are facing an issue when running .py files. This is fairly recent and we were not experiencing this issue last week.As shown in the screenshots below, the .py file hangs on "waiting" after we press "run all". No matt...
- 1366 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello, thanks a lot for your answer.We were getting the required permissions to use Firefox in our org, but in the meantime it seemed it worked again in Edge when it updated to version 135.0.3179.85 (Official build) (64-bit).
- 0 kudos
- 7360 Views
- 2 replies
- 2 kudos
Resolved! requirements.txt with cluster libraries
Cluster libraries are supported from version 15.0 - Databricks Runtime 15.0 | Databricks on AWS.How can I specify requirements.txt file path in the libraries in a job cluster in my workflow? Can I use relative path? Is it relative from the root of th...
- 7360 Views
- 2 replies
- 2 kudos
- 2 kudos
how to install requirement.txt using github action.- name: Install workspace requirements.txt on clusterenv:CLUSTER_ID: ${{ secrets.DATABRICKS_CLUSTER_ID }}run: |databricks libraries install \--cluster-id "$CLUSTER_ID" \--whl "dbfs:/FileStore/enginee...
- 2 kudos
- 1697 Views
- 5 replies
- 0 kudos
Debugging notebook access to external REST API
I'm using a Python Notebook with a REST API to access a system outside Databricks, in this case it's to call a SAS program. Identical python code works fine if I call it from jupyter on my laptop, but fails with a timeout when I run it from my Databr...
- 1697 Views
- 5 replies
- 0 kudos
- 0 kudos
What happens if you run command in notebook:nc -vz hostname 443If it fails to connect this will mean that the firewall or security groups associated with the VPC or VNet are not allowing this connection, you will need to check with your networking te...
- 0 kudos
- 769 Views
- 1 replies
- 0 kudos
Trusted assets vs query examples
¡Hi community! In recent days I explored trusted assets in my genie space and this working very well! but I feel a little confused :sIn my genie space I have many queries examples when I create a new function with the same query example for verify th...
- 769 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Dulce42! It depends on your use case. If your function covers the scenario well, you don’t need a separate query example. Having both for the same purpose can create redundancy and make things more complex. Choose the option that best fits you...
- 0 kudos
- 1216 Views
- 2 replies
- 0 kudos
Resolved! Need help to add personal email to databricks partner account
I have been actively using the Databricks Partner Academy for the past three years through my current organization. As I am planning to transition to a new company, I would like to ensure continued access to my training records and certifications.Cur...
- 1216 Views
- 2 replies
- 0 kudos
- 1115 Views
- 1 replies
- 0 kudos
Python versions - Notebooks and DBR
Hi,I have a problem with conflicting python versions in a notebook running with the Databricks 14 day free trial. One example:spark.conf.get("spark.databricks.clusterUsageTags.clusterName") # Returns: "Python versions in the Spark Connect client and...
- 1115 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Terje, were you able to fix it? From what I know, during the free trial period we’re limited to the default setup, so version mismatches can’t be resolved unless we upgrade to a paid workspace.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
2 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 34 |