- 734 Views
- 1 replies
- 0 kudos
Logging: Unable to read a /volume based file
Hi We've just started using databricks and so am a little naive into the file system, especially regarding unity catalog.The issue is that we're creating a loggeer and wanting to write the files based on a queue handler/listener pattern. The patternn...
- 734 Views
- 1 replies
- 0 kudos
- 0 kudos
When using the CLI you need to add the scheme:dbfs:/Volumes/​...The rest should be fine to refer with "/Volumes/...", for more info Manage files in volumes | Databricks Documentation.Hope this solves the issue!
- 0 kudos
- 2832 Views
- 3 replies
- 6 kudos
Resolved! How to use variable-overrides.json for environment-specific configuration in Asset Bundles?
Hi all,Could someone clarify the intended usage of the variable-overrides.json file in Databricks Asset Bundles?Let me give some context. Let's say my repository layout looks like this:databricks/ ├── notebooks/ │ └── notebook.ipynb ├── resources/ ...
- 2832 Views
- 3 replies
- 6 kudos
- 6 kudos
It does. Thanks for the reponse. I also continued playing around with it and found a way using the variable-overrides.json file. I'll leave it here just in case anyone is interested:Repository layout:databricks/ ├── notebooks/ │ └── notebook.ipynb ...
- 6 kudos
- 1107 Views
- 1 replies
- 0 kudos
Resolved! Workspace Consolidation Strategy in Databricks
Hi Team,The customer is facing a challenge related to increasing Databricks workspace maintenance costs. Apparently, every project is creating its own workspace for specific functionalities, and this has become a standard practice. As a result, the n...
- 1107 Views
- 1 replies
- 0 kudos
- 0 kudos
This is something that you should discuss with your Databricks rep imo. Even with standard tools, migrating consolidating 200 workspaces is something that needs very careful planning and testing.
- 0 kudos
- 663 Views
- 0 replies
- 0 kudos
Introduction Dario Schiraldi Deutsche Bank Executive
Dario Schiraldi Deutsche Bank Executive, known for his strong leadership in the financial and banking sector. Dario Schiraldi brings 20 years of leadership experience to major worldwide organizations where his expertise extends into both market acqui...
- 663 Views
- 0 replies
- 0 kudos
- 641 Views
- 0 replies
- 0 kudos
SAS TO DATABRICKS MIGRATION
SAS to PY is an AI/ML-based Accelerator designed for "SAS to Python or PySpark" code migration. This Accelerator is engineered to convert SAS legacy proprietary codes to the more flexible, open-source Python or PySpark environment with 95% automatica...
- 641 Views
- 0 replies
- 0 kudos
- 610 Views
- 1 replies
- 0 kudos
Dario Schiraldi : How do I integrate Databricks with AWS?
Hi everyone,I am Dario Schiraldi, CEO of Travel Works, and I am reaching out to the community for some insights. We are in the process of integrating Databricks with AWS for a new project, and I have love to hear from anyone who has experience with t...
- 610 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello Dario Good to meet you. You can connect with your account manager of databricks. Also Azure provides first partner assistance to databricks. you can check Azure services as well. Thank you.
- 0 kudos
- 4786 Views
- 4 replies
- 0 kudos
Resolved! vscode python project for development
Hi,I'm trying to set up a local development environment using python / vscode / poetry. Also, linting is enabled (Microsoft pylance extension) and the python.analysis.typeCheckingMode is set to strict.We are using python files for our code (.py) whit...
- 4786 Views
- 4 replies
- 0 kudos
- 0 kudos
How did you solve the type error checks on `pyspark.sql ` ? mypy doesn't create the missing stubs for that one?
- 0 kudos
- 1440 Views
- 1 replies
- 1 kudos
Resolved! How to trigger Power BI refresh from Databricks pipeline without keeping cluster alive?
I have a Databricks pipeline that pulls data from AWS, which takes ~90 minutes. After this, I need to refresh a series of Power BI dataflows (~45 mins) and then datasets (~45 mins).I want to trigger the Power BI refresh automatically from Databricks ...
- 1440 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @chandataeng ,The current Power BI task that is available in databricks workflow will wait for refresh process to return correct status (whether it succeeded or failed).But you can start refresh process by using asynchronous REST API call. The ref...
- 1 kudos
- 2507 Views
- 2 replies
- 0 kudos
Resolved! information_schema not populating with columns
We started migrating databases from hive_metastore into unity catalog back in October 2024 and ive noticed that periodically the Catalog UI will not show columns or a data preview for some tables, but not all of them that were migrated. After some di...
- 2507 Views
- 2 replies
- 0 kudos
- 0 kudos
this is definitely a bug related to older instances of azure databricks that were upgraded to use unity platform. after going back and forth with MS support for 2+ months, we made the decision to just spin up a new instance of azure databricks and co...
- 0 kudos
- 7615 Views
- 8 replies
- 0 kudos
Delta Sharing - Alternative to config.share
I was recently given a credential file to access shared data via delta sharing. I am following the documentation from https://docs.databricks.com/en/data-sharing/read-data-open.html. The documentation wants the contents of the credential file in a fo...
- 7615 Views
- 8 replies
- 0 kudos
- 0 kudos
Hi, the most feasible way would be to convert the contents of your key file into base64 and only mention the spark config as below: credentials <base 64 encoded code>
- 0 kudos
- 2960 Views
- 3 replies
- 2 kudos
Resolved! is Spark UI available on the Databricks Free Edition?
Hi allI have a noob question, I am currently using the Databricks free edition, which runs on serverless compute.To access the Spark UI normally one would click on the attached compute, however, with serverless, I can not find the menu to access Spar...
- 2960 Views
- 3 replies
- 2 kudos
- 2 kudos
So, there is no way we can run spark in free edition as we need general purpose clusters?
- 2 kudos
- 3201 Views
- 4 replies
- 0 kudos
Databricks Features
Hi All, I am new to the Databricks, am using community version. So far, I have noticed some limitations , features like DBFS (File System) are restricted and Cluster Configuration is Locked. So I am thinking to use trial version, it will give 14 da...
- 3201 Views
- 4 replies
- 0 kudos
- 0 kudos
Translator Hello However, the DBFS file browser is often disabled by default in the user interface. It can typically be re-enabled through the admin settings. In the free edition, you would face some limitations with cluster size. However, if you...
- 0 kudos
- 4613 Views
- 7 replies
- 0 kudos
Capture data from a Specific SharePoint Site (List) in M365 into Azure DataBricks
Hello. We are using Azure Databricks and would like to ingest data from a specific M365 SharePoint Online Site/List. I was originally trying to use this recommendation, https://learn.microsoft.com/en-us/answers/questions/2116616/service-principal-a...
- 4613 Views
- 7 replies
- 0 kudos
- 0 kudos
We achieved the same using the SharePoint API. You can follow the steps outlined in this documentation: https://learn.microsoft.com/en-us/graph/auth-v2-service?tabs=http.Additionally, you can grant the Sites.Selected permission to the Azure AD applic...
- 0 kudos
- 1021 Views
- 2 replies
- 1 kudos
Dario Schiraldi Deutsche Bank Executive : Excited to Join
I’m Dario Schiraldi Deutsche Bank Executive. During my time there, I led global institutional sales and investment businesses, honing my expertise in strategy, leadership, and financial markets. As someone who’s passionate about the transformative p...
- 1021 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @ds01 ,Welcome, Dario! It’s great to have someone with your deep experience in finance and leadership join the Databricks community. Looking forward to your insights and contributions!
- 1 kudos
- 1243 Views
- 1 replies
- 0 kudos
Efficiently Copying Single or Multiple Cells in Databricks Notebooks
I found some hidden features in Databricks Notebooks, but sometimes when copying, Titles are lost. Maybe someone knows the reason?
- 1243 Views
- 1 replies
- 0 kudos
- 0 kudos
When you copy cells in Databricks, titles (headers) can be lost because:TOC (Table of Contents) comes from Markdown titles like # Level 1.If you copy only the content and not the Markdown format, it won’t show in TOC.Hidden cells under the title migh...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 34 |