cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

stucas
by New Contributor II
  • 734 Views
  • 1 replies
  • 0 kudos

Logging: Unable to read a /volume based file

Hi We've just started using databricks and so am a little naive into the file system, especially regarding unity catalog.The issue is that we're creating a loggeer and wanting to write the files based on a queue handler/listener pattern. The patternn...

  • 734 Views
  • 1 replies
  • 0 kudos
Latest Reply
FedeRaimondi
Contributor II
  • 0 kudos

When using the CLI you need to add the scheme:dbfs:/Volumes/​...The rest should be fine to refer with "/Volumes/...", for more info Manage files in volumes | Databricks Documentation.Hope this solves the issue!  

  • 0 kudos
esistfred
by New Contributor III
  • 2833 Views
  • 3 replies
  • 6 kudos

Resolved! How to use variable-overrides.json for environment-specific configuration in Asset Bundles?

Hi all,Could someone clarify the intended usage of the variable-overrides.json file in Databricks Asset Bundles?Let me give some context. Let's say my repository layout looks like this:databricks/ ├── notebooks/ │ └── notebook.ipynb ├── resources/ ...

  • 2833 Views
  • 3 replies
  • 6 kudos
Latest Reply
esistfred
New Contributor III
  • 6 kudos

It does. Thanks for the reponse. I also continued playing around with it and found a way using the variable-overrides.json file. I'll leave it here just in case anyone is interested:Repository layout:databricks/ ├── notebooks/ │ └── notebook.ipynb ...

  • 6 kudos
2 More Replies
Phani1
by Valued Contributor II
  • 1108 Views
  • 1 replies
  • 0 kudos

Resolved! Workspace Consolidation Strategy in Databricks

Hi Team,The customer is facing a challenge related to increasing Databricks workspace maintenance costs. Apparently, every project is creating its own workspace for specific functionalities, and this has become a standard practice. As a result, the n...

  • 1108 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

This is something that you should discuss with your Databricks rep imo.  Even with standard tools, migrating consolidating 200 workspaces is something that needs very careful planning and testing.

  • 0 kudos
sastopy
by New Contributor II
  • 641 Views
  • 0 replies
  • 0 kudos

SAS TO DATABRICKS MIGRATION

SAS to PY is an AI/ML-based Accelerator designed for "SAS to Python or PySpark" code migration. This Accelerator is engineered to convert SAS legacy proprietary codes to the more flexible, open-source Python or PySpark environment with 95% automatica...

  • 641 Views
  • 0 replies
  • 0 kudos
darioschiraldi9
by New Contributor II
  • 610 Views
  • 1 replies
  • 0 kudos

Dario Schiraldi : How do I integrate Databricks with AWS?

Hi everyone,I am Dario Schiraldi, CEO of Travel Works, and I am reaching out to the community for some insights. We are in the process of integrating Databricks with AWS for a new project, and I have love to hear from anyone who has experience with t...

  • 610 Views
  • 1 replies
  • 0 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 0 kudos

Hello Dario Good to meet you. You can connect with your account manager of databricks. Also Azure provides first partner assistance to databricks. you can check Azure services as well. Thank you. 

  • 0 kudos
Alexandru
by New Contributor III
  • 4786 Views
  • 4 replies
  • 0 kudos

Resolved! vscode python project for development

Hi,I'm trying to set up a local development environment using python / vscode / poetry. Also, linting is enabled (Microsoft pylance extension) and the python.analysis.typeCheckingMode is set to strict.We are using python files for our code (.py) whit...

  • 4786 Views
  • 4 replies
  • 0 kudos
Latest Reply
A_N
New Contributor II
  • 0 kudos

How did you solve the type error checks on `pyspark.sql `  ? mypy doesn't create the missing stubs for that one? 

  • 0 kudos
3 More Replies
chandataeng
by New Contributor
  • 1443 Views
  • 1 replies
  • 1 kudos

Resolved! How to trigger Power BI refresh from Databricks pipeline without keeping cluster alive?

I have a Databricks pipeline that pulls data from AWS, which takes ~90 minutes. After this, I need to refresh a series of Power BI dataflows (~45 mins) and then datasets (~45 mins).I want to trigger the Power BI refresh automatically from Databricks ...

  • 1443 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @chandataeng ,The current Power BI task that is available in databricks workflow will wait for refresh process to return correct status (whether it succeeded or failed).But you can start refresh process by using asynchronous REST API call. The ref...

  • 1 kudos
KIRKQUINBAR
by New Contributor III
  • 2510 Views
  • 2 replies
  • 0 kudos

Resolved! information_schema not populating with columns

We started migrating databases from hive_metastore into unity catalog back in October 2024 and ive noticed that periodically the Catalog UI will not show columns or a data preview for some tables, but not all of them that were migrated. After some di...

  • 2510 Views
  • 2 replies
  • 0 kudos
Latest Reply
KIRKQUINBAR
New Contributor III
  • 0 kudos

this is definitely a bug related to older instances of azure databricks that were upgraded to use unity platform. after going back and forth with MS support for 2+ months, we made the decision to just spin up a new instance of azure databricks and co...

  • 0 kudos
1 More Replies
alex-syk
by New Contributor II
  • 7617 Views
  • 8 replies
  • 0 kudos

Delta Sharing - Alternative to config.share

I was recently given a credential file to access shared data via delta sharing. I am following the documentation from https://docs.databricks.com/en/data-sharing/read-data-open.html. The documentation wants the contents of the credential file in a fo...

  • 7617 Views
  • 8 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, the most feasible way would be to convert the contents of your key file into base64 and only mention the spark config as below: credentials <base 64 encoded code>

  • 0 kudos
7 More Replies
Nietzsche
by New Contributor III
  • 2961 Views
  • 3 replies
  • 2 kudos

Resolved! is Spark UI available on the Databricks Free Edition?

Hi allI have a noob question, I am currently using the Databricks free edition, which runs on serverless compute.To access the Spark UI normally one would click on the attached compute, however, with serverless, I can not find the menu to access Spar...

  • 2961 Views
  • 3 replies
  • 2 kudos
Latest Reply
dyusuf
New Contributor II
  • 2 kudos

So, there is no way we can run spark in free edition as we need general purpose clusters?

  • 2 kudos
2 More Replies
Ganeshch
by New Contributor III
  • 3201 Views
  • 4 replies
  • 0 kudos

Databricks Features

Hi All, I am new to the Databricks, am using community version. So far, I have noticed some limitations , features like DBFS (File System) are restricted and  Cluster Configuration is Locked. So I am thinking to use trial version,  it will give 14 da...

  • 3201 Views
  • 4 replies
  • 0 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 0 kudos

Translator  Hello   However, the DBFS file browser is often disabled by default in the user interface. It can typically be re-enabled through the admin settings. In the free edition, you would face some limitations with cluster size.  However, if you...

  • 0 kudos
3 More Replies
SmileyVille
by New Contributor III
  • 4617 Views
  • 7 replies
  • 0 kudos

Capture data from a Specific SharePoint Site (List) in M365 into Azure DataBricks

Hello.  We are using Azure Databricks and would like to ingest data from a specific M365 SharePoint Online Site/List.  I was originally trying to use this recommendation, https://learn.microsoft.com/en-us/answers/questions/2116616/service-principal-a...

Get Started Discussions
M365
Service Principal
SharePoint Online
  • 4617 Views
  • 7 replies
  • 0 kudos
Latest Reply
Divya_Bhadauria
New Contributor III
  • 0 kudos

We achieved the same using the SharePoint API. You can follow the steps outlined in this documentation: https://learn.microsoft.com/en-us/graph/auth-v2-service?tabs=http.Additionally, you can grant the Sites.Selected permission to the Azure AD applic...

  • 0 kudos
6 More Replies
ds01
by New Contributor
  • 1024 Views
  • 2 replies
  • 1 kudos

Dario Schiraldi Deutsche Bank Executive : Excited to Join

I’m Dario Schiraldi Deutsche Bank Executive. During my time there, I led global institutional sales and investment businesses, honing my expertise in strategy, leadership, and financial markets.  As someone who’s passionate about the transformative p...

  • 1024 Views
  • 2 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @ds01 ,Welcome, Dario! It’s great to have someone with your deep experience in finance and leadership join the Databricks community. Looking forward to your insights and contributions!

  • 1 kudos
1 More Replies
protmaksim
by New Contributor II
  • 1244 Views
  • 1 replies
  • 0 kudos

Efficiently Copying Single or Multiple Cells in Databricks Notebooks

I found some hidden features in Databricks Notebooks, but sometimes when copying, Titles are lost. Maybe someone knows the reason?

  • 1244 Views
  • 1 replies
  • 0 kudos
Latest Reply
intuz
Contributor II
  • 0 kudos

When you copy cells in Databricks, titles (headers) can be lost because:TOC (Table of Contents) comes from Markdown titles like # Level 1.If you copy only the content and not the Markdown format, it won’t show in TOC.Hidden cells under the title migh...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels