How do we leverage Databricks skills into full stack?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-21-2025 01:54 PM - edited 09-21-2025 02:05 PM
Hi Community experts?
Thanks for replies to my threads.
We need to understand about full stack in the context of Data bricks. I am hearing from folks like "we doing databricks and full stack development"
Is this a new framework or set of new tools?
Are we able to leverage pyspark, DBT , DLT, notebooks, workflow , medallion architecture development skills in databricks full stack development/ delivery?
Are there any additional skiils/tools to support databricks full stack development/ delivery?
Is there any guide or document that I may go through on this topic?
Is there a strategy based on Gen AI to consider?
Thanks for your guidance.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-21-2025 06:47 PM
Hi @RIDBX, How are you doing today?
Great question! As per my understanding, as of today, when people say “Databricks full stack development,” they usually don’t mean a new framework — it just refers to the end-to-end development lifecycle using Databricks tools. Think of it like covering everything: data ingestion (Autoloader, notebooks), transformation (PySpark, DLT, DBT), orchestration (Workflows), storage (Delta Lake, Unity Catalog), and presentation (Dashboards, LakeView). So yes, your current skills with PySpark, DBT, DLT, and medallion architecture are definitely part of this "full stack." Additional helpful tools might include CI/CD with Databricks Repos, MLflow for machine learning, and possibly GenAI capabilities for intelligent assistants, model deployment, or automation. While there isn’t one single guide that says “Databricks Full Stack,” you can piece it together from official Databricks documentation and blogs on architecture best practices. GenAI strategies, like using MosaicML or Databricks Foundation Models, are becoming part of the stack too — especially for teams building intelligent data products. So it’s more about combining tools smartly than learning something totally new.
Regards,
Brahma
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-22-2025 02:32 PM
Thanks for weighing in.
I am reviewing the Trainings Catalog | Databricks (Learning Library)
Eg . Listed training:
Build Data Pipelines with Lakeflow Declarative Pipelines
How/where can I get the note books showing in the session, so I can try the lab setup and exercises?
I did not see a download options. Are these notebooks work with latest FREE edition on data bricks?
Thanks .