- 2149 Views
 - 3 replies
 - 1 kudos
 
Is there any way to add a Matplotlib visualizaton to a notebook Dashboard?
So I love that databricks lets you display a dataframe, create a visualization of it, then add that visualization to notebook dashboard to present. However, the visualizations lack some customization that I would like. For example the heat map vis...
- 2149 Views
 - 3 replies
 - 1 kudos
 
- 1 kudos
 
This is correct, it seems the way you want to implement is not currently supported
- 1 kudos
 
- 2451 Views
 - 0 replies
 - 0 kudos
 
Unlock the Full Potential of Databricks with the Demo Center!
Hello Databricks community!If you're eager to explore how Databricks can revolutionize your data workflows, I highly recommend checking out the Databricks Demo Center. It’s packed with insights and tools designed to cater to both beginners and season...
- 2451 Views
 - 0 replies
 - 0 kudos
 
- 3000 Views
 - 0 replies
 - 0 kudos
 
Understanding Databricks Workspace IP Access List
What is a Databricks Workspace IP Access List?The Databricks Workspace IP Access List is a security feature that allows administrators to control access to the Databricks workspace by specifying which IP addresses or IP ranges are allowed or denied a...
- 3000 Views
 - 0 replies
 - 0 kudos
 
- 663 Views
 - 0 replies
 - 1 kudos
 
Python step-through debugger for Databricks Notebooks and Files is now Generally Available
Python step-through debugger for Databricks Notebooks and Files is now Generally Availablehttps://www.databricks.com/blog/announcing-general-availability-step-through-debugging-databricks-notebooks-and-files
- 663 Views
 - 0 replies
 - 1 kudos
 
- 1381 Views
 - 1 replies
 - 4 kudos
 
Orchestrate Databricks jobs with Apache Airflow
You can Orchestrate Databricks jobs with Apache AirflowThe Databricks provider implements the below operators:DatabricksCreateJobsOperator : Create a new Databricks job or reset an existing jobDatabricksRunNowOperator : Runs an existing Spark job run...
- 1381 Views
 - 1 replies
 - 4 kudos
 
- 4 kudos
 
Good one @Sourav-Kundu! Your clear explanations of the operators really simplify job management, plus the resource link you included makes it easy for everyone to dive deeper .
- 4 kudos
 
- 1376 Views
 - 1 replies
 - 2 kudos
 
Use Retrieval-augmented generation (RAG) to boost performance of LLM applications
Retrieval-augmented generation (RAG) is a method that boosts the performance of large language model (LLM) applications by utilizing tailored data.It achieves this by fetching pertinent data or documents related to a specific query or task and presen...
- 1376 Views
 - 1 replies
 - 2 kudos
 
- 2 kudos
 
Thanks for sharing such valuable insight, @Sourav-Kundu . Your breakdown of how RAG enhances LLMs is spot on- clear and concise!
- 2 kudos
 
- 2005 Views
 - 1 replies
 - 2 kudos
 
You can use Low Shuffle Merge to optimize the Merge process in Delta lake
Low Shuffle Merge in Databricks is a feature that optimizes the way data is merged when using Delta Lake, reducing the amount of data shuffled between nodes.- Traditional merges can involve heavy data shuffling, as data is redistributed across the cl...
- 2005 Views
 - 1 replies
 - 2 kudos
 
- 2 kudos
 
Great post, @Sourav-Kundu. The benefits you've outlined, especially regarding faster execution and cost efficiency, are valuable for anyone working with large-scale data processing. Thanks for sharing!
- 2 kudos
 
- 808 Views
 - 0 replies
 - 0 kudos
 
Utilize Unity Catalog alongside your Delta Live Tables pipelines
Delta Live Tables support for Unity Catalog is in Public PreviewDatabricks recommends setting up Delta Live Tables pipelines using Unity Catalog.When configured with Unity Catalog, these pipelines publish all defined materialized views and streaming ...
- 808 Views
 - 0 replies
 - 0 kudos
 
- 875 Views
 - 0 replies
 - 1 kudos
 
Databricks Asset Bundles package and deploy resources like notebooks and workflows as a single unit.
Databricks Asset Bundles help implement software engineering best practices like version control, testing and CI/CD for data and AI projects.1. They allow you to define resources such as jobs and notebooks as source files, making project structure, t...
- 875 Views
 - 0 replies
 - 1 kudos
 
- 1047 Views
 - 0 replies
 - 0 kudos
 
Databricks serverless budget policies are now available in Public Preview
Databricks serverless budget policies are now available in Public Preview, enabling administrators to automatically apply the correct tags to serverless resources without relying on users to manually attach them.1. This feature allows for customized ...
- 1047 Views
 - 0 replies
 - 0 kudos
 
- 3860 Views
 - 0 replies
 - 1 kudos
 
How to recover Dropped Tables in Databricks
Have you ever accidentally dropped a table in Databricks, or had someone else mistakenly drop it?Databricks offers a useful feature that allows you to view dropped tables and recover them if needed.1. You need to first execute SHOW TABLES DROPPED2. T...
- 3860 Views
 - 0 replies
 - 1 kudos
 
- 3345 Views
 - 4 replies
 - 2 kudos
 
Resolved! Want to learn LakeFlow Pipelines in community edition.
Hello Everyone. I want to explore LakeFlow Pipelines in the community version but don’t have access to Azure or AWS. I had a bad experience with Azure, where I was charged $85 while just trying to learn. Is there a less expensive, step-by-step learni...
- 3345 Views
 - 4 replies
 - 2 kudos
 
- 2 kudos
 
Hi @nafikazi ,Sorry, this is not possible in community edition. Your only option is to have AWS or Azure account.
- 2 kudos
 
- 4867 Views
 - 6 replies
 - 7 kudos
 
🚀 Databricks Custom Apps! 🚀
Whether you're a data scientist or a sales executive, Databricks is making it easier than ever to build, host, and share secure data applications. With our platform, you can now run any Python code on serverless compute, share it with non-technical c...
- 4867 Views
 - 6 replies
 - 7 kudos
 
- 7 kudos
 
Can we somehow play with hosting, and expose this app outside?
- 7 kudos
 
- 2957 Views
 - 5 replies
 - 1 kudos
 
Writing append blob files to unity catalog volum
The workspace is assigned to unity catalog, and all the access to the ADLS Gen2 is now handled via unity catalog only, means no SPN, no connection string, access keys etc. I have to create append blob files in a volume, Is this is possible in a works...
- 2957 Views
 - 5 replies
 - 1 kudos
 
- 1 kudos
 
Now I got your point. No, you can't create Append Blob files directly in Volumes, as this is a native Azure functionality. A volume is basically just an abstraction over a native storage.You will still need to use libraries like azure-storage-blob wi...
- 1 kudos
 
- 4519 Views
 - 1 replies
 - 0 kudos
 
Unity cataog
Disaster recovery is possible in Unity catalog now?Means, for data level, we have enabled with geo redundancy, what about the objects, permissions, an other components in Unity catalog ? Can we restore the unity catalog metadata in another region ?
- 4519 Views
 - 1 replies
 - 0 kudos
 
- 0 kudos
 
Official product release in development will be available as PrPr in a few months.
- 0 kudos
 
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now- 
				
					
						Access Data
1  - 
				
					
						ADF Linked Service
1  - 
				
					
						ADF Pipeline
1  - 
				
					
						Advanced Data Engineering
3  - 
				
					
						AI Agents
1  - 
				
					
						AI Readiness
1  - 
				
					
						Apache spark
1  - 
				
					
						ApacheSpark
1  - 
				
					
						Associate Certification
1  - 
				
					
						Automation
1  - 
				
					
						AWSDatabricksCluster
1  - 
				
					
						Azure
1  - 
				
					
						Azure databricks
3  - 
				
					
						Azure devops integration
1  - 
				
					
						AzureDatabricks
2  - 
				
					
						Big data
1  - 
				
					
						Blog
1  - 
				
					
						Caching
2  - 
				
					
						CICDForDatabricksWorkflows
1  - 
				
					
						Cluster
1  - 
				
					
						Cluster Policies
1  - 
				
					
						Cluster Pools
1  - 
				
					
						Community Event
1  - 
				
					
						Cost Optimization Effort
1  - 
				
					
						custom compute policy
1  - 
				
					
						CustomLibrary
1  - 
				
					
						Data
1  - 
				
					
						Data Analysis with Databricks
1  - 
				
					
						Data Engineering
4  - 
				
					
						Data Governance
1  - 
				
					
						Data Mesh
1  - 
				
					
						Data Processing
1  - 
				
					
						Databricks Assistant
1  - 
				
					
						Databricks Community
1  - 
				
					
						Databricks Delta Table
1  - 
				
					
						Databricks Demo Center
1  - 
				
					
						Databricks Job
1  - 
				
					
						Databricks Migration
2  - 
				
					
						Databricks Mlflow
1  - 
				
					
						Databricks Notebooks
1  - 
				
					
						Databricks Support
1  - 
				
					
						Databricks Unity Catalog
2  - 
				
					
						Databricks Workflows
1  - 
				
					
						DatabricksML
1  - 
				
					
						DBR Versions
1  - 
				
					
						Declartive Pipelines
1  - 
				
					
						DeepLearning
1  - 
				
					
						Delta Live Table
1  - 
				
					
						Delta Live Tables
1  - 
				
					
						Delta Time Travel
1  - 
				
					
						Devops
1  - 
				
					
						DimensionTables
1  - 
				
					
						DLT
2  - 
				
					
						DLT Pipelines
3  - 
				
					
						DLT-Meta
1  - 
				
					
						Dns
1  - 
				
					
						Dynamic
1  - 
				
					
						Free Databricks
3  - 
				
					
						GenAI agent
1  - 
				
					
						GenAI and LLMs
2  - 
				
					
						GenAIGeneration AI
1  - 
				
					
						Generative AI
1  - 
				
					
						Genie
1  - 
				
					
						Governance
1  - 
				
					
						Hive metastore
1  - 
				
					
						Hubert Dudek
1  - 
				
					
						Lakeflow Pipelines
1  - 
				
					
						Lakehouse
1  - 
				
					
						Lakehouse Migration
1  - 
				
					
						Lazy Evaluation
1  - 
				
					
						Learning
1  - 
				
					
						Library Installation
1  - 
				
					
						Llama
1  - 
				
					
						Medallion Architecture
1  - 
				
					
						Migrations
1  - 
				
					
						MSExcel
2  - 
				
					
						Multiagent
1  - 
				
					
						Networking
2  - 
				
					
						Partner
1  - 
				
					
						Performance
1  - 
				
					
						Performance Tuning
1  - 
				
					
						Private Link
1  - 
				
					
						Pyspark
1  - 
				
					
						Pyspark Code
1  - 
				
					
						Pyspark Databricks
1  - 
				
					
						Pytest
1  - 
				
					
						Python
1  - 
				
					
						Reading-excel
1  - 
				
					
						Scala Code
1  - 
				
					
						Scripting
1  - 
				
					
						SDK
1  - 
				
					
						Serverless
2  - 
				
					
						Spark Caching
1  - 
				
					
						SparkSQL
1  - 
				
					
						SQL
1  - 
				
					
						SQL Serverless
1  - 
				
					
						Support Ticket
1  - 
				
					
						Sync
1  - 
				
					
						Tutorial
1  - 
				
					
						Unit Test
1  - 
				
					
						Unity Catalog
4  - 
				
					
						Unity Catlog
1  - 
				
					
						Warehousing
1  - 
				
					
						Workflow Jobs
1  - 
				
					
						Workflows
3  
- « Previous
 - Next »
 
| User | Count | 
|---|---|
| 71 | |
| 43 | |
| 38 | |
| 30 | |
| 23 |