- 1712 Views
- 1 replies
- 3 kudos
How to Grant Workspace Admin Permissions to an ID Using Parent Groups
Hello,There are several ways to grant Workspace Admin permissions in Databricks. While this may seem straightforward, I found it a bit confusing when I started using Databricks, so I’d like to share my experience. This guide is aimed at beginners.How...
- 1712 Views
- 1 replies
- 3 kudos
- 8788 Views
- 3 replies
- 1 kudos
Databricks Academy Labs Coupon Instructions
To see all Databricks training and enablement offerings, please visit our Learning Library and Certifications Catalog. To use your Databricks Academy Labs coupons, please - Head to Databricks Academy Across the top navigation, select Subscriptions C...
- 8788 Views
- 3 replies
- 1 kudos
- 1 kudos
Shekhar you will have to do below : Hi Shubham Thank you for reaching out to Databricks, and we're sorry to see the error message you're facing. Could you kindly raise a support ticket - https://help.databricks.com/s/contact-us?ReqType=training and g...
- 1 kudos
- 559 Views
- 0 replies
- 0 kudos
Learn Data Engineering on Databricks step by step
For new aspiring Data Engineers, it has always been difficult to start their learning. With decade of experience in Data Engineering now I have put together a series of article that can help new aspirants. The list is small attempt to help new Data E...
- 559 Views
- 0 replies
- 0 kudos
- 1834 Views
- 3 replies
- 1 kudos
Is there any way to add a Matplotlib visualizaton to a notebook Dashboard?
So I love that databricks lets you display a dataframe, create a visualization of it, then add that visualization to notebook dashboard to present. However, the visualizations lack some customization that I would like. For example the heat map vis...
- 1834 Views
- 3 replies
- 1 kudos
- 1 kudos
This is correct, it seems the way you want to implement is not currently supported
- 1 kudos
- 2313 Views
- 0 replies
- 0 kudos
Unlock the Full Potential of Databricks with the Demo Center!
Hello Databricks community!If you're eager to explore how Databricks can revolutionize your data workflows, I highly recommend checking out the Databricks Demo Center. It’s packed with insights and tools designed to cater to both beginners and season...
- 2313 Views
- 0 replies
- 0 kudos
- 2516 Views
- 0 replies
- 0 kudos
Understanding Databricks Workspace IP Access List
What is a Databricks Workspace IP Access List?The Databricks Workspace IP Access List is a security feature that allows administrators to control access to the Databricks workspace by specifying which IP addresses or IP ranges are allowed or denied a...
- 2516 Views
- 0 replies
- 0 kudos
- 2061 Views
- 5 replies
- 1 kudos
Databricks App Availability
Hi there,I recently came across this post about databricks apps that says it available for public previewhttps://www.databricks.com/blog/introducing-databricks-appsHowever, when I go to previews in the workspace, I don't see an option to enable it, i...
- 2061 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi there,Just following up, anyone know when Apps will be support in southeastasia?
- 1 kudos
- 565 Views
- 0 replies
- 1 kudos
Python step-through debugger for Databricks Notebooks and Files is now Generally Available
Python step-through debugger for Databricks Notebooks and Files is now Generally Availablehttps://www.databricks.com/blog/announcing-general-availability-step-through-debugging-databricks-notebooks-and-files
- 565 Views
- 0 replies
- 1 kudos
- 1152 Views
- 1 replies
- 4 kudos
Orchestrate Databricks jobs with Apache Airflow
You can Orchestrate Databricks jobs with Apache AirflowThe Databricks provider implements the below operators:DatabricksCreateJobsOperator : Create a new Databricks job or reset an existing jobDatabricksRunNowOperator : Runs an existing Spark job run...
- 1152 Views
- 1 replies
- 4 kudos
- 4 kudos
Good one @Sourav-Kundu! Your clear explanations of the operators really simplify job management, plus the resource link you included makes it easy for everyone to dive deeper .
- 4 kudos
- 1124 Views
- 1 replies
- 2 kudos
Use Retrieval-augmented generation (RAG) to boost performance of LLM applications
Retrieval-augmented generation (RAG) is a method that boosts the performance of large language model (LLM) applications by utilizing tailored data.It achieves this by fetching pertinent data or documents related to a specific query or task and presen...
- 1124 Views
- 1 replies
- 2 kudos
- 2 kudos
Thanks for sharing such valuable insight, @Sourav-Kundu . Your breakdown of how RAG enhances LLMs is spot on- clear and concise!
- 2 kudos
- 1712 Views
- 1 replies
- 2 kudos
You can use Low Shuffle Merge to optimize the Merge process in Delta lake
Low Shuffle Merge in Databricks is a feature that optimizes the way data is merged when using Delta Lake, reducing the amount of data shuffled between nodes.- Traditional merges can involve heavy data shuffling, as data is redistributed across the cl...
- 1712 Views
- 1 replies
- 2 kudos
- 2 kudos
Great post, @Sourav-Kundu. The benefits you've outlined, especially regarding faster execution and cost efficiency, are valuable for anyone working with large-scale data processing. Thanks for sharing!
- 2 kudos
- 685 Views
- 0 replies
- 0 kudos
Utilize Unity Catalog alongside your Delta Live Tables pipelines
Delta Live Tables support for Unity Catalog is in Public PreviewDatabricks recommends setting up Delta Live Tables pipelines using Unity Catalog.When configured with Unity Catalog, these pipelines publish all defined materialized views and streaming ...
- 685 Views
- 0 replies
- 0 kudos
- 781 Views
- 0 replies
- 1 kudos
Databricks Asset Bundles package and deploy resources like notebooks and workflows as a single unit.
Databricks Asset Bundles help implement software engineering best practices like version control, testing and CI/CD for data and AI projects.1. They allow you to define resources such as jobs and notebooks as source files, making project structure, t...
- 781 Views
- 0 replies
- 1 kudos
- 905 Views
- 0 replies
- 0 kudos
Databricks serverless budget policies are now available in Public Preview
Databricks serverless budget policies are now available in Public Preview, enabling administrators to automatically apply the correct tags to serverless resources without relying on users to manually attach them.1. This feature allows for customized ...
- 905 Views
- 0 replies
- 0 kudos
- 3157 Views
- 0 replies
- 1 kudos
How to recover Dropped Tables in Databricks
Have you ever accidentally dropped a table in Databricks, or had someone else mistakenly drop it?Databricks offers a useful feature that allows you to view dropped tables and recover them if needed.1. You need to first execute SHOW TABLES DROPPED2. T...
- 3157 Views
- 0 replies
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access Data
1 -
ADF Linked Service
1 -
ADF Pipeline
1 -
Advanced Data Engineering
3 -
AI Agents
1 -
AI Readiness
1 -
ApacheSpark
1 -
Associate Certification
1 -
Automation
1 -
AWS
1 -
AWSDatabricksCluster
1 -
Azure databricks
1 -
Azure devops integration
1 -
AzureDatabricks
1 -
Big data
1 -
Caching
2 -
CICDForDatabricksWorkflows
1 -
Cluster
1 -
Cluster Policies
1 -
Cluster Pools
1 -
Cost Optimization Effort
1 -
custom compute policy
1 -
CustomLibrary
1 -
Data
1 -
Data Engineering
3 -
Data Governance
1 -
Data Mesh
1 -
Data Processing
1 -
Databricks Assistant
1 -
Databricks Community
1 -
Databricks Delta Table
1 -
Databricks Demo Center
1 -
Databricks Job
1 -
Databricks Migration
2 -
Databricks Mlflow
1 -
Databricks Support
1 -
Databricks Unity Catalog
2 -
Databricks Workflows
1 -
DatabricksML
1 -
DBR Versions
1 -
Declartive Pipelines
1 -
DeepLearning
1 -
Delta Lake
4 -
Delta Live Table
1 -
Delta Live Tables
1 -
Delta Time Travel
1 -
Devops
1 -
DimensionTables
1 -
DLT
2 -
DLT Pipelines
3 -
DLT-Meta
1 -
Dns
1 -
Dynamic
1 -
Free Databricks
2 -
GenAI agent
1 -
GenAI and LLMs
2 -
Generative AI
1 -
Genie
1 -
Governance
1 -
Hive metastore
1 -
Lakeflow Pipelines
1 -
Lakehouse
1 -
Lakehouse Migration
1 -
Lazy Evaluation
1 -
Library Installation
1 -
Llama
1 -
Medallion Architecture
1 -
Migrations
1 -
MSExcel
1 -
Multiagent
1 -
Networking
1 -
Performance
1 -
Performance Tuning
1 -
Private Link
1 -
Pyspark Code
1 -
Pyspark Databricks
1 -
Pytest
1 -
Python
1 -
Scala Code
1 -
SDK
1 -
Serverless
2 -
Spark
6 -
Spark Caching
1 -
SparkSQL
1 -
SQL Serverless
1 -
Support Ticket
1 -
Sync
1 -
Tutorial
1 -
Unit Test
1 -
Unity Catalog
3 -
Unity Catlog
1 -
Workflow Jobs
1 -
Workflows
3
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
24 | |
21 | |
21 |