- 1824 Views
- 5 replies
- 7 kudos
From Associate to Professional: My Learning Plan to ace all Databricks Data Engineer Certifications
In today’s data-driven world, the role of a data engineer is critical in designing and maintaining the infrastructure that allows for the efficient collection, storage, and analysis of large volumes of data. Databricks certifications holds significan...
- 1824 Views
- 5 replies
- 7 kudos
- 7 kudos
@ms_ccg You are correct. I got that error too. Seems like Databricks has removed some of these. I would suggest you to search for those separately via Databricks Academy or external resources. Let me know if you need any help.
- 7 kudos
- 356 Views
- 1 replies
- 2 kudos
Use Retrieval-augmented generation (RAG) to boost performance of LLM applications
Retrieval-augmented generation (RAG) is a method that boosts the performance of large language model (LLM) applications by utilizing tailored data.It achieves this by fetching pertinent data or documents related to a specific query or task and presen...
- 356 Views
- 1 replies
- 2 kudos
- 2 kudos
Thanks for sharing such valuable insight, @Sourav-Kundu . Your breakdown of how RAG enhances LLMs is spot on- clear and concise!
- 2 kudos
- 794 Views
- 1 replies
- 2 kudos
You can use Low Shuffle Merge to optimize the Merge process in Delta lake
Low Shuffle Merge in Databricks is a feature that optimizes the way data is merged when using Delta Lake, reducing the amount of data shuffled between nodes.- Traditional merges can involve heavy data shuffling, as data is redistributed across the cl...
- 794 Views
- 1 replies
- 2 kudos
- 2 kudos
Great post, @Sourav-Kundu. The benefits you've outlined, especially regarding faster execution and cost efficiency, are valuable for anyone working with large-scale data processing. Thanks for sharing!
- 2 kudos
- 310 Views
- 0 replies
- 0 kudos
Utilize Unity Catalog alongside your Delta Live Tables pipelines
Delta Live Tables support for Unity Catalog is in Public PreviewDatabricks recommends setting up Delta Live Tables pipelines using Unity Catalog.When configured with Unity Catalog, these pipelines publish all defined materialized views and streaming ...
- 310 Views
- 0 replies
- 0 kudos
- 421 Views
- 0 replies
- 1 kudos
Databricks Asset Bundles package and deploy resources like notebooks and workflows as a single unit.
Databricks Asset Bundles help implement software engineering best practices like version control, testing and CI/CD for data and AI projects.1. They allow you to define resources such as jobs and notebooks as source files, making project structure, t...
- 421 Views
- 0 replies
- 1 kudos
- 353 Views
- 0 replies
- 0 kudos
Databricks serverless budget policies are now available in Public Preview
Databricks serverless budget policies are now available in Public Preview, enabling administrators to automatically apply the correct tags to serverless resources without relying on users to manually attach them.1. This feature allows for customized ...
- 353 Views
- 0 replies
- 0 kudos
- 758 Views
- 0 replies
- 1 kudos
How to recover Dropped Tables in Databricks
Have you ever accidentally dropped a table in Databricks, or had someone else mistakenly drop it?Databricks offers a useful feature that allows you to view dropped tables and recover them if needed.1. You need to first execute SHOW TABLES DROPPED2. T...
- 758 Views
- 0 replies
- 1 kudos
- 1824 Views
- 5 replies
- 7 kudos
From Associate to Professional: My Learning Plan to ace all Databricks Data Engineer Certifications
In today’s data-driven world, the role of a data engineer is critical in designing and maintaining the infrastructure that allows for the efficient collection, storage, and analysis of large volumes of data. Databricks certifications holds significan...
- 1824 Views
- 5 replies
- 7 kudos
- 7 kudos
@ms_ccg You are correct. I got that error too. Seems like Databricks has removed some of these. I would suggest you to search for those separately via Databricks Academy or external resources. Let me know if you need any help.
- 7 kudos
- 2387 Views
- 6 replies
- 4 kudos
Library Management via Custom Compute Policies and ADF Job Triggering
This guide is intended for those looking to install libraries on a cluster using a Custom Compute Policy and trigger Databricks jobs from an Azure Data Factory (ADF) linked service. While many users rely on init scripts for library installation, it i...
- 2387 Views
- 6 replies
- 4 kudos
- 4 kudos
Hi @hassan2 I had same issue and found solution.When I created POOL i created it as On-demand (not spot) and then policy only worked when I removed entire section "azure_attributes.spot_bid_max_price" from policy.Looks like "azure_attributes.spot_bi...
- 4 kudos
- 1104 Views
- 4 replies
- 2 kudos
Resolved! Want to learn LakeFlow Pipelines in community edition.
Hello Everyone. I want to explore LakeFlow Pipelines in the community version but don’t have access to Azure or AWS. I had a bad experience with Azure, where I was charged $85 while just trying to learn. Is there a less expensive, step-by-step learni...
- 1104 Views
- 4 replies
- 2 kudos
- 2 kudos
Hi @nafikazi ,Sorry, this is not possible in community edition. Your only option is to have AWS or Azure account.
- 2 kudos
- 3139 Views
- 6 replies
- 7 kudos
🚀 Databricks Custom Apps! 🚀
Whether you're a data scientist or a sales executive, Databricks is making it easier than ever to build, host, and share secure data applications. With our platform, you can now run any Python code on serverless compute, share it with non-technical c...
- 3139 Views
- 6 replies
- 7 kudos
- 7 kudos
Can we somehow play with hosting, and expose this app outside?
- 7 kudos
- 1024 Views
- 5 replies
- 1 kudos
Writing append blob files to unity catalog volum
The workspace is assigned to unity catalog, and all the access to the ADLS Gen2 is now handled via unity catalog only, means no SPN, no connection string, access keys etc. I have to create append blob files in a volume, Is this is possible in a works...
- 1024 Views
- 5 replies
- 1 kudos
- 1 kudos
Now I got your point. No, you can't create Append Blob files directly in Volumes, as this is a native Azure functionality. A volume is basically just an abstraction over a native storage.You will still need to use libraries like azure-storage-blob wi...
- 1 kudos
- 1574 Views
- 1 replies
- 0 kudos
Unity cataog
Disaster recovery is possible in Unity catalog now?Means, for data level, we have enabled with geo redundancy, what about the objects, permissions, an other components in Unity catalog ? Can we restore the unity catalog metadata in another region ?
- 1574 Views
- 1 replies
- 0 kudos
- 0 kudos
Official product release in development will be available as PrPr in a few months.
- 0 kudos
- 1230 Views
- 2 replies
- 0 kudos
Resolved! Standardized Framework to update Databricks job definition using CI/CD
Hi Databricks support, I am looking for a standardized Databricks framework to update job definition using DevOps from non-production till it get productionized. Our current process of updating the Databricks job definition is as follows:In our sourc...
- 1230 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi from the Git folders/Repos PM: DAB is the way to go, and we are working on an integration to author DABs directly in the workspace. Here's a DAIS talk where the DAB PM and I demo'ed some recommendations for source controlling jobs: https://www.da...
- 0 kudos
- 2582 Views
- 3 replies
- 4 kudos
Build & Refresh a Calendar Dates Table
IntroductionMaintaining accurate and up-to-date calendar date tables is crucial for reliable reporting, yet manual updates can be time-consuming and prone to error. This fundamental component serves as the backbone for date-based analysis, enabling a...
- 2582 Views
- 3 replies
- 4 kudos
- 1949 Views
- 2 replies
- 0 kudos
Resolved! Feature Engineering for Data Engineers: Building Blocks for ML Success
For a UK Government Agency, I made a Comprehensive presentation titled " Feature Engineering for Data Engineers: Building Blocks for ML Success". I made an article of it in Linkedlin together with the relevant GitHub code. In summary the code delve...
- 1949 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi,Excellent presentation and article! Your insights on feature engineering and practical code examples are incredibly useful for building strong ML models. Thanks for sharing! Thanks,Anushree
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
ADF Linked Service
1 -
ADF Pipeline
1 -
API
1 -
Append blob
1 -
Automation
1 -
AWS
1 -
Azure databricks
1 -
Azure DevOps
1 -
Azure devops integration
1 -
ChangingSchema
1 -
CICD
1 -
CICDForDatabricksWorkflows
1 -
Clone
1 -
Cluster
1 -
Cluster Pools
1 -
compute policies
1 -
compute policy
1 -
Cost
1 -
Cost Optimization Effort
1 -
custom compute policy
1 -
CustomLibrary
1 -
Data Engineering
1 -
Databricks Delta Table
1 -
Databricks Demo Center
1 -
Databricks jobs
1 -
Databricks Migration
1 -
Databricks Mlflow
1 -
Databricks Support
1 -
Databricks Unity Catalog
2 -
Databricks Workflows
1 -
DatabricksJobsAPI
1 -
DatabricksML
1 -
DatabricksWorkflowsCICD
1 -
Date
1 -
Delta Lake
3 -
Devops
1 -
DimensionTables
1 -
Dns
1 -
Dynamic
1 -
Hive metastore
1 -
Jobs & Workflows
1 -
LakeFlow
1 -
Library Installation
1 -
Mlops
1 -
MSExcel
1 -
Networking
1 -
Partner
1 -
Private Link
1 -
Pyspark Code
1 -
Question
1 -
Scala Code
1 -
Schema
1 -
Schema Evaluation
1 -
Serverless SQL Datawarehouse
1 -
Spark
4 -
SparkSQL
1 -
Support Ticket
1 -
Sync
1 -
ucx
1 -
Unity Catalog
2 -
Unity Catlog
1 -
Workflow Jobs
1 -
Workflows
1
- « Previous
- Next »
User | Count |
---|---|
40 | |
12 | |
10 | |
7 | |
6 |