- 1466 Views
- 5 replies
- 7 kudos
From Associate to Professional: My Learning Plan to ace all Databricks Data Engineer Certifications
In today’s data-driven world, the role of a data engineer is critical in designing and maintaining the infrastructure that allows for the efficient collection, storage, and analysis of large volumes of data. Databricks certifications holds significan...
- 1466 Views
- 5 replies
- 7 kudos
- 7 kudos
@ms_ccg You are correct. I got that error too. Seems like Databricks has removed some of these. I would suggest you to search for those separately via Databricks Academy or external resources. Let me know if you need any help.
- 7 kudos
- 597 Views
- 5 replies
- 1 kudos
Databricks App Availability
Hi there,I recently came across this post about databricks apps that says it available for public previewhttps://www.databricks.com/blog/introducing-databricks-appsHowever, when I go to previews in the workspace, I don't see an option to enable it, i...
- 597 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi there,Just following up, anyone know when Apps will be support in southeastasia?
- 1 kudos
- 3558 Views
- 3 replies
- 0 kudos
Editing value of widget parameter within notebook code
I have a notebook with a text widget where I want to be able to edit the value of the widget within the notebook and then reference it in SQL code. For example, assuming there is a text widget named Var1 that has input value "Hello", I would want to ...
- 3558 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @DavidOBrien, how are you? You can try the following approach: # Get the current value of the widget current_value = dbutils.widgets.get("widget_name") # Append the new value to the current value new_value = current_value + "appended_value" # Se...
- 0 kudos
- 175 Views
- 0 replies
- 1 kudos
Python step-through debugger for Databricks Notebooks and Files is now Generally Available
Python step-through debugger for Databricks Notebooks and Files is now Generally Availablehttps://www.databricks.com/blog/announcing-general-availability-step-through-debugging-databricks-notebooks-and-files
- 175 Views
- 0 replies
- 1 kudos
- 213 Views
- 1 replies
- 2 kudos
Orchestrate Databricks jobs with Apache Airflow
You can Orchestrate Databricks jobs with Apache AirflowThe Databricks provider implements the below operators:DatabricksCreateJobsOperator : Create a new Databricks job or reset an existing jobDatabricksRunNowOperator : Runs an existing Spark job run...
- 213 Views
- 1 replies
- 2 kudos
- 2 kudos
Good one @Sourav-Kundu! Your clear explanations of the operators really simplify job management, plus the resource link you included makes it easy for everyone to dive deeper .
- 2 kudos
- 246 Views
- 1 replies
- 2 kudos
Use Retrieval-augmented generation (RAG) to boost performance of LLM applications
Retrieval-augmented generation (RAG) is a method that boosts the performance of large language model (LLM) applications by utilizing tailored data.It achieves this by fetching pertinent data or documents related to a specific query or task and presen...
- 246 Views
- 1 replies
- 2 kudos
- 2 kudos
Thanks for sharing such valuable insight, @Sourav-Kundu . Your breakdown of how RAG enhances LLMs is spot on- clear and concise!
- 2 kudos
- 443 Views
- 1 replies
- 2 kudos
You can use Low Shuffle Merge to optimize the Merge process in Delta lake
Low Shuffle Merge in Databricks is a feature that optimizes the way data is merged when using Delta Lake, reducing the amount of data shuffled between nodes.- Traditional merges can involve heavy data shuffling, as data is redistributed across the cl...
- 443 Views
- 1 replies
- 2 kudos
- 2 kudos
Great post, @Sourav-Kundu. The benefits you've outlined, especially regarding faster execution and cost efficiency, are valuable for anyone working with large-scale data processing. Thanks for sharing!
- 2 kudos
- 207 Views
- 0 replies
- 0 kudos
Utilize Unity Catalog alongside your Delta Live Tables pipelines
Delta Live Tables support for Unity Catalog is in Public PreviewDatabricks recommends setting up Delta Live Tables pipelines using Unity Catalog.When configured with Unity Catalog, these pipelines publish all defined materialized views and streaming ...
- 207 Views
- 0 replies
- 0 kudos
- 298 Views
- 0 replies
- 1 kudos
Databricks Asset Bundles package and deploy resources like notebooks and workflows as a single unit.
Databricks Asset Bundles help implement software engineering best practices like version control, testing and CI/CD for data and AI projects.1. They allow you to define resources such as jobs and notebooks as source files, making project structure, t...
- 298 Views
- 0 replies
- 1 kudos
- 191 Views
- 0 replies
- 0 kudos
Databricks serverless budget policies are now available in Public Preview
Databricks serverless budget policies are now available in Public Preview, enabling administrators to automatically apply the correct tags to serverless resources without relying on users to manually attach them.1. This feature allows for customized ...
- 191 Views
- 0 replies
- 0 kudos
- 475 Views
- 0 replies
- 1 kudos
How to recover Dropped Tables in Databricks
Have you ever accidentally dropped a table in Databricks, or had someone else mistakenly drop it?Databricks offers a useful feature that allows you to view dropped tables and recover them if needed.1. You need to first execute SHOW TABLES DROPPED2. T...
- 475 Views
- 0 replies
- 1 kudos
- 1466 Views
- 5 replies
- 7 kudos
From Associate to Professional: My Learning Plan to ace all Databricks Data Engineer Certifications
In today’s data-driven world, the role of a data engineer is critical in designing and maintaining the infrastructure that allows for the efficient collection, storage, and analysis of large volumes of data. Databricks certifications holds significan...
- 1466 Views
- 5 replies
- 7 kudos
- 7 kudos
@ms_ccg You are correct. I got that error too. Seems like Databricks has removed some of these. I would suggest you to search for those separately via Databricks Academy or external resources. Let me know if you need any help.
- 7 kudos
- 2054 Views
- 6 replies
- 4 kudos
Library Management via Custom Compute Policies and ADF Job Triggering
This guide is intended for those looking to install libraries on a cluster using a Custom Compute Policy and trigger Databricks jobs from an Azure Data Factory (ADF) linked service. While many users rely on init scripts for library installation, it i...
- 2054 Views
- 6 replies
- 4 kudos
- 4 kudos
Hi @hassan2 I had same issue and found solution.When I created POOL i created it as On-demand (not spot) and then policy only worked when I removed entire section "azure_attributes.spot_bid_max_price" from policy.Looks like "azure_attributes.spot_bi...
- 4 kudos
- 782 Views
- 4 replies
- 2 kudos
Resolved! Want to learn LakeFlow Pipelines in community edition.
Hello Everyone. I want to explore LakeFlow Pipelines in the community version but don’t have access to Azure or AWS. I had a bad experience with Azure, where I was charged $85 while just trying to learn. Is there a less expensive, step-by-step learni...
- 782 Views
- 4 replies
- 2 kudos
- 2 kudos
Hi @nafikazi ,Sorry, this is not possible in community edition. Your only option is to have AWS or Azure account.
- 2 kudos
- 2848 Views
- 6 replies
- 7 kudos
🚀 Databricks Custom Apps! 🚀
Whether you're a data scientist or a sales executive, Databricks is making it easier than ever to build, host, and share secure data applications. With our platform, you can now run any Python code on serverless compute, share it with non-technical c...
- 2848 Views
- 6 replies
- 7 kudos
- 7 kudos
Can we somehow play with hosting, and expose this app outside?
- 7 kudos
- 803 Views
- 5 replies
- 1 kudos
Writing append blob files to unity catalog volum
The workspace is assigned to unity catalog, and all the access to the ADLS Gen2 is now handled via unity catalog only, means no SPN, no connection string, access keys etc. I have to create append blob files in a volume, Is this is possible in a works...
- 803 Views
- 5 replies
- 1 kudos
- 1 kudos
Now I got your point. No, you can't create Append Blob files directly in Volumes, as this is a native Azure functionality. A volume is basically just an abstraction over a native storage.You will still need to use libraries like azure-storage-blob wi...
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
ADF Linked Service
1 -
ADF Pipeline
1 -
Append blob
1 -
AWS
1 -
Azure databricks
1 -
Azure DevOps
1 -
Azure devops integration
1 -
ChangingSchema
1 -
CICD
1 -
CICDForDatabricksWorkflows
1 -
Cluster
1 -
Cluster Pools
1 -
compute policies
1 -
compute policy
1 -
Cost
1 -
Cost Optimization Effort
1 -
custom compute policy
1 -
CustomLibrary
1 -
Data Engineering
1 -
Databricks Delta Table
1 -
Databricks jobs
1 -
Databricks Mlflow
1 -
Databricks Support
1 -
Databricks Unity Catalog
2 -
Databricks Workflows
1 -
DatabricksML
1 -
DatabricksWorkflowsCICD
1 -
Date
1 -
Delta Lake
3 -
Devops
1 -
DimensionTables
1 -
Dns
1 -
Dynamic
1 -
Jobs & Workflows
1 -
LakeFlow
1 -
Library Installation
1 -
Mlops
1 -
Networking
1 -
Partner
1 -
Private Link
1 -
Pyspark Code
1 -
Question
1 -
Scala Code
1 -
Schema
1 -
Schema Evaluation
1 -
Serverless SQL Datawarehouse
1 -
Spark
4 -
SparkSQL
1 -
Support Ticket
1 -
Unity Catalog
1 -
Unity Catlog
1 -
Workflow Jobs
1 -
Workflows
1
- « Previous
- Next »
User | Count |
---|---|
40 | |
12 | |
10 | |
7 | |
6 |