cancel
Showing results for 
Search instead for 
Did you mean: 
Knowledge Sharing Hub
Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

SumitSingh
by Contributor
  • 2398 Views
  • 6 replies
  • 8 kudos

From Associate to Professional: My Learning Plan to ace all Databricks Data Engineer Certifications

In today’s data-driven world, the role of a data engineer is critical in designing and maintaining the infrastructure that allows for the efficient collection, storage, and analysis of large volumes of data. Databricks certifications holds significan...

SumitSingh_0-1721402402230.png SumitSingh_1-1721402448677.png SumitSingh_2-1721402469214.png
  • 2398 Views
  • 6 replies
  • 8 kudos
Latest Reply
jem
New Contributor II
  • 8 kudos

This is great! I have worked with Databricks for almost three years and has decided to pursue the Databricks Engineer Professional certification. This will certainly help setting up an effective plan.

  • 8 kudos
5 More Replies
Danny_Lee
by Valued Contributor
  • 1316 Views
  • 0 replies
  • 1 kudos

Databricks AI Security Framework

Today Databricks announced the release of the Databricks AI Security Framework (LinkedIn Post)You can download the paper (PDF) from blog post. Anyone else download this and have thoughts?   My first thought is its a great start and has an excellent G...

  • 1316 Views
  • 0 replies
  • 1 kudos
avrm91
by Contributor
  • 1179 Views
  • 0 replies
  • 0 kudos

GCP - Initial External Location to GCP Bucket is wrong

When creating a new Workspace in GCP the default GCP External Location is wrong.Its easily fixed by Catalog (on the left) > External Data (on the bottom) > External Locations > choose the connection and edit the URL by deleting the second BucketId af...

avrm91_1-1711447524318.png avrm91_0-1711447511611.png
  • 1179 Views
  • 0 replies
  • 0 kudos
MichTalebzadeh
by Valued Contributor
  • 1404 Views
  • 0 replies
  • 0 kudos

Feature article: Leveraging Generative AI with Apache Spark: Transforming Data Engineering

I created this article in Linkedlin to allow both this community and Apache Spark user community to have access to it.It is particularly useful for data engineers who want to have a basic understanding of what  Generative AI with Spark can do.Leverag...

Knowledge Sharing Hub
Generative AI
spark
  • 1404 Views
  • 0 replies
  • 0 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 3452 Views
  • 1 replies
  • 3 kudos

DBR 15.0 beta

databricks runtime 15 is out there!Some breaking changes. More info here https://docs.databricks.com/en/release-notes/runtime/15.0.html

15.png
  • 3452 Views
  • 1 replies
  • 3 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 3 kudos

Thanks for sharing this information @Hubert-Dudek!!!

  • 3 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 2764 Views
  • 1 replies
  • 1 kudos

Notebook IDE

This is an excellent step for #databricks notebooks. Integrated debugger and CLI in notebook terminal is a big step towards a fully functional cloud IDE.

ide.png
  • 2764 Views
  • 1 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

Thank you for sharing this @Hubert-Dudek!!!

  • 1 kudos
MichTalebzadeh
by Valued Contributor
  • 4308 Views
  • 2 replies
  • 0 kudos

Build a machine learning model to detect fraudulent transactions using PySpark's MLlib library

IntroductionFinancial fraud is a significant concern for businesses and consumers alike. I have written about this concern a few times in Linkedlin articles. Machine learning offers powerful tools to combat this issue by automatically identifying sus...

Knowledge Sharing Hub
Financial Fraud
PySpark MLlib
spark
  • 4308 Views
  • 2 replies
  • 0 kudos
Latest Reply
deborah621
New Contributor II
  • 0 kudos

Looking to build a machine learning model for detecting fraudulent transactions using PySpark’s MLlib. Generate synthetic transaction data. Provides a dataset for model training without using sensitive real-world data. Enables the creation of diverse...

  • 0 kudos
1 More Replies
alexgv12
by New Contributor III
  • 1475 Views
  • 1 replies
  • 2 kudos

is it possible to have a class level separation in databricks or implement a design pattern in datab

if you have thought about making your code inside databricks and notebooks more reusable and organized and you have thought about implementing a design pattern or class level separation in databricks the answer is yes, I am going to tell you the deta...

  • 1475 Views
  • 1 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

tnx! I have spent quite some time on figuring out what the best way is. Your approach is certainly a valid one.Myself I prefer to package reused classes in a jar (we mainly code in scala). Works fine too.

  • 2 kudos
MichTalebzadeh
by Valued Contributor
  • 4609 Views
  • 1 replies
  • 1 kudos

Building Event-Driven Real-Time Data Processor with Spark Structured Streaming and API Integration

 I recently saw an article from Databricks titled "Scalable Spark Structured Streaming for REST API Destinations". A great article focusing on continuous Spark Structured Streaming (SSS). About a year old. I then decided, given customer demands to wo...

Knowledge Sharing Hub
Event-driven architecture
Flask
spark
Spark Structure Streaming
Spark Structured Streaming
  • 4609 Views
  • 1 replies
  • 1 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 1675 Views
  • 0 replies
  • 0 kudos

stored procedures

The plan for stored procedures in databricks spark has been announced in a few places. How can stored procedures look in Spark SQL?

stored.png
  • 1675 Views
  • 0 replies
  • 0 kudos
hanlinsun
by Databricks Employee
  • 554 Views
  • 0 replies
  • 0 kudos

Redesigned Move File & Clone File Experiences

Hi everyone! We are redesigning the Move File and Clone File experiences. We want to make it as seamless as possible to organize your files, and would love your feedback on the designs!   Move File: Move Option 1   Move Option 2:     Clone File: Cl...

hanlinsun_0-1707354147837.png hanlinsun_1-1707354172264.png hanlinsun_2-1707354211904.png hanlinsun_3-1707354282734.png
  • 554 Views
  • 0 replies
  • 0 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 1633 Views
  • 1 replies
  • 2 kudos

liquid partitioning

Based on my experience with data partitioning, it often diminishes performance rather than enhancing it. There are exceptions, like when handling tables over 1TB, or when EVERY single query utilizes partition in the WHERE clause - for instance, a Pow...

ezgif-1-900d8d6a4e.gif
Knowledge Sharing Hub
optimize
Partitions
  • 1633 Views
  • 1 replies
  • 2 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 2 kudos

Thank you for sharing this @Hubert-Dudek !!

  • 2 kudos
Taha_Hussain
by Databricks Employee
  • 2508 Views
  • 0 replies
  • 2 kudos

Your updated resource guide to the Databricks Data Intelligence Platform

Want to increase your Databricks knowledge? Look no further!Here’s a guide filled with key resources you’ll need while working on the Databricks Data Inteligence Platform. Bookmark these pages for future reference, or apply these learnings during the...

Screenshot 2023-11-22 at 2.06.35 PM.png
  • 2508 Views
  • 0 replies
  • 2 kudos
Taha_Hussain
by Databricks Employee
  • 2968 Views
  • 0 replies
  • 0 kudos

✨New in Notebooks: AI-powered Databricks Assistant, improved visualizations, web terminal and more!

We are excited to share some of the latest updates in Databricks Notebooks. From AI-powered Databricks Assistant that automates code development to new charts with better performance, these features help you build faster. See the latest features live...

generatecode.gif explain_code.gif fix_code (3).gif Screenshot 2023-10-20 at 10.11.24 AM.png
  • 2968 Views
  • 0 replies
  • 0 kudos
Sujitha
by Databricks Employee
  • 10350 Views
  • 1 replies
  • 1 kudos

Simplify complex workflows with modular jobs

Thousands of Databricks customers use Databricks Workflows every day to orchestrate business-critical workloads on the Databricks Lakehouse Platform. A great way to simplify those critical workloads is through modular orchestration. This is now possi...

Sujitha_0-1694069414336.png Sujitha_1-1694069414523.png
Knowledge Sharing Hub
jobs
Modular Orchestration
run jobs
Workflows
  • 10350 Views
  • 1 replies
  • 1 kudos
Latest Reply
UiliamVenerio
New Contributor II
  • 1 kudos

Hello, is the "if/else condition" task type available for testing?

  • 1 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group