<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic What are the practical differences between bagging and boosting algorithms? in Machine Learning</title>
    <link>https://community.databricks.com/t5/machine-learning/what-are-the-practical-differences-between-bagging-and-boosting/m-p/142051#M4480</link>
    <description>&lt;P&gt;How are bagging and boosting different when you use them in real machine-learning projects?&lt;/P&gt;</description>
    <pubDate>Wed, 17 Dec 2025 07:38:01 GMT</pubDate>
    <dc:creator>Suheb</dc:creator>
    <dc:date>2025-12-17T07:38:01Z</dc:date>
    <item>
      <title>What are the practical differences between bagging and boosting algorithms?</title>
      <link>https://community.databricks.com/t5/machine-learning/what-are-the-practical-differences-between-bagging-and-boosting/m-p/142051#M4480</link>
      <description>&lt;P&gt;How are bagging and boosting different when you use them in real machine-learning projects?&lt;/P&gt;</description>
      <pubDate>Wed, 17 Dec 2025 07:38:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/what-are-the-practical-differences-between-bagging-and-boosting/m-p/142051#M4480</guid>
      <dc:creator>Suheb</dc:creator>
      <dc:date>2025-12-17T07:38:01Z</dc:date>
    </item>
    <item>
      <title>Re: What are the practical differences between bagging and boosting algorithms?</title>
      <link>https://community.databricks.com/t5/machine-learning/what-are-the-practical-differences-between-bagging-and-boosting/m-p/142116#M4485</link>
      <description>&lt;P&gt;Bagging and boosting differ mainly in how they reduce error and when you’d choose them:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Bagging (e.g., Random Forest) trains many models independently in parallel on different bootstrap samples to reduce variance, making it ideal for unstable, high-variance models and noisy data; it’s robust, easy to tune, and rarely overfits.&lt;/LI&gt;
&lt;LI&gt;Boosting (e.g., XGBoost, LightGBM) trains models sequentially, where each new model focuses on previous mistakes to reduce bias, making it powerful for complex patterns and structured/tabular data, but more sensitive to noise and hyperparameters.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;Use bagging when your model overfits, and the data is noisy; use boosting when you need maximum accuracy and can carefully tune and validate.&lt;/P&gt;</description>
      <pubDate>Wed, 17 Dec 2025 18:53:56 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/what-are-the-practical-differences-between-bagging-and-boosting/m-p/142116#M4485</guid>
      <dc:creator>iyashk-DB</dc:creator>
      <dc:date>2025-12-17T18:53:56Z</dc:date>
    </item>
    <item>
      <title>Re: What are the practical differences between bagging and boosting algorithms?</title>
      <link>https://community.databricks.com/t5/machine-learning/what-are-the-practical-differences-between-bagging-and-boosting/m-p/142461#M4494</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/112558"&gt;@iyashk-DB&lt;/a&gt;&amp;nbsp;, this helps.&lt;/P&gt;</description>
      <pubDate>Tue, 23 Dec 2025 17:33:10 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/what-are-the-practical-differences-between-bagging-and-boosting/m-p/142461#M4494</guid>
      <dc:creator>aaravmehta</dc:creator>
      <dc:date>2025-12-23T17:33:10Z</dc:date>
    </item>
    <item>
      <title>Re: What are the practical differences between bagging and boosting algorithms?</title>
      <link>https://community.databricks.com/t5/machine-learning/what-are-the-practical-differences-between-bagging-and-boosting/m-p/142570#M4505</link>
      <description>&lt;P&gt;The practical differences between bagging and boosting mostly come down to how they build models and how they handle errors:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;&lt;P&gt;Model Training Approach:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;Bagging (Bootstrap Aggregating): Builds multiple models in parallel using random subsets of the data. Each model is independent.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Boosting: Builds models sequentially, where each new model focuses on correcting the mistakes of the previous ones.&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Error Reduction:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;Bagging: Reduces variance, so it’s great for high-variance models like decision trees. It helps prevent overfitting.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Boosting: Reduces bias, making weak models stronger, but it can sometimes overfit if not carefully tuned.&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Sensitivity to Outliers:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;Bagging: Less sensitive to outliers because errors are averaged across models.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Boosting: More sensitive to outliers because it tries harder to correct errors, including noisy data.&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Examples:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;Bagging: Random Forest is the classic example.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;Boosting: AdaBoost, Gradient Boosting, XGBoost, and LightGBM.&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;In short: Use bagging when you want to stabilize high-variance models, and boosting when you want to improve weak learners and reduce bias, keeping an eye on potential overfitting.&lt;/P&gt;</description>
      <pubDate>Sat, 27 Dec 2025 12:18:42 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/what-are-the-practical-differences-between-bagging-and-boosting/m-p/142570#M4505</guid>
      <dc:creator>jameswood32</dc:creator>
      <dc:date>2025-12-27T12:18:42Z</dc:date>
    </item>
  </channel>
</rss>

