<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic How can I simplify my data ingestion by processing the data as it arrives in cloud storage? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/how-can-i-simplify-my-data-ingestion-by-processing-the-data-as/m-p/31473#M22917</link>
    <description>&lt;P&gt;This post will help you simplify your data ingestion by utilizing Auto Loader, Delta Optimized Writes, Delta Write Jobs, and Delta Live Tables. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;Pre-Req:&amp;nbsp;&lt;/I&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;You are using JSON data and Delta Writes commands&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;U&gt;Step 1: Simplify ingestion with Auto Loader&amp;nbsp;&lt;/U&gt;&lt;/P&gt;&lt;P&gt;Delta Lake helps unlock the full capabilities of working with JSON data in Databricks. Auto Loader makes it easy to ingest JSON data and manage semi-structured data in the Databricks Lakehouse.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Get hands on and import this&lt;A href="https://databricks.com/notebooks/autoloader-ingest-complex-json-sql-query.html" alt="https://databricks.com/notebooks/autoloader-ingest-complex-json-sql-query.html" target="_blank"&gt;&lt;U&gt; notebook&lt;/U&gt;&lt;/A&gt; for a walkthrough on continuous and scheduled ingest of JSON data with Auto Loader.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;If you want to learn more, check out this&lt;U&gt; &lt;/U&gt;&lt;A href="https://databricks.com/blog/2021/11/11/10-powerful-features-to-simplify-semi-structured-data-management-in-the-databricks-lakehouse.html" alt="https://databricks.com/blog/2021/11/11/10-powerful-features-to-simplify-semi-structured-data-management-in-the-databricks-lakehouse.html" target="_blank"&gt;&lt;U&gt;overview bl&lt;/U&gt;&lt;/A&gt;&lt;U&gt;og&lt;/U&gt;&lt;B&gt;&lt;U&gt; &lt;/U&gt;&lt;/B&gt;and&lt;U&gt; &lt;/U&gt;&lt;A href="https://customer-academy.databricks.com/learn/course/761/how-to-use-databricks-auto-loader-for-incremental-etl-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=12eaf982b6a6f095dcba8696c55b3dd5fb9c913c" alt="https://customer-academy.databricks.com/learn/course/761/how-to-use-databricks-auto-loader-for-incremental-etl-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=12eaf982b6a6f095dcba8696c55b3dd5fb9c913c" target="_blank"&gt;&lt;U&gt;short video&lt;/U&gt;&lt;/A&gt;, and come back to this post to follow Steps 2-3.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;U&gt;Step 2: Reduce latency by optimizing your writes to Delta tables&lt;/U&gt;&lt;/P&gt;&lt;P&gt;Now that you’re using Delta tables, reduce latency when reading by running Auto Optimize to automatically compact small files during individual writes.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Set your table’s properties to&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;B&gt;delta.autoOptimize.optimizeWrite = true &lt;/B&gt;and &lt;B&gt;delta.autoOptimize.autoCompact = true&lt;/B&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;in the &lt;B&gt;CREATE TABLE &lt;/B&gt;command&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;Tip: Tables with many active queries and latency requirements (in the order of minutes) benefit most from Auto Optimize.&lt;/I&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Find examples &lt;A href="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" alt="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" target="_blank"&gt;&lt;U&gt;here&lt;/U&gt;&lt;/A&gt; for enabling Auto Optimize on all table&lt;A href="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" alt="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" target="_blank"&gt;s&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;U&gt;Step 3: Set up automated ETL processing&lt;/U&gt;&lt;/P&gt;&lt;P&gt;Finally, use Databricks workflows and jobs to author, manage, and orchestrate ingestion of your semi-structured and streaming data.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Here's a quick walkthrough on &lt;A href="https://customer-academy.databricks.com/learn/course/758/how-to-schedule-a-job-and-automate-a-workload-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=a29052770ace3560ad639bc0aa657945cb36541d" alt="https://customer-academy.databricks.com/learn/course/758/how-to-schedule-a-job-and-automate-a-workload-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=a29052770ace3560ad639bc0aa657945cb36541d" target="_blank"&gt;&lt;U&gt;How to Schedule a Job and Automate a Workload&lt;/U&gt;&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Did you know Databricks also provides powerful ETL capabilities with Delta Live Tables (DLT)? With DLT, treat your data as code and apply software engineering best practices like testing, monitoring and documentation to deploy reliable pipelines at scale.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;To learn more about DLT...&lt;/P&gt;&lt;P&gt;- Follow the&lt;A href="https://databricks.com/discover/pages/getting-started-with-delta-live-tables" alt="https://databricks.com/discover/pages/getting-started-with-delta-live-tables" target="_blank"&gt;&lt;U&gt; DLT Getting Started Guide&lt;/U&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;- &lt;A href="https://databricks.com/discover/demos/delta-live-tables-demo" alt="https://databricks.com/discover/demos/delta-live-tables-demo" target="_blank"&gt;&lt;U&gt;Watch a demo&lt;/U&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;- Download &lt;A href="https://github.com/databricks/delta-live-tables-notebooks" alt="https://github.com/databricks/delta-live-tables-notebooks" target="_blank"&gt;&lt;U&gt;example notebooks&lt;/U&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;- Join the &lt;A href="https://community.databricks.com/s/topic/0TO3f000000CiKMGA0/dlt" alt="https://community.databricks.com/s/topic/0TO3f000000CiKMGA0/dlt" target="_blank"&gt;&lt;U&gt;DLT discussions&lt;/U&gt;&lt;/A&gt; in the Databricks Community&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Congrats you have now optimized your data ingestion to get the most out of your data!&lt;/P&gt;&lt;P&gt;Drop your questions, feedback, and tips below!&lt;/P&gt;</description>
    <pubDate>Fri, 16 Sep 2022 23:20:11 GMT</pubDate>
    <dc:creator>User16835756816</dc:creator>
    <dc:date>2022-09-16T23:20:11Z</dc:date>
    <item>
      <title>How can I simplify my data ingestion by processing the data as it arrives in cloud storage?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-simplify-my-data-ingestion-by-processing-the-data-as/m-p/31473#M22917</link>
      <description>&lt;P&gt;This post will help you simplify your data ingestion by utilizing Auto Loader, Delta Optimized Writes, Delta Write Jobs, and Delta Live Tables. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;Pre-Req:&amp;nbsp;&lt;/I&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;You are using JSON data and Delta Writes commands&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;U&gt;Step 1: Simplify ingestion with Auto Loader&amp;nbsp;&lt;/U&gt;&lt;/P&gt;&lt;P&gt;Delta Lake helps unlock the full capabilities of working with JSON data in Databricks. Auto Loader makes it easy to ingest JSON data and manage semi-structured data in the Databricks Lakehouse.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Get hands on and import this&lt;A href="https://databricks.com/notebooks/autoloader-ingest-complex-json-sql-query.html" alt="https://databricks.com/notebooks/autoloader-ingest-complex-json-sql-query.html" target="_blank"&gt;&lt;U&gt; notebook&lt;/U&gt;&lt;/A&gt; for a walkthrough on continuous and scheduled ingest of JSON data with Auto Loader.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;If you want to learn more, check out this&lt;U&gt; &lt;/U&gt;&lt;A href="https://databricks.com/blog/2021/11/11/10-powerful-features-to-simplify-semi-structured-data-management-in-the-databricks-lakehouse.html" alt="https://databricks.com/blog/2021/11/11/10-powerful-features-to-simplify-semi-structured-data-management-in-the-databricks-lakehouse.html" target="_blank"&gt;&lt;U&gt;overview bl&lt;/U&gt;&lt;/A&gt;&lt;U&gt;og&lt;/U&gt;&lt;B&gt;&lt;U&gt; &lt;/U&gt;&lt;/B&gt;and&lt;U&gt; &lt;/U&gt;&lt;A href="https://customer-academy.databricks.com/learn/course/761/how-to-use-databricks-auto-loader-for-incremental-etl-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=12eaf982b6a6f095dcba8696c55b3dd5fb9c913c" alt="https://customer-academy.databricks.com/learn/course/761/how-to-use-databricks-auto-loader-for-incremental-etl-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=12eaf982b6a6f095dcba8696c55b3dd5fb9c913c" target="_blank"&gt;&lt;U&gt;short video&lt;/U&gt;&lt;/A&gt;, and come back to this post to follow Steps 2-3.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;U&gt;Step 2: Reduce latency by optimizing your writes to Delta tables&lt;/U&gt;&lt;/P&gt;&lt;P&gt;Now that you’re using Delta tables, reduce latency when reading by running Auto Optimize to automatically compact small files during individual writes.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Set your table’s properties to&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;B&gt;delta.autoOptimize.optimizeWrite = true &lt;/B&gt;and &lt;B&gt;delta.autoOptimize.autoCompact = true&lt;/B&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;in the &lt;B&gt;CREATE TABLE &lt;/B&gt;command&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;Tip: Tables with many active queries and latency requirements (in the order of minutes) benefit most from Auto Optimize.&lt;/I&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Find examples &lt;A href="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" alt="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" target="_blank"&gt;&lt;U&gt;here&lt;/U&gt;&lt;/A&gt; for enabling Auto Optimize on all table&lt;A href="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" alt="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" target="_blank"&gt;s&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;U&gt;Step 3: Set up automated ETL processing&lt;/U&gt;&lt;/P&gt;&lt;P&gt;Finally, use Databricks workflows and jobs to author, manage, and orchestrate ingestion of your semi-structured and streaming data.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Here's a quick walkthrough on &lt;A href="https://customer-academy.databricks.com/learn/course/758/how-to-schedule-a-job-and-automate-a-workload-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=a29052770ace3560ad639bc0aa657945cb36541d" alt="https://customer-academy.databricks.com/learn/course/758/how-to-schedule-a-job-and-automate-a-workload-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=a29052770ace3560ad639bc0aa657945cb36541d" target="_blank"&gt;&lt;U&gt;How to Schedule a Job and Automate a Workload&lt;/U&gt;&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Did you know Databricks also provides powerful ETL capabilities with Delta Live Tables (DLT)? With DLT, treat your data as code and apply software engineering best practices like testing, monitoring and documentation to deploy reliable pipelines at scale.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;To learn more about DLT...&lt;/P&gt;&lt;P&gt;- Follow the&lt;A href="https://databricks.com/discover/pages/getting-started-with-delta-live-tables" alt="https://databricks.com/discover/pages/getting-started-with-delta-live-tables" target="_blank"&gt;&lt;U&gt; DLT Getting Started Guide&lt;/U&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;- &lt;A href="https://databricks.com/discover/demos/delta-live-tables-demo" alt="https://databricks.com/discover/demos/delta-live-tables-demo" target="_blank"&gt;&lt;U&gt;Watch a demo&lt;/U&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;- Download &lt;A href="https://github.com/databricks/delta-live-tables-notebooks" alt="https://github.com/databricks/delta-live-tables-notebooks" target="_blank"&gt;&lt;U&gt;example notebooks&lt;/U&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;- Join the &lt;A href="https://community.databricks.com/s/topic/0TO3f000000CiKMGA0/dlt" alt="https://community.databricks.com/s/topic/0TO3f000000CiKMGA0/dlt" target="_blank"&gt;&lt;U&gt;DLT discussions&lt;/U&gt;&lt;/A&gt; in the Databricks Community&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Congrats you have now optimized your data ingestion to get the most out of your data!&lt;/P&gt;&lt;P&gt;Drop your questions, feedback, and tips below!&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 23:20:11 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-simplify-my-data-ingestion-by-processing-the-data-as/m-p/31473#M22917</guid>
      <dc:creator>User16835756816</dc:creator>
      <dc:date>2022-09-16T23:20:11Z</dc:date>
    </item>
    <item>
      <title>Re: How can I simplify my data ingestion by processing the data as it arrives in cloud storage?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-can-i-simplify-my-data-ingestion-by-processing-the-data-as/m-p/31474#M22918</link>
      <description>&lt;P&gt;This post will help you simplify your data ingestion by utilizing Auto Loader, Delta Optimized Writes, Delta Write Jobs, and Delta Live Tables.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;Pre-Req:&amp;nbsp;&lt;/I&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;You are using JSON data and Delta Writes commands&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;U&gt;Step 1: Simplify ingestion with Auto Loader&amp;nbsp;&lt;/U&gt;&lt;/P&gt;&lt;P&gt;Delta Lake helps unlock the full capabilities of working with JSON data in Databricks. Auto Loader makes it easy to ingest JSON data and manage semi-structured data in the Databricks Lakehouse.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Get hands on and import this&lt;A href="https://databricks.com/notebooks/autoloader-ingest-complex-json-sql-query.html" alt="https://databricks.com/notebooks/autoloader-ingest-complex-json-sql-query.html" target="_blank"&gt;&lt;U&gt;&amp;nbsp;notebook&lt;/U&gt;&lt;/A&gt;&amp;nbsp;for a walkthrough on continuous and scheduled ingest of JSON data with Auto Loader.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;If you want to learn more, check out this&lt;U&gt;&amp;nbsp;&lt;/U&gt;&lt;A href="https://databricks.com/blog/2021/11/11/10-powerful-features-to-simplify-semi-structured-data-management-in-the-databricks-lakehouse.html" alt="https://databricks.com/blog/2021/11/11/10-powerful-features-to-simplify-semi-structured-data-management-in-the-databricks-lakehouse.html" target="_blank"&gt;&lt;U&gt;overview bl&lt;/U&gt;&lt;/A&gt;&lt;U&gt;og&lt;/U&gt;&lt;B&gt;&lt;U&gt;&amp;nbsp;&lt;/U&gt;&lt;/B&gt;and&lt;U&gt;&amp;nbsp;&lt;/U&gt;&lt;A href="https://customer-academy.databricks.com/learn/course/761/how-to-use-databricks-auto-loader-for-incremental-etl-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=12eaf982b6a6f095dcba8696c55b3dd5fb9c913c" alt="https://customer-academy.databricks.com/learn/course/761/how-to-use-databricks-auto-loader-for-incremental-etl-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=12eaf982b6a6f095dcba8696c55b3dd5fb9c913c" target="_blank"&gt;&lt;U&gt;short video&lt;/U&gt;&lt;/A&gt;, and come back to this post to follow Steps 2-3.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;U&gt;Step 2: Reduce latency by optimizing your writes to Delta tables&lt;/U&gt;&lt;/P&gt;&lt;P&gt;Now that you’re using Delta tables, reduce latency when reading by running Auto Optimize to automatically compact small files during individual writes.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Set your table’s properties to&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;B&gt;delta.autoOptimize.optimizeWrite = true&amp;nbsp;&lt;/B&gt;and&amp;nbsp;&lt;B&gt;delta.autoOptimize.autoCompact = true&lt;/B&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;in the&amp;nbsp;&lt;B&gt;CREATE TABLE&amp;nbsp;&lt;/B&gt;command&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;I&gt;Tip: Tables with many active queries and latency requirements (in the order of minutes) benefit most from Auto Optimize.&lt;/I&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Find examples&amp;nbsp;&lt;A href="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" alt="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" target="_blank"&gt;&lt;U&gt;here&lt;/U&gt;&lt;/A&gt;&amp;nbsp;for enabling Auto Optimize on all table&lt;A href="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" alt="https://docs.databricks.com/delta/optimizations/auto-optimize.html#enable-auto-optimize" target="_blank"&gt;s&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;U&gt;Step 3: Set up automated ETL processing&lt;/U&gt;&lt;/P&gt;&lt;P&gt;Finally, use Databricks workflows and jobs to author, manage, and orchestrate ingestion of your semi-structured and streaming data.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Here's a quick walkthrough on&amp;nbsp;&lt;A href="https://customer-academy.databricks.com/learn/course/758/how-to-schedule-a-job-and-automate-a-workload-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=a29052770ace3560ad639bc0aa657945cb36541d" alt="https://customer-academy.databricks.com/learn/course/758/how-to-schedule-a-job-and-automate-a-workload-with-the-databricks-data-science-and-data-engineering-workspace?generated_by=231809&amp;amp;hash=a29052770ace3560ad639bc0aa657945cb36541d" target="_blank"&gt;&lt;U&gt;How to Schedule a Job and Automate a Workload&lt;/U&gt;&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Did you know Databricks also provides powerful ETL capabilities with Delta Live Tables (DLT)? With DLT, treat your data as code and apply software engineering best practices like testing, monitoring and documentation to deploy reliable pipelines at scale.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;To learn more about DLT...&lt;/P&gt;&lt;P&gt;- Follow the&lt;A href="https://databricks.com/discover/pages/getting-started-with-delta-live-tables" alt="https://databricks.com/discover/pages/getting-started-with-delta-live-tables" target="_blank"&gt;&lt;U&gt;&amp;nbsp;DLT Getting Started Guide&lt;/U&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;-&amp;nbsp;&lt;A href="https://databricks.com/discover/demos/delta-live-tables-demo" alt="https://databricks.com/discover/demos/delta-live-tables-demo" target="_blank"&gt;&lt;U&gt;Watch a demo&lt;/U&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;- Download&amp;nbsp;&lt;A href="https://github.com/databricks/delta-live-tables-notebooks" alt="https://github.com/databricks/delta-live-tables-notebooks" target="_blank"&gt;&lt;U&gt;example notebooks&lt;/U&gt;&lt;/A&gt;&lt;/P&gt;&lt;P&gt;- Join the&amp;nbsp;&lt;A href="https://community.databricks.com/s/topic/0TO3f000000CiKMGA0/dlt" alt="https://community.databricks.com/s/topic/0TO3f000000CiKMGA0/dlt" target="_blank"&gt;&lt;U&gt;DLT discussions&lt;/U&gt;&lt;/A&gt;&amp;nbsp;in the Databricks Community&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Congrats you have now optimized your data ingestion to get the most out of your data!&lt;/P&gt;&lt;P&gt;Drop your questions, feedback, and tips below!&lt;/P&gt;</description>
      <pubDate>Wed, 19 Oct 2022 15:15:47 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-can-i-simplify-my-data-ingestion-by-processing-the-data-as/m-p/31474#M22918</guid>
      <dc:creator>youssefmrini</dc:creator>
      <dc:date>2022-10-19T15:15:47Z</dc:date>
    </item>
  </channel>
</rss>

