<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Simple integration to push data from third-party into a client's Databricks instance in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/simple-integration-to-push-data-from-third-party-into-a-client-s/m-p/136814#M50653</link>
    <description>&lt;P&gt;Hi there, we have an industry data platform with multiple customers using it. We provide each customer with their own data every night via .csv. Some of our customers use Databricks, and import their data from us into it.&lt;/P&gt;&lt;P&gt;We would like to offer a more simple solution to remove the following steps: exporting a csv, uploading it to customer's S3, then customer importing it into their Databricks.&lt;/P&gt;&lt;P&gt;What is the best way we can partner with Databricks to integrate once and offer this benefit to any / all of our customers that use Databricks?&lt;/P&gt;</description>
    <pubDate>Thu, 30 Oct 2025 18:40:29 GMT</pubDate>
    <dc:creator>67</dc:creator>
    <dc:date>2025-10-30T18:40:29Z</dc:date>
    <item>
      <title>Simple integration to push data from third-party into a client's Databricks instance</title>
      <link>https://community.databricks.com/t5/data-engineering/simple-integration-to-push-data-from-third-party-into-a-client-s/m-p/136814#M50653</link>
      <description>&lt;P&gt;Hi there, we have an industry data platform with multiple customers using it. We provide each customer with their own data every night via .csv. Some of our customers use Databricks, and import their data from us into it.&lt;/P&gt;&lt;P&gt;We would like to offer a more simple solution to remove the following steps: exporting a csv, uploading it to customer's S3, then customer importing it into their Databricks.&lt;/P&gt;&lt;P&gt;What is the best way we can partner with Databricks to integrate once and offer this benefit to any / all of our customers that use Databricks?&lt;/P&gt;</description>
      <pubDate>Thu, 30 Oct 2025 18:40:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/simple-integration-to-push-data-from-third-party-into-a-client-s/m-p/136814#M50653</guid>
      <dc:creator>67</dc:creator>
      <dc:date>2025-10-30T18:40:29Z</dc:date>
    </item>
    <item>
      <title>Re: Simple integration to push data from third-party into a client's Databricks instance</title>
      <link>https://community.databricks.com/t5/data-engineering/simple-integration-to-push-data-from-third-party-into-a-client-s/m-p/136833#M50654</link>
      <description>&lt;P&gt;&lt;span class="lia-unicode-emoji" title=":light_bulb:"&gt;💡&lt;/span&gt;&lt;SPAN&gt;You could use external volumes with a Cloudflare R2 bucket as an intermediary - you write the nightly data files to R2 (using S3-compatible API), and your customers create external volumes in their Databricks workspace pointing to their designated R2 paths with read-only credentials you provide. This eliminates the manual CSV export/upload steps, has &lt;STRONG&gt;zero egress costs&lt;/STRONG&gt; from R2, and doesn't require you to maintain your own Databricks infrastructure. Customers can then use Auto Loader or scheduled &lt;STRONG&gt;COPY INTO&lt;/STRONG&gt; commands to automatically ingest new files as they arrive.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 30 Oct 2025 21:11:33 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/simple-integration-to-push-data-from-third-party-into-a-client-s/m-p/136833#M50654</guid>
      <dc:creator>jeffreyaven</dc:creator>
      <dc:date>2025-10-30T21:11:33Z</dc:date>
    </item>
  </channel>
</rss>

