<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Docker image for runtime 16.4 LTS with Scala 2.13 in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/docker-image-for-runtime-16-4-lts-with-scala-2-13/m-p/120527#M46185</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/136871"&gt;@mslow&lt;/a&gt;, as far as I understand, DBR 16.4 LTS supports both Scala 2.12 and 2.13, but Docker Hub provides only one base image. The selection between Scala versions doesn’t come from separate Docker images, it's determined during cluster creation via runtime configuration. The base image remains the same regardless of which Scala version is used.&lt;/P&gt;</description>
    <pubDate>Thu, 29 May 2025 10:34:37 GMT</pubDate>
    <dc:creator>Renu_</dc:creator>
    <dc:date>2025-05-29T10:34:37Z</dc:date>
    <item>
      <title>Docker image for runtime 16.4 LTS with Scala 2.13</title>
      <link>https://community.databricks.com/t5/data-engineering/docker-image-for-runtime-16-4-lts-with-scala-2-13/m-p/120425#M46165</link>
      <description>&lt;P&gt;I'm trying to test a custom python package with new 16.4 LTS runtime, pulling the official Databricks docker image from&lt;/P&gt;&lt;P&gt;&lt;A href="https://hub.docker.com/layers/databricksruntime/standard/16.4-LTS/images/sha256-604b73feeac08bc902ab16110218ffc63c1b24ac31f5f04a6798c0cb0000cfc8" target="_blank" rel="noopener"&gt;https://hub.docker.com/layers/databricksruntime/standard/16.4-LTS/images/sha256-604b73feeac08bc902ab16110218ffc63c1b24ac31f5f04a6798c0cb0000cfc8&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Are there supposed to be two images available, one with scala 2.12 and another with 2.13, or just this one image?&lt;/P&gt;</description>
      <pubDate>Wed, 28 May 2025 13:02:55 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/docker-image-for-runtime-16-4-lts-with-scala-2-13/m-p/120425#M46165</guid>
      <dc:creator>mslow</dc:creator>
      <dc:date>2025-05-28T13:02:55Z</dc:date>
    </item>
    <item>
      <title>Re: Docker image for runtime 16.4 LTS with Scala 2.13</title>
      <link>https://community.databricks.com/t5/data-engineering/docker-image-for-runtime-16-4-lts-with-scala-2-13/m-p/120527#M46185</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/136871"&gt;@mslow&lt;/a&gt;, as far as I understand, DBR 16.4 LTS supports both Scala 2.12 and 2.13, but Docker Hub provides only one base image. The selection between Scala versions doesn’t come from separate Docker images, it's determined during cluster creation via runtime configuration. The base image remains the same regardless of which Scala version is used.&lt;/P&gt;</description>
      <pubDate>Thu, 29 May 2025 10:34:37 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/docker-image-for-runtime-16-4-lts-with-scala-2-13/m-p/120527#M46185</guid>
      <dc:creator>Renu_</dc:creator>
      <dc:date>2025-05-29T10:34:37Z</dc:date>
    </item>
    <item>
      <title>Re: Docker image for runtime 16.4 LTS with Scala 2.13</title>
      <link>https://community.databricks.com/t5/data-engineering/docker-image-for-runtime-16-4-lts-with-scala-2-13/m-p/120546#M46189</link>
      <description>&lt;P&gt;Thanks&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/151751"&gt;@Renu_&lt;/a&gt;&amp;nbsp;. That makes sense, I understand that in the DBX workspaces, you can choose between two 16.4 spark versions when creating a compute.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;My confusion was with using the docker image in local environment. I pull it from the registry but then I'm also installing dependencies with pip. So if I understand it correctly, I will still pull the one and only 16.4 docker image but then based on which scala I want to test with, I need to install that scala version compatible package.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;I haven't found a simple way of pip installing scala 2.13 compiled packages and what I did find was this open issue (PySpark installation doesn't support Scala 2.13 binaries)&lt;/P&gt;&lt;P&gt;&lt;A href="https://issues.apache.org/jira/browse/SPARK-39995" target="_blank"&gt;https://issues.apache.org/jira/browse/SPARK-39995&lt;/A&gt;&lt;/P&gt;&lt;P&gt;If anyone has a suggestion for how to set up a local Dockerfile to run 16.4 image and scala 2.13 packages, it would be greatly appreciated.&lt;/P&gt;</description>
      <pubDate>Thu, 29 May 2025 13:27:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/docker-image-for-runtime-16-4-lts-with-scala-2-13/m-p/120546#M46189</guid>
      <dc:creator>mslow</dc:creator>
      <dc:date>2025-05-29T13:27:43Z</dc:date>
    </item>
  </channel>
</rss>

