<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic use DeltaLog class in databricks cluster in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/use-deltalog-class-in-databricks-cluster/m-p/7380#M3268</link>
    <description>&lt;P&gt;I need to use DeltaLog class in the code to get the &lt;B&gt;AddFiles&lt;/B&gt; dataset. I have to keep the implemented code in a repo and run it in databricks cluster. &lt;/P&gt;&lt;P&gt;Some docs say to use &lt;B&gt;org.apache.spark.sql.delta.DeltaLog&lt;/B&gt; class, but it seems databricks gets rid of it in runtime and i have &lt;B&gt;NoClassDefFoundError: org/apache/spark/sql/delta/DeltaLog$&lt;/B&gt; when run on cluster using&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;val files = org.apache.spark.sql.delta.DeltaLog.forTable(spark, path(db, table))
        .unsafeVolatileSnapshot
        .allFiles&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;as i use &lt;I&gt;Provided&lt;/I&gt; config for &lt;I&gt;"io.delta" %% "delta-core"&lt;/I&gt; dependency, when i try running without &lt;I&gt;Provided, &lt;/I&gt;have the exception&lt;B&gt; IllegalArgumentException: requirement failed: Config entry spark.databricks.delta.timeTravel.resolveOnIdentifier.enabled already registered!&lt;/B&gt;&lt;/P&gt;&lt;P&gt;databricks &lt;A href="https://kb.databricks.com/en_US/sql/find-size-of-table" alt="https://kb.databricks.com/en_US/sql/find-size-of-table" target="_blank"&gt;https://kb.databricks.com/en_US/sql/find-size-of-table&lt;/A&gt; say to use &lt;B&gt;com.databricks.sql.transaction.tahoe.DeltaLog&lt;/B&gt; but this class is outside the io.delta package which is the cause of the compilation issue. I even can't define the jar(source of &lt;B&gt;com.databricks.sql.transaction.tahoe.DeltaLog&lt;/B&gt;) to import it explicitly into my build. this code works in the cluster&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;val deltaTable = DeltaTable.forPath(spark, path)
deltaTable.getClass.getMethod("deltaLog").invoke(deltaTable)
      .asInstanceOf[com.databricks.sql.transaction.tahoe.DeltaLog]
      .snapshot
      .allFiles&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;but as i said i can't keep it in my code because of compilation issue&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;How can I use DeltaLog in my code and have the possibility to run this code on a cluster?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 21 Mar 2023 09:23:27 GMT</pubDate>
    <dc:creator>pokus</dc:creator>
    <dc:date>2023-03-21T09:23:27Z</dc:date>
    <item>
      <title>use DeltaLog class in databricks cluster</title>
      <link>https://community.databricks.com/t5/data-engineering/use-deltalog-class-in-databricks-cluster/m-p/7380#M3268</link>
      <description>&lt;P&gt;I need to use DeltaLog class in the code to get the &lt;B&gt;AddFiles&lt;/B&gt; dataset. I have to keep the implemented code in a repo and run it in databricks cluster. &lt;/P&gt;&lt;P&gt;Some docs say to use &lt;B&gt;org.apache.spark.sql.delta.DeltaLog&lt;/B&gt; class, but it seems databricks gets rid of it in runtime and i have &lt;B&gt;NoClassDefFoundError: org/apache/spark/sql/delta/DeltaLog$&lt;/B&gt; when run on cluster using&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;val files = org.apache.spark.sql.delta.DeltaLog.forTable(spark, path(db, table))
        .unsafeVolatileSnapshot
        .allFiles&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;as i use &lt;I&gt;Provided&lt;/I&gt; config for &lt;I&gt;"io.delta" %% "delta-core"&lt;/I&gt; dependency, when i try running without &lt;I&gt;Provided, &lt;/I&gt;have the exception&lt;B&gt; IllegalArgumentException: requirement failed: Config entry spark.databricks.delta.timeTravel.resolveOnIdentifier.enabled already registered!&lt;/B&gt;&lt;/P&gt;&lt;P&gt;databricks &lt;A href="https://kb.databricks.com/en_US/sql/find-size-of-table" alt="https://kb.databricks.com/en_US/sql/find-size-of-table" target="_blank"&gt;https://kb.databricks.com/en_US/sql/find-size-of-table&lt;/A&gt; say to use &lt;B&gt;com.databricks.sql.transaction.tahoe.DeltaLog&lt;/B&gt; but this class is outside the io.delta package which is the cause of the compilation issue. I even can't define the jar(source of &lt;B&gt;com.databricks.sql.transaction.tahoe.DeltaLog&lt;/B&gt;) to import it explicitly into my build. this code works in the cluster&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;val deltaTable = DeltaTable.forPath(spark, path)
deltaTable.getClass.getMethod("deltaLog").invoke(deltaTable)
      .asInstanceOf[com.databricks.sql.transaction.tahoe.DeltaLog]
      .snapshot
      .allFiles&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;but as i said i can't keep it in my code because of compilation issue&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;How can I use DeltaLog in my code and have the possibility to run this code on a cluster?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 21 Mar 2023 09:23:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/use-deltalog-class-in-databricks-cluster/m-p/7380#M3268</guid>
      <dc:creator>pokus</dc:creator>
      <dc:date>2023-03-21T09:23:27Z</dc:date>
    </item>
    <item>
      <title>Re: use DeltaLog class in databricks cluster</title>
      <link>https://community.databricks.com/t5/data-engineering/use-deltalog-class-in-databricks-cluster/m-p/7381#M3269</link>
      <description>&lt;P&gt;i was able to resolve the issue using the reflection only&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;val deltaTable = DeltaTable.forPath(spark, path(db, table))
val deltaLog = deltaTable.getClass.getMethod("deltaLog").invoke(deltaTable)
val snapshot = deltaLog.getClass.getMethod("unsafeVolatileSnapshot").invoke(deltaLog)
val allFiles = snapshot.getClass.getMethod("allFiles").invoke(snapshot).asInstanceOf[DataFrame]&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;but it's will be good to resolve the dependency issue and have the possibility to get &lt;B&gt;DeltaLog &lt;/B&gt;using the delta api&lt;/P&gt;</description>
      <pubDate>Wed, 22 Mar 2023 15:29:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/use-deltalog-class-in-databricks-cluster/m-p/7381#M3269</guid>
      <dc:creator>pokus</dc:creator>
      <dc:date>2023-03-22T15:29:01Z</dc:date>
    </item>
    <item>
      <title>Re: use DeltaLog class in databricks cluster</title>
      <link>https://community.databricks.com/t5/data-engineering/use-deltalog-class-in-databricks-cluster/m-p/64035#M32439</link>
      <description>&lt;P&gt;Thanks for providing a solution&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/48954"&gt;@pokus&lt;/a&gt;&amp;nbsp;.&lt;/P&gt;&lt;P&gt;What I dont understand is why Databricks cannot provide the DeltaLog at runtime. How can this be the official solution? We need a better solution for this instead of depending on reflections.&lt;/P&gt;</description>
      <pubDate>Mon, 18 Mar 2024 20:02:39 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/use-deltalog-class-in-databricks-cluster/m-p/64035#M32439</guid>
      <dc:creator>dbal</dc:creator>
      <dc:date>2024-03-18T20:02:39Z</dc:date>
    </item>
    <item>
      <title>Re: use DeltaLog class in databricks cluster</title>
      <link>https://community.databricks.com/t5/data-engineering/use-deltalog-class-in-databricks-cluster/m-p/133780#M49924</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/48954"&gt;@pokus&lt;/a&gt;&amp;nbsp;,&amp;nbsp;&lt;/P&gt;
&lt;P&gt;You don't need to access via reflection.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;You can &lt;/SPAN&gt;&lt;STRONG&gt;Access DeltaLog with&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;spark._jvm&lt;/CODE&gt;:&lt;/STRONG&gt;&lt;BR /&gt;&lt;SPAN&gt;Unity Catalog and DeltaLake tables expose their metadata and transaction log via the JVM backend. Using&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;spark._jvm&lt;/CODE&gt;&lt;SPAN&gt;, you can interact with&amp;nbsp;&lt;/SPAN&gt;DeltaLog&lt;/P&gt;
&lt;P&gt;Thanks!&lt;/P&gt;
&lt;P&gt;&lt;CODE&gt;&lt;/CODE&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 04 Oct 2025 07:41:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/use-deltalog-class-in-databricks-cluster/m-p/133780#M49924</guid>
      <dc:creator>NandiniN</dc:creator>
      <dc:date>2025-10-04T07:41:46Z</dc:date>
    </item>
  </channel>
</rss>

