<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: databricks academy setup error -data engineering in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/141045#M51612</link>
    <description>&lt;P&gt;Databricks is passing from the Community Edition to the Free Edition, which I am currently using.&lt;/P&gt;&lt;P&gt;Inspecting the code, the problem seems to be related to the spark.conf.get() method, which is declared as follows in the documentation:&lt;/P&gt;&lt;P&gt;--------------------------------------------------------------------&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;get(self, key: str, default: Union[str, NoneType, pyspark._globals._NoValueType] = &amp;lt;no value&amp;gt;) -&amp;gt; Optional[str] Parameters &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;key : str | key of the configuration to get. &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;default : str, optional | value of the configuration to get if the key does not exist.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Returns The string value of the configuration set, or None.&amp;nbsp; &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Examples : spark.conf.get("non-existent-key", "my_default")&amp;nbsp; // 'my_default'&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;--------------------------------------------------------------------&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;However, testing the method clearly shows that it is raising an exception instead of returning the default value for the missing key. &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;This seems to be solvable using&amp;nbsp;spark.conf.getAll.get(key, default), where&amp;nbsp;spark.conf.getAll is retuning a python dict containing all the configurations. However, this problem is present also in the inner code of the &lt;/SPAN&gt;&lt;SPAN&gt;"v3.0.23" dbacademy library version I am using, which raises the [CONFIG_NOT_AVAILABLE] error when importing modules of the library (in my case the&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;dbgems module).&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 03 Dec 2025 17:37:55 GMT</pubDate>
    <dc:creator>iFoxz17</dc:creator>
    <dc:date>2025-12-03T17:37:55Z</dc:date>
    <item>
      <title>databricks academy setup error -data engineering</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/90844#M38006</link>
      <description>&lt;P&gt;am trying to run the set up notebook&amp;nbsp; "_COMMON" for my academy data engineering,&lt;/P&gt;&lt;P&gt;am getting the below error: "&lt;SPAN&gt;Configuration dbacademy.deprecation.logging is not available.&lt;/SPAN&gt;"&lt;/P&gt;</description>
      <pubDate>Wed, 18 Sep 2024 08:39:32 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/90844#M38006</guid>
      <dc:creator>osas</dc:creator>
      <dc:date>2024-09-18T08:39:32Z</dc:date>
    </item>
    <item>
      <title>Re: databricks academy setup error -data engineering</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/90972#M38047</link>
      <description>&lt;P&gt;Same issue here and the AI Assistant cannot help...&lt;/P&gt;&lt;LI-CODE lang="java"&gt;JVM stacktrace:
org.apache.spark.sql.AnalysisException
	at com.databricks.sql.connect.SparkConnectConfig$.assertConfigAllowedForRead(SparkConnectConfig.scala:203)
	at org.apache.spark.sql.connect.service.SparkConnectConfigHandler$RuntimeConfigWrapper.get(SparkConnectConfigHandler.scala:106)
	at org.apache.spark.sql.connect.service.SparkConnectConfigHandler.transform(SparkConnectConfigHandler.scala:241)
	at org.apache.spark.sql.connect.service.SparkConnectConfigHandler.$anonfun$handleGetWithDefault$1(SparkConnectConfigHandler.scala:282)
	at org.apache.spark.sql.connect.service.SparkConnectConfigHandler.$anonfun$handleGetWithDefault$1$adapted(SparkConnectConfigHandler.scala:280)
	at scala.collection.Iterator.foreach(Iterator.scala:943)
	at scala.collection.Iterator.foreach$(Iterator.scala:943)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
	at org.apache.spark.sql.connect.service.SparkConnectConfigHandler.handleGetWithDefault(SparkConnectConfigHandler.scala:280)
	at org.apache.spark.sql.connect.service.SparkConnectConfigHandler.handle(SparkConnectConfigHandler.scala:199)
	at org.apache.spark.sql.connect.service.SparkConnectService.config(SparkConnectService.scala:122)
	at org.apache.spark.connect.proto.SparkConnectServiceGrpc$MethodHandlers.invoke(SparkConnectServiceGrpc.java:805)
	at grpc_shaded.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:182)
	at com.databricks.spark.connect.service.AuthenticationInterceptor$AuthenticatedServerCallListener.$anonfun$onHalfClose$1(AuthenticationInterceptor.scala:312)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45)
	at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103)
	at com.databricks.spark.connect.service.RequestContext.$anonfun$runWith$3(RequestContext.scala:265)
	at com.databricks.spark.connect.service.RequestContext$.com$databricks$spark$connect$service$RequestContext$$withLocalProperties(RequestContext.scala:525)
	at com.databricks.spark.connect.service.RequestContext.$anonfun$runWith$2(RequestContext.scala:265)
	at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:435)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
	at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)
	at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:433)
	at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:427)
	at com.databricks.spark.util.PublicDBLogging.withAttributionContext(DatabricksSparkUsageLogger.scala:27)
	at com.databricks.spark.util.UniverseAttributionContextWrapper.withValue(AttributionContextUtils.scala:225)
	at com.databricks.spark.connect.service.RequestContext.$anonfun$runWith$1(RequestContext.scala:264)
	at com.databricks.spark.connect.service.RequestContext.withContext(RequestContext.scala:277)
	at com.databricks.spark.connect.service.RequestContext.runWith(RequestContext.scala:257)
	at com.databricks.spark.connect.service.AuthenticationInterceptor$AuthenticatedServerCallListener.onHalfClose(AuthenticationInterceptor.scala:312)
	at grpc_shaded.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:340)
	at grpc_shaded.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:866)
	at grpc_shaded.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at grpc_shaded.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:133)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.lang.Thread.run(Thread.java:840)
File &amp;lt;command-2080937168644028&amp;gt;, line 2
      1 import pyspark.sql.functions as F
----&amp;gt; 2 from dbacademy import dbgems
      3 from dbacademy.dbhelper import DBAcademyHelper, Paths, CourseConfig, LessonConfig
      5 # The following attributes are externalized to make them easy
      6 # for content developers to update with every new course.
File /databricks/python/lib/python3.10/site-packages/pyspark/sql/connect/client/core.py:1988, in SparkConnectClient._handle_rpc_error(self, rpc_error)
   1985             info = error_details_pb2.ErrorInfo()
   1986             d.Unpack(info)
-&amp;gt; 1988             raise convert_exception(
   1989                 info,
   1990                 status.message,
   1991                 self._fetch_enriched_error(info),
   1992                 self._display_server_stack_trace(),
   1993             ) from None
   1995     raise SparkConnectGrpcException(status.message) from None
   1996 else:&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 18 Sep 2024 21:20:05 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/90972#M38047</guid>
      <dc:creator>FlorianC</dc:creator>
      <dc:date>2024-09-18T21:20:05Z</dc:date>
    </item>
    <item>
      <title>Re: databricks academy setup error -data engineering</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/98301#M39677</link>
      <description>&lt;P&gt;Did anyone figure out twhat this is? I have the exact same issue:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Configuration dbacademy.deprecation.logging is not available. SQLSTATE: 42K0I&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 11 Nov 2024 01:42:57 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/98301#M39677</guid>
      <dc:creator>robtrevino</dc:creator>
      <dc:date>2024-11-11T01:42:57Z</dc:date>
    </item>
    <item>
      <title>Re: databricks academy setup error -data engineering</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/98304#M39680</link>
      <description>&lt;P class="p1"&gt;The SQLSTATE code 42K0I corresponds to a configuration not being available. This means that the configuration key you are trying to use does not exist or is not supported in your current setup.&lt;/P&gt;
&lt;P class="p1"&gt;Please note that these files were created specifically for the Databricks Academy Labs environment (through our lab partner,&amp;nbsp;&lt;STRONG&gt;Vocareum&lt;/STRONG&gt;) and may not work as expected in other environments.&lt;/P&gt;
&lt;P class="p1"&gt;If you’re interested in working on labs in a provided Databricks environment, you can purchase the Databricks Academy Labs subscription directly from the Databricks Academy website&lt;/P&gt;
&lt;UL class="ul1"&gt;
&lt;LI class="li1"&gt;Databricks customers and the general public -&amp;nbsp;&lt;A href="http://customer-academy.databricks.com/" target="_blank"&gt;&lt;STRONG&gt;http://customer-academy.databricks.com/&lt;/STRONG&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;LI class="li1"&gt;Databricks partners -&amp;nbsp;&lt;A href="http://partner-academy.databricks.com/" target="_blank"&gt;&lt;STRONG&gt;http://partner-academy.databricks.com/&lt;/STRONG&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;/UL&gt;</description>
      <pubDate>Mon, 11 Nov 2024 04:37:56 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/98304#M39680</guid>
      <dc:creator>MuthuLakshmi</dc:creator>
      <dc:date>2024-11-11T04:37:56Z</dc:date>
    </item>
    <item>
      <title>Re: databricks academy setup error -data engineering</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/101010#M40513</link>
      <description>&lt;P&gt;Create a separate cluster with DBR 13.3, to resolve this error.&lt;/P&gt;</description>
      <pubDate>Thu, 05 Dec 2024 06:08:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/101010#M40513</guid>
      <dc:creator>shashi_soni</dc:creator>
      <dc:date>2024-12-05T06:08:02Z</dc:date>
    </item>
    <item>
      <title>Re: databricks academy setup error -data engineering</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/107606#M42857</link>
      <description>&lt;P&gt;I'm using a provided Databricks environment and I've installed the &lt;SPAN&gt;&lt;STRONG&gt;dbacademy&lt;/STRONG&gt; package, that you can find on github.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;I catch the same error, how I can use this installed package in the setup scripts ?&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 29 Jan 2025 14:17:04 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/107606#M42857</guid>
      <dc:creator>Luipiu</dc:creator>
      <dc:date>2025-01-29T14:17:04Z</dc:date>
    </item>
    <item>
      <title>Re: databricks academy setup error -data engineering</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/107888#M42939</link>
      <description>&lt;P&gt;I reported here the solution&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.databricks.com/t5/data-engineering/setup-learning-environment-failed-configuration-dbacademy/td-p/82441" target="_blank"&gt;Setup learning environment failed: Configuration d... - Databricks Community - 82441&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 30 Jan 2025 17:26:36 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/107888#M42939</guid>
      <dc:creator>Luipiu</dc:creator>
      <dc:date>2025-01-30T17:26:36Z</dc:date>
    </item>
    <item>
      <title>Re: databricks academy setup error -data engineering</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/141045#M51612</link>
      <description>&lt;P&gt;Databricks is passing from the Community Edition to the Free Edition, which I am currently using.&lt;/P&gt;&lt;P&gt;Inspecting the code, the problem seems to be related to the spark.conf.get() method, which is declared as follows in the documentation:&lt;/P&gt;&lt;P&gt;--------------------------------------------------------------------&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;get(self, key: str, default: Union[str, NoneType, pyspark._globals._NoValueType] = &amp;lt;no value&amp;gt;) -&amp;gt; Optional[str] Parameters &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;key : str | key of the configuration to get. &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;default : str, optional | value of the configuration to get if the key does not exist.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Returns The string value of the configuration set, or None.&amp;nbsp; &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Examples : spark.conf.get("non-existent-key", "my_default")&amp;nbsp; // 'my_default'&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;--------------------------------------------------------------------&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;However, testing the method clearly shows that it is raising an exception instead of returning the default value for the missing key. &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;This seems to be solvable using&amp;nbsp;spark.conf.getAll.get(key, default), where&amp;nbsp;spark.conf.getAll is retuning a python dict containing all the configurations. However, this problem is present also in the inner code of the &lt;/SPAN&gt;&lt;SPAN&gt;"v3.0.23" dbacademy library version I am using, which raises the [CONFIG_NOT_AVAILABLE] error when importing modules of the library (in my case the&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;dbgems module).&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 03 Dec 2025 17:37:55 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-academy-setup-error-data-engineering/m-p/141045#M51612</guid>
      <dc:creator>iFoxz17</dc:creator>
      <dc:date>2025-12-03T17:37:55Z</dc:date>
    </item>
  </channel>
</rss>

