<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Java - FAILED_WITH_ERROR when saving to snowflake in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/java-failed-with-error-when-saving-to-snowflake/m-p/48938#M28429</link>
    <description>&lt;P&gt;Found the problem.&amp;nbsp;&lt;BR /&gt;The sub-roles didn't have grants to the warehouse.&lt;/P&gt;&lt;P&gt;I hope it will help someone one day &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 11 Oct 2023 11:01:20 GMT</pubDate>
    <dc:creator>orso</dc:creator>
    <dc:date>2023-10-11T11:01:20Z</dc:date>
    <item>
      <title>Java - FAILED_WITH_ERROR when saving to snowflake</title>
      <link>https://community.databricks.com/t5/data-engineering/java-failed-with-error-when-saving-to-snowflake/m-p/48841#M28397</link>
      <description>&lt;P&gt;I'm trying to move data from database A to B on Snowflake. There's no permission issue since using the Python package snowflake.connector&amp;nbsp; works&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Databricks runtime version:&amp;nbsp;12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12)&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Insert into database B fails with the code below. It works when trying to insert into database A&lt;/SPAN&gt;&lt;/P&gt;&lt;DIV&gt;&lt;DIV&gt;data.write \&lt;BR /&gt;.format('net.snowflake.spark.snowflake) \&lt;BR /&gt;.options({...}) \&lt;BR /&gt;.option('dbtable', 'MY_TABLE') \&lt;BR /&gt;.mode('Append') \&lt;BR /&gt;.save()​&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&lt;SPAN&gt;Error:&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;DIV&gt;java.sql.SQLException: Status of query associated with resultSet is FAILED_WITH_ERROR. Results not generated.&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;Py4JJavaError&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;Traceback (most recent call last)&lt;/DIV&gt;&lt;DIV&gt;File &amp;lt;command-4230377357719442&amp;gt;:1&lt;/DIV&gt;&lt;DIV&gt;----&amp;gt; 1 snowflake_source_provider.save_table(a, schema, table_name, save_query_context)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;File &amp;lt;command-2628261293911194&amp;gt;:45, in SnowflakeProvider.save_table(self, data, schema, table_name, query_context)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;44 def save_table(self, data, schema, table_name, query_context):&lt;/DIV&gt;&lt;DIV&gt;---&amp;gt; 45&amp;nbsp; &amp;nbsp; &amp;nbsp;data.write \&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;46&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;.format(self.SNOWFLAKE_FORMAT) \&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;47&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;.options(**self._get_connection_properties(schema)) \&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;48&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;.option('dbtable', table_name) \&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;49&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;.option('QUERY_TAG', self._get_query_tag(query_context)) \&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;50&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;.mode('Append') \&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;51&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;.save()&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;File /databricks/spark/python/pyspark/instrumentation_utils.py:48, in _wrap_function.&amp;lt;locals&amp;gt;.wrapper(*args, **kwargs)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;46 start = time.perf_counter()&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;47 try:&lt;/DIV&gt;&lt;DIV&gt;---&amp;gt; 48&amp;nbsp; &amp;nbsp; &amp;nbsp;res = func(*args, **kwargs)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;49&amp;nbsp; &amp;nbsp; &amp;nbsp;logger.log_success(&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;50&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;module_name, class_name, function_name, time.perf_counter() - start, signature&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;51&amp;nbsp; &amp;nbsp; &amp;nbsp;)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;52&amp;nbsp; &amp;nbsp; &amp;nbsp;return res&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;File /databricks/spark/python/pyspark/sql/readwriter.py:1395, in DataFrameWriter.save(self, path, format, mode, partitionBy, **options)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1393&amp;nbsp; &amp;nbsp; &amp;nbsp;self.format(format)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1394 if path is None:&lt;/DIV&gt;&lt;DIV&gt;-&amp;gt; 1395&amp;nbsp; &amp;nbsp; &amp;nbsp;self._jwrite.save()&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1396 else:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1397&amp;nbsp; &amp;nbsp; &amp;nbsp;self._jwrite.save(path)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;File /databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py:1321, in JavaMember.__call__(self, *args)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1315 command = proto.CALL_COMMAND_NAME +\&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1316&amp;nbsp; &amp;nbsp; &amp;nbsp;self.command_header +\&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1317&amp;nbsp; &amp;nbsp; &amp;nbsp;args_command +\&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1318&amp;nbsp; &amp;nbsp; &amp;nbsp;proto.END_COMMAND_PART&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1320 answer = self.gateway_client.send_command(command)&lt;/DIV&gt;&lt;DIV&gt;-&amp;gt; 1321 return_value = get_return_value(&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1322&amp;nbsp; &amp;nbsp; &amp;nbsp;answer, self.gateway_client, self.target_id, self.name)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1324 for temp_arg in temp_args:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp;1325&amp;nbsp; &amp;nbsp; &amp;nbsp;temp_arg._detach()&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;File /databricks/spark/python/pyspark/errors/exceptions.py:228, in capture_sql_exception.&amp;lt;locals&amp;gt;.deco(*a, **kw)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 226 def deco(*a: Any, **kw: Any) -&amp;gt; Any:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 227&amp;nbsp; &amp;nbsp; &amp;nbsp;try:&lt;/DIV&gt;&lt;DIV&gt;--&amp;gt; 228&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;return f(*a, **kw)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 229&amp;nbsp; &amp;nbsp; &amp;nbsp;except Py4JJavaError as e:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 230&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;converted = convert_exception(e.java_exception)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;File /databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 325 if answer[1] == REFERENCE_TYPE:&lt;/DIV&gt;&lt;DIV&gt;--&amp;gt; 326&amp;nbsp; &amp;nbsp; &amp;nbsp;raise Py4JJavaError(&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 327&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;"An error occurred while calling {0}{1}{2}.\n".&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 328&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;format(target_id, ".", name), value)&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 329 else:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 330&amp;nbsp; &amp;nbsp; &amp;nbsp;raise Py4JError(&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 331&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;"An error occurred while calling {0}{1}{2}. Trace:\n{3}\n".&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp; &amp;nbsp; 332&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;format(target_id, ".", name, value))&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;Py4JJavaError: An error occurred while calling o3841.save.&lt;/DIV&gt;&lt;DIV&gt;: java.sql.SQLException: Status of query associated with resultSet is FAILED_WITH_ERROR. Results not generated.&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at net.snowflake.client.jdbc.SFAsyncResultSet.getRealResults(SFAsyncResultSet.java:128)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at net.snowflake.client.jdbc.SFAsyncResultSet.getMetaData(SFAsyncResultSet.java:265)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at net.snowflake.spark.snowflake.io.StageWriter$.executeCopyIntoTable(StageWriter.scala:603)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at net.snowflake.spark.snowflake.io.StageWriter$.writeToTableWithStagingTable(StageWriter.scala:471)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at net.snowflake.spark.snowflake.io.StageWriter$.writeToTable(StageWriter.scala:299)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at net.snowflake.spark.snowflake.io.StageWriter$.writeToStage(StageWriter.scala:238)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at net.snowflake.spark.snowflake.io.package$.writeRDD(package.scala:106)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at net.snowflake.spark.snowflake.SnowflakeWriter.save(SnowflakeWriter.scala:91)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at net.snowflake.spark.snowflake.DefaultSource.createRelation(DefaultSource.scala:156)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:49)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:82)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:79)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:91)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:238)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:165)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:238)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:233)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:417)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:178)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1038)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:128)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:367)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:237)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:220)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:233)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:226)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:519)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:106)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:519)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:32)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:316)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:312)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:495)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:226)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:372)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:226)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:180)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:171)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:287)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:964)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:429)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:396)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:258)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.Gateway.invoke(Gateway.java:306)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.commands.CallCommand.execute(CallCommand.java:79)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:195)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at py4j.ClientServerConnection.run(ClientServerConnection.java:115)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;at java.lang.Thread.run(Thread.java:750)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV&gt;&lt;DIV class=""&gt;&lt;DIV&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class=""&gt;Any idea is appreciated&lt;/DIV&gt;&lt;/DIV&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;/DIV&gt;&lt;DIV class=""&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&lt;SPAN&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Tue, 10 Oct 2023 10:14:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-failed-with-error-when-saving-to-snowflake/m-p/48841#M28397</guid>
      <dc:creator>orso</dc:creator>
      <dc:date>2023-10-10T10:14:01Z</dc:date>
    </item>
    <item>
      <title>Re: Java - FAILED_WITH_ERROR when saving to snowflake</title>
      <link>https://community.databricks.com/t5/data-engineering/java-failed-with-error-when-saving-to-snowflake/m-p/48938#M28429</link>
      <description>&lt;P&gt;Found the problem.&amp;nbsp;&lt;BR /&gt;The sub-roles didn't have grants to the warehouse.&lt;/P&gt;&lt;P&gt;I hope it will help someone one day &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 11 Oct 2023 11:01:20 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/java-failed-with-error-when-saving-to-snowflake/m-p/48938#M28429</guid>
      <dc:creator>orso</dc:creator>
      <dc:date>2023-10-11T11:01:20Z</dc:date>
    </item>
  </channel>
</rss>

