03-10-2023 11:45 AM
import com.microsoft.azure.sqldb.spark.config.Config
import com.microsoft.azure.sqldb.spark.connect._
import com.microsoft.azure.sqldb.spark.query._
val query = "Truncate table tablename"
val config = Config(Map(
"url" -> dbutils.secrets.get(scope = "azurekeyvault-scope", key = "DW-URL"),
"databaseName" -> dbutils.secrets.get(scope = "azurekeyvault-scope", key = "DW-DBName"),
"user" -> dbutils.secrets.get(scope = "azurekeyvault-scope", key = "DW-Username"),
"password" -> dbutils.secrets.get(scope = "azurekeyvault-scope", key = "DW-Password"),
"queryCustom" -> query
))
sqlContext.sqlDBQuery(config)
While executing the above command in cluster 12.0 runtime version i am facing an error like NoClassDefFoundError
03-21-2023 03:10 AM
Yes suteja by adding additional jar files I can resolve this issue but while running the read and write operation by connecting sqldb there I am facing error. I found different code to execute truncate command now it is working fine.
03-13-2023 12:22 AM
@Someswara Durga Prasad Yaralgadda :
The NoClassDefFoundError error occurs when a class that was available during the compile time is not available at the runtime. This could be due to a few reasons, including a missing dependency or an incompatible version of a dependency.
In your case, it seems like the error is related to the Azure SQL DB Spark connector library. The missing class could be a part of this library. You can try the following steps to resolve the issue:
03-21-2023 03:10 AM
Yes suteja by adding additional jar files I can resolve this issue but while running the read and write operation by connecting sqldb there I am facing error. I found different code to execute truncate command now it is working fine.
09-12-2023 07:52 AM - edited 09-12-2023 08:15 AM
Hi @YSDPrasad , could you please let me know which additonal jar files need to install to resolve this ?
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now