cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

jakubk
by Contributor
  • 14905 Views
  • 7 replies
  • 4 kudos

Unity Catalog - spark.* functions throwing Py4JSecurityException - org.apache.spark.sql.internal.CatalogImpl.currentCatalog() is not whitelisted on class class org.apache.spark.sql.internal.CatalogImpl

I'm looking to migrate onto unity catalog but a number of my data ingestion notebooks throw a securityexception/whitelist errors for numerous spark. functionsIs there some configuration setting I need to enable to whitelist the spark.* methods/functi...

  • 14905 Views
  • 7 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Jakub K​ I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I suggest provid...

  • 4 kudos
6 More Replies
HemanthRatakond
by New Contributor II
  • 1988 Views
  • 2 replies
  • 2 kudos

Reading Athen table created on top of s3 in databricks

HI,we have databricks that use aws glue catalog as metastore, I am trying to read athena table which is created on top s3, I am getting following errorcom.databricks.backend.common.rpc.SparkDriverExceptions$SQLExecutionException: java.lang.RuntimeExc...

  • 1988 Views
  • 2 replies
  • 2 kudos
Latest Reply
HemanthRatakond
New Contributor II
  • 2 kudos

@Daniel Sahal​ at org.apache.hadoop.hive.ql.plan.TableDesc.getDeserializerClass(TableDesc.java:79) at org.apache.spark.sql.hive.execution.HiveTableScanExec.addColumnMetadataToConf(HiveTableScanExec.scala:127) at org.apache.spark.sql.hive.execution.Hi...

  • 2 kudos
1 More Replies
Jitu
by New Contributor II
  • 35716 Views
  • 6 replies
  • 3 kudos
  • 35716 Views
  • 6 replies
  • 3 kudos
Latest Reply
Chaitanya_Raju
Honored Contributor
  • 3 kudos

@Jog Giri​  I also recently encountered a similar scenario, the below code solved my purpose without any issues.import zipfile for i in dbutils.fs.ls('/mnt/zipfilespath/'): with zipfile.ZipFile(i.path.replace('dbfs:','/dbfs'), mode="r") as zip_ref:...

  • 3 kudos
5 More Replies
amitca71
by Contributor II
  • 26975 Views
  • 4 replies
  • 2 kudos

DeltaTable.forPath(spark, path) doesnt recognize table

Hi,I'm working with unity catalog for the last week. I'm refering to delta table by path, as follwing: path='s3://<my_bucket_name>/silver/data/<table_name>DeltaTable.forPath(spark, path)I get an exception that "is not a Delta table"using the table na...

image
  • 26975 Views
  • 4 replies
  • 2 kudos
Latest Reply
amitca71
Contributor II
  • 2 kudos

its even more weired. on one next cells it doesnt... see below older version even by name doesnt work 

  • 2 kudos
3 More Replies
129876
by New Contributor III
  • 2151 Views
  • 2 replies
  • 2 kudos

Unable to run spark sql commands from ipywidget button click event

I'm unable to run any command that queries data from the unity catalog within a function that executes in the event of an ipywidget button click. Code block below. I cannot do queries such as spark.sql(f"SHOW SCHEMAS;") or spark.sql(f"select * from d...

  • 2151 Views
  • 2 replies
  • 2 kudos
Latest Reply
Atanu
Databricks Employee
  • 2 kudos

can you try to println out? val databricksApiTokenKey = CredentialContext.INHERITED_PROPERTY_DATABRICKS_API_TOKEN val databricksApiCredentialOpt = CredentialContext.getCredential(databricksApiTokenKey) val rawUrlProp = spark.sparkContext.get...

  • 2 kudos
1 More Replies
Labels