- 14905 Views
- 7 replies
- 4 kudos
Unity Catalog - spark.* functions throwing Py4JSecurityException - org.apache.spark.sql.internal.CatalogImpl.currentCatalog() is not whitelisted on class class org.apache.spark.sql.internal.CatalogImpl
I'm looking to migrate onto unity catalog but a number of my data ingestion notebooks throw a securityexception/whitelist errors for numerous spark. functionsIs there some configuration setting I need to enable to whitelist the spark.* methods/functi...
- 14905 Views
- 7 replies
- 4 kudos
- 4 kudos
Hi @Jakub K​ I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I suggest provid...
- 4 kudos
- 1988 Views
- 2 replies
- 2 kudos
Reading Athen table created on top of s3 in databricks
HI,we have databricks that use aws glue catalog as metastore, I am trying to read athena table which is created on top s3, I am getting following errorcom.databricks.backend.common.rpc.SparkDriverExceptions$SQLExecutionException: java.lang.RuntimeExc...
- 1988 Views
- 2 replies
- 2 kudos
- 2 kudos
@Daniel Sahal​ at org.apache.hadoop.hive.ql.plan.TableDesc.getDeserializerClass(TableDesc.java:79) at org.apache.spark.sql.hive.execution.HiveTableScanExec.addColumnMetadataToConf(HiveTableScanExec.scala:127) at org.apache.spark.sql.hive.execution.Hi...
- 2 kudos
- 35716 Views
- 6 replies
- 3 kudos
I have to read zipped csv file using spark without unzipping it. can anyone please provide pyspark/spark sql code for that?
Zipped csv files are receiving to s3 raw layer.
- 35716 Views
- 6 replies
- 3 kudos
- 3 kudos
@Jog Giri​ I also recently encountered a similar scenario, the below code solved my purpose without any issues.import zipfile for i in dbutils.fs.ls('/mnt/zipfilespath/'): with zipfile.ZipFile(i.path.replace('dbfs:','/dbfs'), mode="r") as zip_ref:...
- 3 kudos
- 26975 Views
- 4 replies
- 2 kudos
DeltaTable.forPath(spark, path) doesnt recognize table
Hi,I'm working with unity catalog for the last week. I'm refering to delta table by path, as follwing: path='s3://<my_bucket_name>/silver/data/<table_name>DeltaTable.forPath(spark, path)I get an exception that "is not a Delta table"using the table na...
- 26975 Views
- 4 replies
- 2 kudos
- 2 kudos
its even more weired. on one next cells it doesnt... see below older version even by name doesnt work
- 2 kudos
- 2151 Views
- 2 replies
- 2 kudos
Unable to run spark sql commands from ipywidget button click event
I'm unable to run any command that queries data from the unity catalog within a function that executes in the event of an ipywidget button click. Code block below. I cannot do queries such as spark.sql(f"SHOW SCHEMAS;") or spark.sql(f"select * from d...
- 2151 Views
- 2 replies
- 2 kudos
- 2 kudos
can you try to println out? val databricksApiTokenKey = CredentialContext.INHERITED_PROPERTY_DATABRICKS_API_TOKEN val databricksApiCredentialOpt = CredentialContext.getCredential(databricksApiTokenKey) val rawUrlProp = spark.sparkContext.get...
- 2 kudos
-
Access
1 -
Access control
1 -
Access Mode
3 -
Account
4 -
Account Console
1 -
Account Level
2 -
Adf
1 -
ADLS
3 -
ADLS Gen2 Storage
1 -
ADLS Gen2 With ABFSS
1 -
AI
1 -
AmazonRDS
1 -
Amit
2 -
Analytics
1 -
API
4 -
AWS
8 -
Aws databricks
1 -
Aws glue
1 -
AWS Glue Catalog
1 -
Azure
21 -
Azure active directory
1 -
Azure Data Lake Storage
3 -
Azure databricks
16 -
Azure Databricks Delta Table
1 -
Azure DevOps
1 -
Azure Unity Catalog
1 -
Backup
1 -
Backup-restore
1 -
Bamboolib
1 -
Best practice
2 -
Best Practices
1 -
Best Way
1 -
Beta
1 -
Bug
1 -
Bug Report
1 -
Catalog
24 -
CatalogIntegration
1 -
CatalogShared Access
1 -
CatalogTable
1 -
CICD
1 -
Class Class
2 -
Cluster
3 -
Clusterlogs
2 -
Clusters
1 -
Code
2 -
Code Block
1 -
ColumnLevelSecurity
1 -
Columns
1 -
Company Email
1 -
ConfigurationInvalid Configuration Value
1 -
ContainerStorage
1 -
Create table
1 -
DAIS2023
1 -
Data
2 -
Data Engineering
2 -
Data Explorer
1 -
Data Governance
1 -
Data Ingestion
1 -
Data Science
2 -
Databrciks Runtime
1 -
databricks
4 -
Databricks Account
1 -
Databricks Audit Logs
1 -
Databricks Community
1 -
Databricks Data Engineer Associate
1 -
Databricks delta
1 -
Databricks Documentation
1 -
Databricks Feature Store
1 -
Databricks Lakehouse Platform
1 -
Databricks Migration Tool
1 -
Databricks notebook
3 -
Databricks Premium
2 -
Databricks Repos Api
1 -
Databricks Runtime
1 -
Databricks secrets
1 -
Databricks SQL
6 -
Databricks SQL Warehouse
1 -
Databricks Table Usage
3 -
Databricks Terraform
1 -
Databricks Unity Catalog
2 -
Databricks Workflows
1 -
Databricks workspace
4 -
Databricks-connect
2 -
DatabricksAWSAccount
1 -
DatabricksClusterLogs
1 -
Datagrip
1 -
DataMasking
1 -
DBFS
3 -
DBR
3 -
DBR10.4
2 -
Dbutils
1 -
Delete
1 -
Delt Lake
1 -
Delta
16 -
Delta Live
1 -
Delta Live Tables
3 -
Delta Schema
1 -
Delta Sharing
16 -
Delta table
9 -
Deploy Databricks
1 -
Dev
2 -
Different Account
1 -
DLT
7 -
DLT Pipelines
1 -
Documentation
2 -
Error
5 -
Error Message
3 -
Exception
1 -
Exists
1 -
External Hive
2 -
External Hive Metastore
2 -
External Metastore
1 -
External Metastores
1 -
External Table
2 -
Feature
1 -
Feature Store
1 -
File
2 -
File Trigger
1 -
Files
1 -
Function
1 -
GCP
2 -
GCP Databricks
2 -
Global init scripts
1 -
Governance
1 -
Grant
2 -
Groups
1 -
Hi
1 -
High Concurrency Cluster
1 -
Hive
7 -
Hive metastore
6 -
Hive Metastore Of Databricks
1 -
HiveCatalog
1 -
Iam
1 -
INFORMATION
1 -
Init Scripts
1 -
Instance Pool
1 -
Instance Profile
1 -
Integrations
1 -
INVALID STATE
1 -
IP Access List
1 -
JOBS
1 -
JSON
2 -
Limit
1 -
Lineage
1 -
List
1 -
Location
2 -
Locations
1 -
Machine Learning
1 -
ManagedIdentities
1 -
Metastore
15 -
Method
1 -
Method Public
1 -
Microsoft
1 -
Migration
1 -
ML
1 -
Mounts
1 -
Multi
1 -
Multiple Times
1 -
Multiple users
1 -
Network Security
1 -
New Features
2 -
New Workspace
1 -
Notebook
1 -
Notebook Display Widgets
1 -
Organize
1 -
Pandas
1 -
Party Libraries
1 -
Path
1 -
Permission
1 -
Permissions
3 -
Possible
1 -
PPC
1 -
Premium
1 -
Proven Practice
1 -
Public
1 -
Py4j.security.Py4JSecurityException
2 -
Pyspark
3 -
Python
5 -
Python Code
1 -
Python notebook
1 -
Python programming
1 -
Python Proxy
1 -
Python script
1 -
Python2
1 -
Query
2 -
Query History
1 -
Question
1 -
R
2 -
Rdd
1 -
Rest
1 -
Rest API
5 -
REST Endpoint
1 -
Row Level Security
1 -
S3 Params
1 -
S3 permission
1 -
Scala
1 -
Scala spark
1 -
Schema
1 -
Schema Browser
1 -
SCIM
1 -
Search
1 -
Secrets
2 -
Security
4 -
Security Controls
1 -
SEO
1 -
SEO Agency Sydney
1 -
Separate Workspace
1 -
Sequence Init Scripts
1 -
Series En LÃnea
1 -
Series Online
1 -
Service
1 -
Service endpoint
1 -
Service principal
1 -
SET MASK
1 -
SET ROW FILTER
1 -
Setup databricks
1 -
Shallow Clone
1 -
Shared Mode
4 -
SLDC
1 -
SLDC Environemtn
1 -
Snowflake
1 -
Snowflake Federation
1 -
Spark
5 -
Spark config
1 -
Spark Daatricks
1 -
Spark Path Format
1 -
Spark streaming
1 -
Spark view
1 -
SparkFiles
1 -
Sparklyr
1 -
SQL
10 -
SQL Editor
2 -
SQL View
1 -
Sql Warehouse
1 -
Status Code
1 -
Storage
1 -
Storage account
1 -
Stored Procedures
1 -
Strange Behavior
1 -
Streaming
1 -
Summit22
1 -
SVC
1 -
SVC Principal
1 -
Sync Users
1 -
Table
5 -
Table access control
2 -
Table ACLs
1 -
Table Integration
1 -
TABLE SHALLOW CLONE
1 -
Tableau Desktop
1 -
Tables
4 -
TBL
1 -
Terraform
5 -
Terraform Provider
1 -
Trying
1 -
TV En Streaming
1 -
UC
10 -
UC - Service Principal
1 -
UC NOT ENABLED
1 -
Udf
3 -
Unclear Error Message
1 -
Unclear Validation Error Message
1 -
Understanding Delta Lake
1 -
UNDROP
1 -
Unify Catalog
1 -
Unity
3 -
Unity Catalog
185 -
Unity Catalog +
1 -
Unity Catalog API
1 -
Unity Catalog Data Lineage
1 -
Unity Catalog Error
1 -
Unity Catalog Metastore
1 -
Unity Catalog Objects
1 -
Unity Catalog Request
1 -
Unity Catalog Support
1 -
Unity Catalog Sync
1 -
Unity Catalog Table
1 -
Unity Catalogue
1 -
Unity Setup
1 -
UnityCalatog
1 -
UnityCatalog
1 -
UnityCatalog Tables
1 -
Unstructured
1 -
Usa
1 -
User
2 -
User Access Tokens
1 -
Ver Series
1 -
Ver Series En Español
1 -
Version
1 -
Views
1 -
Vpc
1 -
Wildrose
1 -
Wildrose Flixlatino
1 -
Workers
1 -
Workflows
2 -
Workspace
4 -
Write
1
- « Previous
- Next »