Hey,
Im running into a weird issue while running the following code:
def getDf(query, preamble_sql=None):
jdbc_url = f"jdbc:oracle:thin:@//{host}:{port}/{service_name}"
request = spark.read \
.format("jdbc") \
.option("driver", "oracle.jdbc.driver.OracleDriver") \
.option("url", jdbc_url) \
.option("query", query) \
.option("user", username) \
.option("password", password)
if preamble_sql is not None:
request = request.option("sessionInitStatement", preamble_sql)
df = request.load()
return df
display(getDf(
"SELECT sys_context('USERENV','CON_NAME') AS container_name FROM dual",
"ALTER SESSION SET CONTAINER = CDB$ROOT"
))
The query returns "CDB$ROOT" as expected.
I would assume that I can run Statements on the root container now. But when I switch the statement to the required logminer query:
display(getDf(
"SELECT * FROM SYS.V_$ARCHIVED_LOG",
"ALTER SESSION SET CONTAINER = CDB$ROOT"
))
Im getting a java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist error.
This query works perfectly fine in DBeaver though.
I can only believe that there is some Spark pre-checks going on that check for the table schema for example - which explains that the first query is working.
So is there any way to make this work?
Best regards
Samuel