- 5359 Views
- 6 replies
- 0 kudos
SELECT position_no, position_function, work_function, job_profile_id, pos_cat as position_category, pos_cat_desc, job_posting_title as pos_title, employee_status as emp_status, emp_status_desc, clevel, subs...
- 5359 Views
- 6 replies
- 0 kudos
by
DAUR
• New Contributor
- 1367 Views
- 2 replies
- 2 kudos
Hello,Im using Unity Catalog managed tables and Unity Catalog DLTs. I urgently need to use a UDF in DBSQL because pySpark UDFs dont work in UC. I found this way to create a SQL function and use python language inside:CREATE FUNCTION redact(a STRING)
...
- 1367 Views
- 2 replies
- 2 kudos
Latest Reply
It will be available soon in public preview. You will need to have a Pro SQL Warehouse or a Serverless SQL Warehouse to use Python UDF in Databricks SQL.
1 More Replies
- 4871 Views
- 9 replies
- 4 kudos
Hi all,There is no official doc with a step-by-step process to enable Unity Catalog in azure databricks.If anyone has created doc or has the process, please share it here.Thanks, in advance.
- 4871 Views
- 9 replies
- 4 kudos
Latest Reply
I recorded a video showing how to setup Unity Catalog Step by step https://www.youtube.com/watch?v=M2Et5aBj2aw
8 More Replies
by
oteng
• New Contributor III
- 2125 Views
- 2 replies
- 0 kudos
Hi,I am not able to access the schemas inside the hive_metastore in the schema browser inside the SQL Editor in Databricks SQL. I tried on 2 databricks instances. One has Unity Catalog enabled and I am able to access the other catalogs/schemas but no...
- 2125 Views
- 2 replies
- 0 kudos
Latest Reply
this is something weird, can we connect and solve your issue
1 More Replies
by
129876
• New Contributor III
- 1275 Views
- 2 replies
- 2 kudos
I'm unable to run any command that queries data from the unity catalog within a function that executes in the event of an ipywidget button click. Code block below. I cannot do queries such as spark.sql(f"SHOW SCHEMAS;") or spark.sql(f"select * from d...
- 1275 Views
- 2 replies
- 2 kudos
Latest Reply
Atanu
Esteemed Contributor
can you try to println out? val databricksApiTokenKey = CredentialContext.INHERITED_PROPERTY_DATABRICKS_API_TOKEN val databricksApiCredentialOpt = CredentialContext.getCredential(databricksApiTokenKey) val rawUrlProp = spark.sparkContext.get...
1 More Replies
- 4910 Views
- 7 replies
- 5 kudos
I have created a premium cluster on Azure site. There were no problem Data Science and Engineering (DSAE) while i was binding Postgre SQL hive metastore. I have done all settings via Global init Scripts from Admin Console.However when I try to adjust...
- 4910 Views
- 7 replies
- 5 kudos
Latest Reply
Find soluttion :spark.hadoop.javax.jdo.option.ConnectionURL jdbc:postgresql://#########.postgres.database.azure.com:5432/databricks_metastore?ssl=trueneeds to be spark.hadoop.javax.jdo.option.ConnectionURL jdbc:postgresql://#########.postgres.databas...
6 More Replies