cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Create a UDF Table Function with DLT in UC

alexiswl
Contributor

Hello, 

I am trying to generate a DLT but need to use a UDF Table Function in the process.  

This is what I have so far, everything works (without e CREATE OR REFRESH LIVE TABLE wrapper)

```sql

CREATE OR REPLACE FUNCTION silver.portal.get_workflows_from_workflow_id(workflow_id INT)
  RETURNS TABLE(libraryrun_id INT)
  RETURN
    SELECT libraryrun_id
    FROM bronze.portal.libraryrun_workflows
    WHERE workflow_id == get_workflows_from_workflow_id.workflow_id
;
CREATE OR REFRESH LIVE TABLE workflow_run_ids_by_bclconvert
  COMMENT "View all workflows triggered downstream of a bclconvert run"
  AS (
    WITH
      bclconvert_workflow_ids AS (
        SELECT id AS bclconvert_workflow_id, split(wfr_name, '__')[3] AS bclconvert_run_name
        FROM bronze.portal.workflow
        WHERE type_name == 'bcl_convert'
      ),
      workflow_runs_with_bclconvert_run_name AS (
        SELECT bclconvert_workflow_ids.bclconvert_workflow_id, bclconvert_workflow_ids.bclconvert_run_name, get_workflows_from_workflow_id.libraryrun_id
        FROM bclconvert_workflow_ids,
        LATERAL get_workflows_from_workflow_id(bclconvert_workflow_ids.bclconvert_workflow_id)
      )
    SELECT
      workflow_runs_with_bclconvert_run_name.bclconvert_workflow_id,
      workflow_runs_with_bclconvert_run_name.bclconvert_run_name,
      workflow_runs_with_bclconvert_run_name.libraryrun_id,
      workflow.type_name,
      workflow.portal_run_id,
      workflow.`start`,
      workflow.`end`
    FROM workflow_runs_with_bclconvert_run_name
    INNER JOIN bronze.portal.libraryrun AS libraryrun
    INNER JOIN bronze.portal.libraryrun_workflows AS libraryrun_workflows
    INNER JOIN bronze.portal.workflow AS workflow
    WHERE
      workflow_runs_with_bclconvert_run_name.libraryrun_id = libraryrun.id
    AND
      workflow_runs_with_bclconvert_run_name.libraryrun_id = libraryrun_workflows.libraryrun_id
    AND
      libraryrun_workflows.workflow_id = workflow.id
  )

```

However, this results in the following error:


```

[UC_COMMAND_NOT_SUPPORTED.WITHOUT_RECOMMENDATION] The command(s): Delta Live Table operations are not supported in Unity Catalog.
error
```



1 ACCEPTED SOLUTION

Accepted Solutions

Hi all, 

I have figured out my error.  

The function creation should NOT be in the same notebook as the one used by the DLT pipeline. 

Instead, create the function separately and register it in Unity Catalog, then reference the function in the DLT pipeline.  

As a side note, UDF support for DLT is only available in DBR 13.3 and above. At the time of writing this, you will need to configure the runtime to 'preview' over 'current' to enable UDFs in DLT. 


View solution in original post

3 REPLIES 3

shan_chandra
Databricks Employee
Databricks Employee

@alexiswl - could you please use CREATE OR REPLACE function instead of CREATE OR REFRESH LIVE table?

Hi @shan_chandra, unfortunately this is in the context of creating a DLT, so if I update as suggested, I get the following error:

```

org.apache.spark.sql.AnalysisException: The following code at '/Repos/services+databricks@umccr.org/data-portal-analytics/delta_live_tables/silver/silver_portal_dlt' is not supported in a DLT pipeline: CREATE OR REPLACE FUNCTION silver.portal.get_workf... Please correct your code and make sure you only have DLT statements. For reference: DLT currently accepts 'CREATE MATERIALIZED VIEW', 'CREATE STREAMING TABLE', 'APPLY CHANGES INTO', and 'SET' statements.

```

 

Hi all, 

I have figured out my error.  

The function creation should NOT be in the same notebook as the one used by the DLT pipeline. 

Instead, create the function separately and register it in Unity Catalog, then reference the function in the DLT pipeline.  

As a side note, UDF support for DLT is only available in DBR 13.3 and above. At the time of writing this, you will need to configure the runtime to 'preview' over 'current' to enable UDFs in DLT. 


Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group