cancel
Showing results for 
Search instead for 
Did you mean: 
guangyi
Contributor II
since ‎06-20-2024
a month ago

User Stats

  • 27 Posts
  • 2 Solutions
  • 1 Kudos given
  • 6 Kudos received

User Activity

Here is how I define the UDF inside the file udf_define.py:from pyspark.sql.functions import length, udf from pyspark.sql.types import IntegerType from pyspark.sql import SparkSession spark = SparkSession.builder.getOrCreate() def strlen(s): ret...
I try to follow the instructions of Monitor Delta Live Tables pipelines to query dlt.expection log.Here is the simple code version I copied from the Querying the event log section:CREATE TEMPORARY LIVE VIEW event_log_raw AS (SELECT * FROM event_log(T...
I need a DLT pipeline to create a materialized view for fetching event logs. All the ways below I tried are failed:Attach a notebook with pure SQL inside: No magic cell like `%sql` are failedAttach a notebook with `spark.sql` python code: Failed beca...
I know how to validate the column level constraint, like checking whether the specified column value is larger than target value.Can I validate some table level constraints? For example, validate whether the total records count of a table is larger t...
Here is the policy I just created: { "node_type_id": { "defaultValue": "Standard_D8s_v3", "type": "allowlist", "values": [ "Standard_D8s_v3", "Standard_D16s_v3" ] }, "num_workers": {...
Kudos given to