Hi, 
I am a bit stumped atm bc I cannot figure out how to get a DLT table definition picked up in a Python notebook. 
1. I created a new notebook in python
2. added the following code: 
 
%python
import dlt
from pyspark.sql.functions import *
@dlt.table(
    comment="Some simple test",
    name="trx_dlt"
)
def transactions_live_table():
    df = spark.read.json("s3://FOLDER_LOCATION/Transaction/")
    return df
 
3. I created a new DLT pipeline and linked the Notebook. Linking did work bc the Notebook UI now allows me to validate my pieline code.
4. I press play and get 
   a. a warning the python magic command is not supported: 
Magic commands (e.g. %py, %sql and %run) are not supported with the exception of %pip within a Python notebook. Cells containing magic commands are ignored. Unsupported magic commands were found in the following notebooks
/Users/mathias.peters@epicompany.eu/DLT-TRX: %python
  b. an error that no DLT tables were discovered in the code: 

 
How does that work? Where is my mistake here?
 
Kind regards, 
Mathias