Nope, at first I thought that you're using DLT. I think you're confusing things. Managed table means that your table is managed - so if you delete it you delete metadata along with data. You can also have an external table which when you execute delete statement will delete metadata from catalog (but the data won't be removed).
Back to your quesiton. So in your case, you probably saved you table using batch approach:
df_transformed.write.format("delta").saveAsTable("my_db.my_table")
Then the easiest way convert it to streaming table would be to rewrite DataFrameWriter part:
df_transformed.writeStream
.format("delta")
.outputMode("append") # or "complete" depending on aggregation
.option("checkpointLocation", "/mnt/checkpoints/my_table_cp")
.toTable("my_db.my_table") # creates or writes to managed streaming table
But keep in mind that not all operations are supported in spark streaming, so it's possible that if you have many complex transformation you won't be able to use streaming in that case.