Hey Dan, good to hear you're getting started with Databricks. This is not a limitation of Databricks it's a restriction built into Spark itself. Spark is not a data store, it's a distributed computation framework. Therefore deleting data would be unnecessary. If you don't need it, you would just filter it out either in a query or by setting it up as a new table as below.
%sql SELECT * FROM prop0 where prop_id is null AS new_table
It's probably worth your time reading a bit more about the tools that Spark provides, the learning curve is steep but once you get past the first steps you'll start seeing the value! 🙂 I might recommend some of the material that we have in the community edition like some of the CS100 coursework.