For reading / writing to delta from spark outside of databricks, follow the instructions here
Since delta lake is open source / based on open standards, the community has built connectors for other engines ( not just spark ) . There is a growing ecosystem of connectors and the integration details could be found here