a week ago - last edited a week ago
Hi!
Any time I try to import "dlt" in a notebook session to develop Pipelines, I get an error message saying DLT is not supported on Spark Connect clusters. These are very generic clusters, I've tried runtime 14, 15 and the latest 16, using shared clusters as recommended.
How are we supposed to use DLT?
a week ago
Hey Oakhill, are you trying to create a DLT pipeline outside of the Databricks environment? Provide more details of your setup. Thanks, Louis.
a week ago
a week ago
Delta Live tables does not support shared clusters. You need to configure the cluster (a jobs cluster) via the Delta Live Tables use (under workflows). You would still write your notebook with DLT using Python but the pipeline itself has to be configured via the UI and run from the UI.
a week ago - last edited a week ago
I am aware that I have to specify a job cluster/pipeline when deploying. But I am asking how to actually develop on a newer DBR. Back in 12 and 13 we would share a cheap development cluster and then push to prod with autoscale and larger clusters.
I just booted up a cluster on DBR 13 and it worked. Why is that? It's an old version and doesn't support Liquid Clustering, for example.
EDIT: OK, so I had to create a user isolated cluster on DBR 15, unlike the shared on DBR 13.That's unfortunate that we have to make individual clusters for DLT.
a week ago
Hey Oakhill,
Are trying to executed the notebook clicking the run button? If is the case, it not gonna work. You need to write the all the code first. After that You need to go in the sidebar menu click in workflow, choose the table Delta Live table and click in the blue button in the right corner to create a pipeline.
Then you need to setup your pipeline with information about you cluster, destination, the path of your notebook. After that , click in create, and your pipeline will be create. Then click in the start button to start you DLT pipeline.
a week ago
No, this is when I try to develop on a DBR above 13. I can't even import the dlt-package when developing 🙂
Saturday
Oakhill. When building a DLT pipeline you are not prompted for DBR when defining the cluster. DLT handles the infrastructure rollout for you. Can you please share a screenshot of the error you are recieving, I want to see where you are in the platform when you get the error.
a week ago
Oakhill, we do provide free onboard training. You might be interested in the "Get Started with Data Engineering on Databricks" session. You can register here: https://www.databricks.com/training/catalog. When you are searching the catalog of training be sure to filter on "Type" for "Get Started." Cheers, Louis.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group