12-25-2021 02:31 PM
Has anyone successfully used Petastorm + Databricks-Connect + Delta Lake?
The use case is being able to use DeltaLake as a data store regardless of whether I want to use the databricks workspace or not for my training tasks.
I'm using a cloud-hosted jupyterlab environment(in Paperspace), and trying to use Petastorm + Databricks Connect.
What I'm trying to do:
The exact same code, on the same cluster works when using the databricks notebook environment. But when running the `make_spark_converter()` function in my hosted jupyterlab environment it throws me a "Unable to infer schema" error. Even though if I check the `.schema` attribute of the dataframe I'm giving it, it shows me a spark compatible schema.
12-26-2021 07:07 AM
12-26-2021 07:07 AM
I would not definitely use Databricks-Connect in production.
12-29-2021 08:22 PM
because its janky or why? I don't need it for customer facing production. More so for if I'm using my own HPC or local workstation, but I want to access data from delta lake. Figured it was easier/preferable to setting up my own spark environment locally. I'm paying for databricks might as well get the benefits of the runtime.
Can you elaborate on your answer?
01-30-2022 11:22 PM
Hi @Yusuf Khan , Please go through the link here.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group