04-08-2022 01:33 AM
Hello. I'm fairly new to Databricks and Spark.
I have a requirement to connect to Databricks using JDBC and that works perfectly using the driver I downloaded from the Databricks website ("com.simba.spark.jdbc.Driver")
What I would like to do now is have a locally running instance of a database in docker that I can connect to using the same driver. I'd like to automatically initialise the database by creating tables when it starts up. Very much like how you would use docker-entrypoint-initdb.d when creating tables on startup for Postgresql.
I'd then like to insert some data and run some tests locally.
Is any of this possible?
04-08-2022 05:27 AM
the simba driver is for spark connections, I doubt it will work with a database.
Why would you use this driver to connect to a database in a container? Or do you mean running Databricks in a local container?
If the latter: that is not available.
04-08-2022 06:01 AM
ok so maybe I've not asked the right question.
At the moment we use the Simba driver to connect to databricks and we can perform sql queries.
Can I achieve the same thing locally using a dockerized Databricks or spark runtime?
04-08-2022 06:06 AM
Databricks: no,
Spark: I guess so, but it will take some effort to gather all necessary dependencies and create a container (or look for one on dockerhub)
04-08-2022 06:07 AM
If you are just running queries on tables, you could also look into something like Dremio which can be run on docker in single node.
04-08-2022 06:11 AM
can I connect to that using the same Simba driver?
04-08-2022 07:21 AM
I don´t know if the Databricks driver is the same as the classic simba driver.
04-11-2022 01:29 PM
@Gurps Bassi , "running instance of a database in docker" - that is hive metastore, so it just mapping to data which is usually physically on the data lake. Databricks are so much on the cloud that setting metastore locally doesn't make sense. Instead, place two Databricks workspaces, one with stage Repo (where you will make development) and another workspace with master Repo.
For those who want to develop locally in IDE, soon, the databricks tunnel for Visual Studio Code will be available.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group