Hi @Tito, you can use the standard cluster (formerly known as the shared cluster) from VSCode using DBConnect. Here is an example,
from databricks.connect import DatabricksSession
# Option 1: Use cluster_id from .databrickscfg automatically
# Since your [fielddemo] profile has cluster_id configured, just use the profile
spark = DatabricksSession.builder.profile("fielddemo").getOrCreate()
# Option 2: Use serverless compute
# spark = DatabricksSession.builder.profile("fielddemo").serverless().getOrCreate()
df = spark.read.table("samples.nyctaxi.trips")
df.show(5)
Set up your .databrickscfg profile with the cluster_id of the standard cluster. You can obtain the cluster_id from the Cluster Configurations page, as shown below.
[fielddemo]
host = https://.........cloud.databricks.com/
token = ..................................
jobs-api-version = 2.0
cluster_id = ....-.....-cnhxf2p6


