cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

dbutils conflicts with a custom spark extension

maaaxx
New Contributor III

Hello dear community,

we have installed a custom spark extension to filter the files allowed to be read into the notebook. It was all good if we use the spark functions.

However, the files are not filtered properly if the user would use e.g., dbutils.fs.cp.

Anyone has an idea why dbutils does not consider the spark extension in this scenario?

Many thanks!

Cheers,

Max

4 REPLIES 4

Tayyab_Vohra
Contributor

Hello @Yuan Gaoโ€‹ ,

On Databricks, spark and dbutils are automatically injected only into the main entrypoint - your notebook, but they aren't propagated to the Python modules. With spark solution is easy, just use the getActiveSession function of SparkSession class (as SparkSession.getActiveSession()), but you need to continue to pass dbutils explicitly until you don't abstract getting dbutils into some function

The documentation for Databricks Connect shows an example of how it could be achieved. That example has SparkSession as an explicit parameter, but it could be modified to avoid that completely, with something like this:

def get_dbutils():
  from pyspark.sql import SparkSession
  spark = SparkSession.getActiveSession()
  if spark.conf.get("spark.databricks.service.client.enabled") == "true":
    from pyspark.dbutils import DBUtils
    return DBUtils(spark)
  else:
    import IPython
    return IPython.get_ipython().user_ns["dbutils"]

and then in your function, you can use the main function to get the spark dbutils functionality

Debayan
Databricks Employee
Databricks Employee

Hi, could you please explain a little more about the custom spark extension?

Also please tag @Debayanโ€‹ with your next response which will notify me, Thank you!

Vartika
Databricks Employee
Databricks Employee

Hi @Yuan Gaoโ€‹,

Checking in. If @tayyab vohraโ€‹'s answer helped, would you let us know and mark the answer as best? If not, would you be happy to give us more information? 

Thanks!

maaaxx
New Contributor III
This is the best answer. Thank you. Am Mo., 3. Apr. 2023 um 12:08 Uhr schrieb Databricks Community < community@databricks.com>:

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group