cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

dbutils conflicts with a custom spark extension

maaaxx
New Contributor III

Hello dear community,

we have installed a custom spark extension to filter the files allowed to be read into the notebook. It was all good if we use the spark functions.

However, the files are not filtered properly if the user would use e.g., dbutils.fs.cp.

Anyone has an idea why dbutils does not consider the spark extension in this scenario?

Many thanks!

Cheers,

Max

4 REPLIES 4

Tayyab_Vohra
Contributor

Hello @Yuan Gao​ ,

On Databricks, spark and dbutils are automatically injected only into the main entrypoint - your notebook, but they aren't propagated to the Python modules. With spark solution is easy, just use the getActiveSession function of SparkSession class (as SparkSession.getActiveSession()), but you need to continue to pass dbutils explicitly until you don't abstract getting dbutils into some function

The documentation for Databricks Connect shows an example of how it could be achieved. That example has SparkSession as an explicit parameter, but it could be modified to avoid that completely, with something like this:

def get_dbutils():
  from pyspark.sql import SparkSession
  spark = SparkSession.getActiveSession()
  if spark.conf.get("spark.databricks.service.client.enabled") == "true":
    from pyspark.dbutils import DBUtils
    return DBUtils(spark)
  else:
    import IPython
    return IPython.get_ipython().user_ns["dbutils"]

and then in your function, you can use the main function to get the spark dbutils functionality

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, could you please explain a little more about the custom spark extension?

Also please tag @Debayan​ with your next response which will notify me, Thank you!

Vartika
Moderator
Moderator

Hi @Yuan Gao​,

Checking in. If @tayyab vohra​'s answer helped, would you let us know and mark the answer as best? If not, would you be happy to give us more information? 

Thanks!

maaaxx
New Contributor III
This is the best answer. Thank you. Am Mo., 3. Apr. 2023 um 12:08 Uhr schrieb Databricks Community < community@databricks.com>:
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.