cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Code coverage on Databricks notebook

santhoshKumarV
New Contributor II

I have a scenario where my application code a scala package and notebook code[Scala] under /resources folder is being maitained.

I am trying to look for a easiest way to perform code coverage on my notebook , does Databricks provide any option for it.

Right now, I have removed from resource/notebook and moved to package and added into scala object, since my notebook code did not have object , rather it had only my methods and consecutive commands

2 REPLIES 2

saurabh18cs
Valued Contributor III

Performing code coverage on notebooks in Databricks can be challenging because notebooks are typically used for interactive development and analysis rather than structured software development.As you mentioned, you have already moved the code from the notebook to a Scala package and added it into a Scala object. This is a good practice as it allows you to write unit tests and measure code coverage more effectively.

santhoshKumarV
New Contributor II

Important thing which missed to add in post is , we do maintan notebook code as .scala under resources and maitian in github. Files(.scala) from resources gets deployed as notebook using github action.

With my approach of moving under package, I will have to go with new apporach for deployment(use my action copy the .scala files under my pakage  --> github action --> notebook), instead of my older apporach (resources --> then action --> notebook).

With my new apporach I was able to somehow manage write test case and cover it, but there are certain difficulties with my new approach where some of hte notebook native commands(run etc) are not being able to test 
With this I am trying to do test my notebook code and as well deploy as notebook and continue to be interactive.

In order to avoid certain diffculties with my new approach, I was trying to look for some ideas that would cut my time 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group