cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How I can test my Python code that I wrote using notebooks?

alexott
Databricks Employee
Databricks Employee

I've written the code in the notebooks using the Python, and I want to add tests to it to make sure that it won't break when I do more changes.

What tools can I use for that tasks?

2 REPLIES 2

alexott
Databricks Employee
Databricks Employee

Because notebooks aren't the real files, many of existing testing libraries may not work with code in the notebooks. There are two approaches to it:

  1. explicitly create & execute test suites
  2. use the Nutter library for Python

(For actual testing of the Spark code we can use libraries mentioned in this answer)

The first step to make code testable is to have a correct code organization - it's common that notebooks just have code written in linear function, without explicit functions or classes. To make code in the notebook testable we need to split the code into several notebooks:

  1. Create a separate notebook with functions that you want to test
  2. Create a separate notebook for tests
  3. Create a separate notebook for your entry point ("main notebook")

Explicitly creating & executing test suites

The notebook with functions that you want to test (first item) needs to be included into 2nd and 3rd notebooks using the %run directive.

In the notebook with tests:

  1. include your favorite test framework
  2. define test classes or test functions
  3. create a test suite - it could be created explicitly or we can use lookup in the Python environment and generate this test sute
  4. execute test suite

For example, if I have a notebook with functions generate_data and get_data_prediction in the notebook with name "LibraryFunctions" I can define a test for it as (we need to have "%run LibraryFunctions" before this test):

import unittest
 
class SimpleTest(unittest.TestCase):
    def test_data_generation(self):
      n = 100
      name = "tmp42"
      generate_data(n=n, name=name)
      df = spark.sql(f"select * from {name}")
      self.assertEqual(df.count(), n)
 
    def test_data_prediction(self):
      predicted = get_data_prediction()
      self.assertEqual(predicted, 42)

I can generate the test suite explicitly, but this is cumbersome:

def generate_test_class_suite():
  suite = unittest.TestSuite()
  suite.addTest(SimpleTest('test_data_generation'))
  suite.addTest(SimpleTest('test_data_prediction'))
 
  return suite
 
test_suite = generate_test_class_suite()

But it's easier will be to use auxiliary function to discover all test cases automatically:

def discover_test_classes():
  test_classes = [obj for name, obj in globals().items()
    if name.endswith('Test') and obj.__module__ == '__main__' and isinstance(obj, type) and unittest.case.TestCase in set(obj.__bases__)]
 
  suite = unittest.TestSuite()
  for test_class in test_classes:
    for test in unittest.defaultTestLoader.getTestCaseNames(test_class):
      suite.addTest(test_class(test))
      
  return suite
 
test_suite = discover_test_classes()

and then we can execute our test suite using the test runner (you can use other runners, like, XMLTestRunner from the unittest-xml-reporting library if you want to generate JUnit XML files with test results):

runner = unittest.TextTestRunner()
results = runner.run(suite)

(Please note that if you want to re-run tests you may need to regenerate the test suite)

Using the Nutter library

Nutter library was developed by Microsoft specifically for unit testing of the Databricks notebooks. It supports following functionality:

  • it's possible to execute tests interactively, be scheduled as job, or triggered via command-line
  • It can automatically discover all notebooks with tests (including subfolders), and execute them
  • Code is split into run / assert stages, with optional before / after calls - you need to follow naming conventions! For example, you need to define function run_<name> to call tested function, and have corresponding function assertion_<name> that should check result of execution
  • The actual checks are done with frameworks like, Chispa
  • It's possible to parallel execution of the tests (but be careful)
  • It can publish results in JUnit format
  • It's easy to integrate with Azure DevOps & other CI/CD systems

You still need to split functions into a separate notebook and include them into a notebook where you define tests using Nutter. Here is small example of how you can define test using Nutter, and execute it (it works both interactively & triggered from the command-line or CI/CD pipeline):

from runtime.nutterfixture import NutterFixture, tag
 
class Test1Fixture(NutterFixture):
  def __init__(self):
    self.code2_table_name = "my_data"
    self.code1_view_name = "my_cool_data"
    self.code1_num_entries = 100
    NutterFixture.__init__(self)
    
  def run_name1(self):
    generate_data1(n = self.code1_num_entries, name = self.code1_view_name)
    
  def assertion_name1(self):
    df = spark.read.table(self.code1_view_name)
    assert(df.count() == self.code1_num_entries)
 
result = Test1Fixture().execute_tests()
print(result.to_string())
is_job = dbutils.notebook.entry_point.getDbutils().notebook().getContext().currentRunId().isDefined()
if is_job:
  result.exit(dbutils)

Full end-to-end example, including instructions on how to setup CI/CD pipelines on Azure DevOps could be found in this repository.

Ryan_Chynoweth
Esteemed Contributor

@Alex Ottโ€‹ has an awesome answer!

Here is a great blog from our engineering team that may help as well.

https://databricks.com/blog/2020/01/16/automate-deployment-and-testing-with-databricks-notebook-mlfl...

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group