cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Running unit tests from a different notebook (using Python unittest package) doesn't produce output (can't discover the test files)

FG
New Contributor II

I have a test file (test_transforms.py) which has a series of tests running using Python's unittest package. I can successfully run the tests inside of the file with expected output. But when I try to run this test file from a different notebook (run_unit_tests), it doesn't produce any test results ('Ran 0 tests in 0.000s').

I have tried to keep both files in the same dir but still doesn't work. Here's how the tests are being run:

# print(os.getcwd())

# print(os.listdir())

run_test = unittest.main(argv=[' '], verbosity=2, exit=False)

assert run_test.result.wasSuccessful(), 'Test failed; see logs above'

When run from a different notebook (run_unit_tests)

image.png 

When run from the test file (test_transforms.py)

imageNB: I have my notebooks and .py files in a repo. How can I succesfully run testfiles from a different notebook as my intent is to have these tests triggered with Github actions? (I have the workflow set up successfully)

5 REPLIES 5

Anonymous
Not applicable

@Fuad Goloba​ :

When running tests on Databricks, you need to ensure that the test file is uploaded to the Databricks workspace and that the correct path is specified when importing the test module in the notebook that is running the tests. Here's an example of how you could modify your run_unit_tests notebook to run the tests from test_transforms.py on Databricks:

# Import the required modules
import unittest
import os
 
# Set the path to the directory containing the test file
test_dir = '/path/to/test/files/'
 
# Add the test directory to the Python path
os.sys.path.append(test_dir)
 
# Import the test module
import test_transforms
 
# Run the tests
run_test = unittest.main(argv=[' '], verbosity=2, exit=False)
 
# Check if the tests were successful
assert run_test.result.wasSuccessful(), 'Test failed; see logs above'

Make sure to replace /path/to/test/files/ with the actual path to the directory containing your test file. You can check the path using the dbutils.fs.ls() function. Once you have verified that the tests are running correctly on Databricks, you can set up a GitHub action to trigger the tests as part of your continuous integration process. In the action, you can use the Databricks CLI to upload the test file to the workspace and run the tests on Databricks.

FG
New Contributor II

Thanks for your reply. I have tried the solution proposed now and still doesn't provide any results. Alternatively, I tried to run the tests using unittest's methods by discovering all the test files in the directory and running them (this worked).

Attached is a screenshot of both solutions:

image

Anonymous
Not applicable

Can you send me the code snippets please!

sparklearner233
New Contributor II

Try this:

import unittest
import os
 
# Set the path to the directory containing the test file
test_dir = '/dbfs/mnt/repository/yunjchen/customer_ai_project/packages/visualization/test/model_performance'
 
# Add the test directory to the Python path
os.sys.path.append(test_dir)
 
# Import the test module
import test_ranking

all_tests = unittest.TestLoader().loadTestsFromModule(test_ranking)
# run all tests with verbosity
test_runner = unittest.TextTestRunner(verbosity=2)

assert test_runner.run(all_tests).wasSuccessful(), 'Test failed; see logs above'

SpaceDC
New Contributor II

Hello, I have exactly the same issue.

In my case, using the ipytest library from Databricks clusters, this is the error that occurs when I try to run the tests:

EEEEE [100%]
============================================== ERRORS ==============================================
____________________________ ERROR at setup of test_simple_train_model _____________________________
file /root/.ipykernel/24539/command-4149222638982994-1805159126, line 3
def test_simple_train_model(model: BaseEstimator, data: tuple):
E fixture 'model' not found
> available fixtures: anyio_backend, anyio_backend_name, anyio_backend_options, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
> use 'pytest --fixtures [testpath]' for help on them.

/root/.ipykernel/24539/command-4149222638982994-1805159126:3
_______________________ ERROR at setup of test_plot_masive_confusion_matrix ________________________
file /root/.ipykernel/24539/command-4149222638982995-2606421544, line 3
def test_plot_masive_confusion_matrix(model: BaseEstimator, data: tuple):
E fixture 'model' not found
> available fixtures: anyio_backend, anyio_backend_name, anyio_backend_options, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
> use 'pytest --fixtures [testpath]' for help on them.

/root/.ipykernel/24539/command-4149222638982995-2606421544:3
______________________________ ERROR at setup of test_plot_roc_curve _______________________________
file /root/.ipykernel/24539/command-4149222638982996-3955143249, line 3
def test_plot_roc_curve(model: BaseEstimator, data: tuple):
E fixture 'model' not found
> available fixtures: anyio_backend, anyio_backend_name, anyio_backend_options, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
> use 'pytest --fixtures [testpath]' for help on them.

/root/.ipykernel/24539/command-4149222638982996-3955143249:3
___________________________ ERROR at setup of test_changeAdaptativeDtype ___________________________
file /root/.ipykernel/24539/command-4149222638982997-2293422769, line 3
def test_changeAdaptativeDtype(df_fixture: pd.DataFrame) -> pd.DataFrame:
E fixture 'df_fixture' not found
> available fixtures: anyio_backend, anyio_backend_name, anyio_backend_options, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
> use 'pytest --fixtures [testpath]' for help on them.

/root/.ipykernel/24539/command-4149222638982997-2293422769:3
__________________________ ERROR at setup of test_noisyVariableSuppressor __________________________
file /root/.ipykernel/24539/command-4149222638982998-15294889, line 3
def test_noisyVariableSuppressor(df_fixture: pd.DataFrame) -> pd.DataFrame:
E fixture 'df_fixture' not found
> available fixtures: anyio_backend, anyio_backend_name, anyio_backend_options, capfd, capfdbinary, caplog, capsys, capsysbinary, doctest_namespace, monkeypatch, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
> use 'pytest --fixtures [testpath]' for help on them.

/root/.ipykernel/24539/command-4149222638982998-15294889:3
========================================= warnings summary =========================================
../../local_disk0/.ephemeral_nfs/envs/pythonEnv-177398d8-2e8f-43fd-9db3-1d841e854457/lib/python3.10/site-packages/_pytest/config/ http://__init__.py:1277
/local_disk0/.ephemeral_nfs/envs/pythonEnv-177398d8-2e8f-43fd-9db3-1d841e854457/lib/python3.10/site-packages/_pytest/config/ http://__init__.py:1277 : PytestAssertRewriteWarning: Module already imported so cannot be rewritten: anyio
self._mark_plugins_for_rewrite(hook)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
===================================== short test summary info ======================================
ERROR http://t_7524c4aa66964b16be82dde4f1f448d9.py ::test_simple_train_model
ERROR http://t_7524c4aa66964b16be82dde4f1f448d9.py ::test_plot_masive_confusion_matrix
ERROR http://t_7524c4aa66964b16be82dde4f1f448d9.py ::test_plot_roc_curve
ERROR http://t_7524c4aa66964b16be82dde4f1f448d9.py ::test_changeAdaptativeDtype
ERROR http://t_7524c4aa66964b16be82dde4f1f448d9.py ::test_noisyVariableSuppressor

And when I check the pytest fixtures log, I see that the directory is set to default by Databricks:

============================= test session starts ==============================
platform linux -- Python 3.10.12, pytest-8.3.3, pluggy-1.5.0
rootdir: /databricks/driver
plugins: anyio-3.5.0, typeguard-2.13.3
collected 0 items
cache -- .../_pytest/ http://cacheprovider.py:556
Return a cache object that can persist state between testing sessions.

capsysbinary -- .../_pytest/ http://capture.py:1006
Enable bytes capturing of writes to ``sys.stdout`` and ``sys.stderr``.

capfd -- .../_pytest/ http://capture.py:1034
Enable text capturing of writes to file descriptors ``1`` and ``2``.

capfdbinary -- .../_pytest/ http://capture.py:1062
Enable bytes capturing of writes to file descriptors ``1`` and ``2``.

capsys -- .../_pytest/ http://capture.py:978
Enable text capturing of writes to ``sys.stdout`` and ``sys.stderr``.

doctest_namespace [session scope] -- .../_pytest/ http://doctest.py:741
Fixture that returns a :py:class:`dict` that will be injected into the
namespace of doctests.

pytestconfig [session scope] -- .../_pytest/ http://fixtures.py:1345
Session-scoped fixture that returns the session's :class:`pytest.Config`
object.

record_property -- .../_pytest/ http://junitxml.py:280
Add extra properties to the calling test.

record_xml_attribute -- .../_pytest/ http://junitxml.py:303
Add extra xml attributes to the tag for the calling test.

record_testsuite_property [session scope] -- .../_pytest/ http://junitxml.py:341
Record a new ``<property>`` tag as child of the root ``<testsuite>``.

tmpdir_factory [session scope] -- .../_pytest/ http://legacypath.py:298
Return a :class:`pytest.TempdirFactory` instance for the test session.

tmpdir -- .../_pytest/ http://legacypath.py:305
Return a temporary directory path object which is unique to each test
function invocation, created as a sub directory of the base temporary
directory.

caplog -- .../_pytest/ http://logging.py:598
Access and control log capturing.

monkeypatch -- .../_pytest/ http://monkeypatch.py:31
A convenient fixture for monkey-patching.

recwarn -- .../_pytest/ http://recwarn.py:35
Return a :class:`WarningsRecorder` instance that records all warnings emitted by test functions.

tmp_path_factory [session scope] -- .../_pytest/ http://tmpdir.py:242
Return a :class:`pytest.TempPathFactory` instance for the test session.

tmp_path -- .../_pytest/ http://tmpdir.py:257
Return a temporary directory path object which is unique to each test
function invocation, created as a sub directory of the base temporary
directory.

------------------ fixtures defined from anyio.pytest_plugin -------------------
anyio_backend -- ../python/lib/python3.10/site-packages/anyio/ http://pytest_plugin.py:135
no docstring available

anyio_backend_name -- ../python/lib/python3.10/site-packages/anyio/ http://pytest_plugin.py:140
no docstring available

anyio_backend_options -- ../python/lib/python3.10/site-packages/anyio/ http://pytest_plugin.py:148
no docstring available

============================ no tests ran in 0.01s =============================

However, these tests work perfectly fine locally. I would really appreciate your support.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group