cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

What is the alternative for sys.exit(0) in Databricks

Ramana
Contributor

Hi,

We are working on a migration project from Cloudera to Databricks.

All our code is in .py files and we decided to keep the same in Databricks as well and try to execute the same from GIT through Databricks workflows.

We have two kinds of exit functionality requirements:

1. Soft exit with sys.exit(0)- when the job criteria meet some condition, then we do soft exist with sys.exit(0), which basically terminates the job softly by marking the job as successful

2. Job Terminate with sys.exit(1) - when the job criteria meet some condition, then we do terminate the job with sys.exit(1), which basically terminates the job by marking the job as failed.

the above criteria work as expected in any Python environment but not in Databricks.

In Databricks, both sys.exit(1) as well as sys.exit(0) are marked as Job failed.

I read an article in Community and somebody mentioned that "Usage of spark.stop(), sc.stop() , System.exit() in your application can cause this behavior. Databricks manages the context shutdown on its own. Forcefully closing it can cause this abrupt behavior."

Link: https://community.databricks.com/t5/data-engineering/why-do-i-see-my-job-marked-as-failed-on-the-dat...

If this is true, then what is the best alternative to achieve sys.exit(0) in Databricks?

Any help would greatly be appreciated.

FYI: we have code in .py files and we don't want to use the Notebooks for PROD jobs.

Here is an example to execute and see the behavior:

import sys
bucket_name = "prod"

if bucket_name == "prod":
    print("Success. It is PROD. Existing with 0")
    sys.exit(0)
    # sys.exit(None)
    # quit()
    # exit()
    # return 0
    # dbutils.exit(0)
    # dbutils.quit()
    # return
    # soft_exit

else:
    print("Fail. It is DEV. Existing with 1")
    sys.exit(1)
print("outside if else")

 

2 ACCEPTED SOLUTIONS

Accepted Solutions

AndrewN
Databricks Employee
Databricks Employee

Ramana,

I checked internally and the suggestion is to structure your code in such a way that it returns from the main function when a certain condition is met. I modified your code to what you see below.

# sys.exit(0) equivalent
def main():
    bucket_name = "prod"
    if bucket_name ==  "prod":
        return
    # Rest of your code

if __name__ == "__main__":
    main()
 
I ran your original code and the job showed as failed in Databricks. Then I ran the modified function that returns from main and the job showed as succeeded. I believe this is the best alternative to achieve your expected sys.exit(0) behavior in Databricks. 
 

View solution in original post

Ramana
Contributor

I tested with different levels of nesting and it is working as expected.

Here is the sample code:

 

import sys
bucket_name =  "prod"# str(sys.argv[1]).lower()

def main():
    i,j=0,0
    while j<=2:
        print(f"while loop iteration: {j}")
        for i in range(0,3):
            print(f"for loop iteratation: {i}")
            if bucket_name ==  "prod" and j==1 and i==1:
                print("Success. It is PROD. Existing with 0")
                return True
                print("After return")
            # Rest of your code
            else:
                # print("Fail. It is DEV. Existing with 1")
                # sys.exit(1)
                print(f"Else: while iteration: {j} and for iteration: {i}")
                continue
            print("outside if else")
            i+=1
            print("inside for loop")
        j+=1
        print("end of for loop")
    print("end of while loop")

if __name__ == "__main__":
    main()

 

 

View solution in original post

3 REPLIES 3

AndrewN
Databricks Employee
Databricks Employee

Ramana,

I checked internally and the suggestion is to structure your code in such a way that it returns from the main function when a certain condition is met. I modified your code to what you see below.

# sys.exit(0) equivalent
def main():
    bucket_name = "prod"
    if bucket_name ==  "prod":
        return
    # Rest of your code

if __name__ == "__main__":
    main()
 
I ran your original code and the job showed as failed in Databricks. Then I ran the modified function that returns from main and the job showed as succeeded. I believe this is the best alternative to achieve your expected sys.exit(0) behavior in Databricks. 
 

I tested with simple code and it worked because return statements only work inside the functions and converting all the code into the main function makes it work. I will test the same with a full scale of code and if that works then I will mark this as the solution.

Ramana
Contributor

I tested with different levels of nesting and it is working as expected.

Here is the sample code:

 

import sys
bucket_name =  "prod"# str(sys.argv[1]).lower()

def main():
    i,j=0,0
    while j<=2:
        print(f"while loop iteration: {j}")
        for i in range(0,3):
            print(f"for loop iteratation: {i}")
            if bucket_name ==  "prod" and j==1 and i==1:
                print("Success. It is PROD. Existing with 0")
                return True
                print("After return")
            # Rest of your code
            else:
                # print("Fail. It is DEV. Existing with 1")
                # sys.exit(1)
                print(f"Else: while iteration: {j} and for iteration: {i}")
                continue
            print("outside if else")
            i+=1
            print("inside for loop")
        j+=1
        print("end of for loop")
    print("end of while loop")

if __name__ == "__main__":
    main()

 

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group