cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

running notebook job from remote github repository fails, but do not fail on python script type

NielsMH
New Contributor III

Hi all

I am trying to run a notebook from a remote repository, but the job fails. 

I setup the job as follows:

job-setup.png

my project structure is as such:

folder_structure.png

but the output i get is like such:

job_output.png

 

The thing is if i set the job type to "Python Script" i dont encounter this problem. Then the job will carry on as expected with no errors. However, i need the poc_template_job_script.py to be read as a notebook. So somehow i think the job is not recognizing my project notebook (poc_template_job_script.py).

Anyone has some input to how i can solve this issue? I have not found a similar issue anywhere.

Thanks a lot in advance ๐Ÿ™‚

2 REPLIES 2

karthik_p
Esteemed Contributor

@NielsMH if you want to run your jobs based o job name, please use new preview service that databricks released which are DAB format. there you can run your job based on your job name.

remote repo in the sense, are you using github actions or api, looks notebook format is not being detected untill you specify type of format (in u r case py)

NielsMH
New Contributor III

Hi Karthik_p thanks for the reply

If i specify the extension poc_template_job_script.py, I get the same error. In the setup it asks to omit the extension which results in my initial post. And i agree that it should work if i specified the extension if you look at the documentation:

Supported notebook formats

Databricks can import and export notebooks in the following formats:

  • Source file: A file containing only source code statements with the extension .scala, .py, .sql, or .r.

But somehow it still does not recognize poc_template_job_script.py as a databricks notebook.

I also tried to add: "# Databricks notebook source" in the top of the file as specified in the comments also:

Import a file and convert it to a notebook

You can convert Python, SQL, Scala, and R scripts to single-cell notebooks by adding a comment to the first cell of the file:

# Databricks notebook source

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group