cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Command skipped but no error message

valskyyy
New Contributor II

Hi all ! ✌️ .

This is my first post here !

I have a problem when I launch a "run all" on my notebook : at a moment (always on the same cell), all the following cells are skipped.

As you can see the command 38 is correctly executed and in the command 40 I have an OK status (when I check the temp view "post_reco_transactions" I have data in it and when I check the used table "...rod_sui.post_reco_transactions_${country}a..." (sorry I prefer to partially hide the path), I have data in it too. No explicit error message.

Of course all the cells with "command skipped" have empty objects and I have no output.

When I manually execute the cells from the #41 it works, I have no error message.

The problem is that with this problem I can not create a job, it will fail for each execution.

Do you have an idea ?

1 ACCEPTED SOLUTION

Accepted Solutions

Lakshay
Esteemed Contributor
Esteemed Contributor

Hi Valskyyy,

I tried to do the same steps at my end. But it works fine for me even if I try to do Run All. My guess is that there is some other issue happening at the cluster which resets the cluster and hence, the temporary view definition is getting dropped as temporary view are tied to a sparksession.

Could you try putting just the four steps you mentioned in a separate notebook and see if the issue reoproduces then?

Also, could you try creating the view using "df.createOrReplaceTempView" method to check if the issue with sql implementation only?

View solution in original post

5 REPLIES 5

Lakshay
Esteemed Contributor
Esteemed Contributor

Are all the cells after cell 40 skipped or is it just cell 41?

valskyyy
New Contributor II

Hi Lakshay,

Thanks for you answer. All following cells are slipped.

I made more tests and here are the infos about the steps (with generic names of view/path/table).

I create all these steps and make a Run all :

1/ I declare sql parameters, example "country" or "ab_test_id"

2/ I create a table my_database.my_table_${country}_ab_test_id_${ab_test_id}

This table is OK, I can display it when a make a simple :

SELECT * FROM my_database.my_table_${country}_ab_test_id_${ab_test_id}

3/ I have to create a temp view from this table

Just to be sure I make a :

DROP VIEW IF EXISTS my_temp_view

Then I create my temp view :

CREATE OR REPLACE TEMP VIEW my_temp_view AS

SELECT * FROM my_database.my_table_${country}_ab_test_id_${ab_test_id}

>> Status OK, no error message

4/ Then I want to display the temp view :

SELECT * FROM my_temp_view

>> Command skipped (as all the following cells)

If I manually relaunch this cell (SELECT * FROM my_temp_view), the content of the view is displayed, and if I manually execute all the following cells it works.

-------------------------------------

I have tried with a schedule job. I have the same behavior, but I have an explicit error message on the job screen :

[TABLE_OR_VIEW_NOT_FOUND] The table or view my_database.my_table__ab_test_id_ cannot be found. Verify the spelling and correctness of the schema and catalog.

If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.

But I don't understand, in the step 2/ I correctly see the content of my table so it is created and it exists with the parameters in the name.

Lakshay
Esteemed Contributor
Esteemed Contributor

Hi Valskyyy,

I tried to do the same steps at my end. But it works fine for me even if I try to do Run All. My guess is that there is some other issue happening at the cluster which resets the cluster and hence, the temporary view definition is getting dropped as temporary view are tied to a sparksession.

Could you try putting just the four steps you mentioned in a separate notebook and see if the issue reoproduces then?

Also, could you try creating the view using "df.createOrReplaceTempView" method to check if the issue with sql implementation only?

Debayan
Esteemed Contributor III
Esteemed Contributor III

Could you also please try with a different cluster DBR version and check?

Please tag @Debayan​ with your next response which will notify me, Thank you!

Vartika
Moderator
Moderator

Hi @valskyyy valentin.lewandowski.partner​,

Hope all is well!

Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.