cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Error on Workflow

JohnSmith2
New Contributor II

Hi , I have some mysteries situation here 

My workflow (job) ran and got an error -> [INVALID_IDENTIFIER] The identifier transactions-catalog is invalid. Please, consider quoting it with back-quotes as `transactions-catalog`.(line 1, pos 12) == SQL == transactions-catalog.transactions-schema.transactions_bronze_table_eu

But when I ran it  with manually it was success. 

Is anyone have any idea  about this

Coding

%python
silver_df = spark.sql(f"""
SELECT
column 
from table_changes("`transactions-catalog`.`transactions-schema`.table" , {start_version} , {end_version})
""")
***I conceal a column and a table name.
1 ACCEPTED SOLUTION

Accepted Solutions

-werners-
Esteemed Contributor III

Jobs are just notebooks executed in background, so if the notebook is the same between interactive (manual) and job run, there should be no difference.
So I don't see what is wrong.  Is the job using DLT perhaps?

View solution in original post

4 REPLIES 4

-werners-
Esteemed Contributor III

that is weird indeed.
Are you sure the manual test you did has the same code as the one in the job (perhaps different REPOS branch?)?

I am sure that I use a same code in both.

-werners-
Esteemed Contributor III

Jobs are just notebooks executed in background, so if the notebook is the same between interactive (manual) and job run, there should be no difference.
So I don't see what is wrong.  Is the job using DLT perhaps?

Ok, Sorry for this problem, I just found the different. It's  a version of runtime that on my manual use 12.2 but on my job use 13.3. Thank you for your advice to check the different.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group