Single task job that runs SQL notebook, can't retrieve results
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-01-2024 04:50 AM
Hello,
We are integrating databricks and I need to run a job with single task that will run notebok with SQL query in it.
I Can only use SQL warehouse and no cluster, I need to retrieve a result of the the notebook task but I can't see the results.
Is there something like dbutils.exit() in SQL notebook ( i can;t run python nor Java)
Reason for going this way is that I need to run a job on behalf of another user.
Please help. Thank you.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-01-2024 07:15 AM
May I know what is being performed on this query? Is it a select on a table but no results being showed?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-01-2024 10:16 AM
> I need to retrieve a result of the the notebook task
If you want to know if the task run has succeeded or not, you can enable the "lakeflow" system schema and you'll find the logs of jobs and task runs.
You could then use the above info to execute an SQL query against the logs in a task after you job query task.
In any case, you should provide a better description of the problem you are trying to solve to help people understand what you are trying to achieve.

