- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-24-2022 02:24 AM
We got the following error when running queries on Redash connected to
Databricks early today (2022-08-24):
```
Error running query: [HY000] [Simba][Hardy] (35) Error from server:
error code: '0' error message:
'org.apache.spark.sql.catalyst.expressions.UnsafeRow cannot be cast to
org.apache.spark.sql.Row'. (35) (SQLFetch)
```
This happened when the result was cache. If the query was new then
Redash can get the result.
The query works fine on Databricks SQL editor.
Any idea on how to resolve this? This happened early today (2022-08-24), before that it worked fine.
The Redash version we are using is v10.0.0 (9c928bd).
- Labels:
-
BI Integrations
-
Cache
-
Error Message
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-02-2022 07:16 AM
It got fixed last week from Databricks.
There was no problem with the permission, since if the query not hit the cache, then we could get the result. We had to add some workarounds with `select random(), * from table` instead of `select * from table`.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-25-2022 09:15 AM
This can be related to user permission, particularly necessary permission to access the table in the database instance. I understand in SQL editor it is working fine, still can we check the permissions?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-02-2022 07:16 AM
It got fixed last week from Databricks.
There was no problem with the permission, since if the query not hit the cache, then we could get the result. We had to add some workarounds with `select random(), * from table` instead of `select * from table`.