I also have this issue, and I resolved it by cast all the records columns in bigquery to string before I dump the data.
I first create a view like
create view xxx as
select
string_1,
string_2,
string_3,
to_json_string(record_1) as record_1,
to_json_string(record_2) as record_2,
.
.
.
from yyy
I don't know which record column has issue, so I cast them all.
Then in databricks, I query the data only from the view xxx. instead of the original table yyy. With this method, I can dump millions of rows from bigquery view xxx at once.