The answer I can give you to have this work for you is to call the R notebooks from your Python notebook. Just save each dataframe as a delta table to pass between the languages.How to call a notebook from another notebook? here is a link
Also confirming that you do not have any of these limitations:From DBT's website: Some databases limit where and how descriptions can be added to database objects. Those database adapters might not support persist_docs, or might offer only partial su...
Yes, you can use SparkR in the Databricks notebooks so you can keep your native R code. You can select at the top part of the notebook in the Databricks GUI that the language will be in R so you are not needing to add %%R to every cell. You can also ...