Hi, thank you. There was some issues with my permissions (apparently to the Dataverse resource, not the Datalake) and that is why I was not able to query successfully using abfss. Just for the future, I was able to query directly the Datalake (not the endpoint) using
spark.read.format('delta').load('abfss://<workspace_id>@onelake.dfs.fabric.microsoft.com/<lakehouse_id>/Tables/dbo/<table_name>')
You can also write there and should show successfully in the endpoint unless this happens
This was quite useful documentation also