โ03-18-2022 12:45 PM
โ03-21-2022 10:30 PM
@Danilo Mendesโ , Table schema is stored in the default Azure Databricks internal metastore and you can also configure and use external metastores. Ingest data into Azure Databricks. Access data in Apache Spark formats and from external data sources. Use Apache Spark APIs to work with data.
โ03-21-2022 05:08 AM
@Danilo Mendesโ , You can cache, filter, and perform any operations supported by Apache Spark DataFrames on Databricks tables. You can query tables with Spark APIs and Spark SQL.
โ03-21-2022 10:30 PM
@Danilo Mendesโ , Table schema is stored in the default Azure Databricks internal metastore and you can also configure and use external metastores. Ingest data into Azure Databricks. Access data in Apache Spark formats and from external data sources. Use Apache Spark APIs to work with data.
โ03-22-2022 05:10 AM
Hi @Danilo Mendesโ , Can you please elaborate on the question? Do you want to connect through SSAS?
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.