<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic docs.databricks.com in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/docs-databricks-com/m-p/26801#M18812</link>
    <description>&lt;P&gt;&lt;B&gt;&lt;U&gt;What is Databricks Database?&lt;/U&gt;&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;A Databricks database is a collection of tables. A Databricks table is a collection of structured data. You can cache, filter, and perform any operations supported by Apache Spark &lt;A href="https://docs.databricks.com/spark/latest/dataframes-datasets/index.html#dataframes" alt="https://docs.databricks.com/spark/latest/dataframes-datasets/index.html#dataframes" target="_blank"&gt;DataFrames&lt;/A&gt; on Databricks tables. You can query tables with &lt;A href="https://docs.databricks.com/spark/latest/dataframes-datasets/index.html#dataframes" alt="https://docs.databricks.com/spark/latest/dataframes-datasets/index.html#dataframes" target="_blank"&gt;Spark APIs&lt;/A&gt; and &lt;A href="https://docs.databricks.com/spark/latest/spark-sql/index.html#spark-sql-lang-manual" alt="https://docs.databricks.com/spark/latest/spark-sql/index.html#spark-sql-lang-manual" target="_blank"&gt;Spark SQL&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;There are two types of tables: global and local. A &lt;I&gt;global table&lt;/I&gt; is available across all clusters. Databricks registers global tables either to the Databricks &lt;A href="https://hive.apache.org/" alt="https://hive.apache.org/" target="_blank"&gt;Hive&lt;/A&gt; metastore or to an &lt;A href="https://docs.databricks.com/data/metastores/index.html#metastores" alt="https://docs.databricks.com/data/metastores/index.html#metastores" target="_blank"&gt;external Hive metastore&lt;/A&gt;. A &lt;I&gt;local table&lt;/I&gt; is not accessible from other clusters and is not registered in the Hive metastore. This is also known as a &lt;I&gt;temporary view&lt;/I&gt;.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;You can create a table using the &lt;A href="https://docs.databricks.com/data/tables.html#create-table-ui" alt="https://docs.databricks.com/data/tables.html#create-table-ui" target="_blank"&gt;Create Table UI&lt;/A&gt; or &lt;A href="https://docs.databricks.com/data/tables.html#create-table-programmatically" alt="https://docs.databricks.com/data/tables.html#create-table-programmatically" target="_blank"&gt;programmatically&lt;/A&gt;. A table can be populated from files in &lt;A href="https://docs.databricks.com/data/databricks-file-system.html#dbfs" alt="https://docs.databricks.com/data/databricks-file-system.html#dbfs" target="_blank"&gt;DBFS&lt;/A&gt; or data stored in any of the supported &lt;A href="https://docs.databricks.com/data/data-sources/index.html#data-sources" alt="https://docs.databricks.com/data/data-sources/index.html#data-sources" target="_blank"&gt;data sources&lt;/A&gt;.&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/aws/amazon-redshift.html" alt="https://docs.databricks.com/data/data-sources/aws/amazon-redshift.html" target="_blank"&gt;Amazon Redshift&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/aws/amazon-s3.html" alt="https://docs.databricks.com/data/data-sources/aws/amazon-s3.html" target="_blank"&gt;Amazon S3&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/aws/amazon-s3-select.html" alt="https://docs.databricks.com/data/data-sources/aws/amazon-s3-select.html" target="_blank"&gt;Amazon S3 Select&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/azure/azure-storage.html" alt="https://docs.databricks.com/data/data-sources/azure/azure-storage.html" target="_blank"&gt;Azure Blob storage&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/azure/azure-datalake.html" alt="https://docs.databricks.com/data/data-sources/azure/azure-datalake.html" target="_blank"&gt;Azure Data Lake Storage Gen1&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/azure/azure-datalake-gen2.html" alt="https://docs.databricks.com/data/data-sources/azure/azure-datalake-gen2.html" target="_blank"&gt;Azure Data Lake Storage Gen2&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/azure/cosmosdb-connector.html" alt="https://docs.databricks.com/data/data-sources/azure/cosmosdb-connector.html" target="_blank"&gt;Azure Cosmos DB&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/azure/synapse-analytics.html" alt="https://docs.databricks.com/data/data-sources/azure/synapse-analytics.html" target="_blank"&gt;Azure Synapse Analytics&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/sql-databases.html" alt="https://docs.databricks.com/data/data-sources/sql-databases.html" target="_blank"&gt;SQL Databases using JDBC&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/binary-file.html" alt="https://docs.databricks.com/data/data-sources/binary-file.html" target="_blank"&gt;Binary file&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/cassandra.html" alt="https://docs.databricks.com/data/data-sources/cassandra.html" target="_blank"&gt;Cassandra&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/couchbase.html" alt="https://docs.databricks.com/data/data-sources/couchbase.html" target="_blank"&gt;Couchbase&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/elasticsearch.html" alt="https://docs.databricks.com/data/data-sources/elasticsearch.html" target="_blank"&gt;ElasticSearch&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/image.html" alt="https://docs.databricks.com/data/data-sources/image.html" target="_blank"&gt;Image&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/hive-tables.html" alt="https://docs.databricks.com/data/data-sources/hive-tables.html" target="_blank"&gt;Hive tables&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/mlflow-experiment.html" alt="https://docs.databricks.com/data/data-sources/mlflow-experiment.html" target="_blank"&gt;MLflow experiment&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/mongodb.html" alt="https://docs.databricks.com/data/data-sources/mongodb.html" target="_blank"&gt;MongoDB&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/neo4j.html" alt="https://docs.databricks.com/data/data-sources/neo4j.html" target="_blank"&gt;Neo4j&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/oracle.html" alt="https://docs.databricks.com/data/data-sources/oracle.html" target="_blank"&gt;Oracle&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/read-avro.html" alt="https://docs.databricks.com/data/data-sources/read-avro.html" target="_blank"&gt;Avro files&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/read-csv.html" alt="https://docs.databricks.com/data/data-sources/read-csv.html" target="_blank"&gt;CSV files&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/read-json.html" alt="https://docs.databricks.com/data/data-sources/read-json.html" target="_blank"&gt;JSON files&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/read-lzo.html" alt="https://docs.databricks.com/data/data-sources/read-lzo.html" target="_blank"&gt;LZO compressed files&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/read-parquet.html" alt="https://docs.databricks.com/data/data-sources/read-parquet.html" target="_blank"&gt;Parquet files&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/redis.html" alt="https://docs.databricks.com/data/data-sources/redis.html" target="_blank"&gt;Redis&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/riak-ts.html" alt="https://docs.databricks.com/data/data-sources/riak-ts.html" target="_blank"&gt;Riak Time Series&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/snowflake.html" alt="https://docs.databricks.com/data/data-sources/snowflake.html" target="_blank"&gt;Snowflake&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/zip-files.html" alt="https://docs.databricks.com/data/data-sources/zip-files.html" target="_blank"&gt;Zip Files&lt;/A&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;Managed and Unmanaged Tables&lt;/B&gt;&lt;/P&gt;&lt;P&gt;Every Spark SQL table has metadata information that stores the schema and the data itself.&lt;/P&gt;&lt;P&gt;A &lt;I&gt;managed table&lt;/I&gt; is a Spark SQL table for which Spark manages both the data and the metadata. In the case of managed table, Databricks stores the metadata and data in DBFS in your account. Since Spark SQL manages the tables, doing a DROP TABLE example_data deletes &lt;I&gt;both the metadata and data&lt;/I&gt;.&lt;/P&gt;&lt;P&gt;Another option is to let Spark SQL manage the metadata, while you control the data location. We refer to this as an &lt;I&gt;unmanaged table&lt;/I&gt;. Spark SQL manages the relevant metadata, so when you perform DROP TABLE &amp;lt;example-table&amp;gt;, Spark removes only the metadata and not the data itself. The data is still present in the path you provided.&lt;/P&gt;&lt;P&gt;You can create an unmanaged table with your data in data sources such as Cassandra, JDBC table, and so on. See &lt;A href="https://docs.databricks.com/data/data-sources/index.html" alt="https://docs.databricks.com/data/data-sources/index.html" target="_blank"&gt;Data sources&lt;/A&gt; for more information about the data sources supported by Databricks.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;More info here: &lt;A href="https://docs.databricks.com/data/tables.html" alt="https://docs.databricks.com/data/tables.html" target="_blank"&gt;https://docs.databricks.com/data/tables.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Fri, 21 May 2021 18:29:21 GMT</pubDate>
    <dc:creator>User16790091296</dc:creator>
    <dc:date>2021-05-21T18:29:21Z</dc:date>
    <item>
      <title>docs.databricks.com</title>
      <link>https://community.databricks.com/t5/data-engineering/docs-databricks-com/m-p/26801#M18812</link>
      <description>&lt;P&gt;&lt;B&gt;&lt;U&gt;What is Databricks Database?&lt;/U&gt;&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;A Databricks database is a collection of tables. A Databricks table is a collection of structured data. You can cache, filter, and perform any operations supported by Apache Spark &lt;A href="https://docs.databricks.com/spark/latest/dataframes-datasets/index.html#dataframes" alt="https://docs.databricks.com/spark/latest/dataframes-datasets/index.html#dataframes" target="_blank"&gt;DataFrames&lt;/A&gt; on Databricks tables. You can query tables with &lt;A href="https://docs.databricks.com/spark/latest/dataframes-datasets/index.html#dataframes" alt="https://docs.databricks.com/spark/latest/dataframes-datasets/index.html#dataframes" target="_blank"&gt;Spark APIs&lt;/A&gt; and &lt;A href="https://docs.databricks.com/spark/latest/spark-sql/index.html#spark-sql-lang-manual" alt="https://docs.databricks.com/spark/latest/spark-sql/index.html#spark-sql-lang-manual" target="_blank"&gt;Spark SQL&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;There are two types of tables: global and local. A &lt;I&gt;global table&lt;/I&gt; is available across all clusters. Databricks registers global tables either to the Databricks &lt;A href="https://hive.apache.org/" alt="https://hive.apache.org/" target="_blank"&gt;Hive&lt;/A&gt; metastore or to an &lt;A href="https://docs.databricks.com/data/metastores/index.html#metastores" alt="https://docs.databricks.com/data/metastores/index.html#metastores" target="_blank"&gt;external Hive metastore&lt;/A&gt;. A &lt;I&gt;local table&lt;/I&gt; is not accessible from other clusters and is not registered in the Hive metastore. This is also known as a &lt;I&gt;temporary view&lt;/I&gt;.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;You can create a table using the &lt;A href="https://docs.databricks.com/data/tables.html#create-table-ui" alt="https://docs.databricks.com/data/tables.html#create-table-ui" target="_blank"&gt;Create Table UI&lt;/A&gt; or &lt;A href="https://docs.databricks.com/data/tables.html#create-table-programmatically" alt="https://docs.databricks.com/data/tables.html#create-table-programmatically" target="_blank"&gt;programmatically&lt;/A&gt;. A table can be populated from files in &lt;A href="https://docs.databricks.com/data/databricks-file-system.html#dbfs" alt="https://docs.databricks.com/data/databricks-file-system.html#dbfs" target="_blank"&gt;DBFS&lt;/A&gt; or data stored in any of the supported &lt;A href="https://docs.databricks.com/data/data-sources/index.html#data-sources" alt="https://docs.databricks.com/data/data-sources/index.html#data-sources" target="_blank"&gt;data sources&lt;/A&gt;.&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/aws/amazon-redshift.html" alt="https://docs.databricks.com/data/data-sources/aws/amazon-redshift.html" target="_blank"&gt;Amazon Redshift&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/aws/amazon-s3.html" alt="https://docs.databricks.com/data/data-sources/aws/amazon-s3.html" target="_blank"&gt;Amazon S3&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/aws/amazon-s3-select.html" alt="https://docs.databricks.com/data/data-sources/aws/amazon-s3-select.html" target="_blank"&gt;Amazon S3 Select&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/azure/azure-storage.html" alt="https://docs.databricks.com/data/data-sources/azure/azure-storage.html" target="_blank"&gt;Azure Blob storage&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/azure/azure-datalake.html" alt="https://docs.databricks.com/data/data-sources/azure/azure-datalake.html" target="_blank"&gt;Azure Data Lake Storage Gen1&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/azure/azure-datalake-gen2.html" alt="https://docs.databricks.com/data/data-sources/azure/azure-datalake-gen2.html" target="_blank"&gt;Azure Data Lake Storage Gen2&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/azure/cosmosdb-connector.html" alt="https://docs.databricks.com/data/data-sources/azure/cosmosdb-connector.html" target="_blank"&gt;Azure Cosmos DB&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/azure/synapse-analytics.html" alt="https://docs.databricks.com/data/data-sources/azure/synapse-analytics.html" target="_blank"&gt;Azure Synapse Analytics&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/sql-databases.html" alt="https://docs.databricks.com/data/data-sources/sql-databases.html" target="_blank"&gt;SQL Databases using JDBC&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/binary-file.html" alt="https://docs.databricks.com/data/data-sources/binary-file.html" target="_blank"&gt;Binary file&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/cassandra.html" alt="https://docs.databricks.com/data/data-sources/cassandra.html" target="_blank"&gt;Cassandra&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/couchbase.html" alt="https://docs.databricks.com/data/data-sources/couchbase.html" target="_blank"&gt;Couchbase&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/elasticsearch.html" alt="https://docs.databricks.com/data/data-sources/elasticsearch.html" target="_blank"&gt;ElasticSearch&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/image.html" alt="https://docs.databricks.com/data/data-sources/image.html" target="_blank"&gt;Image&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/hive-tables.html" alt="https://docs.databricks.com/data/data-sources/hive-tables.html" target="_blank"&gt;Hive tables&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/mlflow-experiment.html" alt="https://docs.databricks.com/data/data-sources/mlflow-experiment.html" target="_blank"&gt;MLflow experiment&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/mongodb.html" alt="https://docs.databricks.com/data/data-sources/mongodb.html" target="_blank"&gt;MongoDB&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/neo4j.html" alt="https://docs.databricks.com/data/data-sources/neo4j.html" target="_blank"&gt;Neo4j&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/oracle.html" alt="https://docs.databricks.com/data/data-sources/oracle.html" target="_blank"&gt;Oracle&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/read-avro.html" alt="https://docs.databricks.com/data/data-sources/read-avro.html" target="_blank"&gt;Avro files&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/read-csv.html" alt="https://docs.databricks.com/data/data-sources/read-csv.html" target="_blank"&gt;CSV files&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/read-json.html" alt="https://docs.databricks.com/data/data-sources/read-json.html" target="_blank"&gt;JSON files&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/read-lzo.html" alt="https://docs.databricks.com/data/data-sources/read-lzo.html" target="_blank"&gt;LZO compressed files&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/read-parquet.html" alt="https://docs.databricks.com/data/data-sources/read-parquet.html" target="_blank"&gt;Parquet files&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/redis.html" alt="https://docs.databricks.com/data/data-sources/redis.html" target="_blank"&gt;Redis&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/riak-ts.html" alt="https://docs.databricks.com/data/data-sources/riak-ts.html" target="_blank"&gt;Riak Time Series&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/snowflake.html" alt="https://docs.databricks.com/data/data-sources/snowflake.html" target="_blank"&gt;Snowflake&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;&lt;A href="https://docs.databricks.com/data/data-sources/zip-files.html" alt="https://docs.databricks.com/data/data-sources/zip-files.html" target="_blank"&gt;Zip Files&lt;/A&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;Managed and Unmanaged Tables&lt;/B&gt;&lt;/P&gt;&lt;P&gt;Every Spark SQL table has metadata information that stores the schema and the data itself.&lt;/P&gt;&lt;P&gt;A &lt;I&gt;managed table&lt;/I&gt; is a Spark SQL table for which Spark manages both the data and the metadata. In the case of managed table, Databricks stores the metadata and data in DBFS in your account. Since Spark SQL manages the tables, doing a DROP TABLE example_data deletes &lt;I&gt;both the metadata and data&lt;/I&gt;.&lt;/P&gt;&lt;P&gt;Another option is to let Spark SQL manage the metadata, while you control the data location. We refer to this as an &lt;I&gt;unmanaged table&lt;/I&gt;. Spark SQL manages the relevant metadata, so when you perform DROP TABLE &amp;lt;example-table&amp;gt;, Spark removes only the metadata and not the data itself. The data is still present in the path you provided.&lt;/P&gt;&lt;P&gt;You can create an unmanaged table with your data in data sources such as Cassandra, JDBC table, and so on. See &lt;A href="https://docs.databricks.com/data/data-sources/index.html" alt="https://docs.databricks.com/data/data-sources/index.html" target="_blank"&gt;Data sources&lt;/A&gt; for more information about the data sources supported by Databricks.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;More info here: &lt;A href="https://docs.databricks.com/data/tables.html" alt="https://docs.databricks.com/data/tables.html" target="_blank"&gt;https://docs.databricks.com/data/tables.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 21 May 2021 18:29:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/docs-databricks-com/m-p/26801#M18812</guid>
      <dc:creator>User16790091296</dc:creator>
      <dc:date>2021-05-21T18:29:21Z</dc:date>
    </item>
  </channel>
</rss>

