- 2684 Views
- 7 replies
- 3 kudos
Whenever my cluster is terminated, I lose my whole database(I'm not sure if it's related, I made those database with delta format. ) And since the cluster is terminated in 2 hours from not using it, I wake up with no database every morning.I don't wa...
- 2684 Views
- 7 replies
- 3 kudos
Latest Reply
Once if the culstur gets terminated info will be lost
6 More Replies
- 3347 Views
- 5 replies
- 3 kudos
I was going through Data Engineering with Databricks training, and in DE 3.3L - Databases, Tables & Views Lab section, it says "Defining database directories for groups of users can greatly reduce the chances of accidental data exfiltration." I agree...
- 3347 Views
- 5 replies
- 3 kudos
Latest Reply
Hi @Dilorom A Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we ...
4 More Replies
- 15623 Views
- 13 replies
- 8 kudos
- 15623 Views
- 13 replies
- 8 kudos
Latest Reply
Alright, we've implemented a workaround for this, and so far it's been working very well:First, we created a reusable notebook to wait until Hive has been initialized (see code below).We then execute this notebook using the %run command at the top of...
12 More Replies
- 2495 Views
- 5 replies
- 9 kudos
I am trying to create a sqlite database in databricks and add a few tables to it. Ultimately, I want to export this using Azure. Is this possible?
- 2495 Views
- 5 replies
- 9 kudos
Latest Reply
@Hubert Dudek We currently have a process in place that reads in a SQLite file. We recently transitioned to using Databricks. We were hoping to be able to create a SQLite file so we didn't have to alter the current process we have in place.
4 More Replies
- 1834 Views
- 3 replies
- 4 kudos
I'm following a class "DE 3.1 - Databases and Tables on Databricks", but it is not possible create databases due to "AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got exception: org.apache.hadoop.fs.Unsupp...
- 1834 Views
- 3 replies
- 4 kudos
Latest Reply
A colleague from my work figured out the problem: the cluster being used wasn't configured to use DBFS when running notebooks.
2 More Replies
- 641 Views
- 0 replies
- 4 kudos
Hello Bricksters,We organize the delta lake in multiple storage accounts. One storage account per data domain and one container per database. This helps us to isolate the resources and cost on the business domain level.Earlier, when a schema/database...
- 641 Views
- 0 replies
- 4 kudos
by
Dicer
• Valued Contributor
- 1750 Views
- 5 replies
- 5 kudos
I wanted to save my delta tables in my Databricks database. When I saveAsTable, there is an error message Azure Databricks: AnalysisException: Database 'bf' not foundYe, There is no database named "bf" in my database.Here is my full code:import os
i...
- 1750 Views
- 5 replies
- 5 kudos
Latest Reply
Some data can be saved as delta tables while some cannot.
4 More Replies
- 2369 Views
- 6 replies
- 4 kudos
Hello,I want to create database (schema) and tables in my Databricks workspace using terraform.I found this resources: databricks_schemaIt requires databricks_catalog, which requires metastore_id.However, I have databricks_workspace and I did not cre...
- 2369 Views
- 6 replies
- 4 kudos
Latest Reply
Hi @Łukasz Jaremek , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer) and @Atanu Sarkar 's response help you to find the solution? Please let us know.
5 More Replies
by
Jeff1
• Contributor II
- 2025 Views
- 6 replies
- 5 kudos
I'm new to integrating the sparklyr / R interface in databricks. In particular it appears that sparklyr and R commands and functions are dependent upon the type of dataframe one is working with (hive, Spark R etc). Is there a recommend best practice...
- 2025 Views
- 6 replies
- 5 kudos
Latest Reply
Hi @Jeff Reichman , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer) 's response help you to find the solution? Please let us know.
5 More Replies
- 1755 Views
- 8 replies
- 5 kudos
Hello. I'm fairly new to Databricks and Spark.I have a requirement to connect to Databricks using JDBC and that works perfectly using the driver I downloaded from the Databricks website ("com.simba.spark.jdbc.Driver")What I would like to do now is ha...
- 1755 Views
- 8 replies
- 5 kudos
Latest Reply
Hi @Gurps Bassi , Just a friendly follow-up. Do you still need help, or do @Hubert Dudek (Customer) and @Werner Stinckens 's responses help you find the solution? Please let us know.
7 More Replies
by
Jan_A
• New Contributor III
- 2799 Views
- 5 replies
- 5 kudos
Hi,I have a databricks database that has been created in the dbfs root S3 bucket, containing managed tables. I am looking for a way to move/migrate it to a mounted S3 bucket instead, and keep the database name.Any good ideas on how this can be done?T...
- 2799 Views
- 5 replies
- 5 kudos
Latest Reply
Hi @Jan Ahlbeck , Did @DARSHAN BARGAL 's solution work in your case?
4 More Replies
- 1577 Views
- 3 replies
- 4 kudos
Hello,I'm working in Databricks Community Edition. So I will terminate my cluster after my work because anyways it will be terminated after 2 hours. I'm creating a database to store all my transformed data. Will the database be deleted when I termina...
- 1577 Views
- 3 replies
- 4 kudos
Latest Reply
@Hubert Dudek - Thanks for answering so quickly!!@Sriram Devineedi - If Hubert's answer solved the issue for you, would you be happy to mark his answer as best? That helps others know where to look.
2 More Replies
- 4055 Views
- 5 replies
- 12 kudos
Hi guys,I have a trial databricks account, I realized that when I shutdown the cluster my databases and tables is disappear .. that is correct or thats is because my account is trial ?
- 4055 Views
- 5 replies
- 12 kudos
Latest Reply
@William Scardua if it's an external hive metastore or Glue catalog you might be missing the configuration on the cluster. https://docs.databricks.com/data/metastores/index.htmlAlso as mentioned by @Hubert Dudek , if it's a community edition then t...
4 More Replies
- 942 Views
- 1 replies
- 0 kudos
Hello people,I'm trying to build a facial recognition application, and I have a working API, that takes in an image of a face and spits out a vector that encodes it. I need to run this on a million faces, store them in a db and when the system goes o...
- 942 Views
- 1 replies
- 0 kudos
Latest Reply
Dan_Z
Honored Contributor
You could do this with Spark storing in parquet/Delta. For each face you would write out a record with a column for metadata, a column for the encoded vector array, and other columns for hashing. You could use a PandasUDF to do the distributed dista...
- 1475 Views
- 2 replies
- 1 kudos
Is it possible to have a folder or database with a database in Azure Databricks? I know you can use the "create database if not exists xxx" to get a database, but I want to have folders within that database where I can put tables.
- 1475 Views
- 2 replies
- 1 kudos
Latest Reply
The default location of a database will be in the /user/hive/warehouse/<databasename.db>. Irrespective of the location of the database the tables in the database can have different locations and they can be specified at the time of creation. Databas...
1 More Replies