- 40749 Views
- 15 replies
- 8 kudos
- 40749 Views
- 15 replies
- 8 kudos
Latest Reply
Getting same issue after installing ojdbc driver for oracle.
14 More Replies
- 6497 Views
- 8 replies
- 3 kudos
Whenever my cluster is terminated, I lose my whole database(I'm not sure if it's related, I made those database with delta format. ) And since the cluster is terminated in 2 hours from not using it, I wake up with no database every morning.I don't wa...
- 6497 Views
- 8 replies
- 3 kudos
Latest Reply
As the file still in the dbfs you can just recreate the reference of your tables and continue the work, with something like this:db_name = "mydb"
from pathlib import Path
path_db = f"dbfs:/user/hive/warehouse/{db_name}.db/"
tables_dirs = dbutils.fs....
7 More Replies
- 6893 Views
- 3 replies
- 4 kudos
I was going through Data Engineering with Databricks training, and in DE 3.3L - Databases, Tables & Views Lab section, it says "Defining database directories for groups of users can greatly reduce the chances of accidental data exfiltration." I agree...
- 6893 Views
- 3 replies
- 4 kudos
Latest Reply
Hi @Dilorom A Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we ...
2 More Replies
- 5618 Views
- 4 replies
- 8 kudos
I am trying to create a sqlite database in databricks and add a few tables to it. Ultimately, I want to export this using Azure. Is this possible?
- 5618 Views
- 4 replies
- 8 kudos
Latest Reply
@Hubert Dudek We currently have a process in place that reads in a SQLite file. We recently transitioned to using Databricks. We were hoping to be able to create a SQLite file so we didn't have to alter the current process we have in place.
3 More Replies
- 4384 Views
- 3 replies
- 4 kudos
I'm following a class "DE 3.1 - Databases and Tables on Databricks", but it is not possible create databases due to "AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got exception: org.apache.hadoop.fs.Unsupp...
- 4384 Views
- 3 replies
- 4 kudos
Latest Reply
A colleague from my work figured out the problem: the cluster being used wasn't configured to use DBFS when running notebooks.
2 More Replies
- 1506 Views
- 0 replies
- 4 kudos
Hello Bricksters,We organize the delta lake in multiple storage accounts. One storage account per data domain and one container per database. This helps us to isolate the resources and cost on the business domain level.Earlier, when a schema/database...
- 1506 Views
- 0 replies
- 4 kudos
by
Dicer
• Valued Contributor
- 3818 Views
- 5 replies
- 5 kudos
I wanted to save my delta tables in my Databricks database. When I saveAsTable, there is an error message Azure Databricks: AnalysisException: Database 'bf' not foundYe, There is no database named "bf" in my database.Here is my full code:import os
i...
- 3818 Views
- 5 replies
- 5 kudos
Latest Reply
Some data can be saved as delta tables while some cannot.
4 More Replies
- 5204 Views
- 4 replies
- 4 kudos
Hello,I want to create database (schema) and tables in my Databricks workspace using terraform.I found this resources: databricks_schemaIt requires databricks_catalog, which requires metastore_id.However, I have databricks_workspace and I did not cre...
- 5204 Views
- 4 replies
- 4 kudos
Latest Reply
Atanu
Databricks Employee
https://registry.terraform.io/providers/databrickslabs/databricks/latest/docs/resources/schema I think this is for UC. https://docs.databricks.com/data-governance/unity-catalog/index.html
3 More Replies
by
Jeff1
• Contributor II
- 4649 Views
- 5 replies
- 5 kudos
I'm new to integrating the sparklyr / R interface in databricks. In particular it appears that sparklyr and R commands and functions are dependent upon the type of dataframe one is working with (hive, Spark R etc). Is there a recommend best practice...
- 4649 Views
- 5 replies
- 5 kudos
Latest Reply
Recommended is delta format in data lake. Here is code example https://docs.databricks.com/delta/quick-start.html#language-r
4 More Replies
- 4367 Views
- 7 replies
- 5 kudos
Hello. I'm fairly new to Databricks and Spark.I have a requirement to connect to Databricks using JDBC and that works perfectly using the driver I downloaded from the Databricks website ("com.simba.spark.jdbc.Driver")What I would like to do now is ha...
- 4367 Views
- 7 replies
- 5 kudos
Latest Reply
@Gurps Bassi , "running instance of a database in docker" - that is hive metastore, so it just mapping to data which is usually physically on the data lake. Databricks are so much on the cloud that setting metastore locally doesn't make sense. Inste...
6 More Replies
by
Jan_A
• New Contributor III
- 5375 Views
- 3 replies
- 5 kudos
Hi,I have a databricks database that has been created in the dbfs root S3 bucket, containing managed tables. I am looking for a way to move/migrate it to a mounted S3 bucket instead, and keep the database name.Any good ideas on how this can be done?T...
- 5375 Views
- 3 replies
- 5 kudos
Latest Reply
Hi @Jan Ahlbeck we can use below property to set the default location:"spark.sql.warehouse.dir": "S3 URL/dbfs path"Please let me know if this helps.
2 More Replies
- 3080 Views
- 3 replies
- 4 kudos
Hello,I'm working in Databricks Community Edition. So I will terminate my cluster after my work because anyways it will be terminated after 2 hours. I'm creating a database to store all my transformed data. Will the database be deleted when I termina...
- 3080 Views
- 3 replies
- 4 kudos
Latest Reply
@Hubert Dudek - Thanks for answering so quickly!!@Sriram Devineedi - If Hubert's answer solved the issue for you, would you be happy to mark his answer as best? That helps others know where to look.
2 More Replies
- 7848 Views
- 5 replies
- 12 kudos
Hi guys,I have a trial databricks account, I realized that when I shutdown the cluster my databases and tables is disappear .. that is correct or thats is because my account is trial ?
- 7848 Views
- 5 replies
- 12 kudos
Latest Reply
@William Scardua if it's an external hive metastore or Glue catalog you might be missing the configuration on the cluster. https://docs.databricks.com/data/metastores/index.htmlAlso as mentioned by @Hubert Dudek , if it's a community edition then t...
4 More Replies
- 2395 Views
- 1 replies
- 0 kudos
Hello people,I'm trying to build a facial recognition application, and I have a working API, that takes in an image of a face and spits out a vector that encodes it. I need to run this on a million faces, store them in a db and when the system goes o...
- 2395 Views
- 1 replies
- 0 kudos
Latest Reply
Dan_Z
Databricks Employee
You could do this with Spark storing in parquet/Delta. For each face you would write out a record with a column for metadata, a column for the encoded vector array, and other columns for hashing. You could use a PandasUDF to do the distributed dista...
- 2835 Views
- 2 replies
- 1 kudos
Is it possible to have a folder or database with a database in Azure Databricks? I know you can use the "create database if not exists xxx" to get a database, but I want to have folders within that database where I can put tables.
- 2835 Views
- 2 replies
- 1 kudos
Latest Reply
The default location of a database will be in the /user/hive/warehouse/<databasename.db>. Irrespective of the location of the database the tables in the database can have different locations and they can be specified at the time of creation. Databas...
1 More Replies