- 3238 Views
- 1 replies
- 0 kudos
Hello people,I'm trying to build a facial recognition application, and I have a working API, that takes in an image of a face and spits out a vector that encodes it. I need to run this on a million faces, store them in a db and when the system goes o...
- 3238 Views
- 1 replies
- 0 kudos
Latest Reply
Dan_Z
Databricks Employee
You could do this with Spark storing in parquet/Delta. For each face you would write out a record with a column for metadata, a column for the encoded vector array, and other columns for hashing. You could use a PandasUDF to do the distributed dista...
- 1370 Views
- 1 replies
- 0 kudos
Can we use databricks delta lake as a data warehouse kind of thing where business analysts can explore data according to their needs ?
Delta lake provides following features which I think supports this idea
support to sql syntaxprovide ACID guarante...
- 1370 Views
- 1 replies
- 0 kudos
Latest Reply
Dan_Z
Databricks Employee
@austiamel47, Yes, you can certainly do this. Delta Lake is designed to be competitive with traditional data warehouses and with some tuning can power low-latency dashboards.https://databricks.com/glossary/data-lakehouse