- 1171 Views
- 1 replies
- 0 kudos
Vector Database
What is the vector database to generate in DataBricks?
- 1171 Views
- 1 replies
- 0 kudos
- 0 kudos
Not sure if i understood the question. If you want to use Databricks Vector Database, just go to your table > create vector search index. First you need to create a Vector Search Endpoint (compute > vector search), and you need to have an enabled ser...
- 0 kudos
- 441 Views
- 0 replies
- 0 kudos
Databricks User Group Meetups
Are there any Databricks User Group Meetups in Charlotte?
- 441 Views
- 0 replies
- 0 kudos
- 815 Views
- 1 replies
- 0 kudos
Creating table in Unity Catalog with file scheme dbfs is not supported
code:# Define the path for the staging Delta tablestaging_table_path = "dbfs:/user/hive/warehouse/staging_order_tracking"spark.sql( f"CREATE TABLE IF NOT EXISTS staging_order_tracking USING DELTA LOCATION '{staging_table_path}'" )Creating table in U...
- 815 Views
- 1 replies
- 0 kudos
- 0 kudos
In UC dbfs file location and dbfs mount point for a table is consider an anti pattern by databricks, this is because dbfs is at the scope of the workspace and it has security limitations (https://learn.microsoft.com/en-us/azure/databricks/dbfs/unity-...
- 0 kudos
- 825 Views
- 0 replies
- 1 kudos
Databricks asset bundles dependencies
Is anyone aware of a way to include a requirements.txt within the job definition of a databricks asset bundle? Documentation mentions how to have dependencies in workspace files, or Unity Catalog volumes, but I wanted to ask if it is possible to decl...
- 825 Views
- 0 replies
- 1 kudos
- 1039 Views
- 0 replies
- 0 kudos
Creating Zip file on Azure storage explorer
I need to create zip file which will contain csv files which I have created from the dataframe. But I am unable to create valid zip file which will be containing all csv files. Is it possible to create zip file from code in databricks on azure storag...
- 1039 Views
- 0 replies
- 0 kudos
- 1021 Views
- 2 replies
- 2 kudos
New member
Excited to be at Data Summit 2024
- 1021 Views
- 2 replies
- 2 kudos
- 1451 Views
- 2 replies
- 0 kudos
How can i create multiple workspaces in existing single azure databricks resource?
I have an azure databricks resource created in my Azure portal. I want to achieve departmental secracy in single databricks resource. Hence, I am looking for a solution where I can add multiple workspaces to my single Databricks resource. Is it even ...
- 1451 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi, It is possible to create multiple workspace from a single azure account.Go to Azure portal click on Azure Databricks.Click on Create.Fill all the details and your new workspace is ready.
- 0 kudos
- 429 Views
- 0 replies
- 0 kudos
- 429 Views
- 0 replies
- 0 kudos
- 330 Views
- 0 replies
- 0 kudos
Data summit 2023
I learned about Unity catalog, LLM features and delta sharing today at Data summit.
- 330 Views
- 0 replies
- 0 kudos
- 681 Views
- 1 replies
- 0 kudos
Data Summi 2023
I learned about Unity catalog, LLM features and delta sharing today at Data summit.
- 681 Views
- 1 replies
- 0 kudos
- 305 Views
- 0 replies
- 0 kudos
Python spark support
Good to see python as a first class language for spark applications development .
- 305 Views
- 0 replies
- 0 kudos
- 423 Views
- 0 replies
- 0 kudos
How Development Target works for multiple users?
Hi, I'm using the Databricks asset bundle to deploy my job to Azure Databricks.I want to configure the Databricks bundle so that when anyone runs the Azure pipeline, a job is created under their name in the format dev_username_job.Using a personal ac...
- 423 Views
- 0 replies
- 0 kudos
- 535 Views
- 1 replies
- 1 kudos
- 535 Views
- 1 replies
- 1 kudos
- 639 Views
- 1 replies
- 1 kudos
Looking for HR use cases for Databricks
Human Resources use cases.
- 639 Views
- 1 replies
- 1 kudos
- 1 kudos
yeah I have worked on some HR use cases and that is implemeted on databricks itself.
- 1 kudos
- 580 Views
- 0 replies
- 0 kudos
Spark structured streaming
hi,could someone please help me with this code :-input parameter df is a spark structured streaming dataframe def apply_duplicacy_check(df, duplicate_check_columns): if len(duplicate_check_columns) == 0: return None, df valid_df = df.dr...
- 580 Views
- 0 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »