- 1215 Views
- 2 replies
- 2 kudos
Azure Databricks to GCP Databricks Migration
Hi Team, Can you provide your thoughts on moving Databricks from Azure to GCP? What services are required for the migration, and are there any limitations on GCP compared to Azure? Also, are there any tools that can assist with the migration? Please ...
- 1215 Views
- 2 replies
- 2 kudos
- 2 kudos
Hello Team, Adding to @sunnydata comments: Moving Databricks from Azure to GCP involves several steps and considerations. Here are the key points based on the provided context: Services Required for Migration:Cloud Storage Data: Use GCP’s Storage T...
- 2 kudos
- 577 Views
- 1 replies
- 0 kudos
Structuring RAG Projects in Python Using
Understanding Retrieval-Augmented Generation (RAG)Retrieval-Augmented Generation (RAG) is a cutting-edge AI paradigm that enhances traditional generative models by integrating real-time data retrieval. By combining retrieval and generation, RAG ensur...
- 577 Views
- 1 replies
- 0 kudos
- 0 kudos
Here's a demo using RAG LLM: https://www.databricks.com/resources/demos/tutorials/data-science-and-ai/lakehouse-ai-deploy-your-llm-chatbot
- 0 kudos
- 564 Views
- 1 replies
- 0 kudos
Databricks Support
Hi everyoneI'm new to databricks and to this platform. My organization just got started with databricks and we're looking to procure/purchase enterprise support to help with things like training, set up, maintenance and development of our warehousing...
- 564 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @victor_okrobodo, Please see: https://www.databricks.com/professional-services Let me know if you have any other questions.
- 0 kudos
- 2284 Views
- 4 replies
- 5 kudos
Data Modelling
What is the 'implicit' or 'by default' data model of databricks or unity catalog ? Is it Data Vault ?
- 2284 Views
- 4 replies
- 5 kudos
- 5 kudos
Thank you so much for the information. You made it easy for me.
- 5 kudos
- 590 Views
- 3 replies
- 0 kudos
Suggest ways to get unity catalog data to Aws s3 or sagemaker
Please suggest best ways to get databricks unity catalog data to Aws s3 or sagemaker. Data could be around 1gb in some tables and 20gb in others.currently sagemaker pipelines use data from s3 as batches in different parquet files. But now we would li...
- 590 Views
- 3 replies
- 0 kudos
- 0 kudos
You could try by using Delta Sharing with your provider as mentioned in doc https://docs.databricks.com/en/delta-sharing/set-up.html
- 0 kudos
- 839 Views
- 2 replies
- 0 kudos
Unable to capture the Query result via JDBC client execution
As shown in below screenshots MERGE INTO command produces information about the result (num_affected_rows, num_updated_rows, num_deleted_rows, num_inserted_rows).Unable to get this information when the same query is being executed via JDBC client. Is...
- 839 Views
- 2 replies
- 0 kudos
- 0 kudos
Delta API can help you get these details. Reference - https://docs.databricks.com/en/delta/history.html#history-schema
- 0 kudos
- 1518 Views
- 1 replies
- 0 kudos
DIAS 2023 -- recommend the training!
Did the SparkUI training yesterday with Mark Ott, and I highly recommend it. It was super helpful and provided a lot of clarity around some of the vaguer terms and metrics, and some surprise penalties.In-memory partition size is the the main thing to...
- 1518 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, I can't find this course... can you please share the full name of this course? Thanks in advance
- 0 kudos
- 933 Views
- 3 replies
- 4 kudos
Resolved! How to read Databricks UniForm format tables present in ADLS
We have Databricks UniForm format (iceberg) tables are present in azure data lake storage (ADLS) which has already integrated with Databricks unity catalog. How to read Uniform format tables using Databricks as a query engine?
- 933 Views
- 3 replies
- 4 kudos
- 4 kudos
Query Using Unity Catalog:SQL:sqlCopiar códigoSELECT * FROM catalog_name.schema_name.table_name;PySpark:pythonCopiar códigodf = spark.sql("SELECT * FROM catalog_name.schema_name.table_name") df.display()Direct Access by Path: If not using Unity Catal...
- 4 kudos
- 7969 Views
- 2 replies
- 0 kudos
Cluster Memory Issue (Termination)
Hi,I have a single-node personal cluster with 56GB memory(Node type: Standard_DS5_v2, runtime: 14.3 LTS ML). The same configuration is done for the job cluster as well and the following problem applies to both clusters:To start with: once I start my ...
- 7969 Views
- 2 replies
- 0 kudos
- 1463 Views
- 2 replies
- 1 kudos
Creating table in Unity Catalog with file scheme dbfs is not supported
code:# Define the path for the staging Delta tablestaging_table_path = "dbfs:/user/hive/warehouse/staging_order_tracking"spark.sql( f"CREATE TABLE IF NOT EXISTS staging_order_tracking USING DELTA LOCATION '{staging_table_path}'" )Creating table in U...
- 1463 Views
- 2 replies
- 1 kudos
- 1 kudos
I believe, using mount point only we can able to connect to our storage account - containers. If this is anti pattern by data bricks what is the way? Can you please explain what is external location of UC is it our local system folders or something n...
- 1 kudos
- 1371 Views
- 2 replies
- 3 kudos
Resolved! Cannot find "Databricks Apps"
Hi, I saw a demo about "Databricks Apps" 2 months ago. I haven't used Databricks for about 3 months, and I recently recreated a Premium Workspace to try something out (I use Azure), however I can't find "Apps" when I click "New". How can I enable and...
- 1371 Views
- 2 replies
- 3 kudos
- 599 Views
- 4 replies
- 0 kudos
Issue Querying Registered Tables on Glue Catalog via Data bricks
Im having an issue to query registered tables on glue catalog thru databricks with the following error: AnalysisException: [TABLE_OR_VIEW_NOT_FOUND] The table or view looker.ccc_data cannot be found.Verify the spelling and correctness of the schema a...
- 599 Views
- 4 replies
- 0 kudos
- 0 kudos
Can you specify the full catalog.schema.table? and also check the current schema SELECT current_schema();
- 0 kudos
- 439 Views
- 2 replies
- 1 kudos
Delete non-community Databricks account
Hi everyone!I have mistakenly created a non-community account using my personal email address.I would like to delete it in order to create a new account using my business email.How should I proceed? I tried to find this option on the console, with no...
- 439 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello @gabrielsantana!Could you please try raising a ticket with the Databricks support team?
- 1 kudos
- 4379 Views
- 1 replies
- 1 kudos
How to overwrite the existing file using databricks cli
If i use databricks fs cp then it does not overwrite the existing file, it just skip copying the file. Any suggestion how to overwrite the file using databricks cli?
- 4379 Views
- 1 replies
- 1 kudos
- 1 kudos
You can use the --overwrite option to overwrite your file.https://docs.databricks.com/en/dev-tools/cli/fs-commands.html
- 1 kudos
- 644 Views
- 1 replies
- 1 kudos
Resolved! The databricks jdbc driver has a memory leak
https://community.databricks.com/t5/community-platform-discussions/memory-leak/td-p/80756 My question is the same as above Unable to upload pictures, I had to dictateQuestion from ResultFileDownloadMonitor. M_requestList parametersBecause is ResultFi...
- 644 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @gf thanks for your question, it seems that this has been reported with Simba, but no fix has been provided yet, as a temporary workaround, you can consider using reflection to periodically clean up the m_requestList by removing KV pairs whose ...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API Documentation
3 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
4 -
Azure data disk
1 -
Azure databricks
13 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Group
1 -
Community Members
1 -
Compute
3 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
9 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks notebook
2 -
Databricks Notebooks
2 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
Google Bigquery
1 -
HIPAA
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
MlFlow
2 -
Model Training
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
4 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
1 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
122 | |
56 | |
40 | |
30 | |
20 |