- 4319 Views
- 6 replies
- 5 kudos
PowerApps + Delta Lake
Is there any native way to Insert data from PowerApps directly into Delta Lake?I can of course create a workflow - and push the data via API, but is there any alternative ?
- 4319 Views
- 6 replies
- 5 kudos
- 5 kudos
I have started creating solution for that: Here is initial version with PAT https://github.com/lakime/powerappstodelta
- 5 kudos
- 749 Views
- 0 replies
- 0 kudos
Best Tools for API Design? SwaggerHub Alternatives?
Curious if anyone has moved from SwaggerHub to other platforms for API design, especially within data projects. Tools like Apidog, Postman, or Insomnia have been mentioned to me – does anyone have thoughts on how they compare for efficiency and colla...
- 749 Views
- 0 replies
- 0 kudos
- 2736 Views
- 7 replies
- 1 kudos
Coursera Apache SQL Course
Hi, I have enrolled into Databricks Apache Spark SQL for Data Analysts course on Coursera.I am in Module 3, I imported the file through given url in course material. In 3.2Basic Queries I ran the first block which contains following command "%run ....
- 2736 Views
- 7 replies
- 1 kudos
- 1 kudos
Hi, I enrolled in the same coursera "databricks" course, and met the same issue. I unmounted and re-run the classroom set up as following However, when I move forward to list the mounted dirctory, the same issue happened.
- 1 kudos
- 398 Views
- 0 replies
- 0 kudos
Enterprise chatbot ml model, from datalake parquet data.
I want to create a chatbot ml model, to answer questions about the data, my organization have in the datalake. It should answer comprehension questions about the data.example: what do you understand from the data received from yesterday and today.
- 398 Views
- 0 replies
- 0 kudos
- 4246 Views
- 2 replies
- 0 kudos
File size upload limit through CLI
Does anyone know the size limit for uploading files through the CLI? I'm not finding it in the documentation.
- 4246 Views
- 2 replies
- 0 kudos
- 877 Views
- 0 replies
- 0 kudos
Compatibility Issue Between Databricks Runtime and Delta Live Tables: Request for v15.2 Release Time
We are experiencing a compatibility issue between the Databricks Runtime version and the Delta Live Tables (DLT) Runtime. We need to upgrade to Databricks Runtime 15.2 to enable the auto schema evolution feature. However, Databricks Runtime 15.2 requ...
- 877 Views
- 0 replies
- 0 kudos
- 1522 Views
- 0 replies
- 0 kudos
CREATE Community_User_Group [IF NOT EXISTS] IN MADRID(SPAIN)
Hi,I would like to get some support in creating a Community User Group in Madrid, Spain. It would be nice to host events/meetings/discussions ...Regards,Ángel
- 1522 Views
- 0 replies
- 0 kudos
- 7452 Views
- 4 replies
- 0 kudos
Resolved! Have code stay hidden even when the notebook is copied
When I save a certain Python notebook where I have selected Hide Code and Hide Results on certain cells, those conditions persist. For example, when I come back the next day in a new session, the hidden material is still hidden.When the notebook is ...
- 7452 Views
- 4 replies
- 0 kudos
- 0 kudos
In my situation we cannot split this notebook as ADF pipeline is already in PROD ,I have tried to use the option %%capture .It helps to ran the notebook within size limits but somehow it is corrupting the output. Also checked in the Databricks AI and...
- 0 kudos
- 1104 Views
- 0 replies
- 0 kudos
Databricks Asset Bundles - Failed to install provider
Hello All,When I try to deploy my bundle, I get the following error.I can't edit the bundle.tf.json, I suppose it is created automatically. Does anyone have a solution for the same problem?Many Thanks,Can$ databricks bundle deploy -t devBuilding my_p...
- 1104 Views
- 0 replies
- 0 kudos
- 1471 Views
- 2 replies
- 0 kudos
CLI is not helpful in exporting Error: expected to have the absolute path of the object or directory
I try to export a job as a DBA in order to create an Asset Bundle according to thishttps://community.databricks.com/t5/data-engineering/databricks-asset-bundle-dab-from-existing-workspace/td-p/49309I am on Windows 10 Pro x64 withDatabricks CLI v0.223...
- 1471 Views
- 2 replies
- 0 kudos
- 0 kudos
To export an existing folder under /Workspace/... export-dir command could be used : databricks workspace export-dir /Workspace/Applications/ucx/logs/migrate-tables/run-123-0/ /Users/artem.sheiko/logs
- 0 kudos
- 453 Views
- 0 replies
- 0 kudos
Loading parquet files with earlier timestamp looking for newer files
I have a setup where i am replicating Delta live tables parquet, and checkpoint files using azure RAGZRS to peer region for disaster recovery. When i load the the replicated files in peer region using delta format, i get an error that _delta_log/0000...
- 453 Views
- 0 replies
- 0 kudos
- 2697 Views
- 1 replies
- 0 kudos
Databricks serverless Vs snowflake
Hi All,We want to switch from Snowflake to Databricks SQL Warehouse/serverless to simplify our data layers and reduce data copies before the reporting layer. Please share the benefits of using serverless over Snowflake and any limitations you see. We...
- 2697 Views
- 1 replies
- 0 kudos
- 0 kudos
one big pro is that you do not need to copy data to the dwh. also your transformations and analytics queries reside on the same platform (databricks).If databricks can cover all the requirementsm compared to snowflakem is hard to tell. Probably ther...
- 0 kudos
- 14779 Views
- 5 replies
- 2 kudos
Concurrent Update to Delta - Throws error
Team,I get a ConcurrentAppendException: Files were added to the root of the table by a concurrent update when trying to update a table which executes via jobs with for each activity in ADF,I tried with Databricks run time 14.x and set the delete vect...
- 14779 Views
- 5 replies
- 2 kudos
- 2 kudos
In case of such an issue, I would like to suggest apply retry and try except logic (you can use one of existing libraries) in both concurrent updates - it should help, and jobs won't report any error.
- 2 kudos
- 1316 Views
- 3 replies
- 0 kudos
Unable to set shuffle partitions on DLT pipeline
Hello,We are using a 5 worker node DLT job compute for a continuous mode streaming pipeline. The worker configuration is Standard_D4ads_v5 i.e. 4 cores so total cores across 5 workers is 20 cores.We have wide transformation at some places in the pipe...
- 1316 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @PushkarDeole ,Each Delta Live Tables pipeline has two associated clusters:The updates cluster processes pipeline updates.The maintenance cluster runs daily maintenance tasks.According to docs, if you want to configure settings at the pipeline lev...
- 0 kudos
- 1271 Views
- 1 replies
- 0 kudos
Clarification on Acceptable ID Proof for Databricks Associate Exam
Hi,I am planning to take the Databricks Associate exam in the upcoming week. My current ID proof is my original driving license issued by the Tamil Nadu government; however, it is laminated rather than a hard plastic card.Could you please confirm if ...
- 1271 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi Databricks community and @Cert-Team,Can you please help me here? The reason I am asking is Microsoft certification doesn’t approve laminated id proof’s.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |