- 1442 Views
- 1 replies
- 1 kudos
Summit23
I have really enjoyed the summit so far! I am a biostatistics graduate student at Harvard and am learning so much here.
- 1442 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi,If you had a great time at the Summit, be sure to visit our community events portal for upcoming events and join your regional user group for meetups.Thanks,Anushree
- 1 kudos
- 1443 Views
- 1 replies
- 0 kudos
Summit 2024
It has been an exciting event that brought together industry leaders, researchers, and practitioners to explore the latest advancements in artificial intelligence and data science. With cutting-edge presentations, hands-on workshops, and networking o...
- 1443 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi,If you had a great time at the Summit, be sure to visit our community events portal for upcoming events and join your regional user group for meetups.Thanks,Anushree
- 0 kudos
- 2899 Views
- 1 replies
- 0 kudos
- 2899 Views
- 1 replies
- 0 kudos
- 0 kudos
You can leverage this code base. It works as expected using "next_page_token" parameter-Don't forget to mark this solution as correct if this helped you import requests token = 'your token' url = 'your URL' params = {'expand_tasks': 'true'} heade...
- 0 kudos
- 1198 Views
- 1 replies
- 0 kudos
Unity Catalog
Can MLflow Experiments be incorporated into Unity Catalog (may be in volume or somewhere in unity catalog) similar to models and feature tables?
- 1198 Views
- 1 replies
- 0 kudos
- 0 kudos
As of now Databricks supports DBFS, S3, and Azure Blob storage artifact locations. Our teams are working on intruducing Volumes as repository which might be coming soon
- 0 kudos
- 1123 Views
- 1 replies
- 1 kudos
Troubleshoot in Databricks
Hi everyone,I’m having trouble with a query in Databricks. My code seems correct, but I’m getting unexpected results. Can anyone help me troubleshoot this or provide tips for debugging in Databricks?Thanks in advance! locksmith
- 1123 Views
- 1 replies
- 1 kudos
- 1 kudos
It's hard to advice when you didn't provide any code. But you can try to use debugger inside notebook:
- 1 kudos
- 10978 Views
- 13 replies
- 3 kudos
Resolved! Unable to access Account console under Azure Databricks
I am the only user for the Azure enviornment and assigned global admin.For the resource group, I am the owner.For the Azure Databricks Service I am the service admin already.I would like to try to go into the Account console to make some changes for ...
- 10978 Views
- 13 replies
- 3 kudos
- 3 kudos
My issue was resolved with a recommendation of this article here
- 3 kudos
- 2434 Views
- 3 replies
- 0 kudos
INVALID_PARAMETER_VALUE: There are more than 1001 files. Please specify [LIMIT] to limit the num....
I am running Azure Databricks DBX Runtime 13.3 and I am seeing the following error in my workspace:Last execution failed# doesn't work in DBX Runtime 13.3 and above# limitation is it can't read more than 1001 filesdbutils.fs.ls("[path]")AnalysisExcep...
- 2434 Views
- 3 replies
- 0 kudos
- 0 kudos
@Retired_mod how to resolve this issue? I have faced the same issue on dbr 14.3 and dbr 15.4 version unity enabled cluster. Can you please let us know the workaround for it?
- 0 kudos
- 9491 Views
- 6 replies
- 2 kudos
Resolved! Error on UC Liquid Clustering
I know we have 4 keys max on cluster by () for both z-order and partition keys. I got some issues when adding 4 keys and 1 specific key triggers that error (I was not expecting as this is about Create Table) . Stats makes sense if you need to optimiz...
- 9491 Views
- 6 replies
- 2 kudos
- 2 kudos
Is Liquid cluster also helpful for high cardinality columns for a table Join?
- 2 kudos
- 3114 Views
- 2 replies
- 1 kudos
Resolved! Data bricks Bundle - 'include' section does not match any files.
Hello Community,I’m encountering the following error while working on my project:Error: ${var.FILE_NAME} defined in 'include' section does not match any files.Has anyone faced this issue before? I'm using variables to include specific files, but it s...
- 3114 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @AkhilVydyula ,You can start by updating your cli. I've seen some really weird errors with older cli versions.
- 1 kudos
- 659 Views
- 0 replies
- 0 kudos
Creation of live consumption of MSSQL CDC via DLT
HelloI am bit surprised - but I don't see any default possibility of creating continous pipeline for fetching updates from MSSQL, and inserting them into delta lake - is it true - or am I missing something? To be clear - I know we can do that semi-li...
- 659 Views
- 0 replies
- 0 kudos
- 3369 Views
- 5 replies
- 5 kudos
Resolved! Looking for Databricks Study Materials/Resources
Hello,I’m looking for study materials or resources to help me learn Databricks more effectively. Any recommendations would be greatly appreciated!
- 3369 Views
- 5 replies
- 5 kudos
- 5 kudos
@igor779, @Rishabh-Pandey and @szymon_dybczak .Thank you all for the suggestions! Unfortunately, I don’t have access to content from partner companies and mainly rely on free resources for my studies.
- 5 kudos
- 1233 Views
- 0 replies
- 0 kudos
Issues with incremental data processing within Delta Live Tables
Hi!I have an problem with incrementally processing data within my Delta Live Tables (DLT) pipeline. I have a raw file (in Delta format) where new data is added each day. When I run my DLT pipeline I only want the new data to be processed. As an examp...
- 1233 Views
- 0 replies
- 0 kudos
- 4576 Views
- 4 replies
- 0 kudos
Resolved! Data Masking Techniques and Issues with Creating Tables
Hello Databricks Team,I understand that the mask function can be used to mask columns, but I have a few questions:When users with access use a masked TABLE to create a downstream TABLE, the downstream TABLE does not inherit the mask function directly...
- 4576 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @weilin0323, How are you doing today?As per my understanding, When creating a downstream table from a masked table, you’ll need to reapply the mask function to the necessary columns in the new table if you want to maintain the same level of data p...
- 0 kudos
- 1286 Views
- 1 replies
- 1 kudos
How to get a list of users in a workspace with Account Level APIs
Hi There,We are planning to move to Unity Catalog soon so started replacing workspace API with Account Level APIs. One use case I have is getting a list of users, pretty straight forward with workspace:{workspaceUrl}api/2.0/preview/scim/v2/Users?coun...
- 1286 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @unseen007 ,To get a list of all users in a given workspace, you should use api/2.0/preview/scim/v2/Users api endpoint. Why do you assume this API is inappropriat? Unity Catalog has nothing to do with this. One API let's you list user at account l...
- 1 kudos
- 1778 Views
- 1 replies
- 1 kudos
Resolved! Task runs missing from system database
I have a workflow with 11 tasks (each task executes one notebook) that run in sequence. The task was run on 9/1 and again today (9/10). I am working reporting task history and status using system table `system.lakeflow.job_task_run_timeline`. The...
- 1778 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @DavidKxx ,There are chances that the table has not been updated today.Check tomorrow.Here what the documentation says:
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
4 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
Aws databricks
1 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
20 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 129 | |
| 72 | |
| 57 | |
| 42 |