- 8760 Views
- 7 replies
- 1 kudos
How to move a metastore to a new Storage Account in unity catalog?
Hello, I would like to change the Metastore location in Databricks Account Console. I have one metastore created that is in an undesired container/storage account. I could see that it's not possible to edit a metastore that is already created. I coul...
- 8760 Views
- 7 replies
- 1 kudos
- 1 kudos
We ended up 1) deleting the metastore (which only contained catalogs/schemas/tables), 2) creating a new one in the desired storage account and 3) re-populating it by running all Delta Live Tables pipeline. All our underlying raw data is stored in ano...
- 1 kudos
- 7949 Views
- 13 replies
- 3 kudos
Resolved! Unable to access Account console under Azure Databricks
I am the only user for the Azure enviornment and assigned global admin.For the resource group, I am the owner.For the Azure Databricks Service I am the service admin already.I would like to try to go into the Account console to make some changes for ...
- 7949 Views
- 13 replies
- 3 kudos
- 3 kudos
My issue was resolved with a recommendation of this article here
- 3 kudos
- 1524 Views
- 3 replies
- 0 kudos
INVALID_PARAMETER_VALUE: There are more than 1001 files. Please specify [LIMIT] to limit the num....
I am running Azure Databricks DBX Runtime 13.3 and I am seeing the following error in my workspace:Last execution failed# doesn't work in DBX Runtime 13.3 and above# limitation is it can't read more than 1001 filesdbutils.fs.ls("[path]")AnalysisExcep...
- 1524 Views
- 3 replies
- 0 kudos
- 0 kudos
@Retired_mod how to resolve this issue? I have faced the same issue on dbr 14.3 and dbr 15.4 version unity enabled cluster. Can you please let us know the workaround for it?
- 0 kudos
- 7026 Views
- 6 replies
- 1 kudos
Resolved! Error on UC Liquid Clustering
I know we have 4 keys max on cluster by () for both z-order and partition keys. I got some issues when adding 4 keys and 1 specific key triggers that error (I was not expecting as this is about Create Table) . Stats makes sense if you need to optimiz...
- 7026 Views
- 6 replies
- 1 kudos
- 1 kudos
Is Liquid cluster also helpful for high cardinality columns for a table Join?
- 1 kudos
- 1649 Views
- 2 replies
- 1 kudos
Resolved! Data bricks Bundle - 'include' section does not match any files.
Hello Community,I’m encountering the following error while working on my project:Error: ${var.FILE_NAME} defined in 'include' section does not match any files.Has anyone faced this issue before? I'm using variables to include specific files, but it s...
- 1649 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @AkhilVydyula ,You can start by updating your cli. I've seen some really weird errors with older cli versions.
- 1 kudos
- 416 Views
- 0 replies
- 0 kudos
Creation of live consumption of MSSQL CDC via DLT
HelloI am bit surprised - but I don't see any default possibility of creating continous pipeline for fetching updates from MSSQL, and inserting them into delta lake - is it true - or am I missing something? To be clear - I know we can do that semi-li...
- 416 Views
- 0 replies
- 0 kudos
- 2321 Views
- 5 replies
- 5 kudos
Resolved! Looking for Databricks Study Materials/Resources
Hello,I’m looking for study materials or resources to help me learn Databricks more effectively. Any recommendations would be greatly appreciated!
- 2321 Views
- 5 replies
- 5 kudos
- 5 kudos
@igor779, @Rishabh-Pandey and @szymon_dybczak .Thank you all for the suggestions! Unfortunately, I don’t have access to content from partner companies and mainly rely on free resources for my studies.
- 5 kudos
- 662 Views
- 0 replies
- 0 kudos
Issues with incremental data processing within Delta Live Tables
Hi!I have an problem with incrementally processing data within my Delta Live Tables (DLT) pipeline. I have a raw file (in Delta format) where new data is added each day. When I run my DLT pipeline I only want the new data to be processed. As an examp...
- 662 Views
- 0 replies
- 0 kudos
- 2639 Views
- 4 replies
- 0 kudos
Resolved! Data Masking Techniques and Issues with Creating Tables
Hello Databricks Team,I understand that the mask function can be used to mask columns, but I have a few questions:When users with access use a masked TABLE to create a downstream TABLE, the downstream TABLE does not inherit the mask function directly...
- 2639 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @weilin0323, How are you doing today?As per my understanding, When creating a downstream table from a masked table, you’ll need to reapply the mask function to the necessary columns in the new table if you want to maintain the same level of data p...
- 0 kudos
- 769 Views
- 1 replies
- 1 kudos
How to get a list of users in a workspace with Account Level APIs
Hi There,We are planning to move to Unity Catalog soon so started replacing workspace API with Account Level APIs. One use case I have is getting a list of users, pretty straight forward with workspace:{workspaceUrl}api/2.0/preview/scim/v2/Users?coun...
- 769 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @unseen007 ,To get a list of all users in a given workspace, you should use api/2.0/preview/scim/v2/Users api endpoint. Why do you assume this API is inappropriat? Unity Catalog has nothing to do with this. One API let's you list user at account l...
- 1 kudos
- 1132 Views
- 1 replies
- 1 kudos
Resolved! Task runs missing from system database
I have a workflow with 11 tasks (each task executes one notebook) that run in sequence. The task was run on 9/1 and again today (9/10). I am working reporting task history and status using system table `system.lakeflow.job_task_run_timeline`. The...
- 1132 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @DavidKxx ,There are chances that the table has not been updated today.Check tomorrow.Here what the documentation says:
- 1 kudos
- 3040 Views
- 2 replies
- 0 kudos
Resolved! exclude (not like) filter using pyspark
I am trying to exclude rows with a specific variable when querying using pyspark but the filter is not working. Similar to the "Not like" function in SQL. e.g. not like '%var4%'. The part of the code that is not working is: (col('col4').rlike('var...
- 3040 Views
- 2 replies
- 0 kudos
- 1252 Views
- 3 replies
- 2 kudos
Resolved! Curl command working in 12.2 but not in 13.3
Hello, one my teammate is trying to put some obversability on our Databricks flows.When he tries to contact our Open Telemetry server, he gets a timeout.I had a look, and the same command (on the same Databricks workspace) works well with runtime 12....
- 1252 Views
- 3 replies
- 2 kudos
- 2 kudos
Hello,thank you for your answer.I tried to update library version and it was almost the solution.It realized it neither works in 12.2 but the output was pretty different and it misled me.Probably a network config to set up in the target server.
- 2 kudos
- 397 Views
- 0 replies
- 0 kudos
show document name in AI agent
Hi everyone!I have successfully deployed the Review App of my AI agent following those instructions: Create and log AI agents | Databricks on AWSHowever, one question came up regarding the sources (names). To be precise, is there a possibility to sho...
- 397 Views
- 0 replies
- 0 kudos
- 6849 Views
- 10 replies
- 10 kudos
Resolved! how to include checkboxes in markdown cells in databricks notebook
Hi has anyone else tried to include checkboxes in markdown cells in databricks notebook?I believe I followed the correct way for checkbox: - [ ] and - [x]but the result I got is still no checkboxes.Please help! Thanks! %md #to do: - [x] static vari...
- 6849 Views
- 10 replies
- 10 kudos
- 10 kudos
Hi @teaholicI faced the same problem and found the ✓ and ✗ notation that did work for me. Hope that helps.
- 10 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
API Documentation
2 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
4 -
Azure data disk
1 -
Azure databricks
12 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Group
1 -
Community Members
1 -
Compute
3 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Cost
2 -
Credentials
1 -
CustomLibrary
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
9 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks notebook
2 -
Databricks Notebooks
2 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
Google Bigquery
1 -
HIPAA
1 -
Integration
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
MlFlow
2 -
Model Training
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Permission
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
4 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
2 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
sparkui
2 -
Splunk
1 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
1 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
122 | |
56 | |
40 | |
30 | |
20 |