cancel
Showing results for 
Search instead for 
Did you mean: 
Discussions
Engage in dynamic conversations covering diverse topics within the Databricks Community. Explore discussions on data engineering, machine learning, and more. Join the conversation and expand your knowledge base with insights from experts and peers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Browse the Community

Community Discussions

Engage in vibrant discussions covering diverse learning topics within the Databricks Community. Expl...

4386 Posts

Activity in Discussions

APJESK
by > Contributor
  • 23 Views
  • 0 replies
  • 0 kudos

Workspace Folder ACL design

How should the Databricks workspace folder architecture be designed to support cross-team collaboration, access governance, and scalability in an enterprise platform? Please suggest below or share some ideas from your experience ThanksNote: I'm new t...

  • 23 Views
  • 0 replies
  • 0 kudos
Brahmareddy
by > Esteemed Contributor
  • 193 Views
  • 2 replies
  • 4 kudos

Invitation to the Databricks Developer Community on Medium

Hi All,We are starting a new Databricks Developer Community publication on Medium, and this is a warm invitation to every Databricks professional, developer, and aspirant who believes in learning by sharing.We are doing this because Databricks learni...

  • 193 Views
  • 2 replies
  • 4 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 4 kudos

Thanks @Louis_Frolio . Let us do it better together.

  • 4 kudos
1 More Replies
Brahmareddy
by > Esteemed Contributor
  • 107 Views
  • 2 replies
  • 3 kudos

Data + AI Is Not the Future at Databricks. It’s the Present.

One thing becomes very clear when you spend time in the Databricks community: AI is no longer an experiment. It is already part of how real teams build, ship, and operate data systems at scale.For a long time, many organizations treated data engineer...

  • 107 Views
  • 2 replies
  • 3 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 3 kudos

Thanks @Louis_Frolio for your kind words. Happy to contribute here.

  • 3 kudos
1 More Replies
usatopseoshop
by > Visitor
  • 22 Views
  • 0 replies
  • 0 kudos

MARKETING

Are you looking to simplify your international money transfers and manage your finances with ease? Buying a verified Wise account could be the game-changer you need. If you’re looking to send money internationally at low costs, consider opening a mul...

WISW ACCOUNTS.jpg
  • 22 Views
  • 0 replies
  • 0 kudos
ADBricksExplore
by > New Contributor
  • 64 Views
  • 1 replies
  • 0 kudos

child Subqueries/sub-statements history metrics, from a parent [CALL...] statement in QueryHistory

Hi,I cannot find so far a way to get programmatically (SQL/Python) the Subqueries(/Sub-statements) executions history records, shown in ADBricks UI Query History/Profile, that were executed during a TaskRun of Job, as shown in [red boxes] on the atta...

image.png
  • 64 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Greetings @ADBricksExplore ,  Short answer: there isn’t a supported public API that returns the “Substatements / Subqueries” panel you see in the Query History or Profile UI. The GraphQL endpoints the UI relies on are internal and not stable or suppo...

  • 0 kudos
Danish11052000
by > New Contributor III
  • 137 Views
  • 5 replies
  • 2 kudos

How to get read/write bytes per table using Databricks system tables?

I’m working on a data usage use case and want to understand the right way to get read bytes and written bytes per table in Databricks, especially for Unity Catalog tables.What I wantFor each table, something like:DateTable name (catalog.schema.table)...

  • 137 Views
  • 5 replies
  • 2 kudos
Latest Reply
pradeep_singh
New Contributor II
  • 2 kudos

system.access.audit focuses on governance and admin/security events. It doesn’t capture per-table I/O metrics such as read_bytes or written_bytes.Use system.query.history for per-statement I/O metrics (read_bytes, written_bytes, read_rows, written_ro...

  • 2 kudos
4 More Replies
Carlton
by > Contributor II
  • 86 Views
  • 2 replies
  • 0 kudos

Resolved! No Longer Getting the Option to Login to Databricks Legacy Community Edition

Hello Support,In the past, whenever I have logged in to access my Databricks Community Edition, I was given the option to log into the Legacy Edition or the Databricks Free Edition. However, now I'm forced to login to the Free Edition. Can someone le...

  • 86 Views
  • 2 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Carlton ,That's because they shutdown community edition at 1 January 2026. Now you can use Free Edition only (which is way superior).https://community.databricks.com/t5/announcements/psa-community-edition-retires-on-january-1-2026-move-to-the-fre...

  • 0 kudos
1 More Replies
danny_frontgrad
by > New Contributor
  • 175 Views
  • 11 replies
  • 3 kudos

Resolved! Question on Ingestion Pipelines

Is there a better way to select source tables than having to manually select them 1 by 1. I have 96 tables and it's a pain. The gui keeps back to the schema and i have to search through all the tables again. Is there a way to import the tables using ...

  • 175 Views
  • 11 replies
  • 3 kudos
Latest Reply
pradeep_singh
New Contributor II
  • 3 kudos

So you dont see the option to edit the pipeline ?Oronce you click on edit pipeline you dont see the option to Switch to code version(YAML)Or After you Switch to code version(YAML) you can only view that yaml and cant edit it ?

  • 3 kudos
10 More Replies
b_pinter
by > New Contributor
  • 83 Views
  • 1 replies
  • 0 kudos

NetSuite JDBC Driver 8.10.184.0 Suppor

Hello,I am currently attempting to integrate NetSuite with Databricks using the NetSuite JDBC driver version 8.10.184.0. When I attempt to ingestion information from NetSuite to Databricks, I find that the job fails with a checksum error and informs ...

  • 83 Views
  • 1 replies
  • 0 kudos
Latest Reply
pradeep_singh
New Contributor II
  • 0 kudos

RequirementsTo configure NetSuite for Databricks ingestion, you must have the following:A NetSuite account with a SuiteAnalytics JDBC drivers license.Access to the NetSuite2.com data source. The legacy netsuite.com data source is not supported.Admini...

  • 0 kudos
Ericsson
by > New Contributor II
  • 5760 Views
  • 3 replies
  • 1 kudos

SQL week format issue its not showing result as 01(ww)

Hi Folks,I've requirement to show the week number as ww format. Please see the below codeselect weekofyear(date_add(to_date(current_date, 'yyyyMMdd'), +35)). also plz refre the screen shot for result.

result
  • 5760 Views
  • 3 replies
  • 1 kudos
Latest Reply
Fowlkes
New Contributor
  • 1 kudos

What Is Papa’s Freezeria?Papa’s Freezeria is part of the famous Papa Louie game series, where players take on the role of restaurant employees running one of Papa Louie’s many eateries. http://papasfreezeria.online/

  • 1 kudos
2 More Replies
greengil
by > New Contributor III
  • 238 Views
  • 5 replies
  • 0 kudos

Resolved! Databricks database

In Oracle, I create schemas and tables and link tables together via the primary/foreign key to do SQL queries.In Databricks, I notice that I can create tables, but how do I link tables together for querying?  Does Databricks queries need the key in t...

  • 238 Views
  • 5 replies
  • 0 kudos
Latest Reply
balajij8
New Contributor
  • 0 kudos

Primary & Foreign Keys are informational unlike Oracle. You can use Unity Catalog Lineage Graph for easily finding the relationships between tables in Databricks.

  • 0 kudos
4 More Replies
kenny_hero
by > New Contributor III
  • 439 Views
  • 7 replies
  • 1 kudos

How do I import a python module when deploying with DAB?

Below is how the folder structure of my project looks like: resources/ |- etl_event/ |- etl_event.job.yml src/ |- pipeline/ |- etl_event/ |- transformers/ |- transformer_1.py |- utils/ |- logger.py databricks.ym...

  • 439 Views
  • 7 replies
  • 1 kudos
Latest Reply
pradeep_singh
New Contributor II
  • 1 kudos

You dont need to use wheel files . Use glob as the key instead of file - https://docs.databricks.com/aws/en/dev-tools/bundles/resources#pipelinelibrariesHere is the screenshot .  

  • 1 kudos
6 More Replies
anchalgarg0709
by > New Contributor
  • 73 Views
  • 0 replies
  • 0 kudos

Where Data + Gen AI Adds Value in Modern Data Platforms

Data + Gen AI is most effective when grounded in real data constraints.On Databricks, combining Gen AI with Spark and Delta accelerates prototyping and testing, but fundamentals still matter—schema design, realistic distributions, and domain understa...

  • 73 Views
  • 0 replies
  • 0 kudos
Daniela_Boamba
by > New Contributor III
  • 330 Views
  • 1 replies
  • 0 kudos

Databricks certificate expired

Hello,I have a databricks workspace with sso authentication. the IDP is on azure.The client certificate expired and now, I can't log on to databricks to add the new one.How can I do? Any idea is welcomed.Thank you!!Best regards,daniela 

  • 330 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 0 kudos

This is an AWS Databricks workspace and your SSO is with EntraID? You'll need to create a Support Ticket and then Engineering can disable-SSO temporarily allowing you to login with user+OTP. The long term solution here is that you should: Set up Acco...

  • 0 kudos
Danish11052000
by > New Contributor III
  • 146 Views
  • 5 replies
  • 1 kudos

How to incrementally backup system.information_schema.table_privileges (no streaming, no unique keys

I'm trying to incrementally backup system.information_schema.table_privileges but facing challenges:No streaming support: Is streaming supported: FalseNo unique columns for MERGE: All columns contain common values, no natural key combinationNo timest...

  • 146 Views
  • 5 replies
  • 1 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 1 kudos

information_schema is not a Delta Table, which is why you can't stream from it. They are basically views on top of the information coming straight from the control plane database. Also your query is actually going to be quite slow/expensive (you prob...

  • 1 kudos
4 More Replies