cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Learning Festival (Virtual): 15 January - 31 January 2025

Join us for the return of the Databricks Learning Festival (Virtual)! Mark your calendars from 15 January - 31 January 2025! Upskill today across data engineering, data analysis, machine learning, and generative AI. Join the thousands who have el...

  • 1030 Views
  • 2 replies
  • 1 kudos
a week ago
Get Started With SQL Analytics and BI on Databricks

Upskill on SQL analytics and BI with three short self-paced videos As organizations seek to democratize their data, there is an increasing demand to enable users to better understand and work with data. Learn how to better understand and analyze data...

  • 74 Views
  • 0 replies
  • 0 kudos
yesterday
Announcing the Winners of the Generative AI World Cup

We are thrilled to announce the winners of the Generative AI World Cup! This event brought together over 1500 data scientists and AI engineers from over 18 countries. The competition showcased innovative generative AI solutions for solving real-world...

  • 206 Views
  • 0 replies
  • 0 kudos
Friday
Introducing an exclusively Databricks-hosted Assistant

We’re excited to announce that the Databricks Assistant, now fully hosted and managed within Databricks, is available in public preview! This version ensures the Assistant relies exclusively on Databricks-hosted models, leveraging the same secure inf...

  • 598 Views
  • 0 replies
  • 1 kudos
2 weeks ago
How to present and share your Notebook insights in AI/BI Dashboards

Summary The new integration between notebooks and AI/BI dashboards allows insights in notebooks to be seamlessly integrated into a polished shareable dashboard. We’re excited to announce a new integration between Databricks Notebooks and AI/BI Dashbo...

  • 383 Views
  • 0 replies
  • 0 kudos
2 weeks ago
Meet the Databricks MVPs

The Databricks MVP Program is our way of thanking and recognizing the community members, data scientists, data engineers, developers and open source enthusiasts who go above and beyond to uplift the data and AI community. Whether they’re speaking at ...

  • 618 Views
  • 0 replies
  • 11 kudos
2 weeks ago

Community Activity

RobsonNLPT
by > Contributor
  • 216 Views
  • 2 replies
  • 0 kudos

Delta Identity latest value after insert

Hi all.I would like to know if databricks has some feature to retrieve the latest identity column value (always generated) after insert or upserts operations? (dataframe apis and sql)Database engines as Azure SQL  and Oracle have feature that enable ...

  • 216 Views
  • 2 replies
  • 0 kudos
Latest Reply
tapash-db
Databricks Employee
  • 0 kudos

Hi, You can always query "SELECT MAX(identity_column) FROM your_table_name" and see the latest value of the identity column. However, there are no direct functions available to give the latest identity column value.

  • 0 kudos
1 More Replies
dsmoore
by > Visitor
  • 7 Views
  • 0 replies
  • 0 kudos

Multiple volumes from same external location?

Hey all,Do you know if it's possible to create multiple volumes referencing the same s3 bucket from the same external location?For example, if I have two workspaces (test and prod) testing different versions of pipeline code but with static data I'd ...

  • 7 Views
  • 0 replies
  • 0 kudos
RobsonNLPT
by > Contributor
  • 2 Views
  • 0 replies
  • 0 kudos

Delta Live Tables Permissions

Hi allI'm the owner of delta live tables pipelines but I don't see the option described on documentation to grant permissions for different users. The options available are "settings" and "delete"In the sidebar, click Delta Live Tables.Select the nam...

  • 2 Views
  • 0 replies
  • 0 kudos
eballinger
by > New Contributor
  • 54 Views
  • 2 replies
  • 0 kudos

Looking for ways to speed up DLT testing

Hi Guys,I am new to this community. I am guessing we have a typical setup (DLT tables, 3 layers - bronze, silver and gold) and while it works fine in our development environment I have always looked for ways to speed things up for testers. For exampl...

  • 54 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

There isn't a direct way to achieve this within the current DLT framework. When a DLT table is undeclared, it is designed to be removed from the pipeline, which includes the underlying data. However, there are a few strategies you can consider to spe...

  • 0 kudos
1 More Replies
kfloresip
by > New Contributor
  • 4 Views
  • 0 replies
  • 0 kudos

Bloomberg API and Databricks

Dear Databricks Community,Do you have any experience connecting Databricks (Python) to the Bloomberg Terminal to retrieve data in an automated way? I tried running the following code without success:%python%pip install blpapiThanks for your help,Kevi...

  • 4 Views
  • 0 replies
  • 0 kudos
darkolexis
by > New Contributor
  • 880 Views
  • 2 replies
  • 1 kudos

Service Principal types in Azure Databricks

In Azure Databricks, we can create two types of Service Principals, namely:1. Databricks Managed SP2. Microsoft Entra ID Managed SP What is the difference between two, other than one being specific to single workspace, and another being usable from m...

  • 880 Views
  • 2 replies
  • 1 kudos
Latest Reply
arunprakash1986
  • 1 kudos

So, what use would it be in a situation where I have a Docker image that runs as a job using Databricks Compute. Here the Job has "Run As" which is set to a service principal, say "svc1" which is a databricks managed service principal. I believe that...

  • 1 kudos
1 More Replies
mban-mondo
by > New Contributor II
  • 27 Views
  • 2 replies
  • 1 kudos

Resolved! Notebook Paths Errors in Community Edition

I have the following Notebook in Databricks UI: dbutils.entry_point.getDbutils().notebook().getContext().toJson()notebook_path = dbutils.notebook.entry_point.getDbutils().notebook().getContext().notebookPath().get()print(f"Current notebook path: {not...

  • 27 Views
  • 2 replies
  • 1 kudos
Latest Reply
mban-mondo
New Contributor II
  • 1 kudos

Many thanks. It would be helpful if the error message said 'No access has been granted to this resource' instead of 'Resource not found.   

  • 1 kudos
1 More Replies
Cosimo_F_
by > Contributor
  • 1388 Views
  • 4 replies
  • 0 kudos

Autoloader schema inference

Hello,is it possible to turn off schema inference with AutoLoader? Thank you,Cosimo

  • 1388 Views
  • 4 replies
  • 0 kudos
Latest Reply
shivagarg
New Contributor
  • 0 kudos

https://docs.databricks.com/en/ingestion/cloud-object-storage/auto-loader/patterns.html#language-pythonyou can enforce the schema or use the "cloudFiles.schemaHints"  to override the Inference. df = spark.readStream.format("cloudFiles") \ .option("...

  • 0 kudos
3 More Replies
Rishabh_Tiwari
by Databricks Employee
  • 76 Views
  • 1 replies
  • 1 kudos

Data + AI World Tour Atlanta - December 5, 2024

December 5, 2024 | AmericasMart Atlanta (Building 2) Discover how leading companies near you are taking control of their data and building custom AI on the Databricks Data Intelligence Platform. Data + AI Summit Atlanta is almost at capacity. Please ...

  • 76 Views
  • 1 replies
  • 1 kudos
Latest Reply
Doug-Leal
New Contributor III
  • 1 kudos

I'm looking forward to Data + AI World Tour in Atlanta!On a related note, is anyone interested in attending an in-person Databricks User Group meeting? I’m curious to gauge interest, (and topic?), as we could potentially plan an in-person evening mee...

  • 1 kudos
aranjan99
by > New Contributor III
  • 2072 Views
  • 4 replies
  • 1 kudos

system.access.table_lineage table missing data

I am using the system.access.table_lineage table  to figure out the tables accessed by sql queries and the corresponding SQL queries. However I am noticing this table missing data or values very often.For eg for sql queries executed by our DBT jobs, ...

  • 2072 Views
  • 4 replies
  • 1 kudos
Latest Reply
goldenmountain
  • 1 kudos

@aranjan99 did you ever get an answer or conclusion to the limitations of Unity Catalog in regards to tracking access via SQL?

  • 1 kudos
3 More Replies
Direo
by > Contributor
  • 484 Views
  • 2 replies
  • 0 kudos

Migrating to Unity Catalog: Read-Only Connections to SQL Server and Snowflake

We are in the process of migrating to Unity Catalog, establishing connections to SQL Server and Snowflake, and creating foreign catalogs that mirror your SQL Server and Snowflake databases. This allows us to leverage Unity Catalog’s query syntax and ...

Data Engineering
UnityCatalog SQLServer Snowflake Governance Permissions
  • 484 Views
  • 2 replies
  • 0 kudos
Latest Reply
goldenmountain
  • 0 kudos

I’m also trying to figure out if this is a limitation in Unity Catalog. I recently used a JDBC URL to write data to an Amazon Aurora PostgreSQL database, but noticed that no entries appeared in the `system.access.table_lineage` table. Has anyone else...

  • 0 kudos
1 More Replies
MOUNIKASIMHADRI
by > New Contributor
  • 4930 Views
  • 3 replies
  • 1 kudos

Insufficient Permissions Issue on Databricks

I have encountered a technical issue on Databricks.While executing commands both in Spark and SQL within the Databricks environment, I’ve run into permission-related errors from selecting files from DBFS. "org.apache.spark.SparkSecurityException: [IN...

  • 4930 Views
  • 3 replies
  • 1 kudos
Latest Reply
mpalacio
Visitor
  • 1 kudos

Hi, I am having the same issue. The Databricks extension is well installed and configured, and my user has enough permissions as I have been working without issues the whole time, but now when I run my notebooks to read tables in the same databricks ...

  • 1 kudos
2 More Replies
Tamizh035
by > New Contributor II
  • 551 Views
  • 3 replies
  • 1 kudos

[INSUFFICIENT_PERMISSIONS] Insufficient privileges:

While reading csv file using spark and listing the files under a folder using data bricks utils, I am getting below error:[INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission SELECT on any file. SQLSTATE: 42501File <comma...

  • 551 Views
  • 3 replies
  • 1 kudos
Latest Reply
mpalacio
Visitor
  • 1 kudos

I have the same issue, did you manage to solve it?I have the Databricks extension well configured and my role has enough permissions. Everything used to work propertly but now when I run my notebooks is giving me this issue and 'no module named dbrun...

  • 1 kudos
2 More Replies
Sen
by > New Contributor
  • 8083 Views
  • 9 replies
  • 1 kudos

Resolved! Performance enhancement while writing dataframes into Parquet tables

Hi,I am trying to write the contents of a dataframe into a parquet table using the command below.df.write.mode("overwrite").format("parquet").saveAsTable("sample_parquet_table")The dataframe contains an extract from one of our source systems, which h...

  • 8083 Views
  • 9 replies
  • 1 kudos
Latest Reply
jhoon
Visitor
  • 1 kudos

Great discussion on performance optimization! Managing technical projects like these alongside academic work can be demanding. If you need expert academic support to free up time for your professional pursuits, Dissertation Help Services is here to a...

  • 1 kudos
8 More Replies
yash_verma
by > Visitor
  • 58 Views
  • 6 replies
  • 0 kudos

Resolved! error while setting up permission for job via api

Hi Guys , I am getting below error  when I am trying to setup permission for the job via api. Though I am able to create a job via api. Can anyone help to identify the issue or any one has faced below error {"error_code": "INVALID_PARAMETER_VALUE","m...

  • 58 Views
  • 6 replies
  • 0 kudos
Latest Reply
yash_verma
Visitor
  • 0 kudos

Thanks Alberto. It worked.

  • 0 kudos
5 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors
Read Databricks Data Intelligence Platform reviews on G2

Latest from our Blog

Understanding Unity Catalog

Throughout the dozens of engagements I’ve had since joining Databricks, I’ve found that customers often struggle to understand the scope and concept of Unity Catalog. Questions like “Does it store my ...

772Views 3kudos