cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Learning Festival (Virtual): 10 October - 31 October

 Join the Databricks Learning Festival (Virtual)!   Mark your calendars from 10 October - 31 October 2024!  Upskill today across data engineering, data analysis, machine learning, and generative AI. Join the thousands who have elevated their career w...

  • 43313 Views
  • 176 replies
  • 41 kudos
a month ago
Introducing Databricks Apps

Databricks Apps, a new way to build and deploy internal data and AI applications, is now available in Public Preview on AWS and Azure. Ideal use cases include data visualization, AI applications, self-service analytics, and data quality monitoring.It...

  • 839 Views
  • 0 replies
  • 4 kudos
a week ago
Struggling with BI? We want to hear from you!

What if anyone in your organization could get insights from their data using just natural language? What if you didn’t have to manage a BI system separate from your data platform?Databricks is working to make your job easier by delivering these and o...

  • 144 Views
  • 0 replies
  • 1 kudos
Thursday
Databricks Community Champion - September 2024 - Szymon Dybczak

Meet Szymon Dybczak, a valued member of our community! Szymon is a Senior Data Engineer at Nordcloud. He brings a wealth of knowledge and expertise to the group, and we're thrilled to have him here. We presented him with a range of questions, and be...

  • 1270 Views
  • 9 replies
  • 9 kudos
2 weeks ago
Intelligent Data Engineering: Beyond the AI Hype

Don't Miss Out on "Making AI-Powered Data Engineering Practical"! Join us for this exciting virtual event where we’ll cut through the hype and explore how AI-powered data intelligence is transforming data engineering.  AMER: Nov 4 / 10 AM PT EMEA: ...

  • 2115 Views
  • 2 replies
  • 3 kudos
3 weeks ago
GenAI: The Shift to Data Intelligence

Shifting to customized GenAI that deeply understands your data AMER  October 8 / 10 AM PTEMEA  October 9 / 9 AM BST / 10 AM CESTAPJ  October 10 / 12 PM SGT Click here to check out the agenda and speakers and register now! Why are 9 out of 10 organiz...

  • 1270 Views
  • 2 replies
  • 2 kudos
4 weeks ago
Big Book of Data Engineering — 3rd Edition

Get practical guidance, notebooks, code snippets You know this better than anyone: the best GenAI models in the world will not succeed without good data. That’s why data engineers are even more critical today. The challenge is staying ahead of the ra...

  • 2161 Views
  • 0 replies
  • 3 kudos
4 weeks ago

Community Activity

dataslicer
by > Contributor
  • 1259 Views
  • 3 replies
  • 0 kudos

How to export/clone Databricks Notebook without results via web UI?

When a Databricks Notebook exceeds size limit, it suggests to `clone/export without results`.  This is exactly what I want to do, but the current web UI does not provide the ability to bypass/skip the results in either the `clone` or `export` context...

  • 1259 Views
  • 3 replies
  • 0 kudos
Latest Reply
dataslicer
Contributor
  • 0 kudos

Thank you @Yeshwanth for the response. I am looking for a way without clearing up the current outputs. This is necessary because I want to preserve the existing outputs and fork off another notebook instance to run with few parameter changes and come...

  • 0 kudos
2 More Replies
PabloCSD
by > Contributor II
  • 97 Views
  • 1 replies
  • 0 kudos

Resolved! How to deploy to Databricks Assets Bundle from Azure DevOps using Service Principal?

I have a CI/CD process that after a Pull Request (PR) to main it deploys to staging.It works using a Personal Access Token using Azure Pipelines.From local, deploying using Service Principal works (https://community.databricks.com/t5/administration-a...

  • 97 Views
  • 1 replies
  • 0 kudos
Latest Reply
PabloCSD
Contributor II
  • 0 kudos

I needed to deploy a job using CI/CD Azure Pipelines without using the OAuth, this is the way:First you need to have configured the Service Principal, for that you need to generate it in your workspace with this you will have:A host: Which is your wo...

  • 0 kudos
slakshmanan
by > New Contributor II
  • 12 Views
  • 0 replies
  • 0 kudos

how to get access_token from REST API without a user password

using rest api /oauth2/token how do i get access_token programmatically

  • 12 Views
  • 0 replies
  • 0 kudos
Garrus990
by > Visitor
  • 11 Views
  • 0 replies
  • 0 kudos

Passing UNIX-based parameter to a task

Hey,I would like to pass to a task a parameter that is based on a UNIX function. Concretely, I would like to specify dates - dynamically calculated with respect to the date of running my job. I wanted to it like that:["--period-start", "$(date -d '-7...

  • 11 Views
  • 0 replies
  • 0 kudos
ElaPG1
by > Visitor
  • 22 Views
  • 0 replies
  • 0 kudos

all-purpose compute for Oracle queries

Hi,I am looking for any guidelines, best practices regarding compute configuration for extracting data from Oracle db and saving it as parquet files. Right now I have a DBR workflow with for each task, concurrency = 31 (as I need to copy the data fro...

  • 22 Views
  • 0 replies
  • 0 kudos
Derek_Czarny
by > New Contributor II
  • 156 Views
  • 5 replies
  • 1 kudos

Resolved! Creating Groups with API and Python

I am working on a notebook to help me create Azure Databricks Groups.  When I create a group in a workspace using the UI, it automatically creates the group at the account level and links them.  When I create a group using the API, and I create the w...

  • 156 Views
  • 5 replies
  • 1 kudos
Latest Reply
Derek_Czarny
New Contributor II
  • 1 kudos

That was it, thank you.  I was looking at the wrong details.  I really appreciate it.

  • 1 kudos
4 More Replies
itamarwe
by > New Contributor II
  • 570 Views
  • 2 replies
  • 1 kudos

Google PubSub for DLT - Error

I'm trying to create a delta live table from a Google PubSub stream.Unfortunately I'm getting the following error:org.apache.spark.sql.streaming.StreamingQueryException: [PS_FETCH_RETRY_EXCEPTION] Task in pubsub fetch stage cannot be retried. Partiti...

  • 570 Views
  • 2 replies
  • 1 kudos
Latest Reply
itamarwe
New Contributor II
  • 1 kudos

Hi @Retired_mod, it was indeed a permissions issue. Nevertheless, I must admit that the error message is slightly misleading.Thanks.

  • 1 kudos
1 More Replies
dbx_687_3__1b3Q
by > New Contributor III
  • 7179 Views
  • 10 replies
  • 4 kudos

Resolved! Databricks Asset Bundle (DAB) from a Git repo?

My earlier question was about creating a Databricks Asset Bundle (DAB) from an existing workspace. I was able to get that working but after further consideration and some experimenting, I need to alter my question. My question is now "how do I create...

  • 7179 Views
  • 10 replies
  • 4 kudos
Latest Reply
mflyingget
Visitor
  • 4 kudos

How can i deploy a custom Gitrepo to the .bundle workshapce?

  • 4 kudos
9 More Replies
abhishekdas
by > Visitor
  • 19 Views
  • 0 replies
  • 0 kudos

Databricks on AWS - Changes to your Unity Catalog storage credentials

Hi Context: On June 30, 2023, AWS updated its IAM role trust policy, which requires updating Unity Catalog storage credentials. Databricks previously sent an email communication to customers in March 2023 on this topic and updated the documentation a...

  • 19 Views
  • 0 replies
  • 0 kudos
shadowinc
by > New Contributor III
  • 1995 Views
  • 7 replies
  • 3 kudos

Databricks SQL endpoint as Linked Service in Azure Data Factory

We have a special endpoint that grants access to delta tables and we want to know if we can use SQL endpoints as a linked service in ADF.If yes then which ADF-linked service would be suitable for this?Appreciate your support on this. 

Data Engineering
SQl endpoint
  • 1995 Views
  • 7 replies
  • 3 kudos
Latest Reply
yashrg
New Contributor
  • 3 kudos

Azure Databricks Delta Lake (Dataset) uses a linked service that can only connect to a All Purpose/Interactive cluster.If you want to use the SQL Endpoint, you would need a Self Hosted Integration Runtime for ADF with Databricks ODBC driver Installed...

  • 3 kudos
6 More Replies
tonylax6
by > Visitor
  • 25 Views
  • 0 replies
  • 0 kudos

Azure Databricks to Adobe Experience Platform

I'm using Azure databricks and am attempting to stream near real-time data from databricks into the Adobe Experience Platform to ingest into the AEP schema for profile enrichment.We are running into an issue with the API and streaming, so we are curr...

  • 25 Views
  • 0 replies
  • 0 kudos
umccanna
by > New Contributor II
  • 185 Views
  • 5 replies
  • 0 kudos

Resolved! Unable to Create Job Task Using Git Provider Invalid Path

I am attempting to create a task in a job using the Git Provider as a source and GitHub is the provider.  The repo is a private repo.  Regardless of how I enter the path to the notebook I receive the same error that the notebook path is invalid and o...

  • 185 Views
  • 5 replies
  • 0 kudos
Latest Reply
umccanna
New Contributor II
  • 0 kudos

Like I said in a previous response.  This started working automatically a few days ago with no changes on our end.  The developer who was working on this decided to try it one more time and it just worked, no error this time.  I don't know if Databri...

  • 0 kudos
4 More Replies
Brahmareddy
by > Valued Contributor II
  • 23 Views
  • 0 replies
  • 0 kudos

Let's Build the Austin Databricks Community Group Together

Hello Austin Databricks Community,I hope this message finds you well!I’m excited to be leading this group and to connect with all of you, passionate about Databricks and data engineering. Our goal is to create a thriving community here in Austin, whe...

  • 23 Views
  • 0 replies
  • 0 kudos
EvanMarth
by > New Contributor III
  • 4401 Views
  • 10 replies
  • 1 kudos

Cannot create an account to try Community Edition

Hi,Whenever I try to signup for an account, I keep getting the following message  - "an error has occurred. please try again later" when I click on the button "get started with databricks community edition".Could you please let me know why this could...

  • 4401 Views
  • 10 replies
  • 1 kudos
Latest Reply
mwang611
Visitor
  • 1 kudos

may I ask if anyone have the solution?

  • 1 kudos
9 More Replies
ankitmit
by > New Contributor
  • 46 Views
  • 2 replies
  • 0 kudos

DLT Apply Changes

Hi,In DLT, how do we specify which columns we don't want to overwrite when using the “apply changes” operation in the DLT (in the attached example, we want to avoid overwriting the “created_time” column)?I am using this sample code dlt.apply_changes(...

  • 46 Views
  • 2 replies
  • 0 kudos
Latest Reply
Finn-Ol
Visitor
  • 0 kudos

There might be a misunderstanding regarding the except_column_list parameter in the apply_changes function. This parameter is used to specify which columns to exclude from the changes, but in this case, it seems like it’s dropping the created_time co...

  • 0 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors

Latest from our Blog

How not to build an AI/BI Demo

Firstly.  Don’t make a demo and then release it under its old name Lakeview, 6 weeks before the name changes to AI/BI, because now no one in the future knows what you're on about. Secondly, I spoke a...

138Views 1kudos