cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

saraM
by New Contributor III
  • 319 Views
  • 1 replies
  • 0 kudos

My databricks exam got suspened on 18th march and it is still in the suspened state.

My databricks exam got suspened on 18th march and it is still in the suspened state. I have raised a support request using the below link.https://help.databricks.com/s/contact-us?ReqType=trainingand my request id : #00635015 I would really like to ge...

  • 319 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @saraM! It looks like this post duplicates the one you recently posted. The Original post has already been answered. I recommend continuing the discussion there to keep the conversation focused and organised.

  • 0 kudos
ewe
by New Contributor III
  • 2169 Views
  • 3 replies
  • 2 kudos

Resolved! Databricks apps (streamlit) not able to install python libs

So, I have a databricks streamlit app that is not able to install any python lib defined on the requirements.txt.Issue is not specific to one lib, tried other ones but no python lib can be installed. Anyone with similar issue to help ? [2025-02-19 10...

  • 2169 Views
  • 3 replies
  • 2 kudos
Latest Reply
ewe
New Contributor III
  • 2 kudos

Hi, returning to inform how the issue was solved. A bit of context, looks like the databricks apps runs on the databricks/microsoft backend, and not on our clusters. The control of the network ingress rules is done via SEG (https://www.databricks.com...

  • 2 kudos
2 More Replies
Upendra_Dwivedi
by New Contributor III
  • 1085 Views
  • 3 replies
  • 0 kudos

ON-Prem SQL Server Direct Connection with Azure Databricks

Hi All,I have SSMS installed and have some data there which i want to export to Databricks and process it there. Databricks is hosted on Azure Cloud and I am wondering if it is possible. I have tested it using JDBC connection but i am getting error: ...

  • 1085 Views
  • 3 replies
  • 0 kudos
Latest Reply
MariuszK
Valued Contributor III
  • 0 kudos

If you want to play with SQL Server the easy way is to use Azure SQL that will be visuable for Databricks. Alternativly you can use ADF with slef-hosted run time to extract data to Azure from SQL Server.Network configuration requires many steps and i...

  • 0 kudos
2 More Replies
Mumrel
by Contributor
  • 7404 Views
  • 5 replies
  • 1 kudos

Resolved! Usage of forum: How to find the threads where I commented on and my bookmarks

Hi quick questions, 1) the other day I commented on a thread and cannot find it. Is there a feature to find all my posts? I cannot find it under my profile.2) I find interesting threads and want to watch (in the sense: email me when updates occure) t...

  • 7404 Views
  • 5 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Jan St.​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your ...

  • 1 kudos
4 More Replies
Prasanna_N
by New Contributor
  • 2381 Views
  • 0 replies
  • 0 kudos

Inference table Monitoring

i have data from march1 to march 14 in the final inference table and i have given 1 week granularity. after that profile and drift table is generated and i see the window start time as like this objectstart: "2025-02-24T00:00:00.000Z"end: "2025-03-03...

  • 2381 Views
  • 0 replies
  • 0 kudos
Danny_Lee
by Valued Contributor
  • 2576 Views
  • 0 replies
  • 1 kudos

re: Welcoming Bladebridge to Databricks!

Hi @Sujitha and Databricks team,Congrats on the acquisition of Bladebridge.  We used this tool a couple years back to migrate an important ETL process from Informatica.  I'm glad to see its part of the Data Intelligence Platform and have already take...

  • 2576 Views
  • 0 replies
  • 1 kudos
databicky
by Contributor II
  • 866 Views
  • 1 replies
  • 0 kudos

How to Copy the notebooks from one environment to another environment

I have one requirement to copy the notebooks from one environment to another environment by one notebook automatically , how can I achieve it.  

  • 866 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor II
  • 0 kudos

Hey @databicky ,You can automate the process of copying notebooks from one Databricks environment to another using the Databricks REST API within a notebook. I show you the easiest way I found to do itimport json import requests import base64 # ====...

  • 0 kudos
Kabil
by New Contributor
  • 2190 Views
  • 0 replies
  • 0 kudos

useing dlt metadata as runtime parameter

i have started using DLT pipeline, and i have common code which is used by multiple DLT pipeline. now i need to read metadata information like name of the pipeline and start time of the pipeline during run time, but since im using common code and pip...

  • 2190 Views
  • 0 replies
  • 0 kudos
797646
by New Contributor II
  • 1433 Views
  • 5 replies
  • 2 kudos

Resolved! Calculated measures not working in Dashboards for queries with big result

Queries with big result are executed on cluster. If we specify calculated measure as something like cal1 ascount(*) / count(distinct field1) it will wrap it in backticks as `count(*) / count(distinct field1) ` as `cal1`functions are not identified in...

  • 1433 Views
  • 5 replies
  • 2 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 2 kudos

Hello Team, Could you all try with all caps? COUNT(DISTINCT xxx)

  • 2 kudos
4 More Replies
abelian-grape
by New Contributor III
  • 1324 Views
  • 5 replies
  • 0 kudos

Trigger a Databricks Job When there is an insert to a Snowflake Table?

I need to automatically trigger a Databricks job whenever a new row is inserted to a Snowflake table. Additionally, I need the job to receive the exact details of the newly inserted row as parameters.What are the best approaches to achieve this? I’m ...

  • 1324 Views
  • 5 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

I think lamba function/ event bridge would be a good way - You can query your snowflake table there and create logic for any new row insert mabe CDC etc and then you send a job trigger using databricks API / databricks SDK where you can pass your new...

  • 0 kudos
4 More Replies
Trifa
by New Contributor II
  • 1109 Views
  • 3 replies
  • 1 kudos

Resolved! Override DLT Fille Refresh using a Job parameter

HelloI have a Job with a DLT pipeline as a first task. From time to time, I want to execute this Job with a Full Refresh of the DLT pipeline. How could I override my default "full_refresh = false" ?This was possible before using the Legacy parameters...

Trifa_0-1701170537015.png
  • 1109 Views
  • 3 replies
  • 1 kudos
Latest Reply
adriennn
Valued Contributor
  • 1 kudos

@Trifa luckily, it's simple to implement. You can be the guys are going to release Pipeline Parameters® a week after you have deployed your solution though 

  • 1 kudos
2 More Replies
Kumarn031425
by New Contributor
  • 752 Views
  • 1 replies
  • 0 kudos

Automating Migration of Delta Live Tables Pipelines Across Environments Using Azure DevOps CI/CD

I am seeking guidance on automating the migration of Delta Live Tables (DLT) pipelines across various environments—specifically from development to testing, and ultimately to production—utilizing Azure DevOps for Continuous Integration and Continuous...

  • 752 Views
  • 1 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hi there @Kumarn031425 , I guess, This video tutorial will answer most of your questions : https://youtu.be/SZM49lGovTg?si=X7Cwp0Wfqlo1OnuSHere , deployment of workspace resources using databricks azure devops and databeicks asset bundles tutorial is...

  • 0 kudos
WYO
by New Contributor II
  • 396 Views
  • 1 replies
  • 0 kudos
  • 396 Views
  • 1 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hi there @WYO , I dont think we have a way to add multiple notebooks for a single dlt pipeline from dlt pipeline configuration settings.But there can be another way - you can create a single notebooks which has multiple code blocks which use the run ...

  • 0 kudos
Meghana89
by New Contributor II
  • 896 Views
  • 2 replies
  • 0 kudos

Read Write Stream Data from Event Hub to Databricks Delta Lake

I am trying to read streaming data from EventHub which is in JSON Format, able to read data in a data frame but the body type was coming as binary I have converted it to string and decoded but while implementing the write stream I am facing an ongoin...

  • 896 Views
  • 2 replies
  • 0 kudos
Latest Reply
Meghana89
New Contributor II
  • 0 kudos

@SantoshJoshi  Thanks for reply please find the code snippet belowfrom pyspark.sql import functions as Ffrom pyspark.sql.types import StringTypeimport base64# Define the Event Hubs connection stringconnectionString = endpoint (replace with endpoint f...

  • 0 kudos
1 More Replies
908314
by New Contributor II
  • 696 Views
  • 3 replies
  • 2 kudos

Cluster logs stopped getting written to S3

We have two Databricks Workspaces and since a couple of days ago, cluster logs are not getting persisted to S3, in both workspaces. Driver logs are available in Databricks UI only when the job is active. Haven't seen any errors in the job logs relate...

  • 696 Views
  • 3 replies
  • 2 kudos
Latest Reply
adriantaut
New Contributor II
  • 2 kudos

Hello, Facing same issue in both our Workspaces, our Cluster logs suddenly stopped being delivered to S3 on 12th of March. There were no changes on Cluster settings nor IAM side, also all IAM Permissions should be in place according to Databricks Off...

  • 2 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels