cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Anonymous
by Not applicable
  • 4685 Views
  • 15 replies
  • 8 kudos

Resolved! What are some best practices for CICD?

A number of people have questions on using Databricks in a productionalized environment. What are the best practices to enable CICD automation?

  • 4685 Views
  • 15 replies
  • 8 kudos
Latest Reply
BaivabMohanty
New Contributor II
  • 8 kudos

Any leads/posts for Databricks CI/CD  integration with Bitbucket pipeline. I am facing the below error while I creation my CICD pipeline pipelines:branches:master:- step:name: Deploy Databricks Changesimage: docker:19.03.12services:- dockerscript:# U...

  • 8 kudos
14 More Replies
Sambit_S
by New Contributor II
  • 10 Views
  • 0 replies
  • 0 kudos

Error during deserializing protobuf data

I am receiving protobuf data in a json attribute and along with it I receive a descriptor file.I am using from_protobuf to deserialize the data as below,It works most of the time but giving error when there are some recursive fields within the protob...

Sambit_S_0-1713966940987.png
  • 10 Views
  • 0 replies
  • 0 kudos
drag7ter
by New Contributor II
  • 213 Views
  • 2 replies
  • 0 kudos

Resolved! Not able to set run_as service_principal_name

I'm trying to run: databricks bundle deploy -t prod --profile PROD_Service_Principal My bundle looks: bundle: name: myproject include: - resources/jobs/bundles/*.yml targets: # The 'dev' target, for development purposes. This target is the de...

  • 213 Views
  • 2 replies
  • 0 kudos
Latest Reply
drag7ter
New Contributor II
  • 0 kudos

In my case I replaced alias PROD_Service_Principal with id c250831b-5a2a-4461-a855-83b9102f797e and it works. Not intuitive, probably this is a bug in CLI ot bundles service_principal_name: c250831b-5a2a-4461-a855-83b9102f797e  

  • 0 kudos
1 More Replies
deng_dev
by New Contributor III
  • 24 Views
  • 0 replies
  • 0 kudos

Cached Views in MERGE INTO operation

Hi everyone!I want to use in-memory cached views in a merge into operation, but I am not entirely sure if the exactly saved in-memory view is used in this operation or not.So, suppose I have a table named table_1 and a cached view named cached_view_1...

  • 24 Views
  • 0 replies
  • 0 kudos
madrhr
by Visitor
  • 41 Views
  • 0 replies
  • 0 kudos

SparkContext lost when running %sh script.py

I need to execute a .py file in Databricks from a notebook (with arguments which for simplicity i exclude here). For this i am using:%sh script.pyscript.py:from pyspark import SparkContext def main(): sc = SparkContext.getOrCreate() print(sc...

Data Engineering
%sh
.py
bash shell
SparkContext
SparkShell
  • 41 Views
  • 0 replies
  • 0 kudos
EhsanSaba
by New Contributor
  • 32 Views
  • 0 replies
  • 0 kudos

RocksDB results in empty stream/stream joins dataframe

Since we enable RocksDB in our spark.conf the stream to stream joins/unions results in empty dataframe, does anyone else have the same experience? it is on AWSspark.conf.set("spark.sql.streaming.stateStore.providerClass","com.databricks.sql.streaming...

  • 32 Views
  • 0 replies
  • 0 kudos
RakeshRakesh_De
by New Contributor III
  • 370 Views
  • 7 replies
  • 0 kudos

Spark CSV file read option to read blank/empty value from file as empty value only instead Null

Hi,I am trying to read one file which having some blank value in column and we know spark convert blank value to null value during reading, how to read blank/empty value as empty value ?? tried DBR 13.2,14.3I have tried all possible way but its not w...

RakeshRakesh_De_0-1713431921922.png
Data Engineering
csv
EmptyValue
FileRead
  • 370 Views
  • 7 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

OK, after some tests:The trick is in surrounding text in your csv with quotes.  Like that spark can actually make a difference between a missing value and an empty value.  Missing values are null and can only be converted to something else implicitel...

  • 0 kudos
6 More Replies
ajbush
by New Contributor III
  • 9172 Views
  • 6 replies
  • 2 kudos

Connecting to Snowflake using an SSO user from Azure Databricks

Hi all,I'm just reaching out to see if anyone has information or can point me in a useful direction. I need to connect to Snowflake from Azure Databricks using the connector: https://learn.microsoft.com/en-us/azure/databricks/external-data/snowflakeT...

  • 9172 Views
  • 6 replies
  • 2 kudos
Latest Reply
aagarwal
Visitor
  • 2 kudos

@ludgervisser We are trying to connect to Snowflake via Azure AD user through the externalbrowser method but the browser window doesn't open. Could you please share an example code of how you managed to achieve this, or to some documentation? @BobGeo...

  • 2 kudos
5 More Replies
Brad
by Contributor
  • 73 Views
  • 2 replies
  • 0 kudos

Pushdown in Postgres

Hi team,In Databricks I need to query a postgres source likeselect * from postgres_tbl where id in (select id from df)the df is got from a hive table. If I use JDBC driver, and doquery = '(select * from postgres_tbl) as t' src_df = spark.read.format(...

  • 73 Views
  • 2 replies
  • 0 kudos
Latest Reply
Brad
Contributor
  • 0 kudos

Thanks for response. I cannot do that as we incrementally loading from source very frequently. We cannot read full data each time.

  • 0 kudos
1 More Replies
alpine
by New Contributor
  • 462 Views
  • 2 replies
  • 0 kudos

Deploy lock force acquired error when deploying asset bundle using databricks cli

I'm running this command on a DevOps pipeline.databricks bundle deploy -t devI receive this error and have tried using --force-lock but it still doesn't work.Error: deploy lock force acquired by name@company.com at 2024-02-20 16:38:34.99794209 +0000 ...

  • 462 Views
  • 2 replies
  • 0 kudos
Latest Reply
Li_Li
Visitor
  • 0 kudos

Hi, I had the same error. Could I ask if this --force-lock has anything to do with the terraform lock? or it's a separate lock only for bundle? Where can I find documentation about this flag? thank you in advance.

  • 0 kudos
1 More Replies
VovaVili
by New Contributor
  • 436 Views
  • 2 replies
  • 0 kudos

Databricks Runtime 13.3 - can I use Databricks Connect without Unity Catalog?

Hello all,The official documentation for Databricks Connect states that, for Databricks Runtime versions 13.0 and above, my cluster needs to have Unity Catalog enabled for me to use Databricks Connect, and use a Databricks cluster through an IDE like...

  • 436 Views
  • 2 replies
  • 0 kudos
Latest Reply
mohaimen_syed
New Contributor III
  • 0 kudos

Hi, I'm currently using Databricks Connect without the Unity Catalog on VS Code. Although I have connected the Unity Catalog separately on multiple occasion I don't thing its required.Here is the doc:https://docs.databricks.com/en/dev-tools/databrick...

  • 0 kudos
1 More Replies
AnaMocanu
by New Contributor
  • 291 Views
  • 2 replies
  • 0 kudos

Best way to parse Google Analytics data in Databricks notebook

I managed to extract the Google Analytics data via lakehouse federation and the Big Query connection but the events table values are in a weird JSON format{"v":[{"v":{"f":[{"v":"ga_session_number"},{"v":{"f":[{"v":null},{"v":"2"},{"v":null},{"v":null...

  • 291 Views
  • 2 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

@AnaMocanu I was using this function, with a little modifications on my end:https://gist.github.com/shreyasms17/96f74e45d862f8f1dce0532442cc95b2Maybe this will be helpful for you

  • 0 kudos
1 More Replies
Clampazzo
by New Contributor
  • 68 Views
  • 1 replies
  • 0 kudos

Can I see queries sent to All Purpose Compute from Power BI?

I am brand new to Databricks and am working on connecting a power bi semantic model to our databricks instance.  I have successfully connected it to an All Purpose Compute but was wondering if there was a way I could see the queries that power bi is ...

Data Engineering
Power BI
sql
  • 68 Views
  • 1 replies
  • 0 kudos
Latest Reply
Gaut23
New Contributor II
  • 0 kudos

For All purpose compute, best bet would be to use the system tables,specifically the system.access.audit table.  https://docs.databricks.com/en/administration-guide/system-tables/index.html

  • 0 kudos
Labels
Top Kudoed Authors