cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

cs_de
by New Contributor II
  • 1770 Views
  • 4 replies
  • 3 kudos

How do I deploy or run one job if I have multiple jobs in a Databricks Asset Bundle?

How do I deploy or run a single job if I have 2 or more jobs defined in my asset bundle?$databricks bundle deploy job1 #does not work I do not see a flag to identify what job to run.

  • 1770 Views
  • 4 replies
  • 3 kudos
Latest Reply
mark_ott
Databricks Employee
  • 3 kudos

I haven't done it with multiple jobs, but I think under resources you name multiple jobs, then when you deploy you just call that job key.  

  • 3 kudos
3 More Replies
Chris_sh
by New Contributor II
  • 4499 Views
  • 2 replies
  • 1 kudos

[STREAMING_TABLE_OPERATION_NOT_ALLOWED.REQUIRES_SHARED_COMPUTE]

Currently trying to refresh a Delta Live Table using a Full Refresh but an error keeps coming up saying that we have to use a shared cluster or a SQL warehouse. I've tried both a shared cluster and a SQL warehouse and the same error keeps coming up. ...

  • 4499 Views
  • 2 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

You are not using "No Isolation Shared" mode, right?  Also, can you share the chunk of code that is causing the failure? Thanks, Louis.

  • 1 kudos
1 More Replies
guest0
by New Contributor III
  • 2063 Views
  • 6 replies
  • 3 kudos

Spark UI Simulator Not Accessible

Hello,The Spark UI Simulator is not accessible since the last few days. I was able to refer to it last week, at https://www.databricks.training/spark-ui-simulator/index.html. I already have access to partner academy (if that is any relevant).  <Error...

Data Engineering
simulator
spark-ui
  • 2063 Views
  • 6 replies
  • 3 kudos
Latest Reply
guest0
New Contributor III
  • 3 kudos

Just a short update: the request I raised was closed saying there is no active support contract with the org (from the email I used) to look into this. Perhaps someone else could try raising a request using the link above.

  • 3 kudos
5 More Replies
Vasu_Kumar_T
by New Contributor II
  • 1282 Views
  • 3 replies
  • 1 kudos

Data Migration using Bladebridge

Hi,We are planning to migrate from Teradata to Databricks using Bladebridge. Going through various portals, I am not able to conclude the component that facilitates Data movement between Teradata and databricks.Please clarify end to end tool and acti...

  • 1282 Views
  • 3 replies
  • 1 kudos
Latest Reply
RiyazAliM
Honored Contributor
  • 1 kudos

I'm not aware if blade bridge has data movement tool handy with them.I don't see anything advertised by them though.Let me know if you find anything on this.

  • 1 kudos
2 More Replies
yashojha1995
by New Contributor
  • 878 Views
  • 1 replies
  • 0 kudos

Error while running update statement using delta lake linked service through ADF

Hi All, I am getting the below error while running an update query in a lookup activity using the delta lake linked service:ErrorCode=AzureDatabricksCommandError,Hit an error when running the command in Azure Databricks. Error details: <span class='a...

  • 878 Views
  • 1 replies
  • 0 kudos
Latest Reply
RiyazAliM
Honored Contributor
  • 0 kudos

Hi @yashojha1995 EOL while scanning string literal hints that there might be a syntax error in the update query.could you share your update query here, and any other info such as how are you creating a Linked service to your delta lake? Does it mean ...

  • 0 kudos
Dharinip
by Contributor
  • 6266 Views
  • 5 replies
  • 3 kudos

Resolved! How to decide on creating views vs Tables in Gold layer?

We have the following use case:We receive raw form of data from an application and that is ingested in the Iron Layer. The raw data is in the JSON FormatThe Bronze layer will the first level of transformation. The flattening of the JSON file happens ...

  • 6266 Views
  • 5 replies
  • 3 kudos
Latest Reply
artus2050189155
New Contributor II
  • 3 kudos

The whole medallion architecture is unnecesarily complex.   Bronze, Silver, Gold.  Some places I have seen people do -  RAW , Trusted RAW , Silver, Trusted Silver, Gold

  • 3 kudos
4 More Replies
manish_tanwar
by New Contributor III
  • 3120 Views
  • 5 replies
  • 4 kudos

Databricks streamlit app for data ingestion in a table

I am using this code in a notebook to save data row on table. And it is working perfectly. And now I am using the same function to save data from a chatbot in streamlit chatbot application of databricks and I am getting error for ERROR ##############...

  • 3120 Views
  • 5 replies
  • 4 kudos
Latest Reply
pradeepvatsvk
New Contributor III
  • 4 kudos

Hi @manish_tanwar  how can we work with streamlit apps in databricks , i have a use case where i want to ingest data from different csv files and ingest it into delta tables 

  • 4 kudos
4 More Replies
harman
by New Contributor II
  • 1522 Views
  • 3 replies
  • 0 kudos

Serverless Compute

Hi Team,We are using Azure Databricks Serverless Compute to execute workflows and notebooks. My question is :Does serverless compute support Maven library installations?I appreciate any insights or suggestions you might have. Thanks in advance for yo...

  • 1522 Views
  • 3 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

So, it appears that the there is conflicting documentation on this topic.  I checked with our internal documention and what I found was that you CANNOT install JDBC or ODBC drivers on Serverless.  See limitations here: https://docs.databricks.com/aws...

  • 0 kudos
2 More Replies
annagriv
by New Contributor II
  • 5599 Views
  • 6 replies
  • 5 kudos

Resolved! How to get git commit ID of the repository the script runs on?

I have a script in a repository on DataBricks. The script should log the current git commit ID of the repository. How can that be implemented? I tried various command, for example: result = subprocess.run(['git', 'rev-parse', 'HEAD'], stdout=subproce...

  • 5599 Views
  • 6 replies
  • 5 kudos
Latest Reply
bestekov
New Contributor II
  • 5 kudos

Here is a version of @vr 's solution that can be run from any folder within the rep. It uses regex to extract the root from the path in the form of \Repos\<username>\<some-repo:import os import re from databricks.sdk import WorkspaceClient w = Worksp...

  • 5 kudos
5 More Replies
Vasu_Kumar_T
by New Contributor II
  • 1043 Views
  • 3 replies
  • 0 kudos

Default Code generated by Bladebridge converter

Hello all ,1. What is the default code generated by Bladebridge converter.for eg : When we migrate Teradat, Oracle to Databricks using Bladebridge whats the default code base.2.If the generated code is PYSPARK, do I have any control over the generate...

  • 1043 Views
  • 3 replies
  • 0 kudos
Latest Reply
RiyazAliM
Honored Contributor
  • 0 kudos

Hello @Vasu_Kumar_T - We've used Bladebridge to convert from Redshift to Databricks. Bladebridge can definetly convert to Spark SQL, not sure about Scala Spark though.

  • 0 kudos
2 More Replies
AsgerLarsen
by New Contributor III
  • 2087 Views
  • 7 replies
  • 0 kudos

Using yml variables as table owner through SQL

I'm trying to change the ownership of a table in the Unity Catalog created through a SQL script. I want to do this though code.I'm using a standard databricks bundle setup, which uses three workspaces: dev, test and prod.I have created a variable in ...

  • 2087 Views
  • 7 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

I guess that is a safe bet.Good luck!

  • 0 kudos
6 More Replies
Aatma
by New Contributor
  • 5286 Views
  • 3 replies
  • 1 kudos

Resolved! DABs require library dependancies from GitHub private repository.

developing a python wheel file using DABs which require library dependancies from GitHub private repository. Please help me understand how to setup the git user and token in the resource.yml file and how to authenticate the GitHub package.pip install...

  • 5286 Views
  • 3 replies
  • 1 kudos
Latest Reply
sandy311
New Contributor III
  • 1 kudos

Could you please give a detailed example?how to define env varaibles? BUNDLE_VAR?

  • 1 kudos
2 More Replies
minhhung0507
by Valued Contributor
  • 779 Views
  • 1 replies
  • 0 kudos

Handling Hanging Pipelines in Real-Time Environments: Leveraging Databricks’ Idle Event Monitoring

Hi everyone,I’m running multiple real-time pipelines on Databricks using a single job that submits them via a thread pool. While most pipelines are running smoothly, I’ve noticed that a few of them occasionally get “stuck” or hang for several hours w...

  • 779 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

may I ask why you use threadpools?  with jobs you can define multiple tasks which do the same.I'm asking because threadpools and spark resource management can intervene with each other.

  • 0 kudos
rodrigocms
by New Contributor
  • 3398 Views
  • 1 replies
  • 0 kudos

Get information from Power BI via XMLA

Hello everyone I am trying to get information from Power BI semantic models via XMLA endpoint using PySpark in Databricks.Can someone help me with that?tks

  • 3398 Views
  • 1 replies
  • 0 kudos
Latest Reply
CacheMeOutside
New Contributor II
  • 0 kudos

I would like to see this too. 

  • 0 kudos
PunithRaj
by New Contributor
  • 6835 Views
  • 2 replies
  • 2 kudos

How to read a PDF file from Azure Datalake blob storage to Databricks

I have a scenario where I need to read a pdf file from "Azure Datalake blob storage to Databricks", where connection is done through AD access.Generating the SAS token has been restricted in our environment due to security issues. The below script ca...

  • 6835 Views
  • 2 replies
  • 2 kudos
Latest Reply
Mykola_Melnyk
New Contributor III
  • 2 kudos

@PunithRaj You can try to use  PDF DataSource for Apache Spark for read pdf files directly to the DataFrame. So you will have extracted text and rendered page as image in output. More details here: https://stabrise.com/spark-pdf/df = spark.read.forma...

  • 2 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels