cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Kingston
by New Contributor II
  • 225 Views
  • 3 replies
  • 0 kudos

Unable to overwrite table to Azure sql db

Hi I have a requirement to read table from azure sql db and update the table in azure databricks with transformations and overwrite updated table to the azure sql db but due to lazy evaluation of pyspark im unable to overwrite the table in azure sql ...

  • 225 Views
  • 3 replies
  • 0 kudos
Latest Reply
YuliyanBogdanov
New Contributor II
  • 0 kudos

 Hi @Kingston Make sure that you have the proper permissions on the SQL server for the user you do the authentication through JDBC with, i.e. database reader / database writer. Then your approach can go in two directions, push the data from Databrick...

  • 0 kudos
2 More Replies
Vsleg
by Contributor
  • 427 Views
  • 4 replies
  • 0 kudos

Enabling enableChangeDataFeed on Streaming Table created in DLT

Hello, Can I enable Change Data Feed on Streaming Tables? How should I do this? I couldn't find this in the existing documentation https://learn.microsoft.com/en-us/azure/databricks/delta/delta-change-data-feed .

  • 427 Views
  • 4 replies
  • 0 kudos
Latest Reply
Vsleg
Contributor
  • 0 kudos

@Kaniz ?

  • 0 kudos
3 More Replies
hossein_kolahdo
by New Contributor II
  • 569 Views
  • 3 replies
  • 0 kudos

Accessing data from a legacy hive metastore workspace on a new Unity Catalog workspace

Hello,For the purposes of testing I'm interested in creating a new workspace with Unity Catalog enabled, and from there I'd like to access (external - S3) tables on an existing legacy hive metastore workspace (not UC enabled). The goal is for both wo...

Data Engineering
hivemetastore
unitycatalog
Workspaces
  • 569 Views
  • 3 replies
  • 0 kudos
Latest Reply
MichTalebzadeh
New Contributor III
  • 0 kudos

Your aim is to access  external S3 tables from a Unity Catalog workspace without data duplication and keeping data updates synchronized. Configure external location permissions. This ensure that both your Unity Catalog and Hive metastore workspaces h...

  • 0 kudos
2 More Replies
sgupta
by New Contributor II
  • 385 Views
  • 3 replies
  • 0 kudos

Select from a dynamic table name returned by databricks function

I have a databricks function that returns a table_nameCREATE OR REPLACE FUNCTION test_func()  RETURNS string  READS SQL DATA  RETURN    'table_name'I want to select from the table that is returned by this function. How can I make it work in SQL, some...

  • 385 Views
  • 3 replies
  • 0 kudos
Latest Reply
sgupta
New Contributor II
  • 0 kudos

I looked at this posthttps://stackoverflow.com/questions/77475436/in-databricks-workbook-using-spark-sql-how-to-pass-parameters-thru-sql-udf-func What I want is to replace the static table name with the table name passed as parameter (param_table_nam...

  • 0 kudos
2 More Replies
Meshynix
by New Contributor III
  • 1288 Views
  • 5 replies
  • 0 kudos

Resolved! Not able to create external table in a schema under a Catalog.

Problem StatementCluster 1 (Shared Cluster) is not able to read the file location at "dbfs:/mnt/landingzone/landingzonecontainer/Inbound/" and hence we are not able to create an external table in a schema inside Enterprise Catalog.Cluster 2 (No Isola...

  • 1288 Views
  • 5 replies
  • 0 kudos
Latest Reply
YuliyanBogdanov
New Contributor II
  • 0 kudos

Hi @Meshynix,Can you provide the code snippet you execute to create your tables, this would give us a better insight for both use cases. Also can you provide the error that is being returned in the first use case. This would help a lot.

  • 0 kudos
4 More Replies
Nathant93
by New Contributor II
  • 383 Views
  • 1 replies
  • 0 kudos

Autoloader exclude one directory

Hi,I have a bunch of csv files in directories within an azure blob container and I am using autoloader to ingest them into a raw (bronze) table, all csvs apart from one have the same schema. Is there a way to get autoloader to ignore the directory wi...

  • 383 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Nathant93,  You can use the pathGlobFilter option to filter files based on a regular expression. For instance, if you want to skip files with filenames like A1.csv, A2.csv, …, A9.csv, you can specify the filter as follows:df = spark.read.load("/f...

  • 0 kudos
Deepa710
by New Contributor II
  • 427 Views
  • 1 replies
  • 0 kudos

DBAcademy DLT cluster policy missing, &No permission to run Workspace-Setup

I am taking Data Engineering Associate course on the Databricks Partner Academy. To create pipeline for DLT, the cluster policy DBAcademy DLT is needed. I read from the previous community forums that "/Users/<YOUR USER NAME>/Data Engineering with Dat...

dbx.png
  • 427 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Deepa710, Thank you for posting your concern on Community! To expedite your request, please list your concerns on our ticketing portal. Our support staff would be able to act faster on the resolution (our standard resolution time is 24-48 hours).

  • 0 kudos
Mkk1
by New Contributor
  • 305 Views
  • 1 replies
  • 0 kudos

Joining tables across DLT pipelines

How can I join a silver table (s1) from a DLT pipeline (D1) to another silver table (S2) from a different DLT pipeline (D2)?#DLT #DeltaLiveTables

  • 305 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Mkk1, To join a silver table from one Delta Live Tables (DLT) pipeline to another silver table from a different DLT pipeline, you can follow these steps: Read the Silver Tables: In your DLT pipeline code, read the silver tables you want to jo...

  • 0 kudos
raghu2
by New Contributor III
  • 318 Views
  • 1 replies
  • 0 kudos

DLT table from a text source

I am trying to create a delta live table by reading a text source. I get an error message that states that both source and target should be in delta format. Am I missing something? 

  • 318 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @raghu2,  The error message you’re encountering might indicate that no tables were discovered in your specified source code. Verify that your source code includes proper table definitions.Consider using either a notebook with pure SQL or pure Pyth...

  • 0 kudos
healthcareds
by New Contributor
  • 640 Views
  • 1 replies
  • 0 kudos

CLI: Cannot Configure Additional Profiles in web terminal

Hello,I'm having a hard adding secrets since the change from the legacy CLI. Executing "databricks configure" as referenced in the personal access token section of this document does nothing. I have verified that the CLI is installed as v0.216.0. 

healthcareds_0-1711634401208.png
  • 640 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @healthcareds, I understand that you’re facing difficulties with adding secrets to Databricks since the transition from the legacy CLI. Let’s troubleshoot this together. Databricks personal access tokens are now one of the most well-supported ...

  • 0 kudos
toolhater
by New Contributor II
  • 266 Views
  • 1 replies
  • 0 kudos

Self Bootstrap Failure Community Edition

Trying to start a new compute this morning and I get the "Self Bootstrap Error." I saw some people with similar error but their fixes involved DNS settings. I've haven't made any dns changes and was working fine without any problem last night. I real...

  • 266 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @toolhater,  As you mentioned, some users have resolved this issue by adjusting DNS settings. Ensure that your DNS servers are correctly configured and reachable.Verify that the DNS servers set on the Databricks workspace VNet are accessible. Some...

  • 0 kudos
MartinIsti
by New Contributor III
  • 552 Views
  • 3 replies
  • 1 kudos

Resolved! DLT - runtime parameterisation of execution

I have started to use DLT in a prototype framework and I now face the below challenge for which any help would be appreciated.First let me give a brief context:I have metadata sitting in a .json file that I read as the first task and put it into a lo...

Data Engineering
configuration
Delta Live Table
job
parameters
workflow
  • 552 Views
  • 3 replies
  • 1 kudos
Latest Reply
data-engineer-d
New Contributor III
  • 1 kudos

@Kaniz Can you please provide some reference to REST API approach? I do not see that available on the docs. TIA

  • 1 kudos
2 More Replies
pSdatabricks
by New Contributor II
  • 1987 Views
  • 3 replies
  • 0 kudos

Azure Databricks Monitoring & Alerting (Data Observability) Tools / Frameworks for Enterprise

I am trying to evaluate options for Monitoring and Alerting tools like New Relic, Datadog, Grafana with Databricks on Azure . No one supports when reached out to them. I would like to hear from the databricks team on the recommended tool / framework ...

  • 1987 Views
  • 3 replies
  • 0 kudos
Latest Reply
Sruthivika
New Contributor II
  • 0 kudos

I'd recommend this new tool we've been trying out. It's really helpful for monitoring and provides good insights on how Azure Databricks clusters, pools & jobs are doing – like if they're healthy or having issues. It brings everything together, makin...

  • 0 kudos
2 More Replies
FlexException
by New Contributor II
  • 1589 Views
  • 5 replies
  • 0 kudos

Dynamic Number of Tasks in Databricks Workflow

Do Databricks workflows support creating a workflow with a dynamic number of tasks?For example, let's say we have a DAG like this:T1 ->    T2(1) ->             T2(2) ->              .....                 -> T3             T2(n-1) ->             T2(n)...

  • 1589 Views
  • 5 replies
  • 0 kudos
Latest Reply
tanyeesern
New Contributor II
  • 0 kudos

@FlexException Databricks API supports job creation and execution Task Parameters and Values in Databricks Workflows | by Ryan Chynoweth | MediumOne possibility is after running earlier job, process the output to create a dynamic number of tasks in s...

  • 0 kudos
4 More Replies
superspan
by New Contributor II
  • 287 Views
  • 2 replies
  • 0 kudos

How to access Spark UI metrics in an automated way (API)

I am doing some automated testing; and would like ultimately to access per job/stage/task metrics as shown in the UI (e.g. spark UI -> sql dataframe) -> plan visualization in an automated way (API is ideal; but some ad-hoc metrics pipelines from loca...

  • 287 Views
  • 2 replies
  • 0 kudos
Latest Reply
superspan
New Contributor II
  • 0 kudos

Thanks for the response. This enables the event logs. But the event logs seem to be empty. Would you know where I can get the spark metrics as seen from the spark ui.

  • 0 kudos
1 More Replies
Labels