cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

harraz
by New Contributor III
  • 1333 Views
  • 2 replies
  • 0 kudos

how to setup the path to a remote notebook in bitbucket to run as a jobI tried everything in the path and nothing is workingI keep getting this error:...

how to setup the path to a remote notebook in bitbucket to run as a jobI tried everything in the path and nothing is workingI keep getting this error:Run result unavailable: run failed with error message Notebook not found:Note that I already connec...

Screen Shot 2023-05-31 at 6.45.47 PM
  • 1333 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi @mohamed harraz​ , Could you please confirm if files in repos has been enabled? https://docs.databricks.com/files/workspace.html#configure-support-for-files-in-repos.You can use the command  %sh pwd in a notebook inside a repo to check if Files in...

  • 0 kudos
1 More Replies
fhmessas
by New Contributor II
  • 2453 Views
  • 1 replies
  • 0 kudos

Resolved! Autoloader stream with EventBridge message

Hi All,I have a few streaming jobs running but we have been facing an issue related to messaging. We have multiple feeds within the same root rolder i.e. logs/{accountId}/CloudWatch|CloudTrail|vpcflow/yyyy-mm-dd/logs. Hence, the SQS allows to setup o...

  • 2453 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Fernando Messas​ :Yes, you can configure Autoloader to consume messages from an SQS queue using EventBridge. Here are the steps you can follow:Create an EventBridge rule to filter messages from the SQS queue based on a specific criteria (such as the...

  • 0 kudos
VVill_T
by Contributor
  • 3006 Views
  • 4 replies
  • 7 kudos

How to write a Delta Live Table(dlt) pipeline output to Databricks SQL directly

Hi,I am trying to see if it is possible to setup a direct connection from dlt pipeline to a table in Databricks SQL by configuring the Target Schema: with poc being a location of schema like "dbfs:/***/***/***/poc.db The error message was just a...

image image
  • 3006 Views
  • 4 replies
  • 7 kudos
Latest Reply
youssefmrini
Honored Contributor III
  • 7 kudos

When ever you store a Delta Table to Hive Metastore. This table will be available in Databricks SQL Workspace ( Data Explorer ) under hive_metastore catalog.

  • 7 kudos
3 More Replies
elgeo
by Valued Contributor II
  • 1396 Views
  • 1 replies
  • 2 kudos

Databricks - DBeaver error

Hello experts. While trying to setup the connection between DBeaver and Databricks, we receive the following error:[Databricks][DatabricksJDBCDriver](700120) Host adb-xxxxxxxxxx.azuredatabricks.net cannot be resolved through DnsResolver com.databrick...

  • 1396 Views
  • 1 replies
  • 2 kudos
Latest Reply
elgeo
Valued Contributor II
  • 2 kudos

We identified the problem. It was due to the proxy that the hostname couldn't be resolved.

  • 2 kudos
User16753725182
by Contributor III
  • 1962 Views
  • 1 replies
  • 0 kudos

How to setup a private git repository in my workspace?

How to setup a private git repository in my workspace?

  • 1962 Views
  • 1 replies
  • 0 kudos
Latest Reply
atulsahu
New Contributor II
  • 0 kudos

As a platform engineer, I would go to the admin console and click on "workspace settings" and start by looking into the below settings. Repos: true, so that Repos integration is possibleThe next two settings, are important to make the overall experi...

  • 0 kudos
Labels