cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Platform Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Browse the Community

Activity in Databricks Platform Discussions

Colombia
by Visitor
  • 151 Views
  • 0 replies
  • 0 kudos

Use OF API from package enerbitdso 0.1.8 PYPI

Hello! I have code to use an API supplied in the energitdso package (This is the repository https://pypi.org/project/enerbitdso/). I changed the code adapting it to AZURE DATABRICKS in python, but although there is a connection with the API, it does ...

  • 151 Views
  • 0 replies
  • 0 kudos
Ravikumashi
by Contributor
  • 137 Views
  • 3 replies
  • 0 kudos

issue with azure databricks workspace after we disable public network access

Hi All,We had azure databricks workspaces created thru terraform with public network access enabled to true and everything was working great. recently we have disabled the public network access and started to face issues.terraform is uanble to add us...

  • 137 Views
  • 3 replies
  • 0 kudos
Latest Reply
Ravikumashi
Contributor
  • 0 kudos

we use the following code to create private endpoint and on UI we can see the private endpoint connection status as approved.resource "azurerm_private_endpoint" "example" { name = "example-endpoint" location = azurerm_re...

  • 0 kudos
2 More Replies
Ramakrishnan83
by New Contributor III
  • 35 Views
  • 0 replies
  • 0 kudos

Intermittent SQL Failure on Databricks SQL Warehouse

Team,I did setup a SQL Warehouse Cluster to support request from Mobile devices through REST API. I read through the documentation of concurrent query limit which is 10. But in my scenario I had 5 small clusters and the query monitoring indicated the...

  • 35 Views
  • 0 replies
  • 0 kudos
Anske
by New Contributor
  • 381 Views
  • 1 replies
  • 0 kudos

One-time backfill for DLT streaming table before apply_changes

Hi,absolute Databricks noob here, but I'm trying to set up a DLT pipeline that processes cdc records from an external sql server instance to create a mirrored table in my databricks delta lakehouse. For this, I need to do some initial one-time backfi...

Data Engineering
Delta Live Tables
  • 381 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anske
New Contributor
  • 0 kudos

So since nobody responded, I decided to try my own suggestion and hack the snapshot data into the table that gathers the change data capture. After some straying I ended up with the notebook as attached.The notebook first creates 2 dlt tables (lookup...

  • 0 kudos
dollyb
by New Contributor III
  • 42 Views
  • 0 replies
  • 0 kudos

Differences between Spark SQL and Databricks

Hello,I'm using a local Docker Spark 3.5 runtime to test my Databricks Connect code. However I've come across a couple of cases where my code would work in one environment, but not the other.Concrete example, I'm reading data from BigQuery via spark....

  • 42 Views
  • 0 replies
  • 0 kudos
User16826987838
by Contributor
  • 2056 Views
  • 2 replies
  • 4 kudos
  • 2056 Views
  • 2 replies
  • 4 kudos
Latest Reply
User16826994223
Honored Contributor III
  • 4 kudos

Yes it does, !!https://databricks.com/session/secured-kerberos-based-spark-notebook-for-data-science

  • 4 kudos
1 More Replies
AChang
by New Contributor III
  • 1717 Views
  • 2 replies
  • 1 kudos

How to fix this runtime error in this Databricks distributed training tutorial workbook

I am following along with this notebook found from this article. I am attempting to fine tune the model with a single node and multiple GPUs, so I run everything up to the "Run Local Training" section, but from there I skip to "Run distributed traini...

  • 1717 Views
  • 2 replies
  • 1 kudos
Latest Reply
KYX
Visitor
  • 1 kudos

Hi AChang, have you eventually resolved the error? I've also having the same error.

  • 1 kudos
1 More Replies
kiko_roy
by New Contributor III
  • 1406 Views
  • 4 replies
  • 3 kudos

Resolved! Permission error loading dataframe from azure unity catalog to GCS bucket

I am creating a data frame by reading a table's data residing in Azure backed unity catalog. I need to write the df or file to GCS bucket. I have configured the spark cluster config using the GCP service account json values.on running : df1.write.for...

Data Engineering
GCS bucket
permission error
  • 1406 Views
  • 4 replies
  • 3 kudos
Latest Reply
ruloweb
Visitor
  • 3 kudos

Hi, is there any terraform resource to apply this GRANT or this have to be done always manually?

  • 3 kudos
3 More Replies
cubanDataDude
by Visitor
  • 79 Views
  • 1 replies
  • 0 kudos

Job Claiming NotebooKNotFound Incorrectly (seemingly)

I have the code captured below in the screenshot. When I run this individually it works just fine, when I JOB runs this it fails out with 'ResourceNotFound' - not sure what the issue is... - Checked 'main' branch, which is where this job is pulling f...

  • 79 Views
  • 1 replies
  • 0 kudos
Latest Reply
cubanDataDude
  • 0 kudos

Figured it out:ecw_staging_nb_List = ['nb_UPSERT_stg_ecw_insurance','nb_UPSERT_stg_ecw_facilitygroups']Works just fine.

  • 0 kudos
amal15
by New Contributor
  • 274 Views
  • 2 replies
  • 1 kudos

Resolved! import ml.dmlc.xgboost4j.scala.spark.{XGBoostEstimator, XGBoostClassificationModel}

how i can import : import com.microsoft.ml.spark.{LightGBMClassifier,LightGBMClassificationModel}import ml.dmlc.xgboost4j.scala.spark.{XGBoostEstimator, XGBoostClassificationModel} projet spark & scala in databricks

  • 274 Views
  • 2 replies
  • 1 kudos
Latest Reply
amal15
New Contributor
  • 1 kudos

XGBoostEstimator is not a member of package ml.dmlc.xgboost4j.scala.spark ?How can I resolve this error?with maven : ml.dmlc:xgboost4j-spark_2.12:2.0.3

  • 1 kudos
1 More Replies
amal15
by New Contributor
  • 39 Views
  • 0 replies
  • 0 kudos

XGBoostEstimator is not a member of package ml.dmlc.xgboost4j.scala.spark ?

XGBoostEstimator is not a member of package ml.dmlc.xgboost4j.scala.spark ?How can I resolve this error?  

  • 39 Views
  • 0 replies
  • 0 kudos
Dom1
by Visitor
  • 39 Views
  • 0 replies
  • 0 kudos

Show log4j messages in run output

Hi,I have an issue when running JAR jobs. I expect to see logs in the output window of a run. Unfortunately, I can only see messages of that are generated with "System.out.println" or "System.err.println". Everything that is logged via slf4j is only ...

Dom1_0-1713189014582.png
  • 39 Views
  • 0 replies
  • 0 kudos
jp_allard
by Visitor
  • 28 Views
  • 0 replies
  • 0 kudos

Selective Overwrite to a Unity Catalog Table

I have been able to perform a selective overwrite using replace Where to a hive_metastore table, but when I use the same code for the same table in a unity catalog, no data is written.Has anyone else had this issue or is there common mistakes that ar...

  • 28 Views
  • 0 replies
  • 0 kudos
dannythermadom
by New Contributor III
  • 2216 Views
  • 6 replies
  • 7 kudos

Dbutils.notebook.run command not working with /Repos/

I have two github repo configured in Databricks Repos folder. repo_1 is run using a job and repo_2 is run/called from repo_1 using Dbutils.notebook.run command. dbutils.notebook.run("/Repos/repo_2/notebooks/notebook", 0, args)i am getting the follo...

  • 2216 Views
  • 6 replies
  • 7 kudos
Latest Reply
cubanDataDude
  • 7 kudos

I am having a similar issue...  ecw_staging_nb_List = ['/Workspace/Repos/PRIMARY/UVVC_DATABRICKS_EDW/silver/nb_UPSERT_stg_ecw_insurance',                 '/Repos/PRIMARY/UVVC_DATABRICKS_EDW/silver/nb_UPSERT_stg_ecw_facilitygroups'] Adding workspace d...

  • 7 kudos
5 More Replies
my_super_name
by Visitor
  • 24 Views
  • 0 replies
  • 0 kudos

Auto Loader Schema Hint Behavior: Addressing Nested Field Errors

Hello,I'm using the auto loader to stream a table of data and have added schema hints to specify field values.I've observed that when my initial data file is missing fields specified in the schema hint,the auto loader correctly identifies this and ad...

  • 24 Views
  • 0 replies
  • 0 kudos
Top Kudoed Authors