cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Error with mosaic.enable_mosaic() when created DLT Pipeline with Mosaic lib

slothPetete
New Contributor II

The error was raised when I tried to start a DLT pipeline with simple code, just to start experimenting the DLT. 

The primary library was Mosaic, which is already instructed to installed first before importing. 

The code is roughly as follow

 

$ %pip install databricks-mosaic

import mosaic as mos
mos.enable_mosaic(spark, dbutils) # Error line 
import dlt
from pyspark.sql.functions import *
from pyspark.sql.types import *

@dlt.table(comment="Testing a DLT table for geospatial coverage")
def geo_area():
    return spark.read.table("geo_coverage_area")

 

Below is the DLT Pipeline setting JSON

 

{
"id": "<>",
"pipeline_type": "WORKSPACE",
"clusters": [
{
"label": "default",
"node_type_id": "m5d.large",
"driver_node_type_id": "m5d.large",
"custom_tags": {
"type": "test"
},
"num_workers": 1
},
{
"label": "maintenance",
"custom_tags": {
"type": "test"
}
}
],
"development": true,
"continuous": false,
"channel": "CURRENT",
"photon": false,
"libraries": [
{
"notebook": {
"path": "/<>/test DLT"
}
}
],
"name": "helloCellCoverage",
"edition": "CORE",
"catalog": "sigint_workspace",
"target": "default",
"data_sampling": false
}

 

And here is the error raised

java.lang.RuntimeException: Failed to execute python command for notebook "<Removed>" and error AnsiResult(---------------------------------------------------------------------------
Py4JError                                 Traceback (most recent call last)
File <command--1>:3
      1 import mosaic as mos
----> 3 mos.enable_mosaic(spark, dbutils)
      5 import dlt
      6 # import pyspark.sql.functions as 

 

 

 

 

 

 

 

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @slothPeteteThe error message you’re encountering, “OAuth error response, generally means someone clicked cancel: access_denied (errorCode=180002),” indicates that the authorization process was interrupted or denied.

Let’s break it down:

  1. OAuth: OAuth is a protocol used for secure authorization between applications. It allows users to grant limited access to their resources without sharing their credentials directly.

  2. Access Denied (errorCode=180002): This specific error code signifies that the user denied the requested permissions during the OAuth flow. Essentially, someone clicked “cancel” or declined the authorization request.

  3. Possible Solutions:

    • Authorization Flow: Ensure that the user is guided through the entire authorization process. Sometimes users accidentally click “cancel” or deny access.
    • Review Scopes: Check the requested scopes (permissions) in your OAuth configuration. If the user is uncomfortable accepting certain permissions, consider modifying the requested scopes.
  4. Tableau Server and Azure SQL Database:

    • You mentioned using Tableau Server with Azure SQL Database. Make sure that your OAuth configuration is correctly set up in Tableau Server.
    • Verify that the credentials for Azure SQL Database are valid within Tableau Server’s account settings.
    • Additionally, if you have SAML enabled with multi-factor authentication (MFA), ensure that it’s configured correctly.

For more detailed troubleshooting, you can refer to Tableau’s official documentation on OAuth connections. If you need further assistance, consider reaching out to Tableau support. 🚀

slothPetete
New Contributor II

Hi, thank you for the reply. However, I couldn't related the OAuth issue to the Java RunTime error from my question. Additionally, I tried checked the Photon feature when started the workflow, and the error still persists.

Thank you for your response anyway.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.