cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

DBSQL subscriptions method returning `410: Gone`

Salty
New Contributor II

We've been using the DQBSQL API to perform CRUD on queries and alerts.

Part of that process added a slack channel as alert destination using the /subscriptions element on an alert post as below.

As of today I am getting a 410 'Gone' error from the API on trying to add a subscription.

I understand the DQBSQL API is preview and therefore subject to change wihtout notice, however we are a slack heavy org and it was very useful!

Does anyone know of a workaround?

ALERTS_API = DOMAIN + '/api/2.0/preview/sql/alerts'
response = requests.post(
    ALERTS_API + '/' + alertId + '/subscriptions'
    , headers=HEADER
    , data=jsonPayload
    )
Exception: Error adding alert subscription: 410; Reason: Gone

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

@Jeremy Salt​ :

The 410 Gone error typically indicates that the resource you are trying to access no longer exists. It's possible that the DQBSQL API has removed support for adding subscriptions to alerts via the /subscriptions element.

One workaround you could try is to use the Slack API directly to create a webhook and then configure the webhook as the alert destination. You can create a webhook for a specific Slack channel, and then configure the webhook URL as the destination for the alert. This would allow you to continue receiving alerts in Slack even if the DQBSQL API no longer supports adding subscriptions.

Here's an example of how to create a webhook for a Slack channel:

  1. Go to your Slack workspace and navigate to the channel you want to receive alerts in.
  2. Click the settings icon and select "Add apps".
  3. Search for "Incoming WebHooks" and click "Add to Slack".
  4. Follow the prompts to authorize the app and configure the webhook.
  5. Once you've configured the webhook, you'll see a URL that you can use as the alert destination in the DQBSQL API.

Once you have the webhook URL, you can configure it as the alert destination using the DQBSQL API. You can do this by creating an alert with the "Webhook" type and setting the "Destination" field to the webhook URL. Here's an example of how to create an alert with a webhook destination using Python:

ALERTS_API = DOMAIN + '/api/2.0/preview/sql/alerts'
alert_payload = {
    "name": "My Alert",
    "description": "This is my alert",
    "query": "SELECT * FROM my_table",
    "type": "Webhook",
    "destination": "https://hooks.slack.com/services/XXXXXXX/YYYYYYY/ZZZZZZZ",
    "condition": {
        "threshold": 10
    }
}
response = requests.post(ALERTS_API, headers=HEADER, data=json.dumps(alert_payload))

Replace the "destination" field with the webhook URL you created in Slack. Note that you'll need to include the Slack webhook URL in the "destination" field as a string, as shown in the example above.

View solution in original post

4 REPLIES 4

Anonymous
Not applicable

@Jeremy Salt​ :

The 410 Gone error typically indicates that the resource you are trying to access no longer exists. It's possible that the DQBSQL API has removed support for adding subscriptions to alerts via the /subscriptions element.

One workaround you could try is to use the Slack API directly to create a webhook and then configure the webhook as the alert destination. You can create a webhook for a specific Slack channel, and then configure the webhook URL as the destination for the alert. This would allow you to continue receiving alerts in Slack even if the DQBSQL API no longer supports adding subscriptions.

Here's an example of how to create a webhook for a Slack channel:

  1. Go to your Slack workspace and navigate to the channel you want to receive alerts in.
  2. Click the settings icon and select "Add apps".
  3. Search for "Incoming WebHooks" and click "Add to Slack".
  4. Follow the prompts to authorize the app and configure the webhook.
  5. Once you've configured the webhook, you'll see a URL that you can use as the alert destination in the DQBSQL API.

Once you have the webhook URL, you can configure it as the alert destination using the DQBSQL API. You can do this by creating an alert with the "Webhook" type and setting the "Destination" field to the webhook URL. Here's an example of how to create an alert with a webhook destination using Python:

ALERTS_API = DOMAIN + '/api/2.0/preview/sql/alerts'
alert_payload = {
    "name": "My Alert",
    "description": "This is my alert",
    "query": "SELECT * FROM my_table",
    "type": "Webhook",
    "destination": "https://hooks.slack.com/services/XXXXXXX/YYYYYYY/ZZZZZZZ",
    "condition": {
        "threshold": 10
    }
}
response = requests.post(ALERTS_API, headers=HEADER, data=json.dumps(alert_payload))

Replace the "destination" field with the webhook URL you created in Slack. Note that you'll need to include the Slack webhook URL in the "destination" field as a string, as shown in the example above.

Anonymous
Not applicable

Hi @Jeremy Salt​ 

Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.

Please help us select the best solution by clicking on "Select As Best" if it does.

Your feedback will help us ensure that we are providing the best possible service to you. Thank you!

Salty
New Contributor II

Hi @Suteja Kanuri​ , @Vidula Khanna​ 

thanks for getting back with a solution.

The suggested solution looks fine, but for a number of reasons I went with another option to use the Jobs API. This allowed me to preserve more of the automation I had already written, simplified scheduling and gives a base for adding other task types.

def create_acl() -> str:
    """
    Return an ACL as a json string
    """
    can_manage_list = CAN_MANAGE_USERS.split(',')
    acl = ''
    user_id = spark.sql('select current_user() as user').collect()[0]['user']
 
    for user in can_manage_list:
        if user == user_id:
            continue
        if acl == '':
            acl = acl + '[{"user_name":"' + user + '", "permission_level":"CAN_MANAGE_RUN"}'
        else:
            acl = acl + ', {"user_name":"' + user + '", "permission_level":"CAN_MANAGE_RUN"}'
    acl = acl + ']'
    return acl
 
 
def create_alert_task_array(alerts:[str], subscription_id:str) -> str:
    """
    Create a list of tasks that can be assigned to a job during creation
    Returned as a json string
    """
    #at some point we may need to add pass in a list of destinations and add them all
    
    
    WAREHOUSE_ID = '[redacted]'
 
    sql_tasks = ''
    for alert in alerts:
        if get_alert_id(alert) is None:
            jobs_logger.warning(f'Could not find alert_id. "{alert}" has not been added to the job')
            continue
        tmp = '{"task_key": "TASK_' + alert + '", ' + \
            '"sql_task": {' + \
                '"alert": {' + \
                    '"alert_id": "' + get_alert_id(alert) + '", ' + \
                '"subscriptions": [' + \
                     '{"destination_id": "' + subscription_id + '"}]' + \
                    '}, ' + \
                '"warehouse_id": "' + WAREHOUSE_ID + '" }}'
        if sql_tasks == '':
            sql_tasks = tmp
        else:
            sql_tasks = sql_tasks + ', ' + tmp
 
    return sql_tasks
 
 
def create_alert_job(name:str, alert_names:[str], schedule: str, subscription_id: str):
    """
    create a job 
    """
    try:
        try:
            delete_job_by_name(name) # may not have existed before, will raise a warning
        except Exception as e:
            jobs_logger.debug('keep calm and carry on: likely that the job has not been created before. user will be warned and should spot any dupes')
        finally:
            task_list = create_alert_task_array(alert_names, subscription_id)
            if schedule != '':
                schedule_clause = '"schedule": {' + \
                        '"quartz_cron_expression": "' + schedule + '", ' + \
                        '"timezone_id": "UTC", ' + \
                        '"pause_status": "UNPAUSED"' + \
                        '}, '
            else:
                schedule_clause = ''
 
            body = '{"name": "' + name + '", ' + \
                    '"tasks": [' + task_list + '], ' + \
                    schedule_clause + \
                    '"format": "MULTI_TASK", ' + \
                    '"access_control_list": ' + create_acl() + \
                    '}'    
                    # '"tags": {}, ' + \
            jobs_logger.debug(body)
            response = requests.post(
                JOBS_API + '/create'
                , data=body
                , headers=HEADER
            )
 
            if response.status_code == 200:
                jobs_logger.info(f"Successfully created job: {name}")
            else:
                if response.status_code == 403:
                    jobs_logger.warning(f'Authorisation has failed, check API token')
                jobs_logger.info(response.json())
                raise Exception(f'Error creating job {name}: Error: {response.status_code}; Reason: {response.reason}')
    except Exception as e:
        jobs_logger.warning('{e}')

Anonymous
Not applicable

@Jeremy Salt​ : thanks for providing the solution and letting us know how you went ahead with solving this.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group