Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
while connecting to an api from databricks notebook with the bearer token I am getting the below errorHTTPSConnectionPool(host='sandvik.peakon.com', port=443): Max retries exceeded with url: /api/v1/segments?page=1 (Caused by SSLError(SSLCertVerifica...
Hi @SunilSamal
The error you are encountering, SSLCertVerificationError, indicates that the SSL certificate verification failed because the local issuer certificate could not be obtained. This is a common issue when the SSL certificate chain is inco...
Hello all,I have a slightly niche issue here, albeit one that others are likely to run into.Using databricks on Azure, my organisation has included extended our WAN into the cloud, so that all compute clusters are granted a private IP address that ca...
I am an Account Admin at Databricks (Azure), and trying to delete users that are being offboarded.I have managed to delete most users. However, for a couple, I get the following message (see screenshot):ABORTED: Account <account> is read-only during ...
I am currently setting up the VSCode extension for Databricks Connect, and it’s working fine so far. However, I have a question about cluster configurations. I want to access Unity Catalog from VSCode through the extension, and I’ve noticed that I ca...
I can't get past the error below. I've read and reread the instructions several times at the URL below and for the life of me cannot figure out what I'm missing in my AWS setup. Any tips on how to track down my issue? https://docs.databricks.com/en/c...
Hi, I'm trying to leverage CACHE TABLE to create temporary tables that are cleaned up at the end of the session.In creating one of these, I'm getting Data too long for column 'session_data'. The query I'm using isn't referencing a session_data colu...
we are looking to do an integration with databricks, and i've noticed that the samples database doesn't have an INFORMATION_SCHEMA we rely on the existence of the information_schema to help us understand what views / tables exist in each catalog. w...
The "samples" catalog in Databricks does not have an INFORMATION_SCHEMA because it is designed primarily for demonstration and educational purposes, rather than for production use. This schema is typically included in catalogs created on Unity Catalo...
Following the instructions on Authentication settings for the Databricks ODBC Driver | Databricks on AWS I have created both Personal access token and OAuth 2.0 token connection options in an application. However, I realized that, when I use OAuth 2....
This will require further analysis, based on the information it seems there is no direct way to restrict this. Are you able to submit a support case so this can be analyzed?
Hello everyone,I'm using the Databricks connect feature to connect to an SQL server on cloud.I created a foreign catalog based on the connection but whenever I try to access the tables, I get a login error : I have tried with a serverless cluster and...
Solved.Turns out it was a networking issue, once the subnets from databricks where allowed by the cloud sql server we managed to connect. The error message is misleading because the credentials were correct
Hi,We have recently added a service principal for running and managing all of our jobs. The service principal has ALL PRIVILEGES to our catalogs/schemas/and table. But we're still seeing the error message `PERMISSION_DENIED: User is not an owner of T...
Context:Hello, I was using a workflow for a periodic process, with my team we were using a Job Compute, but the libraries were not working (even though we had a PIP_EXTRA_INDEX_URL defined in the Environment Variables of the Cluster, so we now use a ...
I installed in the cluster this library:spark_mssql_connector_2_12_1_4_0_BETA.jarA colleague passed me this .jar file. It seems that can be obtained from here: https://github.com/microsoft/sql-spark-connector/releases.This allows the task to end succ...
Hi All,Based on the article below to enable Genie one needs to:1. Enable Azure AI services-powered featuresThat is done:2. Enable Genie must be enabled from the Previews pageI do not see Genie among Previews:I am using Azure Databricks. Any idea how ...
We are trying to leverage Azure PIM. This works great for most things, however; we've run into a snag. We want to limit the contributor role to a group and only at the resource group level, not subscription. We wish to elevate via PIM. This will ...
My Databricks professional data Engineer certification exam got suspended. My Exam just went for half hour, it was showing me error for eye movement when I was reading question, exam suspended on 11th of July 2024 and still showing in progress assess...
I'm sorry to hear your exam was suspended. Please file a ticket with our support team and allow the support team 24-48 hours for a resolution. You should also review this documentation:Room requirementsBehavioral considerations
In my Workspace, I have a repository with Git folder.I would like to access programatically with Python from within a notebook:- name of the repo- currently checked out branch in the repoI want to do this in two different ways:(1) Access said informa...
Hi @johnb1 ,You can use one of the following options to achieve what you want: - databricks CLI repos commands - databricks python SDK- databricks rest API calls
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.