- 1529 Views
- 1 replies
- 0 kudos
Resolved! Databricks connect - SQL Server - Login error with all purpose cluster
Hello everyone,I'm using the Databricks connect feature to connect to an SQL server on cloud.I created a foreign catalog based on the connection but whenever I try to access the tables, I get a login error : I have tried with a serverless cluster and...
- 1529 Views
- 1 replies
- 0 kudos
- 0 kudos
Solved.Turns out it was a networking issue, once the subnets from databricks where allowed by the cloud sql server we managed to connect. The error message is misleading because the credentials were correct
- 0 kudos
- 1214 Views
- 3 replies
- 1 kudos
Resolved! PERMISSION_DENIED: User is not an owner of Table/Schema
Hi,We have recently added a service principal for running and managing all of our jobs. The service principal has ALL PRIVILEGES to our catalogs/schemas/and table. But we're still seeing the error message `PERMISSION_DENIED: User is not an owner of T...
- 1214 Views
- 3 replies
- 1 kudos
- 1 kudos
I think the feedback button is the right place. At least I don't know of another way.
- 1 kudos
- 691 Views
- 1 replies
- 0 kudos
Resolved! [DATA_SOURCE_NOT_FOUND] Failed to find data source
Context:Hello, I was using a workflow for a periodic process, with my team we were using a Job Compute, but the libraries were not working (even though we had a PIP_EXTRA_INDEX_URL defined in the Environment Variables of the Cluster, so we now use a ...
- 691 Views
- 1 replies
- 0 kudos
- 0 kudos
I installed in the cluster this library:spark_mssql_connector_2_12_1_4_0_BETA.jarA colleague passed me this .jar file. It seems that can be obtained from here: https://github.com/microsoft/sql-spark-connector/releases.This allows the task to end succ...
- 0 kudos
- 1015 Views
- 2 replies
- 2 kudos
Resolved! How to enable Genie?
Hi All,Based on the article below to enable Genie one needs to:1. Enable Azure AI services-powered featuresThat is done:2. Enable Genie must be enabled from the Previews pageI do not see Genie among Previews:I am using Azure Databricks. Any idea how ...
- 1015 Views
- 2 replies
- 2 kudos
- 5364 Views
- 2 replies
- 0 kudos
Leverage Azure PIM with DataBricks with Contributor role privilege
We are trying to leverage Azure PIM. This works great for most things, however; we've run into a snag. We want to limit the contributor role to a group and only at the resource group level, not subscription. We wish to elevate via PIM. This will ...
- 5364 Views
- 2 replies
- 0 kudos
- 0 kudos
Did you find a solution to 20-40min delay?
- 0 kudos
- 518 Views
- 1 replies
- 0 kudos
Exam for Databricks Certified Data Engineer Associte
My Databricks professional data Engineer certification exam got suspended. My Exam just went for half hour, it was showing me error for eye movement when I was reading question, exam suspended on 11th of July 2024 and still showing in progress assess...
- 518 Views
- 1 replies
- 0 kudos
- 0 kudos
I'm sorry to hear your exam was suspended. Please file a ticket with our support team and allow the support team 24-48 hours for a resolution. You should also review this documentation:Room requirementsBehavioral considerations
- 0 kudos
- 190 Views
- 1 replies
- 0 kudos
Access Git folder information from notebook
In my Workspace, I have a repository with Git folder.I would like to access programatically with Python from within a notebook:- name of the repo- currently checked out branch in the repoI want to do this in two different ways:(1) Access said informa...
- 190 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @johnb1 ,You can use one of the following options to achieve what you want: - databricks CLI repos commands - databricks python SDK- databricks rest API calls
- 0 kudos
- 2154 Views
- 6 replies
- 3 kudos
Resolved! Hard reset programatically
Is it possible to trigger a git reset --hard programatically?I'm running a platform service where, as part of CI/CD, repos get deployed into the Databricks workspace. Normally, our developers work with upstream repos both from their local IDEs and fr...
- 2154 Views
- 6 replies
- 3 kudos
- 3 kudos
Thank you for the feedback there! We recently added more docs for SP OAuth support for DevOps. SP OAuth support for Github is being discussed.
- 3 kudos
- 238 Views
- 0 replies
- 0 kudos
Azure Network Connectivity Configurations API failing
It seems like since yesterday evening (Europe time) there's a platform-side issue with Network Connectivity Configurations API on Azure Databricks Accounts.API calls are being redirected to a login page, causing multiple different tools, such as Terr...
- 238 Views
- 0 replies
- 0 kudos
- 273 Views
- 1 replies
- 2 kudos
Resolved! Trouble with host url parameterization
I am attempting to parameterize a databricks yaml so I can deploy it to multiple databricks accounts via Gitlab CICD, and have ran into a snag when parameterizing the workpace host value. My variable block looks like this: variables: databricks_ho...
- 273 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @jasont41 ,Your assumption is correct. You can't use variable for host mapping. You can find information about it in the following documentation entry: https://docs.databricks.com/en/dev-tools/bundles/settings.html#other-workspace-mappings
- 2 kudos
- 751 Views
- 3 replies
- 3 kudos
Resolved! Use a Service Principal Token instead of Personal Access Token for Databricks Asset Bundle
How can I connect using a Service Principal Token, I did this, but it is not a PAT: databricks configure Databricks host: https:// ... Personal access token: **** I also tried this, but didn't work either: [profile] host = <workspace-url> client_id ...
- 751 Views
- 3 replies
- 3 kudos
- 3 kudos
Thanks Pedro, we did it, for anyone in the future (I added fake host and service principal id's):1. Modify your databricks.yml so it have the service principal id and the databricks host: bundle: name: my_workflow # Declare to Databricks Assets Bu...
- 3 kudos
- 237 Views
- 2 replies
- 0 kudos
Automatic schema rendering of files in unity catalog
Hi team,Can anyone please confirm if Unity catalog supports automatic schema rendering from csv, json, pdfs, and structured/unstructured files?Meaning, if i create a volume with path/location to folder (or S3 bucket) having such files, can unity cata...
- 237 Views
- 2 replies
- 0 kudos
- 0 kudos
ok..thanks a lot @gchandra .So, I am new to Unity Catalog and particularly interested (and evaluating) the open sourced version of unity catalog (https://www.unitycatalog.io/)I know that, we can create volumes and those in turn can point to csv, json...
- 0 kudos
- 694 Views
- 2 replies
- 1 kudos
Resolved! Error getting locations Unsupported response format: STREAM [Azure Databricks - Catalog Explorer]
Hi all,I saw this error when I checked my External Locations and Storage Credentials and used Catalog Explorer. The error message gives zero information for diagnosis. Do you have any idea what the reason is?Thank you.
- 694 Views
- 2 replies
- 1 kudos
- 1 kudos
And it works now ... No idea what happened ~
- 1 kudos
- 496 Views
- 4 replies
- 0 kudos
How to disable spark connect in the databricks compute?
I want to be able to access the RDD methods of a Dataframe, but it seems that this is not supported in spark connect. I have been trying to disable spark connect in the spark config using,spark.databricks.service.server.enabled false but when I check...
- 496 Views
- 4 replies
- 0 kudos
- 0 kudos
I have found that when the cluster is shared, it automatically uses that type of session, and in that case, I have not been able to disable it. I don't know if this is your situation. I have avoided some problems that I had with the previous clause.
- 0 kudos
- 134 Views
- 0 replies
- 0 kudos
MLFlow Tracking versions
Hi team,we are migrating from self-self hosted MLFlow Tracking server to the Databricks-hosted one. However, there are concerns about the unclear process of version changes and releases at the Tracking server side. Is there any public information av...
- 134 Views
- 0 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
1 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
50 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
36 | |
9 | |
9 | |
8 | |
8 |