cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

OwenDB
by New Contributor III
  • 7273 Views
  • 4 replies
  • 3 kudos

Spark jobs not starting for saving

I am having issues with a new workspace that has been created and having trouble diagnosing what the issue is.It seems to be related to compute resources, spark and storage.I am an admin and able to create compute resources. Using SQL warehouses I ca...

  • 7273 Views
  • 4 replies
  • 3 kudos
Latest Reply
OwenDB
New Contributor III
  • 3 kudos

ok looks like it might be this for us....You must have a separate pair of host/container subnets for each workspace that you deploy. It is unsupported to share subnets across workspaces or to deploy other Azure resources on the subnets that are used ...

  • 3 kudos
3 More Replies
Kroy
by Contributor
  • 4242 Views
  • 2 replies
  • 1 kudos

Not able to edit the user group entitlement

I am workspace admin , but when editing the entitlement (allow unrestricted cluster creation) of user group, it is not happening, it gives a prompt to confirm the remove, but evening after confirming it does not get removes. after clicking remove , i...

Kroy_0-1704885328747.png Kroy_1-1704885377008.png
  • 4242 Views
  • 2 replies
  • 1 kudos
Latest Reply
Avvar2022
Contributor
  • 1 kudos

is your workspace untiy catalog enabled?

  • 1 kudos
1 More Replies
Wojciech_BUK
by Valued Contributor III
  • 9290 Views
  • 2 replies
  • 1 kudos

Resolved! Unity Catalog - Lakehouse Federation: Permission to read data from foreign catalogs

I have seup connection "SQL-SV-conn" to SQL Server and based on that connection I have created foreign catalog "FC-SQL-SV".I have granted All permission on CATALOG to developers:Use CatalogUse SchemaSelectBut they can not query table (e.g. by running...

Administration & Architecture
Foreign Catalog
Lakehouse Federation
Unity Catalog
  • 9290 Views
  • 2 replies
  • 1 kudos
Latest Reply
Wojciech_BUK
Valued Contributor III
  • 1 kudos

OK, I have found out the answer in below docummentation:https://learn.microsoft.com/en-us/azure/databricks/query-federation/#limitationsSingle-user access mode is only available for users that own the connection.So when I use e.g. Job Cluster that ru...

  • 1 kudos
1 More Replies
sachinw
by New Contributor II
  • 3976 Views
  • 0 replies
  • 1 kudos

Workspace Model Registry with Unity Catalog

if the "default catalog for the workspace" is to Unity Catalog, how can we access a model from the workspace model registry?I have already tried  mlflow.set_tracking_uri("databricks") but still try to find catalog in UC.  

  • 3976 Views
  • 0 replies
  • 1 kudos
Aria
by New Contributor III
  • 6382 Views
  • 1 replies
  • 1 kudos

Resolved! Security and vulnerability for Azure databricks clusters in Data Plane

Microsoft defender is not supported for azure databricks clusters. Can someone point me to a document which describe how the security vulnerabilities are reported and fixed for azure databricks clusters in data plane.

Administration & Architecture
security vulnerability patch dataplane
  • 6382 Views
  • 1 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

Hi Aria,can you check this link?Basically Databricks checks for vulnerabilities and does pentesting etc."Databricks will use commercially reasonable efforts to address critical vulnerabilities within 14 days, high severity within 30 days, and medium ...

  • 1 kudos
leelee3000
by Databricks Employee
  • 3277 Views
  • 1 replies
  • 0 kudos

Connect to Salesforce

Curious if there's a Databricks connector for Salesforce on AWS? 

  • 3277 Views
  • 1 replies
  • 0 kudos
Latest Reply
Wojciech_BUK
Valued Contributor III
  • 0 kudos

There is no "databricks" connector like the on you have in Unity Fedarating e.g. for Snowflake.You can use partner ecosystem e.g. Fivetran https://www.fivetran.com/connectors/salesforceto integrate Salesforce data to your Lakehouse. You also have spa...

  • 0 kudos
riturralde-p
by New Contributor II
  • 4318 Views
  • 1 replies
  • 1 kudos

How to allocate costs per SQL query?

By using System Tables (systen.billing.usage) I'm able to identity DBU usage per query, but I'm not able to identify who ran each query because is not part of the table. I'm also aware of query history where all the queries and who ran them is listed...

  • 4318 Views
  • 1 replies
  • 1 kudos
Latest Reply
riturralde-p
New Contributor II
  • 1 kudos

thanks @Retired_mod for the reply, however query_id is not part of the system.billing.usage table, so no way to join them by IDs. What my Databricks account team suggested me is to join them by timestamps since both tables contain a column like that....

  • 1 kudos
lgepp11
by New Contributor III
  • 5412 Views
  • 2 replies
  • 0 kudos

NPIP tunnel setup failure during launch

In AWS with this current error when spinning up the SQL warehouse or personal compute.Backend private link is enabled.Error: NPIP tunnel setup failure during launch. Please try again later and contact Databricks if the problem persists. Instance boot...

  • 5412 Views
  • 2 replies
  • 0 kudos
Latest Reply
User16539034020
Databricks Employee
  • 0 kudos

Hello,  Thanks for contacting Databricks Support.  Based on the error message: NPIP_TUNNEL_SETUP_FAILURE. It indicates that bootstrap failed due to network connectivity issues between the data plane and control plane.  Seems like you have already dow...

  • 0 kudos
1 More Replies
alexlod
by New Contributor III
  • 5313 Views
  • 2 replies
  • 1 kudos

How to monitor a python wheel job with Prometheus?

Hi Community,We have a Databricks job with a single Python wheel task that runs our streaming pyspark job. The job runs on a single-node compute cluster and consumes from Kafka.Our monitoring stack is Prometheus + Grafana.I want the job's metrics to ...

  • 5313 Views
  • 2 replies
  • 1 kudos
Latest Reply
amelia1
New Contributor II
  • 1 kudos

Hi I'm trying to use the metrics registry object inside an UDF function, but I can't because it's not serializable due to Lock. Our goal is to be able to count the number of messages parsed, and the number of messages we can't parsed (due to exceptio...

  • 1 kudos
1 More Replies
leelee3000
by Databricks Employee
  • 1454 Views
  • 0 replies
  • 0 kudos

Handling Kafka Topics with Avro Schema

Our input data resides in a Kafka topic, and we utilize the Kafka schema registry with Avro schemas. While I can retrieve the schema from the registry, I am facing challenges creating a Spark DataFrame that correctly serializes data for streaming rea...

  • 1454 Views
  • 0 replies
  • 0 kudos