- 854 Views
- 1 replies
- 0 kudos
Spark Executor - Parallelism Question
I was reading the book Spark: The Definitive Guide, I came across below statement in Chapter 2 on partitions."If you have many partitions but only one executor, Spark will still have a parallelism of only one because there is only one computation res...
- 854 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @SANJAYKJ It is correct in the sense that a single executor is a limiting factor, but the actual parallelism within that executor depends on the number of cores assigned to it. If you want to leverage multiple partitions effectively, you either n...
- 0 kudos
- 2193 Views
- 4 replies
- 0 kudos
Resolved! Possible to programmatically adjust Databricks instance pool more intelligently?
We'd like to adopt Databricks instance pool in order to reduce instance-acquisition times (a significant contributor to our test latency). Based on my understanding of the docs, the main levers we can control are: min instance count, max instance cou...
- 2193 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi Steve,If the goal is to pre-warm 100 instances in the Databricks Instance Pool, you could create a temporary job that will request instances from the pool. This ensures that Databricks provisions the required instances before the actual test run.T...
- 0 kudos
- 3427 Views
- 1 replies
- 0 kudos
Notebook runs not found due to retention limits with dbutils.notebook.run
We saw this odd error in an AWS deployment, we have one notebook calling another one through dbutils.notebook.run(...) and this suddenly stopped working and failed with "Notebook runs not found due to retention limits", the "learn more" points to Dat...
- 3427 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @ErikApption ,Maybe I am wrong but I will give you my opinion.Each time you execute dbutils.notebook.run(), it launches a new and independent execution within the same cluster. So, if you run the cell today and then run it again tomorrow, there s...
- 0 kudos
- 10025 Views
- 3 replies
- 1 kudos
Current Azure Managed Identity capabilities 2024?
Hello everyone, I have a few questions about MI capabilites: Is it possible to define a managed identity for Azure Databricks Service resource and use it for e.g.: Writing to Azure SQL Server database Authenticating to Azure Devops in order to downlo...
- 10025 Views
- 3 replies
- 1 kudos
- 1 kudos
Kaniz, thank you very much, you are the best! I will get to work implementing your advice
- 1 kudos
- 2779 Views
- 1 replies
- 1 kudos
Resolved! Community Edition - Photon enabled possible?
Is it possible to use a photon enabled cluster in the community edition? I want to use DBR 13.3 LTS, but choosing that there is no option to enable photon. I want to use test the spatial functionality in Databricks library Mosiac, and appears photon...
- 2779 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @sam_tw Photon is not available in Community Edition
- 1 kudos
- 2337 Views
- 1 replies
- 1 kudos
Resolved! Databricks app - permissions needed
Hi. I am trying to create a new databricks app and I get the following error"Failed to create app [appname]. User does not have permission to grant resource sql-warehouse."Can someone tell me what level of access I require in order to generate a data...
- 2337 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @SarahA ,Requied Permissions to Create a Data AppYou'll need the following permissions in Databricks:Permission Required Role / Grant Purpose CAN MANAGE on the SQL WarehouseSQL Warehouse Admin or OwnerTo manage warehouse settings and assign it to ...
- 1 kudos
- 1580 Views
- 2 replies
- 0 kudos
Mismatch of Columns in databricks vs Athena.
We are trying to expose one of our external tables to databricks via unity catalog, but we are having an issue with column mismatch, ie few of our columns are not visible in databricks. Is this a known issue? If so, can anyone best advise me on where...
- 1580 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @DavidSzedlak ,1. Unity Catalog caches metadata for performance. If new columns were added to the source table after the initial creation, they may not be reflected.Run the following command to refresh metadata: ALTER TABLE <catalog>.<schema>.<ta...
- 0 kudos
- 1138 Views
- 3 replies
- 0 kudos
Timeout on docker pull in Databricks Container Services
Hello,There is a timeout that limits the size of images used in Docker Container Service. When using images containing large ML libraries, the size often exceeds the limit that could be pulled. Is there any plan to add parametrization of this timeout...
- 1138 Views
- 3 replies
- 0 kudos
- 0 kudos
Are there any new or planned changes in the policy?
- 0 kudos
- 3282 Views
- 8 replies
- 0 kudos
How do I simply disable someone's user account
I'm trying to do something seemingly very simple - disable someone's user account. I don't even want to delete the user, just disable it for the time being. How do I go about doing that?
- 3282 Views
- 8 replies
- 0 kudos
- 0 kudos
In Databricks, go to the Admin Console.Navigate to the Service Principals section.Create a Service Role (make sure it has necessary permissions (Admin Access).Generate OAuth TokenFollow the instructions in the Databricks documentation to generate ap...
- 0 kudos
- 2470 Views
- 2 replies
- 0 kudos
Proper way to collect Statement ID from JDBC Connection
Hi, We are executing DML calls on Databricks SQL Warehouse programmatically, with Java and Python.There can be thousands of executions running on daily level, so in case of an error occurs, it would be very beneficial to spot the Statement ID of the ...
- 2470 Views
- 2 replies
- 0 kudos
- 0 kudos
Found a way to extract it for below dbx-java library.-> databricks-jdbc library version : `2.6.32` private static String extractQueryIdFromDbxStatement(Hive42PreparedStatement statement) { byte[] guid = ((HiveJDBCNativeQueryExecutor) state...
- 0 kudos
- 1238 Views
- 1 replies
- 0 kudos
Secret Creation for Service Principal using API
There is API available to create a secret for Service Principal./api/2.0/accounts/{account_id}/servicePrincipals/{service_principal_id}/credentials/secretsCan anyone please help what has to be passed as authentication for this API ? This is looking a...
- 1238 Views
- 1 replies
- 0 kudos
- 0 kudos
hi @AnkurMittal008 In general, I do not recommend using tokens anymore. Instead, if you want to log in via databricks cli, you can use this command:databricks auth login --host https://accounts.azuredatabricks.net/ --account-id <YOUR_ACCOUNT_ID>This ...
- 0 kudos
- 748 Views
- 1 replies
- 0 kudos
Reading delta table from ADLS gen2 using ABFS driver
Scenario - I have an ADLS g2 account and trying to read a delta table using ABFS driver. I am using Databricks Serverless compute. There are no firewalls in place as I am working with sample data. There's network line of sight between Databricks serv...
- 748 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @jeet414,You can always read Delta table from ADLS storage in Databricks mount points: https://learn.microsoft.com/en-us/azure/databricks/dbfs/mountsThis is just experimental, please give it a try.Best Regards,Nivethan V
- 0 kudos
- 463 Views
- 1 replies
- 0 kudos
Querying view gives spurious error
When trying to query a view, my_view, we sometimes see a spurious error. This seems to occur after the table underlying the view has been updated. The error persists for a while and then it seems to fix itself. Error running query: [42501] [Simba][Ha...
- 463 Views
- 1 replies
- 0 kudos
- 1233 Views
- 3 replies
- 0 kudos
Databricks Workspace Access and Permissions
Hi Team,The GCP Databricks URL https://accounts.gcp.databricks.com/ for GCP Databricks is linked to the GCP Billing Account.We have two clients with separate GCP Organizations:client1.example.comclient2.example.comBoth GCP Organizations share the sam...
- 1233 Views
- 3 replies
- 0 kudos
- 0 kudos
@karthiknuvepro The Databricks Account should be handled by a third-party Cloud Administration team. The workspace admins can work with them to set up the necessary cloud resources to support their catalogs and user adds/remove from their selected a...
- 0 kudos
- 13439 Views
- 2 replies
- 2 kudos
Authentication for Databricks Apps
Databricks Apps allows us to define dependencies & an entrypoint to execute a Python application like Gradio, Streamlit, etc. It seems I can also run a FastAPI application and access via an authenticated browser which is potentially a very powerful c...
- 13439 Views
- 2 replies
- 2 kudos
- 2 kudos
Hello, Thank you for your questions and answers regarding this topic. Is it available this feature right now? Or still not supported? Thank you in advance
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
47 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 108 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |