- 107 Views
- 1 replies
- 0 kudos
Resolved! How can i run a single task in job from Rest API
How can I run a single task in a job that has many tasks?I can do it in the UI, but I can’t find a way to do it using the REST API. Does anyone know how to accomplish this?
- 107 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @200842 , Greetings! It is currently not possible to run a single task or a group of tasks from a job without running the entire job from the API. Leave a like if this helps, followups are appreciated.Kudos, Ayushi
- 0 kudos
- 184 Views
- 3 replies
- 3 kudos
Subscription and billing: Directly or via AWS?
Is there any difference in price & features if I subscribe to DataBricks directly or via AWS account?Any startup discount programs available?
- 184 Views
- 3 replies
- 3 kudos
- 3 kudos
No problem, and there is no difference in the pricing between subscribing to Databricks directly or via the AWS Marketplace, it's just about billing mechanism.
- 3 kudos
- 259 Views
- 5 replies
- 1 kudos
Resolved! Payment receipts of Databricks payments
Hello experts,I am trying to get receipts for the monthly payments done to Databricks. I need them for the financial department of the organization I am working for. The only billing information I get access to is the usage dashboards and the tables ...
- 259 Views
- 5 replies
- 1 kudos
- 258 Views
- 3 replies
- 2 kudos
Resolved! DLT-Asset bundle : Pipelines do not support a setting a run_as user that is different from the owner
Hello !We're using Databricks asset bundles to deploy to several environments using a devops pipeline. The service principal running the CICD pipeline and creating the job (owner) is not the same as the SPN that will be running the jobs (run_as).This...
- 258 Views
- 3 replies
- 2 kudos
- 2 kudos
By default who creates pipeline is the owner/run_as. However, yes this is a feature in progress. I wouls still encorage you to share your request her https://docs.databricks.com/en/resources/ideas.html#ideas ...
- 2 kudos
- 194 Views
- 3 replies
- 0 kudos
Resolved! Issue in building databricks source code on ARM platform
Hi,I am attempting to build the Databricks Koalas source code on an ARM platform but am encountering an error. Could anyone provide specific steps for building it on ARM?
- 194 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @mohakgoel, You’re on the right track with the steps you’ve mentioned for building Koalas. It’s important to note that building Koalas is not the same as building the entire Databricks source code. Koalas is an open-source project that provides a ...
- 0 kudos
- 254 Views
- 5 replies
- 0 kudos
Resolved! Using Databricks Docker CLI image with Asset Bundles - Azure
I'm trying to deploy asset bundle from a CI/CD pipeline, I'd like to use the docker databricks CLI image for that, but I can't get it to authenticate. I'm using entra service principal for my deployments and we are using TeamCity as our CI/CD tool. T...
- 254 Views
- 5 replies
- 0 kudos
- 0 kudos
OK, tested and now everything is working - according to the docs bundle settings are of highest priority, https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/authentication#auth-evalNo wonder none of my env vars or .databrickscfg setting...
- 0 kudos
- 1139 Views
- 1 replies
- 0 kudos
We are getting AttributeError: module 'numpy.typing' has no attribute 'NDArray' with tidal jobs
File "/databricks/python/lib/python3.8/site-packages/PIL/_typing.py", line 10, in <module>NumpyArray = npt.NDArray[Any]AttributeError: module 'numpy.typing' has no attribute 'NDArray' we are using 10.4 runtime version , please suggest
- 1139 Views
- 1 replies
- 0 kudos
- 0 kudos
The file path of the above error is suggesting it is from the Pillow library dependency: "/databricks/python/lib/python3.8/site-packages/PIL/_typing.py" Pillow's _typing.py checks for NumPy >=1.21 while 10.4 runtime has these versions installed ...
- 0 kudos
- 791 Views
- 1 replies
- 1 kudos
Troubleshooting Cluster
We had a failure on a previously running fact table load (our biggest one) and it looked like an executor was failing due to a timeout error. As a test we upped the cluster size and changed the spark.executor.heartbeatinterval to 300s and the spark....
- 791 Views
- 1 replies
- 1 kudos
- 1 kudos
The XXKDA error code is a general indicator for task scheduler issues or SPARK_JOB_CANCELLED.
- 1 kudos
- 133 Views
- 1 replies
- 0 kudos
Community Accounts Getting Deleted Automatically
This is the second time in the same week, the accounts are getting vanished without any prior information. All the saved work has no backup I guess and no one would take responsibility because it a community version.Not a good thing to happen twice, ...
- 133 Views
- 1 replies
- 0 kudos
- 0 kudos
I would suggest to follow steps in https://community.databricks.com/t5/support-faqs/databricks-community-sso-august-3rd-2024/ta-p/78459 , if issue persist please contact databricks-community@databricks.com
- 0 kudos
- 184 Views
- 1 replies
- 0 kudos
Account missing
I have been using my community for the past two weeks. Now, while trying to log in, it shows this: Can anyone help me fix this? Thanks in advance!
- 184 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello, @udhayakumar!You can find helpful resources for Databricks Community Edition here. If the available resource doesn’t resolve your concern, please submit a ticket with Databricks Support team for further assistance.
- 0 kudos
- 309 Views
- 3 replies
- 3 kudos
Drop table - permission management
Hello,I'm trying to wrap my head around the permission management for dropping tables in UC enabled schemas.According to docs: To drop a table you must have the MANAGE privilege on the table, be its owner, or the owner of the schema, catalog, or meta...
- 309 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi @PiotrM, I see there is a feature request already in place. It's been considered for the future: https://databricks.aha.io/ideas/ideas/DB-I-7480
- 3 kudos
- 120 Views
- 1 replies
- 0 kudos
Databricks SQL Warehouse Hung - Queries Stuck in Queued State & No Alerts Triggered
We have been facing critical challenges with Databricks SQL Warehouse for the last four weeks. We are using Databricks SQL Warehouse injection from IICS, and we have observed the following issues:SQL Warehouse Going into a Hung State – The SQL Wareho...
- 120 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @sdheepak The first thing you need to identify is the type of SQL Warehouse you are using in Databricks:• Is it Serverless? If so, it is fully managed by Databricks, and you must contact Databricks support because you won’t have access to logs in...
- 0 kudos
- 143 Views
- 0 replies
- 0 kudos
Why is writing direct to Unity Catalog Volume slower than to Azure Blob Storage (xarray -> zarr)
Hi,I have some workloads whereby i need to export an xarray object to a Zarr store.My UC volume is using ADLS.I tried to run a simple benchmark and found that UC Volume is considerably slower.a) Using a fsspec ADLS store pointing to the same containe...
- 143 Views
- 0 replies
- 0 kudos
- 455 Views
- 1 replies
- 0 kudos
Can we provide custom dns name for Databricks app?
Hi All,I want to access my Databricks app https://myapp.aws.databricksapps.com/ using https://myapp.mycompaney.com Is this possible? We tried DNS mapping but it is not working.
- 455 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @satniks_o , It should be possible to set up a DNS but requires SSL and other settings, how are you setting it up?
- 0 kudos
- 100 Views
- 1 replies
- 0 kudos
why doesn't databricks allow setting executor metrics
I have an all-purpose compute cluster that processes different data sets for various jobs. I am struggling to optimize executor metrics like below.spark.executor.memory 4gIs it allowed to override default executor metrics and specify such configurati...
- 100 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @marc88, As you mentioned in Spark config under Advance cluster options you can do it once cluster boots up it will be set at run level. Or you can draft a cluster policy and apply it across for job computes when creating your workflow.
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
75 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
24 | |
15 | |
9 |