- 3029 Views
- 2 replies
- 0 kudos
Access Ganglia metrics via endpoint
How to access the raw Ganglia metrics. Is there an endpoint or a DBFS storage it belongs to?
- 3029 Views
- 2 replies
- 0 kudos
- 1248 Views
- 1 replies
- 0 kudos
How do I get the cost of my notebook run?
How do I get the cost of my notebook run?
- 1248 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 9752 Views
- 2 replies
- 1 kudos
Resolved! How can we pause jobs?
- 9752 Views
- 2 replies
- 1 kudos
- 1 kudos
how to update the schedule via the Jobs API (it would be within the Cron Schedule field).
- 1 kudos
- 9605 Views
- 9 replies
- 0 kudos
Resolved! AWS Databricks Pay-as-you-Go
Hi Team,We are currently using AWS cloud service for our AI/ML Project. We wanted to use Databricks based GPU service for model training purposes. We do not find a Pay-as-you-go option in marketplace for this purposes.Can someone help me on this !Tha...
- 9605 Views
- 9 replies
- 0 kudos
- 0 kudos
https://databricks.com/product/aws-pricingI guess you try to roll out something else?
- 0 kudos
- 3958 Views
- 4 replies
- 4 kudos
Resolved! Databricks and DDD
Our architecture is according to Domain Driven Design. The data is therefore distributed among different domains.We would like to run workloads on top of our data but we would like to avoid to have a dedicated (duplicated) data lake just for Databric...
- 3958 Views
- 4 replies
- 4 kudos
- 4 kudos
So basically you do not want to persist data outside of your source systems.I think the so called 'Kappa architecture' could be a fit, where everything is treated like a stream.Hubert already mentioned Kafka, which is an excellent source to build thi...
- 4 kudos
- 3458 Views
- 4 replies
- 5 kudos
Resolved! GDPR/LGPD Compliance
How do you work to compliance the GDPR/LGPD, in special to people data ?have any suggestion ?
- 3458 Views
- 4 replies
- 5 kudos
- 5 kudos
there is identifying PII data and handling/storing PII data.Identifying can be done with Purview, Macie, other tools. Those are not free ofc, so if your env is pretty big it can be interesting. Otherwise, you could also do manual checks.For storin...
- 5 kudos
- 6343 Views
- 4 replies
- 7 kudos
Resolved! Architecture choice, streaming data
I have sensor data coming into Azure Event Hub and need some help in deciding how to best ingest it into the Data Lake and Delta Lake:Option 1:azure event hub > databricks structured streaming > delta lake (bronze)Option 2:azure event hub > event hu...
- 6343 Views
- 4 replies
- 7 kudos
- 7 kudos
If batch job is possible and you need to process data I would use probably:azure event hub from (events after previous job run) > databricks job process as dataframe > save df to delta lakeno streaming or capturing needed in that case
- 7 kudos
- 2197 Views
- 2 replies
- 1 kudos
- 2197 Views
- 2 replies
- 1 kudos
- 1 kudos
A great place to learn more about Databricks integrations with AWS services is https://www.databricks.com/aws There is information on this page regarding integrations with Glue, SageMaker, Redshift and others. Many of these pages also point to our bl...
- 1 kudos
- 9257 Views
- 3 replies
- 0 kudos
- 9257 Views
- 3 replies
- 0 kudos
- 0 kudos
Multiple writers (inserts/appends) present no problems with Delta. You can have two users appending data to a Delta table at the same time without issue. Updates, deletes, merges, and compaction can run into issues when multiple user are trying to d...
- 0 kudos
- 1309 Views
- 1 replies
- 0 kudos
- 1309 Views
- 1 replies
- 0 kudos
- 0 kudos
Not as of this comment, but our product team is aware of this request and are working on getting this implemented soon!Keep in mind that for feature requests and direct updates/communication from our product team, you can post here (or just vote up o...
- 0 kudos
- 1187 Views
- 1 replies
- 0 kudos
- 1187 Views
- 1 replies
- 0 kudos
- 0 kudos
The admin console exists within the workspace and let's you control access and privileges for that specific workspace. An existing admin can get to it from the drop down in the very top right and selecting Admin Console.The first screen you'll land o...
- 0 kudos
- 979 Views
- 1 replies
- 0 kudos
- 979 Views
- 1 replies
- 0 kudos
- 0 kudos
Anything that can reach the control plane and use the SCIM API should work. For Azure AD Premium, there is specifically an enterprise App that does this for the customer.
- 0 kudos
- 1800 Views
- 1 replies
- 1 kudos
- 1800 Views
- 1 replies
- 1 kudos
- 1 kudos
Ganglia metrics are available only if the job runs for more than 15 minutes. For jobs that are completed within 15 minutes, the metrics won't be available
- 1 kudos
- 1604 Views
- 1 replies
- 1 kudos
- 1604 Views
- 1 replies
- 1 kudos
- 1 kudos
As of June, 2021, No.However Public Preview features are stable and intended to advance to GA and fully supported by Databricks Support.
- 1 kudos
- 1439 Views
- 1 replies
- 0 kudos
Resolved! Databricks SQL dashboard refresh
In Databricks SQL, can you prohibit a dashboard from being refreshed?
- 1439 Views
- 1 replies
- 0 kudos
- 0 kudos
It looks like this can be done by not granting CAN_RUN to a user/grouphttps://docs.databricks.com/sql/user/security/access-control/dashboard-acl.html#dashboard-permissions
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
57 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
40 | |
11 | |
9 | |
9 | |
9 |