- 2454 Views
- 2 replies
- 0 kudos
Disable personal compute for everyone including workspace admins
If we disable personal compute feature in the account console, it gets deactivated only for non-admin users but still admin users are able to create personal compute clusters. Is there way to restrict to everyone? If not, can you raise a feature requ...
- 2454 Views
- 2 replies
- 0 kudos
- 0 kudos
@DebayanI am talking about the personal compute feature here not the way how clusters are created. If personal compute feature is set to delegate, it should be disable for workspace admin users as well. If this is not supported, it's good to have fea...
- 0 kudos
- 1220 Views
- 0 replies
- 0 kudos
Cloning Github Repository Accessibility
Hi all,I am trying to clone GitHub repository on the databricks notebook, but there are some accessibility issues faced here.Already linked the GitHub in the User Settings but still not working in the command on the notebook.
- 1220 Views
- 0 replies
- 0 kudos
- 2579 Views
- 1 replies
- 0 kudos
how to upgrade pip associated with the default python
We have a job scheduled and submitted via Airflow to Databricks using api: api/2.0/jobs/runs/submit. Each time the job runs an ephemeral cluster will be launched and during the process a virtual env named: /local_disk0/.ephemeral_nfs/cluster_librarie...
- 2579 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, I got an interesting article on the same. You can follow and let us know if this helps. Please tag @Debayan with your next comment which will notify me!
- 0 kudos
- 3569 Views
- 1 replies
- 1 kudos
How connect connection ORACLE onpremise in Databricks for extract data
Hello, How Connect in Databricks Enterprise connection Oracle Onpremise and that permissions is necessary.Thank you
- 3569 Views
- 1 replies
- 1 kudos
- 1896 Views
- 0 replies
- 0 kudos
Using map_filter / filter to return only map elements for a given predicate
I want to apply a filter to a map structure (on a column called "ActivityMap") for elements only where a given predicate holds. Representative data is below. Applying both "map_filter" and "array_contains" will return rows where a predicate holds, ho...
- 1896 Views
- 0 replies
- 0 kudos
- 1030 Views
- 0 replies
- 0 kudos
Is it possible to embed Databricks Workspace plataform in my site?
I want to use databricks in my site, because there're some funcionality the user needs to use in my website, but I want the user to use databricks workspace.. is it possible to embed or do something like that ?
- 1030 Views
- 0 replies
- 0 kudos
- 1850 Views
- 0 replies
- 0 kudos
Questions about the new Serverless SQL
I have a list of questions about the Serverless SQL option:As per the docs, it says it's hosted by Databricks. Is there any exception to that? Will it ever create ec2 instances on AWS for the serverless option?Are any serverless asset stored on custo...
- 1850 Views
- 0 replies
- 0 kudos
- 2078 Views
- 1 replies
- 1 kudos
Monitor all Streaming jobs to make sure they are in RUNNING status.
Hi Experts,Is there any way that we can monitor all our Streaming jobs in workspace to make sure they are in "RUNNING" status?I could see there is one option to create a batch job that runs frequently and check the status(through REST API) of all str...
- 2078 Views
- 1 replies
- 1 kudos
- 5926 Views
- 2 replies
- 4 kudos
Can we apply multiple exam discount voucher to one exam?
I have a 75% off certification exam voucher using my working email, and i also noticed that you can redeem a 25% off voucher from the rewards store. I am wondering if you can add these 2 vouchers together at the same time so that you can get 1 exam 1...
- 5926 Views
- 2 replies
- 4 kudos
- 4 kudos
Hi chenqian0562,Could you please share the solution with the community? I have the same issue.
- 4 kudos
- 1857 Views
- 0 replies
- 0 kudos
How can I resolve a problem between AWS and Databricks platform?
This is Algograp Co., Ltd., a partner of Databricks.There was a problem in the process of subscribing to Databricks and linking AWS account.I had a problem using my existing platform and canceled my subscription. Account activation is not possible du...
- 1857 Views
- 0 replies
- 0 kudos
- 6780 Views
- 0 replies
- 0 kudos
DLT use case
Is Delta Live Tables (DLT) appropriate for data that is in the millions of rows and GB sized? Or is DLT only optimal for larger data with billions of rows and TB sized?Please consider the Total Cost of Ownership.Development costs (engineering time)O...
- 6780 Views
- 0 replies
- 0 kudos
- 3893 Views
- 1 replies
- 1 kudos
Call a workspace notebook from a repository notebook
We have a Databricks workspace with several repositories. We'd like to have a place with shared configuration variables that can be accessed by notebooks in any repository.I created a folder named Shared under the root workspace and in that folder, c...
- 3893 Views
- 1 replies
- 1 kudos
- 2031 Views
- 1 replies
- 0 kudos
Pyspark API reference
All,I am using Azure Databricks and at times I refer to pyspark API's to interact with data in Azure datalake using python, SQL here https://spark.apache.org/docs/3.5.0/api/python/reference/pyspark.sql/index.htmlDoes databricks website has the list o...
- 2031 Views
- 1 replies
- 0 kudos
- 1582 Views
- 0 replies
- 0 kudos
Partitioning or Processing : Reading CSV file with size of 5 to 9 GB
Hi Team,Would you please guide me onInstance with 28GB and 8 Cores1. how data bricks reading 5 to 9GB files from BLOB storage ? ( directly loaded full file into one nodes memory )2. howmany tasks will be created based on Core ? how many executors wil...
- 1582 Views
- 0 replies
- 0 kudos
- 7517 Views
- 1 replies
- 1 kudos
Set default database thru Cluster Spark Configuration
Set the default catalog (AKA default SQL Database) in a Cluster's Spark configuration. I've tried the following :spark.catalog.setCurrentDatabase("cbp_reporting_gold_preprod") - this works in a Notebook but doesn't do anything in the Cluster.spark.sq...
- 7517 Views
- 1 replies
- 1 kudos
- 1 kudos
I've tried different commands in the Cluster's Spark Config, none work, they execute at Cluster startup w/o any errors shown in the logs, but once you run a notebook attached to the cluster Default catalog is still set to 'default'.
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
4 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
Aws databricks
1 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
20 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 129 | |
| 72 | |
| 57 | |
| 42 |