- 5787 Views
- 2 replies
- 4 kudos
Can we apply multiple exam discount voucher to one exam?
I have a 75% off certification exam voucher using my working email, and i also noticed that you can redeem a 25% off voucher from the rewards store. I am wondering if you can add these 2 vouchers together at the same time so that you can get 1 exam 1...
- 5787 Views
- 2 replies
- 4 kudos
- 4 kudos
Hi chenqian0562,Could you please share the solution with the community? I have the same issue.
- 4 kudos
- 1791 Views
- 0 replies
- 0 kudos
How can I resolve a problem between AWS and Databricks platform?
This is Algograp Co., Ltd., a partner of Databricks.There was a problem in the process of subscribing to Databricks and linking AWS account.I had a problem using my existing platform and canceled my subscription. Account activation is not possible du...
- 1791 Views
- 0 replies
- 0 kudos
- 6723 Views
- 0 replies
- 0 kudos
DLT use case
Is Delta Live Tables (DLT) appropriate for data that is in the millions of rows and GB sized? Or is DLT only optimal for larger data with billions of rows and TB sized?Please consider the Total Cost of Ownership.Development costs (engineering time)O...
- 6723 Views
- 0 replies
- 0 kudos
- 3814 Views
- 1 replies
- 1 kudos
Call a workspace notebook from a repository notebook
We have a Databricks workspace with several repositories. We'd like to have a place with shared configuration variables that can be accessed by notebooks in any repository.I created a folder named Shared under the root workspace and in that folder, c...
- 3814 Views
- 1 replies
- 1 kudos
- 1956 Views
- 1 replies
- 0 kudos
Pyspark API reference
All,I am using Azure Databricks and at times I refer to pyspark API's to interact with data in Azure datalake using python, SQL here https://spark.apache.org/docs/3.5.0/api/python/reference/pyspark.sql/index.htmlDoes databricks website has the list o...
- 1956 Views
- 1 replies
- 0 kudos
- 1537 Views
- 0 replies
- 0 kudos
Partitioning or Processing : Reading CSV file with size of 5 to 9 GB
Hi Team,Would you please guide me onInstance with 28GB and 8 Cores1. how data bricks reading 5 to 9GB files from BLOB storage ? ( directly loaded full file into one nodes memory )2. howmany tasks will be created based on Core ? how many executors wil...
- 1537 Views
- 0 replies
- 0 kudos
- 7341 Views
- 1 replies
- 1 kudos
Set default database thru Cluster Spark Configuration
Set the default catalog (AKA default SQL Database) in a Cluster's Spark configuration. I've tried the following :spark.catalog.setCurrentDatabase("cbp_reporting_gold_preprod") - this works in a Notebook but doesn't do anything in the Cluster.spark.sq...
- 7341 Views
- 1 replies
- 1 kudos
- 1 kudos
I've tried different commands in the Cluster's Spark Config, none work, they execute at Cluster startup w/o any errors shown in the logs, but once you run a notebook attached to the cluster Default catalog is still set to 'default'.
- 1 kudos
- 4099 Views
- 4 replies
- 1 kudos
Resolved! My exam has suspended by an unprofessional proctor- need help to reschedule
case #00378268 I passed Spark Developer associate exam about 6 month ago with great experience. However, this time the proctor did not even bother to show up to start the exam - checking ID, the room and the surroundings. Somehow, I was able to st...
- 4099 Views
- 4 replies
- 1 kudos
- 1967 Views
- 1 replies
- 2 kudos
The base provider of Delta Sharing Catalog system does not exist.
I have enabled system tables in Databricks by following the procedure mentioned here. The owner of the system catalog is System user. I cannot see the schemas or tables of this catalog. It is showing me the error: The base provider of Delta Sharing C...
- 1967 Views
- 1 replies
- 2 kudos
- 2 kudos
I have already enabled all these schemas using the Databricks CLI command. After enabling, I was able to see all the tables and data inside these schemas. Then I disabled the all the schemas using the CLI command mentioned here. Now, even after re-en...
- 2 kudos
- 2148 Views
- 1 replies
- 1 kudos
Delivery audit logs to multiple S3 buckets
Hi!Am I able to configure delivery of Databricks audit logs to multiple S3 buckets (on different AWS accounts)? Thanks in Advance!
- 2148 Views
- 1 replies
- 1 kudos
- 4699 Views
- 2 replies
- 0 kudos
How to Create a DataBricks Notebook using API
import requestsimport json# Databricks workspace API URLdatabricks_url = "https://dbc-ab846cbe-f48b.cloud.databricks.com/api/2.0/workspace/import"# Databricks API token (generate one from your Databricks account)databricks_token = "xxxxxxxxxxxxxxxxxx...
- 4699 Views
- 2 replies
- 0 kudos
- 1234 Views
- 1 replies
- 1 kudos
- 1234 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello, Thanks for contacting Databricks Support. It appears you're employing a CloudFormation template to establish a Databricks workspace. The recommended method for creating workspaces is through the AWS Quick Start. Please refer to the documenta...
- 1 kudos
- 1263 Views
- 0 replies
- 0 kudos
Databricks access to Microsoft Sql Server
Hi, i am facing below error while accessing Microosfot sql server. Please suggest what permissions I need to check at database level. I have the scope and secret created and key vault set up as expected. I feel some DB permission issue.Error: com.mi...
- 1263 Views
- 0 replies
- 0 kudos
- 4616 Views
- 5 replies
- 2 kudos
Resolved! URGENT - Data Bricks certification Exam Suspended
Hi Team, I scheduled my exam today and I showed the room in proctor. They said dull light. But I turned on with a better place. They again wanted to share the room and suspended the exam. Please help me asapWebassessor Id: npt.senthil@gmail.com
- 4616 Views
- 5 replies
- 2 kudos
- 2 kudos
Thanks I got the rescheduled invite. Thanks much!
- 2 kudos
- 1993 Views
- 0 replies
- 1 kudos
Connect to delta table from mlflow pyfunc serving endpoint
Hi, I'm creating an mlflow pyfunc serving endpoint and I would like to connect to a delta table to retrieve some information within the pyfunc. Is this possible?I ask because I don't think that serving endpoint environment has access to spark, and we...
- 1993 Views
- 0 replies
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 34 |