- 2437 Views
- 2 replies
- 0 kudos
Enable system schemas
Hello All,I'm new with Databricks,Have an issue within enable system schemas. When run api call to check system schemas status in metastores -I see that all schemas in "Unavailable" state (except "information_schema", which is "ENABLE_COMPLETED").Is ...
- 2437 Views
- 2 replies
- 0 kudos
- 6399 Views
- 4 replies
- 1 kudos
Databricks Job Failure + Service now Integration
Hi Team,Could you please suggest how to raise the service now ticket, in case of Databricks job failure?Regards ,Phanindra
- 6399 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi , Can this JSON response to Service Now be edited before being sent? What are the different ways it can be edited?
- 1 kudos
- 6601 Views
- 8 replies
- 2 kudos
Expose delta table data to Salesforce - odata?
HI Looking for suggestiongs to stream on demand data from databricks delta tables to salesforce.Is odata a good option?
- 6601 Views
- 8 replies
- 2 kudos
- 2 kudos
Hey, I think this might helphttps://www.salesforce.com/uk/news/press-releases/2024/04/25/zero-copy-partner-network/
- 2 kudos
- 5532 Views
- 7 replies
- 0 kudos
How to get databricks performance metrics programmatically?
How to retrieve all Databricks performance metrics on an hourly basis. Is there a recommended method or API available for retrieving performance metrics ?
- 5532 Views
- 7 replies
- 0 kudos
- 0 kudos
The spark logs are available through cluster logging. This is enabled at the cluster level for you to choose the destination for the logs. Just a heads up - interpreting them at scale is not trivial. I'd recommend having a read through the overwatch...
- 0 kudos
- 3457 Views
- 4 replies
- 1 kudos
an autoloader in file notification mode to get files from S3 on AWS -Error
I configured an autoloader in file notification mode to get files from S3 on AWS.spark.readStream\.format("cloudFiles")\.option("cloudFiles.format", "json")\.option("cloudFiles.inferColumnTypes", "true")\.option("cloudFiles.schemaLocation", "dbfs:/au...
- 3457 Views
- 4 replies
- 1 kudos
- 1 kudos
In case anyone else stumbles across this, I was able to fix my issue by setting up an instance profile with the file notification permissions and attaching the instance profile to the job cluster. It wasn't clear from the documentation that the file ...
- 1 kudos
- 6541 Views
- 4 replies
- 3 kudos
[DeltaTable] Usage with Unity Catalog (ParseException)
Hi,I'm migrating my workspaces to Unity Catalog and the application to use three-level notation. (catalog.database.table)See: Tutorial: Delta Lake | Databricks on AWSI'm having the following exception when trying to use DeltaTable.forName(string name...
- 6541 Views
- 4 replies
- 3 kudos
- 3 kudos
Thank you for the quick feedback @saipujari_spark Indeed, it's working great within a notebook with Databricks Runtime 13.2 which most likely has a custom behavior for unity catalog. It's not working in my scala application running in local with dire...
- 3 kudos
- 1660 Views
- 3 replies
- 0 kudos
Supporting Material for self phased Data Analysis with Databricks Course
Hi All,Newbie here, Any idea where I can find the supporting materials that are used in the online "Data Analysis with Databricks Course" that the instructor is using?. It seems to be having the scripts to create schema, tables, etcThanks in advance
- 1660 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello @BSalla, here is our suggested learning path. I hope it helps you!
- 0 kudos
- 708 Views
- 2 replies
- 1 kudos
Finding Materials for Databricks Course
Hi All,Where can I find the supporting materials used by the instructor in the online "Data Analysis with Databricks" course? It appears to include scripts for creating schemas, tables, and other database structures.Thanks in advance.
- 708 Views
- 2 replies
- 1 kudos
- 1 kudos
Hello @RoseCliver1 and @BSalla! To access the course supporting materials, please raise a support ticket here.
- 1 kudos
- 1307 Views
- 1 replies
- 1 kudos
Resolved! read file from local machine(my computer) and create a dataframe
I want to create a notebook and add a widget which will allow the user to select a file from local machine(my computer) and read the contents of the file and create a dataframe. is it possible? and how? in dbutils.widgets i dont have any options for ...
- 1307 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @khaansaab ,Currently, there is now out of the box mechanism that will allow you to that. As a workaround,you can create a UC volume and tell your users to upload files into that volume. Then you can create notebook that will have file_name parame...
- 1 kudos
- 5382 Views
- 2 replies
- 1 kudos
Resolved! Configure Databricks in VSCode through WSL
Hi,I am having a hard time configuring my Databricks workspace when working in VSCode via WSL. When following the steps to setup Databricks authentication I am receiving the following error on the Step 5 of "Step 4: Set up Databricks authentication"....
- 5382 Views
- 2 replies
- 1 kudos
- 1 kudos
What worked for me was NOT opening the browser using the pop-up (which generated the 3-legged-OAuth flow error), but clicking on the link provided by the CLI (or copy paste the link on the browser)
- 1 kudos
- 857 Views
- 1 replies
- 0 kudos
Merging customer and company account into single account
I have two accounts: One is my company account and another one is my personal account in databricks community. I want to merge it into single one. Kindly let me know how to do it.
- 857 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @mahfooz_iiitian! Please send an email to community@databricks.com with both of your email addresses, specifying which account you’d like to retain. The IT team will assist you with merging the accounts.
- 0 kudos
- 1132 Views
- 2 replies
- 0 kudos
Is anyone using Databricks for Advanced HR Analytcs?
We are starting out with Databricks and I would like to use the tools to build out Advanced Analytics for HR.
- 1132 Views
- 2 replies
- 0 kudos
- 608 Views
- 1 replies
- 0 kudos
Databricks Data engineer Exam got suspended while still 8 minutes left.
Hi @Cert-Team,I hope this message finds you well.Request ID- #00556592 I am writing to seek clarification regarding my recent exam, which was suspended due to a reflection issue caused by my spectacles. During the exam, the proctor paused it and aske...
- 608 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Dharshan777 , We are sorry to hear that your exam was suspended.. Thank you for filing a ticket with our support team. Please allow the support team 24-48 hours for a resolution. In the meantime, you can review the following documentation: Beh...
- 0 kudos
- 13084 Views
- 3 replies
- 0 kudos
Space in Column names when writing to Hive
All,I have the following code.df_Warehouse_Utilization = ( spark.table("hive_metastore.dev_ork.bin_item_detail") .join(df_DIM_Bins,col('bin_tag')==df_DIM_Bins.BinKey,'right') .groupby(col('BinKey')) .agg(count_distinct(when(col('serial_lo...
- 13084 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi,I have faced this issue a few times. When we are overwriting the dataframes to hive catalog in databricks, it doesn't naturally allow for column names to have spaces or special characters. However, you can add an option statement to bypass that ru...
- 0 kudos
- 14019 Views
- 11 replies
- 1 kudos
Cannot create an account to try Community Edition
Hi,Whenever I try to signup for an account, I keep getting the following message - "an error has occurred. please try again later" when I click on the button "get started with databricks community edition".Could you please let me know why this could...
- 14019 Views
- 11 replies
- 1 kudos
- 1 kudos
I got the same problem if I try to register or login through Community Edition link. But I tried by clicking the "Try Databricks" button on top right corner of the https://www.databricks.com/ home page, I was able to register and login successfully j...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
2 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
1 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
88 | |
42 | |
42 | |
30 |