- 2169 Views
- 2 replies
- 0 kudos
Resolved! Databricks Certificate
I cleared Databricks Associate level certification on 25th September. But, I am yet to receive my certificate from Databricks. I raised a ticket for the same but got no response from the suppport team.
- 2169 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello There, Can you share the ticket number, note that we have a high influx of cases and the team is actively working on clearing the backlogs and you should be able to receive the reply in the next 48 hours. I appreciate the patience.
- 0 kudos
- 5594 Views
- 5 replies
- 5 kudos
Databricks workspace unstable
Our company's databricks workspace is unstable lately. It can't launch any compute cluster. I have never seen this issue. In addition to this issues, I have seen storage credential error on main unity catalog. Why would this happen AWS Databricks ins...
- 5594 Views
- 5 replies
- 5 kudos
- 5 kudos
Hello @Retired_mod and @jose_gonzalez ,I couldn't locate the support ticket we opened. How can we track that ticket down? It came from the peopletech.com domain. If it is more efficient to create another ticket, please let me know. Let us know the UR...
- 5 kudos
- 2668 Views
- 1 replies
- 6 kudos
Webinar: Accelerate Data and AI Projects With Databricks Notebooks
Register now: October 24, 2023 | 8:00 AM PT Use new capabilities in Databricks Notebooks to speed up innovation. This webinar will walk you through the features that are designed to take the manual effort and delays out of building and deploying dat...
- 2668 Views
- 1 replies
- 6 kudos
- 6 kudos
This will help databricks user to speed up development.
- 6 kudos
- 6000 Views
- 3 replies
- 3 kudos
Application extracting Data from Unity Catalogue
Dear Databricks community,I'm seeking advice on the best method for applications to extract data from the Unity catalogue. One suggested approach is to use JDBC, but there seems to be a dilemma. Although using a job cluster has been recommended due t...
- 6000 Views
- 3 replies
- 3 kudos
- 3 kudos
What exacltly do you mean by 'extracting'? If you want to load tables defined in Unity into a database, I would indeed do this using job clusters and a notebook.If you want to extract some data once in a while into a csv f.e., you could perfectly do...
- 3 kudos
- 5139 Views
- 4 replies
- 2 kudos
Resolved! Error on Workflow
Hi , I have some mysteries situation here My workflow (job) ran and got an error -> [INVALID_IDENTIFIER] The identifier transactions-catalog is invalid. Please, consider quoting it with back-quotes as `transactions-catalog`.(line 1, pos 12) == SQL ==...
- 5139 Views
- 4 replies
- 2 kudos
- 2 kudos
Jobs are just notebooks executed in background, so if the notebook is the same between interactive (manual) and job run, there should be no difference.So I don't see what is wrong. Is the job using DLT perhaps?
- 2 kudos
- 5834 Views
- 1 replies
- 1 kudos
DataBricks Cluster
Hi All,I am curious to know the difference between a spark cluster and a DataBricks one.As per the info I have read Spark Cluster creates driver and Workers when the Application is submitted whereas in Databricks we can create cluster in advance in c...
- 5834 Views
- 1 replies
- 1 kudos
- 1757 Views
- 0 replies
- 0 kudos
Model serving is not available for trial workspaces. Please contact Databricks
Hi, as mentioned in the title, I'm getting this error when I try to use model serving, despite being on the premium plan, my trial account ends on 28th September 2023, is there a way to use model serving immediately or am i stuck until 28th September...
- 1757 Views
- 0 replies
- 0 kudos
- 4651 Views
- 0 replies
- 0 kudos
SQL Warehouse - several issues
Hi there,I am facing several issues while trying to run SQL warehouse-starter on Azure databricks.Please note I am new to this data world, Azure & Databricks . while starting SQL starter warehouse in Databricks Trail version and I am getting these ...
- 4651 Views
- 0 replies
- 0 kudos
- 5377 Views
- 2 replies
- 2 kudos
Resolved! Is it not needed to preserve the data in its original format anymore with the usage of medallion?
Hi Community I have a doubt. The bronze layer always causes confusion for me. Someone mentioned, "File Format: Store data in Delta Lake format to leverage its performance, ACID transactions, and schema evolution capabilities" for bronze layers.Then, ...
- 5377 Views
- 2 replies
- 2 kudos
- 10644 Views
- 5 replies
- 0 kudos
Exam got suspended Databricks Certified Data Engineer Associate exam
Hi team,My Databricks Certified Data Engineer Associate exam got suspended within 20 minutes. My exam got suspended due to eye movement without any warning. I was not moving my eyes away from laptop screen. Some questions are so big in the exam so I...
- 10644 Views
- 5 replies
- 0 kudos
- 0 kudos
@rajib_bahar_ptg funny, not funny, right?! I just posted that tip today in this post: https://community.databricks.com/t5/certifications/minimize-the-chances-of-your-exam-getting-suspended-tip-3/td-p/45712
- 0 kudos
- 8268 Views
- 6 replies
- 3 kudos
Cluster policy not showing while creating delta live table pipeline
Hi all!!!I have created a cluster policy but when i want to use that while creating dlt pipeline, It is showing none. I have checked I have all the necessary permissions to create cluster policies. Still, in dlt ui it is showing none.
- 8268 Views
- 6 replies
- 3 kudos
- 3 kudos
@btafur Can we also set the auto_terminate minutes with the policy? (for the dlt cluster type)
- 3 kudos
- 1845 Views
- 1 replies
- 0 kudos
How to automate creating notebooks when i have multiple .html or .py files
Hi all,I have 50+ .html and .py files, for which I have to create separate notebooks for each and every single one of them. For this, manually creating a notebook using the UI and importing the .html/.py file is a bit tedious and time consuming. Is t...
- 1845 Views
- 1 replies
- 0 kudos
- 0 kudos
Depending on your use case and requirements, one alternative would be to create a script that loops through your files and uploads them using the API. You can find more information about the API here: https://docs.databricks.com/api/workspace/workspa...
- 0 kudos
- 2905 Views
- 0 replies
- 0 kudos
move files and folder from workspace/repo to external location.
I would like to move the folder from my repo under /Workspace/Repos/ar... to the external Azure blob location.I tried dbutils.fs.mv(repo_path, az_path) but this gave me an error for the file not found.Also, I am not able to see workspace -> repo usin...
- 2905 Views
- 0 replies
- 0 kudos
- 6556 Views
- 2 replies
- 3 kudos
Resolved! Limitations of committing ipynb notebook output with Repos
During my experimentation with the latest feature that allows including notebook output in a commit, I ran into a specific issue. While attempting to commit my recent changes, I encountered an error message stating "Error fetching Git status." Intere...
- 6556 Views
- 2 replies
- 3 kudos
- 3 kudos
I've found that the restriction I've encountered isn't related to the file size within Repos, but rather the maximum file size that can be shown in the Azure Databricks UI. You can find this limitation documented at https://learn.microsoft.com/en-us/...
- 3 kudos
- 15625 Views
- 2 replies
- 2 kudos
workspace
How can I do to access the Workspace ?
- 15625 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi, you can try checking https://docs.databricks.com/en/administration-guide/workspace/index.html , please let us know if this helps. Also please tag @Debayan​ with your next response which will notify me, Thank you!
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 34 |