- 1702 Views
- 1 replies
- 0 kudos
Materialized view in DLT pipeline
When setting up DLT pipeline, there are 3 types of product edition, Core, Pro and Advanced. When I compare DLT Classic Core and DLT Classic Pro, the difference is that DLT Classic Pro can handle CDC. Does it means if I'm using DLT Classic Core, I hav...
- 1702 Views
- 1 replies
- 0 kudos
- 0 kudos
Also, If I'm using DLT Classic Core for DLT pipeline, the materialized view will be doing a full refresh or just update on the rows that have changes?
- 0 kudos
- 1469 Views
- 0 replies
- 0 kudos
Unzip multiple zip files in databricks
I have a zip file which in turn has multiple zip files inside it. I tried to write a code in databricks notebook to unzip all these files at once, but I ran into an error. So I started to unzip these one by one, but the code which worked in unzipping...
- 1469 Views
- 0 replies
- 0 kudos
- 2459 Views
- 4 replies
- 2 kudos
Resolved! About resuming the test - [ ref:_00D61JGc4._500Vp6jMk5:ref ]
Hello @Cert-Team I am very glad to hear from you,in response of the ticket (00481969), I have received mail for rescheduling of my test to resume. I am very thankful for your quick response in this matter.I have replied in mail about the date and tim...
- 2459 Views
- 4 replies
- 2 kudos
- 2 kudos
@Cert-Team, My issue has been clear. my test has been rescheduled to the time I have mentioned.I am very grateful for your help.Thanks and Regards@SHASHANK2
- 2 kudos
- 1420 Views
- 0 replies
- 0 kudos
Unable to unzip files recursively and copy into a different folder
I am currently trying to unzip files recursively from one folder(source folder) and copy all the unzipped files into the destination folder using databricks(pyspark). The destination path is still empty even after running this code. I tried looking f...
- 1420 Views
- 0 replies
- 0 kudos
- 2829 Views
- 1 replies
- 1 kudos
Issue with Creating and Running Databricks Jobs with new databricks cli v0.214.0
Hi Databricks Support,I'm encountering an issue with creating and running jobs on Databricks. Here are the details:Problem Description:When attempting to create and run a job using the old JSON (which was successfully used to create and run jobs usin...
- 2829 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Retired_mod ,Thanks for the reference links for the solution.I found the solution mentioned in this https://github.com/databricks/databricks-sdk-go/discussions/384 GitHub. By using the get-run API, I was able to retrieve the running status of my...
- 1 kudos
- 4118 Views
- 5 replies
- 0 kudos
clusters/configuration doesn't work
Clusters configuration doesn't work, it's displaying a blank page. I need to add some libraries for the cluster
- 4118 Views
- 5 replies
- 0 kudos
- 2472 Views
- 1 replies
- 0 kudos
Microsoft PNP Azure Log Analytics
There is this com.microsoft.pnp.Util package that is used inside my Scala notebook used for Azure Databricks to Azure Log Monitoring tables push for my application logs. When I updated my Databricks Runtime from 10.4 to 14 in my clusters, it starts t...
- 2472 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @aarshps , How were you installing the library for the package com.microsoft.pnp.Util? Was it a library, whl, init script etc?
- 0 kudos
- 1729 Views
- 2 replies
- 0 kudos
REST API call throwing 403
Hi, Calling curl works just fine from my workspace ```` %sh curl --request GET "https://${DATABRICKS_HOST}/api/2.0/clusters/list" --header "Authorization: Bearer ${DATABRICKS_API_TOKEN}"```But when I transpose the same logic to a script in javascrip...
- 1729 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for sharing your findings, you can archive the ticket from Options on the right hand side.
- 0 kudos
- 1235 Views
- 0 replies
- 0 kudos
How to optimize a View ?
I have create a view from my underlying delta live table. Generally delta live table get auto optimize with vaccum in place. If thats so why do my view take 1 hour to be queried ? Any other way to optimize it ?
- 1235 Views
- 0 replies
- 0 kudos
- 1825 Views
- 2 replies
- 1 kudos
How to Determine Which CLI Methods to Use?
The Databricks CLI is considered Legacy for versions below 0.17 and is in Public Preview for versions above 0.20. I have access to documentation for both these versions separately. As I am developing a new project, I prefer not to use legacy options ...
- 1825 Views
- 2 replies
- 1 kudos
- 1 kudos
While the legacy CLI (versions 0.18 and below) is still available, it is not receiving any non-critical updates and it is recommend migrating to the new CLI as soon as possible. "Databricks recommends that you use Databricks CLI version 0.205 or abov...
- 1 kudos
- 2227 Views
- 1 replies
- 0 kudos
Community version now shows only the MLpersona
hiI have been using the community version for sometime and had saved some notebooks. While the notebooks are there in the workspace the different personas are not available as seen today. Only machine learning persona is visible. Would like to check...
- 2227 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi MSD, We removed the different personas that was observed earlier and now it is a flat system (It is not a recent change, been there for sometime). The notebooks would be saved in your workspace hence they are accessible. Irrespective of the perso...
- 0 kudos
- 1899 Views
- 0 replies
- 0 kudos
dbdemo LLAM CHATBOT RAG
i have an issue when running the below code using the default dbdemos in the advanced preparation , i have reduced the chunk_size and max_batch_size and running the code in a proper compute resources , could anyone help on that please :(spark.readStr...
- 1899 Views
- 0 replies
- 0 kudos
- 6981 Views
- 4 replies
- 0 kudos
pytest error
Hello,I have a quick question. If my source code call pysark collect() or any method related to rdd methods, then pytest on my local PC will report the following error. My local machine doesn't have any specific setting for pyspark and I used findspa...
- 6981 Views
- 4 replies
- 0 kudos
- 0 kudos
Thank you very much, brockb. Probably I will try it in databricks. Thanks.
- 0 kudos
- 5479 Views
- 0 replies
- 0 kudos
SQL Statement Execution API w/ Javascript (REST)
I need to use Databricks SQL Statement Execution API w/ Javascript (see example post )For some reason, Curl Works, Python works, but Javascript fails.This works : (curl)______________________________curl --request POST \https://adb-5750xxxxxxx.azured...
- 5479 Views
- 0 replies
- 0 kudos
- 1269 Views
- 2 replies
- 0 kudos
DataBricks Certification Exam Got Suspended. Require Immediate support
Hi Team,Today (29th May 2024), I began my Databricks assessment exam, but it was abruptly suspended by the proctor without any explanation. This was my first exam and it has been a disappointing experience.I started the exam calmly, but the proctor w...
- 1269 Views
- 2 replies
- 0 kudos
- 0 kudos
Thank you, @Cert-Team , for your quick response.I am looking forward to the resolution of my issue.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 34 |