- 6285 Views
- 6 replies
- 1 kudos
Materialized Views Without DLT?
I'm curious, is DLT *required* to use Materialized Views in Databricks? Is it not possible to create and refresh a Materialized view via a standard Databricks Workflow?
- 6285 Views
- 6 replies
- 1 kudos
- 1 kudos
Hi @ChristianRRL ,When creating a materialized view in Databricks, the data is stored in DBFS, cloud storage, or Unity Catalog volume. You can still create a materialized view by overwriting the same table each time, instead of using Append, Update, ...
- 1 kudos
- 4450 Views
- 4 replies
- 1 kudos
Purpose of DLT Table table_properties > quality:medallion
Hi there, silly question here but can anyone help me understand what practical purpose does labelling the table_properties with "quality":"<specific_medallion>"? For example: @Dlt.table( comment="Bronze live streaming table for Test data", name="...
- 4450 Views
- 4 replies
- 1 kudos
- 1 kudos
I'm with the same doubt @ChristianRRL, did you figured out something related to it?My doubt is to check if it's possible to apply any kind of access control based on this property.
- 1 kudos
- 3204 Views
- 6 replies
- 3 kudos
Resolved! Plotly Express not rendering in Firefox but fine in Safari
Using a basic example of plotly express i see no output in firefox but is fine in Safari. Any ideas why this may occur? import plotly.express as px import pandas as pd # Create a sample dataframe df = pd.DataFrame({ 'x': range(10), 'y': [2, 3, 5, 7...
- 3204 Views
- 6 replies
- 3 kudos
- 3 kudos
UPDATE: I reached out further to Databricks support and they have since deployed a fix. Works fine for me now!
- 3 kudos
- 7335 Views
- 8 replies
- 3 kudos
Unity catalog enabled workspace -Is there any way to disable workflow/job creation for certain users
Currently in unity catalog enabled workspace users with "Workspace access" can create workflows/jobs, there is no access control available to restrict users from creating jobs/workflows.Use case: In production there is no need for users, data enginee...
- 7335 Views
- 8 replies
- 3 kudos
- 3 kudos
@Lakshay Databricks offers a robust platform with a variety of features, including data ingestion, engineering, science, dashboards, and applications. However, I believe that some features, such as workflow/job creation, alerts, dashboards, and Genie...
- 3 kudos
- 3172 Views
- 3 replies
- 0 kudos
String to date conversion errors
Hi,I am getting data from CDC on SQL Server using Informatica which is writing parquet files to ADLS. I read the parquet files using DLT and end up with the date data as a string such as this'20240603164746563' I couldn't get this to convert using m...
- 3172 Views
- 3 replies
- 0 kudos
- 0 kudos
Checking on my current code, this is what I am using, which works for me because we don't use daylight savings time. from_utc_timestamp(date_time_utc, 'UTC-7') as date_time_local
- 0 kudos
- 24997 Views
- 5 replies
- 1 kudos
Insufficient privileges:User does not have permission SELECT on any file
Hello,after switching to "shared cluster" usage a python job is failing with error message: Py4JJavaError: An error occurred while calling o877.load. : org.apache.spark.SparkSecurityException: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User...
- 24997 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi @GeKo The checkpoint directory, is that set on cluster level or how do we set that ? Can you please help me with this ?
- 1 kudos
- 1642 Views
- 1 replies
- 0 kudos
Databricks UC Data Lineage Official Limitations
Hi all.I have a huge data migration project using medallion architecture, UC, notebooks and workflows . One of the relevant requirements we have is to capture all data dependencies (upstreams and downstreams) using data lineage. I've followed all re...
- 1642 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @RobsonNLPT , Yes SQL CTE are supported by the data lineage service. You can track table that were created using CTEs. Here is an example that demonstrate the feature. CREATE TABLE IF NOT EXISTS mpelletier.dbdemos.menu ( recipe_id INT, ...
- 0 kudos
- 3325 Views
- 3 replies
- 1 kudos
Resolved! Ingesting and Transforming NetCDF Data in Delta Table on Databricks Cluster
Hi,I need to ingest and transform historical climate data into a Delta table. The data is stored in .nc format (NetCDF). To work with this format, specific C libraries for Python are required, along with particular versions of Python libraries (e.g.,...
- 3325 Views
- 3 replies
- 1 kudos
- 1 kudos
Great, please let us know in case any assistance is needed
- 1 kudos
- 2331 Views
- 5 replies
- 0 kudos
Service Principal Access to Users Directory in Databricks - Creating Git Folders
I am trying to automate the creation of git folders in user workspace directories triggered by GitHub feature branch creation. When developers create feature branches in GitHub, we want a service principal to automatically create corresponding git fo...
- 2331 Views
- 5 replies
- 0 kudos
- 0 kudos
Hi @Brianhourigan, Can you please DIM your suggestions? I can add it to our internal AHA idea.
- 0 kudos
- 3712 Views
- 5 replies
- 0 kudos
Restore deleted databricks jobs and job runs
Hi All,Is there a way to restore deleted databricks jobs?Thank you.
- 3712 Views
- 5 replies
- 0 kudos
- 0 kudos
Hi @iptkrisna ,Currently, there is no option to recover deleted items. In architectures, it not necessary to control or manage the final code available in the system. Instead, the focus should be controlling and managing how code and jobs are deploye...
- 0 kudos
- 3018 Views
- 1 replies
- 1 kudos
Resolved! Unity Catalog : RDD Issue
In our existing notebooks, the scripts are reliant on RDDs. However, with the upgrade to Unity Catalog, RDDs will no longer be supported. We need to explore alternative approaches or tools to replace the use of RDDs. Could you suggest the best practi...
- 3018 Views
- 1 replies
- 1 kudos
- 1 kudos
To transition from using RDDs (Resilient Distributed Datasets) to alternative approaches supported by Unity Catalog, you can follow these best practices and migration strategies: Use DataFrame API: The DataFrame API is the recommended alternative to...
- 1 kudos
- 3203 Views
- 7 replies
- 0 kudos
Resolved! Tutorial docs for running a job using serverless?
I'm exploring whether serverless (https://docs.databricks.com/en/jobs/run-serverless-jobs.html#create-a-job-using-serverless-compute) could be useful for our use case. I'd like to see an example of using serverless via the API. The docs say "To learn...
- 3203 Views
- 7 replies
- 0 kudos
- 1148 Views
- 1 replies
- 1 kudos
Resolved! Is there a cluster option for dashboards?
Hi everyone,I do not want to use 4 DBU/h XS warehouse since I have very tiny data on the new startup. I want to create a minimal cluster and run it as the underlying SQL engine for my dashboard.Thanks.
- 1148 Views
- 1 replies
- 1 kudos
- 1 kudos
Unfortunately no, as dashboards are part of the SQL service on the platform they are designed to work with SQL warehouses only, you can create Notebook dashboards that will be able to work with regular clusters but functionalities will be limited in ...
- 1 kudos
- 2134 Views
- 5 replies
- 1 kudos
Resolved! Databricks workflow with sequenced tasks
I have a continuous workflow. It is continuous because I would like it to run every minute and if it has stuff to do the first task will take several minutes. As I understand, continuous workflows won't requeue while a job is currently running, where...
- 2134 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi @h2p5cq8, No problem! and you can have the queue option disabled to stop it. Go to the Advanced settings in the Job details side panel and toggle off the Queue option to prevent jobs from being queued
- 1 kudos
- 1289 Views
- 1 replies
- 0 kudos
How Development Target works for multiple users?
Hi, I'm using the Databricks asset bundle to deploy my job to Azure Databricks.I want to configure the Databricks bundle so that when anyone runs the Azure pipeline, a job is created under their name in the format dev_username_job.Using a personal ac...
- 1289 Views
- 1 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
2 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 35 |