- 3054 Views
- 2 replies
- 1 kudos
R Package Installation Best Practices
Hello,We are new to databricks and are wondering what the best practices are for R package installation. We currently have cluster spin up wait times of more than 20 minutes with our init scripts. We have tried the following:1. Libraries tab in the c...
- 3054 Views
- 2 replies
- 1 kudos
- 1 kudos
@Retired_mod Thank you for your detailed response! I think we would like to use Docker if we can because we are not using RStudio but R directly in the databricks notebooks and workflows. So, anymore information about R and Docker and Databricks woul...
- 1 kudos
- 6463 Views
- 6 replies
- 2 kudos
data quality check in data engineering
Can we use deequ library with azure databricks ? if yes Please provide some support material or examplesIs there any similar data quality library or suggestion to achieve automatic data quality check during data engineering (Azure databricks)Thanks i...
- 6463 Views
- 6 replies
- 2 kudos
- 2 kudos
Hi there! You could also take a look at Rudol, it enables no-code Data Quality validations to enable non-technical roles such as Business Analysts or Data Stewards to configure quality checks by themselves.
- 2 kudos
- 4132 Views
- 8 replies
- 9 kudos
Resolved! Facing StorageContext Error while trying to access DBFS
This issue has hindered my practice for the whole day. I scoured the web and couldn't find anybody who has faced this particular error. The error I am getting is: DBFS file browserStorageContext com.databricks.backend.storage.StorageContextType$DbfsR...
- 4132 Views
- 8 replies
- 9 kudos
- 9 kudos
Yeah, unable to save any file with rdd.saveTextfile and to upload any file using the workspace.
- 9 kudos
- 4223 Views
- 5 replies
- 5 kudos
Resolved! Unable to upload files from DBFS
When clicked on upload i am seeing below errorStorageContext com.databricks.backend.storage.StorageContextType$DbfsRoot$@2f3c3220 for workspace 1406865167171326 is not set in theCustomerStorageInfo.
- 4223 Views
- 5 replies
- 5 kudos
- 5 kudos
same issue at my end since yesterday.. does anyone know the reason and what needs to be done from our side (if any) to fix this.
- 5 kudos
- 880 Views
- 0 replies
- 0 kudos
DBCU Plans is costlier vs Job Compute Premium 0.30 per DBU Please justify
Please help me to understood the % of savings how Databricks are calculating DBCUThey are telling if I take DBCU 12500 plan the price will be with discount 12000 and 4% discount.That means if I consume 12500 DBU, I am paying for this $12000 and 4% sa...
- 880 Views
- 0 replies
- 0 kudos
- 1376 Views
- 2 replies
- 2 kudos
Resolved! I am facing issue with DBFS File server
Hi,I am facing the issue DBFS File server, anyone guide me how to resolve the issue. what steps should I take to resolve storage issue ?
- 1376 Views
- 2 replies
- 2 kudos
- 1728 Views
- 3 replies
- 3 kudos
Unablet to Install Python Wheel Library
Hello Team,Can someone let me know if there has been some changes to Databricks Community Edition such that it's no longer possible to install Python Wheel libraries? I was able to install Python Wheel libraries as recently as a few days ago, but now...
- 1728 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi there, Check your runtime version to check you are using a version that supports that and I think nowadays the recommended pattern (don’t quote me) is to store your wheel files on the workspace tree
- 3 kudos
- 2814 Views
- 3 replies
- 4 kudos
Resolved! How does AutoLoader works when triggered via Azure Data Factory?
Hi,I am currently creating an AutoLoader in databricks and will be using ADF as an orchestrator.I am quite confused how this will handle my data so please clarify if I misunderstood it.First, I will run my ADF pipeline which includes an activity to c...
- 2814 Views
- 3 replies
- 4 kudos
- 4 kudos
Auto loader will process files incrementally. Let's say you have a files in existing directory called /input_filesFirst time you run autoloader, it will read all files in that directory (unless you set an option includeExsistingFiles to false, like y...
- 4 kudos
- 3997 Views
- 3 replies
- 2 kudos
How can I deduplicate data from my stream?
Hi,I'm new to databricks and I'm trying to use stream for my incremental data. This data has duplicates which can be solved using a window function. Can you check where my code goes wrong?1-------#Using Auto Loader to read new files schema = df1.sche...
- 3997 Views
- 3 replies
- 2 kudos
- 2 kudos
Hi @zll_0091 ,Change the output mode to update. Other than that, your code looks fine, but I would rename variable microdf to windowSpec, because now it's little confusing.
- 2 kudos
- 3691 Views
- 3 replies
- 0 kudos
Resolved! DLT Compute: "Ephemeral" Job Compute vs. All-purpose compute 2.0 ... WHY?
Hi there, this is a follow-up from a discussion I started last monthSolved: Re: DLT Compute: "Ephemeral" Job Compute vs. All-p... - Databricks Community - 71661Based on what was discussed, I understand that it's not possible to use "All Purpose Clust...
- 3691 Views
- 3 replies
- 0 kudos
- 0 kudos
@ChristianRRL regarding on why DLT doesn't allow you to use all-purpose clusters: 1. The DLT runtime is derived from the shared compute DBR, it's not the same runtime and has different features than the common all-purpose runtime. A DLT pipeline is n...
- 0 kudos
- 5329 Views
- 1 replies
- 0 kudos
Easy GIF Animator 7.4.8 Crack + License Key 2024
Easy GIF Animator 7.4.8 Crack + License Key 2024Easy GIF Animator Crack is popular ever for animation creation and editing purpose there are sufficient modules that works in moderate session and also upgrade the further styles which available here to...
- 5329 Views
- 1 replies
- 0 kudos
- 2838 Views
- 3 replies
- 0 kudos
Error Creating Primary Key Constraint in DLT
Hello There!Greetings!!I am getting the following error when trying to Create a DLT table in my Gold Layer..com.databricks.sql.managedcatalog.PrimaryKeyColumnsNullableException: Cannot create the primary key `x_key` because its child column(s) `x_key...
- 2838 Views
- 3 replies
- 0 kudos
- 0 kudos
Thank you @szymon_dybczak for the response.As you can see I have already defined the not null constraint in my definition for the primary key x_keyCONSTRAINT pk_key_not_null EXPECT (x_key IS NOT NULL) But still I am getting the same error.Also I chec...
- 0 kudos
- 5388 Views
- 2 replies
- 1 kudos
Oracle datawarehous Replacement With Databricks Using DeltaLake
I am new to Spark and DataBricks and exploring these to understand to replace Oracle DataWarehouse by DataBricks(deltalake) and to use Spark to improve the ELT/ETL performance of existing DW.Now, I have done some lookups in databricks blogs, spark do...
- 5388 Views
- 2 replies
- 1 kudos
- 811 Views
- 0 replies
- 1 kudos
registration and certification of databricks certification course regarding
I have a doubt only working personal are register for certification or student who studying data bricks course can register for exam because in some steps of registration ask for company and company email.
- 811 Views
- 0 replies
- 1 kudos
- 25896 Views
- 6 replies
- 0 kudos
Error "Root storage credential for metastore does not exist"While creating the Databricks Volume in
Hi, I tried to create the databricks volume in unity catalog but it threw this error:Root storage credential for metastore XXXXXX does not exist. Please contact your Databricks representative or consider updating the metastore with a valid storage cr...
- 25896 Views
- 6 replies
- 0 kudos
- 0 kudos
I had a similar issue, that was resolved by assigning storage credential to the metastore using REST API call.- make sure that you are metastore or account admin- make sure that the storage credential is correctly configured, its access corrector ide...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
112 | |
56 | |
42 | |
30 |