- 1917 Views
- 4 replies
- 2 kudos
Databricks All Delta Tables Data Read
If we want to read all the data of the databricks tables at single time how can we able to do it.
- 1917 Views
- 4 replies
- 2 kudos
- 2 kudos
Hi @Krishna2110 ,Here it is, it should work now tables = spark.sql("SHOW TABLES IN ewt_edp_prod.crm_raw").collect() for row in tables: table_name = f"ewt_edp_prod.{row[0]}.{row[1]}" try: df = spark.table(table_name) count = df...
- 2 kudos
- 5263 Views
- 2 replies
- 2 kudos
Resolved! Convert Date String to Date type Returning null
Hi,I am using Databricks SQL and I am converting a integer field which is of format ('20240719' or 'yyyyMMdd'),now I am able to convert it to date type using to_date(forecastedHorizonPeriodDate,'yyyyMMdd')I then tried to change the format of this dat...
- 5263 Views
- 2 replies
- 2 kudos
- 4260 Views
- 1 replies
- 0 kudos
Login issue
Hi,Getting below error message.We were not able to find a Community Edition workspace with this email. Please login to non-community-edition workspaces you may have access to.
- 4260 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @virendra1212 ,Take a look at below thread. Maybe some of the suggestions will help in your case.Community Edition Login Issues Below is a list of ... - Page 4 - Databricks Community - 26926You signed up before May 14, 2022 -Please confirm that th...
- 0 kudos
- 838 Views
- 1 replies
- 0 kudos
auto statistics cost
hi,is there any cost implications for automatic statistics collection?or databricks is providing it as a feature and didn't cost on my cluster?
- 838 Views
- 1 replies
- 0 kudos
- 783 Views
- 0 replies
- 0 kudos
Shuffle Partitions
What if I have lot of empty shuffled partitions due to data skewness Secondly , if the shuffle partition size is 128 MB and if the size of the key's partition is 700 MB
- 783 Views
- 0 replies
- 0 kudos
- 1758 Views
- 3 replies
- 4 kudos
The way to visualize logging of Spark Job graphically
Is there any way to visualize logging such as execution runtime or memory usage of Spark Job graphically just like the image below by utilizing Databricks Partner Connect for free?Also, I'd appreciate to know any other ways to visualize the logging o...
- 1758 Views
- 3 replies
- 4 kudos
- 4798 Views
- 1 replies
- 0 kudos
How to Add value to Comment Column in Databricks Catalog View for Foreign table (Azure SQL)
Hi.Struggling to add description by script to the comment column within the catalog view in Databricks, particularly for foreign/external tables sourced from Azure SQL.I have no issue doing that for delta tables. Also for information schema columns f...
- 4798 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod Sorry, I did read this way back but forgot to reply.Thanks for information.And all those steps I found as well and for "3. Programmatic Approach (For Azure SQL)" if you are adding to sql any description that way I don't think it will app...
- 0 kudos
- 2888 Views
- 2 replies
- 0 kudos
Resolved! DLT Online Table with VNnet Enable on Blob Storage Get 403 Issue
I am trying to create an online table in a Unity catalog. However, I get a GET, 403 error. DataPlaneException: Failed to start the DLT service on cluster . Please check the stack trace below or driver logs for more details. com.databricks.pipelines....
- 2888 Views
- 2 replies
- 0 kudos
- 0 kudos
I figured it out. It was because of the Network Connectivity Configurations. I did not have one setup with a private endpoint connection to the ADLS Gen2. I followed the instructions here: https://learn.microsoft.com/en-us/azure/databricks/security/...
- 0 kudos
- 11216 Views
- 8 replies
- 3 kudos
What are the different ways to pull the log data from Splunk to Databricks?
Hi,I have recently started Splunk Integration with Databricks. Basically I am trying to ingest the data from Splunk to Databricks. I have gone through the documentation regarding Splunk Integration. There are some basic information about the integrat...
- 11216 Views
- 8 replies
- 3 kudos
- 3 kudos
Hi @Arch_dbxlearner Did you done integration with splunk if yes can you please help
- 3 kudos
- 728 Views
- 1 replies
- 1 kudos
"No API found for 'POST /workspace-files" error while trying to upload a JAR
Hi,I'm using CE and trying to upload a JAR library of about 45MB into my workspace so I can use it from Pyspark, but getting error "No API found for 'POST /workspace-files". Any thoughts?
- 728 Views
- 1 replies
- 1 kudos
- 1220 Views
- 2 replies
- 1 kudos
regrading course
Hello I am a newbie on this platform can anyone please tell me how can I enroll in courses that we have supposed to complete to get a voucher for exams I came to know about databricks Learning Festival
- 1220 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @Nikhilkamode , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your fe...
- 1 kudos
- 1228 Views
- 2 replies
- 0 kudos
Issues around float datatypes
Float data from file, is never getting copied to the temp_table created though the schema matches.As a workaround, using CREATE TABLE [USING] which is able to insert the file data to a temp_table. Is this a known issue? COPY INTO issue is explained w...
- 1228 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @inagar , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback...
- 0 kudos
- 2377 Views
- 2 replies
- 2 kudos
Learning via Free Trial Azure
Hi I am following a Databricks course on Udemy and the course instructed to access Databricks via the free trial of Azure. Once I created my account on Azure and load Databricks, I try to create a cluster but this never succeeds. It takes an extremel...
- 2377 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @clock4eva , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedb...
- 2 kudos
- 2754 Views
- 2 replies
- 1 kudos
How to Optimize Delta Lake Performance for Large-Scale Data Ingestion?
Hi everyone,I'm currently working on a project that involves large-scale data ingestion into Delta Lake on Databricks. While the ingestion process is functioning, I've noticed performance bottlenecks, especially with increasing data volumes. Could yo...
- 2754 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @Syleena23 , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedb...
- 1 kudos
- 1587 Views
- 2 replies
- 0 kudos
Monitoring Databricks
Good Day All, Did any one done Databricks Monitoring integration with any other 3rd party applications like Grafana or ELK to get infrastructure monitoring like cpu_utilization ,memory ,job monitor and I can able to write spark code to get cpu_utili...
- 1587 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @SainnathReddy_M , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
4 -
Azure data disk
1 -
Azure databricks
13 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Group
1 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
9 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
2 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
1 -
Google Bigquery
1 -
HIPAA
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
MlFlow
2 -
Model Training
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
4 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
1 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
124 | |
60 | |
42 | |
30 | |
20 |