- 6705 Views
- 2 replies
- 1 kudos
Power BI Import Model Refresh from Databricks SQL Whse - Query has been timed out due to inactivity
We have an intermittant issue where occasionally a partition in our Power BI Import Dataset times out at 5 hours. When I look at Query History in Databricks SQL, I see a query that failed with the following error message: "Query has been timed out ...
- 6705 Views
- 2 replies
- 1 kudos
- 1 kudos
The only solution we have been able to come up with was to create a Notebook in Databricks that uses the Power BI API to check the status of a Refresh. We schedule it a bit after we expect the Refresh to complete. If it is still running, we kill th...
- 1 kudos
- 14221 Views
- 11 replies
- 1 kudos
DLT Pipeline issue - Failed to read dataset .Dataset is not defined in the pipeline.
Background. I have created a DLT pipeline in which i am creating a Temorary table. There are 5 temporary tables as such. When i executed these in an independent notebook they all worked fine with DLT. Now i have merged this notebook ( keeping same ...
- 14221 Views
- 11 replies
- 1 kudos
- 1 kudos
I am sorry but information you are providing is not helping at all. Plase dump your code there.
- 1 kudos
- 8130 Views
- 0 replies
- 0 kudos
Data bricks for practice at no cost which cloud service or combination i need to use
Hi All Senior ,Context :I want to use databricks for practice to create projects and keep polishing my knowledge. My free credits are already used up . Now can you pls give me tips on how to run databricks in which cloud provider (storage account com...
- 8130 Views
- 0 replies
- 0 kudos
- 1872 Views
- 0 replies
- 0 kudos
Not able to download Certificate
Hi All,I took the course: Get Started With Data Engineering from below course link https://www.databricks.com/learn/training/getting-started-with-data-engineering#data-videoBut, after completing the Quiz, I am not able to download Certificate. The a...
- 1872 Views
- 0 replies
- 0 kudos
- 10260 Views
- 2 replies
- 0 kudos
Unlock Data Engineering Essentials in Just 90 Minutes - Get Certified for FREE!
There’s an increasing demand for data, analytics and AI talent in every industry. Start building your data engineering expertise with this self-paced course — and earn an industry-recognized Databricks certificate. This course provides four short tu...
- 10260 Views
- 2 replies
- 0 kudos
- 0 kudos
Same here. I am not able to download any Certificate even after passing the Quiz. But the Course link - https://www.databricks.com/learn/training/getting-started-with-data-engineering#data-videoclearly says: take a short knowledge test and earn a com...
- 0 kudos
- 2991 Views
- 1 replies
- 0 kudos
Resolved! Is it possible to pass a Spark session to other python files?
I am setting up pytest for my repo. I have my functions in separate python files and run pytest from one notebook. For each testing file, I have to create a new Spark session as follows:@pytest.fixture(scope="session")def spark(): spark = ( SparkSe...
- 2991 Views
- 1 replies
- 0 kudos
- 0 kudos
I was able to do it by placing the Spark session fixture in the conftest.py file in the root directory.
- 0 kudos
- 1886 Views
- 0 replies
- 0 kudos
DataBricks Certification Exam Got Suspended. Require support for the same.
I encountered numerous challenges during my exam, starting with issues related to system compatibility and difficulties with my microphone and other settings. Despite attempting to contact support multiple times, it was not easy to get assistance.Aft...
- 1886 Views
- 0 replies
- 0 kudos
- 1755 Views
- 1 replies
- 0 kudos
Efficient Detection of Schema Mismatch in CSV Files During Single Pass Reading
Hello, when I read a CSV file with a schema object, if a column in the original CSV contains a value of a different datatype than specified in the schema, the result is a null cell. Is there an efficient way to identify these cases without having to ...
- 1755 Views
- 1 replies
- 0 kudos
- 0 kudos
Maybe you can try to read the data and let AutoLoader move missmatch data e.g. to rescueColumnhttps://learn.microsoft.com/en-us/azure/databricks/ingestion/auto-loader/schema#--what-is-the-rescued-data-columnThen you can decide what you do with rescue...
- 0 kudos
- 10755 Views
- 5 replies
- 2 kudos
[Unity Catalog]-CosmosDB: Data source v2 are not supported
I've worked on azure databricks connected to azure cosmos. It works when my cluster is not enabling Unity Catalog (UC).But when I enable UC, it return error like below:AnalysisException: [UC_COMMAND_NOT_SUPPORTED.WITHOUT_RECOMMENDATION] The command(s...
- 10755 Views
- 5 replies
- 2 kudos
- 1688 Views
- 1 replies
- 0 kudos
Cluster
I’m not able to have any workspace view or cluster
- 1688 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey , could you be more precise with your question ?
- 0 kudos
- 3599 Views
- 2 replies
- 1 kudos
dbutils.fs.ls versus pathlib.Path
Hello community members,The dbutils.fs.ls('/') exposes the distributed file system(DBFS) on the databricks cluster. Similary, the python library pathlib can also expose 4 files in the cluster like below:from pathlib import Pathmypath = Path('/')for i...
- 3599 Views
- 2 replies
- 1 kudos
- 1 kudos
I think it will be usefull if you look at this documentation to understand difrent files and how you can interact with them:https://learn.microsoft.com/en-us/azure/databricks/files/there is not much to say then that dbutils is "databricks code" that ...
- 1 kudos
- 4535 Views
- 1 replies
- 0 kudos
How to ingest files from volume using autoloader
I am doing a test run. I am uploading files to a volume and then using autoloader to ingesgt files and creating a table. I am getting this error message:-----------------------------------------------------------com.databricks.sql.cloudfiles.errors....
- 4535 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey, i think you are mixing DLT syntaxt with pyspark syntax:In DLT you should use:CREATE OR REFRESH STREAMING TABLE <table-name> AS SELECT * FROM STREAM read_files( '<path-to-source-data>', format => '<file-format>' )or in Python@dlt....
- 0 kudos
- 9819 Views
- 5 replies
- 2 kudos
DLT Compute Resources - What Compute Is It???
Hi there, I'm wondering if someone can help me understand what compute resources DLT uses? It's not clear to me at all if it uses the last compute cluster I had been working on, or something else entirely.Can someone please help clarify this?
- 9819 Views
- 5 replies
- 2 kudos
- 2 kudos
Well, one thing they emphasize in the 'Adavanced Data Engineer' Training is that job-clusters will terminate within 5 minutes after a job is completed. So this could be in support of your theory to lower costs. I think job-cluster are actually design...
- 2 kudos
- 2153 Views
- 1 replies
- 0 kudos
python library in databricks
Hello community members,I am seeking to understand where databricks keeps all the python libraries ? For a start, I tried two lines below:import sys sys.path()This list all the paths but I cant look inside them. How is DBFS different from these paths...
- 2153 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello,all your libraries are installed on Databricks Cluster Driver node on OS Disk.DBFS is like mounted Cloude Storage account.You have veriouse ways of working with libraries but databricks only load some of libraries that comes with Cluster image....
- 0 kudos
- 3069 Views
- 1 replies
- 0 kudos
Power BI keeps SQL Warehouse Running
Hi,I have a SQL Warehouse, serverless mode, set to shut down after 5 minutes. Using the databricks web IDE, this works as expected. However, if I connect Power BI, import data to PBI and then leave the application open, the SQL Warehouse does not s...
- 3069 Views
- 1 replies
- 0 kudos
- 0 kudos
Repeating this test today, the SQL Warehouse shut down properly. Thanks for your helpful reply.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Linked Service
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
2 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
3 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 120 | |
| 57 | |
| 42 | |
| 35 |