- 3053 Views
- 2 replies
- 0 kudos
Auotoloader-"cloudFiles.backfillInterval"
1. How to use cloudFiles.backfillInterval option in a notebook?2. Does It need to be any set of the property?3. Where is exactly placed readstream portion of the code or writestream portion of the code?4. Do you have any sample code?5. Where we find ...
- 3053 Views
- 2 replies
- 0 kudos
- 0 kudos
1.Is the following code correct for specifying the .option("cloudFiles.backfillInterval", 300)?df = spark.readStream.format("cloudFiles") \.option("cloudFiles.format", "csv") \.option("cloudFiles.schemaLocation", f"dbfs:/FileStore/xyz/back_fill_opti...
- 0 kudos
- 1876 Views
- 0 replies
- 0 kudos
Sharing compute between tasks of a job
Is there a way to set up a workflow with multiple tasks, so that different tasks can share the same compute resource, at the same time?I understand that an instance pool may be an option, here. Wasn't sure if there were other possible options to cons...
- 1876 Views
- 0 replies
- 0 kudos
- 1697 Views
- 0 replies
- 0 kudos
Unable to Login to community edition
I am unable to login in databricks community account, my email was "sureshsuthar1251@gmail.com". When i try to login it says account not found.
- 1697 Views
- 0 replies
- 0 kudos
- 2993 Views
- 1 replies
- 0 kudos
Can the same Databricks account be used for both AWS and Azure
I am testing Databricks with non-AWS S3 object storage and want to test it with Databricks on Azure and Databricks on AWS. My Databricks account is currently using Databricks on AWS with metadata and a single node compute running. Can the same accou...
- 2993 Views
- 1 replies
- 0 kudos
- 0 kudos
Thank you Kaniz for posting the link. Looking at that, I believe the answer is:This is not possible in Databricks for now.
- 0 kudos
- 1966 Views
- 0 replies
- 0 kudos
Databricks data engineer associate got paused
Hi team,I've faced a disappointing experience during my first certification attempt and need help in resolving the issue.While attending the certification - Databricks data engineer associate on each 2-3 questions I kept receiving a message that the ...
- 1966 Views
- 0 replies
- 0 kudos
- 3370 Views
- 3 replies
- 0 kudos
Resolved! Databricks data engineer associate Exam got suspended.
@Cert-Team I have registered for Databricks certified data engineer associate exam and I have done all biometric and other prerequisites required and launched my exam. While writing the exam, it's exited twice with some technical issue although I don...
- 3370 Views
- 3 replies
- 0 kudos
- 0 kudos
Thank you for your support. I have given my test.
- 0 kudos
- 2178 Views
- 0 replies
- 0 kudos
dataframe column structtype metadata not getting saved to unitycatalog
I have the below schemaschema = StructType([StructField(name="Test",dataType=StringType(),nullable=False,metadata={"description": "This is to test metadata description."})])data = [('Test1',), ('Test2',), ('Test3',)] df = spark.createDataFrame(data, ...
- 2178 Views
- 0 replies
- 0 kudos
- 4624 Views
- 1 replies
- 0 kudos
Connecting to Databricks using Python(VS Code)
I'm trying to connect to tables/views in Databricks using Python(via VS Code). However, I'm getting the following error:- File "C:\Users\XXXXXXXX\AppData\Roaming\Python\Python311\site-packages\urllib3\util\retry.py", line 592, in incrementraise MaxRe...
- 4624 Views
- 1 replies
- 0 kudos
- 6994 Views
- 0 replies
- 0 kudos
Get Network usage read and write from a job cluster on databricks
Hi ! I am writing because I am trying to get the data from network usage from a databricks job cluster. I know that the data is on spark-ui / storage but when the job cluster is terminate I cannot acces to the information , and I trying to make some ...
- 6994 Views
- 0 replies
- 0 kudos
- 10621 Views
- 3 replies
- 3 kudos
Azure Databricks User Alerts: query_result_table
Hi,How can I set up the notification email to show all of rows from query_result_table, not only first 10?
- 10621 Views
- 3 replies
- 3 kudos
- 3 kudos
I see that my previous message has been cut. So, I just wanted to check where can I change email setting using the code above, because when I open User > Account Settings > Notifications I can see only option like this: there is no place to configur...
- 3 kudos
- 5355 Views
- 1 replies
- 0 kudos
Unable to use job cluster for task in workflows
Hi,I have a workflow setup in Databricks using 12.2 LTS ML.I am trying to use a job cluster for the task but i am getting the following error: Spark Conf: ‘spark.databricks.acl.enabled’ is not allowed when choosing an access modeAs a result I have to...
- 5355 Views
- 1 replies
- 0 kudos
- 18667 Views
- 1 replies
- 1 kudos
Resolved! Some streams terminated before this command could finish! -> java.lang.NoClassDefFoundError: scala/c
HelloI do face:Some streams terminated before this command could finish!java.lang.NoClassDefFoundError: scala/compat/java8/FutureConverters$Running some very simple query on eventhub :df = spark \.readStream \.format("eventhubs") \.options(**ehConf) ...
- 18667 Views
- 1 replies
- 1 kudos
- 1 kudos
Of course just after writing that post I did realized how dummy this question is .... after adding scala_java8_compat_2_12_1_0_2.jar it works as expected
- 1 kudos
- 1844 Views
- 0 replies
- 0 kudos
Thoughts on how to improve string search queries
Please see sample code I am running below. What options can I explore to improve speed of query execution in such a scenario? Current full code takes about 4 hrs to run on 1.5 billion rows. Thanks!SELECT fullVisitorId ,VisitId ,EventDate ,PagePath ,d...
- 1844 Views
- 0 replies
- 0 kudos
- 3959 Views
- 1 replies
- 0 kudos
API for Databricks code functionality
I have a Databricks notebook for which I want to create an API. From that API I will have to call the notebook and perform certain operations. Result will be sent back to API. I dont want to do via Postman, as someone has to install Postman at their ...
- 3959 Views
- 1 replies
- 0 kudos
- 1576 Views
- 1 replies
- 0 kudos
Error ingesting files with databricks jobs
The source path that i want to ingest files with is:"gs://bucket-name/folder1/folder2/*/*.json"I have a file in this path that ends with ".json.gz" and the databricks job ingests this file even though it doesn't suppose to.How can i fix it?Thanks.
- 1576 Views
- 1 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
2 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 116 | |
| 56 | |
| 42 | |
| 34 |