- 1077 Views
- 1 replies
- 0 kudos
Databricks Data Engineer Associate certification exam suspended suddenly
Hi @Cert-TeamI hope this message finds you well. I am writing to request a review of my recently suspended exam. I believe that my situation warrants reconsideration, and I would like to provide some context for your understanding.I applied for Datab...
- 1077 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Sujitha or @Kaniz Can you please respond to the above query swiftly. As I need to complete my test quickly. Kindly help me to resume or restart my test.Regards,Fasih Ahmed
- 0 kudos
- 1524 Views
- 0 replies
- 0 kudos
Databricks Certification voucher not received
Hi Team,I have attended the Advantage Lakehouse: Fueling Innovation in the Era of Data and AI webinar.Also completed Databricks Lakehouse Fundamentals and feedback survey, but still I have not received the Databricks voucher.Could you please look i...
- 1524 Views
- 0 replies
- 0 kudos
- 1370 Views
- 0 replies
- 0 kudos
Sharing compute between tasks of a job
Is there a way to set up a workflow with multiple tasks, so that different tasks can share the same compute resource, at the same time?I understand that an instance pool may be an option, here. Wasn't sure if there were other possible options to cons...
- 1370 Views
- 0 replies
- 0 kudos
- 1464 Views
- 0 replies
- 0 kudos
Databricks data engineer associate got paused
Hi team,I've faced a disappointing experience during my first certification attempt and need help in resolving the issue.While attending the certification - Databricks data engineer associate on each 2-3 questions I kept receiving a message that the ...
- 1464 Views
- 0 replies
- 0 kudos
- 1655 Views
- 0 replies
- 0 kudos
dataframe column structtype metadata not getting saved to unitycatalog
I have the below schemaschema = StructType([StructField(name="Test",dataType=StringType(),nullable=False,metadata={"description": "This is to test metadata description."})])data = [('Test1',), ('Test2',), ('Test3',)] df = spark.createDataFrame(data, ...
- 1655 Views
- 0 replies
- 0 kudos
- 12241 Views
- 8 replies
- 0 kudos
Resolved! databricks data engineer associate exam
Hello Team, I encountered Pathetic experience while attempting my 1st Databricks certification. I was giving the exam and Abruptly, Proctor asked me to show my desk, everything i showed every corner of my bed.. It was neat and clean with no suspiciou...
- 12241 Views
- 8 replies
- 0 kudos
- 17213 Views
- 1 replies
- 1 kudos
Resolved! Some streams terminated before this command could finish! -> java.lang.NoClassDefFoundError: scala/c
HelloI do face:Some streams terminated before this command could finish!java.lang.NoClassDefFoundError: scala/compat/java8/FutureConverters$Running some very simple query on eventhub :df = spark \.readStream \.format("eventhubs") \.options(**ehConf) ...
- 17213 Views
- 1 replies
- 1 kudos
- 1 kudos
Of course just after writing that post I did realized how dummy this question is .... after adding scala_java8_compat_2_12_1_0_2.jar it works as expected
- 1 kudos
- 1470 Views
- 0 replies
- 0 kudos
Thoughts on how to improve string search queries
Please see sample code I am running below. What options can I explore to improve speed of query execution in such a scenario? Current full code takes about 4 hrs to run on 1.5 billion rows. Thanks!SELECT fullVisitorId ,VisitId ,EventDate ,PagePath ,d...
- 1470 Views
- 0 replies
- 0 kudos
- 3303 Views
- 1 replies
- 0 kudos
API for Databricks code functionality
I have a Databricks notebook for which I want to create an API. From that API I will have to call the notebook and perform certain operations. Result will be sent back to API. I dont want to do via Postman, as someone has to install Postman at their ...
- 3303 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 1080 Views
- 1 replies
- 0 kudos
Error ingesting files with databricks jobs
The source path that i want to ingest files with is:"gs://bucket-name/folder1/folder2/*/*.json"I have a file in this path that ends with ".json.gz" and the databricks job ingests this file even though it doesn't suppose to.How can i fix it?Thanks.
- 1080 Views
- 1 replies
- 0 kudos
- 9257 Views
- 0 replies
- 0 kudos
Deleted the s3 bucket assocated with metastore
I deleted the aws s3 bucket for the databricks metastore by mistake.How to fix this? can I re-create the s3 bucket? Or can I delete the metastore (I don't have much data in it), and re-generate one? Thank you!
- 9257 Views
- 0 replies
- 0 kudos
- 1622 Views
- 0 replies
- 0 kudos
org.apache.spark.SparkException - FileReadException
Sometimes getting this kind of error "org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 12224.0 failed 4 times, most recent failure: Lost task 1.5 in stage 12224.0 (TID ) (12.xxx.x.xxx executor 1): com.datab...
- 1622 Views
- 0 replies
- 0 kudos
- 2405 Views
- 3 replies
- 1 kudos
Add Oracle Jar to Databricks cluster policy
I created a policy for users to use when they create their own Job clusters. When I'm editing the policy, I don't have the UI options for adding library (I can only see Definitions and Permissions tabs). I need to add via JSON the option to allows th...
- 2405 Views
- 3 replies
- 1 kudos
- 1 kudos
@adrianhernandez are you admin to workspace, if not you might be missing permissions, if you have policies enabled, admin can allow you.https://docs.databricks.com/en/administration-guide/clusters/policies.html#librariesif your workspace is Unity cat...
- 1 kudos
- 1750 Views
- 0 replies
- 2 kudos
dbutils.fs.ls MAX_LIST_SIZE_EXCEEDED
Hi!I'm experiencing different behaviours between two DBX Workspaces when trying to list file contents from an abfss: location.In workspace A running len(dbutils.fs.ls('abfss://~~@~~~~.dfs.core.windows.net/~~/')) results in "Out[1]: 1551", while runni...
- 1750 Views
- 0 replies
- 2 kudos
- 2849 Views
- 3 replies
- 1 kudos
getArgument works fine in interactive cluster 10.4 LTS, raises error in interactive cluster 10.4 LTS
Hello,I am trying to use the getArgument() function in a spark.sql query. It works fine if I run the notebook via an interactive cluster, but gives an error when executed via a job run in an instance Pool.query:OPTIMIZE <table>where date = replace(re...
- 2849 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @Retired_mod,Would you be able to respond to my last comment? I couldn't manage to get it working yet.Thank you in advance.
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »