- 7955 Views
- 9 replies
- 1 kudos
My exam Datbricks Data Engineer Associate got suspended_need immediate help please (10/09/2023)
Hello Team,I encountered Pathetic experience while attempting my DataBricks Data engineer certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam.I want to file ...
- 7955 Views
- 9 replies
- 1 kudos
- 1665 Views
- 2 replies
- 2 kudos
Is Datbricks-Salesforce already available?
Reference: Salesforce and Databricks Announce Strategic Partnership to Bring Lakehouse Data Sharing and Shared AI Models to Businesses - Salesforce NewsI was going through this article and wanted to know if anyone in community is planning to use this...
- 1665 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @Ruby8376, Will get back to you on this.
- 2 kudos
- 2622 Views
- 3 replies
- 3 kudos
Resolved! Terraform provider, problemi in creating dependant task!
Hola all.I have a serious problem, perhaps I missed something, but can't find the solution. I need to push a job description to Databricks using TERRAFORM. I wrote the code, but there is no way to get a task dependant from two different tasks.Conside...
- 2622 Views
- 3 replies
- 3 kudos
- 3 kudos
@6502 You need to make multiple depends_on blocks for each dependency, ex.depends_on { task_key = "ichi" } depends_on { task_key = "ni" }
- 3 kudos
- 1548 Views
- 2 replies
- 0 kudos
Connection VPC AWS in Databricks for extract data Oracle Onpremise
Hi, help meHow can I consume a VPC when it is already anchored in the "network" of Databricks to extract information from a server that really made ping and works
- 1548 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @Kaniz_Fatma I have already configured in "Cloud resources"/ "Network" the Private endpoint links and the linked VPC in ibabricks. How can I connect to Oracle using the EC2 virtual machine?Thank u
- 0 kudos
- 763 Views
- 1 replies
- 0 kudos
Error while attempt to give Lakehouse fundamentals exam
I would like to know why I am getting this error when I tried to earn badges for lake house fundamentals. I can't access the quiz page. Can you please help on this?I am getting 403: Forbidden error.
- 763 Views
- 1 replies
- 0 kudos
- 7738 Views
- 3 replies
- 1 kudos
Structured Streaming of S3 source
I am trying to setup s3 as a structured streaming source. The bucket receives ~17K files/day and the original load to the bucket was ~54K files. The bucket was first loaded 3 months ago and we haven't started reading from it since. So let's say there...
- 7738 Views
- 3 replies
- 1 kudos
- 1 kudos
Thanks,We were able to make things work by increasing the driver instance size so it has more memory for the initial load. After initial load we scaled the instance down for subsequent runs. We're still testing, if we aren't able to make it work we'l...
- 1 kudos
- 2906 Views
- 7 replies
- 6 kudos
Databricks workspace unstable
Our company's databricks workspace is unstable lately. It can't launch any compute cluster. I have never seen this issue. In addition to this issues, I have seen storage credential error on main unity catalog. Why would this happen AWS Databricks ins...
- 2906 Views
- 7 replies
- 6 kudos
- 6 kudos
Hello @Kaniz_Fatma and @jose_gonzalez ,I couldn't locate the support ticket we opened. How can we track that ticket down? It came from the peopletech.com domain. If it is more efficient to create another ticket, please let me know. Let us know the UR...
- 6 kudos
- 2650 Views
- 3 replies
- 3 kudos
Application extracting Data from Unity Catalogue
Dear Databricks community,I'm seeking advice on the best method for applications to extract data from the Unity catalogue. One suggested approach is to use JDBC, but there seems to be a dilemma. Although using a job cluster has been recommended due t...
- 2650 Views
- 3 replies
- 3 kudos
- 3 kudos
What exacltly do you mean by 'extracting'? If you want to load tables defined in Unity into a database, I would indeed do this using job clusters and a notebook.If you want to extract some data once in a while into a csv f.e., you could perfectly do...
- 3 kudos
- 2422 Views
- 4 replies
- 2 kudos
Resolved! Error on Workflow
Hi , I have some mysteries situation here My workflow (job) ran and got an error -> [INVALID_IDENTIFIER] The identifier transactions-catalog is invalid. Please, consider quoting it with back-quotes as `transactions-catalog`.(line 1, pos 12) == SQL ==...
- 2422 Views
- 4 replies
- 2 kudos
- 2 kudos
Jobs are just notebooks executed in background, so if the notebook is the same between interactive (manual) and job run, there should be no difference.So I don't see what is wrong. Is the job using DLT perhaps?
- 2 kudos
- 3241 Views
- 2 replies
- 2 kudos
Resolved! DataBricks Cluster
Hi All,I am curious to know the difference between a spark cluster and a DataBricks one.As per the info I have read Spark Cluster creates driver and Workers when the Application is submitted whereas in Databricks we can create cluster in advance in c...
- 3241 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @DBEnthusiast, In a Spark cluster, the SparkContext object in your main program (the driver program) connects to a cluster manager, which could be Sparkâs standalone cluster manager, Mesos, YARN, or Kubernetes. This cluster manager allocates resou...
- 2 kudos
- 3529 Views
- 4 replies
- 2 kudos
Resolved! Is it not needed to preserve the data in its original format anymore with the usage of medallion?
Hi Community I have a doubt. The bronze layer always causes confusion for me. Someone mentioned, "File Format: Store data in Delta Lake format to leverage its performance, ACID transactions, and schema evolution capabilities" for bronze layers.Then, ...
- 3529 Views
- 4 replies
- 2 kudos
- 2 kudos
Hi @eimis_pacheco , You can store the data in its original format in the Bronze layer. The recommendation to use Delta Lake format for the Bronze layer is mainly for better. The purpose of the Bronze layer in the Lambda architecture is to store data...
- 2 kudos
- 8109 Views
- 5 replies
- 0 kudos
Exam got suspended Databricks Certified Data Engineer Associate exam
Hi team,My Databricks Certified Data Engineer Associate exam got suspended within 20 minutes. My exam got suspended due to eye movement without any warning. I was not moving my eyes away from laptop screen. Some questions are so big in the exam so I...
- 8109 Views
- 5 replies
- 0 kudos
- 0 kudos
@rajib_bahar_ptg funny, not funny, right?! I just posted that tip today in this post: https://community.databricks.com/t5/certifications/minimize-the-chances-of-your-exam-getting-suspended-tip-3/td-p/45712
- 0 kudos
- 4536 Views
- 6 replies
- 3 kudos
Cluster policy not showing while creating delta live table pipeline
Hi all!!!I have created a cluster policy but when i want to use that while creating dlt pipeline, It is showing none. I have checked I have all the necessary permissions to create cluster policies. Still, in dlt ui it is showing none.
- 4536 Views
- 6 replies
- 3 kudos
- 3 kudos
@btafur Can we also set the auto_terminate minutes with the policy? (for the dlt cluster type)
- 3 kudos
- 838 Views
- 1 replies
- 0 kudos
How to automate creating notebooks when i have multiple .html or .py files
Hi all,I have 50+ .html and .py files, for which I have to create separate notebooks for each and every single one of them. For this, manually creating a notebook using the UI and importing the .html/.py file is a bit tedious and time consuming. Is t...
- 838 Views
- 1 replies
- 0 kudos
- 0 kudos
Depending on your use case and requirements, one alternative would be to create a script that loops through your files and uploads them using the API. You can find more information about the API here: https://docs.databricks.com/api/workspace/workspa...
- 0 kudos
- 1928 Views
- 1 replies
- 0 kudos
move files and folder from workspace/repo to external location.
I would like to move the folder from my repo under /Workspace/Repos/ar... to the external Azure blob location.I tried dbutils.fs.mv(repo_path, az_path) but this gave me an error for the file not found.Also, I am not able to see workspace -> repo usin...
- 1928 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Arinjay, you seem to be trying to move a folder from your Databricks workspace repo to an Azure blob storage location. The error you encountered might be because the file system utilities (dbutils. fs) in Databricks do not directly support operat...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
3 -
Azure databricks
3 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Chatgpt
1 -
Community
7 -
Community Edition
3 -
Community Members
2 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Data Processing
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
11 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
3 -
Delta
10 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
11 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
4 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »