Certifications
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Explore discussions on Databricks training programs and offerings within the Community. Get insights...
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insight...
I was using a trial account and my 40$ credit was finished. I set a budget limit but it also exceeded that without warning and now I'm being charge for extra 90$!!! But this is only the first problem. The second problem is since my credit card reject...
@BS_THE_ANALYST I have not, but that support alias looks like it's specific to the `databricks-community` (e.g. this website) rather than the product.
Let’s say we have big data application where data loss is not an option.Having GZRS (geo-zone-redundant storage) redundancy we would achieve zero data loss if primary region is alive – writer is waiting for acks from two or more Azure availability zo...
In Azure and Databricks environments, ensuring zero data loss during a primary-to-secondary failover—especially for Delta Lake/streaming workloads—is extremely challenging due to asynchronous replication, potential ordering issues, and inconsistent s...
In a DLT pipeline I have a bronze table that ingest files using Autoloader, and a derived silver table that, for this example, just stores the number of rows for each file ingested into bronze. The basic code example: import dlt from pyspark.sql impo...
Do we get a certificate for Beta Tests? Recently I have given Data Analyst Beta Test, but no communication on score or certification. Would someone clarify once?
It seems results are out. I have received with the subject line: Result Notification: Databricks Certified Data Analyst Associate Certification Exam Beta Test
Hi All.I'm following the Gen AI Certification material, but.. does anybody know if notebooks are available?Thanis in advance
@szymon_dybczak if @XaviMarti is learning through the partner academy, there's a chance the labs are there (and included) i.e. searching the partner academy:I think these red/purple ones (if that's the correct colour lol ) are labs relating to a le...
Is there is any way/plans of Databricks use Delta sharing to provide data access to Celonis?
Hi @cbhoga ,Delta Sharing is an open protocol for secure data sharing. Databricks already supports it natively, so you can publish data using Delta Sharing. However, whether Celonis can directly consume that shared data depends on whether Celonis sup...
Hi there, I would appreciate some help to compare the runtime performance of two approaches to performing ELT in Databricks: spark.read vs. Autoloader. We already have a process in place to extract highly nested json data into a landing path, and fro...
Hi @ChristianRRL ,For that kind of ingestion scenario autoloader is a winner . It will scale much better than batch approach - especially if we are talking about large number of files.If you configure autoloader with file notification mode it can sca...
Hi there, I would appreciate some input on AutoLoader best practice. I've read that some people recommend that the latest data should be loaded in its rawest form into a raw delta table (i.e. highly nested json-like schema) and from that data the app...
I think the key thing with holding the raw data in a table, and not transforming that table, is that you have more flexibility at your disposal. There's a great resource available via Databricks Docs for best practices in the Lakehouse. I'd highly re...
Bit of a silly question, but wondering if someone can help me better understand what is `read_files`?read_files table-valued function | Databricks on AWSThere's at least 3 ways to pull raw json data into a spark dataframe:df = spark.read...df = spark...
Also, @ChristianRRL , with a slight adjustment to the syntax, it does indeed behave like Autoloaderhttps://docs.databricks.com/aws/en/ingestion/cloud-object-storage/auto-loader/patterns?language=SQL I'd also advise looking at the different options th...
I plan to take the Professional Exam. Since the certification was updated on September 30, I would like to know the details — specifically the course content, how long this version of the course will remain valid (given that the course content is u...
Thank you for response. Is the https://partner-academy.databricks.com/learn/learning-plans/10/data-engineer-learning-plan correct link for professional exam? Thanks
Hi, originally I accidentally made a customer academy account with my company that is a databricks partner. Then I made an account using my personal email and listed my company email as the partner email for the partner academy account. that account ...
Need help to merge my customer portal id with partner mail id my case number is 00754330
What I'm trying to achieve: ingest files into bronze tables with Autoloader, then produce Kafka messages for each file ingested using a DLT sink.The issue: latency between file ingested and message produced get exponentially higher the more tables ar...
Hi, I think it is a delay of the autoloader as it doesn't know about the ingested files. It is nothing in common with the state, as it is just an autoloader and it keeps a list of processed files. Autloader scans the directory every minute, usually a...
I have created an init script stored in a Volume which I want to execute on a cluster with runtime 16.4 LTS. The cluster has policy = Unrestricted and Access mode = Standard. I have additionally added the init script to the allowlist. This should be ...
Hi @jimoskar ,Since you're using standard access mode you need to add init script to allowlist. Did you add your init script to allowlist? If not, do the following:In your Databricks workspace, click Catalog.Click the gear icon .Click the metastore ...
Hi,I have completed the certification of Data warehousing with databricks. when can I expect the voucher.
Hello @Krisna_91! To add to what Sumit mentioned, if your question is regarding the incentives offered in the Learning Festival, kindly ensure that you complete at least one self-paced learning pathway within Customer Academy between October 10 and O...
If you were creating Unity Catalogs again, what would you do differently based on your past experience?
From my experiance:- don't create separate catalogs for every project. Try to think about your desing before implementation- try to come up with consistent naming convention to avoid cognitive overhead- principle of least privilege - grant users and ...
User | Count |
---|---|
212 | |
167 | |
93 | |
74 | |
43 |