Certifications
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
The message you are trying to access is permanently deleted.
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips,...
Explore discussions on Databricks training programs and offerings within the Community. Get insights...
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and ...
Engage in discussions about the Databricks Free Edition within the Databricks Community. Share insig...
Hi everyone,I’m curious to hear your thoughts on the benefits of having both Azure OpenAI and Azure Databricks within the same ecosystem.From what I can see, Databricks provides a strong foundation for data engineering, governance, and model lifecycl...
Two use case I can think of is RAG:Use Databricks for vector indexing (e.g., via Delta Lake or FAISS) and Azure OpenAI for inference.Example: A chatbot that queries Databricks-hosted documents and uses GPT-4 for response generation.Agentic Workflows:...
Hi,I am and trying to create some table by calling APIs of Shopify/Meta Ads/Google Ads and so on. Where will I make the API call ? Is making API calls in Notebooks considered standard way to ingest in these cases. I intend to make a daily call to ge...
hello @int32lama i can help you with that if you are interested
looking for free certification voucher
Virtual Learning Festival: 10 October - 31 October... - Databricks Community - 127652complete this event, Databrick community will provide 50% discount voucher coupon for any course certification.
Databricks One is a user interface designed for business users, giving them a single, intuitive entry point to interact with data and AI in Azure Databricks, without needing to navigate technical concepts such as clusters, queries, models, or noteboo...
Materialized views running on SQL warehouse are super cost-efficient, and additionally, it is a really simple and powerful data engineering tool - just be sure that Enzyme updates it incrementally. Read more: - https://databrickster.medium.com/sql-wa...
hello.I'm testing agentBricks right now. As I'm testing, I have a few questions that I'd like to ask.1) Is it possible to input the schema in a format with input and output?2) In the free trial, when I create a vector search index and utilize it as a...
From my experience, free-trial quotas differ between regions and accounts (for example, on one client it included serverless SQL warehouse, on another it didn't, in the same region). Additionally, Microsoft manages limitations. My recommendation woul...
Small, hidden but useful cluster setting.You can set that no jobs are allowed on the all-purpose cluster.Or vice versa, you can set an all-purpose cluster that can be used only by jobs. read more: - https://databrickster.medium.com/purpose-for-your-...
Hi everyone,I missed my Databricks certification exam because of a personal emergency and a time zone confusion. The attempt was consumed.Does anyone know if it’s possible to reschedule or get a one-time voucher in situations like this? Any advice wo...
Hi Deepak!I’m really sorry to hear that. As far as I know, Databricks doesn’t usually allow rescheduling once an attempt is consumed. However, if you take one of their official training courses before October 31, you can get a 50% discount voucher fo...
I had the pleasure of benchmarking Databricks Free Edition (yes, really free — only an email required, no credit card, no personal data).My task was to move 2 billion records, and the fastest runs took just under 7 minutes — completely free. One curi...
Hello guys,I was taking test for Databricks Certified Data Engineer Associate and irrespective of any solid reason they have suspended my test. saying I am not giving eye contact to camera. how can I read questions and options while eye contacting wi...
Hello @rondarangareddy,Thank you for filing a ticket with our support team, Support team will respond shortly. Please note that we cannot provide support or handle exam suspensions via community (it is not advised that you post your email address her...
Hey everyone I’m currently exploring machine learning model development and I’m interested in understanding how to effectively integrate ML workflows within Databricks.Specifically, I’d like to hear from the community about:How do you structure ML pi...
You can integrate machine learning model development into Databricks Workflows pretty smoothly using the platform’s native tools. The main idea is to treat your ML lifecycle (data prep → training → evaluation → deployment) as a series of tasks within...
We followed this document https://docs.databricks.com/aws/en/connect/streaming/kafka?language=Python#msk-aad to use Kafka client to read events from our event hub for a feature.As part of the SFI, the guidance is to move away from client secret and u...
Currently, Databricks does not support using Managed Identities directly for Kafka client authentication (e.g., MSK IAM or Event Hubs Kafka endpoint) in Python Structured Streaming connections. However, there is a supported and secure alternative tha...
Hi Everyone,I am currently facing an issue with in our Test Environment where Data bricks is not able to mount with the storage account and we are using the same mount in other environments those are Dev,Preprod and Prod and it works fine there witho...
This issue in your Test environment, where Databricks fails to mount an Azure Storage account with the error java.lang.Exception: 480, is most likely related to expired credentials or cached authentication tokens, even though the same configuration w...
Hi,I have the following directory structure:- mypkg/ - setup.py - mypkg/ - __init__.py - module.py - scripts/ - main # notebook From the `main` notebok I have a cell that runs:%pip install -e /path/to/mypkgThis command appears to succ...
Hey @newenglander — always great to meet a fellow New Englander Could you share a bit more detail about your setup? For example, are you running on classic compute or serverless? And are you working in a customer workspace, or using Databricks Free ...
Hi,I'm trying to execute the following code:%sqlSELECT LSOA21CD, ST_X(ST_GeomFromWKB(Geom_Varbinary)) AS STX, ST_Y(ST_GeomFromWKB(Geom_Varbinary)) AS STYFROM ordnance_survey_lsoas_december_2021_population_weighted_centroidsWHERE LSOA21CD ...
@Corar You might want to enable that explicitly by setting 'spark.databricks.geo.st.enabled' configuration to value 'true'.
| User | Count |
|---|---|
| 212 | |
| 177 | |
| 93 | |
| 75 | |
| 50 |