Databricks Platform Discussions
Dive into comprehensive discussions covering various aspects of the Databricks platform. Join the co...
Dive into comprehensive discussions covering various aspects of the Databricks platform. Join the co...
Engage in vibrant discussions covering diverse learning topics within the Databricks Community. Expl...
Hi community!Yesterday I tried extract history chat from my genie spaces but I can't export chats from other users, I have the next error:{'error_code': 'PERMISSION_DENIED', 'message': 'User XXXXXXXX does not own conversation XXXXXXXX', 'details': [{...
Hello! Newbie here, so apologies if this is a super basic question. Trying to figure out if we can connect Purview to an AWS instance of Databricks (vice Azure instance), but I have only seen articles on connecting Azure Databricks to Purview. I r...
Great question as of today, there’s no official support for connecting Azure Purview directly to a Databricks instance running on AWS (i.e. non-Azure) — most of the documentation and integrations assume Azure Databricks. That said, there are a few w...
We are planning to implement a chat interface in our portal application using the Genie Conversational API, where clients, partners, and internal users can ask questions in natural language and receive answers based on our data.I have the following q...
Hello Team, i have been setting up SAT in my Databricks workspace and i am able to do it and scan in my workspace. i have provided my SP access to all other Workspaces as well When i run the initialize job (SAT Initializer Notebook (one-time)) , I c...
I was reading the SAT GitHub page and this might be network issue as well.If you run SAT on Serverless compute or behind IP ACLs, cross‑workspace API calls can be blocked.The Setup guide notes that SAT can’t analyze other workspaces when:The destinat...
Hey, We've been testing the ai_query (Azure Databricks here) on preconfigured model serving endpoints likedatabricks-meta-llama-3-3-70b-instruct and the initial results look nice. I'm trying to limit the number of requests that could be sent to those...
Hey, @BS_THE_ANALYST, before writing that post, I went exactly through the docs you've posted. I wasn't able to find a specific confirmation (or denial) that this function will be affected by the rate limits, which led me to believe that it's worth a...
I'm encountering an issue where a serving endpoint I create disappears from the list of serving endpoints after a day. This has happened both when I created the endpoint from the Databricks UI and using the Databricks SDK.
Hey @prashant_089 , what you are experiencing should not happen on its own except for some extremely outlying circumstanctes. IF YOU ARE USING Databricks Free Edition you shold ignore everything below. Here are some troubleshooting suggestions/tips: ...
Hi Team,While creating a Declarative ETL pipeline in Databricks, I tried to configure a notebook using the "Add existing assets" option by providing the notebook path. However, I received a warning message:"Legacy configuration detected. Use files in...
Thank you @szymon_dybczak, Now I have a good clarification from my end.
We're in the process of transitioning our Azure Databricks instance from SCIM-based provisioning to Automated Identity Management (AIM), now that AIM is generally available. Once enabled, AIM becomes the authoritative source for managing users, group...
@DavidRobinson Let me know how it goes. This is in my to-do list too as we are facing a lot of issues with SCIM like nested group sync and SPN syncs. One of the issue that I can think of is AIM respects nested groups from Entra, which SCIM didn’t. So...
Hello everyone,I'm reaching out to the community for assistance regarding Databricks certification and the ongoing Databricks Learning Festival. I am looking to enroll in the Data Engineer Associate Certification program, and I understand that comple...
Hello @sai_sakhamuri! According to the Learning Festival post, the Learning Festival incentives are for users who complete at least one self-paced learning pathway within Customer Academy between October 10 and October 31. This incentive is intended ...
Hi Everyone,I am working on setting up success/failure notifications for a large number of jobs in our Databricks environment. The manual process of configuring email notification using UI for each job individually is not scalable and is becoming ver...
@Raj_DB Databricks sends notifications via its internal email service, which often requires the address to be a valid individual mailbox or a distribution list that accepts external mail. If your group email is a Microsoft 365, Please check if “Allow...
Hello, using the JDBC driver, when I retrieve the metadata of a ResultSet, the type for a TIMESTAMP_NTZ is not correct (it's a TIMESTAMP one).My SQL is a simple SELECT * on a table where you have a TIMESTAMP_NTZ columnThis works when retrieving metad...
Hello @EricCournarie! Just to confirm, were you initially using the JDBC driver v2.7.3? According to the release notes, this version adds support for the TIMESTAMP_NTZ data type.
Hello @Cert-Team @cert-ops @Cert-TeamOPS @Advika @Jim_Anderson,I hope you are doing well.During my attempt to take the Databricks Data Engineer Associate Certification Exam, it was unexpectedly suspended due to a technical error. I was unable to pr...
Hello @cert-ops,I kindly request your support regarding an urgent issue — my exam was suspended due to a technical error, and I’m currently unable to proceed or reschedule. I’ve submitted a support ticket (Request #00756378) but haven’t received an u...
In a DLT pipeline I have a bronze table that ingest files using Autoloader, and a derived silver table that, for this example, just stores the number of rows for each file ingested into bronze. The basic code example: import dlt from pyspark.sql impo...
Gotcha. Thanks for the reply.We already have a medallion architecture going in production for a while, based on DLT pipelines having most bronze/silver tables, with some separate jobs for silver tables that are more complex to materialize. It works, ...
Hello, I'm trying to fill a STRUCT field with a PreparedStatement in Java by giving a JSON string in the PreparedStatement.But it complains Cannot resolve "infos" due to data type mismatch: cannot cast "STRING" to "STRUCT<AGE: BIGINT, NAME: STRING>"....
Could you provide a sample of JSON string along with a code you're using? Otherwise it will be hard for us to help you.
In all Databricks documentation, the examples use import dlt to create streaming tables and views. But, when generating sample Python code in ETL pipeline, the import in the sample is:import pyspark import pipelines as dpWhich one is the correct libr...
@yit Functionally, they are equivalent concepts (declarative definitions for streaming tables, materialized views, expectations, CDC, etc.). The differences you’ll notice are mostly naming/ergonomics:Module name:Databricks docs & most existing notebo...
User | Count |
---|---|
1833 | |
918 | |
771 | |
476 | |
317 |