- 33151 Views
- 22 replies
- 46 kudos
Databricks Announces the Industry’s First Generative AI Engineer Learning Pathway and Certification
Today, we are announcing the industry's first Generative AI Engineer learning pathway and certification to help ensure that data and AI practitioners have the resources to be successful with generative AI. At Databricks, we recognize that generative ...
- 33151 Views
- 22 replies
- 46 kudos
- 46 kudos
Dear Certifications TeamI have completed full Generative AI Engineering Pathway, so I received module wise knowledge badge but I didn't received the overall certificate which mentioned in description which is Generative AI Engineer with one Star. Req...
- 46 kudos
- 1044 Views
- 3 replies
- 0 kudos
Agent Bricks Information Extraction
I am facing some problem in Information extraction using PDF. I have done all the necessary steps. 1) I loaded the data in Volume.2) I ran the Use PDF's functionality to create a structure table of the PDFs3) I now have the table with the column name...
- 1044 Views
- 3 replies
- 0 kudos
- 0 kudos
Thank You @NandiniN for responding. For now, I have changed the type of "raw_parsed" to string and created a vector index after that.But can you help me with one more thing. I am creating a Multi-Agent-Supervisor, can you explain me the need of OBO U...
- 0 kudos
- 2112 Views
- 2 replies
- 2 kudos
Resolved! Using Genie Conversational API with External Users and Data-Level Security
We are planning to implement a chat interface in our portal application using the Genie Conversational API, where clients, partners, and internal users can ask questions in natural language and receive answers based on our data.I have the following q...
- 2112 Views
- 2 replies
- 2 kudos
- 2 kudos
Hello @JohnnyA I'll try to explain ideas and hope something works for you because I don't have the whole context.1) Authentication & authorization for external usersRecommended (best practice):Federated identity + OBO. Your portal authenticates with ...
- 2 kudos
- 2031 Views
- 2 replies
- 0 kudos
Resolved! Clarification on Databricks Claude 3.7 Sonnet native serving endpoint data processing location.
Hello, I am trying to clarify if Databricks Claude 3.7 Native serving endpoint is safe for use in our enterprise or if it needs to be disabled.I am trying to understand if the data sent to this endpoint is kept within our geo region (UKSouth/Europe)....
- 2031 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @ChrisLawford_n1 The Databricks Claude 3.7 for UKsouth, as you can see in the documentation, can only be used if it is supported based on GPU availability and requires cross-geo routing to be enabled, so cross-geo processing that allows data for ...
- 0 kudos
- 6341 Views
- 1 replies
- 1 kudos
Resolved! API GENIE
Hi community!Yesterday I tried extract history chat from my genie spaces but I can't export chats from other users, I have the next error:{'error_code': 'PERMISSION_DENIED', 'message': 'User XXXXXXXX does not own conversation XXXXXXXX', 'details': [{...
- 6341 Views
- 1 replies
- 1 kudos
- 1 kudos
Based on current documentation and available resources, exporting chat histories from Genie Spaces is restricted by ownership rules: only the user who owns the conversation can export that specific chat history, regardless of admin permissions or wor...
- 1 kudos
- 2254 Views
- 7 replies
- 10 kudos
Resolved! ai_query not affected by AI gateway's rate limits?
Hey, We've been testing the ai_query (Azure Databricks here) on preconfigured model serving endpoints likedatabricks-meta-llama-3-3-70b-instruct and the initial results look nice. I'm trying to limit the number of requests that could be sent to those...
- 2254 Views
- 7 replies
- 10 kudos
- 10 kudos
Hey, @BS_THE_ANALYST, before writing that post, I went exactly through the docs you've posted. I wasn't able to find a specific confirmation (or denial) that this function will be affected by the rate limits, which led me to believe that it's worth a...
- 10 kudos
- 1625 Views
- 2 replies
- 1 kudos
Finally fix Claude Opus
It's been almost 3 months since the announcement of First-party Anthropic Claude Opus 4 on Databricks Mosaic AI Model Serving, but the model is still unavailable.It's listed in the pricing and documentation, but on the Serving endpoints page, it's be...
- 1625 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @qlmahga2 , it appears that Claude Sonnet 4 and Claude Sonnet 4.5 are both currently available. I'm not sure what it looked like when you asked this question, as I didn't check at that time. However, it looks like they should be available now. Can...
- 1 kudos
- 3193 Views
- 8 replies
- 8 kudos
Resolved! Can I get notebooks used in a e-learning video?
I'm currently watching the videos in the "Generative AI Engineering Pathway."In the "Demo" chapters, it appears that the instructor is explaining based on pre-prepared notebooks (for example, a notebook named "2.1 - Preparing Data for RAG"). Would it...
- 3193 Views
- 8 replies
- 8 kudos
- 8 kudos
Hello @Advika !Could you provide details about "the lab materials"?Are they just notebooks, or do they include other materials like raw data files to be processed, environment variables (like "DA"), scripts, etc.?I'm very new to Databricks, so it wou...
- 8 kudos
- 1813 Views
- 4 replies
- 2 kudos
AI Agents - calling custom code and databricks jobs
Hi EveryoneI am building AI Agents; my requirement is to call custom tool logic (which was not possible using a unity catalog function) and databricks jobs. I could not find much of documentation on these scenarios. If someone could share any referen...
- 1813 Views
- 4 replies
- 2 kudos
- 2 kudos
Hi @GiriSreerangam , can you share what you are trying to do with the custom tool? You might be able to implement with a custom MCP server. Here are some other resources that may help: https://github.com/JustTryAI/databricks-mcp-serverhttps://github...
- 2 kudos
- 2764 Views
- 2 replies
- 2 kudos
Export/Share Genie Space Across DEV, QA, and PROD Environments
Hi Team,What is the procedure for exporting a Genie space across multiple workspace environments such as DEV, QA, and PROD?Can you provide any details around this. Thanks ,Phani
- 2764 Views
- 2 replies
- 2 kudos
- 2 kudos
@Pilsner When I try to export, I get Error: dataRoom is not user-facing. Trying to download in the workspace browser results in a 400 Bad Request.The bundle documentation for AWS and Azure does not mention genie spaces as existing resources.See the r...
- 2 kudos
- 788 Views
- 1 replies
- 1 kudos
Resolved! Vectorisation job automatisation and errors
Hey there ! So I'm fairly new to AI and RAG, and at this moment I'm trying to automatically vectorise documents (.pdf, .txt, etc...) each time a new file comes in a volume that I created.For that I created, a job that's triggered each time a new file...
- 788 Views
- 1 replies
- 1 kudos
- 1 kudos
To address the question about automating and optimizing document vectorization pipelines (PDF, TXT, etc.) like the Databricks unstructured data pipeline with challenges around HuggingFace model downloads and job flexibility, here are insights and alt...
- 1 kudos
- 643 Views
- 1 replies
- 1 kudos
Resolved! Prakash Hinduja Switzerland (Swiss) How can I manage spending while optimizing compute resources?
Hi I am Prakash Hinduja Visionary Financial Strategist, born in Amritsar (India) and now lives in Geneva, Switzerland (Swiss) I’m looking for advice on how to better manage costs in Databricks while still keeping performance efficient. If you’ve foun...
- 643 Views
- 1 replies
- 1 kudos
- 1 kudos
To optimize costs in Databricks while maintaining strong performance, consider a blend of strategic cluster configurations, autoscaling, aggressive job scheduling, and robust monitoring tools. These proven practices are used by leading enterprises in...
- 1 kudos
- 2222 Views
- 1 replies
- 1 kudos
Resolved! streaming llm response
I am deploying an agent that works good withouth streaming:it is using the following packages: "mlflow==2.22.1", "langgraph", "langchain", "pydantic==2.8.2", "langgraph-checkpoint-sqlite", "databricks-langchain", "p...
- 2222 Views
- 1 replies
- 1 kudos
- 1 kudos
To implement streaming output for your agent in Databricks and resolve the error "This model does not support predict_stream method.", the key requirement is that your underlying MLflow model must support the predict_stream method. Most l...
- 1 kudos
- 1620 Views
- 5 replies
- 3 kudos
Model deprecation issue while serving on Databricks
I am facing this below error while serving codellama model:Exception: Request failed with status 400, {"error_code":"INVALID_PARAMETER_VALUE","message":"The provisioned throughput model Llama 2 7B is deprecated and no longer supported in serving. See...
- 1620 Views
- 5 replies
- 3 kudos
- 3 kudos
Hello @HemantvIkani32! Did the responses shared above help resolve your concern? If yes, please consider marking the relevant response(s) as the accepted solution.
- 3 kudos
- 751 Views
- 1 replies
- 0 kudos
How to utilize clustered gpu for large hf models
Hi,I am using clustered GPU(driver -1GPU and Worker-3GPU), and caching model data into unity catalog but while loading model checkpoint shards its always use driver memory and failed due insufficient memory.How to use complete cluster GPU while loadi...
- 751 Views
- 1 replies
- 0 kudos
- 0 kudos
1. Are you using any of the model parallel library, such as FSDP or DeepSpeed? Otherwise, every GPU will load the entire model weights. 2. If yes in 1, Unity Catalog Volumes are exposed on every node at /Volumes/<catalog>/<schema>/<volume>/..., so w...
- 0 kudos
- 1787 Views
- 3 replies
- 1 kudos
AgentBricks Query Error
Does anyone got:[REMOTE_FUNCTION_HTTP_FAILED_ERROR]The remote HTTP request failed with code 400, and error message 'HTTP request failed with status: {"error_code": "BAD_REQUEST", "message": "Errored due to unknown error. Please contact Databricks sup...
- 1787 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @gascarin Is this endpoint optimized or the base endpoint? Try optimizing the endpoint and check it again. Are you running ai_query? Can you please run it with a very simple and short prompt to see if it works?
- 1 kudos
-
agent
2 -
agent bricks
2 -
Agent Skills
1 -
agents
2 -
AI
2 -
AI Agents
10 -
ai gateway
2 -
Anthropic
1 -
API Documentation
1 -
App
3 -
Application
1 -
Asset Bundles
1 -
Authentication
1 -
Autologging
1 -
automoation
1 -
Aws databricks
2 -
ChatDatabricks
1 -
claude
5 -
Cluster
1 -
Credentials
1 -
crewai
1 -
cursor
1 -
Databricks App
3 -
Databricks Course
1 -
Databricks Delta Table
1 -
Databricks Mlflow
1 -
Databricks Notebooks
1 -
Databricks SQL
1 -
Databricks Table Usage
1 -
Databricks-connect
1 -
databricksapps
1 -
delta sync
1 -
Delta Tables
1 -
Developer Experience
1 -
DLT Pipeline
1 -
documentation
1 -
Ethical Data Governance
1 -
Foundation Model
4 -
gemini
1 -
gemma
1 -
GenAI
11 -
GenAI agent
2 -
GenAI and LLMs
4 -
GenAI Generation AI
1 -
GenAIGeneration AI
38 -
Generation AI
2 -
Generative AI
5 -
Genie
18 -
Genie - Notebook Access
2 -
GenieAPI
4 -
Google
1 -
GPT
1 -
healthcare
1 -
Index
1 -
inference table
1 -
Information Extraction
1 -
Langchain
4 -
LangGraph
1 -
Llama
1 -
Llama 3.3
1 -
LLM
2 -
machine-learning
1 -
mcp
2 -
MlFlow
4 -
Mlflow registry
1 -
MLModels
1 -
Model Serving
3 -
modelserving
1 -
mosic ai search
1 -
Multiagent
2 -
NPM error
1 -
OpenAI
1 -
Pandas udf
1 -
Playground
1 -
productivity
1 -
Pyspark
1 -
Pyspark Dataframes
1 -
RAG
3 -
ro
1 -
Scheduling
1 -
Server
1 -
serving endpoint
3 -
streaming
2 -
Tasks
1 -
Vector
1 -
vector index
1 -
Vector Search
2 -
Vector search index
6
- « Previous
- Next »
| User | Count |
|---|---|
| 39 | |
| 28 | |
| 23 | |
| 14 | |
| 10 |