cancel
Showing results for 
Search instead for 
Did you mean: 
Generative AI
Explore discussions on generative artificial intelligence techniques and applications within the Databricks Community. Share ideas, challenges, and breakthroughs in this cutting-edge field.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Sujitha
by Databricks Employee
  • 29959 Views
  • 18 replies
  • 33 kudos

Databricks Announces the Industry’s First Generative AI Engineer Learning Pathway and Certification

Today, we are announcing the industry's first Generative AI Engineer learning pathway and certification to help ensure that data and AI practitioners have the resources to be successful with generative AI. At Databricks, we recognize that generative ...

Screenshot 2024-01-24 at 11.32.01 PM.png
  • 29959 Views
  • 18 replies
  • 33 kudos
Latest Reply
AIChief
New Contributor II
  • 33 kudos

thanks for sharing

  • 33 kudos
17 More Replies
Dulce42
by New Contributor II
  • 5753 Views
  • 1 replies
  • 1 kudos

Resolved! API GENIE

Hi community!Yesterday I tried extract history chat from my genie spaces but I can't export chats from other users, I have the next error:{'error_code': 'PERMISSION_DENIED', 'message': 'User XXXXXXXX does not own conversation XXXXXXXX', 'details': [{...

  • 5753 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

Based on current documentation and available resources, exporting chat histories from Genie Spaces is restricted by ownership rules: only the user who owns the conversation can export that specific chat history, regardless of admin permissions or wor...

  • 1 kudos
PiotrM
by New Contributor III
  • 735 Views
  • 7 replies
  • 10 kudos

Resolved! ai_query not affected by AI gateway's rate limits?

Hey, We've been testing the ai_query (Azure Databricks here) on preconfigured model serving endpoints likedatabricks-meta-llama-3-3-70b-instruct and the initial results look nice. I'm trying to limit the number of requests that could be sent to those...

  • 735 Views
  • 7 replies
  • 10 kudos
Latest Reply
PiotrM
New Contributor III
  • 10 kudos

Hey, @BS_THE_ANALYST, before writing that post, I went exactly through the docs you've posted. I wasn't able to find a specific confirmation (or denial) that this function will be affected by the rate limits, which led me to believe that it's worth a...

  • 10 kudos
6 More Replies
qlmahga2
by New Contributor III
  • 961 Views
  • 2 replies
  • 1 kudos

Finally fix Claude Opus

It's been almost 3 months since the announcement of First-party Anthropic Claude Opus 4 on Databricks Mosaic AI Model Serving, but the model is still unavailable.It's listed in the pricing and documentation, but on the Serving endpoints page, it's be...

  • 961 Views
  • 2 replies
  • 1 kudos
Latest Reply
jack_zaldivar
Databricks Employee
  • 1 kudos

Hi @qlmahga2 , it appears that Claude Sonnet 4 and Claude Sonnet 4.5 are both currently available. I'm not sure what it looked like when you asked this question, as I didn't check at that time. However, it looks like they should be available now. Can...

  • 1 kudos
1 More Replies
Nobuhiko
by New Contributor III
  • 2258 Views
  • 8 replies
  • 8 kudos

Resolved! Can I get notebooks used in a e-learning video?

I'm currently watching the videos in the "Generative AI Engineering Pathway."In the "Demo" chapters, it appears that the instructor is explaining based on pre-prepared notebooks (for example, a notebook named "2.1 - Preparing Data for RAG"). Would it...

  • 2258 Views
  • 8 replies
  • 8 kudos
Latest Reply
Nobuhiko
New Contributor III
  • 8 kudos

Hello @Advika !Could you provide details about "the lab materials"?Are they just notebooks, or do they include other materials like raw data files to be processed, environment variables (like "DA"), scripts, etc.?I'm very new to Databricks, so it wou...

  • 8 kudos
7 More Replies
GiriSreerangam
by New Contributor III
  • 454 Views
  • 4 replies
  • 2 kudos

AI Agents - calling custom code and databricks jobs

Hi EveryoneI am building AI Agents; my requirement is to call custom tool logic (which was not possible using a unity catalog function) and databricks jobs. I could not find much of documentation on these scenarios. If someone could share any referen...

  • 454 Views
  • 4 replies
  • 2 kudos
Latest Reply
jamesl
Databricks Employee
  • 2 kudos

Hi @GiriSreerangam , can you share what you are trying to do with the custom tool? You might be able to implement with a custom MCP server. Here are some other resources that may help:  https://github.com/JustTryAI/databricks-mcp-serverhttps://github...

  • 2 kudos
3 More Replies
Phani1
by Valued Contributor II
  • 590 Views
  • 2 replies
  • 2 kudos

Export/Share Genie Space Across DEV, QA, and PROD Environments

Hi Team,What is the procedure for exporting a Genie space across multiple workspace environments such as DEV, QA, and PROD?Can you provide any details around this. Thanks ,Phani

  • 590 Views
  • 2 replies
  • 2 kudos
Latest Reply
spoltier
New Contributor III
  • 2 kudos

@Pilsner When I try to export, I get Error: dataRoom is not user-facing. Trying to download in the workspace browser results in a 400 Bad Request.The bundle documentation for AWS and Azure does not mention genie spaces as existing resources.See the r...

  • 2 kudos
1 More Replies
brahaman
by New Contributor II
  • 508 Views
  • 1 replies
  • 1 kudos

Resolved! Vectorisation job automatisation and errors

Hey there ! So I'm fairly new to AI and RAG, and at this moment I'm trying to automatically vectorise documents (.pdf, .txt, etc...) each time a new file comes in a volume that I created.For that I created, a job that's triggered each time a new file...

  • 508 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

To address the question about automating and optimizing document vectorization pipelines (PDF, TXT, etc.) like the Databricks unstructured data pipeline with challenges around HuggingFace model downloads and job flexibility, here are insights and alt...

  • 1 kudos
prakashhinduja1
by New Contributor
  • 361 Views
  • 1 replies
  • 1 kudos

Resolved! Prakash Hinduja Switzerland (Swiss) How can I manage spending while optimizing compute resources?

Hi I am Prakash Hinduja Visionary Financial Strategist, born in Amritsar (India) and now lives in Geneva, Switzerland (Swiss) I’m looking for advice on how to better manage costs in Databricks while still keeping performance efficient. If you’ve foun...

  • 361 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

To optimize costs in Databricks while maintaining strong performance, consider a blend of strategic cluster configurations, autoscaling, aggressive job scheduling, and robust monitoring tools. These proven practices are used by leading enterprises in...

  • 1 kudos
chunky35
by New Contributor
  • 1173 Views
  • 1 replies
  • 1 kudos

Resolved! streaming llm response

I am deploying an agent that works good withouth streaming:it is using the following packages:      "mlflow==2.22.1",      "langgraph",      "langchain",      "pydantic==2.8.2",      "langgraph-checkpoint-sqlite",      "databricks-langchain",      "p...

  • 1173 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

To implement streaming output for your agent in Databricks and resolve the error "This model does not support predict_stream method.", the key requirement is that your underlying MLflow model must support the predict_stream method. Most l...

  • 1 kudos
HemantvIkani32
by New Contributor II
  • 768 Views
  • 5 replies
  • 3 kudos

Model deprecation issue while serving on Databricks

I am facing this below error while serving codellama model:Exception: Request failed with status 400, {"error_code":"INVALID_PARAMETER_VALUE","message":"The provisioned throughput model Llama 2 7B is deprecated and no longer supported in serving. See...

  • 768 Views
  • 5 replies
  • 3 kudos
Latest Reply
Advika
Databricks Employee
  • 3 kudos

Hello @HemantvIkani32! Did the responses shared above help resolve your concern? If yes, please consider marking the relevant response(s) as the accepted solution.

  • 3 kudos
4 More Replies
dk_g
by New Contributor
  • 522 Views
  • 1 replies
  • 0 kudos

How to utilize clustered gpu for large hf models

Hi,I am using clustered GPU(driver -1GPU and Worker-3GPU), and caching model data into unity catalog but while loading model checkpoint shards its always use driver memory and failed due insufficient memory.How to use complete cluster GPU while loadi...

  • 522 Views
  • 1 replies
  • 0 kudos
Latest Reply
lin-yuan
Databricks Employee
  • 0 kudos

1. Are you using any of the model parallel library, such as FSDP or DeepSpeed? Otherwise, every GPU will load the entire model weights.  2. If yes in 1, Unity Catalog Volumes are exposed on every node at /Volumes/<catalog>/<schema>/<volume>/..., so w...

  • 0 kudos
gascarin
by New Contributor II
  • 461 Views
  • 3 replies
  • 1 kudos

AgentBricks Query Error

Does anyone got:[REMOTE_FUNCTION_HTTP_FAILED_ERROR]The remote HTTP request failed with code 400, and error message 'HTTP request failed with status: {"error_code": "BAD_REQUEST", "message": "Errored due to unknown error. Please contact Databricks sup...

  • 461 Views
  • 3 replies
  • 1 kudos
Latest Reply
Krishna_S
Databricks Employee
  • 1 kudos

Hi @gascarin  Is this endpoint optimized or the base endpoint? Try optimizing the endpoint and check it again. Are you running ai_query? Can you please run it with a very simple and short prompt to see if it works? 

  • 1 kudos
2 More Replies
bhanu_gautam
by Valued Contributor III
  • 556 Views
  • 4 replies
  • 3 kudos

Resolved! Agent Bricks not visible on free edition yet

Hi I am from Asia location and Agent bricks is not yet enabled in free version any idea till when it is expected to be released

  • 556 Views
  • 4 replies
  • 3 kudos
Latest Reply
bhanu_gautam
Valued Contributor III
  • 3 kudos

Thank you @Advika  and @nayan_wylde 

  • 3 kudos
3 More Replies
kay0291
by New Contributor
  • 421 Views
  • 1 replies
  • 1 kudos

Resolved! Databricks app crash

Hello everyone,I am working on a databricks app project. IT's a chatbot on a monorepo wih a frontend built with next.js and a fastapi backend. Sometimes the app crashes unexpectedly. In the logs i see these messages : -> sending sigterms to other pro...

  • 421 Views
  • 1 replies
  • 1 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 1 kudos

Hey Kay0291 - Your logs suggest the app is running out of memory: exit code 137 means the OS killed your process (likely exceeding Databricks Apps’ 2 vCPU/6GB RAM limit). This often happens when both Next.js and FastAPI run together, especially under...

  • 1 kudos
kbaig8125
by New Contributor III
  • 830 Views
  • 2 replies
  • 2 kudos

Resolved! Databricks Genie as Managed MCP server

I am trying out a Genie space to be used as a MCP server with a client. When i add the URL details and authentication PAT I get an error .. attached image file for more details. Unable to validate mcp server capabilities! Error: MCP server validation...

  • 830 Views
  • 2 replies
  • 2 kudos
Latest Reply
Advika
Databricks Employee
  • 2 kudos

Hello @kbaig8125! Did the steps shared above help resolve your issue? If yes, please consider marking it as the accepted solution. If you found a different approach, please share it with the community so others can benefit as well.

  • 2 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now