cancel
Showing results for 
Search instead for 
Did you mean: 
Generative AI
Explore discussions on generative artificial intelligence techniques and applications within the Databricks Community. Share ideas, challenges, and breakthroughs in this cutting-edge field.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks Genie API - Get conversation message

avinashk
New Contributor III

Hello team, 

I am trying to call the data bricks genie APIs from a frontend with NodeJS. I was able to test most of the APIs and was successful in getting response from all. 

But since yesterday afternoon, the below API 

GET /api/2.0/genie/spaces/{space_id}/conversations/{conversation_id}/messages/{message_id} is not giving any response and today it starting throwing an error with code 500  http://localhost:3000/api/spaces/01f0xxxxxx55/conversations/01f0xxxxxxx3cae/messages/01f0xxxxxxx8ab9 500 (Internal Server Error) Any help to fix this error would be appreciated. Thanks 

1 ACCEPTED SOLUTION

Accepted Solutions

avinashk
New Contributor III

Hello everyone, 

I believe its just a temp server issue. I am able to get a response now.

View solution in original post

4 REPLIES 4

BS_THE_ANALYST
Esteemed Contributor

@avinashk I'm probably being naive here but looking at your error message .. the end point is:

api/2.0/genie/spaces ...

Your request is missing "2.0" and "genie" is: http://localhost:3000/api/spaces/01f0xxxxxx55/conversations/01f0xxxxxxx3cae/messages/01f0xxxxxxx8ab9

Worth checking this?

All the best,
BS

Hello. Apologies for the confusion. My server side call looks like this 

https://adb-xxxxx.7.azuredatabricks.net/api/2.0/genie/spaces/01f06xxxxxxxxxx6001/conversations/01f0x...

The one in my original post is a proxy API call from the front end. This was working till yesterday afternoon and started throwing ERROR 500 after no response for a long time

avinashk
New Contributor III

Hello everyone, 

I believe its just a temp server issue. I am able to get a response now.

BS_THE_ANALYST
Esteemed Contributor

@avinashk thanks for the update! 

Glad it's resolved.

All the best,
BS