Description
How are you running AnythingLLM?
Docker (local)
What happened?
Hello,
I am trying to use the Jina API as the embedding engine in Anything LLM by setting the embedding provider to openai and pointing OPENAI_API_BASE_URL to https://api.jina.ai (also tested: https://api.jina.ai/v1, https://api.jina.ai/v1/embeddings). Although the API works correctly when tested with curl, e.g.:
curl -X POST https://api.jina.ai/v1/embeddings ^
-H "Content-Type: application/json" ^
-H "Authorization: Bearer API_KEY" ^
-d "{\"model\":\"jina-embeddings-v2-base-de\",\"input\":\"Hello world\"}"
returns a valid embedding response, the application fails to embed with the following error message:
GenericOpenAI Failed to embed: [failed_to_embed]: 404 status code (no body)
It appears the GenericOpenAI provider in Anything LLM does not handle the Jina API correctly despite the API being OpenAI-compatible.
The docker logs only show this error message without any additional details or response body, making it difficult to diagnose the root cause.
Could you please advise if this is a known issue or if there is a way to properly configure Anything LLM to work with the Jina embeddings API?
Thanks in advance for your help!
Are there known steps to reproduce?
No response