
Elastic has announced the Elasticsearch Open Inference API now supports Jina AI’s latest embedding models and reranking products. Developers building semantic search and RAG applications with the Elasticsearch vector database now benefit from Jina AI’s high-performance, cost-effective tools for GenAI information retrieval and semantic applications. This integration includes support for multilingual text embeddings and multilingual reranking, and is optimized for retrieval, clustering, and classification.
“Integrating Jina AI’s embeddings and reranker models with the Elasticsearch Open Inference API brings enterprise-grade semantic search to production environments while strengthening our open-source communities,” said Dr. Saahil Ognawala, head of product at Jina AI. “Combining Jina’s resource-efficient and open-weight search foundation models with Elasticsearch's proven scalability enables developers to easily build reliable semantic search and RAG applications.”
“Elastic is committed to providing open GenAI solutions that enable developers to build the next generation of search experiences,” said Ajay Nair, general manager, Platform at Elastic. “Our collaboration with Jina AI gives our users access to Jina’s high-performance tools on a singular Elasticsearch platform alongside Elastic models like ELSER, providing a seamless, streamlined building experience designed to create first-class generative AI applications.”
Support for Jina AI is available today in Elastic Cloud Serverless. Supported Jina AI models include jina-embeddings-v3, jina-reranker-v2-base-
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.