Key Capabilities
- Instruction Following: Dynamically control document ranking through natural language commands
- Conflict Resolution: Intelligently handle contradictory information from multiple knowledge sources
- Superior Accuracy: Achieve state-of-the-art performance on industry benchmarks
- Seamless Integration: Drop-in replacement for existing rerankers in your RAG pipeline
contextual-client
Python SDK. Learn more about it here.
Overview
This integration invokes Contextual AI’s Grounded Language Model.Integration details
Class | Package | Local | Serializable | JS support | Downloads | Version |
---|---|---|---|---|---|---|
ContextualRerank | langchain-contextual | ❌ | beta | ❌ |
Setup
To access Contextual’s reranker models you’ll need to create a/an Contextual AI account, get an API key, and install thelangchain-contextual
integration package.
Credentials
Head to app.contextual.ai to sign up to Contextual and generate an API key. Once you’ve done this set the CONTEXTUAL_AI_API_KEY environment variable:Installation
The LangChain Contextual integration lives in thelangchain-contextual
package:
Instantiation
The Contextual Reranker arguments are:Parameter | Type | Description |
---|---|---|
documents | list[Document] | A sequence of documents to rerank. Any metadata contained in the documents will also be used for reranking. |
query | str | The query to use for reranking. |
model | str | The version of the reranker to use. Currently, we just have “ctxl-rerank-en-v1-instruct”. |
top_n | Optional[int] | The number of results to return. If None returns all results. Defaults to self.top_n. |
instruction | Optional[str] | The instruction to be used for the reranker. |
callbacks | Optional[Callbacks] | Callbacks to run during the compression process. |