Search RPM Packages
Search by exact RPM name. You can type multiple names.
Search where RPM name contains text.
Search where RPM summary or description contains text.
i386    i686    noarch    athlon    x86_64    armv6hl   
36    42   
48 rpm packages matching your search terms.
  1. python3-langchain-mongodb-0.6.1-1.lbn42.noarch.rpm
    Sep 13, 2025     ok
    langchain-mongodb Installation pip install -U langchain-mongodb Usage See Getting Started with the LangChain Integration for a walkthrough on using your first LangChain implementation with MongoDB Atlas. Using MongoDBAtlasVectorSearch from langchain_mongodb import MongoDBAtlasVectorSearch MONGODB_ATLAS_CLUSTER_URI = os.environ.get("MONGODB_ATLAS_CLUSTER_URI") DB_NAME = "langchain_db" COLLECTION_NAME = "test" ATLAS_VECTOR_SEARCH_INDEX_NAME = "index_name" MONGODB_COLLECTION = client[DB_NAME][COLLECTION_NAME] vector_search = MongoDBAtlasVectorSearch.from_connection_string( MONGODB_ATLAS_CLUSTER_URI, DB_NAME + "." + COLLECTION_NAME, OpenAIEmbeddings(disallowed_special=()), index_name=ATLAS_VECTOR_SEARCH_INDEX_NAME, ) client = MongoClient(MONGODB_ATLAS_CLUSTER_URI) vector_search_2 =
  2. python3-langchain-mongodb-0.6.1-1.lbn42.noarch.rpm
    Sep 13, 2025     ok
    langchain-mongodb Installation pip install -U langchain-mongodb Usage See Getting Started with the LangChain Integration for a walkthrough on using your first LangChain implementation with MongoDB Atlas. Using MongoDBAtlasVectorSearch from langchain_mongodb import MongoDBAtlasVectorSearch MONGODB_ATLAS_CLUSTER_URI = os.environ.get("MONGODB_ATLAS_CLUSTER_URI") DB_NAME = "langchain_db" COLLECTION_NAME = "test" ATLAS_VECTOR_SEARCH_INDEX_NAME = "index_name" MONGODB_COLLECTION = client[DB_NAME][COLLECTION_NAME] vector_search = MongoDBAtlasVectorSearch.from_connection_string( MONGODB_ATLAS_CLUSTER_URI, DB_NAME + "." + COLLECTION_NAME, OpenAIEmbeddings(disallowed_special=()), index_name=ATLAS_VECTOR_SEARCH_INDEX_NAME, ) client = MongoClient(MONGODB_ATLAS_CLUSTER_URI) vector_search_2 =
  3. python3-langchain-nvidia-ai-endpoints-0.3.8-1.lbn42.noarch.rpm
    Sep 13, 2025     ok
    NVIDIA NIM Microservices The langchain-nvidia-ai-endpoints package contains LangChain integrations for chat models and embeddings powered by NVIDIA AI Foundation Models, and hosted on NVIDIA API Catalog. NVIDIA AI Foundation models are community and NVIDIA-built models and are NVIDIA-optimized to deliver the best performance on NVIDIA accelerated infrastructure.  Using the API, you can query live endpoints available on the NVIDIA API Catalog to get quick results from a DGX-hosted cloud compute environment. All models are source-accessible and can be deployed on your own compute cluster using NVIDIA NIM™ microservices which is part of NVIDIA AI Enterprise. Models can be exported from NVIDIA’s API catalog with NVIDIA NIM, which is included with the NVIDIA AI Enterprise license, and run them on-premises, giving Enterprises ownership of their customizations and full control of their IP and AI application. NIM microservices are packaged as container images on a per model/model family basis
  4. python3-langchain-nvidia-ai-endpoints-0.3.8-1.lbn42.noarch.rpm
    Sep 13, 2025     ok
    NVIDIA NIM Microservices The langchain-nvidia-ai-endpoints package contains LangChain integrations for chat models and embeddings powered by NVIDIA AI Foundation Models, and hosted on NVIDIA API Catalog. NVIDIA AI Foundation models are community and NVIDIA-built models and are NVIDIA-optimized to deliver the best performance on NVIDIA accelerated infrastructure.  Using the API, you can query live endpoints available on the NVIDIA API Catalog to get quick results from a DGX-hosted cloud compute environment. All models are source-accessible and can be deployed on your own compute cluster using NVIDIA NIM™ microservices which is part of NVIDIA AI Enterprise. Models can be exported from NVIDIA’s API catalog with NVIDIA NIM, which is included with the NVIDIA AI Enterprise license, and run them on-premises, giving Enterprises ownership of their customizations and full control of their IP and AI application. NIM microservices are packaged as container images on a per model/model family basis
  5. python3-langchain-ollama-0.3.2-1.lbn42.noarch.rpm
    Sep 14, 2025     ok
    langchain-ollama This package contains the LangChain integration with Ollama Installation pip install -U langchain-ollama You will also need to run the Ollama server locally. You can download it here. Chat Models ChatOllama class exposes chat models from Ollama. from langchain_ollama import ChatOllama llm = ChatOllama(model="llama3-groq-tool-use") llm.invoke("Sing a ballad of LangChain.") Embeddings OllamaEmbeddings class exposes embeddings from Ollama. from langchain_ollama import OllamaEmbeddings embeddings = OllamaEmbeddings(model="llama3") embeddings.embed_query("What is the meaning of life?") LLMs OllamaLLM class exposes LLMs from Ollama. from langchain_ollama import OllamaLLM llm = OllamaLLM(model="llama3") llm.invoke("The meaning of life is")
  6. python3-langchain-openai-0.3.16-1.lbn42.noarch.rpm
    Sep 14, 2025     ok
    langchain-openai This package contains the LangChain integrations for OpenAI through their openai SDK. Installation and Setup Install the LangChain partner package pip install langchain-openai Get an OpenAI api key and set it as an environment variable (OPENAI_API_KEY) LLM See a usage example. from langchain_openai import OpenAI If you are using a model hosted on Azure, you should use different wrapper for that: from langchain_openai import AzureOpenAI For a more detailed walkthrough of the Azure wrapper, see here Chat model from langchain_openai import ChatOpenAI from langchain_openai import AzureChatOpenAI Text Embedding Model See a usage example from langchain_openai import OpenAIEmbeddings from langchain_openai import AzureOpenAIEmbeddings
  7. python3-langchain-perplexity-0.1.1-1.lbn42.noarch.rpm
    Sep 14, 2025     ok
    langchain-perplexity
  8. python3-langchain-perplexity-0.1.1-1.lbn42.noarch.rpm
    Sep 14, 2025     ok
    langchain-perplexity
  9. python3-langchain-pinecone-0.2.2-1.lbn42.noarch.rpm
    Sep 15, 2025     ok
    langchain-pinecone This package contains the LangChain integration with Pinecone. Installation pip install -U langchain-pinecone And you should configure credentials by setting the following environment variables: PINECONE_API_KEY PINECONE_INDEX_NAME Usage The PineconeVectorStore class exposes the connection to the Pinecone vector store. from langchain_pinecone import PineconeVectorStore embeddings = ... # use a LangChain Embeddings class vectorstore = PineconeVectorStore(embeddings=embeddings)
  10. python3-langchain-sambanova-0.1.0-1.lbn42.noarch.rpm
    Sep 13, 2025     ok
    langchain-sambanova