You are here: Home

Modified items

All recently modified items, latest first.
RPMPackage python3-nbconvert-7.16.4-1.lbn36.noarch
nbconvert Jupyter Notebook Conversion The nbconvert tool, jupyter nbconvert, converts notebooks to various other formats via Jinja templates. The nbconvert tool allows you to convert an .ipynb notebook file into various static formats including: HTML LaTeX PDF Reveal JS Markdown (md) ReStructured Text (rst) executable script Usage From the command line, use nbconvert to convert a Jupyter notebook (input) to a a different format (output). The basic command structure is: $ jupyter nbconvert --to <output format> <input notebook> where <output format> is the desired output format and <input notebook> is the filename of the Jupyter notebook. Example: Convert a notebook to HTML Convert Jupyter notebook file, mynotebook.ipynb, to HTML using: $ jupyter nbconvert --to html mynotebook.ipynb This command creates an HTML output file named mynotebook.html. Dev Install Check if pandoc is installed (pandoc --version); if needed, install: sudo apt-get install pandoc Or brew install pandoc I
RPMPackage python3-mistralclient-4.5.0-1.lbn36.noarch
Python client for Mistral REST API. Includes python library for Mistral API and Command Line Interface (CLI) library.
RPMPackage python3-mistralai-1.7.0-1.lbn36.noarch
Mistral Python Client Mistral AI API: Our Chat Completion and Embeddings APIs specification.
RPMPackage python3-litellm+proxy-1.69.0-1.lbn36.noarch
This is a metapackage bringing in proxy extras requires for python3-litellm. It makes sure the dependencies are installed.
RPMPackage python3-litellm-1.69.0-1.lbn36.noarch
πŸš… LiteLLM Call all LLM APIs using the OpenAI format [Bedrock, Huggingface, VertexAI, TogetherAI, Azure, OpenAI, Groq etc.] LiteLLM manages: Translate inputs to provider's completion, embedding, and image_generation endpoints Consistent output, text responses will always be available at ['choices'][0]['message']['content'] Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - Router Set Budgets & Rate limits per project, api key, model LiteLLM Proxy Server (LLM Gateway)
RPMPackage python3-langwatch-0.1.16-1.lbn36.noarch
LangWatch Python SDK Go to https:/docs.langwatch.ai to get started. Contributing After changing code, to test all integrations are working, run the examples integration test manually (you will need all env vars to be set up): poetry run pytest tests/test_examples.py -p no:warnings -s -x Or to test only a specific example, run: poetry run pytest tests/test_examples.py -p no:warnings -s -x -k <example_name>
RPMPackage python3-language-server-0.36.2-6.fc35.noarch
A Python implementation of the Language Server Protocol.
RPMPackage python3-langfuse-2.53.9-1.lbn36.noarch
Langfuse Python SDK
RPMPackage python3-langflow+postgresql-1.4.0-1.lbn36.noarch
This is a metapackage bringing in postgresql extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+nv-ingest-1.4.0-1.lbn36.noarch
This is a metapackage bringing in nv-ingest extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+local-1.4.0-1.lbn36.noarch
This is a metapackage bringing in local extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+deploy-1.4.0-1.lbn36.noarch
This is a metapackage bringing in deploy extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+couchbase-1.4.0-1.lbn36.noarch
This is a metapackage bringing in couchbase extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+clickhouse-connect-1.4.0-1.lbn36.noarch
This is a metapackage bringing in clickhouse-connect extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow+cassio-1.4.0-1.lbn36.noarch
This is a metapackage bringing in cassio extras requires for python3-langflow. It makes sure the dependencies are installed.
RPMPackage python3-langflow-1.4.0-1.lbn36.noarch
Langflow is a low-code app builder for RAG and multi-agent AI applications. It’s Python-based and agnostic to any model, API, or database. ✨ Core features Python-based and agnostic to models, APIs, data sources, or databases. Visual IDE for drag-and-drop building and testing of workflows. Playground to immediately test and iterate workflows with step-by-step control. Multi-agent orchestration and conversation management and retrieval. Free cloud service to get started in minutes with no setup. Publish as an API or export as a Python application. Observability with LangSmith, LangFuse, or LangWatch integration. Enterprise-grade security and scalability with free DataStax Langflow cloud service. Customize workflows or create flows entirely just using Python. Ecosystem integrations as reusable components for any model, API or database.
RPMPackage python3-langchainhub-0.1.15-1.lbn36.noarch
The LangChain Hub API client
RPMPackage python3-langchain-xai-0.2.3-1.lbn36.noarch
langchain-xai This package contains the LangChain integrations for xAI through their APIs. Installation and Setup Install the LangChain partner package pip install -U langchain-xai Get your xAI api key from the xAI Dashboard and set it as an environment variable (XAI_API_KEY) Chat Completions This package contains the ChatXAI class, which is the recommended way to interface with xAI chat models.
RPMPackage python3-langchain-unstructured-0.1.5-1.lbn36.noarch
langchain-unstructured This package contains the LangChain integration with Unstructured Installation pip install -U langchain-unstructured And you should configure credentials by setting the following environment variables: export UNSTRUCTURED_API_KEY="your-api-key" Loaders Partition and load files using either the unstructured-client sdk and the Unstructured API or locally using the unstructured library. API: To partition via the Unstructured API pip install unstructured-client and set partition_via_api=True and define api_key. If you are running the unstructured API locally, you can change the API rule by defining url when you initialize the loader. The hosted Unstructured API requires an API key. See the links below to learn more about our API offerings and get an API key. Local: By default the file loader uses the Unstructured partition function and will automatically detect the file type. In addition to document specific partition parameters, Unstructured has a rich set of "chu
RPMPackage python3-langchain-together-0.3.0-1.lbn36.noarch
langchain-together This package contains the LangChain integrations for Together AI through their APIs. Installation and Setup Install the LangChain partner package pip install -U langchain-together Get your Together AI api key from the Together Dashboard and set it as an environment variable (TOGETHER_API_KEY) Chat Completions This package contains the ChatTogether class, which is the recommended way to interface with Together AI chat models. ADD USAGE EXAMPLE HERE. Can we add this in the langchain docs? NEED to add image endpoint + completions endpoint as well Embeddings See a usage example Use togethercomputer/m2-bert-80M-8k-retrieval as the default model for embeddings.