-
python3-langflow+couchbase-1.4.0-1.lbn36.noarch
This is a metapackage bringing in couchbase extras requires for
python3-langflow.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langflow+deploy-1.4.0-1.lbn36.noarch
This is a metapackage bringing in deploy extras requires for python3-langflow.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langflow+local-1.4.0-1.lbn36.noarch
This is a metapackage bringing in local extras requires for python3-langflow.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langflow+nv-ingest-1.4.0-1.lbn36.noarch
This is a metapackage bringing in nv-ingest extras requires for
python3-langflow.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langflow+postgresql-1.4.0-1.lbn36.noarch
This is a metapackage bringing in postgresql extras requires for
python3-langflow.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langfuse-2.53.9-1.lbn36.noarch
Langfuse Python SDK
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-language-server-0.36.2-6.fc35.noarch
A Python implementation of the Language Server Protocol.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langwatch-0.1.16-1.lbn36.noarch
LangWatch Python SDK
Go to https:/docs.langwatch.ai to get started.
Contributing
After changing code, to test all integrations are working, run the examples integration test manually (you will need all env vars to be set up):
poetry run pytest tests/test_examples.py -p no:warnings -s -x
Or to test only a specific example, run:
poetry run pytest tests/test_examples.py -p no:warnings -s -x -k <example_name>
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-litellm-1.69.0-1.lbn36.noarch
🚅 LiteLLM
Call all LLM APIs using the OpenAI format [Bedrock, Huggingface, VertexAI, TogetherAI, Azure, OpenAI, Groq etc.]
LiteLLM manages:
Translate inputs to provider's completion, embedding, and image_generation endpoints
Consistent output, text responses will always be available at ['choices'][0]['message']['content']
Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - Router
Set Budgets & Rate limits per project, api key, model LiteLLM Proxy Server (LLM Gateway)
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-litellm+proxy-1.69.0-1.lbn36.noarch
This is a metapackage bringing in proxy extras requires for python3-litellm.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36