-
python3-inkex-1.4.1-1.lbn36.noarch
This package supports Inkscape extensions.
It provides
- a simplification layer for SVG manipulation through lxml
- base classes for common types of Inkscape extensions
- simplified testing of those extensions
- a user interface library based on GTK3
At its core, Inkscape extensions take in a file, and output a file.
- For effect extensions, those two files are SVG files.
- For input extensions, the input file may be any arbitrary
file and the output is an SVG.
- For output extensions, the input is an SVG file while the
output is an arbitrary file.
- Some extensions (e.g. the extensions manager) don't manipulate files.
This folder also contains the stock Inkscape extensions, i.e. the scripts
that implement some commands that you can use from within Inkscape.
Most of these commands are in the Extensions menu, or in the Open /
Save dialogs.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-joblib-1.4.2-5.lbn36.noarch
Joblib is a set of tools to provide lightweight pipelining in Python.
In particular, joblib offers:
* transparent disk-caching of the output values and lazy
re-evaluation (memorize pattern)
* easy simple parallel computing
* logging and tracing of the execution
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langchain+anthropic-0.3.25-1.lbn36.noarch
This is a metapackage bringing in anthropic extras requires for
python3-langchain.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langchain+azure-ai-0.3.25-1.lbn36.noarch
This is a metapackage bringing in azure-ai extras requires for
python3-langchain.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langchain+cohere-0.3.25-1.lbn36.noarch
This is a metapackage bringing in cohere extras requires for python3-langchain.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langchain+community-0.3.25-1.lbn36.noarch
This is a metapackage bringing in community extras requires for
python3-langchain.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langchain+fireworks-0.3.25-1.lbn36.noarch
This is a metapackage bringing in fireworks extras requires for
python3-langchain.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langchain+google-vertexai-0.3.25-1.lbn36.noarch
This is a metapackage bringing in google-vertexai extras requires for
python3-langchain.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langchain+openai-0.3.25-1.lbn36.noarch
This is a metapackage bringing in openai extras requires for python3-langchain.
It makes sure the dependencies are installed.
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36
-
python3-langchain-anthropic-0.3.12-1.lbn36.noarch
langchain-anthropic
This package contains the LangChain integration for Anthropic's generative models.
Installation
pip install -U langchain-anthropic
Chat Models
Anthropic recommends using their chat models over text completions.
You can see their recommended models here.
To use, you should have an Anthropic API key configured. Initialize the model as:
from langchain_anthropic import ChatAnthropic
from langchain_core.messages import AIMessage, HumanMessage
model = ChatAnthropic(model="claude-3-opus-20240229", temperature=0, max_tokens=1024)
Define the input message
message = HumanMessage(content="What is the capital of France?")
Generate a response using the model
response = model.invoke([message])
For a more detailed walkthrough see here.
LLMs (Legacy)
You can use the Claude 2 models for text completions.
from langchain_anthropic import AnthropicLLM
model = AnthropicLLM(model="claude-2.1", temperature=0, max_tokens=1024)
response = model.invoke("The best restaurant in San Francisc
Located in
LBN
/
…
/
Data Science
/
BastionLinux 36