You are here: Home / LBN / Up2date / Data Science / BastionLinux 36 / python3-tokenizers-0.15.1-1.lbn36.x86_64

python3-tokenizers-0.15.1-1.lbn36.x86_64

Package Attributes
RPM  python3-tokenizers-0.15.1-1.lbn36.x86_64.rpm Architecture  x86_64 Size  6334532 Created  2025/05/13 02:08:08 UTC
Package Specification
Summary implementation of today's most used tokenizers, with a focus on performance and versatility.
Group Unspecified
License APL
Home Page https://pypi.org/project/tokenizers
Description

Tokenizers Provides an implementation of today's most used tokenizers, with a focus on performance and versatility. Bindings over the Rust implementation. If you are interested in the High-level design, you can go check it there. Otherwise, let's dive in! Main features:

Train new vocabularies and tokenize using 4 pre-made tokenizers (Bert WordPiece and the 3 most common BPE versions). Extremely fast (both training and tokenization), thanks to the Rust implementation. Takes less than 20 seconds to tokenize a GB of text on a server's CPU. Easy to use, but also extremely versatile. Designed for research and production. Normalization comes with alignments tracking. It's always possible to get the part of the original sentence that corresponds to a given token. Does all the pre-processing: Truncate, Pad, add the special tokens your model needs.

Requires
rpmlib(PayloadFilesHavePrefix)  
rpmlib(PayloadIsZstd)  
(python3.10dist(huggingface-hub) < 1~~ with python3.10dist(huggingface-hub) >= 0.16.4)  
rpmlib(CompressedFileNames)  
rpmlib(RichDependencies)  
rpmlib(PartialHardlinkSets)  
rpmlib(TildeInVersions)  
rpmlib(FileDigests)  
Provides
python-tokenizers
python3-tokenizers
python3-tokenizers(x86-64)
python3.10-tokenizers
python3.10dist(tokenizers)
python3dist(tokenizers)
Obsoletes
python-tokenizers

Document Actions