tokenizers

cranv0.3.0

Fast, Consistent Tokenization of Natural Language Text. Convert natural language text into tokens. Includes tokenizers for shingled n-grams, skip n-grams, words, word stems, sentences, paragraphs, characters, shingled characters, lines, Penn Treebank, regular expressions, as well as functions for counting characters, words, and sentences, and a function

License MIT + file LICENSE0 versions1 maintainers3 deps11,480 weekly dl
ropensci/tokenizers
46
/ 100
Health
safe to use

[email protected] is safe to use (health: 46/100)

Health breakdown0 – 100
0/25
maintenance
10/20
popularity
25/25
security
9/15
maturity
2/15
community
Vulnerabilities
0
none known

Health History

Dependency Tree

License Audit

Dependencies (3)
API access

Get this data programmatically — free, no authentication.

curl https://depscope.dev/api/check/cran/tokenizers

First published · 2022-12-22 10:32:39

Last updated · 2022-12-22T07:50:02+00:00

tokenizers — Health Score 46/100 | DepScope