DEV Community

Cover image for Tokenization of text with spaCy tokenizer
Shamanth Shetty for Kern AI

Posted on

Tokenization of text with spaCy tokenizer

In the latest version of our software, the text will now automatically get tokenized with the spaCy tokenizer of your choice saving time and energy with added features exclusively for our userโ€™s.This not only helps you to build better labelling functions via pre-integrated metadata, but also allows you to easily label data manually. Kern sparks innovation having the easiest navigation system with features that can help you manage labelling tasks rapidly in-house. This drastically reduces the time, money and support in delivering high quality AI solutions in a hassle free way.

Image description

Subscribe to our newsletter ๐Ÿ‘‰๐Ÿผ https://www.kern.ai/pages/open-source and stay up to date with the release so you donโ€™t miss out on the chance to win a GeForce RTX 3090 Ti for our launch ๐Ÿ˜‰

Top comments (0)