mlt

mltype

Command line tool for improving typing skills (programmers friendly)

Showing:

Popularity

Downloads/wk

0

GitHub Stars

291

Maintenance

Last Commit

10mos ago

Contributors

3

Package

Dependencies

12

License

MIT

Categories

Readme

mltype

Command line tool for improving typing speed and accuracy. The main goal is to help programmers practise programming languages.

Demo

Installation

Python environment

pip install --upgrade mltype

Docker

Make sure that Docker and Docker Compose are installed.

docker-compose run --rm mltype

You will get a shell in a running container and the mlt command should be available.

See the documentation for more information.

Main features

Text generation

  • Using neural networks to generate text. One can use pretrained networks (see below) or train new ones from scratch.
  • Alternatively, one can read text from a file or provide it manually

Typing interface

  • Dead simple (implemented in curses)
  • Basic statistics - WPM and accuracy
  • Setting target speed
  • Playing against past performances

Documentation and usage

The entrypoint is mlt. To get information on how to use the subcommands use the --help flag (e.g. mlt file --help).

$ mlt
Usage: mlt [OPTIONS] COMMAND [ARGS]...

  Tool for improving typing speed and accuracy

Options:
  --help  Show this message and exit.

Commands:
  file    Type text from a file
  ls      List all language models
  random  Sample characters randomly from a vocabulary
  raw     Provide text manually
  replay  Compete against a past performance
  sample  Sample text from a language
  train   Train a language

Pretrained models

See below for a list of pretrained models. They are stored on a google drive and one needs to download the entire archive.

NameInfoLink
C++Trained on https://github.com/TheAlgorithms/C-Plus-Pluslink
C#Trained on https://github.com/TheAlgorithms/C-Sharplink
CPythonTrained on https://github.com/python/cpython/tree/master/Pythonlink
Crime and PunishmentTrained on http://www.gutenberg.org/ebooks/2554link
DraculaTrained on http://www.gutenberg.org/ebooks/345link
ElixirTrained on https://github.com/phoenixframework/phoenixlink
GoTrained on https://github.com/TheAlgorithms/Golink
HaskellTrained on https://github.com/jgm/pandoclink
JavaTrained on https://github.com/TheAlgorithms/Javalink
JavaScriptTrained on https://github.com/trekhleb/javascript-algorithmslink
KotlinTrained on https://github.com/square/leakcanarylink
LuaTrained on https://github.com/nmap/nmaplink
PerlTrained on https://github.com/mojolicious/mojolink
PHPTrained on https://github.com/symfony/symfonylink
PythonTrained on https://github.com/TheAlgorithms/Pythonlink
RTrained on https://github.com/tidyverse/ggplot2link
RubyTrained on https://github.com/jekyll/jekylllink
RustTrained on https://github.com/rust-lang/rust/tree/master/compilerlink
ScalaTrained on https://github.com/apache/spark/tree/master/mlliblink
Scikit-learnTrained on https://github.com/scikit-learn/scikit-learnlink
SwiftTrained on https://github.com/raywenderlich/swift-algorithm-clublink

Once you download the file, you will need to place it in ~/.mltype/languages. Note that if the folder does not exist you will have to create it. The file name can be changed to whatever you like. This name will then be used to refer to the model.

To verify that the model was downloaded succesfully, try to sample from it. Note that this might take 20+ seconds the first time around.

mlt sample my_new_model

Feel free to create an issue if you want me to train a model for you. Note that you can also do it yourself easily by reading the documentation (mlt train) and getting a GPU on Google Colab (click the badge below for a ready to use notebook).

Open In Colab

Credits

This project is very much motivated by the The Unreasonable Effectiveness of Recurrent Neural Networks by Andrej Karpathy.

Rate & Review

Great Documentation0
Easy to Use0
Performant0
Highly Customizable0
Bleeding Edge0
Responsive Maintainers0
Poor Documentation0
Hard to Use0
Slow0
Buggy0
Abandoned0
Unwelcoming Community0
100