Skip to content
Snippets Groups Projects
Commit 1cf1f819 authored by Piotr Przybyła's avatar Piotr Przybyła
Browse files

Update README.md

parent 4b260cd6
Branches
No related merge requests found
......@@ -113,7 +113,7 @@ print("{:5} {:15} {:15} {:10} {:10} {:10}".format('ID', 'TOKEN', 'LEMMA', 'UPOS'
## Extending LAMBO
You don't have to rely on the models trained so far in COMBO. You can use the included code to train on new corpora and languages, tune to specific usecases or simply retrain larger models with more resources. The scripts in `[examples](src/lambo/examples)` include examples on how to do that:
You don't have to rely on the models trained so far in COMBO. You can use the included code to train on new corpora and languages, tune to specific usecases or simply retrain larger models with more resources. The scripts in [`examples`](src/lambo/examples) include examples on how to do that:
- `run_training.py` -- train simple LAMBO models. This script was used with [UD treebanks](https://universaldependencies.org/#language-) to generate `LAMBO_no_pretraining` models.
- `run_pretraining.py` -- pretrain unsupervised LAMBO models. This script was used with [OSCAR](https://oscar-corpus.com/).
- `run_training_pretrained.py` -- train LAMBO models on UD training data, starting from pretrained models. This script was used to generate `LAMBO` models.
......
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment