From ac65eb6950c5e440886ff21aec18e3fc1afb6bbd Mon Sep 17 00:00:00 2001
From: alina-wroblewska <45227016+alina-wroblewska@users.noreply.github.com>
Date: Wed, 28 Oct 2020 09:23:02 +0100
Subject: [PATCH 01/12] Create LICENSE

---
 LICENSE | 674 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 674 insertions(+)
 create mode 100644 LICENSE

diff --git a/LICENSE b/LICENSE
new file mode 100644
index 0000000..f288702
--- /dev/null
+++ b/LICENSE
@@ -0,0 +1,674 @@
+                    GNU GENERAL PUBLIC LICENSE
+                       Version 3, 29 June 2007
+
+ Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+                            Preamble
+
+  The GNU General Public License is a free, copyleft license for
+software and other kinds of works.
+
+  The licenses for most software and other practical works are designed
+to take away your freedom to share and change the works.  By contrast,
+the GNU General Public License is intended to guarantee your freedom to
+share and change all versions of a program--to make sure it remains free
+software for all its users.  We, the Free Software Foundation, use the
+GNU General Public License for most of our software; it applies also to
+any other work released this way by its authors.  You can apply it to
+your programs, too.
+
+  When we speak of free software, we are referring to freedom, not
+price.  Our General Public Licenses are designed to make sure that you
+have the freedom to distribute copies of free software (and charge for
+them if you wish), that you receive source code or can get it if you
+want it, that you can change the software or use pieces of it in new
+free programs, and that you know you can do these things.
+
+  To protect your rights, we need to prevent others from denying you
+these rights or asking you to surrender the rights.  Therefore, you have
+certain responsibilities if you distribute copies of the software, or if
+you modify it: responsibilities to respect the freedom of others.
+
+  For example, if you distribute copies of such a program, whether
+gratis or for a fee, you must pass on to the recipients the same
+freedoms that you received.  You must make sure that they, too, receive
+or can get the source code.  And you must show them these terms so they
+know their rights.
+
+  Developers that use the GNU GPL protect your rights with two steps:
+(1) assert copyright on the software, and (2) offer you this License
+giving you legal permission to copy, distribute and/or modify it.
+
+  For the developers' and authors' protection, the GPL clearly explains
+that there is no warranty for this free software.  For both users' and
+authors' sake, the GPL requires that modified versions be marked as
+changed, so that their problems will not be attributed erroneously to
+authors of previous versions.
+
+  Some devices are designed to deny users access to install or run
+modified versions of the software inside them, although the manufacturer
+can do so.  This is fundamentally incompatible with the aim of
+protecting users' freedom to change the software.  The systematic
+pattern of such abuse occurs in the area of products for individuals to
+use, which is precisely where it is most unacceptable.  Therefore, we
+have designed this version of the GPL to prohibit the practice for those
+products.  If such problems arise substantially in other domains, we
+stand ready to extend this provision to those domains in future versions
+of the GPL, as needed to protect the freedom of users.
+
+  Finally, every program is threatened constantly by software patents.
+States should not allow patents to restrict development and use of
+software on general-purpose computers, but in those that do, we wish to
+avoid the special danger that patents applied to a free program could
+make it effectively proprietary.  To prevent this, the GPL assures that
+patents cannot be used to render the program non-free.
+
+  The precise terms and conditions for copying, distribution and
+modification follow.
+
+                       TERMS AND CONDITIONS
+
+  0. Definitions.
+
+  "This License" refers to version 3 of the GNU General Public License.
+
+  "Copyright" also means copyright-like laws that apply to other kinds of
+works, such as semiconductor masks.
+
+  "The Program" refers to any copyrightable work licensed under this
+License.  Each licensee is addressed as "you".  "Licensees" and
+"recipients" may be individuals or organizations.
+
+  To "modify" a work means to copy from or adapt all or part of the work
+in a fashion requiring copyright permission, other than the making of an
+exact copy.  The resulting work is called a "modified version" of the
+earlier work or a work "based on" the earlier work.
+
+  A "covered work" means either the unmodified Program or a work based
+on the Program.
+
+  To "propagate" a work means to do anything with it that, without
+permission, would make you directly or secondarily liable for
+infringement under applicable copyright law, except executing it on a
+computer or modifying a private copy.  Propagation includes copying,
+distribution (with or without modification), making available to the
+public, and in some countries other activities as well.
+
+  To "convey" a work means any kind of propagation that enables other
+parties to make or receive copies.  Mere interaction with a user through
+a computer network, with no transfer of a copy, is not conveying.
+
+  An interactive user interface displays "Appropriate Legal Notices"
+to the extent that it includes a convenient and prominently visible
+feature that (1) displays an appropriate copyright notice, and (2)
+tells the user that there is no warranty for the work (except to the
+extent that warranties are provided), that licensees may convey the
+work under this License, and how to view a copy of this License.  If
+the interface presents a list of user commands or options, such as a
+menu, a prominent item in the list meets this criterion.
+
+  1. Source Code.
+
+  The "source code" for a work means the preferred form of the work
+for making modifications to it.  "Object code" means any non-source
+form of a work.
+
+  A "Standard Interface" means an interface that either is an official
+standard defined by a recognized standards body, or, in the case of
+interfaces specified for a particular programming language, one that
+is widely used among developers working in that language.
+
+  The "System Libraries" of an executable work include anything, other
+than the work as a whole, that (a) is included in the normal form of
+packaging a Major Component, but which is not part of that Major
+Component, and (b) serves only to enable use of the work with that
+Major Component, or to implement a Standard Interface for which an
+implementation is available to the public in source code form.  A
+"Major Component", in this context, means a major essential component
+(kernel, window system, and so on) of the specific operating system
+(if any) on which the executable work runs, or a compiler used to
+produce the work, or an object code interpreter used to run it.
+
+  The "Corresponding Source" for a work in object code form means all
+the source code needed to generate, install, and (for an executable
+work) run the object code and to modify the work, including scripts to
+control those activities.  However, it does not include the work's
+System Libraries, or general-purpose tools or generally available free
+programs which are used unmodified in performing those activities but
+which are not part of the work.  For example, Corresponding Source
+includes interface definition files associated with source files for
+the work, and the source code for shared libraries and dynamically
+linked subprograms that the work is specifically designed to require,
+such as by intimate data communication or control flow between those
+subprograms and other parts of the work.
+
+  The Corresponding Source need not include anything that users
+can regenerate automatically from other parts of the Corresponding
+Source.
+
+  The Corresponding Source for a work in source code form is that
+same work.
+
+  2. Basic Permissions.
+
+  All rights granted under this License are granted for the term of
+copyright on the Program, and are irrevocable provided the stated
+conditions are met.  This License explicitly affirms your unlimited
+permission to run the unmodified Program.  The output from running a
+covered work is covered by this License only if the output, given its
+content, constitutes a covered work.  This License acknowledges your
+rights of fair use or other equivalent, as provided by copyright law.
+
+  You may make, run and propagate covered works that you do not
+convey, without conditions so long as your license otherwise remains
+in force.  You may convey covered works to others for the sole purpose
+of having them make modifications exclusively for you, or provide you
+with facilities for running those works, provided that you comply with
+the terms of this License in conveying all material for which you do
+not control copyright.  Those thus making or running the covered works
+for you must do so exclusively on your behalf, under your direction
+and control, on terms that prohibit them from making any copies of
+your copyrighted material outside their relationship with you.
+
+  Conveying under any other circumstances is permitted solely under
+the conditions stated below.  Sublicensing is not allowed; section 10
+makes it unnecessary.
+
+  3. Protecting Users' Legal Rights From Anti-Circumvention Law.
+
+  No covered work shall be deemed part of an effective technological
+measure under any applicable law fulfilling obligations under article
+11 of the WIPO copyright treaty adopted on 20 December 1996, or
+similar laws prohibiting or restricting circumvention of such
+measures.
+
+  When you convey a covered work, you waive any legal power to forbid
+circumvention of technological measures to the extent such circumvention
+is effected by exercising rights under this License with respect to
+the covered work, and you disclaim any intention to limit operation or
+modification of the work as a means of enforcing, against the work's
+users, your or third parties' legal rights to forbid circumvention of
+technological measures.
+
+  4. Conveying Verbatim Copies.
+
+  You may convey verbatim copies of the Program's source code as you
+receive it, in any medium, provided that you conspicuously and
+appropriately publish on each copy an appropriate copyright notice;
+keep intact all notices stating that this License and any
+non-permissive terms added in accord with section 7 apply to the code;
+keep intact all notices of the absence of any warranty; and give all
+recipients a copy of this License along with the Program.
+
+  You may charge any price or no price for each copy that you convey,
+and you may offer support or warranty protection for a fee.
+
+  5. Conveying Modified Source Versions.
+
+  You may convey a work based on the Program, or the modifications to
+produce it from the Program, in the form of source code under the
+terms of section 4, provided that you also meet all of these conditions:
+
+    a) The work must carry prominent notices stating that you modified
+    it, and giving a relevant date.
+
+    b) The work must carry prominent notices stating that it is
+    released under this License and any conditions added under section
+    7.  This requirement modifies the requirement in section 4 to
+    "keep intact all notices".
+
+    c) You must license the entire work, as a whole, under this
+    License to anyone who comes into possession of a copy.  This
+    License will therefore apply, along with any applicable section 7
+    additional terms, to the whole of the work, and all its parts,
+    regardless of how they are packaged.  This License gives no
+    permission to license the work in any other way, but it does not
+    invalidate such permission if you have separately received it.
+
+    d) If the work has interactive user interfaces, each must display
+    Appropriate Legal Notices; however, if the Program has interactive
+    interfaces that do not display Appropriate Legal Notices, your
+    work need not make them do so.
+
+  A compilation of a covered work with other separate and independent
+works, which are not by their nature extensions of the covered work,
+and which are not combined with it such as to form a larger program,
+in or on a volume of a storage or distribution medium, is called an
+"aggregate" if the compilation and its resulting copyright are not
+used to limit the access or legal rights of the compilation's users
+beyond what the individual works permit.  Inclusion of a covered work
+in an aggregate does not cause this License to apply to the other
+parts of the aggregate.
+
+  6. Conveying Non-Source Forms.
+
+  You may convey a covered work in object code form under the terms
+of sections 4 and 5, provided that you also convey the
+machine-readable Corresponding Source under the terms of this License,
+in one of these ways:
+
+    a) Convey the object code in, or embodied in, a physical product
+    (including a physical distribution medium), accompanied by the
+    Corresponding Source fixed on a durable physical medium
+    customarily used for software interchange.
+
+    b) Convey the object code in, or embodied in, a physical product
+    (including a physical distribution medium), accompanied by a
+    written offer, valid for at least three years and valid for as
+    long as you offer spare parts or customer support for that product
+    model, to give anyone who possesses the object code either (1) a
+    copy of the Corresponding Source for all the software in the
+    product that is covered by this License, on a durable physical
+    medium customarily used for software interchange, for a price no
+    more than your reasonable cost of physically performing this
+    conveying of source, or (2) access to copy the
+    Corresponding Source from a network server at no charge.
+
+    c) Convey individual copies of the object code with a copy of the
+    written offer to provide the Corresponding Source.  This
+    alternative is allowed only occasionally and noncommercially, and
+    only if you received the object code with such an offer, in accord
+    with subsection 6b.
+
+    d) Convey the object code by offering access from a designated
+    place (gratis or for a charge), and offer equivalent access to the
+    Corresponding Source in the same way through the same place at no
+    further charge.  You need not require recipients to copy the
+    Corresponding Source along with the object code.  If the place to
+    copy the object code is a network server, the Corresponding Source
+    may be on a different server (operated by you or a third party)
+    that supports equivalent copying facilities, provided you maintain
+    clear directions next to the object code saying where to find the
+    Corresponding Source.  Regardless of what server hosts the
+    Corresponding Source, you remain obligated to ensure that it is
+    available for as long as needed to satisfy these requirements.
+
+    e) Convey the object code using peer-to-peer transmission, provided
+    you inform other peers where the object code and Corresponding
+    Source of the work are being offered to the general public at no
+    charge under subsection 6d.
+
+  A separable portion of the object code, whose source code is excluded
+from the Corresponding Source as a System Library, need not be
+included in conveying the object code work.
+
+  A "User Product" is either (1) a "consumer product", which means any
+tangible personal property which is normally used for personal, family,
+or household purposes, or (2) anything designed or sold for incorporation
+into a dwelling.  In determining whether a product is a consumer product,
+doubtful cases shall be resolved in favor of coverage.  For a particular
+product received by a particular user, "normally used" refers to a
+typical or common use of that class of product, regardless of the status
+of the particular user or of the way in which the particular user
+actually uses, or expects or is expected to use, the product.  A product
+is a consumer product regardless of whether the product has substantial
+commercial, industrial or non-consumer uses, unless such uses represent
+the only significant mode of use of the product.
+
+  "Installation Information" for a User Product means any methods,
+procedures, authorization keys, or other information required to install
+and execute modified versions of a covered work in that User Product from
+a modified version of its Corresponding Source.  The information must
+suffice to ensure that the continued functioning of the modified object
+code is in no case prevented or interfered with solely because
+modification has been made.
+
+  If you convey an object code work under this section in, or with, or
+specifically for use in, a User Product, and the conveying occurs as
+part of a transaction in which the right of possession and use of the
+User Product is transferred to the recipient in perpetuity or for a
+fixed term (regardless of how the transaction is characterized), the
+Corresponding Source conveyed under this section must be accompanied
+by the Installation Information.  But this requirement does not apply
+if neither you nor any third party retains the ability to install
+modified object code on the User Product (for example, the work has
+been installed in ROM).
+
+  The requirement to provide Installation Information does not include a
+requirement to continue to provide support service, warranty, or updates
+for a work that has been modified or installed by the recipient, or for
+the User Product in which it has been modified or installed.  Access to a
+network may be denied when the modification itself materially and
+adversely affects the operation of the network or violates the rules and
+protocols for communication across the network.
+
+  Corresponding Source conveyed, and Installation Information provided,
+in accord with this section must be in a format that is publicly
+documented (and with an implementation available to the public in
+source code form), and must require no special password or key for
+unpacking, reading or copying.
+
+  7. Additional Terms.
+
+  "Additional permissions" are terms that supplement the terms of this
+License by making exceptions from one or more of its conditions.
+Additional permissions that are applicable to the entire Program shall
+be treated as though they were included in this License, to the extent
+that they are valid under applicable law.  If additional permissions
+apply only to part of the Program, that part may be used separately
+under those permissions, but the entire Program remains governed by
+this License without regard to the additional permissions.
+
+  When you convey a copy of a covered work, you may at your option
+remove any additional permissions from that copy, or from any part of
+it.  (Additional permissions may be written to require their own
+removal in certain cases when you modify the work.)  You may place
+additional permissions on material, added by you to a covered work,
+for which you have or can give appropriate copyright permission.
+
+  Notwithstanding any other provision of this License, for material you
+add to a covered work, you may (if authorized by the copyright holders of
+that material) supplement the terms of this License with terms:
+
+    a) Disclaiming warranty or limiting liability differently from the
+    terms of sections 15 and 16 of this License; or
+
+    b) Requiring preservation of specified reasonable legal notices or
+    author attributions in that material or in the Appropriate Legal
+    Notices displayed by works containing it; or
+
+    c) Prohibiting misrepresentation of the origin of that material, or
+    requiring that modified versions of such material be marked in
+    reasonable ways as different from the original version; or
+
+    d) Limiting the use for publicity purposes of names of licensors or
+    authors of the material; or
+
+    e) Declining to grant rights under trademark law for use of some
+    trade names, trademarks, or service marks; or
+
+    f) Requiring indemnification of licensors and authors of that
+    material by anyone who conveys the material (or modified versions of
+    it) with contractual assumptions of liability to the recipient, for
+    any liability that these contractual assumptions directly impose on
+    those licensors and authors.
+
+  All other non-permissive additional terms are considered "further
+restrictions" within the meaning of section 10.  If the Program as you
+received it, or any part of it, contains a notice stating that it is
+governed by this License along with a term that is a further
+restriction, you may remove that term.  If a license document contains
+a further restriction but permits relicensing or conveying under this
+License, you may add to a covered work material governed by the terms
+of that license document, provided that the further restriction does
+not survive such relicensing or conveying.
+
+  If you add terms to a covered work in accord with this section, you
+must place, in the relevant source files, a statement of the
+additional terms that apply to those files, or a notice indicating
+where to find the applicable terms.
+
+  Additional terms, permissive or non-permissive, may be stated in the
+form of a separately written license, or stated as exceptions;
+the above requirements apply either way.
+
+  8. Termination.
+
+  You may not propagate or modify a covered work except as expressly
+provided under this License.  Any attempt otherwise to propagate or
+modify it is void, and will automatically terminate your rights under
+this License (including any patent licenses granted under the third
+paragraph of section 11).
+
+  However, if you cease all violation of this License, then your
+license from a particular copyright holder is reinstated (a)
+provisionally, unless and until the copyright holder explicitly and
+finally terminates your license, and (b) permanently, if the copyright
+holder fails to notify you of the violation by some reasonable means
+prior to 60 days after the cessation.
+
+  Moreover, your license from a particular copyright holder is
+reinstated permanently if the copyright holder notifies you of the
+violation by some reasonable means, this is the first time you have
+received notice of violation of this License (for any work) from that
+copyright holder, and you cure the violation prior to 30 days after
+your receipt of the notice.
+
+  Termination of your rights under this section does not terminate the
+licenses of parties who have received copies or rights from you under
+this License.  If your rights have been terminated and not permanently
+reinstated, you do not qualify to receive new licenses for the same
+material under section 10.
+
+  9. Acceptance Not Required for Having Copies.
+
+  You are not required to accept this License in order to receive or
+run a copy of the Program.  Ancillary propagation of a covered work
+occurring solely as a consequence of using peer-to-peer transmission
+to receive a copy likewise does not require acceptance.  However,
+nothing other than this License grants you permission to propagate or
+modify any covered work.  These actions infringe copyright if you do
+not accept this License.  Therefore, by modifying or propagating a
+covered work, you indicate your acceptance of this License to do so.
+
+  10. Automatic Licensing of Downstream Recipients.
+
+  Each time you convey a covered work, the recipient automatically
+receives a license from the original licensors, to run, modify and
+propagate that work, subject to this License.  You are not responsible
+for enforcing compliance by third parties with this License.
+
+  An "entity transaction" is a transaction transferring control of an
+organization, or substantially all assets of one, or subdividing an
+organization, or merging organizations.  If propagation of a covered
+work results from an entity transaction, each party to that
+transaction who receives a copy of the work also receives whatever
+licenses to the work the party's predecessor in interest had or could
+give under the previous paragraph, plus a right to possession of the
+Corresponding Source of the work from the predecessor in interest, if
+the predecessor has it or can get it with reasonable efforts.
+
+  You may not impose any further restrictions on the exercise of the
+rights granted or affirmed under this License.  For example, you may
+not impose a license fee, royalty, or other charge for exercise of
+rights granted under this License, and you may not initiate litigation
+(including a cross-claim or counterclaim in a lawsuit) alleging that
+any patent claim is infringed by making, using, selling, offering for
+sale, or importing the Program or any portion of it.
+
+  11. Patents.
+
+  A "contributor" is a copyright holder who authorizes use under this
+License of the Program or a work on which the Program is based.  The
+work thus licensed is called the contributor's "contributor version".
+
+  A contributor's "essential patent claims" are all patent claims
+owned or controlled by the contributor, whether already acquired or
+hereafter acquired, that would be infringed by some manner, permitted
+by this License, of making, using, or selling its contributor version,
+but do not include claims that would be infringed only as a
+consequence of further modification of the contributor version.  For
+purposes of this definition, "control" includes the right to grant
+patent sublicenses in a manner consistent with the requirements of
+this License.
+
+  Each contributor grants you a non-exclusive, worldwide, royalty-free
+patent license under the contributor's essential patent claims, to
+make, use, sell, offer for sale, import and otherwise run, modify and
+propagate the contents of its contributor version.
+
+  In the following three paragraphs, a "patent license" is any express
+agreement or commitment, however denominated, not to enforce a patent
+(such as an express permission to practice a patent or covenant not to
+sue for patent infringement).  To "grant" such a patent license to a
+party means to make such an agreement or commitment not to enforce a
+patent against the party.
+
+  If you convey a covered work, knowingly relying on a patent license,
+and the Corresponding Source of the work is not available for anyone
+to copy, free of charge and under the terms of this License, through a
+publicly available network server or other readily accessible means,
+then you must either (1) cause the Corresponding Source to be so
+available, or (2) arrange to deprive yourself of the benefit of the
+patent license for this particular work, or (3) arrange, in a manner
+consistent with the requirements of this License, to extend the patent
+license to downstream recipients.  "Knowingly relying" means you have
+actual knowledge that, but for the patent license, your conveying the
+covered work in a country, or your recipient's use of the covered work
+in a country, would infringe one or more identifiable patents in that
+country that you have reason to believe are valid.
+
+  If, pursuant to or in connection with a single transaction or
+arrangement, you convey, or propagate by procuring conveyance of, a
+covered work, and grant a patent license to some of the parties
+receiving the covered work authorizing them to use, propagate, modify
+or convey a specific copy of the covered work, then the patent license
+you grant is automatically extended to all recipients of the covered
+work and works based on it.
+
+  A patent license is "discriminatory" if it does not include within
+the scope of its coverage, prohibits the exercise of, or is
+conditioned on the non-exercise of one or more of the rights that are
+specifically granted under this License.  You may not convey a covered
+work if you are a party to an arrangement with a third party that is
+in the business of distributing software, under which you make payment
+to the third party based on the extent of your activity of conveying
+the work, and under which the third party grants, to any of the
+parties who would receive the covered work from you, a discriminatory
+patent license (a) in connection with copies of the covered work
+conveyed by you (or copies made from those copies), or (b) primarily
+for and in connection with specific products or compilations that
+contain the covered work, unless you entered into that arrangement,
+or that patent license was granted, prior to 28 March 2007.
+
+  Nothing in this License shall be construed as excluding or limiting
+any implied license or other defenses to infringement that may
+otherwise be available to you under applicable patent law.
+
+  12. No Surrender of Others' Freedom.
+
+  If conditions are imposed on you (whether by court order, agreement or
+otherwise) that contradict the conditions of this License, they do not
+excuse you from the conditions of this License.  If you cannot convey a
+covered work so as to satisfy simultaneously your obligations under this
+License and any other pertinent obligations, then as a consequence you may
+not convey it at all.  For example, if you agree to terms that obligate you
+to collect a royalty for further conveying from those to whom you convey
+the Program, the only way you could satisfy both those terms and this
+License would be to refrain entirely from conveying the Program.
+
+  13. Use with the GNU Affero General Public License.
+
+  Notwithstanding any other provision of this License, you have
+permission to link or combine any covered work with a work licensed
+under version 3 of the GNU Affero General Public License into a single
+combined work, and to convey the resulting work.  The terms of this
+License will continue to apply to the part which is the covered work,
+but the special requirements of the GNU Affero General Public License,
+section 13, concerning interaction through a network will apply to the
+combination as such.
+
+  14. Revised Versions of this License.
+
+  The Free Software Foundation may publish revised and/or new versions of
+the GNU General Public License from time to time.  Such new versions will
+be similar in spirit to the present version, but may differ in detail to
+address new problems or concerns.
+
+  Each version is given a distinguishing version number.  If the
+Program specifies that a certain numbered version of the GNU General
+Public License "or any later version" applies to it, you have the
+option of following the terms and conditions either of that numbered
+version or of any later version published by the Free Software
+Foundation.  If the Program does not specify a version number of the
+GNU General Public License, you may choose any version ever published
+by the Free Software Foundation.
+
+  If the Program specifies that a proxy can decide which future
+versions of the GNU General Public License can be used, that proxy's
+public statement of acceptance of a version permanently authorizes you
+to choose that version for the Program.
+
+  Later license versions may give you additional or different
+permissions.  However, no additional obligations are imposed on any
+author or copyright holder as a result of your choosing to follow a
+later version.
+
+  15. Disclaimer of Warranty.
+
+  THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
+APPLICABLE LAW.  EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
+HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
+OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
+THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+PURPOSE.  THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
+IS WITH YOU.  SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
+ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
+
+  16. Limitation of Liability.
+
+  IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
+WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
+THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
+GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
+USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
+DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
+PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
+EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
+SUCH DAMAGES.
+
+  17. Interpretation of Sections 15 and 16.
+
+  If the disclaimer of warranty and limitation of liability provided
+above cannot be given local legal effect according to their terms,
+reviewing courts shall apply local law that most closely approximates
+an absolute waiver of all civil liability in connection with the
+Program, unless a warranty or assumption of liability accompanies a
+copy of the Program in return for a fee.
+
+                     END OF TERMS AND CONDITIONS
+
+            How to Apply These Terms to Your New Programs
+
+  If you develop a new program, and you want it to be of the greatest
+possible use to the public, the best way to achieve this is to make it
+free software which everyone can redistribute and change under these terms.
+
+  To do so, attach the following notices to the program.  It is safest
+to attach them to the start of each source file to most effectively
+state the exclusion of warranty; and each file should have at least
+the "copyright" line and a pointer to where the full notice is found.
+
+    <one line to give the program's name and a brief idea of what it does.>
+    Copyright (C) <year>  <name of author>
+
+    This program is free software: you can redistribute it and/or modify
+    it under the terms of the GNU General Public License as published by
+    the Free Software Foundation, either version 3 of the License, or
+    (at your option) any later version.
+
+    This program is distributed in the hope that it will be useful,
+    but WITHOUT ANY WARRANTY; without even the implied warranty of
+    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+    GNU General Public License for more details.
+
+    You should have received a copy of the GNU General Public License
+    along with this program.  If not, see <https://www.gnu.org/licenses/>.
+
+Also add information on how to contact you by electronic and paper mail.
+
+  If the program does terminal interaction, make it output a short
+notice like this when it starts in an interactive mode:
+
+    <program>  Copyright (C) <year>  <name of author>
+    This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
+    This is free software, and you are welcome to redistribute it
+    under certain conditions; type `show c' for details.
+
+The hypothetical commands `show w' and `show c' should show the appropriate
+parts of the General Public License.  Of course, your program's commands
+might be different; for a GUI interface, you would use an "about box".
+
+  You should also get your employer (if you work as a programmer) or school,
+if any, to sign a "copyright disclaimer" for the program, if necessary.
+For more information on this, and how to apply and follow the GNU GPL, see
+<https://www.gnu.org/licenses/>.
+
+  The GNU General Public License does not permit incorporating your program
+into proprietary programs.  If your program is a subroutine library, you
+may consider it more useful to permit linking proprietary applications with
+the library.  If this is what you want to do, use the GNU Lesser General
+Public License instead of this License.  But first, please read
+<https://www.gnu.org/licenses/why-not-lgpl.html>.
-- 
GitLab


From 9d339859a63787b5211a66616d523b6c066159c6 Mon Sep 17 00:00:00 2001
From: Mateusz Klimaszewski <mk.klimaszewski@gmail.com>
Date: Mon, 14 Sep 2020 08:56:56 +0200
Subject: [PATCH 02/12] Add herberta configuration.

---
 README.md                 |  4 ++++
 combo/training/trainer.py | 12 ++++++------
 setup.py                  |  8 ++++----
 3 files changed, 14 insertions(+), 10 deletions(-)

diff --git a/README.md b/README.md
index 0ef5706..680673c 100644
--- a/README.md
+++ b/README.md
@@ -1,5 +1,9 @@
 ## Installation
 
+### HERBERTA notes:
+
+Install herberta transformers package **before** running command below
+
 Clone this repository and run:
 ```bash
 python setup.py develop
diff --git a/combo/training/trainer.py b/combo/training/trainer.py
index 2096d48..234bdd7 100644
--- a/combo/training/trainer.py
+++ b/combo/training/trainer.py
@@ -54,12 +54,12 @@ class GradientDescentTrainer(training.GradientDescentTrainer):
                  batch_callbacks: List[training.BatchCallback] = None,
                  epoch_callbacks: List[training.EpochCallback] = None, distributed: bool = False, local_rank: int = 0,
                  world_size: int = 1, num_gradient_accumulation_steps: int = 1,
-                 opt_level: Optional[str] = None) -> None:
+                 use_amp: bool = False) -> None:
         super().__init__(model, optimizer, data_loader, patience, validation_metric, validation_data_loader, num_epochs,
                          serialization_dir, checkpointer, cuda_device, grad_norm, grad_clipping,
                          learning_rate_scheduler, momentum_scheduler, tensorboard_writer, moving_average,
                          batch_callbacks, epoch_callbacks, distributed, local_rank, world_size,
-                         num_gradient_accumulation_steps, opt_level)
+                         num_gradient_accumulation_steps, use_amp)
         # TODO extract param to constructor (+ constructor method?)
         self.validate_every_n = 5
 
@@ -125,7 +125,8 @@ class GradientDescentTrainer(training.GradientDescentTrainer):
                             self.model,
                             val_loss,
                             val_reg_loss,
-                            num_batches,
+                            num_batches=num_batches,
+                            batch_loss=None,
                             reset=True,
                             world_size=self._world_size,
                             cuda_device=self.cuda_device,
@@ -231,7 +232,7 @@ class GradientDescentTrainer(training.GradientDescentTrainer):
             world_size: int = 1,
             num_gradient_accumulation_steps: int = 1,
             opt_level: Optional[str] = None,
-            no_grad: List[str] = None,
+            use_amp: bool = False,
             optimizer: common.Lazy[optimizers.Optimizer] = None,
             learning_rate_scheduler: common.Lazy[learning_rate_schedulers.LearningRateScheduler] = None,
             momentum_scheduler: common.Lazy[momentum_schedulers.MomentumScheduler] = None,
@@ -258,8 +259,7 @@ class GradientDescentTrainer(training.GradientDescentTrainer):
             distributed=distributed,
             world_size=world_size,
             num_gradient_accumulation_steps=num_gradient_accumulation_steps,
-            opt_level=opt_level,
-            no_grad=no_grad,
+            use_amp=use_amp,
             optimizer=optimizer,
             learning_rate_scheduler=learning_rate_scheduler,
             momentum_scheduler=momentum_scheduler,
diff --git a/setup.py b/setup.py
index d3aac1a..228e025 100644
--- a/setup.py
+++ b/setup.py
@@ -3,17 +3,17 @@ from setuptools import find_packages, setup
 
 REQUIREMENTS = [
     'absl-py==0.9.0',
-    'allennlp==1.0.0',
+    'allennlp==1.1.0',
     'conllu==2.3.2',
     'dataclasses-json==0.5.2',
     'joblib==0.14.1',
     'jsonnet==0.15.0',
     'requests==2.23.0',
-    'overrides==3.0.0',
+    'overrides==3.1.0',
     'tensorboard==2.1.0',
-    'torch>=1.5.0,<1.6.0',
+    'torch==1.6.0',
     'tqdm==4.43.0',
-    'transformers==2.9.1',
+    'transformers>=3.0.0,<3.1.0',
     'urllib3==1.24.2',
 ]
 
-- 
GitLab


From dfec6d56ca1f642c45150497d0fdf986b2c62c9d Mon Sep 17 00:00:00 2001
From: Mateusz Klimaszewski <mk.klimaszewski@gmail.com>
Date: Mon, 14 Sep 2020 14:28:37 +0200
Subject: [PATCH 03/12] Fix herberta training.

---
 README.md                 |  6 +-----
 combo/data/api.py         | 11 +++++++++--
 combo/data/dataset.py     | 19 +++++++++++++++++--
 combo/main.py             | 12 +++++++-----
 combo/predict.py          |  2 +-
 combo/training/trainer.py |  1 +
 setup.py                  |  2 +-
 7 files changed, 37 insertions(+), 16 deletions(-)

diff --git a/README.md b/README.md
index 680673c..c5d0fd0 100644
--- a/README.md
+++ b/README.md
@@ -1,9 +1,5 @@
 ## Installation
 
-### HERBERTA notes:
-
-Install herberta transformers package **before** running command below
-
 Clone this repository and run:
 ```bash
 python setup.py develop
@@ -86,7 +82,7 @@ Input: one sentence per line.
 Output: List of token jsons.
 
 ```bash
-combo --mode predict --model_path your_model_tar_gz --input_file your_text_file --output_file your_output_file --silent
+combo --mode predict --model_path your_model_tar_gz --input_file your_text_file --output_file your_output_file --silent --noconllu_format
 ```
 #### Advanced
 
diff --git a/combo/data/api.py b/combo/data/api.py
index b0763b6..10a3a72 100644
--- a/combo/data/api.py
+++ b/combo/data/api.py
@@ -20,6 +20,7 @@ class Token:
     deprel: Optional[str] = None
     deps: Optional[str] = None
     misc: Optional[str] = None
+    semrel: Optional[str] = None
 
 
 @dataclass_json
@@ -37,8 +38,14 @@ class _TokenList(conllu.TokenList):
         return 'TokenList<' + ', '.join(token['token'] for token in self) + '>'
 
 
-def sentence2conllu(sentence: Sentence) -> conllu.TokenList:
-    tokens = [collections.OrderedDict(t.to_dict()) for t in sentence.tokens]
+def sentence2conllu(sentence: Sentence, keep_semrel: bool = True) -> conllu.TokenList:
+    tokens = []
+    for token in sentence.tokens:
+        token_dict = collections.OrderedDict(token.to_dict())
+        # Remove semrel to have default conllu format.
+        if not keep_semrel:
+            del token_dict["semrel"]
+        tokens.append(token_dict)
     # Range tokens must be tuple not list, this is conllu library requirement
     for t in tokens:
         if type(t["id"]) == list:
diff --git a/combo/data/dataset.py b/combo/data/dataset.py
index b5f5c30..459a755 100644
--- a/combo/data/dataset.py
+++ b/combo/data/dataset.py
@@ -41,7 +41,7 @@ class UniversalDependenciesDatasetReader(allen_data.DatasetReader):
                 "Features and targets cannot share elements! "
                 "Remove {} from either features or targets.".format(intersection)
             )
-        self._use_sem = use_sem
+        self.use_sem = use_sem
 
         # *.conllu readers configuration
         fields = list(parser.DEFAULT_FIELDS)
@@ -49,7 +49,7 @@ class UniversalDependenciesDatasetReader(allen_data.DatasetReader):
         field_parsers = parser.DEFAULT_FIELD_PARSERS
         # Do not make it nullable
         field_parsers.pop("xpostag", None)
-        if self._use_sem:
+        if self.use_sem:
             fields = list(fields)
             fields.append("semrel")
             field_parsers["semrel"] = lambda line, i: line[i]
@@ -113,8 +113,23 @@ class UniversalDependenciesDatasetReader(allen_data.DatasetReader):
                         fields_[target_name] = allen_fields.SequenceLabelField(target_values, text_field,
                                                                                label_namespace=target_name + "_labels")
 
+        # Restore feats fields to string representation
+        # parser.serialize_field doesn't handle key without value
+        for token in tree.tokens:
+            if "feats" in token:
+                feats = token["feats"]
+                if feats:
+                    feats_values = []
+                    for k, v in feats.items():
+                        feats_values.append('='.join((k, v)) if v else k)
+                    field = "|".join(feats_values)
+                else:
+                    field = "_"
+                token["feats"] = field
+
         # metadata
         fields_["metadata"] = allen_fields.MetadataField({"input": tree, "field_names": self.fields})
+
         return allen_data.Instance(fields_)
 
     @staticmethod
diff --git a/combo/main.py b/combo/main.py
index 4dc0056..c7aac87 100644
--- a/combo/main.py
+++ b/combo/main.py
@@ -13,7 +13,7 @@ from allennlp.common import checks as allen_checks, util
 from allennlp.models import archival
 
 from combo import predict
-from combo.data import dataset
+from combo.data import api, dataset
 from combo.utils import checks
 
 logger = logging.getLogger(__name__)
@@ -76,6 +76,8 @@ flags.DEFINE_string(name="model_path", default=None,
                     help="Pretrained model path.")
 flags.DEFINE_string(name="input_file", default=None,
                     help="File to predict path")
+flags.DEFINE_boolean(name="conllu_format", default=True,
+                     help="Prediction based on conllu format (instead of raw text).")
 flags.DEFINE_integer(name="batch_size", default=1,
                      help="Prediction batch size.")
 flags.DEFINE_boolean(name="silent", default=True,
@@ -136,13 +138,13 @@ def run(_):
                 model=model,
                 dataset_reader=dataset_reader
             )
-            test_path = FLAGS.test_path
-            test_trees = dataset_reader.read(test_path)
+            test_trees = dataset_reader.read(FLAGS.test_path)
             with open(FLAGS.output_file, "w") as file:
                 for tree in test_trees:
-                    file.writelines(predictor.predict_instance_as_tree(tree).serialize())
+                    file.writelines(api.sentence2conllu(predictor.predict_instance(tree),
+                                                        keep_semrel=dataset_reader.use_sem).serialize())
     else:
-        use_dataset_reader = ".conllu" in FLAGS.input_file.lower()
+        use_dataset_reader = FLAGS.conllu_format
         predictor = _get_predictor()
         if use_dataset_reader:
             predictor.line_to_conllu = True
diff --git a/combo/predict.py b/combo/predict.py
index 0ee80a9..ebbb372 100644
--- a/combo/predict.py
+++ b/combo/predict.py
@@ -128,7 +128,7 @@ class SemanticMultitaskPredictor(predictor.Predictor):
         # Check whether serialized (str) tree or token's list
         # Serialized tree has already separators between lines
         if self.line_to_conllu:
-            return sentence2conllu(outputs).serialize()
+            return sentence2conllu(outputs, keep_semrel=self._dataset_reader.use_sem).serialize()
         else:
             return outputs.to_json()
 
diff --git a/combo/training/trainer.py b/combo/training/trainer.py
index 234bdd7..772b9b0 100644
--- a/combo/training/trainer.py
+++ b/combo/training/trainer.py
@@ -127,6 +127,7 @@ class GradientDescentTrainer(training.GradientDescentTrainer):
                             val_reg_loss,
                             num_batches=num_batches,
                             batch_loss=None,
+                            batch_reg_loss=None,
                             reset=True,
                             world_size=self._world_size,
                             cuda_device=self.cuda_device,
diff --git a/setup.py b/setup.py
index 228e025..dd21555 100644
--- a/setup.py
+++ b/setup.py
@@ -14,7 +14,7 @@ REQUIREMENTS = [
     'torch==1.6.0',
     'tqdm==4.43.0',
     'transformers>=3.0.0,<3.1.0',
-    'urllib3==1.24.2',
+    'urllib3>=1.25.11',
 ]
 
 setup(
-- 
GitLab


From 2a41300666a4b8bd5e9ebc97f92f56367b8b13fa Mon Sep 17 00:00:00 2001
From: Mateusz Klimaszewski <mk.klimaszewski@gmail.com>
Date: Tue, 3 Nov 2020 09:54:50 +0100
Subject: [PATCH 04/12] Update allennlp to 1.2.0.

---
 combo/data/token_indexers/__init__.py         |  1 +
 ...etrained_transformer_mismatched_indexer.py | 85 +++++++++++++++++++
 combo/models/embeddings.py                    |  8 +-
 combo/training/trainer.py                     | 20 +++--
 config.template.jsonnet                       |  6 +-
 setup.py                                      |  5 +-
 6 files changed, 111 insertions(+), 14 deletions(-)
 create mode 100644 combo/data/token_indexers/pretrained_transformer_mismatched_indexer.py

diff --git a/combo/data/token_indexers/__init__.py b/combo/data/token_indexers/__init__.py
index 1b918b3..550a80b 100644
--- a/combo/data/token_indexers/__init__.py
+++ b/combo/data/token_indexers/__init__.py
@@ -1,2 +1,3 @@
+from .pretrained_transformer_mismatched_indexer import PretrainedTransformerMismatchedIndexer
 from .token_characters_indexer import TokenCharactersIndexer
 from .token_features_indexer import TokenFeatsIndexer
diff --git a/combo/data/token_indexers/pretrained_transformer_mismatched_indexer.py b/combo/data/token_indexers/pretrained_transformer_mismatched_indexer.py
new file mode 100644
index 0000000..fbe2368
--- /dev/null
+++ b/combo/data/token_indexers/pretrained_transformer_mismatched_indexer.py
@@ -0,0 +1,85 @@
+from typing import Optional, Dict, Any, List, Tuple
+
+from allennlp import data
+from allennlp.data import token_indexers, tokenizers
+
+
+@data.TokenIndexer.register("pretrained_transformer_mismatched_fixed")
+class PretrainedTransformerMismatchedIndexer(token_indexers.PretrainedTransformerMismatchedIndexer):
+
+    def __init__(self, model_name: str, namespace: str = "tags", max_length: int = None,
+                 tokenizer_kwargs: Optional[Dict[str, Any]] = None, **kwargs) -> None:
+        # The matched version v.s. mismatchedńskie
+        super().__init__(model_name, namespace, max_length, tokenizer_kwargs, **kwargs)
+        self._matched_indexer = PretrainedTransformerIndexer(
+            model_name,
+            namespace=namespace,
+            max_length=max_length,
+            tokenizer_kwargs=tokenizer_kwargs,
+            **kwargs,
+        )
+        self._allennlp_tokenizer = self._matched_indexer._allennlp_tokenizer
+        self._tokenizer = self._matched_indexer._tokenizer
+        self._num_added_start_tokens = self._matched_indexer._num_added_start_tokens
+        self._num_added_end_tokens = self._matched_indexer._num_added_end_tokens
+
+
+class PretrainedTransformerIndexer(token_indexers.PretrainedTransformerIndexer):
+
+    def __init__(
+            self,
+            model_name: str,
+            namespace: str = "tags",
+            max_length: int = None,
+            tokenizer_kwargs: Optional[Dict[str, Any]] = None,
+            **kwargs,
+    ) -> None:
+        super().__init__(model_name, namespace, max_length, tokenizer_kwargs, **kwargs)
+        self._namespace = namespace
+        self._allennlp_tokenizer = PretrainedTransformerTokenizer(
+            model_name, tokenizer_kwargs=tokenizer_kwargs
+        )
+        self._tokenizer = self._allennlp_tokenizer.tokenizer
+        self._added_to_vocabulary = False
+
+        self._num_added_start_tokens = len(self._allennlp_tokenizer.single_sequence_start_tokens)
+        self._num_added_end_tokens = len(self._allennlp_tokenizer.single_sequence_end_tokens)
+
+        self._max_length = max_length
+        if self._max_length is not None:
+            num_added_tokens = len(self._allennlp_tokenizer.tokenize("a")) - 1
+            self._effective_max_length = (  # we need to take into account special tokens
+                    self._max_length - num_added_tokens
+            )
+            if self._effective_max_length <= 0:
+                raise ValueError(
+                    "max_length needs to be greater than the number of special tokens inserted."
+                )
+
+
+class PretrainedTransformerTokenizer(tokenizers.PretrainedTransformerTokenizer):
+
+    def _intra_word_tokenize(
+            self, string_tokens: List[str]
+    ) -> Tuple[List[data.Token], List[Optional[Tuple[int, int]]]]:
+        tokens: List[data.Token] = []
+        offsets: List[Optional[Tuple[int, int]]] = []
+        for token_string in string_tokens:
+            wordpieces = self.tokenizer.encode_plus(
+                token_string,
+                add_special_tokens=False,
+                return_tensors=None,
+                return_offsets_mapping=False,
+                return_attention_mask=False,
+            )
+            wp_ids = wordpieces["input_ids"]
+
+            if len(wp_ids) > 0:
+                offsets.append((len(tokens), len(tokens) + len(wp_ids) - 1))
+                tokens.extend(
+                    data.Token(text=wp_text, text_id=wp_id)
+                    for wp_id, wp_text in zip(wp_ids, self.tokenizer.convert_ids_to_tokens(wp_ids))
+                )
+            else:
+                offsets.append(None)
+        return tokens, offsets
diff --git a/combo/models/embeddings.py b/combo/models/embeddings.py
index edb37a1..5cad959 100644
--- a/combo/models/embeddings.py
+++ b/combo/models/embeddings.py
@@ -1,5 +1,5 @@
 """Embeddings."""
-from typing import Optional
+from typing import Optional, Dict, Any
 
 import torch
 import torch.nn as nn
@@ -110,8 +110,10 @@ class TransformersWordEmbedder(token_embedders.PretrainedTransformerMismatchedEm
                  projection_dim: int,
                  projection_activation: Optional[allen_nn.Activation] = lambda x: x,
                  projection_dropout_rate: Optional[float] = 0.0,
-                 freeze_transformer: bool = True):
-        super().__init__(model_name)
+                 freeze_transformer: bool = True,
+                 tokenizer_kwargs: Optional[Dict[str, Any]] = None,
+                 transformer_kwargs: Optional[Dict[str, Any]] = None):
+        super().__init__(model_name, tokenizer_kwargs=tokenizer_kwargs, transformer_kwargs=transformer_kwargs)
         self.freeze_transformer = freeze_transformer
         if self.freeze_transformer:
             self._matched_embedder.eval()
diff --git a/combo/training/trainer.py b/combo/training/trainer.py
index 772b9b0..aeb9f09 100644
--- a/combo/training/trainer.py
+++ b/combo/training/trainer.py
@@ -3,7 +3,7 @@ import logging
 import os
 import time
 import traceback
-from typing import Any, Dict, List, Optional
+from typing import Any, Dict, List, Optional, Union
 
 import torch
 import torch.distributed as dist
@@ -30,7 +30,8 @@ logger = logging.getLogger(__name__)
 @training.EpochCallback.register("transfer_patience")
 class TransferPatienceEpochCallback(training.EpochCallback):
 
-    def __call__(self, trainer: "training.GradientDescentTrainer", metrics: Dict[str, Any], epoch: int) -> None:
+    def __call__(self, trainer: "training.GradientDescentTrainer", metrics: Dict[str, Any], epoch: int,
+                 is_master: bool) -> None:
         if trainer._learning_rate_scheduler and trainer._learning_rate_scheduler.patience is not None:
             trainer._metric_tracker._patience = trainer._learning_rate_scheduler.patience
             trainer._metric_tracker._epochs_with_no_improvement = 0
@@ -45,20 +46,23 @@ class GradientDescentTrainer(training.GradientDescentTrainer):
                  patience: Optional[int] = None, validation_metric: str = "-loss",
                  validation_data_loader: data.DataLoader = None, num_epochs: int = 20,
                  serialization_dir: Optional[str] = None, checkpointer: checkpointer.Checkpointer = None,
-                 cuda_device: int = -1,
+                 cuda_device: Optional[Union[int, torch.device]] = -1,
                  grad_norm: Optional[float] = None, grad_clipping: Optional[float] = None,
                  learning_rate_scheduler: Optional[learning_rate_schedulers.LearningRateScheduler] = None,
                  momentum_scheduler: Optional[momentum_schedulers.MomentumScheduler] = None,
                  tensorboard_writer: allen_tensorboard_writer.TensorboardWriter = None,
                  moving_average: Optional[moving_average.MovingAverage] = None,
                  batch_callbacks: List[training.BatchCallback] = None,
-                 epoch_callbacks: List[training.EpochCallback] = None, distributed: bool = False, local_rank: int = 0,
+                 epoch_callbacks: List[training.EpochCallback] = None,
+                 end_callbacks: List[training.EpochCallback] = None,
+                 trainer_callbacks: List[training.TrainerCallback] = None,
+                 distributed: bool = False, local_rank: int = 0,
                  world_size: int = 1, num_gradient_accumulation_steps: int = 1,
                  use_amp: bool = False) -> None:
         super().__init__(model, optimizer, data_loader, patience, validation_metric, validation_data_loader, num_epochs,
                          serialization_dir, checkpointer, cuda_device, grad_norm, grad_clipping,
                          learning_rate_scheduler, momentum_scheduler, tensorboard_writer, moving_average,
-                         batch_callbacks, epoch_callbacks, distributed, local_rank, world_size,
+                         batch_callbacks, epoch_callbacks, end_callbacks, trainer_callbacks, distributed, local_rank, world_size,
                          num_gradient_accumulation_steps, use_amp)
         # TODO extract param to constructor (+ constructor method?)
         self.validate_every_n = 5
@@ -93,7 +97,7 @@ class GradientDescentTrainer(training.GradientDescentTrainer):
             metrics["best_validation_" + key] = value
 
         for callback in self._epoch_callbacks:
-            callback(self, metrics={}, epoch=-1)
+            callback(self, metrics={}, epoch=-1, is_master=True)
 
         for epoch in range(epoch_counter, self._num_epochs):
             epoch_start_time = time.time()
@@ -190,7 +194,7 @@ class GradientDescentTrainer(training.GradientDescentTrainer):
                 dist.barrier()
 
             for callback in self._epoch_callbacks:
-                callback(self, metrics=metrics, epoch=epoch)
+                callback(self, metrics=metrics, epoch=epoch, is_master=self._master)
 
             epoch_elapsed_time = time.time() - epoch_start_time
             logger.info("Epoch duration: %s", datetime.timedelta(seconds=epoch_elapsed_time))
@@ -243,7 +247,7 @@ class GradientDescentTrainer(training.GradientDescentTrainer):
             batch_callbacks: List[training.BatchCallback] = None,
             epoch_callbacks: List[training.EpochCallback] = None,
     ) -> "training.Trainer":
-        if tensorboard_writer.construct() is None:
+        if tensorboard_writer is None:
             tensorboard_writer = common.Lazy(combo_tensorboard_writer.NullTensorboardWriter)
         return super().from_partial_objects(
             model=model,
diff --git a/config.template.jsonnet b/config.template.jsonnet
index 57f02ae..8e5ddc9 100644
--- a/config.template.jsonnet
+++ b/config.template.jsonnet
@@ -112,8 +112,10 @@ assert pretrained_tokens == null || pretrained_transformer_name == null: "Can't
         use_sem: if in_targets("semrel") then true else false,
         token_indexers: {
             token: if use_transformer then {
-                type: "pretrained_transformer_mismatched",
+                type: "pretrained_transformer_mismatched_fixed",
                 model_name: pretrained_transformer_name,
+                tokenizer_kwargs: if std.startsWith(pretrained_transformer_name, "allegro/herbert")
+                                  then {use_fast: false} else {},
             } else {
                 # SingleIdTokenIndexer, token as single int
                 type: "single_id",
@@ -202,6 +204,8 @@ assert pretrained_tokens == null || pretrained_transformer_name == null: "Can't
                     type: "transformers_word_embeddings",
                     model_name: pretrained_transformer_name,
                     projection_dim: projected_embedding_dim,
+                    tokenizer_kwargs: if std.startsWith(pretrained_transformer_name, "allegro/herbert")
+                                      then {use_fast: false} else {},
                 } else {
                     type: "embeddings_projected",
                     embedding_dim: embedding_dim,
diff --git a/setup.py b/setup.py
index dd21555..6ce7e3c 100644
--- a/setup.py
+++ b/setup.py
@@ -3,8 +3,9 @@ from setuptools import find_packages, setup
 
 REQUIREMENTS = [
     'absl-py==0.9.0',
-    'allennlp==1.1.0',
+    'allennlp==1.2.0',
     'conllu==2.3.2',
+    'dataclasses==0.5',
     'dataclasses-json==0.5.2',
     'joblib==0.14.1',
     'jsonnet==0.15.0',
@@ -13,7 +14,7 @@ REQUIREMENTS = [
     'tensorboard==2.1.0',
     'torch==1.6.0',
     'tqdm==4.43.0',
-    'transformers>=3.0.0,<3.1.0',
+    'transformers>=3.4.0,<3.5',
     'urllib3>=1.25.11',
 ]
 
-- 
GitLab


From ea87067930488a1b0cbf3d06944824d504f89531 Mon Sep 17 00:00:00 2001
From: Mateusz Klimaszewski <mk.klimaszewski@gmail.com>
Date: Tue, 3 Nov 2020 10:33:11 +0100
Subject: [PATCH 05/12] Add TODO note for next AllenNLP update.

---
 .../pretrained_transformer_mismatched_indexer.py               | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/combo/data/token_indexers/pretrained_transformer_mismatched_indexer.py b/combo/data/token_indexers/pretrained_transformer_mismatched_indexer.py
index fbe2368..3eee80e 100644
--- a/combo/data/token_indexers/pretrained_transformer_mismatched_indexer.py
+++ b/combo/data/token_indexers/pretrained_transformer_mismatched_indexer.py
@@ -6,10 +6,11 @@ from allennlp.data import token_indexers, tokenizers
 
 @data.TokenIndexer.register("pretrained_transformer_mismatched_fixed")
 class PretrainedTransformerMismatchedIndexer(token_indexers.PretrainedTransformerMismatchedIndexer):
+    """TODO(mklimasz) Remove during next allennlp update, fixed on allennlp master."""
 
     def __init__(self, model_name: str, namespace: str = "tags", max_length: int = None,
                  tokenizer_kwargs: Optional[Dict[str, Any]] = None, **kwargs) -> None:
-        # The matched version v.s. mismatchedńskie
+        # The matched version v.s. mismatched
         super().__init__(model_name, namespace, max_length, tokenizer_kwargs, **kwargs)
         self._matched_indexer = PretrainedTransformerIndexer(
             model_name,
-- 
GitLab


From 7ebf88b84bbccfd91a023ba79c321fecbc8bdf13 Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Alina=20Wr=C3=B3blewska?= <alina@ipipan.waw.pl>
Date: Tue, 28 Jul 2020 18:13:38 +0200
Subject: [PATCH 06/12] Readme added

---
 README.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/README.md b/README.md
index c5d0fd0..0ef5706 100644
--- a/README.md
+++ b/README.md
@@ -82,7 +82,7 @@ Input: one sentence per line.
 Output: List of token jsons.
 
 ```bash
-combo --mode predict --model_path your_model_tar_gz --input_file your_text_file --output_file your_output_file --silent --noconllu_format
+combo --mode predict --model_path your_model_tar_gz --input_file your_text_file --output_file your_output_file --silent
 ```
 #### Advanced
 
-- 
GitLab


From 43a618224e91fa5018f6633f64e4e9a47163c15a Mon Sep 17 00:00:00 2001
From: Mateusz Klimaszewski <mk.klimaszewski@gmail.com>
Date: Tue, 3 Nov 2020 11:31:21 +0100
Subject: [PATCH 07/12] Fix dataclass package issue for python 3.6.

---
 setup.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/setup.py b/setup.py
index 6ce7e3c..9529a0c 100644
--- a/setup.py
+++ b/setup.py
@@ -5,7 +5,7 @@ REQUIREMENTS = [
     'absl-py==0.9.0',
     'allennlp==1.2.0',
     'conllu==2.3.2',
-    'dataclasses==0.5',
+    'dataclasses;python_version<"3.7"',
     'dataclasses-json==0.5.2',
     'joblib==0.14.1',
     'jsonnet==0.15.0',
-- 
GitLab


From 90d7c1ef6de2eeda855a136285615f56ec61a08f Mon Sep 17 00:00:00 2001
From: Mateusz Klimaszewski <mk.klimaszewski@gmail.com>
Date: Tue, 3 Nov 2020 14:32:11 +0100
Subject: [PATCH 08/12] Remove unused patching in test.

---
 tests/test_predict.py | 12 ------------
 1 file changed, 12 deletions(-)

diff --git a/tests/test_predict.py b/tests/test_predict.py
index 42bc493..2a56bd9 100644
--- a/tests/test_predict.py
+++ b/tests/test_predict.py
@@ -1,8 +1,6 @@
 import os
 import pathlib
-import shutil
 import unittest
-from unittest import mock
 
 import combo.data as data
 import combo.predict as predict
@@ -14,16 +12,6 @@ class PredictionTest(unittest.TestCase):
     TESTS_ROOT = PROJECT_ROOT / "tests"
     FIXTURES_ROOT = TESTS_ROOT / "fixtures"
 
-    def setUp(self) -> None:
-        def _cleanup_archive_dir_without_logging(path: str):
-            if os.path.exists(path):
-                shutil.rmtree(path)
-
-        self.patcher = mock.patch(
-            "allennlp.models.archival._cleanup_archive_dir", _cleanup_archive_dir_without_logging
-        )
-        self.mock_cleanup_archive_dir = self.patcher.start()
-
     def test_prediction_are_equal_given_the_same_input_in_different_form(self):
         # given
         raw_sentence = "Test."
-- 
GitLab


From 828a3ec56a6cd32a406d840ce987c103d3ec763f Mon Sep 17 00:00:00 2001
From: Mateusz Klimaszewski <mk.klimaszewski@gmail.com>
Date: Sat, 7 Nov 2020 19:27:00 +0100
Subject: [PATCH 09/12] Add header for readme.md

---
 README.md | 11 +++++++++++
 1 file changed, 11 insertions(+)

diff --git a/README.md b/README.md
index 0ef5706..0213823 100644
--- a/README.md
+++ b/README.md
@@ -1,3 +1,14 @@
+# COMBO
+<p align="center">
+    A GPL-3.0 system, build on top of PyTorch and AllenNLP, for morphosyntactic analysis.
+</p>
+<hr/>
+<p align="center">
+    <a href="https://github.com/ipipan/combo/blob/master/LICENSE">
+        <img alt="License" src="https://img.shields.io/github/license/ipipan/combo.svg?color=blue&cachedrop">
+    </a>
+</p>
+
 ## Installation
 
 Clone this repository and run:
-- 
GitLab


From 92b4e5c32ea6979815dbb3dda2b24fe06237fb70 Mon Sep 17 00:00:00 2001
From: Mateusz Klimaszewski <mk.klimaszewski@gmail.com>
Date: Mon, 9 Nov 2020 10:22:28 +0100
Subject: [PATCH 10/12] Add information about pre-trained models.

---
 README.md | 8 ++++++++
 1 file changed, 8 insertions(+)

diff --git a/README.md b/README.md
index 0213823..87a4657 100644
--- a/README.md
+++ b/README.md
@@ -9,6 +9,14 @@
     </a>
 </p>
 
+[Pre-trained models!](http://mozart.ipipan.waw.pl/~mklimaszewski/models/)
+```python
+import combo.predict as predict
+nlp = predict.SemanticMultitaskPredictor.from_pretrained("polish-herbert-base")
+sentence = nlp("Moje zdanie.")
+print(sentence.tokens)
+```
+
 ## Installation
 
 Clone this repository and run:
-- 
GitLab


From 8fba2eddfad66783b8e06b33cef03509e4d7098e Mon Sep 17 00:00:00 2001
From: Mateusz Klimaszewski <mk.klimaszewski@gmail.com>
Date: Mon, 9 Nov 2020 12:59:44 +0100
Subject: [PATCH 11/12] Fix console prediction.

---
 combo/main.py    | 3 +++
 combo/predict.py | 3 +++
 2 files changed, 6 insertions(+)

diff --git a/combo/main.py b/combo/main.py
index c7aac87..44ad091 100644
--- a/combo/main.py
+++ b/combo/main.py
@@ -146,6 +146,9 @@ def run(_):
     else:
         use_dataset_reader = FLAGS.conllu_format
         predictor = _get_predictor()
+        if FLAGS.input_file == "-":
+            use_dataset_reader = False
+            predictor.without_sentence_embedding = True
         if use_dataset_reader:
             predictor.line_to_conllu = True
         if FLAGS.silent:
diff --git a/combo/predict.py b/combo/predict.py
index ebbb372..b6c7172 100644
--- a/combo/predict.py
+++ b/combo/predict.py
@@ -32,6 +32,7 @@ class SemanticMultitaskPredictor(predictor.Predictor):
         self._dataset_reader.generate_labels = False
         self._dataset_reader.lazy = True
         self._tokenizer = tokenizer
+        self.without_sentence_embedding = False
         self.line_to_conllu = line_to_conllu
 
     def __call__(self, sentence: Union[str, List[str], List[List[str]], List[data.Sentence]]):
@@ -127,6 +128,8 @@ class SemanticMultitaskPredictor(predictor.Predictor):
     def dump_line(self, outputs: data.Sentence) -> str:
         # Check whether serialized (str) tree or token's list
         # Serialized tree has already separators between lines
+        if self.without_sentence_embedding:
+            outputs.sentence_embedding = []
         if self.line_to_conllu:
             return sentence2conllu(outputs, keep_semrel=self._dataset_reader.use_sem).serialize()
         else:
-- 
GitLab


From aaea8f5512b0dbae513bd2a078448996acd4c03c Mon Sep 17 00:00:00 2001
From: Mateusz Klimaszewski <mk.klimaszewski@gmail.com>
Date: Tue, 10 Nov 2020 14:43:10 +0100
Subject: [PATCH 12/12] Split documentation into multiple markdown files.

---
 README.md            | 122 ++++++-------------------------------------
 docs/installation.md |  13 +++++
 docs/models.md       |  19 +++++++
 docs/prediction.md   |  40 ++++++++++++++
 docs/training.md     |  52 ++++++++++++++++++
 5 files changed, 139 insertions(+), 107 deletions(-)
 create mode 100644 docs/installation.md
 create mode 100644 docs/models.md
 create mode 100644 docs/prediction.md
 create mode 100644 docs/training.md

diff --git a/README.md b/README.md
index 87a4657..19491b3 100644
--- a/README.md
+++ b/README.md
@@ -1,6 +1,6 @@
 # COMBO
 <p align="center">
-    A GPL-3.0 system, build on top of PyTorch and AllenNLP, for morphosyntactic analysis.
+    A language-independent NLP system for dependency parsing, part-of-speech tagging, lemmatisation and more built on top of PyTorch and AllenNLP.
 </p>
 <hr/>
 <p align="center">
@@ -9,118 +9,26 @@
     </a>
 </p>
 
-[Pre-trained models!](http://mozart.ipipan.waw.pl/~mklimaszewski/models/)
-```python
-import combo.predict as predict
-nlp = predict.SemanticMultitaskPredictor.from_pretrained("polish-herbert-base")
-sentence = nlp("Moje zdanie.")
-print(sentence.tokens)
-```
-
-## Installation
-
-Clone this repository and run:
+## Quick start
+Clone this repository and install COMBO (we suggest using virtualenv/conda with Python 3.6+):
 ```bash
+git clone https://github.com/ipipan/combo.git
+cd combo
 python setup.py develop
 ```
-
-### Problems & solutions
-* **jsonnet** installation error
-
-use `conda install -c conda-forge jsonnet=0.15.0`
-
-## Training
-
-Command:
-```bash
-combo --mode train \
-      --training_data_path your_training_path \
-      --validation_data_path your_validation_path
-```
-
-Options:
-```bash
-combo --helpfull
-```
-
-Examples (for clarity without training/validation data paths):
-
-* train on gpu 0
-
-    ```bash
-    combo --mode train --cuda_device 0
-    ```
-
-* use pretrained embeddings:
-
-    ```bash
-    combo --mode train --pretrained_tokens your_pretrained_embeddings_path --embedding_dim your_embeddings_dim
-    ```
-
-* use pretrained transformer embeddings:
-
-    ```bash
-    combo --mode train --pretrained_transformer_name your_choosen_pretrained_transformer
-    ```
-
-* predict only dependency tree:
-
-    ```bash
-    combo --mode train --targets head,deprel
-    ```
-
-* use part-of-speech tags for predicting only dependency tree
-
-    ```bash
-    combo --mode train --targets head,deprel --features token,char,upostag
-    ```
-
-Advanced configuration: [Configuration](#configuration)
-
-## Prediction
-
-### ConLLU file prediction:
-Input and output are both in `*.conllu` format.
-```bash
-combo --mode predict --model_path your_model_tar_gz --input_file your_conllu_file --output_file your_output_file --silent
-```
-
-### Console
-Works for models where input was text-based only.
-
-Interactive testing in console (load model and just type sentence in console).
-
-```bash
-combo --mode predict --model_path your_model_tar_gz --input_file "-" --nosilent
-```
-### Raw text
-Works for models where input was text-based only. 
-
-Input: one sentence per line.
-
-Output: List of token jsons.
-
-```bash
-combo --mode predict --model_path your_model_tar_gz --input_file your_text_file --output_file your_output_file --silent
-```
-#### Advanced
-
-There are 2 tokenizers: whitespace and spacy-based (`en_core_web_sm` model).
-
-Use either `--predictor_name semantic-multitask-predictor` or `--predictor_name semantic-multitask-predictor-spacy`.
-
-### Python
+Run the following lines in your Python console to make predictions with a pre-trained model:
 ```python
 import combo.predict as predict
 
-model_path = "your_model.tar.gz"
-nlp = predict.SemanticMultitaskPredictor.from_pretrained(model_path)
-sentence = nlp("Sentence to parse.")
+nlp = predict.SemanticMultitaskPredictor.from_pretrained("polish-herbert-base")
+sentence = nlp("Moje zdanie.")
+print(sentence.tokens)
 ```
 
-## Configuration
+## Details
+
+- [**Installation**](docs/installation.md)
+- [**Pre-trained models**](docs/models.md)
+- [**Training**](docs/training.md)
+- [**Prediction**](docs/prediction.md)
 
-### Advanced
-Config template [config.template.jsonnet](config.template.jsonnet) is formed in `allennlp` format so you can freely modify it.
-There is configuration for all the training/model parameters (learning rates, epochs number etc.).
-Some of them use `jsonnet` syntax to get values from configuration flags, however most of them can be modified directly there.
diff --git a/docs/installation.md b/docs/installation.md
new file mode 100644
index 0000000..559c9c7
--- /dev/null
+++ b/docs/installation.md
@@ -0,0 +1,13 @@
+# Installation
+Clone this repository and install COMBO (we suggest using virtualenv/conda with Python 3.6+):
+```bash
+git clone https://github.com/ipipan/combo.git
+cd combo
+python setup.py develop
+combo --helpfull
+```
+
+## Problems & solutions
+* **jsonnet** installation error
+
+use `conda install -c conda-forge jsonnet=0.15.0`
diff --git a/docs/models.md b/docs/models.md
new file mode 100644
index 0000000..485f761
--- /dev/null
+++ b/docs/models.md
@@ -0,0 +1,19 @@
+# Models
+
+Pre-trained models are available [here](http://mozart.ipipan.waw.pl/~mklimaszewski/models/).
+
+## Automatic download
+Python `from_pretrained` method will download the pre-trained model if the provided name (without the extension .tar.gz) matches one of the names in [here](http://mozart.ipipan.waw.pl/~mklimaszewski/models/).
+```python
+import combo.predict as predict
+
+nlp = predict.SemanticMultitaskPredictor.from_pretrained("polish-herbert-base")
+```
+Otherwise it looks for a model in local env.
+
+## Console prediction/Local model
+If you want to use the console version of COMBO, you need to download a pre-trained model manually
+```bash
+wget http://mozart.ipipan.waw.pl/~mklimaszewski/models/polish-herbert-base.tar.gz
+```
+and pass it as a parameter (see [prediction doc](prediction.md)).
diff --git a/docs/prediction.md b/docs/prediction.md
new file mode 100644
index 0000000..89cc74c
--- /dev/null
+++ b/docs/prediction.md
@@ -0,0 +1,40 @@
+# Prediction
+
+## ConLLU file prediction:
+Input and output are both in `*.conllu` format.
+```bash
+combo --mode predict --model_path your_model_tar_gz --input_file your_conllu_file --output_file your_output_file --silent
+```
+
+## Console
+Works for models where input was text-based only.
+
+Interactive testing in console (load model and just type sentence in console).
+
+```bash
+combo --mode predict --model_path your_model_tar_gz --input_file "-" --nosilent
+```
+## Raw text
+Works for models where input was text-based only. 
+
+Input: one sentence per line.
+
+Output: List of token jsons.
+
+```bash
+combo --mode predict --model_path your_model_tar_gz --input_file your_text_file --output_file your_output_file --silent --noconllu_format
+```
+### Advanced
+
+There are 2 tokenizers: whitespace and spacy-based (`en_core_web_sm` model).
+
+Use either `--predictor_name semantic-multitask-predictor` or `--predictor_name semantic-multitask-predictor-spacy`.
+
+## Python
+```python
+import combo.predict as predict
+
+model_path = "your_model.tar.gz"
+nlp = predict.SemanticMultitaskPredictor.from_pretrained(model_path)
+sentence = nlp("Sentence to parse.")
+```
diff --git a/docs/training.md b/docs/training.md
new file mode 100644
index 0000000..9dc430a
--- /dev/null
+++ b/docs/training.md
@@ -0,0 +1,52 @@
+# Training
+
+Command:
+```bash
+combo --mode train \
+      --training_data_path your_training_path \
+      --validation_data_path your_validation_path
+```
+
+Options:
+```bash
+combo --helpfull
+```
+
+Examples (for clarity without training/validation data paths):
+
+* train on gpu 0
+
+    ```bash
+    combo --mode train --cuda_device 0
+    ```
+
+* use pretrained embeddings:
+
+    ```bash
+    combo --mode train --pretrained_tokens your_pretrained_embeddings_path --embedding_dim your_embeddings_dim
+    ```
+
+* use pretrained transformer embeddings:
+
+    ```bash
+    combo --mode train --pretrained_transformer_name your_choosen_pretrained_transformer
+    ```
+
+* predict only dependency tree:
+
+    ```bash
+    combo --mode train --targets head,deprel
+    ```
+
+* use part-of-speech tags for predicting only dependency tree
+
+    ```bash
+    combo --mode train --targets head,deprel --features token,char,upostag
+    ```
+  
+## Configuration
+
+### Advanced
+Config template [config.template.jsonnet](config.template.jsonnet) is formed in `allennlp` format so you can freely modify it.
+There is configuration for all the training/model parameters (learning rates, epochs number etc.).
+Some of them use `jsonnet` syntax to get values from configuration flags, however most of them can be modified directly there.
\ No newline at end of file
-- 
GitLab