Skip to content
Snippets Groups Projects
Commit cd9ff7d6 authored by Grzegorz Kostkowski's avatar Grzegorz Kostkowski
Browse files

Workaround to avoid unexpected stop of execution

Unfortunatelly, there are some limitations related with using
together: multiprocessing and multithreading (forking, handled by
omp).
As a workaround, graph is not shared, each worker has own instance.
There is also possibility to change multiprocessing mode to 'spawn'
(and this case was checked) and it works with omp multithreading,
but it results in same bahavior: model is copied in memory (because
OS cannot postpone copying according copy-on-write strategy - which
is available in 'fork' mode).
parent 49600add
Branches shared-model-fix
No related tags found
1 merge request!4Workaround to avoid unexpected stop of execution
......@@ -28,13 +28,14 @@ class Worker(nlp_ws.NLPWorker):
logging_lvl = config["logging_levels"]["wsd_worker"]
if logging_lvl:
logging.basicConfig(level=logging_lvl)
_log.info("Worker started loading models")
cls.configtool = config['tool']
cls.model = wosedon_plugin.WoSeDonPlugin(
config['tool']['config_file'], config['tool']['model_dir'])
_log.info("Using config: %s and model: %s",
config['tool']['config_file'],
config['tool']['model_dir'])
_log.info("Using config: %s", config['tool']['config_file'])
def init(self):
_log.info("Using model: %s", self.configtool['model_dir'])
_log.info("Worker started loading models")
self.model = wosedon_plugin.WoSeDonPlugin(
self.configtool['config_file'], self.configtool['model_dir'])
_log.info("Worker finished loading models ")
def process(self, inputFile, taskOptions, outputFile):
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment