Capture standard output when loading the predictor
This commit fixes a bug where we failed to flush the StreamRedirector
when catching an exception during the loading of the predictor module.
We now use the existing `_handle_setup_error` function to ensure that
the streams are flushed. I've kept the naming of the context manager
the same because this all happens as part of the model setup.
Two regression tests have been added to reproduce and verify that the
issue has been fixed, both in normal and concurrent/async mode.