Changes in IPython Parallel


Small encoding fix for Python 2.


Due to a compatibility change and semver, this is a major release. However, it is not a big release. The main compatibility change is that all timestamps are now timezone-aware UTC timestamps. This means you may see comparison errors if you have code that uses datetime objects without timezone info (so-called naïve datetime objects).

Other fixes:

  • Rename Client.become_distributed() to Client.become_dask(). become_distributed() remains as an alias.
  • import joblib from a public API instead of a private one when using IPython Parallel as a joblib backend.
  • Compatibility fix in extensions for security changes in notebook 4.3


  • Fix compatibility with changes in ipykernel 4.3, 4.4
  • Improve inspection of @remote decorated functions
  • Client.wait() accepts any Future.
  • Add --user flag to ipcluster nbextension
  • Default to one core per worker in Client.become_distributed(). Override by specifying ncores keyword-argument.
  • Subprocess logs are no longer sent to files by default in ipcluster.


dask, joblib

IPython Parallel 5.1 adds integration with other parallel computing tools, such as dask.distributed and joblib.

To turn an IPython cluster into a dask.distributed cluster, call become_distributed():

executor = client.become_distributed(ncores=1)

which returns a distributed Executor instance.

To register IPython Parallel as the backend for joblib:

import ipyparallel as ipp


IPython parallel now supports the notebook-4.2 API for enabling server extensions, to provide the IPython clusters tab:

jupyter serverextension enable --py ipyparallel
jupyter nbextension install --py ipyparallel
jupyter nbextension enable --py ipyparallel

though you can still use the more convenient single-call:

ipcluster nbextension enable

which does all three steps above.

Slurm support

Slurm support is added to ipcluster.



5.0.1 on GitHub

  • Fix imports in use_cloudpickle(), use_dill().
  • Various typos and documentation updates to catch up with 5.0.


5.0 on GitHub

The highlight of ipyparallel 5.0 is that the Client has been reorganized a bit to use Futures. AsyncResults are now a Future subclass, so they can be yield ed in coroutines, etc. Views have also received an Executor interface. This rewrite better connects results to their handles, so the Client.results cache should no longer grow unbounded.

See also

  • The Executor API ipyparallel.ViewExecutor
  • Creating an Executor from a Client: ipyparallel.Client.executor()
  • Each View has an executor attribute

Part of the Future refactor is that Client IO is now handled in a background thread, which means that Client.spin_thread() is obsolete and deprecated.

Other changes:

  • Add ipcluster nbextension enable|disable to toggle the clusters tab in Jupyter notebook

Less interesting development changes for users:

Some IPython-parallel extensions to the IPython kernel have been moved to the ipyparallel package:

  • ipykernel.datapub is now ipyparallel.datapub
  • ipykernel Python serialization is now in ipyparallel.serialize
  • apply_request message handling is implememented in a Kernel subclass, rather than the base ipykernel Kernel.


4.1 on GitHub

  • Add Client.wait_interactive()
  • Improvements for specifying engines with SSH launcher.


4.0 on GitHub

First release of ipyparallel as a standalone package.