A scikit-learn compatible neural network library that wraps PyTorch.
The goal of skorch is to make it possible to use PyTorch with sklearn. This is achieved by providing a wrapper around PyTorch that has an sklearn interface. In that sense, skorch is the spiritual successor to nolearn, but instead of using Lasagne and Theano, it uses PyTorch.
skorch does not re-invent the wheel, instead getting as much out of your way as possible. If you are familiar with sklearn and PyTorch, you don’t have to learn any new concepts, and the syntax should be well known. (If you’re not familiar with those libraries, it is worth getting familiarized.)
Additionally, skorch abstracts away the training loop, making a
lot of boilerplate code obsolete. A simple
net.fit(X, y) is
enough. Out of the box, skorch works with many types of data, be
it PyTorch Tensors, NumPy arrays, Python dicts, and so
on. However, if you have other data, extending skorch is easy to
allow for that.
Overall, skorch aims at being as flexible as PyTorch while having a clean interface as sklearn.
- Saving and Loading
- REST Service
- How do I apply L2 regularization?
- How can I continue training my model?
- How do I shuffle my train batches?
- How do I use sklearn GridSeachCV when my data is in a dictionary?
- How do I use sklearn GridSeachCV when my data is in a dataset?
- I want to use sample_weight, how can I do this?
- I already split my data into training and validation sets, how can I use them?
- What happens when NeuralNet is passed an initialized Pytorch module?
- How do I use a PyTorch Dataset with skorch?
- How can I deal with multiple return values from forward?
- How can I perform gradient accumulation with skorch?
- How can I dynamically set the input size of the PyTorch module based on the data?