skorch.helper

Helper functions and classes for users.

They should not be used in skorch directly.

class skorch.helper.SliceDict(**kwargs)[source]

Wrapper for Python dict that makes it sliceable across values.

Use this if your input data is a dictionary and you have problems with sklearn not being able to slice it. Wrap your dict with SliceDict and it should usually work.

Note: SliceDict cannot be indexed by integers, if you want one row, say row 3, use [3:4].

Examples

>>> X = {'key0': val0, 'key1': val1}
>>> search = GridSearchCV(net, params, ...)
>>> search.fit(X, y)  # raises error
>>> Xs = SliceDict(key0=val0, key1=val1)  # or Xs = SliceDict(**X)
>>> search.fit(Xs, y)  # works
Attributes:
shape

Methods

clear()
copy()
fromkeys($type, iterable[, value]) Returns a new dict with keys from iterable and values equal to value.
get(k[,d])
items()
keys()
pop(k[,d]) If key is not found, d is returned if given, otherwise KeyError is raised
popitem() 2-tuple; but raise KeyError if D is empty.
setdefault(k[,d])
update([E, ]**F) If E is present and has a .keys() method, then does: for k in E: D[k] = E[k] If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v In either case, this is followed by: for k in F: D[k] = F[k]
values()
update([E, ]**F) → None. Update D from dict/iterable E and F.[source]

If E is present and has a .keys() method, then does: for k in E: D[k] = E[k] If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v In either case, this is followed by: for k in F: D[k] = F[k]

skorch.helper.filter_requires_grad(pgroups)[source]

Returns parameter groups where parameters that don’t require a gradient are filtered out.

Parameters:
pgroups : dict

Parameter groups to be filtered

skorch.helper.filtered_optimizer(optimizer, filter_fn)[source]

Wraps an optimizer that filters out parameters where filter_fn over pgroups returns False. This function can be used, for example, to filter parameters that do not require a gradient:

>>> from skorch.helper import filtered_optimizer, filter_requires_grad
>>> optimizer = filtered_optimizer(torch.optim.SGD, filter_requires_grad)
>>> net = NeuralNetClassifier(module, optimizer=optimizer)
Parameters:
optimizer : torch optim (class)

The uninitialized optimizer that is wrapped

filter_fn : function

Use this function to filter parameter groups before passing it to optimizer.

skorch.helper.predefined_split(dataset)[source]

Uses dataset for validiation in NeutralNet.

Parameters:
dataset: torch Dataset

Validiation dataset

Examples

>>> valid_ds = skorch.Dataset(X, y)
>>> net = NeutralNet(..., train_split=predefined_split(valid_ds))