Core Module Documentation#

neuralprophet.utils_torch.create_optimizer_from_config(optimizer_name, optimizer_args)#

Translate the optimizer name and arguments into a torch optimizer. If an optimizer object is provided, it is returned as is. The optimizer is not initialized yet since this is done by the trainer.

Parameters
optimizer_nameint

Object provided to NeuralProphet as optimizer.

optimizer_argsdict

Arguments for the optimizer.

optimizertorch.optim.Optimizer

The optimizer object.

optimizer_argsdict

The optimizer arguments.

neuralprophet.utils_torch.init_parameter(dims)#

Create and initialize a new torch Parameter.

Parameters

dims (list or tuple) – Desired dimensions of parameter

Returns

initialized Parameter

Return type

nn.Parameter

neuralprophet.utils_torch.interprete_model(target_model: pytorch_lightning.core.module.LightningModule, net: str, forward_func: str, _num_in_features: Optional[int] = None, _num_out_features: Optional[int] = None, _input: Optional[torch.Tensor] = None, additional_forward_args: Optional[Any] = None)#

Returns model input attributions for a given network and forward function.

Parameters
  • target_model (pl.LightningModule) – The model for which input attributions are to be computed.

  • net (str) – Name of the network for which input attributions are to be computed.

  • forward_func (str) – Name of the forward function for which input attributions are to be computed.

  • _input (torch.Tensor) – Input for which the attributions are to be computed.

Returns

Input attributions for the given network and forward function.

Return type

torch.Tensor