nbeats

Classes

NBeatsBaseModel(net[, ...])

Base class for N-BEATS models.

NBeatsGenericModel(input_size, output_size)

Generic N-BEATS model.

NBeatsInterpretableModel(input_size, output_size)

Interpretable N-BEATS model.

class NBeatsBaseModel(net: etna.models.nn.nbeats.nets.NBeatsBaseNet, window_sampling_limit: Optional[int] = None, train_batch_size: int = 1024, test_batch_size: int = 1024, trainer_params: Optional[dict] = None, train_dataloader_params: Optional[dict] = None, test_dataloader_params: Optional[dict] = None, val_dataloader_params: Optional[dict] = None, split_params: Optional[dict] = None, random_state: Optional[int] = None)[source]

Base class for N-BEATS models.

Init DeepBaseModel.

Parameters
  • net (NBeatsBaseNet) – network to train

  • encoder_length – encoder length

  • decoder_length – decoder length

  • train_batch_size (int) – batch size for training

  • test_batch_size (int) – batch size for testing

  • trainer_params (Optional[dict]) – Pytorch ligthning trainer parameters (api reference pytorch_lightning.trainer.trainer.Trainer)

  • train_dataloader_params (Optional[dict]) – parameters for train dataloader like sampler for example (api reference torch.utils.data.DataLoader)

  • test_dataloader_params (Optional[dict]) – parameters for test dataloader

  • val_dataloader_params (Optional[dict]) – parameters for validation dataloader

  • split_params (Optional[dict]) –

    dictionary with parameters for torch.utils.data.random_split() for train-test splitting
    • train_size: (float) value from 0 to 1 - fraction of samples to use for training

    • generator: (Optional[torch.Generator]) - generator for reproducibile train-test splitting

    • torch_dataset_size: (Optional[int]) - number of samples in dataset, in case of dataset not implementing __len__

  • window_sampling_limit (Optional[int]) –

  • random_state (Optional[int]) –

fit(ts: etna.datasets.tsdataset.TSDataset) etna.models.base.DeepBaseModel

Fit model.

Parameters

ts (etna.datasets.tsdataset.TSDataset) – TSDataset with features

Returns

Model after fit

Return type

etna.models.base.DeepBaseModel

forecast(ts: etna.datasets.tsdataset.TSDataset, prediction_size: int, return_components: bool = False) etna.datasets.tsdataset.TSDataset

Make predictions.

This method will make autoregressive predictions.

Parameters
  • ts (etna.datasets.tsdataset.TSDataset) – Dataset with features and expected decoder length for context

  • prediction_size (int) – Number of last timestamps to leave after making prediction. Previous timestamps will be used as a context.

  • return_components (bool) – If True additionally returns forecast components

Returns

Dataset with predictions

Return type

etna.datasets.tsdataset.TSDataset

get_model() etna.models.base.DeepBaseNet

Get model.

Returns

Torch Module

Return type

etna.models.base.DeepBaseNet

classmethod load(path: pathlib.Path) typing_extensions.Self

Load an object.

Warning

This method uses dill module which is not secure. It is possible to construct malicious data which will execute arbitrary code during loading. Never load data that could have come from an untrusted source, or that could have been tampered with.

Parameters

path (pathlib.Path) – Path to load object from.

Returns

Loaded object.

Return type

typing_extensions.Self

params_to_tune() Dict[str, etna.distributions.distributions.BaseDistribution]

Get grid for tuning hyperparameters.

This is default implementation with empty grid.

Returns

Empty grid.

Return type

Dict[str, etna.distributions.distributions.BaseDistribution]

predict(ts: etna.datasets.tsdataset.TSDataset, prediction_size: int, return_components: bool = False) etna.datasets.tsdataset.TSDataset

Make predictions.

This method will make predictions using true values instead of predicted on a previous step. It can be useful for making in-sample forecasts.

Parameters
  • ts (etna.datasets.tsdataset.TSDataset) – Dataset with features and expected decoder length for context

  • prediction_size (int) – Number of last timestamps to leave after making prediction. Previous timestamps will be used as a context.

  • return_components (bool) – If True additionally returns prediction components

Returns

Dataset with predictions

Return type

etna.datasets.tsdataset.TSDataset

raw_fit(torch_dataset: torch.utils.data.dataset.Dataset) etna.models.base.DeepBaseModel

Fit model on torch like Dataset.

Parameters

torch_dataset (torch.utils.data.dataset.Dataset) – Torch like dataset for model fit

Returns

Model after fit

Return type

etna.models.base.DeepBaseModel

raw_predict(torch_dataset: torch.utils.data.dataset.Dataset) Dict[Tuple[str, str], numpy.ndarray]

Make inference on torch like Dataset.

Parameters

torch_dataset (torch.utils.data.dataset.Dataset) – Torch like dataset for model inference

Returns

Dictionary with predictions

Return type

Dict[Tuple[str, str], numpy.ndarray]

save(path: pathlib.Path)

Save the object.

Parameters

path (pathlib.Path) – Path to save object to.

set_params(**params: dict) etna.core.mixins.TMixin

Return new object instance with modified parameters.

Method also allows to change parameters of nested objects within the current object. For example, it is possible to change parameters of a model in a Pipeline.

Nested parameters are expected to be in a <component_1>.<...>.<parameter> form, where components are separated by a dot.

Parameters
  • **params – Estimator parameters

  • self (etna.core.mixins.TMixin) –

  • params (dict) –

Returns

New instance with changed parameters

Return type

etna.core.mixins.TMixin

Examples

>>> from etna.pipeline import Pipeline
>>> from etna.models import NaiveModel
>>> from etna.transforms import AddConstTransform
>>> model = model=NaiveModel(lag=1)
>>> transforms = [AddConstTransform(in_column="target", value=1)]
>>> pipeline = Pipeline(model, transforms=transforms, horizon=3)
>>> pipeline.set_params(**{"model.lag": 3, "transforms.0.value": 2})
Pipeline(model = NaiveModel(lag = 3, ), transforms = [AddConstTransform(in_column = 'target', value = 2, inplace = True, out_column = None, )], horizon = 3, )
to_dict()

Collect all information about etna object in dict.

property context_size: int

Context size of the model.

class NBeatsGenericModel(input_size: int, output_size: int, loss: Union[Literal['mse'], Literal['mae'], Literal['smape'], Literal['mape'], torch.nn.modules.module.Module] = 'mse', stacks: int = 30, layers: int = 4, layer_size: int = 512, lr: float = 0.001, window_sampling_limit: Optional[int] = None, optimizer_params: Optional[dict] = None, train_batch_size: int = 1024, test_batch_size: int = 1024, trainer_params: Optional[dict] = None, train_dataloader_params: Optional[dict] = None, test_dataloader_params: Optional[dict] = None, val_dataloader_params: Optional[dict] = None, split_params: Optional[dict] = None, random_state: Optional[int] = None)[source]

Generic N-BEATS model.

Paper: https://arxiv.org/pdf/1905.10437.pdf

Official implementation: https://github.com/ServiceNow/N-BEATS

Init generic N-BEATS model.

Parameters
  • input_size (int) – Input data size.

  • output_size (int) – Forecast size.

  • loss (Union[Literal['mse'], typing.Literal['mae'], typing.Literal['smape'], typing.Literal['mape'], torch.nn.Module]) – Optimisation objective. The loss function should accept three arguments: y_true, y_pred and mask. The last parameter is a binary mask that denotes which points are valid forecasts. There are several implemented loss functions available in the etna.models.nn.nbeats.metrics module.

  • stacks (int) – Number of block stacks in model.

  • layers (int) – Number of inner layers in each block.

  • layer_size (int) – Inner layers size in blocks.

  • lr (float) – Optimizer learning rate.

  • window_sampling_limit (Optional[int]) – Size of history for sampling training data. If set to None full series history used for sampling.

  • optimizer_params (Optional[dict]) – Additional parameters for the optimizer.

  • train_batch_size (int) – Batch size for training.

  • test_batch_size (int) – Batch size for testing.

  • optimizer_params – Parameters for optimizer for Adam optimizer (api reference torch.optim.Adam).

  • trainer_params (Optional[dict]) – Pytorch ligthning trainer parameters (api reference pytorch_lightning.trainer.trainer.Trainer).

  • train_dataloader_params (Optional[dict]) – Parameters for train dataloader like sampler for example (api reference torch.utils.data.DataLoader).

  • test_dataloader_params (Optional[dict]) – Parameters for test dataloader.

  • val_dataloader_params (Optional[dict]) – Parameters for validation dataloader.

  • split_params (Optional[dict]) –

    Dictionary with parameters for torch.utils.data.random_split() for train-test splitting
    • train_size: (float) value from 0 to 1 - fraction of samples to use for training

    • generator: (Optional[torch.Generator]) - generator for reproducibile train-test splitting

    • torch_dataset_size: (Optional[int]) - number of samples in dataset, in case of dataset not implementing __len__

  • random_state (Optional[int]) – Random state for train batches generation.

fit(ts: etna.datasets.tsdataset.TSDataset) etna.models.base.DeepBaseModel

Fit model.

Parameters

ts (etna.datasets.tsdataset.TSDataset) – TSDataset with features

Returns

Model after fit

Return type

etna.models.base.DeepBaseModel

forecast(ts: etna.datasets.tsdataset.TSDataset, prediction_size: int, return_components: bool = False) etna.datasets.tsdataset.TSDataset

Make predictions.

This method will make autoregressive predictions.

Parameters
  • ts (etna.datasets.tsdataset.TSDataset) – Dataset with features and expected decoder length for context

  • prediction_size (int) – Number of last timestamps to leave after making prediction. Previous timestamps will be used as a context.

  • return_components (bool) – If True additionally returns forecast components

Returns

Dataset with predictions

Return type

etna.datasets.tsdataset.TSDataset

get_model() etna.models.base.DeepBaseNet

Get model.

Returns

Torch Module

Return type

etna.models.base.DeepBaseNet

classmethod load(path: pathlib.Path) typing_extensions.Self

Load an object.

Warning

This method uses dill module which is not secure. It is possible to construct malicious data which will execute arbitrary code during loading. Never load data that could have come from an untrusted source, or that could have been tampered with.

Parameters

path (pathlib.Path) – Path to load object from.

Returns

Loaded object.

Return type

typing_extensions.Self

params_to_tune() Dict[str, etna.distributions.distributions.BaseDistribution][source]

Get default grid for tuning hyperparameters.

This grid tunes parameters: stacks, layers, lr, layer_size. Other parameters are expected to be set by the user.

Returns

Grid to tune.

Return type

Dict[str, etna.distributions.distributions.BaseDistribution]

predict(ts: etna.datasets.tsdataset.TSDataset, prediction_size: int, return_components: bool = False) etna.datasets.tsdataset.TSDataset

Make predictions.

This method will make predictions using true values instead of predicted on a previous step. It can be useful for making in-sample forecasts.

Parameters
  • ts (etna.datasets.tsdataset.TSDataset) – Dataset with features and expected decoder length for context

  • prediction_size (int) – Number of last timestamps to leave after making prediction. Previous timestamps will be used as a context.

  • return_components (bool) – If True additionally returns prediction components

Returns

Dataset with predictions

Return type

etna.datasets.tsdataset.TSDataset

raw_fit(torch_dataset: torch.utils.data.dataset.Dataset) etna.models.base.DeepBaseModel

Fit model on torch like Dataset.

Parameters

torch_dataset (torch.utils.data.dataset.Dataset) – Torch like dataset for model fit

Returns

Model after fit

Return type

etna.models.base.DeepBaseModel

raw_predict(torch_dataset: torch.utils.data.dataset.Dataset) Dict[Tuple[str, str], numpy.ndarray]

Make inference on torch like Dataset.

Parameters

torch_dataset (torch.utils.data.dataset.Dataset) – Torch like dataset for model inference

Returns

Dictionary with predictions

Return type

Dict[Tuple[str, str], numpy.ndarray]

save(path: pathlib.Path)

Save the object.

Parameters

path (pathlib.Path) – Path to save object to.

set_params(**params: dict) etna.core.mixins.TMixin

Return new object instance with modified parameters.

Method also allows to change parameters of nested objects within the current object. For example, it is possible to change parameters of a model in a Pipeline.

Nested parameters are expected to be in a <component_1>.<...>.<parameter> form, where components are separated by a dot.

Parameters
  • **params – Estimator parameters

  • self (etna.core.mixins.TMixin) –

  • params (dict) –

Returns

New instance with changed parameters

Return type

etna.core.mixins.TMixin

Examples

>>> from etna.pipeline import Pipeline
>>> from etna.models import NaiveModel
>>> from etna.transforms import AddConstTransform
>>> model = model=NaiveModel(lag=1)
>>> transforms = [AddConstTransform(in_column="target", value=1)]
>>> pipeline = Pipeline(model, transforms=transforms, horizon=3)
>>> pipeline.set_params(**{"model.lag": 3, "transforms.0.value": 2})
Pipeline(model = NaiveModel(lag = 3, ), transforms = [AddConstTransform(in_column = 'target', value = 2, inplace = True, out_column = None, )], horizon = 3, )
to_dict()

Collect all information about etna object in dict.

property context_size: int

Context size of the model.

class NBeatsInterpretableModel(input_size: int, output_size: int, loss: Union[Literal['mse'], Literal['mae'], Literal['smape'], Literal['mape'], torch.nn.modules.module.Module] = 'mse', trend_blocks: int = 3, trend_layers: int = 4, trend_layer_size: int = 256, degree_of_polynomial: int = 2, seasonality_blocks: int = 3, seasonality_layers: int = 4, seasonality_layer_size: int = 2048, num_of_harmonics: int = 1, lr: float = 0.001, window_sampling_limit: Optional[int] = None, optimizer_params: Optional[dict] = None, train_batch_size: int = 1024, test_batch_size: int = 1024, trainer_params: Optional[dict] = None, train_dataloader_params: Optional[dict] = None, test_dataloader_params: Optional[dict] = None, val_dataloader_params: Optional[dict] = None, split_params: Optional[dict] = None, random_state: Optional[int] = None)[source]

Interpretable N-BEATS model.

Paper: https://arxiv.org/pdf/1905.10437.pdf

Official implementation: https://github.com/ServiceNow/N-BEATS

Init interpretable N-BEATS model.

Parameters
  • input_size (int) – Input data size.

  • output_size (int) – Forecast size.

  • loss (Union[Literal['mse'], typing.Literal['mae'], typing.Literal['smape'], typing.Literal['mape'], torch.nn.Module]) – Optimisation objective. The loss function should accept three arguments: y_true, y_pred and mask. The last parameter is a binary mask that denotes which points are valid forecasts. There are several implemented loss functions available in the etna.models.nn.nbeats.metrics module.

  • trend_blocks (int) – Number of trend blocks.

  • trend_layers (int) – Number of inner layers in each trend block.

  • trend_layer_size (int) – Inner layer size in trend blocks.

  • degree_of_polynomial (int) – Polynomial degree for trend modeling.

  • seasonality_blocks (int) – Number of seasonality blocks.

  • seasonality_layers (int) – Number of inner layers in each seasonality block.

  • seasonality_layer_size (int) – Inner layer size in seasonality blocks.

  • num_of_harmonics (int) – Number of harmonics for seasonality estimation.

  • lr (float) – Optimizer learning rate.

  • window_sampling_limit (Optional[int]) – Size of history for sampling training data. If set to None full series history used for sampling.

  • optimizer_params (Optional[dict]) – Additional parameters for the optimizer.

  • train_batch_size (int) – Batch size for training.

  • test_batch_size (int) – Batch size for testing.

  • optimizer_params – Parameters for optimizer for Adam optimizer (api reference torch.optim.Adam).

  • trainer_params (Optional[dict]) – Pytorch lightning trainer parameters (api reference pytorch_lightning.trainer.trainer.Trainer).

  • train_dataloader_params (Optional[dict]) – Parameters for train dataloader like sampler for example (api reference torch.utils.data.DataLoader).

  • test_dataloader_params (Optional[dict]) – Parameters for test dataloader.

  • val_dataloader_params (Optional[dict]) – Parameters for validation dataloader.

  • split_params (Optional[dict]) –

    Dictionary with parameters for torch.utils.data.random_split() for train-test splitting
    • train_size: (float) value from 0 to 1 - fraction of samples to use for training

    • generator: (Optional[torch.Generator]) - generator for reproducibile train-test splitting

    • torch_dataset_size: (Optional[int]) - number of samples in dataset, in case of dataset not implementing __len__

  • random_state (Optional[int]) – Random state for train batches generation.

fit(ts: etna.datasets.tsdataset.TSDataset) etna.models.base.DeepBaseModel

Fit model.

Parameters

ts (etna.datasets.tsdataset.TSDataset) – TSDataset with features

Returns

Model after fit

Return type

etna.models.base.DeepBaseModel

forecast(ts: etna.datasets.tsdataset.TSDataset, prediction_size: int, return_components: bool = False) etna.datasets.tsdataset.TSDataset

Make predictions.

This method will make autoregressive predictions.

Parameters
  • ts (etna.datasets.tsdataset.TSDataset) – Dataset with features and expected decoder length for context

  • prediction_size (int) – Number of last timestamps to leave after making prediction. Previous timestamps will be used as a context.

  • return_components (bool) – If True additionally returns forecast components

Returns

Dataset with predictions

Return type

etna.datasets.tsdataset.TSDataset

get_model() etna.models.base.DeepBaseNet

Get model.

Returns

Torch Module

Return type

etna.models.base.DeepBaseNet

classmethod load(path: pathlib.Path) typing_extensions.Self

Load an object.

Warning

This method uses dill module which is not secure. It is possible to construct malicious data which will execute arbitrary code during loading. Never load data that could have come from an untrusted source, or that could have been tampered with.

Parameters

path (pathlib.Path) – Path to load object from.

Returns

Loaded object.

Return type

typing_extensions.Self

params_to_tune() Dict[str, etna.distributions.distributions.BaseDistribution][source]

Get default grid for tuning hyperparameters.

This grid tunes parameters: trend_blocks, trend_layers, trend_layer_size, degree_of_polynomial, seasonality_blocks, seasonality_layers, seasonality_layer_size, lr. Other parameters are expected to be set by the user.

Returns

Grid to tune.

Return type

Dict[str, etna.distributions.distributions.BaseDistribution]

predict(ts: etna.datasets.tsdataset.TSDataset, prediction_size: int, return_components: bool = False) etna.datasets.tsdataset.TSDataset

Make predictions.

This method will make predictions using true values instead of predicted on a previous step. It can be useful for making in-sample forecasts.

Parameters
  • ts (etna.datasets.tsdataset.TSDataset) – Dataset with features and expected decoder length for context

  • prediction_size (int) – Number of last timestamps to leave after making prediction. Previous timestamps will be used as a context.

  • return_components (bool) – If True additionally returns prediction components

Returns

Dataset with predictions

Return type

etna.datasets.tsdataset.TSDataset

raw_fit(torch_dataset: torch.utils.data.dataset.Dataset) etna.models.base.DeepBaseModel

Fit model on torch like Dataset.

Parameters

torch_dataset (torch.utils.data.dataset.Dataset) – Torch like dataset for model fit

Returns

Model after fit

Return type

etna.models.base.DeepBaseModel

raw_predict(torch_dataset: torch.utils.data.dataset.Dataset) Dict[Tuple[str, str], numpy.ndarray]

Make inference on torch like Dataset.

Parameters

torch_dataset (torch.utils.data.dataset.Dataset) – Torch like dataset for model inference

Returns

Dictionary with predictions

Return type

Dict[Tuple[str, str], numpy.ndarray]

save(path: pathlib.Path)

Save the object.

Parameters

path (pathlib.Path) – Path to save object to.

set_params(**params: dict) etna.core.mixins.TMixin

Return new object instance with modified parameters.

Method also allows to change parameters of nested objects within the current object. For example, it is possible to change parameters of a model in a Pipeline.

Nested parameters are expected to be in a <component_1>.<...>.<parameter> form, where components are separated by a dot.

Parameters
  • **params – Estimator parameters

  • self (etna.core.mixins.TMixin) –

  • params (dict) –

Returns

New instance with changed parameters

Return type

etna.core.mixins.TMixin

Examples

>>> from etna.pipeline import Pipeline
>>> from etna.models import NaiveModel
>>> from etna.transforms import AddConstTransform
>>> model = model=NaiveModel(lag=1)
>>> transforms = [AddConstTransform(in_column="target", value=1)]
>>> pipeline = Pipeline(model, transforms=transforms, horizon=3)
>>> pipeline.set_params(**{"model.lag": 3, "transforms.0.value": 2})
Pipeline(model = NaiveModel(lag = 3, ), transforms = [AddConstTransform(in_column = 'target', value = 2, inplace = True, out_column = None, )], horizon = 3, )
to_dict()

Collect all information about etna object in dict.

property context_size: int

Context size of the model.