Skip to main content

ELSA (Autoencoder)

Description

The ELSA (Efficient Latent Sparse Autoencoder) policy implements a scalable shallow linear autoencoder designed for implicit feedback collaborative filtering. It learns item-item relationships by reconstructing user interaction vectors. To improve scalability over models like EASE, it uses a factorized hidden layer structure (low-rank plus sparse).

Policy Type: elsa Supports: embedding_policy, scoring_policy

Hyperparameter tuning

  • batch_size: Number of samples processed before updating model weights.
  • n_epochs: Number of complete passes through the training dataset.
  • factors: Number of latent factors in the matrix factorization.
  • lr: Learning rate for gradient descent optimization.
  • device
  • strategy
  • patience: Number of epochs to wait without improvement before early stopping.

V1 API

policy_configs:
embedding_policy: # Or scoring_policy
policy_type: elsa
# Training Hyperparameters
batch_size: 512 # Samples per training batch
n_epochs: 20 # Number of training epochs
lr: 0.1 # Learning rate
# Model Hyperparameters
factors: 10 # Rank (number of latent factors) for the low-rank part

Usage

Use this model when:

  • You have large-scale datasets and need scalability
  • You're working with implicit feedback data
  • You want better scalability than EASE while maintaining similar performance
  • You need efficient item-item similarity computation
  • You want a modern autoencoder approach for collaborative filtering

Choose a different model when:

  • You have very small datasets (EASE might be simpler)
  • You need to incorporate item content features (use Two-Tower or BeeFormer)
  • You want to model sequential patterns (use sequential models)
  • You have explicit feedback only (SVD might be more appropriate)

Use cases

  • Large e-commerce platforms with millions of products
  • Content streaming services with extensive catalogs
  • Social media feed recommendations
  • Any large-scale implicit feedback recommendation system

Reference