Item2Vec (Sequential)
warning
This is an article from the Shaped 1.0 documentation. The APIs have changed and information may be outdated. Go to Shaped 2.0 docs
Description
The Item2Vec policy adapts the Word2Vec algorithm (CBOW or Skip-gram) to learn item embeddings from user interaction sequences. It captures item co-occurrence patterns within a defined context window, placing items frequently interacted with together closer in the embedding space. Primarily models similarity based on co-occurrence rather than strict sequential order.
Policy Type: item2vec
Supports: embedding_policy, scoring_policy
Configuration Example
policy_configs:
scoring_policy:
policy_type: item2vec
embedding_size: 512 # Dimensionality of the learned item embeddings
window_size: 20 # Context window size (items before/after)
min_count: 1 # Minimum frequency for an item to be included
algorithm: "cbow" # Training algorithm ('cbow' or 'skipgram')
References
- Mikolov, T., et al. (2013). Efficient Estimation of Word Representations in Vector Space. arXiv. (Foundation Word2Vec).
- Barkan, O., & Koenigstein, N. (2016). Item2vec: neural item embedding for collaborative filtering. arXiv.