Hyperparameter tuning
Shaped automatically tunes model hyperparameters to optimize performance. You can configure hyperparameter tuning by specifying exact values, setting min/max ranges for partial tuning, or letting Shaped handle all tuning automatically.
Autotuning by default
If you don't specify hyperparameters in your model configuration, Shaped automatically tunes all tunable hyperparameters. This applies to all model policies.
training:
models:
- name: my_model
policy_type: lightgbm
# No hyperparameters specified - all will be autotuned
Event values filtering
At minimum, you can specify event_values to filter which interaction events are used for training. This is useful when your interaction table contains multiple event types and you want to train on specific events.
training:
models:
- name: purchase_model
policy_type: lightgbm
event_values:
- purchase
- checkout_complete
Partial tuning
Many hyperparameters support partial tuning by specifying min and max values. Shaped will tune the parameter within the specified range.
LightGBM example
training:
models:
- name: lightgbm_model
policy_type: lightgbm
max_depth:
type: tunable_int
min: -1
max: 10
num_leaves:
type: tunable_int
min: 20
max: 40
learning_rate:
type: tunable_float
min: 0.001
max: 0.1
ELSA example
training:
models:
- name: elsa_model
policy_type: elsa
factors:
type: tunable_int
min: 10
max: 200
lr:
type: tunable_float
min: 0.01
max: 0.1
BERT4Rec example
training:
models:
- name: bert4rec_model
policy_type: bert4rec
batch_size:
type: tunable_int
min: 32
max: 2048
learning_rate:
type: tunable_float
min: 0.0001
max: 0.1
attn_dropout_prob:
type: tunable_float
min: 0.0
max: 0.5
hidden_dropout_prob:
type: tunable_float
min: 0.0
max: 0.5
Combining approaches
You can combine exact values, partial tuning, and autotuning in the same model configuration. Any hyperparameters not specified will be autotuned.
training:
models:
- name: tuned_model
policy_type: lightgbm
event_values:
- click
- view
max_depth:
type: tunable_int
min: 5
max: 8
# learning_rate, num_leaves, and other hyperparameters will be autotuned
End-to-end example
This example shows how to train an ELSA and LightGBM model with hyperparameter tuning, then combine their scores in a query.
Engine configuration
data:
item_table:
name: products
type: table
user_table:
name: users
type: table
interaction_table:
name: interactions
type: table
training:
models:
- name: elsa_model
policy_type: elsa
factors:
type: tunable_int
min: 10
max: 200
lr:
type: tunable_float
min: 0.01
max: 0.1
- name: lightgbm_model
policy_type: lightgbm
event_values:
- click
- purchase
max_depth:
type: tunable_int
min: 5
max: 10
num_leaves:
type: tunable_int
min: 20
max: 40
learning_rate:
type: tunable_float
min: 0.001
max: 0.1
Query with combined scoring
Use a value model expression in the score stage to combine outputs from both models:
queries:
personalized_feed:
query:
type: rank
from: item
retrieve:
- type: column_order
columns:
- name: _derived_popular_rank
ascending: true
limit: 1000
score:
type: score_ensemble
value_model: 0.6 * elsa_model + 0.4 * lightgbm_model
input_user_id: $parameters.user_id
limit: 20
parameters:
user_id:
default: null
The value model expression 0.6 * elsa_model + 0.4 * lightgbm_model weights the ELSA model at 60% and the LightGBM model at 40%. Adjust these weights based on your performance requirements.