Conversational Recommendations
Conversational interfaces – chatbots, voice assistants, interactive helpdesks – are transforming how users interact with technology. Instead of browsing static pages, users can now simply ask for what they need rather than go looking for it.Integrating recommendation systems into these conversations offers a powerful way to provide personalized guidance, drive product discovery, and enhance user engagement in real-time. Imagine a chatbot not just answering questions, but proactively suggesting relevant products, articles, or actions based on the user's needs expressed during the conversation.
However, building effective conversational recommendation systems presents a unique and complex set of challenges that go far beyond traditional recsys space. It requires blending natural language understanding (NLU), dialogue management, recommendation algorithms, and seamless real-time integration.
The Shaped Approach: Simplifying Conversational Recommendations with LLMs
Building a high-performing conversational recommendation system from scratch involves mastering complex AI/ML across NLU, dialogue, and recommendations, plus intricate system integration. Shaped significantly simplifies the recommendation candidate generation piece, which is often a major bottleneck.
Shaped handles the heavy lifting of learning user preferences and item relationships from your historical interaction data and catalog. It provides simple, low-latency APIs to retrieve highly relevant, personalized candidate items based on user history and real-time signals (like keywords or filters derived from the conversation). You can then feed these high-quality candidates directly into your LLM's context window.
How Shaped Streamlines Conversational Recommendations:
- Focus on Conversation: Your team focuses on building the best NLU, dialogue management, and LLM prompting for a natural user experience.
- Shaped Handles Recommendations: Connect your user interaction data (views, clicks, purchases) and item catalog to Shaped. Shaped automatically trains powerful embedding-based models.
- Fetch Candidates via API: When the dialogue state indicates a need for recommendations:
- Extract key entities or keywords from the conversation (e.g., "trail running shoes", "sci-fi movies", "vegetarian recipes").
- Call Shaped's
rank
API with theuser_id
and applyfilters
based on extracted entities, OR use thesearch
API with the extracted keywords anduser_id
for personalization. - Shaped returns a ranked list of relevant
item_id
s and their metadata in milliseconds.
- Inform the LLM: Format the candidate items (e.g., titles, key features) from Shaped's response.
- Generate Conversational Output: Include the formatted candidates in the prompt to your LLM, instructing it to present them naturally within the conversation.
Shaped manages the complex model training, infrastructure scaling, real-time candidate generation, and MLOps for the recommendation part, allowing you to integrate powerful personalization into your conversational AI with significantly less effort.
Building Conversational Recommendations with Shaped & LLMs
Let's illustrate implementing recommendations within a chatbot using Shaped to fetch candidates and an LLM to present them.
Goal: A chatbot helps users find movies. When a user asks "Suggest some sci-fi movies starring Harrison Ford", the bot should provide personalized suggestions matching the criteria.
1. Ensure Data is Connected to Shaped: Assume you have connected:
user_movie_interactions
: Containsuser_id
,item_id
(movie ID),timestamp
,event_type
(watch, rate, wishlist).movie_catalog
: Containsitem_id
,title
,genre
,actors
,description
,image_url
.
2. Define Your Shaped Model (YAML): Focus on learning movie preferences.
model:
name: movie_recs_conversational
connectors:
- type: Dataset
name: user_movie_interactions
id: interactions
- type: Dataset
name: movie_catalog
id: movies
fetch:
events: |
SELECT
user_id,
item_id,
timestamp AS created_at,
CASE
WHEN event_type = 'watch' THEN 1.0
WHEN event_type = 'rate' AND rating > 3 THEN 0.8 -- Example: Use rating if available
WHEN event_type = 'wishlist' THEN 0.5
ELSE 0.1
END as label
FROM interactions
items: |
SELECT
item_id,
title,
genre,
actors, -- Assuming 'actors' is a searchable field or array
description,
image_url
FROM movies
3. Create & Activate the Model:
shaped create-model --file movie_recs_model.yaml
# Monitor until ACTIVE
shaped view-model --model-name movie_recs_conversational
4. Integrate into Chatbot Logic:
from shaped import Shaped
# Assume you have an LLM client initialized (e.g., OpenAI, Anthropic)
from openai import OpenAI
llm_client = OpenAI()
shaped_client = Shaped() # Assumes SHAPED_API_KEY env var
model_name = 'movie_recs_conversational'
def recognize_intent(message):
"""
Use an LLM to recognize the user's intent from the message.
"""
prompt = f"""
You are an intent recognition system. Analyze the following message and determine the user's intent.
Possible intents include: 'recommend_movies', 'other'
Message: "{message}"
Respond with the intent only.
"""
response = llm_client.Completion.create(
engine="text-davinci-003",
prompt=prompt,
max_tokens=10,
temperature=0
)
return response.choices[0].text.strip()
def extract_entities(message):
"""
Use an LLM to extract entities like genre and actor from the user's message.
"""
prompt = f"""
You are an entity extraction system and facet filter query generator. Extract
relevant entities from the following message and then propose a face filter for
the Shaped API. These filter should be of the form: 'genre = sci-fi' or 'actor =
Harrison Ford' or 'price' < 20.0' etc...
Message: "{message}"
Respond with a JSON object containing the extracted entities.
"""
response = llm_client.Completion.create(
engine="text-davinci-003",
prompt=prompt,
max_tokens=100,
temperature=0
)
import json
try:
return json.loads(response.choices[0].text.strip())
except json.JSONDecodeError:
return {}
def handle_user_message(user_id, message):
# 1. Use NLU to understand intent and extract entities
intent = recognize_intent(message)
if intent == 'other':
return "I'm here to help with movie recommendations! What are you looking for?"
# 2. Extract hard filter.
shaped_filter = extract_filter(message)
try:
# 3. Call Shaped Rank API to get personalized candidates matching filters
response = shaped_client.rank(
model_name=model_name,
user_id=user_id,
limit=5, # Get top 5 candidates
filter_predicate=shaped_filters, # Apply conversational context filters
return_metadata=True
)
if not response.ids:
# 4. Format candidates for the LLM prompt
candidates_text = "\n".join([
f"- {item['metadata'].get('title', item['id'])} (Genre: "
f"{item['metadata'].get('genre', 'N/A')})"
for item in response.metadata
])
# 5. Craft the LLM Prompt
prompt = f"""The user is looking for movies based on their request:
"{message}".
Based on their preferences and request, here are some relevant movie
suggestions:
{candidates_text}
Please present these suggestions to the user in a friendly, conversational
tone. Briefly mention why they fit the request (e.g., genre, actor).
"""
# 6. Call the LLM to generate the response
chatbot_reply = "Okay, based on your interest in
f"{entities.get('genre', '')} movies with {entities.get('actor', '')},
and your viewing history, how about these?\n{candidates_text}"
else:
chatbot_reply = (
f"Sorry, I couldn't find specific matches for '{message}' "
"based on your preferences. Maybe try a broader search?"
)
except Exception as e:
chatbot_reply = (
"Sorry, I encountered an error while looking for "
"recommendations."
)
return chatbot_reply
# Example Usage:
user_id = 'USER_123'
user_input = "Suggest some sci-fi movies starring Harrison Ford"
reply = handle_user_message(user_id, user_input)
print(f"Chatbot: {reply}")
Explanation:
- The chatbot logic uses NLU to parse the user's request.
- It translates extracted entities (
genre
,actor
) into filters for Shaped'srank
API. - Shaped returns a personalized list of movies matching these filters, ranked according to the user's interaction history.
- The candidate movie titles and key metadata are formatted.
- This formatted list is embedded within a prompt instructing an LLM to present these specific suggestions conversationally.
Conclusion: Focus on the Conversation, Let Shaped Handle the Ranking
Building effective conversational recommendations requires tackling significant challenges in NLU, dialogue management, context-aware ranking, real-time integration, and natural language generation. While LLMs excel at generation, they aren't inherently recommendation engines; they need relevant, personalized candidates to work with.
Attempting to build the entire stack from scratch is a massive undertaking. Shaped provides the specialized AI engine for the crucial candidate generation step. By leveraging Shaped's ability to learn from user behavior and serve personalized, contextually relevant candidates via a simple API, you drastically reduce the complexity. Your team can then focus its efforts on crafting a seamless conversational experience, refining NLU, managing dialogue flow, and optimizing LLM prompts, using Shaped's high-quality recommendations as fuel.
Ready to power your chatbot or voice assistant with truly personalized, context-aware recommendations?
Request a demo of Shaped today to see how it can integrate with your conversational AI stack. Or, start exploring immediately with our free trial sandbox.