10. 10/3 10/4 10/5
Keynote1 Keynote2 Paper Session 5:
RecSys that Care
Paper Session 1:
Explanations
Paper Session 3:
Learning & Optimazation
Paper Session 6:
Metrics & Evaluations
Industry Session 1:
Alogrithms
Industry Session 2:
System Considerations
Paper Session 7:
Beyond Users & Items
Paper Session 2:
Products
Paper Session 4:
Travel & Entertainment
Keynote3
RecSys Overview
11. KeyNote
Five E’s: Reflecting on the Design of Recommendations
Elizabeth F. Churchill (Google)
• Explainable (understandable/intelligible)
• Equitable(fair & impartial)
• Ethical(morally good or correct)
• Expedient (convenient & practical)
• Exigent (pressing & demanding)
13. How Algorithmic Confounding in Recommendation Systems
Increases Homogeneity and Decreases Utility
• feedback loop
•
• feedback loop
https://arxiv.org/abs/1710.11214
14. Unbiased Offline Recommender Evaluation for
Missing-Not-At-Random Implicit Feedback
•
• Inverse Propensity Score
•
Causal Embeddings for Recommendation
• Criteo AI Labs RecSys 2018 Best long paper
•
• Domain Adaptatin
15. Calibrated Recommendations
• Netflix
• 70% 30%
• User
KL divergence (Calibration Metric)
• Calibration Metric
• Divercity
Black : play history
Red : before calibration
Green : after calibration
https://dl.acm.org/citation.cfm?id=3240372
16. Artwork Personalization at Netflix
• Netflix Industry session
•
(ex
)
• Contextial Bandit
https://medium.com/netflix-techblog/artwork-personalization-c589f074ad76
Artwork Personalization at Netflix
17. Explore, Exploit, and Explain: Personalizing Explainable
Recommendations with Bandits
• Sportify Contextual Bandits
•
Contextual Bandit BART
•
music
https://hellogiggles.com/reviews-coverage/music/spotify-playlist-favorite-songs-of-2017-wrapped/
18. GENERATION MEETS RECOMMENDATION:
Proposing Novel Items for Groups of Users
•
• item
• VAE Encoder Decoder Z item z
• item k Z
• z Decoder item( feature)
https://haroldsoh.files.wordpress.com/2018/10/sohvo_recsys18.pdf
19. Interpreting User Inaction in Recommender Systems
• item
MovieLens
• 7 ("Would Not Enjoy”
“Watched” “Not Noticed” “Not Now” “ Others Better” “Explore Later or
Decided To Watch”)
•
20. A Field Study of Related Video Recommendations: Newest,
Most Similar, or Most Relevant?
• CTR
• MovieLens
•
Judging Similarity: A User-Centric Study of Related Item
Recommendations
• MovieLens 6
• CF