3. Q: What, say, 3 recent papers in machine learning do you think will be influential to directing the cutting edge
of research these days?
Peter Norvig: I’ve never been able to pick lasting papers in the past, so don’t trust me now, but here are a few:
● Rendle’s “Factorization Machines”
● Wang et al. “Bayesian optimization in high dimensions via random embeddings”
● Dean et al. “Fast, Accurate Detection of 100,000 Object Classes on a Single Machine”
http://blog.teamleada.com/2014/08/ask-peter-norvig/
9. Example
U = {Alice (A), Bob (B), Charlie (C), . . .}
I = {Titanic (TI), Notting Hill (NH), Star Wars
(SW), Star Trek (ST), . . .}
10. Example
{(A, TI, 2010-1, 5),(A, NH, 2010-2, 3),(A, SW,
2010-4, 1), (B, SW, 2009-5, 4),(B, ST, 2009-8,
5), (C, TI, 2009-9, 1),(C, SW, 2009-12, 5)}
Interaction between Alice and Star Trek to predict rating?
Zero interaction?
B-SW and C-SW are similar
A and C are different
ST and SW are similar
A-SW and A-ST are to be similar
17. FFM:ideas
Features can be grouped into fields: users,
movies, context, SSPs, publishers, whatever
Better use this information
Factor vector per field