Artwork

Вміст надано Marcel Kurovski. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією Marcel Kurovski або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.
Player FM - додаток Podcast
Переходьте в офлайн за допомогою програми Player FM !

#29: Transformers for Recommender Systems with Craig Macdonald and Sasha Petrov

1:37:25
 
Поширити
 

Manage episode 502753047 series 3288795
Вміст надано Marcel Kurovski. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією Marcel Kurovski або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.

In episode 29 of Recsperts, I welcome Craig Macdonald, Professor of Information Retrieval at the University of Glasgow, and Aleksandr “Sasha” Petrov, PhD researcher and former applied scientist at Amazon. Together, we dive deep into sequential recommender systems and the growing role of transformer models such as SASRec and BERT4Rec.

Our conversation begins with their influential replicability study of BERT4Rec, which revealed inconsistencies in reported results and highlighted the importance of training objectives over architecture tweaks. From there, Craig and Sasha guide us through their award-winning research on making transformers for sequential recommendation with large corpora both more effective and more efficient. We discuss how recency sampling (RSS) reduces training times dramatically, and how gSASRec overcomes the problem of overconfidence in models trained with negative sampling. By generalizing the sigmoid function (gBCE), they were able to reconcile cross-entropy–based optimization results with negative sampling, matching the effectiveness of softmax approaches while keeping training scalable for large corpora.

We also explore RecJPQ, their recent work on joint product quantization for item embeddings. This approach makes transformer-based sequential recommenders substantially faster at inference and far more memory-efficient for embeddings—while sometimes even improving effectiveness thanks to regularization effects. Towards the end, Craig and Sasha share their perspective on generative approaches like GPTRec, the promises and limits of large language models in recommendation, and what challenges remain for the future of sequential recommender systems.

Enjoy this enriching episode of RECSPERTS – Recommender Systems Experts.

Don’t forget to follow the podcast and please leave a review.

  • (00:00) - Introduction
  • (04:09) - About Craig Macdonald
  • (04:46) - About Sasha Petrov
  • (13:48) - Tutorial on Transformers for Sequential Recommendations
  • (19:24) - SASRec vs. BERT4Rec
  • (21:25) - Replicability Study of BERT4Rec for Sequential Recommendation
  • (32:52) - Training Sequential RecSys using Recency Sampling
  • (40:01) - gSASRec for Reducing Overconfidence by Negative Sampling
  • (01:00:51) - RecJPQ: Training Large-Catalogue Sequential Recommenders
  • (01:21:37) - Generative Sequential Recommendation with GPTRec
  • (01:29:12) - Further Challenges and Closing Remarks

Links from the Episode:

Papers:

General Links:

Disclaimer:
Craig holds concurrent appointments as a Professor of Information Retrieval at University of Glasgow and as an Amazon Scholar. This podcast describes work performed at the University of Glasgow and is not associated with Amazon.

  continue reading

30 епізодів

Artwork
iconПоширити
 
Manage episode 502753047 series 3288795
Вміст надано Marcel Kurovski. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією Marcel Kurovski або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.

In episode 29 of Recsperts, I welcome Craig Macdonald, Professor of Information Retrieval at the University of Glasgow, and Aleksandr “Sasha” Petrov, PhD researcher and former applied scientist at Amazon. Together, we dive deep into sequential recommender systems and the growing role of transformer models such as SASRec and BERT4Rec.

Our conversation begins with their influential replicability study of BERT4Rec, which revealed inconsistencies in reported results and highlighted the importance of training objectives over architecture tweaks. From there, Craig and Sasha guide us through their award-winning research on making transformers for sequential recommendation with large corpora both more effective and more efficient. We discuss how recency sampling (RSS) reduces training times dramatically, and how gSASRec overcomes the problem of overconfidence in models trained with negative sampling. By generalizing the sigmoid function (gBCE), they were able to reconcile cross-entropy–based optimization results with negative sampling, matching the effectiveness of softmax approaches while keeping training scalable for large corpora.

We also explore RecJPQ, their recent work on joint product quantization for item embeddings. This approach makes transformer-based sequential recommenders substantially faster at inference and far more memory-efficient for embeddings—while sometimes even improving effectiveness thanks to regularization effects. Towards the end, Craig and Sasha share their perspective on generative approaches like GPTRec, the promises and limits of large language models in recommendation, and what challenges remain for the future of sequential recommender systems.

Enjoy this enriching episode of RECSPERTS – Recommender Systems Experts.

Don’t forget to follow the podcast and please leave a review.

  • (00:00) - Introduction
  • (04:09) - About Craig Macdonald
  • (04:46) - About Sasha Petrov
  • (13:48) - Tutorial on Transformers for Sequential Recommendations
  • (19:24) - SASRec vs. BERT4Rec
  • (21:25) - Replicability Study of BERT4Rec for Sequential Recommendation
  • (32:52) - Training Sequential RecSys using Recency Sampling
  • (40:01) - gSASRec for Reducing Overconfidence by Negative Sampling
  • (01:00:51) - RecJPQ: Training Large-Catalogue Sequential Recommenders
  • (01:21:37) - Generative Sequential Recommendation with GPTRec
  • (01:29:12) - Further Challenges and Closing Remarks

Links from the Episode:

Papers:

General Links:

Disclaimer:
Craig holds concurrent appointments as a Professor of Information Retrieval at University of Glasgow and as an Amazon Scholar. This podcast describes work performed at the University of Glasgow and is not associated with Amazon.

  continue reading

30 епізодів

Усі епізоди

×
 
Loading …

Ласкаво просимо до Player FM!

Player FM сканує Інтернет для отримання високоякісних подкастів, щоб ви могли насолоджуватися ними зараз. Це найкращий додаток для подкастів, який працює на Android, iPhone і веб-сторінці. Реєстрація для синхронізації підписок між пристроями.

 

Короткий довідник

Слухайте це шоу, досліджуючи
Відтворити