[논문 정리]End-To-End Memory Networks

2020. 7. 12. 10:33Machine Learning/NLP-UGRP

https://arxiv.org/abs/1503.08895

 

End-To-End Memory Networks

We introduce a neural network with a recurrent attention model over a possibly large external memory. The architecture is a form of Memory Network (Weston et al., 2015) but unlike the model in that work, it is trained end-to-end, and hence requires signifi

arxiv.org

Authors: Sainbayar Sukhbaatar, Arthur Szlam, Jason Weston, Rob Fergus

 

Search | arXiv e-print repository

Showing 1–41 of 41 results for author: Fergus, R arXiv:2007.02879  [pdf, other]  cs.LG cs.AI Fast Adaptation via Policy-Dynamics Value Functions Authors: Roberta Raileanu, Max Goldstein, Arthur Szlam, Rob Fergus Abstract: Standard RL algorithms assume

arxiv.org

[Abstract]

In this work, we present a novel recurrent neural network(RNN) architecture where the recurrence reads from a possibly large external memory multiple times before ouputting a symbol.

 

'Machine Learning > NLP-UGRP' 카테고리의 다른 글

[논문정리] StyleNet: Generating Attractive Visual Captions with Styles  (0) 2020.07.13
[논문 정리]Memory Networks  (0) 2020.07.12
데이터 전처리  (0) 2020.07.07
GloVe(글로브) 모델  (0) 2020.07.05
KoNLPy 기초 배우기  (0) 2020.07.05