Open
Description
Thanks for your git, which gives me a lot of inspiration. To my best knowledge, the attention or pointer mechanism is popular in sequence to sequence tasks such as chatbot. I have read the attention mechanism of Luong et al. 2015 and Bahdanau et al. 2015, pointer networks of some summarization tasks, but I feel confused on those formulas. Would you please add some attention or pointer mechanism examples based on your current model?
Metadata
Metadata
Assignees
Labels
No labels