Sveriges mest populära poddar
Data Skeptic

seq2seq

22 min1 mars 2019

A sequence to sequence (or seq2seq) model is neural architecture used for translation (and other tasks) which consists of an encoder and a decoder.

The encoder/decoder architecture has obvious promise for machine translation, and has been successfully applied this way. Encoding an input to a small number of hidden nodes which can effectively be decoded to a matching string requires machine learning to learn an efficient representation of the essence of the strings.

In addition to translation, seq2seq models have been used in a number of other NLP tasks such as summarization and image captioning.

Related Links

Data Skeptic med Kyle Polich finns tillgänglig på flera plattformar. Informationen på denna sida kommer från offentliga podd-flöden.