Closed
Description
Hi
This is for information purposes only and not an issue. I was trying to train the seq2seq for a summarization task. How do we introduce word embedding-s using glove / word2vec before submitting the training job - the example notebook only maps words to integers - is it possible to also provide vector embeddings corresponding to those indices. There are hyperparameters to specify the embedding size but could not figure out a way to provide the actual embeddings.
Thanks in advance.
Metadata
Metadata
Assignees
Labels
No labels