Skip to content

Seq2Seq Word Embeddings (Not an Issue) #332

Closed
@duasahil8

Description

@duasahil8

Hi
This is for information purposes only and not an issue. I was trying to train the seq2seq for a summarization task. How do we introduce word embedding-s using glove / word2vec before submitting the training job - the example notebook only maps words to integers - is it possible to also provide vector embeddings corresponding to those indices. There are hyperparameters to specify the embedding size but could not figure out a way to provide the actual embeddings.
Thanks in advance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions