ai

Seq2Seq Translator

GRU-based sequence-to-sequence translation model with teacher forcing and multiple optimizer comparisons.

2022research5 min read
Source Code
PyTorch / GRU

Project Snapshot

  • ImpactBuilt a GRU-based encoder-decoder translation model trained on 7,000 sentence pairs, achieving 50% test accuracy.
  • Tagsseq2seq · translation · encoder-decoder · gru · pytorch · nlp
  • ArchitectureGRU Encoder-Decoder
  • TrainingTeacher Forcing + L2 Reg
  • FrameworkPyTorch 2.1 / Python 3.11

01. Overview

A sequence-to-sequence neural machine translation model built in PyTorch. Uses GRU-based encoder and decoder with teacher forcing during training. Trained on 7,000 sentence pairs (≤8 words each) and benchmarked across Adam, AdamW, RMSprop, and SGD optimizers with L2 regularization, reaching 50% test accuracy.