seq2seq

keras and sequnece to sequence In the previous article, we implemented the LSTM model, and now we will implement the sequence to sequence model. Nowadays, the sequnece to sequence and attention-based models are often used in natural language processing such as machine translation, and BERT is also based on the attention model. In this section, we will review and implement the basic sequnece to sequence. We will build a model that translates $y=\sin x$ to $y=\cos x$.