What is the purpose of the "attention mechanism" in neural networks, particularly in sequence-to-sequence models?
- A) To enhance the training speed of the neural network
- B) To selectively focus on different parts of the input sequence when generating the output sequence
- C) To reduce the complexity of the model by limiting the number of layers
- D) To increase the size of the dataset by generating synthetic samples