asked 69.2k views
2 votes
Self-attention is given a word; its neighboring words are used to compute its context by summing up the word values to map the attention related to that given word.

a) True
b) False

1 Answer

2 votes

Final answer:

Self-attention in NLP involves computing the context of a word by summing up the values of its neighboring words using attention.

Step-by-step explanation:

The statement given in the question is true. Self-attention in natural language processing (NLP) models involves computing the context of a word by summing up the values of its neighboring words. This is done using a mechanism called attention, which assigns weights to each word in a sentence based on their relevance to a given word. These weights are then used to compute the context representation of the word.

answered
User Mamachanko
by
7.6k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.