In machine learning, the self attention mechanism assigns weights to different parts of a sentence to analyze the importance and relationships of the words. Meaning "attending to itself," the self attention mechanism is an important function in the transformer method (see
AI transformer).