Token Attention Visualizer

Interactive visualization of attention patterns in Large Language Models

Model & Generation

1 50
0 2

Visualization Controls

0 10
0.001 0.5
Filter by Token

Select a token to highlight

Attention Visualization

Ready to visualize
Generate text to see attention patterns

Instructions:

  1. Load a model from Hugging Face (default: Llama-3.2-1B)
  2. Enter a prompt and configure generation settings
  3. Click Generate to create text and visualize attention
  4. Interact with the visualization:
    • Use the step slider to navigate through generation steps
    • Adjust the threshold to filter weak connections
    • Click on tokens in the plot to filter their connections
    • Click Reset View to show all connections

Understanding the Visualization:

  • Blue lines: Attention from input to output tokens
  • Orange curves: Attention between output tokens
  • Line thickness: Represents attention weight strength
  • Node colors: Blue = input tokens, Coral = generated tokens