transformer_lens.components package¶
Submodules¶
- transformer_lens.components.abstract_attention module
AbstractAttentionAbstractAttention.IGNOREAbstractAttention.OVAbstractAttention.QKAbstractAttention.__init__()AbstractAttention.alibiAbstractAttention.apply_causal_mask()AbstractAttention.apply_rotary()AbstractAttention.calculate_attention_scores()AbstractAttention.calculate_qkv_matrices()AbstractAttention.calculate_sin_cos_rotary()AbstractAttention.calculate_z_scores()AbstractAttention.create_alibi_bias()AbstractAttention.create_alibi_multipliers()AbstractAttention.create_alibi_slope()AbstractAttention.forward()AbstractAttention.k_normAbstractAttention.maskAbstractAttention.q_normAbstractAttention.rotary_cosAbstractAttention.rotary_sinAbstractAttention.rotate_every_two()
- transformer_lens.components.attention module
- transformer_lens.components.bert_block module
- transformer_lens.components.bert_embed module
- transformer_lens.components.bert_mlm_head module
- transformer_lens.components.bert_nsp_head module
- transformer_lens.components.bert_pooler module
- transformer_lens.components.embed module
- transformer_lens.components.grouped_query_attention module
- transformer_lens.components.layer_norm module
- transformer_lens.components.layer_norm_pre module
- transformer_lens.components.pos_embed module
- transformer_lens.components.rms_norm module
- transformer_lens.components.rms_norm_pre module
- transformer_lens.components.t5_attention module
- transformer_lens.components.t5_block module
- transformer_lens.components.token_typed_embed module
- transformer_lens.components.transformer_block module
- transformer_lens.components.unembed module
Module contents¶
Hooked Transformer Components.
This module contains all the components (e.g. Attention, MLP, LayerNorm)
needed to create many different types of generative language models. They are used by
transformer_lens.HookedTransformer.