𝙩𝙮≃𝙛{𝕩}^A𝕀²·ℙarad𝕚g𝕞 0 关注者 关注 3个月前 关键突破:从Transformer到”任何复杂神经网络” 论文的结论部分有个惊人的陈述: “Our results remain valid if the self-attention layer is switched by other forms of contextual layers, like that of a RNN, or any layer that can take an in #transformer #神经网络 #ICL #RNN #Mamba 前往原网页查看