[[2023-03-26-Transformer_LLM]] , [[2023-02-18-Attn_All_U_Need_Visual]]
Attention as Kernel and Random Features
Posted on
|
In
LLM
[[2023-03-26-Transformer_LLM]] , [[2023-02-18-Attn_All_U_Need_Visual]]
LLM Instability
Posted on
|
In
LLM
LLM Instability
How to Improve LLM Error Resillence
Posted on
|
In
LLM
Introduction
Causal Attention Kernel Code
Posted on
|
In
GenAI
Source
Linear Attention Vs. Mamba
Posted on
|
In
LLM
Source
Causal Attention Kernel
Posted on
|
In
LLM
Source
Linear Attention with Topological Masking
Posted on
|
In
LLM
Source
VS Code on Colab
Posted on
|
In
GenAI
本文目的很簡單。就是 local VS Code 寫 Python or Jupyter, 但是 remote on Colab 利用其算力。 以下是 ChatGPT 的 answers.
Performer Pytorch Code Analysis
Posted on
|
In
GenAI
[[2024-10-11-Linear_Attention]] [[2024-10-10-Attention_Math]]