Main

processing priority

3

site type

5 (wiki-type site, growing by topic rather than chronologically)

review version

11

html import

0 (new)

Events

first seen date

2026-02-18 16:42:56

expired found date

-

created at

2025-12-12 09:17:46

updated at

2026-02-18 16:42:56

Domain name statistics

length

22

crc

45133

tld

86

nm parts

3

nm random digits

0

nm rare letters

0

Connections

is subdomain of id

87719371 (github.io)

previous id

0

replaced with id

0

related id

-

dns primary id

0

dns alternative id

0

lifecycle status

0 (unclassified, or currently active)

Subdomains and pages

deleted subdomains

0

page imported products

0

page imported random

0

page imported parking

0

Error counters

count skipped due to recent timeouts on the same server IP

0

count content received but rejected due to 11-799

0

count dns errors

0

count cert errors

0

count timeouts

0

count http 429

0

count http 404

0

count http 403

0

count http 5xx

0

next operation date

-

Server

server bits

GITHUB COM

server ip

185.199.108.153

Mainpage statistics

mp import status

20

mp rejected date

-

mp saved date

2026-02-18 16:42:56

mp size orig

20302

mp size raw text

3417

mp inner links count

7

mp inner links status

10 (links queued, awaiting import)

Open Graph

title

description

A simple, whitespace theme for academics. Based on [*folio](https://github.com/bogoli/-folio) design.

image

site name

author

Songlin Yang

updated

2026-03-02 08:17:30

raw text

Songlin Yang Toggle navigation about (current) blog publications talks cv Songlin Yang Songlin (松琳) is a Member of Technical Staff at Thinking Machines Lab , working on language model architectures. She earned her PhD from MIT, where she was advised by Prof. Yoon Kim . Flash Linear Attention efficient attention implementations in Triton FLA Discord community for Flash Linear Attention ASAP Seminar Advances in Sequence Modeling from Algorithmic Perspectives latest posts Dec 3, 2024 DeltaNet Explained (Part III) Dec 3, 2024 DeltaNet Explained (Part II) Dec 3, 2024 DeltaNet Explained (Part I) selected publications ICML Gated Linear Attention Transformers with Hardware-Efficient Training Songlin Yang*,  Bailin Wang* ,  Yikang Shen ,  Rameswar Panda , and  Yoon Kim In , 2024 Abs HTML Code Poster Transformers with linear attention allow for efficient parallel training but can simultaneously be formulated as an RNN with 2D ...

Text analysis

redirect type

0 (-)

block type

0 (no issues)

detected language

1 (English)

category id

-

index version

1

spam phrases

0

Text statistics

text nonlatin

2

text cyrillic

0

text characters

2679

text words

461

text unique words

267

text lines

90

text sentences

16

text paragraphs

2

text words per sentence

28

text matched phrases

0

text matched dictionaries

0

RSS

rss path

rss status

0 (new)

rss found date

-

rss size orig

0

rss items

0

rss spam phrases

0

rss detected language

0 (awaiting analysis)

inbefore feed id

-

inbefore status

0 (new)

Sitemap

sitemap path

sitemap status

0 (new)

sitemap review version

0

sitemap urls count

0

sitemap urls adult

0

sitemap filtered products

0

sitemap filtered videos

0

sitemap found date

-

sitemap process date

-

sitemap first import date

-

sitemap last import date

-