Main

related bits

0

processing priority

3

site type

5 (wiki-type site, growing by topic rather than chronologically)

review version

11

html import

20 (imported)

Events

first seen date

2024-10-08 06:10:06

expired found date

-

created at

2024-10-08 06:10:06

updated at

2024-11-24 16:23:42

Domain name statistics

length

17

crc

64383

tld

86

nm parts

0

nm random digits

0

nm rare letters

0

Connections

is subdomain of id

87719371 (github.io)

previous id

0

replaced with id

0

related id

-

dns primary id

0

dns alternative id

0

lifecycle status

0 (unclassified, or currently active)

Subdomains and pages

deleted subdomains

0

page imported products

0

page imported random

0

page imported parking

0

Error counters

count skipped due to recent timeouts on the same server IP

0

count content received but rejected due to 11-799

0

count dns errors

0

count cert errors

0

count timeouts

0

count http 429

0

count http 404

0

count http 403

0

count http 5xx

0

next operation date

-

Server

server bits

server ip

-

Mainpage statistics

mp import status

20

mp rejected date

-

mp saved date

-

mp size orig

51160

mp size raw text

6169

mp inner links count

0

mp inner links status

1 (no links)

Open Graph

title

description

Homepage of Junmo Kang

image

site name

author

Junmo Kang

updated

2026-03-09 14:51:33

raw text

Junmo Kang Junmo Kang Ph.D. Student, Georgia Tech School of Interactive Computing Coda 1147M junmo.kang [AT] gatech.edu About Education Publications Vitae About Hello! I am currently a third-year Ph.D. student in the School of Interactive Computing at Georgia Tech advised by Alan Ritter and Wei Xu . Previously, I completed my M.S. in Computer Science at KAIST , where I started my NLP journey. I had a wonderful time interning at MIT-IBM Watson AI Lab twice. My research focuses on developing NLP models that are efficient and robust , with the goal of ensuring their practicality in real-world scenarios. How to induce NLP systems to be more scalable and cheaper in terms of data, compute, or parameters? How to design NLP models that remain robust when confronted with unseen cases in the wild? In particular, below are some keywords of my recent interests: Expert Large Language Models Instruction-Driv...

Text analysis

redirect type

0 (-)

block type

0 (no issues)

detected language

1 (English)

category id

Zastosowania AI (149)

index version

1

spam phrases

0

Text statistics

text nonlatin

0

text cyrillic

0

text characters

4519

text words

834

text unique words

350

text lines

240

text sentences

61

text paragraphs

5

text words per sentence

13

text matched phrases

0

text matched dictionaries

0

RSS

rss path

rss status

1 (priority 1 already searched, no matches found)

rss found date

-

rss size orig

0

rss items

0

rss spam phrases

0

rss detected language

0 (awaiting analysis)

inbefore feed id

-

inbefore status

0 (new)

Sitemap

sitemap path

sitemap status

2 (priority 2 already searched, no matches found)

sitemap review version

1

sitemap urls count

0

sitemap urls adult

0

sitemap filtered products

0

sitemap filtered videos

0

sitemap found date

-

sitemap process date

-

sitemap first import date

-

sitemap last import date

-