id
processing priority
3
site type
5 (wiki-type site, growing by topic rather than chronologically)
review version
11
html import
20 (imported)
first seen date
2024-01-22 07:10:40
expired found date
-
created at
2024-06-09 00:23:23
updated at
2026-01-01 21:37:31
length
21
crc
39716
tld
86
nm parts
0
nm random digits
0
nm rare letters
0
is subdomain of id
87719371 (github.io)
previous id
0
replaced with id
0
related id
-
dns primary id
0
dns alternative id
0
lifecycle status
0 (unclassified, or currently active)
deleted subdomains
0
page imported products
0
page imported random
0
page imported parking
0
count skipped due to recent timeouts on the same server IP
0
count content received but rejected due to 11-799
0
count dns errors
0
count cert errors
0
count timeouts
0
count http 429
0
count http 404
0
count http 403
0
count http 5xx
0
next operation date
-
server bits
—
server ip
-
mp import status
20
mp rejected date
-
mp saved date
-
mp size orig
4881
mp size raw text
1009
mp inner links count
3
mp inner links status
20 (imported)
title
description
Anything about NLP in Korean
image
site name
author
NLP in Korean
updated
2025-12-21 03:39:54
raw text
NLP in Korean – Anything about NLP in Korean NLP in Korean Anything about NLP in Korean Blog About The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning) March 5, 2019 저번 글 에 이어 이번엔 다른 contextualized Language Model 들인 BERT와 ELMo에 대한 글을 번역해보았습니다. 마찬가지로 블로그 by Jay Alammar 에서 허락을 받고 가져온 글이며, 원문은 본 링크 에서 확인하실 수 있습니다. Read More The Illustrated Transformer December 20, 2018 저번 글 에서 다뤘던 attention seq2seq 모델에 이어, attention 을 활용한 또 다른 모델인 Transformer 모델에 대해 얘기해보려 합니다. 2017 NIPS에서 Google이 소개했던 Transformer는 NLP 학계에서 정말 큰 주목을 끌었는데요, 어떻게 보면 기존의 CNN 과 RNN 이 주를 이뤘던 연구들에서 벗어나 아예 새로운 모델을 제안했기 때문이지 않을까 싶습니다. 실제로 적용했을 때 최근 연구에 비해 큰 성능 향상을 보여줬기 때문이기도 하고요. Read More Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention) December 17, 2018 최근 10년 동안의 자연어 처리 연구 중에 가장 영향력이 컸던 3가지를 꼽는 서베이 에서 여러 연구자들이 꼽았던 연구가 바로 2014년에 발표됐던 sequence-to-sequence (Seq2seq) + Attention 모델입니다 ( Sutskever et al., 2014 , Cho et al., 2014 ). ...
redirect type
0 (-)
block type
0 (no issues)
detected language
1 (English)
category id
Pozostałe (16)
index version
2025110801
spam phrases
0
text nonlatin
262
text cyrillic
0
text characters
748
text words
184
text unique words
142
text lines
28
text sentences
8
text paragraphs
3
text words per sentence
23
text matched phrases
0
text matched dictionaries
0
links self subdomains
0
links other subdomains
1 - papers.nips.cc
links other domains
1 - emnlp2014.org
links spam adult
0
links spam random
0
links spam expired
0
links ext activities
0
links ext ecommerce
0
links ext finance
0
links ext crypto
0
links ext booking
0
links ext news
0
links ext leaks
0
links ext ugc
links ext klim
0
links ext generic
0
dol status
0
dol updated
2025-12-21 03:39:54
rss path
rss status
1 (priority 1 already searched, no matches found)
rss found date
-
rss size orig
0
rss items
0
rss spam phrases
0
rss detected language
0 (awaiting analysis)
inbefore feed id
-
inbefore status
0 (new)
sitemap path
sitemap status
30 (processing completed, results pushed to table crawler_sitemaps.ext_domain_sitemap_lists)
sitemap review version
1
sitemap urls count
5
sitemap urls adult
0
sitemap filtered products
0
sitemap filtered videos
0
sitemap found date
2024-01-24 14:06:53
sitemap process date
2024-11-05 02:47:04
sitemap first import date
-
sitemap last import date
-