Main

related bits

0

processing priority

4

site type

0 (generic, awaiting analysis)

review version

11

html import

27 (unknown)

Events

first seen date

2024-01-12 13:33:56

expired found date

-

created at

2024-06-21 13:54:32

updated at

2026-02-26 06:43:37

Domain name statistics

length

11

crc

61042

tld

2211

nm parts

0

nm random digits

0

nm rare letters

0

Connections

is subdomain of id

-

previous id

0

replaced with id

0

related id

-

dns primary id

143779477

dns alternative id

0

lifecycle status

0 (unclassified, or currently active)

Subdomains and pages

deleted subdomains

0

page imported products

0

page imported random

0

page imported parking

0

Error counters

count skipped due to recent timeouts on the same server IP

0

count content received but rejected due to 11-799

0

count dns errors

0

count cert errors

0

count timeouts

0

count http 429

0

count http 404

0

count http 403

0

count http 5xx

0

next operation date

-

Server

server bits

server ip

-

Mainpage statistics

mp import status

27

mp rejected date

-

mp saved date

-

mp size orig

110221

mp size raw text

29346

mp inner links count

30

mp inner links status

20 (imported)

Open Graph

title

description

image

site name

author

updated

2026-02-20 22:12:44

raw text

Commercial Intelligence – systems that know and understand and think and learn Skip to content Commercial Intelligence systems that know and understand and think and learn Scroll down to content Posts Posted on April 23, 2023 April 23, 2023 GPT under $100,000? For the last several years, we’ve been hearing about how much it costs to build ever larger language models.  Today, a state-of-the-art language model requires approaching a million-trillion-trillion (10 24 ) arithmetic operations involving hundreds of billions of parameters.  Doing the math, assuming a decent, if older GPU, such as an A100, you come up with how many years this computation will take.  Then you figure out how many GPUs you need given how many days you have to complete the computation.  For example, Meta recently published that training a 65 billion parameter version of the LLaMA model using over a trillion tokens of text took approximately 21 days on roughly 2,000 such GPUs.  That’s almost exa...

Text analysis

redirect type

31 (document.location)

block type

0 (no issues)

detected language

1 (English)

category id

Zastosowania AI (149)

index version

1

spam phrases

0

Text statistics

text nonlatin

0

text cyrillic

0

text characters

23215

text words

4719

text unique words

1394

text lines

431

text sentences

274

text paragraphs

71

text words per sentence

17

text matched phrases

0

text matched dictionaries

0

RSS

rss path

rss status

0 (new)

rss found date

-

rss size orig

0

rss items

0

rss spam phrases

0

rss detected language

0 (awaiting analysis)

inbefore feed id

-

inbefore status

0 (new)

Sitemap

sitemap path

sitemap status

0 (new)

sitemap review version

2

sitemap urls count

0

sitemap urls adult

0

sitemap filtered products

0

sitemap filtered videos

0

sitemap found date

-

sitemap process date

-

sitemap first import date

-

sitemap last import date

-