Fli
03-30-2014, 02:49 PM
http://domaintools.com
http://whoismind.com
-------------------
Update: There seems to be no cheap way to obtain millions of WHOIS data enough frequently because of domain registars limitations. https://stackoverflow.com/questions/2922981/whois-lookup-limits-how-to-work-around-daily-quota-query-limits
Hi,
i think it can be good idea to create php + mysql site which will store whois records, but i mean not present records, but whois history records (so far only one paid service offers that..), so example for one domain as domains are sold or expires and someone else register it, whois details change.
So my idea is to create super basic interface for people to check history whois for some domain.
So on website should be an field where user enter domain name and click to view this domains page with history whois details.
-----------
The programing requirements:
- mysql + php knowledge.
- ability to know how to use so called indexes in mysql database (mysql database need to be able to store billions of domain name records. So probably having each table for each tld or a table for each domains first alphabet character, its up to programmer to make it high performance and keep performance in mind)
- should be used many tables for more performance and for ability to enhance script with more extended features and filtering in future (i can order from programmer)
- i can create some bash script for extracting whois records and adding it into mysql or programmer can make some universal cron.php which will use default linux whois check to flood the database with deta periodically.
The programmer will have its advertising banner or message on the website for one year. and for 3 years footer message that he is an author of the script (if he wish to) + i can pay around $5-$35 for programming
Thank You
----
notes:
CZECH:
tady co jsem zjistil ohledně toho:
rok má 31536000 sekund (32 milionů)
je zde asi 200 milionů registrovaných domén (info, net, com)
pokud by se to redukovalo podle nějakého kritéria (doména má alexa rank, zpětné odkazy..) stačilo by třeba 32 milionů domén. Jedna kontrola za sekundu spotřebuje velmi málo systémových prostředků. Kontroly lze vést z více IP adres.
jeden whois záznam má 3kb dat (vyzkoušeno)
200mil. záznamů je 572.2Gb (mám možnost provozovat server s 4TB+ prostoru) stačí mi whois za poslední 4 roky
není to tak nereálné
ENGLISH:
here what I found regarding this:
year is 31536000 seconds (32000000)
There are about 200 million registered domain (info, net, com)
should be reduced according to some criterion (the domain has alexa rank, backlinks ..) would be enough to be 32 million domains. Of one per second, consumes very little system resources. Checks can result from multiple IP addresses.
One whois record is 3 kilobytes of data (tested)
200 miles. records is 572.2Gb (I have the ability to run server with 4TB + space) I just whois the last 4 years
not so unrealistic
http://whoismind.com
-------------------
Update: There seems to be no cheap way to obtain millions of WHOIS data enough frequently because of domain registars limitations. https://stackoverflow.com/questions/2922981/whois-lookup-limits-how-to-work-around-daily-quota-query-limits
Hi,
i think it can be good idea to create php + mysql site which will store whois records, but i mean not present records, but whois history records (so far only one paid service offers that..), so example for one domain as domains are sold or expires and someone else register it, whois details change.
So my idea is to create super basic interface for people to check history whois for some domain.
So on website should be an field where user enter domain name and click to view this domains page with history whois details.
-----------
The programing requirements:
- mysql + php knowledge.
- ability to know how to use so called indexes in mysql database (mysql database need to be able to store billions of domain name records. So probably having each table for each tld or a table for each domains first alphabet character, its up to programmer to make it high performance and keep performance in mind)
- should be used many tables for more performance and for ability to enhance script with more extended features and filtering in future (i can order from programmer)
- i can create some bash script for extracting whois records and adding it into mysql or programmer can make some universal cron.php which will use default linux whois check to flood the database with deta periodically.
The programmer will have its advertising banner or message on the website for one year. and for 3 years footer message that he is an author of the script (if he wish to) + i can pay around $5-$35 for programming
Thank You
----
notes:
CZECH:
tady co jsem zjistil ohledně toho:
rok má 31536000 sekund (32 milionů)
je zde asi 200 milionů registrovaných domén (info, net, com)
pokud by se to redukovalo podle nějakého kritéria (doména má alexa rank, zpětné odkazy..) stačilo by třeba 32 milionů domén. Jedna kontrola za sekundu spotřebuje velmi málo systémových prostředků. Kontroly lze vést z více IP adres.
jeden whois záznam má 3kb dat (vyzkoušeno)
200mil. záznamů je 572.2Gb (mám možnost provozovat server s 4TB+ prostoru) stačí mi whois za poslední 4 roky
není to tak nereálné
ENGLISH:
here what I found regarding this:
year is 31536000 seconds (32000000)
There are about 200 million registered domain (info, net, com)
should be reduced according to some criterion (the domain has alexa rank, backlinks ..) would be enough to be 32 million domains. Of one per second, consumes very little system resources. Checks can result from multiple IP addresses.
One whois record is 3 kilobytes of data (tested)
200 miles. records is 572.2Gb (I have the ability to run server with 4TB + space) I just whois the last 4 years
not so unrealistic