About varocarbas.com


About me


Contact me


Visit customsolvers.com, my main website


Valid markup


Valid CSS


© 2015-2018 Alvaro Carballo Garcia


URL friendly


Optimised for 1920x1080 - Proudly mobile unfriendly

All projects in full-screen mode


Project 10

Project 9

Project 8

FlexibleParser code analysis:





Chromatic encryption

(v. 1.3)

Pages in customsolvers.com:

Upcoming additions

Failed projects

Active crawling bots:

Ranking type 2


FlexibleParser raw data:

Unit conversion (UnitParser)

Compound types (UnitParser)

Timezones (DateParser)

Currently active:

Domain ranking

Project 10 is expected to be the last formal project of varocarbas.com. I will continue using this site as my main self-promotional R&D-focused online resource, but by relying on other more adequate formats like domain ranking.
Note that the last versions of all the successfully completed projects (5 to 10) will always be available.
Completed (24 days)
Completed (57 days)
Completed (26 days)
Completed (47 days)
Completed (19 days)
Completed (14 days)
Ranking bot type 2

These bots are part of the domain ranking. They perform simplistic backlink-counting actions and only store the following information:
  • Domain name.
  • Contents of the robots.txt file in the root directory of each domain.
  • Number of links pointing to each domain from random others.
  • Global assessment of the site/page on account of the aforementioned information and further issues analysed on-the-go.
All of them have also the following features in common:
  • Their user agent is: RankingBot2 -- https://varocarbas.com/bot_ranking2/ (they self-identify as RankingBot2).
  • All of them are running from the IP address
    NOTE: this is the IP address currently associated with my office's network, dynamically assigned by my internet provider (Telefónica de España).
  • They only visit URLs which aren't forbidden in the corresponding robots.txt file, by exclusively considering entries expressly referring to them ('RankingBot2') or to any bot ('*').
    While confirming reasonable assumptions of dishonest inter-linking (e.g., group of sites created for the sole purpose of providing backlinks among each other), these bots might ignore the robots.txt indications.
  • As usual, I am the sole author of these bots and developed them completely from scratch (PHP + MySQL/MariaDB).