Biobot is the Biometrica Systems, Inc. web crawling agent (they’re sometimes called “spiders”). The Biobot crawls through a subset of the internet, digesting information in order to make it searchable for our clients. The crawling process primarily searches for good demographic and photographic data. Think of it in terms of a targeted search index or a Google-lite, with a very specific area of focus. This data is fed into our fusion center to improve the capabilities and accuracy of our near-time threat detection systems.
Biobot is actually the coordinated effort of many different computers crawling many different sites. We specifically target a subset of the internet with our crawlers, areas where we know good quality information lives. We do this so that we can avoid potentially poor, flawed, or purposely incorrect sources.
For Webmasters
How Biobot accesses your website
Biobot is designed to crawl through your site with minimal impact to you. It should initiate a crawl, at most, once per hour. Some types of crawls have different patterns (for example, looking for new data versus looking for updates made to old data). Depending on the size of your website, more than one crawl may be occurring at the same time. We attempt to limit the rate of the crawl whenever possible, so that we do not impact your site’s overall performance. If you are having issues with our crawler impacting your site performance, please use the contact form on the contact us page and let us know.
Making your site crawlable
If you have what you think is a great set of information that you’d like to contribute to our efforts, please let us know. You may use the contact form on the contact us page. It may take a couple of days for us to point Biobot at your site, so we’d appreciate your patience.
robots.txt
Any site Biobot crawls was explicitly selected for crawling. Therefore we do not honor the robots.txt guidelines. We apologize if this is an inconvenience.
Feel free to contact at info@biometrica.com