Strict and relaxed robots.txt parsing
This article describes the differences between strict and relaxed robots.txt parsing.
URL rewriting - modify or ignore URLs found during a crawl
This article describes the URL rewriting feature. The feature allows to simulate changes in the url-structure of sites.
New methods to confirm the ownership of your website
We just made it easier to confirm the ownership of a website. We released two additional confirmation methods. Read the details!
Deal with non printable characters and control characters
Non printable characters are a nightmare! This article shows how to find, show and fix problems caused by some non printable characters within websites.
Mobile crawling and new hints
Mobile becomes more and more important and so we focused on having a solution to crawl mobile websites. Todays release bring this functionality along with some UI Improvements and a number of new Hints.
Relaunch Announcement
It’s done - you can enjoy the relaunched website! We implemented your most requested features and we'll continue to do so.
How many pages do I need to crawl? - Crawling scenarios
With this guide we want to point out the difference between incomplete crawls and crawls that cover a whole site. Learn about different crawling scenarios and how many sites you need to crawl.
We’re celebrating! Audisto turned two!
We're celebrating 2 years of Audisto. It's been quite a journey so far. We have to say thank you to all of you!
API released to the public
We recently released the first public version of our API and made it available to all accounts. Read all the details!
How to write a good robots.txt
Advanced Robots.txt usage - Learn how to address multiple robots, add comments and use extensions like crawl-delay or wildcards with this Robots.txt Guide.