Simulate changes in the website architecture

Audisto is so powerful, it even allows you to simulate you changes in your site architecture. You can use advanced features like a custom robots.txt or URL Rewriting to simulate change.

When Audisto crawls your site we might find a high number of URLs due to errors or poor design choices. Once you are aware of those URLs you can use Audisto to simulate changes within your sites architecture.

There are various ways to do this. When you create a new crawl you can submit a custom robots.txt file that will be used for crawling. With this solution the urls will not be crawled, however they will still be found. You can go even further if you use the URL Rewriting feature of our ultimate edition. URL Rewriting allows you to ignore urls or to rewrite them completely. Imagine we find a large number of URLs as a result of faceted navigation. You can simulate how your site would look without it by just ignoring the URLs. Another example is rewriting URLs with session ids. You can write rules that remove session ids from your URLs.

To simulate more complex changes you can also use the Audisto Crawler to crawl your staging environments. The bot reference page has detailed information about how to whitelist the crawler.

 

Author