Product Ideas Portal

Client side web crawler or batch load URL?

Current web crawler cannot test against internal sites behind firewalls (which require more whitelist requests and other changes that may be difficult).

Would there be ways to run crawler from browser extensions on the client side? This may also help with pages that require logins.

Workflow could be: user opens a site, optionally log in, then run Access Assistant to crawl though the page and other linked pages, instead of having to manually open each page and run test against.

Or, somehow have Access Assistant loop through a list of URL as TXT/CSV/sitemap.xml in a browser session?

  • Guest
  • Jul 17 2020
  • Deferred to Next Gen
  • Attach files
  • Admin
    Zahra Safavian commented
    4 Aug, 2020 03:05pm

    This is something we have considered in the past. There are some technical issues with this, namely that it would potentially tie up your browser for a longer-than-acceptable period of time. We recently released an alternate solution to spidering sites that are behind a firewall. Our Continuum Scripting Framework allows this sort of testing, while reporting results to AMP, with relative ease in comparison to the other Continuum SDKs, which will require much more scripting. You can read more about this solution in our most recent release announcement: