Customize Resources Crawled in JavaScript Crawls
Running Javascript-enabled Site Audits requires each page to be rendered before crawling. This includes loading all the resources (CSS, JavaScript etc) on the page.
The seoClarity Clarity Audit crawler is configured to automatically block some common resources that pertain to analytics and some popular ad services in order to make sure reporting in those systems are not inflated by the Clarity Audit crawler.
However, there may be some instances when you may want to block specific additional resources from loading and firing impressions beyond the ones we automatically exclude. This can be achieved by entering the resource url you would like blocked into the Block Resources field and our Javascript crawler will not load that resource when rendering the page.
Here are some examples of different exclusion patterns you can enter in the Block Resources field
Below is a screenshot of where the Block Resources field can be set while setting up a new crawl:
Related Articles
What type of crawl should I set up, Standard or Javascript?
Javascript vs Standard Crawls How to check which crawler option to use: Open the url in the browser Disable Javascript See below for steps to disable Javascript Load/re-load the page If you cannot see the links populate, it means that the ...
Site Health: View Details of all Resource Urls rendered in Javascript crawls
With the rise in usage of Javascript frameworks, such as React, Vue, Angular etc in Websites of today, crawling requires all the code and resource urls on the page to be processed and rendered. Having resource urls that are not accessible or ...
Site Audit Projects
Site Audit Projects Overview The Site Audit Projects List gives you a high level view of the different crawls that have been setup for the domain. Watch the video below: "How to Create a Clarity Audit Project" Background & Requirements Some sites ...
Site Audit Settings
Site Audit Settings Overview Site Audits provides a variety of reports and analysis based on a crawl, that can impact the health of a site. Site Audit Settings allow for the customization of Site Audit reports. The settings enable prioritizing issues ...
How do recurring crawls work?
The Recurring Crawl mechanism is designed to create a trend of crawl data so you can run and compare crawls within a project. Site Audits allows you to set up crawls that can be scheduled to run Weekly, Bi Weekly or Monthly, from 1 - 6 months. ...