Screaming Frog Log File Analyser Screaming Frog Ltd.
winget install --id=ScreamingFrog.LogFileAnalyser -e
The Screaming Frog SEO Log File Analyser allows you to upload your log files, verify search engine bots, identify crawled URLs and analyse search bot behaviour for invaluable SEO insight. The Log File Analyser is light, but extremely powerful – able to process, store and analyse millions of lines of log file event data in a smart database. It gathers key log file data to allow SEOs to make informed decisions. Some of the common uses include - View and analyse exactly which URLs Googlebot & other search bots are able to crawl, when and how frequently. - Discover all response codes, broken links and errors that search engine bots have encountered while crawling your site. - Get insight to which search bots crawl most frequently, how many URLs are crawled each day and the total number of bot events. - Find temporary and permanent redirects encountered by search bots, that might be different to those in a browser or simulated crawl. - Analyse your most and least crawled URLs & directories of the site, to identify waste and improve crawl efficiency. - Import and match any data with a 'URLs' column against log file data. So import crawls, directives, or external link data for advanced analysis. Analyse 1k log events for free, or buy a Log File Analyser licence for £99 Per Year to remove the limit.
The Screaming Frog SEO Log File Analyser is a powerful tool designed to help SEO professionals and webmasters analyze search engine bot activity and optimize website performance. By uploading log files, users can verify search engine bots, identify crawled URLs, and gain insights into search bot behavior for actionable SEO strategies.
Key Features:
- Verify Search Engine Bots: Automatically detect and validate search engine bots like Googlebot and Bingbot.
- Analyse Crawl Frequency: Track how often search bots crawl your website and identify the most and least frequently visited URLs.
- Identify Errors and Redirects: Discover broken links, server errors (4XX/5XX), and redirects encountered by search bots.
- Optimize Crawl Efficiency: Analyze crawl patterns to improve site structure and reduce wasted crawl budget.