A bot, also known as a web robot, web spider or web crawler, is a software application designed to automatically perform simple and repetitive tasks in a more effective, structured, and concise manner than any human can ever do.
The most common use of bots is in web spidering or web crawling.
SEMrushBot is the search bot software that SEMrush sends out to discover and collect new and updated web data.
Data collected by SEMrushBot is used for:
- the public backlink search engine index maintained as a dedicated tool called Backlink Analytics (webgraph of links)
- the Site Audit tool, which analyzes on-page SEO, technical and usability issues
- the Backlink Audit tool, which helps discover and clean up potentially dangerous backlinks of your profile
- the Link Building tool, which helps you find prospects, reach out to them and monitor your newly acquired backlinks
- the SEO Writing Assistant tool to check if URL is accessible
- the Brand Monitoring tool to index and search for articles
- the Content Analyzer and Post Tracking tools reports
- the On Page SEO Checker and SEO Content template tools reports
- the Topic Research tool reports
- the SEO A/B Testing tool to create A/B tests on your website