Skip to main content
Menu
Close Search

6.2.1 Robots.txt update

Who

Search engine crawlers

What

Robots.txt file in docroot.

When

Causes issues with not finding all content linked on site. Pagination should be followable.

Where

Events list, news list, all content list like glossary, degree explorer, etc.

Why

Need to be 100% transparent to search engines for better seo

How

Removed one line: Disallow: *?* from robots.txt.

Tags
  • SEO
  • release
  • crawler