In the case of the account management URL of an online store: https://surfoon.com/account/login
We see how we have blocked it in the robots.txt, adding the following command: «Disallow: /account»
Thus preventing Google from investing crawl budget , crawling and rendering this URL that does not cover any search intent.
If you have a Shopify, here is a tutorial on how to edit robots.txt .
It is about reducing the time it takes Google to discover and crawl our website, so that it can discover as many URLs for our project as possible in less time.
To know the real status of our website's loading speed , I recommend the core web vitals section of Google Search Console.
core web vitals
The goal will be to achieve "fast" URLs by applying good WPO uae number data practices , with the aim of reducing the weight and elements of each of the URLs.
3. Using nofollow links
This involves using nofollow links so that Google, in its process of jumping from link to link, does not spend time on those URLs that we have already blocked by robots.txt or in which we have introduced a noindex.
Following the previous example of an eCommerce management page:these are usually found in the main menu.
The problem is that this follow link is found throughout the entire website , giving Google the impression that it is important. Therefore, we must add a nofollow link, so that Google does not take it into account and prioritizes the rest of the links that do cover a search intention.
In the image, you can see how the online store management sections are in red. You can do this using the Linkparser extension , which will allow you to detect follow and nofollow links.