“There’s a small subset of pages with a template we generally no-index that we want to get on Google. Dev doesn’t want to mess with the template or create a new one, so they’re using JS to remove the global robots nofollow meta that’s generally part of the template.
First let me clarify, this is not my recommended solution; but will it work? I know Google will read the JS, so theoretically once it renders the page fully it’ll see it without the nofollow. But will the fact Googlebot sees nofollow in the initial HTML before full render prevent links within the page from being counted in the network graph?”
According to John Mueller, Google always uses the most restrictive setting that they find on a page:
“Google will use the most restrictive setting you have on the page (this matches how robots meta tags are generally processed, eg if you have a ‘noindex’ + ‘index’, then the ‘noindex’ will override the ‘index’)./p>
If you have a ‘nofollow’ in static HTML and remove it with JS, Google will still use the ‘nofollow’. Similarly, if you don’t have any robots meta tag, and add a ‘noindex’ with JS, Google will use the ‘noindex’.
In short, adding a ‘nofollow’ via JS would work, removing it won’t.”
Can all pages of your website be indexed by Google?
If some pages of your website have the wrong settings, Google won’t be able to index them. Check the indexability of your web pages with the website audit tool in SEOprofiler: