Google’s John Mueller said on Twitter that you should not block Googlebot from crawling 404 pages on your website.
That sounds like a really bad idea which will cause all sorts of problems.. You can't avoid that Googlebot & all other search engines will run into 404s. Crawling always includes URLs that were previously seen to be 404.
— ? John ? (@JohnMu) July 15, 2020
Billions of 404 pages are crawled every day – it's a normal part of the web, it's the proper way to signal that a URL doesn't exist. That's not something you need to, or can, suppress.
— ? John ? (@JohnMu) July 15, 2020
It’s better to redirect outdated URLs to new pages on your website. Use the website audit tool in SEOprofiler to check the 404 pages on your website. The website audit tool also shows the redirects on your website: