In a help thread on in a webmaster forum, Google’s John Mueller said that you should avoid special characters in URLs:
“I generally recommend avoiding special characters like commas, semicolons, colons, spaces, quotes etc. in URLs, to help keep things simple.
URLs like that are often harder to automatically link (when someone posts in a forum or elsewhere), and hard for us to recognize correctly when we parse text content to try to find new URLs.
When they’re linked normally or submitted through a sitemap directly, they work as expected. However, when we try to recognize the URL in something that we crawl as a HTML or a text page, then we’ll probably ‘guess’ them wrong — which is fine, since we’ve probably already seen them through the normal links & sitemap usage.
In practice this doesn’t matter, finding links which don’t work is perfectly normal for us; it won’t break the crawling, indexing, or ranking of your site assuming we can crawl it otherwise. We’ll show these as 404s in Search Console because they return 404, but they’re not something critical that you need to suppress.
If you want to move to a cleaner URL structure that’s less-likely to be misinterpreted like that, you can use normal 301 redirects & rel=canonical elements on the page. It’ll generally take some time to crawl & reindex the URLs like that though, so you’ll continue to see these old URLs in Search Console in the meantime.”
URLs with special characters can be misinterpreted by web crawlers. There are several other technical things that can have an influence on the rankings of your web pages. Check your website with the website audit tool in SEOprofiler to make sure that everything is okay: