Magento 2.0 is brilliant and assuring because of its evolution in performance and measurability, easier installations, higher quality code, and simplified integration. Latest analysis unveils some common SEO mistakes in new Magento 2.0 stores, that is really useful for users launching their own Magento 2.0 store.
We took our time to examine usual SEO mistakes in new Magento 2.0 stores. This web log post can assist and functions as a form of SEO listing before you launch your own Magento 2.0 store.
Rel canonical was introduced by Google to solve the complex issues of duplicate and close to duplicate content. Actually Layered (Stratified) filters change the content of the URL and Rel canonical isn’t meant to be utilized in this manner. We can’t have all those layered URLs in index as they might create duplicate content issues and here rel canonical should not be used to work out this issue.
You can simply place meta noindex, follow on those layered URLs, get them out of index while providing the way to the link juice to flow through them and product listings to other pages that you need to rank.
Before discussing this case, it is exigent to know that Google considers https as a ranking factor.
Thus, https version is that the preferred one to index and the HTTP version of the same URLs should have a rel canonical pointing it to https version. And if http is the preferred version then https version should point its canonical towards it.
Having http and https versions of the same URL indexed is a fine illustration of duplicate content & where rel canonical should be used to consolidate the duplicates into a preferred version.
Let’s have a look at something worth looking into and fixing: Micro-data markup. Micro-data markup helps Google and alternative major search engines perceive the content of your pages, puzzle out what’s your worth and special worth, your reviews, etc.
It is not merely a feature that assists you get a much better CTR but it is conjointly an essential one for automatic product updates for Google merchandiser Center.
Since we’re missing the availability markup, neither SERP results will be featuring our product as “In stock” nor will automatic availability product updates within the Google Merchant Center work. Thus, this is one of an important aspects for both SEO and PPC.
According to Google Panda rule, websites that allow indexation of large amounts of their site search data is penalized. It is not like Google doesn’t want to display search results or don’t need to index your website search.
Disallowing layered parameters through robots.txt file is not a recommendable thing to do but you can disallow your site search results, and the /catalogsearch/ URL path. Many Magento 2.0 websites forgets to forbid the location search results through their robots.txt file, and in addition they link to their website search results from the homepage.
They connected the homepage logos of various brands to website search with that brand as a question parameter.
This is in all probability done because of lack of some variety of “shop by brand” extension for Magento 2.0 at the time of development of the website in question. With visual businessperson, you’ll be able to fill the complete class mechanically with merchandise that match a precise price of your manufacturer.
Homepage is generally the h3est page in terms of link equity. It is also the page that can rank for most of the important keywords. For this reason, it should not be titled “Home page” as that doesn’t explains what your website is all about.
You should never get your layered navigation, filtering, and sorting parameters on your category pages indexed by Google, blocking them with a robots.txt disallow is not a favorable way of handling this thin content issue because they can still be indexed but can’t be crawled.
The robots.txt disallow approach loses many good link juice from internal navigation. Instead of disallowing those parameters through the robots.txt file, it is advised to use meta noindex follow-on URLs with those parameters instead.
If meta noindex follow and robots.txt disallow are done simultaneously then Google won’t see the meta noindex follow on your URL as they can’t crawl it due to your robots.txt disallow.
With this list of Common Magento 2 SEO Mistakes, you can check your own Magento 2 store. Remember that an online store is a dynamic system having too many things you can’t anticipate.
Everything may be alright one day and the next day Google Search Console shows strange 404 errors. Thus, you should track your site optimization and Common Magento 2 Digital Marketing Mistakes to notice all important changes and solve the issues quickly.
Minal Joshi is a content marketer at Krish with a flair for eCommerce and Digital Commerce aspects. She is a MarTech fanatic with a knack of writing with which, she helps brands to curate, create, & commence digital brand positioning. Sharing insights via articles, case studies, eBooks, Infographics, and other forms of content creation is what she lives for. Being an ardent traveler, when not writing, you'll find her sipping coffee into the mountains or petting a stray.
16 April, 2024 There’s some great news for Magento Open Source and Adobe Commerce users. Magento is here with Magento Open Source version 2.4.7. It is a significant update that comes with a compelling mix of security improvements, performance optimizations, and exciting new features. Upgrading to this version is highly recommended to ensure the security, performance, and functionality of your Magento store.
Never miss any post, stay tuned!