Write the reason you're deleting this FAQ
Google claims that it's looking out for the search engine user's best good by making all these changes to search engine algorithms.
For instance, they say that Google Panda, helps eliminate the duplicate content results helped the user experience as it cuts out the number of times sites with duplicate content are seen on top results and helps site of original publication get featured in top search results more easily.
However, if this is true, then there should only be ONE site that gets ranked for an article. But I can still search for a random phrase and come up with EVERY site that has republished one article using that phrase. Shouldn't those duplicate pages have been taken completely out of search engine results so as to enhance my search engine experience?
And, also Google Penguin is about penalizing spammy backlinks, so that bloggers/site owners who use natural SEO (such as allowing users to backlink to their site) are more likely to end up in top results rather than sites that use mass (Blackhat) backlinking campaigns. But, i've heard from some sites that have quality content that they were penalized with Penguin even though they didn't feel the backlinks they had were spammy.
Do you think all these updates help search engine users have a better search experience? Or, do they unjustifiably penalize sites?
Are you sure you want to delete this post?
Are you sure you want to delete this post?
Tommy Matalino
Are you sure you want to delete this post?