Negative SEO: 6 Little Things Hurting Your Rank You Probably Never Noticed

As a savvy online marketer, you know the importance of not only optimizing your pages for search engines but also staying on top of the latest search engine optimization (SEO) tactics and algorithms.

After all, increasing your traffic and brand awareness is impossible without understanding the in’s and out’s of SEO.

 

When the top 3 search results get over 50% of traffic and Google gets over 63,000 search queries per second, you want any advantage possible to help your content get a piece of the pie.

Even as you read this, thousands if not millions of people are looking for great content just like yours. So, why not help them find it by becoming even more of an SEO master?

Now, you know the best practices, but there are a lot of smaller, technical things that can play a big role in how your pages rank (and a lot of them will surprise you).

Don’t let these hidden SEO ranking factors negatively affect your pages any longer. Here are 6 little things hurting your SEO that you’ve probably never noticed:

1. Bad Blog Comments

Comment spam is something that every blog has to deal with.

Black hat SEO tactics can actually use automated and semi-automated bots that can post thousands of spam comments across thousands of sites in a single day in an attempt to create backlinks and increase traffic to other sites.

Spam or other bad blog comments do more than just pollute your blog posts, they can also really hurt your site’s ability to rank even if the links in these comments aren’t being followed.

Spam blog comments hurt your site because Google considers where your site links to a strong signal of what kind of site you are, and how qualified you are to receive traffic.

Birds of a feather flock together in the eyes of Google and even if these links weren’t intentionally put there by you, they’re impacting your site authority.

A common black hat method of attacking a site is to link it to a bunch of spammy sites in an attempt to discredit a brand or its messaging.

Combining low-quality links with dangerous keywords that are often associated with spam will cause Google’s algorithms to consider your site suspicious.

So what can you do about it?

The first step is to moderate your comments.

Most platforms (like HubSpot) allow you to get comment notification emails, letting you approve (or not) them before you allow them to publish.

You can also use a strong spam filter. While they don’t catch everything, they’ll snag most obvious spam comments with inappropriate language.

Another option is having a formal comment policy on your site. Decide what you will and won’t allow in the discussion, and make people aware of it.

Remember: it’s your site and you can choose what is published.

There’s no obligation to publish every single comment that people submit, especially if comments use abusive or vulgar language not pertinent to the conversation.

Check out Copyblogger’s comment policy for a good example.

Last, but not least, talk to your developer about disabling links in blog comments. Even if the text URL remains, there are usually ways to prevent them from actually clicking over to the other sites.

2. Poor Site Speed

Site speed is one of the biggest signals used by Google to rank pages, specifically the time to first byte. This basically refers to the amount of time it takes for a visitor’s browser to get the first byte of information from your server.

Page speed not only affects your rank, but it’s important to user experience. Pages with longer load times tend to have higher bounce rates, a lower average time-on-page, and a negative impact on conversions.

A slow page speed also means search engines can’t crawl as many pages using their allocated crawl budget. In turn, your site index might not be complete.

How do you know your load time?

One of the ways we test pages is with WebPageTest.org.

This tool breaks down your page load times, giving you your time to first byte and total load time, as well as some valuable insights into what is slowing you down.

It also generates a waterfall chart for every file on your page, letting you easily identify which specific items are increasing your load times.

Most of the time some very simple changes can significantly decrease your time to first byte, which you should aim to be under .5 seconds.

Here are some of the most common things slowing your site down:

  • Your host: Long story short, you get what you pay for. Cheap hosting offers are attractive but can slow your pages down.
  • Your images are too big: Pretty self-explanatory, but shoot to keep your images under 100kB and make sure they are progressive in order to keep your load times down.
  • Externally embedded video: Video is essential nowadays for any marketing strategy, but embedding your videos on sites like Vimeo or YouTube can slow you down. Sometimes this is necessary, but if you can, host them on your own server.
  • Inefficient code: If your HTML/CSS is not efficient or too dense, it will lower your page speed.

How do you fix these and other site page issues?

Compress Everything

Start off with your images.

Optimize your large images in Photoshop or online programs like Optimizilla. Be careful using online apps though, while definitely faster than manually optimizing images one by one, they can sometimes sacrifice quality a little too much.

Now compress your scripts.

There are many programs out there that can reduce the size of your CSS, HTML, and JavaScript files. By optimizing your code (including removing spaces, commas, and other unnecessary characters), you can dramatically improve your page speed.

Google recommends using HTMLMinifier to minify HTML, CSSNano and csso to minify CSS, and UglifyJS or Closure Compiler to minify JavaScript.

Reduce Redirects

Whenever a page redirects to another page, it slows down your load time.

For example, if you migrate your blog and just added SSL, your visitor could click a link sending them to “http://www.company.com/blog,” which redirects to “https://www.company.com/blog,” and then finally reaching the target page “https://blog.company.com.”

The extra redirects here adds wait time for the page to load.

Don’t Let Caches Expire

When visitors land on your site, their browser caches a lot of information so when they return, the browser doesn’t have to reload the whole page — this is good.

Adding an expiration tag to the head of your site will allow you to set how long before you force the visitor to recache the page. In most cases, setting the expire date for 1 year is appropriate for pages you aren’t updating often.

There is a caveat to this, however.

If you have certain pages that you update often, using an expiration tag can make your visitors see the old version until the cache updates. Check out this article for more on using expiration tags.

3. Broken Links & Images

When visitors follow links to or from your site, they are expecting certain content to be available to them once they get to that page.

Broken links are not only bad for user experience but can hurt your search ranks.

Reactive Solution:

Regularly auditing your outbound and inbound links can ensure you aren’t pointing people (or Google’s page crawlers) to broken pages.

To check both internal and external links, try using the Check My Links Chrome extension.

This tool works well for small sites, but more pages can get tedious.

Ahrefs is a better option for larger sites as well as paid tools like ScreamingFrog, which can help identify broken images at the same time as finding any 404 errors on your site.

Proactive Solution (no, not the skincare stuff):

While you don’t have much control over other sites moving around their content, there are a few things you can do to prevent creating these broken links on your site.

The first thing is to update rather than remove content.

Publish the pages at the same URLs when you are making updates instead of moving them to new ones. This way you don’t have to delete content that other pages are linking to.

You can also set up 301 redirects to a new page if you need to update URL structure. If the content still exists on your site but has simply changed location, setting up a redirect will ensure that any previously existing links don’t send users to a 404 page.

Don’t add these redirects needlessly though! As mentioned earlier, too many redirects can turn into site speed issues. Check out this Moz article for some more redirect best practices.

4. Not Having (or Forcing) SSL

Users want to be safe when researching online, and Google prefers sending users to trusted sites. Therefore, switching to HTTPS will help build credit with users and, in the same effect, improve your rankings.

We have seen the impact of Google’s HTTPS recommendation increase, and this trend will only continue.

An SSL certificate authenticates the website and server and encrypts data like browsing history and credit card information of your site visitors.

To go about SSL encrypting your site, your first step is to get a dedicated IP address and buy an SSL certificate.

Once activated, updating your site to force HTTPS will allow you to secure your user’s information. This article spells out how to set up your SSL, but if your site is on HubSpot– good news! HubSpot includes an SSL certificate with all their subscriptions.

5. Poor/Duplicate Meta Information

The title tag is one of the most important on-page elements for SEO and optimizing yours is one of the easiest ways to give yourself an SEO boost.

Having generic titles like “Home,” “Home Page,” or “Brand Name” (if you’re not already a household name) is a wasted ranking opportunity. Get more specific, by incorporating your target keyword in your title.

Similarly, don’t duplicate your title tags or other metadata, either.

Having the same title tags (i.e. if every one of our pages said INBOUND MARKETING AGENCY) can confuse users if they see more than one of your pages on a SERP. You could also end up competing with your own pages for a better rank. This also applies to duplicate meta-descriptions and headers.

Inconsistencies and Discrepancies

It’s also important to avoid inconsistencies in your meta information.

How a page is displayed on Google is the first interaction a reader has with your content, so making sure these show correctly on SERPs is important.

For example, if your update an article with 2017 in the title to 2018, make sure you do the same in the meta-description. 

Seeing mismatched or dated information in these areas can cause confusion and hurt your credibility. If people don’t think you’re credible, they’re not going to click through on your listing. 

This is especially true with your blog.

Readers are drawn to articles with a date or year but are also more reluctant to trust articles that were published a year ago.

While these discrepancies don’t affect SEO directly, they do affect click-through rate and the time people spend on your pages– which can give you an SEO boost.

Once again, Screaming Frog can be used here to evaluate and identify duplicate title tags, meta descriptions, etc.

6. Bad Spelling / Grammar

Bad spelling also correlates with lower page rank.

According to Search Engine expert Matt Cutts, reputable sites with higher PageRanks spell better than those with low PageRanks.

Just as you struggle to get past typos when you are reading, search engines and your visitors do too.

If Google can answer the questions users are asking by sending them to a page that is enjoyable to read, then it will. Google is just like any other publisher, so the higher quality content it shows, the better.

For checking your site now, TypoSaurus is a great online tool that comes with a Chrome extension to help out as you publish new pages. Grammarly is another Chrome extension (my personal favorite) that catches any errors you make as you type on any page.

Takeaways

The biggest thing to remember is that search engine optimization is never truly complete.

As you shape your SEO strategy this year, give yourself an edge by addressing these simple mistake and paying close attention to technical SEO.

As search engines become more focused on user experience, these elements are becoming just as important than your keywords. 

What other little things have you seen affecting your search ranks? What are your theories? Let me know if the comments!

Source link