Tag Archives: search rankings

SEO Techniques for 2016

Your brand’s most valuable tool is its website. Your website is what people will read when they are researching your company and what you do. It is what they will turn to when they want to get in touch with you. It’s what will entice new traffic to buy and existing customers and clients to stick around. And, because of the way the Web works in the second decade of the new millennium, all of that is dependent upon Search Engine Optimization (SEO).

SEO is key to how people find your website now. Many new website developers and builders believe that SEO is just about keywords. The truth is that there is a lot more to SEO than simply using a few choice words or phrases a few times in your content and tags. Here are some of the best things you can do to ensure your website’s SEO is on the right track for your audience in 2016.

Security

More and more companies are finding themselves the victims of hackers and data thieves. It is vitally important that you make sure you protect the people who visit your site and submit information through your site, etc. While your Web hosting provider will likely offer some security on their end, it is also vitally important that you protect your work with your own servers.

According to Firewall Technical, a company that provides IT support in Ottawa, “A server acts not only as a space for business documents, but it can also control how documents are retrieved. In addition, servers protect the entire network. For this reason, it is critical that your server is operating smoothly.” Make sure that your local and cloud servers are as secure as possible. The last thing you want is someone stealing customer information from your own databases!

Search engines will even put up scary messages warning users that if they follow a link to a web site that has been hacked (and the search engine catches the hack). That will drop your search visitors to almost zero and they will be scared to come back even if you fix your server. No amount of SEO mastery will help you rebuild after a data breach that endangers your customers’ private information!

Links

According to a Moz report done last year, domain level linking is the most important component of SEO success. Domain level linking is a link that points to the whole site instead of a specific page or piece of content within that site.

A few years ago, this was easy enough to achieve. Businesses would hire content writers to churn out dozens of pieces of slap-dashed content and wallpaper the Internet with them (you remember E-zines, right?) to build up their inbound link percentage. Starting around 2013, however, Google and the other search engines got hip to this gaming of their system and started reworking their algorithms. It was a move similar to what they did when keyword stuffing started to dilute their search results.

Today the links that point to your site need to be relevant and from quality sites. Setting up link wheels is a punishable offense and publishing the same piece of content in multiple places is no longer allowed (in fact, it can get your site delisted entirely). This means that you need to be proactive about giving interviews and guest posting on relevant and high-ranking sites both within and outside of your niche.

A great way to help this along is to sign up for services such as Help a Reporter Out. You can sign up as a “source” and then respond to the queries that fall within your expertise. If the reporter asking for help/information chooses your response, you might score an interview or a quote–both of which come with links back to your site. You can also sign up for contributor accounts on sites like The Huffington Post.

Keywords

We’ve talked a lot about how keywords aren’t as important as links and security. This doesn’t mean, however, that your keywords are irrelevant in the 2016 SEO game. Keywords and phrases are still important. They are still often the deciding factor in how your site is listed within a searcher’s results. The difference is that now the keywords you choose matter more than ever.

The keywords you choose must flow naturally within your content. This means that when doing your research, you want to focus on keywords that flow naturally instead of just words strung together haphazardly. You also need to be aware of how often you use them. While there is no definitive ideal percentage, you typically want to limit your keywords to one to two percent of the content.

Security, links and keywords are the most important parts of your SEO campaign. Make sure you get these right and the rest should fall into place!

Related: Keeping Up with SEO Changes

Don’t Hide Important Content Using Coding Gimmicks

My comment on: Does Hidden/Tabbed Content Still Get Picked Up By Google?

I would say that hidden tab stuff is bad Ux (most of the time). I could figure out what the heck was going on when I read this post and it seems to end without having addressed the issue sensibly. Oh, the content is hidden up above I finally figured out. I think Google does exactly the right thing, in making the hidden content a very low ranking factor for the page because as a user it is hidden and not the focus of the page.

The conclusion of the original post is hidden text is given very low weight by Google in search results. If you go to that page, note that you can’t see most of the content for that “page” you have to find the links to unhide the content.

The hidden text being discussed here is when you hide content that only becomes visible once the user clicks something (and instead of going to a page that highlights that content, some content is unhidden and other content is added to the hidden content on the page). It is just a bad Ux practice in general (as with many Ux issues there are odd cases where it can make sense).

Related: Getting Around Bad Web Page LayoutsPoor Web Site User Experience (Ux) on Financial SitesThe Most Important Search Engine Ranking FactorsDon’t Use Short URL Services

Most Important Search Engine Ranking Factors

Moz published their annual Search Engine Ranking Factors based on a survey of SEO experts. The SEO experts opinion of the most important factors in 2015 are:

  1. Domain level linking (8.2 out of 10) – quality and quantity of links etc. to the entire domain
  2. Page level linking (7.9) – quality and quantity of the link to the page, anchor text
  3. Page level keyword and content (7.9) – content relevance to search term, content quality ranking factors, topic modeling factors
  4. Page level keyword agnostic measures (6.6) – readability, content length, uniqueness, load speed, markup, https, etc..
  5. Engagement data (6.6) – based on SERP clickstream data, visitor traffic and usage signals… on the page and domain level

This both reinforces the importance of links and also shows how search result rankings have evolved to include many other factors as significant and important determinants of search result rankings.

Related: Keeping Up with SEO ChangesSite Spam Flags Score from MozWhy Don’t Search Results Screen Better by Date?

New Site Spam Flags Score from Moz

Moz continues to provide interesting tools and site measures. I only follow things as I find it interesting (not as a profession). I am not a SEO person and paying $100 a month (or much more) they charge for their tools isn’t worth it for my curiosity. But they make some things available for free and provide some interesting blog posts on what they find and about their tools.

This new Spam Score analysis by Moz seems very interesting: Spam Score: Moz’s New Metric to Measure Penalization Risk. The idea is sensible, they are trying to determine the spam riskiness of a site based on the correlations they can draw from their web crawl data and Google search results. Moz can then see where sites are not ranking well when many factors would indicate they should rank and then draw a conclusion that Google has penalized certain sites (and not given sites with links from those sites credit or worse penalized sites with links from those sites).

This seems like a really good idea. The found 17 flags that are correlated with spam hits to the site. And when sites trip more and more of those flags the likelihood of Google classifying those sites as spam rise. When a site has 0 spam flags Moz calculates a .5% chance of the site showing up in Google search results (or not showing more likely) in a way that indicates Google sees the site as spam. 4 spam flags equals a 7.5% chance of being a “spam site.” A site with 6 spam flags has at 16% chance of being spam, 7 flags means a 31% chance, 8 is a 57% chance, 9 a 72% chance and 14 a 100% chance.

A screen shot of Moz's spam flags report

Screen shot of Moz’s Spam Flag report.

In their post Moz says that tripped spam flags are not meant to be an indication of something that needs to be fixed (after all the flags are just correlation, not causation – “fixing them” may do nothing for search results). That may be true but if sites are showing a 5-yellow for spaminess it is highly likely lots of people are going to want to reduce this scary looking feedback about their site.

It may well be changing to avoid the flag by adding twitter buttons and making whatever tweaks to get rid of several more flags is what is likely to happen.
My guess is a spaminess rating that wasn’t just x/17 but a factor of how many of 17 tripped plus an understanding of how important that was (I would imagine including which interactions of spam flag were more critical…).

I would be surprised if there isn’t a big difference in a certain 3 flags being tripped versus 3 other flags being tripped (plus say 4 other random flags). That is to say, even with Moz’s limited ability to know what Google is directly reacting to versus correlations you can observe. I would imagine this could big improved into a 100 point (or whatever) system that gave a much more valuable spam site insight than just treating each flag as equally important (and ignoring especially deadly interactions between flags – which flags when they are tripped together cause the likely spam hit to be seen in google results.

Continue reading

Keeping Up with SEO Changes

Response to Heresy: Google and SEO Don’t Actually Change that Much by Rand Fishkin

If, in 2004, you balanced SEO best practices with usability/user experience needs and created a site with content that was optimal for people and engines, you probably had an SEO strategy that would still work today. You might need to make updates, especially with regards to your tactics for earning awareness (taking into account the rise of social networking), for optimizing to multiple devices (instead of just multiple browsers), and to keep your UI/UX enjoyable for modern users, but strategic-level SEO wouldn’t need nearly as much of a shift as these others.

I agree. While I am not an SEO person (I just read about it a bit when I have time because it is an interesting area) what you say is how it seems to me. Yes there are tricks that SEO people learn and can use to do well (I guess based on what I have read over the years).

But the core focus in the same, with just different tricks valuable at different times. It does seem like these tricks (that are mainly about exploiting weaknesses in the algorithms) are worth lots of time and energy in the right situation, to the right people.

For most people I don’t think it is that important. The principles that matter seem to stay pretty consistent over the long term, it is just trying to game the algorithm that is challenging. If it wasn’t challenging those people that are now making their money doing it would have a much more difficult time because many other people would be doing it. The challenge of staying ahead of Google’s ability to eliminate gaming of the results is why those that do it well are rewarded and if it was easier you would be rewarded less.

If you just do the basic stuff that doesn’t change, I barely make any changes over years. The only change I can remember in the last few years was adding Google Authorship – which has now become essentially worthless, so even if I hadn’t done that it wouldn’t matter.

My basic guide is this: Google wants to provide the most relevant content. It does this with significant focus on others opinions of your content. Google has to make judgements about others opinions and doesn’t do so perfectly. So getting links is important as that is both an indication that other value it (maybe) and that is measurable by Google.

Now, I don’t think Google is great at determining if links value what they link to or are saying – “look at this totally idiotic article.” I imagine Google, and other search engines will get much better at this.

I do think Google is absolutely foolish to ignore data from nofollow sources (since this is more about what companies chose to game Google by nofollowing everything that doesn’t directly benefit their own company than whether the link has value). The links from important people on Twitter are an extremely valuable indication that Google just wastes. They, or some other search will do this much better sometime (I would have thought that time would have been years in the past).

But essentially, you need to create useful content. Then get people to acknowledge that it is great. All the games of trying to get others to take advantage of lameness in Google’s algorithm is mainly noise. That yes professional SEO people need to pay attention to. But the extent to which that exists is mainly due to failures in Google algorithm so they will constantly be trying to fix it. They don’t want to provide the sites that do SEO best, they want to provide sites that are relevant to the person searching.

Related: Why Can’t Search Results Screen Better by Date?Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users LikelyGoogle and Links

Google Still Providing Users Bad Search Results: Basing Results Not of Value to Users but How Completely Sites Do What Google Tells Them To Do

Comments on: On MetaFilter Being Penalized By Google (Sadly their comment system is too buggy and kept failing, so I couldn’t post this there).

It is hard to be Google. But they have billions of dollars (tens of billions, I believe now) in profit from search every year which provides resources that dwarf those of all but a handful of companies.

When Google takes action against users (providing bad search results not because the site content isn’t valuable but because the site has some practice Google doesn’t like) or tells website to change other web sites (get rid of links we at Google don’t like you having from some other site) that is extremely bad behavior even if you have a difficult task. And give billions of dollars to do it right I don’t agree it is anywhere near acceptable.

That Google still believes giving search users bad results because that is the best way Google can figure out how to punish those sites doing something Google doesn’t like is super lame. If the site content isn’t useful to search users Google shouldn’t rank it highly. If it is, Google should rank it highly. Basing what I see for search results not on what is useful for me, but on what is from sites that Google didn’t dislike their practices and had related content, is lame.

Sadly without better competition Google can keep up this lazy behavior. Once one or more of DuckDuckGo, Bing, Yandex, etc. do a decent enough job to pull away users Google will stop providing results based on sites that don’t behave how Google wants and instead base search results on value to the user.

There is an easy way to see if Google’s behavior is user driven or the result of lazy behavior by an incumbent without realistic competition. If the “bad practice” Google is trying to correct provides bad content to users it is user driven. If the “bad practice” Google is trying to correct is about doing things Google doesn’t like it is lazy behavior they engage in because providing bad results won’t cost them and doing so lets them threaten sites into compliance with Google’s desires.

This like no-follow and removing links on sites Google doesn’t like have no user value. They are aimed at forcing sites to behave as Google wishes. Google can claim (often it stretches credibility but in some cases with justification) that the trustworthiness of a site is degraded by certain practices and therefore the likely benefit to users is less. So if a site links to lots of lousy sites (per lots of data Google has to value sites) and fits certain patterns which algorithms should be able to measure it is reasonable to penalize those sites.

Treating Twitter results differently because they had no-follow links or follow links has no value to the user. If Google can’t do what most somewhat sensible 10 year olds can do and tell the difference between @timberners_lee, @neiltyson and @Atul_Gawande and spam Twitter accounts then Google threatening Twitter into saying that they don’t believe any of their users are trustworthy (by using the “nofollow” tag) then that is all Google can do. And those threats will likely work fairly well just will the grumbling that we have all seen for the last 5 years. If they get a reasonable competitor such tactics won’t work – Google will have to provide the best results for users and stop penalizing Google search users in order to let Google keep threatening web sites into compliance with their wishes (and if sites don’t comply removing good content from users view).

Related: Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users LikelyWhy Can’t Search Results Screen Better by Date?

Are the Values of Links Decreasing?

This interesting articles shows 13 SEO experts opinions on questions such as:

Do you see the link losing value over time?
How – if at all – should SEO’s change their content marketing and link building strategies in the coming years, given inevitable changes to the PageRank algorithm?
If Google search did work without links, what metrics would replace it?

Some interesting quotes:

The Future of PageRank: 13 Experts on the Dwindling Value of the Link

Michelle Robbins: “Google wants their results to be valid and relevant – to mirror user expectations in the real world, and they will continue to evolve their system to get there. Links aren’t doing it, so they are working to adapt. Offline signals are valid, not easily manipulated, and can be captured. Thus I believe they will be used in the algo.”

Julie Joyce: “I think that links could lose some value but it’s not a doomsday scenario in my opinion. Considering links are how we move around on the web I cannot imagine a successful search engine that doesn’t take their importance into effect.”

Rand Fishkin: “the link has been losing value for almost a decade. That said, I don’t think that in the next decade, we’ll see a time when links are completely removed from ranking features. They provide a lot of context and value to search engines, and since the engines keep getting better at removing the influence of non-editorial links, the usefulness of link measurement will remain high.”

Glenn Gabe: “AuthorRank (or some form of it) once it officially rolls out. The model of ranking people versus websites is extremely intriguing. It makes a lot of sense and can tie rankings to authors versus the sites their content is written on.”

One thing I find funny about the article is talking about “social factors” (social websites) as if they were not links. Google currently claims to ignore social links that use the nofollow (and Google policies have driven companies scared of being punished to use nofollow). But Google using “social factors” is just Google using links it has been failing to use the last few years. I agree Google is foolish to be ignoring those indications today. So I agree Google should use more links (social and likely non-social) to provide better results.

I suppose one “social factor” people could mean is the number of likes something gets. I think that would be a lame measure to give any value to. The use of +1s data in Google+ may have some tiny benefit as their Google can know who gave what pluses (and devalue junk plus profiles, give extra value to authority that is plusing related good [using corroboration with other sources of Google has to gage quality] content).

I have long thought using traffic and interaction (comments, time on site…) with the sight are useful measures for Google. Google has various ways for getting this data. I am not sure to what extent they use it now. Not that it would be a huge factor but just another factor to throw in with the huge number they use already.

I agree trusted authorship is likely to play an increasingly important role. Partially this seems like an extension of pagerank to me (basically passing value to the link based on the link to the author – and the authorship stuff right now, is all link based, there has to be a link on the page tying the author to the Google+ page for the author).

Related: Very Simple Process to Claim Authorship of Pages with GoogleWhy Can’t Search Results Screen Better by Date?Surprise Google Public PageRank UpdateUse Multiple PageRank Site to Find PageRank of https Pages

Let me Filter Out Websites Using Popups from Search Results

I wish Google, Duck Duck Go or some decent search engine would let me filter out website using popups from search results (as I have written before – Remove Popup Ad Sites From Search Results). By popups I mean all the coding to create popup interaction; lately many sites have coded popups in a way to not obey the preferences in my browser.

All many websites have done is found a way to get around my preferences by using code to create popups that circumvents my expressed desires for my browser’s behavior. Actively coding around my desires is a behavior that is obviously hostile to users. I don’t really want to see site that are hostile to users in my search results.

I would want a way to whitelist sites that have convinced me they are worth using even though they intentionally code so as to create popups when I have expressly indicated I don’t want them used.

Ideally a search engine would downgrade results based on the level of customer hostility practiced (kind of like Hipmunk’s agony ranking of plane flight options). It seems obvious to me this is a desirable state but I imagine it will likely take years for search engines to get there so let me block bad sites, for now. Google too 6 years to let me block bad sites but like many of the adjustments since Larry Page become CEO this advance seems to have been rolled back (or possibly just made next to impossible by lousy Ux).

Related: Web Search Improvements Suggestions from 2005 (most of which I am still waiting for) – Google Social Circle ResultsImprove Google Search Results (2006)

Surprise Google Public PageRank Update

Google has published updated public page ranks in a surprise move. Earlier this year Google Guy (Matt Cutts) had said he would be surprised if Google updated the public PageRank data before 2014.

Initial reports seem to be that people are seeing more drops in PageRanks than increases – but that could just be a factor of who is talking the loudest. Also there is more speculation that the data is old. In a quick check of 2 high PageRank blogs of mine the latest posts with PageRank are in August – no September, October, November or December posts have page rank (it may well be within a few days of September 1st in either direction). In past updates the data would often be 2 weeks old, this indicates it may well be over 3 months old (which others are also saying).

The previous update was in February 2012. In the past the updates have been published at about 3 month intervals but with a fair amount of variation.

It is unlikely they will publish new values 4 times in 2014, but we will have to see.

See how your sites have fared in the latest Google PageRank update.

Related: Use Multiple PageRank Site to Find PageRank of https PagesGoogle Has Deployed Penguin 2.0 (May 2013)

Why Can’t Search Results Screen Better by Date?

I realize there are some challenges with showing results by date. But the results shown now are pitiful. If you ask for results only in the last week, a vast majority of the results will still be ancient. They likely had some minor change to superflous text not even in the body of content for the page.

It really doesn’t seem like having the filters for time be better would be too hard. Even if they are not perfect making them much better wouldn’t be hard. Sadly the various search engines range form bad to pitiful to non-existent in their ability to provide accurate results filtered by date.

This seems like another opportunity for all those non-Google search engines to take advantage of Google’s inability to do this competently and gain some market share. But I haven’t noticed any of them taking advantage of this opportunity. If you know a search engine that actually provide somewhat accurate date filters add a comment.