Category Archives: SEO

Google Check of Whether a Website is Mobile Friendly

Google provides a tool to show what if they think a web site is “mobile friendly.” Google states that they will penalize sites in their search rankings if Google doesn’t believe they are mobile friendly. So obviously this matters if you care about your ranking in Google.

If the site passes Google’s test you will get a response similar to ours:

screen shot of site being deemed mobile-friendly by Google

Now Google’s automated tool isn’t so great at providing good usability advice (such as if it really is a good design for mobile users) but it does tell you if Google is going to punish the site or not. If Google thinks the site fails they will provide some feedback, such as:

  • Text too small to read
  • Links too close together
  • Mobile viewport not set

Then you can decide if those really are issues and if you want to fix them. Due to Google’s dominate market position it may be you feel forced to adjust a site (even if it means degrading real usability) in order to make Google happy so your site isn’t punished by Google in search rankings. Or you can decide that you are going to do what is right for users regardless of what Google will do to the site.

Note if you don’t have javascript enabled Google’s tool just fails. I can’t imagine why this tool should require javascript but certainly it is pitifully lame to not provide a clear indication that they created a site that doesn’t work unless javascript is enabled instead of just giving a completely useless message “There was a problem with the request. Please try again later.” as they do now. Google should punish sites that due such lame things in my opinion. I also get that useless message about 20% of the time when I have tried validating a site (but if javascript is enable just reloading makes it work).

The tool is useful in pointing out potential issues to improve for mobile users. I do wish however, Google wasn’t so autocratic about its opinions acting as though failing their tests is equal to failure mobile users. It isn’t, it is a decent indication there may be a problem but it is not proof there is a problem.

Related: Google Still Providing Users Bad Search Results: Basing Results Not of Value to Users but How Completely Sites Do What Google Tells Them To DoDon’t Use Short URL Services (bit.ly etc.)Good Blogging Practices

The Importance of the User in SEO

Optimize for how users are actually using the page — as opposed to how you optimized the page ahead of time — and you’ll see significantly better traffic.

Cyrus Shepard in an good blog post: My Single Best SEO Tip for Improved Web Traffic.

I have always seen the first focus on creating content that users want (both in content and usability of the medium of delivery). Some techniques related to SEO can be useful in tweaking how you go about managing your online presence but it is secondary to creating great content focused on users.

His article is really focused on putting a large amount of effort into tweaking the content. I think this makes sense for some important pages (and some important flows within a web site or web application). I think it is way too much effort to expend on most pages (the payback won’t be worth the effort).

Related: Good Blogging PracticesKeeping Up with SEO Changes

SEO Techniques for 2016

Your brand’s most valuable tool is its website. Your website is what people will read when they are researching your company and what you do. It is what they will turn to when they want to get in touch with you. It’s what will entice new traffic to buy and existing customers and clients to stick around. And, because of the way the Web works in the second decade of the new millennium, all of that is dependent upon Search Engine Optimization (SEO).

SEO is key to how people find your website now. Many new website developers and builders believe that SEO is just about keywords. The truth is that there is a lot more to SEO than simply using a few choice words or phrases a few times in your content and tags. Here are some of the best things you can do to ensure your website’s SEO is on the right track for your audience in 2016.

Security

More and more companies are finding themselves the victims of hackers and data thieves. It is vitally important that you make sure you protect the people who visit your site and submit information through your site, etc. While your Web hosting provider will likely offer some security on their end, it is also vitally important that you protect your work with your own servers.

According to Firewall Technical, a company that provides IT support in Ottawa, “A server acts not only as a space for business documents, but it can also control how documents are retrieved. In addition, servers protect the entire network. For this reason, it is critical that your server is operating smoothly.” Make sure that your local and cloud servers are as secure as possible. The last thing you want is someone stealing customer information from your own databases!

Search engines will even put up scary messages warning users that if they follow a link to a web site that has been hacked (and the search engine catches the hack). That will drop your search visitors to almost zero and they will be scared to come back even if you fix your server. No amount of SEO mastery will help you rebuild after a data breach that endangers your customers’ private information!

Links

According to a Moz report done last year, domain level linking is the most important component of SEO success. Domain level linking is a link that points to the whole site instead of a specific page or piece of content within that site.

A few years ago, this was easy enough to achieve. Businesses would hire content writers to churn out dozens of pieces of slap-dashed content and wallpaper the Internet with them (you remember E-zines, right?) to build up their inbound link percentage. Starting around 2013, however, Google and the other search engines got hip to this gaming of their system and started reworking their algorithms. It was a move similar to what they did when keyword stuffing started to dilute their search results.

Today the links that point to your site need to be relevant and from quality sites. Setting up link wheels is a punishable offense and publishing the same piece of content in multiple places is no longer allowed (in fact, it can get your site delisted entirely). This means that you need to be proactive about giving interviews and guest posting on relevant and high-ranking sites both within and outside of your niche.

A great way to help this along is to sign up for services such as Help a Reporter Out. You can sign up as a “source” and then respond to the queries that fall within your expertise. If the reporter asking for help/information chooses your response, you might score an interview or a quote–both of which come with links back to your site. You can also sign up for contributor accounts on sites like The Huffington Post.

Keywords

We’ve talked a lot about how keywords aren’t as important as links and security. This doesn’t mean, however, that your keywords are irrelevant in the 2016 SEO game. Keywords and phrases are still important. They are still often the deciding factor in how your site is listed within a searcher’s results. The difference is that now the keywords you choose matter more than ever.

The keywords you choose must flow naturally within your content. This means that when doing your research, you want to focus on keywords that flow naturally instead of just words strung together haphazardly. You also need to be aware of how often you use them. While there is no definitive ideal percentage, you typically want to limit your keywords to one to two percent of the content.

Security, links and keywords are the most important parts of your SEO campaign. Make sure you get these right and the rest should fall into place!

Related: Keeping Up with SEO Changes

Don’t Hide Important Content Using Coding Gimmicks

My comment on: Does Hidden/Tabbed Content Still Get Picked Up By Google?

I would say that hidden tab stuff is bad Ux (most of the time). I could figure out what the heck was going on when I read this post and it seems to end without having addressed the issue sensibly. Oh, the content is hidden up above I finally figured out. I think Google does exactly the right thing, in making the hidden content a very low ranking factor for the page because as a user it is hidden and not the focus of the page.

The conclusion of the original post is hidden text is given very low weight by Google in search results. If you go to that page, note that you can’t see most of the content for that “page” you have to find the links to unhide the content.

The hidden text being discussed here is when you hide content that only becomes visible once the user clicks something (and instead of going to a page that highlights that content, some content is unhidden and other content is added to the hidden content on the page). It is just a bad Ux practice in general (as with many Ux issues there are odd cases where it can make sense).

Related: Getting Around Bad Web Page LayoutsPoor Web Site User Experience (Ux) on Financial SitesThe Most Important Search Engine Ranking FactorsDon’t Use Short URL Services

Good Blogging Practices

One of the things I have always done is to read and comment on blogs I find worthwhile. The main reason I do this is to learn. Another advantages include growing a network of like minded people (that grow from recognizing you commenting on their blog or blogs they read and then some start to read your blog…). And that growing your following can result in more links to your site and better search rankings.

These are my comments sparked by an post with some good ideas on some good blogging practices. They are edited and extended from the comment left on the blog.

Great thoughts.

Give your readers what they want: so important and yes to some extent people think of this, but that idea should get more attention from most bloggers.

Length of posts; as you say make them appropriate. Sometimes what you have to share is best captured in a long post. Sometimes a short post is best. Trying to jam a post into a specific format/length is a recipe for failure.

I do think the long, detailed posts are valuable and if you are never doing that there is likely some value in seeing if some of what you have to say can be expressed well in a long post.

I do have comments I leave spark me to write longer posts on my own blogs. I also started a management blog on blogspot (over 10 years ago) and when I created my own domain (also over 10 years ago) I left it there (urls should live forever).

A few years later I started to use that blog to republish comments I thought were worth keeping (one of the things I do is link to my previous content and trying to find some comment I want to reference is really hard, by collecting comments I think I might want to reference on that blog I can actually find them again). I often edit these a bit and add some links (which I often am prevented from including even when they would be really useful).

I was adding this to the related links that follow – Build Your Online Presence (another post that started as a comment). And this shows another reason to republish your comments that are worth keeping. The original article link is gone. I always include a link to the post I commented on; it is amazing how many are broken a few years later (lots of people break a basic web usability and wise SEO practice and break their urls).

An illustration of why it is in your interests to have urls live forever. Last year I did posts on my most popular posts on many of my blogs (based on views in 2014). A fairly typical example is from my Curious Cat Comments blog. The most popular posts by year 2014-6 (the most recent year does have an advantage as lots of regular readers read each new post); 2013-1; 2010-1; 2009-1; 2008-1; 2007-1; 2006-2. This one actually was more heavily weighted to recent post than most of my blogs. I just checked it for this year and 2 posts from 2007 and 1 post from 2008 that were not in the top last year are all in the top 6 this year (and the one from 2008 last year is also repeated again).

Related: Blog commenting optionsMake Your Blog WelcomingDon’t Use Short URL Services

Don’t Use Short URL Services

I am against using url shortening services to redirect urls for 4 reasons.

  1. Short urls add a point of failure – they go out of business and the urls go bad (or even worse get redirected to whoever buys the url service domain) or sometimes the short urls just expire and are reused (which is really lame).
    There is also the risk the country owning the domain messes things up (bit.ly using Libya – not exactly a stable country…). Likely if the domain is owned by super rich company they will pay huge ransom for domain if a country demands it – but not for sure… .be is owned by Belgium (which Google uses for YouTu.be short urls) and is probably less likely to screw with Google. But if the USA government messes with European privacy rights one path for the countries is to mess with their domains and create trouble for .be domain – or whatever other domain is in question.
  2. You lose the tremendous information value that a real human readable url provides users. You also lose the small aid to building your brand available by having them see your name in the url. Finally short urls (by throwing away the human readable url information users would benefit from) contribute to security problems by encouraging people to blindly click on links they don’t know where they are being taken. Scammers take advantage of users that are willing to follow short url links.
  3. You lose Search Engine Optimization (SEO) value of links by not linking to the actual url. For this reason it is a particularly bad idea to use short urls for your own content (but I see this done). When you are posting your content on a site that tells Google not to trust the link you entered (nofollow attribute) this point is not relevant but the other 3 points still are. And I see people use short urls even for followed links.
  4. Url shorteners delay the page load times for users. I often find urls shorteners forwarded to another url shortener forwarded to another url shortener and so on. Just last week, following a link on Harvard Business School’s Twitter account I was forwarded to 7 different urls before the actual url (a page on one of their own sites).

    If you are on a fiber internet connection and all those url redirects respond immediately it probably won’t be noticeable (so the people at Harvard may have no clue how lame they look to users) but if you are on a connection with high latency (many hundred of millions of people across the world are) it can easily take a second or two before the page even starts to load. With all the evidence on how critical fast load times are for users adding in delays with url shortener redirection is a bad practice.

    long urls written out on paper

    It would be better for this Grandmom to use short urls to write out her favorite urls to show her grandchild. via

    Continue reading

High MozRank DoFollow Blogs

Due to spam comments many sites add the nofollow tag to comments. For many years the nofollow tag has been the default in WordPress (you have to use a plugin to revert back to the original style where comment author links were not flagged as untrusted). With the nofollow tag Google (and Moz) do not give the link value.

Here is a list of blogs that moderate their comments and provide dofollow links giving those that contribute worthwhile comments the benefit of being considered real links by Google (and others). I will continue to keep this list updated.

Order of the list is based on MozRank with a penalty for using popups to interfere with visitors using the site. See the very bottom of this post for blogs that supposedly have dofollow comments but I have been unable to comment and my messages to them have not been answered.

Many of the best blogs that provide dofollow links require the use of your real name, a link to your home page or a blog that you obviously write, and comments that are valuable (not just meaningless drivel). They may also require numerous (normally between 3 to 10) approved comments before links become dofollow.

Unfortunately many people spam these blogs in an attempt to get dofollow links. That results in many of the blogs turning off dofollow links. Those that stay dollow are usually impatient with spamming low quality comments and remove poor quality links that are not personal blogs. If you comment, post valuable comments if you expect to get a dollow link, otherwise you are just contributing to the decline of blogs that provide dofollow links.

Why don’t I list 50 or 100 more that are nofollow, haven’t been used in years and where the domain was deleted? That doesn’t make sense to me. But, maybe I am crazy (so I explain my craziness here), since most other listings do that.

If you know of dofollow blogs with at least a 1 year track record and that has compelling posts (if it isn’t of high quality it will likely die so it isn’t worth adding just to have to remove it later) add a comment with the information on the blog.

Related: Ignoring Direct Social Web Signals in Search ResultsGoogle and Links (2012)Using Twitter Data to Improve Search Results

* CommentLuvDF – they dofollow blog-post-title-link (usually only after between 3 to 10 approved comments) but not author link

These blogs don’t work for me (or often don’t work but work sometimes). Either:

  • they don’t post my comments and don’t reply to my contact messages about why (if they decided to block them because they didn’t value the comment that would be fine, it is their blog – but most likely they have a spam filter that just trashes my comments) but do have some dofollow comments.
  • they removed links to author’s blog (and comment luv post link) from comments that were made. It is their right to do so. But the links removed were links to personal blogs and if they are removing those links they don’t really fit in a list of dofollow blogs.
  • or they delete (probably too aggressive spam filter but maybe manual action, there is no way to know) many comments without notice to the comment author.
  • 5.7 Adrienne Smith (MPA 49, MSS 2, CommentLuvDF)
  • 5.3 Sylvia Nenuccio (MPA 35, MSS 0, CommentLuvDF)
  • 5.2 Sherman Smith’s Blog (MPA 43, MSS 2, CommentLuvDF, popup)
  • 5.4 Power Affiliate Club (MPA 33, MSS 2, CommentLuvDF, popup)
  • Most Important Search Engine Ranking Factors

    Moz published their annual Search Engine Ranking Factors based on a survey of SEO experts. The SEO experts opinion of the most important factors in 2015 are:

    1. Domain level linking (8.2 out of 10) – quality and quantity of links etc. to the entire domain
    2. Page level linking (7.9) – quality and quantity of the link to the page, anchor text
    3. Page level keyword and content (7.9) – content relevance to search term, content quality ranking factors, topic modeling factors
    4. Page level keyword agnostic measures (6.6) – readability, content length, uniqueness, load speed, markup, https, etc..
    5. Engagement data (6.6) – based on SERP clickstream data, visitor traffic and usage signals… on the page and domain level

    This both reinforces the importance of links and also shows how search result rankings have evolved to include many other factors as significant and important determinants of search result rankings.

    Related: Keeping Up with SEO ChangesSite Spam Flags Score from MozWhy Don’t Search Results Screen Better by Date?

    New Site Spam Flags Score from Moz

    Moz continues to provide interesting tools and site measures. I only follow things as I find it interesting (not as a profession). I am not a SEO person and paying $100 a month (or much more) they charge for their tools isn’t worth it for my curiosity. But they make some things available for free and provide some interesting blog posts on what they find and about their tools.

    This new Spam Score analysis by Moz seems very interesting: Spam Score: Moz’s New Metric to Measure Penalization Risk. The idea is sensible, they are trying to determine the spam riskiness of a site based on the correlations they can draw from their web crawl data and Google search results. Moz can then see where sites are not ranking well when many factors would indicate they should rank and then draw a conclusion that Google has penalized certain sites (and not given sites with links from those sites credit or worse penalized sites with links from those sites).

    This seems like a really good idea. The found 17 flags that are correlated with spam hits to the site. And when sites trip more and more of those flags the likelihood of Google classifying those sites as spam rise. When a site has 0 spam flags Moz calculates a .5% chance of the site showing up in Google search results (or not showing more likely) in a way that indicates Google sees the site as spam. 4 spam flags equals a 7.5% chance of being a “spam site.” A site with 6 spam flags has at 16% chance of being spam, 7 flags means a 31% chance, 8 is a 57% chance, 9 a 72% chance and 14 a 100% chance.

    A screen shot of Moz's spam flags report

    Screen shot of Moz’s Spam Flag report.

    In their post Moz says that tripped spam flags are not meant to be an indication of something that needs to be fixed (after all the flags are just correlation, not causation – “fixing them” may do nothing for search results). That may be true but if sites are showing a 5-yellow for spaminess it is highly likely lots of people are going to want to reduce this scary looking feedback about their site.

    It may well be changing to avoid the flag by adding twitter buttons and making whatever tweaks to get rid of several more flags is what is likely to happen.
    My guess is a spaminess rating that wasn’t just x/17 but a factor of how many of 17 tripped plus an understanding of how important that was (I would imagine including which interactions of spam flag were more critical…).

    I would be surprised if there isn’t a big difference in a certain 3 flags being tripped versus 3 other flags being tripped (plus say 4 other random flags). That is to say, even with Moz’s limited ability to know what Google is directly reacting to versus correlations you can observe. I would imagine this could big improved into a 100 point (or whatever) system that gave a much more valuable spam site insight than just treating each flag as equally important (and ignoring especially deadly interactions between flags – which flags when they are tripped together cause the likely spam hit to be seen in google results.

    Continue reading

    Find MozRank, Moz PageAuthority, Google PageRank and Alexa Results Now

    We have updated the MultiPageRank site to provide MozRank, Moz PageAuthority, Google PageRank and Alexa results now. In one simple request you can retrieve all these measures for multiple domains.

    Google provided an opening in the market to serve users interested in page authority/popularity when they slowed sharing the updates to public Google page rank. Moz has filled that role extremely well. For a year or two Moz results have been much more useful than Google’s. We have finally added Moz results to our results page.

    MozRank is closest to Google page rank to measure raw link authority to the page; as with Google page rank the link weight is based on the rank of the page providing a link. So 1 link on the home page of some very popular site would provide more rank to the linked page than thousands from low quality pages.

    Moz page authority is enhanced with many extra factors to try and provide a better estimation of search result “authority.” Moz calculates it based off data from the Mozscape web index and includes link counts, MozRank, MozTrust, and dozens of other factors.

    We also continue to include Alexa data which does have significant issues with reliability but it is of some interest so we include it. Alexa uses their data (largely toolbar user based) to rank websites by total visitors/visits (a combination). There data is biased with SEO sites in particular getting a big boost as users using those sites are often using a toolbar that shares data with Alexa and they visit lots of SEO related sites.

    We have had some issues (largely very slow response times for the results page) providing the additional Moz data but I believe things are working well now. Still I have the old results visible using www.multipagerank.com. The new results are found on multipagerank.com. I made split when we first had issues as we worked on them. I will likely eliminate the old results page in the next couple weeks if everything continues to go well.

    Related: Use Our Multiple PageRank Site to Find PageRank of https PagesIs the Value of Links Decreasing?Keeping Up with SEO Changes