Google Check of Whether a Website is Mobile Friendly

Google provides a tool to show what if they think a web site is “mobile friendly.” Google states that they will penalize sites in their search rankings if Google doesn’t believe they are mobile friendly. So obviously this matters if you care about your ranking in Google.

If the site passes Google’s test you will get a response similar to ours:

screen shot of site being deemed mobile-friendly by Google

Now Google’s automated tool isn’t so great at providing good usability advice (such as if it really is a good design for mobile users) but it does tell you if Google is going to punish the site or not. If Google thinks the site fails they will provide some feedback, such as:

  • Text too small to read
  • Links too close together
  • Mobile viewport not set

Then you can decide if those really are issues and if you want to fix them. Due to Google’s dominate market position it may be you feel forced to adjust a site (even if it means degrading real usability) in order to make Google happy so your site isn’t punished by Google in search rankings. Or you can decide that you are going to do what is right for users regardless of what Google will do to the site.

Note if you don’t have javascript enabled Google’s tool just fails. I can’t imagine why this tool should require javascript but certainly it is pitifully lame to not provide a clear indication that they created a site that doesn’t work unless javascript is enabled instead of just giving a completely useless message “There was a problem with the request. Please try again later.” as they do now. Google should punish sites that due such lame things in my opinion. I also get that useless message about 20% of the time when I have tried validating a site (but if javascript is enable just reloading makes it work).

The tool is useful in pointing out potential issues to improve for mobile users. I do wish however, Google wasn’t so autocratic about its opinions acting as though failing their tests is equal to failure mobile users. It isn’t, it is a decent indication there may be a problem but it is not proof there is a problem.

Related: Google Still Providing Users Bad Search Results: Basing Results Not of Value to Users but How Completely Sites Do What Google Tells Them To DoDon’t Use Short URL Services (bit.ly etc.)Good Blogging Practices

Customized Search Results by Only Searching Specific Web Site

1 of the more interesting search features that most people know nothing about are customized search engines. I first experienced these with Rollyo in 2005.

I use a similar feature available via Google now to provide customized search results for topics such as investing, engineering and my Curious Cat web sites.

Setting up a Google Custom Search Engine is very easy. Essentially you just setup an account and then add the sites you want included in the search engine. Then you can get a search box to add to your site that will return results based on the web sites you selected.

Related: Don’t Hide Important Content Using Coding GimmicksIgnoring Direct Social Web Signals for Search ResultsDon’t Use URL Shorters

The Importance of the User in SEO

Optimize for how users are actually using the page — as opposed to how you optimized the page ahead of time — and you’ll see significantly better traffic.

Cyrus Shepard in an good blog post: My Single Best SEO Tip for Improved Web Traffic.

I have always seen the first focus on creating content that users want (both in content and usability of the medium of delivery). Some techniques related to SEO can be useful in tweaking how you go about managing your online presence but it is secondary to creating great content focused on users.

His article is really focused on putting a large amount of effort into tweaking the content. I think this makes sense for some important pages (and some important flows within a web site or web application). I think it is way too much effort to expend on most pages (the payback won’t be worth the effort).

Related: Good Blogging PracticesKeeping Up with SEO Changes

SEO Techniques for 2016

Your brand’s most valuable tool is its website. Your website is what people will read when they are researching your company and what you do. It is what they will turn to when they want to get in touch with you. It’s what will entice new traffic to buy and existing customers and clients to stick around. And, because of the way the Web works in the second decade of the new millennium, all of that is dependent upon Search Engine Optimization (SEO).

SEO is key to how people find your website now. Many new website developers and builders believe that SEO is just about keywords. The truth is that there is a lot more to SEO than simply using a few choice words or phrases a few times in your content and tags. Here are some of the best things you can do to ensure your website’s SEO is on the right track for your audience in 2016.

Security

More and more companies are finding themselves the victims of hackers and data thieves. It is vitally important that you make sure you protect the people who visit your site and submit information through your site, etc. While your Web hosting provider will likely offer some security on their end, it is also vitally important that you protect your work with your own servers.

According to Firewall Technical, a company that provides IT support in Ottawa, “A server acts not only as a space for business documents, but it can also control how documents are retrieved. In addition, servers protect the entire network. For this reason, it is critical that your server is operating smoothly.” Make sure that your local and cloud servers are as secure as possible. The last thing you want is someone stealing customer information from your own databases!

Search engines will even put up scary messages warning users that if they follow a link to a web site that has been hacked (and the search engine catches the hack). That will drop your search visitors to almost zero and they will be scared to come back even if you fix your server. No amount of SEO mastery will help you rebuild after a data breach that endangers your customers’ private information!

Links

According to a Moz report done last year, domain level linking is the most important component of SEO success. Domain level linking is a link that points to the whole site instead of a specific page or piece of content within that site.

A few years ago, this was easy enough to achieve. Businesses would hire content writers to churn out dozens of pieces of slap-dashed content and wallpaper the Internet with them (you remember E-zines, right?) to build up their inbound link percentage. Starting around 2013, however, Google and the other search engines got hip to this gaming of their system and started reworking their algorithms. It was a move similar to what they did when keyword stuffing started to dilute their search results.

Today the links that point to your site need to be relevant and from quality sites. Setting up link wheels is a punishable offense and publishing the same piece of content in multiple places is no longer allowed (in fact, it can get your site delisted entirely). This means that you need to be proactive about giving interviews and guest posting on relevant and high-ranking sites both within and outside of your niche.

A great way to help this along is to sign up for services such as Help a Reporter Out. You can sign up as a “source” and then respond to the queries that fall within your expertise. If the reporter asking for help/information chooses your response, you might score an interview or a quote–both of which come with links back to your site. You can also sign up for contributor accounts on sites like The Huffington Post.

Keywords

We’ve talked a lot about how keywords aren’t as important as links and security. This doesn’t mean, however, that your keywords are irrelevant in the 2016 SEO game. Keywords and phrases are still important. They are still often the deciding factor in how your site is listed within a searcher’s results. The difference is that now the keywords you choose matter more than ever.

The keywords you choose must flow naturally within your content. This means that when doing your research, you want to focus on keywords that flow naturally instead of just words strung together haphazardly. You also need to be aware of how often you use them. While there is no definitive ideal percentage, you typically want to limit your keywords to one to two percent of the content.

Security, links and keywords are the most important parts of your SEO campaign. Make sure you get these right and the rest should fall into place!

Related: Keeping Up with SEO Changes

Moz Essentially Stops Sharing MozRank etc. via API

Moz had been allowing up to 1,000,000 rows of data to be retrieved via their API for free each month. We used that service when we updated our MultipageRank site to show Moz scores. We made this update because Google stopped updating their publicly shared pagerank values and were happen to support Moz’s commitment to providing data similar to what Google stopped sharing.

Moz updated their site (with no notice) to stop providing results after 25,000 rows a month (so 97.5% less than they used to – essentially none). To retrieve more you must pay $500+ every month. We don’t even get that much in revenue each year so that isn’t an option for us.

We will have to look for options about what to do, possibly finding an alternative to MozRank. At this point MozRank is still shared publicly via the toolbar so still is potentially a decent public metric. But just like Google removed the measure from the public once it had served to provide them marketing value maybe Moz will do the same thing and only provide the data for those paying for it. Obviously it is perfectly in their right (both Google and Moz) to restrict access to their data.

It sure would be nice if companies provided a few months notice before they stop the data from being provided. It will take us some time to update our site, since it is just a simple project we do to try and help people.

Related: Use Multiple PageRank Site to Find PageRank of https PagesAre the Values of Links Decreasing?

Don’t Hide Important Content Using Coding Gimmicks

My comment on: Does Hidden/Tabbed Content Still Get Picked Up By Google?

I would say that hidden tab stuff is bad Ux (most of the time). I could figure out what the heck was going on when I read this post and it seems to end without having addressed the issue sensibly. Oh, the content is hidden up above I finally figured out. I think Google does exactly the right thing, in making the hidden content a very low ranking factor for the page because as a user it is hidden and not the focus of the page.

The conclusion of the original post is hidden text is given very low weight by Google in search results. If you go to that page, note that you can’t see most of the content for that “page” you have to find the links to unhide the content.

The hidden text being discussed here is when you hide content that only becomes visible once the user clicks something (and instead of going to a page that highlights that content, some content is unhidden and other content is added to the hidden content on the page). It is just a bad Ux practice in general (as with many Ux issues there are odd cases where it can make sense).

Related: Getting Around Bad Web Page LayoutsPoor Web Site User Experience (Ux) on Financial SitesThe Most Important Search Engine Ranking FactorsDon’t Use Short URL Services

Good Blogging Practices

One of the things I have always done is to read and comment on blogs I find worthwhile. The main reason I do this is to learn. Another advantages include growing a network of like minded people (that grow from recognizing you commenting on their blog or blogs they read and then some start to read your blog…). And that growing your following can result in more links to your site and better search rankings.

These are my comments sparked by an post with some good ideas on some good blogging practices. They are edited and extended from the comment left on the blog.

Great thoughts.

Give your readers what they want: so important and yes to some extent people think of this, but that idea should get more attention from most bloggers.

Length of posts; as you say make them appropriate. Sometimes what you have to share is best captured in a long post. Sometimes a short post is best. Trying to jam a post into a specific format/length is a recipe for failure.

I do think the long, detailed posts are valuable and if you are never doing that there is likely some value in seeing if some of what you have to say can be expressed well in a long post.

I do have comments I leave spark me to write longer posts on my own blogs. I also started a management blog on blogspot (over 10 years ago) and when I created my own domain (also over 10 years ago) I left it there (urls should live forever).

A few years later I started to use that blog to republish comments I thought were worth keeping (one of the things I do is link to my previous content and trying to find some comment I want to reference is really hard, by collecting comments I think I might want to reference on that blog I can actually find them again). I often edit these a bit and add some links (which I often am prevented from including even when they would be really useful).

I was adding this to the related links that follow – Build Your Online Presence (another post that started as a comment). And this shows another reason to republish your comments that are worth keeping. The original article link is gone. I always include a link to the post I commented on; it is amazing how many are broken a few years later (lots of people break a basic web usability and wise SEO practice and break their urls).

An illustration of why it is in your interests to have urls live forever. Last year I did posts on my most popular posts on many of my blogs (based on views in 2014). A fairly typical example is from my Curious Cat Comments blog. The most popular posts by year 2014-6 (the most recent year does have an advantage as lots of regular readers read each new post); 2013-1; 2010-1; 2009-1; 2008-1; 2007-1; 2006-2. This one actually was more heavily weighted to recent post than most of my blogs. I just checked it for this year and 2 posts from 2007 and 1 post from 2008 that were not in the top last year are all in the top 6 this year (and the one from 2008 last year is also repeated again).

Related: Blog commenting optionsMake Your Blog WelcomingDon’t Use Short URL Services

Big Updates to Moz Index Results in Big Moves in Domain Authority and Page Authority Results

Moz posted a big update to their index this week that had a big impact on Moz Page Authority and Domain Authority. Why does it matter?

Really it doesn’t matter, but since Google is so secretive the Moz data gives us some insight into what Google (and other search engines are likely seeing). The changes to Moz have no direct effect on search results or traffic. What Moz believes (and it makes sense they are right) is that the updates better match what Google (and the others) see.

Basically Moz found some weaknesses in their prior data and methods and have tried to improve them, as they explained here. Many sites are noticing lowing Page Authority and Domain Authority numbers for their site (as I am on mine). I am not clear yet, but it seems possible, their was a general inflation in the numbers and so say the average number might have declined by 20% (this is just a made up number for illustration purposes). If that were true what really matters is if you declined less (that would be good) or more (that would be bad) than 20%.

And of course, there will be lots of variation in the changes in scores. These scores move around a fair amount (though Domain Authority scores do seem fairly stable over time) even when no big changes are happening at Moz.

Comments on the fluctuations of DA and PA scores from Rand Fisken- DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores

because Mozscape indices take 3-4 weeks to process, the data collected in an index is between ~21-90 days old.

Since Domain and Page Authority are on a 100-page scale, the very top of that represents the most link-rich sites and pages, and nearly every index, it’s harder and harder to get these high scores and sites, on average, that aren’t growing their link profiles substantively will see PA/DA drops.

PA/DA are created using a machine-learning algorithm whose training set is search results in Google. Over time, as Google gets pickier about which types of links it counts, and as Mozscape picks up on those changes, PA/DA scores will change to reflect it.

My strongest suggestion if you ever have the concern/question “Why did my PA/DA drop?!” is to always compare against a set of competing sites/pages. If most of your competitors fell as well, it’s more likely related to relative scaling or crawl biasing issues, not to anything you’ve done

Rand provides lots of good insight here. Moz is generally followed closely by people that pay a great deal of attention to SEO. I am not really in that camp, I pay some attention just because I find it interesting. I don’t spend time trying to figure out how to increase SEO through various gimmicks.

I don’t pay much attention to ratings for other sites, based on his suggestion I might start tracking a few similar sites to see how their scores vary over time as a way of understanding my scores better. All I really did before was look at other sites authority scores and comparing when I was bored (maybe 2 or 3 times a year) but didn’t keep track of any of them.

I find Moz interesting because it gives us open access to interesting data. There are many other things that impact search results but the authority pages and sites have is an interesting thing to watch (and does have a real impact on search results – even if it is much less than people might suspect).

Earlier this year I wrote about Decreases in MozRank and Page Authority for some of my sites and I posted an update where most of the decreases had disappeared (the authority numbers had returned to the same or close to what they were before the decline). Hopefully that will happen for my sites this time too, but we will have to wait and see.

Related: Most Important Search Engine Ranking FactorsFind MozRank, Moz PageAuthority, Google PageRank and Alexa Results NowKeeping Up with SEO Changes

Don’t Use Short URL Services

I am against using url shortening services to redirect urls for 4 reasons.

  1. Short urls add a point of failure – they go out of business and the urls go bad (or even worse get redirected to whoever buys the url service domain) or sometimes the short urls just expire and are reused (which is really lame).
    There is also the risk the country owning the domain messes things up (bit.ly using Libya – not exactly a stable country…). Likely if the domain is owned by super rich company they will pay huge ransom for domain if a country demands it – but not for sure… .be is owned by Belgium (which Google uses for YouTu.be short urls) and is probably less likely to screw with Google. But if the USA government messes with European privacy rights one path for the countries is to mess with their domains and create trouble for .be domain – or whatever other domain is in question.
  2. You lose the tremendous information value that a real human readable url provides users. You also lose the small aid to building your brand available by having them see your name in the url. Finally short urls (by throwing away the human readable url information users would benefit from) contribute to security problems by encouraging people to blindly click on links they don’t know where they are being taken. Scammers take advantage of users that are willing to follow short url links.
  3. You lose Search Engine Optimization (SEO) value of links by not linking to the actual url. For this reason it is a particularly bad idea to use short urls for your own content (but I see this done). When you are posting your content on a site that tells Google not to trust the link you entered (nofollow attribute) this point is not relevant but the other 3 points still are. And I see people use short urls even for followed links.
  4. Url shorteners delay the page load times for users. I often find urls shorteners forwarded to another url shortener forwarded to another url shortener and so on. Just last week, following a link on Harvard Business School’s Twitter account I was forwarded to 7 different urls before the actual url (a page on one of their own sites).

    If you are on a fiber internet connection and all those url redirects respond immediately it probably won’t be noticeable (so the people at Harvard may have no clue how lame they look to users) but if you are on a connection with high latency (many hundred of millions of people across the world are) it can easily take a second or two before the page even starts to load. With all the evidence on how critical fast load times are for users adding in delays with url shortener redirection is a bad practice.

    long urls written out on paper

    It would be better for this Grandmom to use short urls to write out her favorite urls to show her grandchild. via

    Continue reading

High MozRank DoFollow Blogs

Due to spam comments many sites add the nofollow tag to comments. For many years the nofollow tag has been the default in WordPress (you have to use a plugin to revert back to the original style where comment author links were not flagged as untrusted). With the nofollow tag Google (and Moz) do not give the link value.

Here is a list of blogs that moderate their comments and provide dofollow links giving those that contribute worthwhile comments the benefit of being considered real links by Google (and others). I will continue to keep this list updated.

Order of the list is based on MozRank with a penalty for using popups to interfere with visitors using the site. See the very bottom of this post for blogs that supposedly have dofollow comments but I have been unable to comment and my messages to them have not been answered.

Many of the best blogs that provide dofollow links require the use of your real name, a link to your home page or a blog that you obviously write, and comments that are valuable (not just meaningless drivel). They may also require numerous (normally between 3 to 10) approved comments before links become dofollow.

Unfortunately many people spam these blogs in an attempt to get dofollow links. That results in many of the blogs turning off dofollow links. Those that stay dollow are usually impatient with spamming low quality comments and remove poor quality links that are not personal blogs. If you comment, post valuable comments if you expect to get a dollow link, otherwise you are just contributing to the decline of blogs that provide dofollow links.

Why don’t I list 50 or 100 more that are nofollow, haven’t been used in years and where the domain was deleted? That doesn’t make sense to me. But, maybe I am crazy (so I explain my craziness here), since most other listings do that.

If you know of dofollow blogs with at least a 1 year track record and that has compelling posts (if it isn’t of high quality it will likely die so it isn’t worth adding just to have to remove it later) add a comment with the information on the blog.

Related: Ignoring Direct Social Web Signals in Search ResultsGoogle and Links (2012)Using Twitter Data to Improve Search Results

* CommentLuvDF – they dofollow blog-post-title-link (usually only after between 3 to 10 approved comments) but not author link

These blogs don’t work for me (or often don’t work but work sometimes). Either:

  • they don’t post my comments and don’t reply to my contact messages about why (if they decided to block them because they didn’t value the comment that would be fine, it is their blog – but most likely they have a spam filter that just trashes my comments) but do have some dofollow comments.
  • they removed links to author’s blog (and comment luv post link) from comments that were made. It is their right to do so. But the links removed were links to personal blogs and if they are removing those links they don’t really fit in a list of dofollow blogs.
  • or they delete (probably too aggressive spam filter but maybe manual action, there is no way to know) many comments without notice to the comment author.
  • 5.7 Adrienne Smith (MPA 49, MSS 2, CommentLuvDF)
  • 5.3 Sylvia Nenuccio (MPA 35, MSS 0, CommentLuvDF)
  • 5.2 Sherman Smith’s Blog (MPA 43, MSS 2, CommentLuvDF, popup)
  • 5.4 Power Affiliate Club (MPA 33, MSS 2, CommentLuvDF, popup)