Tag Archives: SEO

High Domain Authority Blogs That Use DoFollow Commenter Links

Due to spam comments many sites add the nofollow tag to comments. For many years the nofollow tag has been the default in WordPress (you have to use a plugin to revert back to the original style where comment author links were not flagged as untrusted). With the nofollow tag Google (and Moz) do not give the link value.

Here is a list of blogs that moderate their comments and provide dofollow links giving those that contribute worthwhile comments the benefit of being considered real links by Google (and others). I will remove blogs that switch to being nofollow.

This list is up to date – unlike nearly every other source I find online (those lists I find online have very few blogs that are actually dofollow hidden in a huge list of nofollow blogs). I also don’t list blogs that are no longer actively updated. In case you wonder why this list is so short, those are the reasons why.

View the 2015 version of this list (that was last updated in early 2017)

Order of the list is based on MozRank but sites moved down for using popups to interfere with visitors using the site and other usability problems.

Many of the best blogs that provide dofollow links require the use of your real name, a link to your home page or a blog that you obviously write, and comments that are valuable (not just meaningless drivel “great post” etc., which are often just deleted). They may also require numerous (normally between 3 to 10) approved comments before links become dofollow.

Unfortunately many people spam these blogs in an attempt to get dofollow links. That results in many of the blogs turning off dofollow links. Those that stay dollow are usually impatient with spamming low quality comments and remove poor quality links that are not personal blogs. If you comment, post valuable comments if you expect to get a dollow link, otherwise you are just contributing to the decline of blogs that provide dofollow links.

I also have dofollow links to the blogs I list here (unless they have failed to post my comments without explanation – likely do to poorly performing spam filters, if they chose to delete the comments for not being the quality they expect and say that, it is fine). If you see a list without links (just listing text urls) you can be confident that it was not created with much care and skip it: just go find more reliable lists, which will have real links to the blogs.

If you know of a dofollow blog with at least a 1 year track record and that has compelling posts (if it isn’t of high quality it will likely die so it isn’t worth adding just to have to remove it later) add a comment with the information on the blog.

Related: Ignoring Direct Social Web Signals in Search ResultsGoogle and Links (2012)Using Twitter Data to Improve Search Results

* CommentLuvDF – they dofollow blog-post-title-link (usually only after between 3 to 10 approved comments) but not author link

Google Check of Whether a Website is Mobile Friendly

Google provides a tool to show what if they think a web site is “mobile friendly.” Google states that they will penalize sites in their search rankings if Google doesn’t believe they are mobile friendly. So obviously this matters if you care about your ranking in Google.

If the site passes Google’s test you will get a response similar to ours:

screen shot of site being deemed mobile-friendly by Google

Now Google’s automated tool isn’t so great at providing good usability advice (such as if it really is a good design for mobile users) but it does tell you if Google is going to punish the site or not. If Google thinks the site fails they will provide some feedback, such as:

  • Text too small to read
  • Links too close together
  • Mobile viewport not set

Then you can decide if those really are issues and if you want to fix them. Due to Google’s dominate market position it may be you feel forced to adjust a site (even if it means degrading real usability) in order to make Google happy so your site isn’t punished by Google in search rankings. Or you can decide that you are going to do what is right for users regardless of what Google will do to the site.

Note if you don’t have javascript enabled Google’s tool just fails. I can’t imagine why this tool should require javascript but certainly it is pitifully lame to not provide a clear indication that they created a site that doesn’t work unless javascript is enabled instead of just giving a completely useless message “There was a problem with the request. Please try again later.” as they do now. Google should punish sites that due such lame things in my opinion. I also get that useless message about 20% of the time when I have tried validating a site (but if javascript is enable just reloading makes it work).

The tool is useful in pointing out potential issues to improve for mobile users. I do wish however, Google wasn’t so autocratic about its opinions acting as though failing their tests is equal to failure mobile users. It isn’t, it is a decent indication there may be a problem but it is not proof there is a problem.

Related: Google Still Providing Users Bad Search Results: Basing Results Not of Value to Users but How Completely Sites Do What Google Tells Them To DoDon’t Use Short URL Services (bit.ly etc.)Good Blogging Practices

The Importance of the User in SEO

Optimize for how users are actually using the page — as opposed to how you optimized the page ahead of time — and you’ll see significantly better traffic.

Cyrus Shepard in an good blog post: My Single Best SEO Tip for Improved Web Traffic.

I have always seen the first focus on creating content that users want (both in content and usability of the medium of delivery). Some techniques related to SEO can be useful in tweaking how you go about managing your online presence but it is secondary to creating great content focused on users.

His article is really focused on putting a large amount of effort into tweaking the content. I think this makes sense for some important pages (and some important flows within a web site or web application). I think it is way too much effort to expend on most pages (the payback won’t be worth the effort).

Related: Good Blogging PracticesKeeping Up with SEO Changes

SEO Techniques for 2016

Your brand’s most valuable tool is its website. Your website is what people will read when they are researching your company and what you do. It is what they will turn to when they want to get in touch with you. It’s what will entice new traffic to buy and existing customers and clients to stick around. And, because of the way the Web works in the second decade of the new millennium, all of that is dependent upon Search Engine Optimization (SEO).

SEO is key to how people find your website now. Many new website developers and builders believe that SEO is just about keywords. The truth is that there is a lot more to SEO than simply using a few choice words or phrases a few times in your content and tags. Here are some of the best things you can do to ensure your website’s SEO is on the right track for your audience in 2016.

Security

More and more companies are finding themselves the victims of hackers and data thieves. It is vitally important that you make sure you protect the people who visit your site and submit information through your site, etc. While your Web hosting provider will likely offer some security on their end, it is also vitally important that you protect your work with your own servers.

According to Firewall Technical, a company that provides IT support in Ottawa, “A server acts not only as a space for business documents, but it can also control how documents are retrieved. In addition, servers protect the entire network. For this reason, it is critical that your server is operating smoothly.” Make sure that your local and cloud servers are as secure as possible. The last thing you want is someone stealing customer information from your own databases!

Search engines will even put up scary messages warning users that if they follow a link to a web site that has been hacked (and the search engine catches the hack). That will drop your search visitors to almost zero and they will be scared to come back even if you fix your server. No amount of SEO mastery will help you rebuild after a data breach that endangers your customers’ private information!

Links

According to a Moz report done last year, domain level linking is the most important component of SEO success. Domain level linking is a link that points to the whole site instead of a specific page or piece of content within that site.

A few years ago, this was easy enough to achieve. Businesses would hire content writers to churn out dozens of pieces of slap-dashed content and wallpaper the Internet with them (you remember E-zines, right?) to build up their inbound link percentage. Starting around 2013, however, Google and the other search engines got hip to this gaming of their system and started reworking their algorithms. It was a move similar to what they did when keyword stuffing started to dilute their search results.

Today the links that point to your site need to be relevant and from quality sites. Setting up link wheels is a punishable offense and publishing the same piece of content in multiple places is no longer allowed (in fact, it can get your site delisted entirely). This means that you need to be proactive about giving interviews and guest posting on relevant and high-ranking sites both within and outside of your niche.

A great way to help this along is to sign up for services such as Help a Reporter Out. You can sign up as a “source” and then respond to the queries that fall within your expertise. If the reporter asking for help/information chooses your response, you might score an interview or a quote–both of which come with links back to your site. You can also sign up for contributor accounts on sites like The Huffington Post.

Keywords

We’ve talked a lot about how keywords aren’t as important as links and security. This doesn’t mean, however, that your keywords are irrelevant in the 2016 SEO game. Keywords and phrases are still important. They are still often the deciding factor in how your site is listed within a searcher’s results. The difference is that now the keywords you choose matter more than ever.

The keywords you choose must flow naturally within your content. This means that when doing your research, you want to focus on keywords that flow naturally instead of just words strung together haphazardly. You also need to be aware of how often you use them. While there is no definitive ideal percentage, you typically want to limit your keywords to one to two percent of the content.

Security, links and keywords are the most important parts of your SEO campaign. Make sure you get these right and the rest should fall into place!

Related: Keeping Up with SEO Changes

Big Updates to Moz Index Results in Big Moves in Domain Authority and Page Authority Results

Moz posted a big update to their index this week that had a big impact on Moz Page Authority and Domain Authority. Why does it matter?

Really it doesn’t matter, but since Google is so secretive the Moz data gives us some insight into what Google (and other search engines are likely seeing). The changes to Moz have no direct effect on search results or traffic. What Moz believes (and it makes sense they are right) is that the updates better match what Google (and the others) see.

Basically Moz found some weaknesses in their prior data and methods and have tried to improve them, as they explained here. Many sites are noticing lowing Page Authority and Domain Authority numbers for their site (as I am on mine). I am not clear yet, but it seems possible, their was a general inflation in the numbers and so say the average number might have declined by 20% (this is just a made up number for illustration purposes). If that were true what really matters is if you declined less (that would be good) or more (that would be bad) than 20%.

And of course, there will be lots of variation in the changes in scores. These scores move around a fair amount (though Domain Authority scores do seem fairly stable over time) even when no big changes are happening at Moz.

Comments on the fluctuations of DA and PA scores from Rand Fisken- DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores

because Mozscape indices take 3-4 weeks to process, the data collected in an index is between ~21-90 days old.

Since Domain and Page Authority are on a 100-page scale, the very top of that represents the most link-rich sites and pages, and nearly every index, it’s harder and harder to get these high scores and sites, on average, that aren’t growing their link profiles substantively will see PA/DA drops.

PA/DA are created using a machine-learning algorithm whose training set is search results in Google. Over time, as Google gets pickier about which types of links it counts, and as Mozscape picks up on those changes, PA/DA scores will change to reflect it.

My strongest suggestion if you ever have the concern/question “Why did my PA/DA drop?!” is to always compare against a set of competing sites/pages. If most of your competitors fell as well, it’s more likely related to relative scaling or crawl biasing issues, not to anything you’ve done

Rand provides lots of good insight here. Moz is generally followed closely by people that pay a great deal of attention to SEO. I am not really in that camp, I pay some attention just because I find it interesting. I don’t spend time trying to figure out how to increase SEO through various gimmicks.

I don’t pay much attention to ratings for other sites, based on his suggestion I might start tracking a few similar sites to see how their scores vary over time as a way of understanding my scores better. All I really did before was look at other sites authority scores and comparing when I was bored (maybe 2 or 3 times a year) but didn’t keep track of any of them.

I find Moz interesting because it gives us open access to interesting data. There are many other things that impact search results but the authority pages and sites have is an interesting thing to watch (and does have a real impact on search results – even if it is much less than people might suspect).

Earlier this year I wrote about Decreases in MozRank and Page Authority for some of my sites and I posted an update where most of the decreases had disappeared (the authority numbers had returned to the same or close to what they were before the decline). Hopefully that will happen for my sites this time too, but we will have to wait and see.

Related: Most Important Search Engine Ranking FactorsFind MozRank, Moz PageAuthority, Google PageRank and Alexa Results NowKeeping Up with SEO Changes

Don’t Use Short URL Services

I am against using url shortening services to redirect urls for 4 reasons.

  1. Short urls add a point of failure – they go out of business and the urls go bad (or even worse get redirected to whoever buys the url service domain) or sometimes the short urls just expire and are reused (which is really lame).
    There is also the risk the country owning the domain messes things up (bit.ly using Libya – not exactly a stable country…). Likely if the domain is owned by super rich company they will pay huge ransom for domain if a country demands it – but not for sure… .be is owned by Belgium (which Google uses for YouTu.be short urls) and is probably less likely to screw with Google. But if the USA government messes with European privacy rights one path for the countries is to mess with their domains and create trouble for .be domain – or whatever other domain is in question.
  2. You lose the tremendous information value that a real human readable url provides users. You also lose the small aid to building your brand available by having them see your name in the url. Finally short urls (by throwing away the human readable url information users would benefit from) contribute to security problems by encouraging people to blindly click on links they don’t know where they are being taken. Scammers take advantage of users that are willing to follow short url links.
  3. You lose Search Engine Optimization (SEO) value of links by not linking to the actual url. For this reason it is a particularly bad idea to use short urls for your own content (but I see this done). When you are posting your content on a site that tells Google not to trust the link you entered (nofollow attribute) this point is not relevant but the other 3 points still are. And I see people use short urls even for followed links.
  4. Url shorteners delay the page load times for users. I often find urls shorteners forwarded to another url shortener forwarded to another url shortener and so on. Just last week, following a link on Harvard Business School’s Twitter account I was forwarded to 7 different urls before the actual url (a page on one of their own sites).

    If you are on a fiber internet connection and all those url redirects respond immediately it probably won’t be noticeable (so the people at Harvard may have no clue how lame they look to users) but if you are on a connection with high latency (many hundred of millions of people across the world are) it can easily take a second or two before the page even starts to load. With all the evidence on how critical fast load times are for users adding in delays with url shortener redirection is a bad practice.

    long urls written out on paper

    It would be better for this Grandmom to use short urls to write out her favorite urls to show her grandchild. via

    Continue reading

Most Important Search Engine Ranking Factors

Moz published their annual Search Engine Ranking Factors based on a survey of SEO experts. The SEO experts opinion of the most important factors in 2015 are:

  1. Domain level linking (8.2 out of 10) – quality and quantity of links etc. to the entire domain
  2. Page level linking (7.9) – quality and quantity of the link to the page, anchor text
  3. Page level keyword and content (7.9) – content relevance to search term, content quality ranking factors, topic modeling factors
  4. Page level keyword agnostic measures (6.6) – readability, content length, uniqueness, load speed, markup, https, etc..
  5. Engagement data (6.6) – based on SERP clickstream data, visitor traffic and usage signals… on the page and domain level

This both reinforces the importance of links and also shows how search result rankings have evolved to include many other factors as significant and important determinants of search result rankings.

Related: Keeping Up with SEO ChangesSite Spam Flags Score from MozWhy Don’t Search Results Screen Better by Date?

Find MozRank, Moz PageAuthority, Google PageRank and Alexa Results Now

We have updated the MultiPageRank site to provide MozRank, Moz PageAuthority, Google PageRank and Alexa results now. In one simple request you can retrieve all these measures for multiple domains.

Google provided an opening in the market to serve users interested in page authority/popularity when they slowed sharing the updates to public Google page rank. Moz has filled that role extremely well. For a year or two Moz results have been much more useful than Google’s. We have finally added Moz results to our results page.

MozRank is closest to Google page rank to measure raw link authority to the page; as with Google page rank the link weight is based on the rank of the page providing a link. So 1 link on the home page of some very popular site would provide more rank to the linked page than thousands from low quality pages.

Moz page authority is enhanced with many extra factors to try and provide a better estimation of search result “authority.” Moz calculates it based off data from the Mozscape web index and includes link counts, MozRank, MozTrust, and dozens of other factors.

We also continue to include Alexa data which does have significant issues with reliability but it is of some interest so we include it. Alexa uses their data (largely toolbar user based) to rank websites by total visitors/visits (a combination). There data is biased with SEO sites in particular getting a big boost as users using those sites are often using a toolbar that shares data with Alexa and they visit lots of SEO related sites.

We have had some issues (largely very slow response times for the results page) providing the additional Moz data but I believe things are working well now. Still I have the old results visible using www.multipagerank.com. The new results are found on multipagerank.com. I made split when we first had issues as we worked on them. I will likely eliminate the old results page in the next couple weeks if everything continues to go well.

Related: Use Our Multiple PageRank Site to Find PageRank of https PagesIs the Value of Links Decreasing?Keeping Up with SEO Changes

Keeping Up with SEO Changes

Response to Heresy: Google and SEO Don’t Actually Change that Much by Rand Fishkin

If, in 2004, you balanced SEO best practices with usability/user experience needs and created a site with content that was optimal for people and engines, you probably had an SEO strategy that would still work today. You might need to make updates, especially with regards to your tactics for earning awareness (taking into account the rise of social networking), for optimizing to multiple devices (instead of just multiple browsers), and to keep your UI/UX enjoyable for modern users, but strategic-level SEO wouldn’t need nearly as much of a shift as these others.

I agree. While I am not an SEO person (I just read about it a bit when I have time because it is an interesting area) what you say is how it seems to me. Yes there are tricks that SEO people learn and can use to do well (I guess based on what I have read over the years).

But the core focus in the same, with just different tricks valuable at different times. It does seem like these tricks (that are mainly about exploiting weaknesses in the algorithms) are worth lots of time and energy in the right situation, to the right people.

For most people I don’t think it is that important. The principles that matter seem to stay pretty consistent over the long term, it is just trying to game the algorithm that is challenging. If it wasn’t challenging those people that are now making their money doing it would have a much more difficult time because many other people would be doing it. The challenge of staying ahead of Google’s ability to eliminate gaming of the results is why those that do it well are rewarded and if it was easier you would be rewarded less.

If you just do the basic stuff that doesn’t change, I barely make any changes over years. The only change I can remember in the last few years was adding Google Authorship – which has now become essentially worthless, so even if I hadn’t done that it wouldn’t matter.

My basic guide is this: Google wants to provide the most relevant content. It does this with significant focus on others opinions of your content. Google has to make judgements about others opinions and doesn’t do so perfectly. So getting links is important as that is both an indication that other value it (maybe) and that is measurable by Google.

Now, I don’t think Google is great at determining if links value what they link to or are saying – “look at this totally idiotic article.” I imagine Google, and other search engines will get much better at this.

I do think Google is absolutely foolish to ignore data from nofollow sources (since this is more about what companies chose to game Google by nofollowing everything that doesn’t directly benefit their own company than whether the link has value). The links from important people on Twitter are an extremely valuable indication that Google just wastes. They, or some other search will do this much better sometime (I would have thought that time would have been years in the past).

But essentially, you need to create useful content. Then get people to acknowledge that it is great. All the games of trying to get others to take advantage of lameness in Google’s algorithm is mainly noise. That yes professional SEO people need to pay attention to. But the extent to which that exists is mainly due to failures in Google algorithm so they will constantly be trying to fix it. They don’t want to provide the sites that do SEO best, they want to provide sites that are relevant to the person searching.

Related: Why Can’t Search Results Screen Better by Date?Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users LikelyGoogle and Links

Are the Values of Links Decreasing?

This interesting articles shows 13 SEO experts opinions on questions such as:

Do you see the link losing value over time?
How – if at all – should SEO’s change their content marketing and link building strategies in the coming years, given inevitable changes to the PageRank algorithm?
If Google search did work without links, what metrics would replace it?

Some interesting quotes:

The Future of PageRank: 13 Experts on the Dwindling Value of the Link

Michelle Robbins: “Google wants their results to be valid and relevant – to mirror user expectations in the real world, and they will continue to evolve their system to get there. Links aren’t doing it, so they are working to adapt. Offline signals are valid, not easily manipulated, and can be captured. Thus I believe they will be used in the algo.”

Julie Joyce: “I think that links could lose some value but it’s not a doomsday scenario in my opinion. Considering links are how we move around on the web I cannot imagine a successful search engine that doesn’t take their importance into effect.”

Rand Fishkin: “the link has been losing value for almost a decade. That said, I don’t think that in the next decade, we’ll see a time when links are completely removed from ranking features. They provide a lot of context and value to search engines, and since the engines keep getting better at removing the influence of non-editorial links, the usefulness of link measurement will remain high.”

Glenn Gabe: “AuthorRank (or some form of it) once it officially rolls out. The model of ranking people versus websites is extremely intriguing. It makes a lot of sense and can tie rankings to authors versus the sites their content is written on.”

One thing I find funny about the article is talking about “social factors” (social websites) as if they were not links. Google currently claims to ignore social links that use the nofollow (and Google policies have driven companies scared of being punished to use nofollow). But Google using “social factors” is just Google using links it has been failing to use the last few years. I agree Google is foolish to be ignoring those indications today. So I agree Google should use more links (social and likely non-social) to provide better results.

I suppose one “social factor” people could mean is the number of likes something gets. I think that would be a lame measure to give any value to. The use of +1s data in Google+ may have some tiny benefit as their Google can know who gave what pluses (and devalue junk plus profiles, give extra value to authority that is plusing related good [using corroboration with other sources of Google has to gage quality] content).

I have long thought using traffic and interaction (comments, time on site…) with the sight are useful measures for Google. Google has various ways for getting this data. I am not sure to what extent they use it now. Not that it would be a huge factor but just another factor to throw in with the huge number they use already.

I agree trusted authorship is likely to play an increasingly important role. Partially this seems like an extension of pagerank to me (basically passing value to the link based on the link to the author – and the authorship stuff right now, is all link based, there has to be a link on the page tying the author to the Google+ page for the author).

Related: Very Simple Process to Claim Authorship of Pages with GoogleWhy Can’t Search Results Screen Better by Date?Surprise Google Public PageRank UpdateUse Multiple PageRank Site to Find PageRank of https Pages