Don’t Use Short URL Services

I am against using url shortening services to redirect urls for 4 reasons.

  1. Short urls add a point of failure – they go out of business and the urls go bad (or even worse get redirected to whoever buys the url service domain) or sometimes the short urls just expire and are reused (which is really lame).
    There is also the risk the country owning the domain messes things up (bit.ly using Libya – not exactly a stable country…). Likely if the domain is owned by super rich company they will pay huge ransom for domain if a country demands it – but not for sure… .be is owned by Belgium (which Google uses for YouTu.be short urls) and is probably less likely to screw with Google. But if the USA government messes with European privacy rights one path for the countries is to mess with their domains and create trouble for .be domain – or whatever other domain is in question.
  2. You lose the tremendous information value that a real human readable url provides users. You also lose the small aid to building your brand available by having them see your name in the url. Finally short urls (by throwing away the human readable url information users would benefit from) contribute to security problems by encouraging people to blindly click on links they don’t know where they are being taken. Scammers take advantage of users that are willing to follow short url links.
  3. You lose Search Engine Optimization (SEO) value of links by not linking to the actual url. For this reason it is a particularly bad idea to use short urls for your own content (but I see this done). When you are posting your content on a site that tells Google not to trust the link you entered (nofollow attribute) this point is not relevant but the other 3 points still are. And I see people use short urls even for followed links.
  4. Url shorteners delay the page load times for users. I often find urls shorteners forwarded to another url shortener forwarded to another url shortener and so on. Just last week, following a link on Harvard Business School’s Twitter account I was forwarded to 7 different urls before the actual url (a page on one of their own sites).

    If you are on a fiber internet connection and all those url redirects respond immediately it probably won’t be noticeable (so the people at Harvard may have no clue how lame they look to users) but if you are on a connection with high latency (many hundred of millions of people across the world are) it can easily take a second or two before the page even starts to load. With all the evidence on how critical fast load times are for users adding in delays with url shortener redirection is a bad practice.

    long urls written out on paper

    It would be better for this Grandmom to use short urls to write out her favorite urls to show her grandchild. via

    Continue reading

High MozRank DoFollow Blogs

The links here are no longer being updated, see our new post for a current list:

See updated DoFollow Blog List

This page remains to provide a historical record.

– – – – – – – – – – – – – – – –

Due to spam comments many sites add the nofollow tag to comments. For many years the nofollow tag has been the default in WordPress (you have to use a plugin to revert back to the original style where comment author links were not flagged as untrusted). With the nofollow tag Google (and Moz) do not give the link value.

Here is a list of blogs that moderate their comments and provide dofollow links giving those that contribute worthwhile comments the benefit of being considered real links by Google (and others). I will continue to keep this list updated.

Order of the list is based on MozRank with a penalty for using popups to interfere with visitors using the site. See the very bottom of this post for blogs that supposedly have dofollow comments but I have been unable to comment and my messages to them have not been answered.

Many of the best blogs that provide dofollow links require the use of your real name, a link to your home page or a blog that you obviously write, and comments that are valuable (not just meaningless drivel). They may also require numerous (normally between 3 to 10) approved comments before links become dofollow.

Unfortunately many people spam these blogs in an attempt to get dofollow links. That results in many of the blogs turning off dofollow links. Those that stay dollow are usually impatient with spamming low quality comments and remove poor quality links that are not personal blogs. If you comment, post valuable comments if you expect to get a dollow link, otherwise you are just contributing to the decline of blogs that provide dofollow links.

Why don’t I list 50 or 100 more that are nofollow, haven’t been used in years and where the domain was deleted? That doesn’t make sense to me. But, maybe I am crazy (so I explain my craziness here), since most other listings do that.

If you know of dofollow blogs with at least a 1 year track record and that has compelling posts (if it isn’t of high quality it will likely die so it isn’t worth adding just to have to remove it later) add a comment with the information on the blog.

Related: Ignoring Direct Social Web Signals in Search ResultsGoogle and Links (2012)Using Twitter Data to Improve Search Results

* CommentLuvDF – they dofollow blog-post-title-link (usually only after between 3 to 10 approved comments) but not author link

These blogs don’t work for me (or often don’t work but work sometimes). Either:

  • they don’t post my comments and don’t reply to my contact messages about why (if they decided to block them because they didn’t value the comment that would be fine, it is their blog – but most likely they have a spam filter that just trashes my comments) but do have some dofollow comments.
  • they removed links to author’s blog (and comment luv post link) from comments that were made. It is their right to do so. But the links removed were links to personal blogs and if they are removing those links they don’t really fit in a list of dofollow blogs.
  • or they delete (probably too aggressive spam filter but maybe manual action, there is no way to know) many comments without notice to the comment author.
  • 5.7 Adrienne Smith (MPA 49, MSS 2, CommentLuvDF)
  • 5.3 Sylvia Nenuccio (MPA 35, MSS 0, CommentLuvDF)
  • 5.2 Sherman Smith’s Blog (MPA 43, MSS 2, CommentLuvDF, popup)
  • 5.4 Power Affiliate Club (MPA 33, MSS 2, CommentLuvDF, popup)
  • Most Important Search Engine Ranking Factors

    Moz published their annual Search Engine Ranking Factors based on a survey of SEO experts. The SEO experts opinion of the most important factors in 2015 are:

    1. Domain level linking (8.2 out of 10) – quality and quantity of links etc. to the entire domain
    2. Page level linking (7.9) – quality and quantity of the link to the page, anchor text
    3. Page level keyword and content (7.9) – content relevance to search term, content quality ranking factors, topic modeling factors
    4. Page level keyword agnostic measures (6.6) – readability, content length, uniqueness, load speed, markup, https, etc..
    5. Engagement data (6.6) – based on SERP clickstream data, visitor traffic and usage signals… on the page and domain level

    This both reinforces the importance of links and also shows how search result rankings have evolved to include many other factors as significant and important determinants of search result rankings.

    Related: Keeping Up with SEO ChangesSite Spam Flags Score from MozWhy Don’t Search Results Screen Better by Date?

    Decreases in MozRank and Page Authority

    I have noticed a decrease in MozRank and to a much lessor extent Moz Page Authority on many of my sites very recently. I don’t know if it is some major MozRank update (I don’t see other posts about it, so probably not) or is just related to my sites. I don’t follow that closely but on the sites I visit a lot I noticed a decrease today (which doesn’t necessarily mean it happened today).

    My sites that I visit a lot are blogs and are interrelated so I could imagine a change could cascade through all of them.

    They are still doing well so I am not worried but it is always nicer to see increases than decreases.

    Looking at a couple it seems like MozRank went down most, and Moz PA went down slightly if at all. Examples (I am trying to remember the previous rankings so I might be off by a bit – they don’t normally change much so I haven’t bothered tracking more than about twice a year)

    Format:
    [my memory of recent values to today (and August 2014 values)

    site 1 from 64 and 6.3 to 64 and 5.6 (August 2014 values: 52 and 6.0)
    site 2 from 60 and 6.2 to 58 and 5.3 (59 and 6.0)
    site 3 from 63 and 6.2 to 59 and 5.2 (48 and 6.1)
    site 4 from 58 and 6.1 to 55 and 5.2 (55 and 6.1)
    site 5 from 51 and 5.6 to 50 and 5.0 (39 and 5.8)
    site 6 from 51 and 5.5 to 47 and 5.0 (38 and 5.5)
    site 7 I can’t remember to 37 and 4.9 (37 and 5.5)
    site 8 I can’t remember to 33 and 4.9 (34 and 5.7)
    site 9 from 38 and 4.9 to 36 and 4.3 (new)

    As you can see many sites increased from August (gradually over the months) and then gave some of those gains back in the last day or two (or decreased to below August 2014, especially for MozRank). On average since August, 2014 PA increase then gave a bit of the increase back but was higher than August, 2014 while MozRank increased more then gave even more than the gain back to end up lower than August, 2014.

    This site isn’t very connected to the others. This blog was 31 Moz Page Authority and 5.0 MozRank in August 2014, today it is MozPA 31 and 4.4 MozRank. The main Multi-pagerank site had MozPA of 41 and MozRank of 4.9 in August, 2014. Now the main site is 40 and 4.9. I think maybe these values didn’t change today but I can’t really remember. For the other sites they pretty much stayed in the same area since August, 2014.

    New Site Spam Flags Score from Moz

    Moz continues to provide interesting tools and site measures. I only follow things as I find it interesting (not as a profession). I am not a SEO person and paying $100 a month (or much more) they charge for their tools isn’t worth it for my curiosity. But they make some things available for free and provide some interesting blog posts on what they find and about their tools.

    This new Spam Score analysis by Moz seems very interesting: Spam Score: Moz’s New Metric to Measure Penalization Risk. The idea is sensible, they are trying to determine the spam riskiness of a site based on the correlations they can draw from their web crawl data and Google search results. Moz can then see where sites are not ranking well when many factors would indicate they should rank and then draw a conclusion that Google has penalized certain sites (and not given sites with links from those sites credit or worse penalized sites with links from those sites).

    This seems like a really good idea. The found 17 flags that are correlated with spam hits to the site. And when sites trip more and more of those flags the likelihood of Google classifying those sites as spam rise. When a site has 0 spam flags Moz calculates a .5% chance of the site showing up in Google search results (or not showing more likely) in a way that indicates Google sees the site as spam. 4 spam flags equals a 7.5% chance of being a “spam site.” A site with 6 spam flags has at 16% chance of being spam, 7 flags means a 31% chance, 8 is a 57% chance, 9 a 72% chance and 14 a 100% chance.

    A screen shot of Moz's spam flags report

    Screen shot of Moz’s Spam Flag report.

    In their post Moz says that tripped spam flags are not meant to be an indication of something that needs to be fixed (after all the flags are just correlation, not causation – “fixing them” may do nothing for search results). That may be true but if sites are showing a 5-yellow for spaminess it is highly likely lots of people are going to want to reduce this scary looking feedback about their site.

    It may well be changing to avoid the flag by adding twitter buttons and making whatever tweaks to get rid of several more flags is what is likely to happen.
    My guess is a spaminess rating that wasn’t just x/17 but a factor of how many of 17 tripped plus an understanding of how important that was (I would imagine including which interactions of spam flag were more critical…).

    I would be surprised if there isn’t a big difference in a certain 3 flags being tripped versus 3 other flags being tripped (plus say 4 other random flags). That is to say, even with Moz’s limited ability to know what Google is directly reacting to versus correlations you can observe. I would imagine this could big improved into a 100 point (or whatever) system that gave a much more valuable spam site insight than just treating each flag as equally important (and ignoring especially deadly interactions between flags – which flags when they are tripped together cause the likely spam hit to be seen in google results.

    Continue reading

    Find MozRank, Moz PageAuthority, Google PageRank and Alexa Results Now

    We have updated the MultiPageRank site to provide MozRank, Moz PageAuthority, Google PageRank and Alexa results now. In one simple request you can retrieve all these measures for multiple domains.

    Google provided an opening in the market to serve users interested in page authority/popularity when they slowed sharing the updates to public Google page rank. Moz has filled that role extremely well. For a year or two Moz results have been much more useful than Google’s. We have finally added Moz results to our results page.

    MozRank is closest to Google page rank to measure raw link authority to the page; as with Google page rank the link weight is based on the rank of the page providing a link. So 1 link on the home page of some very popular site would provide more rank to the linked page than thousands from low quality pages.

    Moz page authority is enhanced with many extra factors to try and provide a better estimation of search result “authority.” Moz calculates it based off data from the Mozscape web index and includes link counts, MozRank, MozTrust, and dozens of other factors.

    We also continue to include Alexa data which does have significant issues with reliability but it is of some interest so we include it. Alexa uses their data (largely toolbar user based) to rank websites by total visitors/visits (a combination). There data is biased with SEO sites in particular getting a big boost as users using those sites are often using a toolbar that shares data with Alexa and they visit lots of SEO related sites.

    We have had some issues (largely very slow response times for the results page) providing the additional Moz data but I believe things are working well now. Still I have the old results visible using www.multipagerank.com. The new results are found on multipagerank.com. I made split when we first had issues as we worked on them. I will likely eliminate the old results page in the next couple weeks if everything continues to go well.

    Related: Use Our Multiple PageRank Site to Find PageRank of https PagesIs the Value of Links Decreasing?Keeping Up with SEO Changes

    Ignoring Direct Social Web Signals for Search Results

    Eric Enge wrote a good post recently: Do Social Signals Drive SEO? He repeats that Google denies using social signals to drive search engine results. And he says while the evidence shows that socially popular links do rank well (and quickly) it is possible to explain this while Google continues to ignore this signal that humans find useful (people we trust sharing links).

    Google has tied themselves to thinking that nofollow is a sensible idea core to their demands for compliance to their directions for how web sites make links. Google has been promoting it be used how they direct for years. So when social sites and other large sites just put nofollow on everything that doesn’t directly benefit them (like Google+, Twitter, etc.) Google either has to change their thinking that nofollow is a good idea or reward sites that only follow links they directly benefit from.

    You have to remember Google attempts to use nofollow to mandate its view of what is a trusted link and what isn’t. Google seems to say it is fine to follow links your organization benefits from if it isn’t that you are being paid cash for that link. Of course it is hard to draw that line in the real world. When an employee of some S&P 100 company writes an article on the company blog about the companies new product they employee is paid to promote the companies product. If the employee didn’t write it the company wouldn’t be paying their salary for long. But these links Google doesn’t mind.

    But other kinds of links where sites have been paid for links Google doesn’t like. It is a tricky area but Google’s solution is very poor it seems to me.

    And I don’t even know what their position is on other things – like partnerships where millions of dollars are exchanged and links are one of many things being paid for (mainly with Google it seems to be if enough money changes hands it is ok, it is the small stuff that Google really doesn’t like – if Coke pays millions to places those links are fine, if Joe’s Fresh Drinks does something similar to a neighborhood blog that is not ok with Google). Lots of places can’t figure it out either and many sites just decided to make everything they didn’t directly benefit from a nofollow link (like G+ does) with I guess the cost benefit analysis that there is a risk in making real links so don’t take the risk unless you directly benefit from it.

    Well, I actually didn’t mean to get off on the problems with Google’s nofollow directives, back to what I meant to write about. But it is related. I can’t see any reason why Google refuses to use a signal ever person experiences as an important signal for them every day they browse the web other than being trapped into their thinking they have been threatening people with for years on nofollow.

    One of the important points Eric made is that even if Google ignores social signals, human being don’t. And then those human beings will create links based on finding good resources and sharing them (most often in personal blogs – as Google has frightened companies away from making real links with vague rules and penalties resulting in many companies marking every link as untrustworthy to Google using nofollow).

    The other issue of course is that social has often become a very large portion of inbound links. Thus even if it didn’t improve search engine links popular social sharing is a substitute for gaining traffic that is not SEO by the initials (search engine optimization) but fairly related to the role people responsible for SEO have (where it seems the role really grew beyond SEO to attracting traffic and it still sometimes is under the SEO name – even if it isn’t actually SEO).

    Google can then take the portion of the social signal that remains (it is greatly reduced as the indirect signal is much less clear but for very popular things with strong signals some of the original signal will seep through to something Google will accept in ranking results of a search). And then Google can use the indirect signal in search results.

    Two of the reasons I find this a poor solution:

    • using a indirect signal means a large portion of the value of the full signal is lost
    • Matt Cutts has been saying for over a decade to just provide a good user experience. While Google might have short term issues with an algorithm that is exploitable if you just forget all that and focus on providing content that is good for human users you can trust that Google will keep getting better and better at using the signals an intelligent person uses to judge content. A huge majority of people today that browse the web are enormously influenced by social signals directly. Google acting like them being blind to this direct signals is not a big failure is just not sensible given my belief in Matt’s long term emphasis on the user versus manipulation for search engines (like nofollow) that are not even noticed by users.

    I will admit it is frustrating how other companies are not capitalizing on Google’s acceptance of ignoring useful signals for content quality. I do use DuckDuckGo by default but I use Google when that doesn’t provide good results or when I want to find only recent results. And continued peeks at Yahoo and Bing continue to be unimpressive. As a stockholder of Google, this is a good thing, but as a search user I find it distressing how search result quality seems worse today than it was 5 years ago.

    Related: Google Still Providing Users Degraded Search Results: Basing Results Not of Value to Users but How Completely Sites Do What Google Tells Them To DoGoogle Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users Likely

    Keeping Up with SEO Changes

    Response to Heresy: Google and SEO Don’t Actually Change that Much by Rand Fishkin

    If, in 2004, you balanced SEO best practices with usability/user experience needs and created a site with content that was optimal for people and engines, you probably had an SEO strategy that would still work today. You might need to make updates, especially with regards to your tactics for earning awareness (taking into account the rise of social networking), for optimizing to multiple devices (instead of just multiple browsers), and to keep your UI/UX enjoyable for modern users, but strategic-level SEO wouldn’t need nearly as much of a shift as these others.

    I agree. While I am not an SEO person (I just read about it a bit when I have time because it is an interesting area) what you say is how it seems to me. Yes there are tricks that SEO people learn and can use to do well (I guess based on what I have read over the years).

    But the core focus in the same, with just different tricks valuable at different times. It does seem like these tricks (that are mainly about exploiting weaknesses in the algorithms) are worth lots of time and energy in the right situation, to the right people.

    For most people I don’t think it is that important. The principles that matter seem to stay pretty consistent over the long term, it is just trying to game the algorithm that is challenging. If it wasn’t challenging those people that are now making their money doing it would have a much more difficult time because many other people would be doing it. The challenge of staying ahead of Google’s ability to eliminate gaming of the results is why those that do it well are rewarded and if it was easier you would be rewarded less.

    If you just do the basic stuff that doesn’t change, I barely make any changes over years. The only change I can remember in the last few years was adding Google Authorship – which has now become essentially worthless, so even if I hadn’t done that it wouldn’t matter.

    My basic guide is this: Google wants to provide the most relevant content. It does this with significant focus on others opinions of your content. Google has to make judgements about others opinions and doesn’t do so perfectly. So getting links is important as that is both an indication that other value it (maybe) and that is measurable by Google.

    Now, I don’t think Google is great at determining if links value what they link to or are saying – “look at this totally idiotic article.” I imagine Google, and other search engines will get much better at this.

    I do think Google is absolutely foolish to ignore data from nofollow sources (since this is more about what companies chose to game Google by nofollowing everything that doesn’t directly benefit their own company than whether the link has value). The links from important people on Twitter are an extremely valuable indication that Google just wastes. They, or some other search will do this much better sometime (I would have thought that time would have been years in the past).

    But essentially, you need to create useful content. Then get people to acknowledge that it is great. All the games of trying to get others to take advantage of lameness in Google’s algorithm is mainly noise. That yes professional SEO people need to pay attention to. But the extent to which that exists is mainly due to failures in Google algorithm so they will constantly be trying to fix it. They don’t want to provide the sites that do SEO best, they want to provide sites that are relevant to the person searching.

    Related: Why Can’t Search Results Screen Better by Date?Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users LikelyGoogle and Links

    Google Still Providing Users Bad Search Results: Basing Results Not of Value to Users but How Completely Sites Do What Google Tells Them To Do

    Comments on: On MetaFilter Being Penalized By Google (Sadly their comment system is too buggy and kept failing, so I couldn’t post this there).

    It is hard to be Google. But they have billions of dollars (tens of billions, I believe now) in profit from search every year which provides resources that dwarf those of all but a handful of companies.

    When Google takes action against users (providing bad search results not because the site content isn’t valuable but because the site has some practice Google doesn’t like) or tells website to change other web sites (get rid of links we at Google don’t like you having from some other site) that is extremely bad behavior even if you have a difficult task. And give billions of dollars to do it right I don’t agree it is anywhere near acceptable.

    That Google still believes giving search users bad results because that is the best way Google can figure out how to punish those sites doing something Google doesn’t like is super lame. If the site content isn’t useful to search users Google shouldn’t rank it highly. If it is, Google should rank it highly. Basing what I see for search results not on what is useful for me, but on what is from sites that Google didn’t dislike their practices and had related content, is lame.

    Sadly without better competition Google can keep up this lazy behavior. Once one or more of DuckDuckGo, Bing, Yandex, etc. do a decent enough job to pull away users Google will stop providing results based on sites that don’t behave how Google wants and instead base search results on value to the user.

    There is an easy way to see if Google’s behavior is user driven or the result of lazy behavior by an incumbent without realistic competition. If the “bad practice” Google is trying to correct provides bad content to users it is user driven. If the “bad practice” Google is trying to correct is about doing things Google doesn’t like it is lazy behavior they engage in because providing bad results won’t cost them and doing so lets them threaten sites into compliance with Google’s desires.

    This like no-follow and removing links on sites Google doesn’t like have no user value. They are aimed at forcing sites to behave as Google wishes. Google can claim (often it stretches credibility but in some cases with justification) that the trustworthiness of a site is degraded by certain practices and therefore the likely benefit to users is less. So if a site links to lots of lousy sites (per lots of data Google has to value sites) and fits certain patterns which algorithms should be able to measure it is reasonable to penalize those sites.

    Treating Twitter results differently because they had no-follow links or follow links has no value to the user. If Google can’t do what most somewhat sensible 10 year olds can do and tell the difference between @timberners_lee, @neiltyson and @Atul_Gawande and spam Twitter accounts then Google threatening Twitter into saying that they don’t believe any of their users are trustworthy (by using the “nofollow” tag) then that is all Google can do. And those threats will likely work fairly well just will the grumbling that we have all seen for the last 5 years. If they get a reasonable competitor such tactics won’t work – Google will have to provide the best results for users and stop penalizing Google search users in order to let Google keep threatening web sites into compliance with their wishes (and if sites don’t comply removing good content from users view).

    Related: Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users LikelyWhy Can’t Search Results Screen Better by Date?

    Are the Values of Links Decreasing?

    This interesting articles shows 13 SEO experts opinions on questions such as:

    Do you see the link losing value over time?
    How – if at all – should SEO’s change their content marketing and link building strategies in the coming years, given inevitable changes to the PageRank algorithm?
    If Google search did work without links, what metrics would replace it?

    Some interesting quotes:

    The Future of PageRank: 13 Experts on the Dwindling Value of the Link

    Michelle Robbins: “Google wants their results to be valid and relevant – to mirror user expectations in the real world, and they will continue to evolve their system to get there. Links aren’t doing it, so they are working to adapt. Offline signals are valid, not easily manipulated, and can be captured. Thus I believe they will be used in the algo.”

    Julie Joyce: “I think that links could lose some value but it’s not a doomsday scenario in my opinion. Considering links are how we move around on the web I cannot imagine a successful search engine that doesn’t take their importance into effect.”

    Rand Fishkin: “the link has been losing value for almost a decade. That said, I don’t think that in the next decade, we’ll see a time when links are completely removed from ranking features. They provide a lot of context and value to search engines, and since the engines keep getting better at removing the influence of non-editorial links, the usefulness of link measurement will remain high.”

    Glenn Gabe: “AuthorRank (or some form of it) once it officially rolls out. The model of ranking people versus websites is extremely intriguing. It makes a lot of sense and can tie rankings to authors versus the sites their content is written on.”

    One thing I find funny about the article is talking about “social factors” (social websites) as if they were not links. Google currently claims to ignore social links that use the nofollow (and Google policies have driven companies scared of being punished to use nofollow). But Google using “social factors” is just Google using links it has been failing to use the last few years. I agree Google is foolish to be ignoring those indications today. So I agree Google should use more links (social and likely non-social) to provide better results.

    I suppose one “social factor” people could mean is the number of likes something gets. I think that would be a lame measure to give any value to. The use of +1s data in Google+ may have some tiny benefit as their Google can know who gave what pluses (and devalue junk plus profiles, give extra value to authority that is plusing related good [using corroboration with other sources of Google has to gage quality] content).

    I have long thought using traffic and interaction (comments, time on site…) with the sight are useful measures for Google. Google has various ways for getting this data. I am not sure to what extent they use it now. Not that it would be a huge factor but just another factor to throw in with the huge number they use already.

    I agree trusted authorship is likely to play an increasingly important role. Partially this seems like an extension of pagerank to me (basically passing value to the link based on the link to the author – and the authorship stuff right now, is all link based, there has to be a link on the page tying the author to the Google+ page for the author).

    Related: Very Simple Process to Claim Authorship of Pages with GoogleWhy Can’t Search Results Screen Better by Date?Surprise Google Public PageRank UpdateUse Multiple PageRank Site to Find PageRank of https Pages