Category Archives: SEO

Find MozRank, Moz PageAuthority, Google PageRank and Alexa Results Now

We have updated the MultiPageRank site to provide MozRank, Moz PageAuthority, Google PageRank and Alexa results now. In one simple request you can retrieve all these measures for multiple domains.

Google provided an opening in the market to serve users interested in page authority/popularity when they slowed sharing the updates to public Google page rank. Moz has filled that role extremely well. For a year or two Moz results have been much more useful than Google’s. We have finally added Moz results to our results page.

MozRank is closest to Google page rank to measure raw link authority to the page; as with Google page rank the link weight is based on the rank of the page providing a link. So 1 link on the home page of some very popular site would provide more rank to the linked page than thousands from low quality pages.

Moz page authority is enhanced with many extra factors to try and provide a better estimation of search result “authority.” Moz calculates it based off data from the Mozscape web index and includes link counts, MozRank, MozTrust, and dozens of other factors.

We also continue to include Alexa data which does have significant issues with reliability but it is of some interest so we include it. Alexa uses their data (largely toolbar user based) to rank websites by total visitors/visits (a combination). There data is biased with SEO sites in particular getting a big boost as users using those sites are often using a toolbar that shares data with Alexa and they visit lots of SEO related sites.

We have had some issues (largely very slow response times for the results page) providing the additional Moz data but I believe things are working well now. Still I have the old results visible using www.multipagerank.com. The new results are found on multipagerank.com. I made split when we first had issues as we worked on them. I will likely eliminate the old results page in the next couple weeks if everything continues to go well.

Related: Use Our Multiple PageRank Site to Find PageRank of https PagesIs the Value of Links Decreasing?Keeping Up with SEO Changes

Ignoring Direct Social Web Signals for Search Results

Eric Enge wrote a good post recently: Do Social Signals Drive SEO? He repeats that Google denies using social signals to drive search engine results. And he says while the evidence shows that socially popular links do rank well (and quickly) it is possible to explain this while Google continues to ignore this signal that humans find useful (people we trust sharing links).

Google has tied themselves to thinking that nofollow is a sensible idea core to their demands for compliance to their directions for how web sites make links. Google has been promoting it be used how they direct for years. So when social sites and other large sites just put nofollow on everything that doesn’t directly benefit them (like Google+, Twitter, etc.) Google either has to change their thinking that nofollow is a good idea or reward sites that only follow links they directly benefit from.

You have to remember Google attempts to use nofollow to mandate its view of what is a trusted link and what isn’t. Google seems to say it is fine to follow links your organization benefits from if it isn’t that you are being paid cash for that link. Of course it is hard to draw that line in the real world. When an employee of some S&P 100 company writes an article on the company blog about the companies new product they employee is paid to promote the companies product. If the employee didn’t write it the company wouldn’t be paying their salary for long. But these links Google doesn’t mind.

But other kinds of links where sites have been paid for links Google doesn’t like. It is a tricky area but Google’s solution is very poor it seems to me.

And I don’t even know what their position is on other things – like partnerships where millions of dollars are exchanged and links are one of many things being paid for (mainly with Google it seems to be if enough money changes hands it is ok, it is the small stuff that Google really doesn’t like – if Coke pays millions to places those links are fine, if Joe’s Fresh Drinks does something similar to a neighborhood blog that is not ok with Google). Lots of places can’t figure it out either and many sites just decided to make everything they didn’t directly benefit from a nofollow link (like G+ does) with I guess the cost benefit analysis that there is a risk in making real links so don’t take the risk unless you directly benefit from it.

Well, I actually didn’t mean to get off on the problems with Google’s nofollow directives, back to what I meant to write about. But it is related. I can’t see any reason why Google refuses to use a signal ever person experiences as an important signal for them every day they browse the web other than being trapped into their thinking they have been threatening people with for years on nofollow.

One of the important points Eric made is that even if Google ignores social signals, human being don’t. And then those human beings will create links based on finding good resources and sharing them (most often in personal blogs – as Google has frightened companies away from making real links with vague rules and penalties resulting in many companies marking every link as untrustworthy to Google using nofollow).

The other issue of course is that social has often become a very large portion of inbound links. Thus even if it didn’t improve search engine links popular social sharing is a substitute for gaining traffic that is not SEO by the initials (search engine optimization) but fairly related to the role people responsible for SEO have (where it seems the role really grew beyond SEO to attracting traffic and it still sometimes is under the SEO name – even if it isn’t actually SEO).

Google can then take the portion of the social signal that remains (it is greatly reduced as the indirect signal is much less clear but for very popular things with strong signals some of the original signal will seep through to something Google will accept in ranking results of a search). And then Google can use the indirect signal in search results.

Two of the reasons I find this a poor solution:

  • using a indirect signal means a large portion of the value of the full signal is lost
  • Matt Cutts has been saying for over a decade to just provide a good user experience. While Google might have short term issues with an algorithm that is exploitable if you just forget all that and focus on providing content that is good for human users you can trust that Google will keep getting better and better at using the signals an intelligent person uses to judge content. A huge majority of people today that browse the web are enormously influenced by social signals directly. Google acting like them being blind to this direct signals is not a big failure is just not sensible given my belief in Matt’s long term emphasis on the user versus manipulation for search engines (like nofollow) that are not even noticed by users.

I will admit it is frustrating how other companies are not capitalizing on Google’s acceptance of ignoring useful signals for content quality. I do use DuckDuckGo by default but I use Google when that doesn’t provide good results or when I want to find only recent results. And continued peeks at Yahoo and Bing continue to be unimpressive. As a stockholder of Google, this is a good thing, but as a search user I find it distressing how search result quality seems worse today than it was 5 years ago.

Related: Google Still Providing Users Degraded Search Results: Basing Results Not of Value to Users but How Completely Sites Do What Google Tells Them To DoGoogle Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users Likely

Keeping Up with SEO Changes

Response to Heresy: Google and SEO Don’t Actually Change that Much by Rand Fishkin

If, in 2004, you balanced SEO best practices with usability/user experience needs and created a site with content that was optimal for people and engines, you probably had an SEO strategy that would still work today. You might need to make updates, especially with regards to your tactics for earning awareness (taking into account the rise of social networking), for optimizing to multiple devices (instead of just multiple browsers), and to keep your UI/UX enjoyable for modern users, but strategic-level SEO wouldn’t need nearly as much of a shift as these others.

I agree. While I am not an SEO person (I just read about it a bit when I have time because it is an interesting area) what you say is how it seems to me. Yes there are tricks that SEO people learn and can use to do well (I guess based on what I have read over the years).

But the core focus in the same, with just different tricks valuable at different times. It does seem like these tricks (that are mainly about exploiting weaknesses in the algorithms) are worth lots of time and energy in the right situation, to the right people.

For most people I don’t think it is that important. The principles that matter seem to stay pretty consistent over the long term, it is just trying to game the algorithm that is challenging. If it wasn’t challenging those people that are now making their money doing it would have a much more difficult time because many other people would be doing it. The challenge of staying ahead of Google’s ability to eliminate gaming of the results is why those that do it well are rewarded and if it was easier you would be rewarded less.

If you just do the basic stuff that doesn’t change, I barely make any changes over years. The only change I can remember in the last few years was adding Google Authorship – which has now become essentially worthless, so even if I hadn’t done that it wouldn’t matter.

My basic guide is this: Google wants to provide the most relevant content. It does this with significant focus on others opinions of your content. Google has to make judgements about others opinions and doesn’t do so perfectly. So getting links is important as that is both an indication that other value it (maybe) and that is measurable by Google.

Now, I don’t think Google is great at determining if links value what they link to or are saying – “look at this totally idiotic article.” I imagine Google, and other search engines will get much better at this.

I do think Google is absolutely foolish to ignore data from nofollow sources (since this is more about what companies chose to game Google by nofollowing everything that doesn’t directly benefit their own company than whether the link has value). The links from important people on Twitter are an extremely valuable indication that Google just wastes. They, or some other search will do this much better sometime (I would have thought that time would have been years in the past).

But essentially, you need to create useful content. Then get people to acknowledge that it is great. All the games of trying to get others to take advantage of lameness in Google’s algorithm is mainly noise. That yes professional SEO people need to pay attention to. But the extent to which that exists is mainly due to failures in Google algorithm so they will constantly be trying to fix it. They don’t want to provide the sites that do SEO best, they want to provide sites that are relevant to the person searching.

Related: Why Can’t Search Results Screen Better by Date?Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users LikelyGoogle and Links

Are the Values of Links Decreasing?

This interesting articles shows 13 SEO experts opinions on questions such as:

Do you see the link losing value over time?
How – if at all – should SEO’s change their content marketing and link building strategies in the coming years, given inevitable changes to the PageRank algorithm?
If Google search did work without links, what metrics would replace it?

Some interesting quotes:

The Future of PageRank: 13 Experts on the Dwindling Value of the Link

Michelle Robbins: “Google wants their results to be valid and relevant – to mirror user expectations in the real world, and they will continue to evolve their system to get there. Links aren’t doing it, so they are working to adapt. Offline signals are valid, not easily manipulated, and can be captured. Thus I believe they will be used in the algo.”

Julie Joyce: “I think that links could lose some value but it’s not a doomsday scenario in my opinion. Considering links are how we move around on the web I cannot imagine a successful search engine that doesn’t take their importance into effect.”

Rand Fishkin: “the link has been losing value for almost a decade. That said, I don’t think that in the next decade, we’ll see a time when links are completely removed from ranking features. They provide a lot of context and value to search engines, and since the engines keep getting better at removing the influence of non-editorial links, the usefulness of link measurement will remain high.”

Glenn Gabe: “AuthorRank (or some form of it) once it officially rolls out. The model of ranking people versus websites is extremely intriguing. It makes a lot of sense and can tie rankings to authors versus the sites their content is written on.”

One thing I find funny about the article is talking about “social factors” (social websites) as if they were not links. Google currently claims to ignore social links that use the nofollow (and Google policies have driven companies scared of being punished to use nofollow). But Google using “social factors” is just Google using links it has been failing to use the last few years. I agree Google is foolish to be ignoring those indications today. So I agree Google should use more links (social and likely non-social) to provide better results.

I suppose one “social factor” people could mean is the number of likes something gets. I think that would be a lame measure to give any value to. The use of +1s data in Google+ may have some tiny benefit as their Google can know who gave what pluses (and devalue junk plus profiles, give extra value to authority that is plusing related good [using corroboration with other sources of Google has to gage quality] content).

I have long thought using traffic and interaction (comments, time on site…) with the sight are useful measures for Google. Google has various ways for getting this data. I am not sure to what extent they use it now. Not that it would be a huge factor but just another factor to throw in with the huge number they use already.

I agree trusted authorship is likely to play an increasingly important role. Partially this seems like an extension of pagerank to me (basically passing value to the link based on the link to the author – and the authorship stuff right now, is all link based, there has to be a link on the page tying the author to the Google+ page for the author).

Related: Very Simple Process to Claim Authorship of Pages with GoogleWhy Can’t Search Results Screen Better by Date?Surprise Google Public PageRank UpdateUse Multiple PageRank Site to Find PageRank of https Pages

No More Google Toolbar PageRank Updates This Year

Google has essentially announced there will not be another toolbar pagerank update in 2013 (the last update was made in February 2013). The toolbar pagerank is the value Google shares with all of us. The real pagerank is updated very frequently, they just don’t publish that value (and as I have posted before I think they may well adjust the public value from the value they really use in calculating search results).

Matt Cutts essentially speaks for Google on this topic (though Google likes to keep things vague and unofficial). Here is a webcast where he touches on the issue a bit.

Related: Use MultiPageRank Site to Find PageRank of https PagesGoogle’s Search Results – Should Factors Other Than User Value be Used

Google Has Deployed Penguin 2.0

Penguin 2.0 rolled out today

We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice…

This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally.

What Google is doing is trying to correct weaknesses in their algorithm that provide less valuable results to users. That is certainly a worthy goal. Part of the method to do so is sensible, it seems to me. They update to devalue links that they had been incorrectly valuing too highly. This lead Google to over-rank pages (from the value to user perspective).

The second part, that Google likes to avoid talking about this way (and amazingly gets away with) is to punish sites that Google thinks have made Google’s job harder. So in this step, what Google does is intentionally provide worse search result to Google users in order to hope the threat of doing so will scare web sites into following practices that makes Google’s job easier.

As I have been saying for years, Google can punish their users by providing intentionally worse results to users, because they do not suffer significantly for doing so. If/when other search engines (DuckDuckGo, Yahoo, Yandex…) start taking away significant amounts of search traffic, this practice will almost certainly end. Google will find ways to improve weaknesses in their algorithms without punishing those using Google to search for content.

This update doesn’t affect published page rank significantly. Google might, punish sites it is upset with by reducing the public page rank but it is questionable if that action affects search results (in seems to probably just be a visible intimidation strategy). I am just guessing here, I don’t know of Google providing an explanation of this practice.

For some sites that Google punishes they do intentionally place them lower in the search results. In such an instance for example, Google will take a site that by the algorithm finds of great value to the searcher, say the 3rd best page and Google punishes the searcher, along with the site, by showing less worthwhile sites to the searcher and putting the 3rd best match is say 33rd place or 53 place or whatever.

I think users of Google want Google to provide search results that are the best match for them, not search results that are the best matches for them that Google isn’t mad at for some reason or another. So removing the 3rd result that was a lousy choice by Google to put in 3rd place – great. Removing the 3rd results because Google is made at the site – not great.

Google says they punish sites that do things they don’t like if those sites make changes to make Google happy (that in no way help those going to the site) then Google says they will stop intentionally not providing that result as high as they believe it should be (based on the quality of the match for the searcher). Google talks a lot of the punishment of the sites doing things google doesn’t like (using keywords too often, linking in ways Google doesn’t like, etc.). I don’t recall them ever talking about the result that has to follow which is they are punishing their users every time they do this.

Related: Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users Likely

Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users LIkely

Google’s actions already have weakened the usefulness of links as many sites just nofollowed tons (or all) external links because suits at the company are scared of Google punishing them. Google made a extremely smart decision to make links an important determination in search results. That led to people trying to improve their search results by gaining links (“gaming the results”).

It was foolish for Google to attempt to enforce their desires for how pages should be coded and it still is. Obviously if all sites completely just went to nofollowing all links Google would have a very big problem – then they would probably start punishing sites for not following links Google wanted followed.

The challenges of providing good search results when people have a large monetary incentive to get better results for their sites, no matter if that is better for users of the search engine or not. That creates challenges for Google. I don’t think Google’s nofollow dictates are the right solution.

Sadly other search sites haven’t taken much advantage of this foolish policy. DuckDuckGo and Yandex are good but they haven’t taken much market share away.

I have written my thoughts on the topic several times previously (on another blog): Google’s Search Results, Should Factors Other Than User Value be Used (like blocking sites that didn’t do what Google wanted), Google and Links, Google’s Displayed PageRank and more.

The bottom line is when Google “punishes” for a site doing something Google doesn’t like Google also is punishing Google users. To the extent that Google dominates the market it can give users less useful results and get away with it. With better competition it likely would not be able to. Google has lots of smart people that know whether the degradation of value to users goes to far and to what extent they can provide worse results to users in order to try and punish people for sites doing things Google doesn’t like.

I am surprised Google still hasn’t found a better way to deal with valuing links that are questionable. I believe improvement probably won’t happen until competition requires Google to improve. Other search engines have a big potential advantage of using the signal value provided by links Google has created a policy that requires it to ignore. Those other search engines though have failed to take advantage of this weakness yet. Until they do, I don’t imagine Google will find a way to provide users better value based on paid, kind of paid, influenced and links that Google now ignores but have search ranking value (lots of nofollow links are nofollowed to game Google or because suits are scared of Google, not because the link is less true/valuable than any followed link).

Reaction to: Will Google Penalize Chromebooks, Google Analytics, AdWords & Google+ For Using Advertorials? by Danny Sullivan

Use Multiple PageRank Site to Find PageRank of https Pages

Welcome to the first post on our new blog; we hope you will enjoy our posts.

If you use a plugin that shows you the Google pagerank of the page you are on, it won’t show any details if you are on a ssl page (with https instead of http).

But you can use our multiple page pagerank checker to get the pagerank of those sites.

Google+ and Twitter automatically use https so your browser plugin fails to show the pagerank. But using our site you can get the values, for example, plugging in links to my Google+ and Twitter profiles to the multipagerank text box:

https://plus.google.com/114308510941297788149
https://twitter.com/curiouscat_com

shows a pagerank of 3 and 4 respectively. If you look at those pages with a plugin to show pageranks of the page you are on, they show no value (because the protocol for the pages is https).

Google doesn’t value nofollow links in determining search results. Since Twitter uses all nofollow links, Google fails to take advantage of Twitter user links to sites. But not all search engines fail to use this data.

Google+ make the first link into the special details after the post and makes that a link that Google does use in determining search rankings.

In my opinion Google is foolish to ignore Twitter user links but for the time being that is what Google is doing (as I understand it).

Alexa also doesn’t show you the site ranking if the site is https. So plugins to display the Alexa rank will also be blank for https pages. Using our site, that data is shown.