Author Archives: seo_cat

Let me Filter Out Websites Using Popups from Search Results

I wish Google, Duck Duck Go or some decent search engine would let me filter out website using popups from search results (as I have written before – Remove Popup Ad Sites From Search Results). By popups I mean all the coding to create popup interaction; lately many sites have coded popups in a way to not obey the preferences in my browser.

All many websites have done is found a way to get around my preferences by using code to create popups that circumvents my expressed desires for my browser’s behavior. Actively coding around my desires is a behavior that is obviously hostile to users. I don’t really want to see site that are hostile to users in my search results.

I would want a way to whitelist sites that have convinced me they are worth using even though they intentionally code so as to create popups when I have expressly indicated I don’t want them used.

Ideally a search engine would downgrade results based on the level of customer hostility practiced (kind of like Hipmunk’s agony ranking of plane flight options). It seems obvious to me this is a desirable state but I imagine it will likely take years for search engines to get there so let me block bad sites, for now. Google too 6 years to let me block bad sites but like many of the adjustments since Larry Page become CEO this advance seems to have been rolled back (or possibly just made next to impossible by lousy Ux).

Related: Web Search Improvements Suggestions from 2005 (most of which I am still waiting for) – Google Social Circle ResultsImprove Google Search Results (2006)

Surprise Google Public PageRank Update

Google has published updated public page ranks in a surprise move. Earlier this year Google Guy (Matt Cutts) had said he would be surprised if Google updated the public PageRank data before 2014.

Initial reports seem to be that people are seeing more drops in PageRanks than increases – but that could just be a factor of who is talking the loudest. Also there is more speculation that the data is old. In a quick check of 2 high PageRank blogs of mine the latest posts with PageRank are in August – no September, October, November or December posts have page rank (it may well be within a few days of September 1st in either direction). In past updates the data would often be 2 weeks old, this indicates it may well be over 3 months old (which others are also saying).

The previous update was in February 2012. In the past the updates have been published at about 3 month intervals but with a fair amount of variation.

It is unlikely they will publish new values 4 times in 2014, but we will have to see.

See how your sites have fared in the latest Google PageRank update.

Related: Use Multiple PageRank Site to Find PageRank of https PagesGoogle Has Deployed Penguin 2.0 (May 2013)

No More Google Toolbar PageRank Updates This Year

Google has essentially announced there will not be another toolbar pagerank update in 2013 (the last update was made in February 2013). The toolbar pagerank is the value Google shares with all of us. The real pagerank is updated very frequently, they just don’t publish that value (and as I have posted before I think they may well adjust the public value from the value they really use in calculating search results).

Matt Cutts essentially speaks for Google on this topic (though Google likes to keep things vague and unofficial). Here is a webcast where he touches on the issue a bit.

Related: Use MultiPageRank Site to Find PageRank of https PagesGoogle’s Search Results – Should Factors Other Than User Value be Used

Why Can’t Search Results Screen Better by Date?

I realize there are some challenges with showing results by date. But the results shown now are pitiful. If you ask for results only in the last week, a vast majority of the results will still be ancient. They likely had some minor change to superflous text not even in the body of content for the page.

It really doesn’t seem like having the filters for time be better would be too hard. Even if they are not perfect making them much better wouldn’t be hard. Sadly the various search engines range form bad to pitiful to non-existent in their ability to provide accurate results filtered by date.

This seems like another opportunity for all those non-Google search engines to take advantage of Google’s inability to do this competently and gain some market share. But I haven’t noticed any of them taking advantage of this opportunity. If you know a search engine that actually provide somewhat accurate date filters add a comment.

Google Has Deployed Penguin 2.0

Penguin 2.0 rolled out today

We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice…

This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally.

What Google is doing is trying to correct weaknesses in their algorithm that provide less valuable results to users. That is certainly a worthy goal. Part of the method to do so is sensible, it seems to me. They update to devalue links that they had been incorrectly valuing too highly. This lead Google to over-rank pages (from the value to user perspective).

The second part, that Google likes to avoid talking about this way (and amazingly gets away with) is to punish sites that Google thinks have made Google’s job harder. So in this step, what Google does is intentionally provide worse search result to Google users in order to hope the threat of doing so will scare web sites into following practices that makes Google’s job easier.

As I have been saying for years, Google can punish their users by providing intentionally worse results to users, because they do not suffer significantly for doing so. If/when other search engines (DuckDuckGo, Yahoo, Yandex…) start taking away significant amounts of search traffic, this practice will almost certainly end. Google will find ways to improve weaknesses in their algorithms without punishing those using Google to search for content.

This update doesn’t affect published page rank significantly. Google might, punish sites it is upset with by reducing the public page rank but it is questionable if that action affects search results (in seems to probably just be a visible intimidation strategy). I am just guessing here, I don’t know of Google providing an explanation of this practice.

For some sites that Google punishes they do intentionally place them lower in the search results. In such an instance for example, Google will take a site that by the algorithm finds of great value to the searcher, say the 3rd best page and Google punishes the searcher, along with the site, by showing less worthwhile sites to the searcher and putting the 3rd best match is say 33rd place or 53 place or whatever.

I think users of Google want Google to provide search results that are the best match for them, not search results that are the best matches for them that Google isn’t mad at for some reason or another. So removing the 3rd result that was a lousy choice by Google to put in 3rd place – great. Removing the 3rd results because Google is made at the site – not great.

Google says they punish sites that do things they don’t like if those sites make changes to make Google happy (that in no way help those going to the site) then Google says they will stop intentionally not providing that result as high as they believe it should be (based on the quality of the match for the searcher). Google talks a lot of the punishment of the sites doing things google doesn’t like (using keywords too often, linking in ways Google doesn’t like, etc.). I don’t recall them ever talking about the result that has to follow which is they are punishing their users every time they do this.

Related: Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users Likely

Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users LIkely

Google’s actions already have weakened the usefulness of links as many sites just nofollowed tons (or all) external links because suits at the company are scared of Google punishing them. Google made a extremely smart decision to make links an important determination in search results. That led to people trying to improve their search results by gaining links (“gaming the results”).

It was foolish for Google to attempt to enforce their desires for how pages should be coded and it still is. Obviously if all sites completely just went to nofollowing all links Google would have a very big problem – then they would probably start punishing sites for not following links Google wanted followed.

The challenges of providing good search results when people have a large monetary incentive to get better results for their sites, no matter if that is better for users of the search engine or not. That creates challenges for Google. I don’t think Google’s nofollow dictates are the right solution.

Sadly other search sites haven’t taken much advantage of this foolish policy. DuckDuckGo and Yandex are good but they haven’t taken much market share away.

I have written my thoughts on the topic several times previously (on another blog): Google’s Search Results, Should Factors Other Than User Value be Used (like blocking sites that didn’t do what Google wanted), Google and Links, Google’s Displayed PageRank and more.

The bottom line is when Google “punishes” for a site doing something Google doesn’t like Google also is punishing Google users. To the extent that Google dominates the market it can give users less useful results and get away with it. With better competition it likely would not be able to. Google has lots of smart people that know whether the degradation of value to users goes to far and to what extent they can provide worse results to users in order to try and punish people for sites doing things Google doesn’t like.

I am surprised Google still hasn’t found a better way to deal with valuing links that are questionable. I believe improvement probably won’t happen until competition requires Google to improve. Other search engines have a big potential advantage of using the signal value provided by links Google has created a policy that requires it to ignore. Those other search engines though have failed to take advantage of this weakness yet. Until they do, I don’t imagine Google will find a way to provide users better value based on paid, kind of paid, influenced and links that Google now ignores but have search ranking value (lots of nofollow links are nofollowed to game Google or because suits are scared of Google, not because the link is less true/valuable than any followed link).

Reaction to: Will Google Penalize Chromebooks, Google Analytics, AdWords & Google+ For Using Advertorials? by Danny Sullivan

Use Multiple PageRank Site to Find PageRank of https Pages

Welcome to the first post on our new blog; we hope you will enjoy our posts.

If you use a plugin that shows you the Google pagerank of the page you are on, it won’t show any details if you are on a ssl page (with https instead of http).

But you can use our multiple page pagerank checker to get the pagerank of those sites.

Google+ and Twitter automatically use https so your browser plugin fails to show the pagerank. But using our site you can get the values, for example, plugging in links to my Google+ and Twitter profiles to the multipagerank text box:

https://plus.google.com/114308510941297788149
https://twitter.com/curiouscat_com

shows a pagerank of 3 and 4 respectively. If you look at those pages with a plugin to show pageranks of the page you are on, they show no value (because the protocol for the pages is https).

Google doesn’t value nofollow links in determining search results. Since Twitter uses all nofollow links, Google fails to take advantage of Twitter user links to sites. But not all search engines fail to use this data.

Google+ make the first link into the special details after the post and makes that a link that Google does use in determining search rankings.

In my opinion Google is foolish to ignore Twitter user links but for the time being that is what Google is doing (as I understand it).

Alexa also doesn’t show you the site ranking if the site is https. So plugins to display the Alexa rank will also be blank for https pages. Using our site, that data is shown.