Tag Archives: search result quality

Don’t Hide Important Content Using Coding Gimmicks

My comment on: Does Hidden/Tabbed Content Still Get Picked Up By Google?

I would say that hidden tab stuff is bad Ux (most of the time). I could figure out what the heck was going on when I read this post and it seems to end without having addressed the issue sensibly. Oh, the content is hidden up above I finally figured out. I think Google does exactly the right thing, in making the hidden content a very low ranking factor for the page because as a user it is hidden and not the focus of the page.

The conclusion of the original post is hidden text is given very low weight by Google in search results. If you go to that page, note that you can’t see most of the content for that “page” you have to find the links to unhide the content.

The hidden text being discussed here is when you hide content that only becomes visible once the user clicks something (and instead of going to a page that highlights that content, some content is unhidden and other content is added to the hidden content on the page). It is just a bad Ux practice in general (as with many Ux issues there are odd cases where it can make sense).

Related: Getting Around Bad Web Page LayoutsPoor Web Site User Experience (Ux) on Financial SitesThe Most Important Search Engine Ranking FactorsDon’t Use Short URL Services

Ignoring Direct Social Web Signals for Search Results

Eric Enge wrote a good post recently: Do Social Signals Drive SEO? He repeats that Google denies using social signals to drive search engine results. And he says while the evidence shows that socially popular links do rank well (and quickly) it is possible to explain this while Google continues to ignore this signal that humans find useful (people we trust sharing links).

Google has tied themselves to thinking that nofollow is a sensible idea core to their demands for compliance to their directions for how web sites make links. Google has been promoting it be used how they direct for years. So when social sites and other large sites just put nofollow on everything that doesn’t directly benefit them (like Google+, Twitter, etc.) Google either has to change their thinking that nofollow is a good idea or reward sites that only follow links they directly benefit from.

You have to remember Google attempts to use nofollow to mandate its view of what is a trusted link and what isn’t. Google seems to say it is fine to follow links your organization benefits from if it isn’t that you are being paid cash for that link. Of course it is hard to draw that line in the real world. When an employee of some S&P 100 company writes an article on the company blog about the companies new product they employee is paid to promote the companies product. If the employee didn’t write it the company wouldn’t be paying their salary for long. But these links Google doesn’t mind.

But other kinds of links where sites have been paid for links Google doesn’t like. It is a tricky area but Google’s solution is very poor it seems to me.

And I don’t even know what their position is on other things – like partnerships where millions of dollars are exchanged and links are one of many things being paid for (mainly with Google it seems to be if enough money changes hands it is ok, it is the small stuff that Google really doesn’t like – if Coke pays millions to places those links are fine, if Joe’s Fresh Drinks does something similar to a neighborhood blog that is not ok with Google). Lots of places can’t figure it out either and many sites just decided to make everything they didn’t directly benefit from a nofollow link (like G+ does) with I guess the cost benefit analysis that there is a risk in making real links so don’t take the risk unless you directly benefit from it.

Well, I actually didn’t mean to get off on the problems with Google’s nofollow directives, back to what I meant to write about. But it is related. I can’t see any reason why Google refuses to use a signal ever person experiences as an important signal for them every day they browse the web other than being trapped into their thinking they have been threatening people with for years on nofollow.

One of the important points Eric made is that even if Google ignores social signals, human being don’t. And then those human beings will create links based on finding good resources and sharing them (most often in personal blogs – as Google has frightened companies away from making real links with vague rules and penalties resulting in many companies marking every link as untrustworthy to Google using nofollow).

The other issue of course is that social has often become a very large portion of inbound links. Thus even if it didn’t improve search engine links popular social sharing is a substitute for gaining traffic that is not SEO by the initials (search engine optimization) but fairly related to the role people responsible for SEO have (where it seems the role really grew beyond SEO to attracting traffic and it still sometimes is under the SEO name – even if it isn’t actually SEO).

Google can then take the portion of the social signal that remains (it is greatly reduced as the indirect signal is much less clear but for very popular things with strong signals some of the original signal will seep through to something Google will accept in ranking results of a search). And then Google can use the indirect signal in search results.

Two of the reasons I find this a poor solution:

  • using a indirect signal means a large portion of the value of the full signal is lost
  • Matt Cutts has been saying for over a decade to just provide a good user experience. While Google might have short term issues with an algorithm that is exploitable if you just forget all that and focus on providing content that is good for human users you can trust that Google will keep getting better and better at using the signals an intelligent person uses to judge content. A huge majority of people today that browse the web are enormously influenced by social signals directly. Google acting like them being blind to this direct signals is not a big failure is just not sensible given my belief in Matt’s long term emphasis on the user versus manipulation for search engines (like nofollow) that are not even noticed by users.

I will admit it is frustrating how other companies are not capitalizing on Google’s acceptance of ignoring useful signals for content quality. I do use DuckDuckGo by default but I use Google when that doesn’t provide good results or when I want to find only recent results. And continued peeks at Yahoo and Bing continue to be unimpressive. As a stockholder of Google, this is a good thing, but as a search user I find it distressing how search result quality seems worse today than it was 5 years ago.

Related: Google Still Providing Users Degraded Search Results: Basing Results Not of Value to Users but How Completely Sites Do What Google Tells Them To DoGoogle Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users Likely

Keeping Up with SEO Changes

Response to Heresy: Google and SEO Don’t Actually Change that Much by Rand Fishkin

If, in 2004, you balanced SEO best practices with usability/user experience needs and created a site with content that was optimal for people and engines, you probably had an SEO strategy that would still work today. You might need to make updates, especially with regards to your tactics for earning awareness (taking into account the rise of social networking), for optimizing to multiple devices (instead of just multiple browsers), and to keep your UI/UX enjoyable for modern users, but strategic-level SEO wouldn’t need nearly as much of a shift as these others.

I agree. While I am not an SEO person (I just read about it a bit when I have time because it is an interesting area) what you say is how it seems to me. Yes there are tricks that SEO people learn and can use to do well (I guess based on what I have read over the years).

But the core focus in the same, with just different tricks valuable at different times. It does seem like these tricks (that are mainly about exploiting weaknesses in the algorithms) are worth lots of time and energy in the right situation, to the right people.

For most people I don’t think it is that important. The principles that matter seem to stay pretty consistent over the long term, it is just trying to game the algorithm that is challenging. If it wasn’t challenging those people that are now making their money doing it would have a much more difficult time because many other people would be doing it. The challenge of staying ahead of Google’s ability to eliminate gaming of the results is why those that do it well are rewarded and if it was easier you would be rewarded less.

If you just do the basic stuff that doesn’t change, I barely make any changes over years. The only change I can remember in the last few years was adding Google Authorship – which has now become essentially worthless, so even if I hadn’t done that it wouldn’t matter.

My basic guide is this: Google wants to provide the most relevant content. It does this with significant focus on others opinions of your content. Google has to make judgements about others opinions and doesn’t do so perfectly. So getting links is important as that is both an indication that other value it (maybe) and that is measurable by Google.

Now, I don’t think Google is great at determining if links value what they link to or are saying – “look at this totally idiotic article.” I imagine Google, and other search engines will get much better at this.

I do think Google is absolutely foolish to ignore data from nofollow sources (since this is more about what companies chose to game Google by nofollowing everything that doesn’t directly benefit their own company than whether the link has value). The links from important people on Twitter are an extremely valuable indication that Google just wastes. They, or some other search will do this much better sometime (I would have thought that time would have been years in the past).

But essentially, you need to create useful content. Then get people to acknowledge that it is great. All the games of trying to get others to take advantage of lameness in Google’s algorithm is mainly noise. That yes professional SEO people need to pay attention to. But the extent to which that exists is mainly due to failures in Google algorithm so they will constantly be trying to fix it. They don’t want to provide the sites that do SEO best, they want to provide sites that are relevant to the person searching.

Related: Why Can’t Search Results Screen Better by Date?Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users LikelyGoogle and Links

Google Still Providing Users Bad Search Results: Basing Results Not of Value to Users but How Completely Sites Do What Google Tells Them To Do

Comments on: On MetaFilter Being Penalized By Google (Sadly their comment system is too buggy and kept failing, so I couldn’t post this there).

It is hard to be Google. But they have billions of dollars (tens of billions, I believe now) in profit from search every year which provides resources that dwarf those of all but a handful of companies.

When Google takes action against users (providing bad search results not because the site content isn’t valuable but because the site has some practice Google doesn’t like) or tells website to change other web sites (get rid of links we at Google don’t like you having from some other site) that is extremely bad behavior even if you have a difficult task. And give billions of dollars to do it right I don’t agree it is anywhere near acceptable.

That Google still believes giving search users bad results because that is the best way Google can figure out how to punish those sites doing something Google doesn’t like is super lame. If the site content isn’t useful to search users Google shouldn’t rank it highly. If it is, Google should rank it highly. Basing what I see for search results not on what is useful for me, but on what is from sites that Google didn’t dislike their practices and had related content, is lame.

Sadly without better competition Google can keep up this lazy behavior. Once one or more of DuckDuckGo, Bing, Yandex, etc. do a decent enough job to pull away users Google will stop providing results based on sites that don’t behave how Google wants and instead base search results on value to the user.

There is an easy way to see if Google’s behavior is user driven or the result of lazy behavior by an incumbent without realistic competition. If the “bad practice” Google is trying to correct provides bad content to users it is user driven. If the “bad practice” Google is trying to correct is about doing things Google doesn’t like it is lazy behavior they engage in because providing bad results won’t cost them and doing so lets them threaten sites into compliance with Google’s desires.

This like no-follow and removing links on sites Google doesn’t like have no user value. They are aimed at forcing sites to behave as Google wishes. Google can claim (often it stretches credibility but in some cases with justification) that the trustworthiness of a site is degraded by certain practices and therefore the likely benefit to users is less. So if a site links to lots of lousy sites (per lots of data Google has to value sites) and fits certain patterns which algorithms should be able to measure it is reasonable to penalize those sites.

Treating Twitter results differently because they had no-follow links or follow links has no value to the user. If Google can’t do what most somewhat sensible 10 year olds can do and tell the difference between @timberners_lee, @neiltyson and @Atul_Gawande and spam Twitter accounts then Google threatening Twitter into saying that they don’t believe any of their users are trustworthy (by using the “nofollow” tag) then that is all Google can do. And those threats will likely work fairly well just will the grumbling that we have all seen for the last 5 years. If they get a reasonable competitor such tactics won’t work – Google will have to provide the best results for users and stop penalizing Google search users in order to let Google keep threatening web sites into compliance with their wishes (and if sites don’t comply removing good content from users view).

Related: Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users LikelyWhy Can’t Search Results Screen Better by Date?

Are the Values of Links Decreasing?

This interesting articles shows 13 SEO experts opinions on questions such as:

Do you see the link losing value over time?
How – if at all – should SEO’s change their content marketing and link building strategies in the coming years, given inevitable changes to the PageRank algorithm?
If Google search did work without links, what metrics would replace it?

Some interesting quotes:

The Future of PageRank: 13 Experts on the Dwindling Value of the Link

Michelle Robbins: “Google wants their results to be valid and relevant – to mirror user expectations in the real world, and they will continue to evolve their system to get there. Links aren’t doing it, so they are working to adapt. Offline signals are valid, not easily manipulated, and can be captured. Thus I believe they will be used in the algo.”

Julie Joyce: “I think that links could lose some value but it’s not a doomsday scenario in my opinion. Considering links are how we move around on the web I cannot imagine a successful search engine that doesn’t take their importance into effect.”

Rand Fishkin: “the link has been losing value for almost a decade. That said, I don’t think that in the next decade, we’ll see a time when links are completely removed from ranking features. They provide a lot of context and value to search engines, and since the engines keep getting better at removing the influence of non-editorial links, the usefulness of link measurement will remain high.”

Glenn Gabe: “AuthorRank (or some form of it) once it officially rolls out. The model of ranking people versus websites is extremely intriguing. It makes a lot of sense and can tie rankings to authors versus the sites their content is written on.”

One thing I find funny about the article is talking about “social factors” (social websites) as if they were not links. Google currently claims to ignore social links that use the nofollow (and Google policies have driven companies scared of being punished to use nofollow). But Google using “social factors” is just Google using links it has been failing to use the last few years. I agree Google is foolish to be ignoring those indications today. So I agree Google should use more links (social and likely non-social) to provide better results.

I suppose one “social factor” people could mean is the number of likes something gets. I think that would be a lame measure to give any value to. The use of +1s data in Google+ may have some tiny benefit as their Google can know who gave what pluses (and devalue junk plus profiles, give extra value to authority that is plusing related good [using corroboration with other sources of Google has to gage quality] content).

I have long thought using traffic and interaction (comments, time on site…) with the sight are useful measures for Google. Google has various ways for getting this data. I am not sure to what extent they use it now. Not that it would be a huge factor but just another factor to throw in with the huge number they use already.

I agree trusted authorship is likely to play an increasingly important role. Partially this seems like an extension of pagerank to me (basically passing value to the link based on the link to the author – and the authorship stuff right now, is all link based, there has to be a link on the page tying the author to the Google+ page for the author).

Related: Very Simple Process to Claim Authorship of Pages with GoogleWhy Can’t Search Results Screen Better by Date?Surprise Google Public PageRank UpdateUse Multiple PageRank Site to Find PageRank of https Pages

Let me Filter Out Websites Using Popups from Search Results

I wish Google, Duck Duck Go or some decent search engine would let me filter out website using popups from search results (as I have written before – Remove Popup Ad Sites From Search Results). By popups I mean all the coding to create popup interaction; lately many sites have coded popups in a way to not obey the preferences in my browser.

All many websites have done is found a way to get around my preferences by using code to create popups that circumvents my expressed desires for my browser’s behavior. Actively coding around my desires is a behavior that is obviously hostile to users. I don’t really want to see site that are hostile to users in my search results.

I would want a way to whitelist sites that have convinced me they are worth using even though they intentionally code so as to create popups when I have expressly indicated I don’t want them used.

Ideally a search engine would downgrade results based on the level of customer hostility practiced (kind of like Hipmunk’s agony ranking of plane flight options). It seems obvious to me this is a desirable state but I imagine it will likely take years for search engines to get there so let me block bad sites, for now. Google too 6 years to let me block bad sites but like many of the adjustments since Larry Page become CEO this advance seems to have been rolled back (or possibly just made next to impossible by lousy Ux).

Related: Web Search Improvements Suggestions from 2005 (most of which I am still waiting for) – Google Social Circle ResultsImprove Google Search Results (2006)

Why Can’t Search Results Screen Better by Date?

I realize there are some challenges with showing results by date. But the results shown now are pitiful. If you ask for results only in the last week, a vast majority of the results will still be ancient. They likely had some minor change to superflous text not even in the body of content for the page.

It really doesn’t seem like having the filters for time be better would be too hard. Even if they are not perfect making them much better wouldn’t be hard. Sadly the various search engines range form bad to pitiful to non-existent in their ability to provide accurate results filtered by date.

This seems like another opportunity for all those non-Google search engines to take advantage of Google’s inability to do this competently and gain some market share. But I haven’t noticed any of them taking advantage of this opportunity. If you know a search engine that actually provide somewhat accurate date filters add a comment.

Google Has Deployed Penguin 2.0

Penguin 2.0 rolled out today

We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice…

This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally.

What Google is doing is trying to correct weaknesses in their algorithm that provide less valuable results to users. That is certainly a worthy goal. Part of the method to do so is sensible, it seems to me. They update to devalue links that they had been incorrectly valuing too highly. This lead Google to over-rank pages (from the value to user perspective).

The second part, that Google likes to avoid talking about this way (and amazingly gets away with) is to punish sites that Google thinks have made Google’s job harder. So in this step, what Google does is intentionally provide worse search result to Google users in order to hope the threat of doing so will scare web sites into following practices that makes Google’s job easier.

As I have been saying for years, Google can punish their users by providing intentionally worse results to users, because they do not suffer significantly for doing so. If/when other search engines (DuckDuckGo, Yahoo, Yandex…) start taking away significant amounts of search traffic, this practice will almost certainly end. Google will find ways to improve weaknesses in their algorithms without punishing those using Google to search for content.

This update doesn’t affect published page rank significantly. Google might, punish sites it is upset with by reducing the public page rank but it is questionable if that action affects search results (in seems to probably just be a visible intimidation strategy). I am just guessing here, I don’t know of Google providing an explanation of this practice.

For some sites that Google punishes they do intentionally place them lower in the search results. In such an instance for example, Google will take a site that by the algorithm finds of great value to the searcher, say the 3rd best page and Google punishes the searcher, along with the site, by showing less worthwhile sites to the searcher and putting the 3rd best match is say 33rd place or 53 place or whatever.

I think users of Google want Google to provide search results that are the best match for them, not search results that are the best matches for them that Google isn’t mad at for some reason or another. So removing the 3rd result that was a lousy choice by Google to put in 3rd place – great. Removing the 3rd results because Google is made at the site – not great.

Google says they punish sites that do things they don’t like if those sites make changes to make Google happy (that in no way help those going to the site) then Google says they will stop intentionally not providing that result as high as they believe it should be (based on the quality of the match for the searcher). Google talks a lot of the punishment of the sites doing things google doesn’t like (using keywords too often, linking in ways Google doesn’t like, etc.). I don’t recall them ever talking about the result that has to follow which is they are punishing their users every time they do this.

Related: Google Falls Victim to Google’s Confusing Dictates, Punishment to Google and Google Users Likely