SES San Jose Corrections

Published 19 years, 6 months past

A few days ago, I posted the entry Silly Expert Opinions, in which I made some snide comments and rebutted some points related in a post at compooter.org.  In so doing, I fell victim to one of the classic blunders:

Never take someone to task for saying something you weren’t there to hear.

…because it may turn out they didn’t actually say it, or didn’t mean it in the way it was reported.

In the comments on the compooter.org post, the SES conference organizer Danny Sullivan (founder and editor of Search Engine Watch) and two of the panelists have calmly and professionally explained the other side of the story—the one where some of the points attributed to them were never made, some were seriously spun, and others were taken out of context.  The comments are well worth reading from about #12 on, that being Danny’s first post.  See also the thread “SES slammed by designers” at the Cre8asite forums.  (Although I should note once more that I’m not a designer.)

Unfortunately, my post triggered other posts, such as one at Molly’s crib and a  WaSP Buzz post this morning (thankfully there’s a more detailed followup).  We all fell victim to the blunder, but I fully take the blame for kicking things into high gear.  I sometimes forget that the entries I post are read and taken seriously by a whole lot of people; that my words have, at least in some circles, a certain weight.  And sometimes I let my penchant for smart-assed commentary get ahead of my more sober desire to speak with intelligence and accuracy.  My post of last Friday is such an example, and I’m sorry it’s caused confusion.  I apologize not only to the panelists and to Danny, but to anyone I inadvertently misled.

In my post, I did posit the idea that I might get into the SEO conference circuit, and now I have that ability, thanks to Danny’s deep professionalism—he could have easily, and with good reason, flamed me in e-mail and left it at that.  He didn’t.  He treated me with respect (probably more than I deserved) and opened the door I’d tried to slam.

In the afternoon WaSP post, Chris Kaminski said:

Here’s an idea: perhaps we standards folks and the SEO crowd should do a bit of knowledge sharing?  In the comments, Danny Sullivan said he’s already asked Eric Meyer to do just that, with an eye towards a possible speaking slot at an upcoming SES no less. That’s a great start. But I think we can do more. I think there’s gold to be found at the intersection of SEO and standards, or at least some good web development.

Let’s keep the beginning of dialogue in the comments to the compooter.org post, throw out the flames and ignorance, and use it to build a better set of best practices for web development. One that accounts for standards, accessibility, usability and search engines.

I agree wholly with Chris: let’s keep the dialogue going.  We’re lucky that the opportunity arose and wasn’t soured by me shooting off my mouth.  It’s time to see what can be done to harmonize the two fields, and where things can be improved.  I’m going to see what I can do about taking Danny up on his offer to attend an SES conference in the future.

I’m particularly interested because it seems, reading between the lines, that standards-oriented design isn’t as search-engine friendly as I’d thought (although it’s certainly much better than most alternatives).  Peter Janes created a test of Google’s treatment of heading levels, and the results weren’t exactly encouraging.  It bothers me that standards-oriented design and search engine optimization might be at odds, whether partially or fully.  This is definitely something that needs to be cleared up.  The results could affect the future evolution of search engines, which is a goal worth pursuing.

If you have ideas about how to get there faster, or have search engine tests of your own to share, let us know.


Comments (22)

  1. Pingback ::

    Peabody's Cre8tive Flow: Deep Linking from the Desk of Bluebert G. Peabody

    […] ts on meyerweb and another site lead to an interesting discussion on the Cre8site forums – SES San Jose Corrections. Oddly, I visited not to see […]

  2. Pingback ::

    developed traffic - web design, search engines and Internet marketing » of Web Design & Search Engines & Internet Marketing

    […] design folk learn from the SEOs, and vice-versa. Hats off to CSS Guru Eric Meyer for his SES San Jose Corrections post. It’s rare to see such an honest […]

  3. Pingback ::

    developed traffic - web design, search engines and Internet marketing » of Web Design & Search Engines & Internet Marketing

    […] design folk learn from the SEOs, and vice-versa. Hats off to CSS Guru Eric Meyer for his SES San Jose Corrections post. It’s rare to see such an honest […]

  4. Pingback ::

    developed traffic - web design, search engines and Internet marketing » of Web Design & Search Engines & Internet Marketing

    […] design folk learn from the SEOs, and vice-versa. Hats off to CSS Guru Eric Meyer for his SES San Jose Corrections post. It’s rare to see such an honest […]

  5. I attended SES 2004 in San Jose. I had a great time; however I felt that something was missing. It was knowledge on CSS and accessibility. What I didn’t realize was that the search engines were missing the knowledge about accessible, valid, semantic markup and how CSS works.

    The last day of the conference, during Meet the Crawlers, one of the slides mentioned that font size was one of the hundreds of things measured in the ranking of pages. Then it was brought up that search engines don’t read CSS. Therefore search engines use as part of their ranking system.

    I asked the panel if the search engines had any plans to stop supporting the use of deprecated markup. Jen Fitzpatrick, Director of Engineering, from Google answered my question something like this: “no, when the standards change, we’ll look into it. But right now, everyone does it so we’re still going to use it.” None of the other panel members from Ask Jeeves, Yahoo!, or MSN said a thing. Since it was the very end of the session, the rest of my comments went just to Jen. I told her that the standards had changed; a long time ago. I was concerned that a search engine in such a position of authority with the capability to influence the use of semantic markup wouldn’t use its power for good. I asked if they had someone working with the W3C. She “couldn’t remember the one guy’s name.”

    If the search engines do not promote valid, semantic, accessible markup, how can we expect the rest of the world to. I am shocked by the fact that we need to educate search engines. I figured they were experts on webdesign.

  6. I am shocked by the fact that we need to educate search engines. I figured they were experts on webdesign.

    If you have ever looked at Google’s markup, you might be less shocked. It seems they really couldn’t care less about web standards. But it’s very dissapointing to hear that not only are standards more or less ignored, but non-standard and non-semantic markup is actually encouraged.

    Hopefully it won’t bring too much of a “it works in Google” crowd (like the “it works in IE” people who seem to make so many websites). At least in the case of Google, accessibility still counts. But it’s a shame to be encouraged to optimize for user agents who see and judge web content so much differently than humans.

    That depressing bit aside, it’s great to hear that SEO people and Web Design/Dev people have at least some common ground. Hopefully a more open dialogue will help to improve the quality of work done by both groups in the long run.

  7. I can’t see Google or any other search engine dropping tag-soup support, but I’m surprised they don’t give a bump to good markup. Not to be too elitist about it, but I’ve yet to come across a standards-based site whose content wouldn’t be nirvana to a crawler that understands it can be used semantically. (Notwithstanding the drivel on my own site, of course.) That’s not to say there aren’t great tag-soup sites that deserve to be ranked highly, but if the engines started giving a bonus to semantic, standards-based sites–in Google terms, say a .5 increase in PageRank–I don’t think it would be long before we saw a huge increase in semantic, standards-based sites.

    Ditto on the above kudos to Sullivan, Kaminski et al. It’s good to see there are at least a few SEOs out there who are interested in optimizing rather than spamming.

  8. I was under the impression that Robots really like structured markup? I’m sure I read an article on alistapart claiming standards based design helped with pagerank? Am I mistaken?

  9. no, to answer you Neil. You were not mistaken.

    …We’re going to be focusing entirely on the benefits of using XHTML and CSS to show you how to improve the readability of your code for search engine spiders, maintain a good content-to-code ratio without going beyond file-size and word-count limits, and how to use CSS to mimic common image effects…

  10. Hi Eric-

    First of all, I LOVE your book, Eric Meyer on CSS. I want everyone who reads your blog to know that. I linked to your book companion site as soon as I finished reading it.

    I appreciate your comments. Thank you for being so professional and considerate.

    I’m all for working with you and Molly (whose books I love) and Steve Krug (whose Don’t Make Me Think is required reading at our firm).

    I just don’t want you and your colleagues to make some of the mistakes that others have made. Search-friendliness does not only mean having a “spiderable” web site. Being “spiderable” is only the tip of the iceberg.

    For example, doorway pages (which are considered search engine spam) are “spiderable,” but they do not meet the terms and conditions set forth by the search engines.

    Thanks again. And I hope to see you (or maybe be on a panel with you) at the Chicago Search Engine Strategies conference. Right in our back yard.

  11. This reminds me of a discussion in ALA about screen readers and image replacement. A number of web designers had been writing at length about accessible design, but they had never had the opportunity to look at one of their sites with a copy of JAWS (because it was, apparently, too expensive). Once the author had actually tested the theories with screen reader software, he was in for a surprise. Screen readers weren’t behaving like all the articles had been cliaming. Because of Google’s proprietary algorithms, we have much the same situation here. Lots of theories, not much data. Thanks to Peter Janes for providing the first data point. I hope this is only the beginning.

    I’m all for semantic markup and CSS, but Google has to index the entire web, not just the parts you and I like. I don’t think it is Google’s place to skew their search engine ratings to satisfy our interest in standards. It’s a nice thought, though. Most of the talk about semantic markup and SEO seems to focus on speculations about how search engines ought to work. An H1 heading really ought to mean more than a casual reference deep into the third paragraph. That might be a more fruitful discussion with the search engine people.

    I am excited by Peter Janes’ work, and the prospect of reverse engineering the actual behavior of search engines. That might be the most fruitful path to clarity, until we can convince Google and others to be more open about how they rank their pages. Above all, let’s start replacing speculation with real data. It’s about time.

  12. Paul Martin mentions: “I don’t think it is Google’s place to skew their search engine ratings to satisfy our interest in standards”

    If any person or organisation is in a position to encourage valid and well structured markup, it is Google. People pay more attention to Google than a standards-publishing consortium. In the longer-run Google gains more from well structured content. Looking at their Google labs projects, there are implementations that rely / could be better accomplished using structured markup – like Google Sets and Google Quotes.

    One of the holdbacks on these tools is the lack of well structured content out there. Anything Google can do to improve that signal, turns into better structured information that they can take advantage of to return better / more focused results to the visitor.

  13. Apologies for referring to you as a ‘designer’ Eric, I wasn’t quite sure how to represent the ‘group’ that seemed to be speaking about it. Most mentions were by big standards advocates (of which I am one), and to me at least, it seemed like a rift between those designing pages, and those optimising them for Search Engines.

    And as I said in the forum thread, though the way it’s come about maybe wasn’t great, I think the discussions been very worthwile and it seems like it could help the groups/areas work together better.

  14. Hmm, with regards to some of the other comments here, I don’t think any search engine would give a bonus to a page just because it was marked up semantically. That’s not what they are there for.

    They are there to crawl the web and suggest pages that meet peoples searches, not to promote a style of coding. A page that validates perfectly and is marked up semantically isn’t necessarily more relevant to a search than one that isn’t.

    To a search engine, relevancy of the content is what they are interested in, not whether the page creator used tables or not.

    Semantic, Standards based design can help Search Engines crawl a site, and if you really are making good use of headings and the title (I’m not overly confident of the test done, I’m sure I saw a similar one a while ago that provided better results, more in line with what you would expect) then you are helping the rank.

    Neil, standards based design won’t help with PageRank specifially, PR in an over simplified way is a measure of then number of sites linking to a page, it’s just one of the way Google ranks a page. I’m also slightly dubious that there are as many benefits to rank as have been put forward by some people. As I said above, standards based design can make it easier for your site to be crawled, and if marked up well may help rank to a certain degree, but theres a bit more to SEO than that.

    And Matthew, unfortunately there already are people who take a “it works in Google” approach. There are a lot of people who get so involved with their ranking in Google, they forget about everything else it seems. A while ago when Google were making regular monthly updates, you would see huge threads on the various Search Engine forums discussing it with people moaning about how their site was high one minute, low the next. It’s where the whole thing about link exhcanges and people asking for links has come from, to try and gain PR to try and get ranked higher in Google.

    Unfortunately many of these people forget that, once you have a visitor, you often then want to convert them into a customer. There are numerous storiews of people spending a lot on SEO tactics, getting a lot of traffic, but not getting the custom to go with it.
    This is why a lot of other SEO’s do see that it is only a part of making a usccessful site.

    To say that a semantic, valid site should automatically rank well I think is a bit of a biased view. I look at it as a bit like Accessibility. There are a good few parts of accessibility that you can cover with no accessibility knowledge. Using good markup is inherintely more accessible. But there are some other bits to it as well. SEO is like that, good design and markup is good for the search engines, but there’s a bit more to it that you can do (without needing to spam).

  15. With regards to Google’s own markup, I imagine that it’s less them not caring about standards, and more about them wanting to present a very consistent image to all users. Google’s immense popularity means that they must have greaters numbers, and perhaps greater percentages, of people visiting using older user agents – IE3, Netscape 4 et al – that don’t have very good support for our shiny standards. So, for that aim, you’d want valid HTML 3.2, or something.

    I agree that focusing solely on Google rank will get you precisely nowhere. Any site should have clearly defined, real-world aims, and everything else (search engine rank, code, design, content, functionality) should only be used in as much as it furthers those aims.

    With regards to Google being more open about how it ranks pages… well, wouldn’t that just open the door to the spammers, thus crippling Google for the rest of us?

  16. I think a first step to persuade Google that promoting valid pages (of any HTML version) would be an excellent idea. Semantic markup may follow, but realistically it’s too much to ask in the short term.

    I’m not convinced that Peter Jane’s tests are wholly valid, as the ‘inbound link count’ factor may not have been triggered. Many people linking using the same anchor HTML to his intro page would be a better measure.

    I don’t think there’s an SEO vs Standards bout to watch, it’s like saying Roof vs Walls.

    Sorry to say this, but the SEO community has more of an image problem than a knowledge problem.

  17. Bit unscientific this, not sure how useful it is, but…

    search term: ‘Athens 2004 hyperlink policy’

    Yesterday I put up a page on my website about the Athens 2004 hyperlink policy. It’s linked to twice from my (Google indexed) homepage (once with the search term in the link text, once without). I would be very, very surprised if anywhere else linked to it.

    The page itself contains the search term in the title and h1 tags, and the filename, and contains Athens 2004 in two paragraphs.

    It’s now top of the Google rankings for the search term described above.

    http://www.google.co.uk/search?hl=en&ie=UTF-8&q=athens+2004+hyperlink+policy&btnG=Google+Search&meta=

    I think this may be due to the rarity of the term itself on the internet – if you search for ‘Athens 2004 website hyperlink policy’, I’m nowhere to be found (my page doesn’t include the word ‘website’ anywhere).

    I was wondering if the presence of the search term in the filename had an effect. Maybe I’ll try that tonight with ‘Athens 2004 website hyperlink policy’.

  18. I’m sure I read an article on alistapart claiming standards based design helped with pagerank?

    To clarify, PageRank does not mean “how a page ranks”. It is a part of the Google formula that takes into account the number and quality of links *to* a website. That part of the ranking formula was devised by Google co-founder Larry Page; thus “PageRank”. Unfortunately, this means that it’s often confused with the *entire* formula; had Mr. Page been Mr. AnythingElse, this confusion would likely not exist. Note that, according to Google, the ranking forumula takes into account over 101 of other factors.

    > improve the readability of your code for search engine spiders

    This is always a good thing, in my experience. Choked-up/clogged-up/incorrect code *can* be a problem for search engines as opposed to leaner and meaner code. From what I can see, sites coded to Web Standards could definitely rank highly for search terms if they are also optimized for those search terms.

  19. I stumbled upon this blog today and just started reading randomly. I came to this particular post and some comments saying that google should enforce standard compliance.

    I find it to be a wrong perspective. The reason being some one like me has no idea about web standards, being a economics student. If a company like google enforced strict standard compliance then i wont even be able to have people reach my web page through the search engine.

    For the search engines they have to follow what the 99% of the world does. They can not try and change the world. I dont think that is their role.

    How about this !! suppose a bunch of economists were discussing about money and then suddenly some one said, well from now on in this world, only those people can talk about money who understand the “theory of money supply” !!

    You guys are doing a great job !! Just wanted to give you a outside the group perspective.

  20. Kalu:
    Albeit that a “money user” can only use “standards compliant money”.
    Try using a red one dollar bill a see what happens.
    Imagine that, although a standard for money existed, anyone could deviate from it… Just for a second… How much more expensive would be to sell a candy? What would be the cost to make money that every shopekeeper would accept? And what would happen if a new shopekeeper enters the market?

  21. Pingback ::

    Very True Things - The End of Civilisation as We Know It (part two of ???)

    […] the SEO gurus in question didn’t quite say the asinine things originally attributed to them. Eric Meyer tries to dampen some of the flames he helped to kindle and links to a selection of the subsequent […]

  22. Pingback ::

    Why People Hate SEO

    […] Bill Slawski and Will Critchlow step up with a response. But how many times do we have to do this? Wasn’t 2004 enough? How about 2009? Does it have to always repeat? Why on earth doesn’t anyone gripe about how […]

Add Your Thoughts

Meyerweb dot com reserves the right to edit or remove any comment, especially when abusive or irrelevant to the topic at hand.

HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <em> <i> <q cite=""> <s> <strong> <pre class=""> <kbd>


if you’re satisfied with it.

Comment Preview