Questions? (877) 643-7244
FutureNow Post
Monday, Aug. 27, 2007 at 7:15 am

SEO Ethics: New York Times is Challenged

By Bryan Eisenberg
August 27th, 2007

SEO ethics restricted area Clark Hoyt, the New York Times public editor, serves as the readers’ representative. In his Op Ed column, he writes:

A BUSINESS strategy of The New York Times to get its articles to pop up first in Internet searches is creating a perplexing problem: long-buried information about people that is wrong, outdated or incomplete is getting unwelcome new life.

People are coming forward at the rate of roughly one a day to complain that they are being embarrassed, are worried about losing or not getting jobs, or may be losing customers because of the sudden prominence of old news articles that contain errors or were never followed up.

Archived material is being pushed to the top of the search engine result pages by the Search Engine Optimization (SEO) efforts of the New York Times. That is considered good business, especially for a website that makes money from displaying ads and the reader gets what might be relevant information. However, the practice raises a new crop of questions about journalistic ethics:

  • What is their responsibility to archive all of their published works?
  • Do they have a responsibility as a news organization to follow up on all published material to verify outcomes and then link it back to the older articles?
  • Should they allow people the ability to comment on this dated material?
  • Should they allow for the editing of the archives at a later date to change what was originally published as news?
  • Should some material just be deleted and forgotten in this digital age?
  • Whose responsibility is it to monitor and influence (if possible) what the search engines say about people?

Please let me know what you think about these new ethical challenges for journalists. What are our responsibilities as bloggers? Do the readers even care when things have finally been resolved?

P.S. If you’re in the mood for contemplating ethical challenges, Marshall Sponder raises another large one: What’s an honest SEO person to do when Universal Search clogs up SERPS with results they can’t manipulate?

Add Your Comments

Comments (22)

  1. Hey Brian,

    Very interesting dilemma here.

    My opinion is that the news should always be maintained as originally written. However I do see wide applications of social media tools to amend news, much like a comment or trackback does to a blog post.

    News happens and then things change. It is inevitable. Imagine a story about, say, “Czechoslovakia.” But then the country disappears into the “Czech Republic” and “Slovakia”. That does not change the opinion of the reporter or what was said when it was first published. A comment–style addition saying that Prague is now the capital of the Czech Republic would be helpful to a story about Czechoslovakia but I would not advocate a search and replace strategy to make wholesale changes to pre-existing news.

    I recall several years ago I was working with a public company whose CEO wanted to remove several press releases from the press pages because they news was no longer “appropriate”. For example, a new VP of Sales had been announced, but the person didn’t work out and was no longer with the company. Another release was a partner deal gone south. I argued (successfully) that a company’s news release archive needs to be maintained because it is the historical record. This is particularly true of a public company.

    But imagine the same VP Sales press reelase scenario where the company amends the electronic release with a comment-like message saying the person is no longer with the company and a link to the new press release announcing the new VP of Sales.

    Cheers, David

  2. There are 2 separate issues here – errors and follow-ups.

    Regarding errors, if information published about a person has been found to be untrue/inaccurate then it is the responsibility of the news organization to go back to the original article and either note, correct or remove the inaccuracy when it’s brought to their attention. If they do not take these steps the news organization can be sued.

    Regarding follow-ups, news is news and, if the material is dated and accurate, then the news organization is under no moral obligation to fill-in or follow-up it’s content. It would be user-friendly for the reader but it is not a moral issue.

    Ultimately, it is the readers responsibility to perform their own research and to evaluate those findings.

  3. as a minimum, al news artcles should include the date published. too many old stories can appear current and there’s no easy way for the reader to verify when it was current.

  4. G’Day Bryan,
    Firstly The New York Times and any publisher has a responsiblity to ensure that their archives work in their favor from an SEO standpoint.

    Journalist’s Ethics & The Fourth Estate:
    I believe society needs to truly debate the whole concept of the fourth estate, journalists ethics, their rights to protect sources etc.

    The “Media” of the last two centuries no longer resembles that of today and tomorrow. Rupert Murdoch, Chairman of News Limited more than perhaps any media owner makes the case that media outlets be they Newspapers, Magazines, TV, or Online Channels are assets of the company first and foremost. Their chief purpose is to benefit the Company and its Shareholders, “they are products & services”.

    Citizen Journalists ~ Bloggers:
    We already see that a huge percentage of blogs, online video content, article submissions etc. are produced primarily to market products & services. “Citizen Journalists get it… media is a tool of business”

    Moral Responsibility:
    The answer should be a clear yes. Anyone who publishes information has a moral obligation to ensure it is factually correct, now and later. I certainly believe that where publication of facts or opinion occurs an equal avenue to respond, put a different point of view or additional facts should exist.

    Equality, Justice and Freedom though are often in the eye of the holder. No news there. Cheers, Brendan

  5. ‘Archived’ material shouldn’t, I feel, be tampered with. After all, you can’t really change what’s written in a print newspaper in 1902, so why should it be any different online.

    However, the problem with the New York Times is that their SEO efforts are obviously poor if they are only getting out of date articles appearing at the top of the SERPs. Part of the job of a good SEO is to ensure that the RIGHT material appears in the results.

  6. Sounds like a perfect szenario for the new “unavailable after” meta tag. Outdated “news” should be marked in that way. That’s not deleting the articles but telling Google that their outdated.

  7. Interesting this is attributed to SEO when it’s bad news but when it’s a successful marketing program, MSM is often reluctant to characterize SEO initiatives as anything more than unethical manipulations of search results.

    I agree with David in that archived content should not be edited directly, but appended with appropriate corrections.

  8. In my opinion making selective decisions about what to let get archived in engines, where and how, is the content owner’s responsibility whether blogger and/or journalist or otherwise. Moreover re. any company whose SEO efforts eventually spur debates over the details of relatively intermediate-level tactics like cloaking, it could be argued that such companies should be especially conscious about such subtleties and maintaining policies on it one way or another. Absolutely.

    If I were writing the policy it would be to close comments on anything that’s old news, and anything I’m letting into the SERPs, cloaked or otherwise, should have appropriate Editors’ correctional notes appended, i.e. clearly visible in the SERPs. So in cases where Titles and Meta Descriptions (what Searchers tend see in the SERPs) have themselves become misleading, the responsible thing to do is either correct it or remove it altogether.

    Search engines themselves don’t say things about people though, so much as simply aggregate what people put on the Web. The currency in which they trade is other people’s information, their profits the ad revenues shown in contexts thereof. The responsibility therefore lies on the day-to-day with SEOs, and at the end of said days with the publishers on whose behalf they work.

  9. Knowing Marshall Simmonds, the NYT’s head of Search Strategy, I can tell you that the company uses only the most ethical of SEO tactics to improve its traffic. The idea that leveraging your article archive would be unethical is preposterous. If Google’s aim is “to organize the world’s information and make it universally accessible and useful,” then Google needs to develop a better means of understanding the relevance of an archived article.

    If the Times were manipulating a Google technique to pretend that archived content was more recent, you’d have a case here, but they aren’t. The new meta tags are a step in that direction, but the tags assume that content value is a black-or-white issue. Why tell Google that the information isn’t of value when it might be? The tags need further context. Why would a publisher tell Google to drop pages altogether? The NYT’s archived content does have value in search results, especially when researching historical developments. If I do a search for “Enron” and get no results that detail the events as they came to fruition, it looks bad for Google. If I’m writing a paper on Enron, I clearly want to be able to cite the NYT as a source, but the meta tag offerings don’t offer the ability to cite the date of relevance. That is, of course, to to the heightened ability to manipulate that meta data’s influence on the index.

    What Google needs is a means of understanding news versus reference content and how to rank the two types of results as time elapses. They also need a way to identify the two types of content without being spam’ed. That’s a tall order, but don’t blame the Times for that. Blame the black hat SEM firms. It’s funny… your critique of NYT is criticizing a white hat shop when it’s the black hats that destroy the possibility of filtering content easily because of their efforts to manipulate/chase the algo.

  10. Here’s what I find really interesting. This appears at the bottom of the NYT article:

    “The public editor serves as the readers’ representative. His opinions and conclusions are his own. His column appears at least twice monthly in this section.”

    So, this writer is the readers’ representative, right? However, his articles don’t have comments enabled? This feels like another old-school journalist fighting change because he is uncomfortable with it, and he is using his public pulpit to champion his way of thinking. I could be wrong here, but it seems like an obvious oversight to pretend to be the voice of the reader and completely ignore the ability to let them respond publicly.

  11. I agree with others that the archive should be just that. But where the NYT misses the boat is that this is a perfect opportunity to leverage the new media and add corrections and follow-up articles. If their efforts are proving that the archives are valuable, then this is the time to get a team looking at the articles that are being found and doing the legwork to follow up and make corrections.

    Nothing like having story ideas lining up for you.

  12. Prescott,

    Please read the post again. We didn’t criticize the NYT at all. We are pointing what they perceive as an ethical dilemma and asking how journalistic ethics need to adjust based on these developments.

  13. [...] leads Bryan Eisenberg to ask a few pointed questions, [...]

  14. It is a dilemma of a kind, but then again, it’s just one of the Old World things that the Web changes — for good and bad. Reputations have always been made or lost, based on true and false information. Think word-of-mouth in the pre-Gutenberg days. The press changed that by committing it to paper and getting it circulated. Now the web lends it a permanence and an accessibility.

    When print came along, we as editors offered to make corrections in later editions — that became standard. I agree with several others on this string that the solution here isn’t hard, and that web tools are tailor-made for it. Those taking exception to stories can easily use commenting tools, and in cases of factual wrong, a publisher can easily take that comment, research it as much (or as little) as it pleases, and then let the readers decide.

    Affixing comments to archival pieces isn’t simple, but it’s got to become a publisher responsibility as they unearth and SEO archives. We can look at the recently announced Google Comments as an early step in this process (as usual, Google is ahead of the NYT in how to harness technology for our age). We can debate Google Comments pros and cons, but it gets people on the record, out in the open, and ultimately that’s the best tonic for error.

  15. [...] I have seen the discussion arise regarding the New York Times employing search engine optimization techniques on old articles, [...]

  16. Interestingly enough, I was listening to part of Orwell’s “1984″ on tape recently–the part, as it happens, that profiles Winston Smith’s typical day at work–altering old news stories to fit the current politics of the dictatorship.

    I’d forgotten that’s what he did for a living.

    I’m with David Meerman Scott on this: it’s fine to annotate old news stories to reflect current realities/correct errors–but it’s definitely not OK to alter stories and claim they were in the original.

    Shel Horowitz, author, Principled Profit: Marketing That Puts People First and founder of the Business Ethics Pledge,

  17. Shel & David – I think you are both right. I’d hate to see the Ministry of Truth editing news stories.

  18. the observer, the new york city scandal sheet, is especially guilty here, they have published many career halting articles, and unfortunately they have heavily worked on the SEO to their site, to the extent where they focus more on search engines then humans. bordering on black hat SEO, they are promoting long and forgotten articles that now are coming back to haunt many people and do double, triple….exponential damage.

  19. al news artcles should include the date published. too many old stories can appear current and there’s no easy way for the reader to verify when it was current.

  20. They should be able to post and optimize so their content shows up high in the rankings. I do agree that they should have to post corrections/alterations along with the original article.

  21. There’s no denying the fact that search engine optimization has an ‘iffy’ reputation at best, and that’s being generous. Many people think SEO is worse than the plague itself.

  22. [...] I have seen the discussion arise regarding the New York Times employing search engine optimization techniques on old articles, [...]

Add Your Comments


Print this Article

Bryan Eisenberg, founder of FutureNow, is a professional marketing speaker and the co-author of New York Times and Wall Street Journal bestselling books Call to Action and Waiting For Your Cat to Bark and Always Be Testing. You can friend him on Facebook or Twitter.

More articles from Bryan Eisenberg

Marketing Optimization Blog
FREE Newsletter Sign-Up
send it once every: