You are currently viewing our old web site content

which contains useful resources on web design and digital marketing.

Clarica is now 100% focused on Search Engine Advertising

Through the Google AdWords Program and is a Certified Google Partner

Visit our new home page to learn more about advertising on Google

SEO Fundamentals - Get More Traffic!

Search Engine Optimisation - Getting higher on Google.

See Also - SEO for Joomla - How to change page titles, headings, meta data etc in Joomla.

You also will find some videos in the Joomla Tutorial pages that cover the practical basics of getting the words you want to get found for into important positions for search engines such as your Page Titles and HTML Headings on your pages, but before you jump in, it pays to understand some of the basic strategies of search engine optimisation.


Been approached by a company trying to sell you SEO? Read this:

Clients often ask us to evaluate SEO proposals they've received after being cold called or spammed by SEO companies. It's hard to do this without going into detail about what constitutes good SEO vs bad SEO, and the time working through your site and some boilerplate SEO proposal could have been better spent just teaching you the basics of good SEO, hence this page.

SEO in a nutshell

Google uses 2 main criteria to rank sites: 1: "relevance" &  2: "importance".

1. Relevance to the search performed. 

If a site is "highly relevant" to the keywords in a search, then it will rank higher.

Where and how often you put keywords on your site only affects "relevance" not "importance". The goal is to be seen by google as "highly relevant" which will make you rank higher.

Unfortunately you can't just cram in more and more instances of a given keyword in an attempt to look "highly-highly-highly-relevant". Google sees this as "search engine spam" and penalise this approach aggressively.

If your keyword density and positioning is unatural for the topic, then you're at risk. So the good news is that if you've got the desired keywords in page titles, headings, and reasonably often in the page contents, and if the info seems to be "all about" that topic in it's total meaning, then it's probably as highly relevant as it can be, and the only way to rank higher would be to look at Importance (see below).

If you have a large site with lots of highly relevant, useful pages on a topic then the whole site will get a boost, but don't think you can just cram in rubbish content that says the same thing in 20 different ways, as Google will rightly recognise that as the spam it is.

Google looks primarily at the actual text on the page, starting with the page title, the headings, and the full text of the page. Metadata (like meta descriptions, meta keywords, alt tags on images, and text in file paths is trivial in comparison, but SEO companies love sounding like they're the only ones with the secret knowledge, when the reality is simple. If you want google to think your site is highly relevant to a certain topic - make sure you write about that topic, and use the words that you want to rank for in prominent locations on the page.

A clear indicator that an SEO company is shonky is if they start telling you to add things like meta keywords, since neither Google nor any other major search engine has looked at meta keywords since the 1990s.

Yes there are some technical factors, and a number of ways you can inadvertently shoot yourself in the foot with SEO (see below) but the basics are easier than many would have you believe.

2. Importance

So if their are 500 sites that are all "highly releavnt" to your desired keyword, then it's the "important" sites, that will rank higher - not the ones that are the most manipulated by SEO companies (despite what their sales people will try and tell you.)

Importance is determined by the total "importance" of the sites that link to your site. The importance of those sites that link to you is of course determined by the total importance of the sites that link to them, and so on, and so on...

So a single prominent, permanent link from home page of www.smh.com.au would be worth literally millions of times more than a link from a free web directory. These days there really aren't any safe, reliable and affordable ways to artificially inflate this importance factor by paying people to get links pointing to your site, but that won't stop SEO sales people trying to sell you such services! After all, the risk they're taking is with your site, not theirs, and by the time google blacklists you, they'll already have your money. The ACCC gets many, many complaints about this and has issue public warnings on the topic.

So if you're up against competitors for a given keyword that have higher importance scores, then don't get desperate and pay some cold calling SEO company to start getting lots of fake links pointing to your site or start cramming keywords into every conceivable position on the page, most of the suggestions we come across in such proposals and either rediculously out of date, illinformed, or just downright dishonest, and as such we no longer think it's worth our time, or our clients' money to review such proposals - they're almost all shonky, and their fine print will  protect them, not you, when your site gets blacklisted.

For some practical ways to get real links that are natural (as opposed to artificial "link spam" sold by SEO companies), see the section below on "links as a credibility indicator"

The Detailed Stuff:

1. Relevance as a ranking factor.

Make your site "highly relevant" to the search terms (keywords) that you want to rank for by including those words in prominent positions.

Important qualification: Once a page is "Highly Relevant" to a given search, extra efforts to cram keywords into every possible position and meta tag offer little, if any additional benefit, and if overdone, can even get your site penalised for "keyword stuffing". Basically once the keyword "density" of a certain term reaches the point of looking unnatural to Google, (compared to what is normal for your topic), then Google may actually penalise you.

You should make sure that your desired  keywords appear in...

  • The Page Title (especially at the start of the title, and especially in your home page's page title),
  • Plus in a prominent H1 or H2 heading at the top of the page, and use subheadings logically to break up text. Avoid excessive heading use beyond what makes sense from a user experience point of view (eg writing a whole page of text inside a H1 heading won't trick Google, and may even look like 'search engine spam' to Google). Note: html heading tags aren't highly significant in their own right. Yes, it's best practice, but don't panic if they're not used. If something looks like a heading, Google will probably treat it as a heading (e.g. a single sentence or phrase in a large font at the top of the page).
  • At least once but ideally a number of times in the body text and
  • Ideally in links pointing to the page (e.g. in menu links or linked words on other pages - but especially on other sites linking to your site, but it's still helpful if links between pages on your own site contain keywords you want to rank well for.
    Note that footer links in small text carry much less weight due to their size and relative position on the page, especially if they point to a page that's already been linked to more prominently in the main navigation. The practice of stuffing keyword rich footer links (especially those that point to the same pages as those in the main navigation) became much less popular about 5 years ago in line with Google coming to recognise that this practice was far more about SEO than about helping site visitors find useful content.

If these conditions are met, (or even just most of them) then Google will already see that page as  "Highly Relevant" to that term, and extra efforts to include the desired words in lots more places like alt tags on images, meta keywords (no longer used at all by any major search engines), meta descriptions, URL paths and extra places in the body text, while sometimes helpful to a degree, will produce diminishing returns (and eventually negative returns if you take it so far that Google penalises you for "keyword stuffing".)

For a second opinion on this point, just take a look at the top ranking sites on Google for just about any search. In fact most sites that rank well on Google have quite modest keywords densities and positioning compared with highly "optimised" sites, and ignore many of the positions recommended by some SEOs. In fact, many sites appear in the top few results despite breaking almost every "rule" in the book for "on-page" content optimisation, so just getting the fundamentals right is the main goal.

1b. An exception to this, is that you can use every possible position to expand the variety of words around a certain topic that you might want to rank for. Once again, don't go nuts, as unnatural groupings of keywords that don't follow typical patterns of English usage for your topic may still trigger certain penalties on Google, but to a large extent, if you want to cram keywords into every possible location, then use it to expand the variety of different keywords, rather than repeating the same ones to the point where your site looks like "search engine spam" to Google. You should however be aware that there is some trade off between concentrating your efforts around a smaller number of keywords (or a single keyword) versus diluting the relevance to rank for a range of keywords. Given the extreme diversity of what people search for, it is generally a better strategy to aim for some breadth of different searches, rather than extremely narrow targeting, however the best balance between the two will vary from site to site.

Important: An exception to the basic tenet that you can't get a page to be more relevant than "highly relevant" is that you can make your whole site into a large, useful and authoritative source of information on a broad topic, and in this sense, a large site that is all about a certain topic with hundreds of pages of useful content on that topic, will certainly outrank a small site where only the home page is highly optimised around the same keywords (all other things being equal). There are many factors at play here, but in a nutshell, if you add lots of useful content on a topic that attracts visitors and keeps them there and keeps them coming back, then Google will notice. Such sites also attract more inbound links and have a broader spread of keyword combinations which can capture the "long tail" of the keyword demand curve, where often 50% or more of total traffic going begging on a topic can come from. Google's ultimate goal is to develop a true artificial intelligence that actually understands language, and they are already a long way down this track compared to being some crude aggregater of keyword densities and meta tags like the search engines of the 1990s, so the old adage, that "content is king" will only become more important over time, and Google really can tell the difference between useful content and rubbish aimed at bulking up a site or content merely written around popular keywords but which otherwise adds little to the user experience.

2. Know what to optimise for.

Use free tools such as the keyword tool in Google Adwords (especially using the "exact match" setting) to find out what products or services your customers search for most often. Don't just optimise for what is popular. Concentrate also on those search terms (keywords) that are most likely to turn into sales, and terms that you have a better chance of ranking well for taking your link profile into account. (see below for more about links).

Sometimes it makes more sense to optimise for a niche where you have less competition, than to try and rank top of google a broad generic term (like "jobs" or "real estate") if your competitors are spending hundreds of thousands of dollars on public relations and other efforts designed to build a very high profile of inbound links which you don't have the link profile or budget to match.

Don't ignore the long tail. Keywords range from "blockbusters" (highly popular, often generic terms), to the "long tail" of more diverse keywords and phrases. The sweet spot is typically just behind the blockbuster terms because generic searches are often not related to any intention to purchase, and qualifying words (such as products attributes, price qualifiers, or locations) often come from people who are ready to make a purchase, and at the same time such words are much less competitive to rank for than the generic blockbusters.

It's also a curious fact that on well optimised sites, including ones that rank well for blockbuster generic terms, the bulk of the traffic still comes from a vast myriad of long tail search terms (Read "The Long Tail" by Chris Anderson to understand why) so targeting only blockbusters ignores the source of most traffic (and the highest converting traffic), whilst at the same time pits you against the toughest competition.

3. Links as a credibility indicator.


Google counts links (from other websites pointing to your website) as votes for your site. More importantly, if a site links to yours, then the value of the vote they are casting for your site is determined by the value of all the votes from the links pointing to their site. Read this again, then twice more for good measure, as it is one of the most important aspects of SEO for Google.

For example, a prominent link at the top of the home page of the Sydney Morning Herald would easily be worth as much as 100,000 times the value of a link from a small website that has just a few poor quality links pointing to it.

This is why it's not as easy as people might think to get good quality links. Getting low quality links is easy, but conveys little if any benefit. Getting a prominent link from quality, high ranking sites is very time consuming.

Furthermore, the value passed on by a link on a page is divided between all the links on the same page (in a simple way of explaining it) so sites that link to lots of other sites (like free directories or forums that get lots of "Spam links" often don't pass much if any benefit, so the take home message is that easy links are often not worth even the small amount of time it takes to get them, compared to sometimes spending many hours writing free material, or in some other way motivating a more important site that links to very few people, to just link to you.

As with all things Google, never overdo anything that is artificial. Google has mapped the entire link architecture of the Internet (and this is the whole point behind the endlessly recursive nature of link analysis as a credibility referral system), and so they are therefore very good at telling natural patterns of link acquisition from artificial ones or fake networks. A rapid acquisition of low quality "free for all" links is unlikely to help, and may even trigger penalties in certain circumstances. There are many many ways Google can tell quality links for poor ones, but sticking to real links from real websites with real credibility is certainly the recommended path.

The best links are those that come from pages where there is at least some meaningful relevance between the content on the two pages. E.g. it would be perfectly natural for a site about travel, to link to a site about camper vans, but it would be unlikely for a cake shop website to link to a site about auto-repairs. A percentage of seemingly irrelevant links is quite natural and certainly not a problem, but if most of your links are low quality links coming from sites with no relevance to your own, then Google will either discount the links or view them as "link spam".

The following are all good strategies

  • Public relations
  • Writing articles containing author backlinks used to be a worthwhile strategy though probably not effective in the wake of various google updates and the vast amount of low quality articles out there on such article sites, however professional, useful articles published in prominent industry publications would still be effective provided you can get them published, which may require help from a professional public relations firm.
  • Joining trade associations that link to their members
  • Link baiting, and
  • Submitting to QUALITY directories especially industry specific directories that only link to reputable sites, may be effective in some cases though in most cases such links won't carry any weight, and low quality directories generally won't confer any benefit whatsoever.
  • Writing testimonials for suppliers that include a backlink to your site.
  • Sponsoring clubs or organisations who link to their sponsors, 
  • Note: any patterns of large numbers of backlinks that seem unnatural can actually get your site penalised by Google.

It is far more important to put effort into this type of link building and developing more useful, interesting and link worthy content, along with the basic content optimisation discussed above, than to obsess over keyword densities of your site compared to your competitors or worrying that you haven't put keywords into every possible URL path, meta-tag or image alt tag, since Google cares far more about putting sites that are credible, relevant and useful at the top of the results, than simply prioritising sites that are highly "optimised".

It is very easy get caught up in content optimisation details because these factors are easier to control, compared to trying to get high ranking, relevant sites to link to you, and it is precisely for this reason that Google prioritises the latter because it is a far better measure of whether a site will be useful to the searcher than some crude measure of keyword positioning or density.

4. Avoid common problems.


Anything that prevents Google from finding and indexing all of your pages will obviously prevent those pages from ranking, for anything!

So don't use links or menus that require javascript or flash in order to function (because Google's "bot" won't follow such links and therefore won't find and rank those pages). Site maps can be used to get around this problem (but if your site doesn't have this problem then a site map won't help you.)

Avoid putting your main page content inside images (i.e. in pictures of words, rather than in selectable text) or inside Flash

Google can't normally read and index such content (but some text inside images or Flash is fine if it contains keywords you don't need to rank for, or keywords that already appear more prominently in real text in other locations. (So don't panic about isolated instances of this.) There are also some ways of using Flash that Google can read and index, so before spending a large sum of money rebuilding your web site, it will be worth checking whether this really is a problem first (just search with quote marks for a unique phrase from various pages to see if Google is finding the content).

Example where putting text in images is actually better: The most prominent heading on many sites often contains a purely marketing driven phrase such as "Real benefits without the cost", and a subheading that might mention keywords such as your products, services and where you sell them. In this instance it makes far more sense to have the higher heading in a form that Google can't read, and the subheading that contains the desired keywords in a true H1 heading tag, though even normal text in a bigger font at the top of a page will usually be recognised by Google as a heading whether it's in a heading tag or not.

Avoid tricks that are only intended for SEO, but not for any other purpose.

There is a constant cat and mouse game between people trying to exploit aspects of Google's ranking systems that involve techniques done for no other reason than to try and trick Google, and SEO has a long history of them. They range from things like keyword stuffing to doorway pages to link spam to hidden text (e.g. white text on a white background) and many others. The basic rule of thumb is to simply align your interests with those of Google and you'll be safe for the long haul. You also won't be sent into a complete panic every time Google tweak their algorithms. Create a site that is interesting and useful to your visitors and gain links from real websites that have relevance and credibility in you industry, and you will continue to rank well and even improve your rankings as Google get better and better at sorting the wheat from the chaff.

Worried you might accidentally be using a technique that looks "spammy"? Don't panic ...

A good rule of thumb is that if what you've done is accidental, or a natural result of how your shopping cart of CMS works and is likely to be done the same way (without any tricky intentions) on millions of other web sites, then don't panic.

Google will have a strong interest in not penalising things that are accidental that exist on large numbers of sites through the legitimate use of popular software - because they don't want to accidentally penalise (and thus remove) vast swathes of potentially good search results from their index, if it's not resulting from any deliberate attempt to trick Google.

Of course it may still be worth addressing, and if you suspect you've been penalised it's worth looking into but use this rule as a guide.

Case study: An example of something inadvertent that did cross the line and get penalised was a Joomla site I once built where I'd written the page headings manually onto each page (as is often preferable depending on the site), and then the client for some reason turned on a global configuration setting that turned the article titles on to being visible on the page as well, thus creating 2 nearly identical headings at the top of every page across the site.

Of course there's no legitimate reason why anyone would want to do this as it looks rediculous, and to Google it looked like a blatant attempt at keyword stuffing and the client promptly dropped from 1st on Google (for the product they sold) to about 40th+.The client contacted me about the drop in rankings and it didn't take long to spot the problem (though why they didn't contact me about having two headings at the top of every page is another question).

Simply writing a reinclusion request to Google explaining the accidental cause of the problem was enough to get their rankings restored, something which would have been much more difficult to resolve had the cause been a deliberate spammy technique.

5. Avoid Common SEO Myths and SEO Snake Oil.

This section will help you spot the SEO sharks! But if they've cold called or spammed you, then chances are you'll find them recommending some of the techniques below:
Because Google holds most ranking factors as closely guarded secrets, and the fact that so many different factors contribute to rankings,  there is a large prevalence of myths in the SEO industry peddled by SEOs
through inexperience, ignorance or at worst, a motivation to get you spending big on major changes, or to give them an excuse for poor results if you're unable to implement every recommendation for any technical or budget reasons, or simply to make them look like they have more 'secret sauce' than the next SEO.

Myth: keywords in the URL (after the end of the domain name) are a strong ranking factor.

They're not (other than in the domain name itself where they're a HUGE ranking factor). The reason this myth is so common is that you often see sites with so called SEF (search engine friendly) URLs in the file name in search results and they're doubly noticeable because when you do a web search, the words that you searched are displayed in bold in the search results, creating a vague impression that keywords in URLs are common in high ranking sites. but don't confuse cause an effect.

Sites that do a lot of SEO typically do this as well, so there are lots of sites at the top of many search results that do this, but it is usually their other efforts that are delivering their results, not this one. I've never seen a significant result where I've added this feature to a page that's already well optimised for the same keywords, and this sentiment is echoed by most top SEOs and good resources on the topic.

Exceptions: On a page or site that is otherwise very poorly optimised for SEO then throwing google a few crumbs in the form of URL keywords might give Google their only way of knowing what the page is about, and in this situation it will of course make a significant difference. For example adding keywords to the URLs on an all-flash site that can't be read by Google may suddenly allow google to see those pages as sufficiently relevant to a given search to allow it to rank (especially if other off-page ranking factors are in play or if the terms aren't very competitive), but on a page that is already well optimised it appears to make very little if any incremental difference.

Another area where keyword rich URLs can be useful is that when people link to your site, they often use the exact URL as the linking text (called the "anchor text"), and keywords in anchor text are a strong ranking factor in determining what what the site is about (relevance) and incidentally this is why some pages rank well for phrases that don't even appear on the page if other sites link to them with that text.

But unless you've got a site that attracts a useful number of links to internal pages (pages other than your home page) then it won't make a significant difference unless you're adding new words that the page isn't already optimised for. The other reason this myth is so prevalent is that search engines did once have trouble indexing dynamically generated content (in the 1990's) but now days half the web is built on content management systems, forums,  shopping carts and other dynamically generated database driven technologies and if Google (or any other search engine) couldn't index those pages or considered them poorer results, then they'd quite simply be out of the search engine business.

Myth: The duplicate content "penalty" and the PageRank™ dilution myth.

Google have gone out of their way to debunk this myth.

It is commonly believed because people often find pages on their site not being indexed by google where another url on their site contains the same content. This is fine as Google are just avoiding having multiple copies of the same content in their index, but they don't attach any less weight to such pages, and have even started aggregating PageRank™  (from the various pages that contain the same content) onto their single chosen page, thus negating the PageRank™  dilution myth (and have categorically spelled this out). Of course if the duplicate content is on someone else's site and google has picked that version first as the original or most authoritative version then your version won't rank, so the moral of the story is to not expect to rank for content that you've copied off someone else's site, but that's very different to the notion that if more than one copy of the same thing exists on different urls on your site that it will hurt your rankings.

Exception: Avoid excessive multiple URLs for the same content (common in some shopping carts or some CMS programs or sites that can generate infinite "next" pages without new content)

Note: this is not anywhere near the sort of problem that many people think it is for the reasons above but it can still cause problems and you can use various methods to canonicalise such urls where it's an issue. In can especially cause problems if there are so many crawlable (but useless) urls generated that Google simply gives up crawling your site because it consumes too many resources without leading to any new unique content.

Exception Avoid Duplicate Content that Hurts the user experience or duplicate content that is tweaked with substantially the same content spun in different ways to target different keywords. Examples are boilerplate pages that merely have different keywords that add nothing to the user experience, but just contain the same content as found elsewhere but with a few keyword changes aimed solely at Google. This is a blatently spammy technique and you should expect your site to suffer if you use this approach. Another example are automatically generated pages that contain very little other than the content found on all pages - such as pages generated automatically from search results or keyword tag clouds. 

(Partial) Myth: PageRank™ sculpting.

This relates closely to the issue above so whilst there is some merit in PageRank™ sculpting, you should take note of the point above to put it into perspective (namely PageRank™ aggregation on substantially identical pages) and also the fact that whilst extra pages might be taking PageRank™, they're also passing it on, so in the grand scheme of things it's often not worth spending time and money on especially if the "sculpting" blocks google from indexing pages that potentially can attract visitors.

Organise your navigation in a way that gives greatest priority to your pages that are most important to your visitors and in most cases this will generally be aligned with good SEO strategy and achieve any degree of PageRank™ sculpting desirable.

Myth: Avoid using html tables.

There are some good coding reasons to avoid tables, but SEO isn't one of them, and Google is constantly getting better at interpreting the hierachy of how your content is laid out, as it actually appears visually.

Myth: Using Meta Keywords. (aka site keywords; keywords added in the meta data of an HTML page.)

This is ancient history and a feature of HTML that was created before search engines had the processing power to read the full text of your site's pages. They are not used at all by modern search engines.

Myth: Have a high text to code ratio (or a text to code ratio of X %)

This one is so absurd it is laughable but it still does the rounds. Basically someone, somewhere noticed that sites with a high text to code ratio seem to do better on Google and it made it into the permanent swirling rumour mill of SEO folk lore.

But the reason for this is that sites with lots of content rank well, (and such sites obviously have more text relative to code, thus giving birth to the myth. So yes, adding more relevant, useful content will help you rank better, and this will coincidentally increase the text to code ratio, but simply changing the text to code ratio through manipulation of the code won't make one iota of difference to your rankings.

Myth: Google or other organisations certify SEOs in any credible way.

SEO certification organisations are created to promote the interests of the people who create those organisations (or their own SEO companies). Google hold most of the hundreds of ranking factors (and their relative weightings) as closely guarded trade secrets so they're not about to start certifying that anyone has any special expertise in them.

They do however certify people in search marketing through their advertising programs like AdWords involving a long training program and many hours of yearly exams, as well as requiring the partner to maintain a minimum client base.

Myth: A site map will help your rankings.

This is only true if google can't find those pages any other way (e.g. by crawling your site's links and navigation which is the normal way Google finds web pages).

Clarica will never build you a site where Google can't crawl your main navigation so in such sites, your menus are effectively Google's site map.

If you've had a site built by someone else where the menus don't work without javascript or flash, then you will be in urgent need of a site map, but on a normal website with html menus, adding a site map won't help one bit.

You also can't make Google visit your site more often (not that would help in it's own right) but you can use a sitemap to tell Google to visit your site less often if you have a site that rarely changes. Once again; don't confuse frequent visits with Google liking your site. The only advantage of Google visiting more often is getting google to reflect changes on your site more quickly.

Myth: Linking out to other popular sites.

It's who lnks to you that builds your credibility in the eyes of Google, not the other way round.

Myth: Keyword Stuffing.

Google knows that the sites that people seem to find as being the best results for a given search typically don't have unnaturally high keyword densities so they're not about to start rewarding sites that do, in fact they go out of their way to penalise such practices, so use what would be natural for your chosen topic as a guide.

Myth: Madlib doorway pages.

Bulk, automated or otherwise rubbish content or bulk minor variants of existing pages (e.g. a product xyz (blue size 10) page, product xyz (red size 12) page etc). Aside from creating a usability nightmare for your visitors, such techniques are mostly well detected by Google, and they're only going to get better over time at identifying such techniques and such pages are most likely to be ignored (at worst it might look like spam which could hurt you, but as a rule of thumb, if it's a legitimate result of how your shopping cart works, then it's probably not a concern, as Google will have a strong interest in not penalising things that are accidental which will exist on large numbers of sites through the legitimate use of popular software - because they don't want to accidentally penalise vast swathes of potentially good search results when the site owners have not tried to deliberately trick Google.

Myth: Keyword rich domain redirections

Registering lots of domains like www.keyword-xyz.com and redirecting it to your main site won't help you.

Myth: Making regular minor changes to hold Google's interest.

Yes, Google loves big sites with rich content that provide quality resources on a topic, and that is a good reason to keep adding more relevant content to your site.  But don't confuse Google's crawl rate (how often they visit your site) with how high they rank you. Yes a site that changes frequently will be crawled more often to keep Google's results fresh, but that won't improve your rankings. Yes, sites that add regular content seem to rank well, but it's because they have a breadth and depth and quality and quantity of content on the topic that causes them to rank well, not because they constantly change.

Partial Myth: Code errors or validation errors will cause SEO problems.
A common scam started going around in 2012 where website owners will be cold called by companies offering to "fix problems" on your site, often centering around validation errors and such like. The truth is that almost all complex sites have html validation errors. Amazong, Google, Ebay, major news sites ALL have a wide range of supposed errors on their sites.
So what's the deal.
Part 1 of this issue is that microsoft thought they could use their market power muscle to abandon the official standard in how browsers were meant to render HTML code which forced web designers to abandon the standard in how they built sites to simply get websites to display properly in microsoft browsers, thus some errors actually become a necessity, but there are many other situations where non standard code is a necessity.
Part 2 is simply that the code that drives websites can get very complex (just right click and select "view source" some time) and on large sites or sites that use complex software (like content management systems, or shopping carts) some degree of mostly unimportant errors is unavoidable.
Part 3 is that some errors are so trivial as to be meaningless. For example some features have been "deprecated" (abandoned) in the html standard, but if browsers stopped rendering those features as expecteed, then all the old sites out there would break, so they keep supporting them, and probably always will.
Does Google Care.
For the most part no. If they did, then you wouldn't be able to find Amazon or Ebay on google, but if errors hurt the user experience then Google may care. The bottom line is that Google cares about whether the content on your site answers the question the searcher typed into google.