Wednesday 31 October 2007

Adsense Alternative

Update 14/12/07 - read the latest about my experience with JustGoMedia Here.


I was phoned a couple of days ago by a lady called Esther from JustGoMedia, asking if I would be interested in displaying their Yahoo adds on my Mortgages site. Quite badly timed since this site has just taken a nose dive in the search engine ratings, but aside from that I was interested in her offer.

What they offer is not contextual advertising so as long as it doesn't look the same as the Google adverts, they are happy for the same page to run both adverts. What their system does instead is to allow the web designer to specify a keyword.

There are restrictions in place to safeguard the system, which is good. But this does mean the potential for more accurate targeting. I agreed to give it a go.

I replaced the current Google adverts with the Yahoo ones yesterday. Results were a touch disappointing, but then Google has been poor the last week so it's not a fair time to compare. I found the click through rate seemed lower than Google is normally, but the earnings per click higher.

A lot of this could be that I'd unfairly only spent the time putting their advert onto the home page of the site and never bothered to track what happens with Google on that page. The Google CTR was also down yesterday.

Hopefully, with time I can get the site back up the search listings and then trial the schemes properly! Watch this space.

Tuesday 30 October 2007

Web Site Design

Web Site Design and what looks good, is very much a personal preference. I very early on learnt that text that is too big looks unprofessional. Obviously, there are cases when an entire site should be built in a very large font, but on the whole, that's not required.

Yet sometimes something unexpected happens when you are designing a web site. One currently about to be published looked good, but the customer asked for all text sizes to be increased by 50%. Using 18pt and 24pt arial fonts across a whole site lost the look and feel of the web design that we had put ages into designing.

In another case we designed a web site in pastel colours. Lot's of nice curling areas on the screen etc and it all fitted into the web site nicely. Just what the customer had asked for, we thought. Then we were passed the current logo and told to colour match with shades of brown. The customer was delighted with the final web site, but I have to say that the design wasn't one I was too impressed with when it was finished.

I'm not saying that as web site designers our tastes are perfect. But what we pass to you in the first attempt may look good to us, but it depends on your tastes what looks good to you. More importantly though, what sort of design will look good to those visiting your web site?

Monday 29 October 2007

Sites Dropped From Google

The saga of why the Mortgage Rates site so suddenly dropped from Google continues, but may be getting closer to a resolution.

Yesterday, I searched Google for half a paragraph of text from the home page and was surprised to find 18 results. All very similar to my own site. Even more surprising, my site wasn't there - not until I clicked the 'show similar sites' link.

Right above my site was a directory listing that linked back to my site. The text it had used to describe my site basically consisted of 50% of my home page. I've already described my problems when the content of a web site was duplicated by a directory.

Looking at the page, it was cached in Google, but grey barred. Thanks to the weekend's PR export and my look at the Latest Page Rank Update on Saturday, I was able to conclude that the page had first been cached in the previous 8 days - my site was dropped 5 days ago.

So that's who I'm blaming (for now) for my site's fall from grace and the loss of a large part of our daily income. The reaction here was that we demand that the listing is removed or changed, but that assumes there are contact details on the site; that the person will actually respond; they will respond quickly and Google will pick up the changes ASAP. If the page is favoured above our own page, it's likely that it's new but long term Google will put weight on that page and having a link out of it will be a benefit.

My solution was to reword our entire home page, take the chance to tidy a few pieces up and just make it look different from the other 18 I've found. Then wait for Google to come visiting and hope we are back where we were.

Sunday 28 October 2007

Second Week's Results

Time for a follow-up of the First Week's Results on my Backlink Experiment.

The results last week ended with all of the blogs falling from Google's cache, and this was still the same on day 7. I blamed this on duplicate content so on day 7 I added some posts of my own input and linked to these from that day's entry on this blog. Amazingly, the home page blog 2 (normal links) appeared on Google's cache later on day 8. Probably as a result of linking out from here.

Day 9 still had just the home page of blog 2 cached.

Day 10 had my own posts and the home pages cached on blogs 2 & 3, plus on blog 2 another (random???) page.

Day 11 - sorry, too ill to be bothered looking!

Day 12 - this is where a difference was first noticed. Site 2 had the home pages and all posts cached. Site 3 was down to the home page, site 1 nothing.

Day 13 - seemed to be a bit of a blip as site 2 lost a few pages from the cache.

Day 14 - the missing pages are back - as day 12.

Also, it is interesting to note (as per yesterday's post) that Google has updated page rank this weekend. Here's the results of that (again!).

Site 1 - grey barred throughout.
Site 2 - pr0 on all pages over 7 days old.
Site 3 - pr0 on home page, grey barred all other pages.

So, it would appear that in terms of Google Page Rank and Google's caching, that adding target="_blank" to a link has a major detrimental effect. So far, it appears that because all of the links use this on site 3, Google has taken little interest in pages beyond the home page. Yet with site 2, which doesn't use that but does link to from exactly the same pages, Google is caching every page, ranking every page it could and I've even seen on that the count of hours since the last update to the home page.

If you are accepting links with target="_blank" in them, your are accepting a lesser quality linkback!

Saturday 27 October 2007

Google Page Rank

It's quite interesting (for me) that Google's Page Rank has updated today - after a gap of almost 6 months. What makes this update interesting, apart from the fact that I've had a few sites suffer go down, is that because of this blog, it's the first time that I've had new pages added daily that I can use to see how Google has tracked the page ranks.

And I record this as much out of interest and to compare to in the future as I do for a study into Google Page Rank. If you notice anything from my results or learn anything, don't keep it to yourself - post a comment!

First,the blog home page has gone straight in at Page Rank 3. Excellent! Only 2 sites link to it - my own and brit blog.

None of the pages added to this blog in the last 7 days have a Google Page Rank - all are grey bared. Pretty much to be expected.

All pages added in the previous days (3/10 to 20/10) have a Google page rank of 0. This means that Google has taken into account all pages it new about up to and including 7 days ago.

Pages prior to from 2/10 and earlier have a page rank of 1. This must mean that Google has given them some credit for existing some time back. Maybe it indicates that the previous actual update of Page Rank happened around this time - the toolbar page rank is just what is exported on an adhoc basis. This would mean that either the 2 to 3 weeks means the pages have counted as aged more, or (and I prefer this theory) that Google is updating Page Rank about every 2 to 3 weeks.

Lastly, for some reason the August posts are grey bared. This is probably because the blog was just starting up and were quickly archived, before Google had a real chance to take an interest in the pages.

It's also interesting to note that 2 of the experimental blogs are now page rank 0 (as opposed to grey barred).

These are (of course) those 2 that are linked to. The one with the 'clean' links is PR0 on home page and pages up to 20/10 (1 unexplained exception). The one with target="_blank" is PR0 on the home page, grey bared on the rest.

But another similar (and slightly older) test blog remains grey bared, even with links from 1 site that aren't using the target function. Another blog, again with 2 sites linking to it, starts with a PR0 (it's brand new - links from the last week or so.). Looks like there's a big difference to explain there...

I'm sure this is one post I'll be referring to in the future - to compare how page ranks change!

Google Drops Websites

The reason my Mortgage Rate Comparison website so suddenly dropped from first listing to page 5 then 7 has still not been explained, but it has begun to slowly crawl up the 7th page. Analyzing the site, the statistics and recent changes and there are just too many possible causes and explanations.

Working on the theory that Google will visit approximately once per week (visits to the home page were 6 days apart) then any changes I make could take a week before they are recorded. Typically, I've noticed that after Google has visited a page it's 2 days before you see the cache change.

But it remains a very worrying time when a website drops so suddenly. It's long been a worry of mine that it would happen with this website, but there's just not been the time and effort available to develop a new site to share the load. Other sites that I've got in place for other schemes have been neglected recently due to pressures of work and left to rot, whilst the Mortgage Rates site has been given the attention it needed.

It's a hard lesson learnt - don't put all of your eggs in one basket. Just one site generating a huge percentage of income through search engine optimisation is a disaster waiting to happen.

Friday 26 October 2007

Sometimes People Confuse Me

A really strange request accompanied a new website this week. Could we make sure that the URL we registered was NOT obvious and easy to find??? You would think from such a request that I had taken on a dodgy website, but no, this was a shop owner with a clothes shop on the high street.

He requested that we do everything possible to stop his competitors finding the website and 'stealing' his ideas. But, I tried to point out, if your competitors can't find the site, then surely neither will your customers. If you aren't advertising the website name on the shop front, in adverts, on receipts, on letterheads and business cards and everything else you get printed, how will your loyal customers know where to find your site - or even that you have one?

Would you ever not publicise a phone number so that competitors can't find out what type of phone line you are using and how good your telephone service is? Does it really matter if your competitors see your site - are they really going to copy your ideas (especially when the same web designer is building your site and theirs.....)?

Get your site published. Tell the world it exists and make it as easy as possible to find. It's hard enough getting traffic - don't make it harder for yourself!

Thursday 25 October 2007

When Google Drops Sites

It's very frightening when suddenly your traffic drops and you realise that your site that was top of Google is now down on 5th page, and your major source of income has dropped.

And that's what's just happened to my Mortgage Rates site. I've had it happen to customer sites and with a little work they are quickly back up again. It can be one of many reasons, sometimes it just happens and then a few days or weeks later it reappears, back where you wanted it.

All it seems to need is for your server to be having a bad day at the wrong time, Google doesn't find the site and then you drop. As long as the server is back next time Google comes visiting then it should recover.

Or it could be that you have made changes to the site that have taken the optimisation over the top - it looks too spammy. Have a think (or keep a log of changes) as to what you might have changed over the last couple of weeks and see if they can be undone.

It's unlikely with a massive drop that it's all your competitors suddenly out performing you. More likely something you have done, or something happened on your server.

It's a horrible time, but you just have to sit there, take stock of the situation and see if you can rebuild. You were there once, try to get back there!

But it's also a very good reason not to have all your eggs in one basket - don't just rely on 1 site and it receiving visitors from search engines.

Tuesday 23 October 2007

Articles for Link Building

My troubled SEO experiment makes me wonder - is it worth writing articles any more? And by this I mean for submission to article directories, not blogging.

I suppose the answer is – why are you writing them? If you are hoping that loads of people will publish them and thus give you links back, there's a chance you are wasting your time. Some of these people will have ‘forgotten’ to include your link. But that's a minor problem. My experiment showed that Google just wasn't interested in syndicated articles. It ignored the pages. If it's ignoring the pages then it's ignoring the links on those pages.

So if you are writing in the hope of gaining lots of links, stop it. You are wasting your time!

Is there any use in writing articles? I believe so, and as always we are being pushed more towards honest openness with people browsing the internet. If you write interesting articles that people will read on ezines etc and those people then want to read more from you or about what you say, then this is the way forward.

Write articles that make people want to read more about what you have to say. Link to your blog or where you publish your thoughts – or just give the URL, it doesn't need to be a link.

Link building is dieing – search engines created a monster and now are going to kill it. Just pity the poor SEO companies that survive on link building.

Monday 22 October 2007

Another Twist in the Tale…

Well here is another interesting twist in the saga of trying to explain missing link backs. After changing tack yesterday because the duplicate content filter seemed to be doing it's job and linking directly to the new posts, the second blog, the one linked to without the target tag, has reappeared in the Google listings.

This blog didn't appear there earlier in the day before I added the link to the unique content, so on first glance it could be that the link without the target statement has helped.

Might be too soon to deduce proper conclusions, but it's looking like there could be a conclusion coming out of this experiment.

Now I just have to work out why the Promotional Items blog is not yet listed! One day, I will understand these search engines!

Sunday 21 October 2007

Turning The Experiment On It’s Head

Well here's an interesting little twist to the experiment – playing with duplicate content filers.

I've added to the blogs one entry of my own thoughts. Each one is different - the first is about open water training in Lanzarote; the second is about Advanced Open Water Training In Lanzarote and the third about Diving In Lanzarote.

Over the next week or so I'll post to each blog every few days – my own diving thoughts rather than cribbed articles.

In theory, given time (hopefully no more than a week, but who knows with blogs) the 2 with inbound links should be cached. I'll watch how often they are cached for a month or so, then I'll turn the experiment around.

If they blogs haven't been cached because I've used content from article sites, what happens if I get the blogs cached and then submit the posts to the same article sites?

You would hope that the search engines would see that the blogs were there first and keep them cached. But I suspect that Google will decide that the article sites are older and have a higher PR and instantly drop the blogs.

It will be interesting to see what happens. And there are plenty of conclusions to be drawn if my blogs are instantly dropped.

Saturday 20 October 2007

The First Week’s Results

So I created 3 identical blogs, every day I posted a different travel article to each blog. One blog didn't have any external links, just relied on the ping. The second had links from pr3, pr4 & even a pr5 site (on all pages of each of these sits). The third had the same links as the second, but all included target=”_blank”.

Theoretically, 2 & 3 should be seen as the same to search engines, in popular belief. 1 should be far less popular.

But, content is king and this experiment has proved that!

Day 1, all 3 sites cached on Google – caused by the initial ‘ping’ from blogspot.

Day 2, none of them cached.

Day 3, site 1 (the one without links) is cached, the others aren't…

Day 4, site 1 still cached and site 2 (without target) has 1 sub page cached.

Day 5, only the sub page on the second site is cached. Great I thought, expected results starting to appear…

Day 6, where have all of the sites gone!!!

It looks like using the freely distributed articles was a big mistake. Google has already detected them as duplicate content and dropped them (and of course the links within them – the authors have wasted their time…).

It's proved to me how effective Google is at spotting duplicate content (there were some minor spelling corrections etc applied to some articles) and the importance of writing your own material.

So I'll be starting the experiment again, this time with 3 sub domains. But first, the task of writing an interesting article for each blog. I might as well leave them running and give each one a genuine piece of new content and see what's happened in a week.

Friday 19 October 2007

The Experiment in Progress

Well the three blogs are up and running and have been for a few days now. I can't put a link to the first blog here, that would defeat the Backlink Experiment! But the second blog and third blog are there for you to see – you might notice the target=”_blank” on the 2nd link!

All three have had the ‘about me’ box removed so that there's no linking there and all use exactly the same basic template. I've not set up any adsense or affiliate banners as I don't foresee them having loads of traffic, but who knows!!!

I've also set up links to 2 of these blogs from 6 travel websites that I maintain. All of the links are from PR3, 4 or 5 sites with link to the third blog including target=”_blank”.

But what am I expecting to see?

Well the first blog will probably be cached very slowly, if at all, as it is unpopular (no links in).

The second blog should cached reasonably quickly and hopefully after a few weeks of hard work it will get to the point whereby Google shows how many hours it is since the last update of the home page.

The third is a bit of a mystery. In theory, it should be similar to the second, but if I'm right then I expect it will list quickly, BUT then it will be unpopular (no links listed) so it might not get cached as often as the second blog.

I'll be recording frequency of caching of the home page as a marker of how popular each page.

All 3 sites were on Google within 40 minutes of first publishing. Before I had time to link to them. But none were cached within an hour of the second post the following day, unlike when I posted to this blog that day.

The blogs have actually been running for a week now (I thought documenting the whole lot in one go would be too much for anyone to read!!!), so tomorrow I'll recap on the first week's results.

Thursday 18 October 2007

Experimenting With Backlinks

Does target=”_blank” prevent Google from counting a link or not? There's only one way to find out – set up some new web sites and vary the links in.

For this I'm going to set up 3 new (but similar) travel blogs. None will link to each other. I'll post articles from the same source into each one at the same time each day and will allow pings to be sent to search engines. Then I'll watch what happens to the blogs.

The first blog will not have any inbound links to it – this one will be relying totally on pings to get the search engine's interest. This, in scientific / mathematical speak, is the control. Without any links, what happens?

The second blog will have links from a variety of pages, mainly on travel sites. I'll pick a cross section of pages from sites. Some will be those shown as listing to my own web design site, others won't be shown. All will be PR2 or above and most importantly none of the links will include target=”_blank”.

The third blog will have links from the same sources as the second and the links will be in the same area of the page. Sometimes the link will be above the second link, sometimes below. But this one’s links will use target=”_blank” on every link.

I'll talk about the expected results tomorrow.

Wednesday 17 October 2007

Preventing Links Counting

As I mentioned yesterday (Is Link Building Being Ignored) I believe that target="_blank" may stop Google from counting the link as a back link sometimes. This is a problem for link exchanges where nearly every link uses this.

First, what does this code mean? It opens the link in a new window so as not to lose the visitor to the other site. So I suppose it's saying “Give them the information on the page, but it’s not important to me and I want my visitor back.”

If it's not important to the web designer – and remember the search engines typically open windows in the CURRENT window for search results – why should the search engine find the link important?

If you don’t really want to point traffic to the other site, then why should the link really be important? If it's not important, then why should it count? If it doesn't count, Google shouldn't show it and search engines should ignore it.

So it would make sense that target=”_blank” caused search engines to ignore a link. But not fully. It's still a valid link and they do follow the link – I've had too many sites discovered quickly by search engines when the only link has used target=”_blank” that it doesn't stop them following the link. But I do believe it stops them from counting the link.

So how can we find out if this is true? Well, I'm starting an experiment to find that out.

Tuesday 16 October 2007

Is Link Building Being Ignored?

Examining the sites linking to my own site (read this post for details) I’ve come across a possible pattern. A first, you would think it not to be true. Until you think about it.

I've noticed that almost without fail, all of my own sites listed as linking back to my own site have a PR2 or better and the link does not include target=”_blank”.

There was an exception. There's a PR4 travel site that is shown as linking to my web design site – but it does use target=”_blank”. Thought I’d broken the pattern, until I realised that hardly any pages on the travel site were listed – in fact only the links pages. When I looked carefully at the links pages there is a second link to my web design site, without target=”_blank”.

So from first impressions, it's highly possible that target=”_blank” stops a link from being listed in Google. I wouldn't go as far as saying it stops Google following the link, only it’s not shown as a backlink.

I've still got loads of non-listed sites to check, but I’ll explain why I think Google might ignore links with target=”_blank” tomorrow. And this is bad news for most link building – nearly every link swap in history must use this piece of code!

Monday 15 October 2007

Returning Customers...

It's always nice when a customer's business is doing well and they branch out and return for a new website. It's really encouraging and complimentary when the email telling you that a new website is required arrives, without any question of price etc.

In today's case instructions for a website for a Puggle Breeder arrived. The customer gave clear instructions on how the existing London Dog Walker site should have a new page added to link into the new site. The pictures for the new site arrived, brilliantly sorted by page and a word document of her preferred text.

It's wonderful to have everything so easily to hand, but even better that customers trust me just to get on with the work and present them with a site they will like. There was no mention of style, just keep it the same colours as the first site.

In this business reputation is everything and once you start to make the contacts they spread your reputation. I'm looking forward to plenty more work coming her way from her friends!

Is Most Link Building A Waste Of Time?

As I said in yesterday’s post I've started to investigate why only 13 of my own sites show as linking to my site in Google.

I've identified a list of sites that I will look at carefully, excluding any published / taken over by me since April. This becomes a massive task – I've got to look through the 13 sites that are shown by Google as linking, document which pages are linking and examine the link on those pages. Then I've got to look at the home pages of the other sites and see what elements are different. There are patterns forming – but not surprisingly every so often there’s a site that breaks the rules.

But the fact is that only 13 of about 100 are shown as linking to me and there has been no intention at all of breaking the link back. Therefore, if you are spending hours link building, what's the chance that Google is taking any note of those links? Out of the hundreds of link swaps I've agreed over time, hardly any appear on the google search. And I wouldn't be surprised if I have spotted at least a couple of reasons why.

If I'm right – and I'm going to try to prove this theory with a small experiment, then it could mean that most link swaps you have ever agreed are a waste of time.

Sunday 14 October 2007

When Is A Link Not A Link?

We've all been sold the benefits of link building and we know full well that if we don’t have any links into a site it won't get listed on the search engines, or at least it won't stay there for long. But how much time and effort do you spend on this and is it really necessary?

SEO companies will tell us it is vital. And as these places are successful, it probably is. There are loads of pros and cons, I'll come back to them in a few weeks.

But a couple of weeks back I was looking at what sites link into my main web design site. Currently I've got around 100 sites live that I've built for customers, all of which should be linking to me.

When I look at Yahoo, there are thousands of links into my site. But Google shows only about 50 pages. And I would expect that Google won't count links in the PR algorithm that it doesn't display.

This was very puzzling so I started looking deeper into this and of the approximately 50 pages linking in, that represented only 13 of my own sites. What's happened to the rest?

So I started looking at the listed pages and the sites that weren't listed as linking. I don't know which is most beneficial (looking at the included or excluded), but this week I'm taking a look at the results and I'm going to start an interesting experiment…

Saturday 13 October 2007

Learning From Successful Keywords

If you are carefully watching what pages people are visiting and what keywords they are using to get there (and which search engines they are using...) it won't take you much effort to go one step further and work out what on those pages you are doing well.

You might find that you have accidentally optimised a page for a certain keyword or that you have lots of internal links to a page that are generating a lot of traffic. Look for these sort of freaks of traffic and as well as using them to target more sales from that page, review where the keywords are in relation to that page.

You are now in the position of being able to create new pages using different keywords and using the same strategy as the successful page. If might be that you need to use lot of page, each with genuine information about different subjects, to get the required number of visitors. This can be a lot of effort, but a lot less than is required to get you top (and remain top) for a hotly contested keyword.

Friday 12 October 2007

Reviewing Successful Keywords

Well my hosts have got my stats back up and running and are in the process of filling in the missing days, so now I can look at what's working on the site.

One of the first things I noticed when I reviewed what pages were being hit and what search terms are getting people there, is that there was a misspelling on a page. In short, AllianceandLeicester.co.uk was generating some traffic for those sort of keywords.

When I reviewed the page, it had 1 link to the Alliance & Leicester website, whereas through my affiliate links I was able to display 5 different adverts, with another couple I was able to sign up for.

This was a very useful weapon. If you know what search terms your visitors are using to find a popular page, then you know what they are interested in. Therefore, you can give them more of what they want. In this case, more links to different parts of the Alliance & Leicester website.

This is a big advantage of monitoring and reviewing successful keywords. You can be certain of why people are visiting you and use that knowledge to convert more sales / leads.

Thursday 11 October 2007

Traffic Monitoring

It is a little more than ironic that in a week when I'm spouting on about how important it is to run accurate monitoring of your website traffic, my own host's system for displaying and reporting traffic goes down totally.

So far, it's around 48 hours and counting since I was able to access my traffic reports. Also ironically, Google adsense reports are showing me that the traffic is really high this week and I'm desperate to know why!

What have I done well that's suddenly bringing in a lot of traffic - but at the same time leads are down compared to last month?

It's that sort of question that you must be able to answer. Long term monitoring of traffic would tell me that when certain keywords are performing well I get a lot of leads, when others perform well I get a lot of adsense. Then I'd know which keywords to push to really increase leads.

Hopefully the reports will be sorted today.

Wednesday 10 October 2007

Monitoring Traffic - When it Goes Wrong!

Of course, all of these ideas are great whilst they work!

Looking at my stats for yesterday, it appears that my mine site had 0 visitors, 0 page impressions and 0 search engine activity. Hmmm. Considering how much it made in adsense, not very likely.

Resorting to the old server logs they also agree - absolutely no activity. But Google is showing massive amounts of traffic and very respectable earnings.

This does come at just the wrong time as I've got an advertiser who wanted to know which relevant pages were doing well this week so that his advert could go onto them next week. This is where having multiple ways of checking traffic is a bonus, but it's not usually critical, so not really worth the bother.

But it would be possible to set up the Google adsense to monitor exactly which pages clicks are coming from and this would also show me relevant traffic logs. A quite easy way around the problem of monitoring traffic.

Back to the subject of monitoring traffic tomorrow - hopefully I'll have some logs to look at myself by then!

Tuesday 9 October 2007

Why Should You Monitor Site Traffic

I've already looked at How to monitor website traffic, but why? What are the advantages?

Quite simply put, if your website traffic is at a good level now, why should you spend time & money looking at where it's coming from?

Put it another way - if in 6 months time your website traffic has dropped - what are you doing right now to raise that traffic?

That's the key to monitoring website traffic. If it's good now, you want to keep it that way. Run reports, find out which search engines are sending visitors for important keywords and what pages they are being sent to. And record this information.

In the future, if traffic is low or sales are lower, then you can easily look back to what's happening now and compare results. If traffic is the same but sales lower then it could be that you are being found for different keywords. These could be keywords that aren't as important or relevant.

If you have a change log for the site, you can then look back and see what's happened and maybe you can quickly reverse the changes and get the traffic back up again.

Traffic comes in peaks and troughs so it's easy to miss a gradual change in traffic and by the time you have noticed server logs have gone. So record key weeks while you have the chance.

Tomorrow - we'll look at what benefits we can derive from studying traffic.

Monday 8 October 2007

Monitoring Your Site Traffic

If you are not already monitoring your website traffic, then you really should be! But how can you do this and what do you need to know?

Knowing how many visitors are finding your site is excellent, but what when it drops or spikes? How can you reverse a drop? What can you do to repeat the spike? You need to know how people are finding your site and which pages they are arriving on.

The most simple form of this is just to look at your server logs. A simple log will tell you what search engine people arrived from, the search term and the page they visited. If you don't have huge mounds of traffic, then this is probably sufficient.

If you are a bit more technically involved, then you could right a piece of code to read through the logs and display counts of the above stats. Easy enough, when you know how.

For some of my own sites (those written in PHP) I have a small script that I can drop into the trailer of the page (all of my PHP sites have a trailer file, update the file for that site and every page on the site is updated). This little piece of code examines the referring page and if it's a search engine, then an email is sent to be telling me the page found, search engine and search term. Great for new sites, but more difficult when the sites are getting thousands of hits per day.

In that case you usually have to go to your web host and see what statistic packages they offer. Usually these will be all singing all dancing packages. My own host charges about £12 a site for these, so once the site is earning it can be a worth while investment.

Sunday 7 October 2007

Watching Your Traffic

How closely do you watch the traffic entering your site? I'm sure you have a good idea of how many visitors usually hit your site and the number of orders / sales / commissions, but do you watch for trends on your website? Do you take a look to see what keywords are working for you, what pages visitors are being sent to and if they are getting the information they want on those exact pages?

I admit, I don't do that carefully enough on my main sites. And it's only when the income drops for a few days that I look. By then,it's too late to see what's been working.

Yesterday I was looking at one site and noticed that a misspelling of AllianceandLeicester.co.uk generated a lot of traffic. So I took a look at that page and was surprised that it was a PR1, yet was only linked to from 1 other page on the site, which is a PR0.

Over the next few days I'll take a look at:
- how you could be monitoring your traffic
- why you should be monitoring traffic
- looking at what keywords are working and getting more from them
- learning from your succesful keywords

Come back soon!

Saturday 6 October 2007

Being Top Is Not Always Best

You would think that having a site in top position on Google for a major search term would be a sure fire way of getting plenty of traffic.

Recently I've had a site work up through the listings for the term Compare Mortgage Rates. It's hovered between 2nd and 4th for a while now, with quite impressive volumes of traffic.

But this week it's made top position. First site of 6,000,000 results. Very impressive indeed and a result after 3 years of hard work on the site. But strangely I've noticed a drop in the traffic this week!

Whether this is because the top place is immediately under the paid adverts and is ignored or whether the traffic generally for the keywords is currently down, only time will tell. Or maybe the site has dropped for lesser keywords. It's just so ironic and annoying!

Friday 5 October 2007

Graphic Designer or Web Designer???

I was talking yesterday with a customer about the pros and cons of dedicated graphic designers against website designers.

He was telling me about a guy he knows that is a fantastic graphic designer and creates some really special looking websites. But they are all created in an art package, chopped down and published as images.

Not only are they impossible for the search engines to read, he and his customers find them hard to maintain. This story came from my telling him about a site I'd picked up recently, also designed by a graphic designer. This one used more code, but he'd panicked when asked to implement drop down boxes in a search and tried to charge £150 for the changes. So his customer came to me and I'd done the changes in 5 minutes...

But there is a balance. Whilst neither of these 2 guys produce the technical sites I can produce, I would never be able to produce sites with the class of graphics that they have. And I think, finally, I've got the answer.

A third graphic designer contacted me a few weeks back. He wanted a few simple amendments to a site he was building, but didn't have a clue where to start. And it looks like the 2 of us could be working together going forward. He's already put together an animation for me.

He'll be producing the layouts for the websites, along with all the graphic work, then passing me the work of realising that site. When needed, I'll also pass him work for creating graphics.

Everyone wins here. The customers get fantastic looking sites with a whole host of complexities and the two of us are getting paid.

So Graphic Designer or Web Designer? Neither - use both. But remember that you'll be paying for both of their time.

Thursday 4 October 2007

When Communication Goes Wrong

Recently I started taking on work through a new sales person. He's producing plenty of new work, but being the middle man between me and the customer does add it's complications.

At first the idea looked simple. He speaks to the customers, passes me their requirements, I build the site and he takes it back to them and talks about revisions.

On the whole this is great. But one site recently I produced an A and B sample for. Personally, I thought the A sample far inferior to the B sample, but he talked with the customer and they agreed A. I was concerned that it didn't look like it was trying to sell the product and it wasn't 100% great, but the customer is always right.

We produced the full site, they signed it off and we published it. Then about a month later, after I suspect the customer had shown their new site to a competitor who had also had a new site built, they came back to us annoyed that the site wasn't as commercial as their competitor's new site and showed us the site.

No doubt they had seen the link from the competitor's site to the web designer and seen that he charged fees starting at three times the price they had paid, but they were unhappy. We've offered to make changes to the site, for free, but getting material out of them for these changes is a long slow process.

The moral of the story - don't sign off work that you aren't happy with!

Wednesday 3 October 2007

Case Matters

If, like me, you are used to working on a Windows environment for hosting your websites, then there's a chance that (like me...) you don't really pay too much attention to the correct case for file names. Why should you - they don't make any difference?

For example, page.html and PaGe.HtMl are both the same on Windows 2003.

But not on Unix!

If you try to access Page.html by typing page.html, it won't work on Unix.

What does that matter to us Windows users - surely it's irrelevant.

I've just had to move a customer's website to a Unix server because he wanted to use a new package on the site and it's not compatible with Windows. I started moving pages over and came across some pages where the case of the links didn't match the actual page names. I corrected the links, reviewed the pages and to my horror noticed the once PR3 pages were suddenly grey barred!

This was easily solved - I just had to change the page names (and revert the links) so that the case matched what Google had cached and immediately the PRs were reinstated.

So this leads me to draw an interesting conclusion. If your website is hosted on Windows and you aren't consistent with the case of page names, then you could have some links to page.html, others to Page.html etc. Well Google is going to see these as 2 separate pages and penalise one for being duplicate content, whilst only giving the other some of the internal linking benefit it should have.

If you are on Windows, make sure that you still follow Unix file naming!

Tuesday 2 October 2007

Affiliate Linking In Blogs

What is the most efficient way to link to affiliate schemes within your blog?

Well here I'm not talking about the high earning affiliate for whom the blog is an essential sales tool - they know exactly how to work. I'm referring to those who blog for fun and then decide that a product or two could be included as affiliate links to try to earn a bit of extra cash.

Here's the way I would do it.

First, assume I have a successful marathon blog and have decided to buy a pair of shock absorbing soles - and they are available through an affiliate scheme. Well in one post I would say I've heard how good they are and am going to try them. From this post I would use an affiliate link to link directly to the item. Avoid linking to the merchant's home page - if someone is interested they might not find the product. Send them directly to where they need to go.

The next thing is that these fantastic products arrive. Another post describes them, what they look like etc before I've tried them. But this time the post doesn't link to the item, it links to the FIRST post, in which there is a link to the item.

Yes, this goes against what I've said on the first post - it's making it harder to find them. So, make the first post short & snappy and the link easy & clear to find.

But why add in this extra step? Because if you add in an internal link, expecially one using the product name as the anchor text of the link, you are building up your internal linking and this is good for the search engines.

Later, add more posts as you try the items, start to feel the benefits etc. In these again you always link to the first post, not to the merchant.

This also has other benefits. Say you notice the same item for sale cheaper, by a merchant offering more commission; or the merchant closes their scheme or changes the page name. Using this method you can just pop into the first post and update the link. Otherwise, you would have to search your blog and update every link.

Also, there's a theory that affiliate links can damage your page ranking (keep coming back and I'll tell you why I believe this). Therefore, by keeping the affiliate link to just the one page, rather than loads of pages, if there are any consequences they are limited to just the one page.

Hope this helps.

Monday 1 October 2007

Getting Listed On Google

So just what does it take to get listed on Google and stay there?

You must have something that the search engines want - unique content. I've recently seen a couple of new sites that I've worked on appear on Google then totally vanish. Nothing that I try can get the sites back. But when I looked around the text used wasn't unique.

In one case the text on the site (a single pager) had also been used for a listing in a large directory. Although the directory listing was added after the site went live, the directory had been cached by the search engines long before the customer's site was registered and published.

It seems that the only answer to get the site listed is to get all of the text changed in the directory listing. Doesn't have to be much, so long as it looks different.

The other site that this has happened to slowly fell out of Google's listing. This time it was a blog publishing phishing emails and there's other sites doing exactly the same. The answer there is to add a load of text around the actual email and make sure that there isn't too much repitition (too many similar emails were also being published).

Give the search engines something new and hopefully they will treat you well. Give them what they already display elsewhere and you might find you don't have any traffic.