Web Design Industry Blog

Rss Feed

How Should Web Designers Manage Outdated Web Browser Technology?

Published on March 31, 2011
Tags: Internet Communication

For the last couple of weeks, we have written about improvements to online technology (the Google Farmer update and the Chrome Personal Blocklist) that have the potential to increase the performance of users’ internet experience. This week, however, our attention turns to the issue of outdated technology, which has the potential to have just as big an impact on the internet, but this time for all the wrong reasons.

One of the most obvious examples here is Internet Explorer 6. This is the web browsing system that Microsoft released back in 2001, when Windows XP was just starting to make waves. The IE6 browser was somewhat problematic from the start; it didn’t properly comply with the web standards of the day and it prevented computers using non-Windows operating systems from displaying sites correctly. As this was in the days before Apple became such a big name, Microsoft was the dominant force in desktop computing and so IE6 was the most used web browser, making it almost impossible for other systems to compete.

In the late 1990s and early 2000s, the flaws of IE6 were one of the issues that led to a restructuring of web standards. This led to increased competition, most notably from Apple’s Safari browser and Mozilla Firefox and so Microsoft had to act to make its web browsers more compatible with other’s operating systems. Eventually, IE7, IE8 and now IE9 were created, ironing out many of the compatibility issues from IE6 and helping to make the web browser experience more dynamic and competitive than ever before.

This increase in competition has generally been positive: with more players on the field, they now all compete with each other to produce the fastest loading web pages, the fastest downloads, and the most comprehensive experience. This is great for web designers as it means their creations are much more likely to end up looking exactly as they intended, no matter which operating system or browser is used to display them.

However, IE6 still refuses to die. Even though in Europe, America, Brazil, Russia and New Zealand less than 4% of web users have IE6 as their browser, it still accounts for 12% of users across the world. This causes massive problems for web designers, particularly in Asia where IE6 is still heavily used, as it prevents their designs from displaying properly and affects the quality of the online experience for users. It means that designers not only have to create their sites to make the most of new technology, but they also have to cater for the sizeable number of people using outmoded browser systems. It also means that Microsoft loses advertising revenue whenever people search on IE6, for complicated reasons involving default search engine settings, which are different on IE6 to later browsers.

It isn’t just IE6 and other outdated web browsers that have the potential to cause problems, either. In this age of search engine optimisation, the speed at which a web page loads is also really important. This can involve a delicate balancing act of coding, Flash capabilities and other multimedia in order to accommodate the whole of the market: while many people now have high-speed broadband, others still use dial up connections while others use mobile internet with varying speeds. Web designers always have to be conscious of this as it is important websites reach the maximum number of web users as easily as possible, but there is also a dilemma that unfurls here.

How long should you accommodate slower and older technologies before moving on and leaving them behind? It’s a tricky issue as, even though the web design industry moves fairly quickly, the ordinary web user isn’t always quite so fast for any number of reasons. This could be because of the affordability (or lack thereof) of new technology, outdated laws restricting their access, speed of the internet connection to download newer versions, or any number of other things. If you are a web designer or company and a significant proportion of your online base is still using slow connections or old browsers, you risk alienating them if you move on from that technology too soon.

The flipside of the argument is that unless website designers move on and take up new technology when it arises, people will have less of an incentive to move on with it. They have less of a need to upgrade their systems if the old ones are still being supported. So a line needs to be drawn in order to strike the right balance between old and new, progress and accommodation.

The hard part is deciding where that line needs to be in the first place.

By Chelsey Evans

Submit Blog & RSS Feeds 
 

  2 Comments | Post Comment

The Chrome Personal Blocklist - What If We All Block Google?

Published on March 29, 2011
Tags: SEO

We wrote last week about the Google Farmer / Panda update; changes to the Google search algorithm that’s had an impact on around 12% of searches since it was launched in the United States, lowering the rankings of low quality sites and ‘content farms’. First analyses show that the Farmer update has been largely successful, with improved search results and the impact falling where it was intended to.

This is all great and it bodes well for when the update is rolled out in the UK and across the rest of the world over the coming weeks and months. One issue, though, has made us think a little bit and has got us wondering what might happen if some, shall we say, ‘unintended consequences’ occurred as a result.

That issue is the Google Chrome Personal Blocklist. You are no doubt aware that Chrome is Google’s web browser, launched to challenge the likes of Internet Explorer and Firefox. The Personal Blocklist is a recently introduced enhancement to Chrome, which allows users to download some software that then, as the name suggests, lets them block certain websites from appearing in their Google search results. So, if they were to do a web search through Google but the top results turned out to be low quality sites such as content farms, they could then block those sites so they wouldn’t appear in their search results again. Any blocked sites are also sent to Google for analysis, and Google admits they may be used as a ranking signal in the future.

On the face of it, this is a really useful tool and it seems to be working well so far. Google says that people have benefitted from better search results, which really should be the ultimate aim. 

Another thing to note is that the sites most affected by the Personal Blocklist are largely the same sites that have been impacted by the Farmer update.

Google says that it didn’t use Personal Blocklist data to inform the Google Farmer / Panda update and they can most likely be trusted on this – the Blocklist hadn’t been around long enough to be incorporated into the Farmer update and is still in an experimental stage. So it does appear to be a coincidence that backs up the success of the Farmer update, which seems to have been based around analysis of user click data and web content quality (such as sentence structure and keyword placement). If both web users and Google are coming to the same conclusions about websites, then something must be going right.

But some evidence still shows that those sites most commonly featured on Blocklists then fell down the Google rankings as a result. So, we were wondering… what if everyone decided to block Google? Admittedly, this is a far-fetched possibility and would never actually happen as it would be foolish to abandon one of the best and most extensive search engines out there. But if everyone did do it, what would happen??

  • Scenario 1: Nothing at all. But, it’d give the Google engineers something to smile about in seeing their technology in action.

  • Scenario 2: On a personal level, searching for Google on Google wouldn’t show any results. But, on a global level things carry on as normal.

  • Scenario 3: On a personal level, searching for Google on Google wouldn’t show any results, and on a global level Google’s Page Rank is reduced algorithmically and it receives a +100 positions penalty (we bet there’s about 12% of US web sites right now wishing this was the case!).

  • Scenario 4: You can’t access Google via Chrome at all.

  • Scenario 5: Google sulks decides it doesn’t like people anymore, and targets the primate market instead. After all, according to the Infinite Monkey Theorem give enough monkeys typewriters and they’ll reproduce a work of Shakespeare, so give enough monkeys access to Google and who knows what our genetically close cousins could come up with…. so Google Monkeys anyone? (Please Matt Cutts of Google – can the next update be codenamed Google Monkeys – after all we’ve just had Pandas!)

OK, so we’re speaking in jest here, but it does raise a real point: should you just block websites because they’re low quality, or for other reasons too? For instance, should people take a moral stand against certain companies by blocking their sites through Chrome’s Personal Blocklist and punishing them through reduced search rankings?

After all, for all the good that business has bought the world, it also has its darker side and barely a week goes by in the news without some report or other about underhand dealings and morally suspect happenings. The internet has proved to be a useful tool for protest before: what if blocking websites could be used in much the same way; to lodge a protest against a company or even Government?

It might not make much of a real impact on businesses just yet – if people were to block sites en masse for reasons of moral or ethical nature, it would most likely be big business bearing the brunt and so they’d be able to withstand any action, especially as Chrome is still a minority web browser (sorry Google – we love it, but it’s still true). Additionally, it would be unlikely to make it into a ranking signal unless the site being blocked was suspect. But on a personal search level if the site no longer appeared in any of an inpidual’s results, and if done en-masse, it might just make a point. This could be true noticeable if the site in question used Google Webmaster Tools and the owners saw a drop-off in page impressions.

Of course, the most likely scenario is that all will continue as before: the Farmer update will roll out across the rest of the world and improve search results by weeding out low quality sites, and people will continue to do much the same thing with the Chrome Personal Blocklist. It does raise the issue, though, that sometimes technological advances have more potential than we can ever realise at first. It’s certainly one to ponder.

If you’ve any thoughts on this article written by Ampheon Web Design London, we’d be delighted to hear them. Why not post your comment, or alternative scenarios below!

By Chelsey Evans

Submit Blog & RSS Feeds 
 

  0 Comments | Post Comment

Google Farmer / Google Panda Update: UK Website and SEO Advice

Published on March 18, 2011
Tags: SEO

When you search for information on Google, you might find from time to time that the search results aren’t exactly what you’re looking for. You sometimes find yourself faced with web content that is below par and nowhere near as useful as you might expect. Google recognises that this is a problem, so this is why they’ve introduced what is known as the Google Farmer (or Google Panda) update.

What exactly is the issue?

The content issue is largely caused by what are known as ‘content farms’. These are clusters of sites that have low quality content but often do well in search results because they optimize keywords. Then, the more people that click through onto these results assuming they’re relevant to their search terms, the more popular they become, even though their content is lacking.

There is also an issue with websites that produce only small amounts of original content. Sometimes these are legitimate sites, with RSS feeds that are fully authorised, but other times they’re sites that have simply copied information from elsewhere. 

Google wants to improve the quality of the sites it shows you when you search for something, hence the introduction of the algorithmic change known as Google Farmer (or officially inside Google, ‘Panda’)

What is the solution?

While Google isn’t officially targeting the Farmer update at the issues described above, the changes they are making primarily affect content farms and sites with ‘shallow’ content. They’ve done this by tweaking their search engine algorithm so higher quality results feature higher up on their lists. This change has only taken place in the US so far, but it’s reportedly had an impact on around 12% of US search results and is currently undergoing international testing, so it should be rolled out around the world, including the UK, soon.

Another update has also gone live to deal with the websites that have high quantities of copied content. This hasn’t had such a wide impact as the Farmer update, but reports suggest it’s definitely made a difference.

Are there any issues with the update?

On the whole, the Google Farmer update is positive to the Internet user’s experience as it makes it more likely that web users get good search results that are relevant to what they’re looking for and that are of good quality. Sites with low-quality content are more likely to fall down in the rankings as a result, which is a good for web users and website developers who put a lot of effort into their sites, as these will naturally benefit.

One issue that does arise, however, is the fact that websites are constantly under development and so might register as being ‘low quality’ for a time, affecting their search results. For example, if a website is trying to start a debate amongst its users, it might just post a short piece of holding text on a page to start things off. Until people comment on the page, though, it could be viewed as low quality by the search engines.

How can this be avoided?

One solution here is to block pages until you’re certain they’ll pass the algorithm’s analysis and to put as much effort as possible into generating good, useful content. As a rule of thumb, make the text on each page on your site unique and at least 300 words long. This means that when the Google crawls the pages, they’re more likely to prioritise them in search results. If you keep your websites up to date and filled with high quality information, the site will also be scanned more regularly, meaning that any changes you make will be picked up more quickly and therefore your search engine positions will alter faster.

You can also make sure you don’t have too much advertising on your site. Of course, advertising can be a good and practical thing to have your site and just because you have it, it doesn’t mean that the site will appear to be ‘low quality’. However, too much advertising can make your site appear like a content farm, lacking unique content, and detracting from any content there is on the page. If Google sees your site in that way, then the results could be damaging to your positions.

One of the main things you can do to make sure your websites benefit from the Google Farmer update is to make sure you tailor your content for your users, not the search engines. It can be all too easy to structure content around certain keywords that register highly in search engine results, but as Google makes more and more changes to favour high quality content, this strategy is increasingly risky. Make sure your users are getting useful, relevant content that’s tailored to their needs and you should have no problems in getting the results you want to achieve.

A final thought is that of writing the site text itself. We have come across numerous ‘professional’ website design companies where the site text has been lifted word-for-word from our site. And, of course, we know we wrote the original text! Upon contacting the directors of such companies, very often they claim that the authoring of the site text had either been outsourced or an employee of the company had been given the task of producing it. So, if you’re responsible for your company’s web site, make sure that whoever you get to write the text does so by writing unique text and not text taken from other sites (which in itself is a breach of copyright anyway).

If you’d like further advice on the Google Farmer update, have noticed a drop in your own Google rankings, or just need general web design advice, please contact us today for a free, no-obligation discussion.

By Chelsey Evans

Submit Blog & RSS Feeds 
 

  0 Comments | Post Comment

How Involved Should the Government be in the Internet?

Published on March 11, 2011
Tags: Web Site Law

One question that seems to come up from time to time is exactly how far governments should involve themselves in the Internet. Over 1.6 billion people across the world now have web access and this figure grows daily: control of the internet is, unsurprisingly, sought after by governments and businesses alike. One of the key selling points of the World Wide Web has always been its independence – an entity that transcends nations and unites communities across the world with relatively little interference from governments.

But is this a good thing? As with any question, there are at least two clearly defined sides to the debate.

One the one hand, governments have a duty to protect their citizens, across all platforms. It would be hard to find many people who disagree that Internet use ought to be restricted for people who pose a threat to others, be they terrorists, paedophiles, hackers or other malcontents. Some level of government-related oversight in this area is a good thing: in the manner of the police, it keeps the whole as safe for the majority as it possibly can while being tough on the criminals around the edges.

Also, the world is increasingly reliant on the Internet for services. As many companies move their operations online – and as government services themselves migrate to the Internet – there is a need for proper investment in infrastructure. This is needed on such a scale that it has to be coordinated by the government: in the UK, much of the broadband upgrade might be carried out by BT and Virgin Media and of course it’s in their commercial interests to do so, but without directives from the government, this project would be much harder. The EU requires that all citizens should have broadband access by 2013 and it’s hard to see how this would happen without involvement from the government.

The threat of cyber terrorism also warrants government intervention to shore up communication systems and ensure the country’s infrastructure could withstand an attack. The UK government is currently spending £650m on cyber security and the threat is considered to be as serious as terrorism. Government spending on such security is vital to maintain the free and open nature of the web. After all, the infrastructure has to come from somewhere.

From this, we can see the different areas where the government should be involved in the Internet – in terms of provision and security especially. But what about the other side of the debate? How much involvement is too much?

One of the main issues that come up on this side of the debate is the idea of the Internet ‘kill switch’. We’ve seen this recently during the Egyptian protests when people across the country suddenly found that they couldn’t get online. Other countries have their web use restricted and censored as a matter of course: when the Internet is supposed to be open to all users, why, the argument runs, should governments decide what information people should or should not be allowed to access?

There is, though, a much wider issue at stake and it’s been around at least since the late 1990s. In 1996, Bill Clinton – then the US President – developed a plan called the ‘Federal Intrusion Detection Network’. Had the plans gone ahead, it would have required major companies to run their Internet connections through the federal government, for ‘safety’ reasons. While you could argue that this was a shrewd plan to protect key industries in the event of a cyber attack, on the other side of the coin, it can be seen as government meddling gone too far and opens up political questions as well as security ones.

The ‘Fidnet’ plan didn’t go into practice, but now in the US, there is a bill going through Congress called the Cybersecurity and Internet Freedom Act of 2011. The Act makes provision so that in the event of a ‘cyber emergency’, the White House can issue directives to Internet companies with which they must ‘immediately comply’. That the regulations governing the Internet can change on such a whim raises the question of how much security is too much. The vague nature of a ‘cyber emergency’ also raises questions.

The backing for this Act comes from the (alleged) ability of cyber terrorists to shut down whole cities. This might be the case, but Fidnet never happened despite the same fears and yet they’ve never come to fruition. The biggest power cuts in the US happen because of tree branches and thunderstorms. 

There’s also already a whole host of Internet security provision in the US and across the world, largely developed as a response to growing terrorist threats and the growing reach of the World Wide Web. New Acts will arguably only create more regulations, placing further obligations on internet companies and slowing the pace of progress.

Perhaps this is a key point – the internet is still under development. It’s a project that isn’t finished yet and that’s the way it’s supposed to be. Of course, there is a need for governments to make provision for its progression through investment in both infrastructure and security. But until it’s got where it’s meant to be – wherever that is – it can seem a little hasty for governments and others who go to such efforts to try and control it. In some respects, the Internet is still a child, and it needs room to grow. Too much of the wrong kind of regulation risks hampering that.

By Chelsey Evans

Submit Blog & RSS Feeds 
 

  0 Comments | Post Comment

Broadband Speeds - Misleading Advertising?

Published on March 3, 2011
Tags: Internet Communication

We wrote last week about new powers being given to the Advertising Standards Agency to monitor online advertising content as well as more traditional means of marketing. Click here to read this article. Now it seems that ASA are even busier than they expected to be, thanks to a report from Ofcom.

It’s long been known that broadband in the UK rarely achieves the maximum speeds advertised by providers, with speed being one of the biggest complaints about broadband. Ofcom have been looking into the matter and they’ve found that the problem is fairly serious. For example, BT offers a service with ‘up to’ 8Mbps in terms of speed. The average speed that customers actually get, however, is between 4.1 and 4.8Mbps. This is by no means an isolated case: some providers fare even worse in the average speed test.

One of the main problems here is the different ways of accessing broadband. For those people who receive their internet through existing phone lines or via mobile networks, the speed tends to be much slower because of the limitations on the line. By contrast, cable and fibre networks such as those used by Virgin Media tend to be much faster and the average speeds are much closer to the advertised maximum speed. Virgin Media’s ‘up to’ 50Mbps broadband, for instance, has an average speed of 43.9 to 47.2Mbps.

If nothing else, this shows the importance of upgrading the broadband network with the latest technologies so people can achieve something nearer the download speeds they’re paying for. Of course, people accept that the standard of broadband is different in different areas, but it isn’t unreasonable to expect a certain level of service when you’re paying for a particular broadband speed. There is currently a big project underway, largely carried out by BT and Virgin Media, to install fibre optic broadband across much of the country. It’s hoped that this will increase broadband speeds by 50% within a year of the system going live. The government is also committed by EU law to provide basic broadband coverage to all citizens by 2013 (it’s currently a couple of years behind) and to provide speeds of at least 30Mbps by 2020.

But what about the current situation? The point made in Ofcom’s research is that it could be considered misleading advertising to say that broadband operates at ‘up to’ a certain speed if it then reaches nowhere near that speed. The ASA is looking into the matter but many broadband providers are unhappy about it. Ofcom thinks that providers should instead advertise ‘Typical Speed Rates’ (TSR) rather than ‘up to’ speeds as this is less likely to lead to confusion and false expectations.

BT, for instance, thinks that this would not be a good development as broadband speeds currently vary so much it wouldn’t be wise to advertise only one average speed – this is their justification for using the ‘up to’ speed proclamations. Another point raised is that to enforce TSR advertising would encourage internet service providers to carefully pick their customers in order to boost their average speed rating.

The impact of this could be that customers already annoyed at their slow connection speeds could have their situations made worse with fewer providers willing to take them on. It’s also hard to determine an average speed because broadband speed depends on so much other than just the means used to deliver it – household wiring, applications and where people live. By contrast, Virgin Media – who fared pretty well in Ofcom’s research – have said they would welcome TSR advertising as currently, they say, internet service providers are not giving their customers the credit they deserve and can’t deliver on the speeds they claim to offer.

As with so many other things, much of this debate is over advertising. Perhaps, though, more of the debate and energy should be spent on upgrading broadband systems in order to give people across the UK better internet coverage no matter which provider they use. After all, with the planned upgrades currently getting underway, the whole forum for debate will shift again in a couple of years anyway. Of course, truthful advertising is important and the ASA should investigate if there are concerns over misleading advertising, especially when so many people use these services. There is, though, also something to be said for actively doing something to improve broadband speeds rather than just worrying about how best to advertise them.

If you want to test your personal broadband speed to see whether you’re getting the service you’re paying for, you can do so here.

By Chelsey Evans

Submit Blog & RSS Feeds 
 

  0 Comments | Post Comment

Follow Us: Follow us on Google+ Follow us on Facebook Follow us on Twitter Follow us on LinkedIn


Disclaimer: The contents of these articles are provided for information only and do not constitute advice. We are not liable for any actions that you might take as a result of reading this information, and always recommend that you speak to a qualified professional if in doubt.

Reproduction: These articles are © Copyright Ampheon. All rights are reserved by the copyright owners. Permission is granted to freely reproduce the articles provided that a hyperlink with a do follow is included linking back to this article page.