- 69% Of Responsive Websites Take An
- Benefits Of Responsive Websites
- How Important Is User Experience For Businesses?
- Mistakes To Look Out For When Adopting Responsive Web Design
- Why Responsive Design Matters
Published on March 18, 2011
When you search for information on Google, you might find from time to time that the search results aren’t exactly what you’re looking for. You sometimes find yourself faced with web content that is below par and nowhere near as useful as you might expect. Google recognises that this is a problem, so this is why they’ve introduced what is known as the Google Farmer (or Google Panda) update.
What exactly is the issue?
The content issue is largely caused by what are known as ‘content farms’. These are clusters of sites that have low quality content but often do well in search results because they optimize keywords. Then, the more people that click through onto these results assuming they’re relevant to their search terms, the more popular they become, even though their content is lacking.
There is also an issue with websites that produce only small amounts of original content. Sometimes these are legitimate sites, with RSS feeds that are fully authorised, but other times they’re sites that have simply copied information from elsewhere.
Google wants to improve the quality of the sites it shows you when you search for something, hence the introduction of the algorithmic change known as Google Farmer (or officially inside Google, ‘Panda’)
What is the solution?
While Google isn’t officially targeting the Farmer update at the issues described above, the changes they are making primarily affect content farms and sites with ‘shallow’ content. They’ve done this by tweaking their search engine algorithm so higher quality results feature higher up on their lists. This change has only taken place in the US so far, but it’s reportedly had an impact on around 12% of US search results and is currently undergoing international testing, so it should be rolled out around the world, including the UK, soon.
Another update has also gone live to deal with the websites that have high quantities of copied content. This hasn’t had such a wide impact as the Farmer update, but reports suggest it’s definitely made a difference.
Are there any issues with the update?
On the whole, the Google Farmer update is positive to the Internet user’s experience as it makes it more likely that web users get good search results that are relevant to what they’re looking for and that are of good quality. Sites with low-quality content are more likely to fall down in the rankings as a result, which is a good for web users and website developers who put a lot of effort into their sites, as these will naturally benefit.
One issue that does arise, however, is the fact that websites are constantly under development and so might register as being ‘low quality’ for a time, affecting their search results. For example, if a website is trying to start a debate amongst its users, it might just post a short piece of holding text on a page to start things off. Until people comment on the page, though, it could be viewed as low quality by the search engines.
How can this be avoided?
One solution here is to block pages until you’re certain they’ll pass the algorithm’s analysis and to put as much effort as possible into generating good, useful content. As a rule of thumb, make the text on each page on your site unique and at least 300 words long. This means that when the Google crawls the pages, they’re more likely to prioritise them in search results. If you keep your websites up to date and filled with high quality information, the site will also be scanned more regularly, meaning that any changes you make will be picked up more quickly and therefore your search engine positions will alter faster.
You can also make sure you don’t have too much advertising on your site. Of course, advertising can be a good and practical thing to have your site and just because you have it, it doesn’t mean that the site will appear to be ‘low quality’. However, too much advertising can make your site appear like a content farm, lacking unique content, and detracting from any content there is on the page. If Google sees your site in that way, then the results could be damaging to your positions.
One of the main things you can do to make sure your websites benefit from the Google Farmer update is to make sure you tailor your content for your users, not the search engines. It can be all too easy to structure content around certain keywords that register highly in search engine results, but as Google makes more and more changes to favour high quality content, this strategy is increasingly risky. Make sure your users are getting useful, relevant content that’s tailored to their needs and you should have no problems in getting the results you want to achieve.
A final thought is that of writing the site text itself. We have come across numerous ‘professional’ website design companies where the site text has been lifted word-for-word from our site. And, of course, we know we wrote the original text! Upon contacting the directors of such companies, very often they claim that the authoring of the site text had either been outsourced or an employee of the company had been given the task of producing it. So, if you’re responsible for your company’s web site, make sure that whoever you get to write the text does so by writing unique text and not text taken from other sites (which in itself is a breach of copyright anyway).
If you’d like further advice on the Google Farmer update, have noticed a drop in your own Google rankings, or just need general web design advice, please contact us today for a free, no-obligation discussion.
Published on March 11, 2011
Tags: Web Site Law
One question that seems to come up from time to time is exactly how far governments should involve themselves in the Internet. Over 1.6 billion people across the world now have web access and this figure grows daily: control of the internet is, unsurprisingly, sought after by governments and businesses alike. One of the key selling points of the World Wide Web has always been its independence – an entity that transcends nations and unites communities across the world with relatively little interference from governments.
But is this a good thing? As with any question, there are at least two clearly defined sides to the debate.
One the one hand, governments have a duty to protect their citizens, across all platforms. It would be hard to find many people who disagree that Internet use ought to be restricted for people who pose a threat to others, be they terrorists, paedophiles, hackers or other malcontents. Some level of government-related oversight in this area is a good thing: in the manner of the police, it keeps the whole as safe for the majority as it possibly can while being tough on the criminals around the edges.
Also, the world is increasingly reliant on the Internet for services. As many companies move their operations online – and as government services themselves migrate to the Internet – there is a need for proper investment in infrastructure. This is needed on such a scale that it has to be coordinated by the government: in the UK, much of the broadband upgrade might be carried out by BT and Virgin Media and of course it’s in their commercial interests to do so, but without directives from the government, this project would be much harder. The EU requires that all citizens should have broadband access by 2013 and it’s hard to see how this would happen without involvement from the government.
The threat of cyber terrorism also warrants government intervention to shore up communication systems and ensure the country’s infrastructure could withstand an attack. The UK government is currently spending £650m on cyber security and the threat is considered to be as serious as terrorism. Government spending on such security is vital to maintain the free and open nature of the web. After all, the infrastructure has to come from somewhere.
From this, we can see the different areas where the government should be involved in the Internet – in terms of provision and security especially. But what about the other side of the debate? How much involvement is too much?
One of the main issues that come up on this side of the debate is the idea of the Internet ‘kill switch’. We’ve seen this recently during the Egyptian protests when people across the country suddenly found that they couldn’t get online. Other countries have their web use restricted and censored as a matter of course: when the Internet is supposed to be open to all users, why, the argument runs, should governments decide what information people should or should not be allowed to access?
There is, though, a much wider issue at stake and it’s been around at least since the late 1990s. In 1996, Bill Clinton – then the US President – developed a plan called the ‘Federal Intrusion Detection Network’. Had the plans gone ahead, it would have required major companies to run their Internet connections through the federal government, for ‘safety’ reasons. While you could argue that this was a shrewd plan to protect key industries in the event of a cyber attack, on the other side of the coin, it can be seen as government meddling gone too far and opens up political questions as well as security ones.
The ‘Fidnet’ plan didn’t go into practice, but now in the US, there is a bill going through Congress called the Cybersecurity and Internet Freedom Act of 2011. The Act makes provision so that in the event of a ‘cyber emergency’, the White House can issue directives to Internet companies with which they must ‘immediately comply’. That the regulations governing the Internet can change on such a whim raises the question of how much security is too much. The vague nature of a ‘cyber emergency’ also raises questions.
The backing for this Act comes from the (alleged) ability of cyber terrorists to shut down whole cities. This might be the case, but Fidnet never happened despite the same fears and yet they’ve never come to fruition. The biggest power cuts in the US happen because of tree branches and thunderstorms.
There’s also already a whole host of Internet security provision in the US and across the world, largely developed as a response to growing terrorist threats and the growing reach of the World Wide Web. New Acts will arguably only create more regulations, placing further obligations on internet companies and slowing the pace of progress.
Perhaps this is a key point – the internet is still under development. It’s a project that isn’t finished yet and that’s the way it’s supposed to be. Of course, there is a need for governments to make provision for its progression through investment in both infrastructure and security. But until it’s got where it’s meant to be – wherever that is – it can seem a little hasty for governments and others who go to such efforts to try and control it. In some respects, the Internet is still a child, and it needs room to grow. Too much of the wrong kind of regulation risks hampering that.
Published on March 3, 2011
Tags: Internet Communication
We wrote last week about new powers being given to the Advertising Standards Agency to monitor online advertising content as well as more traditional means of marketing. Click here to read this article. Now it seems that ASA are even busier than they expected to be, thanks to a report from Ofcom.
It’s long been known that broadband in the UK rarely achieves the maximum speeds advertised by providers, with speed being one of the biggest complaints about broadband. Ofcom have been looking into the matter and they’ve found that the problem is fairly serious. For example, BT offers a service with ‘up to’ 8Mbps in terms of speed. The average speed that customers actually get, however, is between 4.1 and 4.8Mbps. This is by no means an isolated case: some providers fare even worse in the average speed test.
One of the main problems here is the different ways of accessing broadband. For those people who receive their internet through existing phone lines or via mobile networks, the speed tends to be much slower because of the limitations on the line. By contrast, cable and fibre networks such as those used by Virgin Media tend to be much faster and the average speeds are much closer to the advertised maximum speed. Virgin Media’s ‘up to’ 50Mbps broadband, for instance, has an average speed of 43.9 to 47.2Mbps.
If nothing else, this shows the importance of upgrading the broadband network with the latest technologies so people can achieve something nearer the download speeds they’re paying for. Of course, people accept that the standard of broadband is different in different areas, but it isn’t unreasonable to expect a certain level of service when you’re paying for a particular broadband speed. There is currently a big project underway, largely carried out by BT and Virgin Media, to install fibre optic broadband across much of the country. It’s hoped that this will increase broadband speeds by 50% within a year of the system going live. The government is also committed by EU law to provide basic broadband coverage to all citizens by 2013 (it’s currently a couple of years behind) and to provide speeds of at least 30Mbps by 2020.
But what about the current situation? The point made in Ofcom’s research is that it could be considered misleading advertising to say that broadband operates at ‘up to’ a certain speed if it then reaches nowhere near that speed. The ASA is looking into the matter but many broadband providers are unhappy about it. Ofcom thinks that providers should instead advertise ‘Typical Speed Rates’ (TSR) rather than ‘up to’ speeds as this is less likely to lead to confusion and false expectations.
BT, for instance, thinks that this would not be a good development as broadband speeds currently vary so much it wouldn’t be wise to advertise only one average speed – this is their justification for using the ‘up to’ speed proclamations. Another point raised is that to enforce TSR advertising would encourage internet service providers to carefully pick their customers in order to boost their average speed rating.
The impact of this could be that customers already annoyed at their slow connection speeds could have their situations made worse with fewer providers willing to take them on. It’s also hard to determine an average speed because broadband speed depends on so much other than just the means used to deliver it – household wiring, applications and where people live. By contrast, Virgin Media – who fared pretty well in Ofcom’s research – have said they would welcome TSR advertising as currently, they say, internet service providers are not giving their customers the credit they deserve and can’t deliver on the speeds they claim to offer.
As with so many other things, much of this debate is over advertising. Perhaps, though, more of the debate and energy should be spent on upgrading broadband systems in order to give people across the UK better internet coverage no matter which provider they use. After all, with the planned upgrades currently getting underway, the whole forum for debate will shift again in a couple of years anyway. Of course, truthful advertising is important and the ASA should investigate if there are concerns over misleading advertising, especially when so many people use these services. There is, though, also something to be said for actively doing something to improve broadband speeds rather than just worrying about how best to advertise them.
If you want to test your personal broadband speed to see whether you’re getting the service you’re paying for, you can do so here.
Published on February 24, 2011
Tags: Web Site Law
Over the last decade, the internet has increased its depth and influence to an extent that was hard to predict. It has developed in many ways, with lots of different forms of media reflected in its content. One area that has increased massively over the past few years is user generated content. An increasingly popular means of engagement with the web, what exactly is it and what issues does it raise?
Simply put, user generated content is exactly what it says it is – online content written or created by the users of websites rather than website developers or site owners. This can include everything from forums and blog posts to videos, images and articles, depending on the website. Two of the most obvious examples of websites that thrive because of their user generated content are YouTube and Wikipedia, where users can upload videos and articles to the respective sites. There are also thousands of blog sites, forums on all manner of topics, online image galleries and more, all of which provide an opportunity for web users to engage with the websites through producing their own content. Often this is done in a small way, such as through commenting on blog posts, but others create whole enterprising operations out of their actions.
On the face of it, the growth of user generated content is an extremely positive development. It allows people to be more involved with the happenings of the internet and their favourite websites in particular. This helps to contribute to the free and open nature of the World Wide Web, providing ordinary users with a platform through which they can share their knowledge and interests, interact with other people in the same space and open up new ways of networking.
One of the most successful web inventions – social media – largely rests on user generated content. Without it, the Facebook, Twitter and LinkedIn sites would largely be just shells. They need content to fill them and make them successful. This comes from the website users, who set up accounts and engage with the services. This has led to people networking with others across the world and has also given rise to a new form of protest movement and means of organisation for demonstrators, as well as a whole host of other uses. Other websites such as YouTube are beneficial because they not only allow user generated content to flourish, but they can also be integrated into or linked to from company websites, giving businesses alternative means of advertising and interacting with their customers.
Despite these positives, user generated content is not without its pitfalls. For example, there is the question of quality. When people write their own content, there are no rules governing standards of accuracy, so even though they might put their all into it, there is no guarantee it will be watertight. Probably the best example of this is Wikipedia, which occasionally finds itself in the press for one sensationalist false article or another. While it may not be such a worrying issue in the case of Wikipedia, review sites such as TripAdvisor have come under fire as users have posted false reviews in order to make businesses appear worse than they are. When user generated content has the potential to damage companies’ reputations, it shows there is something of a problem emerging.
There is also the issue of copyright, which most often emerges in the context of visual content. For example, some users of YouTube and other video streaming sites upload clips of their favourite TV shows for others to view or songs by their favourite artists for people to listen to. This can lead to something of a quandary as the rules of copyright in such cases aren’t always clear. While some industry experts are happy for their products to be viewed on such websites, others aren’t and have been known to take action to have it removed for copyright reasons. Copyright law is complicated in itself, too, so even if a user thinks they are acting perfectly legitimately in uploading content to a website, they may be accidentally in breach of the law because the product is still under some form of copyright or distribution limitation.
When copyright is breached, the course of action is generally to remove the offending content from the site in question. This is what happened when Comedy Central asked YouTube to remove all clips of The Daily Show from the site. Other companies, such as the BBC, try to control the amount of user generated content on the internet by uploading their own official versions. Occasionally, though, infringements can lead to prosecution if a company or individual feels their privacy has been breached or copyright broken. When users largely upload content because they’re interested in a topic or simply want to share opinions with other people, they are often not aware of the laws surrounding such things and can find themselves being prosecuted for any number of activities.
There is clearly a question here of intent: if someone uploads something to a website that deliberately breaks a law – whatever it may be – then there is a case for action. Otherwise, it must surely be something of a legal grey area. If people accidentally make mistakes and are not aware they have broken the law, it is much harder to know how to deal with it. This is especially true when you consider one of the main purposes of user generated content is for entertainment and not done with any enterprising intention. The location of the infringement can also play a part, with copyright law not being harmonious around the globe. For example, the process for prosecution of copyright infringement is completely different to that of the UK.
Overall, then, this is something of a murky issue. On the one hand, copyright, libel and other laws must be upheld in order to protect businesses, artists, individuals and others. On the other hand, people have a right to generate their own content and contribute to the ongoing development of the internet. If nothing else, it’s fairly safe to say that this is an issue that is only going to grow in significance over time and may require a global approach in order to provide clear guidelines and solutions.
Published on February 21, 2011
Tags: Web Site Law
From 1 March 2011, the Advertising Standards Agency (ASA) has new powers over UK company web sites and online marketing material. This article explains more about how a change in the powers of the ASA might affect your organisation's online strategy.
Under current advertising laws, the Advertising Standards Agency can only monitor and take action over traditional forms of advertising. This includes billboards and adverts on television, as well as advertising in newspapers. Until now, however, it couldn't monitor the content of company websites.
These rules are changing on 1st March 2011, giving the ASA the power to pass judgement on company websites and other online commercial promotion. This means that consumers will be able to complain to ASA if they feel a website features indecent, misleading or false content. The upshot of this is that the ASA will have the power to force companies to pay fines and change the content of their websites if they are found to be in breach of the rules, even if the content in question is not in the form of a traditional online advert.
Therefore, while the current law allows companies to show adverts on their websites that it wouldn’t be allowed to broadcast on television, from March 1st all of this will change. So what does this mean for companies who rely on their websites for business? What will be the impact of the new rules?
On the one hand, it could be argued that this is a positive shift for consumers. It means that all claims made on company websites will have to be true, or else they would have grounds to make a complaint. This helps to protect consumers’ rights and promote quality of service. It also might mean that companies make more of an effort to keep their websites up to date – such as by not relying on old surveys or out of date endorsements to promote themselves – which will be good for both web users and the companies themselves.
On the other hand, it potentially creates the potential for companies to land themselves in hot water despite their best efforts. After all, two different companies in the same industry could both easily proclaim themselves to have "the best customer service in the business" because different customer surveys – even if conducted at the same time – could produce different results, but it would still look suspicious to have dual claims and might lead to complaints.
The new ASA monitoring powers also apply to any other web software that may be controlled by a company. So, for example, if they have a blog, social networking account (Facebook page, Twitter account, LinkedIn company page, etc.) or paid search advertisements (Google AdWords, etc.), these will still be subject to the new scrutiny rules. There is a distinction made between marketing and editorial content (only marketing communication falls under the new rules), but often the line can be blurred on websites and so companies may prefer to be cautious rather than risk the wrath of customer complaints.
This opens up the possibility that many websites are going to have to be restructured and re-written in order to ensure compliance with the new ASA rules and to be on the safe side when making marketing claims. All of this opens up a window of opportunity for copywriters, who are likely to receive requests to help companies redevelop the content of their websites once the new rules come into effect.
As well as web content and marketing communications, there is also the potential that other aspects of companies’ online profiles will have to be restructured, providing another opportunity for copywriters. For example, many online businesses take steps to improve their Google rankings through search engine optimisation. This might include profiling ‘recommended links’ on a piece of editorial content, writing a blog post tailored around a certain key phrase or product, or perhaps posting sponsored tweets on Twitter. While these may not be directly construed as marketing in the traditional sense, the new ASA powers make the world of SEO far harder to navigate, as it opens up the question of what counts as marketing and what, for example, counts as opinion or editorial copy. With many companies outsourcing their SEO efforts too, it may require the company to take a much closer role in what the outsource company is posting online on their behalf. And, with many SEO outsource suppliers being based outside the UK it might be that the knowledge and experience isn't available to them to provide the level of service needed to protect the company.
Officially, if an individual posts a comment on a website remarking on a product or service they have received, this does not fall under the jurisdiction of ASA. If a company then uses that comment in its online marketing material, though, it will fall under the jurisdiction. This makes the practice of checking websites for truth and honesty more important than ever before. Another example would be product pages advertising a particular item or service. If a company was selling an action figure, for example, but the picture promoting the toy on their website showed the action figure surrounded by a larger play set that wasn’t actually included in the price, then someone might complain that it constitutes false advertising and an investigation might be conducted as a result.
This could prove to be a window of opportunity for web content writers with the skill to write good copy and the ability to check that what they are writing is all factually correct. Having a good copywriter to take care of things also allows the company to get on with business without having to worry that they might be fined for false advertising.
Of course, ASA won’t be proactively searching the web for breaches of its rules; the whole thing depends on consumers raising issues as they come across them so it’s possible the real impact of the new rules will be minimal. Last year, 2500 complaints were made to ASA about web content but they couldn’t be investigated because the powers didn’t exist. There is no way of telling whether any more complaints than this will be made once web content does fall under the ASA remit or whether it will stay roughly the same. However, from March 1st the potential will be there for false or misleading claims to be investigated and so ultimately, it’s much better to be safe than sorry and take steps to ensure company websites are entirely truthful, both in their marketing and editorial content.
More information is available directly from the ASA web site at www.asa.org.uk.
Alternatively, if you are concerned about the impact of the new ASA regulations on your web site, and perhaps do not have the time or experience to review your site or marketing yourself, contact us today to arrange a consultation.
- December 2003
- June 2006
- January 2007
- December 2008
- April 2009
- October 2010
- November 2010
- December 2010
- January 2011
- February 2011
- April 2011
- May 2011
- June 2011
- July 2011
- August 2011
- September 2011
- October 2011
- November 2011
- December 2011
- January 2012
- February 2012
- April 2012
- May 2012
- June 2012
- July 2012
- August 2012
- September 2012
- October 2012
- November 2012
- December 2012
- January 2013
- February 2013
- April 2013
- January 2014
- February 2014
- April 2014
- May 2014
- Web Site Law
- Web Hosting
- Web Development London
- Web Development
- Web Design London
- Mobile Application Development
- Internet Security
- Internet Communication
Reproduction: These articles are © Copyright Ampheon. All rights are reserved by the copyright owners. Permission is granted to freely reproduce the article provided that a hyperlink with a do follow is included linking back to this article page.