- 69% Of Responsive Websites Take An
- Benefits Of Responsive Websites
- How Important Is User Experience For Businesses?
- Mistakes To Look Out For When Adopting Responsive Web Design
- Why Responsive Design Matters
Published on August 24, 2012
To add to other Google updates aimed at promoting high quality, original content and penalising those sites that continue to break web etiquette in a range of ways, we can now add an update from Google aimed at copyright issues. The update was announced on 8th August and came into effect the week after and while, like the Panda and Penguin updates we’ve looked at previously, it shouldn’t affect most websites at all, it’s something that it’s definitely worth being aware of.
The copyright update (it doesn’t yet seem to have an animal-based nickname of its own) is aimed at those sites that have valid copyright removal notices against them, meaning that the more such notices a site has, the more it is likely to be penalised. In its blog post announcing the update, Google acknowledges that only courts can decide on issues of copyright infringement, so even though sites could be affected in the SERPs, it won’t remove content unless action is taken by the owner of the relevant rights.
Significantly though, Google also reports that it receives more copyright removal notices now on a daily basis than it did throughout the whole of 2009. The aim of taking valid notices into account in the search engine results is to help users continue to find high quality, legitimate information on the internet.
Something else to be aware of with regards to this update is the ‘counter-notices’ tool. This means that if, for instance, a valid notice is filed against a site and Google takes the step of removing the content, if the person whose content has been removed thinks the decision was wrong, it can file a counter notice to challenge the action. This, as discussed above, should only happen in the event that the proper legal steps are first taken to get the content removed; if those steps don’t happen, it seems that the site’s ranking will be penalised rather than pages removed.
However, even though the impact for most sites will probably be non-existent, this is still an important issue. The issue of copyright is a hot one, both in terms of protecting your own copyrights where necessary and in making sure you don’t infringe anyone else’s, accidentally or otherwise.
There are steps that you can take to help mitigate any issues related to this. For example, if you ever use someone else’s work on your website in a legitimate manner, make sure it is properly credited. A good example of this is images; it isn’t uncommon for sites to use images for which they do not have the copyright but which the copyright owner allows to be used. Most sites already credit the owners of the images, but it doesn’t hurt to make sure all of your pictures are properly accredited.
Also, if you are yourself an owner of any content that people might want to share or distribute elsewhere, it is probably worth giving some thought to your own copyright policy. For example, make it clear on your site that the copyright is yours and, if you are happy for people to share your content (such as by using your photos on their own blogs), state how you would like to be credited.
This is an issue that is of particular interest to the entertainment industry, as they are the ones who most often feel that their content has been used without permission. However, it is something that can affect anyone, so vigilance certainly seems sensible.
Something else to consider is that if your website allows users to upload their own content, you might want to take steps to prevent copyright infringement in their uploads. This is something that should ideally be dealt with in your terms and conditions so it is clear what you expect from users on the site and what they can and cannot post; and make sure you take action if the rules are broken.
A review of your website to make sure all content is original or properly credited where you have used anyone else’s material (and that you have got permission where necessary) should help and uncover any issues you need to address.
Overall, it is not yet entirely clear the impact that this Google update will have on search rankings or wider issues of copyright, but since it is a widely-discussed issue right now, it’s certainly worth checking your site complies with all the relevant regulations.
Published on July 27, 2012
You will no doubt remember when Google released its major algorithm update towards the start of last year: the Panda update (sometimes referred to as the Farmer update) impacted on around 12% of searches at the time, and web designers and others working in the industry spent considerable time making sure websites were up to standard.
The aim of the Panda update was to help weed out sites considered to be less reputable, such as those that took part in the practice of keyword stuffing or otherwise employing low-quality content – yet that still managed to do well in the search results prior to Panda. The impact of this algorithm change was significant, and since then there have been multiple updates and refreshes to the Panda algorithm to help perfect what it was intended to do.
Fairly recently - on June 25th to be precise - a further refresh of Panda was released. It is thought that this has only had an impact on around 1% of searches, but it still makes sense to take this opportunity to review your site and make sure it won't fall foul of the algorithm designed to find low quality sites and bump them down the search results. After all, there are billions of Google searches done every day, so even though 1% might not sound like a huge amount, it is still something that cannot be ignored (if you want to be amazed by the number of Google searches per day, this link is somewhat eye-opening: click the 'start' button and watch the numbers flick by faster than you can see).
So with this in mind, let’s take a look at the different things you can do to ensure your site falls into the ‘high quality content’ category and will not fall foul of the most recent or any future Panda algorithm updates that might be released.
‘High quality content’ is something that can be hard to define, but it’s what all sites need to achieve if they want to rank well in relevant search results. Last year, Google released a list of interesting questions that can help web designers work out whether a particular site is high quality. For example, trustworthiness is one issue that comes up quite a lot: is your website trustworthy? If it is an ecommerce site, would you be willing to input your credit card details? These questions can seem rather subjective, but most of us have a pretty good idea about the answer we would give when looking at a particular website.
However, there are also some very practical things we can do to make sure our websites are high quality and filled with useful information for web users. For instance, making sure there is no duplicate content on your site is one obvious way of improving your site, particularly as this is one of the things Panda can penalise you for.
Sites that make use of a considerable amount of advertising can also take action to make sure they don’t fall foul of Panda. This is a particularly relevant issue since part of the aim of the most recent Panda update was to weed out sites that trick users into believing an ad is trustworthy when it isn’t necessarily the case. Websites that take part in link-building should also ensure that all of their links are reputable and genuine.
One of the big things that all websites can do – and that they should probably be doing anyway even if it weren’t for updates such as Panda – is to think about things from the point of view of the web user. After all, one of the aims of any website is to make sure the user has a good experience of that site, whether it’s in navigating the different pages to find what they’re looking for, making a purchase or reading about a topic of interest to them.
It might sound a little inconclusive to say that the main thing you can do to deal with the implications of Google Panda is to take care over all of the content on your site and make sure it is all engaging and well-written, but it’s true. We can all take action to make sure our websites aren’t negatively affected by algorithm changes, but ultimately our focus needs to be on creating a good web environment and a positive experience for web users – all of the time, not just whenever a new algorithm update is released.
Published on July 27, 2012
There are plenty of factors that have an impact on our search rankings. How well the website is optimised, for instance, or how good the content is. But could a slow website also affect our search rankings? One thing is for certain: we all want to make our websites as quick and efficient as possible. It isn’t just SEO specialists who are interested in this; arguably, web designers have just as much if not more of an interest in making sure our websites are fast and responsive. Here are some of the main reasons why it matters.
Web users don’t like slow sites
It is a truth widely acknowledged in the world of web design that web users tend not to like slow websites. While users naturally have varying degrees of patience when faced with a slow-loading site, many will walk away if it takes too long to load a page. According to some research, if a site takes more than four seconds to load, 1 in 4 people will abandon the site.
One of our main roles as web designers is to give the people who use our websites the best possible experience. How long it takes the page to load might be only one of the factors that influence this, but it certainly doesn’t get us off to a good start when they are stuck waiting for something to load – and it could cost us their attention altogether.
Growing smartphone usage
We have looked before on this blog at issues of growing smartphone usage and why we need to take account of it in web design, but the speed of websites is definitely one of the key issues to consider. Even though there have been massive improvements in recent years in terms of how long it takes mobile websites to load, we cannot deny that they are usually still significantly slower to load than desktop sites.
With this in mind, if a website were to take five seconds to load on a desktop, how much longer would that same site take to load on a mobile device? The difference can be considerable, and research from America suggests that if a mobile site doesn’t load within 10 seconds, half of users will abandon that site.
This is one of the reasons why web design techniques such as responsive design are becoming more popular; it helps to make mobile sites more reactive and efficient, as well as providing a better visual display and practical experience to web users.
Search rankings could be affected
Another reason to care about a slow-loading webpage is that it could end up having an impact on your search engine results. According to Google, only 1 per cent of searches are impacted by the load time of websites, but we shouldn’t assume that our own websites are safe and so it makes sense to ensure they’re as efficient as possible so we don’t get penalised.
This means that even though other issues such as the relevance and quality of the site have more impact on how well your website does in search results, the speed can also have an effect.
It could identify other issues
Also, if your website is loading slowly, it could be a sign that there are other issues you need to address in order to help it run properly. For example, has the site been properly optimised? Could the size of the graphics perhaps be slowing the site down? Are there any bugs that you might have missed that need to be fixed to speed it up? Is there anything you could do to reduce the size of your style sheets? Can you reduce the size of the webpage in any other way to help it load faster?
If we want our websites to run as efficiently as possible, we need to make sure we have analysed it properly and perfected the design so there is nothing on the site that could possibly lead to it being slower than it should be. If we do this, it won’t just be the speed of the load time that improves; the overall quality and experience of the site should improve too, so it is definitely an issue that deserves our attention whenever we are worried that a site isn’t loading quickly enough.
Published on May 4, 2012
Say what you will about Google, but they know how to come up with quirky names for their algorithm updates. Previously, we have discussed the Panda update and the Venice update. Today we turn our attention to the latest update to gain significant attention from web designers and SEO specialists – the Google Penguin update (this was originally known as the webspam update).
While it isn’t yet entirely clear what the update will do, it seems certain to say that it is targeted at quality – or rather, weeding out sites that are in violation of Google’s quality guidelines in order to help sites that utilise good SEO techniques get better page rankings. The Google post outlining the upcoming Penguin algorithm change offers a couple of examples of the kind of pages that might be affected by the change.
That post also offers a useful explanation of what Google perceives to be the difference between good and bad SEO. The terms it uses for this are “white hat” SEO and “black hat webspam”. In brief, white hat SEO is defined as SEO that helps to improve a user’s experience of a website. It is to do with good content as well as good marketing techniques in order to build the profile of a site, according to Google.
By contrast, black hat webspam is defined as SEO techniques that don’t actually benefit users. For example, sites that include keyword links that aren’t actually relevant to the content in question, or keyword stuffing on a site that has got generally poor-quality content.
Google also helpfully offers some information to give us an idea of the number of searches that are likely to be affected by the Penguin update. It isn’t expected to have as significant an impact as the previous Panda update that helped to promote good quality sites, but it should still make a difference. It’s thought that just over 3% of English searches will be impacted.
So, even though most sites should be perfectly fine following the Google Penguin update and could even receive a boost as poor quality sites are weeded out, it still makes sense to review your web content to ensure it falls within the quality guidelines. There are a few different areas you can look at to make sure your site is properly promoted with white hat SEO.
For example, ensuring you don’t have duplicate content on your site is an issue we’ve looked at before as a result of previous Google updates, but if you still have any content that could be considered as duplicate, it could be worth reviewing this. In theory, duplicate content doesn’t automatically get you a search penalty unless it can be proved that you were trying to manipulate the search rankings, but generally speaking, it isn’t worth taking the chance.
Keyword stuffing is another issue that Google mentions in its blogpost on the update, so this something else you might want to look at to ensure your site falls within the quality guidelines. This isn’t just about ensuring your visible content utilises keywords properly; any hidden text on your website also matters and could lead to you being penalised in the search rankings if you have engaged in any keyword stuffing. If your site has been loaded with keywords for the express purpose of doing well in the search rankings even though the user experience isn’t as good as it could be, now is a very good time to review that practice.
Google also mentions links in its Penguin algorithm post. In its guidelines, it warns against linking to webspammers or what it refers to as ‘bad neighbourhoods’, so this is something to watch out for on your site. The examples given in the post linked above include that of an article ostensibly about getting fit but with keyword links for payday loans. This might seem like an obvious no-no, but it’s clearly something to be careful about as this sort of technique can be much more subtle.
Overall, the message from Google clearly seems to be that good content will be prioritised and that those websites utilising white hat SEO should not be pushed out or penalised by sites attempting to manipulate the search rankings. While most sites should have nothing to fear from the Penguin update, it’s still a timely reminder that it’s worth reviewing our content from time to time to ensure it is still as high quality and beneficial to our users as we initially intended it to be.
Something all web designers and SEO specialists need to be aware of is the Google Venice update. This is an update that was included in a series of algorithm changes that Google announced towards the end of February 2012, but the effects of it are just starting to become clear. You can read the full list of algorithm changes that make up the update here.
Google Venice itself is actually number 26 on that list of around 40 algorithm changes and the idea behind it is to improve the rankings of local search results. This is the explanation that is offered by Google:
“Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal.”
This basically means that if you search for something, the results you get off Google will be much more personalised to your location. For example, if you search for ‘web designers’, you should hopefully see results from web designers that are local to you (e.g. ‘web designers london’), because Google is now utilising IP addresses to work out where you are and provide you with results accordingly.
This has several implications for businesses that we need to be aware of. First of all, this has the potential to benefit local businesses that have previously been disadvantaged by larger firms in terms of their SEO. If local businesses are able to optimise their sites for local search terms, they could potentially do quite well out of Google Venice.
However, as well as the benefits, it also throws up a couple of challenges. For instance, businesses that are based in one location but who carry out work in lots of locations will face a dilemma as to what they should optimise and how. As an example, as a web design firm, we can often just as easily design a website for a company in Scotland as we can in London because we’re able to work virtually. Does this mean we need to optimise separate pages for every location we work in?
The full impact of Google Venice is yet to be determined, but for such a significant update it hasn’t actually yet been analysed that extensively.
One issue that does seem to have come up, though, is that if you live in a place that shares its name with somewhere else (Birmingham UK and Birmingham Alabama, for instance), there is a chance that Google will pick the wrong place because even though websites are based in different countries, they’re optimised for the same place names. This doesn’t seem to be an extensive problem, but it’s something that many businesses will need to be aware of.
What is clear from the Google Venice update is that local search is now really entering its own and every business needs to start developing a plan to deal with it to make sure they stay properly optimised and take advantages of the opportunities the algorithm change has to offer. For example, developing localised page content or local landing pages can help firms to optimise the areas they want to target. You’ll only be able to make use of the Google Places feature if you actually have a physical address in a certain place, but if you are based in one place yet work in many, there’s nothing to stop you optimising your site in other ways.
Websites might also need to rejig their site architecture to make sure it is properly optimised for local pages. For instance, take a look at your homepage. Is it currently optimised for local search? If not, then you might well need to make some updates to ensure it still continues to rank for relevant search terms, an also to micro-format your address (if you don’t know what this is, please do ask us).
Also, just because Google has a significant new update out, don’t forget about the old ones. You may think that one of the easiest ways to optimise for local search might be to simply use the same or similar content on multiple pages and just change the keywords. However, this could end with you falling foul of last year’s Google Panda/Farmer update that punishes low quality or duplicate content. Making the extra effort to properly optimise your site for Google Venice is definitely worth it.
Local link building might also help, although it’s not yet clear how the recent updates have impacted on how links are included in determining search rankings. Some suggest that they have become less important, but you still shouldn’t ignore this area, particularly if you’re trying to cement your local profile.
And, despite what you might think, you don’t even need inbound links to your site to improve your local rankings. Google uses NAP data (Name, Address, Postcode) to identify your business and gets this information from various sources on the web. So, search Google for your postcode, then your company name, and try to make sure the information found is consistent with the NAP on your site, to your Google Places listing, to anywhere else where your NAP is listed (including where you might be listed without there even being a link to you site). Consistency, and relevancy, are key – if you can get NAP listings on sites that also service the same area you are based (so, for example, we might look to get NAP listings on London-centric sites) that will be more relevant than having a NAP listing on a global web site, or a site that’s focussed on content on the other side of the world.
Overall, Google Venice might be able to be explained in just a couple of sentences, but its potential impact could be really significant. Taking action now to ensure your site continues to rank well should definitely be on all of our ‘to do’ lists.
- December 2003
- June 2006
- January 2007
- December 2008
- April 2009
- October 2010
- November 2010
- December 2010
- January 2011
- February 2011
- April 2011
- May 2011
- June 2011
- July 2011
- August 2011
- September 2011
- October 2011
- November 2011
- December 2011
- January 2012
- February 2012
- April 2012
- May 2012
- June 2012
- July 2012
- August 2012
- September 2012
- October 2012
- November 2012
- December 2012
- January 2013
- February 2013
- April 2013
- January 2014
- February 2014
- April 2014
- May 2014
- Web Site Law
- Web Hosting
- Web Development London
- Web Development
- Web Design London
- Mobile Application Development
- Internet Security
- Internet Communication
Reproduction: These articles are © Copyright Ampheon. All rights are reserved by the copyright owners. Permission is granted to freely reproduce the articles provided that a hyperlink with a do follow is included linking back to this article page.