- 69% Of Responsive Websites Take An
- Benefits Of Responsive Websites
- How Important Is User Experience For Businesses?
- Mistakes To Look Out For When Adopting Responsive Web Design
- Why Responsive Design Matters
Published on October 29, 2010
We noticed this morning that Google UK may be testing location-based search results based upon the location of the user. Presumably, Google is identifying the user’s location from their IP address and is then showing a mix of results from Google Places and the normal search results. This is good and bad for site owners as our article will go on to explain.
One such test was ‘car parking controls’. This is a non-geographic based search (for example, we’re not searching for ‘car parking controls in London’) so should return results from throughout the UK - and indeed this is exactly what it used to do. This morning, though, we noticed that Google UK has mixed in the map results from Google Maps (now called Google Places) as well as showing a map on the right hand side. Additionally, a new shortcut to Places is shown on the left hand side, which when clicked will only show the Places results. Each of the embedded Places results are highlighted with a marker, showing they are the results local to the user (our search was conducted from Kent). The image below demonstrates this:
What this means for site owners.
First off, Google may only be testing this feature so it could be removed quite quickly, or it could be rolled out to more keywords and eventually all of search in due course. At the time of writing it appears to be limited to a few of Google’s search servers and a few selected keywords.
If you haven’t already claimed your Google Places (formerly Google Maps) listing, do so now. If you’re not sure on how to do this, contact us for assistance.
Make sure you build up inbound links to your site that are relevant to your local area, and get listed on local directories. This will help to boost your local presence.
If you trade nationally and have relied on non-geographic keywords for your listings, there are going to less spots available in the top-10 for natural search as some will be taken up by Google Places listings. This means that you need to work harder and aim to get into the top-5 to ensure your listing isn’t bumped to page 2 of Google’s rankings.
If you trade locally then this development is extremely beneficial to you – so long as you can optimise your listing on Google Places. This is because for searches taking place locally to you, Google is going to pick Google Places listings in your area and mix them in with the natural results. If you are listing well for Google Places, then you could then start to appear well for more non-geographic searches too.
If you currently pay for Google advertising, you may find your click through rate (CTR) drops if your average position is greater than 3. This is because the map now pushes the advertisements below the fold on the right hand side. This, in turn, could create a bidding war for positions 1, 2 and 3 so pushing up your advertising costs (note that Google varies the number of advertisements appearing above the natural listings).
Our advice at this time is to focus on optimising your local presence using the steps above, and in conjunction with a good optimisation team that knows how to optimise for Google Places effectively and ethically.
Right now it’s too early to say if Google will keep this format. Google likes to experiment and this could be another test to see how this format performs. The format may not stay exactly as it is, not least because Google’s paid adverts are also getting pushed below the fold on the right hand side and this could decrease advertising revenues – time will tell though, as it may be that increased revenue from positions 1-3 at the top of page actually balance the decreased revenues for the lower-placed paid advertisements.
Published on October 27, 2010
We noticed that on 22 October one of our clients, who has held top-5 listings on Google for many keywords for many years, suddenly had a random batch of pages removed from Google. Our client's site was carefully not over optimised, had unique content, unique meta data, unbiased website reviews - basically, everything Google looks for in a site. Most of the random batch of pages were completely removed, others lost positions whilst other pages maintained their existing positions without problem. To further complicate things, pages with a lower 'value' to Google started appearing near the top of their rankings in their place.
This left us baffled. Why has Google suddenly dropped pages and positions for a perfectly good site? A bit of research suggests that we're not alone and there are webmasters and SEO specialists around the globe scratching their heads asking exactly the same question having experienced exactly the same problem.
Then we started to look a bit further. Could this be down to another co-incidental issue? An error at Google?
On 22 October, reports started to emerge that Google wasn't indexing brand new content. CNN was one site flagged as an example. On 24 October Matt Cutts of Google reported via Twitter 'Just fyi, the right people on our indexing team are resolving the issue that people have reported', which suggests the problem was still apparent two days later. Which then led us to think that perhaps this is all a bit coincidental; Google admits an indexing problem around 22 October and good web sites around the globe suddenly start losing rankings at the same time.... hmmmmm...
Our theory leads us to believe that what might have happened is that Google's indexers failed (this we know), however not only for new content but also for existing pages being reindexed (this we can't confirm). If that's the case, what could have happened is that the pages went to be re-indexed, the indexers failed, which led Google to 'believe' the pages being reindexed no longer existed so they were pulled from the Google search results. Now, for sites like CNN that are indexed almost in real time we'd expect any missing pages to re-appear pretty quickly after the indexing technology is fixed. But, for those of us with sites that aren't quite that big, that will be indexed perhaps every few weeks, that could mean that it will be a few weeks before those 'missing' pages are reindexed, rediscovered and added back in to the indexes.
To us, this theory backs up with the timing of Google's own admission, as well as the random nature of the page removals our client experienced.
The other theory, of course, is there's been a large-scale algorithm update as happened in May / June 2010. Although, this doesn't seem to follow through quite as well, as we would have expected to see different results to those we saw with our client as well as with other sites we've seen reporting the same type of problem. At this stage, there doesn't seem to be a pattern that would back this up. We might be wrong of course, and if we are time will tell!
So, for webmasters, SEO experts and site owners what's our advice? Don't panic! As with all things Google, first sit tight for a few weeks and see what transpires. If our theory is right, then everything should return to normal in the next few weeks. If the algorithm change theory is right, then in the next few weeks it will become more evident who the winners and losers were, and how you can adjust your site to get back into favour with Google. Either way, action now wouldn't be prudent - hard as it is, take a deep breath, relax, and see what unfolds.
Published on October 20, 2010
We're often asked if it's possible to guarantee a number one ranking on Google, with clients showing us references to companies that claim they can. The following video, direct from Google, helps to dispell the myth that no. 1 rankings can be guaranteed. Should you be approached by a company claiming they can guarantee positions in this way, you should beware.
This is not to say that optimisation cannot work for a site because it can - and does. We have many clients that have benefitted from search engine optimisation campaigns. But, any SEO campaign should be undertaken in an ethical manner with realistic expectations as to the placement results. Good SEO involves a good working partnership and a high level of understanding between the client and the optimisation company. When that's in place, you'll build a business plan that is achievable and successful, with search engine position you'll be happy with not just tomorrow, but in the months and years to come.
Published on October 4, 2010
Tags: Web Design London
Time and again, we've seen clients who have had a bad experience with their web designer. Sometimes it’s obvious; the web design is broken, the functionality doesn't work properly, there are massive delays on the project, or the website designer just disappears.
Other times, though, things may not be so easy to spot but could have equally serious implications for your web site and for your business.
You run a business though, and you shouldn’t have to concern yourself with what goes on behind the scenes. So what is it that you need to look out for and more importantly, why?
Poorly written, insecure code. This is the most common problem we find when we’re reviewing code written by other web designers. It you have any kind of database on your site (such as a content management system or ecommerce web site), or any kind of customer login facility, this can be the most serious issue you need to concern yourself with. Poorly written code can make your site vulnerable to SQL injection and HTML injection attacks which can lead to situations such as hacked, virus-infected web pages being displayed on your site through to far more serious issues such as the theft of customer data – which can also put you at risk of prosecution under the Data Protection Act.
Business logic mixed up with controller code. Code should be written in a structured, layered format with a layer for the database, a layer for the business logic (the rules and functions about how data gets in and out of a database), and a layer for the presentation (what actually gets displayed on visitor’s web browser). Actually, there’s a bit more too it than that, but that’s the core of it in simple terms. Very often, poor programming will mix the layers together which can make management of the code extremely difficult, can make the code unreliable, and can make it difficult to provide fixes and updates to.
Bad coding practices. Code can be written in a number of ways and it depends on the developer’s training and preference as to how they might write individual functions and pieces of code. A developer with a strong background in coding methodology and who keeps their skills regularly up-to-date will generally write code better than someone that may have learnt in a more ad-hoc way and doesn’t continue with any ongoing training. Poor coding practices can lead to the situations mentioned above, as well as poor-performing code that can affect, for example, the speed of page loads (which affect Google positions) or the speed of on-site searching.
Lack of commented code. Generally, good practice dictates that when code is written, each function within the code should be commented. This helps any other developer who updates the code to better understand what the code is intended to do, and how changing it will affect other parts of the website. Lack of comments indicate a lazy approach to coding, and can make it almost impossible for another developer to pick up the code and successfully make changes without having some other documentation (such as a technical specification) to work with.
Sadly, unless you are familiar with code you’re not likely to be able to spot the above four issues so how can you tell whether your code is well written or not:
If the price is too good to be true, it probably is. Don’t forget the old adage that if it’s too good to be true then it probably is. In web design, this is certainly the case. Whilst you may be working to a budget, don’t pick the lowest quote if all of the others are considerably more expensive. If you are going to outsource overseas then do be aware that the majority of poor code reviews that we conduct are from well known overseas outsourcing countries. That’s not to say that every company in an overseas location is incapable – there are good ones out there – but the prevalence of poor coding habits, driven perhaps by the educational process for developers, is higher.
If your project has had problems, get your code reviewed. If you project has been troublesome, running late, or your developers have had problems fixing bugs when you raise them then consider having the source code reviewed by a third party. Very often, there are obvious warning signs such as these that things are not going to be ideal behind the scenes. The sooner in the development process you are able to do this, the better.
Don’t pay everything upfront! It sounds obvious, but it does happen. Typically, you may be asked to pay 30% on project commission, 40% when the beta site is delivered and 30% on project conclusion. Most developers won’t release the full code until you’ve paid in full, which is fair, but you should be able to ask for a portion of the code at beta which can be reviewed by a third party. Then, if problems are found with the code quality you can either ask your developer to rectify them or reach an agreement to close the project off at a discounted total fee and move to another developer who can conclude your project. It’s worth noting though, that a reputable developer may review your original developer’s code and tell you that you need to start again. This is the worst news to receive, but it is sometimes a necessity to go back to the beginning to get it absolutely right.
If you are currently looking for a web developer to complete your web site because you’re experiencing difficulties with your existing supplier, contact us today for a free assessment and no-obligation discussion.
Published on October 1, 2010
Tags: Web Design London
As you’ll no doubt know, on 1 January 2011 the UK VAT rate will change from 17.5% to 20%. Many businesses now run websites where VAT is an integral part of their web site, and we have already started planning to update clients’ web sites where necessary. In fact, we’ve even started receiving new requests from companies that aren’t our clients!
Most of our clients have the ability to change the VAT rate on their web site design themselves, which is of course by design! But, that’s not always the case on some static sites, or even if self-management is possible some clients may not feel confident with making the change themselves.
This is where we come in. If you know your site needs updating to support the 20% valued added tax rate, and either your site doesn’t allow you to do it or you don’t feel confident in doing it, contact us today. We’ll review your site and advise you if we can add you to our schedule and if a charge will need to be levied or if we can make the change for free.
Do hurry though, because our VAT change to web sites list is filling up fast!
- December 2003
- June 2006
- January 2007
- December 2008
- April 2009
- October 2010
- November 2010
- December 2010
- January 2011
- February 2011
- April 2011
- May 2011
- June 2011
- July 2011
- August 2011
- September 2011
- October 2011
- November 2011
- December 2011
- January 2012
- February 2012
- April 2012
- May 2012
- June 2012
- July 2012
- August 2012
- September 2012
- October 2012
- November 2012
- December 2012
- January 2013
- February 2013
- April 2013
- January 2014
- February 2014
- April 2014
- May 2014
- Web Site Law
- Web Hosting
- Web Development London
- Web Development
- Web Design London
- Mobile Application Development
- Internet Security
- Internet Communication
Reproduction: These articles are © Copyright Ampheon. All rights are reserved by the copyright owners. Permission is granted to freely reproduce the articles provided that a hyperlink with a do follow is included linking back to this article page.