Search engine optimisation (SEO) is that geeky thing that determines how many people see your website when they search for a specific word or phrase, and as a result of this, how many people actually end up visiting your website as a result of seeing it.
Contrary to what some geeks claim, SEO isn’t rocket science. But it does require understanding the basics and systematically covering your bases.
To start with, SEO has two categories: on-page SEO—page titles, internal linking, meta tags and descriptions, etc. And off-page SEO—link building, social networking, blogs, contributing to forums, etc.
In the interest of simplicity, this post will examine on-page SEO.
Our team recently optimised on-page for Dr Rob King website. Within six months the site’s traffic increased by 82.2%.
How did we do this? I want to share, with my readers only, some of the best proven techniques that we found and applied to our clients’ accounts.
1. Content comes first
If your website has brilliant copy, chances are it’s going to receive a lot of hits, with or without equally brilliant SEO. But if it does have brilliant SEO as well, it will receive even more hits.
So, what constitutes brilliant copy?
It starts with text descriptions. This means that even if your content is a photograph, infographic or video, it all still needs a text description that explains what the image or video is about.
Brilliant copy requires diversified content, not just text. Google likes multiple types of media content. Think videos, images, infographics, audio or a slideshow that are presented in an engaging way to increase the time people spend on your site and to yield better branding retention.
Your content needs to be useful. Don’t publish information for the sake of getting something out each week. If your content does not add value, don’t push it.
Your content must be well researched. No one, least of all a fussy search engine, wants to read sub-standard or flaky material. If you are going to put the effort into preparing content make sure it contains a balanced view. Don’t forget the somewhat counter intuitive research findings that prove long articles rank better than short articles.Brilliant copy requires diversified content, not just text Click To Tweet
2. No duplicate content please
What is duplicate content?
Duplicate content, in the context of SEO, means the same content appearing in more than one webpage (URL) within the same or different domain (web site).
It happens when you or your staff either copy some other website content then paste it on your site or you use same piece of content across different pages within your website.
This issue occurs more often when SEO guidelines are not in place in the architecture phase of building a site, therefore your designer or content writer for your website, might not pay attention to it.
The most common problem with duplicate content is that search engines can’t decide which content to index, and so they don’t show either web page.
Duplicate content can come under different types; one of the most common errors is duplicate versions of the homepage:
One of the first things I check on a website is whether there are duplicate versions of the home page.
Here are some examples of duplicate homepages:
• Version 1: http://www.msccruises.com.au/au_en/homepage.aspx
• Version 2: http://www.msccruises.com.au/au_en/homepage.aspx/
If you are serving duplicate versions of your home page to the engines, the search engines view them as duplicate content. They usually figure out which version to index, but sometimes they get it wrong. Why make them think if you don’t have to?
I have seen quite a few duplicate URLs where upper case or lower case URL text is used inconsistently, causing search bots to be confused and to devalue the URL or not rank it positively.
The issue here is if you have all of your internal or external links pointing at www.example.com/Apple-Iphone or www.example.com/apple-iphone and the “real” version of the page is www.example.com/apple-iphone, you’re splitting the link equity between these two URLs. You want all that linky goodness pointed at the one original URL you want to rank in the engines.
Almost every site I’ve worked on, has had issues with content that is too similar to rank.
Minimise similar content. If you have many pages that are similar, consider expanding each page or consolidating the pages into one.
For instance, if you have a restaurant site with separate pages for two locations, but the same information on both pages, you could either merge the pages into one page about both locations or you could separate two pages and write unique content about each location.
Use the ‘301 redirect’ status code to permanently redirect traffic from duplicated pages to the original ones. The 301 status code means that a page has permanently moved to a new location. (Note if this part is too technical for you, just skip it and move it to the next point, leave this one for your web developer)
Or place a ‘rel canonical’ tag on the original page to send a strong hint to search engines about the preferred version to index, among duplicate pages on the web.
The rel=canonical tag passes the same amount of link juice (ranking power) as a 301 redirect, and often takes much less development time to implement. The tag is part of the HTML head of a web page.
If you are not clear what 301 redirect or rel canonical tag are, just ask your web developer to fix them for you.
3. Page titles, descriptions and headings
Meta tags, including the page title and description, are considered to be some of the most important factors influencing search engine rankings.
Well-written and unique Meta tags are also important for encouraging people to click through to your website from the search results pages.
Each page must have a unique title that will help both search engines and users understand what the page is about. The length shouldn’t be longer than 65 characters. A page with the title, “SEO Tips That Boost Your Site Search Ranking” is better than a page with the title, “Result Driven SEO”.
A page description should be 155 characters and a clear explanation of what the web page is about. It is the text displayed by the search engine results and your opportunity to entice the searcher to visit your site.
It is important for SEO purposes, as well as readability purposes, for all web pages to be properly formatted with each level of heading and text allocated its correct tag—h1, h2 or h3.
You also need to use a good font size of at least 12px, and ensure your paragraphs are no more than 4 or 5 lines each.
4. Optimise images as well
Images are important but they should not increase the loading time of the website. Best practices for using images:
1) Use original images. If you need to use an existing image from the web you need to reference the source.
2) Optimise the size of the images – the smaller the size (in bytes) of the image the better. Use Photoshop or free online tool such as: http://www.picresize.com/ to reduce the size of an image without sacrificing the quality. Keep the image size below 100K.
Heavy images are common errors that I have seen a lot of many websites, when users have no understanding about speed optimisation and user experience and upload whatever they have to their website before trimming them down.
3) Use ALT tag to describe the image – This helps search engines understand what the image is about.
In order to understand the content of an image, Search Engines use attributes such as the file name, Title and Alt attribute. Search Engines will also use the attributes in an image to help them understand the context of the page itself. Therefore, these attributes should contain relevant keywords.
What is an Alt tag by the way?
Alt text (alternative text) is a word or phrase that can be inserted as an attribute in an HTML (Hypertext Markup Language) document to tell Web site viewers what that image is about.
Search engines can’t read images, therefore it has to read the image name, image Alt Text and Title to understand that image.
4) Use descriptive filenames – Don’t just name your image ‘image1.jpg’ but try to explain what the image is—for example ‘audi a4 2016 for sale’.If you employ images, make sure that you've tailored image name, Alt text & Title Click To Tweet
5. Have an SEO-friendly URL structure
URL best practices: Use hyphens to separate words when necessary for readability. You should not use underscores, spaces, or any other characters to separate words. The URL of a web document should ideally be as descriptive and brief as possible. Do not use CAPITALS in words in URLs.
The URL structure is an important part of on-page SEO. Whenever I talk about URL structure, I prefer to split it into 4 major parts:
1) Permanent links – Permanent links are the URL’s of each page. Good URLs should be less than 255 characters and use hyphens to ‘-‘separate the different parts.
For example a good URL is:
A bad URL is:
2) Categories – File your pages according to logical categories to facilitate fast and effective searches by users and search engines. Not categorising your pages is the equivalent to building a public library without a classification system.
Consider using sub-categories as well but don’t go deeper than three level. For example a good category structure is:
Naveen Somia> Eyelids>Lower Eyelid Surgery
3) Breadcrumb – A breadcrumb is important for all your pages because it allows users to navigate your website in a structured way, always knowing where they are and how deep below the home page they are.
Bread crumbs typically appear near the top of a web page, providing a path back through the hierarchy of pages, to each previous page that the user navigated through to get to the current page. They provide a route back to the home page and may look something like this:
• home page –> section page –> sub section page
4) User Sitemap – One of your options in the main menu should be the User Sitemap. This is an html file that represents the structure of your website. Visit Result Driven SEO’s site sitemap for an example.
6. Have a strong internal linking structure
We have the mail-man deliver our post (snail mail) to our street address. The online equivalent to a street address is an URL. To find a page on the web, we type in a URL and the browser endeavours to show us the requested destination.
1) URLs help human visitors and search engines find pages. A site’s URL structure should be as simple as possible. Having simple, SEO-friendly URLs helps with website optimisation but also makes it easier for visitors to navigate their way to, and around, your website.
Linking to your website’s URL is an important part of good SEO.This tutorial by Google on how search works, demonstrates that the first step a search engine spider will do is follow the links they find.
2) When a search engine finds a page with links, it reads those pages as well. This means you can use this technique to show search engines pages of your website they have not yet discovered.
Another benefit is that using keywords in your URL, as part of the folder structure, adds value to your SEO efforts.
For example, if you’re searching for information about ‘Eyelid Surgery’, a SEO-friendly and well-structured URL will be something like this: https://your website.com/eyelid-surgery.
The information provided in the URL will help the user decide whether to click that link, and the search engines know it’s a page about an Eyelid Surgery.
A URL like www.example.com/index.php?id_sezione=360&sid=3a5ebc944f1, is much less appealing to users. It is less clear where they will be taken and what they will find when they arrive. They are less likely to click a link that looks like this.
In addition, complex URLs can provide barriers to search engines trying to index a site. They may not proceed to the page if the link contains special characters or is too long. This can mean that many pages on your site will not be indexed.
3) Every website has some pages that are more important than others. Highlight the most important pages by sending them more internal links.
4) Use internal linking to increase the time a user spends on your site. If they are reading your content, a link to a new page will increase both the time they spend on your website and the number of pages per visit.
Best practices for internal linking:
• Don’t use keywords only for your internal links. Make it natural by linking it with different words.
• Add useful internal links
• No more than 7-8 internal links per page (this is my opinion and not based on any research or studies)
• “No Follow” unimportant pages; On the landing page (the page that you want to rank) try not to link out too much to other pages or to other sites. Please keep these links nofollow so the link juice or ranking power don’t get passed on to other pages.
Links are by default ‘follow’, that means that the ‘link juice’ and the authority of the page is transferred to the page where the link is pointing to. The ‘nofollow’ should be used to stop this flow of authority from your pages to others external websites you don’t want to transfer your authority. For example, ‘nofollow’ could be used on links to your developer’s website or to Facebook.
• If applicable you can also use ‘related posts’ at the end of each post for internal linking.
7. Speed – you need to have a fast loading website
Google is investing a huge amount of money to make the web faster. You will bump into lots of articles that talk about the importance of speed and Google’s desire to include the fastest websites in their index.
In order to ‘force’ web site owners to take speed into account Google has officially added speed as one of its ranking factors.
Make sure that your website loads as fast as possible by taking into account Google’s recommendations
Fast and optimised pages lead to higher visitor engagement, retention, and conversions. Google’s Page Speed tool delivers more than scores and insights. When you run a Page Speed test, Google tells you exactly what you need to change to improve your score and performance
Speed checking tools:
• Google Speed Test: https://developers.google.com/speed/pagespeed/insights/
• Pingdom Speed Test: http://tools.pingdom.com/fpt/
Test your sites and let your developers know about the issues that these tool raise.
8. Mobile friendly website is a must
Do you know that?
• Mobile marketing spend is predicted to hit $65 billion by 2019 in the US. This figure in itself may possibly be meaningless in Australia, but the graph illustrating it, clearly shows the expected lineal growth of mobile versus desktop, which will impact us.
• Mobile users are picky. They have low tolerance for poor mobile experiences. If your mobile site loads slowly, 43% of your users won’t try to come back and most will go to a competitor’s site, according to data from e-commercefacts.com.
Traffic coming from mobile devices from websites I have managed is at the average of 35% and it is continuously going upwards.
With so much focus on usability, the demise of the desktop browser dominance, and the prevalence of mobile devices, Google’s made it very clear that ‘no mobile experience’, equates to ‘no love from Google’!
Therefore, creating a user-friendly site design that works well and fast across all devices – especially mobile and tablet—is important.
Mobile versions for your website can be built easily by using one of three popular mobile design technologies such as:
• dedicated mobile site (m.example.com.au)
• responsive design (Google’s preferred method)
• native application(for complex design/user interface).
I recommend using responsive design for simple sites and anyone with limited resources for maintenance and limited budget. I built this Responsive Design Testing Tool on my website so if you want to see how your website looks on different devices, check out this tool:
9. Rich snippets
Snippets are the few lines of text that appear under every search result. They are designed to give users a sense of what’s on the page and why it’s relevant to their query.
Implementing rich snippets can significantly impact your website’s performance while helping you to see:
• Better click-through rates
• More qualified traffic
• Better conversion rates
• Higher rankings in search results
Following are the types of Rich Snippets supported by Google, to name but a few:
- Rich snippets – Breadcrumbs
- Rich snippets – Events
- Rich snippets – Music
- Rich snippets – Organizations
- Rich snippets – People
- Rich snippets – Products
- Rich snippets – Recipes
- Rich snippets – Review ratings
- Rich snippets – Reviews
- Rich snippets – Software applications
- Rich snippets – Videos: Facebook Share and RDFa
- Schema.org markup for videos
- Rich Snippets – Social Profile Link
- Rich Snippets – Educational Organization
Adding rich snippets is sometimes overlooked by webpage developers because it can be tedious. It involves labelling relevant information on your website using the above list. But adding rich snippets to your website is necessary for good SEO results.
10. Other technical setup
Google Webmaster tool
Sign up for Google Webmaster Tool to submit your XML sitemap and receive Google’s notifications on your site’s crawling performance and other updates from Google to webmasters
This free tool helps you to monitor your website traffic performance, user behaviour (time on site, pages to visit, events users take such as phone clicks, email clicks, video watch…)
Use URLs, time, pages/visit, and events to set up goals so you can track the essential metrics of your site. The closer these metrics are to activities that generate revenue, the better. You should definitely start tracking:
• Trial signups
• Account creations
• Newsletter signups
• White paper downloads
• Ebook downloads
• Anything else that help you generate income
Measure the success of a visit when users fill in contact form or make a purchase. Goal URLs or funnels show you exactly how many people step through each part of your marketing process.
It shows how many people abandon the funnel and at which part, giving you a fair indication of which pages need fixing.
E-commerce tracking measures the number of transactions and the amount of revenue generated by your website.
The data available to you in Google Analytics Ecommerce reports includes the following information by traffic source, so you get a clear picture of where your sales originated—online marketing, SEO, referral traffic:
• Conversion rate
• Number of transactions
• Total revenue
• Average order value
• Number of unique purchases
• Quantity of products sold
• Quantity of each product sold
• Revenue by product
• Average price of products
• Performance by date
• Day to transaction
• Visits to transaction
It also clearly highlights:
• Which sales were generated by organic traffic, and which were generated by paid marketing
• The time most purchases happen
• The most profitable landing pages
Google Analytic events, include:
• External links
• Buttons (Emails, Phone Numbers)
• Time spent watching videos
• Social media buttons
• Widget usage
• Any element that your visitors interact with can be tracked with events.
You can track just about anything you want with events to monitor a users’ behaviour on your landing page or your website.
Check broken links
You can use this tool to identify error pages on your website and fix them www.brokenlinkcheck.com. Or you can visit Crawl report on Google Webmaster Tool to view this type of report by Google
Setup XML sitemap
If you are going to the trouble and expense of creating pages and providing content then it’s important that you get the greatest possible value out of it in the search results. An XML sitemap can help ensure that the search engines are aware of all of your content.
There are different kinds of sitemaps. An XML sitemap is simply a list of all the pages on your website. Creating and submitting a sitemap to the various search engines helps make sure they know about all the pages on your site. This includes URLs that the crawlers may not be able to access.
For sites that have poor internal linking and an extensive archive of content, are new with few inbound links, have dynamic content, or use AJAX to create rich content, a sitemap can be particularly helpful.
It is best practice for all websites to submit an XML sitemap to the primary search engines.
Create an Xml sitemap and submit it in Google webmaster tools so it’s easy for search engines to crawl and index all pages of the website
Search engines constantly send out web-crawlers to find new web pages. The search engines then index and rank them. It is possible to prevent some pages from being crawled, but this requires a special instruction.
The file used to provide the instructions about which pages to crawl and which pages to ignore is called a robots.txt file.
Pages of a site you may not want to become visible on Google, for example, may include private login areas of a website, or pages on a secure website (such as a banking site).
A robots.txt file that does not provide the correct instructions about what to crawl and what not to crawl can also mean that pages that content creators desire to be indexed may be overlooked and will therefore never be able to rank.
Add the path of the sitemap location to robot.txt file as showing in the example below:
Your on-page SEO checklist
Content is original – copyscape checked?
Content is first published on your website?
Can you add other types of content to add value?
Content has enough descriptive text?
Content is well researched with references?
Page titles, description and formatting
Page titles are unique for each page?
Descriptions are unique and up to 156 characters?
Text is properly formatted using H1, H2, H3, Bold, Italics?
Text is split into small paragraphs?
Font size is easy to read on small screens (tablets) as well?
Image size is optimised, file size smaller than 100K?
All images have Title and Alt tags defined?
Image filename is descriptive?
Permanent links use ‘-‘ as separator ?
Website pages/posts are grouped into categories?
There is breadcrumb on all posts/pages?
There is an HTML User Sitemap?
Pages have internal links?
There is a ‘Related posts’ section at the end of each page?
Internal links use both keyword and non-keyword anchor text?
Speed and mobile site
Website scores more than 90% when checked by Google page speed insights?
Has mobile friendly design is implemented for each and every post/page available on the website?
Have you setup Rich snippets for your products, reviews, articles, videos…?
Other technical setup
Have you signed up for Google Webmaster Tool?
Have you submitted XML sitemap to Google Webmaster Tool yet?
Could you see any crawl error pages reported by Google Webmaster Tool or Broken Link Checker tool?
Are basic Goal URL, Ecommerce, Conversion tracking setup on Google Analytics?
Is the XML sitemap path on Robots.txt?
March 15, 2018
February 15, 2018