Your Web Design Agency Doesn’t Know SEO – Despite What They Say!
I’ve worked for/with a lot of digital agencies over the past 16 years and one thing I’ve learned from this and working with 3rd party web developers on accounts that were developed in-house or by a different agency is that web designers and developers simply don’t know SEO.
The reason for this post is really to highlight what may be in your contract and where you can arguably pass on any technical SEO optimisation costs back to your original designers.
If you’ve recently had a new website created your web design agency may have added into the spec something along the lines of “your website will be SEO optimised”. If this is the case then i’ll eat my hat if the site is actually optimised using even the basics of Google’s best practices. There’s some very simple reasons to why I know this to be true:
- SEO changes every few months and your web developers probably have read something that’s really out of date and hold it to be a valid set of tactics today.
- Developers are always putting in place the quickest, never best solution to technical challenges when building websites. This is because they’re typically building a site to a price and are conscious of time.
- There’s an assumption that the various off the shelf and open source platforms are already SEO optimised.
- Shortcuts such as canonical tags are assumed to be the best solution.
If you’ve recently contracted an SEO agency then you’ll know that the first thing we do is to fully audit a website to ensure its the best possible platform for marketing and that there’s nothing hidden within the code that will hamper SEO. This typically gets passed back to your web developers who cost out the changes, fed back any challenges (there’s always challenges) and then repeatedly fail to follow instructions contributing to more time and frustration all around.
Developers hate revisiting a websites code. This is partly due to them not remembering some of the logic behind their coding decisions, partly due to fatigue and seeing the same site for many months and partly because they’re being asked to suck eggs and re-work all of the shortcuts they’ve previously put in place.
If your website was built and your contract stipulates SEO then I highly recommend that any SEO optimisation costs are passed back to the design agency!
It can be argued that a lot of SEO is subjective however most issues are really apparent once you know what you’re looking for.
Just a few of the common issues I encounter almost weekly are:
- Blocked sites – Usually a legacy of the site being built on a test domain, most developers forget to change the sites robots.txt file to allow search engines to access the content. The xRobots tag is also another way developers blog search engines typically using a nofollow, noindex attribute. Also be careful when asking your web designer to ask users to log in before accessing a websites content. In a significant majority of cases this will kill your websites ability to convert.To check for this visit yourdomain.com/robots.txt and look for:
[image]Also right click on your home page and View Source, then pres CTRL + S together to get a search box, now type noindex and look for something like:
- Google Analytics not being installed correctly – I’ve seen partial integration, multiple GA codes being used, GA just on the sites home page etc. Without Google Analytics being present on your website you really have no idea how the site is performing.To check this. Download the free screamingfrog.com software, perform a scan and in tthe bottom right look for values in the No GA entry:
- No goals or E-commerce tracking – An extension of the above, without event tracking, goals or e-commerce tracking set up in Google Analytics you are basically trading blind.To check this in Google Analytics, scroll down to the Ecommerce section and
- Canonical patches – Symptomatic of pages with filtered products, bad URL structures and flaky development practices, a canonical reference is used when 2 similar pages are present on the site and allows you to specify which one you want search engines to index. In most cases by simply switching from using a query string to using # based parameters in your URLs you can avoid this issue completely.To check this look for ? in fiilter urls i.e. ?colour=red.
- View all – Probably one of the few exceptions to the above issue is when products are displayed in categories across multiple paginated pages. In this case I’d suggest all of these paginated pages having a canonical reference to a single view all page containing a link to all of the available products in the category. This flattens the site architecture and ensures all products have the same Internal PageRank passed to them irrespective of click depth from the websites home page.
- Query-strings in general – See canonical patches above
- Links to resources that are redirected to log-in pages – Your log-in page is typically one of your most linked pages on the website and usually one with no commercial benefit for new users. Your developers should check to see if a user is logged in and if so show the link to any logged in resource pages such as wishlists, returning a product, my account pages etc; If not then just show a login button and remove all of these usually redirected links.To check this click on your basket page without adding a product. If you can access an empty basket then you have this issue.Also try any My Account, Whishlist etc sections that are linked but require login to use.
- Thin content – Tho’ this could be down to clients failing to supply enough text to make a page worth reading however each page of your website should simply be worth reading. Having a 2 or 3 line product description isn’t enough, simply stating that you supply a service isn’t enough, you need to provide insight and be the best possible resource for everything you want yoour users to read. A small website that has long pages of interesting and engaging content is better than a massive site full of thin content. Thin content is a fast way of having your website penalised via Google’s Panda algorithm.To check this browse your websites and look for areas of white space in your product descriptions and service pages that could contain text.
- Copied content – Your web designer should know enough to ask if you’re copying content from a 3rd party resource. This could be a spreadsheet provided by a supplier or something copied from another website or brochure. Every page on your website should be unique and each page should be the best possible resource for your customers fully exploring your products/services and providing answers to relevant commonly asked questions within each page. Duplicate content is another way of having your website penalised via Google’s Panda algorithm.To check this copy a few sentences from some sample pages and paste into a Google search. if you can find the same text on other websites then you may need to rewrite your content.
- Broken links – Covering both internal and external links, all links within your website should be current and active. Having broken links not only leads to poor customer experience but also hurts your websites Internal Pagerank and results in lower than expected rankings for your whole site.To check this use tools like screamingfrog.com or the free Microsoft SEO Toolkit which will report on broken links.
- Redirected internal links – Each time a link is redirected within your website it looses some Internal PageRank. As you control both the source and destination of these links, updating them should be a no-brainer.
- No Title Tag optimisation – I see too many home pages with the title tag “Home Page”. All content on your website should have a unique title tag designed to align searches made by your customers with the content on the page. Yes you can procedural generated category and product titles for large e-commerce sites however even then your best sellers should be tailored to deliver maximum effect.
- No Meta Descriptions – Its not a ranking factor but can improve click through rates for organic traffic if your create a unique 155 character description for each page. There’s no reason why these aren’t created. Again for large e-commerce sites they can be automated.
- No Alt Attribute information on images – Also known as Alt Tags or Alt Text, Google can’t ‘read’ images and doesn’t know whats in them. Alt Attributes provide this information and helps improve page relevancy and ranking especially where images are used as links. If your products are visually appealing you can also leverage Google Image search to help attract even more custom.
- Internal anchor text – Using keywords as link text and understanding that the anchor of the first link to a page within the sites code will carry the anchor text’s relevance helps make decisions with regards to menus and categorisation
- Keyword use – Was keyword research part of your websites new design? I’d guess probably not. Without good keyword research you really dont know where you can add value to your website, how to align your products and services with your customers expectations, basic site architecture and generally what and how to write your content.
- Architecture – In a majority of cases your web design company will ask you what content you want on your website and expect you to write or provide information to build the site. Creating a website in this manner means that everything is being based on an assumption. SEO experts help design website architecture so that no content is wasted. Afterall whats the point in writing a page if its destined to get zero traffic?
- Same products in multiple categories – In many ecommerce systems clients ask for products to be made available across multiple categories. This only becomes a problem if your sites architecture is automatically generated along the following lines Domain/Category/ProductName. This causes the same product information to be displayed across multiple URLs. I see this all of the time. To fix this put all of your products in the root i.e. Domain/ProductName. By doing this you can multi-site your products without any problems. It will usually cause an issue with any breadcrumbs but just take them out.
- Conversion optimisation – Your websites workflow, user journey and calls to action need to be carefully crafted and adjusted over time to ensure a high return on investment. While basic things like contact forms may seem a simple element to design, careful consideration should be used when creating the fields, layout and overall design. In addition many web developers may rely on a framework/platforms built in workflows which are typically over complicated even when a 1 page checkout is implimented for example. SEO lives and dies by its ability to attract and convert custom, web designers are more concerned with what looks ‘nice’ and developers with what can be achieved quickly.
- Slow websites – Scoring under 80% on both desktop and mobile checks, many websites aren’t optimised for speed at launch. For every second a website takes to launch your conversion rate is dramatically reduced. This is more important for mobile where conversion rates are typically half that of desktop traffic. Google gives you a free tool to check your websites speed and provides insights as to the works required to improve this rating.
- Websites that aren’t responsive – Some retailers have reported a 100% increase in conversion by having a well design responsive version of their desktop website. While you may think such gains are only to be made for ecommerce retailers, mobile access is incredibly important for service lead businesses too.
- Duplicated titles and meta
- Hidden content
I’ll update the rest of this post later.