This month ROAST’s SEO team made the trip down to Brighton to enjoy the sea, the sun and Brighton’s bi-annual SEO conference. We regularly encourage our staff to attend external events to keep up to date with the latest industry innovations.
Final tip; Chrome 41 is the same as rendering as GoogleBot, which is great for de-bugging!
In the next sessions, various speakers provided insights on how to take advantage of APIs, without the need to learn a new coding language or utilise the skills of a dev (too much).
The talks focused on recommended tools to use to simplify and improve your confidence when working with API requests. All the suggested tools pave the way for insightful applications in SEO, and could even potentially be expanded into further reviews and processes outside their suggested applications.
Yiğit Konur, in his talk, Connecting APIs without Coding Skills to Create Your Own Dashboards, suggested the use of drag and drop workflow creation through using data mining tools RapidMiner and Knime, to help simplify your GET request and output data filtering. Such tools can also be used in other applications, such as to create automated workflows for data mining and processing.
Kostas Voudouris walked through the creation of performance based monitoring and optimisation using the search console API and Google Sheets in his talk Performance-based optimisation using Google Search Console API. Doing this enables you to monitor your visibility for new search queries, and it also provides a large data source for large scale content analysis and keyword optimisation.
Stephan Solomonidis suggested using IBM Whatson to allow for simplified API requests for NLP reviews. This tool allows you to identify common themes with SERP listings and keyword sets, or, if you’re feeling brave, can enable you to build your own knowledge graph using IBMs knowledge Studio.
At ROAST, we have a selection of tools that utilise APIs to gather data at scale, such as page speeds, knowledge graph data to name a few.
Kelvin Newman, the founder of Brighton SEO, gave an interesting talk, Scary SERPs (and keyword creep), on keyword research and how we as SEOs, need to think differently about the future of keywords and keyword targeting.
A few years back, keyword targeting was as simple as replicating a keyword onto the page as many times as possible, in the hope that Google would think ‘Okay, ‘car insurance’ has been mentioned 50 times on this page, it must be about car insurance and therefore worthy of ranking’, and nine out of 10 times, it would have worked.
Now though, Google recognises so much more than this and is constantly improving its algorithm to understand intent and a sense of what the user is really searching for, many based on IA. So, when creating a page about a topic it’s about including all the phrases within the topics semantic field and enough content that a user and Google would expect to see. As well as incorporating “off the shelf” options (such as machine learning/NLP tools), Newman suggests taking the top ten search results for the query of your topic, and extracting the text from the pages. Once you’ve done this, take all the words from the pages and put them into a word cloud, which will give you a visual prioritisation of how often words are being used for the given query:
Once you’ve done this, it should help you better understand what keywords and phrasing you should be including on your site.
As clever as Google is, Newman hones in on voice search, particularly the Google home voice assistant, and the ‘one true answer’ trend, which is when featured snippets go a tad wrong.
When you search for something on Desktop, you’re presented with the traditional 10 listings on page one, with a featured snippet appearing in “position zero”. You’re then given the choice between clicking the snippet, or any of the other 10 listings. If you can see that the snippet is misleading or incorrect, there is an option to give feedback about the result, which Google would quickly see to.
However, when searching with a voice assistant, you’re only given the ‘one true answer’, which in some cases can produce inaccurate results as can be seen in the example below:
Don’t always trust what Google says…
Newman also spoke on machine learning, and as mentioned earlier, Google’s continued efforts to understand what it is we’re searching for. For example, if you were to search for ‘computer game with Italian plumbers’, Google is now able to recognise this as Mario:
This means that not only does Google understand the keywords, but it understands the whole context of a given query, and sees the connections below the surface, to make sure you’re served the best result. Today’s keyword research is all about understanding our clients’ needs and motivations. This, will help us figure out how to create the best keyword targeting strategy.
On average, 60% of searches are now conducted via mobile. As page speed is a large user experience factor for mobile users, there is now a high demand for optimising site speed to reduce bounce rate.
Google released AMP to help improve page speed for both big sites which have restrictive legacy systems, and small businesses with too little resource to improve their speed. AMP uses simplified HTML, with optimised resources, to server stripped down versions of pages. In her talk, Setting AMP for Success, Aleyda Solis shed light on successfully implement AMP in web design.
According to studies into site speed, people expect pages to load within two seconds and sites must abide by this to avoid customers from leaving mobile sites which in turn will decrease conversions.
For e-commerce companies, the consequences can be devastating. According to Qubit, ‘Online retailers lose an estimated £1.7bn in global sales each year due to slow loading speeds.’ The image below shows the bounce rate as the seconds tick by:
AMP gives everyone a way to create lightweight and fast pages for mobile users.
In his talk, Advanced Site Architecture – Testing architecture & keyword/page groupings, Dominic Woodman explained that Site IA refers to the structure of internal links that search engines use to understand relationships across a website, but more importantly, IA represents a huge opportunity to match up templates and pages with a specific user intent.
Answering some simple questions about the expected steps in your customers’ journey on your website, and how to reach them, should provide important cues about where pages should sit and which keyword(s) they should include.
Matching keywords and their intent to a page template can be done through semantic analysis and a good deal of iteration. Identifying the need for new template creation can be as simple as looking at SERPs and asking yourself the following questions;
Final tip; this also works for injecting nofollow tags, noindex tags and canonical tags.
In the coming months, the team will be at STAT’s City Crawl event and Google’s AMP Roadshow, keep an eye out for the write ups.