This month ROAST’s SEO team made the trip down to Brighton to enjoy the sea, the sun and Brighton’s bi-annual SEO conference. We regularly encourage our staff to attend external events to keep up to date with the latest industry innovations.

Below, the team have pulled together some top tips and key insights from their day in Brighton, covering subjects from API’s to JavaScript to Advanced Keyword Research.

Javascript & Frameworks – Bartosz Góralewicz 

Bartosz Góralewicz provided some useful insights in his talk, Can Google properly crawl and index JavaScript? SEO experiments – results and findings revealed,  in which he concluded that that even Google itself, is still not perfect at crawling Javascript, but he gave some top tips as to how to get the most accurate outcome.

The below table, taken from Góralewicz’s research, provides an insight into which frameworks to use if you have developer’s keen to use JavaScript. As you can see, React, jQuery, Vue.js and Vanilla/Plain JavaScript passed all four stages of indexation and therefore would be the best options to test.

 

Making sure your JavaScript is inline rather than external, can also make a huge difference as to whether Googlebot crawls the page because;

  • ‘Not all JavaScript frameworks are crawled and indexed in the same way’
  • ‘JavaScript generated links aren’t always crawled’
  • ‘Inline vs. External JavaScript makes a huge difference’
  • ‘Angular JavaScript two must always be server rendered’

We also should remember that Google is not the only search engine out there, and through using JavaScript, support will drop off from search engines such as Bing, Yahoo, AOL. etc.

 

 

The best approach to using JavaScript in SEO, is through employing Isomorphic JavaScript, on React and Angular.

Despite Góralewicz’s guidelines being useful for those planning on using JavaScript, remember that Google still might not be able to read the JavaScript perfectly, and it is always worth considering all of your options before making a final decision. If it’s too late for you, and your site is already JavaScript-heavy, you can use a service such as  Prerender.io which will create a HTML snapshot of your site to the search engines.

Final tip; Chrome 41 is the same as rendering as GoogleBot, which is great for de-bugging!

API’s In Search – Yiğit Konur, Kostas Voudouris & Stephan Solomonidis

In the next sessions, various speakers provided insights on how to take advantage of APIs, without the need to learn a new coding language or utilise the skills of a dev (too much).

The talks focused on recommended tools to use to simplify and improve your confidence when working with API requests. All the suggested tools pave the way for insightful applications in SEO, and could even potentially be expanded into further reviews and processes outside their suggested applications.

Yiğit Konur, in his talk, Connecting APIs without Coding Skills to Create Your Own Dashboards, suggested the use of drag and drop workflow creation through using data mining tools RapidMiner and Knime, to help simplify your GET request and output data filtering. Such tools can also be used in other applications, such as to create automated workflows for data mining and processing.

Kostas Voudouris walked through the creation of performance based monitoring and optimisation using the search console API and Google Sheets in his talk Performance-based optimisation using Google Search Console API. Doing this enables you to monitor your visibility for new search queries, and it also provides a large data source for large scale content analysis and keyword optimisation.

Stephan Solomonidis suggested using IBM Whatson to allow for simplified API requests for NLP reviews. This tool allows you to identify common themes with SERP listings and keyword sets, or, if you’re feeling brave, can enable you to build your own knowledge graph using IBMs knowledge Studio.

At ROAST, we have a selection of tools that utilise APIs to gather data at scale, such as page speeds, knowledge graph data to name a few.

Advanced Keyword Research – Kelvin Newman

Kelvin Newman, the founder of Brighton SEO, gave an interesting talk, Scary SERPs (and keyword creep), on keyword research and how we as SEOs, need to think differently about the future of keywords and keyword targeting.

A few years back, keyword targeting was as simple as replicating a keyword onto the page as many times as possible, in the hope that Google would think ‘Okay, ‘car insurance’ has been mentioned 50 times on this page, it must be about car insurance and therefore worthy of ranking’, and nine out of 10 times, it would have worked.

Now though, Google recognises so much more than this and is constantly improving its algorithm to understand intent and a sense of what the user is really searching for, many based on IA. So, when creating a page about a topic it’s about including all the phrases within the topics semantic field and enough content that a user and Google would expect to see. As well as incorporating “off the shelf” options (such as machine learning/NLP tools), Newman suggests taking the top ten search results for the query of your topic, and extracting the text from the pages. Once you’ve done this, take all the words from the pages and put them into a word cloud, which will give you a visual prioritisation of how often words are being used for the given query:

 

 

Once you’ve done this, it should help you better understand what keywords and phrasing you should be including on your site.

As clever as Google is, Newman hones in on voice search, particularly the Google home voice assistant, and the ‘one true answer’ trend, which is when featured snippets go a tad wrong.

When you search for something on Desktop, you’re presented with the traditional 10 listings on page one, with a featured snippet appearing in “position zero”. You’re then given the choice between clicking the snippet, or any of the other 10 listings. If you can see that the snippet is misleading or incorrect, there is an option to give feedback about the result, which Google would quickly see to.

However, when searching with a voice assistant, you’re only given the ‘one true answer’, which in some cases can produce inaccurate results as can be seen in the example below:

 

Don’t always trust what Google says…

Newman also spoke on machine learning, and as mentioned earlier, Google’s continued efforts to understand what it is we’re searching for. For example, if you were to search for ‘computer game with Italian plumbers’, Google is now able to recognise this as Mario:

 

This means that not only does Google understand the keywords, but it understands the whole context of a given query, and sees the connections below the surface, to make sure you’re served the best result. Today’s keyword research is all about understanding our clients’ needs and motivations. This, will help us figure out how to create the best keyword targeting strategy.

 

AMP – Aleyda Solis

On average, 60% of searches are now conducted via mobile. As page speed is a large user experience factor for mobile users, there is now a high demand for optimising site speed to reduce bounce rate.

Google released AMP to help improve page speed for both big sites which have restrictive legacy systems, and small businesses with too little resource to improve their speed. AMP uses simplified HTML, with optimised resources, to server stripped down versions of pages. In her talk, Setting AMP for Success, Aleyda Solis shed light on successfully implement AMP in web design.

According to studies into site speed, people expect pages to load within two seconds and sites must abide by this to avoid customers from leaving mobile sites which in turn will decrease conversions.

For e-commerce companies, the consequences can be devastating. According to Qubit, ‘Online retailers lose an estimated £1.7bn in global sales each year due to slow loading speeds.’ The image below shows the bounce rate as the seconds tick by:

 

 

AMP gives everyone a way to create lightweight and fast pages for mobile users.

 

Technical SEO IA – Dominic Woodman 

In his talk, Advanced Site Architecture – Testing architecture & keyword/page groupings, Dominic Woodman explained that Site IA refers to the structure of internal links that search engines use to understand relationships across a website, but more importantly, IA represents a huge opportunity to match up templates and pages with a specific user intent.

Answering some simple questions about the expected steps in your customers’ journey on your website, and how to reach them, should provide important cues about where pages should sit and which keyword(s) they should include.

Matching keywords and their intent to a page template can be done through semantic analysis and a good deal of iteration. Identifying the need for new template creation can be as simple as looking at SERPs and asking yourself the following questions;

GTM for SEO – Sebastian Monnier

In Sebastian Monnier’s, How Google Tag Manager Can Help your SEO, he explained that one of the benefits of using Google Tag Manager (GTM), is that it allows ad-hoc injection of JavaScript strings and bypasses the need to hardcode everything and therefore makes the code “lighter”. So, if you are not able to implement changes to your clients’ website you could use GTM to make those changes.

Monnier argued that it is in fact possible to rank for keywords that have been injected into the title and description tags with GTM, even though they don’t appear in your HTML code. One reason for injecting keywords through JavaScript, would be for websites that have faceted navigation (e-commerce), or for campaign pages for example. GTM makes it possible to scrape parts of the HTML code and push that into the meta tags and Monnier has showed case in point that it wouldn’t prevent your pages from appearing in SERPs.

Final tip; this also works for injecting nofollow tags, noindex tags and canonical tags.

In the coming months, the team will be at STAT’s City Crawl event and Google’s AMP Roadshow, keep an eye out for the write ups.