How to Drive Sustainability in Search
Google’s data centres are estimated to cost $110m to power annually, primarily attributed to their energy consumption. This impacts the environment detrimentally because over 40 per cent of energy-related carbon dioxide emissions are due to the burning of fossil fuels for electricity generation. Reducing our reliance on these centres is not an option, and so we should instead look to reduce what we request of them.
As digital marketing adds to this, we wanted to highlight how large site owners and agency partners can help. Looking at specifically SEO workflows, here are five ways to optimise sites:
-
Refresh on-page copy instead of continually creating new pages.
-
Monitor Google’s crawling to identify and reduce pain points.
-
Use XML sitemaps and robots.txt files to guide crawlers.
-
Review image use for optimal size and resolution.
-
Work closely with Google’s platforms for News and product inventory.
Method 1: Refresh Pages vs Always Creating New Ones
Maintaining existing live pages and their content is crucial for pleasing two key search algorithms: Google’s Caffeine and EEAT Guidelines.
The Caffeine algorithm prioritises fresh and updated content, which means that refreshing on-page copy can be more effective. In addition, Quality Raters evaluate websites based on predefined EEAT criteria to ensure that search results are relevant, useful and trustworthy.
Here are six useful steps to follow when reviewing site content:
-
Export and compare performance of pages.
-
Analyse the position of their ranked keywords.
-
Define the best course of action for each page (remove, refresh or merge with another).
-
Measure the effort required for each course of action.
-
Estimate the potential impact of each course of action.
-
Use the effort x impact analysis to prioritise the pages and actions for the audit.
By following these steps, brands can identify which pages need to be refreshed and updated to improve their search rankings, reducing servers’ request load.
Method 2: Monitor and report on how well Google is crawling a site via Crawl Log Reporting
Google’s crawling process is essential to the success of a website, because it determines how well the site will rank in search results. To help improve the crawling process, website owners can monitor Google’s crawling and identify pain points.
This is where crawl logs come in useful, as they provide valuable statistics about the crawl history of a website. By closely analysing these logs, site owners can determine where search engine crawlers are experiencing problems and take action to resolve them.
Search Console Crawl Stats Report provides these details about requests made to a site, allowing site owners to further analyse crawl behaviour.
By using this information to address issues, site owners can improve their website’s crawl efficiency, leading to better search engine rankings and a more positive user experience.
You can undertake a review of crawl logs in these six ways:
-
Collect crawl log data.
-
Filter the data to remove noise and irrelevant information.
-
Analyse the data to identify issues and opportunities.
-
Prioritise issues by impact x effort.
-
Develop an action plan to address the issues.
-
Monitor and measure the impact of the action.
By diagnosing crawl issues and resolving them, you are ensuring that a crawlers’ time on your website is spent productively, lessening the amount of electricity required to power it.
Method 3: Guide Googles’ crawlers through XML sitemaps and robots.txt files, to pages and to ignore, respectively.
XML sitemaps and robots.txt files work together to guide search engine crawlers and ensure that the correct pages on a website are indexed.
XML sitemaps provide information about the pages, videos and other files on a site, and the relationships between them. They help search engines find pages that might otherwise be missed while indicating relative importance. On the other hand, robots.txt files inform search engines what not to crawl or index. By accurately using both XML sitemaps and robots.txt files, site owners can direct search engine crawlers to the most important pages on their site.
To ensure that these files are set up correctly, here are some checks to follow:
-
Check for the presence of XML sitemaps and robots.txt files.
-
Review the XML sitemaps to ensure that they contain all relevant pages and are properly formatted.
-
Check the last modified tag in the XML sitemap to ensure that it is up to date.
-
Review the robots.txt file to ensure that all necessary pages are allowed to be crawled and that irrelevant pages are disallowed.
-
Check for any errors or warnings in Google Search Console.
-
Monitor crawl behaviour in the Google Search Console to ensure that pages are being crawled as intended.
By guiding crawlers to high-priority pages in sitemaps and deterring them from pages using a robos.txt, site owners can increase crawler productivity and reduce wastage.
Method 4: Review image use for optimal size and resolution.
Images and other high-resolution assets can significantly impact a website’s load time. This means that large images consume bandwidth and storage on servers, leading to higher costs and environmental impact. This makes it crucial for SEOs to regularly review the use of these to optimise them for performance.
To check image use and identify those that can be optimised in keeping with Core Web Vitals, you can take the following steps:
-
Use a performance analysis tool such as PageSpeed Insights, GTmetrix or Lighthouse to identify areas of your website that need improvement.
-
Review the size and format of each image on your website to identify those that are excessively large or in the wrong format.
-
Compress or resize images to reduce their file size without compromising image quality. You can use image optimisation tools such as TinyPNG to compress images automatically.
-
Use lazy loading to load images only when they are needed, rather than all at once.
-
Use responsive images to ensure that images are appropriately sized for the device they are being viewed on.
By optimising images for performance and sustainability, website owners and SEOs can improve user experience and search engine rankings while reducing energy costs.
Method 5: Work closely with Google’s platforms for News and product inventory.
Integrating with search engine platforms is an effective way for businesses to input data directly into engines’ databases, instead of waiting for crawlers to discover and rank it. This reduces crawl requests, reduces server load and, ultimately, enhances user experience.
Below are some examples of these platforms for different industries:
-
For news and media companies, Google News is an excellent platform to submit articles to, increasing their chances of being discovered by those looking for stories.
-
Ecommerce businesses can showcase their products through Google Shopping by optimising their product feeds, improving their chances of appearing here.
-
Local businesses can manage their online presence through Google My Business, a powerful tool to keep business information up to date, improve their visibility in local search results and attract more customers to their physical locations.
Using search engine platforms to input key information streamlines their efforts in extracting information. Failure to use these platforms unnecessarily makes their job more difficult than needed.
In summary
By following the above methods, brands, website owners and SEOs can implement more sustainable SEO practices.
Refreshing and optimising existing pages, analysing crawl logs, optimising image use and integrating with Google’s platforms reduces server resource consumption. It also improves website performance and user experience. More importantly, implementing these contributes to a more sustainable digital ecosystem and achieves SEO goals simultaneously.