Search engines are widely known software systems that can be used to search through incredible quantities of information and data. Google, Bing! and Yahoo are all examples of popular search engines. But how do search engines work? Understanding this can be beneficial to your website’s SEO.
Search engines are complex connected webs of web pages, documents, information and images. Search engines analyse and index data then filter and present the data to the search-engine user based on the specific keywords and phrases input. These results are then ranked by how accurate a response the search engine believes the webpage to be in response to the search engine query, and presented in the form SERPs (search engine results pages).
To fully comprehend how search engines work, there are a few key concepts to understand.
What is a bot?
A bot (also known as a spider or crawler) follows links on the online web, collecting text, images, videos, news articles and general data then it saves the HTML version of the page into a huge database called the Index. The index updates any time the bot/spider/crawler scans your webpage and finds any new, relevant information or content that it can then add to the index. Search engines are intuitive so if your website is deemed actively important, or if you regularly update your website, bots will scan your website more regularly as a result, whereas if changes are not made very often it will not re-scan as regularly.
What is an algorithm?
Once your website has been indexed, an algorithm takes the information from the Index, then filters and sorts the data based on what the user has searched and then ranks and presents the findings for the user to then explore. The top search results are what the algorithm believes is the most relevant response to your query. The factors that aid the decision-making of the algorithm are difficult to define as there are so many complex factors built into it, and which results will be prioritised regularly changes based on intuitive, developing factors. It is no longer possible to submit your new website to a search engine, the search engine has to identify your new website via a link already known to it. The bot/crawler/spider identities the new website through the thread and your new website can then be indexed. The search engine then uses both external and internal links from the website to define the rank of your webpage. If your website contains many external links, this can help your website to be prioritised in the search engine. Maintaining and updating your website regularly is crucial to improving your website’s crawlability, thereby regulating the frequency with which the algorithm indexes your website. It is possible to block crawlers from your webpage to prevent specific web pages from being indexed. This is not good for SEO, but you may wish to do this for specific web pages if, for example, you have an authorised-access only page (perhaps for running behind-the-scenes operational tasks or for staff training purposes).
Specific Algorithms that influence SEO
Google’s algorithm updates regularly in a plethora of ways, however, certain updates have stood out over the years which provide food for thought in how SEO can be altered and influenced.
RankBrain is a very complex Google algorithm that helps to decipher the meaning and intent of the words users choose when they search for something online. It identifies what it believes to be the best matching search results. It uses the website content, links, keywords and many other SEO elements to identify matching relevant results and present them by their identified rank in the SERPs.
Google does not like to be pandered to, however. Google’s Panda update in 2011 was designed specifically to reduce the ranking of web pages that were created to alter rank in search engines. The Panda update scanned pages to identify whether the content of the website connected actively to search engine users’ search terms. This had a great impact at the time on affiliate websites designed to link to other pages, and websites with very little content. Google has been known to re-run the Panda update on occasion since its initial release.
Similar to this, in 2012 Google then released the Penguin update. This update was designed to assess if sites linked on your website supported and enriched your webpage’s content or whether the links were there for alternative motives. This was meant to reduce the number of websites created to artificially improve the ranking of other sites through increased linking. Many websites lost their Google rank as a result. This update was run on multiple occasions since its inception and is now said to be a permanent fixture of Google’s system.
Google’s focus on optimising SERPS resulted in the Hummingbird update, first instigated in 2013. This update was designed to analyse a whole search phrase rather than only specific words from the query. This resultantly produced results for an overall search rather than results for keyword’s alone. It was not initially successful in improving search results, but in time it arranged the SERPs so that the answer to the query displayed openly at the very top of the SERP, enabling the user to gain results from websites without necessarily having to click into a webpage. This update was also a part of the beginning of Voice Search. Voice Search is used for many modern devices, such as Alexa, Siri, Google Home and many others.
Maintaining relevance and changing with the times led to the Mobilegeddon 2015 update. Designed to boost mobile-friendly web pages in its mobile search results, this was instigated around the time that Google stated that mobile devices were accounting for 50% of all search queries.
Then further to the movement towards mobile device searches, in 2016, the Possum update was introduced to alter Google’s ranking filter based on the physical proximity of the searcher to a physical location or business. Google then expanded this movement further by switching to its mobile-first index in 2018. This impacted rankings based on the quality of the mobile version of a website. This was done as a result of the rise in mobile search queries, and the need for web results to reflect this change. A Googlebot crawls the mobile site and assesses the quality of content, performance and user experience. Based on these factors, the website ranking was then updated.
Medic, or Query Intent update was also instigated in 2018. It was initially thought to target medical websites, but it was later discovered that the update impacted organic results in a number of industries. The update was designed to assess and take into account a user’s specific search intent based on the exact wording and phrasing of a query. For example, searching the words “book publisher” would generate completely unique results compared to searching “how to attract a book publisher”. “Book publisher” would likely display a list of professional book publishers, whereas “how to attract a book publisher” may offer results more specific to informative web pages providing advice and information on the matter.
Continuous updates have been active since 2018. These updates make continuous adjustmentsseveral times per day. Google also releases regular larger updates (core-updates) every few months. This can make it incredibly challenging to predict and continuously change to adapt to the algorithm’s run by Google, so rather than attempting this, it is best to simply focus on perfecting your website to be as informative and usable as possible. Google’s mission is to provide people with the best information, so focus on achieving this to align with Google’s objective and improve your ranking.
If you would like to explore opportunities to optimise your SEO, at FreshOnline we can offer a range of services and information which could benefit your company. Get in touch for a commitment-free conversation about how we can help you