Understanding the Crawling and Indexing Algorithm for Relevance and Importance

In broad terms the way a website is crawled through and picked to be displayed on a SERP all depends on several web ranking factors.  First a crawler using a mathematical ranking algorithm will scan a website making sure that the webpage is a reputable one, one way this is done is looking through the domains.  For example .gov comes first then .edu then .com. government websites are seen as more reputable than a regular domain website.  Another way to determine importance of a webpage is to scan and find the correct indexed keywords that match the search query originally typed into the search query box. If it does then the crawler will go through the webpage analyzing other metrics to include in the in the users SERP page.  Google then refers to an index of terms and rates the relevance and importance of the website compared to others with similar information in an algorithm. For more into on how to get a website crawled Google provided an article to see how this all works.  The websites relevance is determined by if the page relates to the original searched in the query, or if it has relevant anchor text.  Importance is determined if there is citation to the information, are they citing a reputable source like a governor or linking or is it a simple blog post. If the keywords in the search query mate the search ranking factors stated by google it will help raise their relevance and importance to be displayed high on the SERP.

Which Web Ranking Factors to Improve and Include

The way an SEO crawler determines a web page is right for the searcher is by scanning the page. It begins by looking for related keywords the crawler then develops a semantic map of the relevant keywords that help the page be ranked when compared to others.  When being crawled because a webpage does not understand java script key metrics are analyzed through HTML code. For example the most important factors include the title tag, meta description, alt description, and nosctipt code.  It is important for a developer and marketer to understand who their main core user is when creating their website and what keywords they use. Search Engine Watch breaks down a great way to write perfect HTML descriptions when writing optimizing your SEO here If you understand who your target demographic is and familiarize yourself with them it can help bring your website to the top of many SERP. All of these are included on the main SERP page and help a search engine when crawling through web pages determine which websites meet the criteria putting them higher on the main list. The ones that rank the highest will appear higher on the SERP page, those pages based on their algorithm criteria ranking will be on top.  At the core there are strategies to write a great meta description can help a crawler understand the service or product creating relevance and importance. For example Adidas when searched states what type of products they provide allowing Google to rank them higher on SERP.