Well, search engine crawlers bots generally go through the whole webpage content, at once. In general, search engine bots are like spiders and their system is similar to the human brain of a normal person but in a different format. It actually works kinda like our neural web in our brain called synaptic connections. These connections have different levels of information that can be stored in it. Search engine robots work by memory storing the sites they have visited and then store them in their respective data banks for future references for more use.