MDN wants to learn about developers like you:

A web crawler is a program, often called a bot or robot, which systematically browses the Web to collect data from webpages.  Typically search engines use crawlers to build indexes.

Learn more

Document Tags and Contributors

 Contributors to this page: PetiPandaRou, Andrew_Pfeiffer, klez, hbloomer, jsx
 Last updated by: PetiPandaRou,