Which flood attack collects a list of pages or images and appears to be going through these pages, but secretly floods the target?

Prepare for the Certified Ethical Hacker Version 11 Exam with a comprehensive test featuring flashcards and multiple choice questions, each accompanied by hints and explanations to ensure a thorough understanding. Ace your ethical hacking exam with confidence!

Multiple Choice

Which flood attack collects a list of pages or images and appears to be going through these pages, but secretly floods the target?

Explanation:
When an attacker floods a server with HTTP requests, the impact comes from the sheer volume of resource requests, not from long-lived connections or complex crawling. In this scenario, a list of pages or images is assembled, and the attacker steps through that list as if browsing. Each item on the list is requested once, so many distinct pages are fetched in a short time. This creates a flood of single, isolated GET requests that taxes the server’s ability to respond to legitimate users. This best fits the single-request HTTP flood pattern: it simulates typical user navigation through a catalog of resources, but the objective is to generate overwhelming traffic through numerous discrete requests rather than sustaining open connections, following links recursively, or using slow-handover techniques. Why the other approaches don’t align as closely: a recursive GET flood would actively crawl and fetch pages by following links, expanding the request set through traversal rather than simply processing a pre-collected list. a random recursive variant would do similar crawling with randomness, which again emphasizes dynamic discovery rather than a straightforward pass through a prepared list. Slowloris, by contrast, keeps many connections open and slowly sends headers to exhaust connection slots, focusing on connection resources rather than flooding with immediate GET requests to many resources.

When an attacker floods a server with HTTP requests, the impact comes from the sheer volume of resource requests, not from long-lived connections or complex crawling. In this scenario, a list of pages or images is assembled, and the attacker steps through that list as if browsing. Each item on the list is requested once, so many distinct pages are fetched in a short time. This creates a flood of single, isolated GET requests that taxes the server’s ability to respond to legitimate users.

This best fits the single-request HTTP flood pattern: it simulates typical user navigation through a catalog of resources, but the objective is to generate overwhelming traffic through numerous discrete requests rather than sustaining open connections, following links recursively, or using slow-handover techniques.

Why the other approaches don’t align as closely: a recursive GET flood would actively crawl and fetch pages by following links, expanding the request set through traversal rather than simply processing a pre-collected list. a random recursive variant would do similar crawling with randomness, which again emphasizes dynamic discovery rather than a straightforward pass through a prepared list. Slowloris, by contrast, keeps many connections open and slowly sends headers to exhaust connection slots, focusing on connection resources rather than flooding with immediate GET requests to many resources.

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy