After the recent period of DDoS (Distributed Denial of Service) attacks on sites which appear to have been linked to Wikileaks; PayPal, Mastercard, Amazon etc. A question arose to me. Is there any way that a business can prevent themselves from being vunerable to DDoS attacks.
Now to define a DDoS attack. Wikipedia defines it as “an attempt to make a computer resource unavailable to its intended users” (DoS Wiki). To use an analogy, if we think of a server as a plughole in a sink. It is a specific size because it can cope with the amount of water that is being poured into the sink. However, what happens if the amount of water (traffic) becomes to great to be emptied. It will spill over the sink.
This is much the same idea as a DoS attack, where the server is overloaded with requests, and subsequently can not respond to every request and some people will be servers probably with a 404 page (site not found). Now what can be done to combat this kind of attack?
The real difficulty behind this kind of attack is the nature that it is really just a massive burst of traffic to a particular site. All servers have a rate at which they will max out off of their resources and fail to respond, I am sure that most people with older machines will have experienced this kind of thing in the past. However, when your business/service is based online it has a massive effect on your business if people cannot reach or communicate with you.
As to solutions, I do not really see any solutions to this kind of attack as the rogue requests are usually identical as genuine requests. So does this mean that we have to ‘put up’ with DDoS attacks? Or can something be done?
Personally, I believe that there is little we can do to overcome them and as the speed of the internet explodes; with faster connections and lines; it could become easier to target distributed servers etc. However, it may well become a weapon of the future for guerilla warfare etc.
Wikipedia lists solutions as both bandwidth management and creating clean links however, all of these could be overcome just by rate of traffic and currently there is no way to tell what is good and what is bad traffic. Potentially you could blacklist IP sub-sections, however, with botnets and such like spanning vast zombie armies of machines, this is not likely to be particularly effective. So… no answer yet!