Some people at Google have made an interesting Browser Security Handbook available under a free license (CC-3.0-BY). A quite interesting read that shows the brittle security features used in modern web browsers. Certain
features like content handling mechanisms are really frightening!
In short, it is quite difficult to make secure web sites accepting and displaying content from external users. The main issue comes from the legacy of old HTML and related technologies. Even if modern versions of HTML, HTTP, ... have attempted to improve the situation, it failed in a number of ways (for example for HTTP 1.1 by not specifying behaviour when talking to 0.9/1.0 HTTP proxies). Of course, certain initial choices like using a textual encoding are a horrible legacy that we are going to carry until another technology takes over the web. If only the binary encoding of extprot had existed in 1992!
One could also notice that the rapid adoption of Web technologies is directly linked to its fragility: ability to constantly add new features also means lack of long term thinking to harmonize those features. Are the tags of Open Street Map going to follow the same mess?