Last Sunday, the
New York Times Arts (or I should say "Arts") section carried an article to the effect that Hollywood movies carry themes of revenge, and that this does not represent the US values. What a myth! The US is the country of vengeance and revenge par excellence now. Were the wars in Afghanistan and Iraq not wars of
revenge? Ask American patriots?