This recent trend of Americans bashing the French has got to stop!

Your upset that France your supposed allie won't help you with the war on Iraq, but America has helped France in the past.

Mainly I'm hearing "The US helped liberate France in WWII and now they owe it to us to help". If you honestly believe that statement, I'm going to have to call you one of the stupidest short-sighted people around.

Let's get things straight: WHERE WAS THE US WHEN FRANCE WAS BEING OCCUPIED? America sat on it's ass and watched France (along with a bunch of other countries) get their asses kicked. American WATCHED this happen, and got RICH supplying these countries fighting wars. The US didn't join the war intill the Japs brought it to their front door and it couldn't be ignored anymore.

France sacrificed a hell of alot more in WWII. Their country got occupied, they lost lives and property on a 10fold scale of the US. French women and children were killed, their homes and fields burnt, their food taken, and their pets slain.

I think the American people should thank the French people for their supperior efforts and sacrifices during WWII. Not think that France owes them something, because in reality it's the other way around.