Rotten.com has an amazingly harsh article on Mother Theresa. There are two main things it focuses on. The one it spends the most time on is their perception that she was willing to overlook certain abuses from people who treated her well – Indira Gandhi’s imposition of martial law in India in 1975, the son Sanjay’s program of sterilization for the poor, and the Haitian Duvalier regime given as examples. I’m sure there are other details involved, and I’m sure that she (like many people at the time, and similar to the way in which Castro and Chavez routinely have managed to pull the wool over idiotic hollyweird celebrities with more teeth than brain cells) was duped and would probably have said differently had she not been on the wrong side of the dog-and-pony show.
The second point on which it attacks her is the fact that, following the 1971 war which created Bangladesh, she called out for thousands of raped women not to abort the resulting fetuses. This is one of the ongoing items which tends to be a very hard question to answer: should abortion be allowed/encouraged for a woman impregnated as the result of rape?
The question itself lends a hint as to why it is so hard to answer. The circumstances – the rape, the condition of the mother following rape, the fact that the pregnancy will inevitably remind the woman of what happened in a very obvious way – are nothing but grotesque. Depending on one’s beliefs, the options are no less vile.
On the one hand, you might believe (as do most religions) that “human life begins at conception.” In this case, there are a few very ugly points to consider:
1 – Having to carry to term (or even just long enough for a caesarean or induced labor) means the constant reminder of what happened, which may drive the mother to self-destructive acts, possibly up to suicide.
2 – On the belief that the fetus deserves the full rights of any human being, and as a baby is innocent of the circumstances of its conception, ending its life is at worst outright murder and at best the killing of one innocent to try to save the life of another victim.
The question from this perspective then becomes: what is the risk to the mother, and what are the chances the fetus/baby can be carried to term and then given some form of a life (foster/adoption care, etc) to live?
On the other hand, you might believe (as a sizable portion of the population does) that human life begins at some arbitrary point; when the heart first beats, brainwaves first appear, “when it could survive outside the womb” (which keeps getting earlier and earlier as medical technology advances, and may eventually reach the point where an “artificial womb” could raise a human from zygote to birth without the need of a mother at all), or so on. In that case, the calculation inevitably turns to “get rid of it before it reaches that point, since it was forced into the mother against her will.”
To my perspective, none of the options are (at present time) particularly appealing. Bad choices tend to stem from bad circumstances, and these being particularly bad circumstances, I’m not sure that an agreement could ever be fully reached on the “right” thing to do in general. I don’t necessarily think that her calling out to try to prevent the abortions was either evil, or unjustified (especially by the teachings of her church). Neither am I fully convinced that removing the option entirely, especially for those who might be driven to desperate measures, is necessarily the wisest course.
Recently, I’ve been seeing a large number of articles claiming that things are “less civil” in society than in the past. It’s to the point where comedians John Stewart and Stephen Colbert actually held a rally in support of polite discourse.
Some people writing columns or discussing matters point to recent epithets like “rethuglican”, “demoncrat”, “teabaggers”, and on and on. They discuss whether the “decline of civility” leads to bad behavior and the occasional “off-camera, off-microphone” remark that nevertheless gets recorded and magnified since it can be played as a moment of “the candidate being honest” in a bad way. Instances and occasions that are more the result of one or two hotheads in a crowd of thousands or millions are claimed, by the other side, to be “representative” of everyone present.
A number of them from one side make the claim that Barack Hussein Obama, 44th President of the United States, is getting “more than his fair share” because of the color of his skin. Of course, many of these same article writers laughed and enjoyed and said nothing about things like this.
The idea of political discourse being un-heated and non-insulting at some mythological point in the past… well, it’s just not true. Thomas Jefferson, 3rd President of the United States, was derided by his opponents as the “Negro President”. One guy’s done a great job translating the words of Thomas Jefferson and John Adams into modern-day attack ads. Jefferson’s opponents also circulated scurrilous verses regarding his alleged relationship with a slave by the name of Sally Hemings.
Alexander Hamilton and Aaron Burr settled their political differences by a duel to the death. Preston Brooks beat a fellow senator with his cane; Stephen Douglas had said of the beaten man, “this damn fool [Sumner] is going to get himself shot by some other damn fool.” Lyndon Baines Johnson ran this ad. Spiro Agnew was skewered with a mere laugh track.
In 1986, comedian Robin Williams was already making Alzheimer’s/senility jokes about Ronald Reagan. When 1994 came around and Reagan was actually diagnosed with Alzheimer’s, there were certain very incivil people who thought that they were being inventive or unique by asking “does he even remember being president?”
It is a mark of some hilarity, actually, that for the man often derided in recent memory as the “worst president ever”, the worst nickname that could be brought up (at least until, post-presidency, he revealed a very nasty anti-semitic streak) was “Jimmah Cardigan”, and that the worst portrayal of him was that of a bumbling milquetoast. Carter is definitely an outlier.
So, to sum up: is Obama on the receiving end of insults? Undoubtedly. Are they any worse than those received by previous Presidents? I think “Chimpy McBushitler”, Thomas Jefferson, John Adams, as well as most of the others who were at all notable, have a definite case to make that they’re not.
Over at MSNBC, coverage of some disgusting behavior by some disgusting individuals masquerading as “first responders”:
Cranick, who lives outside the city limits, admits he “forgot” to pay the annual $75 fee. The county does not have a county-wide firefighting service, but South Fulton offers fire coverage to rural residents for a fee.
Cranick says he told the operator he would pay whatever is necessary to have the fire put out.
His offer wasn’t accepted, he said.
“They put water out on the fence line out here. They never said nothing to me. Never acknowledged. They stood out here and watched it burn,” Cranick said.
South Fulton’s mayor said that the fire department can’t let homeowners pay the fee on the spot, because the only people who would pay would be those whose homes are on fire.
As we’ve discussed earlier, the problem with people buying in “only when needed” on certain items like health insurance is ongoing. Will proposes a graduated-coverage solution for “previously existing conditions”, wherein people are given incentive to have continuing coverage.
This isn’t quite the same, but at the same time, it’s a point where the behavior of certain entities – hospitals, police, firefighters, certain mayors – goes beyond what I think any sane human would consider reasonable. Was Mr. Cranick un-covered for the year? Yes. Could the firefighters have come out, put the fire out, and then assessed a reasonable fee – at 20x the $75 fee it’s only $1500 to save his irreplaceable family items and pets? Absolutely.
Instead, the firefighters – apparently by order of their chief, if not the mayor – actually stood by and did nothing while a man’s entire life and family pets burned to ashes.
I don’t really have much else to say on it. I’m at a loss for words as to how not a single firefighter would say “hold the phone, put the fire out, figure out the money crap later.” It goes against any code of ethics I’ve ever seen, and it strikes me that whoever made this decision and held the firefighters to it probably should go up on animal cruelty charges as well; they had every ability to save the lives of the family pets, but deliberately decided not to do so, for whatever reason.
My faith in human nature just took another dent.
Over at Half Sigma, a discussion on two different types of poverty.
Apparently, if you live in Europe or the US/Canada, being poor makes you fat and malnourished.
But if you live in India, or Africa, or South America, or so on, being poor makes you thin and malnourished.
I’m going to go out on a limb here and state that in the “developed” countries of the world, much of the problem is simply with the fact that individual people no longer – to the large extent – know how to cook and, further, have the desire to do so. I’ll admit I am as guilty of this as the next guy; I tend to eat prepackaged meals (canned soup, canned noodle dishes, frozen pizzas) more times during the week than I make my own meals. Making my own meals is reserved for occasions when I have a female guest (they seem to love finding out that yes, guys can cook and cook well) or during the weekends when I’m not reaching home tired and wanting to relax.
The reality is, of course, that some of the prepackaged foods I eat are clearly not as good for me as if I made something vaguely equivalent from scratch. Just about everything is likely to be higher in sodium than it needs to be, though being a borderline supertaster, I tend to want more salt to counteract the bitterness in certain foods that other people miss, unless I’m in a mood for something bitter.
At the same time, however, the “western” diet has changed over the past few decades. At one time, “pure meat” – that is to say, a chicken leg, or steak, or burger – was something people had 2-3 times per week. Lunch counter food looking back 4 decades or more was much fresher and less unhealthy as well. To what degree HFCS causes troubles, or the overabundance of Gluten as cheap filler, I can’t say, except that HFCS was barely noticed back then, and didn’t even get to “GRAS” (“generally recognized as safe”) status by the US FDA until 1976.
At the same time, poor neighborhoods tend to lack healthy options. Comparing “poor” and “middle class” neighborhood grocer’s produce aisle, for instance, will give one a remarkable perspective: there are two versions of one particular chain that I tend to go by on a regular basis. The first, in the midst of the “poor zone” surrounding one side of Southern Tech, devotes less than 1/20 of the store’s floor space to produce, and what they do have tends to be wilted or otherwise unappetizing. On the other hand, the “flagship” version, a few miles south of my house, devotes approximately 1/8 of their floor space to produce and tends to have very fresh, clean and appetizing produce for purchase. Given that produce is listed for the same price at each store, I find myself wondering how much of the difference is because the poor around Southern Tech don’t buy it (and so it sits around and wilts), and how much of it came off the truck half-wilted as the “last pick” from the delivery truck.
It’s also true that the number of fast-food restaurants and crappy little corner stores increases with poor neighborhoods. So by the same token, the neighborhood grocery’s produce is unappealing, the Popeye’s Chicken just outside the tenement door smells really good, and why walk the four blocks to the neighborhood grocery when you can buy (for a suitable markup) the same can of Chef Boyardee Overstuffed Ravioli at the corner store on your own block?
As well as that, neighbor-on-neighbor crime is up in those neighborhoods. Why try to go somewhere, even a local little park or a walk along the canal, when you’re likely to have someone try to mug you just for being outdoors?
The impact of mostly-sedentary jobs (when the poor are actually working) isn’t to be underestimated, either. In western nations, the poor are likely to be working “minimum wage” jobs. For a little exercise, perhaps stocking shelves, but they may equally be working in the neighborhood fast-food restaurants, or sitting the counter at the corner store/gas station, or any number of “sit in your butt and watch this” type of jobs. By contrast, the poor in developed nations are walking more to get where they go, and tending to do more physical types of jobs.
Circling back around – when I was in school, there was a requirement that students “choose” between either “home economics”, or a couple other optional courses. Because the other optional courses didn’t interest me, I wound up as one of the 4 boys taking home ec that semester (they wouldn’t let us do wood shop until 8th grade, which I did take when I could). Even looking at the course back then, it was rather a joke; there were 4 weeks of sewing that wound up creating one plush football, 4 weeks of “this is how you make a budget” (which most of the kids failed at), and four weeks of “meal planning” out of which 80% of the class wrote up exactly the same weekly plan based on the very few things they’d been taught to make. Since the home ec room had stoves but we weren’t allowed to turn the gas on to use them, “cooking” was rather pointless, and the most appetizing thing the class ever created were peanut butter and jelly sandwiches. As I was given to understand, by the time my brother and sister went through that school, home ec was shut down entirely.
By comparison, looking back a few decades, it was expected that most households – and most individuals – knew how to cook, at least enough to survive. The basics of making a soup, making a sandwich, grilling, baking, broiling… as far as the middle and poor classes were concerned, at least, they were necessary life skills. In an age when one can stock up the freezer with “hungry man” dinners (or even “lean cuisine”, which are anything but), why would one bother to learn to really cook? The phenomenon of the stay-at-home wife also offers at least some option for a leaner, healthier diet inasmuch as having someone who (a) has the time to be at home preparing a meal and (b) handles food preparation and meal planning regularly, definitely was going to do wonders for keeping some of the nastier stuff out of a waistline.
Over at MSNBC, a news story regarding what might seem a small kerfluffle that happens quite often: a woman in a small town who wants a local art gallery to take down and destroy one of its paintings.
The trick? The painting is a portrait of Adolf Hitler. It’s hanging in a young artist’s gallery, and apparently it’s part of a gallery of “icons”, portraying various figures both good and evil. And the local paper seems to just about sum up my position on the subject.
What sharpens me on the point, however, is the fact that the woman’s comments (though she’s free to make them, as I’ll get to in a second) offer a glimpse into a problem I see too often: people seem to assume they have a “right” to not be offended. Her quote: “Freedom of speech? What happened to taste and sensitivity in our country?” Unfortunately, it’s precisely this form of argument that is so odious. It’s obvious that this woman has every right to be upset; she has a very close family reason to despise Hitler and all he stood for, and if she thinks the painting doesn’t get the portrayal right, then she’s going to be offended. On the other hand, if speech is to be censored for reasons of “taste” or “sensitivity”, then certain subjects will never be debated.
Working at Southern Tech University, I’ve seen plenty of examples of odious, disgusting speech. Anti-abortion displays like this one, bizarre displays of raw anti-semitism masquerading as “palestinian solidarity”, and so on. I worry about the violence potential of the second (especially after having been stalked on-campus by members of said racist group), but as long as they stay peaceful, I subscribe to the notion that the proper response to their hate speech is not censorship, but counter-speech exposing them and whatever factual misrepresentations (hell with it: outright lies) for what they are.
So the other day I took the Bartle Test. Created way back in 1978, it’s still relevant (more than many would think) in designing MMORPG’s (World of Warcraft, Everquest, etc).
In an overarching format, it does well describing why some games “win” and “lose” in the market. Games targeted to “Killers”, such as Ultima Online, Shadowbane, and Asheron’s Call 2, tend to die off. The problem is, if you populate with Killers and design around them, then the vast majority of players who are not primarily “Killers” will get tired of being picked on and leave the game. An all-Killer game will drive off enough players to not be financially sustainable.
The longest-running game I’ve ever played, MMORPG-wise, is City of Heroes. The nice thing about CoH is that the “Killer” mechanic almost completely vanishes. Player-vs-Player combat is only in certain non-storyline areas against “City of Villains” players (the “other side” of the game), or in the “Arena”, in exhibition matches where no penalty for losing exists in the main game. Meanwhile, CoH has a tremendous amount of room for exploration and the enjoyment of various storylines, quests, and options to try out. The end of my CoH play came when the “social attitude”, by which I mean a personality-based falling out with a guild leader, left me with the option of either shutting down my account, or paying way too much money to move my characters to new servers to avoid this “socially powerful” griefer’s behavior.
For those wondering, by the Bartle test I come up as an ESAK, with a mere 7% “Killer” score:
It’s not so much the wandering around and poking about, but that euphoric eureka moment the Explorer strives for. The joys of discovery do not necessarily involve geography, real or virtual. They may derive from the mental road less traveled, the uncovering of esoteric or hidden knowledge and it’s creative application. Explorers make great theory crafters. The most infinitesimal bit of newness can deliver the most delicious zing to an Explorer.
Explorer Socializers are the glue of the online world. Not only do they like to delve in to find all the cool stuff, but they also enjoy sharing that knowledge with others. Explorer socializers power the wikis, maps, forums and theory craft sites of the gamer world.
Eye halve a spelling chequer
It came with my pea sea
It plainly marques four my revue
Miss steaks eye kin knot sea.
Eye strike a key and type a word
And weight four it two say
Weather eye am wrong oar write
It shows me strait a weigh.
As soon as a mist ache is maid
It nose bee fore two long
And eye can put the error rite
Its rare lea ever wrong.
Eye have run this poem threw it
I am shore your pleased two no
Its letter perfect awl the weigh
My chequer tolled me sew.
— Community poem based on the original Candidate for a Pullet Surprise, by Mark Eckman and Jerrold Zar.
The incoming admissions staff at the University of Waterloo have a problem with what they are seeing from their prospective students. Articles like these have been fairly common in the past fifteen years or so, and a backlash against some of the worst methods of teaching (especially the “whole language” nonsense and the idea of “open plan” schools) is slowly taking root.
Too little, too late? Can this be turned around? Working in my department at SoTech, where we “educate” the next generation of teachers, I am occasionally frightened by what I see. It is an open secret that our students are an average of 20 IQ points lower than the IQ of the next lowest-performing college. Our professors regularly give grades of B, or even A, to projects that would have been given a failing mark when I was in the fourth grade. One required test for the students, supposedly meant to ensure that the curricula for a grade-school position have been memorized to a sufficent degree, is passed by students “brute-forcing it”. To wit, they repeat the test some dozen times or more (there is no limit on how many attempts one may have, save that it may only be taken once per day and costs a set fee per attempt at the SoTech Testing Center), entering in random answers to multiple-choice questions until they eke out a “passing” grade once. “Prole Twang”, as Sheila would call it, abounds not only in hallway conversations but in classroom presentations. In the case of two african-american professors (who oddly enough carry bachelors’ degrees in “african-american studies”), it is actively encouraged.
It has been said that “Those who can, do. Those who can’t, teach.” The more time I examine the fields of teaching, and the more time I see the students passing through these doors, the more frightened I become that this could be true. It is a statement that many would take to be rude and demeaning. There are many good teachers employed in the world. At the same time, there are any number of people who entered the field of teaching because they believed it to be easy. There are a large number who entered the field because they lacked the mental acuity for other professions. Sadly, since “promotion” in the field of teaching is largely about being given older students (kindergarten/preeschool teachers are “promoted” to 1st/2nd grade, 1st/2nd grade teachers “promoted” to 3rd/4th grade, and so on) and the system mostly revolves around the idea of “tenure”, by which a teacher who has been in a system for a number of years can either be promoted or not, but never fired, the field has worked itself into the situation we have today: a large number of people expected to educate middle-school or high-school children about more advanced grammatical, mathematical, or higher reasoning concepts are the very people who repeatedly proved their inability to grasp the very same concepts throughout their own educational career.
It is one thing to have a teacher who cannot understand basic geometry, but can still teach a kindergartener how to count to twelve. It is quite another to find out that, fifteen years later, this same teacher is now somehow teaching a trigonometry class because they have, through the magic of seniority and tenure, managed to “fail upwards” to teaching the ninth or tenth grade.
I ran across this image attached to a rather vitriolic post (the thrust of which was, in essence, “only stupid inbred hicks oppose gay marriage and this map proves it”), but it struck something of a thought process. Here goes.
First of all, the map’s not entirely accurate with respect to what the author was trying to say. Five states, at least, shouldn’t be listed as “allowing” cousin marriage, since their restrictions make it so that an impossibly small portion of their population will realistically participate. There’s a considerable overlap with gay and cousin marriage allowability in the northeastern section of the US. And of course the Granola State on the west coast, a place which carries almost entirely the opposite of the “inbred hick” stereotype, allows cousin marriage and has gone back and forth on the issue of gay marriage for a few years now.
Secondly, the science against cousin marriage is muddled. The usual argument put against it is that it encourages genetic diseases. In certain populations, specifically populations where cousin marriage is encouraged and founder effects come into play, this is true. Small, isolated rural villages of current/past ages, the inbred lines of European royalty, and the lines of fundamentalist Mormonism come to mind here. Another example is the Dutch settlers to South Africa (the “Afrikaners”), who carry magnified risk of Huntington’s Disease because an abnormal percentage of the original settlers were carriers.
On the other hand, research into larger, more diverse genetic populations indicates that “once in a while” cousin marriage carries relatively small risk – about the same risk as a woman having kids at the age of 40 rather than 30. The further argument is that laws against it in the US were motivated not by risk of genetic disease, but by a desire to force immigrants to intermarry into the population (and thus assimilate) in a quicker manner.
Oddly enough, the argument about “inbred hicks” falls apart when comparing the map of European gay marriage laws. I’d put a map up comparing it to European laws about cousin marriage, but there’s no real point to it: cousin marriage is legal in 100% of Europe. Two countries have recently begun discussing the option of banning it, and oddly enough, it’s not even the condition of their oddly buckteethed/colorblind/hemophiliac (that last being the origin of the term “blue-blood” as a reference to royalty) royal lines that did it, but rather the high rate of genetic diseases in recent immigrant populations from the rural sectors of Islamic countries, who perpetuate societal cousin marriage rates of 55% or above in a population where it’s not uncommon to be the child of a chain of 8-10 cousin marriages (including “double cousin” marriages, wherein the kids are not simply cousins but where mother/aunt and father/uncle, or mother/uncle and father/aunt, constitute sibling pairs as well making the kids almost genetic siblings) in a row.
The trouble with this is discussion that it’s a perfect example of a “where do we draw the line” sort of argument. On the one hand, in a (mostly healthy) genetic population where cousin marriage would be rare and genetic diversity a given, arguers against cousin marriage would quickly expire upon the line of “well why do we let 40-year-old women have kids then?” On the other hand, we have definitive proof of the genetic risks of allowing multigenerational cousin marriage. There even comes the risk that at some point, society could start stopping non-sibling people from marrying because they both carried a recessive gene for some debilitating genetic disease like Huntington’s or Tay-Sachs, or even something as merely inconvenient as Celiac. It’s not that farfetched; some states to this day still require a blood test, a holdover from times when they were screening for sexually transmitted diseases such as syphilis. Another justification (now that the technology exists) for genetic testing as a marriage requirement could be to ensure that they aren’t unknowingly marrying their half-sibling or even full sibling, due to the high percentage of absentee/unknown fathers or potential for siblings to be separated too early in life to remember each other in certain populations.
Slate’s advice columnist, Dear (p)Rudence, has been doing “web chats” for a while along with the regular letters column. I have yet to actually bother to connect to one in real time, but the transcripts provided are usually pretty entertaining, and evidence that Ol’ Rudie (at least the current one) doesn’t quite have a good handle on what’s rude or not when it comes to politeness advice.
My bone of contention comes in with her advice regarding someone who continually gives the “gift” to someone of a donation to charity in their name.
D.C. Metro: I have a family member who sends a gift of some animal to the Heifer fund as a Christmas present to us every year. Every year I get more and more offended, as this is not a “gift” to anyone except themselves, as they get a tax deduction. My kids understand about giving to charity, but I cannot explain how this is a “gift” to us. I would like to tell this person to please stop sending these donations as “gifts” and only a card is fine.
[Ol’ Rudie]: What a good lesson for the kids! A family member makes a contribution in your family’s name to a wonderful cause, and you want your children to understand this isn’t really a gift but a tax deduction, and you want to demand a refund from the giver…
My first objection, however minor, is that Ol’ Rudence immediately misconstrues the position of the writer. They aren’t asking for a “refund”, simply that the giver refrain from such a “gift.” They aren’t even asking for a gift of any sort – a simple greeting card would suffice, as they write.
When challenged, Rudence responds with an even snarkier attack:
[Ol’ Rudie]: Clarksville, I hope everyone on your list knows you’d rather get a puce scarf from the sale rack than a donation to a worthy cause in your name.
The larger problem I have with this idea is that “giving to charity in someone’s name” is a rather smug, self-serving gift. When done unbidden, the social message it sends could well be that the “giftee” is a person who wouldn’t think to give to charity on their own and thus, the “gift” from the “giver” is making up for their moral shortfall. Or the social message, depending on choice of charity, is “I gave to them, you should be giving too.”
The little cards saying “Hey, I gave $XX to Charity Y in your name” have all the social tact of a card saying “Merry Christmas! By the way, if you didn’t donate to Charity Y you’re a terrible person, but don’t worry, I got you covered.”
Now this isn’t always the case. If there is an adult who has a specific connection to a charity, or has requested that people give in their name for instance, it’s probably fine. For example, a monetary donation to a local soup kitchen where your friend or family member regularly volunteers would probably be a wonderful thing, or a donation to an animal shelter or Humane Society/SPCA for an animal lover who has expressed a desire to support those organizations (and might not have financial wherewithal to make a donation of their own), would probably be taken as a truly thoughtful gift.
On the other hand, to do it to a kid? First of all, most children (the younger, the worse in this regard) do not have the mental ability to make that kind of connection. The abstract “I gave to someone in your name”, in a kid’s mind, is going to degenerate into “I gave your gift to somebody else.” Second of all, making the choice of which charity to give to yourself, rather than giving the “giftee” that option, adds the pressure of socially trying to force the person into some public acknowledgement of the “goodness” of the charity. While the charity in question may indeed be noble, people have a tendency to rebel against such a pressure.
Especially in the case of a kid, there are many better ways to handle such a thing. You want it to be as direct as possible. If you’re going to give to an animal shelter, take the kid to an animal shelter, have them make the donation in person, and maybe volunteer some of your time helping to clean up or exercise/feed the animals. If you’re going to give to a childrens’ hospital, have the kid visit some of the sick kids there (like in the cancer ward) and make some new friends to write letters or email to. If you’re giving long-range? Well, bite the bullet and send a real gift, at least until the kid’s reached the age of 10, and then ask them what kind of a charity they’d like to give to.
Back to the neverending topic of weight loss and exercise, the NYT spotlights research that seems to come from the “well, duh” department: the secret to weight loss is to burn more calories than you consume.
The main problem today is, people have no idea what they’re consuming. As the article points out:
“The message of our work is really simple,” although not agreeable to hear, Melanson said. “It all comes down to energy balance,” or, as you might have guessed, calories in and calories out. People “are only burning 200 or 300 calories” in a typical 30-minute exercise session, Melanson points out. “You replace that with one bottle of Gatorade.”
Most people I know go through 4-5 cans of soda per day. I personally have felt a lot better, and noticed myself getting trimmer (and wanting to exercise more regularly thereby!) when I gave myself one simple rule: don’t stock soda cans in the house. I have juice, I have milk, and that’s it. Generally, after a glass of juice or milk, I don’t feel the need for more than water afterwards; if I drink soda, I find myself thinking “hey, I want another soda.”
I switched to using smaller bowls and smaller plates, and doling out smaller portions (I have “soup bowls” that are wide but shallow but have a circular imprint in the center, so I only fill the imprint and use some whole-grain bread to sop up the gravy from whatever I cooked).
This is not to say that these are easy things. Sticking your giant-sized bowls and glasses “out of the way” and replacing them is an expense, though not that expensive ($20 at Ikea replaced pretty much what I needed for daily use). Cutting how much you eat may require “feeling hungry” for a while as your body adjusts. But the research is clear; “inability to lose weight even though exercising” is much less likely to indicate that you have some hormonal/metabolic issue, and much more likely to indicate that you’re finding some hidden source of calories and not accurately measuring your caloric input or how many calories you’re burning.