Greg Mankiw’s new post on his personal work incentives is required reading for anyone who wants to discuss taxes in this election cycle.
The idea is simple: our tax system uses marginal rates, meaning one rate applies to the first dollar earned and different rates kick in at different thresholds. (That is, unless you’re so economically productive or generous as to get stuck in the Alternative Minimum Tax system and get taxed at high flat rates.) The higher rate is called a marginal rate. This is the rate which applies to the last dollar a worker earns in a year. This rate is the rate which determines how much it’s worth to you to make the effort to earn that last dollar. If you’re acting rationally, it’s the rate which determines whether you take a second job, have a one- or two-income family, or start that business on the side you’ve been talking about.
Mankiw takes it one step further and asks how much he could leave for his kids out of that last dollar under each presidential candidate’s plan. You could do the same thing for any long period of time, of course, like saving for retirement or for your kid’s college education.
Do yourself a favor and read the post.
I offer a final thought for the evening. Last night, a dear friend and I were discussing the state of the world and the nation, particularly with reference to some of the more extreme economic proposals made by politicians and pundits of varying degrees of skill. My friend is one of the most intelligent, well-educated, level-headed, and reflective people I have ever known. He noted that the proposals in question reflected a radical embrace of a radical degree of government control of private affairs. He said, “I fear for America. The people really won’t stand for democracy much longer.”
Coming from this source, that sent chills down my spine. I hope he’s wrong. But I think he might be right.
[Author’s note: In the spirit of my four other posts today, I choose not to explicate this post any further. As one of my favorite math professors used to say, “The proof of whether I’m right or wrong – and I’m right – is left as an exercise to the reader.”]
Sometime in the last few generations, logic started getting short shrift. I don’t mean logic as a concept; plenty of people can, and do, invoke “logic” as a defense for completely absurd arguments. No, I mean LOGIC, the formal subject of study, the one involving formal concepts like “and,” “or,” and “xor,” as well as fancy Latin words for various fallacies. Logic has gone missing, and we’re all of us the worse off for its absence.
When I was young, I had to do lots of logic games. These were the type involving a grid (or several grids) and a bunch of Xs and Os as the problem solver tried to determine which statements or pairings of entities were correct and which were not. For example, a problem might center on allocating livestock to farmers or favorite subjects to school students, given sufficient but incomplete facts. There were a lot of variants, but these are the ones I remember most.
My complaint is not that I did this and “kids these days” don’t. My complaint is that most kids didn’t do problems like that, then, either. See, I only did those games because I was assigned to the school’s “academically gifted” or “gifted and talented” programs (the name changed at some point for political correctness reasons). The rest of my classmates got the chance to do exactly one of these problems during my elementary school years, as I recall. Only a few of us did them regularly.