You can listen to the full segment below.
- Sealed borders prevent Gazans from leaving strip for Muslim pilgrimage
- South Korea to tighten gun rules after man kills 3, himself
- US concerned over human rights, free speech in Turkey
- Venezuela protests continue as teen shot by police is buried
- Cuban cigar makers anticipate big bucks from US travelers
- Germany easily approves Greek bailout extension
Complete elimination of the corporate income tax would boost the American economy by 6 percent in the long run — enough to raise other tax revenues and, over time, make the tax cut revenue-neutral.
The government would immediately lose $273.5 billion in annual corporate tax revenue if the tax were eliminated. But within roughly ten years, income, payroll, and other tax revenues would rise by an annual $273.9 billion as a result, according to the Tax Foundation.
As the above graph shows, the more the corporate income tax rate is cut, the higher economic growth rises. Increasing the corporate tax rate would hurt economic growth and might actually cost tax revenue. Every corporate income tax cut would boost revenue, with the highest revenue boost at about a 20 percent tax rate. However, a 20 percent tax rate would only boost economic growth by 3 percent, whereas eliminating the corporate tax would boost growth by 6 percent.
Government bans on the sale and distribution of raw milk and raw milk products are enforced in the name of public safety. But many people enjoy the health benefits of milk that has not been pasteurized, and some farms want sell it. Are the health threats from raw milk significant enough to warrant a ban on its sale? Government data and the lack of regulation of other raw foods suggest that they are not.
The Food and Drug Agency currently prohibits the interstate sale or distribution of raw milk and raw milk products, such as yogurt, ice cream, cheese, and sour cream, and requires anyone selling raw milk to be licensed. The FDA delegates all further regulation to the states by advising them to likewise regulate the sale and distribution of raw milk. Based on this advice, 40 states prohibit the retail sale of raw milk and raw milk products and 13 states make unpasteurized dairy products completely illegal for human consumption.
Though they have the potential to make people sick if they are not prepared carefully, unpasteurized dairy products carry health benefits—especially for those people who are sensitive to lactose or have allergies. Kristin Canty, director of the documentary Farmageddon, told us, “My son was completely ridden with allergies and asthma until we started drinking raw milk as a family. We have been drinking it now for 15 years. I find it ridiculous that the government thinks that they have the right to tell us that we can’t consume a substance that has been used for sustenance for thousands of years.”
Dan Allgyer, an Amish farmer in Pennsylvania who violated the FDA’s ban on interstate raw milk sales, was subjected to an early morning armed raid to his farm which put him out of business. Milk from Allgyer’s now-closed farm was never alleged to have made any of his customers in the Washington, D.C. area sick. Rather, his customers sought out his dairy products, fully aware that they were unpasteurized.
The FDA and Centers for Disease Control and Prevention are determined to educate the public on the perceived dangers of raw milk, but their data suggest that raw milk is not as dangerous as their slanted language implies.
In its video explanation of the dangers of raw milk, the FDA claims that, “healthy people of any age can get very sick or even die if they drink contaminated raw milk.” This is not news—healthy people of any age can get very sick or even die if they drink or eat any contaminated food. All products carry risks, but their sale is not prohibited. Consumer freedom would be substantially limited if only fully-cooked chicken breasts or well-done steaks were available for sale.
Following months of tenacious lobbying by groups like the American Heart Association, State Sen. Ed Hernandez (D-Azusca) introduced a bill last week that would raise the smoking age in California from 18 to 21 and impose a $2 cigarette tax hike. While certainly well-intended, Californians should be skeptical of such a proposal as new research suggests that cigarette taxes do not decrease smoking as much as conventional wisdom suggests.
Since 2002, there have been more than 110 increases in cigarette taxes at the state level.Taxes now make up about $2.50 of the $6 price of a normal pack of cigarettes. Adult smoking rates in the U.S. dropped from 22.5 percent in 2002 to 19 percent in 2011 according to the Center for Disease Control. The agency hopes to see rates fall to 12 percent by 2020. Tax increases on cigarettes are seen as a valuable tool for discouraging smoking by both policy makers and public health advocates like like the ALA. With numbers like these, it is difficult to argue against these points from a public health standpoint.
Yet, this argument is misguided. New research by professors Kevin Callison and Robert Kaestner published in Regulation questions whether further raising cigarette taxes will do much to decrease smoking rates. They find that adult cigarette use is largely unaffected by taxes. Moreover, they estimate that a large increase, possibly about 100 percent, would be required to decrease smoking rates by just 2 to 3 percent. This flies in the face of the conventional arguments provided by cigarette tax advocates.
Moreover, the study finds, the tax burden of further cigarette taxes is felt acutely by the nation’s poor. If, as stated above, tax increases do little to reduce smoking, then a $1 tax increase would cost the average low-income smoker an extra $480 per year. Worse, even in the best-case-scenario, the person would spend an extra $450 per year, every year moving forward. Not only would higher taxes do little for public health, but their burden would be felt disproportionately by those with lower incomes.
With reforms to No Child Left Behind up for debate, House Republicans are wisely proposing that low-income families be allowed to take a portion of their federal funding to different public schools of their choosing. This form of school choice is known as “portable funding,” or sometimes “backpack funding” since money follows the child.
Low-income families are especially in need of school choice. The upper-class can afford private school tuition and the middle-class can afford to locate in neighborhoods with quality public schools. However, low-income families are too often left with no outlet from failing schools.
School choice is essential since every child has different needs. Each student has different priorities for what they need from a school that cannot be simplified into one test score or grade. School choice allows parents to weigh different factors and do what’s best for them. Factors such as academic success, school safety, or strength in a certain subject all matter to students in a different way.
Whether or not a school has high test scores or meets government-mandated standards should not be of great importance for government funding. If enough families choose to send enough students to a school to make it viable, then that school should be considered a success. All that counts is whether a school serves its students’ needs, whatever those needs may be.
On the left, teachers unions and progressives claim that portable federal funding of education will take resources away from poor school districts. But this approach fails to recognize the educational benefits for individual families and students who can take advantage of school choice. It also perpetuates the falsehood that bad schools just need more taxpayer money to improve. Customized education helps all students individually, whereas pouring more money into failing education systems has failed to produce nationwide gains.
Rudy Giuliani made headlines this week when he stated that President Obama had been “influenced by communists since an early age.” The comments garnered critical reaction ranging from those calling Giuliani wrong to those calling him a racist. However, his critics ignore a very real and plain truth: Marxism has been a major influence in modern American liberalism since the 1960s and has played a large role in the president’s life.
The self-proclaimed Encyclopedia of Marxism details how communists helped to establish the peace movements of the 1960s in Europe and the U.S., which led to heavy Marxist involvement against the war in Vietnam. It’s important to note that while it was never proven that foreign communist regimes were involved with the anti-war movement, it is undoubtedly true that Americans influenced by and espousing Marxism were at the front of the anti-war movement.
It was in that movement that men like Bill Ayers would gain prominence. Ayers cofounded the self-described communist revolutionary group the Weather Underground, which orchestrated a string of bombings of U.S. government buildings during the 1960s and 70s.
In 2008, the media scrutinized the relationship between Barack Obama and Bill Ayers. Ayers had been to several functions at Obama’s home, yet Obama maintained that the two were merely acquaintances, a claim that was pretty much verified and then dismissed. But the media missed the point. The point is that a communist terrorist was treated with such respect and reverence by the left that he nonchalantly passed through engagements attended by Democratic presidential candidates. And still, he continues to excuse his organization’s terrorism.
American colleges are full of Marxist professors; this is not contested. The University of Chicago, where Ayers teaches and Obama briefly taught law, is an obvious example. One need only look at the Democratic Socialists of America, whose members include educators, activists, and public officials, to see the prevalence of Marxist thought in American politics. Their website proclaims that“Democratic Socialists believe that both the economy and society should be run democratically to meet human needs, not to make profits for a few.”
The US government has a hazardous waste problem. Many federal departments have properties that are contaminated with everything from radiation at Department of Energy facilities to petrochemicals on current and former military bases.
The departments of Agriculture and the Interior are no different, as highlighted in a new report from the Government Accountability Office (GAO). Over 5,000 sites managed by these two departments are either likely to be contaminated or have been confirmed to be polluted with some form of hazardous waste. Moreover, the Department of the Interior has a backlog of more than 30,000 possibly contaminated abandoned mines waiting to be assessed. The vast majority of these sites are on lands in the western half of the country, scattered throughout properties of the Bureau of Land Management.
The report is the latest in a long series of case studies that show that the federal government is generally a poor steward of the lands that fall within its jurisdiction. In 2014, Newsweekdeemed the Department of Defense one of the world’s biggest polluters. It’s only logical that other government agencies, especially those whose explicit mandate is to oversee federal lands, would have similar failings. The departments of Agriculture and the Interior claim to currently have about US$500 million in environmental liabilities that will require future clean up, nearly all of which will be paid with taxpayer dollars.
However, these estimates are an understatement. In 2012, the Army Corps of Engineers assessed Formerly Used Defense Sites (FUDS) on Agriculture and Interior department lands and found the potential clean up cost to be much higher. Millions have already been spent to mitigate the environmental damage wrought by these sites, and yet the Corps determined that another $4.7 billion will be needed to complete the remediation of the hundreds of remaining sites.
Moreover, if we know anything about the history of government projects, that number is probably still far too low. Large government projects often end up costing far more than expected, even double by the time they are completed.
The biggest story so far surrounding this year’s Conservative Political Action Conference (CPAC) is the announcement that Phil Robertson, the controversial star of the hit A&E show Duck Dynasty, will receive the second annual Andrew Breitbart First Amendment Award.
No matter how much new buzz the announcement brings CPAC, the decision betrays a fundamental misunderstanding of how free speech works, and where the future of nationally competitive conservatism lies.
At the end of 2013, Robertson was briefly suspended by the network over remarks he made in a GQ interview calling homosexuality sinful and comparing it to bestiality. At the time, his suspension sparked a culture war flare-up between gay rights supporters and social conservatives, who felt Robertson’s freedom of speech was being suppressed.
However, the Duck Dynasty flap (pun intended) was a dispute within a private organization well within its rights to take the action it did. A&E had the authority to suspend Robertson as soon as he voluntarily signed the contract for the show, no how it handled the controversy afterwards.
Had the government taken Duck Dynasty off the air, I’d be up in arms, even as a member of the very LGBT community he marginalized. But that is not what happened here.
The First Amendment protects Americans from government censorship. A network’s decision to craft the messages it broadcasts is itself an exercise of free speech.
And let’s not forget that A&E’s decision to reinstate Robertson after vociferous protests proves that the First Amendment was as healthy as ever. The government didn’t force anyone to do anything here.
Just as many argue that evangelical Christian bakers should not be forced to make wedding cakes for marriages they oppose, consistency demands that neither should a private television network be forced to air opinions it doesn’t want to promote. Freedom of speech cuts both ways, and conservatives who truly care about promoting the values of the Founding Fathers will defend it regardless of its popularity.
This year’s Valentine’s Day was disastrous — not just for me, but for many ex-couples. But as I sat there on Sunday nursing my broken heart, I realized what’s wrong with romance today: not enough regulation.
The United States government has wisely chosen to regulate most other aspects of life, from what wage you are allowed to work for to what medicines a patient is allowed to buy over the counter. Voluntary interactions are all well and good, but the bottom line is that people have to be protected from themselves. The trade-off between liberty and security exists not only in privacy and foreign policy: we must strike a similar balance in the arena of love.
I propose the creation of a new government organization, the Committee to Assure Romantic Equity (CARE), to bring an end to the current Wild West of romance. Three powerful sets of regulations would bring much-needed stability to the chaos of dating.
1. Who’s allowed to date?
Just as professionals — from hair-braiders to interior decorators — must be licensed, so too the government must step in to license daters.
Right now, the dating market is overrun with shoddy specimens. Sleazy men buy women drinks and sleep with them on the first date. Immoral women cheat on their loving boyfriends. Many people lack the discretion to choose good partners for themselves, and their poor decisions can bring out the worst in people. Never mind that they sometimes have children.
To remedy this situation, any dating hopeful should have to submit an application to CARE. A licensing system should be set up whereby applicants pay for classes in order to certify both their good-heartedness and their ability to treat a partner well. In order to enforce this system, CARE agents would inspect couples, fining or jailing any individual engaged in dating without a CARE permit.
This wise step will remove the riff-raff from the dating market and ensure that good, kind individuals are never lured into romances they’ll regret. And if a few people find themselves forcibly removed from the dating pool, so what? They probably weren’t great partners to begin with.