Suppose someone from an insurance company came to you in the year 2050 and said, “We’ve run computer models many thousands of times using all kinds of different assumptions. In the worst-case scenario, a very small fraction of the computer runs—about 1 in 500—has you losing 20% of your income in the year 2100. In order to insure you against this extremely unlikely outcome that will occur in half a century, we want to charge you 3.4% of your income this year.”
Would you want to take that deal? Of course not. The premium is way too high in light of the very low probability and the relative modesty of the “catastrophe.” When someone’s house burns down, that’s a much bigger hit than 20% of annual income. And yet, the premiums for fire insurance are quite reasonable; they’re nowhere near 3.4% of income for most households. Moreover, the threat of your house burning down is immediate: It could happen tomorrow, not just fifty years from now. That’s why people have no problem buying fire insurance for their homes. Yet the situation and numbers aren’t anywhere close to analogous when it comes to climate change policies.
Recognizing that they can no longer make their case on the basis of down-the-middle projections, those favoring massive government intervention in the name of fighting climate change have resorted to focusing on very unlikely but devastating scenarios. In this context, they have likened their preferred government policies as a form of insurance.
However, this analogy fails for several reasons. First, insurance in the marketplace is voluntary; when the government forces people to buy it—as with ObamaCare—then there is indeed a public outcry. Second, actual insurance in the marketplace is based on extensive actuarial data; we have no such understanding with climate change, but instead the outcomes against which we are “insuring” live inside computer projections.
Finally, even taking the insurance analogy head-on, the numbers don’t work. Nobody would take out an insurance policy on the terms of likely payouts and expense of premium that climate change policy offers.
The former drives up costs of producing carbon. This incentivizes firms to reduce their carbon dioxide output, including spurring innovation in making carbon-dioxide become more productive (doing more with less, making that fuel work harder) as well as seeking technological alternatives to carbon production altogether.
The latter drives up costs of human labor. This incentivizes firms to reduce their salary expenditure, including spurring innovation in making labor become more productive (doing more with less, making that labor work harder) as well as seeking technological alternatives to human labor altogether.
Generally, the people who support the former tend to support the latter - yet their expectations of each contradict one another.
[T]he “social cost of carbon” is not an objective fact of the world, analogous to the charge on an electron or the boiling point of water. Many analysts and policymakers refer to the “science being settled” and so forth, giving the impression that the SCC is a number that is “out there” in Nature, waiting to be measured by guys in white lab coats.
On the contrary, by its very nature the SCC is an arbitrary number, which is completely malleable in the hands of an analyst who can make it very high, very low, or even negative, simply by adjusting parameters. Precisely because the SCC even at a conceptual level is so vulnerable to manipulation in this fashion, the analysts giving wildly different estimates are not “lying.” …
Generating estimates of the SCC involves using computer models with (arbitrary) simulated damages that go out centuries in the future, and then the analyst must arbitrarily select a discount rate to convert those future damages into present-dollar terms. Because of these ingredients in the estimation process, an analyst can generate just about any “estimate” of the SCC he wants, including a negative one—which would mean carbon dioxide emissions confer third-party benefits on humanity, and (using the Administration’s logic) ought to receive subsidies from the taxpayer.
In 2012, in a proceeding straight out of the Inquisition, an Italian court convicted six scientists for providing “inexact, incomplete and contradictory information” in the lead-up to the earthquake. Now, a philosophy professor says that case may provide a worthwhile example for the treatment of scientific dissenters—specifically, “climate deniers who receive funding as part of a sustained campaign to undermine the public’s understanding of scientific consensus.”
Writing for The Conversation, a publication geared to academics, Lawrence Torcello(pictured at right), a professor of philosophy at the Rochester Institute of Technology, says the time for conversation is over.
The importance of clearly communicating science to the public should not be underestimated. Accurately understanding our natural environment and sharing that information can be a matter of life or death. When it comes to global warming, much of the public remains in denial about a set of facts that the majority of scientists clearly agree on. With such high stakes, an organised campaign funding misinformation ought to be considered criminally negligent.
Torcello then tries to parse the details of the earthquake case, saying, it was “actually about the failure of scientists to clearly communicate risks to the public.”
Even so, Torcello acknowledges the chilling effect of prosecuting scientists for not framing their public statements in a sufficiently prosecutor-friendly matter. Researchers might not try to warn anybody if they could face fines or prison time for being inexact, incomplete, or contradictory after the fact. He ultimately allows that he wouldn’t actually criminalize poor scientific communication—just anybody who might support dissenting scientists, or receive such support.
If those with a financial or political interest in inaction had funded an organised campaign to discredit the consensus findings of seismology, and for that reason no preparations were made, then many of us would agree that the financiers of the denialist campaign were criminally responsible for the consequences of that campaign. I submit that this is just what is happening with the current, well documented funding of global warming denialism….
We have good reason to consider the funding of climate denial to be criminally and morally negligent. The charge of criminal and moral negligence ought to extend to all activities of the climate deniers who receive funding as part of a sustained campaign to undermine the public’s understanding of scientific consensus.
If you’re trying to figure out how that doesn’t threaten the free exercise of speech, Torcello assures us, “We must make the critical distinction between the protected voicing of one’s unpopular beliefs, and the funding of a strategically organised campaign to undermine the public’s ability to develop and voice informed opinions.”
So…You can voice a dissenting opinion, so long as you don’t benefit from it or help dissenters benefit in any way?
By the way, according to RIT, Torcello researches “the moral implications of global warming denialism, as well as other forms of science denialism.” Presumably, his job is a paid one. But this is OK, because…the majority of scientists agree with his views on the issue?
Let’s allow that they do—and that a majority of scientists agree about man-made climate change and a host of other issues. Just when does the Tribunal of the Holy Office of the Inquisition meet to decide what is still subject to debate, and what is now holy writ? And is an effort to “undermine the public’s understanding of scientific consensus” always criminally negligent? Can it ever be simple scientific inquiry? Or even heroic?
Or maybe we just assume underhanded motives on the part of scientific outliers and their supporters once the committee has ruled.
Well, that’s my privilege [to retract previous apocalyptic predictions about climate change]. You see, I’m an independent scientist. I’m not funded by some government department or commercial body or anything like that. If I make a mistake, then I can go public with it. And you have to, because it is only by making mistakes that you can move ahead. …
Take this climate matter everybody is thinking about. They all talk, they pass laws, they do things, as if they knew what was happening. I don’t think anybody really knows what’s happening. They just guess. And a whole group of them meet together and encourage each other’s guesses.
The 94 year-old scientist, famous for his Gaia hypothesis that Earth is a self-regulating, single organism, asserts that environmentalism has “become a religion” and does not pay enough attention to facts; he also said that he had been too certain about the rate of global warming in his past book.
“The problem is we don’t know what the climate is doing. We thought we knew 20 years ago. That led to some alarmist books – mine included – because it looked clear-cut, but it hasn’t happened.”
For people unfamiliar with how water works in California and the American west, they might conclude that this is a case of the government forcibly taking water form the farmers and handing it over to the environmentalists for their pet projects. This is, however, completely untrue. The first thing to know is that there is no functioning market in water in California, and there are no market prices. Virtually all water in California and the American West is controlled by, distributed by, and “priced” by government agencies. This system of water socialism (described by Bill Anderson here, and yours truly here) is what rules the allocation of water in the West, and by extension, it rules any industry or endeavor that requires water. Thus agriculture is a socialized industry in the West, where the main input for the industry, water, is allocated along socialistic lines. There is no market pricing, because the pricing is done in a way to benefit powerful political interests.
In many Western states, still including California, although cities are quickly eroding their power, the growers are very powerful lobbies who have for the past 70 years enjoyed the benefits of extremely cheap, subsidized water. They do not own the water, and so when one of the farmers commented on the delta smelt situation and said “We are not interested in welfare; we want water” he was being unintentionally funny. Cheap water for Central Valley farmers, who are growing food in a desert, and who only get water thanks to massive taxpayer-funded public works, is a major form of welfare for them.
So, it would be naive in the extreme to frame this story, as the conservatives have, as some sort of battle between the poor, beleaguered farmer who is having “his” water taken away, and the usurper environmentalists.
Indeed, the water situation is just the latest chapter in a long history of using government to pick winners and losers in California and the west using the allocation of water as a weapon.
The conservatives who have declared the farmers to be the rightful owners of the water are simply ignoring the history of river water along the West coast.
Long forgotten is the fact that once upon a time, there were massive fisheries of salmon along the West coast that supported large numbers of canneries, fishing villages, industrial fishing operations, and all the usual support economies that go along with any industry.
Those industries are all massively reduced now, not because of global warming or overfishing or some other environmentalist bogeyman, but because the fisheries were ruined by governments. The governments that dammed up hundreds of rivers long the west coast, and thus destroyed the salmon and steelhead trout breeding grounds, did so with the enthusiastic support and lobbying of the growers who now are whining about some water actually being allowed to flow out into the ocean.
Whole industries were destroyed at the behest of the growers and cities that wanted water storage for their favored interests. This is not limited to the west coast of course. The Colorado River delta at the Sea of Cortez was once a huge estuary where many native communities of fishermen thrived. The fishing industries there are now all also destroyed. They were destroyed so that American farmers can grow pecans (native to humid Mississippi) in scorchingly dry Phoenix.
All of this change came not due to shifts in the marketplace or the will of consumers. On the contrary, the fisheries provided extremely cheap protein to million of people for many, many years. There was huge demand for the industry. No, these industries were eviscerated because of decisions by governments and special interests. It was simply decided, for political reasons, that damming up the rivers and drying up the deltas was better than allowing the rivers to flow.
It is also a fact that without massive amounts of government capital, these dams would have never been built. The infrastructure of water storage and damming was only ever possible thanks to governmental central planning and taxation. Would irrigation still exist were it not for the governmental meddling in water? Certainly it would. Irrigation farming predates government dams. But the scale of government projects is much, much larger, and has far greater impact on the surrounding geography.
Once might counter that the fishing industries were also the beneficiaries of government largesse because the government-owned rivers were being used by the fishing industries to breed their fish. That’s certainly true as well.
So the question we should be asking ourselves is: “Which group has the right to take control of the rivers? Farmers or Fishermen? Or environmentalists? All want the “publicly” owned water for their own purposes. The answer is that none should be using politics so seize control of water. Ownership of the rivers, instead, should be decentralized, privatized, and the water should be sold to the highest bidder.
In contrast, I can tell you that sending in the government to centrally plan the whole affair, as has been done, is decidedly not the correct answer. This is of course the answer the conservatives are fine with, however. For them, the fact that the government one day went in and decided that growers are to be the winners, and the fisheries are to be the losers, is a-okay. For anyone who is actually concerned about finding free-market solutions, however, it’s all just more central planning.
As Kathryn Muratore recently noted, there is no easy answer here, thanks to 70 years of water socialism. Nevertheless, there’s no time like the present, and the best way to end drought is to establish a functioning system of prices and water ownership in the west. The farmers, the cities, and the politicians will cry bloody murder and complain that nothing can work in the West except the established system of prior-appropriation water rights. That system, however, we know has failed. History has shown that it requires government central planning, and is nothing more than water socialism, and is thus unsustainable.
Can other systems work? Perhaps a modified riparian system? We won’t know any time soon, because the farmers and politicians will cling to their precious status quo of “cheap” water for those who write the biggest checks, not for water, but for political influence.
Los Angeles City Council has voted to seize the local private business/large apartment trash hauling industry, take control of it, and sell off exclusive contracts to those it deems appropriate. The word “seize” is not used, of course, but instead it’s all being sold as a recycling and landfill-use reduction plan. It takes the Los Angeles Times nine paragraphs to get past the environmental back-patting to explain what’s actually going on:
Currently, landlords for businesses and apartments choose between competing businesses to haul their trash. Under the new “exclusive franchise” system, Los Angeles will be divided up into 11 zones. Haulers will bid for city contracts giving them the exclusive right to collect garbage in each zone.
The new system is hitched to environmental standards: To be eligible to win each zone, haulers would have to provide separate bins for recycling and use “clean fuel” vehicles, among other ecologically friendly requirements.
The plan is backed by environmentalists and labor groups, who say the system is the best way to help Los Angeles meet its goal of diverting 90% of its trash from landfills. Activists say the system will also mean fewer trucks crisscrossing city streets and safer conditions for workers in a dangerous industry.
The city is turning a private competitive service into a monopoly. The Times does note that the proposal puts unions and environmentalists against business and private property:
Business groups say the new system will put small haulers out of business and ultimately drive up rates.
"The environmental benefits are subterfuge for an effort to organize an industry that the unions couldn’t organize themselves," Central City Assn. of Los Angeles president and CEO Carol Schatz told The Times last week.
Indeed, labor unions were chanting “Si se puede" ("Yes, it can be done") outside the council meeting after the vote passed. Those union folks really, truly care a lot about the environment, eh?
Only one council member voted against the new regulation, Bernard C. Parks. He was also the only council member to vote against the city’s pointless plastic bag ban. Reason TV and Kennedy interviewed him in 2012. He was concerned this new trash plan would harm small businesses. A head of a local commerce association predicts the new monopolies could drive more than 100 small haulers out of business and suggested the city could require environmental and recycling policies among private haulers without resorting to exclusive contracts.
Speaking of small businesses being harmed in Los Angeles, the city is still shutting down medical marijuana dispensaries that don’t fall under the city’s protection racket put in place by a local ballot initiative. The Los Angeles Daily News notes the city is also extracting fines from and charging landlords who rent to unauthorized pot dispensaries, even though the city is still causing confusion by sending out tax certificates to applicants that don’t qualify to do business in the city. These certificates are then being shown to landlords as evidence that the dispensary is legal, even if it’s not.
David Friedman points us to how data has been grossly - and no doubt deliberately - misrepresented:
One problem in arguments about climate (and many other things) is that most of the information is obtained at second, third, or fourth hand, with the result that what you believe depends largely on what sources of information you trust. One result is that people on either side of the argument can honestly believe that the evidence strongly supports their view. They trust different sources; different sources report different evidence. It is thus particularly interesting when on some point, even a fairly minor one, you can actually check a claim for yourself. I believe I have found an example of such a claim.
Cook et. al. (2013) is the paper, possibly one of two papers, on which the often repeated claim that 97% of climate scientists support global warming is based. Legates et. al. (2013) is a paper which criticizes Cook et. al. (2013). Bedford and Cook (2013) is a response to Legates et. al. All three papers (the last a pre-publication version) are webbed, although Legates et. al. is unfortunately behind a pay wall.
Bedford and Cook (2013) contains the following sentence: “Cook et al. (2013) found that over 97% endorsed the view that the Earth is warming up and human emissions of greenhouse gases are the main cause.”
To check that claim, look at Cook et. al. 2013. Table 2 shows three categories of endorsement of global warming reflected in the abstracts of articles. Category 1, explicit endorsement with quantification, is described as “Explicitly states that humans are the primary cause of recent global warming.” Category 2 is explicit endorsement without quantification. The description, “Explicitly states humans are causing global warming or refers to anthropogenic global warming/climate change as a known fact” is ambiguous, since neither “causing” nor “anthropogenic global warming” specifies how large a part of warming humans are responsible for. But the example for the category is clearer: ‘Emissions of a broad range of greenhouse gases of varying lifetimes contribute to global climate change.’ If human action produces ten percent of warming, it contributes to it, hence category 2, as implied by its label, does not specify how large a fraction of the warming humans are responsible for. Category 3, implicit endorsement, again uses the ambiguous “are causing,” but the example is ‘…carbon sequestration in soil is important for mitigating global climate change,’ which again would be consistent with holding that CO2 was responsible for some but less than half of the warming. It follows that only papers in category 1 imply that “human emissions of greenhouse gases are the main cause.” Authors of papers in categories 2 and 3 might believe that, they might believe that human emissions of greenhouse gases were one cause among several.
Reading down in Cook et. al., we find “To simplify the analysis, ratings were consolidated into three groups: endorsements (including implicit and explicit; categories 1–3 in table 2).” It is that combined group, (“endorse AGW” on Table 4) that the 97.1% figure refers to. Hence that is the number of papers that, according to Cook et. al., implied that humans at least contribute to global warming. The number that imply that humans are the primary cause (category 1) is some smaller percentage which Cook et. al. do not report.
It follows that the sentence I quoted from Bedford and Cook is false. Cook et. al. did not find that “over 97% endorsed the view that the Earth is warming up and human emissions of greenhouse gases are themain cause.” (emphasis mine). Any interested reader can check that it is false by simply comparing the two papers of which Cook is a co-author. John Cook surely knows the contents of his own paper. Hence the sentence in question is a deliberate lie.
That Cook misrepresents the result of his own research does not tell us whether AGW or CAGW is true. It does not tell us if it is true that most climate scientists endorse AGW or CAGW. It is nonetheless interesting, for two related reasons.
In recent online exchanges on climate, I repeatedly encountered the claim that 97% of climate scientists believed humans were the main cause of global warming. That included an exchange with one of the very few reasonable and civil supporters of the CAGW claim that I encountered in the online arguments, where most participants on either side are neither. So far as I know, the paper says nothing that is not true. But it appears designed to encourage the misreading that actually occurred. It does so by lumping together categories 1-3 and reporting only the sum and by repeatedly referring to “the consensus” but never stating clearly what that consensus is.
The closest it came to defining the consensus is as the “position that humans are causing global warming,” which leaves it unclear whether “causing” means “are one cause of,” “are the chief cause of,” or “are the sole cause of.” To discover that it meant only the former, a reader had to pay sufficiently careful attention to the details of the paper to notice “contribute to” in the example of category 2 in Table 2, which few readers would do. The fact that Cook chose, in a second paper, to misrepresent the result of the first is pretty good evidence that the presentation of his results was deliberately designed to mislead.
There is a second, and more important, reason why all of this matters. Beliefs on either side depend largely on what sources of information you trust. I have now provided unambiguous evidence, evidence that anyone on either side willing to carefully read Cook (2013) and check what it says against what Bedford and Cook claims it says can verify for himself, that John Cook cannot be trusted. The blog Skeptical Science lists John Cook as its maintainer, hence all claims on that blog ought to be viewed with suspicion and accepted only if independently verified. Since, as a prominent supporter of the position that warming is primarily due to humans and a very serious threat, Cook is taken seriously and quoted by other supporters of that position, one should reduce one’s trust in those others as well. Either they too are dishonest or they are over willing to believe false claims that support their position.
The fact that one prominent supporter of a position is dishonest does not prove that the position is wrong. For all I know, there may be people on the other side who could be shown to be dishonest by a similar analysis. But it is a reason why those who support that side because they trust its proponents to tell them the truth should be at least somewhat less willing to do so.
P.S. A commenter has located the data file for Cook et. al. (2013). By his count, the number of articles classified into each category was:
Level 1 = 64
Level 2 = 922
Level 3 = 2910
Level 4 = 7970
Level 5 = 54
Level 6 = 15
Level 7 = 9
The 97% figure was the sum of levels 1-3. Assuming the count is correct—readers can check it for themselves—that 97% breaks down as:
Level 1: 1.6%
Level 2: 23%
Level 3: 72%
Only Level 1 corresponds to “the Earth is warming up and human emissions of greenhouse gases are the main cause.” (emphasis mine) Hence when John Cook attributed that view to 97% on the basis of his Cook et. al. (2013) he was misrepresenting 1.6% as 97%. Adding up his categories 5-7, the levels of rejecting of AGW, we find that more papers explicitly or implicitly rejected the claim that human action was responsible for half or more of warming than accepted it. According to Cook’s own data.
Would anybody now like to claim that lumping levels 1, 2, and 3 together and only reporting the sum was not a deliberate attempt to mislead?
Back in the Bush II Administration, the American Association for the Advancement of Science (AAAS) nakedly tried to nudge the political process surrounding the passage of the environmentally-horrific ethanol fuel mandate. It hung a large banner from the side of its Washington headquarters, picturing a corn stalk morphing into a gas pump, all surrounded by a beautiful, pristine, blue ocean. They got their way, and we got the bill, along with a net increase in greenhouse gas emissions.
So it’s not surprising that AAAS is on the Washington Insider side of global warming, releasing a report today that is the perfect 1-2-3 step-by-step how-to guide to climate change alarm.
This is how it is laid out in the counterfactually-titled AAAS report “What We Know”:
Step 1: State that virtually all scientists agree that humans are changing the climate,
Step 2: Highlight that climate change has the potential to bring low risk but high impact outcomes, and
Step 3: Proclaim that by acting now, we can do something to avert potential catastrophe.
To make this most effective, appeal to authority, or in this case, make the case that you are the authority. From the AAAS:
We’re the largest general scientific society in the world, and therefore we believe we have an obligation to inform the public and policymakers about what science is showing about any issue in modern life, and climate is a particularly pressing one,” said Dr. Alan Leshner, CEO of AAAS. “As the voice of the scientific community, we need to share what we know and bring policymakers to the table to discuss how to deal with the issue.
But despite promising to inform us as to “what the science is showing,” the AAAS report largely sidesteps the best and latest science that points to a much lowered risk of extreme climate change, choosing instead to inflate and then highlight what meager evidence exists for potential catastrophic outcomes—evidence that in many cases has been scientifically challenged (for example here and here).
The AAAS takes us through the standard litany of scare-scenarios and tipping points. If you can imagine it, the AAAS mentions it. Rapid sea level rise? Check. Heat waves, floods, droughts? Check. Check. Check. Deteriorating public health? Check. National security threat? Check. Ecological collapse? Check. And the list goes on.
The AAAS’s justification for such alarm?
Below are some of the high-side projections and tail risks we incur by following the current path for CO2 and other greenhouse gas emissions. Most of these projections derive from computer simulations of Earth and its climate system. These models apply the best understanding science has to offer about how our climate works and how it will change in the future. There are many such models and all of them have been validated, to varying degrees, by their ability to replicate past climate changes.
Somehow in its haste to scare us, the AAAS seems to have missed (or ignored) the two hottest topics in climate change these days—1) that climate models have done remarkably poorly in replicating the evolution of global temperature during the past several decades , and 2) that high end climate change scenarios from the models are largely unsupported by observations.
Thus, “what the science is showing” completely undermines the AAAS contentions regarding alarming climate change.
So here’s what we are left with.
As to the AAAS’s first point that human actions are causing climate change, this is largely correct, although the degree and details—the most important features—are uncertain and still being intensity studied and debated (a fact left out by the AAAS).
As to the second point, the current best science suggests that coming human-caused climate change is going to be less than expected with a much-diminished risk of abrupt changes with catastrophic outcomes (a fact left out by the AAAS).
Which means that the AAAS’s third point, that immediate action is required to reduce the risk of extreme change, is hardly applicable—especially when recognizing that no matter what action we take in the U.S. (the primary audience of the AAAS report) it will have such a small impact on the course of future climate change as to do nothing to alleviate the overblown worries of the AAAS (a fact left out by the AAAS).
Add this all up and you realize that the AAAS report is the epitome of climate alarmism—long in hype and short in fact and aimed squarely at influencing policymakers. We should expect better, but they drank the ethanol years ago.
Last summer I testified before a Senate subcommittee on the numerous problems with the estimates issued by the Administration’s Working Group on the Social Cost of Carbon. The Working Group’s estimates of the “social cost of carbon” were artificially inflated because of several modeling decisions that it made. Unfortunately, at the time I could only speculate as to the precise quantitative impact of their decisions, because the Working Group didn’t report the intermediate data that would be needed to tweak their findings.
After my testimony, the Heritage Foundation conducted an independent reproduction of (part) of the Working Group’s results, confirming just how significant its decisions were. At that time, Heritage analysts had re-run William Nordhaus’ “DICE” model—one of three computer programs used by the Working Group—to estimate the social cost of carbon, with different parameters fed into the model. When using a discount rate of 7 percent—a scenario that OMB requires federal agencies to include when performing cost/benefit analyses of proposed regulations—Heritage saw the estimates of the social cost of carbon collapse.
Now, Heritage’s analysts have finished a similar exercise with another one of the three Working Group models, this time using Richard Tol’s “FUND” model. The results here are even more striking: As I speculated in my testimony—and now Heritage has confirmed—using the discount rate of 7 percent actually shows a negative social cost of carbon. What this means is that additional tons of carbon dioxide emitted in the near future cause net spillover benefits on humanity. Following the Administration’s own logic, this particular result would imply the need for government subsidies to oil and coal producers.
The Choice of Discount Rate
To remind readers of the context of Heritage’s findings: OMB issues guidelines for federal agencies when they engage in cost/benefit analyses, to ensure uniformity in the reports and (in principle) to provide policymakers with accurate information. OMB clearly requires that cost/benefit analyses be performed using both a 3 percent and a 7 percent discount rate.[i] Therefore, when federal agencies (such as EPA or DOE) propose a new regulation, they must justify it economically using both rates. If they want to include the benefit of reduced carbon dioxide emissions, then they should plug in the “social cost of carbon” at both a 3 percent and a 7 percent rate.
Yet they can’t do so, because the Working Group neglected to generate a 7 percent estimate. This leads to the absurd situation of federal agencies reporting costs and benefits at a “7 percent” discount rate, then having to put in a footnote that actually those aren’t the right numbers because the relevant social cost of carbon estimates “are not available.”
In my Senate testimony, I speculated that this omission of a 7 percent discount rate was no accident. Because the (alleged) harms of carbon dioxide emissions occur in the distant future, using a higher discount rate will reduce the present value of those projected damages. Had the Working Group done the obviously correct thing by reporting a 7 percent estimate of the “social cost of carbon”—so that federal agencies wouldn’t need to fudge with footnotes when reporting their regulatory analyses at a “7 percent” discount rate—then the public would have gained some insight into just how dubious this whole enterprise was.
However, I couldn’t say exactly how much the reported social cost of carbon would fall, had the Working Group used a 7 percent figure, because they didn’t save their intermediate results. In other words, outside analysts would have to run the entire computer simulations from scratch, if they wanted to either double check the Administration’s numbers or tweak the inputs to see how much it affected the result.
Heritage Foundation First Ran DICE Model
The Heritage Foundation—especially its programmer Kevin Dayaratna—began work on reproducing the simulations that the Working Group conducted on its three chosen computer models. First the Heritage team chose William Nordhaus’ DICE model. The interested reader can click through to read Heritage’s full report, but in this post I want to focus on the impact of the discount rate. When they plugged in the value of 7 percent, here is what Heritage found:
The table above from Heritage shows just how significant this decision is. For example, in the year 2020, the estimated social cost of carbon using a 3 percent discount rate is about $38/ton of carbon dioxide. In contrast, the SCC in that same year at the 7 percent discount rate falls to about $6/ton, a drop of almost 85 percent. It’s no wonder that the Working Group entirely neglected to report its SCC estimates at the 7 percent rate, or why federal agencies insist on using the 3 percent figure (when reporting cost/benefit results at the “7 percent” rate) even though a closer 5 percent figure is available to them.
Now Heritage Runs FUND Model
But wait, it gets better. The DICE model and the PAGE model are “pessimistic” when it comes to their projections of the impact of global warming on human welfare. In contrast, the FUND model—which was selected, remember, by the Working Group as being representative of the published literature in the field—projects that moderate warming will confer net social (“spillover”) benefits for the next several decades.
In other words, an additional ton of carbon dioxide emitted in the near future would—according to the FUND model—cause the world to get slightly warmer. This additional warmth would pull the stream of net benefits (through the year 2060 or so) slightly forward in time, but it would also pull the more-distant stream of net harms (from around the year 2060 onward) slightly forward in time as well. At a high enough discount rate, the “present discounted value” of this shift could well be positive, meaning that—from our current vantage point—emitting an additional ton of carbon dioxide was a socially beneficial activity that the government ought to be subsidizing. It was the mirror image of the standard argument forpenalizing carbon dioxide emissions.
Heritage’s work has now confirmed the above intuition. In their recent post they outline what the FUND model estimates as the social cost of carbon, using the full range of discount rates as required by existing OMB guidelines:
Notice that in the final column, which shows the estimated SCC using a 7 percent discount rate, the entry is negative at least through the year 2030. In this context, a negative “social cost” is the same thing as a positive “social benefit.” For all the reasons that Americans are told they must accept taxes, subsidies, mandates, and huge public relations campaigns in order to reduce carbon emissions, this result of a negative SCC would flip it all on its head. Following the standard logic, these results would indicate a prima facie argument for increasing carbon emissions, since they generate benefits for third parties that the emitters don’t fully appreciate.
To be clear, the Institute for Energy Research is committed to free-market approaches to energy markets and environmental challenges. I am being quite tongue-in-cheek when raising the possibility of government support for carbon-intensive operations.
My serious point, however, is that the alleged scientific case for cracking down on carbon is quite dubious. Even using one of the three computer models picked by the Obama Administration Working Group, if we run it using one of the discount rates required by OMB guidelines, then oops! Out pops a social benefit of carbon.
Of course, nobody can ever truly know what motivates others, but I am very suspicious that the Working Group consciously chose to ignore the OMB guidelines precisely because they didn’t want the awkwardness of very low SCC estimates floating around. In that embarrassing scenario, suddenly the weakness of the “it’s a consensus” claim would be revealed, and policymakers would see quite clearly how judgments about discount rates—which have nothing directly to do with climate analysis—can literally flip the policy conclusion on carbon from “tax” to “subsidize.”
Despite populist rhetoric to the contrary, there is no more efficient or just mechanism for conservation,,, and allocation of resources,,,,,, than individuals acting freely under private ownership of property, making consensual, peaceful, and mutually beneficial exchanges of scarce resources. No “human sacrifice” required.
I’m seeing a lot of wrangling over the recent (15+ year) pause in global average warming…when did it start, is it a full pause, shouldn’t we be taking the longer view, etc.
These are all interesting exercises, but they miss the most important point: the climate models that governments base policy decisions on have failed miserably.
I’ve updated our comparison of 90 climate models versus observations for global average surface temperatures through 2013, and we still see that >95% of the models have over-forecast the warming trend since 1979, whether we use their own surface temperature dataset (HadCRUT4), or our satellite dataset of lower tropospheric temperatures (UAH)…
Whether we are the cause of 100% of the observed warming or not, the conclusion is that global warming isn’t as bad as was predicted. That should have major policy implications…assuming policy is still informed by facts more than emotions and political aspirations.
Even many of the people who are supportive of sounding the global warning alarm, back off from catastrophism. It’s the politicians and the green movement that like to portray catastrophe. …
Global warming, climate change, all these things are just a dream come true for politicians. The opportunities for taxation, for policies, for control, for crony capitalism are just immense…
Judith Curry, climatologist and chair of the School of Earth and Atmospheric Sciences at the Georgia Tech, had a really good talk on Econ Talk about how the models are not just wrong, they are broken. They are built in such a way that they don’t account for some very important information.
And because those models are broken, they’ve made a lot of predictions that have simply not been true.
You should listen to the whole thing if you haven’t already:
The warming numbers most commonly advanced are created by climate computer models built almost entirely by scientists who believe in catastrophic global warming. The rate of warming forecast by these models depends on many assumptions and engineering to replicate a complex world in tractable terms, such as how water vapor and clouds will react to the direct heat added by carbon dioxide or the rate of heat uptake, or absorption, by the oceans.
We might forgive these modelers if their forecasts had not been so consistently and spectacularly wrong. From the beginning of climate modeling in the 1980s, these forecasts have, on average, always overstated the degree to which the Earth is warming compared with what we see in the real climate. …
The modelers insist that they are unlucky because natural temperature variability is masking the real warming. They might be right, but when a batter goes 0 for 10, he’s better off questioning his swing than blaming the umpire.
The models mostly miss warming in the deep atmosphere—from the Earth’s surface to 75,000 feet—which is supposed to be one of the real signals of warming caused by carbon dioxide. Here, the consensus ignores the reality of temperature observations of the deep atmosphere collected by satellites and balloons, which have continually shown less than half of the warming shown in the average model forecasts. …
We should not have a climate-science research program that searches only for ways to confirm prevailing theories, and we should not honor government leaders, such as Secretary Kerry, who attack others for their inconvenient, fact-based views.
Competitive markets with low costs of entry have a characteristic that consumers love and businesses lament: very low profit margins. GE, Philips and Sylvania dominated the U.S. market in incandescents, but they couldn’t convert that dominance into price hikes. Because of light bulb’s low material and manufacturing costs, any big climb in prices would have invited new competitors to undercut the giants — and that new competitor would probably have won a distribution deal with Wal-Mart.
So, simply the threat of competition kept profit margins low on the traditional light bulb — that’s the magic of capitalism. GE and Sylvania searched for higher profits by improving the bulb — think of the GE Soft White bulb. These companies, with their giant research budgets, made advances with halogen, LED and fluorescent technologies, and even high-efficiency incandescents. They sold these bulbs at a much higher prices — but they couldn’t get many customers to buy them for those high prices. That’s the hard part about capitalism — consumers, not manufacturers, get to demand what something is worth.
Capitalism ruining their party, the bulb-makers turned to government. Philips teamed up with NRDC. GE leaned on its huge lobbying army — the largest in the nation — and soon they were able to ban the low-profit-margin bulbs.
The high-tech, high-cost, high-margin bulbs have advantages: They live longer and use much less electricity. In the long run, this can save people money. But depending on your circumstances, these gains might be mitigated or eradicated.
The current replacement for traditional bulbs are compact fluorescents (those curly bulbs). They give off UV rays, contain mercury gas, take a while to get bright and don’t last any longer than regular bulbs if you flip them on and off a lot.
Newer technologies, like LED bulbs, are better than CFLs, and they supposedly last 20 years. But they cost even more. In your office building, they probably make sense. In your house? Well they won’t last two decades in a house full of kids who wrestle with the dog and throw footballs around the living room (maybe Congress should ban domestic wrestling and passing).
There is a middle ground between everyone using traditional bulbs and traditional bulbs being illegal. It’s called free choice: Let people choose if they want more efficient and expensive bulbs. Maybe they’ll chose LEDs for some purposes and cheap bulbs for others.
But consumer choice is no good either for nanny-staters or companies seeking high profit margins.
Technologies often run the course from breakthrough innovation to obsolete. Think of the 8-track, the Model T or Kodachrome film. But the market didn’t kill the traditional light bulb. Government did it, at the request of big business.