How green-techs compete in US electric-power production

warning: file_get_contents( [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden in /home/content/j/o/h/johnjri4/html/includes/ : eval()'d code on line 4.

July 11, 2009

Exactly how much electricity is consumed in the United States? And, what would it cost to swap 20% of electric generation with renewable sources? As difficult as these questions may seem, they can be broken down into more manageable parts.

Let's start by looking at where we are. As of 2007, about half of all electricity produced in the United States still comes from coal and a good majority of US electricity production still comes from fossil fuels.
To consider the costs involved in switching to clean, alternative-energy based electricity, we must first answer a few questions:
  • How much electricity does the United States consume?
    Answer: 3.892 Trillion kWh [1].
  • Are any two 40-megawatt power plants capable of producing the same number of kilowatt hours in a given year? And, what is 'capacity factor'?
    For the first question, the short answer is 'no'. The capacity factor of a power plant is a ratio of the actual output of a power plant produced over a period of time compared to its output if it had operated at full capacity. For example, a solar power plant obviously can not operate at full capacity 24 hours a day. Therefore, a solar power plant rated at "40-megawatts" might have a capacity factor of just 0.25. For calculation purposes, it's net capacity is just 10 megawatts (40 MW x 0.25 = 10 MW). In contrast, a 40 megawatt nuclear power plant in the United States typically has a capacity factor of about 0.90 [2] (so, figure 40 MW x 0.90 = 36 MW). So, to reiterate, the answer to the 2nd question is: 'no, two 40-megawatt plants can have very different capabilities regarding the quantity of kilowatt hours produced'.
  • How much net megawatt capacity does the United States need to meet its current demand of 3.892 Trillion kWh?
    Short answer: 444,300 megawatts
    The math: 3.892 trillion kW-hours /year x 1 day/24 hours x 1 year/365 days = 444,300 MW
OK, so 444,300 MW of net capacity is what we have to 'chip away at' in order to replace (or eliminate) certain power plants.

To get a better feel for the actual costs involved in replacing fossil-fuel based electric power plants, I uploaded these graphs that illustrate capacity factor, and costs per kilowatt for a variety of electric-power-plant types. Actual costs for net kilowatt-generating capacity is a factor of both capacity factor (first chart) and cost per kilowatt (second chart).

Note: installed_kilowatt_cost x capacity_factor = actual_cost_per_kilowatt_of_generating_capacity

Capacity factors for typical electric power plants

Technical note: Capacity Factors +1 s.d., -1 s.d., and average (outliers eliminated).
source: NREL, 2009,

Dollar Costs per installed kilowatt of generating capacity

(NREL, current 2009 estimates, but costs are in 2006 dollars)

20% replacement scenario

To better understand the cost obstacles in switching to green technologies, let's consider a scenario for the replacement costs of 20% of US electricity production. The table below uses numbers from the NREL data displayed in the charts above. To better understand the problem, we compare the costs of traditional forms of electric production, like coal and gas with green technologies like solar and wind. Again, this is a theoretical cost for replacing 20% of electricity generation. There are about a million asterisks that would go with a formal study. However, this simple comparison should illustrate the magnitude of the problem.

Tables are pretty boring, so let's look at these costs in a way that we can easily identify with. 'Billions' of dollars in cost are also pretty obscure to most of us whose checking accounts are limited to a few zero; so, I included a cost per US citizen (man, woman, and child). It seems that in this context, with cost per person included, the magnitude of the problem becomes more clear.
If costs for alternative energy where cheaper than traditional methods this would happen naturally via market-based decisions. The problem (that is probably obvious) is that alternative energy solutions are usually much more expensive than traditional electric-production methods. Perhaps with a carbon tax or the passing of the cap and trade CO2 bill, these economic decision will change. But, that's another story for another day.

I included one cost that should stand out -- and is pure speculation. That is an estimated cost for conservation that amounts to a 20% reduction of electricity use. I roughly tripled the cost of installing smart meters in every household (~$175 each x 130 million households) [3][4]. I believe that with
  • smart grids,
  • smart meters,
  • and investment by consumers in smart appliances and software
  • that it is conceivable to reduce electrical generation by 20% with the right technology investments.
I intend to write more about smart meters, smart grids, etc. and how I believe they can be used together to reduce electric consumption. But for now, I will conclude with with a reference to some information I found on smart meters and some interesting software solutions that Google and Microsoft are working on with regard to smart meters.


  1. CIA World Fact Book, see Economy / Electricity Consumption (2007)
  2. NREL, Energy Technology Cost and Performance Data
  3. Time-Based Metering and Communications (p.21)
  4. US Census Bureau, American Housing Survey (based on 2007 data)

6 votes
Your rating: None