Very few people realize that for most of the US, cheaper electricity costs for the utilities means lower profits, because most are regulated utilities that take a fixed profit on costs. Models for this vary greatly, sometimes they only take fixed profit on necessary grid investments (where necessary is determined by the utility, then stamped for approval by a Public Utility Commission, which has a board that might be elected and susceptible to bribing/election malefeasance, such as in Arizona...)
So-called "natural" monopolies are quite difficult to regulate correctly. And the solution we chose as a society a century ago might not be the right one for today.
A century isn't very long. I guess things have changed somewhat since then, but age isn't a very convincing argument against these regulations imo. Having to update all of this every hundred years sounds exhausting to me.
Utilities tend to make most profit as a fixed rate of return on infrastructure built, not really based on electricity costs. Build a new substation, some poles & wires to distribute it to the new subdivisions, and bammo, regulator grants you a fixed rate of return on that from your ratebase (paying customers) over the lifetime of that infrastructure.
You get a fixed return on that infrastructure, but you need to keep pace with inflation in all other areas of the business. Storms, errant vehicles, animals, fires, all sorts of things cause that same infrastructure to require maintenance personnel and equipment, and that equipment needs its own maintenance, as do the offices for the personnel, and so on. It becomes a balancing act because you only get to adjust the rates every so often, and with regulator approval.
I don't say it as a bad thing at all, I say it as a thing that's in contrast to the rest of the companies and proprietors and corporations we deal with. (And my last comment also hinted that I think it's time to reevaluate this social relationship with the utilities...)
Typically, if a person/corporation is clever and figures out a way to reduce costs, they are incentivized heavily to do so because they take the gains, at least for a while until others figure it out too and compete.
With utilities, they are actively incentivized to increase prices as much as possible. This is a crucial distinction for people asking why cheaper electricity generation methods are not resulting in cheaper utility bills: because the regulatory structure is not operating the way it should, and they need to get involved democratically to change the system!
This is really a non-problem. Curtailment is easy and doesn't cost anything, it just needs a slightly better feedback mechanism than the current grid supplies, but honestly even a 24hr timer and ammeter would do it.
This was one of the actual problems highlighted by the people who coined the term "duck curve" over a decade ago.
One of the main ways we've avoided this problem so far is that solar kept getting cheaper.
So it's not really a problem caused by cheap solar, it's just basic market competition, lots of supply with no barrier to entry drives prices down towards the long term marginal cost.
Nuclear is far superior as you don't need batteries and can have 24/7 reliable power. Cut the artificial costs caused by insane regulation and nuclear is the safest and most cost effective solution.
> These hybrid setups, which combine solar panels with batteries, are now standard in many regions and allow solar energy to be stored and released when needed, turning it into a more reliable, dispatchable source of power that helps balance grid demand.
On one hand, I think people underestimate how much energy our grids demand in a 24 hour cycle. The amount of lithium it would take to handle an unusually cloudy week would be astronomical.
On the other hand, one of the ironies of electric cars is that they are one of the least effective uses of battery capacity. A Tesla with a 60kwh battery is probably touching less than 20kwh of capacity every day.
So theoretically if you use the batteries for grid storage and actually cycle them regularly from 80% down to 20%, the battery capacity would be well over 2x - 4x more effective at offsetting carbon sources. (Even more so if you are offsetting worse sources like coal).
> The amount of lithium it would take to handle an unusually cloudy week would be astronomical.
I don't think it's astronomical. At most a few dozen kg per person. Compare that to the amount of steel that we produce per person just to give everybody access to cars!
And it's certainly not more astronomical than the amount of natural gas that we already extract, transport, and burn for electricity.
We have built big things in the past, there's no reason we can't do it again today. In fact, it's going to be far easier today because our tech is better and our factories' productivity is so much higher than in the past.
During the height of the industrial revolution, humans were able to double steel production roughly every 6 years. We've doubled global lithium production in the last 3!
Even so, if you do the math about how much lithium we need to get to a few dozen kg per person given that rate of growth, we're still looking at 20-30 years. (There are also a lot of elemental and labor bottlenecks).
So you're right that it's not astronomical! We just might have speedier expectations for when it's achievable.
Fortunately other chemistries are making their way down the learning curve lithium's been on, and after a few moment reflection it would be astounding if the same types of storage that are optimal for mobile consumers also happen to be optimal for grid balancing, as daily grid balancing can be done pretty reasonably with a c rating[0] of about .2 or even .1, and as long they aren't so heavy they need to be placed on massive foundation piles, weight is not really a factor. Just need to make it as cheap as possible.
Also, sodium batteries are now coming online. Slightly lower energy density, but potentially vastly lower cost. From looking at cost trends and increased production, it also seems pretty obvious at this point that we probably are going to get those vast quantities of battery storage everyone is so surprised about.
I'm still a little bit skeptical because chemical energy storage is inherently corrosive and unstable and so maintaining installed capacity is more expensive.
Unlike power generation, where you can build a facility and it can more or less run for decades with basic maintenance, batteries effectively have a limited number of cycles they can be relied on for. So you would effectively have to remanufacture your entire storage capacity every 10-15 years. So scaling up on storage would theoretically get exponentially more expensive the more of the grid you take over.
Obviously battery technology is improving and each 1% gain in improvement represents drastic savings. But I think there is still lots of reason to diversify technologically and continue to look into complementary sources of base power like nuclear.
At grid scale, all sorts of different batteries become feasible. My favorite is the one that uses solar during the day to lift heavy objects, which then turn a generator at night when they come back down.
Grids don't need "batteries" as consumers think about them. They can store energy using gravity, for example pumping water up into a reservoir (to store) and letting it drain through a turbine using gravity (to convert back into electricity).
With $40/kWh batteries available soon, i think even having 100 kWh of storage for a house will be rather common. With 14.4kWp solar, 5 MWh of electricity use per year and 100 kWh at 50€/kWh of battery you have 90% autarky and a time-to-value of 8-9 years. Pretty sweet.
Not OP, but a recent auction in China has utility scale at $52/kWh. Given the cost decline curve of storage, I would assume we arrive at $40 within 1-2 years.
Not if you consider that you need to either renew or add battery capacity (and panels and power electronics) after x years? Or did you take that into account?
There is small problem though - all of the people will need the last 10% at the same time. And because you need infrastructure that will work only 10 percent of the time, expect the price of kwh to be 10 times the current to compensate.
The less you need the grid - the more expensive is what you will pull from it because the infrastructure costs will be spread on fewer kwh.
It'll be interesting if the sodium ion battery hype can combine with solar and give real base power alternatives. I think claims of cheapest don't really count unless it's qualified as non base power.
The problem with the idea of "base power" is that in the past, the cheapest electricity was from the big thermal generators that take a day to warm up, and operate most cheaply by pumping out at maximum efficiency 24 hours a day. You could then layer on the more expensive electricity sources that could spin up in 15 minutes or a few hours, and match the demand curve, as long as you matched baseload generators to the minimum of the demand curve, and have a cost-optimal electricity mix.
Now that we have cheaper sources of energy for parts of the day, "base" power is a much less desirable concept. It's gone from a simple and straightforward optimization problem that a middle-schooler could solve to a cost optimization problem that markets and linear solvers can solve.
Now that we have cheap storage, and solar-plus-storage is cheaper than coal in the UK, the cost optimization is getting simpler: get rid of all the base load coal!
Define "real base power"? You think the grid can only operate if every generator is always-on? How do you square that with the fact that the grid is operating just fine with this stuff already?
This is a fallacy, basically. Not least because electricity is by far the most mobile traded commodity in human history. Not enough sun today where you live? Buy your power from across the continent, where they have plenty. Or from your wind generators which are working fine. Or the wind generators across the continent if you have to. Or crank up the hydro dams (most of which rarely run at 100%) a bit to handle the shortfall. Or even fire up an idle gas plant if you absolutely can't get anything else.
The idea that solar and wind aren't (sigh) "real" is a lie that someone sold you. The real world relies on a lot of this stuff already and the promised apocalypse never arrived. Go figure.
> Or the wind generators across the continent if you have to.
Transmission losses are typically very substantial in most grids that are AC based. For example, a cross-country power transmission with the USAs grid would result in ~36% losses (napkin math at about 20% loss per 1000km).
Reality isn't as simple as "ship the electricity" unfortunately; it makes a lot of sense to keep generation near consumption.
Where does your 20% for 1000km come from? Transmission losses for long distance lines are much lower, e.g. the below is from wikipedia:
> Depending on voltage level and construction details, HVDC transmission losses are quoted at 3.5% per 1,000 km (620 mi), about 50% less than AC (6.7%) lines at the same voltage.
They also have limits on how much power they can transfer before the lines sag too much.
Here in Norway the limitations of our rather poorly connected energy grid has become very apparent last few years, with 100x price difference between regions that aren't that far apart physically.
While we've been paying "winter prices" during summer, up north they've shut down hydro plants since the prices there are so low it's less than operating costs.
Clearly something we'll need to get over. HVDC has something like 3% per 1000 KM loss - and when we start generating massive quantities of solar, 3% is going to be pretty acceptable.
You're talking about roughly ~5T (the prior estimates I'm aware of) in costs for replacing the grid with a HVDC transmission line. That's a massive cost and undertaking. It's much more likely we see an incremental transition to point-to-point transmission in major industrial corridors (Los Angeles/Irvine to IE solar generation capacity in the future, Lousiana Basin, North Texas to the gulf, etc.) where major benefits can be cost effectively be realized.
A "full" transition is unlikely in our lifetimes due to the fact that the majority of the benefits can be reaped without needing such an expense.
One thing that could help would be to see more folks who are bullish on solar actually push the new promise that it gives: not just that we'll have clean energy (yay, but abstract) - we'll have cheap energy, if we do it right. Tons and tons of it for so cheap that you'll almost not even have to think about it, especially in comparison to fossil fuels today. Every aspect of the supply chain is simpler and cleaner.
If people could see that at some point, keeping their house at a perfect temperature with an electric heat pump would lead to them _never thinking about a heating bill again_... that would be far more concrete than promises of staving off climate change.
A Spanish bank has calculated that the solar and wind built there between 2019 and 2024 reduced wholesale electricity prices by 40% compared with if they'd stuck at 2019 levels.
Well by all means, show us how to do it right at scale. The leaders in this area (California, European countries) haven’t exactly done much to deliver on the promise of cheap renewable energy.
Not every generator needs to be always on, but the generation needs to be relatively independent of external conditions, especially when there might be correlation between high usage at disadvantageous external conditions (for example it’s colder and less sun in winter).
Recognizing this does not mean one is hostile to renewables, even though some people that are hostile use this talking point dishonestly.
> How do you square that with the fact that the grid is operating just fine with this stuff already?
Simple every grid in the united states has enough reliable generation capacity to take up the slack when solar fails. But that means the cost of building all those natural gas peaker plants is part of the cost of solar (it's never included in the LCOE).
Right, because that would change the definition of LCOE. And you are right that it's important, and there are other terms to look for, as LACE, which EIA has been putting out for a long time. And Lazard's energy reports:
This is what we're hoping battery-style plants will supplant. I say "battery-style" because it turns out that filling a tank with compressed air, lifting a heavy weight with a crane, or pumping water up into a lake all have excellent energy storage potential while not requiring any kind of esoteric matter or construction techniques. It does remain to be seen if such activities can be profitable or not - but if you're effectively buying power when it's cheap / free / paid to take it, and you're selling it when it isn't, it should be possible....
I suppose it depends where you're doing it. Makes most sense to do these things where natural formations do 99% of the construction and you just build a dam. Honestly, we probably need to do a lot more damming to keep more fresh water in - we lose so much of it to the ocean where it becomes useless salt trash...
> But that means the cost of building all those natural gas peaker plants is part of the cost of solar (it's never included in the LCOE).
This is a weird position to take. How would you price out nuclear then? As that cannot respond to changes in demand quickly either? Should every power source's cost have some gas tacked on to it? Or can we just assume that for now we have a mix of sources where different sources have different pros and cons.
And as said in many other places here, fossils don't have their externalities priced in either. I wouldn't be surprised if future generations scold us for burning so much natural gas that can also be used for many other things than burning
Not the GP, but I think it's fairly reasonable, yes.
Ultimately, what we care about, from a grid policy perspective, is the cost to provide 99% (or whatever) guaranteed 24/7/365 power. Each energy source will have its own challenges in order to do that, and for solar availablity it clearly one of them. And yes, externalities should be factored in.
> Or can we just assume that for now we have a mix of sources where different sources have different pros and cons.
Of course, but the question is: what is the right mix for the right place. And saying "solar + storage" is cheaper than gas means very different things if it can only guarantee 60% availability like in England, or 95+% in the sunniest regions of the world.
The figures reported here come from (among others) Lazard' LCOE analysis. (https://www.lazard.com/media/xemfey0k/lazards-lcoeplus-june-...)
If you have a look at how it's calculated, "solar + storage" only has enough storage for 4h for instance. You cannot really meaningfully compare it directly with nuclear which is much more reliable in itself.
What LCOE says is that the system will produce X amount of energy at cost of Y. It says nothing about when that energy will be produced. An energy source that produces 365 MWh on January 1st only, and another that produces 1 MWh every day, for the same cost, will have the same LCOE. The latter is, provided you can scale it, much more useful in practice.
Look, I'm not saying that solar is bad or we shouldn't do it. It's just that the "solar is cheap" thing which is regularly reported is a bit misleading. We've heard it for years now, and yet electricity prices around the world are mostly increasing. Clearly there's a mismatch, but where does it come from? And I think part of the reason is that the "ancillary" costs of solar have been underestimated. Sure, the energy straight out of the panel is very cheap, but if you need 10s of billions in grid upgrades and storage/backup to make it work in practice, then it should definitely be included in the comparison! Just like the externalities of fossil fuels should be.
It should be noted that those low costs for solar installations are not including consumer rooftop solar. The consumer rooftop solar cost is usually one of the most expensive ways you can generate electricity - often several times the cost of utility solar installations:
The high rooftop solar price is usually hidden because no power source has been as subsidized as rooftop solar. Besides direct subsidies, wealthier home owners have often been paid the retail rate for the electricity they sell to the grid which causes higher electricity bills for those who can't afford to put panels on their roof. Also, in almost all cases, the home installation doesn’t have enough battery power to actually last through inclement weather and so is free riding on the reliability provided by the grid, putting more costs on the less well off. The whole thing is sort of a reverse Robin Hood scheme.
Any subsidies for solar power should go to utility grade solar. Money is limited and is fungible - a dollar spent subsidizing utility solar will go much, much, further than a dollar spent subsidizing wealthy homeowners who install panels on their roof.
> Also, in almost all cases, the home installation doesn’t have enough battery power to actually last through inclement weather
That's true today, but I bet it won't be in 5 years time. Around this year or last year batteries have hit a price point where they make financial sense for ordinary people, and in a couple of years you'd be mad not to have them if you have rooftop (and somewhere to put the batteries of course).
> Any subsidies for solar power should go to utility grade solar.
At this point, it should probably go towards storage, grid capacity, or things like EV charging infrastructure. Solar generation doesn't need it.
> The consumer rooftop solar cost is usually one of the most expensive ways you can generate electricity
This may well be true, but there are positive externalities:
- It has reduced land use compared to other energy generation methods
- Power is produced near the point of use, reducing transmission requirements
- Most of the cost is labour for installation which creates jobs
Given that rooftop solar usually pays for itself despite being less cost-efficient that other forms of solar, I see no reason to discourage it.
All new construction should have it, at least. The cost differential for building a new home's roof and outfitting that roof with solar just isn't high enough to justify doing it differently. Plus it improves cooling/heating costs and protects the roofing materials from direct sunlight.
That hugely depends on the region. In my corner of the world the scheme is net-billing, so homeowners are selling energy at wholesale prices.
My friend has a 10kW setup combined with a 10kWh battery. Main adjustment he made was prevent the heat pump from keeping the water hot during the night, as that was just wasting energy.
After this adjustment his electricity bills halved despite only really selling energy in the peak of the summer season. The savings translate to a 10-year return on investment before subsidies.
Solar "subsidies" are almost universally tax credits, meaning the only money involved is the money paid by the homeowners. So, for society, rooftop solar is by far the cheapest option. It costs the rest of us nothing, the homeowners pay for it.
Money is also not limited, it is in fact created by the banks when someone takes a loan, for example to put solar on their roofs.
Meaning there is no money lost from society that could instead be used to build utility solar, just because someone puts solar on their roofs. If your county borrows money to make utility solar, that money is also created by the bank then and there.
Also note that you are quoting last years Lazard report. Solar is way cheaper in this year's report. It will probably be even cheaper in the next one.
This is true. A lot of the older panels are also rather inefficient and come with a rather poor lifespan. It's part of the reason I don't personally have any form of solar despite working in the industry.
Technology is catching up on the solar panel front though. French Heliup are producing panels which are only 5kg per square meter. Which makes them significantly easier to install on roof-tops. I imagine I'll eventually have solar panels on my roof, but I'll likely wait another decade for battery tech to also be more viable.
I would have to cut down so many trees around my house to get effective sunlight on the roof more than a few hours a day. Plus my roof is pitched east/west so at best I have half the roof in sunlight for half the day. I do have quite a bit of clear space behind the house, so I could consider a system that sits on the ground.
I just like paying the utility every month and not having to worry about owning, maintaining, and repairing my own infrastructure though. As it is now, if anything goes wrong upstream of the meter, that's not my responsibility.
I think you’re missing the real-politik of rooftop installations. The person installing the panel has a few square meters of empty and unproductive space, over which they have exclusive construction rights. And which is already connected to a grid, with capacity to absorb its production, physically close to the consumers of its output.
That combination of factors is fairly rare in most of Europe & North America - especially when you’re looking for multiple contiguous kilometres of land for utility scale solar. There’s a lot of that type of land out in Texas and Australia, but there’s much less of it in the Scottish Highlands.
The high cost of rooftop solar in the US is mostly a regulatory choice. From tariffs driving up prices significantly, to completely scattered permitting processes city-to-city, to wild swings in utility policies that make solar a good-to-bad decision overnight, it all drives costs up. So the only solar installers who survive are those who are able to swoop in when the utility policies change and can drive customer acquisition super quickly ($$$), and have massive permitting office experience that lets them deploy. That all costs a lot of money and drives out competition from the space.
So we pay something like 3x-5x for residential solar in the US versus Australia. Because we choose to as a society.
Also, I heartily disagree that utility scale solar is obviously cheaper. Transmission and distribution costs are the biggest cost on the grid, and utility scale solar needs to pay for that whereas residential solar drives down T&D costs.
We definitely need a lot of utility scale solar, but if we want cheaper electricity we need to incentivize tons of residential solar so that we can keep our grid costs lower.
Australian rooftop solar is the cheapest consumer energy in history. This is despite the hardware and salaries being roughly equal to the US where the cost is about 3 times more. So it's not wise to extrapolate from US prices.
And in general, you should check your electricity bill to see how much is actually for generation vs transmission.
Totally. Grid solar and grid battery installations are gonna rock the world but rooftop solar is too bespoke and labor intensive. It goes directly against the benefit of solar’s mass manufacturing
Does this include the lifetime costs of need to replace the panels? Also batteries though I'd imagine that is a separate. But the panels would seem to be direct additional costs.
RE ".. Solar energy is now the cheapest source of power, study ...."
Does this cost include needed storage( up to several days ) and often new transmission lines? Is any country running only on solar? if it is the cheapest? The actual all up costs to actually allow 100% solar actually make it very expensive
Storage is not needed. You can consume solar power as it's generated, and it is as useful then as if it came from oil, gas or coal.
When the sun goes down, you have saved tons of oil, gas and goal that didn't have to burn during the day. Which is very very good. You don't have to "solve nighttime" before solar makes sense, it makes sense immediately.
Edit: And of course, nighttime is also being solved, in many ways, already.
i have a house in a lot of sun in mid atlantic usa. i have spent a modest amount of time casually exploring solar. the cost/benefit never seems that great. what am i missing?
Lets do some simple napkin math: Lets say a solar system will cost $25k (that should get you an 8-10kW rooftop system). Pay for it with, say, a 5% APR loan.
- 10 years: $265/mo
- 15 years: $197/mo
- 20 years: $164/mo
What are your current monthly electricity costs, and what time horizon gets you a good break-even?
Also consider what electricity costs are gonna do. The loan locks in a monthly cost, but electricity rates have lately been increasing around 3% per year. Say you currently have a $150 monthly electricity bill... in 10, 15, and 20 years that will be $202, $233, and $270/mo.
There's other ways to finance it (PPA's, leases, maybe incentives if we get serious about our energy again someday, etc.) that have their own advantages/disadvantages, but even this simple approach shows there's some wins depending on your exact consumption, time horizon, time-value of money, and what you expect your electric load to do (will you get an EV? will you adopt heat pumps? will you need to start using way more AC in the summers? etc)
I'm in a leaky, 80 year old 3 bed 2 bath in the Midwest (that doesn't have insulation in the walls... brick and plaster). Worst in the Summer is around $100/mth, worst in the Winter $40/mth (of course gas for heat is around $80 Winter, $15 Summer for water heater, stove, etc.).
Without digging into the study, hasn’t this been true for many years now? Would love to know an actual pivot date where commercial scale deployments became the cheapest option.
I dug down trying to get the actual paper and it appears it's a pre-print with only the abstract available (but maybe I wasn't looking in the right spot).
That being said, the quotes from the author were more to the point of it being a milestone that in the UK that Solar+Battery systems were now less expensive than gas/coal.
To my understanding, this is a milestone vs "raw" production numbers, which you're correct in saying have had solar as the cheapest option for years.
The claim hinges entirely on the narrow definition of "large-scale energy generation." For the UK, with its high seasonal energy demand and low winter solar output, the cost of generation is almost irrelevant next to the cost of firming that power for 24/7/365 availability. While the paper[1] shows solar PV and daily-cycle batteries are getting cheaper, it also shows seasonal storage solutions like hydrogen are still an order of magnitude too expensive and inefficient (huge capex for electrolyzers/storage + poor round-trip efficiency). So providing reliable, 24/7/365 baseload power from PV + storage in the UK is demonstrably not cheaper than gas or nuclear today.
Holy crap, the UK has absolutely awful solar insolation, that solar and battery is now cheaper than coal really emphasizes how much the tech has advanced, and continues to advance!
Germany also has absolutely terrible solar resources, worse than any continental US state, and is also deploying tons and tons of solar.
Solar really is one of the more amazing technologies of our time, especially when combined with batteries, which advance almost as fast.
We will have such greater reliability, cost, and air quality as coal is completely replaced by modern clean energy systems.
It's kind of funny because you just know that in 30 years or so Texas is going to be a gloriously wealthy producer of solar power, and it'll probably be owned by the same people who own the oil fields today, but the current stranglehold of oil is preventing those very same people from getting richer.
Germany has deployed a lot of solar because of the Stromeinspeisungsgesetz from the 90s. It was the first feed in tariff law in the world afaik.
It made it that you were guaranteed a certain rate for x amount of years. As a result people were driving to farmers to rent their roof to install solar on it.
There’s a weird thing the contestants in the solar car challenge that Australia has been hosting since forever where they generate more power on haze or partly cloudy days than on sunny. I don’t know if they ever sorted out why. Speculation was reflected light off the clouds, but I suspect panel temperature also played a role. And road temperature affects panel temps.
What the UK cannot do is concentrating solar. The efficiency absolutely crashes in diffuse light.
Yeah. The temperature issue would have been my first guess.
Regarding concentrating solar: are people still trying to make that work for commercial generation? I thought this had generally failed to pan out for electricity generation.
I don't understand. Why is it that in every single discussion about power- and in particular, solar power- is the cost of land omitted ?
This is literally the most important factor to consider as it is a very scare resource in some markets - nevermind that this would be a significant driver of cost for solar, as it requires massive amounts of land (and taxes). This is true for solar, wind, etc be built.
Edit: your comment has changed a bit, so let me address the new sentence here:
> I don't understand. Why is it that in every single discussion about power- and in particular, solar power- is the cost of land omitted ?
You are incorrect that land costs are omitted; in fact I don't recall any sort of cost comparison that has ever omitted land costs, whether it's from Lazard or NREL or anywhere else. It's part of the CapEx, or as rent, or however it's modeled. NREL in particular looks at overall costs on completed and built systems. Further, the developers building these things know the land costs very well.
Can you point to a discussion or paper where land cost was not included? If it's still an open discussion I'd love to add the high quality solar-is-cheapest studies that all include land costs.
And as for this particular study:
First, doesn't omit the cost of land. Secondly land is only the most important resource in very very few markets. Razing skyscrapers in Manhattan would be a bad idea. Instead of farming crops or the sun in Manhattan, it's done elsewhere and shipped in.
It's been included in every one I've read. You're not going to find it in broad surveys like this one, because it's a survey. They'll average together the cost of hundreds of projects. But when you look at the projects individually that were the sources, land acquisition is part of the capital costs or land lease is part of the operating costs.
Why are you saying land is not included in the cost calculations? I've not read this particular study, but every study looking at the cost of electricity generation I have seen in recent years, essentially looked at investment/returns which obviously includes land use cost. The reason why it's not a big topic, is because it's generally not a big contributor to overall cost, the main drivers being panels and construction. Incidentally, the main
power sources where things are not included in the cost are nuclear (typically risk for major accidents is taken on by the government, as no insurance will do it, waste storage cost is often capped as well incalculations) and fossils (health effects of pollution, CO2...).
Residential rooftops in places like Asian cities are already >50x overcommitted. Land on Earth is a scarce resource. Basically it's only in the US west coast among developed countries has cheap surplus land for some solars.
Go look up US and Norway in the density list[1] before trying to use those countries as the mean average reference standard developed countries. Sort by density and scroll to the top and to bottom. Doing so also explains why EV drivers from basically nowhere else but from those countries appreciate the "full tank every morning" concept.
It's also not clear if they factored in battery costs. They mention lithium battery costs have also decreased in cost but not clear if its included. Batteries are a requirement for consistently delivering solar power. You cant just flip the sun on and off.
No generator runs 100%. Not so long ago the UK nuke fleet was managing only about 60% caopacity factor. The grid system runs as a whole with generators backing one another up. The last two big load shedding incidents (since the turn of the century, ~500k people had power cut to keep the rest of the grid up) on the GB grid involved thermal generators tripping; the older one was nuke+coal, the newer one gas+wind. And in general we know when the sun is going down a long time in advance so grid operators can plan for it!
Because solar (for example) can often go on rooftops so not needing new land. And solar and wind are often good in remote areas where population is low and land is cheap.
In good faith, rooftops are not the solar that is generally talked about, when we discuss solar as the cheapest form of energy.
The solar discussed is massive investments into solar farm production.
To your point, sure you can do "remote" - you still have to get a lot of power transmission lines to remote locations, and yes you still have to buy the land. Its not a 0 cost investment. But that's what the accounting looks like when discussing this topic.
We generally don't like to put our nukes (or coal plant) in the centre of cities even though that would reduce transmission costs, nor are major hydro resources often close to the areas that power from them goes to. So that is not unique to wind and solar even if sometimes amplified for them.
Separate reports in the last couple of days suggest that both Norway and Germany could get ~25% of their total solar power from viable rooftop spaces, IIRC.
And indeed one very positive feature of solar is that it IS safe to deploy AT load/demand centres, very much reducing costs and losses in the last 'distribution' mile.
Solar also works when there is no grid locally, which is useful from rural UK to Africa.
Yep, I can't quite get as much power out of the panels on my roof as I use on average, but that's okay because I am reducing the cost of transmission by a significant amount.
That doesnt really seem to answer the question unless you are suggesting solar doesnt have any land costs? I completely understand that in some situations its effectively zero. But in aggregate?
If UK golf courses were turned over entirely to solar PV that would more than cover the ~30GW extra called for in the UK's Clean Power 2030 plan, IIRC.
I would regard that as a huge improvement in use from areas that are otherwise often fairly sterile from a biodiversity point of view, and only get used by a fraction of the population.
What is the 'cost' of those courses, vs PV, vs horrible climate change?
But the point remains that very significant PV can be provided on already-in-use land from homes and warehouses to reservoirs and farm land agrivoltaics.
The cost of land is usually omitted in casual discussions because it's not a significant driver of cost for solar. Quantitative cost breakdowns do include it. See for example https://www.eia.gov/analysis/studies/powerplants/capitalcost..., where the cost estimate for a US$200 million 150-megawatt-AC solar plant in 02019 (six years ago) included US$133,000 per year in land leasing costs, one fourth the cost of "module cleaning" and one sixth the cost of "preventative maintenance".
However, not every single discussion does omit it. For example,
we were discussing this yesterday at https://news.ycombinator.com/item?id=45487268 and https://news.ycombinator.com/item?id=45487197, where my calculation was that even in solar-unfavorable places like the northern extreme of Germany, the cost of land barely reaches the same order of magnitude as the cost of the solar modules, even at today's record-low module prices. US$72 million of the US$200 million of the estimate mentioned above was the modules themselves, but today that would be closer to US$15 million.
The atom of truth in your confused assertion is that solar farms do take up enormous amounts of land compared to other kinds of power plants. But, at current human energy consumption levels, it's still only a tiny amount of overall land.
I'm assuming from your past submission history that you're in the US, because you mostly only post US politics stuff. The US generates about 4200 TWh electric per year, which is 480 gigawatts (https://en.wikipedia.org/wiki/Electricity_sector_of_the_Unit...). Under the 2000kWh/year/kWp conditions prevailing in the Californian Mojave Desert, parts of Arizona, and parts of New Mexico, according to https://solargis.com/resources/free-maps-and-gis-data?locali..., this would require about 2.1 terawatts (peak) of solar panels. A square meter of 22%-efficient solar panel produces 220 watts, peak, so this is about 9600 square kilometers, a 110-kilometer-diameter circle. (Although, if you don't leave spaces between the panels, you have to make them horizontal so they don't shade each other; in practice people set them further apart to use more land but less panels.)
How big is that?
It's almost 0.1% of the US, not counting the space between the panels. It's a little over twice the diameter of the VLA radio telescope in western New Mexico, a facility you might remember from the movie Contact. It's slightly larger than White Sands Missile Range (8300km²). It's 15 times the size of Lake Mead, which was created by Hoover Dam to generate electricity. It's about two thirds the size of California's Death Valley National Park (13'793km²). It would take up one eighth of the Navajo Reservation.
Almost everywhere in the US has at least half that much sunlight. Suppose you wanted to site the panels near Springport, Michigan, for some reason. You only have 1200 kWh/year/kWp there, so you need 16000km². The panels would cover a ten-county area centered on Jackson County; they would reach Lansing, and might reach halfway to Ann Arbor in one direction and Kalamazoo in the other.
In real life, it wouldn't be a good idea to put it all in one place like that, both because it's fragile and because it creates higher transmission-line losses; it's better to put the panels closer to where they're used, which implies spreading them apart. So probably 0.2% of every state. In an average-sized state like Iowa (145'746km²) it might be 300km², 20% of the size of an average-sized county like Hamilton County. To power the whole state.
But try to keep perspective on the total amount of land we're talking about here: White Sands Missile Range, two thirds of Death Valley, or a small part of the Navajo Reservation to power the whole country.
Land is also spoiled and occupied for fossil fuel extraction, refinement, and distribution. And at least you can be within the near vicinity of solar panels without experiencing significant health issues, so we need to tabulate the very real public health costs too when comparing. I highly doubt there is a solar equivalent of “Cancer Alley” in the US. Another consideration is that we can slap solar panels on top of existing structures, which you cannot do with fossil fuel at any stage.
I don’t doubt that the costs are considerable but if we’re going to go down that path we need to make sure it’s applied across the board here and I imagine the gap is not as bad as you’re implying. I’m going off my gut though so I could be entirely wrong.
Land and exploration costs are definitely included in fossil fuel costs.
I think most major solar projects include land and infrastructure so only comparing the costs of the panels is like comparing hydro only based on operating the dam, not building it.
> Land and exploration costs are definitely included in fossil fuel costs.
I think most major solar projects include land and infrastructure so only comparing the costs of the panels is like comparing hydro only based on operating the dam, not building it.
Fossil the way we're running it now is only priced out based on burning it not dealing with its externalities either.
You're very right to bring up how much land is needed for fossil fuel extraction and refinement.
Take all that land that's currently used for fossil fuels, and replace its use with solar, and you're replacing pretty much 100% of the world's total energy demand, not just fossil fuel demand. (I need to find that citation again...)
What's the cite for solar requiring "massive amounts of land and taxes"? That doesn't square with the back of my envelope but I'd be willing to look at numbers.
Because the land cost and amount needed is immaterial. Solar and wind actually contribute to the tax base where it is colocated, tax revenue that would otherwise not be provided (depending on jurisdiction). People overvalue land for some reason, most of the world is not Manhattan or Hong Kong. The US farms 60 million acres for corn and soybean for biofuels alone, for example, and of course that land is much more efficiently used if solar PV is installed for energy.
None of the links (and I checked) actually attempts to answer my point.
its simple
*Why is the cost of land ommitted ? *
A link to "rooftops" is not an answer when its primary use case is a ROI that doesn't work out unless its measured in decades.
Sloppy accounting.
My initial answer was literally going to be a Billy Madison quote. Specifically, the reply to Principal Max Anderson to Billy's rambling answer. But that's offensive, so i will just state:
In the US, the lease cost per acre is typically $1k-$5k/acre/year over a 20-25 year lease. Feel free to back that out per kWh based on 1MW of solar per ~5 acres and ~1,460 MWh per year based on the four peak sunlight hours a day per the national average.
You don't tend to see it because it's so small it gets buried and lumped in with a bunch of other costs, either capex or as operating costs (if modeled as a lease). It's ~1% of the total costs in most markets, so the only people who think it's worth mentioning are people advocating for nuclear power, since energy produced per acre is one of the only metrics where nuclear comes out on top.
This isn't surprising. Or new. I've been saying for years solar is the future. Solar has so many advantages over every other form of power generation. It's flexible, scalable (meaning you can have an installation any size from a solar farm to a calculator) and robust because it has no moving parts. It has no pollution (noise or chemical) and can be dual-use on many structures, everything from rooftops to adding solar to highways.
The big problem with solar? Regulation around installing it that is entirely designed to protect the profits of utility companies.
We have predatory financing around solar where companies are allowed to put a lien on your house and then essentially extort the homeowner if they ever choose to sell such that solar can reduce the value of your house significantly.
We limit the amount of solar basically so the utility can keep selling you electricity.
One might say it's to cover the bullding and maintenance of the transmission infrastructure. There's some truth to that. But at the same time utilities are generating massive profits, doing share buybacks and giving massive concessions to data centers that everyone else is paying for.
Basically we would all be better off if every electricity provider wasn't a private company but instead what a municipal operation like municipal broadband.
There's also the problem of rural opposition to solar farms. Not unusual to see a "no solar farms" sign every few hundred feet in some parts of Indiana where I wander (often in the same areas where the cars have bumper stickers celebrating coal).
As a steady income stream, solar is a huge help to smaller farmers that might otherwise go bankrupt from a few bad years' of harvests. Larger growers hate that they no longer get to gobble up the land of the smaller growers, so there's a strong political incentive to block solar from their side.
The research team also found that the price of lithium-ion batteries has fallen by 89% since 2010, making solar-plus-storage systems as cost-effective as gas power plants
Well, which one is it? Is it cheaper or the same price as a gas plant?
New energy generation from solar is cheaper. New energy generation from solar plus storage is cost competitive with natural gas peaker plants. By how much depends on the region.
CalISO is a weird entity. WECC has essential subordinated contracting within a subset of the Western grid to CalISO.
However it's not actually a separate grid. So when analyzing stability issues from inverter based sources of power (solar/wind/batteries) we can't use CalISO numbers since they can (and do) actually draw power from outside the CalISO grid area.
Looking forward to the day when solar dominates and we can transmit from the sunny side of the world to the dark side... I think there are already some plans for a pan-European grid, maybe even a European/Asian grid. The bigger the grid, the more resilient solar is.
It would also make the world more interdependent and thus hopefully more peaceful.
I remember reading about this in IEEE. If you google "hypergrid IEEE" you can find papers in IEEE explore, but there was also a perspective that was more readable that I read a few years ago...
The Pan-European grid is already reality. The fluctuating generation costs between e.g. the Nordics and mainland Europe are already driving investment into undersea HVDC cables.
To transmit power you will need battery storage on both sides of long distance power lines. We never build enough of them, but if you can peak shave and trough fill with stored power, you increase the MWH per day available on the far end.
Solar tends to come with battery storage. But you could also just build the battery storage.
I'm not sure how practical that would be. Transmitting power over distances incurs losses.
I did see a proposal to build out solar in Africa and pipe it undersea to Europe. That seemed wild, and, predictably, it got canned for its impracticality.
Edit - it looks like there are several such proposals, and that they're not all cancelled:
I feel like it would be more feasible than that to just put something in space and beam it down by laser/maser/whateber, that way it's outside the penumbra and can be in full light 24/7.
I suspect a mix of batteries and baseload delivery from the sunny side will be helpful. Even a few more hours of daylight from across the country can help batteries "last longer" by shortening the demand period.
Capacity factor for renewables is generally lower than thermal plant, so some transmission costs may be higher. Though batteries firming output mitigates that for example.
Right, and the majority of solar is getting installed with batteries these days because the really profitable hours are in the evening. Something like 60% of new solar in 2024 in the US had batteries on site.
Also, the health impacts on us, by breathing the fumes of fossil fuel power plants, etc.
The cost of fossil fuel is much more than a price tag in the electricity bill.
No, solar panels do not contain "all sorts of heavy metals," they leave the land open to all sorts of use afterwards (unlike fossil fuel infrastructure) and disposal is no more difficult than shipping the panels somewhere.
When a site is repowered with new panels, the old panels are often reused, as well, and there's a robust secondary market for panels.
In short: disposal costs are well in line with other technologies, at least according to this consultant (top hit on a web search):
Solar panels are photodiode arrays. Photodiodes are made by lacing mid-purity Si with ultra toxic metals. More neurotoxic the better is the basic rule of all high-performance opto-electronics semiconductors. The less toxic you make, the worse they get, and less they responsive to longer wavelength lights.
In, Ga, As, P, Si, Zn, Se, Ge, Cd, Te, Pb, etc. The duh materials. Sometimes the whole panel is thin crystals of GaAs. You have to melt that down to recycle.
Those metals are not like Mo additive on greases. The machines that contained those came with warnings long before Prop 65 or EU RoHS simply by toxicity. And deploying PVs is to scale that out into the fields. It's mildly insane thing to do.
Even nukes are more eco friendly, even if you include things Fukushima, maybe Chernobyl too. The elephant foot is going to passivate itself before the collapse of human civilization could occur. I don't know same can be said about panels blown away in rainstorms.
It doesn't exclude construction. The abstract says they analyzed levelized cost of electricity (LCOE). LCOE includes investments, operations, and fuel, so it includes construction.
Is there a form of power that doesn't have these? A coal plant has all these plus fuel costs, workers to run the plant, parts for the turbines, and ash disposal to deal with.
It has become so incredibly cheap that in some parts of the world it has started eating itself, called solar cannibalization.
https://www.aalto.fi/en/news/rapid-growth-of-solar-power-in-...
https://www.reuters.com/business/energy/plunging-solar-captu...
It’s almost like market forces aren’t what you want for a utility.
Very few people realize that for most of the US, cheaper electricity costs for the utilities means lower profits, because most are regulated utilities that take a fixed profit on costs. Models for this vary greatly, sometimes they only take fixed profit on necessary grid investments (where necessary is determined by the utility, then stamped for approval by a Public Utility Commission, which has a board that might be elected and susceptible to bribing/election malefeasance, such as in Arizona...)
So-called "natural" monopolies are quite difficult to regulate correctly. And the solution we chose as a society a century ago might not be the right one for today.
A century isn't very long. I guess things have changed somewhat since then, but age isn't a very convincing argument against these regulations imo. Having to update all of this every hundred years sounds exhausting to me.
Utilities tend to make most profit as a fixed rate of return on infrastructure built, not really based on electricity costs. Build a new substation, some poles & wires to distribute it to the new subdivisions, and bammo, regulator grants you a fixed rate of return on that from your ratebase (paying customers) over the lifetime of that infrastructure.
You get a fixed return on that infrastructure, but you need to keep pace with inflation in all other areas of the business. Storms, errant vehicles, animals, fires, all sorts of things cause that same infrastructure to require maintenance personnel and equipment, and that equipment needs its own maintenance, as do the offices for the personnel, and so on. It becomes a balancing act because you only get to adjust the rates every so often, and with regulator approval.
>cheaper electricity costs for the utilities means lower profits
You say that like it's a bad thing. Maybe electricity generation shouldn't be a profit seeking enterprise?
I don't say it as a bad thing at all, I say it as a thing that's in contrast to the rest of the companies and proprietors and corporations we deal with. (And my last comment also hinted that I think it's time to reevaluate this social relationship with the utilities...)
Typically, if a person/corporation is clever and figures out a way to reduce costs, they are incentivized heavily to do so because they take the gains, at least for a while until others figure it out too and compete.
With utilities, they are actively incentivized to increase prices as much as possible. This is a crucial distinction for people asking why cheaper electricity generation methods are not resulting in cheaper utility bills: because the regulatory structure is not operating the way it should, and they need to get involved democratically to change the system!
This is really a non-problem. Curtailment is easy and doesn't cost anything, it just needs a slightly better feedback mechanism than the current grid supplies, but honestly even a 24hr timer and ammeter would do it.
This was one of the actual problems highlighted by the people who coined the term "duck curve" over a decade ago.
One of the main ways we've avoided this problem so far is that solar kept getting cheaper.
So it's not really a problem caused by cheap solar, it's just basic market competition, lots of supply with no barrier to entry drives prices down towards the long term marginal cost.
This seems good for investors in utility-scale batteries, since they can charge them very cheaply and sell at night.
(Although, that might not work well in Finland in the summer.)
Nuclear is far superior as you don't need batteries and can have 24/7 reliable power. Cut the artificial costs caused by insane regulation and nuclear is the safest and most cost effective solution.
> These hybrid setups, which combine solar panels with batteries, are now standard in many regions and allow solar energy to be stored and released when needed, turning it into a more reliable, dispatchable source of power that helps balance grid demand.
On one hand, I think people underestimate how much energy our grids demand in a 24 hour cycle. The amount of lithium it would take to handle an unusually cloudy week would be astronomical.
On the other hand, one of the ironies of electric cars is that they are one of the least effective uses of battery capacity. A Tesla with a 60kwh battery is probably touching less than 20kwh of capacity every day.
So theoretically if you use the batteries for grid storage and actually cycle them regularly from 80% down to 20%, the battery capacity would be well over 2x - 4x more effective at offsetting carbon sources. (Even more so if you are offsetting worse sources like coal).
> The amount of lithium it would take to handle an unusually cloudy week would be astronomical.
I don't think it's astronomical. At most a few dozen kg per person. Compare that to the amount of steel that we produce per person just to give everybody access to cars!
And it's certainly not more astronomical than the amount of natural gas that we already extract, transport, and burn for electricity.
We have built big things in the past, there's no reason we can't do it again today. In fact, it's going to be far easier today because our tech is better and our factories' productivity is so much higher than in the past.
During the height of the industrial revolution, humans were able to double steel production roughly every 6 years. We've doubled global lithium production in the last 3!
Even so, if you do the math about how much lithium we need to get to a few dozen kg per person given that rate of growth, we're still looking at 20-30 years. (There are also a lot of elemental and labor bottlenecks).
So you're right that it's not astronomical! We just might have speedier expectations for when it's achievable.
Well, lithium is not the only option. For fixed storage, sodium is more and more feasible everyday.
Fortunately other chemistries are making their way down the learning curve lithium's been on, and after a few moment reflection it would be astounding if the same types of storage that are optimal for mobile consumers also happen to be optimal for grid balancing, as daily grid balancing can be done pretty reasonably with a c rating[0] of about .2 or even .1, and as long they aren't so heavy they need to be placed on massive foundation piles, weight is not really a factor. Just need to make it as cheap as possible.
[0] battery c rating is what fraction of the battery can discharge in 1 hour https://wikibattery.org/en/wiki-us/battery/charging-rate-cha...
Also, sodium batteries are now coming online. Slightly lower energy density, but potentially vastly lower cost. From looking at cost trends and increased production, it also seems pretty obvious at this point that we probably are going to get those vast quantities of battery storage everyone is so surprised about.
I'm still a little bit skeptical because chemical energy storage is inherently corrosive and unstable and so maintaining installed capacity is more expensive.
Unlike power generation, where you can build a facility and it can more or less run for decades with basic maintenance, batteries effectively have a limited number of cycles they can be relied on for. So you would effectively have to remanufacture your entire storage capacity every 10-15 years. So scaling up on storage would theoretically get exponentially more expensive the more of the grid you take over.
Obviously battery technology is improving and each 1% gain in improvement represents drastic savings. But I think there is still lots of reason to diversify technologically and continue to look into complementary sources of base power like nuclear.
At grid scale, all sorts of different batteries become feasible. My favorite is the one that uses solar during the day to lift heavy objects, which then turn a generator at night when they come back down.
Grids don't need "batteries" as consumers think about them. They can store energy using gravity, for example pumping water up into a reservoir (to store) and letting it drain through a turbine using gravity (to convert back into electricity).
With $40/kWh batteries available soon, i think even having 100 kWh of storage for a house will be rather common. With 14.4kWp solar, 5 MWh of electricity use per year and 100 kWh at 50€/kWh of battery you have 90% autarky and a time-to-value of 8-9 years. Pretty sweet.
$40/kWh sounds so fantastic, I can't believe it. Could you please provide a source for that price? Where will I get such cheap batteries?
CATL sodium-ion batteries, currently in mass production.
https://aukehoekstra.substack.com/p/batteries-how-cheap-can-...
https://energynews.biz/catls-19-kwh-sodium-ion-claims-face-r...
https://www.nextbigfuture.com/2025/08/catl-sodium-ion-batter...
Not OP, but a recent auction in China has utility scale at $52/kWh. Given the cost decline curve of storage, I would assume we arrive at $40 within 1-2 years.
https://news.ycombinator.com/item?id=44513185
Not if you consider that you need to either renew or add battery capacity (and panels and power electronics) after x years? Or did you take that into account?
x = ~25 years for the panels. They're paid off long before that.
That doesn't impact payback cost, if the batteries & panels last longer than 8-9 years (and they do).
It's just one metric.
There is small problem though - all of the people will need the last 10% at the same time. And because you need infrastructure that will work only 10 percent of the time, expect the price of kwh to be 10 times the current to compensate.
The less you need the grid - the more expensive is what you will pull from it because the infrastructure costs will be spread on fewer kwh.
I'm not sure if everybody will need the last 10% at the same time. Perhaps on a grid that's only a few counties large?
100 kWh of energy will last the average US house three days. And when you throw in people's EV batteries too...
100kWk would last me about the coldest (January) week with all mod cons and hot water and space heat via our heat pump:
https://www.earth.org.uk/energy-series-dataset.html#V-con
Yes. If the price per kW goes up too far, guess what? More batteries. Luckily they are about to get very cheap.
It'll be interesting if the sodium ion battery hype can combine with solar and give real base power alternatives. I think claims of cheapest don't really count unless it's qualified as non base power.
The problem with the idea of "base power" is that in the past, the cheapest electricity was from the big thermal generators that take a day to warm up, and operate most cheaply by pumping out at maximum efficiency 24 hours a day. You could then layer on the more expensive electricity sources that could spin up in 15 minutes or a few hours, and match the demand curve, as long as you matched baseload generators to the minimum of the demand curve, and have a cost-optimal electricity mix.
Now that we have cheaper sources of energy for parts of the day, "base" power is a much less desirable concept. It's gone from a simple and straightforward optimization problem that a middle-schooler could solve to a cost optimization problem that markets and linear solvers can solve.
Now that we have cheap storage, and solar-plus-storage is cheaper than coal in the UK, the cost optimization is getting simpler: get rid of all the base load coal!
24hrs solar is already a thing that is possible for large parts of the world with lithium batteries.
https://ember-energy.org/latest-insights/solar-electricity-e...
That study is already old as the prices for batteries have come down a lot more since then.
Define "real base power"? You think the grid can only operate if every generator is always-on? How do you square that with the fact that the grid is operating just fine with this stuff already?
This is a fallacy, basically. Not least because electricity is by far the most mobile traded commodity in human history. Not enough sun today where you live? Buy your power from across the continent, where they have plenty. Or from your wind generators which are working fine. Or the wind generators across the continent if you have to. Or crank up the hydro dams (most of which rarely run at 100%) a bit to handle the shortfall. Or even fire up an idle gas plant if you absolutely can't get anything else.
The idea that solar and wind aren't (sigh) "real" is a lie that someone sold you. The real world relies on a lot of this stuff already and the promised apocalypse never arrived. Go figure.
> Or the wind generators across the continent if you have to.
Transmission losses are typically very substantial in most grids that are AC based. For example, a cross-country power transmission with the USAs grid would result in ~36% losses (napkin math at about 20% loss per 1000km).
Reality isn't as simple as "ship the electricity" unfortunately; it makes a lot of sense to keep generation near consumption.
Edit: Since people like this comment, take a look at this: https://patternenergy.com/projects/southern-spirit-transmiss...
Where does your 20% for 1000km come from? Transmission losses for long distance lines are much lower, e.g. the below is from wikipedia:
> Depending on voltage level and construction details, HVDC transmission losses are quoted at 3.5% per 1,000 km (620 mi), about 50% less than AC (6.7%) lines at the same voltage.
https://en.wikipedia.org/wiki/High-voltage_direct_current
The US is currently an AC grid. HVDC is not currently in any meaningful use in North America (yet)
They also have limits on how much power they can transfer before the lines sag too much.
Here in Norway the limitations of our rather poorly connected energy grid has become very apparent last few years, with 100x price difference between regions that aren't that far apart physically.
While we've been paying "winter prices" during summer, up north they've shut down hydro plants since the prices there are so low it's less than operating costs.
Clearly something we'll need to get over. HVDC has something like 3% per 1000 KM loss - and when we start generating massive quantities of solar, 3% is going to be pretty acceptable.
You're talking about roughly ~5T (the prior estimates I'm aware of) in costs for replacing the grid with a HVDC transmission line. That's a massive cost and undertaking. It's much more likely we see an incremental transition to point-to-point transmission in major industrial corridors (Los Angeles/Irvine to IE solar generation capacity in the future, Lousiana Basin, North Texas to the gulf, etc.) where major benefits can be cost effectively be realized.
A "full" transition is unlikely in our lifetimes due to the fact that the majority of the benefits can be reaped without needing such an expense.
I never suggested replacing the entire network with HVDC. Point to point is exactly what I'd have recommended also.
>or example, a cross-country power transmission with the USAs grid would result in ~36% losses (napkin math at about 20% loss per 1000km).
wait, is that it? I can get my electricity from a solar panel in Nevada, middle of winter, far longer than I have local sunlight for,
for about 40 cents a kwh?
And that's treated as an existential problem?
Solar panels are so cheap we should be extremely overprovisioning anyway.
> for about 40 cents a kwh?
That’s fucking expensive if you don’t live in Germany or California! I pay a little less than a third of that for nuclear power.
> And that's treated as an existential problem?
Yes, tripling one’s electric bill is a problem.
One thing that could help would be to see more folks who are bullish on solar actually push the new promise that it gives: not just that we'll have clean energy (yay, but abstract) - we'll have cheap energy, if we do it right. Tons and tons of it for so cheap that you'll almost not even have to think about it, especially in comparison to fossil fuels today. Every aspect of the supply chain is simpler and cleaner.
If people could see that at some point, keeping their house at a perfect temperature with an electric heat pump would lead to them _never thinking about a heating bill again_... that would be far more concrete than promises of staving off climate change.
A Spanish bank has calculated that the solar and wind built there between 2019 and 2024 reduced wholesale electricity prices by 40% compared with if they'd stuck at 2019 levels.
They project a further 50% drop by 2030:
https://www.bde.es/wbe/en/publicaciones/analisis-economico-i...
> we'll have cheap energy, if we do it right.
Well by all means, show us how to do it right at scale. The leaders in this area (California, European countries) haven’t exactly done much to deliver on the promise of cheap renewable energy.
Probably not orders of magnitude cheaper than today. But energy prices will become very stable.
Not every generator needs to be always on, but the generation needs to be relatively independent of external conditions, especially when there might be correlation between high usage at disadvantageous external conditions (for example it’s colder and less sun in winter).
Recognizing this does not mean one is hostile to renewables, even though some people that are hostile use this talking point dishonestly.
> How do you square that with the fact that the grid is operating just fine with this stuff already?
Simple every grid in the united states has enough reliable generation capacity to take up the slack when solar fails. But that means the cost of building all those natural gas peaker plants is part of the cost of solar (it's never included in the LCOE).
> (it's never included in the LCOE).
Right, because that would change the definition of LCOE. And you are right that it's important, and there are other terms to look for, as LACE, which EIA has been putting out for a long time. And Lazard's energy reports:
https://www.lazard.com/news-announcements/lazard-releases-20...
have an entire section on "Cost of firming intermittency" where it estimates costs based on each region of the US.
This is what we're hoping battery-style plants will supplant. I say "battery-style" because it turns out that filling a tank with compressed air, lifting a heavy weight with a crane, or pumping water up into a lake all have excellent energy storage potential while not requiring any kind of esoteric matter or construction techniques. It does remain to be seen if such activities can be profitable or not - but if you're effectively buying power when it's cheap / free / paid to take it, and you're selling it when it isn't, it should be possible....
The only large scale energy storage system im aware of that is also cost effective is pumped hydro.
However pumped hydro is shall we say extremely environmentally bad.
Like casually remove a mountain bad
I suppose it depends where you're doing it. Makes most sense to do these things where natural formations do 99% of the construction and you just build a dam. Honestly, we probably need to do a lot more damming to keep more fresh water in - we lose so much of it to the ocean where it becomes useless salt trash...
Norway disagrees, it depends on your environment
> But that means the cost of building all those natural gas peaker plants is part of the cost of solar (it's never included in the LCOE).
This is a weird position to take. How would you price out nuclear then? As that cannot respond to changes in demand quickly either? Should every power source's cost have some gas tacked on to it? Or can we just assume that for now we have a mix of sources where different sources have different pros and cons.
And as said in many other places here, fossils don't have their externalities priced in either. I wouldn't be surprised if future generations scold us for burning so much natural gas that can also be used for many other things than burning
Not the GP, but I think it's fairly reasonable, yes.
Ultimately, what we care about, from a grid policy perspective, is the cost to provide 99% (or whatever) guaranteed 24/7/365 power. Each energy source will have its own challenges in order to do that, and for solar availablity it clearly one of them. And yes, externalities should be factored in.
> Or can we just assume that for now we have a mix of sources where different sources have different pros and cons.
Of course, but the question is: what is the right mix for the right place. And saying "solar + storage" is cheaper than gas means very different things if it can only guarantee 60% availability like in England, or 95+% in the sunniest regions of the world.
The figures reported here come from (among others) Lazard' LCOE analysis. (https://www.lazard.com/media/xemfey0k/lazards-lcoeplus-june-...) If you have a look at how it's calculated, "solar + storage" only has enough storage for 4h for instance. You cannot really meaningfully compare it directly with nuclear which is much more reliable in itself.
What LCOE says is that the system will produce X amount of energy at cost of Y. It says nothing about when that energy will be produced. An energy source that produces 365 MWh on January 1st only, and another that produces 1 MWh every day, for the same cost, will have the same LCOE. The latter is, provided you can scale it, much more useful in practice.
Look, I'm not saying that solar is bad or we shouldn't do it. It's just that the "solar is cheap" thing which is regularly reported is a bit misleading. We've heard it for years now, and yet electricity prices around the world are mostly increasing. Clearly there's a mismatch, but where does it come from? And I think part of the reason is that the "ancillary" costs of solar have been underestimated. Sure, the energy straight out of the panel is very cheap, but if you need 10s of billions in grid upgrades and storage/backup to make it work in practice, then it should definitely be included in the comparison! Just like the externalities of fossil fuels should be.
It should be noted that those low costs for solar installations are not including consumer rooftop solar. The consumer rooftop solar cost is usually one of the most expensive ways you can generate electricity - often several times the cost of utility solar installations:
https://www.lazard.com/media/xemfey0k/lazards-lcoeplus-june-...
The high rooftop solar price is usually hidden because no power source has been as subsidized as rooftop solar. Besides direct subsidies, wealthier home owners have often been paid the retail rate for the electricity they sell to the grid which causes higher electricity bills for those who can't afford to put panels on their roof. Also, in almost all cases, the home installation doesn’t have enough battery power to actually last through inclement weather and so is free riding on the reliability provided by the grid, putting more costs on the less well off. The whole thing is sort of a reverse Robin Hood scheme.
Any subsidies for solar power should go to utility grade solar. Money is limited and is fungible - a dollar spent subsidizing utility solar will go much, much, further than a dollar spent subsidizing wealthy homeowners who install panels on their roof.
> Also, in almost all cases, the home installation doesn’t have enough battery power to actually last through inclement weather
That's true today, but I bet it won't be in 5 years time. Around this year or last year batteries have hit a price point where they make financial sense for ordinary people, and in a couple of years you'd be mad not to have them if you have rooftop (and somewhere to put the batteries of course).
> Any subsidies for solar power should go to utility grade solar.
At this point, it should probably go towards storage, grid capacity, or things like EV charging infrastructure. Solar generation doesn't need it.
> The consumer rooftop solar cost is usually one of the most expensive ways you can generate electricity
This may well be true, but there are positive externalities:
- It has reduced land use compared to other energy generation methods
- Power is produced near the point of use, reducing transmission requirements
- Most of the cost is labour for installation which creates jobs
Given that rooftop solar usually pays for itself despite being less cost-efficient that other forms of solar, I see no reason to discourage it.
All new construction should have it, at least. The cost differential for building a new home's roof and outfitting that roof with solar just isn't high enough to justify doing it differently. Plus it improves cooling/heating costs and protects the roofing materials from direct sunlight.
That hugely depends on the region. In my corner of the world the scheme is net-billing, so homeowners are selling energy at wholesale prices.
My friend has a 10kW setup combined with a 10kWh battery. Main adjustment he made was prevent the heat pump from keeping the water hot during the night, as that was just wasting energy.
After this adjustment his electricity bills halved despite only really selling energy in the peak of the summer season. The savings translate to a 10-year return on investment before subsidies.
It's not as bad as all that.
Solar "subsidies" are almost universally tax credits, meaning the only money involved is the money paid by the homeowners. So, for society, rooftop solar is by far the cheapest option. It costs the rest of us nothing, the homeowners pay for it.
Money is also not limited, it is in fact created by the banks when someone takes a loan, for example to put solar on their roofs.
Meaning there is no money lost from society that could instead be used to build utility solar, just because someone puts solar on their roofs. If your county borrows money to make utility solar, that money is also created by the bank then and there.
Also note that you are quoting last years Lazard report. Solar is way cheaper in this year's report. It will probably be even cheaper in the next one.
This is true. A lot of the older panels are also rather inefficient and come with a rather poor lifespan. It's part of the reason I don't personally have any form of solar despite working in the industry.
Technology is catching up on the solar panel front though. French Heliup are producing panels which are only 5kg per square meter. Which makes them significantly easier to install on roof-tops. I imagine I'll eventually have solar panels on my roof, but I'll likely wait another decade for battery tech to also be more viable.
I would have to cut down so many trees around my house to get effective sunlight on the roof more than a few hours a day. Plus my roof is pitched east/west so at best I have half the roof in sunlight for half the day. I do have quite a bit of clear space behind the house, so I could consider a system that sits on the ground.
I just like paying the utility every month and not having to worry about owning, maintaining, and repairing my own infrastructure though. As it is now, if anything goes wrong upstream of the meter, that's not my responsibility.
Why don't you do a small installation where your generation does not exceed your usage so storage considerations are moot?
I think you’re missing the real-politik of rooftop installations. The person installing the panel has a few square meters of empty and unproductive space, over which they have exclusive construction rights. And which is already connected to a grid, with capacity to absorb its production, physically close to the consumers of its output.
That combination of factors is fairly rare in most of Europe & North America - especially when you’re looking for multiple contiguous kilometres of land for utility scale solar. There’s a lot of that type of land out in Texas and Australia, but there’s much less of it in the Scottish Highlands.
The high cost of rooftop solar in the US is mostly a regulatory choice. From tariffs driving up prices significantly, to completely scattered permitting processes city-to-city, to wild swings in utility policies that make solar a good-to-bad decision overnight, it all drives costs up. So the only solar installers who survive are those who are able to swoop in when the utility policies change and can drive customer acquisition super quickly ($$$), and have massive permitting office experience that lets them deploy. That all costs a lot of money and drives out competition from the space.
So we pay something like 3x-5x for residential solar in the US versus Australia. Because we choose to as a society.
Also, I heartily disagree that utility scale solar is obviously cheaper. Transmission and distribution costs are the biggest cost on the grid, and utility scale solar needs to pay for that whereas residential solar drives down T&D costs.
We definitely need a lot of utility scale solar, but if we want cheaper electricity we need to incentivize tons of residential solar so that we can keep our grid costs lower.
Australian rooftop solar is the cheapest consumer energy in history. This is despite the hardware and salaries being roughly equal to the US where the cost is about 3 times more. So it's not wise to extrapolate from US prices.
And in general, you should check your electricity bill to see how much is actually for generation vs transmission.
Totally. Grid solar and grid battery installations are gonna rock the world but rooftop solar is too bespoke and labor intensive. It goes directly against the benefit of solar’s mass manufacturing
Does this include the lifetime costs of need to replace the panels? Also batteries though I'd imagine that is a separate. But the panels would seem to be direct additional costs.
RE ".. Solar energy is now the cheapest source of power, study ...."
Does this cost include needed storage( up to several days ) and often new transmission lines? Is any country running only on solar? if it is the cheapest? The actual all up costs to actually allow 100% solar actually make it very expensive
Storage is not needed. You can consume solar power as it's generated, and it is as useful then as if it came from oil, gas or coal.
When the sun goes down, you have saved tons of oil, gas and goal that didn't have to burn during the day. Which is very very good. You don't have to "solve nighttime" before solar makes sense, it makes sense immediately.
Edit: And of course, nighttime is also being solved, in many ways, already.
i have a house in a lot of sun in mid atlantic usa. i have spent a modest amount of time casually exploring solar. the cost/benefit never seems that great. what am i missing?
Lets do some simple napkin math: Lets say a solar system will cost $25k (that should get you an 8-10kW rooftop system). Pay for it with, say, a 5% APR loan.
- 10 years: $265/mo
- 15 years: $197/mo
- 20 years: $164/mo
What are your current monthly electricity costs, and what time horizon gets you a good break-even?
Also consider what electricity costs are gonna do. The loan locks in a monthly cost, but electricity rates have lately been increasing around 3% per year. Say you currently have a $150 monthly electricity bill... in 10, 15, and 20 years that will be $202, $233, and $270/mo.
There's other ways to finance it (PPA's, leases, maybe incentives if we get serious about our energy again someday, etc.) that have their own advantages/disadvantages, but even this simple approach shows there's some wins depending on your exact consumption, time horizon, time-value of money, and what you expect your electric load to do (will you get an EV? will you adopt heat pumps? will you need to start using way more AC in the summers? etc)
I'm in a leaky, 80 year old 3 bed 2 bath in the Midwest (that doesn't have insulation in the walls... brick and plaster). Worst in the Summer is around $100/mth, worst in the Winter $40/mth (of course gas for heat is around $80 Winter, $15 Summer for water heater, stove, etc.).
The median cost/benefit of home solar is good, but not great. It tends to be a good investment, but not hugely different from other possibilities.
But the variance is huge.
Where are your calculations? It's hard for anyone to comment without seeing your cost/benefit analysis.
The actual paper seems to be available only as a docx file.
I've converted it to pdf.
https://files.catbox.moe/tfoim0.pdf
Does that include the delivery of power too? Or just for on site consumption?
Without digging into the study, hasn’t this been true for many years now? Would love to know an actual pivot date where commercial scale deployments became the cheapest option.
I dug down trying to get the actual paper and it appears it's a pre-print with only the abstract available (but maybe I wasn't looking in the right spot).
That being said, the quotes from the author were more to the point of it being a milestone that in the UK that Solar+Battery systems were now less expensive than gas/coal.
To my understanding, this is a milestone vs "raw" production numbers, which you're correct in saying have had solar as the cheapest option for years.
The claim hinges entirely on the narrow definition of "large-scale energy generation." For the UK, with its high seasonal energy demand and low winter solar output, the cost of generation is almost irrelevant next to the cost of firming that power for 24/7/365 availability. While the paper[1] shows solar PV and daily-cycle batteries are getting cheaper, it also shows seasonal storage solutions like hydrogen are still an order of magnitude too expensive and inefficient (huge capex for electrolyzers/storage + poor round-trip efficiency). So providing reliable, 24/7/365 baseload power from PV + storage in the UK is demonstrably not cheaper than gas or nuclear today.
[1] https://www.authorea.com/users/960972/articles/1329770/maste...
It's now passed peer-review, but hasn't yet been formatted into the journal's format as a PDF.
Abstract here: https://www.authorea.com/users/960972/articles/1329770-solar...
Which links to this .docx: https://www.authorea.com/users/960972/articles/1329770/maste...
Holy crap, the UK has absolutely awful solar insolation, that solar and battery is now cheaper than coal really emphasizes how much the tech has advanced, and continues to advance!
Germany also has absolutely terrible solar resources, worse than any continental US state, and is also deploying tons and tons of solar.
Solar really is one of the more amazing technologies of our time, especially when combined with batteries, which advance almost as fast.
We will have such greater reliability, cost, and air quality as coal is completely replaced by modern clean energy systems.
It's kind of funny because you just know that in 30 years or so Texas is going to be a gloriously wealthy producer of solar power, and it'll probably be owned by the same people who own the oil fields today, but the current stranglehold of oil is preventing those very same people from getting richer.
Germany has deployed a lot of solar because of the Stromeinspeisungsgesetz from the 90s. It was the first feed in tariff law in the world afaik.
It made it that you were guaranteed a certain rate for x amount of years. As a result people were driving to farmers to rent their roof to install solar on it.
There’s a weird thing the contestants in the solar car challenge that Australia has been hosting since forever where they generate more power on haze or partly cloudy days than on sunny. I don’t know if they ever sorted out why. Speculation was reflected light off the clouds, but I suspect panel temperature also played a role. And road temperature affects panel temps.
What the UK cannot do is concentrating solar. The efficiency absolutely crashes in diffuse light.
Yeah. The temperature issue would have been my first guess.
Regarding concentrating solar: are people still trying to make that work for commercial generation? I thought this had generally failed to pan out for electricity generation.
Concentrated PV is pretty dead, the cost and complexity of the tracking is not at all worth it for the theoretical increase in efficiency.
UK insolation is very roughly half that of (say) California, IIRC...
(BTW, this is my crowd at Surrey, and Ravi is my (IfS) director!)
Wouldn't it depend a lot on where you live in terms of weather and/or required pollution controls?
I don't understand. Why is it that in every single discussion about power- and in particular, solar power- is the cost of land omitted ?
This is literally the most important factor to consider as it is a very scare resource in some markets - nevermind that this would be a significant driver of cost for solar, as it requires massive amounts of land (and taxes). This is true for solar, wind, etc be built.
Edit: your comment has changed a bit, so let me address the new sentence here:
> I don't understand. Why is it that in every single discussion about power- and in particular, solar power- is the cost of land omitted ?
You are incorrect that land costs are omitted; in fact I don't recall any sort of cost comparison that has ever omitted land costs, whether it's from Lazard or NREL or anywhere else. It's part of the CapEx, or as rent, or however it's modeled. NREL in particular looks at overall costs on completed and built systems. Further, the developers building these things know the land costs very well.
Can you point to a discussion or paper where land cost was not included? If it's still an open discussion I'd love to add the high quality solar-is-cheapest studies that all include land costs.
And as for this particular study:
First, doesn't omit the cost of land. Secondly land is only the most important resource in very very few markets. Razing skyscrapers in Manhattan would be a bad idea. Instead of farming crops or the sun in Manhattan, it's done elsewhere and shipped in.
It's been included in every one I've read. You're not going to find it in broad surveys like this one, because it's a survey. They'll average together the cost of hundreds of projects. But when you look at the projects individually that were the sources, land acquisition is part of the capital costs or land lease is part of the operating costs.
Why are you saying land is not included in the cost calculations? I've not read this particular study, but every study looking at the cost of electricity generation I have seen in recent years, essentially looked at investment/returns which obviously includes land use cost. The reason why it's not a big topic, is because it's generally not a big contributor to overall cost, the main drivers being panels and construction. Incidentally, the main power sources where things are not included in the cost are nuclear (typically risk for major accidents is taken on by the government, as no insurance will do it, waste storage cost is often capped as well incalculations) and fossils (health effects of pollution, CO2...).
Because land doesn't need to be vacant to develop solar?
Residential or industrial rooftops are perfect examples. Heck, sound barriers along freeways etc are now even profitable
Residential rooftops in places like Asian cities are already >50x overcommitted. Land on Earth is a scarce resource. Basically it's only in the US west coast among developed countries has cheap surplus land for some solars.
Counter-examples:
[1] https://www.pv-magazine.com/2025/10/03/rooftop-solar-could-g...
[2] https://www.cleanenergywire.org/news/rooftop-solar-housing-b...
Go look up US and Norway in the density list[1] before trying to use those countries as the mean average reference standard developed countries. Sort by density and scroll to the top and to bottom. Doing so also explains why EV drivers from basically nowhere else but from those countries appreciate the "full tank every morning" concept.
1: https://en.wikipedia.org/wiki/List_of_countries_and_dependen...
It's also not clear if they factored in battery costs. They mention lithium battery costs have also decreased in cost but not clear if its included. Batteries are a requirement for consistently delivering solar power. You cant just flip the sun on and off.
No generator runs 100%. Not so long ago the UK nuke fleet was managing only about 60% caopacity factor. The grid system runs as a whole with generators backing one another up. The last two big load shedding incidents (since the turn of the century, ~500k people had power cut to keep the rest of the grid up) on the GB grid involved thermal generators tripping; the older one was nuke+coal, the newer one gas+wind. And in general we know when the sun is going down a long time in advance so grid operators can plan for it!
Because solar (for example) can often go on rooftops so not needing new land. And solar and wind are often good in remote areas where population is low and land is cheap.
In good faith, rooftops are not the solar that is generally talked about, when we discuss solar as the cheapest form of energy.
The solar discussed is massive investments into solar farm production.
To your point, sure you can do "remote" - you still have to get a lot of power transmission lines to remote locations, and yes you still have to buy the land. Its not a 0 cost investment. But that's what the accounting looks like when discussing this topic.
Its sloppy
We generally don't like to put our nukes (or coal plant) in the centre of cities even though that would reduce transmission costs, nor are major hydro resources often close to the areas that power from them goes to. So that is not unique to wind and solar even if sometimes amplified for them.
Separate reports in the last couple of days suggest that both Norway and Germany could get ~25% of their total solar power from viable rooftop spaces, IIRC.
And indeed one very positive feature of solar is that it IS safe to deploy AT load/demand centres, very much reducing costs and losses in the last 'distribution' mile.
Solar also works when there is no grid locally, which is useful from rural UK to Africa.
[1] https://www.pv-magazine.com/2025/10/03/rooftop-solar-could-g...
[2] https://www.cleanenergywire.org/news/rooftop-solar-housing-b...
Yep, I can't quite get as much power out of the panels on my roof as I use on average, but that's okay because I am reducing the cost of transmission by a significant amount.
That doesnt really seem to answer the question unless you are suggesting solar doesnt have any land costs? I completely understand that in some situations its effectively zero. But in aggregate?
If UK golf courses were turned over entirely to solar PV that would more than cover the ~30GW extra called for in the UK's Clean Power 2030 plan, IIRC.
I would regard that as a huge improvement in use from areas that are otherwise often fairly sterile from a biodiversity point of view, and only get used by a fraction of the population.
What is the 'cost' of those courses, vs PV, vs horrible climate change?
But the point remains that very significant PV can be provided on already-in-use land from homes and warehouses to reservoirs and farm land agrivoltaics.
Why are we assuming that the cost of land isn't included in their LCoE calculation?
PV is fairly unique in that land is not required. It can be installed over existing land uses such as rooftops, canals, parking.
The cost of land is usually omitted in casual discussions because it's not a significant driver of cost for solar. Quantitative cost breakdowns do include it. See for example https://www.eia.gov/analysis/studies/powerplants/capitalcost..., where the cost estimate for a US$200 million 150-megawatt-AC solar plant in 02019 (six years ago) included US$133,000 per year in land leasing costs, one fourth the cost of "module cleaning" and one sixth the cost of "preventative maintenance".
However, not every single discussion does omit it. For example, we were discussing this yesterday at https://news.ycombinator.com/item?id=45487268 and https://news.ycombinator.com/item?id=45487197, where my calculation was that even in solar-unfavorable places like the northern extreme of Germany, the cost of land barely reaches the same order of magnitude as the cost of the solar modules, even at today's record-low module prices. US$72 million of the US$200 million of the estimate mentioned above was the modules themselves, but today that would be closer to US$15 million.
The atom of truth in your confused assertion is that solar farms do take up enormous amounts of land compared to other kinds of power plants. But, at current human energy consumption levels, it's still only a tiny amount of overall land.
I'm assuming from your past submission history that you're in the US, because you mostly only post US politics stuff. The US generates about 4200 TWh electric per year, which is 480 gigawatts (https://en.wikipedia.org/wiki/Electricity_sector_of_the_Unit...). Under the 2000kWh/year/kWp conditions prevailing in the Californian Mojave Desert, parts of Arizona, and parts of New Mexico, according to https://solargis.com/resources/free-maps-and-gis-data?locali..., this would require about 2.1 terawatts (peak) of solar panels. A square meter of 22%-efficient solar panel produces 220 watts, peak, so this is about 9600 square kilometers, a 110-kilometer-diameter circle. (Although, if you don't leave spaces between the panels, you have to make them horizontal so they don't shade each other; in practice people set them further apart to use more land but less panels.)
How big is that?
It's almost 0.1% of the US, not counting the space between the panels. It's a little over twice the diameter of the VLA radio telescope in western New Mexico, a facility you might remember from the movie Contact. It's slightly larger than White Sands Missile Range (8300km²). It's 15 times the size of Lake Mead, which was created by Hoover Dam to generate electricity. It's about two thirds the size of California's Death Valley National Park (13'793km²). It would take up one eighth of the Navajo Reservation.
Almost everywhere in the US has at least half that much sunlight. Suppose you wanted to site the panels near Springport, Michigan, for some reason. You only have 1200 kWh/year/kWp there, so you need 16000km². The panels would cover a ten-county area centered on Jackson County; they would reach Lansing, and might reach halfway to Ann Arbor in one direction and Kalamazoo in the other.
In real life, it wouldn't be a good idea to put it all in one place like that, both because it's fragile and because it creates higher transmission-line losses; it's better to put the panels closer to where they're used, which implies spreading them apart. So probably 0.2% of every state. In an average-sized state like Iowa (145'746km²) it might be 300km², 20% of the size of an average-sized county like Hamilton County. To power the whole state.
But try to keep perspective on the total amount of land we're talking about here: White Sands Missile Range, two thirds of Death Valley, or a small part of the Navajo Reservation to power the whole country.
Because the land is already owned for the most part?
Land is also spoiled and occupied for fossil fuel extraction, refinement, and distribution. And at least you can be within the near vicinity of solar panels without experiencing significant health issues, so we need to tabulate the very real public health costs too when comparing. I highly doubt there is a solar equivalent of “Cancer Alley” in the US. Another consideration is that we can slap solar panels on top of existing structures, which you cannot do with fossil fuel at any stage.
I don’t doubt that the costs are considerable but if we’re going to go down that path we need to make sure it’s applied across the board here and I imagine the gap is not as bad as you’re implying. I’m going off my gut though so I could be entirely wrong.
Land and exploration costs are definitely included in fossil fuel costs.
I think most major solar projects include land and infrastructure so only comparing the costs of the panels is like comparing hydro only based on operating the dam, not building it.
> Land and exploration costs are definitely included in fossil fuel costs. I think most major solar projects include land and infrastructure so only comparing the costs of the panels is like comparing hydro only based on operating the dam, not building it.
Fossil the way we're running it now is only priced out based on burning it not dealing with its externalities either.
What about the trillions of dollars per year in damages, health costs, etc. due to emissions?
You're very right to bring up how much land is needed for fossil fuel extraction and refinement.
Take all that land that's currently used for fossil fuels, and replace its use with solar, and you're replacing pretty much 100% of the world's total energy demand, not just fossil fuel demand. (I need to find that citation again...)
What's the cite for solar requiring "massive amounts of land and taxes"? That doesn't square with the back of my envelope but I'd be willing to look at numbers.
Because the land cost and amount needed is immaterial. Solar and wind actually contribute to the tax base where it is colocated, tax revenue that would otherwise not be provided (depending on jurisdiction). People overvalue land for some reason, most of the world is not Manhattan or Hong Kong. The US farms 60 million acres for corn and soybean for biofuels alone, for example, and of course that land is much more efficiently used if solar PV is installed for energy.
https://elements.visualcapitalist.com/how-much-land-power-us...
https://www.weforum.org/stories/2021/10/solar-panels-half-th...
https://thebreakthrough.org/issues/food-agriculture-environm...
https://www.canarymedia.com/articles/solar/california-first-...
None of the links (and I checked) actually attempts to answer my point.
its simple
*Why is the cost of land ommitted ? *
A link to "rooftops" is not an answer when its primary use case is a ROI that doesn't work out unless its measured in decades.
Sloppy accounting.
My initial answer was literally going to be a Billy Madison quote. Specifically, the reply to Principal Max Anderson to Billy's rambling answer. But that's offensive, so i will just state:
you can do better!
In the US, the lease cost per acre is typically $1k-$5k/acre/year over a 20-25 year lease. Feel free to back that out per kWh based on 1MW of solar per ~5 acres and ~1,460 MWh per year based on the four peak sunlight hours a day per the national average.
https://www.agweb.com/news/business/4-500-acre-plus-signing-...
You don't tend to see it because it's so small it gets buried and lumped in with a bunch of other costs, either capex or as operating costs (if modeled as a lease). It's ~1% of the total costs in most markets, so the only people who think it's worth mentioning are people advocating for nuclear power, since energy produced per acre is one of the only metrics where nuclear comes out on top.
This isn't surprising. Or new. I've been saying for years solar is the future. Solar has so many advantages over every other form of power generation. It's flexible, scalable (meaning you can have an installation any size from a solar farm to a calculator) and robust because it has no moving parts. It has no pollution (noise or chemical) and can be dual-use on many structures, everything from rooftops to adding solar to highways.
The big problem with solar? Regulation around installing it that is entirely designed to protect the profits of utility companies.
We have predatory financing around solar where companies are allowed to put a lien on your house and then essentially extort the homeowner if they ever choose to sell such that solar can reduce the value of your house significantly.
We limit the amount of solar basically so the utility can keep selling you electricity.
One might say it's to cover the bullding and maintenance of the transmission infrastructure. There's some truth to that. But at the same time utilities are generating massive profits, doing share buybacks and giving massive concessions to data centers that everyone else is paying for.
Basically we would all be better off if every electricity provider wasn't a private company but instead what a municipal operation like municipal broadband.
There's also the problem of rural opposition to solar farms. Not unusual to see a "no solar farms" sign every few hundred feet in some parts of Indiana where I wander (often in the same areas where the cars have bumper stickers celebrating coal).
I assume astroturfing is at play.
As a steady income stream, solar is a huge help to smaller farmers that might otherwise go bankrupt from a few bad years' of harvests. Larger growers hate that they no longer get to gobble up the land of the smaller growers, so there's a strong political incentive to block solar from their side.
The research team also found that the price of lithium-ion batteries has fallen by 89% since 2010, making solar-plus-storage systems as cost-effective as gas power plants
Well, which one is it? Is it cheaper or the same price as a gas plant?
New energy generation from solar is cheaper. New energy generation from solar plus storage is cost competitive with natural gas peaker plants. By how much depends on the region.
This paper makes an error that is all too common in discussions of power grids.
It talks about the California grid. Except there is no such thing, California is on the Western grid which is operated by WECC.
The actual mix of energy on the western grid is here.[1]
[1] https://feature.wecc.org/soti2025/soti2025/resources/index.h...
But California has this: https://www.caiso.com/todays-outlook
CalISO is a weird entity. WECC has essential subordinated contracting within a subset of the Western grid to CalISO.
However it's not actually a separate grid. So when analyzing stability issues from inverter based sources of power (solar/wind/batteries) we can't use CalISO numbers since they can (and do) actually draw power from outside the CalISO grid area.
Looking forward to the day when solar dominates and we can transmit from the sunny side of the world to the dark side... I think there are already some plans for a pan-European grid, maybe even a European/Asian grid. The bigger the grid, the more resilient solar is.
It would also make the world more interdependent and thus hopefully more peaceful.
I remember reading about this in IEEE. If you google "hypergrid IEEE" you can find papers in IEEE explore, but there was also a perspective that was more readable that I read a few years ago...
The Pan-European grid is already reality. The fluctuating generation costs between e.g. the Nordics and mainland Europe are already driving investment into undersea HVDC cables.
To transmit power you will need battery storage on both sides of long distance power lines. We never build enough of them, but if you can peak shave and trough fill with stored power, you increase the MWH per day available on the far end.
Solar tends to come with battery storage. But you could also just build the battery storage.
Why transmit it rather than just storing the unused power for later near the source?
Because only one of those things is something we know how to do on the scale required.
I'm not sure how practical that would be. Transmitting power over distances incurs losses.
I did see a proposal to build out solar in Africa and pipe it undersea to Europe. That seemed wild, and, predictably, it got canned for its impracticality.
Edit - it looks like there are several such proposals, and that they're not all cancelled:
Several travel across the Mediterranean:
https://gregy-interconnector.gr/index_en.html
https://www.ecofinagency.com/news-industry/0210-49221-egypt-...
Here's the one I thought was cancelled, which travels along the western coast of Africa to the UK:
https://en.wikipedia.org/wiki/Xlinks_Morocco%E2%80%93UK_Powe...
https://xlinks.co/morocco-uk-power-project/
https://thenational-the-national-prod.cdn.arcpublishing.com/...
I feel like it would be more feasible than that to just put something in space and beam it down by laser/maser/whateber, that way it's outside the penumbra and can be in full light 24/7.
I feel like this makes some sense, but imagine the laser gets a degree or two misaligned....
https://en.wikipedia.org/wiki/Desertec
For those interested
I suspect a mix of batteries and baseload delivery from the sunny side will be helpful. Even a few more hours of daylight from across the country can help batteries "last longer" by shortening the demand period.
skimming the article it looks like this excludes transmission, construction, and disposal costs
Do you think those are significantly different for solar versus other technologies? Seems they would be right in line with anything else...
Capacity factor for renewables is generally lower than thermal plant, so some transmission costs may be higher. Though batteries firming output mitigates that for example.
Right, and the majority of solar is getting installed with batteries these days because the really profitable hours are in the evening. Something like 60% of new solar in 2024 in the US had batteries on site.
Wind + solar + batteries at one site can be a good combo.
Depending on when the wind blows! (Which in most of the US is more at night, IIRC?)
For the UK there is more wind in the winter, solar in the summer.
Intraday anti-correlation is there, but it's more complicated:
https://bmrs.elexon.co.uk/generation-forecast-for-wind-and-s...
The US picture varies a lot over the contiguous states, I think, but often, again, there is anti-correlation in general.
In the UK, wind is primarily in Scotland away from most of the population - people don’t build where it’s windy.
So transmission of solar should be less, as the sun shines everywhere, and people like to build houses where it shines the most.
Hopefully coal is not built close to the population either...
There is no coal-fired electricity generation in the UK now.
But no, neither coal nor nukes would be welcome in cities just to reduce transmission costs!
Let's not forget the cost of carbon emissions which are not normally accounted for in pricing fossil fuels.
This. We're about to find out how expensive those are going to be...
Also, the health impacts on us, by breathing the fumes of fossil fuel power plants, etc. The cost of fossil fuel is much more than a price tag in the electricity bill.
It is. Output of solar panels are proportionate to areas and they contain all sorts of heavy metals.
No, solar panels do not contain "all sorts of heavy metals," they leave the land open to all sorts of use afterwards (unlike fossil fuel infrastructure) and disposal is no more difficult than shipping the panels somewhere.
When a site is repowered with new panels, the old panels are often reused, as well, and there's a robust secondary market for panels.
In short: disposal costs are well in line with other technologies, at least according to this consultant (top hit on a web search):
https://thundersaidenergy.com/downloads/solar-power-decommis...
Solar panels are photodiode arrays. Photodiodes are made by lacing mid-purity Si with ultra toxic metals. More neurotoxic the better is the basic rule of all high-performance opto-electronics semiconductors. The less toxic you make, the worse they get, and less they responsive to longer wavelength lights.
In, Ga, As, P, Si, Zn, Se, Ge, Cd, Te, Pb, etc. The duh materials. Sometimes the whole panel is thin crystals of GaAs. You have to melt that down to recycle.
Key word here being "lacing." Tractors, trucks, and cars with lead acid batteries are far more dangerous to have out in a field than solar panels.
Usually lead used the solder, if there is lead, is the biggest risk!
Those metals are not like Mo additive on greases. The machines that contained those came with warnings long before Prop 65 or EU RoHS simply by toxicity. And deploying PVs is to scale that out into the fields. It's mildly insane thing to do.
Even nukes are more eco friendly, even if you include things Fukushima, maybe Chernobyl too. The elephant foot is going to passivate itself before the collapse of human civilization could occur. I don't know same can be said about panels blown away in rainstorms.
It doesn't exclude construction. The abstract says they analyzed levelized cost of electricity (LCOE). LCOE includes investments, operations, and fuel, so it includes construction.
Transmission is more of a problem with the centralized forms of power generation? So transmission should be a net positive for solar?
Construction and disposal I'm not sure to be honest, intuitively I don't think those are much more expensive.
Is there a form of power that doesn't have these? A coal plant has all these plus fuel costs, workers to run the plant, parts for the turbines, and ash disposal to deal with.