Space Based Solar Power

A strange article in E+T which seems to support a very limited and costly source of electricity.

https://eandt.theiet.org/2025/08/20/solar-panels-placed-space-could-drastically-reduce-need-earth-based-renewables

Hoping for a bit more engineering content I found the original publication,

https://www.cell.com/joule/fulltext/S2542-4351(25)00255-7

with the pdf here.

https://www.cell.com/action/showPdf?pii=S2542-4351%2825%2900255-7

I am fairly sure there was a discussion in a previous iteration of this forum looking at the problems and risks of beaming microwave energy from space to a ground station, what the energy density would need to be and how to avoid the beam being deflected to a nearby town.

The article itself seems to suggest that this is an alternative to batteries for when the sun doesn’t shine as there are no clouds in space (do clouds absorb microwave energy?) and requires a fourteenfold cost reduction before it is viable.

The key statement is on page 9:

“Advanced nuclear encompassing small modular reactors (SMRs) and Gen IV designs, with fusion more distant aims to deliver continuous, carbon-free electricity. Demonstration SMRs have entered operation or advanced construction in recent years, and a Gen IV pebble-bed reactor began supplying power in 2021.45 By contrast, SBSP may be unlikely to mature before the 2040s, so advanced nuclear may achieve commercial readiness sooner.”

Even if SBSP might work nuclear will get there sooner and do a better job.

What do the panel think?

Parents
  • I would take this as an example that the point of academic research is to explore possibilities, it's very often not possible to guess where these might lead. 

    The cycle tends to be that:

    - the academic publishes a thought exercise in order to promote further research

    - the press leap on it and publish it as "this is coming NOW!"

    - the press and public together then ridicule it as "well they promised this but it never happened"

    - the academic goes and gets very drunk and depressed and wished they'd never had to publish it

    Incidentally I'm very aware that University press departments often don't help this cycle at all...they get pushed to promote early stage research in order to gain funding.

    These days I regret to say that I read New Scientist rather than E&T, NS is very good at being clear whether research is "at the point of practical implementation" or is "here's an idea, what does it make the rest of you think of that might be a better idea?"

    Back in the engineering world, please let's start putting solar panels on all the acres and acres of flat (and flattish) roofing...I cringe every time I pass an industrial estate or trading estate at the wasted roof space potential...

  • I'm inclined to agree - any article that says "harvested electricity is then converted into microwave or laser beams for transmission. Microwave frequencies (1–10 GHz), especially 2.45 or 5.8 GHz,25 are commonly selected to balance transmission efficiency, atmospheric attenuation, and safety constraints."  is very much at the "idea bouncing" stage.

    Given what I know of current state of play of lasers and microwaves, this is unlikely to be simultaneously  efficient and safe for folk on the ground. Consider a laser like dragonfire - one of the highest energy densities fir which figures are available - here we have 50kW or so (not sure if that is DC input or optical output mind you available for short bursts, in a spot perhaps 25mm wide at a km, and presumably diverges with distance, and that is before we worry about cloud cover, dust storms and so on,). The conversion efficiency is likely to be in the high tens of percent, so lets assume for now that it can be as good as 50% - to beam down a gigwatt - a sensible single power station sort of figure, then another gigawatt of waste heat needs to be sweated off at the source - hard to do in the vacuum of space.

    At the receiving end, the sensible thing to do for a laser system is probably to boil water or melt salt or heat oil, to then boil water, and run turbines. The damage such a sytstem could do if beam tracking went awry, needs I hope no explanation.

    A microwave system might use rectennas - antennas with RF rectifiers built in, but there are problems, as to rectify at thousands of MHz requires very small diode chips, and these are necessarily limited to a few volts peak, and have very awkward impedances. Of course one can stack lots of them, which is presumably where the references to kilometer scale receiving stations come in.

    The efficiency is not good at either end for such systems - we'd probably need to shed a gigawatt on the ground, and then 2 more in space, per gigawatt delivered.

     doable of course, but bigger than the study suggests and probably never worth taking beyond a demo system - which would still be worthwhile as a training aid and to improve the modelling.

    Mike

  • I also subscribe to New Scientist. As you say generally it is fairly open about the viability of things however sometimes it goes a bit far.

    www.newscientist.com/.../

    I hope this isn’t subscriber only Slight frown

    The basic premise that the amount of sunlight striking the earth is vastly more than our energy consumption is probably correct.

    How this can be collected and distributed is not really considered. Applying solar energy collection (PV or heat) to 0.3% of the earth’s land area sounds trivial but in reality that is an immense amount of resources.

    The collection system would need to be placed in the equatorial belt for maximum energy collection. The distribution system, either as electricity or some form of chemical conversion like hydrogen or ammonia (degraded hydrogen) will also require an immense amount of resources.

    Is nuclear a better option, apparently not:

    “But what about nuclear fusion? Would that be an even better option than solar if it ever becomes viable?

    The answer is no. Eric Chaisson at Harvard University estimated even assuming modest growth in global energy demand, in around three centuries the waste heat alone might be enough to warm the planet by 3°C. We are talking here about the waste heat produced as generated energy is used, when you boil a kettle or use a computer, say.”

    I don’t follow this logic. If, as we are told at the beginning of the article, our energy consumption is 1/6000 of the sun’s energy input how can this cause a 3°C temperature rise, even allowing for a 30% efficiency?

  • The basic premise that the amount of sunlight striking the earth is vastly more than our energy consumption is probably correct.

    I would hope so.. Otherwise we really are on borrowed time Smiley

  • how can this cause a 3°C temperature rise

    Nearly all these style of assessment miss out on stating the 'hidden' (unstated) assumption, (or fact) that allows the effects to continue at the stated pace.

    It's like the original (1890s?) assessments that the Earth would cool naturally by radiation (Might have been Lord Kelvin), but they didn't know about the nuclear decay heating within the core of the earth. 

    We see the same with the various CO2 claims and counter claims not really mentioning that it is the high level upper atmosphere where the extra CO2 is making the difference, or that they are comparing 'too much' with 'far too much' [implying no difference..]

    Do you have a reference to Eric Chaisson at Harvard University's work on that 'estimate'?

  • a reference to Eric Chaisson at Harvard University's work on that 'estimate'

    It maybe this one: https://ui.adsabs.harvard.edu/abs/2008EOSTr..89..253C/abstract

    Hopefully It's informative about the assumptions.

    Long-Term Global Heating From Energy Usage

    • Chaisson, Eric J.

    Abstract

    Even if civilization on Earth stops polluting the biosphere with greenhouse gases, humanity could eventually be awash in too much heat, namely, the dissipated heat by-product generated by any nonrenewable energy source. Apart from the Sun's natural aging-which causes an approximately 1% luminosity rise for each 108 years and thus about 1°C increase in Earth's surface temperature-well within 1000 years our technological society could find itself up against a fundamental limit to growth: an unavoidable global heating of roughly 3°C dictated solely by the second law of thermodynamics, a biogeophysical effect often ignored when estimating future planetary warming scenarios.

    Publication:
    Eos, Transactions American Geophysical Union, Volume 89, Issue 28, p. 253-254
    Pub Date:
    July 2008
Reply
  • a reference to Eric Chaisson at Harvard University's work on that 'estimate'

    It maybe this one: https://ui.adsabs.harvard.edu/abs/2008EOSTr..89..253C/abstract

    Hopefully It's informative about the assumptions.

    Long-Term Global Heating From Energy Usage

    • Chaisson, Eric J.

    Abstract

    Even if civilization on Earth stops polluting the biosphere with greenhouse gases, humanity could eventually be awash in too much heat, namely, the dissipated heat by-product generated by any nonrenewable energy source. Apart from the Sun's natural aging-which causes an approximately 1% luminosity rise for each 108 years and thus about 1°C increase in Earth's surface temperature-well within 1000 years our technological society could find itself up against a fundamental limit to growth: an unavoidable global heating of roughly 3°C dictated solely by the second law of thermodynamics, a biogeophysical effect often ignored when estimating future planetary warming scenarios.

    Publication:
    Eos, Transactions American Geophysical Union, Volume 89, Issue 28, p. 253-254
    Pub Date:
    July 2008
Children
  • A quick look appears to have the same faults others make about the radiant surface of the earth.

    In some wavebands we are a high albedo gas giant, that is the earth reflects those wavelength ranges directly off the top of the atmosphere.

    In some wavebands it's the physical earth, and seas that are the radiant surfaces, rather than the top of the atmosphere..

    In others (e.g. the CO2 window) we get that the atmosphere has partial transmission so the absorption of the solar radiation essentially happens lower in the atmosphere than the transmission out of the atmosphere (the emission and absorption temperatures being different). That upper atmosphere layer is where the CO2 'blanket' effects happen that hold the earth's heat

    I've also to see if the analysis is a whole earth (including core) equilibrium, or just some local surface non-equilibrium effect.