This is less useful than most people expected. Redwood has been struggling because the expected battery turnover is not occurring. EV batteries are lasting a long time, so they stay in the car are and not being recycled or reused in any quantity yet.
If EV batteries last 20+ years in EV's, it'll be > 2040 before there are significant numbers of EV batteries available to recycle or reuse.
A lot of the early EV battery life projections were based on Nissan Leaf Gen 1. Which had a horrendous battery pack that combined poor choice of chemistry, aggressive usage and a complete lack of active cooling.
When EVs with good battery pack engineering started hitting the streets, they outperformed those early projections by a lot. And by now, it's getting clear that battery pack isn't as much of a concern - with some of the better designs, like in early Teslas, losing about 5-15% of their capacity over a decade of use.
I am a bit more concerned about batteries now as opposed to an year ago.
We had this article from Elektrek [1] about battery issues in South Korea. When I asked my local electric maintenance shop [2, sorry for the FB link], they said they have started seeing the same issue in Model 3s and Ys in Canada as well. (They also said that it is too early to tell how common it would become)
This may bode well for recycling since the issues is an unbalance, not the whole pack failing.
Tesla made powerwalls a product for a reason. They were supposed to come from outdated Tesla cars, but that never materialized. If it is materializing now, they already know what they are going to do.
Don't forget that the original Leaf pack was only 24 kWh. So if you assume a ~1000 full-equivalent-charge-cycles lifespan, then the large Gen2 62 kWh pack will live 2.5 times longer than an original 24 kWh pack. If you average 3.5 miles/kWh, the 24 kWh battery will be expected to last somewhere around 84,000 miles. While the 62 kWh pack will last for 217,000 miles.
It didn't just had horrendous service life, it was designed for some set years of life to be regularly replaced and repurposed for battery storages. Nissan had business schemes outlined for that with Leaf packs.
I think Tesla deserves credit for rethinking hat model into chassis-life battery packs and surpluses rather than recovered cells for grid storages.
Especially considering that, resales of Gen1 Leafs milked for EVs and renewables incentives is like destination fees atrocious. You can find fairly zero-milage ones with a functional 100-yard battery pack on sale for couple hundred dollars in some places. Even crashed wrecks of a Tesla cost magnitudes more.
LiPo batteries were quiet expensive when it was initially released. NiMH was really the only option in town.
And with a lower energy density battery that's also heavier, adding a cooling system would have also added a bunch of weight to the already heavy car with a barely usable range of 100 miles.
Gen 2, however, had no excuses. They had every opportunity to add active cooling and they still decided to go with just air cooling.
Every generation of the production Nissan Leaf has used lithium batteries. AFAIK no modern (~post-2000) mass-produced (>10k units sold) EV has ever used NiMH or lead-acid batteries.
Edit: Checking Wikipedia to verify my information, I found out that Nissan actually sold a lithium-battery EV in 1997 to comply with the same 90s CARB zero-emissions vehicle mandate that gave us the GM EV-1: https://en.wikipedia.org/wiki/Nissan_R%27nessa#Nissan_Altra
EVs no, but I think some Toyota hybrids (which are of course not even PHEVs) still use NiMH. Toyota tends to be very tight-lipped about their batteries and their sizes (or rather, lack thereof).
Tends to be tight lipped??? It is in the catalog[1]! It is more that American consumers aren't tech obsessed than Toyota being reluctant to share.
Even just looking at online media reports[2][3] clearly sourced from some exact same press event, it is obvious that US English equivalents are much lighter in content than Japanese versions. They're putting the information out, no one's reading it. It's just been the types of information that didn't drive clicks. Language barrier would have effects on it too, that Toyota is a Japanese company and US is an export market, but it's fundamentally the same phenomenon as citizen facing government reports that never gets read and often imagined as being "hidden and withheld from public eyes", just a communication issue.
Leaf Gen 1 didn't have NiMH. It had a lithium-based battery chemistry, but some bastard offshoot of it. One that really didn't fare well under high current draw, or deep discharge, or high temperatures, or being looked at wrong.
On the used market you'll find absolutely cooked (literally) Leafs whose first life was in Arizona and barely have enough range to back out of the driveway.
> Gen 2, however, had no excuses. They had every opportunity to add active cooling and they still decided to go with just air cooling.
The Lizard pack in the later Nissan Leafs has held up surprisingly well. I have a 2015 that still gets 75 miles of range. I'm sure they thought it wasn't necessary and they probably had the actuarial numbers to justify it.
> Redwood has been struggling because the expected battery turnover is not occurring
Redwood pitched recycling. But its principal business was primary production. (Processed black mass is analogous to lithium ore.) They're struggling because demand for American-made batteries remains low.
"It’s the largest microgrid in North America and it’s the largest second-energy storage site in the world. So that’s like you said at the top, it’s a 12-megawatt AC, 63-megawatt-hour grid supporting about 2 or 3 megawatts of data centers and run by solar. So all the energy comes from another 12 megawatts of solar."
Sure, so while not supplying power to a city, they are proving this is viable. Just because it's not "turn off the coal plants now" moment doesn't mean this isn't a very good direction. Everyone has to start and grow. I don't understand the whole shit on something because it's not an immediate solve. If these guys waited until 2040 to start the business, well, that'd just be dumb. It essentially sounds like capacity will just continue to increase year over year, maybe around 2040 there will be a huge spike. Doesn't seem like anything is wrong here.
Still have mine. Battery capacity is around 80% of the new capacity. I'm not planning on switching anytime soon as it's got plenty of range still. I'll probably swap the pack out when it hits 70% in the next 2 or 3 years.
I've been intrigued by used solar panels for sale, seems like you can get an amazing price for ones that are only lightly degraded. Is there a downside, or do you just mean that it isn't popular currently?
In addition to my sibling comment: The cost of the panels is a rather small fraction of the total cost of a typical installation. Most of that cost ist labor, some regulatory requirements and the inverter. Whether you pay a factor of 2 for the panels or not typically doesn't matter. In other words: Reusing used panels will only ever be able to safe you a minuscule amount.
These days it’s a stack of microinverters. Which are not cheaper but do improve array efficiency outside of idea conditions. But that’s another up front cost.
The low cost of the modules themselves has led to the suggestion of cost optimized DC-coupled PV systems being used to directly drive resistive heaters. The cost per unit of thermal energy in a cost optimized system moderate scale system (> residential, < utility scale) may be in the range of $3-5/GJ, very competitive with natural gas. Low cost maximum power point trackers would be useful; inverters would not be needed.
Low cost modules allow one to do away with things like optimally tilted modules and single axis tracking. The modules can also be tightly packed, reducing mounting and wiring costs.
Is it worth using heat pumps in this setup (in addition to resistive elements)? I understand they can't reach the absolute temperature of resistive heating, but from an efficiency POV for the first few tens of degrees they are much more efficient.
What's the proposed system design? For example, in January, I get about 9 hours of sunlight and have an average daily high of 25 F. I'm gonna need to store heat somehow or another.
The place I saw this most clearly described was in Standard Thermal's concept, which will store the heat in huge piles of dirt heated to 600 C. The thermal time constant of such piles can be many years.
The surface will always be only slightly hot. Heat will be stored inside, insulated by overlying dirt. Dirt isn't the best insulator by thickness, but it's a very good insulator by $.
I haven't seen pfdietz's proposed system design, but a so-called "sand battery," consisting of a box of sand with a heating element running through it, should work fine. You can PWM the heating element with a power MOSFET to keep it from overheating; you can measure its temperature with its own resistance, but also want additional thermocouple probes for the sand and to measure the surface of the box. A fan can blow air over or through the sand to control the output power within limits.
I'll work out some rough figures.
Let's say your house is pretty big and badly insulated, so we want an average of 5000 watts of heating around the clock with a time constant on the order of 10 hours, and we don't want our heating element to go over 700°. (Honest-to-God degrees, not those pathetic little Fahrenheit ones.) That way we don't have to deal with the ridiculous engineering issues Standard Thermal is battling. There's a thermal gradient through the sand down to room temperature (20°) at the surface. Suppose the sand is in the form of a flat slab with the heating element just heating the center of it, which is kind of a worst case for amount of sand needed but is clearly feasible. Then, when the element is running at a 100% duty cycle, the average sand temperature is 360°. Let's say we need to store about 40 hours of our 5000W. Quartz (cheap construction sand) is 0.73J/g/K, so our 720MJ at ΔT averaging 340K is 2900kg, a bit over a cubic meter of sand. This costs about US$100 depending mostly on delivery costs.
The time constant is mostly determined by the thickness of the sand (relative to its thermal diffusivity), although you can vary it with the fan. The heating element needs to be closely enough spaced that it can heat up the sand in the few hours that it's powered. In practice I am guessing that this will be about 100mm, so 1.5 cubic meters of sand can be in a box that's 200mm × 2.7m × 2.7m. You can probably build the box mostly out of 15m² of ceramic tiles, deducting their thermal mass from the sand required. In theory thin drywall should be fine instead of ceramic if your fan never breaks, but a fan failure could let the surface get hot enough to damage drywall. Or portland cement, although lime or calcium aluminate cement should be fine. You can use the cement to support the ceramic tiles on an angle iron frame and grout between them if necessary.
7.5m² of central plane with wires 100mm apart requires roughly 27 2.7m wires, 75m, probably dozens of broken hair dryers if you want to recycle nichrome, though I suspect that at 700° you could just use baling wire, especially if you mix in a little charcoal with the sand to maintain a reducing atmosphere in the sand pore spaces. (But then if it gets wet you could get carbon monoxide until you dry it out.) We're going to be dumping the whole 720MJ thermal charge in in under 9 hours, say 5 hours when the sunshine is at its peak, so we're talking about maybe 40kW peak power here. This is 533 watts per meter of wire, which is an extremely reasonable number for a wire heating element, even a fairly fine wire in air without forced-air cooling.
If we believe https://www.nature.com/articles/s41598-025-93054-w/tables/1 the thermal conductivity of dry sand ranges from 0.18 W/m/K to 0.34 W/m/K. So if we have a linear thermal gradient from our peak design temperature of 700° to 20° over 100mm, which is 6800K/m, we should get a heat flux of 1200–2300W/m² over our 15m² of ceramic tiles, so at least 18kW, which is more than we need, but only about 3×, so 200mm thickness is in the ballpark even without air blowing through the sand itself. (As the core temperature falls, the heat gradient also falls, and so does the heat flux. 720MJ/18kW I think gives us our time constant, and that works out to 11 hours, but it isn't exactly an exponential decay.) Maybe 350mm would be better, with corresponding increases in heating-element spacing and decreases in wire length and box surface area and footprint.
To limit heat loss when the fan is off, instead of a single humongous wall, you can split the beast into 3–6 parallel walls with a little airspace between them, so they're radiating their heat at each other instead of you, and cement some aluminum foil on the outside surfaces to reduce infrared emissivity. The amount of air the fan blows between the walls can then regulate the heat output over at least an order of magnitude. (In the summer you'll probably want to leave it off.)
The sand, baling wire, aluminum foil, lime cement, angle irons, charcoal, thermocouples, power MOSFETs, microcontroller, fans, and ceramic tiles all together might work out to US$500. But the 40kW of solar panels required are about US$4000 wholesale, before you screw them to your siding or whatever. At US prices they'd apparently be US$10k.
Sand batteries have a much higher cost per unit of energy storage capacity, so they are in more direct competition with batteries for shorter term storage. It's hard to compete with a storage material you just dig out of a local hole. The economics pushes toward crude and very cheap.
Having said that: a good design for sand batteries would use insulated silos, pushing/dropping sand into a fluidized bed heat exchanger where some heat transfer gas is intimately mixed with it. This is the NREL concept that Babcock and Wilcox was (still is?) exploring for grid storage, with a round trip efficiency back to electricity of 54% (estimated) using a gas turbine. Having a separate heat exchanger means the silos don't have to be plumbed for the heat exchange fluid or have to contain its pressure.
Getting the sand back to the top (where it will be heated and dropping into silos) is a problem that could be solved with Olds Elevators, which were only recently invented (amazingly).
How much of a difference does it actually make in terms of the all-inclusive price of installation (e.g. panels, inverters, mounting hardware, and labor)?
(Asking because I genuinely don't know, not because I have a specific answer in mind.)
Labor is by far the top cost. But I'm intrigued by the economics of a small setups paired with like a <5kwh battery. And for something like that where you literally just throw 4-6 panels out, you can just brute force by buying more panels instead of optimizing angles. Basically a slightly beefier version of a European balcony setup
I think they were referring to the fact that the chief reason there is not large-scale PV panel recycling is that very few panels have ever been retired. It turns out that short of physical destruction by hail etc a PV panel does not degrade beyond economic usefulness simply by being out in the sun. In fact some panels actually get more powerful. The surprising-to-some conclusion of NREL's PV Lifetime Project is that the economic lifetime of a PV panel is basically forever.
The typical EV industry trade show has a small handful of cars and a vast amount of tangential businesses including many finance options, a vast amount of home charger gizmos, fast charging gizmos, electricity suppliers and the companies promising grid-scale storage, either from actively used cars or recycled EV batteries. There is a vast constellation of this stuff, with specialist insurance companies that nobody really asked for outnumbering the car brands or even e-bike brands present.
In time there will be consolidation. This constellation of EV startup bottom-feeders will be decimated along with the 'excuses' to not make money.
I don't think the problem is that EV batteries are lasting longer, it is just that the EV market from before the Model 3 came along is miniscule. Hence not many second hand batteries to recycle.
As for EV batteries and their availability, when was the last time you saw an OG Tesla Model S with the fake grill? Those cars used to be everywhere, but where are they now? The German EVs that came out to compete, for example, Taycan and eTron, those things are not going to last the distance since the repairs cost a fortune and the parts supply is limited.
All considered, there will come a time before 2040+ when there are large quantities of these electric car batteries to upcycle, by which time the EV business will be consolidated with only a few players.
If there was money in recycling cars then every auto manufacturer would be in on it.
There is one constant to all these conversations and that is Silicon Valley tech dudes are grossly misinformed about the lifecycle of things. Solar panels don't wear out, batteries don't wear out as fast as they used to. This is evidenced both by undertaking weird dead-end startup ideas, and being susceptible to propaganda about the supposed downsides of solar energy and batteries.
"most people" even now are just parroting dumb FUD they read on facebook.
You really shouldn't give any weight to the opinions of laypeople on topics that are as heavily propagandized and politically charged as renewable energy.
Prius Plugin 2015 (last year of that model) - full charge/discharge at least 3-4 times a week, currently still a bit more than 80% of capacity (granted the battery seems somewhat overbuilt, yet it is normally does 10-15C which is much tougher mode than in a pure EV where 2-3C is usually enough and only high-end Teslas and the likes would do 5-6C). There has been large continuous improvement in lithium batteries over the last couple decades.
From my model airplane experience, I believe it's "capacity per hour". So, a 1Ah battery discharged at 1c would mean 1 amp; discharged at 10c would be 10 amps. The higher the C, the harder the batteries are being used.
1 C current fully discharges battery in 1 hour. Thus 4KWh battery running 60 KW engine means 15C current, and it would discharge the battery in 4 minutes (in a very simplified linear model).
There's currently no technological path for fusion to be cheaper than fission. It would require a technological breakthrough that we have not yet imagined.
And already, solar plus storage is cheaper than new nuclear. And solar and storage are getting cheaper at a tremendous rate.
It's hard to imagine a scenario where fusion could ever catch up to solar and storage technology. It may be useful in places with poor solar resources, like fission is now, but that's a very very long time from now.
The low energy future that was envisioned is not happening.
The AI arms race, which has become an actual arms race in the war in Ukraine, needs endless energy all times a day.
China is already winning the AI cold war because it adds more capacity to its grid a year than Germany has in a century.
If we keep going with agrarian methods of energy production don't be surprised that we suffer the same fate as the agrarian societies of the 19th century. Any country that doesn't have the capability to train and build drones on mass won't be a country for long.
You have that exactly backwards: solar + storage is what will give us energy abundance at less money than we could ever imagine from nuclear fission or fusion.
China is winning the AI Cold war because it's adding solar, storage, and wind at orders of magnitude more than nuclear.
I'm not sure who's doing your supposed "envisioning" but there is no vision for cheap abundant energy from fusion. Solar and storage deliver it today, fusion only delivers it in sci fi books.
Nuclear is 20th century technology that does not fit with a highly automated future. With high levels of automation, construction is super expensive. You want to spend your expensive construction labor on building factories, not individual power generation sites.
Building factories for solar and storage lets them scale to a degree that nuclear could never scale. Nuclear has basically no way of catching up.
China has been building out nuclear capacity at 5% a year for 25 years.
Solar and wind capacity had shot through the roof in the last five years because they can't sell hardware to the west any more.
The other big item is hydro power, which China has a ton of untapped potential for. Unfortunately for the West every good river has already been damed so we can't follow them there.
> Solar and wind capacity had shot through the roof in the last five years because they can't sell hardware to the west any more.
"can't sell hardware??" hah! I've never heard that weird made-up justification, where did you pick it up from?
China installed 277GW of solar in 2024, capacity factor corrected that's 55.4 GW of solar power. That's equivalent to the entire amount of nuclear that China has ever built. One year versus all time. And then in the first half of 2025, China installed another 212GW of solar. In six months.
Nuclear is a footnote compared to the planned deployment of solar and wind and storage in China.
Anybody who's serious about energy is deploying massive amounts of solar, storage, and some wind. Some people that are slow to adapt are still building gas or coal, but these will be stranded assets far before their end of life. Nuclear fusion and fission are meme technologies, unable to compete with the scale and scope that batteries and solar deliver every day. This mismatch grows by the month.
I blame these for the unquestioned belief that fusion is desirable. It's a trope because it enables stories to be told, and because readers became used to seeing, not because science fiction has a good track record on such things.
The fact that the volumetric power density of ARC is 40x worse than a PWR (and ITER, 400x worse!) should tell one that DT fusion at least is unlikely to be cheap.
With continued progress down the experience curve, PV will reach the point where resistive heat is cheaper than burning natural gas at the Henry Hub price (which doesn't include the cost of getting gas through pipelines and distribution to customers.) And remember cheap natural gas was what destroyed the last nuclear renaissance in the US.
> It would require a technological breakthrough that we have not yet imagined.
Maybe, but not necessarily. The necessary breakthrough might have been high-temperature superconducting magnets, in which case not only has it been imagined, but it has already occurred, and we're just waiting for the engineering atop that breakthrough to progress enough to demonstrate a working prototype (the magnets have been demonstrated but a complete reactor using them hasn't yet).
Or it might be that the attempts at building such a prototype don't pan out, and some other breakthrough is indeed needed. It'll probably be a couple of years until we know for sure, but at this point I don't think there's enough data to say one way or the other.
> And already, solar plus storage is cheaper than new nuclear.
It depends how much storage you mean. If you're only worried about sub-24h load-shifting (like, enough to handle a day/night cycle on a sunny day), this is certainly true. If you care about having enough to cover for extended bad weather, or worse yet, for seasonal load-shifting (banking power in the summer to cover the winter), the economics of solar plus storage remain abysmal: the additional batteries you need cost just as much as the ones you needed for daily coverage, but get cycled way less and so are much harder to pay for. If the plan is to use solar and storage for _all generation_, though, that's the number that matters. Comparing LCoE of solar plus daily storage with the LCoE of fixed-firm or on-demand generation is apples-and-oranges.
I think solar plus storage absolutely has the potential to get there, but that too will likely require fundamental breakthroughs (probably in the form of much cheaper storage: perhaps something like Form Energy's iron-air batteries).
In the end we're still making steam and running a turbine. Just the steam turbine part of the power plant has a hard time competing with solar in sunny locations.
High temperature superconducting magnets are not a panacea for the problems with DT fusion. Those issues follow from limits on power/area at the first wall, and the needed thickness of the first wall; these ensure DT reactors will have low volumetric power density, regardless of the confinement scheme used.
With HTSC magnets, a tokamak much smaller than ITER could be built, but ITER is so horrifically bad that one can be much better than it and still be impractical.
And these are not new issues, they've been known for more than 40 years, but never addressed. From the 1983 Led
> But even though radiation damage rates and heat transfer requirements are much more severe in a fusion reactor, the power density is only one-tenth as large. This is a strong indication that fusion would be substantially more expensive than fission because, to put it simply, greater effort would be required to produce less power.
In terms of cost of materials to build a reactor, sure, that seems right. But most of the cost of fission is dealing with its regulatory burden, and fusion seems on track to largely avoid the worst of that. It seems conceivable that it ends up being cheaper for entirely political/bureaucratic reasons.
Regulatory costs and waste disposal are not significance cost centers for nuclear, at least as far as I can tell from any cost breakdowns.
One doesn't need super high quality welding and concrete pours becuase of regulations as much as the basic desire to have a properly engineered solution that lasts long enough to avoid costly repairs.
Take for example this recent analysis on how to make the AP1000 competitive:
There are no regulatory changes proposed because nobody has thought of a way that regulations are the cost drivers. Yet there's still a path to competitive energy costs by focusing hard on construction costs.
Similarly, reactors under completely different regimes such as the EPR are still facing exactly the same construction cost overruns as in the rest of the developed world.
If regulations are a cost driver, let's hear how to change them in a way that drives down build cost, and by how much. Let's say we get rid of ALARA and jack up acceptable radiation levels to the earliest ones established. What would that do the cost? I have a feeling not much at all, but would like to see a serious proposal.
Relaxed regulatory burden doesn't seem to be making fission competitive in China; renewables are greatly overwhelming it now, particularly solar.
We might ask why regulations are so putatively damaging to nuclear, when they aren't to civil aviation. One possibility is that aircraft are simply easier to retrofit when design flaws are found. If there's a problem with welding in a nuclear plant (for example) it's extremely difficult to repair. Witness the fiasco of Flamanville 3 in France, the EPR plant that went many times over budget.
What would this imply for fusion? Nothing good. A fusion reactor is very complex, and any design flaw in the hot part will be extremely difficult to fix, as no hands on access will be allowed after the thing has started operation, due to induced radioactivity. This includes design or manufacturing flaws that cause mere operations problems, like leaks in cooling channels, not just flaws that might present public safety risks (if any could exist.) The operator will view a smaller problem that renders their plant unusable nearly as bad as a larger problem that also threatens the public.
I was struck by a recent analysis of deterioration of the tritium breeding blanket that just went ahead and assumed there were no initial cracks in the welded structure more than a certain very small size. Guaranteeing quality of all the welds in a very large complex fusion reactor, an order of magnitude or more larger than a fission reactor of the same power output, sounds like a recipe for extreme cost.
Oh for sure, I'm not claiming that CFS (or Tokamak Energy or Type One or whoever else) will for sure succeed, or if they do, that they've already solved all the problems that will need solving to do so. My only assertion/prediction is that I think if they end up succeeding, when future historians look back and write the history of this energy revolution or whatnot, HTSC magnets will turn out to have been the key breakthrough that made it possible.
Fission is expensive for regulation reasons more than technological reasons, so if fusion doesn't face the same barriers then it could be cheaper than fission.
But I agree that it doesn't look like fusion is going to be cheap any time soon.
Fission is also expensive for several mundane reasons, like the fact that massive steam turbines are expensive, and because any large construction project in the West is expensive. Neither fusion nor regulatory reform are going to solve those.
The steam generator that the fusion generator connects to might be more expensive than solar at this point. That would be even if fusion cost nothing and had infinite amounts of fuel, there would be no customers for its energy on a sunny afternoon.
Personally speaking, having just bought an Ioniq 5 and installing solar at home what I see as the near future improvement is adding V2L functionality, which I can hook up to the generator input of my solar inverter, essentially adding another 60kWh buffer to my grid storage.
Considering how expensive residential batteries are and how quickly EVs depreciate, I think soon it'll be cheaper to get a used EV as a cheap source of cells that accidentally happens to be able to drive itself around.
Imo V2G, and V2H is unnecessary and add too much complication, I think for the future, solar inverters already have the necessary hardware and certifications to be able to take power and safely connect to the grid - something that requires different hardware and standards compliance in basically every country (yes even within the EU).
Residential batteries are not that expensive anymore, at least not all of them.
That's a misconception I also held until a few years ago ;-)
My first 14.3 kWh pack cost about 2800$ DDP from China, delivered 03/2023. For that one I did calculate how long it took for amortization, which I projected at about 5 years.
The second, identical pack was delivered 08/2024 and cost 2000$ DDP. Since we got an EV that's drawing about 14kWh per day, I didn't bother doing the math and just ordered it.
These are 280Ah 16S 51.6V packs, based on the EVE LF280K. In an enclosure, with a BMS (Seplos, 200A) and a dedicated balancer. They are good for 6000 cycles at 140A or less [each]. Mind these were both part of small bulk orders - I think each time we ordered 6 to 8 of these, which reduced shipping costs.
My new batteries were about 250EUR/kWh - my 10KWh unit cost 2500 EUR - scaling it up to a decent used 5 year old EV price - you can have one for 15k with 60+ kWh batteries, so I'd say it's at a very similar price.
In the US, V2L limits your ability to output power from the car to about 1500 W. It's not going to power your house as more than a stopgap, even if you do have supplementary house batteries. V2H/V2G justify their complexity by solving that problem, along with all the ancillary grid benefits.
Not sure if that's the case - however doing V2L requires the manufacturer to add an inverter to the car, and making that powerful probably adds extra cost most customers wouldn't pay. TI just looked it up and my Ioniq can only do about 2kW sustained - but since this charges the house battery, that's enough - idle load is just a couple hundred watts.
If you have solar panels or time-of-use electrical rates, you charge the car when power is cheap/free, and spend stored power when the grid costs are high. During a protracted outage, maybe you drive the car to a fast charger.
A typical house averages less than 1500W. And most of the higher usage overlaps the sun being out. So if you have supplemental house batteries to handle bursts then 1500W of V2L can go a very long way.
1. It does. The only issue is that the car can only output about 2kW sustained (this is a model limitation). That's fine since I have batteries in the house.
2. Tbh not super familiar with V2G/V2H, other than it being super expensive for both the wall box and the car (only high end models tend to support it)/
3. No idea, but it's not a high end feature, I wouldn't count on any inverter to just have it, but if you're looking to buy one that does, I don't think you'll be breaking the bank.
Imo the future is for solar inverters to offer a dedicated DC car charger port, as once again all the hardware is already in there.
Thanks for the answers. I used to work for a EV smart charging company (Kaluza) that ran a V2G trial. V2G was financial success for the users, but I always thought the wall box was a potential blocker. I don't think the 2kW output is a big issue as the customer could still reduce there load when required, but the elimination of a wall box makes onboarding much easier.
As long as the inverter can also provide charging this definitely has some potential.
> Can "second life" EV batteries work as grid-scale energy storage?
Yes
is it profitable? probably not.
Looking at the price for brid battery storage, and its dropping precipitously. The cost isn't as much in the batteries them selves, it packaging, placing and then controlling them.
For example if you want to have a 200Mwhr 100Mw storage site, you'll need to place it, join it to the grid, all doable. Then you need the switch gear to make it work as you want it to.
For day ahead, 30 minute trading, thats fairly simple.
For grid stabilisation, thats a bit harder, you need to be able to lead/match/lag the grid frequency by n degrees instantaneously. which is trivial at a few kw, much harder at 100Mw
Having worked extensively with battery systems, I think the grid storage potential of second-life EV batteries is more complex than it appears. We found that typical EV batteries retain 70-80% capacity after 8-10 years of vehicle use, but the real challenge is standardization and integration. Different manufacturers use vastly different battery management systems (BMS) and cell configurations - a Tesla pack is fundamentally different from a Nissan Leaf pack.
The economics are interesting though. New grid storage batteries cost around $200-300/kWh, while second-life EV batteries can be acquired for $50-100/kWh. However, you need to factor in significant integration costs (~$50-75/kWh) to build compatible BMS systems and thermal management. We also found cycle life degrades about 20% faster in repurposed packs compared to new ones, likely due to accumulated stress patterns from automotive use.
Has anyone here successfully integrated mixed second-life batteries at scale? I'm particularly curious about how you handled thermal management across different pack designs while maintaining safe operating parameters.
>So top of the list for us, of course, designing this thing is safety.
Funny issue I learned after talking to a founder at a similar company: although the battery packs were certified safe for cars (passing crash tests, wild heat differences from AK to AZ, people sitting on top of the battery packs in the car)
... the founder had issues re-certifying the batteries for safe use in a static location for grid storage.
The certification process treated his company like the batteries were made from scratch even though they used the same BMS/coolant lines/etc. already proven and tested.
It's clear you still need strong safety regulations and practices in the rare case there's an event, but the founder noted the grid storage industry regulations were adding redundant safety testing and slowing down adoption. The founder also added it's difficult to compete on cost even with effectively free used EV batteries in this startup space of grid storage against the low cost of Chinese made grid-specific batteries due to the added testing + custom hardware + space constraints and other items. (Caveat: I didn't fact check any of their statements)
We don't need ev batteries for this. We just need cheap enough LifePo4 so we're not burning more shit down. Prismatics from China are a start, Salt batteries showing some promise next.
In an ice storm and cold cloudy snap that sweeps the country, NO lithium batteries will save the grid. I'm weary of this tunnel vision of the absurd. The only 'storage' that works is to pump water uphill with 'surplus' energy, and there is not and never will be a surplus. And these evaporartion tanks are on a scale that ecologists have to remember countless horror stories (eg., Glenn Canyon Dam). And of course there's always "mfft!" (another scheme with no numbers behind it so it's credible because I'm talking about it).
The last time there was anything rational on the table was Perry under the Trump administration's 30-day rule proposal ( https://www.nucnet.org/news/nuclear-is-vital-to-us-national-... ). It gave a hard industry incentive to any energy supplier who can have 30 days' fuel on site. This means nuclear and coal. This was no gimmick, it was the first time anyone faced reality about National Security as related to Energy for the grid and survival. And now AI datacenter yadda yadda, we're also talking about the luxury of keeping schools heated in Winter. Adverse weather even for a week is grid down game over for wind and solar. Proven natural gas are ~300 years, joy! As soon as they get around to sending pregnant whales across the ocean (losing ~20% of the gas-energy in cooling) it may even last 50 years! Before we have to go to endless war again.
Without being a battery chemistry expert, why do these battery packs become not useful for an EV yet could still be useful for energy storage. They keep saying that 80% of life becomes unusable for EV, but that's still a lot of life. Is it that grid energy is more of a constant drain while the EV is lots of hard pulls (for lack of better wording)? In an EV, the battery cannot provide the higher volts being requested within rating, but a grid is never demanding peak performance?
It's not just about capacity (80% is still a lot), it's that degraded batteries lose their ability to deliver high current under load—so acceleration suffers and voltage sags under hard pulls. For grid storage, you're doing slow, steady charge/discharge cycles over hours, so the same battery that can't handle aggressive driving anymore works perfectly fine. Plus, grid storage has virtually unlimited space and no range anxiety, so if you need 25% more packs to hit your capacity target, you just stack them in a warehouse where real estate is cheap.
Also, batteries will degrade faster over time when they start to degrade, because they need more frequent charging. Their internal resistance increase and that promotes heat buildup during fast charging/discharging, another thing that promotes degradation. Slow charge/discharge cycles also help with heat management.
They claim to have taken the Moss Landing fire into account with how they are placing their batteries. We won't know if they've really solved the problem or not until their first battery pack experiences a runaway thermal event.
Space and weight are serious constraints in the car space, but not such a big deal on the side of a house. That’s how they retain their usefulness.
80% could indeed be plenty of usable life for your EV use cases, but it strongly depends on usage patterns. More degradation means more trips to the charger on a road trip. It means trips that you’d regularly make just charging at home at the end of day now require you to plug in at the destination too. It means more range anxiety as a whole.
I think car cells will be much more useful if they are packaged as replacement batteries for all the various battery powered tools, ebikes, etc.
There's a consumer profit margin to absorb the repackaging and teardown.
Maybe for home grid batteries they'll work too. Again, a consumer margin.
Sodium Ion and other grid-specific storage will simply be too cheap for secondhand EV batteries to compete. And the retiring cells won't be any better in density and will be less safe than the higher density sodium ion and LFP that is hitting the market.
Not second life, but first life. All EVs and charging stations should be reversible. In a world where fossil fuels cost their true value (~10x as much) and people still drive this would be a necessity for electricity generation
Seems like the market is going the hybrid route. It's kind of easy to see why, best of both worlds. Some BYD hybrids have crazy ranges like 1500 km on a tank of gas. The more practical car is winning. They put in a much small battery in these for fast charge, and the daily commute range. And you have gas, for longer trips. Maybe smaller batteries would be better for grid-scale storage too. If they're lighter and easier to handle.
I don't have first-hand experience, but these guys have an EV repair shop for a while and do also hybrids, their articles always offer lots of insight.
Short run down:
- micro/mild hybrids are useless: batteries too small, engines too small to be the sole source of power, so contribution to emission reduction is very small, batteries tend to fail early because they're very small
- full hybrids have bigger batteries and engines large enough to run pure EV, but you still rely on ICE engine for everything, so there's no ability to charge at home or save on gas
- plug-in hybrids are full hybrids, but you can charge them externally; according to many studies the estimated emissions are much higher than declared, because people simply don't charge them at home and run on ICE the whole time
In all these types of hybrids the batteries are smaller than pure EVs, so they cycle faster and degrade faster. You're carrying two drivetrains all the time with added weight, one of which has plenty of maintenance items. So they're not drop-in replacements.
From what I've seen from EVClinic above, many manufacturers use custom pouch cells, not cylindrical modules like the more advanced pure EVs, so you can't repair an individual failed cell. That means full pack replacement. For many manufacturers you can't order replacement parts of the electric drivetrain, and if you do, they cost a huge chunk of the car.
So all in all if everything's well, you're good. If something goes wrong, be prepared to spend the same as you would spend for a battery replacement of a pure EV, or even more.
It's not like reliable gas cars ever had substantial maintenance problem in the gas part. So removing the gas part didn't do much in practice.
People do/did have frustrations with gas car mannerisms and mental approachability, like, everything was written in a mix of translated foreign language documents and borderline insane gearhead languages. That lead them to imagine that removing the gas part would drastically change the industry, in their favor.
But, in the end, gas cars are good with regular maintenance for something like 100k miles over 8 years, so, I wouldn't know what consumer product were more reliable than a gas car in the first place.
Not that I'm aware of. I've heard that many hybrids actually require less maintenance - for instance, the car can use electric power for hard acceleration instead of stressing the engine, so oil tends to last longer, and regenerative braking causes the friction brakes to wear out more slowly.
Eh, my PHEV has a 2 year oil change interval, which is longer than my ICE only cars. You should probably bring in your EV every 2 years to get things looked at too.
The engine in a hybrid should live an easier life compared to an ICE. No extended idle, mostly running in the power band, etc. There are lots of different ways to setup the hybrid system, but typically, rather than a small stater motor, you have a larger motor/generator that also starts the engine; it's less likely to get worn out, because it's built for continuous use.
In my PHEV, it has a 'toyota synergy' style 'e-CVT' which eliminates gear selection and should be very low maintenance (although mine had to be replaced under a service bulletin due to bearing failure because of manufacturing error) again nicer than an ICE. But some hybrids have a more traditional transmission.
Certainly, you can do ICE only or EV only, but there's a lot of room to use the ICE for things it's good for, and the EV for things it's good for, and blend where there's overlap.
Ford Escape? I have a friend that needed the transmission on his 2023 PHEV replaced under warranty... no service bulletin, but mechanics caught a manufacturing error at a regular service. Hopeful my hybrid Maverick doesn't have similar problems.
That two year oil change cycle is the minumum required to not void the warranty.
It shouldn’t be taken as the optimal interval to maximise engine life.
Of course, modern fully synthetic engine oils are longer lasting, and I believe the newer Toyotas, at least the hybrids anyway, have electric oil pumps, and use very thin engine oil to make sure the engine is well lubricated at startup.
It's possible it might actually be more reliable long term, once the technology matures. For example, in cold weather the gas engine might heat the battery for better battery performance, maybe even extend its life if it prevents it from being drawn down too much. The gas engine, would also likely last longer since its not used for daily commutes.
"In many PHEV systems, there are different modes:
Electric mode (EV mode): The vehicle runs purely on the electric motor(s) and battery until the battery depletes to some extent.
Hybrid/Parallel mode: Both the petrol engine and electric motor(s) work together to drive the wheels, especially under high load, higher speeds or when battery is low.
Ithy
Series mode (in some designs): The petrol engine acts only as a generator to charge the battery or power the electric motor(s), and the wheels are driven by the electric motor(s).
For the BYD Leopard 5 (and many BYD PHEVs) the petrol engine can drive the wheels (i.e., it is not purely a generator). It is part of the drive system, especially when high power or long range is needed.
At the same time, it likely can assist with charging the battery or maintaining battery state of charge (SOC) when needed (for example, to keep the battery at some reserve level or in “save” mode). User-reports show that the petrol engine will kick in to support the electric system, charge the battery, or assist the drive under certain conditions" -
Betteridge's law of headlines finally fails? TL;DR: Yes, but you can also make it 'not work' if you choose to politicize the tech solution to the energy problem.
This is less useful than most people expected. Redwood has been struggling because the expected battery turnover is not occurring. EV batteries are lasting a long time, so they stay in the car are and not being recycled or reused in any quantity yet.
If EV batteries last 20+ years in EV's, it'll be > 2040 before there are significant numbers of EV batteries available to recycle or reuse.
https://www.geotab.com/blog/ev-battery-health/
A lot of the early EV battery life projections were based on Nissan Leaf Gen 1. Which had a horrendous battery pack that combined poor choice of chemistry, aggressive usage and a complete lack of active cooling.
When EVs with good battery pack engineering started hitting the streets, they outperformed those early projections by a lot. And by now, it's getting clear that battery pack isn't as much of a concern - with some of the better designs, like in early Teslas, losing about 5-15% of their capacity over a decade of use.
I am a bit more concerned about batteries now as opposed to an year ago.
We had this article from Elektrek [1] about battery issues in South Korea. When I asked my local electric maintenance shop [2, sorry for the FB link], they said they have started seeing the same issue in Model 3s and Ys in Canada as well. (They also said that it is too early to tell how common it would become)
This may bode well for recycling since the issues is an unbalance, not the whole pack failing.
[1] https://electrek.co/2025/10/14/tesla-is-at-risk-of-lossing-s...
[2] https://www.facebook.com/groups/albertaEV/posts/248558844207...
Tesla made powerwalls a product for a reason. They were supposed to come from outdated Tesla cars, but that never materialized. If it is materializing now, they already know what they are going to do.
I would be more concerned if the source were anyone but Electrek~. Their vendetta against Tesla has forfeited all their credibility on Tesla news.
"many of these vehicles are now out of warranty, as they sometimes exceed the maximum mileage"
They have good numbers for the number of affected vehicles, but the best they can do for out-of-warranty stats is "many" and "sometimes". Convenient.
~To be fair this applies to a lot of popular tech sites I used to respect. Dunking on Tesla is its own industry these days, it seems.
Don't forget that the original Leaf pack was only 24 kWh. So if you assume a ~1000 full-equivalent-charge-cycles lifespan, then the large Gen2 62 kWh pack will live 2.5 times longer than an original 24 kWh pack. If you average 3.5 miles/kWh, the 24 kWh battery will be expected to last somewhere around 84,000 miles. While the 62 kWh pack will last for 217,000 miles.
https://coolienergy.com/lfp-vs-nmc-batteries-the-science-beh...
Why would you only assume 1000 cycles? Is the chemistry that bad? The LFP battery on my balcony is rated for 5000 cycles iirc.
It didn't just had horrendous service life, it was designed for some set years of life to be regularly replaced and repurposed for battery storages. Nissan had business schemes outlined for that with Leaf packs.
I think Tesla deserves credit for rethinking hat model into chassis-life battery packs and surpluses rather than recovered cells for grid storages.
Especially considering that, resales of Gen1 Leafs milked for EVs and renewables incentives is like destination fees atrocious. You can find fairly zero-milage ones with a functional 100-yard battery pack on sale for couple hundred dollars in some places. Even crashed wrecks of a Tesla cost magnitudes more.
That's amazing good news for the environment, thank you I hadn't heard this.
I'll defend the leaf a little.
LiPo batteries were quiet expensive when it was initially released. NiMH was really the only option in town.
And with a lower energy density battery that's also heavier, adding a cooling system would have also added a bunch of weight to the already heavy car with a barely usable range of 100 miles.
Gen 2, however, had no excuses. They had every opportunity to add active cooling and they still decided to go with just air cooling.
Every generation of the production Nissan Leaf has used lithium batteries. AFAIK no modern (~post-2000) mass-produced (>10k units sold) EV has ever used NiMH or lead-acid batteries.
Edit: Checking Wikipedia to verify my information, I found out that Nissan actually sold a lithium-battery EV in 1997 to comply with the same 90s CARB zero-emissions vehicle mandate that gave us the GM EV-1: https://en.wikipedia.org/wiki/Nissan_R%27nessa#Nissan_Altra
EVs no, but I think some Toyota hybrids (which are of course not even PHEVs) still use NiMH. Toyota tends to be very tight-lipped about their batteries and their sizes (or rather, lack thereof).
Tends to be tight lipped??? It is in the catalog[1]! It is more that American consumers aren't tech obsessed than Toyota being reluctant to share.
Even just looking at online media reports[2][3] clearly sourced from some exact same press event, it is obvious that US English equivalents are much lighter in content than Japanese versions. They're putting the information out, no one's reading it. It's just been the types of information that didn't drive clicks. Language barrier would have effects on it too, that Toyota is a Japanese company and US is an export market, but it's fundamentally the same phenomenon as citizen facing government reports that never gets read and often imagined as being "hidden and withheld from public eyes", just a communication issue.
1: https://www.toyota.com/priuspluginhybrid/features/mpg_other_...
2: https://www.motortrend.com/news/toyota-aqua-prius-c-hybrid-b...
3: https://car.watch.impress.co.jp/docs/news/1339263.html
It's nice to get a reminder about this problem once in a while, I've fallen into the trap myself at times.
Leaf Gen 1 didn't have NiMH. It had a lithium-based battery chemistry, but some bastard offshoot of it. One that really didn't fare well under high current draw, or deep discharge, or high temperatures, or being looked at wrong.
On the used market you'll find absolutely cooked (literally) Leafs whose first life was in Arizona and barely have enough range to back out of the driveway.
I have a gen 1 leaf with a remaining range of about 500 yards if you drive gently...
I use it in my driveway to make it look to thieves like someone is home (round me, houses with no car get broken into).
Sounds like an old Roomba I used to have that could clean for about 2 minutes before it ran out of juice.
NiMH was used in Priuses for a very long time, and these seem to have lasted for ages.
> Gen 2, however, had no excuses. They had every opportunity to add active cooling and they still decided to go with just air cooling.
The Lizard pack in the later Nissan Leafs has held up surprisingly well. I have a 2015 that still gets 75 miles of range. I'm sure they thought it wasn't necessary and they probably had the actuarial numbers to justify it.
> Redwood has been struggling because the expected battery turnover is not occurring
Redwood pitched recycling. But its principal business was primary production. (Processed black mass is analogous to lithium ore.) They're struggling because demand for American-made batteries remains low.
From TFA:
David Roberts
When did automotive batteries become the majority of your input by volume?
Colin Campbell
That is a good question.
David Roberts
Was it recent or was that early on?
Colin Campbell
I would say the transition to EV batteries dominating what we received, it’s been in the last year or 18 months.
David Roberts
So the front edge of a very large wave of batteries has begun to arrive?
Colin Campbell
Yeah, the wave is out there, it’s coming. The waters have finally started to arrive at the beach here.
He's just talking his book. Their deployment this year was 1/4000th share of the BESS market.
Battery Energy Storage System for anyone else like me that has no knowledge of this world and their acronyms.
Reading between the lines of the corporate speak will validate my point. Redwood was founded in 2017.
"It’s the largest microgrid in North America and it’s the largest second-energy storage site in the world. So that’s like you said at the top, it’s a 12-megawatt AC, 63-megawatt-hour grid supporting about 2 or 3 megawatts of data centers and run by solar. So all the energy comes from another 12 megawatts of solar."
Sure, so while not supplying power to a city, they are proving this is viable. Just because it's not "turn off the coal plants now" moment doesn't mean this isn't a very good direction. Everyone has to start and grow. I don't understand the whole shit on something because it's not an immediate solve. If these guys waited until 2040 to start the business, well, that'd just be dumb. It essentially sounds like capacity will just continue to increase year over year, maybe around 2040 there will be a huge spike. Doesn't seem like anything is wrong here.
We are only 5-6 years into the car ev market. Tesla model 3 started being sold in 2018 in meaningful numbers
Still have mine. Battery capacity is around 80% of the new capacity. I'm not planning on switching anytime soon as it's got plenty of range still. I'll probably swap the pack out when it hits 70% in the next 2 or 3 years.
It probably will take a lot longer than that to hit 70%. Degradation on Tesla batteries slows down considerably after it hits 85%.
there are exceptions, though.
So, basically the same reason recycling of PV modules hasn't taken off.
I've been intrigued by used solar panels for sale, seems like you can get an amazing price for ones that are only lightly degraded. Is there a downside, or do you just mean that it isn't popular currently?
In addition to my sibling comment: The cost of the panels is a rather small fraction of the total cost of a typical installation. Most of that cost ist labor, some regulatory requirements and the inverter. Whether you pay a factor of 2 for the panels or not typically doesn't matter. In other words: Reusing used panels will only ever be able to safe you a minuscule amount.
Yeah, we paid more for the little bits of metal that held up the panels than for the panels themselves (aluminum, but still).
These days it’s a stack of microinverters. Which are not cheaper but do improve array efficiency outside of idea conditions. But that’s another up front cost.
The low cost of the modules themselves has led to the suggestion of cost optimized DC-coupled PV systems being used to directly drive resistive heaters. The cost per unit of thermal energy in a cost optimized system moderate scale system (> residential, < utility scale) may be in the range of $3-5/GJ, very competitive with natural gas. Low cost maximum power point trackers would be useful; inverters would not be needed.
Low cost modules allow one to do away with things like optimally tilted modules and single axis tracking. The modules can also be tightly packed, reducing mounting and wiring costs.
Is it worth using heat pumps in this setup (in addition to resistive elements)? I understand they can't reach the absolute temperature of resistive heating, but from an efficiency POV for the first few tens of degrees they are much more efficient.
What's the proposed system design? For example, in January, I get about 9 hours of sunlight and have an average daily high of 25 F. I'm gonna need to store heat somehow or another.
The place I saw this most clearly described was in Standard Thermal's concept, which will store the heat in huge piles of dirt heated to 600 C. The thermal time constant of such piles can be many years.
https://www.orcasciences.com/articles/standard-thermal-copy
https://austinvernon.substack.com/p/building-ultra-cheap-ene...
https://news.ycombinator.com/item?id=45012942
I'm going to want that pile hot enough to kill all the bugs and pets that want to get near it.
The surface will always be only slightly hot. Heat will be stored inside, insulated by overlying dirt. Dirt isn't the best insulator by thickness, but it's a very good insulator by $.
I haven't seen pfdietz's proposed system design, but a so-called "sand battery," consisting of a box of sand with a heating element running through it, should work fine. You can PWM the heating element with a power MOSFET to keep it from overheating; you can measure its temperature with its own resistance, but also want additional thermocouple probes for the sand and to measure the surface of the box. A fan can blow air over or through the sand to control the output power within limits.
I'll work out some rough figures.
Let's say your house is pretty big and badly insulated, so we want an average of 5000 watts of heating around the clock with a time constant on the order of 10 hours, and we don't want our heating element to go over 700°. (Honest-to-God degrees, not those pathetic little Fahrenheit ones.) That way we don't have to deal with the ridiculous engineering issues Standard Thermal is battling. There's a thermal gradient through the sand down to room temperature (20°) at the surface. Suppose the sand is in the form of a flat slab with the heating element just heating the center of it, which is kind of a worst case for amount of sand needed but is clearly feasible. Then, when the element is running at a 100% duty cycle, the average sand temperature is 360°. Let's say we need to store about 40 hours of our 5000W. Quartz (cheap construction sand) is 0.73J/g/K, so our 720MJ at ΔT averaging 340K is 2900kg, a bit over a cubic meter of sand. This costs about US$100 depending mostly on delivery costs.
The time constant is mostly determined by the thickness of the sand (relative to its thermal diffusivity), although you can vary it with the fan. The heating element needs to be closely enough spaced that it can heat up the sand in the few hours that it's powered. In practice I am guessing that this will be about 100mm, so 1.5 cubic meters of sand can be in a box that's 200mm × 2.7m × 2.7m. You can probably build the box mostly out of 15m² of ceramic tiles, deducting their thermal mass from the sand required. In theory thin drywall should be fine instead of ceramic if your fan never breaks, but a fan failure could let the surface get hot enough to damage drywall. Or portland cement, although lime or calcium aluminate cement should be fine. You can use the cement to support the ceramic tiles on an angle iron frame and grout between them if necessary.
7.5m² of central plane with wires 100mm apart requires roughly 27 2.7m wires, 75m, probably dozens of broken hair dryers if you want to recycle nichrome, though I suspect that at 700° you could just use baling wire, especially if you mix in a little charcoal with the sand to maintain a reducing atmosphere in the sand pore spaces. (But then if it gets wet you could get carbon monoxide until you dry it out.) We're going to be dumping the whole 720MJ thermal charge in in under 9 hours, say 5 hours when the sunshine is at its peak, so we're talking about maybe 40kW peak power here. This is 533 watts per meter of wire, which is an extremely reasonable number for a wire heating element, even a fairly fine wire in air without forced-air cooling.
If we believe https://www.nature.com/articles/s41598-025-93054-w/tables/1 the thermal conductivity of dry sand ranges from 0.18 W/m/K to 0.34 W/m/K. So if we have a linear thermal gradient from our peak design temperature of 700° to 20° over 100mm, which is 6800K/m, we should get a heat flux of 1200–2300W/m² over our 15m² of ceramic tiles, so at least 18kW, which is more than we need, but only about 3×, so 200mm thickness is in the ballpark even without air blowing through the sand itself. (As the core temperature falls, the heat gradient also falls, and so does the heat flux. 720MJ/18kW I think gives us our time constant, and that works out to 11 hours, but it isn't exactly an exponential decay.) Maybe 350mm would be better, with corresponding increases in heating-element spacing and decreases in wire length and box surface area and footprint.
To limit heat loss when the fan is off, instead of a single humongous wall, you can split the beast into 3–6 parallel walls with a little airspace between them, so they're radiating their heat at each other instead of you, and cement some aluminum foil on the outside surfaces to reduce infrared emissivity. The amount of air the fan blows between the walls can then regulate the heat output over at least an order of magnitude. (In the summer you'll probably want to leave it off.)
The sand, baling wire, aluminum foil, lime cement, angle irons, charcoal, thermocouples, power MOSFETs, microcontroller, fans, and ceramic tiles all together might work out to US$500. But the 40kW of solar panels required are about US$4000 wholesale, before you screw them to your siding or whatever. At US prices they'd apparently be US$10k.
What do you think?
Sand batteries have a much higher cost per unit of energy storage capacity, so they are in more direct competition with batteries for shorter term storage. It's hard to compete with a storage material you just dig out of a local hole. The economics pushes toward crude and very cheap.
Having said that: a good design for sand batteries would use insulated silos, pushing/dropping sand into a fluidized bed heat exchanger where some heat transfer gas is intimately mixed with it. This is the NREL concept that Babcock and Wilcox was (still is?) exploring for grid storage, with a round trip efficiency back to electricity of 54% (estimated) using a gas turbine. Having a separate heat exchanger means the silos don't have to be plumbed for the heat exchange fluid or have to contain its pressure.
Getting the sand back to the top (where it will be heated and dropping into silos) is a problem that could be solved with Olds Elevators, which were only recently invented (amazingly).
https://www.youtube.com/watch?v=-fu03F-Iah8
How much of a difference does it actually make in terms of the all-inclusive price of installation (e.g. panels, inverters, mounting hardware, and labor)?
(Asking because I genuinely don't know, not because I have a specific answer in mind.)
Find an installer who will warranty work using third party let alone used solar panels and then we can talk.
Labor is by far the top cost. But I'm intrigued by the economics of a small setups paired with like a <5kwh battery. And for something like that where you literally just throw 4-6 panels out, you can just brute force by buying more panels instead of optimizing angles. Basically a slightly beefier version of a European balcony setup
I think they were referring to the fact that the chief reason there is not large-scale PV panel recycling is that very few panels have ever been retired. It turns out that short of physical destruction by hail etc a PV panel does not degrade beyond economic usefulness simply by being out in the sun. In fact some panels actually get more powerful. The surprising-to-some conclusion of NREL's PV Lifetime Project is that the economic lifetime of a PV panel is basically forever.
The typical EV industry trade show has a small handful of cars and a vast amount of tangential businesses including many finance options, a vast amount of home charger gizmos, fast charging gizmos, electricity suppliers and the companies promising grid-scale storage, either from actively used cars or recycled EV batteries. There is a vast constellation of this stuff, with specialist insurance companies that nobody really asked for outnumbering the car brands or even e-bike brands present.
In time there will be consolidation. This constellation of EV startup bottom-feeders will be decimated along with the 'excuses' to not make money.
I don't think the problem is that EV batteries are lasting longer, it is just that the EV market from before the Model 3 came along is miniscule. Hence not many second hand batteries to recycle.
As for EV batteries and their availability, when was the last time you saw an OG Tesla Model S with the fake grill? Those cars used to be everywhere, but where are they now? The German EVs that came out to compete, for example, Taycan and eTron, those things are not going to last the distance since the repairs cost a fortune and the parts supply is limited.
All considered, there will come a time before 2040+ when there are large quantities of these electric car batteries to upcycle, by which time the EV business will be consolidated with only a few players.
If there was money in recycling cars then every auto manufacturer would be in on it.
There is one constant to all these conversations and that is Silicon Valley tech dudes are grossly misinformed about the lifecycle of things. Solar panels don't wear out, batteries don't wear out as fast as they used to. This is evidenced both by undertaking weird dead-end startup ideas, and being susceptible to propaganda about the supposed downsides of solar energy and batteries.
> "most people"
"most people" even now are just parroting dumb FUD they read on facebook. You really shouldn't give any weight to the opinions of laypeople on topics that are as heavily propagandized and politically charged as renewable energy.
Tesla batteries fail after 8 years at least from models up to 2014
The number of Teslas sold up to 2014 is less than 1% of all Teslas sold.
Tesla has an 8-year battery and drivetrain warranty but they don't necessarily fail after that date.
Prius Plugin 2015 (last year of that model) - full charge/discharge at least 3-4 times a week, currently still a bit more than 80% of capacity (granted the battery seems somewhat overbuilt, yet it is normally does 10-15C which is much tougher mode than in a pure EV where 2-3C is usually enough and only high-end Teslas and the likes would do 5-6C). There has been large continuous improvement in lithium batteries over the last couple decades.
What does any of this mean?
What is c in this context?
From my model airplane experience, I believe it's "capacity per hour". So, a 1Ah battery discharged at 1c would mean 1 amp; discharged at 10c would be 10 amps. The higher the C, the harder the batteries are being used.
1 C current fully discharges battery in 1 hour. Thus 4KWh battery running 60 KW engine means 15C current, and it would discharge the battery in 4 minutes (in a very simplified linear model).
[citation needed]
In 2040 fusion energy advancements will have gotten far enough to be the next technological step and make this redundant anyway
There's currently no technological path for fusion to be cheaper than fission. It would require a technological breakthrough that we have not yet imagined.
And already, solar plus storage is cheaper than new nuclear. And solar and storage are getting cheaper at a tremendous rate.
It's hard to imagine a scenario where fusion could ever catch up to solar and storage technology. It may be useful in places with poor solar resources, like fission is now, but that's a very very long time from now.
The low energy future that was envisioned is not happening.
The AI arms race, which has become an actual arms race in the war in Ukraine, needs endless energy all times a day.
China is already winning the AI cold war because it adds more capacity to its grid a year than Germany has in a century.
If we keep going with agrarian methods of energy production don't be surprised that we suffer the same fate as the agrarian societies of the 19th century. Any country that doesn't have the capability to train and build drones on mass won't be a country for long.
You have that exactly backwards: solar + storage is what will give us energy abundance at less money than we could ever imagine from nuclear fission or fusion.
China is winning the AI Cold war because it's adding solar, storage, and wind at orders of magnitude more than nuclear.
I'm not sure who's doing your supposed "envisioning" but there is no vision for cheap abundant energy from fusion. Solar and storage deliver it today, fusion only delivers it in sci fi books.
Nuclear is 20th century technology that does not fit with a highly automated future. With high levels of automation, construction is super expensive. You want to spend your expensive construction labor on building factories, not individual power generation sites.
Building factories for solar and storage lets them scale to a degree that nuclear could never scale. Nuclear has basically no way of catching up.
China has been building out nuclear capacity at 5% a year for 25 years.
Solar and wind capacity had shot through the roof in the last five years because they can't sell hardware to the west any more.
The other big item is hydro power, which China has a ton of untapped potential for. Unfortunately for the West every good river has already been damed so we can't follow them there.
> Solar and wind capacity had shot through the roof in the last five years because they can't sell hardware to the west any more.
"can't sell hardware??" hah! I've never heard that weird made-up justification, where did you pick it up from?
China installed 277GW of solar in 2024, capacity factor corrected that's 55.4 GW of solar power. That's equivalent to the entire amount of nuclear that China has ever built. One year versus all time. And then in the first half of 2025, China installed another 212GW of solar. In six months.
Nuclear is a footnote compared to the planned deployment of solar and wind and storage in China.
Anybody who's serious about energy is deploying massive amounts of solar, storage, and some wind. Some people that are slow to adapt are still building gas or coal, but these will be stranded assets far before their end of life. Nuclear fusion and fission are meme technologies, unable to compete with the scale and scope that batteries and solar deliver every day. This mismatch grows by the month.
> sci fi books
I blame these for the unquestioned belief that fusion is desirable. It's a trope because it enables stories to be told, and because readers became used to seeing, not because science fiction has a good track record on such things.
The fact that the volumetric power density of ARC is 40x worse than a PWR (and ITER, 400x worse!) should tell one that DT fusion at least is unlikely to be cheap.
With continued progress down the experience curve, PV will reach the point where resistive heat is cheaper than burning natural gas at the Henry Hub price (which doesn't include the cost of getting gas through pipelines and distribution to customers.) And remember cheap natural gas was what destroyed the last nuclear renaissance in the US.
It's hard to imagine a form of energy production less desirable than fusion.
Okay, sure, burning lignite and using the exhaust as air heating in the children's hospital. You got me.
> It would require a technological breakthrough that we have not yet imagined.
Maybe, but not necessarily. The necessary breakthrough might have been high-temperature superconducting magnets, in which case not only has it been imagined, but it has already occurred, and we're just waiting for the engineering atop that breakthrough to progress enough to demonstrate a working prototype (the magnets have been demonstrated but a complete reactor using them hasn't yet).
Or it might be that the attempts at building such a prototype don't pan out, and some other breakthrough is indeed needed. It'll probably be a couple of years until we know for sure, but at this point I don't think there's enough data to say one way or the other.
> And already, solar plus storage is cheaper than new nuclear.
It depends how much storage you mean. If you're only worried about sub-24h load-shifting (like, enough to handle a day/night cycle on a sunny day), this is certainly true. If you care about having enough to cover for extended bad weather, or worse yet, for seasonal load-shifting (banking power in the summer to cover the winter), the economics of solar plus storage remain abysmal: the additional batteries you need cost just as much as the ones you needed for daily coverage, but get cycled way less and so are much harder to pay for. If the plan is to use solar and storage for _all generation_, though, that's the number that matters. Comparing LCoE of solar plus daily storage with the LCoE of fixed-firm or on-demand generation is apples-and-oranges.
I think solar plus storage absolutely has the potential to get there, but that too will likely require fundamental breakthroughs (probably in the form of much cheaper storage: perhaps something like Form Energy's iron-air batteries).
In the end we're still making steam and running a turbine. Just the steam turbine part of the power plant has a hard time competing with solar in sunny locations.
High temperature superconducting magnets are not a panacea for the problems with DT fusion. Those issues follow from limits on power/area at the first wall, and the needed thickness of the first wall; these ensure DT reactors will have low volumetric power density, regardless of the confinement scheme used.
With HTSC magnets, a tokamak much smaller than ITER could be built, but ITER is so horrifically bad that one can be much better than it and still be impractical.
And these are not new issues, they've been known for more than 40 years, but never addressed. From the 1983 Led
> But even though radiation damage rates and heat transfer requirements are much more severe in a fusion reactor, the power density is only one-tenth as large. This is a strong indication that fusion would be substantially more expensive than fission because, to put it simply, greater effort would be required to produce less power.
https://orcutt.net/weblog/wp-content/uploads/2015/08/The-Tro...
In terms of cost of materials to build a reactor, sure, that seems right. But most of the cost of fission is dealing with its regulatory burden, and fusion seems on track to largely avoid the worst of that. It seems conceivable that it ends up being cheaper for entirely political/bureaucratic reasons.
Regulatory costs and waste disposal are not significance cost centers for nuclear, at least as far as I can tell from any cost breakdowns.
One doesn't need super high quality welding and concrete pours becuase of regulations as much as the basic desire to have a properly engineered solution that lasts long enough to avoid costly repairs.
Take for example this recent analysis on how to make the AP1000 competitive:
https://gain.inl.gov/content/uploads/4/2024/11/DOE-Advanced-...
There are no regulatory changes proposed because nobody has thought of a way that regulations are the cost drivers. Yet there's still a path to competitive energy costs by focusing hard on construction costs.
Similarly, reactors under completely different regimes such as the EPR are still facing exactly the same construction cost overruns as in the rest of the developed world.
If regulations are a cost driver, let's hear how to change them in a way that drives down build cost, and by how much. Let's say we get rid of ALARA and jack up acceptable radiation levels to the earliest ones established. What would that do the cost? I have a feeling not much at all, but would like to see a serious proposal.
Relaxed regulatory burden doesn't seem to be making fission competitive in China; renewables are greatly overwhelming it now, particularly solar.
We might ask why regulations are so putatively damaging to nuclear, when they aren't to civil aviation. One possibility is that aircraft are simply easier to retrofit when design flaws are found. If there's a problem with welding in a nuclear plant (for example) it's extremely difficult to repair. Witness the fiasco of Flamanville 3 in France, the EPR plant that went many times over budget.
What would this imply for fusion? Nothing good. A fusion reactor is very complex, and any design flaw in the hot part will be extremely difficult to fix, as no hands on access will be allowed after the thing has started operation, due to induced radioactivity. This includes design or manufacturing flaws that cause mere operations problems, like leaks in cooling channels, not just flaws that might present public safety risks (if any could exist.) The operator will view a smaller problem that renders their plant unusable nearly as bad as a larger problem that also threatens the public.
I was struck by a recent analysis of deterioration of the tritium breeding blanket that just went ahead and assumed there were no initial cracks in the welded structure more than a certain very small size. Guaranteeing quality of all the welds in a very large complex fusion reactor, an order of magnitude or more larger than a fission reactor of the same power output, sounds like a recipe for extreme cost.
Oh for sure, I'm not claiming that CFS (or Tokamak Energy or Type One or whoever else) will for sure succeed, or if they do, that they've already solved all the problems that will need solving to do so. My only assertion/prediction is that I think if they end up succeeding, when future historians look back and write the history of this energy revolution or whatnot, HTSC magnets will turn out to have been the key breakthrough that made it possible.
Fusion reactors are self destroying, just ask any star.
More seriously: what to do about the neutron flux destroying the first wall inside the reactor vessel?
Fission is expensive for regulation reasons more than technological reasons, so if fusion doesn't face the same barriers then it could be cheaper than fission.
But I agree that it doesn't look like fusion is going to be cheap any time soon.
Fission is also expensive for several mundane reasons, like the fact that massive steam turbines are expensive, and because any large construction project in the West is expensive. Neither fusion nor regulatory reform are going to solve those.
The regulatory hurdles are probably bigger than the difficult enough technological ones you mention.
The steam generator that the fusion generator connects to might be more expensive than solar at this point. That would be even if fusion cost nothing and had infinite amounts of fuel, there would be no customers for its energy on a sunny afternoon.
This is like a “fusion is only 20 years away” (or 15 in this case) joke, right?
It used to be 30. So fifty more years?
Yep, it was 30 years at the 60s. If it keeps halving every 85 years, we'll get it approximately never :)
Zeno's Fusion Paradox
Personally speaking, having just bought an Ioniq 5 and installing solar at home what I see as the near future improvement is adding V2L functionality, which I can hook up to the generator input of my solar inverter, essentially adding another 60kWh buffer to my grid storage.
Considering how expensive residential batteries are and how quickly EVs depreciate, I think soon it'll be cheaper to get a used EV as a cheap source of cells that accidentally happens to be able to drive itself around.
Imo V2G, and V2H is unnecessary and add too much complication, I think for the future, solar inverters already have the necessary hardware and certifications to be able to take power and safely connect to the grid - something that requires different hardware and standards compliance in basically every country (yes even within the EU).
Residential batteries are not that expensive anymore, at least not all of them. That's a misconception I also held until a few years ago ;-)
My first 14.3 kWh pack cost about 2800$ DDP from China, delivered 03/2023. For that one I did calculate how long it took for amortization, which I projected at about 5 years.
The second, identical pack was delivered 08/2024 and cost 2000$ DDP. Since we got an EV that's drawing about 14kWh per day, I didn't bother doing the math and just ordered it.
These are 280Ah 16S 51.6V packs, based on the EVE LF280K. In an enclosure, with a BMS (Seplos, 200A) and a dedicated balancer. They are good for 6000 cycles at 140A or less [each]. Mind these were both part of small bulk orders - I think each time we ordered 6 to 8 of these, which reduced shipping costs.
My new batteries were about 250EUR/kWh - my 10KWh unit cost 2500 EUR - scaling it up to a decent used 5 year old EV price - you can have one for 15k with 60+ kWh batteries, so I'd say it's at a very similar price.
In the US, V2L limits your ability to output power from the car to about 1500 W. It's not going to power your house as more than a stopgap, even if you do have supplementary house batteries. V2H/V2G justify their complexity by solving that problem, along with all the ancillary grid benefits.
Not sure if that's the case - however doing V2L requires the manufacturer to add an inverter to the car, and making that powerful probably adds extra cost most customers wouldn't pay. TI just looked it up and my Ioniq can only do about 2kW sustained - but since this charges the house battery, that's enough - idle load is just a couple hundred watts.
If the car charges the house battery, what charges the car?
If you have solar panels or time-of-use electrical rates, you charge the car when power is cheap/free, and spend stored power when the grid costs are high. During a protracted outage, maybe you drive the car to a fast charger.
A typical house averages less than 1500W. And most of the higher usage overlaps the sun being out. So if you have supplemental house batteries to handle bursts then 1500W of V2L can go a very long way.
> And most most of the higher usage overlaps the sun being out.
Aren’t most people at work / school when the sun is out?
Yes but high electricity use days correlate with air conditioning, and most people don't turn that off in the middle of the day.
If you're not worrying about A/C then 1.5kW goes an extra long way. Outside of cooking you'll rarely exceed it.
Very interesting - I did not know this was possible. A few questions:
1. Does the solar inverter do away with the need for a V2G or V2H unit?
2. What are the limitations vs a dedicated V2G/H unit?
3. Is generator input on your solar inverter a common feature across inverters?
1. It does. The only issue is that the car can only output about 2kW sustained (this is a model limitation). That's fine since I have batteries in the house.
2. Tbh not super familiar with V2G/V2H, other than it being super expensive for both the wall box and the car (only high end models tend to support it)/
3. No idea, but it's not a high end feature, I wouldn't count on any inverter to just have it, but if you're looking to buy one that does, I don't think you'll be breaking the bank.
Imo the future is for solar inverters to offer a dedicated DC car charger port, as once again all the hardware is already in there.
Thanks for the answers. I used to work for a EV smart charging company (Kaluza) that ran a V2G trial. V2G was financial success for the users, but I always thought the wall box was a potential blocker. I don't think the 2kW output is a big issue as the customer could still reduce there load when required, but the elimination of a wall box makes onboarding much easier.
As long as the inverter can also provide charging this definitely has some potential.
> Can "second life" EV batteries work as grid-scale energy storage?
Yes
is it profitable? probably not.
Looking at the price for brid battery storage, and its dropping precipitously. The cost isn't as much in the batteries them selves, it packaging, placing and then controlling them.
For example if you want to have a 200Mwhr 100Mw storage site, you'll need to place it, join it to the grid, all doable. Then you need the switch gear to make it work as you want it to.
For day ahead, 30 minute trading, thats fairly simple.
For grid stabilisation, thats a bit harder, you need to be able to lead/match/lag the grid frequency by n degrees instantaneously. which is trivial at a few kw, much harder at 100Mw
Sounds like what https://www.basepowercompany.com/ is doing
Having worked extensively with battery systems, I think the grid storage potential of second-life EV batteries is more complex than it appears. We found that typical EV batteries retain 70-80% capacity after 8-10 years of vehicle use, but the real challenge is standardization and integration. Different manufacturers use vastly different battery management systems (BMS) and cell configurations - a Tesla pack is fundamentally different from a Nissan Leaf pack.
The economics are interesting though. New grid storage batteries cost around $200-300/kWh, while second-life EV batteries can be acquired for $50-100/kWh. However, you need to factor in significant integration costs (~$50-75/kWh) to build compatible BMS systems and thermal management. We also found cycle life degrades about 20% faster in repurposed packs compared to new ones, likely due to accumulated stress patterns from automotive use.
Has anyone here successfully integrated mixed second-life batteries at scale? I'm particularly curious about how you handled thermal management across different pack designs while maintaining safe operating parameters.
> the real challenge is standardization and integration
In what sense? I'm a newbie, but curious because I'm working on stuff related to https://mesastandards.org/mesa-der-std/.
> but what if they could drain every last drop of energy from those batteries before recycling them?
Again batteries are an energy store and not an energy source. The fact the author cannot distinguish that makes the their opinion less credible.
All I can think is embodied energy? That’s weird.
What we really need is standard two way charging on these cars. Every home with an EV should have a backup battery built into the deal.
>So top of the list for us, of course, designing this thing is safety.
Funny issue I learned after talking to a founder at a similar company: although the battery packs were certified safe for cars (passing crash tests, wild heat differences from AK to AZ, people sitting on top of the battery packs in the car) ... the founder had issues re-certifying the batteries for safe use in a static location for grid storage.
The certification process treated his company like the batteries were made from scratch even though they used the same BMS/coolant lines/etc. already proven and tested.
It's clear you still need strong safety regulations and practices in the rare case there's an event, but the founder noted the grid storage industry regulations were adding redundant safety testing and slowing down adoption. The founder also added it's difficult to compete on cost even with effectively free used EV batteries in this startup space of grid storage against the low cost of Chinese made grid-specific batteries due to the added testing + custom hardware + space constraints and other items. (Caveat: I didn't fact check any of their statements)
I've got a 13 year old EV and nobody has told me how to cash my EV in for reusable energy storage. (No, seriously, hit me up.)
Just sell it though normal channels. If your battery is with more than your car then there would be people making money on the arbitrage.
I totally get that, but as the owner, perhaps I'd like to make the money on the battery before the arbitragers.
We don't need ev batteries for this. We just need cheap enough LifePo4 so we're not burning more shit down. Prismatics from China are a start, Salt batteries showing some promise next.
This is great! Now I have a place to take my old puffy Li-Ion batteries.
https://www.redwoodmaterials.com/recycle-with-us/
In an ice storm and cold cloudy snap that sweeps the country, NO lithium batteries will save the grid. I'm weary of this tunnel vision of the absurd. The only 'storage' that works is to pump water uphill with 'surplus' energy, and there is not and never will be a surplus. And these evaporartion tanks are on a scale that ecologists have to remember countless horror stories (eg., Glenn Canyon Dam). And of course there's always "mfft!" (another scheme with no numbers behind it so it's credible because I'm talking about it).
The last time there was anything rational on the table was Perry under the Trump administration's 30-day rule proposal ( https://www.nucnet.org/news/nuclear-is-vital-to-us-national-... ). It gave a hard industry incentive to any energy supplier who can have 30 days' fuel on site. This means nuclear and coal. This was no gimmick, it was the first time anyone faced reality about National Security as related to Energy for the grid and survival. And now AI datacenter yadda yadda, we're also talking about the luxury of keeping schools heated in Winter. Adverse weather even for a week is grid down game over for wind and solar. Proven natural gas are ~300 years, joy! As soon as they get around to sending pregnant whales across the ocean (losing ~20% of the gas-energy in cooling) it may even last 50 years! Before we have to go to endless war again.
Without being a battery chemistry expert, why do these battery packs become not useful for an EV yet could still be useful for energy storage. They keep saying that 80% of life becomes unusable for EV, but that's still a lot of life. Is it that grid energy is more of a constant drain while the EV is lots of hard pulls (for lack of better wording)? In an EV, the battery cannot provide the higher volts being requested within rating, but a grid is never demanding peak performance?
It's not just about capacity (80% is still a lot), it's that degraded batteries lose their ability to deliver high current under load—so acceleration suffers and voltage sags under hard pulls. For grid storage, you're doing slow, steady charge/discharge cycles over hours, so the same battery that can't handle aggressive driving anymore works perfectly fine. Plus, grid storage has virtually unlimited space and no range anxiety, so if you need 25% more packs to hit your capacity target, you just stack them in a warehouse where real estate is cheap.
Also, batteries will degrade faster over time when they start to degrade, because they need more frequent charging. Their internal resistance increase and that promotes heat buildup during fast charging/discharging, another thing that promotes degradation. Slow charge/discharge cycles also help with heat management.
Looking forward to the grid-scale warehouse fire of battery packs popping off...
They claim to have taken the Moss Landing fire into account with how they are placing their batteries. We won't know if they've really solved the problem or not until their first battery pack experiences a runaway thermal event.
> For grid storage, you're doing slow, steady charge/discharge cycles over hours.
Only if the feed in is a bottleneck. For peak shaving you could go faster.
Space and weight are serious constraints in the car space, but not such a big deal on the side of a house. That’s how they retain their usefulness.
80% could indeed be plenty of usable life for your EV use cases, but it strongly depends on usage patterns. More degradation means more trips to the charger on a road trip. It means trips that you’d regularly make just charging at home at the end of day now require you to plug in at the destination too. It means more range anxiety as a whole.
For an EV you want a high energy density, because it impacts range. For grid storage, density doesn't matter as much.
I think car cells will be much more useful if they are packaged as replacement batteries for all the various battery powered tools, ebikes, etc.
There's a consumer profit margin to absorb the repackaging and teardown.
Maybe for home grid batteries they'll work too. Again, a consumer margin.
Sodium Ion and other grid-specific storage will simply be too cheap for secondhand EV batteries to compete. And the retiring cells won't be any better in density and will be less safe than the higher density sodium ion and LFP that is hitting the market.
Not second life, but first life. All EVs and charging stations should be reversible. In a world where fossil fuels cost their true value (~10x as much) and people still drive this would be a necessity for electricity generation
Seems like the market is going the hybrid route. It's kind of easy to see why, best of both worlds. Some BYD hybrids have crazy ranges like 1500 km on a tank of gas. The more practical car is winning. They put in a much small battery in these for fast charge, and the daily commute range. And you have gas, for longer trips. Maybe smaller batteries would be better for grid-scale storage too. If they're lighter and easier to handle.
> best of both worlds
And the worst too: https://evclinic.eu/2025/09/27/if-you-drive-a-hybrid-may-god...
I don't have first-hand experience, but these guys have an EV repair shop for a while and do also hybrids, their articles always offer lots of insight.
Short run down:
- micro/mild hybrids are useless: batteries too small, engines too small to be the sole source of power, so contribution to emission reduction is very small, batteries tend to fail early because they're very small
- full hybrids have bigger batteries and engines large enough to run pure EV, but you still rely on ICE engine for everything, so there's no ability to charge at home or save on gas
- plug-in hybrids are full hybrids, but you can charge them externally; according to many studies the estimated emissions are much higher than declared, because people simply don't charge them at home and run on ICE the whole time
In all these types of hybrids the batteries are smaller than pure EVs, so they cycle faster and degrade faster. You're carrying two drivetrains all the time with added weight, one of which has plenty of maintenance items. So they're not drop-in replacements.
From what I've seen from EVClinic above, many manufacturers use custom pouch cells, not cylindrical modules like the more advanced pure EVs, so you can't repair an individual failed cell. That means full pack replacement. For many manufacturers you can't order replacement parts of the electric drivetrain, and if you do, they cost a huge chunk of the car.
So all in all if everything's well, you're good. If something goes wrong, be prepared to spend the same as you would spend for a battery replacement of a pure EV, or even more.
i have heard that hybrid's have a maintanance problem?
is not a concern, double the technologie in the same space?
It's not like reliable gas cars ever had substantial maintenance problem in the gas part. So removing the gas part didn't do much in practice.
People do/did have frustrations with gas car mannerisms and mental approachability, like, everything was written in a mix of translated foreign language documents and borderline insane gearhead languages. That lead them to imagine that removing the gas part would drastically change the industry, in their favor.
But, in the end, gas cars are good with regular maintenance for something like 100k miles over 8 years, so, I wouldn't know what consumer product were more reliable than a gas car in the first place.
Reliable gas cars still require a lot more maintenance than an EV does.
Oil and oil filter changes. Fuel filters. Air cleaners. Brake pads (that mostly goes away with hybrids too).
Not that I'm aware of. I've heard that many hybrids actually require less maintenance - for instance, the car can use electric power for hard acceleration instead of stressing the engine, so oil tends to last longer, and regenerative braking causes the friction brakes to wear out more slowly.
Eh, my PHEV has a 2 year oil change interval, which is longer than my ICE only cars. You should probably bring in your EV every 2 years to get things looked at too.
The engine in a hybrid should live an easier life compared to an ICE. No extended idle, mostly running in the power band, etc. There are lots of different ways to setup the hybrid system, but typically, rather than a small stater motor, you have a larger motor/generator that also starts the engine; it's less likely to get worn out, because it's built for continuous use.
In my PHEV, it has a 'toyota synergy' style 'e-CVT' which eliminates gear selection and should be very low maintenance (although mine had to be replaced under a service bulletin due to bearing failure because of manufacturing error) again nicer than an ICE. But some hybrids have a more traditional transmission.
Certainly, you can do ICE only or EV only, but there's a lot of room to use the ICE for things it's good for, and the EV for things it's good for, and blend where there's overlap.
Ford Escape? I have a friend that needed the transmission on his 2023 PHEV replaced under warranty... no service bulletin, but mechanics caught a manufacturing error at a regular service. Hopeful my hybrid Maverick doesn't have similar problems.
That two year oil change cycle is the minumum required to not void the warranty.
It shouldn’t be taken as the optimal interval to maximise engine life.
Of course, modern fully synthetic engine oils are longer lasting, and I believe the newer Toyotas, at least the hybrids anyway, have electric oil pumps, and use very thin engine oil to make sure the engine is well lubricated at startup.
It's possible it might actually be more reliable long term, once the technology matures. For example, in cold weather the gas engine might heat the battery for better battery performance, maybe even extend its life if it prevents it from being drawn down too much. The gas engine, would also likely last longer since its not used for daily commutes.
"In many PHEV systems, there are different modes:
Electric mode (EV mode): The vehicle runs purely on the electric motor(s) and battery until the battery depletes to some extent.
Hybrid/Parallel mode: Both the petrol engine and electric motor(s) work together to drive the wheels, especially under high load, higher speeds or when battery is low. Ithy
Series mode (in some designs): The petrol engine acts only as a generator to charge the battery or power the electric motor(s), and the wheels are driven by the electric motor(s).
For the BYD Leopard 5 (and many BYD PHEVs) the petrol engine can drive the wheels (i.e., it is not purely a generator). It is part of the drive system, especially when high power or long range is needed.
At the same time, it likely can assist with charging the battery or maintaining battery state of charge (SOC) when needed (for example, to keep the battery at some reserve level or in “save” mode). User-reports show that the petrol engine will kick in to support the electric system, charge the battery, or assist the drive under certain conditions" -
Not sure about that, since I did never owned one either. But I watched a review BYD car yesterday. And it's supper nice.
https://www.youtube.com/watch?v=_6bqgR3NRHE&t=1s
Betteridge's law of headlines finally fails? TL;DR: Yes, but you can also make it 'not work' if you choose to politicize the tech solution to the energy problem.