Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Temperature dependant charge curve #1844

Open
johnwb87 opened this issue Jan 5, 2025 · 19 comments
Open

Temperature dependant charge curve #1844

johnwb87 opened this issue Jan 5, 2025 · 19 comments
Labels
enhancement New feature or request

Comments

@johnwb87
Copy link

johnwb87 commented Jan 5, 2025

*Is your feature request related to a problem? Please describe.
First winter with battery and I am starting to see temperature dependent charge limits being applied in battery firmware affecting predbat’s planning. This is especially true for trying to maximise short slots.

Describe the solution you'd like
Where the inverter control service allows use a temperature reading and apply a charge curve based on the temperature. Similar to the current charge curve implementation but based on temperature not SOC.

Describe alternatives you've considered
Setting the charge limit in apps.yaml to the most often observed reduced rate. But this reduces charge rate permanently not just when the temperature is low.

Additional context
For GivEnergy consumer batteries it can be observed in the GE app or GE Cloud that the BMS temperature, which GivTCP reports, is being used to apply a percentage scale to the max charge rate.

@springfall2008
Copy link
Owner

I think this is a good idea but I don't know how the charge curve is impacted with temperature, do you have data on this?

Predbat can calculate the charge curve dynamically so you can set it to auto but it will clearly by from last night rather than right now.

@springfall2008 springfall2008 added the enhancement New feature or request label Jan 5, 2025
@johnwb87
Copy link
Author

johnwb87 commented Jan 5, 2025

There is some discussion on this thread on the GE forum. https://community-beta.givenergy.cloud/t/battery-charging-rate/2219/9

The TL;DR is that GE say there standard spec for charge rates above 20C: 100%, 10-20: 60%, 0-10: 50%.

On the face of it a first release of this appears a little simpler than the current charge rate calculation. Would be a sort of expert mode but in apps.yaml as if a users inverter control service can report temperature they would need to either know from a specification or observe the charge changes then set the limits.
The auto approach similar to the current charge curve would be the next evolution.

@mpartington
Copy link

mpartington commented Jan 6, 2025

Does get complicated, because as you say this is often tweaked depending on firmware. Say 3007, there is no temperature impact, and on 3019 (in beta), you will get 100% rate all the way down to 10 degrees.

Then there is the number of batteries on an inverter (and it's discharge rate). For example, 2x9.5s may not throttle (as say 30% of the total C-raiting still exceeds the capacity of the inverter, so no derate is visible) when 1x9.5 would. Would you also need to consider a prediction in the plan of when the temperature would reach the next threshold.

I can see value for those that see it, but would need some careful though how to implement.

@johnwb87
Copy link
Author

johnwb87 commented Jan 6, 2025

I’m not expecting predbat to auto discover all of this.

What I really want to avoid is predbat executing a charge period and only seeing that something is wrong when the SOC doesn’t rise as planned. Any forewarning must be positive for the plan.

I see it as something rather course and I would happily use an automation to change the charge rate if it wasn’t inside apps.yaml.

I imagine it as something like the manual charge curve currently where the user defines the temperature and the percentage. If you don’t define nothing happens if you do it does.

@gcoan
Copy link
Collaborator

gcoan commented Jan 6, 2025

I see it as something rather course and I would happily use an automation to change the charge rate if it wasn’t inside apps.yaml.

You can change this via an automation by using the predbat manual API https://springfall2008.github.io/batpred/manual-api/

I had inverter limit charge and discharge added to the list of controllable entities for just this reason.

However, I think there is still value in being able to configure the custom charge curve in Predbat and this being an advanced feature as discussed above because it does vary by inverter Gen and battery combination.

Predbat could then apply current battery cell temperature to find the correct charge rate for the plan.
One side issue is that as the battery charges the cell temperatures will rise.

@johnwb87
Copy link
Author

johnwb87 commented Jan 7, 2025

Ah. Ok I will look at the manual API, thank you for highlighting it. I haven’t ventured beyond expert mode so far.

The battery temperature will always be a lagging indicator unfortunately but I do feel predbat knowing and acting on the current charge limit is better than not knowing.

@johnwb87
Copy link
Author

johnwb87 commented Jan 7, 2025

Thanks @gcoan. I think I have an automation working now to call the manual API.

I have triggered it manually a few times, just need to wait for real events to occur now to fully verify.

@springfall2008
Copy link
Owner

New release v8.11.0 will contain a first version of this feature

@johnwb87
Copy link
Author

Thank you Trefor.

Will make changes tomorrow and get back to you.

@gcoan
Copy link
Collaborator

gcoan commented Jan 12, 2025

Thanks for the new temperature curve feature Trefor, a very useful addition to Predbat.

I have seen the following table shared on the givenergy customer forum which I think originated from facebook:

image

It doesn't have the extreme drop-off in charge rates below 10 degrees that's in the suggested temperature curve.

My own personal experience of the effects of temperature on charge rate is not always consistent with the table.

Two Gen 1 hybrid inverters, one with a 9.5 battery (called G) and one with a 5.2 (called H).

Looking at charge rate and cell temperatures:
image

The 9.5 seems to charge at 2.4kWh regardless of temperature, whether 10-20 degrees or above 20 degrees. According to the table it should be at full 2.6 rate when above 20 degrees, but it definitely isn't.

The 5.2 shows different behaviour based on temperature, below 20 degrees its charging at 1.8kWh which is what the table says it should be, and above 20 degrees, again 2.4 not 2.6.

So it does make it a bit difficult for me to know what to set the temperature curve to when the batteries behave differently. I'll probably model the 5.2 and then at least it will just be pessimistic for the 9.5.

Assume that as I am able to configure multiple inverter/battery temperature sensors, Predbat will take account of the different battery charge rates depending on what cell temp each battery is at?

Also, how does this work with input_number.predbat_battery_rate_max_scaling?
I have this set to 0.92 to model that the inverter might say it can charge at 2.6kWh but its actually achieving 2.4.
Is this always applied on top of the battery temperature charge curve so I'd need to adjust the lower temperature rates accordingly slightly upwards?

@johnwb87
Copy link
Author

That table isn't right, I'd say indicative and based on AC charge rates.

From my observations I'm closer to 0.3C on a 9.5kW with a G3 Hybrid. That is what I've set in apps.yaml. @gcoan I would say consistent with what you see for your 9.5. But interesting that your 5.2 is at around 0.37 using your reported 1.8kW charge rate as a DC value.

@springfall2008 I can see temperature being pulled in correctly in the logs. Is there any other evidence to look for or do I need to either wait for or force a charge?

@johnwb87
Copy link
Author

johnwb87 commented Jan 13, 2025

image
First evidence of this feature in overnight charging. I use combine charge slots and low power mode so wasn't actually expecting to see anything in a day without any extra intelligent slots.
Low energy mode usually uses between 1.5kW and 1.7kW charge rates on a normal 6 hours slot. So below any temperature related reduction anyway.

Going to turn combine charge slots off and see what plan I get.

@johnwb87
Copy link
Author

I was checking the documentation on this after a couple of days of use. There is a potential inconsistency between the documentation and the apps.yaml template text.

Main body note says: "...gaps in the curve above 20 will used 20 degrees, gaps below 0 will use 0 degrees. Do not leave gaps in the curve between 20 and 0."

apps.yaml template says:

values unspecified will be assumed to be 1.0 hence rate is capped by max charge rate

@gcoan
Copy link
Collaborator

gcoan commented Jan 16, 2025

Main body note says: "...gaps in the curve above 20 will used 20 degrees, gaps below 0 will use 0 degrees. Do not leave gaps in the curve between 20 and 0."

apps.yaml template says:

values unspecified will be assumed to be 1.0 hence rate is capped by max charge rate

I suspect the documentation is correct and the apps.yaml is wrong, but they ought to be consistent I agree !

I am still struggling to work out how to set the temperature curve up for me

Made harder by historic data including the effects of low power charging and the sensor history being purged and just long term stats mean less granularity of data...

From what I can see I am pretty confident that at least in the 10-20 and 20+ ranges both my 9.5 and my 5.2 are operating as per the table APART FROM the max rate charge which is only ever 2425W not 2600W that the inverter is capable of. The difference most probably due to measuring charging in AC but actual reported charge rate is in DC after conversion losses.

The problem is how to model the temperature curve and the input_number.predbat_battery_rate_max_scaling correctly.

If I set input_number.predbat_battery_rate_max_scaling to 1.0 and just rely on the temperature curve then any charging that is not temperature limited (i.e. 5.2 > 20 degrees and 9.5 > 10 degrees) will incorrectly assume a 2600W charge rate rather than the 2425W that is achieved by the inverters.

If I leave input_number.predbat_battery_rate_max_scaling set to 0.92 to reflect the reduced max rate charging then it risks under-estimating the temperature limiting charging.

I think what I need the logic to be:

  1. Lookup in the table if the charge rate is going to be limited.
  2. If the charge rate for the battery is temperature limited then use the temperature factor x battery reported size to calculate charge rate
  3. If the charge rate is not temperature limited then use inverter max charge rate x input_number.predbat_battery_rate_max_scaling to calculate charge rate
  4. NB: in step 2, use battery reported size and do not apply battery_scaling factor from apps.yaml as this would further under-estimate the charge rate

@springfall2008 does this match how the code works?

@johnwb87
Copy link
Author

A few things I have worked out playing with this.

Firstly the table from the GE forum is almost certainly AC rates. We know GE inverters report the AC charge rate but we actually see the DC rate in the GE app, GivTCP and predbat.

battery_rate_max_scaling is applied after the temperature curve is applied. I had a poke around the pull request and you can see it in the code as a multiplier applied after a function called get_charge_rate_curve and is used to set charge_rate_now. This is consistent with the longer standing charge power curve which is also an argument of get_charge_rate_curve.

If back working to get to your curve we need to remember the AC/DC point for GE inverters, So reverse any scaling applied in battery_rate_max_scaling when doing the sums. I found the GE cloud das

I think the logic in predbat is sound, that being everything is calculated from inverter reported rates.

I'm most intrigued by the knee in the overnight charge I'm now seeing since this release. It is present with and without a temperature_charge_curve set in apps.yaml when low_power_charge is true. Turning low_power_charge false and using a temperature_charge_curve does not result in the knee. Which leads me to think that the adjusts made to the low power mode calculations to accommodate temperature_charge_curve have impacted the output in an unexpected way.

@gcoan
Copy link
Collaborator

gcoan commented Jan 16, 2025

Interesting investigation, thanks. I have today upgraded to 8.11.0 so will see what difference it makes.

I too thought there was inconsistencies in the table with AC vs DC rates and what's used where.

battery_rate_max_scaling is applied after the temperature curve is applied. I had a poke around the pull request and you can see it in the code as a multiplier applied after a function called get_charge_rate_curve and is used to set charge_rate_now.

For clarity, I don't think battery_rate_max_scaling is applied to the charge rate used by the inverter (i.e. number.givtcp_xxxx_battery_charge_rate), my experience is that its only used in Predbat for calculating what the effective charge rate is.

What I see is the inverter charge rate (for a full rate charge) being set to 2600W, but Predbat knows in the plan calculation that it will actually only achieve 2450W or whatever it is.

If you want to limit the inverter charge rate from the inverter maximum you need to set inverter_limit_charge in apps.yaml.

If back working to get to your curve we need to remember the AC/DC point for GE inverters, So reverse any scaling applied in battery_rate_max_scaling when doing the sums

So if the table says C=0.33 for 10-20 degrees and battery_rate_max_scaling is 0.92 then is the conclusion that we should put a value of 0.33/0.92=0.36 in the apps.yaml temperature curve?

And for C=0.25 for 2-10 degrees its 0.25/0.92=0.27 ?

@johnwb87
Copy link
Author

johnwb87 commented Jan 16, 2025

You are right about the use of battery_rate_max_scaling. I know from my logs and GivTCP logs that predbat sets my inverter to 3600, it’s AC max.

I didn’t do the maths that way. I did observed limited charge rate/battery_rate_max_scaling to get the AC limited rate. Then used that divided by battery capacity to get the C rate.

@gcoan
Copy link
Collaborator

gcoan commented Jan 16, 2025

I didn’t do the maths that way. I did observed limited charge rate/battery_rate_max_scaling to get the AC limited rate. Then used that divided by battery capacity to get the C rate.

I think it comes out to much the same number

Charging on my 5.2 earlier this evening observed at 1720W (although the battery temp had just risen from 20 to 21 so in theory shouldn't have been limited)

1720/0.92=1869
1868/5200=0.359

My calculation taking the figure given in the table as being correct is 0.33/0.92=0.359

Yours has the advantage of working out from evidence the actual AC limited rate rather than trusting the table.

Looking at evidence shows there is variation in charge rates. Overnight when the battery temperature was 18 degree rising through to 21 degrees the 5.2 battery was charging at between 1780 and 1790W

1785/0.92=1940
1940/5200=0.373

All useful discussion and I can expand the documentation with more details on how to calculate your own temperature charging curve. Thanks !

@johnwb87
Copy link
Author

I've gone deep down this rabbit hole so be prepared.

I also noticed the charge rate subtly rising. That tells me that a true Amps limit is being applied by the BMS. Which from my little bit of research into C-rating is the first principles way of calculating and using it.

From there I looked at the data sheet for the batteries. I think further variability between units come from the stated max charge current of each unit.

For the 5.2 with a stated amp capacity of 102Ah and a max charge current of 50A. 50A is basically 0.5C as we would expect. So 0.33C is 33.33A at the nominal voltage of 51.2V=1.7kw (DC, I think), matching the table. So it works for this battery.

The 9.5 whilst having an amp capacity of 186Ah, has a max charge current of 80A. So that is already 0.43C. On the same logic then for the 9.5 then you get to 53.33A. My observed charge rate and observed voltage works out to around 50A. Which gives you 0.26C.

In summary I think there is a lot of variability between battery and inverter combinations and so as with many things the answer is; It depends. As heat is the enemy and heat comes from amps it is likely a judgement being made by GE about heat and heat shock. So setting a max charge current to manage that.

What is does make me realise is that a big factor in the AIO having much higher battery charge and discharge rates is because the nominal battery voltage is higher at 307V so with a lower max charge current of 25A it can hit 7.5kW and presumably generate a lot less heat in the battery cells. We see similar in EVs moving to 800V architectures. Reduce heat, reduce losses, reduce size of HV cables and so weight. Higher voltage is generally superior.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants