I would appreciate comments on whether or not the new SEER rating (EU directive 626/2011) is a good indicator for efficiency in warmer climates. It is interesting to note that for heating they have divided Europe into 3 zones but cooling is one zone when summer temperatures can vary dramatically from Helsinki to Rome.
SEER takes partial loads into account with external temperatures of 20, 25, 30 and 35 degrees for an internal temp of 27DB/19WB. These temperatures represent loads of 21, 47, 74 and 100%.
My first issue is with that 35 degrees limit. Most outdoor units on sale in southern Europe have max operating temperatures of 45/46 degrees but their efficiency can be very different at 40/45 degrees when you examine the manufacturer tables for power input v production of cool air. Further, cooling is typically needed May to Oct but few people would have their units on with external temperatures of 20 or 25 degrees, most people would need AC from 30-50 degrees in built up urban areas.
So do these lab condition points indicate well how the units perform with higher external temperatures? Is the 50% case in the lab (ie external 25 degrees, internal 27) really a good indicator of the efficiency of the same unit working at 50% capacity to maintain a target internal temperature with an external temperature of 40 degrees? Should people in southern Europe really be choosing units with higher SEER over EER?
There is a some confusion. The new ratings came into effect this January and some models passed from "B" grade EER to SEER A+ or even A++ (probably with a little tweaking from the manufacturer but you can still divide nominal cooling in kW by power input in kW to confirm the EER) So units nobody wanted last year are now "the best" but still consume more power at higher loads and external temperatures...