Yet another question here that has me stumped despite a fair amount of research. I suspect this issue may help many others, hence my thread.
Background: I have a 2800sf home in NY heated with a Williamson forced air furnace pumping out a full 1 gal per hour (139K BTU input). After deducting 20% for efficiency loses my output is about 111K BTU. I lso have a 2 zone system and a 5 ton Lennox heat pump. My question is only for the furnace sizing.
Of the many contractors I met with, only 2 would do a manual J calc. Both came up with about 75K BTUs as the max load under a worse case scenario. Cooling was measured as needing 56K BTUs.
On the surface it would appear that my furnace is oversized by a full 35K BTU, which is quite alot-- a full 1/3 to big
At first it was suggested that I simply use a smaller nozzle. Can't do that per the Mfgr as such will ruin the unit due to excess corrosion. The engineer was most emphatic on this one.
Both contractors said that the over-sizing is quite significant and results in substantial waste--on the order of 25% or so. They indicated that a "right sized" furnace, especially a variable one, would be far better and result in substantial savings since I'd not be pumping out 111K when all I really need is 50 or maybe 75 under really cold conditions.
The bottom line is thus: How much does an oversized furnace like this waste on average and what might I see in savings should I replace it? If I can recoup savings in a 4-6 time frame I would consider a new unit.
Any thoughts? Suggestions? Ideas?
Thanks in advance-