Stan, since one interesting thing in this thread is a discussion about correctly sizing the wattage of a resistor to a circuit, I looked at your page:
I see this states the resistor is 1100 ohms and 1/4 watt and the current is 15 milliamps.
Well this means the voltage drop across the resistor only is: (from Ohms law) the current times the resistance, or .015 * 1100 or 16.5 volts.
With a total "input" voltage of 23 volts, this means the two leds have a voltage drop of 23-16.5 or 6.5 volts, or 3.25 volts each, which makes sense.
Power is P = I**2 R (derived from Ohms law and the fact that P = IV by definition of power in watts).
So the power in the resistor is equal to the current squared times the resistance.
0.015 * 0.015 * 1100 is .2475 ... or just shy of 1/4 watt.
Your picture indicates that the total power is 105 milliwatt.... I cannot work backwards to see anything in the circuit at 105 milliwatt... at 15 milliamps, you need to be dropping 7 volts to dissipate 105 mW... nothing in the circuit is at 7 volts, the 2 leds are 6.5 volts, the resistor is 16.5 volts....
So I'm guessing that your calculator just indicates the resistor "next up" in size.
I think there may be an error here, but check my work.
The only thing I was hoping to contribute in this thread that any resistor running at it's full rated wattage is HOT! Too hot in my opinion, and could melt or soften plastic if in contact, and will definitely burn your fingers.
So maybe it would be good (after finding the discrepancy) to recommend a wattage rating of double the actual wattage dissipated in the resistor.
Regards, Greg