It doesn't matter how long the oil is in the sump, it only matters how much
heat is lost from the outside of the sump to the air. Do we agree that the
only variable is how great the temperature diffference is between the
surface of the sump and the air? If so, it doesnt matter how long the oil
is in the sump. For your theory to be correct, doesn't the surface of the
sump have to be hotter? Also have you considered that on many engines the
sump is picking up a lot of radiated heat from the exhaust and that there
isn't a lot of airflow over the sump? The outer surface of the sump may
actually be hotter than the oil.
Mike
MU-2
"Matt Whiting" wrote in message
...
John_F wrote:
There are two terms in this temperature equation.
One is steady state thermal transfer rate. If you generate a
quantity of heat say one BTU then it will raise one quart of oil X
degrees F. If you add two times as much heat to the same oil it will
raise the temperature of the quart of oil 2X degrees F.
At any given RPM the oil pump will pump "Y" quarts of oil per minute
whether you have 2 quarts or 10 quarts in the sump since the pump is a
gear positive displacement pump. This means that the oil will carry
of the SAME amount of heat per minute if the temperature delta is the
same. If you want to get rid of more heat then you have to pump the
oil faster or heat the oil hotter to get a larger delta temperature
difference. This is the steady state condition. This is the condition
the engine is in when the temperature gage quits moving up.
Or you start with cooler oil in the sump. This is what, I believe, will
happen when you have more oil in the engine. The oil has a longer
residence time in the sump and contacts more surface area of the sump
through which it may dissipate heat.
Let's run a "thought" experiment at the limits. Let's assume that the
oil level is so low that no oil is ever in the sump. The oil pump pulls
it out just as fast as it comes in, just short of the point of sucking
air. I realize this isn't possible in the real world, but that is why
this is a thought experiment. In this case, the oil will get very hot
as it is constantly being circulated through the heads which are one of
the hottest parts of most engines. The oil has very little opportunity
to reject heat in the coolest part of the engine, the sump. The
equilibrium temperature will be rather high.
Now take the other extreme. The oil sump has infinite capacity so the
oil starts out at the same temperature regardless of how hot the hot
parts of the engine are. The oil will enter the oil pump relatively
cool and pick up heat, but will never again get circulated through the
engine so it has "forever" to dissipate its heat.
A real engine is somewhere in between these to limit cases, therefore it
is reasonably logical to expect some slope that connects the
steady-state oil temperature of the one limit with the other. I don't
think it reasonable to believe that both steady-state temperatures will
be the same and thus have a zero-slope line in between. This is what
would have to be the case for the oil temperature to be completely
independent of the amount of oil in the engine.
Matt
Matt
|