You know what i mean..lmao, Too stoned to get all technical or use big words but ill try. Maybe he can understand it better if i did. Basically it is atmospheric gas colliding due to air pressure differences which ends up causing more atmospheric gas to spread out over a larger surface area. I was going to use confused flow or laminar flow in a non linear fashion but that was a bit too much for me right now. However, atmospheric gas does compress or "slam" into each other. Air is made out of particles such as nitrogen, dust, oxygen, water, etc. and when those particles are hit they move.There is nothing slamming into something else per se.......air (or more correctly, gas) moves due to pressure differences.......in either a laminar or confused flow.......
There is nothing slamming into something else per se.......air (or more correctly, gas) moves due to pressure differences.......in either a laminar or confused flow.......
You guys have made some very valid points, some of which I had never pondered on myself.
But still, doesn't compute in my brain. I just feel it's way too inefficient. Blowing into the light, to get the max air flux into the fins to dissipate heat, makes no sense. We're talking ambient air temps, not actively cooled air.
Heat dissipation in CFM, which is a metric of volume exchange, will be much higher if one blows out of the unit. If you blow in, I agree more homogeneous dissipation occurs, but the added pressure will cause a higher overall heat build-up. Also, the lower actual CFM due to increase of pressure, will cause increase overall heat. Basic physics here. Also, the fan will start working against a pressure, becoming slightly less efficient in it's air flux rating. All of these together add up to a significant amount.
Yes, air follows the path of least resistance, so blowing out obviously causes "cold channels", but the capacity of a system to dissipate heat is primarily due to the coefient of temperature. Meaning, the delta-T difference on the interface. Not the mean average. If you blow in, the mean average temp will be lower, but so will the Temp-coeficient, rendering the cooling system less efficient.
Blow outwards, dissipation may not be so homogenous, but with "cold channels" the outer parts of the heatsink will be much colder than the internal zone, causing a higher coefficient, promoting a higher thermal exchange, essentially being way better overall getting rid of that excess heat in the interface.
Plus, blowing in causes higher turbulence = added temperature increase.
I don't know… I may be totally wrong here, but I truly believe the best way to cool a specific zone is to direct the excess heat away from it, not blow ambient temp air towards it. I am of course talking about a heated grow room and not cooled compressed air to combat excess heat. I'm talking about using ambient warm temps to cool an already hot-zone. It's all about maxing out the differential thermal interface
This didn't bring any direct answers, but did help reaffirm my intuitions
Understanding Thermal Dissipation and Design of a Heatsink - Texas Instruments - Nikhil Seshasayee
http://www.ti.com/lit/an/slva462/slva462.pdf
what I am trying to get at is essentially that if the ambient temperature is hot then blowing out makes much more sense. If you have cooled air compressed, then blowing in makes much more sense because the thermal difference compensates added internal pressure plus fan inefficiency.
If the ambient temperature is grow room temps, then by all means one must make an effort to create the highest thermal coefficient possible, meaning getting that extra heat out of there as fast as possible!
By blowing in, because the delta-T is so low, that the added pressure and lower CFM rating does not compensate a more homogenous temperature reading throughout the heatsink. So there is much narrower delta-T to work with.
Hope my reasoning makes sense to you…?
__
Well i have a 40cm heatsink with 4 cobs on it with a 120mm fan on top blowing into it,i have a lazer thermometer to check temps so i am going to take the temps of the heatsink with it blowing onto it then i will flip the fan upside down and take the temps with it sucking air from the heatsink maybe this can clear up a few questions ?
Hold on, I'm basically talking about closed enclosures, not open. If it's open, there is no significant pressure build up and all my ramblings are made mute.
You guys have made some very valid points, some of which I had never pondered on myself.
But still, doesn't compute in my brain. I just feel it's way too inefficient. Blowing into the light to get the max air flux into the fins to dissipate heat, makes no sense. We're talking ambient air temps here, not actively cooled air. -
- All my thoughts have been for using the same ambient air temperatures between the grow space and the heat sinks...not actively cooled air such as usuing air conditioning then blowing the air over the heat sink....
Heat dissipation in CFM, which is a metric of volume exchange, will be much higher if one blows out of the unit. If you blow in, I agree more homogeneous dissipation occurs, but the added pressure will cause a higher overall heat build-up. Also, the lower actual CFM due to increase of pressure, will cause increase overall heat. Basic physics here. The fan will start working against a pressure becoming less efficient in it's air flux rating (sometimes quite significantly). All of these together add up to a significant amount.
I cant see how the added pressure within the case will cause a temperature rise...i understand air will be causing resistance and yes that resistance may be transferred to heat but realistically its negligible...whether the fans pulling air up and through or forcing it down and out the heat increase of resistance is almost zero and not worth thinking about....
Yes, air follows the path of least resistance, so blowing out obviously causes "cold channels", but the capacity of a system to dissipate heat is primarily due to the coefient of temperature. Meaning, the delta-T difference on the interface. Not the mean average. If you blow in, the mean average temp will be lower, but so will the Temp-coeficient, rendering the cooling system less efficient. There is no cooling action the closer you get to delta-T = 0
Sorry this may sound rude but I really dont understand the point of the temperature differences...the whole point of a heat sink is to minimise the temperature difference between ambient air and the heatsink....it sounds as if you are wanting to create a high delta T between the air and the heatsink, when actually you want the complete opposite...you want the heatsink to sit at the same temperature as the air, and if you are managing to do this then you have a very efficient 'cooling' system in place....i agree the higher the delta T the greater amount of heat will be transferred...but thats like saying I want my COB to run at 100°C so i get better heat transfer when realistically you want it as low as possible...so if this means you have a delta T of 0 then fair play to you it means your cooling system is as good as it can get!
Blow outwards, dissipation may not be so homogenous, but with "cold channels" the outer parts of the heatsink will be much colder than the internal zone, causing a higher coefficient, promoting a higher thermal exchange, essentially being way better overall getting rid of that excess heat in the interface.
The issue is with creating a 'cold channel' as you call it will be the center of the heat sink will be at a MUCH higher temperature as there is basically zero air flow, so zero heat transfer apart from natural conduction will occur over these fins of the heat sink...heat sinks are designed to have the heat evenly distributed around them and considering the COB you are trying to protect is in the center of the heat sink why would you want to cool the outside edges and let the heatsink in the center reach very high temperatures? You want to cool the material that is directly in contact with the heat source as thats where damage would occur...obviously you will get some natural conduction through the heat sink but its not enough to dissipate the heat required directly in the center from the COB...
Plus, blowing in causes higher turbulence = added temperature increase.
Negligible in our case really! Ive never felt a section of ductwork thats warm to touch from resistance...even with 30m³/s at 15m/s through it
I don't know… I may be totally wrong here, but I truly believe the best way to cool a specific zone is to direct the excess heat away from it, not blow ambient temp air towards it. I am of course talking about a heated grow room and not cooled compressed air specifically created to combat excess heat (created using added energy). I'm talking about using ambient warm temps to cool an already hot-zone. It's all about maxing out the differential thermal interface
As Ive said above, yes the you get greater heat transfer when the delta T's are the highest, but when cooling something why would you want to let it get hotter just so you have a larger temperature differential just so you can get a greater amount of heat transfer...its like letting it run hotter so you can just cool it some more if that makes sense....
This didn't bring any direct answers, but did help reaffirm my intuitions
Understanding Thermal Dissipation and Design of a Heatsink - Texas Instruments - Nikhil Seshasayee
http://www.ti.com/lit/an/slva462/slva462.pdf
what I am trying to get at is essentially that if the ambient temperature is hot then blowing out makes much more sense. If you have cooled air compressed, then blowing in makes much more sense because the thermal difference compensates added internal pressure plus fan inefficiency.
If the ambient temperature is grow room temps, then by all means one must make an effort to create the highest thermal coefficient possible, meaning getting that extra heat out of there as fast as possible!
By blowing in, because the delta-T is so low, that the added pressure and lower CFM rating does not compensate a more homogenous temperature reading throughout the heatsink. So there is much narrower delta-T to work with.
What i dont get is how is the Delta T lower because your blowing in rather than drawing it out? The air is coming from the same space so should be the same temperature...the only difference is the air being blown out will be a lot closer to the ambient air temperature as the heat transfer is a lot less inefficient as it is only cooling say 20% of the heat sink around the edges where its got a much clearer air path...
the temperature of the air leaving when being blown in will be a lot higher as its cooling 100% of the heat sink and reaching the center and hard to reach places...if i could set up this scenario at home I would but I dont have any means to test it really...
Hope my reasoning makes sense to you…?
__