0-10v and 2-10v
What is the differance between the two? I was told there is realy no difference. I'm thinking, if there is a device that is 2-10v anything under 2-volts the device should not move. If the device is 0-10v the device should move when it is above 0-volt. Is this correct.
Correct. But, way back when, engineers decided on some standard for signals so every device manufacturer could use input/output sensors that were semi-standard. Of course 0-10 vts moves through its full range of 0-100% signal. Now the 2-10 vts also moves through its full range of 0-100%.
The advantage to the 2-10 vts is; if you are reading less than 2 vts, there must be a problem. On the 0-10 vts signal, if you loose your voltage, will you know there is a problem? - Maybe not, because 0 is a valid signal for the 0-10 vts range.
Also, common signals are; 2-10 vts, 1-5 vts, 4-20 ma.
Note these all have the same exact ratio,
Hope that helps explain. (and not confuse)
A 2-10 device on a 0-10 output (from what I've seen this is the most common problem) you will loose 20% of range ie. output will have to ramp to 2volts b4 anything happens, this may not be a problem if the control loop has integral gain, it will just ramp up as needed.
0-10 device on 2-10 output (not as common) will never get to zero position/stroke/closed or whatever so this may be a problem. Never tryed a zener (1.8 - 2 vdc) diode in series with the control signal, but it may work, would only have 8 vdc on 100% signal though.