Comments on Why 3.3V instead of 3V?
Parent
Why 3.3V instead of 3V?
Usual values used in doing electronics/power supplies are almost always round numbers:
- 5V
- 12V
- 28V
- 48V
- 60V
I understand that in some cases ICs are built to accept the voltage directly coming from the lithium-ion cell in order to simplify the power stage, therefore, having 4.2V input.
But, what is the reason behind having a 3.3V voltage level omnipresent instead of 3V?
Post
This question popped up in the feed and I got curious. Here is what I could find. Note that I am not an IC engineer so my interpretation of some facts may be off.
The 3.3V level is defined in the JESD8 standard. It was made by JEDEC.
3.3V is a stepping stone on the path of decreasing supply voltages. This came about because of the improvements in chip design and introduction of CMOS, which in turn moved the optimal operating voltage down.
This standard is nice in that it also provides some backward compatibility: if you look at it, its logic levels are compatible with TTL, so this means CMOS chips can work with TTL chips. See the chart below for the logic level comparison (Analog Devices). Note that the standard actually allows operation at the 3V level and calls it the "Extended range".
Why the number 3.3? This seems to go way back in the 90's to the development of the first ICs. I can find some anecdotal evidence about this level being a consequence of RTL design in the early days of semiconductor technology, but nothing that I can reference.
So in short: we use 3.3V now because it is a standard. The value itself comes from the properties of silicon and semiconductors that the early manufacturers used. Why that is is a question for the history buffs on this site.
1 comment thread