### Communities

tag:snake search within a tag
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
created:<1w created < 1 week ago
post_type:xxxx type of post
Q&A

# Why 3.3V instead of 3V?

+10
−0

Usual values used in doing electronics/power supplies are almost always round numbers:

• 5V
• 12V
• 28V
• 48V
• 60V

I understand that in some cases ICs are built to accept the voltage directly coming from the lithium-ion cell in order to simplify the power stage, therefore, having 4.2V input.

But, what is the reason behind having a 3.3V voltage level omnipresent instead of 3V?

Why does this post require moderator attention?
Why should this post be closed?

You are accessing this answer with a direct link, so it's being shown above all other answers regardless of its score. You can return to the normal view.

+4
−0

Sorry, but I have to challenge your premises when you say that most numbers are "round numbers" and that could be a reason for choosing some value.

You compare those values by the number of decimal digits, but that's the wrong comparison. You should compare the significant digits.

In fact what is important in any nominal value is its tolerance (i.e. relative accuracy), not its absolute accuracy.

That's why many standard values are also standard preferred values. It's a matter of tolerance spread.

Here is the first decade of the E24 series (with E12 series values in bold):

1.0 1.1 1.2 1.3 1.5 1.6 1.8
2.0 2.2 2.4 2.7 3.0 3.3 3.6
3.9 4.3 4.7 5.1 5.6 6.2 6.8
7.5 8.2 9.1

These are the same series from which standard values for component parameters are taken (for example, resistance values for resistors or Zener voltage values for Zener diodes).

Note that this cover many (most?) common "standard" voltage values, like for example: 12V, 15V, 18V, 24V, 75V, 120V, 240V.

The only notable outlier is 5.0V.

So, to answer your question, although the historical truth about that choice is quite obscure, I wouldn't rule out that when they decided they needed a standard voltage around 3V they simply picked the nearest E12 series value.

I stumbled across this JEITA (Japan Electronics and Information Technology Industries Association) document:

3.3V±0.3V (Normal Range), and 2.7V to 3.6V (Wide Range) Power Supply Voltage and Interface Standard for Nonterminated Digital Integrated Circuit

In which there is an explanatory section that reports some of the discussion history for the standardization (mentioning JEDEC). At page 5 it reads (emphasis mine):

A power supply voltage of digital circuits had been kept 5V,actually for a long time from 1980’s. But, in 1990’s, a requirement of low power supply voltage has become increasing to attain a low power consumption and a high noise immunity of electric equipments, in a main application of portable equipments (note PC, etc) which need a long battery operation and high performance equipments (WS, etc) which require a high speed.

In 1990’s, also an age of deep sub-micron process technology (below 0.5μm process technology) has begun. Needs of low power supply voltage have become the most important issue to obtain a keeping of reliability and continuities of the trends of high density, high speed, together. According to above back ground, discussion for standard of low power supply voltage, firstly 3.3V,have begun in JEDEC, from early of 1990’s. 3.3V standard (JESD8-A) was enacted in June,1994, 2.5V standard (JESD8-5) was in October, 1995 and 1.8V standard (JESD8-7) was in February, 1997, respectively.

Especially, 3.3V JEDEC standard(JESD8-A) was required to maintain 5V TTL and 5V CMOS compatibility, because both 5V and 3.3V power supply voltages were used in a transition period from 5V to 3.3V when it was a first case of lower power supply voltage. To obtain a 5V compatibility, this standard defined a specifications of LVTTL and LVCMOS.

IC Low Voltage Operation Sub-committee began the discussion for standard of low power supply voltage since April, 1996 in EIAJ, according to JEDEC's activities of power supply discussion, in anticipation of a real popularization of 3.3V power supply voltage from the half of 1990’s and coming of next lower supply voltage than 3.3V. EIAJ standard of 3.3V power supply voltage was established in May, 1998. This standard corresponds to that of JEDEC about specifications, because it has been already known and used widely in the world. But, this standard is amended from that of JEDEC about notation and sentence for accomplishing the unify of them among three JEDEC standards (3.3V, 2.5V, 1.8V).

So it seems that that value (the exact value probably chosen to be a preferred number) was chosen because the logic levels for digital CMOS IC chips were compatible with the existing CMOS and TTL 5V families.

What does it exactly mean compatible in this context should be explained in that JESD8-A standard.

I guess that it is something about input and output logic levels, but the diagrams in @Mu3 answer don't explain all the implications.

EDIT2 (found JEDEC standard)

I've retrieved a copy of the JESD8C.01 standard, which supersedes the JESD8A (incorrectly mentioned as JESD8-A in the JEITA document cited above).

I got the JESD8C.01 from the JEDEC website (requires free registration). Unfortunately JESD8A is no longer available and so we can't do anything but assume that JESD8C.01 contains essentially the same information.

NOTE: I've asked JEDEC and got permission to quote excerpts of the aforementioned standard, provided I give attribution with a format they required (that's the reason for all the boilerplate attribution text below).

The standard is named:

JESD8C.01 (Interface Standard for Nominal 3 V/3.3 V Supply Digital Integrated Circuits )[1]

which already says a lot. It's a standardization effort to ease the transition from 5V logic chips to lower voltage chips and it specifies the input/output voltage thresholds for the new "3.3V families".

It defines three power supply ranges, named narrow, normal and extended, which correspond to commercial, industrial and military ranges often used by manufacturers. The main requirements are these:

This table is reproduced, with permission, from JEDEC document JESD8C.01, table 1.

Then it goes on defining two compatibility classes: LVTTL-compatible devices and LVCMOS-compatible devices, which I'll call simply LVTTL devices and LVCMOS devices below.

NOTE: all the requirements stated in the standard must be met by the devices regardless of their range (narrow, normal, extended) and over their entire VDD range.

LVTTL and LVCMOS input requirements

This table is reproduced, with permission, from JEDEC document JESD8C.01, table 2.

The difference between LVTTL and LVCMOS is in their output requirements.

LVTTL output requirements

This table is reproduced, with permission, from JEDEC document JESD8C.01, table 3.

LVCMOS output requirements

This table is reproduced, with permission, from JEDEC document JESD8C.01, table 4.

Why does this post require moderator attention?

+7
−0

This question popped up in the feed and I got curious. Here is what I could find. Note that I am not an IC engineer so my interpretation of some facts may be off.

The 3.3V level is defined in the JESD8 standard. It was made by JEDEC.

3.3V is a stepping stone on the path of decreasing supply voltages. This came about because of the improvements in chip design and introduction of CMOS, which in turn moved the optimal operating voltage down.

This standard is nice in that it also provides some backward compatibility: if you look at it, its logic levels are compatible with TTL, so this means CMOS chips can work with TTL chips. See the chart below for the logic level comparison (Analog Devices). Note that the standard actually allows operation at the 3V level and calls it the "Extended range".

Why the number 3.3? This seems to go way back in the 90's to the development of the first ICs. I can find some anecdotal evidence about this level being a consequence of RTL design in the early days of semiconductor technology, but nothing that I can reference.

So in short: we use 3.3V now because it is a standard. The value itself comes from the properties of silicon and semiconductors that the early manufacturers used. Why that is is a question for the history buffs on this site.

Why does this post require moderator attention?

+0
−0

The other answers are excellent justification to the the industry standards imposed on CMOS logic. The historical migration to smaller lithography CMOS led to lower Cds values which led to faster rise times or higher toggle f's. However realize that the standard has been undocumented standards of 50 ohms then 25 ohms at nominal voltage for reductions from 5V to all the lower threshold logic families. Naturally Rdson lowers with rising Vdd and 10% Vdd yields a wide variation in RdsOn or Vol/Iol=Rs.

3.3V was more of arbitrary 2/3 of the 5V standard. But if if you choose a lower Vdd like 3V in a 3.3 V family the RdsOn is slightly higher and the Pd is slightly lower for a given fmax and the risetime is slightly higher with the lower drive current.

I hope this satisfies your curiosity.

Why does this post require moderator attention?

+1
−1

Actually I think this is a really good question. It's kind of more historical though I guess (and I don't know the historical answer either, but from building products perspective I think this is probably the reason/s).

75V DC is a cut-off value for safety testing on products (OSHA / UL requirements in North America | LVD in Europe). That's a good reason to not use 75V DC.

I think the real answer is probably pretty multi-disciplinary on the economics of producing batteries.

I have some family friends who own battery factories overseas and know a bunch about LiPos from a producing consumer goods perspective and safety issues but I'm no expert on the chemistry of batteries.

For some reason the chemistry of batteries appears to be really easy to produce in 3V & 5V regions. My guess is that the batteries got commercialized first at 3V / 5V and then the ICs were built in order to use the cheapest and readily available batteries.

(Your dollar store AAA battery is 1.5V -- add in 2, and now you are at 3V. So if I'm building semiconductors do I require my end user to go fabricate a custom 4V battery or design the whole thing around the fact they can go to the dollar store and power up my gear for \$0.30 of batteries?).

Whole point of hardware is to sell it.

Same thing on the regulatory side: FCC creates new rules for how you can use RF airways, and viola your local RF semiconductor company creates a new chip to maximize usage of the new rules. First comes the regulation and next comes the hardware.

Why does this post require moderator attention?