Distilled Automotive Electronics Design

The Rise of the 12V Net

The electrical network in automobiles has come a long way over the last century, and it continues to be a work in progress. The first useful American electric starter was patented by Kettering and Leland in 1911. Before that, the internal combustion engine automobile was a non-starter, with the market dominated by all-electric vehicles powered by lead acid batteries, far outnumbering internal combustion engine automobiles. Electric vehicles were quiet and did not need the over-half-hour preheat of their competing steam engine cars. Internal combustion engines, to the detriment of their owners, had to be hand cranked, or pull-cord cranked, producing more “cranky”, or severely injured, drivers than clean starts. The electric starter changed the game with pushbutton starts. The battery went from powering the motor, to simply starting it.

The nominal battery voltage has not always been 12V. It was around 1955 that the 6V system was replaced by the 12V system we see today at Chevrolet, and Volkswagen switched over after 1966. Larger engines and increasing electrical demands made the step up necessary. Today, increasing power demands are driving the call for an even higher voltage level in an effort to keep down the currents of the high power systems. Even so, discussions over the 42V power net have shown that there is no single higher voltage step (such as 42V) that satisfies all power and safety requirements. Instead, the future probably lies in a dual voltage system, similar to the type currently used in hybrid and electric vehicles, with a 12V system for the lower power systems and a higher voltage for the high power systems. The 12V system is here to stay, so it makes sense for power supply designers to understand it well.


Distilled Automotive Electronics Design

How to Survive the Test: Remember These Three Things

Over its 60-year lifetime, the 12V system has been increasingly used and abused, resulting in unique, extensive and growing test catalogues at each automotive manufacturer, even with attempts to simplify and unify requirements through the ISO7637-1 standard. The breadth and depth of these various test catalogues can be daunting to an ECU (electronic control unit) designer. Furthermore, 60 years of history that has lead to some vestigial requirements—outdated specifications that can soak up unnecessary design time. It is important for an ECU or module designer to understand the reasons behind the test pulses. With a good understanding of the issues behind the tests, most of which are the same across manufacturers, it is possible to design generically, covering most manufacturers’ requirements. The three big issues are high voltage and low voltage

One: High Voltage Excursions—Normal vs Nominal vs Worst Case

Let’s start with the basics, the nominal system voltage. In modern automobiles, the nominal 12V battery voltage comes from (typically) six 2V AGM (absorbent glass matt) lead acid cells. Charging voltages are general higher when the battery is cold. Battery temperature is measured, along with current and voltage sensors at one of the battery poles. From these monitored parameters, the battery’s state of charge (SoC) and state of health (SoH) can be calculated and the alternator voltage adjusted accordingly, so that 12V is nominal, but not necessarily normal.

There is more. Simple brake energy recuperation can be achieved by controlling the alternator to its maximum current during break periods. Add to that: automated start-stop systems force a higher charge/discharge dynamic on the battery and a higher voltage dynamic on the 12V system. In sum, the highest voltages expected in normal operation are about 16.5V at cold.

Of course, normal doesn’t take into account worst-case scenarios, which are not that abnormal in the real world. For example: the dual battery jump-start. Enterprising tow truck operators have discovered that the 24V generated by two series-connected truck batteries can jump start a car engine much faster than a single battery. Never mind, that this trick presents a very real danger of an exploding battery and goes against clear manufacturer admonitions.

Another example: sudden disconnect of the battery while the alternator is charging at high current. Usually, alternators are regulated via their rotor field current. The rotor field winding has a high inductance with a relative slow downslope for the current. The typical timescale is 400ms for such a “load dump” event before the alternator current dials down.

To avoid overvoltage, B6 rectifier bridge diodes act as avalanche diodes, clamping the voltage to about 34V. B6 is an abbreviation for the typical 6-diode rectifier found in the 3-phase system of the alternator. To protect against a load dump event, it would be optimal to design the B6 diode avalanche to 18V, but then the diodes would explode in a dual battery jump start. So they are designed for about 25–26 volts avalanche. Depending on their impedance the total load dump voltage is about 28V to 34V. So, to allow for some margin, 36V or 40V absolute max parts should be comfortable in modern 12V automotive applications. Retrofitting a very old car from the late 50s to the 70s with brush and collector type DC generator is a different story, but that’s not relevant for modern production automobiles. 40V is the realistic upper end of any sustained voltages with real low impedance in the sub-ohm range. Of course there are higher voltage pulses, mainly produced by switching inductance like solenoids or the harness, but these are higher source impedance and can be passively filtered.

Two: Low Voltage Cold Crank Conditions

At low “steady state” input voltage end, cold crank requirements have increased in severity over the years. The lowest voltage typically occurs get when the starter has just kicked into the gear, so turnover is very low. This period is in the millisecond range.

In the low voltage side of things, it is important to differentiate mission critical systems from the rest of the electronics. If the alternator does not increase the system voltage to a decent level, or if the ignition, fuel injection or starter does not work, you are stuck in the parking lot. Mission critical systems must work through the minimum voltage during the crank period. In the ISO7637 standard this is called Pulse 4.

There are other systems that are not mission critical, but must also ride through cold crank conditions. These include systems that take significant time to reboot, which would cause driver frustration if they did not survive the cold crank pulses. These require at least some internal supply voltage to carry them through.

Car manufacturers have different ideas how the cold crank test pulse looks, some adding low frequency sinusoids during the starter cranking period. Modern start-stop starters get the engine running in about 400ms. In most electronic control units, abbreviated ECU, it is a burden to try to bridge about 50ms of cold crank pulse down to 3V–4V with capacitors. If your ECU only draws 100mA you get away with reasonable sized aluminium capacitors. With modern high current demands, the capacity quickly becomes excessive. Larger capacitor cans demand additional mounting brackets to be protect against vibration—not attractive for a surface mount design.

The easiest way to accommodate cold crank conditions is to simply design the power supply unit with very low drop out. Then, processor I/O and core voltages are held up regardless of cold crank pulse duration and fancy recovery pulse shapes are no problem, since the control loop easily takes care of them.

Three: Keep the Quiescent Current Low

An important automotive requirement is keeping quiescent current low. Few real switches are used anymore. Instead ECUs shut down on command, via pushbutton, control panel or bus command. The units have to be somehow switched on again. It can be a push button, a wake on bus activity signal like wake on CAN or some ignition-on signal. Many new cars no longer have an old school ignition-switch-on signal—they rely on wake-on bus signals only. A part of the ECU needs to be active and listening for this wake up signal.

Today’s cars contain easily 50 to over 100 ECUs, all looking for activation signals when the automobile is “off.” They must minimize their current draw as they wait, or risk killing the battery during standby. The allowed current in standby is typically 100µA/unit. That is the input current at the ECU terminal block or wire harness.

Simplify the ECU Power Supply

The way to survive most tests is to meet the three requirements I talk about above. The easiest way to do that is to start by using the right power supply, one that will do most of the work for you. The perfect automotive power supply would regulate with 36V or more at its input, taking care of load dumps or jumps by risk-taking tow truck operators. Its very low dropout would maintain regulation through cold crank pulses regardless of shape. And, its low operating quiescent current and high efficiency would preserve the battery charge and meet automotive standby requirements.

In my next installment, I’ll talk about a power supply that meets these ideals.

Christian Kueck

Christian Kueck

Christian Kueck is Strategic Marketing Manager, Europe for the Power Products. As a graduate in Electrical Engineering from the University of Dortmund focused on microelectronics, he has 25 years’ experience in the industry.