GHz
What is GHz?
Definition
GHz, short for gigahertz, is a unit of frequency equal to one billion hertz. It is commonly used to measure computer processing speed, alternating current, and electromagnetic (EM) frequencies.
When used in terms of computer processing speed, it is the measure of the processorâ€™s clock rate, which is the rate at which it generates pulses to synchronize the operations of its components. This is normally the frequency of a crystal oscillator.
The hertz is defined as one cycle per second (cps) and is named after Heinrich Hertz, who proved the existence of EM waves. It is equivalent to the reciprocal second (s^{-1}).
Hz, kHz, MHz, and GHz
The base unit of frequency is the hertz, which is equal to one cycle per second. Other common units are kHz, MHz, and GHz, which are multiples of Hz following standard SI prefix conventions. A kilohertz is a thousand hertz, a megahertz is a million hertz, and a gigahertz is a billion hertz.
Symbol | Name | Value |
GHz | gigahertz | 10 ^{ 9 } Hz (1 billion Hz) |
MHz | megahertz | 10 ^{ 6 } Hz (1 million Hz) |
kHz | kilohertz | 10 ^{ 3 } Hz (1 thousand Hz) |
Hz | hertz | 1 Hz |
How to convert Hz to kHz to MHz to GHz
These most common units of frequency are multiples of a thousand. For example, since 1 MHz is a million Hz and 1 GHz is a billion Hz, a GHz is a thousand times faster than a MHz. To convert from MHz to GHz, just divide by a thousand. To convert from a larger unit to a smaller one (like from GHz to MHz), then multiply by that value instead.