Voltage vs Frequency is not the most exciting topic in the world but having a basic understanding of electricity can go a long way in your life, not to mention the fact that it can make you more interesting if you bump into an electrical engineer at a tea party — unfortunately nowadays during this pandemic it’s likely to be a virtual tea party. Sad, I know. This pandemic will end so hold on. This article is intended to teach you a few things about electricity.
Also, it answers questions as to why electricity standards are different in other countries, which you have to pay attention to when you travel. Knowing a few key points will make your life a little easier.
When we start trying to understand the relationship between voltage and frequency, things may get technical. People who travel internationally have a common question: Why does voltage and frequency differ between the United States vs Europe and Great Britain? This article will explain how AC power works, why voltage and frequency is so important, and the reason for different electric supply conventions around the world.
Anyone who has traveled to another country knows all too well that electric equipment used in the US cannot be plugged into a conventional outlet in most other countries. The voltage in Japan is lower than any other power grid in the world, and the Japanese power grid uses two different frequencies. Why the difference?
The relationship between voltage and frequency is more important than we may think. In order to maintain a stable electrical supply, voltage and frequency needs to remain constant. To fully understand this, and why it is so important, we need to start with a basic understanding of electrical current. The evolution of power generation is not only a fascinating topic, it also provides an insight into how the conventions for voltage and frequency came to be.
Voltage vs Frequency —
What is Electric Current?
An electric current is the movement of ionically charged particles or electrons. Mostly, an electric current results from electrons flowing through a metal conductor. The Nucleus of the atom is maintained in a fixed positioned and the electrons move through a conductive material, like a copper wire. This creates a positively charged point at the nucleus, with negatively charged electrons as the electric current.
Electric current is measured in Amperes (A). This is the amount of electric charge (Coulomb) per second.
The movement of electrically charged electrons is the norm for power generation plants. Electric current can also be ionic, as in the electrolyte used for batteries.
Semiconductors are more complicated. These are made using different elements to control electrons and holes (the absence of electrons). Silicon is usually the base for all semiconductors which are “doped” using other elements to achieve the desired effect. This causes an on/off electronic charge which is interpreted as binary zeros and ones.
When dealing with electronic semiconductors, the relationship between frequency and voltage is of particular importance. If the flow of electrons deviates, because of a change in voltage or frequency, the information generated by semiconductors will not be accurate. Furthermore, the electric circuit may be damaged by repeated distortion of the sine wave. This will be explained in greater detail when we discuss an electric sine wave.
There are two types of electric current: Alternating Current (AC) and Direct Current (DC).
Direct Current
A direct current (DC) is the flow of electrons directly from a positive pole to a negative pole. The electrons move in one direction from + to -. This is a very efficient use of electric energy over short distances. As the length of the conductor increases, DC power will experience a significant voltage loss. This makes DC power unsuitable for large scale electric distribution.
The primary advantage of using DC power is its efficiency when used as a local power source, where the current only has to be conducted over a short distance. It can also be stored in a battery, using an ionically charged electrolyte.
Alternating Current
Alternating current (AC) is the oscillation of an electric charge between two points. Unlike DC power, which flows in a linear path, in one direction. When generating AC power, the electrons are constantly moving between the neutral and live (hot) terminals of an alternator. The rate at which the electrons move between these two points is known as frequency, measured in Hertz (Hz). This indicates how many times the current oscillates per second. A 50Hz current, moves between hot and neutral 50 times per second.
AC power is the standard method for mass power generation, or electrical grids. This is because high-voltage AC current experiences the lowest voltage drop over long distances. The use of 3-phase AC electric generation makes it easier to increase or decrease the voltage between the point of supply and the point of use.
Potential Difference
Potential difference is the potential energy of an electric charge between two points. Energy is measured in joules and the potential difference is measured in Volts (V). We measure potential difference between positive and negative for DC current, or live (hot) and neutral for AC current. It is important to understand that this is the potential energy and not the actual energy. It possible to generate any amount of energy regardless of the voltage. A low voltage can produce the same power (watts) as a high voltage, provided the conductor is ably to supply that current (Amps).
Voltage is calculated by measuring the potential energy (joules) in relation to one coulomb of electric charge. If the potential difference is 12V, this means one coulomb of electric charge will gain 12 joules of potential energy. 1 watt is equivalent to 1 joule per second. 1A is equivalent to 1 coulomb per second. To supply 100W of power, at 12V, the current will be 8.3A. If we increase the potential difference to 120V, one coulomb of electric charge will gain 120 joules of potential energy. This means less current is required to supply the same power, or wattage. To supply 100W at 120V, the current will be 0.83A. By increasing the voltage by a factor of ten, we reduce the amperage (electric current) 10 times.
Electric current, or amperage, is directly proportional to the potential difference, or voltage. If we increase the voltage, the amperage will be reduced when supplying the same wattage. To calculate the relationship between power, volts, and amps, the following formula is used: Power (Watts) = Volts (potential difference) multiplied by Impedence or Amps (current), abbreviated to P = VI. This applies to both alternating and direct current.
Electric Sine Wave
When observing alternating current, the oscillation of electrons forms a sine wave. This is a visualization of the relationship between voltage and frequency. The height, or amplitude, of the sine wave indicates the voltage. The width of a sine wave is determined by the frequency. A perfect sine wave is obtained when voltage and frequency remains constant. However, this is virtually impossible to achieve.
Changes in either voltage or frequency distorts the shape of the sine wave. This is known as Harmonic Distortion (HD). Total Harmonic Distortion (THD) is calculated using the first wave as a base point. Any deviation occurring in all subsequent waves is calculated as a percentage of the difference in voltage and frequency when compared to the first wave.
For electrical engineers, managing the supply of electricity on a large scale, reducing THD is the greatest challenge. Traditional steam turbine generators use valves to control the flow of steam to the turbine. Since the demand for electricity over a power grid is constantly changing, the generating capacity has to change accordingly. This cannot be achieved with 100% accuracy.
The introduction of renewable energy has made this even more complex. Unlike Steam generating plants, solar and wind generators are as easy to control. In order to control the output, relative to electricity demand, battery banks and capacitors are required to store and release electricity according the grid requirements. Nuclear power plants have, in the past, generated a constant electrical supply, according to base demand. This means that the output of a Nuclear power generator is calculated to meet the minimum power requirements for the area it supplies. Additional demand is supplied by coal-fueled steam turbine generator which can be regulated as demand increases.
Ideally, a well-managed power supply will experience a voltage difference of less than 10% and a difference in frequency of less than 1%.
The Power Grid
In the UK, and most English-speaking countries, people commonly refer to mains power. In the US, it is usually known as utility power. Both terms refer to the electric power grid.
A power grid is a mass electricity distribution network, covering a large area. The area supplied by a power grid can be any size, from a group of municipalities to several countries. The United States has three power grids, with over 7,300 power plants. The power grids in the US consists of two main areas, one supplying the Eastern regions, and one for the West. The Electricity Reliability Council of Texas (ERCOT) supplies electricity to most of Texas. The UK national power grid, supplies the England and Wales. It is managed by National Grid Plc. Northern Ireland and the Republic of Ireland are connected by the Single Electricity Supply Market. Scotland utilizes two power grids, on serving Northern Scotland, and one for the Southern and Central Scotland, with inter connectors.
The UK power grid is connected by a submarine (under sea) sea interconnector to Northern France. A submarine interconnector also exists between Northern Ireland and the Netherlands. In Europe, there are several power grids, with many interconnectors. These are all multi-national power grids, connecting different countries. There are plans for a European super-grid which will connect all of Europe to a single grid, including some Central European North African countries. Japan has two electrical grids, serving the eastern and western regions. The two Japanese power grids operate at different frequencies with a synchronous interconnector.
Regardless of its size, a power grid is supplied by several power generation plants. All the generators supplying a power grid need to be synchronized to ensure a stable voltage and frequency. Because of the extent of the power grid, power is transmitted over a large area. The distance between the point of generation to the point of use can be up to 300 miles. In order to reduce the voltage loss over these distances, high-voltage transmission is used. Long-distance power transmission can be anything from 155,000V to 765,000V.
Transformers are used to at the point of supply to obtain the correct voltage. Three phase transformers are used mostly for industrial, or large scale commercial electric supply. In Europe, the standard for 3-phase power is 380V. In the UK it is 400V. I the US, 3-phase power can be 400V – 415V, 240V, or 208V. In order to supply a wider variety of voltages, many US 3-phase transformers will utilize multiple taps to produce different voltages. Japan is the only country with two distinctly different supply standards. In eastern Japan, 3-phase power is 200V (50Hz) in the east, and 200 – 210V (60HZ) in the west.
Domestic power supply in Europe, and most other countries, is 220V – 240V (50HZ). In the US, it is 100 – 127V (60HZ). Most homes in the US are supplied by two-pole transformers, allowing for either 120V or 240V circuits. Japan uses a 100V, with frequency differing from 50Z (Eastern Japan) and 60HZ (Western Japan).
Voltage is never constant. For this reason, all electric devices rated for a certain region will allow for a maximum and minimum voltage, typically around 10%.
Because voltage and frequency is not the same around the world, it is not possible to use electrical equipment from one region, in a country that uses a different voltage or frequency. However, many appliances, particularly electronic devices and chargers, are designed for a wide input voltage (110V – 240V) and can be used with a 50HZ or 60HZ electrical supply. Portable power converters, using a transformer can be purchased, allowing one to use a 120V appliance in a country with a 240V supply voltage and vise versa.
Why is electricity supply not the same for all countries?
It seems logical to use one standard for voltage and frequency in all countries. This would make it easier for international travel and it would simplify the import and export of electrical equipment. To understand the difference between voltage and frequency on an international platform, we need to look at the historical differences between different regions. Decisions on the standard for voltage and frequency were often made based on economic realities of the time. The evolution of electrical standards have also been affected by electrical safety and reliability of the supply network.
History of Electric Power Generation
The electricity generating dynamo was invented in France by Hippolyte Pixii in 1832. The reality of a reliable DC power generator came about in 1867. It is debatable who was the first to invent the “self-exciting dynamo-electric generator” as three engineers introduced this technology around the same time: S.A. Varley, Werner von Siemen, and Charles Wheatstone in 1867. This principal was refined by Belgian inventor, Zenobe Gramme in 1870, when he developed the first dynamo capable of supplying a stable DC current for electric motors.
Through the course of the 1870s, electrical arc lights began to replace gas lighting for city streets in Europe and the USA. The idea of mass electric supply for domestic use only came about in 1879, when Thomas Edison invented the incandescent light bulb. This was the first practical electrical lighting for indoor use.
In the early days of electricity generation, there was no centralized power generation plants and no standard for voltage or the type of current used. Privately-owned and operated power plants were built for industrial entities and managed by the owners of the plant. At this time, there were two influential engineers who played a role in the development of electricity generating plants in the US. Thomas Edison proposed that DC power was the most effective and Nikolas Tesla was of the opinion that AC power was the preferred option.
It was Nikolas Tesla, who went on to work for the Westinghouse Electric Company, who ultimately succeeded in building the first central power plant, supply AC power to the central business district of New York City in 1882. The Pearl Street Power Station in NYC is recognized as the first mass electricity supply in the world. However the first municipal power generating plant to supply DC electricity for street lighting was built in London, in 1878. This was an experimental venture, which only lasted 6-months.
Because Edison’s light bulb was the driving force behind the spread of mass power distribution, it was his invention that determined the voltage standard for the world. Edison calculated that 110V provided sufficient potential difference, with reasonable safety, to supply his invention. High-voltage poses a greater risk of electrocution, and it wasn’t until the invention of the Ground Fault Circuit Interrupter (in 1961) that electrical power became safer. Nikolas Tesla, an advocate for the use of AC power, calculated that 60 Hz was the most efficient frequency for AC power generation. It was a combination of Edison’s decision to use 110V and Tesla’s decision that 60 Hz was the ideal for Ac power, that set the initial standard for power distribution around the world.
Engineers at AEG, the largest manufacturer of generators in Europe, decided to adopt 50 Hz as the standard for frequency. This made it easier to perform metric calculations. After World War II, 50 Hz became the European standard. The US maintained the 110V 60Hz standard.
In Japan, the first electricity generating plant was built by AEG in Tokyo (1895), this was a 50 Hz generator. The second power plant, built in Osaka (1896) was supplied by General Electric, using the US 60 Hz standard. As a result, Japan has two standards for frequency for the eastern and western regions.
As the number of households connected to the electricity grid increased, it became apparent that supply voltage could be problematic. At a lower voltage, the voltage loss is greater as distance increased. As a result, supply transformers were ill-equipped to meet changes in demand. Close to the transformer, the voltage could be as high as 127V. In the outer regions, voltage can be as low as 105V. Furthermore, as demand grew, larger conductors were required to supply the increased amperage. High-powered heating equipment and large electric motors are not as efficient at a lower voltage.
The exception to this rule was the Berlin electricity supply utility, Berliner Elektrizitäts-Werke (BEW). The supply authority decided to adopt 220V as the standard for the city in 1899. The cost of replacing residents’ lighting equipment was offset by the savings in supply network conductors. At this time electricity, for domestic use, was limited to only lighting equipment, making the change relatively inexpensive. As the use of electricity increased rapidly, from the 1920s onwards, the benefit of using the higher voltage became increasingly apparent. At twice the voltage, half the amps are needed to supply the same watts. This meant that the cost expanding the electricity supply network was much lower, as increased amperage requires more copper to be used for the conductors and transformers.
In the 1950’s European countries started increasing the voltage used for power distribution. This made it more efficient to deliver increased power over a greater area, with less voltage loss over distance. Even at this time, the standard was not uniform. In the UK, power distribution was standardized at 240V (± 10%), and the rest of Europe decided upon 220V (± 10%), with both regions using 50 Hz. It was only in 1987, that a standard was set for the UK, Europe, and most other regions. This was to be 230V. At first, the voltage tolerance was +10%, -5%. In 2009, the European standard was set at 230V (± 10%).
When Europe first decided to change the power to a higher voltage (1950 – 1960), electricity usage in the average European and UK household was limited. Appliances, like refrigerators, washing machines, and electric heating weren’t common. In the US, the situation was the opposite. The post-war boom meant that most households were reliant on multiple electrical appliances. A decision was made that it would be too costly to change the standard to 240V, as too many people would have to replace all the electrical appliances in their homes.
To offset the damage caused by voltage loss, the US gradually increased the nominal supply voltage from 110V to 120V. The system of dual pole 120V/240V transformers also increased. By the 1980’s most homes in the US had access to both 120V and 240V supply. This meant that low consumption equipment, like lighting and electronic devices could use the 120V supply. Heating equipment and other high-powered devices could use a 240V supply, thereby reducing the amps needed and allowing for a more stable electrical supply.
Today all of North America, and most countries in central America, as well as a few South American countries us the US electrical standard, 120V/240V 60 Hz. With the exception of Japan and Liberia, the rest of the world comply to European standards – 230V 50 Hz. Liberia is the only African country to use the US 120V/240V 60 Hz power supply standard. Liberia was founded by the American Colonization Society in 1822, to offer freed slaves better opportunities in Africa. Although Liberia obtained full independence from the US in 1862, the country continued to receive American economic support. Liberia was a strategic ally during the two world wars. As a result infrastructure, like electricity generation, was supplied by the US. Consequently, the US electric supply voltage and frequency became the standard for Liberia.