All about renovation and decoration of apartments

I understand how electricity is generated. But where does electricity come from? What is current and its nature? How does electricity get into our home?

We are often contacted by readers who have never encountered electrical work before, but want to figure it out. A section “Electricity for Beginners” has been created for this category.

Figure 1. Movement of electrons in a conductor.

Before you begin work related to electricity, you need to get a little theoretical knowledge on this issue.

The term "electricity" refers to the movement of electrons under the influence of an electromagnetic field.

The main thing is to understand that electricity is the energy of the smallest charged particles that move inside conductors in a certain direction (Fig. 1).

Direct current practically does not change its direction and magnitude over time. Let's say a regular battery has constant current. Then the charge will flow from minus to plus, without changing, until it runs out.

Alternating current is a current that changes direction and magnitude with a certain periodicity. Think of the current as a stream of water flowing through a pipe. After a certain period of time (for example, 5 s), the water will rush in one direction, then in the other.

Figure 2. Transformer design diagram.

With current this happens much faster, 50 times per second (frequency 50 Hz). During one period of oscillation, the current increases to a maximum, then passes through zero, and then the reverse process occurs, but with a different sign. When asked why this happens and why such current is needed, we can answer that receiving and transmitting alternating current is much simpler than direct current. The receipt and transmission of alternating current is closely related to a device such as a transformer (Fig. 2).

A generator that produces alternating current is much simpler in design than a direct current generator. In addition, alternating current is best suited for transmitting energy over long distances. With its help, less energy is lost.

Using a transformer (a special device in the form of coils), alternating current is converted from low voltage to high voltage, and vice versa, as shown in the illustration (Fig. 3).

It is for this reason that most devices operate from a network in which the current is alternating. However, direct current is also used quite widely: in all types of batteries, in the chemical industry and some other areas.

Figure 3. AC transmission circuit.

Many people have heard such mysterious words as one phase, three phases, zero, ground or earth, and know that these are important concepts in the world of electricity. However, not everyone understands what they mean and how they relate to the surrounding reality. Nevertheless, it is imperative to know this.

Without going into technical details that are not necessary home handyman, we can say that a three-phase network is a method of transmitting electric current when alternating current flows through three wires and returns back through one. The above needs some clarification. Any electrical circuit consists of two wires. One way the current goes to the consumer (for example, a kettle), and the other returns it back. If you open such a circuit, then no current will flow. That's all the description of a single-phase circuit (Fig. 4 A).

The wire through which the current flows is called phase, or simply phase, and through which it returns - zero, or zero. consists of three phase wires and one return. This is possible because the phase of the alternating current in each of the three wires is shifted relative to the adjacent one by 120° (Fig. 4 B). A textbook on electromechanics will help answer this question in more detail.

Figure 4. Electrical circuit diagram.

The transmission of alternating current occurs precisely using three-phase networks. This is economically beneficial: two more neutral wires are not needed. Approaching the consumer, the current is divided into three phases, and each of them is given a zero. This is how it gets into apartments and houses. Although sometimes a three-phase network is supplied directly to the house. As a rule, we are talking about the private sector, and this state of affairs has its pros and cons.

Earth, or more correctly, grounding, is the third wire in a single-phase network. In essence, it does not carry the workload, but serves as a kind of fuse.

For example, when electricity runs out of control (such as a short circuit), there is a risk of fire or electric shock. To prevent this from happening (that is, the current value should not exceed a level that is safe for humans and devices), grounding is introduced. Through this wire, excess electricity literally goes into the ground (Fig. 5).

Figure 5. The simplest scheme grounding.

One more example. Let's say that a small breakdown occurs in the operation of the electric motor of a washing machine and part of the electric current reaches the outer metal shell of the device.

If there is no grounding, this charge will continue to wander around. washing machine. When a person touches it, he will instantly become the most convenient outlet for this energy, that is, he will receive an electric shock.

If there is a ground wire in this situation, the excess charge will flow down it without harming anyone. In addition, we can say that the neutral conductor can also be grounding and, in principle, it is, but only at a power plant.

The situation when there is no grounding in the house is unsafe. How to deal with it without changing all the wiring in the house will be discussed later.

ATTENTION!

Some craftsmen, relying on basic knowledge of electrical engineering, install the neutral wire as a ground wire. Never do this.

If the neutral wire breaks, the housings of grounded devices will be under voltage of 220 V.

This question is like cabbage, you open it up and open it up, but the “fundamental” stalk is still far away. Although the question apparently concerns this very stalk, you still have to try to overcome all the cabbage.

At the most superficial glance, the nature of current seems simple: current is when charged particles move. (If the particle does not move, then there is no current, there is only electric field.) Trying to understand the nature of the current, and not knowing what the current consists of, they chose the direction for the current corresponding to the direction of movement of the positive particles. Later it turned out that an indistinguishable current, exactly the same in effect, is obtained when negative particles move in the opposite direction. This symmetry is a remarkable feature of the nature of current.

Depending on where the particles are moving, the nature of the current is also different. The current material itself is different:

  • Metals have free electrons;
  • In metal and ceramic superconductors there are also electrons;
  • In liquids - ions that are formed when chemical reactions or when exposed to an applied electric field;
  • In gases there are again ions, as well as electrons;
  • But in semiconductors, electrons are not free and can move in a “relay race”. Those. It is not the electron that can move, but rather a place where it does not exist - a “hole”. This type of conductivity is called hole conductivity. At the junctions of different semiconductors, the nature of such current gives rise to effects that make all of our radio electronics possible.
    Current has two measures: current strength and current density. There are more differences than similarities between the current of charges and the current of, for example, water in a hose. But such a view of the current is quite productive for understanding the nature of the latter. The current in a conductor is a vector field of particle velocities (if they are particles with the same charge). But we usually do not take these details into account when describing the current. We average this current.

If we take only one particle (naturally charged and moving), then a current equal to the product of charge and instantaneous speed at a particular moment of time exists exactly where this particle is located. Remember how it was in the song of the Ivasi duet “It’s time for a beer”: “... if the climate is difficult and the astral is hostile, if the train has left and all the rails have been TOOK UP...” :)

And now we come to that stalk that we mentioned at the beginning. Why does a particle have a charge (everything seems clear with movement, but what is a charge)? The most fundamental particles (now for sure:) seemingly indivisible) that carry a charge are electrons, positrons (antielectrons) and quarks. It is impossible to pull out and study an individual quark due to confinement; with an electron it seems easier, but it is also not very clear yet. On this moment it is clear that the current is quantized: no charges are observed that are smaller than the charge of the electron (quarks are observed only in the form of hadrons with a total charge of the same or zero). An electric field separately from a charged particle can only exist in conjunction with a magnetic field, like an electromagnetic wave, the quantum of which is a photon. Perhaps some interpretations of the nature of electric charge lie in the realm of quantum physics. For example, the Higgs field predicted by her and discovered relatively recently (if there is a boson, there is a field) explains the mass of a number of particles, and the mass is a measure of how the particle responds to the gravitational field. Perhaps with charge, as a measure of response to an electric field, some similar story will be revealed. Why there is mass and why there is charge are somewhat related questions.

Much is known about the nature of electric current, but the most important thing is not yet known.

In an electrical circuit, including a current source and a consumer of electricity, an electric current arises. But in what direction does this current arise? It is traditionally believed that in an external circuit the current flows from plus to minus, while inside the power source it flows from minus to plus.

Indeed, electric current is the ordered movement of electrically charged particles. If the conductor is made of metal, such particles are electrons - negatively charged particles. However, in an external circuit, electrons move precisely from minus (negative pole) to plus (positive pole), and not from plus to minus.

If you include it in an external circuit, it will become clear that current is possible only when the diode is connected with the cathode towards the minus side. It follows from this that the direction of the electric current in the circuit is taken to be the direction opposite to the actual movement of electrons.

If you trace the history of the formation of electrical engineering as an independent science, you can understand where such a paradoxical approach came from.

The American researcher Benjamin Franklin once put forward a unitary (unified) theory of electricity. According to this theory, electrical matter is a weightless liquid that can flow out of some bodies while accumulating in others.

According to Franklin, electric fluid is present in all bodies, but bodies become electrified only when they have an excess or deficiency of electric fluid (electric fluid). A lack of electrical fluid (according to Franklin) meant negative electrification, and an excess - positive.

This was the beginning of the concepts of positive charge and negative charge. At the moment of connection of positively charged bodies with negatively charged bodies, electric fluid flows from a body with a large amount of electric fluid to bodies with a reduced amount. It is similar to a system of communicating vessels. The stable concept of electric current, the movement of electric charges, has entered science.

This hypothesis of Franklin preceded the electronic theory of conductivity, but it turned out to be far from flawless. French physicist Charles Dufay discovered that in reality there are two types of electricity, which separately obey Franklin's theory, but upon contact they neutralize each other. A new dualistic theory of electricity has emerged, put forward by the natural scientist Robert Simmer based on the experiments of Charles Dufay.

When rubbing, for the purpose of electrification, electrified bodies, not only the body being rubbed becomes charged, but also the body being rubbed. The dualistic theory asserted that in the ordinary state bodies contain two kinds of electrical fluid in different quantities, which neutralize each other. Electrification was explained by a change in the ratio of negative and positive electricity in electrified bodies.

Both Franklin's hypothesis and Simmer's hypothesis successfully explained electrostatic phenomena and even competed with each other.

The voltaic column invented in 1799 and the discovery led to the conclusion that during the electrolysis of solutions and liquids, two charges opposite in the direction of movement are observed in them - negative and positive. This was a triumph of the dualistic theory, because with the decomposition of water it was now possible to observe how oxygen bubbles were released on the positive electrode, while at the same time hydrogen bubbles were released on the negative electrode.

But not everything was smooth here. The amount of gases released varied. Hydrogen was released twice as much as oxygen. This baffled physicists. At that time, chemists still had no idea that a water molecule contains two hydrogen atoms and only one oxygen atom.

These theories were not understood by everyone.

But in 1820, André-Marie Ampère, in a paper presented to members of the Paris Academy of Sciences, first decides to choose one of the directions of currents as the main one, but then gives a rule according to which the effect of magnets on electric currents can be accurately determined.

In order not to talk all the time about two currents of both electricity opposite in direction, in order to avoid unnecessary repetitions, Ampere decided to strictly accept the direction of movement of positive electricity as the direction of the electric current. Thus, Ampere was the first to introduce the still generally accepted rule for the direction of electric current.

This position was later adhered to by Maxwell himself, who came up with the “gimlet” rule, which determines the direction of the magnetic field of the coil. But the question of the true direction of the electric current remained open. Faraday wrote that this state of affairs is only conditional, it is convenient for scientists, and helps them clearly determine the directions of currents. But this is only a convenient means.

After Faraday's discovery of electromagnetic induction, it became necessary to determine the direction of the induced current. The Russian physicist Lenz gave a rule: if a metal conductor moves near a current or magnet, then a galvanic current arises in it. And the direction of the resulting current is such that a stationary wire would move from its action in the opposite direction to the original movement. A simple rule that makes it easier to understand.

Even after the discovery of the electron, this convention has existed for more than a century and a half. With the invention of such a device as the vacuum tube, with the widespread introduction of semiconductors, difficulties began to arise. But electrical engineering, as before, operates with old definitions. Sometimes this causes real confusion. But making adjustments will cause more inconvenience.

Few people think about when electricity appeared. And its history is quite interesting. Electricity makes life more comfortable. Thanks to him, television, the Internet and much more became available. And it is no longer possible to imagine modern life without electricity. It significantly accelerated the development of mankind.

History of electricity

If you start to understand when electricity appeared, then you need to remember the Greek philosopher Thales. It was he who first drew attention to this phenomenon in 700 BC. e. Thalles discovered that when amber was rubbed against wool, the stone began to attract light objects.

In what year did electricity appear? After the Greek philosopher, no one studied this phenomenon for a long time. And knowledge in this area did not increase until 1600. In this year, William Gilbert introduced the term “electricity” by studying magnets and their properties. Since that time, scientists have begun to intensively study this phenomenon.

First discoveries

When did electricity appear and was used in technical solutions? In 1663, the first electric machine was created, which made it possible to observe the effects of repulsion and attraction. In 1729, the English scientist Stephen Gray conducted the first experiment in which electricity was transmitted over a distance. Four years later, the French scientist C. Dufay discovered that electricity has 2 types of charge: resin and glass. In 1745, the first electric capacitor appeared - the Leyden jar.

In 1747, Benjamin Franklin created the first theory to explain this phenomenon. And in 1785, electricity appeared. Galvani and Volt studied it for a long time. A treatise was written on the action of this phenomenon during muscle movement and a galvanic object was invented. And the Russian scientist V. Petrov became the discoverer

Lighting

When did electricity appear in houses and apartments? For many, this phenomenon is primarily related to lighting. Therefore, it should be considered when the first light bulb was invented. This happened in 1809. The inventor was the Englishman Delarue. A little later, spiral-shaped light bulbs appeared, which were filled with inert gas. They began to be produced in 1909.

The advent of electricity in Russia

Some time after the introduction of the term “electricity,” this phenomenon began to be studied in many countries. The beginning of change can be considered the appearance of lighting. In what year did electricity appear in Russia? According to this date - 1879. It was then that electrification using lamps was carried out for the first time in St. Petersburg.

But a year earlier in Kyiv, in one of the railway workshops, electric lights were installed. Therefore, the date of the appearance of electricity in Russia is a somewhat controversial issue. But since this event went unnoticed, the official date can be considered the lighting of the Liteiny Bridge.

But there is another version when electricity appeared in Russia. From a legal point of view, this date is the thirtieth of January 1880. On this day, the first electrical engineering department appeared in the Russian Technical Society. His duties were to oversee the introduction of electricity into daily life. In 1881, Tsarskoye Selo became the first European city to be completely illuminated.

Another significant date is May fifteenth, 1883. On this day, the Kremlin was illuminated for the first time. The event was timed to coincide with the accession to the Russian throne of Alexander III. To illuminate the Kremlin, electricians installed a small power station. After this event, lighting first appeared on the main street of St. Petersburg, and then in the Winter Palace.

In the summer of 1886, by decree of the emperor, the Electric Lighting Society was established. It was engaged in the electrification of the entire St. Petersburg and Moscow. And in 1888, the first power plants began to be built in the largest cities. In the summer of 1892, the debut electric tram was launched in Russia. And in 1895 it appeared. It was built in St. Petersburg, on the river. Bolshaya Okhta.

And in Moscow, the first power plant appeared in 1897. It was built on Raushskaya embankment. The power plant generated three-phase alternating current. And this made it possible to transmit electricity over long distances without significant loss of power. In other cities, construction began at the dawn of the twentieth century, before the First World War.

Dear readers and simply visitors to our magazine! We write quite a lot and in some detail about the methods, with the help of which energy resources, electricity is produced at power plants. Atom, gas, water - they were our “heroes”, except that we had not yet managed to get to alternative, “green” options. But, if you look closely, the stories were far from complete. Never before have we tried to trace in detail the path of electricity from the turbine to our sockets, with the paths to lighting our settlements and roads, to ensure the operation of numerous pumps that ensure the comfort of our homes.

These roads and paths are by no means simple, sometimes winding and change direction many times, but knowing what they look like is the responsibility of every cultured person of the 21st century. A century, the appearance of which is largely determined by the electricity that has conquered us, which we have learned to transform so that all our needs are satisfied - both in industry and for private use. The current in the wires of power lines and the current in the batteries of our gadgets are very different currents, but they remain the same electricity. What efforts do electric power engineers and engineers have to make to provide the most powerful currents in steel factories and small, tiny currents in, say, a wristwatch? How much work do all those who support the system of transformation, transmission and distribution of electricity have to do, and what methods ensure the stability of this system? How does the “System Operator” differ from the “Federal Grid Company”, why were both of these companies in Russia not private but state-owned?

There are a lot of questions, you need to know the answers to them in order to more or less understand why we need so many energy workers and what, roughly speaking, do they do? We are so accustomed to the fact that everything is in perfect order with electricity in our homes and cities, that we only remember about electrical engineers when something suddenly stops working, when we fall out of our usual level of comfort zone. It’s dark and cold - that’s when we talk about energy drinks, and we say words that we definitely won’t print.

We are sure that we were frankly lucky - a true professional agreed to take on this difficult, necessary, and even huge topic. We ask you to love and favor - Dmitry Talanov, Engineer with a capital letter. You know, there is a country - Finland, in which the title of engineer is so important that at one time a catalog was published annually with a list of specialists who had it. I would like such a glorious tradition to appear in Russia someday, since in our electronic and Internet age it is much easier to create such an annually updated catalogue.

The article we bring to your attention on engineering is short, precise and succinct. Of course, everything that Dmitry wrote can be described in much more detail, and at one time our magazine began a series of articles about how the conquest of electricity took place in the 19th century.

Georg Ohm, Heinrich Hertz, Andre-Marie Ampère, Alessandro Volt, James Watt, Faraday, Jacobi, Lenz, Gramm, Fontaine, Lodygin, Dolivo-Dobrovolsky, Tesla, Yablochkov, Depreux, Edison, Maxwell, Kirchhoff, Siemens brothers and Westinghouse brothers – in the history of electricity there are many glorious names worthy of us remembering. In general, if someone wants to remember the details of how it all began, you are welcome, and Dmitry’s article is the beginning of a completely different story. We really hope that you will like it, and we will see the continuation of Dmitry Talanov’s articles in the very near future.

Dear Dmitry, on my own behalf - with a debut, we ask all readers - do not skimp on comments!

What is electric current, where does it come from and how does it get to our homes?

Everyone can find out why we need electricity and how much it helps us live by taking a critical look at their home and place of work.

The first thing that catches your eye is the lighting. And it’s true that without it, even an 8-hour working day would turn into torture. Getting to work in many big cities is already a small happiness, but what if you have to do it in the dark? And in winter it goes both ways! Gas lamps will help on the main highways, but you turned a little to the side and you can’t see a thing. You can easily fall into a basement or hole. And outside the city in nature, illuminated only by the light of the stars?

Night street lighting, Photo: pixabay.com

Without electricity, there is also nothing to remove the heat from the offices, where it was difficult to reach. You can, of course, open the windows and tie a wet towel around your head, but how long will this help? Pumps pumping water also need electricity, or you will have to regularly go to a manual pump with a bucket.

Coffee in the office? Forget it! Only if everyone does it at once and not often, so that the smoke from burning coal does not poison the working atmosphere. Or you can get it from a nearby tavern for extra money.

Send a letter to the next office? You need to take paper, write a letter by hand, then carry it with your feet. To the other end of town? We call the courier. To another country? Do you know how much it will cost? In addition, do not expect an answer earlier than six months from neighboring countries and from a year to five from overseas.

We returned home, we need to light the candles. Reading in front of them is a pain for the eyes, so you'll have to do something else. With what? There is no TV, no computers, no smartphones - even those are gone, because there is nothing to power them with. Lie on the bench and look at the ceiling! Although the birth rate will definitely increase.

It should be added that all plastics and fertilizers are now obtained from natural gas in factories where thousands of motors are spinning, driven by the same electricity. From here, the list of available fertilizers is greatly shortened to those that can be prepared from natural raw materials in vats, stirring the toxic slurry in them with manual, water or steam driven paddles. As a result, the volume of products produced is greatly reduced.

Forget about plastics! Ebonite is our highest happiness from a long list. And among the metals, cast iron becomes the most affordable. From medicine, the stethoscope and the quickly rusting scalpel again appear on the stage as the main instrument. The rest will sink into oblivion.

You can go on for a long time, but the idea should already be clear. We need electricity. We can survive without him, but what kind of life would it be! So where did this magical electricity come from?

Discovery of electricity

We all know the physical truth that nothing disappears anywhere without a trace, but only passes from one state to another. The Greek philosopher Thales of Miletus encountered this truth in the 7th century BC. e. discovering electricity as a form of energy by rubbing a piece of amber with wool. Part of the mechanical energy turned into electrical energy and the amber (in ancient Greek “electron”) became electrified, that is, it acquired the properties of attracting light objects.

This type of electricity is now called static, and it has found wide application, including in gas purification systems at power plants. But in Ancient Greece there was no use for it, and if Thales of Miletus had not left behind records of his experiments, we would never have known who the first thinker was who focused his attention on the type of energy that is perhaps the purest among all with which we familiar to this day. It is also the most convenient to manage.

The term “electricity” itself—that is, “amber”—was coined by William Gilbert in 1600. From this time on, they began to experiment widely with electricity, trying to unravel its nature.

As a result, from 1600 to 1747, a series of exciting discoveries followed and the first theory of electricity appeared, created by the American Benjamin Franklin. He introduced the concept of positive and negative charge, invented a lightning rod and with its help proved the electrical nature of lightning.

Then, in 1785, Coulomb's law was discovered, and in 1800, the Italian Volta invented a galvanic cell (the first source of direct current, the predecessor of modern batteries and accumulators), which was a column of zinc and silver circles separated by paper soaked in salted water. With the advent of this, stable for those times, source of electricity, new and important discoveries quickly followed one after another.

Michael Faraday giving his Christmas lecture at the Royal Institution. Lithography fragment, Photo: republic.ru

In 1820, the Danish physicist Oersted discovered electromagnetic interaction: while closing and opening a circuit with direct current, he noticed cyclic oscillations of a compass needle located near a conductor. And in 1821, the French physicist Ampere discovered that an alternating electromagnetic field is formed around a conductor with alternating electric current. This allowed Michael Faraday in 1831 to discover electromagnetic induction, describe the electric and magnetic field with equations and create the first alternating current electric generator. Faraday pushed a coil of wire into a magnetized core and, as a result, an electric current appeared in the winding of the coil. Faraday also invented the first electric motor, a conductor carrying an electric current that rotates around a permanent magnet.

It is impossible to mention all the participants in the “race for electricity” in this article, but the result of their efforts was an experimentally provable theory that describes electricity and magnetism in detail, according to which we now produce everything that requires electricity to function.

Direct or alternating current?

In the late 1880s, before the advent of global standards for the production, distribution and consumption of industrial electricity, a battle broke out between supporters of the use of direct and alternating current. Tesla and Edison stood at the head of the opposing armies.

Both were talented inventors. Except that Edison had much more developed business abilities and by the time the “war” began, he had managed to patent many technical solutions that used direct current (at that time in the USA, direct current was the default standard; constant is a current whose direction does not change according to time).

But there was one problem: in those days, direct current was very difficult to transform into higher or lower voltage. After all, if today we receive electricity at 240 volts, and our phone requires 5 volts, we plug into the socket a universal box that converts anything into anything in the range we need, using modern transistors controlled by tiny logic circuits with sophisticated software. What could be done then, when there were still 70 years left before the invention of the most primitive transistors? And if, due to the conditions of electrical losses, it was necessary to increase the voltage to 100,000 volts in order to deliver electricity over a distance of 100 or 200 kilometers, any Volta poles and primitive direct current generators were powerless.

Understanding this, Tesla advocated alternating current, the transformation of which into any voltage levels was not difficult even in those days (alternating current is considered to be a current whose magnitude and direction periodically change over time even with a constant resistance to this current; at a network frequency of 50 Hz this happens 50 times per second). Edison, not wanting to lose patent royalties to himself, launched a campaign to discredit alternating current. He insisted that this type of current was especially dangerous for all living things, and as proof, he publicly killed stray cats and dogs by applying electrodes connected to an alternating current source to them.

Edison lost the battle when Tesla offered to light up the entire city of Buffalo for $399,000 against Edison's offer to do the same for $554,000. On the day when the city was illuminated with electricity received from a station located at Niagara Falls and producing alternating current, the company General Electric threw direct current out of consideration in her future business projects, fully supporting alternating current with her influence and money.

Thomas Edison (USA), Fig.: cdn.redshift.autodesk.com

It may seem that alternating current has conquered the world forever. However, he has hereditary diseases that grow from the very fact of variability. First of all, these are electrical losses associated with losses in the inductive component of power transmission line wires, which are used to transmit electricity over long distances. These losses are 10-20 times higher than the possible losses in the same power lines if direct current flows through them. Plus, there is the increased complexity of synchronizing the nodes of the power system (for better understanding, say, individual cities), because this requires not only equalizing the voltages of the nodes, but also their phase, since alternating current is a sine wave.

This also shows a much greater commitment to the “swinging” of the nodes in relation to each other, when the voltage and frequency begin to change up and down, which the average consumer pays attention to when the light in his apartment blinks. Usually this is a harbinger of the end of the joint work of nodes: the connections between them are broken and some nodes find themselves with an energy deficit, which leads to a decrease in their frequency (i.e., a decrease in the rotation speed of the same electric motors and fans), and some with excess energy, leading to dangerously high voltages throughout the entire site, including our outlets with devices connected to them. And with a sufficiently long power line, which, for example, is critical for the Russian Federation, other effects that spoil the mood of electricians begin to appear. Without going into detail, we can point out that transmitting alternating current electricity through wires over very long distances becomes difficult, and sometimes impossible. For information, the wavelength with a frequency of 50 Hz is 6000 km, and when approaching half of this length - 3000 km - the effects of traveling and standing waves, plus effects associated with resonance, begin to take effect.

These effects are absent when using direct current. This means that the stability of the energy system as a whole increases. Taking this into account, as well as the fact that computers, LEDs, solar panels, batteries and much more use direct current to operate, we can conclude: the war with direct current is not yet lost. Modern DC converters for any power and voltage used today have very little left to equal the price of AC transformers familiar to mankind. After which, apparently, a triumphant march across the planet of direct current will begin.

Related publications