What I know about it from high school and general articles on the net doesn't satisfy. Maybe because I have critical holes in my knowledge.

From what I think I know: we're having AC running in the lines. AC means if we zoom down, we'll see that an electron is zipping along this direction, and after 1/50 sec (or 1/100?) that very electron will zip back in the opposite direction, ideally back to the specific point we're looking at, because phases are supposed to be equal.

So how does resistance come into the picture at atomic scale? Conductors heat up after a while, so maybe that's because some of the electrons' kinetic energy gets transferred into the wire's temperature? Does this mean the electron slows down? But then does that mean electricity will somehow, sometime propagate slower than light?

Most if not all of our devices actually use DC, using relay(s) to get it from AC. From the only type of relay I was explained, the DC current the device receives seems to be on & off. This moment the electrons are moving forward, the relay allows them to flow into the device. 1/50s later electrons moves "backward" and it cuts the circuit so they can't flow back and the device doesn't have to lose electrons that way (but it doesn't gain anything either, thus my 'on & off' understanding). So my question is: is it detrimental to the device? Is it responsible for the flickering of lights & other stuffs? If so, is the number of 50Hz chosen for the main purpose of making that flickering imperceptible to us?

This lead to another big pondering. Why the fuck don't they just use DC from the source? There are some methods to transfer DC along big distances, they seem to be tried and probably true. Or the reason is simply because of inertia? That people are so used to AC and the systems for AC are all over the place, so switching is not cost-effective? Is there research on this very subject yet?

Last but not least, I wonder how exactly devices "consume" electricity. Like, is it that many electrons enter the device but fewer exit? If not so, how do counters count our consumption?

New Answer
New Comment

6 Answers sorted by

gjm

Feb 24, 2020

180

The speed at which electrical signals propagate is much faster than the speed at which electrons move in an electrical conductor. (Possibly helpful metaphor: suppose I take a broomstick and poke you with it. You feel the poke very soon after I start shoving the stick, even though the stick is moving slowly. You don't need to wait until the very same bit of wood I shoved reaches your body.)

The speed at which electrical signals propagate is slower than the speed of light, but it's a substantial fraction of the speed of light and it doesn't depend on the speed at which the electrons move. (It may correlate with it -- e.g., both may be a consequence of how the electrons interact with the atoms in the conductor. Understanding this right is one of the quantum-mechanical subtleties I mention below.)

When current flows through a conductor with some resistance, some of the energy in the flow of the electrons gets turned into random-ish motion in the material, i.e., heat. This will indeed make the electrons move more slowly but (see above) this doesn't make much difference to the speed at which electrical effects propagate through the conductor.

(What actually happens in electrical conductors is more complicated than individual electrons moving around, and understanding it well involves quantum-mechanical subtleties, of most of which I know nothing to speak of.)

It is not usual to convert AC to DC using relays.

It is true that if you take AC power, rectify it using the simplest possible circuit, and use that to supply a DC device then it will alternate between being powered and not being powered -- and also that during the "powered" periods the voltage it gets will vary. Some devices can work fine that way, some not so fine.

In practice, AC-to-DC conversion doesn't use the simplest possible circuit. It's possible to smooth things out a lot so that the device being powered gets something close to a constant DC supply.

But there are similar effects even when no rectification is being done. You mentioned flickering lights, and until recently they were an example of this. If you power an incandescent bulb using AC at 50Hz then the amount of current flowing in it varies and accordingly so does the light output. (At 100Hz, not 50Hz; figuring out why is left as an exercise for the reader.) However, because it takes time for the filament to heat up and cool down the actual fluctuation in light output is small. Fluorescent bulbs respond much faster and do flicker, and some people find their light very unpleasant for exactly that reason. LED lights, increasingly often used where incandescents and fluorescents used to be, are DC devices. I think there's a wide variety in the circuitry used to power them, but most will flicker at some rate. Good ones will be driven in such a way that they flicker so fast you will never notice it. (Somewhere in the kHz range.)

Sometimes DC (at high voltages) is used for power transmission. I think AC is used, where it is used, because conversion between (typically very high) transmission voltage and the much lower voltages convenient for actual use is easy by means of transformers; transformers only work for AC. (Because they depend on electromagnetic induction, which works on the principle that changes in current produce magnetic fields and changes in magnetic field produce currents.) I don't know whether AC or DC would be a better choice if we were starting from scratch now, but both systems were proposed and tried very early in the history of electrical power generation and I'm pretty sure all the obvious arguments on both sides were aired right from the start.

When a device "consumes" electrical energy it isn't absorbing electrons. (In that case it would have to accumulate a large electrical charge. That's usually a Bad Thing.) It's absorbing (or using in some other way) energy carried in the electric field. It might help to imagine a system that transmits energy hydraulically instead, with every household equipped with high-pressure pipes, with a constant flow of water maintained by the water-power company, and operating its equipment using turbines. These wouldn't consume water unless there were a leak; instead they would take in fast-moving water and return slower-moving water to the system. An "AC" hydraulic system would have water moving to and fro in the pipes; again, the water wouldn't be consumed, but energy would be transferred from the water-pipes to the devices being operated. Powering things with electricity is similar.

I don't know whether AC or DC would be a better choice if we were starting from scratch now, but both systems were proposed and tried very early in the history of electrical power generation and I'm pretty sure all the obvious arguments on both sides were aired right from the start.

DC wasn't really a viable option at the start because of the transformer issue you mentioned. The local power lines carry ~100x higher voltage than what you get in your house, and the long distance power lines up to another 100x on top of that. Without that voltag... (read more)

Thank you. Using the water pipe analogy, 1 can see some obvious flaws with AC system. What if something needs power right at the moment the water is in the middle state between to & fro, i.e. standstill? How about installing a converter device at the beginning of each household? Surely it'd be better to provide continuous flow to devices, not to mention there's no need to manufacture trillions of small relays or rectifiers that are needed inside devices.

If what devices do is get fast water and release slow water, then it can be understood tha... (read more)

3gjm4y
The water (or, rather, the electricity) sloshes to and fro 50 times a second, so there's never enough delay between flicking the switch and getting usable power that a human being would notice. Typically other things are slower; e.g., if you're turning on an incandescent lightbulb then it may take longer than that for the filament to get hot enough that it starts glowing. For many devices (e.g., your phone) there is a converter device, and when you attach your phone to its USB wall-plug it's getting DC electricity from it. It would be possible to have some sort of converter for every household, but every such converter has some losses, and many devices are perfectly happy just running off AC, and ones that aren't don't necessarily all want the same operating voltage. Again, if we were doing everything from scratch now it might be worth considering something like that (or it might not; the details matter and I'm not an electrical engineer myself), but we have a basically-working system and replacing it wholesale with something new would need to be a big improvement to be worth the tremendous cost and inconvenience. It would be more accurate to say that devices use the energy in the electromagnetic field rather than the kinetic energy of electrons, as such. (There isn't a clear distinction between using the electric field and using the magnetic field; the two are very intimately linked and, e.g., if two observers are moving rapidly relative to one another, then what one sees as the electric field the other may see as the magnetic field.) The motor in an electric fan works something like this. (Unfortunately it involves effects that don't have a close analogue in terms of flowing water.) There are coils of wire. You pass an alternating current through these coils; changing currents generate a magnetic field. (This isn't meant to be obvious. It was one of the big discoveries of 19th-century physics.) There's a lump of iron placed so that this magnetic field pulls on
1Long try4y
I think using the water as an analogy to electricity is still somehow not adequate to the task. For example, to make it slosh back & forth would require a tremendous amount of energy, which seems not to be the case with electricity. But still, I also think that if a device consumes electricity, no matter what way - say, using electromagnetic field, then it must reflect into the lifeline in the wire (electrons) in some way. Since the power source propagate energy using the jiggling of electrons then by using them up, the device must impede that movement. This slowing in jiggling will then propagate back and display as the slowing of the turbine... ... which is to say, actually we convert kinetic energy into whatever type of energy we use, that's the essence of "electricity"? BTW, thank you for your explanations on fans & stuffs! Though the bits with computers & fridges are gloss-over, but I guess I can have a vague understanding.
3gjm4y
Yes, water and electricity are different in important ways even though the analogy is informative sometimes. The energy in the electromagnetic field really truly is different from the kinetic energy of the electrons. (This is one of the important differences from water in a pipe, in fact.) You can see this fairly easily in a "static" case: if I use electricity to charge up a big capacitor, I've stored lots of energy in the capacitor but it's potential not kinetic energy. (There's a lot of potential energy there because there's extra positive charge in one place and extra negative charge in another, and energy will be released if they are allowed to move together so that the net charge everywhere becomes approximately zero.) You might want to describe this situation by saying that the electrons involved have a certain amount of potential energy, just as you might say that when you lift a heavy object from the surface of the earth that object has acquired (gravitational) potential energy. That point of view works fine for this sort of static situation, but once your charges start moving around it turns out to be more insightful to think of the energy as located in the electromagnetic fields rather than in the particles that interact with those fields. So, for instance, suppose you arrange for an alternating current to flow in a conductor. Then it will radiate, transmitting energy outward in the form of electromagnetic waves. (Radio waves, at the sort of frequencies you can readily generate in a wire. At much higher frequencies you get e.g. light waves, but you typically need different hardware for that.) This energy is carried by the electromagnetic field. It will propagate just fine in a vacuum: no need for any charged particles in between. When you have an actual electrical circuit, things are more complicated, but it turns out that the electrical energy is not flowing through the wires, it's flowing through the field around the wires. And, again, this energy i
1Long try4y
I still have not achieved a breakthrough. See, when we broadcast a wave, say radio, then it will propagate into space and will be lost forever. Now as per your words, an AC flow in a wire will radiate energy outward => this means a lot of energy is lost all the time. Since the wattage in a wire is a constant, we lose a big and constant amount of energy no matter what we do. That seems not to be the case in real life. Furthermore, if we accept that electrical energy actually flows in the field around the line, then why do we even need outlets and sockets? Just put a device near the wire, like those cordless chargers. Besides, electric thieves can be easy since almost everyone can put a specialized stealing device near a public line.
3gjm4y
It's not a big amount. (For, e.g., a typical mains cable.) And cabling, especially if the currents flowing in it are at high frequencies (which means more radiation), is often designed to reduce that radiation. That's one reason why we have coaxial cables and twisted pairs. For a 50Hz or 60Hz power cable, though, the radiative losses are tiny. You can power devices wirelessly -- using "those cordless chargers". They are designed to maximize the extent to which these effects happen, and of course the devices need to be designed to work that way. Ordinary mains cables don't radiate a lot and it isn't practical to power anything nontrivial by putting it near a mains cable. But the most effective way of getting energy from the field around a pair of wires is ... to connect the wires into an electric circuit. Indeed, it's only when they're connected in such a circuit that the current will flow through the wires and the energy will flow around them.
2Robert Miles4y
I recall seeing something about a very low-powered (and cheaply made) LED lightbulb which could never be turned off. With the light switch on, it was bright, and with the light switch off it was much more dim, but not actually off. It turned out this was because in certain common house wiring configurations, electrical field effects between nearby wires allow enough power through to light the bulb https://www.youtube.com/watch?v=1uEmX5XClPY
1Long try4y
Yeah, I did have that experience too. But come to think of it, his explanation in the video sounds counter-intuitive for AC & DC. With the bulb connected to the mains via a wire (even though it's the neutral line and that line is severed) like in the better part of the video, as long as the mains is AC the bulb will always at least dim... TBH I'm a bit more confused :)

reallyeli

Feb 24, 2020

100

Perhaps you already know this, but some of your statements made me think you don't. In an electric circuit, individual electrons do not move from the start to the end at the speed of light. Instead, they move much more slowly. This is true regardless of whether the current is AC or DC.

The thing that travels at the speed of light is the *information* that a push has happened. There's an analogy to a tube of ping-pong balls, where pushing on one end will cause the ball at the other end to move very soon, even though no individual ball is moving very quickly.

http://wiki.c2.com/?SpeedOfElectrons

Ooh, indeed I didn't know, thanks! The actual snail speed does surprise me. I guess an important hole has been patched.

Caveat that I have no formal training in physics.

cousin_it

Feb 25, 2020

50

I think Bill Beaty's page on electricity might be what you're looking for. Here's a joking teaser which shows the kinds of questions he's trying to answer:

Electricity is quite simple: "electricity" is just the flowing motion of electricity! Electricity is a mysterious incomprehensible entity which is invisible and visible, both at the same time. Also, electricity is both a form of energy and a type of matter. Both. Electricity is a kind of low-frequency radio wave which is made of protons. It's a mysterious force which cannot be seen, and yet it looks like blue-white fire that arcs across the clouds. It moves forward at the speed of light... yet it sits and vibrates inside your AC cord without flowing forwards at all. It's totally weightless, yet it has a small weight. When electricity flows through a light bulb's filament, it gets changed entirely into light. Yet not one bit of electricity is ever used up by the light bulb, and all the electricity flows out of the filament and back down the other wire. College textbooks are full of electricity, yet they have no electric charge! Electricity is like sound waves, no no, it's just like wind, no, the electricity is like the air molecules. Electricity is like cars on a highway, no, the electricity is the speed of the cars, no, electricity is just like "traffic waves." Electricity is a class of phenomena ...a class of phenomena which can be stored in batteries! If you want to measure a quantity of electricity, what units should you use? Why Volts of electricity, of course. And also Coulombs of electricity. And Amperes of electricity. Watts of electricity and Joules, all at the same time. Yet "electricity" is definitely a class of phenomena; merely a type of event. Since we can't have an amount of an event, we can't really measure the quantity of electricity at all... right? Right?

And then he goes on to answer all the questions one by one, in a very straightforward way.

Holy cow, I've just read to the "poynty" part in his work. Now I have a vague sense of why Tesla wanted to put wireless electricity down into every household. And even Feynmann was afraid of explaining the truth because of its complexity/difficulty.

Jumpman

Feb 24, 2020

30

The electrons in a current never move anything close to the speed of light (https://en.wikipedia.org/wiki/Drift_velocity). It is the propagation of the changes in the electric field caused by the electrons moving that moves at the speed of light. It is more like a tube full of marbles (a stretched analogy). If you push the marble on one end the marble at the other end moves almost instantly. The marble you pushed didn't move all that distance.

Yes, the heat in conductors is caused by the electrons kinetic energy. No, it doesn't really change the propagation speed of the current since that is the electric field propagating. There is certainly power lost there.

It is not easy to transmit DC over long distances (https://en.wikipedia.org/wiki/War_of_the_currents). Edison tried hard to push the adoption of DC going so far as to publicly electrocute elephants with high voltage AC as a PR stunt to scare people. You can find videos of this online if you want. It didn't work because it just so much more efficient to transmit AC voltage and use a transformers to step it down.

The wiki Currents war article ends with a brief mention of HVDC. China utilizes it in 2019, and they certainly are not stupid, so...

The HVDC article lists some pros & cons of it over AC. At a quick glance, there are more pros. And what of the biggest disadvantage? Converter stations cost. And what do they do? They convert that DC into AC, so it can be distributed into households and then switched back to DC inside the devices so they can use electricity! All of this clusterfuck nonsense can be avoided if they use all-out DC system in the 1st place!

I guess using a war more than 120 years ago to justify current (pun intended) situation is not very good.

lincolnquirk

Feb 24, 2020

30

I can give some partial answers based on my own models:

AC is used for transmission because transformers are ubiquitous and incredibly valuable at all stages of transmission, and transformers work using AC (you need a changing electrical field to generate a changing magnetic field). Transformers allow you to convert the voltage and isolate circuits. Isolation is important for safety, and voltage conversion is important to achieve the cross purposes of safety and efficiency. High voltage allows you to transfer more energy with fewer losses, but is far more dangerous to work with. This gets to your resistance question -- resistance / heat generation are related to the amount of current and the thickness of the material. To transfer a given amount of energy, higher voltage means less current needed for the same wire, which means less heat losses.

Why 50Hz (or 60 in the US)? As far as I know, this is largely arbitrary. I do know that subtle differences in the frequency are used for signaling grid load. https://en.wikipedia.org/wiki/Utility_frequency has a lot of info though!

As for metering, I have no idea how current meters (ammeters/watt meters) work, but I am pretty sure no net electrons are entering or leaving e.g. your house or your appliance. Electrons in a circuit should be conserved, they're just the means of transfer of energy.

Tks. You mentioned isolation is important for safety. Can you elaborate some specific examples? As per my imagination, unless the threat has been predicted then the AC transformers are useless against sudden issues. Say, an abrupt surge will still propagate via its magnetic field before we can do anything.

3kpreid4y
Isolation is not about surges, but about preventing current from flowing in a particular path at all. In a transformer, there is no conductive (only magnetic) path from the input side to the output side. So, if you touch one or more of the low-voltage output terminals of a transformer, you can't thereby end up part of a high-voltage circuit no matter what else you're also touching; only experience the low voltage. This is how wall-plug low voltage power supplies work. Even the ones that are using electronic switching converters (nearly all of them today) are using a transformer to provide the isolation: the line voltage AC is converted to higher frequency AC, run through a small transformer (the higher the frequency, the smaller a transformer you need for the same power) and converted back to DC.
1Long try4y
Oh, I was too focused on the system function while forgetting that safety can primarily apply to human health too :)

MoritzG

Feb 28, 2020

10

Why 50/60Hz? It has to be too low to be heard, to high to be seen, high enough for transformation, low enough for low induction losses, low enough for simple rotating machines. Trains can not use 50/60 so they went with 1/3 (16+2/3 Hz or 20 Hz)
Grid frequency is controlled to +-150mHz if that fails private customers might get disconnected/dropped.
The time derivative of the grid frequency is a measure of the relative power mismatch.

50-60Hz is not too low to be heard: https://www.youtube.com/watch?v=bslHKEh7oZk

It's not really too high to be seen either, lights that flicker at mains frequency can be pretty unpleasant on the eyes, and give some people headaches.

1MoritzG4y
True, I had not claimed that all criteria could or have been met. Because of the noise and the heat I just the other day replaced the inductive load in some of my very old but still fully functioning kitchen counter lights, with modern switching current regulators. The 50 Hz produce a 100 Hz tone that had been bothering me for decades. But even some of those can be heard by some people. (Not me I am deaf to anything >10kHz) It is a compromise in an area of sensory overlap but the human senses are not equally sensitive to all frequencies. Your hearing is way better at 3kHz. At your age you will still remember CRT monitors that would operate at 60 Hz at max resolution, bad but they did get used.
11 comments, sorted by Click to highlight new comments since: Today at 7:25 AM

Downvoted for apparently not even trying to check online sources, like Wikipedia and physics stackexchange.

Research is a skill that requires specialized knowledge and a good deal of practice to do well. Long try did say that "general articles on the net doesn't satisfy" in their post, and I think we have a responsibility to assume that this represents a good faith effort. After all, the Internet is pretty hit-or-miss at explaining things in an accessible way. Often, explainers are aimed at small children and don't actually lead to the kinds of questions that would allow one to proceed deeper into a topic. And it can be very discouraging to approach a new topic when you don't even know what you don't know.

Criticism without any attempt at education is unhelpful, and there's no harm in approaching these things with kindness. Builds community, you know, and we could all use a little more of that.

Oh come on, many says one can't rely on wiki. On higher topics like quantum & maybe electricity, wiki uses high words that confuse the hell out of me. For example, it uses the term "drifting speed" to describe "electrons' velocity in wires" - how can I know to find it to read in the 1st place?

OTOH, I posted another question here asking where I should ask a question. Some people suggest posting on as many sites as possible, which means LW included. Even the FAQ or some other "official" documents here encourages asking any and all kinds of questions.

If by downvoting you meant the community only accept high-level questions where one must do substantial research (how substantial is defined by those who read the questions) before even considering writing it, then I think you succeeded. I do feel bad seeing my question got downvoted to a rotund 0, and do feel discouraged from asking questions in the future.

Some notes on researching new topics:

  1. You're right that many people say you can't rely on wiki. Unfortunately, statements as broad as that are rarely useful. A more nuanced approach would be something like "wiki can often be a good starting point, but don't stop there". Check the sources, especially on topics (e.g. drifting speed) that aren't particularly clear, but really on anything that catches your interest. When you feel like you have enough information to narrow your search terms a bit, do that and see if more sources come up that you weren't able to access with the more general question.
  2. You won't be able to keep the whole topic in your head at once. Make brief notes on each source you used and quick summaries of any interesting information you got out of it. Number the notes so you can cross-reference them (any numbering system will do as long as each note has a short but unique identifier). Write down your questions and thoughts as more notes notes as they come to you (maybe set them in a special pile so you can find them easier), then append those notes with answers or partial answers when you find them. Record the full answer in your own words, and link to the notes that helped you write that. Back-link related notes to the question to make it easier to follow your cross-references using that question as an entry point.
  3. If you're thinking this sounds like a lot of work, it is. Keep a list of sources you want to check out and why, and take on only as much as you're comfortable doing at a time. Even if you tend to process only one article or chapter per day, you are making progress! And don't be hard on yourself if you feel like you can't give the project the time you want to: that only leads to feeling frustrated and spending even less time on it.
  4. On forums, you're likely to get friendlier results for asking questions like, "I'm curious about [Specific Thing]. [Source] and [Source] seem to suggest [Brief Summary], but I'd like a little clarification on [Even More Specific Thing]. Sources would be appreciated!" than really general questions like "How does [thing] work?", because you're giving the community a starting point for the discussion instead of a general topic.
  5. [ETA} From the conversation in the Answers section, it looks like you're good at asking follow-up questions. That's a huge help when you're doing research!

I hope you find these notes useful. If you would like to go deeper into any of them I'd be happy to discuss them with you. :)

My appreciation - that's really helpful, especially point 2. I was a bit hesitating when I saw the amount of links in cousin_it's link, but point 3 encourages me to do it, even slowly.

Point 4 is kinda hard from my POV. I admit I'm too lazy to dig all the sources to display in a post. But then, if a question is formatted like that, wouldn't it be way too long? I thought titles should be concise & provoking.

Remember, you have a title and a body to work with when asking a question. Pithy titles are good for getting attention, and there's room for a bit more elaboration once people click through. The key is to keep it both open-ended and specific so the conversation has somewhere solid to start from. Otherwise you'll get a lot more off-topic discussion.

I'm glad you found my notes helpful!

The problem is that you currently lack so much information about the things you are asking about, that no short explanation is possible. The atomic constitution of matter, electromagnetism, electrical engineering. Even to just a high school level, that is a lot of ground to cover. No-one can pour a few paragraphs into your head that will give you all that knowledge.

About the reason for 60 Hz/50 Hz: keep in mind that for most power plant types, there is an actual spinning turbine generating that sine wave of power as a result of it's rotating motion. When you attach a device to the grid that draws power, the energy comes out of those spinning turbines and they would physically slow down except that grid operators closely match the grid energy demand to supply. They can monitor demand by watching the frequency,: if demand goes up, like when you turn on a lightbulb, the turbines slow down, frequency drops. You turn off the lights, and the reverse happens.

I do think you're right that flickering incandescent bulbs needing to be too fast to see was one of the reasons for that specific frequency. Too much lower and people notice. Conversely, too much higher and it gets harder to engineer turbines that spin fast enough and are still efficient and durable, especially with early 1900s era metallurgy and manufacturing tolerances.

Woah, it's a thought that never occurred to me: turbines slow down when we use electricity. Makes sense when 1 thinks hard about it. Did you work in a power plant or something?

There's another relevant question. When turbines rotate, they must be doing it inside a set of huge magnets; or they must themselves rotate the magnets inside a huge coil. In either case, there's a need for magnets. As per my understanding, they can't be electric magnets because it will destroy the purpose of generating electricity in the 1st place. So they must be natural ones. Those will decay over time because their field energy is being used all day. Therefore... theoretically, if humans exist long enough then we will run out of magnets and thus no electricity? For now I have no idea what is the Earth's capacity for magnetic materials.

No, I never worked in a power plant or anything like it, but I have a physics background and back in school I took a class that involved a lot of modeling of the economics of electricity generation, including power grid management, and this came up.

And permanent magnets don't get used up. The energy the gets used is the mechanical energy moving them back and forth, which ultimately comes from the fuel (coal, gas, biomass, nuclear, wind, geothermal, or solar thermal). Their magnetic field just exists, and transfers that mechanical energy to the electrons that flow through the wires in the electric grid. So that one we don't need to worry about.

Edit to add: yes there are ways to generate any AC frequency you want. Obviously wind turbines don't spin at 50Hz, they use gearboxes to convert mechanical motion to the desired frequency before converting to electricity. Each such conversion costs some energy, though.