• bobs_monkey@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    11 months ago

    Not exactly. There’s a ratio of RPMs of the drive motor to the specific input of the alternator that generates the correct frequency. It depends on the way the alternator is designed (ie number of poles) that will yield the correct frequency, almost like a gear ratio, that is optimized for efficiency, and power plants have to constantly make slight adjustments to the drive motor speed the keep the frequency exact (usually done automatically within the drive control system).

    I’ve never seen frequency be an issue in a residential system, but in theory it could happen.

    • datelmd5sum@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      I don’t know how it’s in 60Hz regions, but here the generators are in 3 phases, 120 degrees apart. The voltage gets transformed to up to 400kV, still in 3 phases, and then down to 400V when it’s distributed to peoples’ homes. Then you can pull 400V 3-phase or 230V 1-phase from your wall.

      • bobs_monkey@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        11 months ago

        It’s the same here, though we have varying degrees of transmission and distribution voltages via transformers and regulators. In my area, power comes into our valley from the 500kv lines through the open desert, into the valley at 33kv, and stepped down to 5kv for neighborhood distribution that the single phase 240/120v transformers tap off for the EOL.

        More of what I was getting at was that generation is more or less the same across regions. Some external fuel source (whether it’s diesel, natural gas, nuclear, steam, etc) does its thing to drive a rotor that’s connected into an alternator which is essentially an electric motor but instead of the electric motor doing the driving, it’s being driven which generates power, and the RPMs of whatever given fueled drive mechanism are not necessarily 1:1 with the alternator speed.