War of currents – Is it advantage Edison in the 21st Century

What is ‘War of Currents’? Excerpts from Wikipedia:

The War of Currents was a series of events surrounding the introduction of competing electric power transmission systems in the late 1880s and early 1890s including commercial competition, a debate over electrical safety, and a media/propaganda campaign that grew out of it, with the main players being the direct current (DC) based Edison Electric Light Company and the alternating current (AC) based Westinghouse Electric Company. It took place during the introduction and rapid expansion of the alternating current standard (already in use and advocated by several US and European companies) and its eventual adoption over the direct current distribution system.

More than 100 years later, it may be worthwhile to deliberate on the merits of both types of electrical power generation methods. A constructive debate could promote radical approaches to design of electrical machines. The imperatives and demands of the society in the era of resource revolution & circular economy may compel us to look beyond adapting the existing infrastructure to renewable energy options.

The acceptance of AC system was fast and world-wide owing to the economies of scale in generation, transmission and distribution. The entire system – the grid as it is often called – could power a population of a state or nation. With its network of generation plants, transformers, and transmission lines, the grid appears as infinite source of energy to the consumer. To be able to do so all the generators were required to run at their rated load and frequency of 50/60 Hz to deliver energy to the consumers. And the demand usually matched the generation or was managed by ‘load shedding’. In a day as the demand for electric power would rise from morning to noon, more generation plants were plugged to grid, running at their rated loads. This network was adapted for various types of energy sources – Coal, Hydro, Nuclear, and Fossil fuels.

The oil crisis of 70s and subsequent turmoil coupled with environmental concerns, led to development alternate or green energy solutions – solar, wind and biomass. But in each case the new power generation technology was required to comply with supply frequency restrictions of 50/60 Hz. On the other hand the consumer side loads started becoming more digital (running on DC) – DC drives, microprocessors, electronics goods. The constraint of supply frequency led to various losses and inefficiencies like harmonics which were partly addressed by rectifiers and other digital solutions. But this was still accepted as system requirement.

However with the emergence of distributed power solutions, the constraint of supply frequency became a functional requirement for the electrical machines. I am referring to the terms of Functional Requirements and Constraints from Prof. Suh Nam’s subject of ‘Principles of Axiomatic Design’. Functional requirements (FRs) are a minimum set of independent requirements that completely characterize the functional needs of the product (or software, organizations, systems, etc.) in the functional domain. By definition, each FR is independent of every other FR at the time the FRs are established. Constraints (Cs) are bounds on acceptable solutions. There are two kinds of constraints: input constraints and system constraints. Input constraints are imposed as part of the design specifications. System constraints are constraints imposed by the system in which the design solution must function. With reference to the definition of FRs ad Cs, the constraint of supplying ac power at 50/60 Hz became a functional requirement. In my humble opinion, it is at this point that inefficiencies were built into electrical machines at concept stage.

What Prof. Suh Nam and Prof. Geneichi Taguchi promoted was an approach to move towards an ideal and efficient design, where the output of the machine could be calibrated to be linearly proportional to the signal (load exerted). As I reflect these philosophies on the machines from my industry – ac alternators – the machine were designed to deliver a fixed frequency at the rated load. Whereas ideally they should have been designed to deliver the electric power as per the load applied. In doing so designers forced the mechanical motive power source – variable speed internal combustion engine to run at 1500/1800 rpm. This leads to loss of efficiencies at design stage.

However the emergence of dc technologies – HVDC (high voltage DC transmission), 400 V DC architecture for commercial buildings, renewable energy sources like wind and solar – may prompt designers to revisit the approach to design electrical machine.

We are already seeing variable speed diesel generator sets as high as 750 kVA output. If the energy technologies of the future could be calibrated as per the demand should we constrain them to supply frequency or harness them for a sustainable society? Perhaps the dc could hold the key. Hence the theme!

 

 

Leave a Reply