The idea behind watts seems deceptively simple. By definition, a watt is the amount of work done when one ampere of current flows between a potential of one volt. If you think about it, a watt is basically how much work is done by a 1V source across a 1Ω resistor. That’s easy to say, but how do you measure it in the real world? [DiodeGoneWild] has the answer in a recent video where he tears a few wattmeters open.
There are plenty of practical concerns. With AC, for example, the phase of the components matters. The first 11 minutes of the video are somewhat of a theory review, but then the cat intervenes and we get to see some actual hardware.
Inside the first wattmeter, he finds essentially the same circuit he was drawing at the start of the video, with some practical additions like range selection. The principle used is slightly different than the one he was drawing, but the core principle is the same: measure the voltage and the current to find the power.
We tend to be a little more cautious around main power, but unlike some other famous YouTubers, he manages not to shock himself or set any fires, at least on camera.
We marvel at the mechanical design of these meters and we also liked the homebrew power meter power strip. So if you like the theory, the teardowns, or a homebrew project, there’s something for you.
Measuring power at RF is a whole other science. Naturally, there are many ways to measure wattage, and not every instrument uses the same method.