The thing that fascinates me is that every single digital microwave I’ve ever used behaves the same way, and allows the “seconds-place” to be 0-99.
My best guesses are
There’s some ASIC that’s been around forever and everyone uses it (a cockroach chip like the 555)
The first digital microwave did this and all subsequent ones followed
There’s actually some implementation reasons why this is way more sensible.
Writing it in software, there are different ways that folks would probably implement it, for example, “subtract one, calculate minutes and seconds, display” seems reasonable. But nope, every one I’ve ever used is just the Wild West in the seconds department.
Tap into the button wiring in the control panel (if you’re qualified!). Maybe you’ll be able to type FF (hexadecimal) seconds, resulting in cooking time of 150+15=165 seconds!
Now seriously, a big portion of BCD (binary-counting-decimal) equipment is actually binary-to-hexadecimal but the counter is expected to reset before reaching A. Sometimes it doesn’t:
Another video, super compressed, by me, shot on a feature phone:
You also see it go to 3 st 14 lb, which should not be possible because it’s suppoded to show 4 st 0 lb at that point. (I use kilograms, I was just wondering what the switch in the battery compartment does.) I originally thought the ASIC inside would be a kind of microcontroller stripped down to what’s needed: CPU (8- or even 4-bit, likely von Neumann) with a few registers (hard to call that RAM), (EE?)PROM for the program and calibration data (there’s a pair of pins labeled “CAL” for a jumper or serial interface; I won’t mess with them), 1:4 multiplexed LCD driver, digital input (not even GPIO) pins for the unit switch and calibration interface, differential amplifier and A/D converter, plus some support circuitry like an RC oscillator for the CPU clock, charge pump for the LCD, low battery voltage detector and sleep/wake circuit for power saving. This could enable the same ASIC to be used in personal and kitchen scales, maybe even pressure gauges, thermometers or more, just with a different program, LCD and external components. However, now I see that there is likely no CPU because doing the math in software would make such errors impossible. So there’s most likely a hardwired logic system with hardware counters (just a little more complex than those cheap 3½-digit multimeters), binary-to-hex-7seg-digit converters and a flaky analog threshold system in the st:lb mode (the kg mode is a robust 1500-count decimal counter).
The thing that fascinates me is that every single digital microwave I’ve ever used behaves the same way, and allows the “seconds-place” to be 0-99.
My best guesses are
Writing it in software, there are different ways that folks would probably implement it, for example, “subtract one, calculate minutes and seconds, display” seems reasonable. But nope, every one I’ve ever used is just the Wild West in the seconds department.
Tap into the button wiring in the control panel (if you’re qualified!). Maybe you’ll be able to type FF (hexadecimal) seconds, resulting in cooking time of 150+15=165 seconds!
Now seriously, a big portion of BCD (binary-counting-decimal) equipment is actually binary-to-hexadecimal but the counter is expected to reset before reaching A. Sometimes it doesn’t:
https://www.youtube.com/watch?v=MvZgfj0_hJU&t=86
Another video, super compressed, by me, shot on a feature phone:
You also see it go to 3 st 14 lb, which should not be possible because it’s suppoded to show 4 st 0 lb at that point. (I use kilograms, I was just wondering what the switch in the battery compartment does.) I originally thought the ASIC inside would be a kind of microcontroller stripped down to what’s needed: CPU (8- or even 4-bit, likely von Neumann) with a few registers (hard to call that RAM), (EE?)PROM for the program and calibration data (there’s a pair of pins labeled “CAL” for a jumper or serial interface; I won’t mess with them), 1:4 multiplexed LCD driver, digital input (not even GPIO) pins for the unit switch and calibration interface, differential amplifier and A/D converter, plus some support circuitry like an RC oscillator for the CPU clock, charge pump for the LCD, low battery voltage detector and sleep/wake circuit for power saving. This could enable the same ASIC to be used in personal and kitchen scales, maybe even pressure gauges, thermometers or more, just with a different program, LCD and external components. However, now I see that there is likely no CPU because doing the math in software would make such errors impossible. So there’s most likely a hardwired logic system with hardware counters (just a little more complex than those cheap 3½-digit multimeters), binary-to-hex-7seg-digit converters and a flaky analog threshold system in the st:lb mode (the kg mode is a robust 1500-count decimal counter).
So you’re saying they accept up to 99 in the seconds place when typed in but go from 2:00 to 1:59? (I don’t have a digital microwave at home)
yes
Why does OOP’s microwave go from 60 to 1:20 when the “+60”” button is pressed?
I don’t think it does—I think OOP is doing the math and then inputting the sum.
There is no +60 button there is a +1:00 min button.
Microwaves only see in mins and seconds. If it’s the first two digits it’s mins and the last two are seconds.
So 00:99 seconds is longer then 01:00.
Microwaves never leave off the leading zeros they can’t. Least no microwave iv ever seen in 20 years can and iv seen microwaves that are from the 70s.
They basically just have always worked this way.