This is the case for a lot of inventions.
This is the case for a lot of inventions.
Distributed computing would eliminate the water usage, since the heat output wouldn’t be so highly concentrated, but it would probably somewhat increase power consumption.
In an ideal world I think data center waste heat would be captured for use in a district thermal grid / seasonal thermal energy store like the one in Vantaa.
Of course that isn’t to say that we shouldn’t be thinking about whether we’re using software efficiently and for good reasons. Plenty of computations that take place in datacenters serve to make a company money but don’t actually make anyone’s lives better.
There are actually some fossils of dinosaur mummies, a rare preservation of a rare preservation. For some species these give us direct evidence of their physical appearance beyond their bone structure.
Phones can figure out its location using gyroscopes and accelerometer
This is plainly false.
The error stack-up from the imprecision of a phone’s MEMS sensors would make positioning basically impossible after a couple of dozen feet, let alone after hours of walking around.
There are experimental inertial navigation systems that can do what you describe, but they use ultra sensitive magnetometers to detect tiny changes in the behavior of laser suspended ultra cold gas clouds that are only a few hundred atoms large. That is not inside your phone.
Yeah, I really wish it wasn’t like this, but replacing a phone’s OS is a lot more like flashing a custom bios than installing an OS on a hard drive.
It should probably say, “off the Antarctic coast”, or even “X kilometers off the Antarctic coast”.
What is the “everything” that Rust is being used in? From what I’ve heard its being used in the same place you’d use C or C++, not in any other niches.
Yes, thermal mass only serves to even out fluctuations in temperature. If the outside environment swings between hot and cold then a building with high thermal mass will tend to have a temperature in the middle of those two extremes. Like how a heavier ship is tossed about on the waves much less than a small boat.
But if a place is consistently hot or cold for a long time thermal mass doesn’t really do anything. At most you can use it as a battery, so you can, for example, run a heat pump while electricity is the cheapest and use the thermal mass to maintain the temperature you established over the costly period.
So many people think that its a substitute for insulation though, which slows down the rate of heat transfer in or out, and does actually let you use less energy to maintain a given temperature.
I’m sure it makes the bean counters happier to have another asset valued at X amount, but in practice the software will just be locked in some vault where it won’t do anyone any good.
Its an instance where the number on the screen doesn’t actually correspond to any useful economic activity.
Support for this feature would lessen the need for such players though, and anything that lessens the amount of JavaScript in the world is an objective moral good.
Basically any CEO, and even most small business owners, would kill you if it meant a few percent more yearly profit.
For awhile now I’ve been thinking about how nice it would be to have a something like a modern version of the Poqet PC.
The Poqet PC had a much nicer keyboard than the laptop in the article, and between the simplicity of its software and a very aggressive power management strategy (it actually paused the CPU between keystrokes) it could last for weeks to months on two AA batteries.
Imagine a modern device with the same design sensibilities. Instead of an LCD screen you could use e-ink. For both power efficiency, and because the e-ink wouldn’t be well suited to full motion video, the user interface could be text/keyboard based (though you could still have it display static images). Instead of the 8088 CPU you could use something like an ARM Cortex M0+, which would give you roughly the same amount of power as a 486 for less than 1/100th the wattage of the 8088. Instead of the AAs you could use sodium ion or lithium titanate cells for their wide temperature range and high cycle life (and although these chemistries have a lower energy density than lithium ion, they’d probably still give you more capacity than the AAs, especially if you used prismatic cells). With such a miniscule power consumption you could keep a device like that charged with a solar panel built into the case.
Such a device would have very little computing power compared to even a smartphone, but it could still be useful for a lot of things. Besides things like text editors or spreadsheets, you could replicate the functionality of the Wiki Reader and the Cybiko (imagine something like the Cybiko with LoRaWAN). You could maybe even keep a copy of Open Street Map on there, though I don’t know how computationally expensive parsing its data format and displaying a map segment is.
Does a larger MRI produce more data than a smaller one (same data density over a larger volume), or is it the same resolution spread out over a larger space?
It is from 2018, but how do you imagine that this was written by AI given that LLMs barely existed at the time and weren’t accessible by the general public?
Yeah, I’m not an expert in construction but I don’t really know what this buys you vs using, for example, insulating concrete forms.
Modern day chemists have succeeded where the alchemists of old failed by finally isolating phlogiston.
It might be less the quality of the research and more this:
(This comic is a bit outdated nowadays, but you get the idea).
Except the headlines say “scientists report discovery of miraculous new battery technology using A!”.
Also i think people don’t realize how long it takes to commercialize battery technology. I think they put them in the same mental category as computers and other electronics, where a company announces something and then its out that same year. The first lithium ion batteries were made in a lab in the 1970s. A person in 2000 could have said “I’ve been hearing about lithium ion batteries for decades now and they’ve never amounted to anything”, and they would be right, but its not because its a bunk technology or the researchers were quacks.
With electric cars you might not even need a special charger so much as a special charging cycle. Its already the norm for cars to tell the charger what voltage and current they want, and its already the norm for cars to carefully control their battery’s temperature during charging.
That’s not to say you’d necessarily be able to do this with just a software update, but its not too far off from the current paradigm.
IDK, it wouldn’t be the first time a news org published some random shit as fact because they’re too eager to be the first to report on something.
I remember the days when bugs in x86 CPUs were almost unheard of. The Pentium FDIV bug and the F00F bug were considered these unicorn things.