Skip to main content

in reply to Cory Doctorow

Hail, hail entropy!

S=k. log W

A common interpretation of entropy is that it is somehow a measure of chaos or randomness. There is some utility in that concept. Given that entropy is a measure of the dispersal of energy in a system, the more chaotic a system is, the greater the dispersal of energy will be, and thus the greater the entropy will be.

in reply to Cory Doctorow

Worth reading overall but additionally for the clever neologisms, "Gretacene" and "Manchin-Synematic Universe"

Cory Doctorow reshared this.

in reply to Cory Doctorow

Your article reminds me so much of a recent @TechConnectify video. Are you aware of Technology Connections (aka angry dishwasher man). Alex explains a lot of Technology, old and new. If not, whoo boy are you going to learn a thing or two about dishwashers.

Their recent video discusses, I think, electrification (mainly of cars) just so darn well that I hartily recommend sending it to all those who are still somewhat on the fence.

youtube.com/watch?v=KtQ9nt2ZeG…

Cory Doctorow reshared this.