The issue of ECC is so off-topic it's not even funny. :-)
I agree wholeheartedly though -- ECC (even if just single-bit correcting) is something that should be commonplace these days, but it isn't, and that's because it's all about cost-savings (compounded by gamers who are like "UHHH UHHH UHHH I NEED ANOTHER 4 FPS!" -- companies lover gamers cuz they're cash cows). With CPU manufacturers having moved MCHs on-die (read: northbridge now lives in the physical CPU itself), no longer can motherboard manufacturers offer stuff like boards that use desktop CPUs but with ECC-supporting northbridges. I used the latter heavily throughout Parodius' existence, as I refused to pay the exorbitant cost for Xeons; you're now forced to buy Xeon ("server-class") CPUs if you want ECC, even though they really don't offer much more in the ways of performance (there's lots of marketing blahblah trying to say otherwise). This applies to Intel exclusively BTW; AMD so far has yet to pull something like this, but I imagine it's only a matter of time before their marketing goons decide to milk it. Oh, and to clarify: AFAIK, on-die L1/L2/L3 cache use ECC, even on desktop CPUs.
On the flip side, one of the complications ECC brings with it are
lots of fun MCEs that users can't decode or comprehend. "Is it external DRAM that's causing the MCE? Is it L2 cache? It just says 'memory'! How do I know which DIMM?" Most of the time users get it wrong. The only way to figure out what the cause is, is to replace one part at a time. These MCEs literally vary per every CPU model, so quite often when a new CPU or chipset comes out, an OS update is required to support properly parsing/handling new MCEs (some of which are generated under normal operation, believe it or not). OSes like FreeBSD have extremely neglected MCE handling code, so newer CPU or chipsets often spew stuff that's possibly harmless -- but you can't be sure without decoding it, which is difficult.
Back on topic: I could talk for days about this typing thing, but honestly I have colleagues/peers who grasp it (esp. the importance of strict types) way better than I do. But it doesn't matter, because the "webshit-o-sphere" with its ever-growing number of horrible cargo-cult PLs and loudly-vocal advocates will continue to deny the importance of strong/strict typing. What saddens me greatly is when I see people who are old enough to know better, guys like Rob Pike, making stuff that caters to that horrible mentality. I think it all boils down to people not wanting to have to type things like
string_of_bool or
int_of_string (to use native OCaml functions as an example), as well as explictly cast (at declaration time) what type something is. They believe, truly, that the PL should at run-time figure these things out magically and be able to know what they intended. When it goes awry, like in the case of the above example, I can only shake my head.