Earlier, tepples wrote:
Without JavaScript or WebAssembly, OS-independent rich apps would have to run in an x86-64 VM instead.
Sites are broken in Safari because not every web developer can afford a separate computer just for testing on a 2% browser.
Sites are broken in Safari because not every web developer can afford a separate computer just for testing on a 2% browser.
Recent discussions about scripting in the browser both here and on another forum have inspired me to flesh out the reasoning behind this.
In this post, Hojo_Norem wrote:
koitsu wrote:
And stop using Internet Explorer for god's sake -- still the #1 vector for malware and viruses that I know of.
I can't agree more! My choice of poison is Firefox with noscript.
Yeah, it does make using some sites a little bit of a bother when I'm paranoid enough that I rarely whitelist a domain.
But if I have to enable a dozen seemingly unrelated domains to to access one page, do they really deserve my traffic in the first place?
I'm in the middle of a discussion on Slashdot with some members of the vocal minority who are against any use of JavaScript on the web. My understanding of their position is as follows:
Purist: "Whitelist nothing because 'web applications' should never have existed in the first place. Websites should be static, or have HTML forms at the most, and applications should be native. Screw Pirates Love Daisies, screw Cookie Clicker, and screw JSNES and em-fceux."
Me: "This means you'd miss out on ability to use applications whose developer happens to use a different operating system from your own."
Purist: "The developer can instead use cross-platform middleware, such as Qt, Unity 3D, or some classic game console for which Free emulators exist, to build a cross-platform app."
Me: "The developer would still need to buy a Mac to build the Mac version of a native application using a cross-platform library, and not all hobbyists and startups can afford that for launch day."
Purist: "It's no different from a developer having to buy a Mac to test a web application in Safari. There are several things that work in Chrome but break in Safari. And it wouldn't be a cost in the first place had you chosen Macs for your developers in the first place instead of Lenovo or HP."
Me: "But Safari isn't quite as important because a Mac's owner can choose to run Chrome or Firefox instead. Recent stats from caniuse.com show Safari for Mac at about 2 percent of usage share. I imagine that a lot of hobbyists are stuck with the hardware they had before the project began. And had you chosen a Mac, you'd still have to spend $120 for a retail Windows license for each Mac. At least with web applications, Microsoft offers feature-limited virtual machine images with IE 11 and Edge without charge."
Purist: "A user who cannot make use of the provided executable can still make use of 1. the list of features, 2. documentation, 3. the bug tracker, 4. the developer's blog, 5. things made with the application that its users have chosen to share publicly, and 6. complete corresponding source code so that he can port it."
Me: "A fully public bug tracker for a game might enable users to cheat by reading bugs and trying to reproduce them. And not every project's business model is aligned with distribution as free software. Some fear the ability to make a rebranded port, make it available on a separate website, and take all the credit and revenue."
So I suggested distributing an application as a disk image containing a very stripped-down Linux* distribution bundled with your app, which users of other operating systems can run in a virtual machine. But is that such an improvement over JavaScript running in a virtual machine? Do I fundamentally misunderstand something here?
* Size-optimized Linux is often not GNU/Linux, as it uses BusyBox instead of Bash and Coreutils, and Newlib, Bionic, or uClibc instead of glibc.