I upgraded the nesdev wiki from 1.15.5 to 1.18.0 a day or two ago, per this thread.
Since then, things seem to be working, but I need folks to browse around and make sure everything looks OK. If folks could do that I'd appreciate it.
However, there's one major drawback: it appears that starting with MediaWiki 1.17.x, the developers moved away from using flat files for CSS and JavaScript bits and instead "siphon" the data through PHP. This is documented in the MediaWiki 1.17.x release notes as follows:
Why this is a problem: this badly breaks CSS/JavaScript for the "weekly archive" that folks wanted so badly so they could view the Wiki while without a network connection; Banshaku has verified the breakage. Specifically, the dumpHTML extension does not deal with this situation well. Technically I understand why, and do not believe there is an easy solution for this; alternate tools like wget --recursive and so on will not address the problem either.
As such, I've disabled the weekly archive generation and taken them offline for the time being. I need to get an idea of how many people use the archive, because I really don't see a way to solve this problem easily.
Too much software is doing this crap these days, and for no good reason; why people can't let the actual web server/daemon do its job I'll never know. Sigh...
Since then, things seem to be working, but I need folks to browse around and make sure everything looks OK. If folks could do that I'd appreciate it.
However, there's one major drawback: it appears that starting with MediaWiki 1.17.x, the developers moved away from using flat files for CSS and JavaScript bits and instead "siphon" the data through PHP. This is documented in the MediaWiki 1.17.x release notes as follows:
Code:
* ResourceLoader, a new framework for delivering client-side resources such as
JavaScript and CSS, has been introduced. These resources are now delivered
through the new entry point script "load.php", instead of as static files
served directly by the web server. This allows minification, compression and
client-side caching to be used more effectively, which should provide a net
performance improvement for most users.
JavaScript and CSS, has been introduced. These resources are now delivered
through the new entry point script "load.php", instead of as static files
served directly by the web server. This allows minification, compression and
client-side caching to be used more effectively, which should provide a net
performance improvement for most users.
Why this is a problem: this badly breaks CSS/JavaScript for the "weekly archive" that folks wanted so badly so they could view the Wiki while without a network connection; Banshaku has verified the breakage. Specifically, the dumpHTML extension does not deal with this situation well. Technically I understand why, and do not believe there is an easy solution for this; alternate tools like wget --recursive and so on will not address the problem either.
As such, I've disabled the weekly archive generation and taken them offline for the time being. I need to get an idea of how many people use the archive, because I really don't see a way to solve this problem easily.
Too much software is doing this crap these days, and for no good reason; why people can't let the actual web server/daemon do its job I'll never know. Sigh...