Porting my tools to C or C++

This is an archive of a topic from NESdev BBS, taken in mid-October 2019 before a server upgrade.
View original topic
Porting my tools to C or C++
by on (#234603)
Some data conversion and compression tools in the build process of some of my public and private programming projects for retro game consoles are time-consuming. For example, 240p Test Suite takes 10 seconds to run a DTE encoder on the help pages, and per Amdahl's law, the remainder of the build process has to stall on this one step no matter how many cores I devote to everything else. My pipelines are slow in part because they are written in the dynamic language Python and run in the CPython interpreter. I know of two ways to speed up a program written in Python:

Run the tools in PyPy
PyPy is a JIT comiler that speeds up pure-Python code. I tried this, and it didn't speed much up. It turned out a lot of the time was spent in PyPy's compatibility with CPython's extension API, which is slow. Many of these use Pillow (Python Imaging Library) to crop an image into 8x8- or 16x16-pixel chunks and do operations on each chunk's data, but the transition from PyPy to Pillow to PyPy to produce each such chunk is slow. A feature request for PyPy support in Pillow was closed as soon as it built, not as soon as it was fast. A fast PyPy extension uses CFFI.

Rewrite the extension in a less dynamic language
In this post, jroatch reported well over a hundredfold speedup by rewriting a data compression utility from Python to C. This is practical on Debian or Ubuntu, where I can just switch what packages I ask the user to apt install. But I'm not aware of how to make installing a C++ compiler and image reading and writing libraries for said compiler easy for my collaborators who use the Windows operating system. I've made a walkthrough for installing Git Bash, Make, Python, Pillow, and cc65 on Windows, but I'd need the same for a C or C++ compiler. I would also need to support my collaborators in case they miss a step in the walkthrough, the Windows GUI changes out from under them (which it has done), or a bug fix or enhancement to a tool breaks functionality on only Windows. And I imagine this will prove more difficult seeing as I have only occasional access to a Windows PC.

Which of these two options (or a third that I didn't mention but you will) is any good? And what else will I need to know to make it practical? For example, what image reading and writing library for Python is better than Pillow, and what image reading and writing library for C or C++ is any good?
Re: Porting my tools to C or C++
by on (#234611)
You're wasting your time trying to cater to Windows users with source. They expect exes, so build them an exe via mingw.
Re: Porting my tools to C or C++
by on (#234615)
It really depends on what your main goals are. The two important questions are:
- what would you enjoy doing
- what do your collaborators want and care about?

Are your collaborators complaining about tool speed or difficulty with python? Are you excited about the idea of switching to C or C++? If not, then it just might not be worth the effort of converting and maintaining a C or C++ version. Tool speed is often not the most important metric.

calima wrote:
You're wasting your time trying to cater to Windows users with source. They expect exes, so build them an exe via mingw.

100% agreed. The standard windows way to do it is to provide an exe. Linux users are comfortable getting C or C++ code and doing something with it. Windows users (even most developers) aren't.
Re: Porting my tools to C or C++
by on (#234619)
Mac/Linux machines ship with Python out of the box, and installing a C compiler is one command away. Windows ships with neither. To get Python _or_ a C compiler they are going to need to install something like Cygwin anyway so they can be a dozen clicks away from installing what they need.

Admittedly I'm not much of a Windows user, but I don't see Python having any advantage there.
Re: Porting my tools to C or C++
by on (#234623)
How about both... Pre-built EXE for windows users, and source code for people who need to maintain or modify the tools.
Re: Porting my tools to C or C++
by on (#234627)
slembcke wrote:
Mac/Linux machines ship with Python out of the box

Mac still ships with only Python 2.7 so they might need to install Python 3 anyway. :P

In response to OP though, honestly I don't think you'll get any meaningful benefit out of porting all your tools to C.

jroatch had a high performance need for a compressor, that's a good reason to rewrite something, but that situation probably doesn't apply to most of your python tools, as far as I've seen. (Even your DTE encoder problem could probably be sidestepped a million times more easily with a simple cached result.)


Installing Python 3 on Windows is pretty easy IMO, at least in 2019. A lot better than it used to be. Installing PIL, not quite so easy, but at least it's a well maintained library that PIP isn't going to barf on.


Personally I've had a few requests to offer EXE versions of python things I'd written and made freely available, and I'm usually more than happy to just say no.


Also there's the question of Python to EXE compilers: these have always been and are still terrible, as far as my experience goes trying them.
Re: Porting my tools to C or C++
by on (#234628)
One thing I care about is ability to build the whole project quickly in a fresh folder when I add a new file to the build, to ensure I didn't forget to add an essential source code file to the repository. Sometimes even parallel make doesn't help when one long-running task, such as compressing all of a game's dialogue, dominates the Gantt chart of a build process.

gauauu wrote:
Are your collaborators complaining about tool speed or difficulty with python?

Both: some one, others the other. In particular, koitsu (though not a collaborator on any of my current projects) falls into the latter camp, having complained of difficulty using a program that was developed against a different version of Python than is installed. (Search for coiler by author koitsu for examples.) The switch from Python 2 to Python 3 was especially troublesome.

gauauu wrote:
The standard windows way to do it is to provide an exe.

But how would the executables be distributed to my collaborators? I thought it was standard practice not to check executables into a version control repository. Is it standard to distribute them separately? If so, how is it standard to ensure that the version of the executable matches the version of the source code, in case a change to the tool changes the format of the tool's output and a corresponding change to the source of the game engine expects the changed format?

slembcke wrote:
To get Python _or_ a C compiler they are going to need to install something like Cygwin anyway

The steps that I currently supply to my collaborators install Git Bash, ezwinports Make, and Python.org Python. To what extent is this "something like Cygwin"? I'm asking sincerely, as I don't know to what extent these incorporate Cygwin.

I guess I could cross-compile the Windows version on my GNU/Linux box. That would need three things:

  1. Instructions to build a MinGW-w64 cross-compiler under GNU/Linux or Darwin (or should I use the Windows native compiler in Wine? or is Ubuntu's mingw-w64 package OK?)
  2. Confidence that an executable that works as expected in Wine will work as expected in Windows as well
  3. Means to release the executables in sync with the source code
  4. Again, suggestions for libraries to load and save images
Re: Porting my tools to C or C++
by on (#234630)
tepples wrote:
One thing I care about is ability to build the whole project quickly in a fresh folder when I add a new file to the build, to ensure I didn't forget to add an essential source code file to the repository. Sometimes even parallel make doesn't help when one long-running task, such as compressing all of a game's dialogue, dominates the Gantt chart of a build process.


I have a personal jenkins server that rebuilds fresh copies of my whole project after I push. It may be overkill, but it has saved me from making that mistake with forgetting to add a file to the repo.

Quote:
gauauu wrote:
Are your collaborators complaining about tool speed or difficulty with python?

Both: some one, others the other. In particular, koitsu (though not a collaborator on any of my current projects) falls into the latter camp, having complained of difficulty using a program that was developed against a different version of Python than is installed. (Search for coiler by author koitsu for examples.) The switch from Python 2 to Python 3 was especially troublesome.


If it were me, I'd limit my efforts to my current collaborators. Don't spend a ton of time on a situation that MIGHT happen.

Quote:
gauauu wrote:
The standard windows way to do it is to provide an exe.

But how would the executables be distributed to my collaborators? I thought it was standard practice not to check executables into a version control repository. Is it standard to distribute them separately? If so, how is it standard to ensure that the version of the executable matches the version of the source code, in case a change to the tool changes the format of the tool's output and a corresponding change to the source of the game engine expects the changed format?

As far as I can tell, the standard is to have a build machine that auto-compiles the windows version from the repo source, and makes that available. Or just build it and post on the web somewhere.

But again, it all depends on who your collaborators are. Which is why I suggest that it's important to know who you're making this for, and to know what you're really trying to accomplish.
Re: Porting my tools to C or C++
by on (#234636)
tepples wrote:
Sometimes even parallel make doesn't help when one long-running task, such as compressing all of a game's dialogue, dominates the Gantt chart of a build process.

I wanna echo the "who is this for" question. If you're having an actual performance problem with your build tool on a project that you're working on, you really don't need us to help you make a decision for that.

Instead, let's not be vague. You've published a lot of python script tools. Some are being actively used by people, some are not. There's no reason for you to undertake a project to convert them all to another language, where they'll do the exact same thing. It's a waste of your time that could be better spent making real improvements, or making new things, or otherwise just enjoying life.

Whatever language you choose, if enough people try to use it, there's going to be someone that complains that it's not their favourite language. That's just something you have to accept. If you'd written a tool in C maybe Koitsu would have had an easier time to run it that one time, but some other time someone else would have complained that it wasn't in Python. :P (To be fair, Koitsu even tried to run it, and I'm sure the problem that came up for him could have been easily addressed within Python.)

If you wrote (past tense) a compressor tool in Python and it was slow, the question of whether you should rewrite it comes down to who needs it to be faster, how much, and why. All of that is critical to making a decision, there is no blanket answer anyone here could give to this question when posed vaguely and generically. Only you know the specifics, and all I can advise is that you really shouldn't spend 6 hours rewriting something for 5 people to save 20 seconds each over the next few years.


If you're concerned about Python as your default language going forward... IMO it's pretty fine on Windows these days. These are some things I sometimes try to do with my own releases if I think they'll help:
  • Make a point of saying Python 3 instead of just Python.
  • Start a script with a hashbang and assert for Python 3.
  • Avoid using any additional libraries if I can. If I can't, try to only use something well supported like PIL.

Though really I think the whole Python schism problem is kinda dying out at this point in time. It doesn't come up much at all for me anymore.


And tepples, your dedication to helping people do mundane things like learn how to use a command line, or install and set up a compiler, is pretty remarkable and commendable, but I don't think you should treat every single program you write as if it's someone's first programming experience ever. Every program has a context, and expected prerequisites. Anything that enough people try to use will find a user that doesn't understand the prerequisites, but you can only do so much to prevent that upstream.

As far as I'm concerned, since almost everything of yours is offered free and open source, you've got no obligation to make any of them easier to enter than they are. Most of the ones I've tried are OK to get into, and honestly most of them are of such a niche purpose that a change of language isn't really going to have much affect.
Re: Porting my tools to C or C++
by on (#234640)
Oh yeah, I also forgot, I heard about Numba as an "easy" solution to just make Python code faster.

http://numba.pydata.org/

Unfortunately my experience with it so far has been that it works fine on Linux, and doesn't work at all on Windows. Might be better today (it says it works on the website), or might not, but it might be an option for someone looking to improve Python performance without a lot of extra work.

Edit: Tried it again today, and it installed fine through PIP in Windows. Tried it on a fractal rendering I'd recently rewritten from Python to C++ for performance. Numba gave about 10x speedup. C++ was more like 100x. So... at least in this case Numba isn't as good as the rewrite, but it's still a pretty huge boost for very little extra work.
Re: Porting my tools to C or C++
by on (#234654)
tepples wrote:
But how would the executables be distributed to my collaborators? I thought it was standard practice not to check executables into a version control repository. Is it standard to distribute them separately? If so, how is it standard to ensure that the version of the executable matches the version of the source code, in case a change to the tool changes the format of the tool's output and a corresponding change to the source of the game engine expects the changed format?

GitHub will contact you and scream quite loudly if you check humongous binaries (of any sort; see: several hundred megabytes, or huge sums of smaller sized binaries) into your source repository directly (I speak from experience about this, working at a 120+ employee job where that was done, with a high-end GitHub account). That isn't how you do it.

How you do it is through GitHub's Releases tab/page. There are programs a thousand times larger than yours released this way -- OBS Studio is the first that comes to my mind. GitHub talks about it. You can tie this in to a CI/CD (continous integration / continual deployment) infrastructure service like Travis CI which connects directly with GitHub to do this. That way, you aren't storing binary releases in your GitHub source repo directly. Travis CI is not the only one; for example, Sour uses AppVeyor. Heroku might be another possibility. Generally speaking people tend to use CI when they integrate some form of unit testing performed on a commit (that's what these status icons ("Build Passing") represent -- you can see for OBS they use both Travis CI and AppVeyor). Or you can do it yourself using shells scripts and Python and Wine or whatever.

You also, obviously, do not have to use GitHub if you prefer a different VCS (GitLab, BitBucket, etc.).

I mirror calima's sentiments about Windows users, of which I am one: give us a standalone binary if possible; if not, give us binary + DLLs + whatever in a zip file that can be extracted into a single directory and run directly from there (i.e. "standalone"). Nobody wants to deal with installing PLs and other crap just to run whatever this program is, I assure you.

Edit: words.
Re: Porting my tools to C or C++
by on (#234659)
After having to stop working on my project for a long period of time, what I learned that it's not worth rewriting something unless the gain is substantial. Most of the time, you just end up playing in your sandbox with your toys without accomplishing anything new :lol:
Re: Porting my tools to C or C++
by on (#234664)
calima wrote:
You're wasting your time trying to cater to Windows users with source. They expect exes, so build them an exe via mingw.

koitsu wrote:
I mirror calima's sentiments about Windows users, of which I am one: give us a standalone binary if possible; if not, give us binary + DLLs + whatever in a zip file that can be extracted into a single directory and run directly from there (i.e. "standalone"). Nobody wants to deal with installing PLs and other crap just to run whatever this program is, I assure you.

I'd agree with this for any software that's intended for widespread use. A portable EXE with all of its dependencies already included is usually the ideal way to receive a program.

Part of the reason this is practical is just how good Windows has been about backward compatibility. Binaries have a much longer shelf life there than in Linux or Mac. (Mac users will need binaries too, but they're even more of a hassle to prepare.)

It's also why I've advocated that any commercial modern NES release needs a stand-alone executable, and not just a ROM. Assuming someone has or can install an emulator is a pretty big filter.


However... and I can't stress this enough, the "who is this for" question is relevant here. Tepples has made a lot of tools with a lot of different purposes, and they don't deserve a one-size-fits-all rule. Most of them are not for general widespread use. A lot of them are very specific niche tools for NES developers.

If this is a tool for software development, or for a software developer, there's usually not a huge barrier to installing a widely used programming language, and in a lot of cases they'll already have it ready to use. It's easier for the individual not to have to install anything, but sometimes there's a pretty good trade for the developer to just require someone to have a thing that makes developing the software easier. This is especially true when you're providing it for free. Providing nicely packaged releases is extra work which should be undertaken for a clear purpose.

Like for example you shouldn't need to package an NES emulator, or instructions for how to find and set up your first NES emulator on 6 different platforms with releases for the NESDev compo. There is context to everything, and there's always things that are reasonable to expect as a prerequisite.

...and maybe also just consider that sometimes it's just a good filter to not try to pick up every possible user. Some software, even free software, should be a bit exclusive, and should require some prerequisite learning or understanding before it's accessible to a person. If someone who isn't ready yet is keen enough to try, they might even learn something more broadly useful in the process of preparing to use your tool.


For an example, in my own work, something like NSFPlay absolutely gets a binary release, because it's made for people who just want to listen to NES music. That's who it's for and why I maintain it. On the other hand, my EZNSF tool for building an .NES ROM album out of an .NSF music file has a python prerequisite, and I'll never change that. It's intended user is someone who's willing to work hard enough to produce an NES ROM, and if they're not willing to meet that minimum bar I'm not interested in working hard to lower it for them. It was much easier to write in Python, and bending over backwards to provide binaries, or porting to C++ or whatever would have been a detriment to the project: wasted time that would have been better spent improving its actual function (or doing just about anything else). It's also much easier to customize and modify the project because it doesn't have a binary form and the source is directly accessible. I think there are many good reasons to go either way for different projects.
Re: Porting my tools to C or C++
by on (#234668)
For source on Windows, make it compile with VS. Installing a bunch of linux stuff to compile a 2 file project is more hassle than it is worth. If one can just drop the files into a command line project and hit build, 10000% better.

Maybe go for C#, it has nice parallel support and its async await system might be handy for you as well. It has a bunch of built in image tools as part of the .net framework and there is mono for mac and linux people. Its also really really nice to code in, compiles and is fast.
Re: Porting my tools to C or C++
by on (#234687)
Let me explain some of the "who is this for":

NES homebrew community in general

I have uploaded several of my retro console projects as public repositories on GitHub. Building these requires GNU Coreutils, GNU Make, Python, Pillow, and ca65. Python and Pillow are used for image conversion and data compression. On Windows, I have recommended obtaining Coreutils via Git Bash and Make via ezwinports, through an installation procedure that I fine-tuned with the help of Greg Caldwell at Retrotainment Games.

Just because you have installed a toolchain to cross-assemble NES binaries doesn't mean you have installed one to native-assemble PC binaries. I had hoped to avoid requiring Visual Studio, as its multi-gigabyte download size could cause someone behind a harshly capped Internet connection, such as satellite or cellular, to run up a substantial data transfer overage bill, angering the head of household who has to pay the ISP bill. In several forums or chat communities to which I have belonged for more than a year, I've seen at least one member mention being behind dial-up or satellite because neither cable nor fiber nor DSL serves his or her address: R*** N**** (dial-up), T**H****S******* (satellite), and our own Rahsennor (satellite). I'll admit that since Python 3.4 bundled pip, the "coiler" compatibility issues that koitsu mentioned in the past have become much less apparent.

Retrotainment Games

I am lead programmer for several projects by Retrotainment Games, a video game developer owned by the used game store Cash-In Culture based out of Pennsylvania. Each game has a repository containing assembly language source code and the Python source code of the data conversion tools. Many of these tools are released in my public repositories; others have been cleared for release but not yet released; still others are private because they are specific to a game.

I feel comfortable explaining the following work process publicly because it could be reasonably inferred from the data structures of the Haunted games and the NES projects in my public repositories.

When an artist wants to add a level to a game, he pulls the repository, adds a PNG file of the background, adds a text file containing collision rectangles, enemy positions, and exit positions, and edits the master level list to add the map's background PNG filename, background palette, and collision map filename. Then he tests the level by building the game on his computer. The game's makefile runs a conversion program written in Python on the PNG file, or on all PNG files if the repository was freshly cloned or if the format of compressed background has changed. Converting a background a dozen screens long can take several seconds because it has to find the optimal subpalette for each 16x16-pixel piece of the background. The main Mall map in HH85, for instance, is roughly 24 screens long. Conversion will fail when the background violates some constraint of the engine, such as having only 256 unique tiles' visibility ranges cross a given 264x240-pixel window. If this happens, the software produces an error report, and the artist must edit the background to fix the violations and re-run the build process on any backgrounds in violation.

When an enemy designer wants to change speed, flight pattern, or other build-time constants or data tables for an enemy that I have coded, he pulls the repository, edits whatever constants are needed, and builds the game to test it. Again, if he hasn't built the game in a while, the background images must be reconverted.

When a writer wants to change a dialogue cue, he edits the text and places a dialogue trigger in the level. A text compression program written in Python compresses the entire script, and currently the compression takes several seconds. I have yet to thoroughly investigate whether this is a genuine algorithmic issue or just dynamic language overhead.

If I were to rewrite the most time-consuming among these tools in a compiled language, I'd probably need to make two kinds of artifact: a release of the game itself and a release of just the compiled version of the data conversion tools. Releases to the latter would generally come between releases of the former. Do binary release platforms generally support this? Or do they require all artifacts to be built from the same tag? Would it be better to make two separate repositories for each game: one for the game's source code and assets and one for its asset conversion tools?

In fact, I'm already facing this problem with ft2pently, a tool by NovaSquirrel to convert a FamiTracker text export to a Pently score. ft2pently is written in C and must be compiled before use. A composer who prefers to work in FamiTracker and prefers to keep a FamiTracker module would need a compiled copy of ft2pently around. I suspect that inability to instantly hear the Pently conversion of a module is why at least one Retrotainment project stuck with what is now called FamiTone4.

Eventually, the public

I may want to rewrite the Action 53 builder to allow a more drag-and-drop procedure to build a collection. This will require writing a GUI app for PC. The same might be true if I decide to make a game with a level editor.

At times, I've tried to develop PC games, distributing Windows binaries built with MinGW and source code. I've occasionally received complaints that they do not build on GNU/Linux or macOS, or if they do build, they don't run or lack sound or joystick support or whatever. This was before I switched from Windows to GNU/Linux as a daily driver, and I still don't own a sufficiently recent Mac on which to test. If I ever get back into PC game development, I'll need a way to reach PC platforms other than my own.
Re: Porting my tools to C or C++
by on (#234691)
tepples wrote:
NES homebrew community in general

While I have sympathy for people with poor internet connections, that isn't a problem that you are REQUIRED to solve for them. But if you WANT to consider them, then providing a compiled exe is usually the smallest thing. If it's worth it to you to invest your time converting from python to C in order to provide those people a tiny binary to download, then you've answered your question: yes it's worth it. Release a compiled binary version of your tools.

Quote:
Retrotainment Games

I'd ask them what they personally would want. Explain the trade-offs, and how much development time it would take you, and just ask them what they prefer. The 20 seconds of rebuilding might be annoying to them, but maybe it's worth it to let you focus your development time on the game itself. Now that they already have python set up, do they hate dealing with it and want you to change it? Or do they just want you to buckle down and get working on Full Quiet? The only way to know is to ask them.

Quote:
Do binary release platforms generally support this? Or do they require all artifacts to be built from the same tag?

There are quite a few different binary release platforms, but in general, they're pretty flexible.

Quote:
Eventually, the public


"The Public" as a general category is really hard to plan for. Because that can be anyone from my mother-in-law on a chromebook, to Tokumaru. Supporting multiple platforms well is a LOT OF WORK. You have to decide what subset of the public it's worth supporting. There's no silver bullet.
Re: Porting my tools to C or C++
by on (#234692)
Quote:
Would it be better to make two separate repositories for each game: one for the game's source code and assets and one for its asset conversion tools?


If you're allowed to make the tools open source, then there's a lot more flexibility with pricing on build servers. Appveyor, for example, is free for open source projects.

So if that's the case, having an open-source tools repo and a closed source game repo might make sense.
Re: Porting my tools to C or C++
by on (#234705)
Eh. I get pretty annoyed when I need to install yet another copy of visual studio on my Windows machine for some project. That crap is huge. The last contract I worked on targeted a couple of consoles and I cleaned up almost a hundred GB of SSD space deleting those tools. Ridiculous. I get why people don't like installing "yet another tool".

I tend to stick to Python (2) and plain C/Make depending on what I'm doing. Installing both is dozens of megabytes of disk space. A fraction of one of those ridiculous Electron apps. I don't feel too bad about using smaller tools like that as a requirement.

Why Python 2? It comes installed on Macs and Linuxes. Does anything ship with Python 3? I don't even know (or, quite honestly, care) what the difference with 3 is. I get the impression it's a lot of language pontification to me about a language that's already a little haphazard.
Re: Porting my tools to C or C++
by on (#234709)
Okay, so this isn't just about Python, then.

It seems like you have your standard setup that requires:
  • GNU Coreutils
  • GNU Make
  • Python
  • Pillow
  • ca65

And you've gone far and above most open source projects with a cross-platform environment setup guide:
https://github.com/pinobatch/nrom-template/blob/master/README.md

So, if you want commentary on this... I guess personally I would have little problem setting this up, especially with these directions, but honestly it's a few steps beyond what I'd want to undertake casually. If I had a need to build one of your things, I'd definitely be willing, but admittedly you've put up an unpleasant barrier here. As it stands, I've often looked at the source code for things you've published, but the only time I've actually set up to build something of yours was when I was getting paid to do it for Haunted Halloween 85.

First and foremost this is obviously Linux-centric. Except for CC65, everything here is easy to get or very standard tools on Linux. Not so on Windows.

One thing I would immediately suggest is not to ask any users to manually modify their PATH, ever. This should simply not be done, except by an installer for a big well maintained stable program. Having to add CC65 to the PATH is a non-starter for me, I refuse to make it a global part of the system. I personally know how to create a temporary PATH in a command environment, but it'd be an annoying work-around. Python on the path is fine. MinGW / MSYS is maybe OK (though they recommend against it, see below). CC65 no.

GNU Coreutils / GNU Make are not very standard on Windows. You probably have them if you're a Windows programmer who needs to build a lot of Linux-centric open source software that doesn't try to make concessions for Windows, but otherwise these just aren't standard toolkit for Windows programming. I don't know what you require from Coreutils but it might be worth trying to drop it as a dependency. Make is less obscure for Windows than Coreutils, but most Windows-centric development environments will do without it.

Your install process directions for these two breaks down entirely. You give a few options:
  • You mention MSYS but don't give a link to download. To be honest, I'm not entirely sure what MSYS is, I just think of it as a component of MinGW. The primary website for it seems to be this one which is an absolute nightmare to read.
  • You mention Git for Windows, and then some hideous process to add Make. Also not a good solution for this problem. (Maybe you'd be surprised but I think most Windows users of Git are doing it through a GUI like Sourcetree that keeps a local copy of git rather than a globally installed command line tool.)
  • You mention devkitPro, which I think is an awful place to get these tools. devkitPro's wiki doesn't even have a clear description of what it is or what it's for. It seems to be about homebrew development for Nintendo consoles GBA and later? This is not even remotely something I'd want to trust to do global installation stuff on my system. Maybe if I was already making homebrew for these other platforms and was familiar with it, but this is too obscure looking for me to be confident its installer won't mess up my system.

I think what you should really recommend to install for this is MinGW. It's well known, well maintained, and has a relatively nice installer. Only small problem is it doesn't install to your PATH (oops). That's actually good though, because I wouldn't want it to install to my path. I do use MinGW from time to time, but I much prefer keeping it in its own isolated environment. The MinGW Getting Started guide also recommends this:
Quote:
The MinGW team do not recommend modifying the system wide Windows PATH variable. We prefer that you use a script to set PATH for the processes on a per session basis.


Alternatively, if you can ditch coreutils the problem becomes much simpler. Just get Make for Windows. All you need from this is a single EXE. You can drop it locally in the folder. Done. (I think for HH85 I used this + some small modifications to the makefile to get around whatever is in Coreutils.)

Okay, so that was the worst of it.

Python, I think has a mostly easy install process on Windows these days, not going to further comment on that.

Installing Pillow is OK. You need to open a command line, but at least its a command that should reliably work for most people, provided they know what a command line is. Unlike a lot of other Python packages, Pillow has always installed correctly via PIP for me, and I think it's well maintained for that.

Finally CC65. I've already said this above, but don't require this to be a global install, i.e. do not ask users to set their PATH. Use a local version of it.

Personally I have a lot of CC65 projects, and I use a local copy of CC65 in every one of them. CC65 is not rigorously maintained to be continously backward compatible, and there have been several incidents of this. If I'm working on something I won't allow it to be disrupted by an unnecessary update to the tool.

So...

All of that said, if you have a user install MinGW, there's two approaches that I think might be easy to set up that missing part of the environment:

1. Run C:\MingGW\msys\1.0\msys.bat to open the MSYS shell, which will have access to Coreutils and Make, and they can run this pseudo-Linux environment to build your thing. (I guess put it in your msys home folder.)

2. Create a batch file that sets its own temporary PATH before running Make locally.



So... this turned out pretty long, eh... sorry there was so much to say, but maybe it will help understand what it looks like for at least this particular Windows user to consider building one of your projects.
Re: Porting my tools to C or C++
by on (#234711)
slembcke wrote:
Does anything ship with Python 3? I don't even know (or, quite honestly, care) what the difference with 3 is. I get the impression it's a lot of language pontification to me about a language that's already a little haphazard.

Many Linux distros ship with both 2 and 3. What has been the most important improved feature for me in 3 is much better support for Unicode across the whole language and toolset, but beyond that there have been a ton of small and worthwhile additions to the Python standard library. Python 2.7 is just 2.7 on life support since 2008, which has its own nice appealing stability I suppose.
Re: Porting my tools to C or C++
by on (#234713)
rainwarrior wrote:
Maybe you'd be surprised but I think most Windows users of Git are doing it through a GUI like Sourcetree that keeps a local copy of git rather than a globally installed command line tool.
I used to use SourceTree when I started off using Git, but nowadays the little that Visual Studio doesn't support on its own, I just do in a command prompt (I could very well be the minority here, though.)

Generally speaking, though, I agree. I would usually never bother installing that much software just to compile something, unless I desperately needed it. As someone who develops using VS, C# is my go-to language for any small utility I want to make, so I don't even have Python installed.

Also, regarding adding programs to the PATH. Having a PATH variable that is too long (in terms of the number of characters in the string) can break a lot of stuff - though I don't quite remember the specifics (it may have been fixed in Win8 or 10). In Win7, it's definitely been a problem for me in the past at work (due to having multiple versions of SQL Server installed, each of which adds quite a lot of stuff in the PATH var). So yea, it's usually best to avoid adding stuff to the PATH variable unless it's really necessary.
Re: Porting my tools to C or C++
by on (#234714)
Is there a particular reason to avoid Cygwin? I've had a few people respond with utter revulsion to the idea of it, but it makes it pretty trivial to install most unix tools painlessly in a couple minutes including GCC, Make, Python, etc. It even puts it everything in a single directory so you can clean it up easily.
Re: Porting my tools to C or C++
by on (#234716)
All those things considered, if I was part of a team (ie retrotainment) that already had a set of tools installed and working, I'd want to discuss it as a team before making changes to that workflow. Maybe it's worth changing, but maybe they'd just want to leave their build flow alone, not touch it, and keep working.
Re: Porting my tools to C or C++
by on (#234717)
slembcke wrote:
Is there a particular reason to avoid Cygwin? I've had a few people respond with utter revulsion to the idea of it, but it makes it pretty trivial to install most unix tools painlessly in a couple minutes including GCC, Make, Python, etc. It even puts it everything in a single directory so you can clean it up easily.

Cywgin slipped my mind as another alternative. To be honest I haven't used it in over a decade, so I don't have any current experiences to offer as comparison.

Is there anything you can think of about it that's better or worse than using MinGW/MSYS for this purpose? MinGW is something I do use, and think it's usually pretty good, has a relatively nice installer, and keeps itself nicely contained. (Related: does anyone know what MSYS2 is for?)

Though as I suggested above, probably a more ideal solution would be to ditch this whole Linux toolchain dependency, and just provide make.exe and a batch file to set a temporary PATH and run Make.


...and just a reminder, all of this is just suggestions on what to do if you want to make your Windows user experience better. You don't owe free maintenance to anybody, and while the current setup is unpleasant it's not terribly unreasonable. I've had much worse times building much higher profile open source projects than these. :P The only red flag here for me is that you're telling users to set their PATH, which I think is bad.
Re: Porting my tools to C or C++
by on (#234735)
I have rarely needed to install 2 VS at once ( the 2008 version was a bit special though ). On my current 2017 install I've built VS6 projects for some old tools. I have to fix the modern security warnings but, not a problem.(MS actually have a help page dedicated to the "so you want to build something from Win95, well you're going to get this bug, fix it by changing this, toggle this option and then add this line" handy dandy guide) .This is why VS is this massive gigantic blob, it does support and kept support for these things.

Also I would point out that I'm talking about small 1, 2 file command lines tools, and tepples making them Windows safe, not actually giving a MSVC sln file. As a practical example of what I'm talking about, here are some real world examples, https://sourceforge.net/p/tass64/code/HEAD/tree/trunk/ this is probably a limit, however Soci is a pure no windows guy, when he tests windows he tests in reactOS, however this code is very pure and with a couple of minor fixes i was able to get it compiling on MSVC. Soci was then willing to make said changes to his code base and now I can just grab his code, make a "solution with existing files" remove the DOS and AMIGA custom stuff and done.

Another smaller example is https://github.com/bitshifters/exomizer ... 2/rawdecrs (not original author as it is code is only in a zip ) simple small C files to do a thing, again throw it into a sln with source and done.

Cygwin is a name that gives me nightmares, it did some horrible stuff to my machines back in the day, might be fine now.

I think another path to this would be
install virtual box
download lubuntu.iso
make lubuntu vm
follow instructions as per normal

(although if anybody has a good guide on how to set up cross building on linux, that would be great thanks. Everything I've read has been here is one tiny step and we assume you understand the other 50 archaic linux commands to patch it all up as you need )

I would also vote that CC65/CA65 is somewhat Linux centric and seems to be a bit of a pain to use on windows. Possible, but I get the impression it really wants a linux host.

I use the github desktop for my git, which has its own isolated git, and when you go to its menu and say "open git command line prompt" it will bitch that it can't find git :D
Re: Porting my tools to C or C++
by on (#234757)
We're drifting a little off-topic from the original post, but here's one thing I've done to address the windows/linux build stuff:

My development machine, and my entire build ecosystem and tools are linux-based. When I want to give a windows collaborator the ability to compile things, instead of making a windows version of the build, I use Vagrant. If you aren't familiar with Vagrant, it's a tool to manage a local virtual machine specifically for development. I have a Vagrant script that knows how to setup and create a linux vm that can do all the builds. The windows user needs to install Vagrant (and a vm program like virtualbox). Then they just run a "build.bat" script that I gave them, which ensures that the vm is running, and asks the vm to run the build process.

It's incredibly slow the very first time (it has to download the vm image and set up the vm, which can take 5 minutes), but reasonably fast after that, and requires very little fiddling about on the windows side.

There are obvious disadvantages (a vm is slower and heavier than running tools in the host OS, and requires a huge image download), but it's nice being able to provide a windows user with the ability to build without having to set up a fiddly development environment. Installing Vagrant is generally easier than fiddling with multiple languages and tools and paths.
Re: Porting my tools to C or C++
by on (#236076)
As I mentioned in an aside to this post, I'm ready to discuss deprecating recommending that users modify the system-wide PATH.

As I understand it, "ditch Coreutils" means that if the script detects Windows (through the presence of the COMSPEC environment variable), it should not try calling cat or rm. Instead, it should call cmd /c copy /b (with + signs between each source file path argument and the next) or cmd /c del instead. Were the "small modifications to the makefile" of this nature?

Would it be a good idea to include such a "batch file that sets its own temporary PATH before running Make locally" with each of my projects? The user would still have to edit each file depending on where in the file system he or she unzipped cc65 and Make.

What's the best way to work around the likelihood that the user will have spaces in pathnames? Examples include C:\Program Files (x86) or C:\Users\John Doe. Is there a prerequisite-driven build system that's preferable to GNU Make on the Windows platform, such as with better support for spaces in pathnames and the requirement of a plus sign between source arguments to copy /b?

I'm also not certain of the most appropriate way to implement "use a local copy of CC65". I don't expect that you meant check a vendored copy of the cc65 source code into the repository. Or some procedure to copy ca65 and ld65, and if necessary cc65 and the C runtime library, into a subdirectory that's specified in .gitignore?
Re: Porting my tools to C or C++
by on (#236080)
tepples wrote:
As I understand it, "ditch Coreutils" means that if the script detects Windows (through the presence of the COMSPEC environment variable), it should not try calling cat or rm. Instead, it should call cmd /c copy /b (with + signs between each source file path argument and the next) or cmd /c del instead. Were the "small modifications to the makefile" of this nature?

I think the implied question in "ditch Coreutils" is why does your makefile need to use cat or rm at all? I don't know what you expect a makefile to do, but probably it should be doing as little as possible. Can't you just build binaries to the location they're needed? Why is there some extra layer of OS level file manipulation on top of it? Maybe try not to do that? (Or if you really must... maybe one of your other, harder dependencies can handle it. Python maybe?)

tepples wrote:
Would it be a good idea to include such a "batch file that sets its own temporary PATH before running Make locally" with each of my projects? The user would still have to edit each file depending on where in the file system he or she unzipped cc65 and Make.

That would be an improvement I guess. If CC65 was local to the project they shouldn't have to edit any paths?

tepples wrote:
What's the best way to work around the likelihood that the user will have spaces in pathnames? Examples include C:\Program Files (x86) or C:\Users\John Doe. Is there a prerequisite-driven build system that's preferable to GNU Make on the Windows platform, such as with better support for spaces in pathnames and the requirement of a plus sign between source arguments to copy /b?

That's not really a problem unless you're using absolute paths for things? If your project has no spaces in its directories, it won't come up in a relative path. Where is this actually causing a problem in whatever you're thinking about doing?

tepples wrote:
I'm also not certain of the most appropriate way to implement "use a local copy of CC65". I don't expect that you meant check a vendored copy of the cc65 source code into the repository. Or some procedure to copy ca65 and ld65, and if necessary cc65 and the C runtime library, into a subdirectory that's specified in .gitignore?

In all of my projects I just tell people to unzip CC65 into a specific folder. I don't know what could possibly complicate that, and I'm not willing to go down whatever rabbit hole you're drawing in with the word "vendoring".
Re: Porting my tools to C or C++
by on (#236081)
rainwarrior wrote:
why does your makefile need to use cat or rm at all?

cat or copy /b: At one point, I used to build iNES header, PRG ROM, and CHR ROM to separate files rather than adding CHR ROM as a MEMORY area and SEGMENT in the linker configuration. I'm definitely not using copy /b to combine the header and PRG ROM anymore, and I don't think I'm still doing that for INES+PRG ROM and CHR ROM anymore, but I'd have to check all my repos to make sure of that.

rm or del: I currently use rm for make clean.

rainwarrior wrote:
[A batch file that sets PATH] would be an improvement I guess. If CC65 was local to the project they shouldn't have to edit any paths?

Someone building one of my projects would have to edit the path to GCC in order to build cc65 from source code, making MinGW a dependency on Windows. Or he or she have to download and unzip Make and cc65 built for Windows x86 or x64 into a particular subdirectory (listed in .gitignore) and then repeat the process for each other project of mine that he or she wants to build.

rainwarrior wrote:
That's not really a problem unless you're using absolute paths for things?

The path to Make and cc65 on the user's system would have absolute paths unless I instruct the user to copy Make and cc65 into a subdirectory of each project.

To others reading this: What are the drawbacks of instructing the user to copy Make and cc65 into a subdirectory of each project?
Re: Porting my tools to C or C++
by on (#236082)
tepples wrote:
To others reading this: What are the drawbacks of instructing the user to copy Make and cc65 into a subdirectory of each project?

That I can think of, very few. The main two I can think of are disk space and "technical debt". Disk space isn't a big concern here because the programs are pretty small, while the latter usually encompasses many caveats: but in this specific case (of Make + cc65) the only debt I can think of would be the administrative maintenance of updating the binaries if/when new releases of software comes out.

I think those "drawbacks" are pretty simple/easy (thus acceptable and reasonable) compared to, say, requiring a person to install Visual Studio.

The "keep everything in a local directory" approach is honestly better than screwing with %PATH% for several reasons, the biggest being that introducing a new "commonplace name" binary/program into %PATH% trickles down into literally everything and you can end up with painful binary conflicts. make.exe is a great example: say you have D:\bin\make.exe (the gnuwin32 version), but you also run Cygwin or MinGW which has their own make. In Cygwin or a MSYS2 shell you type make -- and you end up getting the wrong one. This isn't just limited to make either. Yes, this has to do with %PATH% and $PATH argument order and the like, but it's a serious problem if/when you encounter it (I have many times) with all sorts of utilities. It's why I don't use Cygwin or MSYS2 unless I have to (and if I do, I always pick MSYS2). Keeping everything in the project's directory "mostly" relieves this.

P.S. -- All of those utilities on the gnuwin32 Sourceforge page are horribly old and are often fragile compared to MSYS2 or Cygwin alternatives (both of which are actively maintained), all the way down to not having good UTF-8 support (including in filenames; though Windows 7 native CLI utilities often botch these too). I do have several gnuwin32 binaries that I keep laying around in my D:\bin directory, and I fight with/scream at them every time I have to deal with them. wget is the most notorious. The way they break is often weird, too. I am not saying don't use them, I'm just saying if/when you encounter oddities with them, a lot of the time (in my own anecdotal experience) they tend to be due the age/how they were built/whatever with no real way to solve the problem other than ditching them. I tend to just fall back on my FreeBSD or Linux boxes to do things if they don't work for me. It's a bummer, too, because I really love how "standalone" the gnuwin32 programs tend to be. :/

I really do urge folks to give MSYS2 a try if they've found Cygwin to be bloated, slow, or just generally don't like it -- it even gives you an actual terminal (mintty) with good terminal support alongside bash. Their wiki page goes over a lot of stuff. MingW fits into it quite well.