I'm enrolling into High School for next year and all of the school's tech programs never teach anything in the C family, it's always Python, Java/Javascript, and HTML. The same is true at the school I currently attend, the tech teachers seem to treat C as if it's some sort of crazy sophisticated language because it's "Low Level" which is far from the truth. C is a far more useful language to learn then any other because it can be used on practically any device, such as embedded systems, mobile devices, smart watches, game consoles, and even applications for PCs. So why would they rather teach a very niche language that works on niche devices rather then teach a language that would actually be useful?
The school seems to be business oriented. Increasingly less job opportunities for people who specialize in C and C++ / growing rate of job opportunities if you learn something else. On one hand, C and C++ coders can expect a higher salary for their knowledge and expertise in the future, but the number of jobs are slowly diminishing, at least comparatively to other fields.
You need to write a computation heavy simulation or science thingy? Use C family. You want to get some job done at a tight price point quick without having to worry about memory management and what not? Use something high-level.
You can't "architecture astronaut" in C, I'm afraid, so that's why.
In all honesty people learn more about OOP and Design Patterns which are much more useful in the real world nowadays, they can learn low level stuff elsewhere if they desire.
My advice is take the course and learn Java and Python anyway. Doing so will only make it easier for you to eventually learn C, should you want to.
Most of what you will learn in a beginner's course is fundamental stuff that applies to most programming. The specific choice of language isn't a big deal here.
What rainwarrior said. Java is pretty much C with a new coat of paint, and probably easier to learn anyway. Javascript and Python are further removed but the basics are still the same, and that's all you'll be covering in a high school course anyway.
FWIW I think the teachers are right to call C a "crazy sophisticated language because it's "Low Level"". It is a low-level language. It's trivially easy to shoot yourself in the foot. You have to do all the safety checks yourself, manually, and even the pros stuff them up on a regular basis.
I would go so far as to blame the modern acceptance of buggy programs as simply the way things are squarely on the C family's shoulders, because the most frequent 90% of all bugs could be (and are!) reliably caught at compile time in more advanced languages. You really do need to know what you're doing.
That's another thing... bug hunts, manual bug prevention & testing can be expensive. You want to cut unnecessary production costs and automating away a good portion of all human error factors is one such.
Maybe it's just a high school thing mostly..? Not sure about others with EE/CS degrees, but the majority of my EE/CS classes used C. But I also got a taste of asm, java, python, along with verilog and matlab if you count them. Not sure how things have changed since 2012 when a graduated though. There was absolutely no programming classes when I attended high school in early 2000's.
There was no programming instruction at all in my high school until the last year I was there when it became a new course. At the time it was C++, but they switched to Java a year or two later. This was not quite 20 years ago, and Java was a bit less popular at that point (and a whole lot slower).
Throughout my university computer science program, the beginner courses were Java, and so were most courses, but you were expected to learn at least the basics in a variety of other languages too (e.g. Haskell, Prolog, Matlab, assembly). I only had one course that used C++, which was for computer graphics. I don't think there are any courses there that use C at all, though there are certainly graduate students and faculty doing some work in it.
So, C has never been on offer for formal instruction in the educational system I grew up in. Even C++ was rare. I don't know what kind of justifications you're hearing about C being "crazy sophisticated" or whatever, but I think the real reason is just popularity, not language features. Java is the most popular programming language. That's the same reason my high school briefly offered C++ while I was there, but they would have had something similarly spurious to say about why they didn't use Java instead. There's always something bad you can say about any programming language, especially if you want to make an excuse not to use it.
I learned BASIC from library books and magazines. Later I learned C++ from a book, and then quickly learned C and a whole lot more when the internet became a household thing. By the time my high school offered any course in programming, it was too late for me to learn anything from it. I would have loved an opportunity to have classroom instruction in it earlier on. To have peers to learn with, and a teacher to ask questions of would have been amazing. (Nowadays my old high school has 3 programming courses! Java, of course.)
So... take it anyway. If you want, try to learn C on your own, at the same time or later, or whenever you have the interest and inclination to do so.
I still think C is good, and so is assembly language. And, there is many other programming language good for use for different purpose, too.
Quote:
Java/Javascript
Just to make sure you are aware...Java and JavaScript are not related, and don't resemble each other in any way.
If you are interested in web programming, HTML and JavaScript will be usefull to learn.
If you want to make apps, particularly for Android devices, then Java is a good language to learn.
I like C and C++. You can do so many things with either.
python is a good language for high school aged programmers, because it is easier to read, ie. user friendly.
It's also easy to get simple programs up and running very quickly, and on any machine.
There was absolutely not programming course in high-scool back when I was there (if you don't count Excel "programming") and I think there still isn't, but they're planning to introduce it. Think that the general public is ordinary people, which may not be that interested about programming computers. Low level programming is harder, so that's not the priority to teach to that public.
Can't say anything that hasn't been said here, but yeah - while I think C is good to know even if you aren't going to work with it, it's not good for teaching the basics of programming!
Java and C# are designed around the object oriented design idea (unlike C++ which sort of had it tacked on), do memory management for you, and are much more strict about when when you are allowed to build your program, pretty much enforcing ideal design patterns on you. The "only" disadvantage to using either of those would be a lower performance (which is surprisingly rarely relevant in most real life scenarios), and less control of dependencies.
Both of those languages are absolutely ideal for teaching the concepts of object oriented programming, which is likely what you will be doing if you ever get a job working with this kind of stuff, no matter what language you'll end up coding.
One thing those languages don't teach you however, is how things work closer to the internals of the computer, and in some cases that can be really good to know. C definitely teaches you some very important points about memory management that C# or Java like to pretend don't exist, but in regards to high school level beginner classes, an assembly language would be much more beneficial for teaching kids how things work, and I'd imagine most programming classes probably do take a week or two to at least touch on that subject?
I imagine there are a lot more choices now, when I went to what corresponds to "high school" in my country around 1999, my only option was a computer class lasting about a semester, which would touch on programming in terms of Visual Basic and a short flirt with some virtual assembly language we'd be making small tasks in using some kind of simple emulator.
Neither of those would be remotely useful in finding a job later in my life, but I think they worked just fine for the purpose - giving high school students an idea of what programming languages is, how they work, and what you can do with them... although as a big nerd I was familiar with most of that stuff beforehand...
Rahsennor wrote:
I would go so far as to blame the modern acceptance of buggy programs as simply the way things are squarely on the C family's shoulders, because the most frequent 90% of all bugs could be (and are!) reliably caught at compile time in more advanced languages. You really do need to know what you're doing.
Which is a good thing, because that means you're the one actually writing the program and not the computer.
infiniteneslives wrote:
Maybe it's just a high school thing mostly..?
It's not. I'm taking my first semester of programming at a college (even though I'm still in high school), and we've been using Java. Hopefully C better; we've only been making really simple programs so far, but the thing practically makes itself. I don't even feel like I'm doing anything.
Sumez wrote:
an assembly language would be much more beneficial for teaching kids how things work, and I'd imagine most programming classes probably do take a week or two to at least touch on that subject?
Our textbook had a few pages on it, but then that was it. It felt useless for them to teach that; things like binary and machine code have absolutely no relevance to Java programming as far as I can tell.
It's frustrating to me how little everyone (not here, obviously) seems to value efficiency. I can think of several programs that aren't even "super sophisticated" but could bog down even a modern computer due to brute-force checking and whatnot. And why even make something take 2 seconds if it only needs to take 1? My conspiracy theory is that Intel and other companies encourage learning institutions not to value code efficiency so programs run poorly unless you buy their newest hardware.
Espozo wrote:
Which is a good thing, because that means you're the one actually writing the program and not the computer.
While I can see where you're coming from (compilers sure do generate awful code sometimes), no, it really isn't.
I'm talking about
static analysis, the ability of a compiler to detect certain runtime behaviours at compile time. Even a straightforward
strong type system can keep you from accidentally adding feet to meters, for instance. C is really bad for this, as it has a handful of fixed types whose sizes aren't even exactly specified (Is an
int 16 or 32 bits? How about a
long?), doesn't allow you to define new ones (a
typedef is just an alias), will cheerfully allow you to store a value in a variable too small for it, silently discarding the high bits, and lets you put
== anywhere you can use
- (true story: I once wasted an entire day hunting for that typo in a 1000-line program).
In this particular case, there's no reason you can't have your cake and eat it too - a strong type system in a low-level programming language will work just fine, since it doesn't affect the generated code at all.
Rust takes things a step further and enforces compile-time memory safety - null pointers, uninitialized values, dangling pointers, double-frees and all those other hard-to-debug errors are completely prevented, with no runtime impact. The compiler simply checks that you've dotted all your eyes and crossed all your tees. That sort of thing is tedious and error-prone for humans, but is literally what computers were built for. As a programmer, why
wouldn't you want to make the computer do your work for you?
Anyway, I'm derailing the thread. My point is, there are languages out there that go out of their way to tell you when you're doing something that's going to cause you grief later on, and those languages are much better introductions to programming than ones that don't - but that doesn't have to mean those languages are
only for beginners. The
second-best-selling game of all time was written in Java, you know.
Heck, as far as install base goes, it has even been (and is) commercially sound to learn ActionScript (the language of Adobe Flash & AIR). Flash alone has 1,5 billion installations, it runs on every pc, modern console and mobile device, and games like angry birds and farmville speak for themselves in terms of economic viability. ActionScript via flash is also responsible for vast amounts of shovelware.
While flash is rapidly becoming less significant as a web presence and is going to see discontinuation of development by 2020 at latest (in favor of HTML5), the more app- and game-oriented Adobe AIR is a more or less direct continuation, so ActionScript is still a valuable skill in some fields of the industry.
Espozo wrote:
infiniteneslives wrote:
Maybe it's just a high school thing mostly..?
It's not. I'm taking my first semester of programming at a college (even though I'm still in high school), and we've been using Java. Hopefully C better; we've only been making really simple programs so far, but the thing practically makes itself. I don't even feel like I'm doing anything.
My first college intro to programming was Java as well. Once started taking data structures we were upgraded to C. Networking and operating systems classes continued in C. Microcontroller classes all used C after an intro to asm, but those are the only viable options for micros..
Rahsennor wrote:
doesn't allow you to define new ones (a typedef is just an alias)
What about
struct?
infiniteneslives wrote:
Espozo wrote:
infiniteneslives wrote:
Maybe it's just a high school thing mostly..?
It's not. I'm taking my first semester of programming at a college (even though I'm still in high school), and we've been using Java. Hopefully C better; we've only been making really simple programs so far, but the thing practically makes itself. I don't even feel like I'm doing anything.
My first college intro to programming was Java as well. Once started taking data structures we were upgraded to C. Networking and operating systems classes continued in C. Microcontroller classes all used C after an intro to asm, but those are the only viable options for micros..
Java and C++ were the only languages we were really
taught in university (at a major university with a strong CS program), but we regularly received assignments that had to be done in other languages (particularly C). You learned it on your own or dropped the major.
gauauu wrote:
Java and C++ were the only languages we were really taught in university (at a major university with a strong CS program), but we regularly received assignments that had to be done in other languages (particularly C). You learned it on your own or dropped the major.
Yeah I guess it was fairly similar for me, they didn't formally teach us C, we were taught Java, then subsequent classes expected you to be able to quickly pick up C and start putting it to work.
I don't think it's an issue of schools against teaching C, it's that back in the late 90s/early 2000s, Sun really forced (bought) their way into academia and pushed Java big time. Colleges and universities latched on, and you ended up with trash like the APCS/CSAP exams being done exclusively in Java. Microsoft has done similarly with C#.
While the commercial implications of both languages are obviously problematic, I'd say you can easily defend why these languages are absolutely optimal for teaching students the concepts of OOP patterns, since they are designed entirely around it.
C# in particular has the advantage of full integration into Visual Studio which is arguably the best programming IDE ever created, no matter what you may think of Microsoft.
Fortunately, Mono also exists if you aren't happy with relying on Microsoft's own framework. And realising this, MS themselves have gone completely open source for all recent developments in the area.
+ visual studio code is lightweight, portable, and has lots of independently developed extentions for anything you'd like to write.
FrankenGraphics wrote:
+ visual studio code is lightweight, portable, and has lots of independently developed extentions for anything you'd like to write.
And has almost nothing to do with Visual Studio itself
The branding of code as visual studio is admittedly confusing. I get they felt the need to piggyback on an established product. But their download links are next to each other like it was two flavours, or one lite, one premium.
thefox wrote:
What about struct?
Using
struct for scalars (which are the largest single source of bugs in C code, for me and at least one study I can't seem to find anymore) is a pain in the posterior.
Trust me, I've tried it.
koitsu wrote:
I don't think it's an issue of schools against teaching C, it's that back in the late 90s/early 2000s, Sun really forced (bought) their way into academia and pushed Java big time. Colleges and universities latched on, and you ended up with trash like the APCS/CSAP exams being done exclusively in Java. Microsoft has done similarly with C#.
Assuming that what you said is true, I don't see the point since both tools required to develop Java and Java itself is free, i.e. you can develop and distribute a Java application without giving a single cent to Sun. For C# however it's another story.
Bregalad wrote:
For C# however it's another story.
The .NET framework, and basic versions of Visual Studio are available entirely for free.
The issue comes only if you're planning on running web services using Microsoft platforms, which is a common use for C#/.NET. That said, you get what you're paying for - MS's server software has gotten incredibly solid over the last 10-15 years.
Bregalad wrote:
koitsu wrote:
I don't think it's an issue of schools against teaching C, it's that back in the late 90s/early 2000s, Sun really forced (bought) their way into academia and pushed Java big time. Colleges and universities latched on, and you ended up with trash like the APCS/CSAP exams being done exclusively in Java. Microsoft has done similarly with C#.
Assuming that what you said is true, I don't see the point since both tools required to develop Java and Java itself is free, i.e. you can develop and distribute a Java application without giving a single cent to Sun. For C# however it's another story.
Sun definitely made some pretty heavy donations to educational institutions around here in the early 2000s. I remember we had a whole lab full of Sun Ray computers that nobody used much. I would sometimes use them just for variety's sake.
I also recall that there wasn't an open source reference for the Java VM and libraries until 2006, I think? It definitely used to be more of a proprietary thing than it is now.
Sumez wrote:
The .NET framework, and basic versions of Visual Studio are available entirely for free.
The issue comes only if you're planning on running web services using Microsoft platforms, which is a common use for C#/.NET. That said, you get what you're paying for - MS's server software has gotten incredibly solid over the last 10-15 years.
No, you still need the non-free Windows operating system to run them.
How is that different in practice from needing the non-free Windows operating system to run your hardware drivers? This can happen when no laptops in your local Best Buy have a penguin logo to imply that free drivers are available for the hardware, and System76 (which specializes in Linux PCs) doesn't offer any laptops in your preferred form factor.
ASUS T100 still has no suspend on Linux after years.
Even if Windows is non-free as in speech, Linux is non-free as in beer because of increased hardware support cost for the manufacturer. Last I checked, the Dell XPS 13 with Windows 10 Home sold for $50 less than the same laptop with Ubuntu.
tepples wrote:
How is that different in practice from needing the non-free Windows operating system to run your hardware drivers? This can happen when no laptops in your local Best Buy have a penguin logo to imply that free drivers are available for the hardware, and System76 (which specializes in Linux PCs) doesn't offer any laptops in your preferred form factor.
ASUS T100 still has no suspend after years.
In the last 15 years, I've never had a laptop that wouldn't run linux reasonably well. Sometimes some non-essential bits don't work right (suspend/hibernate, or fingerprint readers, etc) but I've had about the same chance of problems when I upgrade a laptop to Windows 10 from an older version.
Quote:
Even if Windows is non-free as in speech, Linux is non-free as in beer because of increased hardware support cost for the manufacturer. Last I checked, the Dell XPS 13 with Windows cost $50 less than the same laptop with Ubuntu.
For a personal user, you're right. For all
practical purposes, windows is usually "free-as-in-beer" for a home computer because of the way it's sold. That said, there are a few important places that it's not free, but linux is:
1. Used computers. I've bought used computers with no legal windows license, or something ancient like XP. To get a modern OS on them, I'll use linux since I don't want to pay for a modern version of windows.
2. Servers and VMs. If I want to run some web service on a windows server, that's going to cost me money for the OS. I can run a linux server without paying for the OS.
Quote:
Sometimes some non-essential bits don't work right (suspend/hibernate, or fingerprint readers, etc) but I've had about the same chance of problems when I upgrade a laptop to Windows 10 from an older version.
Not to mention things like hibernation/suspension doesn't work that well sometimes even on a designated soft/hardware package, and tends to break gradually over time with all the mandatory updates.
My current HP laptop, which is the most expensive i've bought to date, sometimes forgets its multitouch functions (and it's like 4 months old) which is a shame because they really up my workflow. Hibernation or reboot solves it.
Bregalad wrote:
No, you still need the non-free Windows operating system to run them.
Not to mention non-free computers to run that operating system on.
I think we're grasping for straws here
If you are strongly against anything Microsoft branded, Mono compilers and compatible IDEs do exist for other platforms. Also, the latest version of the official C# compiler is completely open source on Github (yes, they aren't even using Microsoft's own code repository).
gauauu wrote:
tepples wrote:
How is that different in practice from needing the non-free Windows operating system to run your hardware drivers? This can happen when no laptops in your local Best Buy have a penguin logo [...].
ASUS T100 still has no suspend after years.
In the last 15 years, I've never had a laptop that wouldn't run linux reasonably well. Sometimes some non-essential bits don't work right (suspend/hibernate, or fingerprint readers, etc)
Some people drive everywhere. Others get
motion-sick from reading while riding transit. I can see how they would consider suspend "non-essential bits", as they don't need to quickly suspend when transferring to another bus at the transit station and can instead rely on shutting down before leaving and booting when arriving. But I don't drive, and fortunately, I'm not affected by reading motion sickness. So to me, and probably to any other frequent passenger who doesn't get motion-sick, suspend is "essential bits".
gauauu wrote:
1. Used computers. I've bought used computers with no legal windows license, or something ancient like XP.
Such as the off-lease ThinkPad X61 that I bought for $101 shipped on a tip from mikejmoffitt. It arrived with its Windows certificate of authenticity torn off and Windows 10 on a volume license that could no longer connect to the LAN where its activation server had been located. It now runs Debian 9 "Stretch", for which C and C++ are a
sudo apt install build-essential away.
I guess there's a difference between hibernation behaving oddly from time to time or suboptimally, and not working at all. I'm working from the seat of a bus a lot.
A bit off topic but I’m always upset by how tutorials always use integers and constants when unnecessary. The arduino tutorials use an entire int just for a bool value. And they tend to use constants when they could just use #define VariableName Number.
DementedPurple wrote:
A bit off topic but I’m always upset by how tutorials always use integers and constants when unnecessary. The arduino tutorials use an entire int just for a bool value. And they tend to use constants when they could just use #define VariableName Number.
1. C has no built in bool type, and on many C++ compilers the size of a bool is the same as an int, so the only gain from using bool is the type information, not storage size.
Ref:
https://en.wikipedia.org/wiki/Boolean_data_type#C,_C++,_Objective-C,_AWK2. Most people would complain about using #define for constant values when unnecessary. #define has a whole host of potential unwanted side effects, and a const value has very few.
Ref:
https://www.baldengineer.com/const-vs-define-when-do-you-them-and-why.htmlC++11 also offers
constexpr, which can more strongly enforce "compile time only", if you need it.
C didn't use to support booleans as a first-order primitive; that was added with C99. (stdbool.h)
(Yes, I know you could use bitfields)
Most computer architectures don't support direct addressing of individual bits as opposed to explicitly using longer constants to emulate the same behavior. At that point, it's faster to just the full word instead of whatever subset anyway.
Eh, I dunno. I'm of the opinion that you shouldn't learn just one programming language anyway. I know quite a few devs that are uncomfortable outside of their only language. (ex: Unity/C# or HTML/Javascript) Many of them consider C++ to be black magic even though they've never used it. I did a Global Game Jam game last weekend for NES, and it blew a few peoples minds that I wrote a dozen lines of assembly to handle music selection. Not that these are bad people, but don't be like that. :p
Be ready to use Java if you are hired to work on Android, C# if it's for MS, Swift if it's Apple, C if it's for Linux, etc. (Yeah, I know there are overlaps and what about JS/Python/etc, I'm just sayin'...) The more adaptable you are, the more valuable you are. If you are learning a language, figure out what makes it unique, and it will make you a better programmer in other languages even if you don't like it.
My experience with a C programming class at college was that in the end we'd spent about 80% of the time dealing with all the intricacies of printf format strings.
And yes under a microscope printf does some bizarre and tricky things that'll get points marked off. Something that teachers love when they're under pressure to make their grades look like a bell curve.
Hahaha like a large percentage of a projects' development time was spent on polishing the layout of some console output no user will ever see.
Sadly at the schools I went to in the UK wayyy back, no-one ever taught us how to program. Which was a shame but they did at least have the goodwill to give us access to the Basic family of things in secondary school (BBC Basic, QBasic and Visual Basic)
No lessons as I said, but the I.T. rooms were always open during break times and were frequented by me and a few others a lot.
At University much of the teaching was very hands-off so you'd just go and learn your own style through research and practice. My course wasn't strictly a programming course, more a mix of design and coding but you get the picture.
Recently there's been a push for more visual programming langauges such as Scratch for kids around theage of 8 years old which is fine. It's an entryway into programming I guess, you can set up logic loops, etc which is cool. Very bare bones but a good starting point.
I think you guys are lucky to at least have classes with computers.
When I was at school, there where only a few PCs that most of the time were broken.
Now that my kids go to school, I have a weird feeling that most teachers are afraid of computers, since most of the time I see the computers staying idle or off, when they're not being used by someone to surf the Web.
I'm under the impression that smartphones and tablets made computer classes for kids kinda pointless. Kids start using tablets/phones when they're only 2 years old nowadays, and they're probably capable of doing much more advanced stuff by the time they're teenagers than the things I was doing when I had computer classes at that age.
Learning to program is getting harder, not easier:
https://developers.slashdot.org/story/1 ... ing-harderhttp://oneweekwonder.blogspot.co.uk/201 ... llacy.html :
Quote:
I've just been watching this BBC news article, which features luminaries from the UK industry talking about the woeful programming skills in the new generation of kids.
Let's make a case here: not only does the education system have computing wrong because we focus on ICT, but the programming industry has the wrong emphasis on computer science, because they believe in teaching kids using powerful environments on powerful systems; when what we need is dirt simple systems and environments. ...
Report: 80’s kids started programming at an earlier age than today’s millennials:
https://thenextweb.com/dd/2018/01/23/re ... llennials/
@Garth What frustrates me is that what we're doing seemingly has no basis in reality. Like, what the hell actually is a "String" in data? We're taught all these different "data types", but one of the first things we even learned in the class is that all data is that all data is represented by "1's and 0's." I know it's for how the compiler wants to treat the data, but I think most people are a bit confused by it. Just give me the option for "byte," "word," "dword," and "qword."
We have functions like "Math.random()", but how it actually gets this "random" number is anyone's guess. Most everyone in the class probably believes that it's actually generating a
completely random number, which is, way more than likely, impossible.
A type like "String" is a convention that different parts of a program agree to use for a particular block of ones and zeroes. Features in language implementations for enforcing conventions like these, such as "strong typing" and "static typing",* help prove that a program does not contain certain classes of errors related to inadvertent violation of these conventions.
Nowadays, random numbers can be generated by sampling actually random data, such as room noise picked up by a modern desktop, laptop, tablet, or pocket computer's microphone, and mixing it around.
* Which are not the same.
tokumaru wrote:
they're probably capable of doing much more advanced stuff
I agree with that. But sometimes they have a difficulties on doing pretty simplier things, like write texts, do some slideshow, spreadsheets, etc. Abilities that probably will be needed on their future jobs.
I think the use of computers should be integrated with the teaching of other subjects, like teach math using a spreadsheet or how to do a good redaction on a word processor.
Other than that, I think that not just C, but any programming language probably will help kids to learn a bit of algorithms, wich in the future can help them to think of better solutions for problems, even the ones that are not directly related to programming/EE.