Personally I'm absolutely amazed how such a weird language like c, that was intended to write ...OPERATING SYSTEMS, has been used to this day to write ...applications.
Are you a programmer AlexGR? I understood from the other thread discussion we had that you mostly aren't, but not sure.
It depends on the definition. I do not consider myself one but I've written some simple programs, mainly for my own use, in the past.
I still code occasionally some things I need. For example I started recently (an attempt) to implement some neural network training to find patterns in SHA256, to see if it might be feasible to create a counterbalanced approach to ASIC mining, by pooling cpu and gpu resources in a distributed-processing neural network with tens of thousands of nodes (the equivalent of a mining pool). Anyway while I think the idea has merit, I didn't get very far. Now I'm contemplating whether neural networks can find weaknesses in hashing functions like cryptonight, x11, etc etc and leverage this for shortcutting the hashing. In theory all hashing should be pretty (pseudo)random but since there is no such thing as random it's just a question of finding the pattern... and where "bad crypto" is involved, it could be far easier to do so.
Anyway these are a mess in terms of complexity (for my skills) and I'm way over my head due to my multifaceted ignorance. My background wasn't that much to begin with: I started with zx spectrum basic back in the 80s then did some basic and pascal for the pc, then did some assembly.... and then the Internet came and I kind of lost interest in "singular" programming. In the "offline" days it was easy to just stick around a screen, with not much distraction, and "burn" yourself to exhaustion, staring the screen for 15 hours to write a program. In the post-internet era that much was impossible... ICQ, msn, forums, all these gradually reduced my attention span and intention to focus.
I always hated c with a passion while pascal was much better aligned with how I was thinking. It helped enormously that TPascal had a great IDE for DOS, a nice F1-Help reference etc. Now I'm using free pascal for linux, which is kind of similar, but not for anything serious. As for c, I try to understand what programs do so that I can change a few lines to do my work in a different way.
In theory, both languages (or most languages?) do the same job... you call the library you want, use the part you want as the "command" and voila. Where they may differ is code execution speed and other more subtle things. Library support is obviously not the same, although I think fpc supports some c libs as well.
Still, despite my preference for the way pascal was designed as a more human-friendly language, it falls waaaaaaaaaay short of what I think a language should actually be.
Syntax should be very simple and understandable - kind of like pseudocode. Stuff like integers overflowing, variables not having the precision they need, programs crashing due to idiotic stuff like divisions by zero, having buffers overflow etc etc - things like that are ridiculous and shouldn't even exist.
By default a language should have maximum allowance for the sizes that a user could fit in vars and consts but also let the user fine tune it
only if he likes to do so to increase speed (although a good compiler should be able to tell which sizes can be downsized safely and do so, by itself, anyway). Compilers should be infinitely better, taking PROPER advantage of our hardware. How the hell can we be like over 15 years later than SSE2 and not having them properly used? You make a password cracker and you have to manually insert ...sse intrinsics. I mean wtf? Is this for real? It's not an instruction set that was deployed last month - it's there for ages. Same for SSE3/SSSE3/SSE4.1/4.2 that are like 8yr old or more and go unused. How the hell can a language pretend to be "fast" when it throws away SIMD acceleration, is beyond me. But that's not the fault of the language (=> the compiler is at fault). Taking advantage of GPU resources should also be a more automated process by now.
I think, though I'm not sure, that most C coding these days (which still seems to be quite popular) is indeed system programming, firmware for devices, performance-critical callouts from other languages (Python for example), etc. But I have few if any sources for that, and it may be completely wrong.
OS are definitely C but almost the majority of OS apps are also C, especially in linux.