javascript
I avoid it amap
ruby
not aware of using it... I'm a real hypocrite here as I should know whether I do or not, but am unaware of common uses of ruby, period
python
1. I check the python based dependencies for all applications I use pretty carefully (and don't use pip/PyPi at all, either the OS repo or compiling python source is the way to go)
2. not yet have I found something on PyPi that has source that I cannot byte-compile myself and install into Python's annoying package system
3. I can read/write python, and so I understand python just about well enough to be able to review code for weird/suspicious looking tricks such as in the OP, it'd have to be a more subtle trick to work
or rust
hmmm, I wish that wasn't true (AFAIK Firefox/Thunderbird are the only applications I currently use that involve rust, and presumably Mozilla enlist the so-called dependency-orgy ecosystem seeing as they're th developers of much of it)
That's why I'm happier using applications/libraries/system daemons/utilities written in C/C++ for anything but the most trivial purposes. The language standards and the (typical) libraries are far less numerous, and many have been around longer than Python has.
Sure, they're harder languages to learn, and easy to make mistakes with (and the amount of control you get using C/C++ makes the mistakes potentially calamitous to your computer). For anything at all critical, that's a good thing, software developers should be using powerful tools that they understand (including the underlying operating system) to produce critical components. And that means a tiny proportion of coders can work on critical code: great. We want exactly those types of people (i.e. competent) to be attracted toward working on that part of the computing science spectrum.
I worry that there will eventually be a modern day witch-hunt in respect of people using free and powerful computing tools, e.g. open source software, hardware with user flashable firmware, or programming languages that let you talk to the kernel could all become locked down at the hardware level (supremely ironic), but you can use these things as long as submit to a million screening procedures and adjunct hardware "unlockers" that both give you the access and constantly monitor everything you type. Computers would become like Gameboys, and
real computers would need military export licensing
These hand-holding programming languages with their universe-loads full of standard libraries and constant emphasis on every resource being behind a URL... it just seems like it's part of the same trend to me. Instead of learning the encyclopaedia of Java objects and exceptions, why not use your brain to learn how the underlying operating system works? It's maddening that programmers of all people don't recognize that it could be only one newspaper article trend or an executive order away from being made inaccessible
Sorry for the rant, but the article in the OP just reminds me of all this; the trend where learning what's actually going on on your computer is increasingly less possible, because there's either too much material for 1 person to cope with, or the chain of trust is immature, opaque and qualitatively inadequate. I mean the packages on PyPi aren't even signed or checksummed, you just rely on the SSL connection for authenticity