Collisions with the windows of buildings kill more birds than wind turbines do, by some orders of magnitude. And that's before we get to cats ...
(Current numbers for the US; ~250,000 per year killed by turbines, ~1,000,000,000 per year by windows. 2.4 billion/year by cats. Numbers via Dunning, B. "Wind Turbines and Birds." Skeptoid Podcast. Skeptoid Media, 7 Jan 2020. Web. 12 Jan 2020. https://skeptoid.com/episodes/4709 )
So switching to Linux might reduce the use of Windows ... but probably increases use of cat ...
Actually @freemo I think that @design_RG is much more accurate.
An even more accurate #infographic would show death by #famine, #starvation and #disease in the #poor histogram of "How #Capitalism actually works"
Looking at this #Microsoft leak from 2000 I see where #Google learned its own tricks.
See slides on #developers' #MindControl from page 118 on
http://techrights.org/wp-content/uploads/2008/08/comes-3096.pdf
Actually I see a rational in his answer but I wonder how other scripting system address the issue.
(sorry if I take it from an historical perspective, but I think it's important to understand what Ashinn is saying)
He basically means you have (and should stick to) two ways of _distributing_ a piece of code:
- libraries
- programs
In a programming language that only supports statically linked binaries this appears quite obvious: you either distribute an executable or a library archive.
But if you distribute the executable, it's self-contained: it actually contains all the required code (except for the kernel, obviously, or any other program it invokes through `exec` which count as dependencies)
Dynamic linking complicates things. People can distribute a library that is linked _at_run_time_ to the executable, so that programs are not self-contained anymore. BUT at least all of the code from a certain team end in a single file (the binary executable) that can be installed in the system paths without name clashes.
Scripting languages further stretch this flexibility and let you distribute source files that can be executed actually.
BUT developers split code to ease development, not distribution.
So you might end with a `utils.smc` in your project, but I could have the same for my project: what if a user try to install both in /bin?
The name clashes cause the second installation to break the first.
So basically Ashinn says: do not distribute #Scheme programs split into different files.
If you want to build a library that people actually uses, distribute it separately. But each program (and script) should be distributed as a single-file.
I think this reasoning is quite correct (it's not by chance that #Jehanne only support statically linked binaries) but I wonder how other scripting languages solve the issues.
For example, #Python should have the same kind of issues.
Do they stick to this "one-file scripts" for programs that are going to be installed in /bin?
To be honest, I've never had to dwell into this issue as most of my python programs weren't distributed to run in a system path.
Malwarebytes says low-end smartphones sold to Americans with low-income via a government-subsidized program contain unremovable malware at https://arstechnica.com/information-technology/2020/01/us-government-funded-android-phones-come-preinstalled-with-unremovable-malware/ - discuss at https://freepo.st/freepost.cgi/post/luqh71ksfb #freepost
"The Cobra Effect"
The British government was concerned about the number of venomous cobra snakes in Delhi.
The government therefore offered a bounty for every dead cobra. Initially this was a successful strategy as large numbers of snakes were killed for the reward.
Eventually, however, enterprising people began to breed cobras for the income.
Apparently nobody really knows why it's called this way.
A "Canadian cross" compilation is simply the cross compilation of a cross compiler.
Indeed, during #GCC configuration you can specify 3 different systems:
1) the `build` system, where the compilation is going to run
2) the `host` system, where the produced compiler is going to run
3) the `target` system that will run the binaries produced by the produced compiler.
SHA-1 is a Shambles
> We have computed the very first chosen-prefix collision for SHA-1. In a nutshell, this means a complete and practical break of the SHA-1 hash function, with dangerous practical implications if you are still using this hash function. To put it in another way: all attacks that are practical on MD5 are now also practical on SHA-1.
Beware: if #Dijkstra had seen what #JavaScript is doing to all of us, he would have been a #COBOL evangelist.
@alexcleac not sure what the "there" is, and I am surely missing a bunch of context, but my answer (as a software developer, sysadmin, infosec person, and a user of technology in 2020) is: software engineering is still largely missing the "engineering" part.
By that I mean the ethos and the risk aversion, and the personal responsibility in case of catastrophic failure.
Because we are having way too many catastrophic failures in IT still. This needs to be fixed.
define "everyone", please 😉
I'd almost say the opposite: there are simple concepts, simple algorithms and so on, but their structure is insane.
You can build GCC on an x86 Linux (glibc) so that it will builds statically linked binaries for Windows x86_64 (newlib-cygwin) on a AArch32 running NetBSD (for several languages).
The simple fact that Canadian crossing is possible and supported should give an insight about the internal #complexity of #GCC.
With GCC you also build `libgcc` a library against which each GCC built binary is linked to ease some optimizations. This is another hint: compilers are not as simple and modular as one might think from an high level description of them.
Finally it's not entirely true that you can disable every optimization: not only because there is no real difference between optimizations and other transformations during the compilation process but also because most of the combinations of optimizations have never been really tested.
So I'd argue that for a tester, modern compilers are the one of the worst possible nightmares of today computing.
Yes, they are functional, but I'd guess nobody would live enough to seriously test each possible combinations of options of a single GCC release to ensure it maps each possible input to the correct output.
@Xipiryon @ekaitz_zarraga @suetanvil@mastodon.technology