It's been a while but from what I remember the easiest way to block this was by disallowing outbound network requests from search/the start menu in the firewall settings. It worked across all versions of Windows I tried it on.
"End-users need to read and understand shell scripts to make sure they're safe" is a completely unacceptable threat model. The way I see it installing software from the AUR is about as safe as installing software from the pirate bay. Nevertheless, this distribution keeps getting discussed and recommended to people, with the AUR often cited as a reason to use it.
How is that an unacceptable threat model for a repo of packages that are optional and user-made? One that clearly says, "DISCLAIMER: AUR packages are user produced content. Any use of the provided files is at your own risk." (1)
The AUR, along with Arch's minimalism, is one of my favorite things about it. Instead of cloning the source repo, reading the build instructions, building, and then installing, I download a script, read it to make sure it looks okay (e.g. the source points to what I expect), and then `makepkg -si`.
> The way I see it installing software from the AUR is about as safe as installing software from the pirate bay.
No, if I trust the source - and I often follow the source link to GitHub to check out the project - then it's like one of my distro's packages, except I'm the one saying it's safe for me to install. I'm not claiming it's risk free, but it's been a great boon to me. (2)
2: I used the AUR to compile and install Goldendict-ng, a fork of the dictionary software Goldendict that's being maintained. It accepts my Stardict converted-from-Apple dictionaries and supports Wayland!
> How is that an unacceptable threat model for a repo of packages that are optional and user-made? One that clearly says, "DISCLAIMER: AUR packages are user produced content. Any use of the provided files is at your own risk." (1)
The AUR is an official part of Arch Linux. It's hosted on the archlinux.org domain with a prominent link to it from the main page. You enable package installation from it either using one of the many transparent pacman wrappers recommended in arch community spaces and on the arch wiki, or by ticking a checkbox in a graphical package manager like pamac. IMO a one-line disclaimer on the aur main page doesn't fix the problem at all.
Security isn't about the trustworthiness of the code you're running, it's about the trustworthiness of the person who's giving you the code. No matter how good you are at auditing bash scripts, there's a malicious bash script that will slip by you, even if you're diligent (which most aren't, even among so-called "power users"). With official packages, I have to trust the people who distribute my OS. With vendor-distributed software (Windows software, PPA, curl | sh) I have to trust the person who wrote the software. With the AUR, I have to trust the first person to park the name of the package.
I mean 99.9% of the problems can be averted by just not installing some random new aur package with 0 votes or popularity.
The vast majority of packages an average user needs are built by arch anyways and aur by large is not nearly as needed. Still would take easily reviewable pkgbuilds over adding some random PPA as all too many ubuntu users tend to do or similar.
I think they have a point, you might (and should) evaluate it for each new package you install. But when you do a full system upgrade, are you telling me you'll review every AUR package again?
Most AUR helpers (well, the ones I've used at least, those being yay and pacaur) include the option to show a diff of PKGBUILD (and other provided files) for AUR package upgrades
Well I don't use any of those dirty helpers (now THAT'S crazy talk) so the AUR packages mostly get built on a separate schedule (whether fully manually or in CI) from running pacman -S.
I think only one or two non-mainline packages I depend on that get frequent updates and it matters.
> I think the biggest problem is that the DOM was built for documents, not apps.
The world wide web was invented in 1989, Javascript was released in 1995, and the term "web application" was coined in 1999. In other words: the web has been an application platform for most of its existence. It's wrong to say at this point that any part of it was primarily designed to serve documents, unless you completely ignore all of the design work that has happened for the past 25 years.
Now, whether it was designed well is another issue...
> Instead of slow React renders (50ms?), every interaction is a client-server round trip.
This is true only if you use zero javascript, which isn't what the article is advocating for (and even with zero javascript there's quite a bit of client-side interactivity built-in to CSS and HTML). Besides: in practice most interactions in an SPA also involve network round trips, in addition to that slow react render.
This library seems to have the annotation on every function, though, so it's possible the author is just following a convention of always using it for functions defined in header files (it'd be required if the functions weren't declared `static`).
My understanding of API standards that need to be implemented by multiple vendors is that there's a tradeoff between having something that's easy for the programmer to use and something that's easy for vendors to implement.
A big complaint I hear about OpenGL is that it has inconsistent behavior across drivers, which you could argue is because of the amount of driver code that needs to be written to support its high-level nature. A lower-level API can require less driver code to implement, effectively moving all of that complexity into the open source libraries that eventually get written to wrap it. As a graphics programmer you can then just vendor one of those libraries and win better cross-platform support for free.
For example: I've never used Vulkan personally, but I still benefit from it in my OpenGL programs thanks to ANGLE.
For every person who says on the internet that you can just use a C++ subset, there's another who insists that C is the bad C++ subset. So compiling C code with a C++ compiler promotes your code from "good C code" to "bad C++ code" (most C code isn't "exception safe," for example).
It's arguably irrational to evaluate a language based on this, but you can think of "this code could be better" as a sort of mild distraction. C++ is chock full of this kind of distraction.
reply