importantly it’s (hopefully) an ISP that operates from a less copyright-happy country and isn’t tied down to tons of expensive infrastructure and long-term contracts
importantly it’s (hopefully) an ISP that operates from a less copyright-happy country and isn’t tied down to tons of expensive infrastructure and long-term contracts
to be even more pedantic, if we follow the relevant official RFCs for http (formerly 2616, but now 7230-7235 which have relevant changes), a 403 can substitute for a 401, but a 401 has specific requirements:
The server generating a 401 response MUST send a WWW-Authenticate header field (Section 4.1) containing at least one challenge applicable to the target resource.
(the old 2616 said 403 must not respond with a request for authentication but the new versions don’t seem to mention that)
with another OS nix is not going to be “in control” so it’s probably more limited. I’m not sure how common using nix is outside of nixos.
also I’ll point out that many other linux distros I think recommend doing a full system backup even immediately after installation, the “grep history” thing is not very stable as eg. apt installing a package today will default to the newest version, which didn’t exist 1 year ago when you last executed that same command.
with nixos, the states of all the config files are collected into the nix configuration which you can modify manually. And if there’s something that can’t be handled through that, I think the common solution is to isolate the “dirty” environment into a vm or some other sort of container that I think comes with nixos
(and there’s always going to be “data” which isn’t part of the “configuration” … which can just be used as a configuration for individual applications)
assuming you have never used anything except apt commands to change the state of your system. (and are fine with doings superfluous changes eg. apt install foo && apt remove foo)
it’s replicable and “atomic”, which for a well-designed modern package manager shouldn’t be that noticable of a difference, but when it’s applied to an operating system a la nixos, you can (at least in theory) copy your centralized exact configuration to another computer and get an OS that behaves exactly the same and has all the same packages. And backup the system state with only a few dozen kilobytes of config files instead of having to backup the entire hard drive (well, assuming the online infrastructure needed to build it in the first place continues to work as expected), and probably rollback a bad change much easier
Rules of thumb can be very useful for a relatively inexperienced programmer, and once you understand why they exist you can choose to ignore them when they would get in the way. Clean Code is totally unhinged though
Actually I think he has already had an adequate amount of recognition:
“In 1999, Red Hat and VA Linux, both leading developers of Linux-based software, presented Torvalds with stock options in gratitude for his creation.[29] That year both companies went public and Torvalds’s share value briefly shot up to about US$20 million”
his autobiography is in several hundred library collections worldwide
Awards he’s received:
2 honorary doctorates
2 celestial objects named after him
Lovelace Medal
IEEE Computer Pioneer Award
EFF Pioneer Award
Vollum Award
Hall of Fellows of the Computer History Museum
C&C prize
Millenium Technology Prize
Internet Hall of Fame
IEEE Masaru Ibuka Consumer Electronics Award
Great Immigrants Award
for a large project, you can probably look at the history of issues, if there are lots of issues that are 5 years old, it’s almost certainly legit
All 9k stars, 10k PRs, 400 forks & professional web site are fake?
Technically, it is entirely possible to find a real existing project, make a carbon copy of the website (there are automated tools to accomplish this), then have a massive amount of bots give 9K stars and make a lot of PRs, issues and forks (bonus points if these are also copies of actual existing issues/PRs) and generate a fake commit history (this should be entirely possible with git), a bunch of releases could be quickly generated too. Though you would probably be able to notice pretty quickly that timestamps don’t match since I don’t think github features like issues can have fake timestamps (unlike git)
though I don’t think this has ever actually been done, there are services that claim to sell not only stars but issues, pull requests and forks too. Though assuming the service is not just a scam in itself, any cursory look at the contents of the issues etc would probably give away that they are AI generated
looks like work on the android client started in 2011 (or at least, that’s when it seemingly started using version control)
the app was released in 2014
so it has likely inherited decisions from ~14 years ago, I’d guess there is a several year gap where having a native desktop app was not even a concern
Also the smartphone landscape was totally different back then, QT’s android support back then was in alpha (or totally nonexistent if the signal project is a bit older than the github repository makes it seem), and the average smartphone had extremely weak processing power and a tiny screen resolution by today’s standards. Making the same gui function on both desktop and mobile was probably a pretty ridiculous proposition.
what’s wrong with them? are you sure it’s just not set to use 100% of all cores, and then the OS does some shuffling?
the “will linearly speedup anything [to the amount of parallel computation available]” claim is so stupid that I think it’s more likely they meant “only has a linear slowdown compared to a basic manual parallel implementation of the same algorithm”
if all 330 million 'mericans tried living in the woods, shit would go south really quickly
Is there something I should be keeping an eye out for, or preparing for so everything goes smoothly at least with regards to this community?
On the 6th of May, 2028, travel to 2 Augusta Hills Drive, Bakersfield, Kern County, California, United States. At exactly 4 PM local time, place an orange traffic cone on top of the nearest garbage can and await further instructions.
ironically suyu is still up on github
and of course ryujinx hasn’t received any legal threats yet
Youtube actually uses 128kbps opus, which should be significantly better than 160kbps mp3
but the real problem is that you can’t know what quality the uploader used, it all gets recompressed by youtube.
They could do it without recompilation, but something like changing the obfuscation and recompiling for every copy would likely make it much harder to get rid of the watermarks even if you can compare several different copies
(though they could also have multiple watermarked sections so that any group of for example 3 copies would have some section that is identical, but still watermarked and would uniquely identify all three leakers. The amount of data you need the watermarks to contain goes up exponentially with the amount of distinct copies, but if you have say 1000 review copies and want to be resistant to 4 copies being “merged”, you only need to distinguish between 1000^4 combinations so you can theoretically get away with a watermark that only contains about 40 bits of data )
it is difficult to get a man to understand something, when his salary depends upon his not understanding it
if you can’t connect to a vpn using only open source software, that’s a crappy vpn