3:42 this is really a insightful realization. I wish society would learn that, they keep becoming more totalitarian in the effort to have lots of rules to catch bad behavior, and as excuse to scoop all the data from citizens. We're trading freedom for security and ending up having none of them.
@grokitall4 ай бұрын
robert hienlien has a character in the moon is a harsh mistress who says "you make whatever rules make you feel comfortable, and if i need to break them i will do my best not to get caught". paraphrasing here as i don't remember the exact wording.
@codingblues31814 ай бұрын
Secular liberalism is as authoritarian as any other, just with different mechanisms.
@jshowao4 ай бұрын
I blame jia tan everytime something breaks when I use linux
@vilian91854 ай бұрын
lmao gonna do that too
@TomJakobW3 ай бұрын
DINKLEBEEEEEERG!
@BabylonianBaboon4 ай бұрын
Pity they never addressed the chance of already having a lot of successful attacks like the XZ-one, active and in use... Undetected...
@unaimillian4 ай бұрын
The pity nobody cares about is the amount of attacks that target proprietary software. As it's sourse code and executables are observed by far less amount of people
@Raspredval13374 ай бұрын
@@unaimillian exactly
@BabylonianBaboon4 ай бұрын
@@unaimillian sure. It's an age old argument. But why use it to ignore the threat? We can control the OSS/FOSS dev process but not the closed source one... The XZ attack shows how the attackers play the long game and are skilled at hiding backdoors in "plain sight"
@autohmae4 ай бұрын
@@unaimillian the executables are observed a lot, especially updates have people looking at changes compared to the previous version trying to figure out how to break into systems.
@angrydachshund4 ай бұрын
Agreed, the whole thing is cope, because the FOSS world is already dead, they just can't admit it yet.
@Turalcar4 ай бұрын
For every attack like this in open source there are a dozen (I made up the number) in closed source that go undetected for years.
@gusthomas68724 ай бұрын
you made up the number because it is unknowable! assuming a large number is probably good in that case
@jfbeam4 ай бұрын
For every one _known_ in OSS, there are many _UNKNOWN._ Have you audited every line of code for every package installed on your Ubuntu desktop? (I didn't think so. Nobody has; even people that would know what they're looking at.) Just as there are many unknown backdoor'd commercial programs. Of the two, one is much, much harder to detect, but neither is trivial.
@dansanger53404 ай бұрын
Another example of the denialism in the open-source world.
@philipstephens59604 ай бұрын
@@jfbeamWell, what SHOULD be happening is that every pull request goes through a thorough code review. Even imperfect code reviews should increase the chances of catching malicious code.
@autohmae4 ай бұрын
@@jfbeam actually, many is even an unknown
@jackthatmonkey89944 ай бұрын
It feels similar to logging to me. If you keep logs, but don't audit them, then what do those logs do? Relying on open source in a context where security really matters feels like much the same way to me, however daunting this implication is. Though I'd rather live in a world where 'its open source so its safe' holds up I don't use arch btw
@whohan7794 ай бұрын
You should really give Arch a try. 😉
@sergeykish4 ай бұрын
Visibility. For example, AUR has more visibility than instructions on forum, and there are people who review PKGBUILDs. Same all the way down.
@Dragon9054 ай бұрын
You can use logs and open code if, for example you are suspicious of a behaviour on your system. Of course logs don't do anything, but in case of doubt, reference the logs or code!
@bloodyorphan4 ай бұрын
Using smaller teams for security and fidelity reasons was the answer for Solaris, it's less true for Microsoft these days. I think the better answer is to do a better Quality Assurance , especially for base kernel non standard ports etc, before a release candidate can be considered for actual release. You know the default ports for every service, make sure spurious ports are not enabled before publishing. You can automate something like that, and have a much greater sense of security as a result. Keeping the hours down for said smaller team. Bless Ya's **EINSTEIN** ;-)
@ZombieLincoln6664 ай бұрын
Linus is so smart
@Nilruin4 ай бұрын
I love how nobody in frame in the audience is looking at the stage and paying attention.
@jamesclark73804 ай бұрын
I think it's more of a monkish respectful bow thing.
@TomJakobW3 ай бұрын
@@jamesclark7380 it’s a weird angle; looking up for so long is tough on the neck. In either case, hearing the words is more important than seeing two mouths move.
@TeaTree-e8y4 ай бұрын
Would love to hear some of your thoughts and advice on repurposing content. You seem to have a really good pipeline here and I'm attempting to something similar using the newer tools.
@nixigaj114 ай бұрын
sauce: kzbin.info/www/bejne/mYHZg3yNjbOal68
@brunoferreira11254 ай бұрын
thanks bro
@evikone4 ай бұрын
Trust? Perhaps, within the programmers of the 70s and 80s (even 90s), a prevailing idea was to never trust anything----being a paranoid programmer was a good thing. We know that the first rule of security is that any security is "only as good as its weakest link" which is always "trust." These days, everything is going up to such a high-level, that no one is paranoid anymore, and everyone trusts everything so long as it appears to be trustworthy. That's my snazzy sassy.
@elalemanpaisa2 ай бұрын
As long as people are involved somewhere there is a risk. No MacOS, no Linux, no BSD, no Windows will be secure ever
@notthere834 ай бұрын
I understand that one has to be able to trust maintainers. But maintainers merging malicious code reminds me of something I've been advocating for at companies for years: Never approve/merge a PR that you don't understand! It is preposterous that there are many people who believe that they should be able to trust someone else's code. Usually, this is of course about introducing bugs rather than malicious code. But the principle is the same. And then there are those who argue that that's an unreasonable burden on open source maintainers. To which I would say that if you don't want to accept this basic responsibility towards society, you probably should just work on your fun projects that may or may not cause harm privately.
@kazioo24 ай бұрын
Even experienced programmers don't just see all the bugs in code. A carefully crafted bug hidden in code can be easily missed even by an expert who understands it all.
@notthere834 ай бұрын
@@kazioo2 Just because something can happen doesn't mean that you shouldn't make a reasonable effort to avoid it.
@notthere834 ай бұрын
@@kazioo2 Automated testing is of course essential too. Yes, something can slip through here as well. But - see above.
@foldionepapyrus34414 ай бұрын
When you are talking about a project as complex and diverse as the Linux kernel no one person however gifted could really hold all the interactions of the current version in memory to know there isn't a carefully crafted bit of malicious sneaking in as they review a submission. So at some point with a complex project as you can't just spend a week or four figuring out every detail of every interaction this one submission out of the 40 that have come in has, all you can do is check the code standard and comments are up to scratch. You simply have to trust your fellows that have earned it for anything to happen at all. Which is still better than trusting the fellow programmers employed by your company - those people are there for the wage and the industry standard seems to mean frequently treated poorly and ignored by the management and marketing/sales folks, put under insane time pressure etc - so they just don't care about anything else but getting paid. Making them ripe targets for getting paid more by slipping something 'useful' in. Where most open source developers do it because what they are developing they also use and care about (hopefully they are still getting paid as well).
@Hagaskill4 ай бұрын
Did you listen to the audio?
@TheRealStevenPolley4 ай бұрын
Who is Jia Tan really
@TheGeorey4 ай бұрын
The friends we made along the way
@ecereto4 ай бұрын
Most probably not a real person, but a state sponsored group.
@Daydream_Dynamo4 ай бұрын
GUY/GIRL behind the XZ Security breach Fiasco!!!!
@angusmacgyver3 ай бұрын
The one who glows in the dark.
@jfbeam4 ай бұрын
"Didn't do it very well." Bull. F'ing. Shit. They did it expertly, that's why it pissed Linus off so much. So much so EVERY patch EVER submitted from ANYONE with that university's domain was pulled, despite the researchers not using university email addresses. (for the record, they used gmail) They reacted _entirely_ out of spite. NONE of their bad code ever actually made it to the mainline kernel; they proved their point by getting maintainers to accept their bad code, and upper levels blindly accepted the commits from those maintainers. If I want to test how blindly you trust shit, I'm not about to tell you beforehand. They proved it was possible - even trivial - and that it could have already happened. Sure, these things "get caught"... at some point in time, through shear, random f...ing luck. How many of these sorts of things have people NOT randomly stumbled into? That we don't know. And that's the point.
@timoruohomaki4 ай бұрын
So what would be a better way to do all that at that scale? Of the original UMN study, one patch did make it into repositories but didn't cause any harm, according to K-H. A large number of the patches were generated with a software they developed in an earlier project that fixed some bugs but were not otherwise very high quality code. It wasn't really a big loss for all that to be reverted.
@coversine4794 ай бұрын
I’m pretty sure he meant they didn’t do it well in an ethical sense, not a technical sense
@grokitall4 ай бұрын
@@coversine479there are ways to do such research which protect both the subject and the researchers. they ignored all of them, acting like bad actors. upon discovery, they got roasted for it, and so did every chain in their supervision. in the meantime, the kernel was left with the only choice being to block every submission from them, until they could audit the credibility of the contributers. the reason the maintainers were pissed is that maintainer time is always less than what is needed, and they wasted a lot of it. if they had done the study properly they could have said upon being discovered, so and so knew, and here is the list of patches from only these email addresses, but that process took way too long.
@vilian91854 ай бұрын
lmao wasn't he who banned the university read the mailing list, the university also pissed the devs there too
@grokitall4 ай бұрын
@@coversine479 no, he meant both. the idea of doing the study was fine, but there are a number of ethical and technical steps that should be taken prior to starting it which they completely failed to even consider. the first of which is should we even do it, and if so what rules should we set up? the standard way to do this is for the university to look at the size of the project, and see if it is big enough to absorb any potential harm caused by the study, and to document the potential harm prior to beginning the study so as to minimise it when setting the rules of engagement for the study. they did not do this. as this was a code study, the next step should have been to find someone connected to the project who did not do code review who could be a point of contact and potentially could have a full audit trail of all the submissions. they did not take either step as far as i have been able to discern. this is what pissed off the devs, because having discovered someone looking like a bad actor, and tracing them back to the university, it was then impossible for a while to determine if it was student or faculty, and if this was a one off or systematic. this is what caused the fallout. yes they blocked the gmail account, but they should then have been able to ask the developer what was going on, and got a reply of here is what we were doing, these people knew about it, and here is every patch involved. they could not do any of that, so that got the university blocked until that information could be independently created and confirmed, at which time the University got unblocked. they implemented the study protocols so badly that they were not only technically bad, end ethically questionable, but due to hacking being illegal to some extent in most countries their behaviour skirted around being criminal. all of these problems would have been caught if a proper review was done by the university legal and ethics board prior to starting the project. not doing so not only slimed themselves, but brought the University into disrepute for allowing it to happen.
@elalemanpaisa2 ай бұрын
this poor guy gets bothered all the time to do something he hates which steals him the time to do what he loves.
@MarkHall-cf6ji4 ай бұрын
3:15 is such bullshit, he says having no rules is good because attackers don't follow rules but they got mad when a school tried to submit malicious code, but that's what bad guys will do too.
@vilian91854 ай бұрын
poeple were mad about being tested on, also he was talking about trust, and the university broke that trust, the university also pissed the devs in tha mailing list so their fault
@MarkHall-cf6ji4 ай бұрын
@vilian9185 But he contradicted himself by talking about the benefits of having no fixed rules because the bad guys are likely to use those rules to their advantage. Well that's exactly what the school did to the linux dev process, which is only fair imo.
@MarkHall-cf6ji4 ай бұрын
@vilian9185 He then twisted it into a positive by pointing out that despite having no fixed rules the devs were able to detect all attempted attacks. This is survivorship bias, as they're only aware of attacks they've detected. For all they know, the linux kernel could be teeming with backdoors submitted by bad guys, they just haven't noticed yet.
@vilian91854 ай бұрын
@@MarkHall-cf6ji fine them good luck creating rules that's only difficult the devs job and don't stop anyone malicious
@XGD5layer4 ай бұрын
@@MarkHall-cf6ji rules are only needed for cohabitation, they don't matter in open source but they will impact engagement