System76 on Age Verification Laws
(blog.system76.com)
(blog.system76.com)
Most System76 employees installed operating systems and created accounts on their computer when they were under 18. They did this out of curiosity. Many started writing software. Some were already writing operating systems. I’m sure the story is similar at most tech companies. Limiting a child’s ability to explore what they can do with a computer limits their future. Removing user limitations to the computer (proprietary software, locked-down platforms like Android and iOS) is why System76 exists.
Fuck yeah. I was pirating software before I turned 18, and the world is a better place for it.
(And I contribute to open source now, in the hope that the next generation can learn without needing to resort to piracy.)
Lovely article. All the context up top perfectly leads to the buried lede at the end:
We are accustomed to adding operating system features to comply with laws. Accessibility features for ADA, and power efficiency settings for Energy Star regulations are two examples. We are a part of this world and we believe in the rule of law. We still hope these laws will be recognized for the folly they are and removed from the books or found unconstitutional.
Just forbid any user from any place that has these laws from using your OS through your ToS but don't implement a way to check who uses it.
Wow, that's a refreshing take on this whole stupidity. AI can identify us anyway, leave us alone!
I and many others I know who grew up with unrestricted internet access (before and after the corporatization of the internet) were exposed to terrible shit. Like, I grew up with unusually tech savvy parents who were able to protect me from the worst of it, but even I have been somewhat traumatized by accessing graphic content I shouldn't have. I personally know people who grew up with worse parents who grew up browsing shock/gore websites and who were repeatedly groomed and abused by pedophiles.
Honestly, I don't really get the backlash to this legislation, beyond that its prehaps being applied to devices it shouldn't be. Its a local, safe option for reducing child access to things they shouldn't access. While yes, freedom is important, we're talking about providing the option to limit access to mature content, not preventing them from downloading python or using the internet. There is a justified reason for wanting this, and this seems like the ideal way to do it.
Edit: I'm genuinely confused as to why people are against this. All the argument sound like they're thinking this is another variety on ID collection or AI tracking. From my understanding, this is an optional flag, set locally by the user, about as decentralized and pro-user-choice as it gets. I'm going to reread the law to make sure Im not missing something.
Edit 2: Reading the law, section 4a seems unpractically vague, but in favour of blocking data collection. From my understanding, that would ban the use of things like user agents, and theme settings in browsers. Notably, the law also specifies fines for both accidental and intentional data sharing. This seems like about as good an option as you can get, for protecting children, while keeping it user-choice driven, decentralized, and anonymous.
Edit 3: Actually, in combination with the CCPA, possibly COPPA, and California Civil Code, wouldn't this effectively work as a "tracking me is now illegal" switch?
Its a local, safe option for reducing child access to things they shouldn’t access.
With the proposed measures in place, any app can know exactly which devices children are using, something noone can do now.
When you implement a feature, there's no way in the world you can guarantee only "good people" can use it, and malicious individuals are way more interested in getting info about children than anyone else.
That doesn't protect children, it puts them even in more danger than they are now.
I mean, from my understanding, this would be both hyper-illegal and extremely impractical. You'd need to have a large enough site to lure users in, and collect identitying information and republish it, but can't draw enough attention to become a target for data poisoning (given that this flag is freely set by the user) or for law enforcement. It seems like this would be unlikely enough that the benifit gained from having this flag would far outweigh the risks, esspecially in the modern, hyper-corporate internet.
Individual sites will have their data leaked then aggregated by data brokers. Those data brokers both sell the aggregated data and experience data leaks themselves. The data keeps moving from actor to actor while the aggregation is continued until eventually finding it's way into a public repo or security researcher data sets.
This is a compelling argument, but do you think its really a significant attack vector? Its already illegal to share or leak (even unintentionally) this data, and from my understanding, if you chose to set your age to a lower bracket via this process, companies sharing (also collecting? Currently unclear on this.) this data would also break CCPA and possibly COPPA, and from my understanding, the companies are required to provide additional data privacy measures under California Civil Code.
Yes, these laws will be broken, but will it be on a significant enough scale, and with reliable enough information to be worth-while? Like, since this bans the use of data from those who set their age low, wouldn't this likely reduce the data collection pool overall, not to mention inventiving adults to poison this data. For those who do illegally collect this data anyway, is it that much of an advantage compared to just asking the user's age upon reaching the site as most sites currently do? Beyond that, when these sites operating illegally do leak their data, will that data be a realistic attack vector? Like I said to another commenter, collating data in this way seems extremely impractical and unreliable for predators. Wouldn't those who want to seek out children just go to existing spaces where they can connect directly like Roblox or Discord? Like, don't get me wrong, I don't like data collection, but compared to everything else, this seems like a relatively unreliable and unhelpful data point, esspecially given all the legal restrictions.
Edit: also, would be interested to hear if your opinion changes if even storing this value is illegal, if unnecessary data collection as a whole is banned, and/or if this value has a legally defined default of using the 18+ value, and doesn't have to be made obvious in account setup.
Edit 2: Also, wantted to say thanks for responding genuinely and with a well-articulated argument. I know the Fediverse tends to be very... unfriendly... towards anything that may impact privacy and towards government regulation in general, so your civility is really appreciated.
would be both hyper-illegal and extremely impractical
Does that ever stopped criminals before?
illegal
Yes, in that they can be stopped if noticed. Police are incompetent, but if something is that bad, and draws enough attention, the person will generally be arrested.
extremely impractical
Yes, all the time. Thats why safes, passwords and similar exist. Or, more relevant in this case, the adage that the best way to avoid a break-in is to be a less appealing target than your neighbors. Roblox, Minecraft, Discord, and other platforms where kids gather and regularly self-identify are still going to exist, and they are far safer and far more appealing for targetted abuse of children. On the other hand, setting up a public website/app and trying to lure children to it is expensive, risky, and unlikely to succeed on the modern internet.
On the other hand, setting up a public website/app and trying to lure children to it is expensive, risky, and unlikely to succeed on the modern internet.
Right, when has any website become a platform where kids gather and regularly self-identify?
You're completely ignoring my argument. How many of these websites where children gather and self-identity are created and maintained by paedophiles specifically to prey on childen? So far as I know, there has never been a site like this on the modern internet, nonetheless one that remains up and has been running for an extended period. I don't see any reason to expect this to change.
How many of these websites where children gather and self-identity are created and maintained by paedophiles specifically to prey on childen?
In light of the Epstein files I would hesitate to say that number is zero. Nevermind that most such platforms are smaller than the giants you mentioned. Or that anyone working for or with kid-filled sites of any size could make it incidentally about preying on said kids. Apparently people manage when they're just anonymous users.
roblox
There is no benefit.
You can't glibly assert that people can just lie, so it's not a big deal - and then pretend it'll do the thing it's for. Which again, is a bad idea anyway, which this approach would not achieve, if it even worked. It's fractally stupid. It is dangerous bullshit, at every scale.
There is no benefit.
This is obvious hyperbole and know it. Kids are stupid and vulnerable, and measures to protect them aren't useless. That said, I am open to the idea that this law isn't worth the cost. Basically every other age verification law (esspecially those based on use ID or AI) is very clearly not. I just haven't seen a compelling argument as to why this one isn't.
You can't glibly assert that people can just lie, so it's not a big deal - and then pretend it'll do the thing it's for. Which again, is a bad idea anyway, which this approach would not achieve, if it even worked. It's fractally stupid. It is dangerous bullshit, at every scale.
Okay, but why? You keep repeating that its dangerous, limits freedoms, and causes privacy issues, but so far, the only argument I've seen is that it can help kids identity themselves, but given that its handled locally and is unreliable, I don't see this being usable on any meaningful scale. Setting up a, "free candy" website or app is going to be way less effective and way more dangerous than just creating a Roblox account. Is there something I'm missing?
Companies shouldn't even be allowed to demand more than a username and password, on any machine I could pick up and throw. Making anything beyond that a legal requirement is intolerable, in itself. My age is not this object's business. It sure isn't this website's business.
Stop excusing these intrusions against adult life, for the sake of children who will bypass them anyway. You know they will. You use the flimsiness of this alleged protection as an excuse for enabling it. There is literally no benefit if it doesn't fucking work. Even pretending the immediate goal is something you should want - this won't do that.
Companies shouldn't even be allowed to demand more than a username and password, on any machine I could pick up and throw. Making anything beyond that a legal requirement is intolerable, in itself. My age is not this object's business. It sure isn't this website's business.
Edit, because I forgot this part: I agree to this, but it isn't realistic, unfortunately. That said, even with this law, you can still make unnecessary storage, or sharing of user data illegal.
Stop excusing these intrusions against adult life, for the sake of children who will bypass them anyway. You know they will. You use the flimsiness of this alleged protection as an excuse for enabling it. There is literally no benefit if it doesn't fucking work. Even pretending the immediate goal is something you should want - this won't do that.
I do know they will. The whole reason I'm even okay idea is because it is completely optional for the user. I don't see how it'll impact adult life. That is why I'm so confused at the backlash. Its asking for an option to increase user control and user choice over their experience. Hell, from my understanding, this would provide a means for users to make it actually illegal to collect any user data, but I need to re-read the CCPA to confirm this. It seems that the benifits of user choice provided by this option far outweight the loss of having one more fingerprinting metric - nonetheless one that is illegal to share.
If I had to take a photo of my genitals to sign into my own computer, promises against storage or sharing are not addressing my complaints about privacy. Asking my age is a lot less personal - but it's still information about me, which this object does not need.
'I'm only okay with this idea because I know it won't work' is, just, why are we even talking? What is the function of an argument when you're not listening to yourself?
This won't fix that.
we’re talking about providing the option to limit access to mature content, not preventing them from downloading python or using the internet.
We're talking about stopping adults from using a computer without surrendering their privacy. Whatever excuses you make about that, will not last. This is a flying leap down a slippery slope, and it won't even fucking work.
The California law is a local flag for age range. Its not a law that requires ID, or tracking, or anything else like that. Given that its set by the user optionally, and from my understanding illegal to use for anything but age verification, I don't understand how this is that negative for privacy or freedom.
Edit: Also, setting the age accurately is entirely optional. I don't see how this impacts freedom either.
And it stops here. Yeah? Today is the end of history. Nevermind any resemblance to rampant demands for facial scans and government ID, just to use a website; this demand for every computer to be 18+ will never cause problems.
Have you ever taken a hint in your life.
This is a slippery slope falicy. Just because the option is provided to self-identify age, doesn't mean that it will be replaced with more complex and direct data collection (which I am against, if it wasn't clear) later - esspecially considering that if its based on this law, it would be literally impossible. 4a bans the collection of data from your system besides age, and the fact that it is all handled locally and sharing it is prohibited means that it would be impractical to implement anything fancier than a text box to collect data. If anything, this looks like a way to be seen "doing something" without having to change anything for most users. Hell, if California wantted to implement a law for data collection, why would they have implemented the CCPA, why would they have written this law to ban the sharing of data, and why wouldn't they just write the data collection law instead, given (as you said) there is already significant backing for the idea.
The worst-case scenario is already happening - aforementioned facial scans are not theoretical. Only their scope has been limited, and suddenly we're talking about legally-mandated age gating at an OS level.
Pattern recognition is a requirement for survival.
Many abuses start small so that people like you will let it happen. Some caveats only exist for you to point to while bickering with critics, and when you're not looking, they quietly vanish. Others were just empty words the whole time.
This law is not some compromise over widely-demanded change. It would be a pointless intrusion even if, by some miracle, it stopped right here. It will not stop here. Be serious. You lived through last year; you know the general state of everything. These exact companies have been spying on you. These governments sure aren't stopping them, for some mysterious reason. Scoffing about blindingly obvious expectations is a choice of comforting fantasy over worthwhile argument.
Okay, but should we not oppose laws about data collection and facial recognition in that case, rather than a law that implements an entirely separate, optional, user driven approach. Saying this is bad because those are bad is not an argument any more so than saying CCPA and GDPR are bad because the government want to collect data. Your argument isn't against this law, or even the concept of having age verification in general. Its against government overreach as a broad concept. You're again relying on slipery slope falacy to say that because I'm okay with this one specific form of age gating, I'm okay with every other one, which I have repeatedly made clear is not true.
If you're going to reference the slippery slope fallacy so much, you should probably read where and when it actually applies.
From the wikipedia entry:
When the initial step is not demonstrably likely to result in the claimed effects, this is called the slippery slope fallacy.
You yourself just acknowledged that the worst-case is already happening, so the assumption that the worst case will continue to happen is reasonable.
Unless you wish to argue that :
The worst-case scenario is already happening
followed by you saying
Okay, but
isn’t an acknowledgement ?
Mandatory OS integration is not separate, optional, or user-driven.
I have explicitly argued against this, in itself, for its own sake.
Under the other submission, I am even arguing against age verification in general.
But sure, let's talk about this on its merits, in a vacuum, like there's nothing else happening. What the fuck is it for? You endlessly insist it's super minor, barely an inconvenience, and obviously any idiot can bypass it. That is your defense. If you freely acknowledge all of the other efforts went too far and didn't work, why is this one worth trying? How is this encroachment on all operating systems not a waste of time, at best?
even I have been somewhat traumatized by accessing graphic content I shouldn’t have
Why did you access it if it made you feel bad? It is (and has been since I remember) very difficult to accidentally run across anything shocking on the Internet.
No one ever linked you to lemonparty, huh? No escalating chains of "hot singles in your area" ads? No, you know, human tendency to explore and pursue novel experiences?
OK, if someone actively links me to it, then yes, but there's also no solution to that because they could just send it (or a screenshot of it) directly to me and circumvent any filters there might be.
I've never clicked on a "hot singles in your area" ad, so no idea what that is about.
The entire Internet is of course IMHO about exploring and pursuing novel experiences; but how quickly do you imagine children can get from websites actively recommended by parents to shocking websites? Not very, I think?
It didn't take me long! I learned the shortcuts to hide what I doing from them and was pretty quickly the one being asked for tech tips. Plotting revolution and pirating media in IRC while mom thought I was playing "Where In The World Is Carmen San Diego?" >:)
Kids are way smarter than a lot of people want to admit. I would say more intelligent than adults on average, balanced out by lack of experience of course. That's why I'm so against government measures to limit their exposure and experience, whatever the pretext. They are our future and they will surpass our capabilities, we're fucked as a species if they don't. They deserve our support, not disingenuous constraints or to be weaponized for fear mongering
I definitely agree with all of that.
But if you "learned the shortcuts to hide" what you were doing, then you were clearly accessing things you actively wanted to see, which was my entire point.
Not like alt-tab is rocket surgery :p
What I wanted to see was "the world", you know? That drive to explore and pursue novelty we talked about? Think it's a pretty universal experience, and one companies have absolutely learned to prey on. I don't think yearning to know the unknown is quite equivalent to actively wanting to see anything specific, and you seem like a smart enough guy to be aware of the ways companies abuse that curiousity. That people, children or not, are only shown things they actively want to see is measurably, provably not true. We go down rabbitholes and off on tangents and towards intensity and in all kinds of directions all kinds of people have all kinds of motivations for influencing
Alright, I agree with you that modern "social media recommendation algorithms" are a bad thing that shouldn't have been invented, if that is what you're getting at.
Because I was a stupid kid and didn't realize that watching combat footage might be a bad idea. I thought I was just learning about military history. Same way kids don't realize they're being groomed or don't realize that watching graphic horror movies might be a bad idea. Kids are dumb - and to be clear, I know you can't shield them from everything and parents are still the primary solution. Still, a local flag for age range seems like exactly the sort of tool that would help a parent to moderate access without limitting privacy or freedom.

Matrix chat room: https://matrix.to/#/#midwestsociallemmy:matrix.org
Communities from our friends:
LiberaPay link: https://liberapay.com/seahorse