Data Protection Regulations - GDPR, CCPA, CPRA, LGPD, STFU (with special guest COPPA)

What is the "ideal" method of data regulation?

  • The GDPR as it is now.

  • The CCPA

  • The CPRA as it is now.

  • The LGPD as it is now.

  • Transparency of data collection and use + more agency over automated data collection. Nothing else.

  • Transparency of data collection and use. Nothing else.

  • Fuck regulations, let them do whatever they want.

  • Other (post in thread)


Results are only viewable after voting.

ScatmansWorld

kiwifarms.net
I've been doing a lot of thinking recently on these data protection laws, and more and more I can't help but feel unnerved about certain overreaches of power that could end up ruining things in the future. Perhaps the best recent example of this I've found was a post over in the Knockout.chat forums, where Garry Newman explains why he closed the Facepunch forums, but also, why he couldn't just give out an archive of them. (The "data protection laws" in this post is referring to the GDPR.)

Screenshot_2020-11-23_17-52-26.png

This should be setting off alarm bells in everyone's head over the dangers this regulation could mean for the future of digital archiving, the transfer and sharing of information, and the internet in general. But the bigger question is, why were these regulations seen as necessary in the first place, and why is the general public acceptant of them for the most part?

From what I can tell, the creation of these laws derive from 3 main concerns within society that have only grown with time and the advancement of technology.

1. Businesses and organizations don't show enough transparency over what data they collect from you, what specific data they've already collected, and what they do with that data.

This is the one talking point of these regulations that I am 100% onboard for. While I mostly lean on the libertarian side of issues, one of the greatest aspects of government (ideally) is ensuring truthfulness and reprimanding you for falsehoods, whether it's a lie or a lie by omission. To use a very exaggerated example, a meat packaging company can't serve you beef laced with shards of glass and then use the excuse "We never said there WEREN'T shards of glass in the meat!" Even if consumers catch on to the lie and stop buying from them, the damage of their lie by omission has already been done. Similarly, Facebook, Google, Apple, etc. shouldn't be allowed to collect data from you that you didn't even know about, and then claim "We never said we WEREN'T collecting hours of voice recording data from you and selling it to advertisers!" Anyone who handles data recorded from other users should make it known in a clear and concise manner what they use it for, and if they're caught in a lie, they should face consequences. Moreover, users should have view access to all data collected from their specific account, including all identifying data, voice recordings, video footage, and GPS tracking. Whether or not they should know who their data was sold to is iffy and could be a breach of privacy laws, so I don't have any conclusive thoughts on that.

2. Businesses and organizations are using automated methods of collecting data from users, which they have little to no input or agency over.

I want to emphasize the "automated" part of that concern, because I believe this is where the biggest fears of data collection are coming from, and what these regulations are ultimately fueled by. When you create a post on Facebook, you are making a conscious decision to post specific information to Facebook's database and have them share it to whoever they wish. On the other hand, when you ask Google Nest or Amazon Alexa a question, most people aren't aware of the fact that it's creating a recording that is saved in a database and could be used however the company wishes, or that the GPS on your phone can let Facebook track your movement. In the United States, this also brings up many questions regarding the Children's Online Privacy Protection Act (COPPA), such as what happens when someone under 13 has their voice recorded by a home assistant? How can you enforce COPPA when these automatic methods of data collection can so easily collect information about children without even intending to? Of course, you could make the argument users DO have agency in these situations by just not using a product if they don't agree with their terms of service or methods of data collection, but said methods are being employed by basically every major technology manufacturer and producer, leaving many to feel trapped in an endless state of surveillance. One solution already used by data regulation laws is to have all non-essential methods of data collection to be opt-in choices, so users can feel somewhat more secure with their privacy.

3. There's a fundamental lack of trust between data collectors and their users.

This lack of trust is felt both by those who know very little of technology, and those who have a considerable understanding of it. It's the reason many desktop and laptop users have a plastic flap or a piece of tape stuck over their web camera. Even if companies like Amazon, Google, Facebook, Microsoft, etc. swear to every god there is that they only collect information in specific circumstances, their words will feel hollow to a vast amount of their userbase. How can you be sure that Alexa ISN'T always recording just because Jeff Bezos says so? How would you investigate this issue when all the incriminating evidence could be wiped with the press of a button? Should someone be constantly monitoring Amazon's database just to make sure there's no illegal collection of data happening? I'm not sure how you'd solve this, and I think anything involving government oversight would just create a horribly bloated and inefficient system of bureaucracy.

With all that being said, there is one extremely important rule we should all agree on for the sake of the internet's future, and unfortunately it seems everyone's already disregarded it.

erasure1.PNG

erasure2.PNG


YOU DO NOT "OWN" ANY PERSONAL INFORMATION YOU GIVE OUT WITH YOUR CONSENT.

No, you should not be allowed to retroactively delete any information about yourself from any internet site just because you want to. Information is not a copyright. The knowledge that I have a specific brand of furniture in my house or that I ate 3 slices of pizza today is not fucking proprietary. You should not be able to demand that a company not sell your information when you've agreed to all their terms of service and given it to them first hand. Online archives should not have to be a constantly maintained database in case someone wants to delete something they said 10 years ago. When you type a post or fill out a form on the internet and hit that little "send" button, you are giving that website full access to the data you've inscribed with your own two hands. At the end of the day, YOU ARE THE ONE RESPONSIBLE FOR YOUR OWN INFORMATION.

There can be no free and open internet in a world where all information is proprietary. I've seen people fearful about Article 13 or the possible end of Section 230 and how those will be the death of the internet, but meanwhile no one is paying attention to these changes that are happening right under our nose. This so called "right to erasure" is Article 13 on steroids, it's the ability for anyone to point to any piece of information on the internet they've posted or that's been posted about them and memory hole it forever. Worst of all, I feel as if any attempt to speak out against this ruling will be hopeless, because the general public won't care. All they'll hear is that they'd be given the authority to permanently smite out all their embarrassing photos and moments off the face of the earth, and they'd gladly accept that power regardless of the overall consequences. With the CCPA already enforced and the CPRA recently passed in court and to be enforced by January 2023 (along with the creation of the California Privacy Protection Agency), it seems as if things might already be too late for us in America as well. The only saving grace to California's laws are that they don't apply to non-profit organizations unlike the GDPR, but who knows if that will also be changed in the future.

In today's society, citizens and government officials act as if these businesses' interest in information is some dangerous unheard of paradigm shift caused directly by the advancement of technology and recording hardware, but the truth is that interest has always been there. In the old world, if a business wanted analytics about their customers, they'd walk around requesting information in a physical survey, and maybe offer a free burger or something to encourage your participation. Services like Facebook and Twitter have simply replaced surveys as the most efficient way for these businesses to solve their problem, as advances in tech tend to do. Why would anyone spend time walking around malls and cities for information when you could just pay Facebook a lump sum of cash and have all the analytics you'd ever need?

I don't know what the future holds, but I can only hope that cooler heads will prevail and that we find some happy middle ground that ensures the freedom of information while giving us the confidence that nothing is being taken under our noses. Whew, that's enough rambling from me.

TL;DR : These regulations happen because of lack of trust with corporations and the automated methods used to collect information that we have less and less agency over. We might need to do something to alleviate these concerns but it shouldn't come at the cost of free and universal information, or crippling our systems with bureaucracy. Also fuck the NSA just because.
 
Last edited:

Similar threads

  • Poll
I think the elites are making a very simple mistake of only looking at mathematical measured outcomes instead of the larger picture
Replies
94
Views
8K
  • Poll
Find and share Good™ sites Google doesn't want you to see
Replies
49
Views
6K
Top