U bent hier

Privacy

Internet Service Providers Plan to Subvert Net Neutrality. Don’t Let Them

In the absence of strong net neutrality protections, internet service providers (ISPs) have made all sorts of plans that would allow them to capitalize on something called "network slicing." While this technology has all sorts of promise, what the ISPs have planned would subvert net neutrality—the principle that all data be treated equally by your service provider—by allowing them to recreate the kinds of “fast lanes” we've already agreed should not be allowed. If their plans succeed, then the new proposed net neutrality protections will end up doing far less for consumers than the old rules did.

The FCC released draft rules to reinstate net neutrality, with a vote on adopting the rules to come the 25th of April. Overall, the order is a great step for net neutrality. However, to be truly effective the rules must not preempt states from protecting their residents with stronger laws and clearly find the creation of “fast lanes” via positive discrimination and unpaid prioritization of specific applications or services are violations of net neutrality.

Fast Lanes and How They Could Harm Competition

Since “fast lanes” aren’t a technical term, what do we mean when we are talking about a fast lane? To understand, it is helpful to think about data traffic and internet networking infrastructure like car traffic and public road systems. As roads connect people, goods, and services across distances, so does network infrastructure allow for data traffic to flow from one place to another. And just as a road with more capacity in the way of more lanes theoretically means the road can support more traffic moving at speed1, internet infrastructure with more “lanes” (i.e. bandwidth) should mean that a network can better support applications like streaming services and online gaming.

Individual ISPs have a maximum network capacity, and speed, of internet traffic they can handle. To continue the analogy, the road leading to your neighborhood has a set number of lanes. This is why the speed of your internet may change throughout the day. At peak hours your internet service may slow down because a slowdown has occurred from too much requested traffic clogging up the lanes.

It’s not inherently a bad thing to have specific lanes for certain types of traffic, actual fast lanes on freeways can improve congestion by not making faster moving vehicles compete for space with slower moving traffic, having exit and entry lanes in freeways also allows cars to perform specialized tasks without impeding other traffic. A lane only for buses isn’t a bad thing as long as every bus gets equal access to that lane and everyone has equal access to riding those buses. Where this becomes a problem is if there is a special lane only for Google buses, or for consuming entertainment content instead of participating in video calls. In these scenarios you would be increasing the quality of certain bus rides at the expense of degraded service for everyone else on the road.

An internet “fast lane” would be the designation of part of the network with more bandwidth and/or lower latency to only be used for certain services. On a technical level, the physical network infrastructure would be split amongst several different software defined networks with different use cases using network slicing. One network might be optimized for high bandwidth applications such as video streaming, another might be optimized for applications needing low latency (e.g. a short distance between the client and the server), and another might be optimized for IoT devices. The maximum physical network capacity is split among these slices. To continue our tortured metaphor, your original six lane general road is now a four lane general road with two lanes reserved for, say, a select list of streaming services. Think dedicated high speed lanes for Disney+, HBO, and Netflix, but those services only. In a network neutral construction of the infrastructure, all internet traffic shares all lanes, and no specific app or service is unfairly sped up or slowed down. This isn’t to say that we are inherently against network management techniques like quality of service or network slicing. But it’s important that quality of service efforts be undertaken, as much as possible, in an application agnostic manner.

The fast lanes metaphor isn’t ideal. On the road having fast lanes is a good thing, it can protect more slow and cautious drivers from dangerous driving and improve the flow of traffic. Bike lanes are a good thing because they make cyclists safer and allow cars to drive more quickly and not have to navigate around them. But with traffic lanes it’s the driver, not the road, that decides which lane they belong in (with penalties for doing obviously bad faith things such as driving in the bike lane.)

Internet service providers (ISPs) are already testing their ability to create these network slices. They already have plans of creating market offerings where certain applications and services, chosen by them, are given exclusive reserved fast lanes while the rest of the internet must shoulder their way through what is left. This kind of networking slicing is a violation of net neutrality. We aren’t against network slicing as a technology, it could be useful for things like remote surgery or vehicle to vehicle communication which requires low latency connections and is in the public interest, which are separate offerings and not part of the broadband services covered in the draft order. We are against network slicing being used as a loophole to circumvent principles of net neutrality.

Fast Lanes Are a Clear Violation of Net Neutrality

Where net neutrality is the principle that all ISPs should treat all legitimate traffic coming over their networks equally, discriminating between  certain applications or types of traffic is a clear violation of that principle. When fast lanes speed up certain applications or certain classes of applications, they cannot do so without having a negative impact on other internet traffic, even if it’s just by comparison. This is throttling, plain and simple.

Further, because ISPs choose which applications or types of services get to be in the fast lane, they choose winners and losers within the internet, which has clear harms to both speech and competition. Whether your access to Disney+ is faster than your access to Indieflix because Disney+ is sped up or because Indieflix is slowed down doesn’t matter because the end result is the same: Disney+ is faster than Indieflix and so you are incentivized to use Disney+ over Indieflix.

ISPs should not be able to harm competition even by deciding to prioritize incumbent services over new ones, or that one political party’s website is faster than another’s. It is the consumer who should be in charge of what they do online. Fast lanes have no place in a network neutral internet.

  • 1. Urban studies research shows that this isn’t actually the case, still it remains the popular wisdom among politicians and urban planners.
Categorieën: Openbaarheid, Privacy, Rechten

EFF, Human Rights Organizations Call for Urgent Action in Case of Alaa Abd El Fattah

Following an urgent appeal filed to the United Nations Working Group on Arbitrary Detention (UNWGAD) on behalf of blogger and activist Alaa Abd El Fattah, EFF has joined 26 free expression and human rights organizations calling for immediate action.

The appeal to the UNWGAD was initially filed in November 2023 just weeks after Alaa’s tenth birthday in prison. The British-Egyptian citizen is one of the most high-profile prisoners in Egypt and has spent much of the past decade behind bars for his pro-democracy writing and activism following Egypt’s revolution in 2011.

EFF and Media Legal Defence Initiative submitted a similar petition to the UNGWAD on behalf of Alaa in 2014. This led to the Working Group issuing an opinion that Alaa’s detention was arbitrary and called for his release. In 2016, the UNWGAD declared Alaa's detention (and the law under which he was arrested) a violation of international law, and again called for his release.

We once again urge the UN Working Group to urgently consider the recent petition and conclude that Alaa’s detention is arbitrary and contrary to international law. We also call for the Working Group to find that the appropriate remedy is a recommendation for Alaa’s immediate release.

Read our full letter to the UNWGAD and follow Free Alaa for campaign updates.

Categorieën: Openbaarheid, Privacy, Rechten

NCSC meldt actief misbruik van kritiek Palo Alto firewall-lek in Nederland

Security.NL - 19 april 2024 - 4:58pm
Er is in Nederland misbruik gemaakt van een kritieke kwetsbaarheid in de firewalls van Palo Alto Networks, zo meldt het ...

Bijna zeshonderd Poolse burgers bespioneerd via Pegasus-spyware

Security.NL - 19 april 2024 - 4:31pm
De afgelopen jaren zijn bijna zeshonderd Poolse burgers via de Pegasus-spyware door de overheid bespioneerd, zo heeft de Poolse ...

Congress: Don't Let Anyone Own The Law

We should all have the freedom to read, share, and comment on the laws we must live by. But yesterday, the House Judiciary Committee voted 19-4 to move forward the PRO Codes Act (H.R. 1631), a bill that would limit those rights in a critical area. 

TAKE ACTION

Tell Congress To Reject The Pro Codes Act

A few well-resourced private organizations have made a business of charging money for access to building and safety codes, even when those codes have been incorporated into law. 

These organizations convene volunteers to develop model standards, encourage regulators to make those standards into mandatory laws, and then sell copies of those laws to the people (and city and state governments) that have to follow and enforce them.

They’ve claimed it’s their copyrighted material. But court after court has said that you can’t use copyright in this way—no one “owns” the law. The Pro Codes Act undermines that rule and the public interest, changing the law to state that the standards organizations that write these rules “shall retain” a copyright in it, as long as the rules are made “publicly accessible” online. 

That’s not nearly good enough. These organizations already have so-called online reading rooms that aren’t searchable, aren’t accessible to print-disabled people, and condition your ability to read mandated codes on agreeing to onerous terms of use, among many other problems. That’s why the Association of Research Libraries sent a letter to Congress last week (supported by EFF, disability rights groups, and many others) explaining how the Pro Codes Act would trade away our right to truly understand and educate our communities about the law for cramped public access to it. Congress must not let well-positioned industry associations abuse copyright to control how you access, use, and share the law. Now that this bill has passed committee, we urgently need your help—tell Congress to reject the Pro Codes Act.

TAKE ACTION

TELL CONGRESS: No one owns the law

Categorieën: Openbaarheid, Privacy, Rechten

Duitse politici krijgen hulp van overheid, X en TikTok bij beveiligen accounts

Security.NL - 19 april 2024 - 3:26pm
In aanloop naar de Europese parlementsverkiezingen krijgen Duitse politici hulp van de overheid, X, TikTok en LinkedIn bij het ...

Datalek, klacht of tip melden niet mogelijk 19-21 april 2024

Autoriteit Persoonsgegevens (nieuws) - 19 april 2024 - 2:48pm

Door werkzaamheden is het op vrijdag 19 april 2024 vanaf 17:00 uur niet mogelijk om een datalek te melden bij de Autoriteit Persoonsgegevens (AP). U kunt ook geen klacht, tip of datalektip indienen. 

AP: Overheid, gebruik Facebook niet bij onduidelijkheid over privacy

Autoriteit Persoonsgegevens (nieuws) - 19 april 2024 - 2:48pm

Overheidsorganisaties kunnen Facebook maar beter niet gebruiken als onduidelijk is wat er met de persoonsgegevens van bezoekers van hun Facebookpagina gebeurt. De overheid moet namelijk kunnen garanderen dat de verwerking van deze gegevens aan de wet voldoet. Dit adviseert de Autoriteit Persoonsgegevens (AP) aan het ministerie van Binnenlandse Zaken (BZK). 

Man die via bankhelpdeskfraude bijna 80.000 euro stal krijgt twee jaar cel

Security.NL - 19 april 2024 - 2:41pm
Een 21-jarige man uit Kudelstaart die via bankhelpdeskfraude bijna 80.000 euro van oudere mensen stal is veroordeeld tot een ...

Autoriteit Persoonsgegevens adviseert overheid te stoppen met Facebook

Security.NL - 19 april 2024 - 2:20pm
Vanwege privacyrisico's voor burgers moet de overheid stoppen met het gebruik van Facebookpagina's, zo adviseert de Autoriteit ...

'Britse toezichthouder vindt Googles plan voor trackingcookies onvoldoende'

Security.NL - 19 april 2024 - 1:58pm
De Britse privacytoezichthouder ICO vindt het plan van Google om trackingcookies te vervangen onvoldoende. Het voorstel moet ...

Google ontdekt bij onderzoek Windows Registry vijftig kernel-kwetsbaarheden

Security.NL - 19 april 2024 - 1:16pm
Het Project Zero-team van Google heeft bij een onderzoek naar de Windows Registry vijftig kwetsbaarheden in de Windows-kernel ...

Frans ziekenhuis annuleert operaties wegens cyberaanval, valt terug op papier

Security.NL - 19 april 2024 - 12:03pm
Een Frans ziekenhuis heeft meerdere niet-urgente operaties geannuleerd wegens een cyberaanval, waardoor personeel noodgedwongen ...

ACM laat SIDN voor het eerst domeinnamen van webwinkel schrappen

Security.NL - 19 april 2024 - 11:35am
De Autoriteit Consument & Markt (ACM) heeft de Stichting Internet Domeinregistratie Nederland (SIDN) vorige maand opgedragen om ...

Onderzoek: digitale vaardigheden en kennis Nederlander kunnen veel beter

Security.NL - 19 april 2024 - 11:22am
Nederlanders kunnen nog veel leren op het gebied van digitale vaardigheden en digitale kennis, zo stelt de Amsterdam School of ...

Onderwijs moet maatregelen checken voor Zoom-gebruik zonder privacyrisco's

Security.NL - 19 april 2024 - 10:58am
Nederlandse onderwijsinstellingen moeten kijken of ze de nodige maatregelen hebben genomen om Zoom zonder privacyrisico's te ...

NCSC adviseert samen met FBI tegen periodiek wijzigen wachtwoord

Security.NL - 19 april 2024 - 10:04am
Bedrijven en organisaties die zich tegen ransomware willen beschermen doen er verstandig aan om hun personeel niet te ...

Gemeente Utrecht verbiedt TikTok en AliExpress op werktelefoon ambtenaren

Security.NL - 19 april 2024 - 9:16am
De gemeente Utrecht verbiedt apps zoals TikTok, AliExpress, Temu, WeChat en andere applicaties afkomstig uit China, Iran, ...

Er zit een backdoor in mijn NAS, mag ik mijn geld terug?

IusMentis - 19 april 2024 - 8:15am

Een lezer vroeg me: Ik zit met het volgende. Ik heb dus een D-Link NAS waar een backdoor account in aanwezig is. Nu begrijp ik heel goed dat software en andere producten beveiligingslekken bevatten. Maar een backdoor account voeg je als fabrikant toch echt zelf toe. Kan ik een partij als D-Link (en genoeg anderen helaas) hier aansprakelijk voor houden? Het liefst wil ik gewoon mijn geld terug of een product zonder backdoor. Hier werd inderdaad recent voor gewaarschuwd: “Het betreft een command injection-kwetsbaarheid en het gebruik van hardcoded credentials, of een ‘backdoor account’ zoals D-Link het noemt. Via de kwetsbaarheden kan een aanvaller zonder authenticatie willekeurige commando’s op het NAS-systeem uitvoeren, wat kan leiden tot toegang tot gevoelige informatie, het aanpassen van de systeemconfiguratie of een denial of service.”

De hardcoded credentials ware geen bewuste feature, maar een slordigheid: hierachter zit een typische Unix-constructie die alleen niet goed is geïmplementeerd. Maar uiteindelijk doet het er niet toe of het opzet, roekeloosheid, onoplettendheid of iets anders was. Die backdoor zit er, het product is daardoor niet veilig, wat kun je daarmee als consument?

Het simpele antwoord is natuurlijk: je mag van een product verwachten dat dit aan de redelijke verwachtingen voldoet. Dat wil niet zeggen dat het altijd 100% foutloos en backdoorloos is, je moet kijken hoe het product wordt gemarket, hoe eenvoudig de fout te exploiteren is en in hoeverre D-Link dit had moeten voorzien. Niet elke fout is een conformiteitsgebrek.

Toch denk ik dat je bij een enorme impact zoals hier je wel een goed verhaal hebt dat het product niet voldoet aan de redelijke verwachting. Zó makkelijk binnendringen, dat moet niet kunnen bij zo’n belangrijk product. Maar dit wordt al snel een moeilijke technische discussie, waar je niet makkelijk uitkomt als de wederpartij betaald wordt om het met je oneens te zijn.

In de nabije toekomst zullen we met wetten als de Cyber Resilience Act dit een stuk makkelijker aan kunnen pakken. Die stellen updates en een kwalitatief proces van security verplicht. Er is dan weinig nuance meer als er dan toch een securityfout doorheen glipt.

Als laatste blijf je natuurlijk met het aloude probleem in het consumentenrecht dat de winkel (die jij moet aanspreken en die wettelijk verplicht is jou je geld terug te geven, nu herstel geen optie meer is omdat de NASsen end-of-life zijn) simpelweg weigert dat te doen met meestal een excuus zoals “er zit maar 2 jaar garantie op” of “het lampje gaat aan dus hij is niet stuk”. En daarna komt security jou eruit zetten want stemverheffing triggert Protocol Lastige Klant. Het kan dus een hele toer zijn om je recht te halen als consument, en de vraag is altijd of dat het waard is gezien de prijs van het ding.

Arnoud

Het bericht Er zit een backdoor in mijn NAS, mag ik mijn geld terug? verscheen eerst op Ius Mentis.

Two Years Post-Roe: A Better Understanding of Digital Threats

It’s been a long two years since the Dobbs decision to overturn Roe v. Wade. Between May 2022 when the Supreme Court accidentally leaked the draft memo and the following June when the case was decided, there was a mad scramble to figure out what the impacts would be. Besides the obvious perils of stripping away half the country’s right to reproductive healthcare, digital surveillance and mass data collection caused a flurry of concerns.

Although many activists fighting for reproductive justice had been operating under assumptions of little to no legal protections for some time, the Dobbs decision was for most a sudden and scary revelation. Everyone implicated in that moment somewhat understood the stark difference between pre-Roe 1973 and post-Roe 2022; living under the most sophisticated surveillance apparatus in human history presents a vastly different landscape of threats. Since 2022, some suspicions have been confirmed, new threats have emerged, and overall our risk assessment has grown smarter. Below, we cover the most pressing digital dangers facing people seeking reproductive care, and ways to combat them.

Digital Evidence in Abortion-Related Court Cases: Some Examples Social Media Message Logs

A case in Nebraska resulted in a woman, Jessica Burgess, being sentenced to two years in prison for obtaining abortion pills for her teenage daughter. Prosecutors used a Facebook Messenger chat log between Jessica and her daughter as key evidence, bolstering the concerns many had raised about using such privacy-invasive tech products for sensitive communications. At the time, Facebook Messenger did not have end-to-end encryption.

In response to criticisms about Facebook’s cooperation with law enforcement that landed a mother in prison, a Meta spokesperson issued a frustratingly laconic tweet stating that “[n]othing in the valid warrants we received from local law enforcement in early June, prior to the Supreme Court decision, mentioned abortion.” They followed this up with a short statement reiterating that the warrants did not mention abortion at all. The lesson is clear: although companies do sometimes push back against data warrants, we have to prepare for the likelihood that they won’t.

Google: Search History & Warrants

Well before the Dobbs decision, prosecutors had already used Google Search history to indict a woman for her pregnancy outcome. In this case, it was keyword searches for misoprostol (a safe and effective abortion medication) that clinched the prosecutor’s evidence against her. Google acquiesced, as it so often has, to the warrant request.

Related to this is the ongoing and extremely complicated territory of reverse keyword and geolocation warrants. Google has promised that it would remove from user profiles all location data history related to abortion clinic sites. Researchers tested this claim and it was shown to be false, twice. Late in 2023, Google made a bigger promise: it would soon change how it stores location data to make it much more difficult–if not impossible–for Google to provide mass location data in response to a geofence warrant, a change we’ve been asking Google to implement for years. This would be a genuinely helpful measure, but we’ve been conditioned to approach such claims with caution. We’ll believe it when we see it (and refer to external testing for proof).

Other Dangers to Consider Doxxing

Sites propped up for doxxing healthcare professionals that offer abortion services are about as old as the internet itself. Doxxing comes in a variety of forms, but a quick and loose definition of it is the weaponization of open source intelligence with the intention of escalating to other harms. There’s been a massive increase in hate groups abusing public records requests and data broker collections to publish personal information about healthcare workers. Doxxing websites hosting such material are updated frequently. Doxxing has led to steadily rising material dangers (targeted harassment, gun violence, arson, just to name a few) for the past few years.

There are some piecemeal attempts at data protection for healthcare workers in more protective states like California (one which we’ve covered). Other states may offer some form of an address confidentiality program that provides people with proxy addresses. Though these can be effective, they are not comprehensive. Since doxxing campaigns are typically coordinated through a combination of open source intelligence tactics, it presents a particularly difficult threat to protect against. This is especially true for government and medical industry workers whose information may be subjected to exposure through public records requests.

Data Brokers

Recently, Senator Wyden’s office released a statement about a long investigation into Near Intelligence, a data broker company that sold geolocation data to The Veritas Society, an anti-choice think tank. The Veritas Society then used the geolocation data to target individuals who had traveled near healthcare clinics that offered abortion services and delivered pro-life advertisements to their devices.

That alone is a stark example of the dangers of commercial surveillance, but it’s still unclear what other ways this type of dataset could be abused. Near Intelligence has filed for bankruptcy, but they are far from the only, or the most pernicious, data broker company out there. This situation bolsters what we’ve been saying for years: the data broker industry is a dangerously unregulated mess of privacy threats that needs to be addressed. It not only contributes to the doxxing campaigns described above, but essentially creates a backdoor for warrantless surveillance.

Domestic Terrorist Threat Designation by Federal Agencies

Midway through 2023, The Intercept published an article about a tenfold increase in federal designation of abortion-rights activist groups as domestic terrorist threats. This projects a massive shadow of risk for organizers and activists at work in the struggle for reproductive justice. The digital surveillance capabilities of federal law enforcement are more sophisticated than that of typical anti-choice zealots. Most people in the abortion access movement may not have to worry about being labeled a domestic terrorist threat, though for some that is a reality, and strategizing against it is vital.

Looming Threats Legal Threats to Medication Abortion

Last month, the Supreme Court heard oral arguments challenging the FDA’s approval of and regulations governing mifepristone, a widely available and safe abortion pill. If the anti-abortion advocates who brought this case succeed, access to the most common medication abortion regimen used in the U.S. would end across the country—even in those states where abortion rights are protected.

Access to abortion medication might also be threatened by a 150 year old obscenity law. Many people now recognize the long dormant Comstock Act as a potential avenue to criminalize procurement of the abortion pill.

Although the outcomes of these legal challenges are yet-to-be determined, it’s reasonable to prepare for the worst: if there is no longer a way to access medication abortion legally, there will be even more surveillance of the digital footprints prescribers and patients leave behind. 

Electronic Health Records Systems

Electronic Health Records (EHRs) are digital transcripts of medical information meant to be easily stored and shared between medical facilities and providers. Since abortion restrictions are now dictated on a state-by-state basis, the sharing of these records across state lines present a serious matrix of concerns.

As some academics and privacy advocates have outlined, the interoperability of EHRs can jeopardize the safety of patients when reproductive healthcare data is shared across state lines. Although the Department of Health and Human Services has proposed a new rule to help protect sensitive EHR data, it’s currently possible that data shared between EHRs can lead to the prosecution of reproductive healthcare.

The Good Stuff: Protections You Can Take

Perhaps the most frustrating aspect of what we’ve covered thus far is how much is beyond individual control. It’s completely understandable to feel powerless against these monumental threats. That said, you aren’t powerless. Much can be done to protect your digital footprint, and thus, your safety. We don’t propose reinventing the wheel when it comes to digital security and data privacy. Instead, rely on the resources that already exist and re-tool them to fit your particular needs. Here are some good places to start:

Create a Security Plan

It’s impossible, and generally unnecessary, to implement every privacy and security tactic or tool out there. What’s more important is figuring out the specific risks you face and finding the right ways to protect against them. This process takes some brainstorming around potentially scary topics, so it’s best done well before you are in any kind of crisis. Pen and paper works best. Here's a handy guide.

After you’ve answered those questions and figured out your risks, it’s time to locate the best ways to protect against them. Don’t sweat it if you’re not a highly technical person; many of the strategies we recommend can be applied in non-tech ways.

Careful Communications

Secure communication is as much a frame of mind as it is a type of tech product. When you are able to identify which aspects of your life need to be spoken about more carefully, you can then make informed decisions about who to trust with what information, and when. It’s as much about creating ground rules with others about types of communication as it is about normalizing the use of privacy technologies.

Assuming you’ve already created a security plan and identified some risks you want to protect against, begin thinking about the communication you have with others involving those things. Set some rules for how you broach those topics, where they can be discussed, and with whom. Sometimes this might look like the careful development of codewords. Sometimes it’s as easy as saying “let’s move this conversation to Signal.” Now that Signal supports usernames (so you can keep your phone number private), as well as disappearing messages, it’s an obvious tech choice for secure communication.

Compartmentalize Your Digital Activity

As mentioned above, it’s important to know when to compartmentalize sensitive communications to more secure environments. You can expand this idea to other parts of your life. For example, you can designate different web browsers for different use cases, choosing those browsers for the privacy they offer. One might offer significant convenience for day-to-day casual activities (like Chrome), whereas another is best suited for activities that require utmost privacy (like Tor).

Now apply this thought process towards what payment processors you use, what registration information you give to social media sites, what profiles you keep public versus private, how you organize your data backups, and so on. The possibilities are endless, so it’s important that you prioritize only the aspects of your life that most need protection.

Security Culture and Community Care

Both tactics mentioned above incorporate a sense of community when it comes to our privacy and security. We’ve said it before and we’ll say it again: privacy is a team sport. People live in communities built on trust and care for one another; your digital life is imbricated with others in the same way.

If a node on a network is compromised, it will likely implicate others on the same network. This principle of computer network security is just as applicable to social networks. Although traditional information security often builds from a paradigm of “zero trust,” we are social creatures and must work against that idea. It’s more about incorporating elements of shared trust pushing for a culture of security.

Sometimes this looks like setting standards for how information is articulated and shared within a trusted group. Sometimes it looks like choosing privacy-focused technologies to serve a community’s computing needs. The point is to normalize these types of conversations, to let others know that you’re caring for them by attending to your own digital hygiene. For example, when you ask for consent to share images that include others from a protest, you are not only pushing for a culture of security, but normalizing the process of asking for consent. This relationship of community care through data privacy hygiene is reciprocal.

Help Prevent Doxxing

As somewhat touched on above in the other dangers to consider section, doxxing can be a frustratingly difficult thing to protect against, especially when it’s public records that are being used against you. It’s worth looking into your state level voter registration records, if that information is public, and how you can request for that information to be redacted (success may vary by state).

Similarly, although business registration records are publicly available, you can appeal to websites that mirror that information (like Bizapedia) to have your personal information taken down. This is of course only a concern if you have a business registration tied to your personal address.

If you work for a business that is susceptible to public records requests revealing personal sensitive information about you, there’s little to be done to prevent it. You can, however, apply for an address confidentiality program if your state has it. You can also do the somewhat tedious work of scrubbing your personal information from other places online (since doxxing is often a combination of information resources). Consider subscribing to a service like DeleteMe (or follow a free DIY guide) for a more thorough process of minimizing your digital footprint. Collaborating with trusted allies to monitor hate forums is a smart way to unburden yourself from having to look up your own information alone. Sharing that responsibility with others makes it easier to do, as well as group planning for what to do in ways of prevention and incident response.

Take a Deep Breath

It’s natural to feel bogged down by all the thought that has to be put towards privacy and security. Again, don’t beat yourself up for feeling powerless in the face of mass surveillance. You aren’t powerless. You can protect yourself, but it’s reasonable to feel frustrated when there is no comprehensive federal data privacy legislation that would alleviate so many of these concerns.

Take a deep breath. You’re not alone in this fight. There are guides for you to learn more about stepping up your privacy and security. We've even curated a special list of them. And there is Digital Defense Fund, a digital security organization for the abortion access movement, who we are grateful and proud to boost. And though it can often feel like privacy is getting harder to protect, in many ways it’s actually improving. With all that information, as well as continuing to trust your communities, and pushing for a culture of security within them, safety is much easier to attain. With a bit of privacy, you can go back to focusing on what matters, like healthcare.

Categorieën: Openbaarheid, Privacy, Rechten

Pagina's

Abonneren op Informatiebeheer  aggregator - Privacy