This is a static archive of the old Zorin Forum.

The information below may be outdated. Visit the new Zorin Forum here ›

If you have registered on the old forum, you will need to create an account on the new forum.

EDRI Newsletter ...

Swarfendor437

Fri May 01, 2020 10:09:40 pm

European Digital Rights Information Newsletter latest:


======================================================================

EDRi-gram

fortnightly newsletter about digital civil rights in Europe

EDRi-gram 18.8, 29 April 2020
Read online: https://edri.org/edri-gram/18.8

=======================================================================
Contents
=======================================================================

1. #*****: DSA and political micro-targeting
2. COVID-19: A Commission hitchhiker’s tech guide to the App Store
3. Everything you need to know about the DSA
4. Member in the spotlight: Homo Digitalis
5. Why COVID-19 is a Crisis for Digital Rights
6. Recommended Action
7. Recommended Reading
8. Agenda
9. About

=======================================================================
1. #*****: DSA and political microtargeting
=======================================================================

Europe is about to overhaul its 20-year-old e-Commerce Directive and it
is a once-in-a-decade chance to correct the power imbalance between
platforms and users. As part of this update, the Digital Services Act
(DSA) must address the issue of political microtargeting (PMT).

Microtargeting, and PMT in particular, has the alarming power to derail
democracy, and should be regulated. According to self-assessment
reports, political advertisers spent €31 million (excluding the UK) on
Facebook, and only €5 million on Google between March and September
2019. Facebook’s role in developing and targeted adverts goes far
beyond a simple presentation medium — its tools for optimising ad
delivery, targeting audiences and defining delivery criteria are far
beyond the capacity of most political parties alone. A detailed report
based on data collected during two Polish election campaigns in 2019
carried out by Panoptykon and partners, shed critical light on the role
of the company, and what it revealed was extremely informative:

The study found that Facebook’s transparency and control tools that
would explain how ad targeting works offered to both researchers and
users are “insufficient and superficial.” Users are targeted by
Facebook’s algorithm based on potentially thousands of distinct
selectors following a a set of criteria that only the company knows.
Advertisers on Facebook can opt to select audiences on obvious factors
such as age, gender, language spoken and location. But the Facebook
machine also steers them towards increasingly narrower criteria such as
interests (political affiliation, *** orientation, musical tastes,
etc...), “life events” and behaviour, as well as more than 250,000
free-text attributes including, for example, Adult Children of
Alcoholics, or Cancer Awareness, which constitute a deeper privacy concern.

Facebook is not merely a passive intermediary; its algorithms interpret
criteria selected by advertisers and deliver ads in a way that fulfils
advertisers’ objectives, and actively curate the content that users see
in their timelines based on those assumptions. In 2016, the company
introduced a feature allowing them to target “lookalikes” – profiles
similar to a target audience. It also allows A/B testing so advertisers
can compare which ads are more effective.

But Facebook’s “why am I seeing this ad?” transparency tool can be
misleading, revealing only the “lowest common denominator” attribute.
For example, according to the report, during the European elections
campaign in Poland in May 2019, a person who was pregnant saw a
political ad referring to prenatal screenings and perinatal care. “Why
am I seeing this ad?” informed her that she was targeted because she was
interested in “medicine” (potential reach 668 million) rather than
“pregnancy” (potential reach of 316 million). Users can only verify
(check, delete, or correct) a short list of interests that the platform
is willing to reveal.

Here is where upcoming regulation comes into play: At the very least,
the Digital Services Act should prohibit PMT based on characteristics
which expose our mental or physical vulnerabilities (e.g. depression,
anxiety, addiction, illness). But if the EU wants to be ambitious and
tackle many of the associated problems with the current business model,
the DSA should go further and regulate any sort of advertising aimed at
profiling users, particularly as there appears to be a gap between ads
labelled as “political” by the platform, and ads perceived as political
by researchers.

Regulating targeted ads, requiring greater transparency for researchers
and users, opt-in rather than opt-out, tighter requirements for
political advertising and recognising PMT as an application of AI that
poses serious risks for human rights will not solve all the problems of
political disinformation in society, but they would certainly eliminate
some of the worst practices today.

Read more:

Who (really) targets you? Facebook in Polish election campaigns:
https://www.panoptykon.org/political-ads

Annual self-assessment reports of signatories to the Code of Practice on
Disinformation 2019 (29.10.2019):
https://ec.europa.eu/digital-single-mar ... ation-2019


(Contribution by Karolina Iwańska, from EDRi member Panoptykon)

=======================================================================
2. COVID-19: A Commission hitchhiker’s tech guide to the App Store
=======================================================================

“We’re being asked what do we want these systems to look like. If we
don’t make the decision it will be made for us (…) This virus will pass,
but the measures will last”
Edward Snowden

According to the World Health Organisation (WHO), closely watching
contacts during a pandemic “will prevent further transmission of the
virus”. In response to the COVID-19 crisis many technical responses (or
acts of techno-solutionism) arose shortly after the pandemic was
declared by the WHO. Contact–tracing applications are one of the notable
solutions brought forward, and currently occupy the center of the public
debate in the European space.

Whether contact-tracing technology will help or not, however, is still
contested. Technology is not a silver bullet, as Carly Kind, director of
AI research center Ada Lovelace Institute, puts it. Moreover, Dr.
Michael Ryan, a key advisor for the WHO, warned that "when collecting
information on citizens or tracking their movements there are always
serious data protection and human rights principles involved". Several
voices in the EDRi community also question whether the risks in using
apps may outweigh the benefits (La Quadrature du Net) and if apps are
just “we-did-something” political responses (FIPR - Ross Anderson).

That said, if apps (and technology in general) are proven to be useful
in any significant way, they need to fully protect fundamental rights,
since the risks created by these technologies could outlast the pandemic
itself.

European Digital Rights, as the voice of 44 organisations working to
advance and uphold human rights in the digital space, warned early on of
the potential problems that a rushed technological solution could lead
us to.

In reaction to the debate regarding the safeguards potential technical
solutions must provide, the European Commission (EC) has published a
toolbox and guidelines for ensuring data protection standards. The two
instruments aim to guide the responses that Member States are already
preparing nationally, sometimes in very different directions.

In this article, we aim to provide insight into European Commission’s
proposals and how they fit with civil society views on this subject.

A techie toolbox

"A fragmented and uncoordinated approach to contact tracing apps risks
hampering the effectiveness of measures aimed at combating the COVID-19
crisis, whilst also causing adverse effects to the single market and to
fundamental rights and freedoms”
European Commission Common EU Toolbox for Member State

The EC argued for the need of a toolkit as national authorities are
developing mobile applications (apps) to monitor and mitigate the
COVID-19 pandemic. The Commission agrees that contact tracing, as
usually done manually by public health authorities, is a time-consuming
process and that the “promising” technology and apps in particular could
be useful tools for Member States.

However, the EC points out that, in order for apps to be efficient, they
need to be adopted by 60-75% of population - a very high threshold for a
voluntary app. As comparison, in the famous case of Singapore, only 20%
of the population downloaded the app.

The toolbox calls for a series of concrete requirements for these apps:
interoperability (apps must work well with each other in order to be
able to trace transnational cases); voluntary; approved by the national
health authorities; privacy-preserving and dismantled as soon as they
are no longer needed.

The time principle was a key point in our statement laying out
fundamental rights – based recommendations for COVID-19 responses. On
apps in particular, EDRi member Access Now advocates that access to
health data shall be limited to those who need information to conduct
treatment, research, and otherwise address the crisis . Finally, EDRi
members Chaos Computer Club (CCC), Free Software Foundation Europe
(FSFE) and noyb are among those that agree on the need for the apps to
be voluntary.

Decentralised or centralised, that is the question

The Toolbox describes two categories of apps: those that operate via
decentralised processing of personal data, which would be stored only on
a person’s own device; and those operating via a centralised back-end
server which would collect the data. The EC argues that this data should
be reduced to the “absolute minimum” necessary, with technical
requirements compiled by ENISA (encryption, communications security,
user authentication….) and “preferably” the Member State should be the
controller for the processing of personal data. The Annexes list key
recommendations, background information on contact tracing , background
on symptom checker functionalities and an inventory of existing mobile
solutions against COVID-19.

Our member noyb agrees with the Commission requiring strong encryption,
an essential element of secure technologies for which we have also
advocated before. More, EDRi member CCC sides with the decentralisation
option rather than a centralised one, as well as with strong
communication security and privacy requirements.

Readers who liked the Toolbox… also liked the Guidelines

"People must have the certainty that compliance with fundamental rights
is ensured and that the apps will be used only for the
specifically-defined purposes, that they will not be used for mass
surveillance, and that individuals will remain in control of their data."
European Commission Guidance on Apps supporting the fight against COVID
19 pandemic in relation to data protection

The Commision guidance summarises some of the key points of the Toolbox
but provides more insight on some of the features, as well details on
ensuring data protection and privacy safeguards. The guidance focuses on
apps which are voluntary and that offer one or more functionalities:
provide accurate information to individuals about the pandemic or
provide questionnaires for self-assessment and guidance for individuals
(symptom checker). Other functionalities could include alerting
individuals if they have been in close contact with an infected person
(contact tracing and warning functionality) and/or provide means of
communication between patients and doctors.

The guidance relies heavily on references to the ePrivacy Directive
(currently blocked by EU Member States from becoming an updated
Regulation for 3 years and 4 months) and the General Data Protection
Regulation (GDPR) . The references include data minimisation, purpose
limitation, time limitation (apps deactivated after the pandemic is
over) and top-of the-art security protections.

Our member Access Now has thoroughly gone through the data protection
and privacy requirements of purpose limitation, data minimisation and
time limitation , largely coinciding with the Commission, while Bits of
Freedom has also mentioned the minimal use of data needed and time
limitation, in addition to the apps being based on scientific insight
and demonstrable effectiveness.

Location data is not necessary, decentralisation is

The Commission states that location data is not necessary for the
purpose of contact tracing functionalities and that it would even be
“difficult to justify in light of the principle of data minimisation”,
and that it can create “security and privacy issues”. Regarding the
debate of centralisation vs decentralisation, the Commission believes
that decentralisation is more in line with the minimisation principle
and that, as Bits of Freedom, CCC and many other groups have suggested,
only “health authorities should have access to proximity data [which
should be encrypted]” and therefore no law enforcement agencies can
access the data. What about the well-intended but risky use of data for
“statistics and scientific research”? Commission says no, unless it is
necessary and included in the general list of purposes and clearly
communicated to users.

Get us some open code. And add good-old strong encryption to go with it,
please

The Commission asks for the source code to be made public and available
for review. In addition to this, the Commission calls for the use of
encryption when transmitting the data to national health authorities, if
that is one of the functionalities. Both of these conclusions have been
some of the key requests from EDRi members such as FSFE, both for
transparency and security purposes but also as a an appeal for
solidarity. We consider the call for openness as a positive request from
the Commission.

Finally, the guidance brings back the forgotten Data Protection
Authorities (DPAs) who, as we have also suggested, should be the ones
consulted and fully involved when developing and implementing the apps.

Moving forward

We have many uncertainties regarding the actual pandemic, especially
regarding whether any technical solution will help or not. Furthermore,
it is unclear how these technologies should be designed, developed and
deployed in order to avoid mass surveillance of citizens, stigmatisation
of those who are sick and reinforced discrimination of people living in
poverty, people of colour and other individuals of groups at risks who
are already disproportionately affected by the pandemic.

The voices of experts and civil society must be taken into
consideration, before taking the road of an endless “war on virus” that
normalises mass surveillance. If proven that technologies are indeed
helpful to combat this crisis, technological solutions need to comply
with very strong core principles. Many of these strong principles are
already present in the Commission’s two documents and in many of the
civil society views in this ongoing debate.

In the meantime, strong public health systems, strong human rights
protections (including extra protections for key workers), a
human-rights centric patent system that puts humans at its core and open
access to scientific knowledge are key principles that should be
implemented now.

Read more:

Press Release: EDRi calls for fundamental rights-based responses to
COVID-19 (01.04.2020)
https://edri.org/edri-calls-for-fundame ... -covid-19/


noyb – Active overview of projects using personal data to combat
SARS-CoV-2.
https://gdprhub.eu/index.php?title=Data ... SARS-CoV-2

Privacy International – Extraordinary powers need extraordinary
protections. (20. 03. 2020)
https://privacyinternational.org/news-a ... rotections


Access Now – Protect digital rights, promote public health: toward a
better coronavirus response. (05. 03. 2020)
https://www.accessnow.org/protect-digit ... -response/


Ada Love Lace Institute: Exit through the App Store? (20. 04. 2020)
https://www.adalovelaceinstitute.org/wp ... 2020-1.pdf


European Commission - Mobile applications to support contact tracing in
the EU’s fight against COVID-19: Common EU Toolbox for Member States
(15. 04. 2020)
https://ec.europa.eu/health/sites/healt ... pps_en.pdf


European Commission (COMMUNICATION)- Guidance on Apps supporting the
fight against COVID 19 pandemic in relation to data protection (16. 04.
2020)
https://ec.europa.eu/info/sites/info/fi ... rt1_v3.pdf

(Contribution by Diego Naranjo, head of policy at EDRi)

=======================================================================
3. Everything you need to know about the DSA
=======================================================================

In her political guidelines, the President of the European Commission
Ursula von der Leyen has committed to “upgrade the Union’s liability and
safety rules for digital platforms, services and products, with a new
Digital Services Act” (DSA). The upcoming DSA will revise the rules
contained in the E-Commerce Directive of 2000 that affect how
intermediaries regulate and influence user activity on their platforms,
including people’s ability to exercise their rights and freedoms online.
This is why reforming those rules has the potential to be either a big
threat to fundamental rights rights or a major improvement of the
current situation online. It is also an opportunity for the European
Union to decide how central aspects of the internet will look in the
coming ten years.

A public consultation by the European Commission is planned to be
launched in May 2020 and legislative proposals are expected to be
presented in the first quarter of 2021.

In the meantime, three different Committees of the European Parliament
have announced or published Own Initiative Reports as well as Opinions
in view of setting the agenda of what the DSA should regulate and how it
should achieve its goals.

We have created a document pool in which we will be listing relevant
articles and documents related to the DSA. This will allow you to follow
the developments of content moderation and regulatory actions in Europe.

Read more:

Document pool: Digital Service Act (27. 04. 2020)
https://edri.org/digital-service-act-document-pool

=======================================================================
4. Member in the spotlight: Homo Digitalis
=======================================================================

This is the tenth article of the series “EDRi member in the Spotlight”
in which our members introduce themselves and their work in depth

Today we introduce our Greek member: Homo Digitalis.

1. Who are you and what is your organisation’s goal and mission?

Homo Digitalis is the only digital rights civil society organization in
Greece. Our goal is the protection of human rights and freedoms in the
digital age. We strive to influence legislators & policy makers on a
national level, and to raise awareness amongst the people of Greece
regarding digital rights issues. Moreover, when digital rights are
jeopardized by public or private actors, we carry out investigations,
conduct studies and proceed to legal actions.

2. How did it all begin, and how did your organisation develop its work?

Homo Digitalis was founded in 2018 by 6 tech lawyers with a strong
passion about the protection and promotion of digital rights. No digital
rights organisations existed in Greece before. So, we wanted to create
an organisation that could bring like-minded people together and shake
things up. After two years of voluntary work, we have managed to grow
into an organization with more than 100 members, who bring together a
wide variety of disciplines such as law, computer science, humanities
and social sciences.

We aim to transform Homo Digitalis from an organization based on
voluntary work to a strong watchdog with a long-term strategy and
full-time personnel. It will be a long and difficult path, but we have
started acquiring our first grants and we are confident that we will
grow, gaining more recognition and support for us and our vision.

3. The biggest opportunity created by advancements in information and
communication technology is…

…facilitating access to information all around the globe, and building
bridges between people. These advancements constitute a driver for
positive change in our societies, and could lead to enhanced equality
and transparency.

4. The biggest threat created by advancements in information and
communication technology is…

…mass surveillance of our societies and power asymmetry in the
information economy.

5. Which are the biggest victories/successes/achievements of your
organisation?

Becoming a full member of EDRi is certainly a great success of Homo
Digitalis so far!

Additionally, Homo Digitalis has managed to achieve important
accomplishments over the last two years. We have increased public
awareness on digital rights issues by generating media interest in our
actions, visiting educational institutions and participating in events,
campaigns, and giving talks all around Greece. Moreover, we were
instrumental in influencing the public debate around data protection
reform in Greece by cooperating with related stakeholders, and by filing
complaints and requests before EU and national authorities, respectively.

Also, through access to information requests, complaints, and
investigations we have attained a high level of scrutiny regarding
projects on technology-led policing and border management activities in
Greece. In addition, we have collaborated with investigative journalists
to reveal important facts. Even though we are an organization based
solely on volunteers, we give our best to respond quickly to the
challenges that arise.

Furthermore, we have been fortunate enough to participate shoulder to
shoulder with powerful digital rights organisations in EU-wide projects
and campaigns and to learn from their expertise and knowledge. Finally,
we also had the great opportunity to present our views and opinions in
important fora, such as the UN Human Rights Council 39th session in
Geneva or the European Parliament in Brussels.

All these accomplishments over the last two years give us the strength
to continue our work towards the protection and promotion of human
rights in the digital age.

6. If your organisation could now change one thing in your country, what
would that be?

Active participation of people in collective activities such as digital
rights activism. If individuals could devote a part of their knowledge
and time to such activities, we would have a stronger voice to influence
policy makers and legislators towards political decisions that respect
our rights and freedoms and not violate them, instead.

7. What is the biggest challenge your organisation is currently facing
in your country?

After 10 years of financial crisis and austerity measures in Greece that
limited public spending, we witness over the last years an increase in
funds used for technology-led policing and border managements projects.
Thus, we must stay wide-awake in order to challenge and fight back the
implementation of intrusive tools and technologies in our societies that
limit our rights and freedoms.

8. How can one get in touch with you if they want to help as a
volunteer, or donate to support your work?

You can visit our website to help us as a volunteer or to donate and
support our work. Also, we always appreciate a good conversation, so
feel free to reach out to info@homodigitalis.gr. Last but not least, you
can subscribe to our newsletter here.

Read more:

Homo Digitalis - volunteers page: https://www.homodigitalis.gr/en/join-us

Homo Digitalis - donation page:
https://www.homodigitalis.gr/en/donations/help-us-grow

EDRi member in the spotlight series:
https://edri.org/member-in-the-spotlight/

=======================================================================
5. Why COVID-19 is a Crisis for Digital Rights
=======================================================================

The COVID-19 pandemic has triggered an equally urgent digital rights crisis.

New measures being hurried in to curb the spread of the virus, from
“biosurveillance” and online tracking to censorship, are potentially as
world-changing as the disease itself. These changes aren’t necessarily
temporary, either: once in place, many of them can’t be undone.

That’s why activists, civil society and the courts must carefully
scrutinise questionable new measures, and make sure that – even amid a
global panic – states are complying with international human rights law.

Human rights watchdog Amnesty International recently commented that
human rights restrictions are spreading almost as quickly as coronavirus
itself. Indeed, the fast-paced nature of the pandemic response has
empowered governments to rush through new policies with little to no
legal oversight.

There has already been a widespread absence of transparency and
regulation when it comes to the rollout of these emergency measures,
with many falling far short of international human rights standards.

Tensions between protecting public health and upholding people’s basic
rights and liberties are rising. While it is of course necessary to put
in place safeguards to slow the spread of the virus, it’s absolutely
vital that these measures are balanced and proportionate.

Unfortunately, this isn’t always proving to be the case. What follows is
an analysis of the impact of the COVID-19 pandemic on the key subset of
policy areas related to digital rights:

a) The Rise of Biosurveillance

A panopticon world on a scale never seen before is quickly materialising.

“Biosurveillance” – which involves the tracking of people’s movements,
communications and health data – has already become a buzzword, used to
describe certain worrying measures being deployed to contain the virus.

The means by which states, often aided by private companies, are
monitoring their citizens are increasingly extensive: phone data, CCTV
footage, temperature checkpoints, airline and railway bookings, credit
card information, online shopping records, social media data, facial
recognition, and sometimes even drones.

Private companies are exploiting the situation and offering
rights-abusing products to states, purportedly to help them manage the
impact of the pandemic. One Israeli spyware firm has developed a product
it claims can track the spread of coronavirus by analysing two weeks’
worth of data from people’s personal phones, and subsequently matching
it up with data about citizens’ movements obtained from national phone
companies.

In some instances, citizens can also track each other’s movements –
leading to not only vertical, but also horizontal sharing of sensitive
medical data.

Not only are many of these measures unnecessary and disproportionately
intrusive, they also give rise to secondary questions, such as: how
secure is our data? How long will it be kept for? Is there transparency
around how it is obtained and processed? Is it being shared or
repurposed, and if so, with who?

b) Censorship and Misinformation

Censorship is becoming rife, with many arguing that a “censorship
pandemic” is surging in step with COVID-19.

Oppressive regimes are rapidly adopting “fake news” laws. This is
ostensibly to curb the spread of misinformation about the virus, but in
practice, this legislation is often used to crack down on dissenting
voices or otherwise suppress free speech. In Cambodia, for example,
there have already been at least 17 arrests of people for sharing
information about coronavirus.

At the same time, many states have themselves been accused of fuelling
disinformation to their citizens to create confusion, or are arresting
those who express criticism of the government’s response.

As well as this, some states have restricted free access to information
on the virus, either by blocking access to health apps, or cutting off
access to the internet altogether.

c) AI, Inequality and Control

The deployment of AI can have consequences for human rights at the best
of times, but now, it’s regularly being adopted with minimal oversight
and regulation.

AI and other automated learning technology are the foundation for many
surveillance and social control tools. Because of the pandemic, it is
being increasingly relied upon to fight misinformation online and
process the huge increase in applications for emergency social
protection which are, naturally, more urgent than ever.

Prior to the COVID-19 outbreak, the digital rights field had
consistently warned about the human rights implications of these
inscrutable “black boxes”, including their biased and discriminatory
effects. The adoption of such technologies without proper oversight or
consultation should be resisted and challenged through the courts, not
least because of their potential to exacerbate the inequalities already
experienced by those hardest hit by the pandemic.

d) Eroding Human Rights

Many of the human rights-violating measures that have been adopted to
date are taken outside the framework of proper derogations from
applicable human rights instruments, which would ensure that emergency
measures are temporary, limited and supervised.

Legislation is being adopted by decree, without clear time limitations,
and technology is being deployed in a context where clear rules and
regulations are absent.

This is of great concern for two main reasons.

First, this type of “legislating through the back door” of measures that
are not necessarily temporary avoids going through a proper democratic
process of oversight and checks and balances, resulting in de facto
authoritarian rule.

Second, if left unchecked and unchallenged, this could set a highly
dangerous precedent for the future. This is the first pandemic we are
experiencing at this scale – we are currently writing the playbook for
global crises to come.

If it becomes clear that governments can use a global health emergency
to instate human rights infringing measures without being challenged or
without having to reverse these measures, making them permanent instead
of temporary, we will essentially be handing over a blank cheque to
authoritarian regimes to wait until the next pandemic to impose whatever
measures they want.

Therefore, any and all measures that are not strictly necessary,
sufficiently narrow in scope, and of a clearly defined temporary nature,
need to be challenged as a matter of urgency. If they are not, we will
not be able to push back on a certain path towards a dystopian
surveillance state.

e) Litigation: New Ways to Engage

In tandem with advocacy and policy efforts, we will need strategic
litigation to challenge the most egregious measures through the court
system. Going through the legislature alone will be too slow and, with
public gatherings banned, public demonstrations will not be possible at
scale.

The courts will need to adapt to the current situation – and are in the
process of doing so – by offering new ways for litigants to engage.
Courts are still hearing urgent matters and questions concerning
fundamental rights and our democratic system will fall within that
remit. This has already been demonstrated by the first cases requesting
oversight to government surveillance in response to the pandemic.

These issues have never been more pressing, and it’s abundantly
clear that action must be taken.

If you want to read more on the subject, follow EDRi’s new series
#COVIDTech here:
https://edri.org/emergency-responses-to ... he-crisis/


This article was originally published at:
https://digitalfreedomfund.org/why-covi ... al-rights/

(Contribution by Nani Jansen Reventlow, Digital Freedom Fund)

Read more:

Tracking the Global Response to COVID-19:
https://privacyinternational.org/exampl ... e-covid-19

Russia: doctor who called for protective equipment detained (03.04.2020):
https://www.amnesty.org.uk/press-releas ... t-detained


A project to demystify litigation and artificial intelligence
(06.12.2019):
https://digitalfreedomfund.org/a-projec ... elligence/


Making Accountability Real: Strategic Litigation (30.01.2020):
https://digitalfreedomfund.org/making-a ... itigation/


Accessing Justice in the Age of AI (09.04.2020):
https://digitalfreedomfund.org/accessin ... age-of-ai/

=======================================================================
6. Recommended Action
=======================================================================

EDRi is looking for a Communications and Media Manager (Permanent position)

EDRi is looking for an experienced Communications and Media Manager to
join EDRi’s team in Brussels. This is a unique opportunity to help shape
and lead on the communications of a well-respected network of NGOs at a
time of numerous challenges to our rights and freedoms in the digital
age. The deadline to apply is 22ndMay 2020. This is a full-time,
permanent position and the start date is expected to be July 2020.

https://edri.org/edri-job-communications-media-manager/

=======================================================================
7. Recommended Reading
=======================================================================

Blind faith in technology diverts EU efforts to fight terrorism

https://www.euractiv.com/section/digita ... terrorism/


EU: Terrorist Content Regulation must protect freedom of expression rights

https://www.article19.org/resources/eu- ... on-rights/

=======================================================================
8. Agenda
=======================================================================

30.04.2020, Bielefeld, Germany
German Big Brother Awards 2020 – POSTPONED
https://bigbrotherawards.de/

09.06.2020, Costa Rica
RightsCon 2020 - POSTPONED
https://www.rightscon.org/

10.08.2020, Berlin, Germany
re:publica20 (new date)
https://20.re-publica.com/en

06.11.2020, Brussels, Belgium
Freedom not Fear 2020
https://www.freedomnotfear.org/fnf-2020 ... ember-2020


26.01.2021, Brussels, Belgium
Privacy Camp 2021
https://privacycamp.eu/

============================================================
9. About
============================================================

EDRi-gram is a fortnightly newsletter about digital civil rights by
European Digital Rights (EDRi), an association of civil and human rights
organisations from across Europe. EDRi takes an active interest in
developments in the EU accession countries and wants to share knowledge
and awareness through the EDRi-gram.

All contributions, suggestions for content, corrections or agenda-tips
are most welcome. Errors are corrected as soon as possible and are
visible on the EDRi website.

Except where otherwise noted, this newsletter is licensed under the
Creative Commons Attribution 3.0 License. See the full text at
http://creativecommons.org/licenses/by/3.0/

Newsletter editor: Guillermo Peris - edrigram@edri.org

Information about EDRi and its members: http://www.edri.org/

European Digital Rights needs your help in upholding digital rights in
the EU. If you wish to help us promote digital rights, please consider
making a private donation.
https://edri.org/donate/

star treker

Fri May 08, 2020 6:20:11 pm

This is what happens when you don't vote for the right people. The effect, is that you get a government ruled by a dictator who thinks they are granted all powers of the universe, simply because their daddy left them millions of dollars. You can't put lipstick on a pig, and hide the fact that its a pig, its still a pig, and always will be a pig. Too many dumb and uneducated people in this world who don't know what is going on, and thats another issue, when it comes to voting for the right or wrong people.

Yes democracy is dieing, but thats cause all these stupid people who don't give a c***, freely giving away their rights. Truth is, all people who have lived in democracies, don't know what its like to live in a dictatorship, cause they never had to, so they don't know how bad it is. People should ask what its like to live in North Korea, see what they say, reality, what a concept. And of course corporations and government will abuse your rights during the digital age, what did people think would happen?

Are people so stupid as to believe that the constitution would have applied to the internet? HAHA, I hate to break it to ya folks, but the internet was not even a gleam in a persons eye, when the constitution was written. I think the only technology the country had back then was guns and trains, there weren't even radio yet, let alone tv, computers, and the internet. The EU however, does appear to take the internet more seriously then most countries do however, so they will probably be the first to impose strong regulations.