Every day, every week,
we agree to terms and conditions.
And when we do this,
we provide companies with the lawful right
to do whatever they want with our data
and with the data of our children.
Which makes us wonder:
how much data are we giving away of children,
and what are its implications?
I'm an anthropologist,
and I'm also the mother of two little girls.
And I started to become interested in this question in 2015
when I suddenly realized that there were vast --
almost unimaginable amounts of data traces
that are being produced and collected about children.
So I launched a research project,
which is called Child Data Citizen,
and I aimed at filling in the blank.
Now you may think that I'm here to blame you
for posting photos of your children on social media,
but that's not really the point.
The problem is way bigger than so-called "sharenting."
This is about systems, not individuals.
You and your habits are not to blame.
For the very first time in history,
we are tracking the individual data of children
from long before they're born --
sometimes from the moment of conception,
and then throughout their lives.
You see, when parents decide to conceive,
they go online to look for "ways to get pregnant,"
or they download ovulation-tracking apps.
When they do get pregnant,
they post ultrasounds of their babies on social media,
they download pregnancy apps
or they consult Dr. Google for all sorts of things,
for "miscarriage risk when flying"
or "abdominal cramps in early pregnancy."
I know because I've done it --
And then, when the baby is born, they track every nap,
every life event on different technologies.
And all of these technologies
transform the baby's most intimate behavioral and health data into profit
by sharing it with others.
So to give you an idea of how this works,
in 2019, the British Medical Journal published research that showed
在 2019 年， 英国医学杂志发布了一项研究：
that out of 24 mobile health apps,
在 24 个健康类的手机软件里，
19 shared information with third parties.
有 19 个把用户资料 分享给了第三方，
And these third parties shared information with 216 other organizations.
而这些第三方又分享给了 216 个其他的组织。
Of these 216 other fourth parties,
而这 216 个第四方机构，
only three belonged to the health sector.
The other companies that had access to that data were big tech companies
like Google, Facebook or Oracle,
they were digital advertising companies
and there was also a consumer credit reporting agency.
So you get it right:
ad companies and credit agencies may already have data points on little babies.
But mobile apps, web searches and social media
are really just the tip of the iceberg,
because children are being tracked by multiple technologies
in their everyday lives.
They're tracked by home technologies and virtual assistants in their homes.
They're tracked by educational platforms
and educational technologies in their schools.
They're tracked by online records
and online portals at their doctor's office.
They're tracked by their internet-connected toys,
and many, many, many, many other technologies.
So during my research,
a lot of parents came up to me and they were like, "So what?
Why does it matter if my children are being tracked?
We've got nothing to hide."
Well, it matters.
It matters because today individuals are not only being tracked,
they're also being profiled on the basis of their data traces.
Artificial intelligence and predictive analytics are being used
to harness as much data as possible of an individual life
from different sources:
family history, purchasing habits, social media comments.
And then they bring this data together
to make data-driven decisions about the individual.
And these technologies are used everywhere.
Banks use them to decide loans.
Insurance uses them to decide premiums.
Recruiters and employers use them
to decide whether one is a good fit for a job or not.
Also the police and courts use them
to determine whether one is a potential criminal
or is likely to recommit a crime.
We have no knowledge or control
over the ways in which those who buy, sell and process our data
are profiling us and our children.
But these profiles can come to impact our rights in significant ways.
To give you an example,
in 2018 the "New York Times" published the news
2018 年《纽约时报》 发布的一则新闻称，
that the data that had been gathered
through online college-planning services --
that are actually completed by millions of high school kids across the US
who are looking for a college program or a scholarship --
had been sold to educational data brokers.
Now, researchers at Fordham who studied educational data brokers
revealed that these companies profiled kids as young as two
on the basis of different categories:
ethnicity, religion, affluence,
and many other random categories.
And then they sell these profiles together with the name of the kid,
their home address and the contact details
to different companies,
including trade and career institutions,
and student credit card companies.
To push the boundaries,
the researchers at Fordham asked an educational data broker
to provide them with a list of 14-to-15-year-old girls
who were interested in family planning services.
年龄在 14 至 15 岁的少女名单。
The data broker agreed to provide them the list.
So imagine how intimate and how intrusive that is for our kids.
But educational data brokers are really just an example.
The truth is that our children are being profiled in ways that we cannot control
but that can significantly impact their chances in life.
So we need to ask ourselves:
can we trust these technologies when it comes to profiling our children?
My answer is no.
As an anthropologist,
I believe that artificial intelligence and predictive analytics can be great
to predict the course of a disease
or to fight climate change.
But we need to abandon the belief
that these technologies can objectively profile humans
and that we can rely on them to make data-driven decisions
about individual lives.
Because they can't profile humans.
Data traces are not the mirror of who we are.
Humans think one thing and say the opposite,
feel one way and act differently.
Algorithmic predictions or our digital practices
cannot account for the unpredictability and complexity of human experience.
But on top of that,
these technologies are always --
in one way or another, biased.
You see, algorithms are by definition sets of rules or steps
that have been designed to achieve a specific result, OK?
But these sets of rules or steps cannot be objective,
because they've been designed by human beings
within a specific cultural context
and are shaped by specific cultural values.
So when machines learn,
they learn from biased algorithms,
and they often learn from biased databases as well.
At the moment, we're seeing the first examples of algorithmic bias.
And some of these examples are frankly terrifying.
This year, the AI Now Institute in New York published a report
今年，位于纽约的 人工智能现在研究所（AI Now Institute）
that revealed that the AI technologies
that are being used for predictive policing
have been trained on "dirty" data.
This is basically data that had been gathered
during historical periods of known racial bias
and nontransparent police practices.
Because these technologies are being trained with dirty data,
they're not objective,
and their outcomes are only amplifying and perpetrating
police bias and error.
So I think we are faced with a fundamental problem
We are starting to trust technologies when it comes to profiling human beings.
We know that in profiling humans,
these technologies are always going to be biased
and are never really going to be accurate.
So what we need now is actually political solution.
We need governments to recognize that our data rights are our human rights.
(Applause and cheers)
Until this happens, we cannot hope for a more just future.
I worry that my daughters are going to be exposed
to all sorts of algorithmic discrimination and error.
You see the difference between me and my daughters
is that there's no public record out there of my childhood.
There's certainly no database of all the stupid things that I've done
and thought when I was a teenager.
But for my daughters this may be different.
The data that is being collected from them today
may be used to judge them in the future
and can come to prevent their hopes and dreams.
I think that's it's time.
It's time that we all step up.
It's time that we start working together
as organizations and as institutions,
and that we demand greater data justice for us
and for our children
before it's too late.