Who, exactly, is watching?
A woman is pacing up and down the sidewalk in
front of your child’s school, taking pictures of the children and the school. She seems especially interested in the front and side entrances. A kid runs in to tell his teacher. An alert parent in the carpool lane calls
911.
Your daughter sees a shadow outside her
bedroom window and hears a rustling in the flower bed. You rush outside too late to see the culprit
but find a large shoe print in the newly raked soil and call 911.
The coach catches the shy sophomore boy
everyone believed to be kind and polite sharing nude photos on his phone
with the guys in the locker room. He
takes the phone and calls the principal – who then has to call the girl’s
parents to let them know their daughter’s body is no longer her own private
self.
When children’s privacy is violated in ways
that are overt, visible, knowable, the violation is unquestioned. It is
unacceptable. In most cases it is illegal.
Even when a perpetrator is caught, prosecuted and punished, the unease
remains. The fear doesn’t go away. The child may not feel safe, may not be able
to trust. And the parent – or teacher –
lives with the vulnerability of not being able to protect that child.
So why is it different when the violation is
hidden, opaque, electronic, commercial, and complicated?
Kim Slingerland, a mother of three in
Alberta, Canada, discovered an intrusion into her children’s privacy by an
unlikely source: a game app marketed
for children in the “family” section of Google Play. The app, which has kids racing cartoon cars
driven by animals, looks innocuous, and is even labeled as age-appropriate for
young children.
In a report titled “How Game Apps CollectData on Children,” New York Times
reporters say that the game is designed to do more than make racing against
animal drivers fun. They write “Until
last month, the app also shared users’ data, sometimes including the precise
location of devices, with more than half-dozen advertising and online tracking
companies.” The attorney general of New
Mexico filed a lawsuit claiming the maker of Fun Kid Racing had “violated a
federal children’s privacy law through dozens of Android apple that shared
children’s data.” The suit charges the
maker of this and other apps designed for children of
“flouting a law intended to prevent the personal data of children under 13 from
falling into the hands of predators, hackers and manipulative marketers.” And
the suit charges Google with misleading parents and other consumers by listing
these un-private apps in the “family” section of its products.
The Times
analysts did additional research, confirming academic studies and other
investigations that show that without parental consent (or even knowledge),
many apps aimed at children are collecting such personal details as names,
physical locations, email addresses and tracking other connections such as ads
the users click on.
The article, which extends to a full-page
story in the Business section, follows attempts to regulate apps designed for
children, and restrict their capacity to collect data on children, through laws
and regulatory policies. The writers
document pushback by Google and others, both denying and defending their
practices and urging “self-regulation” by the industry. Graphics accompanying the article show how
the data moves from the “user” (a child!) to being collected to then being
marketed to third parties.
The article concludes with statements by regulators
and industry leaders about whether new laws can protect children’s privacy
(their “data” is their lives!), suggesting without good, consistent enforcement
these companies will never be compliant.
So about that 911 call: if we don’t know that the person lurking
outside the window is really an electronic signal hiding already inside the
child’s room, how can we call to report the violation?
Some speculate that the current generation of
parents who grew up on computers and cell phones and social media don’t even
expect for their personal information to be confidential. They think privacy is
passé – it’s over. But I’m not so
sure. If these same parents call 911
about that woman with the camera taking pictures of kids who aren’t hers, these
parents clearly do care about privacy. They do care about protecting their
children. The intruders are just not
visible from the carpool lane. They’re not leaving footprints in the flowerbed.
I’ve written before about the ways
instructional software and digital testing programs increasingly dominating our
classrooms are gathering data on the students – not just from their digital
homework and tests but from any other social media they have open on their
devices. Researchers are making these
intrusions – and the very inadequate laws meant to regulate them – known, but
policymakers, often reliant on the big campaign contributions of the tech
companies, have been slow to act. And most have little knowledge of how any of
this works.
So it may be up to parents to step up, speak
up. I suggest sending this article to all the parents you know who may be
concerned in general about the amount of “screen time” they allow their young
children but think these cute kiddie apps are harmless. The apps and the hidden systems behind them are not safe and there is no way to know just how much of your children's information has been sold to these third parties or what they may already be doing with it.
Let’s stop them from watching.
No comments:
Post a Comment