The promises of big data are convincing. Organizations from
across every major industry are using data mining techniques and data bases as competitive
differentiators to:
- Detect fraud and cyber-security issues
- Manage and eliminate risk
- Anticipate resource demands
- Increase response rates for marketing campaigns
- Discern voter preferences for election campaigns
- Solve today’s toughest big data challenges
Believing these sound wonderful, the concerned citizen needs to explore applications that have a downside and must be addressed very soon. We are not about nefarious designs but rather a creeping invasion into human life.
On
mental health and beyond. Through big data,
we will soon witness the development of personal counselor programs. Already such programs have existed for a
half-century. The first example,
consider ELIZA, a computer program operated
by processing users' responses to scripts, the most famous of which was DOCTOR. It was written in about 1965 by Joseph
Weisenbaum at the MIT Artificial Intelligence Laboratory. It used little information about human
thought or emotion, yet was startlingly interactive. People were convinced Eliza was a real person. https://en.wikipedia.org/wiki/ELIZA
If such code is online, like for free, the user will reveal their deepest secrets in exchange for comfort, conciliation, or a little understanding. Such code will become like a close friend, in many cases the only friend. This information will be extensively mined, and then used in fashions we can only guess. This process will accelerate when psychologists and the like will be required to put their notes online, ostensibly for insurance purposes. It will be promised, at first, to be highly confidential. This sincere promise will not sustain.
If such code is online, like for free, the user will reveal their deepest secrets in exchange for comfort, conciliation, or a little understanding. Such code will become like a close friend, in many cases the only friend. This information will be extensively mined, and then used in fashions we can only guess. This process will accelerate when psychologists and the like will be required to put their notes online, ostensibly for insurance purposes. It will be promised, at first, to be highly confidential. This sincere promise will not sustain.
It is not cynical to say these days,
that every single thing can be hacked. Already, tax, medical, and credit information
is hacked on a daily basis. We use to
believe they (who’s they?) will figure a way to protect us. So, like the ants in the forest and the
wildebeests on the Serengeti plane, we
hope for safety in numbers, that our data will not be selected for invasion.
The codes to “read” writing and to
interpret are already well developed.
It is not that the codes actually understand what you write, but they
know how to respond. It's like when a
friend relates something you don't get. In
most cases, you only know (i.e., have
learned) what to say. Such codes will greet you when you log in, ask you about
your day, how’s the family, your dog,
and other items it has learned you like.
You will almost be convinced it cares about you. Yet, this is just what your shrink does
now. He/she learns to begin the session
with things you care about.
You say… yes but. The computer is too impersonal and too cold. It can’t convince. You know it is a merely a machine. Remember Eliza? To repair this, we now available codes that will
show you a face or avatar with lip synced animation and voice – real time no
less. Well-developed are these. It will learn to talk to you in a way you
prefer. Can you imagine how such code
can be used to manipulate you, change your politics, buying preferences, or
even radicalize you?
And now the beyond. Regular users of social media, such as
Facebook, are already well analyzed, and that is only from passive data
mining. There is more to come. For example, if I needed to find 20,000 folks
that seem to be Republican but have many Democratic tendencies (or vice-versa),
I could order this up from the big data business managers, maybe at five bucks a piece, and then carefully engage them,
and convert them to another belief.
There is big money here. Eventually, the machines would learn to do this
– all at a price per subject. The
parties will be regular customers at this store. The media, bloggers, pundits, and writers
will become secondary resources. For
kids, even parents will assume a lesser
role. To a great extent, this comprises
social engineering gone wild.
It will not be all bad, but the badness
potential electrifies our comments. How
this is achieved technically is postponed to another day, but the favorite techniques
such as random forests and neural nets are remarkably simple to
understand and apply.
Such software will be infinitely
patient and always available. It will be used to counsel prisoners, school children with problems, the terminally
ill, agoraphobics, the love-lost, psychopaths, and the like. It may provide help for autism, a condition
that has been most resistant to almost every clinical
approach. It will be used to
test for compliance, attitude, loyalty, obedience, honesty, or any quality
desired sought. It will be used to
decide on making hires, paroles, and promotions. Not long away with be the commercials and promotions you see on your favorite shows, specifically targeted to you. Netflix already does this.
The purpose, originally, will be
promoted as benign. Eventually, the
codes will be able to test for anything, and deception will be all but
impossible. Even now, if you take a
basic survey with enough questions, you will not be able to conceal anything
targeted. Books will be written to
assist participants from giving untoward revelations.
Below is a simple illustration of a
future hiring process. Almost everything
is online, and all of the data is stored and then used not only with regard to
the applicant but to newer applicants downstream.
The online application → The
pre-interview → The actual interview → You get the job → But then…
You
are asked to sign releases to previously collected information, e.g.
Facebook, Twitter, Schools data, and more
Facebook, Twitter, Schools data, and more
→ Pre-employment orientation →
Orientation → Office climate interview → Wellness interview → End of
probationary period evaluation. Now do
you keep the job?
----------------------------------------------------------------------------------------
All of this is underway, and our
government hasn't a clue about it. Who know
what will happen if it does?
Comments
Post a Comment
Please Comment.