Acerca do conceito de "care"
Não posso deixar de partilhar convosco esta conversa que tenho andado a seguir sobre o papel das máquinas na nossa sociedade de conhecimento e de avanço tecnológico. Elas adoptam o "care" - o computador e as telecomunicações, os amores à distância ,...- e tomam conta de nós, então ora bolas, as máqunas também "care"! Mas os humanos ainda precisam de outro tipo de "care", que não é o implementado pelas regras sociais que foram feitas para que se "care" sobre a sociedade. Contudo, no mundo actual em que há tempo para tudo, menos para "care", noutro sentido mais lato, "amar"(o próximo!) - tenho me habituado a conhecer assim casos de quem, sendo gente de carne, osso e pensamentos, se revela autêntica máquina de ...qualquer coisa e, por extensão, até de amar - crescem, por isso, incompreensões e para nós humanos não termos que nos "care" com isso, inventamos as máquinas e depositamos sobre elas as regras mínimas das nossas necessidades. Na verdade, penso que esses que se perderam algures na demasiada explicação das coisas do espírito ou na transformação material a tal associadas, assumindo a sua segmentação potenciada in excelsis, não ajuda muito no dia-a-dia em que apenas se quer um simples e tão só "care".
Mas para isso há uma explicação excelente - que eu li hoje - pois eu critico isto, à luz do que sei melhor que é "conhecendo-me a mim própria" como se inscrevia no templo do Apolo, o qual ficou por eu visitar, e sei que eu que também me estarei a tornar uma determinada máquina de pensamentos sobre o que é afinal "care". Alguém quer contribuir para explicar bem o que é? Leiam abaixo que está óptimo!
Graças a Deus também conheço gente de carne e osso e pensamentos e sentimentos que me humaniza o dia, keine Sorge! Mesmo assim os tais de quem falo se cinjam obviamente àquilo que necessariamente são regras de "care", para além das regras sociais e compromissos bancários...para colocar mais uma acha na fogueira e relativizar mais um bocadinho, este para além é a quantidade normal para muitos, mas ....cinjamo-nos agora às máquinas...
machines don't care
We often seek to interact with entities that "don't care" regardless of
whether or not those entities are machines. Physicians supposedly
"don't care" about our bodies the way our lovers might so we are allowed
to believe that exposing our bodies to physicians risks no personal
judgment (about beauty, say, or desirability) on the part of the
physician. Policemen supposedly don't care if we're stupid or foolish
when we ask for help. Soldiers in combat don't care that the enemy are
human beings against whom they have no personal animus. Not caring is
crucial for facilitating some functions important to humans and so we
create social categories that theoretically obviate caring. In that
sense, machines may be perfect exemplars of roles we developed
collectively long before even the idea of those machines existed. A
robot surgeon will never be misguided by its feelings; a robot policeman
will never unwittingly insult someone standing across the street from
the location sought. A robot bombardier will not suffer remorse in
releasing its payload and become subsequently less effective. In the
U.S., judges and juries are asked to consider demonstrable facts with no
weight given to any possible emotional responses the facts or witnesses
may occasion. Banking functions are among those we usually want to be
uninfluenced by personal concerns such as impatience or favoritism, just
as we want bankers to perform mathematical calculations without any
possibility of the influence of fatigue. While your revised reading of
"machines don't care" may suggest a useful revision of the Turing Test,
I think more broadly your rereading highlights the fact that, despite
the undeniable centrality of caring as a function of most people's ideas
of human beings, humans themselves often "steel themselves" (become
machine-like) in order to perform better the very functions human beings
need, and this denial of a "natural" caring is often considered
virtuous, as would any other discipline adopted at personal cost to lead
to socially desired ends. The problem arises, of course, if the
discipline so influences us that we don't care when we should, if the
surgeon always sees people only as mechanical problems or the policemen
always sees people only as a job or the soldier always sees The Other
only as a target. We want those who adopt roles that "don't care" to
maintain a dual cognition with the role as the conscious overlay, an
overlay we want to believe can be put aside. Why? Because we want to
believe that at bottom it is only a role, at bottom the surgeon wants to
heal us, the policeman to guide us, the soldier to protect us. We want
caring to be the fundamental condition of these people, while not caring
is the role. The disturbance one feels in having to revise your first
reading of the advertisement, I think, comes from having to acknowledge
that sometimes not caring is so important that we may be willing to give
up the notion that that is only a role and accept it as the fundamental
identity. We would not want to do that with people because that would
dehumanize them and, by extension, suggest that we too could become
dehumanized and lose our identities. But we don't mind inhuman machines
failing to be human. It just takes a moment sometimes to accept the
notion that the inhuman may serve human desires better than humans can.
All best,
Eric S. Rabkin,
Dept. of English
University of Michigan
0 Comments:
Post a Comment
<< Home