Friday, October 31, 2008

Cine.

Recomiendo ésta pelicula. Aunque tiene muchas subtramas y multiples mensajes, el que siempre me ha fascinado de la novela de Evelyn Waugh es el retrato decadente y esteticista de la aristocracia inglesa. Y los paisajes de oxbridge...

Congreso Basurto (2ª parte)

Ayer tuve la oportunidad de asistir al simposio medico en honor del Hospital de Basurto que celebraba su centenerario (gracias, Maite)

El Hospital de Basurto es un hospital del Gran Bilbao, una institucion muy presente en las vidas de los bilbainos, y por que no decirlo de los bizkainos.

Forma parte de la red de servicio y asistencia sanitaria publica Osakidetza gestionada por el gobierno vasco.

El Hospital de Basurto fue uno de los primeros en tener una facultad de medicina para la docencia y formacion de medicos y enfermeras en el estado español, pero a consecuencia de las circunstancias que rodearon la guerra civil española, este proyecto se interrumpio.

Tuvo un gran pasado y tendra un gran futuro.



A las 9:00 de la mañana tuvo la palabra y la primera intervencion de apertura del simposio el consejero vasco de salud, Dr. Don Gabriel Mª Inclán Iribar, que presento de forma breve la historia del Hospital, la biografia y contribucion a la ciencia medica de cada uno de los ponentes, e invito a todo el mundo a disfrutar de una jornada que él considero el pinaculo de las celebraciones por el centenerario del hospital de Basurto.

Seguidamente, el Dr. Kevin O. Leslie pneumologo y especialista en patologias pulmonares en un perfecto castellano y con una ponencia que llevaba por titulo "Carcinomas pulmonares", se centro en los conceptos basicos del carcinoma del pulmon, nos conto como el 28% de muertes por cancer éstas se deben al cancer de pulmon, los factores nutricionales, geneticos, y habitos de vida (tabaquismo) causantes del cancer de pulmon etc.

Jean Marco cardiologo del Centro Cardiotoracico de Monaco nos hablo del "intervencionismo coronario", un termino paraguas para describir un conjunto de tecnicas para combatir las enfermedades coronarias arteriales sin el uso de "bypass" u otras tecnicas de perfusion.

En el panel dedicado al futuro de las nuevas tecnologias en la medicina con un enfoque en la terapia celular, cirugia robotica y radioterapia dirigida, el Dr.Alvaro Martinez de origen colombiano y experto en radioterapia oncologica, nos hablo de una tecnica revolucionaria que permite sincronizar el movimiento fisiologico de los organos con la aplicacion de la radioterapia.

Richard L. Gardner, catedratico de zoologia de la Universidad de Oxford, pionero en la identificacion y cultivo de celulas madre embrionarias de los mamiferos nos hablo de las tecnicas para la obtencion de las mismas.



En la foto de arriba, que esta un poco mal enfocada, se distingue a Richard L. Gardner que camina justo delante del atril por detras de dos mujeres hablando.

Es a éste ponente al que me hubiera gustado preguntarle qué opina de los aspectos sociales de la investigacion con celulas madre.

Qué opina sobre las barreras sociales y tecnicas de la investigacion con celulas madre, y cuáles son mas dificiles de romper, las barrreras o limitaciones tecnicas, o las barreras o limitaciones sociales.

Tambien, en relacion a la medida que han tomado las autoridades britanicas de utilizar embriones hibridos animal-humano o embriones quimericos por la escasez de donantes de ovulos, me gustaria haberle preguntado qué opinion tiene sobre los miedos sociales ante posibles resultados aberrantes de mezcla de especies etc.

Richard Gaston, cirujano jefe del departamento de urologia de la clinica San Agustin y uno de las maximas autoridades en urologia robotica, nos hablo sobre la aplicacion de la robotica en la cirugia, y en concreto, para la operacion y extirpacion de cancer de prostata.

Nos mostro peliculas en las que se ven sus trabajos y realmente este señor tiene unas "manos" prodigiosas.

Y por fin, llego el turno de Jean Pierre Changeux, maximo experto en receptores nicotinicos.



En la foto de arriba le vemos repasando sus notas para la presentacion que llevaba por titulo: "Towards a Neuroscience for a capable person: Implications for Neuropsychiatry"



Esta foto de arriba, muestra la primera plantilla de presentacion de la ponencia de Changeux.

Como dijo el presentador de Changeux, el Dr. Miguel Angel González Torres jefe del servicio de Psiquiatria del Hospital de Basurto, no son pocos los neurocientificos que dejan de examinar las estructuras "micro" del sistema nervioso para pasar a preguntarse por comportamientos complejos como la conciencia y el comportamiento social y empatico.

Estoy de acuerdo con estas palabras, pero dada la universalidad del cerebro, el cerebro esta envuelto en todas las actividades humanas, esto es algo ovbio y evidente que mas tarde o mas temprano tendria que suceder.

Lo que sucede es que hasta que grandes nombres de la neurociencia respetados por su rigor cientifico (Damasio, Crick, Kandel, Gazzaniga, Koch...) no se han embarcado en investigar estos temas, la conciencia, el comportamiento social, el libre albedrio... estos temas eran considerados como acientificos por ser dificiles de objetivar.

Mi perplejidad es cómo fenomenos tan inherentemente humanos no han sido objeto de examen por la ciencia hasta que los filosofos, los primeros en preguntarse por estas cuestiones, han influenciado las agendas de los cientificos.

La ponencia de Changeux basandose en conceptos del filosofo frances Paul Ricoeur como el de "persona capaz libre y consciente", nos lanzo un programa de investigacion para el futuro que asentado en fuertes evidencias cientificas puede que un dia nos muestre las bases neuronales de la conciencia y otros fenomenos y experiencias humanas.



En la foto de arriba, Changeux en accion!

Changeux, nos advirtio que hay muchos niveles de causacion y organizacion del cerebro y sus funciones, y que él observa que muchos de sus colegas pasan de los niveles "micro" a los nivles "macro" sin tener en cuenta todo el recorrido pasando por los niveles "meso".

Pero, no obstante, dijo que la unica estrategia valida es la reducionista para a partir de ahi construir y hacer una sintesis con datos derivados de la influencia cultural, epigenetica etc.

Nos hablo de la comunicacion intencional y las relaciones interpersonales que pasan por la representacion del mundo, la autocoenciencia, el reconocimiento y percepcion del otro y la busqueda de recompensas mutuas, de la empatia...



Nos hablo de la universalidad genetica del cerebro del homo sapiens, de genes unicos causantes de desordenes cerebrales (el ASM responable de la microcefalia, el FoxP2 posible reponsable de desordenes linguisticos...) y tambien, de dos paradojas de la estrategia reduccionista.

La primera paradoja, es la de la "no-linealidad" de los genes a la conducta.

Los genes no codifican conductas, pero las disponen.

La segunda paradoja, es la de la parsimonia identificada con la navaja de Ockham.

La parsimonia es muy probable que sea una limitacion cognitiva humana que no es capaz de tener en cuenta todos los datos y recurra a lo simple, que no simplista, pero de todas formas una representacion parcial.

Nos hablo de la transformacion radical que ejerce la cultura en la organizacion del cerebro o lo que estos dias se llama "neurociencia cultural"



Ofrecio una definicion tentativa de la conciencia cuya evolucion tiene una correlacion directa con la expansion del lobulo prefrontal en los seres humanos:

espacio subjetivo e interno, un milieu fisico donde las acciones son reemplazadas por simulaciones, planes, objetivos y cualquier curso de accion, los cuales son procesados y evaluados de manera global


Y se despidio, con la esperanza de que en un futuro la neurociencia contribuya a un mejor entendimiento de nosotros mismos, o lo que el oraculo de Delfos nos mando, y esto promueva la paz en el mundo.

El ultimo ponente el Dr. Jeffrey M. Drazen editor en jefe de la prestigiosa revista medica The New England Journal of Medicine, despues de los altos vuelos y en aspectos especulativa charla de Changeux como él mismo reconocio, nos devolvio a la ciencia dura y el rigor que han de tener las publicaciones cientificas a la hora de publicar trabajos cientificos.



Su charla se centro en la etica de la publicacion y el compromiso y deber moral de las revistas de presentar con honestidad, precision, y veracidad los resultados de las investigaciones.

Dilema moral.

Juan esta en medio de un desierto sin ningun medio de transporte y el punto de civilizacion mas cercano se encuentra a mas de 50 kilometros. Pero Juan no podra llegar hasta alli porque no tiene mas fuerzas y solo le queda una cantimplora con muy poca agua y hace una temperatura de mas 35º. Al rato Juan se encuentra un grupo de viajeros tambien perdidos y deshidratados que han podido contactar por radio para pedir ayuda antes de que se perdiera la señal definitivamente. Si uno de ellos no bebe agua inmediatamente, morira.

Es moral que Juan comparta/ofrezca su agua al viajero perdido a punto de morir sabiendo que si le ofrece/comparte agua, Juan es posible que muera porque la ayuda puede que no llegue y no podran alcanzar la ciudad, pero sino lo hace el viajero morira seguro.

Thursday, October 30, 2008

Congreso Basurto.

Acabo de llegar ahora mismito del congreso medico en conmemoracion por el Centenario del Hospital de Basurto celebrado en el Palacio de Congresos del Euskalduna, Bilbao.

Como podeis ver, los ponentes son de altura y las charlas tampoco han decepcionado.

Las que mas me han interesado han sido las charlas de Jean Pierre Changeux y Richard L. Gardner, aunque la ultima, la del editor de la revista medica The New England Journal of Medicine, Dr. Jeffrey M. Drazen, ha sido relavante por explicar y dar a conocer la responsabildad que tienen las revistas cientificas (medicas) a la hora de publicar articulos que sean precisos y veraces.

Si tengo tiempo mañana pondre algunas imagenes y hare un comentario mas detallado.

Cita del dia.

"La naturaleza y las leyes de la naturaleza yacian ocultas en la noche. Dios dijo, ¡hagase Newton! y ¡todo fue luz!"
-Alexander Pope-

Tuesday, October 28, 2008

Tanta matemática económica, tanto análisis...

...ésta es la más clara explicación a la crisis financiera. vía Significativos:





Pero claro, los NINJAS no son los unicos en los que debe recaer la culpa. Los MONJAS (los que tienen "more than others in terms of job and assets")son igual o mas culpables, porque han instrumentalizado a las victimas de la exclusion social.

Neuro-Oncología.

Existe una tipología bastante heterogénea de cánceres cerebrales.

La lista va como sigue:

Tumores cerebrales primarios: causados por celulas cancerigenas.

Cancer cerebral metastatico: cancer de otra zona del cuerpo que se ha trasladado al cerebro.

Tumor cerebral benigno

Linfoma primario del sistema nervioso central

Sarcoma cerebral

Cancer infantil cerebral

Cancer espinal

Subtipos por tipo de celula afectada

Gliomas

Astrocitoma

Glioma del tronco encefalico

Ependinoma

Oligodendrogliomas


Meduloblastoma: Tambien llamado tumor primitivo neuroectodermal.

Meningioma

Schwanomas: comienzan en las celulas Schwan, los neuromas acusticos son de este tipo.

Craniofaringiomas: situados alrededor de la glandula pitutaria.

Tumores de celulas germinales

Germinoma

Tumores de la region pineal

Neuroblastoma

Neurocitoma

Ganglioneuroma

Oligodendroglioma

Monday, October 27, 2008

Analisis evolucionista del liderazgo y el seguidismo.

Por qué es tan importante el liderazgo.

Coloca a un grupo de personas en una habitacion para realizar una tarea y pronto veras emerger estructuras lider-seguidor.

El liderazgo tiene su presencia en lo social, lo organizacional e industrial, en lo politico, en lo cultural, en los grupos de amigos...

Aunque sabemos mucho sobre el liderazgo y la literatura es enorme nos hace falta un marco interpretativo que de coherencia a la evolucion del liderazgo.

Porque aunque sepamos que el liderazgo es un rasgo universal, la evolucion del liderazgo tiene una relacion proporcional con el seguidismo.

Ambas estrategias han surgido como soluciones a problemas de coordinacion social para el movimiento de los grupos, la cohesion pacifica interna en el grupo, y la competicion intergrupo.

Por eso la relacion liderzago-seguidismo es una carrera armamentistica (arms race).

Una relacion ambivalente entre los lideres y los seguidores, porque siempre que haya un lider es posible que este abuse de los seguidores o que los seguidores conspiren y reten al lider y lo desbanquen.

De la misma forma que tiene que haber un lider, tiene que haber seguidores tambien, y tan importante es una faceta del binomio como la otra.

Pero la literatura ha romantizado al lider hasta el punto de que se le atribuyen atributos idealizados cuando no es justificable.

El liderazgo no es tanto un atributo o caracteristica individual como un recurso del grupo.

Por mucho carisma, competencias o habilidades que tenga un lider es el grupo quien le otorga el liderazgo.

Nuestros ancestros que mantenian una vida nomada de cazadores y recolectores en la savana africana, principalmente vivian en grupos reducidos de familiares en comunidades igualitarias.

Es en ese entorno donde evoluciono nuestra "psicologia del liderazgo".

Para cazar, encontrar recursos, e incluso para resolver conflictos intragrupo (mantener el orden social y armonia de paz) o conflictos intergrupo (competicion) debian desarrollar un sistema de tomas de decisiones efectivos y a veces una persona debia indicar que decision tomar.

Pero la dinamica de grupos evaluada e interpretada desde una perspectiva evolucionista que tenga en cuenta la historia de la psicologia del liderazgo, nos dice que la "psicologia del seguidismo", es incluso mas interesante.

La mayor parte de la literatura psicologica sobre liderazgo ha enfocado mal el estudio del liderazgo.

Parte de este problema es que se ha centrado en estudiar exclusivamente a la persona al mando: al lider.

Desde una perspectiva evolucionista tan importante es el liderazgo como el seguidismo, y por consiguiente se debe estudiar con el mismo enfasis la psicologia del seguidismo.

2. 5 millones de años viviendo en comunidades igualitarias han tenido que influir en el modo en que hoy vemos al lider.

Todo esto por un simple hecho: porque los objetivos del lider y los objetivos del seguidor no siempre coinciden.

Dependiendo de las necesidades del grupo se deciden los prototipos de lider para cada contexto y situacion.

No hay un molde o patron unico del lider.

Pincha aqui.

Saturday, October 25, 2008

La moral innata.

Is Morality Natural?
Science is tracing the biological roots of our intuitive sense of what is right and what is wrong.

Marc D. Hauser, Ph.D.
NEWSWEEK
From the magazine issue dated Sep 22, 2008

On Jan. 2, 2007, a large woman entered the Cango caves of South Africa and wedged herself into the only exit, trapping 22 tourists behind her. Digging her out appeared not to be an option, which left a terrible moral dilemma: take the woman's life to free the 22, or leave her to die along with her fellow tourists? It is a dilemma because it pushes us to decide between saving many and using someone else's life as a means to this end.

A new science of morality is beginning to uncover how people in different cultures judge such dilemmas, identifying the factors that influence judgment and the actions that follow. These studies suggest that nature provides a universal moral grammar, designed to generate fast, intuitive and universally held judgments of right and wrong.

Consider yourself a subject in an experiment on the Moral Sense Test (moral .wjh.harvard.edu), a site presenting dilemmas such as these: Would you drive your boat faster to save the lives of five drowning people knowing that a person in your boat will fall off and drown? Would you fail to give a drug to a terminally ill patient knowing that he will die without it but his organs could be used to save three other patients? Would you suffocate your screaming baby if it would prevent enemy soldiers from finding and killing you both, along with the eight others hiding out with you?

These are moral dilemmas because there are no clear-cut answers that obligate duty to one party over the other. What is remarkable is that people with different backgrounds, including atheists and those of faith, respond in the same way. Moreover, when asked why they make their decisions, most people are clueless, but confident in their choices. In these cases, most people say that it is acceptable to speed up the boat, but iffy to omit care to the patient. Although many people initially respond that it is unthinkable to suffocate the baby, they later often say that it is permissible in that situation.

Why these patterns? Cases 1 and 3 require actions, case 2 the omission of an action. All three cases result in a clear win in terms of lives saved: five, three and nine over one death. In cases 1 and 2, one person is made worse off, whereas in case 3, the baby dies no matter what choice is made. In case 1, the harm to the one arises as a side effect. The goal is to save five, not drop off and drown the one. In case 2, the goal is to end the life of the patient, as he is the means to saving three others.

Surprisingly, our emotions do not appear to have much effect on our judgments about right and wrong in these moral dilemmas. A study of individuals with damage to an area of the brain that links decision-making and emotion found that when faced with a series of moral dilemmas, these patients generally made the same moral judgments as most people. This suggests that emotions are not necessary for such judgments.

Our emotions do, however, have a great impact on our actions. How we judge what is right or wrong may well be different from what we chose to do in a situation. For example, we may all agree that it is morally permissible to kill one person in order to save the lives of many. When it comes to actually taking someone's life, however, most of us would turn limp.

Another example of the role that emotions have on our actions comes from recent studies of psychopaths. Take the villains portrayed by Heath Ledger and Javier Bardem, respectively, in "The Dark Knight" and "No Country for Old Men." Do such psychopathic killers know right from wrong? New, preliminary studies suggest that clinically diagnosed psychopaths do recognize right from wrong, as evidenced by their responses to moral dilemmas. What is different is their behavior. While all of us can become angry and have violent thoughts, our emotions typically restrain our violent tendencies. In contrast, psychopaths are free of such emotional restraints. They act violently even though they know it is wrong because they are without remorse, guilt or shame.

These studies suggest that nature handed us a moral grammar that fuels our intuitive judgments of right and wrong. Emotions play their strongest role in influencing our actions—reinforcing acts of virtue and punishing acts of vice. We generally do not commit wrong acts because we recognize that they are wrong and because we do not want to pay the emotional price of doing something we perceive as wrong.

So, would you have killed the large woman stuck in the cave or allowed her to die with the others? If you are like other subjects taking the moral sense test, you would say that it is permissible to take her life because you don't make her worse off. But could you really do it? Fortunately, there was a simpler solution: she was popped out with paraffin after 10 hours.

Hauser is a professor of psychology and human evolutionary biology at Harvard, and author of “Moral Minds” (Ecco/HarperPerennial).

Espero que Newsweek no me pegue un toque

Un poquito de Rock and Roll... tampoco viene mal.

Friday, October 24, 2008

El sinsentido llega a la neurofilosofia.

Aqui.

Copyright laws.

La jefa de prensa de la revista Nature me ha llamado la atencion por poner el articulo de Philip Ball en el blog. Sintiendolo mucho tengo que quitarlo, estoy violando los derechos de autor.

Habria mucho que discutir sobre los derechos de autor, que estan obsoletos, en la actual sociedad de la informacion porque niegan la creatividad, la "generatividad" (termino acuñado por Zittrain), y otras muchas cosas.

Articulo de Philip Ball aqui.

(Por cierto, es que yo soy muy lego en derecho de copia privada, alguien me sabria decir la diferencia entre copiar y pegar en un blog el texto de un articulo reconociendo la fuente original de publicacion, su fecha de publicacion y autor, y poner un vinculo directo a ese articulo)

Cita del dia.

"En teoría, no hay diferencia entre teoría y práctica. Pero en la práctica, sí"
-Jan L. A. van de Snepscheut-

Thursday, October 23, 2008

¿Derechos vegetales?

No me gusta llevar las cosas al paroxismo, pero cualquier intento de fundamentar los derechos humanos en una cosmovisión de la singularidad humana está destinada al fracaso.

La ciencia despues de Darwin nos ha mostrado la secuencia directa de interrelacion entre todos los seres vivos.

Pero, qué pasa con las plantas.

Hoy sabemos que muchas de las cualidades y competencias otrora consideradas exclusivamente humanas (lenguaje, moral, cultura, uso de herramientas...) estan presentes de forma rudimentaria en otros animales.

Este hecho reconocido por nuestra ciencia esta en la base de la discusión sobre extender unos minimos derechos legales a los animales desarrollados.

Pero de nuevo, ¿qué pasa con las plantas?

A petición del gobierno suizo se ha creado un panel ético de expertos para elaborar una doctrina de derechos vegetales con el fin de que se impida el exterminio de la flora de forma arbitraria.

Hce unos años el gobierno suizo también introdujo en su constitución la siguiente provisión:


"tener en cuenta la dignidad de la creacion cuando se trata de manipular animales, plantas y otros organismos"


La postura filosófico-moral que defiende este panel de expertos en ética es una postura "biocentrica" por la cual se debe respetar la vida de cualquier organismo y considerarlo un sujeto moral por el simple hecho de estar vivo.

El debate sobre la aplicación de los derechos humanos a otros organismos sigue abierto.

A los partidarios de los derechos de los animales se les acusa de hiper-antropoformizacion de la realidad biológica.

Este debate es espureo, porque de hecho hay una graduación en la antropoformizacion cuyos límites no están bien definidos.

Pero de nuevo, qué pasa con las plantas.

Ahora, a los defensores de los derechos vegetales se les va a acusar de un "fitocentrismo" o "fitoformizacion" de la realidad.

Pincha aquí.

Actualizacion: Eduardo tambien tiene una entrada sobre este tema polémico donde los haya (pero coherente con el paradigma darwiniano.) Es sorprendente como se va formando la homogeneidad en el pensamiento. Esta entrada la tenía programada desde hace un par de días y Eduardo y yo hemos convergido en el mismo día, una fotografía parecida y el mismo mensaje.

Wednesday, October 22, 2008

Jonathan Haidt sobre los fundamentos morales que forman nuestra ideologia politica.

Consejos psicologicos para un buen inversor.

La solucion a la crisis crediticia, el "boom" inmobiliario y el subsiguiente colapso de las hipotecas basura, bancos etc. como en cualquier otra crisis financiera no se sabe hasta que las medidas que se hayan decidido tomar vayan produciendo resultados, y por un proceso de ensayo y error, con memoria de sucesos pasados, se va aprendiendo a dar la solucion.

Pero que la solucion (coordinacion y [semi]nacionalizacion de bancos, compra de activos bancarios toxicos, asi como inyeccion de liquidez por parte del estado) no se sepa de antemano, no significa que la explicacion de la crisis financiera no se tenga desde el primer momento.

Otra cosa es que haya conflictos de interes y los "hypes" de ciertos analistas, consultores, agencias de ratio crediticio, managers, prensa, y demas grupos de presion quieran ocultar la realidad.

La explicacion es que la Fed, el tesoro y la SEC de los E.E.U.U. han debido ser unos reguladores mas activos parando las burbujas antes de que empiecen.

Pero claro, de un modelo en teoria economica que es capaz de predecir espectativas bajo un marco teorico basado en principios de racionalidad (utilidad maxima esperada, teoria de actualizacion bayesiana...) cuando se enfrenta a la psicologia de los inversores reales, de carne y hueso, se topa con deficits de mercado, crisis y miopia financiera y politica.

No sé cuantas lecciones sacaremos de esta crisis.

Igual la arquitectura financiera no cambiara, ni los causas que han propiciado la crisis desapareceran (incentivos negativos, retiros y bonos multimillonarios para altos ejecutivos, codicia, avaricia...), ni nos dirigiremos a una nueva filosofia economica (aunque quizas sí, porque un neokeynesiano ha ganado el Nobel).

No obstante, una cosa ya esta clara.

La economia ya no puede teorizarse y practicarse al margen de la psicologia (cerebro) del hombre.

Consejos para el inversor aqui.

Tuesday, October 21, 2008

Una vida filosofica.

Via Leiter Reports.

Una vida filosofica segun:

David Chalmers.

David Rosenthal.

Michael Tye.

y

Stephen Stich.

Percepcion lectora.

"Sgeun un etsduio de la uivenrsdiad ignlsea de Cmaridgbe, no ipmotra el odren en el que las ltears etsan ersciats, la uicna csoa ipormtnate es que la pmrirea y la utlima ltera esten ecsritas en la psiocion cocrrtea. El rsteo peuden estar ttaolmntee mal y aun pordas lerelo sin pobrleams. Etso es pquore no lemeos cada ltera por si msima prouqe la paalbra es un tdoo"

No, en realidad este es un meme apocrifo en forma de correo electronico que ha mutado varias veces desde que en 2003 empezo a propagarse por la red.

Pero, de hecho, si que contiene varias verdades y principios de la psicolinguistica sobre la forma en que nuestro cerebro procesa la informacion visual, y en concreto, el orden y posicion de las letras de las palabras para ser leidas.

Este fenomeno se conoce como "word-letter advantage": las palabras son un todo y se procesan antes que las letras aisladas.

Pincha aqui.

Monday, October 20, 2008

Lo natural y lo surnatural en Les Demoiselles d'Avignon de Picasso.


Charles Baudelaire diferencio entre dos formas de ver: la percepcion natural y la percepcion "surnatural".

La primera forma de ver es comun, la que te permite ver el mundo visual desafectuosamente, desinteresadamente, como retratando la realidad.

La segunda forma de ver conlleva una percepcion afectuosa, con una gran carga emocional y de conexion taumaturgica, y es la principal forma de ver que esta al servico de la contemplacion pictorica.

La contemplacion del arte te crea percepiones "surnaturales" que solo las drogas o el vino pueden de forma surrogada crear, pero artificosamente.

La percepcion surnatural distorsiona el tiempo y el espacio, te introduce en la representacion pictorica, en el espacio pictorial del cuadro.

Les Demoiselles d'Avignon de Picasso te conducen a una percepcion "surnatural"

Esta obra completada y acabada en 1907 marco un punto de inflexion en el arte moderno.

Muchas son las interpretaciones artisticas, antropologicas, transculturales, filosoficas, historicas, que se han hecho de este cuadro que eufemisticamente se llamo: Las señoritas de Avignon.

Las señoritas (o damichuelas como mi basico frances me sugiere traducir) son prostitutas.

Todo un genero pictorico que Picasso tomo prestado de los impresionistas.

Las prostitutas, segun Baudelaire, son "la belleza en mitad de la ciudad moderna"

Representan lo salvaje, lo arcaico, ante las normas civilizadas (es por esto que varias de ellas tienen rasgos faciales similares a las mascaras tribales africanas)

Este cuadro refleja la vida cotidiana en los burdeles de principios de siglo XX.

Las prostitutas mas afortunadas podian estar en las "Maison de tolerance" donde recibian inspecciones medicas diarias, a diferencia de otras menos afortunadas que ejercian su trabajo en "maison de passe".

Este cuadro captura precisamente el instante en el que las prostituas, cinco en total, esperan desnudas recibir la inspeccion medica rutinaria.

En la version final del cuadro se retratan cinco mujeres que miran al espectador.

Una de ellas entra por la izquierda echando para atras una cortina.

La segunda y la tercera por la izquierda, esperan circunspectas (mirando fijamente al espectador).

La cuarta parece tambien entrar haciendose paso por detras de unas cortinas con unos rasgos fisionomicos tribales.

La quinta y ultima, con rasgos tribales mas parecidos a una mascara africana, esta sentada.

En el centro hay piezas de frutas( sandia, uvas y melocotones)

En un principio, en el esbozo original del cuadro, hay dos hombres, los medicos que van a realizar la inspeccion:



Pero que es lo que hace que este cuadro sea uno de los cuadros mas representativos del arte del siglo XX.

Este cuadro introduce un nuevo estilo pictorico (da pie a una nueva gramatica artistica que los criticos llamaron cubismo).

Tambien se acaba con la perspectiva de profundidad.

Es un cuadro incompleto.

Ya no se pinta el "hecho pictorico" de principio a fin.

Lo que Picasso introduce son nuevas formas que asimilan todos los angulos desde una unica perspectiva, y apelan a la participacion del espectador, porque es él quien tiene que acabar la obra rotando mentalmente la escena, la imagen, para construirla.

Saturday, October 18, 2008

Narices (sistemas olfativos) artificiales.

No, este no es un post sobre la cirugia reparadora, o estetica, de la nariz de Letizia.

Los sistemas olfativos de los animales comparten muchas caracteristicas entre si, lo cual implica que la naturaleza ha encontrado una solucion optima para discriminar los olores.

Los olores (mixturas quimicas) son olores por la forma que tienen las moleculas que se desprenden de los materiales.

De acuerdo con la teoria de la forma molecular y el proceso "Lock and Key" la forma especifica de una molecula asociada a un olor se liga a un receptor del epitelio nasal produciendo una transduccion neuronal con la consiguiente activacion de las celulas nerviosas de la nariz, que directamente viajan hasta el hipotalamo (de ahi que los olores evoquen muchos recuerdos), y que finalmente codifican el caracter subjetivo, o qualia, del olor.

Aunque hay otras teorias, como la que propone el biofisico y creador de fragancias Luca Turin, llamada la teoria vibracional del olor.

El sistema olfativo es uno de los sistemas de procesamiento de informacion sensorial (sistema perceptivo) mejor estudiado y comprendido al igual que el sistema visual.

Los neuroingenieros, como Dominique Martinez, se inspiran en los sistemas olfativos biologicos que poseen los organismos vivos, para mimetizar con redes neuronales artificales y algoritmos de aprendizaje, la percepcion olfativa.

El uso de narices artificales, o sistemas olfativos artificales, incorporados en robots pueden servir para la deteccion de gases, explosivos, drogas...

Una muestra de estas aplicaciones es el sguiente video:




Pincha aqui.

Friday, October 17, 2008

Cita del dia.

"En 1995 empezamos a leer la secuencia genetica completa de bacterias, insectos, plantas, animales, humanos.
Esta escrito en un codigo de cuatro letras (A,T,C,G)...
Si cambias este codigo, tal y como si cambiaras el codigo del disco duro de un PC... tu cambias el mensaje, el producto, el resultado.
Estamos empezando a adquirir, dirigir y producir control sobre la evolucion de todas las formas de vida de este planeta... incluido nosotros mismos
"
-Juan Enriquez-

Thursday, October 16, 2008

Teoria evolutiva de los mercados financieros.

El mantra repetitivo que se dice todo el mundo es que el estado es ineficiente a la hora de prestar servicios y que el mercado es la solucion porque siempre alcanza un equilibrio.

Segun la concepcion estandar en la modelizacion de los mercados, la hipotesis de la eficiencia del mercado, los precios y los consumibles convergen en un equilibrio competitivo.

Aunque haya un consumible sobre o infravalorado, el precio de otros consumibles sirve como señal que informa para corregir la desviacion.

Por otro lado la hipotesis de la eficiencia del mercado dice que todos los agentes economicos tienen una completa informacion que emplean en el momento justo y apropiado.

(Alguien se ha preguntado, como yo, qué ser humano posee toda la informacion relevante)

Sin embargo, durante años la psicologia, la economia experimental, la economia comportamental, y mas recientemente la neuroeconomia, nos estan señalando las innumerables contradicciones del modelo racional que esta detras de las actuaciones de los agentes economicos en los mercados.

Los inversores, los directores de fondos, los accionistas, no solo nunca tienen la informacion relevante (sino de que otra forma los directores de los fondos o las compañias se llevan suculentas compensaciones, incentivos, bonos, o retiros multimillonarios a expensas de los accionistas, a menos que oculten informacion), sino que se mueven por el entusiasmo y el frenesi descontrolado del mismo modo que por el miedo y la depresion, como si tuvieran un trastorno afectivo bipolar.

El mercado no es tan racional como nos parece y la hipotesis de la eficiencia del mercado, ha muerto.

Pero para Andrew Lo, director del laboratorio de ingenieria financiera del MIT, estas dos visiones son dos caras de la misma moneda y ha propuesto una hipotesis sustituta: la hipotesis del mercado adaptativo.

Lo, ha propuesto una marco interpretativo basado en la teoria de la evolucion, donde los mercados financieros deben entenderse desde los principios biologicos de competicion, adaptacion y seleccion natural.

Lo interesante (aunque en mi opinion demasiado optimista en la posible reconciliacion, pero quizas muy del gusto de los que se sientan atraidos por un darwinismo aplicado a las finanzas donde empresas y organismos son equiparados) de la propuesta de Lo, es que esta afirmando la posibilidad de coexistencia entre dos paradigmas considerados antagonicos y ademas no se opone al arbitraje o la regulacion puntual en los mercados.

Por un lado la neuroeconomia, la economia comportamental, las neurofinanzas, la psicologia de la economia, y por el otro la teoria racional de la decison aplicada a la economia o las hipotesis tradicionales de la economia, pueden coexistir de una forma pacifica.

Para Lo, los errores sistematicos que los inversores cometen por no tener en cuenta la forma en que nuestro cerebro (mente) acomoda señales del entorno, no son mas que propuestas heuristicas tomadas fuera de contexto.

En palabras de Lo: "De la misma forma que el movimiento tambaleante de una lagartija en tierra es ineficiente, el mismo movimiento en el agua es adaptativo"

El problema es que los "cognitive bias" que los inversores, directores de fondo, accionistas... despliegan, ¿para qué mercado han evolucionado si para el actual mercado que vivimos no son muy adaptativos?

Se quiere decir que determinados rasgos cognitivos han evolucionado en ciertos mercados (entornos)se retienen y se aplican a otros mercados (entornos).

Esto es un desconocimiento de la biologia. Mi aparato locomotor y su biomecanica tiene los mismos constreñimientos intrinsecos ya sea para el medio acuatico como para el medio terrestre.

Ambos medios ponen distintas resistencias, pero yo no podre cambiar mi potencial.

En otras palabras, la seleccion natural es muy continuista, utiliza los mismos trucos en entornos distintos.

(Hay que tener mucho cuidado con los pseudodarwinismos economicos)

Pincha aqui y aqui.

Malcolm Gladwell sobre cómo contratar a la persona correcta.

Wednesday, October 15, 2008

Cita del dia.

"Aquellas reducciones de impuestos, y no el gasto indulgente, son la causa principal del deficit (federal)"
-Paul Krugman-

Tributo de Timothy Williamson a Ruth Barcan Marcus.

Via Leiter Reports


Laudatio: Professor Ruth Barcan Marcus

Timothy Williamson


The central methodological advantage that analytic philosophy enjoys over all other forms of philosophy, past and present, is the rigorous framework of formal logic within which it can conduct its inquiries.

Although different systems of logic are needed for different branches of philosophical inquiry, in the core area of metaphysics and surrounding fields for the past forty years the most natural and fruitful setting for inquiry has been quantified modal logic, in which we not only have the resources of first-order logic with identity but can also raise explicit questions of possibility and necessity with elegantly perspicuous generality.

The first study of quantified modal logic as a branch of formal logic was published in March 1946 in The Journal of Symbolic Logic, under the title ‘A functional calculus of first order based on strict implication’, by Ruth C. Barcan, a logician whose identity with Professor Marcus is of course necessary.

The system that she presented there did not simply combine pre-existing non-modal quantified logics with pre-existing unquantified modal logics. It identified a crucial axiom about the interaction of the two sides, the interchange of modal operators with quantifiers.

The axiom says that if there can be something that has a certain property, then there is something that can have that property. This is the famous Barcan formula; most logicians can only dream of having a formula named after them. Its converse is also derived in the paper.

The Barcan formula and its converse are neither obviously correct nor obviously incorrect (on the intended interpretation), but they are of the utmost importance, both technical and philosophical, to the distinctive nature of quantified modal logic. Technically, their presence or absence makes a large strategic difference to the ways in which the proof theory and formal semantics of quantified modal systems can be developed.

But this is closely connected to their philosophical significance too, for together they are tantamount to the claim that it is non-contingent what individuals there are. Although that non-contingency claim may sound implausible on first hearing, it can be given a sustained defence in more than one way, either by taking a narrow view of what individuals there can be or by taking a broad view of what individuals there are.

In metaphysics there are disputes whose content is notoriously hard to pin down, for example concerning actualism (the thesis that ‘everything is actual’) and its analogue for time, presentism (the thesis that ‘everything is present’). These disputes are threatened by trivialization; they can easily sound verbal. It is increasingly appreciated that the best way to focus them on worthwhile issues may be to reconfigure them as disputes over the validity of the Barcan formula and its converse and their analogues in tense logic.

Those formulas lie at the heart of other metaphysical debates too: for example, they present a lethal threat to one contemporary version of the correspondence theory of truth, according to which a truth has to be made true by some thing, a truthmaker. We are going to be hearing much, much more of the Barcan formula and its converse in metaphysics.

The 1946 paper did not initially meet with universal acclaim, although its importance was recognized by C.I. Lewis, one of the founders of modern modal logic. Its main critic was W.V.O. Quine, who argued that its application of modal operators to formulas with free variables was incoherent and unintelligible (although he did concede that Miss Barcan ‘is scrupulous over the distinction between use and mention of expressions — a virtue rare in the modality literature’: rare praise from the guardian of the use-mention distinction).

Quine’s original criticisms were technically unsound, and he was forced over the years into a series of revisions that eventually reduced the charge to one of a commitment to Aristotelian essentialism. Even there, technical results vindicated Professor Marcus’s later reply that the commitment was to the intelligibility, not the truth, of essentialism, and that in any case there may well be a scientific basis for some form of essentialism. Philosophy has gone Marcus’s way, not Quine’s, but the vindication of her paper was a gradual process: it was years ahead of its time.

In 1947, Miss Barcan published another pioneering paper on quantified modal logic, ‘The identity of individuals in a strict functional calculus of second order’. It is best known for the first proof of the necessity of identity, the thesis that individuals cannot be contingently identical. For many years this was regarded as a paradox, perhaps even a reductio ad absurdum of quantified modal logic.

There were thought to be obvious examples of contingent identity. But on further analysis the apparent counter-examples turned out to rely on philosophical confusions, concerning either the scope of definite descriptions or the distinction between the contingent and the a posteriori, or at least on deeply questionable metaphysical assumptions. In contemporary philosophy, the necessity of identity is widely seen as a vital insight into modal metaphysics, and a valuable constraint on philosophical theorizing.

The 1947 paper is pioneering in another respect too. It is a system of second-order modal logic. That is, it permits quantification into predicate position, not just into name position as in first-order logic. In cruder terms, it lets one generalize over properties, not just over the individuals that have those properties. Despite Quine’s opposition, second-order non-modal logic is now widely recognized as the appropriate logical framework for many mathematical theories and other applications. But very little attention has been paid to second-order modal logic. I predict that it will play an increasingly central role as the framework for many debates in metaphysics and other areas of philosophy, and that this aspect of the 1947 paper will turn out to have been more than sixty years ahead of its time.

Who was the author of these seminal works? She was born in 1921 in New York, to Sam and Rose Barcan, Russian Jewish immigrants who had settled in the Bronx. Ruth Barcan graduated in 1941 with a B.A. in mathematics and philosophy from New York University, where she had also studied some physics, history and classics.

Informally, she learnt mathematical logic there from J.C.C. McKinsey, who encouraged her precocious interest in both technical and philosophical aspects of modal logic and her move to Yale for graduate studies. There she received her Ph.D. in 1946 with a dissertation on quantified modal logic, supervised by Frederic Fitch. Her early papers were the fruits of that research. She spent the year 1947-8 as a postdoctoral fellow at the University of Chicago, in Rudolf Carnap’s seminar; he too made early contributions to quantified modal logic. Astonishingly, from 1948 to 1963 Ruth Barcan Marcus, as she became, had no regular affiliation with a major department, and never applied for one.

She was a wife and mother, living the life of a housewife and modal logician. However, it was not a life of complete professional isolation: she participated in the life of the greater Chicago philosophical community, and occasionally gave invited lectures or courses. The change of professional name was Alonzo Church’s doing, in his capacity as editor of The Journal of Symbolic Logic. She had married in 1942, but published under her maiden name until seven or eight years later, when he found out and informed her that future submissions would have to be under her ‘legal’ name.

Only gradually did the philosophical community realize that she had struck gold, not fool’s gold. Of course, gold is fool’s fool’s gold, but not only fools were deceived: the proper appreciation of her work required a diametric change of philosophical perspective, feeling one’s way out of deeply but often tacitly held commitments. It also required a willingness to learn about logic and metaphysics from a woman. Such changes do not happen overnight. Nevertheless, it should have been clear from the beginning that whether or not what she had found was gold, it was at least a mineral of quite unusual quality.

Things improved. Increasing attention was paid to her papers. Philosophical logicians such as Arthur Prior, Saul Kripke and Dagfinn Føllesdal saw their interest and significance. From 1960 onwards, after a gap of seven years, Professor Marcus published a burst of articles in which she reflected on the interpretation of quantified modal logic and answered Quine’s criticisms. One of the ideas in them that resonates most with current philosophy of language is that of proper names as mere tags, without descriptive content.

This is not Kripke’s idea of names as rigid designators, designating the same object with respect to all relevant worlds, for ‘rigidified’ definite descriptions are rigid designators but still have descriptive content. Rather, it is the idea, later developed by David Kaplan and others, that proper names are directly referential, in the sense that they contribute only their bearer to the propositions expressed by sentences in which they occur. Direct reference entails rigid designation but not vice versa. It was the wildest unorthodoxy when she wrote, and is the purest orthodoxy now.

These papers on modal logic and metaphysics open up and analyse a network of further themes: the nature of extensionality as a principle in semantics; the philosophical groundwork for the necessity of identity; the possibility of a substitutional interpretation of the quantifiers, but also of an objectual interpretation restricted to actually existing concrete individuals, both of which can validate the Barcan formula and its converse; the status of essentialism; the extension of these ideas to properties, sets and other ‘collectives’.

The discussion is extraordinarily fertile, tersely open-minded and exploratory, as befits the state of the discipline, although still sharply constrained by logic: the emphasis is more on raising questions than on settling them. She is laying out the agenda for a discussion that has been at the heart of philosophy ever since, concerning issues that are as alive now as they were then. Many have contributed substantively to that discussion; there is so much credit to go round that all can afford to be generous over its distribution.

Institutional recognition flooded in too. In 1963, Ruth Barcan Marcus became the founding chair and professor of the Department of Philosophy at the University of Illinois at Chicago, a position of great trust which she held until 1970. After three years at Northwestern University, she was then Halleck professor of philosophy at Yale from 1973 to 1991, and subsequently Senior Research Scholar at Yale and Visiting Distinguished Professor at the University of California at Irvine.

She was a long-standing Chairman of the National Board of Officers of the American Philosophical Association and President of its Central Division, President of the Association of Symbolic Logic (which she helped achieve financial autonomy) and President of the Institute Internationale de Philosophie and thereafter Honorary President, in addition to extensive service on editorial boards, external review panels and other committees that underpin the collective life of the profession. She has held visiting research fellowships at Oxford, Cambridge, Edinburgh, the Stanford Center for Advanced Study in the Behavioral Sciences and the National Humanities Center, and a residency in Bellagio. She has been a Fellow of the American Academy of Arts and Sciences since 1977.

A festschrift packed with distinguished authors was published in her honour. She was awarded an Honorary Doctorate by the University of Illinois, the Wilbur Cross Medal by Yale, a medal by the College de France, the Machette Prize for contributions to the profession from the Machette Foundation, the American Philosophical Association’s Quinn Prize for service to philosophy and philosophers — and now, of course, the Lauener Prize itself. Not least amongst those services to the professions has been her formidable defence of the highest intellectual standards, of rigour and other core philosophical values, against compromise with fashionable political and cultural pressures.

We do not always expect much activity from great monuments of the profession, but in Professor Marcus’s case recognition coincided with a remarkable widening of the range of her creativity. Already in 1966 a pregnant note in Mind had helped clarify the interpretation of iterated deontic modalities. Her most-cited paper is one in The Journal of Philosophy from 1980 on moral dilemmas and consistency, in which she refuted the popular idea that moral dilemmas involve mutually inconsistent moral principles, and showed that they provide no support for moral anti-realism.

She made a powerful case against conceptions of belief that put too much weight on language-use rather than non-linguistic interaction with the external environment, and defended an elegant analogy between belief and knowledge, on which belief requires consistency just as knowledge requires truth. In the history of philosophy, she applied her expertise on modal matters to Spinoza’s ontological ‘proof’ of the existence of God and her ideas on names to the development of Russell’s later views on ontology and reference.

The link with Bertrand Russell is no coincidence. If you look at the index to Ruth’s selected philosophical essays, Modalities, you will see that he has by far the longest entry of anyone. Like Russell, she uses logic as an essential and creative discipline for philosophy. She is every bit as good as he was at suffering fools gladly. Like Russell, she is willing to try out a variety of ideas with undogmatic experimentation, to follow the argument where it leads, however unpopular the conclusion, while still retaining exactly what he called ‘that feeling for reality, which ought to be preserved even in the most abstract studies’. In reading her work, one has a strong sense that there is truth and falsity in philosophy, just as in other sciences, however hard it is to tell the difference.

Sometimes, in sincerely honouring a genuinely distinguished philosopher, one nevertheless feels that in the end all their distinctive ideas will turn out to lie on the false side of the line. So it is a special pleasure to have been praising Ruth, many of whose main ideas are not just original, and clever, and beautiful, and fascinating, and influential, and way ahead of their time, but actually — I believe — true. The award of the Lauener Prize to her must encourage us all to have the courage and patience to carry on the work of analytic philosophy according to the highest standards in our tradition.

Tuesday, October 14, 2008

El primer instrumento musical.


Restos arqueologicos de instrumentos musicales se han encontrado muchos, particularmente en contextos neandertales, pero casi todos ellos estaban en muy mal estado de conservacion.

Hasta la fecha, los instrumentos musicales mas antiguos son unas flautas multinota hechas de hueso del cubito de una grulla Grus japonensis encontrados en la excavacion de Jiahu, sita en Henan provincia de China.

Datan de unos 9000 años y tienen entre unos 5 y 8 agujeros y aun se pueden tocar.

Los analisis de las flautas con Stroboconn (estroboscopico de analisis de sonidos) han mostrado como su escala tonal es muy similar a la occidental de ocho notas (do-re-mi).

Dos grabaciones de las flautas se pueden escuchar aqui.

Articulo original aqui, y otro articulo relacionado aqui.

Monday, October 13, 2008

El premio Nobel de economia 2008...

Es para Paul Krugman, por sus analisis de los patrones de comercio y geografia de la actividad economica.

Una idea peligrosa.

¿Cúal es la idea más impactante de los últimos años?

La idea de los prosumers.

Un neologismo derivado de las palabras inglesas "professional" "producer" y "consumers" que hace referencia a todos los voluntarios que no tienen remuneracion, programadores de codigo abierto, bloggers, padres, escritores freelance, personas amateur que ofrecen su talento, su conocimiento, su habilidad, tiempo... sin esperar nada a cambio, para ser consumido por otros free lunch.

En otras palabras, los prosumers estan fuera del sistema monetario que rige la economia.

El originario de esta idea es el intelectual, futurista y escritor norteamericano Alvin Toffler y esta idea, es realmente un quebradero de cabeza para el sistema capitalista.


Porque como bien explica Alvin Toffler:

La sociedad necesita personas que se hagan cargo de los ancianos y que sepan cómo ser compasivos y honestos. La sociedad necesita gente que trabaje en los hospitales. La sociedad necesita todo tipo de habilidades que no son sólo cognitivas, son emocionales, son afectivas. No podemos montar la sociedad sobre datos


La actual version de la economia de mercado, es la economia del conocimiento, la economia de la sociedad de la informacion.

Y esta, ha sido capaz de poner precio al intangible que podemos llamar capital intelectual, a traves de medios indirectos.

Pero como poner precio a la diligencia, a la motivacion, a los afectos... ni siquiera indirectamente se puede.

Pincha aqui, y ¡hazte prosumer!

Friday, October 10, 2008

Neurohistoria.

When Does History Begin?
by Daniel Lord Smail

Back when I was in grade school — I was born in 1961 — it was pretty clear that history began in 1492. We did cover the Native American peoples in our social studies classes, and since I grew up in Wisconsin this meant the Chippewa. But the Chippewa nation didn't exactly have a history. All they had was a collection of timeless customs, encapsulated in the frozen dioramas we went to see in the State Historical Society Museum in Madison. We never had to memorize any dates associated with the Chippewa. In this sense, Wisconsin came into the stream of history only when the first French traders arrived and set things on the move, in the same way that Christopher Columbus magically brought history to North America as a whole. A thick curtain shrouded all that lay before. There was something back there, but it wasn't connected to the time stream of what we called "history." It never would have occurred to any of us to ask what the Chippewa were doing in Wisconsin at the same time that the Romans were doing things in Rome.
So when does history begin? In the K-12 and university curricula nowadays, it starts a lot earlier than 1492. But even if our history curricula have been stretched, there's an eerie correspondence between my schoolboy's sense of history and the way that professional historians and school curricula frame history today. Several years ago I was a fairly junior faculty member at a university in New York State. A senior colleague and I were talking about what it was like to teach college history back in the 1960s. "I taught four courses a semester," he told me. "I did everything: Greeks, Romans, all the way back to the beginning of history." He meant Sumeria around six thousand years ago. The phrase stuck out because at the time I was teaching an undergraduate history course, "A Natural History," which began around three million years ago. To be fair, he didn't mean that humanity began in the watery, irrigated fields of Mesopotamia, the secular equivalent of the Garden of Eden. He meant that something we conventionally call "history" began there. But there is still that thick curtain shrouding the other side from view. Beyond it, there are no dates. No history. An undifferentiated assortment of hunter-gatherers, with their timeless customs, their cave paintings, and their solitary, nasty, brutish lives.

This chronological habit whereby "history" is demarcated from "prehistory" is typical of departments of history in colleges and universities in the United States. Actually, many departments now lack a historian of ancient Greece or Rome, and Sumeria itself is long gone. But the Sumerian origins of history live on in Western Civ courses and textbooks because it is a comforting and familiar place to begin. We have to start somewhere, right? Otherwise, the dark abyss of time opens sharply beneath our feet, and we teeter precariously on the brink, facing the awful immensity of the past. Dimly, we can make out the archaeologists, paleoanthropologists, and biologists who toil away on the other side of the chasm. Yet we can safely leave all that to them, since for some reason it's not history.

Few professional historians deny that there is something over there, on the other side of the chasm. The problem is that we don't know how to think of it as history. Hence, there is widespread if tacit agreement with the memorable phrase coined by two influential historians in 1898: "No documents, no history." And since writing is a little over five thousand years old, this limits history to the past that we're used to. Oddly enough, it's okay to use nondocumentary evidence after writing has reached a given society. Thus, U.S. historians allow themselves to consult archaeology in their efforts to reconstruct the history of the New World after 1492. What they allow themselves to do less often is to rely on archaeological evidence in the absence of documents. The existence of contemporary documents somehow "cleanses" the archaeological evidence of its scientific taint and makes it worthy of being history.

By this logic, we are limited to a history encompassing no more than five or six thousand years. It is hardly a coincidence that this time frame corresponds to the Judeo-Christian chronology, according to which the world was created in 4004 BC. One hundred and fifty years ago, the limits of Judeo-Christian chronology were cast off during the course of an intellectual revolution at least as significant as the discoveries of heliocentricity and relativity. Thanks to Charles Lyell, Charles Darwin, and others, we are now aware of the immensity of geological and astronomical time. We have learned about our primate ancestry and the unity of life. Yet we still teach history as if it begins between the two great rivers of Mesopotamia around six thousand years ago. In this way, we translate the story of Genesis into suitably secular terms but leave the basic narrative intact. For all intents and purposes, "history," as framed in our curricula and syllabuses, has not yet experienced the Darwinian revolution.

So what would history look like if we jettisoned the idea of prehistory, pulled aside the curtain, and launched ourselves into the abyss of time? A deep history can never enjoy the full range of sources available to historians of the more recent past. Biography and the history of ideas are pretty hopeless, and we can never touch the people of the deep past through their own words. But there's history there to be written. Jared Diamond, Tim Flannery, and others have shown how ecology, environment, and disease can provide mind-capturing ways to connect the deep past to the recent past. We can plot the movements of peoples, things, and phonemes over the past fifty thousand years with considerable fidelity, writing histories that talk about human diasporas, trade, and the status hierarchies. Bones and fossilized excrement provide extraordinary insights about health, diet, and the gendered division of labor, themes that connect to the work of historians of the recent past. The dark abyss isn't so dark anymore. As long as we give up the association of human history with cities and empires, as long as we acknowledge that bones, tools, grave goods, fireplaces, trash heaps, clothes, phonemes, and genes are just as worthy as documents, there is no end to the possibilities of a deep history.

More than anything, the new science of neurobiology has provided manifold ways to write a long history centered on humanity's defining feature, the brain. We now know a good deal about the operation of neurotransmitters like dopamine and serotonin in the brain-body system. We have begun to piece together how these neurochemicals were involved in the co-evolution of the human body and human culture over several hundred thousand years. Obviously we cannot measure levels of dopamine in the synapses of dead people. What we can do is develop histories sensitive to the fact that a great many human practices and human institutions — liturgies, rituals, spectacles, foods, drugs, forms of torture and deprivation — have innumerable physiological consequences. This did not come about by accident. The human institutions that have emerged in the past five thousand years may have been designed by kings, priests, administrators, or artisans, but they were also "designed" by the process of cultural selection to modulate or manipulate the brain-body chemistry of oneself or one's subjects or clients. This kind of insight can help us understand, say, the evolution of practices of sensory deprivation in monastic religions. What better way is there to inculcate an addiction to the prayers, liturgies, and ascetic practices that lighten the unpleasant sensation of dopamine deprivation? A neurohistory, written in light of neurobiology, can help us see how the modern world economy is designed to deliver dopamine, serotonin, epinephrine, and all the neurotransmitters and hormones that lend color and pizzazz and comfort to our lives. But none of this will make any sense until we grasp the long history of the brain.

The first time I taught the deep history of humanity, seven years ago, an anonymous student made this comment on a course evaluation form: "This is the first history class that ever made sense to me." My students pressed, and pushed, and interrogated, but they were excited about what they learned and their sense of history was stretched. And, boy, does history ever need stretching. Last year, by my reckoning, half of the senior honors theses written in my history department had a chronological balance point located after 1939. Three-quarters dealt with the twentieth century. In this way history has been reduced to a branch of current affairs. So let's make history historical again. It's time to foster anew our native sense of wonder about the deep past. And by refashioning our idea of what history is, by coming to terms with the Darwinian revolution, we can abandon the secular Eden of Mesopotamia and start our history where it ought to begin: in Africa.


÷ ÷ ÷

Daniel Lord Smail is Professor of History at Harvard University. He is the author of Imaginary Cartographies (1999), which won the American Historical Association's Herbert Baxter Adams Prize and the Social Science History Association's President's Award; and The Consumption of Justice (2003), which won the Law and Society Association's James Willard Hurst Prize. He is also co-editor of Fama: The Politics of Talk and Reputation in Medieval Europe (2003).

Vinculos relacionados, aqui.

Thursday, October 09, 2008

El primer neurocirujano afro-americano.


Hay algunas profesiones que requieren el caracter mas noble, la aptitud mas firme, la disposicion de animo mas entregada y el mejor talento que la mente humana pueda cultivar.

En muchas de estas profesiones no hay mas que vanidades, y mucho "ego" suelto.

Hay una practica y actividad dentro de una de estas profesiones que requiere aun mas: la neurocirugia.

No es una hiperbole si digo que los neurocirujanos son una raza a parte y especial.

Pero cuando te encuentras con alguien superior a la media en una profesion distinguida y exclusiva, solo reservada para los mejores, que es humilde y ademas todo un pionero, solo podemos sentir admiracion y un respeto profundo.

En la America de mediados del siglo XX, el hombre de raza negra comia en zonas reservadas para ellos fuera del contacto con los blancos. Viajaba en la parte trasera del autobus. Era discriminado social y laboralmente.

El hombre negro era inferior a los ojos del hombre blanco.

En los albores del movimiento por la libertades civiles, el hombre negro empezaba a disfrutar de nuevos aires.

Podia entrar en la universidad y si realmente era meritorio, poder ejercer una profesion liberal.

Como te podras imaginar llegar a ser medico, y ademas un medico negro, era todo un fenomeno social y significaba un cambio cultural sin precedentes, una demostracion de que con las oportunidades sociales adecuadas el hombre negro era capaz de hacer lo mismo que el hombre blanco.

Nacido en diciembre de 1901, Clarence Sumner Greene Sr. recibio el titulo de medico en cirugia dental en 1926 por la Universidad de Pensilvania.

Tras un año de practica como dentista, no se sentia realizado y se matriculo por la Universidad de Harvard, para recibir formacion como medico especialista de 1927 a 1929.

En 1932 regreso a Pensilvania para obtener un master y en 1936 obtuvo el titulo de doctor en medicina por la Universidad de Howard (conocida como la "Black Harvard").

Completo su año de interino en el hospital de Cleveland. Mas tarde sirvio como medico residente en cirugia bajo la tutela del Dr. John Turner en el hospital Douglass de Filadelfia.

Una de vez de vuelta en Washington fue nombrado instructor de anatomia y fisiologia en la Universidad de Howard.

Una vez alli, el gran neurocirujano Wilder Penfield le llamo para que pasara un periodo de dos años de residencia en el famoso Instituto Neurologico de Montreal, y tras su estancia, estas son las palabras de reconocimiento que el mitico Dr. Penfeld dedico al Dr. Greene:

"El Dr. Greene ha trabajado con nosotros en el laboratorio de neuropatologia, y ha realizado un trabajo excelente, y cuando estaba de guardia su trabajo fue impecable. Su actitud hacia la profesion es algo que uno desearia tener, es un hombre etico, afabale y recto... es un serio estudiante de la neurocirujia y un buen neurocirujano, y amable con sus pacientes"

(Wilder Penfield´s written communication, American Board of Neurological Surgery, September 4, 1985.)


Tras su vuelta a Howard, fue nombrado director de la division de neurocirugia y alli practico a lo largo de los años las primeras craniotomias por aneurismas intracraniales, cirugia por hernias intervertebrales, simpatotectomias por hipertension.

Se caso con Evelyn Gardner con quien tuvo dos hijos, Carla y Clarence Jr. (Clrence Jr. es actualmente cirujano pediatrico en la ciudad de Kansas)

Fue promocionado a profesor titular (algo asi como catedratico en neurocirugia) y ya en vida una unidad del hospital recibio su nombre.

El Dr. Greene murio en 1957 convirtiendose en todo un simbolo para las subsiguientes generaciones de neurocirujanos afro-americanos.

Articulo aqui.

Tuesday, October 07, 2008

El premio Nobel de fisica 2008...

Es para Yoichiro Nambu por sus descubrimientos sobre la ruptura de simetria espontanea en fisica subatomica; Makoto Kobayashi y Toshihide Maskawa por sus descubrimientos sobre los origenes de la ruptura de simetria que predice, al menos, tres tipos de familias de quarks en la naturaleza.

Candidatura de Daniel Dennett al Premio Nobel de Literatura: ¡Ya!


Una vez mas llegan los nobel y una de las plumas filosoficas mas penetrantes y provocadoras del panorama actual sigue sin haber recibido un premio destinado a ser suyo.

Como todo gran intelectual, Dennett no solo se dedica a revelar y desentramar los mas profundos misterios sin resolver de la estructura de nuestro mundo, ademas es un pensador comprometido con los problemas planetarios que nos zozobran.

Su campaña, quizas excesivamente ruda, en contra de toda creencia supranatural que conlleva dramaticas repercusiones en este "mundo natural" (pensemos en los terrorista suicidas que se inmolan creyendo que van directos al paraiso) es una muestra de este compromiso.


Por sus libros, su pensamiento filosofico, su personalidad y su impacto mas alla del circulo de la filosofia academica y profesional, Dennett merece este premio mas que nadie.

Monday, October 06, 2008

El premio Nobel en Medicina o Fisiologia 2008...

Es para los co-descubridores del virus de la inmunodeficiencia
adquirida humana, Françoise Barré-Sinoussi y Luc Montagnier, y para Harald zur Hausen maxima autoridad del cancer de cervix.

Entrevista con Daniel Dennett.

SCIENCE & SPIRIT
Daniel Dennett's Darwinian Mind: An Interview with a 'Dangerous' Man
by Chris Floyd

The outspoken philosopher of science distills his rigorous conceptions of consciousness, and aims withering fire at the dialogue between science and religion.


I n matters of the mind—the exploration of consciousness, its correlation with the body, its evolutionary foundations, and the possibilities of its creation through computer technology—few voices today speak as boldly as that of philosopher Daniel Dennett. His best-selling works—among them Consciousness Explained and Darwin’s Dangerous Idea—have provoked fierce debates with their rigorous arguments, eloquent polemic and witty, no-holds-barred approach to intellectual combat. He is often ranked alongside Richard Dawkins as one of the most powerful—and, in some circles, feared—proponents of thorough-going Darwinism.

Dennett has famously called Darwinism a "universal acid," cutting through every aspect of science, culture, religion, art and human thought. "The question is," he writes in Darwin’s Dangerous Idea, "what does it leave behind? I have tried to show that once it passes through everything, we are left with stronger, sounder versions of our most important ideas. Some of the traditional details perish, and some of these are losses to be regretted, but...what remains is more than enough to build on."

Consciousness has arisen from the unwilled, unordained algorithmic processes of natural selection, says Dennett, whose work delivers a strong, extensive attack on the "argument from design" or the "anthropic principle." But a world without a Creator or an "Ultimate Meaning" is not a world without creation or meaning, he insists. When viewed through the solvent of Darwinism, he writes, "the ‘miracles’ of life and consciousness turn out to be even better than we imagined back when we were sure they were inexplicable."

Dennett’s prominence does not rest solely on his high public profile in the scientific controversies of our day; it is also based on a large body of academic work dealing with various aspects of the mind, stretching back almost 40 years. Dennett has long been associated with Tufts University, where he is now Distinguished Arts and Sciences Professor and director of the Center for Cognitive Studies. Boston-born, Oxford-educated, he now divides his time between North Andover, Massachusetts, and his farm in Maine, where he grows hay and blueberries, and makes cider wine.

In this exclusive interview with Science & Spirit, Dennett talks about his ideas on consciousness, evolution, free will, and the "slowly eroding domain" of religion.

Science & Spirit: Can you give us an overview of your ideas on consciousness? What is it? Where does it come from? Where might it be going?

Dennett: The problem I have answering your question is that my views on consciousness are initially very counterintuitive, and hence all too easy to misinterpret, so any short summary is bound to be misleading. Those whose curiosity is piqued by what I say here are beseeched to consult the long version carefully. Aside from my books, there are dozens of articles available free on my website, at www.ase.tufts.edu/cogstud.

With that caveat behind us (and convinced that in spite of it, some people will leap on what I say here and confidently ride off with a caricature), I claim that consciousness is not some extra glow or aura or "quale" caused by the activities made possible by the functional organization of the mature cortex; consciousness is those various activities. One is conscious of those contents whose representations briefly monopolize certain cortical resources, in competition with many other representations. The losers—lacking "political clout" in this competition—quickly fade leaving few if any traces, and that’s the only difference between being a conscious content and being an unconscious content.

There is no separate medium in the brain, where a content can "appear" and thus be guaranteed a shot at consciousness. Consciousness is not like television—it is like fame. One’s "access" to these representations is not a matter of perceiving them with some further inner sensory apparatus; one’s access is simply a matter of their being influential when they are. So consciousness is fame in the brain, or cerebral celebrity. That entails, of course, that those who claim they can imagine a being that has all these competitive activities, all the functional benefits and incidental features of such activities, in the cortex but is not conscious are simply mistaken. They can no more imagine this coherently than they can imagine a being that has all the metabolic, reproductive, and self-regulatory powers of a living thing but is not alive.

There is no privileged center, no soul, no place where it all comes together—aside from the brain itself. Actually, Aristotle’s concept of a soul is not bad—the "vegetative soul" of a plant is not a thing somewhere in the plant; it is simply its homeostatic organization, the proper functioning of its various systems, maintaining the plant’s life. A conscious human soul is the same sort of phenomenon, not a thing, but a way of being organized and maintaining that organization. Parts of that organization are more persistent, and play more salient (and hence reportable) roles than others, but the boundaries between them—like the threshold of human fame—are far from sharp.

S&S: What are the implications of all this for the notion of free will and moral choice?

Dennett: The implications of all this for the notion of free will are many. I have come to realize over the years that the hidden agenda for most people concerned about consciousness and the brain (and evolution, and artificial intelligence) is a worry that unless there is a bit of us that is somehow different, and mysteriously insulated from the material world, we can’t have free will—and then life will have no meaning. That is an understandable mistake. My 1984 book, Elbow Room: the Varieties of Free Will Worth Wanting, set out to expose this mistake in all its forms and show how what really matters in free will is handsomely preserved in my vision of how the brain works. I am returning to this subject in my next book, with a more detailed theory that takes advantage of the tremendous advances of outlook in the last 15 years.

S&S: What then of religion, or, more specifically, of the relationship between religion and science? Stephen Jay Gould speaks of "Non-Overlapping Magesteria," where the two realms of knowledge—or inquiry—stay within their own spheres, operating with mutual respect but maintaining a strict policy of non-interference. Is this possible, in your views? Is it even desirable?

Dennett: The problem with any proposed detente in which science and religion are ceded separate bailiwicks or "magisteria" is that, as some wag has put it, this amounts to rendering unto Caesar that which is Caesar’s and unto God that which Caesar says God can have. The most recent attempt, by Gould, has not found much favor among the religious precisely because he proposes to leave them so little. Of course, I’m certainly not suggesting that he should have left them more.

There are no factual assertions that religion can reasonably claim as its own, off limits to science. Many who readily grant this have not considered its implications. It means, for instance, that there are no factual assertions about the origin of the universe or its future trajectory, or about historical events (floods, the parting of seas, burning bushes, etc.), about the goal or purpose of life, or about the existence of an afterlife and so on, that are off limits to science. After all, assertions about the purpose or function of organs, the lack of purpose or function of, say, pebbles or galaxies, and assertions about the physical impossibility of psychokinesis, clairvoyance, poltergeists, trance channeling, etc. are all within the purview of science; so are the parallel assertions that strike closer to the traditionally exempt dogmas of long-established religions. You can’t consistently accept that expert scientific testimony can convict a charlatan of faking miracle cures and then deny that the same testimony counts just as conclusively—"beyond a reasonable doubt"—against any factual claims of violations of physical law to be found in the Bible or other religious texts or traditions.

What does that leave for religion to talk about? Moral injunctions and declarations of love (and hate, unfortunately), and other ceremonial speech acts. The moral codes of all the major religions are a treasury of ethical wisdom, agreeing on core precepts, and disagreeing on others that are intuitively less compelling, both to those who honor them and those who don’t. The very fact that we agree that there are moral limits that trump any claim of religious freedom—we wouldn’t accept a religion that engaged in human sacrifice or slavery, for instance—shows that we do not cede to religion, to any religion, the final authority on moral injunctions.

Centuries of ethical research and reflection, by philosophers, political theorists, economists, and other secular thinkers have not yet achieved a consensus on any Grand Unified Theory of ethics, but there is a broad, stable consensus on how to conduct such an inquiry, how to resolve ethical quandaries, and how to deal with as-yet unresolved differences. Religion plays a major role as a source of possible injunctions and precepts, and as a rallying point for public appeal and organization, but it does not set the ground rules of ethical agreement and disagreement, and hence cannot claim ethics or morality as its particular province.

That leaves ceremonial speech acts as religion’s surviving domain. These play a huge role in stabilizing the attitudes and policies of those who participate in them, but the trouble is that ceremony without power does not appear to be a stable arrangement—and appearances here are all important. Once a monarch is stripped of all political power, as in Great Britain, the traditions and trappings tend to lose some of their psychological force, so that their sole surviving function—focusing the solidarity of the citizenry—is somewhat undercut. Whether or not to abolish the monarchy becomes an ever less momentous decision, rather like whether or not to celebrate a national holiday always on a Monday, instead of on its traditional calendar date. Recognizing this threat of erosion, religious people will seldom acknowledge in public that their God has been reduced to something like a figurehead, a mere constitutional monarch, even while their practices and decisions presuppose that this is so.

It is seldom remarked (though often observed in private, I daresay) that many, many people who profess belief in God do not really act the way people who believed in God would act; they act the way people would act who believed in believing in God. That is, they manifestly think that believing in God is—would be—a good thing, a state of mind to be encouraged, by example if possible, so they defend belief-in-God with whatever rhetorical and political tools they can muster. They ask for God’s help, but do not risk anything on receiving it, for instance. They thank God for their blessings, but, following the principle that God helps those who help themselves, they proceed with the major decisions of their lives as if they were going it alone.

Those few individuals who clearly do act as if they believed in God, really believed in God, are in striking contrast: the Christian Scientists who opt for divine intervention over medical attention, for instance, or those who give all their goods to one church or another in expectation of the Apocalypse, or those who eagerly seek martyrdom.

Not wanting the contrast to be so stark, the believers in belief-in-God respond with the doctrine that it is a sin (or at least a doctrinal error) to count on God’s existence to have any particular effect. This has the nice effect of making the behavior of a believer in belief-in-God and the behavior of a believer in God so similar as to be all but indistinguishable.

Once nothing follows from a belief in God that doesn’t equally follow from the presumably weaker creed that it would be good if I believed in God—a doctrine that is readily available to the atheist, after all—religion has been so laundered of content that it is quite possibly consistent with science. Peter de Vries, a genuine believer in God and probably the funniest writer on religion ever, has his hyper-liberal Reverend Mackerel (in his book The Mackerel Plaza) preach the following line: "It is the final proof of God’s omnipotence that he need not exist in order to save us."

The Reverend Mackerel’s God can co-exist peacefully with science. So can Santa Claus, who need not exist in order to make our yuletide season more jolly.