Lamarckian evolution in action? Alas, no.

This started to sound very cool from the beginning:

Ahornia inhabits the thickly wooded mountains along what once was the fortified border between West Germany and Czechoslovakia. At the height of the Cold War, a high electric fence, barbed wire and machine-gun-carrying guards cut off Eastern Europe from the Western world. The barriers severed the herds of deer on the two sides as well.

* * *
But one species is boycotting the reunified animal kingdom: red deer. Herds of them roam both sides of the old NATO-Warsaw Pact border here but mysteriously turn around when they approach it. This although the deer alive today have no memory of the ominous fence.

But alas, there’s a perfectly good non-Lamarckian explanation:

One reason, he says, is that deer have traditional trails, passed on through the generations, with a collective memory that their grounds end at the erstwhile barrier. Females, who stay with their mothers longer than males and spend more time absorbing their mothers’ movements, stick even more closely to the traditional turf.

Though, really, WSJ? “Collective memory?” WTF stupid science reporter? What about “learned behavior from mothers?”

Share


11 Responses to “Lamarckian evolution in action? Alas, no.”

  1. Daniel S. Goldberg Says:

    Just out of curiosity, why so immediately dismissive of the possibility that deer might have some kind of collective memory?

  2. Paul Gowder Says:

    Materialist reasons, mostly. Representational states, like other cognitive phenomena, have to have biological instantiations. And collectives don’t have brains.

    There are any number of conceivable ways for herds to sensibly transmit information across generations, but to call them “collective memory” would be either a misuse of the term “collective” (deer to deer transmission of behavior patterns) or “memory” (instinctual behavioral patterns transmitted either genetically or through some weird biological trait-environment interaction triggered only in very odd environments).

  3. Daniel S. Goldberg Says:

    Mmm. So, under your account, humans would not have collective memory either, right?

    (Sidenote: I would imagine that animal behavior is generally emergent with respect to brains, such that all sorts of things we note in animals, including ourselves, may be causally attributed to brains without being reducible to them).

  4. Paul Gowder Says:

    Humans have a better case for something called “collective memory” because at least we have, you know, written language. But I wouldn’t use a term like “collective memory” for any phenomenon I know about in humans either, on balance.

    Don’t get me started on emergence though. Suffice it to say that I find it really difficult to believe that any such thing as actual metaphysical emergence exists. There might be epistemological emergence, where X is reducible to Y but either we don’t know or it is in principle impossible to know how to carry out the reduction (think of a perfect crypotographic algorithm, the ciphertext of which would be emergent in that very weak sense). But anything beyond that? Just seems incoherent to me.

  5. Steve M. Says:

    I’m having trouble coming up with a definition of “memory” that includes my memory of what I had for dinner last night but excludes things like the instinctual human fear of spiders, at least not without cheating and specifying in the definition that information about the past has to be recorded in a particular physical medium, e.g., synapses, &c. And if instinctual fears count, then obviously lots of species have collective memories. Or do you get to specify the recording medium?

  6. Daniel S. Goldberg Says:

    I’m having difficulty understanding the difference between epistemic and metaphysical emergence, esp. if you are willing to say that it is inherently impossible to “know how to carry out the reduction” of X to Y.

  7. Paul Gowder Says:

    And you guys came to the right place, because I’m the problem-solver.

    Steve: I’d prefer to specify that memory at a minimum has to have representational content. Not just emotions or behaviors triggered by representations. Otherwise we’d have to say that Pavlov’s dog drools because he remembers something about the bell and the food. (Or are you willing to take that claim on?)

    Daniel: when a lot of people talk about consciousness, they seem to want to get something like substance dualism out of this idea of emergence — it’s not just that we can’t explain the relationship of observations in consciousness to observations in the brain, but that consciousness is something more than than phenomena in the brain. It’s that latter claim that I want to resist.

  8. Steve M. Says:

    It depends on what representational content is. Surely you can have memories that lack propositional or intentional content. The easiest case involves having seen something, but being unaware of where. You can walk by a bakery, and remember having smelled that scent before, without knowing that it’s similar to the smell of your grandmother’s delicious chocolate chip cookies. I think you ought to call the scent, or your awareness of your familiarity with it, a memory, even though you don’t recall what originally caused it. That’s not exactly the same as representation, and I can imagine a view of representation on which, in order for X to represent Y, it simply need be the case that X was caused by Y and that X usefully records information about Y. If all you’re trying to say is that, in order to me a memory, information has to be related to something that happened, that’s pretty weak beer. Surely the bits in your hard drive represent digital photographs even when the computer’s turned off. But then, setting aside the pun, do computers have memories?

    Now, maybe you can take the position that representing and intentionality don’t imply awareness of the nature of the representation or of the intentional state. In that case, your reaction to the bakery isn’t a memory. I’m not sure I’m comfortable with that.

  9. Daniel S. Goldberg Says:

    Paul,

    This is material I cover in my dissertation. Searle is exactly right when he says that believing that material substrate is a sine qua non for consciousness does not license the conclusion that consciousness is nothing but material structure. We need neither dualism nor monism.

    However, if you think this move is valid, chalk it up to one of the number of things we would disagree about pretty vehemently!

  10. ben Says:

    Daniel,

    What do you think, if you’ve read it, of The Nature of Mental Things?

  11. Daniel S. Goldberg Says:

    Ben,

    Haven’t read it, but it looks extremely interesting upon Googling.

    (FWIW: Aside from Searle’s accounts of consciousness, I am also quite partial to Grant Gillett’s work, in no small part because he privileges a phenomenological account of the role of subjectivity in mind. I think this can be read in terms quite friendly to Searle’s account).

Leave a Comment