Why do cords always get tangled? The answer is inertia, not entropy.

Whenever mathematics types get asked why cords get tangled, they sometimes answer “because there are many more tangled states than untangled states, so a little bit of energy is much more probable to dump it into a tangled state.” (much more complex exception)

But that can’t be right, because the it confuses messiness with difficulty to leave. Doubtless there are more “messy” than “neat” states of a cord, but an arbitrarily messy cord doesn’t count as “tangled” unless you have to struggle to get it out. And there’s no a priori reason to think there are more tangled than untangled states.

That observation, however, points the way to the correct answer. Because tangled states are much harder to leave, at any random time over the life of a system that’s getting energy put into it, we’re more likely to observe one. No matter what how many tangled states there actually are relative to the number of untangled states.

To see this, imagine a cord with 20 possible states, 19 of them taking an energy of X to leave (the untangled states), and the 20th (the tangled state) taking an energy of Y to leave, where Y is much larger than X. Now suppose that the cord is subject to a shock every second, normally distributed, where X is the mean and the standard deviation is (Y-X)/3. And suppose that once the cord leaves a state, it picks a new state to be in from a uniform distribution of the 20 possible states.

It’s pretty obvious that it’s going to end up in that tangled state pretty quickly, and stay there for a long time, isn’t it? Just because it takes much more energy to get out of there.

Essentially, this is the same abstract-level idea as evolution. Stable states get observed more than unstable states just because stable states tend to stick around long enough to be observed.

Share


7 Responses to “Why do cords always get tangled? The answer is inertia, not entropy.”

  1. Physical Chemist Says:

    This is pretty far off base.

    First, there is not only a priori reason, but proof, that tangled states are more numerous than untangled states for the type of string you’re talking about getting tangled. A quoted paragraph from the paper “Spontaneous knotting of an agitated string” (PNAS, 2007, first answer on the stackexchange link you posted):

    “Formation of knots in mathematical self-avoiding random walks has been extensively studied (10–16). In the 1960s, Frisch and Wasserman (10) and Delbruck (11) conjectured that the probability of finding a knot would approach 100% with an increasing walk length. In 1988, Sumners and Whittington (15) proved this conjecture rigorously by showing that exponentially few arcs would remain unknotted as the length tends to infinity. Numerical studies of finite-length random walks find that the probability of knotting and the average complexity of knots increase sharply with the number of steps (16).”

    Self-avoiding random walks are an excellent model for the type of strings physicists like to talk about, not necessarily because they are entirely realistic, but because in a precise sense many other models behave the same way.

    Second, the fact that you say ‘confuses messiness with difficulty to leave’ indicates to me that you do not understand the implications of the ergodic hypothesis for residence times. Messiness and difficulty of leaving are directly related in a wide class of models of strings. Those models probably do not apply to cords on the human scale (centimeters and longer), so I hesitated to write this one down, but it’s nonetheless a signal of ignorance that you should avoid in the future.

    Third, the example with 20 states is deeply flawed comparison to the string. For a conceptual string, all observed configurations of the string can roughly be assumed equal in energy (and this assumption can be justified fairly rigorously). In your example, state 20 has a far lower energy than the other 19. If this is not apparent to you after some reflection, take it as a sign that you do not understand how an energy surface is defined.

    Fourth, building on two and three: I actually think you’re right about why strings are found tangled– because once they’re tangled, they stay tangled– but though that’s right, the barrier to getting untangled is not energetic. It’s entropic. The inertia is entropic, contrary to your title’s opposition of the two.

    Finally, what you’ve described is very, very different from evolution. Evolution is essentially non-equilibrium, and what you’ve described in the ‘imagine’ example is an equilibrium model.

    I came from Leiter. I appreciate his style of dealing with physicists who talk nonsense about philosophy and I figure I should return the favor.

  2. Paul Gowder Says:

    Real quick:

    1) fair, but not relevant. My claim was that we don’t need to think there are many more tangled than untangled states to explain why we observe more tangled states. Indeed, I said that “doubtless” there are more tangled states.

    2) Irrelevant as well as arrogant. Irrelevance comes from your own point that this doesn’t apply to human scale. Arrogant because you charge me (guilty as charged, I don’t claim to be a physicist) with ignorance of a physical fact that doesn’t apply on the scale under consideration and don’t even bother to summarize the basic idea. (Why should the ergodic hypothesis have implications for residence times? This doesn’t seem like an obvious implication — generally, the probability distribution over states shouldn’t imply anything about the cost of leaving a state in the abstract — so it would have been helpful and constructive, rather than just pointlessly brick-throwing, to say why there is such a relationship.)

    3) Doesn’t that just assume away any difference between tangled states and untangled states? In which case, you’re answering a different question. There’s no point in trying to understand idealized conceptual strings in which no states are dramatically less easy to leave than others, if we’re trying to figure out why real strings disproportionately are observed in states that are dramatically less easy to leave than others. Whether that entails counter-idealized assumptions about energy surfaces or not isn’t really relevant.

    4) That’s actually a fair criticism, I agree. Thanks for the correction.

    5) Just wrong. Evolution is essentially non-equilibrium? Eh? That’s certainly not true on a population level, where you can reach equilibrium distributions of traits (see, generally, Weibull Evolutionary Game Theory for the math). A stable state in the evolutionary context is just one in such an equilibrium, and its stability (its being-in-equilibrium) is just the reason we observe it. Note that the population level is the correct level of analysis for evolutionary claims. Evolutionary explanations don’t purport to explain why a trait appears in an individual member of the population (it could just be random variation), they purport to explain why the traits in the population as a whole have the distribution they do. That is, for the level of generality at which evolution has explanatory value, it’s an equilibrium phenomenon. (We then get explanations of observed traits in individual population members from explanations of population distributions, but the primary object of explanation is a population.)

    If I may indulge in a little bit of reciprocal nastiness, this is a signal of ignorance about evolutionary explanation that you might want to avoid. Also, in terms of the norms of intellectual discussion, note how I didn’t just flatly (and arrogantly) declare “clearly you don’t understand the difference between individual explanation and population explanation.” Instead, I explained the error. Maybe you might try that kind of communication in the future?

  3. Physical Chemist Says:

    I meant to be critical, but not nasty, and seeing nastiness and arrogance here is reading me wrong. I thought that this post was filled with embarrassing things that I wanted to warn you about because you seem to be a genuinely decent man who I’d like to be taken seriously. Overall: I was trying to say that the post was incoherent and using technical terms poorly with a lot of funny accidental equivocations, false oppositions, and implicit misrepresentations of others’ positions. I didn’t mean it to be nasty, I meant it to be a blunt but friendly warning that you look like a bit of a goofball. I’m sorry I’ve rubbed you the wrong way, and I’ll explain myself a bit better since I didn’t mean to annoy you.

    1) If you think it’s not relevant, you may have missed the importance of the states they’re enumerating. I spoke as if you had understood that in my first comment. The states they enumerate are not the states our brains automatically generate when we look at a collection of strings (I’ll call these macrostates). The states (call them microstates) are defined precisely at the microscopic level and all of them are equally easy to leave. There is a sense in which we don’t need to know that there are more tangled microstates than untangled microstates to explain why we see strings tangled–we can say that ‘they don’t leave tangled macrostates, so they are found there’–but to us, that’s tantamount to saying ‘they are there, so they are there.’ The problem is to predict which macrostates are hard to leave, so your explanation totally misses the point of the explanations you criticize. In the string case, the tangled macrostates are hard to leave because there are a lot more microstates in tangled states than untangled states (higher entropy). In the example you gave with the twenty states, either they are macrostates and the ‘difficulty of leaving’ was given, and so not explained, or they were microstates and the ‘difficulty of leaving’ implies an energy difference that misrepresents the physics of a string. In the evolution example, the ‘difficulty of leaving’ is less important than the ‘difficulty of getting here from the past’. Evolutionary equilibria are not thermodynamic equilibria; there are scalars like entropy that track mixing in population dynamics, but they aren’t thermodynamic entropy proper.

    2) In cases where the ergodic hypothesis is valid, residence times and probability distributions are intimately linked because ergodicity implies connections between long-time averages and spatial averages over the equilibrium probability distribution. See http://en.wikipedia.org/wiki/Ergodic_theory, section “Sojourn time.” This is something that I assume is obvious to people who understand the ergodic hypothesis; it seems to be the essence of it to me. I’m surprised you’re aware of it but not aware of the time-space-link interpretation–I usually think of it as exactly that, so what did you think it was? However, regardless, I don’t think it’s arrogant when philosophers criticize others for poor understanding of philosophy and yet don’t explain the modern debates to them, and I don’t think it’s arrogant to return the favor. I think it’s natural to say “oh, this guy has no handle on the literature or what people actually argue about these days and I don’t want to be the one to make him a syllabus, but I’ll at least tell him that he’s talking about noncognitivism vs error theory so he has something to Google.” Maybe lazy, but not arrogant.

    3) This is tricky, because either my criticism is good or the definition of ’state’ has implicitly changed during the course of your argument. By one definition of ’state,’ one that uses degree of tangling as a classifier, your example makes sense. However, that is not the definition of state that the mathematicians or physicists would be referring to–we use one based on the precise coordinates of the string all along its length. So the example you gave was either a non sequitur or a misrepresentation of the physics of strings. Either way, it was clear you didn’t understand the mathematicians you were criticizing. Again, I’m not criticizing the argument on its merits, but as something people will judge your understanding on, because I want you to be respected (because of the strength of your other entries).

    4) I promise the rest is fair, too, at least according to my standards. I’m following the Golden Rule, not being nasty and throwing bricks.

    5) Nah, you’ve misunderstood me. Equilibria play a large role in reasoning about evolution, but it’s a different meaning of ‘equilibrium’ with very different implications. The equilibrium I’m talking about is thermodynamic equilibrium, and biological evolution definitely happens away from thermodynamic equilibrium. It’s an ugly equivocation. Both concepts are population concepts, I definitely understand that aspect, but surely you’ve been subjected to the evolution vs entropy ‘paradox’ some people like to use to argue against evolution? The sort of explanation you were giving would make that argument against evolution seem reasonable even though it’s not, and since you’re the sort who wants to advocate for evolution, you may want to be wary of that explanation. The word ’stable’ you use is fraught with ambiguity… the intuition one applies with that word is either vacuous or mistaken in the comparison between the tangled string and evolution. Evolutionary stability and thermodynamic stability are very different concepts even though the same language and the same stochastic process and information theory mathematics can be applied to both from certain perspectives. If your point was that the same language applies, that the same tautologies (like ‘things that stick around for a long time are observed for a long time’) can be formulated in each case, then what’s the point of criticizing mathematicians with a more ambitious object? But if it was something deeper, it’s almost surely wrong. And evolution is deep, so I don’t want to trivialize it by making it just old-style evolutionary game theory and equilibrium population dynamics. There are too many interesting things beyond that, like spatiotemporal limits to metabolism, dynamic environment-organism interactions, and the wild fractal structures in real speciation patterns.

    Well, you did say “just wrong” to point 5, and directly told me my points weren’t relevant without much defense again and again, so I can’t give you too much credit for not flatly and mistakenly saying I didn’t understand things. Still, as my last statement about Leiter was meant to indicate, I prefer blunt, curt criticism, I consider getting it a favor, and I didn’t mind being called ‘just wrong’ or ‘irrelevant’ in the least. Our intellectual cultures must differ; to me this was something nice to do for you. I’m sorry if that got lost in translation. I’ve tried to explain a bit more this time since I seem to have offended you, but after this I may not do it again. It’s too much work to criticize you if you insist on knowing how and why you’re wrong in addition to that you were wrong, but I thought telling you that there were problems would be something I had time for. I do like most of what you write, so thank you for that, and I meant no disrespect.

  4. Paul Gowder Says:

    Longer thought in the offing when I find time, but I did want to drop a quick reply to thank you for clarifying the tone and say that no offense is taken post-explanation — and I hope none given as well.

  5. Physical Chemist Says:

    Thanks, and no, none taken. We each have different priors for inferring the tone of written text, and my assumptions about yours were unfounded. I should have taken a different approach.

  6. Joann Leonard Says:

    Why do dialogues always get tangled?

  7. Just Call Me St00f Says:

    Well played Joann Leonard, Well played indeed. I would try to use some fancy wording and explanation, but I’m afraid these guys would tear me apart for just writing such a thing as their eloquence is intimidating.

Leave a Comment