Comments on: Doing the Time Warp A revolution in time. Thu, 11 Apr 2013 01:11:28 +0000 hourly 1 By: Taurus Londono Taurus Londono Wed, 25 Jul 2012 21:48:01 +0000 “The answer to that is that I’m a person who realizes that all existing systems of government are deeply flawed, and that the real future of freedom and of humanity lie in our ability, if we have it, to transcend ourselves and to overcome the limitations of our evolutionarily “randomly engineered” beginnings.”

As succinct and eminently agreeable a statement on politics as anyone is likely to find. The only cause I can fathom for argument in the further comments below is the reflexive, instinctual need for self-defense of the ego.

For all the endless volumes written, the outpouring of the deepest meditative thoughts humans have mustered about government, one definitive, unequivocal scientific *FACT* (ie; as opposed to a political theory) underlies them all-

Our brains are the product of random, undirected forces of natural selection. This unpleasant truth underlies all political ideologies. The brains of show dogs have been subject to more thoughtful, intelligent design than our own. As such, we should be brave enough and honest enough to face our own natural limitations with humility, and agree together (whatever political labels we find comfortable to wield for ourselves) that the *best* path forward involves our directed efforts to break free from the biological boundaries imposed by nature.

By: admin admin Fri, 14 Oct 2011 22:00:17 +0000 This is in response to your two comments, to Fundie’s remarks, and to anyone else out there who shares your views.

It would be far better if we were in actual conversation, face to face, rather than doing in this in writing from the standpoint that it would be a lot easier to communicate not just my position, but also my attitude and emotional state. I don’t want to appear denigrating, hypercritical, or somehow “superior” in any way. I not only agree with your position about not wanting to have my life run by other people without my consent, I also have a pretty jaundiced view of what happens to the character of people when they are in power, In terms of what we all want and consider “acceptable.” In short, I don’t see any meaningful disagreement.

Here’s my problem: when you write; “I do not believe that ANY human institution or group of humans should have any kind of monopoly power over others. Rather, I believe in a competitive meta-system where all human institutions are subject to competition and no institution is able to exert monopoly status over others. I simply reject out of hand the very concept of monopoly-authoritarianism. If I were religious, I would describe myself as a maltheist (this is the belief that god exists, but is evil). I do not know if there is a label that describes what I believe in. However, I tend to describe myself as libertarian for the simple reason that it is the label that most closely approximates what I believe in”

In saying this, you have just told me what you WANT. I’m not quite so absolute in my condemnation of humanity (even in power), because I have met and been the beneficiary of many fair, decent and very good men in high places. Sure, it is easy to point to the Riverside County Coroner, Raymond Carillo and say, “This is why government is evil, look at what he tried to do to Dora Kent, to her son, and to cryonics in general – and look at the damage he did.,” without also considering the many other people in positions of power who acted not only responsibly, but fairly and responsibly under conditions when it would have been far easier for them to have just “gone along with the time and tide.” The good decisions that were made were not made because of politics, profit or corruption, but because they were the RIGHT decisions. When I look over all of my many years of interaction with the government and the police in the US, I can point to a number of instances of unfairness, prejudice, and even corruption, but mostly they were fair and reasonable experiences. And yes, I realize that depending upon who you are and where you are, you may be treated very differently.

That point being made, I will now return to the question at hand. As I said above, you have told me what you WANT, but what you have not told me is who you ARE. When I ask a man who he is, he may respond in a wide variety of ways. He may tell me his age, his heritage/ethnicity, his job, that he is a parent, a Democrat, and so on. All of those things are reasonable and honest answers, because they tell me how lives. He may not be a good father, or a good Democrat or Republican, but at least I know he’s achieved some level of contact with the real world. Of course, I also understand that being a Democrat, or even a Mormon or a Catholic can encompass a fairly wide distribution of beliefs and behaviors – but I also know that these labels refer to real world institutions that, despite their many flaws and shortcomings, actually exist and which do manage to function and maintain some semblance of continuity over time.

When people speak of being Anarco-Capitalists, or even of being Ron Paul Libertarians, my problem is that they have put themselves squarely in the realm of the hypothetical – of what they WANT, rather than what they, or anyone else for that matter, has managed to practically live. There’s no great sin in that. In fact, all of us are in that realm in some mode, or another. The whole of the mystical part of religion is really about what people WANT, rather than about what they will GET.

Now at this point you may say, “Well, isn’t that exactly what cryonics is?” My answer is, “Close, but no cigar.” Cryonics has a large, complicated mass of rationally based things that must be done to achieve it, and most importantly, it is not just falsifiable, it WILL be either falsified or proven if the experiment is allowed to run to conclusion. So, cryonics is an EXPERIMENT, indeed a SCIENTIFIC EXPERIMENT, with people working very hard in very practical ways to carry it to conclusion. If we imagined cryonics as just an idea, as a proposition like the Singularity, or the Second Coming, which people simply wait for in faith or hope, then that would be another matter altogether.

My problem with libertarians, Anarco-Capitalists, and others similarly inclined, is that they have no credible pathway to achieve the end they want. In fact, despite the fact that the ideas are quite old, they have never even been attempted experimentally. Why is that? One reason that seems obvious to me is that the mechanics, if not the ideals, cannot be implemented with any now extant technology. That doesn’t mean that the ideals or the “wants” are bad, wrong, or impossible to achieve. But it does mean that the technology doesn’t exist. And yes, to be fair, it may mean that those ends aren’t achievable; I may want the laws of gravity and thermodynamics to go away – but wishing and wanting doesn’t mean that those ends are achievable.

I could write countless pages about why efforts to achieve the high degree of personal responsibility and freedom that we all want have consistently failed. I would also point out that in traveling around the world and keenly observing the “base” state of man, the Western Republics truly are remarkable achievements. They are a huge part of the way from the state of chimpanzee society, to the ideal of complete personal responsibility and freedom. Such failure analysis is (or can be) constructive. But beyond that, what is required is for a group of men to come together, sit down, and try to ENGINEER the next and better iteration of “cooperative interaction,” or if you prefer, self-government.

I believe we are due for such a rethink not because the current system is “broken” or shortsighted, but primarily because we have more information and knowledge about our biology and our nature. We are not yet “truly INDIVIDUAL free living organisms” that can exist in the universe roaming as single individuals from star to to star and living off the raw energy and matter present in the cosmos. Human beings require other human beings (and also a lot of other life forms) in order to survive. When we exist in great number and proximity to each othe, there will be very, very difficult problems that will either have to be solved, or that will result in our perishing – either as a communities, or a species.


I used to see this problem as separate from cryonics. I now realize that such a separation is not possible, because cryonics absolutely requires a stable and technologically capable society, be it large or small, and what’s more, that any stable human society (culture) require cryonics. Without a shift in the individual’s timescale from the trivial and personal, to the species or cosmic, no human society can survive long trm; it will be overcome by the shortsightedness and expediency that are the legacy of our evolution from unreasoning animals. All of human religion, and much of human culture has been an attempt and a yearning at achieving transcendence; and to the extent that effort has been realized, civilizations have been stable. Absent the biomedical technology for practical immortality, all such attempts at achieving that transcendence and escaping the “range of the moment:” timescale in which most our species exists, is just so much fantasy which is easily manipulated into exploitation and subjugation. It is all a pseudo-experiment, which cannot ever be either falsified, or concluded. We have a better answer rooted in the advance of human scientific and technological knowledge. Cryonics and life extension are manifestations of that – just as were the Enlightenment and the classically liberal ideas and practical engineering of the Founding Fathers in the US. And they too were born out of scientific and technological advance.

However, these are “general answers;” they do not constitute the actual engineering required to build the world we want. That engineering largely remains to be done; and at this point, we may know, or think we know that in order to fly , what we build will probably have wings and a tail. But we may be surprised and find ourselves looking at a helicopter after we are “done;” there is more than one engineering approach to leave the earth and fly.

However, we won’t get airborne just by wishing it were so. We must act and the time to do that is now – because time is running out.for us. — Mike Darwin

By: Fundie Fundie Thu, 13 Oct 2011 00:48:22 +0000 Give me a break! I have no obligation to endorse any such system which has no rigor and no rigorous evidence to support its workability. I would hastily add that I feel the same way about dictatorships, monarchies, oligarchies, technocracies, and virtually all other large-scale human government apparatuses.

I feel the same way about democracy, as well. :)

By: Fundie Fundie Thu, 13 Oct 2011 00:46:44 +0000 In cryonics, it’s often hard to get government permissions to do the experiments you would like to do.

In politics, the same is also true. The experiment I would like to perform would involve giving all citizens the right to secede. I can’t seem to get through the regulatory red tape to get approval granted, though. :)

By: Fundie Fundie Thu, 13 Oct 2011 00:45:01 +0000 Just now say this.

Mike, I didn’t mean to imply you said anything about my thoughts specifically. I didn’t feel that you were addressing me specifically. But you did say what you believed libertarians thought, and I am one, so I spoke as a libertarian to shed more light on what I as a libertarian believe.

No offense taken, and I hope none was given.

Disclosure: I’m all sorts of awful things. I am an anarcho-capitalist, I like 99% of what Murray Rothbard and his student Walter Block say (I’ll give Block a 99.9%). I don’t believe in democracy. I’m just simply awful. :)

By: Abelard Lindsey Abelard Lindsey Wed, 12 Oct 2011 21:12:31 +0000 I will clarify my point. I consider most humans, and by association, human institutions to be corrupt and ineffectual. I have lived long enough and done enough things (on two continents) to convince me of this reality. For this reason, I do not believe that ANY human institution or group of humans should have any kind of monopoly power over others. Rather, I believe in a competitive meta-system where all human institutions are subject to competition and no institution is able to exert monopoly status over others. I simply reject out of hand the very concept of monopoly-authoritarianism. If I were religious, I would describe myself as a maltheist (this is the belief that god exists, but is evil).

I do not know if there is a label that describes what I believe in. However, I tend to describe myself as libertarian for the simple reason that it is the label that most closely approximates what I believe in.

By: Abelard Lindsey Abelard Lindsey Wed, 12 Oct 2011 00:26:47 +0000 That’s quite a rant, Mike!

I even agree with it to a certain extent.

However, it does support my previous comment. If other humans are so incompetent and flawed, why put up with any kind of system that allows any of them to have influence over you?

By: admin admin Thu, 06 Oct 2011 05:11:27 +0000 Binary, binary, binary…capitalist or communist, conservative or liberal…. Sigh. I have no confidence in, let alone advocacy of bureaucracy, or of the nation-state as “the answer” to the problem of rational, WORKABLE human government. And I’ve never said that I think that such are workable solutions. They aren’t. What I’ve said is that NONE of the solutions we humans have come up with so far are workable over any meaningful period of time (e.g., centuries and millennia).

At this point in my life, I’m completely uninterested in ideologies and labels, and acutely interested in what works. I have no interest in politicos of any stripe telling me why their hypotheses are the best ones and why they should work. Instead, I want EVIDENCE. In other words, do the fucking experiment – don’t whinge on and on about the ideological justification - because I no longer care.

I detest government regulation, but I would be a hypocrite, an idiot, or both, if I were to say I didn’t see and understand the reason for why it invariably comes about. The primary reasons are human stupidity, short-sightedness (in part a result of our very short lifespans) and human cognitive disabilities, of which I’ve copied the list from Wikipedia, below. Please, look at that list, and realize that neither you nor I are immune from them – in fact, we are afflicted with multiples of them, daily.

We are also not very smart as a species, and those individuals who are “smarter than the rest” invariably (not occasionally, but invariably) statistically become disconnected from and unaware of the feelings, capabilities, and ultimately the actions of the “balance” of the species, whose IQs are lower than the mean, and whose EQs are, on average, lower still. Libertarianism and democracy both posit that “everything will be just fine in long run” in any human population if government is restricted to protecting people and their property from predation – either by their immediate neighbors, or by their less immediate neighbors. Not defined in such systems are things like what exactly constitutes property – how it is created, who has rights to it, and what happens when “property rights” are tangled and interleaved?

Give me a break! I have no obligation to endorse any such system which has no rigor and no rigorous evidence to support its workability. I would hastily add that I feel the same way about dictatorships, monarchies, oligarchies, technocracies, and virtually all other large-scale human government apparatuses. They are varying degrees of failure and misery and writ large.

And yes, I am acutely aware of (and deeply grateful for) the (classically) liberal demi-democracies of the West, of which the US and the UK are the shining examples. But these are not libertarian principalities; they are nation-states offering varying degrees and types of oppression that inevitably progress towards failure whilst following a highly predictable historical arc. Abelard, the staunch libertarian, has repeatedly noted here that he has suffered grievously from the recent economic madness. Interestingly, the PROXIMATE causes of the current real estate and stock bubbles were the removal of REGULATORY constraints, in no small measure as a result of the actions of a man that I have (for decades) called the “Evil Gnome:” Alan Greenspan. The same Alan Greenspan who suckled at Ayn Rand’s teat. It would be disingenuous and wrong to leave the blame there, because the UNDERLYING CAUSES are laid out below, in the various cognitive biases (shortcomings), lack of long term world view, and yes, stupidity, exhibited by people who have lived such a short time (and will continue to live for an even short time) that they had no opportunity to viscerally learn, let alone forget, the lessons so well taught by the past. Am I to understand that Abelard wants another, bigger serving of the same?

So, what am I? The answer to that is that I’m a person who realizes that all existing systems of government are deeply flawed, and that the real future of freedom and of humanity lie in our ability, if we have it, to transcend ourselves and to overcome the limitations of our evolutionarily “randomly engineered” beginnings. — Mike Darwin

Anchoring – the common human tendency to rely too heavily, or “anchor,” on one trait or piece of information when making decisions.
Attentional Bias – implicit cognitive bias defined as the tendency of emotionally dominant stimuli in one’s environment to preferentially draw and hold attention.
Backfire effect – Evidence disconfirming our beliefs only strengthens them.
Bandwagon effect – the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.
Bias blind spot – the tendency to see oneself as less biased than other people.[2]
Choice-supportive bias – the tendency to remember one’s choices as better than they actually were.[3]
Confirmation bias – the tendency to search for or interpret information in a way that confirms one’s preconceptions.[4]
Congruence bias – the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
Contrast effect – the enhancement or diminishing of a weight or other measurement when compared with a recently observed contrasting object.[5]
Denomination effect – the tendency to spend more money when it is denominated in small amounts (e.g. coins) rather than large amounts (e.g. bills).[6]
Distinction bias – the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.[7]
Endowment effect – “the fact that people often demand much more to give up an object than they would be willing to pay to acquire it”.[8]
Experimenter’s or Expectation bias – the tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.[9]
Focusing effect – the tendency to place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.[10]
Framing effect – drawing different conclusions from the same information, depending on how that information is presented.
Hostile media effect – the tendency to see a media report as being biased due to one’s own strong partisan views.
Hyperbolic discounting – the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, where the tendency increases the closer to the present both payoffs are.[11]
Illusion of control – the tendency to overestimate one’s degree of influence over other external events.[12]
Impact bias – the tendency to overestimate the length or the intensity of the impact of future feeling states.[13]
Information bias – the tendency to seek information even when it cannot affect action.[14]
Irrational escalation – the phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong.
Loss aversion – “the disutility of giving up an object is greater than the utility associated with acquiring it”.[15] (see also Sunk cost effects and Endowment effect).
Mere exposure effect – the tendency to express undue liking for things merely because of familiarity with them.[16]
Money illusion – the tendency to concentrate on the nominal (face value) of money rather than its value in terms of purchasing power.[17]
Moral credential effect – the tendency of a track record of non-prejudice to increase subsequent prejudice.
Negativity bias – the tendency to pay more attention and give more weight to negative than positive experiences or other kinds of information.
Neglect of probability – the tendency to completely disregard probability when making a decision under uncertainty.[18]
Normalcy bias – the refusal to plan for, or react to, a disaster which has never happened before.
Omission bias – the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).[19]
Outcome bias – the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
Planning fallacy – the tendency to underestimate task-completion times.[13]
Post-purchase rationalization – the tendency to persuade oneself through rational argument that a purchase was a good value.
Pseudocertainty effect – the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.[20]
Reactance – the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
Restraint bias – the tendency to overestimate one’s ability to show restraint in the face of temptation.
Selective perception – the tendency for expectations to affect perception.
Semmelweis reflex – the tendency to reject new evidence that contradicts an established paradigm.[21]
Social comparison bias – the tendency, when making hiring decisions, to favour potential candidates who don’t compete with one’s own particular strengths.[22]
Status quo bias – the tendency to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification).[23][24]
Unit bias — the tendency to want to finish a given unit of a task or an item. Strong effects on the consumption of food in particular.[25]
Wishful thinking – the formation of beliefs and the making of decisions according to what is pleasing to imagine instead of by appeal to evidence or rationality.[26]
Zero-risk bias – preference for reducing a small risk to zero over a greater reduction in a larger risk.

Biases in probability and belief

Many of these biases are often studied for how they affect business and economic decisions and how they affect experimental research.

Ambiguity effect – the tendency to avoid options for which missing information makes the probability seem “unknown.”[27]
Anchoring effect – the tendency to rely too heavily, or “anchor,” on a past reference or on one trait or piece of information when making decisions (also called “insufficient adjustment”).
Attentional bias – the tendency to neglect relevant data when making judgments of a correlation or association.
Availability heuristic – estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual, or emotionally charged examples.
Availability cascade – a self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or “repeat something long enough and it will become true”).
Base rate neglect or Base rate fallacy – the tendency to base judgments on specifics, ignoring general statistical information.[28]
Belief bias – an effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion.[29]
Clustering illusion – the tendency to see patterns where actually none exist.
Conjunction fallacy – the tendency to assume that specific conditions are more probable than general ones.[30]
Forward Bias – the tendency to create models based on past data which are validated only against that past data.
Gambler’s fallacy – the tendency to think that future probabilities are altered by past events, when in reality they are unchanged. Results from an erroneous conceptualization of the Law of large numbers. For example, “I’ve flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads.”
Hindsight bias – sometimes called the “I-knew-it-all-along” effect, the tendency to see past events as being predictable[31] at the time those events happened.(sometimes phrased as “Hindsight is 20/20″)
Illusory correlation – inaccurately perceiving a relationship between two events, either because of prejudice or selective processing of information.[32]
Observer-expectancy effect – when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
Optimism bias – the tendency to be over-optimistic about the outcome of planned actions.[33]
Ostrich effect – ignoring an obvious (negative) situation.
Overconfidence effect – excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time.[34][35]
Positive outcome bias – the tendency of one to overestimate the probability of a favorable outcome coming to pass in a given situation (see also wishful thinking, optimism bias, and valence effect).
Pareidolia – a vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing hidden messages on records played in reverse.
Pessimism bias – the tendency for some people, especially those suffering from depression, to overestimate the likelihood of negative things happening to them.
Primacy effect – the tendency to weigh initial events more than subsequent events.[36]
Recency effect – the tendency to weigh recent events more than earlier events (see also peak-end rule).
Disregard of regression toward the mean – the tendency to expect extreme performance to continue.
Stereotyping – expecting a member of a group to have certain characteristics without having actual information about that individual.
Subadditivity effect – the tendency to judge probability of the whole to be less than the probabilities of the parts.
Subjective validation – perception that something is true if a subject’s belief demands it to be true. Also assigns perceived connections between coincidences.
Well travelled road effect – underestimation of the duration taken to traverse oft-traveled routes and over-estimate the duration taken to traverse less familiar routes.

[edit] Social biases

Most of these biases are labeled as attributional biases.

Actor–observer bias – the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation (see also Fundamental attribution error), and for explanations of one’s own behaviors to do the opposite (that is, to overemphasize the influence of our situation and underemphasize the influence of our own personality).
Dunning–Kruger effect – a twofold bias. On one hand the lack of metacognitive ability deludes people, who overrate their capabilities. On the other hand, skilled people underrate their abilities, as they assume the others have a similar understanding.[37]
Egocentric bias – occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.
Forer effect (aka Barnum effect) – the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
False consensus effect – the tendency for people to overestimate the degree to which others agree with them.[38]
Fundamental attribution error – the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).[39]
Halo effect – the tendency for a person’s positive or negative traits to “spill over” from one area of their personality to another in others’ perceptions of them (see also physical attractiveness stereotype).[40]
Illusion of asymmetric insight – people perceive their knowledge of their peers to surpass their peers’ knowledge of them.[41]
Illusion of transparency – people overestimate others’ ability to know them, and they also overestimate their ability to know others.
Illusory superiority – overestimating one’s desirable qualities, and underestimating undesirable qualities, relative to other people. (Also known as “Lake Wobegon effect,” “better-than-average effect,” or “superiority bias”).[42]
Ingroup bias – the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
Just-world phenomenon – the tendency for people to believe that the world is just and therefore people “get what they deserve.”
Moral luck – the tendency for people to ascribe greater or lesser moral standing based on the outcome of an event rather than the intention
Outgroup homogeneity bias – individuals see members of their own group as being relatively more varied than members of other groups.[43]
Projection bias – the tendency to unconsciously assume that others (or one’s future selves) share one’s current emotional states, thoughts and values.[44]
Self-serving bias – the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).[45]
System justification – the tendency to defend and bolster the status quo. Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest. (See also status quo bias.)
Trait ascription bias – the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable.
Ultimate attribution error – similar to the fundamental attribution error, in this error a person is likely to make an internal attribution to an entire group instead of the individuals within the group.

By: admin admin Thu, 06 Oct 2011 04:25:32 +0000 Fundie writes: “As a libertarian, I don’t presume any of the things Mike says I do.” Say what? I didn’t say you thought anything! Indeed, I have no idea what your ideas are on such matters! What made you think (and write) that any ideas I’ve expressed are also yours? — Mike Darwin

By: Fundie Fundie Wed, 05 Oct 2011 20:45:44 +0000 I am firmly with Abelard’s comment, here. As a libertarian, I don’t presume any of the things Mike says I do. In fact, I presume pretty much the exact opposite, and I conclude after much thought that the best solution is to empower people to protect themselves from the consequences in the way they personally deem best.

But I figure that’s beginning to get off-topic. :)