I've often been accused of being a pessimist. I'd rather like to think of myself as a realist.
I do, like most people, however, dislike being around grumpy people. These people have also been termed pessimists. They seem to always expect the worst in people---and many times, they eventually are correct.
I have developed my philosophy after many years of observation and testing: expect the best, be prepared for the worst. You've probably heard this before, the concept is certainly not new. But why should we expect the best? Through my observations, I've learned that people tend to eventually get what they expect. This is a very strange result, but in my experience and tests, it seems to have some statistical relevance.
My theory is that there are a couple of reasons for this. One involves other people, and the other involves yourself. People are really good at perceiving expectation of others. Generally, people tend to follow the norms present around them---thus, if you want people to behave a certain way, simply be surprised when they do not. Another reason expecting the best seems to work is that it changes your own attitude. No longer are you frightened to try new things, no longer are you bitter, no longer are you stuck.
Saturday, June 20, 2009
Tuesday, June 16, 2009
Decisions
Last night I watched an interesting talk on how people make silly decisions:
The most interesting part of this talk was when he showed how people's decisions became more rational as they distanced themselves from the consequences. I thought, however, that much of his actual theory was far too simplified to be reasonable. For instance, his expectation equation is very simple: Utility * Probability of Payoff. This does not take into account possible negative payoff in the case of failure. For example, say you were going to rob a bank, which would give you a 5 million dollar payoff. Let's for now assume you do not have any moral qualms regarding robbing a bank (i.e. you're Ben Bernanke). Now let's say your probability of success is 0.05. This means your expected payoff is 5,000,000 * 0.05 = 250,000. If it costs less then 250,000 to attempt this caper, you should go for it, right? Not necessarily. What about if you fail? There is also a cost for failing. So, in every decision there is the payoff for success, there is a payoff for failure, there is a fixed price of attempting, and there is a price for not attempting. Imagine a binary decision---to act, or not to act. You could quantify the expectation of Act as:
This must be weighed against the expectation of not acting. Whichever has the higher expected value should be taken in a purely rational agent.
Another interesting point in the talk was that people make decisions by comparing the gradient. That is people prefer slopes that are moving toward higher payoffs than lower, even when the total payoff is lower. There were at least two examples of this in the talk, the salary and the burger. In the case of the salary, most people chose the salary that was increasing, even though the total net was less than the total of the salary that was decreasing. In the case of the burger, people associated its worth to what they had paid for it in the past. I do not see this as being dumb or ignorant. In fact, we have learned in nature that things at rest tend to stay at rest and things in motion tend to stay in motion. Thus, we use the slope of the payoff the predict future rewards.
These shortcuts are very interesting to me, as we may need them to create machines that are able to make good decisions---or at least machines that understand the decisions that humans are making.
The most interesting part of this talk was when he showed how people's decisions became more rational as they distanced themselves from the consequences. I thought, however, that much of his actual theory was far too simplified to be reasonable. For instance, his expectation equation is very simple: Utility * Probability of Payoff. This does not take into account possible negative payoff in the case of failure. For example, say you were going to rob a bank, which would give you a 5 million dollar payoff. Let's for now assume you do not have any moral qualms regarding robbing a bank (i.e. you're Ben Bernanke). Now let's say your probability of success is 0.05. This means your expected payoff is 5,000,000 * 0.05 = 250,000. If it costs less then 250,000 to attempt this caper, you should go for it, right? Not necessarily. What about if you fail? There is also a cost for failing. So, in every decision there is the payoff for success, there is a payoff for failure, there is a fixed price of attempting, and there is a price for not attempting. Imagine a binary decision---to act, or not to act. You could quantify the expectation of Act as:
({probability of success} * {utility of success}) +
((1 - {probability of success}) * {utility of failure}) -
{cost of attempt}.
This must be weighed against the expectation of not acting. Whichever has the higher expected value should be taken in a purely rational agent.
Another interesting point in the talk was that people make decisions by comparing the gradient. That is people prefer slopes that are moving toward higher payoffs than lower, even when the total payoff is lower. There were at least two examples of this in the talk, the salary and the burger. In the case of the salary, most people chose the salary that was increasing, even though the total net was less than the total of the salary that was decreasing. In the case of the burger, people associated its worth to what they had paid for it in the past. I do not see this as being dumb or ignorant. In fact, we have learned in nature that things at rest tend to stay at rest and things in motion tend to stay in motion. Thus, we use the slope of the payoff the predict future rewards.
These shortcuts are very interesting to me, as we may need them to create machines that are able to make good decisions---or at least machines that understand the decisions that humans are making.
Friday, June 05, 2009
Facts and Theories
What causes us to accept something we do not observe as fact? Why do some people have so much faith in something? What causes someone to cling tightly to a belief?
It amazes me how some people suddenly become experts in some area simply because they heard a rumor somewhere from someone who heard it somewhere else. This phenomenon is a strange one. I hear them proclaim it with complete confidence--as if they directly observed many facts supporting the claim, and none contradicting. But, many times, the fact is, they never did.
It seems that they so dearly want this fact to be true, that they end up ignoring all evidence that contradicts it. They build up a model of the world, but then, instead of constantly revising, they freeze it.
In science, we have the scientific method:
When people present their theories they should make sure that they are presenting them as theories and not as facts. Just because you think X is Y doesn't make it so. If you want to tell me a theory, tell me that it's your theory and the evidence you have. If you have a fact, tell me it as a fact, don't give me your interpretation.
It amazes me how some people suddenly become experts in some area simply because they heard a rumor somewhere from someone who heard it somewhere else. This phenomenon is a strange one. I hear them proclaim it with complete confidence--as if they directly observed many facts supporting the claim, and none contradicting. But, many times, the fact is, they never did.
It seems that they so dearly want this fact to be true, that they end up ignoring all evidence that contradicts it. They build up a model of the world, but then, instead of constantly revising, they freeze it.
In science, we have the scientific method:
- Observe.
- Make a theory.
- Create a test.
- Go back to 1!
When people present their theories they should make sure that they are presenting them as theories and not as facts. Just because you think X is Y doesn't make it so. If you want to tell me a theory, tell me that it's your theory and the evidence you have. If you have a fact, tell me it as a fact, don't give me your interpretation.
Subscribe to:
Posts (Atom)