BBO Discussion Forums: Smart math people, help? - BBO Discussion Forums

Jump to content

  • 7 Pages +
  • « First
  • 3
  • 4
  • 5
  • 6
  • 7
  • You cannot start a new topic
  • You cannot reply to this topic

Smart math people, help?

#81 User is offline   david_c 

  • PipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 1,178
  • Joined: 2004-November-14
  • Location:England
  • Interests:Mathematics;<br>20th century classical music;<br>Composing.

Posted 2007-July-19, 11:33

jtfanclub, on Jul 19 2007, 06:14 PM, said:

david_c, on Jul 19 2007, 11:54 AM, said:

This is all true, but it only gives you useful information in the case where the expected amount of money in the envelopes is finite. Since we are told that the amounts in the envelopes can "approach infinity", it is perfectly possible that the expectation is infinite.

So, you open the envelope, and it has infinity dollars in it.

No it doesn't.

The amounts in the envelopes are always finite, but they can be chosen in such a way that the expectation before you open an envelope is infinite.

If you don't understand how this is possible you need to learn some basic probability theory.
0

#82 User is offline   helene_t 

  • The Abbess
  • PipPipPipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 17,198
  • Joined: 2004-April-22
  • Gender:Female
  • Location:Copenhagen, Denmark
  • Interests:History, languages

Posted 2007-July-19, 11:42

david_c, on Jul 19 2007, 07:30 PM, said:

helene_t, on Jul 19 2007, 06:22 PM, said:

This sounds a little strange. Did I do something wrong?

No, you didn't. And like I keep saying, it is perfectly possible for it to be right to switch no matter what amount you find in the envelope. Does that sound strange too?

I suddenly understand why most of the theorems the proof of which I was too lazzy to learn at uni, started with conditions like "consider a probability measure with finite mean" or "... finite variance". Loosening those conditions would have lead to absurd results, like gamblers expecting either to loose or gain an infinity of dollars and still basing their decision on a possible gain of $2. But in fact they do, at least if they work for the British government (it is described in the book "Parkinson's law" :lol:
The world would be such a happy place, if only everyone played Acol :) --- TramTicket
0

#83 User is offline   jtfanclub 

  • PipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 3,937
  • Joined: 2004-June-05

Posted 2007-July-19, 12:07

david_c, on Jul 19 2007, 12:33 PM, said:

No it doesn't.

The amounts in the envelopes are always finite, but they can be chosen in such a way that the expectation before you open an envelope is infinite.

If you don't understand how this is possible you need to learn some basic probability theory.

By all means, teach me.

You have an envelope with 2N times (2 to the -X power) money in it, where N is the amount of money in the largest amount of money in it, and X is an integer ranging from 1 to the number of envelopes, so the 2nd largest envelope will have an amount of money equal to 2N times (2 to the -2 power).

You can switch that envelope with an adjacent envelope: ie., a 50% chance of switching for X'=X+1, and 50% that X'=X-1. Should you make the switch?

That's the actual formula that's been crudely switched into English.

When you actually use the formula, instead of just English, it should become clear that if N is infinite, the question becomes meaningless, and if N is not infinite, then the expectation cannot be infinite.
0

#84 User is offline   david_c 

  • PipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 1,178
  • Joined: 2004-November-14
  • Location:England
  • Interests:Mathematics;<br>20th century classical music;<br>Composing.

Posted 2007-July-19, 12:57

jtfanclub, on Jul 19 2007, 07:07 PM, said:

david_c, on Jul 19 2007, 12:33 PM, said:

No it doesn't.

The amounts in the envelopes are always finite, but they can be chosen in such a way that the expectation before you open an envelope is infinite.

If you don't understand how this is possible you need to learn some basic probability theory.

By all means, teach me.

You have an envelope with 2N times (2 to the -X power) money in it, where N is the amount of money in the largest amount of money in it, and X is an integer ranging from 1 to the number of envelopes, so the 2nd largest envelope will have an amount of money equal to 2N times (2 to the -2 power).

You can switch that envelope with an adjacent envelope: ie., a 50% chance of switching for X'=X+1, and 50% that X'=X-1. Should you make the switch?

That's the actual formula that's been crudely switched into English.

When you actually use the formula, instead of just English, it should become clear that if N is infinite, the question becomes meaningless, and if N is not infinite, then the expectation cannot be infinite.

The way you've chosen to set this up (having a fixed number of envelopes to choose from) means that there are only a finite number of different amounts that can be in the two envelopes.

But this needn't be the case. If someone asks you to put two amounts of money into the envelopes, with only the condition that one must be twice as much as the other, there are (in theory) infintely many different ways you could do it.

Here's one way you might go about it. Pick a number at random between 0 and 1. Let's call it x. Now,

If 1/2 < x <= 1, then let the amounts in the two envelopes be $1 and $2. (And toss a coin, say, to decide which envelope gets the larger amount.)
If 1/4 < x <= 1/2, then let the amounts in the two envelopes be $2 and $4.
If 1/8 < x <= 1/4, then let the amounts in the two envelopes be $4 and $8.
... and so on.
Finally, if x=0, let's say (for the sake of it) that the amounts are $5 and $10.

That covers all possible cases, and no matter what x you've chosen, the amounts in the two envelopes are always finite.

But, if you do it this way, the expected amount of money in the less valuable envelope is

(1/2 x 1) + (1/4 x 2) + (1/8 x 4) + ...

which is 1/2 + 1/2 + 1/2 + ...

which is infinite.

This is basically the same situation as the one Helene looked at.

If you want an example where it is always "right" to switch, then you could take something like the one I gave before, where the probability of the smaller amount being 2^n was

k.(1-t)^(|n|)

for some very small number t. (Anything sufficiently small will work.) Here k is whatever constant is required to make the sum of the probabilities equal 1.
0

#85 User is offline   frouu 

  • PipPipPip
  • Group: Full Members
  • Posts: 90
  • Joined: 2007-January-13

Posted 2007-July-19, 13:29

well I don't know whether this paradox is mentioned, but here it is:

There's a game that you have 50% chance to win.
If you bet $x and win, you get $2x back. (total profit is x)


So the gambler first bets $1. if wins , profit is $1
if not, next game he bids $2, if wins total profit 2-1= $1
if can't wind 2nd game, bid $4, if wins, total profit is 4-2-1=1

if not again you bet twice more and try again. with this strategy your expected profit is always $1 (or whatever you start with) :)
0

#86 User is offline   Echognome 

  • Deipnosophist
  • PipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 4,386
  • Joined: 2005-March-22

Posted 2007-July-19, 13:31

frouu's set up is known as the Gambler's Ruin.

The solution to that one is fairly obvious. The gambler has a positive probability of busting (hitting his upper limit of funds).
"Half the people you know are below average." - Steven Wright
0

#87 User is offline   frouu 

  • PipPipPip
  • Group: Full Members
  • Posts: 90
  • Joined: 2007-January-13

Posted 2007-July-19, 13:38

david_c, on Jul 19 2007, 05:31 AM, said:

frouu, on Jul 19 2007, 08:03 AM, said:

david_c, on Jul 18 2007, 08:53 PM, said:


[For the smart math people: just take a very slowly decaying distribution. For example, let the probability that envelopes contain $(2^n) and $(2.2^n) be k.(1-epsilon)^(|n|).]

In this case, when analysing whether it's right to switch for a particular amount M, the calculations are so close to what they would be if the probabilities were 1/2 that it makes no difference. The conclusion is

For every amount M you see in the envelope, your expectation if you switch is greater than M.

(In fact, greater than 1.2M, say. We can get any multiple less than 1.25.)

Is this a paradox? It shouldn't be - it's true. Do you believe me?

I don't believe you because you're restricting the outcomes of the experiment to {k*2^n} sequence and you don't a priori know what "k" is.

Well, two points:

(i) You've misread the example. k is a normalising factor for the probabilities, to make sure they sum to 1.

(ii) Yes it's a discrete distribution. That's the easiest example to write down. However, you could find a continuous distribution with the same properties if you like.

I'm sorry my "k" , and your "k" had different meanings, I meant {alpha*2^n}

Anyway you're right,
The problem occurs because there's finite probability that you may see 2^10000 $ in the first envelope, i.e there's no meaningful limit. it's like the gambler's paradox I mentioned above. The amount of money available in the world is finite.

So if you put a limit to the outcome of the events, then the switching isn't necesarily better.
0

#88 User is offline   helene_t 

  • The Abbess
  • PipPipPipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 17,198
  • Joined: 2004-April-22
  • Gender:Female
  • Location:Copenhagen, Denmark
  • Interests:History, languages

Posted 2007-July-19, 13:56

Oh my, this wikipedia article contains everything that has been said in this thread. I suppose the reason why my colleagues hate wikipedia (yes, I'm a statistician) is that they are afraid it will make us all unemployed. (Or worse, making it obsolete to write on BBO forum at work).
The world would be such a happy place, if only everyone played Acol :) --- TramTicket
0

#89 User is offline   EricK 

  • PipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 2,303
  • Joined: 2003-February-14
  • Location:England

Posted 2007-July-19, 15:26

helene_t, on Jul 19 2007, 07:56 PM, said:

Oh my, this wikipedia article contains everything that has been said in this thread. I suppose the reason why my colleagues hate wikipedia (yes, I'm a statistician) is that they are afraid it will make us all unemployed. (Or worse, making it obsolete to write on BBO forum at work).

This thread brings back memories. The first ever page I edited on Wikipedia was on the envelope paradox!

It was in the very early days of the article, and I added the section about Raymond Smullyan's reworking of the problem to not include probabilities at all. It is all been rewriiten countless times since, and I think the only words of mine which remain unchanged are in the Bibliography section!
0

#90 User is offline   luke warm 

  • PipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 6,951
  • Joined: 2003-September-07
  • Gender:Male
  • Interests:Bridge, poker, politics

Posted 2007-July-19, 17:14

i really haven't been able to tell when (or even if) we got away from justin's original puzzle... to refresh my mind, i'll paste it below

Quote

2 were having the following discussion:

A: Say you have 2 envelopes filled with money. The value of one envelope is half the value of the other. These values can approach infinite $.

B: Ok.

A: Say you are handed one envelope and open it contains 5000 dollars. You are offered the choice to switch envelopes, do you?

han answered and said that to switch would not result in a higher EV, and from what i've read almost everyone with a mathematical bent has agreed with that... but i still don't quite understand

the 'givens' of justin's problem were:
1) there are 2 and only 2 envelopes
2) the value of 1 of the envelopes is exactly half the value of the other
3) you open an envelope and it contains $5,000

are these statements accurate?
1) if i switch i'll either have $2,500 or $10,00
2) if i switch and am "wrong" i'll lose 50% of what i have in hand
3) if i switch and am "right" i'll gain 100% of what i have in hand

assuming one can live with oneself if wrong, why is it not better to switch? layman's terms please
"Paul Krugman is a stupid person's idea of what a smart person sounds like." Newt Gingrich (paraphrased)
0

#91 User is offline   Trumpace 

  • Hideous Rabbit
  • PipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 1,040
  • Joined: 2005-January-22
  • Gender:Male

Posted 2007-July-19, 17:20

luke warm, on Jul 19 2007, 06:14 PM, said:

han answered and said that to switch would not result in a higher EV, and from what i've read almost everyone with a mathematical bent has agreed with that... but i still don't quite understand

the 'givens' of justin's problem were:
1) there are 2 and only 2 envelopes
2) the value of 1 of the envelopes is exactly half the value of the other
3) you open an envelope and it contains $5,000

are these statements accurate?
1) if i switch i'll either have $2,500 or $10,00
2) if i switch and am "wrong" i'll lose 50% of what i have in hand
3) if i switch and am "right" i'll gain 100% of what i have in hand

assuming one can live with oneself if wrong, why is it not better to switch? layman's terms please

I am not sure if Hannie said that a switch will not result in a higher EV.

In fact the reason that _always switch_ does not work is that, we can choose the numbers in such a way (probability wise) such that switching will reduce your EV.

In fact we could also choose the numbers in such a way that switching will increase your EV.

Basically, as the problem is stated, we have insufficient information to make a decision to switch.
0

#92 User is offline   luke warm 

  • PipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 6,951
  • Joined: 2003-September-07
  • Gender:Male
  • Interests:Bridge, poker, politics

Posted 2007-July-19, 18:08

if i misstated han's position, i apologize... so are you saying the answers to my questions are true or false? also, taking the original as given, why do we not have enough info? what other info do you need?
"Paul Krugman is a stupid person's idea of what a smart person sounds like." Newt Gingrich (paraphrased)
0

#93 User is offline   cherdano 

  • 5555
  • PipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 9,519
  • Joined: 2003-September-04
  • Gender:Male

Posted 2007-July-19, 18:28

luke warm, on Jul 19 2007, 05:14 PM, said:

but i still don't quite understand

the 'givens' of justin's problem were:
1) there are 2 and only 2 envelopes
2) the value of 1 of the envelopes is exactly half the value of the other
3) you open an envelope and it contains $5,000

are these statements accurate?
1) if i switch i'll either have $2,500 or $10,00
2) if i switch and am "wrong" i'll lose 50% of what i have in hand
3) if i switch and am "right" i'll gain 100% of what i have in hand

assuming one can live with oneself if wrong, why is it not better to switch? layman's terms please

You are assuming that being right and being wrong are equally likely. This can't be right for all possible amounts of money you find in the envelope, for the reasons explained by several in this thread.
The easiest way to count losers is to line up the people who talk about loser count, and count them. -Kieran Dyke
0

#94 User is offline   cherdano 

  • 5555
  • PipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 9,519
  • Joined: 2003-September-04
  • Gender:Male

Posted 2007-July-19, 18:33

Btw, somewhat related is the following fact: If you look at (large) numbers that you happen to encounter (a distance in some unit, the annual earnings of a company in $, the number of cells in your brain, the number of donuts sold per year in the US), many more than 1/9 of them begin with 1. Why is that?
The easiest way to count losers is to line up the people who talk about loser count, and count them. -Kieran Dyke
0

#95 User is offline   Trumpace 

  • Hideous Rabbit
  • PipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 1,040
  • Joined: 2005-January-22
  • Gender:Male

Posted 2007-July-19, 18:44

luke warm, on Jul 19 2007, 07:08 PM, said:

if i misstated han's position, i apologize... so are you saying the answers to my questions are true or false? also, taking the original as given, why do we not have enough info? what other info do you need?

It completely depends on the way the numbers are chosen.

There are ways of choosing the numbers so that if you open the envelope and see $5000, you should switch.

There are ways of choosing the numbers so that if you open the envelope and see $5000, you should stick to it.

So the information we require is "how were the numbers chosen?".


For instance consider the following (classic, called as Bertrand's paradox) problem.

(Note I am using the site: http://www.cut-the-k.../bertrand.shtml for the write ups)

Question: "Given a circle. Find the probability that a chord chosen at random be longer than the side of an inscribed equilateral triangle."

There are at least three ways of looking at it:

1) Probability = 1/3

We have to choose randomly two points on a circle and measure the distance between the two. Therefore, the only important thing is the position of the second point relative to the first one. In other words, position of the first point has no effect on the outcome. Thus let us fix the point A and consider only the chords that emanate from this point. Then it becomes clear that 1/3 of the outcomes will result in a chord longer than the side of an equilateral triangle.

2) Probability = 1/4

A chord is fully determined by its midpoint. Chords whose length exceeds the side of an equilateral triangle have their midpoints inside a smaller circle with radius equal to 1/2 that of the given one. Hence, its area is 1/4 of the big circle which also defines the proportion of favorable outcomes - 1/4.

3) Probability = 1/2

A chord is fully determined by its midpoint. Chords whose length exceeds the side of an equilateral triangle have their midpoints closer to the center than half the radius. If the midpoints are distributed uniformly over the radius (instead of over the area, as was the case in the second solution), the probability becomes 1/2.

Which of the three answers is right?
0

#96 User is offline   david_c 

  • PipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 1,178
  • Joined: 2004-November-14
  • Location:England
  • Interests:Mathematics;<br>20th century classical music;<br>Composing.

Posted 2007-July-19, 19:25

cherdano, on Jul 20 2007, 01:28 AM, said:

You are assuming that being right and being wrong are equally likely. This can't be right for all possible amounts of money you find in the envelope, for the reasons explained by several in this thread.

:D Again, this is all true, but if you think this is the explanation for the paradox then you're missing the really beautiful part.

OK, let's actually play this damn game. I'm going to put the amounts of money in the envelopes for you. I'm going to do it in such a way that the probability of the envelopes containing $M and $2M is at least 9/10 of the probability of the envelopes containing $M/2 and $M, for all possible values of $M. (Let's say the amounts will always be a power of 2, for simplicity.)

Trivial calculation: this means that if you open the envelope and find that it contains $M, then the expected amount in the other envelope, given this information, is at least $(6M/5). [In fact the best possible bound involves 19s, but I want to keep the numbers simple.]

But what does this actually mean? Does it mean that it's always right to switch? Well, yes and no. It turns out to be difficult to pin down exactly what we mean by "the right thing to do". But we can get some insight into what's going on by looking at what happens if you adopt various strategies.

Let's play this game a large number of times, and keep track of how much money you've collected in total. We'll also keep track of how much money you would have won if you'd never switched. We'll say you're "in profit" if the amount of money you've collected is bigger than the amount you would have got by never switching. Sounds fair enough?

OK, the first strategy we'll look at is if you only switch if the amount in your envelope is $1. This guarantees you a profit of $1 every time it happens (assuming you can't get any smaller amounts of money). So, in the long run you will be making a profit, albeit slowly.

Now, let's instead try the strategy where you switch only if the amount in your envelope is $2. From the calculation we did earlier, we expect to make a profit on average when this comes up, of over 40 cents per time. Again, in the long run you are going to make a profit. It's not a guaranteed profit any more, but the probability that you will be in profit after playing, say, 1000000 times is very close to 1.

I hope there is nothing paradoxical about this so far.

But we can do the same thing for any other amount. No matter what amount we choose, we expect to make a profit in the long run, provided that we only switch when that amount comes up.

Now here's another strategy: let's switch provided that the amount in our envelope is at most $64. Now, this means that we will be switching on a lot of times where the amounts in the envelopes are $1 and $2. These will cancel out in the long run, since of those occasions, we expect to open the envelope with the larger amount exactly half the time. The same goes for when the envelopes contain $2 and $4, or when they contain $4 and $8, and so on - these situations all give us zero profit. BUT we are also switching someimes when the envelopes contain $64 and $128, and here our cleverly-chosen strategy means we only switch when it is right for us to do so! So again, by following this strategy we will be making a profit in the long run.

More generally, we can choose any finite subset of the possible amounts, and if we only switch when we get one of those amounts, we will make a profit (which you can calculate in terms of expected gain per play). If we have a fixed strategy like this, then the probability that we are in profit after N plays approaches 1 as N gets large.

And the more often you switch, the larger your expected profit per play is.

But now let's try and be greedy and switch every time. Now, we are no longer steadily accumulating profit at a rate we can calculate. [In fact the expectation is undefined, because the money that you expect to make is infinite whether you choose to switch or not.] If you track what happens to your profit as time goes on, you will find that at times you will be making huge profits: if you look at your maximum profit of all times up to the present, then this will get larger and larger as time goes on. But at other times you will be making huge losses, and the size of these losses will also approach infinity. In fact, you will find yourself in profit exactly half the time in the long run. There will be infinitely many points in time when you are making a profit, but also infinitely many points in time when you are making a loss.

So, if you want to actually make a steady profit, you should only switch when you get a particular amount [or amounts] in your envelope. This guarantees ("almost surely" as the experts say) that whatever profit you want to make, you will eventually reach that level and stay there for the rest of time. If you switch all the time, then you will eventually reach that level but you won't stay there.
0

#97 User is offline   kenrexford 

  • Brain Farts and Actual Farts Increasing with Age
  • PipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 9,586
  • Joined: 2005-September-21
  • Gender:Male
  • Location:Lima, Allen County, North-West-Central Ohio, USA
  • Interests:www.limadbc.blogspot.com editor/contributor

Posted 2007-July-19, 20:49

I'll admit to only reading a part of this, but one thought has occurred to me. It seems that we have an informational advantage that skews this thing and that, in fact, then tendency is toward a gain on a switch.

That informational advantage is that we know that we are on the winning side of the equation because we know that the amount is positive. If the amount was negative (do you owe $5000 or do you owe double or half that), then a switch would not benefit us.

It also seems like we are moving in two different directions, in a sense. From one standpoint, we gain or lose the same amount. It is only from another standpoint, the one we care about, that we gain or lose by different amounts.

I mean, take, for instance, the question of whether a "50-50" game is a win or loss proposition. I have $5 in my pocket. I can toss a coin with the chap next to me, who bets his $5, and whoever wins gets all of it. This seems like an equal chance, equal gain.

However, if it takes $10 to buy a ticket, $5 does nothing, as does $0. They are equally useless. So, I gain more than I lose if I gamble.

A similar concept might occur, and perhaps more on point, if I were to consider being on the top of a mountain. If I move two feet to the right, I might be OK. However, if I move two feet to the left, the slope may be so steep that I fall.

So, I think you gain by the exchange. I think you gain because you are talking two-dimensional instead of one-dimensional, and you care only about the X axis but are trading on the Y axis.
"Gibberish in, gibberish out. A trial judge, three sets of lawyers, and now three appellate judges cannot agree on what this law means. And we ask police officers, prosecutors, defense lawyers, and citizens to enforce or abide by it? The legislature continues to write unreadable statutes. Gibberish should not be enforced as law."

-P.J. Painter.
0

#98 User is offline   helene_t 

  • The Abbess
  • PipPipPipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 17,198
  • Joined: 2004-April-22
  • Gender:Female
  • Location:Copenhagen, Denmark
  • Interests:History, languages

Posted 2007-July-19, 22:56

One should keep in mind that experiments with an infinity of possible outcomes are mathematical abstractions. Experiments that can be conducted in real life have a finite number of outcomes, if only because our finite number of brain cells put an upper limit to the number of different outcomes we can distinguish.

For many purposes, the mathematical abstraction is adequate since the upper bound in the real-life experiment is unknown and that it doesn't matter what it is, and then it's easier to calculate with the symbol "Inf" than with some very large upper bound, especially if one would have to try with many different values of that upper bound to make sure that the arbitrary choice of upper bound doesn't matter.

But whenever the reasoning makes use of the assumption that there is no upper bound, as opposed to a very large upper bound, one should be causcious. In the envelope case, the "paradox" vanishes completely if you make the assumption that no envelope can contain more than, say, $1000,000,000. This is because the sure loss, if the first envelope contains more than $500,000,000, is so large that however unlikely it is, on average it will cancel out the expected gain when the first envelope contains a smaller amount. (Or something like that, again the details depend on the probabilities).

Jimmy said:

layman's terms please
We have done our best. It's not that easy to explain, actually I didn't fully understand the problem before David pointed me to the concequences of assuming a probability distribution with infinite mean. The wiki article does a good job. I still think that the male-and-female-weight problem should be easier to understand but since no-one seems to appreciate it maybe I should stop promoting it :D

Arend said:

If you look at (large) numbers that you happen to encounter (a distance in some unit, the annual earnings of a company in $, the number of cells in your brain, the number of donuts sold per year in the US), many more than 1/9 of them begin with 1. Why is that?

Take a real stocastic variable that is known to be positive and has known mean. It follows from the maximum entropy theorem that it must follow an exponential distribution. It also follows from information theory that you need to have some prior knowledge about the variable (not necesarily the excact value of the mean, but otherwise something equivalent) in order to observe it. Other prior information may give you a different distribution, but in general it will be something with a long concave tail, which is also what one observes in real-life histograms.

Such distributions favor numbers that begin with "1". This is because on log-2 scale, the width of the range [1;1.999999] is 1, while the ranges corresponding to other start digits are slimmer.
The world would be such a happy place, if only everyone played Acol :) --- TramTicket
0

#99 User is offline   barmar 

  • PipPipPipPipPipPipPipPipPipPipPipPip
  • Group: Admin
  • Posts: 21,594
  • Joined: 2004-August-21
  • Gender:Male

Posted 2007-July-20, 01:49

Trumpace, on Jul 19 2007, 07:44 PM, said:

For instance consider the following (classic, called as Bertrand's paradox) problem.

(Note I am using the site: http://www.cut-the-k.../bertrand.shtml for the write ups)

Question: "Given a circle. Find the probability that a chord chosen at random be longer than the side of an inscribed equilateral triangle."

There are at least three ways of looking at it:

1) Probability = 1/3

We have to choose randomly two points on a circle and measure the distance between the two. Therefore, the only important thing is the position of the second point relative to the first one. In other words, position of the first point has no effect on the outcome. Thus let us fix the point A and consider only the chords that emanate from this point. Then it becomes clear that 1/3 of the outcomes will result in a chord longer than the side of an equilateral triangle.

2) Probability = 1/4

A chord is fully determined by its midpoint. Chords whose length exceeds the side of an equilateral triangle have their midpoints inside a smaller circle with radius equal to 1/2 that of the given one. Hence, its area is 1/4 of the big circle which also defines the proportion of favorable outcomes - 1/4.

3) Probability = 1/2

A chord is fully determined by its midpoint. Chords whose length exceeds the side of an equilateral triangle have their midpoints closer to the center than half the radius. If the midpoints are distributed uniformly over the radius (instead of over the area, as was the case in the second solution), the probability becomes 1/2.

Which of the three answers is right?

I think I've got this one. The problem is with the informal phrase "chosen at random". To calculate the probability of a result, you need to know the probability distribution of this input. Even if you assume it means uniform distribution (the typical layman definition), is it uniform along the circumference (result 1), among the areas of the smaller circles (result 2), or along the radius (result 3)? Going 1/4 the way around the circumference doesn't result in the same chord as choosing a midpoint 1/4 of the way towards the center.

#100 User is offline   barmar 

  • PipPipPipPipPipPipPipPipPipPipPipPip
  • Group: Admin
  • Posts: 21,594
  • Joined: 2004-August-21
  • Gender:Male

Posted 2007-July-20, 02:02

A number of responders have answered based on the fact that in the real world there isn't an unbounded amount of money -- if one envelope contains a trillion dollars, it's intuitively more likely that the other one only contains a half trillion rather than 2 trillion. But this is supposed to be a math paradox, and math works in an ideal, platonic world in which there are no fundamental limits.

Is this an equivalent formulation of the original problem, that doesn't result in that problem? Rather than having money in the envelopes, they each just contain a piece of paper with a number on it, and the numbers are different. There are also two prize checks, one for $10,000 and the other for $5,000. If you eventually pick the envelope with the higher number, you win and get the bigger check, otherwise you get the smaller check.

Now that we're just writing numbers, there's no limit. The numbers don't have to be written out in full, they can be in exponential notation, names like Avogadro's number, googol, or googolplex, or expressions like 2*googol. So if you pick an envelope that contains googol, the other envelope could contain googol+1 or googol*googol, but just as likely contain googol-1 or sqrt(googol).

  • 7 Pages +
  • « First
  • 3
  • 4
  • 5
  • 6
  • 7
  • You cannot start a new topic
  • You cannot reply to this topic

7 User(s) are reading this topic
0 members, 7 guests, 0 anonymous users