juef
He/Him
Player (155)
Joined: 1/29/2007
Posts: 208
Location: Québec, Canada
That's absolutely right. I'm guessing one reason we usually don't drag those conditions is that many equations based on real life situations don't have such mathematical singularities.
Player (201)
Joined: 7/6/2004
Posts: 511
Ok here is a problem (I guess it could be called a math problem?) that I was thinking about recently after a game of 21. When I play basketball with two friends and the ball goes out of bounds, the person who caused the out picks a secret number and then the player 1 guesses a number and then player 2 guesses a number. Whoever is closer gets possession. Does player 2 have an advantage? On the one hand, if player 2 had chosen their number simultaneously, he would have an equal chance, but if choses simultaneously in his head, then refines it to just higher or just lower than player 1's guess, he will have an ever better chance, giving him an advantage. (And this increase in chance is not infinitely small because the random number is typically in a finite range, and the two players predictions are typically finitely different). But on the other hand, player 1 can just choose a number such that he thinks it is equally probable that the correct number is either higher or lower, giving him 50% chance of winning. (Please ignore the chance that player 1 guesses the number exactly, and that player 2 can then not guess this number. Possible ways this could be enforced is say redo if either player gets exact, or say the correct number has infinite precision and is sufficiently random.) It is purposefully vague what number can be chosen, there are no limits and no fixed probability distribution to choose from. The players chances are affected by how well they can read and predict the originator's randomness, but both players have equal knowledge of their behavior. So does player 2 have the advantage and why?
g,o,p,i=1e4,a[10001];main(x){for(;p?g=g/x*p+a[p]*i+2*!o: 53^(printf("%.4d",o+g/i),p=i,o=g%i);a[p--]=g%x)x=p*2-1;}
Joined: 10/20/2006
Posts: 1248
If it's totally random, there's a set range and player1 makes a random guess, then it's pretty easy for player 2 to pick the bigger range of numbers for him. If the selected range of numbers is uneven and player1 takes the middle number, he has an advantage. If it's an even range, p1 takes one of the middle numbers and p2 the other their chances are equal. So p1 would have a slight advantage for uneven ranges. Am I missing something? If it's not totally random, then it depends on how much you can narrow it down, if the narrowed down space is even or uneven etc. If player1 is a douche, player2 has a big advantage. If player1 makes a smart guess, player2 has still about a 50/50 chance of winning at worst (except if p1 can almost exactly guess the correct number - depends on how much you can narrow it down). So p2 has an unfair advantage for that in most cases, if you ask me. But if player1 was really, really smart and wouldn't make any mistakes, he'd have a slight advantage.
Player (201)
Joined: 7/6/2004
Posts: 511
A couple clarifications, guesses and answers don't have to be whole numbers. And its neither random nor not random, nor is there a fixed range.
g,o,p,i=1e4,a[10001];main(x){for(;p?g=g/x*p+a[p]*i+2*!o: 53^(printf("%.4d",o+g/i),p=i,o=g%i);a[p--]=g%x)x=p*2-1;}
Joined: 10/20/2006
Posts: 1248
Sorry, I seem to have misinterpreted your words. Anyway, I'm curious to see any mathematical/logical explanation for this.
Joined: 2/19/2010
Posts: 248
flagitious wrote:
So does player 2 have the advantage and why?
If we assume that both players know the distribution of numbers, and we assume the distributions are sufficiently large, then perfect play for players is as follows: Player 1 picks the median, m. Player 2 picks m + e, where e can be made arbitrarily small. In this way, player 1 wins when the number is <m>m. If we redo situations where player 1 guesses exactly, then player 2 has an advantage for certain small skewed discrete probability distributions. For example, if the distribution is 1,2,3 with probability 0.2, 0.4, 0.4: if player 1 chooses 2 then player 2 chooses 2.5 with twice the chance of success (0.4 vs 0.2). if player 1 chooses 2.5 then player 2 chooses 2.1 with 1.5 times the chance of success (0.6 vs 0.4). All other choices are the same or worse for player 1. This player 2 advantage becomes far less pronounced for larger distributions. Furthermore, if the distribution is symmetric or continuous then the advantage disappears. ---- Things get trickier if we assume the players know something about the distribution, but not the full distribution: say if the players know that only positive whole numbers less than 10,000 will be chosen, and that distributions skewed towards the bottom of this range are more likely than distributions skewed towards the top of this range. Nevertheless I think it reasonable to encode the knowledge the players have as a distribution, by taking an average over all distributions weighted by each distribution's probability. Then once more, player 1 chooses the mean and player 2 chooses m + e. But basically, player 1 can always, to the best of either players knowledge, choose a number x such that player 2 does not prefer taking one side or the other of x. At this point, I think it can be described as fair. ---- Taking things a bit further, if you know the distribution but your opponent does not, do you want to go first or second? If you go first, your opponent will pick either side of your chosen number with equal probability, so you have a 50% chance of success. If you go second, your opponent is unlikely to pick the median, so you can pick the larger half and get a greater than 50% chance of success. This by itself shows that you always want to go second: if you know less than your opponent, he will not be able to exploit it; while if you know more than your opponent, you get an advantage.
Player (201)
Joined: 7/6/2004
Posts: 511
This looks like good analysis at first glance. It will be about a day before I can get time to fully try to understand your reasoning. (And I may have led to believe that I had the 'answer' when I still need to figure it out too :)
g,o,p,i=1e4,a[10001];main(x){for(;p?g=g/x*p+a[p]*i+2*!o: 53^(printf("%.4d",o+g/i),p=i,o=g%i);a[p--]=g%x)x=p*2-1;}
Tub
Joined: 6/25/2005
Posts: 1377
It's pretty difficult to give a precise answer unless you give a precise definition of the problem. The problem here seems to be whether we know something about the distribution or not. Given the real-life source, we can safely assume that there's a finite range (i.e. surely no more than 10^90 digits) with a non-uniform probability distribution. It's also a discrete set, since there's only a is a limited amount of numbers that could be memorized by a human brain / expressed in the human language in a reasonable amount of time. Case A: both players know the RNG (from previous matches) and have the ability to guess his probability distribution. Let's assume they've both guessed the distribution perfectly. Player 1 chooses exactly the median, player 2 can pick higher or lower, each choice having roughly 50% success rate. Case B: both players guessed the same probability distribution, but both guessed wrong. Since Player 2 considers Player 1's guess to be the median, he'll throw a coin to pick higher or lower, yielding 50/50. Given equal knowledge of both players, player 2 does not have an advantage (as kuwaga explained there's a slight advantage for one of the players since we're operating on a discrete set, but only a very slight one, and we don't know for which of the players) Case C: both players guessed a different probability distribution with a different median. Player 1 guesses the median to be M1, Player 2 guesses M2. If each player just said his number, their chances would only depend on the quality of their guesses. But, if one player goes first, the second one will adjust his guess to M1+\epsilon or M1-\epsilon. And there's the advantage of player 2: if both guesses are far apart, he can claim the whole interval in between instead of only half. If neither knows the distribution, Player two actually has an advantage! This one will even work when both players just pick a random number, player 2 will have an advantage because he adjusts his numbers. But this will only hold true if we're limited to a finite range of numbers. Now if we ignore human restrictions and accept any number with any distribution, we have an entirely different problem. Why? There is no distribution over all numbers and our concept of median goes right out the window. Without an actual median, the strategies break down. Does player two still have an advantage? See this similar question for inspiration. http://blog.xkcd.com/2010/02/09/math-puzzle/comment-page-1/
m00
Player (42)
Joined: 12/27/2008
Posts: 873
Location: Germany
Decision problems involving probability distributions are usually very hard to solve and, IIRC, no consensus has been reached yet. An example that illustrates the non-triviality of these puzzles is the "Two envelopes paradox". Suppose that you have to choose between two envelopes, one contains X dollars and the other 2X dollars. After choosing an envelope with A dollars, you're given the chance to switch to the other one. To see if it's worthy, you assume that there's 50% chance the other one will have A/2, and another 50% that it'll have 2A. Summing, your expected gain will be 5A/4, thus it's better to switch. However, since that's true for every value, switching before even choosing the envelope would rise your expected value, and this is absurd. These problems introduce a concept called "expectation", which is hard to formalize in mathematics. By picking a large value, you may think "it's unlikely that the other envelope contains an even larger sum". Basically, in an infinite distribution, the probability of the numbers appearing is not the same. This may influence your players' problem. I lack the mathematical background to offer a reasonable solution to this problem, I suggest that you look at other paradoxes of probability and decision theories and their proposed solutions and see if it helps you. EDIT: Wow, it seems Tub posted something similar while I was writing, lol
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
p4wn3r wrote:
Suppose that you have to choose between two envelopes, one contains X dollars and the other 2X dollars. After choosing an envelope with A dollars, you're given the chance to switch to the other one. To see if it's worthy, you assume that there's 50% chance the other one will have A/2, and another 50% that it'll have 2A. Summing, your expected gain will be 3A/2, thus it's better to switch. However, since that's true for every value, switching before even choosing the envelope would rise your expected value, and this is absurd.
I think there's an extremely simple reason why it makes no difference if you switch or not. This would be a proof by contradiction: Let's assume that switching to the other envelope is advantageous, so you switch. However, now you are in the exact same situation as before, just with the envelopes switched. If the original assumption were correct, it would now also be advantageous to switch to the original envelope, which is a contradiction (because we started with the assumption that switching from the original is advantageous). Or stated in other words: If it were indeed advantageous to switch, each switch would increase the probability of getting the larger sum ad infinitum, so you should switch as many times as possible to "increase" your odds. Which makes no sense.
Player (42)
Joined: 12/27/2008
Posts: 873
Location: Germany
Oops, I miscalculated before, 1/2*(A/2)+1/2*2A = 5A/4, not 3A/2 It really makes no difference whether you switch or not, I implied this when I said that conclusion was absurd. The paradox rises when you reach a false conclusion using classical probability theory. To completely solve a paradox, you have to point out a mistake on the calculation of the final value 5A/4 (which is not the case here, the value is gotten using correct premises of classical probability) OR say why the theory you're using to calculate it is mistaken, that's the difficulty of the problem. In order words, it's easy to understand why your proof by contradiction is correct and the previous one is wrong, because human intuition gives the answer. However, take Bertrand's Paradox, there are at least three proofs leading to different results that don't have any flaws when you consider classical theory. Determining which value is the correct one would need a counter-argument to the other proofs.
Tub
Joined: 6/25/2005
Posts: 1377
p4wn3r wrote:
Suppose that you have to choose between two envelopes, one contains X dollars and the other 2X dollars. After choosing an envelope with A dollars, you're given the chance to switch to the other one. To see if it's worthy, you assume that there's 50% chance the other one will have A/2, and another 50% that it'll have 2A. Summing, your expected gain will be 5A/4, thus it's better to switch. However, since that's true for every value, switching before even choosing the envelope would rise your expected value, and this is absurd.
I coloured your dinosaur problem. Calculating the expected gain is valid, but assuming 50/50 to hold the smaller amount is flawed. If there's a limit to the money in an envelope (say between $100 and $10.000), you don't have a 50/50-chance if the envelope contains $190 or $6000. Note that if you pick an envelope with $1000, you're absolutely correct to say that switching is advantageous. But you'll only know after you actually opened the envelope and compared the value to the possible range. Now you're saying: any amount goes. To which I reply: there is no uniform distribution over all numbers, your premise is flawed.
m00
Player (42)
Joined: 12/27/2008
Posts: 873
Location: Germany
Tub wrote:
Now you're saying: any amount goes. To which I reply: there is no uniform distribution over all numbers, your premise is flawed.
p4wn3r, in his original post wrote:
Basically, in an infinite distribution, the probability of the numbers appearing is not the same.
p4wn3r, later wrote:
To completely solve a paradox, you have to point out a mistake on the calculation of the final value 5A/4 (which is not the case here, the value is gotten using correct premises of classical probability) OR say why the theory you're using to calculate it is mistaken, that's the difficulty of the problem.
Where exactly did I say that any amount goes? I never said my premises weren't flawed either, I just cited that problem to explain why classical probability theory can't be used in flagitious's problem when there's no range determined.
Joined: 2/19/2010
Posts: 248
Tub wrote:
There is no distribution over all numbers and our concept of median goes right out the window. Without an actual median, the strategies break down.
Yes, there is. The normal distribution is over all reals.
p4wn3r wrote:
However, take Bertrand's Paradox, there are at least three proofs leading to different results that don't have any flaws when you consider classical theory. Determining which value is the correct one would need a counter-argument to the other proofs.
Bertrand's Paradox is no paradox -- just an underspecified problem with three solutions which complete the specification in different ways. They are measuring different distributions and get different results.
Tub wrote:
Calculating the expected gain is valid, but assuming 50/50 to hold the smaller amount is flawed. If there's a limit to the money in an envelope (say between $100 and $10.000), you don't have a 50/50-chance if the envelope contains $190 or $6000. Note that if you pick an envelope with $1000, you're absolutely correct to say that switching is advantageous. But you'll only know after you actually opened the envelope and compared the value to the possible range.
This is a valid criticism of the original problem, but it can be restated such that this argument no longer applies.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
Tub wrote:
Note that if you pick an envelope with $1000, you're absolutely correct to say that switching is advantageous.
I don't understand why switching would be advantageous. It being advantageous would result in a contradiction (because once you switch, switching again would also be advantageous by the same reasoning).
Player (42)
Joined: 12/27/2008
Posts: 873
Location: Germany
rhebus wrote:
Bertrand's Paradox is no paradox -- just an underspecified problem with three solutions which complete the specification in different ways. They are measuring different distributions and get different results.
It was. I think it was this problem that took down the old theory and made people aware that they had to consider the distribution. I put it as a form of saying how mathematical proofs can be non-intuitive, even when they're clear. Looking back at my post, I seemed to say that the measuring had to be unique, thanks for clarifying.
Warp wrote:
Tub wrote:
Note that if you pick an envelope with $1000, you're absolutely correct to say that switching is advantageous.
I don't understand why switching would be advantageous. It being advantageous would result in a contradiction (because once you switch, switching again would also be advantageous by the same reasoning).
Well, I only teach math up to calculus and linear algebra, so I'm far from being a specialist in this area, but I had a deeper look into this and I'll try to answer this (I may say something stupid, be warned). Old/classic theory, in a nutshell, considered that the probability of an event is computed by dividing the amount of events favorable divided by the total events (or, in the case of a continuous domain, the length/area/volume of the favorable domain divided by the total length/area/volume). However, Bertrand's paradox shows that this fails in some cases. Like rhebus pointed out, there must be a distribution of probability so that we can evaluate it. I can consider that the chords in a circumference magically appeared out of nowhere or that the two envelopes were conceived by a mysterious entity, but there's no way to tackle this problem mathematically, it would be the same as asking "what's the chance that adelikat is thinking about redoing the Gradius run right now?" To properly compute the probability, we have to take into account all possible distributions. In Tub's case, switching is favorable because you looked at the value of the envelope and can compare it to the possible range. As an example, consider the uniform distribution of pairs (1,2) , (2,4) , (4,8) ... (2^99,2^100), which is bounded (my explanation can be made for an arithmetic distribution, instead of a geometric one, but it's more tiresome). See that, when you look at the envelope and see 1, it's obvious that you must switch, while if you have 2^100, a switch will only reduce your money. For other values n, switching is overall better, we can compute the expected gain by calculating (2^(n+1)-2^n)-(2^n-2^(n-1)) = 2^(n-1) The contradiction arises if it's better to switch without looking at the envelope, i.e, for any value there. For that, we need to add the gains for all values you can open. 1 = 2^0: Gain is (2-1) = 1 2 = 2^1: Gain is 2^(1-1) = 1 4 = 2^2: Gain is 2^(2-1) = 2 ... ... ... 2^99: Gain is 2^(99-1) = 2^98 2^100: Gain is -2^99 Adding everything, S = 1 + (1 + 2 + 4 + ... + 2^98) - 2^99 = 1 + 2^99 - 1 - 2^99 = 0 Thus, it makes no difference to switch the envelope without looking at its value first. See the problem now? When the value is not bounded, there is no last term to compensate the other gains, and doing this procedure would result that it's better to switch in all cases. It can be argued that it's impossible to have a uniform infinite distribution according to modern probability theory. While this solves the problem I brought up, it can be changed so that the set in discussion is { (2^n,2^(n+1)), n in Naturals }. For this case, by computing the distribution of each value and evaluating the probability, we still come to the conclusion that it's better to switch for every value. Here, my knowledge fails me and I can't really say more than what I already have. Hope this was helpful.
Tub
Joined: 6/25/2005
Posts: 1377
p4wn3r wrote:
It can be argued that it's impossible to have a uniform infinite distribution according to modern probability theory. While this solves the problem I brought up, it can be changed so that the set in discussion is { (2^n,2^(n+1)), n in Naturals }.
Nope. There is no uniform probability distribution for any discrete infinite set (or for any unbounded interval in continuous distributions), not in any variant of probability theory. Here's why, for the discrete case: A uniform distribution must satisfy these conditions: * the sum of all probabilities must be 1 (otherwise it's not a distribution) * P(X=a) = P(X=b) for all a,b (otherwise it's not uniform) Now what's P(X=a)? If P(X=a) = 0, the sum is 0. If P(X=a) > 0, the infinite sum doesn't converge. You can use a different distribution that is valid on all positive numbers, but then you'll need to re-examine the part I coloured earlier. Depending on the value of money in your envelope, the chance that it's the lower amount is anything but 50%. (Of course, since Bertrand was mentioned, we must also specify the exact method for determining the amounts. After picking a random number x, we could fill the envelopes with (x, 2*x), or (x/2, x), or maybe (42*x, 84*x). As soon as we use non-uniform distributions, the exact method matters.)
m00
Player (246)
Joined: 8/6/2006
Posts: 784
Location: Connecticut, USA
First off, thanks everyone for the help with the x / sqrt(x) thing. Also, I just learned how to do related rate problems. They got under my skin at first but now I think they're kinda neat. Here's one that I particularly liked (though I realize that this is probably a common question for first year Calc students): A streetlight sits 12 feet off the ground. A woman who is 5 feet tall is walking away from the streetlight at a rate of 3.5 feet / second. How fast is the tip of her shadow moving? How fast is her shadow lengthening?
Tub
Joined: 6/25/2005
Posts: 1377
That can be answered by trivially applying the intercept theorem, which I learned in 8th class when I was 13. I'm not familiar with your scholar system, how old is a "first year calc student" actually?
m00
Player (246)
Joined: 8/6/2006
Posts: 784
Location: Connecticut, USA
Tub wrote:
That can be answered by trivially applying the intercept theorem, which I learned in 8th class when I was 13. I'm not familiar with your scholar system, how old is a "first year calc student" actually?
Yes, as you mention, knowledge of similar triangles is needed to solve this. But you also need to find the derivative of a couple different things since the question asks about a specific moment in time. I'm not really sure it can be done otherwise, or if there's a quick way to do it without derivatives. As for your question, I suppose it could vary. I'm not familiar with too many people who took Calculus in High School (which would be up to the twelfth year), but it's fairly common for students going to college to take it their first or second year there (so 13th or 14th).
Tub
Joined: 6/25/2005
Posts: 1377
Wait.. at time t (in seconds) the woman has a horizontal distance of 3.5*t feet to the streetlight (assuming she's directly under the streetlight at t=0). A straight line from streetlight through the woman's head will touch the floor at (3.5t) / 7 * 12 = 6t feet (interception theorem), thus the tip of the shadow moves at 6 ft/s. Of course the length of the shadow at any given time is the horizontal distance between the woman and the tip of the shadow, which is (6t - 3.5t) = 2.5t feet Did I misunderstand the question? Where's the derivative?
m00
Player (246)
Joined: 8/6/2006
Posts: 784
Location: Connecticut, USA
Tub wrote:
Wait.. at time t (in seconds) the woman has a horizontal distance of 3.5*t feet to the streetlight (assuming she's directly under the streetlight at t=0). A straight line from streetlight through the woman's head will touch the floor at (3.5t) / 7 * 12 = 6t feet (interception theorem), thus the tip of the shadow moves at 6 ft/s. Of course the length of the shadow at any given time is the horizontal distance between the woman and the tip of the shadow, which is (6t - 3.5t) = 2.5t feet Did I misunderstand the question? Where's the derivative?
You're correct on both counts, of course... The way I was taught is detailed here. You basically take the derivative of both sides of an equation with respect to time (d/dt) to find the answer. There are probably other related rates problems that can't be solved in easier methods though, am I right? Otherwise, what is the point of learning this way of solving? =[
Player (42)
Joined: 12/27/2008
Posts: 873
Location: Germany
Tub wrote:
p4wn3r wrote:
It can be argued that it's impossible to have a uniform infinite distribution according to modern probability theory. While this solves the problem I brought up, it can be changed so that the set in discussion is { (2^n,2^(n+1)), n in Naturals }.
Nope. There is no uniform probability distribution for any discrete infinite set (or for any unbounded interval in continuous distributions), not in any variant of probability theory.
Huh, I guess you missed the point. One distribution of (2^n,2^(n+1)) has probability P(n) = 2^n/3^(n+1). The sum of this series converges to 1, so it's clearly a valid distribution. (I never said that set had uniform distribution either, please read that again) So, for a value 2^n for n larger than 0, if I calculate the gain like I did before, I get ( ( 2^(n+1) - 2^n )p(n) - ( 2^n - 2^(n-1) ) p(n-1) )/( p(n) + p(n-1) ) = ( 2^n ( 2^n/3^(n+1) ) - 2^(n-1) ( 2^(n-1)/3^n) )/( 2^n/3^(n+1) + 2^(n-1)/3^n ) = ( 2^n/3 - 2^(n-2) )/( 1/3 + 1/2 ) = 2^n(1/3 – 1/4 )/( 5/6 ) = 2^n(1/12)(6/5) = 2^n/10 So, using non-uniform distribution, I'd still get 1/10th more of the sum if I switch the envelope. While this analysis is not trivial, I didn't point it out because rhebus had already linked it before. It's usually good to read some posts before so that people don't need to spend time explaining the idea when it isn't necessary, and that you don't need to teach me things I already know.
Patashu
He/Him
Joined: 10/2/2005
Posts: 4045
p4wn3r wrote:
Decision problems involving probability distributions are usually very hard to solve and, IIRC, no consensus has been reached yet. An example that illustrates the non-triviality of these puzzles is the "Two envelopes paradox". Suppose that you have to choose between two envelopes, one contains X dollars and the other 2X dollars. After choosing an envelope with A dollars, you're given the chance to switch to the other one. To see if it's worthy, you assume that there's 50% chance the other one will have A/2, and another 50% that it'll have 2A. Summing, your expected gain will be 5A/4, thus it's better to switch. However, since that's true for every value, switching before even choosing the envelope would rise your expected value, and this is absurd. These problems introduce a concept called "expectation", which is hard to formalize in mathematics. By picking a large value, you may think "it's unlikely that the other envelope contains an even larger sum". Basically, in an infinite distribution, the probability of the numbers appearing is not the same. This may influence your players' problem. I lack the mathematical background to offer a reasonable solution to this problem, I suggest that you look at other paradoxes of probability and decision theories and their proposed solutions and see if it helps you. EDIT: Wow, it seems Tub posted something similar while I was writing, lol
It's not a paradox, it's like the 'case of the missing dollar' where mathematical concepts are misused to create an apparent contradiction. I'd solve it something like this: Pick an envelope. It either has X or 2X, so its expected value is 1.5X. The other envelope will either have X or 2X, making its expected value also 1.5X. The 'trick' employed is to consider the case of the two times your amount and 0.5x your amount being irrespective of what amount you picked. Imagine a problem where this actually happened: Two envelopes either had A or 2A in them. When you pick one, the other one updates to randomly have either 2x (your amount) or 0.5x (your amount). Now it WOULD be advantageous to switch. BUT in the case of this problem, if the other envelope offers 2x it's 2x of 1x, and if it offers 1x it's 1x of 2x, essentially cancelling it out.
My Chiptune music, made in Famitracker: http://soundcloud.com/patashu My twitch. I stream mostly shmups & rhythm games http://twitch.tv/patashu My youtube, again shmups and rhythm games and misc stuff: http://youtube.com/user/patashu
Tub
Joined: 6/25/2005
Posts: 1377
p4wn3r wrote:
One distribution of (2^n,2^(n+1)) has probability P(n) = 2^n/3^(n+1). The sum of this series converges to 1, so it's clearly a valid distribution. (I never said that set had uniform distribution either, please read that again)
sorry, you neither said "non-uniform" nor did you give an actual distribution in that post, either. Yes, I missed rhebus's link. In any case, the solution is given in rhebus's link as well: * just opening an envelope has an expected infinite payoff * switching repeatedly has an expected payoff of infinity * 1.1^(amount of switches) so while I'm aware that I can't just calculate with infinity like that, you should intuitively agree that an expected infinite payoff cannot get any higher by randomly switching closed envelopes, right? (more precisely, if you want to argue that repeatedly switching is favorable, you need to find a different approach than the expected gains, to avoid the infinity in your formulas) Once you actually open an envelope and notice that you're well below your expected infinite payoff, switching once is favorable. But since you opened the envelope, you can only switch once, avoiding the paradox. If you're concerned that the correct strategy is "switch" no matter which number you picked: see the monty hall problem which has the same resolution. IMHO it's a non-issue.
m00