Ok since you disagree with both of the premises let's start with the first
The premise states that if an organism requires multiple independent mutations (say 3 mutations) in order to have a benefit that would be selected by natural selection, then such step would not ocurre (atleast not by a process of random mutations)
So for example if an organism requires 3 mutations in order to become immune to an antibiotic then this organism will not become inmune to the antibiotic. (in this example having 1 or 2 mutations would be completely useless) you need all 3 mutations to gain a benefit.
Behe justify his claim as follows, his point is that 2 mutations are within a limit, (very improbable but possible) but 3 mutations would become impossible. So why is Behe wrong?
(Michael Behe, The Edge of Evolution: The Search for the Limits of Darwinism, p. 135 (Free Press, 2007).)
He is wrong in several different ways. First, the idea that the probabilities multiply is the *definition* of being independent in the sense of probability. But at no point does he demonstrate that the required mutations are independent in this sense.
Second, he gives a very distorted initial probability of a mutation (1 part in 10^20) which, I would assume uses an assumption of probabilistic independence to arrive at also.
In general, the probability of two events is the product of the two probabilities only when the presence of one has *in no way* an effect of whether the other happens. In particular, when changes happen in series, rather that simultaneously, the product of probabilities is very seldom the correct calculation to make (the actual calculation is much harder and requires the evaluation of the conditional probabilities).
This is precisely where many calculations attempting to discredit evolution fail.
As an example that I did for myself quite some time ago. Suppose that we have a target string of, say, 50 symbols, each of which can be one of 90 possibilities. The 'raw' probability of getting the correct string is then 90^50, which is about 5*10^97. So, the probability of a one-shot string to be correct is one part in 5*10^97, an incredibly small probability. If one attempt was made 1000 times a second, the time it would take to get a 'hit' would be much, much longer than the age of the universe.
But, suppose instead that we try to get the first symbol and once it is correct, we then try to get the second. And once those are correct, we try to get the third. In this case, the average number of attempts to get a 'hit' would be only 90*50=4500, which would give the result in 4.5 seconds.
A more refined model would be to have a population at each stage where mutations happen randomly in children and the 'best' children are selected to be the parents in the next generation. This tends to give numbers higher than the last model, but still well within the range of possibilities. For example, with 50 individuals in each generation, I was able to get a 'hit' within 200 generations, which is about 10000 individuals.
In the case of evolution, it is the calculation for probabilities is closer to the second and third method than the first. First, one mutation is found that is beneficial. That mutation stays around (is fixed) and the second mutation arises probabilistically. Once that is done, the third is the target (bad wording, but it shows what is needed for the probabilities).
Behe is using the incorrect calculation for the probabilities if he is attempting to model evolution.