Oh, we have verified that evolutionary processes can and do increase the fitness of organisms in nature through adaptations. I have provided multiple examples from cichlid fish to pesticide resistance to DDT in mosquitoes.
how many turned into humans, or any other species for that matter?
This is completely wrong. If we knew what design or code we wanted, we would not be using iterative evolutionary and genetic algorithms and optimization schemes at all. We would just code or design it directly. Actual usage of Genetic and Evolutionary algorithms seek to find solutions to design and optimization problems that are unknown.
You may have more programming experience and expertise than I do, but I have a fair bit, in commercial applications as well as some gaming. We assign fitness functions, and the algorithm calculates the most efficient way to satisfy that fitness function. Dawkins quotes a program for designing efficient spider webs as an analogy for evolution... again the program is set a specific task, given specific parameters to adjust- and will perform that task, no more no less, it will give you nothing that was not specifically set as a goal in the first place.
Dawkins is not without his naive charm, but life, DNA, operates on complex (dare I use the word) information systems, as does the whole crux of the question today, like many biologists he is way out of his depth in this key area.
In evolution through natural selection, a mutation with 0.01% selection advantage
will certainly be chosen over neutral variants with a high probablity. Another way in which natural selection differs from human selection.
It can identify and choose and fix even very small improvements in phenotype. Here is the math.
On the Fixation Process of a Beneficial Mutation in a Variable Environment | Genetics
\
Made irrelevant by previous considerations. A mutaant allele that increase the mean number of offsprings from 4 to 4.04, will also be selected for over the generations.
It may seem so intuitively, but that's really an anthropomorphic bias:
As humans we can identify a .01% advantage in a design, an airline operater might well preserve this for a later accumulated savings in fuel. This decision requires forethought, purpose, desire, things found only in a conscious mind
Natural selection cannot make these forward looking decisions- and you'd have to argue this with Dawkins and Darwin if you disagree! it has no way to specifically preserve and save up insignificantly beneficial mutations for rainy days. Not only that, we could grant a raccoon a whopping 50% advantage in it's gestation period, and it's just as likely to get run over by a semi before reaching sexual maturity as the rest, so nature is at a huge disadvantage here
No. Both are simple, though the random pile is simpler as random distribution produces simple statistical measures that capture the "group" distribution of the bricks more easily than a well arranged brick wall. You need to understand what complexity is and how to measure it.
Centre for Complexity Science
ha ha, try replicating a particular random pile of bricks, versus a brick wall with a simple design and you will soon find out which is the more complicated task. But again we are getting into semantic weeds. The point is that if we see a brick wall next to a pile of the same number. Most of us don't have much trouble figuring out which is random and which was designed-
as 'HELP' in rocks on the deserted island, waves or intelligent agent?
And I don't take advice from a group who can't spell 'center' correctly!
If you are going to use the second law then you have to use the scientific definition of entropy, as the layman dictionary definition does not follow the second law.
https://en.wikipedia.org/wiki/Entropy .
In statistical thermodynamics, entropy (usual symbol S) (Greek:Εντροπία, εν + τρέπω) is a measure of the number of microscopic configurations Ω that a thermodynamic system can have when in a state as specified by certain macroscopic variables. Specifically, assuming that each of the microscopic configurations is equally probable, the entropy of the system is the
natural logarithm of that number of configurations, multiplied by the
Boltzmann constant kB (which provides consistency with the original thermodynamic concept of entropy discussed below, and gives entropy the
dimension of
energy divided by
temperature). Formally,
S = K * Ln (Omega)
Specifically scientific entropy measures the
amount of hidden information present in the microscopic states of a system. Thus the second law basically states:- "For a system completely disconnected from the outside world, the amount of hidden information in the microscopic states of the system tends to increase with time."
Okay, and if I cite the 2nd law I'll be sure to cut and paste the same blurb.
In the meantime
Entropy
2 lack of order or predictability; gradual decline into disorder.
synonyms: deterioration, degeneration, crumbling, decline, degradation, decomposition, breaking down, collapse;
We've all seen a document that someone kept photocopying from the last generation, it's still functional, but it slowly deteriorates with entropy. when it comes time to regenerate them, sure you might select the 'fittest' of that generation to reproduce for the next generation, but this in no way denotes a fittER new generation (although perhaps someone at the complexity science campus would disagree!)
But if they were organized, they kept a master to copy from, and if after 100 million re-issues of the memo- it's still as fresh as the first, you know the copy is derived from that independent master