• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Consciousness

Copernicus

Industrial Strength Linguist
No, the problem is that everything is not an objective relationship between a certain cognitive state in a brain and observable measurents and data.
That is the core philosophical assumption in your model. That which you do in a subjective sense as a behaviour in effect in your brain, is in effect not really a subjective behaviour.

Mikkel, you are putting words in my mouth. I did not claim the everything is... You are claiming that everything has to be... And then you are attributing that assumption to me. Observation and data gathering are used as evidence to support an objective study of an act of experiencing something. The investigator is not the subject of the study, but a third person describing it from an objective perspective. Observing a walking robot that is experiencing a maze that it must navigate and observing a human being that is experiencing a maze that it must navigate are roughly the same thing. The behavior of the physical navigators can be described in terms of the physical interactions between their bodies and the structure of the puzzles they are solving to navigate the maze. The term "objective" inherently excludes first person descriptions. That is part of what the adjective means--a third person perspective.

That is the core problem. I use objective and subjective as meaning human behaviours for different aspects of human behaviour and different relationships to the world as such including humans with psychological, social and cultural behaviour.
The falsification of the world/the universe/reality/everuthing is only objective, is that I can do a non-objective behaviour and I will now do that.
No, I don¨t do everything in an objective sense as a behaviour and I am doing that right now, because I subjectively chose to write this text.

Do you think that physical objects that interact have subjective experiences? After all, they detect each other's presence and are either attracted or repelled by physical forces. Does an iron filing detect a magnet that changes its location in space? It certainly doesn't "experience" the magnet in the same way that a human being experiences it, but one can say that its interaction with the magnet is a type of primal "experience". Think of a human body as another type of physical object that interacts with the magnet, but in vastly more complex ways. The interaction is still entirely physical from an objective perspective, but the objective description of all the physical processes that go into that interaction are unimaginably more complex. The human body doesn't noticeably experience the attractive force that the iron filing does, but it still interacts physically with the magnet. For example, the physical human body uses the magnet to conduct experiments with iron filings.
 
Last edited:

mikkel_the_dane

My own religion
Mikkel, you are putting words in my mouth. I did not claim the everything is... You are claiming that everything has to be... And then you are attributing that assumption to me. Observation and data gathering are used as evidence to support an objective study of an act of experiencing something. The investigator is not the subject of the study, but a third person describing it from an objective perspective. Observing a walking robot that is experiencing a maze that it must navigate and observing a human being that is experiencing a maze that it must navigate are roughly the same thing. The behavior of the physical navigators can be described in terms of the physical interactions between their bodies and the structure of the puzzles they are solving to navigate the maze. The term "objective" inherently excludes first person descriptions. That is part of what the adjective means--a third person perspective.



Do you think that physical objects that interact have subjective experiences? After all, they detect each other's presence and are either attracted or repelled by physical forces. Does an iron filing detect a magnet that changes its location in space? It certainly doesn't "experience" the magnet in the same way that a human being experiences it, but one can say that its interaction with the magnet is a type of primal "experience". Think of a human body as another type of physical object that interacts with the magnet, but in vastly more complex ways. The interaction is still entirely physical from an objective perspective, but the objective description of all the physical processes that go into that interaction are unimaginably more complex. The human body doesn't noticeably experience the attractive force that the iron filing does, but it still interacts physically with the magnet. For example, the physical human body uses the magnet to conduct experiments with iron filings.

Yeah, the problem is that if I see a cat and tells you it is multicolored, you could see that too under certain conditions. But you can't see a physical object. That is an mental abstract idea in your mind.
Now you can as you do describe how you experince and understand something, but that is you doing it first person. You are as an "I" never a third person, so you didn't write this third person.
The idea in your world view is that there is no first person in a third person view. There is and here is the joke, there would be no observation without the 1st person.
What does the claim "I know something" require? Answer that and we will see if you can aviod the "I" as you first person individually.

In effect I descrbe the world including in part that I describe the world and how it works to do that. And I observe and inferer that you do the same.
 

Copernicus

Industrial Strength Linguist
Yeah, the problem is that if I see a cat and tells you it is multicolored, you could see that too under certain conditions. But you can't see a physical object. That is an mental abstract idea in your mind.
Now you can as you do describe how you experince and understand something, but that is you doing it first person. You are as an "I" never a third person, so you didn't write this third person.
The idea in your world view is that there is no first person in a third person view. There is and here is the joke, there would be no observation without the 1st person.
What does the claim "I know something" require? Answer that and we will see if you can aviod the "I" as you first person individually.

In effect I descrbe the world including in part that I describe the world and how it works to do that. And I observe and inferer that you do the same.
Both objects and colors are mental abstracts. They are just different categories of abstracts.

You continue to miss my point. Let me call your attention to the stems of the words "subjective" and "objective". They are "subject" and "object", respectively. The verb "observe" takes a subject and and object. The concept of observing something implies both. You can imagine yourself as the one doing the observing or being observed. In my opinion, you keep confusing the subject with the object. I am not denying either. You can shift perspectives back and forth between the two. If you keep confusing the two, then you arrive at a contradiction--a paradox that you can't resolve. That is the source of your problem here, not me. I am just telling you to distinguish the subject from the object. Don't confuse the two.
 
Last edited:

mikkel_the_dane

My own religion
Both objects and colors are mental abstracts. They are just different categories of abstracts.

You continue to miss my point. Let me call your attention to the stems of the words "subjective" and "objective". They are "subject" and "object", respectively. The verb "observe" takes a subject and and object. The concept of observing something implies both. You can imagine yourself as the one doing the observing or being observed. In my opinion, you keep confusing the subject with the object. I am not denying either. You can shift perspectives back and forth between the two. If you keep confusing the two, then you arrive at a contradiction--a paradox that you can't resolve. That is the source of your problem here, not me. I am just telling you to distinguish the subject from the object. Don't confuse the two.

Yes, I get it. You are subjective there and talking to me subjectively, that I subjectively have to keep the 2 part. I do get it. You will always as a human end in how you make sense of the world and you just did that.
The point of this exchange to not eliminate the objective part of the world as objective: having reality independent of the mind. But to point out that the mind is still there. That is all.
 

Copernicus

Industrial Strength Linguist
Yes, I get it. You are subjective there and talking to me subjectively, that I subjectively have to keep the 2 part. I do get it. You will always as a human end in how you make sense of the world and you just did that.
The point of this exchange to not eliminate the objective part of the world as objective: having reality independent of the mind. But to point out that the mind is still there. That is all.

This has never been in dispute. I am not an eliminative materialist. I have no trouble talking about the mind--what eliminativists deride as folk psychology. All I'm saying is that materialism doesn't really eliminate folk psychology. Materialism is the only means we have of explaining it.
 

gnostic

The Lost One
The idea in your world view is that there is no first person in a third person view. There is and here is the joke, there would be no observation without the 1st person.

That not necessarily always true.

In the world of technology, we have designed & constructed instruments, devices, machines, computers, etc, to do some of observations, so with such technology, so technically, it is no longer first-person perspective or no longer direct observations, because the “observer” is no longer a “person”, “human”.

Of course, a person or people using the device or machine that observe, can look at the outcomes of the observations.

Just because a person didn’t observe whatever it is, “first”, doesn’t invalidate the observation of the device or machine.

Many of the ”observing” instruments or devices people used, aren’t just for “observing” or “detecting” alone, these devices could do other observations such as counting (hence quantities), measuring, analysing the physical properties, comparing one evidence against other evidence, and so much more. The points is that human sensory perception are limited, and devices can be used what people cannot naturally perceive with their own eyes, ears, nose, touch, etc.

For instance, a person cannot see or measure electricity running in some circuitry, but that person can use a multimeter or oscilloscope to do all the observations, that include detecting the charge or current and obtaining information or data about the electricity that go through any components of the circuitry, such data as measurement of electric current, charges, voltage, resistance, power, detect if the circuitry uses AC or DC current, type of waveforms in AC circuitry, etc. Multimeters & oscilloscopes aren't just measuring tools, but also versatile enough to allow any electrician or technician to diagnose fault in the circuitry.

Even a camera is a device that can either capture still images of evidence or record it in video, in which anyone (observers) can view at any time. The camera don’t even need a person to be there when the camera is operating.

A lot of evidence are not directly observed by the first-person. If you were using telescope or microscope, you would be seeing many things that your eyesight wouldn’t see, naked eye. Our eyes are limited to visible light and see colours of certain wavelengths. There are many telescopes capable of observing stars, galaxies, nebulas, etc, and going beyond the visible light spectrum, eg infrared, ultraviolet, radio waves, microwave , gamma ray, etc. it is not direct observations, whenever we use these telescopes.

Scientists can use mass spectrometers to breakdown any solid, liquid, gas or plasma specimens, to accurately find out what chemical compounds or atoms those specimens are comprise of.

Hospitals, clinics and research facilities, often operate all sorts of machines that look inside any parts of a human body, without ever cutting them open. Machines, such as ultrasound, x-ray, ct scan, MRI, EEG, digital sphygmomanometer, etc.

my points about using technology to do all the observing & measuring the evidence, that provide data needed to understand the evidence. That’s what provide objective information for any scientist, engineer or whoever use those technology. You don’t necessarily need to have first person observer, to have objective observations.
 

Whateverist

Active Member
The problem is most of what is being said applies to thought and not to consciousness. I believe consciousness is life and all life is individual. But we homo omnisciencis use different programming I call modern languages than all other species that ever existed. This programming makes us very different and gives rise to abstract thought that hides the nature of consciousness from us.

I’m in complete agreement. The part of our brain with which we manipulate the world to get what we need is that with which we express ourselves verbally for the most part although, as with any other function we can name, both hemispheres contribute albeit in different ways. We don’t have two different brains, just two different centers which focus differently. The left attends where we decide with our narrow beam focused conscious attention. The right attends to everything at a preconscious level as it becomes present to us and triages what will come to our conscious attention. They can and should work cooperatively but hyperactive attempts to control everything consciously lead to mental illness such as schizophrenia.

In my opinion religion has succeeded because it provides a mental framework for dealing with some things and accepting that other things are beyond our control. Having faith that the preconscious part of our brains may provide intuition/insight/inspiration when confronted with the unexpected where no solution is familiar could be the upshot of belief in God. Leastwise it provides an incentive to look beyond what we can figure out through examining what we already know or can find by researching what others have discovered. Of course belief in any particular religious schema is not required but an open mind and some receptivity is.
 

mikkel_the_dane

My own religion
This has never been in dispute. I am not an eliminative materialist. I have no trouble talking about the mind--what eliminativists deride as folk psychology. All I'm saying is that materialism doesn't really eliminate folk psychology. Materialism is the only means we have of explaining it.

Okay. Well, since I am not a materialist nor an idealist, we could properly go a round or two about materialism. Or rather what we actually know about objective reality in itself.
 

Copernicus

Industrial Strength Linguist
Okay. Well, since I am not a materialist nor an idealist, we could properly go a round or two about materialism. Or rather what we actually know about objective reality in itself.

What we know can only be built on knowledge of how our bodies interact with objective reality. I think we may be in agreement on that point.
 

RestlessSoul

Well-Known Member
That not necessarily always true.

In the world of technology, we have designed & constructed instruments, devices, machines, computers, etc, to do some of observations, so with such technology, so technically, it is no longer first-person perspective or no longer direct observations, because the “observer” is no longer a “person”, “human”.

Of course, a person or people using the device or machine that observe, can look at the outcomes of the observations.

Just because a person didn’t observe whatever it is, “first”, doesn’t invalidate the observation of the device or machine.

Many of the ”observing” instruments or devices people used, aren’t just for “observing” or “detecting” alone, these devices could do other observations such as counting (hence quantities), measuring, analysing the physical properties, comparing one evidence against other evidence, and so much more. The points is that human sensory perception are limited, and devices can be used what people cannot naturally perceive with their own eyes, ears, nose, touch, etc.

For instance, a person cannot see or measure electricity running in some circuitry, but that person can use a multimeter or oscilloscope to do all the observations, that include detecting the charge or current and obtaining information or data about the electricity that go through any components of the circuitry, such data as measurement of electric current, charges, voltage, resistance, power, detect if the circuitry uses AC or DC current, type of waveforms in AC circuitry, etc. Multimeters & oscilloscopes aren't just measuring tools, but also versatile enough to allow any electrician or technician to diagnose fault in the circuitry.

Even a camera is a device that can either capture still images of evidence or record it in video, in which anyone (observers) can view at any time. The camera don’t even need a person to be there when the camera is operating.

A lot of evidence are not directly observed by the first-person. If you were using telescope or microscope, you would be seeing many things that your eyesight wouldn’t see, naked eye. Our eyes are limited to visible light and see colours of certain wavelengths. There are many telescopes capable of observing stars, galaxies, nebulas, etc, and going beyond the visible light spectrum, eg infrared, ultraviolet, radio waves, microwave , gamma ray, etc. it is not direct observations, whenever we use these telescopes.

Scientists can use mass spectrometers to breakdown any solid, liquid, gas or plasma specimens, to accurately find out what chemical compounds or atoms those specimens are comprise of.

Hospitals, clinics and research facilities, often operate all sorts of machines that look inside any parts of a human body, without ever cutting them open. Machines, such as ultrasound, x-ray, ct scan, MRI, EEG, digital sphygmomanometer, etc.

my points about using technology to do all the observing & measuring the evidence, that provide data needed to understand the evidence. That’s what provide objective information for any scientist, engineer or whoever use those technology. You don’t necessarily need to have first person observer, to have objective observations.


“The eyepiece of even the largest telescope cannot be larger than the human eye.”

- Ludwig Wittgenstein
 

shunyadragon

shunyadragon
Premium Member
There is a significant advance in technology where the images people see can be traced, monitored, and repeated the images using brain scans. What these and other similar advances in AI and other technology achieve is linking consciousness and intelligence to the direct function of the brain.

Images are no longer simply mental abstracts.

More references to follow on the advances AI and the understanding of consciousness and intelligence's relationship to the brain.


Source: https://www.science.org/content/article/ai-re-creates-what-people-see-reading-their-brain-scans


AI re-creates what people see by reading their brain scans

A new artificial intelligence system can reconstruct images a person saw based on their brain activity

7 MAR 2023 BYKAMAL NAHAS
As neuroscientists struggle to demystify how the human brain converts what our eyes see into mental images, artificial intelligence (AI) has been getting better at mimicking that feat. A recent study, scheduled to be presented at an upcoming computer vision conference, demonstrates that AI can read brain scans and re-create largely realistic versions of images a person has seen. As this technology develops, researchers say, it could have numerous applications, from exploring how various animal species perceive the world to perhaps one day recording human dreams and aiding communication in people with paralysis.

Many labs have used AI to read brain scans and re-create images a subject has recently seen, such as human faces and photos of landscapes. The new study marks the first time an AI algorithm called Stable Diffusion, developed by a German group and publicly released in 2022, has been used to do this. Stable Diffusion is similar to other text-to-image “generative” AIs such as DALL-E 2 and Midjourney, which produce new images from text prompts after being trained on billions of images associated with text descriptions.

For the new study, a group in Japan added additional training to the standard Stable Diffusion system, linking additional text descriptions about thousands of photos to brain patterns elicited when those photos were observed by participants in brain scan studies.
© Copyright Original Source

 
Last edited:

shunyadragon

shunyadragon
Premium Member
Source: Brain Activity Decoder Can Reveal Stories in People’s Minds
Brain Activity Decoder Can Reveal Stories in People’s Minds

More interesting research between the brain and consciousness using AI.

May 1, 2023 • by Marc Airhart

The work relies in part on a transformer model, similar to the ones that power ChatGPT.

A new artificial intelligence system called a semantic decoder can translate a person’s brain activity — while listening to a story or silently imagining telling a story — into a continuous stream of text. The system developed by researchers at The University of Texas at Austin might help people who are mentally conscious yet unable to physically speak, such as those debilitated by strokes, to communicate intelligibly again.

The study, published in the journal Nature Neuroscience, was led by Jerry Tang, a doctoral student in computer science, and Alex Huth, an assistant professor of neuroscience and computer science at UT Austin. The work relies in part on a transformer model, similar to the ones that power Open AI’s ChatGPT and Google’s Bard.

Unlike other language decoding systems in development, this system does not require subjects to have surgical implants, making the process noninvasive. Participants also do not need to use only words from a prescribed list. Brain activity is measured using an fMRI scanner after extensive training of the decoder, in which the individual listens to hours of podcasts in the scanner. Later, provided that the participant is open to having their thoughts decoded, their listening to a new story or imagining telling a story allows the machine to generate corresponding text from brain activity alone.

“For a noninvasive method, this is a real leap forward compared to what’s been done before, which is typically single words or short sentences,” Huth said. “We’re getting the model to decode continuous language for extended periods of time with complicated ideas.”
The result is not a word-for-word transcript. Instead, researchers designed it to capture the gist of what is being said or thought, albeit imperfectly. About half the time, when the decoder has been trained to monitor a participant’s brain activity, the machine produces text that closely (and sometimes precisely) matches the intended meanings of the original words.

For example, in experiments, a participant listening to a speaker say, “I don’t have my driver’s license yet” had their thoughts translated as, “She has not even started to learn to drive yet.” Listening to the words, “I didn’t know whether to scream, cry or run away. Instead, I said, ‘Leave me alone!’” was decoded as, “Started to scream and cry, and then she just said, ‘I told you to leave me alone.’”

© Copyright Original Source
 

shunyadragon

shunyadragon
Premium Member
One of the complicated problems with relating the physical brain with consciousness and behavior is the actual ability to map this relationship. Science found a way to map this relationship in all animals including humans by studying this relationship in a worm at the simplest level.


Neural Navigators: How MIT Cracked the Code That Relates Brain and Behavior in a Simple Animal​


MIT is an acronym for the Massachusetts Institute of Technology. It is a prestigious private research university in Cambridge, Massachusetts that was founded in 1861. It is organized into five Schools: architecture and planning; engineering; humanities, arts, and social sciences; management; and science. MIT's impact includes many scientific breakthroughs and technological advances. Their stated goal is to make a better world through education, research, and innovation.
" data-gt-translate-attributes="[{"attribute":"data-cmtooltip", "format":"html"}]" style="margin: 0px; padding: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: dotted; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(0, 0, 0); border-left-color: initial; border-image: initial; font: inherit; vertical-align: baseline; text-decoration: none !important; color: rgb(0, 0, 0) !important;">MIT researchers model and map how neurons across the tiny brain of a C. elegans worm encode its behaviors, revealing many new insights about the robustness and flexibility of its nervous system


To understand the intricate relationship between brain activity and behavior, scientists have needed a way to map this relationship for all of the neurons across a whole brain. Thus far this has been an insurmountable challenge. But after inventing new technologies and methods for the purpose, a team of scientists in The Picower Institute for Learning and Memory at MIT has produced a meticulous accounting of the neurons in the tractably tiny brain of a humble C. elegans worm, mapping out how its brain cells encode almost all of its essential behaviors, such as movement and feeding.

In the journal Cell on August 21, the team presented new brain-wide recordings and a mathematical model that accurately predicts the versatile ways that neurons represent the worm’s behaviors. Applying that model specifically to each cell, the team produced an atlas of how most cells, and the circuits they take part in, encode the animal’s actions. The atlas, therefore, reveals the underlying “logic” of how the worm’s brain produces a sophisticated and flexible repertoire of behaviors, even as its environmental circumstances change.

Insights from the Research​

“This study provides a global map of how the animal’s nervous system is organized to control behavior,” said senior author Steven Flavell, Associate Professor in MIT’s Department of Brain and Cognitive Sciences. “It shows how the many defined nodes that make up the animal’s nervous system encode precise behavioral features, and how this depends on factors like the animal’s recent experience and current state.”

Graduate students Jungsoo Kim and Adam Atanas, who each earned their PhDs this spring for the research, are the study’s co-lead authors. They’ve also made all their data, and the findings of their model and atlas, freely available to fellow researchers at a website called the WormWideWeb.
 

incites

Member
There have been a number of threads and posts that challenge the scientific basis for a natural evolved basis for an evolved natural consciousness.

The following are basic definitions of consciousness:

The Cambridge Dictionary defines consciousness as "the state of understanding and realizing something." The Oxford Living Dictionary defines consciousness as "The state of being aware of and responsive to one's surroundings.", "A person's awareness or perception of something."

I will argue the following:

1. Degrees of consciousness exist through out the evolutionary history of animals with a nervous system and a brain with neurological responses to the awareness of the environment, Consciousness increases with complexity over time.
2. Consciousness represents the collective thoughts, reasoning, understand and realizing relationships and responses to the environment, which have been falsified by scientific methods that originate from the brain and nervous system.
3. Science has reasonable explanation of the nature of consciousness in the animal kingdom.

Research is constantly expand our scientific understanding of consciousness in the animal kingdoms. The following article is representative of the current advances of science:


Tracing the Evolutionary Roots of Cognitive Flexibility

Summary: A new study provides insights into the evolutionary origins of cognitive flexibility, an essential skill for adaptation and survival.

Participants were studied using functional magnetic resonance imaging (fMRI) while learning a sensorimotor task, the findings of which showed the importance of sensory brain regions in decision-making. The researchers also discovered surprising similarities between the brain activity of humans and mice during this task.

These results suggest that the interplay between the frontal brain and sensory brain regions for decision-making formed early in evolutionary development.

Key Facts:


  1. Cognitive flexibility, which allows quick adaptation to changing conditions, is crucial for survival and is based on the functions of the orbitofrontal cortex located in the frontal brain.
  2. Sensory brain regions are critical in decision-making processes as discovered in the study, suggesting the need for further investigation in this area.
  3. The similarity in cognitive processes between mice and humans suggests that these decision-making mechanisms likely developed early in evolutionary history.

Source: RUB

Get up. Go to the kitchen. Prepare some cereal – but a look into the fridge shows: the milk bottle is empty. What now? Skip breakfast? Ask the neighbour for milk? Eat jam sandwiches? Every day, people are confronted with situations that were actually planned quite differently. Flexibility is what helps.

The origin of this skill in the brain is called cognitive flexibility.

A neuroscientific research team at the Berufsgenossenschaftliches Universitätsklinikum Bergmannsheil, University Hospital of Ruhr University Bochum, Germany, and the Biosciences Institute at Newcastle University has now succeeded in getting a little closer to the evolutionary origin of cognitive flexibility.

The researchers published their findings in the journal Nature Communications, online since 9. June 2023.

Key factor in many neuropsychiatric diseases

Cognitive flexibility is essential for the survival of all species on Earth. It is particularly based on functions of the so-called orbitofrontal cortex located in the frontal brain.

“The loss of cognitive flexibility in everyday life is a key factor in many neuropsychiatric diseases,” Professor Burkhard Pleger and first author Dr. Bin Wang from the Berufsgenossenschaftliches Universitätsklinikum Bergmannsheil describe their motivation for the study.

“Understanding the underlying network mechanisms is therefore essential for the development of new therapeutic methods.”

Using functional magnetic resonance imaging (fMRI), the Bochum team and their cooperation partner Dr. Abhishek Banerjee from the Biosciences Institute at Newcastle University examined the brain functions of 40 participants while they were learning a sensorimotor task.

While lying in the MRI, the volunteers had to learn to recognise the meaning of different touch signals – similar to those used in Braille – on the tip of the right index finger. One touch signal told the participants to press a button with their free hand, while another signal instructed them not to do so and to remain still.

The connection between the two different touch signals and pressing the button or not pressing the button had to be learned from trial to trial. The challenge: after a certain time, the touch signals changed their meaning.

What had previously meant “pressing the button” now meant “holding still” – an ideal experimental set-up to investigate the volunteers’ cognitive flexibility. The fMRI provided images of the corresponding brain activity.


Similarities between humans and mice

“Similar studies had already been done with mice in the past,” says Pleger.

“The learning task we chose now allowed us to observe the brains of mice and humans under comparable cognitive demands.”

A surprising finding is the comparability between the Bochum results in humans and the previously published data from mice, Wang points out.

The similarity shows that cognitive functions that are important for survival, such as the flexibility to adapt quickly to suddenly changing conditions, are following comparable rules in different species.

In addition, the Bochum scientists were able to determine a close involvement of sensory brain regions in the processing of the decisions made during tactile learning. Wang emphasises: “Besides the frontal brain, sensory regions are essential for decision-making in the brain.”

More to follow . . .
elin musk said if it took another ten seconds for the conscienceness to developed then it never would of happened and that is just how precise n balanced this nature is n cosmos as humans have survived everyone of the chillenges it face so far from nature n cosmos and thats y they are here as they are homosapiens modern day humans 200 000 years old but im saying i have identified over 20 breeds of humans n 4 sub breeds but today its 90% homo sapiens but 10% is a unknown breed that is migrateing among the homosapiens as its not to replace them vut to give them greater ability mentally n physically like a increase in the homo sapiens to make them even stronger n more able to do the important things that they must do in daily routine as they will be quicker computing n faster thinking as their intellengence will be boosted n they will be more equipped to face what ever challenges they face from nature or cosmos
 
Top