• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Getting from cause effect to awareness

LegionOnomaMoi

Veteran Member
Premium Member
Opcodes are the lowest level
The lowest level are procedures. It's the actual instantiation of code.


This includes structures that represent (that is, has the properties of)
The structures have the properties we understand. A calculator can do complicated mathematics. Does it understand addition?

I understand the very idea of understanding to be this: you understand an object when you have a comprehensive (not necessarily complete) knowledge of the object's properties.
Objects exist only in abstraction. There is no "tree", no "red", no "idea", no "program", there are only the abstractions of these. Computers don't abstract. We might try to write programs which use word like "tree" and have it inherit from "plant" but we could have all the same with meaningless symbols because this is what the computer ends up processing. The words in programming languages are meaningful to us.
You understand arithmetic on integers when you know that, e.g. adding two integers always produces an integer
I know a lot more than that. I have a concept of what adding is and what numerosity is and this is based on the extension of numerosity to groupings of just about every category represented in my brain. Brains are superb at categorizing. Computers are terrible at it because they cannot represent concepts at all. They can only implement procedures.
This can be represented in the computer system, which means the computer understands arithmetic
Were that true, we wouldn't need to program every single step. Understanding arithmetic means seeing how numbers get added together and being able to add two numbers one has never seen, like 6,349,586,086 and 3.25260346 because one understands addition, not because one was programmed with an addition algorithm.

Mostly, the whole idea occured to me debating the nature of qualia. I know how to represent the notion of redness in mathematical terms
We don't represent the notion in mathematical terms.

I had to work backwards and explicitly construct a concept that behaved in the same way as qualia intutively behaved
Qualia doesn't behave. It is entirely subjective and is your experience of color not any generalized experience. That's the point. It is internal, subjective, and unique.

System.Type is a model, in the maths sense.

It is not a model of concepts. It isn't intended to be and will never be.
It behaves according to a set of properties
It doesn't behave. It's an abstract framework. Things done in this framework by a person enable a computer to behave. If you can't give an example of a computer showing it understands the notion "concept" using this framework then what does it matter?


If I have a C++ program in front of me, do I have to understand it to be able to write an assembly program that will do the same thing?
Assembly and C++ are both high level programming languages. We don't interpret our own code. We see the results of interpreting it. Looking at computer code is as meaningless at looking at a description of a plant of a snail and interpreting this as understanding how the snail thinks. Look at what the system can do. Can it even mimic anything remotely close to language (conceptual processing)? No.
 

Slapstick

Active Member
Computers are literally stupid. I read a post made earlier by Legion and he is right, computers were created to perform grunt work, yet be consistent when doing it without creating errors in computations. Before the modern day electronic computer, people who did computations were called computers. Electronic computers help to provide consistency with computation because one error in a log book could throw off the entire sequence of a logarithm and it would be near impossible to find the error after 100 or more pages of computations. Which is why we have the concept of human error.

Human computer - Wikipedia, the free encyclopedia

I also made a post why computers are stupid. They might be able to learn different things but its based on what we tell them to do. http://www.religiousforums.com/forum/3559012-post99.html
 

idav

Being
Premium Member
Objects exist only in abstraction. There is no "tree", no "red", no "idea", no "program", there are only the abstractions of these. Computers don't abstract. We might try to write programs which use word like "tree" and have it inherit from "plant" but we could have all the same with meaningless symbols because this is what the computer ends up processing. The words in programming languages are meaningful to us.
With the software the computer becomes more than the ones and zeroes. The awareness requires a central area of processing like a collection of what is known. Just like we are more than the collection of nuerons, whatever the neurons say to eachother is meaningless to us. Its the effects of the commands carried out that do anything. Like execute more dopamine.

For an actual concept of a tree or red it doesnt have to experience it like us. If a robot has the concept of an object in front, height length and width then for all intents and purposes that robot is aware. It matters little what style of perception being used, it could just use echo locator and be as blind as a bat.
 

LegionOnomaMoi

Veteran Member
Premium Member
With the software the computer becomes more than the ones and zeroes.
How so?

The awareness requires a central area of processing like a collection of what is known.
We don't have a central area. We have many and they are connected in ways that are incomparable to computers with other brain areas.

For an actual concept of a tree or red it doesnt have to experience it like us.
Not exactly like us, no. But the experience a computer has isn't of concepts.
It matters little what style of perception being used
It's essential.
 

idav

Being
Premium Member

Computer has to link various processes and has to form a larger picture.

I see why your asking for the language cause it needs a way a means of relating to all the data. Being aware is beyond language, a person could be experiencing life for the first time with no means of communicating to himself or others.

Not exactly like us, no. But the experience a computer has isn't of concepts.

It's essential.

What is the experience of watson having all that knowledge and being able to communicate it to us? There is at least the awareness of the knowledge base. In order to be aware of the experience it needs to be able to remember and recall the passing seconds to experience a now type moment.
 

LegionOnomaMoi

Veteran Member
Premium Member
Computer has to link various processes and has to form a larger picture.
It doesn't do this. CS is your field. You know this.

What is the experience of watson having all that knowledge and being able to communicate it to us?
Zip. We know exactly how the program was able to imitate understanding. Watson understood nothing.
 

LegionOnomaMoi

Veteran Member
Premium Member
I'd be interested in what it woukd take to make software utize that much parallel processes.
Parallel processing is nothing if you don't know how to program such machines. We can't and don't. We know how to make parallel processors mindlessly calculate.
 

idav

Being
Premium Member
Parallel processing is nothing if you don't know how to program such machines. We can't and don't. We know how to make parallel processors mindlessly calculate.

It can be done, itd be a ton, use the million cores to mindlessly calculate it for us. Its the experience that has me wondering, how to give it experience. It essentially needs to be able to retain a present moment, thats what I think awareness is.
 

LegionOnomaMoi

Veteran Member
Premium Member
It can be done
It can. It doesn't matter if we can't make it do what we want. We can have multiple humans cranking a lever calculating math problems using old school mechanics. Parallel processing simply means things are being done at the same time. If they aren't the things that count, it doesn't matter. You're basically talking about a lot of calculators working on adding numbers faster.
 

idav

Being
Premium Member
It can. It doesn't matter if we can't make it do what we want. We can have multiple humans cranking a lever calculating math problems using old school mechanics. Parallel processing simply means things are being done at the same time. If they aren't the things that count, it doesn't matter. You're basically talking about a lot of calculators working on adding numbers faster.

More IBM stuff.

IBM simulates 530 billion neurons, 100 trillion synapses on supercomputer | KurzweilAI
 

idav

Being
Premium Member
We've known for decades this isn't true. This study in the 90s already reports on previous studies showing 4 day old infants recognizing their mother's faces. More recent studies (e.g., Bulf, H., & Turati, C. (2010). The role of rigid motion in newborns' face recognition. Visual Cognition, 18(4), 504-512.) have investigated how day old infants recognize faces.
Their brains cant even handle the visuals at that age. They can see far enough in front to nurse. There isnt evidence that the newborn can recognize anything, maybe moms voice if that.

The newborn is preprogrammed to be able to start analyzing and interpreting its world but it is a blank slate for the most part. A machine with software that memorizes analyzes and categorizes in real-time is equivelant to a conscoius being with a blank slate. Awareness has to be programmed and it doesnt have to in the integrate with the hardware just cause thats how we see it work in biology.
 

idav

Being
Premium Member
No. Any mathematical representation is an abstraction to US. To a computer, it's a procedure.
Anything can be a pattern and easily represented. Even just a simple search for pics is object representation. Awareness is the end result like a photo, not all the stuff happening in the background. You want to give a computer eyesight then it has to be programmed, you want it integrated with the sight mechanism it still has to be programmed.
 

idav

Being
Premium Member
You've heard of P vs. NP?
I believe P does not equal NP. It shouldn't really but obviously a problem when trying to do hugely parallel programming. Often times we know the answer before we can work it out cause we looked up the answer in the back of the book. Looking in the back of the book doesn't mean we understand the problem but this is similar to the P vs. NP.


why is it that we don't have programming languages where you can misspell a word or forget a semi-colon and the program will run perfectly anyway because the computer is able to "get" what you meant the way you or I can when we misspell words?
The device or machine are only as smart as the program. We have to teach the system ambiguity. That's what is needed, imagination is what makes us so clever. Like our ability to read a whole sentence with no issues when every word is misspelled and has letters swapped.
 

PolyHedral

Superabacus Mystic
I believe P does not equal NP. It shouldn't really but obviously a problem when trying to do hugely parallel programming. Often times we know the answer before we can work it out cause we looked up the answer in the back of the book. Looking in the back of the book doesn't mean we understand the problem but this is similar to the P vs. NP.
P vs NP has very little to do with capacity for parallel programming. Having more than one processor can only ever speed up your computation by a constant factor, whereas P vs. NP is a statement about the fundamental lowest possible speed that its possible to recognise vs. find answers to specific problems as those problems' grow in size.
 
Top