The lowest level are procedures. It's the actual instantiation of code.Opcodes are the lowest level
The structures have the properties we understand. A calculator can do complicated mathematics. Does it understand addition?This includes structures that represent (that is, has the properties of)
Objects exist only in abstraction. There is no "tree", no "red", no "idea", no "program", there are only the abstractions of these. Computers don't abstract. We might try to write programs which use word like "tree" and have it inherit from "plant" but we could have all the same with meaningless symbols because this is what the computer ends up processing. The words in programming languages are meaningful to us.I understand the very idea of understanding to be this: you understand an object when you have a comprehensive (not necessarily complete) knowledge of the object's properties.
I know a lot more than that. I have a concept of what adding is and what numerosity is and this is based on the extension of numerosity to groupings of just about every category represented in my brain. Brains are superb at categorizing. Computers are terrible at it because they cannot represent concepts at all. They can only implement procedures.You understand arithmetic on integers when you know that, e.g. adding two integers always produces an integer
Were that true, we wouldn't need to program every single step. Understanding arithmetic means seeing how numbers get added together and being able to add two numbers one has never seen, like 6,349,586,086 and 3.25260346 because one understands addition, not because one was programmed with an addition algorithm.This can be represented in the computer system, which means the computer understands arithmetic
We don't represent the notion in mathematical terms.Mostly, the whole idea occured to me debating the nature of qualia. I know how to represent the notion of redness in mathematical terms
Qualia doesn't behave. It is entirely subjective and is your experience of color not any generalized experience. That's the point. It is internal, subjective, and unique.I had to work backwards and explicitly construct a concept that behaved in the same way as qualia intutively behaved
System.Type is a model, in the maths sense.
It is not a model of concepts. It isn't intended to be and will never be.
It doesn't behave. It's an abstract framework. Things done in this framework by a person enable a computer to behave. If you can't give an example of a computer showing it understands the notion "concept" using this framework then what does it matter?It behaves according to a set of properties
Assembly and C++ are both high level programming languages. We don't interpret our own code. We see the results of interpreting it. Looking at computer code is as meaningless at looking at a description of a plant of a snail and interpreting this as understanding how the snail thinks. Look at what the system can do. Can it even mimic anything remotely close to language (conceptual processing)? No.If I have a C++ program in front of me, do I have to understand it to be able to write an assembly program that will do the same thing?