But isn't it limited to how much memory it can address? e.g. whether it is 16 bit, etc?
@Banach-Tarski Paradox is correct, he appears to actually be a mathematician which besides engineering is the field behind computers.
Computers as I used to say are just a box of rocks. they have no more intelligence in them than that. current computers do one thing, add and subtract in binary, everything else is just what they do with the result (where they put it) and what the next instruction is.
Beyond that, it is all software optimization, 8 bits is only the size of the register, no reason you can't use it again for the next 8 bits and emulate a 16 bit. The limitation is how long you are willing to wait and how much time it takes to write the code to do this.
My suggestion is leave this environment as it is not actually a good place to learn about computers, my CS101 50 years ago with punch cards and tractor feed printers has given me a better leg up on actually understanding what computers do (and learning network protocol from my sons textbook so I could explain it to him and teaching myself Access and SQL so I could process sample data) than most here know.
Read about Alan Turing, Turing completeness and Charles Babbage and his friend Ada Lovelace and then you can work yourself forward to the engineerds in Palo Alto. heck you might even run into my next door neighbor who became rich off of the code he wrote for 32k modems. (look them up in a history book)
That said, this wasn't the post I intended since just as I started writing, my internet went out and left me on a useless page.
Good luck, my other son is writing software for a company that tracks dairy herd production and quality, he is using newer programs (apps these days) but solving the same problems I did 25 years ago. It is just a box of rocks, it is knowing how to use it that makes the difference.