not nom
Well-Known Member
The term "step debugger"and the idea of running a complex ANN "step by step" have nothing to do with ANN's, nor is it possible to use a step debugger or to go through the program step by step and anyone with even a BASIC familiarity with neural network programming would know that.
blah blah blah. why not spend some time making an argument, instead of argument by authority? and everytime I say "nah", you simply repeat the exercise, with more text, and still no argument.
Instead of bothering to check up on what I was talking about, you posted your flippant response and then made it worse by exapanding on it, continuing to apply concepts from programming in general which don't apply here.
oh, but they do.
ANNs do not use explicit algorithms which allow one to run the program step by step. Every time a complex ANN is run, the input neurons communicate with perhaps multiple hidden layers of neurons until the information reaches the output neurons and a final response. Each run results in an adjusting of weights according to the intial complex nonlinear algorithms. However, the code never specificies how the weights are adjusted after or during each run, nor is it possible to "stop" the program and "see" what connections led to this or that weight change or this or that final output.
"hidden" layers? you mean they get set up, then stimuli pass through them and they adjust their weights, according to what other nodes are up to. and that then comes up with things we cannot understand.
but that doesn't mean you cannot run it step by step, just that it doesn't really help. because it's so many operations, you could spend your life "watching" the program, and be none the wiser. but since you know how you set it up, and since you can know the deterministic behaviour of individual operations, you can assume a deterministic whole follows from it. unless you wanna get esoteric about it, or need grants or something.
make sure you do. Because you continue to apply traditional programming logic to a field designed specifically to avoid that "step-by-step" process,
bah, strawman. that you keep pounding it is your thing, I don't "keep applying" that, I never did in the first place. I do know how neural nets work, basically, but I also know that they're made up of variables which hold one value at a time, and that unless you take entropy from input, it's deterministic.
and our inability to know the system trajectory in some cases, or to know how the result was derived, has nothing to do with programmers being unable to "step through" the code.
more importantly, it has nothing to do with wether it's deterministic or not.
And while your debugger approach is standard just about everywhere else in programming, if you knew what you were talking about, you'd know it doesn't work here, and you wouldn't have made reference to step debebugging.
oh. my. *******. god.
you're like a dog with a bone or something.
still waiting for the argument, and how neural nets don't run on deterministic machines, which have a specific state at every given moment. though you kinda gave that up anyway, so I guess this'll just peter out.
Again, this only demonstrates that you are completely unfamiliar with artificial neural network programming, but rather than retract the rude comments you made you'd prefer to just dig yourself a deeper grave.
*yawn*
First, an actual "turing machine" requires and infinite length of tape, and one of the points (or results) was to strike another blow against Hilberts dream (which Turing did using Cantor's diagonal infinite proof method). But more importantly, "Turing machines" outside of the theoretical concept and even within Turing's paper use formal and linear (which does not exclude loops) logic. ANN's are fundamentally different and deliberately so.
yeah, but HOW so? for someone constantly dissing me for lack of knowledge, you kinde exhibit none of your own. explain to me how they're fundamentally different, instead of telling me "if I had a clue, I would know". I mean, what's your point in posting in the first place, then? everybody has to take it as face value and that's that? why don't you get a blog, that would be better suited.
Unlike other programs, even extremely complex ones, ANNs are designed to "write" their own code in a sense.
lol. dude, I do know the basics of a neural net. and yes, "in a sense" is the keyword here haha.
They adapt to input in highly complex ways making it impossible for the programmer to always know how or why certain changes in weights resulted or why the output was what it was, and also to run through these changes "step-by-step" to find the answer.
that's just because it's too much information for a programmer to see -- it's basically just tables of numbers, after all, neuron weights etc. -- not because it's not a step by step process.
you cannot run an algorithm on a CPU that fetches data and instructions in deterministic fashion that makes it magically non-deterministic, and I am tired of your fluff by now. I never said it's determinable for us, I said that it isn't doesn't mean it's not deterministic -- you weren't debating that
This isn't saying that they are indeterministic (although again, that has been suggested), but it does mean that your mocking comments about how it's just a matter of using a step-by-step debugging approach means you don't understand how ANNs work.
you're really desperate for that, huh? no, it just means I correctly identified them as deterministic -- do you seriously think I was suggesting one should step-by-step debug a neural net to "understand why it does this or that" -- ?? :/
Only that isn't what I said:
If they say "under these circumstances it is deterministic, but under these we can't assert that it is" (which is exactly what they say) than yes, that does equal "we have found even just the slightest indication that it isn't." The whole reason to bring up determinism of this type of ANN was to note how under certain conditions it is deterministic, and under others that can't be said. If it can't be said, then they can't say that for a reason.
then gimme that reason, and cut the filler.
Given your mocking "step-by-step" solution to the problem faced by experts in mathematics, computer science, cognitive science, etc., you'd think that either they'd have figured out all they had to do was use debugging techniques everyone else has for the past several decades, or it is relevant and you don't know what you are talking about.
again, you run for the strawman. at least you're not denying those advanced neural nets run on pretty much standard CPU's. so substracting all the huffing and puffing, thanks for pretty much confirming all I said.
Last edited: