Machines can become "self aware" but you have to give them the ample programing to actually have that happen.
A human brain has "ample programming" in the form of genetic information; Some of our behaviour is based on instinct. "Programming" used as a term that implies something lesser than another random thing, is not a valid argument. It's based on feeling. Just because "something is programmed" doesn't lower its value unless you choose it to. It's very subjective, and you're mistaking it as an objective fact. It's not.
A machine can't do anything you don't give it the hardware to do which is why I find the idea of machines just becoming self aware utter nonsense.
A human is also a subject of its hardware. You're making another mistake: You are putting value to the word "hardware" and implying that it's somehow lesser than "a bunch of meat". Another subjective view misinterpreted as objective fact.
A machine is nothing if not for the hardware. The same thing for humans, just replace "hardware" with something that applies to organics.
Machines wouldn't "just" become self-aware. Nothing "just" happens.
You can't just have a full range of emotions out of thin air.
Self-awareness is not an emotion. And emotions are not a prerequisite for self-awareness. And you can't have anything out of thin air unless you had the means to change the constituency of "thin air" and transform it into something else. So it's not a point with much value.
The idea that a washing machine can just suddenly feel joy is on the same level as a robot just suddenly feeling joy.
Why does it have to be "suddenly"? You're implying that it would happen accidentally, "out of nowhere". This makes very little sense. Humans didn't just suddenly start feeling joy either... Nothing happens "suddenly out of nowhere" even if from your perspective if might look that way: Usually it only looks that way when you don't fully understand the issue you're thinking...
I've heard things like this and the singularity called "the atheist's fantasy."
I've heard people like you use this fantasy to make yourself feel better about your lack of knowledge: You are again trying to reduce something in value to the level of your subjective assessment. An OBJECTIVE assessment would be thus: If you think it's fantasy, it's most likely because you don't fully understand it. And to me it seems that way indeed.
I find it VERY unimaginative to consider something impossible "just because". Especially on a forum where many believe in the supernatural... I mean, you call it fantasy, but you don't afford the same level of criticism to your own holy scriptures, whatever they might be.'
/E: I'd like to add that a machine's self-awareness and sentience need not be comparable to a human in order to be... Well... Self-aware or sentient. I think it's our mistake for anthropomorphizing. You could build an "analogue" of something with something totally different, functionally. Imagine instead of neurons you have for example, transistors. You'd need a much larger surface area compared to a human brain and neurons, but it could still be done. Our own fault for thinking that everything must be human-like.