Unless you believe in a transcendent soul that could be the source of these sensations or feelings, this assertion doesn't make sense. Assuming there is no supernatural soul, it's logically impossible for anything humans experience to not arise from the human body.
This entire notion of qualia is a philosophical quagmire predicated on the idea that if you can imagine something, it must be true ("we can imagine a zombie that behaves exactly like a human, but doesn't have qualia at all"). It's actually as laughable as the "argument from perfection" for the existence of a god.
> You just need biological machinery and computers are not biological machinery capable of producing sensations.
This is a postulate, not an argument. My contention is that qualia are meaningless - like saying that there is such a thing as "feeling like you're computing the number 1000" for a processor, or "feeling like you are a really hard granite" for a piece of granite. Just because we can express it doesn't mean that it makes sense.
All of the conundrums about qualia go away if we just accept this. Alice would not in fact experience anything new when she saw red for the first time, if she knew everything about human cognition and the physical properties of the color red.
I do absolutely agree that we know almost nothing about how these processes actually happen in the brain, and most attempts at AI and bombastic predictions about replacing humans are off the mark by centuries. But that is no reason to assume that there is something completely different going on in animal brains than computation, in the wide sense of the Turing machine model.
This entire notion of qualia is a philosophical quagmire predicated on the idea that if you can imagine something, it must be true ("we can imagine a zombie that behaves exactly like a human, but doesn't have qualia at all"). It's actually as laughable as the "argument from perfection" for the existence of a god.