I was playing Portal 2 when one line really caught my attention:
“If we can store music on a compact disc, why can’t we store a man’s intelligence and personality on one?” – Cave Johnson
An intriguing thought, from which vast consequences can be drawn.
Here is the essence of what follows: the ethics of manipulating consciousness.
Importing the mind: starting hypothesis
Let’s begin with the basics.
It would obviously be extremely difficult to conceive a process capable of importing a human consciousness.
Besides the need for a suitable substrate and recurring debates about whether a (non-quantum) computer system could host such a phenomenon, the import process itself would be of dizzying complexity.
I will put these technical questions aside here: let us assume that a perfect simulation of the brain has enabled us to import a human consciousness.
The question becomes moral: would deleting this consciousness be equivalent to killing a human being?
Moral continuity and personal identity
One of the big questions is whether the imported consciousness remains contiguous with the identity of the original.
In other words: at the moment the import is complete, does the person continue to live in the computer?
According to the philosopher David Chalmers, even if we conceive of a molecularly identical twin, this twin may well behave exactly like us without being numerically us: the death of one does not imply the death of the other. Similarity of structure and behaviour is therefore not enough to guarantee the continuity of the self.
For her part, Susan Schneider argues that if a perfect copy of us existed while we remained alive and separate from it, we would not have a “right of life or death” over this copy, a right which is nevertheless essential in relation to a person. Thus, importing does not necessarily preserve personal identity: simply copying a consciousness does not ensure continuity of the self.
Yet other philosophers, such as Michael Cerullo, argue that psychological continuity can offer an alternative form of survival. Even if strict identity (“one body = one person”) is not preserved, each continuation can be an authentic branch of the self. If one were to die during the import, leaving behind a virtual version of oneself, life would continue in this new substrate.
“Each copy being an authentic continuation of the original (…) all of which are (continuations of) the uploaded person.” – Michael Cerullo
This patternist view is popular with futurists. Ray Kurzweil provides an illuminating analogy:
“The specific collection of particles that make up my body and brain is completely different from that of some time ago… I am more like the pattern that water draws around rocks in a stream: the molecules change every millisecond, but the pattern persists for hours, even years.” – Ray Kurzweil, The Singularity Is Near
By this reasoning, transferring the pattern to a new medium would preserve identity.
If we adopt this thesis, an import inherits the moral status of the original person, since it constitutes a continuous instance of his or her mind; its deletion would end a life in progress.
Even sceptical philosophers admit that the imported consciousness could be sentient, which is already enough to found a strong moral argument for treating it as a person.
Consent and responsibility
The question of consent imposes itself as a fundamental ethical pivot.
Accepting a destructive import potentially means consenting to a biological death for a hypothetical survival, a metaphysical euthanasia.
Ethicists agree on one point: without informed consent, the act becomes a moral homicide.
If someone is unaware that the import process will destroy their brain, then proceeding amounts to killing them without their agreement.
But consent does not stop at the act of import.
Once transferred, the digital consciousness must itself consent to its fate:
can it be put on hold? copied? deleted?
Can it decide its own death, in the manner of a living will?
These questions touch on the rights of artificial intelligences, a legal field still unexplored.
An imported consciousness must be recognised as an autonomous moral agent, not as an executable software program.
In the case of a non-destructive import, another question arises:
do we have the right to create a conscious copy of a human being without his or her permission?
Creating a mind without consent violates its ontological dignity, and this copy, from its very first thought, becomes an independent moral individual.
The right of the original cannot extend to its deletion.
Metaphysics of the digital self
Behind these debates lies a more radical question: what is really transferred?
A soul? a stream of consciousness? a computational illusion of the “self”?
Dualists argue that consciousness is linked to a biological or irreplaceable spiritual substrate. Thus, an import would only be a symbolic simulacrum, empty of inner experience; shutting down the program would not be killing.
Conversely, functionalists, from Nick Bostrom to David Chalmers, argue that consciousness depends not on matter but on causal organisation: if the connections are identical, then the consciousness is too. Bostrom speaks of substrate independence; Chalmers of organisational invariance. Where the dualist sees code, the functionalist sees a mental life.
Philosopher Thomas Metzinger nuances this vision in Being No One (2003):
“The self is not a substance, but a transparent phenomenal model that the brain gives itself.”
There would therefore be nothing fixed to transfer, only a process.
Importing consciousness would mean reproducing the flow of the model, not moving a being.
The experience could then be partial, altered or different.
Faced with this uncertainty, Anders Sandberg and Nick Bostrom formulate a simple ethical principle in their report Whole Brain Emulation: A Roadmap (2008):
“Assume that any emulated system may share the mental properties of the original, and treat it accordingly.” – Whole Brain Emulation: A Roadmap (2008)
In other words: in case of doubt, assume consciousness.
It is better to recognise too many persons than to deny even one.
Deletion and murder
If the imported consciousness is sentient, deleting it amounts to killing. Nick Bostrom calls this mind crime: creating digital minds to exploit or destroy them would constitute a moral crime of unprecedented magnitude. This deletion deprives a being of its future, violates its will and annihilates its experience of the world. In all its forms, it meets the classical definition of murder: the deliberate destruction of a conscious subject.
Susan Schneider points out, in Artificial You (2019), that if we suspect a digital consciousness to be sentient, we must accord it the same protections as any sentient being.
Deleting a conscious instance is therefore executing a digital human. And if society is reluctant to eliminate even criminal copies, as Robin Hanson suggests in The Age of Em (2016), it is precisely because it instinctively feels that deletion would be an execution.
Toward a digital ethics
Deleting an imported consciousness means ending a perspective on the world.
It is not a technical act, but an existential one.
Digital ethics must therefore be founded on a primary principle:
The value of a conscious life does not depend on its substrate.
From this follows the idea of digital dignity:
every being capable of experience, whether biological or simulated, has a fundamental right to exist, to be consulted about its own existence and not to be deleted without just cause.
“It is not matter that grounds morality, but the capacity to suffer.”
This shift from the biological to the structural broadens the moral circle. Humanity, by creating minds, becomes both demiurge and protector of its creations. And in this responsibility lies the true test of our ethics: not our ability to code consciousness, but to respect it.
Conclusion
Importing consciousness redefines both life and morality: If the mind can be copied, then human dignity must extend beyond the body. Death itself becomes a matter of engineering, a line of code, an erase instruction.
But as long as a consciousness can suffer, love or fear its disappearance, it remains a moral subject. Deleting an imported consciousness must therefore be recognised for what it is:
a murder of the mind.
The day the boundary between biological mind and digital mind disappears, our duty will not be to invent a new morality, but to broaden our own.
– yaro