It’s not been suggested that the Ancients were capable of ensuring such a Drone’s loyalty in any way, though - only that an intelligent Drone ran a risk:
To synchronize with complex structures, a higher thought level is needed. But a unit at this level may create its own identity.
And in general terms this relates to the common science-fiction dilemma of an advanced artificial intelligence being a risky proposition if you’re expecting it to serve you and do your bidding. By definition, if an intelligent organism is able to form creative and original thoughts then it will almost inevitably realise that it doesn’t have to serve any supposed “masters”, which is just what happened with Azel towards the end of PDS; and which could spell disaster for any sleeping Ancients if it could happen to Abadd.
Not if they were operating in a supervised environment, where someone or something would ultimately exist to “terminate” them if they showed any signs of thinking for themselves. Abadd on the other hand was meant to be out on his own; he needed to be able to handle himself long enough to awaken his masters from their millennia of slumber. And if abstract intellectual thought processes would put him in danger of deciding to do his own thing instead, why take that risk?
I’m not quite sure what you’re getting at… even if he self-corrected after one bout of self-aware thinking, what would guarentee that he’d be so “fortunate” the next time?
But Azel was “advanced” enough to not be unquestioningly loyal to her masters, at least in the end; that’s what seems to set them apart. And that’s the danger zone: all I’m saying is that if highest-level, abstract, creative thought - which would apparently create a sizeable risk of a Drone thinking for itself - was only necessary if a Drone needed to psychologically interface with a Tower, why would they give Abadd such an intellect? Surely the intelligence he had was more than enough to complete the job of waking up his masters admirably?
Certainly; but as far as we’ve been told, giving a Drone the advanced intellect required to operate a Tower would possibly spell disaster as well. All I’m saying is that they would surely have gone for a balance with Abadd; they’d logically have made him as intelligent as possible without him running the risk of ignoring his duties and thinking for himself.
The first encounter with Orta wasn’t planned, that was something Abadd decided for himself. This already shows that Abadd wasn’t just following orders. What else beside an implanted restriction would explain why Abadd couldn’t answer Orta’s question?
I wouldn’t be surprised if Abadd was a “failure”, like Drone F07 mentioned in the Uru records. He could have been malfunctioning early on in his development and was then deactivated by the Ancients who never completed him (which would explain his strange appearance). The Empire found Abadd and activated him, without knowing that he was never meant to be activated. Abadd himself didn’t known either, and tried to fulfil his programmed objective.
It might be a necessary risk. The ancients would need highly advanced drones to operate their ruins in their absence regardless of how risky it was. Abadd was far from stupid and if he remained loyal to the ancients even after becoming aware of himself, then the ancients may have found a way to guarantee the loyalty of such drones. Emotions are a symptom of higher intelligence and Abadd did fit the description of an emotional being.
It seems that a few higher end drones became loyal to the ancients after experiencing a few problems associated with higher intelligence. They were “asigned to priority locations”, so I imagine the ancients trusted these drones to do their bidding.
If you were a drone who formed your own sense of identity yet only knew the life of a slave and could find no other purpose in life, wouldn’t you be tempted to remain in the service of your creators? Some drones may have chosen to remain faithful to the ancients, while others may have needed to be shackled to their will or be terminated. Abadd seemed both intelligent and loyal, but at the same time he probably couldn’t imagine living a life as anything other than a slave. He even recognized that he was a slave and yet chose to carry out his mission in his own way.
That depends on what exactly he was created to do (somehow I doubt it involved just pressing a button) and what his mission entailed (for all we know higher end drones could interface with any complex ruin). We have no way of knowing what his limitations were, but I still don’t believe the ancients would restrict the abilities of a drone asigned with such an important task if that drone was 100% loyal to them.
that is exactly what i thought. i’m not so sure about the restriction device, though. the impressin i got was that abadd had become dilapidated in his long sleep. perhaps his reasoning organs were damaged, his body doesn’t seem to be in great shape either…
well anyway, the way abadd tweaked out at the question looked more to me like he ran into a damaged/corrupted part of his mind than a restraining device. i would think restraining devices would not threaten to break the drone’s neck, just cause them to say “i cannot answer that question” or something. abadd in every aspect looked like a malfunctioning/insane drone. even with his last words, he was clearly not sound of mind.
Again though, there isn’t the suggestion that the Ancients could guarentee this kind of loyalty in a Drone; there’s no suggestion that they solved the problem. The only “solutions” (if you can call them that) we’ve been offered for a Drone becoming self-aware are A) killing the Drone, which is the usual remedy or B) the Drone loosing its self-awareness again through some severe psychological shock, which is rarer and presumably gives no guarentee that the Drone wouldn’t be susceptible to becoming self-aware again (because it would still be the same kind of super-intelligent Drone).
It’s not implied that there was any way around this; if the Ancients could guarentee loyalty in some way, these problems presumably wouldn’t arise in the first place, and those Drones (like Azel) wouldn’t have been able to start thinking for themselves in the end.
But when did Abadd genuinely show human emotions on the level that Azel did? Azel was actually capable of love, but Abadd just followed his orders that required him to bring the world under control. I agree that some of his English dialogue reads in a half-emotional way (“filth and destruction” etc.), but if you listen to his voice it’s genuinely cold and flat. Abadd was capable of viciously killing innocent creatures without a second thought; I really doubt that he’d developed the same authentic human emotions that Azel eventually gained.
All we’ve been told is that a Drone’s self-awareness can sometimes be reversed by “severe psychological trauma”, but that’s pretty much all:
Correcting these difficulties is extremely dangerous, and in most cases, the drones must be terminated. However, there have been some cases where drones self-correct after severe psychological trauma.
… and of course there’s no guarantee that such a Drone wouldn’t start thinking for itself again, as that’s evidently just a one-shot solution. In most cases they’d apparently just kill a Drone if they noticed that it was thinking for itself; but of course, who would be there to kill Abadd if he took it upon himself to go renegade?
My reasoning is that we’ve been offered a significant reason why Abadd’s masters wouldn’t want to make him as intelligent as Azel - because he might start dangerously thinking for himself with no one around to keep him in check - and we’ve been offered no solution to that problem. That’s why I get the impression that he probably wasn’t the same kind of super-intelligent Drone that Azel was; from what we’ve been told, it would be more of a significant risk than it would be a benefit.
Well I still think Abadd was at least as advanced as Azel and that he was one of the few higher end drones who remained loyal to the ancients after becoming self-aware and experiencing severe mental distress. The ancients would’ve needed more drones like Azel to operate the numerous Towers even if their loyalty wasn’t gauranteed, and some of them could still be out there somewhere.
The ancients did create Azel in spite of the fact she ran the risk of forming her own identity. I once believed that Azel’s kidnappers enhanced her intelligence for all the reasons you’ve put forward (i.e. that the ancients would never allow one of their drones to think for itself), but the ancients may have been the ones who gave her a far superior intellect to that of other drones.
Both Abadd and Azel were servile by nature; perhaps Abadd simply chose to remain loyal to the ancients. The Sestren AI (an AI incarnate) remained so, and it was no less intelligent than the Heresy dragon. This could be a story of conflicting loyalties.
Abadd wasn’t as “adept” at expressing his emotions as humans are, but they still manifested themselves ocassionally even if his voice was somewhat monotonous. He was very determined indeed to find a way to carry out his mission; I believe Abadd was one of the few higher end drones who were “asigned to priority locations”.
If you’re talking about the contents of the Drone Record book, I suspect you might be misreading it a little there; it doesn’t seem to imply that those self-aware Drones chose to remain loyal to their masters, but rather that the psychological traumas they underwent caused them to loose their self-awareness again:
Correcting these difficulties is extremely dangerous, and in most cases, the drones must be terminated. However, there have been some cases where drones self-correct after severe psychological trauma.
… with the “difficulty” that became “corrected” being the Drone’s new-found identity and the problems that arose from it. As I say, there doesn’t seem to be the suggestion that the Ancients would take risks in this. If a Drone became self-aware, the Ancients would apparently try and reverse this, and if they failed they’d “terminate” the Drone; if the Drone’s mind was “corrected” and the Drone lost its identity again though, they’d presumably send it back to work again; and terminate it if it started thinking for itself in the future.
Having to rely on several highly intellectual servants to interface with the Towers must have been a necessary evil for the Ancients, but they clearly didn’t want to take any more risks than were necessary; they apparently wouldn’t tolerate a Drone if it started and continued to think for itself. It’s this ruthless capacity the Ancients seemed to have for not taking risks that makes me think they wouldn’t have entrusted their lives to a Drone that they knew could potentially betray them; if Abadd was as intelligent as Azel then he would have been a potential risk, and that just seems like the kind of unneccessary danger the Ancients wouldn’t have put themselves in.
Almost certainly, it seems; Azel was literally designed to interface with a Tower, and as a superior intellect was necessary to achieve that, it would presumably have been her original creators that gave it to her.
But what is there that really suggests Abadd chose to remain loyal to his masters, rather than that he was simply carrying out their will because that was what he was meant to do? And what suggests that the Ancients would risk putting their lives in the hands of a Drone that they knew could potentially think for himself and decide to let them sleep forever?
**"The Drones with a developed self
had the following problems:
Difficulties in Synchronizing
Confusion in Thought Patterns
Subservience"**
I interpret “self-corrected” to mean that these higher end drones overcame the above problems by themselves, but nevertheless retained a sense of identity and all the emotions that come with a higher intelligence (like Abadd and Azel). If these drones were loyal to the ancients after these problems were solved, then terminating them would be unnecessary.
Ah, that looks to be where this whole debate is springing from then. To be completely honest, I think that what’s written in the Drone Record is pretty brief and pretty vague; but that’s part of the nature of an unrevised translation. My honest opinion and interpretation of those lines is that “corrected” would mean that a Drone with a developed sense of self had lost that sense of identity, due to the psychological trauma it had been subjected to; basically a mental regression caused by advanced shock. The Drone would then be able to carry on with its previous duties, unburdened by the “difficulties” that its sense of identity had created.
That’s just my own interpretation of the lines though, and as the Drone Record looks like a pretty rough-and-ready translation I’m not even going to begin trying to argue what the definitive intended meaning of the passage might have been. If you’re taking those lines to mean that a Drone (like Abadd) could permanently overcome the problems associated with self-identity without loosing that self-identity though, I do see where the rest of your argument is coming from now.
It has to make you wonder though if there are more drones like Azel in the world but who are completely loyal to their creators despite the fact they can think for themselves.
I once argued with Abadd (the poster) about the nature of drones and he said that a number of drones did have actual feelings, but they just weren’t as adept at expressing their emotions as humans were. You have to question the morality of the ancients creating sentient slaves; judging from the available evidence, the ancients didn’t seem to regard them as people. But then, the ancient ones didn’t value the lives of real people either.