“Does she touch you … like this?”
“Ah … ah yes, very much like that. Just … like that.”
“And kiss you, like this?”
“… Somewhat like that. Only a little like that.”
“Only a little?”
“She kisses me differently. Oh, this is not … I shouldn’t … I really shouldn’t be telling you any of this. This is so … You are the most terrible man.”
“I know, I know. I hate myself. Am I not just the most terrible, terrible man?”
“You are, I don’t know … Oh, now what?”
“Let me see … does she kiss you like that?”
“No. Again, no, not quite like that.”
“In some other way?”
“In some other way.”
“How many ways are there, to kiss? I … I really have no idea. I am not so well versed as you might … let’s see …”
“… Well, then. Now. She kisses, let me see … more lightly, just as … passionately, but with less, less … less intensity, less muscularity.”
“Muscularity?”
“I think that is the word.”
“And, with the touching, is it like this …?”
“Oh, ah, yes, sometimes, though …”
“Yes?”
“Her hands, her fingers.”
“Like this?”
“No, not … something like … her hands, see, are more slender, the fingers are longer, they are more delicate. Yours are … fuller, more …”
“Filling?”
“Yes. And grasping.”
“Grasping? Grasping? I’m shocked, Virisse! Am I really grasping?”
“Ha, I mean … hungry, given to clutching, gripping, even grabbing.”
“And now grabbing! Good gracious!”
“You grabbed me, don’t you remember? That first time, when we were in the garden of the parliament. That evening? You said you wanted to talk about some long-term aspect of her schedule, do you remember?”
“Of course I remember. Here is more comfortable, though, don’t you think? I swear if we weren’t all departing so shortly I might have to have this bed reinforced, just to cope with our exertions.”
“Her birthday, the following year. You said you wanted to plan something special for her because it would be her sixtieth. Then. That was when you grabbed at me, almost as soon as you had me alone, in that night-shady bower.”
“I grabbed you? Are you sure this was me?”
“Oh, now. Who else? You know this. You did.”
“I thought you wanted to be grabbed.”
“I might have.”
“Just as well I did, then, wouldn’t you say?”
“I think I will say anything, when we are like this, when you hold me like this.”
“Really? I must think up something extremely terrible, then, to exploit that admission.”
“You mustn’t. You can’t. I’m at your mercy; it would be cruel, wrong.”
“To the contrary! It would only be right. You offer me this, I must take it. You lay yourself open; I must in.”
“Ah. Ah … yes. Oh, dear Scribe … But … not anything. Not anything at all. I am not so totally … I am not so … I am not …”
“Not what, Virisse?”
“I can’t remember. I have forgotten what I am not.”
“Well. Better that than forgetting what you are.”
“You steal all I have away, like this, when we lie like this. I feel laid bare, all washed away.”
“Does she do this? Does she have that same effect?”
“Oh, my love, why do you always have to ask me to make these …?”
“Because I’m fascinated. Everything about you fascinates me. How can I not be consumed by … not envy, but interest, to know how you lie with her, how much of what you do with me is what you do with her?”
“Is it not enough to know that you and I do what we do? Do we have to compare? Must we always compete?”
“How can we not? The urge to compare and compete is as basic as any. As basic as this.”
“Must it be, though?”
“It is; that’s all that matters. And does she touch you like this?”
“Oh. Oh. Oh. Yes, yes she does.”
“How I’d love to compare. How I’d love to see. How I’d love to watch.”
“See, my love? Watch?”
“Is that too much to ask?”
“Prophet’s kiss, Septame! You want the three of us …?”
“Septame, is it, now! Why …”
“Oh, don’t stop, don’t stop to laugh. Laugh if you must, but don’t stop.”
“Well, I plough on. But no, I didn’t think to suggest that she and I might have you both, together.”
“No. That would be too … I couldn’t … anyway, she would never …”
“No, of course not. When I have you, I want you all to myself. And I want you to have me, all to yourself. I’d have no dilution of this … concentration, this … tenacity.”
“What, then?”
“Just to watch, just to see you with her.”
“She still would not.”
“I know. I wouldn’t expect her to. And secreting myself within some hidden space, like a courtier in some ancient tragedy, would be absurd.”
“Put it from your mind, my love. Concentrate on this, on now, on us.”
“The more I do, the more I want to see you and know you in all your states, in all your moods and passions, and that must include with her. Just once, just to see, later, not at the time. And it would be so easily accomplished; I can source the sort of means no civilian or journalist can find.”
“Oh, dear gods, you’ve thought about this. You’re serious! No, no, still; don’t stop …”
“Serious, ardent. Please. It would mean so much.”
“It might mean my job! My career; all my standing. She’s the president!”
“She is president for thirteen more days, as I am a septame for thirteen more days. All that means nothing then and already starts to mean even less now. What does matter is that she is a woman, you are a woman and I am a man.”
“But still …”
“Still nothing, my love, my beautiful love. We strip off our importance with our clothes. That’s all that matters; not our titles. They only have meaning in public, not in moments like these, when we are purely, perfectly ourselves. Only a man – a weak man, a hopelessly curious, desperate-to-know man – asks this of you, my Virisse, not a politician. Just a man. Your man, your man, your man.”
“But if … if … if …”
“I’d protect you. There would be no risk. And so close to the end of this world, the start of the next, who really cares? All the rules are questioned now, all the laws loosened. Everything is licensed. Like this, and this, and this.”
“Oh. Oh. Oh.”
“And I’d make sure nothing would happen to you. I swear; I swear I swear I swear. Even if anything was found, it would never be traceable to you or I. Say you will. Say yes. Say it for me. Say you will. Say yes, say yes, say yes. Say it.”
“… Yes …”
The Simming Problem – in the circumstances, it was usually a bad sign when something was so singular and/or notorious it deserved to be capitalised – was of a moral nature, as the really meaty, chewy, most intractable problems generally were.
The Simming Problem boiled down to, How true to life was it morally justified to be?
Simulating the course of future events in a virtual environment to see what might happen back in reality, and tweaking one’s own actions accordingly in different runs of the simulated problem to see what difference these would make and to determine whether it was possible to refine those actions such that a desired outcome might be engineered, was hardly new; in a sense it long pre-dated AIs, computational matrices, substrates, computers and even the sort of mechanical or hydrological arrangements of ball-bearings, weights and springs or water, tubes and valves that enthusiastic optimists had once imagined might somehow model, say, an economy.
In a sense, indeed, such simulations first took place in the minds of only proto-sentient creatures, in the deep pre-historic age of any given species. If you weren’t being too strict about your definitions you could claim that the first simulations happened in the heads – or other appropriate body- or being-parts – of animals, or the equivalent, probably shortly after they developed a theory of mind and started to think about how to manipulate their peers to ensure access to food, shelter, mating opportunities or greater social standing.
Thoughts like, If I do this, then she does that … No; if I do that, making him do this … in creatures still mystified by fire, or unable to account for the existence of air, or ice, above their watery environment – or whatever – were arguably the start of the first simulations, no matter how dim, limited or blinded by ignorance and prejudice the whole process might be. They were, also, plausibly, the start of a line that led directly through discussions amongst village elders, through collegiate essays, flow charts, war games and the first computer programs to the sort of ultra-detailed simulations that could be shown – objectively, statistically, scientifically – to work.
Long before most species made it to the stars, they would be entirely used to the idea that you never made any significant societal decision with large-scale or long-term consequences without running simulations of the future course of events, just to make sure you were doing the right thing. Simming problems at that stage were usually constrained by not having the calculational power to run a sufficiently detailed analysis, or disagreements regarding what the initial conditions ought to be.
Later, usually round about the time when your society had developed the sort of processal tech you could call Artificial Intelligence without blushing, the true nature of the Simming Problem started to appear.
Once you could reliably model whole populations within your simulated environment, at the level of detail and complexity that meant individuals within that simulation had some sort of independent existence, the question became: how god-like, and how cruel, did you want to be?
Most problems, even seemingly really tricky ones, could be handled by simulations which happily modelled slippery concepts like public opinion or the likely reactions of alien societies by the appropriate use of some especially cunning and devious algorithms; whole populations of slightly different simulative processes could be bred, evolved and set to compete against each other to come up with the most reliable example employing the most decisive short-cuts to accurately modelling, say, how a group of people would behave; nothing more processor-hungry than the right set of equations would – once you’d plugged the relevant data in – produce a reliable estimate of how that group of people would react to a given stimulus, whether the group represented a tiny ruling clique of the most powerful, or an entire civilisation.
But not always. Sometimes, if you were going to have any hope of getting useful answers, there really was no alternative to modelling the individuals themselves, at the sort of scale and level of complexity that meant they each had to exhibit some kind of discrete personality, and that was where the Problem kicked in.
Once you’d created your population of realistically reacting and – in a necessary sense – cogitating individuals, you had – also in a sense – created life. The particular parts of whatever computational substrate you’d devoted to the problem now held beings; virtual beings capable of reacting so much like the back-in-reality beings they were modelling – because how else were they to do so convincingly without also hoping, suffering, rejoicing, caring, loving and dreaming? – that by most people’s estimation they had just as much right to be treated as fully recognised moral agents as did the originals in the Real, or you yourself.
If the prototypes had rights, so did the faithful copies, and by far the most fundamental right that any creature ever possessed or cared to claim was the right to life itself, on the not unreasonable grounds that without that initial right, all others were meaningless.
By this reasoning, then, you couldn’t just turn off your virtual environment and the living, thinking creatures it contained at the completion of a run or when a simulation had reached the end of its useful life; that amounted to genocide, and however much it might feel like serious promotion from one’s earlier primitive state to realise that you had, in effect, become the kind of cruel and pettily vengeful god you had once, in your ignorance, feared, it was still hardly the sort of mature attitude or behaviour to be expected of a truly civilised society, or anything to be proud of.
Some civs, admittedly, simply weren’t having any of this, and routinely bred whole worlds, even whole galaxies, full of living beings which they blithely consigned to oblivion the instant they were done with them, sometimes, it seemed, just for the glorious fun of it, and to annoy their more ethically angst-tangled co-civilisationalists, but they – or at least those who admitted to the practice, rather than doing it but keeping quiet about it – were in a tiny minority, as well as being not entirely welcome at all the highest tables of the galactic community, which was usually precisely where the most ambitious and ruthless species/civs most desired to be.
Others reckoned that as long as the termination was instant, with no warning and therefore no chance that those about to be switched off could suffer, then it didn’t really matter. The wretches hadn’t existed, they’d been brought into existence for a specific, contributory purpose, and now they were nothing again; so what?
Most people, though, were uncomfortable with such moral brusqueness, and took their responsibilities in the matter more seriously. They either avoided creating virtual populations of genuinely living beings in the first place, or only used sims at that sophistication and level of detail on a sustainable basis, knowing from the start that they would be leaving them running indefinitely, with no intention of turning the environment and its inhabitants off at any point.
Whether these simulated beings were really really alive, and how justified it was to create entire populations of virtual creatures just for your own convenience under any circumstances, and whether or not – if/once you had done so – you were sort of duty-bound to be honest with your creations at some point and straight out tell them that they weren’t really real, and existed at the whim of another order of beings altogether – one with its metaphorical finger hovering over an Off switch capable of utterly and instantly obliterating their entire universe … well, these were all matters which by general and even relieved consent were best left to philosophers. As was the ever-vexing question, How do we know we’re not in a simulation?
There were sound, seemingly base-reality metamathematically convincing and inescapable reasons for believing that all concerned in this ongoing debate about simulational ethics were genuinely at the most basic level of reality, the one that definitely wasn’t running as a virtual construct on somebody else’s substrate, but – if these mooted super-beings had been quite extraordinarily clever and devious – such seemingly reliable and reassuring signs might all just be part of the illusion.
There was also the Argument of Increasing Decency, which basically held that cruelty was linked to stupidity and that the link between intelligence, imagination, empathy and good-behaviour-as-it-was-generally-understood – i.e. not being cruel to others – was as profound as these matters ever got. This strongly implied that beings capable of setting up a virtuality so convincing, so devious, so detailed that it was capable of fooling entities as smart as – say – Culture Minds must be so shatteringly, intoxicatingly clever they pretty much had to be decent, agreeable and highly moral types themselves. (So; much like Culture Minds, then, except more so.)
But that too might be part of the set-up, and the clear positive correlation between beings of greater intellectual capacity taking over from lesser ones – while still respecting their rights, of course – and the gradual diminution of violence and suffering over civilisationally significant periods of time might also be the result of a trick.
A bit, after some adjustments for scale, like the trick of seeding another society with the ideas for a holy book that appeared to tell the truth on several levels but which was basically just part of an experiment, the Contents May Differ thought, as it reviewed the results of the latest sim runs.
The sims it was setting up and letting run were all trying to answer the relatively simple question, How much difference will it make if the Gzilt find out the Book of Truth is a fake?
And the answer appeared to be: Who the fuck knows?
Once you started to think that the only way to model a population accurately would be to read the individual mind-states of every single person within the real thing – something even more immoral than it was impractical – it was probably time to try another approach entirely.
As a good, decent, caring and responsible Culture Mind, the Contents May Differ would never run a sim of the Gzilt people at the individual level to find out anyway, even if it could have, and – apart from anything else – had decided some time ago that even resorting to such desperate measures wouldn’t solve anything in any case. Because there were two Problems: the Simming Problem and the Chaos Problem.
The Chaos Problem meant that in certain situations you could run as many simulations as you liked, and each would produce a meaningful result, but taken as a whole there would be no discernible pattern to them, and so no lesson to be drawn or obvious course laid out to pursue; it would all depend so exquisitely on exactly how you had chosen to tweak the initial conditions at the start of each run that, taken together, they would add up to nothing more useful than the realisation that This Is A Tricky One.
The real result, the one that mattered, out there in reality, would almost certainly very closely resemble one of your simulated results, but there would have been no way at any stage of the process to have determined exactly or even approximately which one, and that rendered the whole enterprise almost entirely futile; you ended up having to use other, much less reliable methods to work out what was going to happen.
These included using one’s own vast intelligence, pooled with the equally vast intelligences of one’s peers, to access the summed total of galactic history and analyse, compare and contrast the current situation relative to similar ones from the past. Given the sort of clear, untrammelled, devastatingly powerful thinking AIs and particularly Minds were capable of, this could be a formidably accurate and – compared to every other method available – relatively reliable strategy. Its official title was Constructive Historical Integrative Analysis.
In the end, though, there was another name the Minds used, amongst themselves, for this technique, which was Just Guessing.
* * *
The mount was called Yoawin. It was old and in no particular hurry, though still strong and tireless. Well, as tireless as Tefwe needed it to be; she got weary and saddle-sore before it started showing signs of complaint.
Tefwe had chosen an old-fashioned saddle: tall and unwieldy if you were planning on performing any fancy stuff, but comfortable. Comfortable for the aphore as well as her; you always had to think of your mount. They’d had intelligent animals for hire at the stables in Chyan’tya, too; ones you could talk to, both amended bio and what were effectively walking, talking slightly dumb drones made to look biological. She guessed talkative people might hire those. Maybe people who were so talkative they couldn’t persuade other humans to ride with them. But a talking mount had always struck Tefwe as taking things a little too far. Aphores were quite smart enough, and sufficiently companionable to provide a sort of silent friendship.
They’d arrived in the middle of the night, on this side of the Orbital. She left at first light, riding out through the quiet town. There was some sort of festival happening during the day and some flower garlands, stretched across the street leading out of town towards the hills, had sagged with the dew during the night; she had to lift them out of the way to get underneath. One pale blue flower, loosened, started to fall. She caught it, sniffed it, stuck it in her hair, rode on.
The town was much as she’d remembered. It sat like a rough brush stroke along one side of the Snake river, cliffed on the shifting sands of tawny and grey-pink that marked the desert edge; a fragrant oasis of bell-blossom and strandle flower, even-cluss and jodenberry, the low, flowing buildings half submerged by their own orchards and groves.
Across the river, past some stunted, half-hearted dunes and the silted-up entrance to a long-dry oxbow lake, the brush and scrub of the low prairie began. The few scattered bushes looked like an after-thought to the land: quick, light scribbles of brittle-dry vegetation, prone to fires that in the right wind could move so fast you were better turning to face their heat and running straight through, because you’d never outrun them.
The river was very low; just a trickle at the bottom of the crack-dried muds and in-flowed spills of sand like fanned ramps. High season. The rains would come in a third of a year from now, falling in the Bulkheads, which were so far off that even in the clearest weather you could strain your eyes for ever and you’d never see them, night or day, not this far down in the thick of air.
A few tens of days later after the rains started falling in the high lands of the Honn-Eynimorm Bulkhead Range, the river would swell, generally pushing a plugged mess of old leaves, scrawny twigs, gnarled branches, stripped tree trunks and the bleached hides and bones of dead animals before it, like a moving barricade of half-forgotten decrepitude and death.
She rode out across the Pouch, the bay of desert and patchy set-sand that lay between the river and the town on one side and the hills on the other. A pair of raptors wheeled high up, following her for half the morning, then found something else to watch, kilo-metres off anti-spinwards. She lost sight of them in the building heat of the cloudless day.
Her backside got sore; she glanded numb, rode on.
At the height of the day, a desiccant umbrel provided shade for her and the aphore, once she’d chased a snoozing misiprike away. It had been fast asleep; she’d had to clap her hands and holler to get it to wake while she and the suddenly nervy aphore stood, only ten metres away.
The misiprike had looked up, risen tiredly to its feet and loped off. It stopped once to snarl back at them as though it had only just remembered it was supposed to be a fearsome creature, then padded loose-limbed over the frozen waves of set-sand.
She fed the aphore, her stomach rumbling, then ate, drank. Even in the shade it was very hot; she and the mount snoozed.
She’d missed the descent; usually she liked being dropped onto the inside of an Orbital, rather than taking the conventional approach of coming up from underneath. A descent meant you got to see the overview of the place you were coming to; a real view, even if it was through a screen, not a pretend one.
She’d been born and raised on a planet. This was a rare thing, in the Culture; even rarer than being born on a ship, and that was pretty rare in itself. Things had been like this for millennia. Anyway, she ascribed whatever eccentricities she was prone to displaying to that oddity of birthright. She’d spent more time on Os than anywhere else – spent centuries living on them – but still she couldn’t help but think of planets as normal and Orbitals as somehow aberrant, for all that the artificial worlds far outnumbered the number of naturally inhabitable ones.
In the end it was kind of unarguable that planets were natural and Orbitals were artificial, though she supposed that, when you thought about it, it was really no more natural to live on the outside of a huge, skein-warping sphere of rock, held to its surface – like the atmosphere – by nothing more than gravity than it was to live on the inside of an O, held there by spin, with the atmosphere held down by the same movement and stopped from spilling off the edges by retainment walls of diamond film and fate knew what exoticism of material and field.
She drifted in and out of her snooze, and, while still dozily waking up, asked her ancient pen terminal whether anybody had ever built a planet, or discovered a natural Orbital. Yes to the former, though rarely, and not for aeons. A straight no to the latter.
“There you are,” she told Yoawin as she got her to stand and busied herself re-setting the animal’s bridle. “Planets not guaranteed totally natural after all.”
The aphore snorted.
That night, at the start of the hills – maybe a kilometre in and a hundred metres up, by a dry water bowl – Tefwe slept under the stars and the bright, lit band of the Orbital’s far side.
That was the really unnatural sign of an O, she supposed. You might not notice the inside-out curvature at any time, or come to an access tube that would let you drop the hundred metres or whatever to the world’s space-exposed under-surface, or see the sea or the clouds held ruler-straight against an edge wall, but, come the night, there would be the sure and certain sign that you really were pinned by your rotating frame of reference to the inside of a bracelet ten million klicks across: the far side, shining in its daytime while the side you were on had its back turned to whatever sun the whole O circled.
Well, unless the clouds on your side of the Orbital were really thick, she supposed, and fell asleep.
Come the morning they left in the grey pre-dawn, despite Yoawin’s protests. “Don’t you spit at me, you addled stumble-hoof,” Tefwe said, and spat back at it. “There. See how you like it.” She wiped her face while Yoawin shook hers. “There. Now, sorry. Both of us sorry, yes? No more spitting. Here.” She gave it some dried berries.
A series of tall cliffs kept them mostly in shade until nearly noon, letting them keep going longer. They ate, drank and dozed under an overhang.
The track just before the pass was very steep; she got off Yoawin and led the animal up the zigzagging path. Dry stones rattled under her boots and the aphore’s hooves. At a corner in the trail near the top, one fall of initially just a dozen or so stones set off a small landslide further down. The fractious, rattling, rumbling sound of it echoed off the cliffs and slopes all around like dry thunder. Tefwe watched the dusty mass descend, to see if it would sweep away any part of the track, but it didn’t, and the tumbling mass of stones and boulders slowed and slumped to a dusty stop on the shallowing slope some way above the valley floor.
From the pass, the plateau was only fifty metres below her, ringed by sharp peaks and ragged cliffs. Its surface was bright with salt lakes and thin wind-corralled whorls of pale sand. Yoawin panted. Tefwe patted the animal’s snout and looked around as the hot wind swept her hair back from her face. She smiled, and felt the dried skin on her lips protest at the motion.
A couple of kilometres away, shimmering in the heated air, she could see the rocky outcrop where the drone Hassipura Plyn-Frie would be working on its sandstreams.
Tefwe reached up to where the flower she’d picked the day before had been, tucked into a coil of her hair, at least until this morning, thinking to leave it here at the pass. But her fingers closed round nothing; it had gone, already fallen.
oGCU Beats Working
You appear to covet the behaviour of our superlifter siblings, and have become a sort of serial tug. Were you getting bored?
∞
A little. I also thought that – given the Ronte had been due to arrive at Zyse in particular significantly after the Enfolding – helping them to get to Zyse swiftly might help preclude the Liseiden, faced with an empty home system devoid of both those it had belonged to and those it had been allocated to, being tempted to ignore the decision in favour of the Ronte by resorting to illegal pillaging. I simply wanted to remove that temptation by making sure the Ronte were in position at the time of the Instigation, thus ensuring a smooth handover.
∞
Thoughtful. However it does rather make it look as though we favour one over the other.
∞
I appreciate that, however all we are doing is favouring the rightful over those who might, wrongly, feel aggrieved and who, were this action not taken, have the means and the opportunity to act upon that feeling, with potentially disastrous results.
∞
Noble motivations, I’m sure. May I ask you to run any ideas for future innovations of a similar nature past the rest of us before carrying them out? Would you do that?
∞
I would.