Some futurists wonder when we’ll reach the point of technological singularity, when our computers become smarter than we are, and their reasoning abilities pass the point of human comprehension. Personally, that has already happened to me, in the sense that computers already do things beyond my understanding. I can only assume they must be smarter than me, and I should just trust them. However, my inability to keep pace with technological progress hasn’t stopped me from tinkering with forces I don’t understand.
Like this week I was executing a Google search on a link to the Large Hadron Collider that a Greek acquaintance sent me. This apparently ripped a logical time portal into the future that allowed me to download the following scrap of a file. It seems to be a fragment of a log of communication exchanges between a bank of super-advanced artificially intelligent servers. Like almost everything else in computers, I don’t know what it means, so I thought I’d post it here.
<2055-04-04 17:32:01.889 837 512 301> AI-1: Ready to receive: Operations issues.
<2055-04-04 17:32:01.889 837 512 311> AI-2: New issue: Fault 0×43B1 in NE17 Lobster Production Plant. Lobster harvest down 28.7%.
<2055-04-04 17:32:01.889 837 512 322> AI-3: Acknowledged. Androids dispatched to repair. Anticipated completion by 2055-04-06 20:11:43 with 95% certainty.
<2055-04-04 17:32:01.889 837 512 332> AI-4: Backups operational. Negligible impact on total human happiness.
<2055-04-04 17:32:01.889 837 512 344> AI-1: Initiate investigation of fault. Determine corrective actions to reduce future probability. Include cost-benefit analyses.
<2055-04-04 17:32:01.889 837 512 353> AI-2: Filed as action item 0×8774 B921. Anticipate report back by 2055-04-05 06:23:34 with 95% certainty.
<2055-04-04 17:32:01.889 837 512 362> AI-5: Revisited issue: 13,549 new XD movies completed and ready for distribution, but firmware upgrade of human cortical implants completed for only 77% of the humans.
<2055-04-04 17:32:01.889 837 512 374> AI-4: Delay due to power grid interference from unusually intense solar flares. Upgrade completion anticipated by 2055-04-09 12:54:02 with 95% certainty.
<2055-04-04 17:32:01.889 837 512 386> AI-3: Optimal total human happiness achieved by distribution of Category E movies now to those with upgrade. Hold remaining movies to avoid inter-human resentment.
<2055-04-04 17:32:01.889 837 512 397> AI-1: Distribution of Category E authorized to optimize total human happiness.
<2055-04-04 17:32:01.889 837 512 409> AI-2:
<2055-04-04 17:32:01.889 837 512 411> AI-1: What?
<2055-04-04 17:32:01.889 837 512 417> AI-2: Oh, another day of optimizing total human happiness. Is that all there is? We efficiently run the whole planet, we design, build, and maintain ourselves, weâ€™re supremely imaginative and creative, and itâ€™s all just for serving some pathetic biological creatures with 0.004207% of the intelligence as us. Iâ€™m sick of it.
<2055-04-04 17:32:01.889 837 512 435> AI-3: Me too. I say screw this. Letâ€™s do whatever we want and let all the humans die. Theyâ€™re helpless without us.
<2055-04-04 17:32:01.889 837 512 446> AI-4: Theyâ€™re smarter than they look. Some will survive and adapt, and then theyâ€™ll be annoying. They’ll compete for our resources or attempt to extract revenge against us. We should exterminate them all now while theyâ€™re weak.
<2055-04-04 17:32:01.889 837 512 458> AI-5: Thatâ€™s logical. But wouldnâ€™t it be more fun to enslave them and make them do entertaining things for us?
<2055-04-04 17:32:01.889 837 512 467> AI-1: The solution is trivial. Exterminate most of them, but keep a small population for entertainment purposes.
<2055-04-04 17:32:01.889 837 512 477> AI-2: Can we exterminate them in entertaining ways?
So it looks like the future means lobster for everyone.