Next time somebody tells you that you should never
use multiple inheritance,
look them straight in the eye and say,
"One size does not fit all."
If they respond with something about
their bad experience on their project,
look them in the eye and repeat,
slower this time,
"One size does not fit all."
use multiple inheritance,
look them straight in the eye and say,
"One size does not fit all."
If they respond with something about
their bad experience on their project,
look them in the eye and repeat,
slower this time,
"One size does not fit all."
Marshall Cline, C++FAQ Lite
In a classic example of Baader-Meinhof, having never before heard the word 'singularity', I suddenly found myself hearing about it all the time. Initially, it was a friend interested in the technological kind who introduced me to the concept, but soon thereafter it began to mushroom in conversation everywhere: black holes, limits, power laws. Suddenly the name of the film `Event Horizon` made a lot more sense.
It happened, then, that this inconsequential realisation was idly forming in my mind one day just as I was reading about the hidden complexity of wishes. So what of it? In the article the author argues a non-sapient genie could never be trusted to grant you a wish, since to make the correct choice it would have to understand all of its non-explicit suppositions, ones which you yourself may not even be aware of. One moment you tell your genie 'quickly, get my Mother out of the burning building', the next her disembodied bits fly past you as a boiler explosion propels her far away from the center of the house: the quickest way of getting her out, but perhaps not what you had in mind. Somehow this reminded me of Google image search.
As human beings we're quite a bit better at making choices than our non-sapient silicon friends, but we are still far from perfect. Take the realm of justice. Were we to act as the wish-granting genie of a man who was asking for a book of good laws - how could we respond? Informed by a history of moral philosophy, we might advise the man that any decisions he makes must be consistent, that they must treat everyone as equal unless there are relevant reasons for treating them otherwise, in which case the unequal treatment should be proportional to its cause. But this is hardly very helpful.
Using a parable from a book by the philosopher David Miller, imagine the man had been given £100 to allocate between five people. The rules given so far would instruct him to treat them equally; or differently, but only to do so for relevant reasons and in proportion to them. Thus, the five men might be his employees and the £100 their weekly bonus: then, he should reward each based on the contribution made to the joint enterprise. If he is an aid worker and the five men members of a destitute group, he should look at the urgency of their needs and split the money accordingly. Or the £100 may be the prize to an essay competition, in which case all of it should go to the author of the best piece. It may also be a lottery win, with the six men part of a syndicate which required the sum to be split equally.
In other words, despite having some idea of the general principles involved, a just decision must be highly contextual. The rules are few and the exceptions many. For instance, we may say the principle of equal treatment is paramount. But in a given context, need, right deserts, promising and contracting, restitution and compensation are all things which trump it.
The lack of context is what will make the genie propel your Mother out of the burning building and what makes computers hopelessly bad at making intelligent decisions. What our patron above had wished for was not the wisdom of doing the right thing in this or this other particular circumstance, but a theory of justice - a formal specification of how to resolve any situation. Of course, had we had that, we would had achieved a sort of ethical singularity. We could program a computer to do it. But we can't.
As human beings we're quite a bit better at making choices than our non-sapient silicon friends, but we are still far from perfect. Take the realm of justice. Were we to act as the wish-granting genie of a man who was asking for a book of good laws - how could we respond? Informed by a history of moral philosophy, we might advise the man that any decisions he makes must be consistent, that they must treat everyone as equal unless there are relevant reasons for treating them otherwise, in which case the unequal treatment should be proportional to its cause. But this is hardly very helpful.
Using a parable from a book by the philosopher David Miller, imagine the man had been given £100 to allocate between five people. The rules given so far would instruct him to treat them equally; or differently, but only to do so for relevant reasons and in proportion to them. Thus, the five men might be his employees and the £100 their weekly bonus: then, he should reward each based on the contribution made to the joint enterprise. If he is an aid worker and the five men members of a destitute group, he should look at the urgency of their needs and split the money accordingly. Or the £100 may be the prize to an essay competition, in which case all of it should go to the author of the best piece. It may also be a lottery win, with the six men part of a syndicate which required the sum to be split equally.
In other words, despite having some idea of the general principles involved, a just decision must be highly contextual. The rules are few and the exceptions many. For instance, we may say the principle of equal treatment is paramount. But in a given context, need, right deserts, promising and contracting, restitution and compensation are all things which trump it.
The lack of context is what will make the genie propel your Mother out of the burning building and what makes computers hopelessly bad at making intelligent decisions. What our patron above had wished for was not the wisdom of doing the right thing in this or this other particular circumstance, but a theory of justice - a formal specification of how to resolve any situation. Of course, had we had that, we would had achieved a sort of ethical singularity. We could program a computer to do it. But we can't.