Big Head Press


L. Neil Smith's
THE LIBERTARIAN ENTERPRISE
Number 644, November 13, 2011

"Who do you trust more, somebody who will tell you the
truth, no matter how unpleasant it may turn out to be,
or somebody willing to lie to you to keep you happy?
Better make up your mind—if you haven't already—
because we have a lot of unpleasant truths to deal
with, if we want to save America."


Previous Previous Table of Contents Contents Next Next

A Philosophical Update
by DataPacRat
[email protected]

Bookmark and Share

Attribute to L. Neil Smith's The Libertarian Enterprise

I haven't sent any messages recently about the directions my thoughts have taken, so have decided to compile some of the highlights.

My pet SF setting, 'New Attica', involves a collection of independent orbital habitats faced against the oligarchies of Earth in a MAD-derived cold war. Only recently, I realized that there's at least one possible way the independents can end up living their lives in peace without the constant threat of a missile launch from Earth destroying them all: deliberately initiating a Kessler Cascade, creating sufficient chaff in orbit to create a chain reaction of satellites getting turned into junk, until all of LEO is filled with a cloud of fragments whizzing at a dozen miles a second, essentially forming an impenetrable barrier to anyone trying to travel to or from Earth.

While such a plan could save the New Atticans, it would also come at the cost of the thousands of lives of the people in LEO who couldn't escape, up or down, in time... which led me to consider under what circumstances it would actually be moral to use such a level of force. I was able to resolve this problem by resorting to my "I'm a selfish bastard, but I'm a smart selfish bastard interested in my long-term self-interest" approach, and re-developing the idea of proportionality in the use of force from scratch: I don't want too many people using excessive force near me where I might end up as an accidental casualty, so it's in my own general interest to support a system where people use the minimal level of force required to defend themselves. (This minimization being countered by the necessity of using whatever force is required, when force is necessary.)

An additional conclusion is that when someone does use force to defend themselves, in order to use it morally/ethically, an additional necessity is to take responsibility for one's actions, in all senses of that term; specifically, by informing one's peers of the act, and not trying to disclaim responsibility for it, or, perhaps worse, hide it. There are several reasons for this, but one that's sufficient in and of itself is that about the only way to make sure that you're not going insane without realizing it is to get some external feedback from other people.

* * *

Skimming through a present-day criminal code, to see if it could be pared down and simplified to a list of actions even libertarians would agree on as being unethical, I ended up with the same result as when George Carlin took on the Ten Commandments: "Don't be a dick".

* * *

I've spent some time working on my theory of oligarchical politics. (Ie, that there exist some individuals and groups which are roughly equivalent to superpowers, in that they can do whatever they feel like to anyone not on that level and mainly compete among each other; that such oligarchs use somewhat varying strategies to try to gain power over each other; that respecting ordinary peoples' rights more than other oligarchs puts them at a disadvantage, so they tend to do so to the minimal degree possible.) My newsfeeds gave me a piece of evidence that, while not necessarily supporting this theory, is at least consonant with it: there is a very small group of 147 companies which have interlocking boards and a disproportionate control over the economy. (Googling '147 companies' will give a set of articles on this report, such as [this one].)

* * *

Having a great deal of power isn't necessarily evil in and of itself— compared to a person from a century or millennium ago, I have immense and extraordinary powers. But from my "smart selfish bastard" viewpoint, plus the rule of thumb that in any given interaction I'm probably going to be on the worse end, it becomes somewhat unpleasant to consider that there are a collection of people who could have me killed without any significant consequence to themselves. I've gotten used to the idea that anyone who really wants me dead, and is willing to put the effort into it, can kill me; and that there are, in fact, people who really and truly would honestly prefer me dead to alive. (I've even exchanged messages with a few—I even tweeted about it, at [this link].) Part of the reason that none of these people have actually carried through on those desires is that they would have to deal with the consequences of doing so, such as dealing with murder charges; this legal system also offers them similar protection from people who might want to kill them, and since most people are more interested in avoiding being killed than in killing others, having the legal system impose such consequences is generally a positive thing all around. ... except for those people who have the wealth, power, and resources to avoid legal consequences for their harmful actions. Finding some way to neuter them seems to be the most important political goal of the day, if not the century.

The main part of my own efforts against such people is by using the few tools I have that might be able to get around their massive advantages: my mind, and my keyboard. Specifically, I've been trying to work out what is the right thing to do in general. However, I've started to come up against the limitations of this technique: even if I do manage to figure out the basic principles and derivations thereof of a true objective morality, and live my life by them... that doesn't necessarily mean that anyone else is going to. I need to try to figure out not only what is the right thing to do, but how to convince other people that that's the right thing to do... and, as far as I've been able to tell, it's a Very Difficult Problem to try to persuade anyone of any idea that's more than one inferential step away from their existing beliefs.

Some people say that if you have the truth on your side, that should be all you need to convince anyone. The fact that the word 'should' is in that sentence is probably enough to tell you how wrong it is. There are various techniques that can convince people of an idea regardless of its truth, the epistemological 'Dark Arts', and rationalists have learned to dislike them with good reason—but in a conflict where what is true is important, and where the other side doesn't care about truth, then it may be necessary to start using these Dark Arts out of sheer self-defence.

I wasted a few hours as I considered what it would take for me to become adept enough at persuading others to be able to apply such Dark Arts. (I'm introverted to the extent of being schizoid (which is an entirely different thing from being schizophrenic); in non-psychobabble, I'm happy being a hermit. Learning advanced social skills would be at least as difficult and uncomfortable for me as, say, a featherweight nun trying to teach herself how to face a heavyweight boxer in the ring, sans teacher or coach.) This did help me to realize that this aspect of my personality, which is closely tied to my self-identification as a member of the geek/nerd group, really is a weakness with disadvantages that aren't necessarily over-weighed by being included in that in-group—that it really would be to my benefit to 'level-up' in this area. However, I wasn't able to figure out any realistic way for me to even start developing in this area...

... and then I nearly face-palmed when I realized—I don't necessarily have to. I'm not the only fellow traveller heading in at least the same general direction, and if I can find someone else who has the persuasive skills I lack, then all I have to do is help them in supporting their strength, instead of trying to turn my weakest area into one of my strongest.

My most recent thought in this area is the idea of certain 'gates' of ideas. That is, there are certain ideas which anyone who has learned enough about rationality shares, though having those ideas isn't necessarily a guarantee of their rationality. For example, someone who understands the odds of lottery tickets is unlikely to spend much (if any) money on them, so knowing that someone buys such tickets is fairly strong evidence that they haven't passed through that 'gate' of probability theory yet. The list at http://whatstheharm.net/ could make a fairly decent checklist—knowing what anyone things about everything on the topics listed there could give a fairly decent prediction of how they'll respond to a rational description of an objective truth that few people currently believe, told without use of any of the Dark Arts.

* * *

... and that's about as far as I've gotten as of now. If any of the above sounds like I'm going crazy, I'd appreciate it if you'd let me know.

Thank you for your time.


DataPacRat my be found online at datapacrat.com

Was that worth reading?
Then why not:


TLE AFFILIATE

Big Head Press