Saturday, 22 December 2012

conspiracy theorists

I despise conspiracy theorists for many reasons:

1. They do not look at all the evidence, just the evidence that confirms their preconceived notions. They enter "research" with an idea up-front of whatever bigoted view it is that they want to promote, e.g. that the government is controlled by "the Jews" or "the Illuminati" or "the Masons" or "the reptilian aliens" or "the industrial-military complex" or "the pharmaceutical industry" or whatever. They only look at evidence that confirms this, and ignore the other evidence that shows government is incompetent and incapable of organising a piss-up in a brewery, never mind a huge conspiracy.

2. They diminish and belittle tragedies like HIV/AIDS, JFK, 9/11, The Holocaust, Princess Di, etc. By casting doubt on these events, they render the lives of the victims invisible on the fabric of history. It is worse than killing someone, it is utterly destroying even the memory of them. Conspiracy theorists, in my view, are worse than murderers, because they rob the victims of their only meaningful remnant: the memory of the lesson that they represent to humanity.

3. Conspiracy theorists spread views which to ignorant and stupid listeners sound plausible, and cause sociopolitical upheavals, martyrdoms, or dangerous dietary behaviours, because of lies, which the average Joe on the street is incapable of distinguishing from fact. They bluster about "them" or "they" who want to "cover up" the "truth" which the lucky conspiracy theorist just happens to "know" because some or other authority figure gave them a crackpot theory to sell. They fail to distinguish quality research from crackpottery.

4. They claim that the "official" view is "propaganda" created by "them" or "the government" or "the illuminati" or "the industrial-military complex" or "the medical fraternity" or "the reichstag arsonists" or [fill in your favourite form of big brother here]. Yet if pressed for evidence of the reasonable intent and mechanisms used by the relevant Big Brother, they are vague on details and researched evidence. Or they twist facts, or even worse, fabricate evidence. They cast doubt over things that are established facts.

5. Most importantly: They fail to ask: How is it that SO MANY people are "in" on the conspiracy and never break their oaths of silence? "Well, they're all scared that THEY will come and get them". But surely someone would be willing to be a martyr for truth and come out and tell what "really" happened? Are all those people fleeing the collapsing buildings just actors who were paid to say that passenger aeroplanes flew into them? Are ALL the scientists working on HIV/AIDS actually in the pay of a shady group of "them" whose aim is to sell ARVs to starving poor people and drug addicts? Do you really think that government is so cynical that it will kill thousands of its own people, and inflict billions of dollars of damage, just to get a pretext to go to war, when there's plenty of oil inside USA anyway? It is outrageous to suggest this. It is an affront to the memory of those who died, and the same goes for holocaust denial.

6. Conspiracy theorists take skepticism too far, to the point where they aren't even convinced when rational evidence, such as clear photos of starving people with tattoos, are presented. Rational skeptics reject theories that lack evidence: no photos, no scientific evidence. Conspiracy theorist skepticism rejects EVERYTHING as "lacking evidence" - except for their particular favourite conspiracy - and even when evidence is abundantly against them.

Explanation from Marius du Preez (paraphrase): Conspiracy theories are the same kind of thinking as religion and superstition. Because we're too ignorant to accept that there's a rational mechanistic explanation for disasters, we assume it has some "big mind" or "big force" or secret behind it, such as God or the Illuminati. Both ways of thinking are the same, and are superstitious and primitive. There is no "big plan" - either from God, or from the Illuminati, or from the Masons, or from the Jews. The "big plan" is a myth, no matter what its form. The religious and the conspiracy theorists are in effect insecure, and cannot face the fact that the world is chaotic, in the mathematical sense, and unpredictable. Such people wish to shoehorn the world into a neat mechanistic plan-based pattern, so that it all "makes sense" in the end.

--- latest debate follows ---

The neo-nazis - sorry, I mean, nazi apologisers, believe that it's a false-flag conspiracy, ie you claim that millions of your own people were killed (and provide some evidence), so you can claim a right to military action, e.g. to attack palestine and colonise it as the state of israel. Another conspiracy lunacy is the 9/11 conspiracy, also a false-flag claim: that the USA knocked over its own towers to create a justification for war against islamic states.

False flag - Wikipedia, the free encyclopedia
False flag has its origins in naval warfare where the use of a flag other than the belligerent's true battle flag as a ruse de guerre, before engaging an enemy, has long been acceptable. It is also acceptable in certain circumstances in land warfare, to deceive enemies in similar ways providing that...

I actually don't think there are any significant global conspiracies. It's just not possible to keep everyone quiet.

I think in some peoples' case, what has happened is they discovered skepticism, and applied it too strongly - to everything. So not just to God and homeopathy, but to the holocaust, 9/11, moon landings, etc. So you get a super-skeptic, who inevitably ends up being a conspiracy theorist, because he simply cannot see the difference between good evidence and bad evidence, so he assumes that all evidence is bad, and therefore, anything and everything is signs of a conspiracy.

Ivy Bedworth: Conspiracy theories can't survive in a world with google in it

Actually, Ivy Bedworth, the internet has given rise to conspiracy theories galore. The reason is the same as why christianity can survive on internet: people keep drinking the kool aid. Christians only go to christian websites, and keep drinking that kool aid. They never dare look elsewhere, and feel no need to. The same happens with conspiracy nuts. They keep going to conspiracy sites because they "tell the truth behind the lies of the government-illuminati-military-industrial-complex" or whoever is the favourite bogeyman of the day, so they also keep drinking the kool aid. A conspiracy theorist doubts everything - except his conspiracy theory. A religious person doubts everything - except his religion. A scientist doubts anything that doesn't have proper empirical evidence.


latest news:


Monday, 17 December 2012

gun lobby are not thinking straight

It's quite incredible how the USA is so religious about their 2nd amendment, viz the right to bear arms. It was written in a time where they'd just broken free of the oppression of the British Empire. They needed weapons in order to preserve their freedom. What, pray tell, are a few handguns in the hands of fat, untrained, macdonalds-scoffing couch potatoes going to do against special ops, Marines, nuclear weapons, drones, black hawks, etc.? Nothing. If the US govt wanted to decimate their own population, they could, whether or not that population was armed. Witness Iraq. 150 000 or so Iraqi deaths to 4000 or so American. That's a kill rate of 37:1. The US population would also not be attacked by their own soldiers because their own soldiers would simply not attack their own families. It's paranoid and idiotic.

The USA, as usual, is not looking at what the rest of the world is doing. As usual, they are sociopolitically about 50 years behind the rest of the west. Guns are banned everywhere else, more or less, except Canada. But that's because Canadians have higher education levels and social compliance levels than USA. US citizens still seem to be chest-beating macho types. "Out of my cold dead hands", as the NRA says. It's insane. The mass shootings elsewhere are few and far between. Why? Because it's hard to get hold of guns.

1. Just because I'm "allowed" to have a gun, it doesn't mean I ought to have one, or just anyone ought to have one. We do not give guns to mentally challenged persons or children or known psychopaths or people with criminal histories, for good reasons.

2. Just because I'm "allowed" to have a gun, doesn't say what kind. Where do we draw the line? I mean Uzis? What about AK47s? What about bazookas? Tanks? Nuclear weapons? What defines the reasonable limit as to what I should be allowed to have? "The people"? "The government"? Surely there are some reasonable limits on the kinds of weapons we should all have?

If you grant (1) you have to grant (2), because (2) follows from (1), in that not everyone is capable of reasonably handling, say, nuclear weapons, or AK47s.

Where do we draw the line? Start with repeating guns of any type. Allowed/not? Then ammo. Plain lead? Armour piercing? Hollow points? Etc. Where do you draw the line between military and civilian use? Or do you draw it at all?

My proposed solution: 

a. Ban all military-grade weapons, cartridges and ammunition. Possession = imprisonment.

b. Make licensing onerous and complicated, requiring psychiatric evaluation, competency testing, background checks including requirements of no affiliation to right-wing or other terrorist type organisations suck as KKK, Al-Qaeda, etc.

c. Make possession of an unlicensed weapon (for criminals etc) an imprisonable offence.

d. Make ammunition extremely expensive, e.g. $ 100 per bullet.

e. Make it illegal to give away a gun, or sell it, without the purchaser going through proper licensing procedures.

There. I fixed it.

28761 492715830773836 884018591 n

Some links that I like:,30793/,30860/

Sam Harris' idiotic argument

I've heard the arguments, even from Sam Harris, about "protecting" people and people wanting to hunt. Hunting (like eating meat) is immoral. It's less immoral than buying meat at a grocery store, because you didn't take responsibility for that animal's death. But it's still murder.

As for 'self protection', it's more likely that you will harm your own family. Most family murders in South Africa are gun crimes, perpetrated by drunk and or angry patriarchs. The American mentality has a very similar machismo to it. Guns, hunting, outdoors, fishing, contact sports, right-wing. That profile is exactly the same as our right-wingers, who typically take it out on their families.

All of these conversations amongst Americans keep ignoring the following facts:

- Europe, despite having guns, does not have this problem,


- either Europeans are more educated, or socially supported, or civilised, OR

- gun bans work.

I am not interested in the usual responses about "self protection" and "what if the Sandy Hook teachers had been armed" etc. I want an American to answer the stats:

553132 10151348527205155 1607509956 n

Thursday, 13 December 2012

interesting story on Anonymous

Tuesday, 4 December 2012

why solipsism is improbable

solipsism: The concern that only your mind exists and nothing else can be known. It's usually taken as a criticism or ultimate problem for empiricism. There are gigabytes of arguments for and against the view, most often against, because it is taken as a threat to empiricism. I assume your interlocutor is throwing solipsism at you as a way to show that empiricism is false.

The simple reply is: Well, I'm just a figment of your imagination, so if you imagine me going away, I will go away. However, if I exist independently of you, I would continue to badger you about your fanciful beliefs. Of course, I could be an entity like Agent Smith in the Matrix, or I could be an illusion created by Descartes' evil demon. And you'd have no way to know for sure that I was not. But that doesn't mean there's no nuanced reply.

So what is the probability of each model? Very simply put: it's harder to explain why a Matrix or Evil Demon exists, and why they go to all that trouble to create an illusory world. It's much easier to explain what we perceive as real (given scientific theories about matter), and therefore, that empiricism (that there is an objective external world) is the simpler and more probable hypothesis.

Tuesday, 27 November 2012

installing hp laserjet 1018


LaserJet 1018 is not properly supported on Mac out of the box. There is a way to make it work however
Make sure it's turned on and plug it into your Mac. Open printing preferences. Add new printer - HP LaserJet 1018. It will ask you to select driver: search for HP LaserJet 1015 and add this printer. Now this will not work, yet.

Open System Update and search for updates. It will find update for HP printing software, install it.
Open printing preferences again. Delete the printer you just added. Now add it again. This time tell it to use printer driver HP LaserJet 1022

Friday, 9 November 2012

signs of a cult

1. They have a charismatic "infallible" leader. Hero worship.

2. They ask you for money.

3. They do not want you to tell other people what their 'teachings' are. Secrecy.

4. Groupthink. Thoughtcrime. Unquestioning acceptance of teachings.

5. They have a fixed system of teachings that you may not question or doubt, and if you're allowed to "question" the teachings, the ultimate conclusion is that you're wrong and the Leader is right. Even if questioning is tolerated, ultimately you're "led" back to the "truth". Drink the Kool Aid.

6. They try to substitute your family and friends for other members. Closed social circle. If interference by "outsiders" is detected, those "outsiders" are "encouraged" to "experience it for themselves" or they're treated as enemies, and you're encouraged to think of them as such.

7. You're discouraged from leaving, either by emotional blackmail, or direct threats of excommunication or refusal to be allowed 'back', or worse. Ostracism. 

This description describes some MLM companies, political groupings, some religious organisations, and some self-help organisations. I won't name them but you know who I mean. If it checks all seven, it's a cult. In fact, early Christianity as depicted in the gospels matches all criteria except (2).

Friday, 2 November 2012

open or launch apps in background on Mac OS X

It's annoying when you're working in an app on Mac OS X and you want to launch some other app, and because Mac OS X takes so long to launch apps, you switch away and start working on  something, only to have the splash screen of the previously launched app come to the foreground and annoy you by stealing the focus.

As far as I can tell, the only way to do this is to launch apps from the commandline, ie make your dock contain unix shell scripts for all the apps you want to launch.

The syntax is, for example, to open TextEdit:


open -g -a /Applications/

once you've written the script (in TextWrangler, and saved as a UNIX textfile with .sh on the end), make sure you do this:


chmod 755


and then rightclick it and choose "Open With ->"

lexmark help

If your lexmark is printing blank on Mac OS X ,  delete the printer under system preferences and recreate it.

Thursday, 1 November 2012

bookwhirl legitimacy

Wednesday, 10 October 2012

no, you're not entitled to your opinion

Tuesday, 9 October 2012

how to write a journal article

This is a simple guide on how to do write journal articles.

1. Write what you want to say, from your head. Make sure you do not rant, rail against, or otherwise rhetoricise, polemicise, or exaggerate. Never say "is", always say "seems to be". Never say "I", always say "it seems that". Never make absolute claims unless you're giving mathematical proofs or empirical measurements. Never use "etc". Do not start sentences with abbreviations like "IE" or "Eg". If you mention someone's theory, make sure you read his work and describe his theory accurately. Especially if you're attacking it.  Make sure you don't exceed about 10 pages, since step (3) below will generate probably another 10 pages.

2.a. Break your document up into six sections: (i) The Abstract (the summary, which gives your conclusion and results too). (ii) Introduction (which explains what you're going to discuss). Main body has two parts: (iii) your opponent's point of view, and (iv) someone else's point of view that you agree with. (v) Conclusion: where you say why you prefer the latter point of view. (vi) References or bibliography. If you like, you can make a footnote section at the end of the document, but it's easier to read footnotes on the page where they're relevant. Make sure you give roughly equal space to (iii) and (iv).

2. Count how many pages you have. Read at least that number of journal articles or book chapters on the same topic, within a 10-year range of your current date. A book chapter counts as one journal article. Try ensure you have at least one article from your year. I'd say don't read more than double your number of pages. Do not base your arguments on anything that is refuted in the most recent articles. Make sure that if you read a book on the topic, that you only read a book written by someone who has published in academic journals. Do not cite books written by popular authors, since they usually oversimplify the debate, especially when it's not their field of expertise. Make sure that if you have an opponent, that you read and cite at least two articles that support his view. Ideally half your reading should support your opponent's view.

3. While reading those articles, note the page numbers and article names and authors in a summary document where you (a) either summarise what the author says, if it's a relevant argument you haven't already thought of, OR (b) just note the page where he or she says something you agree with that you've already said in (1) above. E.g. if you're arguing that snails ought to be exterminated, and Smith agrees with you, put something like this: Snails -> exterminate (Smith, 1994: 512). 

4. Repeat step 3 till you've gone through all the articles.

5. Take your summaries and put them into a single document. Put the page number, article dates, author names, etc., in brackets after each summary point, in this format: (Surname, Initial. Date: page number). If you get tired of these brackets, or typing long lists of author names, note the following abbreviations:

• et seq. (for pages that run on and on), so 512 et seq. It means "et sequitur" - and following.

• et al. (for a long list of authors, just give the first one and then put 'et al.'). It means 'et alia', (and others).

• cf. (confer/compare): it means have a look at what so-and-so said in this page/journal, which is similar to what I say here, but not quite the same.

• q.v. (see). Same as compare, effectively. It means go read.

• op. cit. (work already cited.) Same as "I've already told you this, see above".

• ibid. (ibidem, in the same). It means I cited this already as the last citation. You can use ibid as an abbreviation for an author name, an author name and date, or an author name, date, journal or article or chapter, or, even, the entire thing: author name, date, source, page. It refers to the most recently cited quote or citation. If the page is different, you can just put 'ibid., page number'. The trouble with ibid is that when you do step (6) below, the "ibid" will refer to the wrong source, so avoid "ibid" except in the final draft of your article. I'd say avoid it completely, because as soon as you move one sentence from one location to another, the "ibid" no longer refers to the correct citation.

6. Sort the arguments or information in (5) into related topics that belong together, inevitably, mixing the authors up. This is why every sentence must have the author name, article year, and page number, in brackets after it. Put the most closely related claims or arguments together. If two authors make the exact same claim, put the claim or argument as a single sentence, and both author names in brackets, e.g. "Snails ought to be wiped out (Smith, J. 1994: 512, Jones, K. 2001: 123)."

7. Copy and paste from that document into the relevant spots in (1), making sure you put quote marks around each thing you actually quote, and page numbers and author names around anything you cite or quote. Keep copying and pasting until you've copied/pasted everything from the (5) document into the (1) document. If you have a very long quote — i.e. more than two sentences, replace parts of the quote that are repetitive or irrelevant with an ellipsis (...). and where you change a word, put it in square brackets. E.g.:

"Snails ought to be wiped out. They really ought to be eliminated. The sky is blue. Snails are nasty, slimy things that devour one's garden". (Smith, J. 1994: 512).

... change to:

"Snails ought to be wiped out ... [they] are nasty, slimy things that devour one's garden". (Smith, J. 1994: 512).

8. Make a bibliography listing the journal articles you read, in this format: Surname, Initial. (Date). Title. Journal name: Issue or Year, (Volume). Or similar.

9. Proof-read and make sure it flows as a single piece.

That's it, you're done.

Monday, 8 October 2012

what freedom is

My first 'epiphany' on this debate was that what we call 'freedom' is that set of determinants/forces with which we're comfortable.

Thursday, 4 October 2012

Academic terminology



Conference, congress, colloquium, symposium — a gathering of persons to discuss academic concepts. Larger conferences tend to be called ‘congresses’, smaller ones ‘symposia’. Note the spelling of the plural. If a conference has mini-conferences inside it, those are called symposia. Symposia comes from Greek meaning “to sit together”. Colloquium comes from Latin, meaning “to talk together”. Conference and congress are Latin as well, meaning To bring together, and To travel/go together.


Abstract — a summary of a piece of academic work. It appears at the beginning of a piece of academic work and summarises the research question, and the answer that the work gives. It is marked with the word “Abstract” at the beginning of the paper. It also often has key words listed below it that summarise the work or the topic.


Poster — a piece of academic work summarised on a large piece of cardboard. It is presented at a conference by the person standing next to a large poster board with their poster pinned up, in a hall, and they wait for people to come up to them and talk to them. One way of thinking of this is as a random opportunity to meet someone who is an expert on a particular area of research. A poster is typically displayed for a limited time on a particular day of the conference. An “electronic poster” is when the person keeps their poster electronically on a computer and presents it electronically (i.e. not printed) — but in the same public space, e.g. a hall.


Paper — an essay, usually 10-20 pages long, which starts with an abstract, and examines a research question. It usually has an introduction, a main body, in which contrasting ideas are debated, and a conclusion, which usually selects one of the contrasting ideas as the more likely to be correct. A person presenting a paper at a conference will usually have a PowerPoint series of slides that they will talk about in front of a usually small audience of up to about 100 people at most, but sometimes as few as 1-2 people. This is usually done in a closed room, reserved for the paper in a certain time slot on a certain day.


Symposium, Panel — a mini-conference inside a conference, most often, consisting of a panel of experts who sit around and present their papers in turn. The audience, who do not sit around the table, get to listen to their discussion. A symposium is brought together by a ‘convenor’, who often presents the first paper. The symposium is summed up at the end by the ‘discussant’.


Plenary — a session or presentation given by an important or famous researcher, usually attended by everyone in the conference. From Latin for “Complete” or “full”, related to the word “plenty”. A plenary is the same as a paper presentation except for the audience size.


Session — a type of presentation, most often a synonym for a panel or symposium. If a session has only one presenter, that’s a paper session. If the session has a person showing a poster, that’s a poster session. If it’s a group of people, that’s a panel session or a symposium session.


Discipline, Research Area — an area of study with specific methods, pre-commitments, and specific foci or areas that it focuses on. A limited area of study. More popular areas of study tend to be broken down into more sub-disciplines or research areas. So for example Physics contains research areas such as quantum mechanics, fluid dynamics, Newtonian mechanics, thermodynamics, astrophysics, nuclear physics, etc.


Chair — the head of a subdivision of some kind, e.g. a head of a research area.


Reviewer — a person who reads another person’s work to see if it is of acceptable standard. 


Sub-reviewer — a reviewer who reports to a chair or another reviewer. 


Peer-reviewed — when a piece of academic work is reviewed by someone who is the author’s academic equal (more or less). 


Editor — a person who reads some writing to check for spelling, grammar and clarity problems. An editor does not check for conceptual or factual problems; that’s what a reviewer does.


Journal — a periodical or magazine for academics with articles on academic topics. Most journals cover very specific research areas, and academics submit their papers to those specific journals for peer review. In most cases, work is reviewed by one reviewer, but if there’s a doubt, another reviewer can be called in.


Blind or anonymous peer review — the process of review taken in most cases, wherein the author does not know who is reviewing his paper, and the reviewer does not know whose paper he is reviewing, to prevent bias in favour of colleagues or friends.


Publish — to get a paper accepted into a journal.


PhD, Masters, Thesis, Dissertation — PhD is “doctor of philosophy of”; the highest degree you can get (although a “Post Doctorate” was recently introduced). PhD is one level higher than Master. A Master’s degree is typically either by coursework or by “dissertation” (a written discussion of a particular research question). A master has typically been reviewed by two persons before being awarded the degree. A PhD is typically reviewed by at least three people before being awarded their degree by “thesis”. A thesis is like a dissertation — it’s a big book on a particular research topic — except that it offers a new theory, whereas a dissertation does not. A professorship is not a degree awarded by signing up for a course. A professor, rather, is someone who has published a lot, and who has been awarded the rank by his university.


Proceedings — a journal containing only papers from a conference.


Call for Abstracts — the opening phase of a conference wherein the conference organisers send out adverts asking academics to submit abstracts for review by the conference organisers. If the abstracts sent in are accepted, the person may then attend the conference as a presenter.


Registration — signing up AND paying your membership/attendance fee for the conference.


Delegate — any person at the conference, who may or may not also be a presenter.


Invited Speaker — a delegate who was specifically invited to give a plenary or other major session.

This is some software we wrote to organise conferences:

Wednesday, 26 September 2012

veganism needs a marketing company

I think that vegans also fail to distinguish their arguments and bombard people with all of them. They need to find which argument works on whichever person, and address only that aspect.

So, for example, (a) most people in the west, which is the biggest meat consumer, accept and are worried about global warming. If you can show evidence that a large proportion of it is due to cow flatulence and deforestation, you can make an argument for reducing consumption of burgers, for example.

The other arguments are (whether true or not): (b) it's morally wrong to kill any sentient being, (c) mass farmed animals are full of hormones/antibiotics/BSE/whatever, (d) meat is actually not healthy for you (e) we're not biologically 'designed' to eat meat.

So for example, I only find (a) to (c) compelling; (d) and (e) don't really persuade me, because of palaeontological evidence that humans have been doing it for 1.6 million years - ie from a time BEFORE we were actually modern humans (which is only in the last 100-200 000 years or so - I can't recall the exact timeframe).

Moreover, even (c) isn't that compelling if you go for free-range meats or venison. From my point of view, the moral argument is the strongest. The bigger the frontal cortex, the more it can suffer, the less you should consider eating it.

Lastly, as for raw food vegans: Processing isn't inherently bad. I mean think about it. If you put aside preservatives, colourants and flavour enhancers, you're left with "processing" which usually means "liquidising and mixing". How is that any different from what your teeth do?

Tuesday, 18 September 2012

etymology of wolves and foxes

I find it interesting that the Latin term for a fox is 'vulpes'. If you apply Grimm's law to this, it becomes "wulf" in Germanic. Now, in Latin, u v and w were interchangeable as long u (oo), and uniformly written as v. This is seen in Hawaiian as well, where some islanders pronounce the name of their home 'hava-yi', and others as 'hawa-yi'. Likewise, in Welsh, a 'w' is a long u, e.g. 'cwm'. So this leads me to consider the following theory.

Consider Latin Lupo - a wolf. If you swap some letters - a common occurrence in speech - think of people who say "ecsetera" and "arks" instead of "et-setera" or "arsk". Or consider "three" versus "third" - the r and the e swap places. Now, If you take Latin Lupo, it's quite plausible that at some stage it was 'ulpo'. Now, if you apply Grimm's law to that, you get 'ulf', which is the Norse Germanic word. So the original Indo-European word for a wild dog must have been 'ulf' or 'ulp'. But since Latin uses 'vulpes' or 'uulpes', to spell it correctly, for a fox, it suggests that Latin added that word later, since it's closer to the Germanic letter-order of 'ulf'. This suggests that the Latins had the word "lupo" for a wolf, but then, when they went further north, and met the Germanic tribes, they encountered foxes, and decided to borrow the Germanic word 'ulf' for a fox. Hence, a wolf is 'lup-', and a fox is 'ulp-'. IE it's the same word, and reintroduced in the Germanic letter-order.

This is an interesting occurrence, since the Latins developed their civilisation sooner (into the Roman Empire), and hence, the Germanics tended to borrow from Latin, not the other way around, as seen in this case. 

Friday, 14 September 2012

homeopathy debunked


The best comment:


I'm glad that every glass of water I drink has infinitely small concentrations of every possible homeopathic remedy, so I know I will never get sick from anything.
Doug Selsam

"water molecule retention of energetic information"
1) this doesn't happen. Basic chemistry and physics.
2) if it did, every molecule of water on the planet would already have the signature of every substance that has ever been, because the water cycle has been going on for about 6 billion years and it all gets churned up constantly. So, again, if homeopathy worked, just drinking plain untreated tap water would have the exact same effect.
If there were really a "memory of water" which was detectable by any means, then any lab in the world should be able to take an unlabeled sample of homeopathic medicine, and one of plain water, and using whatever techniques are claimed to be able to detect the difference, figure out which was which. This has never happened. The only people who have claimed to be able to tell the difference are those who were already proponents of homeopathy in advance of the experiment, and they do not explain their methods, (or when they do, those methods don't work for anyone else).

-- Jacob Baziza

Thursday, 13 September 2012

zeitgeist debunked

Wednesday, 12 September 2012

responses to critics

I recently gave a talk on evolution vs creation. These replies to criticisms below should be self-explanatory.

Please note that I am unable due to time constraints to engage further in debate.


1. That my argument contains non-sequiturs and fallacies of affirming the consequent (FAC). That would be because it is an inductive, probabilistic argument. All inductive arguments are structured the same, logically speaking, as FACs. (If A, then B, B, therefore A). If you consult Swinburne (2004), the Existence of God, you’ll see that he justifies God’s existence using probabilistic arguments based on Bayes’ Theorem, which is bidirectionally implicative, unlike formal logic, which defines any bidirectionally implicative proposition as a fallacy or circular or self-justifying. IE in this argument the premises entail the hypothesis, and the hypothesis entails the premises. But all science is like this: a hypothesis derives from observed evidence, and predicts future observations or evidence, by stating a theory which has law-like entailments that the future will resemble the past. The evidence entails the hypothesis, and the hypothesis or theory entails future evidence of the same kind. So this criticism isn’t revealing anything I wasn’t aware of, hence my originally placing the Bayesian equation in one of the pages.


2. That my presentation doesn’t show nuance as to the various types of evolutionary arguments or presentations of sub-models etc. Yes, this is not a formal academic presentation or paper, in the sense of intended to show detail to an academic audience who are experts in evolutionary theory. I was presenting an intelligent layperson’s comprehendable model of evolution which an educated layperson could follow, and was limited by time to not go into detail. You may notice, for example, that the last few screens of philosophical argument do not go into as much technical detail as this response collection here. The reason is the same: if one has one minute per slide, it’s not possible to go into technical detail. I apologise for misrepresenting anything, however, or oversimplifying.


3. That the present research ignores recent research by Intelligent Design proponents. See my reply to (2) above regarding limitations. 


I also responded to this particular criticism by pointing out that the scientific consensus is in favour of evolution, which my interlocutor accepted as true. This does not, of course, mean that scientists are right, just that ID proponents need to get more peer-reviewed articles on ID published in scientific journals if they want to be taken seriously, by presenting incontrovertible evidence of something biological that had to have been designed, for which no mechanism would suffice as an explanation. I am not aware of any such journal article, but I look forward to reading one. 


In particular, my objection to ID is not that it is covertly theistic. My objection to ID is that it assumes that a complex designer, with complex (universe-sized and universe-detailed plans) pre-existed the universe, and pre-existed life, which means that something more complex than our existing universe pre-existed our universe. That strikes me as highly improbable. In our observation, less complex things come first and are followed by more complex things, hence, bacteria and jellyfish first, then fish, then amphibians, reptiles, etc. Or to use a human analogy: fire, spear, Boeing, not, Boeing first. If we use Bayes’ equation of probability, then, given that in our observation, complex things usually appear after simpler things in technology and in nature, it seems as if the prior probability of there being an intelligent designer is low. Given the massive explanatory power of evolution, barring perhaps an as-yet-unexplained example here and there from the ID camp, it seems to follow that the posterior probability of ID is low, too. From Bayes’ equation, it follows, that the total probability of ID is low. ID supporters have to provide an argument, like any other species of creationist, as to why they think the prior probability of a universal designing intelligence that wants to design and create, is high. Swinburne (2004) provides such an argument, as do Unwin and Plantinga. But they’re all mistaken if they consider such a being to be ‘simple’. For my detailed argument, see Ostrowick (2012), South African Journal of Philosophy, available on


4. That evolution is inherently a materialistic hypothesis and comes from the materialist paradigm, and is therefore anti-religious. If this were true, then Evolution would threaten religion broadly. But I do not think it is true. The specific simplified model of evolution presented does not require materialism to be true. It’s agnostic towards metaphysics. If angels were capable of genetic mutations, they’d evolve. Thus, if evolution is one possible answer to the problem of natural evils, and if evolution does not contradict Deism, then it follows that evolution helps a theist (or specifically a Deist) to respond to the problem of natural evils.


5. I’d like to point out that the purpose of the presentation was to show that it is possible to coherently hold that God exists and that evolution is true, provided that one rejects scriptural infallibility. By self-identifying as a creationist, as two audience members seemed to do, one is presuming the truth of something and then afterwards interpreting the evidence in the light of that. 


One audience member suggested that “we’re working from the same evidence” but just “interpreting it differently” based on our “prior assumptions”. I am not certain that that is how science works, but I stand to be corrected. My understanding of the scientific method is that (i) one makes an observation, (ii) one draws a theory from it, and then (iii) one makes a prediction, which, if confirmed, (iv) confirms the theory. Theism, Intelligent Design, or creationism omits step (i) and starts at step (ii) - postulating a theory first, which is why it is unscientific. It starts by assuming that there’s an intelligent creator - the “theory”. If an ID supporter wants to dispute this, he has to explain why it is, then, that he thinks that intelligent purposive/teleological explanations are the only suitable ones, when less complex explanations that fit the normal scientific efficient-causal framework are available. That seems to be his starting point. 



Of course, he will then point to various examples of evolutionary oddities that are prima facie hard to explain from within evolutionary theory, without reference to an intelligent designer. All I can say in response is that this seems to require that we assume that they’re inexplicable, ipso facto, without a designer, that they are, ipso facto, purposive, and that ipso facto, science will never ever be able to explain them sans a designer, using a purely efficient-causal model like evolution. All three of these assumptions are false, and the latter, that “we will never find a mechanistic explanation” for how certain miraculously complex things eventuated, is another inductive assumption, prone to fallacy of affirming the consequent, like any induction. In this latter case, one is arguing from Bayesian probability and saying, it seems prima facie improbable, for example, that the eye could have evolved, and was not created. My response is that it depends on one’s intuitions, and these derive from one’s assumptions. So, if one assumes, ipso facto, that God exists, one will no doubt see his design in things. And if one assumes, ipso facto, that he doesn’t exist, one will no doubt see the “design” as explicable, at some stage, by a mere mechanism. 




So I acknowledge that my interlocutor was right about how our prior commitments lead us to assume different types of explanation. However, I disagree on the notion that these prior commitments must assume intentional design prima facie, or that it is the preferable explanation, and that we will never find a mechanistic explanation, for evolutionary oddities, or that a materialistic presumption is somehow ruled out prima facie. To be charitable, one must see which explanation best predicts the type of thing one is seeing. 




In one case, one explanation would be that God wanted bacteria to have flagella, and designed them meticulously. In the other case, one would say that some virus injected a cell, and the cell erroneously copied the DNA and produced instead a motile flagellum-bearing cell rather than another virus with an injector. Which of these explanations seems prima facie more probable? It strikes me that saying “God did it” in this case is really bringing up further questions, like, “Why did he want to do that?”, whereas the materialist explanation seems to bring up no further questions, just as one would not ask “why did the dice land on a double-six?” - They just did. The trick here is that "Why" is ambiguous. It can mean "For what goal or purpose?" and it can mean "Due to what previous cause?". It is this intuitive difference in the style of explanation one accepts that leads one to prefer ID or evolution.




6. That genetic mutations always destroy or remove information. This was a bald assertion, and is demonstrably false. As pointed out, certain genetic diseases exist because of the addition of codons, and meiosis involves the merging of two sets of DNA, and meiosis is one of the mechanisms of mutuation - indeed, probably the main mechanism. Hence evolution can occur because of adding, removing or changing quantities of DNA codons or information, provided that the changes have some effect as to the animal’s fitness for survival.




7. That natural evils are not evil, because no free-will or evil choices are made by animals. This observation is probably correct, however, animals are subjects of morality, in that they have moral worth. So their suffering is morally significant. Therefore, if they are subjected to wanton cruelty or other futile forms of suffering, they are morally significant as sufferers, and hence, as subjects of some ethical concerns. One such ethical concern might be the question as to why God allows them to suffer. The quickest and strongest response, in my view, is that it is so they can evolve into better creatures. No other response will suffice. My interlocutor at this point suggested that that is a bad argument, and I agree, but since I am, as it were, playing “God’s advocate”, I must stop there and not explore further lest I find the answer inadequate to support theism. The point I was making was it seems, prima facie, that theism and evolution are not incompatible, and that evolution might help theism. That’s all. Again, for reasons of time limits, it’s not practical to go into too much detail about whether an argument is really good or not.




8. The purpose of the talk was primarily to reassure theists that it’s not necessarily the case that evolution is a threat to theism. You have to first demonstrate that it is a threat, and I believe that I have shown that evolution is not necessarily a threat, particularly if the theist opts for Deism. This means that arguments such as ID which assume that evolution is a threat, are starting from false assumptions (that ‘designed-looking’ things can’t appear by accident), and hence, their entire argument becomes a moot point (whether evolution occurs). If evolution supports theism, as I argued above, then it follows that everything that has evolved might also have been intentionally pre-designed to do so by God, and allowed to manifest through evolution. Etc. So my point is that an argument can be made for the compatibility of evolution and God, and that theists are barking up the wrong tree by attacking evolution; they should be attacking neuropsychology. 




9. As for designed-looking things appearing spontaneously, I mentioned the clock experiment in which a mathematician showed that random clock parts can and will spontaneously evolve into working clocks, given just the requirement that non-working clocks ‘die out’. Moreover, chaos theory shows that certain convergence points of statistical randomness, called ‘strange attractors’, exist. Just this fact alone is sufficient to explain a large proportion of the apparent order that we see in the universe. Indeed, whorls like galaxy shapes or flowers follow from fractal mathematics. Likewise, crystals are organised and structured in a manner that looks like intentional artifice, but quantum mechanics explains it away. Design, then, could just be an illusion. So to assume design instead of evolution is an assumption. It is stronger to argue instead that God just created the laws of evolution.




10. That evolution is a tautology, and predictions from it derive from a mere tautology. I am not sure that this is a criticism. Many tautologies are useful, e.g. I saw three tigers enter my forest, and now I see two leaving (3-2=1). That tells me I ought to watch out. I go to the shop and buy something and pay $5, and the item I bought had a price tag of $3. I get $1 change. That tells me something: that I’ve been short-changed (5-3=2). So tautologies are not necessarily bad. As for whether evolution is a tautology (“Things survive because they’re fit to survive”) - that’s neither here nor there regarding my argument. My argument was that it’s apparently coherent with the theory of evolution to be a deist, and that evolution may help answer the problem of natural evils. What exactly the theory of evolution says, and how its logic is structured, doesn’t really matter for my argument. Moreover, I don’t think evolution has the logical structure of a formal tautology. I think it has the logical structure of a predictive model, ie “Things will probably survive if they have a feature which will probably lead them to survive”. This is not virtus dormitiva, it’s statistics. Moreover, the claim is more like this: “Things will probably live long enough to breed if they have a feature which will probably lead them to survive”. Since it’s a probabilistic argument, it means that it’s Bayesian, and hence, bidirectionally implicative, or inductive. That means that it cannot, by definition, be a tautology.


11. That we might just not know God’s plans for evil, and that is the better response to the problem of evil. I address this criticism in my PhD research. The reply, briefly, is that this argument, called Skeptical Theism or Appeal to Omniscience, is simply that if we cannot claim to know why God allows evil, we cannot appeal to theodicy, which relies on providing an exact answer as to why God allows evil. IE you can either appeal to skepticism as to our ability to understand God’s purposes, OR you can appeal to a theodicy such as free-will, but not both. If you appeal to skepticism, moreover, you lose the claim to know that God wanted to create the universe and did do so for good purposes. So skeptical theism actually costs you too much in the end; it costs you theodicy and cosmology, which are the strongest theistic arguments.


12. That I have not acknowledged how other religions cope with things such as evolution or the problem of evil. Hinduism, in particular, has replies to these. My reply was yes, I acknowledge that, but I am primarily addressing Abrahamic religions as they seem to be the ones complaining about evolution,

Monday, 10 September 2012

Friday, 7 September 2012

'the secret' debunked

The chief problem is that if you attract what happens in your life, and that is the exclusive explanation of your life (like the Karma doctrine), then it means that no matter how bad something is (e.g. rape, murder, etc., - it means YOU DESERVED IT and WANTED IT). That's called 'blaming the victim'.

Wednesday, 5 September 2012

Diet fads

I'm often fascinated by how people obsess about diet and follow all the latest fads without question. I consider it a sign of stupidity. For a while, there was the Atkins diet, for example. But Atkins died. I'm pretty sure it's because of his eating plan. Now again, recently, the high-protein diet fad has re-emerged (it went to ground for a while), and again, people are shovelling in the meat and being told by doctors, to not take starch. I consider this horribly irresponsible and stupid for a number of reasons.

(1) Ethical. Meat is murder, especially the more conscious or intelligent the animal. If you think about it, Westerners are leery about eating dogs and dolphins. Why? Because they're intelligent. But they don't think much of cows, chickens or fish - because they're not intelligent. This is my intuitive understanding of our superstitions about which animals we eat. On the other hand, pigs are as intelligent as dogs, so we technically should be reluctant to eat pigs, too. But we're not. So I think, really, that it's got to do with tradition. Outside of Judaism, in the West, eating pigs is traditionally acceptable, so we do it. If we look at various cultures, they have different types of animals that they consider acceptable to eat. Indeed, some cultures consider eating humans to not be 'disgusting'. So I think there actually is no dietary truth here. The recommendation I'd like to suggest is that we base our meat-eating practice not on tradition, but that we reject our existing traditions, and ask just how sentient the animal is. I'd like to suggest that sentience is an increasing curve, and that within any particular order of animals (mammalia, reptilia, gastropoda, insecta, arachnida, crustacea, aves, etc.) that there is a grade of intelligence, and that we should avoid eating creatures on the higher-end of the intelligence scale PER ORDER, simply because they are more sentient, and therefore capable of greater suffering. This means, for example, that squid and octopus are forbidden, because they're actually very intelligent, whereas, for example, bugs might not be forbidden, because they're almost automata. In mammalia, no chimps, monkeys, dogs, etc., but maybe buck or anything dumber than that. What makes mammalia tricky is that smaller animals like mice and rats seem to be very intelligent compared to much larger things like sheep, so it is hard to tell whether they really are stupider or not. I'm contemplating giving up eating mammalia altogether because of this problem. Broadly speaking, I'd say that the orders of intelligence are probably something like this: insecta/arthropoda/arachnida/crustacea (dumbest), gastropoda, pisces, amphibia, reptilia, aves, mammalia. IE evolutionary age gives a rough indication of intelligence, with more recent phyla being more intelligent.

(2) Environmental. The more you encourage people to eat meat, the more rainforest etc., has to be levelled to make cattle farms. Then their flatulence attacks the ozone layer and contributes further to global warming. So that, at least, is a scientific reason why not to eat meat. 

(3) Dietary arguments. Some writers claim that eating meat promotes a variety of ailments or dangers to health, e.g. claiming that it might promote cancer or whatever. These arguments often also take the form of claiming that our dentition or intestines are not like those of carnivores, and therefore, that we are not "meant" to eat meat. I have two responses to this. Firstly, it's a naturalist fallacy: that whatever nature 'designed' is how things are "meant" to be. Moreover, it's teleological, which is unscientific. That's why I've put quotes around "meant" and "Designed".

I can give a few examples of phenomena that seem to be natural functions, and point out how humans violate them. Consider excretion. Consider how that is a natural function. Now consider how some subcultures, e.g. homosexuals or certain fetishists, make use of those functions. Are we to prohibit such persons from practicing their lifestyles because it's putatively 'unnatural'? I do not think so. Piercing your ears is unnatural. Driving a car is unnatural, and causes lots of deaths.

Another example of this kind of thing follows. Chimps are genetically very close to us (98-99%). Yet they've been known to hunt with spears, and are omnivorous, as are baboons. So we could argue that since in nature, chimps are omnivorous, so should we be. Moreover, ancient palaeontological evidence shows that humans practiced carnivory probably before they could speak, so it follows, if we accept the argument from 'nature is right', that we ought to eat meat. But this is just another naturalist fallacy.

Comparitive anatomy is a bad argument. The 'dentition' argument, for example, strikes me as particularly fallacious since gorrillas are vegan and have the same dentition as us (but massive canines), which suggests, if canines are for eating meat, that they ought to be meat eaters. Likewise, ruminants lack top front teeth except molars and have four stomachs, so if we were meant to be plant eaters, I could argue, we should have four stomachs and no top front teeth. So the gut structure/dentition argument is fallacious or tenuous at best. One might as well say, well, our legs are very different to a leopard's, and leopards climb trees, therefore we ought to not climb trees. That kind of comparitive argument is nonsense. Chimps also don't fly planes, so we ought to not fly planes, too, then. The problem with the naturalist fallacy is it assumes the truth of the design hypothesis - that we are designed. We are not designed. We are the accidental products of a process of evolution in which our bodily features either helped us reproduce, or did not prevent us reproducing. In cases where a bodily feature prevented a creature from reproducing, that creature went extinct. Therefore, since humans have been omnivorous since the start of recorded time, it follows that omnivory is irrelevant to our survival as a species, and that our bodies are “designed" to cope with omnivory. Gut length, dentition, etc., is an accident of nature and genetics, which doesn't dictate what we "should" or "shouldn't" do; our dentition and gut dictates merely what is more or less efficient for our bodies to process as a food source. Some food sources are better than others. For example, we can't eat paper, but termites and cows, technically speaking, can, because they have bacteria in their guts which can digest cellulose, which is an isomer of sugar. We don't. So paper isn't a good food source for us. That's the only kind of relevance our gut and dentition structure has. I mean, my molars are good at crushing sweets; does that mean I must eat only sweets? Our wrists are "designed" for brachiation. That seems to imply that we should swing through the trees rather than walk. That is obvious nonsense.

Humans are a mish-mash of features that have resulted from evolution. The fact that we still exist, six million years after splitting from the chimps, shows that omnivory is not harmful to us as a species, or unnatural. It shows, in fact, that omnivory is correct for us, since we still exist. Only if omnivory drove us towards extinction (i.e. the world population started drastically decreasing), should we start to reconsider the scientific merits of the practice.

I think the only kind of argument from dietary appropriateness would follow from a scientific study of the effects of percentages of meat consumption, and I'm sure such studies have been done. Here's my requirement for a legitimate study. Put 1000 people or more on four different diets (Western - the control -and Atkins, Vegetarian, and Vegan - the experimental diets). Then see what their health is like after a year. So, if, for example, it was found that a mostly-meat diet correlated highly with bowel cancer, then we could conclude that a lower quantity should be consumed. If we noticed that only people on the Western diet got obese, we would know that that correlated with obesity. What needs to actually be established, scientifically, is what the actual required daily amounts are for all forms of foods (and note: protein does not mean 'meat', it means nitrated organic compounds). For example, most people take vitamin supplements, on the assumption that it's necessary. It is not. Unless you have scurvy or another vitamin-deficiency related disease, you have no reason whatsoever to take extra vitamins: you're just making, as I heard a doctor put it, "expensive pee".

It is for these considerations above that I consider the ethical and environmental arguments to be much stronger than dietary arguments for reducing meat consumption.

(4) In my view, I am convinced of only one thing relating to meat consumption and health. Some varieties of arthritis and kidney stones are caused by buildups of uric acid crystals, which are byproducts of protein digestion. It seems to me that adults should therefore simply reduce their protein intake from ALL sources. There are also plenty of plant proteins that do not suffer from the above problems (1-3). Lentils, gluten and soya being obvious examples, but there's no proof either way whether they would also cause buildups of uric acid if eaten in excess.

Saturday, 25 August 2012

Matt 10:34 isn't a true verse, it's satanic (Salman Rushdie, please take note)

The 'no true scotsman' fallacy is a favourite. "These aren't true muslims/christians/whatever' when one of them commits a scripture-justified atrocity. Well, I've understood the fallacy to be if you go from a general set to a specific case/token, and then ad-hoc that the specific case/token is not a member of the general set, so as to cover the specific when it mismatches the generalised claims. So Chrisitans usually say theirs is a religion of love, but I say, Matt 10:34 contradicts that claim.

Let me see.

If Matt 10:34 says 'I bring the sword', then this is a specific case that violates the 'general' case of putative NT benevolence. So I'd have done the TS fallacy in this case if I'd said something like: "All verses in the NT are loving, Matt 10:34 is not loving, therefore Matt 10:34 wasn't authored by Jesus or is not a true Christian verse." This of course assumes that 'christian' necessarily means accepting every verse in the NT. I believe I've done the reverse; I've said: Here's a specifc case of a non-loving verse, which means that the general case/claim "NT doctrine is loving" is not completely true. The 'scotsman fallacy' would be to try explain 10:34 away or redefine christianity so as to preclude 10:34 from being 'true'. So, symbolically...

For all Scottish S, it is true that they don't take sugar in their porridge (~P)
Alasdair, born in Glasgow, (A) does take sugar in his porridge, (P), therefore A isn't Scottish (~S)

Interlocutor's claim: for all S, S: ~P (a possibly false generalised premise)
My objection: A, A: P
Interlocutor's false conclusion: therefore A: ~S 

This might even generalise as [S -> ~P, P, therefore ~S] or [ S v P ]. (Am I right? - I'm not sure. If it's Sunny, then it's not Precipitating. It's precipitating, therefore it's not sunny. That seems generally true.)

But since A was born in Glasgow (presumably necessary and sufficient for being Scottish), S, which contradicts S, S: ~P, it seems to follow that

A necessarily: S
so all that we've established is that there's a contradiction between
A necessarily: S and
A, A: P -> A: ~S

The fallacy is when you reply "well in this case, sugarless porridge overrules Glasgow", no?

So if no true Christian would say/do something vicious (C, C: ~V)
I ask for an explanation of Matt 10:34, call it M, M: V (vicious Matthew).

Interlocutor: for all C, C: ~V (a possibly false generalised premise)
to which I reply: M, M: V,
To which my interlocutor has to reply: therefore M: ~C 

Which suggests that arguing that M is not christian is a scotsman fallacy. A, A: P, therefore A: ~S

where I have

M, M: V, therefore M: ~C

Which is analogous.

So to reply that Matthew 10:34 doesn't represent christian values is a scotsman fallacy. One has to abandon the first premise that (P, P: ~S) or (C, C: ~V) - ie accept that to be christian is to be sometimes vicious, and to be scottish is to sometimes put sugar in your porridge.

Incidentally, the context of the verse is about giving up your family to follow Jesus, it’s not specifically about violence.

Thursday, 23 August 2012

Why I prefer spotlight to the Dock, but still hate spotlight

Hi Apple.

Spotlight is really annoying - sorry.

1. It indexes inserted flash disks by default. What is the point? It should not index attached disks unless you explicitly say so. Maybe when the person starts typing in the "find" column in the finder, then do the indexing, or just use the unix Find utility underneath. I don't WANT it to index my 1 TB backup drive with +- 1 million files. PLEASE. I know I can drag the icon into "privacy" in the system prefs, but that's a schlep. How about you just make it NOT do that by default, and only make it do that (index) if I try to "find" something on the drive. And anyway, what's with spotlight giving a progress bar while it indexes an external drive? I want to be able to search my internal drive while it is indexing the external. Plus it maxes out the CPU and your average joe user has no idea why his mac is overheating and phones me to ask.

2. I'm very seldom going to keep contents on a flash disk constant. A flash disk is just a piece of junk that I copy stuff onto and GIVE to people or lend to them, and I format/erase it every time. Why waste my time indexing it? Especially when I copy lots of stuff onto it? It just slows my mac down and annoys me and fills the flashdisk with a database file of search results that aren't going to work on their windows(tm) pc anyway!

3. The choices/search results are incredibly annoying. Most of the time, I want to show a file in the finder, not open it, and I want it to show me the most recently found or opened item that matches that string typed. Suggestion: when it shows a found result, make it have a button next to the item saying "show" and another saying "open", or something (eg rightclick the item found and it says "show in finder", leftclick it and it opens it. I realise there are workarounds (open the file, rightclick the title in the titlebar, etc)

--edit: Angelo Kyrilov tells me: "There is a simple way to get it to do what you want. If you hold down the command key while clicking on an item, it shows that item in Finder. I too often find it more useful to find the item's location rather than open it."

4. If I have a folder called 12-01 from december 2001, and a file called 12-01.pages which I made on 1 december 2011, SURELY it should offer the latter file first when I search for 12-01? I get it that it thinks it's a calculation, but ...If I have two files called 12-01, e.g. a spreadsheet, a wordprocessing file, etc., which I just edited a few seconds ago, it should offer the most-recently-modified items first? The same applies to apps. Offer the most-recently opened first.

5. If I type ADOBE, I don't want to see all the cruft that Adobe installs on my filesystem, like the help viewer and the auto updater, I want it to offer Photoshop first, then Acrobat, then Distiller, in the order of what I am most likely going to use. So like with iTunes stars or number of times a track is played, keep track of which are my favourite or most opened apps, and offer them first.

This would be great, because then I could get rid of the dock, which is unusuable, because I have so many app icons and minimised windows that I cannot see what I'm looking for, and always end up giving up and going to spotlight to open something that is actually visible (microscopically) in the dock. Eg at the moment I have 40 icons open - and I'm just surfing. 8gb RAM. Your apps take so long to open that I never quit apps. Now, when I'm working seriously, it gets to about 50 items in the dock including minimized windows. Icon size goes down to about 10-15px. I realise I can enable the magnify effect, but that just makes the icons wobble and move, and hard to hit, and I STILL have to scrub the whole dock to find what I want. I also realise I can turn off magnification, scrub the dock, and see the item names, but that wastes my time, because I have to do an exhaustive search from top to bottom (my dock's right-vertical aligned).

It would just be faster if I could hide the dock and ONLY use spotlight, and it only showed my most recent apps. In fact, the only reason I keep the dock open at all is to drag files into app icons to overrule the suffix app default, e.g if I want to force an html file to open in BBEdit rather than Safari (for just one case).

I ALSO realise there's a "defaults write" hack to enable this in the dock (show most recent items), which helps a little, but the icons it opens are massive and it brings up a scrollbar. Here's a rule of user interface design: avoid scrolling as much as possible. It's horrible and annoying and wastes time. Try show everything at once, tiled, visibly, with file titles. Like the App Launcher thing on iOS that you moved into Lion/MountainLion. Preferably without a paginator - just shrink the icons to a user-defined limit (icon size minimum and numofdisplayeditems maximum being calculated by icon size limit and screen size). IE a version of dashboard/exposé/launchpad/launcher/whatever you want to call it today, that works.

6. Another example, if I type 2012, I expect to find first items CALLED 2012-something (find ./ -name). I have several. it should list all of them, files first, then folders. After that, I expect to see files that CONTAIN the string 2012 (grep) inside them. After that, I expect to see items that I used or opened in 2012 (find ./ -mtime), in most-recent date order first. Surely this makes sense?

7. Why on earth would the find function in the Finder give completely different results? They should be the same. Find in the finder seems to do a grep only, or show date modified only, when I search for 2012. It should come up with the file called 2012, first, then the folder. Moreover, why on earth is its default scope my entire computer? That's ridiculous. Most of the things I'm searching for are either files I misplaced or mis-sorted, or email contents, in my local homedir.

8. Can we use operators in our spotlights/finder searches? If not, why not? If so, where is/are the manual/instructions? Can I say &, |, !, -, etc? I tried it, and it seems to do something weird; it seems to assume, for example, that 2012 is always a date search and never a string filename search. I tried filename:2012 & .pages, and it still didn't bring up my file called 2012.pages. You know, I really miss the Find command from System 7. That rocked.

Thanks for listening.