Friday, September 30, 2016

Euthanasia'sSlipperySlope


Legalise euthanasia, and compassionate society dies too

If you love your parents, respect your children, care for your society and think compassionately about your world then it is time to open your heart and brain to what happens when a jurisdiction legalises killing or, as it is called, euthanasia.
The justification for euthanasia lies in human rights, individual autonomy and relieving pain — all worthy ideas, and that may prompt the question: why then is euthanasia still opposed by most nations, most medical professional bodies around the world and the Australian Medical Association?
The reason is not hard to find. It is because crossing the threshold to euthanasia is the ultimate step in medical, moral and social terms. A polity is never the same afterwards and a society is never the same. It changes forever the doctor-patient bond. It is because, in brutal but honest terms, more people will be put at risk by the legislation than will be granted relief as beneficiaries.
The argument against euthanasia has endured for many years: it leads, on balance, to a less compassionate society that creates a new series of moral and practical hazards for itself. It is a disproportionate response to the real problem of patient pain that needs more care and money. It is because a society that legalises killing has to change fundamentally in terms of the ethics of its doctors, its medical ethos, its family relationships and its principles of human life. Belgium, having legalised euthanasia in 2002, offers a tragic picture of what can happen to a country just a few short years later.
In this debate the principle of individual autonomy is vital. Adults, as much as possible, should be able to exercise choices over their medical treatment. That means declining treatment that can keep them alive. There is no real dispute about that.
Euthanasia is different: it is an act that terminates life. It is, therefore, by definition not a private affair; not just about a patient’s right. It is a public and society-wide issue because it involves the state legalising killing subject to certain conditions. That is a grave step and it concerns everyone.
AMA head Michael Gannon tells Inquirer: “The current policy of the AMA is that doctors should not involve themselves in any treatment that has as its aim the ending of a patient’s life. This is consistent with the policy position of most medical associations around the world and reflects 2000 years of medical ethics.”
There are three foundational points in this debate. First, in relative terms the proportion of people dying in acute pain is declining because palliative care methods have been enhanced. There is wide agreement among experts that most physical pain at life’s end can now be managed — this is a critical trend but cannot conceal the fact painful deaths still exist and become the main argument for legal change. But euthanasia should not be seen as a substitute for palliative care — that would be a medical and moral blunder.
Second, where euthanasia is legalised the record is clear — its availability generates rapid and ever expanding use and wider legal boundaries. Its rate and practice quickly exceeds the small number of cases based on the original criteria of unacceptable pain — witness Belgium, The Netherlands, Switzerland and Oregon. In Belgium, figures for sanctioned killings and assisted suicide rose from 235 in 2003 to 2012 by last year. In the Netherlands they rose from 2331 in 2008 to 5516 last year.
These figures come from Labor MLC Daniel Mulino’s minority report in the recent Victorian parliament committee report recommending euthanasia. His conclusion is that “the negative consequences arising from legislation far outweighs the benefits arising in that minority of cases”.
Experience in other jurisdictions leads to the unambiguous conclusion: the threshold event is the original legalising of euthanasia. After this there is only one debate — it is over when and how to expand the sanctioned killings. Claims made in Victoria that strict safeguards will be implemented and sustained are simply untenable and defy the lived overseas experience as well as political reality. There are many questions. If you sanction killing for end-of-life pain relief, how can you deny this right to people in pain who aren’t dying? If you give this right to adults, how can you deny this right to children? If you give this right to people in physical pain, how can you deny this right to people with mental illness? If you give this right to people with mental illness, how can you deny this right to people who are exhausted with life?
Third, culture and values will change to justify the death process. Consider the situation of one of Belgium’s most famous doctors, Wim Distelmans, applauded as a human rights champion. Having killed more than 100 patients, he is a celebrity, gives talks around the nation and is lauded as a man who “cannot stand injustice”. He toldDer Spiegel that giving a lethal injection is an act of “unconditional love”.
In Belgium, because so many are killed, the act must be converted into the exemplar of moral and medical compassion.
“Who am I to convince patients that they have to suffer longer than they want?” Distelmans said in one of the most astonishing articles of our time (“The Death Treatment” by Rachel Aviv, The New Yorker, June 22, 2015).
It is the story of how an adult son, Tom Mortier, sought justice after Distelmans killed his mother without Mortier’s knowledge. Distelmans was appointed chairman of the Federal Control and Evaluation Commission, whose job is to assess that doctors have complied with Belgian law. He told The New Yorker: “We at the commission are confronted more and more with patients who are tired of dealing with a sum of small ailments — they are what we call ‘tired of life’.”
Though their suffering derived from social as well as medical concerns, Distelmans said he regarded their pain as incurable. The article reported that 13 per cent of Belgians who were euthanised last year did not have a terminal condition. In Belgium euthanasia and suicide march together — it also has the second highest suicide rate (excluding euthanasia) in western Europe.
The most chilling aspect in a chilling story was Distelmans’s moral superiority in dealing with Mortier, prompting Mortier to write later: “I loved my mother for more than 30 years and I wanted her to live; Dr Distelmans loved her so much — ‘unconditionally’ — that after a few brief consultations over six months he gave her a ­lethal injection.”
Once you sanction euthanasia you open the door to euthanasia creep. The human heart will ­always respond to the incentives of the law. Cross the threshold and doctors will be encouraged to think it is their job to promote the end-of-life. Sick people, thinking of families, feel obliged to offer up their deaths. Less worthy people exploit the death process for gain. In Belgium children can now be euthanised. Would this have been acceptable when euthanasia was legalised in 2002? No way.
The article quoted a professor of psychiatry at the University of Leuven, Dirk De Wachter, calling euthanasia a humanist solution to a humanist dilemma. “What is life worth when there is no God?” he asked. “What is life worth when I am not successful?”
There are an infinite number of similar questions: what is life worth when you are lonely or depressed? De Wachter said he had recently ­euthanised a woman, not suffering from clinical depression but in a condition where “it was impossible for her to have a goal in life”.
Pro-euthanasia advocates in Australia are split when dealing with Belgium and The Netherlands between defending their practices or saying they are not relevant to our debate. The latter is false. These countries are highly relevant — as classic studies in how the euthanasia culture takes grips of a nation’s moral sense. It is sanctioned in terms of love, liberation and compassion — the ultimate service one human can render another.
The recent Victorian parliamentary report Inquiry into End of Life Choices recommended that people be assisted to die by being prescribed a lethal drug to be taken by themselves or administered by a doctor. It outlined a series of strict guidelines as eligibility criteria — approval by a primary doctor and a second doctor only for patients suffering at the end of life. The condition must be serious and incurable. The ­request must come from the ­patient and be free of coercion, be properly informed and be made three times: verbal, written, then verbal again.
There is significant support for euthanasia in the Victorian cabinet and in the opposing frontbench. A bill is certain in the life of the present parliament. Expectations are that it will be passed.
The AMA’s Gannon says the association is conducting a review of its euthanasia policy. He says this is “routine” and not prompted by “recent events”. He highlights the paradox of euthanasia. “It is only a rich country issue,” Gannon says. “There is no one in the developing world talking about terminating the lives of patients.” The AMA review will be completed in mid-November.
The pro-euthanasia group within the AMA hopes to shift its policy from opposition to neutral, mirroring the shift made in Canada — and that would be a significant step. In its evaluation the AMA must focus beyond the issue of patient autonomy to confront the question of doctor-patient ­relations and how they would change under euthanasia.
A critical feature of the Victorian report is the belief that a small number of people seeking euthanasia can be helped without any significant downside for ­society. It seeks to achieve this through robust eligibility criteria and the repudiation of any “slippery slope” problem with euthanasia in jurisdictions such as Oregon, The Netherlands and Switzerland.
Such optimism is heroic and typical of the euthanasia debate. It is echoed in nation after nation, year after year. It testifies to the deepest humanist conviction that mankind and wise governments can introduce euthanasia regimes with the necessary legal safeguards and the necessary regulatory protections to manage the promotion of death to ensure only net gains for the social order.
It is surely extraordinary that people sceptical of the ability of governments to get trains running on time fool themselves into thinking they can confidently manage a regime that sanctions the termination of human life.
The minority report from Mulino provides statistics showing there has been a sustained increase in deaths in all ­jurisdictions, no evidence that growth rates are plateauing with compound annual growth rates ranging from 13 to 22 per cent, which Mulino says has to be ­regarded as “extremely high”. He says the total number of cases in Belgium has increased by 756 per cent over 12 years and in Oregon is 725 per cent higher across the 17 years since initial legislation.
What sort of society is evolving if these growth rates continue? Why cannot we rationally confront and answer these questions? What drives the rise in deaths?
Munilo says the evidence ­reveals euthanasia and assisted suicide regimes “come under ­immediate pressure as soon as these schemes are enacted”. First, there is pressure to widen the law and second is the pressure to ­interpret more generously its implementation. And we think Australia is exempt?
There are many examples. In Canada, there are advisory group recommendations to extend the law to children. In Belgium ­extending euthanasia to dementia patients is under examination. The Netherlands is considering allowing patients to make pre-­dementia declarations.
The trend and logic is unassailable: once legislated the principle of euthanasia is settled and the practice of euthanasia is widened, if not by law then by administrative laxity and de facto regulatory sanction. Of course, many euthanasia cases are never declared.
A 2012 report by the European Institute of Bioethics said: “Initially legalised under very strict conditions, euthanasia has gradually become a very normal and even ordinary act to which patients are deemed to have a right.”
Many advocates in Australia use the rights language. Once this takes hold, then holding back the tide is near impossible. The ­upshot in The Netherlands is that the type of patients seeking euthanasia has changed with a shift to those with psychiatric illness. ­Mobile clinics offering free lethal injections are now in operation.
Mulino refers to an Oregon Public Health Division report looking at 132 deaths and finding that 48 per cent listed being a burden on family, friends or caregivers was a concern. When the Belgian law was passed politicians insisted that patients with psychiatric disorders, dementia or ­depression would be excluded — yet the prospect now is for an ­escalation in these categories.
Vulnerable people are right to feel uneasy if Australia crosses the legal threshold. In truth, it is virtually impossible to ensure all acts of euthanasia are voluntary. The elderly, lonely, handicapped and indigenous need to think how such laws mat affect them and their self-esteem.
In short, the foundational claims in the majority Victorian report of no “slippery slope” and effective “safeguards” do not pass the test of evidence, experience or careful analysis. This goes to the question of whether Australia will legislate on false and misleading assumptions that reflect ideological and political propositions.
On the pivotal and related issue of palliative care, Australia suffers a moral and humanitarian failure — and the Victorian report has responded with a strong set of recommendations.
Palliative Care Australia chief executive Liz Callaghan tells Inquirer: “The practice of palliative care does not include euthanasia or physician-assisted suicide, and palliative care does not intend to hasten or postpone death. PCA believes the Australian government needs to ­increase access to palliative care.
“Currently 70 per cent of Australians want to die at home but only 14 per cent do. We believe more needs to be done to ensure that this can happen. Access to integrated, comprehensive support and pain/symptom management is often inadequate, inequitable or may not meet patient needs.”
Callaghan says evidence is that pain management improved from 2011 to last year based on data collection from 115 spec­ialist palliative care services looking after 20,000 patients needing pain management. She says PCA believes more needs to be done to ensure people are better educated about their end of life care choices and palliative care. The PCA believes any ­request for euthanasia requires “a respectful and compassionate ­response”, with Callaghan saying euthanasia is an issue for parliaments.
It is ironic this week that more evidence has emerged about the shocking impact of suicide in this country, particularly for Australians aged in the 15 to 44 age group. How, pray, does legalising euthanasia help the campaign against suicide? The most bizarre notion this week was the suggestion that legalising euthanasia may lower the suicide rate.
In many ways this entire debate is about how to interpret love and care in the context of death. Hug the person you love. But realise this is also about deciding the degree of discretion doctors have dealing with death. It may be good for a doctor to follow a patient’s wish for a lethal injection but that must be assessed against the total social impact of a regime that allows life to be terminated.
If we proceed then life will change, there will be a “slippery slope”, your relationship with your doctor will be different, the vulnerable will have reason to feel uneasy, the push to make euthanasia a right will be inevitable, the frail will feel obliged to volunteer and our values as a community will shift more quickly than you appreciate

Friday, September 09, 2016

ClimateChange, Q & A and Brian Cox



GRAHAM WOODS

Open Letter to an Alarmist Shill

That ABC would stack its Q&A panel and audience comes as no surprise, as that is standard operating procedure at the national broadcaster. But even by those low standards,  alleged climate guru Brian Cox plumbed new depths in bias, bad manners and a compere's partisan indulgence

Dear Brian,
I’d appreciate your response to this email, which deals with your recent appearance on the ABC’sQ&A program.
First, I want to make it clear that, where you’re concerned, I’m not a ‘vexatious invigilator’.  My wife and I (each with an earned PhD) have watched most of your TV programs, and have been struck by their intellectual clarity and your unassuming personal style (as well as by your BMI: we’re high-level wellness devotees).  With that said, we both have serious misgivings about your recent appearance on Q&A.
No pronouncement that enjoys an audience has zero social consequences, and the more prominent the pronouncer the more significant the consequences are likely to be.  Your recent Q&A appearance brings that out well.  You were treated like a science guru, both by the audience and by compere Tony Jones, and it’s inevitable that what you said will affect the opinions of hundreds, probably thousands, of people.
You might disagree, but I’d argue that your authority carries a responsibility: a responsibility to ensure that your audience (whether that’s one person or thousands) is not misled by your pronouncements.  It’s difficult to evade the conclusion that, on this recent occasion, you didn’t live up to that responsibility.
First, the program itself, including some of its history.  In 2007, Tony Jones brought climate change sceptic Martin Durkin onto his program.  My wife, Denise, and I, at that stage relatively uninformed and open-minded about the subject, expected Durkin to be given a decent opportunity to put his case.  Instead, we watched the attempted ‘credibility destruction’ of a person who had obviously been set up to be ambushed.  The attack was carried out most enthusiastically by Jones himself.  I was so appalled by Jones’ behaviour that I wrote to the ABC about it (so did others); Denise and I were so disgusted that we’ve never been able to bring ourselves to watch Jones since.
In the recent Q&A (which, as matter of duty, I watched during its second airing, on Tuesday, August 16, 2016), Jones attacked nobody, but the ‘stage-management’ of that episode was undisguisedly tendentious.  On the panel there was no acknowledged climate or ‘climate -related’ scientist with known anti-AGW views (e.g. Bob Carter, William Kininmonth, Ian Plimer) – and, had there been, I suspect that you wouldn’t have been there.  In fact no panel member at all was a bona fide climate scientist: i.e. a scientist with specialised knowledge in one (or more) of the disciplines that are demonstrably related to global climate behaviour and who frequently applies that knowledge as a professional contributor to that field.
Instead, the panel comprised a ‘science superstar’ (an appellation used by commentators both before and after the show); a federal minister who would (inter alia) be interrogated about cutting spending on climate change; a federal opposition member with no obvious responsibility for any aspect of climate; a ‘mathematician’ (publicity blurb) who holds a bachelor degree built only partly on mathematics per se and who, as far as I know, is a person not connected professionally with any aspect of climate science research per se; and one lay climate sceptic who is – unfairly or not –  perceived by many Australians as an extremist (on many topics) and so was expected to shoot himself in the foot on the subject of climate change.
The outcome of the ‘debate’ was predictable: most media presented it as climate change scepticism being ‘debunked’ by a leading scientist with a worldwide reputation.
Whether or not you agree with your admirers about your status, as an experienced science presenter you know as well as I do that a national broadcaster in a self-proclaimed democracy has an ethical duty to present material in a balanced and fair manner; this was never more incumbent on a broadcaster than in the case of climate change.  Yet the Q&A program was stacked in a way that should have been expected to prejudice one side of a debate.  Why, then, did you agree to take part?  If your response is that there is no valid debate (a strange stance for a scientist anyway), then why were you there?  To lend your support to interaction that had a better than even chance of being reported as the modern equivalent of bear-baiting: a spectacle to thrill the masses?  (If you think I’m exaggerating here, just look at the coverage the show got later, including in your own British papers.)
A possible defence of the makeup of the panel is that the Q&A episode would deal with more than just the subject of climate change.  I would argue that to do justice to climate change under such circumstances was therefore next to impossible, and that it should not have been on the agenda.  But it was, and you came onto the show armed with graphs; given all the circumstances a balanced treatment of climate change never had a chance.
Another defence you might put is that you tried to be reasonable on the show and you can’t be held responsible for the way it was later misrepresented in the media and that you’re really quite surprised.  That defence might excuse a naive artisan scientist who works in a lab or out in the tundra, but you’re anything but.  You’re an experienced, media-savvy science presenter, well informed about embedded messages, the importance of context, the impact of the status of the messenger on the message, and much more.  (The ‘much more’ includes, or ought to include, the politics of science career protection and of access to scientific journals: each of these connected intimately with the other.)  The idea that a world-famous scientist, armed with graphs, would not have a profound influence on public opinion about climate change, is laughable; the idea that a contest between such a person and an already somewhat demonised antagonist would produce anything other than the result that it actually did, is just as risible.
I want to turn, now, to what you actually said (i.e. on climate and climate change) and what you didn’t say.
You presented some graphs (the fact that you had some ready suggests complicity in a stitch-up, but I’ll let you tell me how it was that you came onto the show so well prepared).  I’d appreciate copies, in due course, but what I took these graphs to show is that the mean global temperature has risen steadily during the last 100 years and so has the concentration of atmospheric CO₂.
The fact that the mean global temperature has risen during the last 100 years says nothing about what it was doing before then, and says nothing at all about its causes.  Even if the 100-year correlation with rising levels of atmospheric CO₂ were perfect (and there isn’t 100% agreement even on the purely statistical question of how good the correlation is), that proves nothing whatever about causation.  The fact that correlation says nothing about causation (a fact that guides all empirical inquiry, including science) was drawn to your attention by Malcolm Roberts, your sceptical fellow panel member, the fellow who, according to subsequent media assessments, you ‘schooled in the science of climate change’ and ‘exposed and destroyed’, and who is a ‘climate change denier’ (he isn’t) whose claims you refuted (you didn’t: you disputed them).
These facts – I call them part of the immediately relevant context – you grossly played down (I quote you: ‘The absolute – absolute – consensus is that human action is leading to an increase in average temperatures.  Absolute consensus.  I know you may try to argue with that but you can’t.’).  If youwant to argue that to include this ‘relevant context’ would have opened up issues that couldn’t easily be addressed within such confines, doesn’t that apply a fortiori to your decision to display the graphs themselves?
The predictable result of your manoeuvre was well captured in a tweet I noticed during the program: ‘the graph speaks!’.  With the utmost respect to this benighted soul: in the context of the program, the graph speaks but it does not fairly inform; moreover a glove puppet speaks too, but who is its manipulator?
As I understand it you’re an astrophysicist and/or particle physicist – and, by my own reckoning, you’re an excellent science presenter.  On Q&A the other night you let yourself be manipulated (or deliberately chose the strategy yourself), such that it was your credibility as a ‘scientist’ that gave credibility to your comments on climate.  But you aren’t a ‘climate scientist’ (i.e. you don’t study climate and its perturbations as your primary professional job): in the area of climate and climate change you’re a layman – almost certainly a well-informed one but still a layman: like your chief opponent, Malcolm Roberts, and like me in fact.  Your moral duty, I believe, was to emphasise, for the benefit of your audience, that you’re NOT a climate scientist: in respect to climate change your most pertinent expertise is that you’re a presenter and explicator of science topics.  Along with any other panel member, you had a perfect right to nominate the dimensions of climate and climate change that you believe deserve to be put on the table, but as a non-specialist and a non-expert you had an obligation to confine those dimensions to those about which there can be very little doubt whatever: dimensions or facts that any intelligent non-specialist could, in principle, discover for herself.  Here are some of them, the first and second groups surely safe from dispute by any climate scientist:
  • Planet Earth is a dynamic planet in a dynamic solar system: thus climate change is, now and for millions of years to come, inevitable and unstoppable.  In the absence of climate change, life as it exists on our planet simply wouldn’t.
  • Our global climate system is almost incomprehensibly complex: across geological time and into the present affected interactively by the sun; the moon; possibly by some of the larger planets; by tectonic plate movement; volcanic activity; cyclical changes in the earth’s oceans; changes in the quantum and distribution of the earth’s biomass; changes in greenhouse gases that themselves are the result of changes in more underlying factors; by changes in the earth’s tilt and solar orbit; probably by changes in the earth’s magnetic field; and possibly by some other non-anthropogenic factors that at present scientists either don’t know about or whose impact they haven’t yet fully appreciated.
  • ‘Consensus’ means ‘majority view’; majority views can be egregiously wrong (witness the work of apostates Marshall and Warren in the case of Helicobacter pylori and stomach ulcers).
  • There is no published estimate of the degree of consensus on any aspect of climate or climate change that is so statistically robust that it can’t be contested; in any case, the size of the majority in favour of a scientific conclusion is logically disconnected from its validity: scientific hypotheses and conclusions are refined and proven by empirical data, not crowd appeal.
  • There are now countless thousands of studies drawn from at least twenty scientific disciplines that aim to – or purport to – shed light on how the earth’s climate ‘works’.  Many of their results and conclusions are, by their authors’ own reckoning, tentative; the results and conclusions of some studies contest the results and conclusions of others.  There would be few, if any, aspects of climate that could claim 100% agreement among the relevant researchers except some of the raw data – and even many of these are contested, because different (though prima facie equally defensible) methods have been adopted to collect them.
  • In 2016, the feedback loops and tipping points that are assumed to affect global climate systems are, in actual real-world settings, imperfectly understood, and tipping points in particular are largely speculative.  This is true regardless of the possibility (even the likelihood) that the current ‘very rapid pulse increase’ in CO₂ is geologically unprecedented or the possibility that it will have irreversible climatic consequences.
  • There is demonstrable scientific debate about the presumptive roles (yes, roles) of CO₂ in medium- and long-term climate change in the real world – and there is no conclusion about how CO₂ is related to these dimensions that is supported by incontestable empirical evidence.
  • The impact of anthropogenic CO₂ is therefore a scientific question, not a matter on which ‘the science is settled’ or ‘the debate is over’.
You might nit-pick one or other of my proposed ‘agreed facts or dimensions’, but as a non-specialist appearing on a national TV ‘debate’ entered into by non-experts you had no legitimate brief to ignore most of them and substitute (albeit via oblique insertion) what I assume are your personal convictions about climate and climate change, misusing the face-validity of your science credentials to have your views uncritically accepted by a wider audience.
Let us, for one phantasmagorical moment, pretend that all the data are in (this would be a first for any science ever, and would transform it from science to dogma), that ‘the (scientific) debate is over’, that CO₂ has been shown unequivocally to be the main driver of global warming during the past 40 years, and that the existence of countervailing global mechanisms is vanishingly unlikely: given the world-wide concern about ‘climate change’, and given your high profile as a scientist, you have further duties of care I believe.  Chief among them is to help people understand what sort of world they’ll inhabit if fossil-sourced substances are taken off the menu.
Nuclear-powered electricity generation could, theoretically, substitute for a very significant proportion of current fossil-fuel-powered generation.  Assuming uniformly supportive governments and negligible public opposition (an unlikely scenario), nuclear power could be up and running across the world in 5-10 years.  It follows that fossil-fuel-powered generation will be required for at least that long: in reality it’s likely to be much longer.
Assuming anything less than a massive increase in nuclear electricity generation, in the absence of fossil-sourced energy and fossil-sourced raw materials (for many of which there are currently no realistic alternatives) at least the next twenty years would be years with minimal heating and cooling; with compromised urban street lighting; with compromised sewerage and other waste disposal systems; without motorised transport, functional agricultural, mining and industrial machinery, newly manufactured computers and tablets, mobile phones, television sets, refrigerators, bicycles or any other conventional consumer goods, including clothes and shoes; and with inadequate food and/or water for most of the world’s people and their pets and livestock.  Modern medicine would collapse; so would most school systems; so, probably, would our financial systems – and possibly even our political systems.  In such a world, people like Brian Cox won’t be able to jet to Australia – and will struggle to conduct their professional lives even via video-conferencing – and Al Gore will have to significantly reduce the scale of his energy-dense lifestyle.  The world as we’ve come to expect it during the past century simply won’t exist, and many of its human inhabitants will perish: in particular the already impoverished, the very young, the otherwise frail, and the physically handicapped.  In a world so beleaguered civil unrest is certain, and food-looting, widespread violence and murder are virtually guaranteed.  This is the larger context in which the ‘climate change debate’ (now over … ) should be conducted.  It’s a context that implies balancing risks against benefits, and that balance will have to be struck even if the worst of the climate-change scenarios is realised.
The sciences that are contributing to our full understanding of climate and climate change are a long way from achieving their goal; the debate that characterises their work is a sign of a healthy scientific enterprise.  The rise of climate-related ‘think-tanks’, ‘idea clearing-houses’ and other lobby groups – or individuals (on both sides of the contest) is predictable, but the main contribution of many of them, ill-informed and/or tendentious as they are, has been to massage prejudices and close people’s minds.  Your own contribution, on the recent Q & A, will almost certainly have that effect too: in fact the subsequent media coverage already comprises powerful evidence.  Your implicit invitation, that people do their own research, is disingenuous: you know, as well as I do, that most people won’t do their own research, and that many are simply not capable of it.  The vast majority of the world’s public look to respected spokespersons such as you to instruct them about what they should think and believe.  You have a profound duty of care to instruct them even-handedly and fairly; I believe you failed in that duty on the recent Q & A.
Denise and I will continue to watch your programs, simply because they’ve been so good and you seem so decent.  However, your recent appearance and performance on Q & A have severely disillusioned us.  Perhaps now is the time for you to consider a series of programs of your own on this, one of the most vexed of current topics.  Your flair for making complex subjects intelligible to the lay person without misrepresenting them (sadly, not on display on the recent Q & A) should make the series compelling.  But it’s precisely your recent appearance on Q & A that leads us to doubt that you would be able to conduct such a project fairly and with an open mind.
Undoubtedly you have a lot on your plate, so I’m willing to wait until 7 September for your response to this letter, without taking any further action.  If I’ve heard nothing by then, I’ll rework the document and distribute it as an open letter, available for scrutiny and comment by anybody.
With best wishes
Graham Woods
Australia

Sunday, September 04, 2016

Settlement or Inavasion


Quadrant magazine

ROBERT MURRAY

1788 Deserves Better


Invasion won prime ministerial approval at the winter election and now seems set to win the decades-old battle with settlementas the key word in the larger argument about whether or not the origins of modern Australia were “shameful”.
Invasion is mainly applied to the arrival of the First Fleet in 1788—one never hears it applied to settlement in comparable contexts, for example settlement in the north—and there are arguments for and against. It leads, though, to a general but erroneous public impression of the past. It gives the impression that European settlement in Australia was only achieved by armed force and widespread killing, which is far from the truth.
The invasion/massacre/dispossession “black armband” version of history now routinely appears in the media and fiction, is taught in schools and universities and is a staple of private conversation, often in a deeply moral way. But it is invariably just a “given”—a vague and generalised presumption without detail or context. It originates in history books, themselves sometimes only moderately biased, and then seeps into the system, becoming exaggerated over time. The available facts, however, do not support it.
There is not much direct evidence either way about what the Aborigines thought at any time about whites arriving. I have read everything I could find over many years, and the indirect evidence suggests that the Aborigines usually—though not always—accepted our forebears, shrewdly and warily. There is no evidence that, except in a few parts of the tropical north, the indigenous people “resisted invasion” or were in any modern sense “freedom fighters”.
The “evil ancestors” version, blaming supposedly ruthless and racist governors, police, soldiers and squatters, also barks up the wrong tree.
There is much more evidence about convict transportation and the record there is clear: for all its rough edges it was massively successful in improving the lives of scores of thousands of people.
To be quite clear, the colonial period was a disaster for Aborigines. But they have genuine grievances enough without the need of exaggeration or invention. The disaster that befell them was essentially a consequence of the outside world colliding with a continent and its people after they had lived in isolation for 50,000 years. Even if it had been foreseen, it would not have been avoided for long had Britain decided to ship its convicts elsewhere. Regrettably, it was an inevitable part of the opening up of the Pacific.
The extreme “black armband” version of Australia’s pioneer days is a fantasy, a 1970s radical confection from assorted anti-colonial theory, American history, anti-Vietnam War ideology and a dash of South Africa grafted on to cherry-picked bits of Australian history.
It is simply not true that the 1970s brought, Watergate-style, a sudden rush of enlightenment, until then suppressed, especially from school books, about the “shame” of Australia’s past and the “need to come to terms” with it. As with all history, there have been a lot of new books about the white–black encounter since the 1960s, varying from good to patchy to bad, but they add detail, opinions and interpretation; they do not change the main story.
Many of these books display an anti-white bias, mainly through being selective, and have sought to enlarge the Aboriginal presence in mainstream history and build up Aboriginal pride and confidence. The ensuing publicity established fashionable views which have been repeated uncritically and built on by guesswork.
It is hard to see how in the 1780s “permission” from the Sydney Aborigines, or a “treaty”, could have been practical, given the immense gulf of culture, language and distance. North American treaties between Europeans and natives, which are often held up as examples of what should have been done here in 1788, usually broke under the pressure of white expansion; they were rarely worth the paper they were written on.
The British executive governors in Australia instead used diplomacy and conciliation with the Aborigines, as London instructed them to do, and it worked reasonably well over the seven decades of governor rule that followed 1788.
There is no evidence that the Sydney Aborigines opposed the First Fleeters or had a big problem with their presence. Black-armband writers point to the clearing of land as an example of probably unwelcome invasion of space, or to known and recorded conflict with the Botany Bay dissident Pemulwuy.
While there is no record of what the Aborigines thought, a vast amount has been recorded and written about early Sydney. On balance the impression is of a fairly successful, well governed and harmonious society, especially given the human raw material.
It is unlikely that any of the London officials who authorised and organised the early fleets had ever heard of the words “Terra Nullius”, which was then a legal technicality but is now widely supposed to be the doctrine that allowed the British and other European powers to “invade” countries such as Australia.
From the remote past until about a century ago, powerful countries considered it normal to plant small colonies in less developed lands. This was particularly so when the colonies were in territory with small populations, “wilderness” without towns and farming or visibly built dwellings or identifiably effective government. Group movement of people from one country to another was seen as normal.
A lot of the world was still like this in 1788, though it had declined greatly over the centuries. Immigration and port restrictions were also in a fledgling state worldwide. It was a quieter, less busy world. World population in 1788 was about one billion, compared with 7.5 billion today.
The original European colonies were usually intended for trade, such as those planted in seventeenth-century India, but it was common enough also to plant them for domestic purposes, such as exiling convicts or for major power strategy considerations—all factors present in early Australian colonies, though convict transportation was the main purpose here. Over time and for various reasons, such as wool growing in Australia, early niche colonies often grew far beyond their original purposes and were then incorporated into empires.
What happened in Australia also happened in comparable ways in India and the Americas. It had been the way of the world back at least as far as ancient Rome and could be viewed as a natural way of bringing stronger, more developed society to the weaker ones. One could endlessly ponder the strengths or weaknesses of particular cultures but it is evident that some survived and developed better than others under the pressures of the outside world.
The idea that 1788 was an invasion became popular in the lead-up to the 1988 Bicentennial. The main impetus was a perceived need to be more inclusive and build up Aboriginal pride, which seemed then to be sadly lacking. But “invasion” gives a wrong impression. People tend to imagine a process more like the D-Day invasion of occupied France in 1944 than the bland events at Port Jackson in 1788. There was a popular view around in 1988, which never even made it to print or screen, that the First Fleet came sailing in with war guns blazing at hapless indigenous defenders.
What did the Aborigines do? As far as we know, nothing much, except look on from the bush. One said later they thought the newcomers might be possums, climbing the ships’ masts as they did.
The main impression that emerges from fairly abundant records, and has been told in print over and again, is of the two Sydney-side races mingling fairly well, mutually curious and learning from each other. But there are serious qualifications to that happy picture that the black-armbanders seize on.
There were probably about a thousand Aborigines living around Sydney Harbour in 1788 in about a dozen loosely related communities, and 5000 of them in the wider Sydney region. The Australia-wide indigenous population was perhaps 750,000. The white population reached about 5000 in 1800 and only 190,000 even by 1840, when the Aboriginal population was about the same—extremely light use of the land by any standard. There were probably no more than 50,000 of each in the southern grazing lands.
White occupation of the land was slow—smidgins before 1800, and long after that only tiny coastal enclaves plus vast uncleared grazing runs. Only rarely did indigenous people directly oppose the process.
Nor was much land “grabbed”. An orderly system of government land grants prevailed in the Sydney basin and tiny areas around Hobart and Launceston in the early years. After about 1820 the grants went to wider areas within “limits of location”—about 350 kilometres round Sydney, and in the area between Hobart and Launceston. The grants were intended to grow food for the colony and help to pay its costs, to employ the convicts, teach them rural skills and encourage them to stay in Australia when their sentences expired. Immense expanses of unexploited bushland filled even these relatively “settled” districts.
Occupation by whites beyond the limits was illegal until 1836, when the wool boom had reached such a pitch that it was too big to stop. From 1836 these early “squatters” on the vast expanses beyond the “limits of location”—most of south-eastern Australia—had to pay for licences to graze their stock. Only after 1848 were the licences for fixed areas and periods, allowing more settled “stations” to develop.
Governments seemed at first to view the land as like common land in England, for informal sharing. Aboriginal rights were uncertain until 1848 but when some graziers drove local Aborigines from traditional lands, London ordered the grazing leases to be legally shared with the Aboriginal customary owners. Local colonial governments sometimes fudged the application, but the principle is the basis of Mabo.
As a rule, and with much qualification, the Aborigines accepted this rather light sharing of their lands with newcomers, who brought advantages such as more secure and diverse food and drink, metal goods, often a degree of security from hostile neighbouring tribes, and also the eye-opening wonders of the previously unknown outside world. It is, after all, normal for people to accept modernity.
Far more distressing for the indigenous people than the arrival in 1788 of a thousand interesting whites, surely, must have been the calamitous death rate at the same time. The greatest blow was the smallpox epidemic of 1789; First Fleet records estimate it to have killed about half the Aboriginal people of Sydney in a short time. The epidemic most likely originated in the Sulawesi area of Indonesia and came to the north coast of Australia with Sulawesi fishermen, who were frequent visitors at the time. Smallpox then spread throughout the continent by contact among the tribes, probably halving the total population.
A second smallpox calamity hit in the 1820s and also possibly again killed half the indigenous population of Australia, as well devastating many of the Pacific islands. Again Indonesian outlying islands are suspected to be the source. Black-armbanders ignore the epidemics or make little of them, and the less responsible ones guess that the First Fleet brought smallpox.
The origins of the epidemics have been debated, inconclusively, for two centuries but the weight of evidence is with the northern source. A First Fleet source cannot be ruled out but is the least likely of various suggestions. There is no record of smallpox on the First Fleet ships, and the time and distance of the voyage would not have allowed the germs to survive.
Once smallpox has entered an area the population gradually becomes immune, as in Britain or Java, but it devastates unexposed populations with no long-term history of it, such as the outer Indonesians, Australian Aborigines and Pacific islanders.
Another population calamity was venereal disease. Convicts and visiting seamen from 1788 onwards undoubtedly brought it but it might also have been spread by the Indonesian fishermen. Once into Australia, it spread through the unsuspecting indigenous people, devastating women’s ability to have children.
The age-old custom of lending wives—it was a polygamous society with child betrothal and two or often several wives—in exchange for presents had been a social bond over isolated aeons when visitors were rare, but it was calamitous when the outside world crashed in.
Aboriginal liking for alcohol, especially when visiting towns like Sydney, multiplied the disaster, since the Aboriginal people often traded encounters with their women for grog. White observers from time to time reported rates of 30 per cent and even up to 90 per cent of indigenous women suffering from venereal disease. Official reports on populations of various tribes show very small numbers of children, especially compared to the big-breeding colonial whites. Aboriginal infanticide—the killing of unwanted babies, including part-white ones—also reduced the population, although reliable information about its extent, which seemed to vary, is scarce. This again was age-old practice geared to life in the wild, especially in difficult times.
Other European diseases like influenza and measles do not seem to have been big killers in the early decades but they added to the tsunami of fatal disease, especially in Tasmania and other colder climates, as the white population increased, voyages became faster and more frequent, and colonisation spread. Tuberculosis was another major cause of death, among whites then too, but thought to have been devastating among blacks.
Aborigines, especially mixed-race people, gradually learned precautions against the new diseases, and immunity may also have improved. By the 1880s the plummeting population numbers of the mid-nineteenth century began to steady, but recorded indigenous numbers continued to fall; they reached a low of about 80,000 in 1930, but this figure excluded many part-European people.
Australian health control was fairly good, though, and the worst international killer diseases, such as cholera, were kept out. Little is known about Aboriginal health in the long aeons of isolation from the world.
Massacres of Aborigines by whites and vice versa have long haunted the Australian imagination. Old bush tales about them abound. People as various as South Africans and the far Left seize on them as the main reason for the decline in the Aboriginal population. Massacres had an especial lease of life in the wake of the anti-Vietnam War protests, when the spin was added that knowledge of them had been suppressed until that time.
Nobody denies that massacres happened. The challenge is sorting out the facts from substantial but inadequate evidence and an overdose of rumour. “Freedom fighting” and “defending country” rarely fit the evidence, except occasionally in the far north. I have seen very few references to verified poisoning and even fewer of poisoned waterholes.
As an extremely inadequate summary, from inadequate evidence, there may have been forty or so massacres around the country between 1800 and 1920. They were more often extended, lethal skirmishes in the bush than set-piece massacres.
Only about half a dozen that we know of were in the first fifty years of white settlement on the mainland. This period saw the great wool boom, when graziers spread over much of southern Australia, and Melbourne, Perth, Adelaide and Brisbane were founded. Most incidents could be attributed to white mistakes rather than outright resistance. Otherwise, as far as records show, relative racial harmony prevailed throughout Australia until 1838—a considerable achievement, as officialdom liked to point out.
It would not have been difficult for the Aborigines to have fought off the whites then, had they seriously wanted to. The graziers advanced with slow-moving mobs of sheep accompanied by bullock drays on barely existent bush tracks. Most of the accompanying workers, typically serving or former convicts, were on foot, usually unarmed. Such guns as there were were muskets or long-barrelled rifles, of limited use against nimble Aborigines armed with spears and knowing every inch of their country.
A period of serious conflict began in 1838 and lasted in southern Australia until the mid-1840s. The reasons are not known with certainty—the Aborigines kept no records—but they can be estimated. One would have been drought—1838 was a bad drought year. The Aborigines might also have become wary of the overbearing and permanent nature of the white advance, or perhaps they wanted rent. But this is speculation. It is important to remember that “the Aborigines” were not a single entity but a collection of loosely governed tribes (more accurately described as clans and language groups), historically antagonistic and sometimes at war.
By the late 1830s the squatters were moving into more remote country where the Aborigines may have seen them as allies of neighbouring tribes where squatters had already settled. Such indigenous “politics” seem to have been a frequent flashpoint on the frontier, but it is hard for historians to assess other than circumstantially.
Much of the lethal racial violence of 1838 to 1845 was in these new squatting districts, such as far south-western Victoria, home of the independent, technically somewhat advanced Gunditjmara people; Gippsland (south-eastern Victoria), home of the warlike Kurnai who were already bitterly at war with the Kulin of central Victoria; and in far north-western New South Wales around the Queensland border, home of the big, strong Kamilaroi.
The biggest single cause of fighting, though, was large-scale sheep stealing (at least that’s how debt-plagued squatters perceived it). To again summarise a poorly understood subject, I think the main cause was the many light-fingered former or escaped convicts then roaming the frontier and attempting to set up little pastoral operations of their own and asserting themselves over the local Aborigines.
But it was a real tangle. Various motives have been attributed to explain Aboriginal sheep stealing, from all-out resistance, to punishing particular squatters, to simply being naughty. But the main reason was that they loved barbecued mutton. It seemed more about property and women, including raids on huts and drays, than about land.
The north of Australia saw much more frontier violence, especially in Queensland, where deaths of whites at black hands numbered in the hundreds and Aborigines possibly in the thousands. By then, in the second half of the nineteenth century, guns were much more advanced, horses much more numerous and distances even more immense. Northern Aborigines had the reputation of being more aggressive and there were more of them. The convict era had long gone, so convicts cannot be blamed.
The biggest and most distinctive human killers of Aborigines there were the native police forces, comprising mounted, armed Aborigines under white officers. Especially in Queensland, they developed a dreadful reputation for killing and mistreating Aborigines from outside their own tribal group—who were the great majority of those they dealt with.
Records are poor. How much of the lethal turmoil, especially between 1860 and 1890 when graziers flooded over northern Australia, was due to defending squatters from attacks on their livestock and on themselves and their workers, and how much to friction among Aborigines, remains unknown. But it was a wild time.
The number of Aborigines killed in conflict with whites and native police forces is not known. Published estimates are extremely rough guesstimates, but probably several thousand lost their lives—a sad, if relatively small, contribution to Aboriginal depopulation.
Note: This article has been summarised from many books and records. Two good books are Jan Critchett’s A Distant Field of Murder: Western District Frontiers, 1834–1848 (1990) and Judy Campbell’s Invisible Invaders: Smallpox and Other Diseases in Aboriginal Australia, 1788–1880(2002).
Robert Murray is the author of The Making of Australia: A Concise History (Rosenberg).