Saturday 28 March 2015

The Limitations and Elasticity of International Law: Lawfare, and the Strategic Proliferation of Violence


International law is currently the most dominant framework for thinking about conflict and war. Despite its positive influences, many thinkers have argued about its limitations and that instead of restricting war, it actually has many war generative functions, which are often overlooked. 

Many rules and institutions exist which are responsible for the purchase and sale of weapons, managing armed forces, recruiting soldiers, making war profitable, and encouraging military-related technological advancement. (See "Of War and Law" by David Kennedy  for more info.) The military also uses international law to legitimize killing, and for assurances that their (often morally ambiguous) actions are legally permissible. In this way, international law creates the institutional pathways in which war is created. Thus, war is now framed as primarily a legal institution and international law is key in its making. It has become the ethical vocabulary under which contemporary military bureaucracies operate. 

"The provocative new book, Of War and Law . . . 
[is] a cautionary tale of what
can go wrong when military leaders and outside
 observers use legal language as a substitute
for independent ethical thinking."
Interpretations of international law have a big impact on the way that players may strategize to achieve their objectives. How the law is interpreted, and how this interpretation is going to be perceived by the international community will have considerable implications on the perceived legitimacy of aggressors.  Legal experts, who are often employed to work along side the military, can manipulate it and find loopholes that test the elasticity of international law. Maneuvering the law in this way requires the creative use of “legal pluralism”. Essentially, the question isn’t whose interpretation is right, but rather who has the power or influence to make it stick. This strategic use of international law has thus weaponized it to serve a primary function in war-making. Charles Dunlap actually coined the term “lawfare” to describe this phenomenon.

For instance, the US is using international law to excuse their targeted killing practices, which sets dangerous precedents for future conflicts. By using lawfare to deconstruct the battlefield, the legal and normative borders that used to contain war are eliminated, thus liberating the use of violence in the process. (See Craig Jones for extra reading.) 

In a similar vein, Eyal Weizman argues that our reliance on international law has led to the proliferation of violence rather than its containment during the Israeli-Gaza conflict. 

International Humanitarian Law (IHL) experts were closely involved as advisors to Israeli military personnel in their attacks on Gaza from December 2008 to January 2009.


The conflict resulted in a disproportionate amount of casualties for Palestine, and enormous damage to civilian property and infrastructure. As a result an investigation was launched as it was presumed that these attacks were intentional and pre-meditated. Israel claimed that their attacks were executed in “self-defense”, and tried to convince the international community that their military operations were within the scope of IHL. Furthermore, they blamed Hamas for using human shields, and other indiscriminate tactics.

By defending their actions with claims of “proportionality”, Israel made attempts to excuse their actions. Furthermore, embedding the adjective “humanitarian” in their vocabulary really sticks out as a constant reminder of the apparent “legality” behind their attacks. Despite abuses such as killing women and children, refusing to help the wounded, and using air-burst white phosphorus in densely populated areas, the courts have done nothing about it.

The Dahiya doctrinewhich refers to intended collateral damage inflicted on urban cities by Isreal in its past conflict with Lebanon, can be related to the attacks on Gaza. By its logic, the violence inflicted on civilians in urban areas was meant to exert political pressure on Hamas and influence the political process. The effect that this destruction would have was the very reason that these targets were hit in the first place. They were not just collateral damage.

As a matter of fact, former Prime Minister Ehud Olmert was quoted saying that ‘Our response will be disproportionate. We won’t go back to the rules that the terrorist organizations tried to dictate’. This kind of rhetoric is also a testament to the use of IHL's legal language, which is still central to dialogue of war even if a state may be in violation of its rules. 

Conversely, restraint, which is a promise of IHL is always going to present the potential for escalation. In Gaza’s case, Israel claimed it could have done much worse than it did and that it exercised restraint, thus justifying the destruction it had already caused.

There is also a correlation between issuing warnings and the proliferation of violence. Warnings were used to legitimize attacks on targets whose destruction would otherwise be viewed as illegal. These warning systems also resulted in an increased frequency of attacks since they were more readily authorized by Israeli commanders.

The change in Israeli’s risk management policies also played a big part in the increase of civilian casualties in Gaza, as their military code of ethics prefers and allows the killing of civilians over Israeli soldiers. The ratio of dead Israeli soldiers to Palestinians was 1 Israeli soldier for every 100 Palestinians in the 2008-09 conflict, in contrast to 1 Israeli soldier to 6 Palestinians during the first intifada of 1987-91.

(See Human Rights Lawyer Noura Erekat speak about the conflict in more detail): 


By operating at the margin of the law, Israeli’s military lawyers were expanding its borders. If enough players operate under similar principles within these gray areas, their conduct can set a precedent and become customary law, and thus perpetuate violence through rewriting of what actions are permissible or not. In this case, Gaza was like a laboratory, where limits of international law and ethics were tested and shifted as Israel’s regime of control strategically played with just how much they could get away with.

What is particularly interesting is how experts assessed Israel's adherence to the Laws of Armed Conflict (LOAC) in a recent report from 2014 sponsored by the Jewish Institute for National Security Affairs.

Not surprisingly they cast Israel in a positive light while demonizing Hamas and Palestinian tactics. The report claims that the IDF used an unusual amount of restraint in their attacks and praised them for their innovative use of methods and tactics such as advanced warnings (as discussed above), which apparently helped reduce civilian casualties and exceeded the requirements of LOAC.

The report also concluded that Hamas actually exploited IDF’s respect for LOAC and its own civilian population. The report claims that Hamas did this in order to manipulate the perception of the international community by presenting their own civilian casualties as unlawful actions by the IDF.

As we can see, this account differs significantly from Weizman’s, and it goes to show that the question isn’t whose actions or whose interpretation of the law is right, but rather who has the power to force their interpretations of the law to become authoritative, and who can successfully manipulate the law to appear to be on their side.

This sort of manipulation calls for many ethical considerations about the legitimacy of international law itself and whether it really serves to create a more just, impartial, and fair world. 



Author: Marianna Stetsiv
_______________________________________________________________

Additional videos:

Israel Gaza Conflict: "Most of our shooting was random...we didn't think about civilian casualties"



Is International Law Effective? The Case of Russia and Ukraine.

Wednesday 25 March 2015

Exploring the Relationship Between Modern International Law and Warfare



By Alexandra Crow

It has been argued by scholars such as David Kennedy that modern warfare is a legal institution. Kennedy argues that "when we talk about war today... we are not talking about wars between great powers, relatively equal in size... war today tends to happen at the periphery of the system among extremely unequal players." He goes on to state that this modern warfare takes place with the involvement of non-state actors and new warfare technologies. He argues on this basis that the changing nature of the relationship between law and war means that law ends up legitimising the use of force rather than restricting it. Law legitimises certain actions in war such as whether a particular strike was proportionate, and this takes the discussion out of the old moral justifications and into a legal realm. He argues that this ends up causing more conflict and violence.

In an attempt to understand the arguments of critics of international law such as Kennedy, it is useful to look at a few of the key international legal instruments governing the use of force and armed conflict. It’s difficult to know where to begin with public international law because it is so vast in the sources it draws from. However, having a good understanding of the more commonly referred to instruments, especially those regulating armed conflict and the use of force, is a good start.  

The signing of the Geneva Convention 1864.
The first significant inter-state agreement aimed at restraining the undesirable effects of armed conflict was the Geneva Convention 1864. This was significant in that it was signed by 12 European states and gained its obligatory force from the consent of the states which accepted and applied them in the conduct of their military operations. The Convention was successful in effecting significant and rapid reforms to improve the treatment of wounded and sick armed forces in the field.  The four treaties that make up the Geneva Conventions constitute a body of humanitarian law covering the treatment and protection of civilians and other non-combatants.


Further regulations concerning armed conflict, and also the use of force, are set out in the Hague Conventions 1899 and 1907. These were the first set of conventions to lay down formal principles concerning disarmament, war crimes and restricting certain methods of warfare such as the use of projectiles by balloons. It also set up the Permanent Court of Arbitration with the intention of settling disputes before resorting to war. Although not all the great powers ratified or formally accepted these declarations, they are now generally accepted to comprise part of customary law.

So this was effectively the codification, in one legally binding instrument, of international law as it related to armed conflict and the use of force. However it was not exhaustive in the situations it covered. So it was established in a principle called the Martens Clause that where the law was silent, customary law would continue to govern. It is contained in the preamble of the Hague Convention 1899. This principle has been more widely interpreted in the application of modern international law providing that an act which is not explicitly prohibited by a treaty does not mean that that act is automatically permitted, thus expanding the scope of international law is warfare.
The United Nations General Assembly.

The Charter of the United Nations 1945 which established the United Nations, is arguably the most important modern document of international law. The Charter is based around restricting the possibility of further war with the first line stating the purpose is “to save succeeding generations from the scourge of war, which twice in our lifetime has brought untold sorrow to mankind.” This has been described as a major shift towards multilateral institutions, whereby numerous states come together and work on various issues, prompted perhaps by the increasingly connected nature of our globalised world. David Kennedy makes a point in his book that due to these developments "states can no longer expect to be sovereign in the way that they previously were"; he then relates this to war sating that "all governments have less focused power to decide for war and peace than they had a century ago."

Click here to view the enlarged version of the diagram. 
The most significant provision in the UN Charter relating to modern warfare is the complete prohibition on the threat or use of force set out in Article 2(4). Chapter 7 of the Charter states that force may only be used when authorised by the UN Security Council. This approach is intended to remove subjective judgments by powerful states about when it is acceptable to use force. There are various options that can be considered and tried before the UN Security Council will authorise the use of force. These include the use of sanctions on economic relations, arms embargos, flight bans, the severance of diplomatic relations, among other things. 


A Palestinian woman sits on the rubble of her home on July 26, 2014.
The notable exception to the prohibition on the use of force is the right to self-defence set out in Article 51 of the Charter. There are two requirements that have to be met in order to use force in self-defence: the action taken must be necessary and it must be proportionate to the threat. Customary law states that the requirement for necessity can only be met if the threat is imminent but it is now also recognised that there is an anticipatory dimension, due to the nature of modern weapons technology and to deal with the issue of modern terrorism. This is often the justification used by states to perform acts of violence such as the US invasion of Afghanistan in 2001 and Iraq in 2003 and the 2008 attack on Gaza by Israel referred to in the readings. 

Finally, we conclude with a discussion of the rules surrounding armed conflict which is regulated by the body of law known as international humanitarian law. This is predominantly found in customary law and is a series of principles that are binding on states through Article 38(1)(c) of the Statute of the International Court of Justice. There are several important principles that make up this body of law. For example the principle of distinction, that is the idea that there must be a distinction between combatants and non-combatants. Article 48 of Protocol 1 Additional to the Geneva Conventions 1977 states that civilians must not be the object of attack. This is meant to provide an overarching prohibition on the targeting of civilians in warfare. Another example of these principles is the prohibition on the use of indiscriminate weapons. This carries on from the first rule requiring states to never use weapons that are incapable of distinguishing between civilian and military targets. This would perhaps rule out all use of nuclear weapons, expect in an extreme circumstance of self-defence.

So when considering that the sources mentioned above are only a minute part of international law we can perhaps understand why Kennedy describes international law as a “fragmented and unsystematic network of institutions…. which are only loosely understood or coordinated by national governments”. The intention of international law however is to restrict the use of force and to constrain and regulate armed conflict. There are many reasons to be sceptical of the effectiveness of international law however there have been examples of super powers being held to account for excessive force by international law such as the Nicaragua v United States of America case heard in the International Court of Justice. So it is arguable that, for all its faults, it is better to have a system of international law that attempts, and sometimes succeeds, in restricting states' use of force, than not have one at all. Hopefully this brief introduction to international law will help in assessing the value of the arguments of critics such as Kennedy.

To listen to a more in-depth discussion of David Kennedy's arguments, check out the video below. 


Monday 23 March 2015

The Postmodern Perspective



           To understand Postmodernism, one must first understand its precursor, Modernism. Modernity as a theoretical framework sought to find objective knowledge from a position of neutrality. It firmly asserted the power of reason based on scientific knowledge, and sought universal truths. Morality, rather than being a natural trait, was viewed as something that needed to be imposed.

According to  Zygmont Bauman, modern ethical thought is characterised by two defining factors. Universality and foundation. Universality refers to the bid to discover and implement exceptionalist rules based on one set of laws. Laws are based on a reward/punishment model and work to discourage the individual from deviating from prescribed norms. Foundation are the coercive powers of the state that render obedience to these rules to be viewed as sensible and expected. For Modernism, adherence to laws is developed through coercively encouraging and shaping an ‘individual’s power of judgement’ so as to prompt them to willingly obey the order set out by the legislators. Relating this back to conflict, there is the argument made by Postmodernists that Modernism created what Bauman refers to as an ‘aporetic situation’. An aporetic situation refers to the perspective that often there is an inevitable, perpetual cycle of violence due to the anarchic tendency of people to rebel against rules experienced as oppression. 


Source: http://www.azquotes.com/picture-quotes/quote-power-in-a-nutshell-is-the-ability-to-get-things-done-and-politics-is-the-ability-to-zygmunt-bauman-91-78-49.jpg
Postmodern critique of ethics

It was in retaliation to modernisms assertion of the existence of a universal objective truth, and its alleged perpetual cycle of violence that Postmodernism emerged. The suffix 'post', is not necessarily implied in a chronological sense, but is more referring to its direct reaction to Modernism. 

Bill Kynes provides a good overview of the key aspects of Postmodernism:


Ethics in itself is viewed by Postmodernist to be a redundant, controlling and broken illusion that is derived from typically Modern constraints. Postmodernist deny the existence of a universal moral reality- objectivity is a myth, there does not exist a place in which one can remove oneself from bias, for deciding what is inherently right or wrong, good or bad. Postmodernism views human nature as consistently collectivist- individual identities are largely shaped by the social linguistic groups that they are exposed to. Consequently, communal moral standards are decided both by coercion and consensus and unchanging universal principles are non-existent.  


According to Bauman, there are seven moral conditions that he argues represent the marks of the moral conditions, from a Postmodern perspective. Firstly, the idea that humans are morally ambivalent, neither essentially good nor essentially bad. Second, moral phenomenon is inherently ‘non-rational’, negating the wisdom in creating a fixed rule guide. Third, as alluded to above, morality is incurably aporetic- meaning that it is essentially an oxymoron. Fourth, morality is not universalizable. Fifth, morality is bound to remain irrational. Sixth, morality is originally a product of the self; not of society, and finally, Postmodernism exposes the political nature of ethical codes. 

Sending troops to defeat ISIL

The Postmodern perspective has been criticised by a number of scholars both on the left and right of the political spectrum. Prominent scholar, Noam Chomsky criticised Postmodern perspective for being too ambiguous, complex and lacking in empirical and analytical knowledge. 


In a bid to demystify Postmodernism, one needs to relate it back to a case study. The recent debates in New Zealand around sending troops to help train Iraqi forces to defeat ISIL provides a good example of how morality exists in plurality. Even John Key, who supports sending troops to Iraq, highlighted the complexity of his decision in his speech- which can be found here. There exists strong moral arguments to be made on both sides. On the one hand, immense suffering is occurring under ISIL’s barbaric control, and it seems that morally we should be obliged to act quickly to prevent such atrocities from continuing. On the other hand, by sending in troops to train the Iraqi forces, we are not only putting our own troops in danger, we will also be responsible for training troops that will undoubtedly kill others, which could, potentially, result in civilian deaths. Furthermore, by engaging in the fight we increase the terror risk at home. Both inaction and action are both moral and imoral. This highlights an example of morality being an aporetic oxymoron that was touched on earlier.

Derrida explains, ‘our responsibility to one other involves sacrificing our responsibility to all others’. According to Dan Bully, this contradiction highlights the ‘irresolvable paradox of morality and responsibility’. Any ranking of moral responsibility is arbitrarily politicised. Humanitarian interventions and the like will always have morally opposing views, and one must embrace that it is not from some morally objective position that decision over course of action will be made- these decisions will be made via political compromise. As Maja Zehfuss states “knowledge incurs our trust, and it nicely takes away the agonising: if we know what is right, we embark on this course of action in good conscience”. Truth, for Postmodernists, is often viewed as a Western centric construct aimed to gain and justify power. Ethics therefore is inherently political. By embracing responsibility for actions, as opposed to deferring them to guidelines based on a ‘moral code’ or legislation, there is the potential for less suffering as taking responsibility can act as a deterrent. Furthermore, exposing the politics of the decision prevents ulterior motives from going unnoticed. As summed up by Chris Brown, an awareness can cause us to “act modestly, to be aware of our limitations and to be suspicious of grand narratives of salvation which pretend that there are no tragic choices to be made”.


Source: http://www.notable-quotes.com/b/george_w_bush_quote_2.jpg

Postmodernism in general is often very sceptical of invasive foreign policy, for example, Stephen R. C. Nicks questioned if the United States misused claims of ‘knowledge’ and ‘truth’- by using the power gained from these claims to impose their own group ideology and capitalist system onto other states. An audio recording of Nicks explaining some of these claims can be found here. This is echoed by Jean Francious who criticises Western civilisation using ‘reason, truth and reality’ to wrought dominance and oppression over less powerful states. Western nations have the power to assert their moral legitimacy to conduct expressions of force . Through their monopoly on ‘knowledge’, Western nations can deny the moral legitimacy of the global south. The global south are often associated with irrational behaviour, backward institutions and inability to conduct moral humanitarian interventions. Knowledge is power, and according to the Postmodern critique, this power is often used to control what is, and who is able to conduct, a ‘just’ war. Postmodernism aims to de-construct concepts such as reason truth and knowledge so as to gain a more in-depth and honest version of the situation. Postmodernism is emancipatory in the sense that its primary aim is to expose the source of moral power that is concealed in modern political thought.


By Kamala Busch-Marsden

Thursday 19 March 2015

Judith Butler and the politics of grief as a path to non-violence


Judith Butler on living precariously

A feature common to all humans is our vulnerability to one another.  The self has boundaries, but these boundaries are permeable and allow us to form an understanding both of the world around us and of ourselves through the interaction of the two.  This means that we each have a fundamental dependency on others, making us susceptible to them both socially and physically.  Each of us is constituted in part by our vulnerability and social vulnerability is demonstrated in how we experience the loss of others as grief.  Our senses of self are manufactured by our interactions with others, and when we lose someone who once was a reference point for our own selves, so too do we lose a piece of our identity.  We mourn both.  

For Butler, when a life is lost and that loss is accompanied by grief, grieving functions as both an acknowledgement of death and recognition of a life lived.  On the other hand, the absence of grief and mourning implies the absence of a wider value afforded to the life of that person – a disregard not only of their death but also a negation of their life.  For this reason, grief and mourning represent a resistance to loss and therefore an important challenge to violence.  The absence of grief however implies a permissiveness toward future such losses and ultimately dehumanizes the individuals we place in the frame of the ungrievable.

Grievability and ungrievability make us less and more vulnerable to violence respectively on a sliding scale of the precariousness of our lives.  Such precariousness is not distributed equally, but rather is applied along the lines of race, class, gender, faith, or geography and thus, according to Butler, the conditions of life are distributed unequally.  This differentiated precarity is perpetuated by the frames we use to define and understand others relative to ourselves and these frames mould what we can see.  Our experiences constitute these frames just as we are constituted by them.

The importance Butler places on the role of grief as the manifestation of a life vested with meaning raises questions about our ability to extend our frame of the grievable and the possible advantages that it could confer.  She asks:
  1. Is there something to be gained from grieving and not endeavouring to seek resolution through violence?
  2.  Is there something to be gained in the political domain by maintaining grief as part of the framework within which we think about our international ties?
  3. Is it destructive to stay with a sense of loss or are we instead returned to a sense of human vulnerability and our collective responsibility for the physical lives of one another?
  4. From where might a principle emerge by which we vow to protect others from the kinds of violence we have suffered if not from a common human vulnerability?

9/11 and the War on Terror: How to Grieve, Who to Grieve

Butler offers her answer to the four questions above when she says, “To foreclose on vulnerability, to banish it, to make ourselves secure at the expense of every other human consideration is to eradicate one of the most important resources from which we must take our bearings and find our way.”  To this end, Butler criticises the Bush administration for rushing the processes of grief and mourning in the public sphere after 9/11, saying, “Mindfulness of this vulnerability can become the basis of claims for nonmilitary political solutions, just as denial of the vulnerability through a fantasy of mastery...can fuel the instruments of war.”

President Bush’s announcement September 21, 2001 of the end to the period of mourning and the start of the time for action dismissed the possibility of becoming better acquainted with the inequitable geopolitical distribution of the sort of corporeal vulnerability that America had just experienced.  Neither the President nor his administration considered the greater corporeal vulnerability of the citizens of the nations they were about to wage war on.  When we fail to grieve, when we fear it, we can instead turn it to rage and use it only to perpetuate a cycle of grieving.


In the course of the War on Terror, many lives had been lost, but Butler points out that public mourning is the key to understanding the limits of empathy and therefore the limits of action.  To demonstrate this, she highlights two examples.  The first is Daniel Pearl, an American who had worked for the Wall Street Journal and was murdered in 2002 by Pakistani extremists while on assignment. His murder was gruesome and rightly condemned in a public out-pouring of grief by private American citizens and colleagues of Pearl’s in the media.  In contrast, a Palestinian citizen of the US had submitted two obituaries to the San Francisco Chronicle.  They were both for Palestinian families killed by Israeli troops. This man was asked to make revisions and resubmit, but upon the resubmission they were ultimately rejected for fear of offending someone.

What makes Daniel Pearl more grievable than Palestinian families? What makes us strive to prevent more deaths like his, but fail to seek new protections for Palestinian families?  Butler says, “We have to consider how the norm governing who will be a grievable human is circumscribed and produced in these acts of permissible and celebrated grieving, how they sometimes operate in tandem with a prohibition on public grieving of others’ lives, and how this differential allocation of grief serves the de-realizing aims of military violence.”


 Homeland Precarity

Such ready acceptance of violence toward a population can be seen not only in the intersection of the familiar with the unfamiliar, but also within a single nation. The series of protests and civil disorder in Ferguson throughout 2014 in response to the fatal shooting of an unarmed black teen, Michael Brown, challenged American law-enforcement and its relationship with African-Americans.  Judith Butler describes the mass protest as “war zones of the mind that play out on the street” and emphasized that each time an officer or vigilante is exonerated for the killing of an unarmed black person, it normalizes violence toward black Americans, making their lives precarious.

Given the link that Butler draws between grievability and a life given meaning, the vigils and public mourning in Ferguson and by way of the Black Lives Matter movement become acts of protest.  They demonstrate the grievability of those lives that law enforcement and the judicial system had failed to protect and challenge the norms that make black people far more vulnerable than white in America.

 

Grieving and Non-Violence
 
Extending the bounds of our ability to mourn beyond the familiar or recommended can give us insight into the lives of others and our shared vulnerability.  Grief and grieving can mean reflecting on social narratives that make certain lives vulnerable while protecting others.  This is integral to the task of rewriting such narratives and most certainly a step toward a less violent future: it is much more difficult to harm someone you will mourn.  Through the radical equality of the grievable, we may begin to establish new concepts of self that are sufficiently unbounded so as to stay our hands when we may have otherwise done violence. Grief is therefore transformative, though the conclusion is unknowable at the outset.  That does not mean we shouldn’t follow its course.  As Butler says, “We’re undone by each other.  And if we’re not, we’re missing something”.


Emily

Friday 13 March 2015

The origins of Just War Theory


Ethical questions of war can be traced back to classical antiquity and across the histories of all the main civilisations. Just War Theory is a set of concepts and values that have been refined throughout its history. There have been some key figures throughout the period whose contributions to the theory have shaped what it has become today.

There are two clearly defined elements to the theory:
  1. Jus ad bellum – When is it just to go to war? And,
  2. Jus in bello - When we are in war, how and who should we fight?

There is also an emerging third element of the theory: Jus post bellum – what is ethical conduct in the aftermath of war?

But for a very short history lesson, Just War Theory can be traced as far back as Greek philosophers Plato and his student Aristotle (3rd or 4th century BC).

Aristotle is credited with coining the term ‘Just war’ in contrast to the earlier idea of a ‘holy war’ as mandated by God.

Just War Theory is today identified as a secular concept but its values are based on the origins of the Greeks and Romans, as well as Christian values.

And we will see that those Christian values played a significant role in developing the theory in the 5th Century AD when St Augustine was asking himself and his fellow Christians whether they could engage in war without sin. St Augustine is often credited with being the founder of the Theory, but in fact there was much that came before him.

It was from Aristotle, that we first see the reference to what is now considered by Just War theorists as the most obvious just cause for war: Self defense. He thought it morally justified to go to war to prevent one’s community from being attacked and enslaved by another. But we may be uncomfortable today with his other justifications for war.

And we can see how over time different concepts for Just War have been considered, but rejected. Aristotle thought it ok to go to war for empire expansion provided:
  1. The empire would benefit everyone, including the conquered
  2. The empire would not be so large and rich that it would attract attacks and therefore create more wars
  3. He also thought it was also morally acceptable to go to war to collect for  slaves, as long as the slaves were naturally submissive.


A little later came the Romans, and amidst their shattering success on the battlefields as their empire expanded – Roman emperors and Senators developed some of the deepest reflections on the ethics of war.

Roman lawyer Cicero (106-43 BCE) built further on Aristotle’s early development of the theory. But he rejected the need for more slaves as a just cause for war. He also added the rules that are familiar concepts of the theory today: proper authoritypublic declaration and war as a last resort.

It was St Augustine, 500 years later, who was wrestling with the dilemma of how to be both a good Christian and a public official of the warring Roman Empire, who contributed the thinking around the right intentions for going to war.

Shouldn’t he as a Christian, always take the Pacifist position: one must always show love and non-violence? But also, isn’t protecting his own people from aggression not showing love for them? 

To help develop this, Augustine reconciled this with incorporating ‘Right Intention’ into the Just War Theory. He insisted the intention of war must be love for and the desire to protect the endangered innocents, and without any joy from the bloodlust itself, therefore good men can undertake wars in the obedience of god as long as their intention is for the love and protection of their own.

Interestingly here, Augustine did not reject the connotation that a Holy War was just cause for war and in doing so, according to Brian Orend, allowed for a blurring of the lines on this matter that would last for another 1000 years within which time the Pope-ordered crusades would take place.

Within those 1000 years, the Roman Empire had collapsed in the West, and the Dark Ages took hold with the only institution left standing the Catholic Church. This was a period of messy, private, feudal wars, with the Crusades coming towards the end of this period.

Despite the Theory having now been in some form of existence for roughly 1500 years at this point, this is considered a period of regression for Jus ad bellum – the just cause of war – due largely to the muddling of what was declared holy war and/or a just war.

The Theory did still have some notable bright moments. In this time the Church led peace movements to protect innocent people, particularly women and children caught in the crossfire. Coined the “Peace of God’, this concept would later form what is widely considered the most important element of Jus in bellum – justice in war – or ‘non-combatant immunity’.

What’s the difference between a Holy war and a just war?

The difference between the two is in short: A holy war is a war considered approved of by God, or by a direct messenger of God, for example the Pope – in contrast a just war is not sacred, but moral.

Italian theologian Thomas Aquinas around the time of the last crusade questioned whether a holy war was just, saying that defensive wars protecting Christians from death or persecution at the hands of non-Christians might be permissible but aggressive wars designed to coerce non-believers into Christianity were not.




It would still take another two centuries before there would start to be clear definitive proclamations against the use of holy wars as a justification for war.

In the 16th Century, the theory was being effectively used not only to test the legitimacy of the decision to go the war, but also to criticize the powers that authorized the war, urging them to stop the fight. Two legal figures, one Spanish and one Dutch, were both using Just War Theory to criticise their own nation’s justification and conduct in the wars they waged.

Spanish jurist Vitoria, living in the time of the Spanish Conquest, heavily criticized the conduct of the Conquistadors, using the theory to largely resolve wars of conquest were unjust. Spain was motivated neither by love – the right intention Augustine said was required, nor by any self-defense or protection, but were driven by greed and love of power.

Vitoria was the first to say clearly that non-Christian communities have the right not to be attacked and enslaved. He rejected the concept of holy wars and insisted on the secularism of the Just War Theory.

His work was responsible for extending what had been up til now a set of guidelines for Christian princes on how to conduct war in Christian Europe to universal principles – or natural law applicable to anyone, anywhere at anytime.

One of his most important contributions was his stress on the quality and objectiveness of the evidence needed to support the decision to go to war.

We can fast-forward here 500 years and apply this concept to President George W. Bush in his 2003 State of the Union address, which ‘justified’ the impending Iraqi invasion in just ’16 words’:

"The British government has learned that Saddam Hussein recently sought significant quantities of uranium from Africa”

Evidence we now know was vehemently rejected:



Dutch theologian and lawyer Hugo Grotius was also deeply distressed by the behaviour he was seeing from his fellow countrymen in their colonization, particularly through the Caribbean Indies, as well as what the Wars of Religion, between Catholics and Protestants – across Europe. He sought to strongly reinforce Vitoria’s rejection of Holy Wars, the combination of which sealed the debate on the issue in the Just War Tradition.

Grotius’s work is also credited as one of the first productions of the ‘laws of armed conflict’. Within 200 years after Grotius’s work, the moral principles of war were being translated into specific legal codes: and international treaties and laws regulating armed conflict culminated in the Hague Conventions between 1899 and 1907.

The question for today is whether the theory is still relevant?

Jeff McMahan argues it is no longer relevant, due to the changing character of war – wars are not always between two clear sides on the battlefield today and the state-centric framework of the traditional theory can no longer apply.

He is part of a new school of revisionist theorists, challenging the traditional theory – asserting the principles of jus ad bellum and jus in bello cannot be so clearly separated and that the principles of jus ad bellum apply to the individual soldier, not just the state.

Michael Walzer, however, is the 20th century’s most influential defender of the Theory.

Walzer on the origins of Just War Theory and the development jus post bellum:


 Janna