Tuesday, January 26, 2010

What makes a country?

If you cast your mind back to the later half of the 1990s, you might remember Britain being embroiled in a foreign war that was rather better received than any militaristic ventures this side of the year 2000. Then, the Kosovo Liberation Army (KLA) sought to create an autonomous province, free from Serbian control. Slobadan Milosevic and the Serbian military were of a different mind, and ultimately refused to leave the 'occupied' territory (Kosovo being within Serbia). Then NATO got involved (without a UNSC resolution, interestingly, but that's not what I'm here to discuss), and the Serbian military was driven from the province, finally leading to the arrest of Slobodan Milosevic on war crimes charges

Pursuant to this, The UN passed Resolution 1244, allowing Kosovo to be UN administered. Importantly, Kosovo was granted autonomy within the Federal Republic of Yugoslavia (that is to say, Serbia, as Serbia is the successor state to the FRY). The matter, of course, was far from settled - many ethnic Serbs live in Kosovo, and you will not be too surprised to hear that the Kosovans, ethnic Albanians ruthlessly persecuted by the Serbs during the 1990s, have turned on them. Tensions have remained high.

In 2006 negotiations were begun within the UNSC to try and solve the problem. It was widely anticipated that Kosovan independence would be the ultimate aim, but Russia, holding a veto and permanent seat on the Security Council, held that they would not support any resolution that was not agreeable in both Belgrade and Pristina. Unsurprisingly, the Serbs do not want Kosovo to become independent. Russia's argument was that to allow such would undermine the principle of state sovereignty.

Transitional moves, giving the Kosovans an independent legislative assembly, led to the first elections in 2007. This culminated, ultimately, in Kosovo declaring independence in February 2008, and this was recognised by 65 states, including France, the UK and the US. Kosovo is a member of the World Bank and IMF, as the 'Republic of Kosovo'.

So, is Kosovo independent? Is it a state? It certainly satisfies two of the Montevideo criteria, namely that it has a defined geography and borders, and a defined population. It has a government, and can seemingly enter into relations with other states - but is this 'legal'? There is a Serbian government department with responsibility for Kosovo, and only sovereign states can enter into public relations - surely, legally, Serbia has this power?

This very question, of whether the unilateral secession of a province from a sovereign state, becoming a new sovereign, self-ruling state, can be legal, was referred to the International Court of Justice by the UN, following Serbia's proposal that it should be (it was a narrow vote, 77 states voting to refer the question against 74 opposing). All UN states were invited to submit their positions to the court, and public hearings were held on the 1st of December 2009. Serbia argued that independence violated international law, citing resolution 1244 as its key basis (that Kosovo was part of Serbia, as successor state to the FYR). It was keen to play on international fears of secession in other states, and declared that the move would be a dangerous precedent.

Kosovo argued that Serbia had forfeit its right to the province due to its long-standing abuse of Kosovar human rights, and pointed out that Serbia had never been serious about granting Kosovans independence - indeed, it had re-drafted its constitution specifically to include Kosovo as part of its sovereign territory after resolution 1244 had been passed.

A decision on the matter is due at some point in 2010. It is unclear which way the court will go. There are arguments that such secession is not prohibited by international law, but the lack of an explicit rule makes this a rather soft argument. By contrast, Russia, who opposes the move, recognised Abkhazia and South Oseetia as independent of Georgia during the recent fighting there, citing Kosovo as a precedent (amazing, since they reject Kosovo's independence as illegal, but never mind). The UK, pleasingly in my view, made the apt statement that 'Courts do not order estranged spouses to continue in a broken marriage.', implying that it would be manifestly wrong to make Kosovo surrender its tentative freedom and re-engage with Serbia once more.

I feel that the testimony by significant states such as the US and China is most instructive - the US states that Kosovo should be viewed as a special case, but that it should not provide a precedent for other states to utilise. China, like Russia, argues that the move violates resolution 1244 and thus cannot be legal. I feel that the ICJ will ultimately decide along one of those two lines. It would be a sad and retrograde step for Kosovo to be denied its freedom, but then, if it is free, what can other nations do to stop their own breakups, should states wish to secede? Might we begin to see fresh, bolder moves for Scottish and Welsh independence? An independent Kurdistan (and more bloodshed in Iraq?). It would certainly be of great interest to the Quebecois. The title posits the question of what a country is, and the simple answer is that we cannot say with certainty, and the outcome of this case will probably not really clarify the question. The ICJ's decision, however, will be of great interest and may have a huge impact, especially if the court's Opinion is in Kosovo's favour.

Saturday, January 23, 2010

Now for Something Completely Different

Today's installment will be a little at odds with the rest so far. The following is probably not for the eyes of children, either, so if you have any, tell them to go to bed, or something, and we'll begin.

There's no point in skirting the issue now I suppose. Today I'm going to talk to you about foot fetishes. Now, before you start, I'd like to state quite clearly that such fetish is not one of my personal sexual peccadilloes (I'm not too likely to go into detail about those, so don't reach for the sick bag just yet). Further, this is largely apropos of nothing. It's just always struck me as a decidely odd thing to be turned on by. Maybe that owes to the simple reason that my own feet look like they have been attacked with a claw hammer, but I digress.

It's interesting to note that such a sexual taste is more common than you might think. In this study carried out at the University of Bologna (http://www.nature.com/ijir/journal/v19/n4/abs/3901547a.html) it was found (by searching through roughly 400 internet fetish groups. Can you imagine) that feet were, by a reasonably clear distance, the most commonly fetishised body part. When this was extended to items such as footwear, the margin increased.

Having established that it is perhaps more than a small majority with such an interest, it still doesn't get to the heart of why people would sexualise feet, so let's examine some ideas as to why this might be so.

In researching this, I've read a few times that 'the foot's shape mimics the curve of a woman's body'. Firstly, I'm not sure it does really. I mean, looking at the sole of someone's foot, maybe, but generally I don't buy it. In any event, It seems that straight women can be foot fetishists, which rather throws a spanner in the works.

Some people also seem to suggest it has something to do with reflexology. Reflexology is a load of ascientific nonesense, so we'll scrap that too.

Neurology in the wider sense may well have a part to play, and I think it is in this that we find what I deem to be the most plausible explanation for what is at first so counter-intuitive (to me) a desire. The somatosensory cortex is the part of your brain which receives nervous inputs from all around your body, and lets you know, in very basic terms, when a part of you has been touched, or is hot etc. It is mapped out in a way that may at first seem a little strange (I'd click here http://www.alinenewton.com/images/homunculus.jpg and look at how it is mapped out before reading further).

You will note, if you followed the link, that the areas of the brain receiving input from the genitalia and the toes are sat right next to each other on the cortex. The theory would run that due to the proximity, some of us may simply be wired (quite literally) differently, and that the neural inputs from the feet and genitalia become intermingled or cross into the 'wrong' areas, and as such the sight of feet becomes arousing, as it gains a mental equivocation with genitalia, the sight of which is arousing (by and large!)

Of course, with something as complex as sexuality, this is not going to be the whole story. I'm not going to go into Freudian psychosexuality, as it's too nebulous and complex for me to really get into, but it's probably fair to say that theories that childhood experiences are likely to define future sexual tastes, and this developmental aspect should not be ignored, but as ever it's hard to pin down exactly what experiences during which period could trigger such desires later in life.

From googling around, it seems that such fetishes were first mentioned in literature roughly 800 years ago, and examining trends in the prevalence of mentions of such fetishes throughout history can be instructive. It has been noted that an increased interest in feet as sexual objects has been noted during peak epidemics of sexual disease throughout the centuries, specifically syphilis epidemics of the 16th and 19th centuries. I am extremely suspicious of this finding - whilst one could imagine that the rise in popularity of feet as sexual objects that has arisen during the AIDS pandemic of modern times makes sense (feet being 'safe' from a disease standpoint), the same cannot be said of syphilis. A known syphilitic symptom is the present of sores and rashes on the feet, so the argument that feet became 'saf'e' during these periods doesn't stack up - it's an interesting idea, however.

Even feminism has been implicated, the principle being that data seem to suggest that during periods of increased female emancipation interest in such fetishes rises. Perhaps it is possible that fetishising of less traditional parts of the female anatomy has increased when women have subverted and advanced beyond their traditional roles, but I can't really see any logical reason why this would be so.

Overall, it's hard to pin anything down definitvely. I like the neurological argument because it appeals to the logical side of mine. Of course, as stated previously, sexuality rarely makes that much sense. But we shouldn't demonise those who have tastes that we may see as a little beyond the pale - my overriding conclusion is that due to a myriad of factors, to a greater or lesser extent, such urges and desires arise in a manner that is not under one's conscious control.

Thursday, January 21, 2010

A government of laws, not of men..

I'm sure you've heard of Barack Obama. I'm equally sure that you've probably read a lot about him, and thus may be about to click your back button to escape. I am going to talk about him a little, but just as a starting point, so don't be too discouraged just yet.

Things haven't been going so well for the President as his first year in office ticks away. His poll ratings are low, there's a great opposition to his healthcare plans, and just this week the democrats lost a Massachusetts senatorial seat (Massachusetts having returned democrats to the senate for the past 50 years plus).

I'm not going to go into the rights and wrongs of Obama's presidency (though I'm still a bit miffed about the Nobel prize he picked up. I guess Henry Kissinger got a peace prize for advocating the carpet bombing of Cambodia, so it's not the worst nomination). It seems he's taking a lot of flak for not pushing through the great agenda of change he promised, and that was instrumental in his sweeping electoral victory. Part of the blame does lay at his door, and we cannot forget the Republicans' amazing ability to delay and frustrate any legislative efforts by the democrat majority in both houses of Congress (consider their tabling of no fewer than 600 amendments to the healthcare bill, of which 160 were accepted as concessions by the democrats, and still it remains unpassed).

However, the real obstacle to Obama's reform agenda is a simple and brilliant (the second amendment right to bear arms aside) bundle of paper - the US Constitution.

It's a very simple set up. A President is elected by virtue of gaining a majority of electoral college votes, which is done by winning the most States in a popular election (note that some States carry dramatically more electoral college votes - California, for example, is a key state to win, whereas Alaska, far larger, carries few votes). The President is the head of the executive, and so appoints deputies as a 'cabinet'. This executive is responsible for day-to-day governance.

The key to Government and it's impact on people really lies within the legislature and judiciary, and this is where the Constitution really comes alive. Local voting in individual States every two years leads to the appointment of representatives in the lower legislature. Six yearly elections appoint Senators to the upper house. This ensures a very democratic representation of the people within the legislature. In addition, States may elect Senators of one party, but often Representatives of another. This ensures that there is a great mix and parity within the legislature. The two house system, with legislation needing majorities in both houses before Presidential approval, is the crux of it all. No law can pass through this process without being tempered and altered, without a broad consensus and agreement. Further to this, all States have their own federal legislatures to which decision making powers have been devolved, with elected Mayors and Governors responsible for local executive function.

This process was intended by the original signatories to the Constitution to ensure that radical legislation was hard to pass. It is a system designed with compromise in mind. Indeed, having struggled to escape the taxation and rule of the British monarchy, it can be seen as a direct opposition to that rule. It is a system built to ensure that no one man can ever wield too much power, and that those who do possess that power because the people will it.

This quote sums it up pretty well, I think -

'As our president bears no resemblance to a king so we shall see the Senate has no similitude to nobles. First, not being hereditary, their collective knowledge, wisdom, and virtue are not precarious. For by these qualities alone are they to obtain their offices, and they will have none of the peculiar qualities and vices of those men who possess power merely because their father held it before them.
Tench Coxe, An American Citizen, No.2, September 28, 1787'

The judiciary is slightly more problematic. The Constitution establishes the Supreme Court as the highest court of the land, which is the ultimate arbiter of whether any acts are constitutional. As such, in a way it has a power above the other branches of the legislature and executive.

However, again, this is neatly tempered by the provisions allowing Congress to establish other courts beneath it, and it is the President who appoints judges to the bench. Ultimately, once appointed, Supreme Court Justices have a great safety of tenure, and although their appointments are often political in nature, they may choose whatever path they wish (an excellent example was David Souter, appointed by the Republicans, who would go on to be about as liberal a Judge as any, much to the GOP's consternation).

Of course, several Justices sit on the Supreme Court Bench, and majorities make decisions. Again, we see that the amount of power available is spread between individuals. But vitally, for a functioning judiciary, there is a great deal of independence from the executive and legislature.

It is a wonderful, brilliantly thought out arrangement. When one looks at the 'elective dictatorship' that exists in the UK, where the first past the post system, an impotent House of Lords, and the whips give parties crushing majorities. With the executive dominating the legislature, it is hard not to be envious of the American system. Thankfully, the Constitutional Reform Act of 2005 has moved the highest court of appeal to a new Supreme Court outside of the Lords, and judges are now appointed by an independent commission - thus the British can at least claim a properly independent judiciary!

Indeed, whilst there may be disaffection with Obama, I feel that Americans can still have great reason to be proud. Their Constitution, nigh unchanged for 230 years is still doing its job. To negotiate it, one must cajole, persuade, and bargain. No one man can ever dominate, in the 'government of laws, not of men'. In concluding, I say simply, do not be frustrated with the man. Be honoured to live under the system. It may not be perfect, but it comes closer than anything else before or since. It's doing its job, exactly as it was meant to.

Wednesday, January 20, 2010

Futurology

Those of you born in or before the 80s may just remember a cartoon called the Jetsons. I say remember - I couldn't really tell you anything about it, other than a slight twinge of pre-pubescent tumescence when Mrs. Jetson was on screen. It was, in essence, a lazy rehash of the Flintstones, a programme with no discernible plot progression or well-defined characters. The interest came from the situation in which the characters were based. The 'charm' of the Jetsons, such as it was, was how it portrayed an imaginary future, replete with jetcars, pills for meals (if memory serves) and robots. It's amusing that Mrs. Jetson is in essence the archetypal 50s housewife, staying at home and doing the cooking and watching over the household, given the thousands of years that would have had to elapsed for jetcars etc. to have arisen. But then who could've predicted women would start having careers? I think we can all agree it was ludicrous enough giving them the vote, but the notion that they might go to work! Bewidering....

Anyway, I bring this up as it's always hilarious to look back at the directions we all imagined the future would take in the past. Lord Kelvin's prediction in 1897 that 'radio has no future' is a real favourite of mine. I suppose we should all be thankful he was wrong, due to the technology's later development as radar, the key to victory in the Battle of Britain. Amusingly he also predicted two years earlier that heavier-than-air flying machines were an impossibility, so had he been an accurate predictor we'd not have even have needed radar. Sir Clive Sinclair's infatuation with the idea that his C5 personal automobiles were the mode of transport of the future cost him much of his business and ultimately paved the way for the domination of IBM and Microsoft as the world leaders in the computer industry. The greatest quote in this regard has to go to Charles H. Duell, Commissioner, U.S. Office of Patents, 1899. Mr. Duell, perhaps hoping for an easy life, remarked that 'Everything that can be invented has been invented.'

It's not to say we don't get things wrong in the modern world too, even short term predictions prove elusive. During my lifetime, the millenium bug must of course rate as the clearest example of how utterly wrong a species can be. Recent fears over bird flu (which was surely a nonesense from the start) and swine flu (which I caught, and was nasty, in fairness) show that even when the best and brightest are at work, any prediction is fraught with difficulty.

We may, however, be getting better. Statistically, meteorologists, for example, are gaining steady accuracy as time goes by. But what's really interesting is the growing encroachment of ideas taken from mathematics and physics and extrapolated into the field of social sciences. In essence, these theories postulate that there are often times when we, as societal groups, act en masse, as if a group of atoms within a substance. Of course, this is not too hard to believe when one considers that we are not as free-willed as we'd like to believe we are. There so many written and unwritten rules that direct and dictate our conduct, and the acts of those around us impact massively on the choices we made. This is just as well, as without such limitations, faced with endless choices, life would be very difficult, nigh on impossible even.

Physicists often see similar things. In iron, for example, each atom is like a compass that can orientate itself in a magnetic field. Individually, each atom is 'free' to choose its own orientation; but because of the magnetic forces between them, all the atoms will align themselves in the same plane/orientation. Brazilian physicists have used a model like this to explain why the voting statistics of the 1998 Brazilian elections do not sit easily with a notion of rampant free will. It seems that the influence of individuals within the group, all tyring to convert others to their political points of view, cause subgroups and communities to 'line up' and vote in blocs, like lumps of iron - further weight to the Chruchillian view that democracy is the worst from of government (except all the others we've tried).

It may also help explain how market traders and economists act, with waves of mass buying and selling, bull markets full of confidence suddenly flipping to bear markets of disaster as a herd mentality takes over due to interactions between agents.

Of course, it is a long strecth to say this will ever be an exact science - indeed, if one could accurately predict the future, it would suggest that our traditional understandings of time are wildly erroneous (since time must, by nature, move from the known to the unknown). If these models do prove more accurate, they may give us an insight into how we can shape societies, and hopefully for the better. As ever, though, I'm not sure we will ever arrive at a decision en masse as to how that society should work and what it should be.

In any event, I hope we won't end up looking as silly as Mr. Duell does now. I'd wager that the era of the bizarre prediction won't have ended just yet though. I, for one, am not too upset by that.

Tuesday, January 19, 2010

The Vulture Culture

We all live in age of financial instability, and during recent times it has been all too easy for societies to look inwards and try to cope with and resolve concerns on a national level. Being happily insulated from such fears and insecurities as unemployment, it might be easy to turn a blind eye to such. However, I am not untouched by the plight of those who have suffered from the vicissitudes and excesses of capitalism in recent times in Britain. I do feel, though, that to a greater extent than is perhaps normal, certain international issues are being brushed under the carpet somewhat.

In my current post, I'm involved in a case relating to a 'vulture fund' – naturally I cannot discuss any further specific detail, but I feel that it is certainly worth bringing the wider scope of such funds to greater attention, and especially to consider their impacts upon the developing world.

So, what exactly is a vulture fund? In simple terms, vulture funds are created by corporations (typically hedge funds or private equity houses), in order to purchase debts - generally from other companies who are unable/willing to pursue their creditor. These debts are purchased at a low price. The next step is for the new holders of the debt (the vultures) to sue for repayment in court. Having obtained a court order, the fund can pursue the creditor for the debt and interest on top, often amounting to several times the original debt.

Now, some of you might have reverted into Gordon Gekko mode for a moment ('Greed is good. Greed clarifies.'), and indeed, on the face of it, why shouldn't people have to repay their debts?

The problem comes when these vulture funds gain judgement in courts over sovereign debt, that is to say, judgement against nations themselves. In Africa, numerous countries (principally the Democratic Republic of Congo, Congo Brazzaville and Liberia) have fallen prey to vulture funds and owe dizzying sums of money, far above the sums originally owed.

The real issue is the origin of these debts. Take, for example, the case of Zambia (http://news.bbc.co.uk/1/hi/6365433.stm), where the government there was loaned money by the Romanian state under Ceaucescu in 1979. The money was lent in order to allow the Zambians to purchase tractors and other agricultural equipment from the Romanians, but the Zambians struggled to keep up with the repayments. Despite protracted negotiations, and due to burgeoning hardships in Romania, the Romanian government felt there was no option but to sell the debt to Donegal International, a firm that is heavily involved in the titular vulture culture.

We can see that the Romanian state took a massive loss from this, and following a court ruling the Zambians were held to owe roughly double the original debt to Donegal. Ultimately, both Zambia and Romania suffered financially from the arrangement, with Donegal International, a US based firm, left to reap the rewards. A similar, more recent case involves the DRC (http://www.guardian.co.uk/world/2009/aug/09/congo) where a debt accrued by the vicious dictator Mobutu Sese Seko is being pursued by FG Hemisphere, another vulture fund capital. As ever, it is the ordinary citizenry of this war-torn land who are left to pay the price of a debt they never requested

This debt hits the third world hard. Often the original debts that have accrued arose from dealings of corrupt dictators, due to disappearing or embezzled funds. African nations generally don't have the best credit ratings, and thus have to pay high levels of interest in any event. I'm all for dropping the debt, but the key matter is to remove the underlying issues that will simply lead to further debt accumulation. These funds reap huge profit but leave those who have to pay up facing sums of money out of all proportion from what they originally owed.

I don't propose to have a solution to African corruption, nor do I have access to Standard & Poors computer system to give countries like Zambia a AAA credit rating. But I feel that the issue of vulture funds is something we can act upon. There are moves either side of the Atlantic to introduce legislation banning these funds, both within Westminster and Washington. We must encourage such moves. These funds are immoral, and hold back developing nations, miring them in proscriptive debt they cannot escape all in pursuit of profit for shareholders. Something has to give. I hope that before too long courts the world over will be prevented from giving judgement against the citizens of sovereign states, and that we can start to give a bit more hope to the many, rather than cynically enriching the few.

If you'd like to read more, have a look here http://www.jubileedebtcampaign.org.uk/

Monday, January 18, 2010

All that glisters...

I caught this article - http://www.guardian.co.uk/science/2009/nov/17/heart-disease-ancient-egyptians - in today’s Guardian. To summarise, based on body scans of twenty mummified corpses, approximately 3,500 years old. Of these, sixteen had identifiable circulatory systems (hearts and arteries), and nine showed signs of atherosclerosis (that is to say arteries whose walls have been damaged and whose lumina have been occluded by fatty deposition). From this, the article states that the researchers involved appear have concluded that a great number Egyptians of high socioeconomic status tended to suffer from cardiovascular disease.

All well and good, although nine of twenty mummies is hardly good enough to draw any real conclusions, and a claim made by one of the research team that "The findings suggest that we may have to look beyond modern risk factors to fully understand the disease [atherosclerosis]" seems an incredible overstatement. It seems hardly surprising that wealthy Egyptians would have been able to eat a diet heavy in meat, and indeed needing to do no physical labour themselves, could well have grown fat and developed heart and arterial disease. I hardly think that were we to have a perfectly preserved Henry VIII to dissect and examine, we would conclude from his morbidly obese, gouty body that all Tudors had heart disease and that it can't just be modern diets that are problematic. The man was a glutton par excellence, as, I imagine, were many rich Egyptians.

The above is largely apropos of nothing, but it did point me to two other recent stories relating to pharaonic Egypt. Firstly, the revelation that new evidence suggests the pyramids were not built by slaves, which I was not too surprised to hear. Secondly, more interestingly, was the call by Dr. Zawi Hawass, the head of the Egyptian Supreme Court of Antiquities, for the return of the Rosetta Stone to Egypt from the British Museum.

The two stories together got me thinking - Egypt retains a place in the popular imagination, but generally people only really know of names like Tutankhamun, and stories pertaining to him, such as the fabled curse of his tomb that befell the team that discovered his tomb led by Howard Carter. Overall the ancient age is rather poorly represented in our national psyche. Frankly, Tutankhamun is a boring subject, and was an irrelevant king, and it is a great shame that he is perhaps first to jump to our minds when we think of Egypt. We know his name because of gold, and gold alone. So, since the Rosetta Stone is in the news, I thought I'd tell you a little tale about it...

The stone was discovered in the late 1790s, by the French under Napoleon. The arrival of British forces in Egpyt who defeated the French at Cairo explains how it made its way to its current home in London. These are reasonably trivial details - the really interesting part is the information inscribed.

The stone features inscriptions in three languages - classical Greek, and two Egyptian scripts, hieroglyphics and demotic (naturally, the Egpytians didn't just write in the intricate hieroglyphics - there were two less formal written languages of hieratic and demotic to use). The stone was thus immensely valuable, providing the key to reading the hieroglyphs present throughout the tombs and temples of the land.

The first man to really get to grips with the stone was the Englishman Thomas Young, but before he really got into his stride, he grew bored of the stone and abandoned his search. It was ultimately a Frenchman, Jean-Francois Champollion, who would prove a key. He correctly identified that the glyphs were based on phonetic sounds (i.e. if I were to represent the word 'belief' as an English hieroglyph, I would draw a bee, then a leaf), and that the sounds came from the spoken Coptic language of the Egyptians. With his consumate knowledge of Coptic, Champollion correctly translated the cartouche (the names of Egyptian kings were surrounded in an oval or cartouche in hieroglyphics) of the pharaoh Ramesses.

Upon discovering this, he shouted 'Je tiens l'affaire!' (Eureka!) and collapsed, not waking for five days. Fortunately, living with his brother who had supported him, he regained consciousness and made his discovery known, achieving a lifelong dream.

I don't know if that story made any impression - indeed, you may already have known of it, or heard it. I thought of other episodes in the history of Egypt but somehow this story is the one I feel ought to be most well known (though, if you have time, I'd look into the stories of Hatshepsut and Ahkenaten, fascinating pharaohs who did frankly staggering things) - and indeed, to turn a little Aesopian, there is a slight moral to this diatribe. It's easy to be bewitched by tales of rich boy kings (like Tutankhamun). I'd like to think that maybe, with Egypt, people would look a little deeper if they knew a little more. I've barely scratched the surface of the rich collection of personalities and tales from the pharaonic times. So, if this piqued your interest, try and quell your inner magpie, and remember, all that glisters is not gold...

Sunday, January 17, 2010

Thoughts on how UK companies are run (I promise to write something more fun next time)

Corporate governance is the system by which a company is regulated and controlled, and gives ‘an architecture of accountability – the structures and processes to ensure companies are managed in the interests of their owners’. Within the UK, such structures are principally based upon the notion of shareholder primacy, which is to say that the companies overarching aim is to provide profit to those who own the company. Under the Companies Act 2006, s33 provides that there is a contractual relationship between shareholders and the company, and between shareholders themselves. As a result, the key basis for how a company’s affairs are handled under UK law lies in the fact that shareholders effectively own the company, but that they delegate responsibility for management to the directors. This separation of ownership and control means that effectively the board of directors are responsible for the entrepreneurial enterprise of the company, but must also ensure that checks and balances exist to minimise risks. From a corporate governance standpoint, it is the interaction between the board and shareholders which is pivotal. For companies to be governed effectively, it is paramount that boards provide accurate, detailed information and act with a good degree of transparency, so that shareholders, in their turn, can make an accurate appraisal of the company’s affairs and contribute to the decision making processes of the company.

The other vital provisions relating to UK corporate governance come in the form of ‘The Combined Code on Corporate Governance’, most recently updated in 2003. It is important to note that the provisions of the code do not have the force of legislation, and that a company is free to deviate from the code provided an explanation as to why this has occurred is given (the ‘comply or explain’ principle6). As such it is clear that companies in the UK are fundamentally self-regulating, as they are not formally bound by any extrinsic rules. In this essay, I will be examining and assessing whether these key principles of self regulation and separation of ownership and control are responsible for the shortcomings of corporate governance in England and Wales, especially following the recent financial crisis.

The most fundamental issue as concerns the above approach involves risk. To paraphrase Adam Smith, as directors are in effect managing other people’s money (the shareholders), rather than their own, it is unlikely to expect that they will watch over it with the same degree of vigilance. Equally, whilst shareholders have a purely financial interest in the company, directors may wish to pursue divergent aims outside of solely looking to maximise profits. Due to the fact that shareholders may well lack the time and resources to stay fully informed of the company’s activities, and the obvious disparity in the level of information available between directors and shareholders where the separation of ownership and control exists, it would seem that the directors may be allowed to act in a relatively unchecked manner. It has been shown repeatedly, from the earliest corporate scandal of the South Sea Company, through to the present financial crisis, that such freedom is often abused. Furthermore, these issues concerning separation of ownership and control may well have impeded the British economy. The ACCA, in a recent report, identified poor corporate governance as one of the chief reasons underpinning the credit crunch, and noted the need for greater shareholder and wider stakeholder involvement in holding boards to account.

A recent LSE study notes that there is still a notable degree of failure by boards to explain non-compliances with codes. This highlights the fact that shareholders may often remain in the dark about corporate behaviours, another example of the potential issues with the current model of UK governance. It is clear, then, that the division of ownership and control as a model for corporate governance has significant shortcomings, in that boards typically possess too much control, and shareholders are unable to provide a satisfactory check upon them, through a combination of apathy or inability, and a lack of information coming from the directors. The recent OECD report on governance and the financial crisis states that shareholders have failed to hold boards to account and have made too little effort to engage and meet with directors, and cited low turnout of shareholders at key votes as a significant issue. This so-called principal-agency problem clearly pre-dates the current financial crisis, and is manifestly a key issue. A report from 2004 looking specifically into governance issues within the banking system, states that ‘A cursory review of recent banking crises would suggest that many causes for concern relate to management decisions which reflect agency problems involving management.

Management may have different risk preferences from those of other stakeholders including the government, owners, creditors, etc., or limited competence in assessing the risks involved in its decisions, and yet have significant freedom of action because of the absence of adequate control systems able to resolve agency problems…’, and that ‘The principal-agent problem, outlined above, poses a systemic threat to financial systems when the incentives of management for banking or securities firms are not aligned with those of the owners of the firm.’. Even the Institute of Directors, which feels that the current UK model of corporate governance is ‘fundamentally sound’, is in accord on these issues. In its response to the Walker report on corporate governance, the IoD voiced its opinion that ‘The financial crisis has highlighted the fact that shareholders are not always sufficiently committed to the fulfilment of this role. Leading up to the crisis, they failed to ask the right questions and did not engage sufficiently with boards in respect of proposed business strategies and risk profiles. As Lord Myners has commented, many institutional shareholders continue to behave like “absentee landlords’.

In assessing the above, it seems clear to me that the separation of ownership and control provides significant issues for corporate governance within England and Wales. The inequality in information available to shareholders versus directors ensures that boards are not effectively monitored and kept in check, and abuses may well go undetected. The discrepancy between the aims and attitudes to risk between directors and wider stakeholders also means that corporations are often governed and run in a manner that is inconsistent with the desire of the owners of the company. It is apparent that there is a definitive need for closer relationships to exist between board members and shareholders.

The IoD’s suggestion of introducing a reciprocal combined code for investors, ensuring that there is a framework enforcing shareholder involvement, and requiring a ‘comply or explain’ approach where shareholders did not adhere. A greater drive and emphasis on boards to ensure that quality, independently verified information is obtained and disseminated to shareholders to ensure that they are able to make informed judgements as regards the company and the manner in which it is run. This is especially so in light of the decline of UK institutional investors, those bodies who are perhaps best placed and able to hold boards to account. Hopefully, the recent crisis will focus minds on these issues and improvements of governance will come to fruition, as it is clear that the status quo is at best unsatisfactory.

In addition to the separation of ownership and control, the issue of self-regulation must be considered. Whilst there is some statutory regulation in place affecting corporations (specifically the Companies Acts of 1985 and 2006) which set out directors’ duties, for example, the Combined Code provides a more significant framework setting out standards for good practice. It is important to note that adherence to this code is voluntary in the sense that there are no legal sanctions for non-adherence. Compliance is sought through the ‘comply or explain’ approach, which following a report conducted by the LSE seems to be working well, and indeed improving year on year. It seems that in spite of the recent financial crisis, the Combined Code has retained strong support according to the Financial Reporting Council’s recent review and consultation thereon. It was noted in that review that the current ‘soft law’ approach is preferable to a more heavily legislative environment, in that it provides a greater deal of flexibility and the ability to adapt more rapidly to changes in the corporate climate. This echoes the view that the FRC put forward in 2003, where it accorded with Sir Derek Higgs’ view that legislation was not the way forward due to its general inflexibility, and the fact that it is best that the shareholders and directors come together to in order to consider what is in the company’s best interest, and cited the Cadbury Committee’s statement that ‘statutory measures would impose a minimum standard and there would be a greater risk of boards complying with the letter, rather than with the sprit, of the requirements’.

The IoD is particularly vociferous in its anti-regulatory stance, noting those issues stated above, but also stating that the UK benefits from the lack of regulation, citing the comparative impact of the Sarbanes-Oxley Act in the USA, which imposes onerous regulatory requirements, and has cost corporations a cumulative estimated total of $1.4 trillion, as well as driving companies away from registering in New York towards other financial centres. If we consider that much of the current global financial difficulties can trace their origins to the USA, this hardly suggests that firmer legislative regulation can really have a significant part to play, and certainly indicates that the UK’s current approach may well be preferable to a more regulatory climate.

It is perhaps understandable given recent events within the global economy to state that greater regulation is necessary to ensure that such events are not repeated and to be able to hold individuals responsible to account for negligent corporate practices. However, in assessing the above I am wholly unconvinced of the need for any introduction of legislation. Codes are advantageous in providing standards. Legislation may well be complied with to the letter, but as has been seen in the USA, such is no guarantee of good corporate governance, and may well carry more negative facets than positives, especially if wealth stimulation and creation is stifled.

However, the ‘soft law’ approach is of course not perfect. Compliance is voluntary and no sanction may be brought for non-compliance with the code. As such, boards must be relied on to engage with and adopt any code, something which naturally may be difficult to achieve. There may equally be some areas where corporate behaviour must be curtailed by legislation. I find it hard to imagine there is great desire amongst UK banks to adopt a model where commercial and investment banking are separated, but I would accord with Lord Lawson’s view that such a model (in line with the now repealed US Glass-Steagall Act) is highly desirable and definitively in the public interest. Increased regulation in other areas, especially re the environment, may also be necessary to ensure that companies adhere to principles of corporate social responsibility. Manifestly, compliance with codes on governance depends on corporate willingness to adhere. Such willingness may fluctuate, and presents the key weakness in such an approach

Sex, God and Excrement

Like many of the world's inhabitants, I stubbed my toe today. Whilst you may find this amusing (and hearing my Hugh Grant-esque shriek of 'bugger!' may well be somewhat funny), I, naturally, did not. Indeed, as I've already alluded to, my brain and tongue immediately lurched for a syllable or two of the Anglo-Saxon.

What struck me about that moment was that forcing out an expletive made me feel better, not merely because I love swearing (I bloody do). The pain in my toe ebbed, tangibly, and I wondered why. I imagine a sensible person would've cleaned up some of the blood first, but never mind.

It would appear that it is fairly settled science that swearing is useful as regards pain relief. An American study, published online on July 13th in NeuroReport (http://journals.lww.com/neuroreport/) found that if participants were to hold their hand in freezing water for as long as possible, they would cope better and endure longer if they swore (when compared to not swearing or saying a neutral word such as 'table). Interestingly, one participant, when asked to list five swear words they might exclaim after suffering pain, failed to come up with a single curse and was thus cut from the study.

However, it seems that this was the first and only study to look at the phenomenon, and it's worth noting that the sample size was small (67 participants), and that the neurology of pain is a complex matter (see here for example http://thalamus.wustl.edu/course/body.html). As such the findings may not be conclusive, but there is a satisfyingly inherent logic behind it all.

The proposed rationale behind the pain relief is that the act of swearing triggers a sort of internal feedback mechanism, triggering the sympathetic nervous system (the fight or flight mechanism, as it is more colloquillaly known). The key outcome is a surge in adrenaline throughout the body. Adrenaline acts within the nervous pathways, effectively helping to dampen down the pain signals which reach your brain from the periphery. It is a compelling argument as it is demonstrable that swearing has this effect (your heart jumps a little when you swear, a classic sign of increased adrenaline), and that adrenaline acts as an analgesic.

This may, in itself, provide some insight into the linguistic origins of swearing. I've always been interested in how we invoke God ('deistic' swearing) and bodily functions ('visceral' swearing) when we swear. English has a curious linguistic make-up, but it is unmistakenly the Germanic origins of our venacular that inspire our swearing. There are numerous examples of this, from German's 'Scheisse' and our 'shit', or the Afrikaner's 'Fokk/fokken' and our 'fuck' (it is worth pointing out that it is a fallacious folk etymology that suggests that fuck is an acronym of the pharse' Fornicated Under Charles the King). But why sex, and why God?

Based on what I've already written, the visceral element of swearing is more readily explainable. Words invoking sex are highly evocative, and inducing thoughts of such will obviously have a physiological effect. It seems possible that what really upsets us with swearing is not the sound of it on our ear, rather its physiological impact upon us. When someone incants a word pertaining to faeces, we feel a revulsion that goes beyond the sound and tone of the letters compiled together - it appeals to something more basic, an evolutionary part of us that acknowledges that excrement is brimming with potential disease. As such sex and excrement can be seen as positive and negative elements of the visceral swearing spectrum. It would of course be foolish and simplistic to ignore cultural and historical influences in swearing - the worst swear word by anyone's standards is undoubtedly the word 'cunt', a Victorian era word for a prostitute. Considering the sexual repression that marked that era in British history, it is no stretch to say that physiology alone cannot describe how that word came to be so reviled. It is worth noting that the -unt suffix is, dispassionately, more guttural and perfunctory sounding than -uck or -it, and so we cannot ignore the purely linguistic implications of spelling either!

Deistic swearing is a far more challenging topic. Any notion of God must, by my logic, become part of language later than any vocabularly to do with bodily functions (it seems odd to suppose a child, for example, would discuss Hume with you before it told you thatit needed to go to the toilet). I have no doubt that the word God and notions thereof can inspire physical feelings within people that are of a potency. But I doubt that the visceral swearing explanation fits our purpose. It seems to me that a cultural explanation is more likely. Early societies which grappled with God were invariably polytheistic, prone to Shamanism. I would surmise that it is this that provides our first insight into how we came to 'blaspheme' on stubbing our toes. There were undoubtedly certain words that simply could not be uttered within these cultures save by those ordained by communities with the right to utter them. Words acquiring a taboo status undoubtedly develop a power and a mystique that others simply do not. It may well be that the fear involved in speaking such words may lead to a 'visceral swearing' type response, but this seemingly must have followed from the cultural constraints applied.

Obviously, this is a fairly redcutionist view, and it is likely that there are a plethora of other factors I've neglected. It is at least pleasing to note, however, that even stubbing one's toe can be interesting - and if you disgaree? Well, fuck off :)

Genesis

In the beginning was the word, and the word was blog.

A suitably rubbish start to what will be a hopefully interminable ramble comprising whatever thoughts grab my attention day on day...