General


Someone sent me a link to a recent Michael Shermer’s column on people believing in conspiracies – “Paranoia Strikes Deep“. An interesting article that reveals more about its author’s biases and limitations than its intended subjects.

Basically, Mr. Shermer thinks that people who believe there may indeed exist some secret groups – especially within government – conspiring successfully to influence politics, society and economy on a large scale are basically irrational idiots not thinking straight. According to Mr. Shermer such conspiracies are very improbable, because in a large conspiracy maintaining secrecy would be next to impossible. Doing that within government is especially hard, because bureaucrats are incompetent and stupid:

But as former Nixon aide G. Gordon Liddy once told me (and he should know!), the problem with government conspiracies is that bureaucrats are incompetent and people can’t keep their mouths shut. Complex conspiracies are difficult to pull off, and so many people want their quarter hour of fame that even the Men in Black couldn’t squelch the squealers from spilling the beans. So there’s a good chance that the more elaborate a conspiracy theory is, and the more people that would need to be involved, the less likely it is true.

(my emphasis)

Seems like a very good argument: if someone famous for failing at conspiring says it is hard it must be true, right?

But, seriously: is it really the case? Is it really impossible to create a big, successful secret operation within government – and keep it so for a long time – because people will talk? Let’s examine this claim carefully.

First, it is absolutely clear that such a rule would apply at all only within the context of modern, Western democracies. In Soviet Russia, for example, everything was secret and classified by default, from genocide and mass resettlement of whole nations through the whole portfolio of weaponry development projects to civilian plane crashes. Same happened in Nazi Germany and routinely happens to this day in modern totalitarian states. Who knows, for example, what Chinese government is up to? Certainly not the Chinese people – or shall I call them ‘subjects’?

But even within the Western world large scale secret operations were successfully hidden from public for decades. My favorite example is the Ultra/Enigma case.

Everyone knows the basics – during the Second World War Allies were routinely reading much of German encrypted radio traffic and were distributing intelligence gathered to dozens of Allied commanding officers giving them incredible advantage over their German counterparts.

What’s interesting here is that after the war ended no one knew anything about it until 1973 when Bertrand’s book was published providing the public with its first glimpse of truth. Movies were made and books were written – including scientific research in the fields of history and military tactics – analyzing allied victories in numerous battles without that crucial knowledge. Amazing, but somehow for 28 years everyone involved, on both sides of the Atlantic, was keeping their mouths shut.

And we are not speaking here about a small group of people. Hundreds if not thousands were involved in the operation and knew the Ultra secret. This includes the Bletchley Park staff of a couple of hundreds cryptologists, analysts, technicians and clerks, then hundreds of soldiers in the SLU units distributing the information to commanders (and taking every dispatch back!), commanders themselves and numerous politicians and intelligence officers both in the UK and the US. Not one of them spoke about it for 28 years after the conflict was over.

This makes it even more interesting. It is easier to understand why everyone involved was not talking while the war was ongoing. All were in the military, penalties for loose talk were harsh, press was censored anyway – plus all involved did understand their mission was important and didn’t want to compromise its security. But why after the war? Especially 10 or 15 years after? Germany was divided, part of it was already considered an ally in the Cold War. But somehow no one said a word.

Amazing, isn’t it, Mr. Shermer what Her Majesty’s government was able to do? Maybe they were not as inept as Mr. Liddy?

But even in the US some bureaucrats were much better than Mr. Nixon and his staff at hiding secrets. Let’s just take the case of NSA. NSA – arguably the biggest sig-int organization in the world – was officially created by Harry S. Truman in June 1952. Again, not only the general public, but also large parts of US government – including the US Congress – didn’t even know this organization existed yet alone spied on Americans until 1975 Church Committee hearings. And we are not speaking here about a small organization – NSA employed thousands and its operations spanned the globe with listening posts in Australia, UK, Turkey and other places.

I think those two examples show very well that Mr. Liddy is wrong and so is Mr. Shermer. Conspiracies can be pulled off by government agencies without being compromised for quite a long time even in Western societies of recent time.

Someone may say that both cases are from the secretive field of cryptology and military intelligence. Can there be conspiracies of a different type – ones with political and social agendas on huge scale?

History, again, serves us with an excellent example – Soviet Union and unprecedented social engineering that took place there was exactly a product of such a successful conspiracy. Lenin and his pals were able to use destabilization of Russia following the First World War to grab power, ruthlessly eliminate opposition – and then implement the crazy social agenda they all firmly believed in. No place to explore the details here, but it is a very interesting story in itself.

And yes, it can happen again. One of the fallacies of the modern world is to think that old problems of humanity – wars, dictatorships, cruelty etc. – are a thing of the past, because we are modern – mainly more technologically sophisticated. But technological advancement doesn’t change the human nature which remains strikingly unchanged through recorded history (which is why Greek tragedies are so understandable to us thousands of years after they were written) – it just makes the damage we can potentially do bigger.

Does it mean that 9/11 was an “inside job”? No. It just means arguments of those who say so should be looked at and discussed, not them snared at just for asking questions or having doubts about the official version. History shows that a conspiracy on such a scale is hard to pull off and thus improbable – but definitely far from impossible. But first, of course, one has to know history – without that it is easy to fall for naive simplistic arguments like the one made by Mr. Shermer. Ignorance strikes indeed deeper than paranoia…

On Jun 24th our daughter, Amelia Weronika was born. She is the sweetest thing that ever happened to me, and both me and me Joanna are very happy that she’s with us.

Amelia

Since her arrival I have less time to write but more motivation to better use my time, so this blog will be still alive. Stay tuned for more articles on various topics as usual.

I have been a computer freak for longer than I would like to admit (I just realized that when I was writing programs on my first computer – the venerable C64 – most of my current team was not even born yet). The shape of computing has changed a lot over that time. Computers now have infinitely more memory and processing power than my first machine. But one thing didn’t improve as much: overall software quality.

Every new programming paradigm or idea coming along since computers crawled out of government labs had “better quality” as part of its promise. Yet somehow we are still surrounded by unreliable, poorly designed software. Making it more graphical didn’t change it fundamentally. And sadly web applications are no exception.

At first at least “hard to use” part was partially solved, because the web interfaces were simple which made them clean and easy to grasp. It also made them fast. Now, however this is quickly becoming a thing of the past. Modern web apps are full of heavy graphics, JavaScript is being pushed to the limits of its capabilities and the result is heavy UIs browsers barely cope with. Opening a few of those modern web apps in Firefox makes it the most CPU consuming process on my system, usually eating lots of memory (and frequently leaking). Frequently sites display errors, get overloaded or otherwise malfunction.

Why it is so? Let me offer two reasons, that don’t exclude each other so both can be true.

First, I think we are all spoiled by the computing power and memory getting cheaper and more abundant. In the past machines changed very slowly. For example the C64 I mentioned didn’t change at all over a period of 10 years, yet software for it did improve immensely. Programmers studied the hardware and by late eighties made it perform things the machine’s original designers didn’t think it would be capable of. That was truly pushing the limits.

Now if an application is a resource hog it is easy to just give it enough to make it run. That’s why each release of MS Office is slower and bigger than previous with marginal functional improvements. Same happened to web browsers and almost all desktop apps. Increase in speed & memory sizes mask the fact that today’s desktop apps are usually bloated and slow, but only to a certain degree. The subjective perception is that they work as fast as the previous generation and everyone accepts that – even though everything should run way faster considering how much the machines this software runs on improved.

Clusters and now cloud computing made it an easy solution for server-based software too – and that’s what web apps are apart from the UI. You can throw a whole bunch of servers on the problem and forget about optimization. And this works, also economically. Sites still make profit on mediocre codebases, because the computing power is cheap and because usually very little depends on web apps. If they break or run slow no one dies and nothing of importance is lost – the impact of low quality is not easily seen. So, there is no economic incentive to improve.

This and overall acceptance of software as buggy (effect of customers being trained for decades to accept dismally poor software from Microsoft) causes buyers of software products to accept low quality as the norm. They learned to live with it and it takes some effort to bring quality into their prospective at all. In fact, I think even some of our clients don’t appreciate all the effort we put into optimization and testing because they don’t see benefits of having a piece of web software that is robust and scalable.

Second, I think many of the web applications today are created by people with little or no education (formal or otherwise) in computer science as such. I think many of “cool web 2.0 kids” have no idea how the computers they use really work inside. I don’t think many of them know for example what preemptive time sharing is let alone are able to calculate the computational cost of the algorithms they use (if they know this difficult word with Greek roots). Very few know and understand the operating systems they use or ever heard of highly available design and other such practices.

Tim O’Reily’s predicted in late 90-ies that easy web technologies (which back then meant PHP and early CMSes) will allow people with little or no technical knowledge to express themselves on the web. And indeed they do – but much of this “creative expression” is, well, crappy software.

All this means that despite everyone saying a lot about quality there is very little push to actually deliver it in web apps. In fact, quality is squeezed out of prospective from both sides. Client’s don’t care for it when they order their web apps and developers frequently don’t have the knowledge necessary to deliver it.

The upside is that this will change as more and more will depend on IT systems that are based on web technology. Crashes, data loss or hour-long “scheduled downtimes” (like Twitter’s) won’t be acceptable. And also the buyers will with time bear the cost of neglecting the quality from the start and – learning from their mistakes – will insist on it in their next project. Which is great news for us, because we already have the knowledge and practices in place to deliver web apps that are also good software. In the meantime we have to keep on educating people that even in web apps quality counts.

I just devoted 1.2h of my time to view the Google Wave video in its entirety and I’m impressed. I wholeheartedly agree with glowing comments by Tim O’Rilley and Matt Asay. I’m not sure it will retire e-mail or traditional IMs, but it has a potential to change internal communication in companies. It is the first thing that has really a chance to challenge MS Exchange’s reign in corporate communication. Here is why.

First, it will be opensourced and based on an open protocol unlike most of Google products so far. Most corporations won’t use hosted web tools like Google Apps precisely because they value their data and want to retain control over it. They won’t use tools they can’t host on their own servers behind their own firewalls. So Wave is something that can get adopted where Google Apps (and all other hosted apps for that matter) didn’t stand a chance.

But even more important is the ability to integrate almost everything with Wave through extensions and bots. Here lies the real strength of Wave as a corporate solution.

Typically large businesses have already different specialized systems that are supporting their processes – CRMs, accounting systems, ERPs, logistics & order tracking etc. And much of the communication inside those companies revolves around same processes and data stored in those systems. Currently this communication goes in e-mails with data pasted in or attached as spreadsheets etc. With Wave it is easy to imagine integrating all those systems and creating a customized, comprehensive corporate communication environment. People would be able to talk and discuss invoices, orders, reports & other stuff right in their Wave inbox seeing up to date information fetched into context from other corporate systems.

Of course, it will take a while – Wave is still beta and protocol to connect different Wave servers is still being developed. There are issues to fix before Wave will be used and then it will take a push from both corporate IT and companies like us to create all the integration, build extensions, robots etc. But we are definitely seeing something very interesting here. And we are eager to deploy it.

« Previous PageNext Page »