The Last Temptation of Mankind?

halOne of my professional interests is in Artificial Intelligence – AI.  I think I’ve had an interest in the simulation of human personality by software for as long as I’ve been interested in programming, and have also heard most of the jokes around the subject – particularly those to do with ‘making friends’. 🙂  In fiction, most artificial intelligences that are portrayed have something of an attitude problem; we’ve had HAL in 2001 – insane.  The Terminator designed to be homicidal.  The Cylons in the new version of Battlestar Galactica and the ‘prequel’ series, Caprica – originally designed as mechanical soldiers and then evolving in to something more human with an initial contempt for their creators.  The moral of the story – and it goes all the way back to Frankenstein – is that there are indeed certain areas of computer science and technology where man is not meant to meddle. 

Of course, we’re a long way away form creating truly artificial intelligences; those capable of original thought that transcends their programming.  I recently joked that we might be on our way to having a true AI when the program tells us a joke that it has made up that is genuinely funny!  I think the best we’ll manage is to come up with a clever software conjuring trick; something that by deft programming and a slight suspension of disbelief of people interacting with the software will give the appearance of an intelligence.  This in itself will be quite something, and will probably serve many of the functions that we might want from an artificial intelligence – it’s certainly something I find of interest in my involvement in the field.

But the problem with technology is that there is always the possibility of something coming at us unexpectedly that catches us out; it’s often been said that the human race’s technical ability to innovate outstrips our ethical ability to come up with the moral and philosophical tools we need to help our culture deal with the technical innovations by anywhere from a decade to 50 years; in other words, we’re constantly trying to play catch up with the social, legal and ethical implications of our technological advances.

One area where I hope we can at least do a little forward thinking on the ethical front is in the field of AI; would a truly ‘intelligent’ artificial mind be granted the same rights and privileges as a human being or at the very least an animal?  How would we know when we have achieved such a system, when we can’t even agree on definitions of intelligence or whether animals themselves are intelligent? 

Some years ago I remember hearing a BT ‘futurist’ suggesting that it might not be more than a decade or so before it would be possible to transfer the memory of a human being in to a computer memory, and have that memory available for access.  This isn’t the same as transferring the consciousness; as we have no idea what ‘conciousness’ is, it’s hard to contemplate a tool that would do such a thing.   But I would accept that transferring of memories in to storage might be possible and might even have some advantages, even if there are ethical and the ultimate in privacy implications to deal with.  Well, it’s certainly more than a decade ago that I heard this suggestion, and I don’t believe we’re much closer to developing such a technology, so maybe it’s harder than was thought.

But what if….

In the TV series ‘Caprica’, the artificial intelligence that controls the Cylons is provided by an online personality created by a teenage girl for use as an avatar in cyberspace that is downloaded in to a robot body.  In Alexander Jablokov’s short story ‘Living Will’   a computer scientist works with a computer to develop a ‘personality’ in the computer to be a mirror image of his own, but that won’t suffer from the dementia that is starting to affect him.  In each case a sentient program emerges that in all visible respects  is identical to the personality of the original creator.  The  ‘sentient’ program thus created is a copy of the original.  In both Caprica and ‘Living Will’ the software outlives it’s creator.

But what if it were possible to transfer the consciousness of a living human mind over to such a sentient program?  Imagine the possibilities of creating and ‘educating’ such a piece of software to the point at which your consciousness could wear it like a glove.  From being in a situation where the original mind looks on his or her copy and appreciates the difference, will it ever be possible for that conscious mind to be moved in to that copy, endowing the sentient software with the self awareness of the original mind, so that the mind is aware of it’s existence as a human mind when it is in the software?

Such electronic immortality is (I hope) likely to be science fiction for a very long time.  The ethical, eschatological and moral questions of shifting consciousnesses around are legion.  Multiple copies of minds?  Would such a mind be aware of any loss between human brain and computer software? What happens to the soul?

It’s an interesting view of a possible future  for mankind, to live forever in an electronic computer at the cost of becoming less than human?  And for those of us with spiritual beliefs, it might be the last temptation of mankind, to live forever and turn one’s back on God and one’s soul.

The PAYG Laptop?

You write one article about Appliance Computing and the following morning this BBC story pops up – Laptop Launched to aid computer novices’.  The ‘Alex’, a Linux based laptop, is aimed at people who’re occasional computer users and comes with an Office suite, mail, browser, broadband connection and a monthly fee.  In other words, a PAYG laptop.  There’s nothing new about this; a number of Mobile Phone Companies offer mobile broadband access packages that include a Windows laptop, and in the recent past there have been a few occasions when companies have attempted to launch similar schemes, sometimes backed with advertising.

I say attempted, because they’ve tended not to work, and I’m not at all convinced that this one will be any more succesful.  The company’s website describes the package available here,and to be honest it does seem rather over-priced for what is a modified and stripped down Ubuntu distro.  And one that seems to only work when your broadband connection is running.  It’s a good business model provided that you can get people to buy in to it.  There’s a review of the package to be read here.

Now, first question – who is the market place?  The Broadband company who’ve developed this package claim that almost 25% of people in the UK with computers don’t know how to use it.  really?  That I find difficult to believe.  Most folks I know – across the board, non-techies, techies, old, young, whatever are quite au fait with using their computer to do what they want to do.  There may be aspects of computing that they don’t get, in the same way that I don’t ‘get’ iTunes, for example, or the intricacies of computer or video gaming, but I know no-one who’s bought a computer who doesn’t make some use of it.  Perhaps that 25% didn’t really want a computer, or have ended up with one totally unsuitable for them?

If the market sector is this 25%, then what proportion are willing to buy a £400 computer and a £10 access fee?  Apparently a ‘sofwtare only’ option that can be installed on older computers and that will simply cost you the monthly fee is out in the next couple of months, which might allow people with older computers to make use of them.  the package comes with 10Gb online storage; does this mean that local storage is not available?  If so, what happens to your data if you don’t pay your monthly fee or cancel your subscription?  To be honest, that sounds like something of a lock-in akin to Google Docs.  According to this review, on stopping the subscription, the PC effectively ‘expires’ – along with the access to your data.

I’m afraid that from what I can see I’m not impressed with either the environment or the limitations offered.  One of teh things that you learn after a while in putting together user interfaces is that people who come in knowing nothing soon gather skills and in some cases start finding the ‘simple interface’ that originally attracted them to be a limitation.  With a standrad PC, you just start using more advanced programs and facilities; with something like the Alex you’re stuck with what you’re given.  And whilst you could just buy a PC, and ask someone to set it up ‘simple’ for you (to be honest, it isn’t THAT difficult with a Windows PC, Mac or Linux machine if you ask about) and use a more ‘mainstream’ machine, you’re still stuck with your data being locked in to the Alex environment.

The solution to this problem is perhaps to look at front ends that sit on existing platforms, rather than work to further facilitate the move towards a computer appliance future split between a large number of manufacturers who lock us in to proprietary data stores.

The Appliance Computer?

ipadWell, the fuss over the launch of the iPad has died down somewhat – it wasn’t the Second Coming or the Rapture, the world didn’t suddenly turn Rainbow coloured (not for me. anyway) and the Apple Fans have gone quiet.  So, perhaps it’s time to take a few minutes to think about what the iPad might mean in the future.   This is an interesting viewpoint – that the iPad could be the first step on the road to the computer as a true ‘appliance’. 

In some ways, this might not be a bad thing – after all, it’s the way that all technology has tended to go over the years.  Take for example radio – the first radio receivers required the operators to be reasonably knowledgeable about the equipment, and in some cases be able to build and maintain their own equipment.  Radios required large outside aerials, and I clearly remember a ‘Home Maintenance’ book that my mum had that dated from the 1920s that had great amounts of information about how to service your wireless were it to go wrong.  By the 1930s they were more self contained ‘black boxes’ – OK, self contained walnut wood boxes – and by the time we hit the 1970s little radios were being given away as children’s toys.  We’re moving along that path with computers; when home computers first became available then you were expected to want to write some of your own programs or even build the machine, then published software came along, then we have the time we’re in now when very few people write their own software at all. 

But the thing is with contemporary computers is that you can still write your own software if you wish to; you can go out, buy a copy of VB.NET, download Python or PHP or Java and with some application write your own software.  And if your computer doesn’t support media you want to view or listen to, you can just get a piece of software installed that will do the trick.  And if you want it to do something totally new, you can again find an application somewhere, or write your own, or commission someone else to write it for you – all without fear or favour.

If computers follow the logical progression, then we could expect to see them move on to a stage of development where they are pretty much ‘closed units’ – the old joke of ‘no user serviceable parts’ will be very applicable.  Think of the computer of tomorrow as being a little like your smartphone or a digital TV with Satellite TV and a DVD recorder built in; there’s content for you to view, you can save it, there may be services to buy, but you’re not going to be able to add functionality to it by producing your own code or content to run on it.

In other words, surprisingly like an iPad.  And some analysts have noted that the apparent lack of expandability of the iPad might not be a design omission, but might actually be a deliberate design policy.

Producing computers that are simply glorified media players has a number of advantages for many parts of the hardware and content industries.  To start with, if you can totally control the hardware and software environment then you can restrict your support calls; many software houses that produce applications for Windows have to have reasonable support functions in their companies because whilst their software runs on Windows, each PC running Windows is to a great degree unique, and therefore offers a near unique environment on which the application runs. 

A further point is that once you stop people from being able to put their own software on these machines, then you also prevent a lot of the issues of illicit copying.  By controlling the platform you can control the way in which the platform handles content that might be protected by some sort of Digital Rights Management software.  Indeed, it’s not too difficult to imagine a situation in which the functionality available on the unit can be remotely enabled and disabled  based on the payment of licenses or rental fees – similar to the way in which satellite TV receivers can be activated or de-activated remotely.

The Appliance Computer has a lot to offer manufacturers and content providers; it locks users in; it protects content; it makes the equipment more reliable.  But it also eats away at the very foundations of what has made so many software applications possible – the ability for anyone to write their own software.

Don’t let Appliance Computing remove the freedom to compute.

UK Government Data Release – much ado about nothing?

Back in January the UK Government opened a web site up that was described as “a one-stop shop for developers hoping to find inventive new ways of using government data”.   The site, http://data.gov.uk/, aims to pull together government generated data sets in a form that application developers can use to create ‘mashups’ of data from different sources of public and private data, create map based information from the data, etc.  In other words, the idea if to open up public data for private use.

I was pretty excited; professionally I’ve used some public data in the past and acquiring it is usually quite hard going.  Even if you know where to find the data, it’s not easy to just grab and download, and then it comes in various formats that need pre-processing to make useful.  So, I was pretty excited when I heard about this project.  I wouldn’t go so far as to say that my nipples were pinging with excitement, but there was definite anticipation.

So….my thoughts.  Bottom line for me at the moment is ‘Sorry chaps, sort of getting there but there’s a long trail a-winding before you reach your goal’.  Now, this may sound rather churlish of me, but allow me to explain….

Nature of data

First of all, a lot of the data on the site has been available in other places before now – however it is at least now under one roof, so to say.  The data is also available in disparate formats, like CSV files, etc.   The data is also pre-processed / sanitised – depending upon how you want to view it.  In some cases the data is in the form of Spreadsheets that are great for humans but dire for automated processing in to mashups.  The datasets are not always as up to date as one might expect; for example, on digging through to the Scottish Government data, I found nothing more recent than 2007.

Use of SPARQL and RDF

Although the SPARQL query language has been implemented to allow machine based searching of the site to be done, the data available via this interface seems to be pretty thin on the ground AND, to be honest, I’m not sure that the format is the best for the job.  SPARQL is a means of querying data that is represented in the RDF format to search what’s called the ‘Semantic Net’ – a way of representing data on teh Internet that is more easily made meaningful to search tools. But for a lot of statistical data, this isn’t necessarily the best way to search for data,  and the SPARQL language is not widely used or understood by developers.

No API

There’s no API available such as a Web Service to get at the data.  The site acknowledges this and states :

“The W3C guidance on opening up government datasuggests that data should be published in its original raw format so that it’s available for re-use as soon as possible. Over time, we will covert datasets to use Linked Data standards, including access through a SPARQL end-point; this will provide an API for easy re-use.”

I think this is a rather facile argument.  Apart from the data not being that up to date, one can surely publish the contentof the data raw – i.e. no numerical alterations – whilst still making it available via a SOAP, JSON or other similar API that more developers might have experience of and access to.   As it stands it just seems that some of the time spent on this project could have been spent in getting the data in to a format that could be served up in a consistent format to a wider range of developers.

This current interface – wait for the heresy, people – may be wonderful for the Semantic Web geeks amongst us BUT for people wishing to make widescale, real use of the data it’s NOT the best format to allow the majority of non bleeding-edge developers to start making use of the data available.

Summary

This is an early stage operation – it is labelled ‘Beta’ in the top right of the screen, and as such I guess we can wait for improvements.  But right now it just seems to be geared too much towards providing a sop for the ‘Open Data’ people rather than providing a widely usable and up to date resource.

Google to phase out IE6 support – first shots in their war for browser dominance?

ielogoI really dislike IE6.  I hate having to support it for some of my clients, and really wish they could work out how to convince their customers to upgrade.  But, my clients are real world guys; they deal with nuts and bolts, ironmongery, bank accounts, etc.  Their customers tend to be real world people as well – and by real world I mean not software, not media, not technology companies.

I have a client whose website gets 30% of it’s hits from people running IE6.  That’s right.  30%.  That’s three times higher than the average accoridng to these statistics here – http://www.w3schools.com/browsers/browsers_stats.asp – where in December 2009 about 10% of browsers are still IE6.  From my own experiences, these tend to be large corporate sites where machines are ‘locked down’ or smaller non-technical companies who don’t care what browsers their PCs run as long as they can access everything they need to do.

Anyway…Google have finally announced that some features of Google Docs and other applications will soon stop working with IE6.  Actually, for once we have a technology company that has delivered ahead of the announcement.  Some Google products already fail big time with IE6..and 7…and IE8.  Google Wave is a non-starter with IE at all.  It isn’t just ‘some features’ or a ‘reduced user experience’.  In my experience it’s a big fat ‘no user experience’ at all.

Here’s what I expect Google to do over the next few months.  After IE6, the pressure will be placed on IE7 and IE8.  Google will probably suggest that people move to the Chrome plugin for using their sites in IE, and then I’d expect a mysterious problem to emerge with using the plugin in IE, so that more pressure is placed on IE users of Google sites to drop IE for Chrome (or at this time another browser).  Of course, not all IE users will be bothered about not having access to Google applications; but Google’s applications are rapidly becoming the main game in town for online apps – a very unhealthy situation.  Microsoft were hag-ridden for years by various regulatory authorities about their efforts to command the desk top by all means available to them.  Google appear to be starting to do exactly the same thing.

Of course, there are other browsers that are more standards compliant than IE is, was or is ever likely to be.  And this is the core of Google’s current argument – that IE’s non-standard handling of certain elements of the HTML, CSS and JavaScript standards makes it impossible to properly support IE.  Google’s products make extensive use of a protocol called AJAX to provide a desktop style user interface experience; it’s strange that other companies producing AJAX style interfaces are able to make them run happily with IE (albeit with a few tweaks occasionally required to layout).  My conclusions at this stage would be that either Google hasn’t got the brainiest guys on the block as far as coding is concerned, and/or that they’re using their market muscle to start dictating their way to a situation in which they own the web ‘desktop’.

After IE, what next? Firefox, Opera and Safari aficionados should be reminded of John Donne’s famous quote at this point:

Each man’s death diminishes me,
For I am involved in mankind.
Therefore, send not to know
For whom the bell tolls,
It tolls for thee.

All Google need to do is start defining their own standards, or push implementation of emerging standards in their products so that only their own browser, Chrome, will be ready to cope.  Look at any areas of weakness in other browsers, and code your application to include code that would deliberately break when used on that ‘target’ browser.  No browser is 100% compliant; Google need to force each browser manufacturer in to a cycle of fail and fix, whilst each time Chrome is available from Zero Day to work perfectly on Google’s applications.

Microsoft have been bad lads in the past; there’s no reason for Google to start angling for the same accolades.  However, if they do, I’ll be interested to see whether the folks who’ve rightly been hard on MS will be equally hard on Google.  And if not, why not?

Innovative is not the same as useful

heathrobinsonI recently found this on my Twitterfeed: @jakebrewer: Yes! Note from newly devised Hippocratic oath for Gov 2.0 apps: “Don’t confuse novelty with usefulness.”  It is so true – and that comes from someone who spent part of his MBA working on the management of creativity and innovation.  There is a science fiction story by Arthur C Clarke in which two planetary empires are fighting a war.  The story’s called ‘Superiority’ for anyone who wants to read it.  In this tale, one side decides to win the war by making of use of it’s technological know-how, which is in advance of the opposing side.  Unfortunately, each innovation has some unforeseen side effect which eventually, cumulatively, ends up with the technologically advanced empire innovating itself in to defeat.

First of all, a definition.  For the purposes of this post, innovation is not the small improvements we all do to streamline and ‘finesse’ a process or product.  That’s just maintenance and responding to feedback.  Innovation is the equivalent of trading in the bike for a car.  It’s a big shift.

Innovation is an important aspect of our personal and business lives; through it we have a vital tool for adaptation and survival, but it’s important to not get hooked on the idea that innovation is always a Good Thing, and fetishise it as being an all powerful tool for all problems.  In fact:

  1. Innovation is not always useful.
  2. Innovation is not always indicative of progress.
  3. Innovation does not always benefit all the stakeholders.
  4. Failure to innovate can be expensive and risky; innovating for no reason can also be expensive and risky.
  5. Innovating is not the same as being effective.
  6. Innovation can deliver false confidence.

 

Innovation is not always useful

This usually equates to ‘if it ain’t broke, don’t fix it’.  If you have part of your life or business process that is chugging along well and is meeting the targets you set for it, then don’t bother innovating it yet.  There is no purpose or use to massive change that meets no need.  Such innovation is useless.

Innovation is not always indicative of progress

‘Progress’ is one of those words that falls in to the category of ‘hard to define but we all know what it is’.   You may think that you have to innovate to stay cutting edge; but do you?  Sure, we have to be aware of where our market is going, and risks to our future revenue streams.  But innovating to stay on the bleeding edge of technical and social change is likely to expose you to risk.  Progress for your business or life does not always reflect social or technological ‘progress’.  Innovating purely to keep up with trends is ‘running the Red Queen’s Race’ – you will never finish.

Innovation does not always benefit all stakeholders

Innovation may be great for you, but not great for people whose incomes are affected, whose role is removed and whose job in the organisation is no longer needed.  When you innovate, bear this in mind and don’t automatically expect everyone to be pleased they belong to an innovative organisation.

Failure to innovate can be expensive…as can innovating!

Innovation always costs time and perhaps money, especially if done properly.  There is no such thing as free innovation, even if the cost is in terms of the time taken to make sure your innovation won’t break what’s already happening.  It’s easier to keep existing customers than to create new ones.  An innovative approach may scare existing customers away, and not get new replacements.  Be prepared. 

Innovating is not the same as being effective

I see a lot of people in software engineering spending inordinate amounts of time on new processes, new languages and techniques who don’t seem to always be hitting the market with product.  Don’t mistake skilling up with the latest languages and software design techniques as being effective.  It’s only effective if you put the techniques to use.  I have several clients who make a good living, thank you very much, on maintaining and providing applications that are based on 10 year old technology.

Innovation can deliver false confidence

The German Enigma code machine in World War 2 was a highly advanced and innovative piece of kit for the time.  If used correctly it would have been unbreakable.  However, the operators tended to use slightly dodgy procedures in operating it and that gave the British code-breakers at Bletchley Park an ‘in’ to the machine that they were able to exploit and hence read German secret messages.  Even when the Germans did suspect that someone had broken ‘Enigma’ they were so confident in their technologically advanced machine that they thought it impossible.

Enough said.

I’m not saying don’t innovate; that would be ridiculous.  Just think about your innovations and don’t automatically follow the ‘innovate or die’ mantra.  Take time out and read ‘Superiority’ and learn from it.

iPad – third way or solution looking for a problem?

jobsandipadWell, I guess that as someone with technical credentials I should comment on the unveiling of Apple’s new tablet machine, the iPad.  The first thing I will say is that I’m not an Apple fanboi, and so am probably a hard audience to impress.  Anyway, here’s what Apple have to say – I like that price tag, although I expect the usual dollar / pound sterling equivalence will work giving a price range of £500 for the lowest memory / WiFi option through to about £850 for the 3G / 64Gb unit.    But, I have to say, that at first glance it looks beautiful.  Take a look at this from the engadget site (the start of the presentation is at the end of the page, ad the images run in chronological order up the page).

At half an inch thick and about 9.5″ by 7.5″ it has a slightly odd page aspect ratio – it basically looks like an iPod Touch or an iPhone for giants. 🙂  It will run existing apps from the Apple App store, and will also talk to iTunes to get media.  There is a 30 pin connector to charge through and connect to other devices – including PCs.  The unit comes with up to 64Gb memory, has a 1GHz bespoke processor from Apple, called A4,  WiFi as standard an 3G as an extra, touch keyboard a-la-iPhone, GPS, accelerometer for motion sensitive UI, etc.  Ah, what the heck – here’s the technical specs.  No point in regurgitating what’s elsewhere!!  Like I said, think about a wider, longer iPhone.

It looks good – the processor looks pretty capable, and if one were to appear in my birthday bag or Christmas stocking I wouldn’t say no. 

I have to admit that I’m old enough to remember Apple’s first pass at pad computing donkey’s years ago – the Apple Newton.  It was a concept ahead of it’s time.  This machine looks like it really hits the spot on so many levels, but I’m always a believer in ‘Never buy Version 1.0 of anything’, and I do have a few reservations in terms of both business and technology. 

No SD Slot– I appreciate that this seems a small thing when you’re looking at something that can handle 64Gb, but it seems to be a problem with Apple gear that they always ship it with less memory than you want.  I can see lots of applications where media could be distributed on an SD card for plugging in to a gadget like this.

Battery life / replacement– not the life time of the battery in normal use – that 10hrs is pretty cool – but the problems about replacing the unit when it fails.  Are we looking at a similar situation to that experienced with iPods, or have Apple learnt?

Software Development– The Software Development Kit that is available is still unsurprisingly Mac centric – based as it is on the iPhone / iPod SDK and looks at first glance to be more of a conversion kit for existing iPhone / iPod apps than a new development environment.  It’s not available for any other platform than Mac, and Apple also charge for the privilege of belonging to the developer programme.  All in all, seems a little short sighted in terms of application development.  Whilst there are thousands upon thousands of available applications, the question is just how many are genuinely useful on a platform closer to a Netbook than a pocket phone.

Lack of ‘open’ connectivity– I would have liked to have seen a bog standard micro-USB port rather than just the Apple Docking port.

 Having said all that – it’s a nice piece of kit and one step closer to Star Trek.  I could see myself buying one and using it as ‘player / reader’ for media, rather than as a portable work tool.  I could imagine it being given out at high-end conferences packed with stuff for delegates.  I could imagine it as a brilliant teaching tool.  I can see lots of uses, but whether it succeeds or not must surely depend upon bringing the price point down, opening it up a little and finding the killer application.

It has the potential to be a ‘third way’ between phone and Netbook, or a solution looking for a problem.  And I’m not yet 100% convinced which way it will go.  Ask me when we finally see the UK pricing.

Web 2.0 Jumps the Shark

tagcloud - from WikipaediaThere is a wonderful phrase in film and TV script writing – ‘to jump the shark’.  It’s that point in the history of a TV series where the scripts veer off in to the surreal or when characters suddenly change their behaviour.  It’s reputedly named after an episode of the  popular 1970s sitcom ‘Happy Days’ when the hero ‘The Fonz’ ends up jumping over a shark on water skies.  Plausible, huh?

It struck me the yesterday, after seeing a site that had been bought to my attention via Twitter, that Web 2.0 may very well be at the point of jumping the e-shark.

Now, Web 2.0 has revolutionised the way we put web applications together.  Before we go much further, Web 2.0 is like pornography; we know what it is when we see it but we’d be hard pressed to formally define it.  So, here’s what I mean by Web 2.0.  It’s a piece of jargon that is used to loosely define web sites and technologies that facilitate interactivity, inter-operability between web sites, sharing of user information and user driven content, whether text, image or multimedia content like video and animation.  Web 2.0 sites are typically those where the content displayed to you and other site users can be easily modified and configured by the user.  Facebook is a Web 2.0 ‘poster boy’; my Internet Banking site is good old fashioned ‘Web 1.0’.

A lot of the technology that has been developed to make Web 2.0 possible has found it’s way in to all sorts of Web sites – things like Google Apps, for example, are a perfect example of the serious application of Web 2.0 technologies.

But for all the value, have we finally hit a point where many sites and applications being delivered as part of the Web 2.0 revolution are trivial, absurd and effectively worthless to the vast majority of Web users, effectively showing themselves to be ‘portfolios’ for developers or sites of interest only to the digerati being passed off as the next ‘big thing’?

Not that there’s anything wrong with either of these directions, provided that we appreciate it and that we don’t get ourselves so tied up in having the joy of having a Web 2.0 site that we miss the point of what the site is supposed to be doing.

And so on to  http://omegle.com/ .  To save you the job of visiting, it’s a chat site that allows you to talk to….total strangers anonymously.  Yes, a technology that trumpets the fact that it facilitates communications between individuals the world over now allows stranger to speak unto stranger.  Maybe I’m a bit hard on this site, but to me it encapsulates so much of what is wrong with some of the more over-hyped Web 2.0 applications.  It’s no doubt regarded as ‘cool’ and ‘clever’ by some; it’s essentially pointless, does little that can’t be done elsewhere.  It’s almost ‘out of character’ for the original aim of Web 2.0 – to facilitate communication and interactivity.  After all, anonymous communications are not that useful for most things.  And you have to admit that talking to randomly selected anonymous people is pretty surreal.  Assuming that the people on the other end are real people and not just ‘bot’ programs….

So…are we heading for Web 2.0 shark jumping in 2010?  And why is it important? 

Well, shark jumping almost always precedes the demise of the TV show.  And it would be a shame if the good stuff that the interactive web has bought us were to be drowned under a wave of over-hyped nonsense.

Iframes in phpBB

I am currently tinkering with a phpBB3 installation for a forum I ran until the summer of this year – Coffeehouse Chat. I shut the site down in the summer, but am now contemplating opening it up again. However, I want to try a few new things out on the site, including some ’embedded content’ where I include content generated elsewhere on my site in forum posts and pages.

The easiest way to do this seemed to me to be use the HTML IFRAME tag, but I wanted to do this within the context of Forum posts, and didn’t want to get in to having to create separate template pages for these special pages within Forum threads. I therefore decided to use BBCode tags and use those to code IFRAME tags.

There are always warnings about implementing any form of BBCode that can in principle allow a user to put code from another site dircetly in to your page – and quite rightly so. However, I felt reasonably comfortable about the approach I was going to take, as rather than make available a ‘generic’ BBCode version of an IFRAME tag, I was going to create a series of BBCodes that would only insert an IFRAME tag with a pre-specified URL and other attributes in to the page.

The approach was as follows:

Install the code that I wanted to run in the IFRAME within a sub-directory on my web server.

Tweak that code so as to run within a window that would fit comfortably within the space available for a conventional phpBB forum post.

Within the phpBB administration screen, create a new BBCode to generate an IFRAME specific to the application in the sub-directory.  For example:

phpbb-bbcode-1

Here I decided that to add my game of ‘Battleships’ to a page I would simply create a BBCode tag called [battleships].

Write the corresponding HTML code that will be inserted in the page when the phpBB is encountered.  In this case, it’s as follows:

phpbb-bbcode-2

Because the URL is pre-set to a location within my own site, there is no problem if users of the Forum choose to use the BBCode on their own posts within the Forum.

The BBCode command can thus be placed on any page and brings in content generated from the predefined URL. I’ve used this approach to embed some Javascript applications in Forum posts, and it works very well as a means of delivering customised content within posts.

 

WPMU Installation to support sub-domain blogs

wordpressI’m currently renovating a site of mine – Coffeehouse Chat – with a possible view to re-opening the Forum side of it with new and improved features – including better integration with Social Media and User Blog hosting on the site.  And there was the issue – I wanted to install WordPress-MU – the multi-user edition of WordPress – in such a ways so as to support user blogs in sub-domains on the main site domain – e.g. something like joesblog.blogs.coffeehousechat.co.uk

This is a two stage process that is outlined in the documentation.  the first part is the setting of Wildcards in the DNS settings for the server, and the second part is installing a .htaccess file that actually handles the processing of the redirected incoming requests.

Installing the .htaccess file is nice and easy.  the file is below – it comes with WordPress-MU named as htaccess.dist – simply put it in the directory containing the WordPress sofwtare and rename it to .htaccess.

RewriteEngine On
RewriteBase BASE/

#uploaded files
RewriteRule ^(.*/)?files/$ index.php [L]
RewriteCond %{REQUEST_URI} !.*wp-content/plugins.*
RewriteRule ^(.*/)?files/(.*) wp-content/blogs.php?file=$2 [L]

# add a trailing slash to /wp-admin
RewriteCond %{REQUEST_URI} ^.*/wp-admin$
RewriteRule ^(.+)$ $1/ [R=301,L]

RewriteCond %{REQUEST_FILENAME} -f [OR]
RewriteCond %{REQUEST_FILENAME} -d
RewriteRule . - [L]
RewriteRule  ^([_0-9a-zA-Z-]+/)?(wp-.*) $2 [L]
RewriteRule  ^([_0-9a-zA-Z-]+/)?(.*\.php)$ $2 [L]
RewriteRule . index.php [L]

<IfModule mod_security.c>
<Files async-upload.php>
SecFilterEngine Off
SecFilterScanPOST Off
</Files>
</IfModule>

So in my case – WordPress-MU installed in a folder called blogs – this file goes in to that folder.

Now, the second part – the Wildcard DNS settings.  Some time ago when I set up an installation of WordPress-MU I had to get my hosting comapny to deal with this for me.  However, this time, a little advice from Samuel at Prime Hosting showed me how to set it up from within cPanel, so I’m going to share that with you here.  If you’re not using cPanel, there may be other ways in your own control panel to do this.

In my installation, WordPress-MU is installed in a fodler called blogs off the root of my public_html directory.  I have set up a subdomain – blogs.coffeehousechat.co.uk – to point to it, so that when a user enters this domain they go to the blog create / sign in page.  Now, after checking that this worked happily, I logged in to cPanel for the coffeehousechat.co.uk domain and selected the ‘Subdomains’ control from the Domains panel.

Now the cunning bit…note that this may not work for you in complicated web site set-ups where multiple redirects are involved – but it worked for me.

In the ‘Create a Subdomain’ box, (below) enter ‘*’ as the subdomain name – giving *.coffeehousechat.co.uk in my case – and enter the folder on the server where you want things to redirect to as the ‘Document Root’ – in my case public_html/blogs.

 

 CreateSubDomain

 

 

 

 

 

 

Once this is entered, press the create button.  The grid at the foot of the screen should be updated to reflect teh changes just made:

ListSubdomains

 

 

 

 

 

And that is that! 

A user entering, say, www.test.blogs.coffeehousechat.co.uk will be directed to that blog if it exists, or be prompted to create it.