There’s one thing we got to get, Heyes….

…and that’s out of this business!”

One of the TV highlights of the week for me in the early 1970s was the TV series ‘Alias Smith and Jones‘, following the adventures of two outlaws on probation, Hannibal Heyes and Kid Curry, as they attempted to stay ahead of the law and out of trouble. At the start of every episode, we’d see the two being pursued on horseback, with Curry shouting the above lines to Heyes.

This week I finally decided that I need to get ‘out of the business’ of freelance web development.  I have a nice part time day job, involvement with a startup, and currently enough freelance work to keep things ticking over.  But teh freelance web work will never, ever, make me a good income again, and if I’m going to do anything with my freelancing time, I need to find something else.

What triggered this?  I quoted for a WordPress related job – install, configure, tweak the theme and apply a few small mods to the installation. Admittedly not one of the world’s great technical tasks, but a nice job.  I quoted at my ‘lowest rate’ – £20.00 / hr – this was a UK based customer, and I expected to take about 10 hours to do the job.  I replied a mail later in the day telling me that I’d not been successful as another UK based freelancer had come in at a lower rate.  Of £5.00 per hour.

A fiver an hour.  Less than I’d get sweeping floors in McDonalds. Rates like that are pretty common from suppliers of services based in the Far east, but from a UK based develoeper, it’s scary.  Because it means that the market for some types of development work has become commoditised, price driven and almost at the level of ‘will work for food’.

So…time to get out.  It’s no longer worth it.  Fortunately I have a few ‘specialist’ areas of software development I can fall back on, but am wondering now whether it’s time to take a while different approach.  With a flexible permanent job available to me, maybe it’s time to look at other things to do and leave software development work to the sweatshops of the far east and the UK?

I’ve been thinking of things that are not ‘commodity’ – maybe get my old woodworking skills back?  Or try something new? Art? Something to do with my interest in vintage radio? Who knows.  Perhaps focusisng on the permanent job and doing bits of freelance work or something new for ‘beer money’ is the way forward these days.

Very, very sad.  How long before other parts of our technology and ‘creative’ industries become sub-minimum wage sweatshops?

The Last of the Magicians

“He was the last of the magicians, the last of the Babylonians and Sumerians, the last great mind which looked out on the visible and intellectual world with the same eyes as those who began to build our intellectual inheritance rather less than 10,000 years ago.”

So wrote John Maynard Keynes about Sir Isaac Newton in a lecture he wrote, but never delivered, to commemorate the 300th anniversary of Newton’s birth on Christmas Day, 1642.

I was reminded of this observation the other evening when I was contemplating my ‘life and times’ as a developer come writer come ‘hacker’ (in the original, MIT, sense of the word) in late 1970s and early 1980s.  My life and times with computers started as a spin off from my interest in electronics as a boy – I built a simple computer in my teens, and then took the path of ZX81, Spectrum, BBC Model B, Amstrad 6128.  I was particularly interested in interfacing and robotics – and because of my prior interest in electronics I could handle both the hardware and software side of things.

I had an interest and an aptitude for electronics, analogue and digital, and could understand my computers from the printed circuit board up, so to say – on occasion doing the odd modification to circuit boards to fix things or improve matters.  Looking back, it was possible to trace those 8 bit home computers back through history to the special purpose computers created during and after World War 2 to help with code breaking and other similar applications.

And this was where I started from the other evening – in IT terms, hobbyists and hackers of the 70s/80s were able to trace their activities back in to the ‘Sumerian Period’ of their interest; the world view was similar, just using chips rather than relays. Sometime in the 1980s it all changed; PCs, Macs, etc. came along with (eventually) the sort of day to day access to the Web, Apps and the Internet that we regard as normal. That ability to get involved with all aspects of the machine, from wires to Windows, disappeared.

It seems to me that there is a line back from where we are now, through the 2000s, in to the 1990s and back to those original Macs and PCs – then a hiatus – then back from the home micros via mainframes to the days of Turning Machines, Colossus and ENIAC.  And those of us who remember hacking code and hardware on computers big enough to get your fingers inside are maybe not the first scientists of the modern IT age, but perhaps we are truly the last of the magicians.

 

‘Does Microsoft move the cheese too often?’

Every now and again I bother to read something about my profession.  I know that this sounds rather bad – continuous professional development and all that stuff – but usually I’m too busy doing stuff to read about what others are doing or what I might be doing in 2 years time.  And so I encountered this little piece:

http://www.techrepublic.com/blog/programming-and-development/poll-does-microsoft-move-the-cheese-too-often/5317?tag=nl.e055

I am a professional .NET developer (OK…I make money by writing code using .NET – my professionalism is up to my clients and employer to comment on!) and yes, it’s a rapidly changing world.  But that doesn’t mean that you have to adapt what you’re doing all the time to keep up with Microsoft.  Now, I can already hear the ceremonial disembowelling cutlasses being sharpened by more hardcore developers, but let’s continue…

I still write code using the .NET 2.0 and .NET 3.5 frameworks, as well as .NET 4.0  Why? Three reasons:

  • The earlier frameworks often do everything that the application needs to do.
  • I understand how they work better than .NET 4.0.  So, I find it faster to create code and hence solve the customer’s problems.
  • The customer may not have (or want to have) the most up to date framework on their machines.  And who am I to say otherwise if the earlier stuff does the job?

Whilst there are some very sensible reasons for making use of the most current, stable version of any technology, it’s worth remembering that many people don’t care what you develop their software in as long as it works, is maintainable and doesn’t cost them the Earth. the preoccupation with newer, shinier stuff comes mainly from us – the developers – who get hooked in to the stuff that the tool makers – Microsoft et al – produce. If we said ‘No, bugger off’ more frequently things might settle down.

I also develop software in PHP and JavaScript, and maintain a lot of legacy stuff in Microsoft VB6.  I do this because, bluntly, people pay me to do it.  And therein lies the answer to the question above.  Microsoft change stuff reasonably frequently – that’s their privilege.  It’s also our privilege to not get roped in to the constant change process.  Remember WHY we write code – it’s to solve problems – not keep the develoeprs at Redmond in gainful employment.

Remember our customers – they’re the people who should matter to us – not Microsoft’s behaviour.

 

 

 

 

And Facebook carries on downwards….

In my last post I questioned the value given to Facebook in their IPO.  It became clear at the close of first day trading that things hadn’t gone according to plan; the usual ‘Day One Spike’ associated with high profile technology IPOs just didn’t happen, and I have a feeling that the only reason that Facebook didn’t end the day lower than it did was because the underwriters of the IPO bought up stock to shore up the price.

Of course, that what the under-writers of IPOs are partially there for; they pick up the spare stock and keep the price up, but given that the slump has continued for two days trading now, I am beginning to wonder whether the initial price was artificially inflated for the egos / benefit of those involved.  If so, that is totally unacceptable beccause it means that the ‘civilian’ buyers of the stock – those not in on the game – paid over the value of the company from day one, and it is increasingly possible (and indeed likely) that everyone involved in the launch KNEW that.

There is an interesting article here, by Michael Wolff, that states what quite a few of us have wondered for a while.  Given that, beneath the hype, branding and bells and whistles, Facebook is an advert supported site, how on earth do they expect to make such money as a 100 billion dollar price tag suggests?  And Wolff should know; he wrote the book ‘Burn Rate’, which documented the crash and burn of the first dot-com boom a decade ago.

The problem is that it’s not just Facebook at risk here; a bad IPO in a sector colours the views of those preparing other stock market launches.  Out there are lots of technology start ups, all wondering about financing.  A solid Facebook IPO would have possibly led to a market place that was more willing to put money in to smaller companies that needed much less money and that might even have been producing goods and services of greater value than the ability to play Farmville or post statuses.  And by ‘solid’ I mean exactly that – a sound valuation – if the fall continues then it may well be that a valuation of 40 to 50 billion dollars was MUCH more realistic – that didn’t require urgent under-writer support, that showed healthy secondary trading over the days after launch and that also showed a steady growth as people realised that Facebook had some value that could be exploited – given availability of money.  After all, that’s what an IPO was ORIGINALLY supposed to do – give some money to the founders but mainly give the company money to develop.

As it is, Facebook was clearly over-valued – whether by intention or accident we don’t know – and analysts are finally asking the questions that should have been asked months ago.  Facebook’s last minute purchase of Instagram to try and grab mobile traction looks increasingly like panic.  The company may settle down to around $50 billion dollars – still, IMO, overvalued but the markets could probably live with it.

But back to those other comanies in the wings.  Based on previous technology trading cycles, a bad high profile IPO:

  • Makes the companies queuing up to do an IPO pause in their tracks.
  • Reduces the value of such IPOs and dents market confidence
  • Causes investors in any technology companies to remember that there might be a downside risk and so be more careful about investing – which isn’t always a bad thing, but sucks if you’re a ‘real value’ company.

Historically this tends to lead to a deflation of the tech market place – the last thing we need now.

As is said in the Pythian Scrolls in Battlestar Galactica : “All this has happened before, and all this will happen again.”

Happy (Belated) Birthday Speccy!

It always takes me a while to catch up with things, but it was recently the 30th Birthday of the Sinclair Spectrum. Like many of a certain age, the Speccy was one of the computers on which I cut my teeth.  I was lucky enough to have been exposed to most of the popular home computers of the late 1970s and early 1980s by virtue of my first job and the fact I wrote articles for the computing magazines of the time.

I’d already bought a ZX81, and became the Z80 Machine code Guru for my employer – I was also writing books for Melbourne House on the Z80 based MSX machines – and the two worlds overlapped when I was asked by my employer to develop a way of extending the BASIC language of the Sinclair Spectrum to allow new commands to be added to the language.  I managed to deliver the goods – oddly enough around the same time a magazine article was published that detailed a similar approach to my own – and I added writing books on Spectrum Machine Code programming to my repetoire.

I also wrote a fair number of articles about programming and interfacing the Sinclair machines, designed interface cards for it for my employer, dabbled in a little light robotics, but rarely actually USED the machine for anything!  When I needed to write these articles and books I used my BBC Model B which had a proper keyboard.  How I hated that rubber monstrosity on the Spectrum – the later Spectrum 2 had a better keyboard and made life easier, but one still had to deal with the multi-function behaviour of the keys.  I think that that was the single biggest hitch with the Spectrum; had the ‘dead flesh’ keyboard just had ‘normal’ keyboard functionality, where you typed stuff in letter by letter, I think it would have been easier.

Still, I can’t grumble.  This was in the days when if you were good enough to write and have your material accepted by a publisher, you got paid for it.  This may seem something of a novelty these days when blogging and other forms of self-publishing seem to have ripped the heart out of traditional (OK, paid!) technical writing, but those magazine cheques of £40 or £50 went a long way!

I think that the Spectrum was one of two machines I bought (the other being an Amstrad 6128) that actually paid for themselves from my writing.  That immediately makes the Spectrum special to me.  I also learnt a HELL of a lot from it about low level programming, hardware interfacing, robotics and the Zen like patience needed to manage that keyboard and a tape recorder for saving and loading programs…..

Funnily enough, 30 years later, I spent several hours in my current day job looking at an interfacing problem involving a PIC Microcontroller.  And the solution I eventually suggested was one that I dragged up from my Spectrum interfacing days….

If it ain’t on your machine, it ain’t yours.

Yesterday I found out that Yahoo had pulled the plug on the Delicious application, amongst a few other APIs and services.  There will no doubt now be a spate of articles about how to move your content from these applications to somewhere else, and it may be that new services spring up out of the Internet eco-system to fill the gap.  But hopefully the users of these systems will have learnt a valuable lesson:

If it ain’t on your machine, you cannot rely on it being there.

This isn’t rocket science for those of us who cut our computing teeth in a pre-Cloud, pre-WWW world, but it was pointed out to me the other day that there are now large numbers of children and teenagers who have never lived in a world without the WWW.  Scary.

A couple of years ago a Forum that I was an occasional contributor to shut up shop in a sudden and pretty final manner – the owner simply closed the shutters with little warning.  For me this was vaguely annoying but no biggie, but for other users of the Forum who’d committed some pretty large articles and intellectually robust commentary over a period of time, it was almost the equivalent of Edmund Blackadder having his novel burnt by Baldrick.  Of course, the site owner was perfectly within his rights to do this – free forum and all that.  But the general feeling was that a form of social contract had been broken.  However, one could easily say that the authors had not taken backups of their content…

I mothballed a forum myself a year or so back – it’s still online, all content there, but posting has been disabled.  I have to say that in these times of almost limitless server space and cheap hosting it almost seemed churlish to pull the work of others. 

But there may well be a point at which I let the domain go or re-use it for something else.  It’s perfectly within my rights to do so, and that content will then exist only as a zipped up backup on a DVD somewhere, and anyone who posted anything there, who wants it back and didn’t take a copy will have to whistle.

And there is the issue; the ‘universal availability’ offered by browser based applications, the Web and the Cloud means that many people no longer own their own data, in anything but an intellectual property sense.  They don’t know where it is stored, they don’t know who gets to look at it, search it or mine it.  They don’t know how often it’s backed up, and have an assumption that ‘smeone’ will be taking care of it.  The increasing focus of Operating Systems on hiving off document and data storage to servers ‘out there’ in the Cloud or on the Internet (like Google’s new Chrome OS) is regarded as a great positive for those involved in Internet service related businesses – after all, it could well be the next big thing in what you can be charged for – always something folks like. 🙂

There is something rather neat, in my opinion, about having your data on your hardware, under your control.  Yes, it’s your responsibility, but we need to start regarding personal or household data in the same way previous generations have looked after old letters and photographs.  If you need to work on stuff whilst away, then why not just put the files in question on USB sticks?

And finally, data ‘out there’ is under the legislation and jurisdiction of whatever country the servers lie in.  You might want to look at things like the US PATRIOT act before saving your data anywhere that crosses US jurisdiction.  Whilst you might not think you’re a terrorist or a troublemaker, the definitions these days are flexible.

Ultimately, there is something rather reassuring about having your data at home, under your roof, where the only way it can be seized or searched is when the stormtroopers kick the door in.

The Death of Google Wave

Not for Google Wave the sudden death; more a slow, drawn out lingering farewell on the life support machine of ‘development has been stopped’. I guess it gives the boys at Mountain View the opportunity to change their minds if the pressure gets too much. The demise of Wave doesn’t actually surprise me; I’m surprised that it’s lived as long as it has done.

Here’s the story of my experiences with Google Wave.

When it was first announced, I wasn’t quite sure what to make of it – a sort of mash-up of email, instant messaging, social networking, blogging and online discussion forum. I received my invitation and got signed up. I have to say that I wasn’t an early adopter – to be honest I wasn’t sure what I was going to use it for and I’m past the stage in my life where I have to try out all new technology the day it comes out – life is way too short to be someone else’s Beta-Tester….

And there we hit problem number 1. I knew that Wave would not work with IE, so I signed in with Firefox, and had a few problems there as well. OK, Google, you want me to use Chrome so I will do – and I was sorely disappointed when I still couldn’t get the equivalent of a profile set up on my Wave account – the special form of Wave that stores such information just wasn’t playing with me. I contacted Google technical support, scoured discussion groups and found that others experienced the same problem. I was told by Google that it was something to do with my account, but not how to deal with it. Various other folks suggested that it was ‘just one of those things’ that might get fixed at some point, but for now it was a problem that bothered some users.

OK…I could live with it.

The second thing is that getting a Wave account is rather like buying the first telephone in your circle of friends – because of the social nature of Wave you need a few friends to make it worthwhile. You can use it without other folks in your network using it – but it rather misses the point. So, next, find your friends. And that was the next sticking point for most IE using, Firefox using, non-techies that I knew – why should they bother trying to get on to a new social networking / communications / chat / mail / what have you system where most of their friends AREN’T?

However, I have a number of techy pals and people who’re interested in emerging technologies, so I got a few folks on-boad.

OK…I could live with it.

We then hit the issue of exactly what to do with Wave. For one project we did try using it to discuss design ideas and such, but we found that it was more convenient to use an existing issue / bug handling system already in place for the organisation. Another couple of people I knew attempted to kick off various waves but it just felt like we were using Wave for the sake of using Wave. I was reminded to some degree of a great piece of software (IMO) from the 1980s called Lotus Agenda – it did all sorts of clever stuff but conceptually was a mare to get your head around – but at least Lotus provided a few samples of what could be done.

And I think that this was, in the end, the thing that did Wave for me – I couldn’t honestly think of an application within my circle of friends and professional contacts that couldn’t be done better with a different tool. There’s an approach to software utility development that I often adopt that I was taught very early on in my career; build tools to do specific jobs very well – and if possible, make those tools so that they’ll talk to each other. Now Wave attempted to combine e-mail, social networking, instant messaging, file sharing and online discussion forums in a way that doesn’t really give the advantages of the individual technologies but requires a change in working practice, in many cases change of browsing software and a cultural / behavioural change amongst participants to get them ‘on board’.

And that’s why I’m not terribly surprised that Wave hasn’t taken off; I am hopeful that if Google release the code in to the wild as an Open Source project we might see some new projects spring from it. But I’m still to be convinced that the ‘Wave’ concept of multi-mode online communication all in one place is going to be popular – especially if it requires you to sign up to yet another site and maybe even change browsers.

I write software…to solve problems

Well, it’s a while since I wrote a blog post so why not kick off with a slight bit of professional heresy.  I write software for a living; have done for over 30 years, starting with SCMP microprocessors in my teens (yes, I was THAT sort of teenager…) and working through everything in between until now when I spend my time split between .NET, JavaScript and PHP.

Now, why do I write code?  Well, occasionally I do it for fun, but mostly I do it for profit – my clients pay me to do it.  Actually, that’s not right.  My clients pay me to solve their problems for them using software. 

I’ve never been one of the great ‘geeks / hackers’ in life; I’m a radio amateur and electronics whizz, and the closest I ever came was in my teens and early twenties when I was fiddling with low level stuff like analogue to digital converters and the like; but pure software geekery has never been me.  I used to say to people that I was a reasonable programmer but an excellent developer; now I’m more likely to say I’m an excellent problem solver.

Don’t get me wrong; I have an active interest in my profession, from the perspective of how I can deliver better service to my clients in delivering what they want from me.  And I like to think that I write sound, efficient and effective code.  I create data structures, create objects to model those structures and business processes, create code to implement these abstracts and put something on my client’s desktop or web server that allows them, bottom line, to make more money or save more money.  I also write code that is easy to follow and maintain, that has sensible variable names, that I document and leave a pile of useful information with my client.  And I’m there for them when needed.  I love it when I get a call from a client who tells me ‘We needed to add a new feature, so we took a look at the code and documentation and we think we’ve done it right, but next time you’re in, could you give it a quick look?’ – the ultimate accolade for me – I’ve delivered code that others can pick up and run with.

I’m methodical, but don’t have what you could call a methodology; I was recently asked whether I was agile; I almost replied that I used to be but since I tore my knee cartilage a few years back I’m not as nimble as I once was.  Do I practice Extreme programming; not really, I’m more Church of England, middle of the road, myself….

I’ve started to notice that there are two broad categories of software developers; those who work for software houses or in large development teams where words like Agile, Extreme, kanzen, dojos, user stories, sensei are the common parlance, and those who work very close with business and organisational problems, where the usual words that define a day at the coalface are fix, solution, feature, document, debug, budget, timescale.

I like to talk to my clients in their language; I’m afraid I still work in a world where businesses have processes, not user stories; where they don’t particularly care what technique I use behind the scenes as long as I deliver working, maintainable and efficient code, to budget and on time.  I’m sure that the software house methodologies work effectively but do they provide yet another layer of obfuscation, bureaucracy and abstraction between what we do and what our clients and customers want us to do – solve their problems?

No matter how much we dress things up with Japanese words (and I speak with some knowledge and experience of Japanese culture and management) we must not lose track of what we do and why we do it; we solve problems by developing effective software systems delivered on time and to budget.  That is all our clients care about; we’re not ninjas or ronin; we’re professional programmers and problem solvers.

I guess what I’m saying to developers is don’t fetishise what you do to the point where the process becomes more important than the product.  It’s rare I have much good to say about Steve Jobs and the slavering behemnoth that is Apple, but he did once say ‘Great artists deliver’.  And that’s what it’s all about.

I was right to blame it on sunspots!

Early on in my consulting career – late 1980s, early 1990s – I did a lot of work for a public sector organisation.  I worked on a number of projects – this was in the days when IT consultants could still be generalists, applying their skills to whatever was needed – and tended to specialise on development of a few database applications that were centrally based and accessed over a (pre-Internet) wide area network, held together by leased lines, private cabling, etc.

All in all, a fantastic environment in which to hone your skills.  Actually, in many respects I was rather spoilt by this client – and by my first job out of university – they both gave me a rather distorted view of working life!  For a while we experienced some rather ‘odd’ problems on some of the applications running over the wide area network.  Despite our best efforts, we couldn’t actually ground the problems – we checked software, hardware, cabling, the works.  Eventually, and half jokingly, a colleague and I (both of us radio amateurs) decided that the problems were being some how caused by sun spots….

Unsurprisingly, this caused gales of laughter in the office, but as far as we were concerned there was an element of logic in our proposal.  We knew that sun spots and solar activity in general had an effect on the earth’s ionosphere, and that in the past bad solar storms had knocked out telephone and communication systems.  Indeed, in the pre-Internet, pre-computer days of 1859 a major solar storm had caused incredible effects, even causing telegraph wires to carry electrical currents when all the batteries were disconnected!

This information did little to convince people around the office, so we simply did what any other self respecting techie would do; turn things off and on, replace a few network cards and bridges, tighten connections and tweak software.  And the odd errors stopped, and we stopped worrying about it.

But over teh years I’ve thought about those gremlins on numerous occasions, and it now appears that we may have been right after all.  According to this article, solar storms can cause mystery glitches in communication and computer systems. 

It may be that the next time we get a big solar storm or Coronal Mass Ejection – when a massive plume of plasma and charged particles is thrown from teh sun out in to space – the impact will be much more than a few gremlins in the works.  Some have suggested that a storm similar to that of 1859 might cause massive damage to the electrical and communications systems of the world; indeed, some real pessimists have suggested that a BIG solar event might put us back in to the pre-electronics age for decades.

Let’s hope we don’t get it…

Configuring MOWES on a USB Stick

There’s an old saying that you can neither be too thin  or have too much money.  I’d like to add to that list – you can’t have too many web servers available on your PC.   For the non-geeks amongst you, a web server is a program that runs on a computer to ‘serve up’ web pages.  because I write web software for part of my living, I run my own web server on my PC.  Actually, that’s not quite true…because there are two main web servers used today – Microsoft’s IIS and Apache – I have two.  And today I decided that it would be really useful to have a web server and associated software on a USB stick that I could plug in to computers to demonstrate my web applications out on client sites.

I decided to use the MOWES installation – after all, it’s designed to run on USB sticks – and as well as the standard Apache, PHP and mySQL I decided to also install Mediawiki and WordPress.  As well as being used for demonstrations, I decided that I’d also like to have a portable Wiki to use for note taking / book research when I’m on my travels, and run a demonstration instance of WordPress.

Installation

The simplest installation involves putting a package together on the MOWES website, downloading it to your PC and installing it.  To get started with this, Google for MOWES and select what you want to install.

NOTE – when this post was written I pointed to a particular site.  That site – chsoftware.net – now reports back as a source of malware, so I’ve removed the link.

For my purposes I chose the full versions of Apache, mySQL 5 , PHP5,  ImageMagick, Mediawiki, WordPress, and phpMyAdmin.  This selection process is done by ticking the displayed checkboxes – if you DON’T get a list of checkboxes for the ‘New Package’ option, try the site again later – I have had this occasionally and it will eventually give you the ‘ticklist’ screen.

Tick the desired components and download the generated package.

Plug in your USB stick, and unzip and install the MOWES package as per their instructions.  First thing to note here is that you may need to keep an eye on any requests from the computer for allowing components access to the firewall.  The default settings will be Port 80 for the Apache web server and 3306 for mySQL.  If these aren’t open / available – especially the mySQL one – then the automatic install of the packages by the MOWES program will fail miserably.

Once you have the files installed on your memory stick, then you can configure them.

Configuration

If you never intend to run the installation on any PC that has a local Web Server or instance of mySQL, then you don’t need to do anything else in terms of configuration.  You might like to take a look at ‘Tidying Up’ section below.

If you ARE going to use the USB Stick on PCs that may have other web servers or mySQL instances running, then it’s time to come up with a couple of ports to use for your USB stick that other folks won’t normally use on their machines.  The precise values don’t matter too much – after all, the rest of the world won’t be trying to connect to your memory stick – but be sensible, and avoid ports used by other applications.

I eventually chose 87 for the Apache Web Server, and 4407 for mySQL – 87 fitted with my own laptop where I already have a web server at Port 80 and another one at Port 85, and I run mySQL at the standard port of 3306.  NOTE that if you run the installation using an account with restricted privileges, you may not be able to open the new ports you use.

In order to configure the MOWES installation you’ll need a text editor of some sort – Windows Notepad will do at a push.  You’ll be editing a couple of files on the USB stick, as follows:

apache2\conf\httpd.conf

Open this file up and look for a line starting with Listen.  Change the number following it to the number you’ve chosen for your Apache Port – e.g. 87.

Now look for ‘ServerName’ – change the line to include the Port number – e.g. localhost:87

php5\php.ini

Open this file and find the line starting mysql.default_port.  Change the port referenced in this to the Port you have chosen for your mySQL installation.  E.g. mysql.default_port=4407

mysql\my.ini

Open the file and look for two lines like port=3306.  Change the port number to the one you have chosen – e.g. 4407 – port=4407.  There will be two lines like this in the file, one in the [client] section and one in the [server] section.

www\phpmyadmin\config.inc

This is the configuration file for the phpMyAdmin program that provides a graphical user interface on to the mySQL database.  Look for a line that starts with : $cfg[‘Servers’][$i][‘port’] and replace the port number in the line with (in this example) 4407.

And that, as they say, is that for the configuration files.  You can now start up the MOWES server system by running the mowes.exe program.  If all is working, after a few seconds your web browser will be started and will load the ‘home page’ of the MOWES installation.  With the configuration carried out in this article, the browser will show the url http://localhost:87/start/ and the page displayed will show links to WordPress, Mediawiki and phpmyadmin.

WordPress Configuration

The final stage of configuration is to make a change to WordPress that allows WordPress to run on a non-standard Apache port.  This needs to be done via phpmyadmin, as it involves directly changing database entries.  Open phpmyadmin, and then open the wordpress database from the left hand menu.

Now browse the wp_options table.  Find the record where option_name is ‘siteurl’ and change the option_value field to (for using a port number of 86) http://localhost:86/wordpress.  Now find teh record with option_name of ‘home’ and again change the option_value to http://localhost:86/wordpress.

Tidying Up

You may like to put an autorun.inf file on the root of your memory stick, so that when it is plugged in to a machine it will automatically start the MOWES system (if the machine is so configured).  The file can be created with a text editor and should contain the following:

[autorun]
open=mowes_portable\mowes.exe
label=Your Name for the Installation

And that’s that!

Enjoy!