Uncategorized

Transition to the Mac side complete

macmini-handsFor a long time I never thought I would ever get to this point, but it’s now official: I don’t use Microsoft Windows any more. I have decided to stop using my PC at home in favour of using my Mac Mini instead. This now means that my home computer, work computer and laptop are now all Macs. All the servers I am responsible for run Linux, save for just one virtual server which runs a couple of customer ASP sites under Windows 2003 Server. This means that, to all intents and purposes, Microsoft Windows is no longer part of my life after twelve years of use.

I thought long and hard about using the Mac Mini in preference to my PC. Its specification isn’t as good in that it only has a 1.5Ghz Intel Core Solo processor, versus the PC’s 3Ghz Pentium 4 processor, and until yesterday it only had 512Mb of RAM, which made it difficult to use. However, it now has 1.25Gb and the upgrade has made it perfectly usable. I do lose my multiple monitors though since the Mac Mini only has one DVI port and its limited upgrade path means that there is no way to add another. This I had to consider very carefully, so I gave it a go with just one 20″ 1600×1200 display for a couple of days and, you know what, it’s worth the sacrifice.

So the PC’s going in the loft along with one of my screens. Yeah, it’s nowhere near as fast as my fantastic computer in the office, but for the odd day in the week working from home it’s absolutely fine. I’ll be replacing it with a Mac Pro of my own in due course anyway, at which point I’ll be back to two monitors.

The memory upgrade in the Mac Mini, incidentally, was a right pain in the arse. If you’ve ever attempted to get inside a Mac Mini you’ll know exactly what I’m taking about what with the fucking putty knives and brute force required. Still, it’s done now, and for £30 for a 1Gb memory module from Crucial it was well worth it.

So yeah, now I’m a full time Mac Snob. Added to that, Chris got a Macbook Pro this weekend (from his parents, officially for work), so between us we now have an example of each model in the current Apple range (Mac Mini, iMac, Mac Pro, Macbook and Macbook Pro). How sad!

Uncategorized

Charge for 5 minutes, drive 500 miles

This CNN article discusses a company that’s working on a new type of energy storage technology designed for electric cars. If it works as it’s supposed to, it will charge up in five minutes and provide enough energy to drive 500 miles on about $9 worth of electricity.

Something about that doesn’t sit right with me, and after a bit of maths, I know why. Consider the following, and I will appreciate any advice necessary if I have somehow cocked up these equations.

  • Electricity costs around 5.5 pence per kilowatt-hour (kwh) in the UK.
  • $9, at today’s exchange rate, is £4.75.
  • £4.75 therefore buys us 86 kwh of electricity.
  • The technology claims that it can suck up this amount of electricity during a five minute charge.
  • 86 kwh in 5 minutes equates to 1,036 kwh in an hour, meaning that this technology requires a 1,036 kilowatt power supply.
  • That’s just over 1 megawatt.
  • At 240 volts (where, broadly, amps = watts divided by volts), a 1 megawatt supply requires 4,166 amps.
  • 4,166 amps is roughly 70 times that provided to a normal domestic premesis, assuming a 60 amp UK domestic supply.

I’m preparing to wield the big “BOLLOCKS” rubber stamp, but before I do, let’s run those figures again assuming that everything’s in the US, so that differences in cost and specification of electricity supply between the USA and the UK aren’t affecting the judgement:

  • Electricity costs around 4 cents per kilowatt-hour (kwh) in the USA.
  • £9 therefore buys us 225 kwh of electricity.
  • The technology claims that it can suck up this amount of electricity during a five minute charge.
  • 225 kwh in 5 minutes equates to 2,700 kwh in an hour, meaning that this technology requires a 2,700 kilowatt power supply.
  • That’s 2.7 megawatts.
  • At 110 volts (where, broadly, amps = watts divided by volts), a 2.7 megawatt supply requires 24,545 amps.
  • 24.545 amps is roughly 204 times that provided to a normal domestic premesis, assuming a 120 amp US domestic supply.

It’s REALLY bollocks then. *STAMP*

Uncategorized

Wireless: Not the convenience utopia everyone thinks it is

You know what? I fucking hate wireless networks. It seems that every single day of my life I’m somehow supporting or fixing a wireless network, whether that’s at home, in the office or for a broadband customer. Wireless simply isn’t the convenience utopia that it’s made out to be. It’s unreliable, unsecure and absolutely no substitute for a proper wired network, despite what some bedroom network consultant idiots I’ve overheard in the Institute of Directors claim.

At work I’m constantly being asked what the WEP key is, even though it’s the name of a reasonably popular children’s television programme which is hardly difficult to remember. Then as soon as a website doesn’t load or whatever, it’s immediately assumed that the wireless network has gone down and I get a phone call or a shout accross the office as if it’s somehow my fault.

At home I get friends coming round who decide that they want to check their e-mail or whatever and so ask if they can use my wireless. I give them the 26 character WEP key and of course this is a big old hassle for them to enter into their PCs (which of course they have to do twice on Windows machines) and there’s lots of huffing and sighing, as if it’s some fucking huge inconvenience for them to use my Internet connection for free.

Then there are the hotels which claim to offer a wireless network service but it actually transpires that all they’ve done is install a couple of crappy wireless access points here and there and haven’t actually checked that it’s usable in all parts of the establishment. The Crowne Pointe is a fine example of this. Luckily they also have wired connections in the rooms, but that of course meant that we had to drive all the way to Hyannis (some 50 miles away) yesterday to get a network cable, because the wired port is behind the dresser on the other side of the room.

Then you get the people who insist on having a wireless network, but that also want it to be 100% secure. It’s not going to happen. If you want a secure computer network, don’t connect a wireless access point to it. You must choose between the “convenience” of wireless and a secure network, you cannot have both, especially when you don’t want to invest in RADIUS servers, secure certificates, and all the other stuff that’s associated with WPA encryption; itself no guarantee of 100% security.

Don’t get me wrong, wireless does have its place, but by no means should it be considered to be an all-encompassing solution for network connectivity requirements. It’s limited, unreliable, unsecure, and a lot of the time just not worth the hassle. It’s been improving over time, and will continue to do so, but it’s not mission-critical just yet. Until it is, use it at your own risk, and if it goes wrong, use a fucking cable.

Uncategorized

The Cult of Mac

Apple Mac Pro

Apple Mac Pro

What’s going on then? Why have I suddenly turned into a Mac weirdo recently? How do I justify this after spending so many years slagging off Macs and their users?

It’s probably exactly what you think; a combination of being utterly tired and pissed off with Windows (with XP now being over 5 years old and not due to be replaced for another 9 months at the very least) and much improved offerings from Apple over recent years. I don’t think that one of these things on its own would have been enough to convince me, and I expect Apple probably knew that too.

Yes, there’s Linux, and that’s good for many things, but Ubuntu (which is the closest thing I’ve seen to a useable Linux desktop) simply didn’t make me happy enough for me to be able to commit to it, and I never felt 100% at home with it (a reminder that I ran it on my old laptop for a number of months) and despite its advances it still required a large amount of tweaking to get it working with all my laptop’s hardware. This isn’t the case with my Mac. It really does “just work”.
Problems I had with Macs previously included:

  • Expensive hardware with poor performance: You used to typically pay around twice the amount you would have for a modern PC, with a specification around half that of said PC. Even when taking into account the assertion that Macs needed less resources to perform the same functions, it still didn’t add up and I saw little reason to spend so much on so little.
  • Rubbish operating system: Mac OS 9, in my admittedly limited experience, was a proprietary, buggy piece of shit and it was long overdue to be replaced (rather than just another release). It did things in its own way that made sense only to itself and to “native” Mac users, but that were completely baffling and counter-intuitive to anyone else. This put up a huge barrier to use and wider adoption.
  • Requirement for commitment to migration: While Mac OS has always enjoyed reasonable software support, it was still relatively limited when compared to Windows, for better or for worse (meaning that a huge range of software availability for a platform isn’t always necessarily a good thing).

Since then, Apple appears to have sat up, listened, and implemented a successful strategy for getting people to defect, including (but not limited to):

  • Cheaper hardware: I expect that this has largely been brought about by the introduction of Intel CPUs into all Macs over the past 18 months or so, but Apple made efforts previously to introduce more low-end models to entice people who simply couldn’t justify large expenditure on someone that, for them, was untried and untested. Things like the Mac Mini bridged this gap, allowing people to dip their toes in the water with relatively low risk. I am one of those people, and as a result here I am typing this on a brand new MacBook, a very capable laptop computer that, at £632 including NY state sales tax, cheaper than the vast majority of Windows-based laptops from other manufacturers. £600 will buy you a Windows laptop, but it won’t be a very good one.
  • Much improved operating system: The BSD based Mac OS X was a gigantic leap forward for Apple. It immediately attracted people from a UNIX background at a time when UNIX desktops left a lot to be desired. Its UNIX roots also obviously made Mac OS X extremely stable compared to Windows and Mac OS 9. While it still requires a little bit of getting used to by non-native Mac users, it can be picked up very quickly; certainly this was true for me.
  • Introduction of Intel CPUs: This has brought all sorts of advantages, from cheaper components (leading to cheaper products), through generally faster machines, through to the ability to actually run Microsoft Windows on a Mac alongside Mac OS X. Apple are quite correct in stating that many people will now have no excuse not to switch to a Mac. Mac OS X enjoys splendid software support, but even if that doesn’t prove to be enough and you’ve got some obscure Windows software package that doesn’t have an equivalent, you can still run it.

Other things that I really like about Mac OS versus Windows in particular:

  • Fairer licensing: Microsoft want money for each and every installation of Windows without exception, no matter who uses it, what it’s used for, or how often it is used. At around £300 for each installation, this is unfair and expensive, and now they’ve got their blasted product activation system to ensure that they get their readies. You have to be a large company in order to enjoy any sort of significant discount. Mac OS is not only siginificantly cheaper at £89, but spend £40 on top of that and you get to install it on up to five computers in your household, legally. Microsoft would want over 10 times that amount for the same privilege (assuming Windows XP Pro).
  • I didn’t have to spend hours uninstalling legions of useless crap when I bought my computer. My last two laptops and my desktop PC at work all came laden down with so much rubbish that it took me hours to remove it. Only after I had done so did the computer start to perform as expected. They do this because they want to push the fact that you’ve not just bought a PC, you’ve bought a Vaio, or a Thinkpad, or a Portege or whatever, and so obviously they need to make Windows XP less generic by filling it up with all sorts of manufacturer-specific rubbish that wants to manage your photos and play your MP3s and present you with special trial software with preferential purchase options. All bullshit.

So yeah, I’m hooked. Chris and I have an iMac as our “home computer” and now I have a Mac laptop. At work I currently have a Windows PC that I do all my work on, and sitting on my desk next to it is a Mac Mini that I use to test stuff with Safari. Next year when my PC comes up for replacement, I’ll be getting a Mac Pro, and instead of having the Mac Mini just to test stuff in Safari, I’ll install Windows on it and use it just to test stuff in Internet Explorer.

The real irony there is that the Mac Mini cost £400 and is smaller than my external DVD writer. I doubt that the same money would buy a Windows box of the same specification and size. So even for the things I need that a Mac can’t run natively, I’ll still be using a Mac. I hereby take back everything bad I ever said about Apple Macs and I willingly pledge myself to the Cult of Mac.

And yes, I’ll fix this site so it doesn’t look wonky in Safari, Howie, I promise :)

Uncategorized

Explorer Destroyer

Slashdot | Explorer Destroyer – this is an extraordinarily bad idea and will most likely lead to another inter-browser jihad similar to that fought between Netscape and Internet Explorer in the late 90s, and trust me, nobody wants to go through that shit again. The trouble is, the pro-Firefox and pro-Opera and pro-not-IE zealots fail time and time again to understand the practicalities of developing sites and applications for modern web browsers.

The central issue is “web standards”. The W3C sets and maintains web standards which govern the specification and use of the various markup languages and how browsers should interpret said languages. This is all well and good, but only in theory. And it’s a good theory, make no mistake, but unfortunately we have a little thing called “real world” to contend with.

The trouble is that browsers are not standards compliant, which means that website coders by extension will waste their time by coding up a 100% standards compliant website, because the likelihood is that it won’t work, at least not fully, in any browser. Militant (read: non-commercially minded) developers stick to their guns on this, proclaiming that their site is 100% compliant, and to hell with the browsers, it isn’t their problem. Again, they do not live in the real world.

Some browsers are more standards compliant than others. Firefox and other Mozilla based browsers are more standards compliant than Internet Explorer. Safari and other KHTML browsers are more standards compliant than Firefox et al, Opera is more standards compliant than Safari et al, and so on and so forth. But none are 100% compliant. So, when coding up your website, you have to make allowances for this, which often means compromising design and/or functionality in some way. You either do this by changing your overall design, or you make browser-specific hacks. The former is the preferred method, but sometimes you’ve no choice but to do the latter, dirty as it may be.

Then there’s another issue in that some of the official web standards aren’t actually that sensible, with some bordering on pretty awful. Internet Explorer’s interpretation of the W3C standards is very flexible in some areas, and while this is technically not correct and it’s wrong of Microsoft to insist on their on take on the standards (another relic from the Netscape/IE war, anyone remember “Netscape tags”?), some of the modifications they have made are actually pretty reasonable, and really should be in the offical standards.

For example, the IE “box model” (how the browser calculates the dimensions of rectangular areas) in particular makes far more sense than the flawed and illogical W3C version, standard or not. The web development community recognises this particular issue, albeit reluctantly, so Microsoft aren’t all bad, even if their unilateral execution of the modification seems abhorrant to some.

I also tire of the shortsightedness of users of alternative browsers. On asking them why they prefer Firefox or whatever over Internet Explorer, their answers are generally one or more of:

  • $my_browser is more secure!
  • $my_browser has tabbed browsing!
  • $my_browser supports transparent PNGs!
  • $my_browser isn’t Microsoft! F/OSS! (free open source software)

The first one in that list is understandable, IE does have some problems with its security. But that’s nothing to do with how it renders web pages, and that’s the crux of the “standards” arguments. The same applies to the second item. Tabbed browsing is an application feature offered by the browser, and again, nothing to do with the page rendering. It is NOT REQUIRED by the W3C standard.

The third item is a cosmetic extravagence that frankly nobody needs and everybody has managed perfectly well without since the inception of the web. Granted, it would be a nice thing to have, but let’s face it, the web isn’t going to wither and die without it. Then on hearing the fourth item I stop speaking to the person because it’s then clear to me that it’s not about browsers and standards, they just want to bash Microsoft. Yawn. Move on.

IE isn’t perfect, not by any means. It’s old and it’s unsecure and its liberal take on published web standards can be infuriating. But it is a good, solid and above all popular browser, which 90% of the world uses. So while that’s still the case, developers working to a budget and who have to deliver return on investment will more often than not develop a site for IE, and then see what they can do to get it to work in other browsers, and even then they’ll probably stop at Firefox, the second most popular browser. Beyond that it’s just not commercially viable to spend time satisfying the various levels of standards compliance demanded by every subsequent minority browser. Sorry, but we don’t live in web standards utopia yet, we live in the real world, and we’ve all got to try to earn our keep in it.

This is a quite insightful comment regarding the perception and reality of standards:

I think you’re mistaking a standard for a formal specification.

I agree with you that it would be great if the whole browser world followed W3C recommendations, given their popularity outside the IE world, and the fact that they are formally specified. However, the word “standard” implies a widespread acceptance, and the only player in that game today is defined by “what IE 6 does”. Calling most specifications about the web from the W3C “standards” is, unfortunately, rather misleading; you cannot have less than 1/5 of the market share and claim to be any sort of standard, and AFAICS the W3C themselves rarely use that term.

You don’t like it. I don’t like it. But it is the way things are, and for the foreseeable future it’s the way things will be […] .

Returning to the subject of the referenced article, Firefox isn’t the holy grail of web browsers, so promoting it in such an invasive way is, in my opinion, no better than site that won’t let you in unless your’re using IE. Firefox lacks features just as Internet Explorer does, and it certainly does not have halos for standards compliance or security. Want a list? Be my guest.

Uncategorized

Ubuntu trial over

The Ubuntu trial is over, and I regret to say to all Ubuntu fans that I have returned to Windows. I have reasons, make no mistake, and my time with Ubuntu isn’t over.I really wanted Ubuntu to work for me full time, and I tried my hardest with it, but I was pushing it to the limit and it couldn’t cater for me in the end. Reasons in a pinch:

  • Whilst Ubuntu could connect to various sorts of network drives (SMB and SFTP), accessing those drives was frustratingly slow and more often than not, access to them was not offered by applications when loading and saving data. This meant that I frequently had to copy a file from the network drive to the local filesystem, do whatever I needed to do to it, then copy it back. Most inconvenient.
  • Ricey pointed out Crossover Office to me, which allows certain Windows applications to run under Linux, including Internet Explorer, Microsoft Word, Microsoft Excel and Adobe Photoshop. They all installed and ran, but were very slow due to the emulation engine under which they ran (WINE, one assumes). They also suffered from the inability to access my network drives.
  • Crossover Office also did not support Adobe Illustrator or Quark Xpress, so I was still missing my vector graphics and DTP software.
  • The whole system seemed slower. Memory usage wasn’t a problem, so it wasn’t swapping that was slowing it down. Programs seemed to take a long time to load and the processor fan always seemed to be working hard, even when I wasn’t doing anything in particular. The kernel that shipped with Ubuntu didn’t recognise my hyperthreading processor (probably because it wasn’t an SMP kernel), although I don’t know if that had anything to do with it. People call Windows a processor hog, but it seems to give my CPU much less of a hard time in comparison.
  • The iTunes equivalent “RhythmBox” software really couldn’t get its act together. Once I’d convinced it to recognise my MP3 stash, it then went overboard and indexed it twice. It would also frequently lock up, sending the CPU fan spinning into oblivion.
  • The open source office suite OpenOffice.org shows promise, but did not properly display 80% of the office documents that I opened with it. This was particularly prevalent in the word processor; the spreadsheet software wasn’t so bad.
  • I had to go through a complex process just to get it to play MP3s. Apparently, because the MP3 codec isn’t “free”, it doesn’t come with Ubuntu by default, and you have to install it separately, but that means adding unsupported repositories and other such nonsense. It seemed an unnecessary bit of red tape just so that I could play my Massive Attack album. I know all the arguments about “free” codecs versus those encumbered by patents, but this is supposed to be an out of the box OS, and what’s one of the most popular things that people use their computers for these days? That’s right.

Like I say, I really wanted this to work out for me, because Linux on the desktop has come a very long way from the days when you needed to be a sorcerer to even have a hope of getting a half-decent graphical desktop setup on a Linux machine, but unfortunately, it’s still not come far enough, at least not for my day to day work requirements. I will however attempt to get it onto my laptop and use it on there. I only use my laptop for web browsing, e-mail and SSH access, and Ubuntu can do all that just fine.

Other good points that I really liked:

  • Seems to support my laptop’s wireless network adaptor out of the box, but I can’t get it to display all the networks available, including my own. I expect I’ll be able to do it via some command line tool, but I shouldn’t have to do this.
  • This isn’t down to Ubuntu, but I was impressed at the ease of which I downloaded and installed the manufacturer supplied graphics card drivers, which allowed me to use my multi monitors with no fuss.
  • 98% of the system management functions are available using the graphical user interface, which is good. There is, however, still the 2% remaining. I suspect that use of the command line will never be fully eliminated, since at the end of the day it’s a UNIX-like operating system, and that means commands.
  • I liked the range of “familiar” looking software that shipped with it. For example, evolution looks like Outlook, RhythmBox looks like iTunes and OpenOffice.org did its best to use the good parts of Microsoft Office’s interface. The developers have made a very good attempt at trying to cover all the bases and not scare newcomers by inflicting unfamiliar software on them.
  • The installation procedure is marvelous. It’s quick, doesn’t ask any complicated questions, and seems to have no trouble in detecting and installing drivers for most if not all hardware that’s thrown at it. This is crucial if it wants to poach Windows users, newcomers won’t accept anything less.

Ubuntu is a very solid, if relatively limited, operating system distribution, and it’ll work a treat for the likes of my laptop and my Dad’s PC. The developers have done a fantastic job, especially as it’s been made available for free, and must keep up the good work.

Unfortunately, in my case, it can’t support my day to day work, and I don’t have endless time to hack it and tweak it, and even if I did I would still have to make compromises. I don’t expect it to 100% look and act like Windows, not only is that unrealistic but it would completely defeat the object of offering an alternative operating system. Windows, for all its fault and reputation, is fast, responsive and very well supported in terms of software, and that’s what I need, at least at work.

Uncategorized

Ubuntu Linux

I’m trialling Ubuntu Linux for a period. I didn’t plan it, but a series of particular events lead me to begin such a trial.

It started when Dad’s installation of Microsoft Word broke. He can still use it, but every time he loads it a series of dialogue boxes come up, along with Windows Installer. It’s just a question of cancelling each one, but it’s frustrating and confusing nonetheless. So I planned (and still will) go down south this weekend to generally update and fix his PC, since it’s running Windows Millenium (something that we’d all rather forget about). I could just install XP, but I don’t have any spare legal licenses for it, and neither do I have the same for Microsoft Office. Added to this, I thought I’d take the opportunity to replace the OS with something else, since installing XP would just give Dad more of the same thing, which he doesn’t fully understand.

So at the weekend I posted a message on a techie mailing list to which I’m subscribed asking for suggestions about a possible parent-friendly Linux distro that could easily offer basic computing tasks, such as word processing, web browsing, e-mail, picture downloading and viewing, and printing from all of the above. The overwhelming response was, that if I didn’t want to by a Mac, that Ubuntu Linux was a fair bet, so I downloaded the live CD.

I was well impressed with it. It worked out of the box on both my desktop PC and my laptop, even going so far as to kindly connecting to my neighbour’s unsecured wireless network for me. So that’s going on Dad’s PC at the weekend. I can make it ultra-simple for him, and while obviously using any computer requires some thought, there will be less to confuse him. There’ll also be the added benefit of not being susceptible to all the viruses and spyware on the Internet that target Windows machines.

The subject of which brings me to yesterday. Somehow, and I don’t know why, my PC contracted a spyrus (malicious software that is both a virus and spyware). Don’t ask me how, because I don’t know. I am the most careful person in the world when it comes to running hooky software and my PC is well firewalled. It’s the first time I’ve caught anything like this in all my years of using Windows (12+).

Try as I might with an armada of anti-virus and anti-spyware tools, I couldn’t get rid of the damned thing. The cleaning software would detect it, delete it, and consider its job to be done, but then when I rebooted, it was back. I searched through the registry, the filesystem, everything. Then it started to download some of its virus and spyware mates, and before I knew it I had half a dozen different infections, popping up adverts on my screen, etc. One even installed a Sudoku game, which suddenly appeared in my start menu.

It’s possible to spend days and days trying to eradicate this nonsense, as a colleague discovered to his peril some weeks ago, so I decided to cut my losses and dump the whole Windows installation. All my data is saved on various servers, so it’s not a big deal to do that, assuming of course you can spare a day to reinstall. So I thought what the hell, let’s give this Ubuntu a go, since I’m going to be inflicting it on Dad.

It’s the latest beta version (Dapper Drake or something), but it seems pretty sorted. The setup process was quick and simple and asked no complicated questions. It downloaded TONS of updates, which is good, nothing wrong with that. I found manufacturer drivers for my graphics card and got dual monitors working, so that’s good. All my other hardware was detected and installed automatically, with the exeception of the scanner, which I’ll sort out later (if I can). There are software equivalents to Outlook, MSN, mIRC, SecureCRT, Word, Excel and iTunes, which is all perfectly acceptable. It reads and writes CDs and DVDs and can read my flash drive. It has drivers for and has successfully connected to the office printers.

There are however a number of reasons why I still consider this a trial and not a done deal. Firstly, I need to get to grips with the Gimp, since I am now deprived of Photoshop. I’ve dabbled with this in the past and I frankly didn’t like it, so it’s going to be a difficult learning curve. I also still need to test stuff in Internet Explorer, which means I’m going to need a permanent Terminal Services window open, which is a little inconvenient. I’m currently downloading the Linux version of Zend Studio, so the jury’s out on that one at the moment, although I don’t imagine there’ll be much of a problem with it since it’s written in Java and therefore will be the same everywhere.

There’s also then the issue of software that I run less often, but still run nonetheless. I use Adobe Illustrator, and I know of no Linux vector graphics package, much less one that has the capabilities of Illustrator and can read and write Illustrator files. This is a potential problem. Following on from that, I sometimes also use Quark Xpress, and of course, that ain’t never going near a Linux installation. So I am faced with having to reboot into Windows when I want to use such software, which will be a right royal pain, unless anyone’s got any other suggestions?

I’ll also have to reboot into Windows to play games, but I’m not unhappy about that. Overall, this has been an eye-opening experiment. The Ubuntu developers really have managed to create a Linux based operating system that works out of the box and that can be operated by normal humans. I’d never use it for a server of course, but then I’d never use Slackware as a workstation. Different Linux distributions are suited to different purposes, this is by no means news.

I’ll let you know how I get on :)

Uncategorized

Web development truths

Anyone can be a web developer, right? Wrong. During the dot.com boom of the late 90s, any old jack-the-lad was claiming that he was a web developer, ranging from 14 year old nephews (known in the industry as Nephew Technology – used by company directors to produce their website in acts of blind faith in untrained schoolboys) to pensioners with way too much spare time on their hands. The dot.com crash of 2000 sorted the men from the boys as those who really didn’t have any genuine skills at all lost their jobs or customers, whilst those who did know what they were doing were more likely to retain theirs.

“Red Herring” websites that cost an incredible amount of money but at the end of the day proved to be little use to anyone became a thing of the past. The Emperor had finally seen the true nature of his new clothes and was no longer willing to pay over the odds for poor results.

I survived the dot.com crash by not getting involved with any of the silly companies that sprang up at the time, instead choosing to make reasonable money and consistently getting better at what I do. People can now easily see the value in my skillset and experience when they brief me for projects. But along the way I’ve learnt a few home truths, which I am not afraid to tell customer both new and existing when I need to:

“How much is a website?”

Never EVER ask a web developer this question. Imagine yourself walking in to a car showroom and asking the dealer – “How much is a car?”. Ludicrous isn’t it? The dealer can’t possibly tell you how much a car is because the price of cars ranges from £5,000 to £500,000 depending on what sort of specification you require and how much you’re willing to spend.

It’s exactly the same with a website. Before a web developer can even give you a ballpark price for developing a new website project, he/she needs to have a reasonably detailed description of what you want it to do.

If you don’t know what a website can be capable of, then they will be more than happy to present options to you, in much the same way as a car salesman will explain the meaning of the obscure acronyms you read on a car’s option list. You may not know what a car can be capable of, so ask the salesman, he’ll tell you. The same applies to web developers, although obviously not on the subject of cars.

You have to tell them what you want, only then can they tell you what it’s going to cost. Not all websites are alike. They don’t come in a set range of flavours. Almost every single website in the world is unique. You’re basically specifying a customised product, with a customised cost.

Fast, Good, Cheap – Pick Any Two

This is golden rule number 1 when constructing a brief for a new project or an extension to an existing project, and it’s by no means specific to website software. Think about it carefully: If you want a good quality product in a hurry, it’s not going to be cheap. Alternatively, if you’re still in a hurry but don’t want to spend a lot of money, the product isn’t going to be particularly good. Lastly, if quality and low costs really are paramount, then you’re not going to have it finished in a hurry. Personally I recommend option 1 followed by option 3 as I’m a perfectionist and actually find it quite difficult to produce something that’s not “good”.

Few companies, at least of the size that most web development agencies operate at, can offer all three, and those companies who claim to be able to do so often just try to sell you a pre-packaged website solution that probably won’t be directly suitable for your purposes, which then of course brings the “quality” factor back into question – are they really offering all three after all?

“I want to be able to update it myself”

This is, has always been, and always will be the biggest double edged sword in the whole website arena. It sounds like a marvellous idea doesn’t it – a website that the owner can control and update themselves with no programming knowledge or dependence on the web developer required. Cynics may well claim that web developers don’t like to produce such products because it subtracts from maintenance contracts, and to be honest there is an element of that, but it is by no means as extensive as you might think.

Principally, the simpler something becomes, the less flexible it also becomes. Again, this is not specific to web development projects, it applies to pretty much all software and hardware products that require some sort of human interaction, from Microsoft Word to your washing machine.

Let’s change the brief here to “I want it to just wash my clothes by pressing a button”, when you’re buying a washing machine. Imagine a washing machine with just one button – “Wash”. Sure, it would wash your clothes, at a fixed temperature and with a fixed programme, and for a lot of people this would be fine. It’s simple to use and virtually foolproof. But woe betide it ruins your Club Monaco wool-knit t-shirt because the programme was unsuitable, because then you would need to change how the machine operates when washing such garments. You need another button. Suddenly the machine has become twice as complicated as it was before.

The same applies to “update yourself” websites (the proper name for which is Content Management System, or CMS). I can provide you with a form with one single text box that allows you to change the content of a paragraph on your website. No problem, you just type the text and the paragraph is updated. But now you want to change another paragraph, and not only that, the paragraph is on another page, and furthermore you want to add an image and change the text colour and add a few links. But at the same time, you don’t want to have to know anything about HTML. Herein lies the problem.

Now you have two options. If you want your content management system to become more complex to satisfy your growing needs, you either need to start learning HTML (the markup language that’s used to define the layout and content of web pages), or you need to invest more money into the CMS in order that you don’t have to. One route is obviously more expensive than the other, and each have their disadvantages.

With the first option, many people fall into a common trap known as Microsoft Word, but the trap also applies to other HTML-producing software. Microsoft Word, a popular item of software on most peoples’ computers, claims to be able to export normal Word documents as HTML files. This, for the most part, is untrue. It may well be able to product HTML files, but the HTML it produces is the most god-awful excuse for markup code that’s ever been seen, and this is not just a personal opinion, this is one of those Internet-wide truths that everyone (bar perhaps Microsoft) accepts. Yet it’s all too tempting for website owners just to simply cut and paste Word HTML into the CMS and expect it not to completely screw up their website.

The point here is that allowing people to include their own HTML on their website empowers them to do a wide range of very powerful things. It also allows them to do some very bad things. If you want to manage a complicated website yourself, then you’re going to have to learn how to do some complicated things, including learning at least basic HTML that’s sympathetic to the site’s design and style, rather than how Microsoft Word thinks it should look.

The second option is also not without disadvantage. There is no end to how complicated your CMS can get in order that you don’t have to learn any HTML, and therefore there is no end to how much money you can sink into it just because you don’t want to have to “know about programming and stuff”. This is good for the likes of me, but bad for you. In some cases people spend more money on the CMS so they can then spend their own time updating the site themselves than they would have done paying their web developer to make the changes for them under their maintenance contract. There’s a point at which updating the site yourself simply ceased to be cost-effective.

It is necessary to strike a balance between allowing the CMS to automate and you to provide your own creative input by using HTML. The web, despite its apparently simplicity to the average user, is getting more and more complicated by the month underneath. If you want to be involved with controlling the back end then you too will need to become more complicated and technically literate. If you don’t have the skills for this, are not willing to learn the skills for this, or if it otherwise scares you, then leave it to someone else who does have the skills and isn’t scared to take advantage of it.

Rhydio customers should note that this quasi-rant is not aimed at anyone in particular – I just sometimes get this feeling of tremendous dread whenever I hear the immortal words “I want to be able to update it myself” :)

1 3 4 5