Friday, 31 July 2020

Rogue (1980)

Introduced 1980

Before the microcomputer boom of the late 1970s most computers were large, expensive but powerful multiuser machines (“minicomputers”) such as the DEC PDP-11 and VAX. These expensive machines were meant for serious work, but even so a few games had been written such as Colossal Cave and Star Trek.

These were often complex games, but they were severely hampered by the rudimentary output capabilities of the computers involved. Although minicomputer terminals had evolved through the 1970s leading to designs such as the versatile VT100, it wasn’t always easy to leverage the new features into programs.

Cursor addressability was the key feature first seen in the era of the VT52 – the ability to move the cursor to anywhere on the screen and display text. It seems simple today, but the earliest terminals were basically printers-with-a-keyboard (teletypes) and it took a long while for glass teletypes to evolve into the video terminals that could support recognisably modern applications.

By 1980 the newly-developed curses library was making it much easier to use the advanced features of these terminals, and around 1980 a pair of students – Michael Toy and Glenn Wichman – were intrigued by the possibility of writing a game on the UCSC computers systems where they were studying.

Rogue emulated running on a VT220

Rogue broke out of the mould of earlier minicomputer games which either tended to be quite simple or weren’t worth playing once you had beaten them. Rogue was a satisfyingly complex Dungeon-and-Dragons style game, set on several levels of mostly randomly-generated maps. The player roamed these ASCII maps as a wandering “@” sign, with 26 different types of monster represented by letters of the alphabet - V = vampire, O = orc for example. The player could accumulate a variety of weapons, armour, scrolls, potions and rods (wands) to help them on their task.

26 levels down in the game you would find the ultimate prize – the Amulet of Yendor, which you could then take back to the surface? The turn-based gameplay did give the player plenty of time to consider their next move in tricky situations, but on the other hand death was permanent – if slain by a monster or your own stupidity you would have to start over.

Michael Toy then moved to Berkeley from UCSC and met with Ken Arnold who had developed the curses library, the game evolved further and by the mid-1980s commercial interests were involved, porting the game to a variety of 1980s micros including the Macintosh, Atari ST, Commodore 64, Amiga, Amstrad CPC, ZX Spectrum and many more.

It was a hard game to master, and you could easily spend hours and hours trying to tackle deeper and deeper part of the dungeon. The fact that it was simply made up as ASCII characters didn’t really matter because of the rich gameplay – and most micro versions used some simple graphics to enhance the game further.

Nethack running on a modern laptop

Rogue was always a closed source game though, so many open source variants followed. Nethack is probably the most popular of these, with richer gameplay that Rogue but still mostly stubbornly sticking to ASCII.

Of course modern computer games can offer something of the rich gameplay of Rogue-like games along with impressive graphics and sound effects, but Rogue and its derivatives linger on and continue to be developed.

Image credits:
Artoftransformation via Wikimedia Commons - CC BY-SA 3.0
Mad Ball via Flickr - CC BY-SA 2.0







Tuesday, 21 July 2020

The Rise and Decline of Sharp Mobile (2002 to 2008)

Fifteen years ago this month, Sharp released the Sharp 903 – a high-end 3G phone that was the high watermark of Sharp’s efforts to break into the European market. Distinctly different from the Nokias and Motorolas that dominated the market, the 903 should established Sharp as a contender in the market. But it faded from sight instead.

In the early noughties Asian firms were having a hard time making an impact outside their home markets, with the notable exception of Sony… but even they had to join forces with Ericsson in 2001. But the result of this was that there were some weird and wonderful ecosystems developing – especially in Japan.

Sharp were dipping their toe in the market, initially with some fairly standard devices but then starting to leverage their expertise in other technologies. In 2000 they made the world’s first camera phone – the J-SH04 – but in particular devices started to appear that used some of Sharp’s world-leading display technology.

Sharp J-SH04
In Europe Sharp started cautiously with the O2-only GX1 which sold in limited quantities. Then came the almost identical Sharp GX10 and GX10i (the latter exclusive to Vodafone) in 2002 and 2003 which were attractive but pretty undistinguished clamshells.

The next handset to be launched (in late 2003) was a ground-breaker. Exclusive to Vodafone in most regions, the Sharp GX20 featured a high-resolution 240 x 320 pixel continuous grain silicon (CGS) display which easily beat everything else on the market at the time. Added to that was a competitive VGA resolution camera with a multi-coloured LED, along with a relatively large colour external screen – all in a package smaller and lighter than the more basic GX10. The GX20 created a real buzz around Sharp’s products and consumers were eager to see what would come next.


Sharp GX10i and GX20

The Sharp GX30 built on the superb display in the GX20 and added the world’s first megapixel camera. The GX30 also had a full-sized SD slot, added video recording, Bluetooth and an MP3 player. And in early 2004 all of those things together were a big deal. Even if the software wasn’t as easy to use as a Nokia, the hardware was class leading in almost every respect, again this was a Vodafone exclusive in many regions – although some other carriers had the functionally identical GX32.

Sharp GX30

You might guess that the next phone from Sharp would be the GX40… but you would be wrong. The Sharp TM100 was exclusive to T-Mobile rather than Vodafone, but was basically a slider version of the GX20 with minimalist looks at the same CGS display that Sharp were becoming famous for.

Sharp TM100

Vodafone again had the exclusive for the next handset – the very popular Sharp GX25. Still a 2004 product, this had a similar specification to the older GX20, but it had a sleeker design and notable it tucked the antenna inside the case. Bluetooth was added into the mix but the external screen shrank considerably. The result was a smaller, lighter, more capable and cheaper phone that was cheaper than the GX20 while retaining the excellent display. One highly sought-after version of the GX25 was the attractive Ferrari edition in bright red, but some markets had other eye-popping colours available too.

Sharp GX25
Sharp returned to their clamshell-with-antenna design for the Sharp TM200 in late 2004. This was exclusive to T-Mobile and was broadly similar to the GX30 except it had a smaller external display and crucially a two megapixel camera, making it the first such device in Europe. The oversized camera assembly on the TM200 was rather pointless, but it did draw attention to its class-leading camera capabilities.

Sharp TM200
Although most of these handset had been designed with European and Worldwide markets in mind, the next product releases had a more distinctive Japanese origin. One of the stars of Vodafone’s fledgling 3G network was the Sharp 902 which was essentially almost a straight import of the 902SH handset Vodafone Japan used.

Sharp 902

The 902 was like (almost) nothing else on the market. A large 3G-capable swivelling clamshell phone, it featured a 2.4” QVGA TFT display, a 2 megapixel camera with 2X optical zoom and a flash,  video calling, expandable memory on a full-size SD/MMC card, an MP3 player, web browser and email client. The 902 looked like a compact digital camera from one side, and you could swivel the display around to act as a huge viewfinder. The 902 had plenty of “wow factor” but flaws in the camera design meant that the pictures were disappointing, and Vodafone was having a hard job persuading customer that 3G was worth having. Launched alongside it was the cut-down Sharp 802 with a more conventional 1.3 megapixel camera, although this didn’t have the same market appeal. A special bright red Ferrari edition was the most desirable version, that that still commands a premium today for collectors.


Sharp 803
Most customers were sticking with their 2/2.5G devices and the GX range was still popular despite 3G competition. Rumours of a Japanese-style GX40 clamshell with a 2 megapixel camera were doing the rounds, Sharp having impressed potential consumers with the radical design of the 902. But this crucial market seemed to be overlooked.  It meant that customers with a GX30 who wanted an upgrade but didn’t want a bulky 3G phone would have to look elsewhere.

Sharp’s next launch was the Sharp 903 and Sharp 703 – another pair of G devices. The 903 was quite similar to the 902 in design, but sported a 3.2 megapixel camera with a 2X optical zoom that fixed the flaws of the 902. The full-sized SD card slot had gone to be replaced by a miniSD slot, but strangely the phone was actually bigger than the 902 despite that. Better looking than the 902, it came in a variety of colours as well. Launched at the same time was the more conventional 703 with a swivel-less design and a 1.3 megapixel camera.

Sharp 903 and 703

We didn’t know it at the time, but the Sharp 903 was as good as it was ever going to get for Sharp fans in Europe. When the Sharp GX40 finally came out later in 2005 it was a huge disappointment. It sported good multimedia features but a very disappointing 1.3 megapixel camera and even the screen was a slight downgrade on previous versions.

Sharp GX40
Three elegant but fairly low-end phones followed in 2006 – the Sharp GX29, 550SH and 770SH. The 770SH was the most elegant with a QVGA display and expandable memory, but it was still only a 2G phone with a 1.3 megapixel camera. The 550SH was essentially a candy-bar version of the 770SH. The GX29 was a simpler phone with only a VGA camera and limited features. This time the most desirable of the bunch was the 770SH McLaren Mercedes edition which certainly looked the part even if it didn’t deliver much.

Sharp GX29, 550SH and 770SH McLaren Mercedes Edition
After this Sharp pretty much faded out of markets outside of Japan, although years later they did return with some decent Aquos branded Android handsets which developed a following but have never really sold in large numbers.

Sharp certainly seemed to be poised on the verge of a breakthrough, but what went wrong? Sharp were certainly leading in display and camera technology. Very much at the leading edge Sharp and Vodafone also bet strongly on 3G, coming up with the class-leading 902… the problem was that consumers really didn’t want 3G and sales of that, the follow-up 903 and the 802 and 703 were weak. Sharp were also very much stuck with carrier exclusive deals, mostly with Vodafone but also to some extent T-Mobile. This was good news for the carriers, not such good news for Sharp. A failure to update their 2G line also left fans with nowhere to go - and when Vodafone left the Japanese market in 2006 the ties with Japanese manufacturers became much weaker. And of course the market was dominated by Nokia, and despite their handsets lagging behind in hardware terms they were usually the best-looking devices and very easy to use.


Sharp 902 and GX25 Ferrari Editions

Today the Ferrari editions are sought-after and a humble GX25 in Ferrari livery in very good condition can sell for hundreds of pounds. The 902 can cost around £150 in good condition, but most other Sharp phones are worth much less. However many of them - especially the GX30 and 902 - would make an ideal addition to a collection.


Image credits: Sharp, Vodafone, T-Mobile
Morio via Wikimedia Commons - CC BY-SA 3.0

Monday, 6 July 2020

Missile Command (1980)

Missile Command screenshot
Introduced July 1980

It’s the height of the Cold War, and the possibility of nuclear annihilation is always just around the corner. Everything you know and everyone you love could be swept away in an instant and there would be very little you could do about it.

So, for some escapism what about a game where everybody dies in a nuclear conflagration? Welcome to 1980 and Atari’s Missile Command.

The golden age of arcade machines featured many escapist games, usually of the shoot-‘em-up variety. As with microcomputers of the time, arcade machines were being propelled by improvements in microprocessors and other silicon chips leading to a rapid improvement of hardware. Missile Command used a 1.25 MHz 6502 CPU with an Atari POKEY chip handling sound. Graphics were 256 x 231 pixels in 8 colours, and unlike Lunar Lander and Asteroids, Missile Command used a raster scan monitor.
Missile Command arcade machine

The gameplay was this: the player had to defend six cities at the bottom of the screen from waves of nuclear weapons (represented with a line with a blob on the end). The player would launch their own missiles from three bases into the sky to destroy the nukes, and those bases themselves can be destroyed. As the game progresses the player is attacked by missiles with multiple warheads, bombers and satellites. The game ends when all six cities are destroyed, and invariably they ARE destroyed.

Unusually, the primary control for the game was a large trackball which emulated the sort of thing that real military bases would use for controlling systems. Combined with the (then) advanced graphics and sound, it made Missile Command a distinctive and popular gaming experience.

Although the game was distributed by Atari in North America, Atari chose to partner with Sega to distribute it in Europe. This gave Sega a useful foothold in the arcade game market. In Asia-Pacific markets a smaller number of Taito cabinets were made. But as a classic video game, it was ported to many platforms from the 1980s onwards and there are still licenced version and clones available today. Or if you still have Flash installed on your computer, you can play it for free here.

Image credits:
John Cooper via Flickr - CC BY-NC-ND 2.0
James Brooks via Flickr - CC BY 2.0


Tuesday, 23 June 2020

Nokia 6111 vs Samsung E530 (2005)

Nokia 6111
Introduced June 2005

A long time ago we weren’t so bothered about the technical specifications of our phones… how they were designed – the look and feel of the things – was often more important. One common design feature was to make phones small and curvy, and the rival Nokia 6111 and Samsung E530 phones were certainly that.

The Nokia 6111 was the best known of the pair, one of a small number of slider phones from Nokia. With a diminutive size and heavily curved corners there was no doubt that the 6111 was a looker. When closed it measures 84 x 47 x 23mm and at 92 grams it was lightweight as well. On the back was a competitive 1 megapixel camera with an LED flash and rudimentary video capabilities. The 1.8” 128 x 160 pixel screen wasn’t all that great but as with all Nokias of this type it was incredibly easy to use. The was an FM radio built in plus a few games (more could be downloaded) and it supported Bluetooth too.

Although this was a GPRS-only phone, it did include a version of the capable Opera web browser.
The most popular silver and white version looked a bit “girlie” but there were darker colour combinations too which looked less so. Launched at the height of the slider phone craze in the mid-noughties, the Nokia 6111 was quite a success.

Although Samsung were the king of the slider market in 2005, for the Samsung SGH-E530 they returned to their rather more traditional clamshell market. The E530 was all about curves, from the gentle curve of the clamshell case to the gentle contours of the keypad inside.

Available in a much wider variety of colours than the 6111 – including pink, orange, white, purple, blue and silver – the E530 did lean more definitely to the “girlie” end of the market. The built-in apps reinforced this with a calorie counter, fragrance chooser, biorhythm calculator and shopping list.

Samsung E530
In hardware terms the E530 beat the 6111 in a lot of respects. Although it still had a 1.8” display the Samsung was a much sharper 176 x 220 pixels, plus there was a smaller external display too next to the camera which could be used for selfies. The 6111 had a crude LED flash where the E530 didn’t, but the E530 had better battery life.

In software terms, the Samsung wasn’t quite as polished as the Nokia but it was certainly very usable.  You could use the Samsung as an MP3 player, even though like the Nokia, the Samsung lacked expandable memory but the E530’s internal 80Mb was much more useful than the paltry 23Mb in the 6111. Both came with four games included with other Java games available for download.

If you quite fancied the technical specs of the E530 but wanted something a bit less feminine then the Samsung E720 offered almost identical features but in a different package. Offering many variations on fundamentally similar handsets is something that Samsung still do today (with more than 500 “Galaxy” devices launched to date).

The Samsung E530 had better technical specifications and a more detailed design, but it was the Nokia 6111 that sold. Today the 6111 is commonly available for about £10 to £30 but with the rarer Cath Kidston versions coming in at rather more. The Samsung E530 is much rare and tends to range in price from £50 to £100. Either phone is a refreshing change from today’s identical-looking slabby smartphones however, and whatever you might think of the gender stereotyping they are both good looking devices.

Image credits: Nokia and Samsung

Tuesday, 16 June 2020

Range Rover (1970)

1973 Range Rover
Introduced June 1970

The development of the Range Rover is a very long and quite interesting story which you can read about if you want. But the very short version is that the Rover Company wanted something to follow-on from their successful Land Rover utility vehicle. This search started in the 1950s and surprisingly it took nearly two decades to come up with something that they thought was good enough to bear the “Land Rover” name… and more crucially something that might sell.

The original Land Rover was strictly an off-roader. On road it was awful to drive, slow and Spartan on the inside. Attempts to convert it into a Station Wagon over the years had not met with sales success. But in the US, cars like the International Harvester Scout and the Ford Bronco showed that there was indeed a market for utility vehicles that could be used as an everyday car.

Development of the Range Rover (codenamed “Velar”) started in the mid-1960s, and it combined lessons from the Land Rover’s impressive off-road manners with Rover’s ability to make a quite luxurious and usable car, but perhaps the key added ingredient was the legendary Rover V8 engine. The Rover V8 a powerful and lightweight engine that had originally been developed and abandoned by Buick. Buick’s loss was definitely Rover’s gain and the V8 engine spent 46 year in production (ending only when Rover collapsed).

In the Range Rover, the V8 gave the car the power it needed to shift its substantial weight in a fairly speedy manner. Inside were things like (gasp) comfy seats and carpets. Peculiarly it was designed as a two-door car, although coachbuilders such as Monteverdi would sell you a four-door conversion. It took until 1981 for a factory-built four-door Range Rover to become available and in 1992 a long wheelbase version cemented the idea of it being a luxury car and was probably the best-looking of all the original Range Rover models.

Late model four-door long wheelbase Range Rover LSE
It had idiosyncrasies. The split tailgate wasn’t to everyone’s taste and it had the turning circle of a bus. But it stayed in production (as the Range Rover Classic) until 1996, two years after the launch of the P38 which replaced it and an astonishing 26 years on the market during which it was continually developed as a product.

Today the Range Rover is very much a luxury car, but one that has lost none of it’s off-road capabilities. A 1970 Range Rover could cost you a shade under £2000 (about £32,000 today) where today a base model will cost around £81,000 with typical prices being £100,000 or more. An original Range Rover Classic in fair condition can cost between £15,000 to £25,000 with really good ones nudging the price of a new one. Even though the timeless design doesn’t really look 50 years old, buying and maintaining one of these might well be a labour of love.

As a car, the Range Rover really launched the idea of a luxury SUV in Europe and fifty years later the things are all over the place, love them or loathe them. Today more than a third of new car sales in Europe are of SUVs like the Range Rover, and in the US the figure is about than half the market. The rise of car leasing also means that people don’t have to spend so much at once to get one of these huge beasts. Far from being a niche, the SUV is now becoming the mainstream choice.

Image credits:
Vauxford via Wikimedia Commons – CC BY-SA 4.0
nakhon100 via Wikimedia Commons – CC BY 2.0






Sunday, 14 June 2020

Hollerith Tabulating Machine (1890)

1890 Hollerith Tabulating Machine
Introduced June 1890

The United States has a census every ten years, an essential act of counting all the population of the country and working out their demographics. But as the population grew in the 19th century, so it took longer and longer to take the census and process all the information. By 1880 the whole process took eight years, nearly long enough to clash with the 1890 census. Something had to be done.

The key part to processing the census more quickly (and cheaply) was the punched card. First introduced in 1804 (more than two centuries ago!) with the semi-automated Jacquard Loom, punched cards allowed binary data to be stored and read by simple mechanical and later electro-mechanical devices. It seemed to American inventor Herman Hollerith that this could be a key part of the solution to the census problem.

For the new census, data was still collected on paper but it was then transcribed to a punched card with 12 rows and 12 columns of binary data, marked by the presence or absence of a hole. The key element was Hollerith's Tabulating Machine. An electromechanical sensor combined with a simple counting dial could then add up the data in a variety of ways, which allowed for all sorts of data analysis.

100 million cards were made, and each was processed just four times to come up with the variety of statistics that the census office wanted. It took two years off the time it took to process the data, but the lasting legacy of the tabulating machine was much deeper. For the first time it showed that automation could be used to process data on a large scale. And remember… this was 1890.

Hollerith’s business grew and in 1896 he create the Tabulating Machine Company, which then in 1911 merged with some other businesses to become the awkwardly named Computing-Tabulating-Recording Company. In 1924 this company renamed itself to International Business Machines (IBM). And a century after the Tabulating Machine’s success in the 1890 Census IBM gave us... errr… the PS/1.


Punched card from a Hollerith Machine
As it happens, the Hollerith Machine could be used for evil as well as good. The Nazi regime used demographic data collected on punched cards extensively with horrifying consequences.

Hollerith’s tabulating machine popularised the punched card, something still seen today sometimes in voting machines but which generally fell out of use in the 1980s. Today punched cards can be quite collectable, especially for more obscure systems or ones with some historical interest. However, most people who actually did use them don’t miss them at all... especially if you’ve ever dropped a big pile on the floor and have then had to sort them back into order by hand.

Image credits:
Diane Maine via Flickr – CC BY-NC 2.0
Marcin Wichary via Flickr – CC BY 2.0

Sunday, 7 June 2020

IBM PS/1 (1990)

IBM PS/1 Model 2011
Introduced June 1990

By 1990 the PC-compatible marketplace had changed a lot since the launch of the original IBM PC (model 5150) in 1981. No longer just the choice of businesses, PCs had largely replaced an eclectic range of incompatible home microcomputers that had dominated the earlier 1980s. It was increasingly common to see PCs in the home, but they weren’t generally IBM PCs despite IBM inventing the platform.

IBM had tried to break into the home computing market in 1984 with the IBM PCjr, a short-lived crippled version of the PC that was a sales catastrophe. Apparently unperturbed by this, in 1990 IBM tried to break into the same market again… and they repeated many of the same mistakes they had done years earlier.

Worse still, IBM’s attempt to redefine the business PC market with the IBM PS/2 launched a few years earlier was floundering. Instead of moving the market from DOS and the old ISA hardware architecture to OS/2 and Micro Channel Architecture (MCA) it seemed that IBM just split the market between themselves and competitors such as Compaq who were improving the old platforms instead.

In 1990 IBM tried a shift in direction with the new IBM PS/1. Rather more based on traditional PC architecture than the PS/2, it was designed for home users who wanted to be able to unpack something from the box and get working in minutes Models such as the 2011 made this really easy, and when assembled they booted into a friendly screen allowing easy access to DOS, Microsoft Works on online services if they had been included.

IBM PS/1 Model 2133
A Mac-like simplicity to the hardware had some drawbacks – it wasn’t really expandable and the non-standard power arrangement where the computer was powered by the monitor (like the Amstrad PC1512) meant that you were stuck with using the IBM PS/1 monitor for ever.

The hardware was excellent though, and it wasn’t stupidly expensive (competing with Compaq on the likes of price), but consumers were not that interested. It didn’t help that IBM had to create a completely new sales channel for the things as traditional IBM dealers didn’t sell to consumers, but in the US large-deal with Sears who bundled access to Prodigy with the computers. On early models DOS was included in ROM, which made the machines very quick to boot up.

Consumers were cool about the PS/1 though, preferring other brands where they were available. IBM was still seen as a business PC, and the incompatibilities of the PS/2 range rubbed off on the PS/1 even though it was a different hardware platform. IBM stuck with the range though, making the machines more expandable and more standard in terms of hardware and software.

The range lasted until 1994 when IBM replaced the PS/1 range with the architecturally similar but more appealing IBM Aptiva range which continued until IBM’s exit from the home PC market in 2001. Today the PS/1 is an uncommon beast but it commands decent prices of about £500 to £700 or depending on model.

Image credits:
Kungfoocow369 via Wikimedia Commons – Public Domain
Science Museum, London – CC BY-NC-SA 4.0



Wednesday, 3 June 2020

Digital VT05 and VT420 (1970 and 1990)


The video terminal is the unsung hero of the computing world. Often toiling aware in warehouse, factories, colleges, shops and other places out of the public eye, the video terminal was a dependable workhorse for decades… well into the era of the PC.

Arguably the king of the terminal market was Digital Equipment Corporation (“DEC” or just “Digital”) who made terminals that were attached to minicomputers or mainframes, where they could run a wide variety of centralised applications that typically ran on Unix or VMS boxes.

They comprised of not much more than a display, keyboard and serial interface – and although they were not always cheap to buy, they were certainly cheap to run with no moving parts and complete immunity to computer viruses and other malfeasance. You could plug one in and forget about it for years, and it would keep doing its job.

DEC VT05
DEC VT05
The DEC VT05 was introduced in 1970 and was Digital’s first standalone raster video terminal. Sure, I could tell you that it was a bit of an upgrade from the glass teletype concept with a bit of cursor control thrown in but probably the thing about the VT05 that most people will notice is how it looks. Digital’s radical space-age design made it look like the terminal was leaping out of the work surface. Inside the system boards were slanted behind the CRT rather than subsequent models which were more conventional inside. It looked fantastic, but the downside was that the VT05 was 30 inches (76 centimetres) deep which meant that you’d likely have to re-engineer your working environment to put one in. At 55 pounds (25 kilograms) it was hardly a lightweight device, so you wouldn’t want to move it anyway.

It could only display uppercase characters, but the keyboard could enter both upper and lowercase. Quite how you were meant to tell what you were typing is a mystery. Maximum data transfer rate was 2400 baud. The VT05 had a video input so you could display other things on the monitor, and mix them together with the text. The VT05 was around for five years until the much more capable – and conventional – VT52 was launched.

DEC VT420
DEC VT420
Fast forward twenty years and the direct descendant of the VT05 – the VT420 – is launched. Don’t expect two decade of development to count for all that much though, the VT420 was still conceptually the same thing. Unlike the VT05, the VT420 was a practical design with a separate keyboard and a monitor on a tilt-and-swivel stand that was supplied as standard (unlike previous versions). It weighed just 8kg so wasn’t a problem to move about a bit, and the ANSI character set that it supported allowed full cursor addressability and enough predefined graphics to make a nice user interface.

The VT420 also supported dual sessions, typically by using the two serial ports on the back. Not only could you interact with two utterly different systems, but you could also copy-and-paste between them. That might not seem like a big deal now, but back in 1990 most people still couldn’t paste data between applications on their PCs so it was kind of a big deal.

The data transfer rate was a speedy 38,400 baud using the compact phone-like MMJ sockets, the screen had a maximum capacity of 132 x 48 lines of text and the latest revision of DEC’s legendary keyboard – the LK401 – was almost perfect in every way except for the annoying lack of an Escape key.

Where the VT05 marked a point near the beginning of DEC’s journey, the VT420 marked a point near the end. The days of centralised minicomputers were starting to fade and throughout the 1990s PCs and Macs became more capable professional computing environments. The VT420 was a success but it lasted just three years before being replaced by the VT520 which was almost identical. DEC sold the entire terminal division in 1995 and they themselves were taken over by Compaq in 1998, who were then taken over by HP in 2002.

The VT range soldiered on with Boundless Technologies until 2003, and other manufacturers either closed down or shuttered production in the following years, including Wyse and Qume until there were none left.

Even though the manufacturing of terminals dried up, the computer systems that relied on them still exist. VT terminals are still in use around the world, but newer installations will typically rely on a PC with some terminal emulation software – a more complex and less reliable solution.

Today a DEC VT420 in good condition second-hand can cost a couple of hundred pounds, and maybe budget a thousand or so if you want to acquire a VT05. Of course terminal emulators can be had for less, a client such as PuTTY is free or Reflection is a more commercial offering.

Image credits:
Matthew Ratzloff via Flickr – CC BY-NC-ND 2.0
Adamantios via Wikimedia Commons – CC BY-SA 3.0


Tuesday, 26 May 2020

Nokia 770 vs HTC Universal (2005)

HTC Universal in vanilla form and an O2 XDA Exec
Fifteen years ago the golden age of mobile phones was in full swing – each generation added new features, but most phones concentrated on the business of making phone calls, playing games and taking a few grainy photos. But on the higher end of the market were devices that would lay a path followed by the smartphones of today. Two of these were the Nokia 770 and the HTC Universal.

In 2005 HTC exclusively made devices sold with the names of other companies on them – so the HTC Universal went under many names including the O2 XDA Exec, T-Mobile MDA IV, Qtek 9000 and i-mate JASJAR. Despite the marketing confusion, HTC’s reputation was growing… but it would still be another year before HTC would sell phones under its own name.

The Universal itself was ground-breaking. Here was a smartphone with absolutely everything – a smartphone with a large 3.6” VGA-resolution touchscreen display on a swivelling hinge, a physical QWERTY keyboard, front and rear cameras, expandable memory, Bluetooth, infra-red connectivity, WiFi and 3G. The only thing missing from the feature list was GPS, which was a rare option back then (although the rival Motorola A1000 did feature GPS).

It was a big device for the time, measuring 127 x 81 x 25mm and weighing 285 grams. But in modern terms, that’s about the same footprint as the iPhone 11 but about 50% heaver and three times thicker. The swivelling hinge meant that you could use the Universal like a mini-laptop or a PDA.

There were flaws – it could have used more RAM, the VGA resolution screen was effectively a just a scaled-up QVGA screen most of the time and the new version of Windows it came with was still clunky and harder to use than a modern OS. The wallet-lightening price of nearly €1000 for SIM-free versions compared unfavourably with laptops of the same era, and while the Universal had awesome mobile working capabilities it just wasn’t as powerful as a real computer. And from a practical point of view, the sheer size and weight made it impractical to use as a mobile phone from the point of view of a customer in 2005.

Nokia 770 Internet Tablet
Nokia on the other hand were trying something different. The Nokia 770 Internet Tablet wasn’t a phone at all but was instead a compact Linux-based computer with built-in WiFi and Bluetooth for connectivity. You could still use the 770 pretty much anywhere, but you’d need to spend a little time pairing it with your regularly mobile phone and setting it up as a modem.

There was no camera or keyboard either, but there was a huge (for its day) 4.1” 800 x 480 pixel touchscreen display, expandable memory and pretty decent multimedia support. The operating system was Maemo, a Linux-based platform that was quite different from Nokia’s regular Symbian OS. It was wider but thinner than the Universal and at 230 grams a good deal lighter.

The whole Internet Table project was a bit left-field for Nokia, but it attracted a small but dedicated following. Many were attracted by the idea of using Linux on a small handheld computer which was a fresh idea, indeed in retrospect it seems trailblazing because both iOS and Android are closely related to Linux.

The 770 suffered from a lack of software at first, although it didn’t take long for Linux application to be ported across. The processor was a bit slow and there was a lack of RAM which hampered things, and some people just couldn’t get used to the idea of carrying two devices. It was successful enough though to spawn several sequels, which is a different story.

Even though the 770 was priced at just €370 (about a third of the price of the Universal) there was some criticism that it was expensive. In retrospect it looks positively cheap though, and although modern tablets are much bigger there are many echoes of modern smart devices here.

Neither the 770 nor the Universal were a huge success, perhaps partly because the technology of 2005 wasn’t quite good enough to deliver the results people wanted, and perhaps partly because consumers didn’t understand that they wanted all these features from a mobile device until handsets such as the original iPhone came along.

Today both devices are fairly collectable, with the HTC Universal coming in between £100 to £350 or so depending on condition and about £80 to £250 for the Nokia 770. Oh, and the winner between the two? The Nokia 770's software platform is pretty close to what we use today, with the "everything but the kitchen sink" hardware specs of the HTC Universal. Let's call this one a tie.


Image credits: HTC, O2, Nokia

Saturday, 23 May 2020

Windows 3.0 (1990)

Software... in a box!
Introduced May 1990

By 1990 Microsoft’s Windows platform had been around for five years and had made pretty much no market impact at all. Early versions of Windows were truly terrible and combined the very worst of clunky user interface design with the technological backwardness of late 1980s IBM-compatible PCs.

Most PC users stuck with plain old-fashioned DOS and were seemingly happy to run just one program at a time, each with a different user interface and incompatible file formats. WordPerfect, Lotus 1-2-3 and dBase and ruled the market – all very different products from different vendors. But a high-end business PC could have an Intel 80386 running at 20 or 25MHz with up to 4MB of RAM, plus a VGA card and a 100MB hard disk and a really high-end system might have a powerful 486 processor inside. But by a large of lot of these systems just ran DOS programs… but very, very quickly.

The limitations of DOS were pretty crushing. One application at a time and a maximum memory address space of 640KB. Various tools and memory managers existed, but many of these were incompatible with each other and they couldn’t make up for the clumsiness of DOS itself. Microsoft’s way out of this was to partner with IBM to make a new operating system called OS/2. This had been developed alongside earlier versions of Windows, but it was a much more modern operating system designed for 80286 processors and above where Windows 1 and 2 could run (just about) on a first generation 8086-based PC.

But Microsoft too had been pushing technological limits with Windows/286 and Windows/386 with special versions of Windows 2.1 which maintained the clunky look-and-feel of the old Windows but could actually take advantage of newer CPUs, including multi-tasking DOS applications. These were niche products, but when Windows 3.0 was introduced in May 1990 it included enhanced support for the 286 and 386 processors out of the box.

Not only was it better underneath, but Windows 3.0 had a complete overhaul of the user interface, featuring the application-orientated Program Manager rather than the brutally ugly and simplistic MS-DOS Executive in previous versions. Utilising attractive icons and taking advantage of what was then high-resolution VGA graphics, Windows 3.0 was approaching the usability of the Macintosh – although the Mac’s Finder was more about data files than programs.
That's a billion person-hours down the drain then

Windows 3.0 design was a mix of polished-up elements from previous versions of Windows with a rather flat feel to them, along with 3D elements largely borrowed from Presentation Manager in OS/2. Compared with modern minimalistic versions of Windows, Windows 3.0 had a lot of “Chrome” around the edges which resulted in visual clutter and wasted space. But it certainly wasn’t bad looking for a 30-year-old design.

It had flaws – many flaws – in particular it wasn’t very stable and the dreaded Unrecoverable Application Error (which was Windows 3.0’s Blue Screen of Death) was all too common. Driver support was fiddly, if you didn’t have a popular system out of the box then you’d need to acquire and install things like video drivers and sound drivers. Most importantly, it wasn’t really an operating system in its own right, it was an operating environment perched on top of the ancient DOS OS. It took another three years and a major schism with IBM to create a completely modern version of Windows with Windows NT.

Windows 3.0 was a huge success (perhaps in part to the maddeningly addictive card game of Solitaire it shipped with), and of course the quest for ease-of-use spread beyond the operating system itself. Microsoft Office 1.0 was launched in November 1990 with Word for Windows 1.1, Excel 2.0 and PowerPoint 2.0, typically priced at around half of what it would cost you to buy the programs separately. Windows 3.1 followed two years later with improvements all around, and because Microsoft would cut PC manufacturers a good deal to ship Windows with new PCs it eventually got everywhere and wiped out all the other PC-based opposition.

As an operating system it isn’t of much practical use today, but if you want to play with it there are a few places that you can get virtual machines to run under VirtualBox or VMware, and you can relive the frustrations of Solitaire if nothing else. Complete set of installation disks, boxes and manuals are quite collectable too with prices typically ranging from £60 to £100.



Image credit: David Orban via Flickr - CC BY 2.0

Tuesday, 12 May 2020

Pac-Man (1980)

Introduced in Japan, May 1980

The Golden Age of Arcade games is in full swing by May 1980, and it sees the launch of a gaming icon – Pac-Man. Developed in Japan by Namco, it followed on from their successful Galaxian shoot-‘em-up, but Pac-Man was a completely different type of game.

As with Galaxian, Pac-Man was powered by a Z80 processor, it had colour graphics with sprites and a dedicated graphics co-processor and a three-channel sound chip. Utilising and impressive array of technology from the era, the capable hardware platform allowed for a much richer gaming experience than earlier generations.

The game itself – first called Pakkuman and then Puck Man before becoming Pac-Man - is familiar to most, but in case it somehow passed you by, the player controls the Pac-Man (essentially a circle with a chomping mouse) who is chased around a maze by four ghosts (Blinky, Pinky, Inky and Clyde). Pac-Man’s job is to eat all the dots in the maze, if caught by a ghost he loses a life… but various power pills around the maze allow Pac-Man to eat the ghosts instead. The ghosts have different personalities and between them try to chase and trap Pac-Man.

With appealing graphics and sound, Pac-Man was a fun game to play – and it was also something different from the alien invasion or war games that dominated arcades. A slow burn at first, when Pac-Man hit the US in late 1980 it became a smash hit and the most popular game around, shipping 100,000 cabinets by 1981.

After arcade success came conversions to home computers and consoles, official and unofficial and of varying quality, and then there were endless sequels (notably Midway’s Ms. Pac Man), board games, TV shows, toys and other things you could spend your cash on. The franchise raked in billions of dollars and created an iconic character recognised throughout the world. Although there are many different ways to play Pac-Man today, if you want an original cabinet in good condition then you can expect to pay several thousand pounds.

Image credits:
Peter Handke via Flickr - CC0 1.0

Marcin Wichary via Flickr - CC BY 2.0

Friday, 8 May 2020

Penny Black (1840)

Rare block 4 of four unused Penny Blacks
Introduced May 1840

These days we mostly communicate with each other electronically – at the press of a button a message can be transmitted the anywhere in the world in a fraction of a second. But of course that wasn’t always the case, and not so long ago you would often rely on having to write a letter. On paper, in an envelope and typically written out by hand.

Postal systems had been around since at least 500 BC and possibly even earlier, but they tended to be haphazard, open to corruption and often only accessible to governments. Although this situation improved, by the 19th century most postal systems were still complex and unreliable – sometimes operating on a post-pay principle where the postage would be paid by the receiver, who might decline to pay (sometimes after reading the letter).

Reform was needed and in the spirit of Victorian innovation the government started to look for solutions. Eventually a proposal by Rowland Hill was accepted for an inexpensive, simple-to-use and universal postage system that would become the model for the rest of the world.

Before Hill’s reforms the price of postage depended on the distance the letter had to travel, making the system complicated to use. It was also expensive, so the volume of mail was quite low, and the Post Office was haemorrhaging money as a result. Instead a system was proposed where a letter of up to half an ounce (14 grams) could be sent prepaid for a uniform rate of just one penny, regardless of distance.

Penny Black Detail
To show that postage had been paid, a piece of gummed paper was cut out and stuck to the envelope – called a “stamp”. It was a peculiar thing to call a sticky bit of paper, but it referred to some older postal systems where the envelope was stamped with an actual stamp. This small piece of paper had a portrait of Queen Victoria on it and was black in colour, becoming known as the “Penny Black”.

Introduced in May 1840, the Penny Black and the postal system it represented became the model for every other country, and the stamp itself became an icon. But it didn’t last long. The black colour of the stamp made it very hard to see cancellation marks, meaning that the stamp could sometimes be carefully removed and re-used, so in February 1841 the stamp was redesigned in a red colour – and this version in one form or another continued in production until 1879.

The system was simple to use and inexpensive (one penny in 1840 terms is worth about 42 pence today). It was a huge success, and it allowed people from all classes to communicate with anyone else in the country, reliably and for a modest amount of money. Over the next few years the same model was adapted by most other countries, and an era of low-cost and reliable communication was born.

Today of course the Penny Black is very collectable, with prices typically starting at about £50 for a used one in decent condition and then heading upwards to thousands or tens of thousands of pounds. Given the short period of time that the Penny Black had been available, combined with the fact that the hobby of stamp collecting didn’t really exist until stamps did, it does add a certain rarity value to these stamps – but every semi-serious collector will either own one or aspire to it. Alternatively you could buy the modern day equivalent – a British first-class stamp currently costs 76p which is 182.4 times more expensive than the original.

Image Credits: [1] [2] 
Metropolitan Museum of Art via Wikimedia Commons - CC0 1.0

Saturday, 2 May 2020

DAI Personal Computer (1980)

Later model Indata DAI Personal Computer
Introduced 1980

The early microcomputer market is full of hits and misses. Sometimes products fail due to no fault of their own, sometimes they have flaws which are their undoing, and sometimes it is just bad luck. The DAI Personal Computer combines all of these and perhaps if the dice had fallen differently it could have been one of the successes of the early 1980s.

DAI stood for “Data Applications International”, and during the 1970s this Belgian company had developed computers and expansion cards based on their own standard called DCEbus, related to the then-popular Eurocard standard. The story goes that Texas Instruments (TI) in Europe approached DAI to make a computer that they could sell, as TI’s corporate bosses were only looking to sell the upcoming TI99/4 in the US as it was going to be compatible with North American NTSC TVs rather than European PAL ones. DAI designed a system to show TI with a physical design based on TI’s own Silent 700 data terminal.

TI changed their mind, and decided to make a PAL version of the TI99/4 after all… so they didn’t need the system that they had paid DAI to design and they were free to market it themselves. Despite having a bumpy start, the DAI computer looked promising… but the whole history of the computer itself was going to be a whole series of bumps.

In the Netherlands, educational broadcaster Teleac was looking for a computer system to accompany an upcoming series about microcomputers (launched some years before the BBC did the same). They were interested in the DAI, and this looked like a promising tie-up that would give the upcoming product a boost. But production problems in the Belgian factory plus difficulties in sourcing the ROMs from the US meant that the DAI was never going to be ready in time, and Teleac chose to go with the Exidy Sorcerer instead.

Despite these setbacks, the DAI finally launched in 1980, and (if you noticed it at all) it was a quite different machine from most of the competition, in good and bad ways. Inside was an Intel 8080 processor rather than the more common Z80 or 6502 CPUs. The 8080 came out in 1974 and it was pretty basic by the standards of 1980, it wasn’t particularly fast and it struggled with handling precise floating point numbers, although you could add a maths co-processor. Inside was 48KB of RAM, of which up to 32KB could be used up by graphics. Some clever trickery with an adjustable on-screen palette meant that the DAI could display far more colours than you would expect.

Sound was built in (with three stereo channels plus a noise generator) and there were a whole bunch of I/O ports including DAI’s own DCE bus which meant that you could expand the hardware with number of peripherals, including floppy disks and hard disk drives. Perhaps partly to make up for the pedestrian processor, BASIC on the DAI was compiled at run time rather than interpreted which made programs run very quickly.

There were problems – there wasn’t much software, documentation was poor, after-sales service was abysmal and the system was very expensive. Financial troubles plagued both the manufacturer and distributors, and in 1982 DAI (the company) went bankrupt, with production being taken over by another company called Indata. Production carried on until 1984 when the DAI fell victim to the great microcomputer market cash of the early 1980s.

Despite all this, the DAI had a strong hobbyist following which helped to develop software, write documentation and keep the systems running. Overall the DAI was ahead of its time in many aspects, and perhaps with a bit more luck it might have been a success. Today, the DAI Personal Computer is a rare find and in good condition could cost you €1000 or so.

Image credit: Thomas Conté via Wikimedia Commons - CC BY-SA 2.0

Sunday, 26 April 2020

Acorn Atom (1980)

Acorn Atom
Introduced 1980

The late 1970s saw the introduction of the first microcomputers that you could just take out of the box and use and home or at work, and eventually this would lead to an explosion in the numbers of home computers available in the first half of the 1980s.

Somewhere in between these two points was the Acorn Atom – not one of the original micros that changes the world, and not one of the later ones that sold (for a while) like hot cakes. However – like almost all Acorn computers – the Atom is part of a vitally important heritage that had profound effects on the way we interact with technology today.

Acorn’s first successful commercial product had been the Acorn System 1 launched the previous year. Little more than a board computer with a keypad and some interfaces, the System 1 was incredibly basic but had potential.

The System 1 developed into the Systems 2 to 5, rack-mounted systems that were not dissimilar to the pioneering RM 380Z which had achieved some success in education and scientific markets. These Eurocard systems had evolved quickly into capable systems, but despite this the rackmount form factor was never going to work in the home.

Back: Atom, BBC Master Compact, Electron, BBC Micro
Front: BBC Master
Acorn took most of the key components of the Acorn System 3 and stuffed them into an all-in-one unit that was based on the Acorn System keyboard. Featuring a 1MHz 6502 processor, 2Kb of RAM (upgradable to 12Kb), a fast built-in BASIC plus a variety of colour graphics modes and sound the Atom was certainly competitive with other systems on the market.

The standard storage was a cassette drive, but you could add floppy disks, printers or other peripherals. Unfortunately a fully-loaded Atom would tend to overheat, a challenge that could often be fixed by the not-at-all-dangerous act of rewiring the mains power supply.

You could buy the Atom for £170 assembled or for £120 as a kit, more expensive than the rival Sinclair ZX80 but much more sophisticated. It was a relative success, selling in tens of thousands of units and setting Acorn up well enough financially to make several follow-ups.

So why is it so important? Well, the Acorn Atom developed into the very successful BBC Micro which was followed up by the Acorn Archimedes and that gave us the ARM processor which is probably what powers your smartphone. It took just 8 years for Acorn to evolve from single board computers to an architecture that would change the world.

Probably partly because they didn’t sell in huge numbers and those that did would mostly be in the UK, the Atom is an uncommon find today. Be prepared to pay several hundred pounds if you want one in a good condition, or even more if it has accessories. Or alternatively you could just tinker with an emulator for free...

Image credits: Simon Inns via Flickr [1] [2] - CC BY 2.0

Monday, 20 April 2020

From the archives: Microsoft KIN, Nokia N70, N90, N91, N8, 5140i, 8800, BlackBerry Pearl 9100

Some product launches are merely bad but the Microsoft KIN (2010) cratered so badly that you you could see it from space. On the other hand, the Nokia N-Series  represented Nokia at its best, starting with the Nokia N70, N90 and N91 (2005) and ending (at least for the Symbian N-Series line) with the N8 just five years later.


April 2005 was a great year for Nokia, with the somewhat rubbery but almost indestructible Nokia 5140i and the elegant Nokia 8800. Back to 2010, and the BlackBerry Pearl 9100 combined RIM's smartphone know-how with a traditional form factor that made it a success.

Image credits: Microsoft, Nokia, RIM

Saturday, 18 April 2020

Fanta (1940)

Introduced 1940

This year sees the 80th anniversary of the launch of Fanta – the fruit-flavour fizzy beverage that is one of the most popular soft drinks brands in the world. But the history of Fanta is complicated and it involves the Second World War and Nazi Germany.

Coca Cola opened its first bottling plant in Germany in 1929 – a few years before Hitler came to power. It became the most important market for Coca Cola outside the United States and by 1939 it had dozens of factories in Germany alone to keep up with demand. Of course to some extent, doing business in Germany in the late 1930s meant doing business with Nazis and although in retrospect this was probably a bad thing, many European and American businesses still do business with repressive regimes today.
---
"Fanta Klassik"
- a modern recreation of the original recipe

In 1939 of course the Second World War started in Europe, and by 1940 the German operations of Coca Cola were struggling because of a combination of blockades, embargoes and… well… all-out war. Although the United States (home of Coca Cola) was officially neutral until December 1941, supplies of key ingredients dried up effectively leaving Coca Cola in Germany with nothing to sell.

Coca Cola’s German boss - Max Keith – then set about looking for a product he could actually make and sell along with others in the company. What was settled on was fairly unappealing from the ingredients list – milk whey, beet sugar and apple leftovers from the food and cider industry. This got the name “Fanta” from the German word “Fantasie”… and a brand was born, indeed it was only the second beverage that Coca Cola had ever made.

The product varied a bit over the war years, sometimes switching in artificial sweetener and sometimes using other fruits or fruit by-products. But Fanta sold in millions of cases and was a huge success throughout Nazi Germany. The occupied Netherlands also made its own version using elderberries.

At the end of the war, Coca Cola took back control of their German subsidiary – including all the profits made during the war – and then shut down Fanta production and switched back to Coca Cola. But in the 1950s increased and more varied competition from rivals PepsiCo led Coca Cola to revive the brand. Variations of Fanta spread throughout Europe, and the classic orange Fanta that we most associate with the brand was invented in Italy in 1955, using locally-sourced oranges.

Fanta continued to grow, both in the countries it was sold in and in the bewildering variety of flavours. The flexibility in the brand allows it to be adapted to local tastes, and it also means that short-term special editions can be made to boost sales further.  Today the brand has evolved massively since its origins in Nazi Germany… but it doesn’t stop some very weird conspiracy theories coming up.

Interestingly, this American company prospered under the Nazi regime – even when the United States and Germany were at war. This is a ripe ground for conspiracy theorists, but several American companies ended up having their German subsidiaries orphaned in this way. Ford Germany made trucks such as the V3000 and engine parts, GM’s German branch of Opel also made large numbers of trucks and military vehicles, IBM’s involvement was darker still. Many other companies found themselves in this position with a greater or lesser degree of collaboration with the Nazi authorities, but it seems that both Max Keith and Coca Cola Germany resisted the dark side and just stuck to making fizzy drink.

A collection of Fanta cans from the 1960s

The other interesting thing is just how unappetising the original recipe for Fanta sounds. Apple pomace is the sort of thing that ends up in animal feed, and milk whey sounds very out of place in a fruit drink even though it is quite sweet-tasting and nutritious. It might not be an obvious way to make a soft drink, but considering some of the theories about what it in Coca Cola it does seem rather tame by comparison.

In 2015, Coca Cola recreated something approximating the original Fanta for its 75th anniversary, calling it “Fanta Klassik”, however a misjudged tagline of “we’re bringing back the feeling of the gold old days”  could be misinterpreted as the “good old days” being Nazi Germany, when really it was talking about the 1960s. Nonetheless it was a popular (if temporary) retro concoction, but if you want to recreate the taste of 1950s Italy you could just have some orange Fanta instead..

Image credits:
Illustratedjc via Wikimedia Commons - CC BY-SA 4.0
José Roitberg via Flickr - CC BY-NC-ND 2.0