Monday 28 December 2020

2020: things that didn’t quite make the cut

This year we’ve covered gadgets and inventions from the 1800s and up. But there are plenty of other things that had anniversaries this year that we didn’t mention.

One of the most important inventions debuted in 1810 – the tin can. A key product of the industrial revolution, the tin can answered many of the millennia-old questions about how to preserve foodstuffs. More reliable and palatable than salting, drying, pickling and a variety of other methods this humble tin can meant food security for growing populations in the 19th, 20th and 21st centuries.

Also hailing from the nineteenth century was celluloid, a versatile class of materials that were pioneered in billiard balls in 1870, and then found their way into many other products from toys to films. However, celluloid’s habit of bursting into flames means that it is rarely used as a material today.

Canned food in a store in Hong Kong, Easter brooch with celluloid flowers


Somewhat related to celluloid is cellophane, a lightweight and flexible material that lent its properties to Scotch Tape, introduced in 1930. Also from 1930 – and perhaps a little bit more edible – is the Hostess Twinkie cake bar. Allegedly, Twinkies last forever – but eat a decade-old box at your own risk.

Vintage scotch tape container, Twinkie cake bars

Fast forward to 1960 and the white heat of technology forges something even more high-tech than sticky tape and cake bars, with the laser. Don’t ask me to explain how these things work, they just do. Pew pew.

Visible lasers being demonstrated


People who remember the home computers of the 1980s probably remember the Commodore VIC-20 – but it had an immediate predecessor in the shape of the VIC-1001 which was sold successfully in Japan only. The main difference between the two is that the VIC-1001 supports Japanese Katakana characters.

Commodore VIC-1001


By 1990 of course things were really getting more advanced. The Sega Game Gear was an 8-bit handheld console that carved a significant market for itself. Another 8-bit games machine, the Commodore 64 Games System (or simply the C64GS) was a more traditional console based closely on the legendary Commodore 64 home computer. The C64GS held great promise with the potential of a huge games library, but it failed to deliver in a spectacular way.

Sega Game Gear, Commodore 64 Games System

In the same year, the Macintosh Classic breathed a bit more life into a familiar format at a sub-$1000 price point, but the 68000 processor was getting a bit long in the tooth by then. More powerful, but three times the cost, was the 68030-based Macintosh IIsi which was much more forward-looking.

Apple Macintosh Classic, Macintosh IIsi

Handheld gadgets continued to develop, and in 2000 the Sharp J-SH04 was launched in Japan which was the world’s first recognisable camera phone, with a rear-facing 0.11 megapixel camera. It wasn’t great, but it set a pattern that other early camera phones improved on.  

Sharp J-SH04


That’s it for 2020, a difficult year for many people. Let’s hope that 2021 will be better. A big shout out to all those key workers, healthcare professionals and everybody trying to be socially responsible (or even just managing to keep themselves sane) this year. 

Image credits:
Canned food in a store in Hong Kong: Bairgae Daishou 33826 via Wikimedia Commons - CC BY-SA 4.0
Easter brooch with celluloid flowers: Pinke via Flickr – CC BY-NC 2.0
Vintage scotch tape container: Improbcat via Wikimedia Commons – CC BY-SA 3.0
Twinkie cake bars: Photog Bill via Flickr - CC BY-NC-ND 2.0
Visible lasers being demonstrated: US Navy via Wikimedia Commons – Public domain
Commodore VIC-1001: Thomas Conté via Flickr - CC BY-SA 2.0
Sega Game Gear: James Case via Flickr - CC BY 2.0
Commodore 64 Games System: Thomas Conté via Wikimedia Commons - CC BY-SA 2.0
Macintosh Classic: Christian Brockmann via Wikimedia Commons – CC0
Macintosh IIsi: Benoît Prieur via Wikimedia Commons - CC BY-SA 4.0
Sharp J-SH04: Morio via Wikimedia Commons - CC BY-SA 3.0


Sunday 20 December 2020

Zork I (1980)

Introduced December 1980

Although early microcomputers came in all sorts of different and incompatible varieties, they had a few things in common. Specifically, they didn’t have much in the way of memory and if they had graphics capabilities they were pretty rudimentary. Early computer games therefore required a bit of imagination, and one great example of where this was the case if the Zork series of text adventures from Infocom.

Zork running on an ADM31 terminal
Zork running on an ADM31 terminal

Originally designed as a follow-on from the 1970s “Colossal Cave” adventure, Zork (in those days just called “Dungeon”) was developed for the DEC PDP-10 (a 36-bit minicomputer system) and was written in a version of LISP called MDL. Computers like these tended to have a relatively large amount of memory and decent hard disk storage, so the game itself grew quite large and complex. And just like Colossal Cave, Dungeon was very popular amongst people with access to the expensive computing equipment required to run it.

The next step by the authors – Tim Anderson, Marc Blank, Dave Lebling and Bruce Daniels – was to port the game (now called “Zork”) to the ever-expanding range of microcomputers on offer. Some compromises had to be made due to the small amount of memory these machines had, and the original minicomputer game was rewritten into three episodes.

The creators formed a company to market the game called Infocom, and set about the challenge of rewriting the game. The approach was a novel one for the time – the Zork game itself was written in its own language called “ZIL”, which ran in a virtual environment called a “Z Machine”. This meant that the game (and others based on the same technology) could be easily ported to any compatible platform that had a Z Machine coded for it. Infocom partnered with the distributors of VisiCalc to sell the game, and by 1980 it was ready for the wider world.

You could call the finished product either an adventure game or a piece of interactive fiction. Rich text descriptions, clever natural-language parsing and complex gameplay made Zork a compelling proposition. The player starts in a field next to a white house, which turns out to be the gateway to an underground layer full of treasure and monsters – including the infamous “Grue” who would kill the adventurer if they wandered around in the dark. By solving a series of puzzles, mazes and other challenges, the player could bring all the treasure back to a cabinet in the house and thus win the game.

Sharp-eyed people may notice that this Kaypro II is accompanied by the Amiga version of Zork I
Sharp-eyed people may notice that this Kaypro II is accompanied by the Amiga version of Zork I

Sales were good, and as word got around – and the software was ported to more devices – the popularity began to grow throughout the 1980s. Zork II & III were released in 1982 and a wide variety of other games were released in the ZIL platform, including 1984’s “The Hitchhiker’s Guide to the Galaxy” designed in conjunction with Douglas Adams. As a novel twist, some of these games included “feelies” which were physical objects in the box that would form part of the puzzles in the games.

Even as rivals tried to come up with graphical adventures, Infocom’s position remained very strong. But Infocom wanted to move beyond being just a games company, and in 1985 they launched a novel database product called Cornerstone. It wasn’t a success and Infocom was taken over by Activision, which ensured the short-term survival of the company… but in the end Activision didn’t really understand the Infocom brand and by the end of the 1980s development of text adventures had ceased, although there was a brief renaissance in the mid-1990s when several Infocom games compilations were released.

Because of the portable nature of the game, it’s possible to play Zork online for free. Be warned though – you may find that once you start, you won’t be able to stop playing until you’ve solved it all.

Image credits:
CyberHades via Flickr - CC BY-NC 2.0
Marcin Wichary via Flickr - CC BY 2.0

 

Saturday 28 November 2020

Atari Battlezone (1980)

Introduced November 1980

Displays on early computer systems – and arcade games – tended to be divided into two categories. Most used a series of dots created line-by-line, like a domestic TV set – these were raster scan devices. But you don’t have to create a picture like that with an old-style cathode ray tube (CRT), you can draw lines from point-to-point too – these are vector devices.

Atari’s Lunar Lander and Asteroids games used vector displays to create cool space-age graphics, modelled in part on the look and feel of Spacewar! on the PDP-1 decades earlier. The next stage for Atari was an impressive evolution of vector graphics, with Battlezone.


Modern rendering of the periscope view of Battlezone
Modern rendering of the periscope view of Battlezone

Launched in November 1980, Atari’s Battlezone was set firmly on the ground. Instead of a spaceship, the player controlled a tank with two controllers that pushed backwards and forwards. Push both forwards, and the tank moved forwards, you would go backwards if both were pushed backwards, and to turn you would push one back and one forwards. A button on one of the controllers fired a shell towards enemy tanks prowling around the landscape, or faster super tanks, UFOs or missiles.

The gameplay was fairly straightforward, but the graphics set it apart. In the standard versions, the player would look through a periscope with a lens in it and some painted on artwork, to simulate a real tank. What they saw was a flat plane with an occasional enemy, distant mountains and a live volcano, a crescent moon and random geometric shapes scattered around which could act either as cover or as obstacles. The wire-frame vector graphics, given a primitive but compelling 3D experience.

The enemy often wasn’t visible at all, except for being on radar. A frantic swing around to find the opposing tank would often be too late to avoid an incoming round. Unseen objects could block escape routes, or alternatively save your bacon and block an attack. All the time, the mountains beckoned in the distance. Could you ever reach them? All sorts of rumours and myths abounded, sadly untrue.

Battlezone machine with periscope
Battlezone machine with periscope

The game was a huge hit, but the periscope remained one of the most divisive parts. Although it added to the feel of the game, you had to be able to reach it and it wasn’t always the most hygienic thing to press your face into. Some later versions removed the periscope which made the game more accessible.

Battlezone also had official ports to most of the popular systems of the early 1980s, include the Apple II, Commodore 64, Sinclair ZX Spectrum, Atari 8-bit machines and the Atari ST. Unofficial versions and games inspired by Battlezone proliferated across every  platform you can think of and are still popular today. Vintage machines seem rare, with prices in the US being around $5000, but if you want a more modern (and compact) VR experience you could try something like the PlayStation 4 version.

Image credits:
Gamerscore Blog via Flickr – CC BY-SA 2.0
Russell Bernice via Flickr – CC BY 2.0



Sunday 22 November 2020

Berzerk (1980)

 Launched November 1980

The golden age of arcade games features many classic games that are still fondly remembered and played today, but one that has been somewhat forgotten is Berzerk, launched by Stern Electronics in November 1980.

The basic gameplay of Berzerk was that the player was trapped in a series of mazes and had to shoot robots who were themselves shooting back. An indestructible enemy called Otto would turn up. Getting shot by a robot, walking into a robot, being too close to a robot that was shot, bumping into the electrified walls or being caught by Otto would all result in death. The player does have the option of escaping to another one of the thousands of possible rooms to fight again.


Berzerk

Although the gameplay was novel, what really made Berzerk stand out was speech synthesis. Although the game itself was powered by the popular Zilog Z80 processor (clocked at 2.5MHz), the speech was provided by a TSI S14001A chip that had originally been designed for a talking calculator. The games used speech to attract players with “coin detected in pocket” and then chattered and taunted the player throughout the game.

Berzerk was a hit – gaining some notoriety along the way – and was successfully ported to Atari and Vectrex gaming consoles. Stern followed up with a successful partnership with bringing Konami games to US markets, but by the mid-1980s the bottom dropped out of the market in a huge crash that took Stern with it.

Ports and conversions of Berzerk exist to this day and aren’t hard to find, but collecting arcade machines themselves is a bit of a niche hobby. We found an original cabinet in the US for $1200, which is not much compared to some better known games.

Perhaps the lasting legacy of Berzerk was that it helped to kick start voice synthesis, although it took decades for this to became good enough to use all the time. Stern themselves were resurrected as a brand in 1999 when the assets of Sega’s pinball operations were spin off, and Stern Pinball make machines to this day.

Image credit:
The Pop Culture Geek Network via Flickr - CC BY-NC 2.0


Sunday 15 November 2020

Super Nintendo Entertainment System (1990)

 Introduced November 1990 (Japan)

The best-selling 16-bit gaming console, the Super Nintendo Entertainment System (or just “SNES”) ruled the roost in the early 1990s, despite an epic battle with the Sega Mega Drive. Relatively late to the market, one of the key reasons for the success of the SNES was better games and good marketing.

Nintendo SNES (PAL Version)

Models varied throughout the world with different cases between North America, Japan and Europe with imcompatible cartridge slots and region locking in both hardware and software. In Japan the console was called the “Super Famicom”. At its heart was an unusual 16-bit development of the venerable 6502 processor called the Ricoh 5A22 – the previous generation Nintendo Entertainment System used another 6502 derivative, this time the Ricoh 2A03.

A wide range of colour graphics modes, an impressive audio subsystem called S-SMP (made by Sony) and ergonomically designed controllers made the SNES a capable hardware platform. But with games consoles, that’s just one of the ingredients you need for success.

What the SNES did have was games... lots of games. Super Mario, Mario Kart, Donkey Kong, Final Fantasy, Street Fighter, Mortal Kombat, SimCity and many more games were available. Many games were only available on Nintendo platforms, some (such as Mortal Kombat) were available of several. Nintendo did insist of elements of graphic violence being removed from games, which made them more family-friendly but ultimately probably lost them sales.

Depending on market, the SNES was around for about a decade and sold an astonishing 49 million units compared to the Mega Drive’s 32 million or so. A few revisions were made to the hardware, along with quite a lot of hardware expansions that had a limited audience, but ultimately the success of the SNES continued well into the 32-bit console era.

There’s a healthy retro gaming community around the SNES – used units are inexpensive, although game cartridges – especially rare ones – can be worth much more that the systems themselves. In 2017, Nintendo released the Super NES Classic Edition – a modern take on the classic console. There are also emulators and other reimagined versions out there – even after 30 years, the SNES still has the power to captivate gamers.

Image credit: JCD1981NL via Wikimedia Commons - CC BY 3.0

Saturday 7 November 2020

DEC PDP-1 (1960)

First unit shipped November 1960

In 1960, the Massachusetts firm Digital Equipment Corporation (typically known as “DEC” or just “Digital”) was just three years old. Founded by two ex-MIT engineers, DEC’s earliest products were small computing modules on cards that could be used in labs, the design of which was based on work originally done at MIT.

The company advanced quickly, and in 1959 it announced a minicomputer based on these modules, which they called the PDP-1. “PDP” stood for “Programmable Data Processor” – according to legend this was because DEC was afraid that IBM might sue if they called it a “computer”, but in reality it seems that this approach was because computer companies were regarded as high-risk for investors, where lab equipment companies were less so.

DEC PDP-1
DEC PDP-1

Although clever and flexible in approach, the modules still consisted of individual transistors and other components soldered to a board. This meant that the overall system was quite large (weighing hundreds of kilograms). It was 36-bit computer with the equivalent of just 9.2Kb of memory, mass storage was provided by paper tape and output could be to a printer or weird circular CRT that had basically been transplanted from a radar system.

Prices started at $130,000 for a basic system (around a million dollars today) which made it a feasible purchase for well-funded labs, universities and tech-oriented companies. It was essentially a single-user system until the development of timesharing OSes such as the one from BBN. As a comparison, an IBM 7090 mainframe computer of the same era cost around $2.9 million ($20 million today), and although the PDP-1 was much less powerful it was much more flexible and it could be programmed to do things that were impossible on a tightly buttoned-down mainframe.

Things that you couldn’t do on a mainframe but could do on a PDP-1 included playing synthesised music and a key early video game – Spacewar!. A precursor of much later video games such as Asteroids, the game used the Type 30 vector display of the PDP-1 to show two opposing spacecraft fighting each other around the gravitational field of a sun. The Type 30’s long persistent phosphor added to the space-age gameplay and of course it became a major reason for people to book time on their department’s PDP-1 for (ahem) “research”.

Spacewar!
Spacewar!

DEC were still a fledgling company, and although the PDP-1 was a success it only shipped 53 units. However, just five years later DEC was growing quickly and the 12-bit PDP-8 launched in 1965 shipped an astonishing 50,000 units over its lifetime, and five years after that the 16-bit PDP-11 eventually shipped 600,000 units. As an aside, DEC also made 36-bit and 32-bit computers in the 60s and 70s, to add to the confusion.

The architecture used in the PDP-1 turned out to be a dead end, ending in 1970 with the PDP-15. However, it helped start a revolution in small-scale computing which – after nearly two decades – allowed everybody who wanted one to have a computer on their desk. Unfortunately for DEC, they never could successfully transition from minis to micros, but that is another story.

Image credits:
M.Hawksey via Flickr – CC BY 2.0
Kenneth Lu via Flickr – CC BY 2.0


Thursday 5 November 2020

Apple III (1980)

Introduced November 1980

The legendary Apple II series was one of the most successful computers ever, selling millions of units in a 16 year run from 1977 to 1993. But despite that success, Apple repeatedly tried to kill off the Apple II and replace it with something better. The Apple III was the first attempt… and one of the most disastrous.

Apple III with Profile Hard Disk
Apple III with Profile Hard Disk

It was released to the market in November 1980, and on paper it looked like it should succeed – based on the 6502 processor like its predecessor, it had 128Kb of RAM included, supported lowercase characters (unlike the Apple II), built-in colour graphics and 80 column text and a new operating system called Apple SOS. A floppy disk was included as standard, and the Apple III also had sound.

Despite the similarity in hardware, the Apple III wasn’t compatible with the Apple II in normal operations, but it did have a special emulation mode it could be booted into. But this was one of the Apple III’s many poor design decisions – the Apple III could only emulate an Apple II with 48Kb of memory when a lot of software demanded the full 64Kb. There wasn’t any good technical reason for this either, it was purely a marketing decision. Similar access to other advanced parts of the Apple III hardware (such as the 80 column display and lowercase letters) were deliberately blocked too, but the memory limitations meant that the Apple III did not make a very good Apple II.

Unfortunately the problems ran much deeper. The physical design of the Apple III had been determined by the marketing team, not the engineering team. So when it came to squeezing all the bits in, the engineers couldn’t make it all fit. To get round this, the Apple III motherboard was designed with very narrow tracks and a large number of soldered-on components to save space. However, the quality of the manufacturing process was not up to such precise work and a large proportion of Apple IIIs were faulty out of the box.

Overheating was a major problem too – in order to keep noise and size to a minimum, the III had no fans but instead was meant to dissipate heat into the aluminium casing. Heating would lead to system crashes and according to legend, chips would pop out of their sockets and disks could melt. Other hardware problems included a faulty real-time clock which would eventually fail and be difficult to replace.

A chronic lack of software was combined with a lack of documentation, making it had for independent developers to create new software and hardware. Compared to this, the Apple II had a huge software library and the system was well-documents and easy to work with.

The launch was a complete disaster, and it soon became clear that Apple hadn’t done any real testing on this unit and the product was unfit for purpose. Sales were quickly suspended and all the sold units recalled for re-engineering. A redesigned circuit board (and the removal of the troublesome clock) fixed most of the issues and the Apple III was re-introduced to market a year later, in November 1981.

Apple III Plus


Consumers were not keen on the revised model at all, because the reputation of the early ones was so bad that it tainted the newer and more reliable models.  A lightly revised Apple III Plus launched in December 1983 did not turn around sales either and in April 1984 the entire product line was permanently cancelled.

In three and a half years of on-off sales, the Apple III had only shipped about 120,000 units – and many of these were to replace faulty ones. The entire project lost tens of millions of dollars, but fortunately for Apple it survived because of buoyant sales of the Apple II. Apple’s next main product launch following the Apple III was the Apple Lisa, which was also a high-profile commercial failure.

It took more than a decade to kill of the Apple II – the Mac LC from 1990 helped – by which time the Apple III had been long forgotten. Today, Apple III systems are rare but usually only cost a few hundred dollars for one in working order.

Image credits:
Adam Jenkins via Flickr - CC BY 2.0
Tellegee via Wikimedia Commons - CC BY-SA 4.0


Tuesday 27 October 2020

Apple Macintosh LC (1990)

Introduced October 1990

The Macintosh LC helps to demonstrate the tricky situation that Apple found itself in at the beginning of the 1990s. On one hand, sales of the Macintosh were doing well with a continually evolving product line including the powerful colour Macintosh II range. On the other hand, a large slice of the their sales were still to educational markets who very much favoured the ancient Apple II platform, development of which had continued into the late 1980s with Apple IIc Plus, IIe Platinum and even a 16-bit version called the Apple IIGS.

Not unreasonably, Apple wanted to move this market on from warmed-over products of the late 1970s. Schools in particular demanded colour, but the Macintosh II platform was very expensive and the cheaper Macintosh Classic platform was monochrome-only. The challenge was to create a colour Mac that didn’t cost the earth, and the LC was created in response to that challenge.


Apple Macintosh LC
Apple Macintosh LC

Although half the price of the contemporary Macintosh IIx, the LC was crippled in performance terms by the out-of-date Motorola 68020 processor, 16-bit internal bus and a maximum of 10Mb of RAM. Graphics capabilities were more limited than the Macintosh II, leading to some compatibility problems, and internal expansion was more limited.

Still it was a Mac, and if you wanted a colour Mac but didn’t have the substantial amount of cash needed for a Mac II then the LC was the way to go. And it turned out that a lot of people wanted a colour Mac very badly, and they were prepared to put up with the performance hit that the LC came with. So despite everything, the LC was a sales success.

Although it was cheap compared to the more than $7000 demanded for the IIx, the base unit of the LC by itself had a list price of $2400, more than four times the price of an Apple II. Still, it was around the same price and same market segment that 80386SX PCs were selling into. Schools stubbornly stuck with the Apple II though, which soldiered on until 1993. After that point you would need an Apple IIe card in your Mac LC series to run Apple II programs (which was another $250).

It was always going to be a tricky transition – the LC certainly took sales away from the Mac II and it wasn’t the budget computer that could replace the Apple II. Performance was an issue, mostly because the LC could easily have been made faster for a little more money.

But perhaps the biggest problem was fragmentation… by the end of 1990 there were four different models of Macintosh II on sale, plus the Mac Classic, SE/30, the esoteric Portable and the LC. In a few year time, Apple would have 20 or more competing products which confused both consumers and showed a lack of direction within the company that nearly led to its bankruptcy in the late 90s.

The LC itself didn’t last long, replaced by the similar 68030-based LC II in 1992 and finally getting the performance it needed with the LC III in 1993. Surprisingly, prices for an original LC in decent condition can easily be a few hundred pounds whereas an equivalent model 386SX PC of the same era is basically landfill. Old Apple devices are quite collectable, but really you want to find an Apple I in a cupboard rather than a humble Mac LC…

Image credit:
Jay Tong via Flickr - CC BY-ND 2.0



Saturday 24 October 2020

Epson MX-80 (1980)

Introduced October 1980

It was the noise you remember most of all, the slow tortured screeching of metal pins hammering thousands of tiny black dots onto paper, for what seemed like an eternity. When you were in a room full of them, the sound was cacophonous but distinctive. Not quite like the rat-a-tat machine gun fire of a daisy wheel or the howled whistling of a modem, the din of a dot matrix printer is thankfully something seldom heard today.

The Epson MX-80 is probably the key product of its type. Introduced in October 1980, this 9-pin dot matrix printer was capable, reliable, compact and lightweight. Most of all, it was successful – the MX-80 and the Epson printers that followed it defined the way most of us printed for more than a decade.

Elegantly but practically designed, the MX-80 used a simple 9 x 9 matrix to produce typically 80 columns of text. Nine pins meant that the printer could produce “true descenders” on letters such as p and g rather than 8 x 8 matrices (as found on most computers) which could struggle. Italics, bold characters and underlining was all supported, and the printer could also produce custom graphics with a little effort.

Epson MX-80
Epson MX-80

The MX-80 was a bidirectional printer with logic seeking which meant faster print times of up to 80 characters per second. On the standard model the paper was tractor fed from a box of fanfold paper (also called continuous stationary) on the floor, typically allowing for up to 2000 sheets before the slightly awkward task of feeding in some more.

You would typically hook up the MX-80 to a small computer using a parallel port which was a big cable, but it didn’t require anything to set it up. Serial options were available, and the MX-80 F/T had a friction feed so it could use cut sheet paper. A larger MX-100 had a wider carriage for bigger paper.

It was cheap to run – mostly you’d just need a new ribbon from time-to-time, although quite expensive to buy. By modern standards it was very slow, taking up to a couple of minutes for each page. And then there was the paper…

Fanfold paper is all connected together, making one continuous piece of paper where the pages are perforated so that they can be pulled apart from each other. Most fanfold paper also has perforations along the sprocket holes on either side. If you’d waited a couple of hours for a really big document to print out, you might then want to split the paper into A4 sheets. This was an agonising manual process that involved carefully pulling the paper apart. If you didn’t pay attention then you could rip the page, and you might have to reprint it (if it was even possible to print a single page in whatever application you were using).

Epson MX-80 and Apple II
Epson MX-80 and Apple II

Despite all the drawbacks, it was a useful device and consumers loved it, not least because it was very cheap to run. IBM loved it too, and rebadged it as the IBM 5152 to go alongside the IBM PC. The MX-80 spawned a number of 9-pin and 24-pin successors, and despite most modern printers being lasers or inkjets, you can still buy Epson dot-matrix printers today. But where are they used?

Dot matrix printers can be found in industrial environments, warehouses and also in the aviation industry. A combination of ruggedness, reliability and low maintenance costs outweigh the slow speed, low quality and noise in those environments. Although you would be unlikely to find an MX-80 still in operation after all of these years, you can still find many examples of its descendants.

Image credits:
Cushing Memorial Library via Flickr – CC BY-NC-ND 2.0
Nakamura2828 via Wikimedia Commons – CC BY-SA 3.0


Tuesday 20 October 2020

3Com Audrey (2000)

Introduced October 2000

These days we are glued to our mobile devices, wanting to access the web and applications wherever we are. For many the idea of having to sit and the computer to use the internet seems old-fashioned. Smartphones, tablets… and yes, maybe a laptop if it’s something serious.

Twenty years ago however, the internet was not a mobile experience. Cables, dial-up modems and old-fashioned PCs were the way to go. If you wanted to access the web while you were in – say – the kitchen, you’d be out of luck.

3Com had an idea. One of the stars of the dot com boom, they created a device that could enable you to access the internet from almost any room you wanted. Using cutting-edge technology and undoubtedly with an eye to the future, they introduced the 3Com Ergo Audrey (or just the “3Com Audrey”). 3Com weren’t into consumer computing, except for their Palm subsidiary, so this was a new market for them.

It looks like a dangerous environment but loads of us are working from home in worse

In some ways the Audrey was like a modern tablet – an 8” VGA-resolution touchscreen display was complemented by a compact design, a wireless keyboard and USB expansion ports. The beating heart of the Audrey was the QNX operating system, a Unix-like platform designed to run on small devices. The Audrey could access the web, email, personal information management tools and a variety of push content that could be selected by a rather retro knob. You could sync it with up to two Palm PDAs (back in the days when a Palm Pilot was still a big deal). For added quirkiness, the stylus plugged into the top of the unit and it lit up when you received an email. It sounded great, but there were some drawbacks.

Internet access was via a dial-up modem, although you could plug an Ethernet adapter into a USB port if you had Ethernet at home. Also, you had to power it from a wall socket, so it was tied to both the mains and a phone or network port. At $499 it was quite pricey, and it didn’t have the power of a contemporary laptop. And – probably mostly – people just didn’t have the need to do internet all the time from anywhere.

3Com envisaged the Audrey (named after Audrey Hepburn) as being the lead device in their new “Ergo” sub-brand, including a “Mannix” device for the living room and presumably an “Andrex” device for the toilet. Presumably then you’d need a phone line in every room, which would be a pest. Wireless networking was technically a thing back in 2000, but it wasn’t widespread.

Is the kitchen the natural environment for the Audrey? Or the box it came in?

Still, this was pretty exciting stuff and of course 3Com hadn’t seen the iPad which was still a decade away, so it all sort of made sense. If you hadn’t seen the iPad. Which they hadn’t. Which was a problem really, because in hindsight it was the iPad that consumers really went for, followed by ridiculously oversized smartphones. Consumers wanted a cheese soufflé, 3Com offered them a soggy omelette instead.

It was a moderate success, but the dot com crash sent investors running away from tech shares and 3Com cancelled the Audrey and all the other Ergo devices in the summer of 2001. Some of the unsold inventory ended up in the hands of hackers who discovered how to jailbreak QNX and do their own things with it, turning these remaindered devices into something like a prototype Raspberry Pi.

Twenty years on and the Audrey is largely forgotten, along with 3Com who eventually vanished without trace into HP (along with their one-type subsidiary Palm). QNX was bought by RIM and turned up in the catastrophic BlackBerry Z10 in 2013, but despite that it still successfully soldiers on in embedded systems. As for the Audrey, you can pick one up for a few tens of dollars from sellers in the US but if exporting it you’ll need to watch the voltage on the power supply.

Image credits:
Andrew Turner via Flickr - CC BY 2.0
Alessandra Cimatti via Flickr - CC BY 2.0



Sunday 11 October 2020

Going nowhere: Windows 7, Bada, Symbian, BlackBerry OS and WebOS (2010)

Announced October 2010

It’s October 2020 and if you have a smartphone in your pocket it’s almost certainly going to be one of two things: an Apple iPhone or an Android device. It seems like it has been that way for ever, but ten years ago this month rival platforms were duking it out as if they had some sort of chance.

The big news ten years ago was Windows Phone 7. Microsoft had been haemorrhaging market share ever since the iPhone was launched. Earlier versions of Windows Mobile (as it was then called) had been capable enough, but the user interface was a frankly horrific relic from an earlier age. Stung by failure, Microsoft decided to redesign the product entirely and came up with something entirely different.

Windows Phone 7 was a huge critical success in interface design. Clean, responsive, informative and intuitive at the same time, it made Android and iOS look old-fashioned. iOS in particular was wedded to the skeuomorphic design that had been an Apple hallmark for decades, but by contrast Windows looked utterly modern.

Microsoft made the decision to base Windows Phone 7 on the same underlying Windows CE platform that had powered previous generations, rather than the more modern Windows NT platform that could have delivered the same power as Android and iOS. The next-generation of Windows Phone – version 8 – would change the platform while retaining the UI… but that is another story.

There was certainly some buzz about Windows Phone 7 though, and a lot of manufacturers had lined up behind Microsoft to push this new platform. HTC were the keenest with several new devices – the HTC 7 Pro, HTC 7 Trophy, HTC 7 Mozart, HTC Surround and the high-end HTC HD7. Samsung resurrected the Omnia sub-brand to come up with the Samsung Omnia 7, where rivals LG had the LG Optimus 7 and LG Optimus 7Q. Even Dell got in on the act with the Dell Venue Pro.


A trio of doomed Windows Phone 7 devices

The operating system was sleek, the phones were pretty good and competitively priced. But of course Windows Phone failed, mostly because it lacked the apps that Android and iOS had. And perhaps partly because… who actually needed another mobile phone OS anyway?

But Windows Phone 7 wasn’t the only doomed platform being touted this month. Samsung had also developed the Unix-like Bada operating system for use in smartphones. The Samsung Wave II was the company’s flagship Bada phone… again it was a sleek operating sytem, competitively priced with excellent hardware. Samsung had tried hard to get apps for the platform and had done reasonably well. But still… it wasn’t Android or iOS. But it did at least feature the somewhat infamous Rick Stivens.


Samsung Wave Goodbye might have been a better name

Bada didn’t last long, being folded into Tizen in 2012. Tizen itself was the effective successor OS to a medley of other Unix-like platforms: Maemo, Moblin, MeeGo and LiMo. Tizen found itself ported to a wide range of smartwatches and the Samsung Z range of smartphones up to 2017 when eventually they fizzled out. But Tizen didn’t die, instead becoming the most popular operating system in Smart TVs instead.

Another relatively new kid on the block was Palm’s webOS platform found in the Palm Pre 2. Stop me if you’ve heard this before… but the phone had a combination of good hardware, a great OS, competitive pricing and a reasonable set of apps which sadly couldn’t compete with the market leaders. The Pre 2 was the last smartphone to be launched under the Palm name (apart from the peculiar Palm Palm). But less than a year after the Pre 2 the entire webOS product line was cancelled by Palm’s owners, HP.
Palm Pray might also have been a better name

The excellent webOS operating system lingered on, with HP attempting to open source it. Eventually it was picked up by LG who applied it to smart TVs and smartwatches, in a directly parallel to Samsung and Tizen.

Three doomed platforms is surely enough? Not quite.

The Nokia C5-03 was a nice enough, low-cost Symbian touchscreen smartphone. Unlike the others, this had a really good library of apps, it was attractively priced and designed and also Symbian had been around for donkey’s years there had been a process of continual improvement and an established base of fans. But Nokia were on the verge of a spectacular collapse, and Symbian would effectively be dead within a year.
Inexpensive but doomed smartphone fun with the Nokia C5-03.

Windows Phone 7, Bada, webOs and even the aging Symbian were all modern platforms that could deliver the sort of experience that customers might want. In comparison, the BlackBerry OS on the Bold 9780 was not. BlackBerry’s efforts at repeatedly warming over an OS that was nearly a decade old had created a device that was pretty good at email and absolutely appalling for web browsing or any other of the meagre collection of apps that were available.
BlackBerry missed the memo about what a 2010 smartphone should be

It sold pretty well into corporations that had standardised on BlackBerry, but users hated it – instead choosing to use their own iOS and Android devices which they expected their companies to support, leading in turn to the idea of BYOD (“bring your own device”). BlackBerry did eventually come up with a vaguely competitive smartphone… in 2013, a full six years after the iPhone was announced.

Today, if you want a smartphone without Android or iOS then the pickings are fairly slim. But Huawei – currently the world’s number two smartphone manufacturer – is working on the Linux-based Harmony OS to replace Android. This move is mostly due to trade sanctions from the US, but Harmony is also available as open source, or alternatively Huawei will licence their closed-source version to other manufacturers. Who knows, perhaps this rival OS will be a success?

Image credits: HTC, LG, Samsung, BlackBerry, Dell, HP, Nokia.

Tuesday 29 September 2020

Nokia E7 (2010)

 Introduced September 2010

Looking back at Nokia, there was a point in its history where it slipped from being the market leader to a market failure. The Nokia E7 sits on the cusp on that change.


Nokia E7
Nokia E7

The Nokia E7 was the spiritual successor to Nokia’s long-running range of Communicator devices. Big and often bricklike, the Communicators had bigger screens that almost any other phone combined with a large physical QWERTY keyboard. Oddly, the Communicators often lagged behind in terms of features – the Nokia 9210i lacked GPRS for example when rivals had it, and the Nokia 9500 didn’t have 3G just when it was becoming common. It wasn’t really until the E90 in 2007 when it caught up in communications terms… a bit odd given the “Communicator” name.

Anyway, more than three years had elapsed since the launch of the E90 and in that time Apple had released their third-gen iPhone and Android devices were eating into Nokia’s market share. Many Nokia fans who had bought the E90 had moved on to other smartphone platforms. Nokia needed something special, and it looked like the E7 could be it.

Sleekly designed, the most prominent feature of the E7 at first glance was the huge 4” AMOLED capacitive touchscreen display, bigger than almost anything else on the market at the time. Hidden underneath that was a large keyboard that could be found by sliding the screen. A decent 8 megapixel camera was on the back, and the E7 supported 3.5G data and WiFI. GPS was built-in along with an FM radio and 16GB of internal storage.


Nokia E7
Nokia E7

All of this was powered by Symbian^3, Nokia’s latest version of their S60 operating system. This supported all the usual applications plus document editors, comprehensive email support, built-in navigation and excellent multimedia capabilities. It was quite possibly the best Symbian phone that Nokia ever made. But since nobody uses Symbian today, something must have gone wrong..

Nokia had both been very early to the touchscreen smartphone party and very late at the same time. The Nokia 7710 had launched in 2004 – years before the iPhone – but the technology wasn’t quite there and consumers stayed away. Nokia’s next attempt at a mainstream touchscreen smartphones was the 5800 XpressMusic which launched waaay after the iPhone. The 5800 proved popular but it was pushing S60 almost as far as it could go.

Symbian was meant for an earlier era of handheld computing. First appearing in 1989 as Psion’s EPOC operating system (see on the Psion Series 5 for example) it was designed to run smoothly on minimal hardware. The Series 5 for example had a puny 18MHz processor and 4MB of RAM, but by the time the E7 was launched it had a 680MHz processor and 256MB of RAM… hardware which in theory could run something more sophisticated. Rival Apple and Android devices both ran on operating systems descended from Unix which allowed a much richer environment for software developers. Developing for Symbian was harder, but it was worthwhile because even in 2010 Nokia’s OS was the market leader – even if it was beginning to fade.


Nokia E7
Nokia E7

It wasn’t as if Nokia lacked an alternative – the 770 Internet Tablet launched in 2005 running Nokia’s own take on a Unix-like OS called Maemo. But it wasn’t a phone and development of the platform was slow, but eventually they came up with a practical if somewhat rough around the edges smartphone in the Nokia N900. It looked likely that whatever would succeed the N900 would be a winner, but instead Nokia decided to merge Maemo with the Intel-led Moblin platform… a decision which completely derailed the strategy to replace the N900.

Stuck with the limitations of Symbian and with no next-gen product on the horizon, Nokia’s future was beginning to look uncertain. Even though sales were strong, it wasn’t clear how they could compete in the long term. But as it happens just a few days before the announcement of the E7, Nokia also announced a new CEO – Stephen Elop.

Elop realised the predicament that they were in and explained it to Nokia employees in the now-infamous “burning platform” memo that was leaked to the press. Ultimately Elop wanted to move Nokia away from Symbian and Maemo/MeeGo towards Microsoft’s new Windows Phone 7 OS. This was a bold move as Microsoft’s platform was very new… and Microsoft themselves had lost market share sharply to Apple. The plan was that Symbian would eventually be discontinued, but Nokia were hoping there would be a gradual transition of customers from Symbian to Windows. But that’s not what happened.

Dubbed shortly afterwards as the “Elop Effect”, the impact on Nokia’s sales were disastrous. Elop had effectively made Symbian a dead-end platform and that killed off pretty much any market appeal to customers. Sales fell through the floor, and worse still Nokia didn’t have a product to replace it (the first Lumia handset launched late in 2011). Far from being a smooth transition from one platform from one platform to another, it simply persuaded Symbian fans to jump ship… mostly to Android.

Less than six months after the announcement of the E7, Symbian was effectively dead. A trickle of new Symbian devices emerged from Nokia with the last mainstream handset launched in October 2011 and the final ever handset being launched in February 2012. None of them sold well. But then neither did the Windows phones that followed.

The E7 marks the point when Nokia’s seemingly invincible empire crumbled. The last high-end Symbian smartphone, the last of the Communicators, the E7 might have been a game changed if it had been launched three years earlier. Today the E7 is quite collectable with prices for decent ones starting at £60 or so with prices into the low hundred for ones in really good condition. 

Image credits: Nokia

Wednesday 23 September 2020

TRS-80 Color Computer (1980)

Introduced September 1980

The original TRS-80 (launched in 1977) was one of the “holy trinity” of early consumer-friendly microcomputers along with the Apple II and Commodore PET. Capable though the original was, it lacked colour and sound which was what the next-generation of home micros would provide, so in September 1980 Tandy Radio Shack launched the TRS-80 Color Computer.

It had almost nothing in common with the original TRS-80 Model I except for the name. Crucially the Color Computer (often called the “CoCo”) didn’t have the Z80 processor that gave the “80” to the Model I’s name but instead it included a Motorola 6809. Indeed, the whole thing was more Motorola than Tandy – the basis of the CoCo was a Motorola-designed Videotex terminal which Tandy joined in to manufacture and market.

TRS-80 Color Computer 1
 

The 6809 was a bit more sophisticated than the Z80 and the rival 6502, and the more powerful Motorola 68000 was still an expensive and rather niche device. This was combined with a Motorola MC6847 graphics chip and there was an optional sound and speech cartridge.

Although the CoCo had pretty powerful graphics capabilities it was complex to get the most out of them, and the machine had some odd quirks such as being unable to display pure white and lacking lowercase character support. At launch the CoCo had 4, 16 or 32KB of RAM but later models shipped in 16 or 64KB configurations, and the last series of Color Computers could support up to 512KB.. and wonder of wonders, even lowercase text.

Over eleven years the hardware evolved somewhat with three distinct series of computers being made with different case colours, detailing and keyboards. The third and last series had improved graphics, built-in software and better support for peripherals. The larger memory allowed the sophisticated OS-9 operating system to run which brought a modern operating system to this fairly simple 8-bit machine.

TRS-80 Color Computer 2

Production ended in 1991, which wasn’t bad for an 8-bit machine. It was more popular in North America than in Europe, but the same Motorola reference platform emerged in the somewhat CoCo-compatible Dragon 32 and Dragon 64 a few years later.

For collectors, the CoCo isn’t an expensive buy and is commonly available in the US, however those run on a different voltage and have different video standards to European ones. Plenty of software emulators are available if you fancy tinkering on more modern hardware.

Image credits:
Adam Jenkins via Flickr - CC BY 2.0 - [1] [2]


Tuesday 15 September 2020

Amstrad GX4000 (1990)

Introduced September 1990

During the late 1980s Amstrad had been on a roll. The Amstrad CPC range had taken a respectable share of the home computing market, the cheap all-in-one PCW wordprocessor had been a remarkable success for small businesses and home users, the PC-compatible PC1512 and PC1640 had sold in huge quantities and Amstrad had bought out arch-rival Sinclair to produce their own take on the iconic ZX Spectrum micro.

Not everything had been a success. The deeply strange portable PCs – the PPC 512 and PPC 640 – proved to be a high-profile flop. Worse still, the next-generation PC2000 series which had been launched to great acclaim ended up as a disaster with a batch of faulty hard drives significantly damaging Amstrad’s reputation.

Amstrad’s success had been built on offering quality devices at bargain prices, typically by exploring ways to drive down costs. The CPC computers were a good example, a home computer, monitor and storage device starting at £399, all inclusive. Amstrad leveraged their relationships with makes of TV tubes and cassette players to give them a price advantage, the inclusion of the cheap-but-capable Z80 processor drove down costs further. Amstrad chose to use the CPC platform for their next venture.


Amstrad GX4000

The Amstrad GX4000 was essentially a games console version of the CPC. Stripped of the cassette drive, TV and keyboard, the GX4000 used cartridges and hooked up to a domestic TV. Still running a Z80 with 64Kb of RAM the console was modestly specified even by 1990’s standards… but at just £99 it was really cheap.

It was an elegantly packaged device, with two slightly creaky games controllers attached and video output via RF, SCART or and Amstrad DIN connector for a CPC monitor. You could add a light gun or an analogue joystick took, but expansion options were pretty limited. Still, it was pretty capable for an 8-bit platform and the related CPC had a huge variety of good quality games available for it. So, it should have been a success? Not exactly.

By 1990 the 8-bit era that had dominated the 1980s was at an end. 32-bit home computers such as the Commodore Amiga had been established for some time, and the games console market itself was in the process of moving to 16-bit platforms such as the Sega Megadrive. But technological obsolescence had never been a problem for Amstrad - a company that shipped CP/M computers well into the 1990s – where instead they were interested in value-for-money. And the GX4000 certainly seemed to have that.

But the GX4000 was a massive failure, and perhaps the key problem was games. CPC games on cassette cost a few pounds where a GX4000 cartridge for the same game cost £25 (a quarter of the price of the console). Only a couple of games were available at launch, and a combination of manufacturing delays and high costs means that just 27 games of varying quality were launched. The 8-bit CPC platform that the GX4000 ran on wasn’t something that gamers could be excited about either.

Perhaps if the GX4000 had been released a few years earlier with more (and cheaper) games plus better designed hardware, it might have been a success. As it was, the GX4000 was discontinued in 1991 having sold just 15,000 units. Of course, that makes this console quite collectable today with prices for ones in good condition going for up to £200 which would be a lot more than was paid for it in the first place..

Image credit:
Evan-Amos via Wikimedia Commons - Public Domain


Sunday 6 September 2020

Soft Toys (1880)

Introduced 1880

The soft toy in all its forms – from teddy bears to more exotic creatures altogether – is a favourite of children the world over and many adults too. And although toys have been found dating back to the earliest human civilizations, earlier examples would tend to be made from wood or whatever materials were available, but although these could be fun to play with they certainly weren’t cuddly.

Generally speaking the invention of the soft toy is attributed to Margarete Steiff in Germany. Steiff was a seamstress who at first noticed that children liked to play with the soft animal-themed pincushions that she made. These were adapted into toys covered in felt and filled with lamb’s wool, first for children of friends and relatives and then sold commercially from her shop in 1880. The first design was reportedly a toy elephant.

Elephant pin cushion in the Steiff Museum

These became a success and Steiff’s business grew and a result, but she always insisted on making the first prototypes herself to make sure that there were no problems. In 1897 her nephew Richard Steiff joined the company and designed a stuffed toy bear which initially was not successful – but then came the teddy bear craze of the early 1900s which Steiff and other manufacturers were happy to supply.

Margarete Steiff became a successful businesswoman, and created the Steiff company which even today is probably the best-known creator of stuffed toys. And she achieved all this despite being crippled with polio from an early age which left her confined to a wheelchair and with limited mobility in one of her arms.

Today Steiff sits at the top end of the soft toy market, which has spawned many other well-known brands. The American Ty company – founded in 1986 – popularised Beanie Babies. Gund is another American company that specialise in mid-range bears and other soft toys that are characteristically understuffed to make them more cuddly. Yet another American company, Build-A-Bear Workshop, allows customers to custom make their own bears in-store and to accessorise them. Many other companies also make bears and soft toys, and there are also small-scale operations that make high-end designer toys made out of premium materials.


Ty Beanies
But a soft toy is more than just a bit of fabric with some stuffing in it. They are companions, faithful friends, a comfort in times of distress and so much more. Bears can be fearless too, recently taking over some of the streets of Paris in the COVID-19 lockdown. They are patient and non-judgmental. And even when they’ve been put away for years at a time, they are still pleased to be with you – no matter how long that might be. And they proliferate. Oh my, do they proliferate.


Les Nounours des Gobelins

Of course, soft toys can be high collectable. At the high end, rare Steiff bears or Ty beanies can sell for tens of thousands of pounds... but these soft toys will probably never be played with or even cuddled. More everyday soft toys can be found all over the place, but often there can be interesting finds in charity shops looking for a new home. Just don’t let them completely take over your house.

Image credits:
Flominator via Wikimedia Commons - CC BY-SA 3.0
Caroline Léna Becker via Flickr – CC BY 2.0
frankieleon via Flickr – CC BY 2.0