Tuesday, 27 October 2020

Apple Macintosh LC (1990)

Introduced October 1990

The Macintosh LC helps to demonstrate the tricky situation that Apple found itself in at the beginning of the 1990s. On one hand, sales of the Macintosh were doing well with a continually evolving product line including the powerful colour Macintosh II range. On the other hand, a large slice of the their sales were still to educational markets who very much favoured the ancient Apple II platform, development of which had continued into the late 1980s with Apple IIc Plus, IIe Platinum and even a 16-bit version called the Apple IIGS.

Not unreasonably, Apple wanted to move this market on from warmed-over products of the late 1970s. Schools in particular demanded colour, but the Macintosh II platform was very expensive and the cheaper Macintosh Classic platform was monochrome-only. The challenge was to create a colour Mac that didn’t cost the earth, and the LC was created in response to that challenge.


Apple Macintosh LC
Apple Macintosh LC

Although half the price of the contemporary Macintosh IIx, the LC was crippled in performance terms by the out-of-date Motorola 68020 processor, 16-bit internal bus and a maximum of 10Mb of RAM. Graphics capabilities were more limited than the Macintosh II, leading to some compatibility problems, and internal expansion was more limited.

Still it was a Mac, and if you wanted a colour Mac but didn’t have the substantial amount of cash needed for a Mac II then the LC was the way to go. And it turned out that a lot of people wanted a colour Mac very badly, and they were prepared to put up with the performance hit that the LC came with. So despite everything, the LC was a sales success.

Although it was cheap compared to the more than $7000 demanded for the IIx, the base unit of the LC by itself had a list price of $2400, more than four times the price of an Apple II. Still, it was around the same price and same market segment that 80386SX PCs were selling into. Schools stubbornly stuck with the Apple II though, which soldiered on until 1993. After that point you would need an Apple IIe card in your Mac LC series to run Apple II programs (which was another $250).

It was always going to be a tricky transition – the LC certainly took sales away from the Mac II and it wasn’t the budget computer that could replace the Apple II. Performance was an issue, mostly because the LC could easily have been made faster for a little more money.

But perhaps the biggest problem was fragmentation… by the end of 1990 there were four different models of Macintosh II on sale, plus the Mac Classic, SE/30, the esoteric Portable and the LC. In a few year time, Apple would have 20 or more competing products which confused both consumers and showed a lack of direction within the company that nearly led to its bankruptcy in the late 90s.

The LC itself didn’t last long, replaced by the similar 68030-based LC II in 1992 and finally getting the performance it needed with the LC III in 1993. Surprisingly, prices for an original LC in decent condition can easily be a few hundred pounds whereas an equivalent model 386SX PC of the same era is basically landfill. Old Apple devices are quite collectable, but really you want to find an Apple I in a cupboard rather than a humble Mac LC…

Image credit:
Jay Tong via Flickr - CC BY-ND 2.0



Saturday, 24 October 2020

Epson MX-80 (1980)

Introduced October 1980

It was the noise you remember most of all, the slow tortured screeching of metal pins hammering thousands of tiny black dots onto paper, for what seemed like an eternity. When you were in a room full of them, the sound was cacophonous but distinctive. Not quite like the rat-a-tat machine gun fire of a daisy wheel or the howled whistling of a modem, the din of a dot matrix printer is thankfully something seldom heard today.

The Epson MX-80 is probably the key product of its type. Introduced in October 1980, this 9-pin dot matrix printer was capable, reliable, compact and lightweight. Most of all, it was successful – the MX-80 and the Epson printers that followed it defined the way most of us printed for more than a decade.

Elegantly but practically designed, the MX-80 used a simple 9 x 9 matrix to produce typically 80 columns of text. Nine pins meant that the printer could produce “true descenders” on letters such as p and g rather than 8 x 8 matrices (as found on most computers) which could struggle. Italics, bold characters and underlining was all supported, and the printer could also produce custom graphics with a little effort.

Epson MX-80
Epson MX-80

The MX-80 was a bidirectional printer with logic seeking which meant faster print times of up to 80 characters per second. On the standard model the paper was tractor fed from a box of fanfold paper (also called continuous stationary) on the floor, typically allowing for up to 2000 sheets before the slightly awkward task of feeding in some more.

You would typically hook up the MX-80 to a small computer using a parallel port which was a big cable, but it didn’t require anything to set it up. Serial options were available, and the MX-80 F/T had a friction feed so it could use cut sheet paper. A larger MX-100 had a wider carriage for bigger paper.

It was cheap to run – mostly you’d just need a new ribbon from time-to-time, although quite expensive to buy. By modern standards it was very slow, taking up to a couple of minutes for each page. And then there was the paper…

Fanfold paper is all connected together, making one continuous piece of paper where the pages are perforated so that they can be pulled apart from each other. Most fanfold paper also has perforations along the sprocket holes on either side. If you’d waited a couple of hours for a really big document to print out, you might then want to split the paper into A4 sheets. This was an agonising manual process that involved carefully pulling the paper apart. If you didn’t pay attention then you could rip the page, and you might have to reprint it (if it was even possible to print a single page in whatever application you were using).

Epson MX-80 and Apple II
Epson MX-80 and Apple II

Despite all the drawbacks, it was a useful device and consumers loved it, not least because it was very cheap to run. IBM loved it too, and rebadged it as the IBM 5152 to go alongside the IBM PC. The MX-80 spawned a number of 9-pin and 24-pin successors, and despite most modern printers being lasers or inkjets, you can still buy Epson dot-matrix printers today. But where are they used?

Dot matrix printers can be found in industrial environments, warehouses and also in the aviation industry. A combination of ruggedness, reliability and low maintenance costs outweigh the slow speed, low quality and noise in those environments. Although you would be unlikely to find an MX-80 still in operation after all of these years, you can still find many examples of its descendants.

Image credits:
Cushing Memorial Library via Flickr – CC BY-NC-ND 2.0
Nakamura2828 via Wikimedia Commons – CC BY-SA 3.0


Tuesday, 20 October 2020

3Com Audrey (2000)

Introduced October 2000

These days we are glued to our mobile devices, wanting to access the web and applications wherever we are. For many the idea of having to sit and the computer to use the internet seems old-fashioned. Smartphones, tablets… and yes, maybe a laptop if it’s something serious.

Twenty years ago however, the internet was not a mobile experience. Cables, dial-up modems and old-fashioned PCs were the way to go. If you wanted to access the web while you were in – say – the kitchen, you’d be out of luck.

3Com had an idea. One of the stars of the dot com boom, they created a device that could enable you to access the internet from almost any room you wanted. Using cutting-edge technology and undoubtedly with an eye to the future, they introduced the 3Com Ergo Audrey (or just the “3Com Audrey”). 3Com weren’t into consumer computing, except for their Palm subsidiary, so this was a new market for them.

It looks like a dangerous environment but loads of us are working from home in worse

In some ways the Audrey was like a modern tablet – an 8” VGA-resolution touchscreen display was complemented by a compact design, a wireless keyboard and USB expansion ports. The beating heart of the Audrey was the QNX operating system, a Unix-like platform designed to run on small devices. The Audrey could access the web, email, personal information management tools and a variety of push content that could be selected by a rather retro knob. You could sync it with up to two Palm PDAs (back in the days when a Palm Pilot was still a big deal). For added quirkiness, the stylus plugged into the top of the unit and it lit up when you received an email. It sounded great, but there were some drawbacks.

Internet access was via a dial-up modem, although you could plug an Ethernet adapter into a USB port if you had Ethernet at home. Also, you had to power it from a wall socket, so it was tied to both the mains and a phone or network port. At $499 it was quite pricey, and it didn’t have the power of a contemporary laptop. And – probably mostly – people just didn’t have the need to do internet all the time from anywhere.

3Com envisaged the Audrey (named after Audrey Hepburn) as being the lead device in their new “Ergo” sub-brand, including a “Mannix” device for the living room and presumably an “Andrex” device for the toilet. Presumably then you’d need a phone line in every room, which would be a pest. Wireless networking was technically a thing back in 2000, but it wasn’t widespread.

Is the kitchen the natural environment for the Audrey? Or the box it came in?

Still, this was pretty exciting stuff and of course 3Com hadn’t seen the iPad which was still a decade away, so it all sort of made sense. If you hadn’t seen the iPad. Which they hadn’t. Which was a problem really, because in hindsight it was the iPad that consumers really went for, followed by ridiculously oversized smartphones. Consumers wanted a cheese soufflé, 3Com offered them a soggy omelette instead.

It was a moderate success, but the dot com crash sent investors running away from tech shares and 3Com cancelled the Audrey and all the other Ergo devices in the summer of 2001. Some of the unsold inventory ended up in the hands of hackers who discovered how to jailbreak QNX and do their own things with it, turning these remaindered devices into something like a prototype Raspberry Pi.

Twenty years on and the Audrey is largely forgotten, along with 3Com who eventually vanished without trace into HP (along with their one-type subsidiary Palm). QNX was bought by RIM and turned up in the catastrophic BlackBerry Z10 in 2013, but despite that it still successfully soldiers on in embedded systems. As for the Audrey, you can pick one up for a few tens of dollars from sellers in the US but if exporting it you’ll need to watch the voltage on the power supply.

Image credits:
Andrew Turner via Flickr - CC BY 2.0
Alessandra Cimatti via Flickr - CC BY 2.0



Sunday, 11 October 2020

Going nowhere: Windows 7, Bada, Symbian, BlackBerry OS and WebOS (2010)

Announced October 2010

It’s October 2020 and if you have a smartphone in your pocket it’s almost certainly going to be one of two things: an Apple iPhone or an Android device. It seems like it has been that way for ever, but ten years ago this month rival platforms were duking it out as if they had some sort of chance.

The big news ten years ago was Windows Phone 7. Microsoft had been haemorrhaging market share ever since the iPhone was launched. Earlier versions of Windows Mobile (as it was then called) had been capable enough, but the user interface was a frankly horrific relic from an earlier age. Stung by failure, Microsoft decided to redesign the product entirely and came up with something entirely different.

Windows Phone 7 was a huge critical success in interface design. Clean, responsive, informative and intuitive at the same time, it made Android and iOS look old-fashioned. iOS in particular was wedded to the skeuomorphic design that had been an Apple hallmark for decades, but by contrast Windows looked utterly modern.

Microsoft made the decision to base Windows Phone 7 on the same underlying Windows CE platform that had powered previous generations, rather than the more modern Windows NT platform that could have delivered the same power as Android and iOS. The next-generation of Windows Phone – version 8 – would change the platform while retaining the UI… but that is another story.

There was certainly some buzz about Windows Phone 7 though, and a lot of manufacturers had lined up behind Microsoft to push this new platform. HTC were the keenest with several new devices – the HTC 7 Pro, HTC 7 Trophy, HTC 7 Mozart, HTC Surround and the high-end HTC HD7. Samsung resurrected the Omnia sub-brand to come up with the Samsung Omnia 7, where rivals LG had the LG Optimus 7 and LG Optimus 7Q. Even Dell got in on the act with the Dell Venue Pro.


A trio of doomed Windows Phone 7 devices

The operating system was sleek, the phones were pretty good and competitively priced. But of course Windows Phone failed, mostly because it lacked the apps that Android and iOS had. And perhaps partly because… who actually needed another mobile phone OS anyway?

But Windows Phone 7 wasn’t the only doomed platform being touted this month. Samsung had also developed the Unix-like Bada operating system for use in smartphones. The Samsung Wave II was the company’s flagship Bada phone… again it was a sleek operating sytem, competitively priced with excellent hardware. Samsung had tried hard to get apps for the platform and had done reasonably well. But still… it wasn’t Android or iOS. But it did at least feature the somewhat infamous Rick Stivens.


Samsung Wave Goodbye might have been a better name

Bada didn’t last long, being folded into Tizen in 2012. Tizen itself was the effective successor OS to a medley of other Unix-like platforms: Maemo, Moblin, MeeGo and LiMo. Tizen found itself ported to a wide range of smartwatches and the Samsung Z range of smartphones up to 2017 when eventually they fizzled out. But Tizen didn’t die, instead becoming the most popular operating system in Smart TVs instead.

Another relatively new kid on the block was Palm’s webOS platform found in the Palm Pre 2. Stop me if you’ve heard this before… but the phone had a combination of good hardware, a great OS, competitive pricing and a reasonable set of apps which sadly couldn’t compete with the market leaders. The Pre 2 was the last smartphone to be launched under the Palm name (apart from the peculiar Palm Palm). But less than a year after the Pre 2 the entire webOS product line was cancelled by Palm’s owners, HP.
Palm Pray might also have been a better name

The excellent webOS operating system lingered on, with HP attempting to open source it. Eventually it was picked up by LG who applied it to smart TVs and smartwatches, in a directly parallel to Samsung and Tizen.

Three doomed platforms is surely enough? Not quite.

The Nokia C5-03 was a nice enough, low-cost Symbian touchscreen smartphone. Unlike the others, this had a really good library of apps, it was attractively priced and designed and also Symbian had been around for donkey’s years there had been a process of continual improvement and an established base of fans. But Nokia were on the verge of a spectacular collapse, and Symbian would effectively be dead within a year.
Inexpensive but doomed smartphone fun with the Nokia C5-03.

Windows Phone 7, Bada, webOs and even the aging Symbian were all modern platforms that could deliver the sort of experience that customers might want. In comparison, the BlackBerry OS on the Bold 9780 was not. BlackBerry’s efforts at repeatedly warming over an OS that was nearly a decade old had created a device that was pretty good at email and absolutely appalling for web browsing or any other of the meagre collection of apps that were available.
BlackBerry missed the memo about what a 2010 smartphone should be

It sold pretty well into corporations that had standardised on BlackBerry, but users hated it – instead choosing to use their own iOS and Android devices which they expected their companies to support, leading in turn to the idea of BYOD (“bring your own device”). BlackBerry did eventually come up with a vaguely competitive smartphone… in 2013, a full six years after the iPhone was announced.

Today, if you want a smartphone without Android or iOS then the pickings are fairly slim. But Huawei – currently the world’s number two smartphone manufacturer – is working on the Linux-based Harmony OS to replace Android. This move is mostly due to trade sanctions from the US, but Harmony is also available as open source, or alternatively Huawei will licence their closed-source version to other manufacturers. Who knows, perhaps this rival OS will be a success?

Image credits: HTC, LG, Samsung, BlackBerry, Dell, HP, Nokia.

Tuesday, 29 September 2020

Nokia E7 (2010)

 Introduced September 2010

Looking back at Nokia, there was a point in its history where it slipped from being the market leader to a market failure. The Nokia E7 sits on the cusp on that change.


Nokia E7
Nokia E7

The Nokia E7 was the spiritual successor to Nokia’s long-running range of Communicator devices. Big and often bricklike, the Communicators had bigger screens that almost any other phone combined with a large physical QWERTY keyboard. Oddly, the Communicators often lagged behind in terms of features – the Nokia 9210i lacked GPRS for example when rivals had it, and the Nokia 9500 didn’t have 3G just when it was becoming common. It wasn’t really until the E90 in 2007 when it caught up in communications terms… a bit odd given the “Communicator” name.

Anyway, more than three years had elapsed since the launch of the E90 and in that time Apple had released their third-gen iPhone and Android devices were eating into Nokia’s market share. Many Nokia fans who had bought the E90 had moved on to other smartphone platforms. Nokia needed something special, and it looked like the E7 could be it.

Sleekly designed, the most prominent feature of the E7 at first glance was the huge 4” AMOLED capacitive touchscreen display, bigger than almost anything else on the market at the time. Hidden underneath that was a large keyboard that could be found by sliding the screen. A decent 8 megapixel camera was on the back, and the E7 supported 3.5G data and WiFI. GPS was built-in along with an FM radio and 16GB of internal storage.


Nokia E7
Nokia E7

All of this was powered by Symbian^3, Nokia’s latest version of their S60 operating system. This supported all the usual applications plus document editors, comprehensive email support, built-in navigation and excellent multimedia capabilities. It was quite possibly the best Symbian phone that Nokia ever made. But since nobody uses Symbian today, something must have gone wrong..

Nokia had both been very early to the touchscreen smartphone party and very late at the same time. The Nokia 7710 had launched in 2004 – years before the iPhone – but the technology wasn’t quite there and consumers stayed away. Nokia’s next attempt at a mainstream touchscreen smartphones was the 5800 XpressMusic which launched waaay after the iPhone. The 5800 proved popular but it was pushing S60 almost as far as it could go.

Symbian was meant for an earlier era of handheld computing. First appearing in 1989 as Psion’s EPOC operating system (see on the Psion Series 5 for example) it was designed to run smoothly on minimal hardware. The Series 5 for example had a puny 18MHz processor and 4MB of RAM, but by the time the E7 was launched it had a 680MHz processor and 256MB of RAM… hardware which in theory could run something more sophisticated. Rival Apple and Android devices both ran on operating systems descended from Unix which allowed a much richer environment for software developers. Developing for Symbian was harder, but it was worthwhile because even in 2010 Nokia’s OS was the market leader – even if it was beginning to fade.


Nokia E7
Nokia E7

It wasn’t as if Nokia lacked an alternative – the 770 Internet Tablet launched in 2005 running Nokia’s own take on a Unix-like OS called Maemo. But it wasn’t a phone and development of the platform was slow, but eventually they came up with a practical if somewhat rough around the edges smartphone in the Nokia N900. It looked likely that whatever would succeed the N900 would be a winner, but instead Nokia decided to merge Maemo with the Intel-led Moblin platform… a decision which completely derailed the strategy to replace the N900.

Stuck with the limitations of Symbian and with no next-gen product on the horizon, Nokia’s future was beginning to look uncertain. Even though sales were strong, it wasn’t clear how they could compete in the long term. But as it happens just a few days before the announcement of the E7, Nokia also announced a new CEO – Stephen Elop.

Elop realised the predicament that they were in and explained it to Nokia employees in the now-infamous “burning platform” memo that was leaked to the press. Ultimately Elop wanted to move Nokia away from Symbian and Maemo/MeeGo towards Microsoft’s new Windows Phone 7 OS. This was a bold move as Microsoft’s platform was very new… and Microsoft themselves had lost market share sharply to Apple. The plan was that Symbian would eventually be discontinued, but Nokia were hoping there would be a gradual transition of customers from Symbian to Windows. But that’s not what happened.

Dubbed shortly afterwards as the “Elop Effect”, the impact on Nokia’s sales were disastrous. Elop had effectively made Symbian a dead-end platform and that killed off pretty much any market appeal to customers. Sales fell through the floor, and worse still Nokia didn’t have a product to replace it (the first Lumia handset launched late in 2011). Far from being a smooth transition from one platform from one platform to another, it simply persuaded Symbian fans to jump ship… mostly to Android.

Less than six months after the announcement of the E7, Symbian was effectively dead. A trickle of new Symbian devices emerged from Nokia with the last mainstream handset launched in October 2011 and the final ever handset being launched in February 2012. None of them sold well. But then neither did the Windows phones that followed.

The E7 marks the point when Nokia’s seemingly invincible empire crumbled. The last high-end Symbian smartphone, the last of the Communicators, the E7 might have been a game changed if it had been launched three years earlier. Today the E7 is quite collectable with prices for decent ones starting at £60 or so with prices into the low hundred for ones in really good condition. 

Image credits: Nokia

Wednesday, 23 September 2020

TRS-80 Color Computer (1980)

Introduced September 1980

The original TRS-80 (launched in 1977) was one of the “holy trinity” of early consumer-friendly microcomputers along with the Apple II and Commodore PET. Capable though the original was, it lacked colour and sound which was what the next-generation of home micros would provide, so in September 1980 Tandy Radio Shack launched the TRS-80 Color Computer.

It had almost nothing in common with the original TRS-80 Model I except for the name. Crucially the Color Computer (often called the “CoCo”) didn’t have the Z80 processor that gave the “80” to the Model I’s name but instead it included a Motorola 6809. Indeed, the whole thing was more Motorola than Tandy – the basis of the CoCo was a Motorola-designed Videotex terminal which Tandy joined in to manufacture and market.

TRS-80 Color Computer 1
 

The 6809 was a bit more sophisticated than the Z80 and the rival 6502, and the more powerful Motorola 68000 was still an expensive and rather niche device. This was combined with a Motorola MC6847 graphics chip and there was an optional sound and speech cartridge.

Although the CoCo had pretty powerful graphics capabilities it was complex to get the most out of them, and the machine had some odd quirks such as being unable to display pure white and lacking lowercase character support. At launch the CoCo had 4, 16 or 32KB of RAM but later models shipped in 16 or 64KB configurations, and the last series of Color Computers could support up to 512KB.. and wonder of wonders, even lowercase text.

Over eleven years the hardware evolved somewhat with three distinct series of computers being made with different case colours, detailing and keyboards. The third and last series had improved graphics, built-in software and better support for peripherals. The larger memory allowed the sophisticated OS-9 operating system to run which brought a modern operating system to this fairly simple 8-bit machine.

TRS-80 Color Computer 2

Production ended in 1991, which wasn’t bad for an 8-bit machine. It was more popular in North America than in Europe, but the same Motorola reference platform emerged in the somewhat CoCo-compatible Dragon 32 and Dragon 64 a few years later.

For collectors, the CoCo isn’t an expensive buy and is commonly available in the US, however those run on a different voltage and have different video standards to European ones. Plenty of software emulators are available if you fancy tinkering on more modern hardware.

Image credits:
Adam Jenkins via Flickr - CC BY 2.0 - [1] [2]


Tuesday, 15 September 2020

Amstrad GX4000 (1990)

Introduced September 1990

During the late 1980s Amstrad had been on a roll. The Amstrad CPC range had taken a respectable share of the home computing market, the cheap all-in-one PCW wordprocessor had been a remarkable success for small businesses and home users, the PC-compatible PC1512 and PC1640 had sold in huge quantities and Amstrad had bought out arch-rival Sinclair to produce their own take on the iconic ZX Spectrum micro.

Not everything had been a success. The deeply strange portable PCs – the PPC 512 and PPC 640 – proved to be a high-profile flop. Worse still, the next-generation PC2000 series which had been launched to great acclaim ended up as a disaster with a batch of faulty hard drives significantly damaging Amstrad’s reputation.

Amstrad’s success had been built on offering quality devices at bargain prices, typically by exploring ways to drive down costs. The CPC computers were a good example, a home computer, monitor and storage device starting at £399, all inclusive. Amstrad leveraged their relationships with makes of TV tubes and cassette players to give them a price advantage, the inclusion of the cheap-but-capable Z80 processor drove down costs further. Amstrad chose to use the CPC platform for their next venture.


Amstrad GX4000

The Amstrad GX4000 was essentially a games console version of the CPC. Stripped of the cassette drive, TV and keyboard, the GX4000 used cartridges and hooked up to a domestic TV. Still running a Z80 with 64Kb of RAM the console was modestly specified even by 1990’s standards… but at just £99 it was really cheap.

It was an elegantly packaged device, with two slightly creaky games controllers attached and video output via RF, SCART or and Amstrad DIN connector for a CPC monitor. You could add a light gun or an analogue joystick took, but expansion options were pretty limited. Still, it was pretty capable for an 8-bit platform and the related CPC had a huge variety of good quality games available for it. So, it should have been a success? Not exactly.

By 1990 the 8-bit era that had dominated the 1980s was at an end. 32-bit home computers such as the Commodore Amiga had been established for some time, and the games console market itself was in the process of moving to 16-bit platforms such as the Sega Megadrive. But technological obsolescence had never been a problem for Amstrad - a company that shipped CP/M computers well into the 1990s – where instead they were interested in value-for-money. And the GX4000 certainly seemed to have that.

But the GX4000 was a massive failure, and perhaps the key problem was games. CPC games on cassette cost a few pounds where a GX4000 cartridge for the same game cost £25 (a quarter of the price of the console). Only a couple of games were available at launch, and a combination of manufacturing delays and high costs means that just 27 games of varying quality were launched. The 8-bit CPC platform that the GX4000 ran on wasn’t something that gamers could be excited about either.

Perhaps if the GX4000 had been released a few years earlier with more (and cheaper) games plus better designed hardware, it might have been a success. As it was, the GX4000 was discontinued in 1991 having sold just 15,000 units. Of course, that makes this console quite collectable today with prices for ones in good condition going for up to £200 which would be a lot more than was paid for it in the first place..

Image credit:
Evan-Amos via Wikimedia Commons - Public Domain


Sunday, 6 September 2020

Soft Toys (1880)

Introduced 1880

The soft toy in all its forms – from teddy bears to more exotic creatures altogether – is a favourite of children the world over and many adults too. And although toys have been found dating back to the earliest human civilizations, earlier examples would tend to be made from wood or whatever materials were available, but although these could be fun to play with they certainly weren’t cuddly.

Generally speaking the invention of the soft toy is attributed to Margarete Steiff in Germany. Steiff was a seamstress who at first noticed that children liked to play with the soft animal-themed pincushions that she made. These were adapted into toys covered in felt and filled with lamb’s wool, first for children of friends and relatives and then sold commercially from her shop in 1880. The first design was reportedly a toy elephant.

Elephant pin cushion in the Steiff Museum

These became a success and Steiff’s business grew and a result, but she always insisted on making the first prototypes herself to make sure that there were no problems. In 1897 her nephew Richard Steiff joined the company and designed a stuffed toy bear which initially was not successful – but then came the teddy bear craze of the early 1900s which Steiff and other manufacturers were happy to supply.

Margarete Steiff became a successful businesswoman, and created the Steiff company which even today is probably the best-known creator of stuffed toys. And she achieved all this despite being crippled with polio from an early age which left her confined to a wheelchair and with limited mobility in one of her arms.

Today Steiff sits at the top end of the soft toy market, which has spawned many other well-known brands. The American Ty company – founded in 1986 – popularised Beanie Babies. Gund is another American company that specialise in mid-range bears and other soft toys that are characteristically understuffed to make them more cuddly. Yet another American company, Build-A-Bear Workshop, allows customers to custom make their own bears in-store and to accessorise them. Many other companies also make bears and soft toys, and there are also small-scale operations that make high-end designer toys made out of premium materials.


Ty Beanies
But a soft toy is more than just a bit of fabric with some stuffing in it. They are companions, faithful friends, a comfort in times of distress and so much more. Bears can be fearless too, recently taking over some of the streets of Paris in the COVID-19 lockdown. They are patient and non-judgmental. And even when they’ve been put away for years at a time, they are still pleased to be with you – no matter how long that might be. And they proliferate. Oh my, do they proliferate.


Les Nounours des Gobelins

Of course, soft toys can be high collectable. At the high end, rare Steiff bears or Ty beanies can sell for tens of thousands of pounds... but these soft toys will probably never be played with or even cuddled. More everyday soft toys can be found all over the place, but often there can be interesting finds in charity shops looking for a new home. Just don’t let them completely take over your house.

Image credits:
Flominator via Wikimedia Commons - CC BY-SA 3.0
Caroline Léna Becker via Flickr – CC BY 2.0
frankieleon via Flickr – CC BY 2.0


Wednesday, 12 August 2020

PERQ Workstation (1980)

Introduced 1980

Sometimes influential bits of technology are ones you might never have heard of. The PERQ Workstation is probably one of these. Probably never selling more than a few thousand units, the PERQ nonetheless offered a lucky few a glimpse of the future.

What made the PERQ special was that it was the first commercially-available system with a GUI. Although Xerox had pioneered the idea with the Alto in the 1970s, these were never a commercial product. The Alto went on to inspire the Lilith workstation, but this too wasn’t something that you could buy. But the PERQ was something that you could actually buy… if you were the right type of customer.

Designed by the Three Rivers Computer Corporation (3RCC), the PERQ attracted the attention of British computer giant ICL who built a version under licence. Even at first glance the likeness between the PERQ, Alto and Lilith were obvious with the large portrait-orientation monitor and large under-desk computer system. All three computers used a device for moving the cursor, in the case of the Alto and Lilith it was a mouse but the PERQ used a digitising tablet.

(L to R) ICL PERQ 2 (with mouse), Norsk Data ND-100, ICL PERQ 1 (with tablet)

From the point of view of the user, the screen and tablet were the most important elements. The display itself was the same size as an A4 piece of paper with an astonishing (for the time) 768 x 1024 pixel resolution. The tablet offered more precise control than a mouse but it performed the same function. Inside was a complicated set of discrete components roughly equating to a 16-bit CPU with up to 256 Kb of memory. In addition to a floppy disk there was a 24Mb hard disk, but crucial to the idea of the PERQ was networking – it supported both Ethernet and Cambridge Ring.

The PERQ was always meant to be more than a nice computer with fancy graphics (a perquisite in fact), but instead it was meant to form part of a much larger integrated network compromising of shared storage devices and file servers, print servers, wide area networks and large-scale mainframes for batch processing. Essentially the PERQ would form a component of a recognisably modern network of devices… but this was in 1980 and really nobody much was doing this sort of thing outside of labs.

ICL's vision of the future
 

The operating system was called POS (PERQ Operating System) which was a pretty simple platform. A Pascal compiler, text editor and some demonstration programs were included to show off the GUI, but in the early days you’d have to write or port software to it yourself. It wasn’t an expensive system for what it was - about $20,000 – considering that it had the power of a small minicomputer. Other operating systems were also developed - including Flex, Accent and the Unix-like PNX.

Cutaway of PERQ workstation

The original PERQ was a niche success. The PERQ 2 followed in 1983 with a lighter-coloured cabinet, more memory, better internal storage and a three-button mouse rather than the digitiser. All was set for the PERQ 3 – a very different design based around the Motorola 68020 and running ICL’s PNX as the default OS. The display was boosted to an impressive 4096 x 2130 pixel landscape screen, and it would have more memory and access to more peripherals.

The PERQ 3 was in full development in late 1985 which was about the same time that PERQ Corporation (the new name for Three Rivers) started to get into serious financial trouble in the face of competition from the likes of Sun and Apollo workstations. ICL was also beginning to fancy itself as a “services” company and had lost interest in the PERQ too. Crossfield Electronics took over the project and did develop some high-performance workstations in the late 1980s but ultimately without the backing of a big player the PERQ was doomed.

Although the PERQ was ultimately a business failure, these innovative workstations when into exactly the right sort of environments to influence future computer designs – many of the ideas of the interconnected environment the PERQ was designed for would really only be adapted by businesses decades later. It wasn’t just the first computer with a GUI that you could actually buy, it was a glimpse into the entire future of computing.

The most likely place to find a PERQ today is a museum, and that’s probably the best place for them as early models might electrocute you if not careful, or catch fire. If you are technically minded you could try a PERQ emulator or perhaps you might consider that there's a little bit of PERQ in every modern computer system.

Image credits:
Rain Rabbit via Flickr - CC BY-NC 2.0
ICL Technical Journal – November 1982


Friday, 7 August 2020

Mattel Intellivision (1980)

Introduced 1980

Games consoles have been through several waves of being “a thing” to being “not a thing” and back to being “a thing” again. The Mattel Intellivision – launched to the general public in 1980 – was released nearly at the top of the wave… which unfortunately meant that it was all going to be downhill for this interesting console.

The Intellivision had been tested in Mattel’s home market of California at the end of 1979, and in 1980 it was ready for release across the United States. More sophisticated than the Atari VCS, the Intellivision was designed as more than a games console and Mattel hoped that… well, it could be an intelligent television (hence the name).

Original Mattel Intellivision

It was unusual in several ways. Firstly, the processor was quite unlike anything else that rivals had. The General Instrument (GI) CP1610 was a 16-bit processor with an instruction set closely based on the venerable PDP-11 which was hardly an obvious choice compared to the then-common MOS 6502 and Zilog Z80. A dedicated sound chip and graphics far superior to the Atari VCS certainly caused a splash when it was launched.

One notable hardware feature was the complicated gaming pad with a control pad and 14 buttons. The controllers took some practice to use, and while they were useful for complicated games they were a pain for games where a simple joystick would have worked better. Indeed, unlike other consoles many of the games actually required you to read the instructions before you started.

It was competitively priced at $299 (about $900 in today’s money), it had a reasonable amount of games and better graphics and sound than the Atari. After a somewhat slow start, the Intellivision started to sell strongly – shifting more than three million units up to 1983.

The Intellivision was always meant to be an expandable system, with a “Keyboard Component” which included a 6502 processor, extra RAM and a cassette drive. This add-on was meant to turn the Intellevision into a videotext-capable microcomputers. However, the project was badly delayed and was a high-profile failure resulting in fines from the FTC. Only a few thousand were sold, and most of those were bought back by Mattel when the project was cancelled – making it an exceptionally rare component today. Instead an add-on called the “Entertainment Computer System” was created which was much cheaper and less ambitious. A voice synthesiser called “Intellivoice” was also launched but had only a few games launched for it before that too was cancelled. An online service called PlayCable was trialled but cancelled.

Add-on woes aside, the Intellivision was selling well – and not just under the Mattel name. Bandai, Sears, Tandy, GTE and Sharp had their own versions. A cosmetically updated Intellivision II was launched in the US and Brazil in 1983 which was cheaper to make and less bulky. Mattel had plans for the Intellivision III and IV which would have been progressively better…

Intellivsion II

..but then in 1983 the bottom fell out of the console market. There were too many different consoles on the market, margins were getting very thin and competition from cheaper and more powerful microcomputers led to a disastrous market crash. Up until that point Mattel Electronics had been profitable and had continued to grow in terms of staff and investment – but suddenly it started posting enormous losses.

Mattel Electronics collapsed over six month period in late 1983, and Mattel sold on the remains for just $20 million in early 1984. But that wasn’t the end of the story. New owners INTV Corporation launched the INTV System III along with some unreleased Mattel games and a few they created themselves, the System III continued in production until 1990.

You might think that the story of this slightly weird games console would end there, but it didn’t. In 2014 the Intellivision Flashback was launched, packaging many popular games into a more modern hardware platform while retaining the same esoteric controller. Scheduled for release in October 2020 is the Intellivision Amico which reimagines the Intellivision concept on modern hardware.

Intellivsion Amico

Today the Intellivision is quite collectable, with prices varying widely usually depending on the number of packaged games – prices in the US commonly start at less than $100 and go up to $500 or so. Alternatively the upcoming Amico is slated to be $249, although that will be a very different experience to the original. Either way this 40 year old game system still seems to have its fans today..

Nicolas Nova via Flickr - CC BY-NC 2.0
Andy Simmons via Flickr - -CC BY-ND 2.0
Intellivision Entertainment




Friday, 31 July 2020

Rogue (1980)

Introduced 1980

Before the microcomputer boom of the late 1970s most computers were large, expensive but powerful multiuser machines (“minicomputers”) such as the DEC PDP-11 and VAX. These expensive machines were meant for serious work, but even so a few games had been written such as Colossal Cave and Star Trek.

These were often complex games, but they were severely hampered by the rudimentary output capabilities of the computers involved. Although minicomputer terminals had evolved through the 1970s leading to designs such as the versatile VT100, it wasn’t always easy to leverage the new features into programs.

Cursor addressability was the key feature first seen in the era of the VT52 – the ability to move the cursor to anywhere on the screen and display text. It seems simple today, but the earliest terminals were basically printers-with-a-keyboard (teletypes) and it took a long while for glass teletypes to evolve into the video terminals that could support recognisably modern applications.

By 1980 the newly-developed curses library was making it much easier to use the advanced features of these terminals, and around 1980 a pair of students – Michael Toy and Glenn Wichman – were intrigued by the possibility of writing a game on the UCSC computers systems where they were studying.

Rogue emulated running on a VT220

Rogue broke out of the mould of earlier minicomputer games which either tended to be quite simple or weren’t worth playing once you had beaten them. Rogue was a satisfyingly complex Dungeon-and-Dragons style game, set on several levels of mostly randomly-generated maps. The player roamed these ASCII maps as a wandering “@” sign, with 26 different types of monster represented by letters of the alphabet - V = vampire, O = orc for example. The player could accumulate a variety of weapons, armour, scrolls, potions and rods (wands) to help them on their task.

26 levels down in the game you would find the ultimate prize – the Amulet of Yendor, which you could then take back to the surface? The turn-based gameplay did give the player plenty of time to consider their next move in tricky situations, but on the other hand death was permanent – if slain by a monster or your own stupidity you would have to start over.

Michael Toy then moved to Berkeley from UCSC and met with Ken Arnold who had developed the curses library, the game evolved further and by the mid-1980s commercial interests were involved, porting the game to a variety of 1980s micros including the Macintosh, Atari ST, Commodore 64, Amiga, Amstrad CPC, ZX Spectrum and many more.

It was a hard game to master, and you could easily spend hours and hours trying to tackle deeper and deeper part of the dungeon. The fact that it was simply made up as ASCII characters didn’t really matter because of the rich gameplay – and most micro versions used some simple graphics to enhance the game further.

Nethack running on a modern laptop

Rogue was always a closed source game though, so many open source variants followed. Nethack is probably the most popular of these, with richer gameplay that Rogue but still mostly stubbornly sticking to ASCII.

Of course modern computer games can offer something of the rich gameplay of Rogue-like games along with impressive graphics and sound effects, but Rogue and its derivatives linger on and continue to be developed.

Image credits:
Artoftransformation via Wikimedia Commons - CC BY-SA 3.0
Mad Ball via Flickr - CC BY-SA 2.0







Tuesday, 21 July 2020

The Rise and Decline of Sharp Mobile (2002 to 2008)

Fifteen years ago this month, Sharp released the Sharp 903 – a high-end 3G phone that was the high watermark of Sharp’s efforts to break into the European market. Distinctly different from the Nokias and Motorolas that dominated the market, the 903 should established Sharp as a contender in the market. But it faded from sight instead.

In the early noughties Asian firms were having a hard time making an impact outside their home markets, with the notable exception of Sony… but even they had to join forces with Ericsson in 2001. But the result of this was that there were some weird and wonderful ecosystems developing – especially in Japan.

Sharp were dipping their toe in the market, initially with some fairly standard devices but then starting to leverage their expertise in other technologies. In 2000 they made the world’s first camera phone – the J-SH04 – but in particular devices started to appear that used some of Sharp’s world-leading display technology.

Sharp J-SH04
In Europe Sharp started cautiously with the O2-only GX1 which sold in limited quantities. Then came the almost identical Sharp GX10 and GX10i (the latter exclusive to Vodafone) in 2002 and 2003 which were attractive but pretty undistinguished clamshells.

The next handset to be launched (in late 2003) was a ground-breaker. Exclusive to Vodafone in most regions, the Sharp GX20 featured a high-resolution 240 x 320 pixel continuous grain silicon (CGS) display which easily beat everything else on the market at the time. Added to that was a competitive VGA resolution camera with a multi-coloured LED, along with a relatively large colour external screen – all in a package smaller and lighter than the more basic GX10. The GX20 created a real buzz around Sharp’s products and consumers were eager to see what would come next.


Sharp GX10i and GX20

The Sharp GX30 built on the superb display in the GX20 and added the world’s first megapixel camera. The GX30 also had a full-sized SD slot, added video recording, Bluetooth and an MP3 player. And in early 2004 all of those things together were a big deal. Even if the software wasn’t as easy to use as a Nokia, the hardware was class leading in almost every respect, again this was a Vodafone exclusive in many regions – although some other carriers had the functionally identical GX32.

Sharp GX30

You might guess that the next phone from Sharp would be the GX40… but you would be wrong. The Sharp TM100 was exclusive to T-Mobile rather than Vodafone, but was basically a slider version of the GX20 with minimalist looks at the same CGS display that Sharp were becoming famous for.

Sharp TM100

Vodafone again had the exclusive for the next handset – the very popular Sharp GX25. Still a 2004 product, this had a similar specification to the older GX20, but it had a sleeker design and notable it tucked the antenna inside the case. Bluetooth was added into the mix but the external screen shrank considerably. The result was a smaller, lighter, more capable and cheaper phone that was cheaper than the GX20 while retaining the excellent display. One highly sought-after version of the GX25 was the attractive Ferrari edition in bright red, but some markets had other eye-popping colours available too.

Sharp GX25
Sharp returned to their clamshell-with-antenna design for the Sharp TM200 in late 2004. This was exclusive to T-Mobile and was broadly similar to the GX30 except it had a smaller external display and crucially a two megapixel camera, making it the first such device in Europe. The oversized camera assembly on the TM200 was rather pointless, but it did draw attention to its class-leading camera capabilities.

Sharp TM200
Although most of these handset had been designed with European and Worldwide markets in mind, the next product releases had a more distinctive Japanese origin. One of the stars of Vodafone’s fledgling 3G network was the Sharp 902 which was essentially almost a straight import of the 902SH handset Vodafone Japan used.

Sharp 902

The 902 was like (almost) nothing else on the market. A large 3G-capable swivelling clamshell phone, it featured a 2.4” QVGA TFT display, a 2 megapixel camera with 2X optical zoom and a flash,  video calling, expandable memory on a full-size SD/MMC card, an MP3 player, web browser and email client. The 902 looked like a compact digital camera from one side, and you could swivel the display around to act as a huge viewfinder. The 902 had plenty of “wow factor” but flaws in the camera design meant that the pictures were disappointing, and Vodafone was having a hard job persuading customer that 3G was worth having. Launched alongside it was the cut-down Sharp 802 with a more conventional 1.3 megapixel camera, although this didn’t have the same market appeal. A special bright red Ferrari edition was the most desirable version, that that still commands a premium today for collectors.


Sharp 803
Most customers were sticking with their 2/2.5G devices and the GX range was still popular despite 3G competition. Rumours of a Japanese-style GX40 clamshell with a 2 megapixel camera were doing the rounds, Sharp having impressed potential consumers with the radical design of the 902. But this crucial market seemed to be overlooked.  It meant that customers with a GX30 who wanted an upgrade but didn’t want a bulky 3G phone would have to look elsewhere.

Sharp’s next launch was the Sharp 903 and Sharp 703 – another pair of G devices. The 903 was quite similar to the 902 in design, but sported a 3.2 megapixel camera with a 2X optical zoom that fixed the flaws of the 902. The full-sized SD card slot had gone to be replaced by a miniSD slot, but strangely the phone was actually bigger than the 902 despite that. Better looking than the 902, it came in a variety of colours as well. Launched at the same time was the more conventional 703 with a swivel-less design and a 1.3 megapixel camera.

Sharp 903 and 703

We didn’t know it at the time, but the Sharp 903 was as good as it was ever going to get for Sharp fans in Europe. When the Sharp GX40 finally came out later in 2005 it was a huge disappointment. It sported good multimedia features but a very disappointing 1.3 megapixel camera and even the screen was a slight downgrade on previous versions.

Sharp GX40
Three elegant but fairly low-end phones followed in 2006 – the Sharp GX29, 550SH and 770SH. The 770SH was the most elegant with a QVGA display and expandable memory, but it was still only a 2G phone with a 1.3 megapixel camera. The 550SH was essentially a candy-bar version of the 770SH. The GX29 was a simpler phone with only a VGA camera and limited features. This time the most desirable of the bunch was the 770SH McLaren Mercedes edition which certainly looked the part even if it didn’t deliver much.

Sharp GX29, 550SH and 770SH McLaren Mercedes Edition
After this Sharp pretty much faded out of markets outside of Japan, although years later they did return with some decent Aquos branded Android handsets which developed a following but have never really sold in large numbers.

Sharp certainly seemed to be poised on the verge of a breakthrough, but what went wrong? Sharp were certainly leading in display and camera technology. Very much at the leading edge Sharp and Vodafone also bet strongly on 3G, coming up with the class-leading 902… the problem was that consumers really didn’t want 3G and sales of that, the follow-up 903 and the 802 and 703 were weak. Sharp were also very much stuck with carrier exclusive deals, mostly with Vodafone but also to some extent T-Mobile. This was good news for the carriers, not such good news for Sharp. A failure to update their 2G line also left fans with nowhere to go - and when Vodafone left the Japanese market in 2006 the ties with Japanese manufacturers became much weaker. And of course the market was dominated by Nokia, and despite their handsets lagging behind in hardware terms they were usually the best-looking devices and very easy to use.


Sharp 902 and GX25 Ferrari Editions

Today the Ferrari editions are sought-after and a humble GX25 in Ferrari livery in very good condition can sell for hundreds of pounds. The 902 can cost around £150 in good condition, but most other Sharp phones are worth much less. However many of them - especially the GX30 and 902 - would make an ideal addition to a collection.


Image credits: Sharp, Vodafone, T-Mobile
Morio via Wikimedia Commons - CC BY-SA 3.0

Monday, 6 July 2020

Missile Command (1980)

Missile Command screenshot
Introduced July 1980

It’s the height of the Cold War, and the possibility of nuclear annihilation is always just around the corner. Everything you know and everyone you love could be swept away in an instant and there would be very little you could do about it.

So, for some escapism what about a game where everybody dies in a nuclear conflagration? Welcome to 1980 and Atari’s Missile Command.

The golden age of arcade machines featured many escapist games, usually of the shoot-‘em-up variety. As with microcomputers of the time, arcade machines were being propelled by improvements in microprocessors and other silicon chips leading to a rapid improvement of hardware. Missile Command used a 1.25 MHz 6502 CPU with an Atari POKEY chip handling sound. Graphics were 256 x 231 pixels in 8 colours, and unlike Lunar Lander and Asteroids, Missile Command used a raster scan monitor.
Missile Command arcade machine

The gameplay was this: the player had to defend six cities at the bottom of the screen from waves of nuclear weapons (represented with a line with a blob on the end). The player would launch their own missiles from three bases into the sky to destroy the nukes, and those bases themselves can be destroyed. As the game progresses the player is attacked by missiles with multiple warheads, bombers and satellites. The game ends when all six cities are destroyed, and invariably they ARE destroyed.

Unusually, the primary control for the game was a large trackball which emulated the sort of thing that real military bases would use for controlling systems. Combined with the (then) advanced graphics and sound, it made Missile Command a distinctive and popular gaming experience.

Although the game was distributed by Atari in North America, Atari chose to partner with Sega to distribute it in Europe. This gave Sega a useful foothold in the arcade game market. In Asia-Pacific markets a smaller number of Taito cabinets were made. But as a classic video game, it was ported to many platforms from the 1980s onwards and there are still licenced version and clones available today. Or if you still have Flash installed on your computer, you can play it for free here.

Image credits:
John Cooper via Flickr - CC BY-NC-ND 2.0
James Brooks via Flickr - CC BY 2.0