Wednesday, 1 April 2020

Nintendo Game & Watch (1980)

Nintendo Game & Watch - original "Ball" game
Launched April 1980

Legend has it that one day Nintendo engineer Gunpei Yokoi was observing a Japanese worker playing with a pocket calculator when he suddenly had an idea to take the basic components of a calculator and turn them into a handheld game.

It wasn’t an entirely new idea. Mattel’s rudimentary Auto Race game in 1976 was arguably the first, with the Coleco Electronic Quarterback machine of 1978 also proved popular. Both of those games were bulky and used simple LED displays, however technology had moved on and by the end of the 1970s inexpensive LCD displays were appearing all over the place – including in a range of low-cost and very portable pocket calculators. There were even pocket calculator game books which – with some effort – could make these little gadgets more entertaining.

Yokoi’s design took the LCD display, but instead of having it display numbers it could display a series of predefined graphics instead. Inside was a 4-bit Sharp calculator battery, and power was provided by button cells which kept the size and weight down. This was launched as the Nintendo Game & Watch (as it also included a clock).

The first game was simply called “Ball” and it was a simple enough game where you had to throw and catch a ball, but many more followed. Each game required its own hardware, typically with “A” and “B” variants with different gameplay. At first the games had a single screen and simple control buttons, but dual screen ones followed and in 1982 a handheld version of Donkey Kong also introduced the now ubiquitous D-pad controller. Screens also got bigger and layer ones had coloured sections printed onto the display to give a bit more visual interest.
Rare Super Mario Bros Game & Watch

Dozens of different versions came to market over the next decade or so with games titles many of which featured the Mario Brothers and Donkey Kong, plus a whole lot of Disney characters and ports of arcade games. Nintendo placed a lot of emphasis on gameplay because of the strictly limited hardware, which had the effect of making the games both addictive and relatively inexpensive.

Millions of Game & Watch devices were sold, but in the end technology had moved on. Yokoi had also designed the Nintendo Game Boy, using lessons learned from the Game & Watch but in a much more flexible package where new games could be loaded on with a cartridge.

In fact Gunpei Yokoi had been a prolific designer of products for Nintendo, starting with successful toys such as the Ultra Hand in the 1960s. Not all the products were successful, but his design philosophy of (roughly translated) “lateral thinking with seasoned technology” is still an important part of the way Nintendo designs products – in other words, using existing and inexpensive technologies in innovative ways rather than pursuing the highest technology available. Yokoi left Nintendo in 1996 to form his own company, but he was killed in a car accident in October 1997. However, his influential legacy lives on.

Today Game & Watch devices are very collectable, with rare items such as the yellow Super Mario Bros (YM-901) selling for thousands of pounds, with more common or later ones such as Donkey Kong 3 selling for less than £100.

Image credits:
masatsu via Flickr - CC BY-SA 2.0
Sturmjäger via Wikimedia Commons - CC BY-SA 3.0

Saturday, 28 March 2020

Sony PlayStation 2 (2000)

Sony PlayStation 2
Launched March 2000

Sony had been something of a late entrant into the games console, coming to market in the early 1990s in what was essentially a grudge match with Nintendo. Both companies had worked together in 1988 to make a CD-ROM add-on to the Nintendo Super NES console which fell apart spectacularly.

Sony became keen to demonstrate its expertise in consumer electronics, and this led to the original PlayStation console launched in 1994. The PlayStation was a successful fifth-generation console that beat Nintendo’s offering by a full year and a half.

Things move on of course, and in 2000 Sony announced the successor to the original PlayStation, imaginatively called the PlayStation 2 (or “PS2”). Based around a heavily customised MIPS RISC processor called the “Emotion Engine” combined with powerful graphics and audio processors, the PlayStation 2 was a massive leap forward from previous generations.

Bundled with an upgrade of the legendary DualShock controller, the PlayStation also had a wide variety of other peripherals that could be used for gameplay. One crucial feature it also came with was an integrated DVD player at a time when these were very uncommon and rather expensive.

Decent games took a little while to become available, but the PlayStation 2 maintained hardware compatibility with the original PlayStation (retconned as the “PS One”). After a few months the number of games started to increase, boosting the popularity of the PS2. The best-selling game on the platform was 2004’s Grand Theft Auto: San Andreas.

It had its flaws – in particular online services on the PS2 were clunky and poorly though-out. But overall it was a well-executed packaging of hardware with a strong suite of software to go with it. Selling over 150 million units plus a staggering 1.5 billion games, the PS2 is – just about – the best-selling video console ever.

Later revisions of the PS2 were smaller and cheaper, but it continued to be popular even after the launch of the PlayStation 3 in 2006... in fact the last PS2 units shipped in January 2013 after being in production for more than 12 years.

The PS2 cemented Sony’s place in the console market, and it was largely responsible for the demise of Sega and a difficult period for Nintendo as well. Ultimately the only thing that really made a stand against Sony was Microsoft’s Xbox platform – between Sony and Microsoft eventually the entire old order of games consoles was swept aside.

Because of the large numbers of PS2s sold, there are quite a lot available for not very much money – usually including a collection of bundled games. There are plenty of buyers guides too if you fancy a bit of low-cost retro gaming.

Image credit: Evan-Amos via Wikimedia Commons

Tuesday, 17 March 2020

Squarial (1990)

A Squarial on display
Available March 1990

Direct-to-customer satellite television started in the late 1970s, but back in the early days it required a big and expensive dish to receive the broadcasts which had the effect of turning your back garden into something that looked like Jodrell Bank.

Although technology improved during the 1980s, by the end of the decade a typical satellite dish for Sky in the UK could still measure 90 centimetres across. Typically painted white, the dishes were regarded as an eyesore and when there were lots of them in an area – for example with blocks of flats and terraced houses – it made the place look like a KGB listening post.

Although Sky was popular in the UK, it gained a reputation for being rather lowbrow. A mix of imported US TV series and cheaply-made shows had a certain appeal, but neighbours might judge you harshly if you attached a massive dish to the side of your house to watch American wrestling.

It became obvious in the 1980s that the market could do with some competition, and through a complex mix of government legislation and business deals the idea of creating a company knows British Satellite Broadcasting (BSB) was born.

After some delays, BSB eventually started broadcasting in March 1990. It attempted to offer a more highbrow mix of shows including original content, high-quality imports and repeats of popular BBC shows... not to mention a bit of science fiction slash soap opera.

Low-level class warfare is a thing in the UK, and indeed the BBC TV show “Keeping Up Appearances” first broadcast in 1990 satirises just that. And for many middle-class people, the idea of sticking up a huge Sky satellite dish seemed frightful. But how could you differentiate that you were a middle-class BSB customer? BSB thought it had a secret weapon… the Squarial.

A Squarial in its natural habitat

The Squarial looked quite unlike other satellite receivers. Just 38 centimetres across, the Squarial was completely flat and inside was a phased array antenna made up of a large number of tiny individual antennae working together. This different technological approach was possible because BSB’s twin Marcopolo satellites had a higher power output than the Astra 1A satellite used by Sky.

It became something of a design icon, and created a significant buzz around BSB’s launch. Squarials started to appear on hundreds of thousands of homes. It looked like a success. But it wasn’t.

The problem that both Sky and BSB were losing huge amounts of money, a staggering $1.5 billion dollars between them (equivalent to about £2.5 billion today). Subscriber numbers were nowhere near the figures that either company needed to be sustainable. So in November 1990 Sky and BSB dropped a bombshell – these two bitter rivals would merge to become British Sky Broadcasting (BskyB).

Although on paper it was a 50/50 merger, in reality Sky was the dominant partner. And although costs could be saved by rationalising the TV channels, in the end it was inevitable that BSkyB would want only one broadcasting system – and that meant using the Astra satellite and not Marcopolo.

By 1992 BSkyB shuttered the service on the Marcopolo satellites, leaving both the Squarial and attached satellite set top boxes obsolete as they were incompatible with the Astra system. Essentially, for all but the most die-hard tinkerers the Squarial and all the rather expensive equipment that came with it ended up as junk. For owners it was like being on the wrong side of the VHS-Betamax war.

But the Squarials lingered – because of the fact that they were mounted in difficult-to-access places it meant that you’d have to pay somebody to take it down. So they lingered and lingered, a testament to making the wrong technological choice and a reminder of high-profile failure. Even today it is still possible to find a forgotten Squarial on the side of a house. For a few tens of pounds you could even acquire one yourself, and see if it really is “smart to be square”.

Image credits: Alex Liivet via Flickr
CC BY 2.0
pauldriscoll via Flickr
CC BY-NC-ND 2.0 




Saturday, 14 March 2020

Kenwood Chef (1950)

Introduced March 1950

Recently we explored a whole bunch of domestic gadgets that are in common use today but actually trace their origins to the Nineteenth Century. One such gadget was the food mixer, tracing its roots back to the 1850s and with electric food mixers coming to market around the turn of the century.

The popularity of such devices grew and in 1950 the British public saw the launch of what is perhaps the most iconic mixer of its type – the Kenwood Chef. Designed (perhaps unsurprisingly) by Kenneth Wood, the original Chef (model A700) came with a variety of attachments that could mix bread, cakes, sausages and even drinks.

Kenwood Chef A700
It wasn’t cheap, the launch price was a little under £20 in 1950s money which is nearly £700 today. Still, it was immensely popular for those who could afford it and it was the sort of thing you could show off to friends and neighbours to make them envious.

Because it was stylish, well-built and versatile the popularity endured, and although over the decades the product has changed and improved in that time, the modern Kenwood Chef is still fundamentally the same as the one from 70 years ago.

Perhaps these days with the rise of pre-prepared foods and takeaways, the food mixer isn’t the must-have appliance that it once was. However, vintage Kenwood Chefs from the 1950s and 1960s have a surprising number of fans, and working ones in good condition can be had for less than £200. If you want a contemporary model then these start at £250 going up to £700 or so, with an even wider range of gadgets you can plug in.

Image credit: Science Museum
CC BY-NC-SA 4.0

Monday, 9 March 2020

Samsung I9000 Galaxy S (2010)

Launched March 2010

Here’s a question: what is the most significant Android smartphone ever? Is it the T-Mobile G1 which was the first on the market, or one of the second-gen phones such as the Motorola DROID or Nexus One which were easily as good as the iPhone. Or perhaps it is the Samsung I9000 Galaxy S – a phone with a feature set so rich that it redefined what a smartphone should be.

The Galaxy S had pretty much every feature dialled up to 11. Starting with the huge (for the time) 4” 480 x 800 pixel AMOLED display, 512MB of RAM, a 1GHz CPU, dedicated GPU, 8 or 16GB of RAM plus a microSD slot, a stereo FM radio, 5 megapixel camera plus a secondary camera on the front… plus of course all the expected features such as 3.5G, WiFi, GPS and by then a huge variety of applications to do just about anything. In terms of features, the Galaxy S blew everything else out of the water.

Samsung I9000 Galaxy S
Sure… it was a pretty bland device in design terms and Samsung’s somewhat iPhone-like TouchWiz skin lead to years of lawsuits. It was quite expensive too… daringly so for an Android phone. Despite this, the original Samsung Galaxy S was a huge success – shipping more than 25 million units.

But there wasn’t just one model – when you took into account all the variants there were over two dozen. Some added 4G support, some replaced the FM radio with TV tuners, a couple even had physical QWERTY keyboards. Some variants replaced the 5 megapixel camera with an 8 megapixel one… or a 3 megapixel one. Screen sizes varied between 3.5 and 4.5 inches and LCD screens made a showing alongside AMOLED. Samsung was willing to customise and tweak the phone in any way the carriers wanted, which was something Apple would never do.

The rest is history – the Galaxy S in now in its 11th generation with the Samsung Galaxy S20 (confusingly the previous version was the S10). Although the Galaxy S does come in about half a dozen main variants, these days the more extreme variations come under different parts of the bewilderingly huge Samsung Galaxy range.

Perhaps the most important legacy is screen size – although tiny for a modern smartphone the 4” display of the original Galaxy S was huge compared with smartphones of the time. Manufacturers proceeded somewhat cautiously, but every time they made the panel bigger it seemed that customers approved. The current S20 has a 6.2” screen on a bezel-less display and the phone one third bigger overall than the original one.

Perhaps because of the bland styling, the Samsung Galaxy S doesn’t seem like a particularly collectable device despite its importance in the evolution of modern smartphones. There are plenty of decent examples for less than £40 and unofficial ports of the more modern Lineage and Replicant OSes exist if you want to tinker.

Image credit: Samsung Mobile



Monday, 2 March 2020

Sharp PC-1210 (1980)

Introduced 1980

These days most of us carry around a powerful little computer with us all the time, but where did it all start? To go back to the beginning of pocket computing we have to travel back 40 years to find the Sharp PC-1210.

Sharp had been making LCD calculators since the early 1970s and by the end of the decade they were competing successfully in the nascent microcomputer market. Back in those days one of the main uses of a micro was BASIC programming, and the Sharp PC-1210 was certainly capable of doing that. Well, to an extent.


Sharp PC-1210

Powered by twin 4-bit CPUs with a quite usable QWERTY and numeric keyboard, the PC-1210 also had a 24 character dot matrix display. You could also buy an optional interface for a cassette or printer, making this potentially a very versatile little thing.

Perhaps the biggest problem was memory – just 896 bytes of RAM in the original model, although carefully managed to make it usable. Later versions (the PC-1211 and PC-1212) increased this to 1920 byes which was a lot more useful.

The PC-1210 and its successors were a niche success, and over the years the range was improved and diversified into products aimed at engineers, mathematicians and scientists on one hand and people who wanted a personal organiser on the other. Somewhere in the 1990s these personal organisers fizzled out to be replaced by PDAs and then smartphones.

Today the Sharp PC-1210 series is a pretty uncommon find, and many seem to have faulty screens. However later models are available too, and prices vary from a few tens of pounds to several hundred depending on conditions and accessories. Alas, Sharp are not in the computer business anymore and have struggled to make a profit in recent years. In 2016 they were effectively taken over by Taiwanese giant Foxconn – and surprisingly both companies have recently branched out into coronavirus masks in addition to electronics.

Image credit: Armin.maas sb via Wikimedia Commons

Tuesday, 25 February 2020

Kodak Brownie Camera (1900)



Introduced February 1900

Photography these days is commonplace – we snap a photo with our smartphones and don’t give it a second thought. But early photographs were difficult, time-consuming and above all expensive to make.

By 1900 the science (or art) or photography had been around for half a century, and although it had become progressively easier and cheaper things were about to take a big leap forward with the introduction of the Kodak Brownie camera.

Priced at just one dollar at launch (about $31 today), the Brownie was a leatherette-covered cardboard box with a simple lens and a roll of film in the back. Cheap, lightweight and very easy to use the “Box Brownie” camera became a huge success. The improved Brownie 2 followed the next year (priced at $2) and this too sold in huge numbers. The real value of the Brownie to Kodak was not the camera itself, but the huge profits to be made from selling film and developing the photographs.

Box Brownies went on to document everything from family gatherings to wars. Simple to use, robust and portable – what they lacked in sophistication they made up for by being there at the right moment.  The Brownie spawned a huge range of similarly low-cost Kodak cameras over the following decades, and if you’ve ever owned a camera then you’ve probably owned a Kodak at some point.

Kodak’s business model of selling cameras cheaply and making money from the film sustained it for a hundred years, but by the end of the 20th Century digital cameras were making significant inroads into the market. Kodak tried hard to compete in this market, grabbing a decent share of the market… but it lost money on almost every camera it sold. Various other strategies were tried and largely failed, and finally the rise of high quality smartphone cameras proved a fatal blow. Kodak files for Chapter 11 bankruptcy in 2012, two years after the launch of Instagram which came up with a new paradigm for sharing photographs.

Despite its woes, Kodak still exists today although much slimmed down from the giant of the past. And Kodak Box Brownie cameras are still commonly available for not much money. Although nobody makes the film for the original Brownies any more, you can adapt current film for use in them if you are after a bit of retro photography.

Image credit: Federico Leva via Wikimedia Commons

Friday, 14 February 2020

HTC Desire (2010)

HTC Desire
Launched February 2010

Android devices had only been around for less than a year and a half by the time Mobile World Congress came around in 2010, but during that time the platform had evolved rapidly from somewhat ropey beginnings.

Riding the crest of this particular wave was the HTC Desire – an Android 2.1 smartphone with a 3.7” SVGA display, 1GHz CPU and a 5 megapixel camera and… wait… yes, it might well seem familiar because the Desire was very closely related to the Google Nexus One launched the previous month.

The differences were minor – the Desire ditched the Nexus One’s trackball and had a much more usable optical trackpad, but conversely the Desire had physical function buttons instead of touch-sensitive ones. The Desire also had an FM radio (included in the Nexus hardware but disabled) and it used the HTC Sense UI on top of the underlying OS rather than the stock Android of the Nexus.

This whole combination of features was very appealing to potential customers, and because HTC already had an established relationship with mobile phone carriers it was simple enough to get your hands on a subsidised Desire on contract, where at launch the Nexus One was a rather expensive SIM-free affair.

The Desire was well-designed, the user experience was great and it was easy to get one. And although this combination doesn’t always guarantee success in this case it did, and the HTC Desire became the first Android phone for many people wanting to dip their toe in the smartphone world.

It had its problems – notably the original AMOLED display lacked sharpness which was fixed by a switch to S-LCD and over-the-air software updates dried up after just 18 months. Nonetheless it established HTC as the Android manufacturer to beat… however rivals Samsung had something up their sleeves when it came to that.

The “Desire” name stuck around – even if (like a lot of other HTC handsets) – it sounds a bit like a brand of condom. The most recent phone to bear the name is the HTC Desire 19s, launched in late 2019. Original HTC Desires (model HTC A8181) are commonly available for not very much money should you want to own a little slice of Android history.

Image credit: Retromobe and Mobile Gazette

Monday, 3 February 2020

Lawn Mower (1830), Sewing Machine (1830), Dishwasher (1850), Vacuum Cleaner (1860) and Hair Dryer (1890)

In these days of home automation and snazzy gadgets, we don’t always realise that some of the things we use on a regular basis have their origins a surprisingly long time ago.

Let’s start with the lawn mower. The first one was invented in 1830 by Edwin Beard Budding in Gloucestershire, UK. Pushed along by hand, the original lawn mower combined rotating blades and a roller which is pretty similar to the setup of a modern rotary lawnmower. Before this, grass had to be cut with a scythe which was both back-breaking work and also didn’t result in a very closely cut lawn. Although this still involved a fair bit of manual labour, the lawnmower could cut larger areas better and more quickly. Crucially, this enabled the creation of smooth sports pitches for games such as cricket or football too.

The same year saw the introduction of the first usable sewing machine, invented by Barth√©lemy Thimonnier in France, although other inventors had been working on their own versions for decades. Thimonnier’s machine used a needle with a barb which enabled a simple chain stitch to be created. The new invention was put to work in a factory making uniforms for the French army, however angry tailors destroyed the machines in 1831 because they were fearful for their jobs. Undeterred, other inventors continued to develop the idea and the within a few decades rival manufacturers fought it out in a thriving marketplace.

Although the lawn mower and sewing machine quickly developed into recognisable modern products. The first dishwasher – invented in 1850 by Joel Houghton in the US – took a rather longer time to develop into something practical. The first dishwasher was crank-operated and basically just splashed water on the dishes. It was the 1920s until dishwashers started to look anything like modern ones, and it was a century after Houghton’s invention that they started to be commonplace in more wealthy markets such as the United States.

A trio of "Green" lawnmowers from 1890 - functionally similar to the Budding device, a copy of Thimonnier's original sewing machine, hand-powered dishwasher from 1860

Vacuum cleaners had a similarly slow start. Perhaps the first recognisable device was invented by Daniel Hess of the United States in 1860. Combining a set of brushes and a bellows, Hess’s creation worked best if you had four arms. Nonetheless the idea evolved and got more usable, and by the beginning of the 20th Century the first recognisable and practical vacuum cleaners emerged.

Finally, another product that took a long time to become an everyday item was the hair dryer. The first commercial version was invented in France by Alexander Godefroy, it was basically a pipe connected to the chimney of a gas stove. The person having their hair dried had to sit next to the stove, and the whole thing sounds rather dangerous. By the 1920s hand-held electric hair dryers began to emerge, but they remained quite dangerous for a long time because electricity and water can be a lethal combination.

Model of Hess's vacuum cleaner, gas hair-dryer (date unknown)

Although labour-saving devices have certainly saved a lot of hard work in the home, some things remain resistant to automatic. Ironing for example. But it turns out that this particular tedious task may be done by machine too in the near future.

Image credits:
Green Lawnmower (1890) - Science Museum
Copy of Thimonnier's sewing machine (1930) – Science Museum 
Hand-powered dishwasher circa 1860 – Daderot via Wikimedia Commons
Model of Daniel Hess’s carpet sweeper (1860) - Underneaththesun via Wikimedia Commons
Gas Hairdryer – Alex Liivet via Flickr

Friday, 24 January 2020

Apple iPad (2010)

Apple iPad (2010)
Launched January 2010

At the start of 2010, the Apple iPhone had been on the market for two-and-a-half years and a bit of a slow start it was becoming rather popular, along with its rival Android platform.

But although modern smartphones were easy to use for some tasks, their small screens could be a pain. Go bigger and you might end up spending a lot of money on a notebook computer which would take ages to boot up.

There were other solutions, and as ever it was Nokia who were exploring them. The elegant Nokia Booklet 3G took and inexpensive small notebook design and added seamless 3G connectivity along with the advantages of a 10.1” display. And five years previously, Nokia had introduced something called an “Internet Tablet” with a relatively large 4.1” display running a version of Linux.

Nokia had all the technologies it needed to take the next step in personal computing, but it hadn’t put them together in the right way. However, in January 2010 Apple put together all the available technologies to come up with a product that people hadn’t realised that they actually wanted until that point – the Apple iPad.


Rumours had been flying around for a while about some tablet-based computer, but there were few solid details. In fact, the smart money seemed to be convinced that the device was going to be called the Apple iSlate (a trademark that Apple actually owned). But overall Apple's veil of secrecy about the new device held.

The iPad was as easy to use as the iPhone but had a 9.7” display, was just 13mm thick and weighed only around 700 grams. Priced at between $499 for the basic 16GB WiFi-only version to $829 for a 64GB one with 3G they weren’t exactly cheap, but they were still good value compared to a lot of laptops.

Of course, the iPad lacked a keyboard or mouse (but it didn’t take long for Bluetooth keyboards to come out), however the main flaw with the iPad was that it didn’t support multitasking, so you couldn’t run multiple apps at once. This was eventually fixed in iOS 4.2 launched in November 2010, but to begin with it was certainly a handicap.

Regardless of the flaws it may have had, the iPad launched in a blaze of publicity and it was a massive hit – selling 15 million units before the launch of the iPad 2. Unusually for Apple, it had only a short lifespan for software updates with the last OS upgrade coming out in May 2012.

Several generations later and the iPad is still a strong seller, however sales peaked in about 2013 and have been in decline ever since. Part of this apparent decline is probably due to smartphones becoming more capable – the iPhone 11 Pro Max has a 6.5” display for example – and also people replace tablets less often. Despite many competitors coming along (and mostly failing) Apple still has more than a third of the market it essentially created.


Image credits: Apple

Wednesday, 15 January 2020

Google Nexus One (2010)

Nexus One
Introduced January 2010

By the start of 2010 the Android platform had been around for fifteen months but some cracks were beginning to appear in Google’s strategy to revolutionise the smartphone industry. Part of the problem was that manufacturers were trying to customise the OS rather too much which was leading to fragmentation, they were also very poor at providing software updates and prices for higher-end devices were quite expensive.

Microsoft had suffered a similar problem with their Windows phones – weakness in the user interface with that operating system had resulting in different manufacturers reskinning the OS to make it more appealing. This meant that your experience with a Windows phone from HTC would very different to one made from Samsung or Motorola. This was a problem that rivals Apple and Nokia didn’t have because they completely controlled both the OS and the hardware.

Google’s response to this was the Google Nexus One, a device made for them by HTC. Competing in part with the iPhone 3GS, the Nokia 5800 and a multitude of Android and Windows phones, the Nexus One beat most of them when it came to both hardware and software. The 3.7” 480 x 800 pixel AMOLED display beat almost everything else in its class, the 5 megapixel camera was pretty good and the whole package looked attractive even if the styling betrayed that it was an HTC underneath.

Initially the idea was that Google would sell the Nexus One to consumers at $530 or €370, which was good value for a high-end SIM-free smartphone at the time. However back in 2010 customers were cool on the idea, preferring to get their phones subsidised with a contract.

Despite the attractions of the device, sales were slow. Google shifted away from direct sales in mid-2010 and tried to attract carriers to the device, with only a moderate amount of success. Customers were unhappy with the quality of the OLED screen to begin with, the Nexus One was modified for a more traditional Super LCD display a few months in (although this was mostly down to manufacturing issues). There wasn’t much in the way of marketing either, so while mobile phone fans might have known about it... many others didn’t.

But still, the Nexus One was meant to set an example to other manufacturers about how to do it and to some extent sales were not important. The other thing that Google wanted to do was show that software updates could be done quickly, rather than dragging on more months with other manufacturers (especially handsets tied to carriers). And Google were as good as their word, updates hit the Nexus One very quickly and everyone was happy… right up until the point that Google announced that the Nexus One wouldn’t be getting an upgrade to Android Ice Cream Sandwich in October 2011 because the hardware was “too old”. This was for a phone that was less than two years old and was now effectively on the scrapheap – and just as a comparison, the contemporary Apple iPhone 3GS ended up with software updates for five years.

Despite all of these woes, Google stuck with the Nexus project with a variety of partners such as Samsung, ASUS, Motorola and HTC (again), LG - with the final Nexus model being built by Huawei in 2015. After that, Google dropped the Nexus devices and instead brought out a more expensive range called the Google Pixel to somewhat mixed reviews and moderate success.

Google’s involvement in Android got more complicated when they bought Motorola’s mobile phone businesses a year after the launch of the Nexus One, only to asset strip it of patents and sell the desiccated husk to Lenovo in 2014. In 2018 Google bought part of HTC but as yet haven't turned this fading company around. Overall, Google’s foray into producing its own handsets was probably not the decisive influence that Google wanted it to be. Would it have made any real difference if they hadn't bothered?

Image credits: HTC and Google

Google Nexus One - Video 

 

Thursday, 9 January 2020

Digital (DEC) PDP-11

Well-appointed PDP-11 at TNMOC, Bletchley
Launched January 1970

The Digital PDP-11 is a computer you may not have heard of, but it was hugely influential in terms of hardware and even more so in the software that it helped to create.

Digital Equipment Corporation (typically known as DEC or Digital) was founded in 1957, first tinkering with electronics for laboratory environments and then producing a full computer system in 1959 with the PDP-1 minicomputer. Other models followed and DEC grew quickly through the 1960s with a wide range of new products including the very successful PDP-8.

The term “minicomputer” is rarely used these days, and by modern standard there was nothing mini about them. Often house in racks and sometimes filling a small room, minicomputers were shrunk down versions of the huge mainframe computers that tended to require their own building. Even as microcomputers became popular, minicomputers were much more powerful and made it easier for people to work collaboratively, these days their modern descendant would be a server. Typically you would access a minicomputer with a terminal such as a VT52.

Time and technology move on and by the late 1960s the computer industry started to settle on 8, 16 and 32 bit architecture (based on an 8-bit word size) where Digital was mostly producing 12, 18 and 36 bit machines (based on a 6-bit word size). In part this change happened because the computer industry was starting to standardise on the 7-bit ASCII character set.

In January 1970, DEC launched its first 16-bit minicomputer – the PDP-11. Combining the extensive experience of the company from the previous decade (both good and bad), the PDP-11 was a high usable and expandable system. A key feature of the PDP-11 was that it was relatively easy to program, especially when it came to using peripherals (initially on the Unibus bus and later Q-Bus). And peripherals were available from DEC in abundance, including disk drives, tapes drives, printers and terminals.

From the outset the PDP-11 was a huge success, starting with the original 11/20 and 11/15 models in 1970 and then developing along with advancing technologies to become smaller and more powerful, ending with the 11/93 and 11/94 in 1990 (which were in production until 1997). But PDP-11 systems ended up being squeezed into other “smart” peripherals too such as robot arms and when added to a terminal such as the VT100 they could make a compact desktop version (such as the VT103). DEC even tried to make a PDP-11 to compete with the IBM PC with the DEC Professional range.

Perhaps confusingly DEC had many operating systems for the PDP-11, notably RT-11. However the most famous OS that the PDP-11 is famous for is Unix – a platform that was developed at Bell Labs in response to the complex Multics OS. In fact, Unix was tied to the PDP-11 platform until 1978 when it was finally ported to a fairly obscure system called the Interdata 8/32.

Unix became an enormous success – it took a while – and today descendants and variants of that OS power smartphones, servers and personal computers worldwide. But the PDP-11 hardware too was hugely influential, directly inspiring 1970s processors such as the Motorola 68000 and Intel 8086.

DEC sold hundreds of thousands of PDP-11s while it was in production, making it possibly the most popular minicomputer ever made. The 32-bit DEC VAX launched in 1977 was meant to be the next logical step, however both the PDP-11 and VAX ended up being sold in parallel.

In terms of both software and hardware the PDP-11 was a hugely significant device, even if most people may never have seen one. Surprisingly it seems some are still in use, and there’s a brisk trade in parts and components on the second hard market.

Image credit: Loz Pycock via Flickr

Wednesday, 1 January 2020

Sinclair ZX80 (1980)

Sinclair ZX80
Launched January 1980

By the start of the 1980s, the microcomputer revolution had been in full swing for a few years. However, price remained a problem – generally speaking even basic systems could run into thousands of dollars or pounds when you added in all the bits and pieces you needed.

This meant that many people couldn’t afford to get involved in this new field, especially in the UK where disposable income at the end of the 1970s was quite small compared to the US. So, Clive Sinclair tackled the issue of affordability head-on to come up with a uniquely British computer that went on to spawn some even more successful offspring.

Sinclair had been dabbling in all sorts of high-tech gadgets for some years, and in 1978 had come up with a low-cost board computer called the MK14. This very simple little computer showed that there was a market for this type of device, so Sinclair’s firm – Science of Cambridge – set to work on something more usable, the Sinclair ZX80.

When it was launched in January 1980, the ZX80 came in ready-built form for a shade under £100. If you wanted to solder it together yourself you could save £20 on top of that. Then, all you would need was a TV, cassette recorder and a couple of cables to make a complete system.

Measuring 22 x 18cm it was about two-thirds of the size of an A4 sheet of paper – a fact that surprised many customers who bought one by mail order who expected it to be somewhat bigger. Despite the diminutive form factor, the ZX80 was an exceptionally elegant design. A blue-on-back membrane keyboard housed in a futuristic white case with “SINCLAIR ZX80” boldly emblazoned across it, the exterior design was the work of the late Rick Dickinson who went on to work on many other Sinclair projects.

Inside were just 21 chips including a Zilog Z80 compatible processor and 1KB of RAM. RAM could be extended to 16KB by using a “RAM Pack” that plugged into the edge connector on the back of the machine. There was no sound and the monochrome output only supported uppercase characters and some simple predefined block graphics. BASIC was built-in to the computer along with some pretty good documentation, so it was possible to get started on the ZX80 straight away and start doing some coding. As with a calculator, each key had many different functions – for example, almost all the BASIC keywords were generated by a single key press.

It had its flaws – primarily the way the display went blank when the computer was processing, and poor ventilation meant that the ZX80 was prone to overheating (the black slots that look like cooling vents are in fact merely cosmetic). However, Science of Cambridge sold around 100,000 units in a lifespan of just over a year and the machine was so popular that there were significant waiting lists.

The ZX81 followed in 1981, which was almost identical in overall architecture but had a lower chip count, more features and crucially was cheaper. In 1982 Sinclair launched the ZX Spectrum, and between them these inexpensive compact computers sold in their millions.

Despite selling so many machines, the ZX80 is a rare beast these days. Typical prices for a complete working system are around £600 or so, quite a bit more than it would have cost in real terms.  Alternatively you can buy a modern kit such as the Minstrel and assemble one yourself for much less.

Image credit: Rain Rabbit via Flickr
Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0)