Why do old pcs become so useless?

Started by Technical Ben, Nov 08, 2010, 11:31:50

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Technical Ben

We all know, 12 months down the line, it's more than likely your old computer will be an antique by most standards. Internet browsing will be slow. Games or software will refuse to work. Things do not seem fair. Your car lasts up to 10 years and it still manages to keep up with the rest of the traffic. Your pc? While it will still be going in that time, it will have trouble keeping up with the rest. Mores Law suggests computing speed doubles every 2 years. An example of this is my 3 core PC I bought over a year ago. This is now looking quite meager when compared to 6 and 8 core machines coming out.

 To visualise why this is a problem, think of a chess board. We are only going to put 1 grain of rice on the first square. but on each square, we will double it. So on square 2 it is 2 grains of rice, and on square 3 is 4 grains. Likewise square 4 has 8 grains of rice. We start to notice a pattern that computers follow too. As square 5 is 16 and squares 6 and 7 are 32 and 64 respectively. It is just like bits that multiply in a computer. Usually it takes just one extra transistor to double the calculating power on a microprocessor. So just as our chessboard, we are making small steps. Or so it seems!
  If you continue this on, how big do you think your rice pile will be? You may be surprised, so I'll leave it as a spoiler...
















 18,446,744,073,709,551,615 grains, or 461,168,602,000 metric tons, which would be a heap of rice larger than mount Everest!  :eek4:
So if we apply this to computers, we can see why the first computer made and the Most powerful* ever made are vastly different. Like a grain of rice compared to a mountain.
http://en.wikipedia.org/wiki/Wheat_and_Chessboard_Problem

*This has 224,256 AMD processors, with 1,345,536 cores in total. I suppose that's the computing power of an entire ISP and it's customers!?
I use to have a signature, then it all changed to chip and pin.

Rik

Food always gets into threads in this forum, doesn't it. ;D

Sobering thought, though, Ben.
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Simon

Simon.
--
This post reflects my own views, opinions and experience, not those of IDNet.

Technical Ben

Quote from: Rik on Nov 08, 2010, 11:33:45
Food always gets into threads in this forum, doesn't it. ;D

Sobering thought, though, Ben.

Yes. I wish I could still use my old hardware, and to some extent I can. But only for the tasks it was used for back at the time. Trying to get a 1995 PC to run IE would be a taxing to say the least. I suppose the same troubles hit ISPs when their customer base starts to explode. So does the complexity of the infrastructure.
I use to have a signature, then it all changed to chip and pin.

Rik

I wonder how pathetic my BBC B would 'feel' now?
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Gary

Flash player needs a graphics card with 128mb ram these days, I remember not so long ago when 256mb was the biggest you could get
Damned, if you do damned if you don't

Rik

Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Technical Ben

I use to have a signature, then it all changed to chip and pin.

Rik

 ;D  That would bump the power bill a bit.
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

armadillo

I am now using the PC that I built in 2004. Six years on, it runs as fast as it did when I built it. Internet browsing is fast and efficient. I have hundreds of programs installed (130 icons on my desktop). They run perfectly. I use CPU hungry and memory hungry apps like Photoshop. I am not a gamer but I do run Microsoft Flight Simulator and that runs beautifully too. The PC is based on a Pentium IV Prescott 3GHz hyperthreading chip and I have 1GB of RAM.

One reason it all runs so well is that I am using Windows XP. The PC is hugely faster than my 2008 dual core laptop and always was.

Whereas it is true that modern machines get more and more powerful, software developers tend to create software to use up the extra processing power. So we get bloatware OSs like Vista that eat up the extra memory and power just to keep themselves afloat.

I believe that much of the extra size and hunger of newer software actually gives a very small return for the extra resources it requires. For example, all I want in an OS is something that enables me to run programs that specialise in their own field. I do not want the OS to do image processing (Photoshop CS2 is fine for that), antivirus (NOD32), backups and restores (Acronis), web browsing (Firefox). Hey, I even have some mathematical stuff that I wrote in QBasic to run in a DOS window! (Not that I use it much).

The newer OSs try to do everything and you end up going to great efforts to stop them taking over all the things that are best done by dedicated software. It may well be true that the latest IE is hard to run on older hardware. I never use IE anyway so I am not all that worried.

Where you would probably have a problem is when buying new software, and especially games, developed specifically for the latest CPUs and OSs. It would not run well on the older hardware and OSs if at all. Not interested in games so that is not a problem for me.

Occasionally, I defrag the C partition and I do a registry clean when I make a significant change (such as getting rid of a software firewall after I changed from using a modem to using a router). Neither of those seems to have much effect on speed. Fast both before and after.

Having had a motherboard failure in 2008, I then managed to get a replacement used motherboard of the same make and type and rebuilt the PC and everything worked perfectly. I also got a second and third spare motherboard. And I have spares of everything - CPU, memory, graphics card etc. I aim to keep using this PC in Windows XP until there is something essential that it is unable to do. The most likely reason for that to happen will be that, one day, XP will decide it needs re-activating and Microsoft will refuse to do it.

Powers of two amaze a lot of people. The doubling you illustrate is also the reason for the success of biological viruses. The Norwalk virus, for example, is a very nasty gastroenteritis bug. You only need to get infected with about four or five single virus particles. It doubles every minute or two. So in a few hours, you have billions of those virus particles in your body. Then you are ready to infect anyone else by transmitting four or five of those airborne particles to them. The only reason it needs as many as four or five to begin is that not every virus particle is viable. Otherwise, one would do.

A final amusing throwback to the old days is from when I was responsible for developing mainframe software. I ran one particular system that had what was described as a "very large database". When I first implemented it, I was asked if I could make that database a bit smaller as the disk storage requirements were such a huge overhead. How large was this "very large database"? It was 1GB and it needed five exchangeable discs each the size of a small fridge to store that amount! My first PC had an 800MB hard drive. And yes I do mean 800MB and not 800GB.


Simon

Simon.
--
This post reflects my own views, opinions and experience, not those of IDNet.

DarkStar

@ armadillo

So true, I have just lost my XP because the computer was simply worn out and fell apart after 7 years of trouble free running. For me XP was stable, did all I wanted of it and was as fast as Win7 on my new computer - and it didn't keep telling me I was trying to put my photos in the wrong folder in the wrong order either  :mad:
Ian

Rik

I like an OS which keeps out of the way, and doesn't try to organise me. That's why I stick with XP.
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Glenn

It wasn't always the case, but 9 years of developement, post release, has helped.
Glenn
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Rik

True.

Let's bring back coding in assembler, it made for neat and efficient code.
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

davej99

A tendency towards nostalgia is proportional to one's age. Once I only looked forward to the next technological breakthrough. Now I mostly look back on them as those I have witnessed far exceed those that I have yet to see. And I am intrigued that often IDNetters describe the first computer they remember. Perhaps, we should give our age like an android model; I am BBC Micro; or I am XT; or in my case, I am Ferranti Atlas. I can remember fondling that first bit of punch tape, my first program. It was never like that again. Was it useless? No.

I think I will challenge the assertion that old computers become useless. Strangely they can still do what they were intended to do. You can still marvel when you place them in the context of the day and watch them at work still, un-aged. You can say of great sporting heroes they have become useless, but you can marvel at what they did. The difference, though, is you cannot see that performance ever again. So in this sense the computer is the superior creation. It still does what it has always done, wanting only the odd replacement component. Is can still deliver its design spec. Its speed and accuracy is undiminished. More over as specie it can evolve rapidly; double its performance every year; year on year. So in this sense too the computer is the superior creation. It knows what it can do and does it. It is man that is useless for expecting any more.

Technical Ben

I agree Armadillo. That's why my riders in my replies said "if you don't put new software on it". Most of the stuff is bloatware. A few of the things are useful.

But just showing why you can manage to fit 4 passengers in a 30 year old car, and a brand new car. But cannot fit Adobe Photoshop CS5 on a 10 year old pc. :/
I use to have a signature, then it all changed to chip and pin.

armadillo

Quote from: DarkStar on Nov 08, 2010, 14:47:45
- and it didn't keep telling me I was trying to put my photos in the wrong folder in the wrong order either  :mad:

:rofl:

armadillo

Quote from: Technical Ben on Nov 08, 2010, 15:59:46
But just showing why you can manage to fit 4 passengers in a 30 year old car, and a brand new car. But cannot fit Adobe Photoshop CS5 on a 10 year old pc. :/

And you are so right. I am with CS2 and we are very happy together :)

esh

The fastest OS of all is Windows 95 these days because the entire OS kernel can fit in the CPU cache, it doesn't even have to go to RAM.

But I don't see people using it.

Sure, we like speed. But ignoring software just because it's newer is rather shortsighted. So you can fit the same number of people in a 30 year old car. Do you think CS5 is the same number of bytes or features as the very first version of Photoshop? It's a slightly dodgy analogy. Is a 30 year old car as reliable? Maybe. Is it as efficient? Almost certainly not.

It's pretty interesting that software is pushing hardware these days, as back in the 80s (and early 90s to a lesser extent) it was the other way around. Most of the reason for the increasing amount of hardware for newer software these days can be put down to a few things. Firstly there is the software having more features as I mentioned above. Mostly though, this will add to the size (in bytes) of the executable, and hence also the RAM footprint, but not strictly the execution speed (unless it is directly related). I can do realistic lens distortions in Photoshop now which I couldn't do 10 years ago, but when I'm not using this feature, there is no reason it should slow the whole code down. Look at the Linux kernel: it has grown from 3 million to more than 15 million lines of code in 10 years, but performance tests show it has pretty much stayed the same as far as speed goes (see the recent tests at phoronix.com).

No, the main reason can be put down to what I will simply term as 'abstraction'. It can make things complicated, but it can undoubtedly make things better. How can I best describe this?

It's layers upon layers. It's in a company, that has an accountant to keep track of the funds, that has a bank to keep the funds. It's there when you go on holiday, for which you use a travel agent, that has an airline, that has planes and staff. One thing that relies on another, in layers. We could of course, count all our pennies ourselves and keep them in jars under the bed. I know some people do that. More unfeasibly, you could try and charter a plane yourself, and fly it yourself, using a hand-drawn map to get you there. At some level, as a society, as human beings, we voluntarily (and sometimes ignorantly) depend upon other things just being there and working for us.

This has been what has been increasing in computers; the layers.
Let's look back to my first computer, a Commodore VIC-20. It had 4KB of RAM (you could use 3.5KB) and a fixed ROM, a fixed processor, a fixed sound chip, a fixed graphics chip, a fixed keyboard, and a tape drive. You could get a modem for it if you were that way inclined. But software wise, there was pretty much nothing between you and the hardware. Sure, most people coded in BASIC, but you could poke bits and alter the memory, and really do cool things (or disastrous things) on a whim.

Manufacturing costs shrunk. Computers grew - in importance, in size, in power.

Fast forward a bit and we're sitting in front of my first PC. It's also a Commdore, with a 286 12MHz processor (fixed), BIOS (fixed), onboard graphics (fixed, and it was an ATi by the way), and one of those godawful PC speakers that was actually worse than the VIC-20 sound (fixed). But now we are running DOS, and now people tend to run more than one program -- because it's *useful* -- meaning that the computer is now no longer a single purpose tool. It's multipurpose and a great business tool and oh cr*p, Lotus 1-2-3 has gone and over-run its array bounds and has now written over the OS memory space and hung the system. This can't go on, clearly. Multiple programs need to play well together. So what do we do? We'll add protected mode.

I'm now on a 486 running in protected mode and Windows 3.1 is here, and it's *not entirely cr*p*. Hey, look at that, WordPerfect for Windows just crashed (remember the nice big rectangles with 'Ignore' buttons that did nothing?), but it hasn't hung the entire system. Windows keeps running. To get this far, the OS now depends upon protected mode functionality, and for my business tools such as WordPerfect and Excel, a common graphical interface is beneficial to the user, so we now add the Windows API to the list of layers too.

Of course, that wasn't the end of crashes and hangs. Oh, no. Windows 3.1 was always cr*p at games (the predecessor to DirectX, named WinG, was a bit of a joke really), but Windows 95 was going to change all that. Supposedly. So yeah, games slowly started moving to Windows 95, but it was basically a long exercise in patience, interrupted with everyone's favourite Blue Screen of Death. Many games stuck to DOS, and right about now, 3D graphics cards were starting to hit the mainstream that plugged into your shiny new PCI slots. How did games use them? With the Glide API, of course. Look! Abstraction!

My goodness, it's nearly the new millenium already, and here's Windows 2000. A lot of technology in Win2k was actually present in Windows NT (New Technology) but that was a bit of a clunker (but a reliable one) and was used only in business. But with the advent of this operating system (which was wisely chosen over Windows ME to grow into Windows XP) suddenly blue screens are a whole lot rarer. Why is this? How did this happen? Windows NT abstracts the hardware layer. For almost anything now, apps can't start throwing stuff straight at your soundcard and graphics card, and hence if they go wobbly they won't bring your system down. It's the 'hardware abstraction layer'. Games and other 3D apps are now part of Windows now, DOS has been declining into obscurity for some time. This is mostly thanks to DirectX which is a graphics API which - yes indeed - just happens to be another of those layers. But it's not the only one. Graphics cards have been diversifying for some years now, and it's been a long while since people have been happy with VGA resolutions. Graphics drivers are thrown onto the growing list of layers. Games, that use DirectX, that use the graphics driver, that use the hardware abstraction layer, that use protected mode. But damn, it was good.

It's Windows XP time, and Microsoft has suddenly realised that the Windows API, with legacy support back to Windows 3.0, has actually turned into an ungodly mess and is a total nightmare to code for. It may give us a nice standardised interfaced, and it may be just fine given enough time, but wouldn't developers love us so much more if we had a nice *higher level* API which made things so much easier? Especially with Java showing they can do it... Whoosh! Welcome to .NET, and it's a new layer that helps people code Windows (and now non-Windows with mono) apps in record time with less frustration.

It goes on. There's dozens -- hundreds -- of APIs there mostly to convenience developers and avoid everyone having to rewrite the wheel, and these layers together with those that make the system more stable and more useable is what we have arrived at today. Yes, it's true, back in the really old DOS days, every developer had to write their own interrupt tables to catch keyboard strokes, their own engine backend to push pixels to the screen. Those days are gone. They won't be here again, and if these layers of abstraction hadn't been so damn useful (and manageable given the hardware power we have) then they would not have happened. In short, it's human laziness. In short, it's human forgetfulness, and proneness to error. If we all could write every program as a static blob that did every hardware operation itself and never made a mistake and destroyed all your other program memory, we'd have no need for all these layers and undoubtedly we'd have fast and efficient systems. But god knows how long it would take to write the programs, or how long it would take to debug them to make them work well together. At some level, there is a balance between development time, and runtime, always being changed by the evolving pace of hardware.

So that's why new software doesn't run on old systems. Going back to my hole in the ground now.
CompuServe 28.8k/33.6k 1994-1998, BT 56k 1998-2001, NTL Cable 512k 2001-2004, 2x F2S 1M 2004-2008, IDNet 8M 2008 - LLU 11M 2011

Rik

Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

armadillo

Brilliant summary of the computing timeline, esh!

Long gone are the days when it was within the abilities of a single person to build a machine from components and program it to do something.

Technical Ben

Adding to that, on a side note. One of the reason some set top boxes and integrated devices are so slow is they run a virtual machine on their cpu, a Java machine on that, and Java programs to run the software. Not because it's fast, but because it's easy. That's not including Android of cause, as Java can be fast if programmed correctly for the right hardware.
I use to have a signature, then it all changed to chip and pin.

esh

I actually find that virtual machines invoke a tiny performance hit as long as you are not I/O bound, on the order of 0.5%. Java for pure arithmetic is also pretty fast, but I think the whole GC mechanism can start to cludge things up a little when you have tons of data. It's of course less slow than leaky C code.

But yes, these days it is cheaper to throw more hardware at a problem than hire and pay someone to fix broken code. I'm not sure what to think about that really.
CompuServe 28.8k/33.6k 1994-1998, BT 56k 1998-2001, NTL Cable 512k 2001-2004, 2x F2S 1M 2004-2008, IDNet 8M 2008 - LLU 11M 2011

Technical Ben

Well, we could use Minix3 at around 300mb...
DSL Linux at 50mb
Or KolibriOS at 5mb.
I'm tempted to try them.  ;D
I use to have a signature, then it all changed to chip and pin.