1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Is vista really that BAD ?

Discussion in 'Windows - Software discussion' started by as31, Feb 9, 2007.

  1. as31

    as31 Regular member

    Joined:
    Aug 23, 2005
    Messages:
    164
    Likes Received:
    0
    Trophy Points:
    26
    Vista only just came out here in Europe, but all I read here is that its a total piece of $hit... Is it reallllllyyyy thhaaat baad?

    I currently run XP with ser. pack 2 - would it be worth the upgrade?
     
  2. rulisky

    rulisky Regular member

    Joined:
    Apr 26, 2006
    Messages:
    772
    Likes Received:
    0
    Trophy Points:
    26
    There are numerous Vista thread posted here. Have you read them? They all seem to say it's bad.

    I am avoiding Vista and holding out for the next OS.

    You decide.
     
  3. as31

    as31 Regular member

    Joined:
    Aug 23, 2005
    Messages:
    164
    Likes Received:
    0
    Trophy Points:
    26
    Yes i have read them. Although the main ' story ' i get is :

    - In a few years EVERYONE will have Vista, just wait now till some bugs are fixed, and that its a bit like MS : Media Center , but since i dont have this - i asked what other people thought.
     
  4. as31

    as31 Regular member

    Joined:
    Aug 23, 2005
    Messages:
    164
    Likes Received:
    0
    Trophy Points:
    26
    And by the way, I'm talking about Vista Ultimate here...
     
  5. SadJoker

    SadJoker Regular member

    Joined:
    Oct 28, 2002
    Messages:
    419
    Likes Received:
    0
    Trophy Points:
    26
    The hell I will have Vista in a few yrs. In a few yrs, Vista's gunna be outdated same as XP. MS will be shoving another bloated socalled 'innovative' OS down consumers throats.

    In a few yrs, I will most likely either a. still be running XP Pro x64 b. Moved to a Mac c. installed OS X on my machine. Vista WILL Never touch my hardware
     
  6. PeaInAPod

    PeaInAPod Active member

    Joined:
    Nov 28, 2005
    Messages:
    3,050
    Likes Received:
    0
    Trophy Points:
    66
    The problem I see with Vista is that its anti-consumer. Restricting what you can do/can't do. And Vista Ultimate has a retail asking price of $400!! The main features of the Ultimate version are being able to connect to a Windows Domain, which is fairly useless for anyplace outside a corporate setting. Another feature is called BitLocker, it is essentially a program to encrypt your harddrives contents from being read if the harddrive is stolen or what not. It works, I believe, by reading a chip/the bios on the mobo to validate that your pc is the pc that is associated with that harddrive. If it is then your drive becomes "unlocked". Now this is actually a hadny feature, but what happens if your motherboard fails? Your screwed! In my opinion there is a better option. Its called TrueCrypt. Its free,open-source, and a much better program. Now TrueCrypt is a complex program so I wont go into the specifics of how it works here.

    Basically if you really want/need Vista I would recommend Home Premium. It has the basic Vista stuff, aero glass, and whatever other features are standard. However if all you want is the look of Windows Vista, I would recommend trying out Vista Transformation pack. It replaces the icons that makes Windows XP look like it does with Windows Vista icons. So basically a Vista "skin" you use on XP. I use it and love it, the looks of Vista the compatibility/freedom of XP.
     
  7. rubixcube

    rubixcube Regular member

    Joined:
    Oct 31, 2004
    Messages:
    285
    Likes Received:
    0
    Trophy Points:
    26
    Vista is awesome. I have the Ultimate Edition and have been using it for a while now, with no problems at all. It's far superior to XP. But its just like going from Windows 2000 to XP all over again, you kinda have to learn to do a few things before you feel at home. My opinion, it's tops :)
     
  8. colw

    colw Active member

    Joined:
    Apr 25, 2004
    Messages:
    1,602
    Likes Received:
    0
    Trophy Points:
    66
  9. rubixcube

    rubixcube Regular member

    Joined:
    Oct 31, 2004
    Messages:
    285
    Likes Received:
    0
    Trophy Points:
    26
    yeah thats one thing. it worked before, dont know why it doesnt now :( :S
     
  10. borhan9

    borhan9 Active member

    Joined:
    May 25, 2005
    Messages:
    2,771
    Likes Received:
    3
    Trophy Points:
    68
  11. jbelder

    jbelder Member

    Joined:
    Dec 14, 2004
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    16
    You going to wait another Half a
    decade?
     
  12. The_Fiend

    The_Fiend Guest

    If he's smart, he will.
    And since he seems smart enough, i'm guessing that answers that :)
     
  13. Ank_3

    Ank_3 Member

    Joined:
    Nov 11, 2006
    Messages:
    48
    Likes Received:
    0
    Trophy Points:
    16
    My brother in law works for the BC Genome Sciences Center based out of Vancouver. Their a huge company that employs at least 1000 thousand scientists under 5 departments and just recieved a $90 million grant for their work on the SARS virus. The Cpsc dudes he works with say they wont transfer to Vista for at least 5 years. Thats my two cents.
     
  14. The_Fiend

    The_Fiend Guest

    That post proves that Canadians have more sense than most of my fellow americans.
    'Nuff said.
     
  15. jbelder

    jbelder Member

    Joined:
    Dec 14, 2004
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    16
    What do you think it will be better than vista? Have you even tried vista? Ihave been using it for about a couple of weeks and from what I've seen it's way better than XP.
     
  16. Ank_3

    Ank_3 Member

    Joined:
    Nov 11, 2006
    Messages:
    48
    Likes Received:
    0
    Trophy Points:
    16
    Agreed! =D
     
  17. Sandwich1

    Sandwich1 Guest

    I've got Vista Business (almost the same as Ultimate) and I have found a lot of annoying things about it, like for example when I deleted my pictures folder (fully) and then created a new one, it still makes a replacement for the one I deleted every time I start the computer (if I delete the one it has created). I got rid of that problem, but it's still kind of stupid. And look at this...

    [​IMG]
     
  18. The_Fiend

    The_Fiend Guest

    @ jbelder : i work in the IT security business, and i was unlucky enough to have the displeasure of testing the final version in december to see if we would or would not use and/or sell that piece of sh*t.
    And i don't know what your definition of "better" is, but if this is better than XP, i think i'll stick with the inferior product.
    Because IMHO, better means improved security, improved ease of use, better compatibility, less restraints, and more control over the system.

    Vista offers big security holes, annoyances in it's dramatically failing attempts at improving security, almost no compatibility *i had to try !!3!! different mobo's before ifound one that would work*, enough DRM to make a soccer mom cry, restraints on what you can do with your system, and losing control over what you can and cannot install to microsoft.

    If you dare call that imporvement, i suggest you have yourself commited to a mental institution.
     
  19. jbelder

    jbelder Member

    Joined:
    Dec 14, 2004
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    16
    Well You know thats whats so great about the GOOD OLD USA freedom of choice!!!!!!!!!!!!!!!!!!!!!!
     
  20. ireland

    ireland Active member

    Joined:
    Nov 28, 2002
    Messages:
    3,451
    Likes Received:
    15
    Trophy Points:
    68
    a DRM movie studios and microsoft DRM product


    The problems with Vista laid bare

    Part One What we got and why


    By Liam Proven: Wednesday 21 March 2007, 08:43
    "The only problem with Microsoft is that they have no taste. They have absolutely no taste." - Steve Jobs to Robert X Cringely in Revenge of the Nerds

    REGARDLESS OF WHAT Bill Gates might have claimed in interviews, a lot of the goodies in Vista have – I'll be diplomatic – drawn inspiration from rival products, primarily Apple's Mac OS X. The trouble is that Microsoft has prioritised the wrong bits, taken the wrong inspiration. And the sad irony is that if it had made different choices, we'd have got a simpler, faster, safer Vista a lot sooner.

    So what sources and where has MS got its ideas from? And where should it have done so instead? You don't have to look far.

    System-wide instant search and a query-driven, location-independent view of the filesystem are very useful things to have. Microsoft spent years working on "Windows Future Storage", a complex system that would store all your data in a SQL database.

    Instead, Apple came up with Spotlight. A minor tweak to the filesystem code means that every file saved to disk is indexed as it's saved, making it simple to offer near-instant file searching. Once you've got that, it's relatively trivial to offer folders showing all image files or all files containing "Dear Mrs Jones" or all files of over 38.45Kb generated by used "alice" between 2:30 on 13 April 2004 and 5:15 on 26 September 2005. No need for a big heavy relational database or SQL or anything else.

    And guess what happened? Microsoft quietly dropped WinFS and Vista gained a very Spotlight-like index-driven search system instead. On Linux, GNOME's Beagle search is much the same. If you use an older version of Windows, you can get much the same functionality from Google Desktop Search, MSN Search Toolbar or others; they don't have the benefits of integrating with the shell, or of being able to put indexing hooks right into the filesystem, but the end result is similar.

    The idea for Vista's pseudo-3D GUI surely came from OS X – as it did with open source versions such as Compiz and Beryl. The live previews of taskbar buttons strongly resemble the live icons in OS X's Dock. The visual inspiration for the 3D window flipper was probably Sun's 3D Java desktop Looking Glass, but the competitive pressure was surely from OS X's Expose.

    A bit of background
    Bear with me for a second. To understand what Vista does and why, you have to know the history - and the history comes from a different company.

    The whole concept of using a 3D graphics card to accelerate the windowing system comes from the Mac's Quartz Extreme. Apple's OS X is based upon NeXTStep, NeXT Computer's pioneering Unix-based OS from the 1980s. The NeXTStep GUI didn't have hardware acceleration, because it was drawn by Display PostScript, which was too complex for the simple 2D graphics cards of the time to provide any useful help. OS X's graphics system, Quartz, uses Adobe's royalty-free PDF imaging language instead – which is derived from PostScript anyway. In OS X 10.0 and 10.1, Quartz was unaccelerated.

    Modern 3D cards, though, are powerful processors in their own right - as is demonstrated by the fact that they're getting used for real computation now, such as the GPU version of Folding@Home. The innovation in Quartz Extreme, which appeared with OS X 10.2, is that Apple's engineers found a way to use a 3D card to speed up their GUI. In QE, each actual window is handled as a 3D object by the GPU - which means that the GPU does the grunt work of compositing (calculating which window appears on top of which other ones, which bits are hidden and so on), moving windows around, scaling them and so on.

    What that means is that suddenly all these special effects are computationally "cheap" - in other words, they don't take much CPU power. It's not doing the work, the graphics card is. This being so, not only can the developers use them more widely. So, you get pretty but largely gratuitous effects like transparency, drop shadows and zooming translucent "ghost" icons when files are double-clicked – but it also enables useful new features, like Exposé. In case you're a deprived Windows user, I'll explain: Exposé is a feature of the OS X windowing system where hitting a hotkey shrinks all the windows onscreen - either showing all a single application's windows side-by-side, or all applications side-by-side, or even zooming them all offscreen so that you can get at the desktop for a moment without minimising or otherwise disturbing the arrangement of your windows.

    There are several important points to realise here.

    Firstly, all the shrinking, scaling and other "chrome" is done in hardware, so it puts no extra load on the CPU.

    Secondly, all the chrome is part of OS X's Quartz graphics API - these sorts of features are an inherent part of using a sophisticated language like Postscript or its descendant PDF to implement the display. This means that the developer doesn't have to make great extra efforts: they can just specify a window as 80% transparent or tell Quartz to rotate it 74° clockwise and shrink it 18% and it just happens. Such facilities are built into the OS, so it does the grunt work of implementation, not the application developer.

    On a machine with a poor graphics card, the effects are rendered in software, slowly - so even users of old Macs don't miss out on the fun. But on a modern machine with a powerful card, it all happens in hardware, automatically, and the programmer need neither know nor care. That's the third benefit - implementing things this way doesn't exclude users with older, slower hardware. It levels the playing field for developers and users alike.

    The Microsoft way
    Compare this with Microsoft's effort in Vista. All the chrome happens through extensions layered on top of the basic 2D Windows GDI (Graphical Device Interface). If the user has one of the Basic editions of Windows Vista, they don't get the 3D effects at all - those editions don't include the feature. If they have one of the more expensive editions but a low-spec graphics card, they don't get the snazzy effects either, because they only work on certain CPUs.

    Which means that the developers have to take these things into account. And, of course, all this stuff is new in Vista. Microsoft has realised the implications of this and is releasing an update for XP to bring some of the new APIs to older versions - but if you're on Windows 2000 still, forget it. Naturally, along with the down-market Vista customers, XP users don't get the new 3D effects, either.

    So if a developer incorporates this 3D stuff into their app, it won’t display on all versions of Vista and won't work at all on older versions of Windows. Which means more work for the programmers, ensuring that things degrade gracefully.

    To be fair, these problems aren't unique to Vista. Now that this 3D whizzery has been done, naturally, everyone's doing it. In the open source world, there are two parallel efforts - AIGLX and Compiz. The technical details of how they work are irrelevant; suffice to say that they both use the 3D card to render and display the desktop, using two different underlying approaches. That's one of the joys of Free software - with no accounts department to have to justify things to, multiple different approaches can be tried. The one that works better or has some compelling advantage will succeed.

    The snag is that the same problems that face Microsoft, trying to bolt 3D onto a twenty-year-old 2D-based imaging model, also face the Linux world. On the Free side of the fence, though, users replace and update their software far more often. New releases of Ubuntu Linux come along twice a year, the same as its GNOME desktop - very roughly ten times as fast as the Windows rate of about twice a decade. And of course it usually costs little or no money to update to a new version of Linux, whereas every successive release of Windows gets more expensive than before.

    So why are we getting all this 3D glitz, anyway?
    Regardless of where they come from, these sorts of features are rapidly becoming not only accepted but expected in modern desktop OSs. So we can't really blame Microsoft for doing what everyone else is doing. It gives the "wow" factor, the instant appeal of the shiny, which is the main driver of the Vista marketing campaign. Although of course the real thing propelling Vista uptake isn't adverts or upgrades, it's that soon all new PCs will come with Vista on. If it could afford to be patient, Microsoft could save its massive promotional effort - in a year or two, Vista will all but replace XP anyway.

    The main reason it's going to all the trouble is twofold. For one thing, Microsoft makes a lot more money on an upgrade copy of Windows or Office than from OEM bundled one, so it is in the company's interest that people buy Vista. More significantly, though, it needs them to want Vista. Microsoft, along with the whole PC industry, is predicated on growth: they really need all their customers to dump XP and buy new software - and hardware - as soon as possible.

    Actually, in recent years, the rate of improvement in PC performance has dropped off. The boom years were the 1990s, when every eighteen months, PCs got roughly twice as fast. But this is not what the famed "Moore's Law" predicts. What Intel cofounder Gordon Moore actually said was that the number of transistors it was possible to put on a chip for a given cost would double every year and a half. For a while, more transistors meant more speed - but it doesn't any more. Processor designs have got to the stage of complexity where it's difficult to make them much faster while keeping the costs down. The rate of improvement in raw speed is slowing: now, they're spreading sideways instead of upwards: more cores, more memory, GPUs that can perform more complex and more elaborate 3D rendering in hardware, and of course, bigger disks and faster communications links.

    Acting against this, though, is another driver: more transistors take more power, even as they continue to get smaller and smaller. Computers with more CPU cores and more capable GPUs require more electricity to run and they generate more heat - so the machines are getting bigger and hotter and noisier, and designers need to go to greater lengths to counter these trends.

    Which makes the computers harder to sell. They don’t actually work much faster than older computers, they just handle lots more data and can present it to the user with more snazzy visual and sound effects. Even for gamers, one of the hardcore performance-driven markets, the main benefits are being able to fill larger and higher-resolution displays with crisp high-definition graphics while keeping the refresh rates up.

    In summary, we're not getting fancy 3D effects and other new features because we need them or asked for them, we're getting them because they're what modern PCs can do, and the vendors - Apple included - need to find ways to make these new features attractive and desirable because they need to keep selling more computers.

    The only difference between Apple's method and everyone else's is due to history. Back in the mid to late 1980s, when Steve Jobs was kicked out of Apple and started NeXT, he hired a selection of the best and brightest computer systems designers around, and starting with a clean sheet, they set out to build the best educational computer that money could buy - based around educational standards like Unix and commercial ones like Postscript, but intended for a market that wasn't as price-sensitive as the home and business ones, which would pay for and expected powerful machines with big screens and integral networking. The NeXT team made some very smart choices back then and Apple inherited those benefits when it bought NeXT - and thus got Jobs back on board - more than a decade later. All Apple's own efforts at designing the future - Pink, Copland, Taligent, OpenDoc, CyberDog and so forth - came to nothing and ended up in the bin.

    The vision and the foresight came from NeXT, which took the best of what was out there at the time (BSD Unix, Motorola's 68030 and 68040 processors and some Digital Signal Processors), added the most promising future-looking technologies (Mach, Display Postscript, optical storage), and aimed high, for functionality not price: integrated standards-based networking, big high-resolution displays, laser printers for output and so on. NeXT also made sure that they were best-of-breed in areas like development tools, cross-platform networking and support for upcoming standards like Internet email and multimedia.

    Much of this didn't deliver - there's little sign of the CPUs, DSPs or optical drives in modern Mac OS X machines. Overall, though, the bets paid off handsomely, and nearly two decades later, everyone else is still playing catch-up.

    In the second and concluding part of this article, I'll look at the areas that Microsoft didn't prioritise when it was developing Vista - and which it arguably should have. µ






    The problems with Vista laid bare

    Part Two What might have been

    By Liam Proven: Thursday 22 March 2007, 10:07
    SO, WHAT AM I complaining about? What should MS have copied that's more important than the 3D shininess?

    When someone upgrades a computer or its operating system, what they're really doing it for is not The Shiny, it's in the hope that it will work better. Sure, glitz is cool, there's no harm in looking good - even the Linux geeks know that. What you want, though, is for your computer to be more reliable, faster or to do stuff it couldn't do before.

    Apple's been delivering that for as long as OS X's been around. Each release not only sports new features, but so far at least, it's been faster than before as well. Well, Vista doesn't deliver that. It would have been pretty nifty, and certainly faster operating systems are possible - BeOS ran rings around Windows on the PC and around MacOS on the Mac, for example, not to mention around Linux on both. However, BeOS accomplished this by jettisoning all of the legacy baggage of Windows, MacOS and Unix - it was a clean-slate ground-up OS, probably the only one to get as far as the retail market in some decades. Windows Vista is based on NT, which although a new product in the early 1990s, also carries a huge legacy of backwards-compatibility with DOS and 16-bit Windows. Sixty-four bit versions of Windows finally lose this, but they've got a mountain of their own in 32-bit compatibility.

    So, if you want your new version of Windows to still run the apps from prior versions, you're not going to get something substantially smaller and faster as well. You can have that, but only at the price of starting with a clean slate - meaning no old apps and no old drivers. And as the failure of Be showed, people aren't prepared to do that. Shame, really.

    So what else could we have hoped to see in Vista? Smaller, nope. Faster, nope. New features, yup, got that. But what's the biggie, what's the one thing that all those smug Mac users and incomprehensible Unix weirdos go on about?

    Security. There's no way around it. Windows is a security nightmare. The reason we all get thousands of spams, the reason that we have to run virus and anti-spyware checkers that slow our high-power electricity-guzzling scalding-hot PCs down to the speed of the ones they replaced, the reason that the whole Internet is bogged down with sending all those spams, the reason that criminals hold websites to ransom for millions of dollars a year: it is all Windows' fault.

    It's because of the hundreds of millions of compromised PCs that form zombie armies, sending spams, participating in distributed-denial-of-service attacks and so on, all without their owners' knowledge. They still work, they're just a bit slower. Who notices? Next year, you just buy a faster one. (With Vista on it.)

    Some argue that whatever the dominant platform is, it would have these problems. Any OS used by 95% of the world would be a magnet for the bad men, for the hackers and crackers and blackmailers and extortionists. Well, that may be; it's an unanswerable question, doomed to be eternally hypothetical unless something takes over from Redmond's hegemony and becomes the new near-monopoly. And that doesn't seem to be too likely just yet.

    But if you know something about different OSs and the way they work, it's indisputable that Windows is less secure than most of its rivals. Sure, it's possible to lock it down; a properly-set-up Windows workstation on a well-built corporate LAN can be pretty safe. But outside of those corporate LANs, and even on the very many of them that are not well and properly designed and configured, Windows is a honeypot for malware.

    There are several reasons for this.

    For one, by default, the local user account or accounts on a Windows PC - be it NT, Windows 2000 or XP, doesn't matter - are administrators. They have full power and control over the PC: they can see anything, change anything, delete anything, install anything.

    What's more, they have to be. Lots of Windows software assumes that you have admin privileges, and either won't work properly or in some cases won't work at all if you don't.

    Another of the big holes in Windows is Internet Explorer. IE itself is not a bad browser, but the way it's been implemented and used is problematic. It's not only the web browser, it's also used to render a lot of the user interface. Secondly, Microsoft's model for dynamic content, ActiveX, is a security nightmare. The way ActiveX works is to download a normal Windows executable program - in geekspeak, a "native binary" - from the remote server, and then run it on the local machine with full local privileges. In other words, the stuff that you download from the Internet effectively becomes part of your installation of Windows, even if only temporarily. This sort of thing makes vandals' and security crackers' faces light up with delight and fills security consultants with the screaming heebie-jeebies.

    So what could be done about this? How could Microsoft have fixed this in Vista if it wanted?

    Firstly, the business of local accounts with full admin access. This is criminally stupid. No other serious multi-user OS works this way. There is a Right Way to do this, which everyone else does. (Well, all right, apart from a few heretics like Linspire and Puppy Linux, where you normally log in as root - but all the Linux pros regard those with creeping horror.)

    No, what you do is this: you take the keys to the admin account and you lock them away. It can have a password if you like, but it needn't - OS X and Ubuntu Linux, for example, set it to something random, so that you can't use it even if you know it's there. You don't show it to users, which XP gets right - but the trick that XP misses is that you must compel all ordinary users to have restricted, non-admin accounts. You make it impossible - or at least really hard - for normal accounts to have super-user powers. Ordinary users can see their own files, but not each other's or the protected ones of the OS itself, and they can't touch anything that might cause problems.

    But obviously, sometimes, if you're running your own system, you have to do admin-type tasks. You have to be able to install apps and drivers, perform updates and so on. So, when the user tries to do something like this, you ask them for a password. Not the admin password - you don't let them have that, it's too dangerous - no, you just ask them for their own password again. Then that program, and that program alone, temporarily gets its privileges escalated to admin rights. The critical point being that the user never, ever logs in as the admin under any circumstances, because then, everything runs with admin privileges and that's just too dangerous.

    So far, so good. This sounds a bit like Vista's User Access Controls, as parodied by the big suited CIA type in Apple's latest "Get A Mac" ad - doesn't it? It's not the same thing, though. Vista does it backwards; instead of compelling everything to the lowest level and raising it when necessary, in Vista, the programmers restrict the privileges of risky programs - so that Internet Explorer runs with reduced privileges, for example. And when you want to do something administrative, like changing hardware settings, Vista warns you that you're doing so - but all you have to do is say yes, you want to continue.

    There are two mistakes here.

    One is the idea of forcing dangerous, risky programs to run with reduced privileges, because this is the "blacklist" model of security: it's all right unless it's something that we know is not OK. The problem being that you need to keep your blacklist up to date, and if it isn't, you have a problem, because something that used to be safe may no longer be, or it may be something new that isn't in the list at all.

    It's better to do it the other way round - the "whitelist" model: to assume that everything's dangerous and only allow the stuff that you know is safe to happen. Or if you don't know, not to have a whitelist at all - mistrust everything and always ask for permission.

    Secondly, when the computer asks for permission, it doesn't do so with a simple "yes/no" prompt, because after a while, people get used to these prompts and stop reading them - they just click "OK" or whatever seems the easiest way to make the irritating message go away. Instead, you ask them to re-enter their password, and, it almost goes without saying, you compel them to have a password - not just a blank, and ideally, a good one, with a mixture of letters and numbers, even different cases. There are easy ways to remember these, such as using the initials of your address - like 53VMMS for 53 Veals Mead, Mitcham, Surrey, which no dictionary-based attack will ever find.

    This isn't rocket science. It's what several successful desktop OSs already do. What's more, most of the necessary functionality is already built into XP and has been since NT 3: the "Run As" command. This lets an ordinary, unprivileged user run a single command as a different user, so long as they know the username and password of the other account. It does the same thing as the Unix "sudo" command, which is the basis of Ubuntu's security model, for example. The only change that would have been needed would be to change it so that the user gives their own password - pretty simple stuff.

    It's simpler and safer than Vista's elaborate UAC system, which, if it winds you up, you can simply go into Control Panel and turn off - another fairly heinous security no-no. This in itself is a give-away, too – you can’t just “turn off” the reduced privileges of a user account, but you can turn off the warnings on an admin account with extra safeguards.

    So that's one difference that would have saved a lot of time and effort for better results.

    There would have been a price to pay: as already mentioned, lots of Windows applications expect and demand admin rights and go wrong if they don't get them. This sort of fix would break all those programs, and for all its sins, Microsoft does bend over backwards to keep as much backwards-compatibility as it possibly can.

    But the thing is, code that demands admin privileges is broken. It's not merely a bad idea, this is just plain wrong - and whereas Microsoft created the circumstances that led to this situation, to be fair, it didn't write all those broken apps. It really isn't Microsoft's fault if third-party vendors are supplying broken software.

    Vista is a major rewrite of Windows. The new display model means that old graphics drivers don't work any more; nor do lots of other drivers. This and other changes mean that lots of programs don't work properly on Vista any more. Such as all of Apple's applications, from iTunes on down. I'm sure that was a complete coincidence.

    Vista breaks lots of apps anyway. So did XP in its day, and 2000 before that, and NT before that. Indeed, the move from DOS-based Windows - 95, 98 and ME - to NT-based Windows - 2000 and XP - broke loads of old apps that relied on the DOS-based underpinnings of the old versions. It happens. It's the price you pay for improvements. Either the software vendors fix their apps, or you replace the apps - it's what anyone upgrading their OS to a new version faces anyway. But the change I propose would have been worth it. The stuff that would have been broken was already broken.

    You don't get something for nothing - at least, not from a commercial outfit like Microsoft. No pain, no gain.

    So much for the user account thing. The other big hole in Windows is Internet Explorer. IE7 is an improvement. It's got a lot of beneficial changes: things like tabs and RSS support make it more powerful and usable and things like the built-in phishing protection are a step in the right direction.

    But the big problems with IE are still there. For one, ActiveX. It was a bad idea in 1995 and it's a worse one now. No other browser works this way. The Mozilla family is derived from Netscape, which has its own plugin system and uses relatively safe, "sandboxed" environments for running interactive content, such as Flash and Java. In the beginning of the Web, Netscape was the dominant browser, and though I bet few people remember it now, the up-and-coming Internet Explorer supported Netscape plug-ins so that users moving to IE didn't lose all their advanced features. That support was dropped around the time of IE 5.5, as by then IE had triumphed and was the new power on the Web.

    It used to do it. I'm sure it could do it again. ActiveX should never have happened, but it's not too late. It still could be banished. Sure, it would break quite a lot of websites, but thanks to the success of Firefox and Mac OS X, there are fewer and fewer sites around that are completely specific to IE running on Windows any more. For all the gains in security, it would be worth forcing some web designers to rework their sites.

    Microsoft has, belatedly, realised that the "safe sandbox" approach is the way to go - it's the basis of the "managed code" of .NET. All .NET programs run in a sort of protective sandbox. It's time to banish unmanaged code from the Web, too. Just slapping warning notices all over it - "Did you notice the Information Bar at the top of the window?" - is not enough.

    The other problem with IE is also fixable: its shell integration. Internet Explorer first appeared as an optional extra for Windows 95. (NT3 never got a 32-bit version of IE. If you downloaded IE for NT3, what you got was the 16-bit version for Windows 3, with the Windows 3 bits that didn't work on NT taken out.) It only got bundled with Windows 95 OSR2 and became integrated into the Windows shell with Windows 98.

    Funnily enough, this is around the same time that Netscape was suing Microsoft for anticompetitive practices - viz., unfairly bundling its free browser with Windows to displace the commercial Netscape offering.

    Microsoft's defence was that IE wasn't a standalone product, it was part of the OS, a component of Windows. (Although Windows 95 didn't actually come with it originally - and neither did any older version of Windows. And that you could install Windows without it, or remove it from Windows once it was installed.)

    Right after this, Microsoft integrated IE deeply into Windows 98. The new and improved "Active Desktop" in Win98 was driven by IE: Explorer window contents were generated in HTML which was rendered by the core DLLs of IE, as were JPEG images. Remember that message when setting a JPEG as your wallpaper? "To display this image, Active Desktop must be enabled." If you use a Windows Bitmap (BMP) as the wallpaper, Explorer can display it itself, because BMP is a native format. JPG isn't, it's a Web format, so IE is used to display it.

    The thing is that it's a bad idea to use the same code to display remote content from the Internet as for the local user interface. Remote content, by nature, can't be trusted - you don't know who provided it or what it does. So programs displaying that content need to be really paranoid and careful, and you ought to run them in a self-contained, isolated process that can't affect anything else that's going on.

    Local content, on the other hand - like a list of the contents of your own hard disk - you have to trust. The user interface of the machine must be able to view and manipulate anything on the machine, or it can't do its job.

    These are two different roles. Using the same code for both is a bad plan, for several reasons.

    For one, any exploits or vulnerabilities in the web browser automatically become vulnerabilities of the whole machine when that browser is part of the OS and always running. If the baddies can somehow sneak a dodgy file onto your computer, then even if you're disconnected from the Internet, if you open that file, your box is owned; the vulnerability is always present. You can't really firewall a computer from itself.

    Secondly, the two roles demand different functions and different ways of working. The local interface should be fast and slim and sleek, because there's no delay in retrieving the information to be displayed: it's right there on the machine already, and up to a point, it can therefore be trusted. Faster PCs are making it harder to spot, but on Windows 98 machines, you could often catch a glimpse of Explorer showing a grid of blank icons before it fetched and rendered the actual images - just like watching a web page display slowly over dial-up. The fact that better performance makes this invisible doesn't excuse it being there in the first place when it shouldn't be.

    The remote interface needs to be careful, paranoid, isolated from local storage, and it needs to cope with delays in stuff appearing. It needs to be complex and capable, to handle elaborate, fancy web sites, whereas the local file browser is only going to be displaying simple, known quantities - lists of files, previews of images and so on.

    It may sound like having two sets of code to render and display images and so on is needless duplication, but that happens a lot on computers - it's a fact of life. Your kidneys and liver both perform functions of filtering your blood and removing nasties from it, but they're different enough versions of the superficially similar job that it's worth having two different types of organ to do it.

    The web browser ought to be a separate subsystem with no connection to the machine's own user interface, freeing it to be large and clever. The local file browser should be simple, fast and responsive. There's no need to turn the view of the local filesystem into HTML, then pass that HTML through the web browser. Yes, it makes it easier to have fancy, customisable views with task-specific bars down the side and so on, but this is an inefficient way of doing it.

    (And yes, I know that KDE does the same thing with local and remote browsing, but then, firstly, KDE runs on a proper, secure OS with restricted privilege levels, secondly, it has no worries about ActiveX to contend with, and thirdly, its developers are considering replacing Konqueror as the file manager in the next version. Personally, I shifted to GNOME years ago, anyway.)

    Windows 95 didn't include IE, so the Windows Explorer didn't do any of this HTML stuff. And the same Explorer powered NT4, too. Yet it was the same basic GUI as we have today - task bar, Start menu, folder windows and all. The Windows 98 Explorer brought in lots of handy extras - customisable toolbars and window views, thumbnail icons, JPEG wallpapers, drag and drop Start menu editing, all that sort of thing. Some of its features are really hard to live without, like multithreaded file operations - while it's copying or moving files, you can get on with something else. The old Win95/NT4 Explorer froze up until the operation was finished. The old progress bars didn't work, either - they showed the progress of each file, not the whole operation, so all you saw during multi-file operations was a blurred, flickering bar that was constantly being redrawn - telling you precisely nothing about how far the job had got.

    But fixing the old Explorer wouldn't be that hard. Adding in working progress bars, multithreading, thumbnails and customisation and so on does not have to mean using IE to display everything. Netscape is dead and gone and the Mozilla Foundation doesn't care; nobody is going to force Microsoft to remove IE from Windows now, or demonstrate that it's an integral part. It is, now, and has been for nearly a decade.

    A simpler, cleaned-up IE with no ActiveX, which played no part in the local GUI, would make Windows safer against attack. A simpler renderer for local JPEG and HTML content, so that HTML Help files and so on still work, would be an easy job. At the same time, the use of the IE renderer to display everything from Windows Messenger chats to emails in Outlook and the display in Media Player brings IE's vulnerabilities to all those programs as well. They ought to be using a simpler, safe renderer too, rather than the full-on IE browsing code.

    In an ideal world, I'd like to see IE completely replaced. There are enough HTML rendering engines out there - Mozilla's Gecko, KDE's Webkit, as used in Apple's Safari browser, Opera's code and so on. No need to use an open-source one - just buy Opera or hire the developer of the Apple browser iCab, say. There were too many bad decisions made in the course of IE's development, some of them motivated by commercial concerns like the Netscape lawsuit, rather than technical considerations - which should rule such a sensitive, critical piece of software.

    Yes, getting rid of IE would break some websites, but then, IE7 does that anyway, and IE8 will doubtless break more. It would be a price worth paying.

    These are the sorts of changes that could have made a really big difference to Vista. Not bolting on new code to hedge around the dangerous bits with warning messages and reduced execution privileges, but adopting the same models that everyone else uses - limited user accounts, hidden or inaccessible admin logins, and strict isolation of untrusted remote content from local files, handled by different rendering engines.

    These are big changes and they would have caused lots of problems, but that always happens anyway. It's unavoidable. They wouldn't have by any means fixed Windows altogether and made it a completely safe system, but they would be big steps in the right direction.

    It's never going to happen now - it's too late for Vista, and after this, there will probably never be such a big change in Windows again, until it's replaced with something new.

    But here's a fun thought. What if Microsoft were held legally responsible for all those vulnerable, insecure Windows installations out there? You may not have heard of it, but there's a special lightweight edition of XP for turning old PCs into thin clients. It's called Windows XP Fundamentals. It's the core of XP with almost all the features removed - it can't even "ping" an IP address - but it's still the same old XP. Only this version can run on pretty much any old box that will run Windows 98: 64MB of RAM and a few hundred meg of disk is enough.

    How about a special free update of Windows for all those people who won't or can't upgrade to Vista, given away as a free upgrade, just like Outlook 98 was given away free on magazine cover disks to get people to replace the woefully buggy Outlook 97. An enhanced "Vista Fundamentals", with a fixed, safe Explorer, no privileged accounts and a sanitized IE, and none of Vista's whizzy new features, given away for nothing to anybody with an existing version of Windows. That would go a long way toward persuading people to finally abandon all the old versions of Windows and upgrade. And lots of those people might then be tempted into buying the full Vista.

    No, I know, it'll never happen. But wouldn't it be nice if...? µ
    http://www.theinquirer.net/default.aspx?article=38419
     

Share This Page