Five Features Operating Systems Should Have

Getting things to the next level...

Monday, February 28, 2005 by Frogboy | Discussion: OS Wars

My friend Eugenia over at OSNews.com was lamenting how boring operating systems have become. And I agree. How far we've fallen from the exciting times of 1991 when pre-emptive multitasking, protected applications, flat memory, and object oriented interfaces were about to be delivered to the masses.

Since then, improvements to operating systems have been incremental. Or in some cases, we've actually regressed (largely thanks to jerks taking advantage of open systems to create viruses and spyware).  IBM's OS/2 was well on its way to providing an OS in which users from around the world could seamlessly integrate new functionality into the operating system via SOM and OpenDoc.  Of course, had that occurred, it would have been the mother of all opportunities for spyware vendors and the creeps who make viruses. The 90s could be looked back upon as a time of naiveté and idealism. It was in that environment that ActiveX and VB Script and Internet Explorer Outlook Express were designed that we now rue because of the exploitative nature of malicious people.

And so in the past few years the two major OS vendors, Microsoft and Apple have largely taken on the role of tossing in features into the OS that third parties had already provided or that the other had managed to come up with on its own. And then after that the Linux vendors then try to mimic that (there, I've offended all 3 camps!).

With MacOS X, Apple finally managed to put together a stable operating system with preemptive multitasking and memory protection. The first release was slow and buggy but subsequent versions got better and better. MacOS X Tiger looks to be a refinement on what has come before along with Apple's usual innovative twists on existing concepts (ex: Dashboard).  Apple's "innovation" with MacOS X has been very very gradual --  a far cry from the heady days of "Pink", "Taligent", and "OpenDoc". This is particularly true when one considers its ancestor, NeXTStep was released in the late 80s.

Meanwhile, Microsoft has contented itself with lifting shareware programs and throwing it into the OS.  WinZip sure looks popular, let's put ZIP into the OS.  Hey, AOL is annoying us, let's tweak them by making our media player skinnable and tossing that in.  Hey, let's put in a basic movie editing program too.  Ooh, WindowBlinds sure looks nice, let's make our OS have its own skin too.  Meanwhile, in areas that there's little third party innovation (or at least competition) things haven't progressed very much.  Outlook Express hasn't changed much since 1998. Internet Explorer is still roughly the same today as it was in 1998.  Now before any Windows zealots get on me, I'm not saying that users haven't benefited some from Microsoft tossing in home grown (or contracted out) copies of third party programs. I like being able to work with ZIP files "natively".  But bundling more content with the OS is not the same as innovation. To be fair, Microsoft's Longhorn project is very ambitious and Avalon promises to, at the very least, pave the way to really control how large things appear on our monitors without giving up resolution.

Yea yea, talk is cheap. So what should the OS vendors be doing?  I can think of five things that operating system developers should look at having part of the OS.

#1 Seamless Distributed Computing

In an age of gigabit internal networks, it is striking that I can't simply tell my operating system that I have "access" to Machine A, B, and C on my network and to use their memory, their CPU, and other resources to ensure that my local computer remains always responsive.  Instead Windows users are quietly bumping up against the "User Handle" limit (which most people don't even know about yet, they just discover their programs start to crash) of Windows XP.  So if I felt my machine was getting slow, I'd just bump up more machines on my LAN to use or toss another machine under my desk or even (gasp) throw some tasks at my home machine via the Internet (where the OS would be smart enough on what tasks it farmed out based on the connection speed). This sort of thing should be built into the OS, today.

#2 Seamless Distributed File System Databases

To be fair to Microsoft, WinFS has the potential to become this but it keeps being delayed. I should be able to create "cabinets" on my desktop (or anywhere else) in which I set a few parameters and from then on, files show up there as if they were any other folder.  I should be able to set the scope of my cabinets -- local, network, worldwide.  The physical location of files should be irrelevant. I should be able to arrange things however I want without it affecting anything.  Instead, move something from c: \program files\ and you're asking for trouble.  Why? I should be able to organize things however I want without it affecting anything.  This should be part of the operating system, today.

#3 Global User Management

When I activate Windows (or MacOS) I should be given a unique account global account where my preferences and other key settings are stored along with any data I would choose to store there for an additional fee. There should be a Bwardell.Microsoft.net (or Bwardell.Mac.com).  I shouldn't be forced to use it so that the privacy nuts are kept happy. But if I choose to use it, all my preferences, email, favorites, and other system specific "stuff" could be kept there and synchronized and accessible from any machine I'm on.  Moreover, it would also act as a re-director between machines. If Machine A has my spread sheets and I'm on Machine B then I would be able to get to those files through my Microsoft.net account re-directing my files from Machine A to Machine B. The system would need to be plugin-able so that Microsoft could avoid any DOJ issues. So a Yahoo or Google or even Apple could provide alternative Network cordinators (i.e. use Bwardell.Google.net instead of Microsoft.net). (DWL: I'm not talking about a domain controller here, we're talking about a global system that's seamlessly part of the OS).

#4 Universal environments

Following #3, I should be able to have a universal user environment. If I go to a public machine, type in my UserID and password, it should then be able to pull my environment from Bwardell.Microsoft.net so that my program settings, etc. are there.  If the local machine is missing a program, it would simply be grayed out. But if the local machine does have it, then I would be able to launch it from the normal place in the Start menu.  All my files would be accessible here through the re-direction of Microsoft.net. What machine my files are physically located on should be immaterial as long as as they are safe and secure and backed up (all handled by the OS without me messing with it).

#5 Component Based operating systems

Part of the problem with Windows and MacOS is that the OS vendors make it hard for third parties to enhance the OS with their own code.  Rather than worrying about locking everyone out for security reasons, they should come up with a secure way for legitimate software developers to be able to enhance or replace components of the OS.  For instance, in Windows I  get 5 choices on how to view a folder (icon, tile, thumbnails, details, and list).  Third parties should be able to extend this (I can think of a dozen ways I might want to display data in a folder).  There should be APIs so that developers can replace major or minor components to the OS. The web browser engine, parts of the shell, the display rendering engine, the built in search, and dozens of other things.  The OS should be broken down into parts that the OS vendor bundles their parts and third parties can try to innovate those areas. In that way, everyone wins, including the OS vendor (particularly closed-source vendors). Windows has "API hooking" but what we need is something more robust and secure to make use of that is well documented and accessible to developers.

The closed source vendors should be looking at Linux. It has managed to keep up on a tiny fraction of the resources. It does this through open-source.  Anyone can update any part of the OS and if it's a good thing, it can be put as part of a distribution. If the major closed-source OS vendors started providing documented ways for third parties to easily extend virtually any part of the OS without a lot of pain, they could harness their considerable third party developer base.  OS/2 was working in this direction with SOM where OS features could be "inherited" by app developers, extended and then updated.  Object Desktop on OS/2 was coded entirely by a single individual because he could inherit the base features of the OS and then start from there. And then other developers could go from there. (DWL: I know COM can do some of this but it's not what I am talking about, if you've done any serious OS extension development, COM doesn't apply here much).

...

Now you might be thinking, "Well if you think these ideas are so great, why don't you do them?"  The answer is, only the OS vendor can, as a practical matter, do this.  If a third party makes these things and it's successful, it's only a matter of time (probably one version) before the OS vendor puts in one of these on their own, wiping out the "low hanging fruit" part of the market.  As soon as some third party, for instance, put out a really good distributed computing product that "did it right" and started to make good business that targets consumers (DWL: Armtech is not a consumer product and isn't what I'm talking about), you could be assured that the next version of the OS would have some basic implementation of this put in.  And the OS vendor's fans would chime in, "That should be part of the OS anyway!"  In short, there's no business case for a third party to invest the money to develop these things because the pay-off isn't there.

But if these features were part of the OS, you could imagine how it might lead to dramatic changes in the way we use and think about computers. And to add to that, imagine the kinds of additional innovations that would present themselves if these things were already taken as a given?

 

## About the Author ##
Brad Wardell (aka "Frogboy") is the Designer of Stardock's Object Desktop. Object Desktop is a suite of operating system extensions for Microsoft Windows. Stardock also makes programs such as Multiplicity, Galactic Civilizations, and much more. Its home page is www.stardock.com.

"DWL" stands for "Don't write letters".

First Previous Page 2 of 2 Next Last
paxx
Reply #21 Wednesday, March 2, 2005 9:21 AM
About what you call "Distributed File System", the only problem I find with that is that as long as the files are on your local machine, you can back it up, make CD copies, whatever, but if the files are stored on a third party server somewhere, you don't know what will happen to that server 6 months or 2 years from now. I have learned that from experience... I used to have a online photo album... My stupid mistake is that I didn't even keep a local copies of the photos, as I thought that the pictures were on that server anyway, if I needed a local copy, I could always download them whenever. Problem is, the company went bankrupt, the server shut down and now my pictures are gone...
Anyway, your idea is good, as long as the server is used as a secondary way t access your files. They would still need to be kept on your local machine, and backed up as they shold be. Can't rely on third party...
MasonM
Reply #22 Wednesday, March 2, 2005 2:04 PM
Very good article Brad. I agree with you that these things would make life a lot easier regardless of what OS ultimately decided to implement them. I personally wouldn't use such a system as I wouldn't want my personal files/settings transmitted over such a wide and open network as you describe but I can certainly see the advantages in such aa system for work related use.

While I am a Linux user and don't use Windows at all, I would never try to argue that one is inherently "better" than the other for all things. Each has it's own strength and weakness. There can be no arguement that Windows is far more suited to the average home PC user due it's "plug and play" nature.

Linux developers are finally seeing the advantage of such a scheme and are rapidly approaching such a "plug and play" environment as well. Many recent distributions configure hardware automatically and do a good job of it, and software downloading and installation is done very automatically in some of the Debian-based distros. But in all honesty, while much more user friendly these days, Linux is still not quite ready for the main-stream "mom and pop" end user. Getting close, but ot quite.

I'd like to see these features you describe become implemented not so much for the features themselves, but for the advances they would encourage in other areas of OS development, particularly better integration and sharing between the various operating systems out there.
sunwukong
Reply #23 Wednesday, March 2, 2005 2:42 PM
From a userland perspective the stuff Brad wants is very compelling and a lot of people are working towards them. But there are two rules that people seem to forget that will, for practical purposes, severely limit every wish except the last:

1. The network is unreliable (or, God hates data) -- the issue with distributed data (file systems, persistent objects, whatever) is that when the network is partitioned (via fault, outage, pulled cord, misconfiguration, etc.) then it is a hard problem deciding who has the authoritative version of what. It becomes undecidably hard with arbitrary network partitions. This points towards a central authority for arbitrating these decisions, which brings up the second rule ...

2. The network is not benign (or, the universe hates everyone) -- the problem of distributing authoritative data is unfortunately mapped to that of distributing trust. How much practical trust do you endow the central authority to hold/use your distribution keys? Do you necessarily trust the keys of others distributed by this authority? Is the model a web/chain of trust? Who's at the root? Are there supervisory powers over your keys/data that you don't have?

Both of these rules essentially limit the practicality of distributed computing for any single user to as far that person can have a reliable, trustworthy network. Outside of external factors (network infrastructure issues and, say, political boundaries), most likely the size of this dream network will depend on the resources (monetary, political, etc.) one can personally spend to maintain it.

Is this much different from the situation today? Not really. Consider the distribution and signed copies of Windows service packs or the Linux kernel. The wide availability of authoritative copies has been the direct result of the central authority spending both monetary (dominant on the Windows side) and political (dominant on the Linux side) resources to maintain the reliability of the network for distribution. The same type of considerations will be required for personal data -- who has a political requirement to trust Brad's identity on their network and will they spend the resources to reliably mirror his data? If there's a lack of political will, can Brad purchase the reliability? Does Brad have the resources to keep a reliable channel open between this network and the other networks mirroring his data?

Stay tuned.
MarkSnyder61
Reply #24 Wednesday, March 2, 2005 5:09 PM
#4 Universal environments

This option seems very similar to a "Roaming Profile". The only thing wrong with this is "Security". I think your files would get hacked a lot unless you can figure out a way to use IPSec or CA's (Certificates of Authority) to ensure privacy, but using these uses bandwidth too...Hmmm! Sounds good but we are a long way from it!
Wiskyjon
Reply #25 Wednesday, March 2, 2005 10:20 PM
Review of “The five things the computer industry needs”.

I have always wondered who “Frogboy” is and know I do, nice to meet you. Firstly, I wanted to thank you so much on your invention, Object-Desktop. I have been watching it for some time as it evolves. A few years ago I bought me a new state of the art system at that time. As of this date though I come to find out it is sliding down the scale of performance with the advent of newer machines. Nonetheless, it was in my plan to add this to wonderful application to my desktop as time and resources permitted. Now I have it, and all the upgrades are coming in all the time automatically I am so happy with the openness and the ease of operation to the most part. Those items that I am not too sure of are another discussion.

A long time ago I was a Software Engineer engaged by the FAA/DOT to maintain and keep afloat all the aircraft from the time it takes off until it lands at its destination and all points in between. During those days we did not have any of these modern connivances on the mini-micros as we called it in those days. These applications and network interfaces where all handled by centralized mainframe processors running various forms of proprietary applications and could be co-configured to run OS/HASP/MVT-MFT. Back in those days, we did not have all these specialized processes to accomplish our work; we just made the application or the node enhancement and attached it to the load object. They’re where none of these Testing Applications, Mercury and such, available back in those days. Sure we had Simulation runs to demonstrate how it worked under loads up to 500% of capacity. But the verification processes was done with the compilation list, a core dump, and by manually pulling up these key memory locations and monitor their changes in binary! Sure we had many tools to do this with, but again, these things where all done on the Mainframes.

You developers of today have know idea how these current open advances in making XYZ application run anywhere on any type of machine and how significant that is. We developed a centralized system with a process called Resource Byback. What it did was to keep all the memory and other system components in a sort of flux in case of catastrophic event. When something did go wrong the resources in use at that time and the redundant resources where forced through a filter like process so that in one instruction cycle in appearance the system would be whole again locally. The same process was also used nationally, the draw back is it could take a couple of minutes to take place as all components of all centralized systems and all their redundancy’s went through this filter application by microwave distribution. The problem is that when this happened it was more than a flicker or two on the monitor to error messages on all the screens including the air traffic controllers. Now just try and imagine the panic you are in control 50 or so aircraft if it where to happen to you. To me this is the definition of a distributed centralized system. All the parts and the boxes compliment the whole instantly with no action by the end user. Where the best use practices are the defaults to optimize the system from the gate with no tweaking required!

Lets see what would happen if we merged the past with the present to make the best possible result of our future. I will not speak on VPN’s or LAN’s just on the public domain. For the sake of clarity these premises will be included:

1. Ease of operation where the defaults will be the best practices, in other words optimized;
2. No GUI except for gaming until we can think our way to winning;
3. Centralized Cabinets with no restraints other than Names;
4. No purchases, all leased;
5. Security and access protocols.


I will be approaching this from the PC point of view and not from the Mainframe however, the thought process or application maybe available only on the mainframe currently. If these premises and guidelines are acceptable lets begin.

Number One

I’m sure each and every one of you have had the mishap of finding out that a special Check Box or Radio-Button was not enabled or activated because the manufacturer has these settings enabled by their defaults. It is hi time that when something is added in any form be that a application, an API or what ever you want to call it, that it is made part and parcel with the best possible utilization and optimization the PC can adjust to even if the user does not know about it’s processes or capabilities. This would be especially true during new hardware or other functionality being added as well.

Number Two

The Graphical User Interface is inept in the best cases that I personally have seen. It is cumbersome and difficult to find ABC process or application in the way it is presented currently. Object Desktop has several skins’ that allow you to get rid of it to a certain point. The user even has the option to turn off the task bar for a presentation in another way. But that is not what NO GUI means. It means that once you have your system setup to your liking that there will be interface display at all without your calling for it by Voice Command! As we all know that are several of these applications out there but they are resource heavy and not 110% accurate in all cases. This would be required to alleviate the end user of the mouse and these kinds of interface except again during a learning phase when these run side-by-side and while we are involved in Gaming.

I’m of the opinion that there are several of you that have watched some of those Sci-Fi movies where there was this display that you would see a swirling effect of objects while the system was being directed to perform a task by voice or a series of tasks. In some of these movies you see where the person that is the end user had a head mounted device (HUD) that maintained the connection with the system, which was responding purely by their thought processes. You may even remember in some of these movies that those who made this interface had evolved to a race of pure thought making the interface of direct input. There was no wires, keyboard or display other than a view screen of the results.

It would be nice I’m sure to take the latter here, but as we all are aware your technology is not at this point currently. That is not to say with some standards being but into place that we cannot have the former that being the interface running parallel during the learning phase. Once this has been completed the display would be used as output or verification of input and be in stand-by mode (Screen-Saver).

Number Three

My definition of Centralized Cabinets is a series of drawers that contain the make up of the individual system. This would include the customization settings, as the system will always be available so there will be no need for the boot process. These settings would be called from the Centralized Unit and applied during the login. To apply this to my system would be something similar to:

The system initialization files would be read and the boot screen would come up followed by the login on screen that goes with the theme that I am using at that time using Object Desktop. Once that has been entered my settings would be applied launching all the API’s that go with these and other applications: Windows XP Pro, MS Office Pro, Ghost, Spyware Blaster, EZ Armor, Pest Patrol, Object Desktop v4.5, Registry Mechanic v4.0, Diskkeeper Pro, BF1942 full version with the entire series modifications for Desert Combat up to v.8 and UT2004 full version with all of its modifications included in CBP2 made for XP.

Lets look at this from the public domain now, that being the library or one of those smart terminals for a fee. When I login the CDC will find my cabinet and it will see from the start that do to limitations it will not be able to apply all of these applications to the public processor and ask to connect to my personal system. Once it does that it will see that do to further limitations a list of applications will not be available do to remaining constraints and give me a list and ask if I wanted to proceed without them. If the system determines that it can utilize anyone of these but with limitations these would be in a message as well. Given this I could expect to see that these messages would be first for the following applications with limitations: Windows XP Pro, MS Office Pro, Ghost, Spyware Blaster, EZ Armor, Pest Patrol, Object Desktop v4.5, Registry Mechanic v4.0, Diskkeeper Pro, leaving access via my system for the games.

You sit down and enter your login information and hit enter. It will look inside of the Centralized Distribution Center (CDC I like it) and find your cabinet and begin looking through the drawers. Once this is completed it will start adding the resources that you have registered to you. While it is applying these to the system if there are any that are not compatible it will display these and ask you if you would like to connect to your personnel system to retrieve or apply this application or API. The biggest problem with this scenario is that we do not have the wired status that we need to make this a truly wonderful thing like Singapore, which has 60% saturation.

Lets look at it from another point of view, Gaming!

I should not have any difficulty in attaching to BF1942 full version with the entire series modifications for Desert Combat up to v.8 and UT2004 full version with all of its modifications included in CBP2 made for XP as these show on my desktop. The really kewl part of these aspects is that if I had several friends and we wanted to connect to my system and play a spawned game we should not have any difficulty. This means the bottom line is it truly mobile!

Number Four

As we can see what we are really talking about are not much more that E-Machines unless a special hardware apparatus is required or desired. Because of this we will not need to buy the applications in the future there will be a link to its drawer. Depending on how you do this will depend on the limitations that are placed of that drawer. As things change the link will change from the original drawer for the object with the upgrade. With the system having the sense to know what your best performance setting should be the need to tweak these would be slim making the whole process invisible to us the end users of the world. On the flip side if you have an application that you have installed on your system that does not have an identifier code assigned to you, you do not get it period. This will stop you bad guys that like to destroy or diminish our internet and it will get really boring having to install an application each time you want to use it!

Number Five

The above captions I speak on these authorization codes that are attached in our settings and counter attached at the CDC in the associated drawer. There is only one problem with this idea that is privacy! However the upside would be everyone would have an entity with a series of available user accounts for the applications that are being put into play. These processes could have limitations on the number of uses or none at all. For example, I’m sure that a lot of you have been reading on the BLOG’s about email, virus’s and stuff like that, right? Lets say that you have an email account and your ISP is Roadrunner for sake of this argument. You have a standard account during which time you learn that you have limits on email. These limits may or may not be on size but on repetitious and toward a count limit of 300 that being 10 each day (this could be optional by your ISP). With this ISP you get two really great cabinet drawers with the first being EZ Armor and BCD application. I’m sure that it would not be to hard to have the spam blocker give you a notice of duplicate material being sent out and reporting NENt limitations toward the 300. Once this occurs it would block the spam message out and report to whom ever will be decided upon controlling the Internet. Deduct these from your 300 and move on to the next process requested by the end user. Once the NENt has been exhausted a message will appear stating this and advising the end user that the email can be sent at a dime each purchased in $5.00 lots via credit card. Here is something else that would be possible. If the application is not signed it cannot be added this will stop hackers from getting their much-needed programs. Once that the controlling entity is decided upon we will be able to take back our Internet and enjoy it in away that we have never experienced before.

Closing

I am not writing this to get anyone upset or to get into any kind of argument. I am writing this in response to Frogboy’s article period. I would enjoy replies or questions and it was my pleasure to share this with the Internet Community. Sincerely yours, Wiskyjon


AzColt62
Reply #26 Saturday, March 5, 2005 3:27 PM
Like all of the comments concerning hte base O/S and third party add ons, that soomn become part of the O/S. The key is, to make the UI and the underlying O/S more user
friendly, not more confusing.

MSFT et al should stick to refining to the basic O/S and allow third party developers
the chance to improve or make the system "friendly". Inovation seems to be lacking, even in the Linux Community. Same ideas refined, not redefined or expanded upon.

Most end user's have enough problems just finding what they need in the programs and
O/S they use with out having to worry about how refined the system is or will be.

A base O/S with add on modules has been sounded out before, it is still a viable option and need for the average user. Add to that the ideas that Brad has suggested would enable many average people to greater ability and understanding of the Computers they use at work and at home.

Most users don't have the luxury or knowledge of having used PC's from MSDOS 1.2, OS/2 2.1, Windoze 3.1 days, or even the MAC.

Integration and Simplicty.
delaron
Reply #27 Sunday, March 6, 2005 8:46 AM
I don't think I can go into much detail as to the specifics of the proposal that frogboy put up. But, I can forsee some of the ramifications of such a system. While being nice to have, and practical, you need a 6th component called Security to back up the use of the other 5. Imagine a virus that gets ahold of your system, then it has access to all of the systems that you have access too etc etc... But unfortunately, Security created is made to be broken. no matter what system you have, there will be someone breaking it. Its just a matter of how easy it is determining the number of people doing it.

Now I think there are a few arguments dancing around the linux thing that people are transfering back and forth...

Point 1: most vendors that put linux on thier systems, pre-configure just like windows. the average user doesn't have to install linux, just like the average windows user. So, for 90percent of the people out there using Internet, Writing email, writing up a document to post about thier lost kitties, a windows or a linux installation will do just fine. For that 10percent who add hardware 70-80percent should be supported by linux, but there might be some headaches, but joe user will read the specs on the box right before buying? For that 10percent who install alot of extra software and/or play games, linux is probly not the greatest option right out of the box unless they like to do some research, or buy software that says 'linux ready' on the side of the box.

Point 2: Windows has a glut of software written for it, so hence the driver issue and general commercial software added is in the windows camps favor. But, linux also has support, which all you have to do is look at the technical specs for 'linux support' im sure. if not, call the manufacturer. All of these things the average user can do, and must do even for windows. Normal people look at software specs when buying, which say 'must have pc 128M ram, 200 HD, etc' and normal people probly don't know half the stuff to 90percent of the stuff they read regarding windows specs either.

Point 3: If you want a program, you have to install it. Each commercial entity creates software with a gui nowadays for windows, simple enough. But, there are shareware/freeware/whateverware out there for windows, just like Linux. There are commercial products that are for linux and windows out of the box. Read the box. There are freeware/shareware/whateverware for linux as well, which has varying degrees of technical aptitude required to install. That is not denied.

Point 4: Some people state that 'sure its easy to say whatever OS has this functionality, but you have to eat your own children just to install it. Now, I think the point being made with the argument to this one is that if you want it, its there, if you KNOW about it. your probly estute enough to be able to search for it, and then read directions, and install the software. If not, don't install it. But the fact is, its there if you want it. Not that its hard to do, and won't work on half the OS's out there. On the flipside, the argument is put forth that, sure you can have one feature here, one feature there, but its not all in one place. That may be true, I haven't done the research. But, when talking windows vs linux, it surely aint in windows for MOST of the features you want anyway. So, I think people should be saying, Lets come to a common ground, Windows has mindshare right now, for the common user who doesn't ask too many questions because he doesn't really know what his system is capable of..Linux has technical strengths that for the right person, can work the things he/she/it knows they need.

What I am trying to say is, that there are a few different arguments from the different camps always put forth. One argument states that, "I hate windows, but it works, and the average user can use it without having to go through alot of hassles" while technically true, I know that in practice, the average user can have just as many problems with windows when things go wrong, as with linux.

Another argument says that "Mom and pop can't use linux because its hard to install", come on, no one installs windows unless your someone who does it on your own anyway and builds his/her/its own systems. Someone who can do that, can install linux, or windows. The average user calls someone up and says gimme computer zanga, and everythings shipped to them. Nothing says that linux can't be on it as well without the average user getting involved(then go to point 1 above)

Another argument states that you can't install linux programs because they are all half done college projects that even yoda would have a hard time installing. I have seen installations of windows programs that have some holy havok to play if you don't have just the right patches installed, or configuration detected. true, theres alot more side projects for linux than there is for windows, but its the nature of the beast. People can advance the same argument for the virus's that are going around..."well the only reason microsoft is getting virus's is cuz its 90percent of the market, no one wants to write virus's for linux", nature of the beast. But, the average user won't be going online to search out his software for doing what he wants to do anyway, he will go into EB, or Best buy to say to the associate I need 'software to calculate the airspeed velocity of an african swallow" and the 7buck an hour associate will go, 'I know all about that, heres what I use'

So, one side is arguing, 'but its possible' the other side meanwhile is arguing 'but its hard' when the real point is that if you do the same things as the 'easy' side, its all the same stuff. Ie, the 'AVERAGE' user is going to the store to buy his computer, software, hardware, etc, will ask people whats this work with, and get a reply. So, if it works with linux, great, if not, boo, or whatever... The other 5percent of the equation comes in when you go to the internet for various other software thingies for linux, and thats what the 'it is hard' camp always like to beat into the ground with a dead stick. Everyone get over thier own side and realize the software for what it is. Windows is what it is, and linux is what it is, each has its strengths and weaknesses, and like someone posted 'tools for the job'. Granted, that to go into a store and find something 'linux ready' is going to be a bit harder than finding windows software, but it is there in some degree or another

Disclaimer: I dont use linux, but I play one on TV...errr I have delusions of someday using it fully, but im a lazy gamer
Enough rant...
DarylK
Reply #28 Sunday, March 6, 2005 1:55 PM
Having read this train from the beginning for the first time, one thing I found myself doing....
listing the work-arounds I am currently using to accomplish just a portion of the 5 things that Frogboy lists.
and I work with Mac (media), Win2k (secure work environemnt), XP-Pro (home) and Win2003(ce) (Pocketpc).
I find my current life is defined by email, wifi, and flashdrives... work-arounds.
bad juju
Reply #29 Sunday, March 6, 2005 6:19 PM
Frogboy writes: I don't mean to offend Linux users, but I think some of them are so interested in winning an advocacy debate than actually providing a solution.

The last thing I wanted to do here was turn this into an "advocacy debate" of any sort. A number of OS Features were proposed as being things that "operating system developers should look at having part of the OS."

I merely pointed out that many of the features do exist, and in many cases are seamlessly integrated already. The OS that these features integrate with is largely coincidental. As iefan pointed out some of these features appear to exist in MacOS. The reason that GNU/Linux gets mentioned at all is because it seems to have the greatest number of the features you desire. The fact that it is GPL'd makes it possible for anybody with the time and energy to begin packaging these features in a single OS distribution. Look - if you really want to just sit around moaning about how proprietary OS vendors should package this functionality that's your call. Feel free to continue doing so. If somebody else would like to actually start making this wish-list a reality I welcome their enthusiasm.

Perhaps it's a cultural thing. As a long-time GNU/Linux user my initial thought on reading this article was "What good ideas. Maybe we can do something about it." Indeed, I wrote in support of Frogboy's article! I don't really care what OS is used to develop these ideas, it just seems to me that there is one that seems to stand out as a candidate. Who knows, maybe Solaris will soon be there as well.
dfcoffin
Reply #30 Thursday, March 10, 2005 1:11 AM
>> IBM's OS/2 was well on its way to providing an OS in which users from around the world could seamlessly integrate new functionality into the operating system via SOM and OpenDoc. Of course, had that occurred, it would have been the mother of all opportunities for spyware vendors and the creeps who make viruses

Do you really think that?? I always felt that the biggest vulnerabilities in Windows is the Registry. OS/2 did not have one and I thought that might make it harder for trojans, virusus and spyware to hide out. In OS/2 you could move applications around without breaking them.

btw, Glad to see Stardock still around. I had to migrate kicking and screaming away from Warp years ago on into NT. Stardock was one of those killer apps that really showed off OS/2 power and sold me on it. There was a Fax application too that was lightyears ahead of anything out today.

Chris TH
Reply #31 Thursday, March 10, 2005 2:39 AM
I always felt that the biggest vulnerabilities in Windows is the Registry. OS/2 did not have one


It had .INI files though (maybe they were named something else) that amounted to a registry, however. It wasn't until I realised the problems that lurked in there (and got an INI cleaner util) that I got OS/2 to run reasonably well.


Posted via WinCustomize Browser/Stardock Central
kona0197
Reply #32 Thursday, March 10, 2005 2:43 AM
I bet you on these newer systems an old copy of OS/2 would FLY much like windows 95!
rbutler
Reply #33 Thursday, March 10, 2005 6:28 AM
Ahh, some of the features will appear sooner or later in a project I'm working on, called Object no relation to your product, though I did get the part of the name from you guys...thanks for that ) but anywho, I'm working on my desktop a'la 3.11-style(Open-Source, of course) P-Mode interface for the FreeDOS OS... and I just wanted to say, the client/server editions(standalone, OSes), and shall be able to surf their local network seemlessly from their local network-aware "file manager" equivalent application.
will feature seamless user integration between client/server on a local node..I'm going to go as far as to say the server will be able to
optimize your disk for you, and even log you off after a certain period of time... The purpose of the project though, is to prove to the world, and to myself, that something on the scale of a microsoft product is actually doable with time, effort, and hard work.. By the way, the
entire project is supposed to be built in C(DJGPP/DOS, with a little HLA(High-Level Assembly) elbow-grease)... Anybody who would be interested in taking a look at the
site, drop me a line.. (I do hope I'm not breaking any laws or anything by calling my project what I do..if it's a problem, I can always change it...) - Robert Butler

PS. We plan to support our own library format, and are calling It dlm. (dynamic library module)
Robert

One last note, the modular operating system idea, seems interesting, (keep in mind, I'm just waking up here before going to bed after an all-nighter researching info for my project; Thursday, March 10th @ 6:39am, and I haven't slept yet...), BUT they would have to be careful not to let it be exploited, as we've all seen with ActiveX and VBScript-type annoyances.
Robert
chekmarx
Reply #34 Friday, March 18, 2005 4:38 AM
> I bet you on these newer systems an old copy of OS/2 would FLY much like windows 95 !

I have a AMD Athlon XP based system with 512 megs of DDR333 RAM and OS/2 indeed absolutely ROCKS on it. I'm also running eComStation (the latest incarnation of OS/2 from Serenity Systems) and it too runs great.

That's one of the reasons it's so difficult to drop it and change over to Windows XP. Aside from losing the WPS performance would drop considerably. Of course, there are major pluses as well but not enough yet to lure me over to the darkside

chekmarx
APB
Reply #35 Monday, March 28, 2005 6:47 AM
As an end user, I would like the import/export options of files to disappear and everything to be drag and drop.

Please login to comment and/or vote for this skin.

Welcome Guest! Please take the time to register with us.
There are many great features available to you once you register, including:

  • Richer content, access to many features that are disabled for guests like commenting on the forums and downloading skins.
  • Access to a great community, with a massive database of many, many areas of interest.
  • Access to contests & subscription offers like exclusive emails.
  • It's simple, and FREE!



web-wc01