![]() |
If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#51
|
|||
|
|||
![]() "Andrew Gideon" wrote in message online.com... C J Campbell wrote: You could not even go back to the days when only research facilities and the military had Internet access. Considering the economic damages caused by each major worm run, that might not be a Bad Thing. We'd lose a *lot*. But it might be better, in the long run. Better for whom? I submit that your protests are essentially the same as those of the priests when Gutenberg started printing Bibles. Your high priesthood is threatened now that the unanointed masses have access to computers and networks. The reaction was violent when personal computers were first introduced. The IT priesthood constantly warned of the dangers of personal computing. Most big corporations and government agencies adopted policies prohibiting employees from using anything but the company mainframe. Never mind that the IT priests could not deliver what people wanted: their own spreadsheets and word processors. Employees had to meet off site in secret to get real work done on their personal computers. After thirty plus years, nothing has changed. The ancient priesthood still tries to hold onto its power, railing against the dangers of Microsoft and Windows and, yes, personal computing. The days of the priests are numbered. I think that is a Good Thing. Still, there's a third alternative: safe computing. Again, safe for whom? Apparently the high priests are concerned only for their own safety, ie, jobs. People don't mind using mechanics or A&Ps because we're told it's necessary. As a matter of fact, many people do mind. |
#52
|
|||
|
|||
![]()
"Teacherjh" wrote in message
... People don't mind using mechanics or A&Ps because we're told it's necessary. The difference is, anybody with a nearby library can learn how to fix a car or an airplane. This is not true of Windows, or any closed source software. The innards of windows are a secret. You cannot be sure of what the operating system (or any commercial program) is =really= doing. There is no such thing as "closed source software", not as you describe it. The operating system is only mysterious to people who don't take the time to look at the software. 100% of the operating system is delivered to every person who installs it, and it is right there for anyone to look at. Deciphering what the operating system does is orders of magnitude harder than reverse engineering an airplane engine (for example), but it is not impossible by any means. How do you think independent third parties discover vulnerabilities in any operating system? You can be every bit as sure of what the operating system is really doing as you like. You need only invest the time and effort to do so. Open source is somewhat easier to review, but the truth is, if you aren't looking at the compiled machine code that is actually running on your computer, you don't really know what the software is doing. The high-level C/C++ (or whatever) code is very useful for giving you a good idea, but only the actual compiled result can be considered to be a 100% certain description of what the computer is doing. Pete |
#53
|
|||
|
|||
![]()
On Fri, 25 Jun 2004 22:09:50 -0700, C J Campbell wrote:
Still, there's a third alternative: safe computing. Again, safe for whom? Apparently the high priests are concerned only for their own safety, ie, jobs. Actually, it's the "high priests" which are trying to hold the line against safe computing because it's a very high, steep, and sliperly slope. It's the general public and CTO-Q-public which seems to be pushing the line for that. Greg |
#54
|
|||
|
|||
![]()
On Fri, 25 Jun 2004 23:00:14 -0700, Peter Duniho wrote:
"Teacherjh" wrote in message ... People don't mind using mechanics or A&Ps because we're told it's necessary. The difference is, anybody with a nearby library can learn how to fix a car or an airplane. This is not true of Windows, or any closed source software. The innards of windows are a secret. You cannot be sure of what the operating system (or any commercial program) is =really= doing. There is no such thing as "closed source software", not as you describe it. I disagree. The operating system is only mysterious to people who don't take the time to look at the software. 100% of the operating system is delivered to every person who installs it, and it is right there for anyone to look at. Deciphering what the operating system does is orders of magnitude harder than reverse engineering an airplane engine (for example), but it is not impossible by any means. How do you think independent third parties discover vulnerabilities in any operating system? That's a different issue. You can be every bit as sure of what the operating system is really doing as you like. You need only invest the time and effort to do so. The difference is one of practicality and man hours. With even the source code, it would take man-years to review NT's code, for example. It would probably take several orders of magnitude more man-years if it all had to be done as the machine language level. Ack! Open source is somewhat easier to review, Somewhat? Try, practicle versus nonpractical. Now, consider the number of C and C++ programmers. Now, consider the number of even semi-decent assembler and/or machine language programmers around these days. Starting to see a resource problem? but the truth is, if you aren't looking at the compiled machine code that is actually running on your computer, you don't really know what the software is doing. This is, of course, a valid point. Just the same, a *mostly* moot point given the openess of today's open source compilers. GNU's compilers, for example, are self hosting. That means it compiles a compiler, which it then uses to compile a new compiler. Then, the new compiler is used to compile another compiler to compare with the previous compiler, to ensure that everything is built properly. So, about the only way you're going to get a surprise there is **if**, your current compiler is trojaned AND it's smart enough to pass the trojan on to the new compiler. That means it would have to know when it's compiling a new compiler. Aside from that, compiler bugs are really the only issue to worry about. Which means, for the vast, vast, vast majority of the cases, source code is king! The high-level C/C++ (or whatever) code is very useful for giving you a good idea, but only the actual compiled result can be considered to be a 100% certain description of what the computer is doing. Basically, short of a hidden compiler trojan, the source is still the preferred reference, by anyone's measure. The compiled code is only checked to ensure proper optimizations or to validate unexpected behavior which may seem to not coincide with the source (compiler bug, for example). Long story short, there is a world of difference between open source and closed source software. The biggest being, it's completley unpractical to even aspire to audit a closed source OS or application of any significant scope without the source. Chances are, it would be much easier to write your own software solution rather than attempt to audit a closed source solution at the machine language level. Greg |
#55
|
|||
|
|||
![]()
On Fri, 25 Jun 2004 14:54:19 -0500, Bill Denton wrote:
I have seen quite a few Linux admins who had to use cheat sheets for even the most rudimentary Win server tasks. And during the dot.com bust I saw more than one Linux admin deliberately allow their Windows boxes to deteriorate so they couild make the Linux boxes look better, and thus preserving their jobs. During the .com boom, there where plenty of completely unqualified people working in IT. People often needed warm bodies to support their 1, 2, 4 business plans. The scales heavily lean toward WIN admins, IMO, but I have no doubt that there have and are unqualified Linux/Unix admins out there too. In fact, I've met unqualified HP/UX admins too. A rotten egg doesn't spoil the whole batch unless you attempt to cook with it. I have seen applications crash NT workstation and server four or five times, and I've crashed Win2K Professional twice; once with Flight Simulator. A couple of years ago I worked for a software company, and a test suite run of one of our applications brought down two Linux boxes and one Unix box. Everything can crash. Granted, but if an application crashes, it's an OS bug. There are many ways to "bring down" a system, but not all of these are bugs. In many cases, it's improper configuration or hardware for a task. And I saw a Win2K server mysteriously begin going BSOD, for no observable reason. It looked just like a software crash. I worked in the IBM building in downtown Chicago, where you would expect the power to be good, but it turned out we were getting power sags which were crashing the machine. This is exceedingly common. The more machines you get on a circuit, the dirtier the power is going to become. People often rush to purchase surge protecters but fail to realize the brownouts are actually far, far worse for their computer. Not to mention, very common. They often go completely unnoticed but cause bit-flip errors or even physical gate damage to some components. I could go on, you I think you get the point. It was plugged into the same outlet with a workstation which never had a problem. Different CPUs, chipsets, power supplies, and yes, even computing trends, can all make a significant difference. Not all computers are created equal. I did some testing, and discovered that the sags were long enough to drop the server, but not long enough to effect the workstation. I put in a UPS; no problem. But as I said, it looked just like a software problem. How many other hardware problems get blamed on the OS? I'm sure many. It's not a matter of which will do the job, it's a matter of which will do the job best. And there are things that Win will do better than Linux and vice-versa. And better is not just a matter of benchmarking: in some small towns you might find 10 Win administrators and zero Linux administrators. In that case, Linux is totally worthless. It's all a matter of matching the OS to the need. Well, my statement assumed everything being equal. But you are right. I'm a strong believer in the best tool for the job. If your room of a thousand monkeys only know typewriters, you don't give them word processors and hope for the best. You have a valid point here. And I hate to tell you this, there were LAN's long before the Internet became "prime time". I did my first Windows For Workgroups (NETBUI) network Done that too. Ohh...you remember arcnet? Hehe. Those were the days. in 1992, the Internet did not begin to achieve any sort of mass penetration until 1996 or so. While the majority of the servers may run Unix/Linux, most of the outbound data quickly goes through a router onto a Windows network, Well, before the big days of the net, most were military, dod, and university users, so it was still mostly unix. Granted, as it grew, windows networks were certainly plugged in. I agree with you about the increased vulnerability of Windows, but a case could also be made that the associated protocols, which were designed by Unix guys, were poorly engineered. Had they been better designed the spam problems would not exist. I don't make that argument, but if you talk about Windows vulnerabilities, you also have to consider lacadasical engineering. Hehe. I don't think that's fair. IPv4 works rather well for what it was designed to do. It's just the IPv4 has grown well beyond it's original design. This is why IPv6 exists. This is why everyone is wanting the Interent to transition to IPv6. It addresses most every complaint of IPv4. So, I don't think it's fair to blame the designers if the market refuses to adopt the correct technology. Having said that, spam is actualy an issue of the SMTP protocol and not the IP protocol. Granted, better facilities in IP may of helped. But let's face it, the current SMTP protocol (and associated RFCs) would be hard pressed to become more spam friendly. If you insist on pointing a finger, feel free to point a finger at the right group. ![]() Cheers! Greg |
#56
|
|||
|
|||
![]()
In article , C J Campbell wrote:
Fine, if you have a huge corporation that can afford a bunch of well-paid admins. Your argument is beginning to sound an awful lot like you don't think most people should have computers and that you think that the general public is a menace. No, I think Windows as it currently stands is unsuitable for the general public. Windows as it stands is fine in an environment where a corporate admin can look after the network. It's not the users fault, it's the fault of Microsoft because the configuration is insecure by default. Windows as it stands should have at least the software firewall on *by default* and almost all services (most services which home users will never use) *off* by default. Finally, MS has decided to listen and will have the firewall on by default in Service Pack 2. Security researchers have been saying this for *years*, and only now is it being done. In this instance, Windows 98 is better than Windows XP. The real problems didn't start happening until XP came out. Windows XP was a retrograde step for home users on the internet - it just allowed them to be 0wn3d because of all the additional potentially exploitable (and as it happens, actually exploitable) services that were running. It's not a problem with the users. It's entirely a problem with Windows. The users are essentially decieved - it's a nice easy to set up system, but they've been tricked into having a system that claims to be easy to use and maintain, but really requires an expert system administrator to make secure. That isn't the fault of Windows. But it IS the fault of Windows. Having a number of insecure services turned on by default which the vast majority of home users will *never* use on a network is purely the fault of Microsoft. The PC manufacturers also have some responsibility to bear - they could have at least thought about it and set up a reasonably secure disk image when they duplicated the hard disk loads for their PCs. In any case, the Macintosh has been easy to set up since the 1980s (including setting up a Mac LAN) so ease-of-use is hardly a Microsoft innovation. It's just a pity that the hardware platform wasn't open. might start asking yourself what would happen if you really got your way. Maybe you are a bigger threat than the public you despise. Gosh, you're reading an awful lot into my post that I didn't write. I don't think I've seen a non-sequitur like that since Lord Tebbit managed to turn a radio interview on obesity into how the Government was encouraging buggery! -- Dylan Smith, Castletown, Isle of Man Flying: http://www.dylansmith.net Frontier Elite Universe: http://www.alioth.net "Maintain thine airspeed, lest the ground come up and smite thee" |
#57
|
|||
|
|||
![]()
In article , C J Campbell wrote:
Still, there's a third alternative: safe computing. Again, safe for whom? Apparently the high priests are concerned only for their own safety, ie, jobs. Safe for the general Internet-using public. Is it too much to ask that operating systems designed for personal use on personal computers aren't set up by default to be running a huge bunch of exploitable server processes? -- Dylan Smith, Castletown, Isle of Man Flying: http://www.dylansmith.net Frontier Elite Universe: http://www.alioth.net "Maintain thine airspeed, lest the ground come up and smite thee" |
#58
|
|||
|
|||
![]() |
#60
|
|||
|
|||
![]()
I'm not a big fan of SCO anymore, and I haven't had to work with the SCO UNIX
for about 10 years, but I used to be the SA for the SCO box that ran the plant that produces all the chemicals used by a major chip maker. I had that system so tweaked that when I quit they didn't bother to replace me and the system kept running fine until it was replaced about five years later. They only called me in "on contract" once in that whole period and that was for a hardware problem. Dave Reinhart Greg Copeland wrote: On Thu, 24 Jun 2004 23:16:44 +0000, leslie wrote: Kevin Darling ) wrote: : : However, Microsoft themselves point out that Windows is not a hard : realtime OS, and should not be used in more demanding applications : such as fly-by-wire. : http://www.gcn.com/archives/gcn/1998/july13/cov2.htm Software glitches leave Navy Smart Ship dead in the water | GCN July 13, 1998 That's actually an application bug and not an OS bug. One interesting note is that they also highlight that Unix would be a much more reliable option, which would be true, excluding SCO, and including Linux. It's also worth noting, that traditionally, all MS OS's have somehow managed to sidestep the DoD qualification phases. Some cash and palms are usually suspected to be the reasons. In fact, while I don't have a link off hand, there is a fairly well known quote, by a DoD (IIRC) guy, which can be paraphrase as, "If Windows had been forced to go through the same channels as every other OS, it would of never qualified." Cheers! |
Thread Tools | |
Display Modes | |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Glass Goose Website revamped | wingsnaprop | Home Built | 0 | December 14th 04 02:58 PM |
Glass cockpits & Turn Coordinators | Jeremy Lew | Piloting | 2 | May 29th 04 06:16 AM |
Glass Cockpit in Older Planes | Charles Talleyrand | Owning | 2 | May 20th 04 01:20 AM |
C182 Glass Panel | Scott Schluer | Piloting | 15 | February 27th 04 03:52 PM |
Lesson in Glass | JimC | Owning | 3 | August 6th 03 01:09 AM |