Trends and the Future of
Computing and the Internet

Why '.Net' will soon has become irrelevant



Forward

This article has undergone a few transformations over time, but has been pretty accurate so far in predicting the directions things have gone. As such I have left most of it intact, as much as possible, with notes to indicate changes rather than complete re-writes. As of 6/7/2010 I have added some time-relevant information, and corrected anything that is not true. Where applicable, any past inaccuracies have been noted as well.

The Past

In the past, before the internet 'craze', on-line services were used primarily for communication, either with peer groups (chat rooms, bulletin boards), or with individuals (electronic mail, private chat rooms). The internet itself was a somewhat anarchistic place mostly populated by college students and people who had extra cash to devote to a new technology. Web sites were places where people could stand on a soapbox, or be as weird or anti-social as they wanted to, and nobody really minded.

Once 'the public' got into the internet, the 'marketing' soon followed, rightfully filling the vacuum of mail ordered products not previously available, and easily sought out with a search engine. Commercial advertising came with it, and soon every popular site was filled with banner ads proclaiming various marketplaces and products with eye-catching animation, flashing, and whatever else could be thought up. Later of course this marketing became far more onerous, with pop-up ads that wouldn't close, cookies that tracked your behavior, and pretty soon the internet became nothing LESS than an attempt by a small number of marketers to extract revenue from the masses.

The internet was originally supposed to be a place for information exchange, and personal communication on a worldwide network. Originally developed for the military, it graduated from the college campuses and entered the home. The commercial aspect did a lot of GOOD, in spite of everyone's hate of "too many advertisements" and I know that I have benefited greatly from being able to order things on-line that I wouldn't have been able to get (or would have paid too much money for) otherwise. But the saturation of the commercial side of the internet has reached its limit, as evidenced by the well-known "dot bomb" fiasco of yesteryear.

The Present

Some time ago Microsoft introduced their ".Net" run-time and development tools, originally aimed at providing various internet services easily at the client and server level. It came with plenty of hype, and a previously MOCKED concept of 'light client, heavy server via the internet'. According to an article at 'Linux World'

It's ironic that this is precisely the model that Sun, Oracle and IBM described in their "Network Computer" initiative, which was openly mocked by Bill Gates.
(formerly available at http://www.linuxworld.com/story/32652.htm , link now broken)

I for one, agree with Bill Gates' original assessment. A 'light client heavy server' solution simply relies too much upon an outdated concept, one that worked very well when the average client machine had the processing power of a modern hand-held device, and before that even, when big-iron timeshare systems were the state of the art. But today's computers often have multiple cores and operate at speeds far in excess of 2Ghz, have gigabytes of RAM, terabytes of disk storage, and video resolution better than HD TV. That's enough hard disk space to store hundreds of full length feature films and more hours of audio recordings than you could possibly listen to in a decade, along with the usual '1000 times more powerful than the computer that sent man to the moon' argument thrown in for extra good measure. And let's not forget the widespread use of high speed internet connections, which really hasn't improved nearly as fast in speed and performance as the computer you access the internet with. The entire concept of 'light client heavy server' just doesn't make any sense any more. Technology is moving too fast in a direction that says that the processing power of the average desktop is far greater than any possible advantage that could be derived from that of using a remote server (and may in fact be MORE powerful than the server it is 'querying the data' from).

'.Net' and leveraging the server market

From the outside, the server-side aspect of the original '.Net' initiative, with Passport authentication, SOAP, and other built-in technologies, appears to be a method by which Microsoft could have leveraged the server side of the Internet, forcing them into using Microsoft servers in lieu of their primary competition: Open source.
By all indications, a Microsoft server would have become NECESSARY to implement the kinds of client/server technologies that would eventually have required a close-source '.Net' based application on both client and server in order to run. And without open specifications or open source code for any of the required technologies, the only recourse for competitition would be to reverse engineer the 'black box' functionality that '.Net' was intended to "provide for you". Fortunately, this scenario never happened.
Instead, '.Net' evolved into the 'Common Language Run-time', with an actual alternative for non-Microsoft operating systems, the MONO project.

AND THIS IS PRECISELY WHY IT WILLDID FAIL!

If the '.Net' run-time were shipped with source code, so that people could "compile in" only what they needed, there would be no need to ship around the 30+Mb 'shared component installer' along with the application.

update: Mono is an open source equivalent of '.Net', but lacks certain Microsoft-specific functionality.

And let us not forget the all-too-frequent OS-level updates and security patches, and (from a developer's viewpoint) "what happens when an 'update' breaks your application" and YOU get the midnight phone call! Service packs and security patches are bad enough at risking your application being 'broken' and causing that midnight phone call, but adding a large chunk of shared components (one you can expect to be frequently updated) potentially reduces your project's reliability even more, Microsoft's promises to the contrary notwithstanding. After all, they are 'fixing' the so-called "DLL HELL" problem, right?

In some ways Mono and shared libraries on open source systems like Linux aren't a whole lot better. But the one thing you CAN do with open source is to update the source code from a known-good repository, then re-build all of the libraries and applications from source, which high-reliability open source systems (like those using FreeBSD) typically do.

But quite transparently, the '.Net' initiative was simply Microsoft's way to try to force everyone into purchasing a Microsoft server package for every web server that puts content out on the web. They wanted to take this part of the market. '.Net' was their mechanism by which to do this. You really can't blame them for trying. But the principles upon which they are basing this strategy are inherently flawed. And last I checked, UNIX and Linux platforms were grabbing a larger percent of the server market, primarily due to perceived unreliability of the Microsoft server platforms as well as Microsoft's licensing policies.

It has several years since the 'Microsoft Passport' system has been in place. It would not take rocket science to make a list of all of the web sites that are actually USING it. After having made such a big push for '.Net', and seeing how FEW web sites are actually using the capabilities of it, how can Microsoft expect all of the money and effort poured into this concept to actually pay off? Granted, I use the passport so that I can use the 'Windows Messenger', and I have to use it for my MSDN subscription (which recently got a bit more expensive, though not without some extra bonuses in the process, like the way the cable TV company raises your rates after adding more channels). Even with a Java-like language like C#, which actually has a promise of being useful for development in the future, you are still required to 'backpack' a monolithic set of shared components that is just too large to allow you to offer a C# application as a downloadable file without forcing the user into a 2-step "setup" process that's far too complex for the average computer novice.

'.Net' is too monolithic for desktop solutions

To sum it all up, the 'shared components' required to run the application are bigger than the application itself. And they carry with them a monolithic set of capabilities that are only valuable for a limited number of end users. Since these extra features are all 'web based' anyway, it also requires a 'live' connection to the internet to even make use of them. And not everyone wants to be 'on line' all of the time.
(this no longer applies, since web-based applications, the few that actually exist, typically use Java or pure HTML now)

And there is just no need for 'Passport Authentication' to watch an on-line music video, or to make an order at a commmercial web site (such as Amazon.com, who has been using their own secure on-line authentication system for years, without any apparent major incident). And do we really want the various on-line sites and stores knowing what I'm doing on OTHER sites that use 'Passport'? The possibility of being tracked using this system may be one reason why it's being avoided! After all, the agreement you make when you get your 'Passport' DOES say that your personal information will be shared with the sites you visit. There is no reason why you could not be tracked using this information, and then targeted for various forms of marketing or ads on every site you visit as a direct result. People are a lot more aware these days with respect to privacy and security.

And so, as a software developer, you need to haul around all of these 'shared web components' that are probably not even necessary for your application, just to ship your application. You end up including this 30+Mb self-extracting setup utility along with your own application. You may have chosen to use a language that requires '.Net' because you wanted to take advantage of the language features, like Visual BASIC or C#. Or you may have written a C or C++ application and included the 'managed extensions'. In either case, it's really just too much bloat without enough added functionality. But it IS more convenient for Microsoft, as in "One File fits All".

It was once thought that shared libraries were the best way to go. In some cases, such as "optional features", this is absolutely the case. You don't need to include a driver for an Epson printer if you only own an HP printer, and so forth. Dynamic libraries make the most sense when they are loaded on-demand, such as when you print.
Modern computers have SO much memory and SO much storage space that, for reliability reasons, it makes MORE sense for software authors to STATICALLY LINK all of the standard libraries into their applications, rather than relying on the presence of a (monolithic or otherwise) set of shared components that everyone uses. This is even more important if an application is sold via file download. Not having to include '.Net' in the installer is most likely to reduce the file footprint, wich means lower bandwidth requirement for download, and ultimately lower cost. And, it's also very likely that end users will be LESS FRUSTRATED with the setup process that way.

Faster Machines for Piggy Applications - A Possible Conspiracy?

Many have observed what is sometimes referred to as the "WinTel Cartel" of Windows operating systems running on Intel computers, and the marketing strategies of pre-installing Microsoft OS's on the vast majority of the Intel-based computers being sold today. Although I don't see an actual 'cartel', a possible 'conspiracy' may in fact be related to '.Net'. However, the likelihood of this actually being the case seems to be dwindling over time, with sufficient competition from Apple and AMD, as well as inexpensive Linux-based notebook computers, and the proliferation of hand-held and embedded devices, many running Linux or a similar open source operating system.

Back in 1995, when Microsoft released their Windows '95 operating system, everybody wanted it, for it offered MANY significant improvements, being a true 32-bit operating system, having support for multiple threads, a better user interface, and a lot of multi-media enhancements not present in earlier versions of Windows. Like Windows 3.0, it was a revolutionary change in operating systems, and gave people the "killer apps" that would motivate them to upgrade their computers.

Unfortunately, computers with only 4Mb of RAM, which ran fine using Windows 3.1, needed to be updated. Similar, a 386-based computer might run too slowly to work with Windows '95. And as a result, hardware sales increased in proportion to the demand for the new operating system, as people realized that they wanted the operating system, but their existing computers weren't powerful enough to run it properly. And then, there was a RAM shortage, due to a fire in a memory manufacturing plant somewhere in South East Asia, caused RAM prices to artificially escalate to several times what they were only weeks before. The demand for precious RAM to run Windows '95 no doubt made many hardware vendors quite wealthy, and caused the only serious price INCREASE on computer hardware in the last several decades.

Now, Windows '95 wasn't what I would call 'piggy' or 'monolithic', but its operating system, and the applications written for it, simply needed more RAM to function. In part it was due to the 32-bit code itself. In part it was due to the widespread use of Microsoft's "foundation classes" (MFC), which increased the 'memory footprint' of windows applications significantly. In part it was due to the improved functionality and 16-bit application compability. The bottom line: you needed at least 16Mb to run Windows '95, and it wasn't anybody's fault. It just 'was'.

But there hasn't been any such 'revolutionary' improvement in Windows since then. Sure, Microsoft has completed their migration to 32-bit by introducing improved versions of Windows NT, then 2000, then XP. And they released Vista (a step back), which disappointed end-users, and quickly followed up with Windows 7 (a step and a half forward), still with too many 'flavors' to properly keep track. And of course there are the various 'server' versions, often bearing similar names, and more recently the release year. There has even been a migration to 64-bit (with 32-bit application compatibility), and Symmetric Multi Processor (SMP) support for all of the new multi-core and hyper-threading processors. Yet the user perception of the operating system technology, which has changed significantly over time, is that it's more or less the same as it was, before, with a different name and a slightly different appearance. There has been no revolutionary change. There has only been an evolutionary change, and only out of the necessesity to keep up with the hardware.

And so, the sales of 'high end' CPU's and large amounts of RAM and 'very large' disk drives is dropping, and along with it, the prices, the economic situation notwithstanding. Technology has continued to move forward, but the demand just isn't there any more, like it has been in the past. Sadly it's probably due to operating systems and software not able to fully utilize this new hardware to its maximum potential. The technology seems to be outgrowing its practical usefulness.

And the fault may lie with Microsoft's '.Net' strategy, and the wasted resources they put into it. For Microsoft could have focused on streamlining CPU-intensive processes like natural language speech recognition, for the 'HAL 9000' or 'Star Trek' computer interface. They could have focused on utilizing the multi-core processors fully, by altering their development libraries to use multi-threaded solutions to time-consuming operations. They also could have focused on 3D graphics technologies that would have given you a 3-dimensional virtual reality or game-like interfaces, perhaps with the ability to 'swim' through your files or the entire Internet like the 'sea of information' it is supposed to be. If you can visually recognizeg those things you want instead of spending lots of time frustratingly entering different queries into a search engine (only to be led in circles with the results), the Internet becomes more like 'information' than 'disorganized data'.

Yes, Microsoft has definitely chosen a direction - their '.Net' initiative. It's a direction that will most certainly require 'more heavy duty' CPU's, more memory, and more disk space, because the '.Net' framework represents a monolithic top-heavy run-time system that must be hauled around like a ball and chain by applications that make use of it. Yet, the simpler solutions like HTML and scripting already do the job. But they don't sell hardware! For if Microsoft ever succeeds in (quite literally) force-feeding '.Net' techologies down the throat of the unsuspecting end-user, we will (once again) be forced to upgrade our hardware to 'take advantage' of this 'new technology'. Already this can be seen with Windows 2003 Server, which requires a minimum of 128Mb to run. Yet, earlier versions like Windows 2000 server ran easily on HALF that much. And the functionality in Windows 2003 Server is really no better than for Windows 2000 server, except of course, the monolithic '.Net' framework, which apparently needs that extra 64Mb of RAM for something...
In 2010 we are coming close to the limit of processing speed for a single thread of execution. CPUs are not getting FASTER, they're getting WIDER, with the introduction of 8 or more CPU cores per processors, all running at about the same clock speed as last year's models. As such, there is a desperate need for multi-threaded solutions for end-user applications; that is, symmetric processing (aka 'threaded') algorithms. And it is just as important to write hyper-efficient NATIVE code tailored for each specific processor's capabilities. With such techniques and effort, the 'Apparent Speed' of an application by an end user will appear to get faster. With solutions like '.Net', that get piggier and piggier, it will actually appear to be SLOWER, and user perception is more important than whatever technology you're using under the hood. And a '.Net' application, especially one that does not use native binary code, WILL be directly affected by the '.Net' Common Language Run-time (CLR), regardless.

The future of '.Net'

If Microsoft were to simply release the source files for their '.Net' shared components, in addition to the common run-time source (they have thankfully done this already), to allow developers to statically link in what they needed, or even develop their own versions (or compatible technologies) to go along with it, '.Net' may have a chance of survival in a somewhat unsure marketplace. But Microsoft has not done this, and instead has literally 'hidden' the functionality from public view, providing only 'black boxes'. Often there are reasons for doing this, and the primary reason is protection of intellectual property. As I have stated before, this appears to be Microsoft's leverage of server platforms.
As it turns out, Microsoft eventually DID contribute most (but not all) of their Common Language Run-time (CLR) code to the MONO project, and (as I understand it) is active in the development and support of the MONO project. Kudos for that.

But to put the one area of MAJOR success in jeopardy in the process of developing '.Net' is a major strategic blunder on the part of Microsoft. They have shifted a lot of resources into making '.Net' work, and it's taken its toll in 'desktop innovation'. There should have been a technology revolution by now, but it is late in coming. Other operating systems, including Apple's OS/X, have a chance at taking Microsoft's place in the desktop world. If Apple were to introduce their G.U.I. in a form that would operate on any x86 system running UNIX or Linux, a LOT of people would adopt it, and if Apple had the applications and pricing and marketing guts to make it happen, they could push Microsoft out of the desktop dominance permanently. And the aftermath from Windows Vista only makes my point even more. Fortunately Microsoft seems to have learned from this experience, when they release Windows 7.

That being said, it's a bit of a shaky world for developers right now, not knowing which way to go. If you listen to the Pied Piper, you'll follow the mice into '.Net' development, and like Microsoft's other 'failed' initiatives (ZAW and Net Meeting and WinDNA being among those that completely vented the initial high pressure steam they had when they were first introduced by Microsoft) you'll end up wasting man hours and time 'learning something new' only to find out that it costs you a lot more than it's worth in the long run. Years after its introduction, C# has barely grown in popularity. Yet it was supposed to replace C++ ! If anything has replaced C++, it would have to be Java. Java is doing rather well these days. And so is the 'C' language. But C# has barely caught on. No revolution here.

UPDATE August, 2004 and June 2010
The following article on the TCP Index for August, 2004, confirms most of what I've been saying about '.Net' all along. Looking at the relatively constant popularity of Visual Basic (the newest version of which now uses '.Net' runtime becaue Microsoft made it happen that way), plus the VERY LOW popularity of 'C#' (which definitely requires the '.Net' runtime, or an open source equivalent like 'Mono'), simply demonstrates that people aren't lining up in droves to use Microsoft's latest development technology. In fact, the languages that are gaining popularity the most are OPEN SOURCE languages. Perhaps Microsoft should take a lesson from their own playbook, and get on the 'open source' bandwagon the way they wanted everyone to get on the 'windows' bandwagon some years ago.
In June of 2010, the TIOBE Programming Index reported that C# had a popularity of around 5%, with Java and C at around 18% each, and C++ at just under 11%, with Java still holding onto the #1 slot since its popularity dropped below that of C for a short time in 2004 and came back again in 2005. And C# popularity has crawled from about 2% popularity in 2003 to the level it is today, taking 7 years to do so. It just doesn't appear as if anyone is going to rapidly switch their existing Java or C or C++ applications over to C# any time soon.

The future of Computing and the internet

When I look into my technological crystal ball, I see a few things happening on the horizon that are far different than current conventional thinking (but that has always been the case I think). But I was right about Windows, I was right about Java, I was right about the 'dot bombs', and I was right about several other things that I can't remember right at the moment. Since hindsight is 20:20, I can make any claim I want to about these things, so why not do something that boosts my credibility a bit? I'll go ahead and make a prediction and we can all see how it turns out (today it is October 3, 2003).

That being said, it is not the kind of news that anyone a Microsoft Board Member really wants to hear. But I believe that if software developers and technology investors keep their minds open and watch the trends (instead of jumping on the latest bandwagon and betting the farm) they will see whether or not I am right about all of this. If I am wrong, they will know, and Microsoft's '.Net' will be the wave of the future and will dominate both the server and desktop platforms with a plethora of "dot bomb-like" functionality that the 'dot bombers' of 2002 were simply unable to implement. Or, Microsoft's '.Net' initiative will turn out to be a billion dollar boondoggle, complete with lower stock prices (this happened, but Vista was primarily responsible) and a lower share of the market (this is happening too, mostly because of hand-held and low-cost portable devices).

It's really too bad that Microsoft didn't sink their money into improving the desktop operating systems with 'the next technology' instead of trying to dominate the market. I would love to have a HAL-like computer interface that understands what I am saying, and can intelligently answer my questions, and even play a good game of chess. Or what about a true 3D user interface, not a 2D one like Windows, but true 3 dimensions, maybe one that reminds you of a video game or Radical Edward's computer display (from 'Cowboy Bebop'). These are the kinds of radical technological innovations that are likely to make people want to upgrade their systems. And Microsoft has already floated 'trial balloons' indicating that they want to shift to 'subscription licensing' so they can guarantee a revenue stream.

And THAT just makes Apple and Linux look even MORE attractive!

But, you say, what about the "dotGNU" project? Well, if you have any question as to why it was started, and what they are REALLY trying to accomplish, you should visit the following link and see what THEY have to say about it:
    http://www.gnu.org/projects/dotgnu/danger.html
Not to say that writing for "dotGNU" would necessarily be a BAD thing, if you are really so inclined to do so. My personal preference is to avoid shared run-time code whenever possible because it often results in compatibility problems (even on *nix systems). But if it became integrated with a standard set of 'widgets' for gnome (as one example), it could then become a part of the operating system (or the desktop system, at any rate). And there is a significant difference between a 'bloatware slap-on' set of shared run-time libraries and an operating system.
The "dotGNU" project morphed into the MONO project, and has been moderately successful, though nobody really seems to be jumping on THAT bandwagon either. It's a 'fun toy' to write a simple C# application and then have it run on a Linux box, after sweating out which packages you need to install on the Linux box to make it happen. But you CAN do it. Really.

Microsoft has been there before, making decisions that adversely affected developers. In the early part of 1998 they shipped a version of MFC40.DLL that broke certain applications that were written prior to Windows '95 OSR2, Windows '98, and NT 4.0 SP3. These applications literally called the WRONG functions in the 'CRecordset' class implementation if they had been designed to use the MFC40.DLL MFC run-time (instead of statically linking). Microsoft had been encouraging this, for the usual reasons that ultimately lead up to DLL HELL. And it was MICROSOFT who BROKE IT! In some cases, this would have led to the midnight phone call, "Your application doesn't work any more" and tens of minutes later you find out the user installed another application that took the liberty of updating MFC40.DLL, thus breaking YOUR application. And of course it was YOUR fault it wasn't working!


Of worthy note is an article published on-line in 'The Register' regarding an analysis by a Microsoft-owned/funded company '@stake', who concluded that '.Net' is more secure than IBM's competing J2EE (Java 2 Enterprise Edition) framework. However, as the people at 'The Register' have so cleverly pointed out in links on the above-mentioned web page:

    Security analysis of Microsoft .NET Framework and IBM WebSphere, by @stake (funded by MS)(link broken)
      - vs -
    Passive smoking isn't really harmful, 'independent' study funded by tobacco industry

And so it appears that perhaps an 'un-biased' evaluation of '.Net' is not really possible. This goes double when you read the license agreement for installing it; basically, you are NOT allowed to do any kind of benchmark testing unless you get permission from Microsoft to do so first. I supposed 'benchmarking' would include comparing it to ANY other product. So in effect it's like a "gag rule" and in light of the lack of REAL innovation plus the apparent attempt to leverage the market and shift/reduce real software development to "slapping together a bunch of third party or 'shared' component libraries and calling it a 'product'" (in line with the 'VB Model' of doing things, of course), it just shows you that the direction being paved by Billions of U.S. Dollars from Microsoft is one of those classic idiotic moves that only an attorney could think up. Then again it could also be an attempt to inflate the APPEARANCE of a 'product' in order to artificially keep stock prices higher.



Back to S.F.T. Inc. home page
©2004,2010 by Stewart~Frazier Tools, Inc. - all rights reserved
last updated: 6/7/2010