Click here to Skip to main content
15,881,882 members
Articles / Programming Languages / C++

2010: The Deconstruction…

Rate me:
Please Sign up or sign in to vote.
4.48/5 (8 votes)
14 Nov 2015CPOL30 min read 13.3K   2   15
Initial notes… The following piece is a sociological analysis of the state of professional software development today.

2010 - The Deconstruction

Initial notes…

The following piece is a sociological analysis of the state of professional software development today. As one who is not just a senior software engineer but a military historian\analyst as well, I have brought together the skills of both these areas of expertise to demonstrate to the technical community at large that what is happening to our profession has nothing to do with innovation or new opportunities in application development but instead is an ongoing process of terrible, deleterious, degradation that has professionals everywhere concentrating on abstract concepts, useless paradigms, and redundantly, new tool introductions without any analysis as to the benefit of their application to the development of new applications. In fact, little has been reported on the success (or failure) of application development itself, as this aspect of reporting has been pushed aside for the less relevant aspects of our profession. Is it not the final product that is important and not what makes it?

The negative results of this process are staggering and are not simply the result of a software profession overwhelmed by marketing hype and vendor misdirection both supported by an avalanche of “me too” groups of young professionals trying to insert their own ideas but is as much also a symptom of what is happening to societies in general.

To understand that this deterioration is not isolated to our profession alone, one should take a look at the article at Russia Insider at

http://russia-insider.com/en/military/epic-fail-heres-why-most-us-weapons-systems-are-worse-russias/ri11097. The article is written by a US military analyst and provides some in-depth details as to the state of the US Armed Forces and intelligence agencies. You will also find some of my own historical notes in the comments section below it.

To add to this, one may also want to see John Oliver’s piece on the state of US nuclear arsenals, which is both hilarious and terrifying at the same time (https://www.youtube.com/watch?v=1Y1ya-yF35g). Much of what Oliver reports has already been reported in the mainstream news outlets.

Cognizant functionality in life today is quickly deteriorating in all areas of industrialized societies into nothing but a cacophony of gibberish and nonsense, not to mention the terrible social and political traumas being afflicted on citizens across the globe along with horrific conflicts and their affects. The software development profession has not been immune to these trends…

Symptoms of a new age; A Deconstructed One

Recently, a technical news item that probably has appeared in many inboxes of professional developers recently, promoted a new tool called “Granular.js”, which is primarily JavaScript for Microsoft’s Windows Presentation Foundation or WPF. A few days later two other rather interesting articles appeared on the “jaxEnter.com” site…

The first article, “Why are websites getting fatter?” details an explanation on the results of the movement from server-side processing for web applications to that of the concentration on the front-end. The second article describes an increasingly bleak future for organizations that are primarily avoiding the usage of database due-diligence practices by allowing developer groups to implement their own favorite NoSQL database engines.

All three combined, along with many other such articles not mentioned here are small indications that a sociological change has occurred in the software profession. For all of the popularized hype about doing development in new ways for the 21st “century”, as has been expected by many senior and more cognizant younger professionals, these methodologies, which have often avoided simple common-sense are now beginning to show the innate flaws with their underlying philosophies.

The announcement regarding the “Granular.js” project would lead most serious professionals who have been in the field for a while to wonder what type of motivations a group of professional developers would have in order to “reinvent a wheel” with this new project.

“Windows Presentation Foundation” is already finely developed by Microsoft and has been advanced with every new release of its flagship development product, “Visual Studio”. WPF is well-designed to replace Windows Forms for desktop application development in the long term and has an option to allow WPF applications in the XBAP format to be supported within a browser

So why would anyone waste their time in reworking an already maturing product to use JavaScript of all languages. For an answer, here is the introduction from the “Granular” site on GitHub…

“Granular is a re-implementation of WPF, compiled to JavaScript, allowing WPF applications to run in the browser (using Saltarelle C# to JavaScript compiler).

WPF can be thought of as a definition, not only as an implementation. It defines many advanced UI concepts (such as visual / logical trees, layouts, routed events, resources, bindings and many others).

It also naturally supports MVVM and other UI patterns, Granular allows to enjoy all of them in the web.”

The “Granular” project introduction admits to simply being a re-work of an existing desktop development framework without any reason other than one can now apply JavaScript to its implementation. Why not simply use WPF on the Internet as it was designed to be used? Why add an additional complexity to this implementation?

The ironic fact that a group is re-working WPF to allow it to be implemented with JavaScript in a web environment for which it already supports is symptomatic of our profession’s slow deconstruction that appears to have begun in 2010; technology is seemingly now created for the sake of creating it with no apparent understanding if it is even needed.

What new business requirements for The 21st century?

The idea that such new technologies are for the “business requirements of the 21st century” can nowhere be found to be based in any factual analysis with the exception of the hyper-marketed nonsense that promotes it as such.

Considering that businesses today are just as corrupt, bureaucratic, mean spirited, and disorganized as they were in the 20th century there is no reason to expect that with the passing of the millennia something magical has happened to a centuries old entity that was originally created as a scam (see Joel Bakan’s, “The Corporation”).

So what has supposedly changed in terms of their requirements? Nothing has changed but for the necessity to encourage software development teams to be able to deal effectively with hyper-fast levels of product creation. How absurd; a basically creative process that should “innovate” ever increasing speed.

Old language constructs and new ones aren’t creating anything new

Today, development technology in the web space in particular, is going in so many directions in an attempt to satisfy such demands as to be basically unable to provide any lasting, concrete techniques that may actually foster simpler and more efficient ways to build Internet applications and other types of applications used in business environments. Relying on JavaScript or similar language constructs (ie: Dart, Go) as a foundation for so many of these avenues on the Internet is surely a fool’s errand since the foundation itself has never been considered a strong suit in such development. Either the language has inherent issues or won’t last beyond the time it takes to grow stale after its introduction. So far the newer languages just noted above have not done much to ingratiate themselves with the bulk of the development community. As to JavaScript, even the older VBScript implementations were more advanced at the time comparatively than anything JavaScript has to offer today.

As vendor organizations go, Google has become famous for contributing to this erratic drive for innovation simply for the sake of innovation under the guise of 21st century business requirements. Many of its new development concepts fail outright or simply disappear due to a lack of interest (ie: “Dart”, which is now being resurrected under its new “Flutter” platform). Even one of its finest creations, Google+, which appears to have been designed as a competitor to Facebook is an excellent implementation of a social media interface when compared against its “nemesis”. It’s ease-of-use is far and away superior to anything that Facebook has to offer or any of its other competitors. Yet, Google, this massively wealthy organization, simply gave up on such a good concept with no concrete reason provided other than it had lost interest.

Such generalized project failure regarding new innovation in the IT profession makes one want to cringe; especially when it is explained away as the price of innovation and the after effects continue to seed confusion and litter the overall landscape. It appears that many in the profession are simply throwing ideas up against a wall to see what sticks without considering why they are lobbing such things in the first place; except of course that it is new. Contrary to this, historical innovation was fostered by a necessity (ie: medical advances) or a convenience that made life a little easier (ie: telephone).  The sociological effects of the plethora of smart-device innovation now available has seemingly demonstrated that they have done society as a whole a lot more harm than good.

Real innovation is done at the foundation level

However, what type of innovation can be created within current business application development when there is little more that can be accomplished beyond that which supports standardized methodologies and techniques for building many such applications? And most such techniques have already matured into stable foundations for development endeavors. Why change if it’s not broken?”

By 2010, Information Technology tools and processes had actually reached a zenith of their own success and with the onset of critical developments began to unravel as it became a willing victim of its own success. However, much of this unraveling was a result of not only of accumulated technical success but sociological changes in our daily lives. Both of these situations were also seriously affected by the changing nature of the sociology of political and business leadership. Yet technically, few to none of these outcomes have generated any real positive effects for people in general.

This situation in the technical community appears to have been the result of the convergence of the Apple iPhone and subsequent Android smart-phone technologies growing popularity, which spurned a senseless mania for all things “mobile”. This emphasis encouraged a mindless acceptance that such technologies were now seen as needed hand-maidens to function competently in the 21st century.

It was no longer enough to interact with the Internet from the confines of more robust equipment forcing such usage within a confined context in one’s life. Now people could and eventually needed to be attached to their mobilized devices 24/7. The question is, did anyone ever ask why? Hardware and software vendors saw it only as new avenues to create profits in an industry that had become relatively stable and mature with its technologies; younger professionals saw it as something entirely new to tinker with while the non-technical person seemingly adopted to the new computing paradigm more from a seemingly need for an addiction than anything else.

Before the revolution

Before this so called “revolution” in mobile computing occurred, people did things within the context of a life construct that all things had their place. If you watched television you went into a room that had one. If you wanted to research something in particular you went to the library or increasingly to where you kept your desktop or laptop. If you wanted to speak to someone you called them on a phone or spoke with them face-to-face. Such things did not overlap seriously leaving daily life somewhat organized to a degree compared to the increasingly unhealthy blur that it has now become. The complaints by many that they feel like they are living in some form of disconnected haze is a direct result.

Today, no one has to any longer bother themselves with such an aging concept as context and as a result, it is increasingly demonstrated how our world is breaking down along the crevices of a myriad of sociological fault lines.

Now, if you want to watch television, you just turn on your smart-phone or tablet or do both; research something, turn on your smart-phone and rush to the latest online tidbit that seems feasible on tiny screens. Want to talk to someone, again turn to your smart-phone while the person you may want to communicate with is sitting right across from you. But why speak when you can text…

The symptom of it all; Microsoft’s ASP.NET MVC

Into this evolutionary quagmire In 2010, Microsoft launched its now popular ASP.NET MVC development paradigm, which many of us much older professionals along with the more cognizant younger ones at the time reacted to with somewhat muted horror. This was hardly the defining cause of what was already in motion but more a symptom of it. Microsoft was merely following a continuity of events that for whatever reasons were fostering a perceived need for change in the software development communities. The introduction of ASP.NET MVC was nothing more than a single event within a larger universe that was deteriorating into a senseless chaos.

The first question by many professionals in the .NET community at the time was why would Microsoft suddenly promote a development infrastructure that basically returned us to the “Classic ASP” environments with some additional bells and whistles attached? And return us to an infrastructure that Microsoft created but spent years disingenuously degrading in favor of its new .NET technologies?

ASP.NET was commercially released in 2001 and was a boon to general web-development. It made web development a far easier endeavor even exceeding Java’s advances at the time. It removed the developer from the more difficult requirements of understanding and working with the low-level aspects of Internet development.

However, with ASP.NET MVC there seemed to be a need to turn this apparently more modern approach to web development on its tail. There is no doubt that there were things with ASP.NET that needed fine tuning (ie: better HTML generation which was resolved) but with ASP.NET MVC it appears that Microsoft threw the baby out with the bathwater. Now everything that Microsoft had touted as inefficient and overly complex with “Classic ASP” was no longer true since with MVC, much of what “Classic ASP” had been was brought back into vogue with a number of technical twists. And many reasons were cited by the new MVC promoters in their attempt to demonstrate the efficiencies of this paradigm but absolutely none of them mention anything of benefit to business organizations (ie: cost reductions) or a way to develop applications faster. Nor is it ever explained how the Web-Forms architecture failed to offer a modularized approach to such development other than like it’s earlier predecessor, it was prone to producing bloated, messy projects.

If the technical foundations you work with force a definition of the methodologies with which you work with them than no matter what new development is devised, if it as well designed as the one it hopes to replace or supersede it will be more or less redundant, which is exactly what has happened. As it regards ASP.NET and ASP.NET MVC, they both do exactly the same things with both having their advantages and disadvantages making neither superior to the other and only offering yet another avenue for such development. And this is as just as true with the Granular Project as well as with many others. WPF for the desktop, like development for the Internet, is a mature product in terms as to how far the underlying technologies can be taken. For the foreseeable future, development on the desktop will remain relatively the same, whether one uses Windows Forms, WPF, or Java Swing.

The problem surfaces with the general confusion that the increasing number of development technologies cause, which understandably has new and existing professionals questioning which environments and languages they should concentrate on to maintain a security in their careers. The recommended lists are apparently getting longer making it far more difficult for anyone to concentrate on gaining any substantial level of expertise. The result of this is also a decline in the inherent non-codified knowledge bases within the technical community, which is vitally critical for successful project development.

Real innovation is taking place but not where everyone is looking

None of these contentions mean that true innovation in the technology business space is at an end. However, it is important to understand where true innovation can occur. Real innovation has been taking place under the hood of database engines as one example. Look at any of the popular RDBMSs and their release notes in the past few years and it will be found that giant strides are being made in the internals of these venerable workhorses; everything is being affected by such updates from performance to new specialized features and all being done without having to change how one basically works with such data storage tools leaving this technical area quite stable.

Some of these innovations are so dramatic that there is increasingly little reason to consider the popularized NoSQL engines, which in fact actually represent a retrograde form of database system. With NoSQL someone took a B-Tree construct, patched on JSON and\or XML to it and claimed that a document store was somehow a radical new entry to the database world. For older professionals they saw this all done in the 1990s and before with B-Tree, B+Tree, and ISAM databases with hashed keys. And NoSQL database has done anything radical that existing RDBMs have not already done as both text and binary documents can be easily stored in them. NoSQL engines may have a claim to greater efficiencies since with most RDBMs SQL is merely a layer sitting atop B-Tree/B+Tree constructs. However, this is hardly a reason to use them.

In 2010 there was really nothing new to introduce since all the technologies being used were quite mature. Java was and still is being used in the same manner for its side of the fence and .NET can provide no reason as to why Web-Forms should still not be the dominant web development platform other than perception. Yes, many can list all of the technical reasons as to why MVC is superior to Web-Forms development. Yet, in the scheme of things the promotion of any such paradigm is really meaningless.

Reversing sound server-side principals with ASP.NET MVC

ASP.NET MVC’s most significant effect on the .NET Internet development community was the reversal of years of proven technique regarding the reliance on server-side calls. Now instead of following a robust form of client-server processing much of what had been done server-side would make its way to the front-end. This in turn promoted the use of JavaScript and its variants (ie: CoffeeScript, TypeScript) into a mainstay language from which many new tools are now being derived such as the previously mentioned, “Granular”. Incredibly, “jQuery”, the highly popular JavaScript framework was designed with the express goal of making JavaScript “more usable” since it was well known that it wasn’t. Is anyone getting the feeling that they have been duped?

The result of this paradigm change from the server to the front-end has been less than stellar as web sites are becoming more ungainly, development environments more complex and less efficient, and the most important tenet of Agile (that of getting development done faster) is coming into direct conflict with the lower level style of coding required for the MVC paradigm. No one will be able promote the idea that getting an MVC application completed successfully can actually be accomplished faster than a similar well-designed application using ASP.NET Web-Forms.

By this point some readers are probably questioning the fact that if MVC was good enough for the Java Community what is the big deal of seeing ASP.NET MVC as some type of eruption in the sociological landscape? An excellent question… The answer lies in how Java has been historically used in the software development profession, which has been mostly with large, complex projects. Java is not used nearly as much in the medium and small sized corporations and rarely holds a monopoly in large ones. As a result, Microsoft development products are ubiquitous to the entire business community in the United States and holds major shares of development communities in South America, Europe, and India. Thus the ripple effect of this 2010 introduction was quite extensive.

Yet the emergence of ASP.NET MVC as the most popular .NET web development paradigm similarly to the one used by the Java community, sociologically speaking, has made little to no sense in a profession that is increasingly dominated by the philosophy of getting things done ever more quickly. Not a day goes by when some vendor or training organization isn’t hawking their wares promising speed-like Nirvana for getting well-crafted applications out the door quickly. The entire concept on its face is nonsensical but surprisingly and increasingly many developers have actually come to believe this clap-trap.

This specific event in 2010 demonstrated how the world in general was becoming; which was less thoughtful and reflective due to the adoption of technologies that turned just about every aspect of life into a sound-bite. Increasingly for the world at large age old concepts were being turned on their ears; “War is Peace”, Wrong is Right”, and “Slavery is Freedom”, foundations of the Orwellian nightmare have very much become the stuff of modern-day nightmares. For business software development, “Complexity is Simplicity”, “Slow is Fast”, and “More is Less”, have become similar refrains to Orwellian thought.

With this volatile mixture of technology and sociological change, the imprimatur of deconstruction of the software profession in the year 2010 began and has been on a dramatic roll since.

Other businesses are experiencing the same deconstructive effects as software development

If any would like to dispute this contention than let’s step away from business technology for a moment and take a look at similar effects that are occurring in other parts of the US economy.

In 1995, the Boeing Corporation won the international award for technical excellence with its development of the 767/777 aircraft and the corresponding software used to run these magnificent machines. At that time the US military also had such formidable technologies that there was no one on Earth that could compete with such capabilities.

Like the Information Technology profession in 2010, this all too changed and dramatically. And for the same reasons; progress in technical innovation in general was being promoted for the sake of technology and profit instead of for any actual real creativity (However, there are individual examples of this throughout modern US history such as the F-111.).

There are some exceptions such as the Boeing 787 with its new skin-alloy designs. However, in an effort to pursue similar current trends in outsourcing, Boeing introduced into its very mature development process a more decentralized approach whereby different aspects of development would be done by different companies and suppliers spread out across the globe. Upending the Boeing development and construction process had deleterious effects on the final 787 product, which became apparent with certain failures in the aircrafts avionics and support systems. As advanced a concept as the 787 is, little is now heard about this beautiful aircraft. So whatever actual innovation there was in this design it was all overshadowed by the use of very self-destructive trends in US business processes.

Much of this failure was primarily due to the influence of the vile Neoconservative or Neoliberal mindset that has completely infested the US business and political classes. And technology companies have not been in any way immune to this as the professional development community is widely known for having suffered terribly under the years of outsourcing by the US; a process solely for the benefit of reducing costs, which universally has obliterated quality while gutting the very essence of US dominant economic power, it’s manufacturing.

In short, the Neoconservative theory of economics reduces every aspect of daily life to a profit-based motivation. This has all been well documented and is the single reason for the ongoing economic as well as technological decline of the United States. And it has affected everything in US daily life, including as just noted, the professional software development community.

Everyone remembers the debacle that occurred with the introduction of the Affordable Care Act (ACA) and it’s negative effects are still sending shockwaves throughout the United States. The ACA web-site was nothing more than a web-site for people in different states to apply for medical insurance. Such sites had been designed many times before successfully for other business needs and yet, the ACA site could not properly implement well-known techniques for such a level of required concurrency and differentiated processing by state.

The cost of over $600,000,000 with over 50 different consulting firms on the payroll to the US government literally boggles the mind and that for such an expense the developers clearly had no concept of the construct requiring the front-end efficient access to the various back-ends among a host of other issues. For example, in New York State, site processes could not correctly validate legitimate social security numbers. And the response of the customer service representatives was that such numbers were not correct. Scores of web-sites around the United States currently correctly validate social security numbers without any issues but the ACA site in New York couldn’t even though all of the New York State government sites could.

The question is why something so basic became a trivialized complexity of mammoth proportions when developers from so many different firms had to have known better.

Similarly for the United States Armed Forces, Lockheed-Martin amped up its ambitions to secure a monopoly in fighter design and gave us the F-35, which as a development disaster so far has not found an equal in the history of aviation

In short, the F-35 has single handedly, with its ongoing hardware and software design failures, been the cause of the literal destruction of the fighter defense capabilities of the United States Air Force, the US Navy, and The Marine Corp. Simply put, the United States can no longer sustain air combat operations in the face of far superior Russian and Chinese weaponry. And yet this trillion-dollar travesty is being maintained by the US Congress while scores of politicians continue to promote the efficacy of US Armed Forces when such development failures have been ongoing for several years across all types of weapon systems.

The failures cited in the above examples are based on the same set of factors that are now plaguing the Information Technology field; a loss of the ability for coherent planning and design capabilities, which affect product delivery, the result of the comparable rise in the Neoconservative mentality that promotes a management mentality that prioritizes profit first and allows for little to support towards its generation. This abhorrent mad dash for money appears to have now metastasized into a cancer that overrides the importance of every facet in life including our technical capabilities, which are being marred by the continuous creation of tools that seem only to add more to the general confusion of developers and seemingly have no real benefit than to promote the reflections of individuals and groups who like anyone else are attempting to make a mark in their chosen careers.

The dreaded “design pattern”; solutions looking for problems

To clarify this redundancy further we can take a look at the modern paradigms for the creation of actual source code, which are more commonly known as “design patterns”, which began to concretely emerge in the 1980s from software engineering analysis of successful projects and experimentation. Nonetheless, up to quite recently most such constructs were shunned by most professionals since they already inherently used many patterns in their work without any need for a formal understanding of them. In addition, software engineering analysts found that in many instances such patterns were actually just “solutions looking for problems”.

Today, design patterns and paradigms are promoted for anything that a developer may want to create, with the idea that using a format that has been shown to be successful will produce further success, which is a false supposition to begin with. It hasn’t turned out that way. It has only produced increased complexity and ambiguity in software development. Do you really need to use a “repository” pattern when all you need to do is just access the bloody database!?

Much of this current popularization is quite similar to students having to learn the “new math” back in the 1970s. Instead of learning to do math succinctly that would effectively allow them to implement it in their daily lives, instead they now first had to learn the theories on how math was done and then apply these theories to mathematical problems in theoretical fashion.

Trying to do long division in this manner became a convoluted process of sheer nonsensical proportions. This is what the current mantra with all the design patterns and development paradigms has wrought on the current software development profession turning our field into nothing more than an incoherent mess of processes with their own arcane terminologies that have yet to show anything substantive in the delivery of quality products.

WPF jammed up with a convoluted design pattern from Microsoft

Development with WPF or the Windows Presentation Foundation is a classic example whereby a design pattern, specifically Microsoft’s MVVM pattern, is promoted as the proper way of developing such applications over the idea of simple but sound design.

If for whatever reason an application requires the additional complexity of MVVM than of course it should be considered but exactly what type of desktop application would require the complexity of a design-pattern framework is beyond comprehension considering that the pattern does not define any capabilities that WPF all by itself already contains.

Windows Forms developers have done similarly fine work without MVVM for years. Does using XAML instead of interface objects somehow deem it necessary that the entire process be deprecated to some level of ambiguity so that WPF developers feel unique?

Agile; the ideological “Magic Bullet” that shot everyone in the feet

The convergence of the tool and paradigm mania in 2010 also had a lot of help from within development environments that were increasingly coming under the influence of using a process “ideology” instead of proven software engineering techniques that promoted sound process “models”. This came in the guise of the Agile Development Framework, an incarnation from the seriously flawed “Extreme Programming” nonsense that was responsible for the collapse of the development of the Chrysler C3 Payroll System in the 1990s.

Agile came into the technical community with it’s “Agile Manifesto”, which should have been subject to severe scrutiny then and there. Anything that is touted as having a “manifesto” is most certainly an ideology, something to be inculcated into a group for reasons other than altruism, and not something to be taken very seriously. Nonetheless, the development community bought into this hyped up nonsense that promised techniques that could produce applications increasingly faster under increasingly severe pressures. As we have seen, this worked for small projects since such projects rarely had any need for many formal development processes to begin with and if the developers were of high quality a good product can almost always be ascertained. Agile simply put a veneer of professionalism over small project processes by allowing them to assume that they were following a prescribed format of development. In reality and in many organizations it was merely a cover for sloppy, disorganized development processes that often resulted in poorly constructed software products.

Software development process paradigms have been in place for many years under the umbrella of software engineering practices prior to Agile and since. They work! And they have been categorically demonstrated to offer succinct standards for quality software development. They also offer a variety of techniques to accommodate the needs of any type of software development project, whether small, large, or huge. So why not use them? Because the guys who created Agile thought they were on to something that could do away with such mature constructs as software engineering and provide lightning speed development as if Humans would magically become robots. They had found their “magic bullet”.

Nonetheless, they gave us a “Manifesto” in 2001 that just recently is undergoing a philosophical change, which begrudgingly offers original software engineering concepts that would allow Agile to be used successfully in large, complex projects. The fact that this “Manifesto” is coming under this type of scrutiny is not a very convincing indication that it did the software development profession much real good previously. Not surprisingly, software projects still failed at around the same rate with Agile in the past 10 years as they always have and the reporting on this fact has been relatively consistent.

Conclusion

When you have a society that has become so induced by a sense of complete narcissism through narrowly focused interactions on tiny devices, which in turn are an outgrowth of the depraved foundations of consumerism, compartmentalization, which is a trademark of quality software development is hardly going to be a desired trait.

Quality software development cannot possibly be a result of such a sociological environment in which the profession finds itself and cannot hope to extricate itself from.

Even Microsoft’s flagship product, Visual Studio”, has followed along a similar trajectory with each new release, something the Java Community has somehow seemingly avoided to some extent since it has no monolithic vendor such as Microsoft promoting its wares in the same fashion (though, it has plenty of peripheral organizations all hyping the mobile computing bandwagon).

“Netbeans”, the current flagship product for Java development, has more or less remained fairly stable over the years in its capabilities. However, this is as much a result of the nature of its community, which has always used more low-level capabilities as compared to Microsoft, which offered tools with ease-of-use as a primary motive in their designs. There is nothing wrong with this but it is rather odd to abandon the very ship that they built to follow along the Java path when .NET development was a dominant force in its own right. No one in the .NET developer community was demanding such changes for if they were it would have been upfront in many of the technical journals at the time.

Nonetheless, since “Visual Studio 2010”, every new release of this IDE is becoming increasingly crammed pack with tools, patterns, process paradigms, frameworks, and project types for all types of software development efforts. The product has redefined the concept of “bloat” all in an effort to accommodate this new age we are living in. “Visual Studio” has become everything for everybody and nothing for anyone in particular.

With the introduction of the modern mobile phone in the late 1990s/early 2000s, people were finally given a communication device that could serve them well in cases of emergency and important communications. No longer would people have to be stranded under severe conditions without being able to call for help or be prevented from making critical and necessary interactions due to the impediment of daily conditions such as transit. Along with the Internet many were hopeful for a better world in terms of communication, social interaction, greater support for social justice through more collaborative online endeavors, and new opportunities in the software development profession.

But the world had Steve Jobs to save us from such a mundane usage of a singular device when that it is all it ever was needed for. In 2007 Pandora’s Box was opened with the iPhone and the masses couldn’t resist its portent. The software profession has followed suit faultlessly to make itself relevant to a new generation of professionals in a world of schmaltz and glitter while leaving the soundness of its original foundations behind.

If it wasn’t the iPhone and Steve Jobs it would have been something else being promoted by some other megalomaniac who couldn’t see past his own heightened sense of importance. In other words, this convergence of technologies was unavoidable. And the sociological changes that came with it were unavoidable since Humans are what they are.

In this case most simply took the path of least resistance and went with the flow…

WaitingForTheNextBigThingConsumers and technical professionals wait in anticipation for the next new “thing”…

A Publication of Black Falcon Software

 

 

 

 


License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Software Developer (Senior) Black Falcon Software, Inc.
United States United States
This member has not yet provided a Biography. Assume it's interesting and varied, and probably something to do with programming.

Comments and Discussions

 
GeneralMy vote of 1 Pin
simviews3-Dec-15 16:59
simviews3-Dec-15 16:59 
GeneralNo programmer wants to think Pin
Vivi Chellappa22-Nov-15 16:26
professionalVivi Chellappa22-Nov-15 16:26 
GeneralRe: No programmer wants to think Pin
Steve Naidamast23-Nov-15 5:29
professionalSteve Naidamast23-Nov-15 5:29 
QuestionHistory matters Pin
lognormal16-Nov-15 8:05
lognormal16-Nov-15 8:05 
AnswerRe: History matters Pin
Steve Naidamast16-Nov-15 8:28
professionalSteve Naidamast16-Nov-15 8:28 
GeneralToo much ranting too little understanding Pin
Jakob Bohm16-Nov-15 7:46
Jakob Bohm16-Nov-15 7:46 
GeneralRe: Too much ranting too little understanding Pin
Steve Naidamast16-Nov-15 9:13
professionalSteve Naidamast16-Nov-15 9:13 
AnswerRe: Too much ranting too little understanding Pin
Jakob Bohm23-Nov-15 10:14
Jakob Bohm23-Nov-15 10:14 
GeneralRe: Too much ranting too little understanding Pin
Steve Naidamast23-Nov-15 12:31
professionalSteve Naidamast23-Nov-15 12:31 
QuestionQuite a rant Pin
john morrison leon15-Nov-15 14:12
john morrison leon15-Nov-15 14:12 
AnswerRe: Quite a rant Pin
Steve Naidamast16-Nov-15 2:22
professionalSteve Naidamast16-Nov-15 2:22 
GeneralRe: Quite a rant Pin
john morrison leon18-Nov-15 13:34
john morrison leon18-Nov-15 13:34 
GeneralRe: Quite a rant Pin
Steve Naidamast19-Nov-15 1:53
professionalSteve Naidamast19-Nov-15 1:53 
GeneralRe: Quite a rant Pin
john morrison leon19-Nov-15 12:25
john morrison leon19-Nov-15 12:25 
GeneralRe: Quite a rant Pin
Steve Naidamast19-Nov-15 13:34
professionalSteve Naidamast19-Nov-15 13:34 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.