Click here to Skip to main content
15,885,546 members
Articles / Artificial Intelligence

The “War Game” and Understanding Complex Application Development

Rate me:
Please Sign up or sign in to vote.
4.95/5 (8 votes)
7 Jul 2016CPOL52 min read 9.2K   10  
Getting an understanding of complex app development

GettysburgWargameMasterScreen

Author’s Note…

Please note that the use of the words game, war game, and simulation are all used interchangeably through this paper.

Merriam-Webster Definition: “War Game”…

  • a military training activity that is done to prepare for fighting in a war
  • a simulated battle or campaign to test military concepts

Preface

This piece is a somewhat more radical departure from those that I have written before as it is both a sociological and technical one at the same time. This writing attempts to demonstrate the use of a completely unrelated subject to assist in the development of one’s mind to allow it to understand and encompass large-scale complexities that are most often the underlying foundations in similarly large application development. This piece uses the somewhat forgotten genre of the historical simulation or war game to promote this concept.

To be sure, there are a variety of other pastimes and hobbies that can provide the same orientation. Writing, for example, is one of them. However, for the technically oriented mind and also from my own experiences, I have found the war game to be an excellent departure from studying technology to learning something completely new and possibly alien that will help younger development professionals grasp the complexities of their careers from a different point of view.

There are many types of war games that are available to the public as well as several commercial versions of military-grade training simulations that can show how actual military officers are trained in the matters of violent conflict.

From still popular board-games that require face-to-face interaction with players to computer-based simulations with increasingly powerful artificial intelligences that also offer Internet and Play-By-Email (PBEM) options, there is literally something for any period in history one may be interested in studying, recreating, or altering…

An Unrelated Study

There has been a theory among some in the education profession for quite some time that the study of unrelated areas of expertise can actually increase the development of skills for a professional’s actual field of interest. For example, an in-depth study of mathematics will almost always make such professionals as accountants, stock brokers, economists, statisticians, and financial advisors far more proficient in their chosen profession considering that quite a bit in each of these professions are already based upon mathematics. However, the study of mathematics within each of these fields is more or less relegated to each field’s needs.

As another example, a study of history will almost always benefit a novelist, who is interested in writing historical novels, by making him or her much better at his art allowing for the bringing of greater depth to their works.

We have the same correlations in the software professions with the development of internals applications efforts whereby to be good at the creation of such programs, one must have had training in such areas as compiler theory, hardware\software interaction, Assembler, C++, and the like. Similarly, in areas of scientific applications’ development, developers must have a background in the very scientific endeavors they are endeavoring to develop programs for.

However, in the area of business applications, the relationship between a developer’s background and the work he or she does, does not often correlate with each other. In fact, many developers who enter this profession have little real-world training that can assist them with their efforts other than training and education to be business developers. It used to be that members of related business professions such as accounting would allow one to move into a development area simply on those merits and by the early training programs provided by companies. However, this is no longer the case with advent of more modern computer science degrees.

Our side of the profession has always been made up of a somewhat motley crew of personnel whereby in earlier years, we were many times seen as social misfits, though more often than not, that was a result of stereotyping and not reality. Yet, today, our area of the profession is still attracting personnel who want to build applications that, for the most part, fall into the business application arena who really haven’t been trained for such an endeavor since there is no real way to teach people what is required of what are called application generalists. In the business development arena, the requirements for development can be as varied as a general species chart.

It is true though; that a person who has studied accounting and who then wants to become a professional developer or software engineer will do much better with such a background when developing applications for financial services than one who doesn’t have a similar background.

People trained in hospital administration will also have a much better time in developing such administrative applications than those who have only development specifications to go by without such training.

The same holds true for people trained in personnel administration (Human Resources; a terrible description that for the most part commodifies Human beings into nothing more than objects) for which they may create such applications for relevant departments in an organization.

Yet, by and large, these examples also are somewhat niche areas of programming that also fall outside the arena of the generalist. And it is also rare that companies will hire such trained personnel as employees to work on application development that only relates to their educational backgrounds. It happens but it is not the norm.

Most often, when working as a developer or software engineer in a business environment, you will come up against many different application development requirements that no prior training in any specific area of expertise will be of any great value with the possible exception of logic along with an education in general application development.

For the generalist who must be prepared to work on any type of project requirements, other than having the knowledge of program and design, there is little that can prepare such people to be able to understand and analyze complex application development. In fact, no matter what area of application development one is working in, there is little to assist the professional to learn how to understand to not only design such applications but be able to keep track of their complexities from both a detailed view as well as the larger picture of the entire endeavor.

This type of capability one either has natural abilities for or he or she requires exposure to many different types of application designs over their career to get a handle on such large-scale complexity from a mental perspective.

In this case, to improve one’s capabilities in this area, one has to literally train his or her mind by some alternative method where that method provides similar complexity but in a setting that is naturally enjoyable to the participant considering that few people will want to do something in their off-hours that is not enjoyable to enhance their own technical aspirations. Developing projects on one’s own as many do and even enjoy doing can also lead to a sense of burnout over time since this additional work is not offering any life balance.

To train one’s mind in this fashion, one has to be involved in activities where one of the benefits is the actual understanding of complexity for complexity’s sake. Years ago, both in high-schools as well as universities, the written essay or research paper was one of the primary tools in learning how to plan out complex subject matter that was to be described in detail. When doing research for a paper or a thesis, one had to do much more than today to get at the necessary material to support what is being written about. Spending hours in public libraries or what we used to call the “Stacks” at university and extracting information into written note form was a hallmark of the mind being trained to dig into complex materials in depth. Today, such research is many times confined to looking something up on the Internet even if what is found is highly credible documentation. There is no longer the curiosity to delve deeply into subject matter except for those who are interested in it specifically.

The earlier emphasis on mathematics and the sciences also contributed greatly to training one’s mind to handle complexity. Combined in the manner they were in traditional forms of education they produced in students the capacities for critical and complex thinking. Today, if one were to look at the many online postings of peoples’ comments or emails, a conclusion could be derived that in many cases, the individual in question could barely write a sentence correctly, let alone think about one.

Deterioration in Critical Thinking Skills

With the advent of the “new mathematics” in high schools in the United States in the 1970s, which led more to frustrating confusion than anything closely resembling critical thinking skills, along with the increasing penchant to tinker with educational curriculums and the sociological re-orientation towards an emphasis on business success, traditional approaches to education have suffered and deteriorated to such a point that the oft reporting on the subject would be laughable if it wasn’t such a complete tragedy. When I went to university in the late 1960s and early 1970s, people who majored in business and finance were considered less than worthy of being considered actual students. Unfortunately, and quite unfairly, the same held true for students majoring in education even for specific areas of studies such as mathematics. Though teaching is of itself a highly honorable profession, viewpoints towards the training of new teachers often trended towards derision.

Political manipulation, of course, curriculums hasn’t happened in many other countries to the extent it has happened in the US and the comparative results have shown a serious deterioration in US general intelligence to a point that students in the 1960s would have considered impossible to believe. In one study found by respected nutritional health expert and political activist Gary Null, it was noted that if one were take an average 55 year old person from the 1950s and place them in current-day America, they would be regarded as having a genius intelligence level.

Today the results have shown that young high-school students and university graduates around the world far exceed in general intelligence than what is found in the United States today with the possible exception of the US Asian populations (Asian, Indian, Pakistani) where in their families, education is still instilled in their young children as a primary goal in their lives.

In many inner-city neighborhoods, poor children intelligent enough to study and work themselves out of their environments are often derided by their peers as not being manly or part of their peer groups’ mores fostering reluctance to exhibit such capabilities on the parts of these intelligent youth.

Such sociological trends combined with the current attachment to “smart technologies” (they should actually be called “dumbing technologies” since that is their actual affect) have thus seriously, negatively affected many new professionals entering the software development profession. A recent but rather obscure report found that in the Information Technology profession, new professionals entering the field do not seem to have the same professional capabilities that their counterparts just a decade ago managed to accrue1.

Poor Critical Thinking Skills and their Effect on Software Development

In terms of Information Technology and its development, this change has manifested itself in the way technology developments are presented to the technical community. Review any technical magazine today, online or off it, and what you will find is a plethora of articles describing techniques and details for a variety of new and existing tools and techniques describing what they do with rarely any explanation as to how such technologies can fit into application development in general. This demonstrates a similar granulation of the use of software tools in development that matches the early continuous promotion of abstraction in the Java Community in the 1990s. This thrust is what has driven most of the modern paradigms such as MVC, MVVM, and the like, along with ORM frameworks as well as other similar constructions such as language syntaxes that many times appears to look like some level of hieroglyphics that takes C/C++ syntax to new heights of confusion.

However, rarely will you find in-depth discussions on quality application development and how such new tools were incorporated into them demonstrating the results of their use within good system development, which was a mainstay in earlier technical reporting years ago.

Corresponding documentation for many of these tools do not aid in easily understanding them as so much of it makes too many assumptions on existing knowledge or simply appears as if it was provided as an afterthought. Many professionals that write responses to queries by other technicians never provide their answers in context requiring further research to how even a correct answer can be implemented.

Microsoft is a star perpetrator to this issue. For example, Windows Presentation Foundation (WPF) as it rose to prominence on the desktop developments' side of things was often presented along with their MVVM development paradigm providing the idea that one could not be done without the other, which was entirely untrue. WPF itself is a highly capable desktop development environment but poorly documented as to how best to use it efficiently.

In a recent online class by Plural-Sight in just under 20 minutes, I found the easiest and most effective way of designing data entry forms by using a combination of layouts and panels through the defining of grid-row and grid-column constructs that can replace the more difficult technique of using margin properties. Yet, such an easy to use technique should have been an upfront defining characteristic of WPF, which in fact was similarly based off of Java AWT that was quite difficult to learn at the time. Instead, technicians have been left to make their own assumptions, often based upon earlier experience, to define what is easiest and best for their own capabilities to quickly come up to speed in such a technology, often a result of working pressures.

The end result is a profession that is suffering from what the engineering profession years ago used to call “detailitis”, a common malady among engineers who often make the details of their work and even their lives override more rational thinking where the end result is just as important as getting to it. As a result, in our profession, we more often than not, concentrate on the limbs of a tree without ever understanding that an entire forest surrounds it.

Back to the Future

The issue then is how do current professionals who want to produce high quality results get back to the basics of using their skills within the context of the larger designs they are working against? And this is an issue when you have such development paradigms being fostered upon developers as Agile and its variants, which promote the underlying idea that you never really truly know what the larger design is since if you do, the fear is that you will fall back into the dreaded ‘Waterfall Approach”, which is anathema to such proponents of these newer paradigms.

So how can current professionals better train their minds to see projects in their larger context while also managing the extraordinary detail they have to deal with?

There are many ways to do this. As previously mentioned, writing is one of them as is the reading of complex subject matter such as history, anthropology and other social sciences, which can provide you with an appreciation for great events, cultural and sociological changes and the factors that led up to them. However, for the level of detail that software developers have to deal with, what could be required is some form of enjoyable pastime that not only has a tremendous amount of detail but is goal oriented as well by which the use of such detail lends itself to the accomplishment of an end result; the strategic goal.

Enter the War Game…

Early History

War games became very popular in the 1960s and 1970s with the introduction of the Avalon Hill2 series of games that covered a wide variety of historical simulations. The Avalon Hill series of games also sparked a new interest in military history that resulted in the development of games by other companies3 over the years along with a number of publications that demonstrated the games against the actual histories of the events.

However, war gaming did not suddenly sprout out of nowhere, appearing to be the brainchild of an innovative company that enjoyed developing historical simulations. The first, real war game4 was actually developed in 1812 by Lieutenant Georg Leopold von Reiswitz and his son, Georg Heinrich Rudolf von Reiswitz, both of the Prussian Army4. Their system was developed around a special table for their king, Kaiser Friedrich Wilhelm III, as shown in the photo below…

OriginalKriegsspielTableGame1812

The First War Game Board – 18125

The table allowed for the development of multiple scenarios through the use of underlying blocks of terrain that could be combined into any field map desired, which were all based upon a squared grid-system that would remain in popular use for many years until it was modified to that of hexagonal styled maps, which are still in use in many of today’s war games.

The original game also included counters, or physical blocks that represented the various military formations of the opposing armies, a combat resolution system, a complete command structure that also included “line of sight” command capability, as well as the capability to introduce the “fog of war”, the uncertainty in the planning phases as well as operations during an actual battle.

The original rules unfortunately, were so cumbersome that many Prussian officers did not enjoy using the game for tactical training. Nonetheless, it eventually gained acceptance as a training tool under the command of Helmut von Moltke, Chief of the Prussian Army under Otto von Bismarck from 1857 to 1887. Moltke invented the concept of “Blitzkrieg” that would be first used by the Wehrmacht in World War II. For those who are interested, the military concept of lightening, mobile warfare of “Blitzkrieg” was originally designed in the early 1860s and was intended to be used against the Austrians in the battle of Königrätz, which was the defining battle that would initiate into being the nation of Germany. Thought to be overly cruel by the Prussian Military general Staff, the design was shelved until the attack on France in 1940.

This first war game was originally called and still remains, “Kriegsspiel”, which in German translates into “war play” and is thus idiomatically accepted as “war game”.

“Kriegsspiel”, which eventually had its original rules modified for easier play, contributed greatly to the training of Prussian officers as they found themselves defeating their opponents on the battle field more often than not with the tactical experience they gained by using this new training technique. News of this new training tool eventually spread and other militaries began adapting it to their own training needs.

It was commercialized into the famous Avalon Hill game by the same name in 1970.

Today, one can still get the original rulesets for all the variations of this battlefield simulation allowing people to play it in its original form6.

War Games Today

As microcomputer technologies continued to evolve through the 1980s, so too did the platforms that war games would be presented in. The original microcomputer-based war games became popular under the banners of “SSI” and “SSG”, though neither of these publications actually provided for a true war game experience as there was to much interference from the computerized algorithms. In 1989, “Atomic Games” took up the banner and released a number of World War II battle field simulations. This time, this company got the computer mapping correct as well as the movement of formation icons. However, the AI was so poor that one never quite got the feeling that they were actually competing against some level of intelligent artificial intelligence that made playing these games enjoyable. “Atomic Games” would quickly fade from computer history with the contribution of standardized war gaming graphics for the PC.

Until Talonsoft7 made its entrance in the war gaming market in 1995 with their famous Civil War series of games, a lot of war gaming still remained with the game board or in the case of miniatures, on the tabletop. With Talonsoft, the first real pc-based war game was released that was in the very form that all war games would come to emulate. Graphically appealing, their games were also highly playable with a high level of enjoyment against a mostly intelligent artificial intelligence while also eventually allowing for playing over the Internet with Human opponents.

For some reason though, Talonsoft quickly sold off its franchise in 2000 and though the new owners made an attempt to keep the original games in front of the public, the loss of Talonsoft to the gaming community was such a huge disappointment that these really very well crafted games eventually disappeared from the market. Nonetheless, later versions of these games are still available from John Tiller Software8 with his “Civil War Battles” series.

As technologies continued to advance in the areas of microprocessor speed and graphics, war gaming, one of the few really thought provoking activities that people could find with such experiences on the PC other than the adventure game, began to languish with the emergence of the modern-day first person shooters that are more about killing than anything else.

Currently, there are only a few well known development houses which are still developing such games such as John Tiller Software8. However, there are many smaller companies that sell through distributors such as Matrix Games9.

John Tiller Software directly offers the war game in its most advanced form while also providing such tools for the US military. Matrix Games also offers a highly complex and advanced set of simulations with Gary Grigsby’s “War in the West” and “War in the East”, two of probably the most advanced commercially available war games anywhere10.

Why the War Game as an Unrelated Study?

With the convergence of smart technologies and people’s reliance on them, thought provoking activities have increasingly been relegated to the past with the idea that we are now entering some form of knowledge era. Though what that exactly means is questionable, considering that it is the smart devices that appear to have all the knowledge while most people have been filling their brains with useless junk. Score one for the machines!

Nonetheless, for those who are seeking to enhance their abilities in large application development and who have not had the advantage of earlier educational techniques while in school, war gaming can actually fill this void to a degree by training such minds to grasp large-scale concepts while having to deal with the details that make them up.

Some people would suggest Chess as an alternative and they would be correct in doing so since both Chess and war games have a lot in common. In fact, Chess was actually designed as a strategy game.

Both types of games have immense complexities. With Chess, they are rooted in the ability to foresee many moves ahead of the current board and the varying changes in strategies that such moves could portend. With war games, such complexity is rooted in the interpretation of events on the playing board (or screen) and what they could possibly lead to. Due to the mechanics of a war game, it is significantly different from Chess.

With Chess, the ultimate goal is to capture the King so that the opponent can no longer move it to a safe position. With the war game, the goal is to inhibit the opponent from making any gains on the playing field while also managing to capture a pre-determined set of objectives. Whether that is accomplished by superior movement and fortification, the destruction of the opposing forces, or a combination of the two, the prevention of the opponent to maneuver for gain is similar in that regard to Chess.

Where Chess and war games starkly differ is in the way each is learned. With Chess, the mechanics are very easy to master while mastering the game itself is very difficult. With the war game, the mechanics are quite complex, while once mastered, the mastering of the game becomes somewhat easier, though not in all cases (Gary Grigsby’s “War in The West” and “War in The East” series are some of the most complex pc-based war games ever developed, which are still being updated today10).

Both types of games have immense value in being able to train the mind and it is mostly up to the individual, which type of game suites their fancy. As a military historian, I always found war gaming to be of great interest.

Pattern Recognition in Development & War Gaming

Anything that involves a high level of detail that is to be used to develop a goal can be used as a training tool for professional developers looking to acquire the ability of being able to understand large application development. There are some such complex endeavors that do not in fact yield themselves to such training. Hand-drawn art is one of them. Not because it doesn’t have its own set of complexities to master but because the type of mind that is good at creating such art would not necessarily be equally adept at developing something that requires pattern recognition, which is a primary requirement for our profession. On the other hand, the graphic arts can play such a role due to the manner in which this kind of art is created. However, one still has to have some sense of the artistic to be successful in this medium.

Pure art though, is about creating abstractions unless one goes towards something like the architectural arts, where not only artistic style is required but the ability for pattern recognition as well. Again, such people would usually not be found taking up a career in software development.

What exactly is pattern recognition? It is the ability to discern patterns within complexity.

In the early days of commercial computing, pattern recognition IQ\aptitude exams were often given to prospective developers looking to be trained in computer programming. This is also why many companies were looking for people with talents in Chess and\or Mathematics.

If interested in becoming a professional developer, one of the best courses one could take in both high school and university were those related to Algebra; a study in Mathematics that is highly pattern oriented. For example, factoring out a complex polynomial uses a combination of patterns to discern the various steps, while a simply polynomial only requires a few such patterns to factor out.

The results expected of such early initiations, were the better one was found to be in these areas of talent, the better a developer they would most likely become. It hasn’t changed for today’s new entrées into our profession though many of the names we assign to pattern recognition capability have most likely been redefined.

Without such a capability, a developer would never be able to design complex software tools in which the makeup of the intrinsic components is all more or less based on patterns learned in previous endeavors of development. This is because without such a capability, a developer would literally have no understanding where to start any form of software creation.

There is no doubt that professionals new to the field only have a cursory period of training in related studies that will assist them to acquire an understanding of how applications are developed. And as experience in the field grows, any serious minded developer will find how pattern oriented the profession actually is. The patterns being discussed are not “design patterns” or come under the heading of “best practices” but are actually the natural underpinnings of how any software application is built.

This is why until the last several years, many senior software developers eschewed formal “design patterns” as nothing but solutions looking for problems as earlier software engineering analysts had discovered. The reason such analysts ascribed to this was that good developers followed inherent patterns in development naturally, making the formalized application of such paradigms completely unnecessary in most instances.

Today, many software developers are encouraged to use “design patterns” simply as a result of a herd mentality that promotes them as being the best way to implement applications. In a sense, this encouragement is also the result of the smart-device, sociological effects on society whereby increasingly, people are off-loading their brains onto such equipment thinking that by doing so, their lives are being made much easier. In reality, all they are doing is decreasing their abilities to think critically about things. The same is true in this rush to implement everything around a pre-ordained pattern. Developers simply have to pick which patterns they want or need to develop their code without having to think such patterns out on their own, which they would have normally done without any outside pressures to institute standardized ones.

The other reason why “design patterns” are being fostered in so many situations is the ridiculous design concept of the “what if” scenario whereby technical managers propose the need for extensibility if “this or that” requirement comes in.  Certain “design patterns” can provide for such extensibility but in reality, most application development is never called upon to support such prognostications of the future.

As baseball legend, Yogi Berra, used to say…

“It’s tough to make predictions, especially about the future.”

The same holds true for development endeavors as the majority of them have to be done against “what is” and not “what if”…

Thus, the pattern recognition we are then discussing here is what a developer requires as non-codified understanding of his or her profession.

For example, if a software application requires an interface with a database, a developer has several decisions to make prior to proceeding to code such processes…

  • Which type of database is to be used? (relational or document store [nosql])
  • If relational, how is the database to be accessed?
    • ORM (Object Relational Mapping)
    • directly (i.e.: ADO.NET, JDBC)
  • If relational, which one is to be selected?
  • If document-store, which one is to be selected?

Once all such decisions have been adequately made, depending on the types of decisions, a pattern of implementation will automatically be defined as a result of the nature of these decisions.  For our purposes, let’s say we have decided upon the use of a relational a database for which we will take advantage of the large scale capabilities of PostgreSQL. Since we prefer the greater efficiency of direct access and we will use the .NET development environment for this development effort, we will obviously use ADO.NET.

The expected result of any basic design for such a support system could appear like the following…

SampleStandardDatabaseImplemnation
A Standard N-Tiered Direct Database Implementation Design

In this example, we have a single public class (Database Dispatcher), which is used to send calls to the components that, in turn, access the database and return data from any of the modules from the library that they would comprise, with the exception of the actual database access module. This last would have its own independent library since it would be used by multiple calling modules and being rather generic, other applications as well.

All other processing modules (Access Type) would be comprised of internal\friend (C#\VB.NET) methods that could be distributed in any fashion that the application may require. I always prefer to build such modules based upon the tables they are accessing. Hence, a Customers access type module would primarily access a Customer table, though calls to other tables could be required if they were part of the database code that was primarily calling for data from the Customer table. Other developers prefer to set their modules by process, such as an Order module.

I always return either a single structure or an Array-List of structures to the original calling method. They are light-weight, can be easily extended when necessary in correspondence to any changes to the actual database code, and require a minimum of database related definitions in the calling modules, which though not perfect, decouples a lot of the actual ADO.NET code from the top-tiered calling modules, where theoretically, you should have no database coding at all. Structures are also easy to work with in these tiers since they are always in an object format.

Many professionals may have their own disagreements with such a pattern for database access but it has been fairly standard for n-tiered development for many years and has worked very well with practically no issues over the course of my long career with development with database applications.

As an aside, recently there have been discussions concerning Micro-Services as a departure from Service Oriented Design (SOA), which became popular in the mid-2000s both inferring a break from standard n-tiered development. However, the results were not what were expected as it became a new standard by which to develop external services with. This is because technical personnel completely ignored the ingrained sociological habits of organizational structures. As SOA development became more prominent it also followed a similar trend that Object Oriented Programming (OOP) was also expected to avoid. With OOP, the promise was for re-usable modules. With SOA, the hope was for better cross-organization services that provided common functionality. Both endeavors were miserable failures. The reason for the failure in OOP was that organizations never accommodated the underlying foundations for making OOP modules re-usable, which required organization-wide repositories that developers had access to. SOA failed as a result of the still inherent desire to maintain control over the data that individual departments controlled making even successful SOA production deliverables into closed silos for these very departments. Micro-Services may offer a better granulation of SOA developments but it is quite possible they will suffer the same fate. Technology evangelists for their many promotions often create the hype surrounding their preferred technologies in vacuums where the larger aspirations of corporate personnel in general do not tend to fit too well.

Nonetheless, no matter what type of design you implement in any of your development efforts, if it is based upon direct access to a database as is shown here, it will follow to a similar degree the pattern shown in the above image.

Such implementation patterns can be found throughout any software development process. And one uses such patterns to understand large application development in order to facilitate the construction of such software. However, as such development becomes more complex, more experience in understanding how to build such designs is required.

If you notice in the graphic above, the implementation pattern is made up of individual modules or components as any such implementation would be. If the coding is well done, than the end result would be a database library that works well for retrieval, insertion, updating, and deletion of data for a particular application. In the end, such a library will be relatively simple to modify and add new extensions as new requirements are made.

Learning to play a war game follows along a similar path to accomplishing a goal where you must work with the equivalent of an IDE, individual components that makeup a combined entity and then use that combined entity with others to accomplish a specified mission. By looking at war gaming from this vantage point, one can understand how it could develop and\or refine one’s skills at pattern recognition while also teaching one the ability to see not only the details of a situation but how such details work within an overall picture of tactical and strategic accomplishment.

Similarly war games, like development, demonstrate re-usable patterns since all battlefield engagements fall into some degree of a previous pattern of engagement fought before. Even in modern-day tactical engagements, ancient battlefield techniques can be found in use. For example, the “Battle of Cannae” between Rome and the Carthaginian state is still seen today by military historians and analysts as one of the finest examples of defensive warfare ever executed on the field of battle, which involved the complete envelopment of the Roman Legions. It’s lessons are still taught in military academies around the world.

Like application development, war games are comprised of a myriad of details that must be constituted into groups (which in development would be the aforementioned equivalent of modules), which in turn are used tactically to attain a strategic goal. Depending on the period and nature of the war game, tactical and strategic goals can vary greatly.

Though never touted as a war game, Sid Miers’ famed “Civilization” series11 had a wide variety of details that had to be used to form groupings in order to become the most powerful civilization in the game. Such details included economics, political aspirations, population morale, inventions and discovery, birth rate, acquisition of foreign cities and powers, military units, and much more. With every new release of the game, attaining the strategic goal of power over all your rivals became more complex and difficult.

For many years, this highly, thought provoking challenge ruled the entire gamut of the gaming industry.

On the other end of the scale, there have been quite a number of war games that were developed at the squad or platoon level, which meant that the goals were never strategic in nature but only went as far as tactical such as the capture of a building or a hill. Nonetheless, how the player used the available resources within his squad would determine if he or she would be successful or not.

In the end, all such simulation gaming follows a general, pre-defined pattern of development…

Unit Component > Combined Group(s) > Tactic(s) > Strategic Goal

Components of a War Game are Similar to Objects of Development

For purposes of demonstration, the original John Tiller game, “Gettysburg”12, will be used as it offers all of the standard artifacts for such a simulation.

The Unit_Structure

In a war game, the most basic component is a unit. The unit can represent any type of military asset and in many games are reflective of those units used by actual military forces. Units are often displayed as standardized NATO graphics13 in commercial versions of simulations used by Armed Forces of various nations.. For most simulations however, unit representation is most often designed by the publisher in some form 2D and 3D graphics, the latter to bring some field realism to the situation.  The unit is the only “pre-built” component in such games offered to the player that he or she can manipulate during game-play. Everything else has to be developed using the unit component.

StandardNATOInfantrySymbol

NATO Infantry

WargameConfederateCavalry2D

Confederate Calvary 2D

WargameConfederateCavalry3D

Confederate Calvary 3D

Units can be any type of military configuration, from a single soldier, vehicle, or weapon type (ie: artillery w/ crew) to groups of troops from the squad or platoon level all the way up to a division. Such variations are dependent on the game.

The “Gettysburg” game used here as a basic example is designed around company defined units (60-250 troops).

Like code, which is used to create methods, properties, and eventually classes\objects, you use these units to form larger or smaller groups to attain a certain goal for that particular group. Such groups can be used for such missions as for example a tactical reserve, where a group of units would be placed on the battle-map/playing-board in such a way as they would be able to lend support to one or more other groups that will be used in an attack formation against an opposing player’s formations.

All units come with differing sets of “properties” and “methods”, again dependent on the type of unit that is being represented. This then makes any unit an “object” within the game environment and no doubt, publishing houses have used hierarchical inheritance to define units belonging to a particular side in the game.

To clarify this type of inherent structure to a unit, we can list example properties and methods of an infantry unit in the table below.

Properties Methods
Strength Factor (force in attack or defence)

Move

Fatigue Factor (rate of exhaustion of troops) Fire
Quality factor (quality of unit combat capability) Face (facing of unit)
Range Factor (distance for firing) Attack (standard combat resolution due to fire)
Movement Factor (number of positions unit can move) Melee (hand-to-hand combat)
Current Facing Factor (direction unit is pointed)  

Dependent on the game, the detail of the structure of units can be foreboding. In commercially available versions of such simulations that were designed for actual military training, the detail for a single unit can be quite extensive requiring the player to have a complete understanding of a unit’s capabilities in order to use it efficiently.

The Terrain Structure

The second major structure in such games is the “terrain”, which is most often displayed as a compendium of tiles, each one representing a positional space in the playing map. Many war games and probably still the majority of them have their maps defined as arrays or lists of hexagonal structures, though quite a few games use the line-of-site technique, which involves the use of pixel-based movement. In this latter type of playing board, no defined movement areas are shown on the map as all units move in any specific direction desired with the computer calculating how far any one unit can travel.

In our case, which is probably still the most popular with serious war gamers, the “Gettysburg” simulation uses the hexagon map system as shown below. In this image, terrain structures are shown comprising of meadow-land, roads, and forest.

WargameTerrainSample

Hexagonal Playing Map

Like the unit structure, a terrain structure has “properties” and “methods”, all of which must be understood in order to combine a unit or group of units’ movement to properly place these simulated forces in an advantageous position against an opposing player.

Properties Methods
Terrain Type (ie: field, forest) Get relative position in map (x, y coordinates)
Terrain Height (relative to sea level) Get ID (if one is assigned)
Terrain Mix (additional object; ie: wall, road) Get owner
Assigned Owner (terrain may be part of a nation)  

The Map (Playing Board)

The map is the playing board and in the case of the “Gettysburg” simulation can be comprised of several hundred or even thousands of hexagons with a multitude of terrain types. The playing board is contained within the equivalent of an IDE as can be seen in the image below with which a player can control every aspect of the simulation; from screen size to how units are represented (2D, 3D).

In all such environments, clicking on a unit or a group of units will display all of the relevant information the player requires in order to use them. In the screen below, a group of units has been selected and their corresponding information is in a panel on the left.

GettysburgWargamePlayingBoard

Gettysburg Simulation Master Window

(Notice the large number of hexes in the scenario by where the noted scrollbars are positioned.)

How This All Relates to Understanding Complex Application Development

At this point, you must have begun to wonder how playing historical conflict simulations can help one enhance their experience in understanding large application development. It is a fair question. However, to be able to answer it adequately, you also have to have a basic understanding of what a war game is.

To begin, and as was mentioned earlier, war gaming may appear to be nothing more than a niche hobby for enthusiasts but it also represents thought provoking challenges that is often no longer found in modern society as pastimes. “First Person Shooters” such as “Battlefield” are the dominating aspect of pc\console gaming today and they are rightfully called video games since they hail back to the popularization of arcade games in the 1980s but are now far more sophisticated. Such games attempt to project a sense of realism in battlefield environments that in reality can never be equalled. If you have ever experienced real combat, then you understand the significant disparities between reality and controlling digital avatars on a computer or a game console. Such games are also designed in such a fashion as to provide a sense of high speed movement, which also cannot be realistically replicated on a computer screen; nor can a person’s view of the environment, which is often limited since peripheral vision is hard to simulate on a single screen, though many gamers may have multiple screens.

At best, such games allow for a lot of tension and hyper-activity that if indulged in for too long and too consistently can have lasting effects on one’s personal health. Whatever thought provoking challenges there are in these games, they are most often secondary to the primary requirement of simply killing your opponents. One does not develop source code in such a manner.

War games, even more so than their sister equivalent, the adventure game, provides challenges that often, mirror similar challenges in the professional development environment; you are given a project that will require certain components and you must use all of them to create a final deliverable. This is exactly the type of analysis that is also required in war gaming. You are given a historical situation with which you are provided with a certain set of units that you must all use to develop a series of tactics that will accomplish a certain strategic goal.

In the case of the “Gettysburg”14 simulation, you must not only capture vital objectives (i.e.: towns, hilltops) in the individual scenarios (smaller parts of the overall battle) but in the campaign, which reflects the entire three-day battle, you must also capture key areas on the map and hold them to the end of the entire engagement in order to defeat your opponent.

For example, one of the key areas in this particular simulation (as it was in the real battle) is the one known as “Cemetery Ridge”, which General Robert E. Lee had Major General George Picket15 attempt a frontal assault against on the last day of this battle. Attaining and holding this vital area in the Pennsylvania farmlands is one of the major keys to winning the engagement. However, this is a far more difficult task than one may expect. First and foremost, it is exceedingly difficult to assault entrenched positions as the Union Forces were. Second, the Union Army had bored rifles providing them with murderous accuracy both individually and in volley fire. The majority of the Confederate Army used smooth-bore rifles which are known to have much lower rates of accuracy over distance. The results were to be expected when during Pickett’s charge, scores of Confederate soldiers were literally mowed down in their attempts to storm the hilltop.

To give you an idea as to how difficult this particular campaign is, I played the full campaign scenario with an earlier version of this game that was produced by Talonsoft. In this particular version of the battle, it would take 165 turns to complete. My goal was to prove through the simulation, as historians have also demonstrated, that Lee should have never lost this battle in the way that he did. By turn number 80 I had secured, based on the internal point system, a “minor victory” for the Confederacy. In addition, I had developed a long term strategy that I believe would have been good enough to maintain the current victory status to the end of the game. Unfortunately, I never got to complete the simulation since by the time I had gotten half way through I was also in the process of upgrading to a new machine when I inadvertently lost the saved file for this experiment.

Just a few notes on General Lee before we continue… Lee lost this battle due to his overriding predilection for frontal assaults while ignoring common sense when it came to military science. Some historians have contended that due to this flaw in Lee’s thinking, he was a primary factor in the final loss of the war by the Confederacy. These historians substantiate their thesis by demonstrating the statistics of losses for each major battle that Lee engaged in by showing that in such engagements Lee would lose on the average of 20% of his men, which is a staggeringly high casualty rate on such a consistent basis. The result was that Lee, over time, inflicted severe damage to the fighting ability of the Army of Northern Virginia, the major fighting force of the Confederacy and the finest force in the field for most a good part of the war until Gettysburg. At Gettysburg, Lt. General James Longstreet, one of Lee’s highly competent senior officers, strongly recommended against such an assault and instead offered a reconnaissance of the Union Forces as well as syncing up with Confederate Calvary, which had not arrived on the battlefield due to a delay in Jeb Stuart getting his forces to the field. Lee ignored such recommendations and went ahead with the battle and the rest is history.

This should not take away from Lee’s strengths as commander. He was adored by his men and thus was always able to instill a very high morale in them, which has always been critical in combat and in many simulations well simulated. He was also an excellent strategist as his defences of the Confederacy that he built earlier in the war in the western theater allowed the Confederacy to continue fighting and extending their war effort as some believe by as long as 18 months. Nonetheless, Lee was no battlefield tactician and had he allowed his better suited officers to design such tactics for his battles, there is a chance that the Confederacy would have actually won the war considering the amount of time it took the Union Forces under competent generals to become cohesive fighting units.

Though, you may be wondering how this additional information may provide any insight into using war gaming as an enjoyable aid to understanding complexity, the reality is that such games are intrinsically tied to actual historical events and with current technologies are now able to yield the same results as those in history if the same tactics and strategies are used by the players. Some miniature war game (table top) enthusiasts do just that to learn and understand the actual events that occurred in engagements.

The detail that has to be understood and controlled in such games to your own advantage can be staggering and can be very similar to what developers have to deal with when building large, complex applications.

Yet, the more detail, the more work it requires to develop tactics and strategies to yield a positive end result.

A good example of a war game that mirrors such large scale detail along with highly realistic results is HPS Simulations, “Tigers Unleashed”16, which is one of the earlier, commercial editions of a war game that simulate actual military officer training requirements when used by professional training facilities.

A more recent example is the updated Gary Grigsby game, “War in The West”, which was not designed to be used for the training of professional officers, nonetheless is designed with enough detail and sweep to provide similar experiences that would be found with professional officer curriculums that use such tools. Gary Grigsby, as a designer, has an industry reputation for incorporating such a level of detail that his games are not for the faint of heart.

Logistics, Tactics & Strategy – The Bread & Butter of Warfare

As with application development, logistics, tactics, and strategy are the primary building blocks for engaging in any military conflict. They somewhat mirror user-built or pre-built components that developers use to create their end products. The only drawback is that in conflict, you do not have the option to purchase a 3rd party tool; you have to develop everything by yourself.

“Logistics”, like a database engine, could be viewed as the lowest common denominator to everything else. Without an understanding of logistics, no conflict in the past, the present, or in the future would or will have any hope of being winnable. “Logistics” concerns the definition, supply and movement of troops, vehicles, aircraft, ships, the defining headquarters command centers, and the entirety of the administrative needs of such military forces. If you do not supply your troops and equipment properly, you will lose an engagement. If you do not define your forces properly for an upcoming engagement, you will lose it. If you do not move your forces and equipment into advantageous positions, again, you will lose the engagement to a better positioned opponent.

Logistics

“Logistics” is the literal database of knowledge that must be understood and refined prior to and during any engagement. No battle has ever been won without it and is why military professionals come to quickly learn that it is “Logistics” that win battles and nothing else. A good example in military history of the understanding of logistics was during the 19th century British\Zulu wars in Africa. On the field of battle the Zulus, even with primitive weaponry were the equal of the British troops with disciplined rifle fire. Many engagements were questionable wins or draws between the two due to the fierceness of the Zulu warriors. However, the Zulus fought with a sense of honor that was no longer prevalent in such conflict. The Zulus tactics mirrored those of the famed Greek Hoplite soldiers of antiquity fame whereby once a battle had been won or lost both opponents would leave the field to fight another day. Not the British, who understood the concept of pursuit that the Zulus would have never considered as it was dishonorable to them. The British in turn at the completion of a successful engagement would go after the Zulu warriors looking to find their supply bases while murdering any warriors they came across. When they found the supply bases, they torched them burning them to the ground. In the end, this destruction of the Zulu supply areas did them in regarding the entire war and the British were free to continue on with their imperialistic ambitions in these areas.  Nonetheless, the average British soldier gained a tremendous amount of respect for the Zulu warriors and so too did the Zulus for the British. At the Battle of Rorke’s Drift, the Zulus eventually overwhelmed the entrenched British but admiring their bravery and persistence stood down from a final attack…

Tactics

“Tactics” is the design of movement on a battlefield, whether it is offensive or defensive, which will incur the greatest damage on an opponent in an assault or the best protection for the troops during defense. And both are just as important as the other. In fact, commanders who are well versed in the “arts of the defensive maneuver” are most often even more competent than those who have only mastered the offensive position. This is due to the greater difficulty in movement due to the fewer options available to a commander in such situations.

Successful “Tactics” are all designed to work in unison with each other over time to ensure a strategic goal, making “Strategy” the top most construct in all warfare.

Strategy

“Strategy” is the design to accomplish an endpoint to any conflict through the attainment of intermediary goals that are provided for by “Tactics”, which in turn are supported by “Logistics”. For example, a tactic would be the taking of a hilltop from the enemy in an engagement. However, it is the strategy that provides the reason for taking that hilltop. Perhaps such an accomplishment will provide forces with a far superior view of the overall battle field allowing command headquarters to be placed in a far more beneficial position. And by doing so will allow senior commanders to develop their plans in a better environment for the finalization of the current conflict. There was a reason why Napoleon always took the best vantage point possible during his engagements with a pool of messengers at the ready. By doing so, he could command the sweep of his forces in a more direct manner.

Good “Strategy” can only be developed based upon the “Logistics” and “Tactics” that precede it.

“Strategy” in of itself can never win a conflict but without it, everything else becomes meaningless. This can be easily demonstrated in the Mid-East conflicts where the United States has foolishly involved itself since 2001. All of them were not based on any succinct strategy but merely vague goals designed by oil barons and political neoconservatives whose only thoughts on anything is to privatize it or conquer it. The results have been horrific.

For example, in the 2003 US invasion of Iraq, US troops were sent into a conflict theater without proper defensive armor, which caused scores of casualties. When complaints started to flood in from the families of these soldiers regarding protective vests and the lack of armor plating on military vehicles used for patrol and reconnaissance purposes, then Secretary of Defence Donald Rumsfeld exclaimed, “You go to war with what you got!”, which demonstrated the arrogant stupidity of the men and woman who promoted and executed this conflict who had no real strategic goals except in the fantasies of their minds they devised and then set out to implement them without even a sense of proper logistics to attain them.

“Strategy” is the end result of any operation or set of operations that are designed to attain it. This is the same as the developer’s end result of a well-functioning application being implemented into production. How you build such an application is based upon your “Logistics” (resources) and the “Tactics” (designs) that are used to create it.

“Strategy” can often be found in the support of political goals as well. In World War II, new evidence is starting to come to light that the D-Day Invasion was primarily designed to halt the Soviet Union from rolling into Western Europe and not necessarily save the threatened countries, though that happened as a by-product. The Allies had made a pact with the devil in “Uncle Joe” and by 1944 began to realize their stupidity. Similarly, in IT organizations, many development teams are under pressure to create applications that certain well-connected personnel in the company want whether they are actually of any use or not ushering in a political motivation for the technical effort.

As has been described, there are many succinct parallels to learning and understanding the mechanisms of conflict to similar processes in the Information Technology profession.

Conclusion

Serious war gaming, not the current fluff found on the Internet multi-player sites, is slowly beginning to make a resurgence in the industry for those who want to return to a more thoughtful era, which in today’s fast-paced lifestyle is increasingly becoming a thing of the past. For many years, starting in the 1960s, this genre of gaming was at the forefront of popular pastimes either equalling family gaming or surpassing it. War gaming provided an intellectual outlet for those interested in understanding what happened on the battlefields of history and how such outcomes could have been possibly changed by making alterations to the tactics involved.

In a war game, you are the commander of the side you chose to play. You are responsible for everything that happens on the playing map, making the number of decisions you have to make as complex as the simulation you have decided to play.

War gaming is also a realistic reflection of what happens in circumstances where death and destruction are your primary motivations but under an atmosphere of complete uncertainty since nothing is ever guaranteed on a field of battle no matter how well planned out. You never know what the opponent may have planned to counter your efforts. Similarly, in professional development, one never knows when some new requirement may be thrown into a development effort that may negatively affect ongoing planning and efforts.

Current, popular video games do not train the mind in such an intellectual manner. They are primarily designed to create cannon fodder (which has been well documented) for the US military by exciting youth and young adults still eligible for military service with the fantasies of adventure through the lens of warfare, when nothing could be further from the truth.

Warfare should be the farthest thing from any sane person’s mind and can only be seen for what it is through an objective lens. This is why war gaming can provide one such lens. The excitement does not come from the pull of the digital trigger but from the success of thoughtfully laid out plans that accrue in the attainment of a goal. War gaming not only teaches the science behind the attainment of such goals but provides an appreciation for the ongoing costs in lives and treasure to attain them. It can and does often teach enthusiasts to see and understand how events and situations are connected to bring about larger and more complex goals allowing for a more introspective view of the big-picture within endeavors.

In short, war gaming is an avenue for the training of one’s mind to think rationally and logically when presented with difficult tasks as so many are faced with on a daily basis in the development profession.

It is also a path that will allow professionals to relax and enjoy an activity that they can participate in at their own pace.

References

  1. Cultural Critic Chris Hedges on Deteriorating US Education
  2. Avalon Hill
  3. GMT Games (now one of the leaders in board-based war gaming)
  4. War Game History (short)
  5. Eine Anleitung zur Anleitung. Das taktische Kriegsspiel 1812-1824 Philipp von Hilgers
  6. Current “Kriegsspiel” Rules
  7. “Kriegsspiel” News (based in the UK)
  8. Talonsoft
  9. John Tiller Software
  10. Matrix Games
  11. Gary Grigsby’s Games at Matrix Games
  12. Sid Mier’s “Civilization”
  13. John Tiller Software’s “Gettysburg”
  14. NATO Military Symbols
  15. The Battle of Gettysburg
  16. Pickett’s Charge
  17. HPS Simulations, “Tigers Unleashed”Image 9

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Software Developer (Senior) Black Falcon Software, Inc.
United States United States
This member has not yet provided a Biography. Assume it's interesting and varied, and probably something to do with programming.

Comments and Discussions

 
-- There are no messages in this forum --