As Enterprise Architecture reaches its 40th anniversary, Roger takes us through the history of the discipline from where it began, to where we are now.
As with many methodologies or disciplines – it is difficult to pinpoint a precise point in time that we might call its start date. But for me, something happened in 1975 that was a sign of distinct shift in thinking, and which clearly marks the beginnings of Enterprise Architecture.
In 1975 Richard Saul Wurman became the first person to use the word architecture in a “more generic, holistic, systemic sense as a way of understanding and managing something that was intangible.[i]” Wurman is perhaps better known for founding TED Talks. In 1975 he co-wrote an article about the Architecture of Information, and in 1976 he chaired the American Institute of Architecture National Convention which had the “Architecture of Information” as its theme.
So, our discipline began as Information Architecture in 1975, became Information Systems Architecture in the 1980s, and finally Enterprise Architecture in the 1990s. In some ways the original name was more relevant – because we architecture through, and with, information about the various components and their structure or organization. The term Information Architecture was appropriated for web-based systems, and now it is only a part of EA. Information Systems Architecture as a label had a narrow focus towards IT. Our current label, Enterprise Architecture, is better, providing we remember that it refers to “enterprise” as “any collection of organizations that has a common set of goals[ii]”; in other words, enterprise is not restricted to organizations or companies – it applies to any social group, which could include a community, country, or society at large.
That explains why 1975, making 2015 a 40th Anniversary. It was 1975 when Wurman gave us the label “architecture”. We’ve come a long way in these forty years, and sometimes we forget about the vast range of techniques that form the core discipline of EA. I thought it would be useful therefore to give a brief history of EA as a reminder of the origins of some of these techniques.
The 1970s was a fertile time for methodologies and approaches aimed at the development of applications and data. Information Engineering (IE) emerged in the late 1970s based on the original work of Clive Finkelstein and James Martin. IE is a business-driven methodology that provides an architectural approach to planning, analyzing, designing, and implementing applications. Several techniques from IE have been adopted by EA, for example: entity analysis - originally used to identify things that an enterprise may want to hold data about – had become component analysis to identify anything that is relevant within an architecture; function analysis and process dependency has been adapted to examine the behavioural and functional aspects of an architecture; lifecycle analysis helps understand significant changes to components over time; matrix cross-checking and cluster analysis help cross-reference different types of component (in IE this was largely data to process) to verify that they are necessary or complete; and normalization – originally a formal way to confirm the correctness of an entity model – can be applied to many types of architectural component.
Information Engineering became very popular. It brought together techniques from data modeling with techniques that were emerging for process analysis and design. IE was well supported by software tools (known as Computer Assisted, or Aided, Software Engineering – or CASE tools). Some of the EA tools today have their ancestry in the CASE tools of the 1980s. Some of these modeling techniques are still common in EA, such as the use of entity relationship diagrams to document the data architecture. Node and link diagrams are a more general form or relationship diagram that can be used to record and present information about EA components and their associations. The notion of attributes is used to refer to characteristics or qualities of EA components, while cardinality is used to explore more complex relationships between building blocks.
Sometimes techniques or concepts from the past are used, although they are not strictly relevant to EA. For example, in the 1970s data modeling described three types of data model instance [formally defined by the American National Standards Institute (ANSI) in 1975] – conceptual, logical, and physical schemas; these terms are sometimes still found in EA descriptions, where they often cause confusion and need to be replaced with more precisely defined architectural layers.
Around the same time, improvements in software engineering (a term first used in 1968) introduced new techniques to improve the structure, performance and reusability of computer programs. Structure Programming added the use of sequence, selection and iteration as the three types of program control structure; subroutines or callable units; block structures that enable groups of things to be considered as a single unit; conditions; and for/while/until loops. As programming languages have evolved, and as different programming paradigms, such as object or service oriented, have emerged – so additional techniques for designing and structuring programs have surfaced. For example, the single exit point required by structured programming has been supplemented by multiple or early exit breaks from a function or loop; approaches also provide more sophisticated exception handling options.
Although the concepts behind object oriented (OO) programming go back to the 1960s, they came to more general prominence in the 1980s. Techniques from OO that are relevant to EA include grouping objects into classes; the use of instances of classes and inheritance to create components which are derived from a class; abstraction and generalization techniques; states and state machines; and encapsulation, or the packing of data and function into a single component.
From an EA perspective, these programming techniques provided a rich source of ideas for analysis and design that were easily adapted so that they applied to EA components.
OO gave added impetus to the idea of software design patterns, which were proposed in the late 1970s by the building architect Christopher Alexander. He is regarded as the father of the Pattern Language movement. A pattern language has a “vocabulary” which names and describes solutions to problems in a field of interest; each solution, described as a pattern, has "syntax" which shows where the solution fits in a larger, more comprehensive context or web of other patterns; it also includes "grammar" that describes how the solution solves a problem or produces a benefit. Patterns have become a key concept in EA, and as EA matures they may prove to be one of its most important techniques.
Another important technique from the 1980s is the notion of an architecture framework. The concept was introduced by John Zachman, although ironically the framework that bears his name is more a schema, taxonomy or ontology. The concept of a framework has been formally defined in a standard (ISO/IEC 42010). Early frameworks attempted to show a comprehensive EA schema in a single diagram. This has proved impossible, because of the difficulty in presenting more than 2 or 3 dimensions in one chart. During the 1990s architects realized that EA is multi-dimensional, so to make a framework a practical technique it needs to focus on less than three dimensions at a time. This has been recognized in The Open Group Architecture Framework (TOGAF), for example, which is actually a framework of frameworks, as it contains separate frameworks for things like content and capability. In my own work I’ve identified eight core dimensions or factors that can be combined to create Multiple Integrated Architecture Frameworks (MIAF).
The rise of knowledge management in the 1990s, in parallel with research into business rules and insights from Knowledge Engineering, gave EA further techniques. EA is often considered to be a knowledge or information based discipline, so techniques for representing or reasoning about this knowledge are particularly useful. Consequently, the use of semantic nets, frames, rules, primitives, deconstruction (rather than decomposition), meta-representation or reflection, and ontologies have become part of the EA arsenal. Domain theory and domain modelling also had a big influence on EA in the late 1990s and early 2000s, and it is not surprising that around this time we find the first large-scale use of generic reference models. Some of the early reference models were relatively primitive, and we still have the problem of how to rationalize and integrate a proliferation of diverse models, but as we increasingly build EA on more common foundations, so we are likely to find that domain models simplify and merge into a smaller and standard base set. It is also from domain theory that EA has adopted ideas such as separation of concerns and faceted classifications.
In the 1990s and 2000s EA has become more business oriented, so it has adapted techniques from the business world to create architectural landscapes that mirror business needs. For example, the business concept of Mass Customization, which has also been referred to as the Sense-Respond Model, has informed architectural techniques for separation of concerns, parameterization, modularization, and component class hierarchies. The need to align EA investment with business imperatives has led to improvements in road-mapping techniques, such as the use of strategic themes, strategic vectors and enterprise patterns.
It is impossible to cover everything in one short (OK – long!) blog. I hope I’ve given a flavor for the way that EA has adopted techniques of the time and adapted them to its needs. And I hope that it might give some inspiration for seeking out other emerging techniques that can be tailored to help us produce better enterprise architectures in the future.