1. Trang chủ
  2. » Ngoại Ngữ

Assessing Organizational Capabilities Reviewing and Guiding the Development of Maturity Grids

42 0 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 42
Dung lượng 432 KB

Nội dung

Title page: Assessing Organizational Capabilities: Reviewing and Guiding the Development of Maturity Grids Abstract – Managing and improving organizational capabilities is a significant and complex issue for many companies To support management and enable improvement, performance assessments are commonly used One way of assessing organizational capabilities is by means of maturity grids Whilst maturity grids may share a common structure, their content differs and very often they are developed anew This paper presents both a reference point and guidance for developing maturity grids This is achieved by reviewing existing maturity grids and by suggesting a roadmap for their development The review of more than twenty maturity grids places particular emphasis on embedded assumptions about organizational change in the formulation of the maturity ratings The suggested roadmap encompasses four phases: planning, development, evaluation and maintenance Each phase discusses a number of decision points for development, such as the selection of process areas, maturity levels and the delivery mechanism An example demonstrating the roadmap’s implementation in industrial practice is provided The roadmap can also be used to evaluate existing approaches In concluding the paper, implications for management practice and research are presented Index Terms – Performance assessment, Quality management, Process improvement, Organizational capabilities, Change management, Project Management, Maturity Grid, Maturity Matrix, Maturity Model Q I INTRODUCTION1 UALITY management and process improvement initiatives and their influence on performance excellence have led to an explosion of standards, regulations, bodies of knowledge, statutes, models and grids that have been published These ‘improvement frameworks’ often seek to enable the assessment of organizational capabilities Customers may explicitly require compliance with some frameworks, the market may implicitly require compliance with others Some might be imposed by regulation, others might simply be recognized as being useful in building or maintaining a competitive position [1] or in overcoming the paradoxical organizational struggle to maintain, yet replace or renew capabilities [2] Irrespective of the driver for adopting an improvement framework, dealing with hundreds of requirements that the diverse standard-setting bodies impose leaves many companies confused and frustrated Confusion has triggered calls for an overview [3] or a taxonomy of improvement frameworks [4, 5] A taxon suggested by Paulk [5] pertains to management philosophies, such as Total Quality Management and associated with it maturity assessment grids Maturity grids can be used both as assessment tools and as improvement tools Crosby’s Quality Management Maturity Grid (QMMG) features as the pioneering example, advocating a progression through five stages: uncertainty, awakening, enlightenment, wisdom, certainty (Figure 1) -Insert Figure about here -In case of a voluntary evaluation of performance levels, companies often look for assessments that not take long and not cost too much; which makes maturity grid assessments especially attractive However, whilst managing organizational capabilities is a significant and complex issue for many companies and features prominently in organization literature, we nevertheless observe that the contribution of assessments using maturity grids has so far been overlooked in academic literature It seems as if maturity grids have been in the shadow of the more widely-known Capability Maturity Model (CMM) [6, 7] and its derivatives, including the CMMI [8] and the People-CMM [9] – all developed at Carnegie Mellon’s Software Engineering Institute (SEI) The authors wish to acknowledge the constructive comments provided by the three anonymous reviewers and the Associate Editor, Prof Jeffrey Pinto We also wish to extend our gratitude to the following people who we contacted during the writing of this paper: Dr Bob Bater, InfoPlex Associates, Prof Ian Cooper, Eclipse Consultants, Dr Nathan Crilly, University of Cambridge, Prof Kevin Grant, University of Texas at San Antonio, Dr Manfred Langen, Siemens AG and Dr Sebastian Macmillan, University of Cambridge We also thank the members of the Design Management Group, University of Cambridge, and the members of the Work, Technology and Organization Section, Technical University of Denmark, who supported us with helpful advice on earlier versions of the manuscript A The connection between maturity grids and models Differentiating between capability maturity grids and capability maturity models is difficult Whilst they are complementary improvement frameworks with a number of similarities, key distinctions can be made with respect to the work orientation, the mode of assessment and the intent Work orientation: Maturity grids differ from other process maturity frameworks, such as the SEI’s Capability Maturity Model Integration (CMMI), which applies to specific processes like software development and acquisition The CMMI model identifies the best practices for specific processes and evaluates the maturity of an organization in terms of how many of those practices it has implemented By contrast, most maturity grids apply to companies in any industry and not specify what a particular process should look like They identify the characteristics that any process and every enterprise should have in order to design and deploy high-performance processes [10] Mode of assessment: Typically, an assessment using the Capability Maturity Models consists, among other instruments, of Likert or binary yes/no-based questionnaires and checklists to enable assessment of performance In contrast, an assessment via a maturity grid is typically structured around a matrix or a grid Levels of maturity are allocated against key aspects of performance or key activities, thereby creating a series of cells An important feature of a maturity grid approach is that in the cells it provides descriptive text for the characteristic traits of performance at each level, also known as a ‘behaviourally anchored scale’ [11] Intent: Many capability maturity models follow a standard format and are internationally recognized As a result, they can be used for certification of performance Maturity grids tend to be somewhat less complex as diagnostic and improvement tools without aspiring to provide certification Accordingly, the intention of use by companies differs Companies often use a number of approaches in parallel and a maturity grid assessment may be used as a stand-alone assessment or as a sub-set of a broader improvement initiative In summary, in comparison with CMMs, a CMG has a different orientation, and are normally generic across industries CMGs consist explicitly of behaviouraly anchored scales Finally, CMGs are concise and as a result are less effective in benchmarking or as a tool for certification They are effective though in raising awareness of new managerial issues B Lack of cross-domain reviews There is a lack of concerted cross-domain academic effort in understanding the limitations and benefits of maturity grids In specific knowledge domains, there have been some efforts compare a variety of maturity assessments, mostly focusing on maturity models: Becker et al [12] compared six maturity models for IT Management; Rohloff et al [12] review three maturity models for Business Process Management; Kohlegger et al [13] conducted a qualitative content analysis of a set of 16 maturity models; deBruin and Rosemann [14] presented a cross-domain review of two maturity models for Knowledge Management and for Business Process Management respectively, Siponen [15] explored three evaluations based on maturity principles in the area of Information Management, and Pee and Kankanhalli [16] compare maturity assessments for knowledge management Analyses concentrate mostly on a small sample of maturity models and describe strengths and/or weaknesses of existing approaches within the respective domains, i.e IT Management, Knowledge Management, Business Process Management and Information Management Studies with a larger sample size that use maturity assessment methods to gauge the current level of a specific organizational capability in different industry sectors have been conducted, e.g by Ibbs and Kwak [17], Pennypacker and Grant [11, 18] and Mullaly [19] in the domain of Project Management These studies report on a substantial interest in the development of viable methods to assess and improve project management maturity and recognize the need for synthesis among maturity assessment methods The aim of these studies was to synthesize domain-specific findings on the status of an organizational capability Each of the previous research efforts cited succeeded in comparing maturity models in specific knowledge domains or used specific maturity assessment methods to arrive at an overall level of, for example, project management within a number of industry sectors As such, these efforts represent valuable initial steps to give researchers and industry the needed overview in their quest for awareness and perhaps synthesis of assessment methods Given what has been said above, there is a need for a cross-domain review of maturity grids and a gap in current understanding relating to underpinning theoretical concepts of maturity grid based assessments The consequences of this are that unnecessary effort is often expended in developing new assessment tools that duplicate those that already exist or new methods are developed from unsuitable foundations Consequently, this paper aims to review existing maturity grids to provide a common reference point C Lack of guidance for authors of maturity grids Since the early 1990s and in parallel to the absence of a cross-domain academic debate, we see rapid growth in the development and use of maturity grid assessments in management practice across a number of industry sectors Such maturity grids are developed by researchers in academia, practitioners and consultants and as a result are often proprietary or difficult to access When examples reach the academic literature, understandably, they are published in specialized journals relating to the domain addressed With few exceptions, work is presented without reference to that which precedes it and new language is developed for concepts that have already been described elsewhere Many such efforts have been criticized as ad-hoc in their development [20] It is perhaps not surprising, as in the absence of guidance, authors what they think is best and that is often not good enough, especially considering the potential impacts of assessment results on a company’s operations and employees’ morale Consequently, this paper aims to suggest a more rigorous approach to the development of maturity grids Recent studies intend to aid the development of maturity models Becker et al [12] compare six maturity models for information technology management and suggest a procedural model for their development, Mettler and Rohner [21] suggest decision parameters for the development of maturity models in information systems, and Kohlegger et al [13] compare the content of 16 maturity models and suggest a set of questions for subsequent development of maturity models This paper complements these studies by focusing on underpinning theoretical concepts of maturity grid based assessments and by guiding their development and evaluation D Objectives of the paper The objectives of this paper are to review existing maturity grids to provide a common reference point and to suggest the parameters for a more rigorous approach to the development of maturity grids This paper is intended for practitioners in industry and academic researchers concerned with process improvement, intervention and change management in organizations Both readerships might benefit from an overview of published maturity grids for assessing organizational capabilities, and potentially be introduced to grids that they had not previously seen Furthermore they might benefit from guidance for the systematic development of their own maturity grid E Structure of the paper The remainder of the paper is organized as follows: Section II describes the methods used to elicit the content of this paper Section III introduces the notion of maturity and traces its history and evolution In Section IV, existing maturity grids are reviewed Section V introduces a process for creating maturity grids Section VI presents an illustrative example of its application Section VII concludes the paper with the main implications for management practice and research II METHODS This section explains the rationale for selecting the review sample of this paper and furthermore describes how the suggested roadmap for the development of new and the evaluation of existing maturity grids was built up and evaluated A Selection of the review sample Our sampling strategy for the review consisted of the following activities: Firstly, we keywordsearched academic databases and World Wide Web search engines Secondly, we followed-up references from publications found in the preceding activity Finally, to check comprehensiveness and gather suggestions from experts in the field of organizational studies, we sent the list of about 60 grids and models to a number of academic scholars active in this field Out of the list of potential models, we selected maturity grids for further analysis that fulfilled the following criteria: • Grid-based approach: The grids need to be akin to the original Quality Management Maturity Grid (QMMG) by Crosby as opposed to following a staged CMM-style approach The representation in grid or matrix format using text descriptions as qualifiers is either used as underlying model and/or assessment instrument Here, no differentiation is made between a grid, a matrix and a table [22] • Self-assessment: A number of assessment methods use an external facilitator to assign and/or decide on scores based on participants’ comments and/or require a certified auditor Those approaches not meet this criterion as this paper focuses on models aimed at self-assessment • Publicly available: Many maturity grids are proprietary tools generated by consulting organizations Approaches included in this review are all in the public domain and available without cost Tables to summarize the 24 maturity grid assessments analyzed in this study, presented in chronological order of publication Our sample consists of maturity grids developed by academic researchers, industry, and consulting firms, provided they meet the above-mentioned criteria As a result, many models that make significant contributions in their own field were not included in this review To mention a few, for example, the Achieving Excellence Design Evaluation Toolkit [23] uses a Likert-scale and is, as defined in this paper, not a grid-based approach Kochikar’s approach in knowledge management [24] is particularly influenced by the CMM in that it is a staged approach, requiring baseline performance in certain key research areas before advancing to the next level of maturity Further, Ehms and Langen’s comprehensive model [25] in knowledge management is a third- party assessment tool relying on an auditor to infer the scores from participants and the Learning Organization Maturity Model by Chinowsky et al [26], for example, violates the first criterion as it is not a grid-based tool Outside the scope of the paper is also work on related terms For example, firstly, technology readiness [27, 28] and uptake of its principles, e.g System Readiness Level [29] Secondly, life cycle theories, for example, describing the adoption of a new product, technology, or service in the market, often visualized using S-shaped curves [30, 31]; the development of electronic data processing towards maturity (using six stages of growth [e.g.32] and the phases of growth organizations in general pass through towards maturity in the market [33] B Elicitation of guidance The overall research approach taken for the review and suggestion of subsequent guidance is that of conceptual analysis [15] Individual maturity grids are compared according to critical issues: the assessment aim, the selection and rationale of process areas, the conceptualization of leverage points for maturity, and administration of the assessment In order to show how we came to a particular conclusion, relevant citations from the original material are included wherever possible The roadmap takes the form of a description of the sequence of four process steps [34] with a set of decision points for each phase Development of the roadmap was pursued like a design exercise in that it aimed to create an innovative artefact – a method – intended to solve an identified problem The underlying theoretical perspective is thus design science [35, 36] For presentation of this article, we have furthermore been inspired by Eisenhardt’s [37] roadmap to develop theory from case studies which alerts the reader to steps and associated decision points in the development journey Guidance, in the form of a roadmap with four phases and a number of associated decision points, was developed in three steps: Firstly, more than twenty extant maturity grids for assessing organizational capabilities were reviewed The sample contains contributions resulting from the fields of management science, new product development, engineering design and healthcare Secondly, guidance was elicited on the basis of experience from the authors of this paper Thirdly, two experts who have independently developed and applied a maturity grid for assessing organizational capabilities were interviewed Interviews were conducted to obtain further insights and to validate our results from comparing extant approaches in literature and findings from our own experience with developing and applying maturity grid assessments in small and medium-sized companies as well as large multinational corporations from a number of industry sectors Both experts have undertaken consulting work and are now pursuing academic careers in engineering and construction In summary, insights from reviewed literature, the authors’ own experience and the experts’ feedback were used as the basis of this guide C Evaluation of guidance The roadmap suggested in this paper was evaluated in two ways Firstly, application of the roadmap’s use in industry demonstrates its workings Outcomes of multiple applications of the maturity grid to assess communication given as the case example in this paper is taken as an indicator for both the roadmap’s and the grid’s reliability In addition, feedback from the participants in the assessments is also taken as direct member validation [38] of the Communication Grid Method and as indirect evidence of the roadmap’s utility Secondly, in connecting to the design science perspective, a further evaluation of the development method presented here is provided by applying the guidelines for devising an artefact formulated by Hevner et al [36] and reformulated and used specifically in the context of requirements for maturity models by Becker et al [12] (see Section VI and Table 6) III MATURITY When discussing the concept of maturity, it is important to provide definitions as the language can be inconsistent and confusing It cannot be assumed that even within one field of expertise, the concept of maturity espoused is one and the same This section introduces a dictionary definition of maturity and continues by specifying what in the literature has come to be termed ‘process maturity’, ‘organizational maturity’, ‘process capability’, ‘project maturity’ and ‘maturity of organizational capabilities’ A The notion of maturity: a dictionary definition Broadly speaking, there is reference to two different aspects of the concept of maturity Firstly, there is reference to something or someone as having reached the state of completeness in natural development or growth [39], in other words, the “state of being complete, perfect, or ready” [40] Secondly, there is reference to the process of bringing something to maturity, “to bring to maturity or full growth; to ripen” [40] It is the latter definition which stresses the process towards maturity that interests us in the paper How authors of individual maturity grid assessments conceptualize a progression to maturity? Searching for answers to this question, one realizes that the concept of maturity is best discussed in connection with the context within which it has been applied B Evolution of the notion of maturity in literature(s) The concept of maturity has seen widespread attention in a number of academic fields Whilst the concept of maturity grids has been familiar for some time, their popularization as a means of assessment has been more recent [19] Process maturity: The concept of process maturity stems from Total Quality Management (TQM) where the application of statistical process control techniques showed that improving maturity of any technical and business process ideally leads to a reduction of variability inherent in the process and thus an improvement in the mean performance of the process Whilst Crosby has inspired the notion of progression through stages towards maturity, his maturity concept as a way of measuring organizational capabilities was not formalized [5] Organizational maturity: Through the widely adopted Capability Maturity Model for improvement of software development process (CMM-SW), this concept of process maturity migrated to a measure of organizational maturity Integral to the CMM-SW is the concept that organizations advance through a series of five stages or levels of maturity: from an initial level, to a repeatable-, defined-, managed- and an optimizing level These levels describe an evolutionary path from ad hoc, chaotic processes to mature, disciplined software processes and define the degree to which a process is institutionalized and effective [6, 41] Process capability: Rather than measuring organizational capability with a single value, ISO/IEC 15504 measures process capability directly and organizational capability with a process capability profile CMMI integrated both organizational maturity and process capability Although ISO/IEC 15504 and CMMI both use capability levels to characterize process capability, their operational definitions are somewhat different The key taxonomic distinction is between a multi-level organizational versus process measure Paulk [5] suggests the term organizational capability to characterize a hybrid between organizational maturity and process capability This is different from the notion of organizational capabilities applied in this paper In addition, irrespective of the way of arriving at an overall score either on the level of processes (process capability) or aggregate level for an organization (organizational maturity), the notion of higher levels of maturity representing increased transparency is retained [5] Project maturity: Perhaps since software is developed through projects, it is natural that the concept of organizational maturity would migrate from software development processes to project management, and this has been reflected in an interest in applying the concept of ‘maturity’ to (software) project management [17, 42-47] Although the focus on assessing project management using the notion of maturity has started comparatively recent, a number of alternative models have been released [18, 19, 45, 48] Whilst inspired by CMM- or ISO/IEC 15504-like notions of maturity which focus on process transparency and increased (statistical) control, research into project management maturity shows variations in how maturity is conceptualized One way to determine a mature project is by looking at what organizations and people are doing operationally [17, 49] Skulmoski [50] goes further to include an organization’s receptivity to project management Andersen and Jessen [48] continue in this vein and determine maturity as a combination of behavior, attitude and competences Maturity is described as a composite term and characteristics and indicators are used to denote or measure maturity In response to the many competing models, the Project Management Institute (PMI®) launched the Organizational Project Management Maturity Model (OPM3) program [18] as a best-practice standard for assessing and developing capabilities in Portfolio Management, Program Management, and Project Management [51] Maturity of organizational capabilities: The concept of capability has been used in strategic literature, specifically in the resource-based view to explain differential firm performance [52-55] The capability perspective hinges on the inductively reasoned relationship between process capability and business performance This paper uses the terms organizational capabilities as the collective skills, abilities and expertise of an organization [56, 57] In this vein, organizational capabilities refer to, for example, design; innovation; project management; knowledge management, collaboration and leadership [56] Thus, organizations can be viewed as comprised of a set of capabilities [58] which are the skill sets that provide an organization with its competitive advantage Whilst it seems potentially misaligned to have first described process maturity, followed by organizational maturity, and finally by one example of an organizational capability, project management, before finally moving to the focus of this paper, maturity of organizational capabilities in general, two reasons justify this sequence One, the historic timeline of influence is maintained Two, this body of literature engages with and shows variations in conceptualizations of maturity that would be fruitful across disciplines Variations show that there is more than one leverage point to achieve maturity – the subject of the cross-domain review of existing maturity grids in the ensuing section -Insert Tables to about here - presentation and in some cases a written report Repeatedly, participants felt the results accurately represented the actual status of communication between the team interfaces chosen This ensured reliability The method itself was evaluated according to set criteria, such as functionality and usability, usefulness and learn effect, triggering reflection, correctness of results obtained [69, 70] Findings from verbal feedback were cross-referenced with a questionnaire using the same criteria and completed by the same participants For a comprehensive description refer to Maier [76] In addition to feedback from experts in industry participating in the studies, feedback was also sought from experts from two engineering consultancies and engineering design researchers from a variety of universities The different respondent groups were chosen because a single group of respondents could not properly replicate broader use in industry or judge scientific purpose As research progressed and predominantly during the three early application cases after which the design was frozen, subtle changes were made to the list of process areas, the number of interfaces assessed at any one time, and the administration mechanism Description of the development process reflects the final design of the maturity grid D) Phase IV – Maintenance The Communication Grid Method aims to raise awareness It may compare teams and departments within companies with each other, rather than positing a best practice benchmark Cell-text is descriptive and therefore, the maturity grid will remain up-to-date The development process was documented and communicated to scientific audiences through peer-reviewed publications Documentation of results from the application cases were made available to participating industry partners Following the roadmap in the development of the communication grid gave us a clear structure for development It also meant that both the authors and the users of the assessment method knew what we were expecting and could therefore define the rules of engagement and estimate time and effort In keeping with the design science perspective underlying this article (Section II), a further evaluation of the Communication Grid Method and the suggested roadmap for development of such maturity assessment methods was undertaken (Table 6) We followed the guidelines for devising an artifact formulated by Hevner et al [36] and reformulated and used specifically in the context of requirements for maturity models by Becker et al [12] -Insert Table about here VII SUMMARY AND CONCLUSIONS This paper has discussed various notions of maturity, i.e ‘process maturity’, ‘organizational maturity’, ‘process capability’, ‘project maturity’ and ‘maturity of organizational capabilities’ (Section III) It presented a comprehensive overview of extant maturity grids that build on the ideas of Crosby’s Quality Management Maturity Grid from the late 1970s (Section IV) In reviewing, particular emphasis was placed on analyzing embedded assumptions about organizational change in the maturity scales of the examples reviewed As direct comparison, leverage points for maturing collaboration as an organizational capability were shown to be one or a combination of: Existence and adherence to a structured process (e.g infrastructure, transparency), alteration of organizational structure (e.g job roles), emphasis on people (e.g skills, training, building relationships), and/or emphasis on learning (e.g awareness, mind-set) This shows that combining different perspectives and measures of ‘good’ for one and the same process or capability is difficult Assessing maturity will therefore perhaps always be more subjective than objective [48] Whilst the number of maturity grids and models is growing, there is little support available on how to develop these approaches to organizational capability assessment and development To address this issue, the paper also provided a four-phase roadmap for developing maturity grids (Section V) and showed the roadmap’s implementation in industry with an illustrative example (Section VI) This roadmap is a first attempt at identifying and synthesizing phases and decision points that may be useful to both authors of assessment grids and implementers who need to handle the multi-model environment It provides the parameters within which professional development of a maturity grid might occur Further, it provides the parameters within which professional judgements for evaluation of existing grids might be made However, it cannot, and does not aim to provide the answer to every dilemma an author of a grid may face Decision points and options provide instances for reflection to ensure appropriate courses of action when developing new or evaluating existing maturity grids Such instances occur, for example, when selecting process areas and facing the issue of academic rigor vs logistical feasibility and thereby practical utility For academic purposes, the list of process areas chosen needs to be comprehensive, complete, correct, consistent and, above all, theoretically justified For industrial applicability, however, certain flexibility for adaptation and tailoring needs to be designed into the method It is necessary to strike a balance between developing an exhaustive method and a usable one A) Implications for industrial practice Maturity grids are built upon conceptual models that in their own rights provide insights into the author’s perspective of the factors important in an organization Thus, the maturity grid-based assessment methods collectively offer a contemporary representation of different conceptualizations of organizational practices and capabilities that are viewed as important for success In addition, this review presents an overview of the different maturity grid approaches available to assess organizational capabilities and initiate change processes at a given point in time This provides organizations with a better understanding of existing capabilities and enables benchmarking against a range of competitors While maturity grids may share a common structure, their content differs and very often they are developed anew This paper provides a common reference point and guidance for the evaluation of existing grids and the development of new grids It alerts both the novice and expert user to the potential decisions to be addressed at the start of the development of a maturity grid assessment tool B) Implications for research The maturity grids selected all embrace the notion that successful organizational change can be triggered and/or achieved by an assessment of practices Most maturity grids are based around the underpinning assumption that key organizational capabilities need to be rooted in codified business processes Thus, there is an acceptance that business processes are beneficial and necessary However, the underlying idea of ‘cause and effect’ might be fallacious because the processes under assessment are in many cases social processes that not follow simple cause and effect patterns [88] Careful analysis is necessary, however, to discern whether maturity grid methods fall into a naturalisticmechanistic perspective that holds processes to be fully quantifiable and controllable Given the variety of maturity grids available, the skeptical observer would wisely enquire as to the basis upon which each maturity grid is founded A number of perspectives on organizational change – certain ideas of goodness based on certain paradigms of goodness – are embedded in the rating scales of maturity grids The overview, analysis and suggestions provided here gives a background for research into theorybuilding on the development of maturity grids and for research into management tools as interventions in organizations FIGURES AND TABLES Stage I: Uncertainty Quality Management We don't know why we have problems with quality Stage II: Awakening Stage III: Enlightenment Stage IV: Wisdom Stage V: Certainty Is it absolutely necessary to always have problems with quality? Through management commitment and quality improvement we are identifying and resolving our problems Defect prevention is a routine part of our operation We know why we not have problems with quality Figure Excerpt reproduced from the Quality Management Maturity Grid [89] Teamwork E-10 E-2 E-3 E-4 Teamwork is focused, occasional, and atypical The enterprise commonly uses cross-functional project teams for improvement efforts Teamwork is the norm among process performers and is commonplace among managers Teamwork with customers and suppliers is commonplace Figure Redrawn ‘Teamwork’ by Hammer [10] Points Level A (Not recognized) R&D department does not think that it needs to work with marketing in developing new products (aerospace) Level B (Initial efforts) Technical people want better coordination with marketing, but lack the skills to analyze the business applications of a technical idea (petroleum equipment) Level C (Skills) Technical people know how to develop applications of a technology, but lack of methods for working backward from a customer need to selecting technical projects (chemical) Level D (Methods) Work closely with marketing, but has difficulties in sorting out where responsibilities lie between technical concept and product concept (food processing) Level E (Responsibilities) Close coordination between R&D and marketing departments, but has not figured out how to develop new products effectively (chemical) Level F (Continuous improvement) Close coordination, with a former technical person in charge of marketing and taking the lead in technical marketing and new market development (industrial equipment) Figure Redrawn Activity – Coordinating R&D and Marketing from Szakonyi [60] Collaboration and participation Level Level Insularity, lack of trust or power struggles reduce participation and collaboration Team members prefer to work alone and give more priority to their own concerns than to those of the team Level Level Level The team seeks ideas, proposals and solutions from all its members All members are given opportunities to contribute and build on suggestions from others Familiarity, honesty, mutual trust and full participation harness the collective expertise of the team Figure Redrawn ‘Collaboration’ by Constructing Excellence [61] (Direction of rows changed) Collaboration strategy: “Conscious choice between internal or external sources of design and development expertise” Level Level Level Level (Not) Invented Occasional ad-hoc Established partners Regular review of Here! partnering competence • Investment in core • Some long term partners, but • Do most technological • Inconsistent use not strategically managed things incompetences of design house specialists regardless of • Long term relationships with • Strategic partnering capability specific design service • No agreed firm providers Collaboration policy • Capability is • Tasks not strategy extended with most always done • External design involvement appropriate external • Industrial by specialists is planned into the project resource design used late early to ‘tart up’ pre• Prone to NIH determined • External resource syndrome (not • Internal skill shortages clearly mechanics integrated into core invented here) recognized team Figure Redrawn ‘Collaboration strategy’ by Fraser et al [74] (General discussion questions omitted) Factors A: No action B: Change of action Collaboration Everyone looks solely after his or her tasks Collaboration happens only if asked for in order to fulfill tasks C: Change of action and attitude D: Continuous adaptation Collaboration happens proactively in order to learn from others and improve own approaches Collaboration is constructive, happens regularly whenever necessary and there is continuous effort to improve it Current Desired Figure Redrawn ‘Collaboration’ by Maier [90] PhaseI: Planning PhaseII : Development PhaseIII : Evaluation PhaseIV: Maintenance 1) Specify audience 1) Select process areas 1) Validate 1) Check benchmark 2) Verify 2) Defneaim 2) Select maturity levels 2) 3) Clarify scope Maintain results database 3) Formulate cell text 4) Defne success criteria 4) Defne administration mechanism Figure Phases and decision points of roadmap Document and communicate development process and results Tables to 4: Comparison of existing maturity grids Maturity grid Description • Quality Management Maturity Grid (QMMG) [89] The Safety Management Maturity Grid [91] • • • • • • • • Process Grid [92] • • • • Energy Management Matrix [93] • • • • Measuring R&D effectiveness [60, 71] • • • • The Information Process Maturity Model (IPMM) [94] • • • • Product and Cycle-time Excellence (PACE) [95] Innovation Audit [70] • • • • Description: The Quality Management Maturity Grid (QMMG) is an organizational maturity matrix conceived by Philip B Crosby first published in his book Quality is Free in 1979 The QMMG is used by a business or organization as a benchmark of how mature their processes are, and how well they are embedded in their culture, with respect to service or product quality management Aim: Raise awareness and benchmark an organization’s processes relative to each other Scope: It is generic for Quality Management Administration: Individual completion of paper-based grid Description: The Safety Management Maturity Grid Aim: Raise awareness “situation identification” and comparison between individual “raters” Scope: It is generic for Safety Management Administration: Individual “rating” of paper-based grid (at least three people per company or division) Description: The Process Grid aims to evaluate large-system programming development locations according to a set of process stages It evaluates the effectiveness of the work being performed to develop software and can be used for software evaluations of any project Aim: Raise awareness and benchmark an organization’s processes relative to each other Scope: It is generic for Software Evaluation Administration: Assessment through interviews of product group members Description: Energy efficiency activity in an organization is often treated as a technical activity isolated from management processes The Energy Management Matrix was developed to determine how energy efficiency is viewed at various organizational levels Aim: Benchmark with best practice in industry sector Scope: Discipline-specific for energy management in construction Administration: Individual or group completion of matrix Description: Based on several decades of experiences and work with a number of companies, Szakonyi developed an approach to measure Research and Development (R&D) effectiveness Aim: Benchmarking with best practices in respective industry sector Scope: Discipline-specific for R&D effectiveness Administration: One-to-one interviews with R&D personnel Description: Companies frequently collect information about their customers, products, suppliers, inventory and finances However, it can become increasingly difficult to accurately maintain that information in a usable, logical framework over time Therefore, Hackos developed maturity grids to help companies improve information development practices Aim: Improvement through raising awareness and comparison within an organization Scope: Generic Administration: Interviews and additional documents Description: Every company that improves the new product development process goes through evolutionary stages McGrath’s Product and Cycle-time Excellence (PACE) is one approach to assess and improve this progression Aim: Benchmarking with industry best practice Scope: Discipline-specific for product development Administration: not mentioned Description: The Innovation Audit addresses the managerial processes and the organizational mechanisms through which Process areas Maturity levels • • • Number: Labels: Management categories Items: Management understanding and attitude; Quality organization status; Problem handling; Cost of quality as % of sales; Quality improvement actions; Summation of company quality posture Stage I: Uncertainty Stage II: Awakening Stage III: Enlightenm ent Stage IV: Wisdom Stage V: Certainty • • • Number: Labels: Management categories Items: Management understanding and attitude; Safety organization status; Problem handling; Safety & Health program improvement actions; Summation of company employee welfare posture Stage I: Uncertainty Stage II: Awakening Stage III: Enlightenm ent Stage IV: Wisdom Stage V: Certainty • • • Number: 132 Labels: Process stages and attributes Items: Process stages 12: Requirements, Product Level Design, Component Level Design, Module Level Design, Code, Unit Test, Functional Verification Test, Product Verification Test, System Verification Test, Package and Release, Early Support Program, General Availability Attributes 11 for each process stage: methods, adherence to practices, tools, change control, data gathering, data communication and use, goal setting, quality focus, customer focus, technical awareness Level = traditional Level = awareness Level = knowledge Level = skill and wisdom Level = integrated managemen t system • • • Number: Label: Organizational issues Items: policy, organization, motivation, information systems, marketing, investment Level Level Level Level Level • • • Number: 10 Label: Processes Items: Selecting R&D, Planning and managing projects, Generating new product ideas, Maintaining the quality of the R&D process and methods, Motivating technical people, Establishing cross-disciplinary teams, Coordinating R&D and marketing, Transferring technology to marketing, Fostering collaboration between R&D and finance, Linking R&D to business planning Level A = issue is not recognized Level B = initial efforts are made toward addressing issue Level C = right skills are in place Level D = appropriate methods are used Level E = responsibilit ies are clarified • • • Number: Label: Characteristics Items: Organizational Structure, Quality Assurance, Planning, Estimating and Scheduling, Hiring and Training, Publications Design, Cost Control, Quality Management Level = Oblivious? Level = Ad-hoc Level = Rudimentar y Level = Organized and Repeatable Level = Managed and Sustainable • • • Number: 10 Label: not mentioned Items: Product Development Process (Structure and Definition), Project Team Organization, Management Decision Process, Continuous improvement, Target setting/Metrics, Product Strategy Process, Technology Management Process, Pipeline Management, Time to Market Performance, Development Productivity Stage = Informal Stage = Functionall y Focused Project Managemen t Stage = CrossFunctional Project Managemen t • • Number: 23 (8 “process areas” and two-four sub-questions) Label: Questions Level Level Level Stage = EnterpriseWide Integration of Product Developme nt Level • • • • Information Security program maturity grid [96] • • • • PM Solutions Project Management Maturity Model (PMMM) [11, 42, 59] • • • • Towards a Risk Maturity Model [97] • • • • Berkeley PM process maturity model [17, 49] Collaboration Audit [74, 75] • • • • innovation is performed The audit has two dimensions: the process audit assesses whether the processes necessary for innovation are in place and the degree to which best practice is used; and the performance audit focuses on the outcomes of each core and enabling process and of the overall process of technological innovation and its effect on competitiveness Aim: Raise awareness and comparison with other companies Scope: Discipline-specific for product development Administration: The audit methodology uses a two-level approach: a rapid assessment based on innovation scorecards and an in-depth audit Description: Outsourcing, off shoring, mergers and acquisitions demand for increased productivity and reductions in work force often are accompanied by data management problems In addition, as enterprise data is frequently held in disparate applications across multiple departments and regions questions of data security come into play Stacey’s Information Security program maturity grid addresses the above-mentioned issues Aim: not mentioned Scope: Generic for information security Administration: Not mentioned Description: Project management maturity is the progressive development of an enterprise-wide project management approach, methodology, strategy and decision-making process The Project Management Maturity Model follows the Software Engineering Institute's (SEI) Capability Maturity Model's (CMM) five evolutionary maturity levels, and examines maturity development across nine knowledge areas in the Project Management Institute's (PMI) A Guide to the Project Management Body of Knowledge (PMBOK Guide) Aim: Benchmark across industry sectors Scope: Generic for project management Administration: Questionnaire Description: Good risk management practice, including assessment of health and safety, environmental and financial risks, is essential for businesses In the UK, for example, the management of safety and environment are subject to regulator controls which ask for appropriate practices which, in turn, can have implications for an organization’s capability to manage safety and risk To address this, Hillson suggests a Risk Maturity Model Aim: Improvement by benchmarking current practice within organization Scope: Generic for Risk Management Administration: Not mentioned Description: Adopting the classification of the Project Management Body of Knowledge (PMBOK), the Berkeley PM process maturity model assesses the effectiveness of project management in organizations Aim: Benchmarking Scope: Generic within project management Administration: Individual completion of questionnaire Description: Product development is inherently a collaborative activity, involving both internal groups and external partners Few firms have all the skills and resources to develop technologically complex products in-house Although external collaboration is acknowledged to be difficult, the capacity to collaborate successfully has been considered to confer competitive advantage (p.1499) To ensure good collaboration practice, Fraser et al developed a collaboration maturity model • Items: Product innovation (Generating New Product Concept, Product Innovation Planning, Inventiveness and Creativity); Product development process (the product development process, teamwork and organization, transfer to manufacturing, industrial design) • • • Number 5: Label: Measurement categories Items: Management Understanding and Attitude, Security Organization Status, Incident Handling, Security Economics, Security Improvement Actions • • • Stage I: Uncertainty Stage II: Awakening Stage III: Enlightenm ent Stage IV: Wisdom Stage V: Benevolenc e Number: 42 (9 PMI knowledge areas with 4-6 components for each knowledge area) Label: Components Items: Project Integration Management (Project Plan Development, Project Plan Execution, Change Control, Project Information System, Project Office); Scope Management (Requirements Definition – Business, Requirements Definition – technical, Deliverables Identification, Definition, WBS, Change control); Time Management (Activity Definition, Activity Sequencing, Activity Development, Schedule control, Schedule integration; Cost Management (Resource Planning, Assurance, Control, Management Oversight); Quality Management (Planning, Assurance, Control, Management oversight); Project Human Resource Management (organizational planning, staff acquisition, team development, professional development); Communication Management (planning, information distribution, performance reporting, issues tracking and management); Risk Management (identification, quantification, response development, control, documentation); Procurement Management (planning, requisition, solicitation/source control, contract management/closure) Level 1=Initial Process Level 2=Structure d Process and Standards Level 3=Organiza tional Standards and Institutional ized Process Level 4=Managed Process Level 5=Optimizi ng Process • • • Number: Label: Attributes Items: Definition, culture, process, experience, application Level 1: Naïve Level 2: Novice Level 3: Normalised Level 4: Natural • • Number: > 50 Label: PM Phases ((Initiate, Plan, Execute, Control and Close Out)) and PM knowledge areas (Integration, Scope, Time, Cost, Quality, Human, Resources, Communications, Risk, and Procurement) were used as a basis for the benchmarking tool Each knowledge area is applied to each phase And each knowledge area contains a number of questions More than 50 questions in total Items: Initiate (Scope Management, Time Management, Cost Management, Quality Management, Human Resources Management, Communication Management, Risk Management, Procurement Management); Define and Organize (ditto); Plan (ditto); Track and Manage (ditto); Closeout (ditto); Project-Driven Organization Environment (ditto) Number: Label: Elements Items: collaboration, strategy, structured development process, system design and task partitioning, partner selection, project initiation, partnership management, partnership development Level 1: Ad-Hoc Stage Level 2: Planned Stage Level 3: Managed Stage Level 4: Integrated Stage Level Level Level Level • • • • Level 5: Sustained Stage • • • • Integrated Collaborative Design (ICD) [98] • • • • Design Atlas: A tool for auditing design capability[99] • • • • Knowledge Management Maturity (KMM) [20] The Business Process Maturity Model (BPMM) [100] • • • • • • • • Managing sustainable companies (MaSC Matrix) [101] • • • • Design Process Audit [68, 73] Effective Teamwork Matrix [61] • • • • Aim: Raising awareness Scope: Discipline-specific Administration: Workshop Description: Austin et al present three maturity grids addressing three different aspects of integrated ICD, namely one for ‘maturity assessment of applying process management’ p.41, maturity assessment of adopting supply chain management practices’ p.42 and maturity assessment of establishing value frameworks’ P.43 They all follow the same structure Six maturity levels are assigned to six key aspects of the area chosen to assess Aim: Benchmarking Scope: Discipline-specific Administration: The assessment is ad ministered using a workshop which should be attended by representatives a cross-section of the provider’s business in a supply chain (p.135) Description: Bruce and Bessant [99] assume that design is a business process as opposed to a peripheral or specialist activity and aim to establish ‘Total Design Management’ making design a part of everyone’s concern Offering a number of tools and techniques with which design can be managed, one of which is using a maturity approach termed the Design Atlas Aim: Raising awareness, results can then be turned to making recommendations for change Improving design capability Scope: Discipline-specific Administration: 15 areas of questioning, answered either by an individual or a team Description: Kulkarni and St Louis developed an instrument that organizations can use to self-assess their knowledge management maturity (KMM) Aim: Benchmarking Scope: Generic within knowledge management Administration: Survey Description: The Business Process Maturity Model was developed to identify opportunities for optimization Aim: Raising awareness Scope: Generic Administration: Not mentioned Description: there is growing awareness that businesses that adopt sustainable policies can benefit from increased business opportunities and improvements to bottom line performance However, many organizations in construction find it difficult to know how to start introducing more sustainable practices, and which aspects to concentrate on first This is the motivation behind developing the Managing Sustainable Companies (MaSC) matrix The matrix aims to help managers to introduce and develop more sustainable business practices in their organizations Aim: Raise awareness and benchmark within industry sector Scope: General Administration: Individual or group completion of matrix Description: Good design is important to company success Yet, especially in Small and Medium Sized Enterprises, design skills are often marginalized Emphasizing the design process as a component of the wider New Product Development (NPD) process, the design audit was developed The tool aims to raise awareness of good design issues and to support managers in improving both products and the design process that deliver them Aim: Raise awareness and compare within company Scope: Discipline-specific design Administration: Group completion of matrix Description: Reacting to a report on the UK construction sector, Eclipse Consultants developed a maturity grid to assess effective teamwork The report concluded that the construction sector does not use successful collaborative working strategies and that this can lead to animosity between consultants over • • • Number: Label: Practices or key aspects Items: A: Awareness and understanding of SCM within Organization, B: Commitment of Senior Management, C: Organizational Cohesion, D: Provider Relationships, E: Receiver Relationship, F: Employee Commitment, G: Supply Chain Information Exchange • • • Level = Full deployment and improveme nts Level = Don't know Level = Haven't thought about it Level = Thinking of doing something about it Level = Doing it as normal business Number: 15 ("areas of questioning”: design capabilities and 2-5 sub-categories) Label: “Areas of questioning” Items: Planning for design (general planning awareness, general planning communication, design planning awareness, design planning thinking, design planning horizons); Processes for design (general process awareness, design process awareness, design process management, design process tools); Resources for design (general resource allocation and design resource allocation); People for design (design skills, design organization); Culture for design (design commitment, and design attitudes) Level Level Level Level • • • Number: key maturity areas (second level not disclosed) Label: Key maturity areas Items: Lessons learned, Expertise, Data, Structured Knowledge Level 1: Possible Level 2: Encouraged Level 3: Enabled/Pra cticed Level 4: Managed Level 5: Continuousl y Improved • • • Number: Label: Levers of change Items: Strategy, Controls, Process, People, IT State Siloed State Tactically Integrated State Process Driven State Optimized Enterprise Intelligent Operating Network • • • Number: Label: Key aspects Item: Strategy, Responsibility, Planning, Communication, Implementation, Auditing Level Level Level Level Level • • • Number: 24 Label: Key design activities Items: Requirements capture (market segmentation, competitive analysis, investigating user needs, ongoing user involvement, product specification); Concept design (concept generation, aesthetic design, ergonomic design, product architecture design, concept evaluation and selection; Implementation (design for manufacture and assembly, prototyping to reduce market risks, prototyping to reduce technical risks, evaluation); Project generation (idea generation and management, creative culture and environment, product strategy, project selection); Project management (product development process, risk management, design reviews, management of design targets and metrics, teamwork, specialist design involvement) Number: Label: Key aspects Items: Team identity, shared vision, communication, collaboration and participation, issue negotiation and resolution, reflection and self-assessment Level 1: None/adhoc Level 2: Partial Level 3: Formal Level 4: Culturally embedded Level Level Level Level • • • Level • • • • Communication Grid Method (CGM) [63, 72, 90] • • • • NPD Process audit (Establishing an NPD Best Practices Framework) [102] Manchester Patient Safety Assessment Framework (MaPSaF) [103-105] • • • • • • • • The Process and Enterprise Maturity Model (PEMM) [10] • • • territory or between contractors during the tender process The Teamwork Matrix aims to improve teamwork and collaborative working between professions, in this case, engineers, architects, surveyors and planners and any others involved in contributing to the design of a project Aim: Raise awareness for teamwork within company Scope: Generic Administration: Individual or group completion of matrix Description: There is general consensus that effective communication within and between teams avoids problems Many problems of project management, engineering accidents, and healthcare-associated harm have been attributed to poor communication However, it is often difficult to ascertain whether communication as such is the problem or whether it is a manifestation of influences, such as lack of common understanding of goals and objectives or use of differing terminology Therefore, the Communication Grid Method was developed to increase reflection on factors influencing communication in engineering design and use them as levers for improvement Aim: Raising awareness and comparing teams within the organization Scope: Generic Administration: Face-to-face interview and/or group workshop Description: The search for best practices to manage new product development is ongoing, driven by managers’ desire to identify and implement an optimal new product development (NPD) process The NPD Process audit proposes a best practices framework Aim: Benchmarking Scope: Generic for new product development Administration: Not mentioned Description: Parker developed a maturity grid to help organizations reflect on their progress in developing a mature patient safety culture This initiative was part of a broader drive towards cultural change The idea being that the UK National Health Service (NHS) moves away from a culture of blame to one that is open, fair and continually encouraging improvement Aim: Raise awareness for safety culture Scope: Discipline-specific for patient safety culture in health organizations Administration: Individual or group completion of matrix Description: Building on many years of experience and expertise, Hammer suggests a maturity grid approach to assess process and enterprise maturity A distinction is made between the maturity of a process and the maturity of an enterprise In each version, four levels of maturity are assigned against thirteen ‘enablers’ in the process version and ‘capabilities’ in the enterprise version Aim: Raise awareness and identifies areas that need improvement Scope: The framework is intended to be applicable companies in any industry Administration: Completion of matrix individually or in team • • • Number 45 (5 areas with categories each and four to six factors respectively) Label: Factors Items: Product with Expressing the Product (representation, notation, terminology, requirements) and Media of Communication (Email volume, Email satisfaction, Email content, project team meetings, corridor meetings'); Information with Information Handling (knowledge about information needs asked for each side of the interface (=x2), knowledge about format of information x2, knowledge about processing of information of the other party x2) and Availability of information about (about competitors, our company, procedures, product specification, new technology); Team with Awareness (Do you know what you need to know? Handover of tasks, Sequence of tasks in the design process, People involved in the project) and Personal development (education/training, autonomy of task execution, generation of innovative/alternative ideas best use of individual capabilities), Organization with Organization structure (hierarchies, activity at interface (with the other party), use of procedures, roles and responsibilities) and Organization culture (handling of technical conflict, handling of personal conflict, transparency of decision making, mutual trust, application of corporate vision and values) Level A: Status Quo/No action Level B: Change of action Level C: Change of action and attitude Level D: Continuous adaptation • • • Number: Label: NPD management decisions Item: strategy, portfolio management, process, market research, people, metrics, performance evaluation Level Level Level Level • • • Number: (The 2006 paper lists 11; marketing material lists 9) Label: Aspects of safety culture Item: Benchmarking, Trends and Statistics, Audits and Reviews, Incident/accident reporting, Investigation and analysis, Hazard and unsafe act reports, Work planning including PTW, Journey Management, Contractor management Level 1: Pathologica l Level 2: Reactive Level 3: Calculative or bureaucratic Level 4: Proactive • Number: 26 in total process enablers and enterprise capabilities and 2-5 subcategories for enablers and capabilities respectively Label: enablers and capabilities Items: Process enablers: Design with (Purpose, Context, Documentation); Performers with (Knowledge, Skills, Behavior), Owner with (Identity, Activities, Authority); Infrastructure with (Information Systems, Human Resource Systems), Metrics with (Definition, Uses) Enterprise capabilities: Leadership with (Awareness, Alignment, Behavior, Style), Culture with (Teamwork, Customer Focus, Responsibility, Attitude Toward Change), Expertise with (People, Methodology), Governance with (Process Model, Accountability, Integration) P1/E1 P2/E2 P3/E3 P4/E4 • • Level 5: Generative Table 5: Decision points and attributes according to the development phases of a maturity grid Phase I – Planning Phase II – Development Decision points Decision options 1) Specify audience • • 2) Define aim • 3) Clarify scope • 4) Define success criteria • • • 1) Select process areas (components and theoretical framework) • E.g Existence and adherence to a structured process; Alteration of organizational structure; Emphasis on people; Emphasis on learning 3) Formulate cell text • • • Type of formulation: prescriptive or descriptive Information source: Synthesizing viewpoints from future users or comparing practices of a number of organizations Formulation mechanism: Inductively generated from descriptions of practice or deducted from underlying rationale Focus on the process of assessment (e.g face-to-face interviews, workshops) or focus on end results (e.g survey) Correspondence between author’s intent and user’s understanding Correctness of results 2) Verification • Correspondence with requirements specified 1) Check benchmark and adjust description in cells • If applicable 2) Maintain database of results • If applicable 3) Document and communicate development process and results • Audience-specific 4) Define the administration mechanism Phase IV – Maintenance Generic (e.g energy management) or domain-specific (e.g energy management in construction) High-level requirements (e.g usability, usefulness) Specific requirements E.g Reference to established body of knowledge; Literature survey; Expert knowledge; Defining goals 2) Select maturity levels (underlying rationale) • Phase III – Evaluation Users (e.g project member, project leader, change agent, or CEO) Improvement entity (e.g teams, organization, process, or product) Raise awareness or best practice benchmark • • 1) Validation Table 6: Evaluation of the roadmap to develop maturity grids and the Communication Grid Method Guidelines and requirements for the design of an artifact [12, 36] Evaluation Roadmap Evaluation Communication Grid Method R1 Comparison with existing maturity grids Section II Roadmap as procedural model for the development of new and evaluation of existing maturity grids as new contribution Section IV Maturity grid to assess communication in design as new contribution R2 Iterative Procedure R3 Evaluation Section II and V Development of phases based on literature review and field experience from authors Expert interviews for refinement of individual decision points Section VI Application to case study Application of design guidelines/requirements listed here R4 Multimethodological Procedure Section II and VI Literature research Expert interviews Developer’s experience Industrial case study R5 Identification of Problem Relevance Need for review of existing grids (who have been in the shadow of maturity models) and guidance for development and evaluation Section VI Identification of initial process areas (factors) empirical studies and literature research Iterative development through application in industrial case studies Section VI Iterative development, refinement and evaluation through application in industry, survey of participants, and interview with representatives from academia, industry and consultancies Section VI Literature research Expert interviews Empirical field studies (observation, interviews, survey) Workshops Need for assessment method Inadequate communication often leading to failure of design projects Adequate communication as success factor R6 Problem Definition Development of a roadmap to guide the development of maturity grids and to aid evaluation of existing grids Development of method for the assessment of communication at teaminterfaces in engineering design R7 Targeted Publication of Results Academia: Conference Industry: Presentation of earlier version of this manuscript to practitioners in industry Academia: Conference and peer-reviewed journal articles Industry: Presentations and written reports for participating firms REFERENCES [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] M C Paulk, "Surviving the Quagmire of Process Models, Integrated Models, and Standards," presented at ASQ Annual Quality Congress, Toronto, Canada, 2004 D Leonard-Barton, "Core Capabilities and Core Rigidities: A Paradox in Managing New Product Development," Strategic Management Journal, vol 13, pp 111-125, 1992 J W Moore, "An Integrated Collection of Software Engineering Standards," IEEE Software, vol November/December, pp 51-57, 1999 C P Halvorsen and R Conradi, "A Taxonomy to Compare SPI Frameworks," presented at 8th European Workshop on Software Process Technology, Witten, Germany, 2001 M C Paulk, "A Taxonomy for Improvement Frameworks," presented at World Congress for Software Quality, Bethesda, Maryland, US, 2008 M C Paulk, B Curtis, M B Chrissis, and C V Weber, "Capability Maturity Model SM for Software, Version 1.1," Carnegie Mellon University CMU/SEI-93-TR-024 ESC-TR-93-177, February 1993 M C Paulk, C V Weber, B Curtis, and M B Chrissis, The Capability Maturity Model: Guidelines for Improving the Software Process Boston: Addison-Wesley, 1995 M B Chrissis, M Konrad, and S Shrum, CMMI Guidelines for Process Integration and Product Improvement Boston: Addison-Wesley, 2003 B Curtis, W E Hefley, and S Miller, The People Capability Maturity Model: Guidelines for Improving the Workforce Reading, MA: Addison Wesley Longman, 2002 M Hammer, "The Process Audit," Harvard Business Review, pp 111-121, 2007 K P Grant and J S Pennypacker, "Project Management Maturity: An Assessment of Project Management Capabilities Among and Between Selected Industries," IEEE Transactions on Engineering Management, vol 33, pp 59-68, 2006 J Becker, R Knackstedt, and J Pưppelb, "Developing Maturity Models for IT Management: A Procedure Model and its Application," Business and Information Systems Engineering, vol 1, pp 213-222, 2009 M Kohlegger, R Maier, and S Thalmann, "Understanding Maturity Models Results of a Structured Content Analysis," presented at I-KNOW ’09 and I-SEMANTICS ’09, Graz, Austria, 2009 T DeBruin and M Rosemann, "Understanding the Main Phases of Developing a Maturity Model," presented at 16th Australasian Conference on Information Systems, Sydney, 2005 M Siponen, "Towards maturity of information security maturity criteria: six lessons learned from software maturity criteria," Information Management and Computer Security, vol 10, pp 210-224, 2002 L G Pee and A Kankanhalli, "A Model of Organisational Knowledge Management Maturity Based on People, Process, and Technology," Journal of Information and Knowledge Management, vol 8, pp 79-99, 2009 C Ibbs and Y Kwak, "Assessing project management maturity," Project Management Journal, vol 31, pp 32–43, 2000 J S Pennypacker and K P Grant, "Project Management Maturity: An Industry Benchmark," Project Management Journal, vol 34, pp 4-11, 2003 M Mullaly, "Longitudinal Analysis of Project Management Maturity," Project Management Journal, vol 36, pp 62-73, 2006 U Kulkarni and R St Louis, "Organisational self assessment of knowledge management maturity," presented at 9th Americas Conference on Information Systems, ?, 2003 T Mettler and P Rohner, "A Design Science Research Perspective on Maturity Models in Information Systems," University of St Gallen, St Gallen BE IWI/HNE/03, May 2009 2009 R Phaal, C J P Farrukh, and D R Probert, "Technology management tools: concept, development and application," Technovation, vol 26, pp 336-344, 2005 D o Health, "Achieving Excellence Design Evaluation Toolkit (AEDET Evolution)," vol 2009, 2008 V P Kochikar, "The Knowledge Management Maturity Model - A Staged Framework for Leveraging Knowledge," vol 2007, 2000 K Ehms and M Langen, "Reifemodelle im KM-Consulting - Erfahrungen aus Jahren Beratungspraxis," presented at KnowTech 2004, Muenchen, 2004 P Chinowsky, K Molenaar, and A Realph, "Learning Organizations in Construction," Journal of Management in Engineering, vol January 2007, pp 27-33, 2007 J Mankins, "Technology readiness levels: A White Paper," NASA, Washington, DC 1995 J C Mankins, "Technology readiness assessments: A retrospective," Acta Astronautica, vol 65, pp 1216-1223, 2009 J E Ramirez-Marquez and B Sauser, "System Development Planning via System Maturity Optimization," IEEE Transactions on Engineering Management, vol 56, pp 533-548, 2009 R Foster, Innovation: The Attacker's Advantage New York: Summit Books, 1986 [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] C M Christensen, "Exploring the limits of the technology S-Curve Part I: Component technologies," Production and Operations Management, vol 1, pp 334-357, 1992 R Nolan, "Managing The Computer Resource: A Stage Hypothesis," Harvard Business Review, vol 16, pp 399– 405, 1973 L E Greiner, "Evolution and revolution as organizations grow," Harvard Business Review, pp 37-46, 1972 N Cross, Engineering Design Methods: Strategies for Product Design London: John Wiley & Sons Ltd, 2008 N Cross, "Designerly Ways of Knowing: Design discipline versus design science," Design Issues, vol 17, pp 4955, 2001 A R Hevner, S R March, J Park, and S Ram, "Design Science in Information Systems Research," MIS Quarterly, vol 28, pp 75-105, 2004 K M Eisenhardt, "Building theories from case study research," Academy of Management Review, vol 14, pp 532 - 550, 1989 M Bloor, "Techniques of Validation in Qualitative Research: a Critical Commentary," in Context and Method in Qualitative Research, G Miller and R Dingwall, Eds London: Sage, 1997, pp 37-50 O E D O Dictionary, "Mature," vol 2009, 2009 O E D O Dictionary, "Maturing," vol 2009, 2009 K Dooley, A Subra, and J Anderson, "Maturity and its Immpact on New Product Development Project Performance," Research in Engineering Design, vol 13, pp 23-39, 2001 J K Crawford, Project management maturity model providing a proven path to project management excellence New York: Marcel Dekker, 2002 J K Crawford, Project Management Maturity Model, Second ed Boca Raton: Auerbach Publications: Taylor and Francis Group, 2007 Project Management Institute, A guide to the project management body of knowledge Newtown Square, PA: Project Management Institute, 2000 T J Cooke-Davies and A Arzymanow, "The maturity of project management in different industries: An investigation into variations between project management models," International Journal of Project Management, vol 21, pp 471-478, 2003 H Kerzner, Using the Project Management Maturity Model: Strategic Planning for Project Management, Second ed Hoboken: John Wiley and Sons Ltd, 2005 R Gareis and M Huemann, "Maturity models for the project-oriented company," in Gower handbook of project management, J R Turner, Ed., Fourth ed Aldershot: Gower, 2008, pp 183-208 E S Andersen and S A Jessen, "Project maturity in organisations," 21, vol 21, pp 457-461, 2003 Y H Kwak and C W Ibbs, "Calculating Project Management's Return on Investment," Project Management Journal, vol 31, pp 38-47, 2000 G Skulmoski, "Project maturity and competence interface," Cost Engineering, vol 43, pp 11-18, 2001 Project Management Institute, "Organizational Project Management Maturity Model (OPM3)," 2005 E G Penrose, The Theory of the Growth of the Firm New York: Wiley, 1959 B Wernerfelt, "A Resource-based View of the Firm," Strategic Management Journal, vol 5, pp 171-180, 1984 J B Barney, "Firm resources and sustained competitive advantage," Journal of Management, vol 17, pp 99-120, 1991 K M Eisenhardt and C B Schoonhoven, "Resource-based View of Strategic Alliance Formation: Strategic and Social Effects in Entrepreneurial Firms," Organization Science, vol 7, pp 136-150, 1996 D Ulrich and N Smallwood, Why the Bottom Line isn't!: How to Build Value Through People and Organizational Change: John Wiley & Sons, 2003 D Ulrich and N Smallwood, "Capitalizing on Capabilities," Harvard Business Review, vol 82, pp 118-127, 2004 D Teece, G Pisano, and A Shuen, "Dynamic Capabilities and Strategic Management," Management Journal, vol 18, pp 509-533, 1997 A Fincher, "Project Management Maturity Model," presented at 28th Annual Seminars and Symposium, Chicago, Illinois, 1997 R Szakonyi, "Measuring R&D Effectiveness - II," Research - Technology Management, vol May-June, pp 4455?, 1994b Constructing Excellence, Effective Teamwork A Best Practice Guide for the Construction Industry Watford, 2004 J E Strutt, J V Sharp, E Terry, and R Miles, "Capability maturity models for offshore organisational management," Environment International, vol 32, pp 1094-1105, 2006 A M Maier, C M Eckert, and P J Clarkson, "Identifying requirements for communication support: A maturity grid-inspired approach," Expert Systems with Applications, vol 31, pp 663-672, 2006 C Argyris and D Schön, Organizational learning: A theory of action perspective Reading, Mass: Addison Wesley, 1978 [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] C Argyris and D A Schön, Organization Learning II: Theory, Method and practice Reading, Massachusetts: Addison-Wesley, 1978b K E Emam and D R Goldenson., "An Empirical Review of Software Process Assessments," National Research Council of Canada, Institute for Information Technology November 1999 1999 M C Paulk, S Guha, W E Hefley, E B Hyder, and M Iqbal, "Comparing the eSCM-SP and CMMI: A comparison between the eSourcing Capability Model for Service Providers v2 and the Capability Maturity Model Integration v1.1," Carnegie Mellon University, IT Services Qualification Center CMU-ITSQC-05-005, December 2005 2005 J Moultrie, "Development of a Design Audit Tool for SMEs," Journal of Product Innovation Management, vol 24, pp 335-368, 2007 J Wilson, "Communication artifacts The design of objects and the object of design," in Design and the Social Sciences Making connections, J Frascara, Ed London: Taylor and Francis, 2002, pp 24-32 V Chiesa, P Coughlan, and C Voss, "Development of a technical innovation audit," Journal of Product Innovation Management, vol 13, pp 105-136, 1996 R Szakonyi, "Measuring R&D Effectiveness - I," Research - Technology Management, vol March-April, pp 2732, 1994a A M Maier, M Kreimeyer, U Lindemann, and J P Clarkson, "Reflecting Communication: A key factor for successful collaboration between embodiment design and simulation," Journal of Engineering Design, vol 20, pp 265 - 287, 2009 J Moultrie, "Development of a design audit tool to assess product design capability," in Institute for Manufacturing Cambridge: University of Cambridge, 2004 P Fraser, C Farrukh, and M Gregory, "Managing product development collaborations - a process maturity approach," Journal of Engineering Manufacture, Proc Instn Mech Engrs, Part B, vol 217, pp 1499-1519, 2003 P Fraser, J Moultrie, and M Gregory, "The use of maturity models/grids as a tool in assessing product development capability," presented at Proceedings of IEEE International Engineering Management Conference, IEMC, Cambridge, UK, 2002 A M Maier, "A grid-based assessment method of communication in engineering design," in Engineering Department Cambridge, UK: University of Cambridge, 2007 B Kitchenham and L Pickard, "Case Studies for Method and Tool Evaluation," IEEE Software, vol July 1995, pp 52-62, 1995 W R Roberts, "Rhetorica," in The Works of Aristotle, vol XI, W D Ross, Ed.: Oxford University Press, 1936 S W Littlejohn, Theories of Human Communication Belmont, CA: Wadsworth Publishing, 1999 R T Craig, "Communication Theory as a Field," Communication Theory, vol 9, pp 119-161, 1999 K Miller, Communication Theories: Perspectives, Processes, and Contexts, Second ed New York: McGraw Hill, 2005 J A Anderson, Communication Theory Epistemological Foundations New York: The Guilford Press, 1996 L L Bucciarelli, "Reflective practice in engineering design," Design Studies, vol 5, pp 185-190, 1984 L L Bucciarelli, Designing Engineers Cambridge: MIT Press, 1994 N Luhmann, Social Systems Stanford: Stanford University Press, 1995 K Krippendorff, "Communication and the genesis of structure," General Systems, vol 16, pp 171-185, 1971 E v Glasersfeld, "Konstruktion der Wirklichkeit und der Begriff der Objektivität," in Einführung in den Konstruktivismus, H Gumin and A Mohler, Eds München, 1985, pp 1-26 S H Pfleeger, N Fenton, and S Page, "Evaluating software engineering standards," IEEE Computer, vol 27, pp 71-79, 1994 P B Crosby, Quality is Free: The Art of Making Quality Certain New York: Penguin, 1979 A M Maier, M Kreimeyer, C Hepperle, C M Eckert, U Lindemann, and P J Clarkson, "Exploration of Correlations between Factors Influencing Communication in Complex Product Development," Concurrent Engineering Research and Applications, vol 16, pp 37-59, 2008 R Hughes, "The Safety Management Maturity Grid," Professional Safety, vol 30, pp 15-18, 1985 R A Radice, J T Harding, P E Munnis, and R W Philips, "A programming process study," IBM Systems Journal, vol 24, pp 91-101, 1985 Building Research Energy Conservation Support Unit (BRECSU), "Reviewing Energy Management," Building Research Establishment, Watford January 1993 1993 J Hackos, "The Information Process Maturity Model: A 2004 Update," Best Practices, vol 6, 2004 M E McGrath and C L Akiyama, "PACE: An Integrated Process for Product And Cycle-time Excellence," in Setting the PACE in product development: A Guide to Product And Cycle-Time Excellence, M E McGrath, Ed., Revised Edition ed Boston: Butterworth-Heinemann, 1996, pp 17-30 T R Stacey, "The information security program maturity grid," Information Systems Security, vol 5, 1996 [97] [98] [99] [100] [101] [102] [103] [104] [105] D A Hillson, "Towards a Risk Maturity Model," The International Journal of Project and Bbusiness Risk Management, vol 1, pp 35-45, 1997 S Austin, A Baldwin, J Hammond, M Murray, D Root, D Thomson, and A Thorpe, Design chains - a handbook for integrated collaborative design London: Thomas Telford, 2001 M Bruce and J Bessant, Design in Business Strategic Innovation Through Design London: Financial Times Press, 2002 D M Fisher, "The Business Process Maturity Model A Practical Approach to Identifying Opportunities for Optimization," in BP Trends, vol September, 2004 R Woodall, I Cooper, D Crowhurst, M Hadi, and S Platt, "MaSC: managing sustainable companies," Engineering Sustainability, vol 157, pp 15-21, 2004 K B Kahn, G Barczak, and R Moss, "Perspective: Establishing an NPD Best Practices Framework," Journal of Product Innovation Management, vol 23, pp 106-116, 2006 D M Ashcroft, C Morecroft, D Parker, and P R Noyce, "Safety culture assessment in community pharmacy: development, face validity, and feasibility of the Manchester Patient Safety Assessment Framework," Quality and Safety in Health Care, vol 14, pp 417 - 421, 2005 D Parker, M Lawrie, and P Hudson, "A framework for understanding the development of organisational safety culture," Safety Science, vol 44, pp 551-562, 2006 D Parker, M Lawrie, J Carthey, and M Coultous, "The Manchester Patient Safety Framework: sharing and learning," Clinical Risk, vol 14, pp 140-142, 2008 ... point and guidance for the evaluation of existing grids and the development of new grids It alerts both the novice and expert user to the potential decisions to be addressed at the start of the development. .. explains the rationale for selecting the review sample of this paper and furthermore describes how the suggested roadmap for the development of new and the evaluation of existing maturity grids. .. Development The roadmap reminds the developers of maturity grids to theoretically ground their selection of process areas and maturity levels which influences the formulation of the cell text and ultimately

Ngày đăng: 19/10/2022, 02:01

w