Software Reliability A Complete Guide - 2020 Edition. Gerardus Blokdyk

Чтение книги онлайн.

Читать онлайн книгу Software Reliability A Complete Guide - 2020 Edition - Gerardus Blokdyk страница 8

Software Reliability A Complete Guide - 2020 Edition - Gerardus Blokdyk

Скачать книгу

Where can you gather more information?

      <--- Score

      105. Why are you doing Software reliability and what is the scope?

      <--- Score

      106. Are required metrics defined, what are they?

      <--- Score

      107. What information do you gather?

      <--- Score

      108. How does the Software reliability manager ensure against scope creep?

      <--- Score

      109. What are the compelling stakeholder reasons for embarking on Software reliability?

      <--- Score

      110. Has a high-level ‘as is’ process map been completed, verified and validated?

      <--- Score

      111. Scope of sensitive information?

      <--- Score

      112. What is the scope of Software reliability?

      <--- Score

      113. What are the Roles and Responsibilities for each team member and its leadership? Where is this documented?

      <--- Score

      114. Is Software reliability linked to key stakeholder goals and objectives?

      <--- Score

      115. What specifically is the problem? Where does it occur? When does it occur? What is its extent?

      <--- Score

      116. Are roles and responsibilities formally defined?

      <--- Score

      117. When are meeting minutes sent out? Who is on the distribution list?

      <--- Score

      118. What would be the goal or target for a Software reliability’s improvement team?

      <--- Score

      119. Do the problem and goal statements meet the SMART criteria (specific, measurable, attainable, relevant, and time-bound)?

      <--- Score

      120. When is the estimated completion date?

      <--- Score

      121. Has anyone else (internal or external to the group) attempted to solve this problem or a similar one before? If so, what knowledge can be leveraged from these previous efforts?

      <--- Score

      122. What is the scope of the Software reliability effort?

      <--- Score

      123. Has your scope been defined?

      <--- Score

      124. What is the definition of Software reliability excellence?

      <--- Score

      125. What is the context?

      <--- Score

      126. When is/was the Software reliability start date?

      <--- Score

      127. What are the boundaries of the scope? What is in bounds and what is not? What is the start point? What is the stop point?

      <--- Score

      128. How do you manage scope?

      <--- Score

      129. Who are the Software reliability improvement team members, including Management Leads and Coaches?

      <--- Score

      130. What knowledge or experience is required?

      <--- Score

      131. Are approval levels defined for contracts and supplements to contracts?

      <--- Score

      132. Has everyone on the team, including the team leaders, been properly trained?

      <--- Score

      133. Is the improvement team aware of the different versions of a process: what they think it is vs. what it actually is vs. what it should be vs. what it could be?

      <--- Score

      134. Is the Software reliability scope complete and appropriately sized?

      <--- Score

      135. Is the team adequately staffed with the desired cross-functionality? If not, what additional resources are available to the team?

      <--- Score

      136. In what way can you redefine the criteria of choice clients have in your category in your favor?

      <--- Score

      137. Will a Software reliability production readiness review be required?

      <--- Score

      138. What was the context?

      <--- Score

      Add up total points for this section: _____ = Total points for this section

      Divided by: ______ (number of statements answered) = ______ Average score for this section

      Transfer your score to the Software reliability Index at the beginning of the Self-Assessment.

      CRITERION #3: MEASURE:

      INTENT: Gather the correct data. Measure the current performance and evolution of the situation.

      In my belief, the answer to this question is clearly defined:

      5 Strongly Agree

      4 Agree

      3 Neutral

      2 Disagree

      1 Strongly Disagree

      1. What could cause delays in the schedule?

      <--- Score

      2. How do you verify the authenticity of the data and information used?

      <--- Score

      3. At what cost?

      <---

Скачать книгу