Self Service A Complete Guide - 2020 Edition. Gerardus Blokdyk
Чтение книги онлайн.
Читать онлайн книгу Self Service A Complete Guide - 2020 Edition - Gerardus Blokdyk страница 8
46. What measurements are being captured?
<--- Score
47. How best to implement Analytic Applications for high user adoption and self-service?
<--- Score
48. What are the key input variables? What are the key process variables? What are the key output variables?
<--- Score
49. Is a follow-up focused external Self-service review required?
<--- Score
50. Is there a Performance Baseline?
<--- Score
51. How do you know that any Self-service analysis is complete and comprehensive?
<--- Score
52. How do you measure efficient delivery of Self-service services?
<--- Score
53. Is Process Variation Displayed/Communicated?
<--- Score
54. Do all call center inquiries have to be via telephonic means or can self-service methods be utilized via the web or other resources to reduce headcount and cost?
<--- Score
55. Are you ready to learn more about how your organization can use self-service analytics to increase the effectiveness of analytics programs?
<--- Score
56. What is your path to a single source of truth and true BI self-service with a current landscape that has multiple analytics solutions and a high TCO?
<--- Score
57. Are there measurements based on task performance?
<--- Score
58. How do you leverage empowered analytics (self-service) to change the way your organization works?
<--- Score
59. How does self-service BI influence the lifecycle of BI, in terms of analysis and design?
<--- Score
60. What is the impact on your self-service strategy?
<--- Score
61. When asked, what impact does self-service have on your customers?
<--- Score
62. How much will the self-service portal cost up front and annually?
<--- Score
63. How will success or failure be measured?
<--- Score
64. Do your organizations IT BI teams and development experts provide guidance to users of self-service BI and analytics technologies?
<--- Score
65. How should the self-service technology features be prioritized?
<--- Score
66. Was a data collection plan established?
<--- Score
67. Is data collected on key measures that were identified?
<--- Score
68. Where can you go to verify the info?
<--- Score
69. What is an unallowable cost?
<--- Score
70. What are the agreed upon definitions of the high impact areas, defect(s), unit(s), and opportunities that will figure into the process capability metrics?
<--- Score
71. Have you deployed self-service/automated tools to optimize your IT operational cost?
<--- Score
72. Does management have the right priorities among projects?
<--- Score
73. Is key measure data collection planned and executed, process variation displayed and communicated and performance baselined?
<--- Score
74. How will costs be allocated?
<--- Score
75. Which self-service technology are you focusing on?
<--- Score
76. How advanced and coherent is the data architecture in support of a self-service analytics initiative?
<--- Score
77. Do you aggressively reward and promote the people who have the biggest impact on creating excellent Self-service services/products?
<--- Score
78. Among the Self-service product and service cost to be estimated, which is considered hardest to estimate?
<--- Score
79. What key measures identified indicate the performance of the stakeholder process?
<--- Score
80. What impact has using self-service for customer service currently had on your organizations ability to address priorities?
<--- Score
81. Is it cost effective to use a self-service system because of the consequent reduction in staff costs?
<--- Score
82. Are key measures identified and agreed upon?
<--- Score
83. Who participated in the data collection for measurements?
<--- Score
84. Have you found any ‘ground fruit’ or ‘low-hanging fruit’ for immediate remedies to the gap in performance?
<--- Score
85. Which Self-service impacts are significant?
<--- Score
86. Are high impact defects defined and identified in the stakeholder process?
<--- Score
87. What does self-service really cost?
<---