Dec 10, 2008

Architecture CoE evaluation - Part 1

Here are some of my thoughts on Architecture CoE's performance this year. I'll discuss what went well and what didn't go so well in separate posts.

Things that didn't go well

- We were not fearless. E.g. we failed to take bold steps to achieve great things e.g. we tolerated the pain from our heavyweight survey platform and failed to reject it

- We were not thorough. We released 3-5 applications that needed hotfixes because we broke existing functionality. Yes, its a small % (total 120+ releases) but nothing short of perfection is expected from us in such maintenance releases. We cannot shortchange regression testing (case for automated testing but manual if/while tests aren't fully automated)

- We were not focused. We ended up spreading our attention on many things (Presentation Service, Common Service, Reporting Service, Real-life testing, Improve our design, Automated build/test/deploy/CI of Java apps, Ruby/LAMP). We could not make much progress in half of these areas

I've requested external perspective on the CoE from our external coach - Neal Ford. He will share his SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis with us in January 2009.

Your thoughts/comments?