Listen to a couple of Wise guys (, and Steve, discuss the golden age of architecture, the benefits of, and problems with, Agile methodologies, and the value of critical thinking.

Recorded: Friday 18th March, 2016
Hosts: Mike Wise, Steve Rogers
Running Time: 1:16:14
Special Guest: David Wise

About David

  • Work for consultancy arm of a large multi-national IT company.
  • 4 years Manager of Architecture / Enterprise Architecture for 6 large QLD state government agencies.
  • 7 years Manager of Development / Architecture for Melbourne IT out of the Web Central brand, in the USA it was Verisign.
  • Worked for Banks 20years at what become Suncorp.  Started as Cashier jumped across to mainframe creating SAS and Focus datasets using MVS and JCL.
  • While at Suncorp built first ‘real’ web developed MIS systems in Cold Fusion.
  • Qualification in Business and IT.
  • Certified TOGAF, Prince2, COBIT and ITIL.
  • Ran the Macromedia group for a couple of years.
  • Managed Dev teams, biggest was across three contents and 4 cites, jumped across to architecture.
  • Golden Age of architecture (c:


 Management vs Engineering 

“Estimation process is part of the root cause”

The Wicked Problem: difficult or impossible to solve because of incomplete, contradictory, and changing requirements that are often difficult to recognize.

Conflict caused by two opposing forces who are in violent agreement

Impact of Cognitive bias 

Why do projects fail (Darryl Carlton)

  • The Kruger-Dunning Effect (1999) postulated that those that have no competence in the domain or skill are not only incapable of understanding what is required they are not equipped to recognize competence in others.
  • The Nash Equilibrium. Basically and simplistically what the Nash Equilibrium says is that all parties need to subvert self-interest for the common good if each party is to maximize their opportunity for a positive outcome.
  • “Obedience to Authority” – the social psychology research by Stanley Milgram demonstrated that 65% of participants would continue to follow instructions even when their own instincts told them it was the wrong thing to do.

Planning phase

On Agile

  • PERT (Optimistic + 4x Typical + Pessimistic)/6 = My Estimate.
  • Standard deviation of time : the variability of the time for accomplishing an activity.
  • Break down the estimates like a Fermi estimate.


  • Try to do it in writing: Making decisions in face-to-face meetings may not work well for all participants (especially with non-native speakers). When you can write out the reasons why you think Task A will take X days and Task B will take Y days, you get the opportunity to think about things more thoroughly.
  • Always document your concerns: If your PM gives you an impossible deadline, always send a politely worded e-mail to the PM saying that you’ll do your best but you are not entirely sure that you can finish in the time allotted. This may seem like just protecting yourself (which it might do) but it really is part of the history of the project and it needs to be documented at some point. Lessons Learned / Project Reviews can actually help other people in the future.
  • Use a formal methodology: If you can explain your estimate with some sort of methodology, your recommendation might contain more weight. I find that the “if everything goes well = X weeks but if things go wrong = X+10 weeks but X+3 weeks seems reasonable” is a great place to start. Many PERT books use that as a basis of a formula (Optimistic + 4x Typical + Pessimistic)/6 = My Estimate. It wouldn’t hurt to try it a couple of times and see if it gets more traction.
  • Use the risk register.
  • Keep your calm, 10 breaths do work.
  • At the beginning establish how you are going work.
  • Establish what the MVP or Minimal Shippable Product looks like.
  • Build a prototype, is give the sales people something.
  • Work out your persona and build to that person.
  • Start simple, Arse-less, Toppless apps.
  • Wireframe the workflow.
  • Establish who doing to what when.

 Gall’s Law

Uncertainty ensures you will never be able to anticipate all of these interdependencies and variables in advance, so a complex system built from scratch will continually fail in all sorts of unexpected ways.Gall’s Law is where environmental Selection Tests meet systems design. If you want to build a system that works, the best approach is to build a simple system that meets the environment’s current selection tests first, then improve it over time. Over time, you’ll build a complex system that works.Gall’s Law is why Prototyping and Iteration work so well as a value-creation methodology. Instead of building a complex system from scratch, building a prototype is much easier—it’s the simplest possible creation that will help you verify that your system meets critical selection tests.Expanding that prototype into a Minimum Viable Offer allows you to validate your Critical Assumptions, resulting in the simplest possible system that can succeed with actual purchasers.Iteration and Incremental Augmentation, over time, will produce extremely complex systems that actually work, even as the environment changes.If you want to build a system that works from scratch, violate Gall’s Law at your peril.

What Is ‘Uncertainty’? (Systems)

The difference between Uncertainty and Risk is that Risks are known unknowns: you know what might happen. Uncertainties are unknown unknowns, there’s no way to expect that that could happen.You can’t know if a something unexpected will occur, all you can do is remain flexible, prepared and Resilient to react properly.Accepting Uncertainty is accepting the real world, instead of coming up with a nicer one that doesn’t exist.Don’t rely on making predictions. Plan for flexibility instead.From <>

 Ambiguity Effect The ambiguity effect is a cognitive bias where decision making is affected by a lack of information, or “ambiguity”.[1] The effect implies that people tend to select options for which the probability of a favorable outcome is known, over an option for which the probability of a favorable outcome is unknown. The effect was first described by Daniel Ellsberg in 1961.[2]