Gordon Haff is Red Hat technology evangelist, is a frequent and highly acclaimed speaker at customer and industry events, and is focused on areas including Red Hat Research, open source adoption, and emerging technology areas broadly. He is the author of How Open Source Ate Software from Apress and co-author of Pots and Vats to Computers and Apps: How Software Learned to Package Itself in addition to numerous other publications. Prior to Red Hat, Gordon wrote hundreds of research notes, was frequently quoted in publications like The New York Times on a wide range of IT topics, and advised clients on product and marketing strategies. Earlier in his career, he was responsible for bringing a wide range of computer systems, from minicomputers to large UNIX servers, to market while at Data General. Gordon has engineering degrees from MIT and Dartmouth and an MBA from Cornell’s Johnson School.
Gordon Haff (He/Him/His)
| Follow @ghaff
Authored Comments
I don't want to put words in Sarah's mouth but I took her comment to mean that at least some people react negatively to what has often been the historical approach to de jure standards adoption: long process, very political, maybe not very useful at the end of the day. There are other facets too--standardizing before a space has stabilized, using so-called "standards" to lock out competitors, etc. Overall, today I think there is more of a thought that standards should often develop organically from the work of a community of contributors and users.
To add to Bryan's point, I don't say a lot about speed but it is implicit in a lot of other things. It's difficult both financially and culturally if those failures require writing off investments that have consumed a lot of money and time.
It's definitely important to be able to tolerate and mitigate small scale production failures. Hence, canary deployments, etc. (and Netflix Chaos Monkey.)