As far back as I can remember, I wanted to be an architect. There was no epiphany that led me to the decision. No one in my family was in construction, let alone architecture. In fact, I don’t think I even knew an architect. I often wonder what led me to make architecture my lifelong passion. It wasn’t the round glasses, the bowtie, or even the black turtleneck. So what was it then?
The image of the architect
Maybe it was growing up in the 80s and 90s and watching numerous movies and TV shows where the main protagonist was an architect. Think Mike Brady (Robert Reed) of the Brady Bunch, Martin Kelly (Robert Hughes) of Hey Dad, and Sam Baldwin (Tom Hanks) in Sleepless in Seattle.1 They made working from home cool, long before remote working became a thing. It never dawned on me at the time, but all of these characters also happened to be widowed. If there was any doubt about what an architect was, we could simply turn to the TV and be rest assured that architects were safe, reliable, resilient, and somewhat tortured. Not to mention male, pale and stale, but it was the 80s, after all.
The other contributing factor which I believe led me to architecture is that I grew up in a 70s house long after it was fashionable. I’m talking green carpet and green floral wallpaper as far as the eye could see. Furnished with bucket chairs and my father’s reclining chair and footrest in green velvet. And no 70s house is complete without an entertaining area. So naturally, we had a green laminated bar with matching green vinyl bar stools to retreat to after swimming in the green-tinted, resort-style pool. It was epic.
Years later, my architect friends would come over and be in awe, like they had just walked into some sort of time machine. I knew the power spaces could have, and I wanted to make architecture that improved people’s lives. And if I could do that from the comfort of my home office and be seen by society as a respected professional, all the better. Just no green interiors. Not ever.
Fast forward several years, and I am sitting in a cold, empty classroom in central London. I’m at the University of Central London, and the classroom is filled with the usual clutter that accompanies past architecture studios – a sort of design model graveyard. After years of smashing into the glass ceiling that is ‘Part II architectural assistant’, I had decided it was time to become a registered architect. Sat across from me are two peers. Much like Gandalf, ready to proclaim, “You shall not pass!” they were the gatekeepers to the architectural profession, and my acceptance, or non-acceptance, was at their mercy. After seven years of architectural education, three professional degrees, 10,000 plus hours of professional experience, a 12,000-word case study, and an epic three-hour written exam, it all came down to this – a 30-minute oral interview to see if I was worthy of admission into the profession.
An industry struggling with BIM
I was ready. Despite reading the entire Harry Potter series in parallel to my study, I knew my JCT from my CDM and my section 106 from my torts. It seems ironic then that so much of the outcome of the interview would be determined by BIM. After a bit of back-and-forth banter, the conversation turned to BIM. My examiners asked me if I would charge a client for using BIM.
To put this in context, the year was 2011, and there was a lot of professional whinging going on. UK government BIM mandates, the argument went, would put small practices out of business because of the exorbitant costs of hardware upgrades and the necessary upskilling. The industry was being dragged into the 21st century, and they weren’t happy about it. I was an early adopter of BIM and had a post-professional degree in advanced digital design. If they wanted to quiz me on BIM, bring it on. I could talk all day on the topic, which meant less time answering the difficult questions.
Maybe someone had used the Imperius Curse and taken control of me, because I responded, “No, I would not charge a client for using BIM”. What I may as well have said is, “I have no respect for myself nor the architectural profession.” In the eyes of the profession, it was heresy to say that you would agree to a deliverable without charging. Only two years had passed since the Royal Institute of British Architects (RIBA) abolished professional fee scales, and the topic was still highly charged.2 The industry was premised on a fee-for-service model, and without the collective safety net that comes with professional fee scales, providing a service without a fee was the beginning of the end of the profession. I needed to explain my position and fast.
But not before I poked the bear. I asked my examiner if she charged her clients for using Microsoft Office. No? Why not? The client didn’t pay for it. Why not give them handwritten notes? What about AutoCAD, or MicroStation as was more common at the time in the UK, do you charge clients extra for using this technology? No? Why not?
My comment was absurd. But so, too, was the question. It highlighted the profession’s absurd biases as it applies to technology. I explained how in my experience, BIM improved documentation quality, reduced errors and ultimately, made me more productive. It was a win-win scenario. So why, then, would I choose not to adopt the technology? At what point does technological change become problematic simply because it can’t be charged to a client?
Then came round two. The examiner went on to explain how she believed the most effective way to review documentation was not via BIM but rather to print out all coordination drawings and go through them one by one marking up any clashes. What was my opinion on that? How could coordination and clash detection possibly occur in a 3D model? It was time for a truth bomb.
Clearly still under the Imperious Curse, I explained that technically savvy architects these days are more than capable of orbiting a 3D model and resolving clashes. Rather than debating the specifics of how something is done, isn’t it more important to understand if and why something is done? The old adage of its not what you say, but how you say it, held true. To the examiners’ credit, they realised my comments were not due to ignorance, but rather a deep appreciation of the topic and the desire to improve the industry. I was finally an architect…at least in the UK.
Six years later, I had the privilege of going through the whole process again, this time for the Architectural Registration Board in Australia. This time I would be on my best behaviour. No poking the bear. No truth bombs.
And so the oral interview rolled around, and right off the bat, the examiners says to me: “A client rings you up about a potential commission. What do you do?” I told them that I would invite them into my office for an initial meeting to talk through the project and their requirements. Easy peasy, lemon squeezy. Or so I thought. My examiners, on the other hand, looked dumbfounded. My crime?
They wanted to know why I wouldn’t go to the client’s home for the meeting. After a bit of back and forth, it became evident that the confusion stemmed from different perspectives about how an architect should act in that situation. Coming from a background of large architectural practices, the office was often used as a tool to wow potential clients. Photographs of past projects were on the wall, and the client could get a behind-the-scenes look at where the magic happened. My examiners, on the other hand, were sole practitioners, and the majority of commissions were private houses. In their mind, visiting the site was an important first step in understanding the project (but interestingly didn’t constitute working for free). Again I had to explain my position. But again, they came around.
I learnt, and re-learnt, an important lesson those days. Everyone has biases, and these biases carry into how we define a professional. And by biases, I am not referring to their race, religion, or if they prefer ties over bowties. I’m talking about how an individual performs due skill, care and diligence when undertaking architectural services. It’s a no-brainer that architects should provide competent professional services. But how we achieve that will change, and must change, with the times. If not, the architecture profession risks being left in the dust.
A changing professional landscape
So what exactly is changing? Put simply, larger practices are getting larger, smaller practices are getting smaller, and mid-sized practices are disappearing altogether through mergers or acquisitions. In Australia, for example, 34% of architects are sole traders, and an astonishing 98.4% of architectural practices employ less than 20 people.3 We find more or less the same in the USA, where 25% of architects are sole traders, and 75.2% of architectural practices employ less than 10 people.4 And this distribution is likely found in many other countries as well.
This polarisation is creating a two-tier system of professional architects. Sole practitioners and small practices will continue to focus on bespoke, mostly residential projects and these architects will need the full spectrum of current core competencies. Large practices, on the other hand, have far deeper specialisation and are investing heavily in technology and the economies of scale they bring. The problem is that our image of the architect, and by association, the registration process, is very much geared to the sole practitioner. This bias means that our definition of architect has failed to evolve, even though the professional landscape has already shifted.
Jobs of the future
There are, of course, reasons for this, beyond personal biases. Organisations such as the Architects Accreditation Council of Australia (AACA) define the national standards of competencies for architects.5 The list of capabilities is very much black and white: On graduation from an architecture program, a graduate will…”; “At the point of registration, a candidate will…”; “Post architect, an architect will…”. While these core competencies are the minimum standard required, where this becomes problematic is that they become the default benchmark of what an architect is, and should be.
We already know that a substantial proportion of future jobs will be hard to predict, except for the fact that they will require a very different range of skills than that displayed by most graduates. The World Economic Forum, for example, estimates that 65% of children entering primary school in 2016 will ultimately end up working in completely new job types that don’t yet exist.6 Others, such as Wired editor Kevin Kelly, have suggested that before the end of this century, 70% of today’s occupations will be replaced by automation.7 If this sounds unbelievable, consider this.
My first architectural job was in 1999. In this particular practice, all development applications were drawn by hand with Rotring pens on film. If you made a mistake, you scratched it out with a razor. Curvilinear lines were drawn with a flexi-curve – if it didn’t bend tightly enough, you changed the design. Presentation drawings were hand coloured and needed to be colour copied at the local copier. We didn’t use Photoshop, and we didn’t use email. In fact, no one really knew what the internet was.
We communicated via Fax, and every single one needed to be photocopied to prevent the rolled thermal paper from curling and fading over time. We had a postal register, and every letter coming in or out was manually recorded. We measured existing buildings with a tape measure, ensuring we used running dimensions to minimise accumulative errors. Archived projects were rolled up with a post-it note and rubber band and stuck in the archive room. And product reps armed with a suitcase would randomly turn up, and update their ring-binder product catalogues in our library.
A digital future
Today, less than 25 years later, existing buildings are laser scanned, and the point cloud imported into the BIM model. Genetic algorithms are deployed to search through all possible designs in a solution space. The project is documented in BIM and digitally coordinated using clash detection software. Repetitive tasks are automated using visual or textual programming. Designs are reviewed using 3D printed models or real-time gaming engines extracted directly from the design model. Custom digital tools are developed and deployed to analyse environmental performance and code compliance. Past projects are easily retrieved using a simple desktop search. And building parts are robotically fabricated using direct file-to-factory fabrication.
Despite still undertaking professional architectural services, how that occurred, fundamentally changed. Not a single task I did in my first job is the same as today. In fact, saying that this occurred over 25 years is generous as the industry has been doing this for some time now. 15 years is probably more accurate. It’s clear from this example that even during this short period of time, technology has already irrevocably changed the nature of architectural practice. How, then, should we define architects, and educate them?
Educating future architects
The answer depends on what you believe is the purpose of higher education. For many, there is the expectation that an architectural graduate should be sufficiently educated so that they can perform the day-to-day tasks of an architect. Grimshaw’s Group Managing Partner, Mark Middleton, for example, argues:
Firms up and down the country pull their weight when it comes to supporting the existing system, and this includes bursary schemes, professional practice training, tutoring, critiquing and lecturing. This is defensible if there is reciprocity and the profession gets what it needs from the universities, but unfortunately, it doesn’t. There is a strong sense of a failing system, starving the profession of a future generation of properly trained, socially and ethnically diverse architects.8
If the only thing we know about the future, is that it will be different, what constitutes properly trained? Technological change is often perceived linearly, but the reality is that it is exponential, and so change is happening at a much faster pace than people think. In my opinion, to simply educate students on today’s definition of what an architect is, which is already out-of-date and biased, seems downright amoral.
In the United Kingdom, the average time taken from the beginning of architectural studies to qualifying as an architect is nine and a half years.9 And in Italy, it’s even longer at 11 years on average. And again, this trend is probably common in most other countries. What this means is that if a student is educated based on the requirements of the day, by the time they are registered, they have probably five years or less before much of their foundational knowledge is obsolete.
The purpose of universities
The other way to look at educating future architects is not merely a cog in the growing corporate machine, but rather a mechanism to improve and redefine the profession. As organisational psychologist, Adam Grant says:
The purpose of a university is not to train skills. It’s to promote the pursuit of knowledge. Higher education exists to stoke curiosity, fuel discovery, foster debate, encourage critical thinking, and develop the next generation into more sophisticated learners.10
So why doesn’t this happen more? One contributing factor is the transactional nature of universities and the student’s desire just to be told what they need to know. And this is indeed a big problem. The other, however, I would argue, is that we have become so habituated about what an architect is and should do, that it is no longer questioned.
Like many architects this time of year, I’ve been undertaking Continued Professional Development (CPD) seminars as required as part of my registration renewal. In a recent CPD session titled, Systemic risks in the Australian architectural sector, the Architects Registration Board (ARB) identified four systemic risks. The usual suspects of procurement models, compliance with the National Construction Code (NCC), and managing the client-architect relationships remain. The fourth, however, was “automation, digitisation and innovation”.
A decade after registering as an architect and being quizzed over BIM, the industry is still clearly struggling to come to grips with our relationship with technology. In the session, the ARB stated, “Architects who do not have skills and expertise to use BIM to service clients may face additional competitive pressure.” One would think then that BIM would be a core competence needed to become an architect (as stipulated by the AACA). But no. Not a single mention of BIM, digitalisation, or innovation in the National Standard of Competency for Architects.
Rethinking the role of the architect
This year, the NSW ARB introduced three new mandatory CPD topics. The first was understanding and respecting (First Nations) country. The second is a focus on life cycle assessment. And the third relates to the National Construction Code. It’s clear that redefining core competencies is possible. But we must go further. We need to define core competencies around automation, digitisation and innovation.
I believe that technology has the power to change the profession for the better, but before we do, we must first free ourselves of the dogma of what we think an architect is. In their book, The myth of experience, Emre Soyer and Robin Hogarth explain how smart, knowledgeable experts can embrace flawed ideas where they feel licensed, even obligated, to propagate it to the masses. These experts then go on to mentor their successors, preserving and spreading the established school of thought.11 Our inability to imagine alternative views of the world, is summed up best by the Nobel prize winner Daniel Kahneman:
We hold a single interpretation of the world around us at any one time, and we normally invest little effort in generating plausible alternatives to it. One interpretation is enough, and we experience it as true. We do not go through life imagining alternative ways of seeing what we see.12
My point is that our perception of what an architect is, and should be, must evolve.
Given the inevitable technological convergence underway, the architecture profession risks becoming left in the dust if we hold onto flawed biases. I don’t know the solution, but as a profession, we must ask the question. We must free ourselves of dogma and imagine plausible alternatives. The reality in today’s digital-first world is that we need to unlearn much of what we think an architect is, in order to remain relevant. I still believe architecture has the power to improve people’s lives. But how we achieve that has changed, and so too should be how we define an architect.
1 Robert Hughes would later be convicted of child sex offences, so probably not the best role model.
2 Under Margaret Thatcher’s economic reforms of 1982, the Royal Institute of British Architects (RIBA) changed the mandatory fee scales to recommended. By 1992, these scales became indicative until the RIBA finally abolished them in 2009. With the Competition and Consumer Act 2010, Australia prohibited anti-competitive conduct, which includes professional fee scales.
3 AACA (Feb 2018). Industry profile: The profession of architecture in Australia.
4 American Institute of Architects (2020). Firm survey report 2020.
5 AACA (2021). National standards of competencies for architects.
6 World Economic Forum. (18 Jan 2016). The future of jobs: Employment, skills and workforce strategy for the Fourth Industrial Revolution.
7 Kelly, K. (2017). The inevitable: Understanding the 12 technological forces that will shape our future. Penguin Books, New York, p.50.
8 Middleton, M. (14 Dec 2016). Scrap architecture degrees and revitalise the profession. In BD Online.
9 UK architectural education review group. (Jan 2014). Pathways and gateways: The structure and regulations of architectural regulations, p.13.
10 Grant, A. (27 Jan 2021). Twitter post.
11 Soyer, E. & Hogarth, R. (2020). The myth of experience: Why we learn the wrong lessons, and ways to correct them. Hachette Book Group, New York, pp.3-4.
12 Kahneman, D. et al. (2021). Noise: A flaw in human judgment. William Collins, London, p.31.