Approximately 10,000 delegates attended this year’s Autodesk University (AU) which was hosted at the Venetian Hotel, Las Vegas from 15-17 November. The theme this year was ‘The future of making things’. To better understand what’s new, check out our wrap-up from the Autodesk University 2014 conference.
As usual, the conference opened with keynote lectures from CEO Carl Bass and CTO Jeff Kowalski. Kowalski opened the keynote by talking about Artificial Intelligence (AI), Machine Learning, Generative Design, Virtual Reality (VR), and Robotic systems. These themes are not new and have been a staple at AU conferences over the past few years.
Next up was Carl Bass, who spoke about technology disruption. Using the automotive industry as an example, Bass spoke about the issues facing traditional automotive manufactures: Autonomous vehicles; ride-sharing and electric powertrains. Many of the automotive companies accustomed to making vehicles to drive and own are now having to reinvent themselves to produce cars which are not purchased but instead offer a service focused on a passenger’s experience. The lesson is simple – regardless of the scale of your company, you still need to experiment and be proactive with new ideas and technologies.
Bass also spoke about machine learning to write software stating that in the past, the only way to get software to improve was to either re-write it or run it on a faster computer. However, today, deterministic software writing is being replaced by machine learning. This enables software to learn and adapt. Bass concluded by stating that historically, Autodesk has been focused on building tools for individuals. But that just like the automotive industry, they are adapting by developing a new generation of building tools for teams.
Product Innovation keynote
Day two saw the Product Innovation keynote from Amar Hanspal, Senior Vice President, Product. This was an insightful talk, as never in the history of AU has there been a complete overview of all products in a single session. Autodesk has created three streams within their products:
- Architecture, Engineering & Construction (AEC) with Autodesk BIM 360,
- Product Development & Manufacturing with Autodesk Fusion 360, and
- Media & Entertainment with Shotgun.
Showcasing software of all three streams had the interesting side effect of clearly highlighting how far AEC is behind the rest. Within the AEC steam, there were several significant updates:
This was released in July 2016 to transform a modelling environment into an interactive experience. The latest release now fully supports VR and can be done with two clicks direct from Revit.
Still in alpha phase, Project Fractal allows users to explore the ‘design space’ of parametric models created in Dynamo Studio. (Note that it is not yet available in Dynamo for Revit). Project Fractal operates similarly to Grasshopper’s Galapagos by allowing multiple design options to be produced from a set of input parameters. However, Project Fractal is not yet developed enough to adopt generic algorithms and instead utilises brute force which is a significant limitation.
BIM 360 is a suite of applications: Team, Docs, Glue, Layout, Plan and Field. Think of it as the lovechild of Aconex/Newforma/Revitzo where models and documents are integrated, correlated and accessible in a common data environment. Autodesk is looking to add to this ecosystem with Project IQ. Just like voice recognition software such as Siri, Project IQ aims to provide intelligent assistance by mining construction project data and to apply machine learning and analytics.
Those wanting updates or fixes to existing software will be someone disappointed. Autodesk appears to be focused on migrating their products across to a platform known as Forge. First introduced last year, Forge represents a push by Autodesk to integrate products and services under a common set of APIs. Forge is thus the common platform for cloud-based micro-services.
The big announcement of probably the whole conference was ‘Project Quantum’. Project Quantum is touted as the next-gen BIM ecosystem built with Forge APIs. Project Quantum aims to evolve the way BIM works in the cloud through integrated platforms and experiences. Hanspal describes it as ‘not a single, monolithic, giant app, but really a family of web and mobile experience or workplaces that a woven together into the common data environment and that are kept in sync through a common communication bus.’ Thus it is not a single platform – It is a single experience across multiple platforms. While the details of Project Quantum are still in its infancy, here are some key points:
Exchange data, not files
Like Flux and their motto ‘exchange data, not files’, Autodesk note that the current desktop and file format paradigm has made interoperability difficult. This has resulted in a new job role, that of the ‘data wrangler’.
Since no one program ‘owns’ the data, it is not a universal, native model. This marks a shift in how other team members interact with the data. Rather than trying to create a federated model (which so many BIM managers loath to do currently), under Project Quantum buildings would be broken down into systems – façade, structure, internal partitions etc. Since not everyone needs the entire building’s detail, only specific data would be shared, such as coordination points. So rather than compiling a super-heavy, native Revit model, multiple models would exist which are all coordinated and in sync.
Using cloud-based systems affords the possibility of ‘branching’ and ‘merging’ in the design process. These are concepts which are ubiquitous in the programming industry and are used in applications such as GitHub. Not only is this a more flexible and agile working environment than simply using Revit’s design options, but it also affords the possibility for automatic revision-ing or ‘diffs’ to occur.
An ecosystem of specialised digital tools
In the past, interoperability has not been Autodesk’s strong suit. With the announcement of Project Quantum, this marks a turning point in Autodesk thinking. It is now no longer about one particular software, namely Revit, but an ecosystem of specialised digital tools where data is distributed yet can stay in sync. It is important to note that this is still in the development phase and is not yet commercially available. However, if Autodesk does manage to pull it off as prophesied, then it could be a massive, positive change for the AEC industry. Given that Autodesk is already offering services via apps such as simulation and rendering, and more in their other software streams, it is not a far leap to imagine all design authoring tools transforming into a cloud-based micro-service.
In addition to the keynotes, there was also a plethora of sessions on offer. Here are a few I visited and found useful:
Computation BIM workshop: Advanced
By Racel Williams, Matt Jezyk, Ritesh Chandawar, and Aparajit Pratap (Autodesk). The great thing with workshops is that often, it doesn’t matter what you think you already know, there will always be little nuggets of information that will surface that you never knew about. This was one such workshop. It covered topics such as lists, data types, dictionaries, replication, function passing, and ‘List@level’, mesh toolkit, and T-splines.
By Shawnee Finlayson (Arup, Sydney). Shawnee gave a lively and passionate presentation which demonstrated how every-day workflows could be improved and automated with Dynamo. Not only where these well-presented but they were backed with quantitative data to demonstrate their value in terms of time and resources, and hence cost. It was estimated that these tools reduce their documentation time by more than 500 hours (50%). However, what I found most compelling about the talk was the refreshing nature about how Shawnee implemented these tools into her office. Through a concoction of self-learning, perseverance, and in her own words, charm, she was able to convince senior management about the tool’s validity and to investigate parametrics further a methodology.
By Chiara Rizzarda and Claudio Vittori Antisari (Antonio Citterio Patricia Viel and Partners, Milan). This class was aimed at intermediate Revit and Dynamo users and demonstrated that hotel rooms are distinctively rule-based spaces. That is, there are a group of standardised key factors that govern its exact configuration. Once you determine those rules, you can use them to create the framework for the rapid generative exploration of alternatives.
The class also illustrated how other useful applications, including Revit’s Global Parameters, Flux and Venngage (data visualisation software) contributed to the workflow. But the main point I took away from the class was the notion of an instrument versus a tool. This idea is based on the work of the Russian science historian, Alexandr Koyré who theorised that tools merely extend our capability to do something in physical terms. On the other hand, instruments are the ’embodiment of the mind’ and ‘materialisation of thought’. While a tool plays a crucial role in helping us do something better before its invention, an instrument is something that allows us to see things differently, and this is the power of computational design.
By Nate Miller and David Stasiuk. This course explored the use of the Revit API to enhance creative problem-solving in the design process. Nate and David supplied the boilerplate code, which was developed from their company, the Proving Ground. While this course was aimed at advance users interested in programming, it was great to see samples provided to help those people interested in creating their own stand-alone parametric interfaces with the Revit API.
Design Computation Symposium
Featuring Industry leaders including David Gilford, Valentin Heun, Rajaa Issa, Fred Martin, Martha Tsigkari, Ryan Welch. This was a half-day symposium covering a broad array topics investigating how technology is becoming a mainstream focus, with the rise of STE(A)M in education and allowing for computational literacy in schools no longer the realm of computer nerds and scientists. The wide array of topics covered included, trends in digital practice, the potential impact of computational literacy education and effects on the future of design, real-world applications for design computation workflows, and how to identify opportunities for design computation in new and traditional workflows.
Of course, AU is not AU without all the extra-curricular activities. A big shout out to Shaan Hurley for organising the morning runs. These were a huge success and gave people the opportunity not just to get some fresh air, but also an opportunity to informally hang-out with the Autodesk staff.
Dynamo After Hours
Attendees at the ‘Dynamo After Hours’ were privileged to see Ian Keough, godfather of Dynamo, ride a mechanical bull. Well, actually it was more like a rooster…
The Flux party at Canonita saw CEO Jamie Roche outline his plans for Flux. It is great to see how far Flux has come and all the new and exciting developments they are making. I’m sure Autodesk is taking note of their success in developing cloud-based collaborative tools. It was refreshing to see almost, or at least what appears to be, the whole of the Flux team at the party. This was incredibly useful and allowed clients to speak directly to the various developers and not just a sales rep.
Overall I was significantly more impressed with AU over previous years. With announcements around Project Quantum and the continued development of Dynamo, I am more optimistic about Autodesk’s future. Let’s hope Autodesk can deliver on their vision outlined in the various keynotes. I look forward to seeing their development next year when AU returns from 14-16 November 2017 at the Venetian.