Flux Site Extractor is a Flux extension app which allows you to import GIS information. It uses the building outlines and road centrelines from the Open Street Map database and the elevation data from NASA’s Shuttle Radar Topography Mission. When activating the app, there is no installation required and hence no requirement to have admin rights, which can prove useful if on a work computer. You simply need to grant the Site Extractor app access to your Flux account.
The Bays Precinct, Sydney, Australia
Once activated, the map will be centred on your location. If your site is elsewhere, you can search for the specific location in the search box. Simply draw a rectangle over the area you want to export. Note that the rectangle will turn red if the area is too large. In this scenario, you may need to do multiple exports to several different Flux projects.
Select the features you want to extract from the menu. Also select the Flux project you want to send the data to. If no project is selected, a new Flux project will be created for you. Click on ‘Send to Flux’. A link to your project will appear under the project listing. In Flux, double-click on the layers you want to preview. You can also drag layers on top of each other to overlay the geometry.
Once the data is in Flux, you and import this into Grasshopper. In the example below I have used Elefont to bake the geometry into Rhino so that they are on the correct layers. This is not essential and can be left out if required.
Note that although you may have selected certain data fields to be extracted such as ‘Buildings (accurate height)’, this is completely dependent on the data coming out of Open Street Map. This means that some of the Flux keys may be empty depending on your location. The discrepancy in Open Street Map detail can be clearly seen when comparing Sydney Australia (above) and New York, USA (below).
New York, USA
While there are many programs and plug-ins that are capable of extracting the same data (Elk, @IT, Meerkat GIS, and CAD Mapper), the benefit of using Flux Site Extractor is that this data can be sent easily to both Grasshopper and Dynamo. This allows both Rhino and Revit users to access the same data thereby simplifying the workflow.
As part of the SEPP65 requirements in New South Wales, balconies for multi-residential buildings must meet minimum area and depth requirements. Section ‘4E Private open space and balconies’ in the Apartment Design Guide states that all apartments are required to have primary balconies as follows:
|Dwelling type||Minimum area||Minimum depth|
|1 bedroom apartments||8m2||2m|
|2 bedroom apartments||10m2||2m|
|3+ bedroom apartments||12m2||2.4m|
The ‘Room.SetBalconyCompliance’ node as part of the BVN Dynamo package helps to automate the process of verifying compliance of the minimum area. Note that the minimum depth is not tested and this needs to be undertaken separately.
Before using the node, firstly ensure that there are no redundant rooms in the project as the area of these rooms will be incorrect and skew the results.
The node first collects all rooms in the project using Lunchbox’s ‘Room element collector’. The corresponding apartment numbers are then collected. It is assumed that the apartments and their corresponding balcony have been numbered the same. For this particularly example, the project was setup using a shared parameter called ‘Apartment Number’ which is used as the default value to the ‘apartmentNumberParameter’ input. This is because if you use the OOTB parameter ‘Number’ to group apartments and balconies you will receive an error, ‘Elements have duplicate “Number” values.’
Next we need to determine the dwelling type, that is, 1-Bed, 2-Bed, etc. For this example, the ‘Occupancy’ parameter is used as the default input to ‘apartmentNameParameter’. Once the apartment numbers and apartment names are known, the rooms are grouped together to represent all the rooms within the apartment, both internal and external.
Once grouped, the rooms are filtered by the ‘apartmentName’ input, for example, ‘1 BED’, ‘2 BED’, etc. The balconies from these dwelling types are then extracted and the area calculated and compared to the ‘minSize’ input. All balconies are first reset to be non-complying. Next only the complying apartments are updated based on and the shared parameter defined in the ‘sharedParameterName’ input. By default, the sharedParameterName input is set to ‘Balcony Compliance’.
In this previous tutorial, we discovered how to export Revit levels to Grasshopper for the generation of a diagrid structure, before pushing the results back in into Revit using Rhynamo and Dynamo. This tutorial will look at an alternative methodology whereby Rhynamo is eliminated and replaced with Flux.
Step 1: Setup Flux
- Set up and sign into your Flux account.
- Create a new blank project. In this example I have called it ‘RTC Exercise 03’
- Select ‘open project’.
- Create ‘keys’. These are geometry/data that will be transferred to/from Flux. Simply hit the plus button in the data table on the left.
- Add the name and description as required. We need to create 6 keys in total:
- Floor mesh;
- Floors Universal mesh;
- Analytical model;
- Ring beams;
The purpose of all of these keys will be elaborated on later.
Step 2: Export Revit data/ geometry
- Setup a Revit view which just the Revit geometry to transfer.
Revit view of floor plates
- Generate an elevation or section showing the levels to be exported.
Revit section showing levels
- In Dynamo, open the Part 1 Dynamo file. Select the levels and floors to be exported and run the script. The script collect the select levels, sort them based on their level and return their name and elevation. It also converts the floors into meshes. This data is then exported to flux to the ‘Levels’, ‘RLs’ and ‘Floor mesh’ keys.
Dynamo script to export levels and floors
Step 3: Flux refactoring
- We need to refactor the geometry so that Flux can use the mesh from Revit. With Flow, generate the following:
Flux Flow to generate a universal mesh
From this point on, we need to use the ‘Floor universal mesh’ key rather than the ‘floor mesh’ key.
Step 4: Generate Grasshopper geometry
- Open Grasshopper open the Part 2 Grasshopper script. The first part of the script imports three keys from Flux – ‘Floor universal mesh’, ‘Levels’ and ‘RLs’. The ‘Floor universal mesh’ and ‘levels’ keys are purely for reference. The ‘RLs’ key is used to generate a series of ovals at the correct level. Note that the elevations (RLs) exported from Revit will be absolute, that is, relative to the survey point. Therefore the script makes an adjustment for this because the Rhino file is set-out based on the Project base point. From these ovals, a diagrid is generated. Finally, the resultant geometry is pushed to Flux via the ‘analytical model’ and ‘ring beam’ keys.
Revit preview of resultant geometry
Step 5: Import geometry into Revit
- Open Revit and ensure the families to be placed are pre-loaded. Note that while structural framing could have been placed, adaptive components were used to avoid the visual gap created when using structural framing. The other family to be loaded into the project is a structural frame that we’ll use for the ring beams.
- Open the Part 3 Dynamo file and run the script and define which families are to be placed with in the Dynamo script
Dynamo script to re-create geometry from Flux
Structural tubes in Revit
The application of Flux in this example proved quite successful. This was due to the fact that Flux provided a (semi-) live link for the levels and floors. Since the size of the data been transferred wasn’t very large, the process was reasonably fast. Furthermore, Flux offered the possibility to have multiple people simultaneously contributing.
In this previous tutorial, it was demonstrated how to undertake a detailed SEPP65 solar access compliance check. The methodology involved using Rhynamo as a means to extract Revit rooms for analysis in Grasshopper and Ladybug. This example offers an alternative methodology by eliminate Rhynamo and replacing it with Flux:
Step 1: Setup Flux
- Set up/sign into your Flux account.
- Create a new blank project.
4. Create ‘keys’. These are geometry/data that will be transferred to/from Flux. Simply hit the plus button in the data table on the left.
5. Add the name and description as required. We need to create 10 keys in total:
- Meshed floors;
- Meshed walls;
- Meshed groups;
- Combined universal mesh;
- Room crvs;
- Room numbers; and
- SEPP65 compliance.
These keys can be created in each individual application but it is easier to plan it out first and do it all in one go. The purpose of all of these keys will be elaborated on later.
Step 2: Export Revit rooms and context using Dynamo and Flux
- Currently, Flux doesn’t support breps so we need to convert the Revit geometry into a mesh. Unfortunately, Dynamo out-of-the-box does not have meshing tools so you’ll need to install the ‘Dynamo Mesh Toolkit‘ package by Autodesk.
2. Simplify the context geometry for export. The easiest way to do this is to setup a 3D view within Revit. Using Visibility Graphics (VG) in conjunction with worksets and filters, turn off any unnecessary geometry. Essentially all we require are party walls, floors and roofs. Extraneous geometry such as planting, furniture, doors and mullions can all be hidden. Windows are effectively transparent and therefore can also be excluded from the analysis. The same also applies to glass balustrades. However, since these are often modelled as a wall, you will need to set up a filter to turn off only the glass balustrade wall types and not all walls. It is critical that the context model be as clean and minimal as possible to minimise computational time later in the workflow.
3. From this view, duplicate and create 3 separate views isolating different elements – walls, floors and groups (internal walls).
Revit walls to be exported
Revit floors to be exported
Revit groups (internal walls) to be exported
4. Open Part 1 of the Dynamo script. This script exports the context geometry. Due to the file size, the model will be exported in batches ; walls, floors and roof. Note that you will need to run the script several times if you have Dynamo set to ‘Manual’ in order to pick from menu in the Flux nodes. Otherwise you’ll need to set Dynamo to ‘Automatic’. Therefore, Flux’s Flow control node does not have much meaning when you are working in Dynamos manual mode. For additional control you can leave the flow control mode in ‘once’ and then after the manual run in Dynamo you can right click the ‘to Flux’ or ‘from Flux’ components and click ‘push data’ or ‘pull data’.
The Python script shown below simply extract all the geometry in the view, so there is no need to manually select elements. You will need to re-run on the 3 views created, each time writing to a different Flux key.
Dynamo script to export context geometry using Flux
5. Open the Part 2 of the Dynamo script. This script exports the room geometry. The easiest way to do this is to use the LunchBox Room element collector node.
Dynamo script to export rooms geometry and room numbers using Flux
Step 3: Flux refactoring
- In Flux, check that the data has been transferred. For some of the keys, particularly the context and room curves, you might get the following image. Simple click the ‘load value’ icon to preview the geometry.
2. If you don’t already have access, go to Flux labs and sign up for the ‘Dynamo Mesh converter’. Hopefully in the future this will be much more accessible. The reason we need this is that Dynamo meshes have a unique format that will not be recognised by Grasshopper. Therefore, Flux, has developed a converter block that turns a Dynamo mesh into a universal mesh that you can view in Flux and send to Grasshopper. Essentially the difference is as follows. Rhino takes the form of a list of vertices (points with X,Y,Z coordinates) which are stored in an ordered list. In addition, it has a list of faces which are each an ordered list of vertex index points (refer more here).
Grasshopper Mesh definition
Dynamo on the other hand creates a similar list of vertices, but then rather than store the faces as a list of index points, it stores them again as a list of vertices with repeated X,Y,Z coordinates. It is unclear why the Dynamo team have chosen this data structure but obviously it leads to a much larger file size to store the same sized mesh as the X,Y,Z coordinates of each vertex is repeated many times.
3. Go to Flow, which is Flux’s visual scripting platform and generate the following code. Hover in column A and pick the add a data node (shortcut ‘D’). Drag the ‘Context mesh’ key to the left hand size of the data node. In stage A-B add Flatten and the dynamoMeshEater nodes. (Note that once you have the mesh in Flux and have been invited to use the ‘Dynamo Mesh converter’ block, you will be able to double click in the Flux Flow and search for DynamoMeshEater). In the menu icon on the top right of the dynamo MeshEater node, set the lacing to Longest. In column B, add a data node and drag the ‘Universal mesh’ key to the right hand size of the node. Repeat this process for the other elements (walls, floors, roofs, etc.).
By going back to the data pane, you should see the preview of the ‘universal mesh’ now.
Flux universal mesh
Step 4: Solar Analysis with Ladybug
- Open a blank Rhino file and then open the Grasshopper Part 3 file. Ensure the Flux components are pointing to the correct Flux project and keys. Once Ladybug has finished running, the list of complying room numbers will be pushed to Flux. Note that if your rooms have internal islands or aren’t clean geometry, you may need to modify the polycurve and planar surface creation nodes to ensure the room surfaces and room number lists match up.
Step 5: Import results into Revit
- In Dynamo, open up the Part 4 file. The script will first ‘reset’ all the rooms to be non-complying, before updating only those rooms imported from Flux.
While in theory Flux appeared to simplify the workflow, the main limitation was the processing of the context geometry. Due to the large file size, the upload process to the Flux website was incredibly slow. Having said that, this tutorial was produced with Flux/Dynamo 0.9.2. According to Flux, the new Flux plug-in for Dynamo 1.0 performs significantly better.
The other issue that I faced was the refactoring process. For example, Flux doesn’t accept polycurves (yet) so the original Grasshopper script that used Rhynamo had to be modified slightly. Similarly having to use the Dynamo Mesh converter is somewhat annoying and in an ideal world, this would be eliminated from the workflow.
In conclusion, for this particular exercise, I don’t believe Flux added much value to the workflow. This was because all it was being used for was to export dumb geometry and a simple *dwg export would have been much faster. But in any case, this exercise probably wasn’t the most strategic use for Flux and its potential lies elsewhere.
Building Information Modelling (BIM) entails interdependencies between technological, process and organisational/cultural aspects. These mutual dependencies have created a BIM ecosystem in which BIM related products form a complex network of interactions.¹ For a long time interoperability between these products has been essentially non-existent, resulting in users unwilling to interchange between different software platforms. Rather than using the best product for the job, users have preferred to remain in the software that they are most familiar with, possibly to the detriment of the design. One such example of this is conceptual massing within Autodesk Revit. This post explores how to extend Revit’s modelling capabilities by combining it with McNeel’s Rhinoceros and Dynamo.
It is important to emphasise that BIM is an ecosystem and that there is no single software that can do everything. Within the graphic design industry, Adobe has recognised this and have produced a suite of software including; Photoshop, Illustrator and InDesign amongst others. Each software has a very clear and well defined scope. Photoshop is used for raster images, Illustrator for vector images and InDesign to compile it all together. Each software is separate but linked together in the workflow. In general, architects are accepting of this ecosystem and are agile enough to move between each platform. However, this ethos of a software ecosystem needs to be applied to BIM software as well.
Software ecosystems: Adobe vs Autodesk
The term ‘BIM’ can mean different things to different people. Despite BIM bridging across the entire project from conception through to facilities management, depending on who you talk to, different people will have different biases. For example, as an Architect, I am naturally more interested in geometry and the design authoring phase of BIM. These biases which are epidemic in the AEC industry is elaborated on in Antony McPhee’s post, ‘Different BIMs for different purposes‘. With so many perspectives of what BIM is, one needs to question the role of a BIM Manager. Dominik Holzer is his RTC Australasia presentation, ‘You are a BIM Manager – Really?’ discusses how BIM managers need to go beyond tool knowledge and develop management acumen. But that can be a hard thing to accomplish. Even if we focus on a particular BIM bias, say Design Authoring, there is a plethora of tools out there that a BIM manager must master, or at the very least, have an appreciation of. Here is just a few of them.
Most of the software mentioned above is generally accepted as part of a BIM workflow. Why then are so many people resistant to creating a federated model and instead insist that everything must be created within Autodesk Revit? It was a pleasure to see that this year’s RTC Australasian conference was run in conjunction with Archicon, ArchiCAD’s equivalent conference. This was a small acknowledgment from the profession that BIM is bigger than just a single piece of software and that we need to seriously consider interoperability in our workflows.
Although Autodesk Revit is often chosen as the primary documenting tool in many architectural offices, it is widely acknowledged that Revit is extremely limited in dealing with complex conceptual modelling. Autodesk’s attempts at introducing a dedicated conceptual modelling environment in the form of FormIt and Inventor haven’t really taken hold of as of yet. As a work around, many offices have adopted McNeel’s Rhinoceros, or Rhino for short, for the conceptual design phase and Autodesk Revit for the development and documentation phases.
The integration between McNeel’s Rhino and Autodesk’s Revit in the past has proven to be a challenge for many designers looking to combine freeform geometry with BIM. These interoperability barriers are partly due to Revit’s API limitations and also arguably, a lack of development of the Industry Foundation Class (IFC) file format, which promised universal interoperability within the AEC industry.
IFC: A promise of universal interoperability
The other barrier is one of culture and perception. Rhino is sometimes regarded as useful for conceptual modelling only, whereas Revit is preferred for documentation. Over and over I see these two worlds failing to collaborate. The scenario generally looks something like this: Young, recent graduates who are relatively computational literate, spend long hours at the office to win a competitions. These people then jump onto other projects and if the competition is awarded, a new team is put together to deliver the project. This will most likely contain more experienced architects more attune to developing and delivering the project. The Rhino model is thrown out and completely rebuilt from scratch within Revit. Sound familiar?
The problem with this approach is that it more often than not involves loosing much of the intelligence built into the original design. And since Revit is unable to accurately recreate complex geometry, the design needs to be dumbed down to comply with Revit’s limitations. This methodology couldn’t be further from what BIM set outs out to achieve. So what is the solution?
For some, the solution is simply to produce everything natively within Revit from the beginning. However, as shown in this case study, Revit sometimes simply isn’t capable of creating the desired forms. One therefore needs to look to other software that can, such as Rhino, but this too can have limitations.
Those that have attempted to use Rhino geometry within Revit will be all too familiar with the difficulty in successfully achieving this. The most basic method is to simply export a *sat file from Rhino and then import it into Revit. Best practice is typically to do this via a conceptual mass family. However, the result can be called ‘dumb geometry’ in that it is unable to be edited once imported, which is far from ideal in a BIM environment.
The next progression in usability is to explode the import instance. Depending on the base geometry, Revit may make editing the mass accessible via grips (push/pull arrows) but this is not always the case. However, the best method to generate native Revit elements from Rhino is to use the wall-by-face, roof-by-face, curtain system and/or mass floors commands. These elements will be hosted to the Rhino geometry. In theory, these elements can be updated if the base geometry changes but experience has shown that Revit is not always able to re-discover the host element and new elements will need to be created. Therefore, it is prudent to test the Rhino to Revit workflow before investing in too much time in embellishing the Revit elements.
The whole process of using Rhino geometry in Revit feels a bit like black magic. Often, the first attempt in importing the geometry will fail and there will be no guidance from Revit as to why it failed. This can prove to be very frustrating and many users will simply give up. However, if these best practices are followed when generating the mass in Rhino, integration with Revit should be seamless, or at very least, less painful.
The best practice mentioned above all rely on more or less the same techniques once in Revit: wall-by-face, roof-by-face, curtain system or mass floors. In other words, a base mass needs to be provided to host the element. But what if you want to create other elements which aren’t roofs, floors or walls? This is where third party plug-in come into play. Some of these (past and present) include:
- Chameleon – Uses Chameleon Adaptive Component Systems (CACS). The workflow is bi-directional whereby geometry can be created in Grasshopper, exported to Revit and then re-imported back into Grasshopper.
- Geometry Gym – Allows Grasshopper geometry to be translated into Revit using OpenBIM formats (primarily *ifc). The Industry Foundation Classes (IFC) is a neutral platform, open file format specification that is not controlled by a single vendor or group of vendors. It is an object-based file format with a data model developed by BuildingSMART. This workflow is probably the most complicated but potentially allows much greater control over the geometry and the element properties.
- Grevit – Enables you to assemble your BIM model in Grasshopper and send it to Revit or AutoCAD Architecture. Once the elements have been sent, their geometries and parameters can be updated by another commit.
- Hummingbird – This process exports basic geometric properties and parameter data to Comer Separated Value (*csv) text files. In Revit, this data is easily imported using the WhiteFeet ModelBuilder tool which is included in the download. In April, 2015 Hummingbird was updated to support Revit geometry into Rhino making it bi-directional.
- Lyrebird – Similar to Chameleon in that it uses adaptive components but is only uni-directional.
- OpenNURBS – Plug-in for Revit to allow Revit to automate the process of importing Rhino geometry.
Rhino to Revit plug-in comparison
Regardless of the plug-in adopted, there were common limitations amongst them:
- Complex geometry – Apart from OpenNURBS, the plug-ins don’t actually export Grasshopper geometry you have created but rather re-creates that geometry in Revit through a series of input parameters. For example, if exporting a wall from grasshopper you don’t reference the wall, but rather you are required to define the walls centre line and height for it to be rebuilt within Revit. Therefore, if the wall is irregular in height, it will not translated accurately.
- Longevity – As shown in the table below, many of the plug-ins have either been discontinued (as is the case with Chameleon and OpenNURBS) or made open-source (such as Grevit and Lyrebird) and has stopped being updated regularly.
Timeline of interoperability plug-ins as at June 2016.
- Unidirectional – With the exception of Chameleon and only recently, Hummingbird, most of the plug-ins are unidirectional which significantly limits interoperability.
- Classification – Only selected Revit elements were able to be created. If for example you wanted to create a ceiling or stair you couldn’t as there were no components for it.
However with the introduction of Dynamo, a whole new world of possibilities have emerged to facilitate interoperability. Since the release of Dynamo 7.2 back in September 2014, three new interoperability tools have emerged:
- Flux – The new kid on the block and without a doubt the most promising. Flux provides cloud-based collaboration tools to exchange data and streamline complex design workflows. Flux plugins work with Rhino/Grasshopper, Excel, and Revit/Dynamo, to automate data transfer to and from Flux. Flux also has plans to expand this design software to include: AutoCAD, SketchUp, Revit and 3D max. A Flux project is the focal point for data exchange and collaboration. You can invite teammates into your project to share data. Each user and application controls when to synchronise data with the project, allowing users to work in isolation until they are ready to share their changes with the team. Since Flux was developed by Google[x], it will only work with Google Chrome.
- Mantis Shrimp – Allows you to read Rhino’s native *3dm file type as well as export geometry from Grasshopper. Mantis Shrimp works in a similar manner to Rhynamo.
- Rhynamo – An open-source node library for reading and writing Rhino *3dm files. Rhynamo uses McNeel’s OpenNURBS library to expose new nodes for translating Rhino geometry for use within Dynamo. In addition, several experimental nodes are provided that allow for ‘Live’ access to the Rhino command line.
To assist Rhino users in becoming acquainted with Dynamo, I have produced a ‘Dynamo for Grasshopper Users‘ primer. Since everything is a little different in Dynamo, the primer provides a list of ‘translations’ in order to find a comparable node/component.
To date, most of the tutorials presented on Parametric Monkey have focused on Rhynamo. Since, Rhynamo doesn’t work out of Grasshopper (as it needs to have baked Rhino geometry) we will need to use Elefront for our data management. Elefront was developed by Ramon van der Heijden and is a plug-in focusing on managing model data and interaction with Rhino objects. The plug-in allows users to bake geometry to the Rhino model with the option of specifying attributes, including an unlimited amount of user defined attributes by means of key-value pairs. This way it is possible to treat a 3D Rhino model as a data base, where each object ‘knows’ what it is, what it belongs to, where it should go, what size it is, when it needs to be fabricated, etc. Instead of trying to store geometry in a database, Elefront stores data in a ‘Geometrybase’, thereby turning your Rhino model into a quasi BIM model. Elefront is therefore ideal for combining with Rhynamo to export Grasshopper data into Dynamo/Revit.
The purpose of this post was to present the notion of a BIM ecosystem and encourage an open and agile workflow amongst the AEC profession. To illustrate this notion, the post explored in detail how to extend Revit’s modelling capabilities by combining it with Rhino and Dynamo to create an integrated BIM environment. It is hoped that though a greater awareness of all software’s strengths and weaknesses, BIM professionals can use the right tool for the job, rather than being constrained to a single software.
1 Gu, N. et al. (2016). BIM Ecosystem: The co-evolution of products, process and people.
I am pleased to announce the latest release of the BVN Dynamo package v1.0.0, which is compatible with Revit 2017 and Dynamo 1.0.0. There have been substantial changes with how Dynamo operates and it has been a mission to keep up to ensure that the nodes keep working as intended. Hopefully now as we progress forward we should see more robust performance.
Below is a list of nodes available in the current version, with hyperlinks to tutorials to demonstrate how they can be utilised. Since launching the BVN Dynamo package back in March 2015, the package has grown to 33 custom nodes with more still on the way. The package has five dependencies which will be installed when installing the BVN package:
- Clockwork for Dynamo 0.9.x (0.90.6)
- Ladybug (0.1.2)
- LunchBox for Dynammo (2016.5.3)
- Rhynamo (2016.5.3)
- Steam Nodes (0.8.43)
- AdaptiveComponent.FromExcel – Places an adaptive component family from a Excel spreadsheet with comma separated points.
- Area.ImportInstance – Places 3D import instance masses based on areas.
- Family.FromExcel – Places a family by point and level from an excel file. Families will only be placed on a level if the Z value matches an existing Revit level.
- Grid.FromExcel – Reads an excel file to generate grids.
- Group.ByTypeAndLocation – Places a model or detail group based on group time and a location (point).
- Level.Plane – Creates a plane from level(s).
- Levels.FromExcel – Generates levels from an excel spreadsheet.
- Point.FromExcel – Creates a point from an Excel comma separated string.
- Room.ExportToRhino – Collects all rooms in the project and exports polycurves to a Rhino file. Ensure Rhino file is closed in order to write to it. WARNING: WHEN EXPORTING, RHYNAMO WILL OVER-WRITE THE RHINO FILE. ENSURE YOU WRITE TO A BLANK DOCUMENT.
- Room.ImportInstance – Searches for rooms by a name (string) and places import instances volumes. The import instance are named the same as the room number it originated from. Useful if wanting to export a room/program massing model out to Rhino.
- Door.RenumberByRoom – Renumbers doors based on ‘to’ room. If no ‘to’ room is present, it will use the ‘from’ room, e.g. external doors .
- Room.AdjustUpperLimit – Extracts all rooms and adjusts upper limit so that its limit is the level above with zero limit offset.
- Room.CentreLocation – Moves room to the centroid of the room. If the centroid is outside of the room boundary, the room will remain in its current position. Based on the script by Modelical.
- Room.CreateUnplaced – Create unplaced rooms from a list of room names.
- Room.RenameByArea – Renames apartment rooms (1Bed, 2Bed, etc.) based on current area.
- Room.RenameByModelGroup – Renames rooms based on the model group placed inside it. Useful for residential or healthcare projects where model groups are used and the room name needs to be in sync with the model group.
- Room.RenumberByModelCurve – Renumber rooms based on a model line drawn through rooms. Will only renumber rooms that intersect with the model line so that batch processing can be done.
- Room.SetSEPP65Parameter – Reads an Excel file containing a list of complying SEPP65 apartment room numbers and then sets the Revit SEPP65_2 Hrs min shared parameter to True, with the rest False. The script will remove the first line from the Excel spreadsheet, i.e. the heading.
- RoomTag.MoveToRoomLocation – Moves room tags to the room location point. Use the Room.CentreLocation node first before using this node. Based on the script by Modelical.
- FilledRegion.Perimeter – Returns a filled region’s perimeter and will give you the option to filter out line style from the calculations.
- Filter.GetUnused – Extracts all the unused filters in your Revit project so that they can be purged.
- ModelGroup.TypeCollector – Returns all the types (not instances) from all model groups in the project
- View.FromSun – Generated views from the sun to visually verify the results of a Ladybug’s SEPP65 ‘heat map’.
- View.SetUnderlayToNone – Sets a views underlay to none.
- View.SwitchTitleblock – Switches the titleblock family type located on a sheet. Sheet number series is a string which is the prefix of the sheets names to be modified, e.g. “B”.
- SunSettings.GetTimesAndSunDirections – Needs to be combined with SunSettings.TimeZone. Unlike ‘SunSettings.SunDirections’ and ‘SunSettings.StartDateType’ this node will return all the dates and vectors for a solar study
- SunSettings.TimeZone – Extracts the current time zone in hours relative to UTC. Since Dynamo converts all time and dates to UTC this can be confusing. This node maintains the current time zone so that results can be verified easier.
- View.Phases – Returns the current view’s phase.
- List.SortSublists – Sorts sublists.
- List.SortSynchronously – Sorts other list synchronously.
- IsInteger – Checks a list for input types and returns Boolean value based on an integer. Useful when importing Excel values so that Dynamo can process them accordingly.
- IsNumber – Checks a list for input types and returns Boolean value based on a number.
- IsString – Checks a list for input types and returns Boolean value based on a string.
Recently I had a scenario where I need to extract the perimeter of an atrium in order to calculate the smoke extraction requirements. This proved to be a trickery problem than at first glance. In AutoCAD world, one would simply draw a polyline and extract the length property. However, since Revit doesn’t have polylines, we would need to draw multiple individual lines and then get the sum of all of the lengths. Manually trying to do this is not an option because when selecting multiple objects in Revit, no length parameter will be returned (unless all lines are exactly the same length but even in this case it wouldn’t be the sum).
Trying to automate the process with Dynamo also proved to be tricky as the atrium wasn’t continuously bounded by a single element, for example, a balustrade. Instead, it was bounded by a curtain wall, columns, walls (balustrades) and stairs. To further complicate things, certain elements were excluded from the calculation process. It was therefore decided that the simplest method to visualise the atrium opening was to create a filled region and use line styles to control which lines to calculate and which to exclude. This was only possible with Dynamo as Revit is only able to return the area of a filled region, not its perimeter. (Note that if you need to use Dynamo to get the area of a filled region, you can use the ‘Filled Region Area’ node from the Archi-lab_Grimshaw package).
The ‘FilledRegion.Perimeter’ node in the BVN Dynamo package will return a filled region’s perimeter and will give you the option to filter out a line style from the calculations. The node automatically excludes the <Invisible lines> line style, so the ‘lineStyleToFilter’ input can be used for other styles as required.
Flux was started in late-2010 at Google[x], Google’s research lab, with the mission to address two global challenges: climate change and affordable housing for the urbanizing population. Flux provides cloud-based collaboration tools for architects, engineers, and contractors to exchange data and streamline complex design workflows. In contrast, most design software today relies on manual file transfer, data conversion, and data-merge, which are tedious and error-prone tasks. Flux therefore frees you from the burden of exchanging and converting data so you can focus on what’s most important to your design.
Flux plugins currently work with Grasshopper, Excel, and Dynamo to automate data transfer to and from Flux.
Flux also has plans to expand this design software it can work with, including: AutoCAD, SkecthUp, Revit and 3D max.
Setting up a Flux project
A Flux project is the focal point for data exchange and collaboration. You can invite teammates into your project to share data. Each user and application controls when to synchronize data with the project, allowing users to work in isolation until they are ready to share their changes with the team. Since Flux was developed by Google[x], it will only work with Google Chrome. Here is how to setup your Flux project:
- First, you’ll need to set up and sign into your Flux account.
- Next, create a new blank project.
- Select ‘open project’.
- We then need to create ‘keys’. These are geometry/data that will be transferred to/from Flux. Simply hit the plus button in the data table on the left.
- Add the name and description as required. While these keys can be created in each individual application, it is easier to plan it out first and do it all in one go directly in Flux.
- Once your Flux project is setup, simply modify your Dynamo or Grasshopper definitions so that data is being pushed or pulled to the Flux keys.
- Flux only works on stable releases of Dynamo. If you have a later daily build installed, you may be stuck as several users have reported not being able to uninstalling Dynamo fully.
- Currently, Flux doesn’t support breps so we need to convert the Revit geometry into a mesh. Unfortunately, Dynamo out-of-the-box does not have meshing tools so you’ll need to install the ‘Dynamo Mesh Toolkit‘ package by Autodesk. Next you’ll need to go to Flux labs and sign up for the ‘Dynamo Mesh converter’. Hopefully in the future this will be much more accessible. The reason we need this is that Dynamo meshes have a unique format that will not be recognised by Grasshopper. Therefore, Flux, has developed a converter block that turns a Dynamo mesh into a universal mesh that you can view in Flux and send to Grasshopper. Essentially the difference is as follows. Rhino takes the form of a list of vertices (points with X,Y,Z coordinates) which are stored in an ordered list. In addition, it has a list of faces which are each an ordered list of vertex index points (refer more here).
Grasshopper Mesh definition
Dynamo on the other hand creates a similar list of vertices, but then rather than store the faces as a list of index points, it stores them again as a list of vertices with repeated X,Y,Z coordinates. It is unclear why the Dynamo team have chosen this data structure but obviously it leads to a much larger file size to store the same sized mesh as the X,Y,Z coordinates of each vertex is repeated many times.
- Flux components require login authentication and thus will not work in clustered scripts.
Throughout Parametric Monkey I have written about the interoperability tools available to Revit and Rhino users. Of all the plug-ins available, it would appear that Flux is the most promising. Overall, Flux is pretty unobtrusive and only minimal modifications are required to your existing Grasshopper and Dynamo scripts. Yet the benefits it offers – interoperability, worksharing and cloud computing – are quite powerful. Moreover, it would appear Flux has the resources to become something really special in the AEC industry. While most of the interoperability plug-ins were developed solo during the developer’s spare-time, Flux currently has approximately 27 employees and has just secured US$29M in funding. The question then is, when will Flux be commercialised, and will it still have the same appeal if we have to pay for it?
The ROB|ARCH2016 conference brings together world-leading researchers at the forefront of new robotic technologies and applications. This year’s conference was held in the iconic, industrial Pier 2/3 at Walsh Bay in Sydney from 15th –19th March. Hosted by The Faculty of Architecture, Design and Planning from The University of Sydney, and in partnership with RMIT, Monash University, Bond University, UNSW, and UTS, the event comprised of a 3-day workshop and a 2-day conference.
As a major sponsor of the event, BVN contributed by fabricating the conference bags. These were custom printed using a KUKA KR10 and a paint pen. Delegates had the option to choose from two different designs and then the ability to control the robot to print their own bag. This proved a huge success as it allowed delegates to interact with the robot right from the get go.
Over the 3-day workshop, delegates tested the boundaries of robotic fabrication in architecture. Both Kuka and ABA robots were used. This year there were 8 workshops offered:
- Robot UI: User interfaces for robotic live control (SCI-Arch & UNSW);
- Feature-based multi-robot assemblies (HAL Robotics & Bond University);
- Stigmergic accretion: Semi-autonomous polymer deposition (RMIT);
- Interactive 3D printing (IAAC, Harvard GSD & USYD);
- Robotic sewing of custom timber veneer laminates (ICD, Stuttgart);
- Superform: Robotic hot-blade cutting (Odico, Aarhus School of Architecture & USD);
- Spatially extruded structures (ETH Zurich, University of Michigan & UTS); and
- Dynamo-build! Dynamo-driven collaborative robotics for automated construction of spatial structures (Autodesk, Virginia Tech, Walter P Moore Engineering, MASS Design Group, Delcam).
The conference concluded with a 2-day presentation of academic papers and talks. One of the many highlights was Mark Burry and his presentation of the use of robotics in the construction of the Sagrada Famila. As pioneers in the field of digital fabrication, Mark and Jane Burry have worked with a 2D robot since 1989 on the Nave Columns, 3D printing since 2000 on the Passion Façade Rose, and finally, a 7-axis robot since 2001 on the Passion Façade Narthex columns. These techniques are combined with stonemasonry, to achieve a high level of precision and material computation.
Overall, the conference was a huge success. Dr Dagmar Reinhardt and her team at the University of Sydney did a fantastic job organising the event. With such fascinating outputs from the event, I look forward to the next one, which is due to be hosted at ETH, Zurich in 2018.
Airports of Thailand commissioned a consortium that included HOK as lead designer to design the new Suvarnabhumi International Airport Midfield Satellite Concourse in Bangkok. The 216,000m², four-story concourse building will add 28 contact gates to Thailand’s main airport, which will serve more than 50 million passengers per year. The design references the existing architectural language of the main terminal. Using a similar barrel vaulted section and alternating bands of glass and solid material, the interior of the midfield blends harmoniously with the spatial quality of the main terminal. The roof at the centre of the concourse is elevated and separated from the rest of the concourse, a reference to the stacked and layered roof forms of traditional Thai structures.
Initial conceptual design studies were first undertaken using McNeel’s Rhinoceros 5.0. These studies culminated in a wireframe model which defined the structural setting-out of the entire concourse. Due to a lack of computational skills within the project team, this was produced manually as opposed to generatively through Grasshopper. Once the initial Rhino conceptual design studies were complete, the entire model was rebuilt from scratch using Autodesk Revit 2013. As it will be shown, this proved to be very tedious and ultimately unsuccessful, as Revit was unable to replicate the Rhino setting-out geometry which was absolutely critical to the project.
Detailed Rhino study model
Rhino structural wireframe model
Essentially, there were three main geometric components to the project:
- The structural trusses of which there were two types, a ‘bow truss; and a ‘flat truss’;
- The overall shell massing which would form the basis of the roof, glazing and skylights; and
- The ceiling battens.
The structural trusses were generally pretty straightforward to model due to the geometric rationalisation that had already been undertaken in Rhino. The setting-out geometry was based on a series of tangential arcs in section which taper in plan. Key points were defined along these tangential arcs which represented the ‘top of the top chord’ and the ‘bottom of the bottom chord’. This was the structural zone with the roof build-up and ceiling situated above and below the zone respectively. This resulted in each bow truss and flat truss being standardised with only the middle truss varied in order to allow the building to taper in plan. Using a Revit generic model family, a series of type parameters were defined which allowed the truss to flex and the key setting-out points to be scheduled. Special attention was paid to the graphic representation of the complex geometry in order to aid legibility of the design intent.
Overall shell massing
The concourse shell massing was initially rebuilt parametrically within a Revit conceptual mass family. Using the tangential arc profiles as defined above, these were lofted together to create various masses. There were several Revit conceptual mass families generated, including:
- Bottom chord massing
2. Top chord massing with a series of void extrusions to subtract away various zones including: window, gutters and transition zones.
- Assembly massing. This was simply a compilation of all the masses into one file.
The assembly mass family was then loaded into the Revit project. Since the family defined the structural zone, a roof thickness need to be applied above this zone. The ‘wall by face’ command was adopted and initially this proved to be successful in that Revit was able to read the segmented surfaces and produced individual walls. However, upon further investigation is was discovered that all the surfaces needed to be tangential otherwise it resulted in inconsistent geometry as shown below.
Due to an anomaly in our setting-out geometry, the two surfaces shown weren’t actually tangential. Without completely changing the master geometry this proved to be a limitation of Revit that was unable to be overcome. Therefore, as an alternatively to the ‘wall by face’ command, the project team explored building the entire roof massing as a single element within a conceptual mass family. All that was required was to use the top chord massing and offset the surface. Unfortunately, while this is a relatively straight forward procedure in Rhino, the Revit family editor wasn’t able to do this automatically. In what proved to be a painstakingly slow process, the project team offset all the reference lines and created new parameters before generating the new offset surface.
However, this only fixed the first problem. The next problem was that is was impossible to accurately subtract the windows volumes. Using the Rhino ‘offset surface’ command resulted in the edge of the window being perpendicular to the normal of the top chord surface as shown below. However, using the void forms in Revit to perform a Boolean subtraction after the roof thickness had been generated would result in the edge of the window being parallel to the ground plane. This clearly wasn’t acceptable and as a result this method was abandoned.
Rhino massing showing window void intersecting the top of top chord surface
Eventually, after numerous unsuccessful tests in attempting to rebuild the Rhino shell geometry in Revit, it was decided to use the actual Rhino massing as the master geometry and import it directly into Revit using a *sat file via a conceptual mass family.
Imported *sat mass within Revit 2014
Of note is that when using Revit 2013, the *sat file was unable to be fully exploded and if attempted, the imported geometry would disappear completely. However, when testing with Revit 2014, the geometry was able to be fully exploded. Since the project team was contractually tied to Revit 2013, the option of upgrading to Revit 2014 was unavailable. While exploding a *sat file within the conceptual mass family is not a pre-requisite for it to be functional, it does limit its adaptability if required.
Exploded conceptual mass within Revit 2014
The other major computational issue the team faced was documenting the ceiling. Since in plan the building was tapering, the ceiling varied from structural bay to structural bay. A method needed to be developed to express and represent these variations. To achieve this, Chameleon and Grasshopper were adopted as there were no out-of-the-box Revit commands that would be able to replicate the design intent accurately.
Revit view showing ceiling
The base surface was first created within Rhino. The surface was split at grid lines. This was done to compartmentalise troubleshooting as the script took approximately 10min to run per structural bay.
Rhino base surface for ceiling
Grasshopper was then used to divides the surface with UV divisions. Since the concourse tapered in plan, the divisions were calculated from the shortest length and based on critical dimension (batten width and min spacing, etc.). The centre lines were then offset to accommodate batten spacing.
A four-point surfaces was then created. However, these surfaces were not planar and needed to be rebuilt. The rebuilt surfaces were then extruded to give them a thickness and a Boolean operation used to cut out the arched window and skylight geometry.
A 2D pattern was then created based on a sin curve and controlled through a grasshopper graph mapper component. This curves was then arrayed, offset, extruded and a Boolean operation performed on the battens.
Due to the Boolean operations described above, some of the resultant battens became quite small while others had more/less vertices. The Grasshopper script then culled the battens that were determined to be too small, and filtered the reaming battens to be either 3-points, 4-points or 5- points battens (disregarding the batten thickness). Several adaptive components were then made within Revit to correspond with the number of vertices (3 points, 4 points, and 5 points) and loaded into the project. The thickness of the batten was controlled within the adaptive component as this reduced the number of placement points and hence made the script run faster. The XYZ co-ordinates were then exported from Grasshopper via Chameleon. This needed to be done in batches so that the appropriate adaptive component could be applied (3 points, 4 points, etc.) to the XYZ points being exported. It was discovered that the list of co-ordinates needed to be ‘cleaned’, as any empty lines would crash the export.
Final adaptive components ceiling in Revit
The above method was very successful, albeit a little time consuming. In all, there was approximately 13,000 individual battens for half the concourse. Rather than creating the entire ceiling, only half was created, and then when linked into the main model, it was copied and mirrored to minimise the computational load.
This case study highlighted Revit’s inability to deal with complex geometry. While it is always preferable to have entirely native Revit elements, this isn’t always possible as was shown here. Through the adoption of other software such as Rhino and Chameleon, it is possible to extend Revit’s capabilities and achieve the desired geometric outcome. With a clear workflow, geometric integrity and rigour can be maintained. Of course for this to be successful, the project team needs to be sufficiently skilled in the proposed software and if not, adequate training and support needs to be provided.