Sydney Computational Design Group


Join us on Wednesday 5 October at the Sydney Computational Design Group where we have Chris Pettit and Michael Ostwald presenting.



Chris Pettit is the inaugural Professor Urban Science at UNSW. He is Associate Director of the City Future Research Centre where he leads the City Analytics Program. Chris currently co-chairs the Geo4All OpenCitySmart initiative and is the co-chair of the International Society of Photogrammetry and Remote Sensing (ISPRS) Geographical Visualisation and Virtual Reality Working Group. His expertise is in the convergence of the fields of spatial planning and GIS and he has published more than 120 peer reviewed papers in this area. Chris will present ‘Visualising a Changing City’ where he will introduce two data driven visualisation platforms which have been developed to communicate the changes in our city. The first platform is known as CityViz. This platform currently provides data on city housing and city movement indicators. A digital story telling approach is undertaken where interactive maps are supported by narrative. The second platform is known as CityDash. This platform aggregates a number of feeds from open data from across Sydney and provides this as a real-time city dashboard, the digital pulse of Sydney.



Michael J. Ostwald is Professor and Dean of Architecture at the University of Newcastle (Australia) and a visiting Professor at RMIT University. He has previously been a Professorial Research Fellow at Victoria University Wellington, an Australian Research Council (ARC) Future Fellow and a visiting fellow at MIT and UCLA. In 2016 Michael was awarded the Neville Quarry Medallion for services to architectural education and in his career he has been awarded 13 major ARC grants. Michael has a PhD in architectural history and theory and a DSc in design mathematics and computing and he completed postdoctoral research on baroque geometry at Harvard in 2000. Michael’s talk titled, ‘Computational Analysis of Architecture’ will presents an overview of several recent research projects which have used computational and mathematical means to investigate claims about famous buildings and spaces. The projects include: (i) an isovist analysis of passage through Frank Lloyd Wright’s domestic architecture, (ii) parametric generation of plans of traditional Chinese private gardens to replicate social and cognitive properties, (iii) a reconstruction of Richard Neutra’s soundscape in the Clark House and (iv) fractal dimension analysis of Sinan’s Mosques in Istanbul.


Sydney Computational Design Group


Join us tomorrow evening at the Sydney Computational Design Group where we have Dr Luke Hespanhol and Dr Yannis Zavoleas presenting.


Dr Luke Hespanhol is a media artist, researcher, lecturer, interaction designer and software developer based in Sydney, Australia. His practice investigates the potential of media art to create engaging experiences that lead to reflection on the relationship between individuals and the immediate environment around them. Luke will present ‘Two Worlds Colliding: When The Digital Turn in Architecture Meets The Architectural Turn in Digital Media’. In this talk,he will discuss the growing convergence of architecture and human-computer interaction through the presentation of his recent research on generative 3D media facades, emerging from a combination of operational settings and based on simulating the spatial arrangement of ‘light cells’. The strategy adopted is innovative for proposing light cells to be designed as building blocks not only for the hybrid media structure but also for its content, with one aspect continuously informing the other. Luke will present the initial design concept for the light cells, the software applications developed for modelling the 3D media facades, and discuss opportunities and challenges related to its utilisation in the fields of architecture and urban interaction design.



Dr Yannis Zavoleas is Senior Lecturer in Architecture at The University of Newcastle Australia and co-founder of Ctrl_Space Lab in Athens. He explores the idea of digital media as extended “tools for thought,” by also combining new technologies with core architectural discourse.  Yannis will present “Dynamic Systems in Architecture”, in which he will trace the evolution of systemic ways of analysis leading to synthesis, as the design process is currently being informed by agent-based simulation techniques testing the behaviour and the shared influences of variable design inputs.




Elk is a set of tools to generate map and topographical surfaces using open source data from Open Street Map and Shuttle Radar Topography Mission (SRTM) data from USGS. Elk was developed by Timothy Logan and works in a similar way to @IT, Meerkat GIS and Flux’s Site Extractor. Elk has recently been rewritten from scratch and as such this tutorial will focus on the latest version, 2.2.2.


Before we can import any data into Grasshopper, we first need to export the data from Open Street Map. Once you have found your area of interest, select export up the top (1), then manually select a different area (2), then export (3). This will create a file called ‘map.osm’ in your download folder which we can reference into Grasshopper.




Within Grasshopper we can use ‘File Path’ and ‘Location’ components to read the OSM data. The Location component converts the latitude and longitude locations point data based on the origin being the lower left corner. For example, in the image above, the Open Street Map data has a latitude domain of -33.8800 to -33.8647 and a longitude domain of 151.1727 to 151.2064. Therefore, the bottom left hand corner of the window will correspond to -33.8800, 151.1727.




Next we can use the OSM Data component to start organising and collecting the data from the OSM file. The component defaults to selecting building elements, but can be changed to select from any of the other map features in the OSM specification by right clicking on the component.



We can use the polyline component to draw closed polylines where possible. Note that although you may have selected certain categories to be extracted, such as ‘railway’, this is completely dependent on the data coming out of Open Street Map. This means that some of the OSM Data components may be empty depending on your location.



Once the data is in Rhino, you can add a point at 0,0,0 and use Rhino’s ‘Earth Anchor Point’ command to assign the point to the desired latitude and longitude that we identified earlier, -33.8800, 151.1727. The will georeference your model.


To generate a topography, Elk uses data that originates from the Shuttle Radar Topography Mission (SRTM) of 2000. The SRTM was flown aboard the space shuttle Endeavour from 11-22 February 2000. The National Aeronautics and Space Administration (NASA) and the National Geospatial-Intelligence Agency (NGA) participated in an international project to acquire radar data which were used to create the first near-global set of land elevations. Endeavour orbited Earth 16 times each day during the 11-day mission, completing 176 orbits. SRTM successfully collected radar data over 80% of the Earth’s land surface between 60° north and 56° south latitude with data points posted every 1 arc-second (approximately 30 metres). This data was then packaged into 1° x 1° tiles.


The Topography component can be used to generate points, curves, and a surface from various Digital Elevation Model (DEM) file formats. Elk currently accepts:


  • *IMG files at a resolution of 1/3 arc second. The higher resolution IMG files are available for the United States only and can be downloaded from the USGS National Map Viewer. When searching for the data choose Data > Elevation Products (3DEP), with a data extent of 1 x 1 degree and *IMG file format.



  • *GeoTIFF files at a resolution of 1 or 3 arc seconds. GeoTIFF files are available for 1 and 3 arc second resolutions for most of the earth and can be downloaded from the USGS Earth Explorer.  This is the best options for data in Australia. When selecting your data sets, choose Digital Elevations > SRTM > SRTM Void Filled.



Once you have found your data set, go to results and download options and select GeoTIFF.




  • *HGT files at a resolution of 1 or 3 arc seconds. HGT files are the least reliable files as they tend to contain holes in the data, but they are the most basic form of DEM and you can download them from here. The website it quite difficult to navigate, but for example to access Sydney select Version2_1 > STRM3 > Australia > Ensure you unzip the file before trying to use it in Grasshopper.



Once you have download one of the datasets, it is just a matter of a simple Grasshopper script. We can use the latitude and longitude from the Location component. The Topography component will output points, curves and surfaces. In the example below I have flattened the points and used a Delaunay Mesh instead.



Once you have generated your topography, you can bake all the geometry and project all the curves onto your surface.


City of Sydney model


Recently I was charged with setting up a digital City of Sydney model acquired from AAM. This was a steep learning curve into GIS and large data sets. I thought I would share with you some of my findings if ever you are in a similar position.




AAM is a Geospatial Services company specialising in the collection, analysis, presentation and delivery of geospatial information. AAM currently provides the City of Sydney with 3D GIS data, including the City Model that is received if preparing Development Applications. The model’s extents is that of the Sydney Local Government area, which stretches from Moore Park to Glebe, and the CBD to Mascot. Therefore, it does not include the entire metropolitan area.


Depending on your licencing agreement, there are several models on offer including:

  • Untextured building model LOD1 (2013);
  • Textured or Untextured building model LOD3 (2013); and
  • Textured terrain mesh (2008-2013).


The building models are supplied in both mesh (*3DS and *OBJ) and polygon (*DGN and *DWG) formats and were digitized using photogrammetric methods from aerial imagery captured on 25-28 February 2009 and updated from imagery capture on 7 March 2013. Please keep in mind the date when the dataset was created. Certain key buildings are not in the model, including One Central Park, Barangaroo and the Convention Centre.

LOD stands for ‘Level of Detail’. LOD3 is the highest resolution available. The LOD1 model takes the highest point of the building and simply creates an extrusion. It is therefore much lighter in file size and useful if limited detail is required. Both the LOD1 and LOD3 models extend down to RL0. That is, there are no topographic ‘pads’. Buildings intersect the topography instead of sitting on it.

Apart from the LOD differences, there are also some other differences depending on which model you use:



LOD3 building model


  • The LOD3 buildings are not ‘watertight’ in that they have no base. Generally this would cause issues if trying to 3D print the model. However, preliminary test with Cura and an Ultimaker 2 Extended + printer have been positive, indicating that no file cleanup is required.



LOD1 building model


  • LOD1 model is property aligned in the CBD meaning the raw data has been manually adjusted to match cadastre information, whereas the rest of the city has not.
  • LOD1 model is watertight and should 3D print without any issues.


Textured Terrain Mesh & LOD3 building model


AAM acquired Airborne Laser Scanning (ALS) data for the Sydney metropolitan area. This file contains a TIN from thinned ground classified LiDAR points in *OBJ and *3DS format. The data has been acquired from February to June 2008. The terrain is textured and broken into 500x500m tiles. This file size of this mode is very large and is very difficult to use due to the sheer number of sample points.


Coordinate systems

AAM supplies models using both a GeoReferenced and Project Local coordinate system, as well as supplying them in millimetres and metres. The GeoReferenced coordinate system uses Map Grid of Australia Zone 56 (GDA94) [EPSG: 28356] whereas the Project Local has an origin point of 335,000 E, 6,250,000 N (MGA56).



The Geocentric Datum of Australia, 1994 (GDA94) is an Earth-centred datum, which has been adopted for use throughout Australia by the Inter-Governmental Committee on Survey and Mapping (ICSM) with all states and territories adopting it in 2000. The map projection associated with GDA94 is the Map Grid of Australia, 1994 (MGA94), a transverse Mercator projection, which conforms with the internationally accepted Universal Transverse Mercator Grid system.

The datum which is currently applied as the national standard throughout Australia is the Geocentric Datum of Australia (GDA), which was first implemented in 1994. When applied as a geographic coordinate system, GDA is known as GDA94. Conversely, when applied as a projected coordinate system, GDA is referred to as the Map Grid of Australia 1994 (MGA94). For more detailed info on GDA94 and MGA94, refer here and here.

Note that if you plan of combining this CoS model with Open Street Map data imported from Elk or Flux’s Site Extractor, this information will come in using WGS84 and not MGA94. WGS84 is the World Geodetic System 1984. It provides the current standard for locational measurement worldwide, particularly in conjunction with the Global Positioning System (GPS) satellite network.


Project Local

The Project Local coordinate system has been repositioned from the GeoReferenced location and is located at the corner of Crown St & Kings Lane, Darlinghurst. The reason for this is due to software issues with geometry being too far away from the default origin (0,0,0). Since the CoS also uses the same Project Local coordinate system, I would recommend also adopting this system. This means that if you have adopted the Project Local coordinate system and receive a model or drawing at MGA94 coordinates, you will need to move it (in metres) as follows to get it to the Project Local coordinate system:

  • X = -335,000
  • Y = -6,250,000

AAM_Project Local_1600x790

Project Local origin point


The exception to this is if you need to model the sun angle planes as defined in Section 6.17 of the LEP.  For whatever reason, as it is not clearly documented, the CoS has decided to use truncated MGA coordinates which are different to the coordinate systems above. GeoReferenced coordinates would be truncated as follows:

  • LEP: 34067E, 49731N
  • Actual GeoReferenced : 334067 E , 6249731 N


File structure

Depending on the software you intend on using to view the model, you’ll want to use different file formats of the raw data when setting up your projects. In general, meshes will produce considerably smaller file sizes but can be difficult to manipulate and may cause visualisation issues due to the mesh triangulation. If for example you use Rhino, the *DGN files actually imports better than the *DWG file. This is because the *DGN files brings in surfaces whereas the *DWG brings in meshes and hatches. If you use the *DWG files, then the hatches need to be exploded to be turned into surfaces and then the meshes deleted.



The AAM city models are a great resource and adds a whole new level of sophistication to the design process. It provides relatively accurate information to aid you in undertaking solar access analyses, view analyses, and overshadowing analyses, to name but a few. Watch this space for more tutorials on integrating GIS and city data.

Flux Site Extractor

Flux_Site Extractor_1600x800

Flux Site Extractor is a Flux extension app which allows you to import GIS information. It uses the building outlines and road centrelines from the Open Street Map database and the elevation data from NASA’s Shuttle Radar Topography Mission. When activating the app, there is no installation required and hence no requirement to have admin rights, which can prove useful if on a work computer. You simply need to grant the Site Extractor app access to your Flux account.


Flux_Site Extractor_1600x740

The Bays Precinct, Sydney, Australia


Once activated, the map will be centred on your location. If your site is elsewhere, you can search for the specific location in the search box. Simply draw a rectangle over the area you want to export. Note that the rectangle will turn red if the area is too large. In this scenario, you may need to do multiple exports to several different Flux projects.

Select the features you want to extract from the menu. Also select the Flux project you want to send the data to. If no project is selected, a new Flux project will be created for you. Click on ‘Send to Flux’. A link to your project will appear under the project listing. In Flux, double-click on the layers you want to preview. You can also drag layers on top of each other to overlay the geometry.


Flux_Site Extractor2_1600x700The Bays Precinct, Sydney, Australia


Once the data is in Flux, you and import this into Grasshopper. In the example below I have used Elefont to bake the geometry into Rhino so that they are on the correct layers. This is not essential and can be left out if required.


Flux_Site Extractor3_1600x1300

Note that although you may have selected certain data fields to be extracted such as ‘Buildings (accurate height)’, this is completely dependent on the data coming out of Open Street Map. This means that some of the Flux keys may be empty depending on your location. The discrepancy in Open Street Map detail can be clearly seen when comparing Sydney Australia (above) and New York, USA (below).


Flux_Site Extractor4_1600x700

New York, USA


While there are many programs and plug-ins that are capable of extracting the same data (Elk, @IT, Meerkat GIS, and CAD Mapper), the benefit of using Flux Site Extractor is that this data can be sent easily to both Grasshopper and Dynamo. This allows both Rhino and Revit users to access the same data thereby simplifying the workflow.

Min balcony size


As part of the SEPP65 requirements in New South Wales, balconies for multi-residential buildings must meet minimum area and depth requirements. Section ‘4E Private open space and balconies’ in the Apartment Design Guide states that all apartments are required to have primary balconies as follows:


Dwelling type Minimum area Minimum depth
Studio apartments 4m2
1 bedroom apartments 8m2 2m
2 bedroom apartments 10m2 2m
3+ bedroom apartments 12m2 2.4m


The ‘Room.SetBalconyCompliance’ node as part of the BVN Dynamo package helps to automate the process of verifying compliance of the minimum area. Note that the minimum depth is not tested and this needs to be undertaken separately.

Before using the node, firstly ensure that there are no redundant rooms in the project as the area of these rooms will be incorrect and skew the results.


The node first collects all rooms in the project using Lunchbox’s ‘Room element collector’. The corresponding apartment numbers are then collected. It is assumed that the apartments and their corresponding balcony have been numbered the same. For this particularly example, the project was setup using a shared parameter called ‘Apartment Number’ which is used as the default value to the ‘apartmentNumberParameter’ input. This is because if you use the OOTB parameter ‘Number’ to group apartments and balconies you will receive an error, ‘Elements have duplicate “Number” values.’

Next we need to determine the dwelling type, that is, 1-Bed, 2-Bed, etc. For this example, the ‘Occupancy’ parameter is used as the default input to ‘apartmentNameParameter’. Once the apartment numbers and apartment names are known, the rooms are grouped together to represent all the rooms within the apartment, both internal and external.


Once grouped, the rooms are  filtered by the ‘apartmentName’ input, for example, ‘1 BED’, ‘2 BED’, etc. The balconies from these dwelling types are then extracted and the area calculated and compared to the ‘minSize’ input. All balconies are first reset to be non-complying. Next only the complying apartments are updated based on and the shared parameter defined in the ‘sharedParameterName’ input. By default, the sharedParameterName input is set to ‘Balcony Compliance’.

Diagrid via Flux

Rhino_Structural tubes_Flux_1600x800

In this previous tutorial, we discovered how to export Revit levels to Grasshopper for the generation of a diagrid structure, before pushing the results back in into Revit using Rhynamo and Dynamo. This tutorial will look at an alternative methodology whereby Rhynamo is eliminated and replaced with Flux.



Step 1: Setup Flux

  1. Set up and sign into your Flux account.


  1. Create a new blank project. In this example I have called it ‘RTC Exercise 03’


  1. Select ‘open project’.
  1. Create ‘keys’. These are geometry/data that will be transferred to/from Flux. Simply hit the plus button in the data table on the left.


  1. Add the name and description as required. We need to create 6 keys in total:
    • Levels;
    • RLs;
    • Floor mesh;
    • Floors Universal mesh;
    • Analytical model;
    • Ring beams;

The purpose of all of these keys will be elaborated on later.


Step 2: Export Revit data/ geometry

  1. Setup a Revit view which just the Revit geometry to transfer.


Revit_Flux Floors2_1600x900

Revit view of floor plates


  1. Generate an elevation or section showing the levels to be exported.

Revit_Flux Levels_1600x1050

Revit section showing levels


  1. In Dynamo, open the Part 1 Dynamo file. Select the levels and floors to be exported and run the script. The script collect the select levels, sort them based on their level and return their name and elevation. It also converts the floors into meshes. This data is then exported to flux to the ‘Levels’, ‘RLs’ and ‘Floor mesh’ keys.



Dynamo script to export levels and floors



Step 3: Flux refactoring

  1. We need to refactor the geometry so that Flux can use the mesh from Revit. With Flow, generate the following:



Flux Flow to generate a universal mesh


From this point on, we need to use the ‘Floor universal mesh’ key rather than the ‘floor mesh’ key.


Step 4: Generate Grasshopper geometry

  1. Open Grasshopper open the Part 2 Grasshopper script. The first part of the script  imports three keys from Flux – ‘Floor universal mesh’, ‘Levels’ and ‘RLs’. The ‘Floor universal mesh’ and ‘levels’ keys are purely for reference. The ‘RLs’ key is used to generate a series of ovals at the correct level. Note that the elevations (RLs) exported from Revit will be absolute, that is, relative to the survey point. Therefore the script makes an adjustment for this because the Rhino file is set-out based on the Project base point.  From these ovals, a diagrid is generated. Finally, the resultant geometry is pushed to Flux via the ‘analytical model’ and ‘ring beam’ keys.

Grasshopper_Flux Diagrid_1600x1650Grasshopper script generating structural diagrid


Rhino_Flux Diagrid_1600x950

Revit preview of resultant geometry



Step 5: Import geometry into Revit

  1. Open Revit and ensure the families to be placed are pre-loaded. Note that while structural framing could have been placed, adaptive components were used to avoid the visual gap created when using structural framing. The other family to be loaded into the project is a structural frame that we’ll use for the ring beams.
  1. Open the Part 3 Dynamo file and run the script and define which families are to be placed with in the Dynamo script



Dynamo script to re-create geometry from Flux


Structural tubes in Revit




The application of Flux in this example proved quite successful. This was due to the fact that Flux provided a (semi-) live link for the levels and floors. Since the size of the data been transferred wasn’t very large, the process was reasonably fast. Furthermore, Flux offered the possibility to have multiple people simultaneously contributing.

Structural tubes via Flux


Solar Analysis via Flux


In this previous tutorial, it was demonstrated how to undertake a detailed SEPP65 solar access compliance check. The methodology involved using Rhynamo as a means to extract Revit rooms for analysis in Grasshopper and Ladybug. This example offers an alternative methodology by eliminate Rhynamo and replacing it with Flux:




Step 1: Setup Flux

  1. Set up/sign into your Flux account.
  2. Flux_Login_1600x800Create a new blank project.

Flux_NewProject_1600x8003. Select ‘open project’.

4. Create ‘keys’. These are geometry/data that will be transferred to/from Flux. Simply hit the plus button in the data table on the left.


5. Add the name and description as required. We need to create 10 keys in total:

    • Floors;
    • Walls;
    • Groups;
    • Meshed floors;
    • Meshed walls;
    • Meshed groups;
    • Combined universal mesh;
    • Room crvs;
    • Room numbers; and
    • SEPP65 compliance.


These keys can be created in each individual application but it is easier to plan it out first and do it all in one go. The purpose of all of these keys will be elaborated on later.




Step 2: Export Revit rooms and context using Dynamo and Flux

  1. Currently, Flux doesn’t support breps so we need to convert the Revit geometry into a mesh. Unfortunately, Dynamo out-of-the-box does not have meshing tools so you’ll need to install the ‘Dynamo Mesh Toolkit‘ package by Autodesk.


Dynamo_MeshByGeometryMesh.ByGeometry custom node from the Mesh Toolkit Package


2.  Simplify the context geometry for export. The easiest way to do this is to setup a 3D view within Revit. Using Visibility Graphics (VG) in conjunction with worksets and filters, turn off any unnecessary geometry. Essentially all we require are party walls, floors and roofs. Extraneous geometry such as planting, furniture, doors and mullions can all be hidden. Windows are effectively transparent and therefore can also be excluded from the analysis. The same also applies to glass balustrades. However, since these are often modelled as a wall, you will need to set up a filter to turn off only the glass balustrade wall types and not all walls. It is critical that the context model be as clean and minimal as possible to minimise computational time later in the workflow.

3. From this view, duplicate and create 3 separate views isolating different elements – walls, floors and groups (internal walls).




Revit walls to be exported




Revit floors to be exported



Revit groups (internal walls) to be exported


4. Open Part 1 of the Dynamo script. This script exports the context geometry. Due to the file size, the model will be exported in batches ; walls, floors and roof. Note that you will need to run the script several times if you have Dynamo set to ‘Manual’ in order to pick from menu in the Flux nodes. Otherwise you’ll need to set Dynamo to ‘Automatic’. Therefore, Flux’s Flow control node does not have much meaning when you are working in Dynamos manual mode. For additional control you can leave the flow control mode in ‘once’ and then after the manual run in Dynamo you can right click the ‘to Flux’ or ‘from Flux’ components and click ‘push data’ or ‘pull data’.

The Python script shown below simply extract all the geometry in the view, so there is no need to manually select elements. You will need to re-run on the 3 views created, each time writing to a different Flux key.




Dynamo script to export context geometry using Flux


5. Open the Part 2 of the Dynamo script. This script exports the room geometry. The easiest way to do this is to use the LunchBox Room element collector node.


Dynamo script to export rooms geometry and room numbers using Flux



Step 3: Flux refactoring

  1. In Flux, check that the data has been transferred. For some of the keys, particularly the context and room curves, you might get the following image. Simple click the ‘load value’ icon to preview the geometry.



2. If you don’t already have access, go to Flux labs and sign up for the ‘Dynamo Mesh converter’. Hopefully in the future this will be much more accessible. The reason we need this is that Dynamo meshes have a unique format that will not be recognised by Grasshopper. Therefore, Flux, has developed a converter block that turns a Dynamo mesh into a universal mesh that you can view in Flux and send to Grasshopper. Essentially the difference is as follows. Rhino takes the form of a list of vertices (points with X,Y,Z coordinates) which are stored in an ordered list. In addition, it has a list of faces which are each an ordered list of vertex index points (refer more here).


Flux_Mesh Details_1600x900

Grasshopper Mesh definition


Dynamo on the other hand creates a similar list of vertices, but then rather than store the faces as a list of index points, it stores them again as a list of vertices with repeated X,Y,Z coordinates. It is unclear why the Dynamo team have chosen this data structure but obviously it leads to a much larger file size to store the same sized mesh as the X,Y,Z coordinates of each vertex is repeated many times.


3. Go to Flow, which is Flux’s visual scripting platform and generate the following code. Hover in column A and pick the add a data node (shortcut ‘D’). Drag the ‘Context mesh’ key to the left hand size of the data node. In stage A-B add Flatten and the dynamoMeshEater nodes. (Note that once you have the mesh in Flux and have been invited to use the ‘Dynamo Mesh converter’ block, you will be able to double click in the Flux Flow and search for DynamoMeshEater).  In the menu icon on the top right of the dynamo MeshEater node, set the lacing to Longest. In column B, add a data node and drag the ‘Universal mesh’ key to the right hand size of the node. Repeat this process for the other elements (walls, floors, roofs, etc.).


Flux_Universal mesh_1600x

Flux Flow


By going back to the data pane, you should see the preview of the ‘universal mesh’ now.


Flux_Combined mesh_1600x735

Flux universal mesh


Step 4: Solar Analysis with Ladybug

  1. Open a blank Rhino file and then open the Grasshopper Part 3 file. Ensure the Flux components are pointing to the correct Flux project and keys. Once Ladybug has finished running, the list of complying room numbers will be pushed to Flux. Note that if your rooms have internal islands or aren’t clean geometry, you may need to modify the polycurve and planar surface creation nodes to ensure the room surfaces and room number lists match up.


Grasshopper_FluxSolar_1600x700Grasshopper script



Step 5: Import results into Revit


  1. In Dynamo, open up the Part 4 file. The script will first ‘reset’ all the rooms to be non-complying, before updating only those rooms imported from Flux.






While in theory Flux appeared to simplify the workflow, the main limitation was the processing of the context geometry. Due to the large file size, the upload process to the Flux website was incredibly slow. Having said that, this tutorial was produced with Flux/Dynamo 0.9.2. According to Flux, the new Flux plug-in for Dynamo 1.0 performs significantly better.

The other issue that I faced was the refactoring process. For example, Flux doesn’t accept polycurves (yet) so the original Grasshopper script that used Rhynamo had to be modified slightly. Similarly having to use the Dynamo Mesh converter is somewhat annoying and in an ideal world, this would be eliminated from the workflow.

In conclusion, for this particular exercise, I don’t believe Flux added much value to the workflow. This was because all it was being used for was to export dumb geometry and a simple *dwg export would have been much faster. But in any case, this exercise probably wasn’t the most strategic use for Flux and its potential lies elsewhere.


SEPP65 Solar analysis pt2_Flux

BIM ecosystem


Building Information Modelling (BIM) entails interdependencies between technological, process and organisational/cultural aspects. These mutual dependencies have created a BIM ecosystem in which BIM related products form a complex network of interactions.¹ For a long time interoperability between these products has been essentially non-existent, resulting in users unwilling to interchange between different software platforms. Rather than using the best product for the job, users have preferred to remain in the software that they are most familiar with, possibly to the detriment of the design. One such example of this is conceptual massing within Autodesk Revit. This post explores how to extend Revit’s modelling capabilities by combining it with McNeel’s Rhinoceros and Dynamo.


It is important to emphasise that BIM is an ecosystem and that there is no single software that can do everything. Within the graphic design industry, Adobe has recognised this and have produced a suite of software including; Photoshop, Illustrator and InDesign amongst others. Each software has a very clear and well defined scope. Photoshop is used for raster images, Illustrator for vector images and InDesign to compile it all together. Each software is separate but linked together in the workflow. In general, architects are accepting of this ecosystem and are agile enough to move between each platform. However, this ethos of a software ecosystem needs to be applied to BIM software as well.


Software ecosystems: Adobe vs Autodesk


The term ‘BIM’ can mean different things to different people. Despite BIM bridging across the entire project from conception through to facilities management, depending on who you talk to, different people will have different biases. For example, as an Architect, I am naturally more interested in geometry and the design authoring phase of BIM. These biases which are epidemic in the AEC industry is elaborated on in Antony McPhee’s post, ‘Different BIMs for different purposes‘. With so many perspectives of what BIM is, one needs to question the role of a BIM Manager. Dominik Holzer is his RTC Australasia presentation, ‘You are a BIM Manager – Really?’ discusses how BIM managers need to go beyond tool knowledge and develop management acumen. But that can be a hard thing to accomplish. Even if we focus on a particular BIM bias, say Design Authoring, there is a plethora of tools out there that a BIM manager must master, or at the very least, have an appreciation of. Here is just a few of them.

BIM Ecosystem_All_1600x900


Most of the software mentioned above is generally accepted as part of a BIM workflow. Why then are so many people resistant to creating a federated model and instead insist that everything must be created within Autodesk Revit? It was a pleasure to see that this year’s RTC Australasian conference was run in conjunction with Archicon, ArchiCAD’s equivalent conference. This was a small acknowledgment from the profession that BIM is bigger than just a single piece of software and that we need to seriously consider interoperability in our workflows.



Design authoring

Although Autodesk Revit is often chosen as the primary documenting tool in many architectural offices, it is widely acknowledged that Revit is extremely limited in dealing with complex conceptual modelling. Autodesk’s attempts at introducing a dedicated conceptual modelling environment in the form of FormIt and Inventor haven’t really taken hold of as of yet. As a work around, many offices have adopted McNeel’s Rhinoceros, or Rhino for short, for the conceptual design phase and Autodesk Revit for the development and documentation phases.


The integration between McNeel’s Rhino and Autodesk’s Revit in the past has proven to be a challenge for many designers looking to combine freeform geometry with BIM. These interoperability barriers are partly due to Revit’s API limitations and also arguably, a lack of development of the Industry Foundation Class (IFC) file format, which promised universal interoperability within the AEC industry.


BIM Ecosystem_IFC_1600x900

IFC: A promise of universal interoperability


The other barrier is one of culture and perception. Rhino is sometimes regarded as useful for conceptual modelling only, whereas Revit is preferred for documentation. Over and over I see these two worlds failing to collaborate. The scenario generally looks something like this: Young, recent graduates who are relatively computational literate, spend long hours at the office to win a competitions. These people then jump onto other projects and if the competition is awarded, a new team is put together to deliver the project. This will most likely contain more experienced architects more attune to developing and delivering the project. The Rhino model is thrown out and completely rebuilt from scratch within Revit. Sound familiar?

The problem with this approach is that it more often than not involves loosing much of the intelligence built into the original design. And since Revit is unable to accurately recreate complex geometry, the design needs to be dumbed down to comply with Revit’s limitations. This methodology couldn’t be further from what BIM set outs out to achieve. So what is the solution?




For some, the solution is simply to produce everything natively within Revit from the beginning. However, as shown in this case study, Revit sometimes simply isn’t capable of creating the desired forms. One therefore needs to look to other software that can, such as Rhino, but this too can have limitations.

Those that have attempted to use Rhino geometry within Revit will be all too familiar with the difficulty in successfully achieving this. The most basic method is to simply export a *sat file from Rhino and then import it into Revit. Best practice is typically to do this via a conceptual mass family. However, the result can be called ‘dumb geometry’ in that it is unable to be edited once imported, which is far from ideal in a BIM environment.

The next progression in usability is to explode the import instance. Depending on the base geometry, Revit may make editing the mass accessible via grips (push/pull arrows) but this is not always the case. However, the best method to generate native Revit elements from Rhino is to use the wall-by-face, roof-by-face, curtain system and/or mass floors commands. These elements will be hosted to the Rhino geometry. In theory, these elements can be updated if the base geometry changes but experience has shown that Revit is not always able to re-discover the host element and new elements will need to be created. Therefore, it is prudent to test the Rhino to Revit workflow before investing in too much time in embellishing the Revit elements.

The whole process of using Rhino geometry in Revit feels a bit like black magic. Often, the first attempt in importing the geometry will fail and there will be no guidance from Revit as to why it failed. This can prove to be very frustrating and many users will simply give up. However, if these best practices are followed when generating the mass in Rhino, integration with Revit should be seamless, or at very least, less painful.



Interoperability plug-ins

The best practice mentioned above all rely on more or less the same techniques once in Revit: wall-by-face, roof-by-face, curtain system or mass floors. In other words, a base mass needs to be provided to host the element. But what if you want to create other elements which aren’t roofs, floors or walls? This is where third party plug-in come into play. Some of these (past and present) include:

  • Chameleon – Uses Chameleon Adaptive Component Systems (CACS). The workflow is bi-directional whereby geometry can be created in Grasshopper, exported to Revit and then re-imported back into Grasshopper.
  • Geometry Gym – Allows Grasshopper geometry to be translated into Revit using OpenBIM formats (primarily *ifc). The Industry Foundation Classes (IFC) is a neutral platform, open file format specification that is not controlled by a single vendor or group of vendors. It is an object-based file format with a data model developed by BuildingSMART. This workflow is probably the most complicated but potentially allows much greater control over the geometry and the element properties.
  • Grevit – Enables you to assemble your BIM model in Grasshopper and send it to Revit or AutoCAD Architecture. Once the elements have been sent, their geometries and parameters can be updated by another commit.
  • Hummingbird – This process exports basic geometric properties and parameter data to Comer Separated Value (*csv) text files. In Revit, this data is easily imported using the WhiteFeet ModelBuilder tool which is included in the download. In April, 2015 Hummingbird was updated to support Revit geometry into Rhino making it bi-directional.
  • Lyrebird – Similar to Chameleon in that it uses adaptive components but is only uni-directional.
  • OpenNURBS – Plug-in for Revit to allow Revit to automate the process of importing Rhino geometry.


Rhino to Revit plug-in comparison


Regardless of the plug-in adopted, there were common limitations amongst them:

  1. Complex geometry – Apart from OpenNURBS, the plug-ins don’t actually export Grasshopper geometry you have created but rather re-creates that geometry in Revit through a series of input parameters. For example, if exporting a wall from grasshopper you don’t reference the wall, but rather you are required to define the walls centre line and height for it to be rebuilt within Revit. Therefore, if the wall is irregular in height, it will not translated accurately.
  2. Longevity – As shown in the table below, many of the plug-ins have either been discontinued (as is the case with Chameleon and OpenNURBS) or made open-source (such as Grevit and Lyrebird) and has stopped being updated regularly.


Timeline of interoperability plug-ins as at June 2016.


  1. Unidirectional – With the exception of Chameleon and only recently, Hummingbird, most of the plug-ins are unidirectional which significantly limits interoperability.
  2. Classification – Only selected Revit elements were able to be created. If for example you wanted to create a ceiling or stair you couldn’t as there were no components for it.


However with the introduction of Dynamo, a whole new world of possibilities have emerged to facilitate interoperability. Since the release of Dynamo 7.2 back in September 2014, three new interoperability tools have emerged:

  • Flux – The new kid on the block and without a doubt the most promising. Flux provides cloud-based collaboration tools to exchange data and streamline complex design workflows. Flux plugins work with Rhino/Grasshopper, Excel, and Revit/Dynamo, to automate data transfer to and from Flux. Flux also has plans to expand this design software to include: AutoCAD, SketchUp, Revit and 3D max. A Flux project is the focal point for data exchange and collaboration. You can invite teammates into your project to share data. Each user and application controls when to synchronise data with the project, allowing users to work in isolation until they are ready to share their changes with the team. Since Flux was developed by Google[x], it will only work with Google Chrome.


  • Mantis Shrimp – Allows you to read Rhino’s native *3dm file type as well as export geometry from Grasshopper. Mantis Shrimp works in a similar manner to Rhynamo.

Logo_Mantis Shrimp_1600x800

  • Rhynamo – An open-source node library for reading and writing Rhino *3dm files. Rhynamo uses McNeel’s OpenNURBS library to expose new nodes for translating Rhino geometry for use within Dynamo. In addition, several experimental nodes are provided that allow for ‘Live’ access to the Rhino command line.


To assist Rhino users in becoming acquainted with Dynamo, I have produced a ‘Dynamo for Grasshopper Users‘ primer. Since everything is a little different in Dynamo, the primer provides a list of ‘translations’ in order to find a comparable node/component.

GrasshopperToDynamo_1600x800Dynamo for Grasshopper users primer


To date, most of the tutorials presented on Parametric Monkey have focused on Rhynamo. Since, Rhynamo doesn’t work out of Grasshopper (as it needs to have baked Rhino geometry) we will need to use Elefront for our data management. Elefront was developed by Ramon van der Heijden and is a plug-in focusing on managing model data and interaction with Rhino objects. The plug-in allows users to bake geometry to the Rhino model with the option of specifying attributes, including an unlimited amount of user defined attributes by means of key-value pairs. This way it is possible to treat a 3D Rhino model as a data base, where each object ‘knows’ what it is, what it belongs to, where it should go, what size it is, when it needs to be fabricated, etc. Instead of trying to store geometry in a database, Elefront stores data in a ‘Geometrybase’, thereby turning your Rhino model into a quasi BIM model. Elefront is therefore ideal for combining with Rhynamo to export Grasshopper data into Dynamo/Revit.




The purpose of this post was to present the notion of a BIM ecosystem and encourage an open and agile workflow amongst the AEC profession. To illustrate this notion, the post explored in detail how to extend Revit’s modelling capabilities by combining it with Rhino and Dynamo to create an integrated BIM environment. It is hoped that though a greater awareness of all software’s strengths and weaknesses, BIM professionals can use the right tool for the job, rather than being constrained to a single software.


1 Gu, N. et al. (2016). BIM Ecosystem: The co-evolution of products, process and people.

BVN Dynamo package v1.0.0 released


I am pleased to announce the latest release of the BVN Dynamo package v1.0.0, which is compatible with Revit 2017 and Dynamo 1.0.0. There have been substantial changes with how Dynamo operates and it has been a mission to keep up to ensure that the nodes keep working as intended. Hopefully now as we progress forward we should see more robust performance.


Below is a list of nodes available in the current version, with hyperlinks to tutorials to demonstrate how they can be utilised. Since launching the BVN Dynamo package back in March 2015, the package has grown to 33 custom nodes with more still on the way. The package has five dependencies which will be installed when installing the BVN package:


  • Clockwork for Dynamo 0.9.x (0.90.6)
  • Ladybug (0.1.2)
  • LunchBox for Dynammo (2016.5.3)
  • Rhynamo (2016.5.3)
  • Steam Nodes (0.8.43)







  • AdaptiveComponent.FromExcel – Places an adaptive component family from a Excel spreadsheet with comma separated points.
  • Area.ImportInstancePlaces 3D import instance masses based on areas.
  • Family.FromExcel – Places a family by point and level from an excel file. Families will only be placed on a level if the Z value matches an existing Revit level.
  • Grid.FromExcel – Reads an excel file to generate grids.
  • Group.ByTypeAndLocation – Places a model or detail group based on group time and a location (point).
  • Level.Plane – Creates a plane from level(s).
  • Levels.FromExcel – Generates levels from an excel spreadsheet.
  • Point.FromExcel – Creates a point from an Excel comma separated string.
  • Room.ExportToRhino – Collects all rooms in the project and exports polycurves to a Rhino file. Ensure Rhino file is closed in order to write to it. WARNING: WHEN EXPORTING, RHYNAMO WILL OVER-WRITE THE RHINO FILE. ENSURE YOU WRITE TO A BLANK DOCUMENT.
  • Room.ImportInstance – Searches for rooms by a name (string) and places import instances volumes. The import instance are named the same as the room number it originated from. Useful if wanting to export a room/program massing model out to Rhino.



  • Door.RenumberByRoomRenumbers doors based on ‘to’ room. If no ‘to’ room is present, it will use the ‘from’ room, e.g. external doors .
  • Room.AdjustUpperLimitExtracts all rooms and adjusts upper limit so that its limit is the level above with zero limit offset.
  • Room.CentreLocation – Moves room to the centroid of the room. If the centroid is outside of the room boundary, the room will remain in its current position. Based on the script by Modelical.
  • Room.CreateUnplaced – Create unplaced rooms from a list of room names.
  • Room.RenameByAreaRenames apartment rooms (1Bed, 2Bed, etc.) based on current area.
  • Room.RenameByModelGroup – Renames rooms based on the model group placed inside it. Useful for residential or healthcare projects where model groups are used and the room name needs to be in sync with the model group.
  • Room.RenumberByModelCurveRenumber rooms based on a model line drawn through rooms. Will only renumber rooms that intersect with the model line so that batch processing can be done.
  • Room.SetSEPP65ParameterReads an Excel file containing a list of complying SEPP65 apartment room numbers and then sets the Revit SEPP65_2 Hrs min shared parameter to True, with the rest False. The script will remove the first line from the Excel spreadsheet, i.e. the heading.
  • RoomTag.MoveToRoomLocation – Moves room tags to the room location point. Use the Room.CentreLocation node first before using this node. Based on the script by Modelical.


  • FilledRegion.PerimeterReturns a filled region’s perimeter and will give you the option to filter out line style from the calculations.





  • Filter.GetUnused – Extracts all the unused filters in your Revit project so that they can be purged.
  • ModelGroup.TypeCollector – Returns all the types (not instances) from all model groups in the project






  • View.FromSun – Generated views from the sun to visually verify the results of a Ladybug’s SEPP65 ‘heat map’.
  • View.SetUnderlayToNone – Sets a views underlay to none.
  • View.SwitchTitleblock – Switches the titleblock family type located on a sheet. Sheet number series is a string which is the prefix of the sheets names to be modified, e.g. “B”.


  • SunSettings.GetTimesAndSunDirections – Needs to be combined with SunSettings.TimeZone. Unlike ‘SunSettings.SunDirections’ and ‘SunSettings.StartDateType’ this node will return all the dates and vectors for a solar study
  • SunSettings.TimeZone – Extracts the current time zone in hours relative to UTC. Since Dynamo converts all time and dates to UTC this can be confusing. This node maintains the current time zone so that results can be verified easier.
  • View.Phases – Returns the current view’s phase.








  • IsInteger – Checks a list for input types and returns Boolean value based on an integer. Useful when importing Excel values so that Dynamo can process them accordingly.
  • IsNumber – Checks a list for input types and returns Boolean value based on a number.
  • IsString – Checks a list for input types and returns Boolean value based on a string.