Computational Design Process: Reflections from the Now Constructed UNO Baxter Arena
This past October, the University of Nebraska Omaha (UNO) Mavericks won their season opener hockey game at the recently opened Baxter Arena. The arena, designed by HDR and located in Omaha, Nebraska’s Aksarben Village, has the capacity to seat 7,898 fans.
Almost four years ago today, HDR started the design process for the UNO Community Facility. In 2012, President Barack Obama was just re-elected, the housing crisis was finally over, 3-D printers were showing up on desktops, and self-driving cars with automatic stopping and parallel parking were becoming a hot topic.
Four years from concept to construction is still relatively quick in the AEC industry, considering most projects tend to go on hold once the concept is developed, waiting for funding or acquisition of land. In comparison, the entire social media revolution was created and populated with billions of users in four years from 2008 through 2012. To say the AEC industry is slow to adapt would be a bit of an understatement. We all know the industry is slow, but when a project takes four plus years to design and construct, the tools we are using have either already changed, or don’t even exist anymore.
Quickly adapting to new tools and custom workflows is pivotal to overcoming this difference between the time it takes to build a building and the speed of technology. The days of creating standard workflows that apply to projects without change are all but gone. Today’s teams need to be prepared to “cherry-pick” from past and current work-in-progress workflows that most effectively solve the problem at hand.
Back in 2012 Nate Miller, formerly of CASE, now with Proving Ground, helped HDR build custom tools and workflows for the UNO Facility project. In 2013, Nate and I produced a white paper about our process and presented at the University of Southern California BIM Symposium. Now that the UNO Community Arena is fully constructed and open to the public, it is time to reopen our white paper and see what we can “cherry-pick.” The computational design tools applied to the UNO Community Facility fall into three categories: concept design, daylight analysis, and facade workflow.
One of the successful ways computational design was implemented during the concept phase was by developing parametric ‘throw-away’ models. We developed an iterative process of creating concept sketches and then building on-the-fly parametric models. Perhaps similar to building a CAD drawing from a sketch to test scale and viability, but the parametric model allowed the team to flex the model, test options, and go back to sketching. One of the biggest misconceptions is that computational design replaces the process of sketching. Some projects have the ability to be either repetitive or driven by certain parameters. For example, the typology of an arena dictates the overall height and shape of the seating bowl. In the end, flexing the location of the community ice provided many different options both from programming adjacencies and externally as one approaches the facility.
The process of creating a parametric ‘throw-away’ model is something computational designers at HDR do frequently, whether it’s testing a program on the site or dynamically flexing the design. Throughout this last year our Practice Innovation team has created a content library for computational design so that anyone at HDR can access what has previously been done. This library lends itself to the process of ‘throw-away’ models where you can figure out what has been done and see if you can reuse it or use it as a starting point for your next project.
As the Baxter Arena design developed, one of the driving concepts was that the community ice became a major feature, serving as an open front entry to what is typically a concealed box. To achieve glass around an ice rink required extensive daylight and energy models to be developed throughout the design development phase. Back in 2012, running quick daylight and energy models as an iterative design tool was quite revolutionary. In test simulations, the roof was lowered on one side, the overhang was increased and a wall was added to eliminate direct sunlight from hitting the ice. These options were developed iteratively and provided quick feedback to the team. The result was a design that maximised the amount of glass and minimized the amount of direct sunlight on the ice.
Over the last four years the industry has made tremendous strides in developing daylight tools for design teams. Sefaira, DIVA, and Insight 360 are all tools not available in 2012 that HDR is currently using on projects. HDR has a number of initiatives focused on daylight and energy tools for design teams to use during concept and design development. Over the next four years I expect the AEC industry will see an even greater adoption and integration of these tools.
As the design and program began to take shape, developing the façade became critical. Most projects hit a point where the schematic design is either completed or close to completed when the production model (Revit model) begins. During this phase the Revit model is populated with floors, walls, rooms and doors. One of the more challenging things to build in Revit is facades, especially if they are curved with custom panels. It’s easy to move walls and rooms around as the design and program change, but it’s much more difficult and very time consuming to update a custom façade in Revit with traditional workflows. Nate Miller while at CASE developed a custom workflow for HDR at the time to allow the team to connect the design model from Rhino to our production model in Revit. This process allowed the design team to iterate on program and design options and link the concept and production models. In the end, we were able to control a consistent framework for the façade to be developed. The final solution didn’t require custom panels, but at that point the framework was able to adapt to any pattern and material quite easily.
Since 2012, the use of interoperability tools has been an extremely hot topic. New tools, like Flux and Rhynamo, have created new and efficient ways to link the design and production models. For teams looking to increase speed and reduce rework, interoperability is the key to all future workflows. Some firms (Thornton Tomasetti, for example) have invested heavily into developing their own custom interoperability tools going from concept to analysis to production.
Four years from now will be 2020. In the last year alone, HDR has doubled the number of people using computational design tools and I imagine by 2020 that these tools will represent a standard way of working. I’m curious whether the speed of construction versus the speed of technology will continue to develop at different rates or perhaps we will finally see new ways of building finally take hold.
Today’s race seems to be shaping up and that’s going from design, direct to manufacturing. Using design tools that link up with cost models, financial models and construction fabrication are indeed possible with our current technology. This is the future and whoever figures it out first is going to be extremely successful.