AR/VR for Rhino and Grasshopper UK UGM | October 2020
Join Simply Rhino, Heatherwick Studio and Epic Games, for our live & online AR/VR focused Rhino User Group meeting.
This was our first online AR/VR User Group meeting and Heatherwick Studio started the evening’s presentations followed by Epic Games (Developers of Unreal Engine and Twinmotion). The meeting finished with a Q&A session with our three panelists. The meeting date was Thursday 8th October 2020 (18:30 – 20:30 (London/UK time)).
For the video recording of the meeting please go to the foot of this page
Heatherwick Studio has been working with game engines as part of their design workflow for years now and have developed custom design workflows and techniques that enable these processes.
Silvia Rueda will provide an insight on Heatherwick Studio’s use of Immersive Media, with a focus on the role of landscape design and the use of Unreal within its design process and workflow.
Image: Courtesy of Heatherwick Studio
David Weir-McCall from the Epic Games Enterprise team will take a look at the many ways that people are utilising the power of the Unreal Engine in the AEC industry to go beyond visualisations, to help bridge the gap between ideas and reality.
Looking at use-cases in the industry we will explore the different integrated workflows with Rhino and Grasshopper and how they are being used to communicate ideas, design and build in real time, and link up to sensors to create fully functioning digital twins. This includes covering works by relevant partners including Mindesk, Speckle & BHoM.
Images: Left – Courtesy of AHMM; Right – Courtesy of SPP and Imerza.
Meeting Presenters:
Organised by Simply Rhino
Sponsored by BEAM
Thanks to both Heatherwick Studio and Epic Games for joining us at the meeting.
For details on the previous AR/VR for Rhino & Grasshopper meeting you can visit here.
AR/VR for Rhino and Grasshopper UK UGM with Heatherwick Studio and Epic Games – Video Recording Transcript
We have made a transcript of the meeting recording, if you’d like to follow that then here it is:
Paul: Right, welcome everybody. This is the first of our virtual versions of our AR VR User Group Meeting, held here in the UK. It’s actually the sixth of this type of meeting, but the first one we’ve held virtually. We’ve met (for this format of meeting) before at AKT II at Grimshaw, at Bryden Wood and Heatherwick Studio Offices previously, and at SOFTROOM as well.
I’m joined by some friends here from… two from Heatherwick Studio, Pablo and Silvia who will be presenting first.
Pablo is the Head of Geometry and Computational Design at Heatherwick Studios. Silvia is the Lead Designer of the Immersive Cluster at Heatherwick. So, they’ll be presenting first for 30 minutes or so, and then we’ll be hearing from David Weir-McCall from Epic Games, part of the Enterprise Team in the AEC area.
Just a couple of other things to mention here. There’s quite a big group joining us. There might be as many as 300 or so, so please with questions, if you could address them in the questions panel rather than the chat panel, that would be great. They’re going to be monitored by myself and Steph who is in the background helping out. So, yes, please add them in questions. As there is quite a lot of you, there could be potentially quite a lot of questions but we’ll do our best to get as many questions to the presenters as we can. There is also the chat dialogue opportunity. You can use that to talk between yourselves, if you want to communicate with anyone else that you know is also participating.
What else is there to say?
We’re having this presentation first from Heatherwick. There’s a couple of polls that we’ll ask you to complete. Then we’ll hear from David, then Q and A’s for both presenters, then a round up. Then after all of this, there is an opportunity to join us on the Mozilla Hubs platform, a fun little meeting, because normally after these things we would have a nice social meet up, some pizza some beer. We can’t do that this time of course, so we’re going to invite you to come along to the space at Mozilla Hub. Some details on that will follow after everything.
So, what I’m going to do now is handover to Heatherwick people. Do you want to just say something as an introduction, Pablo, first?
PABLO: Sure. I think we’ll jump on the presentation.
PAUL: Okay, I’ll jump out and see you all later.
PABLO: Okay. Well thank you Paul and Steph for having us here today. We’ve been part of this AR VR community for some time now and we love always to see what is happening in the rest of the industry and obviously we’re very happy to do something this time around.
So, we are from Heatherwick Studio, and we are a team of problem solvers and designers based in the heart of Kings Cross. However, during these times, I think we’re mostly working from different parts across the UK, from our own homes.
Today’s presentation is going to focus on the studio presentation and specifically on our Unreal Engine workflow and how we use it for landscape design.
We are going to try to split the presentation in four main chapters. The first one is going to be covering how we use these visualisations in the studio, and then we’re going to talk about the landscape design and the relationship between this and how we visualise. Then we are going to jump in to a case study of one of our projects and we’re briefly going to go through future developments.
So as Paul mentioned, my name is Pablo Zamorano. I am Head of Geometry and Computational Design Department in the studio. I work across all studio projects and with a team of designers that also are passionate about engaging in complex design challenges and digging deeper in terms of how things get together, from early stages to the very latest ones. I also work with the great Silvia.
SILVIA: Hello, my name is Silvia. I am a designer and Immersive Media Specialist at Heatherwick Studio. I have a background in architecture and interaction design and my focus is to develop and communicate this to the design ideas using Unreal Engine.
PABLO: So, as I mentioned, we are based in London and we try to focus on projects across all different scales and types, and we not only design buildings, but also we design objects, or landscape design, as we will see in today’s lecture.
As I mentioned before, we work across scales and typology at every possible location. I think our main focus is to find projects that can potentially allow for a positive social impact wherever we are working. We have a special focus on material, craftsmanship and we are really focused on how things actually feel for people at human scale, at one to one scale. We like to generally design that you can approach to them with your body and you can feel them and understand them, as positive elements of the human scale.
These are three examples of recently finished buildings, one in Kings Cross, called Coal Drops Yard, the middle one A Thousand Trees in Shanghai and the third one, The Vessel in New York.
We also have a focus on Applied Research and particularly in my department, some of the things that we cover are trying to find the relationship between not only the physical world, but also the digital one. We try not to get ever too interested in anything we do. Rather we try to always think very deep into the ideas and how we can develop them further. So, we use any tool we have at our hands, or not, and if we don’t have the tool at our hands, we try to either look for it, or try and find a way of finding it. So, really, we try to open our spectrum of interaction between the digital or devices or fabrication methods and craftsmanship.
We apply this through different tools we have put together through the years and we can now run simulations of the buildings, but also allow us to quickly get through every single layer of any complexity and scale that we want to work on.
We are also partnering with different organisations, private and public and some schools like the IAAC in Barcelona, where we’ve been trying to focus on advanced fabrication and using, in this case, the material as wood, and working with simple elements, though trying to use them in a much more complex way and trying to solve, or bridge the gap between complex design and the fabrication materials we use.
So, you can see in this case robotic fabrication, we tried to realise the designs that the students put together in this short workshop.
I really love how robots move, and it sounds very interesting, the potential that we can get from them, we’re also quite interested in the relationship between our body and the machines we work with and we understand what are the limitations of a robot and what are the limitations of our bodies. So, we try and bridge this gap with tools like Augmented Reality. For instance, in this case, we are using the HoloLens and mobile devices to put together assemblies that later on we can add to the bigger pieces manufactured by the robots. So, it’s an overlaying of AR and the physical things, and here are some examples of the final pieces. So, again, a piece like this obviously is, I would say, too complex for a robot to run through the whole process avoiding clash between the optics in some parts and also too complex for a person to put together without any kind of traditional guidance. So, I think it’s very interesting how these two worlds merge together allow us to explore geometry further.
This is another example of the use of Augmented Reality in the studio. This is using Fologram and the HoloLens, this is in our workshop, where we’re using it to wire cut some foam blocks, and you can see how the model in Rhino 3d is moving, following how the physical object is moving. It’s not unidirectional, but also goes back from the real world back then into the digital world.
Here is how we use it (Fologram), not only as a presentational tool, but rather to understand how digital mock ups can interact with physical ones. So, we’re testing a small object, a lift button, and then we have another digital option for the same object. If you look at it on the screen, it may look okay, or you can say whether you like the design or not, but when you look at it with the goggles, you realise that you’re missing with your body, and it’s obvious this object is far too big and we can action these things in real time. So, the understanding of scale is key when we’re using Augmented Reality.
We’re also teaming up with some other people, in this case, Thornton Tomasetti, the engineering team, and trying to look in how we can customise some of these tools and we had this big ambition to have an AR tool, where we can not only import models that we’ve created but also physical models that we’ve done or even sketches and then turn them in to something else, like running any kind of modelling or sun exposure, or wind for analysis. And then also maybe editing the geometry from the model devices and then placing them back.
Obviously for this, we did a quick sketch actually over one day. So, we shortened the spectrum to maybe three areas, so having an import, being able to transform the geometry and then place it back and run some sun exposure analysis and with the key thing in common, that the tool should work cross platform. So, it’s not an app that only worked for Android devices or IOS devices, but rather is in this case web based platform where you can access on either a mobile device or even your computer.
This is the result. So, one of the interesting points here is that you can place a geometry base on a tracker, so once as you move the tracker, basically, the geometry will follow, and what you see on the screen on the upper right hand, are the controls of the parameter, basically that we allow the users to control this, the scale, the rotation, but also the day of the year and the hour. So, you can see not only the object moving but also the environment reacting to it. Because it’s basically following the tracker, if you had to place it in a different space, you can just literally use your hands and move it along with you and it will follow. So, we’re very excited to carry on this collaboration with Thornton Tomasetti.
SILVIA: So, understanding that, I’m going to do a bit of a jump and we’re going to talk about landscape and how this is integrated in this design vision. Basically, we can say that we have nature embedded in most of our projects, from different scale projects, large projects, Toranomon-Azabudai on the left in Tokyo and 1000 Trees on the right in Shanghai as well as medium sized buildings as the residential tower on the left and Little Island in New York on the right, and as well, small buildings, for example, pavilions that we invented with landscape, like the recently integrated Maggies Centre in Leeds.
So, how does Heatherwick Studio see visualisations? We think it’s critical, that a richness needs to be to applied in most of our visualisations. So, this means that traditionally, we used to have a series of stills that had a combination of VR, V-Ray rendering and adding some planting in post-production, even some hand sketches, means that it limits our ability to present the design model in a 3D way. So, it means that landscape becomes something that goes on top of the image at the end and it doesn’t go along with the design process of all the geometry and architecture. This is very challenging when it comes to projects like Little Island where planting is almost as important as the whole architecture itself. So, it’s very difficult to imagine how this would be without any planting. It looks very incomplete and empty.
So, for example, this is how landscape defines and gives context meaning in architecture and geometry. This is Al Fayah Park in Abu Dhabi where if we look at the architecture elements, it’s very difficult to understand the scale and the sense of the space that we are designing for. But if we implement and we put all of the landscape over it, then we have the whole picture and the idea of what we want to see and what is the main concept of it. In this case, we use Twinmotion. This was five years ago, so it was the first 3D animation and planting, all placed together with architecture. It was great because the client and the consultant’s for the first time really get to see a 360 of both of these things placed together in one animation. So, before moving forward, describing out what is the visualization workflow, we wanted to give an overview on how this sits in our wider design interoperability workflow, which we developed… basically, we develop all of our projects using different softwares and platforms in different stages. Rhino is the main design software for all of the stages. Revit is very important in the later stages and Unreal is the main visualisation tool. So, in this diagram, we can see which are the highlights of each of the stages, and understanding that for concept and schematic design we use Rhino and then we would visualise it in Unreal and then for the later stages, Revit and we can have a link from Revit to Unreal, for sure we will use it in the later stages.
So, for Landscape Visualisation, we developed our own workflow basically. Because we are looking forward to seeing the more things then just renders and videos, we are as well researching in 360 views, virtual reality and immersive media.
As part of the studio setting, we evaluate the different ways of visualisation, like visualising landscape. So, the most traditional ways that are Rhino and then possibly the options in Photoshop or V-Ray for Rhino, and then we are going to compare it to Rhino and Unreal.
So, in the first way, we use Rhino to just render out simple view, then we apply all the vegetation in post-production. This will give us a fully customised image at the end with a good quality but it can get very time consuming, especially if we want to represent more than one still. Making changes can get very tricky as you will just enter into a world of Photoshop and layers that is very hard to handle. As well, we will never be able to create a real time view using this method. When we use Rhino and V-Ray, we realise that it has an amazing quality and it is very photo realistic results, but we realised as well that because all the polygons of the trees that we want to render is massive, so it will take tons… an astronomical amount of time to render more than a few views around it. As well, we’ll not have real time alongside the project.
So, after comparing all of these methods, we realise that Unreal is the way to go. We can see the project in every possible angle. The landscape is already embedded in the real time modelling. Everything is fully customised and the most important thing is that apart from customising the look and feel of the project, we as well can customise all of the landscape assets that we want to implement and need.
So, this is the first project that we used Unreal in. It’s 1000 Trees in Shanghai. As the name says, it’s a lot of trees and a lot of landscape. So, we wanted to be very accurate towards it and this is the first video that we did, animation. This took us around two months to get our hands in. This is quite a lot of time to do an Unreal model, but from here, we learnt quite a lot of things for in the now future, we just take a few days or a few hours to develop an Unreal model. So, this was pretty exciting at the beginning as well. Client was seeing the project years in advance with fully accurate planting and the 360 view and you could see the whole scheme, all placed together in the contextual site.
So, it’s quite nice to compare the reality that is very much in the left, that is almost in completion, and the real that we had years in advance. So, from this, as I was saying, we learnt a few things. The first is to create a Heatherwick Studio template that will begin with some materials that we commonly use in our projects and as well, the atmosphere of Unreal would already be adjusted. Going forward, we built a landscape asset library which apart from having different species of pants in it, the most important is that we have different types of plants. So, this means we will have conical trees, global trees, quite a lot of types of trees. As well, this can be subdivided into different categories, depending on where the project is located geographically. So, this means that if you start a project in California you already have a folder full of the planting that you will use in California and it’s accurate to the surroundings.
The last one is that we improve our workflow with Datasmith and we use Rhino as a main tool of design and it’s quite easy to update any geometry in Unreal.
So, as a conclusion, we were trying back in the day to produce just one Unreal model and this would be a development design and it will be referenced until construction. What we are trying to do now is to have different real models since concept until construction, which will be evolving and changing all the time, meanwhile we are changing design. Now for landscape design process, basically landscape design plays a key role in the visualisation in most architectural projects. Despite being so important, it’s often overlooked and considered only as an interest that is placed in later stages of the visualisation process.
So, basically, to sum up what is our landscape design scope, it can be divided in two main packages. Hard landscape, that is everything that is manmade structure, so it will be pavement, furniture, landscape structures, and soft landscape, that is everything that is live components, so it will be trees, all planting, land form and water features. So, we can divide it in four layers starting from the bottom, land form and water, then hard scape and furniture, then lower planting and tree planting. We can conclude that really the true heart of landscape, is the two top layers, that is the tree planting and the lower planting. So, we can see here the comparison on the left is just the first two layers, and in the right, we can see the third and fourth layer, which gives the very characteristic aspect of what is landscape. So, this means that we should pay attention and get the whole picture and make it accurate if this is the most seen and most important part of landscape.
So, this is normally what we will deliver in our package landscape from Revit. So, you can see it is very dry, it has a very traditional method of communicating all of the design information. So, we have in the right technical drawing, defining areas of planting colours, each of them colour coded to differentiate different hatches that we schedule each planting in the left, and we describe a specific species of plant. This sounds very crazy. Apart from that, we will combine this with a reference of images that will define what we are saying in the schedule, and from there, there is a massive jump between what is the technical drawing and what the landscape architect asks and what is the deliverable images. So, using Rhino and V-Ray and Photoshop, there is a lot of artistic interpretation in what we are seeing and what really was delivered, making us a lot of manipulation in each image and we are lacking a lot of fidelity and accuracy towards what we are describing.
So, to conclude, we can see that the landscape design is received as very technical and often detached from the project. It’s very hard to visualise properly landscape as it is very time consuming and normally the quality is not that good. We have a clear need to make planting design and development more inclusive and interactive. That means that we wish to involve more the design team and the clients in the landscape design decisions. So, for this we use Unreal and we think that this is the best way to visualise landscape. It’s very dynamic. It’s a dynamic tool that will allow everybody to comment and to see the outcome immediately. Planting is accurate and you get a holistic view of an integral project of landscape and architecture.
So, we have this case study Tokyo Japan, where there is a massive mixed use development which includes residential and retail, office and education use and it has an extensive area of landscape. We are trying to create intimate human scale gardens and as well, a large scale city. As you can see in the plan, it’s full of landscape. What we tried to do was generate different narrative and characteristics of the landscape. So, in the entrance, we wanted to generate gateway plazas with cherry blossoms, to invite everyone to go in. In the middle, we wanted to create urban orchards so that people could interact with all the landscape and as well, woodland grasses.
But we are just going to focus and speak in the central garden. So, this is an enlarged plan of the planting scheme. So, basically, we have different mixes of shrubs and grasses and these are the description of each mix. Each one will have different species that will describe colour, texture, height, if they are evergreen and so on. For example, we take shrub mix 01, which will have a warm colour, red, orange. We already know the heights of all of the planting, so what we want to do is bespoke this in Unreal to make it as accurate as we want. So, by loading a lot of real assets in to our project, then we can edit the colours of any of the flowers. We can remove some leaves and begin to create seasonal changes if we want, as well as we’re seeing at the top, we can have a cherry blossom tree and then just by changing the colours of the leaves, it becomes another type of tree. As well, we can combine two assets, so if we have ivy and we combine it with some flowers, it can become a beautiful jasmine flower that we can hang anywhere that we want. So, from doing this, we recreate all of the planting that the schedule was showing us. As well, for the grasses, what we did was duplicate one of the simple shrubs that we had in Unreal and create a new static mesh, and for herbaceous ornamentals as well, there is a combination of grasses and flowers.
So, when we have already created all of the assets, what we need to do is create some folders in Unreal that will have all of these static mesh inside, or all of the plantings and then we import just the surfaces that are already colour coded with the folders, so that it means that anybody can jump in to Unreal and not be a landscape expert and already can plant and make any design decisions towards this.
So, this is explained here in a small video, where we have the surfaces with the colours, the folders, they are matching the surfaces. Then we choose the plantings. We drag and drop in the folder to Unreal. We select which ones we want to use and then from there, we change the density that means how much planting you want to paint when you want to paint it in Unreal basically.
So, the density, this is a bit of trial and error, and then you just… with one click, you will immediately have all of the landscape in the surface that you choose to plant it. So, then you will be exactly the same for the next one, you unclick the ones that you had and then you just click the new mix and you will paint it as well. That’s how we began to create all of the lower layer planting.
As well, we do it for the grasses. That is a… it’s very difficult to get accuracy in grasses, but in Unreal we can manage to do that. At the end we just hide the surfaces that we imported, we turn on the soft scape and we can meander inside of the project and literally see if it looks good or not. Sometimes, you just don’t have a say in these situations and it’s super nice to design and see if you can change some of the design things that you already thought about. Sometimes you just need to do that right. You can walk around and have a real time 360 view of all of the corners of the project and decide on all of the landscape that you need.
In summary, you will have a blank model for the colour coded areas, place the lower planting then import the bubble trees and then you will change them for real trees.
This means that we will have a fully coordinated model in Unreal, giving us the most accurate planting that we can have. We will have more than one final landscape version so that we are encouraging people to mix and test different types of versions to see which is the best combination for your project. So, it means that Unreal will allow us to interrogate landscape design in a more holistic way, involve all of the teams and consultants in landscape design progress and decisions and the most important is that we will be spreading the knowledge of landscape design all around architectural domains.
PABLO: Following from what Silvia just showed us, we are looking in to future developments and some of the things that we really care about are these workflows. How do we come up with the easiest ways of communicating the different softwares we use?
So, in this diagram that Silvia showed before, these three main softwares we use for design being Rhino, the key backbone of the process all through, from early on till later status. So, how can we connect these three softwares together, is something that we’re always asking ourselves and trying to improve.
So, about a year ago or so, we teamed up with Mindesk to answer this very question. So, how can we make this connection between the software as smooth as possible, knowing that Mindesk already had some of this potential built in, so they were using an Unreal Engine to turn the Rhino environment in to a virtual reality one, and we thought well maybe, if these two softwares are already talking to each other, there must be a way now to just progress that in to a link where we can have both environments at the same time.
Here you can see some of the results of this and there is… this is now live and is a tool which is part of what Mindesk offers as part of the software. So, if you modify the geometry in Rhino, on one side, then Unreal will automatically show you what changes are happening. This means there is a direct pipeline between the two worlds and so any geometry that you move is going to move across, any geometry that you import is going to get imported across live. If you modify it, it’s going to get modified. If you hide it, it’s going to hide on the other side as well. So, it’s going to make this process of turning these lollipop trees in to real or more realistic and real assets much easier. So, here you can see how things are moving live.
And another thing that is very interesting is that most of the people that engage with Rhino in the studio, don’t necessarily understand how Unreal works. So, here there is a view link where you can actually control the viewport in a real directive from Rhino. So, if you don’t understand how to navigate to the Unreal world, you can still modify your views from the Rhino viewpoint and then Unreal is going to mimic that.
Very recently, the tool has been evolving and you can now automatically modify the assets that are going to be assigned to this. Unfortunately, we don’t have a video showing this because it’s come up quite recently and we cannot record it. But imagine these lollipop buildings in Rhino will automatically get converted in to more realistic versions based on our landscape template. Of course, this is not only related to what happened in traditional Rhino geometry, but also can be driven by Grasshopper geometry. So, this obviously has a huge potential for us, because you can quickly test different options, but also to animate things in Grasshopper very quickly and see how things move up or change in the Unreal world.
Also, as Silvia mentioned, we engaged with Twinmotion something like five years ago. Twinmotion is the very reason why we started investigating realtime rendering in the studio and we are also excited about the new developments of this software and we are currently testing it and super excited to hear possibly more about what is coming up with this in the next lecture by David. But we really like the ease of use of Twinmotion and we love the fact we can control very quickly the season and make it change. Now, one of the things that is quite key for us, is to be able to modify the planting assets and to kind of customise with our own planting, based on our own landscape design. So, this is something we’re hoping Twinmotion could introduce in the future, so David, please take note of this.
I think with this, we can finish our presentation. So, thank you so much and we will be answering your question. Thanks.
Paul: Okay, great, thank you so much Pablo and Silvia. Okay, I have some questions for you. I’ll start with some workflow questions I think, first. We’ll just start with this question from James. Have you started to use Rhino Inside Revit? Has it been useful for you or have you found it not developed enough for you yet?
PABLO: I guess this is for me. The answer is yes. So, we’ve been using Rhino Inside for quite some time at the studio already. We already present our internal templates that focus on different customised workflows that we always use. It’s been an interesting journey because Rhino Inside obviously it’s under development. It means that one build may change from the previous one. Any attempt to standardise something or a process, may be in need of reviewing with the next release of the Rhino Inside version. But we’re still very excited and actively using it in very large projects actually that are now gearing up for construction. So, it is a very powerful tool and it’s not only for Revit. I’m also hoping to hear more from someone who is using it a bit more with Unreal.
The other very good thing about Rhino Inside is it allows you to hack it. So, we have some coding… if the tool is not doing it for you, you can actually come up with a tool that will do it.
Paul: Okay, I’m just going to go down these questions in any order now. Are you looking at procedural generation of plants? I guess a question for Silvia?
SILVIA: We have seen it by literally creating the trees by scratch but we have developed a better and faster workflow by using already the assets that Unreal gives us. So, it is literally a combination of all the libraries that are out there that people are using and making it our own.
Paul: Something else here from Kevin. At the point of workflow development that Silvia was talking about… I’m just reading this out, so when you say Unreal, are you building this library and templates for Unreal directly, or a library for Twinmotion?
SILVIA: It’s for Unreal. We have it already, the foliage in Unreal and it’s just for Unreal.
Paul: Thank you. Question from Lynne. Can the assets library be shared across teams and projects so everyone has the same and updated assets all the time?
SILVIA: Correct. We start with a template and then we will be narrowing down where the project is located and then we will give the template with already the library of plants that we think we can be using there. If not, we will add more plants if needed to that template, depending on the project.
Paul: Okay, thank you Silvia. Right, do you use VR with clients and/or collaborate within Unreal? Question from Martin Johnson.
SILVIA: Not at the moment. We are the owners of the geometry and Unreal itself. We don’t share Unreal but we share its animations.
PABLO: So, to the point of VR with clients, I think the answer is yes, sometimes. In some of the projects we’ve been working on, the clients actually turned out to be quite sophisticated and they have asked directly for either walkthroughs or a 360 animation where they can use their own VR domes to walkthrough the project and similar things. So, yes.
Paul: Another workflow question. Maybe David can answer this, or maybe you’d know as well Pablo. Does Unreal have a Revit plug-in and how does the Revit, Rhino, Unreal workflow work? Just before you answer that, it’s probably a good time to actually just mention that we do have a sponsor for this meeting, and that’s the developers of BEAM, which is a solution for interoperability between Rhino and Revit. Anyway, I know Pablo has used BEAM and may be an advocate for that solution as part of a toolset, but anyway, did you get that question Pablo?
PABLO: Yes. So, we haven’t connect Unreal and Revit directly yet. That’s the quick answer, but there are ways to doing it. I know there is some development using Rhino Inside to do this, and also I know that Mindesk… Gabriella may be somewhere in the audience tonight, I hope. So, hopefully after this, in Mozilla Hubs, you can try to find me and ask directly, but I can tell you that some very good news about this may be coming your way from Mindesk.
Paul: Very good.
DAVID: I’ll elaborate in the next presentation.
Paul: Quite a number of questions here. I don’t think we’re going to get through them all but let’s see. Right, so Pablo, have you tried Unity? How would you compare Unreal versus Unity? That’s a great one in terms of ARVR interface, software interface or within your workflow?
PABLO: We have tried many different platforms and I think the answer of why we selected Unreal is because of the quality we can achieve with it. So, I know some of other like Unity for instance are closer to the larger hacker community. People want to dig deeper in to these rabbit holes in software. But to be honest, what we really care about is the actual quality we can get from a software at the end of the day. For us, the best results so far have been coming from Unreal, and this is why we engage this. But yes, we always try any possible solution that is out there in terms of the ease of use and the quality.
Paul: Great, thank you. Silvia, are you using Speedtree to create the vegetation or are you using a library of some sort?
SILVIA: So, as I said before, we are using just the Unreal libraries from Epic Games.
Paul: Fine, okay, could you talk a little bit about how you onboard clients in to using and viewing the work in Unreal and generally in ARVR, and how receptive they’ve been to this? It’s a question from Pam Harris.
SILVIA: So, basically, it’s a win/win situation. They love it. They are very engaged with it. They are always asking or wanting Unreal. It’s the best way for them to see and understand all of the architectural terms and everything we are talking about and when we engage them with Unreal they get the whole picture and it’s very clear. The next steps that we need to develop are super clear. Sometimes it’s very tricky because we have a lot of things to finish because you can see everything, but I think it’s very useful. I don’t know if Pablo, you can say anything else about it?
PABLO: No, I think that says it all. We use it for every single presentation with the client.
Paul: Very good. I think one more question from Libney. Is it possible to upload the Unreal scene to the Cloud, so the client can check in a web browser?
PABLO: I think that’s a question for David I think, but there are many ways of actually doing this and I know you can do this in Unreal and export in many different platforms, not only web based but also augmented reality models and so on.
Paul: Okay, so maybe that’s something we can come back to. Okay, I think maybe… I’m just going to ask you one last question, and then we’re going to go to David’s presentation. But if there is time, if we’ve missed some questions and you really want anything, if there is something really pressing, please do let us know that it’s an urgent question and we’ll make sure that we get to it at the end. I’m trying not to leave people out but it’s a bit of a challenge. Okay, let’s see. What are you using for version control, mentioning Per Force SVN or something else? That question doesn’t mean much to me. Do you understand the question?
SILVIA: No, I didn’t, sorry.
PABLO: So, how are we dealing with different versions of Unreal?
Paul: Yes, I guess so. They are talking about asset version control.
DAVID: It’s for multiple users to engage in the same Unreal. So, are you using it as across the office for multiple users on the same scene, or are you using it for individual users for individual scenes?
PABLO: We are using it for individual user for individual scene.
SILVIA: Correct, but again, we also began to use levels so everyone can jump in to the level and change anything that they need to change in Unreal. Like somebody will have the core Unreal model.
Paul: This is the last question before we go to David. Do you do all the lighting in Unreal or do you add additional surfaces to be used as light sources in Rhino for instance?
SILVIA: The lights, we use them all in Unreal, yes.
Paul: Fantastic. Thank you Pablo, thank you Silvia. We’ll see you again at the end. We’re going to hand over now to David. Thank you very much.
Okay so you’ll be made presenter now David, and I’ll say see you later.
DAVID: Okay, you should be able to see my lovely background wallpaper. First of all, huge thank you to the guys at Heatherwick Studios. They just did a great job of showing you kind of what I want to share with you as well, which is some of the great use cases which are coming out of the architectural engineering construction industry. So, I’ll quickly start my presentation, but before we get started, I just wanted to share a quick introduction for those of you who maybe aren’t aware. I work within the architecture engineering industry within Epic Games, which basically means that I focus on in speaking to architectural engineering construction firms about their use of real time rendering tools in the workflow. There are ways to innovate work processes and outcomes. And what we do is we go around and we talk to a lot of people about a lot of uses of Unreal, the different uses of Twinmotion and the big thing that we usually do in these presentations is we like to share the use cases behind it, and Heatherwick is just one of those great use cases in landscape design and how their workflow has come out.
I want to share another couple of examples with you today, but first of all, just in case there is anyone on the call who is a little bit unsure of the Unreal Engine or Twinmotion, I just want to spend a couple of minutes just quickly running through that for everyone’s benefit.
First of all, what are we? Well we’re Epic Games, and we have this great platform called the Unreal Engine. Now this is what is used to create a number of the games that you may recognise, that big one up there Fortnite, which I recently joined a week ago, and I get my ass kicked by nine year olds on a daily basis now. But it’s also used as a background game engine for Infinity Play, Gears of War, and we also license out to both the games sector and the non-games sector. So, other game studios are using Unreal Engine as their tool. The non-games is where I sit and a number of other great guys sit as well, the film and media and the broadcasting and the automotive and manufacturing. If anyone is a fan of Star Wars, The Mandalorian was filmed or used Unreal Technology which is exciting.
But our big thing which I think again, Heatherwick did a great job of which is why we’re in the AEC space, is this ability to bridge ideas and reality together. Within the architecture engineering construction community, our output isn’t the same as the games or the film. We create reality, real building, and reconstruct those from ideas. So, we see these real time render tools as the fastest possible way to share and engage stakeholders and we see outputs of those ideas. Again, some of which we’re going to see a little bit later on.
But again, if you’re unfamiliar, we have two lovely products. We have Twinmotion, and we have Unreal. They have both got their different use cases and I just want to define exactly what they are so that you can understand when we talk about things moving forward, where the use cases for each of them sit. The way that I usually describe Twinmotion is this idea that it’s architectural visualisations in a few clicks. Essentially what Twinmotion is, it’s the Unreal Engine but with this wrapper around it, which has been customised and set up for ease of use for a very quick and simple learning curve. So that you can create great visuals and within a few clicks as I said, not one click but a few clicks. It’s the comparable tool to what we see people doing with Enscape and Lumion. It’s just there for the everyman, the every architect and engineer can have this and work alongside the propriety tools, whereas what we… here is a quick video demoing it. What we really have in here is what Heatherwick alluded to and it’s a number of key things including how it speaks to Rhino, the way that you can use the assets whenever they are in the Engine. We have a thing that we like to call Smart Assets, trees that interact with environment and people and animations and cars and things like x-ray materials for engineers to be able to see their designs. The other great thing we see about Twinmotion is this asset library of about 2500 assets.
What’s coming very shortly, because Heatherwick were asking about road maps, is, these assets are about to be released on the Unreal Engine Marketplace. So, these assets aren’t just fixed to Twinmotion but they can actually then be used in the Unreal Engine as well. So, we are really excited about that.
So, the Unreal Engine then is very different, how we describe it is very different. We see it as an advanced 3D real time creation platform. So, it’s the place that you take your visualisations and you advance them to the next level. There is different use cases that we see people doing this in, but really it’s about that engagement, that virtual reality, augmented reality, creating these outputs, these UI configurators that give you an extra element of control. So, it’s not just for visualisations, it’s for going beyond that. We see it being used across the industry in a variety of different ways, and I’m excited to get to share a number of those which you’re seeing on the screen in front of you today, and tell you a bit more about they’re using the Engine.
But in terms of both these softwares, I guess the things I really want to cover which may pertain to you guys a lot because I saw a lot of the questions focusing on this, why people use the engine and what the engine has to offer, both Twinmotion and Unreal. I usually sum it down to these four things. But today I’m only going to talk about one of them in more detail. But we have this great ecosystem. We acquired Quixel which is the amazing material library of high quality materials and assets which now syncs seamlessly in to Unreal and Twinmotion.
Data aggregation is working with your proprietary software tools like Rhino and Grasshopper, and we have plug ins that are free, fresh out the box and optimised for these platforms, ready to use from the second you open up the platform. Same with collaboration. From day one of opening up Unreal, there’s a template which allows you to create an experience that you can share with other people, and have multiple people exploring the same space at the same time even across the world.
Then the last one is, the assets that you have, once you have these visual assets in 3D, we want it to be as open and flexible to do what you want with it as possible, as in, you could have it as a render today. You could have it as an animation tomorrow. But if in a weeks time you want to turn it in to a virtual reality experience, or a web based application, or a desktop game, then those options are all available. So, you can customise the experience around your use.
But the big thing I want to cover, because I just heard a lot of it in the Q and A is, this idea of data aggregation and how it is used with these external tools, with a special focus today on the Rhino and Grasshopper side. But as I said earlier on, we have this built in tool in Unreal that we like to call Datasmith, and Datasmith’s job is to convert these external assets in to Unreal assets, in a very non-destructive way. It turns Rhino assets in to Unreal Assets. It turns Revit assets in to Unreal assets and it’s just meant to be this very quick way that you can then also optimise with automation, with a thing inside Datasmith called visual data prep where you can pre-customise essentially a script which runs every time you import your model, that will do all the prep work ahead of time. You have a material that you know you have a nicer material of in Unreal, but you can get it to automatically replace it. With Heatherwick, they had obviously trees that they wanted to replace or they wanted to put in their place, you can get it to take all these trees from the Rhino model, and replace them with these lovely trees in our Unreal asset library. So, that’s what Datasmith is there to do and it’s constantly advancing to be more and more real time.
And it’s not just us. We are huge supporters of makings sure that our platforms speak broadly across the AEC. I think there is this common understanding and everyone on the call will be aligned to it. We have a big interoperability problem within the AEC, or opportunities, in ways that we work and a variety of different tools. We need them to speak to each other better, and some of the projects and some of the tools we’re seeing emerging by people within the AEC to answer that call, are really exciting. Tools like Speckle and the BHOM by Buro Happold the work which is being done within Rhino Inside (from McNeel) and also Mindesk.
I just want to quickly touch on these for those of you who are unaware. First of all, big news I think last week, that we now have an official Rhino exporter in to Datasmith. Before it’s always just been an in-built thing in to the FBX, but now it’s been optimised for the tool, so we’re really excited.
But Speckle and BHOM are the two big ones which we see a lot of potential and development in. If you don’t know what Speckle Works is, Speckle is this amazing open source data platform, which is looking to answer the problem of these multi-programmes or workflows, in a way that is a cloud based system, that will share data and geometry across platforms that you can have Revit assets that you can then see in Rhino in real time, that whenever you make a change in Rhino, you can see it in Revit. It’s this interconnected system that both the Speckle and BHOM are looking at addressing, and integration with Unreal has been explored with HOK and Mobius Node. We are supporters of people exploring this space of interoperability. We gave them a mega grant and they’re working on developing this tool to create this more integrated system through Speckle and making Unreal part of that equation.
Similarly, with the great work that is going on with Rhino Inside, Rhino in a general is a C Sharp tool, which is preferably Unity based. But for those of you who are using Unreal, there is this external wrapper that has now been developed called U Sharp, which allows a C Sharp interface in to Unreal, allowing you to use Rhino Inside, within the Unreal Engine.
So, there’s different tiers of expertise you need for each of these, but it’s really interesting to see how people are rising up and addressing this interoperability pipeline workflow.
The last one which again got a lot of focus on in the previous presentation is Mindesk. It has to be the simplest of them all, of creating this real time, synced, bi-directional workflow of Rhino and Grasshopper and Unreal with the added bonus of having this great interface that allows you to model and work in a virtual environment and see that automatically changing in Rhino and Grasshopper.
So, in general, we’re really excited about the way the industry is approaching this problem of data aggregation and bringing… not just for the Unreal Engine, but for the entire industry, and we’re happy to be a part of that.
And the last thing, just before I jump on to seeing some cool stuff, is this idea that we have this great tool called Twinmotion and we have this great tool called the Unreal Engine and what we’re really excited about is this process that you’re very soon… we just released the Beta of it, where you’re going to be able to export models out of Twinmotion and import them in to the Unreal Engine, which basically means you can create this very quick and beautiful architectural visualisation using Twinmotion, very quickly, simply. Throw in some lights, some assets, and then export that entire thing in to the Unreal Engine to then add that next layer. So, beyond visualisations into something else. This is for the digital specialist, the visualisation experts or UI creators that they can add that layer on top. So, we see this streamlined workflow that can be used across the process, from architects to engineers, all the way through to the technology specialists. So, we’re really excited about the development of that.
So, I want to spend a little bit of time talking about this which is really the many different ways that we’re seeing people use the Unreal Engine. It’s talked a lot… we see a lot of people using it for architectural visualisations and VR, but we sit it being used across a broad range of different areas throughout the building information life cycle, from concept design all the way through to the building and operations side and that’s kind of what I want to share with you today, is people who are using it across a variety of different areas, digital twins to training to visual communication and virtual collaboration and share with you some of the ways they’re doing that. Again, it’s because of these great data aggregation tools that are out there, that allows Unreal to really come out and shine, and that’s always part of these processes. So, I’ll share a number of them with you today.
What’s really funny about the way the tool is developed is, we find ourselves speaking less and less about the visual fidelity of Twinmotion and Unreal, mostly because we feel that it’s a given. The visual quality that you can get out of these tools, this is Twinmotion, it comes… it’s just known or it speaks for itself. So, we find ourselves spending less and less time talking about this aspect of the visual quality and fidelity you can get out of Twinmotion or alternatively, what you can get out of Unreal Engine and finding ourselves focusing more on the UI and the use cases around pushing it beyond that. In saying that, for anyone who is really interested in the arch-vis side, I just wanted to throw these two examples out there just to interest you.
We recently worked with the Mill on creating videoes released in Unreal Fest, just about the different use cases or different industries using Unreal Engine and what’s special about this collaboration work we did with The Mill, are these effects that you’re seeing, the construction of the buildings, the ripple effect, the Inception/Dr Strange – esque style creation. We actually have a webinar which shows you how we created it. Similarly, with this example here, it’s using real time retracing. We have that asset for free on the Unreal marketplace. You can go in and see exactly how it is that they were set up, the assets that were used, the light settings, to try and help you understand how these visual qualities were developed and achieved. Like I say, we rarely find ourselves focusing on these any more.
We find ourselves more focused on these amazingly fun areas of immersive design. People are using it in the early stages or using it in the more immersive sense from people like AF Consult who are using VR as a way of designing with the client. So, this is a GIS Map that has been brought in and they’re using it as a way to draw a map of potential transportation, road, rail links through that 3D geometry, but in VR in a multi user. So, there are other people able to be in there at the same time to share and understand it. All the way through to the work that I was part of, which was for the work of CallinsonRTKL where, we have people in virtual spaces all the time where what they’ve done is, they’ve built this tracking tool where they can see where people go in these virtual spaces, what it is that they look at, what objects it is that they’re hitting.
So, they can better define what areas are important, where people are not going, what are people looking at that they find interesting and then planning and designing around that. Is there an area that people aren’t interested in? Is there something we can do in that area to grab attention and grab focus and that’s really a great use case of what is coming out of the engine, all the way through to again, although we saw Speckle and the BHOM, there is a bunch of other data integration pipelines that focus more on the parametric building design. This is a thing that was designed by Cornell University that works with City Engine and bringing in the archaicism, the parametric control that City Engine has and exposing those functions within the Unreal Engine.
So, lots of really cool use cases of what we’re seeing in the immersive design side, but I would say the biggest use and most beneficial use which we see, which again, Heatherwick touched on, is this idea to communicate ideas and to communicate information and vision, be it just in an immersive space, or be it by creating custom asset tools. This is building asset management tool created by Cityscape, that was created for leasing managers to speak to future tenants, so they can explore the building footprint and see it in a one to one virtual scale. But also linking that model, and linking that Unreal model to a financial model that what they’re then able to do is draw out new floor plates and that will then update in real time what the cost of that space might be. This kind of integration in to real world date and giving context to that information is really what we’re seeing a powerful use of the tool being and then obviously exploring it, that new space and the engine, all the way through to the work that Arup are doing. They have this amazing driving simulation game where again, because it was all created in Unreal in this 3D asset, they created this customisable driving simulator tool that was a fly through and a real time walkthrough, which they actually hooked up to an entire driving simulation game.
So, it’s just great to see the different ways that people are communicating and engaging with people around these ideas, all the way through to the work they’ve been doing with Accucities. This is the newest tool planned city her they have a digital twin of London and a variety of different cities throughout the UK, and the way that their model links in with City Data to help future city planners see their future buildings and city data in context, it’s really exciting news. The new tool planned city looks at integrating in with your actual models, your geometry and your information. Whether it be a simple massing model, that you can then run a bunch of simulation tools on within the application, like visibility site lines, like it’s doing here, where your buildings are from, all the way through to if you want to see it more clearly, importing your own building design in here, and then running similar of the application simulations in that.
So, the communications side is a huge area, but I think the most important and most relevant to today’s climate is this idea of virtual collaboration, and this actually addresses one of the questions that was brought up by Paul at the end of the last session, which we will get to in just a second, but this idea of now that we’re not able to be together, being able to virtually meet in spaces and still communicate and explore ideas. The Unreal Engine has the collaboration template which allows multiple users to be in a space like this. Thea, the company that you see in front of you, they built on top of it. They were like, this is great, we’ll add more functionality in to it now, allowing deeper control and integration of your ideas, to be able to share that in a virtual way, in VR to sketch and to annotate and to communicate, both in VR and in desktop, which I think is really exciting. There are beta’s currently online and it’s called Big Room if you’re interested, and I can provide links to all of that afterwards.
But I guess the big thing that is really important and really cool in this day and age, is there is no longer the need to have big powerful gaming computers in order to view the content, and the example I want to share is this example by a company called Pure Web, who use a new feature of the Engine, called Pixel Streaming, or their own version of it, where you can load your executable, Unreal instance on to a server and then share that in a web based application. So, this is running on Google Chrome where all the functionality of the experience, the sales configurator, the quality and the fidelity, is all running on a browser and through a web link that you can just share, and you can do this fresh out the box, using this tool called Pixel Streaming. There are lovely tutorials online about it. You can then look to post it on your own servers to much larger audiences, using AWS instances, or it gets a little bit more technical which is why things like Pure Web exist. They are there to work with you, that you give them the visual content that you want and they’ll control the back end, the servers, the hosting, the graphics so that you can then go around and share this content. So, lots of different avenues to explore.
The last one which is actually fresh off the press actually, it was something that happened about a week ago, which is people hosting virtual events and talks. This is again, a screen grab to show this was on a web browser of the latest… a world digital built environment 2020. It was created in Unreal and hosted and allowed users to explore this virtual space created in the Engine and witness talks, virtual production techniques like green screen included in to the platform so you can view it in the browser, navigate and walk around and be able to experience presentations in a much more one to one basis. We are seeing a lot more of this emerge which is really exciting.
The last one again is just other people are looking at these virtual collaboration tools. So, we have a free one, ESRI have created a free one for the Engine as well. Thea have worked on a much bigger one. I think Space Form by Squint/Opera are doing again a much higher level professional one, so lots of different options to explore and use the space, whether or not you want to create it yourself or whether or not you want to use one of these great presets that are in front of us.
So, that’s virtual collaboration. A lot of our conversations seem to be focused on that at the moment for obvious reasons and although this may not pertain much to the work that you guys do, I always keep it in because I love the use cases and the different ways that people are using the Engine. So, this we’ll commonly see being used by our real estate and their building operator use cases, like Aedas Homes who… like the tools we just saw, they mixed virtual production with the Unreal Engine that they’re able to bring in their 3D assets of a building, but the woman you’re seeing on the screen isn’t a recording, it’s a live feed to their studio and you can ask her questions. Does this lovely couch come in a mahogany green? Yes it does. Then you can change it. So, it’s kind of just bridging this ability. When we can’t be together, what’s the best way to engage with people and talk about big ideas? All the way through to the work that Line Creative have recently done. We’ve all seen AR before, this has to be one of the more high quality ones I’ve ever seen being done, just the level of quality they’ve managed to retain from the Engine in to these AR applications, and the tools around this when the Engine are constantly getting better, which basically means you can have these AR models out in the middle of a city and not need a representation like a sheet of paper in order to load the model up, which is super exciting.
Then you have Imerza. This is an awesome one, mainly because it’s a 3D printed model. This is a 3D printed model that they use 12 laser projectors to project on the image on the Unreal Engine. So, they’re able to actually communicate with large audiences and large groups of people around a fixed master plan. Probably the closest to masterplans I’ve ever seen but it links in live with the Engine. It’s real time, so that view you’re seeing is from the Engine. You can change it and move it around and it updates on the model itself as well, which is what we really like to see.
For those of you who are looking ahead to the future trends, we’re having more and more conversations around this idea of digital twins and smart cities, which is really exciting. We’re so glad to be a part of it, because we see a lot of digital twin solutions going around where people are bringing in geometry and trying to represent sensor information so it’s understandable, and this is where I feel gaming engine technology comes in to its own. The ability to put sensor information in to context in a controllable UI format is really how you get understanding of that data, and how that becomes information.
This is a company called 51 World who have built an entire business around working with clients and developers on bringing their physical assets in to a digital UI. So, they can control them, they can visualise live sensors in the Engine, see how they’re performing and interrogate it from a very small scale, like a floor to floor kind of building, all the way up to the much larger sizes and scales. So, they’ve done it right up to these bigger building things, all the way through to… I think we released a video online a couple of weeks ago now, how they had done it to an entire city. It’s massive. So, yes, they’re building digital twins for cities, which is really exciting to see what they’re going to do with it next.
I realise we’re running out of time and I just want to leave it open just a little bit. I’ll jump in to our last one which is scale and size as open worlds ability. Talking about data aggregation of Rhino and Grasshopper, the ability to work with those tools including things like LIDAR scanning technologies. So, we can now import LIDAR scans. This is run by a company called Virtual Wonders, and the model you see in front of you, first of all, it’s real time and you can walk around it in a VR headset, real time, but that’s points that you’re seeing. That’s not measures, or points converted to measures. The point density is so tight in this that it actually appears as an object. So, as scanning technology increases, the abilities of what the real time renders Engines can handle, is also increasing. So, a lot of people are pushing the boundaries on this. So, all the way up to what Build Media have done which is build the entire Wellington City in New Zealand in Unreal. It’s not just Wellington, it’s the entire country they’ve put in, using a very streamline process of GIS data, photo scans and BIM information on to this refined workflow that they have this entire running. This isn’t a video animation, this isn’t all just special effects and cutting especially. We’ve actually kept in the views from the Engine which shows it running at 50 frames per second rate as you go through this model. What is great about the one platform, many assets, they’re now looking at working with Wellington City Council and integrating IOT information to turn it in to a digital twin. So, the future of where your models can go are amazing.
That’s kind of everything I want to talk about. I realise I’m coming up to my half hour slot. The last thing I want to talk about is really the importance of this slide which we spoke about at the beginning which… if you’re not at the Unreal side of it and you’re working in design tools like Rhino & Grasshopper, Twinmotion is a great start for you. The Unreal Engine is a great adopter, but there’s a slightly steeper learning curve and that’s where you rely on your digital specialist team if you have them, or if you’re very technically savvy and good with the science and technology. It’s great to jump in because the potentials are limitless in what you can do. But the whole idea is that you can work with one and translate in to the other. It’s really the future where real time rendering technology needs to go.
And the good news, the great news, is Unreal Engine is free, completely. It’s open for anyone to use. I love this fact. Unless you’re creating a game that you’re going to be releasing on the PS4 or XBOX generally, we don’t want to hear from you as far as licensing goes. If you’re releasing a PS4 title of the newest building then maybe we should have a conversation, but in general, use it, develop it for what you need and we don’t ask for anything in return. It’s there to use. It’s an open source tool and you can add to it and build on it in the marketplace, and that’s what I love about it.
The Twinmotion side, it’s not free. It’s a paid asset. It’s free to try, but I guess the good news which was brought to my attention today actually, is that Twinmotion is free to Rhino users. We have a new tie in which we’re doing for a limited time, so any Rhino 5 or 6 owners can get a perpetual license. It’s not a subscription. We don’t do the subscription thing with them. Twin Motion, you have the software and it’s yours forever essentially, absolutely free. So, I’ll make sure the link is in the chat at the end of this. All of your information is there and how to use it, and with that version that you get, you’ll get all the updates up until the end of 2021, which again is great news for all. So, that’s everything I want to talk about.
The last thing which I kind of spoke about very quickly earlier on is this idea of MegaGrants. We’re very big in supporting innovation in the industry in a whole. You saw earlier on, we’re supporting HOK with Speckle, a number of the people you maybe have seen in the presentations are also recipients of this. It’s a pot of $100 million in grants that we allocated to drive towards developers to essentially create amazing things using these tools. So, if you have a great idea, if you’ve got something that you really want to experiment and try, we encourage you to create a proposal and send it in to our Epic MegaGrants team and again, you could be lucky and receive a grant that allows us to go in to those developments around integration and interoperability, like Speckle. So, we’re really excited about exploring that. We’ve seen some great stuff come out of it and we encourage everyone to have a think about how you can push the bounds on solving these large scale problems.
Lastly, I can’t actually answer anything on this, because it’s currently heavily under development but I’m sure everyone saw the Unreal Engine 5 which is coming out in 2021 and this is the future of where the Engines and real time Engines are going. So, if you haven’t seen any of the footage on it, go and have a look and this is what we’re walking towards and the Unreal Engine 4 is a great place to start because the integration and switch over will be seamless, everything will work and translate across but with all the great benefits.
That’s me. I think I’m bang on my half hour. I’ve done this a lot so I think I’ve timed it pretty well. So, I’ll stop my screenshare and go back to Paul and open up for some questions.
Paul: Great, thank you David. It would be excellent if Silvia and Pablo could join us again as well, because there are some other questions that have gone back to them as well. Brilliant, thank you Silvia, thanks David. Excellent, loved the announcement about free licenses of Twinmotion for users of Rhino 5 and Rhino 6.
DAVID: It’s awesome.
Paul: Let me put the link in the chat so everyone can have a look at it and explore it, and yes we can jump to questions.
I’m going to ask this question here because it was marked as an urgent question but I think it’s a question for you Silvia. How do you quickly replace the bubble trees from Rhino with the Unreal Engine tree asset? Would it replace all objects at once or is it a more manual process?
SILVIA: So, as Pablo mentioned, we can us Mindesk for that and we can replace it as in having the name of the tree open real and placing it in to Mindesk, it will then literally change as we speak. It will go in the moment that you just connect everything in Mindesk. But if you are not using Mindesk, then you just can have little dots and then paint in the pocket of the dots and you would create the trees.
Paul: Thank you. I wanted to mention… I mentioned this right at the start. This event is being recorded. So, we will be posting a link to everybody, where you can watch this again. A question on coding experience. So, how much, and what type of coding experience is useful for people who are interested in Unreal, and is any experience like that required for Twinmotion?
PABLO: I would say none. There is no coding required to jump in to either of these softwares. I think that’s the beauty of them. That doesn’t mean that you can’t go further if you want to go but…
DAVID: The thing I like about it is you guys who use Grasshopper should be familiar. It’s a very visual, scripting based UI that goes in to it. You can of course go in to the coding. This is where your expertise in… you can do a lot more with it with C++ or using the U Sharp wrapper that allows you to translate that in to C++. You can get in to the coding but by no means do you have to, do you need to. In fact I usually see if you have to then, we need to be working harder as Epic Games to provide that solution for you without having to code. So, that’s our stance. It’s blueprint and visual scripting which I think people here should be familiar with.
Paul: Many will be. Okay, was there a mention of a timeframe for a Twinmotion to UE bridge?
DAVID: So, we’re saying I think, 2021, early at the moment. We’re working on the beta at the moment with a number of people. So, we’re exploring it but there’s no official announce day yet. It’s just in the works. Pablo and Silvia, I can’t remember if you were part of that but it’s in the beta at the moment. No official announce date. Apologies.
Paul: Okay, there’s an interest in MegaGrants. Are there any requirements? Do you look for particular skill sets? Do you look for particular experience? Leonardo is very interested in your MegaGrants.
DAVID: Generally, there are some requirements that we look at. Most of it is just dependent on… we care about a number of things. First of all we care about innovation. That’s one of the big things. We want the idea to be innovative and generally we also are very preferable because we’re very open and we love it to be altruistic. We love it to be able to benefit an industry or a wider group of people or by making it free, or an open source thing. Skills that we look for in people, is only that either you have the technical skills to explore or are willing to work with parties that do. We’ve seen a mix of MegaGrants that look to include and work with companies and developers who use Unreal Engine. We’ve seen it from Unreal developers alone. So, we’ve seen it across the board in a variety of different ways. I would say the importance really comes down to the innovative and altruistic side of the proposal, the expertise does but it’s a good question.
Paul: Thank you.
DAVID: I look forward to seeing his proposal.
Paul: Yes, I think one might be on its way. Is the TM / UE bridge available to the public?
DAVID: Not public beta yet, no. We’re running it with a number of people who are obviously heavy Twinmotion and Unreal users. At some point that will change, but again, it’s early days at the moment. So, trust me, whenever… it will be announced and shared on all our channels when we start to have something. With Epic Games, we’re… for those of you who are more familiar with the Engine, we’re really open with including people in our previews and early versions of things. You can download the newest version of Unreal 426 preview, which is not the official one, but it’s the beta one for people to start exploring. So, we are really open for people sharing these things and once it becomes available, you guys will be the first ones to know about it, because you will be made aware.
Paul: Excellent. Is it possible to say a little bit about hardware? Pablo, perhaps your experience about what hardware is required for good performance with Unreal? I guess that’s workstation rather than headsets?
PABLO: Silvia?
SILVIA: So, basically, we are using any Alienware or Dell Precision. But as well, we can run it in our studio computers. It will be a bit slower, but you can open it anywhere basically.
PABLO: Maybe it’s beneficial for the audience Silvia, if you can maybe mention about the size of a file so the…
DAVID: I guess that’s the important thing. I mean, a, it depends on what you do with it. But the instances where you see build media bringing in that level of fidelity to an entire city will obviously require a little bit more than the smaller projects. Generally, what Silvia said is a good general all round, is that you need… I would say you’d have to have a reasonably good graphics card, but Alienware, MSI, laptops can run it quite happily and functionally all the way up to the professional grade workstations. But that would be the one limitation that I would say, if you’re working on these masterplanning projects and want to bring in 20,000 assets, to save you time and optimisation using higher grade computers will obviously be key, but there’s also a number of great optimisation tools within the engine that allow working with large files and assets easier. LODS, culling, all of these things will allow you to explore the models and work on the models without needing the high level graphics cards, but it varies. It’s use case by use case.
Paul: Okay, thank you. Is there still a place for tools like 3D Studio Max? It seems your visualisation workflow has changed a lot, so how do tools like that still fit in to the workflow that we’ve seen tonight?
DAVID: I can jump in on this one. In general, as Unreal Engine, we rely on everyone’s tool sets. If a company is using 3D Studio Max, then great. We want to be able to support it. The problem is with these softwares, they’re probably more tailored towards other industries or other specialities. As the architect grows and the profession matures, the computational side is just being more dominant and a much smarter and faster way of working. So, we’re seeing more people using that tool now than 3D Studio Max. However, you speak to our visual production guys in the movie industry, they’ll say the complete opposite. This is where 3D Max and Maya is maybe a little bit more dominant. So, we don’t see it ever going away and we will continue to make adaptions in to it, but I see in the AEC and Silvia and Pablo will agree that this comp designer role, to put it in more context, in gaming, we talk about next gen consoles, all the time. The next generation is coming. We very much see these computational designing tools like Rhino and Grasshopper being the next gen of architects. So, these things all come in to fruition more and more.
PABLO: I agree.
Paul: Thank you. Is it necessary to have RTX cards? I guess that’s specific to NVIDIA boards.… go on David.
DAVID: It’s not and it is. If you want to start to push the boundaries on real time ray tracing, then yes, you are definitely going to need RTX software. This always comes down to what it is you’re looking to do. What is the output? The higher graphical, visual quality, the larger the model and asset, will usually require it. So, there is comparison charts on the Unreal website that you can kind of see what works and what features are needing particular tools but I would say there is a use for it, and don’t rule it out if you want to do real time ray tracing which is awesome.
Paul: Is there a possibility to live link Grasshopper and Unreal Engine without Mindesk? I think…Pablo, is this possible? I guess there’s not going to be as nice and slick a solution as there would be in using Mindesk?
PABLO: David mentioned a couple. So, the BHOM is one way of having this direct link. We actually used it. So, w had our mini project going on with Buro Happold trying to test and develop it with them and so that’s one way of doing it. You mentioned Speckle as well. We haven’t tested the connection with Speckle, but it seems like it’s doing it as well. And also Rhino Inside.
DAVID: What I was trying to get at with these different solutions is, they require different levels of expertise and knowledge. Speckle and the BHOM, they are great but there’s a lot of understanding that you need to get around those tools and how you use them, whereas what you see done with Mindesk is a very simple, quick, efficient way of doing it, out of the box type thing. And Rhino Inside requires knowledge of Visual Studio, and U Sharp. So, I guess there is lots of different solutions on how to get this real time bi-directional link by Grasshopper and Unreal but it really depends on your knowledge and the area, on which solution is the best one and I think they’re all great. It all just depends on the use case that you’ve got and the experience and time you have to invest in to them.
Paul: Questions are flying in at the moment. How is the lighting typically handled in the scenes today? Is it baked, or real time?
DAVID: How do you guys do it in Heatherwick?
SILVIA: If we are showing the real, we don’t bake it, but if we are doing a proper animation, yes.
DAVID: We’ve seen a lot of new tools coming out in this area. I think there was a light mask rendering tool which is being released with 4.26, which allows you to have a more real time approach on base lighting quality. But it always comes down to optimisation of the Unreal asset. So, if you are trying to make your Unreal files more streamlined and get those high frame rates, dynamic lighting is very tough to do, because it requires higher graphical patterns. Baking your lighting in, although it’s fixed, creates on a higher level of fidelity, which if anyone has seen Unreal Engine 5 trailers, knows that’s what Unreal Engine 5 looks to completely blow out the water, by allowing that high quality global illumination, control in real time rather than have to bake it in. So, that will soon be a thing of the past, but room for both.
Paul: Question for Pablo. How is Heatherwick accounting for occlusion culling in AR scenes and the question goes on, ARCore uses depth API and ARKit is using LIDAR. Are you implementing these new API’s in your process?
PABLO: So, I think rather than going through the details of each part of that question, I think it’s better if I describe how we are using AR at the moment. So, we are using Fologram to connect Rhino with HoloLens and mobile devices. So, for that, we are controlling absolutely everything through Grasshopper. But we are also starting to use ARKit. So, since the beginning of this year, we have been testing different platforms and ARKit is one of the ones that we like the most which is something that the person in the question also mentioned. And the bespoke tool that I presented, which is a web based tool, it’s again the product of a one day hackathon. So, it’s really on its early status. It’s in sketch mode and so basically we will be addressing all these questions as we go ahead with the project.
Paul: And a linked question for David, is Unreal working on blueprints to account for these new occlusion culling technologies?
DAVID: Probably not blueprint things, but again, I think this is something that will come in to more the Unreal Engine v5 capabilities. But not that I’m aware of is there direct focus on it, but whenever there does, it will be more feature of the Engine rather than a particular blueprint option. I would hang fire on that one.
Paul: Okay. Those questions were from Chance. Now, there was something else from Gabrielle for you David. Is there an expected timeframe for when Unreal 5 beta will be available for partners and developers?
DAVID: Again, unfortunately not. It’s always a fun conversation to have with people. Everyone wants to get their hands on it first. We’re not… even in 2021, there’s not an exact timeframe that we’ve set for, beta comes out here, early release gets out here. Again, the only thing I can tell you is that whenever we know you’ll know, because we make all these things very public. The reason it’s not at the moment is because there’s nothing publically out there yet on this. So, hang fire and keep an eye on the news is what I say.
Paul: Question from Reece, maybe for Silvia, can you mix and match assets coming in from Rhino with assets in Maya and 3d Studio Max?
SILVIA: Yes, it’s like… if you are importing geometry from different softwares, that’s the question, right?
Paul: Yes, what assets, so I guess geometry?
SILVIA: Yes, obviously. It depends how you’re importing them but you can import from every software and you can all have them in Unreal. That’s the beauty of it.
Paul: Another workflow question. Is optimisation for the mesh distance fields that Unreal Engine uses, handled in Rhino, Datasmith, or is there a manual control for that?
DAVID: I’m not quite sure on the part that refers to.
Paul: Reece, if you want to word the question in a different way, I’m happy to ask again.
PABLO: Is it asking where the optimisations happen?
Paul: I’ll go on to something else and come back. We have a poll for everybody and I wanted to mention a couple of things. Thank you to all of the presenters. Thank you Pablo, thank you David, thank you Silvia, fantastic presentations. Thanks for joining us. We will be back and we have something scheduled… although we don’t have a date yet, Grimshaw have agreed to present at the next version of this meeting which is hopefully going to happen in January, maybe sooner. But you’ll be the first to know. Do please either sign up for our newsletter or follow us on social media at Simply Rhino and you’ll hear about all this stuff. As you know, we do Rhino training and we supply Rhino software, just to mention a couple of classes that are coming up there, Rhino Level 1, Grasshopper classes, and more advanced Rhino classes, all of which have been delivered live and online at the moment.
So, shall we go to Silvia and Silvia’s little explanation of Mozilla is going to work for us?
SILVIA: How do we send the links?
STEPH: I put the links in the hand outs. So, there is a PDF in the handouts that has the links to the rooms in it. Hopefully people will be able to see that.
SILVIA: Right, so just if you guys want to have a little talk with us, and between you guys as well, just click on the links provided. We have four different rooms. The first one is The Vessel and the other ones are magic places that you can click and then we can enter through each room. Just press enter the room. You choose an avatar, I’m going to be Santa Claus if you want to find me. You just enter the screen, allow your microphone and then you can navigate inside of each room as the same controls as in Unreal, A for left, d for right, w for front and s for backwards. You can pan with your left mouse click and you can jump in to places with your right mouse click. If you are close to an avatar, you will be hearing the voice closer and if you are far away, you will not be able to hear anything.
PABLO: Can you also copy and paste the links on the chat box, because I don’t think we have the hand outs actually.
STEPH: I can do that.
PABLO: Are we all going to be in the same room or are we splitting?
Paul: I’m going to go to the Spanish Port, because that sounds like Barcelona and I think I know who will be there.
SILVIA: I am there. So, I can see a lot of faces now. This is quite fun.
Paul: So, we’re going to close down this window, say goodbye from here and we’ll see you at Spanish Port, or the Vessel, or these other places very soon.