- 8 ore fa
Un nuovo videodiario della serie "Around the Verse" per Star Citizen.
Categoria
🎮️
VideogiochiTrascrizione
00:00.
00:16Hello and welcome to another episode of Around the Verse,
00:19our weekly look at Star Citizen's ongoing development.
00:22I'm Sandy Gardiner.
00:23And I'm Chris Roberts.
00:25On today's show, we take a look at the systems
00:27a render holograms and comms in real-time.
00:31Yeah, it's pretty cool, so I can't wait to show you guys.
00:34But first, as many of you know,
00:35the team is very focused on completing our 3.0 update
00:38for the Persistent Universe.
00:39So 3.0 is a giant leap forward from what's currently available in-game,
00:43and thanks to the dev team's hard work,
00:45the majority of 3.0's new features are almost complete,
00:48and we've shifted into the final phase of the production process
00:51that focuses on feature and content integration,
00:54optimization, and bug fixing.
00:56Now, we're also expecting many new players,
00:59or people who have been busy playing something else,
01:02to come back in and log in and play 3.0.
01:04So we wanted to make sure the user experience is really good,
01:07so we've decided to spend more time polishing and optimizing
01:10than we have in recent releases.
01:13In addition, we're also aiming to introduce our new Delta patcher,
01:16so you will only need to download just the files that have changed
01:18for each subsequent patch, which means no more 30-gigabyte downloads.
01:22But of course, this will require some fine-tuning
01:25and a lot of testing to make sure it works as intended.
01:28Now, we know that 3.0 is a big release,
01:31and you're all eager to play,
01:32and we're excited for you to play too,
01:35and we can't wait to get it done.
01:37But we want to make sure that it's ready.
01:38So if you've read the list of caveats we gave when we first started sharing
01:43our internal unpatted schedules,
01:44our very first point was quality would always trump schedule,
01:47and the second and third points about task estimates being unpredictable
01:52due to the nature of developing something that hasn't been done before,
01:55and the difficulty of estimating bug fixing and polish time
01:58are also important to remember as we go forward with our schedules on finishing 3.0.
02:03So that's why we've seen the constant changes to production schedule
02:06over the past few weeks.
02:08As new issues or advancements cross our paths,
02:11we've worked hard to communicate those to you
02:13no matter how good or bad the news may be.
02:16By its very nature, game development can be an exhilarating
02:19and frustrating and unpredictable process.
02:22So if our 3.0 schedule wasn't that,
02:24then you wouldn't be getting the true development experience.
02:27You wouldn't.
02:28For our new backers who may not know,
02:30with each of our major releases,
02:32we've done different things to help you track our progress.
02:35For the .8 patcher that launched Arena Commander
02:38and the 2.0 patch which introduced the PU,
02:41we had our weekly development updates
02:43that listed current blockers and resolve bugs.
02:45And for the march to 3.0,
02:47we've been tracking the major tasks
02:49we're doing with our weekly production schedule reports.
02:52So now that we've reached this later stage in the process,
02:54we're planning to adjust the format of ATV
02:56to highlight exactly what we're working on to get 3.0 out the door.
02:59Now, as all our studios are working hard to get 3.0 out,
03:01we have decided to suspend the studio update portion of the show
03:04so as to not distract developers
03:06with providing footage of their work for the studio updates.
03:10And instead, starting next week,
03:11we'll be launching a new segment called Burndown.
03:14With this segment, you'll be able to be a fly on the wall
03:17for some of our production meetings
03:18and hear directly from the developers and QA testers
03:21about the week's biggest bugs, blockers,
03:24and challenges that we've been battling.
03:26It will be another great way for you to follow 3.0's progress.
03:29The weekly production report on the website
03:31will also be adjusting its focus to match.
03:35Alongside the new Burndown segment,
03:37ATV will bring you a weekly deep dive into a feature
03:40we're working on for the game.
03:41That way, you'll still be getting the same great detail
03:44about what we have planned,
03:46alongside the most current information
03:48on exactly where we are on the path to releasing 3.0.
03:51Yep, and then once 3.0 is out,
03:53we'll resume the normal ATV cadence
03:55with the weekly studio reports
03:57and all that lovely eye candy that you guys like to see every week.
04:00Now, let's shift gears to focus on two systems
04:03we've recently got working together,
04:05the secondary viewport and render-to-texture system.
04:08When combined, these systems can do a wide variety of things,
04:12from dynamically creating comm calls from other locations
04:14to rendering holograms in real time.
04:17Yeah, and I'm pretty excited about the potential of this technology
04:20as it's going to allow us to do some really cool things
04:21since Quadrant 42 and Star Citizen,
04:25which you'll see maybe a little hint of to come.
04:28Let's take a look.
04:36We've been working with the graphics engineering team in the UK
04:39to develop and make use of their new secondary viewport tech,
04:43which in itself makes use of the new render-to-texture system.
04:45It allows us to do some really cool things
04:48for our in-universe narrative in both Squadron 42 and the PU.
04:51So far, we have used it for comms calls
04:53and holographic volume rendering,
04:55and we have been syncing very closely with the engineers
04:57that write all the new rendering code to make this happen,
04:59and we're slowly homing in on a final feature set.
05:02First of which is secondary viewports,
05:04which allows us to get a second view onto the world
05:06or many different views onto the world.
05:08This is built on top of some new tech we have called the render-to-texture system.
05:13Prior to the render-to-texture system,
05:14if we wanted to render some user interfaces or screens or visors,
05:18we would have to render them directly into the game world,
05:21and this happened after all of the rest of the scene had been rendered.
05:25What that meant is that the UI would always look on top of the game world.
05:28It would never truly fit in,
05:29and therefore, it would never correctly be obscured by things like glass or fog
05:34or bloom in the same way as everything else in the scene.
05:37And this has always bothered our UI artists.
05:38So the new system, the idea is we render all of this content into textures first,
05:44and then we use them textures in the actual main rendering pass of the scene
05:48and composite them in with whatever effects we need,
05:51like whether we need them to look holographic or like they're on glass
05:54or whatever it might be, and it lets them to bed themselves in the game world much better
05:59and have much better lighting and sorting with the rest of the scene.
06:02We also get a few other benefits from this.
06:04We get better anti-aliasing, better sorting,
06:07we get better performance, actually,
06:09with the fact that we can reuse the same screen
06:12on many different displays in the game world just by rendering it once,
06:16and we can even use the same screen on the next frame of the game
06:19to be able to avoid rendering cost if, for example,
06:21you've got a screen which doesn't need to animate or doesn't animate very quickly.
06:26These new pieces of tech we've been using in many different systems,
06:28so we've got all of our UI screens and our visors,
06:31all of our holograms and video commons calls,
06:34and there'll be several other uses we're hoping to fit in further down the line,
06:37like things like mirrors, things that are typically really difficult to achieve in games.
06:40The render text system starts at the engine level when we're gathering all the objects.
06:45Really, at this point, all we really need to know is that the objects are going to be streamed,
06:50so the streaming system needs to be informed,
06:52and we also need the max and min screen space size.
06:55We use the max and min screen space size along with the UV tectal density
06:58to be able to calculate how much screen resolution is required for that texture.
07:03The minimum screen space size is required because that texture may be used on multiple different objects.
07:09As it's used on multiple different objects, we then need to get the largest size
07:13and use mip mapping down to the smallest size.
07:16When you have a screen within a screen, we need to know the ordering of the RTTs,
07:20so as one RTT is rendered before another, it can then be used as a texture within the second one.
07:27We also need to know that an RTT within an RTT within the main pass,
07:32if the first RTT is half res of its parent RTT, and that RTT is half res of the main
07:39pass,
07:39the first RTT must be a quarter of the res of rendering size.
07:42The rendering system has a fixed memory budget.
07:46To do this, we allocate one large texture ahead of time.
07:49This texture is called the texture pool, or in this case the render texture pool.
07:53It's very similar to a standard shadow pool system.
07:56We recently rewrote our shadow pool packing system to be a power of 2 quadtree allocator.
08:03We use the same power of 2 quadtree allocator for the render texture system.
08:06We render the textures as power of 2, so i.e. 1, 2, 8, 2, 5, 6, 5, 12, 1K.
08:11We use the smallest size that the texture needs that we can fit in,
08:17so if you need a render texture object at 800x800, we'd use a 1024x1024.
08:24And as you move closer and further away from the object, it will require a lower or higher resolution,
08:29and we progressively move up and down.
08:32One of the benefits of render texture is we can reuse those textures for multiple objects.
08:36So if you have a scene with many different billboards in them, let's say 12 billboards,
08:40we would render that texture for the billboard once and then reuse that texture over 12 different billboards.
08:47The original system, the UI system for instance, wouldn't do that.
08:50It would render the UI or that billboard 12 times.
08:53Because the render texture system is now a texture and it's not, you know, flash just rendered into the world,
09:01it means we can render any curved screen, anything like that.
09:04The only downside to that was we had to implement a new system to manage the mouse pointer interactivity that
09:11we already had.
09:12We had to make some modifications to bring in a mouse pointer system which takes screen space size
09:18and remaps that into object UV coordinates and then we can then pass that object UV coordinates to the UI
09:25system.
09:25The UI system is able to work out where on the object that mouse pointer is and then in reference
09:32to where on the flash that is.
09:33And then therefore you can then start selecting things.
09:35With the character animation, it normally goes through the camera system to decide whether it needs to be animated,
09:41whether any of the facial animations need to be run.
09:43And as we were running this through the render texture, it wasn't in view of the main camera.
09:47What would happen is it just wouldn't render when it was in the render texture system.
09:51We resolved this by having the usual camera calling system and the facial animation system, stuff like that,
09:57communicating with the render texture manager.
09:59That render texture manager will allow it to go through all the different cameras and work out exactly how big
10:05it is on screen
10:06or how big it is with inside a render texture manager.
10:08And it will allow it to decide on the level of detail of the facial animations, the level of detail
10:13in the character animations.
10:14If you do pre-rendered comms, you can't really acknowledge characters changing costumes or ships or locations.
10:20So real-time rendering for us makes a big difference for immersion.
10:23Comms calls reflect what's going on in the universe and for the persistent universe,
10:26it opens up customized player avatars calling each other, all rendered live.
10:30Another possibility are for example CCTV or other room view style puzzles or live recording of views to be featured
10:37somewhere else in the universe.
10:39There are some remarkable consequences of these advances in the tech.
10:42Our capital ships feature big holographic volumes on bridges or in briefing rooms
10:46and that means the player can walk around them freely.
10:48For those, we wanted to not just render a second viewport using 2D display screens but actual 3D holograms
10:54and you can view them from all angles.
11:25The third viewport is meant for the most beautiful to be featured on the network of others.
11:25The first view is what's going on in the universe.
11:36Secondary viewport camera is updated dynamically to match the relative viewing angle
11:41from the main player's camera to the projection volume
11:45and then as you move around the secondary viewport camera moves
11:49and therefore you can essentially move around the holographic projection.
11:52Using the existing rendering pipeline means that we can render essentially any object into the holographic projection
11:59but there's no need for material duplicates or any duplicating of material setup
12:04it just basically works with these existing shaders.
12:08So as well as the existing shaders we also have developed dedicated shaders for various things
12:13for example like abstract user interface objects
12:17or if in a mission briefing you wanted to go to a waypoint
12:21the waypoint could be displayed as holographic
12:23and it will be using one of the dedicated shaders that we developed
12:27and the cool thing about this is that we can automatically fade the objects that are in the source volume
12:33as they get closer to the boundary of the volume
12:36we can automatically fade them out so it doesn't clip as it goes through the boundary
12:41we also exposed two new artistic features where we basically allow the objects to dissolve and tint
12:48independent of the material setup.
12:50Being able to light these 2D and HoloRTT presences in real time at a source location
12:54and then seeing the results in the corresponding 2D display screen or in the 3D holo target area
13:00immediately felt exciting to me.
13:02With this new tech we can have a character calling another ship or location
13:05and the call appears on either a 2D display screen or inside a 3D volume
13:09with the calling partner essentially being telepresence then.
13:12What's really cool about the holocomps or telepresence is we can arbitrarily scale the source volume up or down
13:18and easily create larger than life representations of characters without having to resort to cheats like scaling up or anything
13:25of the scene.
13:25And this makes it possible to have something like a grand admiral appearing as a looming figure inside the bangle
13:30carrier hologlobe
13:31versus him just being a small life-sized presence.
13:34We also added the ability to tint or dissolve any object in our scenes at will
13:38which helps staging something like mission briefings where waypoints would need to flash green or enemy presence is marked as
13:44red.
13:44So this tech progressed really really fast and we got some really great results but there's more we want to
13:48do with it.
13:49Next thing for us is to optimize it further.
13:51We really want to make sure that there was no performance impact when you have these secondary renders or these
13:56holograms and such in the scene.
13:58We're doing a bunch of exciting things to try and combat the performance issues such as if you're going to
14:02video call to someone
14:03and you can only see like a slight part of the background behind them.
14:06We'll use what we call the environment probe which is normally used for reflections of the scene.
14:11We're going to render that directly behind the player to avoid having to render the entire background scene.
14:15And in most situations you won't be able to actually tell the difference.
14:17So that's one of the examples of the optimizations we're going to be making but there'll be many more to
14:22make sure that we can really use this tech in as many situations as possible.
14:26So you get to see the fun gameplay that will result from it.
14:28So now that we have the basics of our holotech in place we want to stabilize it more and spend
14:32more time finalizing the look of these holograms.
14:35That means all the good post effects goodies you can think of interlacing lines.
14:38I want to have flickering when there's poor signal quality or when the holographic display is damaged.
14:43Yeah, we can't wait to show you more of this when it comes to life later in the pew and
14:48of course in our Squadron 42 narrative.
14:50Thanks for watching.
14:52Pretty awesome, eh?
14:54And for the eagle eye amongst you, you will have spotted the first appearance of Ben Mendelssohn's character in Squadron
14:5942 and Liam Onye Knight Cunningham in the work in progress holobriefing test scene that we've been doing.
15:06Our graphics team have really created something I haven't seen in any other engine and allows us to actually do
15:10proper holographic telepresence.
15:13None of it's pre-rendered or faked. It's all live and the possibilities for longer term gameplay are pretty exciting.
15:19And that's all for today's episode. As always, thanks to all of our subscribers for making it possible for us
15:25to produce all of our video content.
15:27We've just announced that August ship of the month is the Apua Kartuol.
15:32So that means subscribers can test out this Jan ship all month long.
15:36Just log into the game to take it out for a spin.
15:39Yeah, and thanks to all our backers who have supported the game over the years.
15:42Opening up the development process to you all has been both challenging and extremely rewarding.
15:47And I would say the emphasis would be mostly on rewarding.
15:50So I can't thank you enough for making it all possible.
15:52Finally, if you want to know what all of our offices did over the past month,
15:56then check out the July monthly report which goes live tomorrow.
15:59So until next week, we'll see you...
16:02Around the Verse!
16:03...
16:05Grazie a tutti.
16:35Grazie a tutti.
Commenti