Vai al lettorePassa al contenuto principale
  • 3 ore fa
Cloud Imperium ha pubblicato una nuova puntata del documentario di Star Citizen intitolato Around the Verse. Nel filmato, della durata complessiva di 39 minuti, vengono mostrati i miglioramenti che il team di sviluppo ha apportato al titolo, tra i quali l’interazione sulle navi, gli effetti grafici e le animazioni dei personaggi.
Trascrizione
00:15Hello and welcome to Around the Verse, our weekly look at the development of Star Citizen.
00:20I'm Eric Iron Davis.
00:22And I'm Kirk Tomei.
00:23Tomorrow the April monthly report will be shared with the community.
00:26As you probably know by now, the new monthly report is basically a collection of each studio's updates over the
00:31past month.
00:32Yeah, and we really look forward to sharing all of our progress with everyone and starting the cycle all over
00:37again here in Los Angeles next week.
00:39But first, let's go to Wilmslow and see what they've been up to.
00:44Hi, and welcome back again to the UK studios for our latest updates on our progress over the last four
00:48weeks.
00:49Everyone is focused and busily working through all the tasks and bugs for Squadron 42 and of course the anticipated
00:553.0 update for Star Citizen, which we're very excited to get out to you all as soon as possible.
01:00So let's kick off with the gameplay feature sprints which we've been working on.
01:03The player interaction system has moved along quickly over the last few weeks.
01:06Further improvement of the personal inner thought system will allow you to select functionality which is not directly tied to
01:12a particular object.
01:13Examples of this would be selecting an emote or exiting your seat, although there will be of course still be
01:18quicker ways of doing this, you know, through default actions for experienced players.
01:22Next up is the Air Traffic Controller Sprint, which deals with managing the flow of traffic to a location for
01:27both takeoffs and landing.
01:29In particular, it is responsible for assigning out and reserving a landing pad when a player wants to land as
01:34well as freeing up that landing pad once they've landed and cleared the area.
01:38Similarly, it will deal with reserving a landing pad and spawning a ship when the player wants to take off.
01:42The initial stages of the implementation are now underway and we're working on the underlying structure of how the system
01:47works.
01:48We've been finishing up functionality on the character status system, which includes bringing the procedural breathing and soup punctures to
01:54final implementation.
01:55Once this is done, we will focus on getting the system switched on by default in the game.
02:00We are also working on pick up and carry, which is a bit of a mashup between the player interaction
02:04system and our usable sprint.
02:07The usables were more concerned with getting the AI to interact with the objects in the environment, whereas the player
02:12interaction system is more for the player UI to interact with the environment.
02:17We're now bringing these two systems together to get the player to be able to pick up, carry and then
02:22place objects in our universe.
02:24The conversation tech has now completed the initial development of the subsumption tool and to create the conversations with NPCs
02:30much more easily.
02:31It's been handed over to designers to prove it out by setting up all the different conversations.
02:36They'll provide feedback to code on any needed further improvements.
02:39The audio team has been working on procedural planner audio processes, including R&D and planning for systems to map
02:44and modify audio automatically.
02:46Also, work is continuing on the audio propagation system, the breathing system audio for the character status system and also
02:52a dialogue tool that's been called Word Up.
02:54For weapon sound effects, the ship weapon toolkit is in progress, which includes reload sound effects for the gallant, the
03:00weapon tail refactor and multi-positional code support for weapons which will handle summing up the audio for many of
03:05the same weapons mounted to a single ship.
03:08For ships, the prospector audio is done, with work on the greycat and cutlass black, which is continuing.
03:42For weapon sound effects, the prospector audio is done, with work on the greycat and cutlass black, which is continuing.
04:07The music department continues to work on the dynamically looping cinematic ambient music system.
04:12Clean up dogfighting music logic, the addition of tension system prototyping for the planetside procedural music, and have also added
04:19more music to the launcher.
04:24Meanwhile, the graphics team have been working on many separate pieces of tech this month.
04:28The first is the integration of real-time lit volumetric fog from Lumberyard, which is going to be a huge
04:33boost for the lighting and environment art.
04:36The render the texture feature is progressing quickly, and the initial version is in the UI team's hands to upgrade
04:41our 2D UIs and will also soon have the feature usable for 3D holographic projections to power our various holographic
04:47displays.
04:48The real-time environment probe tech is nearing completion and allows fully dynamic bounce light and reflections on a planet
04:53where traditional light baking techniques are not possible.
04:56The visual effects team have been working on several sprints.
05:00Atmospheric flight effects have completed the first sprint with a passive planetary entry VFX.
05:06The effect is controlled by speed and atmospheric density values.
05:10With the core functionality in place for this, as well as engine trails, we're now merging these two sprints as
05:16we further implement design and art feedback while optimising and bug fixing.
05:19We have also been working on lighting entity effect improvements.
05:23This is where we are attempting to create realistic lighting and other electrical type effects.
05:29In other areas, we have completed the first pass for the MISC Prospector, including thruster improvements and damage.
05:35For weapons, we have continued initial work on the APR Scourge Railgun, including the charging and charge effects.
05:43Since finishing off the Banu Defender, the concept team has been busily developing the Origin 600i, which is now in
05:48its final stages.
05:50The weapons team have completed the Preacher Distortion Scattergun and the Apocalypse Arm Scattershot, and made good headway on the
05:56Klaus & Werner LMG.
05:59The UK ship team has been hard at work since bringing you the Javelin for Ship Shape.
06:04The Reclaimer has made a lot of progress since our last update.
06:07On the exterior, we have completed work on the hull and the team was excited to see the huge claw
06:11come together.
06:12We have now moved into the damage phase of development, splitting the mesh up and getting it ready to use
06:16the damage tech.
06:18On the interior, we have fully fleshed out habitation and tech decks, as well as an enormous salvage processing room,
06:24and now the team is working on finishing the drone room, engineering deck and cockpit.
06:45On the interior, we is ready toглядate the erhöhts of the
06:57www.caterpillar-starfarer.com
07:28www.caterpillar-star.com
07:57www.caterpillar-star.com
08:28www.caterpillar-star.com
08:33www.caterpillar-star.com
08:33A detail pass is also ongoing, adding all the finer details you've come to expect from our ships.
08:38The interior has gone through blockout phase and is now well into art production.
08:42By utilising assets from other MISC ships, we've been able to create spaces quickly and efficiently
08:47with the intention to use these across the whole series.
09:16The environment team are continuing to explore ways to create volumatic forms
09:19in space with the graphics team.
09:21We've been baking out simulations and doing some initial renders.
09:24The surface outposts are finishing their interior visual benchmarks for engineering, habitation, and hydroponics.
09:30These will then be distributed to the various outpost layouts and configurations.
09:34The team is continuing to set dress, light, and polish these interior spaces to build character,
09:38while also exploring options for navigation and branding based on the lore and fiction.
09:44The truck stop space stations have moved into the final art phase,
09:47so the team is busy building out the shader library and working up some example pieces to final quality.
10:06As it's a modular system, we are also continuing to refine the building set
10:09to explore potential build configurations, which will make sure the set is as flexible as possible.
10:14The animation team has been working on the cover AI work,
10:17with the aim to improve all animation assets beyond functional.
10:21Breathing state improvements are now in line with back-end code improvements.
10:25This involves getting curved data out of Maya and into DataForge,
10:28which will allow for more refined procedural breathing curves.
10:32In other areas, the team started implementing multi-directional takedowns
10:36for killing enemies that are within close proximity to the player characters.
10:40Also, there have been further improvements to weapon setup and reloads across the board,
10:45including the Devastator Shotgun, the Arrowhead Sniper Rifle,
10:48the Gallant Laser Rifle, and the PASC Ballistic SMG,
10:52as well as melee improvements for pistol and stock weapons.
10:55Mission Givers has handed over 500 facial animation files
10:58that are now ready to be implemented into Squadron 42.
11:02The motion capture team has tracked and solved almost 1,000 new body animations
11:05for various characters within the Persistent Universe.
11:09The team has also been working on new facial animations for shooting guns.
11:12Steve Bender, our animation director, has been a great source of inspiration,
11:16so expect to see new, improved faces soon.
11:20Well, that's our update for this month.
11:22Once again, I hope you enjoyed seeing what we've been up to here in the UK,
11:25supported as always by all the other studios around the globe.
11:29Thanks again for all your support and encouragement
11:31and for joining us on our journey to make Star Citizen
11:33become this incredible reality.
11:36Everyone on the team really appreciates the trust our community places with us,
11:40allowing us to create this amazing universe.
11:43Without you all, Star Citizen would not be the reality it has become.
11:46Thank you, take care, and I look forward to seeing you in the verse.
12:19We'll see you in the next one.
12:27You know, it's really great to see the progress made on the procedural breathing.
12:31When stamina is introduced in 3.0,
12:32players are really going to be able to experience the consequences of their actions,
12:36like puncturing someone's suit or running for long periods of time.
12:40Yes, it's part of the detailed universe building that we're implementing.
12:44First with the rollout of 3.0,
12:46then testing the expanding universe from there.
12:48Yeah, and speaking of the 3.0 rollouts,
12:50up next we see how the new interaction system influences every aspect of gameplay.
12:55Take a look.
13:02The player interaction system touches everything.
13:07It's a unified interaction across first-person experience of shooting,
13:15of shopping, of looting.
13:18when you go up to screens and you interact with those terminals,
13:22and also when you get into your cockpit,
13:25you fly your ship around,
13:27and being able to point at things,
13:31you know, with reckless abandon,
13:33actually opens up a lot of opportunity for interactions.
13:37of, uh, I want to find out more about that and we can give back contextual clues of the things
13:47that you can do.
13:47So the, uh, player interaction system, uh, that we're starting to show here is the third version of the interaction
13:55system that we've added in the game.
13:57Um, the original, uh, interaction system is what was in the game in, uh, alpha 2.5 and previous,
14:03and it was this kind of mish-mash of different approaches to try to figure out what the player was
14:09looking at and what they were trying to interact with in the game.
14:12Uh, after that, um, we made an item 2.0 interaction system that tried to be more accurate about what
14:19you were interacting with,
14:21and so it would use raycasts and collision geometry to figure out the results.
14:26Um, that had some issues as far as usability,
14:28and so we're adding some new, uh, features onto the current item 2.0 interaction system
14:34that makes it more contextually aware about what you're interacting with.
14:39Our old system, uh, it was basically the dreaded use system,
14:44that basically, you know, we had the bounding box that she had to be inside,
14:47and then you can only really have one action tied to that,
14:50and, you know, it wasn't very descriptive, it was just use, right,
14:53and that's why it's kind of deemed as the use system.
14:56To get the player interaction system going, it really required a pretty fundamental rewrite of, uh,
15:04the interface we would use to interact with objects within the game.
15:09Um, for example, for, um, first couple of years on the project,
15:14we were so used to just having the big, horrible use prompt on everything,
15:20and, uh, really that sort of not only did that not look good,
15:24but it also sort of always felt clunky.
15:27It was very hard to actually get in the right spot to use stuff.
15:30So, we had to really sort of come up with a design that gave the kind of level of detail
15:38that we wanted to, uh, put into the game for the users to actually be able to interact with things,
15:47um,
15:47that, uh, present in the levels to the level of detail, uh,
15:53that would immerse them in the gaming experience.
15:56So, to really make all that happen, the easiest way was to just basically write a lot of this from
16:03the ground up,
16:04write it, write it as a fresh new system as opposed to, uh, just going with the old used stuff.
16:11I did a prototype of cargo, uh, some of the earliest, you know, jankiest bits of that that we've done.
16:18And, as part of that, I did a little section in, in a freelancer that would detect that you brought
16:25a thing in
16:26and then showed up on a cargo manifest screen, which was the first time that I made a thing with
16:31a cursor on it.
16:33Eventually, that turned into MFDs and how you interact with, with that stuff in your, the screens in your cockpit.
16:40Uh, and then eventually became, well, that's, should, we should unify all of these, you know, exponentially growing input systems
16:51to something that actually has a, uh, a core to it.
16:58This is a whole completely new system that's coming in.
17:01Because before you didn't have a cursor or anything, so it was very hard to tell, you know, what you're
17:07actually focusing on.
17:09Okay, so this text is popping up, um, on this door, but is it for this door that's to the
17:16left or to the right?
17:18So, by having the cursor and then being able to highlight the objects, which is another thing in the interaction
17:22system that we do,
17:23is we highlight the actual object that's going to be, uh, engaged.
17:28So, um, when you bring up the cursor and you have your cursor over the particular item that you're focusing
17:33on,
17:33you'll get a highlight on the object, so that's additional feedback, um, that is very useful for you to know
17:40as a player
17:40when, you know, you're using this interaction system.
17:43It's like, okay, I'm, here's the door that I'm focusing on, and here's the actions that are tied to that.
17:47I mean, the whole idea is to make all interactions that you can have in the game, uh, extremely consistent.
17:54So, uh, for instance, if you want to walk up and interact with the terminal screen, that's the interaction system.
18:01Uh, you can, you know, bring up a cursor, uh, click certain buttons, uh, that can be applied to, uh,
18:09something where it's like a kiosk,
18:10where it's very, it has a very in-depth interaction where you have multiple buttons and filters and all sorts
18:16of things.
18:17or it can be used for, like, an elevator panel where you're selecting between floor one and floor two and
18:21something simple like that.
18:23But also, not just, that's just screens, but we can also use that system for, uh, picking up objects in
18:29the world
18:30or interacting with a physical control panel.
18:33As we've, as we've gone through iterations of the interaction system, it hasn't changed too much philosophically.
18:41Uh, it's always been about having the objects in the world sort of dictate how you interact with them in
18:51tandem with your current state.
18:53Uh, we've demonstrated that a little bit with the battery, uh, demo that I think we showed in, uh, some
18:59of the previous footage
19:01where having the battery in hand, uh, gives you contextually the interaction to put it into the radar.
19:10where you could also manipulate the radar directly and open up the panel and all of those things.
19:17But because your current state interacts with the world, you get to have these, uh, more, more natural interactions.
19:28So the way the system would work on a regular basis is the user would, uh, press and hold F
19:34to go into the interaction mode.
19:37They'd then be presented with a cursor that could kind of lead the choice.
19:43Uh, choice males depend on the proximity of the player to the various objects and the way you've been facing
19:49and such.
19:50But really the cursor would kind of lead the action and then clicking the left mouse button could actually trigger
19:59the, trigger the action.
20:00So that would lead into sort of an animation of picking something up or, um, could be inspecting a particular
20:10object.
20:11And we've also added various different sub modes, such as like rather than a left click, if you do a
20:17right click,
20:18we can zoom in on an object, focus on it a bit more to get that extra detail so we
20:24can see what, see what we're doing with it.
20:27Um, so very much cursor led to give us the precision, but then the results that comes out would tend
20:35to be something that would be animated.
20:37It was important to me that you'd be able to interact anywhere on the screen for a couple of reasons.
20:42One, I just, I really liked the way it felt and I, I sort of got sold on that really
20:48early and, uh, through iteration,
20:52there were versions of it that were more and less successful, but I'm really quite fond of it myself.
20:58But also, again, with the animation, your ability to look around the world is limited by your animation, by the
21:08physicality of your character.
21:10And so if there's something that you need to interact with, you need to be able to look at it.
21:16And traditionally, shooters manipulate everything through a dot in the center of the screen, which is you, really.
21:26All, all of your interaction capability is a dot in the center of the screen.
21:31And that means that if you're going to interact with something, you have to be able to get it to
21:35the center of the screen,
21:35which isn't so hard when you can fudge the animations a bit, but when you're being really faithful to what's
21:43actually happening,
21:45uh, it's important to be able to look with your eyes to the edges and, uh, you can reach anything
21:54that you can see.
21:55So if there's interactions on your body of where you would put items to stow or to, uh, access things
22:06about the seat that you're sitting in,
22:08uh, these things should always be available to you as well.
22:12The way it works is it uses, uh, what we call a proximity query.
22:16So it checks what's around the player in, um, the local area to see what is interactable,
22:21but then also uses some ray casts to figure out what the cursor is currently pointing at.
22:26And then using that, it can figure out what its best estimation is for what you're looking at right now,
22:32what you were previously looking at, and what is the best result for what you should interact with if you
22:38press the interact button immediately.
22:40So it's basically this cursor that's kind of, um, browsing the available options to you that are available.
22:47You know, depending on what type of item or what type of action that you're going to, that you're going
22:51to invoke
22:52on the thing you're focusing on, um, you'll get different cursors, for example.
22:56So you'll get, uh, very specific feedback on, okay, this is a dial.
23:01So your cursor might be a, uh, sort of like a dial icon that indicates that you can rotate it.
23:06Uh, or, or it's this onscreen cursor, you know, that you're over a button or, you know, things like that.
23:13So you get, like, that sort of feedback that, you know, uh, makes it much more intuitive than if, let's
23:20say,
23:20if you just use one cursor for everything and there was no feedback.
23:24Building it was a challenge because it required some collaboration, for one, between the studios.
23:29We had design here in L.A. Um, the main engineering was spearheaded in the U.K. with support, support
23:35here in the L.A. studio.
23:37And one big part of it was how to get the sophistication of that behavior to work with just, you
23:45know, screen coordinates.
23:46You have this cursor, um, you know, the player's position.
23:50How do we figure out, given this information in 3D space, what is the best solution?
23:55And so you're going to have to do a little bit of math.
23:57You have to do a little bit of testing to test it out and prototype and iterate, um, to see
24:02what actually works.
24:04And the other issue is performance. You know, how do you do all these checks and how do you make
24:08it so smart and intelligent
24:09about what you want to do and what the right result is and still be performing?
24:14The kind of challenges you get with this is with it being a system that's going to be regularly used
24:21for all manner of different objects,
24:25you know, all manner of different scenarios. Um, you really need to get that level of polish, um, added to
24:32the system
24:32to make it feel good. So the player isn't there getting annoyed every time they want to use an object.
24:41And also we need to kind of cover all these various different scenarios and situations the player finds themselves in
24:50so that they can act, the player can actually do what they want to do with these objects, as opposed
24:56to being limited by a system that, um, that being in the cry engine previously,
25:03that was just very kind of flat in, uh, in what you could do with an object.
25:09It's really the kind of situations you'd find yourself in where, whether it was just getting in a ship.
25:16Often you were just presented with the option to use the thing as opposed to like, you know, open me
25:24the door, deploy me the ladder,
25:26um, choose to climb up the ladder, actually start the engine. You know, these are all the things that we
25:32wanted to add,
25:33but we just didn't really have the system there. So it was a real challenge to get that system in
25:39place that allowed us to do all these things that we want to do,
25:43but also sort of to a nice polished level where that wouldn't feel frustrating or, um, tricky to do.
25:51Part of what's been so difficult about getting this system together has been that as we built the various systems
25:58of the game,
25:58we had to make them in isolation so that they would function, you know, regardless of the fact that the
26:04rest of the game wasn't there yet.
26:06So when we, now that we have those things and we want to bring them all together, consolidate them into
26:12something that's a bit more robust and sensical.
26:16Building that is, is difficult because you have to take all these really specific behaviors that are tailored to all
26:25these systems
26:25and create a generic interaction object that needs to sync with a generic usable object, which needs to sync with
26:36a generic animation object.
26:38And all these things have to, they, they have to be that generic because they need to touch so many
26:45parts of the game.
26:46On top of that, there's the fact that we have, uh, we have these interactions that we want to do
26:53that need to coexist with a wide variety of gameplay.
26:59You could be in a seat, uh, chilling, looking around, uh, browsing mobiglass.
27:06You could be in the middle of a firefight. Uh, you could be frantically repairing something.
27:12You could be cautiously exploring a derelict. There's all these wildly different experiences that this needs to accommodate.
27:20It took a lot of, uh, experimentation to sort of feel out what allowed you that, that degree of expression
27:29and nuance
27:29without impinging too much on the other systems that were so far afield from what you were doing now.
27:35For example, the grabby hands system, as it's been so notoriously labeled, uh, was an exploration of how do we,
27:45uh, create a system that accommodates all these different types of carried objects.
27:54And what does it mean for something to be in your possession, in your inventory, when we don't really have
28:01an inventory,
28:02where we have physical places on your body. And how do you access those things now?
28:08How do you, uh, put them into the world? How can you get close enough to take them back?
28:14And so those, a lot of those questions, uh, have also extended into this system, you know, which has been
28:23a hub of all these things, like the, the terminals as well.
28:26There's some, um, real intelligence added to the system to, um, create that nice feel of, um, an intuitive kind
28:35of selection of the objects that, uh,
28:39that you're close by with, uh, what you're, what you're trying to interact with. Um, if you don't add in
28:45some sort of intelligence there,
28:47it can, uh, it can lead to kind of frustrations as to, like, clearly, like, clearly the player's trying to
28:55interact with a certain object
28:57that's sat in a particular position, but because, you know, without the intelligence, it might require a very specific, um,
29:09alignment of player to object. Um, there might be multiple objects in one scene or close by, so we have
29:16to sort of
29:16get, add some kind of intelligence to, uh, try and figure out which object the player wants to, uh, wants
29:26to actually interact with.
29:28Whenever you interact with something in the world, the way that that thing is designated as being interactable
29:36is using the Item 2.0 component system. So a designer would create some kind of record for something that
29:43they want to be interactable.
29:44It could be an elevator. It could be a door. And then they'll give it the interactable component. And on
29:51the interactable component,
29:52they define the sets of interactions that can be used with it. And then the interaction points that are on
29:59that entity
29:59where they want those interactions to be shown. Um, one thing that the new interaction system adds is the ability
30:06to kind of
30:07have interactions on particular bones and sub regions of an entity. Whereas previously the old system made it where you
30:14kind of had these large bounding boxes and you would kind of get lost whenever you're trying to find things.
30:20It wasn't clear. So the, the important thing about having these generic components is it gives us the building blocks
30:27to make things that are ultra bespoke. The problem with bespoke content is that it requires a lot of painstaking
30:37maintenance.
30:38Because as you move forward, things about your technology change, things about the environment in which you've placed this content
30:45change.
30:46And in order to keep it all working lockstep and sync, uh, you have to be really vigilant going through
30:53all those things.
30:54And when you make your behaviors separated and modular and generic, it makes it so that you can build things
31:04out more conceptually.
31:06This thing is heavy, but it can be picked up and it takes two hands.
31:14So that's going to affect what you can be holding at the time you try to interact with this thing.
31:20It's going to affect what's going to be the result of throwing this thing.
31:24What's going to be the result of it in zero G having collisions with other things.
31:28And those are fairly simple examples and already it starts to spiral out into all these possibilities of when you
31:37can get really specific in the content that you're making.
31:42It's usually because the rest of it has been nailed down.
31:47We'd use similar subsystems within the game, such as, uh, such as the zone system, zone system queries, for example,
31:57to, uh, figure out which objects you could interact with were in the proximity of the player.
32:05We'd also use sort of standard, uh, engine techniques, such as ray casting and, um, other such things to actually
32:14determine which of these objects were best to use.
32:17But really there was a whole another sort of layer of logic added for this system to give us the
32:25feel and the depth that we wanted to add for interactions.
32:28So now there is a through line of input that you can interact with the whole game and, uh, it
32:36remains consistent.
32:38It adds quite a lot of exciting possibilities.
32:41It gives you the opportunity to have essentially a point and click adventure game in your shooter.
32:45So all of those wild bespoke interactions that adventure games are built out of are suddenly available for something that
32:53is so systemic, such as Star Citizen.
32:56I think the biggest change in player experience is, uh, kind of a change in mindset when you're using it.
33:03When you use the old interaction system, um, you're kind of fishing around and you're not sure what you're looking
33:10for.
33:11When you're using the new interaction system, it's more about fluidly browsing what you can clearly see.
33:17And it's a really big difference, I think.
33:19Because you'll walk into a room, you enter into this interaction mode, and immediately you can tell, based on this
33:24highlighting that's happening,
33:26here's the options about what I can work with right now.
33:29What do I want to do given these options?
33:31And you can kind of float the cursor around, see what's close, see what the different available interactions for those
33:37different objects are.
33:38It might be that, oh, I can turn on the engines, or I can turn off the power.
33:42Do I want to do that?
33:43And so I think it's just a more enjoyable experience.
33:46You also don't have to move so much.
33:49So previously with the old interaction system, if you wanted to go interact with this thing over here, you'd have
33:54to go over there.
33:55You have to position yourself, you'd have to look.
33:58It just takes more work to look at the different options.
34:01Whereas this, you can kind of stay put a little bit more, and kind of more fluidly guide yourself through
34:07the different options.
34:07You don't have to do as much work to see what is all there.
34:11The focus up to now has been on consolidating all of our behaviors in tech to get to the point
34:18where, as we're building things,
34:20we can sort of stitch them together conceptually and be able to get really specific with things.
34:26And that frees us up to have all sorts of creative ideas, and this is just the start.
34:34So what we're also working on in conjunction with this interaction system is something that we're calling renderer texture.
34:40And in the UI sense, that will allow the UI to render properly within the rendering pipeline.
34:47It'll pick up all the post effects, and it'll actually look like it's in the world.
34:51So that's another thing that's going to really make your interaction feel much more in-world, part of the game
34:59experience.
35:00The other thing it'll allow us to do is project onto curved surfaces, you know, which, you know, in terms
35:05of like a sci-fi setting, you know, that's probably one thing we'll want to have.
35:11So wherever your cursor is, it'll allow us to easily negotiate what you're actually over in terms of the UI.
35:18So if you're over a button that's on a curved surface, the position of your cursor will, you know, it's
35:23easier to map down onto a curved surface with this renderer texture tech.
35:27So that's kind of an example of all these different pieces that are going to be coming together to, you
35:34know, make a fulfilling game experience.
35:36I think the one thing that I would add is that this new interaction system is starting to come online.
35:42We're starting to add new things, but there's still a lot that can be added.
35:46Like every day, we're thinking of new possibilities for how this interaction system can be used.
35:50Where have we used other solutions in the game that maybe can be replaced by this?
35:55One thing, for instance, is the item port system in the hangars.
35:59That's kind of its own thing right now, but it shouldn't be.
36:02It's almost identical to what we're doing with this interaction system.
36:06So it doesn't have the same benefits.
36:09It's an opportunity for us to unify the systems and gain the benefits from the new interaction system.
36:15But beyond that, there's even new, more wild things that we could possibly do.
36:20Try to bring more of maybe the ship HUD, for instance, into the interaction system in some parts,
36:26to where maybe you could look at something, say, that's displaying the ships that are in the area.
36:32It might be possible to then select one of them and then find some information about it
36:36or communicate with that particular player.
36:38Using the new interaction system, we can generate new interaction points and new interactions at runtime.
36:44So it gives us a lot of flexibility with deciding, you know, what behavior do we want to open up
36:49to the player
36:50with the benefit that it's all using one system to do it.
36:54This game has been very iterative, and I think that's one of the great things about it.
36:59But certainly the interaction system is a great example of that.
37:03We kind of start with this stuff that we have in CryEngine, and it works.
37:09It's not like it was broken, but it didn't have everything that we needed.
37:13So we started adding stuff to it.
37:15But it was buggy, it wasn't performant, it didn't do quite what we needed.
37:19So we kept having to iterate and change and invent and make new things.
37:24And now we're getting to this place where eventually, after a lot of work, we're bringing it all together.
37:29And I'm really excited once we get it in the player's hands and they can start to see all of
37:34these things coming together.
37:35You know, as the guys said, the new system is just the groundwork for more realistic experiences here in Star
37:41Citizen.
37:41It will continue to grow, just like our universe.
37:44Yes, and it will affect every aspect of the universe to create a more immersive player experience.
37:49Yeah, before we go, I just want to remind the subscribers that they can fly the Drake Buccaneer as part
37:54of our Ship of the Month.
37:56Subscribers will also get an Icarus 1 holo model as part of their flair this week.
38:00And if you're interested in learning about our subscriber program, check out the link in the description.
38:05That's all for this episode of ATV.
38:07We want to thank all of our backers for your continued support.
38:10You're the reason we're able to create the best Damned Space Sim ever.
38:14Yeah, and we're also very grateful to all of our subscribers who make shows like this possible.
38:17Thank you.
38:19And thanks for watching.
38:20We'll see you around the verse.
38:50If you want to keep it with the latest and greatest in Star Citizen and Squadron 42's development, please follow
38:55us on our social media channels.
38:57See you soon.
38:58We'll see you soon.
39:00Bye.
39:01Bye.
39:07Bye.
39:09Grazie a tutti
Commenti

Consigliato