089 - Placing Initial Entity Spawns

October 18, 2020

This week I continued working on the editor tool for placing initial entity spawns within the game world.

While there are still a few things left to tie together, I was able to make it to a point where there is visual progress to share.

Right now the you cannot see the objects that you are placing until you switch to a different mode within the editor. This does not make sense, so at some point I will combine these modes within the editor.

You can click to place an initial entity spawn, which currently gets rendered as a red bush but in the future will be rendered as a semi-transparent version of the entity that will be spawned.

There are a few things remaining to implement that I should wrap up over the next couple of days.

Most notable of these is a text area where I can see the EntityToSpawnDescriptor that encodes what gets spawned.

I am currently implementing the data structures and methods that will power text areas.

Editor Rendering Architecture High-Level Overview

When we implemented adding scenery in the editor in 082, all we needed to do in order to see the new scenery was to implement a way to tell the game that it needed to update its scenery HashMaps.

The game already knew how to load and render scenery, even if it was defined at run time.

Initial entity spawn locations were different though. The game client does not have any notion of where entities should be spawned. It just renders entities that the server tells it about.

We want to be able to edit the game while playing it in the editor, so we needed some way to render initial entity spawns within the running game pane even though the game knows nothing about them.

Here's a high level view of the rendering architecture that solves for this problem.

Anambra1 engine has a Renderer trait that is used to implement different rendering backends such as the WebGlRenderer for the web and MetalRenderer for MacOS devices.

The editor uses a native Renderer such as the MetalRenderer on MacOS, but when it creates instances of the game it gives them a fake Renderer implementation that simply stores the RenderJobs that the game instances create in an Arc<Mutex<Vec<RenderJob>>> that the editor has access to.

pub struct EditorRendererResource {
    renderer: Box<dyn Renderer>,
    pending_render_jobs: Arc<Mutex<Vec<RenderJob>>>,

The editor's RenderSystem combines all of the rendered game instances along with the editor's user interface into a single RenderJob, and then passes this merged RenderJob to the real Renderer implementation.

impl<'a> System<'a> for RenderSystem {
    type SystemData = RenderSystemData<'a>;

    fn run(&mut self, mut sys: Self::SystemData) {
        let render_job = create_editor_render_job(&mut sys);

        let mut jobs = vec![];

        for job in sys.renderer.pending_render_jobs().lock().unwrap().drain(..) {


        let merged = merge_render_jobs(&jobs);

Note that the editor has access to the RenderJob that each game created, before it gets rendered.

This allows the editor to modify the RenderJob for any game, enabling us to insert render descriptors for rendering entity spawn locations into the job that describes the game's final presentation frame buffer.

Rendering to the same frame buffer that the game is being rendered to allows for proper depth testing of inserted objects to render.

Here is the code where we insert initial entities into a game's FramebufferRenderJob that describes how to render to the game's final framebuffer2.

// FIXME: Use some sort of HashMap to look up the RenderJob and FBJob
//  instead of iterating over all jobs
'pane_jobs: for rjob in pending_render_jobs.iter_mut() {
    for fb_job in rjob.framebuffer_jobs_mut().iter_mut() {
        if fb_job.framebuffer_id() == game_pane.final_framebuffer_id() {
            let game_pane_fb_job = fb_job;

            match game_pane.mode() {
                GamePaneMode::Playing => {}
                GamePaneMode::PlaceObject(_) => {}
                GamePaneMode::EditTerrain(_) => {}
                GamePaneMode::Object(o) => {
                        &mut render_job,

            break 'pane_jobs;

This sits on top of work from 076 where I introduced the concept of prefix IDs for GPU resources, allowing multiple applications to share the same GPU device handle and resources without worrying about two different applications accidentally using the same ID for a GPU resource such as a texture or vertex buffer.

I always appreciate when old investments get re-used or built upon weeks, months or even years later.

Other Notes / Progress

  • Made the game server's build script take all of the initial entities spawns that are defined on disk in YAML format and generate a binary encoded file that is included into the final game binary and used to spawn entities at the beginning of runtime.

Next Week

I am in the middle of implementing a re-usable text area UI component. The current use case is to display the YAML that describes an initial entity spawn in the editor's active object properties pane, but in the future there are sure to be more use cases for editable text areas.

I won't make it editable this time around. Just displaying text is enough for now, I can edit the files by hand for some time.

After this I am starting a new stretch where I will be prototyping new gameplay and deploying my progress weekly.

Stay tuned!

Cya next time!



The engine has a name now!


Game instances that are running in the editor have their final framebuffer render to a color texture. The editor then displays that color texture in the viewport.