060 - Adding Some Visuals

March 29, 2020

In the last journal entry I said that this week we'd be focused on the visuals. Adding things that players can see and starting to iterate towards something that looked and felt like a game.

We made great progress on this - here's a before and after of the beginning of the week vs. today.

Beginning of week and end of week Good progress this week! Still some work to do before things look good, but we're approaching a point where things are at least servicable.

Prepping for an initial alpha release

In 054 I said that I'd be moving for April 9th to be the alpha release date. This is still on pace.

There are lots of things that I want to add to the game and wish were there before "announcing" it to folks, but those can be added over time.

I'm planning to work on Akigi for years to come so there will always be more things to do. It's time to start letting people play the game.

Or rather - it's time to have a game for them to play after years of a nearly blank canvas.

UserInterfaceSystem

I started the week by adding backgrounds to the bottom and side panel.

To accomplish this I introduced the UserInterfaceSystem which pushes to a Vec<UiElement>.

The RenderSystem later iterates over these UI elements in order to draw textured quads on the screen.

TransformationInterpolationSystem

The TransformationInterpolationSystem was introduced in order to interpolate entities positions and rotations over time.

The rotation interpolation is a little choppy so I'll give that a few tweaks sometime soon.

SkeletalAnimationSystem

The SkeletalAnimationSystem was introduced to power determining the dual quaternions that an entity should be using for its corresponding armature.

This is powered by blender-armature under the hood.

Nearby Chat

A few systems were introduced on the frontend and backend in order to power chatting with other nearby players.

When a player sends a message - players that are close by will see it above their head and also see it inside of their message panel.

I ran into a big scare while working on this where the frames per second plummeted.

I was worried, but at the same time confident that there would be a fix since before the refactor everything was running smoothly.

The issue ended up being that I was accidentally pushing to a vector of text to render every frame without clearing it out.

So I was essentially rendering the same text on top of eachother thousands of times.

I discovered this when I implemented deleting text - because characters that I was trying to delete were still getting rendered since I wasn't clearing out previous frames.

I'm using glyph-brush under the hood for the text rendering.

Here's a glance at how a typical UserInterfaceSystem unit test looks.

My general approach is to verify that the user interface element is near other user interface elements that it should be near.

// A look into how I write tests for the user interface. I mainly test the positioning.
// In the future I might add tests that render the element and confirm that it matches
// an image that was manually approved once.

/// In the bottom left corner of the bottom panel we render the player's username along with
/// their pending chat message that they're typing out.
#[test]
fn pushes_pending_chat_message() {
    let world = create_test_world();
    world.fetch_mut::<ChatResource>().push_char('a');

    create_entity_with_display_name(&world);

    UserInterfaceSystem.run_now(&world);

    let mut chat_text = text_sections_containing_text(&world, "My Username: a*");
    assert_eq!(chat_text.len(), 1);

    let chat_text = chat_text.remove(0);
    let viewport = world.fetch::<Viewport>();

    let close_to_screen_left = viewport
        .within_x_distance_to_the_right_of_left_side(chat_text.screen_position.0 as i16, 10);
    let close_to_screen_bottom =
        viewport.within_y_distance_above_bottom__side(chat_text.screen_position.1 as i16, 10);

    assert!(close_to_screen_left);
    assert!(close_to_screen_bottom);
}

Planning out the pace of progression

Towards the end of the week I started planning out the pacing of progression in the game.

This felt great, because it was one in the first times since I started working on the game four years ago that I worked on game design.

It confirmed to me that the time has finally come that I'll be focusing on making a game instead of just writing underlying tech and tooling for the engine.

WaterPlaneJob

I started adding a WaterPlaneJob to the RenderJob struct. I've written about the water technique that I'll be using before.

Water plane render job work in progress Right now the water plane render job can only specify a blue plane. When we're done it will look more like water.

I'm going to put this on pause since I didn't finish this during this week. I'll circle back to it at some point in April - but for now I want to start the week working on some gameplay.

I also realised that I'll likely want to wait a few months to a year before implementing another client (such as a native macOS client). The renderer-core and renderer-test and renderer-webgl crates are still evolving and taking shape as I add more variety to the RenderJob, so we'll want that to stabilize before we start introducing another rendering target.

// This is a work in progress - by the time I'm done there will be a bunch more fields to
// describe how to render the water plane.

/// Describes a water plane that needs to be rendered.
#[derive(Debug)]
pub struct WaterPlaneJob {
    model_matrix: [f32; 16],
    should_refract: bool,
    should_reflect: bool,
}

Other Progress

  • I started working on a launch checklist for the alpha "release" on April 9th. By release I just mean having enough for people to play and then continuing to work to add new features quickly.

  • Made improvements to the renderer-core crate to normalize some concepts that were duplicated with the RenderJob struct that describes how to render a frame.

  • Ran into some troubles a few journal entries ago with getting WebGL in Chrome working in a Docker container. I've fixed this - so I can start running the renderer tests in CI.

  • Implemented the InteractionMenuResource to store what should be shown when you click on things, as well as mousing out of the interaction menu closing the menu.

Next Week

I liked having a theme last week - it helped all of my work feel connected.

Last weeks theme was things that you can see.

This weeks theme will be things that you can do. I'll be adding a couple of gameplay mechanics that should start to set the foundation for Akigi.


Cya next time!

- CFN

Play Akigi