#1, what languages?

Web Languages

HTML5

Modern Markup Standards
  • Two years of HTML lectures at degree level.
  • Multiple years of working within the standards.
  • Understanding of semantic elements for ease of design.
Web Languages

CSS3

Styling, Animations and Design
  • One year of CSS lectures at degree level.
  • Multiple years of working within the standards.
  • Experience with CSS on legacy systems.
Web Languages

JavaScript

Dynamic Client Pages
  • Two years of JS lectures at degree level.
  • Multiple published works.
  • Proficiency with both pre and post ES6 Javascript.
  • Experience with both server and client JS.
Web Languages

PHP

Dynamic Server Pages
  • Management of legacy systems (PHP <=5.6)
  • Multiple open source libraries.
  • Work ranging from simple PHP pages through class based development and complex libraries.
  • Development of embedded systems including REST APIs, Database Migration and CLI tools.
  • Two years of degree level education
Data Description and Manipulation

SQL

A language for querying structured data.
  • Experience in multiple variants (SQLite, MySQL)
  • 6 months of degree level education.
  • Experience of database administration and management.
  • Data normalization & de-duplication.
General Purpose Programming

Python

Science, Data, Prototyping and Rapid Development
  • 5 years experience.
  • Published package.
  • Deployed production scripts.
General Purpose Programming

Lua

Embedded Scripting Language
  • Many years experience.
  • Used in both embedded and standalone environments.
  • Code writting ranging from game modifications to stand-alone linters and code formatters.
General Purpose Programming

Shell

Command Lines
  • Four years experience.
  • Skills in diverse environments; Unix Shell, Bash, Batch, PowerShell.
General Purpose Programming

Visual Basic (.NET)

please i dont want to use it again
  • Development of business software.
  • no seriously please dont make me use vb again.
General Purpose Programming

C++

C Family Language
  • Introductory Knowledge.
  • Use of Managed APIs (SimConnect).

#2, what technologies?

Stack - Base

Linux

Ubuntu & Debian

Use of Ubuntu and Debian as a base OS for other layers of the stack, including hardening, maintenance, user admission and other administration.

Stack - Connectivity

Apache 2

Serving files.

Use of Apache as a base for both general computing, and directing to various backends. This included setting up Virtual Hosts, reverse / SSL proxies, caching, endpoint security, rewrite rules and more to ensure a reliable service with high levels of uptime.

Stack - Storage

MySQL

also MariaDB.

Ensuring that data was stored correctly and efficiently, in a way which was both secure and accessable. This involved taskes from general administration through to design and backups, to the end goal of increasing efficiency and relability.

Stack - Dynamic Responses

PHP

For both Apache, Nginx and CLI.

Whilst generally this only involved writing code, this also included making configuration modification to allow for better service, implementing developer extension requests and implementing PHP in both Apache and non-Apache environments.

Stack - Containers

Docker

Core, Compose and Swarm

Generally, Docker worked well by itself, with my interactions being limited to the creation and management of services and containers, ensuring they interacted well with the rest of the system.

I however, orchestrated systems through Compose, to handle multiple containers per project (such as for isolated databasing), and finally managed these services across multiple hosts with Swarm.

Stack - Dynamic Responses

Node.JS

JavaScript on the Server.

My work with Node has ranged from single purpose micro-services and command line applications through to large systems incorporating multiple datasets to provide information to end users. This also included setting up management (for those systems not ran through Docker), and ensuring compatibility with pre-existing systems.

Job Role

DevOps

Combining Development, Deployment, Testing and Operations in a unified pipeline.

Working with the multiple technology stacks on the server, combined with DevOps software such as GitLab, my role as a developer shifted over to DevOps, writing code, ensuring it deployed, testing it, and making sure works.

This has included use of both GitLab CI and GitHub Actions, deploying to self-contained web-hosts, to Docker, or to external services such as Steamworks.

Technology

Version Control

git and svn, through GitHub, GitLab, BitBucket and self-hosted solutions.

VCS ensures that changes can be tracked back, breaking changes can be reverted and everything is logged, whilst also improving productivity through branches, merges and other protections from overwriting code.

Whilst my work has mainly been in using VCS, I’ve also been called upon to manage VCS services and implement VCS specific code.

Technology

REST APIs

Remote Data over HTTP.

Throughout my development career, I’ve both had to ingest REST APIs to find data and make changes, whilst also writing REST APIs to expel data and ingest changes.

Technology

REST APIs

“tube schoomper” -Discord

Webhooks are just REST APIs that push data, really. Just like with regular REST APIs, I’ve had to both write code that pushes webhooks and ingests them, including webhooks from multiple sources, such as GitHub, Ghost and GitLab.

Stack - Base

Window Server

It’s like Windows, but for Servers.

Whilst I have only recently worked with Windows Server, it has been important in the development of my skills further, especially considering its use in SME.

Stack - Email (SMTP)

Postfix

SMTP & MTA Daemon

From smaller setups for forwarding system mail, through to larger setups including implementation of SASL auth, DKIM keys and integration with external user directories, postfix has been fun to work with, if a pain at times.

Stack - Email (IMAP / POP)

Dovecot

IMAP / POP3 / SASL Auth Daemon

Similar to Postfix, I’ve used Dovecot in a variety of situations, from simple flat-file logins through to external directories with SASL auth.

#3, what work?

Photon Lighting Engine

Though only a recent addition to the Photon Core team, I’ve put in large amounts of effort,
from improving the use of VCS within the team,to building notification services.

The Photon Lighting Engine Github/Gitlab Notifiers.

The Photon Lighting Engine Github/Gitlab Notifiers.

This is combined with code provided to the Photon Lighting Engine core.


L² Plates

L² plates is a system written from the ground up in Garry’s Mod, using Lua as an embedded scripting language.
The plugin itself is sold on GModStore, with a more advanced build system having been built around it.

L² Plates Background Image

L² Plates Background Image

This system includes a set of docker containers, running a database and API service for the development tools,
and both GitLab and GitHub pipelines for building and deploying the addon versions,
alongside both the ingame system and development tools themselves.


Limelight Gaming

Development Lead

Within Limelight, as the development lead, a major part of my role has been improving team efficiency.
Our team is compromised of a small number of developers, most of who work with Limelight as a secondary or tertiary job.

Developers I manage.

Developers I manage.

With this has led to an improvement in the soft skills: People management and conflict resolution.
This has been of key importance here, if one person leaving drops staffing by 15%, each person is key.


Limelight Gaming

Workflow Optimisation

With our low staff levels, we’ve also have to develop custom tooling in order to improve our workflows.
For example, a system I developed was the suggestions management console,
a key system in improving our Suggestions Workflow.

Suggestions Review Manager

Suggestions Review Manager

This has led to a 50% decrease in the time spent processing internal suggestion reviews.
In addition, many parts of this system are also fully automated, from polls to requesting review to automatic closing.


Limelight Gaming

Data Protection

Whilst for many a boring subject, data protection is of key interest to me, hence why I put myself forward for for the data protection role.
Within the role, I worked on improving compliance with regulations (including the GDPR), implementing workflows for subject requests and
training other staff to improve data protection throughout the company.


Preface for the LL Data Report Green Paper.

Preface for the LL Data Report Green Paper.

Limelight Gaming

The Emergency Vehicle Update

One of the major updates I worked on for Limelight was the Emergency Vehicle Update.
This involved working with myself and three other texture artists to create new liveries and configurations for
over 80 vehicles, creating over 70 liveries and hundreds of configurations.

A 2013 Taurus with Rockford Police Livery.

A 2013 Taurus with Rockford Police Livery.

Despite some issues with one of the texture artists becoming unavailable, and some disagreements between the artists and other staff members,
the update only released a few weeks late (impressive considering multi-week delays that happened during development),
this not only allowed me to work on other development skills such as texture work, but also allowed development of the soft skills.


The full set of State Trooper liveries.

The full set of State Trooper liveries.

Glorified Studios

Deployment

One of my key pieces of work at Glorified Studios has been the development of
integrated build and deployment solutions.

For a varied range of engines, from Unity to Love2D,
my role has been to ensure that developers have the least amount of friction possible,
between them writing code and having those built changes deployed to consumers.

Limelight Gaming

Secure Document Store

One of my most used pieces of software to date is the secure document store.
This Node.JS application was written to securely store documents,
providing selective access to users based on various criteria, such as group filtering.
Access to individual documents is controlled by an ACL,
with groups also having individual permission nodes for audit and management purposes.

The document store's document model, designed to make use of current best practices.

The document store's document model, designed to make use of current best practices.

These are all passed through to a backend secured with current best practices,
such as the use of non-incremental IDs and using external authentication sources
to remove the requirement for storing passwords locally.

Furthermore, access was also built into the application for automatic updating of documents from external sources,
such as our staff guidelines, which are automatically built from asciidoc source files, with built files deployed to the store.


The GitHub actions deployment log for the built documents.

The GitHub actions deployment log for the built documents.