0% found this document useful (0 votes)
24 views9 pages

DEVELOP-FPS A First Person Shooter Development Too

DEVELOP-FPS is a software tool designed for developing First Person Shooter (FPS) games using Rule Based Scripts, enabling developers to create, debug, and analyze player behaviors. It provides functionalities such as scenario preparation, game execution control, and performance evaluation through repeated experiments. The tool is built on a generic architecture that includes individual and global control consoles, allowing for detailed monitoring and management of non-player characters (NPCs) within the game environment.

Uploaded by

ichauhuyxmlbb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views9 pages

DEVELOP-FPS A First Person Shooter Development Too

DEVELOP-FPS is a software tool designed for developing First Person Shooter (FPS) games using Rule Based Scripts, enabling developers to create, debug, and analyze player behaviors. It provides functionalities such as scenario preparation, game execution control, and performance evaluation through repeated experiments. The tool is built on a generic architecture that includes individual and global control consoles, allowing for detailed monitoring and management of non-player characters (NPCs) within the game environment.

Uploaded by

ichauhuyxmlbb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

See discussions, stats, and author profiles for this publication at: [Link]

net/publication/235450784

DEVELOP-FPS: a First Person Shooter Development Tool for Rule-based Scripts

Article in International Journal of Interactive Multimedia and Artificial Intelligence · September 2012
DOI: 10.9781/ijimai.2012.167

CITATIONS READS
0 1,967

3 authors, including:

Paulo Urbano
University of Lisbon
66 PUBLICATIONS 744 CITATIONS

SEE PROFILE

All content following this page was uploaded by Paulo Urbano on 22 August 2014.

The user has requested enhancement of the downloaded file.


Special Issue on Intelligent Systems and Applications

DEVELOP-FPS: a First Person Shooter


Development Tool for Rule-based Scripts
Bruno Correia, Paulo Urbano and Luís Moniz,
Computer Science Department, Universidad de Lisboa

shareware alternatives put most of their effort in supporting the


Abstract —We present DEVELOP-FPS, a software tool game engine and graphical design, solving problems like
specially designed for the development of First Person Shooter physical simulation, collisions detection and character
(FPS) players controlled by Rule Based Scripts. DEVELOP-FPS animation, the tools to assist the design and development of
may be used by FPS developers to create, debug, maintain and
compare rule base player behaviours, providing a set of useful NPCs' behaviour are usually omitted.
functionalities: i) for an easy preparation of the right scenarios The existence of a debugging tool to validate the behaviour
for game debugging and testing; ii) for controlling the game of a NPC is still a dream in the designer’s mind. As the
execution: users can stop and resume the game execution at any behaviour complexity of NPCs increases, also growths the
instant, monitoring and controlling every player in the game, need for a tool that provides a set of functionalities like:
monitoring the state of each player, their rule base activation,
breakpoints that can stop a behaviour script at any point;
being able to issue commands to control their behaviour; and iii)
to automatically run a certain number of game executions and recreate situations to test snippets of code; monitor variables,
collect data in order to evaluate and compare the players functions and NPCs knowledge; force the behaviour or
performance along a sufficient number of similar experiments. remotely control a character. Most of the scripting languages
used in the development of AI components are interpreted
Keywords —Intelligent Game Characters, Behaviour Control (directly or in byte-code), and the common tool available to
and Monitoring, Rule Based Scripts, Development Software Tool. construct those scripts is a text editor with colour syntax
(although some languages provide plugins for standards IDE
I. INTRODUCTION only for write the code). When some execution bug occurs, the

I N the recent years artificial intelligence become a key


feature to the success of a computer game. The hard-core
common procedure is stopping the script, in some situations
the interpreter will also crash. Some better interpreters will
provide an error message identifying the type of error and its
gamer no longer accepts "space invaders" kind of behaviour
with easily identifiable patterns, he expects the game to deliver location in the code. With no tools to deploy, test and monitor
a convincing challenge always different and interesting. To the the components, it is up to the programmer to perform the
game publisher the increase of a game lifespan is also a debug and test cycle of his own code. For instance, the Unity
strategic decision to make; the player capability of defining game development tool [4,5] provides a debug mechanism
new scenarios and adversaries allows him to define his own based on log messages produced in the script. The existence of
challenges and opponents expanding the longevity of the mature tools providing a professional environment to support
game. all the development process would dramatically reduce the
The development of game oriented platforms, consoles or time spend in this cycle, liberating the programmer to produce
special tuned computers, provided new spaces to developed better code.
and apply new AI technics in commercial games. Game If we want that a game became a professional product, we
development toolkits are starting to provide support to design have to provide tools that allow extensive and professional test
of non-player characters’ behaviour (NPCs), mainly through of the code, guaranteeing the quality of the final delivery.
the use of copyrighted languages (UnrealScript on Scripting languages without tool support can rapidly
UnrealEngine [1]), open-source or free languages (Lua on degenerate in spaghetti code with lots of tweaks and artifices
World of Warcraft [2]) or libraries of behaviours (PandAI on that disallow any future changes or reuse of the program.
Panda3D engine [3]). Although some commercial games We propose a generic architecture to support the process of
include game editors, these are usually centred on terrain or development and test of autonomous characters behaviour in a
level construction, giving a limited support to the artificial computer game environment. Based on this architecture we
intelligence aspects. The high-end game developments of tools create a software tool (DEVELOP-FPS), which support the
support the design and deploy intelligent NPC through limited development, debug and execution of NPCs behaviours in a
and proprietary solutions. Most of the game companies had its FPS like game. The tool is supported on the Unreal Tourment
own tools and development kits, which are not made available 2004 engine and uses the Pogamut API library [6] to access
to the game community. The low-cost, open source and the environment sensor information and control of the avatar.
Our tool provides the developer with a set of functionalities
-52- DOI: 10.9781/ijimai.2012.167
International Journal of Artificial Intelligence and Interactive Multimedia, Vol. 1, Nº 6

that allow monitor and control an individual character, define II. GENERIC ARCHITECTURE
and deploy specific scenario situations, gather data and
Our generic architecture is composed by four main
statistics of running experiments, and get different perspectives
components: The NPC behaviour definition script; the
of the scenario.
individual control console; the global control console; and the
In the next section we detail our generic architecture in a
game engine server. These components were substantiated
global perspective. In section 3 we present our application and
using the Jess Rule Based Language [7] to define the
the options made. Finally in section 4 we make some
characters behaviour and the Unreal Engine as the game
conclusions and provide future development directions.
server. This architecture is outlined in figure 1.

Fig. 1. Generic architecture: the Unreal Tournament Server, The Global Console and the individual Non Player Character Agents with the Jess scripts.

We can split this architecture in two main component tool. The simulated environment provides a game world with
classes: individual character management, and global a physical engine, graphical representations of the environment
management. The first group comprise the tools to access, from different points of view, and functionalities to interact
monitor and control and individual character. Through those with the scenario – actions an NPC can perform and
tools the developer can issue commands to the agents, using information it can perceive. As stated before the actions and
the individual console, which can cause a wide range of perceptions are made available through the middleware
effects, from alterations in the character internal interface.
representations to consequences in the game environment. In The global console offers a set of functionalities to manage
order to maintain a certain degree on independence from the the characters as group, issuing commands that all of then must
specific game environment, all the control of the NPC avatar accomplish.
in the environment is actuated through a middleware interface One of our objectives with this generic architecture was to
(Pogamut), that provide an intermediate abstraction over the provide a relative independence between what are the tools
game engine. The NPC behaviour script can be debugged and made available to the development, debugging and execution
executed using the console, the developer can directly control of characters behaviour and the specifics of the game engine.
the interpreter, issuing commands, stopping execution, testing This architecture is an evolution of earlier work presented
alternatives, and monitoring execution. originally in [8].
The second group comprise the simulated environment
where all the characters actuate, and a global management

-53-
Special Issue on Intelligent Systems and Applications

III. THE SOFTWARE TOOL DEVELOP-FPS was designed for developing scripts for the Game Unreal
Tournament but it can easily be adapted to other game
DEVELOP-FPS is a software tool written in JAVA,
platforms.
specially designed for the development of First Person Shooter
In figure 2 we may see an example of DEVELOP-FPS in
(FPS) players controlled by Rule Based Scripts in Jess.
action: In the centre two NPCs are fighting, on the right the
DEVELOP-FPS may be very powerful if used by FPS
Global Console is displayed and on the top and left we can see
developers to create, debug, maintain and compare rule base
the individual console of one the players and the 2D-map as
player behaviours along a number of repeated experiments. It
seen from the that player perspective.

Fig. 2. A screenshot of the game control with four windows displaying a graphical view of the environment, two 2D maps representing a global situation an
individual position, an individual control console.

We will now proceed to detail the tool architecture and their position to the Global Console every 0.5 seconds.
main components and respective functionalities. In the top of the console we see the IDs of the connected
clients (the individual identification of each game character),
A. Global Terminal
and the one selected will have its respective console displayed,
The global console role functions are: 1) to offer a bird eye the others will be hidden—only one of the individual consoles
view of the world, providing a 2D map of the game world and can be displayed at any moment. In the bottom we may see
displaying the waypoints and character positions; 2) launching two buttons that are used to stop the game execution of every
an individual console for each character giving the user the character (“Stop All”) and to resume their execution (“Resume
possibility to monitor and control each NPC; 3) the possibility All”). This is an important feature for developing behaviours
to stop and resume the game execution; and 4) to automatically for game characters, due to the frequent necessity to stop the
run a certain number of game executions and collect data in game execution for debugging and testing behaviours.
order to evaluate and compare the characters performance There are three parameters for the repetition of a set of
along a sufficient number of similar experiments. In Figure 3, similar experiments: 1) The duration of each run; 2) the
we see a snapshot of the global console in a game played by 2 number of experiments and 3) the number of agents. Note that
NPCs with IDs 218 and 219. each game can end because there is only one player left or
AS we said above, the global console 2D map will represent because the duration has reached the defined limit.
an updated bird eye view of the NPCs positions (large circular The Global Console is responsible for start up the NPCs,
icons), with a different colour for each NPC, and also the run the game until it finishes, collect the game reports and
waypoints: the reference locations in the environment defined destroy the NPCs, repeating this procedure the right number of
by the user, for navigation purposes. The information is runs.
obtained from each NPC trough Sockets: each NPC sends its

-54-
International Journal of Artificial Intelligence and Interactive Multimedia, Vol. 1, Nº 6

survivors, the energy of NPCs in the end of the game. The


repeated experiments report will be written on a file (in csv
format).
B. NPC Terminals
Each Non Player Character (NPC) has its own private
console (Fig. 4), which may be hided or visible and when is
displayed it can be used for monitoring and controlling the
game character. It displays the NPC position, orientation
coordinates and sensory information, along with information
regarding the rule base execution. There is the possibility to
display a world map with an icon representing the Terminal
Player, which can be used to tell the NPC to go to a certain
position on the map. Below there is a mini-command center.
The jess code entered in this command center is executed only
by this NPC and the output can be visualized above the
command center in a window. This jess code can be used for
additional NPC behaviour monitoring and controlling. On the
left, we find three manual buttons for controlling the NPC
movements and on the bottom a line of buttons useful for
stopping and resuming execution besides other functionalities.
In the presence of the Global Terminal only one NPC is
allowed for display, as we do not want to fill the screen with
terminal windows. If we want to monitor or control different
Fig. 3. The display of the Global Console: the 2D world map where we see a agents, we have to activate the display of one after another
set of waypoints and two characters. In the bottom of the console we see the sequentially. In order to choose to be displayed a different
Pause All and Resume All buttons and the three important parameters to
NPC filling a specific slot in the Global Terminal with the
repeat a set of experiments.
NPC ID.
At the moment, we do not provide an interface for In case the Global Console is switched off, something
specifying which settings the user wants varied, and what different happens: every time a created NPC does not detect
values he wants them to take for, neither for specifying what the Global Terminal, it launches its individual terminal.
data to collect from each run. It is up to the NPC developer to Therefore, if there are 10 NPCs created from the same
program all this information directly in the JAVA code. For computer, there will be 10 individual terminals displayed in
example, he may want to vary the set of world maps to use and the computer monitor, visually overcharging it.
he may want the report of the NPC winner, the number of

Fig. 4. Example of a NPC terminal. On the top section the Jess data, which can be totally or partially hiden. On the left, three manual movement and orientation
button controls. On the right, the agent state may be displayed, and on the center, we see the command window .

1) Control Buttons Line


In the bottom of the NPC terminal we see a line of control
Fig. 5. The NPC control interface. From left to right, Kill
buttons (see Fig. 5).
agent, Reload logic, Play/Pause agent, Show/Hide agent state,

-55-
Special Issue on Intelligent Systems and Applications

Show/Hide Jess state, Show/Hide map. In the Figure, the agent


state is hided and the same happens with the Jess state.
We will describe each button function from left to right.
Kill agent button: The NPC is killed and disappears
from the game.
Reload Logic: If we change the NPC script, by
activating this button, the agent behaviour will be
controlled by the most recent script version. It will be
updated in the agent without being forced to close the
application and reinitialize the game.
Play/Pause: The NPC execution is paused and can be Fig. 6. The displayed sensory information in a NPC console.
resumed. This way we can stop a certain player in order
to monitor its behaviour with more detail. We can resume
4) Individual 2D Map
the behaviour at any time.
We can visualize a world map with the position of every
Step: Behaviour is executed one step forward. Time is
NPC in the game but where the position of the currently
divided in steps and behaviour can be followed step by
monitored NPC is highlighted (see Fig. 7).
step.
Show/Hide Agent State: The agent state, which
appears on the right section of the terminal window, may
be hided or displayed.
Show/Hide Jess State: The agent information
regarding the Jess rule based script execution may be
hided or displayed.
Show/Hide Map: The NPC map can be hided or
displayed.

2) NPC Sensory Information


In order to monitor the behaviour execution of an NPC, it is
useful to access to its most important internal data, like the
energy level, the position and rotation and also other relevant
information like if it is moving or if it is seeing or hearing
anything. What about the enemies? Is it seeing any of them? It
is seeing any weapon and what about the number of
ammunition that it is currently possessing? All that information Fig. 7. 2D Map. It allows the visualization of the monitored agent in relation
can be displayed on the individual terminal window, along to the others and the world. On the top we see information regarding the
with the NPC ID and name (see Fig. 6). colour legends.
At this point, we have considered the referred data as the
most important to be displayed. As we will explain later there The map may be used as an interface for controlling the
are other ways to monitor other aspects of the agent, by using position of the NPC. The user can click in any waypoint on the
the powerful command window tool. map, and if it is possible, the NPC goes directly to the chosen
waypoint.
3) Manual Controls
On the left we may see three manual control buttons that 5) Jess Monitoring
allow us to control manually the movement of an NPC. By In order to develop and maintain a rule based script it is
clicking the right or left arrow buttons, the NPC will make a very useful to be able to monitor the list of facts from the Jess
respectively clockwise or anti-clockwise 45º rotation; by working memory, the agenda or rule activations, the selected
clicking the north arrow, it will advance forward a certain and fired rule and also the available user defined Jess functions
small distance, if possible. This buttons can be very useful if along with some useful built-in ones (see Fig. 8). All this
we want to manually position the NPC so that it will end with information may be displayed in the individual terminal
a certain position and orientation. window.

-56-
International Journal of Artificial Intelligence and Interactive Multimedia, Vol. 1, Nº 6

Fig. 8. Jess monitoring information: the working memory facts list, the rule activations and fired rule and also the user defined functions along with other
useful built-in functions.

After stopping a NPC, it will be easy to test the script rules, Thus, it is convenient that the script developer separates the
monitoring their activation in a certain situation. We can Jess rules in two modules: one specialized in gathering
follow the rule-based behaviour of a NPC using the step information like, for example, the nearest enemy location, and
control button and observing the Jess information on the the other specialized in actions, like moving or shooting. In
individual terminal window. each module more than one rule can fire—each module is
The user defined functions visualization was introduced executed only when no more rules fire. Therefore, the script
with the goal of helping the user just in case he wants to must carefully manage the return of the control to JAVA so
execute a particular function using the command window. It that Jess rules in any of the two modules do not fire forever.
will certainly be useful for him to look up for the right function
name. TABLE I
A JESS SCRIPT TO ILLUSTRATE A SIMPLE NPC BEHAVIOUR DEFINITION
On the right of the terminal window, depicted in Fig. 5, we USING A PERCEPTION/ACTION CYCLE.
see three buttons that allow us to hide any of these three Jess
;An example of Deftemplate
information types. ;to store all about the agent

(deftemplate bot
6) Command Window (slot see-enemy)
For a full agent monitoring and control, in the individual (slot hear-anything)
terminal is offered a command window, which is an interface (slot moving)
(slot nav-target)
where the game developer has the possibility to execute any (slot enemy-target))
Jess command and behaviour or perception functions and
;Setup
observe their output. This is an important tool for script (deffacts SETUP
exploration and debugging besides being very useful for (perception)
(action)
setting up test situations. (bot (see-enemy FALSE)
The user can fire rules step by step tracing the NPC (nav-target nil)))
behaviour, following the evolution of the NPC state and facts (defmodule PERCEPTION)
list as well as the rules activation and selection. Or he can
execute some specific Jess function that extends the NPC state ;Rule to collect info about the agent
(defrule perception
besides the standard information given on the right and ?f <- (perception)
referred on III.B.2. The user can even create a function in real- ?x <- (bot (nav-target ?target))
=>
time and execute it, and as Jess is written in Java, he can have (retract ?f)
full access to the Java API. (assert (perception))
(modify ?x (see-enemy
As an example, consider that we want to test the script when (see-enemy-func))
the user is facing the enemy. We would run the game until our (enemy-target
(get-enemy-location)))
NPC sees its enemy and that after pausing the game, we would (return))
pick up the right user defined Jess function: (turn-to- enemy),
and execute it in the command prompt. Afterwards we would (defmodule ACTION)
see the ordered list of rule activations in the window terminal ;Rule to pursuit and fire at the enemy he sees
by executing the (agenda) command, so that we could check if (defrule fires-and-pursuit-enemy
(declare (salience 100))
the rules script were behaving as expected. ?a <- (action)
?bot <- (bot (see-enemy TRUE)
C. The Execution Step: the interface between JAVA and JESS (enemy-target ?t&~nil))
=>
The game execution is divided in steps, but the script (retract ?a)
developer is responsible for the definition of what is a step, (assert (accao))
although there are some restrictions. The JAVA NPC (go-to-enemy ?t)
(shoot ?t)
controller will always put two special Jess modules in the (return))
focus stack: the PERCEPTION and BEHAVIOR, and will
issue a (run) command for execution of the PERCEPTION
We show in Table I an example of a toy Jess script, only for
rules followed by the BEHAVIOR ones.
illustration. The (return) command assures that control no
-57-
Special Issue on Intelligent Systems and Applications

more rules are executed inside the respective module: after a each module and so a step execution will fire 2 rules in case
(return) in a PERCEPTION rule, control is given to the they are both activated.
ACTION module, and after a (return) in an ACTION rule, At table II we present another short example of the Jess
control is given back to JAVA, putting an ending in the step. code to control the character movement in a formation
We can see several perception and action functions: (see- controlled by the group leader. As the previous example the
enemy-func) returns a boolean and (get-enemy-location) behavior is controlled by a cycle of perception/action activated
returns the enemy position coordinates; (goto-enemy) means be a message from the squad leader. This message indicates to
that the MPC goes towards a position near the enemy and the character is new position on the formation and the
(shoot) means the NPC turns towards the enemy position and direction it should be facing. When a new message is received,
shoots. the PERCEPTION module stores the information of the
Note that while in the JESS command window we can character new objectives. This information is used to activate
execute a rule after another monitoring behaviour in a thinner the module ACTION and execute the appropriated actions to
scale than a step. In the example given there is only one rule in achieve those goals.

TABLE II
AN EXEMPLE OF A PICE OF CODE THAT CONTROL THE MOVEMENT OF A CHARACTER IN A FORMATION

(defmodule PERCEPTION)

(defrule perception
?f <- (perception)
?x <- (bot) ; representation of BOT current attributes
=>
(retract ?f)
;If received a message to move in formation (id 9)
(if (and (eq (get-receiver-team-id-from-message) 9))
then (bind ?var (select-place-on-diamond-formation
(get-location-from-message)
(get-rotation-from-message)))
;setup destination
(modify ?x (nav-target ?var))
;setup bot rotation
(modify ?x (rot-target
(select-rotation-on-diamond-formation ?var)))
)
(assert (perception))
(store RuleFired perception)
(return)
)

...

(defmodule ACTION)

(defrule go-to-destination
?a <- (action)
;If there is a destination and a rotation
(bot (nav-target ?target&~nil)(rot-target ?rot))
=>
(retract ?a)
(assert (action))
;move bot
(go-to-target ?target ?rot))
(store "RuleFired" go-to-destination)
(return)
)
...
.. )

This rules and modules can be combined in more complex Although the integration of different pieces of code is not
behaviours, taking advantage of the capability of the tool entirely error free, these characteristics provide us with a
environment to make extensive tests to each component. significant enhancement over the current accessible tools.

-58-
International Journal of Artificial Intelligence and Interactive Multimedia, Vol. 1, Nº 6

IV. CONCLUSIONS AND FUTURE WORK a new game project is initiated.


By now we are already extending the game developer tool in
In this paper we presented a generic architecture to support
order to have different agent teams controlled by the Global
the development of tools to assist the design, debug and
Console. Another useful extension can be the addition of a
execution of artificial intelligent non-player characters in a
command window into the Global Console so that we can
game simulated environment. We build the application
broadcast Jess commands and functions to every Non
DEVELOP-FPS as a concrete example of the implementation
Character Player or just to a specific team, which may help
of the architecture, and introduce some of its core
setting up test scenarios. The definition of teams and the
functionalities and capabilities. This tool allows the
definition of coordinated actions and group tactics is currently
management of the NPCs from different levels, individually
work in progress. We expect that our tool will improve and
monitoring and controlling their behavior or act in a global
facilitate the designer tasks.
perspective.
We have designed several experiments using this tool, from
REFERENCES
simple behaviours that only follow a fixed path to advanced
cooperative team behavior which include collision avoidance
[1] UnrealEngine and UnrealScript official web page
and split and regroup capabilities. Our tool was fundamental in ([Link]
the debugging process and testing of the developed [2] Whitehead II, J., Roe, R.: World of Warcraft Programming: A Guide
behaviours. The advantages of forcing situations when a and Reference for Creating WoW Addons. Wiley; (2010).
specific behavior characteristic was triggered and follow the [3] Lang, Christoph: Panda3D 1.7 Game Developer's Cookbook. Packt
Publishing (2011).
execution trace of the agent rules were an improvement in the [4] Goldstone, Will. Unity Game Development Essentials. Packt
character creation. Publishing; (2009).
We believe that this kind of tools is fundamental in the [5] Unity game development tool official web page ([Link]
[6] Pogamut official web page ([Link]
process of constructing and deploying artificial intelligence [7] Friedman-Hill, Ernest. Jess in Action: Java Rule-
components. Although commercial games companies had their Based Systems. Manning Publications (2003).
own proprietary tools, these are not made available to the [8] Moniz, L., Urbano, P., Coelho, H.: AGLIPS: An educational
environment to construct behaviour based robots. In Proc. of the
general public. The use of a text editor and a trial and error
International Conference on Computational Intelligence for Modelling,
approach hardly is viable when the project grows beyond a Control and Automation – CIMCA (2003).
certain dimension. The development of these tools is a [9] Millington, Ian: Artificial Inteligence for Games. Morgan Kaufmann
something that in a close future had to taken into account when (2009)

-59-

View publication stats

You might also like