Dr Christopher Child
- Dr Christopher Child
- +44 (0)20 7040 8586
Chris Child developed the International Cricket Captain game series for PC, PlayStation and iPhone and has been working in the gaming industry for over fifteen years. After several years at Empire and Logica CMG he now runs his own computer game company 'Childish Things Ltd' where he's involved in every stage of computer game production from game design, game programming, motion capture, writing manuals, right through to advertising and marketing.
Chris has been a games technology lecturer since 2005, and became course director of the Computer Games BSc at City in 2012 and the MSc in 2008. His aim has been to bridge the skills gap between talented graduate and industry programmers and the requirements of game companies. Chris is also a researcher in the Department of Computer Science at City University, developing cutting edge game agent AI using techniques such as reinforcement learning, probabilistic planning, environment modelling and approximate dynamic programming.
Chris can currently be found developing the next generation of computer game designers at City University London, where he lectures on undergraduate and postgraduate courses in computer games technology.
PhD Approximate Dynamic Programming with Parallel Stochastic Planning Operators, City University London, 2011
PGDip Academic Practice, City University London, 2004
MSc Cognitive Science, University of Birmingham, 1994
BSc Computer Science and Software Engineering, University of Birmingham, 1993
02/2008 - to date City University London, Lecturer
04/2005 - to date Childish Things Ltd, Director
09/2003 - 05/2008 City University London, Visiting Lecturer
07/1996 - 10/2002 Empire Interactive, Designer, Manager & Programmer
01/1995 - 06/1996 Logica plc, Analyst Engineer
06/1993 - 08/1993 Microsoft Development Lab, Analyst Engineer
Membership of professional bodies
04/12 British Computing Society, Member
04/12 IEEE, Member
My research is centred on the automated creation of intelligent agents for computer game environments, from the level of non-play characters in RPG or "god" games, to squad leaders and artificial opponents. The work is also applicable to a range of agent and robotics applications. My Ph.D. and publications focus on agents which build a stochastic rule based model from experience in an artificial environment which is either inherently random or random from the limited perspective of the agent's perception. This model is then used as the basis for a reinforcement learning algorithm (rule value reinforcement learning) which attaches value to each rule enabling the agent to pick an action in a given situation based on the values of rules with conditions matching the current state. Both the stochastic rule learning and rule value reinforcement learning algorithms are novel contributions. My future research interests include integrating agent research into commercial games and software engineering techniques for games.
- Basaru, R.R., Slabaugh, G.G., Child, C. and Alonso, E. (2016). HandyDepth: Example-based stereoscopic hand depth estimation using Eigen Leaf Node Features.
- Basaru, R.R., Child, C., Alonso, E. and Slabaugh, G. (2015). Quantized Census for Stereoscopic Image Matching.
- Trusler, B.P. and Child, C. (2014). Implementing racing AI using Q-learning and steering behaviours.
- Child, C.H.T. and Dey, R. (2013). QL-BT: Enhancing Behaviour Tree Design and Implementation with Q-Learning. CIG 2013- IEEE Conference on Computational Intelligence and Games 11-13 August, Niagara Falls, Canada.
- Hadjiminas, N. and Child, C. (2012). Be The Controller: A Kinect Tool Kit for Video Game Control - Recognition of Human Motion Using Skeletal Relational Angles. The 5th International Conference on Computer Games and Allied Technology Bali, Indonesia.
- Child, C., Parkar, S., Mohamedally, D., Haddad, M. and Doroana, R. (2010). Development of a Virtual Laparoscopic Trainer using Accelerometer Augmented Tools to Assess Performance in Surgical training. 19th International Pediatric Endosurgery Group (IPEG) 8-12 June, Hawaii, USA..
- Child, C., Stathis, K. and Garcez, A.D. (2007). Learning to Act with RVRL Agents. 14th RCRA Workshop, Experimental Evaluation of Algorithms for Solving Problems with Combinatorial Explosion July, Rome.
- Child, C. and Stathis, K. (2006). Rule Value Reinforcement Learning for Cognitive Agents. Fifth International Joint Conference on Autonomous Agents and Multiagent Systems (AAMAS`06) 8-12 May, Hakodate, Hokkaido.
- Child, C. and Stathis, K. (2005). SMART (Stochastic Model Acquisition with ReinforcemenT) learning agents: A preliminary report.
- Child, C. and Stathis, K. (2004). The Apriori Stochastic Dependency Detection (ASDD) algorithm for learning Stochastic logic rules.
- Child, C. (2012). Approximate Dynamic Programming with Parallel Stochastic Planning Operators. City University London.
- Child, C.H.T. Approximate Dynamic Programming with Parallel Stochastic Planning Operators. (PhD Thesis)
- Watching the rise and rise of eSports. (2015). City University London News https://www.city.ac.uk/news/2015/march/watching-the-rise-and-rise-of-esports
- Indie Cricket Developer Going All Out for Student Success. (2012). Edge Online http://www.edge-online.com/get-into-games/indie-cricket-developer-going-all-out-for-student-success/
An academic whose university development project became a long-running console cricket series is using his experience to inspire students and help them get a job in games. Dr Chris Child, who makes International Cricket Captain as owner of Childish Things, says the industry needs “hardcore programming specialists” rather than development all-rounders.
- Watching the rise and rise of eSports. (2015). City University London News http://www.city.ac.uk/news/2015/march/watching-the-rise-and-rise-of-esports
- Cricket Captain 2014 released on PC, Mac, iPhone & Android.