Homepage of Garnett Wilson

Ph.D., Computer Science

Research Interests
(for associated publications, please click here)

 Machine Learning Applications of Financial Engineering

Since 2014 I have been working on the Analytics Team at the banking compliance software company Verafin, Inc.  I also worked there in 2007-2008 during the industry component of my postdoctoral fellowship at Memorial University. I researched and produced analytics using machine learning techniques to detect debit fraud, perform anti-money laundering (AML), combat the financing of terrorism (CFT), and to identify human trafficking.  Machine learning techniques used in the development of this software include evolutionary computation, fuzzy logic, Bayesian networks, and natural language processing.

Following my postdoctoral work, I researched genetic programming techniques for the analysis of trends for interday and intraday stock price data.  This work lead to the production of a proprietary stock analysis algorithm at Afinin Labs Inc.  The system was used by proprietary traders at a large Canadian Bank and is now being repurposed for use in a hedge fund. I continue to collaborate with my colleagues at Dalhousie University as an Adjunct Assistant Professor in the Faculty of Computer Science in the areas of financial engineering and evolutionary computation.   

Evolutionary Computation, Genetic Programming,
Genetic Algorithms

My Ph.D. thesis research (supervisor Dr. Malcolm Heywood) presents the Probabilistic Adaptive Mapping DGP (PAM DGP) algorithm, a new version of linear genetic programming (LGP) that uses coevolution.  PAM DGP produces its solutions by evolving redundant mappings to emphasize appropriate members within relevant subsets of the problem’s original function set.

My research into Genetic Programming began during my Master's program with Dr. Malcolm Heywood as supervisor. We examined linearly structured individuals (bit strings) and the use of page-based crossover to establish context within the individuals. Page-based crossover involved the trading of code segments of fixed sizes, and the establishing of code context meant that reusable sub-sections of code existed within an individual. Additional studies on LGP examined the effect of biasing mutation and crossover to operate on particular sections of the individuals` genomes in other benchmark problems.  Dr. Wolfgang Banzhaf and I have also investigated the theoretical and performance differences between the graphical form of Linear Genetic Programming (LGP), Cartesian Genetic Programming (CGP), and another graph-based GP alternative. Research during my postdoctoral fellowship also included new algorithms to parallelize genetic programming using graphics processing units (GPUs). 

Planning Systems

During a NSERC Undergraduate Student Research Assistantship (Summer 2000), a colleague Atreya Basu and I began working on improvements to the planning system "Graphplan" under the supervision of Dr. Afzal Upal. The basic Graphplan algorithm consists of two phases: forward expansion and backward solution extraction. We explored the connections between constraint satisfaction and Graphplan’s solution extraction phase and investigated various enhancements that can lead to improved planning performance. These enhancements included (1) extending the Graphplan algorithm to learn to prune future search nodes while backtracking, (2) extending the existing learning algorithms to allow them to learn heuristics to improve the quality of the plans produced by Graphplan.

Decision Theory

Dr. Stephen Maitzen of Acadia University and I worked on a controversial game theory/decision theory problem since the completion of my undergraduate degree until publication in the Journal of Theory and Decision in 2003. Newcomb’s problem supposedly involves a Chooser, who has the option of taking one or else two boxes in certain circumstances, and a Predictor, who makes a prediction of how many boxes the Chooser will take in those circumstances. We believe that the crucial concepts of “Chooser” and “Predictor” have received too little attention. Indeed, we argue, neither of those concepts can be adequately defined: each of them conceals a vicious regress, previously unnoticed, which shows that Newcomb’s problem itself is insoluble because it is ill-formed.