👋 Hi I'm Flawnson

Profile Picture

I'm a Machine Learning developer and researcher focusing on graph structured learning with deep learning and genetic algorithms. I often work with chemical and biological data assays in sci-informatics.

I'm an occasional writer ✍ and a frequent reader 📖. I'm an avid piano player 🎹 and an abysmal frisbee tosser 🥏. I'm enthused listener of any music genre 💿 and an obsessive collector of Hi-res audio for a single music genre 📀.

Given free time, I pander to and moderate the (small but growing) Geometric Deep Learning subreddit.

👨‍💻 What I'm working on

Most recently I've been building self-referential neural networks towards the development of new meta-learning heuristics with Kevin Shen.

For the latest updates on what I'm doing, my setup, or just to dig a bit more about me, visit my blog 📝.

📚 What I've worked on

Kebotix Inc. - Machine Learning Developer Intern

Sept 2020 - May 2021 - Boston, MA

Relation Therapeutics - Machine Learning Research Intern

Sept 2019 - May 2020 - London, UK

ML for Chem/Bio-informatics - Personal Projects

Oct 2018 - June 2019 - Toronto, Canada

Canadian Imperial Bank of Commerce (CIBC) - Front end Developer

June 2018 - Sept 2018 - Toronto, Canada

📜 Publications

Sparse DDD

Sparse Dynamic Distribution Decomposition: Efficient Integration of Trajectory and Snapshot Time Series Data

Jake P. Taylor-King, Cristian Regep, Jyothish Soman, Flawnson Tong, Catalina Cangea, Charlie Roberts

DDD allows for the fitting of continuous-time Markov chains over these basis functions and as a result continuously maps between distributions. The number of parameters in DDD scales by the square of the number of basis functions; we reformulate the problem and restrict the method to compact basis functions which leads to the inference of sparse matrices only -- hence reducing the number of parameters.