👋 Hi I'm Flawnson

Profile Picture

I'm a former ML engineer turned full-stack developer. I used to work a lot with Graph Neural Networks and NLP models. As a web developer I'm most competent in TS, React, Tailwind, Express, GraphQL, and Postgres.

I'm an avid piano player 🎹 and an abysmal frisbee tosser 🥏.

Given free time, I pander to and moderate the (small but growing) Geometric Deep Learning subreddit.




👨‍💻 What I'm working on

Most recently I've been working with my friend and co-founder Albert Wang on Comend. We build tools for the rare disease community to help them become research ready.

I recently started releasing original piano compositions on YouTube and Spotify.

For the latest updates on what I'm doing, my setup, or just to dig a bit more about me, visit my blog 📝.


📚 What I've worked on

Comend - Co-founder and CTO

Feb 2023 - Present - Toronto, Canada

Entrepreneur First (EF) - Founder in Residence

March 2022 - Feb 2023 - Toronto, Canada

Kebotix Inc. - Machine Learning Developer Intern

Sept 2020 - May 2021 - Boston, MA

Relation Therapeutics - Machine Learning Research Intern

Sept 2019 - May 2020 - London, UK

ML for Chem/Bio-informatics - Personal Projects

Oct 2018 - June 2019 - Toronto, Canada

Canadian Imperial Bank of Commerce (CIBC) - Front end Developer

June 2018 - Sept 2018 - Toronto, Canada


📜 Publications

Sparse DDD

Sparse Dynamic Distribution Decomposition: Efficient Integration of Trajectory and Snapshot Time Series Data

Jake P. Taylor-King, Cristian Regep, Jyothish Soman, Flawnson Tong, Catalina Cangea, Charlie Roberts

DDD allows for the fitting of continuous-time Markov chains over these basis functions and as a result continuously maps between distributions. The number of parameters in DDD scales by the square of the number of basis functions; we reformulate the problem and restrict the method to compact basis functions which leads to the inference of sparse matrices only -- hence reducing the number of parameters.