I’m Bhanuka Gamage, a final-year PhD candidate in the Inclusive Technologies Lab at Monash University. My research sits at the intersection of Augmented Reality, Human–AI Interaction, and Accessibility, where I design context-aware AR tools to empower people with Cerebral Visual Impairment (CVI).
Alongside my academic work, I’ve worked as a Senior Machine Learning Engineer with over eight years of industry experience—building and scaling AI solutions, leading cross-functional teams, and delivering user-focused products. Most recently, I led the development of a smart feeding system for pets at ilume, combining trackers and smart bowls to help fight obesity in dogs. I’ve since wrapped up my work there to focus full-time on completing my PhD.
When I’m not in the lab, I’m usually deep in the Victorian High Country on my mountain bike—chasing trails and fresh air.
The best way to reach me is via LinkedIn.
Bhanuka Gamage, Leona Holloway, Nicola McDowell, Thanh-Toan Do, Nicholas Price, Arthur Lowery, Kim Marriott
ACM SIGACCESS Conference on Computers and Accessibility (ASSETS'24)
Bhanuka Gamage, Thanh-Toan Do, Nicholas Seow Chiang Price, Arthur Lowery, Kim Marriott
ACM SIGACCESS Conference on Computers and Accessibility (ASSETS'23)
I’ve just rolled out a revamped site with featured publications, CVI research updates, and news—check it out!
Presented my research on AR-powered Apple Vision Pro solutions designed to help individuals with Cerebral Visual Impairment read text and interact with their environment.
Successfully completed my pre-submission milestone—the final of Monash’s three major PhD checkpoints.
Most outstanding undergraduate academic performance in BCompSci (Honours).
Most outstanding graduate in BCompSci (Honours) for April 2021 graduation.
Won Gold Medal for “BaitRadar – a scalable browser extension for clickbait detection on YouTube using AI technology.”
Awarded to high-achieving students across the university.
Highest marks in MCD4700 across all Monash campuses.
Highest mark in MCD4290 across all Monash campuses.
Awarded to the batch top in the Engineering – IT stream.
Selected for ASSETS 2023 Doctoral Consortium in New York—one of only 3 international PhD researchers; awarded a travel grant.
Awarded scholarship for PhD in Computer Science.
Awarded scholarship for PhD stipend.
Awarded for excellence in college basketball.
Awarded for excellence in college basketball.
25% scholarship for college colours in basketball.
Finalist in the Tertiary Education category at the 20th APICTA competition.
A novel computer-implemented method for predicting video links as clickbait using deep learning is described. The video link’s title, thumbnail, tags, audio transcript of the video, comments and statistics are used for training the model. The title, tags, audio transcript, comments, thumbnail and statistics are inputted into a deep learning network having separate sub-networks for each attribute. The sub-network for title, tags, audio transcript, and comments involves an embedding layer and a long short-term memory layer. The sub-network for thumbnail involves a convolutional neural network. The outputs are merged through an average operator; each sub-network handles a different modality. The weight of the video link as clickbait is determined by the deep learning model.