Remote Explorer (private github repository)
- An android file explorer app for file access over the Samba protocol built with Kotlin / Jetpack Compose
Doorbell Detector (https://github.com/xiankai/doorbell_detector)
- A python script to utilize machine learning for audio classification in order to detect the sound of a doorbell
FFXIVFurigana (https://github.com/xiankai/FFXIVFurigana)
- A C# plugin to the game to display furigana over Japanese in the chat log output
ChatHistory w/ Vector DB (https://github.com/xiankai/chat-history) (https://github.com/xiankai/vector-database)
- Importing and displaying chat history from old messaging programs, utilizing a vector DB for semantic search
Triple Triad V2 (https://github.com/xiankai/triple-triad-ai)
- An experiment with Rust to replicate the working logic of the card game
XIVAnalysis (https://github.com/xiankai/xivanalysis)
- Collaborating with other users to analyse combat performance in Final Fantasy XIV
- Using React, SemanticUI, CSS Modules
Pokémon Go Map (https://github.com/xiankai/sg-pokemongo-ex-raid-map)
- A local map to track and predict locations of popular EX-raids in Pokémon Go, serving 3k+ users
- Built with Typescript, MobX, Leaftlet and d3.js
- Designed to be replicated for other countries as well
- Auto-updated data from a google drive spreadsheet
- Implemented various mapping techniques like (S2 cells, point-in-polygon) to visualize map data
- Used Mapbox for map layers before switching to OpenStreetMap (OSM)
Triple Triad (https://github.com/xiankai/triple-triad-solver)
- A multiplayer implementation of a mini-game in Final Fantasy XIV, using peerjs to polyfill WebRTC
- Using redux-observables/RxJS for state management
Web Crawler (https://github.com/xiankai/Disney-Store)
- Crawled the Disney Store for product updates to notify up to 15 subscribers of new stock
- Used cURL to crawl pages, redis to store hourly data and Mailchimp to send notifications
Discord bot (https://github.com/xiankai/pusheen-the-fc-helper)
- Another web crawler to crawl ten thousands of pages using a job queue mechanism
- Used phantomjs to crawl pages, redis to store data and node.js to host the chat bot interface
Server Administration
- Managed a few VPS to host various bots and websites for other people
- Migrated the server 6 times, alternating between Linode, AWS and Digital Ocean, using rsync, mysqldump and Puppet to quickly rebuild environments, preserve data, and maintain 100% uptime