50 hours (~5 days)
Currently offline (read bottom of the page)
My first serious stab at Node.js. This project is all about analysing chess games at large quantities and in parallel. Doing a full move-by-move analysis of a chess game is very CPU-intensive task; therefore it makes sense to distribute analysing into many nodes. I achieved this through server automatically handing out batches of fresh chess positions to anyone asking for them.
Basically, the server has a collection of - say - 10 000 chess games. Whenever user connects, the server hands him - say - 100 chess positions. User then analyses them using CPUs of his computer (little JS script takes automatically care of this) and sends complete analyses back to server. Server does some validations and then saves analyses into a file. So the server is basically just for book-keeping and handling connections; all real work is done by end-users.
The fun part is that anyone can participate with regural web browser. Nothing to install. Another fun part is that the client-script actually utilizes multicore-CPUs automatically. So there are two levels of parallelism; multiple users analysing simultaneously with each user analysing multiple position simultaneously. Very cool setup, I think.
UPDATE 2nd May, 2016 I am not anymore maintaining web services for Finnish Chess Federation. Due to this, the above-mentioned "shakkianalysointi"-service is currently offline.
Update 2019: This project spawned similar follow-up project for analyzing positions using AWS Lambda. Video here: https://www.youtube.com/watch?v=A4-fZj7KCa0