Hi, I'm Chris Barnett.
I encountered HPMOR when I met Christopher Olah at Chez JJ, Mountain View in April 2012 during a networking expedition to Silicon Valley. I read for approximately 3 days straight. HPMOR took the place of Ender's Game, which I'd only read a few weeks before, as my favourite fiction.
I joined the Melbourne LessWrong community in early 2013 and finished reading the sequences soon after. My favourite sequences are Epistemology, Quantum Physics and Words.
I started the first rationalist sharehouse in Melbourne with Brayden McLean, Thomas Eliot and Allison Rea in June 2013, completed the first Melbourne CFAR workshop in February 2014 and moved to Berkeley CA at 1pm on March 6th 2014 (via timezone teleportation :P).
I'm in the process of deciding where my time would best be spent to maximize the expected goodness of the future. I still have much confusion about how to read the output of my utility function for far future scenarios involving AI, brain upload, mind copying and consciousness-containing simulations, but I have a few heuristics such as less suffering is better, more exploration of possibility space is better, retention of human values in general (such as freedom, love, curiosity) is better.
I'm strongly considering accepting a programming job with Rev, primarily for skill attainment and income, with the interestingness of their long term vision being a significant motivational bonus. I'm also exploring working at Leverage as a possibility and plan to network with people in crypto-tech and social choice theory. I've spent hundreds of hours designing a distributed reputation system which I plan to publish in the form of 1 or more white papers and a series of blog posts, the first of which is here: https://zuthan.wordpress.com/2014/03/08/reputation-is-powerful.
To respond to Jennifer's request, I self-identify as an aspiring rationalist. I see this as prescriptive rather than descriptive: I aspire to be rational. I too use a general heuristic of not using labels on myself because most of them come associated with arbitrary baggage. Aspiring Rationalist seems well enough defined to be useful, though.