LESSWRONG
LW

AI
Frontpage

43

Which LessWrong/Alignment topics would you like to be tutored in? [Poll]

by Ruby
19th Sep 2024
1 min read
12

43

AI
Frontpage

43

Which LessWrong/Alignment topics would you like to be tutored in? [Poll]
7Elizabeth
2Ruby
25Ruby
15habryka
11Ruby
9Ruby
7Ruby
7Ruby
5Ruby
3RHollerith
2Ruby
1TheManxLoiner
New Comment
12 comments, sorted by
top scoring
Click to highlight new comments since: Today at 6:39 PM
[-]Elizabeth1y71

what I want for rationality techniques is less a tutor and more of an assertive rubber duck walking me through things when capacity is scarce. 

Reply
[-]Ruby1y20

Poll for LW topics you'd like to be tutored in
(please use agree-react to indicate you'd personally like tutoring on a topic, I might reach out if/when I have a prototype)

Note: Hit cmd-f or ctrl-f (whatever normally opens search) to automatically expand all of the poll options below.

Reply
[-]Ruby1y250

CFAR-style Rationality Techniques

Reply18
[-]habryka1y150

Writing well

Reply13
[-]Ruby1y110

Decision Theory

Reply5
[-]Ruby1y90

Infra-Bayesianism

Reply4
[-]Ruby1y70

Applied Game Theory

Reply10
[-]Ruby1y70

Natural Latents

Reply5
[-]Ruby1y50

Agent Foundations

Reply3
[-]RHollerith1y30

Applying decision theory to scenarios involving mutually untrusting agents.

Reply2
[-]Ruby1y20

Anthropics

Reply2
[-]TheManxLoiner10mo10

What is the status of this project? Are there any estimates of timelines?

Reply
Moderation Log
More from Ruby
View more
Curated and popular this week
12Comments

Would you like to be tutored in applied game theory, natural latents, CFAR-style rationality techniques, "general AI x-risk", Agent Foundations, anthropics, or some other topics discussed on LessWrong?

I'm thinking about prototyping some topic-specific LLM tutor bots, and would like to prioritize topics that multiple people are interested in.

Topic-specific LLM tutors would be customized with things like pre-loaded relevant context, helpful system prompts, and more focused testing to ensure they work.

Note: I'm interested in topics that are written about on LessWrong, e.g. infra-bayesianism, and not magnetohydrodynamics".


I'm going to use the same poll infrastructure that Ben Pace pioneered recently. There is a thread below where you add and vote on topics/domains/areas where you might like tutoring.

  1. Karma: upvote/downvote to express enthusiasm about there being tutoring for a topic.
  2. Reacts: click on the agree react to indicate you personally would like tutoring on a topic.
  3. New Poll Option. Add a new topic for people express interest in being tutored on.

For the sake of this poll, I'm more interested in whether you'd like tutoring on a topic or not, separate from the question of whether you think a tutoring bot would be any good. I'll worry about that part.

Background

I've been playing around with LLMs a lot in the past couple of months and so far my favorite use case is tutoring. LLM-assistance is helpful via multiple routes such as providing background context with less effort than external search/reading, keeping me engaged via interactivity, generating examples, and breaking down complex sections into more digestible pieces.