Hello, player character, and welcome to the Mazes of Menace! Your goal is to get to the center and defeat the Big Bad. You know this is your goal because you received a message from a very authoritative source that said so. Alas, the maze is filled with guards and traps that make every step dangerous. You have reached an intersection, and there are two doors before you. Door A leads towards the center; it probably takes you to your destination. Door B leads away from the center; it could loop back, but it's probably a dead end. Which door do you choose?

The correct answer, and the answer which every habitual video game player will instinctively choose, is door B: the probable dead end. Because your goal is not to reach the end quickly, but to search as much of the maze's area as you can, and by RPG genre convention, dead ends come with treasure. Similarly, if you're on a quest to save the world, you do side-quests to put it off as long as possible, because you're optimizing for fraction-of-content-seen, rather than probability-world-is-saved, which is 1.0 from the very beginning.

If you optimize for one thing, while thinking that you're optimizing something else, then you may generate incorrect subgoals and heuristics. If seen clearly, the doors represent a trade-off between time spent and area explored. But what happens if that trade-off is never acknowledged, and you can't see the situation for what it really is? Then you're loading garbage into your goal system. I'm writing this because someone reported what looks like a video game heuristic leaking into the real world. While this hasn't been studied, it could plausibly be a common problem. Here are some of the common memetic hazards I've found in video games.

For most games, there's a guide that explains exactly how to complete your objective perfectly, but to read it would be cheating. Your goal is not to master the game, but to experience the process of mastering the game as laid out by the game's designers, without outside interference. In the real world, if there's a guide for a skill you want to learn, you read it.

Permanent choices can be chosen arbitrarily on a whim, or based solely on what you think best matches your style, and you don't need to research which is better. This is because in games, the classes, skills, races and alignments are meant to be balanced, so they're all close to equally good. Applying this reasoning to the real world would mean choosing a career without bothering to find out what sort of salary and lifestyle it supports; but things in the real world are almost never balanced in this sense. (Many people, in fact, do not do this research, which is why colleges turn out so many English majors.)

Tasks are arranged in order of difficulty, from easiest to hardest. If you try something and it's too hard, then you must have taken a wrong turn into an area you're not supposed to be. When playing a game, level ten is harder than level nine, and a shortcut from level one to level ten is a bad idea. Reality is the opposite; most of the difficulty comes up front, and it gets easier as you learn. When writing a book, chapter ten is easier than writing chapter nine. Games teach us to expect an easy start, and a tough finale; this makes the tough starts reality offers more discouraging.

You shouldn't save gold pieces, because they lose their value quickly to inflation as you level. Treating real-world currency that way would be irresponsible. You should collect junk, since even useless items can be sold to vendors for in-game money. In the real world, getting rid of junk costs money in effort and disposal fees instead.

These oddities are dangerous only when they are both confusing and unknown, and to illustrate the contrast, here is one more example. There are hordes of creatures that look just like humans, except that they attack on sight and have no moral significance. Objects which are not nailed down are unowned and may be claimed without legal repercussions, and homes which are not locked may be explored. But no one would ever confuse killing an NPC for real murder, nor clicking an item for larceny, nor exploring a level for burglary; these actions are so dissimilar that there is no possible confusion.

But remember that search is not like exploration, manuals are not cheats, careers are not balanced, difficulty is front-loaded, and dollars do not inflate like gold pieces. Because these distinctions are tricky, and failing to make them can have consequences.

Memetic Hazards in Videogames
New Comment
164 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

Have I ever remarked on how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives and give them nearly no support in doing so?

Support like, say, spending a day apiece watching twenty different jobs and then another week at their top three choices, with salary charts and projections and probabilities of graduating that subject given their test scores? The more so considering this is a central allocation question for the entire economy?

At my high school the gifted program required a certain number of hours of internship at a company in the area, and indeed even those outside the gifted program were encouraged to meet with their counselors for advice on finding internships. 'Course, that program, along with AP classes, the arts, and half of science, was cut starting this school year. I think it's 'cuz Arizona realized that since they were already by far the worst state in the nation when it came to education they might as well heed the law of comparative advantage and allocate more resources to the harassment of cryonics institutions.

One possible solution is to have education financed by equity rather than loans, the third party who pays for your education does so in return for some share of future income. Besides the obvious effect of funding profitable education, this has the totally awesome side-effect of giving great incentive to an organization to figure out exactly how much each person's income will be increased by each job - which includes predicting salary, probability of graduating, future macro trends, etc.

The third party wouldn't have much incentive to predict what jobs will be most fun (only whether you will hate it so much you quit), but at least a big chunk of the problem would be solved. Personally I think the solution would involve "higher education is rarely worth it", and direct people towards vocational training or just getting a damn job. But I could be wrong - the great thing about a mechanism is that I don't have to be right about the results to know that it would make things more efficient :).

6[anonymous]
This is called the income tax. So why doesn't the government do that?
1Jay
We actually do pretty much the opposite of that in the U.S.  Student loans have a Federal guarantee, so the incentive is to sign people up for as much education as possible.  If they succeed, great.  If they fail, they'll be paying off the loans until they die at which time Uncle Sam will pay the balance.  With compounding interest, the ones who fail are the most profitable.
[-]dclayh150

Indeed, some of us spend 9 more years in school to postpone this decision. (In case you were wondering, it doesn't help.)

Have I ever remarked on how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives and give them nearly no support in doing so?

Do we actually do that that much? The vast majority of high school students when I was in highschool had no idea what they wanted to do, and that was considered ok. Heck, a large fraction of people even when they were well into their undergraduate educations didn't know what they wanted to do and that was also considered ok. And as far as I can tell the general trend in high school education has been less emphasis on specific-job oriented classes as time has gone on.

2Relsqui
The trend in reporting about education certainly seems to be that kids are being asked to specialize earlier and earlier--taking AP classes to prepare for majors, etc. Whether that corresponds to the actual advisement trends I couldn't tell you. I only went through it once.

What we really need is a "brain plasticity in adulthood" pill. Because really the only reason we force these impossible choices on teens is that we're racing against their diminishing ability to learn.

This argument may hold for things like languages or thinking habits, or other skills that take root early, but having tackled an undergrad maths syllabus at both ages 18 and 28, I've found an adult work ethic beats the pants off youthful 'plasticity' any day of the week. Any skillset mandatory to a specialised vocation will probably mostly be learned well into adulthood anyway.

9Spurlock
Why can't we have both? A plasticity pill wouldn't inherently destroy your work ethic. Having this useful ability, without the crippling shortcomings of youth (mostly various forms of inexperience, not to mention developmental/hormonal distractions) would be one hell of a combination.
9sixes_and_sevens
Well, at the moment we can't have both because brain plasticity pills don't currently exist. If someone asked me tomorrow to optimise the education system, "educate people at the point in their lives when that education would be most useful to them" would come considerably higher up the list than "invent brain plasticity pill".
1JulianMorrison
The win from skilled use of childhood plasticity maxes out at around 15 well-filled years of highly plastic learning. The win from a pill maxes out at a lifetime thereof. So if a pill were close to technologically plausible, it would be a much better use of effort.
4sixes_and_sevens
Assuming it's possible to get the 'plasticity' gains without a significant trade-off. Childhood brains are so flexible because they're still developing; concordantly they don't have a fully developed set of cognitive skills. By way of analogy, concrete is very flexible in its infancy and very rigid in its adulthood. The usefulness it possesses when rigid is based on how well its flexibility is utilised early on. If you come up with a method to fine-tune the superstructure of a building on the fly later on in its lifetime, cool beans. If all you come up with is a way to revert the whole thing to unset concrete, I'd rather focus on getting the building right first time.
4JulianMorrison
Hmm. I don't trust that. It sounds too much like a just-so story. What I know is that most species have a learning-filled childhood followed by an adulthood with little to learn. I also know that evolution hates waste - it will turn a feature off if it isn't used. So if anything the relatively high human ability to learn in adulthood looks to me like neoteny. Concrete is a poor analogy - rigidity is not an advantage to adult humans!
3b1shop
I think rigidity fits well into the Aristotelian framework. Too rigid and you hold fast to wrong ideas. Too plastic and you waste mental effort challenging truths that should have been established. Yes, we don't want to be too rigid in our beliefs, but there's a high opportunity cost to mental thought. I've run into too many hippies who are "open-minded" about whether or not 1=1. We have to internalize some beliefs as true to focus on other things. I worry some in this community are so used to getting others to reconsider false beliefs they forget there's sometimes a good reason to sometimes have rigid beliefs. Reversed stupidity is not intelligence.
3A1987dM
By the way, if I recall correctly, in the proverb "a rolling stone gathers no moss", moss was originally intended to be a good thing, but most people now take it to be a bad thing.
-4sixes_and_sevens
In what way does it sound like a just-so story? Re: rigidity and humans, I suspect you would find it very difficult if you continued to adjust your speech patterns to accomodate every irregular use of the English language you'd heard since the day you were born. Your ability to rapidly learn language stopped for a reason. In that sense, rigidity is pretty advantageous.

I suspect you would find it very difficult if you continued to adjust your speech patterns to accomodate every irregular use of the English language you'd heard since the day you were born. Your ability to rapidly learn language stopped for a reason.

I'm tempted to call this a just-not-so story.

Not only do I disagree with the general point (about "rigidity" being advantageous), but my sense is that language is probably one of the worst examples you could have used to support this position.

It strikes me as wrong on at least 4 different levels, which I shall list in increasing order of importance:

(1) I don't think it would be particularly difficult at all. (I.e. I see no advantage in the loss of linguistic ability.)

(2) People probably do continue to adjust their speech patterns throughout their lives.

(3) Children do not "accommodate every irregular use [they have] heard since the day [they] were born". Instead, their language use develops according to systematic rules.

(4) There is a strong prior against the loss of an ability being an adaptation -- by default, a better explanation is that there was insufficient selection pressure for the ability to be maintained (since abilities are usually costly).

So, unless you're basing this on large amounts of data that I don't know about, I feel obliged to wag my finger here.

3wedrifid
I'm tempted to agree. When adults spend as much time focussed on learning to speak a language as children do they learn faster. I don't quite agree with this, at least as a general rule. (Red King, etc.)
3Relsqui
I read a good, if not new, article about this recently. It's relevant to a couple posts in this thread, but I figured this was as good a place to insert it as any.
1sixes_and_sevens
I'm happy to concede the point on childhood learning, but maintain that educational reform is significantly more implementable than brain plasticity pills.
0wedrifid
Absolutely. Plain plasticity does very little without a quality learning environment in place. (And a quality learning environment is one of the most powerful ways of fostering brain plasticity!)
2Spiracular
TL;DR: As people get older, it's common for people to acquire responsibilities that make it hard to focus on school (ex: kids, elderly parents). Fairly high confidence that this is a big factor in community college grades. ---------------------------------------- As someone whose parent teaches basic math at community college, and who attended community college for 2 years myself (before transferring)... I have absolutely seen some people pick up these skills late. The work ethic & directedness of community college high-achievers is often notably better than that of people in their late teens. They also usually have healthier attitudes around failure (relative to the high-achieving teens), which sometimes makes them better at recovering from an early bad grade. Relatedly, the UCs say their CC transfers have much lower drop-out rates. One major "weakness" I can think of, is that adults are probably going in fully-cognizant that school feels like an "artificial environment." Some kids manage to not notice this until grad school. ---------------------------------------- From my mom's work, I know that the grading distribution in high-school-remedial math classes is basically bimodal: "A"s and "F"s, split almost 50-50. The #1 reason my mom cites for this split, is probably a responsibilities and life-phase difference? A lot of working class adults are incredibly busy. Many are under more stress and strain than they can handle, at least some of the time. (The really unlucky ones, are under more strain than they can really handle basically all of the time, but those are less likely to try to go to community college.) If someone is holding down a part-time job, doesn't have a lot in savings, is married, is taking care of a kid, and is caring for their elderly mother? That basically means a high load of ambient stress and triage, and also having 5 different avenues for random high-priority urgent crises (ex: health problems involving any of these) to bump school o
2Spiracular
Some of the other F-grade feed-ins, for completion's sake... * A lot of people went to a bad high school. Some have learned helplessness, and don't know how to study. Saw the occasional blatant cheating habit, too. * Community colleges know this, and offer some courses that are basically "How to study" * So much of many middle-class cultures is just hammering "academics matter" and "advice on how to study or network" into your brain. Most middle-class students still manage to miss the memo on 1-2 key study skills or resources, though. Maybe everyone should go to "how to study" class... * Personally? As a teen, I didn't know how to ask for help, and I couldn't stand sounding like an idiot. Might have saved myself some time, if I'd learned how to do that earlier. * Nobody uses office-hours enough. * At worst, it's free tutoring. At best, it's socially motivating and now the teacher feels personally invested in your story and success. * "High-achievers who turned an early D into an A" are frequently office-hour junkies. * Someone with a big family crisis, is probably still screwed even if they go to office hours. Past some threshold, people should just take a W. * A few people just genuinely can't do math, in a "it doesn't fit in their brain" kind of way * My mom thinks this exists, but only accounts for <1%
1TheOtherDave
Come to think of it, beating the pants off youthful plasticity accounted for why I didn't do a lot of studying in college. More seriously: yeah, IME the idea that 18-year-olds are more able to learn than 30-year-olds is mostly a socially constructed self-fulfilling prophecy.
6Kingreaper
I often think that more of pre-adult education should be about teaching people how to put effort into things, and a good work ethic, rather than just facts.
-1[anonymous]
A thousand THIS. Learning the same or similar things 30+ is far easier, as I don't only have a better work ethic, but I also have the practical experience to actually understand theoretical things that looked bullshit to me when I was 20. Definitely practice, experience should be given before teaching theory, not after. Work on something, follow rules, also experiment with not following rules and fuck a bit up, and then people get curious and actually listen when you tell them why exactly the rules work. It differs per country, but I think most ones the worst thing about education as a global average is that the majority of it is simply classification. Our average music class was preparing for tests like "name 5 brass instruments". The whole idea is that you know such categories, classes, like how brass instruments are a subset of aerophones and consist of two subsets, valve brass and sliding brass and for extra points you can also call them labrosones. This is more than just the teachers password, it is the whole philosopy that knowledge equals classification of words while you have no idea how a mellophone sounds... I think this is why I hated education, this is its worst part. However I have heard that in English-speaking countries this kind of thing is less bad, there is more hands-on experience going on.

how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives

One common answer to that is to become a dropout, try a career or two to find out where your talents really lie, and then go for that. You can usually go back to school for an education when you've figured which one you need.

It doesn't even seem as if it would be very hard to build that right into the system. Doing it the artisanal way takes longer, generates more stress, loses more income.

Tentatively, thinking of my own experience, I'd point to the competitiveness of the system as the driving force. I had some smarts but school didn't suit me much. There were a bunch of things I was interested in - computers, AI, writing sci-fi, evolutionary biology - and I had no clear idea what I should do when I turned 18.

My parents' reasoning was "Most of your interests are scientific, so, the best way to keep your options open is to enrol in the top engineering schools, then you can have your pick of careers later". One problem with that is that these schools aren't a place for learning while you keep your options open. They are, basically, a sorting process, get... (read more)

9sixes_and_sevens
From this point forward, I'm describing the past ten years of my life as "having taken the artisanal route".
2Aurini
I just call myself an 'autodidact'.
4Relsqui
Oh, hi. Didn't see you there describing my life. :) Dropped out towards the end of high school, spent a lot of time unemployed or doing odd jobs, lived off other people, got sick of living off other people, and eventually woke up one morning and developed an idea about what I could do with my life that would fit my goals and suit what I'd learned about who I was (a picture which had changed a fair bit since high school). Long story short, I started college a few weeks ago. I'm trepidatious, because I haven't gotten along well with formal academics historically, but I've also never been there for me before. It's kind of a scary experiment, because I'm playing with real money (most of which isn't mine), but that's also an added incentive not to fail. (The education I turned out to need to do what I want--if I've planned this out well--turns out to be in communications/language/linguistics. If I'd gone to college right after high school, I would probably have ended up in English or computer science.) To her credit, the college counselor at my high school (in a mandatory appointment beore I dropped out), recommended that I take some time off, travel, and work before deciding if I wanted to go to college. I guess it was pretty clear from my record that putting me right back into a classroom the following fall wasn't going to be very productive.
0Sniffnoy
Can you explain what you mean by this?
9Morendil
By "video-game" order I mean in an order which makes it increasingly challenging, as opposed to making it increasingly easy because built on more solid foundations. For instance (as I dimly remember it), calculus was introduced as a collection of rules, of "things to memorize", rather than worked out from axiomatic principles. It was only later (and as an elective class) that I was introduced to non-standard analysis which provides a rigorous treatment of infinitesimals. This may be a limitation of mine, but I can only approach math the way I approach coding - I have to know how each layer of abstraction is built atop the underlying one, I'm unable to accept things "on faith" and build upwards from something I don't understand deeply. I can't work with expositions that go "now here we need a crucial result that we cannot prove for now, you'll see the proof next year, but we're going to use this all through this year".
6DanielLC
Calculus is built on limits, not infinitesimals. At least, that's how it's normally defined. They both work, and neither was understood when calculus was discovered. I think most people are fine using the tools without understanding the rules, and find that easier than learning the rules. Schools are built to teach the way that the majority learns best, as it's better than teaching the way that the minority learns best.
6Morendil
Related (via HN):
4RobinHanson
I suspect one reason for this is that many people hope to steer their kids into good career deals they understand especially well. Official competent training in who should pursue which careers threatens to eliminate this advantage.
1sark
Why won't parents trust the recommendations of official competent training, if it had a good track record?
3Baughn
Being a public good, it's quite likely to be biased towards what's good for society more than what's good for the individual kid. More so than the parents' advise would be, at any rate. On the other hand, I'd expect to see them happy to have other people's kids steered in this manner.
1Kingreaper
If you know that a career is underfilled and overpaid, you can get your kid a job there. If there's official competent training, then more people will be directed at the job, and the pay:effort disparity will disappear.
0sark
So parents can potentially do better. In the cases where the jobs they understand well are not underfilled or overpaid, shouldn't they trust official recommendations? Possibly parents could know enough relatives/friends to hear of at least one underfilled/overpaid job to make such cases rare.
4NihilCredo
My high school used to organise a Saturday every year when they would invite their old alumni to come and tell any interested students about their academic and/or job experience. Lots of people would come, since it was a good chance to catch up and have a free quality lunch with their old friends and teachers (many of whom were friends too - it was a small, quality school). The logistics and self-selection effect meant there was an overrepresentation of younger people who still lived in the area (usually working engineering, office, or teaching jobs), but it was still an extremely useful experience.
0[anonymous]
It is not at all clear that their goal is to match every student with their dream job. If for example society needs a lot of jobs done that are not fun, it is possible that they decided better not to tell to young people too loudly that they are not fun. Tell them they safe careers.
0[anonymous]
You have now!

When I get insignificant amounts of change, like a nickel, I leave it on the nearest outdoor object, thus teaching people that they can collect small amounts of money by searching random objects.

[-]ata290

I paint question marks on boxes and leave hallucinogenic mushrooms in them.

I just leave handgun ammunition everywhere.

I randomly assault people who wander outside of the city centre, but only if they look strong enough to kill me easily.

For my part, I force rabid dogs to swallow gold coins. Defeating an aggressive dog ought to earn you both experience and precious metals.

I do that too. Also, I had a sign by my front door that read "no plot here" so that wandering adventurers aren't tempted to investigate too deeply and make a mess of my house.

I once decided to search my sofa and found a pair of nunchaku there.

[-]Zvi150

Whenever anyone comes up to me but doesn't say anything, I repeat the same phrase over and over again.

How many NPCs does it take to change a lightbulb? (The answer)

5ata
"Excuse me, sir..." "Hi! I like shorts! They're comfy and easy to wear!" "Uh, okay, anyway, would you happen to know where..." "Hi! I like shorts! They're comfy and easy to wear!" "..." "Hi! I like shorts! They're comfy and easy to wear!"
8Baughn
So, wait. This means the entire premise of this article is wrong? Our intuitions aren't being driven off track, we're simply turning reality into a videogame.
7Drahflow
Video game authors probably put a lot of effort into optimizing video games for human pleasure. Workplace design, User Interfaces etc., they could all be improved if more ideas were copied from video games.

Games often fall into the trap of optimizing for addictiveness which is not quite the same thing as pleasure. Jonathan Blow has talked about this and I think there is a lot of merit in his arguments:

He clarified, "I’m not saying [rewards are] bad, I’m saying you can divide them into two categories – some are like foods that are naturally beneficial and can increase your life, but some are like drugs."

Continued Blow, "As game designers, we don’t know how to make food, so we resort to drugs all the time. It shows in the discontent at the state of games – Radosh wanted food, but Halo 3 was just giving him cheap drugs."

...

Blow believes that according to WoW, the game's rules are its meaning of life. "The meaning of life in WoW is you’re some schmo that doesn’t have anything better to do than sit around pressing a button and killing imaginary monsters," he explained. "It doesn’t matter if you’re smart or how adept you are, it’s just how much time you sink in. You don’t need to do anything exceptional, you just need to run the treadmill like everyone else."

I work in the games industry and I see this pattern at work a lot from many designers.

4luminosity
Interestingly, some of the best mathematical analysis I've ever seen happens in WoW, and to a limited extent in other MMOs. When you want to be the top 25, 100 or even 1000 out of 13 million, you need to squeeze out every advantage you can. Often the people testing game mechanics have a better understanding of them than the game designers. Similarly, the first people to defeat new bosses do so because they have a group of people they can depend upon, but also because they have several people capable of analysing boss abilities, and iterating through different stategies until they find one that works. It's unfortunate that there's so much sharing in the community; players who aren't striving to be the first to finish a fight can just obtain strategies from other people. People who don't care to analyse gameplay changes, or new items, can rely upon those who do to tell them what to wear, what abilities to choose, and what order to use them in. Back when I played one of my biggest frustrations was that nearly everybody in the game out of the few top thousand simply lack the ability to react and strategise on the fly. Throw an unexpect situation at them and maybe 1 in 10 will cope with it.

And that great mathematical analysis is being directed at solving meaningless made-up problems that generate no value for the world. It's pure consumption, zero production. Yet it's complicated, goal-oriented consumption, it feels like doing work, and hence scratches the productive itch for many people...without actually doing any good in the world.

It's a powerful opiate (a drug which makes the time pass pleasantly and users wish to use all the time, as opposed to psychedelics which are used occasionally and make the rest of your life better). Which, I believe, makes it on the side of evil not of good.

3yrff_jebat
At least the first part could be said word-by-word for modern-day astrophysics, except that this is socially accepted and the guys and gals doing it are (in most cases) being paid for (and even the people seeing fundamental knowledge over the universe as goal in itself will agree that there are far more important things to divert workforce to)
1yrff_jebat
I also find it funny when mathematicians pejoratively speak of "recreational mathematics" (problem solving) as opposed to theory building: "If I build a lego hat, that's just for fun, but if I build a lego Empire State Building, that's serious business!"
0[anonymous]
I also find it funny when mathematicians pejoratively speak of "recreational mathematics" or problem-solving as opposed to theory-building: "If I build a lego hut, that's just for fun, but if I build a lego empire state building, that's serious business!"
1luminosity
I don't disagree much with your post (my only complaint is that fun is a reasonable goal in and of itself, and if someone chooses that, then so be it). However my objection is to Blow's (amongst many others') characterisation of the game and the players. Contrary to his thesis, being smart and adept are actually massively rewarded in WoW by comparison to other games; nearly everybody who plays the game is aware of the best players. There is a lot of status up for grabs just by being the best on a server, let alone best in the world. Accepting his analysis at its face value would lead you to conclude that there are no lessons you can take from WoW or other MMOs. In fact, to me WoW demonstrates ways in which people can be motivated to work upon hard, mathematical problems. It would be a shame if people were to dismiss it off hand, when it has the potential to demonstrate how to structure hard work to make it more palatable and attractive to tackle.
1katydee
I know several people who used to be the best players on a particular WoW server-- they said it was generally boring and not really as prestigious as one might expect, since the sheer number of servers out there means that being the best on one doesn't even necessarily mean you're all that good at the game as a whole.
1luminosity
I suppose it would depend on the makeup of your particular server. Though we were nowhere near world best, my guild had decent competition on our server and there was always need to strive to be the first to win an encounter. Both groups were reasonably well known on the server, and I would reasonably often have people messaging me out of the blue. To try to generalise the post a bit better, I think the lesson from this is that to encourage rational analysis and quick thinking in important areas it's important to have good competition, an easily verified criteria for 'winning', preferably milestones towards the ultimate goal, and a reward for winning whether status or monetary. Off the top of my head, the people behind the X-Prizes seem to have used this model well to encourage innovation in select areas.
8msironen
Seconded. It seems to be a rather unfortunate video game meme in itself that MMOs (WoW particularly since it somewhat defines the genre currently) Massively reward time spent over skill. No amount of grinding low level content will make you capable of taking down say the Lich King in heroic mode (both skill- and gearwise) and to claim otherwise just shows that the extent of the knowledge of the person making the claim is limited to a single South Park episode. The most "celebrated" players are exactly the people who master the most difficult content first without the benefit of shared tactics (and usually with the highest gear handicap), not the the first guy who kills a billion sewer rats (even if there were such an Achievement). It has been said by WoW developers responsible for generating new high-difficulty content that most of the challenge come from the fact that the best players are much, MUCH better than the average player (even more so that actual player community is aware of) that making content which is not trivial to the top guilds but is also beatable for the average joe has become somewhat impossible without certain gimmicks. Certainly, you can become say the richest player on your server just by investing massive amounts of time (though actually manipulating the Auction House seems nowadays a better strategy than grinding), but that just means that you'll be known as the guy who spent the most time gaming the AH (we actually have just such a player on our realm). If anyone think that's the game rewarding you for time spent instead of skill, I seriously suggest they spent a little more time researching the subject before pontificating on it. Finally I apologize for the slightly combative tone of my first post, but I hope it's an excusable reaction, especially on this site, to a nearly "accepted wisdom" that doesn't really even survive the slightest scrutiny.
0yrff_jebat
(Not meant as a rhetoric question): Does "mathematical analysis" really mean that someone with an IQ of 170 has (in average) a real advantage to someone with an IQ of 160 (if you don't count effects on information processing ability and reaction time) in solving really hard mathematical problems, or is it rather a combination of clicking fast, knowing how the monsters will react and calcing through what will happen if you do X?
0Baughn
Where do visual novels such as Ever17 fit on this scale? Do you count them as games at all?
1mattnewport
I'd never heard of Every17. Based on the description on Wikipedia I'd say it's borderline whether it qualifies as a game. I'm not sure it meets the minimum level of interactivity required. Non-game entertainment can fall into the same trap of addictiveness vs. pleasure however, some TV for example.
5DSimon
Life imitates Zelda.
4MichaelVassar
All this before we even mention Burning Man?
6prase
Thank FSM that you don't leave weapons there.
[-][anonymous]290

Here's one that's particularly sinister, and shows up in nearly every RPG and MMORPG:

Progress is tied to presence, not performance.

In these games, as long as you're there, in front of the screen, you're making progress. Skill is almost never involved in success - if you can't beat that boss you just need to go kill a few thousand boars and level up a little bit, and if you want that sweet equipment you just have to put in the hours to grind for gold, or faction, or honor, or whatever.

In the real world, getting better at something generally takes actual work, and only occurs under specific conditions of deliberate practice and proper feedback. But it's so easy to fall into the trap of "hey, I'm doing something tangentally related to goal x or skill z, I must be making progress at it".

Well, that's not entirely unrealistic. As Woody Allen said, half of life is just showing up. (Ask Eliezer what he thinks about school...)

2listic
There should be a reason skill is almost never involved in success. In my understanding, this reason is network latency. I think you need low latency to make an action game where achievement is dependent on skill. In World of Warcraft, you can have slow players on slow network connections separated by large distance from the server make progress and have fun. In 3D shooters, you can't.

Blizzard's own Starcraft is competitive and very fast-paced, and yet it has continent-wide servers all the same.

A better reason for the perverse nature of MMOs is that the promise of guaranteed progress, especially combined with social obligations, is much more effective at keeping people paying their monthly fees than the hope of personal improvement.

5[anonymous]
You see it slowly being integrated with other, more skill-based genres in the form of Achievements, little badges you can display and a progress bar/counter that marks how many you've gotten. Many of these are skill based, but just as many are presence based (ie: complete 1000 multiplayer matches). Their widespread adoption into nearly every sort of game leads me to believe they're VERY effective for keeping people around.
1DanArmak
Skill isn't particularly related to success in most single-player games that allow leveling/improving equipment. The developers want to please their paying customers, so they will do their best to prevent a situation where someone isn't skilled enough to complete the game. Since there are usually only a few game endings, everyone gets to see the same result, and so their playing skills don't ultimately matter. Adjustable game difficulty serves the same end. Sure, some games are hard enough that not everyone can beat them, but these are the exceptions and they can even become famous for that quality. (Anecdotally, I remember reading claims that Japanese games are much more likely to be unbeatably difficult than are Western ones.)
2DSimon
Hm, was that judged over the number of games made or the number of game copies sold? Or to put it anther way, did it show that Japanese developers like making hard games or that Japanese gamers like playing hard games?
0DanArmak
As I said, it's completely anecdotal - I don't remember the source, but it was someone commenting from his own (extensive) experience, not a controlled study. That said, I expect the comparison was between percentages of well-selling games.
0Relsqui
It doesn't have to be an action game to be dependent on skill. Consider Puzzle Pirates. Almost everything you can accomplish is skill-based, and most of it's even single-player (but cooperative by way of many people puzzling towards the same goal). Avoids most issues with latency (as do, I imagine, the relatively simple graphics), and ties advancement to skill.

It's interesting to see what happens when videogames behave more like real life. For instance, in Oblivion (and Fallout 3), you can't just take things unless you're in the middle of nowhere. If someone sees you, they cry out "stop, thief!". Equally, attacking people who didn't attack you first in civilised areas will draw the guard or vigilantes down on your head, and most of the stuff you find lying around is worthless trash that isn't worth the effort to haul away and sell.

I remember how jarring it was when I first tried to take something in Oblivion, only for a bystander to call for the guard. And then I realised that this is how NPCs should react to casual theft.

9SilasBarta
Is that how people normally react in real life? I would think people tend to be apathetic bystanders, or might think you were picking up something of your own. If someone creates a real life simulator where you can repeatedly practice your crimes and learn what the actual responses would be ... God help us all. (I mean God in the secular sense.)
6James_K
In Oblivion, the settlements you are in are village-sized. They would be close-knit communities in which you are a stranger. Also, we're not talking about picking up something off a street (picking flowers or herbs for instance) was OK, because outdoor plants generally weren't flagged as owned. Things you might want to pick up were generally indoors and often within sight of the person who owned them.
6Alexei
An interesting article on stealing bicycles here.
7thomblake
Yes, this is following the tradition of the Ultima series), wherein Lord British originally introduced those sorts of mechanics specifically out of concern for the effects video games might have on the character and habits of the players.
3CronoDAS
The Kleptomaniac Hero is very common in video games. If the game lets you take it, you probably should - and you can take a lot of stuff from random people's houses and such, while the people who actually own it stand there doing nothing.
8derefr
This is an example of Conservation of Detail, which is just another way to say that the contrapositive of your statement is true: if you don't need to take something in a game, then the designer won't have bothered to make it take-able (or even to include it.) I always assume that there's all sorts of stuff lying around in an RPG house that you can't see, because your viewpoint character doesn't bother to take notice of it. It might just be because it's irrelevant, but it might also be for ethical reasons: your viewpoint character only "reports" things to you that his system of belief allows him to act upon.
2SilasBarta
I want to see a game where people react normally to this kind of thing ... maybe even have the police increase their watches for thieves as more burglaries happen.
6CronoDAS
In the original Baldur's Gate, you'd get in trouble if any NPC saw you stealing something. And, perhaps unfortunately, "any NPC" included cats and other animals.
2[anonymous]
This seems to depend of the kind of games one plays. NPCs noticing theft was default for most of my gaming experience (first RPG I played was Gothic 2) I am disappointed by less.

The most insidious of these misguiding heuristics have, apparently due to their transparency (like water to a fish), gone unmentioned so far in this thread.

Typical game play shares much in common with typical schooling. Children are inculcated with impressions of a world of levels that can (and should) be ascended through mastery of skills corresponding to challenges presented to them at each level, with right action leading to convergence on right answers, within an effectively fixed and ultimately knowable context.

Contrast this with the "real world", where challenges are not presented but encountered, where it's generally better to do the right thing than to do things right, within a diverging context of increasing uncertainty.

One thing this essay does not address is whether humans actually are likely to learn heuristics from playing videogames or whether a large enough fraction of the population plays videogames for this to be a real concern.

Let's briefly address that: There's a fair bit of evidence that much of "play" behavior over a wide variety of species is specifically to learn behavior and rules for actual life events. Thus for example, wolf cubs engage in mock fights which prepare them for more serious events. Some species of corvids (crows, ravens, jays, etc.) will actively play with the large predators in their area (pecking at their tails for example or dropping objects on their faces) which is an apparent attempt to learn about the general behavior of the predators which is primarily important because these species of corvids get much of their food from scavenging. It is likely that humans engage in play behavior in part for similar reasons. If so, there's a real danger of people learning bad heuristics from videogames.

What percentage of the population plays videogames? A quick Google search turns up various numbers which disagree but it seems that they vary from around a third to slightly over half. See for example here. Given that, this seems like a common enough issue to be worth discussing.

4RolfAndreassen
Is it obvious that a videogame is enough like the play a human child would do in the ancestral environment that it will activate the learning-by-play circuits? Our enjoyment does not imply that it is play in the sense our learning circuits recognise.
8datadataeverywhere
Play is about learning. Even games that we don't think of as teaching us anything are fundamentally tied into our learning circuits. Even games as mindless as solitaire (Klondike) activate our learning circuitry regardless of whether or not we actually develop any skills by playing them---like an artificial neural network continuing to train past the point of usefulness, fluctuating around its plateau with every new example it sees. One of the most difficult aspects of video game design is scheduling difficulty increases; ideally, a game gets harder at the same pace that the gamer gets better, because getting better feels good. Engaging those learning circuits is one of the primary reasons games are fun in the first place. The real question to ask is whether learning bad habits in video games translates even a little bit to real life. This is an age-old debate, most frequently brought up when somebody claims that playing violent first-person shooters turns innocent children into mass-murdering psychopaths.
5patrissimo
But we learn by working too - especially if work is somewhat playful. Yes videogames teach us, but they teach us while producing absolutely nothing, whereas work teaches us, may be less fun, but actually produces value (makes the world better, you get paid rather than paying, etc.) And what you learn at work (or any productive enterprise - maybe it's an art project for Burning Man, or a self-improvement project) is much more likely to be what you need to know for future work. Whereas what you learn in a game may happen to be useful later, but also may not. Playey Work >> Worky Play

I agree with everything you said. We should be especially cautious about playing so-called casual games, where the modal player very quickly reaches a plateau beyond which he or she will learn nothing, much less learn anything useful.

The difference of course is that the learning process in Real Life is slooooow. In game X, after 30 hours of play, the modal player may be one or two orders of magnitude better at a given skill (one that is at least somewhat unique to the game) than someone who has been playing for two hours. Some games (e.g., some first-person shooters) require no unique skills; I suspect the skill curve looks similar, but most players are hundreds of hours along it rather than just a few, so the curve appears flatter and the differences proportionally smaller.

Contrast that to life: in mine, the skills I am trying to cultivate are the same ones that I've been trying to cultivate for years, and my improvement is sometimes so glacial that I feel doubt as to whether I'm getting better at all. I could just be thousands of hours along similarly shaped curves, but I have certainly reached the point where I no longer see incremental improvement: all I see anymore are occasio... (read more)

As I was reading this, I realized that many of the points here apply heartily to single-player games, but the reverse is often true of MMOs.

A while back I spent a few years playing World of Warcraft, and ended up doing mid to high level raids.

When leveling, or completing a raid, you do know your purpose, and it is handed down from on high. This is unrealistic, but possibly one of the most relaxing aspects of escapism.

You DO NOT delay or take forever! While leveling or raiding, it is important to do things efficiently to meet your goals quickly. You want to hit max level ASAP, not see the whole low-level world; you want to see the whole high-level world.

When leveling or raiding, there is usually a specific build that is more powerful than the others. You have choices between various builds, but never more than 3 per character class, and usually the 3 are vastly different and you must choose one of them. For example, every rogue ever would take a talent that gives them +5 attack speed, but taking a bonus to speed while in stealth would get you kicked out of a hardcore guild.

In raiding, the difficulty isn't (strictly) progressive. Some fights are easier, some are harder. Some a... (read more)

I don't play a lot of video games, but I'm quite fond of strategy, and have recently become besotted with Starcraft 2. Something that struck me while looking through the online strategy community was how ruthlessly empirical that community was.

It shouldn't be too surprising. Players are provided with an environment governed by the immutable laws of the game engine, and the only objective is to win. You can accomplish this however you like: efficient micromanagement, complementary unit selection, economic superiority, stealth tactics, mind games, aggressive map dominance, off-the-wall strategy your opponent can't plan for...however you manage it, provided you're the one left standing at the end, you win.

As a result, players continually test the behaviour of the environment, see what works and throw away what doesn't. This often involves setting up contrived scenarios to explicitly test certain theories. The result is a massive body of knowledge on how to effectively win the game.

I would say that it's kind of heartening to find that when given proper incentive, even people with (presumably) no formal scientific training can apply systematic methods for testing the behaviour of their environment, but I don't know what kind of crossover exists between the SC/scientific community.

8NihilCredo
Here's a full research paper on the subject. It very thoroughly data-mines a strategy WoW forum and observes how their discussion about the hidden variables of the game world mirrors quite clearly the scientific study of the natural world. A: "I suggest that the Octopus Lord is vulnerable to fire." B: "Good idea! I have a fire sword and an identical ice sword, I'm off to try." C: "Wait! Maybe he's just resistant to ice! We need to design a better test." It's actually a lot more complex than that (there's a HUGE spreadsheet quoted in the paper), but you get the idea. Reverse engineering isn't quite the same as science (because you know from the start that all natural laws must be traceable back to a short piece of human-written code), but they are definitely kin.
3Zvi
Why is the goal handed down from on high? I don't think even this break from reality is true in an MMO. If we mean that the game is telling you what to do, what you have are various NPC questgivers (employers) who are hiring heroes (players) for various jobs and offering various rewards. Then each group of players (guild, group, etc) decides together which of these jobs they want to accept. Alternatively, there are places you can go with things to be accomplished. This isn't that different from freelance work. Even when there is a central overriding goal, you are still free to ignore it and set your own goals. If we mean that the guild is handing down the goal from on high, well, that's highly realistic: Your boss is telling his workers what to do. You don't like it, choose new leadership or quit.
2magfrump
I mean that your goals are extremely concrete, and their value is extremely concrete. "Kill this many boars and you will gain this many experience points." "Earn this many experience points and you will gain a level." My conception of the real world is that goals tend to be vague ("figure out and fulfill my own utility function") and subgoals tend to be unpredictable (will keeping a diary help? A food diary? research on the internet? Spending time with friends? What balance between "figure out" and "fulfill"?) It is true that the system is MORE liquid than in most single player RPGs, where it is not uncommon to encounter a narrator saying something like "monsters are everywhere! Our hero sets out to defeat them all!" Which is on a bit of a different level.

This thread wouldn't be complete without a link to this Ctrl+Alt+Del comic

4scotherns
'Chore Wars' (http://www.chorewars.com/) is designed to motivate you do get chores done by providing XP / Gold / Treasure for completing chores, and tracking it to induce competition amongst your housemates. It works for me as a more interesting to-do list, and has caused my kids to argue about who gets to clean the toilet and level up.
4CronoDAS
Or this XKCD comic.
4thomblake
I'd never thought of grinding real-life skills - brilliant!
9CronoDAS
Learning to play musical instruments is basically grinding.

Getting good isn't-- see Talent Is Overrated for details about the 10,000 hours to mastery theory.

People tend to prefer grinding over developing relevant sub-skills by experimentation, but the latter is what works.

5CronoDAS
I think we need to decompose what we mean by "grinding". When I practice a segment of a song on the piano over and over until my hands move the proper way to hit the notes, that's grinding, right?
9jimrandomh
I think grinding would be if you kept on practicing the song even after you could consistently play it correctly. Otherwise, the positive connotations of "practice" versus the negative connotations of "grinding" wouldn't make sense.
3NancyLebovitz
Maybe "grinding" isn't the right word, but playing something over and over until it smooths out is the way most people practice. Thinking about what might be causing problems or what might lead to improvement, and then working on the piece to make specific changes is what you need to do to get really excellent.
3NancyLebovitz
Cooks Illustrated might be a real-world example-- they take recipes through a bunch of conscious variations to perfect them.
[-][anonymous]90

On the other hand, good videogames can be a cool tool for low-risk self-improvement.

I've historically had a lot of trouble focusing on one thing at a time - choosing a major, minimizing my areas of focus. I recently played KOTOR, and realized that I play videogames the exact same way. I can never commit to one class/alignment/weapon specialization at a time, and I suffer for it.

Recognizing the similarities, I decided to play KOTOR as a specialist in one alignment, one class, and one weapon type, ignoring tantalizing opportunities to generalize whenever possible. I ended up enjoying the game a lot more than I usually do.

Three weeks later I chose my major, and I honestly believe KOTOR helped.

I would like to commend you for taking the time to include the penultimate paragraph. I think it extremely worth pointing out that not everything that happens in games is likely to manifest in seemingly-analogous real world decisions.

The good news about most of these biases is that they are quite testable. I would love to see some research about the decision making processes of video game enthusiasts (particularly those who started at an early age) and a control group.

Thanks to Ralith, Lark, neptunepink and nshepperd for their feedback on the first draft of this article in the #lesswrong IRC channel. The IRC channel is a good way to get early feedback on posts, and discussing it there revealed several important flaws in the writing that have been fixed.

2Clippy
Which internet has the #lesswrong IRC channel?
2jimrandomh
Less Wrong IRC channel details are on the wiki.
0NihilCredo
This one.
1Larks
Yes - there normally seem to be a good number of people there too. ETA: here is the channel in question.
[-][anonymous]80

Hello, player character, and welcome to the Mazes of Menace!

I'm surprised that you didn't mention NetHack, and that nobody else has either, given that it contains the Mazes of Menace and provides counterexamples to many of your points.

Because your goal is not to reach the end quickly

In NetHack, the goal of beginners is to ascend, i.e. win, and it is very difficult. (I have not yet ascended; the furthest I've gotten is level 27, with 422434 points.) The goal of intermediate players is to ascend quickly. And the goal of advanced players is to ascend u... (read more)

5[anonymous]
I started writing a reply about how different aspects of NetHack are repeated in other games, which lets those games avoid these hazards too... ...But then I realized that there was a common factor in everything I was writing, which is that as games become harder / more complicated to play, playing them becomes more and more similar to how we act in reality. NetHack is a very complicated game, and it is also very hard (for one thing, due to permanent death). So we use our full "reality skills" when playing it.

Similarly, if you're on a quest to save the world, you do side-quests to put it off as long as possible

I've explicitly made note this fact, that one should do quests in exactly reverse order of importance, in every cRPG I've ever played. Because often making progress on major quests will change the game (lock you out of an area, say, or kill an NPC) such that you can no longer complete some minor quests if you haven't done them already .

Modern designers have finally started to take account of this. In Mass Effect 2, you do almost all of your side-questing while you wait for your employer to gather information about the main problem. Once the party does get started, the game makes it emphatically clear that waiting any more than absolutely necessary is going to severely compromise your primary mission.

2dclayh
But does it actually punish you for waiting, or just threaten to? (I haven't gotten around to playing Mass Effect 2 yet.)
9NihilCredo
jim answered quite thoroughly. I'll add that I was hinting mainly at the fact that the BioWare developers knew that most players would, by habit, take their sweet time no matter how many universes were at stake, and planned accordingly. If your most trusted ally tells you "We must hurry, or we will fail!", a veteran gamer knows to ignore him and go rescue a kitten. If a pop-up window tells you to hurry up or you will fail, you do hurry up. Some messages can only be given on this side of the fourth wall.
2jimrandomh
Yes; if you're too slow, it kills off some minor characters who would otherwise survive. The ending to that game is quite well done. It also has you assign NPCs to tasks, and kills a character for each assignment you get wrong, including some non-obvious and unstated requirements, like you can't put someone in charge of a squad if their backstory doesn't mention leadership experience. However, the early game still has the usual timing incentive problem. Side-quests fall into major and minor categories, and the clock doesn't start ticking until you've done all the major ones.
7thomblake
I have a friend, Rit, who refuses to play cRPGs this way. Towards the end of Final Fantasy 8 (don't expect spoilers ahead), you are supposed to do all your sidequests before rescuing a friend in trouble; by FF tradition, this should be obvious to the player since you just got free reign of the world map. Rit said, "Screw that, she's in trouble, I'm going straight there!"
1[anonymous]
The original Fallout is an exception since it had a time limit. The world changed as time went on, regardless if you did anything and if you where slow enough (500 in game days I think) you could loose the game.
3William
Star Control II did something very similar--as time went on, the world changed, and eventually one of the villains would start their omnicidal rampage.
5Zvi
And in both of these games I had to restart because you can use a huge amount of time traveling the world map to go places, and spending game time rather than playing time makes perfect sens, especially for the Luck 10 character I was playing, until you realize you've lost. Star Control 2 gives you fair warning and I didn't realize it at the time, but Fallout doesn't and I was pretty mad about it. Having a time limit without being deeply explicit about it is a crime against gaming.
1CronoDAS
Seconded.
1[anonymous]
However getting a nasty surprise like that might just help shed light on a Video game meme you didn't even know you internalized. Also Fallout was explicit about the time limit. The pipboy clock, as well as the manual.
1James_K
It's interesting that the designers hook up the formula for FF 13. You basically don't do any sidequests until you finish the game. After defeating the final bosses it puts you at the last save point, and lets you go back and do all those sidequests you walked past earlier in the game. The incentive to play this way comes form the fact you can't finish levelling up until you finish the game.

I guess I might as well post about my own experiences, even though I'm probably not a typical game player:

I noticed myself developing the habit of seeking the dead ends first in video games, but I thought that it was just a bad habit that I developed, and that most other people don't play like that. My brother doesn't play like that. But I continue using this strategy even in games where there isn't a reward at the dead ends. I deliberately choose the path that's more likely to be a dead end first, just for my own peace of mind, to know that it's a dea... (read more)

2mattnewport
I used to have some of the same tendencies when playing games but in an effort to improve my play (particularly playing competitive multiplayer games) learned that it's often a bad strategy. I feel learning this actually helped me overcome an unproductive real life tendency towards hoarding or excessive caution. I have a much reduced tendency to do this now. A related habit which I unlearned to some extent from games (particularly competitive RTS games) was the tendency to try and build up impenetrable defenses before engaging in any combat (excessive turtling) in RTS speak). This is another example of a tendency which can be ineffective or counterproductive in real life and I've found lessons from game strategy helpful in overcoming. This is I think a similar problem to how you describe your tendency to "err on the side of spending too much effort researching which choice to make, rather than risking making the wrong choice by deciding arbitrarily on a whim". Note that in certain circumstances both of these tendencies can be good winning strategies. If you have a personality type that inclines you to overuse this type of strategy even when it is not a good approach it can be detrimental to your success. I personally found games helpful in appreciating this.
0PeerInfinity
hmm, I just realized that this confession that I deliberately use a strategy that inefficiently uses in-game currency... kinda conflicts with my previous claim that I always play to win. a random thought: am I playing to win, or am I playing to "not lose"? also, sometimes it turns out that I actually did need to save up the in-game currency for an important item in the next town, and so I shouldn't always just spend all the currency as soon as I get it, with the excuse that inflation makes frugality counterproductive. I also have a tendency to turtle. If there's ever a choice between offense and defense, I choose defense. Or maybe higher speed, for better dodging. Or better yet, the ability to heal. I usually pick the class with the best healing ability. My overly defensive strategies kinda make me no fun to play against, but they generally result in me losing less often. And yes, I have found games to be useful for showing me when my strategy is suboptimal, and I've been making some attempt to change the bad habits. Though I don't seem to have made much progress at this. I have at least allowed myself to go on a big spending spree when I'm at the last town, and the currency has stopped inflating. And I've allowed myself to use all those rare items in the battle with the final boss, since there's nothing left to save the items for... except maybe later in the battle with the final boss... So I know that my strategy is suboptimal, and I'm trying to change it, but I'm failing to actually make any significant changes, due to... psychological inertia? But I still make sure to buy stuff that is actually necessary, or that is obviously a good deal, and I actually do use items that are obviously a good idea to use. And my strategy does work well enough for me actually win often enough, so maybe I'm being too critical...
0NancyLebovitz
To what extent are you playing to fill time?
0PeerInfinity
When I start playing, it's because I don't feel like I have the energy to do something more useful, or if I just feel like I need a break. And so my original purpose for starting the game is just to fill time, and maybe even have fun, or recharge energy. But once I start playing, I almost always end up taking the game way too seriously, and I end up burning energy I didn't think I had, and ending up more tired when I'm finished than when I started. Once I start a game, I have a really hard time stopping. That's bad. And yet playing games is still my default activity, when I don't feel like doing anything else.
0mattnewport
This can be a manifestation of a lost purpose. Money / one-use items are useful to accumulate for the purposes of beating the game (or your opponent) but focusing on maximizing them is to lose sight of your goal (winning the game). It's not clear to me whether you are primarily talking about single player games or not but I have generally found competitive multiplayer much more effective than single player for encouraging winning strategies and punishing losing strategies. Good human opponents often also devise creative strategies which can be educational in themselves.
0thomblake
The reason I hate Final Fantasy Tactics. I've had the same problem. I basically came to an epiphany similar to Red Mage's. It applied to both my behavior in life and in RPGs.

Good post. One other thing that should be said has to do with the /why/. Why do we design many games like this? There are some obvious reasons: it's easier, it's fun, it plays on our natural reward mechanisms, etc. A perhaps less obvious one: it reflects the world as many /wish it could be/. Straightforward; full of definite, predefined goals; having well known, well understood challenges; having predictable rewards that are trivial to compare to others; having a very linear path for "progression" (via leveling up, attribute increases, etc.) A world with a WHOLE lot less variables.

7luminosity
If you're not aware of Jane McGonigal you might be interested in her works. Her basic position is that games are better than reality, mostly because they have a far superior feedback system. She tries to apply game design to the real world to stimulate people's problem solving.

Back around 1990, there was a school of game design that said that a game should be immersive, and to be immersive, it should stop reminding you that it's a game by making you throw away all real-life conventions. So this school of game design said things like:

  • You should not have to examine everything in the game. You should do just fine in the game by examining only objects that a reasonable person would examine.

  • You should not have to die in order to learn something needed for the game.

  • You should usually be punished for theft, breaking and enterin

... (read more)
6NancyLebovitz
Exactly-- and I don't think it's just structural. A lifestyle of killing sentients and taking their stuff might or might not be a pleasure in the real world, but it seems to satisfy the imagination. Women's Work: The First 20,000 Years Women, Cloth, and Society in Early Times is about what can be deduced about the early tech whose products don't survive for millennia. Even then, people were looking to fill time as well as to use it. You'd think people would evolve towards maximum-reproduction utilitarianism, but I'm not seeing it happen.
5luminosity
Deus Ex is the last good example I can think of, of a game immersive in this sense. Depending on how the prequel goes, it might not be dead just yet. Edit: As pointed out downthread, there are of course Bethesda's RPGs too.

In the real world, getting rid of junk costs money in effort and disposal fees instead.

In the real world, you can sell your old stuff. People just don't. Perhaps games can teach them that it is a good idea, even if it's for a fraction of the price you bought it for.

5gwern
Well, sometimes you can sell them. I'm having trouble unloading my GeForce 8600 on Craigslist for $20, which I thought was a pretty low price. And nobody has been interested in my 24-inch TV, even at a nominal $15. EDIT: I managed to sell the graphics card, but got not a single expression of interest in the TV even after dropping it down to $4, at which point I gave up.
3Zvi
I think games teach a valuable lesson the moment you realize that everything you buy has lost three quarters of its value when you try to turn around and sell it. They also teach a valuable lesson when you realize you have a limited amount of inventory space and that you're going to have to get rid of most of your junk. Video games do teach us to sell our junk when we can rather than throw it away, however, and I strongly feel that in general far too much time is spent trying to sell or even give away things we no longer have a use for rather than throwing them away, and often the underlying reason is because throwing them away is wasteful and therefore wrong. My parents taught me this explicitly, and it was a hard lesson to unlearn.
2CronoDAS
Then there's also those video games that reward you for holding onto "useless" junk because you'll end up needing it later for some optional reward, even if the game lets you sell it for much-needed cash before then. LostForever can be one of the more annoying game tropes.
2mattnewport
You can sometimes sell your old stuff but for many people it's not worth it for most items - the return vs. the time investment isn't worth it vs. just throwing it out. Even giving stuff away for free is generally too much effort to be worth it over throwing stuff out though you might think people who had a use for free stuff would have an interest in making it easier to give it to them than to drop it in the garbage.
1scotherns
Freecycle exists specifically to assist in giving things away.
0[anonymous]
http://www.listia.com/ Sites like the above do make this easier.
0mattnewport
Listia seems like a really terrible idea to me - from what I can tell it's like a much smaller ebay where money is replaced with 'credits' which the company hopes to make money by selling to people. It's possible they might make a profitable business out of it but I see no benefit to the users other than the misguided idea that they're getting something for free.

I'd say the worst habit of thought promoted by computer games is that if you do something disastrously foolish or clumsy, you can conveniently restart from a recently saved position. Clearly, that doesn't help one develop a good attitude towards the possibility of blunders in real life. (Though I should add that I haven't played any computer games in almost a decade, and I don't know if the basic concepts have changed since then.)

6Zvi
I find it makes me long for this ability rather than fool me into thinking I have it. In fact, reminding me that I could die or make the game unwinnable at any moment tends to have the opposite effect of making me more risk averse than I should be.
5madair
I don't find that's so for myself. War games give me a sense of my own mortality and ease of finding death no matter how many health packs and energy shields are provided. I wonder whether others experience the same?

I'm rather amused to be reading this for the first time while wearing my 'Things You Learn From Video Games' shirt...

I bet video games make you a better driver by forcing you to develop situational awareness.

8Matt_Stein
In my experience, it was much easier to learn to drive thanks to my experience with videogames. After years of picking up new control systems, learning to drive an actual car was of little challenge. Same thing when I made the transition from automatic to manual transmission. It'd be interesting to see some research into how easily people pick up and learn new interfaces. I think it's also part of what separates "computer people" from "non-computer people". (Sorry, bit of a tangent there)
8NihilCredo
I think it continously ingrains a certain type of "testing the waters" process: 1) Find an operation you can perform 2) Is it likely to cause permanent damage? If yes, goto 1. 3) Perform that operation a few times 4) See how it combines with the other operations you have already mastered 5) Repeat I don't think it's something inherent to video games as a medium, it's just the most common activity that requires you to learn a new interface every few weeks if not more often. Professional tools of any kind will strive to retain a familiar feeling, and everyday tools like household appliances, cars, or cellphones don't get replaced nearly as fast. It's OK, this is rapidly turning into LW's General Videogame Thread anyway.

Hmm, I've never really confused my life with in-game life before, but I wonder if I maybe do it on the subconscious level. An interesting note: when I tried playing Morrowind (for those who don't know, that game has a huge open world with many huge areas that are there for no reason other than to add realism), I had a sort of paralysis, because I had to explore every room and open every door, but that's simply impossible in that game.

It should be noted that some of these seem specific to games with a levelling/upgrade system, and in particular ones that you don't know in advance / are not really intended for replay.

For most games, there's a guide that explains exactly how to complete your objective perfectly, but to read it would be cheating. Your goal is not to master the game, but to experience the process of mastering the game as laid out by the game's designers, without outside interference. In the real world, if there's a guide for a skill you want to learn, you read it.

This doesn't sound like how people actually use them?

If it's a puzzle, then sure, figuring it out yourself can be fun. But if you get stuck and want to move on...then don't you pull out a guide?

(... (read more)

Applying this reasoning to the real world would mean choosing a career without bothering to find out what sort of salary and lifestyle it supports; but things in the real world are almost never balanced in this sense. (Many people, in fact, do not do this research, which is why colleges turn out so many English majors.)

It would have been worth noting that there are valid criteria for choice of university courses other than these. As it is, this section looks rather philistine.

If you think people believe that RPG classes are balanced, you obviously haven't spent much time reading game forums! "Imba"ness, real or perceived, is probably the #1 topic of discussion for most multiplayer games.