Learning to program in a given language requires a non-trivial amount of time. This seems to be agreed upon as a good use of LessWrongers' time.

Each language may be more useful than others for particular purposes. However, like e.g. the choice of donation to a particular charity, we shouldn't expect the trade-offs of focusing on one versus another not to exist.

Suppose I know nothing about programming... And I want to make a choice about what language to pick up beyond merely what sounds cool at the time. In short I would want to spend my five minutes on the problem before jumping to a solution.

As an example of the dilemma, if I spend my time learning Scheme or Lisp, I will gain a particular kind of skill. It won't be a very directly marketable one, but it could (in theory) make me a better programmer. "Code as lists" is a powerful perspective -- and Eric S. Raymond recommends learning Lisp for this reason.

Forth (or any similar concatenative language) presents a different yet similarly powerful perspective, one which encourages extreme factorization and use of small well-considered definitions of words for frequently reused concepts.

Python encourages object oriented thinking and explicit declaration. Ruby is object oriented and complexity-hiding to the point of being almost magical.

C teaches functions and varying abstraction levels. Javascript is more about the high level abstractions.

If a newbie programmer focuses on any of these they will come out of it a different kind of programmer. If a competent programmer avoids one of these things they will avoid different kinds of costs as well as different kinds of benefits.

Is it better to focus on one path, avoiding contamination from others?

Is it better to explore several simultaneously, to make sure you don't miss the best parts?

Which one results in converting time to dollars the most quickly?

Which one most reliably converts you to a higher value programmer over a longer period of time?

What other caveats are there?

New Comment
98 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

As far as I can tell, there is no such thing as a good programmer who knows only one programming language.

There just isn't. The nature of the field is that raw computation is quite alien to ordinary human thought; and that the best way to get anything even remotely resembling a reasonable grasp on it is to come at it from multiple angles.

You should know about functional abstraction, which you're not going to get very much of out of C or Perl, but you might get out of Lisp or Haskell.

You should know something about how the machine operates, which you're not going to get out of Java, Python, or Haskell, but you might get out of C, Forth, or assembly. And reading about the actual architecture you're working on. (It's easier to do this on a dinky architecture. I recommend 8-bit AVR microcontrollers.)

You should be eager to reuse other people's code to solve your problems, which you'll learn from an open-source, big-library language like Python, Perl, or Ruby, but probably not from a small, standardized language like C or Scheme.

You should learn about data structures, which you'll only get deeply if you have to implement them yourself, which is really only something you do in a CS class ... (read more)

I keep meaning to try drinking some of the relational kool-aid -- any links you'd recommend?
If you are looking to learn the conventional mechanics of using relational DBs, The Learn X The Hard Way series is well-regarded, and there is an (in progress) SQL edition (I don't have specific experience with this book).

I eagerly await the most rational toothpaste thread.

I think it has been mentioned before, but it bears repeating, please don't use "most rational" in titles. Just ask for the best programming language and describe your needs.

Upvoted for being the most rational comment in this thread.

Snark isn't the same as rationality.

Let's break it down, shall we? The comment contains the following three things:

  1. A joke insinuating "rational programming" is the same as "rational toothpaste". No claim is made explicitly, so no rebuttal can be made. This is pure dark arts if you ask me.
  2. Negative instruction: Don't do this. No attempt to explain why.
  3. Positive instruction: Do that instead. Again, no attempt to explain why.

And you think this is more rational than the detailed, respectful, intelligent comments made by people who actually thought about the questions for five minutes and shared their expertise?

I'm appalled.

Maybe it's obvious to you that having "most rational X" in the title is stupid. And to be honest in hindsight it seems a bit silly to me as well, now that I have explicitly thought about the reasons for it. But it wasn't obvious when I wrote it, and it surely isn't self-evident to everybody.

I'm not against setting up norms and rules, and yes they are gonna change on people, and yes people need to be humiliated from time to time for breaking them flagrantly, but it's simply unfair to make humiliating jokes in retaliation for breaking ... (read more)

Rational is a frequently used but unfortunately more and more meaningless applause light on LessWrong.
Hmm, I guess people that follow most of the discussions here might be overestimating how much "community lore" others are aware of (or they may mistake "opinions shared by a few posters" for "community lore"). For prior discussions of this see here: or here: Most people seem to have liked the suggestion to use "optimal" instead.
Forgive my ignorance; so what is the most rational form of humor? Note: Both of these comments are jokes and were intended to be funny, not snarky punishments for norm-breaking.

It turns out to be fart jokes. I have an elegant proof of this, but it is too long to fit in a comment.

Louis C.K. was deconstructing why farts are funny on the daily show the other day: 1. They come out of your ass. 2. They make a trumpet noise. 3. etc
Argh, I can't believe that went completely over my head in both cases. Now that you've added the italics around "most rational" I can see it.
I share your surprise that the grandparent was so positively received. It was briefly at -1, since I was the first to encounter the comment and I thought it was inane, obnoxious and wrong, without being sufficiently wrong that it could get any points for clever irony or humor. Mind you I upvoted Konk's actual comment.
By popular demand.
I haven't seen this advice before. A link would be appreciated. Edit: The post has been retitled to "what is the best programming language". My main reason for doing so is to avoid confusion as well as dilution of the meaning of the word "rational" -- which should probably be reserved for specific contexts (e.g. avoiding cognitive biases) rather than used as a catch-all for "most optimal" and so forth. My needs? Well I am already moderately skilled at a dozen or so languages, including Python, SQL, and Forth. My first scripting language was Perl and my first GUI language was REALBasic, which was essentially Visual Basic for the Mac. Why did I go into Forth? Well, I wanted some down and dirty understanding of what the heck is actually going on in a computer. And I couldn't stick with C long enough to get that for some reason. Now I've done things like creating my own string manipulation functions (by concatenating other functions and primitives). I'm not sure I could have got that from Python. On the other hand, now when I look at C code slinging pointers and char arrays around it makes perfect sense, and I can also visualize linked lists and other data structures. As a newbie though I remember it was all extremely confusing.
Behold, rational wart removal
Um, that seems off topic. I do see some vaguely on topic comments in the replies... maybe you meant to link to one of them?
I linked to the post itself because more than one of the comments were about using "rational" in the titles of posts, and I also thought the content of the post was relevant to understanding that discussion.
Labeling it off topic was an overreaction on my part. It was clear to me that you were talking about the comments. Nonetheless, it seems kind of silly (in an insulting and childish way) for someone to portray the topic "most rational programming language" as essentially equal with "rational wart removal", which is the most parsimonious interpretation of your comment, and which I must therefore rebut since you did not bother to clarify. There are multiple levels on which programming languages can be rational -- they can teach rationality skills, they can help you make money for rational causes, and so forth. Wart removal is far more specific and a much more clear-cut case of dilution of the term. I have substituted "best" in the title in the interests of preventing dilution, but this still seems to me to be a step above and beyond what I am required by linguistic politeness and the demands of clarity to take -- programming and rationality really are related in ways beyond the superficial "rational = best" kind of way.
That's the kindest interpretation you could think of? I'm a bit bothered that I have to specify I wasn't trying to be a dick in this specific situation. No, I wasn't trying to be mean to you. It looked like you wanted to see situations similar to yours, so I showed you the first one to came to mind (which of course was the most extreme one), and I assumed you wouldn't think I was implying they were equal.
"Rational" is so frequently used as a contentless word that, if I were to have a comment keyword blacklist, it'd be number two on there, right after "status", perhaps followed by "downvote me". Unless you're talking meta (as in the parent comment), I strongly recommend trying to figure out what you actually mean, and use that word. "Rationality" ain't the goal.
Usually when I say status I actually mean status. It's a valid, coherent and useful abstraction. Reducing all uses of the term to component parts would require writing paragraphs or essays all over the place and in general be a stupid thing to do.
As evidenced by the presence of "downvote me" on the list, my problem with each term is not necessarily the same. Briefly, I expect people talking about status to be worrying about trivialities, providing facile explanations, or stopping at an overabstraction while thinking they understand a complex issue. I expect people requesting downvotes to be A) nutty, possibly conspiracy theorists, and B) over-concerned with the karma system.
It is certainly a word that I have seen used as a curiosity stopper.

I'm pretty sure the short answer is: Become really good at Python. Learn additional languages if you want to solve a problem Python isn't good for, you want to learn programming concepts Python doesn't facilitate, you want to work on a project that isn't in Python, etc.


  • I've seen lots of discussions online about what people think the best introductory programming language is. Python seems to be the clear favorite. (1, 2.)
  • UC Berkeley and MIT both use Python for their introductory CS classes. (Yes, both universities switched away from Scheme.) I don't know much about any other universities.
  • Recently on Hacker News there were polls on what programming languages people like and dislike. Hacker News is the unofficial homepage of programmers everywhere, and thousands participated in these polls. According to two different ways of analyzing the data (1, 2), Python was the most favored language. Note that the poll was for general purpose programming, not best introductory language.
  • In my experience, it's quite valuable to be highly fluent in your chosen language. For me, there seems to be a substantial productivity difference between a language I am fluent in and a lang
... (read more)
Also, one shouldn't ignore the practical advantages of a batteries-included language (like Python) for enabling you to do things. For example, if one wants to do some transformation of a spreadsheet which might be tricky or annoying to do in Excel/LibreOffice, one can just import csv and you've got a complete CSV file reader. Similarly, it's an import and 2 function calls to get a file off the internet.
I only count one function call: import requests f = requests.get(my_url).text The built-in urllib and urllib2 modules can do this too, but they're a disaster that should be buried in the ground. The general consensus is that requests is the Right Thing. You have to install it, but that should be easy with the pip package manager: $ pip install requests By the way, I agree with the recommendation of Python. It's a very easy language to get started with, and it's practical for all sorts of things. YouTube, for example, is mostly written in Python.
Thanks for requests!

Everyone says you should start with Python. Everyone's right. It's a beautiful language in which you can write good, concise, elegant code. Lots and lots of shops use it.

If you want to learn a second language, to give yourself some sense of the diversity available, I'd recommend Haskell. I think Haskell conplements Python nicely because a) it's nicely designed and b) it's almost nothing at all like python. It's fantastically strict where Python is fantastically permissive. It's functional where Python's object-oriented, etc.

I honestly don't know what the best Python tutorial is -- I learned from a handful. The best Haskell tutorial in the world is Learn You a Haskell for Great Good

The "other" Haskell tutorial is also worth a mention: Real World Haskell. (That said, I prefer LYAH.)
I've been reading this book and enjoying it. At first I couldn't get into the groove because I got bored/distracted while reading the intro, but I was able to get started right away with the slick interactive web interface at Try Haskall, after which coming back to LYAH had more appeal. Some thoughts: * Haskell shares plenty of syntax with Python, Ruby, and Javascript that you don't see in Perl or C. For example the way lists and tuples are represented, and the filter and map functions. * Getting into something right away with instant feedback reduces Delay so the Procrastination Equation comes out more favorable. So to those struggling with akrasia, the web app is worth a try. Similar apps exist fot for Ruby and Python. * Intros in programming books are something you should probably skip (or skim) and get into the examples, which you should start trying out asap. It is important to get positive feedback if you want to generate sustainable interest. * Once you've achieved a certain amount of interaction with the language to the point that it is clicking with you, front matter stuff and technical explanations will become much more interesting. (You may wonder why you thought they were boring.) * In this particular case, I had a harder time starting LYAH during the evening than I did during the morning. There could be some possible state-of-mind considerations there -- e.g. less patience for the jokes and digressions due to end of day fatigue, less "room in my brain" for the new concepts. Tentative hypothesis: It's optimal to do something that is more a matter of rote typing and simple response (like Try Haskell) later in the day whereas something involving intellectual learning (like reading Learn You A Haskell) works better during the morning.

Is it better to focus on one path, avoiding contamination from others?

Learning multiple programming languages will broaden your perspective and will make you a better and more flexible programmer over time.

Is it better to explore several simultaneously, to make sure you don't miss the best parts?

If you are new and learning on your own, you should focus on one language at a time. Pick a project to work on and then pick the language you are going to use. I like to code a Mandelbrot set image generator in each language I learn.

Which one results in converting time to dollars the most quickly?

If you make your dollars only from the finished product, then pick the language with the highest productivity for your target platform and problem domain. This will probably be a garbage collecting language with a clean syntax, with a good integrated development environment, and with a large available set of libraries.

Right now this will probably be Python, Java or C#.

If you make your dollars by producing lines of code for a company, then you will want to learn a language that is heavily used. There is generally a large demand for C++, C#, Java, Python, and PHP programmers. Companies in ... (read more)

One way to do this is by writing small C programs and looking at the assembler a compiler generates e.g. by calling gcc with -S. (You can also use this to get some understanding of the optimisations a compiler performs by comparing the difference between the assembler with optimisations and the assembler with full optimisations.) As you do this, you should also start replacing bits of the C code with inline assembler that you have written yourself, since writing code is better than just reading code. (Also, the DPCU16 from the yet-to-be-released game 0x10^c might be a reasonable way to learn the basics of assembly languages: there are even numerous online emulators, e.g. 0x10co.de)

Languages are for completing tasks, and each has varying strengths and weaknesses for different tasks. What specifically do you want to be able to do?

If you are a scientist or engineer who needs to quickly and accurately answer questions from quantitative data or perform statistical inference, R is the way to go. It also has a great interactive command line with powerful data visualization tools and plotting functions. The experience of "playing with" and manipulating data to quickly ask questions, and consider the data in different ways directly... (read more)

I've been wanting to learn R. Do you have any reccommendations for tutorials?
I recommend these: Girke Lab R manuals Full disclosure, I am biased because I co-authored several of those... but I really do think they're quite good. They're oriented primarily towards people that want to do biology/bioinformatics with R. Bayesian content is in the works...
That's sweet, thanks! R is severely lacking free tutorials. (As is bayesian stats)
This might be an approach.
http://stackoverflow.com/questions/570029/learning-applied-statistics-with-a-focus-on-r seems like a good starting point.
* Doing Bayesian Data Analysis: A Tutorial with R and BUGS * Introduction to Statistical Thinking (With R, Without Calculus) * R Videos * R-Bloggers.com is a central hub (e.g: A blog aggregator) of content collected from bloggers who write about R (in English). * RStudio, a free and open source integrated development environment (IDE) for R.
Do you recommend any of the 3 tutorials/books? The first one sounds good if it would let one kill two birds with one stone: both learn R and learn Bayesian statistics.
Wow, thanks.

Just today, there was a post at Coding Horror, which was itself a follow up to another excellent post, about whether or not learning a programming language is a good use of your time. I think you should read those before you get too invested in the idea of teaching yourself how to program.

That second post caused a bit of a furor, which was mostly disagreement. E.g. the Hacker News thread, and several (hundred?) other blog posts. And the first post is also causing disagreement (again, the relevant HN thread).

Learn assembly, C, and Scheme, in that order.

Start by learning assembly for some small, manageable microcontroller, like PIC16 (if they still sell those) or a low-end Atmel, preferably one that you could wire up yourself. Prove to yourself that you can make LEDs light up by writing bits to ports. This will give you a good understanding of what computers actually do behind the scenes.

Follow that up with C, for your desktop machine. Implement some common data structures like linked lists and binary trees (you could do it in assembly as well, but it'd be too ... (read more)

I recommend outright reversal of the above process. If you absolutely must learn assembly language, do it once you can already program. The same applies to other excessively low level languages with a lot of emphasis on memory management rather than semantic expression.

I actually started with Basic, then went to Perl, then Python (which I didn't grok all that much at the time), and finally Forth, which is probably lower level than C in some respects but was somehow easier for me to stick with. I tried picking up C and couldn't get past Hello World. With Forth (specifically the RetroForth project -- which was a bit less smooth at the time) I built my own linked lists, dictionaries, string splitters, and stuff like that, using concatenation that maps more or less directly to machine code. Now when I look back at these other languages I see real stuff going on instead of magic. Maybe this is the equivalent of practicing fencing with a weighted sword.
This still misses a few ares. You would not be completely ready to learn OCaml (or Haskell) with sophisticated type inference system. You would probably not be ready to learn Erlang (unless you used something like Termite if your Scheme turned out to be Gambit). If you picked an R5RS Scheme, you would probably miss some parts of the power of macros. You would miss APL/J array processing. You would probably (unless you pay attention to using some Scheme object system) miss SmallTalk object orientation and also multiple dispatch-capable object oriented programming (which is not SmallTalk, but, say Common Lisp Object System). As for calling C syntax clean... Well... Anyway, for long-term learning you need to ask what concepts to learn - you will most probably have a choice of languages (but no one specific languages will include all).
All true, but you should be able to pick those up easily enough if you have internalized the other concepts. For example, type inference is much easier to understand when you realize that, underneath it all, there's no such thing as a "type" anyway, just pointers pointing to memory blocks of various sizes; and that, on the other hand, you could construct whatever notion of a "type" that you want, by using functional programming. I have to admit, though, that I was never a big fan of macros, in any language. I meant, as compared to raw assembly. That was kind of my point: instead of learning a specific set of concepts, learn just enough of the right ones, so that adding new concepts becomes easy.
Well, it looks that either you have some minimal experience with abstract algebra or you will need to learn some of it while working with complex type systems. Learning new powerful abstraction to the level of being able to exploit it for complex tasks is a matter of a few days of full-time learning/thinking/tinkering per se, so learning new languages will still not be trivial. And you have to spend a few weeks collecting minimal best practices. Given that only macro-assembler and Lisp-like languages even had complex enough macros to matter until recently... Well, from my experience I can say that implementing a Lisp-macros-like system for Pascal did help me simply to cope with some project. Some say that macros are just for fixing deficiencies of the language; while it is partially true, macros are useful because you will never have a language precisely fit for your task at hand in a large project. But on the same distance from hardware, there is also Pascal. I remember being able to read Pascal without knowing much of it, and it is still not verbose enough to be annoying to some people. While learning C, it is a nice idea to write down every way to write a hard-to-read code buried in the syntax that you come up with. After a while, it will make a useful checklist when cleaning up sloppy code. The main problem is that unless you have learned some concept you don't really know whether you need to try to apply it. You give a nice set of starter concepts, of course, but I found it useful to show that there are very numerous concepts not mentioned - and it is a good idea to be aware of them. A piece of advice not to take seriously is to look at http://www.falconpl.org/index.ftd?page_id=facts . Afterwards, one could find where all the mentioned concepts are implemented sanely and learn those languages...
Or both ! It depends on what you mean by "trivial". Learning a programming language to the point where you can effectively employ it to solve a complex real-world problem will never be easy, for the same reason that learning a human language to the point where you can converse in it will never be easy. There are always a bunch of library APIs (or dictionary words), edge cases, and best practices (or colloquialisms) to memorize. But understanding the basics of the language does not have to be hard. Technically, this is a language deficiency in and of itself. I rarely find myself wishing I had more macros in Python, or even C#. I do wish there were macros in Java, but that's because they still haven't implemented closures correctly. I dislike Pascal because I find its pointer syntax to be needlessly obscure. That said, I haven't used Pascal for like 15 years, so my knowledge could be hopelessly outdated. Sure, but you've got to draw the line somewhere; after all, there are as many concepts as there are language designers ! Many of them can be rolled into more general concepts (f.ex., all kinds of different type systems can be rolled into the "type system" category). Others aren't even concepts at all, but simply useful tools, such as regular expressions or file I/O. You can't learn everything at once, so you might as well start with the big ideas, and go from there.
All this is simple to look up - programming is not fluent speech, it is writing. The problem is that similar words have radically different combinations of meanings. And also, sometimes there are totally new concepts in the language. You see it better after you try learning a language where concepts do match your expectations. Well, I have written significant amount of code in Python and I did have to use workarounds that would be cleaner as macros... If you consider your language a good fit to your task at any time, you are likely just not asking for the best. It can be mitigated if your requirements are understandable. It is still the same. But C syntax is plainy malicious even in assignments, so why care about pointers. Somehow, Google managed to create a clean C-derived syntax in Go by streamlining a lot of rules. But it is also clear that you should always know that you are not learning some magical set of all basic concepts, just the concepts that are simpliest to learn in the beginning.
Have you ever tried learning a foreign language ? Maybe it was easy for you -- I know people who seem to have a natural aptitude for it -- but for me, it was basically a long painful slog through dictionary-land. Yes, from a strictly algorithmic standpoint, you could look up every word you intend to read or write; but this works very poorly for most humans. I think your demands might be a bit too strict. I am perfectly ok with using a language that is a good, though not 100% perfect, fit for my task. Sometimes, I would even settle for an inferior language, if doing so grants me access to more powerful libraries that free me from extra work. Sure, I could "ask for the best", but I have other goals to accomplish. How so ? Perhaps you were thinking of C++, which is indeed malicious ? I agree with you that there's no magical silver bullet set of concepts, but I also believe that some concepts are vastly more important than others, regardless of how easy they are to learn. For example, the basic concept you internalize when learning assembly is that (roughly speaking) the computer isn't a magical genie with arbitrary rules -- instead, it's a bag of circuits that moves electrons around. This idea seems trivial when written down, but internalizing it is key to becoming a successful programmer. It also leads naturally to understanding pointers, on which the vast majority of other languages -- yes, even Scheme -- are built. I doubt that you can properly understand things like type inference without first understanding bits and pointers.
English, French (I usually forget the latter and recover it when I have any proximate use for it). My native language is Russian. It is a big relief when learning French that most words have the same translations in many contexts. This multi-translation problem is way more annoying than simply looking up words. This actually confirms my point. You will have to choose inferior language from time to time, and its lack of tools of adapting language to your task is either local incompetence of language authors or lack of resources for development of language or lnaguage community arrogance. "i+= i++ + ++i;" can be reliably compiled but not predicted. There are many actual everyday examples like "if(a=b);". Of course, it is not even close to C++, which takes malicious semantics a few levels up. Any command-line programming environment will make you internalize that computer has some rules and that it does what you order - literally. x86 assembly is quite arbitrary anyway. Maybe LLVM assembly (which is closer to "pointer machine" than to "random access machine) would be nicer. After all, high-level languages use specially wrapped pointers even in implementation. You cannot properly understand some performance implications, maybe. But the actual input-output correspondence can be grokked anyway. Of course, only higher-order functions have a strict proof that they can be understood without proper understanding of imperative semantics.
It's possible that you are much better at automatically memorizing words than I am. Wait... what ? Are you saying that, when I have some practical task to finish, the best solution is to pick the most elegant language, disregarding all other options -- and that not doing so makes me arrogant ? I am pretty sure this isn't right. For example, my current project involves some Bluetooth communication and data visualization on Windows machines. There are libraries for Java and C# that fulfill all my Bluetooth and graphical needs; the Python library is close, but not as good. Are you saying that, instead of C#, I should just pick Scheme or Haskell or something, and implement my own Bluetooth stack and drawing APIs ? I am pretty sure that's not what you meant... Ok that's a good point; I forgot about those pre-/post-increments, because I avoid them myself. They're pretty terrible. On the other hand, the regular assignment operator does make sense; the rules that let you say "if(a=b)" also let you say "a=b=c". The result of an assignment operator is the RHS. I don't see this as a bad thing, though it might've been better to use "eq" or some other token instead of the comparison operator "==". True, and that's a good lesson too, but programming in assembly lets you get close (though not too uncomfortably so) to the actual hardware. This allows you to internalize the idea that at least some of these rules are not arbitrary. Instead, they stem from the fact that, ultimately, your computer is an electron-pushing device which is operating under real-world constraints. This is important, because arbitrary rules are something you have to memorize, whereas physical constraints are something you can understand. You are right about x86 assembly, though, which is why I mentioned "a small microcontroller" in my original post. Their assemblies tend to make more sense. You are right, though this depends on which problem you're solving. If you approach the programming language compl
Or simply annoyed by different things. Sorry for unclear phrase. I mean that language's lack of tools is language's arrogance. "a=b=c;" vs "a=c; b=c;" is not much; the former syntax simplifies injection of vulnerabilities (intentionally or incidentally). I have written in C for these microcontrollers - physical constraints visibly leak into the language, so if you are learning C anyway, you could delay learning assembly. If you learn just Scheme and OCaml you still can understand what type system and type inference gives you. You can appreciate steam engine without knowing nuclear physics, after all.
I'm still not sure what you mean by that. Are you suggesting that all languages should make all possible tools available ? For example, should every language, including C, Javascript, Java, C#, Ruby, Python, Dart, etc., provide a full suite of Bluetooth communication libraries ? I agree that it would be really neat if this were the case, but IMO it's highly impractical. Languages are (so far) written by humans, and humans have a limited amount of time to spend on them. What do you mean by "injection of vulnerabilities" ? Also, "a=b=c;" should be more correctly rendered as "b=c; a = b;". This makes it possible to use shorthand such as "if ( (answer = confirmRequest()) == CANCEL) ... ". Sure, you could delay it, but it's best to learn it properly the first time. There are certain essential things that are easy to do with assembly that are harder to do with C: for example, balancing your branches so that every iteration of the main loop takes the same number of cycles. If you were a person who only knew Scheme, how would you explain "what type inference gives you", and why it's useful ?
It was a clarification to some specific phrase in my previous comment. The original phrase answers both your questions. I specifically said that it can be lack of resources or competence, not only arrogance. And this is specifically about tools that allow you to tailor the language to your specific task, so that there are no problems with language that you are prohibited from solving. Somebody can always write a bluetooth library. This is not essential for many applications, even with what is now called microcontrollers. Learning optimization on that level is something you can do while having a good grasp of other concepts already. Type inference allows you to write with strict typechecks and catch some kinds of errors without cluttering the code with type specifications for every variable.
That makes sense, and I do wish that more languages supported more capabilities, but I think it's unrealistic to expect all languages to support all, or even most, or even some large fraction of real-world tasks that are out there. There are vastly more tasks than there are languages: graphics (raster, vector, and 3d, on various systems), sound, desktop user interfaces, bluetooth, TCP/IP networking, bio-sequence alignment, finance, distributed computation, HTML parsing and rendering, SQL access... and that's just the stuff I'd had to handle this month ! I think the opposite is true: performing this kind of optimization (even on a "toy" program) is exactly the kind of task that can help you internalize those concepts. I agree with you there, but I'll play Devil's Advocate, in my attempt to adopt the perspective of someone who only knows Scheme. So, can you give me an example of some Scheme code where the strict typechecks you mentioned are truly helpful ? To me (or, rather, my Schemer's Advocate persona) this sounds inelegant. In Scheme, most entities are pairs, or data structures built of pairs, anyway. Sure, there are a few primitives, but why should I worry about 5 being different from 5.0 or "5" ? That sounds like a job for the interpreter.
You didn't understand my point correctly. Language per se should not support directly, say, bluetooth - because bluetooth will change in an incompatible way. Language could live without a bluetooth library - why not, there is always FFI for dire cases. But the question is about allowing to define a nice API if a need arises. More or less any metaprogramming tool that is not constrained in what it can create would do - those who want to use it, will wrap it in a layer that is nice to use, you can then just incorporate their work. Common Lisp didn't have any object system in the first edition of the standard; CLOS was prototyped using macros, documented, and then this documentation was basically included in standard. Of couse, macro use could be somewhat more clumsy or more explicit for any reason (make it easier to control overuse, for example) - this is not a problem. The problem is there when you have zero ways to do something - for example, to define a non-trivial iteration pattern. Sorry? I was talking about things that help to catch errors. In any small snippet the errors are simple enough to find for this to be unillustrative. It only helps you when you have some kind of wrong assignment in 1K+LOC.

Time to dollars: Python. Ubiquitous, powerful enough, useful for everything, has a friendly learning curve but also a wide variety of concepts in the language.

Highest-value programmer: Probably C, but it's sort of a moot point because I don't think there's a way to become a programmer of above-average value without becoming pretty competent in two or three languages along the way.

Caveats: I'm pretty sure that it varies a great deal based on your inclination and logical ability going in.

If a programming language has nothing new to teach you, it is not worth learning. For this reason, it is probably a good idea to learn multiple ones that are "conceptually orthogonal" to each other. Examples:

lisp (code-as-data, syntax vs semantics, metaprogramming)

prolog (declarative programming)

a simple RISC assembly language (machine details, stack vs heap)

haskell (functional programming, type inference, lazy evaluation, monadic theory, type classes)

ruby (message passing, object oriented programming, regular expressions)

APL (concise syntax, vector programming, non-ascii programs)

Assuming you are a person not influenced by external incentives.
Learning ideas has better ROI than learning tools. It's easy to pick up tools as needed for work, but recognizing ideas/patterns is both a more transportable kind of knowledge and harder to acquire. Also key ideas behind computation do not have a "half-life," whereas tool/tradeschool type knowledge does.
Exactly, it's all about the concepts underlying the tool and recognizing situations when a certain tool has a better ROI than some other one at solving a problem at hand. But, sometimes it can be hard to make a fair judgement on whether you really know something or just think that you know. So, it might definitely be useful to know a few other techniques/tools of doing the same thing in order to foolproof yourself.

A question regarding your title: are you looking for the programming language that best teaches rationalist thinking (if there is one in particular)? Or are you asking for a more general analysis of what the various languages are best at?

Regardless, as a novice programmer (I'm taking my first Java class right now), I would be interested in hearing what LW's opinions are. I chose Java because I wanted to develop Android apps, and because of the large number of jobs calling for Java programmers in my area.

I would like to ask the commentators: what do you think about learning JavaScript as a "first" programming language? I would like to learn to use modern programming technologies and utilize best practices, but learn something quickly usable in the real world and applicable to web programming.

I was going to learn JavaScript for a while (but haven't got around to it) because:

  • I heard it's kinda Scheme on the inside, and generally has some really good parts
  • To do web programming, I need to learn JavaScript for client side anyway; with Node.JS I can
... (read more)
JavaScript is fine as a first language. I consider it to be a better first language than the TRS-80 BASIC I started on.
JS has a powerful advantage as far as usefulness in that it comes with all the browsers already, so you're going to have to learn it for client side if you are doing web apps. My suggestion to newbies trying to find a quick 10-minute intro is coffeescript. I'm still leaning towards Python and Haskell as things I should be learning for various reasons. (Python seems useful and career-friendly, and I already know enough to be dangerous. Haskell seems to teach a different kind of math/thinking, which is attractive long term even if I never use the language.) However Javascript is pretty friendly, especially with CoffeeScript and NodeJS. It might actually be a better language for the web-entrepreneur track since the hottest apps will be optimized for the client side. One thing I've noticed about the NodeJS community is they seem really good about removing trivial inconveniences. For example with Meteor I was able to get an example set up in about 30 seconds.
Javascript shares a problem with C++: it is hard to find non-crap documentation and tutorials that won't lead new coders subtly (or not so subtly) into bad habits that are hard to break later. With C++ or Javascript, the first few google results for any newbie question are likely to be pretty bad. If you have access to a really good Javascript programmer who uses modern techniques and libraries (use of jQuery, prototype, coffescript and/or node.js are all good signs), and can get them to supply you with help or at least review the help you're gtting from others, then that's cool. If not, then stay away from JS until you're a good programmer and you have a direct practical need for it.
Personally, I would recommend learning Python first and then learning JS. Udacity has great free courses in Python. Python has fewer caveats than JS. And there is very little in Python style that will steer you wrong when learning JS.
Python has very nice tracebacks that help a ton with debugging. JavaScript doesn't come close. But yes, JavaScript is not a terrible choice for a first language.

I'm fond of Perl as a first language, for a couple of reasons. Foremost among them is that Perl is fun and easy, so it serves as a gentle introduction to programming (and modules) that's easy to stick with long enough to catch the bug, and it's versatile in that it can be used for webapps or for automating system tasks or just for playing around. But I wouldn't recommend making it anybody's only language, because it IS a scripting language and consequently encourages a sort of sloppy wham-bam-thank-you-ma'am approach to coding. Start with it, learn the bas... (read more)

I remember Perl with fondness, but unfortunately it seems to be a dying language. The foretold Perl 6 (literally foretold, there were "exegeneses" and "apocalypses" and everything) has been at a standstill for many years, and the once-amazing CPAN has now been utterly demolished by the likes of GitHub and RubyGems. There's a lot to be said for languages that have active communities regularly supplying new and updated useful libraries. If you miss Perl, try Ruby; it actually was meant at the beginning to be a fairly Perl-like language, and it has many (IMO somewhat underused) features that assist with quick get-crap-done scripts, like the ARGF I/O handle that automatically and conveniently reads through STDIN and/or files specified on the command line.
The problem with Perl as a first language (which maybe makes Python a better choice) is that it encourages sloppiness a bit too much. You can certainly resist; but in Python, Pascal, Scheme you can take arbitrary example program off the net and have a decent chance of reading and understanding it quickly. Reading code not written in your presence is an important skill, and developing it with Perl will take more time than with most other proposed first languages.

I have no interest in evaluating languages based on how quickly they lead to money; only how they affect your ability to program. Additionally, I am not particularly experienced. I've only been programming for three or four years. Take my words with a grain of salt.

I remember one semester in college where I was using three separate programming languages at the same time. VB, Java, and Assembly (16-bit x86). Sometimes it lead to a small amount of confusion, but it was still a good experience. I would suggest beginning a second language soon after achieving ... (read more)

There's a dilemma here which is present in teaching a lot of skills: do you want your hypothetical students to be building useful things quickly, or do you want them to be internalising concepts that will last them a lifetime?

If it's the former, just give people a solvable problem and let them pick their own tools. If it's the latter, start them off with some verbose compiled unforgiving strongly-typed beast like C or Java. They're best learnt in a training environment rather than on the fly, so if you have a training environment, it makes sense to learn them there. It's easier to go from, say, Java to Python than it is to go in the other direction.

At times I've thought the same myself, but I've read accounts from at least two different CS professors who said that student outcomes improved dramatically when switching from statically typed languages to Python as a first language. I suspect Python works well because it makes it easy to introduce just a few concepts at a time. I started with Python and I didn't have any trouble learning statically typed languages.
Static typing isn't really the issue. C and Java are constrained and fussy and demand a level of forethought and attention to detail that is absent from something like Perl or Python. This is mostly because they're compiled, although static typing also plays a role. The point is they're unforgiving, and teach you a level of discipline and rigour that you otherwise may not need to learn. Personally, I hate working in C or Java, and avoid them precisely because prototyping all my functions annoys me and I don't care whether my objects are public or private. I'm still glad I learned them when I did, in a training environment where I was obliged to do so. If I hadn't, and I'd started on Python, I would have had absolutely no motive to learn them unless someone made me.
I see what you are saying. As a newbie I found it hard to stick to C or Java for long enough to get past Hello World. If you're relying on internal motivations, you aren't so likely to stick it out long enough to get the fundamentals from these. The problem is you need rewards within a certain time limit for the brain to become addicted. This does happen, but only after a certain amount of coding has been done. On the other hand something like Python (or Basic for that matter) is easy, but your inner lazy person is going to keep on thinking certain things are magic because they are automated and work without you thinking about them. With Forth I like to think there's a bit of the best of both worlds. IME, you can get addicted to Forth without too much effort, but it is very hard to get anything serious done in it until you've been doing it for several years. Essentially you end up building your own language from first principles. For example pretty much every language has a stack for passing values, but most hide this from the user. Likewise every language represents memory addresses as numbers, but this also tends to be hidden from the user. In Forth if you want to hide complexity you pretty much have to do the hiding of information yourself -- concatenate functions and primitives to generate complexity, factor them into smaller functions to hide it. Factoring is necessary for every language of course, but most of them don't punish you as hard for not factoring, and most ship with tons of complexity already pre-factored and ready for you to magically wave your hands at. I'm not saying that's bad, just that it is (or seems to me) a trade-off people may not be aware they are making.
Fair enough. I guess I could say the same thing, I'm glad I was made to learn C for school. C is the ideal language for a data structures class in my opinion, since you can implement all the data structures yourself and understand what's really going on under the hood.

My personal recommendation is Visual Basic, assuming you use Excel for anything beyond recording what you ate for breakfast. VB extends the functionality of it ten fold, if you know a few basic things. It has the added bonus of being a very easy language to learn, the syntax is pretty much English. That being said, no company is ever going to use VB as a real programming language, but it sounds like employment is not your goal.

Edit: Also, it's important to note that (at least I don't think) any language is going to teach rationality any better than any other. It's not like programming changes very much, for most purposes, it's just different syntax.

Surprisingly, a lot of Wall Street uses VB for automating models. It's a dirty little secret but I've known people highly paid to do this.
Just because you can code in C++ the way you coded in C doesn't mean it's a good idea to do so. Programming changes faster than most programmers do. The "just different syntax" statement is true for scripting languages, but not for programming languages. You will see this practically the instant you start optimizing your code for performance scaling or modularity or whatever.

Relevant xkcd. It is important to realize that a good programming language doesn't help if the thoughts are confused. This often matters much more than the language choice.

OOP in general can help clarifying thoughts, especially when dealing with complex systems. Readability of code is another huge factor, which depends on your ability to write good code (which is easier in some languages than in others) as well as the language syntax itself. A good language may not help if your thoughts are already confused, but it might help you to not confuse your thoughts.