Here is a place to talk about questions for the speakers at the Singularity Summit.

People are often afraid that they have stupid questions, and fail to ask their good questions that would have benefited many - either by increasing their understanding, showcasing the character of a speaker who admits to not knowing something, or showing by the lack of an adequate response that the speaker's argument is flawed.

This is an overblown fear that is quite irrational - irrational to the extent that their emotions lead them to think that the consequences of their asking a stupid question will be severe. People will have unpleasant emotional reactions if embarrassed, but their emotions are needlessly warning them not to risk status before the tribe lest they get cast out and die. That's unlikely to happen.

People's reluctance to ask questions is rational to the extent that many questions are, by any reasonable description, stupid.

So let's use this thread to gain confidence that our questions aren't of the latter type, or to get answers, or to get better questions.

New Comment
2 comments, sorted by Click to highlight new comments since: Today at 4:01 PM

There is a general type of question for all people, particularly those speaking publicly about the future: "what do your financial arrangements implicitly anticipate"?

What would "bets" over various probability functions look like?

The wording of the question ultimately directed to the person would differ depending on his or her economic understanding.

Question: Do you think financing government debt will unintentionally cannibalize capital needed to fund innovation at small and medium enterprises? In other words, at any given time there is a finite pool of capital (financed through savings) that can be spent in a variety of endeavors. Financing sovereign debt is one such market for capital, yet other, unseen areas are SMEs, who do not have the political or institutional "pull" to receive loans/funds during a solvency crisis. Thus will the steps towards a "technological singularity" be slowed down due to lack of financial resources?

New to LessWrong?