DanielVarga

10

Here is something I'd like to see: You give the machine the formally specified ruleset of a game (go, chess, etc), wait while the reinforcement learning does its job, and out comes a world-class computer player.

00

Here is one reason, but it's up for debate:

Deep learning courses rush through logistic regression and usually just mention SVMs. Arguably it's important for understanding deep learning to take the time to really, deeply understand how these linear models work, both theoretically and practically, both on synthetic data and on high dimensional real life data.

More generally, there are a lot of machine learning concepts that deep learning courses don't have enough time to introduce properly, so they just mention them, and you might get a mistaken impression about their relative importance.

Another related thing: right now machine learning competitions are dominated by gradient boosting. Deep learning, not really. This says nothing about **starting** with deep learning or not, but a good argument against **stopping** at deep learning.

50

In the last two days I alone wrote a prototype that can take a whiteboard photo, and automatically turn it into a mindmap-like zoomable chart. Pieces of the chart then can be rearranged and altered individually:

https://prezi.com/igaywhvnam2y/whiteboard-prezi-2015-12-04-152935/

This was part of a company hackathon, and I had some infrastructure to help me regarding the visualization, but with the shape recognition/extraction, it was just me and the nasty python bindings for OpenCV.

00

Oh my god, look at 0-4-year old assaults, both ED visits and deaths. (Assault is the leading TBI-related cause of death for 0-4-year olds.) Some of those falling 4 year olds were assaulted.

60

There are worse fates than not being able to top your own discovery of general relativity.

40

That's not a top-level comment, so it's excluded by my script from this version. I won't manually edit the output, sorry. There's another version where non-top-level comments are kept, too. Your quote is in there:

30

Top quote contributors by statistical significance level:

- 0.00000 (23.11 in 45): Alejandro1
- 0.00007 (17.98 in 63): James_Miller
- 0.00016 (19.02 in 43): Stabilizer
- 0.00016 (25.25 in 16): dspeyer
- 0.00020 (18.69 in 45): GabrielDuquette
- 0.00052 (26.91 in 11): Oscar_Cunningham
- 0.00142 (24.33 in 12): peter_hurford
- 0.00183 (50.50 in 2): Delta
- 0.00252 (68.00 in 1): Solvent
- 0.00290 (19.35 in 23): Yvain
- 0.00352 (66.00 in 1): westward
- 0.00360 (24.78 in 9): Mestroyer
- 0.00529 (29.20 in 5): michaelkeenan
- 0.00591 (41.00 in 2): nabeelqu
- 0.00591 (41.00 in 2): VincentYu
- 0.00604 (60.00 in 1): RomeoStevens
- 0.00719 (24.00 in 8): philh
- 0.00725 (19.28 in 18): Tesseract
- 0.00780 (57.00 in 1): Zando
- 0.00820 (39.00 in 2): sediment
- 0.00830 (23.62 in 8): Qiaochu_Yuan
- 0.00871 (23.50 in 8): Maniakes
- 0.00993 (32.00 in 3): benelliott
- 0.01012 (15.17 in 64): Jayson_Virissimo
- 0.01226 (26.00 in 5): Ezekiel
- 0.01359 (49.00 in 1): Liron
- 0.01627 (23.67 in 6): AspiringRationalist
- 0.01711 (45.00 in 1): Mycroft65536
- 0.01816 (34.00 in 2): summerstay
- 0.02114 (43.00 in 1): bentarm
- 0.02134 (16.58 in 26): Kaj_Sotala
- 0.02265 (42.00 in 1): Andy_McKenzie
- 0.02600 (22.17 in 6): ShardPhoenix
- 0.03044 (30.50 in 2): gRR
- 0.03200 (24.00 in 4): Particleman
- 0.03435 (18.25 in 12): MinibearRex
- 0.03523 (37.00 in 1): andreas
- 0.03875 (36.00 in 1): NoisyEmpire
- 0.03876 (16.23 in 22): Grognor
- 0.04292 (28.00 in 2): roystgnr

50

Top quote contributors by karma score collected in 2014:

- 369 James_Miller
- 277 dspeyer
- 239 Jayson_Virissimo
- 181 Stabilizer
- 165 Alejandro1
- 163 lukeprog
- 146 arundelo
- 129 Salemicus
- 124 johnlawrenceaspden
- 117 Kaj_Sotala
- 117 B_For_Bandana
- 116 NancyLebovitz
- 110 Pablo_Stafforini
- 107 Gunnar_Zarncke
- 100 Eugine_Nier
- 97 aarongertler
- 94 shminux
- 90 Azathoth123
- 88 EGarrett
- 84 elharo
- 81 Benito
- 79 Torello
- 74 MattG
- 74 AspiringRationalist
- 73 satt
- 73 JQuinton
- 73 27chaos
- 67 Tyrrell_McAllister
- 66 Vulture
- 65 Cyan
- 62 michaelkeenan
- 60 WalterL
- 60 Ixiel
- 58 jaime2000
- 58 [deleted]
- 57 Zubon
- 55 Jack_LaSota
- 55 CronoDAS
- 52 Vaniver
- 52 hairyfigment

30

Top quote contributors by total (2009-2014) karma score collected:

- 1394 RichardKennaway
- 1133 James_Miller
- 1040 Alejandro1
- 1037 [deleted]
- 978 gwern
- 971 Jayson_Virissimo
- 847 lukeprog
- 846 Eugine_Nier
- 841 GabrielDuquette
- 827 Eliezer_Yudkowsky
- 818 Stabilizer
- 775 Rain
- 750 MichaelGR
- 734 NancyLebovitz
- 628 Konkvistador
- 590 anonym
- 521 CronoDAS
- 479 arundelo
- 445 Yvain
- 434 RobinZ
- 431 Kaj_Sotala
- 404 dspeyer
- 372 Alicorn
- 357 Grognor
- 353 Vaniver
- 347 Tesseract
- 332 shminux
- 328 DSimon
- 296 Oscar_Cunningham
- 296 billswift
- 293 Pablo_Stafforini
- 292 peter_hurford
- 284 Nominull
- 277 jsbennett86
- 271 katydee
- 263 RolfAndreassen
- 262 Thomas
- 237 Kutta
- 229 roland
- 224 Cyan

I think I misunderstand your definition. Let feature a be represented by x_1 > 0.5, and let feature b be represented by x_2 > 0.5. Let x_i be iid uniform [0, 1]. Isn't that a counterexample to (a and b) being linearly representable?