hacker news with inline top comments    .. more ..    1 Sep 2012 News
home   ask   best   6 years ago   
1
BranchOut Falsifies Wall Street Journal Quote socialsplat.net
101 points by robbiet480  6 hours ago   34 comments top 15
1
therealarmen 5 hours ago 1 reply      
I always find it a little fishy when startups put glowing quotes from news outlets without linking back to the original article. It takes minimal effort on their part and it's generally better to read the quote in context anyway. This only reinforces my suspicions.
2
JoshTriplett 5 hours ago 1 reply      
"Falsifies" seems like an odd choice of words here; I went to the article expecting that the Wall Street Journal made some statement that BranchOut managed to disprove, not that BranchOut faked a quote from the Wall Street Journal.
3
staunch 5 hours ago 2 replies      
Sure seems like that's the case.

  No results found for site:wsj.com "create secure professional profiles".

Tried a few other substrings from the quote -- none matched.

4
notlisted 3 hours ago 0 replies      
You sound surprised? BranchOut is based on milking the LinkedIn social graph using FB as a viral spam host.

http://www.switched.com/2011/01/11/branchout-helps-you-land-...

They took a nice page out of the FB playbook, ie import as much data from your competitors before they cut you off. I've never understood why LinkedIn gave them access to the API to begin with.

http://techcrunch.com/2011/07/01/linkedin-cuts-off-api-acces...

Now, is there anyone here who fell for their spammy messages and actually USES BranchOut?

5
twelvechairs 3 hours ago 0 replies      
Its (sadly) quite common for news organizations to copy verbatim parts of a press release as part of an article. Not to say BranchOut aren't being dishonest here, but I wouldn't be surprised if this 'quote' was actually printed in the Wall Street Journal in some form or other - probably in some filler article like 'top 10 web startups' or somesuch...
6
raintrees 5 hours ago 2 replies      
As a side not, thanks to the article, I now know I use Oxford/serial commas.
7
pmarca 4 hours ago 0 replies      
It's straight out of their own press release boilerplate!
8
cpeterso 4 hours ago 0 replies      
So where did the quote come from? Was the text from a BranchOut press release that happened to be syndicated in Wall Street Journal? For example, the quote shows up in a press release on yahoo.com in a format that resembles a news article:

http://finance.yahoo.com/news/branchout-raises-25-million-su...

9
jmathai 4 hours ago 1 reply      
Waiting for them to come out with an announcement that it was an oversight on their part and it's been removed from their site.
10
nugget 3 hours ago 0 replies      
Anyone who works with these clowns knows they are full of crap. Facebook spam != real users.
11
jval 5 hours ago 1 reply      
#hustle2win - how many more of these are on startup websites?
12
DuskStar 4 hours ago 1 reply      
Site down for anyone else? Using Google's cached copy... Gotta love the Hacker News effect
13
mjcohenw 6 hours ago 0 replies      
It's only fair.
14
stephenhandley 2 hours ago 0 replies      
Gamification 4evr
15
bicknergseng 2 hours ago 1 reply      
Meanwhile... presidential candidates falsify just about everything else. Ironic double standard?
3
Tacocopter Basics danshapiro.com
13 points by danshapiro  2 hours ago   2 comments top
1
lutusp 1 hour ago 1 reply      
Okay HN developers -- when you get done reading the linked article and are finished laughing at the thought of a little helicopter delivering a taco or a bottle of beer, start thinking:

* Little helicopters can now lift a substantial weight.

* They aren't very expensive.

* They're easily controlled, more so than a full-sized helicopter (primarily because of computer-aided controls and GPS guidance). So you don't have to be Chuck Yeager to fly one.

* All you need to do is mate the helicopter with a decent camera that can simultaneously beam a picture to the ground for guidance and preview, and take high-resolution pictures on command by way of the radio link.

* Uses: real estate (who desperately need a way to take high-quality pictures of houses from above), surveillance, art, video productions, etc..

This is an opportunity waiting for someone willing to take it on.

4
LG demonstrates wireless Linux Web pad at CeBIT linuxfordevices.com
42 points by bane  5 hours ago   13 comments top 6
1
ktizo 1 hour ago 0 replies      
2
akandiah 4 hours ago 1 reply      
This article was published on 2001-03-23. However, it's interesting to see that they called it the "Digital iPAD". Wonder why it never took off? Why didn't they ever trademark the name?

I notice that Gizmodo's already published a story based on this article: http://www.gizmodo.com.au/2012/09/check-out-lgs-ipad-from-20...

3
nkassis 1 hour ago 0 replies      
That's very interesting. I wonder if Apple and LG ever got an agreement on the name.

Was LG's tablet ever released to the public?

4
DiabloD3 4 hours ago 1 reply      
Wow, what a blast from the past. 64mb of memory, 206mhz Intel StrongARM, "Internet appliance" was still a term, and 802.11b was still new.
5
Nux 1 hour ago 2 replies      
So isn't this like "prior art" relevant to the recent Samsung vs Apple "patent" trials?
6
tvon 4 hours ago 1 reply      
It's not entirely clear to me why this matters at all.
5
Amazon S3 - Cross Origin Resource Sharing Support aws.typepad.com
113 points by jeffbarr  10 hours ago   32 comments top 11
1
theli0nheart 9 hours ago 6 replies      
Hah, about time.

About 6 months ago I rewrote the Let's Crate (https://letscrate.com) backend to work exclusively with Amazon S3 Direct POST uploads. Getting upload progress to work with that was a royal PITA, but in the end I got it working. If you're interested in how, perhaps that's a good subject for a far more lengthy post on how to write extremely convoluted Javascript. I gave myself a pat on the back (no flash, yay!) and vowed to never do anything like that again.

As requested: Basically, the gist is that you accept the upload via a local JS file that acts as a conduit. You then turn the dropped / selected file object into a blob object and transfer that blob JS file that lives on S3 (using postMessage and a hidden iframe). That JS file on S3 is what actually performs the upload and tracks the upload progress. On progress events, I send back postMessage payloads to the local JS file to show updates to the user.

Convoluted, but it works. :)

2
chao- 1 hour ago 0 replies      
As excited as I am about this finally happening, I was so pissed about having to deal with this issue over and over (e.g. JS files describing WebGL models), that I was on the verge of starting a service to provide the layer of redirection with CORS support, ala what Heroku does for EC2. I was actually getting a bit psyched for it, because I was convinced Amazon didn't care about ever implementing this.

At least now I won't launch something only to have Amazon eat my lunch when they finally came around to providing this much-needed feature.

3
TazeTSchnitzel 36 minutes ago 0 replies      
Great.

Could somebody explain CORS to me? How is making the server you're contacting specify it wants to receive requests, in the response header, secure? The request has already been made!

4
akoumjian 9 hours ago 0 replies      
It's been 3.5 long years since first feature request, but thank you!
5
purephase 1 hour ago 0 replies      
This must have been difficult to work out. My thanks (or condolences?) to Amazon for this. It will make my life a bit easier!
6
RoboTeddy 8 hours ago 1 reply      
Finally, I won't have to proxy s3 requests through my own nginxes.

I've pled for this feature in the AWS forum, over their commercial support (which I bought just to bug them about this), and to werner vogels directly.

Thanks jeffbarr!

7
ceejayoz 8 hours ago 1 reply      
OK, now how about CloudFront?
8
nathancahill 9 hours ago 0 replies      
Yes! Fonts in Firefox will work now!
9
46Bit 8 hours ago 0 replies      
Awesome. Will this allow us to load images into canvas without security errors?
10
throw_away 7 hours ago 1 reply      
somewhat shocked they pushed Friday before the weekend.
11
logical42 10 hours ago 0 replies      
oh my god it is about friggin' time.
6
Ask HN: Review my web app? Trying to get out of slump. wysp.ws
26 points by fchollet  4 hours ago   19 comments top 8
1
nanijoe 41 minutes ago 0 replies      
It's not immediately apparent what your site does. Instead of the cryptic "unlike other websites, wysp is not about showcase etc" , can you just write something as straight forward as "Learn how to draw" or " Let us help you improve your pencil skills" or whatever message directly tells your target audience what it is exactly you do
2
thejerz 4 hours ago 2 replies      
Your practice engine idea is really smart, but it isn't at all immediately apparent upon visiting the site. At first, your website looks like any "portfolio site" (deviant art, dribble, etc). To make the practice engine the focus of the website, I would consider a new homepage design that focuses on the "learn to draw" aka "practice engine" aspect of the website, as opposed to a portfolio showcase. There are lots of portfolio sites, but very few websites that teach you how to draw on a structured learning path. It is a great idea, stick with it!
3
kevinconroy 3 hours ago 1 reply      
Agree with other comments that at first use it's not clear what the site is for, but shows a lot of potential. Have you done any A/B testing? If not, check out https://www.optimizely.com/

Also, are you sending email to users whenever they get comments? Emails to people who sign up but never post anything? Welcome emails to people who sign up telling them what to do? Create sets of automated email campaigns and it will dramatically improve your usage stats. Articles from Patrick trend on HN all the time, but here's a link in case you missed it (+1 from my professional experience): http://www.kalzumeus.com/2012/05/31/can-i-get-your-email/

Best of luck to you!

4
pemmigiwhoseit 1 hour ago 0 replies      
Positives: 1. Generally great idea. 2. Very nice css, looks beautiful. 3. Really like the blank page section - it really encourages people to get involved. (make this more prominent).
Advice/Criticism: Just looked at it for a few minutes but here are my first impression thoughts:
1. You say on home page it is for practice and personal progress, but all I see is other people's are that is way better than mine. 2. You feature the practice engine but it only has two courses, and feels incomplete. (plus it looks like you are asking for my money before you convince me I should use your site / before I even fully understand the purpose). Maybe try to feature your best feed back coupled with the piece instead of your most popular art. Add more classes and make a couple of them (more obviously) free. Encourage me to upload art from the start - right now it feels like I should focus on other peoples art instead of my own.
5
etherealG 2 hours ago 0 replies      
focus on what sets you apart, the practice engine. on your homepage although just under the nav you have text explaining this, people are generally more visual as I'm sure you know, try using something more visual to explain how this practice engine works maybe?

by way of an example, http://www.sublimetext.com/ has a set of features that really make it stand out as a text editor. they could be written out in text, but it's hard to understand what they are when described in words. instead the author has put some animated pictures showing the actual features in use. this presentation instantly shows how you would use those features and most people understand right away where the benefit is.

i'm not sure animated would work for yours, but perhaps a swapping image showing a particular artists progress over a few weeks/months.

just an idea, hope it helps. 1 other tiny thing that got to me, the top nav text isn't vertically centered in the space. it's a small gripe, but sometimes little details can make the difference, especially on a site for artists.

edit: another thing that comes to mind, you feature art on the homepage in a similar style to a portfolio. including what's new or popular. what if instead you featured the artist, and then change between pictures of their progress up to that point. you could highlight the learning aspect by showing progress of each artist you feature on the homepage, rather than just 1 piece of art by that artist.

6
georgeecollins 3 hours ago 1 reply      
I agree with other comments that this looks like a portfolio site while your USP is teaching.

I tried your tool and I didn't really understand why it taught me anything. It put up a picture, waited for me to draw in your limited drawing tool, then asked me to evaluate myself. I just scribbled and gave myself four stars an everything continued happily.

It seems like the practice engine lacks useful feedback, but maybe that is a feature I would get if I signed in?

7
wisty 3 hours ago 1 reply      
You have 2 courses. The first prominently displays $2.00 price, and then says "$2.00 (one session free!)". The second is free.

But when I scanned it, all I saw was "$2.00", not the freebies.

Maybe style the buttons, so they display what is free?

8
fchollet 4 hours ago 2 replies      
I have a pretty high churn rate, so I've recently been rolling new features to try to stay afloat. User feedback is positive, but the usage stats aren't.

Any insight as to what I could be doing wrong?

8
Jiro Dreams of Sushi magpictures.com
45 points by hboon  6 hours ago   20 comments top 10
1
rwmj 3 minutes ago 0 replies      
There's an interesting review of sorts of the restaurant here:
http://www.cookingissues.com/2012/06/08/tokyo-tales-300-of-s...
2
siglesias 3 hours ago 1 reply      
For the uninitiated, the subtext of this movie is pondering whether Jiro's son Yoshikazu will be able to successfully take over the restaurant when Jiro retires. It's the age old question of what does it take for apprentice to finally surpass the master; who is a worthy successor to the master, etc.

This is particularly relevant because the valley's succession story du jour is Apple and whether Tim Cook et al can take the reins in the wake of Steve Jobs. The following quote struck me:

"It's not going to be easy for Yoshikazu to succeed his father at the same restaurant. Even if Yoshikazu makes the same level of sushi it will still be seen as inferior. If Yoshikazu makes sushi that's twice as good as Jiro's, only then will they be seen as equal." (32:06)

This is exactly what Apple has been going through in the last year, exacting a level of polish that is on par if not above what they released last year, but still leaving nagging doubts in the hearts of the faithful. The one thing that would silence critics and quell fears would be that something twice as revolutionary as the original iPhone be straight up imagined, developed, and hoisted by the post-Jobs Apple--just to claim par.

3
staunch 22 minutes ago 0 replies      
One could probably make 100,000 more documentaries approximately identical to this one at other family owned businesses. The only thing particularly unique about this place is that it came to the attention of Michelin and then a documentarian. I don't mean that as a knock against Jiro's place at all, I just think the filmmaker overplayed the story. That and the combination of Japan and sushi made the situation seem even more exotic and rare than it really is.

It was entertaining though. I especially liked the part when he said something like "Welp. I'm ready to go. Why am I even here [at his parent's shrine]. My parents treated me like crap."

4
jonny_eh 2 hours ago 0 replies      
It's currently streaming on Netflix (in the US at least): http://movies.netflix.com/WiMovie/70181716
5
marban 38 minutes ago 0 replies      
I would rank it among a typical Gary Hustwit movie but it did become repetitive at times.
Obviously not as thrilling as Man on Wire but when it comes to documentaries about devoting every waking minute to your passion, it's a top pick.
6
astrojams 4 hours ago 2 replies      
One of my favorite documentaries of this year. I LOVED this movie.
7
hboon 3 hours ago 0 replies      
If you are really into your craft and/or love sushi, this is a great movie for you.
8
dleibovic 3 hours ago 1 reply      
I'll be the dissenter here-- I was not a fan of the movie. It was a boring piece about a man obsessed to the point of craziness about sushi. His kids said to the mother: "Mommy, who is the strange man in our house?"

The man was Jiro, their father. Is that the childhood you want your children to live? For me, the mastery of a craft is not worth this price.

9
breakyerself 3 hours ago 0 replies      
I don't know what it was but I was in awe of this guy and his family and the whole culture with the whole movie.
10
pheon 3 hours ago 2 replies      
that the full menu costs $300 and you will finish in 30min. also reviews of the his sons restaurant say they get angry if you dont eat everything...
10
Flynn's IQ bryanappleyard.com
165 points by tortilla  15 hours ago   124 comments top 15
1
tokenadult 15 hours ago 2 replies      
Links to information about the book under review, Are We Getting Smarter?: Rising IQ in the Twenty-First Century by James R. Flynn:

http://www.cambridge.org/gb/knowledge/isbn/item6835805/?site...

http://www.amazon.com/Are-We-Getting-Smarter-Twenty-First/dp...

(The three expert reviewers shown on Amazon are all very impressive researchers on human intelligence in their own right, so their joint endorsement of Flynn's book carries a lot of weight for people like me who follow the research.)

Here is what Arthur Jensen said about Flynn back in the 1980s: "Now and then I am asked . . . who, in my opinion, are the most respectable critics of my position on the race-IQ issue? The name James R. Flynn is by far the first that comes to mind." Modgil, Sohan & Modgil, Celia (Eds.) (1987) Arthur Jensen: Concensus and Controversy New York: Falmer.

AFTER EDIT: Replying to another top-level comment:

I don't understand how anyone could not have an emotional response being told 'your IQ is x'.

People have emotional responses to most statements about themselves that they think are overall evaluations. Some of those emotional responses are more warranted than others. Devote some reading time to the best literature on IQ testing (besides the book under review in this thread, that would include Mackintosh's second edition textbook IQ and Human Intelligence

http://www.amazon.com/IQ-Human-Intelligence-Nicholas-Mackint...

and the Sternberg-Kaufman Cambridge Handbook of Intelligence,

http://www.amazon.com/Cambridge-Handbook-Intelligence-Handbo...

both recently published). Any of these books will help readers understand that IQ tests are samples of learned behavior and are not exhaustive reports on an individual's profile of developed abilities.

AFTER ANOTHER EDIT:

Discussion of heritability of IQ, a reliable indicator of how much discussants read the current scientific literature on the subject, has ensued in some other subthreads here. Heritability of IQ has nothing whatever to do with malleability (or, if you prefer this terminology, controllability) of human intelligence. That point has been made by the leading researchers on human behaviorial genetics in their recent articles that I frequently post in comments here on HN. It is a very common conceptual blunder, which should be corrected in any well edited genetics textbook, to confuse broad heritability estimates with statements about how malleable human traits are. The two concepts actually have no relationship at all. Highly heritable traits can be very malleable, and the other way around.

Johnson, Wendy; Turkheimer, Eric; Gottesman, Irving I.; Bouchard Jr., Thomas (2009). Beyond Heritability: Twin Studies in Behavioral Research. Current Directions in Psychological Science, 18, 4, 217-220

http://people.virginia.edu/~ent3c/papers2/Articles%20for%20O...

is an interesting paper that includes the statement "Moreover, even highly heritable traits can be strongly manipulated by the environment, so heritability has little if anything to do with controllability. For example, height is on the order of 90% heritable, yet North and South Koreans, who come from the same genetic background, presently differ in average height by a full 6 inches (Pak, 2004; Schwekendiek, 2008)."

Another interesting paper,

Turkheimer, E. (2008, Spring). A better way to use twins for developmental research. LIFE Newsletter, 2, 1-5

http://people.virginia.edu/~ent3c/papers2/Articles%20for%20O...

admits the disappointment of behavioral genetics researchers.

"But back to the question: What does heritability mean? Almost everyone who has ever thought about heritability has reached a commonsense intuition about it: One way or another, heritability has to be some kind of index of how genetic a trait is. That intuition explains why so many thousands of heritability coefficients have been calculated over the years. Once the twin registries have been assembled, it's easy and fun, like having a genoscope you can point at one trait after another to take a reading of how genetic things are. Height? Very genetic. Intelligence? Pretty genetic. Schizophrenia? That looks pretty genetic too. Personality? Yep, that too. And over multiple studies and traits the heritabilities go up and down, providing the basis for nearly infinite Talmudic revisions of the grand theories of the heritability of things, perfect grist for the wheels of social science.

"Unfortunately, that fundamental intuition is wrong. Heritability isn't an index of how genetic a trait is. A great deal of time has been wasted in the effort of measuring the heritability of traits in the false expectation that somehow the genetic nature of psychological phenomena would be revealed. There are many reasons for making this strong statement, but the most important of them harkens back to the description of heritability as an effect size. An effect size of the R2 family is a standardized estimate of the proportion of the variance in one variable that is reduced when another variable is held constant statistically. In this case it is an estimate of how much the variance of a trait would be reduced if everyone were genetically identical. With a moment's thought you can see that the answer to the question of how much variance would be reduced if everyone was genetically identical depends crucially on how genetically different everyone was in the first place."

The review article "The neuroscience of human intelligence differences" by Deary and Johnson and Penke (2010) relates specifically to human intelligence:

http://www.larspenke.eu/pdfs/Deary_Penke_Johnson_2010_-_Neur...

"At this point, it seems unlikely that single genetic loci have major effects on normal-range intelligence. For example, a modestly sized genome-wide study of the general intelligence factor derived from ten separate test scores in the cAnTAB cognitive test battery did not find any important genome-wide single nucleotide polymorphisms or copy number variants, and did not replicate genetic variants that had previously been associated with cognitive ability[note 48]."

The review article Johnson, W. (2010). Understanding the Genetics of Intelligence: Can Height Help? Can Corn Oil?. Current Directions in Psychological Science, 19(3), 177-182

http://apsychoserver.psych.arizona.edu/JJBAReprints/PSYC621/...

looks at some famous genetic experiments to show how little is explained by gene frequencies even in thoroughly studied populations defined by artificial selection.

"Together, however, the developmental natures of GCA and height, the likely influences of gene"environment correlations and interactions on their developmental processes, and the potential for genetic background and environmental circumstances to release previously unexpressed genetic variation suggest that very different combinations of genes may produce identical IQs or heights or levels of any other psychological trait. And the same genes may produce very different IQs and heights against different genetic backgrounds and in different environmental circumstances."

2
Aloisius 11 hours ago 4 replies      
Wait, people take IQ tests seriously? Really?

I was tested twice when I was in second grade, the first time by a psychologist in a class setting and the second time one-on-one to verify the first. I tested quite high, but even then I knew I wasn't noticeably smarter than others; I just test particularly well.

Frankly, it cracks me up when someone refers to their IQ seriously or boasts about being in Mensa. Other than testing for mid-to-high level mental retardation, subjectively, IQ tests seem to be terrible at measuring actual brilliance.

3
ilaksh 14 hours ago 1 reply      
So the vast majority of people misunderstand intelligence, at least as measured by IQ tests, as being a limiting factor for performance and potential, while it is actually just a measurement of current abilities.

> the penultimate chapter is a list of 14 examples in which science has failed because of social blindness.

This carries through more broadly and generally to the application of many incorrect fundamental assumptions to the design of our institutions, which consistently fail because of the resulting flawed structures.

4
marquis 15 hours ago  replies      
Am I the only one who refuses to take an IQ test? I hate being tested - especially with a time limit, I rarely play games (chess and logic puzzles being the exception) but above all I don't understand how anyone could not have an emotional response being told 'your IQ is x'.
5
pmb 9 hours ago 0 replies      
Two wonderful companions to this article are "Thinking Intelligence is Innate Makes You Stupid" http://lemire.me/blog/archives/2007/12/03/thinking-intellige... and (especially) Cosma Shalizi's article on the malleability and heritability of IQ http://cscs.umich.edu/~crshalizi/weblog/520.html
6
Claudus 14 hours ago 2 replies      
The Flynn effect tends to be used to support the feeling that intelligence isn't something you're born with.

People who think that intelligence is innate will refer to the "g factor".
http://en.wikipedia.org/wiki/G_factor_%28psychometrics%29

The racist (technically speaking) and controversial Jean Rushton believes that "gains in IQ over time (the Lynn-Flynn effect) are unrelated to g".

What's really going on with discussions like this is that humanity seems to have come to the conclusion that "smarter is better". Intelligence almost certainly exists, but is quite difficult to quantify exactly, and is further complicated by the fact that people resent others who are "better" than them. Given that there is a genetic component to intelligence to a certain degree, race becomes a factor and leads to a line of inductive reasoning that makes people uncomfortable:

  1) Smarter is better
2) Intelligence is genetic
3) Race determines genetics
4) One race is better than another
5) Hitler was right (or other outrageous conclusion)

As far as I can tell when you see terms like "g factor", "Flynn effect", "multiple intelligences", "crystallized intelligence", "fluid intelligence", "cultural fair IQ tests", "The bell curve"... people are really secretly arguing about one race or sex being smarter than the other, or not.

7
kevinpet 14 hours ago 1 reply      
I have no idea what is the author of the articles opinion, and what is from the book being reviewed. I'm not even sure what the title of the book being reviewed is.
8
mistercow 7 hours ago 0 replies      
>For this to happen, evolution would have had to have accelerated to light speed.

What? No... no. There are plenty of ways that intelligence could have dramatically increased since the '30s without any evolutionary effects.

9
pud 11 hours ago 0 replies      
Unrelated to the content, when I use Command-plus to increase the font size on this blog (in Chrome, at least), the font doesn't grow.

Which means I can't read it, because my eyes suck.

10
bearmf 14 hours ago 3 replies      
And still IQ is very good at predicting various life outcomes: probabliity of being arrested, income, number of children.

Group differences in IQ do exist, and Flynn effect does not make them go away. Flynn effect increases scores across the board, it does not equalize different groups.

Black people in US have lower average IQ than white people do, Asian people have higher average IQ. The reasons for this are many, but genetics certainly comes into play: IQ is heritable.

Trying to explain away group differences by "culture" is mostly bad science - trying to make the facts fir your desired conclusions.

11
Alex3917 10 hours ago 0 replies      
"contrary to widespread assumptions, no clear link between nutrition and IQ has been found."

Someone call up Paul Krugman and tell him that iodine deficiency doesn't actually cause mental retardation.

12
Dylan16807 13 hours ago 3 replies      
These people aren't any less intelligent than the researcher " their minds just work differently. They focus on the practicalities they know rather than hypothetical possibilities.

That's going to need some explanation. First off, I can see how the question itself is abstract, about a place they have never been, but "no camels" sounds extremely concrete to me. Second, how is ignoring information anything other than a lapse in intelligent thought?

13
pixelcort 13 hours ago 1 reply      
OT: Just me, or does browser zoom not seem to increase font size for this page in Safari and Chrome?
15
aaron695 11 hours ago 0 replies      
To not know the answer to "there are no camels in Germany so how many camels do they think there are in B, a specific German city? " is not a cultural difference, to think that other cultures are seriously that dumb is more a statement on ones self than anything else.

This is similar to statements engineers sometimes make that the Chinese are good copiers but can't innovate.

No, cultures are the same as everyone else. They have humour, art and like to tinker and have in jokes. If they don't publicly innovate it's more likely it's not economical in that environment yet.

The fact 'camels' was used has strong undertones to me, if this was an actually a study I'd be interested to know.

11
Birds hold 'funerals' for dead bbc.co.uk
67 points by zoowar  7 hours ago   18 comments top 6
1
jerf 5 hours ago 5 replies      
There's some projection here. We see some birds gathering around a corpse, we know that if we did that we would be "mourning", but that does not mean that they are. Stopping eating for a day may sound like mourning, but it also sounds like a very sensible defense against poisoning.

They may be; it is true that many birds exhibit complex behavior. But at least based on the info given in this article, there's a lot of unjustified assumptions about the internal states of the bird's brains based on deep, deep subconscious assumptions about how humans would be feeling if we saw humans acting that way. I would consider it just as likely that to the extent they are "feeling" something it would be something with no human analog.

2
Spooky23 6 hours ago 0 replies      
These sorts of stories are always fascinating to me. If you look back at human history, one of the big differentiators between human cultures is how we handle our dead.

Was the origin of that behavior a sort of opportunity for groups of humans to learn from the death of their friends? Or are our emotions a sort of outgrowth of the behaviors these birds display?

3
mark_l_watson 5 hours ago 0 replies      
This sounds reasonable. When we lived by the beach there was a blue jay who after accepting food from me on our deck for about a year started one day landing on my legs when I was on a lounge chair to get more food. This behavior lasted for a few years until a cat got him.

Now, years later, we have a domesticated Meyers Parrot and his behavior is very complex and interesting.

4
georgemcbay 4 hours ago 0 replies      
For anyone that finds this interesting, I highly recommend "A Murder of Crows", which was made a few years ago and is where I first heard of the idea of birds holding funerals:

http://www.pbs.org/wnet/nature/episodes/a-murder-of-crows/fu...

5
autophil 3 hours ago 1 reply      
Lots of findings coming out lately about animals having empathy and demonstrating qualities us humans think we're unique for.

And it makes me sad. We're not considerate towards animals. We hardly give them a second thought. We eat them, cage them and experiment on them.

Humans need to be better "caretakers" of the planet. We need advances in compassion towards animals and the environment, not more advances in technology.

6
michaelbuckbee 4 hours ago 0 replies      
I'm reminded of the National Geographic story (and iconic photo) of the chimpanzees grieving over the death of one of their own.

http://blogs.ngm.com/blog_central/2009/10/the-story-behind-o...

13
Now In 600+ Schools, Lore Gives Higher Ed A Next-Gen Social Network techcrunch.com
14 points by irunbackwards  3 hours ago   2 comments top
1
hypersoar 2 hours ago 1 reply      
"That's because context is key for social networks, Cohen says. Students don't want to 'friend' their professors on Facebook (or connect with them on LinkedIn in any significant capacity) and they don't want to broadcast photos from last night's party to the world of family members and grandmas on Facebook."

Hasn't Google+ essentially solved a generalization of this problem with Circles? Creating a whole new account just to keep your networks separate seems very clunky (and highly inelegant) by comparison.

14
Prevent cold boot attacks using TRESOR uni-erlangen.de
15 points by thomas-st  4 hours ago   3 comments top 2
1
rdl 3 hours ago 1 reply      
TreVisor and TRESOR are some of the most amazing things going on. Combine that with QUBES out of Invisible Things, and you could conceivably trust a computer built of untrustworthy hardware operated by your worst adversary, as long as you trust a tiny subset of it -- potentially just the Intel CPU.

The nice thing about trusting just a latest-edition Intel CPU is that they're so far ahead of everyone else in process that most attacks would be technically difficult for anyone except Intel, NSA, etc. Chris Tarnovsky isn't going to be able to extract keys out of Intel E5 CPUs in 6 hours with even a 10x bigger than $1.5mm lab, so as long as you deal with a machine which disappears faster than 6h (rotating keys, releasing the hounds, etc.), you should be safe.

One of the few things (along with the takeover of mobile OSes vs. legacy crappy desktop OSes) which makes me hopeful for security.

2
gingerlime 1 hour ago 0 replies      
Sounds interesting. Very clever thinking.

Just hope it doesn't get turned on its head for some yet-another layer of DRM that stops us from accessing our own content.

I guess (or hope) that seeding the key into the CPU in the first place is what's going to make it hard for content owners to use for DRM?

15
Valve Finds Value In Open-Source Drivers phoronix.com
120 points by macco  15 hours ago   47 comments top 9
1
geoffhill 14 hours ago 3 replies      
> Valve has granted these Intel Linux developers complete access to the game's source-code, including the Source Engine. This has allowed Intel's Linux developers to better investigate possible optimizations and tweaks to their driver in order to enhance Source-powered games. Valve has even given them commit access to push back changes to the game company.

That sure is something EA and Ubisoft wouldn't do in a million years.

2
nathanb 14 hours ago 1 reply      
> Valve makes games people actually want to run, rather than most of the games we work with now...

BURN!

And I know I've said this before, but there are open-source drivers for the other vendors' cards as well...and I think they'd benefit tremendously from some Valve love.

3
bryanh 14 hours ago 2 replies      
I'm guessing Valve's foray into Linux is more or less a hedge against Microsoft. If Windows 8 is a massive flop, the next best option is Mac or straight Linux. If Windows 8 is a success, well, I suppose it is business as usual albeit with some extra pressure from the Windows app store.
4
jlgreco 15 hours ago 0 replies      
I think an interesting takeaway from this article even for those who are not really interested in gaming on Linux is that the benefits of open source development can be seen even when source isn't made available to the world, but just within or between organizations.

That is probably something most HNers know or suspect already, but this seems like a particularly clean proof of that concept.

5
batgaijin 14 hours ago 2 replies      
Playing CS:GO on Fedora 17 with the latest nVidia drivers is an absolute nightmare. However, the framerate and audio lag are doing wonders for building my patience.
6
TazeTSchnitzel 13 hours ago 0 replies      
OK, I was thinking of getting a Mac, but now I'm reminded that in not too long I will hopefully be able to play TF2 without WINE...

I have renewed hope.

7
sirlancer 11 hours ago 1 reply      
It will be interesting to see what Valve _recommends_ as their preferred Linux distribution and even more so if they roll their own.
8
nelmaven 14 hours ago 1 reply      
This will surely increase the Linux popularity as a platform.
9
taskstrike 15 hours ago 3 replies      
too many ads on that page.
17
What powers Etsy etsy.com
140 points by johnzimmerman  18 hours ago   32 comments top 4
1
lordlarm 16 hours ago 4 replies      
The Supermicro servers are really really great - one of the best investments I've done so far in 2012.

I bought mine (a Supermicro 2U Dual 3GHz Xeon 4GB 12 Bay Storage Server) for 400$ on eBay, with great service and terms. Recommended if you are ever in the need (or just want) a private server rack unit (1U, 2U or more).

2
druiid 16 hours ago 4 replies      
It's very cool to see these kinds of posts. I always learn something from them and it's always neat to see what others have built to deal with a specific problem.

I wonder in this case why, if any specific reason Etsy is not utilizing virtualization of one kind or another? Basically they're building dumb-boxes that seem a much better fit to have a good number of slightly beefier VM systems and just rebuild bad instances from images (this would be especially good since they're already utilizing Chef).

I know that as an e-commerce firm this has helped us substantially to move toward this kind of a structure.

3
mbell 15 hours ago 3 replies      
I noticed something like this several times in the article: "Each machine has 2x 1gbit ethernet ports, in this case we're only using one."

It sounds like a single switch failure could take down a large amount of their infrastructure. The seeming lack of geographic (or even datacenter) redundancy also seems a little dangerous at this scale.

4
chrismealy 14 hours ago 2 replies      
Anybody know what they use hadoop for?
18
How Tracking Down My Stolen Computer Triggered a Drug Bust makezine.com
95 points by scottshea  14 hours ago   53 comments top 13
1
maqr 13 hours ago 1 reply      
I'd rather lose the laptop but have full disk encryption and keep my data secure.
2
msutherl 12 hours ago 1 reply      
I had a similar experience that took place over the course of a single day in which I tracked down that thieves in real-time and confronted them in a parking lot. Instead of calling the cops, and since they had taken money and not an actual laptop, I had them give me collateral and gave them 2 weeks to give me money back in exchange, which they did after.

I actually did write up and submit a police report, but it was about 3 months before I heard anything from the police.

Lessons learned: (1) sometimes it's better to roll up your sleeves and do it yourself and (2) some (most?) people legitimately want to come clean.

3
EliRivers 13 hours ago 1 reply      
It is nice to hear a story involving the police in which they're helpful and effective rather than spraying protesters/journalists/bystanders with pepper-spray.
4
omgsean 9 hours ago 1 reply      
Too bad that some guy is going to spend years and years in prison over drug charges when he should really only be charged with theft but I guess the lesson is don't go stealing traceable devices when you're running a dope operation.
5
ginko 1 hour ago 0 replies      
It's surprising that apparently almost no laptop thief takes the time to wipe the laptops.

That's the first thing I would do if I were in a business like that.

6
ck2 13 hours ago 2 replies      
Great read.

I am going to make all my mobile devices hit a webpage on a few of my servers silently on bootup (if there is a web connection) so I would at least have that ip. Also embedding a hidden image into the browser about:blank (startup) page.

7
agumonkey 8 hours ago 0 replies      
As someone said in the article comments, if you liked that story, go watch:

http://www.youtube.com/watch?v=U4oB28ksiIo

      Defcon 18 - Pwned By the owner
What happens when you steal a hackers computer zoz

8
JoeAltmaier 12 hours ago 0 replies      
Mostly agree with his conclusions, but for one. I didn't think Batman was all that great.
9
dirkdk 13 hours ago 0 replies      
great story! A friend of mine got her iPhone stolen, Find My iPhone didn't render anything for days so she gave up on it and got another one. This kind of technology should be standard on any new device
10
terrapinbear 11 hours ago 1 reply      
Oh Make Magazine, please put your "close this slide show" button on the upper right instead of the upper left. I'm conditioned to look for X icons to close out of modal windows on the upper right not the upper left. Thaaaaaaanks.
11
jason_slack 13 hours ago 1 reply      
Great read and this brings me to ask what options there are for hard drive encryption on OSX?
12
alexchamberlain 13 hours ago 0 replies      
Brilliant story!
13
sneak 12 hours ago 0 replies      
Idiot. Who leaves valuables in a car in Detroit? Are you not aware that the windows are all that keeps someone from popping the trunk?
19
The challenges of Desktop Linux gnome.org
57 points by rbanffy  9 hours ago   53 comments top 19
1
twelvechairs 32 minutes ago 0 replies      
To me, 'Desktop' Linux has never worked because the lofty aim of projects like GNOME to secure 'everyday' users is completely at odds with many of the basic design decisions of the Linux/Unix platform that underpin it. Things like storing data in string-based-files, chaining small command-line-tools to do everything, a historically eccentric file structure, and pluralism in every-way (eg. custom built binaries on a per-machine basis) are never going to work for the 'everyday' user. You'd need to have an even bigger project than GNOME to change these design decisions (like redesigning the file structure and standardizing the hardware/software platform on which it runs, as OSX did).

That said, many of the components of GNOME (GTK, GStreamer, etc.) are great and we shouldn't forget their usefulness to other projects whilst the Gnome Desktop is coming up against these existential questions...

2
jiggy2011 7 hours ago 2 replies      
The second point: "Companies tend to depend on a myriad of applications to run their business, and just a couple of them not running under Linux would be enough to derail a transition to Linux desktops" is the real killer.

This is really the only reason that I can't recommend trying Ubuntu to more people. People site usability concerns etc and while there are some issues there I think it's mostly "good enough" now and we're long past the days of having to compile a .tar.gz full of .c files and fettle with vi in order to get sound to work.

So the issue is how to get third party developers interested. I think the best way is by including a really sexy app store. Ubuntu Software Centre is a start but it's still nowhere close to what Apple has achieved in this area. Nasty looking icons, inconsistent screen shots (some showing gnome2, others Unity) and thousands of free apps with weird names don't make it the most attractive place to shop.

In many ways though , I would consider desktop Linux a success regardless of marketshare for the simple reason that it is now possible to "use a computer in freedom". I think the software world would be a bleaker place if Torvalds , Stallman et all hadn't spent the hours pushing code. Imagine a world where the cheapest HTTP server license went into thousands of $.

3
programminggeek 5 hours ago 1 reply      
The challenge of Desktop Linux is that it's not a product, it's a project.

Ubuntu is a very successful desktop Linux distro. It's pleasant to use and very modern. Nerds might hate it because Unity doesn't fall in line with Linux "the project" so much as its there to make Ubuntu "the product" better.

Overall, desktop linux as an overarching product failed, but so did mobile linux pre-android, but Android isn't so much mobile linux as it is Android.

Open source is a bit like herding cats and if you don't have a real product you are trying to ship, devs will scratch their own itch.

4
winter_blue 5 hours ago 1 reply      
Actually, now is a great time to switch to Linux.

The only thing that was keeping my parents and some of my friends was that they needed Microsoft Office for their work/schoolwork. As of now MS Office runs like a piece of cake on Linux -- with PlayOnLinux (http://playonlinux.com/). Honestly, it's quite impressive how smoothly it runs (and how easy it is to install it.)

MS Office isn't the only application that PlayOnLinux supports - there's a ton of games and other software (Photoshop, Blender, Dreamweaver, Flash, etc.) that it supports. To top all that off, I feel like the desktop on has gotten better and better lately. I use KDE 4.9, and I will say it is quite nice. The level of integration KDE offers and the high quality of many of the standard apps that come with it will make a Windows user never turn back. Ubuntu too has a rather simple and straightforward UI (although it doesn't personally appeal to my taste).

5
tsurantino 7 hours ago 0 replies      
I don't think there needed to be so many bullet points to explain why the adoption of Linux as a desktop operating system has been extraordinarily slow and generally unsuccessful.

In my opinion, it would fall under these two reasons

1) The difference between a Linux desktop & a Windows/Mac desktop is negligible, or worse. GNOME/KDE don't really add any compelling features that make them better than Windows or Mac anymore. I remember a few years ago, I loved putting Ubuntu on my system because drivers would be downloaded automatically and I could easily access all Linux packages from one simple package manager. The folder browser was pretty familiar, and the GNOME 2 bar was a nice hybrid between Mac & Windows, but nothing too special.

However, the driver installation & central package directory have been long a competitive advantage with the era of Windows 7, etc. Granted, these weren't "defining" features of Linux, but when I personally used it, these were thinks that struck me then, but are no longer relevant now.

2) The ecosystem. I think it goes without saying that the Linux software ecosystem is much more fragmented, and is often found in the "underground".

It's not an issue of whether or not there are substitutes to things like Office, Adobe Creative Suite products, iPhoto, and other essential apps that normal people/working people use on a daily basis, etc. (although I do think that there aren't adequate substitutes for these and that the friction of trying to get these actual products to work through things like WINE, etc is too much).

However, because things are so much more fragmented on Linux, it being an open system, it's harder for there to be de facto software (unless you lurk in some Linux community, which again, normal people aren't generally interested in).

The user has to make so many choices, which are often arbitrary and needless, and in doing so becomes frustrated and confused. There is too much stuff to explain that isn't necessary to explain, and too much detail to go into that again, is not practical.

The trade-off with Linux is that you get an enormous amount of power and responsibility. The benefit of this is that you get an enormous amount of power & responsibility. The cost is...the same, some people just don't want to bother.

6
SlipperySlope 5 hours ago 5 replies      
Stop looking backwards.

The desktop is being left behind by mobile. Recognize that Android/GNU/Linux has won mobile. Despite an early lead, Microsoft has been crushed worldwide in mobile by a Free Software platform.

Android provides a platform where both Free libraries and closed source apps proliferate - and are very inexpensive. All the failures of desktop GNU/Linux have been solved, or are not relevant, in mobile GNU/Linux.

Regarding the tablet segment of mobile, one could argue that GNU/Linux will seize the low end and gradually gain market share at the expense of iPad, leaving no room for Microsoft.

The only reason I care about desktop OS at all is to develop for mobile or back-end server.

7
xentronium 1 hour ago 0 replies      
In my crusade of boycotting apple production I switched from OS X to ubuntu. And you know what, despite high disregard from HN auditory, I've found Unity to be surprisingly good. For example, unity dock still worse than os x dock, but it's good enough. Single menu for all apps is familiar from os x, although I hit it when attempting to move windows more often than I want to. Application switcher can be navigated with arrows by default, which is good, although I would also like it to be navigatable via mouse too. 'Spotlight-like' menu named Dash Home takes way too much screen estate, but again, it's at least usable. Hotkeys are terrible, though, and first thing I did was disable alt and super keys calling dock/dash-home.

Overall, I like Unity way more than current Gnome, KDE, XFCE and LXDE. Your mileage may vary.

I think, the real linux desktop problems are when something goes wrong. Sometimes updates are unsafe. Sometimes you find a bug in a software. I had 10 or 20 crashes and error report windows in my first day. Commercial software is terrible too. Skype is buggy, crashes often and is just bad. Nvidia binary drivers suck and nouveau crashes on my card (560ti). Twinview can only VSync one screen. Your other screen is doomed to lag on renders. Xinerama has a bug with cursor randomly jumping over to another screen. Whenever something bad happens you resort to google and waste 10 minutes+ for fixing it.

I also think that applications not being made for linux is not a very big deal. 80% use case includes browser, music player and office package. All of which are included by default in most distributions.

8
yesimahuman 7 hours ago 1 reply      
I use desktop linux quite a bit (though only through a VM these days). What's interesting is many of the applications that always seemed to be missing or worse than their Windows competitors have been absolutely destroyed with web-based software. Both Windows and desktop Linux have been losing that battle.

The funny thing is that desktop Linux apps have always been trying to match Windows apps feature-to-feature, but web developers haven't. Turns out I didn't need every feature from Excel, I needed something faster, more convenient, and easier to use.

9
damian2000 2 hours ago 0 replies      
Personally I don't see these points as being that important when looking at casual home users. I've recently setup a couple of older laptops with Ubuntu for some family members who just want to use them for watching movies and browsing the web ... and for that, it excels.
10
goombastic 5 hours ago 1 reply      
Gnome 3 completely borked my system of 8 years running on an Thinkpad R50e I was happy with. There really is no excuse for what happened. There was no cure either. I am done. Writing from a mac now.

I read that blog post and all I can do is shake my head. They seem to be fighting the same old monsters for the last 12 years. The list of reasons on the post sickens me.

11
rbanffy 4 hours ago 0 replies      
I see a much simpler explanation. Linux has, for a very long time, followed the path opened by (and thus replacing) technical Unix workstations. Those were specialized machines built for people who would put up with cryptic interfaces in order to run the powerful software they needed. As any specialized tool, it has appeal to the few professionals who have actual use for them.

Only recently (3 years?) Linux distros started being friendly to "mere mortals". This change coincided with the acceleration of the demise of the PC. There may never be a year of the Linux desktop.

Yet, Linux is everywhere. As soon as you fire up your network connection, you are using Linux. Every time I look up the time on my phone, I'm using Linux. Most internet-connected TVs run Linux, as do most set-top boxes and e-readers. I amuse myself thinking the convoluted things Steve Ballmer is compelled to do just to be able to claim he doesn't use it.

12
chris_wot 1 hour ago 0 replies      
The GNOME project has failed because they have hacked at things that don't matter. For example, what the world wanted and needed was a decent word processor. It got OpenOffice. It needed an easy to use flow chart creator. It got dia. It needed project management software. It got... Heck, what did it get?!?

Cheese and Gnome Shell are all very nice, but so what?

13
pm90 1 hour ago 0 replies      
A dream-wish: The people who developed the Harmattan UI (Nokia N9) design something new and refreshing for Desktop Linux. I know, that's probably improbable, but if it did happen...
This was the first (and only) UI on top of linux that I really found beautiful. Android does not have the same polish, unfortunately..
14
progrock 4 hours ago 0 replies      
The linux desktop probably hasn't suceeded due to pirating! Imagine if the developing world jumped onto Linux in a bigger way.

I think MS realise this, there's suggestion that they'll be offering Windows 8 for a reasonable figure for once. Then desktop Linux better beware.

15
webwanderings 7 hours ago 0 replies      
One of thing which I believe hindered in Linux's popularity right from the early days, was its not so impressive User Interface, compared to Win95/98. I think this is where/when Windows took off and nobody was able to catch up comparably for a very long time to come.

Another aspect was the lack of availability of compatible software. Software back in the days were in the form of CDs-on-the-shelf, and I don't think anybody made any of them Linux compatible.

So the challenge was just not there from the beginning to get a hold of average desktop user.

16
batgaijin 6 hours ago 2 replies      
Does anyone remember netbooks? How they used to have Linux? That was when Linux had a real chance.

http://www.informationweek.com/windows/microsoft-news/micros...

If you don't think Microsoft is directly responsible for this, you are an absolute idiot.

Every manufacturer has to pay for Windows Mobile for every Android phone they sell. No company has stood up to Microsoft. This is a real threat.

You want to talk about Gnome 3? Fuck you. Why would anyone invest a cent in a WM if you can't distribute Linux installed on a laptop?

People like talking about Microsoft and Apple as though they are different teams. Nope, they are on the same team: fuck people who think they can get by without them.

17
lovamova 6 hours ago 2 replies      
elementary OS is doing something right. Learn from them!
18
nuxli 7 hours ago 0 replies      
His 1st bullet point says it all. Ken Thompson said the same thing about Linux in an interview years ago. He said that is Linux's major problem.

I find preoccupation with some company's metaphor to be a sign of lack of creativity. And the people behind Linux distributions are obsessed with Microsoft and the "desktop".

The "desktop" is only one metaphor.

Does iOS have a "desktop"?

To speed up Vista when it was first released, I used to disable the desktop in Windows by changing the registry key that specifies "explorer.exe". I would just boot to msconfig or task manager.

The system ran much faster that way. Applications can still be minimised. It worked so well, I never went back to the aero nonsense.

Obsession with a "desktop", and trying to look like Microsoft's version of it, is one of Linux's major flaws.

19
radley 6 hours ago 1 reply      
Android PC
20
Open webOS Beta Officially Released openwebosproject.org
128 points by mikecane  16 hours ago   25 comments top 9
1
mitjak 16 hours ago 1 reply      
Will definitely keep my ears open for further announcements. As much as I avoid webOS due to the lack of quality apps, booting into it even for a short while causes me to realize just how smooth its UX is and how awkward and backwards in a lot of ways the other mobile OSs are.
2
macco 25 minutes ago 0 replies      
Great to have a real free os for mobiles. I am very thrilled what people will do with it.
3
kevincennis 14 hours ago 2 replies      
WebOS would have been so awesome if Palm had put it on nicer hardware and found a way to get developers on-board.
4
wtracy 13 hours ago 1 reply      
The announcement mentions in passing a client-server model that allows the UI to handle things like touch interaction without blocking on app code.

Can anyone elaborate on how this actually works? I would expect the UI rendering and Javascript execution to be all happening in one WebKit process. I would be excited to learn if the webOS team has come up with something I don't know about here.

5
grantjgordon 16 hours ago 0 replies      
Way to go! Web OS always had such promise -- I hope that it can make a bigger impact now that it's been freed from the shackles of the dying Palm.
6
iscrewyou 15 hours ago 2 replies      
Question: Does this openness also include the things that WebOS was amazing at? Like Swipe Up to close, card stacks, etc?

Is, say Google, allowed to use that without paying royalties? Or does HP still own those software patents?

7
mikecane 16 hours ago 0 replies      
Waiting for someone to port this to the Nexus 7...
8
neya 14 hours ago 0 replies      
Wow..this is seriously good news...can't wait to see open Source software like this prosper...hopefully no one will sue them, though..
9
wgirish 16 hours ago 1 reply      
Will it run on HP Touchpad ?
21
Where Have All The Good Managers Gone? thomaslarock.com
27 points by kitty  8 hours ago   7 comments top 2
1
lmkg 4 hours ago 3 replies      
Everyone wants talented, experienced employees, but no one wants to invest the resources it takes to gain experience and develop talent. And why would they? After investing so much into their developer, a competitor would just come along and poach them with a cushier chair and a higher salary. Everyone would rather be the poacher than the poachee, and so those "rookies with potential" have no where to develop that potential into talent.

Of course, this is all a drastic oversimplification. Rookies can get jobs, and get some experience by exposure. But developing talent effectively & efficiently takes more time & resource investment beyond just having them do work, and that's investment with an uncertain and non-immediate return. It's a tough nut to crack, but the place where I would start is figuring out how to retain your good employees, especially when the growth of their market value is faster than your company's standard career advancement path.

2
praptak 2 hours ago 0 replies      
I remember post-bubble-burst times when people were willing to learn huge yucky legacy workflow systems all by themselves just to get a crap job in a dreary cubicle-land. Perhaps this got some managers spoiled.
22
Codecademy Fellowship codecademy.com
37 points by sew  7 hours ago   5 comments top 4
1
larrys 6 hours ago 1 reply      
I'm guessing that the use of the word "fellowship" is to make it more enticing for not only mom and dad to agree to let you take a year off from school but to make it sound like you are doing something truly special.

Colleges of course do this already and it called by different names such as Drexel's coop program which has been around since 1919:

http://en.wikipedia.org/wiki/Drexel_University

2
zanny 4 hours ago 0 replies      
Seems like they are more looking for a straight out of college graduate with a resume to match a 3 year industry vet. I'm sure they will get one - I know I did CodeAcademy over the last year to learn basic JS while I was bored in my senior year of CS, and most of my peers at least know what it is.
3
lbcadden3 6 hours ago 0 replies      
Considering there target audience they should be looking for those who have done n% of there site or some kind of contest.
4
southpolesteve 6 hours ago 0 replies      
"$80,000 and regular employee benefits"
24
The Woman Who Needed to Be Upside-Down discovermagazine.com
272 points by danso  23 hours ago   55 comments top 15
1
powertower 19 hours ago 1 reply      
This sounds a lot like the occasional support request that comes in to me....

At first it's dis-believable and impossible, and you think the person is crazy, but after "troubleshooting" something rational pops up.

I can't tell you how many times this has happened. But it really doesn't help having a product/service that manages (on Windows) an underlining system of Virtual Hosts, dozens of configuration files, Apache, PHP, and MySQL, and a bunch of other software and tools (http://www.devside.net/server/webdeveloper).

2
tokenizer 22 hours ago 0 replies      
This was a great story, and really demonstrates the value of being calm and accessing a situation. This could have easily turned into a worse situation had the doctors forced the man to drop his wife and then examine the situation.
3
raldi 19 hours ago 3 replies      
It's like the medical version of the 500-mile email.
4
Jun8 17 hours ago 6 replies      
IDEABOLT: Create a central repository of interesting cases and diagnoses, with a super intuitive UI, make it freely available to all MDs in the world to fill up and consult, sort of like the github for doctors.

You can make money by being the intermediary to find subjects for experiments, e.g. "For a study we are looking for identical twins who cannot see from birth but now one has restored vision where the other does not".

Does something that looks remotely similar exist?

5
mynegation 20 hours ago 3 replies      
As a big fan of House MD and a software engineer I have always been wondering how often doctors turn to the web to look up strange cases, or is there a benefit in creating a better structured and curated site specifically for that? I guess WebMD is this kind of site, but could someone knowledgeable in medical practice share the thoughts on that?

Granted, once I knew she was on pacemaker, I figured that this has something to do with electrical connectivity. But then again, may be this is consequence of my poor soldering skills and watching too much of House MD.

6
kevinconroy 22 hours ago 1 reply      
Good to remember the next time you get a bug report from a user. Perhaps your system really isn't as perfect as you think it is.
7
Dinoguy1000 6 hours ago 0 replies      
No comment on the overall story, but as soon as I heard the description of the man carrying her, and he said "I'm her husband", my thought was Izumi Curtis and her husband Sig. =D
8
vinutheraj 16 hours ago 1 reply      
Great story. But I wonder what would have happened if she was not five feet and him seven feet tall !
9
mbubb 16 hours ago 0 replies      
Reminds me of a children's joke - which i didn't even think was funny at the time - but enjoyed anyway:

"Dr, Dr! Every time I drink a cup of coffee, I get a stabbing pain in my right eye..."

(google it if you don't remember)

Exhibits, the same kind of ability to see the whole situation and make a diagnosis

10
duwease 17 hours ago 3 replies      
I just keep wondering why he was picking her up by the ankles in the first place..
11
akg 20 hours ago 1 reply      
How did they get her to the hospital if she needed to be upside-down the entire time?
12
gizmogwai 15 hours ago 0 replies      
Disclaimer: I worked as software engineer for a CRM device company.

From what I learned during my medical training, this king of issue is not so uncommon, but it is usually diagnosed very easily. Her peacemaker can be disabled using a simple magnet.
This is a common test in nearly all protocol to check how the heart is working without the help of the device. Doing this simple test while upside-down would have shown that the paecemaker was effectively working in that position. That should have be enough to ring a bell to most of the cardiologists.

13
kang 20 hours ago 1 reply      
Sorry, but I couldn't resist mentioning House.
One of the episodes had similar situations. http://en.wikipedia.org/wiki/Dont_Ever_Change_(House)

Since in there they say patient has "Nephroptosis, also known as 'Floating Kidney'", which is a listed medical condition, conditions like OP should not be uncommon.

14
criswell 22 hours ago 1 reply      
I had a feeling she just had a screw loose.
15
alliemobley 20 hours ago 0 replies      
This is the sweetest story I have ever heard in my life!
25
Jonah Lehrer's Journalistic Misdeeds at Wired.com slate.com
34 points by tptacek  9 hours ago   18 comments top 9
1
tokenadult 8 hours ago 1 reply      
This one nails Lehrer to the wall:

"In a third post from mid-2011 titled "Basketball and Jazz," one of Lehrer's paragraphs closely paralleled one written by Newsweek science writer Sharon Begley some three years earlier.

"Lehrer:

"The rebounding experiment went like this: 10 basketball players, 10 coaches and 10 sportswriters, plus a group of complete basketball novices, watched video clips of a player attempting a free throw. (You can watch the videos here.) Not surprisingly, the professional athletes were far better at predicting whether or not the shot would go in. While they got it right more than two-thirds of the time, the non-playing experts (i.e., the coaches and writers) only got it right about 40 percent of the time.

"Newsweek:

"In the experiment, 10 basketball players, 10 coaches and 10 sportswriters (considered non-playing experts), and novices all watched a video clip of someone attempting a free throw. The players were better at predicting whether the shot would go in: they got it right in two-thirds of the shots they saw, compared to 40 percent right for novices and 44 percent for coaches and writers.

"Tellingly, Begley misstated the number of participants in the study. (There were only 5 coaches and 5 sportswriters, not 10 of each. In addition, there were also 10 people in the novice group who were neither coaches nor sportswriters.) Lehrer made the exact same mistake in precisely the same manner."

When Lehrer reproduces someone else's mistake, you know he isn't looking up or verifying the facts himself. The honorable thing to do in a blog would be simply to link to Begley's piece and say, "Sharon Begley wrote an interesting article a few years ago about a study on this issue."

P.S. I posted an article to HN earlier about the initial discovery of Lehrer making up quotations in articles in other publications.

http://news.ycombinator.com/item?id=4370417

I had also seen him "recycle" earlier writings of his in paid publications, because once one of his articles was submitted here to HN, and I thought, "Hey, I've read this before." Indeed I had, in the previous publication where he had first written on the same subject a couple years earlier.

2
dredmorbius 5 hours ago 0 replies      
So ... this is the result of close analysis of a single author.

As medical types will tell you, one of the problems of running imaging and diagnostics on ill / injured / diseased patients is that you'll find anomolies -- not because they're relevant to the illness in question, but because individuals differ.

What is the prevalence of the cited behaviors -- recycling, press-release plagiarism, plagiarism, quotation issues, and factual issues -- in an unbiased sample of other authors / reporters / columnists / essayists?

What, specifically, is wrong with some of the behaviors in question? I haven't followed the Lehrer situation particularly closely, I'm aware that he's admitted to fabricating quotes from Bob Dylan specifically (not good).

I'm a bit puzzled as to what he's being faulted for in "recycling" -- essentially reusing his own material.

The press-release plagiarism cited appears to involve taking quotes from press releases, rather than interviews (which Lehrer shaded to sound like it had been told him directly). The looser view would be that, well, the pres release "told Lehrer" ... and anyone else reading it. Not great, but a modestly pale shade of gray.

Direct quotations of the published, non-press-release works of others is getting rather darker. Though I wouldn't mind knowing what specific rulebook(s) Seife is playing from when he states: "Journalistic rules about press releases are murky. Rules about taking credit for other journalists' prose are not." I mean, I really hope we're not making shit up as we go along (and frankly have no way of knowing if Seife is or isn't -- he's, erm, not citing sources, merely his own authority as a professor of journalism).

Seife admits as much later in his piece: "There isn't a canonical code of conduct for journalists; perfectly reasonable reporters and editors can have fundamental disagreements about what appear to be basic ethical questions, such as whether it's kosher to recycle one's own work." He also notes that recycling can be considered common and acceptable practice, though he feels "may violate the reader's trust". My own experience, especially in persuasive writing that's repeated as an author attempts to argue for a position, is that there is considerable recycling of material, though often an author will refine and strengthen arguments over time. That's what I myself practice.

Handling quotations also allows for some leeway. It's not uncommon to tidy up tics of speech and grammar particularly from spoken conversational passages. It can, in fact, be a negative shading to quote someone with complete faithfulness and accuracy, including all "ers", "ums", "ahs", and syntactical tangents and fragments. That said, changing meaning in as fundamental a manner as to equate memorizing a few stanzas of an epic work with memorizing the whole thing, and failing to correct it, is pretty bad.

At different points in time, attitudes toward what would currently be considered plagiarism in news were radically different. It's very, very helpful to recognize that outside a relatively few fairly stable rules (murder, real property theft), much of ethics and morals is temporally, culturally, and situationally relative. Today we suffer witches to live. In Revolutionary America, plagiarism was common practice (http://www.huffingtonpost.com/todd-andrlik/how-plagiarism-ma...). My feeling is that too strict an insistence on slavishly faithful accuracy can be as much a liability as confabulation. We know now that war photographers since Brady have staged and arranged subjects in photographs to more effectively tell stories. That NASA image processing often involves significant Photoshop enhancement and visible-range representations of invisible spectra from radio, infra-red, ultra-violet, and X-ray ranges. That NPR extensively edits interview audio, and will even modify "live" host comments over the course of repeats of their anchor news programs Morning Edition and All Things Considered to correct for flubs. That Campbells put marbles in its soup, that clothing catalog models wear heavily pinned garments, and that HN moderators will re-edit headlines and censor meta articles.

Who ya gonna shoot?

If we're going to hang Lehrer, let's hang him for what he's been doing deliberately and in clear exception to both norms and hard-written rules. Not based on either fast-and-loose definitions of correctness or normal deviations.

3
shock3naw 8 hours ago 0 replies      
Is the sunburst background behind Lehrer's head in the table necessary? I thought this was about being 'professional.'

That being said, I'm glad the journalism community is cracking down on people who are recycling, plagiarizing, and not fact checking.

4
Alex3917 8 hours ago 0 replies      
Meh. There were some of Lehrer's articles that I genuinely liked, but as often as not I got the feeling that he didn't really know what he was talking about. Although that's not unlike most other popular Internet science writers.
5
rd108 4 hours ago 0 replies      
I'm sorry, none of what I read seemed very egregious to me. In many cases, I couldn't even decipher whether something had been really plagiarized or not. You DO use lots of material from other people when writing a story-- some of these phrases, especially under tight deadlines and late nights, likely jumble into a mish-mash of words and phrases that might spill out while writing.

Even the case of copying someone else's mistake (in the "10 sportswriters" example) also seems forgivable to me... and- forgive me if I'm too generous- just another mistake, albeit this time on Lehrer's part.

6
benologist 5 hours ago 0 replies      
Somebody should tell these guys about Engadget, The Verge, Gizmodo, Geek and all the other sites that do this crap for a living.
7
awwstn2 8 hours ago 1 reply      
Here's Lehrer's barely-used Twitter account. His most recent Tweet called Samsung plagiarists: https://twitter.com/jjonahlehrer
8
regnum 7 hours ago 2 replies      
It's not been a good year for This American Life.

First their story about abuse at Apple factories in China turned out to be piece of fiction. Now all this with their contributor Jonah Lehrer.

9
nuxli 6 hours ago 0 replies      
Seems like generating this sort of plagiarised output is something we could train a machine to do. No need to put someone on the payroll to do it.
26
Building Atari With CreateJS atari.com
75 points by fatjonny  15 hours ago   5 comments top 2
1
Gamefoo 3 hours ago 2 replies      
IE only?
I don't know which kind of message they are trying to pass, but going from retro to irrelevant is definitely not the right one.
2
Raphael 6 hours ago 1 reply      
Wow, Atari is taking JS games mainstream. It was bound to happen eventually.
27
Meteor 0.4.0: Introducing Spark meteor.com
58 points by NSMeta  13 hours ago   18 comments top 8
1
edtechdev 5 minutes ago 0 replies      
There's no link to examples or a direct link to the relevant documentation. You have to figure out the difference between 'constant' and 'preserve'.

And one of their principles states: "One Language. Write both the client and the server parts of your interface in JavaScript."

Yet you still have to learn a separate template language, deal with the DOM, browser inconsistencies, etc.

To be fair, if you did do everything in javascript including the user interface, it ends up looking like java (see dojo, google closure, dart, qooxdoo). Javascript isn't well suited for a user interface or template language, but there are dozens of alternatives, some of which are more friendly to declarative UIs: https://github.com/jashkenas/coffee-script/wiki/List-of-lang...

2
rbn 3 hours ago 1 reply      
You guys should focus on authentication. Because this thing is practically useless without it. I'm aware of the authentication branch but this update broke it.
3
Rickasaurus 10 hours ago 1 reply      
Why would you name this the same thing as a popular iterative map reduce implementation?
4
shykes 11 hours ago 1 reply      
Can Spark be used as a standalone library, without the rest of Meteor? That would be pretty cool and I suspect it would be very successful.
5
nmb 4 hours ago 0 replies      
I agree that the name should be changed. If someone said they built their product "using Spark", it would be fairly ambiguous as to which one they meant, especially if their product has a data mining component.

[http://www.spark-project.org/]

Meteor Spark looks pretty nice though; look forward to trying it out.

6
ricksta 11 hours ago 2 replies      
It will be cool if Meteor could be packaged in a gem and be embedded in parts of a Rails app where client side interaction is heavy.
7
sebastian 11 hours ago 1 reply      
How possible would it be to use something like Spark or Derby.js's Racer in a django site?

I like the realtime concept/behavior but I want to keep developing in django.

8
tiglionabbit 11 hours ago 0 replies      
Yay, more data binding libraries. This one needs some better examples before I'll be able to understand at a glance how it works. I currently do my data binding with AngularJS.

Btw, this feature sort of already exists in jQuery. http://api.jquery.com/link/

28
Ex-Google hiring committee member about job interviewing extroverteddeveloper.com
82 points by maayank  16 hours ago   51 comments top 7
1
JabavuAdams 14 hours ago 5 replies      
If a lot of talented engineers from Microsoft and Amazon are having trouble in Google interviews, doesn't that indicate that Google is testing the wrong things?
2
rckclmbr 13 hours ago 2 replies      
She's spot on. I consider myself a competent developer (6 years experience, "big fish in a small pond"), and I don't really interview often. I've interviewed at Google twice, the first time doing 2 phone interviews and a round of on-site interviews, and the second time not making it past the phone interview.

I have prepared well for the interviews every time. I understand the algorithms, the data structures, and can program them on my own time no problem. But the second I'm in an interview setting, I lock up and can't think. I stumble across stupid thoughts (How many bits are in a byte? Oh yea, 8. But what about the 0th bit? What to do?!) and just work myself into a corner. All the while trying to seek approval from the interviewer. After about 2 minutes, I become a wreck and am hopeless.

I also don't consider myself non-social, and deal with coworkers very well. I still don't understand what it is about the technical interview setting that makes me act like this.

3
curiousDog 5 hours ago 2 replies      
Ironically, her book won't get you through the google interview if you are a new college grad. It may have sufficient coverage for Microsoft/Amazon interviews but not google.

From my experience the average google interview requires you to be so deeep into the algo/datastructure space, you should be able to code up the KMP algo off the top of your head. You have to be a topcoder with atleast 1200 rep or equivalent algo & coding skills. Coding speed also matters.
You should be able to scribble Floyd warshall.

Why this way? Well that's where google did most of their recruiting from back in the day. Anyone who says they got hired without this are either lying or got lucky in the interview process. Same goes for the new wave of startups in the bay area..facebook/palantir/quora etc.

I was very much into the OS, compiler space in school and that was what I was interested in. Got an offer from msft, amzn but not google. So kids, read up on CLRS & the algorith design manual, solve every problem there & also create your account today on topcoder if you'd like that job at el goog. Any other book that says otherwise is equivalent to Linux programming for dummies or Complete C++ in 21 days.

4
mkhattab 9 hours ago 3 replies      
Perhaps there is too much emphasis on the interview process. Why not use the contract-to-hire method? It could be a short duration spanning a few weeks.

There are of course multiple drawbacks, but at least this method could help alleviate the problem of false negatives.

5
papaver 11 hours ago 3 replies      
i think the best answer to any google interview question i didn't know is..... "i'd google it"!

but seriously, why do i need to remember information about languages or algorithms i don't use daily? my memory is limited and i prefer it to be filled with the most useful information at the time.

6
jamesmiller5 11 hours ago 0 replies      
I find the resume length suggestions interesting.

The Microsoft representatives have told me to use a short one page resume.
The Google representatives instead told me the exact opposite and said for me to put down everything relevant to the position regardless of length.

I now have two resumes, a short one for on the floor career fairs and a multi-page that I submit online.

[Edit grammar]

7
hk_kh 13 hours ago 1 reply      
Buy a book to get the illusion you'll work on Google. Sounds good!
29
Selling Yourself hbr.org
132 points by fifilc  20 hours ago   33 comments top 14
1
ten_fingers 19 hours ago 3 replies      
Let's see:

(1) See a market opportunity in yachts 55 feet long. Need to hire a yacht designer to get the engines, hull shape, hull construction, safety, other engineering details right and supervise the construction including selecting the people for the interior design and finishing the interior. Want (A) someone who has done such work with high success for two dozen yachts from length 30 feet to 150 feet or (B) someone with the potential?

(2) Have a small but rapidly growing Web site and need to hire someone to get the server farm going for scaling the site. They need to design the hardware and software architecture, select the means of system real time instrumentation, monitoring, and management, work with the software team to make needed changes in the software, design the means of reliability, performance, and security, get the backup and recovery going, design the server farm bridge and the internal network operations center (NOC), write the job descriptions for the staff, select and train the staff, etc. Now, want someone who has recently "been there, done that, gotten the T-shirt" or someone with the 'potential' of doing that?

(3) Need heart bypass surgery. Now, want someone who has done an average of eight heart bypass operations a week for the past two years with no patient deaths or repeat operations or someone with that 'potential'?

(4) Similarly for putting a new roof on a house, fixing a bad problem with the plumbing, installing a new furnace and hot water heater, installing a high end HVAC system, etc.?

War Story: My wife and I were in graduate school getting our Ph.D. degrees and ran out of money. I took a part time job in applied math and computing on some US DoD problems -- hush, hush stuff. We had two Fortran programmers using IBM's MVS TSO, and in the past 12 months they had spent $80 K. We wanted to save money and also do much more computing. We went shopping and bought a $120 K Prime (really, essentially a baby Multics).

Soon I inherited the system and ran it in addition to programming it, doing applied math, etc. When I got my Ph.D., soon I was a prof in a B-school. They had an MVS system with punched cards, a new MBA program, and wanted better computing for the MBA program. I wanted TeX or at least something to drive a daisy wheel printer. Bummer.

At a faculty meeting the college computing committee gave a sad report on options for better computing. I stood and said: "Why don't we get a machine such as can be had for about $5000 a month, put it in a room in the basement, and do it ourselves?". Soon the operational Dean wanted more info, and I lead a one person selection committee. I looked at DG as in "Soul of a New Machine', DEC VAX PDP 11/780, and a Prime.

The long sitting head of the central university computer center went to the Dean and said that my proposal would not work. I got a sudden call to come to the Dean's office and met the critic. I happened to bring a cubic foot or so of technical papers related to my computer shopping. I'd specified enough ordinary, inexpensive 'comfort' A/C to handle the heat, but the critic claimed that the hard disk drives needed tight temperature and humidity control or would fail. I said: "These disk drives are sold by Prime but they are actually manufactured by Control Data. I happen to have with me the official engineering specifications for these drives directly from Control Data.". So I read them the temperature and humidity specifications that we could easily meet. The critic still claimed the disks would fail. Then I explained that at my earlier site, we had no A/C at all. By summer the room got too warm for humans, so we put an electric van in the doorway. Later we had an A/C evaporator hung off the ceiling. Worked fine for three years. The Dean sided with me.

In the end we got a Prime. What we got was a near exact copy of what I had run in grad school, down to the terminals and the Belden general purpose 5 conductor signal cable used to connect the terminals at 9600 bps. The system became the world site for TeX on Prime, lasted 15 years, and was a great success. The system was running one year after that faculty meeting. I was made Chair of the college computer committee.

That faculty meeting had been only two weeks after I had arrived on campus. There was one big, huge reason my planning was accepted: I'd been there, done that, and gotten the T-shirt. That is, in contradiction to the article, what mattered was actual, prior accomplishment, not 'potential'.

Why the industrial psychological researchers came to their conclusions I don't know, but I don't believe their conclusions.

2
tmoertel 19 hours ago 3 replies      
I have doubts about the offered explanation.

From the abstract of the study behind the article [1]:

    When people seek to impress others, they often do so by
highlighting individual achievements. Despite the intuitive
appeal of this strategy, we demonstrate that people often
prefer potential rather than achievement when evaluating
others. Indeed, compared with references to achievement
(e.g., “this person has won an award for his work”),
references to potential (e.g., “this person could win an
award for his work”) appear to stimulate greater interest
and processing, which can translate into more favorable
reactions. This tendency creates a phenomenon whereby the
potential to be good at something can be preferred over
actually being good at that very same thing. We document
this preference for potential in laboratory and field
experiments, using targets ranging from athletes to
comedians to graduate school applicants and measures
ranging from salary allocations to online ad clicks to
admission decisions.

What causes this apparent preference for potential achievement over actual achievement? According to the study's abstract (the study itself is behind a pay wall), it's a mind trick: uncertainty "stimulates greater interest and processing, which can translate into more favorable reactions." More of the same from the HBR piece:

    When human brains come across uncertainty, they tend to pay
attention to information more because they want to figure
it out, which leads to longer and more in-depth
processing. High-potential candidates make us think harder
than proven ones do. So long as the information available
about the high-potential candidate is favorable, all this
extra processing can lead (unconsciously) to an overall
more positive view of the candidate (or company). (That
part about the information available being favorable is
important. In another study, when the candidate was
described as having great potential, but there was little
evidence to back that up, people liked him far less than
the proven achiever.)

That's a nice theory, but another explanation, which doesn't seem to have been considered, is that in many cases something that has the potential to reach a value of X is actually more valuable than something that we know, with certainty, has a value of X. That's because a thing that has the potential to reach X also has the potential to reach beyond X, but a thing that has already been measured to be X does not.

In other words, you are given two probability distributions: one wide, one tight. The wide one, you are told, has the potential to contain X. But the tight one, for certain, is centered on X. Now, which do you pick?

If you're in a situation were values of X are nice but values beyond X are gold, then you pick the wide distribution. That's because the tight distribution is centered on "nice" and, being tight, offers almost no hope of straying into "gold" territory.

So maybe it's not a mind trick, after all.

[1] http://psycnet.apa.org/psycinfo/2012-18069-001/

EDIT: fix typos.

3
rscale 19 hours ago 0 replies      
Interesting research, but I'm not a big fan of the 'blog article that draws on the abstract of the research' type of article.

Unfortunately, I couldn't find a sharable, non-paywalled draft of the article anywhere.

4
brink 20 hours ago 3 replies      
Most of this article can be summed up in: "Focus on what you're capable of, not what you've done.". In their test scenarios, they found that this brought more interest and pay.
5
ary 17 hours ago 0 replies      
Let me propose a better conclusion that has the potential succeed over vague claims about the human subconscious.

One of the best ways to get hired is to, like all great American beers, have the great taste of social proof with the less filling property of appearing to be inexpensive. Business people very much want qualified, capable employees, but they want them at the best possible price. Giving off the appearance that you are capable but unproven makes a business person's leverage sense tingle like no tomorrow. This, at least, has been my experience. Your mileage may vary.

I'm not sure that, as far as the general population goes, craving the "next big thing" is a global phenomenon. Those more concerned with getting attention (citizens of wealthier countries) over practicality might be more inclined to care. That would certainly help explain the hipster population plaguing the United States.

6
cookingrobot 15 hours ago 0 replies      
The most interesting part to me is the effect of wording in the Facebook ad. I wish they'd said how strong the effect is:

"they compared two versions of Facebook ads for a real stand-up comedian. In the first version, critics said "he is the next big thing" and "everybody's talking about him." In the second version, critics said he "could be the next big thing," and that "in a year, everybody could be talking about him." The ad that focused on his potential got significantly more clicks and likes."

7
mikepmalai 17 hours ago 0 replies      
Recruiting is a very interesting process. Unless you happen to be pre-IPO Google/Facebook/Hot Start-up, you are typically constrained by the following:

1) The kind of person you want to hire is either too expensive or unattainable. Let's assume this person's output equals 100%.

2) The person you can afford to hire probably grades out at 50% to 75% output of the ideal hire.

3) The minimum output you need to justify paying another person is 40%.

I'm just throwing numbers out there but this is directionally the situation you're dealing with (especially in a hot market). Some would say #1 is a '10x' player and the actual gap between #1 (what you want) and #2 (what you can afford)is much much greater than what I lay out above.

If that's the case, you can begin to see why hiring mangers aren't opposed to hiring #3's (or training someone up to #3 output) who have #1 potential over proven #2's with (perceived) limited upside. Of course the process of identifying candidates with #1 potential is a separate matter.

Ideally you can hire #2's with #1 upside but it's hard to get people like that since more often than not their current employer makes a big counter-offer and/or promotion to keep them. Consequently, you end up in a situation where you can hire someone who is 1) unproven with upside or 2) proven with limited upside.

8
Evbn 19 hours ago 2 replies      
Please remove the puffery adjective and "secret" from the title. "Selling Yourself" or "Research on Techniques for Selling Yoursf". would do fine.
9
001sky 18 hours ago 0 replies      
Heidi Grant Halvorson, Ph.D. is a motivational psychologist and author of the HBR Single Nine Things Successful People Do Differently and the book Succeed: How We Can Reach Our Goals

To editor who OK'd the book title ("Succeed: How We Can Reach Our Goals")? Did you not read the "Nine Things Successful People Do Differently"? Or, do you believe that a total lack of originality is the way to reach your goals?

10
drumdance 18 hours ago 0 replies      
I'm suspicious of a study that uses athletes with skills that can be quantified via statistics. Most jobs aren't nearly as quantifiable, and so the interviewer has to evaluate on squishier things such as cultural fit. I'm thinking of Paypal rejecting that guy who liked to shoot hoops, mainly because he used the word "hoops."

Not that using more squishier measures is wrong. It just is. Ergo, I don't see how this study means anything in the real world.

11
kjw 19 hours ago 1 reply      
Does anyone have any pointers to or comments on the implications for the hiring/manager side based on this research? Namely, when hiring, should one consciously avoid the bias toward high-potential candidates versus experienced candidates? The article doesn't comment on this aspect, and I couldn't get a sense of whether the players/candidates went on to perform equally well. Unfortunately the full text of the actual research paper referenced in the article is behind a paywall.
12
nwenzel 14 hours ago 0 replies      
In all the very good discussion here, and not one mention of the menu at the top of the HN page. The first text link in the menu bar is... "new"

There's an entire section devoted to new, or future Page 1 articles.

The inclusion of the link must mean that there was either a demand for it or a belief that readers would want to know about the articles with future potential to make the first page. Anecdotal at best, but it seems to support the idea that there is some innate bias toward the next new thing.

13
TomMasz 19 hours ago 1 reply      
It's really interesting that the test subjects said they preferred the guy with the most achievement, yet actually picked the one with the most potential. It makes you wonder how many of those "how to get a job" articles are written with the achievement mindset even though it's not what employers are really looking for.
14
davmar 17 hours ago 0 replies      
...the harvard drivel review continues it's churn
30
Probabilistic Many-to-Many Relationships (with Bloom Filters) zacharyvoase.com
163 points by zacharyvoase  23 hours ago   39 comments top 12
1
zacharyvoase 13 hours ago 1 reply      
OP here.

The lack of an index on the junction table definitely did have a major effect. By just doing the following:

    CREATE INDEX ON movie_person (movie_id);
CREATE INDEX ON movie_person (person_id);

The junction query speeds up to around 2ms"it's comparable to or faster than the bloom filter query. But the trade-off is revealed when you see the total size of the movie_person table (including indexes):

    SELECT pg_size_pretty(pg_total_relation_size('movie_person'));
=> 45MB

Whereas, by my calculations, the total size added by the bloom filters and hashes on movie and person is just 2094kB in total.

I plan on adding an erratum to my article explaining my error, the time/memory trade-off, and ideas for further improvement or exploration, potentially including bloom-based GiST indexes and the opportunities for parallelization.

2
tmoertel 16 hours ago 4 replies      
EDITED ONE LAST TIME " I PROMISE, I REALLY DO, THIS IS THE LAST TIME " TO ADD:

I think the reason that OP's join-based queries are slow is that there are no indexes over his junction table's foreign keys:

    CREATE TABLE movies_people (
movie_id INTEGER REFERENCES movie,
person_id INTEGER REFERENCES person
);

Thus when he wants the movies associated with person 160, the database must examine the entire junction:

    EXPLAIN ANALYZE SELECT * FROM movies_for_people_junction WHERE person_id = 160;

Hash Join (cost=282.37..10401.08 rows=97 width=33) (actual time=7.440..64.843 rows=9 loops=1)
Hash Cond: (movie_person.movie_id = movie.id)
-> Seq Scan on movie_person (cost=0.00..10117.01 rows=97 width=8) (actual time=2.540..59.933 rows=9 loops=1)
Filter: (person_id = 160)
-> Hash (cost=233.83..233.83 rows=3883 width=29) (actual time=4.884..4.884 rows=3883 loops=1)
Buckets: 1024 Batches: 1 Memory Usage: 233kB
-> Seq Scan on movie (cost=0.00..233.83 rows=3883 width=29) (actual time=0.010..2.610 rows=3883 loops=1)
Total runtime: 64.887 ms

Note the sequential scan on movie_person that accounts for 2.5 to 60 seconds. If there were an index on movie_person(person_id), this could be an index scan.

(EDITED TO ADD: I totally misread the timings in milliseconds for timings in seconds, so most of what I originally wrote is off by a factor of 1000. I'm leaving it here for entertainment value and because you might want to play with the data set in SQLite. But my point is still valid: a vanilla join is comparable in performance to the OP's bloom-filter method.)

I'm having a hard time believing that the straightforward join on a data set as small as the OP's sample is really going to take 65 seconds on PostgreSQL. Maybe that's what EXPLAIN predicts (with spotty stats, I'd wager), but EXPLAIN is not a reliable way to measure performance. For this data, I'd expect real queries to perform much better.

EDITED TO ADD: The OP's article shows the results for EXPLAIN ANALYZE, which ought to have performed the queries. So I'm not sure why the results are so slow.

Heck, even SQLite, when processing a superset of the data set on my 4-year-old computer, can do the OP's final query (and return additional ratings data) almost instantly:

    $ time sqlite3 ratings.db '
select *
from
users
natural join ratings
natural join movies
where user_id = 160
' > /dev/null

real 0m0.006s
user 0m0.002s
sys 0m0.004s

If you want to try some queries yourself, here you go:

    $ wget http://www.grouplens.org/system/files/ml-1m.zip
$ unzip ml-1m.zip
$ cd m1-1m
$ sqlite3 ratings.db <<EOF

CREATE TABLE movies (
movie_id INTEGER PRIMARY KEY NOT NULL
, title TEXT NOT NULL
, genres TEXT NOT NULL
);

CREATE TABLE users (
user_id INTEGER PRIMARY KEY NOT NULL
, gender TEXT NOT NULL
, age TEXT NOT NULL
, occupation TEXT NOT NULL
, zipcode TEXT NOT NULL
);

CREATE TABLE ratings (
user_id INTEGER REFERENCES users(user_id)
, movie_id INTEGER REFERENCES movies(movie_id)
, rating INTEGER NOT NULL
, timestamp INTEGER NOT NULL
, PRIMARY KEY (user_id, movie_id)
);

.separator ::
.import movies.dat movies
.import users.dat users
.import ratings.dat ratings

EOF

Fully enumerating the joined data takes under a second:

    $ time sqlite3 ratings.db '
select count(*)
from
users
natural join ratings
natural join movies
'
1000209

real 0m0.953s
user 0m0.925s
sys 0m0.021s

And even going so far as to print the joined data takes under six seconds:

    $ time sqlite3 ratings.db '
select *
from
users
natural join ratings
natural join movies
' > /dev/null

real 0m5.586s
user 0m5.497s
sys 0m0.059s

Sometimes, the old stuff works better than we give it credit for.

3
fusiongyro 19 hours ago 1 reply      
This is a really cool technique and warrants some investigation, but I can't let this go unaddressed:

> and the upper bound on the time taken to join all three tables will be the square of that

These kinds of from-principle assertions about what Postgres's (or other DBs') performance will be like sound helpful but usually aren't. The kinds of queries you issue can change everything. Indexing can change everything. Postgres's configuration can change everything. Actual size of the table can change everything. For example, if the table is small, Postgres will keep it in memory and your plans will have scary looking but actually innocent sequential scans, which I think actually happened in his join table example.

Anyway, it's good to have a lot of tools in your toolbox, and this is an interesting tool with interesting uses. I just think it would be a grave error to take the performance ratios here as fixed.

4
slig 20 hours ago 1 reply      
Now that's the kind of content that I'd love to see more here. It covers basic tools like sed and awk, to nice concepts I didn't know, like BIT field in Postgres.

Any recommended book or set of articles for starting with Postgres?

5
DEinspanjer 21 hours ago 1 reply      
I like the exploration of this method, but I would have liked to see the actual comparison of any false positives. Bad data can be acceptable in statistical analysis, but if you were showing someone a list of their ratings or the actors who were in the latest Kevin Bacon movie, false positives have a much stronger impact.

Is there any chance that the bloom could be used as a short-circuit filter but still follow-up with the m2m join to filter out the false positives? If the query optimizer can take advantage of that, then you could likely balance the size and cost of the bloom field.

6
brown9-2 17 hours ago 1 reply      
Regarding the space consumption, it seems like this is a tradeoff of storing two integers (plus the row overhead) per rating versus storing 335 bytes per movie/user.

For the join table, that's 2 integers * 575,281 ratings * 4 bytes = 4,602,248 bytes used in the join table.

With the filter, in each movie row, you need to store 1632 bits for the person_filter and 1048 bits for the hash, so 3,883 movies * (1632 bits + 1048 bits) = 1,300,805 bytes.

In each user row you need to store the same number of bits for the filter and hash, so 6,040 users * (1632 bits + 1048 bits) = 2,023,400 bytes.

Is my math here wrong? With this approach you save about 1.22MB, or about 27% over the join table approach (ignoring how much overhead there is for each row of the table and each page to store the table in).

Depending on the dataset it doesn't seem like the space savings would be worth the sacrifice in accuracy.

7
ocharles 17 hours ago 1 reply      
These seem to produce entirely different sets of results:

For the bloom filter: (actual time=0.033..2.546 rows=430 loops=1)

And for the join: (actual time=7.440..64.843 rows=9 loops=1)

So the join returned 9 movies for person_id=160, while the bloom filtered returned 430.

I understand it's a probabilistic model, but that's a pretty whopping difference in data. Have I missed something?

8
mhale 18 hours ago 0 replies      
Using a custom index implementation that uses bloom filters internally is probably going to work out better in the long run. It should be way more efficient than storing the data in a bit field, using app-layer code to generate the bloom filter values, then doing bitwise comparisons on-the-fly at query time.

The Postgres query planner can also recheck constraints automatically to recover from bloom filter false positive matches at query time.

FYI -- bloom filters are already used internally within the PostgreSQL intarray contrib module and the full-text search functionality.

See:
http://postgresql.1045698.n5.nabble.com/bloom-filter-indexes...
http://code.google.com/p/postgres-learning/wiki/BloomFilter

EDIT: for clarity, typo correction

9
b0b0b0b 19 hours ago 2 replies      
Thanks for this interesting writeup; it's definitely thought provoking.

Because you can't index that bloom column, it seem's you'd always be doing full table scans.

In fact it doesn't appear any indexes were used throughout this whole exercise, is that right?

10
luney 19 hours ago 0 replies      
Can anyone recommend any good interactive charting examples for many-to-many relationships?
11
secure 21 hours ago 2 replies      
As the article mentions at the very bottom, this technique is not accurate.

This fact makes it unusable for many use-cases, but it's an interesting and good article nevertheless.

12
pr0filer_net 20 hours ago 1 reply      
Nice article!

I see the table `movies_people` uses (SIGNED) INTEGERS as datatypes, but they reference to a UNSIGNED BIGINT (SERIAL).

       cached 1 September 2012 10:02:01 GMT