hacker news with inline top comments    .. more ..    9 Feb 2014 News
home   ask   best   5 years ago   
Google turns on "Download Gmail Archive" feature google.com
256 points by thejerz  9 hours ago   111 comments top 17
rzendacott 9 hours ago 1 reply      
I'm very impressed with the number of products and services supported so far. It has everything from bookmarks to location history. It even lets you choose the format for some products, and Drive in particular has some nice options. I'm glad to see Google opening this tool up.
grandpoobah 2 hours ago 2 replies      
I ran it and it gave me a zip containing an html file called errors.html with a bunch of errors and no actual emails.
magicalist 7 hours ago 1 reply      
antifuchs 9 hours ago 5 replies      
Easy prediction: they'll disable imap access within a year.
blackjack48 8 hours ago 0 replies      
If I'm already using gmvault, is there any advantage to downloading it directly?
PhasmaFelis 5 hours ago 1 reply      
Is this new? I'm pretty sure it's been around in some form for a couple of years at least.
emilyst 3 hours ago 0 replies      
I'm glad they've rolled this out widely, but unfortunately, when I attempted to use it, it simply sent me a 5MB error log and not a single bit of archived data.
qwerta 38 minutes ago 0 replies      
You should create backup every month...
coldtea 4 hours ago 1 reply      
Perhaps preparing to turn off 'IMAP access' feature?
brohoolio 8 hours ago 4 replies      
Am I missing something new? Hasn't this been in place for couple months?
anilshanbhag 4 hours ago 1 reply      
I just requested an archive and I will be surprised if I get it any time soon. Gmail - 5GB, Drive - 3 GB, G+ Photos - 1.5 GB - this is going to be one huge zip file !!
jebblue 5 hours ago 2 replies      
I've had 4 "Failed - Network error" messages in a row.
LeicaLatte 2 hours ago 0 replies      
They took their time but this does look like the right way to do it.
shmerl 7 hours ago 0 replies      
Useful but you could do it with IMAP already.
amiramir 4 hours ago 1 reply      
I'll just ask the NSA for mine ;-)
intull 4 hours ago 0 replies      
Wasn't this there before?
notastartup 1 hour ago 0 replies      
anyone else getting just a zip of error.html which contains whole bunch of "Service failed to retrieve this item." for all emails?
The Egg galactanet.com
291 points by mickgiles  12 hours ago   104 comments top 32
chroma 8 hours ago 6 replies      
I guess I'm in the minority that doesn't find this story uplifting. The memory reset is the biggest problem, since it doesn't allow the main character to learn from his mistakes. At the end of time, the main character will have his memories and personalities merged. Then he'll look back on countless lifetimes of the same mistakes and regrets. It would be like if someone slipped you some Ambien (to prevent memory formation), played the same prank on you 20 times, then showed you a video of it after you sobered up. "Ha-ha, you fell for it every time! Classic!" Except instead of 20 times it would be billions (possibly trillions) of lifetimes. And instead of one prank, it would be countless heartbreaks, regrets, failures, and insecurities.

And that's only looking at the character's own "choices." (Is it really a choice if you can't stop yourself from becoming John Wilkes Booth?) The cruelty inflicted by nature would be much greater. Disease, famine, famine, disease, famine, typhoon, famine, rattlesnake bite, famine, tsunami, etc.

Now I wonder if a sugar-coated Lovecraftian horror story was the author's intent. No other kind of god would set up a system where you're forced to repeat the same mistakes for billions of years.

crntaylor 10 hours ago 7 replies      
This would have been a significantly more enjoyable experience if the title of the submission didn't give away the ending.
rosser 10 hours ago 1 reply      
I've read this several times before, and this bit always gets me, as they say, "right in the feels":

Dont worry, I said. Theyll be fine. Your kids will remember you as perfect in every way. They didnt have time to grow contempt for you. Your wife will cry on the outside, but will be secretly relieved. To be fair, your marriage was falling apart. If its any consolation, shell feel very guilty for feeling relieved.

It's just so human. It's almost confrontational in its degree of, "That's just how shit is sometimes," but it's delivered with utter compassion. That juxtaposition captures so much of how I feel about the human condition.

gkoberger 9 hours ago 2 replies      
Completely off topic, however I read this originally right around when Lost went off the air.

I always wished this is how Lost ended: with Jack being told by Jacob that he was actually everyone on the plane (which is why they all had a weird connection), and all these lives were him waiting to be "born" into running the island.

dhughes 9 hours ago 3 replies      
Another good short story similar to this is Asimov's "The Last Question." http://filer.case.edu/dts8/thelastq.htm
Xcelerate 7 hours ago 0 replies      
This would be horrifying if it was true. Think of how many billions of painful, terrible lives you would have to experience.

Edit: I suppose it's just as horrible regardless of whether or not you experience them...

whyme 8 hours ago 1 reply      
That model seems ineffective. While it operates in parallel, spawns a plethora of threads and does, I imagine, aim to end in eventual consistency, it also seems to lack any form of shared state. If I was me I would make sure I overlap myself on each new instance and not deal with that restart time. I'd also suggest picking a different tool rather than the current one... I think I might be using LISP (Lost In Self Protocol).
samatman 4 hours ago 1 reply      
This is within an iota of the precise cosmology of my religion of birth:


Except that, coming from that background, I expected the big reveal to be that the Egg is talking to himself, hatched.

As unprovable speculations about the nature of reality go, I rather like this one.

swatkat 28 minutes ago 0 replies      
Wow! That was a good read. Advaita philosophy, and Karma Siddantha in a nutshell.
julianpye 9 hours ago 0 replies      
On HN, we want to hack the way our systems work. We want to see through the obscurity and complexities advanced systems have brought along and find the most elegant and quickest way to challenge and control them. Why can we not look at hacking religion? Why can we not hack philosophy?I like this story, but more than that I like the fact that it has reached page 1 on HN. I think many people here are not looking for self-realisation in the form of a startup that brings them big bucks, but hey - self-realisation.
yc-kjh 7 hours ago 3 replies      
Incompatible with Free Will

The story is incompatible with free will. The only way the universe could be the way it is, with the one person living all those lives, yet always choosing such that the other people (him in another re-incarnation) also always choose as they (he) did, it would be necessary for free will not to exist.

But this would also mean that the "god" in this story also didn't have free will, because the man was "of his (god's) kind".

But if God does not have free will, he isn't the greatest possible being. The universe thus described therefore fails Anselm's Ontological Argument for the Existence of God. The hypothetical God who is identical to the God in this story, with the exception that He DOES have free will, is obviously a greater being.

I conclude that this story cannot possibly describe Reality, as It actually Is.

otikik 9 hours ago 0 replies      
Well I'm glad I wrote it.
NhanH 10 hours ago 0 replies      
I'd strongly suggest "Sum: 40 tales from the after lives" for anyone enjoy this story. It's a lovely collection of short stories with very similar writing style and theme.
leobelle 5 hours ago 1 reply      
What about pre-human hominids? Was he also all the Homo Ergasters? Homo Habilis? Australopithecus Afarensis? The great apes? All the mammals? All multi-celled life?

I don't get what's interesting about this story. It's pretty silly and not very enlightening.

ctoth 4 hours ago 0 replies      
While we're talking about dark cosmologies, Divided by Infinity [0] is fantastic. I'll post it here as when I try to submit it as its own post it is automatically marked dead.

[0] http://www.tor.com/stories/2010/08/divided-by-infinity

LukeShu 8 hours ago 0 replies      
I hadn't read this, but I recognized the author's name as the author of one of my favorite (no longer running) webcomics[0]. I guess that means I would probably enjoy his other writings[1].

[0]: http://www.galactanet.com/comic/view.php?strip=1[1]: http://www.galactanet.com/writing.html

... hmm, now I'm a little disapointed in myself that I didn't recognize the domain name too.

thatthatis 8 hours ago 1 reply      
Of all the religious, non religious, philosophical, etc. texts I've ever read. This is the one I most hope is true.
terranstyler 1 hour ago 0 replies      
This would mean that life is an episode in a Monte-Carlo simulation traversing the state space of all possible human lifes.

The story also suggests that the simulation is heavily parallel and complete knowledge of all episodes (rather all paths) makes you god.

smky80 8 hours ago 1 reply      
This seemed cute the first time I read it, and it explains the "why I am me?" question that most religions just don't.

But I realized, this is would be an absolute disaster if true. True story: life more or less sucks if you aren't near a local maximum of a food chain.

WCityMike 9 hours ago 2 replies      
The author of this wrote THE MARTIAN. I recommend it:


shire 3 hours ago 1 reply      
First time reading this story would be nice if someone can explain the takeaway from this? It's a very deep story but trying to rationalize the meaning or concept behind the idea of this story. Thanks:)
squirejons 9 hours ago 0 replies      
I think I will stick with cryonics. Thanks, anyway...
pirateking 10 hours ago 0 replies      
The theme and mood remind me somewhat of Childhood's End by Arthur C. Clarke - one of my favorite books.
afriend4lyfe 9 hours ago 0 replies      
The Gentle Seduction is a similar short story in regards to singularity that The Egg reminded me ofhttp://www.skyhunter.com/marcs/GentleSeduction.html
spindritf 10 hours ago 5 replies      
How can you bootstrap such a world with only one soul? (I know, not the right question.)
sharemywin 12 hours ago 0 replies      
brings new meaning to "go f* yourself"
sillysaurus2 10 hours ago 0 replies      
I wonder when this was written?
joe_inferno 10 hours ago 0 replies      
This is a little reminiscent of Mormonism (minus the 'only you and me' bit)
auggierose 9 hours ago 1 reply      
That makes actually a lot of sense.
thenerdfiles 7 hours ago 0 replies      

    The rate of information increases.
This is true of humans, gods, and all things. Some initial configuration is irrelevant to this fact of existence that information describes it, so too, is part of the configuration that you play into.

Many gods, one god, whatever it is part of a being to know what is relevant at any point in time. If there are gods, your death is something that becomes information to them.

The implication here is that even in their case, their ends become information to something else.

The first to be surprised to introduce new information doesn't "win" or "sin". What's to be felt about the falsity that

        The rate of information increases.?

phhh 9 hours ago 5 replies      
Old. Seriously who hasn't read this by now.
The terrifying surveillance case of Brandon Mayfield aljazeera.com
8 points by jpatokal  1 hour ago   1 comment top
Broken_Hippo 54 minutes ago 0 replies      
Horrifying story that will change absolutely nothing. The government isn't willing to back away from the power (even after it became known) and people aren't upset enough in mass to actually do things about it - things that would probably be done in vain for some time. And changing the minds of many people in the 'innocent people have nothing to hide' I find to have a general delusion, believing the government (or at least the law) is mostly infallible.

It is not a new pattern, either. Look back to what we (americans) did to each other in the 50's (are You a communist? Do you act like one?). Had we had the tech then to spy like now, we'd have done so. Which means that despite the sensational news stories, I find almost apathy from normal people I meet - the paranoid has known for years (don't talk on that cell phone, they can listen in on those easily... does anyone remember this attitude?). Privacy intrusions simply exist. They are. You aren't escaping them. It might make one angry, but there is nothing people feel they can do about it. I personally disagree, but also feel the path to balance in this area is an uncomfortable one.

The Programming Steamroller Waits For No One (2013) thecodist.com
40 points by prostoalex  5 hours ago   14 comments top 7
rob-anderson 1 hour ago 4 replies      
There is no steamroller.

I know many programmers feel this way, but in my humble opinion it is a fallacy, and not a very healthy one.

No-one can deny that our industry is evolving at breakneck speed, and it is an exhilarating place to be. But just because there's a new technology every week on HN doesn't mean that we are losing old ones at a similar rate.

It is perfectly possible to have done nothing but C or Java for your entire career and yet remain extremely employable. And I wouldn't be at all surprised if there are highly paid COBOL jobs still out there, nursing some vast banking-industry mainframe which is too precious to risk replacing.

In fact I'm hard pressed to think of any programming language which I would dare declare 'dead' in a HN comment.

But even if you're a specialist in something which you feel is in decline, or for which there are newer, snazzier replacements, you've got every opportunity to learn something new, taking as long as you like to do so. There's extensive documentation for every technology under the sun available for free on the internet, and an army of friendly, helpful people willing to provide help and advice without expecting anything in return.

In fact, it's entirely possible you could even get paid to cross-train. In my own company we use RoR for which (in England at least) demand far outstrips supply. I've paid PHP developers to learn Rails, and I would consider anyone with an in-depth knowledge of any language as potentially employable.

Really, the only way an experienced developer is going to end up flipping burgers or flying a manager's desk is because they have lost the desire to learn - ie fallen out of love with programming in general. I believe few people work in this industry for money alone - you either love programming or you don't do it - and if you love it then you will pick up new technologies out of sheer intellectual curiosity.

Feverishly reading HN every day and feeling threatened by the emergence of every new 'next best thing' is not a good idea. I would advise anyone feeling this way to take a chill pill and remember why they took up programming in the first place.

dwaltrip 2 hours ago 2 replies      
I enjoyed this article. However, haven't many of the core fundamental concepts and abstractions found in computing technologies stayed relatively the same over the past 40 years? Minus the occasional paradigm shift, of course.
jamesaguilar 2 hours ago 0 replies      
I don't think this is exactly true. There are still many COBOL programmers hard at work.
Nekorosu 58 minutes ago 0 replies      
First I was going to write "That's not true". Then I reread the article. That steamroller is a really slow one. :)
Theodores 2 hours ago 1 reply      
I think it is funny how he sees managers - the ranks of managers include those who could not stay ahead and were flattened by the steamroller. Do managers see themselves that way or in the cab driving the thing?
danieltillett 1 hour ago 0 replies      
This is not specific to coding. My background is in molecular biology and almost everything I have learnt has become worthless because of new techniques - about the only skill I have from my junior years that now has value is how to use a Gilson correctly.
chebert 1 hour ago 0 replies      
Man this is depressing.

I don't want to learn tech because I'm afraid of getting my bones crushed by a steam roller.

I would rather learn tech because of the new and interesting things I can do with it.

Debian committee members vote for systemd as init system debian.org
168 points by waffle_ss  11 hours ago   90 comments top 16
Spittie 11 hours ago 2 replies      
Those are the votes for systemd:





For anyone wondering, 4 votes are enough because Bdale's vote (the chairman of the Technical Committee) count as two votes in case of a draw.

EDIT: see this (https://news.ycombinator.com/item?id=7203479) comment below - if the other 4 members of the TC vote F, then systemd would not win.

jordigh 10 hours ago 7 replies      
For a fun retelling of the events (I think this only works on Firefox):


jtchang 10 hours ago 7 replies      
Can anyone give a quick rundown of the different init systems?

For the most part it can get confusing for a non day to day system administrator when I am trying to get a program to "run on boot". Between rc.local, init.d, run levels, etc. sometimes it is just frustrating.

kashchei 2 hours ago 2 replies      
I have to remind everyone that the author of systemd is Lennart Poettering, the guy behind Pulseaudio. I think, this is one "feature" that should outweigh all supposed benefits of this program.

For those unfamiliar with Pulseaudio development, it's a third-generation audio subsystem (after OSS and ALSA with JACK forming two previous ones, and ESD being the beginning of the third) that is famous for overcomplicated, non-human-writable barely human-readable configuration procedure, development marred with huge number of bugs that ruined audio on Linux until recently, and suffering from immense number of internal interfaces and system being presented as a huge monolithic piece of software that can not be used in a modular manner except as modules that only talk among themselves.

systemd seems to suffer from the same problems, plus it tries to "integrate" init, udev and syslog into a single "product", with arcane internal interfaces and formats -- just as non-human-writable as Pulseaudio.

_wmd 11 hours ago 1 reply      
I wonder how long upstart will survive in Ubuntu after the next Debian stable
kudu 4 hours ago 0 replies      
I'm very doubtful as to whether it was within the Technical Committee's role to rule on this. After all, there is nearly a consensus on the fact that systemd is the superior init system. The question of whether certain kernels should have more features by default at the expense of portability is a matter of policy, and not a technical decision.
dfc 7 hours ago 0 replies      
Can an HN mod change the title? The committee has not voted in favor of systemd (yet).
e12e 57 minutes ago 0 replies      
Did anyone see if runit was ever considered? Perhaps it was considered a better alternative to just stick with sysv vs changing to runit?
forgottenpass 10 hours ago 1 reply      
What happened to the last vote? I saw some criticism that Ian jumped the gun by calling for votes when he did and putting two issues on at once but haven't read the archive in the meantime.

Pretty unfortunate it took this long. Even as a casual observer it's looked like they'd eventually pick systemd a month ago. Hopefully this vote is the last one.

jiggy2011 10 hours ago 4 replies      
Not a surprise, was there a compelling reason for upstart at all?
hayksaakian 10 hours ago 2 replies      
Could somebody explain like I'm five?
undoware 1 hour ago 0 replies      
I usually just put everything in inittab. ;D
auvrw 8 hours ago 0 replies      
ran across this link, which might be helpful for people familiar sysvinit but not systemd


tl;dc: systemd replaces the bash scripts you're used to grepping through with unix .conf files that fit on the screen.

herokusaki 8 hours ago 1 reply      
Did Debian at any point consider launchd?
qwoeiu 3 hours ago 0 replies      
fucking bureaucrats
pekk 10 hours ago 1 reply      
So when are they going to vote on whether to switch to RPM?
The Weight of Rain style.org
165 points by ams1  12 hours ago   16 comments top 7
mutagen 8 hours ago 1 reply      
If the only thing youre doing is coming up with a single number, then youre doing arithmetic, not visualization....And I think that the goal of visualization is not finding elaborate ways to encode information. I try to encode as little as possible....But to me this feels like imposing a design on the data, and drawing attention to the design more than the data.

The whole thing is great, I'm glad I stuck with it past the first few images to see where he's going. These bits stuck out though and his work is so often things I wish I had thought of.

martindale 1 hour ago 0 replies      
Every so often, you follow a link that you'd thought not worth the click. Therein you discover just how wrong one can be.
wmeredith 5 hours ago 0 replies      
I wish I had more than one upvote to give to this talk. I was giddy half way through it and my jaw was hanging open by the end of it. What an inspiring look at the design process. I'm an interaction designer and this REALLY spoke to me.
K2h 3 hours ago 0 replies      
This is a great presentation on how to effectively communicate data and it is not just putting raw data on a slide. it takes time and major creativity to communicate. Read the presentation for another reason, a good example of a catchy, relevant intro tied to a closing.... something that makes us all better communicators.
mdda 3 hours ago 3 replies      
One thing that surprised me was the statement about a sun that would soon engulf the nearby planet - and being so relatively large that the planet was 'more than half' in sunlight (~ the sun's disc shines 'around the corners').

But isn't that true on earth too (to a much smaller degree)? As long as the sun's radius is larger than the earth, then sunlight will fall simultaneously on more than a half of the earth's surface, no?

reneherse 3 hours ago 1 reply      
Process and presentation at the level of... poetry.
Gracana 4 hours ago 2 replies      
Fantastically interesting.
Show HN: Crushify.org crushify.org
8 points by MarkIceberg  1 hour ago   9 comments top 6
jey 22 minutes ago 0 replies      
This used to be extremely common as a naturally viral way to harvest email addresses, the prompt was "put in the email addresses of your crushes".

What's your twist that sets it apart from the usual variants? (Sure, building it on top of Facebook could potentially provide some additional benefits, but what are they?)

aegiso 1 hour ago 0 replies      
Solve two nearly insurmountable problems and you have a winner:

1) Make it impossible to game.2) Chicken and egg.

I have no idea how to solve these problems, but it's a concept ripe for disruption if you do.

techaddict009 1 hour ago 0 replies      
Seems similar to bang with friends. Except the bang part.
ddorian43 1 hour ago 1 reply      
Why not limit it to say 3 friends?You can't have a crush on every friend (you might want to bang them, but not crush?)?
Comkid 1 hour ago 1 reply      

There are more examples of apps just like this that came before (this is the only one that comes to mind at the moment), the only way I would be remotely interested is if they figured a way to solve the inherent problem of selecting all/some of your friends as crushes to see who had you as a crush, and then what happens when someone legitimately adds the person who was just checking everyone's crushes.

f_salmon 1 hour ago 1 reply      
Props for your effort!

What I dislike (not specifically about your project, that is!) is the idea that, as a society, we're incentivized to increasingly hide behind our screens instead of growing some balls an actually live in the "real life". Here, we're talking about dating. The other current topic: how we intend to fight democracy-destroying mass surveillance by (apparently) simply sitting behind our screens.

Total security in a PostgreSQL database ibm.com
150 points by amirmansour  14 hours ago   32 comments top 9
tptacek 12 hours ago 4 replies      
This is an interesting, detailed, and well-written article.

Let me caution you though: in most applications, if you concede to an attacker INSERT/UPDATE/SELECT (ie: if you have SQL Injection), even if you've locked down the rest of the database and minimized privileges, you're pretty much doomed.

Most teams we work with don't take the time to thoroughly lock down their databases, and we don't blame them; it's much more important to be sure you don't give an attacker any control of the database to begin with.

a1a 20 minutes ago 0 replies      
Are they seriously recommending the usage of unsalted md5?

Edit: Oh, the article is from 2009 (I'd say it was bad practice even back then though).

dizzystar 13 hours ago 4 replies      
Very nice article.

The section under "the ideal administrator" is quite eye-opening. I pretty much use PostgreSQL exclusively, and I've found that every time I learn something new, there is another mile of learning to go, and that feedback cycle never seems to end.

I have a few PostgreSQL-specific book on admin and server programming, but I wonder where I would be able to go to really learn this stuff. Are there any classes or places to go for this sort of SQL training?

How does one go about becoming a total master at this? I find that, out of all the programming that I do, I love working with SQL the most and I want to dive deeper into it.

rubiquity 13 hours ago 0 replies      
DeveloperWorks puts out some really great content from time to time. This article and their article on POSIX Asychronous I/O in Linux[0] are two of my favorites.

0 - http://www.ibm.com/developerworks/library/l-async/

sehrope 8 hours ago 0 replies      
Pretty good article but had to laugh when I read this:

> Common practice dictates that passwords have at least six characters and are changed frequently.

nasalgoat 10 hours ago 0 replies      
An excellent article, but it brings up a question about authentication using the various load balancing tools out there, such as pgPool or pgBouncer. I've found the auth tools in them to be extremely poor, to the point that it's easier to just leave it off.

Has anyone gotten it to work transparently?

yeukhon 11 hours ago 0 replies      
Isn't default postgres user password authentication still MD5?
kbar13 13 hours ago 0 replies      
Wow, this is very in-depth. Bookmarked for reading when I get home. Thanks for sharing!
angry_octet 12 hours ago 0 replies      
Great read. But I was disappointed that it didn't mention other password encryption schemes, i.e. Blowfish.www.postgresql.org/docs/8.4/static/pgcrypto.html
Turkey pushes through new raft of 'draconian' Internet restrictions theguardian.com
65 points by wolfgke  9 hours ago   28 comments top 13
eknkc 8 hours ago 1 reply      

They created a new office that has the authority to block any website url / domain / ip address without prior notice or court order. All ISPs are required to apply restrictions within 4 hours. Plus, all ISPs are required to log HTTP access + IP + Date pairs and store them for two years for government access (again, no court involvement)

aytekin 4 hours ago 0 replies      
"any bureaucrat can now decide to take down a certain website without having to apply for a court order, but you will need to take that decision to court in order to get it reversed."

This is going to be used/abused by the Islamic government to silence opposition or to cover up corruption news. That's the main goal of the law.

middleclick 8 hours ago 0 replies      
Something this article does not cover [1]:

"The law requires service providers to take down objectionable content within four hours and any page found in violation by the country's telecom authority or face fines up to $44,500."

[1] http://www.usatoday.com/story/news/2014/02/06/turket-interne...

Nux 8 hours ago 1 reply      
Well, if Britain can do it, why not Turkey? Sure, the details differ, but the story is the same essentially.
atmosx 5 hours ago 1 reply      
I wrote a couple of articles about Turkey in 2007. I never expected Recep Tayyip Erdoan to deteriorate so much, of course I'm not referring to this specific law, but to his reactions and his unwillingness to step down after pretty much the most successful decade a politician could reach for (2000-2010).

Regarding the law, with all these new technologies floating around, I think censorship will become more and more difficult for the government to enforce it on geeks but I firmly believe that people of Turkey should rely mostly on overturning the part where the ISP blocks whatever website without court order and not on tech to circumvent it. That might turn nasty and jails in Turkey are not fancy.

Given recent events, I think the law is going to be used in order to fight anti-establishment (as in anti-current-government) websites mostly and not pedophilia (surprised no one mentioned pedophilia yet...), copyright, etc.

yawz 5 hours ago 0 replies      
Unfortunately this is just one of the current oppressions in Turkey. This pro-Islamic government has limited individual freedom by every step taken, ironically in the name of "advanced democracy". Scare tactics such as putting people through trials because of tweeting anti-government ideas or sharing something on Facebook have become common practice.
jotm 8 hours ago 0 replies      
I guess they're quitting the Internet... Cold Turkey!
harunurhan 2 hours ago 1 reply      
Normally you spend a week of your time to report an insulting content. This regulation speed these kind of processes up. Also They have right to block for only 24 hours, If you do not go to law for it, the content is unblocked again.In addition, they are keeping record of everything we do on the internet already no court involvement. Actually as you know it is much worse in other countries(USA,UK...).In short I think the regulation is exaggerated. Moreover funny but pathetic most turkish who complain about it don't know what even changed. They just complain to complain, they complain because they don't like government.
bediger4000 8 hours ago 0 replies      
I'm surprised that this didn't come about in the guise of "protecting intellectual property". That seems to be the usual beard when imposing internet censorship in western countries.
adventured 9 hours ago 0 replies      
I wish this sort of behavior, by any government, would draw immediate, global, and severe economic sanctions. Repression is repression, whether online or off. I feel like most of the governments around the world are intentionally refraining from considering rights to fully or properly translate to the online sphere, so they can all first figure out in which ways they want to restrict or abuse their own domestic populations.
throwwit 3 hours ago 0 replies      
I wonder if news coverage of the restrictions is restricted in Turkey.

It'd be nice if SOME country started leading by example. 'Safegaurds' like these are gambling on the premise that they will -never- be abused, which is always a lofty bet if not flat out disingenuous.

dombili 5 hours ago 0 replies      
alionfalcon 8 hours ago 1 reply      
Some kind of legal NSA, more honest than US way isn't it? But it's nonsense anyway!
Your Progress As A Programmer Is All Up To You thecodist.com
128 points by ks  14 hours ago   43 comments top 15
stdbrouw 10 hours ago 1 reply      
I think it's possible to agree with both this post and with the other programmer he's railing against.

Yep, I know that if I don't take care of my own training, it's just not going to happen, most companies are not so altruistic that they'll hand me everything on a silver platter.

But at the same time, a company that never hires people unless they already have the exact skillset they're looking for, a company that fires people on a whim because priorities change, and a company that provides zero incentive for people to keep learning (e.g. with 20% time or a willingness to let employees experiment with new tech) well, those are not companies I want to work for.

JackMorgan 8 hours ago 4 replies      
I have been writing and rewriting a post for the better part of a year on this same thing. I think I've finally accepted that a majority of programmers actually are not interested in learning new things.

I got into software because there was so much to learn and explore, so this realization still baffles me. Why on earth would someone want to do this job and not want to learn new things? It's like a baseball player who hates being outside.

Not only that, but often times I'm faced with the prospect like the author's, where people I've worked with actively prevent those around them from learning new things on the job. "No, don't write this standalone module in Python, our standard is PHP; it was good enough 5 years ago, it's good enough now!" (in a four man shop).

As someone who loves constantly learning more, it's suffocating to be around people who are so paralyzed. I simply cannot fathom the fear that drives someone to say to the offer to learn something new on the job, "no thanks, I'm happy becoming obsolete, and you can't learn it either, because I might have to one day support it, and I'm not interested in learning anything new!"

ryanobjc 9 hours ago 1 reply      
Interesting article, sounds like the author has really taken charge of their career and managed to do well.

Now, what about an alternative world where he did not "get oo" or perhaps a lifestyle where he had children and no time at work to learn. Or one of these newer not quite as successful software companies which has no money and no extra time.

Keeping up with new tech requires time, and money. Start ups provide neither of these. Even bigger "start ups" attempt to keep up the illusions of a smaller company including mandatory over time and no extras (eg tuition reimbursement, sabbaticals, more than 2 weeks of pto a year, etc).

The other thing, computing as a career is quite a bit harder, more complex, and highly competitive than when the author had their formative years.

The real rallying cry is how do you make an industry that respects career advancement?

mwfunk 11 hours ago 1 reply      
This is absolutely true, at least from my perspective. All of the programming jobs I've had in my career have been for medium- to freaking-huge companies, and most of my projects at those jobs have required me to do a ton of self-teaching to get up to speed on a bunch of technical (or scientific, or mathematical) stuff that I had no previous exposure to, in order to get my job done. On only one occasion did I ever get any training for anything, and that was just for two days.

I guess that's a combination "back in my day/get off my lawn" statement, plus a little whining, and maybe a humblebrag, but I don't think that's an unusual story at all for software developers.

I interned at IBM during grad school with a team of consultants that all did enterprise Java stuff for financial institutions- that was very different. IBM would frequently send those developers away for a week or more at a time, multiple times a year, to get training on specific technologies. I'm not sure how common that is anywhere other than IBM though, or if IBM even does that anymore. Maybe Google does it? I don't know.

Sometimes I deal with developers who either can't or won't teach themselves anything, and can't or won't learn by doing. They absolutely need someone to hold their hand and explain things to them every step of the way, and they will just throw their hands up in the air and fail before putting any time into trying to read up on whatever topic is giving them trouble. I don't know what to attribute this to, so I'm trying really hard to not jump to the conclusion that they suck or they don't care or whatever. I'm sure a lot of them do just suck at their jobs and/or just don't care, but maybe some of them have genuine problems with learning that aren't their fault. The only thing I can say for sure is that this is a trait that is a major impediment to their careers and getting their jobs done without sucking up too much of their cow-orkers' time (as we all know, orking cows requires long stretches of uninterrupted concentration).

TL;DR Spot on, and being able to develop your own technical skills to keep up to date and expand your horizons is absolutely critical to being a really successful developer. You are also the only person that you can count on to do this for you. You can't really count on any employer, even some mythical ideal company with bottomless resources that treats each employee as a magical snowflake, to do this. Even if your company does provide training, it's not necessarily going to be the training you want or need to receive.

zzzcpan 10 hours ago 1 reply      

  > Today keeping up is a ridiculous job sometimes. 
Even though he advises people to keep up, he actually kind of admits how ridiculous this is today. It's not possible to keep up anymore, there are just too many people creating too many things, like languages, frameworks, technologies.

krob 3 hours ago 0 replies      
I see it this way, the better the company you work for, the more responsibility they will generally have towards feeling you need to stay up on education and possibly provide opportunities for you to stay educated in the field you're paid to do. The poorer the quality of company, and generally the more you have to do on a regular basis as the primary individual to do your job, the less chance you will have to learn new technologies. Unless your company feels external pressure for you to pursue these new tech's you are on your own. Small shops with bad scheduling will make it so you are unable to schedule time for new technology stacks. You inevitably end up pigeonholed to stick with what you always use.

I think ultimately, many people in the industry, they only get to learn new tech when they leave for their next job. The pressure is momentarily reduced while they learn at their new job.

Just my 2 cents.

arms 11 hours ago 1 reply      
Terrific post. This is exactly the type of individual I want to work with - someone who recognizes that they're in charge of their own advancement, and doesn't lay blame on any outside factors. As developers/builders/hackers we are ultimately responsible for our own success or failure.
fleshweasel 11 hours ago 0 replies      
"You might learn useless stuff. But learning is never itself useless."Great article. Thanks for posting.
maerF0x0 8 hours ago 0 replies      
Thats why you work for a company where your skills are the product. Everyone loves selling a better product, so you'll get upgraded... If you are a cost center, then you'll be nicked and cut and eventually hacked at until there is next to nothing left.
clmorg01 10 hours ago 1 reply      
This is basically the choice I finally had to make for myself over the last couple years.

It was just three years ago that my main responsibility was maintaining code on a black & yellow terminal for a VMS server. Another couple years and I could have easily have been one of those people pushed out of the industry with no easy way back in.

Although my company has provide an avenue for me to transition to doing things with the LAMP stack it is still in some sense legacy. It's a large website base that started over a decade ago.

I have made the choice that I'm done with being legacy and am doing whatever I can to learn current tech. I will even be willing sometime later this year to get a new job at a junior level just I can cut loose the legacy code crap I am tied to. At this point it feels mostly like a bunch of anchors holding me down. I want a new job where I can learn from the people around me and truly be focused on my direction.

alashley 9 hours ago 0 replies      
I've talked to other developers and what I've understood is that the best way to learn new stuff is to get paid to do so. I'm just wondering how you keep up with stuff if your job demands so much out of you that you can't keep with with anything besides your main stack.
ZeppelinDePlomo 11 hours ago 1 reply      
If it has nothing to do with a way to increase production, then the company has no business by investing in you learning that. BUT, if the company can benefit from you learning those skills, then it could be a missed oportunity not give you the resources to do it (learning it while on the company's time).

Of course it's all a product of culture and supply-demand (systemic), if there are enough great programmers that are willing to learn everything on their own time, then of course it will become the norm that programmers should learn everything on their own time. And, of course, that's great for the employers.

higherpurpose 8 hours ago 0 replies      
If I were you at this point (actually a 2-3 years ago) I would already starting going back to using Java for Android, too. iOS will be on a billion devices in 3 years, but Android will be on 3 billion, so the impact is much greater, and probably the revenues, too.
dinkumthinkum 4 hours ago 0 replies      
Yeah, I think this is fine. But is it also not irritating to think that following every web mvc framework fad is really "keeping up to date" with programming? This seems to be a very common view and I don't think it is any less irritating. :)
6d0debc071 10 hours ago 0 replies      
You may as well say that your job advancement is your responsibility and not your employer's. And it would be true, strictly speaking you're rarely owed promotion - even if you perform incredibly well, there are no guarantees. However, someone could still not wish to work in a dead-end job.

It feels to me that that's the sense in which the young man's comments were meant. It doesn't seem unreasonable in that light. So the compensation he'd like isn't entirely monetary in nature, that's hardly unique.

Bitcoin Ponzi scheme ponzi.io
283 points by runn1ng  15 hours ago   247 comments top 61
RyanZAG 14 hours ago 4 replies      
I'm trying to work out if this is some grand statement on the similarities of Bitcoin to a Ponzi scheme in the way it's deflationary, or just a quick way for someone to make a few BTC. Maybe it's designed as a way to teach people about Ponzi schemes?

Got to say, I pity the person who eventually deposits too much money at once, causing the payments to pause while they build up enough to cover his large deposit, in turn causing everyone else to think that the money has stopped paying out and causing no further money to be deposited. That seems like the likely end of this eventually...

bernardom 15 hours ago 1 reply      
This is a beautiful piece of art/social experimentation/I-don't-know-what.
mmaunder 14 hours ago 2 replies      

Ponzi Scheme Enforcement Actions

Curtailing Ponzi schemes and holding those responsible for these scams accountable is a vital component of the SEC's enforcement program.

Since fiscal year 2010, the SEC has brought more than 100 enforcement actions against nearly 200 individuals and 250 entities for carrying out Ponzi schemes.1 In these actions, more than 65 individuals have been barred from working in the securities industry. The SEC also has worked closely with the U.S. Department of Justice and other criminal authorities on parallel criminal and civil proceedings against Ponzi scheme operations.


Source: http://www.sec.gov/spotlight/enf-actions-ponzi.shtml

Keep in mind that BTC or virtual money is just another asset class.

FiloSottile 12 hours ago 8 replies      

    The experiment is over.    We will pay back everyone we can. We are not making money from this.
And it's down.

runn1ng 14 hours ago 1 reply      
What I realized now: if you look at their BTC balance




their "debt" - meaning, what other people have to bring into the system - is their balance + 20%. Right now, their debt is about 42 bitcoin.

lucb1e 11 hours ago 3 replies      
Should have known! Like ten minutes before I expected 1.2x to be gathered and my payout to be done, the site reports "The experiment is over." Come on guys, just let it run. I would have accepted it if it just 'died of natural causes' and I lost all I put in (which is not much, 0.03); that was my gamble. Not that you'd pull the plug.

Somehow their pulling the plug just really bothers me much more than losing it would have been. Especially because at the rate at which it was going, payout was more or less ensured (300btc * 1.2 = 360. They quit 16 coins short). Right now, two hours after they supposedly would pay everyone back, I still got nothing.

sillysaurus2 13 hours ago 1 reply      
Here's someone who recently sent 1 BTC to this ponzi scheme:



I'm going to watch and see what happens to them. If they don't lose their 1 BTC, then that's at least slightly interesting.

EDIT: It's been more than an hour; no repayment yet.

runn1ng 15 hours ago 1 reply      
I will add, since this hit the frontpage: nobody post any big money there. It's a Ponzi scheme. It says so in the title.
dwaltrip 14 hours ago 5 replies      
No one finds this repulsive? People will lose money to this. Yes, they are dumb, yes it is obvious. But those who participate are still morally complicit. Enabling others who suffer from serious issues (addiction, gambling) seems like a pretty shitty thing to do.
shalmanese 13 hours ago 0 replies      
If you think Ponzi schemes won't work if you tell people that they're a ponzi scheme, MMM-2011 was a Ponzi scheme in Russia that hooked people despite nakedly advertising that it was a scam: http://www.bloomberg.com/news/2012-06-06/is-global-finance-a...
udfalkso 13 hours ago 1 reply      
So how do the creators of this make money from it?

The best solution is one that doesn't infringe on the "correctness" of the game, and it's a simple one. Simply play the game yourself. Send money in, let the system send money back. Do it a lot. Many small transactions. You will never lose, because when the game ends you are the owner of the actual account and won't get screwed like everyone else.

Right? So, perhaps many of the transactions we're seeing go into this are suspect and the total amounts aren't to quite be trusted?

tommorris 15 hours ago 2 replies      
I'm sure the Dogecoin Ponzi scheme will be better.
legojoey17 10 hours ago 0 replies      
I deposited the minimum just to give it a go (80c... Can't hurt?) and the Ponzi sent me back an absurdly larger amount of bitcoins... Not quite sure what happened, but hey. When this hit the top of Hacker News the site wasn't loading any bitcoin stats or such so I feel the load must have had something to do with it possibly. (That or I just got double spent on, which would be a real sneaky trick to get a user to send back the 'perceived' amount, but coming out of their pocket.

This is personally why I wouldn't trust any programs to handle currency so openly on the web. The inability of the average user to stress test or put proper testing through applications can cause quite a fault. Having experienced the methods that banks undergo for software cycles there is a tiny chance someone would have the resources to properly engineer something so fragile (relative to money) properly.

Because of this I would assume the main reason the author actually shut the site down (or at least so suddenly) was because of scaling technical issues.

a3_nm 14 hours ago 1 reply      
This would be cooler with one of those other cryptocurrencies that feature a Turing-complete scripting language: the code of the scheme could be public and run in the blockchain so that everyone would know the system is fair.
ryanskidmore 12 hours ago 2 replies      
Something is not quite right.

Deposited 0.05859407 BTC ( https://blockchain.info/tx/c5411ae7ad41d6dab5dd879c158cb81f0... )

Recieved 0.0599 BTC back ( https://blockchain.info/tx/ef7f32df518dabb104812ea4a12719026... )

1.2 * 0.05859407 = 0.070312884 BTC

So, somebody owes me 0.010412884 BTC

CurtMonash 12 hours ago 2 replies      
To be technical, there are multiple inaccuracies in "Get 120% back when the next person sends".

First, as per the central problem of Ponzi schemes, it is missing an "... if ever" at the end.

But further, it is oversimplified, because people can submit different amounts of bitcoins. Covering up that uncertainty -- which while apparent is not as clearly disclosed as the "Ponzi" aspect -- makes it harder for people to assess the likelihood they will get the promised return.

Kiro 12 hours ago 0 replies      
"The experiment is over.We will pay back everyone we can. We are not making money from this."
makomk 15 hours ago 1 reply      
There have been a few Bitcoin sites like this, I think the original one was Bitcoin Gem. Most of them turned out to be scams (and I don't just mean that they were Ponzis).
pmikal 12 hours ago 3 replies      
Website shows they've shut it down.

"The experiment is over.We will pay back everyone we can. We are not making money from this."

ck2 13 hours ago 2 replies      
I'm impressed by the vanity address, must have taken a lot of cpu power to hash that one out.
kokey 14 hours ago 0 replies      
I've always been fascinated by ponzi schemes, or other pyramid and chain structures. Most of the legislation against it that I've looked at, attacks it from a deception angle. So, basically, if it's honest about what it is and your chances are to make or not make money from it, it may be within the law of many countries. I have never looked at it from a cross border legal perspective, that might make it even more interesting.
coherentpony 14 hours ago 0 replies      
Oh jesus fucking christ. People have sent over 200 BTC to that. :/
sashazykov 15 hours ago 2 replies      
http://bitcoinpyramid.com/r/230 is 3 years old (and still paying :))
tobz 14 hours ago 2 replies      
Why bother with a faceless ponzi scheme? Send your bitcoin directly to me, and I will invest it by hand, making sure to maximize your return: 1HbNxRhrv5Jocr7Q9ZqbbqCwuNNy4UsR77

This is totally legit. Seriously.

dror 13 hours ago 1 reply      
This is almost perfect except for not including a warning:

"Warning, if people stop depositing money, you won't get 120% back, and you could lose all your money."

Also, it seems like the person running the site is not taking a cut. If that's right, he's not making a profit, and he's less likely to get in trouble when things collapse.

refrigerator 15 hours ago 1 reply      
If I had any Bitcoin, I would definitely play this once or twice
n1c 1 hour ago 0 replies      
Funny, I also built something like this in December.


bbosh 12 hours ago 1 reply      
It is either a scam or very poorly coded. I've seen one transaction in the blockchain for 7BTC from yesterday that hasn't been paid. But, people sending 0.01BTC today are being paid. Any proper Ponzi scheme would pay back in time order, not by order of value.
breeezzz 11 hours ago 1 reply      
I threw some old satoshi-dice coins at it (0.25) and was saddened at the header mocking that "The experiment is over.".

1.5 hours later 0.30BTC showed up in my wallet! Wow, did not expect that!

3rd3 15 hours ago 1 reply      
How does that work exactly? I pay in X BTC and two other people pay in say 0.5 X and 0.7 X then I get these BTC back? How exactly will my X BTC be distributed among other gamblers?
Kiro 15 hours ago 2 replies      
A bit OT but how do you build automatic Bitcoin services like this?
loucal 12 hours ago 1 reply      
I'm wondering though how they can afford to pay some people 400% Kind of seems like a bug might be making this less sustainable than it normally would be.


epmatsw 13 hours ago 1 reply      
Huh. I got paid back. I'm actually somewhat surprised. Well, I made 0.0002.
randomflavor 12 hours ago 0 replies      
Just tried sending .2, and got back .119 - so therse a bug or he's skimming. lol, anyway interesting to make this


agent462 9 hours ago 1 reply      
First, I fully understood what I was getting in to.

I first deposited .1 bitcoin to see if it was real and got 0.11978 back.

Ok, this will be fun.

I then deposited .8591 and got back 1.1999. Ha this is hilariously fun. More gambling!

I then sent 1.2001 and got back 0.8589. Wait what..

The game ended and he skimmed from me to hopefully pay someone else back and not pocket it.

thrush 15 hours ago 2 replies      
How much does creator make?
RRRA 12 hours ago 2 replies      
Can anybody tell me how you come I've sent close to the minimum and got almost 100 time more?(this is not a joke, algo. fail?)
kolev 10 hours ago 0 replies      
So, ponzi.io is a Ponzi scheme squared?
Kiro 14 hours ago 0 replies      
I just tried and got paid within 30 minutes.
fnsa 12 hours ago 1 reply      
shameless plug: my ncurses thin SPV bitcoin client for linux/mac:


  * 100% C code,  * support for linux and mac platforms,  * console based: uses ncurses,  * home grown async network i/o stack,  * home grown poll loop,  * home grown bitcoin engine,  * supports encrypted wallet,  * supports Tor/Socks5 proxy,  * multi-threaded,  * valgrind clean.
You'll need basic dev skills to install it: check-out the code, install dependencies then build. It's all in the README file. I'm interested in all kinds of feedback you may have: feature requests, bugs, etc. Thanks!

erikb 13 hours ago 0 replies      
I would not wonder if it was posted by the actual author and if his advertisement here is actually successful.
harrigan 14 hours ago 0 replies      
This was tried out about a year ago: https://bitcointalk.org/index.php?topic=138749.0. The rules were originally very similar to the website above; then the owner modified them so that the gem randomly "reset".
amenghra 10 hours ago 0 replies      
A few small teaks could make this into a money laundering / mixing money trail tool.
jasonvolpe 11 hours ago 0 replies      
http://coincurious.com/ is a more interesting experimental art piece as it's not a blatant scam.
yatakaka 13 hours ago 0 replies      
Careful. I just sent .3, complained in the chat and was subsequently blocked...
RobinL 15 hours ago 1 reply      
What a great illustration of how investment bubbles can rationally occur.
leugim 15 hours ago 0 replies      
Cool cool cool

The human stupidity in one web, from the same Carlo Ponzi to Bernard Madoff. All Ponzi scheme portrayed in a web.

I love it.

melarina 12 hours ago 0 replies      
> Transparent Bitcoin Ponzi scheme

needs more parentheses

xcyu 15 hours ago 0 replies      
Refresh the page to see some "live" action.
vezzy-fnord 15 hours ago 0 replies      
Charles would be proud.
sunshinerag 14 hours ago 0 replies      
I bet this ponzi scheme will be much short-lived compared to the one run by federalreserve.io.
dogewow 9 hours ago 0 replies      
Looks like the fun has moved to http://dogepound.l8.lv/

Who wants to make some money before the prices get too high ;)

smartistone 14 hours ago 0 replies      
Well, this is how folks get rich in the smartist era, by creating schemes like this. Or buying facebook & google stock. Or creating/investing in the next web 2.0 billion dollar company. A 9-5 chump would have to work for years to make this kind of money, assuming he paid off all his college loans.
97s 15 hours ago 1 reply      
This is amazing.
magic8ball 10 hours ago 0 replies      
Just received my 1BTC back. The "experiment is over" seems to be legitimate.
avodonosov 12 hours ago 0 replies      
very good idea
1Ponzi 12 hours ago 0 replies      
Don't trust on any new address. Legit Ponzi address starts with 1Ponzi, like:1PonziAwQpnfj15Br4YFJHygWtd5LQKgN1This is the new address that will be up shortly. Early birds get 1.35X payout.
yatakaka 13 hours ago 0 replies      
tzakrajs 14 hours ago 0 replies      
A ponzi scheme on top of a ponzi scheme. Brilliant.
breaker05 14 hours ago 0 replies      
I'd love people to donate a few BTC that I can use for more noble causes.


powera 14 hours ago 2 replies      
This is quite possibly the worst post to ever hit #1 on Hacker News. Why are you people voting it up? Just because it admits that it's a scam (and "get your money plus 20% back for nothing ALWAYS ENDS UP A SCAM") doesn't mean that it should get posted here.
The Computational Complexity of Machine Learning utexas.edu
3 points by luu  50 minutes ago   discuss
Every line of code is always documented uniqpath.com
119 points by jessaustin  15 hours ago   81 comments top 22
RyanZAG 14 hours ago 5 replies      
You really shouldn't have to be relying on history for all of that context. It should definitely have been a function simply called 'triggerLayout()'. Then the exact and best method for triggering layout could be put in that function and used throughout the project where necessary, and easily updated if a better method of triggering layout comes along.

Code like this is extremely brittle with or without that git history, don't rely on it like a crutch as it makes the code obscure, and updating a line somewhere may leave other similar lines updated or not. If you wanted to update the triggerLayout function, you would have to go through the git commit log for every .clientLeft line to see if that one was or wasn't used for triggering layout...

crazygringo 12 hours ago 3 replies      
But why would you only comment this in the git history?

Why on earth wouldn't you place this explanation in a comment preceding the line of code itself?

One of the best things I ever learned in programming is that you shouldn't write code to be executed -- you should write code to be read and understood by other people.

It's going to take me forever to read your file and understand what's going on if I have to do a git blame on every other line.

Just use short purpose-based commit messages (fixes a bug where..., so now...), and then put the actual why behind the implementation in the source code comments itself!

jasonkester 1 hour ago 1 reply      
Here's me doing the HN thing and analyzing the code sample rather than the article itself. Sorry in advance.

But if you're going to do this trick and you use a code compiler of any sort, you'll need to assign the value of that clientLeft somewhere. Otherwise your compiler will notice it not doing anything and helpfully optimize it away. So your users in production will see your layout bug and you'll never be able to reproduce it in development.

JavaScript is awesome.

As to the article itself, I'd prefer to see a comment on a line as wacky as this one. It's one well-intentioned lop away from vanishing from that git blame entirely, and then six hours of debugging and research away from finding its way back into place.

Trying to sift through file history to understand what's going on is hard enough on code I wrote myself only a year ago. I wouldn't want to rely on it as the only way of digging into a large shared code base. Yikes.

jakejake 12 hours ago 5 replies      
This is probably not a popular view, but I don't really understand why comments are viewed by some people as a bad thing. I agree that useless, redundant comments are not helpful. But that doesn't mean all comments are useless. I don't agree that well-written code never needs comments either. Reading the code tells you what it does. It doesn't always tell you why it's there.

Digging through version control comments seems to me a last-ditch effort to figure out what some code is doing. If the project had any significant history you could be digging through hundreds of commit messages. Why not just put a 1-liner comment above that line and save every subsequent developer the hassle of wondering what the heck that seemingly useless line does..?

zmmmmm 10 hours ago 1 reply      
This would only seem to work for relatively fine grained commits or projects in maintenance mode.

Apart from that, to rigorously apply it would break the author's own advice or common sense source control practise - suppose I make 4 changes in separate files to fix a bug. Do I check them in separately so that I can put my pseudo code comments into the revision history? Now I have broken the atomicity of my revision history. If I check them in all together do I type a whole essay into the revision history about why each change was made in each file? It'll quickly all fall apart.

It also relies on the reader recognizing that they need to be curious about the code here. What it if didn't look so curious? It could easily get cleaned up or modified without a comment to alert the reader.

For people and projects in specific contexts it can work but there are plenty of situations where this is a terrible idea.

shaggyfrog 15 hours ago 1 reply      
It sounds like "this.get(0).clientLeft" should have been a single-line function named something like triggerLayoutInMozillaAndFirefoxToFixAnimateForNewDomElement, or if you don't like massive function names, a comment that says that.

Spelunking through commit history shouldn't be necessary learn the intentions behind those kinds of actions.

gtirloni 13 hours ago 0 replies      
Checking the history is much slower than a comment right there in the code. Repeat this a thousand times and you've lost precious time hunting for information.
markm208 9 hours ago 3 replies      
I agree with the author that historical information about how a codebase has evolved is important. I would also argue that code comments are not always the best place for this historical information (if you dont know about the deep past of a bit of code, then why would you want to see a code comment describing some change to it?).

I suggest we take a step back and ask if modern version control is the best way to store historical information. Modern version control systems (git, mercurial, etc.) were built within the last decade or so but they were built with the same constraints as the original version control systems of the 1980s. They are optimized to be disk efficient (and dont get me started about their command line interfaces). This is crazy!

We should store much more about the programming process than the data gathered if and when a developer chooses to commit. We should record it all- every keystroke. No human generated source of data is ever going to fill up our hard drives or the cloud. Dont optimize for the disk!

This data can be used to replay programming sessions so that others can learn exactly how the code evolved. Developers could then comment on the evolution of their code. Think of this as a modern commit message. I am working on a project that attempts to do this:


zimbatm 14 hours ago 0 replies      
It's a good example of a place where a comment should have been added. The user got lucky because the line's last commit happens to contain the meaningful explanation. He could also have stumbled along a formatting change, variable name change, file rename. The spelunking becomes harder than just reading the comment that could have been on top of that line.
rymohr 13 hours ago 0 replies      
Great article! Mislav definitely knows his git.

What he didn't point out though is that he was actually the one that contributed that code in the first place.



As others have commented, long explanations like this have no place in commit messages. Comments should always be used to explain what you're doing and why you're doing it. Commit messages should simply summarize what you did.

A better commit message would have simply been:

fix animate() for elements just added to DOM

See included comments for explanation.

meistro 15 hours ago 2 replies      
> var one = "foo"

  , two = "bar"  , three = "baz"
Agree with the author that this is easier to change, and in JS it will keep you from accidentally leaving a trailing comma. That being said, I find it to be very unreadable(which is where most your time will be spent) and most text editors/IDE's make it a burden to work with.

stretchwithme 12 hours ago 0 replies      
It would be useful to make commit comments on code as you make changes that can be used by SCMs and IDEs. See the history of a method change by change complete with the relevant comments. And from an SCM, click on a comment and select from a list of code changes.

And you avoid having to write your comments when you commit. You'd do it in the code when you are more focused on the change.

Even better would be detecting when you are changing code and prompting for the comment or let you select from recent comments.

Then on the SCM side when you commit, each comment could be handled as a separate commit.

Any IDEs already doing some or all?

erichocean 6 hours ago 0 replies      
When I'm evaluating a project on GitHub, one of the first things I do is walk through the commit history. The info in there is invaluable IMO for getting up to speed on a codebase fast.
martincerdeira 12 hours ago 0 replies      
You live in a world of wonder and magic.If someone puts that code (without comments) probably, the same person won't give such a detailed commit message.Good luck in wonderland.
etler 15 hours ago 2 replies      
This is exactly why I'm such a bastard when it comes to git history. A clean history isn't something that just gets filed and disappears forever. Unfortunately, lots of people think it's ok to use a message like "fixed stuff". Don't even try that in my codebase though.
Aga 9 hours ago 0 replies      
Extending code review to commit messages is a big help.

If I can't understand from the commit message, what the change is trying to achieve, I won't even look at the code. Instead I'll ask to clarify the commit message first.

forrestthewoods 11 hours ago 0 replies      
Perforce time lapse view is similarly amazing.
gizmogwai 15 hours ago 2 replies      
I wish my current coworkers wrote any comment in there commit message...
ghubbard 14 hours ago 1 reply      
"is it safe to change or remove that call in the future?"

If you remove the line you should end up with a failing test.

ilzmastr 14 hours ago 0 replies      
Interesting concept.Coincidentally, I wrote a gem recently that serves the same function as the git-overwritten script at the end:https://github.com/ilyakava/stefon

gem install stefon

LukeB_UK 14 hours ago 0 replies      
Every line of code is always documented (if it's in a VCS and the commit message is good)
tshadwell 7 hours ago 0 replies      
Is this Javascript? Why is there no semicolon?
Rumor Monger (2005) mememotes.com
15 points by BIackSwan  5 hours ago   discuss
How I hacked Github again homakov.blogspot.com
819 points by zhuzhuor  1 day ago   187 comments top 33
jqueryin 1 day ago 3 replies      
If @homakov is finding security holes without access to Github repositories, imagine what he'd find if you had him code audit for a few days... He's clearly been going about this the proper white-hat way and ensuring holes are patched before open disclosure... what's there to lose?

On the flip side, you could go about doing what you're doing under the presumption nobody is maliciously targeting your user base. In this scenario, it's possible you have a couple bad actors that see a net benefit greater than your bug bounties and are silently stealing and selling supposedly secure code from your users. You could be supporting a hacker black market where they sell and trade codebases to popular online sites. Imagine how easy it would be for them to find vulnerabilities in these sites if given access to the source code.

That, my friends, would be a catastrophe.

ultimoo 1 day ago 2 replies      
@homakov finds 5 different bugs with github and manages to align them so that a bigger vulnerability is exposed in under 5 hours? That's amazing! I used to think I'm a fast delivery-focused developer but I'm probably just a fraction of how fast some people are.
enscr 1 day ago 5 replies      
Github uses ruby on rails, which is a pretty mature framework, perhaps covering most of the common security pitfalls. Additionally, I assume github has excellent programmers because of the nature of their job.

Could someone explain in simple english, how did they overlook known & well documented bugs that got them hacked (e.g. Bug 3 about cross domain injection). I'm wondering if someone of Github's caliber can be hacked so easily, what about the rest of the masses developing web apps. Especially all those new crypto-currency exchanges popping up left & right.

I've been toying with Django. Reading through the docs makes me feel that as long as I follow the safety guidelines, my app should be safe. It feels as if they've got you covered. But this post rattles my confidence.

sdegutis 1 day ago 5 replies      
> $4000 reward is OK.

$4000 !? Wow, I'd love to be able to make $4000 on the side just doing what I love.

> Interestingly, it would be even cheaper for them to buy like 4-5 hours of my consulting services at $400/hr = $1600.

This sounds like a pretty clever strategy for marketing yourself as an effective security consultant.

EDIT: $4000!? wow. so money. such big.

ChuckMcM 1 day ago 0 replies      
Grats Egor, once again a great explanation of how these things add up into vulunerabilities.
akerl_ 1 day ago 4 replies      
"P.S.2 Love donating? Help Egor on coinbase or paypal: homakov@gmail.com"

Maybe it's just me, but asking for donations after saying you bill clients at $400/hr seems weird to me. I wish I could bill at that rate.

thrush 1 day ago 0 replies      
"Btw it was the same bug I found in VK.com"

Is there an easy way to see what vulnerabilities other websites have had and fixed, and to check if your site has them as well?

throwaway3301 1 day ago 1 reply      
How can I start learning about how to identify exploits like this? I know some basics about web application security and work as a software engineer on a day-to-day basis but security has always been a passion of mine and I have always wanted to be able to support myself through working on security alone (by collecting rewards through bounty programs, self-employed security consulting, working at a security consulting firm like Matasano, or some combination thereof) but I don't know where to start. I want to learn the ins and outs of web application security instead of just understanding the OWASP top 10 and having a strong interest in certain topics (like HTTPS/SSL vulnerabilities). When I read disclosures from people like Egor I grasp the steps they are taking to craft an exploit like this as they are explained but I don't know how to identify these exploits on my own.

Can anyone recommend some reading material or some first steps I can take to work towards moving to a more security-focus career?


interstitial 1 day ago 1 reply      
Half the comments are about his pay scale, imagine the ruckus if he had been paid in unwithdrawable bitcoins at mtgox.
gabrtv 1 day ago 1 reply      
Impressive display of persistence, stringing together those vulnerabilities. I also see your English has gotten noticeably better :) Keep up the good work!
derengel 1 day ago 2 replies      
I'm the only that thinks that $4000 was very cheap on part of Github? a security hole like this on the wrong hands would have bring severe consequences to github, consequences so big that they would probably pay $1,000,000 USD for it to never happen. So maybe something in the $50-100K would sound more reasonable. Egor is a great hacker with no business sense? On the other hand, the publicity his service gets for this its probably worth more than $50-100K.
nightpool 1 day ago 0 replies      
As soon as I saw the new bounty program the first thought through my head was "Any Github Hacking leaderboard without homakov at tthe top is an inaccurate one". Congrats on your newest discovery!
aroman 1 day ago 0 replies      
Wow, really clever stuff! Also of note is the $4,000 reward he received from GitHub's bounty program their largest to date, according to the email.
nakovet 1 day ago 1 reply      
One thing that I didn't get from the post:

> Oh my, another OAuth anti-pattern! Clients should never reveal actual access_token to the user agent.

From what I understood by reading the OAuth RFC is that front-end intensive applications (a.k.a. public client) should have short lifespan access tokens (~ 2 hours) and the back-end takes care of reissuing a new access token when expired.

Can someone clarify on how to make a those calls from a front-end application without revealing the access token?

desireco42 1 day ago 0 replies      
One more comment. Security flaws seem obvious, but getting security right is hard. It require a lot of testing and effort to get everything right. This kid Homakov has a talent for finding holes and seems that has his hard on right place ie. isn't abusing it.
ivanca 1 day ago 0 replies      
Really good work @homakov and I suggest you should start a web-security-school or something of the sort. I'm sure there is money in that field and you would be able to keep traveling around the world while doing it.
mtkd 1 day ago 1 reply      
Github should have hired him last time.
desireco42 1 day ago 2 replies      
Why is GitHub so hostile to this kid, just give him a job already! He obviously has deep understanding of how things work. I would feel better knowing he work for them.
leandrocp 1 day ago 1 reply      
@homakov, have you thought about selling screencasts ?
Tobu 1 day ago 1 reply      
WTF is up with Firefox and Chrome not fixing their /// bug.They're prioritising neither user security nor standards-compliance.
runn1ng 1 day ago 1 reply      
OK. I give up. No matter how much I try, I will never be as cool as @homakov.
yarou 19 hours ago 0 replies      
Very cool write-up of non-critical bugs that can be used together to inflict some serious damage. Great work @homakov!
Kiro 1 day ago 0 replies      
How do you find all this stuff? Where do you even start?
peterwwillis 1 day ago 0 replies      
This would be a great case study if expanded on and edited. Igor should write a book!
bashcoder 18 hours ago 0 replies      
Thanks for continuing to make Github safer for all, @homakov. Someday I might even host a private repo there again, but I haven't done that since your first mass assignment exploit. You continue to prove that my decision was a good one.
rip747 1 day ago 0 replies      
every post this guy has about the security holes he has found are impressive to say the least.
Omnipresent 1 day ago 0 replies      
It would be great for educational purposes if a sample app was setup so this vulnerability could be tried on it. Most of the white hack vulnerabilities are fixed by the time white hat blog posts come out so there is no way to actually try them out.
livingparadox 1 day ago 4 replies      
Seeing stuff like this, I want to get into comp-sec. It always sounded interesting, and it looks like it pays well...
outside1234 1 day ago 0 replies      
why hasn't GitHub hired this guy?
afarra 1 day ago 1 reply      
Does anyone know of a website or central resource that documents all these vulnerabilities to look out for?
intortus 1 day ago 3 replies      
Shame on github for making these mistakes in the first place, but kudos to them for doing such a great job of engaging the white hats.
ng6tf7t87tyf 1 day ago 2 replies      
Ruby Brogrammer Security Fail yet again.

Friends don't let friends code in Fails frameworks.

pgs_pants 1 day ago 1 reply      
Firstly, well done. It is good to see well done security eval.

But github, seriously? Why do you guys fail so hard at security?

Too much Brogrammer rather than programmer methinks.

AOL chief reverses changes to 401(k) policy after a week of bad publicity washingtonpost.com
10 points by _pius  4 hours ago   3 comments top 3
twotwotwo 1 hour ago 0 replies      
The "distressed babies" line was incredibly lame. First, statistically, a company with thousands of workers will have some people with expensive health problems and get hit with some insurance surcharges. It sucks, but it's how today's system is. Second, even if it weren't so, it's hard for it not to sound like you're blaming babies. It's beyond sleazy at a human level, which is what got him, publicity-wise. And, not entirely unrelatedly, it's the sound of a CEO trying not to take responsibility for what happens at their company, which is always a bad sign.


ohashi 9 minutes ago 0 replies      
I hope all of the AOL employees are considering their job options elsewhere.
dmead 55 minutes ago 0 replies      
that policy is status quo at IBM
Ask HN: What should I learn to stay relevant in the next 5 10 years?
8 points by thewarrior  1 hour ago   13 comments top 12
dshankar 21 minutes ago 0 replies      
This is not a wise strategy.

Technologies go through cycles - PHP was hot in 2006, Ruby peaked in 2010, Javascript (and Node.js) is currently hot, and it's quite possible that Go will be hot in 2-3 years.

It's foolish to learn a language under the assumption that it will be relevant for 10 years. If you asked this question in the early 2000s, PHP would be the answer, not Javascript.

The best strategy is to continuously brush up on skills. Experiment and dabble with new languages and frameworks as often as your time allows!

jbert 8 minutes ago 0 replies      
Move up the value chain.

If you can turn a design into code, learn to turn a spec into a design.

If you can turn a spec into a design, learn how to understand a problem and produce a spec to solve it.

If you can understand a problem, learn to talk to people and discover the problems they have so you can solve them for them.

If you can do that, learn a million other things and run your own business.

[You can also skip any of these steps if you're happy managing people to fill in the downstream aspects rather than doing it yourself.]

wsc981 56 minutes ago 0 replies      
The .NET framework[0] is always useful and I think it's here to stay for a long while. You'd be able to write mobile apps using Xamarin or business apps for Windows. Also apps for the new Ubuntu mobile OS will be possible.

Personally I find Go[1] interesting and it's something I'm hoping to pick up in the coming year. It seems like a fun language, well suited for building web services that handle lots of traffic.

Lua[2] might also be nice to learn. It's used for scripting in a lot of games. For example: in World of Warcraft you can create your own Lua add-ons. Lua can be easily integrated into your own apps / games, since it's just a small C library. It might be a good language to learn if gaming interests you, since lots of games make use of Lua in some way.

And as someone else already mentioned in this thread: functional programming will become bigger in the future. You can use the functional programming style with .NET if you choose to learn the F# language.


[0]: http://en.wikipedia.org/wiki/.NET_Framework

[1]: http://en.wikipedia.org/wiki/Go_(programming_language)

[2]: http://en.wikipedia.org/wiki/Lua_(programming_language)

adrianhoward 7 minutes ago 0 replies      
My meta-comment would be to avoid trying to find the-next-big-thing.

If it was easy to guess the next big thing everybody would do it ;-) A 5-10 year horizon is a very long time in computing years. Look back ten years. How many people were accurately guessing the current environment? How many of the big-things now even existed ten years ago?

When I look back at my career I can't point to a single instance of seeing the next-big-thing.

I can point to lots of great things that have happened because I'm continually poking at new ideas, new processes and new bits of tech. So I'm ready to take advantage when one of those does become the next-big-thing.

So yeah. Take a look at JavaScript and the node.js world. Or robotics. Or architecture as code - or whatever. But for god's sake don't bet your career on it in five years time. Explore lots of things. Find stuff you're good at and enjoy. Be ready for when one of those starts turning into the next big thing.

roadster72 45 minutes ago 0 replies      
5-10 years is a very long period. To me: JavaScript looks exciting, current trends clearly show the rise of JS.

However, just a decade ago, a large majority of Internet used JS primarily for form validation, which was sad. A lot of web developers were not comfortable leaving their code open for the visitors to see.

I personally believe JS will continue to soar but I also believe that nobody can answer this question perfectly as nobody knows the future.

In any case, if you spend a lot of time learning any language very well, the time required to learning another language after that, decreases substantially.

jmnicolas 27 minutes ago 0 replies      
From the trends I see, I'd recommend to become a Javascript expert (not just passing knowledge of jQuery).

Then Node.js and Angular. You should be set for the next 10 years.

Outside enterprise, .NET is sinking into irrelevance. I don't know for Xamarin though.

featalion 34 minutes ago 0 replies      
Learn basics - concepts of programming. 90% of developers use imperative (procedural) and structured (OOP) paradigms. But the world of the programming is not so close. There is a lot of other interesting and applicable paradigms, including declarative (functional and logic programming), metaprogramming, semantic, and many more.

Learning only programming language (PL) people are limited to scope of that language. Learning paradigms (better in terms of one of PL) you gain knowledges which are "portable" between PLs of the paradigm. You will have a boost when switching PL of the same paradigm: learn PL faster, looking into PL's features and not its basics.

I recommend to check out before you choose what to learn: Lisp dialects like Closure, CL, etc; Ruby; Go / Rust; Java.

mattm 16 minutes ago 0 replies      
Sales and marketing - those skills will never go out of style.
Goranek 22 minutes ago 0 replies      
Go, Javascript(angular)
jiax 1 hour ago 1 reply      
Functional programming is gonna get big with concurrency becoming more popular
techaddict009 1 hour ago 0 replies      
Even I am learning Java Script. Plus I am web developer so started learning Laravel - PHP framework.
chatman 1 hour ago 0 replies      
Learn Java. Java 8 looks exciting.
Unsung Hero of the Nuclear Age slate.com
57 points by _pius  12 hours ago   19 comments top 7
ufmace 8 hours ago 3 replies      
It's a challenging issue, all right. The important part that I don't see many people discuss is the nature of international relations, most of which is based on how other national leaders and decision-makers will perceive your actions.

Consider: North Korea is run by madmen who have at least some primitive nuclear capability. They regularly make wild accusations and threats against the US. Say they do manage to mount a nuclear device on a long-range rocket. What's to hold them back against launching it at a major US city? Whenever people discuss North Korean (or Iranian or...) nuclear capability, the usual line is that we don't need to worry about it that much, since it would obviously be crazy for them to use them against the US, or any other nuclear power. What is it that makes it crazy, when they've already done so many terrible things to their own people?

It's crazy because, according to MAD, any such attack, or even a specific threat to make such an attack, would result in a full-scale launch against their country. Millions of casualties, the total destruction of their culture and way of life. Everyone in the world, most especially leaders in North Korea, and China, fully believes that the US will carry out this threat if attacked with nuclear weapons.

Now, let's say Iran manages to detonate a primitive nuclear device in a coastal US city. The textbook MAD reply is a total destruction of every Iranian city. You can make the case that this is crazy on it's face - there is no imminent threat to stop, and those millions of people who would die didn't do anything to deserve it. Say that what Hering and the article author seem to want happens - that no weapons are launched, and a more measured, conventional reply is used. What do you think the North Korean leaders will think then? Or China and Russia? That's the more important question to ask.

After that, might North Korea think that they can use a nuclear attack to try and extract some sort of diplomatic concession from us? They're a harder nut to crack with conventional weapons, and they have more firm backing from China. If they get the idea that our MAD policy is toothless, they might try something that could lead to a much greater war, even possibly a much bigger nuclear war.

We've been living in a world for a long time now where the Kim Jong-uns of the world have very good reason to be terrified of using nuclear weapons against the US. Are you willing to see what happens if that is no longer true?

There are terrible people in this world who are prepared to do terrible things to everything we hold dear. To keep the world safe and stable, those people must believe that we will do even more terrible things to them if the situation calls for it. Keeping that belief in place may sometimes require us to actually do some terrible things ourselves,

pdonis 6 hours ago 1 reply      
I'm a bit confused by one thing in this story: it never mentions the two man rule:


The President cannot order the use of nuclear weapons on his own; he can only issue the order jointly with the Secretary of Defense. The article mentions that Nixon's SecDef asked people to "check with him" before carrying out orders from Nixon, which may be a sort of garbled reference to the two man rule, but if so it's very garbled.

Whether this rule actually answers Maj. Hering's question is a separate issue. But I find it disappointing (though unfortunately not surprising--journalists often get things like this wrong) that the article repeatedly talks as though a single person can issue the order, when that's not the case.

gpcz 11 hours ago 2 replies      
Why did the military require human beings to turn the keys if they expected people (and weeded out people unwilling) to turn the key blindly? A wire would have been much more efficient...
magic_haze 9 hours ago 3 replies      
Does anyone know what TempleOSV2 is talking about? I can't reply to his comment in this thread because it's marked as dead, but it sounds very interesting.
throwwit 2 hours ago 0 replies      
Another unsung... Stanislav Petrov: The man who may have saved the world http://www.bbc.co.uk/news/world-europe-24280831
gojomo 10 hours ago 1 reply      
"When the President does it, that means it is not crazy."
peterpathname 5 hours ago 1 reply      
in my opinion, no order to deploy nuclear weapons can ever possible come from a sane commander. any such order should be refused.
Proposed Minecraft fan film canceled on Kickstarter by Notch polygon.com
8 points by frik  3 hours ago   3 comments top 2
TazeTSchnitzel 25 minutes ago 1 reply      
Well, it's Mojang's right.
frik 3 hours ago 0 replies      
Minecraft Movie 'The Birth of Man' Nixed by Notch:


Google Loses Appeal, Forced to Publish 150,000 Fine on Google.fr pcmag.com
45 points by prateekj  11 hours ago   24 comments top 9
nraynaud 5 hours ago 0 replies      
CAUTION, THIS IS NOT TRUE.Google is appealing the sanction, and in the mean time they asked an emergency injunction ("rfr") for not having to do the message part of the sanction waiting for the appeal (on the ground that if they win the appeal, their reputation would already be damaged by the message). The appeal has absolutely not been ruled.

The "emergency" judge simply declared that what they asked for was not following the specific emergency criterion (basically they didn't believe the irremediable damage part), and he simply let go the sanction for now. Another Court will actually judge the appeal itself. If they win the appeal, they get their money back, and some bragging rights.

here is the PR from the actual court: http://www.conseil-etat.fr/fr/communiques-de-presse/sanction...

sushirain 2 hours ago 0 replies      
Unfortunately, the article doesn't explain what is the dispute between Google and France. A link to from the article to another one answers it:

At issue is an update to Google's privacy policy that went into effect on March 1, 2012. The revamp consolidated 70 or so privacy policies across Google's products down to one. But with this change, Google also switched to one profile for users across all services rather than separate logins for offerings like YouTube, Search, and Blogger.


derekp7 6 hours ago 3 replies      
Would it be reasonable for, say, a car manufacturer to be forced to include a statement painted on the side of each car they sold for a period of time? Or a consumer electronics manufacture to have an apology statement on, say, a TV that gets displayed whenever you change channels? Something like that would severely damage their product. So why is it reasonable to require a tech company to deface their product? Now it wouldn't be as bad for some companies, where their main domain points to an information page, but the main page for Google is an application page, not a "web" page. This just feels like a bad precedent -- what happens when this is forced on another company, who's application front end isn't conducive to having arbitrary text on it (I'm thinking of like map programs, word processing, presentation apps, etc).
rikacomet 2 hours ago 0 replies      
Trust is a thin line. One must not trust anyone blindly, that is true. But watching and surveying every move of your own allies, breaks your own reputation. That is not trusting your allies at all!

But this is a good news, small.. yet a welcome. Though this does not mean that France is not in with the mass surveillance itself(it might just be a diplomatic maneuver). But it does mean that at least a few have realized that privacy of citizens is not something you can mess around with.

ZoFreX 9 hours ago 2 replies      
I feel that once again legislature is several steps behind technology. Who even sees the Google front page these days? Anyone searching from their browsers search bar or address bar, or on their smartphone, won't see this message.
eropple 9 hours ago 3 replies      
I know PCMag is pretty hard up for...everything, these days, but a modal "sign up for our newsletter" box that doesn't have a close button is the worst thing I've seen in a while.

(You can get out by hitting Escape, but that doesn't make it okay.)

p4bl0 7 hours ago 2 replies      
Funny thing: the French search engine Qwant currently displays a message with the exact same presentation and almost the same text on their front page. They just changed to text so it says that they have never been condemned for anything by the CNIL because they respect privacy.
spektom 2 hours ago 1 reply      
Where will the money go if Google pays the fine? Will I (as Google user) gain some benefits from this case?
yuhong 8 hours ago 0 replies      
I wonder if there is any chance that Vic Gundotra can be fired.
Personal observations on the reliability of the Shuttle (1986) nasa.gov
112 points by umanwizard  18 hours ago   76 comments top 9
kens 15 hours ago 3 replies      
Related: Gregg Easterbrook's article "Beam Me Out Of This Death Trap, Scotty" [1] is long but remarkably prescient, having been written a year before the first shuttle flight. It goes into the dangers of the tiles, how the costs would spiral, the danger of relying on a single launch vehicle, the benefits of disposable rockets, and other warnings that ended up being right.

The article talks about how unlikely the shuttle was to achieve the expected 500 flights, and would more likely only have 200 flights. (Real number: 135)

Some of the quotes from the article are scary in retrospect:

Quote: "Here's the plan. Suppose one of the solid-fueled boosters fails. The plan is, you die."

Another quote: "When Columbia's tiles started popping off in a stiff breeze, it occurred to engineers that ice chunks from the tank would crash into the tiles during the sonic chaos of launch: Goodbye, Columbia."

Remember, this article is from 1980, before the shuttle launched.

[1] http://www.washingtonmonthly.com/features/2001/8004.easterbr...

stiff 17 hours ago 2 replies      
The last sentence from this piece is just beautiful, it has become my personal motto:

For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.

It captures in a capsule form the reasons for a huge fraction of all the big engineering catastrophes, maybe even most of them. For everyone interested in similar case studies, and in reliability from a wide engineering perspective, I strongly recommend the book "Design Paradigms: Case Histories of Error and Judgment in Engineering" by Henry Petroski.

treblig 16 hours ago 3 replies      
There were 135[1] Space Shuttle missions with 2 resulting in human casualties (Challenger and Columbia disasters).

Thus, a failure with loss of vehicle and of human life of 1.48 in 100.

The estimates range from roughly 1 in 100 to 1 in 100,000. The higher figures come from the working engineers, and the very low figures from management.

The reality was even more dangerous than the engineers had predicted, and far more dangerous than management had.

[1] http://en.wikipedia.org/wiki/List_of_space_shuttle_missions

pjmorris 17 hours ago 2 replies      
This is an absolute classic of engineering literature. The last sentence, perhaps deservedly, gets most of the glory, but the whole piece should be under every engineer's and every manager's fingers.

I constantly see the dynamic observed in the first paragraph, and it would seem that the question "What is the cause of management's fantastic faith in the machinery?" is eternal.

marze 16 hours ago 1 reply      
Even with Feynman's carefully reasoned essay, the next shuttle disaster was a mirror of the first: chunks of foam falling of each flight, careful monitoring but no serious action until the foam resulted in a loss of a vehicle.

The first loss was after careful monitoring of near-burn throughs of the SRB o-rings on many flights, but no decisive action.

InclinedPlane 16 hours ago 0 replies      
Interestingly, a thorough risk assessment of the Shuttle was done much later by NASA (near the end of the program) and it concluded that the risk of losing a Shuttle in the pre-Challenger era to be much higher than 1 in 100, closer to 1 in 10. Many people look at Challenger and Columbia as instances where the Shuttle program hit a patch of bad luck. In reality the Shuttle program has been extraordinarily lucky, there were many other close calls, some not well publicized, that came within a hair's breadth of causing loss of crew and vehicle (STS-1, STS-8, STS-9, STS-27 being examples of such). It was always a tricky bird to fly, and in the early days there were about half a dozen different things that could kill it outright with a shockingly high probability (not just the SRBs or foam/ice strikes on the TPS, also the APUs (which caught fire and exploded on one flight), the computer (which was completely inoperable just before landing on another flight), the SSMEs (which came close to causing loss of the orbiter once or twice), and other components). Over time some of the systems were improved to such a degree that they were no longer serious risks, but the whole system was so complicated and there were so many elements of risk that even at the end of the program many substantial risks still remained.
altero 17 hours ago 1 reply      
I think root problem is that shuttle was starting and landing with people. They should use it just for cargo, second rocket with Apollo should have transport people from/to orbit.
alexhutcheson 10 hours ago 0 replies      
For anyone else interested in how the organizational incentives andinstitutional culture at NASA helped to set the stage for theChallenger disaster, I highly recommend The Challenger LaunchDecision[1] by Diane Vaughan.

From the New York Times review[2]:

In "The Challenger Launch Decision" Diane Vaughan, a sociologist atBoston College, takes up where the Rogers Commission and Claus Jensenleave off. She finds the traditional explanation of the accident -- "amorally calculating managers intentionally violating rules" -- to be profoundly unsatisfactory. Why, she asks, would they knowingly indulge such a risk when the future of the space program, to say nothing of the lives of the astronauts, hung in the balance? "It defied my understanding," she says.

[1] https://www.goodreads.com/book/show/995029.The_Challenger_La...

[2] http://www.nytimes.com/books/97/04/13/nnp/19074.html

benmorris 12 hours ago 0 replies      
In case you missed it Discovery Channel aired a pretty interesting Challenger docu(drama?) in November. The show portrayed Dr. Feynman's path to reaching these conclusions. Pretty interesting and disturbing at the same time.


Neither Microsoft, Nokia, nor anyone else should fork Android. Its unforkable. arstechnica.com
148 points by AndrewDucker  20 hours ago   190 comments top 29
stusmall 17 hours ago 9 replies      
The "android isn't very open source" sentiment really bothers me and I see it a lot here. It is a complete and fully functional mobile OS under extremely permissive licensing. Pull it! Change it! Build it! Fork it! Whatever! AOSP is open source.

Look at the closed source services google adds. As far as I know they are all related to google services(Someone please correct me if I'm wrong). Their store, their maps, their email, their location services. This isn't needed in the open source distro and it works great without them. Also there are a lot of restriction on the brand and how you use it when release an Android product. That isn't unique to Android. See firefox vs iceweasel.

Sure sometimes it gets annoying that a lot of Android design decisions are made behind closed doors and in working in the framework sometimes I have to play the game of "Guess what the Google engineer was thinking" because documentation can be scarce. Also from what I've heard sending changing upstream isn't an easy process. These things are nice and make it an easy product to work with... but aren't required for it to be truly open source. The code is there in a series of open git repos and under Apache license. That is open source in my book.

cwyers 15 hours ago 2 replies      
I want to like the article, because the idea that Microsoft should fork Android is kinda silly. The idea that nobody should do it is even sillier, though.

The article, in the course of explaining why it can't be done, names two major examples of where it already has been done successfully -- Amazon's Kindle ecosystem and any number of Chinese OEMs. It doesn't mention other (admittedly less successful) forks, like B&N's Nook tablets or the Ouya. It also doesn't mention how far along the road Samsung was to having the ability to ship Android without Google Mobile Services, until Samsung and Google made a peace treaty that involved sending Motorola off to live with Lenovo.

Yes, if you fork Android, you lose Google's ecosystem. It's not impossible to duplicate, though -- Amazon's done it, Samsung just about did it. And Microsoft already owns all the things it'd need to do it -- that's how Windows Phone has an ecosystem. Losing Google's ecosystem isn't the downside of forking Android, it's the entire point.

Once you've done it, though, you need to convince people to use your fork instead of Google's. Microsoft's success at prying people towards Windows Phone and away from Android can basically boil down to:

1) The ability to run on lower-powered and thus cheaper hardware and still provide a polished experience, and2) Nokia's build quality.

Switching OS cores to AOSP instead of the current Windows Phone OS wouldn't entirely solve Microsoft's app problem (look at the Amazon app store), and it would piss away the only competitive advantage their platform (as opposed to their OEM partner) has against Google's Android experience. Microsoft isn't Amazon -- they aren't a cloud company looking for an OS to give to consumers, they already have an OS. They just need to make their ecosystem more appealing, and giving up on Windows Phone now wouldn't do that.

SideburnsOfDoom 19 hours ago 3 replies      
Microsoft is just not going to abandon Windows Phone, for fundamentally these reasons:

"If it's a core business function do it yourself, no matter what." http://www.codinghorror.com/blog/2008/10/programming-is-hard...

MS certainly will view "a computer in every pocket, and all of them running MS Software" as core to their future. If they do abandon Windows phone, it will be because that has changed.

In the meantime, MS is a company with a track record of plugging away until version x of the product is good enough to succeed.

fidotron 19 hours ago 2 replies      
It's not a poor idea at all. The main example used to be Amazon, but now it's China. China are going to be the source of a lot of Google headaches, thanks to their inability to operate there leading to a majority of Android devices shipped there not including their services at all, giving a critical mass to the market for applications without their components.

Hugo Barra ended up at a Chinese company (Xiaomi) and Google just invested in Lenovo, but unless there's a big policy change on the way the Android/China beast will get further and further out of control.

MS absolutely should weigh in with a privacy hardened Android with great Exchange and Active Directory support (and Nokia Maps). It would be huge, and it would force Apple and Google right on to the defensive.

erikb 1 hour ago 0 replies      
Hm. It sounds a little like Google would be doing really bad to seize control over their infrastructure. But since they do, the whole Android world has improved, at least in my opinion. Think about the upgrade complains we had a few years back. As far as I know that's gone now.(?) So I think it's actually good what Google is doing in that regard.

From my own experience with developing FOSS is also that other people don't try to integrate their solutions into your system. Everybody heads off and makes their own stuff.

Stupid forking is no problem for a project like GCC. But Android is a brand name. And if other people fork it and head off doing their own stuff and fail, then it is always Android that fails.

And really as a developer I also fight for the freedom of software, but as an end user I want to be able to go to a shop and buy an "Android phone" and it just works. Therefore, yes, please, Google take control! Good job!

wambotron 16 hours ago 2 replies      
I don't have much to say about the forking issue, but it want to say this: I own and LOVE my Windows Phone (Nokia Lumia 822). I started on android (Droid X) after deciding the iPhone was too small, bought another one (Droid Razr Maxx), and finally ended up at Windows Phone after playing with it on some online simulator.

I wouldn't go back to android or ios. I don't think they're as usable or fit me as well.

Anecdotally, my wife also joined me on Windows Phone recently after dropping her android phone. She started on iPhone, lost it, was gifted my Razr Maxx, then broke it. She liked the UI of ios, but loved Swype on android and said she'd never go back to ios. Then while we waited a couple weeks for our phone upgrade, she played with my phone and ended up really liking it. She now says it is her favorite phone (Nokia Lumia 920). She likes the camera and the excellent apps from Nokia.

Obviously I don't want MS to switch to android. There may be more apps, but so many are of such poor quality that it is entirely irrelevant to me. Same goes for the apple store. It's almost overwhelming how many bad apps there are.

avenger123 18 hours ago 3 replies      
The title is a bit of a linkbait for sure. I don't think anyone at this point can say that Windows Phone is a non-player. It's going to be around and it can only get better.

Suggesting that Microsoft would fork Android is more wishful thinking than anything else.

Why not just say - "Android is unforkable" and leave it at that.

Oculus 19 hours ago 3 replies      
This has been an ongoing cycle for all of Android. First people complain about Android lagging behind iOS so Google starts moving their services to GSM so they can avoid having to send updates through manufacturers. Then people complain Google isn't open enough so they start baking it into ASOP which takes forever to rollout. On the cycle goes..
rbanffy 19 hours ago 2 replies      
Microsoft could build a fully functional Mono-based Windows Phone experience on top of the AOSP core minus the Dalvik VM.

The only question would be "what for"? They'd gain nothing from it. They already have a more or less portable kernel upon which they can build phones.

stefantalpalaru 13 hours ago 2 replies      
What do they mean unforkable? Cyanogen Mod[1] has been doing it for years.

[1]: http://www.cyanogenmod.org/

DannyBee 4 hours ago 0 replies      
Oh, another Peter Bright ("Microsoft Editor at Ars Technica") clickbait article. Yay.The author of such fine pieces as:

"Firefox ships, but we shouldnt really pay attention"

"Android OEMs should hear Microsoft, Nokia out on Google-Motorola combo"

"Android tablets may provide sales, but profitability is another matter."


Sadly, judging by the old argument rehash going on here so far, it looks like it's working :(

ryen 5 hours ago 0 replies      
Has Microsoft, Nokia, etc considered some kind of app api bridge or outright code converter tool for Android/Java apps into their own native SDKs?

Seems that if a lack of apps is your main problem then easing the time to port an app to your ecosystem should be a high priority. Sure you'd have to re-implement some proprietary api features, but they likely already have equivalents in the Windows phone SDK (location services, in-app purchasing, etc).

pmelendez 18 hours ago 0 replies      
Well since that is a comment that pop up in here very frequently (that Nokia and now MS should fork Android) I am very happy to see that I am not the only one who thinks that would be a bad idea.
css771 4 hours ago 0 replies      
AOSP is open source, period. Just because Google apps aren't doesn't make the entire thing closed. The binary blobs and baseband firmware might as well be on any "open" phone. The issue here is not whether Android phones are open or not, it's whether phones in general can truly be "open" or not.
Zigurd 19 hours ago 2 replies      
Amazon has obviously forked Android successfully and has sold a lot of tablets. OPhone is an Android derived OS used in China, and other Chinese OEMs without access to the Google ecosystem have to do similar things.
blahbl4hblahtoo 16 hours ago 1 reply      
The guy that ran windows phone is now the head of the entire windows division. His name is Terry Myerson. I doubt very seriously that he is going to shitcan windows phone.
ZeroGravitas 11 hours ago 0 replies      
Dianne Hackborn (who works on Android) responds in the comments:


edderly 17 hours ago 0 replies      
Interesting analysis but I think he overcomplicates the situation.

From a third party app point of view you can build your against Android API level X, or a corresponding Google api release associated with the given API level. An app built against API level X will work seamlessly on the corresponding AOSP release.

So far, unless youre interested in more tightly integrating into Google services, you dont need to build your app against the Google api, but obviously Google are interested in app developers using their custom apis.

As far as the increasing integration of core applications into GMS is concerned, it is rather overblown to call the AOSP versions broken or buggy. AOSP remains the base platform for the hardware ecosystem to develop their reference designs, AOSP has to work and does work well.

The hardware domain is a big problem for rolling your own OS from scratch, the associated software stack to support a given piece of hardware is non-trivial. Even generic Linux is now being supplanted by Android variants in the embedded space especially if youre interested in graphics or multimedia.

However, also consider that the most successful player in the Android space, Samsung, have pursued a strategy of lightly forking Android with their own features and customizations without breaking compatibility deliberately.

mwcampbell 14 hours ago 0 replies      
As the developer of a couple of small Android apps, I know that I would rather post them to a Microsoft or Nokia Android app store than port them to Windows Phone. Of course, it helps that I specifically avoided using the Google Play Services APIs, but even if I had used those APIs, it would be easier to make my apps Google-independent than to port them to a whole other platform with miniscule market share.
qwerta 19 hours ago 1 reply      
Many companies produce Android devices without google maps, gmail, play store... It solves a lot of privacy issues.

In 5 years smartphones will be cheap commodity and there will be also good opensource community fork. I can see Debian on Android.

briandh 18 hours ago 0 replies      
This piece seems to imply that only full source compatibility with Android (including Play Services) would be valuable to the Windows Phone platform. I don't think that's the case. I think getting 80% of the way, combined with the Microsoft brand plus quality devices Nokia would be enough to entice plenty of developers to the platform.

I think a more interesting question is whether Microsoft should fork AOSP as a whole, or place an Android compatibility layer atop Windows Phone, a la BlackBerry 10.

SoapSeller 18 hours ago 2 replies      
Didn't Google won the the "Oracle vs Google" case on the ground that APIs aren't patentable?

What prevent MS(or any other big player) to re-implement GMS?

mkr-hn 7 hours ago 0 replies      
Not having Google Play hasn't limited my Kindle Fire experience so far. All the important apps are on the Amazon store, and most of the rest are low quality IAP bait.
mpettitt 19 hours ago 1 reply      
Interesting article. I'm intrigued as to how CynaogenMod handles the lack of Google Play APIs - it seems to be skirted around on their website.
fredgrott 19 hours ago 0 replies      
the article author does not know what eff they are speaking about:The AOSP counterparts are buggy, feature deprived, and by at least some accounts, barely maintained.

If that sentence was true than stuff like the phone app would not work, obviously it does :)

Author does not realize that AOSP is a snapshot of the full android OS

higherpurpose 19 hours ago 1 reply      
The author mentions "you can have control or compatibility, but not both". That's basically another way or saying "you can fragment the platform all you want, or choose compatibility (i.e. let Google control the platform across OEMs)". Remember when people were yelling from the rooftops "Android is so fragmented, and why it's always behind iOS! Google, give us standardization already!!"?

Linux' biggest problem has been that Microsoft moved much faster to get Windows on as many PCs as possible through certain corporate deals, but in terms of gaining market share, the Linux ecosystem has also worked against itself, but allowing everyone to fork it into hundreds of different distributions, all doing different stuff, and with barely even a weak app store across several distributions.

Linux is "everywhere", because everyone can fork it, and Android has certainly benefited from this strategy in the early years, too, but that seems to be an antithesis to an "ecosystem". As we can observe, even though "Linux is everywhere" in all sorts of devices, there's no significant "ecosystem".

Google wants to keep and evolve the Android ecosystem, because that makes it much easier for users, and also developers to develop on top of a well standardized ecosystem of devices and OS images. I guess for an proper ecosystem to thrive, it needs to be controlled and standardized as much as possible, with restrictions for OEMs and carriers.

The only alternative for the others, if they really want to start from the Android base, will be to form their own ecosystem, but that's very hard, unless we get to the point where only the web matters on mobile devices, too.

bane 17 hours ago 0 replies      
"typed on my Kindle Fire"
higherpurpose 19 hours ago 0 replies      
So Microsoft should fork Ubuntu Touch then. Problem solved ;)
ksherlock 16 hours ago 2 replies      
Android is already forked[1].

  Gingerbread: 20%  Ice Cream Sandwich: 16%  Jelly Bean: 60%  KitKat: 2%
1: http://developer.android.com/about/dashboards/index.html

A Better Firefox Sync mozilla.org
64 points by lelf  14 hours ago   31 comments top 8
mikegioia 12 hours ago 3 replies      
Just today I reinstalled Ubuntu on a laptop and the sync completely borked.

What I don't understand is why does it first ask you for a pair code from another device? My only other device was a desktop far away. After I logged in and clicked "reset sync key" it apparently lost all of the synced data!

I seriously hope this solves the currently heinous sync process. Just let me log in an authorize myself for goodness sake.

seanieb 6 hours ago 0 replies      
On Password reuse - Implementing a one factor, password based auth puts the accounts security in the users hands. There are lots of email, password lists from hacked web services (Linkedin,Yahoo Voices, Gawker, etc.) in the wild and users all too commonly reuse their weak passwords across multiple services.

If a user couldnt figure out how to set up Firefox Sync previously by following the instructions and taking a set of digits from one device and entering them into another, what hope have they of picking a strong and unique password?

magicalist 11 hours ago 0 replies      
The Sync protocol document is (linked in the article) has a lot of interesting details about the new system: https://github.com/mozilla/fxa-auth-server/wiki/onepw-protoc...
ParkerK 5 hours ago 2 replies      
I wish they'd port some form of the sync to iOS. Syncing between my phone and my laptop is the only reason I'm still on Chrome
ubojan 12 hours ago 1 reply      
good news. I hope that new sync feature will work seamlessly and efficiently - I disabled sync about a year ago because browser became sluggish with large number of bookmarks.
antonio0 12 hours ago 0 replies      
Finally! The tedious way Firefox Sync used to work is the main reason I use Google Chrome.
ubercow13 12 hours ago 1 reply      
Doesn't seem to be working very well on Nightly for me
Siecje 12 hours ago 2 replies      
Can you sync add-on settings?
Things Microsoft Still Does Well time.com
15 points by cl8ton  6 hours ago   18 comments top 6
fiatmoney 4 hours ago 1 reply      
Microsoft's enterprise software is really, really good. Active Directory, SQL Server, Azure, C# and Visual Studio are all amazing. They also make tons of money, and MS is really good at managing them as cash cows over time.

Not coincidentally, they just put the Cloud and Enterprise guy in charge of the company.

asadotzler 52 minutes ago 1 reply      
It's time to stop joking about IE like that. IE 11 is a fine Web browser these days. Yes, it took them a while to catch back up, but they have and they're a genuinely modern browser today. Firefox is still a better browser, for a good number of reasons, not the least of which is Mozilla's mission to put users and the health of the Internet first, but Microsoft engineers working on IE for the last few years deserve more credit than they're getting. Shouldn't we be cheering for them to build the best they can instead of insulting them?
Svip 2 hours ago 1 reply      
Yes, the Xbox 360 has been a success in the US (beating the PS3), but in every other market than the US, the PS3 has beaten the Xbox 360.[1] And the article in question is very US-centric on that regard. I am not saying the Xbox is necessarily bad, but is it better than the PlayStation? I think that's rather subjective.

Although I hear that neither Sony nor Microsoft has made a lot of money from their console adventures.

[1] http://en.wikipedia.org/wiki/History_of_video_game_consoles_...

nemothekid 4 hours ago 1 reply      
I guess this is consumer focused as it missed some of the real MS Gold - Azure, MSSQL, Visual Studio, C#.

However, on the consumer path this interested me:

>As the console gaming industry evolves (dies?), Nadella needs to convince America that the Xbox is truly a living room feature, not simply a gaming device. If he can sell that concept, Microsoft will leapfrog Sony and recapture the lead.

Despite the claims of the "living room console" being made for years, I really can't find any evidence for it. I can't see why a $499 console will beat out an $99 Apple TV for OTT entertainment in the larger consumer market. I still think this idea that the gaming console will become "the box" is an idea that has roots in the early 2000s, where people only have 1, maybe 2 televisions at home. If "John Jr." is playing video games for 4 hrs/day, I commonly see he does it in his room, not in the living room where he can impede on "John Sr."'s decision to watch the Netflix. Sure they could buy 2 Xbox Ones, but John Sr. doesn't really need all that gaming power.

us0r 6 hours ago 0 replies      
5) Make money.
gschiller 4 hours ago 0 replies      

-> Enterprise

Show HN: JS library to make your website instant
349 points by dieulot  19 hours ago   156 comments top 38
VeejayRampay 19 hours ago 5 replies      
Another idea: take into account the movement of the mouse to define a directional cone in the general direction of the movement, which would enable you to preload your pages even before the hover state occurs.
wesley 19 hours ago 0 replies      
I wish there was an instantclick link to this website..

OK, here it is: http://instantclick.io

CoryG89 19 hours ago 1 reply      
After looking at the source, one thought I have is that since you are dealing with such small timescales you should use the high resolution window.performance.now function (or the Date.now function for higher compatibility) as a timer instead of using the Date object as you do.
callesgg 19 hours ago 2 replies      
Realy Cool,Only real problem with this is if the clicking has side effects like: http://example.com/?action=logoutas brought up on page.

And probably a ton of other application bugs as style and script stuff wont load. like they normally would

adwf 11 hours ago 1 reply      
Really awesome. I was working on something like this myself, but using Jquery ajax combined with history.pushState for partial page loads. This is much better!

There are a couple things that I had on my TODO list that could be handy though:

1) Caching - if you hover back and forth over two links, it will keep loading them every time. Dunno whether this can be alleviated or not.

2) Greater customisability. It'd be great if I could customise whether it was a hover or mousedown preload, on a per link basis. Some links benefit from hover, others it might be overkill.

3) Lastly, it would be cool if it could link up with custom actions other than just links. For example, jquery ajax loading a fragment of html to update a page. This is probably lower down on my priority list though, as the full page prefetch works remarkably fast.

Keep up the great work!

primitivesuave 16 hours ago 1 reply      
One way I see to move this forward in websites at scale is to run a test where you find out the percentage of hovers that result in a click. Suppose its 90% - that means that 10% of those hovers result in fruitless busy-work for your server. Multiply bandwidth + server cost by 10%, and compare that amount to the amount you'd be willing to pay for near-instant load times.

For many companies (Facebook, Twitter, etc) the desire for instant user gratification is paramount, so the push toward instant browsing experience is a very real possibility. One problem is that most people wouldn't really notice, because these websites load pretty quickly as it is.

One interesting direction is if there was some kind of AI in the background that knows what pages you're likely to visit and preloads them - Facebook stalking victims would become an instantclick away.

dmazin 17 hours ago 1 reply      
By the way, if you don't want to listen to mouseover, merely listening to mousedown takes 50-70ms off loads [1]. Not ignorable.

[1] https://github.com/rails/turbolinks/pull/253#issuecomment-21...

snitko 19 hours ago 3 replies      
While interesting, I think this kind of functionality should be implemented only by browser developers and should be turned off by default. Really, I can wait 1 second until the site loads. What I don't want is some library accessing sites without my permission. I usually place mouse over links to see what URL it points to and I sometimes do not wish to click.
w1ntermute 18 hours ago 1 reply      
Doesn't Chrome already do something like this?
lmartel 18 hours ago 1 reply      
This is very cool!

One interesting reaction I had: things loaded so fast that I didn't notice one of the page changes and thought it was stuck. For sites like this one where different pages look very similar, maybe it could be worth experimenting with some sort of brief flashing animation (to make it look like a real page load)?

maxucho 15 hours ago 1 reply      
Awesome work! I just installed this in my own new experimental (read: very low traffic) web app: http://www.penngems.com/

I set the preload to occur on mousedown rather than mousover, as per the docs, but even with this I noticed near-instantaneous page loading.

soundoflight 18 hours ago 1 reply      
Prefetching really shouldn't be blindly applied to everything as users may have bandwidth limited. Even though your implementation is better than browser prefetch on users it does take the choice away from the user unless individual sites make it easy for users to opt out.
AshleysBrain 18 hours ago 1 reply      
Isn't this what link rel="prerender" does? https://developers.google.com/chrome/whitepapers/prerender
ishener 15 hours ago 4 replies      
i still don't understand why in 2014 it's not possible to have an entire website with all it's files zipped and shipped as it is on the first request. how wasteful is it to have 50 requests for a server just for images and resources? have your root domain be a zip file of everything you need to view it, and then include some additional popular pages along with it. it can't get any faster than that
pokstad 17 hours ago 2 replies      
I have an even better hack. Since most blog posts / articles are nothing more than a bunch of text, I simply download all articles in a single fetch when the initial page loads. I do this using a CouchDB view that returns all blog posts in chronological order. All successive link clicks don't hit my server (unless there's an image in the article that needs to be loaded). Check it out: http://pokstad.com
auvrw 7 hours ago 0 replies      
> before clicking on a link, you'll hover over it.

unless you use vimperator or similar. the demo handles this though, giving a hover time of infinity.

lifeformed 19 hours ago 1 reply      
Is there a demo page? I want to see what it feels like.
thasmin 16 hours ago 1 reply      
Have you considered preloading all of the links while the person is reading the page?
sagargv 4 hours ago 0 replies      
Awesome ! I can't believe I hadn't thought of this before.
ruricolist 8 hours ago 1 reply      
The tricky thing with all of these (pjax &c.) is that by loading with JavaScript, you lose progressive rendering, so while reducing latency you may actually lose perceived speed.
robgibbons 5 hours ago 1 reply      
I would be hesitant to rely on mouse input, or even touch input. Think about things like screen readers and accessibility and you'll quickly learn there are many ways people browse the internet.
wiradikusuma 16 hours ago 1 reply      
Does it work with SPA, particularly using AngularJS? (Essentially what's needed is the "prefetch on hover")
oneeyedpigeon 18 hours ago 3 replies      
Wouldn't this ruin usage stats?
insertnickname 13 hours ago 0 replies      
>Click Hover =

>Click Mousedown = 2 ms

>Click Touchstart =

I win!

mbesto 17 hours ago 1 reply      
In theory you could write a chrome extension and use it for any site, right?
napolux 18 hours ago 2 replies      
Wonder what happens for website with zillion of visitors per day.Could all this preload impact on servers?
resu 12 hours ago 0 replies      
This is really cool! I'll try it out.

Thanks for sharing :)

udfalkso 19 hours ago 1 reply      
Very nice. It would be great if the JQuery Mobile folks would integrate this.
wololo_ 16 hours ago 1 reply      
Does it support /#!/ (hashbangs) or just pushState ?
mrfusion 19 hours ago 1 reply      
Love it! Will it only work on html5 sites?
lintiwen 17 hours ago 0 replies      
I have problem understand "instant website",

can you provide some specific definitions? thank you

loteck 16 hours ago 1 reply      
Am I correct in assuming that touch interfaces can't benefit from this kind of architecture?
aabalkan 17 hours ago 1 reply      
Is there a demo?
matysanchez 18 hours ago 1 reply      
Any demo? I mean, a implementation in a real web, like a blog or something like that?
kjannis 18 hours ago 1 reply      
Does this require server components? Or does it also work with a static site?
math0ne 16 hours ago 1 reply      
umm like 40% of traffic is already touch, seems too late
augustohp 18 hours ago 0 replies      
Extra kudos for not using jQuery!
aehv 17 hours ago 2 replies      
Is it possible to make it a Chrome extension and use it on all sites?
How I Optimize Myself danrodriguez.me
67 points by operand  15 hours ago   24 comments top 7
rickdale 12 hours ago 1 reply      
>"Eat Well"

I can't think of anything more distracting than trying to fit in 5 meals/day. Lost a lot of weight doing the slow carb diet, which is eating 4 times/day and the gradually shifted to the warrior diet, which is essentially one large meal/day. Takes a few days to adjust, but once I did my ability to focus has been unreal. Not everyday, but somedays I feel like I am on adderall and it is awesome. Anyways, just want to throw it out there, eating 5 times a day can be a lot for some, and I have found other hn'ers out there that practice one meal per day. I really really enjoy it.

lazydon 4 hours ago 0 replies      
I wonder how the author keeps such a long list in his mind while working. I'm not sure how, when his brain is actually not cooperating, he would say "brain, here's the remedy, so stick to it". Most points have this pattern "don't..", "resist..", "seize..". Whom are we kidding - as if its that easy directing our quirky mind.

I'm reminded of following Daniel Kahneman's closing notes in his books 'Thinking Fast & Slow'. So, after listing all biases and quirks (to which he has devoted his life), he writes:

What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely: This number will be an anchor..., The decision could change if the problem is reframed... And I have made much more progress in recognizing the errors of others than my own.

danielharan 14 hours ago 2 replies      
Upvoting this, if only for the suggestion to "Dont think at your desk", which is a strategy I hadn't heard before.
linux_devil 12 hours ago 1 reply      
>"Don't think at your desk" Personally it helps me a lot , I go for a small walk alone and it helps.
sizzle 8 hours ago 0 replies      
has the author never tried foam ear plugs before...?

" Eliminate distractions, and I mean eliminateFor a long time theres been a fad in our industry of having open workspaces. While being right next to someone and being able to just look over and ask a question is ideal for communication, it can be the opposite for concentration. Headphones with loud music dont solve the problem either. What I believe works best is quiet. Can you imagine taking a final exam in college with someone blasting music? You cant concentrate at your best when any sort of external stimuli is demanding some of your attention. It needs to be quiet, and free of any visual distraction as well. People walking by, a television, anything like this should be avoided for you to stay in the zone. If your office doesnt have a quiet distraction free area to work in, take it up with your manager. Im personally lucky enough to get to choose when to work from home, and I often do so when I have a large piece of work cut out for me that I dont need to communicate much more on."

nazgulnarsil 12 hours ago 1 reply      
5-6 meals helps burn fat is broscience.
finishingmove 13 hours ago 5 replies      
Mostly reasonable advice. An important step was missing though. Editor / IDE setup anyone? And in regards to that, I would like to point out one particular thing that might seem trivial. Color scheme. I think most people don't realize how much a proper color scheme helps reduce eyestrain, and hence general irritableness. Not to mention saving your eyes in the long run... High contrast between foreground and background color tires your eyes faster. That's why you should never use (0,0,0) or (255,255,255) as your background color.
Why You Should Open-Source Your Startup codecombat.com
61 points by dreeves  14 hours ago   25 comments top 9
jakejake 2 hours ago 0 replies      
We try to keep things modular and so we open source components and various things in our software that might be of general use. Likewise we try to send pull requests back on components that we use. Several of us have our own open source projects as well so we're very much into open sourcing code.

But, I honestly don't see a ton of value in open-sourcing the entire codebase as-is for my company. Not that I would worry about people stealing our ideas, because like most businesses ours is about relationships, support and various other things rather than the "secret sauce" in our code. I just don't see anybody bothering to contribute since our customers are not programmers (not even very technical in many cases). I have no idea why any programmer in his/her right mind would want to go through our product to fix bugs or add features.

I can see it being a different thing if your product is aimed at other developers and/or it has a broad appeal. If there are developers who would be interested for some reason to participate, then I can see it being a great thing. I just don't think it's necessarily helpful for every company.

biot 11 hours ago 3 replies      
I think better insight would be derived from analyzing when you should not open source your startup. Where would Microsoft be if Windows and Office were open source and companies could get them for free?
dreeves 14 hours ago 1 reply      
Love this so much, and it's absolutely persuasive. My startup -- http://beeminder.com -- has been on board conceptually for a while and the only thing holding us back, for now, is the initial work required (as Nick puts it in the article, "writing documentation, automating the developer setup, filing easy issues, paring down the repository, and preparing licenses").

Here are some other costs to consider, courtesy of Yehuda Katz:

1. Reviewing all of the code that you want to open source for secrets that could compromise security.

2. Improving parts of the code that are embarrassing or too coupled to infrastructure that isn't going to be made open source.

3. Additional communication overhead for communicating with the open source community so that contributors don't do work that you're already working on.

4. Time spent triaging and working with features that may not have been high internal priorities (or risk pissing off the open source ecosystem).

5. A general willingness to cede control over the precise direction and priorities to a larger group of open source people.

Aaron Parecki adds:

6. Support costs of helping people get their dev environments set up.

But Yehuda, obviously, is in favor of open-sourcing as long as you understand those costs, and lists these advantages, most of which the article also notes:

1. Gaining additional contributions from open sourcers that would have been expensive or technically impossible to do in-house.

2. A vibrant community of people that are interested in the product, its direction, and are knowledgeable in the implementation.

3. People willing to do cleanup work in order to become familiar with the project and become contributors.

4. Getting insight into product direction by people willing to put their money where their mouth is and dedicate time to implementation (this is the flip side of some of the negative above).

5. A recruitment pool that is already familiar with the product and its implementation.

Avshalom 12 hours ago 2 replies      
I like open source and I understand you have to make a business case for it but this just feels depressingly like "Open-source your startup so you don't have to pay for employees"
loomio 1 hour ago 0 replies      
We have been open source from the start, but we actually find it really hard to find the time to support contributors to do more than small features and changes. If you have any advice about how to better integrate open source contributors without making it prohibitively time-expensive in terms of coordinating with the core dev team, I would love to hear it!

Even if we got contributors banging down the door tomorrow, I am not sure we could spare the people-hours to properly make use of them - catch 22! Right now most of our community contributions have come in the form of help translating the app (more than a dozen languages now!) which is great, but only one part of the puzzle.

This is our github: https://github.com/loomio/loomio - as you can see we have pretty basic documentation. What would be the most important information we'd need to help people be able to contribute more easily?

marquis 5 hours ago 0 replies      
This assumes you are not solving hard algorithmic problems, but I certainly agree the open sourcing certain components can be very valuable - you can get more customers if others are extending your work and of course you attract consulting work. Providing a good, usable SDK is a good meeting point.
KaoruAoiShiho 12 hours ago 2 replies      
If I were running a startup like youtube or airbnb should I open source? The problem of clones seems to be quite big (they all already have clones).
abeiz 12 hours ago 1 reply      
I open sourced my startup a few months ago (logicpull.com) and it ended up opening up a lot of doors for me. Definitely glad I went that route.
dreeves 13 hours ago 1 reply      
[Eep, I used a bookmarklet to submit this to Hacker News and didn't notice that it submitted it with "?action=upvote" in the query string (how I first reached the page because I clicked the upvote link when I got the blog post in my inbox).

So now it looks like the CodeCombat blog is doing some crazy spammy thing with popups when you hit the site. Could a moderator fix the URL?]

Why Mt. Gox, the Worlds First Bitcoin Exchange, is Dying coindesk.com
54 points by beniaminmincu  12 hours ago   33 comments top 8
sillysaurus2 10 hours ago 7 replies      
This is FUD. As incredible as it seems, MtGox's problems are, in fact, technical. And it was quite a relief to realize this.

Also, there is minimal danger of a bank run. MtGox makes a profit off of every transaction. Therefore they control more BTC than users are able to deposit. Hence, since Magical Tux (the owner) is motivated not to go to jail, he will not run off with people's BTC.

For anyone who wants the truth, rather than speculation, here it is: http://www.reddit.com/r/Bitcoin/comments/1x93tf/some_irc_cha...

dgreensp 8 hours ago 0 replies      
Dear god, I hope so.

Mtgox is the giant raging zit on the face of Bitcoin. The software and service are so terribly, inexcusably bad, and they have always been that way. The market price on Mtgox has been artificially high ever since I can remember, because no one can withdraw their money. Users wait for their withdrawals for weeks, and sometimes months. As a result, a dollar in your Mtgox account is worth less than a dollar. Now it is artificially low, because people can't transact Bitcoins either! The other exchanges, meanwhile, tend to agree pretty closely, because it is actually possible to move bitcoins and dollars between them.

Imagine if ETrade, say, was so bad at moving money that you had to pay an extra $100 for a share of Google stock. That's what it's like. Or if everyone used Google's DNS servers but they typically took 3 seconds to respond. And yet people use Mtgox, and talk about it, and it's listed on all the sites and apps that track the market price on different exchanges.

At first, everyone chalked up Mtgox's problems to the sheer difficulty of running a Bitcoin exchange, and then, to the difficulty of running the biggest. Well, now it's no longer the only or the biggest exchange, and no one else is having the same problems, at least at this service-destroying level. I can't comment on the difficulty of establishing relationships with banks to move millions of dollars -- I'm sure it's hard -- but Mtgox has also had a lot of plain old scaling issues, and the transaction load is not even that high by web scaling standards. Last time I checked it was 30 per second or so. Their security is nothing special; they've been hacked before. I have heard only negative things about their engineering competence.

I'm a casual fan of Bitcoin but a big hater of bad software, bad customer service, and companies that act in a sleazy manner. I hope Mtgox dies.

cromwellian 11 hours ago 2 replies      
Looks like a Bank Run? Bitcoin exchanges that engage in some kind of fractional scheme need insurance coverage. I don't know if Mt Gox is doing that, but it kind of smells like they don't have enough BTC on hand to allow withdrawls.

On the upside, Bitcoin seems like a perfect laboratory to re-run the experiments in banking leading up to the 1930s when you know, the "communist" FDR imposed banking regulations which are still hated today by some. :)

gojomo 10 hours ago 1 reply      
There's plenty to warrant suspicion, but it turns out there is a credible explanation for the seize-up in BTC payouts being a technical problem that will take some time to resolve:


This analysis describes a Bitcoin transactional defect of which I wasn't previously aware. However, from a vaguely-similar much-smaller incident involving the default client, I know that having a local wallet confused about what prior transactions are truly spendable can take significant time and custom effort to unwind.

Tenoke 1 hour ago 0 replies      
I can't really feel sorry for anyone who is having troubles with Mt Gox. People have known for ages now that Mt Gox is unreliable and they kept using it to take advantage of the higher prices which come with the higher risk. Hell, If anything, they are the problem since things would've likely been better right now if more people had moved away from Gox from before instead of waiting for the ship to start sinking.
captainchaos 10 hours ago 0 replies      
Reading this thread on Reddit actually puts in perspective how hard it must be for governments to design and implement effective economic regulations. It's a real balancing act.
Larrikin 10 hours ago 0 replies      
I never had any problems withdrawing money, but I live in Japan
gaius 11 hours ago 1 reply      
Because it's Magic the Gathering Online eXchange.
Cloudflare app for thedaywefightback.js cloudflare.com
33 points by sinak  12 hours ago   6 comments top 2
Goopplesoft 11 hours ago 1 reply      
Will there be a way to access metrics for this to know how many people did it VIA your site?
thisisparker 7 hours ago 1 reply      
Is the code available for this version too?
Books to Read in a Lifetime amazon.com
45 points by acdanger  6 hours ago   49 comments top 15
RBerenguel 2 hours ago 4 replies      
Although I'd change some books (actually, quite a lot,) I'm positively surprised by some choices, with books you usually don't see in "best of" lists but which rank high in my personal preferences.

For science fiction I'd change Dune... Even though it's a great book and I like it (I even read Herbert Jr.'s sequels... spoiler: don't) I'd pick something "shinier." Heinlein's Stranger in a Strange Land, or 2001: An Space Odissey... or go with the weird and pick Rendezvous with Rama. Or just Foundation. Sci-Fi-wise you can't go wrong with Foundation.

Ninja-edit: How could I forget The Starmaker by Olaf Stapledon? Written in the late 40s, I didn't give much for it as Sci-fi goes. I read it in one sitting. Ended with a headache, dizzy and hungry. It was well worth it.

For Murakami, I'd pick instead a relatively unknown book by him: Hard-boiled Wonderland and The End of The World. An inception-esque plot-inside-plot book, set in an almost Neuromancer-like setting. I love it.

For Dystopian... Even though I have not read The Giver, just classification-wise I'd have to pick Shades of Grey (Jasper Fforde's book, not to be confused with another, numbered, similarly named book.) It was a thrilling read (I think it's the best novel I've read in the past 3 or 4 years, but well, I don't read that much fiction lately), sadly part of a trilogy waiting to be finished. Beware: once you are done with the book you'd want to go to Britain and tie Fforde to his desk until he is done with the next book.

The books that surprised me though are incredibly well-spotted. I like that Guns, Germs and Steel is there. It's been on my reading list for... 3 years already (I have it, but it's a heavy book so I'm always eager to pick an ebook or thinner material for a commute,) because the theme is so compelling. The Right Stuff is not the usual book you see in a best-of list, but for me, it should be in all these lists. Heck, writing from it is used as example of good writing in On Writing Well (which is surprisingly a very good read).

The Long Goodbye, by Chandler. Chandler is great, period. Having one of his books in this lists validates other books I'd never consider... Even though you can't have a Chandler and don't have a Hammett. You can't go wrong with a book by Hammett, I'd probably pick The Maltese Falcon. A classic.

Of course, there are some books that I'd personally treat to a Bradbury process... Catcher in the Rye and On the Road are two books I was looking forward to reading (not being English-based meant I didn't get to read them on high school) and found dull. I guess read in a different context would have made it different, but I couldn't see all the praise. Personal opinion, though.

MichaelApproved 3 hours ago 1 reply      
Disappointed to not see Siddhartha on the list http://www.amazon.com/Siddhartha-Hermann-Hesse/dp/0553208845
JacobAldridge 40 minutes ago 0 replies      
Surprised / pleased that I'm a quarter of the way there (well, 24/100).

Didn't expect quite that much young adult literature - Harry Potter, Limony Snickett, Golden Compass, The Giver (plus Lord of the Rings). All major film franchises (well, The Giver under production).

Was amazed to see The Phantom Tollbooth. One of my favourite books of all time, though I haven't read it since I was a child and honestly have never met another person who'd heard of it!

mercurial 3 hours ago 3 replies      
"Mostly anglo-saxon" books to buy on Amazon in a lifetime.
rdl 3 hours ago 3 replies      
They did a bad job on sci-fi, I think -- while I liked Dune, it certainly isn't the one SF book I'd include.

The Handmaid's Tale is far better, but is listed as "Feminist Speculative Fiction"; if I'd read the category first, I would have skipped the book, but the book is great, along with her other writing.

In a list of 100, I'd probably include 3. The Handmaid's Tale is fine; Snow Crash or maybe a Heinlein or a "golden age of sci-fi" choice.

(Edited to fix incorrect title, thanks)

rabbitonrails 3 hours ago 3 replies      
"...which are modern enough to still be under copyright and thus most profitable to Amazon"?
callmeed 2 hours ago 0 replies      
I'm glad Cormac McCarthy is on there, but is The Road better than Blood Meridian?
girvo 1 hour ago 0 replies      
I'd definitely pick Foundation over Dune, in my personal opinion. I still hold the Foundation series up as the pinnacle of SciFi.
skybrian 4 hours ago 2 replies      
I know it's not meant seriously, but even so, the notion that everyone should read the same set of books in a lifetime is a strange one. If everyone actually did that, it would be a massive duplication of effort. Much better for people to read different books.
prawn 3 hours ago 1 reply      
Surprised that Perfume didn't make the list, or did I miss it browsing on my phone with that weird interface?
bowlofpetunias 24 minutes ago 0 replies      
Looks more like a popularity contest with an unsurprising correlation to recent screen adaptations.

I half expect Justin Bieber's biography to make the list.

Goodreads and Amazon have become as useless as IMDB when it comes to ratings. Both are prime examples of why the "wisdom of the crowds" is bull. Guess Nicholas Carr was right after all.

sidcool 1 hour ago 1 reply      
Harry Potter books there. They are good, but then it's about 100 books to read in a lifetime. Just my opinion.
hhorsley 4 hours ago 2 replies      
What's the criteria? This is totally opaque as a list. I agree that some of these books are epic but don't know where the decisions are coming from.
ssully 3 hours ago 0 replies      
I get why they did this, but it seems like such a waste considering everywhere I've seen this posted people just bitch about a book or a section of books they like not being on the list. But it seems to be working as intended, because I've seen this posted a lot.
       cached 9 February 2014 11:02:01 GMT