hacker news with inline top comments    .. more ..    26 Dec 2014 Best
home   ask   best   4 years ago   
1
Prosecute Torturers and Their Bosses
520 points by rpicard  3 days ago   323 comments top 31
1
g0v 3 days ago 10 replies      
I just feel sick when I think of how helpless I am as a citizen while I watch my government's officials get away with shit like this. Yeah, I can sign petitions, vote, contact my congressman, but I always have the feeling of someone that's just watching from the outside.

My hope is that those people that we don't see or hear about that work around injustices like these are doing what they can to keep some sort of balance. I try to convince myself sometimes that these people that love their country and want to do good are trying to use what power they have to make things right. I know for a fact that there are amazing people working in government and I hope the good stuff these people do just doesn't get much attention.

I love my country, I served, I consider myself a patriot, but I worry about my children and their children. I don't expect much, if anything, to come of this and the fact that that feeling is common for me as an American puts knots in my stomach.

2
tacon 3 days ago 2 replies      
I'm still searching for a society in history that engages in torture against its opponents that doesn't eventually turn to torturing its own citizens. They will, of course, be labeled terrorists when that happens, but that time is coming.

When that story came out of Chicago a few years ago about a two decade history of torturing inmates to get them to confess to crimes, we asked how that could go on for so long. Well, apparently you can torture powerless people of color in Chicago without consequence. They were only torturing "bad" people, weren't they?

I get ill listening to the "arguments" for torture. "Well, it works." I want to drop my head into my hands. The efficiency expert's answer to moral questions. But what about a ticking bomb? Can't Jack Bauer put a bullet in a prisoner's knee to get him to talk? The Israeli's have the process down: They torture supposedly because of a ticking bomb, a time limit, whatever, but then they take a break for the Sabath. Gotta keep those priorities in mind.

If torture "worked", does that mean we have no limits? Just animals ripping the face off our prey?

I'm not very religious. What are Christians thinking when they allow this to happen? Has the iron law of Paul taken over from the Prince of Peace? If there is one piece of evidence that the US is a post-Christian country, this is it.

3
diafygi 3 days ago 0 replies      
If you want to see torture prosecutions, write a letter to your editor calling out your elected officials by name (source below). I did it, so can you!

How to get your senators' and representatives' attention on any issue without being a wealthy donor | Protip from a former Senate intern[1].

--------

An email to your senator or representative may result in a form letter response and a phone call to the office may amount to a tally mark on an administrative assistant's notepad. But, for any given policy concern, if you want to get their attention a letter to the editor in one of your state's 5-10 biggest newspapers that mentions them specifically BY NAME is the way to go. If your message is directed to your representative, pick a newspaper that is popular in your district.

That is the crucial thing to know--the rest of this post is an explanation of why I know this is true.

I know this because, when I interned in the D.C. office of a senator one summer, one of the duties I shared was preparing a document that was distributed internally both online and in paper format. This document was made every day and comprised world news articles, national news, state news, and any letters to the editor in the 5-10 largest newspapers within the state that mentioned the senator by name. I was often the person who put that document on his desk, and it was the first thing he read every morning after arriving to the office.

I began to suspect that this was standard operating procedure because several other senators' offices share the same printer in the basement of the Russell Senate Office building, and I saw other interns doing the exact same procedures that I was involved in.

Since the internship, I've conferred with other Senate and House employees past and present and determined that most--if not all--offices use essentially the same procedure.

--------

[1]: https://www.reddit.com/r/politics/comments/1os8rz/how_to_get...

4
dreamweapon 3 days ago 1 reply      
Prosecute Torturers and Their Bosses

And while we're at it, we should sanction those -- like the New York Times -- who enabled and shielded the torturers via a consistent editorial policy -- solidly in place for 12 years -- of never referring to the practice of torture by name, but instead employing that famous dystopian euphemism: "Harsh Interrogation Techniques."

5
suprgeek 3 days ago 3 replies      
"Whoever fights with monsters must be careful lest he become a monster. And when you look long into an abyss, the abyss also looks into you." - Nietzsche

Based on their actions (which ultimately may have not yielded anything useful either) the torturers and their masters are monsters of the worst kind - state sanctioned.

So now why stop at just releasing the report?Take the next logical step & prosecute these sociopaths.

6
gorbachev 3 days ago 4 replies      
It would be an interesting bit of schadenfreude, if countries imposed some sort of sanctions against the US over harboring war criminals.

The State Department would probably release statements of insane hypocrisy given how they're perfectly fine preaching about human rights when OTHER countries violate them.

7
justcommenting 3 days ago 0 replies      
"If only it were all so simple! If only there were evil people somewhere insidiously committing evil deeds, and it were necessary only to separate them from the rest of us and destroy them. But the line dividing good and evil cuts through the heart of every human being. And who is willing to destroy a piece of his own heart?" - Solzhenitsyn, The Gulag Archipelago
8
darkarmani 3 days ago 0 replies      
For anyone doubting the claims you can go right to the Senate Committee's report: http://www.nytimes.com/interactive/2014/12/09/world/cia-tort...

Page 10 is the start of the findings and conclusions.

9
fargolime 3 days ago 1 reply      
They won't be prosecuted, of course. I ostracized those in my life who supported it. That may be the best anyone can do against it, given our 2-party oligopoly.
10
mkramlich 3 days ago 0 replies      
I still find it interesting what the major news websites and channels choose to make front page headlines with or not. In the world I want to live in, there'd be a lot more daily hammering with (apparently factual statements) like: "US Gov Violates Geneva Convention on Torture" ever since the Senate CIA report came out. The US is signatory to that, ratified it by Congress, and indeed was one of its architects in the post-WW2 landscape. They repeatedly posture themselves as one of the Good Guy nations, the ultimate White Hat lawman. They have tortured and violated international law, to which they are signatory, on basic human rights and ethical behavior.
11
rdxm 3 days ago 0 replies      
Does anyone honestly believe that the current administration which has: 1) doubled down on the Bush admins attack on the Fourth Amendment; 2) attacked the 1st Amendment directly (something even Bush/Ashcroft didn't do..) and 3) run what is an arguably similar program in its moral and ethical ambiguity(Drone strikes) to the interrogation techniques; would even think about going after the previous administration!?!? Give me a break, that's beyond laughable.

This piece by the Times is about what you'd expect from an organization that has lost its grip on reality. What's worse is that people actually pay attention to the Times at this point. Sadly, what we lack in this country is an honest journalistic broker that can objectively communicate about the political realities we have in front of us and our shortcomings as a nation and an electorate.

Here's the thing, all of us that vote (and those that don't), are responsible for the mess in D.C.. The people that inhabit the beltway are simply taking advantage of our apathy and incompetence as an electorate..nothing more, nothing less...it's up to us, as the electorate, to purge our government of the latent corruption and decay that has come to permeate that town. Until we take up that charge in a meaningful way, we can't even have a discussion about topics such as this, it's just a waste of time..

12
guelo 3 days ago 2 replies      
Obama is a disgrace. The reason he was elected over Hillary was because he initially took a harder line against the war, the spying and the torture. At the end of the Bush administration the American people voted overwhelmingly to break from the recent past, and Obama inexplicably betrayed our democracy. And I have my doubts that it will ever fully recover.
13
will_brown 3 days ago 2 replies      
>Since the day President Obama took office, he has failed to bring to justice anyone responsible for the torture of terrorism suspect.

This is such an important topic, that it is a shame it always has to be discussed in the political context (i.e. Obama failed to bring justice anyone from the Bush administration).

#1. Obama's administration continued using the Bush era Enhanced Interrogation Techniques; therefore, politically he could not pursue anyone from Bush's administration without subjecting his own administration to the same...including himself.

#2 From a legal standpoint anyone who actually engaged in these alleged acts of torture would have a legal defense, they relied on Government officials and attorneys who authorized/ordered them to perform these interrogations. Whether the officials/attorneys were right or wrong in their judgment, this is a lawful defense recorgnized by US criminal courts. See US v. Baker, 546 F.2d 940 (D.C. Cir. 1976).

14
warfangle 3 days ago 1 reply      
This is the kind of self-prosecution that you can't trust the USG to pursue with the necessary zeal.

This is what the International Criminal Court at The Hague is for.

15
spacefight 3 days ago 0 replies      
Yes. Prosecute them, put them all in prison. If the US won't do that, the rest of the world has to oblige the moral and legal duty of prosecuting anyone involved in the torture of humans. That will restrict their travels wastly, right Dick?
16
mtimjones 3 days ago 3 replies      
For all you liberals out there who want to go after Bush, don't forget that Obama has been killing people indiscriminately with drone strikes. Something tells me you'll be less likely to go after him, indicating the real source of your outrage.

The CIA report was written solely by Democrats with an ax to grind, and sought no information from an opposing side.

Go back and watch any 9/11 news video (as it happened) and put yourself in the shoes of those who struggled to find direction during that time. I personally don't think torture is a good decision, but those in power did it out of precaution to identify possible future acts.

When the next terrorist attack occurs, please think about the bullshit you wrote here.

17
WallWextra 3 days ago 3 replies      
I can't think of a worse idea than prosecuting Bush administration officials. Jailing the previous administration would set a terrible precedent, and would fatally undermine the stability of our government.

There are many people who are convinced, e.g., that abortion is murder and that politicians who enable it ought to be prosecuted. Or, more on topic, that Diane Feinstein is a traitor for making these torture memos public. They believe this with as much moral certitude as anyone here believes torture is a crime. It is only a matter of time before another right-wing president is in the White House, and when there is I don't want there to be a precedent for jailing one's political rivals.

18
DanBC 3 days ago 0 replies      
A few low level torturers were prosecuted and put in prisons. Their bosses had the political language to endorse and encourage torture while using language vague enough to escape prosecution.
19
cyphunk 3 days ago 0 replies      
"looking forward" (as the article attributes to being Obama's argument for not prosecuting) is something one does to forgive others. self-punishment is something one does to prove trust to others. The looking forward policy of the US was important in the construction of the Marshal Plan. But now is the time they need to show they are trustworthy arbitrators of any moral ground. If they fail to prosecute anyone for torture they set a horrible precedent but more importantly the lack of correction turns any future moral arguments into platitudes.

This article deserves juxtaposition with this scene from the Act of Killing:

http://youtu.be/tQhIRBxbchU?t=2m9s

"We need our gangsters to get things done" @BarackObama

20
DanielBMarkham 3 days ago 11 replies      
I'm voting this up even though I think it's a horrendous idea. Here's why:

The problem isn't that there wasn't a crime: it certainly looks like there was. I am outraged by parts of what I've read. The problem was that the system sought to legally justify it. The problem is that we changed the system so that a good portion of people believe there was no crime. If somebody is told by the system that there is no crime, we can't then go backwards in time and declare there was one. History shows us that such legal application is always more destructive to society than the original incident.

An important concept to understand is "criminalizing politics". That's when politicians, who rotate through office and are expected to spend most of their lives as private citizens, make decisions that could be criminal but do not involve personal gain.

We elect people to make hard choices that involve results that could be construed in other contexts as criminal, especially with respect to foreign policy. We always have.

I do not like any of this, but it's very important to understand that the problems here are systemic. A different president and VP were just as likely to make the same choices. Want to go back and try people for Japanese interment? All the rendition done prior to 2000? Assassinations and coups overseas? Spying on MLK? Such an emotional attitude is understandable, but you just can't continue a government like that. If the system was acting as best as it could, and it screws up? You fix it. You don't get the firing squads out. That's banana republic territory.

So let's fix the system so it doesn't happen again. If we want somebody to hang, start a nice show trial. But since folks were acting in good faith (which is more important than "just doing their jobs"), pardon them and let's move on. There is no justice to be had here. We need to learn. This is not the time to let emotional outrage lead us into hurting each other needlessly.

21
higherpurpose 3 days ago 4 replies      
The very least that I expect out of this is for the International Criminal Court to convict them as war criminals. Whether those convicted will actually do any prison time is another issue. However, US will have to live with the shame for decades that it's harboring war criminals, and it may even impact its dealings with other countries in the future.

US will be in the history books as a country that not only allowed, but keeps condoning torture by refusing to punish the torturers. Brennan even implied that he's not ruling out the use of torture in the future.

Maybe eventually, some new US president will decide that it's time to prosecute and imprison them so US can have a "clean start" in its international relations.

22
happyscrappy 3 days ago 1 reply      
Shouldn't Europe be boycotting the US? They are not in a very good position to do so but maybe they should work on unraveling their dependencies.
24
us0r 3 days ago 0 replies      
Vices News interview with the "Architect":

https://www.youtube.com/watch?v=MmNUi0itl-8

25
at-fates-hands 3 days ago 0 replies      
I guess this means we should throw all the Democrats under the bus who suddenly had amnesia about knowing anything about the techniques being used too?

http://yidwithlid.blogspot.com/2014/12/feinsteins-duplicity-...

The report (embedded below) shows that the CIA briefed at least 68 members of Congress on the CIA interrogation program, including "enhanced interrogation techniques" (EITs) . It details the dates of all congressional briefings and in most cases, the members of Congress in attendance and the specific subjects discussed. Keep in mind though, that the topic for each one of these meetings was interrogation of prisoners.

For example in April 2002 both the House (HPSCI) and Senate (SSCI) committees on intelligence were briefed on the "Ongoing Interrogations of Abu Zubaydah, who was mentioned in the Feinstein report. According to the report, at this time EITs were referenced but there is no evidence they were discussed in detail. However later meetings not only discussed but gave examples of EITs being used, (but attendees weren't mentioned). Finally near the end of 2002 we see that the most Senior members of the House and Senate committees had meetings totally devoted to EITs.

26
known 3 days ago 0 replies      
Prosecute and Confiscate their properties/wealth
27
fredgrott 3 days ago 0 replies      
question: how exactly do you prosecute the to bosses in a democracy? As the top bosses is the voters..So how will we prosecute the voters?
28
darasen 3 days ago 0 replies      
Sorry, I can not take any organization seriously that complains of "torture" while fully supporting the act of siphoning out an unborn child's brain.
29
jrochkind1 3 days ago 5 replies      
> Bush pouring water on terrorists' faces and making them stand in place for long periods is bad.

Are you deliberately trying to phrase torture to sound benign? Or have you not read about what went on? This is not an accurate description of the torture practices that were used, not the ones you were trying to describe, nor the ones you left out.

Yeah, torture is bad.

> Obama dropping drone missiles on picnics and weddings killing Americans and their whole families is OK.

I don't think so. It sounds like you don't either. I'm not sure if you are trying to accuse the NYT editorial board of hypocrisy, or the commenters in this thread, or someone else -- not entirely sure who you think has the opinions you are parodying. But for what it's worth, the NYT editorial board doesn't think it's okay either.

http://www.nytimes.com/2014/06/24/opinion/a-thin-rationale-f...

http://www.nytimes.com/2014/07/07/opinion/reining-in-the-dro...

30
kvl7 3 days ago 1 reply      
Just as Obama will never be impeached, imprisoned, or in any other way brought to justice for his atrocities committed against the Constitution, the people in the highest offices responsible for the torture will never see punishment. Rightly so I say, there is nothing wrong with interrogating terrorists.
31
srenihwon 3 days ago 4 replies      
If two people kidnapped your wife and children saying they would rape and slaughter them in 24 hours, and you somehow caught one of the two culprits, you would do whatever it took to save your family. Whatever it took. And if you say you wouldn't, then you're either a lier, or you are a cruel and cowardly person. On the world scale, performing non-lethal EIT on one terrorist in order to save 500 or 5000 or 50000 lives is both reasonable and moral. To think otherwise is just shocking.
2
Merry Christmas to HN
487 points by lateguy  1 day ago   106 comments top 64
1
tfb 1 day ago 0 replies      
Merry Christmas, everyone. I tend to lurk a lot, but this is one of the few programming communities where I don't expect to be met with negativity and condescension every time I post something. And beyond that, just reading everyone's discussions has easily helped shape me into the person I am today. Thanks for the past few years and many more to come!
2
jobposter1234 1 day ago 1 reply      
And a jolly Festivus to the Restuvus!
3
aragot 1 day ago 0 replies      
I read HN since 3 years. I've created my startup 1.5 year ago. I think that says it all. Oh and I had revenue from day 70, currently cash-flow-positive, and I've taught a lot of people what bootstrapping means. Thank you, community.
4
patio11 1 day ago 0 replies      
Merry Christmas guys. May you and yours be blessed with peace, prosperity, and happiness, today and always.
5
yzzxy 1 day ago 2 replies      
This seems as good a place as any to point this out: take a look at the numbers on the frontpage.
6
rameshkamaraju 5 hours ago 0 replies      
HN is very informative and boosts the confidence of readers in their respective areas of working. I wish all HN contributors to include articles which will be of interest to professionals in all walks of life.
7
ddoolin 1 day ago 0 replies      
HN. Thanks for all the insights, day in and day out. Love ya.
8
xantronix 22 hours ago 0 replies      
Gleileg jl!

To get a jumpstart on my New Year's resolution, time to air some grievances (with myself)!

1. Dammit Xan, when are you going to finish up those unit tests for tnzer? You're holdin up the actual 0.1 release!

2. I can't believe you haven't started implementing your bytecode VM yet! Are you waiting on a freaking miracle, or just piddling until you figure out whether you want to make classes and functions defined at object code load time, or have opcodes for registering classes and functions at runtime?

I can't go home until I wrestle myself in this year's Feats of Strength and get code for my VM, birchvm, up and running.

9
vanwilder77 1 day ago 0 replies      
Thank you! Merry Christmas to all of you :-)

And thank you for being a big part of my small world :-)

10
racktash 22 hours ago 0 replies      
Merry Christmas! I've thoroughly enjoyed lurking at HN for the last year. Looking forward to another year of interesting articles!
11
hilti 1 day ago 0 replies      
Merry Christmas and thanks for being with me every single day in 2014.
12
dataminer 1 day ago 0 replies      
Merry Christmas to you as well.
13
JayEs 1 day ago 0 replies      
Merry Christmas everyone!

http://xmas.flatout-technologies.com

14
jen729w 1 day ago 0 replies      
One of the few places left where comments are worth reading. Thank you, all, and Happy "whatever makes you happy". :-)
15
shrig94 19 hours ago 0 replies      
HN is the reason I'm a reasonably good programmer. Happy Holidays everyone!
16
boo1ean 1 day ago 0 replies      
Merry Christmas!I share with you my santa hat! http://santahat.me
17
s0l1dsnak3123 21 hours ago 0 replies      
"Nollaig chridheil agus bliadhna mhath r" from Scotland.
18
midhir 1 day ago 1 reply      
Nollaig shona daoibh go lir!
19
kruk 20 hours ago 0 replies      
Wesoych wit Boego Narodzenia!

All the best to the community. It's the only place on the Internet where comments are often more insightful than articles :)

20
jarcane 1 day ago 5 replies      

  map (\x -> chr (x + 32)) [45,69,82,82,89,0,35,72,82,73,83,84,77,65,83,1]

21
syb 23 hours ago 0 replies      
Merry Xmas to all of you! It's been a pleasure to read and be influenced by great people and minds. Love to Computer Science!
22
noobermin 18 hours ago 0 replies      
Ungil Kurismas (Palauan)! Probably my favorite post on HN right now. Happy Holidays to all.
23
nilkn 19 hours ago 0 replies      
Merry Christmas to all from Houston!
24
nickthemagicman 19 hours ago 0 replies      
Merry Xmas ya glorious basterds!
25
alva 1 day ago 0 replies      
4d 65 72 72 79 20 43 68 72 69 73 74 6d 61 73 21
26
binoyxj 1 day ago 0 replies      
Merry Christmas to each one of you here on HN.
27
Rygu 1 day ago 0 replies      
Vrolijk Kerstfeest!
28
bvrry 1 day ago 0 replies      
Merry Christmas all!
29
Nib 1 day ago 0 replies      
Merry Christmas!

How about having a little new year party this 31st ?

30
kozlovsky 20 hours ago 0 replies      
! :)
31
joeyspn 1 day ago 1 reply      
Feliz Navidad!
32
vayarajesh 23 hours ago 0 replies      
Merry Christmas to you as well
33
Fizzadar 23 hours ago 0 replies      
Merry Christmas HNer's :)
34
asimpletune 23 hours ago 0 replies      
Merry Christmas to you too!
35
jdhendrickson 1 day ago 0 replies      
Merry Christmas!
36
Rasmase 1 day ago 0 replies      
Gldelig jul!
37
ajankovic 1 day ago 0 replies      
!
38
lui8906 19 hours ago 0 replies      
Merry Christmas HN!
39
arcticf0x 23 hours ago 0 replies      
Happy Holidays! Here to the many more successful years to come!
40
shared4you 21 hours ago 0 replies      
! With regards from India :)
42
kshitizrimal 1 day ago 0 replies      
Merry Christmas to you as well
43
mweibel 1 day ago 0 replies      
Schne Weihnachten :)
44
DiabloD3 1 day ago 0 replies      
Merry Christmas, everyone!
45
yla92 1 day ago 0 replies      
HN ..
46
cpach 1 day ago 0 replies      
Happy Grav-Mass, folks!
47
bornabox 1 day ago 1 reply      
Frhliche Weihnachten & feliz natal
48
masolino 22 hours ago 0 replies      
Buon Natale!
49
adventured 20 hours ago 0 replies      
Merry Christmas all, I hope your holidays are wonderful
50
jonsterling 19 hours ago 1 reply      
Lmao at learning computer science, psychology and economics from Hacker News...
51
ljegou 1 day ago 0 replies      
Joyeux nol :)
52
lllllll 23 hours ago 0 replies      
Bon Nadal!
53
jodooshi 1 day ago 0 replies      
54
tylerpachal 22 hours ago 0 replies      
Vrolijk kerstfeest!!
55
phireph0x 22 hours ago 0 replies      
Merry Christmas!
56
bliker 22 hours ago 0 replies      
Vesel Vianoce!
57
asmosoinio 21 hours ago 0 replies      
Hyv joulua!
58
thameera 1 day ago 0 replies      
!
59
oron 23 hours ago 0 replies      
happy xmas and new year !
60
spikett 1 day ago 0 replies      
many thanks to hn
61
spydum 23 hours ago 0 replies      
Merry Christmas everyone! Ooh look alternating article numbers in holiday colors! Isn't it amazing what technology can do?? If only it could <blink/> like the olden days
62
spikett 1 day ago 0 replies      
many thanks
63
freshyill 1 day ago 5 replies      
All I want for Christmas is the three lines of CSS it would take to make HN responsive.

If that's not possible, I'll take world peace instead.

64
trendril 1 day ago 3 replies      
The thing I dislike most about the holidays is the erosion of intellectual discussion and infectious somatization even on its few remaining bastions like HN.

Oh, Huxley.

3
Why movies look weird at 48fps, and games are better at 60fps
419 points by jfuhrman  1 day ago   125 comments top 21
1
sray 1 day ago 6 replies      
I liked the article, but, as a game developer who does not specialize in graphics, I really liked one of the comments:

Joe Kilner - One extra issue with games is that you are outputting an image sampled from a single point in time, whereas a frame of film / TV footage is typically an integration of a set of images over some non-infinitesimal time.

This is something that, once stated, is blatantly obvious to me, but it's something I simply never thought deeply about. What it's saying is that when you render a frame in a game, say the frame at t=1.0 in a game running at 60 FPS, what you're doing is capturing and displaying the visual state of the world at a discrete point in time (i.e. t=1.0). Doing the analogous operation with an analogous physical video camera means you are capturing and compositing the "set of images" between t=1.0 and t=1.016667, because the physical camera doesn't capture a discrete point in time, but rather opens its shutter for 1/60th of a second (0.16667 seconds) and captures for that entire interval. This is why physical cameras have motion blur, but virtual cameras do not (without additional processing, anyway).

This is obvious to anyone with knowledge of 3D graphics or real-world cameras, but it was a cool little revelation for me. In fact, it's sparked my interest enough to start getting more familiar with the subject. I love it when that happens!

2
bhauer 1 day ago 13 replies      
This article is detailed and scientific.

However, anecdotally speaking, the concern I have with evaluating high-frame rate in film is that we have very little contextmost of us have only ever seen Peter Jackson's Hobbit films in HFR. In other words, I have never seen how other directors' work would be affected by HFR.

Speaking exclusively about the Hobbit series in HFR, I too observed an uncanny valley that traditional films intrinsically avoid with their low frame rate. The Hobbit films felt more like a play than a film. A play with elaborate stage effects, but a play nonetheless.

In fact, my chief criticism of Jackson's directing with HFR is that the feeling of watching a play is amplified by how he mixes the sound and directs his extras. The extras routinely just mumble nonsense in the background, leaving only the character you're intended to be focused on speaking clearly. It's the same thing you see in a play when there is background dialog, and it's completely unnatural. You find yourself sometimes distracted by the characters in the background and realizing they're not actually doing anything meaningful or having real conversations. For example, in the most recent film, I found myself more distracted by the unnatural audio in early scenes (such as the beach scene) than the HFR video.

Combine that with the poor acting by the minor characters in the first 45 minutes of the recent film and I think HFR gets a bad rap in large part because the Hobbit films alone are our point of reference.

3
lchengify 1 day ago 1 reply      
> At 48Hz, youre going to pull out more details at 48Hz from the scene than at 24Hz, both in terms of motion and spatial detail. Its going to be more than 2x the information than youd expect just from doubling the spatial frequency, because youre also going to get motion-information integrated into the signal alongside the spatial information.

I had a conversation with a friend at Pixar about exactly this topic.

The issue goes beyond just pulling more spacial information out of a shorter timeframe, it's also that all the current techniques for filmmaking assume 24fps.

Everything has a budget of time and money, and when you say, make 1000 extra costumes for a shot, you cut corners in certain ways based on your training as a costume designer. Your training is based on trade techniques, which are based on the assumption that the director of photography (DOP) and director are viewing the work at 24fps with a certain amount of spacial detail. Doubling the frame rate means some of those techniques need to be more detailed, whereas others might be completely useless.

Given everything that goes into a shot (hair/makeup, set design, lighting, costume design, props, pyrotechnics, etc), it's unlikely everyone working on a high-fps film is going to be aware of exactly which techniques do and do not work. As a result, you get lots of subtle flaws exposed that don't work with twice the detail. The sum of these flaws contribute heavily to making the shot look 'videoish'.

4
dperfect 1 day ago 2 replies      
The most interesting point made in the article (for me) is that the presence of noise/grain - which effectively reduces real detail in an individual image - can actually improve the perceived detail across time with high frame rates.

At first, I thought this extra "detail" could be explained as an illusion (since noise/grain can mask a lack of resolution), but then I read the abstract quoted near the end of the article:

"...visual cortical cell responses to moving stimuli with very small amplitudes can be enhanced by adding a small amount of noise to the motion pattern of the stimulus. This situation mimics the micro-movements of the eye during fixation and shows that these movements could enhance the performance of the cells"[1]

So if I understand right, since the biological systems are tuned to extract extra detail via supersampling across time, and a small amount of noise/grain can enhance that ability (mimicking natural movement of the eye), it actually helps our visual system extract more real detail.

It seems counterintuitive to add noise for more detail, but the explanation is fascinating.

[1] Stochastic resonance in visual cortical neurons: does the eye-tremor actually improve visual acuity? Hennig, Kerscher, Funke, Wrgtter; Neurocomputing Vol 44-46, June 2002, p.115-120

5
baby 1 day ago 8 replies      
I personally love HFR and have went out of my ways to watch the three The Hobbits in HFR (I traveled to Paris, the only place in France where they have it in HFR).

When people complain about 48fps being weird I just feel like they're just not used to it. It does look weird but after 20 minutes it looks amazing. I'm personally tired of not understanding anything in action movies that uses 24fps. It is kind of a luxury for the eyes to have 48 fps and I predict that in a few years we'll have the same debate we have with console now (60 fps is better than 30 fps).

We got used to 24 fps and so we're making justifications on why it looks better when it clearly doesn't if you take a step back.

6
AndrewDucker 1 day ago 4 replies      
So, basically, at 24FPS things are blurry enough that you can't see the fine details, which means that special effects and costumes look realistic.

Increase the frequency to 48FPS and the blur goes away, meaning that we can see the fine detail, and suddenly sets look like sets, costumes look like costumes, and CGI looks like a computer game.

7
EpicEng 1 day ago 0 replies      
8
Jyaif 1 day ago 0 replies      
I disagree on both explanations:

1/ The "soap opera effect" explains the 48 fps issue.

2/ The lack of motion blur in games is the reason why higher fps are better (see https://www.shadertoy.com/view/XdXXz4 for a great visualisation).

9
UhUhUhUh 1 day ago 0 replies      
There's also a high-level processing aspect. The brain excels at extracting relevant information, which includes discordant information. Back in the days, a solo violin was tuned slightly off to allow the audience to hear it over the orchestra. Barthes also came up with the "punctum" idea, whereby an odd detail in a picture will generate an impression.What I'm saying is that higher-level processing is probably responsible for a number of "impressions" that might have little to do with fps.
10
Qiasfah 1 day ago 3 replies      
Most serious FPS gamers swear by screens that have a higher update rate than 60hz.

In the past this was achieved by setting your CRT to a low resolution and upping the refresh rate. More recently you can get TN LCD panels that offer 120 or 144hz update rates.

Moving the mouse in small quick circles on a 144hz screen compared to a 60hz screen is a very different experience. On a 60hz screen you can see distinct points in the circle where the cursor gets drawn. With 144hz you can still see the same effect if you go fast enough, but it is way smoother.

This makes a huge difference for being able to perceive fast paced movements in twitch style games and is the reason there has been a shift to these monitors across every competitive shooter.

My thoughts on this is that this behavior is similar to signal sampling theorems. Specifically the Nyquist theorem talks about how you have to sample at at least 2x the max frequency of a signal to accurately represent the frequency. For signal generation this means that you have to generate a signal at at least twice the rate of the max frequency you want to display. If you want to accurately reconstruct the shape of that signal you need 10x the max frequency (for example two samples in one period of a sine wave makes it look like a sawtooth wave, ten samples makes it look like a sine wave).

So, if you're moving your mouse cursor quickly on a screen or playing a game with fast paced model movement even if your eyes can only really sample at something like 50-100hz the ideal monitor frequency might be 1000hz. There's a lot of complexity throughout the system before we can get anything close to this (game engines being able to run at that high of a framerate, video interfaces with enough bandwidth to drive that high of a framerate, monitor technology being able to switch the crystals that fast, etc.).

Yes, 48fps movies typically look less cinematic, but I think this is a flaw in movie making technology and not of the framerate. The fight scenes in the hobbit sometimes look fake because you can start to tell how they aren't actually beating up the other person. This detail is lost at 24fps and is why they have been able to use these techniques.

11
Animats 20 hours ago 0 replies      
James Cameron (Titanic, Avatar, etc.) wants to get frame rates up to at least 48FPS. He considers that more important than resolution, pointing out that higher resolution only benefits the first three rows in a theater.

With the low 24FPS frame rate, pans over detailed backgrounds look awful. This is a serious constraint on filmmaking. Cameron's films tend to have beautifully detailed backgrounds, and he has to be careful with pan rates to avoid "judder". "The rule of thumb is to pan no faster than a full image width every seven seconds, otherwise judder will become too detrimental."(http://www.red.com/learn/red-101/camera-panning-speed)

There are some movies from the 1950 and 1960s where this is seriously annoying. That was when good color and wide screen came in, and films contained gorgeous outdoor shots of beautiful locations. With, of course, pans. Some of the better Westerns of the period have serious judder problems. Directors then discovered the seven-second rule. Or defocused the background slightly, if there was action in the foreground. Some TVs and DVD/BD players now have interpolation hardware to deal with this.

The author's analysis of the human visual system is irrelevant for pans. For pans, the viewer's eyes track the moving background, so the image is not moving with respect to the retina.

12
dsugarman 1 day ago 1 reply      
I see the same arguments arise about HFR as I do with stereoscopy and the rhetoric follows the same as the switch from vinyl to digital music formats: it is no longer art. It feels like you lose the artistic effect when you add a multiple of information to your brain. The reality is artists need to learn how to be mindful of the new medium and the old tricks they used to overcome older medium defects need to be removed from the process. (Ex. Over use of makeup) I am excited because we have a bright future with better media technology and pioneers like James Cameron are leading the way.
13
dkbrk 1 day ago 0 replies      
> Add temporal antialiasing, jitter or noise/film grain to mask over things and allow for more detail extraction. As long as youre actually changing your sampling pattern per pixel, and simulating real noise not just applying white noise you should get better results.

This could be a viable alternative to supersampling for antialiasing. Rather than averaging multiple subsamples for each pixel fragment, this suggests that if a single subsample were taken stochastically, the results could be as good, or even better, so long as the frame rate stays high enough.

Antialiasing doesn't quite have the same impact on rendering performance in modern games that it used to, mainly due to new algorithms such as SMAA and the increased cost of postprocessing relative to rasterisation, but this could nonetheless lead to tangible performance improvements.

14
jfuhrman 1 day ago 0 replies      
15
doomrobo 17 hours ago 0 replies      
I don't quite understand. If a video is playing at 41fps, then your eye can sample each frame twice, with a difference of one microtremor to increase resolution. But if a video is playing at 83fps, you only get one sample per frame with no added benefit from the microtremor. The article states the opposite: that the latter framerate allows for a higher perceived resolution. Can anyone explain?
16
suchow 1 day ago 0 replies      
Does anyone know of a good demo of different frame rates that I can view on a laptop? Is this even possible with LCDs?
17
anonymfus 1 day ago 0 replies      
Can it be that description of 24 fps as "dreamy" is subjective? Because usually my dreams don't have such effect. I like plays and 48 fps Hobbit.

May be it's like in the days of monochrome media black-and-white dreams were a norm, but today they are exception.

18
abandonliberty 1 day ago 1 reply      
I dug into high FPS film when I read that 24 fps were designed to be viewed in a dark theatre, when human eyes blur images due to switching between rods and cones.

Most of us no longer watch content in darkness. James Cameron is of the opinion that improving FPS is more significant than moving up from HD. I figured I should trust the professional who devotes his life to this.

To truly evaluate high FPS movies and video content, you have to watch it for a while.

The SmoothVideo Project (SVP) is pretty awesome. Needs some good hardware, made by volunteers, and needs some work to get set up well.

It struggles in scenes with lots of detail, but panning scenes are incredibly beautiful.

Going back is a bit difficult.

20
nitrogen 1 day ago 0 replies      
Is anyone else redirected to a 403 error on a completely different site (broadbandtvnews) when following the link?
21
leonatan 1 day ago 0 replies      
But... UbiSoft said some games are better and more "cinematic" at 30fps. Derp!
4
Mozilla Research Projects
430 points by yuashizuki  2 days ago   91 comments top 13
1
bkeroack 1 day ago 0 replies      
It's really fantastic to see Mozilla become a sort of steward and incubator of the open web. Considering that Mozilla originally came from a revolutionary (at the time) and somewhat desperate attempt by Netscape to counter the monopoly power of Microsoft/IE. If I'm remembering correctly, this was the first high profile corporate OSS dump (years before Java, for example) and shortly thereafter was largely considered a failure since Netscape became irrelevant and the Mozilla project didn't stop the MS juggernaut. Therefore it would be a number of years before a company was willing to take a risk like that again.

Of course with hindsight we can see that the Mozilla folks played the long game. Quietly working in the background they produced a product (Firefox) that actually did largely kill IE dominance. You can argue the role that Chrome had in this, but my opinion is that Firefox created the market for non-IE browsers. Without this trailblazing Chrome would not exist.

Congrats to everyone responsible, from the beginning to the present day.

2
bjz_ 2 days ago 0 replies      
A neat thing about Shumway is just how much of it is written in Typescript[0]. It's great that Mozilla is getting behind the project.

- [0]: https://github.com/mozilla/shumway/tree/master/src

3
ChrisAntaki 2 days ago 0 replies      
Keep up the great work, Mozilla! I'm excited to see where a lot of these projects go, especially asm.js.
4
thomasfoster96 2 days ago 1 reply      
I hadn't heard of about half of these - it's great that there's a sort of directory for them.

Sweet.js is pretty awesome (I was going to say it's sweet, but that's stupid). Broadway.js and Shumway look pretty awesome, I'm going to check them out tonight.

Regarding Parallel JavaScript, does anyone know how this relates to Khronos' WebCL project? Hardware manufacturers seem really interested in WebCL, but software developers aren't.

5
ape4 2 days ago 1 reply      
I donated some money to Mozilla this year. May you could too?
6
jamii 2 days ago 1 reply      
LLJS is listed here but the last commit is over a year ago. Having spent the last month hand-coding arrays of structs in js, I'm really feeling the need for better low-level constructs. Looks like I'm stuck waiting for rust + emscripten to be a valid option.
7
k__ 2 days ago 1 reply      
And they are using Github!

I did some issues on Firefox, but stopped, because the Bugzilla/Mercurial workflow was so bad.

I did some fixes for other OSS projects on Github later and it felt like a charm.

8
bobajeff 2 days ago 3 replies      
So I take it this means that Mozilla is reopening "Mozilla Labs".

It's interesting to see Shumway on there as I was under the impression the project was put on hold.

9
shmerl 2 days ago 1 reply      
I hope Shumway will arrive before major shift to Wayland on the desktop.

Daala is a very exciting project. The current mess of codecs support on the Web is just horrible.

10
tkubacki 2 days ago 1 reply      
what about Mozilla Brick and WebComponents ?
11
mindcrime 2 days ago 3 replies      
Hmmm... would it make sense to talk about a JVM written in Rust? Could that make it easier to write a safe JVM that would be less susceptible to exploits? It would be wonderful if we could get there and have a Mozilla browser with "out of the box" Java support without needing a separate plugin.
12
mp3geek 2 days ago 3 replies      
No pdf.js?
13
soapdog 2 days ago 2 replies      
Also, if you folks enjoy these projects take your time (and money) to donate a some bucks to Mozilla.

Mozilla is the only independent vendor pushing technology and principles focused on people over profit.

You can find the donation page at https://sendto.mozilla.org/page/contribute/givenow-seq#page-...

5
Kalzumeus Software Year in Review 2014
390 points by JayNeely  3 days ago   147 comments top 32
1
davidw 3 days ago 2 replies      
> I work mostly on what I want to work on, take a day off whenever I feel like it, and optimize the business for quality of life rather than for any particular growth or financial targets.

There was some thread here where patio11 kind of snickered when someone called his business a "huge success"; probably because he knows a bunch of people that earn one or more magnitudes more money. But the above quote probably sums up "fantastic success" for me, and I think, a whole lot of the world.

2
toumhi 3 days ago 0 replies      
I completely understand for your neglect of Appointment Reminder. I've also had my share of projects I'd start but lose interest in it because, well, I was not consumed by the problem I was solving.

So during the years I worked sequentially (or sometimes at the same time) on a gift certificate template gallery, a travel insurance comparator, a body-mass-index calculator website, a file sharing solution for businesses targeted at the french market.

The reasoning behind all these projects was to make "passive income". And by running multiple websites I would make a nice income from them all combined.

After developing and marketing the last of these projects (file sharing one, post-mortem here: http://www.sparklewise.com/post-mortem-5-mistakes-i-made-wit... ) I realized that the most important thing is not to have a "good idea" but to work on a problem you want to solve and with people you can relate to or at least that you enjoy working with. That's why I now focus on serving SaaS businesses, because that's actually something I care about and will likely care about for years to come.

Thanks for all the transparency Patrick and for setting an example for the rest of the HN crowd. And good luck with the fatherhood :-)

3
porter 3 days ago 1 reply      
Hey Patrick, thanks so much for your honesty in all of this. It's never easy talking about things that embarrass us. Not to mention in public. You'll probably get some haters, but just know that I look forward to your annual updates and they have encouraged me to quit my job and start my own software business too. This has been one of the best decisions of my life. I'm sure there are many more here who can say the same thing. So, to you good sir, thank you.
4
mherrmann 3 days ago 2 replies      
I have been working on an AR clone in Austria since mid September (https://www.terminerinnerung.org). I focus on getting enterprise customers. My website is hardly visited (so far). I got my first two customers by simply walking into their offices and asking whether they'd be interested. I asked them to pay me for 12 months in advance, so I have earned 5368 (6565 US$) since I started. I'm hoping that I will earn this again in 2016, when it comes to renew the contract for these first two clients.

My approach is more high-touch - I don't rely on people searching for "Terminerinnerung" ("Appointment reminder") and then coming to my website. I think most doctors don't do that. I go out and talk to them.

I also by default offer to develop integrations into the customer's existing appointment reminder system - because the majority (~66%) of my potential customers here already have some computer system. This means reverse engineering the customer's existing system to be able to continuously export its data. I did this successfully for one of my customers (it was a Java/MySQL application). The other customer I developed a web calendar for.

I have now completed the development for my first two customers (I hadn't completed development when I sold the service to them. I just pretended I had, to make the sale). At the beginning of next year I'll start to acquire more (enterprise) customers.

I'm happy to talk about this via email if anybody's interested. My address is [my first name]@[my last name].io (Michael Herrmann).

5
wallflower 3 days ago 1 reply      
> Ruriko and I were blessed by the birth of our daughter, Lillian.

Congratulations patio11!

6
jakobegger 3 days ago 0 replies      
Thanks for being so open with your feelings. I always find it hard to talk about my own feelings, and it's great to hear how you are struggling with and overcoming the pitfalls of self-employment.

Your open sharing of actual revenue numbers is invaluable. The tech press only loves to talk about all those billion dollar companies. But your blog posts put that into perspective, giving us a glimpse of how much money a small business can realistically make without shooting for the startup lottery.

7
saturdayplace 3 days ago 0 replies      
> The only time in recent memory I used it myself was when a Redditor asked for anti-Bitcoin bingo cards, a request which I am unquestionably the best qualified person in the world to answer.)

Apropos of nothing, it seems that if you're interested in piquing the interest of someone busy, discovering the venn diagram for which they're one of a small population in the intersection might be the way to go. Or, it might just be really creepy.

8
chrisan 3 days ago 3 replies      
Thank you for sharing your story

> Im taking my own advice to charge more, and re-aligning those numbers with actual customer behavior rather than the numbers I guessed four years ago.

How do people normally handle this?

1) Take it or leave it price hike

2) Give a X month grace period before new price

3) Grandfathered in and price only changes if they need to upgrade

4) ??

9
manto 3 days ago 0 replies      
Patrick's writing serves as a great supplement to PG's essays: real world analysis of "slower" growth software businesses. For engineers interested in alternate models of creating a company, these annual write ups help one develop an outline for the financial, business, and engineering lifestyle required to get something off the ground. Thanks Patrick, after working at VC backed startups, these types of posts actually encouraged me to go out on my own!
10
ThomPete 3 days ago 1 reply      
You can burn or you can last, but you can't do both.

Patrick is a wonderful example of a person who takes the middle road and actually put quite a lot of effort into making sure he stays there rather than letting himself be sucked in by the grow like crazy game or the never launch anything game.

He is happy, he is not trying to be happy. That alone is something most people will never experience and measured in that he is a billionaire.

11
simonswords82 3 days ago 1 reply      
Hey @patio11, hopefully you'll lurk a bit later...

Congrats on the kid! Balancing a newborn and any amount of business is no joke (source: I have two startups, a software company and an 18 month old).

I wanted to ask about this:

> To build out that software and get the team spun up, I had to actually sit down and document our business processes

I'd love to here a bit more about how you went about that. You've got a great approach to documenting your thoughts and I'm sure I could learn a thing or two. I'm scaling our app http://www.staffsquared.com in 2015 and working hard to share knowledge across our growing team.

Keep up the great work!

12
mooneater 3 days ago 1 reply      
I am a huge fan of yours patio11, you have gifted us so much useful knowledge. How do we ever repay you? =)
13
shostack 3 days ago 1 reply      
Re: your SEO comment:

"AR is virtually guaranteed to be a mortal lock on the query [appointment reminder] due to the combination of the exact match domain bonus and the fact that most links to it naturally cite the name of the company."

Wasn't that largely made irrelevant a while back? I'd be willing to bet the majority of your relevancy comes from the backlinks and content on their pages vs. your exact match domain. Hopefully you're not building a link profile focusing on that link text as there have been reports of people getting dinged for that.

14
gknoy 3 days ago 2 replies      

  [H]aving numbers publicly available would complicate   [taking investment money in the future].
I don't understand why having publicly available numbers would complicate getting investment in the future. Would someone be gracious enough to explain that to me as if I have no knowledge (true in this case)?

15
Permit 3 days ago 0 replies      
>Most of our customers are on the Professional plan, which annoys the heck out of me, but its my fault. Since I was thinking personal services, where 100 appointments a month barely sustains a sole practitioner (it implies $3k to $8k gross revenue), I thought any sizable business would be forced to pay more meaningful amounts of money. It turns out that you can run a nice boutique law office with sales in the high seven figures or an architectural consultancy with millions in revenue on less than 100 appointments a month. Believe me, I know several examples.

If you could go back in time how would you change the pricing model here?

Would you remove the professional plan altogether? How could your pricing model differentiate between the small personal services with < 100 appointments/month and the law firms with < 100 appointments/month?

16
daxelrod 3 days ago 0 replies      
Bravo for candidly writing about your failures as well as your successes. I don't think a lot of us would have the guts to write publicly about times that we didn't measure up the way you have; but your doing so is wonderfully instructive.

Congratulations on fatherhood!

17
billsossoon 3 days ago 0 replies      
The comment about BCC being Hello World with a random number generator is amusing, but the fact that you were able to generate profit from a simple web app is really a testament to your marketing and business skill.
18
tome 3 days ago 1 reply      
Patrick, if you're reading I have a question about Appointment Reminder. The website says this:

> We do our level best to answer all emails within 24 hours. All questions are answered by our lead engineer. (Your business is too important to trust to a call center.)

This sounds like it won't scale well. How do you plan to cope with support when your number of customers grows?

19
unreal37 3 days ago 1 reply      
I bet it must be an amazing relief to be able to talk specifics of AR, after so long of having to be quiet about the details. Congrats on your success, Patrick.
20
UtahDave 3 days ago 0 replies      
My favorite quote from Patrick's post:

"The slip date shipped repeatedly."

21
JunkDNA 3 days ago 0 replies      
This might be one of my most favorite posts of Patrick's ever. It's often so much easier to start something new than it is to sustain something old. Hanging around HN too long makes you an addict of "new". Guess what: growing and sustaining a business for the long haul is hard, even for patio11.
22
bbcbasic 3 days ago 1 reply      
Patrick,

Have you thought of launching an affiliate program for the BCC ?

You can connect on forums like WarriorForum and find good affiliates. They will love that you have a proven product. Then give them 50-75%.

Some will advertise on their existing sites, to their lists and may even pay for advertising. You take no risk and may get a lot for sales.

Another option is Clickbank where affiliates may find you. There are alternatives like JVZoo, DigiResults etc.

You could hire someone part time to do the customer support and bugfixes.

In short you could keep BCC going nicely with very little effort on your part.

And then after a few months, you have a low maintenance business that you can sell, rather than something that will slowly die.

I know it may not seem worth your time, as you have bigger fish to fry.

However all of this could be done in a couple of days, and maybe an hour a month to maintain. You may be able to sell it for $100k or more once it is in good shape again.

23
xzlzx 3 days ago 0 replies      
"...a lot of folks have wanted me to roll the dice on a funded startup with big put-a-dent-in-the-universe ambitions. At times, I wanted to want that for myself, but for the moment I was content to keep running my business in the traditional fashion. I work mostly on what I want to work on, take a day off whenever I feel like it, and optimize the business for quality of life rather than for any particular growth or financial targets."

We share the same viewpoint. Well said.

24
applecore 3 days ago 2 replies      
Why is Appointment Reminder on a .org top-level domain?
25
fdsary 3 days ago 1 reply      
Congrats on the kid & move to
26
mattste 3 days ago 0 replies      
As a young software developer interested in what happens on the business side of things, this was a fantastic read. Thanks Patrick.
27
sogen 3 days ago 1 reply      
Is there a way to reach patio11? I emailed him while taking his course but never got a response.
28
krschultz 3 days ago 1 reply      
Glad to hear Patrick is using Bench, those guys are awesome.
29
curiously 3 days ago 1 reply      
Is it better to have a low churn rate per month with lower MRR or a higher churn rate with higher MRR ?

Can you counter churn rate by acquiring new customers?

30
staunch 3 days ago 0 replies      
Congrats on the baby!
31
sadpanda5 3 days ago 10 replies      
This may be a unpopular opinion, but this guy needs to cofound a company with someone else. He is one of the smartest dudes I know for optimization, but seems to have some of the most tame/boring ideas for 'startups' (if you can call them that). Bingo card creator? Appointment reminder? He needs a cofounder who compliments his skills. Mainly, good ideas and good sales skills.

The bottom line is you can only optimize so much via a/b testing and whatnot from marginal ideas at best.

32
dennisgorelik 3 days ago 1 reply      
> Ruriko ... does not love Ogaki ... and wanted a change.

Did she consider US cities?

Tokyo is very expensive and an unusual choice to move into for family with a child.

Any US city would put you in a better touch with your business and would be less expensive than Tokyo.

6
JMAP a better way to email
395 points by brongondwana  3 days ago   115 comments top 21
1
dannysu 3 days ago 1 reply      
Great job guys! Things like this together with CalDAV and CardDAV are reasons why I pay you guys for service.

I just tried the following experiments with my Gmail and FastMail account. There's a reason why FastMail just feels faster!

  Try this:  1) While having the Gmail iOS app open, mark an email as read on the web  2) See how long it takes for Gmail to reflect the change  3) Now try with FastMail iOS app, see that it's faster  Try this too:  1) While having the Gmail iOS app open, delete an email on the web  2) See that Gmail app never updates until you manually refresh  3) Now try with FastMail iOS app, see that it happens instantly  How about the other direction?  1) Read an email on your phone  2) See that Gmail on the web never updates until manual refresh  3) Now try with FastMail... Instant update!

2
HarrietJones 3 days ago 0 replies      
This is good, and I love FastMail, but the one thing that really, really gladdens my heart with this is that they're pushing it as an open standard.

Thanks FastMail. Sincerely, thanks.

3
plg 3 days ago 1 reply      
Amazing.

Modern, innovative, service-oriented products and a business model based on customers giving the company money in return for services.

I hope that this is a reminder to startups out there that you don't have to trick customers into handing over their personal information all the while pretending to offer your services "for free", in order to gain market share and be profitable.

The "traditional" business model can be quite successful.

4
driverdan 3 days ago 4 replies      
Why would you create a new email protocol without an encryption requirement? I understand trying to fix the existing protocol problems but one of the biggest is plaintext.
5
ebabchick 3 days ago 2 replies      
For the well acquainted -- how does this relate to Inbox (https://www.inboxapp.com/)? I have not spent enough time with either to make a comparison
6
nona 3 days ago 1 reply      
I hate to be critical considering the considerable work they've put in, but I don't buy their "JMAP is actually more REST-like than most RESTful APIs" statement. But maybe I'm missing a whole lot.

REST doesn't preclude doing multiple things in one round trip. Well it does seem JMAP is a bit more flexible when it comes to multiplexing very different actions. But the error handling doesn't seem to tell you which command has failed. You still can't get away from certain ordering issues. And to be honest, I don't see how it's going to be that easy to cache.

This really is RPC using it as a straight replacement for IMAP might work well, but I wouldn't want to use it as a platform to build something very different than an email client on it.

7
ratsmack 3 days ago 1 reply      
This looks interesting, as long as it is more efficient as a protocol and API. On another note, the entire email infrastructure need to be redesigned in my opinion.
8
thecoffman 3 days ago 0 replies      
I haven't had a chance to review the specifics of the protocol yet, but as a Fastmail user, I can vouch for the speed of event propagation across multiple clients. Its lightning fast; noticeably faster than say, Gmail.
9
tinco 3 days ago 2 replies      
Wasn't there a startup recently that pivoted from being a mail client into building a generalized API for interfacing with mail services like gmail/yahoo/outlook? I forgot the name but I'm super interested in efforts like this.

I have the feeling the next big social network disruption is going to be leveraging e-mail in a big way. E-mail is the established quasi-p2p (it's a network of centralized services) platform that everyone has an account and a very complete social network on. I'm not saying it would be easy to launch a Facebook competitor from e-mail, but I believe it's still got the potential to launch one last social network and someone just has to do it.

I'm hoping the key lies somewhere in building the right sort of mail client, that's got the reliability and extent of e-mail, adding on top some hook feature and layering in perhaps a dash of privacy, cryptography and independence. There's got to be something viral in there.

10
0942v8653 3 days ago 2 replies      
Did something happen to the comments? I'm seeing 78 upvotes and no comments.
11
mortenjorck 3 days ago 0 replies      
This looks fantastic and is a huge, long-overdue breath of fresh air in the protocol space. But realistically, what will it take for major vendors like Apple and Google to start adding support for it?
12
mike-cardwell 3 days ago 3 replies      
I understand you guys use and contribute to Cyrus. Is there a JMAP interface planned for Cyrus? Or for any other mail servers, like Dovecot?
13
hedgehog 3 days ago 1 reply      
Neat. Two questions about the digest:

1. You recommend using a digest of the message to generate the message ID. Did you consider mandating that scheme or going farther and using a Merkle tree for the mailbox representation? It seems like this would allow for generating single requests that can fetch all new items.

2. Why SHA-1?

Edit: Also, thanks for keeping Fastmail running. As a customer of about ten years it's much appreciated.

14
maxk42 3 days ago 3 replies      
It's missing the one killer feature that would fix the spam problem: approved senders. If you had a "friends" list like Facebook where only people you've approved may send messages to you, we'd cut down on 90% of spam immediately. Between that and the lack of mandatory encryption this is just another tedious protocol to implement.
15
elwell 3 days ago 1 reply      
> By using the platform push channels, JMAP avoids having to hold its own connection open.

Can anyone explain what they mean by "platform push channels"? I don't understand how you can have realtime updates without leaving a connection open or periodically polling. Do they mean that they use the same mode of transport as Push Notifications?

16
iwantagrinder 3 days ago 2 replies      
I' m interested in this from a security perspective. What does this new protocol offer in terms of better control around what makes it to the inbox? Would IMAP>JMAP translation before hitting the user give us better ability to filter out malicious items/spam?
17
msh 3 days ago 0 replies      
I guess this tries to solve the same issue as microsoft active sync, just in a open specification way.
18
ummjackson 3 days ago 0 replies      
FastMail continue to impress me. Good work folks.
19
sjtrny 3 days ago 2 replies      
20
kijin 3 days ago 5 replies      
JMAP is a rather misleading name.

"J" is misleading because it's not JSON that distinguishes JMAP from IMAP, but rather the use of HTTP, along with recently added HTTP features such as push. JSON is just the message packing scheme. HTTP is what does all the heavy lifting.

"M" is misleading because this protocol is designed for a lot more than mail. It also handles contacts and calendars, and I wouldn't be surprised if Fastmail made their file storage service accessible through JMAP as well.

What's even worse is that once you come up with a protocol that covers everything that Fastmail does, everyone and their dog will try to add more services to the protocol. Instant messaging? Got it. Collaborative editing a la Google Drive? No problem. And before you know it, the protocol is bloated as hell and you've basically reinvented HTTP.

Please don't try to do everything. Do one thing, do it well, and put a strict limit to the scope of the project for the time being. A few more minutes of battery life on a phone is not worth polluting the world with yet another example of massive scope creep. IMAP+SMTP using a stateless protocol would be cool, but anything more and I'm not sure.

21
Animats 3 days ago 2 replies      
What does this do that IMAP and SMTP don't do already?

All you need is an IMAP server, accessed from all your devices. The one that comes with Android isn't bad. Thunderbird works fine on the desktop.

You don't have to use GMail with Android; when you first power up your Android device, click "Later" when it asks for a Google login. Then delete the Google One-Time Startup app. Google won't bother you again.

7
Attack Is Suspected as North Korean Internet Collapses
372 points by jcfrei  3 days ago   190 comments top 28
1
eyeareque 3 days ago 7 replies      
The public /22 (1024 IPs) that is used by North Korea is widely known now, so it is bad form to assume the US is behind this attack. Heck, a 14 year old with a few bots could take down their whole country.

This outage won't hurt North Korea. At best it makes for a good head line to see the whole country offline. At worst this means that their elite citizens cannot access social networks or email outside of their country.

I really hope this isn't the doing's of the US government. You'd hope they could do better than this..

2
Alupis 3 days ago 3 replies      
This is occurring after the hacker group who claimed the attack (Guardians of Peace), sent the FBI a letter thanking them for blaming North Korea, calling the FBI the best (sic), and linked to a youtube video that called the FBI "an idiot".[1]

[1] http://www.cnn.com/2014/12/22/world/asia/north-korea-us-sony...

3
SEJeff 3 days ago 5 replies      
Do people seriously think the USG is behind a ddos when Anonymous has already stated they are going to go after the DRPK?

http://www.inquisitr.com/1691688/anonymous-announces-vengean...

4
vlunkr 3 days ago 5 replies      
Is it weird to anyone else that all this "cyber warfare" is happening over the release of a movie. A comedy movie, not a documentary or propaganda film. I don't know if media has every had such an inadvertent impact on politics before. I would say it's a strange age we live in, but I think this strangeness is all from North Korea.
5
jgwest 3 days ago 2 replies      
Maybe it's the doing of the U.S. gov't... maybe not...

But in any case, what's the point of keeping the U.S. government's action or non-action secret?

As the linked piece states:

"If the attack was American in origin something the United States would probably never acknowledge ..."

It's sort of like the Doomsday Machine in Dr. Strangelove: it just doesn't work as a deterrent if you keep it a secret.

Or is all this secret "cyberwarfare" capability that the U.S. government is secretly building only going to be used in secret?

6
yourad_io 3 days ago 5 replies      
Trying to inform oneself about a technical matter through a mainstream news source is an exercise in frustration.

Maybe my English needs work. Could someone with superior English skills to mine, please decipher the article and tell me:

Is there any actual evidence of an attack? Has traffic spiked through/from NK?

Or could this be them "pulling the plug"?

Because the first case is: "Someone attacked NK Internet and brought it down", while the second "NK Internet IPs were \"withdrawn\" from the net".

7
uean 3 days ago 0 replies      
With such a small subnet, the idea that all the various sysadmins who read this article are immediately going to run a quick ping check to confirm NK is still down, and that in itself turning into sufficient traffic to DDoS the entire country, makes me giggle a bit.
8
keeran 3 days ago 0 replies      
This all stinks (TBP included) of a media blitz to prepare the greater masses for further restrictions to their Internet abilities.

"Sure a content filter makes sense, there's a war going on."

9
Rapzid 3 days ago 1 reply      
Obama already calibrated the governments stance on this ordeal when he said the Sony hack was vandalism and NOT terrorism. I don't believe the government being responsible for NK's internet problems is in line with that.

He also seemed to believe that the fault for any censorship as a result of the hack lies squarely within the US.

10
oneofthose 3 days ago 0 replies      
This article reads like an excerpt from a Vernor Vinge novel, in particular `Rainbows End`. Amazing.
11
jmnicolas 3 days ago 3 replies      
Of course, NK won't be pissed at all and they're not going to retaliate at all (yeah I know it's probably the goal of this attack).

This might be the first steps of the first cyber world war for all I know.

The only good thing is that only the elite will be affected by the collapse of NK Internet (no porn for a while). The average citizen probably can't even grasp what the net is, and none of her life is linked to it.

12
luftderfreiheit 3 days ago 5 replies      
What fascinating times we live in.

My interpretation of the general history of warfare is that countries agree on restraint once some situation has occurred that all sides agree should never happen again. Mustard gas in WWI, nuclear weapons in WWII...

Hopefully this doesn't spiral out of control. It's not clear where the boundaries are that we don't want to cross.

13
wahsd 3 days ago 1 reply      
So, has it been 100% confirmed that NK is behind all of this? I don't know, I realize that NK is like some hormone crazed pubescent boy, but shit just seems weird.

What if this all turns out to be some trolling by some third party, maybe even not government affiliated.

14
downandout 3 days ago 4 replies      
I suspect this to be the work of the US government, but out of curiosity I wonder if there would be any legal consequences were Sony or another private party to launch a DDOS attack on North Korea from the US. Obviously no one would be extradited to NK, but I'm curious if that would run afoul of US law.

If not, it might be fun to create some software or a mobile app that would keep this going indefinitely. I imagine a "CrashNK" app would get alot of downloads.

15
seanemmer 3 days ago 0 replies      
More informative article from Huffington Post:

http://m.huffpost.com/us/entry/6367654

16
ElectricMonk79 3 days ago 0 replies      
Cutting off a major source of communication to a paranoid and armed nation seems like a really bad idea. Ask any horror film director - imagined enemies and actions are much worse than being able to see the monster.
17
leke 3 days ago 0 replies      
Sometimes I think the US's responses are so disproportionate, if someone was to actually attack their country, they would respond by attacking the entire world.
18
hardwaresofton 3 days ago 0 replies      
No offense (to any who might be vehement supporters of NK I guess, though I can't imagine there are many), but I can't imagine the NK internet is/was very big/strong/fault-tolerant.
19
Monotoko 3 days ago 0 replies      
Recently a scan of the IP space was put on /r/netsec - I don't think this is coincidence.
20
ilamont 3 days ago 2 replies      
If this is the work of the U.S., it sets a very bad precedent.
21
graycat 3 days ago 0 replies      
Well, NK likely does not have the bestelectric grid. So, maybe the problemwas just their electric grid! Or maybethe problem was someone clicking on thewrong icon or push button in some systemmanagement software, maybe written in NK?

But if the outage was from a DDoS fromthe USG, then I have to regard it as mostlya publicity stunt: That is, I have tobelieve that the NSA and CIA havemuch better control over, penetration of,NK computing than just a DDoS!

I mean, NK has, what, bootleg, never updatedcopies of Win 95, Win 2K, Win XP SP0, reallyold IE with lots of ActiveX pages, reallyold FF and Flash? The place has to be a computer version of a fire trap withouta firewall! NSA and CIA rootkits have to be tripping over eachother all over NK like rats in agarbage pile.

Oh, did someone compare NK with a garbagepile? Oh, how pejorative! I mean, howcould one regard that pinnacle of fashionthat gave the world the unique haircut ofthe Great Patriotic Leader, Jr.?

Besides, their girls nearly all lookso young, that is, small and thin,possibly because nearly everyone thereis thin. Maybe they get a lot of exercise,aren't very warm in the winters, anddon't eat very much, or all of those.

22
kolev 3 days ago 0 replies      
How immature... if it was the US. So, North Korea (we still don't know for sure) caused hundreds of millions of dollars of loss to Sony Pictures and US caused how much damage to North Korea (which doesn't care much about the internet)?... Well, close to $0. How proportional is that?!
23
r109 3 days ago 1 reply      
ooh thought this would happen. driverdan called it, reference: https://news.ycombinator.com/item?id=8777811
24
classicsnoot 3 days ago 0 replies      
Could this bay some sort of shot across russia's bow?
25
curiously 3 days ago 1 reply      
This is as useful as announcing we put an embargo on rolls royce pinnacle travel. There's only 15 of it and not many people can afford it anyways.
26
yuashizuki 3 days ago 2 replies      
LOL what a phatetic response, after a attack on the first amendment.
27
sauere 3 days ago 1 reply      
/edit: posting in wrong thread. sorry. (and stop it with the downvotes!)
28
lostgame 3 days ago 6 replies      
Okay, seriously - who else is making the weird, kinda unsettling connection between the recent seizure of the Pirate Bay and this whole 'The Interview' business with North Korea?

If the Pirate Bay was still online, would 'The Interview' have leaked already?

Is the seizure of The Pirate Bay linked to the intentional suppression of the release of this film?

Why would the government raiding TPB concede to do this for terrorists?

I mean, I hate to be one of those conspiracy nuts, but - it really seems like this is all a big distraction for the start of some new strange form of cyberterrorism.

8
Ruby 2.2.0 Released
315 points by kokonotu  1 day ago   68 comments top 13
1
jayroh 23 hours ago 1 reply      
To the people who maintain the tools and platforms around these large version updates - thank you. rvm, ruby-build, homebrew, heroku - you're all truly generous for having everything ready for us almost immediately (on Christmas morning no less).

You're all wonderful- merry Christmas! <3

2
mrmondo 1 day ago 1 reply      
Great work to the ruby team, the improvements in Ruby's performance since 2.x have been very impressive to say the least. Well done and Merry Christmas to everyone that worked hard on this release.
3
xfalcox 22 hours ago 1 reply      
Guys, I made a very simple pull request regarding auto proxy detection on the standard library of ruby, but it still open after a little while. Someone here have some directions on how to contribute correctly to the project?
4
hit8run 1 day ago 2 replies      
Ruby seems to continuously move forward. Great to see that they manage to keep this project alive without a python 2vs3 disaster.
5
cmelbye 16 hours ago 1 reply      
Not compatible with Rails 3.2.x, for those still on that version. (And possibly will never be, as 3.2.x only receives security updates now.)
6
blacktulip 1 day ago 4 replies      
Great work. Thanks for the Christmas gift.

btw: anyone else thinks Rails was mentioned too much in the release note?

7
joshdotsmith 1 day ago 0 replies      
For rvm users, remember to run

  rvm get stable
before trying to

  rvm install ruby-2.2.0
You might end up getting preview1 unexpectedly.

8
rab_oof 1 day ago 1 reply      
Interesting: vfork support.

Vfork in most older systems is like fork except it doesn't deeply duplicate all process state immediately (file handles, memory, such), so it can be faster if all the app wants to do is fork/exec. (We had to implement both fork and vfork in minix 2.x in uni.^)

But according to SO, most OSes implement fork lazily, so there's not much point in using it when fork will be just as fast without any change.

If there were a slowly-performing platform that has a specific performance issue identified by profiling as caused by fork, then mature optimization could follow. Otherwise, it seems like adding LoC without a clear goal.

The Symbol GC sounds good. Maybe this will imply code can associated with classes and modules can be GCed and required anew once all objects are freed (live upgrade apps without restarts).

^ The first thing I did to the Minix codebase was set keyboard repeat rate to the fastest possible values. Everyone else seemed content to waste their life waiting for their editor and debugger to move at a snail's pace. Of course, no else had x86 asm / pc hw io experience.

9
calineczka 18 hours ago 1 reply      
I have problem installing on ubuntu 12.04 : https://gist.github.com/paneq/fee5477fb7ab1ede0104 . Did any of you experienced that as well?
10
current_call 1 day ago 2 replies      
It's a bit weird that symbols are garbage collected. I wouldn't think it could delete them without making the same symbols unequal some of the time.
11
andersonmvd 11 hours ago 0 replies      
The download URL should be HTTPS, but at least they informed the digests.
12
arikrak 1 day ago 1 reply      
Anyone know how Ruby 2.2 works with Rails on Heroku (with unicorn)? When I switched from 2.0 to 2.1, it caused all sorts of memory problems so I had to switch back.
13
claudiug 1 day ago 0 replies      
nice!Hope that version 3 will add jit :)
9
We Invite Everyone at Etsy to Do an Engineering Rotation
293 points by kevingessner  4 days ago   70 comments top 20
1
karmacondon 3 days ago 0 replies      
I'm a huge fan of cross disciplinary rotations of all types. A startup company, or any organization, should act as a unified whole. "It's not my problem" is not an option, especially when the company is small and the stakes are high. Sales depends on engineering which depends on support and management, an interconnected web. Rotations build empathy, lead to innovative thinking from outside perspectives and give people greater context. I've proposed them at several of my past jobs only to be shot down each time. It says a lot about the management of Etsy that they encourage designers and product managers to do a rotation on the coding side, when I wasn't able to convince my team leaders to let php developers from one project rotate to work on another.

"Human resources" has come to mean paperwork and discipline, but the real value of the term is much closer to its literal meaning. Developing peoples' innate capability is very important. Any company can compete to hire the "best people", but the really smart companies put that effort into increasing the value of the people that they have. The capacity of the human mind is one of the broadest and most versatile things in the universe, but most of us quickly settle into limiting patterns of thought. Just a few days of seeing things from a new perspective can make all the difference in the world. Etsy's engineering rotations seem like fun, but I think they will pay off in a big way. It's hard to put a number on increasing teamwork and understanding. Programs like this are a great way to maximize that value.

2
robertwalsh0 3 days ago 0 replies      
I loved everything about this article. At my company, we've also found that providing spaces where people are able to work in a cross-disciplinary fashion gives the opportunity for innovative ideas. Every Thursday, a team member is paired with another from somewhere else in the company. While say, a marketing person can get to learn tech it's also very rewarding to an engineer to be able to work with a marketer or a sales person to see that side of the business. Exposing a marketer to engineering may help her have epiphanies like, "i might be able to track how effective my last campaign was by doing X" and an engineer might think about things that could be added to a feature to maximize user growth. We wrote a blog post about our intra-company pairing here: http://blog.scholasticahq.com/post/91759651948/pairing-thurs...
3
pvnick 3 days ago 0 replies      
That is just the coolest idea ever. For non-engineers, software can be a sort of black box filled with "code," whatever that means. This knowledge gap frequently leads to conflicts when engineers take longer to build a feature than non-engineers would like, or when things break that just seem so simple. Getting everybody involved in the deliberate, painstaking process of writing quality software is a fantastic way to ensure the everybody is on-board with the way code is written and minimizes interdepartmental friction. Kudos to Etsy!
4
Wonnk13 3 days ago 2 replies      
Great idea. I'd love to see a writeup about a rotation in the other direction, ie give engineers a taste of the business side of the house. As a data scientist I speak a lot with Sales and Engineering and sometimes the two teams seem worlds apart...
5
frostmatthew 3 days ago 4 replies      
I like the rotation idea, but I can't say I see much logic in the desire to have new engineers deploy to production on their first day mentioned/linked in the opening. At VMware (or at least on my team) we try to have new engineers commit code their first week (this doesn't always work out, and when it does it's usually the 4th or 5th day) and I almost feel that's too soon...first day just seems nuts.

You don't see this in other professions, e.g. I doubt doctors are performing surgery or lawyers are going to court on their first day at a new hospital or firm. I'm just not seeing the value in having someone commit code before they're possibly familiar with the codebase and [unless it's a product they used before getting hired] may be equally unfamiliar with what the product even does.

6
zavulon 3 days ago 4 replies      
It's a great idea, but I'm having difficulty understanding the specific task non-coding employees are doing: adding their own photo to Staff page. Shouldn't there be a nice user-friendly back-end interface that would let them do that in about 10 seconds without any code knowledge?
7
drderidder 3 days ago 0 replies      
Kudos to Etsy for doing this. I think there's great value in learning basic programming skills even if not everyone has the inclination to become a software designer. Kind of like how taking music lessons has all kinds of tangential value even if the student doesn't turn out to be another Van Cliburn.
8
pnathan 3 days ago 1 reply      
I'm continually impressed by Etsy Engineering's descriptions of their practices and process.

Rotations are a wonderful idea and, IMO, should be done more regularly.

9
hw 3 days ago 0 replies      
As much as the rotation idea is interesting, and can be beneficial on the surface, I'm not sure if doing so on a recurring basis provides more value than interruption and the setup/teardown costs of context switching.

Sure, a non engineer could learn a thing or two about how code works, and an engineer as well on handling support, but I'd be cautious about these sessions leading to a false sense of understanding how things actually work, which might eventually lead to, for example, a support person making wrong assumptions about an issue a customer is having just because he/she paired on the relevant code base.

IMO cross disciplinary 'rotations' should happen naturally, instead of making it explicit on a certain day in the quarter. Engineers should have exposure on a day to day basis on what customers want as well as have exposure to the product and business side of things in the planning stage of a sprint, understanding why a story or task is prioritized the way they are, etc. Same goes for non engineers like product managers or support personnel who often deal with engineers on an ongoing basis, and the sharing of technical knowledge should come naturally with each discussion.

10
badmadrad 3 days ago 0 replies      
From a UI perspective, I don't love Etsy but I think they really have a world class engineering team. This not the first time I've heard of good things from that outfit.
11
Havoc 3 days ago 1 reply      
Wish my employer had that. I'd kill for an engineering / IT dev rotation...since those we're close 2nd & 3rd on my choice of career.
12
pkaye 3 days ago 0 replies      
I wonder how to do this with engineering that requires deep knowledge. At my work we have SoC designers, layout, analog designers, board layout and firmware among the engineering departments. I don't think we can even rotate within the engineering departments as everything is so specialized.
13
radicalbyte 3 days ago 1 reply      
It's not just tech companies doing this. At Volvo we did it as part of our continuous integration process. It was great fun, it really helped you to understand the business better.
14
sytelus 3 days ago 0 replies      
This should be also applied to within engineering teams as well. Employees when encouraged to move from team to team after certain intervals (such as 3-4 years). There has been argument that this doesn't allow people to specialize but I feel 3-4 years is long time after which returns are probably diminishing in developing specialization. This keeps life interesting and you get insights on how other teams work, their process and tools etc.
15
kevinSuttle 1 day ago 0 replies      
So many parallels to other industries. The greatest chefs often describe how they'd held every job in a restaurant before becoming chefs.
16
deepGem 3 days ago 0 replies      
This is really cool. What would be awesome is a design rotation for engineers. The way some of the top designers work is a joy to experience. Even their scratch book looks so well organized.
17
gohrt 3 days ago 0 replies      
Thank you for not putting "Why" at the beginning of the article title.
18
frsandstone 3 days ago 0 replies      
This is awesome.
19
logicallee 3 days ago 0 replies      
This is like an opera company inviting everyone - altos, contraltos, baritone and bass male singers, the conductor, the symphony orchestra - to do a rotation as a soprano singer.
20
productcontrol 3 days ago 0 replies      
It is true, i used to clean the bathrooms there and i went on code rotation, or as we called it "stink patrol". I thought the stalls were bad, but man, that codebase was far worse!
10
UI Performance Decline OS X Tiger to Yosemite [video]
276 points by blkhp19  2 days ago   234 comments top 56
1
lucisferre 2 days ago 13 replies      
Yosemite has been a major regression in almost every respect, I actually regret upgrading. Aside from the obvious and major performance and stability issues the major features are just plain broken. For example, try answering your phone from your computer. It takes way too long to actually pick up the call from the phone and then when you try to answer it bugs up so that in the end you can never actually answer the call. This is completely unacceptable behavior and a feature like this should have never been released unless it was exceptionally reliable and seamless.

It seems Apple is more concerned with showing off the concepts behind these features than making sure they actually deliver a solid experience when their released. Another case in point is the changes to AirPlay for the AppleTV. The new approach to connecting a device should be a big improvement, instead it's a buggy mess that makes it nearly impossible for us to use it anymore.

One of the most refreshing things about moving from primarily using Windows/Linux to a Mac for me was the sensibility, stability and the fact that things just worked the way you expected them to. Now that seems to be completely lost.

2
lunixbochs 2 days ago 3 replies      
I feel like the entire compositor has a performance regression. This includes dragging windows, and seems to have nothing to do with transparency.

When running OS X virtualized inside VMWare Fusion (which does not have hardware accelerated video drivers for the guest), Mavericks was very usable. Yosemite dropped to 2fps (measured using Quartz Debug tools).

I had to use Quartz Debug (available in the Graphics Debug Tools here [1]) to disable beam sync to make it usable again. In VMWare, this takes me from 2fps to around 50fps for basic operations like dragging a window around and typing.

I've observed noticeable improvements disabling Beam Sync in a native environment as well. Unfortunately, I've found no way to do it without leaving Quartz Debug open. Quartz Debug also has an FPS meter in the menu bar under "Window -> FrameMeter". My framerate when dragging a large window in Yosemite with "Beam Sync: Automatic" is around 40fps. With "Beam Sync: Disable" it's almost a steady 60fps.

A stable test in System Preferences for me is the Dock settings.

[1] https://developer.apple.com/downloads/

3
mrmondo 2 days ago 3 replies      
was heavily involved in the beta testing of 10.10, In my experience Yosemite is by far not only the most buggy, but also slowest release of OSX.

I am saddened to report that only one of the many performance regressions I reported was actually fixed (In that case within Mail.app).

A huge portion of the performance problems seem to be stemming from the graphics subsystem. If you use Yosemite on dual Retina or even WQHD displays you'll know what I'm talking about.

Add graphics glitches, immature theming and wireless problems to the over all poor performance and were talking about major issues in areas that OSX has been known as a leader in.

My biggest issue is that Apple seems to be ignoring most of the core performance issues that are being reported - just look at the number of unresolved posts on the Apple forums.

I do hope that Tim Cook is becoming aware of the drop in reputation that OSX has received since Yosemite and is leading his teams to perform more in depth performance soak testing before releases in the future.

4
fiatpandas 2 days ago 2 replies      
You would think with all the smart, detail oriented, outgoing people at apple, SOMEONE would have pulled the person responsible for shifting to a yearly cycle and tell them: "look, this accelerated release cycle, it's not resulting in quality software. It's hurting our reputation. It's not performing on 2 year old hardware. People are losing faith in our ability to release stable bug-free builds. Users shouldn't have to cross their fingers when upgrading. This is Apple."

But no, that's probably not happening.

5
kenferry 2 days ago 0 replies      
This particular issue is probably due to process isolation and sandboxing.

In Tiger, all of those pref panes were loaded into the System Preferences app, and each pane could access all the same data as any other. 3rd party pref panes are supported.

In Yosemite, each of those panes is its own process.

That doesn't excuse it, just saying something about what's going on.

6
mikhailt 2 days ago 2 replies      
I wish SL were still getting updates, it was the best OS release Apple ever made. It went downhill the moment they went to annual release cycles.

I rather wait 2-3 years for a new OS update that is stable, fast, and responsive than dealing with all weird glitches that's turning me off OS X for good. Windows 10 in alpha builds feels better than Lion-Yosemite.

7
mergy 2 days ago 3 replies      
The decline started with Lion. Snow Leopard was the pinnacle of the OS. They were able to offset or mask some of this with SSD drives. But, it has become pretty sloppy for folks that really wanted to tune their systems for performance. Couple that with the questionable changes on the UI over the last couple major releases and I didn't see anything I wanted anymore.
8
JohnBooty 2 days ago 1 reply      
I hadn't noticed any perceptible performance differences in Yosemite on my 2011 MBP.

After reading this article, I opened up System Preferences on an external WQHD monitor and... yes, indeed. That transition is not very smooth. Informally, it doesn't seem slower ...it's just not as smooth as the old effect.

As another poster noted, in Yosemite those panels now use a cross-dissolve effect instead the simpler slide and fade in earlier versions of OSX.

Thing is, I'm not sure I would have noticed this in a million years.

How much time do you guys spend clicking around in System Preferences every day? This really affects you?

I don't know of any other apps with preference panes that use this effect. I tried two other apps (iTunes and Pages) and neither one uses the cross dissolve.

I understand that this small regression is merely emblematic of the issues some of you are experiencing; none of you are claiming that System Preferences itself has a huge impact on your daily lives. But this is such a ridiculous regression to select as the poster child for Yosemite's perceived woes.

9
comex 2 days ago 4 replies      
Interesting. The flashing at least seems to be an isolated System Preferences bug, as I don't remember seeing it in any other applications.

Would be better to compare a larger number of UI interactions - Finder, Safari, Expos/Mission Control, iTunes, etc., to more clearly establish a pattern rather than just claiming it anecdotally. Not saying the pattern isn't there, but proving it would be a breath of fresh in air in an area typically so subjective and dominated by perceptual biases.

10
grandalf 2 days ago 1 reply      
Not sure what everyone is talking about, I loaded Yosemite on my 3 macbooks (2010 11" air, 2011 13" air, and 2014 13" pro) and it works perfectly on all with no glitching whatsoever.
11
mayhaffs 2 days ago 3 replies      
While loitering in the local Apple store, I got on a new 5k iMac and started launching a bunch of applications with multiple windows, alt-tabbing and clicking around really fast, etc. Normal dev workflow stuff..

Discovering the glitchy flash in the system preferences pane was really unsettling. Almost a uncanny valley type situation.

I'm still on Mavericks. I've been waiting for bug fixes from Apple and application support for non-Apple apps before upgrading to Yosemite. But after seeing the glitchy-ness on the 5k and reading these comments..

Both recent upgrades are pretty disappointing. When I first got an iPhone 4, I was astonished by how there were practically zero bugs, especially compared to my previous no-name OS, bug-infested phones.

Are there actually more bugs in Yosemite/iOS8? And if so, can we identify potential factors?

Technical challenges shifting to Swift? Regular product life cycle growing pains? Decreased focus on OSs from Apple Increased product-lines/technologies? Pivot from price-skimming to market-capture business strategy?

12
x0054 2 days ago 2 replies      
I recently updated from Mavericks to Yosemite on my 2012 rMBP. Even with no transparency on my battery life was down from 6+ hours to less than 3.5 hours! And everything is so annoyingly slow. After 3 days with Yosemite I had to use my wife's 2009 Air for something, it's running Mavericks, and wow, the speed! That day I restored from a backup and now I am back to running 10.9. My battery is back to 6+ hours, and my laptop is flying.

I also updated my iPad 3 from iOS 6 to iOS 8. Now it's unusable! So, so, so slow. And that I can't even downgrade. If you are going to screw things up, Apple, can you at least give me an option to downgrade.

I think the main problem is the monolithic aspect of the OSX and iOS. Why is the Mail or Face Time app part of the OS?

13
hit8run 2 days ago 2 replies      
That's also what I experienced. Yosemite UI Performance is super slow compared to Mavericks. Maybe by rewriting the UI Code they also got rid of speed optimizations from the Bertrand Serlet era :/
14
mzaouar 2 days ago 0 replies      
The UI performance decline is more drastic in "System Preferences" than in the rest of the system. It's because some of the pref panes (General, Security & Privacy, etc..) are actually remote views and involve launching processes for security (address space separation) reasons.
15
alexggordon 2 days ago 0 replies      
It's sad really. I continually have incredibly high expectations for OS X and it's new releases. However, they keep proving me wrong time and time again.

I used to be incredibly "gung ho" about upgrading to Apple beta's. Then, with iOS 7, that all changed. I had to do something I've never had to. I took the beta off my phone and put back on iOS 6. Same with iOS 8 -> iOS 7, despite the nagging feeling at the back of my head that I shouldn't upgrade.

I think the real issue is, that ideals like Facebook's "move fast and break things"[0] (important to note they've stopped using that technique) has become the moral of big companies to fight the decreasing return on investment each new developer brings. To compare, the impressive thing to me about Google is that they ship so fast, with so few issues.

Despite Tim Cooks solid leadership, I think Apple is really in a grey area right now. Yeah, their profits are at an all time high, but the reason the profits exist is because they have the best hardware and software combination. As soon as one side of that edge goes away, then it might be prime time for a dedicated Microsoft to step back in.

[0] https://www.facebook.com/notes/facebook-engineering/reflecti...

16
mproud 2 days ago 0 replies      
Yosemite uses a variation of a cross-dissolve transition when changing/showing content after a mouse click, whereas older versions do not. Give it a try and pay close attention to the window.
17
mrpippy 2 days ago 2 replies      
I've also had lots of problems with Yosemite, but I don't think this comparison is useful at all. The Hackintosh looks like a fresh install with nothing else running, whereas the rMBP has other apps running.When I don't have 100+ Safari tabs open, Yosemite runs a lot better.
18
blkhp19 2 days ago 0 replies      
For those who haven't read the video description:

Here's a very quick, unscientific comparison of one tiny aspect of UI performance in OS X Yosemite vs. OS X Tiger. Although this is just one example, I personally see these kinds of hiccups throughout OS X constantly.The machine running OS X Tiger is actually an Intel Core 2 Duo PC (hackintosh) from 2007 with no hardware accelerated graphics support in OS X (it's an unsupported intel integrated graphics chipset that was never used in a real mac, hence the unsupported graphics). Therefore, the Tiger demo is running without QE and CI.The machine running OS X Yosemite is a late 2013 15 inch retina MBP. Automatic graphics switching is disabled for the demo, forcing the machine to use the much more powerful Nvidia graphics card. I also have the power adapter plugged in so that the system isn't in a low power state.I've been using OS X for 8 years now. From 2007 to 2013, I used hackintoshes (custom built with compatible hardware). Those machines ran Tiger, Leopard, Snow Leopard, Lion, Mountain Lion, and Mavericks. I got my rMBP in 2013 and it's now my primary machine. It first ran Mavericks, then I upgraded it to Yosemite this year. It's been my experience that the performance of all of my machines has noticeably declined since the release of Lion. In terms of UI performance, my rMBP running Mavericks/Yosemite is nowhere near as responsive as any of my Snow Leopard machines were.Fingers crossed that WWDC 2014 focusses on major speed and stability improvements for OS X and iOS. New features are great, but not at the cost of performance.

19
sesteel 2 days ago 1 reply      
I am currently working on an OpenGL based UI toolkit and recently purchased a 4K monitor and a EVGA GeForce GTX 750Ti to drive the display for testing purposes. As you might expect, it appears taxing to push 4X the content over the bus or have the card render content at a high resolution.

This may not be the effect seen here, but it is the first thing that came to mind. I am not sure how well system hardware has kept up with display technology.

20
tomvo 2 days ago 0 replies      
I noticed a lot of UI speed degradation as well, especially when using a dual monitor setup and changing focus from one screen to the other. This was the most visible when OSX tries to refresh the menubar icons. I did a fresh install (instead of upgrade to the beta) and things seem to be a little better but I guess this just has to do with the fresh install clearing my mac of all the bloat that it acquired in the past years.
21
Yoshino 2 days ago 0 replies      
I do agree that Yosemite has had some performance regressions, but people seem to remember the older operating systems with rose colored glasses. I used 10.3 to 10.5 on the 2005 PowerBook G4 (1.67 GHz) and it was incredibly slow. Spotlight was nearly useless. The fans would frequently spin up. And let's not talk about Finder locking up when a network mount disappeared. Right now I use 10.10 on the original 2012 Retina MBP, and even though Yosemite is a bit slower than Mavericks it is still leagues faster than (Mountain) Lion on the same machine (which is what it shipped with). The biggest performance regressions I've seen have been in Safari, but Chrome doesn't seem to have the same problems.

I think they are paying attention, since 10.10 performance has improved considerably since the first DP, but they need to do more. I wish they would stop fucking with it on a yearly cycle and just release a stable, fast operating system with no new features. The graphics system needs more optimization and they need to kill HFS+ (but that's a whole different can of worms).

(Edits: wording)

22
kenOfYugen 2 days ago 0 replies      
My first Apple Computer was a black MacBook 13" bought in 2007. The OS installed was Leopard and I was extremely happy with it.

The experience contained quality, responsiveness, stability as well as productivity enhancements. Hadn't missed Windows and Linux at all.

The experience remained perfect as the OS was gradually getting updates up to 10.6.8, the latest Snow Leopard version and the hard disk drive was changed to a solid state drive.

At that time I bought a MacBook Pro 13" with an i5 processor, and was introduced to Lion.

Believe it or not, I used my older MacBook more. It "felt" much more solid and responsive. Well at some point I gave it away, and was stuck with the Pro model.

As the OS updates where coming along, things were expected to get only better, just like I had previously experienced. But no! I'm currently using the latest Yosemite, and hate it when I notice tiny buggy things happening..[ for example, try to disable the transparency in accessibility features, and get clunky black corner edges around the volume box as you adjust the volume up/down ]

As I became a more experienced programmer, I realized that good software is perfect software. Thus anything I notice that's not right (and doesn't get fixed when it's a known issue) is a hint for a lot more bugs that I might never get to experience. So essentially I feel like I've lost the stability and security feelings that used to keep me close to the Mac OS.

I really miss Snow Leopard, and hate the path Apple is following. I could bitch about their phones too but that would be more derailing.

PS: I wish there was an image out there of Mac OS 10.6.8I haven't got a spare Apple machine to boot 10.6.7, update it to 10.6.8 and create a bootable disk for my MacBook Pro

Does anyone know if such an image exist, or perhaps can create it?

23
cmelbye 2 days ago 0 replies      
It's probably too much to hope for a stability and performance focus for iOS 9 and OS X 10.11, but damn if I don't try.
24
conradev 2 days ago 4 replies      
Even with "Reduce Transparency" enabled (blur disabled) I've had times when Mission Control operates at 2 frames per second on my 13" 2013 rMBP. This never happened using Mavericks.

If you try full screen zoom on Yosemite with an external display attached, it's similarly unusable.

25
aquanext 2 days ago 0 replies      
I'm hoping that 10.11 will perhaps be the release that refines things quite a bit, concentrating on stability and generally cleaning up messes like this.

Let's remember that they did a fairly major overhaul of everything with Yosemite. This is just how software works sometimes. But you're right, there was a certain amount of UI stability during the Tiger through Snow Leopard era that I wish that Apple would get back to.

Try switching between the tabs in the "About this Mac" dialog. The framerate is truly shameful. Astonishing that it got approved at all. I had to try it on a couple different Macs at the Apple Store to believe that it wasn't some random problem I was seeing.

26
ghshephard 2 days ago 2 replies      
This isn't the first time that things have gone wrong with a new version of OS X. The shift from 10.6.8 to 10.7 was an absolute catastrophe from both a performance and a stability basis. 10.8/ML was slightly better, and, I've stayed on 10.8.5 waiting for 10.9/Mavericks to stabilize.

From colleagues who use it, and the various KEXTs that I also have, it looks like Mavericks, and KEXT support for it, actually hit a pretty good stability plateau around June/July of this year - so I"m looking forward to upgrading to it sometime early in the next year.

There is zero chance that I'm going to be considering 10.10/Yosemite as my production platform (if at all) any earlier than 2016. I'm always slightly bemused at how quickly people feel the need to rush into the next operating system. Was the older one really all that bad? About the only thing I'm missing right now is Omnifocus doesn't support 10.8 (10.9 or newer). Other than that, I can live with the various warts on 10.8.5 (particularly now that FTDI drivers don't kernel panic my system every time I pull out the USB cable) - Kernel Panics are now a biweekly (instead of every couple days)event, and my only real need to reboot the system is when my USB devices aren't recognized (usually after a sleep) - but a 3 minute reboot and everything is fine again.

Still - I think it's good to call Apple on their crap - 10.6 was actually a pretty decent release, and I don't think they've managed to get one that good out since then.

27
htilonom 2 days ago 1 reply      
It's that fucking applegraphicspowermanagement.kext !! I'm not even kidding with that kext name. If you delete the kext, repair caches and reboot you'll see 10x better and smoother animations.

More about kext itself; it contains the list of supported graphics cards (poor one) and power states for them. Some states are badly done even with supported graphics cards. I had a lot fun with this kext back in the ole' days after Intel switch.

28
kaffeinecoma 2 days ago 0 replies      
I read so many reports of Yosemite being faster that I was beginning to think I was the only one who was disappointed.

I have an older machine (2010 Mac Pro tower), but it's been upgraded to 16GB RAM and an SSD over the years. I experienced an initial degradation in UI performance with Mavericks, but after the first OS update it was as fast as ever. Yosemite comes out, and now it takes over second to switch between Safari tabs on an otherwise idle machine. I had hoped that like Mavericks, this would be resolved in the first OS update, but no such luck. I enabled the "reduce transparency" option (this was recommended to speed up the UI) but it didn't help.

My mid-2012 MBr doesn't seem to suffer from this. I wonder if Apple just didn't consider video cards in older hardware. Not an unreasonable thing to do, but I'd have preferred a "sorry, this machine can't run the latest OS" message instead.

29
jokoon 2 days ago 0 replies      
I have a mid 2009 13" mbp, and for me, it has been very slow since mavericks, probably bceause it only has 2GB of RAM and no SSD, since recent version of mac os have been optimized to use SSDs. The superdrive is busted, and there's no USB boot possible with this version. Linux or windows-bootcamp are not good options since they both seem to make the laptop ventilate like crazy.

I intent to buy a 80 euros 8GB RAM upgrade, but still, it's quite sad to see such solid hardware with such crappy OS. No official way to downgrade. The processor has 3MB of L2 cache and honestly I'm sure a C++ IDE would run better on a linux-running, 300 euros, celeron laptop.

I feel like I fell for the apple hype.

I really feel apple is now worse than microsoft regarding screwing users. Now you can run windows 7 without a product key, and it won't really bother you too much.

30
lwh 2 days ago 0 replies      
Don't worry, in a few years when iOS gets a major 3D-look visual upgrade it will trickle down to the desktop and things will be back to normal. In the meantime I've switched my macbook to dualboot Linux and only boot up MacOS when I have to use one of the programs not available for Linux.
31
serve_yay 1 day ago 0 replies      
I don't know what it is exactly, these things are very subjective, but the UI does seem to have lost some feeling of smoothness. Transitions are starker than they used to be, strange blinks as you see in this video, and more.

I am not sure what to think of our current era of Apple products. The software seems to be more slapdash now, as if they are trying to do too many things at once.

On the other hand, this was the first time I upgraded the OS and all my development tools worked without my wasting time on silly problems. They do seem to be more developer-friendly now, but I mean that purely in a technical sense. In terms of developer relations they are perhaps worse than ever.

32
tolmasky 2 days ago 0 replies      
Animation in general is just plain broken in Yosemite (and iOS 8). I'm not up to date with the latest frameworks, but I believe there's been a fundamental API change, because its basically possible to break any animation by just causing an event to happen before its finished. This happens all the time in iOS, you can get it to break mid-rotation and end up in a half and half state: https://twitter.com/tolmasky/status/532578692804124672

On OS X, its the same thing. Here's me easily repro-ing in Safari: http://tolmasky.com/letmeshowyou/Yosemite/Safari%20Animation...

33
rmetzler 2 days ago 2 replies      
Not really sure what causes this, but I learned that Apple changed the layouting algorithm to Cassowary a few years ago [1]. Because languages have different word length (eg. German words tend to be much longer than English words) and UIs should be readable in every language. Apple now uses constraints to layout the UI and the algorithm tries to find a good solution through iteration and has to reiterate for every size change. Visual effect might also trigger this.

[1] http://en.wikipedia.org/wiki/Cassowary_%28software%29

34
bluthru 2 days ago 0 replies      
Just checked on my 2007 MBP--I'm also getting flashes. (Translucency is turned off.)
35
mholt 2 days ago 0 replies      
Interesting that this was the UI aspect that was chosen for a video. In my experience, Yosemite performed very poorly when clicking on a folder from the dock (very low framerate). My problems were worse than bad UI though, I was losing data. Yosemite frequently (every other day) crashed hard-core and the only option was to power cycle. Though I save frequently, sometimes I just hadn't hit Cmd+S yet, and in some running processes, it corrupted data. (On a Late 2013 MBP.)

Reverting to Mavericks made everything silky smooth again. Not a single crash or performance hiccup. I think I'll stay here for a while.

36
wodenokoto 2 days ago 0 replies      
Since we all have anecdotal evidence about the merits and perils of any OS update, I find this video of another anecdote extremely counter productive to the discussion.

Two systems, with different specs, different resolution and different apps running yielding different UI performance is really useless.

If anyone have a spare mac and a few hours doing nothing, please re-do this experiment using fresh OSX installs on same hardware and then we actually have SOMETHING to talk about.

37
naoru 2 days ago 1 reply      
Yeah. Just look at this menu bar when switching focus from internal Retina display to external non-Retina ATD. http://cl.ly/2W172H0l3s0zThis bugs the hell out of me.

Not so long ago I grabbed myself a G4 MDD with 768M of RAM it was much more responsible in terms of UI. 3-4 browser tabs slowed it down though, while my current rMBP13" from 2012 works ok with a lot more.

38
jorisw 1 day ago 0 replies      
Might help to disable all those animations/transitions they keep adding. I have entered these defaults settings on all my Macs and they have sped up my experience a lot:

http://apple.stackexchange.com/a/63477/70074

39
tdicola 2 days ago 0 replies      
Have been holding off on upgrading to Yosemite until the dust settles. I find it funny that I'm not even getting nagged at all to upgrade (unlike on iOS where I was getting nagged constantly to upgrade to iOS 8). It's almost like Apple themselves aren't even confident enough in Yosemite to be usable right now.
40
Luyt 2 days ago 0 replies      
Ah, yes, the flicker and delayed drawing. I noticed that also a few years ago with Windows applications. They used to be written with the Win32 API or MFC or even VCL, but when WinForms became popular it introduced a lot of ugly flickering, especially when resizing windows with many controls on them.
41
joshmn 2 days ago 2 replies      
I don't know anything of desktop/native applications, but web. My question to the people on the other side of the isle: why aren't all the view panes preloaded, and pre-rendered? In this day and age, where 64mb of RAM is easy to come by, I don't see (as a web developer) why this approach isn't taken.
42
x0054 2 days ago 2 replies      
Also, this is part of the final build?

https://d262ilb51hltx0.cloudfront.net/max/800/1*EdAxsHqdZmZ7...

Really, this bug was reported in the original beta, 6 months ago!

43
oldspiceman 2 days ago 0 replies      
After Mavericks I swore I would wait to upgrade until the second patch of the next os. So I'm sitting here on Mavericks waiting for 10.10.2 to drop.

If you slightly shift your upgrade schedule you're only 3 or 4 months behind. No big deal.

44
batuhanicoz 2 days ago 0 replies      
I did have many of the issues commented here but recent beta releases fixed many of them for me. Wifi seems stable, I can answer phone calls if the phone is nearby, it certainly become faster over time.

I'm on a previous generation rMBP 13".

45
rado 2 days ago 0 replies      
My biggest OS X problem of 10 years switching keyboard language got worse in 10.10, because now it doesn't work at all in the "Share" dialog.
46
nuwin_tim 2 days ago 0 replies      
The only application holding me back from using Linux on all my machines is xcode.. only if there was a wine for mac apps that could run xcode.
47
gsands 2 days ago 0 replies      
2012 MBP retina with 8gb ram here..Major performance issues with Chrome -- when videos are playing in tabs, 10+ tabs open, etc.Systems Preferences are the least of the problems I've came across.
48
MrBuddyCasino 2 days ago 0 replies      
Also, compare this with the "about this mac" dialog, the transitions are slow as hell. This would never, ever have shipped under Jobs.
49
srpoder 2 days ago 0 replies      
This is simply sad. I downgraded to Mavericks after a week of pain with Yosemite, I won't upgrade again, I don't trust it anymore.
50
htor 2 days ago 0 replies      
This video is pretty useless. Randomly clicking on settings icons on one computer with Tiger and then another with Yosemite is a poor method for comparing UI performance. Does the two computers even have the same hardware and configuration?

Many people don't like the look of the new Yosemite UI. That's fine. It's a radical change. But don't make ungrounded statements about the perfomance until you have done some proper testing.

51
general_failure 1 day ago 2 replies      
For me the installation bombed. Just locked up forever in 1 min remaining.
52
ps4fanboy 2 days ago 1 reply      
I wonder if this was done on the latest apple hardware. Apple always appears to regress older hardware without fail.
53
mukundmr 2 days ago 0 replies      
Performance has degraded with Yosemite. Stability is not an issue. I use it on both older and newer machines.
54
msie 2 days ago 0 replies      
Wow, the day I finally upgrade to Yosemite this story and its comments arrive. Hours too late. :-(
55
hubot 2 days ago 0 replies      
The performance is ok for me (Air 2011).A bit slower but stable so far.
56
gcb0 2 days ago 0 replies      
jobs died in the middle of his plan of covering osx and ios and forever owning apple consumers. now crook is executing it without the reality distortion powers that jobs had.
11
Cat Litter Boxes and DRM
282 points by DanBlake  3 days ago   122 comments top 21
1
forrestthewoods 3 days ago 10 replies      
This is a relatively interesting situation imo. And it's tragically not unique. Another example is the Keurig machines. The latest model has DRM such that you can only use officially licensed Keurig cups. Suffice to say people were displeased. A delightful video showing how to get around it has 670,000 views. https://www.youtube.com/watch?v=9e0yCq1AEeY

So, here's why I think it's interesting. The companies that sell these products likely make the bulk of their profit from the consumables. That makes it economically viable to sell the machine at break even or possibly a loss. Then the money comes from the consumables. This is a pretty attractive business model both to the manufacturer and possibly even the consumer.

The downside, of course, is that competitors can swoop in and make consumables as well. So now you're selling hardware for a loss and other people are selling consumables for razor thin margins and you're screwed.

I don't think there is an obvious answer here. Some of these markets might not be viable if the hardware has to be sold for a profit. So that kinda sucks. But the DRM also treats consumers like shit, so that really sucks.

Is perhaps the issue just that it isn't clearly stated up front? Amazon sells two flavors of Kindle readers. One with ads for less money and one without ads for more ($20). Once upon a time Apple sold DRM mp3s for 99 cents and DRM free mp3s for $1.29. Would you pay an extra $50 for a DRM free kitty litter box? Or an extra $100 (33%) on a fancy coffee maker?

Here's my take away. Some products are subsidized by consumables. DRM enables that subsidy. Without DRM that product may not be viable. DRM can be minimally negative (Steam) but can also be maximum hostile (Keurig). Finding the balance is tough and we should talk more about it.

2
donutz 3 days ago 3 replies      
"Every once in a while, when the scoop misses a giant cat poop the drying cycle cooks it. It gets dried out like a little raunchy piece of beef jerky. It ends up stinking the apartment up worse than one could imagine. Its rare, happening maybe once every week or two"

Thanks for striking this off my list of "things I think might be useful." I don't need the odor of fresh-baked cat poop wafting through my house. I'll stick with the freshly-poop cat poop smell that concentrates itself fairly well to the room the litterbox is in.

3
pavel_lishin 3 days ago 2 replies      
I feel like the litterbox of the future baking a shit lasagna once every two weeks is a pretty damned high failure rate. Aside from the obvious gross-out factor (I also assume you have to clean it manually after this happens?) I'd be worried that today it's the cat's poop, but tomorrow it'll cook the cat.
4
CapitalistCartr 3 days ago 2 replies      
We've been hearing about computer tech coming to everything, even our coffee pots for so many years now. Who knew when it did, it'd be DRM. Keurig thought that was the way to better coffee, apparently. This has become idiotic. There appears to be someone at every corp that thinks this is a good idea and presses it. No downside, and huge potential upside. Until there is a negative, it will proliferate.
5
anigbrowl 3 days ago 4 replies      
I do my cat litter manually, but the author seems to have overlooked the Litter Robot: http://www.litter-robot.com/ My friend has one and it works very well (for years now). It just uses off the shelf cat litter. The mechanism is 'brute force' rather than optimized, but it works: after the machine senses the cat has stepped out by the change in weight, it slowly rotates the entire cat chamber, causing any deposits to be covered by falling litter even if the cat hasn't buried them by choice, before hitting the mechanical filter that diverts any solids into a disposal drawer. It's purely mechanical, so the odor control/dust level is as good as your choice of litter product. Downsides are that it's a bit bulky and noisy. but there's no DRM and it's so mechanically simple that there's not much that can go wrong with it - it certainly won't bake the cat poop like the machine described - yuck.
6
peatmoss 3 days ago 5 replies      
I trained my cat to do its business in the toilet. I cannot tell you how much better this is than any other solution. There are training kits that our friends and family have now used for their cats.

If you live near the ocean, you won't want to do this if your cat could be a toxoplasmosis carrier (I.e. is outdoors or otherwise could be eating rodents). Apparently toxoplasmosis makes it through water treatment and harms sea mammals.

Otherwise, this is a great way to go.

7
pclark 3 days ago 4 replies      
tangentially related to blog post but very relevant to cat shit i recently bought a "top opening" cat litter box[1], and its incredible that they are not the norm.

my biggest problem with cat litter is not the shit, but having cat litter be tracked across the bathroom - this is almost entirely solved with a top opening box. additionally, my dog is unable to eat the shit.

[1]: https://www.clevercatinnovations.com/top_entry_litterbox_abo...

8
userbinator 3 days ago 1 reply      
Things like this are partly why I'm wary of the "ubiquitous security" (encrypt everything, tamperproof hardware, signed binaries, etc.) concept that a lot of people are pushing -- yes they can benefit the user but in the current environment of capitalism, chances are that any security measures are going to be used against you, to secure some company's profits, if they become cheap enough to implement.

As an aside, I think it's odd that there's alternate open-source firmware and cartridge resetters for a cat litter box, as well as some 3D printers, and there are completely-open-source 3D printers, but basically nothing at all of that sort for regular inkjet printers.

9
monochromatic 3 days ago 0 replies      
I used to have one of these, and I completely agree with the article. The DRM is shitty, the solution is expensive, and my house would periodically smell like baking cat feces. Eventually unit started to do the turd bakery routine more and more often (a worse failure mode is hardly even imaginable). So I replaced it with a Litter Robot[1].

With the exception of having to buy regular cat litter, which isn't a big deal, it's better in every way. It runs in a couple of minutes instead of like 40 minutes. It's quieter. The litter doesn't get tracked around nearly as much as the plastic pebbles. It also has never turned my house into a shit oven.

[1] http://www.litter-robot.com/

10
DigitalSea 3 days ago 1 reply      
The best purchase I have ever made for my cat was using the expensive crystal litter. It costs way more than the cheap cardboard pellet one you can buy, but each crystal contains some kind of scent and really, I clean it once a day and it never smells that bad. They make a big mess if you don't put a mat beneath the tray though and get tracked through the house like tiny granules of sugar.

I have considered a robot litter box, but seems to me, the cost far outweighs the benefit of not having to change the litter yourself very quickly. These things are pricey and based on what I've read in this post and tonnes of reviews online, they're not particularly that great.

11
dominotw 3 days ago 2 replies      
I bought litter robot (www.litterrobot.com) couple of years ago and have never been happier.

Please get it.

12
randunel 3 days ago 0 replies      
I guess you wouldn't mind losing your warranty on a cat litter machine, but you definitely would suffer losing your warranty on a 20000 car. Although you currently don't have an alternative for that car's DRM, I'm sure someone will come up with something once these get popular enough.

The cost of leasing the battery for 36 months starts from 79/month (US$104/month) http://en.wikipedia.org/wiki/Renault_Zoe

Later editMy statements were based on the previous rent-only strategy. Meanwhile, they also introduced the option to actually purchase the battery for ~4000. As if your handing in your driving data wasn't enough, they can also disallow your charging at any time, and here is an excerpt from their TOS:

18.3 Battery DataFor management, administration, and accounting we will collect information about your use of the Battery and the Electric Vehicle. Thisis to allow us to manage battery stocks, maintain hire payments at a competitive level, monitor performance of your Battery and monitormileage and fast charge use.This data will be transmitted to us by the telematic box installed in the Vehicle. If you would like more information about this technicaldata, please write to Renault ZE Customer Services, RCI Financial Services, P.O. Box 495, Watford, Hertfordshire, WD17 1GL.If you have opted to install a Connection Pack we will also receive data about your location. If you do not wish us to receive locationdata you may disconnect the telematic box. Instructions for disconnecting the telematic box will be in the Connection Pack.

13
444000 3 days ago 0 replies      
To be honest, all your problems with the smell, and stuff getting everywhere, cleaning bags, etc. would go away with Worlds best cat litter (http://www.amazon.com/Worlds-Best-Cat-Litter-Multiple/dp/B00...) - you just flush stuff down the toilet, and it's doesn't smell.
14
Animats 3 days ago 1 reply      
This is becoming so common. There are 3D printers that only take their very own special cartridges of plastic filament. The Form I, which started as a Kickstarter project, requires a proprietary resin fluid which costs $149/liter. (It's gone up; it was $130/l a few months ago.) However, it doesn't have a DRM system to enforce that.
15
shmerl 3 days ago 4 replies      
This is insane. What's next, DRM in cats?
16
schoen 3 days ago 0 replies      
I read some ways into this before I was confident that it wasn't a Cory Doctorow science fiction story!
17
zafka 3 days ago 0 replies      
We still have the manual version, but this fascinates me. It seems there is still an opening for a high end solution. The race is on.....
18
ddunkin 3 days ago 1 reply      
Calling the chip 'DRM' sounds like media sensationalism, it really is just 'dumb memory'. Maybe I think this because I knew exactly how the cartridges worked when I did my initial research on the unit (and knew about the CartridgeGenius as an option before purchase), so I had no surprise when it worked how it did.
19
mkramlich 3 days ago 2 replies      
simpler, cheaper, healthier alternate solution: have no cats
20
ifelsethen 3 days ago 0 replies      
tldr; protip: potty train.
21
lotsofmangos 3 days ago 0 replies      
Go home humans. You are drunk.
12
Jenga Pistol
272 points by antr  1 day ago   24 comments top 11
1
ohazi 1 day ago 1 reply      
If you haven't seen Mattias' website before, you should take some time to look at all of the other cool stuff he's built:

http://woodgears.ca/tools.html

This guy is awesome.

2
the_cat_kittles 1 day ago 0 replies      
mattias' approach has really influenced how i think about software engineering. not entirely surprising considering he is also a software engineer. i think his youtube channel is really worth checking out. he has even invented several new, amazing tools, like the "pantorouter" for cutting perfect mortise and tennons (among other things) with a router. https://www.youtube.com/watch?v=8wZ1v4PIsYI

the way he solves, and even identifies, the problems in the woodworking domain are, i think, very instructive for how to think about some aspects of software design.

3
jbrooksuk 1 day ago 0 replies      
My 15 year old brother made one of these a couple of years ago. He saw a video that demoed it with the plans available for $5 but he looked at the video and went and made it from mind.

It was so powerful, almost hilariously so!

He actually said on Facebook earlier: "I made one of these before it was cool."

4
patcon 1 day ago 1 reply      
I've been half-joking with friends about going out to the desert and playing "gun jenga". Good to know the physics are in our favor :)
5
olalonde 1 day ago 0 replies      
If you're interested in this kind of stuff, this German guy has a great channel on Youtube: https://www.youtube.com/user/JoergSprave his slingshot cannon is especially impressive
6
rtpg 1 day ago 1 reply      
It's kind of interesting how the dynamics of the game completely change with the pistol. Since you can get rid of entire rows now, you have rows of same orientation stacked upon each other all of a sudden.
7
geuis 1 day ago 1 reply      
Mattias has a great youtube channel. Highly recommend subscribing. https://www.youtube.com/channel/UCckETVOT59aYw80B36aP9vw
8
jedanbik 1 day ago 0 replies      
What a creative person!
9
hotgoldminer 1 day ago 1 reply      
Struck me that it's like a physics flash game (a la Blosics).
10
BorisMelnik 1 day ago 0 replies      
Have I told you about the Jenga block catcher I'm building?
11
pla3rhat3r 1 day ago 1 reply      
A Jenga shotgun to be used with those oversized sets would be far more impressive. lol
13
When security goes right
272 points by cperciva  16 hours ago   75 comments top 12
1
chris_wot 13 hours ago 2 replies      
The first comment was:

"I don't see why this was a security problem in the first place. No personally identifiable data was disclosed. What does it matter if you can view anonymous traffic graphs from other customers?"

I hate it when people say this sort of thing. That just indicates they can't see a potential exploit, not that it won't be one potential aspect of an attack. Honestly, attackers - regardless of their morality - will tend to look at things from a viewpoint others haven't imagined. It's best to give them as few avenues as possible.

2
patio11 13 hours ago 1 reply      
As mentioned, a great five minute project for when you get back to work after the holiday is adding a /security page to any website you control which handles user data. All it needs is a monitored inbox, a promise to get back to security researchers, and a PGP key.

If you want to make it a 15 minute project, write a bit of customer-facing "We take your security seriously. That's why we encrypt all data with bank-grade security..." copy above or adjacent to the researcher-focused payload.

Good examples (I picked their disclosure pages rather than the security marketing pages) include:

https://basecamp.com/security/responsehttps://www.twilio.com/docs/security/disclosure

3
jacquesm 14 hours ago 4 replies      
Merry Christmas Colin! Nice to see this worked out. I'm confused as to why you're surprised they had working whois contacts, most real businesses do, it's usually the scammers, the spammers and people that are up to no good that use whois privacy, rarely you see that used for someone who genuinely needs some protection.

If a company uses whois privacy I don't do business with them as a rule.

4
noonespecial 14 hours ago 2 replies      
Granted that this is Canada so is a bit different from where I live, I personally would still at this point fear malicious prosecution just enough that I would not have taken action. That's a little sad all by itself.

Hats off to Colin for the (brave) good deed.

5
protomyth 13 hours ago 1 reply      
"First, they were using widely used open source code; if I hadn't been familiar with it, I wouldn't have noticed the problem. Bad guys will always dig deeper than unpaid good guys; so if you're going to benefit from having many eyeballs looking at what you're doing, it's much better if your bugs are shallow."

That is a really interesting reason to go with the popular open source solution. I guess I don't always follow that advice, but I wonder why I didn't think of it as a security decision.

6
wtbob 12 hours ago 2 replies      
Why not just store each user's information in a file whose name is based on HMAC(some secret, user's account number)? That should be secure against enumeration attacks.
7
fubarred 8 hours ago 0 replies      
Re: whois.

There's a need for extremely private registrations that still have a feedback channel. A guy I know has such a registrar, but I doubt there's any means of contacting any site's operator for technical, legal or other matters. (The whois postal address always lists somewhere Europe, but there's absolutely no published details. So it's a registrar that's about as private as allowable.)

8
click170 14 hours ago 2 replies      
Novus is great, they offer some of the highest speeds available in the areas that you can get access.

My biggest gripe with Novus (besides limited coverage) is they have no unlimited bandwidth offering like my current provider has. Despite that, I'm still considering switching to them for those awesome upload speeds. In light of this, I'm probably going to sign up.

Last I heard, when you went over your bandwidth limit with Novus they cut you off to prevent overages. I really liked this because you have the freedom to call them and confirm that you're fine with additional charges and they'll immediately reconnect you, but I'm curious about if they still do this. Any current customers able to clarify?

9
danieltillett 12 hours ago 2 replies      
My only concern here is how quickly support was making changes to production code. Should changes like this be made to a functioning system within 10 minutes of getting a verbal bug report?
10
chubot 9 hours ago 1 reply      
A short time later they started checking Referer headers; but as I pointed out to them, sending a fake Referer header is easy, so that only helps against the most careless attackers. I suggested several options for solving this, and they chose the simplest: Use a cron job to delete all the generated graphs every minute.

If I'm understanding correctly, I think this problem can be elegantly solved with Macaroons: http://research.google.com/pubs/pub41892.html

And it's relatively common issue with auth on the web. I think Facebook has (or had) this problem too. You can generally right click and "copy link" to get a .jpg URL, and send around people's pictures without any auth.

Basically the problem is when there are two web servers, a "dynamic" one with auth, and a static one that serves images. The static one is often a CDN.

Macaroons are basically a simple technique for decentralized auth, involving HMAC chaining. In this setting, the static server would first give the dynamic server a macaroon M authorizing ALL pictures.

At serving time, the dynamic server authorizes the user for a particular request. That request will have an <img src="acct123.jpg"> link. The dynamic server will take the macaroon M, and add a CAVEAT that the file must be "acct123.jpg", yielding Macaroon M2.

The client gets the restricted macaroon M2 with the HTML, and sends it back to the static server to retrieve the .jpg. The server can 1) verify that M2 is derived from the original M, and 2) read the caveat from the dynamic server, proving that the user was authorized for the image acct123.jpg (and only that image). The HMAC chain is constructed so that the client can't remove the caveat and get access to all pictures.

Basically what happened is that the static server DELEGATED auth logic for its resources to the dynamic server. In figure 2 of the paper, the static server would be TS (target service), and the dynamic server is IS (intermediate service).

The static server still needs extra code for Macaroons, which existing CDNs and static servers don't currently have. It would be cool to have an Nginx plugin that does this. But the key point is that it is preserving the original intention behind the static/dynamic split: performance.

In less performance sensitive context, you would have a web server perform custom auth logic, and then just read() the static file from disk and serve it over HTTP. This is likely to be many times slower than say Nginx. With the Macaroons, you can authorize ONCE in the dynamic server, and then PROVE to the static server that the auth decision was made. So all the .jpg requests can be fast and only hit the static server. The HMAC calculations are just hashing so they are cheap. It is symmetric crypto, with the shared HMAC secret.

The paper has some other use cases and is definitely worth a read. I'm thinking about using this technique for a project. I'm interested in opinions from crypto/security folks.

11
fubarred 8 hours ago 1 reply      
Don't have a screen capture, but several insightful comments on the blog disappeared. Wtf?
12
xupijack 9 hours ago 0 replies      
really insteresting
14
Principles of Distributed Computing
265 points by olalonde  3 days ago   22 comments top 5
1
olalonde 3 days ago 3 replies      
The whole book is available for download here: http://dcg.ethz.ch/lectures/podc_allstars/lecture/podc.pdf

Has anyone followed the course or read the book? I was actually reading "Distributed Systems - Concepts and Design (Third Edition)" and thought it felt a bit outdated and not enough focused for my taste (there are whole chapters on networking and operating systems for example). Then I found this course/book which seems a lot more in depth and modern but I haven't had time to read it yet and couldn't find any reviews.

2
krat0sprakhar 3 days ago 0 replies      
Shameless plug: If you're interested in more such university courses on systems that make their lectures, assignments available checkout https://github.com/prakhar1989/awesome-courses#systems
3
wstrange 3 days ago 1 reply      
I am curious why Jini gets no mention in a book on distributed computing.

Jini was not commercially successful, but some of the concepts were quite interesting: mobile code, downloadable client proxies, leasing, look-up by interface, etc.

"The end of protocols" is a short but good read:

http://www.cc.gatech.edu/~keith/classes/ubicomplexity/pdfs/i...

4
javajosh 3 days ago 6 replies      
I'm actually rather curious to know how HN people use resources like this. Do you set aside a few hours a week and do a self-course on the content? Do you skim it and smile and node? Do you bookmark it and never get around to reading it? Something else...?
5
YesThatTom2 3 days ago 1 reply      
Shameless plug...

"The Practice of Cloud Computing" is a more balanced approach.

"Unsatisfied with books that cover either design or operations in isolation, the authors created this authoritative reference centered around a comprehensive approach." (quote from the back cover)

Ladies and gentlemen, I present exhibit A: the ToC of "Principles of Distributed Computing". All design. No operations.

http://the-cloud-book.com

15
Writers who will be entering the public domain in 2015 in many countries
255 points by benbreen  2 days ago   150 comments top 12
1
audiodude 2 days ago 11 replies      
In case anyone is wondering, once again nothing will enter the public domain in the United States because of Congress' perpetual habit of retroactively extending copyright on works. The next time anything will enter the public domain in the US is January 1, 2019. source: https://web.law.duke.edu/cspd/publicdomainday
2
anigbrowl 2 days ago 1 reply      
Somewhat off-topic, but: Piet Mondrian - A Dutch painter whose distinctive grid based creations horizontal and vertical lines upon a white background adorned with red, blue and yellow blocks proved one of the most influential experiments with abstraction of the 20th century.

I always thought of Mondrian's work as abstract until the first time I flew to the Netherlands. Dutch people really, really like flowers, and have made them a major agricultural export. Also, a lot of dutch land is reclaimed from the sea and the terrain is pretty flat to begin with. Fly into Schipol at the right time of year and Mondrian's inspiration becomes very obvious - black roads, snow-covered open fields, greenhouses with blocks of vivid color. In a flash, my concept of his work went from 'paintings' to 'pictures'.

3
MereInterest 2 days ago 5 replies      
In addition to this, I like considering things that should be entering the public domain. Using the initial copyright length of 14 years, renewable for an additional 14, anything published in 1987 should be entering the public domain in the upcoming year.

* Predator

* Robocop

* The Princess Bride

* Dirk Gently's Holistic Detective Agency, by Douglas Adams

* Watchmen, by Alan Moore

* Hatchet, by Gary Paulsen

These should all be open cultural works, ready for new writers to use as a basis. Ready to be used as the backdrop for new stories. Instead, they are locked universes, only containing a small number of stories.

4
benbreen 2 days ago 0 replies      
The French historian Marc Bloch is one of the ones who makes their runner's up list and deserves wider attention. He was shot by the Gestapo in 1944 and along with Fernand Braudel (who was himself a POW at the time) was the leader of the Annales School, arguably the most influential school of thought among professional historians.

http://en.wikipedia.org/wiki/Marc_Bloch

5
wazoox 2 days ago 1 reply      
About Saint-Exupery: his plane and chain bracelet have been found in 2004 in La Ciotat bay, so there is no serious doubt anymore about the exact date and place of his death.
6
jdeisenberg 2 days ago 1 reply      
Well, there goes their incentive to produce any new works.
7
swsieber 2 days ago 0 replies      
The question is why don't we organize a campaign and get a bill introduced to shorten copyright law? Every year we lament how we should have more in the public domain without doing anything. I'm fairly sure somebody could come up with a catchy slogan or some cool perspective that would make voters hate their senator if they didn't pass that type of bill (copyright shortenting) Conceivably (and most likely), it might not pass on the first time. But basically all I hear here is whining, without any suggestions on how to fix it - possible laws yes, possible action plans no.

We need a couple of good opinion pieces on why copyright is bad, and maybe a couple of light-weight buzzfeed style articles that highlight what things we're missing out (so they can trend on facebook). You wouldn't believe #4...

8
angelbob 2 days ago 0 replies      
I read the title and expected more tweaking of the US for how many years it's been since any writers' work has entered the public domain here.
9
herge 2 days ago 3 replies      
So James Bond is in the public domain in Canada?
10
mgraczyk 2 days ago 0 replies      
Interesting list, but sad that the US is once again missing.

Side note: the page is mostly unreadable in Chrome on my Nexus 5. The text runs past the edge of my screen and I can't zoom out.

11
juliendorra 2 days ago 0 replies      
Every December since the last 3 years we (SavoirsCom1, a collective advocating for cultural commons) publish a Public Domain Advent Calendar. It's a fun way to anticipate and discover the works of the authors joining the public domain:

http://www.aventdudomainepublic.org/

(In French! But with names, links and pictures it should be useful and interesting to all)

12
ww520 2 days ago 1 reply      
When is Micky Mouse entering public domain? It has been for a long while.
16
Schwab password policies and two factor authentication
254 points by jeremyt  3 days ago   122 comments top 40
1
cddotdotslash 3 days ago 4 replies      
I just called Schwab about this, and hand to whatever deity you believe in, this is what he told me:

Representative: "One of the things we were trying to do with these passwords was make them different from other providers. So we know that they allow multiple character types, and are case-sensitive, so we decided to make them different. That way, you can't use the same password you've used elsewhere and it kind of forces you to come up with a new one."

Me: "...that is... I can't even explain how terrible that is."

Representative: "Well, Schwab does care about your security and as far as the 8-character limitation goes, the reason you can enter any arbitrary text afterwards is so that if someone is looking over you shoulder they can't tell that it only accepts 8."

Points for thinking on his feet?

2
kevinburke 3 days ago 1 reply      
I had pretty much the same experience with Virgin Mobile last year (passwords limited to 6 digits, no brute force protection). I finally told the guy I got escalated to that if they didn't do anything I'd call the NY Times, Consumerist, Gawker, CNET, Ars, etc and tell them about it.

They didn't do anything, so I sent around the article and pretty much every publication I sent it to ran with it. After that they took down the login page for about nine hours and brought it back up with brute force protection.

https://kev.inburke.com/kevin/open-season-on-virgin-mobile-c...

3
userbinator 3 days ago 0 replies      
To activate my newly received token, I was instructed to go to the homepage and append the six digit token code onto the end of my password during a login attempt.

This sounds like a symptom of the multilayered bureaucracy that often goes on in banks and similar institutions - a change to the UI to add something as simple as an extra field for the token code, and the changes required to hook it up to the backend, might have been accompanied by so much "enterprisey" management red-tape cruft (specification writing, approval documents, approval meetings, meetings for scheduling meetings - I wish I was joking, etc.) that it made the programmers find creative ways around the system.

At the least, if I were forced to concatenate fields, I'd use a separator that couldn't occur in either one, like a comma or something else that their password policy didn't allow... but then again, I wouldn't be surprised if something else in their system would reject that.

4
paulschreiber 3 days ago 1 reply      
After receiving unsatisfactory responses from my local Schwab rep here in New York and the customer service staff, I complained to Schwab's CISO, Bashar Abouseido <bashar.abouseido@schwab.com>, on September 1.

He never replied.

5
mariusz331 3 days ago 2 replies      
I've been using Schwab for almost 5 years and haven't noticed the password limitation until about 2 years ago. My password is pretty lengthy, so when I mistyped the last letter and pressed enter, I expected an error message. Instead, Schwab logged me in. I investigated a bit and ended up contacting Schwab about the "vulnerability". I remember someone quite high up responding saying they were aware of the length limit but that they lock you out after 3 failed password attempts. I didn't validate the claim, but I felt content and moved on.
6
ufmace 3 days ago 4 replies      
I've been coming to an opinion on these issues that may be unpopular with the tech crowd: The big banks have the right idea when it comes to security, and we are misguided at best with our obsession over the minutia of password handling.

Why? All of these big banks and investment houses have holdings in the neighborhood of billions of dollars. Like billions in actual cash. If they are so vulnerable and insecure, why aren't all of the hackers targeting them, with their potential upside of billions of dollars in cash, and instead target little web apps to steal some credit card numbers or user data, worth tens of thousands to maybe a few million on black markets? Think about how much effort we've seen put towards stealing cool Twitter handles and other such trivial things. Does anybody really believe that there aren't many more people working much harder to hack banks, with their billion dollar paydays?

They may not be the greatest on password handling, but the evidence suggests that they have a much more healthy security culture overall than your average internet startup. Apparently, they are worlds better at making their systems secure enough that nobody can steal these user databases in the first place. They most likely also have a pile of fraud detection and validation on account activity, especially anything involving moving significant amounts of money out of the accounts. They are probably in the right on this - what's the point in building a perfect lock for the front door if, once an attacker gets in, they can transfer the whole balance to a Russian bank and nobody will notice? Consider how, with some well-publicized recent hacks, you can apparently do anything at all once you get through that front door at most major tech companies.

I'll happily change my tune if any of these banks get hacked and lose big money. Until then, maybe we should ask these banks how they get it so right overall instead of worrying and hassling them about how long their passwords are and how they're storing them.

7
greggarious 3 days ago 0 replies      
I pointed this out to them over a year ago:http://norcie.com/2013/09/01/schwab-unsafe/

I went to far as to get in contact with senior staff members at Schwab to alert them to the issue, and got a pretty condescending response.

I mentioned it to a friend at a burrito truck outside the Mozilla office, and soon found out it was a top post on /r/personalfinance.

I got a call from Schwab shortly after that. But the rep I talked to just said they were "working on" allowing more characters in the password.

I must say though, this post does a great job detailing their 2F solution. I never set it up since it seemed like wearing a fishnet condom given the rest of their security, so I never got to see how bad it is.

8
modeless 3 days ago 1 reply      
I filed a support ticket about the password length. They told me it was due to "government standards" and they would reevaluate after a new standard came out. I didn't inquire further into this obvious BS. They provide a good service otherwise so it's strange that they have this blind spot.
9
tuzakey 3 days ago 0 replies      
It may be much worse than you think. Another large brokerage company I know of has similar password requirements. They also have a phone banking system, to use it you have to touch tone in your password. On a whim I tried entering the keypad version of my password on the website and surprise! it worked. Luckily for me there is zero customer liability for fraud on their retirement accounts.
10
mdaniel 3 days ago 1 reply      
> I've never, ever seen this "append stuff onto your password" approach being used.

Then he doesn't have an eBay or PayPal token, because they both do it. Or rather, it is an option to do it that way, in order to skip over the "submit, enter token, submit" workflow.

https://www.paypal.com/us/webapps/helpcenter/helphub/article...

11
deet 3 days ago 1 reply      
Not that this is an excuse, but keep in mind that Schwab probably has had the mentality that a compromise of a user's online account, while bad, is not the end of the world.

They have been frustratingly slow in implementing features like linking external bank accounts using trial deposits instead of mailing them a voided check from the external account.

Their slowness to adopt these new features has meant that if you got access to the online account, there wasn't much you could do as a third party that moved money out of the already linked accounts of the victim. You could cause headaches or buy/sell securities but not access the money easily. And if you did link an account or add a biller the victim would get an email.

Things have probably changed recently since I think you can link external accounts now, and there's probably a way to send yourself a check as a bill payment.

Totally not an excuse though.

Note:

I was fooled by the password length as well. Sometimes I would hit what I thought was the wrong last few letters on my phone keyboard yet the password would still work somehow. Turns out you can just type the first eight and be done.

12
einhverfr 3 days ago 0 replies      
Having worked on some major financial web sites (globally), including password code, I can say a few things that may be relevant.

The thing is, you never get a sense of how bad legacy code can be at restricting options in reforming sanity until you have worked on such sites.

It took me about 5 months to restore sanity to one codebase with a bunch of problems regarding encryption and passwords. Fortunately security was a priority, and not just security checkboxes in PCI requirements but real security. But it wasn't cheap and it wasn't easy, and we ran into a lot of unpleasant surprises along the way.

Looking at this the chance is that you have tons of legacy code, and these fit together in not very nice ways. People are afraid to change things because of PCI requirements, security scan results, etc. And the cost of fixing things my be very high. In these cases, I can imagine a "don't rock the boat" mentality developing and a large part of security-critical code becoming effectively untouchable.

13
elsewhen 3 days ago 0 replies      
Schwab offers security protection with all accounts:

http://www.schwab.com/public/schwab/nn/legal_compliance/schw...

When payouts on this security guarantee begin to become a meaningful burden, I am sure Schwab will improve their security practices.

14
polarix 3 days ago 0 replies      
Yeah, this flow is completely nuts. After setting up 2fac, in general, though, the thing to do is probably test that you can't log in without it.
15
codementum 3 days ago 1 reply      
Like many others, I just filed a support ticket as well. I'd like one of two outcomes: 1. A public response and plan from Schwab, or 2. An alternative bank/brokerage company that a) takes security seriously and b) is easy to move to.
16
tootie 3 days ago 0 replies      
We need some sort of interenet security reformation. This is even more ridiculous than when the Chase mobile banking app for Android didn't check if SSL certs were authentic before sending credentials.
17
jdeibele 3 days ago 0 replies      
Thanks for calling attention to this. I've been frustrated by it. One thing that I did was let LastPass generate 32 random characters for the password but used it to change the username. I depend on LastPass to remember that.

It's not much but it was all I could come up with.

My wife wants to use the Schwab app to deposit checks on her phone but I don't trust their security. One lost phone could lead to our retirement funds being transferred to Belize (or wherever).

18
michaelfeathers 3 days ago 3 replies      
Banks aren't technology companies. Someday a technology company will become a bank.
19
elahd 3 days ago 2 replies      
I complained to Schwab about their password policies numerous times over the 3 years I was a bank/brokerage customer. A few months ago I finally moved my accounts to TD.

Schwab's standard response was 1) to assure me that they had "intelligent" fraud monitoring systems on their backend and 2) to offer me a hard token, which would have been a pain and may have caused issues with Mint.

20
yalogin 3 days ago 1 reply      
I have called and complained many times about their password policies.

They let you choose a random user id, as in, change the user id whenever you want. I bet you the security guys over at Schwab are using that as a reason to not improve password options. I can see the argument being - "The idiotic password limitations are not a big deal because of the random userids".

21
superuser2 3 days ago 0 replies      
Online banking is largely a read-only proposition. It's mostly for reading account activity. Some of the more forward-looking banks will even let you initiate ACH transfers, but generally sending money to a new recipient triggers a 2FA prompt (debit card number prompt, phone call, text) and several secondary notifications, with several days to say "that wasn't me" before the money is gone.

I wouldn't voluntarily post my bank account credentials on the internet, but at the end of the day, the security of an online banking account just doesn't matter very much.

The security of the transaction mechanisms do, sure, but that's got little to do with online banking passwords.

22
timeal 3 days ago 2 replies      
This story needs to be upvoted 1000 times. Why are financial institution so _bad_ at password policies?
23
timdierks 3 days ago 0 replies      
I'm guessing that if they had a major breach because of this kind of idiocy, they'd find a way to fix it after the fact.

Which sort of implies to me that they should find a way to fix it before they have a big breach.

If nothing else, the fact that they've been warned repeatedly and done nothing could be pretty compelling if there was ever litigation over losses.

For example, I could imagine someone successfully disavowing a trade at Schwab because they don't enforce the password authentication they claim to, and thus can't convincingly claim that the trader was in fact the account owner.

24
spac 3 days ago 2 replies      
I did report this a while ago to Schwab both over the phone and on Twitter and I have been equally ignores. Thanks for writing a blog post about it.

Edit: forgot to mention that the passwords are case insensitive!!!!!!

25
overgard 3 days ago 0 replies      
Some of their competitors are just as bad. I remember I started to sign up for a TD Ameritrade account a few years ago, but when the password "requirements" came up (which were very similar), it was clear that they were probably storing passwords as plaintext, so I stopped.

Then I got phone calls from them asking why I hadn't finished, so I had to explain to a person that clearly wasn't technical (not his fault, of course), that his company had no idea what they were doing security wise.

Maybe they finally fixed it though. I can only hope.

26
11thEarlOfMar 3 days ago 0 replies      
Offers little consolation...

Schwab has "...a system which locks you out if you guess the [username,password] combination incorrectly more than twice.."

Schwab can improve your security via Verisign and verbal passwords, but you have to ask for it: "... Schwab has several additional (optional) verification methods."

http://www.marottaonmoney.com/schwab-verisign-security-measu...

27
markcerqueira 3 days ago 4 replies      
Quite shameful. Fortunately, I only use Schwab because of their awesome checking account that covers ATM fees. Definitely won't put more of my assets in there until they get their act together.

I may be wrong, but I think user IDs can be longer than 8 characters too which makes this all even worse.

LinkedIn did something similar with having to append your auth token to the end of your password, but they actually checked the token AFAIK.

28
jkupferman 3 days ago 0 replies      
Fidelity had a similarly terrible password policy (6-12 characters, only letters and numbers). I complaining to their tech six months ago and got a stock "we're looking into it" answer. In the past month they've actually fixed the issue and now require 6 characters (upper case, lower case, number and symbol).
29
spacefight 3 days ago 0 replies      
"Like probably millions of people I have a Schwab brokerage account, and that account holds a good portion of my savings for retirement."

OpSec 101: replace that sentence with "Like probably millions of people I have a Schwab brokerage account, and that account holds just a few bucks of play money to try out trading strategies."

30
paulschreiber 3 days ago 1 reply      
Schwab does let you enter your token code on a separate screen. If you enter your username and password (without appended token code), you'll get this screen:https://www.flickr.com/photos/paul/16079572151/
31
bhartzer 3 days ago 2 replies      
Thanks for posting, I've passed this on to my contact at Schwab to see if it can get fixed properly ;)
32
brini 3 days ago 0 replies      
I've also bemoaned Schwab's password policy. I reacted by changing my login ID to something nonsensical which would be difficult to guess or to associate with anyone's identity, let alone mine.
33
jrochkind1 3 days ago 0 replies      
Recently, on a Schwab competitors site, I couldn't recall my password -- which was required to have upper, lower, and punctuation.

Not being able to recall my password, I was able to reset it by supplying only my mother's maiden name and my date of birth.

Um.

34
acconrad 3 days ago 1 reply      
I, too, complained months ago about the poor password protection. I immediately had them send me a 2-factor FOB. While it isn't a great solution, it's one I would ask for asap.
35
et2o 3 days ago 0 replies      
I recently filled out a support ticket concerning password policies too, after opening an account. I received a rather absurd reply and decided not to transfer any of my money to them.
36
Flott 3 days ago 2 replies      
8 digits password...

It sound like DES encryption stored directly in the database. (This is pure speculation of course)This alone is a huge red flag. Adding the fact that the 2 factor auth. is broken is not a good news.

37
themckman 3 days ago 0 replies      
It's really a shame they're so bad at all of this. I've had nothing but a fantastic experience when dealing with them for my Investing and Checking accounts.
38
gesman 3 days ago 0 replies      
Enterprise support:

Q: "Your secure banking portal has critical vulnerability!"

A: "Did you try to reboot your computer?"

39
40
caycep 3 days ago 2 replies      
How do the other big consumer trading services compare? i.e. Vanguard, Fidelity, etc?
17
Docker Image Insecurity
260 points by Titanous  2 days ago   100 comments top 11
1
shykes 2 days ago 6 replies      
I wish the author had not omitted this crucial paragraph in the announcement he quotes:

    Note that this feature is still work in progress:    for now, if an official image is corrupted or tampered with,    Docker will issue a warning but will not prevent it from    running. And non-official images are not verified either.    This will change in future versions as we harden the code    and iron out the inevitable usability quirks. Until then,    please dont rely on this feature for serious security, just yet.
So, we've made it pretty clear from the start that we're working on ways to make image distribution more secure, but are not claiming that it's more secure yet.

2
ef4 2 days ago 4 replies      
I keep hearing people ask "why won't they just get the core tech right instead of adding all these tangentially related features?".

If Docker was just an open source project, it could focus on getting the core tech right. But Docker is also a startup, and the startup can't stay differentiated unless they keep adding bells & whistles, all of which stay tightly integrated.

See also "Why there is no Rails Inc" (http://david.heinemeierhansson.com/posts/6-why-theres-no-rai...)

3
lclarkmichalek 2 days ago 2 replies      
I don't understand why the image distribution is so tightly tied into the main docker codebase. This is why rocket is a thing, because docker is the systemd of the container world. Please stop trying to do everything.
4
ewindisch 2 days ago 5 replies      
Hello, I'm the lead security engineer at Docker, Inc.

There is nothing particularly new in Jonathan's post and I thank him for facilitating a conversation. Image security is of the upmost importance to us. For these reasons, we've concentrated efforts here in both auditing and engineering effort. Engineers here at Docker, our auditors, and community contributors alike have been evaluating this code to many of the same conclusions.

Last month, we released Docker 1.3.2 which included limited privilege separation and extending this paradigm has been discussed. I have explicitly called out the need for containerization of the 'xz' process, and to run it in an unprivileged context. I thank Jonathan for reminding us of the need for this work and validating much of what is already in progress.

As the recently published CVEs describe, we are expending resources in discovering and fixing security issues in Docker. Yet, I agree the v1 registry has a flawed design and we're aware of it. In September, I requested to become a maintainer of the tarsum code and have also made proposals and pushed PRs toward improving the v1 registry integration. This is not to replace the v2 effort, but to offer improved security for the design we have today.

We have a draft for a v2 registry and image format. This and the supporting libtrust library are in the process of being audited by a 3rd-party. This is something we had previously promised the community and are making good on. What code exists today is a technical preview.

Unlike the v1 registry and image format, the libtrust and v2 image format code has been designed for a decentralized model. However, as the libtrust and v2 image work, and subsequently, registry protocols are still in draft and security review, it is difficult for us to recommend that users yet attempt deploying these. This is why the developers of that code have not published clear instructions for its use, nor made such recommendations. As this work comes out of review and a specification is finalized, we should expect to see a much better experience and more secure image transport, along with stronger support for on-premises and 3rd-party registries.

5
geku 2 days ago 1 reply      
I would really prefer when Docker, Inc would spend their time and effort in securing their core product rather than extending it all the time by adding more and more features like Machine, Swarm, etc.
6
23david 2 days ago 0 replies      
The inevitable CVE's coming from this report will definitely get their attention. Hopefully the adults in the room will help make sure that the Docker team addresses what up until now has been a really lax approach towards security.

Who is the architect in charge of this, and do they have any security chops? If not, it's just a matter of $$$ to get a 3rd-party security review before every major release. I've done it before, and it's really not a big deal.

7
snoble 2 days ago 3 replies      
It confuses me why they wouldn't just verify the images since they have the signature in the manifest. Is this because they don't want to wait for a complete image before the start streaming through the pipeline? Is this actually a significant time saver?
8
wayoverthere 2 days ago 1 reply      
Particularly interesting given that some of these problems were pointed out to Docker folks ~4 months ago in the development of the feature. https://github.com/docker/docker/issues/8093#issuecomment-57...
9
rab_oof 2 days ago 0 replies      
Early on, I asked that images be signed similar to Debian packages, but was met with skepticism and resistance. To me, none of the Docker core devs had a handle on security implications of allowing anyone and everyone to share random bits without being able to prove end-to-end integrity and nonrepudiation.

I hope this has changed, Docker is a great app. But if not, Perhaps someone would like to teach them a security lesson? It seems the only way most people actually learn, sadly. :(

10
disjointrevelry 2 days ago 1 reply      
Reminds me of debian and ubuntu's requirement that apt-get is run under root. There is simple ways to get apt-get to run on non-root, but it require giving permission to non-root account to modify important package signature files. But, they're not as bad as docker. It's becoming norm for these US/Silicon companies to give very bad integrity on data.
11
oscargrouch 2 days ago 0 replies      
As a non-security aware(not a security specialist) developer, this was one of the most instructional and concise little gem about security flaws i've read. You can learn very useful tricks just by reading this. Thank you
18
PHP 7's new hashtable implementation
248 points by xrstf  3 days ago   131 comments top 16
1
skrebbel 3 days ago 11 replies      
This is probably not a popular opinion, but I believe that PHP's associative array is one of the best-designed data structures in programming languages.

Its main distinguishing property, as mentioned in this article, is that values can be indexed by key, but are still iterated in the order they were set. This is "do what I want" in so many cases that it's just nuts.

Sure, just as often it's just needless overhead, but as a programmer who prefers to reason about domain and not performance, I often don't care about that. I hate that many other languages, including C#, Ruby and Python, force me to choose between either an unordered map or a list of (key, value) tuples. EDIT: clearly, i'm behind the times with that remark. thanks commenters :-)

I wish more languages had a native data type like this. It scares me that in practice JS objects have the same property, but officially the iteration order is not specified.

(that said, PHP's choice to mix regular arrays and associative arrays into a single type strikes me as a bit odd. i've also never seen a good use case of arrays with mixed string/int keys)

2
adunn 3 days ago 4 replies      
This is great news. PHP doesn't have many structured data types, so arrays (aka maps) are basically used for everything. Any improvement to them will impact the entire application.

It would be nice to have separate types for arrays and maps though. I don't understand why they were combined to begin with. Simplicity? Seems like there are more edge cases and gotchas the way things are now.

3
allendoerfer 3 days ago 1 reply      
It is both nice and concerning, that an ubiquitous element of a ubiquitous language has that much potential for performance optimizations after 19 years of development. The optimizations were not even complicated hacks for edge cases, just a simpler implementation overall.

But then again, PHP itself being stateless between requests is quite fast already, nice to see even more performance getting squeezed out. Imagine the decrease in global energy consumption due to this change. :D

4
halflings 3 days ago 1 reply      
Make sure you don't miss this part:

> PHP uses hashtables for all arrays. However in the rather common case of continuous, integer-indexed arrays (i.e. real arrays) the whole hashing thing doesnt make much sense. This is why PHP 7 introduces the concept of packed hashtables.

> [...] We keep these useless values around so that buckets always have the same structure, independently of whether or not packing is used. This means that iteration can always use the same code. However we might switch to a fully packed structure in the future, where a pure zval array is used if possible.

It's nice that they're starting to consider the fact that "real" arrays are unnecessarily mixed with hashtables, which comes with a pretty significant overhead. Let's hope they'll soon add that different separate type for arrays (or "fully packed hashtables" if they prefer :)).

5
mmaunder 3 days ago 4 replies      
Awesome, but I still don't think it's enough. In benchmarks we did the memory usage of PHP array() was horrific. Sorry I don't have actual numbers to post, but we ended up using pack() and unpack() to store stuff that should have been in an array because it would grow to 100's of megs using PHP's array() and using a binary structure it stays under 10 megs. I just don't think a 2.5X improvement is going to come close to as efficient as it could and should be.
6
gopalv 3 days ago 1 reply      
This is neat. Looking through it, looks like it makes regular numeric arrays faster as well via the flags.

I wonder if the ->pDataPtr vs ->pData confusion has been resolved.

I'm probably a few years behind, but a lot of my confusion working with hashes has been that pair of void* pointers.

7
bcheung 2 days ago 1 reply      
You mean they are not using strlen as the hash function anymore? http://news.php.net/php.internals/70691
8
Bahamut 3 days ago 7 replies      
What happened to PHP 6?
9
rbanffy 3 days ago 1 reply      
Would anyone like to comment how these are implemented in Hack?
10
aruggirello 3 days ago 1 reply      
It would have been nice to see performance comparisons too - though I understand the new codebase might not be optimized for performance yet.
11
bhouston 3 days ago 1 reply      
How does this compare to the optimized hashtable implementations in the various JavaScript runtimes? I imagine their requirements are similar?
12
krick 3 days ago 1 reply      
So, the only thing I'm actually interested is: API for it stays the same? That is it's the same old "all in one" data structure with the same behavior for all standard functions, with all old gotchas left in place and no new added, right?
13
nly 3 days ago 1 reply      
> The hash returned from the hashing function (DJBX33A for string keys) is a 32-bit or 64-bit unsigned integer

I thought there was a big hooha about PHP and other dynamic languages using ill-suited hash functions and ultimately most runtimes moved to SipHash?

14
nly 2 days ago 1 reply      
An important variable determining memory consumption is going to be the maximum bucket load factor. Does anyone know what it is as currently implemented?
15
ape4 3 days ago 1 reply      
I assumed they used C++ std::map<>
16
debacle 3 days ago 2 replies      
Copied from /r/php (care of http://www.hhvm.rocks):

    dev@aerilon ~/dev $ php --version    PHP 5.5.20-pl0-gentoo (cli) (built: Dec 22 2014 13:44:21)    dev@aerilon ~/dev $ hhvm --version    HipHop VM 3.5.0-dev (rel)    dev@aerilon ~/dev $ php memusage.php    13.97 MBs [14649088 bytes]    dev@aerilon ~/dev $ hhvm memusage.php    2 MBs [2097152 bytes]
So basically this implementation still uses 100% more RAM (hhvm is 64bit) by default compared to the current production version of HHVM.

Great job, PHP internals team...

19
Daala progress update
233 points by dbcooper  2 days ago   38 comments top 6
1
AceJohnny2 2 days ago 3 replies      
Interesting to see they're also applying Daala to still images. It's a very different set of constraints and optimizations than video.

My first instinct was to wonder how it compared to Bellard's recently featured BPG image format [1], which achieves impressive quality at low sizes. Turns out you can check this by selecting "HEVC" in Xiph's image comparison, since that's the core method employed by BPG.

[1] http://bellard.org/bpg/

2
IvyMike 2 days ago 0 replies      
Charles Bloom has been discussing PVQ and Daala over on his blog. I am not an expert nor have I fully understood these articles, but they definitely look like interesting related reading.

http://cbloomrants.blogspot.com/2014/12/12-16-14-daala-pvq-e...http://cbloomrants.blogspot.com/2014_12_01_archive.html

3
aidenn0 2 days ago 2 replies      
I prefer the JPEG to the Daala in that one sample image; JPEG does have more artifacts, but Daala seems to preserve less detail.

For low-to-medium bitrate video that might be the correct tradeoff though.

[edit] H264 and H265 look like successive incremental improvements over JPEG in the direction I personally prefer; they reduce artifacts without losing detail.

4
0x0 2 days ago 2 replies      
Applying an easing animation on the split screen comparison mouse drag was probably not the most user friendly idea.
5
bla2 2 days ago 2 replies      
Subjectively, on that image VP9 looks nicer than Daala. (And x265 looks nicer than both.)
6
rikacomet 2 days ago 1 reply      
Just to add as feedback to that: I much prefer the Jpeg render of 2 things in particular in that sample image. First, the brownish stone on the lower-right corner. As we as the car tracks. Regarding the car-tracks, the rough look in the Jpeg render looks more natural to me. The stone is though only slightly better in the jpeg. For the rest of it, Daala wins! I specially like the thin, tall leave-less tree on the left.
20
Sleep sort (2011)
262 points by olalonde  10 hours ago   39 comments top 12
1
antics 9 hours ago 3 replies      
Just to be 100% clear, in case anyone who is new to the field stumbles on this thread, this is NOT breaking math or anything. This is a fun, but totally normal result.

First of all, there are quite a few linear time sorting algorithms. Radix, pigeonhole, and counting sort are all linear time sorting algorithms. The popular result that any comparison-based sorting algorithm works in O(n log n) applies _specifically_ to comparison-based sorting algorithms, and not those like the above. So, even if you ignore the underlying mechanics of the OS scheduler and assume sleep works "perfectly", the result would not that unusual.

Second off, in the comments there appears to be considerable time worrying about the technicalities of the scheduler, the nondeterministic nature of sleep, and so on, and whether this really implies linear time bound. IMHO these are not worth worrying about because we already have linear time sorting algorithms. It's fine to assume the scheduler adds no significant asymptotic cost here, even if we know differently.

Third off, remember that all of these sorting bounds assume machines of the Von Neumann architecture. In particular, this model assumes constant time memory and constant time comparison operations. In cases where you're comparing really big numbers, these bounds get worse. This is easy to forget, but worth remembering just since we're on the subject anyway.

2
danso 9 hours ago 5 replies      
Sadly, the somewhat lengthy Wikipedia entry for Sleep sort was deleted. Here's archive.org's version:

https://web.archive.org/web/20110622073615/http://en.wikiped...

Reading through Wikipedia's existing pages for other esoteric sorts makes me laugh as I did when reading Hitchhiker's Guide to the Galaxy's various entries for scientific theories (e.g. Bistromathics)...There's Bogosort, Stooge sort, American flag sort, and my favorite, "Gnome sort", named thusly because "that is 'how a gnome sorts a line of flower pots'" http://en.wikipedia.org/wiki/Gnome_sort

3
shultays 8 hours ago 0 replies      
Didn't saw this post before

I heartily disagree with all the attempts to downplay the brilliance of the sleep sort algorithm. Many of you have missed the important point that while traditional sorting algorithms can only utilize one core, sleep sort has the capacity to use the full power of a massively parallel execution environment.

Given that you need nearly no computing in each of the threads, you can implement them using low-power CPUs, so this is in fact a GREEN COMPUTING algorithm.

Oh, and did I mention that the algorithm can also run inside a cloud...?

Sure, you're a genius!

4
ambrop7 7 hours ago 0 replies      
One should know that it is the timer scheduler of the kernel which ends up doing the sorting. With Linux, this is probably a red-black tree[1], so performance of sleep sort (when it works) is O(n *log(n)).

[1] https://www.kernel.org/doc/Documentation/timers/hrtimers.txt

5
majke 7 hours ago 1 reply      
With fluxcapacitor it will run in fraction of a second!

https://github.com/majek/fluxcapacitor#basic-examples

$ ./fluxcapacitor examples/sleep_sort.sh 1 4 20 3 55

6
veesahni 1 hour ago 0 replies      
(impure) ruby implementation:

ruby -e '[3,1,2].each{|n| system("sleep #{n} && echo #{n} &")}'

7
SwellJoe 9 hours ago 0 replies      
Previous discussion: https://news.ycombinator.com/item?id=2657277

Pretty sure it's been posted a couple other times, but I'm on a tablet at the moment.

8
perlgeek 7 hours ago 0 replies      
Recently I wrote about sleepsort for the Perl 6 advent calendar, mostly as a fun topic to talk about threads, promises etc.

http://perl6advent.wordpress.com/2014/12/23/webscale-sorting...

9
avodonosov 2 hours ago 0 replies      
javascript version (faster):

    var input = [5, 3, 6, 3, 6, 3, 1, 4, 7];    var timeAxis = [];    // Put every element to the corresponding time point.    // Time point would hold an array, as we can have    // equal elements.    for (var i = 0; i < input.length; i++) {        elem = input[i];        timeAxis[elem] = [elem].concat(timeAxis[elem] || []);    }    // skip all empty (UNDEFINED) time points    // and contact all the arrays:    var sorted = timeAxis.reduce(function (accum, cur) {                                     return cur ? accum.concat(cur) : accum},                                []);    console.log(sorted)

10
xianshou 4 hours ago 0 replies      
Radix sort, with time as the buckets...stylish.
11
codezero 9 hours ago 3 replies      
This is my new favorite after bogosort.

I'm sad nobody has implemented my idea, rock, paper, scissors sort.

12
logicallee 6 hours ago 1 reply      
if you'd like to trade space for time, here is another sort with no comparisons: http://codepad.org/0k2Ou2Db
21
Why is everyone so busy?
240 points by Futurebot  2 days ago   210 comments top 33
1
waylandsmithers 2 days ago 10 replies      
So this is probably a little Marxist, but isn't the reason we're not working 3 hours a day the fact that the owners of capital capture all the gains from technological advance? In other words, money is more powerful if you can buy a robot at a high up front cost with small ongoing costs.

On a micro level, I see this as a worker figuring out how to do his 8 hour task in 4. Boss man says great, now you can do this other task for 4 hours also instead of going home at noon.

2
falcor84 2 days ago 2 replies      
I would like to provide another explanation for the longer hours at work: reverse-telecommuting (doing personal stuff on company time).

Particularly with the advent of the internet, there isn't that much stuff that we NEED to be away from work to do. We can easily organize our personal finances, research and schedule various appointments, order groceries, etc. all the while coordinating with one's spouse.

The expectation to stay longer hours at work seems to have come with more goodwill towards running personal errands and general slacking around at work, such that though we work longer hours, the work itself is much less concentrated.

3
StillBored 2 days ago 9 replies      
I blame my own personal loss of time partially on long commutes. Previously, I lived about ~8 minutes away from where I worked. Now I live about ~40 minutes (which is only about 12 miles) from work. So, that works out to ~5.3 extra hours a week I lose sitting in my car. Or about 6.5 work weeks of time a year. With that time, I could be teaching the kids something, learning a new skill, exercising, working on my own projects, or just relaxing.

Think about how much work you can get done in 6 weeks at work. That is what is being lost because I live in a state/city that puts infrastructure projects near the bottom of the priority list.

4
tjradcliffe 2 days ago 1 reply      
"I don't have time for that" should be translated as "That's not a high enough priority for me to make time to do it."

While we live in a world where setting our priorities one way (more money, less time) is facilitated by everything from labour laws to social conventions, we do have a choice to set them differently.

There are two prongs to this: 1) setting our personal priorities differently, and consciously accepting the trade-offs that involves and 2) promoting social and corporate policies that enable our preferred priorities.

Setting personal priorities may involve simply not working through lunch every day and ramping up from there.

Promoting social and corporate policies may involve things that facilitate telecommuting (reflecting the discussions here on the time-cost of commuting) or a move toward more European-style labour policy, which doesn't seem to have harmed European productivity (http://ieconomics.com/productivity-euro-area-united-states). That's another way of saying the Anglosphere all-work-all-the-time ethic is really inefficient, and who wants to be inefficient?

For many people, though, leisure time is over-rated. People work long hours because they value their working hours more than their leisure hours, and set their priorities accordingly. There are a lot of reasons why people have those priorities, some of which may be related to how bad we are at judging what is likely to make us happy: http://www.huffingtonpost.com/kirsten-dirksen/happiness-rese...

5
adamzerner 1 day ago 2 replies      
1) People seek purpose in their jobs. The jobs that provide this usually require an investment of time.

2) People seek status. Status comes from relative wealth and intelligence, which requires an investment of time.

3) People usually don't have the choice to trade money for time. You usually can't say, "I'm going to work 5 hours less per week and in return take a smaller salary". Hourly workers can (sort of) do this, but they're the ones who usually can't afford to do this.

All that said, there still are plenty of opportunities to exchange money for more time and people (IMO irrationally) pass these opportunities up. Ex. paying for food instead of taking the time to cook, clean, shop. Ex. paying someone to clean your house. I'm not really sure what the explanation for this is.

6
oldspiceman 2 days ago 1 reply      
Necessary reading for anybody who's sick of hearing people talk about how busy they are: https://www.google.co.uk/search?q=the+busy+trap&gws_rd=ssl

"Busyness serves as a kind of existential reassurance, a hedge against emptiness; obviously your life cannot possibly be silly or trivial or meaningless if you are so busy, completely booked, in demand every hour of the day."

7
debacle 2 days ago 4 replies      
In the current economy, service providers are no longer incentivized to provide ease of use. Everything is commoditized. Grocery shopping, banking, plumbers, auto repair, travel. We want it as cheaply as possible because we're all strapped. This creates time bloat on both sides - it takes longer to get what we want, and it takes longer on the other side to actually provide services.

We're currently in an artificial starvation economy. When you're starving, it's important to conserve energy (or in this case capital) which means that things take longer than they should. You spend a few hours more to save a few dollars more, but, long term, the stress of being stretched causes impulse purchasing, which creates a positive feedback loop.

8
Sorgam 2 days ago 5 replies      
My opinion is that people who are motivated to do things or make money work as much as they can to work towards what they want. Sitting around would be frustrating. That doesn't mean they need to work so much to survive, they just want to.

Personally I've never done that. I often worked 3 day weeks. Now I do programming at home whenever I feel like and have a low-hours day job. I make enough money to be comfortable and it's quite nice.

I think the predictions have come true - for anyone who wants it.

9
ascendantlogic 2 days ago 1 reply      
As a personal corollary, this year I started doing nights and weekends consulting. After having spent years playing video games in my free time I now feel like any hour away from my 9-5 job that I'm not billing is an hour wasted. My bank account has never been healthier but I know I'm burning the candle at both ends and it will catch up with me. I can't shake the feeling of time not worked being time wasted, though.
10
chvid 2 days ago 1 reply      
Well. Where I live not everyone is busy.

Our population of working age (18-65) is about 3 mio. people with about 1 mio. on some kind of welfare (unemployment benifits, government paid sick leave, disability pension etc.).

So here in Denmark mr. Keynes has turned out to be right.

What he failed to see was the rise of the welfare state and just how unevenly distributed work or "busyness" would end up being.

11
nickik 1 day ago 0 replies      
Another thing to consider is that the law is heavly focused on full employment and there are lots of benefits. This is true in almost all countrys, laber market regulation have a clear focus on this.

We have no idea what the (abstract) free market laber time would be. For better or worse our economys have devloped into a kind of paradigm that is not focused on flexibilty. This has devloped in the industrialisation and should be more relaxed now. In some places (most places programmers work) working hourse are allready much more flexible.

12
afoot 2 days ago 0 replies      
When I worked for a large established company a lot of the older, more senior staff who had been around for a long time would complain constantly about the workload being much higher than it was a decade ago.

When younger team members got promoted to similar roles with the same workload they didn't have the same issues. The big difference between the two was the use of technology. The workload expectation was based on efficient use of all of the systems and programmes we had available. Those that still used paper-based systems and ignored automated processes and other efficiencies really struggled, and their days were much longer as a result.

13
nickik 1 day ago 1 reply      
Its simple, because we want to consume. People work hard and then buy a expensive car. Well you could have work less and drive a shitty old car.

I dont understand what the mystery is. I myself could probebly get by with working 4 hours per day but I dont, I rather work 8 hours and buy myself cool computer shit and books.

14
pm90 2 days ago 0 replies      
The underlying problem is to figure out what you really want. Yes, I've felt that feeling of uneasiness in "wasting" time a lot, especially as I moved into jobs which significantly increased the monetary value of my time. But one thing I did know was that I did not value money as much as I valued the time I spent meeting people, being outdoors, reading books or just thinking about things.

Another lesson that self-help books give out a lot is that if you don't decide the course of your life, others will do so for you. I suspect that most people take money as a proxy for success and happiness and hence keep desiring more of it. Ironically, our consumerist environment makes us spend even more as we earn more...making us want to earn even more.

Anyways, this is simply hypothesis. I can't say that I've figured out things, but I do think I am on the right path.

15
wallflower 2 days ago 0 replies      
The beauty of being human is that we aren't rational when it comes to managing our time. At least, for me, in terms of what has the biggest long-term payoff (e.g. making better friends v. reading articles two-levels removed from original HN linkage).

> But being busy has become a refrain and rationale for the things we dont do, an acceptable and even glamorous excuse. My friend at lunch reminded me of what the Buddhist monk Sogyal Rinpoche calls active laziness the filling of our lives with unessential tasks so we feel full of responsibilities or, as he calls them, irresponsibilites.

https://medium.com/thelist/the-cult-of-busy-bbb124caed51

16
teddyh 2 days ago 0 replies      
Commuting not only takes time, it is also expensive:

http://www.mrmoneymustache.com/2011/10/06/the-true-cost-of-c...

17
Thriptic 1 day ago 1 reply      
This piece resonated with me strongly. I am currently down in Florida with my parents on a two week break from work for the holidays. This is the first stretch of time off from work I have taken in 2-3 years which involved more than a day or two out of the lab in Boston (I have accrued 8 weeks of paid time off and in addition my employer has effectively told me I can take as much time off as I want). There is a nice pool where I am, a beach, unlimited booze, other fun amenities, and yet I found myself complaining to my best friend last night on gchat that I have been feeling anxious and somewhat bored / unhappy for most of the trip.

I feel like I should be spending this scarce free time better learning python and facilitating my career switch from bio E into data science; I don't want to eat out because that will harm my ability to cut weight for powerlifting (my main hobby); and when I am not trying to code or work on lab work, I spend many hours a day looking for gyms / in the gym. I effectively am attempting to do what I do every day (work from morning until I sleep punctuated with some time spent lifting) and am feeling miserable because I am not doing it as well as I would be able to do it at home. Last night I was checking how much it would cost for me to change my departure date so that I could go home early and use my vacation days more effectively studying.

Part of this angst is driven by the fact that while I love my family and enjoy spending time with them, I feel angry that this is how I am forced to use my special large break. I am never able to spend time with my friends. I see my best friend and close high school friends maybe once or twice a year for 2-3 days. They are scattered across the midwest and simply don't have any vacation days to spend hanging out with me. My college friends in the midwest invited me to spend New Years with them, but they admitted that they would be working the entire week and would only be able to hang out with me for about a day, so I declined. My friends from work are in the same situation as me and don't have time to go on a trip.

As this article correctly points out, I realize that I have no right to complain as these are completely self-imposed behaviors and though processes, but to be honest I'm not sure what else to do with myself or my time. I sometimes ask why I work the hours I do; is this really the best way to live my life? While I realize that the answer is probably no, I don't know what else to do, and so I keep doing it.

18
jmadsen 1 day ago 0 replies      
In kind of general agreement with many comments below. If I can just take a stab at what I see as at least one of the ways we got here:

During the previous recession after 9/11, many company managers jumped on the title of the book "More with Less" and started trying to stretch resources, people first and foremost. (I say jumped on the "title" because the book actually was a clever thesis that had nothing at all to do with making your workers do two jobs for the price of one).

That recession was long and deep enough & quickly followed by an even worse one that the new job definitions became the new standards. There hasn't been a enough of a good economic period for labor to take back its decent conditions.

I think a lot of it is as simple as that.

19
lordnacho 2 days ago 0 replies      
Isn't the standard prescription to do something that doesn't feel like work for a living?

I quite enjoy programming, so it's not like I'm constantly watching the clock when I work. I'm sure there are loads of other jobs that people like doing that don't make them feel like they would rather be doing something else.

20
bshimmin 2 days ago 0 replies      
For anyone interested in the context of the Keynes quote in the first paragraph of this article, here it is in full: https://www.marxists.org/reference/subject/economics/keynes/...
21
fwn 2 days ago 1 reply      
> "Ever since a clock was first used to synchronise labour in the 18th century, time has been understood in relation to money. Once hours are financially quantified, people worry more about wasting, saving or using them profitably."

Right. Cavemen lived lazy through the day. If they did not find an animal to hunt one day they told their children not to starve but to go and get some chicken wings from KFC.

Surely they were happy to live before the industrial revolution. The time when time wasn't evaluated by economic ratio. This is a great fairy tale in social sciences.

22
3rd3 2 days ago 0 replies      
I think an obvious fact is overlooked that while new technology accelerates existing tasks, it also introduces a lot more possibilities and new tasks. For example, Internet messaging allows us to spend a lot less time for each message, but it also allows us to send more messages within the same time. It's also cheap, so we are free to set our own limits (which we are pretty bad at).
23
collyw 1 day ago 0 replies      
"American men toil for pay nearly 12 hours less per week, on average, than they did 40 years agoa fall that includes all work-related activities, such as commuting and water-cooler breaks."

Only being 40 I can't really say for sure, but was it not they case that one man would provide for a household 40 years ago, while now we generally have both partners working?

24
tessierashpool 1 day ago 0 replies      
the article starts out with (paraphrase) "decades and/or centuries ago, we expected tech would save us incredible amounts of time, but it didn't."

if you had a computer in 1984, and somebody told you what processor speeds would be in 2014, you might have expected all software to run flawlessly by now.

as it is, everybody has more computing power in their pocket than the entire Apollo moon landing ever had, and we use this power to look at pictures of cats.

there's probably an economic principle explaining both of these failed prediction categories. in either case, what people do with the new abundance has a lot more to do with what they're willing to tolerate in their lives than what the technology is actually capable of.

great tech will only produce great results if you choose to do great things with it.

25
stealthfound3r 2 days ago 1 reply      
Douglas Rushkoff nailed it in a short TED talk http://vimeo.com/65904419
26
jotux 1 day ago 0 replies      
Similar but a different perspective: http://opinionator.blogs.nytimes.com/2012/06/30/the-busy-tra...

I personally strive to never, ever, say that I'm busy.

27
ateng 2 days ago 0 replies      
It is interesting to note that there is lots of factors why people in large cities generally walks faster. Someone did a research on this (forgot wherr the source is) and it concluded in average younger people walks faster, and young people are concentrated in cities.
28
DanielBMarkham 2 days ago 1 reply      
Being a Star Trek fan, I was watching TOS (The Original Series) yesterday with my son. As was the norm for that show, in every episode Kirk somehow managed to find an alien planet with a beautiful woman on it that needed some sort of help. [insert long discussion about misogyny in 1960s TV shows]. I'm a bit of a movie buff, so I started using IMDB to look up some of these actresses to see what they're doing now.

Dang. They're either dead or in their 80s.

We are becoming the first generation to have a multi-media reminder of how short life is. In previous generations once grandpa died, you might have a painting and/or some family stories to share. Perhaps a tombstone to visit. Over time the memory faded away. In this generation and future ones, when grandpa dies? Hell, he might still be online, a bot posting his musings for the next 50 years. I don't see a reason grandpa can't send the grandkids a "Happy Birthday! Now you're 60!" message decades after he passes.

This is a good thing overall, in my mind, but this constant reminder of how short life is has the effect of making people really stingy about their time. That's probably not such a good thing, as many important connections happen when we're not looking for them.

As I get older, I find the need to orchestrate my time instead of conserving it: spending some time on high-energy, deeply-focused tasks and spending some time purposefully _not_ focusing and instead spending time socially with others I care about. Working in a good technology team is good for the former, exercise, outdoors, and family is good for the latter, at least for me.

Being mindful of time is fine. Being stingy with it or bitter about losing it? Not so much.

29
branchless 1 day ago 0 replies      
Because land prices have rocketed because the banks lend based on both incomes so now both parents have to work to feed the rentiers.
30
ZoomZoomZoom 2 days ago 1 reply      
Quite an interesting read, but it's just stating the obvious repeatedly. Time equals money. And of course, with slower economics coupled with ever-increasing competition of the global market money becomes more valuable. But why do we feel so? It seems, there are some socio-cultural drivers for this need of the resources.

People tend to want same things their neighbours have, or more. Today you feel your neighbour is not just someone living across the fence, but also someone you've just chatted on Skype (across the continents). Our sense of what Normal is is based on the cultural environment we're living in. And with technological progress, modern communications and mass media this sense skews towards images propagated by those who are involved in producing more of this kind of information i.e. western world.

This flip side of this process is what might have looked like a decent living now seems less so. When you live in the area with life expectancy about 60 years and suddenly you realize your friends live somewhere where average is 74, you become more stressed to maximize your efforts. So my idea is that most people who work extra hours do so not to get rich, but trying to avoid ending up worse than average.

So what is the way we can free our time so we could sit on the "park benches with pretty girls" more often?

The only obvious answer I see is to improve the overall life quality of the poorest. This sounds frighteningly lefty, but this notion is based on the realization of one of humanity's ultimate goals: providing personal freedom for everybody to do what person feels preferable for him. The non-destructive way of achieving it is taking off the stress and fears of less income-maximizing life style. When you're certain you still will be well fed and able to afford medical assistance even without extra-hours at work you'll be more inclined to do what fulfils you as a person.

At this point it's natural to discuss the old issues of exploiting social care and parasitic lifestyle in market economy, which might be possible in societies with high social guarantees. This is an axiological issue and answers depend on personal senses of equity and sympathy, and it's a hot topic on its own. However, scientific and technological progress is what I think the only plausible potential way of improving the overall life conditions thus leaving more time for leisure.

*Some context on happiness vs inequality:Paul Alois. Income Inequality and Happiness: Is There a Relationship?www.lisdatacenter.org/wps/liswps/614.pdf

Shigehiro Oishi, Selin Kesebir and Ed Diener. Income Inequality and Happiness.http://www.factorhappiness.at/downloads/quellen/S13_Oishi.pd...

31
lifeisstillgood 2 days ago 0 replies      
Just slow down. I don't do good work when I am rushing, I don't take the time to simplify or improve, but if I feel I have time and the support for good work I can be more productive in days than I would by rushing for weeks.
32
yagibear 2 days ago 1 reply      
TLDR, anyone?
33
dodyg 2 days ago 0 replies      
tldr: first world problem.
22
Google Self-Driving Car Project's first vehicle prototype
229 points by justhw  3 days ago   192 comments top 28
1
amckenna 3 days ago 9 replies      
A lot of people are bashing the car's appearance, but I think people forget that putting the first driverless cars on the road is as much a PR challenge as it is a technological challenge. Truly autonomous driverless cars is a huge shift in the way we have operated for almost 100 years. There will be a lot of caution and resistance from political groups, concerned citizens, entrenched interests, etc. The car that they put forward first needs to be non-threatening, safe, and easy to adopt.

Given Google's stake in Uber the car will be part of a fleet that can be summoned by a mobile app, not some product you go out and buy. Because there will be no dealerships and individual owners, they don't care about attracting buyers for the vehicle - it doesn't need a cool factor. What it needs is to be non-threatening and safe so you will feel comfortable getting in one and going for a ride.

Additionally, the first car on the roads will just be making in town trips and will be limited to 25mph - no highways or major arterials. This means it makes more sense for the car to be compact, light, and similar to a Smart Car, than a Camry or SUB.

2
nichodges 3 days ago 6 replies      
My initial reaction was dismay that Google seemingly didn't consult any decent auto designers on this. But then I wonder if that's actually fine.

My kids will likely be baffled by the idea that we attached so much of our own identity to our cars. The financial investment in cars to make a statement about ourselves (over and above getting us from A to B) is immensely irrational.

With self driving cars ownership will likely disappear, and be replaced with time sharing. At that point the connection between our view of ourselves, and the car we ride in disappears.

I'm not sure that completely excuses the lack of modern car aesthetic here, but it could go some way to explaining it.

3
LukeB_UK 3 days ago 1 reply      
Matthew Inman of The Oatmeal got to have a ride in one and shared his thoughts here: http://theoatmeal.com/blog/google_self_driving_car
4
bane 3 days ago 3 replies      
Lots of people are point at this being Uber's future "auto-car". Here's an alternative idea:

- You can buy this car. It costs $100,000. But that's okay.

- When you aren't actively using it, you tell it to go "Uber mode" and pick up and drive people around as part of the "Uber Network of Cars"

- You split the fee with Uber/Lyft/whoever. They get 30%, you get 70%.

If the average ride pays you $7, over 5 years that's like 8 rides your car has to "sell" per day to be effectively "free" to you (except for financing, insurance, etc.).

- At the end of the workday, Google Now summons your car to pick you up in front of your office and whisk you home.

- After dropping you off at home, your car goes back onto Uber mode and does night-time service (if you opt-in).

You could probably pay your car off much earlier than 5 years with more rides/higher average ride fare, after which your car is making you money. Clever people will use this extra to finance more cars to run small fleets and effectively live without working.

5
jasonwilk 3 days ago 8 replies      
The way Google seems to be approaching self-driving cars is the right one in my opinion. Self-driving cars will be on-demand, booked through something like Uber and will not be owned by the end-user.

I feel that the other car companies working on self-driving car technology for consumers are wasting their time. The main reason I enjoy owning a nice car is that I like driving it. If I wasn't in control of driving my car, what would be the point? Vanity of course has to be considered but in the future, I see self-driving cars which we don't own will the the status quo in cities, and owning a manual operating a car will be either a novelty or something for people outside of major city hubs.

6
kandalf 3 days ago 1 reply      
Given that as far as I know, roads must be extensively mapped in advance of a self-driving car going on them, there is a nice bonus of doing self-driving cars exclusively through Uber at first. Uber can know the exact route the passenger wants to take in advance, and only send cars to passengers whose routes are already mapped. Furthermore, they can choose to only send them out when the conditions are good (no snow, etc. assuming conditions are still a problem when these go into fuller production). A nice way to roll the cars out incrementally without some of the problems they might otherwise have...
7
click170 3 days ago 0 replies      
One of the aspects of this that I've been getting concerned about is the invasion of privacy that they will pose, especially if it's one or a handful of companies owning an operating the autonomous vehicles.

It's true that if you carry a cell phone you already carry a personal tracking device and offer this information up freely to your cellular provider, but I'm interested in reducing instead of increasing the amount of information I'm leaking in that way.

What kind of information will these cars track? They'll have to track who rides in them for accountability purposes, which I already find troubling. Your average cabby isn't going to be compiling a profile about you based on where you catch rides to.Who has access to the information such as who rides in which cars? Is this available via an open API? I'm already peeved at companies like FitBit which hold my data ransom, is this going to be another of those situations?

There's a lot of privacy questions that I feel aren't being adequately addressed, but I still look forward to the possibilities this will bring. The privacy questions are answerable and any problems should be correctable.

8
vinkelhake 3 days ago 1 reply      
To those that get hung up on the design: remember that this car is limited to 25 mph for regulatory reasons. Having a design that is closer to a bumper car than a model S seems fitting with that in mind.
9
kin 3 days ago 0 replies      
A lot of people are bashing its appearance. I think it looks cute. So, to each his own on the regard. But seriously guys, this is happening a lot sooner than I thought and I really could not be any more excited to have these on the road.
10
billsossoon 3 days ago 0 replies      
Reminds me of the Cozy Coupe [1] I had as a child. Perhaps that's not by accident. The fear is that these machines will be unsafe, either to their passengers or to other cards on the road. Making it cute my reduce the perceived threat level.

[1]: http://www.littletikes.com/content/ebiz/shop/invt/612060/612...

11
Animats 3 days ago 0 replies      
They've got to make more progress on the sensors. They still have that overpriced Velodyne HDL-64E scanner (about $100K) on top of the prototype. The new vehicle has a slightly smaller device on top, probably the HDL-32E. Google doesn't seem to be making progress on flash LIDAR or millimeter microwave radar, which are going to be needed for reasonable-cost production vehicles.

CMU/Cadillac have a self-driving car. They have a number of long videos taken with a back-seat camera.

https://www.youtube.com/watch?v=uXhvQeArwWM

It's good enough that it's been driven around downtown Washington. It doesn't seem to sense turn signals or infer much intent from other-driver behavior. The driver has his hand on the auto/manual switch at all times; clearly there's not much confidence in this thing yet.

12
jastanton 3 days ago 1 reply      
Google has had much more PR about it's revolutionary tech and that has a LOT going for it. When tesla comes out with their self driving car, if it looks 100x better than this prototype, I still might pick Google. Better aesthetics with comparable functionality will win the majority of time in my book (think Android devices vs Apple devices), however when it comes times to putting my life on the line, I will go with something I feel is safer 100% of the time regardless of how it looks. And like always, Google will dominate with it's superior functionality (backed by their PR over the last couple of years) over any tesla any day.
13
14
davidw 3 days ago 0 replies      
I wonder if these things could be used in potentially 'easier' niches like long-haul trucking: you'd create a loading/unloading port near the freeway, and send the truck to another port across the country.

Naturally, I don't know anything about trucking, and you'd want to be really sure something so big and bulky is safe, but the idea would be, rather than "do everything a car does all at once" to do something relatively simple.

15
yRetsyM 3 days ago 0 replies      
I thought this was quite a good commentary on it: http://theoatmeal.com/blog/google_self_driving_car
16
3apo 3 days ago 2 replies      
From an economic standpoint, I would be interested to see how many OTC parts this system. That is, does it need a $1000 lidar or would it get a similar performance with a cheap $100 sensor? From a technical point of view, 25mph is very limiting IMO. You probably do not need a very sophisticated controller to navigate at 25. If you reach speeds of 60-70MPH with varying road curvatures, the controller design gets trickier.
17
jvagner 3 days ago 0 replies      
Cute, adorable, non-threatening... could have been achieved and still looked.. better, methinks.
18
jedunnigan 3 days ago 2 replies      
I don't have a firm enough understanding of how the LIDAR and laser's work in these vehicles, but it occurs to me that it might be possible for a malicious actor to confuse the cars and cause accidents. That's concerning.
19
dogeye 3 days ago 1 reply      
Nobody who has used Adwords believes Google will ever make a self driving car.
20
bbayer 3 days ago 0 replies      
What is the purpose of Google here, to develop a platform that car manufacturers want to integrate or producing driverless cars under Google brand?
21
pinaceae 3 days ago 0 replies      
i guess it has a strong appeal to the hardcore android crowd. people that get excited about utility, but have no sense at all for style. not a bad thing, mind you, but it is already obvious that it will take a company that gets style, like Tesla and yes, Apple, to make this appealing to people on the other side of the spectrum. people who cannot unsee ugliness, assymetry and disproportion.

personnally i just hope roads stay open for motorcycles in the future. self-driven transportation can be massive fun.

22
danblick 3 days ago 0 replies      
What's with the side-view mirrors?
23
brc 3 days ago 1 reply      
What, exactly, are the mirrors for?
24
productcontrol 3 days ago 1 reply      
Edit: Google staff and fanbois have far too much time to downvote, but less time to articulate why it seems!

I tried to use Google's driverless car, but everytime I asked it to search for rival services, it kept driving me to their search and adsales offices. Was a bit weird. Like they told the car to prefer their services first! My eurotrash friends promised to investigate though, so that's nice.

25
soupcancooloff 3 days ago 0 replies      
don't worry guys, its only a matter of time until Uber adds another service called UberGrandma/pa and starts targeting senior citizens with this car.(Google Ventures has a stake in Uber)
26
sparkzilla 3 days ago 2 replies      
This thing looks like a clown car. Perfect for those who think Glass is the height of fashion.
27
nodata 3 days ago 1 reply      
It looks crap.

Google, please don't release that.

(OR: Google, I will pay more not to ride in that.)

28
manticore_alpha 3 days ago 8 replies      
Unfortunately, these things just don't have a cool factor. Here's to hoping Tesla moves forward much more quickly (and regular car manufacturers as well.)

It's something Google probably just doesn't "get" - but a lot of people's identities are tied to their cars. It's why we have colors, shapes, brands, options.

This car may be "perfect" algorithmically, but it doesn't mean it stirs the soul.

23
Flipping bits in memory without accessing them [pdf]
209 points by ColinWright  3 days ago   80 comments top 20
1
kabdib 3 days ago 3 replies      
What goes on at the chip level is terrifying. "You have to understand," a hardware engineer once said to me, when we shipped a consumer computer that was clocking its memory system 15% faster than the chips were supposed to go, "that DRAMs are essentially analog devices." He was pushing them to the limit, but he knew what it was, and we never had a problem with the memory system.

There was a great TR from IBM describing memory system design for one of their PowerPC chips. Summary: Do your board layout, follow all these rules [a big list], then plan to spend six months in a lab twiddling transmission line parameters and re-doing layout until you're sure it works . . .

2
hammer_test 3 days ago 1 reply      
Note: the regular memtest+ doesn't have this test. Use the researcher fork:

https://github.com/CMU-SAFARI/rowhammer

in Ubuntu 14.04, run this to bring all the dependencies for building: sudo apt-get build-dep memtest86+

Update: just finished running the test on my cheap Lenovo laptop. Not affected. phew! :)

3
rab_oof 3 hours ago 0 replies      
Am I wrong (if you happen to work on processor microcode) or could microcode patches per processor insert a minimum delay where needed based on RAM parameters and organization to prevent this?
4
zanethomas 2 days ago 1 reply      
The success of this approach to corrupting memory depends upon knowing the geometry of the memory chip. Naive calculations of which addresses correspond to an adjacent row may be incorrect.

It's interesting to see this issue addressed in 2015. In 1980 I worked at Alpha Microsystems and designed a memory chip test program which used translation tables based upon information we required chip manufacturers to give us in order for their chips to be used in the systems we sold.

That approach required us to only put one type of memory chip on a memory board. But back in the day microsytems were expensive and customers expected them to be well-tested.

5
jhowe 2 days ago 1 reply      
I work in the chip industry. This was a good paper.

1. Note that chip-kill/Extended ECC/Advanced ECC/Chipspare which are all similar server vendor methods for 4-bit correction will prevent this problem. These methods are enabled on the better reliability server systems.

2. This failure mode has been known by the DRAM industry for a couple years now and the newest DRAM parts being produced have this problem solved. The exact solution varies by DRAM vendor. I wish I could go into specifics but I am unaware of any vendor that has stated publicly their fix.

6
BetaCygni 3 days ago 1 reply      
Excellent article! The fact that they can reliably produce errors in most ram chips is worrying. They also provide a solution (probabilistic refresh of neighboring lines).
7
mseaborn 1 day ago 0 replies      
Here is a program for testing for the DRAM rowhammer problem which runs as a normal userland process:https://github.com/mseaborn/rowhammer-test

Note that for the test to do row hammering effectively, it must pick two addresses that are in different rows but in the same bank. A good way of doing that is just to pick random pairs of addresses. If your machine has 16 banks of DRAM, for example (as various machines I've tested do), there should be a 1/16 chance that the two addresses are in the same bank. This is what the test above does. (Actually, it picks >2 addresses to hammer per iteration.)

Be careful about running the test, because on machines that are susceptible to rowhammer, it could cause bit flips that crash the machine (or worse, bit flips in data that gets written back to disc).

8
markbnj 2 days ago 0 replies      
This article actually contains one of the better-written fundamental explanations of DRAM operation that I've read. Thanks for the post.
9
bhouston 3 days ago 4 replies      
How long until someone uses this as the basis of an exploit? Maybe not root access, but if you can figure out an OS call that replicates the access pattern, you can corrupt machines just by interacting with them.
10
blinkingled 2 days ago 1 reply      
Hopefully this doesn't affect ECC DRAM? Also does the problem get worse with increased density - i.e. 16GB modules are more vulnerable than say the 8GB ones?
11
diydsp 3 days ago 0 replies      
From abstract:

High speed DRAM reads influence nearby cells. Reproduced on 110 of 139 mem modules after 139k reads on intel and amd. 1 in 1.7k cells affected.

12
phkahler 2 days ago 4 replies      
Problem: They propose a solution and calculate the reliability of the solution. Why not test it with their FPGA based memory controller and demonstrate an improvement?

Second: While the problem looks real enough, the tests to demonstrate it are not realistic. Hammering the same rows with consecutive reads does not happen in the real world due to caches which the get around via flushes. I'd like to see more data on how bad the abuse needs to be to cause the problem. Will 2 reads in a row cause errors? 5? 10? 100? They never address how likely this is to be a real-world problem. I don't doubt that it is, but how often?

13
pera 3 days ago 1 reply      
So basically you just make a couple memory reads a few hundred thousands times and this will alter some near cell? why manufacturers didn't test this? it looks like a pretty obvious thing to test while working at these scales.
14
kazinator 2 days ago 1 reply      
Interestingly, the researchers used a Xilinx FPGA, not just an off-the-shelf AMD or Intel PC.

Why not?

If the attack can only be reproduced by custom hardware, why should anyone care?

Also, precise patterns of access to DRAM would require disabling the L1 and L2 caches. Doesn't that sort of thing require privileged instructions?

With caching in place, memory accesses are indirect. You have to be able to reproduce the attack using only patterns of cache line loads and spills.

15
jhallenworld 2 days ago 0 replies      
I think row hammer is basically a DRAM design defect and wish it was fixed in the DRAM instead of on the controller side. At the very least the DRAM vendors should document this access pattern limitation in their datasheets.
16
tsukikage 2 days ago 1 reply      
So, busywaiting on spinlocks considered dangerous?
17
edwintorok 3 days ago 1 reply      
Why doesn't it mention the manufacturers' names?
18
rebootthesystem 2 days ago 0 replies      
I've completed many multi-gigahertz product designs during my career. If you take the time to study and understand the physics involved and bother to do a bit of math none of it is particularly difficult. I reject the characterization of this as some kind of a black art. It's not magic. Yes, of course, experience helps, but it isn't magic. One problem is that some in the industry are still using people who do layout based on how things look rather than through a scientific process. Yes, it's analog electronics. When was it anything else?

Want to wrap yourself around another challenging aspect of high-speed design? Power distribution system design (PDS). You can design perfect boards based on solid transmission line and RF theory and have them fail to work due to issues such as frequency-dependent impedances and resonance in the PDS.

19
jadc 3 days ago 1 reply      
Discussion from a couple of weeks ago:

https://news.ycombinator.com/item?id=8713411

20
blazespin 3 days ago 0 replies      
Sounds like a great technique for ddos.
24
GPGPU Accelerates PostgreSQL
220 points by lelf  3 days ago   52 comments top 10
1
JonnieCache 3 days ago 5 replies      
So glad to see this coming along.

If any of the project team are reading, what I'd like to see most is GPU-accelerated point-in-polygon lookups in postGIS, ST_Contains and so forth.

2
majc2 3 days ago 0 replies      
As an aside, if this is of interest you might be interested in the third run of the GPGPU course by coursera/University of Illinois starts in January. See: https://www.coursera.org/course/hetero
3
joelthelion 3 days ago 1 reply      
It's great to see that they're using OpenCL. GPU computation desperatly needs standardization, and this could help bring OpenCL drivers on par with CUDA.
4
maaaats 3 days ago 3 replies      
Most servers I've used don't even have a GPU. It will be interesting to see how this and other GPGPU applications for server software will shape the server parks in the future.
5
adamtj 3 days ago 0 replies      
This is very interesting, but not as a CPU saving optimization. I'm sure it does that very well, but that's not why it's Important. Rather, this seems to me like the next step toward the inevitable future of PostgreSQL as arbitrarily scalable, and as _the_ general query engine that ties together whatever physical data stores you happen to use.

It seems obvious to me that pushing the Foreign Data Wrapper layer with work like this is how we eventually break through the RDBMS scalability barrier of the individual host. In the future, I'm sure you'll see similar work where _the_ GPU won't be the GPU and the PCI bus won't be the PCI bus. Rather, they'll be _a_ host and the network. A database service (database cluster in Posgres's nomenclature) will eventually run not just on a single machine, but on a single cluster of machines. Instead of a cluster of machines for redundancy, you'll have a cluster of clusters.

Postgres is really two things in one: a physical layer of bytes in pages in files, and a logical layer of queries on tables of records. The most important piece in the future will be the logical layer. The FDW layer will naturally be extended and generalized until it is fully as powerful as the current physical layer. At that point, it can be made THE api through which the logical layer accesses data. The current physical layer will then be nothing more than the default implementation of that general API.

At that point, we can move whole or partial tables to other hosts. Perhaps the autovacuum daemon will gain a sibling in the autosharding daemon. The query optimizer will need to care not just about disk IO, but network IO and will need to start considering the non-uniform performance characteristics of different tables. Some tables will be driven by Postgres's default physical storage engine. Others will be driven by other RDBMSs, or by NoSQL key/value or document stores, or other data stores. They may be on the same machine or a different one.

Postgres will transform into a query engine on top of whatever data stores best fit your workload. I expect the query engine will learn about columnar stores and be able to mix those in a single query with the row stores, key/value stores and document stores that it already understands. PostgreSQL will be a central point through which you can aggregate, analyze, and manipulate any and all of your data. It needn't be intrusive or disruptive: you can still use a normal redis client for your redis store, but you can also use Postgres to manipulate that data with SQL and to combine it with other tables, whole RDBMSs, other NoSQL stores, spreadsheets, web services, or anything else. Maybe it will even make things like Map/Reduce frameworks redundant.

I don't typically follow Postgres's internal discussions, so maybe this is already being discussed and planned. Or, maybe it's so obvious that nobody even needs to talk about it. Or, perhaps I'm just some wide-eyed idealist who doesn't understand the fundamental problems preventing such a thing from ever being practical.

6
nrzuk 3 days ago 3 replies      
While I absolutely love the concept and really want to buy a graphics card just to play with this on my development box. Find it quite exciting how some applications are utilising graphics processing power.

But I can't help but wonder what the sys admin's response is going to be when I start asking for additional graphics cards being added to his perfectly built 2U database servers!

7
fitshipit 2 days ago 0 replies      
This reminds me a little of the Netezza data warehouse appliance's architecture: a query planner in front of lots of little nodes with one disk, one CPU, and an FPGA. Every query is a full table scan, each node flashes the WHERE clause to the FPGA, and slurps the whole disk through the FPGA.
8
jvickers 3 days ago 1 reply      
Does anyone know if or when the GPGPU acceleration will be available in the normal Postgres install?

Is / will this acceleration be switched on by default?

9
thomasfoster96 2 days ago 0 replies      
This + Amazon RDS would be pretty awesome for mapping.
10
sjtrny 3 days ago 1 reply      
in other breaking news grass is green and water is wet. Obviously throwing more power at the problem ends up with faster execution.

There's a limit to GPGPU acceleration though. It's the tiny amount of RAM. We need to adopt shared memory architecture like those found in games consoles. A single massive pool of RAM would further unlock potential power.

25
High speed M&M sorting machine
218 points by nbsymr  4 days ago   43 comments top 12
1
Animats 3 days ago 10 replies      
That's low-speed sorting. This is high-speed sorting:

https://www.youtube.com/watch?v=DogZJmThRSE

That's a high-speed computer-vision optical pea sorter. Yes, each and every pea in that huge flow of peas is examined by a computer vision system. Tiny high-speed air jets are kicking out the rejects during the brief period the peas are in free flight.

Here's a blueberry sorting machine, throwing out anything that doesn't look like a round blue blueberry:

https://www.youtube.com/watch?v=8CyWvnh4YtE

That machine could sort M&Ms by color, easily.

The food industry has lots of machines like that. The technology was first applied to large fruit like tomatoes. It's now so cheap it's applied to rice and grains.

2
frisco 3 days ago 0 replies      
It's a little M&M flow sorter! They even kind of look like cells. If you really want your mind to be blown, go read about flow cytometers: http://en.wikipedia.org/wiki/Flow_cytometry#Fluorescence-act...

Fluorescence-activated cell sorting uses the same idea as in the M&M sorter linked here, except with lasers instead of an iPhone camera and uses an electron gun to deposit charge onto a droplet containing exactly one cell as it falls through a magnetic field to sort it into bins. Madness!

3
51Cards 3 days ago 1 reply      
Very cool. I just picked up a couple Mindstorms kits and have been debating my first build challenge. This looks perfect (though it won't be as quick I'm sure)

Edit: Just playing with the math regarding the speed of this machine considering the M&Ms are in free fall. Nicely done!

4
Zikes 3 days ago 2 replies      
I'd like to see a variant of this capable of separating a mixture of M&Ms and Skittles.
5
mrestko 3 days ago 0 replies      
6
dale386 3 days ago 0 replies      
Is the code posted anywhere?
7
razzberryman 3 days ago 1 reply      
Very nice. If you want to reduce the shadows for sorting browns, try backlighting the chute so that shadows can't appear on it. Basically, turn the chute into a photography light box.
8
ChuckMcM 3 days ago 0 replies      
oh the glue, the glue! M&M sorters are great projects though. I am impressed that the bluetooth link has the frequency response to actually get to the blue ones before they have fallen what appears to be a few inches.
9
jorjordandan 3 days ago 1 reply      
Somebody buy this guy a 3d printer.
10
comrh 3 days ago 0 replies      
The sound is very rhythmic. One part of a sort machine band.
11
forrest_t 3 days ago 2 replies      
if only this existed back in Van Halen's heyday

http://en.wikipedia.org/wiki/Van_Halen#Contract_riders

12
seanemmer 3 days ago 0 replies      
but can it sort Skittles?
26
Show HN: Initial release of H2O, and why HTTPD performance will matter in 2015
217 points by kazuho  19 hours ago   72 comments top 20
1
moe 12 hours ago 1 reply      
Very nice work, competition is always good.

However, it seems worth mentioning that webservers haven't been a bottleneck for a long time now. Your bottleneck is always disk I/O, the network, or the slow application server that you're proxying to.

For reference: Wikipedia[1] serves roughly 8k pageviews/sec on average for a total of ~20 billion pageviews/month.

Assuming each pageview consists of ~10 webserver hits we're looking at ~80k requests/sec.

This is within the realm of a single instance of either nginx or h2o on a beefy machine [on a very beefy network].

So, unless you plan to serve Wikipedia or Facebook from a single server, you're probably fine picking your webserver software on the basis of features rather than benchmarks.

[1] http://reportcard.wmflabs.org/graphs/pageviews

2
scottlamb 7 hours ago 0 replies      
I'm skeptical of the performance numbers. First, like others here I don't believe nginx's performance will be a bottleneck for HTTP/2. Beyond that, I suspect there are cases in which this code is much worse than nginx.

Here's one. Look at the example request loop on <https://github.com/h2o/picohttpparser/>. It reads from a socket, appending to an initially-empty buffer. Then it tries to parse the buffer contents as an HTTP request. If the request is incomplete, the loop repeats. (h2o's lib/http1.c:handle_incoming_request appears to do the same thing.)

In particular, phr_parse_request doesn't retain any state between attempts. Each time, it goes through the whole buffer. In the degenerate case in which a client sends a large (n-byte) request one byte at a byte, it uses O(n^2) CPU for parsing. That extreme should be rare when clients are not malicious, but the benchmark is probably testing the other extreme where all requests are in a single read. Typical conditions are probably somewhere between.

3
rkrzr 18 hours ago 4 replies      
Congrats on shipping! This project looks very interesting already and will hopefully pick up more contributors.

Is there already support for configuration files? Because for me the performance isn't the most important issue, in fact the main reason I'm using nginx over Apache is that I don't want to deal with .htaccess any more.

I think if you would consider adding support for the nginx config file format to H2O, thus making it a drop-in replacement for it (if all the used features are actually supported), you could give the project a huge boost.

4
stephth 17 hours ago 6 replies      
Interesting article. And congratulations for the release!

Sorry this is a bit off-topic (and doesn't apply to H2O as it's been in the works for a while looking at the commits), but I wonder, today, with a language like Rust (1.0 is at the door [1]), as performant as its safe C equivalent but modern and safe by design (and with an escape hatch to C/C++ if needed), what would be the advantages of starting a long term project of this type in C today?

[1] http://blog.rust-lang.org/2014/12/12/1.0-Timeline.html

Edit: why the downvotes?

5
halayli 16 hours ago 1 reply      
This doesn't look like a complete HTTP server, comparing it with nginx is not fair.

. It's missing content-encoding handling on the receiving side

. No http continue support

. No regex routing support

. No header rewrites

to name a few.

6
robbles 17 hours ago 3 replies      
> Instead, switching back to sending small asset files for every required element consisting the webpage being request becomes an ideal approach

This doesn't solve the other side of the problem that spritesheets are meant to solve, namely that an individual image will not be loaded yet when the first UI element using it is displayed (e.g. in a CSS rollover, or new section of a SPA appears). I can't see a way that new protocols are going to solve this, unless I'm missing something in how HTTP2 is going to be handled by the browser?

I assume that once you're forced to preload everything you might need for the page, it's no longer more efficient to break up into multiple tiny requests.

7
Shish2k 18 hours ago 3 replies      
Looking at the tangentially linked qrintf project that H2O uses ( https://github.com/h2o/qrintf ), replacing generic sprintf calls with specialised versions for a 10x speed boost - that seems like a brilliant idea, I wonder why it took so long for somebody to think of it?
8
zzzcpan 17 hours ago 1 reply      
Socket API is a bottleneck now, right?So, next step: roll your own http-friendly tcp stack on top of netmap/dpdk and get 10x performance increase over nginx.
9
jarnix 18 hours ago 1 reply      
Obviously it's great software. Does Kazuho work alone on this ? If it's meant to replace nginx, it needs a lot of other options/functions/extensions/modules/...

Is it getting commercial support/funds ?

10
huhtenberg 2 hours ago 0 replies      
That's a very good code. Succinct and readable. You clearly now your C well :)
11
jvehent 11 hours ago 0 replies      
That's a cool project. Performance is a fascinating topic.

However, in the real world, the number of requests per second a http daemon can perform is the last thing to worry about. If the web is slow, it's not because Apache used to be bloated with thread. It's because of bad architecture: centralization of services, latency in page builds time, size of static components, data store bottlenecks, etc...

Nevertheless, a very cool project. One I'll follow closely.

12
dschiptsov 7 hours ago 0 replies      
So, it has better string, pool allocators, zero-copy buffers and syscall support than nginx/core/*.ch? That would be a mirracle.
13
okpatil 9 hours ago 0 replies      
It seems that everything mentioned in the library could be done with golang easily. I am interested to see how H2O benchmarks with pure golang binaries.
14
thresh 7 hours ago 0 replies      
Hello there, can you share the performance test details? The configurations of both servers, client software, hwserver setups.

Thanks!

15
haosdent 12 hours ago 0 replies      
I couldn't understand why it could faster than Nginx? Maybe the way of benchmark nginx in this case is wrong?
16
bkeroack 14 hours ago 1 reply      
If you're relying on HTTP for all your microservices, you're doing it wrong.
17
ams6110 17 hours ago 1 reply      
Another one to keep an eye on might be the new httpd in OpenBSD. http://www.openbsd.org/cgi-bin/man.cgi/OpenBSD-current/man8/...

I'm not seeing that there is yet a portable version however.

18
xfalcox 16 hours ago 0 replies      
Any plans to get script support, like nginx access_by_lua?
19
caycep 16 hours ago 0 replies      
whoa, and here i was thinking nginx was the be all end all of sweet sweet blistering speed...
20
PythonicAlpha 18 hours ago 1 reply      
Looks very promising!

I am not so much into web-servers (yet), but I found this in the feature list:

  reverse proxy    HTTP/1 only (no HTTPS)
Are there any plans to add also HTTPS-support for reverse proxy? Since I have to include a secondary (Tornado) web-server unto my stack for dynamic pages.

It also puzzled me, that https is not supported, but in the benchmarks I found a part: "HTTPS/2 (reverse-proxy)". As I said, I am not so much in Web-servers and https/2, but that was a little confusing.

27
No more JavaScript frameworks
232 points by hit8run  2 days ago   157 comments top 55
1
pedalpete 2 days ago 4 replies      
I have a very different perspective than the author.

I've considered Libraries to be the pieces of code that let us 'paper over browser inconsistencies' to be things like jQuery. Sure, they also gave us some useful tools and plugins to accomplish tasks, but on the core purpose was to be able to write simple javascript code for common functions and have parity across all the major browsers.

Frameworks, I view at as code organization and paradigm tools. They create a more consistent structure for a group of developers to work on a single project. It aids in creating an MV* or other similar paradigm that a team can work within. Even outside of a team environment, Frameworks give you a ... framework, of how your app should be built, rather than just creating your own structure as you go along.

This is the reason why there are so many frameworks vs. libraries. Once you've chosen a library to handle browser inconsistency, as the author mentioned, barely an issue anymore, there is an endless possibility as to how your app should be structured. Some people like two-way data binding, some don't. Some people need extreme DOM performance of leveraging the virtual DOM, others hate the idea of having their html closely bound to their javascript (and framework as often the case).

2
dasil003 2 days ago 3 replies      
Don't throw the baby out with the bathwater. Sure JS frameworks are often unnecessary, that doesn't mean they aren't solving real problems.

The scope of applications and interactivity found in today's browser-based apps today dwarfs what we were doing even 10 years ago. The amount of experience and expertise that is being codified in (for example) Ember.js is fantastic and will allow you write much richer web apps in far less code than trying to achieve the same functionality with it.

Saying just use JS directly is sort of like saying don't bother with Rails, CGI.pm has everything you could ever need.

3
shripadk 2 days ago 3 replies      
"React isn't a framework, it's a library!" There I said it!

Jokes aside, React provides a Virtual DOM which isn't available natively. I'm sure sometime in the future browser's DOM tree estimation and computation would be so efficient that you wouldn't need a Virtual DOM anymore. Until then, React is the best way forward. There are plenty of thin VDOM libraries (some even more efficient than React) out there but the joy of working with code written as composable Components outweighs efficiency (I'm sure those libraries would fare far worse in benchmarks if composability is put into the equation). Couple this with a library like Om/Reagent and you get FP (code as data)+immutability handed to you on a silver platter.

Without the "right" libraries you introduce a lot of incidental complexity into your application development lifecycle.

4
bstrom 2 days ago 0 replies      
None of the upgrades in ES6/ES7 obviate the need for frameworks. Maybe some libraries are made redundant, but frameworks address more interesting abstractions that a general language probably shouldn't cover anyway. Frameworks seem to adopt more opinions about implementation, and help unite a team over a set of ideas, and enable them to move faster and reason about similar problems in similar ways. Hopefully that team has picked a framework that is also well-suited towards the problem they're solving.
5
gsands 2 days ago 1 reply      
As a user of Angular, Backbone, and React (not all at the same time!), I feel like their largest benefit to me is in providing application structure -- good because as it has been said, if you aren't using a framework, you are writing your own. I don't really see how any of the 3 items under the article's "So what do we need now?" will provide that benefit.
6
frik 2 days ago 4 replies      
Modular small libraries are great, I tend to cherry pick just a few. Specific frameworks like React are great too.

But all these old monolithic frameworks/libraries, that try to solve everything and there citchen sink, are so 2009-ish.

They have its place in the enterprise world. And for performance optimisation you want a more modular approach.

7
Osiris 2 days ago 1 reply      
I've found that even if you start with the "basics", like the author describes, that you'll end up finding that you repeat the same types of tasks repeated. So, you refactor and make functions that simplify common actions. After a while, I guarantee you that your code will look a lot like a framework.

Frameworks are created because they make solving certain types of problems easier and reduce code duplication, whether you use someone else's or your own.

8
davvolun 1 day ago 0 replies      
> The problem is that now you have two systems to learn, HTML+CSS+JS, and the framework

Besides that there are 4 things listed there, not to mention how each interacts with the other, this is not a problem. If you shy at learning an additional thing, you should get out now.

That's not to say you shouldn't strive to reduce complexity and dependencies, that's a laudable and useful goal. But it's my personal opinion that every programmer out there should know at least a handful of languages, concepts, paradigms or frameworks that they don't like or have no use for--it's the same basis of learning more from failure than success.

That being said, I sincerely disagree with the assertion of the article, if on nothing more than the abstraction argument. The abstractions of jQuery have saved me so much time over the years with systems that work great in the latest and greatest browser, and work well enough, possibly with minor tweaks, in older browsers. Rewriting browser detection junk and making sure each control mostly works independent of browser and engine would be a much larger waste of my time than learning three or four js frameworks.

9
al2o3cr 2 days ago 1 reply      
"The problem is that now you have two systems to learn, HTML+CSS+JS, and the framework."

As opposed to the non-framework case, where you have HTML+CSS+JS, N libraries, and all your custom glue code... Yup, that's WAY easier.

10
emsy 1 day ago 1 reply      
It's not only the frameworks. Working with JavaScript has become ridiculously complicated. I remember when I started developing JSF applications. You had to learn about application servers, servlets, Spring, MySql, Maven, Hibernate JSF, JSP and the various JSF extensions before you could even start working. JavaScript now has become almost the same with Bower, NPM, Angular MongoDB Grunt, Compass, Jade and whatnot. Nowadays, when I start a project I try to keep it as simple as possible because

a) Todays technology is deprecated tomorrow

b) When someone joins I'm the one who'll have to teach them

11
pothibo 2 days ago 0 replies      
I agree with you wholeheartedly. I just recently started blogging exclusively on how to build dynamic without javascript framework @ http://pothibo.com

Solutions exist outside the scope of JS framework. We only need to take a minute to analyze them. It's not a popular decision to avoid JS frameworks but I believe some criticism of the current solutions available is in order.

12
fiatjaf 2 days ago 1 reply      
A framework that is much like a non-framework: https://github.com/raynos/mercury virtual-dom, selective rerenders, faster than light, truly modular
13
debaserab2 2 days ago 3 replies      
It's not just about a framework featureset, frameworks also provide convention and structure. Convention and structure are a problem whether or not you choose to use a framework.

Frameworks give us a language to speak to each other about the nuts and bolts of our application. This means new developers can get ramped up quickly (provided they understand the framework) and there is a common body of best practices that can be referenced when faced with difficult problems. There's a lot of value in that.

14
duaneb 2 days ago 2 replies      
People also said this when HTML5 was on the risepure JS, no runtime but the browser! Of course, it turns out engineering DOM-manipulating applications in a responsive way is really hard.

These are the core benefits of frameworks:

1. Standard code style.

2. There's only one way to do it (ideally).

3. Modularization.

4. Unit testing.

5. A realistic standard library for async/concurrent computation.

Angular, for instance, provides dependency injection and $scope/$digest. It would be pretty ridiculous to attempt to replicate those benefits until AT THE VERY LEAST Object.observe has solidified in terms of support. And even then, you're on your own in terms of mocking, in terms of communication between different modules, in terms of libraries you can drop in without dragging in a framework itself.

I'm appreciative of the attitude, but it's quite simply not a reality for people who don't want to invest in establishing their own patternsin effect, writing a framework. With a framework, people can sit down and use the engineering techniques they've learned from other areas of CS and write an application without being bogged down in terms of figuring out how to write a high-performance single-threaded web app in a language without modules, integers, futures (or other similar async abstractions), calendar widgets.

Try going without JQuery for a day and see how much duplicate code you write.

Then, multiply yourself times a team of 10.... good luck.

15
kuni-toko-tachi 2 days ago 0 replies      
The combination of technologies the author suggests we settle on is absurd. Look at the Elm language or On in Clojurescript for JS frameworks with solid computer science behind them.

The only JS frameworks that shouldn't exist are ones cooked up by folks that have little or no theoretical backing in firm comp science. And yeah, AngularJS, I'm talking about you.

16
encoderer 1 day ago 0 replies      
I think one of the biggest problems with the morass of client side code in many large code bases is that it's written, by and large, by engineers who are trained and focused on the shared-nothing world of http server side development. The art of managing state, and the goal of a stateless application is lost on you if you've never had to mentally run a program that runs longer than a few seconds. Yes, there are many fad frameworks. But I totally disagree that the answer is "fewer frameworks, more raw js." I think the answer is building tools to help better manage state (eg React), and over time strengthening the skills of your average "full stack" web engineers from 75% server side 25% client side to a more 50/50 split.
17
john1108 2 days ago 1 reply      
There is actually a tool[1] that will provide the right subset of polyfills to the specific browser. However, you do need to specify the list of polyfills you use in your code.

[1] https://github.com/Financial-Times/polyfill-service

18
rapind 2 days ago 0 replies      
This post is just pushing web components / polymer with a controversial title right? I don't think it's production ready yet though (slow polyfills), and there's some differing opinions on how web components should be built.

Here's an interesting and fairly detailed article about a different approach:http://jlongster.com/Removing-User-Interface-Complexity,-or-...

Best of both worlds?https://github.com/PixelsCommander/ReactiveElements

20
zak_mc_kracken 1 day ago 0 replies      
> So why are we still writing JS frameworks? I think a large part of it is inertia, it's habit.

I'd say it's need.

> Q: You cant do ____ in HTML5, for that you need a framework.> A: First, that's not a question. Second, thanks for pointing that out. Now let's work together to add the capabilities to HTML 5 that allows ____ to be done w/o a framework.

Great idea, but what do we do in the meantime? It's not like features get added to HTML 5 overnight.

21
drogus 2 days ago 2 replies      
The author of this article completely ignored all of the features that actually make people use Ember.js. Router, naming conventions, project structure, container to manage objects. It's possible that in the future Ember.js will use plain HTML components and JS features like Object.observe, but it won't mean that suddenly its code will shrink to 1kB, because it's not all there is to the framework.

My summary of that text: "I don't know how frameworks work, so let's not use them".

22
sdotty 1 day ago 0 replies      
My thoughts precisely! Down to web components and polymer!Thank you, now I have to something to reference when talking to my colleagues.Well I didn't consider HTML-imports, x-tag, Bosonic, but thank you for that information.The author could also have mentioned the last round of crazy javascript framework flowering ... about 2005-2006 ... Prototype, YUI, Sarissa (the most basic one), Dojo, Mootools, Scriptalicious, jQuery... jQuery seemed to have won that round and for a few years most people seemed to be using jQuery. Until AngularJs, Backbone, React and Ember came along :)

As for tools to compile and crush the CSS and JS, consider https://developers.google.com/closure/

23
digitalzombie 2 days ago 1 reply      
It's a complete mess right now.

Eventually Front end will get it thing together... the closest thing for bare stuff is like yeoman stack, bower, npm, grunt/glup, etc.. Require.JS doesn't play nicely either. I wish Javascript come out with built module system but it feels like you're just dressing up a pig now.

Maybe people do like coding in Javascript but so far the direction that Javascript is going feels like a mess and added complication. Javascript tries to do backend stuff now and deviate from it's original intended domain. Which is fine but it also make it a mess cause it doesn't have the construct and primitive in mind for backend.

What construct? Modules for one, is not built into javascript. It's ok we fixed it. Now we have several module system. How bout concurrency? What's wrong with passing call back and non block? It's ugly and because it's ugly we're going to patch it up with promise in the next language iteration. It goes on. What's wrong with fixing the language? Nothing, you're just fixing it on a shaky foundation while adding more complexity. You're just a debbie downer. Perhaps, or perhaps I'm spoiled by the elegant of other languages...

People are going to chug away and use it as a hammer imo. But seriously it's a very ugly language to write huge lines of code in for anything more than just small scripts.

Frameworks solve some problems but I'm wary of them now especially Angular and Ember. Smaller framework might be better but I'm sick of front end and the constant search for the thing that will save us from this nightmare of SPA. I'm leaving front end.

24
arenaninja 2 days ago 1 reply      
In some ways, browsers are as fragmented as ever. Everyone loves to forget IE, but the last I checked IE8 was #3 in sales in our shop. Throw in a little jQuery and we never have to worry about a lot of things. Polyfills are nice, except that browsers are still experimenting, and plenty of people aren't mindful of that fact and it's not uncommon to visit sites that don't work properly on a browser other than Chrome. A sad state of affairs, but a reality nonetheless
25
p3drosola 2 days ago 0 replies      
I agree with some of the points the author makes. Libraries > Frameworks. Dont create silos, etc.

The problem with this article is that most of the innovations that browsers are now bundling came from the community, in the form of libraries or frameworks. (document.querySelector, js templating, promises, observables, server push, etc)

Libraries & frameworks are the way new paradigms are explored and improved. They are not the future, but they contain the future.

26
mpoloton 2 days ago 1 reply      
"Now let's work together to add the capabilities to HTML 5 that allows ____ to be done w/o a framework"

It takes a long time and iterations for standards to be finalized and adopted. It took more than a decade to for the release of HTML5. In addition, adding more and more features to the standard lead to bloats and even longer release cycles. I think this is the role of frameworks/modules to provide higher abstraction and convenience.

27
codingdave 1 day ago 0 replies      
I do think that frameworks come out too fast, and too often. But I disagree that we should move past them. They do make coding easier and faster. The real efficiency comes not from using a framework for one project, though. It comes from using one framework for many years, until you are coding features almost as fast as you can think of them.

My concern with frameworks is that they become a crutch. People no longer try to code up their features from scratch. I see this the most in jQuery-heavy developer, where people will go seek out a plugin when they could have coded up the same features themselves had they just stopped, took a step back, and thought about it for a while.

At the end of the day, I support picking a framework, sticking with it long-term, while also recognizing its weaknesses and knowing when to just write native JS.

28
vhpoet 1 day ago 0 replies      
Unless you're coding a hello-world app, if you aren't using a framework, you are writing your own. If you're working on a 50k+ line frontend app, there's no way you're not gonna come up with a structure, standards, conventions which eventually is gonna become something like the frameworks we already have out there.

Frameworks are not built to deal with browser inconsistencies. At least that's not the main selling point, unless you're talking about the jQuery only. You mentioned Angular and Ember which provide a lot more.

29
barnaby 2 days ago 0 replies      
I remember when people used to make these arguments with Java and PHP. "You don't need a framework, just write your JSP templates to have database calls with SQL and tons of Java code". BLEH! I'm glad Struts, then Spring, the Play, came out and made life more sane.

I wrote javascript before there were frameworks when it was just libraries like mootools and dojo and jQuery sitting on server-side templates. Today I write CORS apps with Backbone.js+Marionette.js+require.js+grunt+qunit and it solves all kinds of problems and I love it! I'd never go back to javascript without frameworks. As soon as this project is over I am gonna try AngularJS to see what the hype is about.

30
davorb 2 days ago 0 replies      
I think what the author is describing could be looked at as PHP+MySQL, versus something like Rails. The problem with being in one camp or the other, is that one size will not fit everyone.

Bot bare-metal js and frameworks are here to stay.

31
hit8run 2 days ago 0 replies      
It is interesting to see so many different opinions on this topic. In my opinion it really depends on the kind of problem you want to solve. For very simple things it can feel stupid to load in a huge js framework. If one wanted to create something like a blog where a little ajax here and there is nice one might completely skip a framework. But does a developer really want to solve standard problems that have been solved countless times? Big web applications can profit from a framework that enforces some coding standards for common problems.
32
serve_yay 1 day ago 0 replies      
I certainly sympathize, but I'm not going to use Angular or some similar pile of goop because people don't want to think about JS frameworks anymore.

Something I've noticed: my coworkers who bitch and rant about this the most love Django. So I don't think the problem is the notion of framework as such. It may just be that things are changing really fast and people get stressed out trying to stay on top of it all.

33
theoutlander 1 day ago 0 replies      
I'm glad that I've blatantly ignored all the frameworks that have come and gone over the years. I still know what these frameworks are and their capabilities.

Frameworks like polymer are nice because they will eventually will fade out. I think that was the idea with TypeScript as well, but I think MS deviated.

What we should encourage are unopinionated but comprehensive libraries that simplify tasks just as the author mentioned.

34
arcosdev 2 days ago 0 replies      
JavaScript, among all the languages out there, is the most ripe for "abuse". It can be used in so many ways, I don't know how you don't use a framework.
35
dustingetz 2 days ago 0 replies      
I strongly disagree. Innovation is good, form some opinions about which ideas are good, become good enough to quickly identify good ideas and quickly dismiss bad or obsolete ideas, and all the problems go away.
36
glifchits 2 days ago 0 replies      
I think the speed of development gained now by leveraging all the good stuff that frameworks offer in the 10% tip of the iceberg is hugely beneficial in the short term. In the long term, thanks to initial productivity, teams will get more resources to invest into rewriting code for new frameworks or even rolling their own JS "paradigm" (be it a framework, lib, or nothing)
37
jenscow 2 days ago 1 reply      
Frameworks aren't specific to Javascript. Almost every mainstream language I know of has a framework of some sort - If you're using a language on its own then I guarantee you'll end up writing (and testing/maintaining) similar code to everyone else. After a few years, you'll have built up a "utilities library" that you copy from project to project. This will need maintaining, and new starters will also have to learn your library.
38
aikah 2 days ago 0 replies      
Well,with the release of ES6,the author should expect an EXPLOSION of frameworks,especially class based,IoC driven ones. ES6 brings loads of features to the language,some features will lead to complex codebases (proxies,modules,loaders,realms,quasis...).

It wont be just about sticking a few js files together.Js dev will get more complicated as times goes.

AngularJs and co are just the beginning of a trend that will accelerate exponentially in the next 3 years. Be ready for endless variations of Angulars and Reacts.

39
madprops 1 day ago 0 replies      
Honestly everytime I see a job posting saying it needs angular or some other similar framework they lose me. I've built some fairly complex applications using nothing more than jquery and handlebars. This level of abstraction leaves room and flexibility to build pretty much anything.
40
FallDead 2 days ago 1 reply      
I feel the author does not have enough development experience, or maturity to make such an statement. The reason for such frameworks is to clean up the mess and the void left with years of neglect on the web as a platform. To possibly blossom ideas, on how to improve the web, clearly not many people can agree on standards
41
hathym 2 days ago 0 replies      
The last sentence "there are standalone libraries for that." seems to contradict the entire article
42
caetan 2 days ago 0 replies      
Research, development, ... evolution is an iterative (if not recursive) process. We need to experiment with Backbone to create Angular to create the next framework. Angular 2 is already prognosticated as NOT backward compatible aye?
43
Mimu 2 days ago 0 replies      
Current frameworks are made more and more useless with the evolution of standard, browsers and stuff, however what would most likely happen is new frameworks will emerge and the circle will continue.
44
Siecje 2 days ago 0 replies      
Frameworks and libraries let you try different ways of doing things, if they work out they make it into the browser, like document.querySelector()
45
mixonic 1 day ago 0 replies      
This is so, so off base.

The implication in this post is that frameworks are standing still. That browsers have evolved and frameworks are built based on some past version of browsers that no longer exists.

In reality frameworks, browsers, and web standards are coupled together in an important triangle that creates the progress Joe calls out.

Frameworks iterate at an incredible pace compared to standards and browsers. The multitude of frameworks and libraries implementing the component pattern directly influence the spec. Members of W3C TAG and TC39 take the lessons learned from JavaScript development and fold it back into specs.

A great example is promises. This is a standard JS pattern, and now native feature, that began life as a library, became several libraries, then an independent spec, then finally a formal ES6 spec (domenic can correct me if I have the history wrong). When we talk about Move the Web Forward this is what we mean: http://movethewebforward.org/. Frameworks represent the shared and negotiated best practices of a development community- from this we can formalize solutions into specs.

To make a second point: Developing for the web means developing for a spectrum of platforms. A framework like Ember (which I work on) provides you known support for a variety of platforms. I don't need to consider if 5 different libraries all have fixed the ARM optimizations errors on iOS8. I can know all the features in Ember have it fixed. In fact I don't even need to think about the error, most likely.

A third and final point: Of course frameworks don't just influence browser features. The conversation is two-way. Unlike a simple scratch-an-itch library, framework authors are constantly looking foward and thinking about how they can better align with upcoming features. Ember has iterated on its Set, Map, and promise APIs to make them match the specs. Sometimes as we align with a feature we discover unexpected architecture problems with the spec (Object.observe, web components) and push the feedback upstream. Sometimes a spec helps us solidify an un-spec'd solution, and we need to expend effort trying to move apps to that new pattern (ES6 classes).

Most developers who buy into SPA architecture and build complex JavaScript applications understand the value of a framework. They are not panaceas (and setting them up as one is making them a straw man), but they absolutely play a very important part in the JavaScript and web ecosystem.

Which is more than I can say about this post's FUD ("Remember all those MochiKit widgets you wrote? Yeah, how much good are they doing you now that you've migrated to Ember, or Angular?" they work just fine thanks) and call to limit your ambitions for a great client-side app.

46
awjr 2 days ago 1 reply      
Within a team, a JS framework is critical to making sure people are writing (and even thinking) in a cohesive way that is maintainable 2 years from now.
47
cnp 2 days ago 0 replies      
Frameworks very much define the way we think about problems. React is a good example, taking from many good ideas before it.
48
progx 2 days ago 0 replies      
Solution: Use the right tools too get the job done.

If it are frameworks, use frameworks.

If it are libraries, use libraries.

If it is plain JS, write in plain JS.

49
ragecore 2 days ago 1 reply      
So the author wants to use Javascript more for hacky stuff rather than a proper set of conventions?
50
Rygu 2 days ago 1 reply      
* HTML Imports

* Object.observe

* Promises

* HTML Templates

Sorry but IE8 support. Nuff said.

51
forgottenacc56 2 days ago 0 replies      
The solution proposed here seems similar to react.js
52
Touche 2 days ago 1 reply      
This is completely wrong. The reason frameworks are still popular is Modules. Modules are not solved in the browser and until they are there will continue to be frameworks to ease that problem.
53
bitwize 2 days ago 0 replies      
How I wish this were a Scarlet Witch declaration like in the House of M storyline.
54
pooky666 1 day ago 0 replies      
jQuery is a library; not a framework.
55
unclebucknasty 1 day ago 0 replies      
More JS frameworks (databases, back-end frameworks, languages, caches, messaging systems, etc.) equals more specialization, "job security", higher wages, etc.

To illustrate, the (extreme) corollary is one stack and one pool of laborers with one skill-set.

Just another vantage point, whatever we think of the tech merits.

28
The world is not falling apart: The trend lines
201 points by crgt  2 days ago   106 comments top 20
1
diafygi 2 days ago 11 replies      
Great! We're better overall than we were! No reason to rest on our laurels, though. How can we be better than we are now?

Here's some areas that aren't doing too well:

1. Climate Change - We're in store for a lot of trouble over the next few decades[1]. How will we manage?

2. Wealth Inequality - The gap is widening[2]. How do we reverse the trend?

3. Gerrymandering/voter suppression - The ones in power are the ones who draw the district boundaries[3]. How do we stop the feedback loop?

[1]: http://ipcc-wg2.gov/AR5/

[2]: http://www.pewresearch.org/fact-tank/2014/12/12/racial-wealt...

[3]: http://www.theatlantic.com/politics/archive/2014/12/the-pern...

2
hentrep 2 days ago 2 replies      
This point of an increasingly peaceful world has come up repeatedly over the years, but I think there is an inherent problem in the way it is viewed. The rise of the internet and social media has facilitated glimpses into the terrible acts which humans are capable of perpetrating against one another. That combined with a biased media who thrives on shock and outrage, it's no wonder we find this data difficult to digest. Most of the modern world influenced by this biased media reside in very sterile, largely safe environments. In effect, we've become ultra-sensitized to gore and violence, and as a result our impression and response to any sort of mayhem is skewed accordingly.
3
xahrepap 2 days ago 1 reply      
I wish this kind of data was frequently mentioned throughout the year in mainstream media. Continue to show the news the way they are but keep this kind of data around to jeep people "calibrated" I think would help people's perspective on the world more positive.
4
Animats 2 days ago 1 reply      
Chairman of the Joint Chiefs Gen. Martin Dempsey informed the Senate Armed Service Committee (in 2013), "I will personally attest to the fact that [the world is] more dangerous than it has ever been." Now that's surprising from the head of the JCS, and from a former commander of an armored division in combat. He's a trade-school guy (West Point), so he knows his military history.

Things have been much, much worse for the US. Early in WWII, it didn't look good. When the USSR got ICBMs and H-bombs, it really didn't look good. Worldwide, nobody is having a really big war right now. The USSR lost 20 million people in WWII. Nothing that bad has happened since.

There are some big worries ahead, mainly regarding proliferation of nuclear weapons and troubles involving existing nuclear powers - Russia, China, Pakistan, and North Korea. Those are the things that can kill us.

Domestically, the biggest threat is the Mississippi River, with major floods both at New Orleans and further upstream.

5
kristiandupont 2 days ago 1 reply      
I am increasingly worried about the state of the world each year.

However, I was discussing this with a friend recently and we were talking about how much of it was simple the result of more reports about the trouble in the world. I realized something: if there was, say, ten reports of really bad, violent crime in Denmark (where I am from -- population: 5 million people) per year, that would be very little. But even so, if I heard about each of them, it would be practically something really bad happening every month which would lead me to feel that things were going down the drain.

In other words, an unchanging constant violence rate would seem like a deterioration. And furthermore, if the type of violence was different every time, I would start to feel that all the different types of violence were on the rise.

So maybe my fear is not completely justified. But I still don't feel completely convinced.

6
chasing 2 days ago 1 reply      
"The World is Falling Apart!" gets way more clicks than "The World is Not Falling Apart," though.
7
retrogradeorbit 2 days ago 3 replies      
The world can "fall apart" without people dying. For example, the ongoing currency wars.

You can also have violence without death. Guantanamo and Abu Ghraib are examples. Because these people are only tortured, and not murdered, is that considered "peaceful"?

And it also would depend on how far back you draw the data from. Is the Stalin famine of the 30s modern, or ancient? If we cast the data back to the 1600s, it's new. If we cast the data back to the 1920s, its old.

Some data sets in this article go back to the 60s, some go back to the 30s, some only back to the 90s. Maybe we need more data. Lets cast all the data sets back a few hundred years and look again.

Consider me unconvinced.

8
YesThatTom2 1 day ago 0 replies      
THIS IS TERRIBLE.

How can news agencies make any money if they can't scare people into hysterics that keep them glued to the TV screen?

How can gun companies and home security firms sell product if people aren't afraid of everything around them?

THIS KIND OF RESEARCH HARMS THE ECONOMY AND MUST BE STOPPED.

Sincerely,Tom being cynical

9
UVB-76 2 days ago 1 reply      
I think a lot of this has to do with perception.

The violence and suffering in the world, particularly overseas, feels more inescapable in the digital, high definition, always connected age.

This age has gifted us perception, but not perspective.

10
guard-of-terra 2 days ago 0 replies      
By the way, homicide rates chart should account for age distribution. As population ages, the 18-40 age group dwindles. And, statistically, that is who dominates homicide deaths.

If you account for that, your chart may switch polarity.

Same for rapes.

As for "Democracy and Autocracy" - this chart comes from people who still call Uzbekistan a "young democracy". Actually, many supposedly democratic countries actually aren't.

11
RodericDay 2 days ago 1 reply      
Steven Pinker has built a writing career out of telling people who are doing really well that everything is fine and that they should keep on enjoying.
12
stealthfound3r 2 days ago 0 replies      
It`s the end of the world as we know it.https://www.youtube.com/watch?v=AzqiPvGrkTo
13
tim333 2 days ago 0 replies      
The world is not falling apart. Personally though things are wearing out. They should fix that!
14
Zigurd 1 day ago 0 replies      
A better title would have been "Progress isn't uniform, and measuring it is hard." Each war brings progress in trauma medicine, which contributes to making war (and driving) less deadly. That's good, right? Right?

Higher education requirements and increasing relative status as policing becomes a relatively more desirable job for people with lower (and capped) aptitude means policing gets better by many metrics. Nonetheless, militarization is a bad thing, solution rates are shockingly low, the Drug War is a distraction, and cop culture is rotten: http://www.salon.com/2014/12/23/deader_than_a_roadkill_dog_d...

Some things are clear, however: Terrorism is a negligible threat. Nuclear weapons are still the #1 threat to civilization. But there is a lack of intentionality in both these areas.

15
deciplex 2 days ago 0 replies      
If you assume that most governments have been doing that to various degrees since forever, and the only thing that has changed is the degree to which the information can get out, and the ease of anonymous whistling-blowing (which still carries too much risk, however it's arguably better than it's ever been) - seen in that light, you might suppose that while things seem to be getting worse, in fact they are getting better precisely because we're more aware of these various atrocities than we ever have been. It's possible to be very optimistic for the future while also outraged at what these inhuman murdering savages are doing in our name.

For example, I'm sure now that the CIA tortured countless Vietnamese during the Vietnam war, Iraqis during the first Gulf War, and you can apply this to the Korean war, WW2, basically as far back as you want to go. We just didn't know about it then, because it was easier to silence journalists and neutralize (murder) whistle-blowers or suspected whistle-blowers without the general public knowing about it.

FWIW, I am in the pessimist camp as well, in spite of my reasoning here, but just barely. I think probably things are actually about the same as always, but perhaps with the broader awareness we're just starting to see, that is just beginning to change. It's comforting that at least I'll probably know one way or the other within my lifetime.

16
agentultra 2 days ago 0 replies      
This article is pure link bait. Both ends of the extreme are far removed from the truth. There is hardly anything in this article that isn't sensationalist fluff.
17
esaym 2 days ago 1 reply      
I've always heard that more people died in the 20th century than any other... Not sure if that is true or not. Never did the research.
18
desireco42 2 days ago 1 reply      
This is a proof that there are lies, damn lies and statistics :).

World is in terrible shape, from climate, to new conflicts emerging, richest people detachment from reality are some that come to mind.

19
aryehof 2 days ago 4 replies      
> England, Canada, and most other industrialized countries...

England isn't a country. Constantly referring to it as one, was a distraction from the contents for me.

20
seivan 2 days ago 1 reply      
Tell that to the 7000 Yezidian women in ISIS captivity marked with a price tag, paraded through Raqqa then gang-raped, tortured and starved. Forced to strangle themselves to death using scarves to get out.

Or the Female PKK/YPG soldiers defending their families being captured alive by Swedish Arabs and Somalians islamic rapists.

For them the world is falling apart.

I guess since we can't blame Israel for this it's not front-page news. But lets worry about about the lack of feminine characters in a Donald Duck video from the 70s or the apparent sexism in games.

29
Eggs Not Always What They're Cracked Up to Be
179 points by x0054  2 days ago   148 comments top 31
1
archagon 2 days ago 12 replies      
The description in "The Omnivore's Dilemma" of egg yolks from idyllically pasture-raised chickens is forever stuck in my memory.

"Between stops, Art mentioned that Joels eggs usually gave him his foot in the door when trying to land a new account. We stopped in at one such prospect, a newly opened restaurant called the Filling Station. Art introduced himself and presented the chef with a brochure and a dozen eggs. The chef cracked one into a saucepan; instead of spreading out flabbily, the egg stood up nice and tall in the pan. Joel refers to this as muscle tone. When he first began selling eggs to chefs, hed crack one right into the palm of his hand, and then flip the yolk back and forth from one hand to another to demonstrate its integrity. The Filling Station chef called his staff over to admire the vibrant orange color of the yolk. Art explained that it was the grass diet that gave the eggs their color, indicating lots of beta-carotene. I dont think Id ever seen an egg yolk rivet so many people for so long. Art beamed; he was in."

I've yet to find an egg like that, though I've heard you can get them if you raise backyard chickens.

2
zedpm 2 days ago 0 replies      
One of many benefits of living in a semi-rural area (western South Dakota) is easy access to truly high-quality foods like actual farm eggs. I've walked around the two farms I buy eggs from and I've seen the chickens wandering around. They get ordinary feed, of course, but also scavenge insects and other things, resulting in gorgeous orange yolks and tasty eggs. I pay $2 a dozen (as long as I promise to return the egg cartons to be reused), or sometimes just trade some fresh produce for the eggs.

Almost all of my meat (beef, lamb, goat, and pork) is similarly purchased from small producers I personally know; often my family and I butcher the animals ourselves. This approach is a win in terms of quality, cost, humane treatment of animals, and satisfaction.

Having a connection to my food and the people who produce it is, to me, reason enough to prefer living in fly-over country vs. San Francisco. All the fresh air, open land, and lower stress environment doesn't hurt either.

3
debacle 2 days ago 4 replies      
The study is funded by the Coalition for Sustainable Egg Supply [1] which is "facilitated" by the Center for Food Integrity [2], which is run by a PR firm on behalf of ConAgra, Monsanto, Tyson, and others.

It makes me sad that NPR is slowly turning into just another mouthpiece.

[1] http://www2.sustainableeggcoalition.org/

[2] http://www.sourcewatch.org/index.php?title=Center_for_Food_I...

4
schmichael 2 days ago 2 replies      
Backyard chicken farmer here! If you have a bit of yard space and time they make wonderful pets. About as difficult to care for as cats and what they lack in cuddles they make up for in hilarious/idiotic behavior.

I doubt it saves me any money, but it's fun and provides lots of "organic pasture-raised" eggs and compost.

5
jack-r-abbit 2 days ago 1 reply      
My first real job (25 years ago... at age 15) was collecting eggs at a medium-sized commercial egg farm. Maybe I didn't care about "organic" or "cage-free" at age 15 or nobody cared 25 year ago, but we were one of those "battery cage" farms. It was long buildings with 5 rows of double stacked cages. The floor of the cage was slanted so that the eggs rolled out into a trough in the front. We'd push a narrow cart down the row and collect the eggs into trays. It was brutal for both humans and chickens. I didn't eat eggs or chicken for awhile after that.
6
steven2012 2 days ago 3 replies      
I'm usually not one to care much about where my food came from, but the one thing I pay attention to, for some weird reason, is eggs. I think the idea of trapping chickens in cages for their entire lives and having them produce eggs for us is revolting, a real-life Matrix situation. I'll pay a few bucks extra to make sure that the chickens get at least a modicum of better life than being trapped in a cage for their entire lives.
7
shalmanese 2 days ago 1 reply      
Given how much HN loves to bring up the studies that show people fail at tasting wine, it's surprising that nobody has talked about how double blind studies consistently reveal that there's no perceptible taste difference between different types of eggs (once color is controlled for):

Washington Post:

> The egg industry has been conducting blind tastings for years. The only difference is that they don't use dish-towel blindfolds; they have special lights that mask the color of the yolks. "If people can see the difference in the eggs, they also find flavor differences," Curtis says. "But if they have no visual cues, they don't."

> Only one factor can markedly affect an egg's taste, and that is the presence of strong flavors in the feed. "Omega-3 eggs can sometimes have a fishy taste if the hens are fed marine oils,"

http://www.washingtonpost.com/wp-dyn/content/article/2010/06...

Serious Eats:

> It was pretty clear evidence that as far as eggs go, the mindset of the taster has far more bearing on the flavor of the egg than the egg itself.

http://www.seriouseats.com/2010/08/what-are-the-best-eggs-ca...

Journal of Product and Brand Management:

> Respondents indicated that darker yolk color results in better taste, whereas results of a blind taste test indicated that consumers were not able to distinguish any significant difference in the taste of different types of eggs and yolk colors (Fearne and Lavelle, 1996b). Similarly, respondents indicated that brown regular and specialty eggs are tastier than white regular eggs. Fearne and Lavelle (1996b) reported that in a blind test consumers could not make a distinction between the tastes of regular and specialty eggs.

http://ps.oxfordjournals.org/content/90/5/1088.full

It's an extremely potent example of confirmation bias since even people who are normally quite science minded and skeptical somehow find loopholes that exempt their personal experience as not conforming to the data when confronted with evidence that their fancy eggs taste no better.

8
dlau1 2 days ago 0 replies      
Shameless plug, the company I (just started) working at connects producers with consumers. We handle the logistics of getting the fantastic products from our producers to you.

If you are in the sf bay, nola, la, or nyc, check it out at http://goodeggs.com

Here is the SF egg section (we deliver to sf, east bay, and the peninsula)

https://www.goodeggs.com/sfbay/dairy/eggs

9
atourgates 2 days ago 0 replies      
The article somewhat pans Certified Humane, but apart from raising your own chickens or buying from a farmer you trust, it's far and away the best alternative that you can find at many grocery stores.

Their comparison chart gives a good overview of the program[1], but here are a few key facts:

* It's a completely independent organization, that maintains that independence very intentionally.

* It has rigorous compliance and inspection requirements, and everything about the program is transparent and available online.

* The birds are required to be out of doors at least 6-hours per day, every day, year round, with 108 sq. ft. per bird.

Animal Welfare Approved is another good, independent and transparent program, but I've never actually seen them in a grocery store.

[1] http://certifiedhumane.org/wp-content/uploads/2014/05/Laying...

10
eggman47 2 days ago 0 replies      
Most of the time we use eggs, it is just needed as "glue" to make things stick together, or for leavening, moisture etc. This can easily be replaced by tapioca flour and water, flax and water, or some other substitute. It'probably cheaper as well. Save the eggs for your quiches, omeletes and the like where it actually makes a difference.

http://www.wikihow.com/Replace-Eggs-in-Your-Cooking

11
anatoly 2 days ago 2 replies      
I was shocked to read that eggs generally have pleasing deep-yellow or orange-red yolks because the egg industry spends a lot of money to put special additives into chicken feed just for that purpose. There's no other benefit from those additives.

http://shkrobius.livejournal.com/375927.html

I think about that now every time I see a pleasing, healthy-looking, tasty-looking deeply yellow yolk on my plate.

12
aabajian 2 days ago 2 replies      
The most informative thing for me was that "organic" actually has a definition by USDA:

http://www.usda.gov/wps/portal/usda/usdahome?contentidonly=t...

Up until now the med student in me had the jerk reaction that "organic" meant carbon-based (+ nightmares from ochem), which applies to almost everything we eat.

13
jere 2 days ago 2 replies      
I've know these for a while and always look for pastured eggs but never find them outside of a farmer's market.

>No Hormones... It's like putting a label on a cereal box that says, "No toxic waste."

http://xkcd.com/641/

14
ndespres 2 days ago 1 reply      
Of course an article on our food system is in all of our interests (Soylent customers, perhaps, excepted) but I'm always happy to see an article like this at the top of hn. I feel that this industry is particularly ripe for innovation, and it's where I'm presently working. Though it turns out, feeding and watering all the animals every day takes up all that time I thought I'd have to design new systems to track egg consumption and greenhouse temperatures with my Arduino.

Turns out all that work outside is just as gratifying without the electronic hassles I imagined I'd bring to it.

I sell eggs from chickens which I believe have the best possible life, and I hesitate to use most of the terms listed here, even the most positive ones, because of their lackluster connotations. I know what "free range" means in the minimal application of the spectrum, so where my birds have acres to forage for food, someone else may not be so generous. I don't want us classified under the same umbrella. So I invite all my customers to come and see where their food is grown, in the environment where I believe it should all optimally come from.

And yes, the eggs are healthier, firmer, and last longer (not refrigerated) and vary depending on what the chickens have foraged that day. Not to mention the beautiful rainbow of their shells, which vary depending on the breed.

http://homestead.sevenarrowseast.com/wp-content/uploads/2012...

15
kalleboo 2 days ago 2 replies      
In contrast, here are the regulations in Sweden (probably similar elsewhere in the EU):

* Battery cages: Banned

* "Caged hens": Cage size must be at least 750 cm^2. Must have sand, a nest, and a stick to sit on.

* "Free range indoors": Floor space must be at least 1,111 cm^2 per hen. 1/3 of the floor area must be sand or similar. Must have access to a nest, at least 15 cm of stick to sit on.

* "Free range outdoors": At least 1,111 cm^2 space per hen. Must have access to an outdoors area of at least 4m^2 per hen. Same sand/nest/sitting stick rules as indoors.

* "Organic": At least 1,664 cm^2 space per hen. Outdoors area of at least 4m^2 per hen. Outdoors area must have grass growing. 1/3 of floor area must have sand or similar, must have a nest, must have a sitting stick of at least 18 cm. Must be fed with organic feed.

* "KRAV Organic certified": Organic + Must have access to root vegetables to eat at. The farm can't be leaking stuff from the fertilizer into the surrounding nature.

Many of the large supermarket chains have stopped selling caged eggs completely, which now make up less than 10% of sold eggs. Around 12% are organic.

Swedish eggs are 100% salmonella-free.

16
xacaxulu 2 days ago 3 replies      
It's so wonderful to now live in a forthright European country that cares for their citizens and doesn't permit this kind of fraudulent advertising of unnatural and unhealthy modifications of foods. Firms like Monsanto are barred from the country as are GMOs in general and you'd be hard-pressed to find anything other than natural eggs from free-range chickens, untreated with any hormones or antibiotics. It seems the inverse is the case in the US.
17
pvaldes 2 days ago 0 replies      
We often forget to remember that chickens have not an human brain.

The subliminal (and fake) statement in all those articles is that the chicken are a sort of "small little people crying" living very sad lives because they don't live like humans do. This is simply terrible.

Living in a farm, even in the best of farms, at Texas or New Jersey, could be perfectly claimed as a totally artificial and unnatural live for the point of view of a chicken. They are rainforest birds "cruelly" placed at deserts or in states with harsh winters living inside for 6 months a year. There are many reasons because a chicken could not to want to go outside. The claimed "because the farmer is a jerk" is wrong often.

They can do this because they are adaptable, and they just don't care so much as we do. But this is NOT natural for this species. If we really wish that chickens live "happy chicken lives" maybe we should grow them only in countries like Venezuela or Brazil...

18
batbomb 2 days ago 1 reply      
I get pasture raised eggs from a farm in Petaluma which has a CSA I use. They are very good, come in a wide variety of shell colors and the yolks are a deep yellowish orange, but I'm not necessarily sure they are necessarily worth the $8.50 a carton (technically, they are only grade A because of the variance). On the other hand, a dozen eggs from a decent producer will usually run $4/dozen. I'd recommend them for egg dishes, but maybe not so much for general culinary usage.
19
philwelch 2 days ago 0 replies      
I'm trying to figure out if this was a subtly intentional pun or a coincidence of phrasing:

> Pasture-raised birds spend most of their life outdoors, with a fair amount of space plus access to a barn. Many are able to eat a diet of worms, insects and grass, along with corn feed...

cf. http://en.m.wikipedia.org/wiki/Diet_of_Worms

20
felipesabino 2 days ago 0 replies      
Only reading the title I thought it would be a discussing about fake eggs in chine being a hoax [1]

But anyways, this is just more confirmation these jargons in labels mean almost anything useful for the general public. The article reminded me that funny "natural effect" videos [2] and also that the whole GMO labelling issue that still make people argue.

And in the end of the day, all I have seen so far is a lot of fear mongering and no debate wether real useful information should be added to labels, like 'how far and for how long did this food traveled', 'how long was it kept in storage' and so on...

[1] http://sguforums.us/index.php/topic,43295.msg9275536.html#ms...

[2] https://www.youtube.com/watch?v=AftZshnP8fs

21
djokkataja 2 days ago 0 replies      
Handy link from the article if you want to be a bit pickier about the eggs you buy: http://www.cornucopia.org/organic-egg-scorecard/
22
deevus 2 days ago 0 replies      
I wonder how closely this applies to eggs in Australia. I will keep a look out "pasture raised", as I do like to buy better quality, more humanely sourced eggs.

It's articles like these that make me love coming to HN, regardless of the fact that there's no "hacking" per se.

23
woodchuck64 2 days ago 0 replies      
Good for them, "Pasture-Raised" just booted out "Free Range" in my personal egg-shopping check list.
24
phil248 2 days ago 0 replies      
For those that are interested, I've found that Certified Humane eggs are the easiest to find and that label carries weight. It's usually $4-$5 for a dozen, more expensive than factory eggs but still pretty cheap for food in general. AWA is stricter, but I've never seen AWA eggs.
25
3pt14159 2 days ago 0 replies      
Note that this is only indicative of the USA. In Canada Organic eggs are generally what you should be buying. They are about 50 cents an egg and are the most strict on the humaneness of a chicken's life.
26
nartz 2 days ago 0 replies      
Its pretty bogus how the FDA gets lobbied for things like this, and it ends up convoluting the message to the american public. For instance, read this article on how 'organic' / 'antibiotic-free' eggs were still able to be treated by antibiotics as long as it was early in the eggs life:

http://www.motherjones.com/tom-philpott/2014/01/organic-chic...

Even this Q & A is not the most straightforward on what the antibiotic process is, instead hiding behind 'meets FDA regulations'.http://www.uspoultry.org/faq/faq.cfm

27
aaronbrethorst 2 days ago 0 replies      
Most useful link in the article: http://www.cornucopia.org/organic-egg-scorecard/
28
bevan 2 days ago 0 replies      
The evidence appears to be in about "omega-3 eggs". There are quite a few studies showing the benefits of eating them over conventional eggs:

http://paleoclaims.com/claims/omega-3-eggs-are-healthier-tha...

A quote from one of the studies on that page:

"Three n-3 PUFA-enriched eggs provide approximately the same amount of n-3 PUFA as one meal with fish.""

29
Dirlewanger 2 days ago 1 reply      
Can pretty much sum up the entire organic/"natural" food craze of the past decade with: advertising as usual.

Good article. Nothing new to see here though. Hopefully this will wake up some people though who exclusively go for the brand name stuff at Whole Foods.

30
markuz 2 days ago 0 replies      
this article boils down to the comments.
31
gd1 2 days ago 2 replies      
"They usually live in aviaries: massive industrial barns"

Oooh, industrial barns...

"And often, Kastel says, industrial fans that suck..."

Oooh, industrial fans. As opposed to the ones that grow on trees.

"One of the most common causes of death was pecking by other chickens."

You mean we give them more freedom, and they kill each other? I thought 'natural' was meant to mean all rainbows and happiness and shit.

30
The Interview
189 points by pc  1 day ago   145 comments top 34
1
avargas 1 day ago 12 replies      
I hope Stripe developers didn't write this code. I got the movie for free, jesus ... http://imgur.com/a/hf8FZ - and I didn't get a job after my interview with Stripe earlier this year.
2
xur17 1 day ago 4 replies      
3
legohead 1 day ago 7 replies      
If it's not free, I hope everyone buys it instead of waiting for a torrent. We should support online releases.
4
doxcf434 1 day ago 1 reply      
That's a pretty old version of nginx:

% curl https://www.seetheinterview.com/ -D - -o /dev/null -s|grep Server

Server: nginx/1.0.12

5
jbrooksuk 1 day ago 1 reply      
Why does https://www.seetheinterview.com keep redirecting to https://www.kernel.com which doesn't even contain The Interview film?
6
bboyan 1 day ago 2 replies      
You can download this as a DRM-free file with a simple curl command. Most people probably wouldn't bother, however, because Stripe makes it so easy to just watch directly your browser, no need to go through the hassle of downloading anything or needing the hard drive space.

This is a great example of content providers finally beating piracy by providing a simpler method of content distribution. I hope more movies come out like this.

7
akhatri_aus 1 day ago 1 reply      
Looking at the FAQs after running $("*").removeClass('hidden');

- only available in the US or from a US based ip address

- only works with US based credit cards

- available for 48h

- will be streamed to the browser

8
Narkov 1 day ago 1 reply      
So after all the crying about "how dare they try and censor us!" we get - Only available in the US.

Irony much?

9
cyphunk 1 day ago 0 replies      
lol! Am I the only one trying to understand why Stripe felt reason to give moralist post? "Were proud to work with organizations defending digital freedoms such as the Electronic Frontier Foundation and the Freedom of the Press Foundation ... We dont always endorse what businesses sell through Stripe, but" I mean props, but are you doing this because you figured out the vibe on the internet is that Sony is an souless and shallow company or is it you don't want N.K.'s elite hackers to get an idea you're gun-ho-america on this release and target you?
10
bvanslyke 1 day ago 1 reply      
Noone else is saying this for some reason but: It's hilarious that this movie is going to be historic despite it being so shitty (probably).

(In true HN fashion this is where I would write "Full Disclosure: I work for the DPRK, but this does not color my opinions.")

12
Animats 1 day ago 2 replies      
They want a credit card, they want a CVC for a $6 purchase, and they only have a domain-only SSL cert. Sloppy security, people.
13
lukasm 1 day ago 0 replies      
It's US only. Seems torrent there is.
14
rctgamer3 1 day ago 0 replies      
15
UVB-76 1 day ago 0 replies      
Looking at the file names, it seems this film had the codename 'Elephant'?
16
ChrisAntaki 1 day ago 0 replies      
> Online freedom isnt automatic, and its only through active effort that the internet will stay an open platform for creativity and innovation. We take our role seriously.

Well said, Stripe.

17
baby 1 day ago 1 reply      
https://www.seetheinterview.com/ : I just see a poster of the movie. Is it normal? (I'm in France)
18
rhgraysonii 1 day ago 1 reply      
It redirects to kernel.com on the website they link to, and there is no reference of The Interview. Where have I made a wrong turn, or is everyone else seeing what I am?
19
UhUhUhUh 1 day ago 0 replies      
Somehow, Seth Rogen fits so well in all this! They must be working on a movie (with him) as we speak.
20
thebiglebrewski 1 day ago 0 replies      
Looks like Stripe's Interview page won't even load. You have been banned from r/pyongyang
21
TD-Linux 1 day ago 0 replies      
How does the player work? I'd buy it but I'd rather not watch it in Flash Player.
22
fit2rule 1 day ago 0 replies      
This is the beginning of the new Internet - one where you can watch a mainstream movie from mainstream producers, and PAY FOR IT in a fashion that doesn't make you feel like a criminal.

Thanks, North Korea!

23
51Cards 1 day ago 1 reply      
Sigh, no love for Canada on any of the sources. Soon hopefully.
24
artur_makly 1 day ago 0 replies      
how do expats like us watch it? i tried https://www.proxfree.com/ and no dice. thanks
25
pegoty 1 day ago 0 replies      
WHere can I download it? I'm in the UK..
26
jstalin 1 day ago 1 reply      
Not on Amazon?
27
pegoty 1 day ago 0 replies      
Where can I download it from?
28
pegoty 1 day ago 0 replies      
is it downloadable yet?
29
edibleEnergy 1 day ago 0 replies      
feh, sucked
30
angersock 1 day ago 1 reply      
Flagged.

And in the comments here:

  "By jove, go out and buy it!"  "Jolly good, and what might that cost?"  "The low price of 5.99 pounds, my good man!"  "Huzzah!"
I'm not going to say that this is all astroturfing and "organic sales hacking", but sometimes, it looks like a pretty big fucking shill.

31
julie1 1 day ago 0 replies      
This north korea hacking looks more and more like a marketing PR scam... Or at least something very opportunisticly white washing for the company involved in privacy leaking and censorship.

Beurk

32
brianxq3 1 day ago 1 reply      
That's a publishable key. Its supposed to be there.

https://support.stripe.com/questions/difference-between-secr...

33
Kiro 1 day ago 1 reply      
The NK hack was probably just a big PR stunt.
34
afowfow 1 day ago 1 reply      
If another country made a movie about Obama being assassinated the US would be bombing them. No matter how crazy a current world leader is, I think it's an unwritten law not to depict their assassination.
       cached 26 December 2014 16:11:01 GMT