hacker news with inline top comments    .. more ..    11 Jul 2015 News
home   ask   best   4 years ago   
I Am Sam Altman, Reddit Board Member and President of Y Combinator. AMA reddit.com
21 points by sandmansandine  3 hours ago   discuss
Revised and much faster, run your own high-end cloud gaming service on EC2 lg.io
433 points by SG-  9 hours ago   173 comments top 34
halotrope 7 hours ago 1 reply      
I have followed the original instructions and after a couple days of tinkering around it is now my go-to service for gaming. I can play AAA titles on my mac without having them consume precious SSD space nor does the computer get anywhere as hot as when I was running them in bootcamp. The cost is quite affordable when making your own AMI with Steam and everything preinstalled. Since booting the machine and setting everything up takes around 10 minutes I also don't get tempted to play when I would have to work. It is a much more conscious decision. I only had to get an ethernet cable because wifi was too flaky. But now it is very solid with a 50M DSL line and average ping of 60ms to Ireland.
Wilya 7 hours ago 2 replies      
The guide advocates an EC2 security group that allows everything, plus disabling the Windows firewall. That's quite insecure, and unnecessary.

It's probably better, and not more work, to create a security group that only allows:

* UDP on port 1194 (Openvpn server)* TCP on port 3389 (Remote Desktop)* ICMP (for ping)

z3t4 6 hours ago 4 replies      
We might just have seen the future of PC gaming DRM. That you will pay per hour instead of a one-off payment.

There's one problem though, and it's latency, even 50ms will feel very laggy. We need more decentralized data centers! With a data-center in each city you could get latency down to less then a millisecond.

I think the next digital revolution will be low latency, and a flora of new services coming with it.

TheGRS 8 hours ago 1 reply      
I really appreciate a guide that takes you through the process, giving me a chance to understand all of the steps, before sharing the pre-packaged solution at the bottom.

I was a little surprised by the cost as well. At the rate that I'm gaming these days it would be like $10-20 per month, that's pretty damn good (price of games not included obviously).

dsmithatx 46 minutes ago 1 reply      
I read the entire thing and I think it is cool that now days this is possible. However, I got to thinking at $.50+ per hour, if I play 6 hours a night, that's $3.00 a day on weeknights. If I only play $8 on weekends it adds up to $23 per week. This would equate to $1196 per year at (23*52). Basically I'd much rather invest in a gaming rig. My CPU and GPU haven't required upgrades now for years in this day in age. At least if I invest in a gaming rig I actually have a gaming rig.

While I respect and find the technology fascinating and cool, it feels like leasing a car I'll never own versus owning and evening buying a new one every few years at a much lower price. For those who game a few hours a week however, I can see this being a cheap alternative to a gaming rig.

feld 8 hours ago 1 reply      
This is impressive, but you should probably not use his AMI unless you use your own uniquely generated OpenVPN certificates/keys
rl3 4 hours ago 2 replies      
One of the more exciting possibilities afforded by DIY cloud game streaming is the ability to interactively share single-player gaming experiences with people, in games that otherwise do not support co-op. Games like FTL and XCOM: Enemy Unknown come to mind.

However, one thing I would be extremely wary of is running your Steam account from AWS or any other server environment. The last thing you want is to get flagged as a bot or VPN abuser and banned; Valve customer support isn't exactly known for being particularly understanding or responsive. Personally I would just load up a throwaway Steam account with a few games and use that.

lectrick 7 hours ago 0 replies      
Somewhere, someone at Valve is noticing this and pitching it around as a new service idea :O
Arelius 6 hours ago 1 reply      
Something that I'm a bit worried about, I used to try to run performance sensitive game servers on a Xen based virtual machine, was that no matter how many resources I tried to dedicate to the virtual machine. The xen scheduler would give hitchy performance. introducing large enough delays sporadically to make playing the game a little painful.

Does anybody know much about the EC2 hyper-visor schedulers or in the case of large instances, does it even run with a serious hypervisor?

glogla 6 hours ago 0 replies      
It seems that the reason why this makes sense is Spot instance pricing - it wouldn't be economical with normal instance.

But don't they pull the instance out from under you if someone outbids you? Does anyone have experience with that?

And one more question - how is the performance? The OP shows screenshot of game running in 1280*800, but that might be because of the macbook resolution. Can it do fullhd or 4k?

zachlatta 7 hours ago 0 replies      
Big fan of this approach.

I wrote a simple script (https://github.com/zachlatta/dotfiles/blob/master/local/bin/...) to really easily spin up and down the machine I set up for game streaming.

kayoone 3 hours ago 1 reply      
we really need a focus on low latency instead of bandwidth but i guess that's even worse in terms of marketing than upload bandwidth.But its frustrating to know that <10ms latencys are easily achievable with current technology but ISPs just don't care. Lower latency even improves web browsing a ton, voice/video calls and any kind of realtime interaction basically.Then again with 4K gaming becoming more popular, even todays bandwidth will not be enough.
jordanlev 2 hours ago 2 replies      
So you can install MS Windows on an EC2 instance without having to pay for a license? How does that work?
pdeva1 3 hours ago 1 reply      
i tried using the prebuilt ami. However after installing and configuring tunnelblick on my mac, when i connect to the vpn, i get: "This computer's apparent public IP address was not different after connecting to <<hostname>>'. now steam cannot detect the windows server. what am i doing wrong?
philtar 7 hours ago 11 replies      
Anyone wanna team up and build something like this?
dharma1 3 hours ago 1 reply      
i did this about a year ago to run 3dsmax/vray on ec2 gpu instance via RDP. Worked ok-ish but i found it quite clunky to mess about with AWS interface to start and turn off an instance every time I wanted to use it.

Has anyone managed to script something where you just press a button/run a local script and it does all the work, including saving your image to EBS before you turn the thing off and stop paying for the instance?

TD-Linux 5 hours ago 0 replies      
I wonder how the hardware encoders and decoders compare to software implementations. They of course use less CPU, but also generally tend to compress worse and have higher latencies than software implementations. Is nVidia's hardware specially optimized for this use case?
annon 6 hours ago 0 replies      
This would work fantastic with Steam Link that they have coming out: http://store.steampowered.com/universe/link

It uses the same in home streaming feature of steam.

rroriz 8 hours ago 1 reply      
Amazing idea! If this could be set up for a multiplayer games without much trouble (lag, cheating, licenses), this could be The Next Big Thing :)
Procrastes 6 hours ago 0 replies      
This can work very well for some applications. I have a startup doing something similar to this with the Second Life Viewer with good results. The most painful parts turn out to be in the plumbing around state and change management as you might expect.
ortuna 8 hours ago 2 replies      
I wonder why this works better than the Steam In-Home Streaming. I could never get it to be close to 60fps. The video suggests 60fps.
xhrpost 5 hours ago 1 reply      
This is crazy awesome. Since it uses h264, I wonder how well a Raspberry Pi would work as a client machine. Heck, you might be able to do a whole LAN party just with PI's.
nickpsecurity 6 hours ago 0 replies      
Cool experiment. I thought about trying this for streaming video or music to cheap devices in various places as well. For now, I just use my smartphone and WiFi as it's cheaper. :)
WA 8 hours ago 4 replies      
Anyone tried to play competitive multiplayer games like CS or Heroes of the Storm with such a setup? I can imagine that streaming adds a bit of latency, which isn't a problem in singleplayer games, but could add too much lag for fast-paced multiplayer games. Any experiences?
dogas 7 hours ago 5 replies      
What is the latency of a setup like this? Could I play an intensive FPS and be competitve?
programminggeek 8 hours ago 4 replies      
Am I ridiculous for wanting this to be a sort of on demand service with a small markup?
skellington 7 hours ago 2 replies      
Just curious how he got $0.11/hr for a "spot" instance of g2.2xlarge? Amazon's "on demand" pricing of that config w/ Windows on their website is $0.767/hr.
zubspace 7 hours ago 4 replies      
Does someone have experience hosting a dedicated server on EC2 24/7? How's the performance and is it cost effective? Or is it preferable to host on digital ocean/linode?
spydum 5 hours ago 0 replies      
i hate to admit, there are a lot of places I thought cloud services could be leveraged, this just wasn't one of them (keep in mind, I say useful, not necessarily best fit).

This is such a cool idea, makes me realize what other creative solutions are just lurking, ready to slap me across the face.

bsharitt 7 hours ago 0 replies      
I'm going to set this up to see how it compares to my Windows Steam on Wine that sits next to my native Steam on Linux with its smaller library.
bhear 8 hours ago 6 replies      
Has anyone thought of selling preconfigured cloud gaming services?
mullen 7 hours ago 1 reply      
This is actually a cost savings. Windows games are much cheaper than their OSX versions and they are available much sooner on Windows than OSX.
thedaemon 7 hours ago 1 reply      
Has anyone tried this with Nvidia GeForce Experience and a Shield TV? I might try this instead of upgrading my aging desktop.
mo1ok 4 hours ago 1 reply      
This is really important as virtual reality begins to take center stage, but most people don't have the rigs to run it.
Relation Between Type Theory, Category Theory and Logic ncatlab.org
18 points by rndn  2 hours ago   discuss
Ellen Pao Is Stepping Down as Reddits Chief nytimes.com
599 points by jonmarkgo  3 hours ago   489 comments top 53
nhf 3 hours ago 15 replies      
I think this was the right thing to do from a PR perspective. Having Steve back as the new CEO will definitely be good for the community.

I also applaud Reddit's announcement for calling the community out on their childish BS:

> As a closing note, it was sickening to see some of the things redditors wrote about Ellen. [1] The reduction in compassion that happens when were all behind computer screens is not good for the world. People are still people even if there is Internet between you. If the reddit community cannot learn to balance authenticity and compassion, it may be a great website but it will never be a truly great community. Steves great challenge as CEO [2] will be continuing the work Ellen started to drive this forward.

All in all, a good day I think.

MBCook 3 hours ago 5 replies      
I don't like this at all.

Even if she wasn't the right person (I don't know), all the worst elements of the site are going to see this as a victory for their awful behavior and it's going to get worse.

The people who attacked her with sexism and comments about her personal relationships. The people who supported FPH even though they were attacking people in real life and off Reddit, not just posting comments in their personal corner of 'discussion'.

She didn't do a good job of it, but at least she tried to stand up against some of the worst of Reddit.

I worry heavily that if the new person doesn't draw a clear line at the start things are going to get a lot worse in terms of hate/abuse/harassment.

EDIT: After posting this I saw Nilay Patel tweeted basically the same thing: https://twitter.com/reckless/status/619620964658245632

pkorzeniewski 3 hours ago 10 replies      
Ellen has done a phenomenal job, especially in the last few months, he said.

What exactly "phenomenal" has she done? Reddit works pretty much the same as it worked several years ago, but in the meantime she managed to piss off the majority of community, which is the only reason Reddit exists

minimaxir 3 hours ago 3 replies      
Ellen Pao gives the reason for leaving on /r/self: http://www.reddit.com/r/self/comments/3cudi0/resignation_tha...

> So why am I leaving? Ultimately, the board asked me to demonstrate higher user growth in the next six months than I believe I can deliver while maintaining reddits core principles.

This is believable because there have been odd business decisions under her watch, not just policy decisions. RedditMade, one of the intended revenue-generating models for Reddit, failed while she was interim CEO. Alienating /r/IAMA probably did not help.

bane 3 hours ago 0 replies      
Right or wrong, fair or unfair, or whatever you think about Ellen, I think most people agree that she had become personally and professionally toxic to reddit as a brand and community and even if she did a great job from here on out, it was going to be an uphill battle to restore community confidence in her as a CEO.

I personally don't believe she had the right qualifications to lead a community-driven site like reddit as it is today, but would have the right qualifications if reddit was going to start making a serious pivot to a more lucrative money making direction via commercial partnerships, advertising, etc.

Reddit may still go that direction, but Huffman won't have the same baggage weighing him down.

(note: this will also likely feed the conspiracy that her turn in the head office was a convenience for her lawsuit, now that she lost, she has no reason to stay in that position)

I agree with other comments chastising the community for the racist/sexist/whatever nature of lots of the negative comments against her. It was childish and dangerous. She had enough issues worthy of reasonable criticism that it wasn't even necessary.

I think this is a good thing for reddit.

devindotcom 3 hours ago 8 replies      
Maybe I missed it, but was there ever any information on why Victoria was fired, or whether Pao actually had anything (or everything) to do with it?

From where I was sitting, it seemed like no one actually learned the full story, which might be confidential or take time to contextualize/safely explain, and everyone immediately threw it on Pao's lap and downvoted any holding maneuvers she and the rest of the staff tried to attempt. It was poorly handled, sure, but it seems like there was a lot of finger pointing before anyone knew what was actually happening. For that matter, do we even know now?

If I'm wrong, though, happy to correct my ideas here. (grammar edit)

notsony 3 hours ago 8 replies      
>Sam Altman, a member of Reddits board, said he personally appreciated Ms. Paos efforts during her two years working at the start-up. Ellen has done a phenomenal job, especially in the last few months, he said.

This is clearly nonsense, otherwise there wouldn't have been a grassroots campaign to remove Ellen Pao from her role.

If Sam Altman honestly believes that Ellen did a "phenomenal" job, he should reconsider his own position at YCombinator.

noir_lord 3 hours ago 8 replies      
This entire debacle and the 'communities' (the small vocal part that acted horribly) response pretty much hammered the last nail into the coffin for me when it comes to reddit.

With the exception of a few niche subreddits and the (few) incredibly moderated major subreddit's the whole place has become a negative pit with horses beaten so badly to death Findus put them in their lasagna.

Twitter often feels the same way as well (I'm pretty much at the unfollow as soon as someone acts like an idiot stage now).

Ironically the only social network I don't hate is Facebook and that's because I have about 20 people I consider true friends on there, all signal no noise.

bedhead 3 hours ago 2 replies      
During the KP trial I had always kept an open mind towards her arguments...until I later learned she was married to Buddy Fletcher, one of the biggest scoundrels and thieves in the investment world in recent years. The character and judgment of a person who would fall in love and wed someone like that says more than I can articulate. It's oddly reassuring to see my (and many, many others') skepticism about both her judgment and motives validated.
puranjay 2 hours ago 1 reply      
As someone who frequents only a couple of subs on Reddit (which were completely insulated from this fiasco), I have no idea why people were so pissed off.

So she made a bad decision. Big fcking deal.

"She's killing the community!". Well, if your idea of 'community' is making public rape threads (while you use a throwaway) and threaten to kill a person, then maybe your community deserves to die.

Reddit has a lot* of good. I've been there long enough to see it. But it has a lot of absolute low-lifes clogging its sewers as well.

dvt 3 hours ago 2 replies      
Pretty much had to happen. To say that the Victoria situation was mishandled is a severe understatement. I wonder what will happen with communities like FPH and others (that have since moved to Voat). Will reddit lessen their censorship efforts?

Time will tell. IMO, the problem at hand is that reddit is still trying to make advertisers their bread and butter. And advertisers will never be overly attracted to censorship-free spaces.

Even though I may not agree with her aggressively politically-correct agenda (nor does most of reddit), I think it may have been a smart move from a business dev. perspective.

onewaystreet 3 hours ago 1 reply      
> It became clear that the board and I had a different view on the ability of Reddit to grow this year, Ms. Pao said in an interview. Because of that, it made sense to bring someone in that shared the same view.

Does this mean that the board thought Pao was being too aggressive in pushing growth or not aggressive enough? If it's the latter then the Reddit community is in for a shock.

rocky1138 15 minutes ago 0 replies      
We keep hearing over and over again about how it's a small minority of vocal people who spew vitriol in any community, but how about providing some real, hard data?

Reddit has enough data and skill to identify approximately the percetage of users who engaged in this type of behaviour at the very least.

I'd rather see the numbers myself than read a press release simply stating something and being asked to believe it.

ljk 3 hours ago 2 replies      
It's interesting how fast people go from hating[1] /u/kn0thing to love[2] him again

[1]: https://np.reddit.com/r/SubredditDrama/comments/3bwgjf/riama...

[2]: https://pay.reddit.com/r/announcements/comments/3cucye/an_ol...

luckydude 2 hours ago 0 replies      
I posted this over on reddit but it got lost in the noise:

Cool, I guess. But after having spent some time on voat.co I think reddit will get less and less of my attention (not that anyone gives a shit about me but I suspect I'm not alone).

Reddit's management has destroyed any sense of trust I had in Reddit (I'm looking at you /u/kn0thing, it's not just Ellen, my understanding is you fired Victoria, right? And then grabbed popcorn [I know, cheap shot, but it appears like you really fell out of touch]).

It appears that it is all about making money which I think is going to be the end of Reddit for some of us. Reddit could have a decent revenue stream on reasonable ads but that wasn't enough, it had to be more. That is really troubling because the next thing you might decide to "monetize" is what each of your users reads. That would make the NSA look amateurs and would be a massive invasion of privacy. It would also be very easy to monetize. Given all that has been going on, it would appear to be just a matter of time before "user optimized marketing" appears.

Welcome back but the existing management has dug you a mighty big hole. I don't trust you any more.

iblaine 2 hours ago 0 replies      
Being the CEO of reddit is a political position. And Ellen Pao has too much drama in her life to be a good politician. Losing a sexual harassment case, marrying a crook who stole millions...those are events that don't happen by accident.
chaostheory 19 minutes ago 0 replies      
I'm not even going to debate whether or not she was an effective CEO. At the end of the day it's about the lawsuit, and I'm not going to argue the merits of that either. The only thing she should have realized from the start was that you can't have your cake and eat it too. You either choose reddit or the lawsuit. You can't divide your focus between both or you lose both.
sergiotapia 25 minutes ago 0 replies      
Ellen Pao was a scapegoat. She was the face of a lot of changes that didn't sit well with the community. Now the people clamor, they remove her, and the people are happy again.

Notice how they didn't mention anything about reverting the bad changes to the website. ;)

return0 3 hours ago 2 replies      
Isn't it already too late? How can a new captain save the sinking ship? The new CEO would be standing on a double edged sword. If he reverses course immediately claiming reddit an absolute free-speech enviromnent, the people who wanted a safe-space will be disillusioned, if he doesn't the rest of the users will keep seeking for another platform.
ksenzee 3 hours ago 1 reply      
Reddit would do well to hire someone with experience in the association management field. Those folks specialize in managing fractious communities such that the volunteers not only stick around, they're happy to pay for the privilege.
mcintyre1994 39 minutes ago 0 replies      
> The attacks were worse on Ellen because she is a woman,

@sama, how do you explain this claim without ignoring the community's enormous support for Victoria Taylor?

kolbe 3 hours ago 1 reply      
After seeing what Ellen went through, I think Sam will need to raise some more funds to offer a significant pay bump to entice even mediocre talent to fill her void.
muglug 3 hours ago 2 replies      
Will the anti-corporate brigades in these large community-driven sites make turning a profit impossible in the long-run?
lisper 3 hours ago 0 replies      
So... who is replacing Steve at Hipmunk?
robot22 2 hours ago 0 replies      
The one take away I have from this situation is that we have an honesty problem. People criticize Reddit as a platform of hate and vitrol, but as in reality this only partially describes the entirety. They complain that people on the internet are too free to speak their minds, but perhaps this is a reflection on our society a place where honesty and the free exchange of ideas is discouraged.

Response to material:http://www.buzzfeed.com/charliewarzel/reddit-is-a-shrine-to-...

Food for thought:https://www.facebook.com/psiljamaki/posts/10153334440110516?...

musesum 54 minutes ago 0 replies      
I wonder if a Law Degree runs counter to running a social network? Where authority bumps up against anarchy. Imagine Peter Thiel running reddit. Both Thiel and Pao have law degrees. Both have been lightning rods. I suspect a JD comes in handy for some ventures. Such as Thiel running Paypal or Pao sourcing funding for RPX. In both of those cases, it is about removing ambiguity. For social nets, the opposite holds true. Because, ambiguity is the main product.
tptacek 2 hours ago 0 replies      
Converting large-scale investor dollars into compelling returns using the world's most entitled and monomaniacal message board: not, in fact, an easy job. Pretty sure very few of us could do it either.
gesman 3 hours ago 0 replies      
Wow, I applaud this development!

Now, if Victoria is coming back too - that's would be 200% right move for reddit!

slg 3 hours ago 0 replies      
I will be interested to see if anything changes regarding the management of Reddit or at least the communities opinion of it. I wonder if the community will chalk this up as a win and suddenly forget all of the reasons they have been complaining which really have nothing to do with Pao in the first place.
smitherfield 3 hours ago 1 reply      
I'm sorry to see it go down like this. Redditors treatment of her got really ugly (/r/all after the FPH banning was shocking) over the past few weeks, and it's disheartening to see people's bad behavior rewarded.
atarian 1 hour ago 0 replies      
Even on the internet, mob mentality wins.
goldfeld 3 hours ago 6 replies      
So can someone summarize the ordeal?
brock_r 2 hours ago 0 replies      
Reddit: The world's largest drunken mob.
golergka 2 hours ago 0 replies      
So, there are two stories people use to sum all the affair up: either "the witch is dead", or "pitchfork mob got what they wanted".

But neither of these stories really fit the information we have right now. Both of them fit some of it, and look realistic unless you look at the whole picture.

The best conclusion we can have here is that we don't actually know what's _really_ going on, just a bunch of facts and a couple of theories.

myrandomcomment 2 hours ago 0 replies      
As a CEO she chose to take actions in manner and method that allowed things to spiral out of control. It was her job to control the message and blowback. She failed at her job, therefore she needed to go. It is really that simple.
15step 3 hours ago 0 replies      
Great to see Steve back in the fold
scobar 2 hours ago 0 replies      
In the 24th Upvoted by Reddit podcast, Steve and Alexis talked about all the great content and communities hiding within Reddit that go undiscovered. I'm excited to see how they'll try to solve that problem, and hope they find a great solution. Reddit is really great, and it's very cool to see both Steve and Alexis back to enjoy and advance it.
tacos 3 hours ago 0 replies      
Yup, she did awesome Sam. Especially recently. (Makes me wonder how bad one of these people would have to screw up in order NOT to get the happy handwave as they're booted.)

I didn't even know who she was until "the last few months." Which have been a parade of increasingly-negative press and idiotic behavior. And that's from reading Reuters and the NY Times -- I don't even use Reddit.


Sam Altman, a member of Reddits board... Ellen has done a phenomenal job, especially in the last few months, he said.

jstakeman 3 hours ago 1 reply      
It's remarkable how fast and how organized they were.
justonepost 1 hour ago 0 replies      
Don't piss off Kleiner perkins, that's all I can say...
trhway 2 hours ago 1 reply      
Seems like Reddit hired Ellen without checking up her references at the previous job, ie. Kleiner, and now they harvest the same results - insufficient performance and high scandals.

(note: there is nothing about her sex here - just read the case materials and you'll see that she behaved just like a jerk at Kleiner - for God sake she complained there that some assistant was using company fax to send brain scans of dying from cancer mother)

osetinsky 3 hours ago 0 replies      
rough year for her
scotty79 2 hours ago 0 replies      
Corporations are like sea of cockroaches on the dark floor. They look vast. Roaches have their little fights and wars, but when they make some random noise and draw outside attention, it's funny to look how individual cockroaches run away from the spotlight.
neur0tek 1 hour ago 0 replies      
quelle surprise
ElComradio 3 hours ago 8 replies      
PR that comes out of corporations cannot be trusted. We cannot trust Altman is being honest that he appreciated her efforts. We can't trust that her and her husband were in love. All of this that comes out of spokesmen is carefully crafted as a result of a numbers game.

When do we hear "so and so CEO did a horrible job and was forced out by the board."? Never. So are we to believe there is no such thing as a terrible CEO? Will we hear Sam saying "We made a terrible decision putting her in charge."? Never. Even if it was the actual truth.

Pao does not get a pass on this dynamic for being a woman.

yegor256a 1 hour ago 1 reply      
Who is Ellen Pao?
pfisch 3 hours ago 0 replies      
The real question is will Ellen Pao sue reddit now?
arprocter 3 hours ago 0 replies      
I hope reddit has good legal representation...
tosseraccount 3 hours ago 0 replies      
There's a fine line between being polite and being so boring that it's stifling.Every commenting site has it's herd mentality and punishment of thought crimes.

Reddit users just wore it on the their sleeves and trying to suppress them was silly.

It might turn into DIGG 2 pretty fast and might not recover.

The investors and the "community" are just too far a part on this.

post_break 3 hours ago 0 replies      
I think most people just let out a sigh of relief.
kaa2102 3 hours ago 0 replies      
That was quick but the corporate agenda being pushed flew in the face of the Reddit community. Power to the people!
istvan__ 3 hours ago 1 reply      
I am opening a bottle of champagne and at the same time answer my own question: it took 3 weeks for the community to get rid off a tyrant. Well done Reddit!

Friendly reminder that if you are using downvotes for disagreement than you are doing it wrong.

DeepDreamVideo github.com
41 points by albertzeyer  3 hours ago   10 comments top 5
andybak 6 minutes ago 0 replies      
I'm really curious to see how much the original training material affects these images.

Is everyone using the same source? There's a lot of doge in there along with that bird that always pops up. Why are faces so prominent? Is that an artefact of the training data or inherant in the algorithm. I would guess the former.

mikkom 2 hours ago 2 replies      
This must be one of the most disturbing things I have ever seen..

(link to youtube version below)https://www.youtube.com/watch?v=oyxSerkkP4o

iLoch 24 minutes ago 0 replies      
This is the stuff of nightmares, congratulations!
abstractbill 1 hour ago 1 reply      
There's a nice live version of this effect on Twitch that you can guide by typing things in chat: http://www.twitch.tv/317070
bla2 1 hour ago 0 replies      
https://www.reddit.com/r/deepdream/ has many static images with this effect (and a few videos too).
OPM director resigns bbc.co.uk
16 points by jackgavigan  1 hour ago   8 comments top 6
magicalist 7 minutes ago 0 replies      
For those that missed it, I suggest this look at how comically bad security was (is) at OPM, to the point there were security reports saying the loss of productivity by shutting down the database completely would be preferable to the ridiculously vulnerable state it was in. No one listened.


celticninja 51 minutes ago 1 reply      
If governments cannot keep data like that safe what are the chances of most other data owners being able to do the same? The way that we need to give it out all over the Web is a recipe for disaster as we are shown time and time again.
CydeWeys 35 minutes ago 0 replies      
Good. I last worked for the government in 2007 and all of my data was still included in the breach. Heads rolling is the very least of what needs to happen when a security error of this magnitude occurs.
deepnet 9 minutes ago 0 replies      
Plaintext in 2015 ! Inexcusable.
adzicg 47 minutes ago 1 reply      
> usernames and passwords that background investigation applicants used

if this is correct, I guess that means passwords were stored in cleartext or reversibly encrypted. that is either epic stupidity, or someone wanted to keep passwords of potential employees so they could use them elsewhere

appleflaxen 28 minutes ago 0 replies      
Yes; this is exactly what her psychological profile suggested she would do.


Psychologists Shielded U.S. Torture Program, Report Finds nytimes.com
103 points by mcgwiz  4 hours ago   36 comments top 7
littletimmy 1 hour ago 3 replies      
It is enlightening to see that the Joseph Mengeles and Shiro Ishiis of our times are not subject to any sanction.

Remember folks, this is only the torture that we know. There are dozens of CIA black-sites around the world about which we know next to nothing. It is not too hard to imagine a regime that can torture people via waterboarding can also torture people by, say, forced heroin withdrawal and then just kill them.

Just goes to show, the only morality is that of the victor.

c_prompt 3 hours ago 0 replies      
A few related articles of interest:

Why I Am Not a Member of theAmerican Psychological Association - http://www.chalquist.com/apa.html

EMAILS REVEAL CLOSE RELATIONSHIP BETWEEN PSYCHOLOGY GROUP AND CIA - https://firstlook.org/theintercept/2015/04/30/emails-show-cl...

itistoday2 3 hours ago 1 reply      
I would like to take this moment to point out that John Kiriakou is doing an Indiegogo campaign:


It's because of people like him that we know about this at all.

He put his life on the line for us. Only he was imprisoned. The torturers [1,2] are still free.

Until there are consequences to torture it will continue to happen, and Kiriakou's sacrifice will be in vain.

Think about what the pizza delivery man has done for you, and compare that to what Kiriakou did for us all.

Consider sending him a monetary thank you for his service to you, to our country, and to humanity. I did.

[1] http://www.ibtimes.com/who-cia-torture-report-george-tenet-h...

[2] http://www.publicintegrity.org/2014/12/12/15221/whos-respons...

jimrandomh 3 hours ago 1 reply      
Go straight to page 12 of the report, the section titled "key players". Every person named in that section should be brought before a court of law, to determine the truth or falsehood of the accusations in this report. The US criminal justice system has taken a lot of blows to its credibility recently; this is a chance to demonstrate that it still does what it's supposed to... or that it doesn't.

> The APA official who led this behind-the-scenes coordination with the DoD officials was the Ethics Director, Stephen Behnke, and the key DoD official he partnered with was Morgan Banks, the chief of psychological operations for the U.S. Army Special Operations Command and the head of the Army SERE Training program at Ft. Bragg. During the task forces premeeting communications, during its three-day meetings, and in preparing the task force report, Behnke and Banks closely collaborated to emphasize points that followed then-existing DoD guidance (which used high-level concepts and did not prohibit techniques such as stress positions and sleep deprivation), to suppress contrary points, and to keep the task forces ethical statements at a very general level in order to avoid creating additional constraints on DoD. They were aided in that regard by the other DoD members of the task force (who, for the most part, also did not want ethical guidance that was more restrictive than existing DoD guidance), and by high-level APA officials who participated in the meeting.

> Other leading APA officials intimately involved in the coordinated effort to align APA actions with DoD preferences at the time of PENS were then-APA President Ron Levant, then APA President-Elect Gerald Koocher, and then-APA Practice Directorate chief Russ Newman. Then-APA Board member Barry Anton participated in the selection of the task force members along with Levant, Koocher, and Behnke and in the task force meeting, but was involved substantially less than the others. Other members of the APA executive management groupnamely, CEO Norman Anderson, Deputy CEO Michael Honaker, General Counsel Nathalie Gilfoyle, and communications director Rhea Farberman were involved in relevant communications, as described below.

> The other DoD official who was significantly involved in the confidential coordination effort was Debra Dunivin, the lead psychologist supporting interrogation operations at Guantanamo Bay at the time who worked closely with Banks on the issue of psychologist involvement in interrogations. At times, they were coordinating their activities with the Army Surgeon Generals Office. There is evidence that Banks was consulting with other military leaders, likely in the Army Special Operations Command and the Joint Task Force Guantanamo, although this was not the focus of our investigation, in part because of our limited ability to access DoD documents and personnel. Another important DoD official involved in some coordination with Behnke was PENS task force member Scott Shumate, a former CIA official who was head of behavioral sciences for a newly-created counter intelligence unit (CIFA) within DoD, which reported to the Under Secretary of Defense for Intelligence.

> For Banks, Dunivin, and others at DoD, the attention on the abusive treatment of detainees as a result of the media disclosures of Abu Ghraib, the torture memos, the DoD working group report, and other related events created uncertainty and worry about whether the involvement of psychologists in interrogations would be deemed unethical. Some in DoD, such as civilians Shumate and Kirk Kennedy at CIFA, were pushing APA to move forward with action that would show support for national security psychologists and help end the uncertainty by declaring that psychologists participation in interrogations (with some then-undefined limits) was ethical. Others, like military officers Banks and Dunivin, reacted to APAs movement toward the creation of the task force with concern that APA could head in a negative direction if the task force was not properly set up and controlled, and with awareness that this was an opportunity for DoD.

Stephen Behnke. Morgan Banks. Ron Levant. Gerald Koocher. Russ Newman. Barry Anton. Norman Anderson. Michael Honaker. Nathalie Gilfoyle. Rhea Farberman. Debra Dunivin. Scott Shumate. Kirk Kennedy.

nickbauman 3 hours ago 8 replies      
Before anyone gets into the morality of torture. Can anyone even point to open, independently verifiable evidence that shows that torture yields good human intelligence?
chrisbennet 3 hours ago 1 reply      
What ever happened to "do no harm"?
shadowmoses 3 hours ago 0 replies      
American style corruption.
OpenResty A fast web app server by extending Nginx openresty.org
99 points by elektromekatron  6 hours ago   30 comments top 12
dividuum 2 hours ago 0 replies      
Even if you don't intend to use build a complete web application using OpenResty, the underlying HttpLuaModule for nginx is awesome on its own if you're using nginx already. It makes it easy (among other things) to build logic around requests before they are handed off to a backend service. I used it for two different problems that would be more difficult to implement otherwise:

* Classifying incoming requests based on various signals to detect DDOS patterns and moving those requests into different request limit zones. It's used in production at a bigger hosting provider.

* Rewriting incoming requests to a certain path of an existing website to a different shared blog hosting provider and changing back HTML/CSS/JS so everything worked. It didn't end up going into production, but it was pretty easy to build in under 100 lines of Lua code.

So when you're bored and want to learn something useful, have a look at http://wiki.nginx.org/HttpLuaModule. It might help you someday.

pwm 3 hours ago 1 reply      
I have been using OpenResty for years now. The ability to run Lua scripts from within is great.

Story time: Some time ago we developed a CMS for a large art/production company. One of the requirements was that everything must be access controlled including all assets (eg. images, videos). The site went live and all was good until one day when, for no apparent reason, our monitoring alreted extreme load and then the site went down. Not so good for a company that sells its tickets online... I got on the phone with them and turns out that they started using the CMS to store images used for newsletters, which they sent out to some 200K people that morning. Now normally this wouldn't be such a big problem, but since all images were ACLd, this meant that instead of serving static files all requests went to the backend and pretty much killed it. The solution we came up with was to move the initial access control check for assets directly to OpenResty using Lua and Drizzle. So whenever a request came in for an image Openresty checked if it's public and only if it was not did the request hit the backend, otherwise it was served directly. Once we pushed this live the load disappeared and never came back.

Also I wrote an init script for it (as we needed one for a DRBD setup). Maybe someone will find it useful, here it is:https://gist.github.com/pwm/d3260804b4ade0d81f29

erikcw 6 hours ago 4 replies      
I've had great results using OpenResty in combination with the Lapis[0] framework. Screaming fast and pretty easy to work with.

[0] http://leafo.net/lapis/

nickpsecurity 6 hours ago 0 replies      
Combining a fast, optimized web server with Lua for fast and robust web applications is a technique proven in the field by Barracuda Application Server [1]. It's good to see people trying it for server apps. Thing I like about such a combo is that simpler web stacks are easier to secure and have fewer ways to fail in general.

[1] https://realtimelogic.com/products/barracuda-application-ser...

curiousjorge 4 hours ago 4 replies      
I wonder if people insisting that they use node.js for scalability and high concurrency (two false but popular reasons) will now reconsider their decision because this blows node.js out of the water.


mtourne 2 hours ago 0 replies      
Rewriting the hodge podge of C, Perl and other custom bits of Nginx almost entirely in Lua and hiring its maintainer Yichun in the process where important steps for scaling up the CloudFlare's CDN and make sure each node is running as hot as possible.

More on this from dotScale 2015:https://youtu.be/LA-gNoxSLCE?t=7m40s

MichaelGG 4 hours ago 1 reply      
One big downside was that OpenResty doesn't support SPDY, due to an apparent issue when running SPDY and Lua. SPDY was an easy double digit percentage perf gain for us, so it's omission was odd.

I've had normal nginx with all the plugins (using the nginx-extras package, I think, from the ppa). No problem with Lua and SPDY at least in my use case (sending rendered HTML snapshots for anything with an HTML mime type).

onyxraven 3 hours ago 0 replies      
a note: the image serving frontend and storage backend for the original twitter photos integration was written largely in openresty. I love it.
DaemonHN 5 hours ago 1 reply      
I think all the HN traffic might have crushed the website - I'm getting a blank page and when I refresh it seems to be stuck. Not very encouraging, assuming they're using it to serve their main page.
nodesocket 4 hours ago 0 replies      
I know about OpenResty, but I am getting a blank page? Can't even open Chrome developer tools on it. Very strange. Works fine in Safari and Firefox.
xfalcox 5 hours ago 0 replies      
My default web server! Wrote last week a Discourse SSO implementation just using content_by_lua from OpenResty.
leftnode 5 hours ago 3 replies      
The navigation on this site is possibly the worst ever conceived. You can't link to anything!
How Go was Made sourcegraph.com
58 points by beliu  4 hours ago   3 comments top
seanmcdirmid 1 hour ago 0 replies      
Not very accurate. Rob Pike pursued a unique line of PL design research from a systems perspective that was never integrated into mainstream PL design research. There is nothing particularly wrong with this, it is what happen when communities are isolated (they develop technologies independently, see the Americas vs. Europe/Asia before Columbus). The fact that the world is much smaller now (via globalization and the internet) means that diversity necessarily decreases to converge on an identified local optimum. Go is interesting since it rejects this pressure and explores an alternative path where PL technology could go, which seems to have paid off (it has use cases and has attracted users).
You're in a space of twisty little mazes, all alike strangelyconsistent.org
114 points by kamaal  8 hours ago   13 comments top 6
mrcactu5 7 hours ago 1 reply      
this is known as a uniform spanning tree https://en.wikipedia.org/wiki/Spanning_tree

these can be sampled using Wilson's algorithm for Loop-Erased Random Walks https://en.wikipedia.org/wiki/Loop-erased_random_walk#The_un...

Here is a nice visualization of Wilson's algorithm using d3.js by Michael Bostock (NY Times) http://bl.ocks.org/mbostock/11357811

jerf 6 hours ago 0 replies      
It seems like it would be way easier to think about it the opposite way... pick a space, start enumerating the possible connections it can have to its neighbors recursively, with a simple algorithm that won't pick already picked neighbors, and then read the bit string off the result. That's easily done by starting with all 1s, and then setting to 0 the bit corresponding to the choice you just made.

Either direction is of course the same in theory, of course.

More excitingly than that, you may be able to contribute to OEIS now, if you can work yourself out a few more terms: https://oeis.org/search?q=1%2C1%2C28%2C12600&sort=&language=...

jdjdps 1 hour ago 0 replies      
I like the epilogue. I too wish I could go back to my younger self and describe this understanding. I would try to explain to myself that each pattern can be seen as an object in and of itself. That an algorithm is a way to move between these objects in a specific way that marries with a particular human goal. I would try to explain that each step of an algorithmic process combined can be seen independently of time as a pattern in and of itself and as such is itself an object. A process is a noun, a thing just as much as a chair or a lightbulb. I would say that in order to find an algorithm to solve a goal, all one needs to do is imagine the shape formed by this goal and construct that shape from the shapes that are readily available to you. Be them finger movements on an abacus or bitwise operations in a computer memory. And then I would explain that the universe can be seen in this way. The entire thing as a single object out there in phase space. I would tell myself that I suspected that all possible shapes exist, that the shape of your life exists only as much as the shape of a thing that you imagine while dreaming. This was the understanding I have been searching for since my early teens. It is such a joy to have found it, I like to think my younger self would have cherished it as much. But I may have discarded them as the ramnlings of an old fool.
VieElm 7 hours ago 1 reply      
Now watch this question pop up on technical interviews everywhere for days because people read about it here.
comrh 3 hours ago 0 replies      
Interesting post, something people often ignore is talking about the failures that got them to the solution but this is usually the real interesting stuff!
Apple and IPv6 Happy Eyeballs ietf.org
96 points by jasonmp85  6 hours ago   24 comments top 5
nailer 2 hours ago 1 reply      
If you're wondering what Happy Eyeballs is: https://en.wikipedia.org/wiki/Happy_Eyeballs
azernik 38 minutes ago 0 replies      
Apple has done amazing things for IPv6 on the client side; first defaulting to IPv6 for link-local traffic, and then being an early adopter of Happy Eyeballs and putting relatively good IPv6 support in their AirPorts.

Many brownie points!

jdorfman 5 hours ago 2 replies      
Am I reading way too far into David's "CoreOS Networking Engineer" title?
jmount 4 hours ago 0 replies      
It would be hard to be worse. If I turn on too much of IPV6 on OSX DNS takes forever (yes I did check the IPV6 config).
jasonmp85 6 hours ago 3 replies      
I don't think this update really changes anything about the status quo, then, other than the fact that IPv6 seems to have a much higher likelihood of being favored (based on the ratios provided by the poster).

If you had privacy concerns about using IPv6 before, they remain, and (presumably) you can still disable it at a systemwide level.

Personally, if you're sophisticated enough to be using a VPN for privacy purposes, you can probably figure out whether or not your given solution supports IPv6.

In addition, I disagree that IPv6 is some sort of abject reduction in privacy. Many households already have a single member, so IPv4 wasn't providing much cover there. And (speaking from some experience) IP addresses aren't the best resource when doing tracking: that's what evercookies and supercookies are for.

Impulsive Rich Kid, Impulsive Poor Kid priceonomics.com
10 points by ryan_j_naughton  4 hours ago   discuss
Japans New Satellite Captures an Image of Earth Every Ten Minutes nytimes.com
151 points by revorad  9 hours ago   23 comments top 8
Syrup-tan 1 hour ago 2 replies      
I wrote a simple shell script[0] to scrape and output the latest image.

It uses the tiles from their online satellite map[1], and can output images in increments of 1x1, 2x2, 4x4, 16x16 tiles (each tile being 550px by 550px). Here is an example with 2x2 [2]

If you have any suggestions or bugfixes, feel free to fork or comment.

EDIT: Also works for a single tile[3], also clarity.

[0] https://gist.github.com/Syrup-tan/1833ba1671c7017f0d59

[1] http://himawari8.nict.go.jp/

[2] https://denpa.moe/~syrup/himawari8.png

[3] https://denpa.moe/~syrup/himawari8-single.png

johansch 4 hours ago 1 reply      
The resolution of the "full disk" (i.e. whole earth) natural color images appears to be 11000x11000 pixels every 10 minutes. I can't find any realtime access to these images though - could anyone else?

They do have a cloud service for disseminating the imagery, but only for "official use":


"Until Himawari-8 becomes operational, NMHSs wishing to release Himawari-8 data and products to the public are requested to consult with JMA beforehand."

Edit: Here is at least a tile-zoomer with some sort of realtime access to high-res imagery: http://himawari8.nict.go.jp/

pavel_lishin 6 hours ago 2 replies      
Will there be a place where they can be downloaded? A live Planet Earth desktop wallpaper would be pretty great.
Animats 5 hours ago 0 replies      
Nice. Japan needs better weather data; too many hurricanes and too much coastal development. From geostationary orbit, the resolution has to be low, but it's always on.

The US has two geostationary weather satellites, which are usually parked roughly over Panama and Hawaii. Neither has good coverage of Japan. Korea's COMS satellite does, though. China has several, including one that's usually pointed roughly at Taiwan.[1] Right now, you can see the hurricane that's due east of Shanghai.

[1] http://www.hko.gov.hk/wxinfo/intersat/fy2e/satpic_s_vis.shtm...

sosuke 6 hours ago 0 replies      
The Earth is so beautiful.

I saw GOES-R http://www.goes-r.gov/ and the pronunciation I heard in my head made me think of Ghostbusters.

state 5 hours ago 1 reply      
I have always hoped that someday Google Earth would just be live.
bargl 6 hours ago 0 replies      
So my first thought was (will this replace the doves by space labs). Which was an earlier story on HN. https://news.ycombinator.com/item?id=8158295

It won't because these are geostationary satellites (if I read the post correctly). So you'd need at least 3 of these to get a good image and that's not even considering some of the bigger issues with this. I also don't think the resolution is on par. But the images will be really cool to see.

Link to space labs. https://www.planet.com/story/

ChuckMcM 2 hours ago 0 replies      
I find that we can do this sort of thing amazing. However on the animation the fact that the terminator line changes angles is a bit unnerving.
Data on 7% of Americans Were Just Hacked, Now What? onename.com
67 points by shea256  4 hours ago   48 comments top 20
Litost 10 minutes ago 0 replies      
This might well be the dumbest thing i've ever said on the internet, but extrapolating from "data on 7% of americans just got hacked" to the premise nothing is actually secure

a) What would happen if we embraced this and just made all information freely available?

b) Is one of the likely/possible end or transitional states of the human race, all information being freely available and presumably along with it, a more enlightened approach to dealing with it?

c) Are there any good sci-fi books where this is explored?

bargl 3 hours ago 0 replies      
Is it sad that because I have worked on government systems in the past that this does not surprise me at all?

It makes me mad, but it is not at all surprising. The negligence on government software is crazy. That is on top of the regulations that basically don't allow developers to use new/open source technology.

While new technologies wouldn't have prevented this by themselves, they might have made it easier to encrypt data so the devs would have said, "oh yeah we can do that". Or they might have had defaults that prevent simple things like cross site scripting.

jessriedel 3 hours ago 3 replies      
I wish alternative strategies like "stop having the government collect and store information" would be considered in these situation.
eli 3 hours ago 3 replies      
> Worse access to ALL of this information was given to certain foreign contractors, some of whom were in China.

Pretty sure this is unproven and, regardless, had nothing to do with the hack.

informatimago 2 hours ago 2 replies      
I don't see that as a problem. At all. The US government (NSA, CIA, etc) has files on most of the people on the planet (including close spying of most governments, politicians and important corporations worldwide). I don't see how somebody else having 20 million records on US people would change anything.

On the other hand, if personal and important information about the activities (behind the curtain) of all those politicians, banksters and big corporations, american or not, was accessible to the public, perhaps things would change.

Shivetya 3 hours ago 1 reply      
I am not sure what is actually the worst thing we learned here, that this many people were hacked or this percentage is/was employed by the US government
TheMagicHorsey 3 hours ago 1 reply      
Why is everyone so shocked? Has anyone ever talked to a friend that works for the Federal govt.? They are well known to be completely incompetent when it comes to technology. Even the DoD, which gets billions of dollars for cyber defense, often doesn't do things right.

How can you expect the Fed. Govt. to handle things competently when some of the best paid private contractors F' things up too. Security is hard.

What IS a bit surprising is not the fact that they were hacked, but that they actually found out they were hacked. From what I understand, the Fed. Govt. has lost even more important data (like designs for weapon systems), and not even realized it till like years later when the technology shows up in foreign weapons.

sologoub 1 hour ago 0 replies      
Does anyone know if this affects immigration records, as I'm pretty sure they collect fingerprints and such?
mangeletti 3 hours ago 1 reply      
The article's title was just edited[1] to read, "Data on 7% of Americans Was Just Hacked, Now What?".

This is apparently a living document.

1. http://webcache.googleusercontent.com/search?q=cache:WKgL8jW...

jganetsk 3 hours ago 1 reply      
Does anyone know if the OPM's data included Global Entry?
carl7081 1 hour ago 0 replies      
But hey - they erase their disks 7 times and spike them before they throw them away - so we are safe now.
Qantourisc 1 hour ago 0 replies      
If it's such a big deal to loose / get the data stolen. Should you have been storing it in the first place ?And if you do really need it, like fingerprints, start by using a hash. The other data you wish to keep are current data (not history): ssn, address, family(maybe you should be able to opt out of that, but risk them no getting contacted in certain situations) Medical records? Have a standard form that list anything important: allergies, blood-type.Well that's my (maybe naive) view on it.
RRRA 3 hours ago 0 replies      
... Because the government is keeping everyone insecure so they can hack other nations and themselves?
1971genocide 1 hour ago 2 replies      
I am so happy this is happening !

I always felt cryptography was treated as a back room kind of operations. We are all so busy making iOS apps. The real computer science has always taken a back seat.

Hopefully MORE such breeches occurs and investment in security recieves the kind of investment and respect it deserves.

We are all so focused on this MBA growth bullshit. Time to do some real computer science !

tslug 3 hours ago 0 replies      
I'm always amused by these "here's how to protect data better" articles, because today's security is tomorrow's joke, and that's how we got here with the OPM hack.

The only way to get ahead of it is to make it so that all private data is public and thus devalued. Privacy creates liability. Visibility creates value.

The problem we have right now is the idea that one entity should have domain over any information. That's what we need to get over. It should be shared- all of it, from bank security cameras down to what you're doing in the shower. When all surveillance is shared, you find that people suddenly get a lot more tolerant, because throwing stones in glass houses isn't helpful.

The Earth is a closed system. We have finite, shared resources. Privacy creates the fiction that it's not a closed system. You think that's how the space station works? Is that how you want it to work? No, you want cameras on everything, because if someone decides to experiment with the CO2 scrubbers, it affects everyone.

The same is true here on Earth. We're now in an age where one person or company or government can single-handedly change the habitability of the entire planet, such as Exxon did in the 80s. That's dangerous.

And meanwhile, there's incredibly valuable, life-saving services and conveniences we can all enjoy if we are open with all our surveillance data. How many lives could be saved or improved if we all had a smartwatch measuring our vitals and our food intake and toilet waste were monitored? That one change could single-handedly resolve most of our healthcare issues in the US.

What we really need instead of privacy is complete visibility coupled with a code of conduct that emulates the benefits we expect from privacy. Just because we can see everything doesn't mean we have a right to bother people with what we know. That's the issue we need to address. By all means, check out whomever in the shower, but that doesn't give you a right to interfere with that person's life by commenting on their genitalia. That's the key ingredient we're missing from the privacy conversation. We like privacy because we equate it with civility and thus freedom.

If someone doesn't know something, then they can't make you miserable with it. But that doesn't really work anymore. Even if someone doesn't know something, big data techniques can interpolate what it is they're not supposed to know. What you're really signing up for with "privacy" is granting visibility to only a privileged few- the spy agencies, the multinational companies, the hackers, and anyone willing to pay for the information.

kanusterkund 3 hours ago 0 replies      
Hack me twice, can't get hacked again, right?
mangeletti 3 hours ago 1 reply      
7% of American't were not "just hacked"[1]. Perhaps the HN title should be changed to avoid misleading users herein.

The title is very much click bait.

1. https://en.wikipedia.org/wiki/Hacker_%28computer_security%29

trhway 1 hour ago 0 replies      
93% later we'd be able to stop worrying about hacking and love the open Internet.
gmuslera 3 hours ago 0 replies      
99% were hacked the last decade, along with most of the rest of the world, by an US government agency. If people didn't care about that, why you expect sympathy for this one?
jwildeboer 3 hours ago 1 reply      
Exactly why is $AUTHOR so sure it was a foreign power that hacked OPM? Which proof can $AUTHOR provide besides unfounded rumours? It's just too simple.
James Watt: A Twelve-Year Flash of Genius thonyc.wordpress.com
13 points by samclemens  3 hours ago   2 comments top
colomon 14 minutes ago 1 reply      
This criticism seems so odd to me. Does the author really believe that people reading that are going to assume that Watt knew nothing of steam engines, this idea hit him out of the blue, and he built a working model the next day? Of course he spent the previous years learning the problem domain. Of course it took him time to make his idea work.

But neither of those details changes the fact (I guess, the author does not try to dispute it) that on this stroll, Watt hit upon the key idea which eventually changed the world from muscle- and wind-powered to machine-powered.

New Letters Added to the Genetic Alphabet quantamagazine.org
120 points by treefire86  8 hours ago   48 comments top 15
veddox 6 hours ago 3 replies      
So these guys come along, casually expand a well-established code and test it under precisely one small condition. And when this one test gives them some nice data, they say: "Hey, our stuff is better than everything all of nature has ever done!"

A very intellectually stimulating endeavour no doubt, but I expect some more tests before I would call this good science. Claiming that "the new additions appear to improve the alphabet" is simply extrapolation to the nth degree. [1]

Oh and by the way, when the article claims that

> "the three-biopolymer system may have drawbacks, since information flows only one way, from DNA to RNA to proteins"

that is not correct either. For more information, read up on epigenetics.

[1] Note that this quote comes from the article, not the original paper. The original paper is not quite as cocky (at least not in the abstract, but I don't have full access).

dibujante 7 hours ago 1 reply      
There are actually 84 possible combinations with 4 base pairs if you accept sequences of length < 3.

However, if you assume all sequences are length 3, you still get 64 combinations.

We only use 20 out of that space. And if you look at how base pairs encode to amino acids, for half of them, only the first two base pairs even matter - since it's prefix-free you can guess the amino acid if you see those two and even ignore the third.

Given how underutilized this space is, I'm not convinced that increasing the domain to 216 will lead to much more than the ability to express our current amino acid space with only two base pairs.

MaxScheiber 6 hours ago 0 replies      
I'm not convinced that this is necessarily a good idea biologically, especially after talking to a couple of my friends that are researchers in this space. However, this seems quite interesting for non-biological applications. Take cold storage, for example--with a third base pairing, we can obviously develop an even denser data storage format than with regular DNA.
shiggerino 6 hours ago 1 reply      
It would have been nice if the author would have at least acknowledged that in reality they are nucleobases and not tiny, tiny letters curled up in our cell nuclei. Sure, 6-amino-5-nitro-2(1H)-pyridone and 2-amino-imidazo[1,2-a]-1,3,5-triazin-4(8H)one doesn't say much to us laymen, but just saying letters and not mentioning once what they stand for is really poor reporting.
pavel_lishin 8 hours ago 1 reply      
Nitpick: it wouldn't be a potential 216. Some three-"letter" sequences code for the same amino acids, so instead of 4^3 (64) possible amino acids, only 20 are generated. Adding new letters doesn't change what these old words create, so I think there would only be a possible maximum of 172.

(I think I did my math right, but maybe not.)

(edit: thanks duaneb, had my basic bio facts wrong - codons code for amino acids, not proteins.)

jey 7 hours ago 2 replies      
Neat, but extending amino acids would be even cooler. DNA is mostly "just" a string encoding for information, like binary or hexadecimal. Proteins on the other hand are the actual machines whose blueprints are written in DNA, and they're built out of amino acids. Extending the set of amino acids could extend the set of basic building blocks available to create biomolecular machines.

Of course, teaching ribosomes to handle them and etc will take a lot of additional work, but identifying promising new amino acids would be a nice and major first step.

apalmer 7 hours ago 1 reply      
Not sure I understand the benefit, it's denser, on the other hand from what I understand DNA generally does have much in the way of size constraints. If I remember large swathes of DNA is inactive and there isn't selective pressure to clean up this wasted space. Coupled with the fact that it is apparently more error prone and seems to show why evolution didn't go down this path.

Probabably will be very useful for synthetic purposes where there isn't too much concern about fidelity after 10 million years of copying.

DDickson 8 hours ago 3 replies      
Sequel to GATTACA(PZ)?
trestletech 6 hours ago 0 replies      
Oh, good. Bioinformatics data wasn't big enough with two bits per nucleotide.
mjfl 7 hours ago 0 replies      
Very interesting concept. One thing I noticed after developing several genetic algorithms on my own is that they tend to give a good creative hint at what the solution to the problem should be, which the human mind can then interpret and produce what the genetic algorithm was "trying" to approach. I wonder if the same could be true with biological evolution, that there are better ways of storing genetic information than DNA and all that, but that DNA is a good guideline to what should be done.
mbq 5 hours ago 0 replies      
Even with PZ DNA would have major and minor groove rather than being a symmetrical double helix beloved by virtually all illustrators, sadly also those of pop sci articles...
logfromblammo 7 hours ago 0 replies      
This sounds a lot like a story from 20 years ago, that was probably in Discover Magazine or Scientific American. The new nucleotides at that time were labeled kappa and chi.

And as a point of fact, three-base segments of DNA to not have a one-to-one mapping to amino acids. I also believe that a non-standard use of one of the three stop codons can change an encoded methionine to selenomethionine, with similar special cases for other proteins using rare amino acids.

Furthermore, 6^3=216, but that doesn't mean that adding a new base pair can code for that many amino acids. The original set of 4, with 64 possible codons, usually encode for 20 amino acids (excepting special cases as with selenomethionine). mRNA also employs uracil and tRNA adds hypoxanthine. These lead to "wobble pairs" which in turn allow a single tRNA to match several different-but-synonymous codons.

As it stands now, every codon without a matching tRNA would be a different variety of stop codon.

Now, what would be interesting to me is if the P-Z pairs could match some tRNA anticodons that translate stereoisomers of the standard 20 amino acids (or actually just the 19 that are chiral). That way, the D-(KLAKLAK)2 apoptosis promoter sequence could be synthesized directly by the ordinary transcription-translation mechanics of a cell.

gherkin0 7 hours ago 1 reply      
IIRC, E.T. (from the movie) had DNA with six nucleotides.
NoMoreNicksLeft 5 hours ago 0 replies      
This article is ignorant.

>Why nature stuck with four letters is one of biologys fundamental questions. Computers, after all, use a binary system with just two letters 0s and 1s. Yet two letters probably arent enough to create the array of biological molecules that make up life. If you have a two-letter code, you limit the number of combinations you get, said Ramanarayanan Krishnamurthy, a chemist at the Scripps Research Institute in La Jolla, Calif.

This simply isn't true. Even with regular DNA, the word size is 3 nucleotides long... giving you 64 instructions. If I remember my highschool biology, only some of these are even used, the rest are duplicates or unused.

Binary would work too, assuming ribosomes and mRNA could expand the word size... you only need 6 bits to do the same as natural DNA.

Is there something I don't know that fixes word size at 3 nucleotides?

jacob019 7 hours ago 3 replies      
The "enhanced" DNA escapes into the wild where a new pathogen spreads over earth. All life is defenseless against the bizzare genetic alphabet...
Mandelbrot Set with SIMD Intrinsics nullprogram.com
22 points by kilimchoi  4 hours ago   2 comments top
melling 46 minutes ago 1 reply      
I noticed the plug for https://handmadehero.org

How's that going? Seems like progress has slowed.

The little ssh that sometimes couldn't (2012) naguib.ca
43 points by jasonmp85  6 hours ago   4 comments top 3
karlshea 4 hours ago 0 replies      
foxhill 2 hours ago 0 replies      
tldr; single bit flips in a hop to a remote server.

the moral of this story is - the number of layers and abstractions between our code (even our shell scripts - cron jobs in this case) and the network layer is so large.. the most subtle of bugs in one of these layers is a massive pain to track down.

i am in awe of the tenacity of these bug hunters.

gpvos 2 hours ago 1 reply      
Is there some kind of TCP signal that the kernel could reasonably send back to the originator if it detected packet corruption?
Earths aliens aeon.co
16 points by Hooke  4 hours ago   discuss
Bitsquatting: DNS Hijacking without exploitation (2011) dinaburg.org
15 points by jasonmp85  7 hours ago   discuss
The Myth of the Psychopath psmag.com
31 points by dpflan  6 hours ago   24 comments top 9
wmil 3 hours ago 3 replies      
I'm not buying this article. There's a very strong tabula rasa / pure nuture contingent that's been refusing to cede any ground despite decades of evidence.

The article is pure opinion... and I have trouble believing that a trio of SFU / Okanagan College professors are the top researchers.

kukx 3 hours ago 0 replies      
Here's an interesting talk from a psychopath https://www.youtube.com/watch?v=fzqn6Z_Iss0He made previously an other confession, which was less funny and much darker. Unfortunately it's gone now. I guess, he had a second thought, and asked to remove it. As far, as I remember he mentioned, for example, that if someone did something bad to him, he could not forget it and he would get even in some way, no matter if years had to pass. Also, another interesting talk related to the subject https://www.youtube.com/watch?v=bysVPcKQfeY
norea-armozel 3 hours ago 0 replies      
In part, I agree with the author's skepticism that psychopathy is purely a mental disorder. I've known people who would easily be classified as psychopaths but they clearly knew what they were doing was wrong, hurtful, and downright evil (these people oddly happened to be bullies that fixated on me through most of my childhood too).

Sometimes people are just evil and trying to sugar coat the notion that someone can be evil doesn't make it easier to cope with it. All it does is delude people into thinking we should have respect or love for someone when they do wrong by us and betray our trust. I think it's better to treat a so-called psychopath as an adult (if they are one) rather than as some broken down piece of machinery. At least then, you won't find yourself with your guard down and your judgement clouded by false sympathies.

nickpsecurity 2 hours ago 2 replies      
Let's test the title against proposed alternative with one sentence in the article:

"even though that flow is unproven in the scientific literature on psychopathy"

So, a lot of speculation with no evidence to back it makes the existence of psychopaths a myth. Further, the article intentionally ignores research that supports the existence of people with their alleged traits, even neurological differences. Pop psychology and religious rhetoric are far from the only things that went into it. So, the article's claims are unbalanced, weak, and defeat themselves for now with above quote.

dpflan 2 hours ago 0 replies      
Here are two HN submissions from 2 and 4 years ago entitled 'Letter from a Psychopath. The content of course is what you'd expect from the title.

1. 2 years ago - 340 comments - https://news.ycombinator.com/item?id=69411712. 4 years ago - 109 comments - https://news.ycombinator.com/item?id=3094824

dpflan 6 hours ago 0 replies      
This makes me curious about computational psychology/psychiatry/neuroscience and diagnosing individuals with DSM-IV disorders. Anyone have any experience in the field?
jugad 2 hours ago 0 replies      
We don't have to go too far to find psychopaths... a relatively larger percentage of teenagers are "psychopaths" - as in, they lack empathy.

All it takes to convert them into adult psychopaths is lack of guidance by parents and a ruthless outside environment.

I would state, without proof, that the number of people who would be labeled as psychopaths (by their behavior, violent nature and lack of empathy) would be larger in third world countries as compared to, say, US, Norway or Sweden... this is going by my personal experience.

dbbolton 2 hours ago 0 replies      
I'd like to see Robert Hare's response to this.
ExpiredLink 3 hours ago 2 replies      
You don't know what a psychopath is until you meet one. I decided to leave a project because the project lead turned out to be a real psychopath. It's difficult to describe a psychopath's behavior. They are not 'evil', sadistic or cruel. They are clearly self-centered, narcissist and self-important but in unusual appearances. Appearances which can be very abusive to dependent persons in their sphere of influence.
Things to Know When Making a Web Application in 2015 venanti.us
168 points by venantius  7 hours ago   119 comments top 22
tspike 5 hours ago 5 replies      
First of all, thanks for the nice writeup. I hate that comments tend to hone in on nitpicking, but so it goes. My apologies in advance.

> If you're just starting out with a new web application, it should probably be an SPA.

Your reasoning for this seems to be performance (reloading assets), but IMHO the only good reason for using a single-page app is when your application requires a high level of interactivity.

In nearly every case where an existing app I know and love transitions to a single-page app (seemingly just for the sake of transitioning to a single-page app), performance and usability have suffered. For example, I cannot comprehend why Reddit chose a single-page app for their new mobile site.

It's a lot harder to get a single-page app right than a traditional app which uses all the usability advantages baked in to the standard web.

balls187 6 hours ago 7 replies      
If you're new to web application development and security, don't blindly follow the advice of someone else who is also new to web application security.

You should instead have a security audit with people who have experience in security, so they can help you identify where and why you're system is vulnerable. If no one exists on your team/company that does, then hire a consultant.

Security is a hairy issue, and no single blog post/article is going to distill the nuances down in an easy to digest manner.

joepie91_ 1 hour ago 0 replies      
> If you can get away with it, outsource identity management to Facebook / GitHub / Twitter / etc. and just use an OAuth flow.

The thing that everybody seems to overlook here: this has serious legal consequences.

You are demanding of your users that they agree to a set of TOS from a third party, that does not have either their or your best interests at heart, and that could have rather disturbing things in their TOS - such as permission to track you using widgets on third-party sites.

Not to mention the inability to remove an account with a third-party service without breaking their authentication to your site as well.

Always, always offer an independent login method as well - whether it be username/password, a provider-independent key authentication solution, or anything else.

> When storing passwords, salt and hash them first, using an existing, widely used crypto library.

"Widely used" in and of itself is a poor metric. Use scrypt or bcrypt. The latter has a 72 character input limit, which is a problem for some passphrases, as anything after 72 characters is silently truncated.

quadrature 4 hours ago 1 reply      
This is a bit of a pet peeve of mine, but that banner image is 10 megabytes, it can be compressed down to 2mb without any perceptible loss of quality. Heck it could probably be shrunk further if you can accept a bit more loss because most of the image is blurry and noisy anyway.

heres a compressed version: https://www.dropbox.com/s/bw606t7znouxpj1/photo-141847963101...

devNoise 5 hours ago 3 replies      
Question about JavaScript and CDN for mobile devices. Should I use a CDN for standard libraries or should I just concat and minify all my JavaScript?

The concat and minify seems better as that reduces the JavaScript libraries and code load to a single HTTP request.

A CDN seems nice in theory. Reality is: Does the browser have the library cached? Is the library cached from the CDN that I'm using? The browser is making more HTTP requests, which sometimes takes more time to request than to download the library.

I agree that using CDNs is a good speed boost. I'm trying to figure out if hoping for a library cache hit out weights a library cache miss.

romaniv 5 hours ago 0 replies      
> All assets - Use a CDN

> If you can get away with it, outsource identity management to Facebook / GitHub / Twitter / etc. and just use an OAuth flow.

Questionable advice. At the very least neither of these two are some kind of automatic "best practice" everyone should just follow.

> it can be helpful to rename all those user.email vars to u.e to reduce your file size

Or maybe you should less JavaScript so length of your variable names does not matter.

vbezhenar 5 hours ago 1 reply      
One thing to note is login redirect. Please be sure that redirect parameter is local URI and don't redirect user to another site.

Maybe even append HMAC signature to that parameter with user IP and timestamp. Might be an overkill, but still be careful with craftable redirects, they might become vulnerability one day.

balls187 7 hours ago 1 reply      
> If you can get away with it, outsource identity management to Facebook / GitHub / Twitter / etc. and just use an OAuth flow.

OAuth isn't identity management, it's for authorization.

Each of those platforms does provide it's own identity management, but that isn't OAuth.

shiggerino 6 hours ago 4 replies      
>When storing passwords, encrypt them


This is terrible advice. Don't do this. Remember what happened when Adobe did this?

jameshart 6 hours ago 2 replies      
"You don't have to develop for mobile..."

... well, no. Technically you don't have to. But you almost certainly should.

toynbert 5 hours ago 1 reply      
As a web application developer in 2015+ I would argue that developing with mobile in mind should be required. At least taken into consideration. At bare minimum have a pre-deployment test: is my app unusable/does this look terrible on the most popular iphone/android.
patcheudor 6 hours ago 0 replies      
For mobile apps that use WebView and/or has the capability to execute javascript or any other language provided by any network available resource I'd like to add:

ALWAYS USE CRYPTOGRAPHY for communication! Simply doing HTTP to HTTPS redirects is not sufficient. The origin request must be via HTTPS. Also make sure the app is properly validating the HTTPS connection.

Sorry I had to shout, but I'm growing tired of downloading the latest cool app that is marketed as secure only to find that it doesn't use HTTPS and as a result I can hijack the application UI to ask users for things like their password, credit-card number, etc., all without them having any way to tell if they are being asked by some bad guy.

Domenic_S 5 hours ago 1 reply      
How to make a reasonbly-decent webapp in 2015 without having to worry about bcrypt and open redirects and such:

1. Use a widely-accepted framework.

2. Implement your application using that framework's methods.

Why a beginner would implement even 1/3 of this list manually is beyond me.

martin-adams 3 hours ago 2 replies      
>> When users sign up, you should e-mail them with a link that they need to follow to confirm their email

I'm curious, why is this good? Sure, sending an email to them so they confirm they have the correct email, but what is the benefit of the verification step? Is it to prevent them from proceeding in case they got the wrong email? It would be nice if this was justified in the article.

I would also add, that changing a password should send an email to the account holder to notify them. Then when changing the email address, the old email address should be notified. This is so a hijacked account can be detected by the account owner.

Kudos 3 hours ago 0 replies      
One big omission from this list: gzip. Before you ever think about uglify, make sure you're gzipping your textual assets.
Yhippa 5 hours ago 0 replies      
I like this list.

> Forms: When submitting a form, the user should receive some feedback on the submission. If submitting doesn't send the user to a different page, there should be a popup or alert of some sort that lets them know if the submission succeeded or failed.

I signed up for an Oracle MOOC the other day and got an obscure "ORA-XXXXX" error and had no idea if I should do anything or if my form submission worked. My suggestion would be to chaos monkey your forms because it seems that whatever can go wrong can. Make it so that even if there is an error the user is informed of what is going on and if there's something they can do about it.

Quanttek 4 hours ago 2 replies      
> The key advantage to an SPA is fewer full page loads - you only load resources as you need them, and you don't re-load the same resources over and over.

I don't know much about web development, but shouldn't those resources get cached? Isn't the disadvantage of SPAs that you are unable to link to / share a specific piece of content?

donmb 4 hours ago 0 replies      
Rails has most of this out of the box. Use Rails :)
sarciszewski 4 hours ago 0 replies      
>For all of its problems with certificates, there's still nothing better than SSL.

Yes there is. It's called Transport Layer Security (TLS).

andersonmvd 6 hours ago 0 replies      
When using SPA, validate CORS origin instead of allowing *.
anton_gogolev 4 hours ago 0 replies      
> sent to a page where they can log in, and after that should be redirected to the page they were originally trying to access (assuming, of course, that they're authorized to do so).

Smells like an information discolsure highway. I usually 404 all requests that hit "unauthorized" content.

stevewilhelm 3 hours ago 0 replies      
Hacking Team and a case of BGP hijacking bofh.it
133 points by rolux  10 hours ago   20 comments top 6
acaloiar 7 hours ago 6 replies      
As someone who works in technology, but has only a cursory understanding of BGP, I find BGP's trust mechanism flabbergasting. Would anyone like to explain why it remains the preferred protocol and what improvements are in the works to mitigate the effect of these sort of hijacks?
diafygi 8 hours ago 0 replies      
For those who are curious, (the hijacked IPs) belong to balticservers.com, which is based out of Lithuania[1].


acd 4 hours ago 0 replies      
You can take over other providers IP space by announcing their IPs via BGP from well connected high ranked tier ISPs, but just because you can do one thing does not mean you should exercise it.

Internet was built on the premise that you can trust other organisations such as good willed universites, it was not built for a landscape of internet crime and state sponsored hackers.

BGP and central certificate authorities is flawed in princicple and this sense. Its very easy to create fake certificates for big organisations if you have the power of a state.

Diginotar is such an Epic fail of CA which shows exactly why you cannot trust central trust when there is state hackers at work.

So you either hijack BGP, DNS or Central certificate authority then you steal peoples cookies. Since most does not use two factor authentication that is enough to take ownership of their email accounts. Once the email accounts is compromised all other accounts can be compromised through password resets.

rudolf0 7 hours ago 2 replies      
This is pretty crazy. I wonder how the route hijack didn't get noticed by anyone at the time, though? Or at least if someone did notice, they didn't make a fuss about it.
cft 3 hours ago 0 replies      
I do not undertsand this. We recently had to change our announcement to upstream ISPs from/23 to /22 and our ISPs verified with ARIN that the entire /22 belonged to us, before changing their filters. Also, there's RADb database.
gr0wln1n 7 hours ago 1 reply      
Can somebody explain how they got the police to help them?

"You remember the RAT we sold you? Yea... That's broken because ... Help us or people might notice." If that's it.. Wow. This whole story gets more fishy by the minute.

ClojureCL Parallel computations with OpenCL 2.0 in Clojure uncomplicate.org
24 points by dragandj  8 hours ago   2 comments top
silja1 8 hours ago 1 reply      
Does it work on NVIDIA GPUs?
Car Hacker's Handbook opengarages.org
111 points by MichaelAza  10 hours ago   20 comments top 7
alexggordon 7 hours ago 0 replies      
Looks like the website is having some trouble with all the HN traffic. Rehosted the ebook downloads on GDrive to save him some traffic. I'll remove them when the post leaves the HN front page.

Just FYI, this book literally teaches you how to identify security vulnerabilities in modern cars and exploit them.

You can purchase it from Amazon here[0], or download the book for free in EPUB[1] or PDF[2].

[0] http://www.amazon.com/2014-Hackers-Manual-Craig-Smith/dp/099...

[1] https://drive.google.com/file/d/0Bzxo-UKxFmN-bDlNSi1IT1JLdHM...

[2] https://drive.google.com/file/d/0Bzxo-UKxFmN-WFVjcEVVX3B5azg...

akallio9000 6 hours ago 1 reply      
This is especially heinous given that car manufacturers are trying to keep you from repairing your own car, claiming that the computer systems are copyrighted.


titomc 2 hours ago 1 reply      
I worked for one of those car manufacturers for the telematics unit like putting specific frames on the CAN bus to make the car do remote operations like start/stop engine and also read values from ECUs for DTC codes. We used to teraterm into the unit with a serial cable & a trivial password. The security measure we had during that time was that "we do not give cables to customers so that they cant teraterm into the telematics unit. It might have changed now with the recent CAN Bus hacks.
AceJohnny2 7 hours ago 1 reply      
Some very interesting stuff in there... that's bound to make some manufacturers very unhappy. I remember a couple years ago when some Tesla forum geeks got access to the Linux system running the infotainment dashboard of the model ... and got a nice (seriously) message from Tesla engineers to the amount of "good job... but please stop there".

Many folks have mentioned how the Tesla Model S at least is more of a supercomputing cluster on wheels than a car with some ECUs. I don't know how armored their CAN bus(es) are, but I'm sure the "Attacking ECUs and other embedded systems" is giving some safety engineers white hair.

(of course, everything I've said about Tesla is just about equally applicable to other high-end vehicles. It's just that Tesla are a bit more connected to the traditional software world)

csours 4 hours ago 0 replies      
Looks awesome, hope this will be updated for V2X (Vehicle to Vehicle and Vehicle to Infrastructure) / DSRC / Wave

I would have bought the Kindle e-book for sure - Does Amazon allow pay-what-you-want?

TaylorGood 4 hours ago 0 replies      
On the enduser side, this a big leap towards maintenance transparency: https://www.automatic.com + being tethered to YourMechanic is brilliant.
Show HN: Rent my private airport for your hardware startup box.com
51 points by SpacemanSpiff  4 hours ago   13 comments top 3
ChuckMcM 1 hour ago 3 replies      
Hmm, October through May, snow blower included? On the plus side its probably great for focus, on the down side its going to be a haul to resupply mid-winter. Although with another outbuilding for supplies and pulling gigabit fiber from the Internet all the way out to the house, you could treat it like the Arctic stations and just work solid for 8 months.
tdicola 1 hour ago 3 replies      
Is this really in the spirit of a Show HN post? It's basically an advertisement to rent someone's space.
trhway 2 hours ago 2 replies      
o, man! it brings me to tears - compare all this majestic grand scale to my condo backyard patio in MV where i'm trying to build up toward ultimate goal of 2 humans carrying octocopter (and at full scale it actually would hardly fit into the patio :)
The Rise and Suspiciously Rapid Fall of Freedomland U.S.A. atlasobscura.com
5 points by Thevet  2 hours ago   discuss
All the steps needed to create deep dream animations on EC2 github.com
54 points by jmount  7 hours ago   19 comments top 6
jmount 7 hours ago 2 replies      
I really wanted to run the deep dream and deep dream animation scripts people have created. I am now re-sharing the instructions I found (with links to original guides). It isn't an automated install. You pretty much have to paste the lines one by one. But I did replace any "edit" steps with append or patch.

To do this on a fresh Ubuntu EC2 g instance there are a lot of steps- but I have tested them and put them all in one place (with links to the original sources and guides). I have CUDA up but not CUDNN as I haven't found how to legitimately download CUDNN without registering on the NVIDIA website.

Again: credit to the actual creators and all the original guide authors.

shreyask 6 hours ago 1 reply      
These steps are cool and very detailed, also check out https://github.com/VISIONAI/clouddream which is containerized so you can try it on local machine as well as on EC2.
tacos 2 hours ago 1 reply      
At $2.60 an hour you'd think the Python code would be smart enough to expand "~" -- and call "avconv" instead of "avconf" -- or use a recent version of Ubuntu that avoids the ffmpeg vs avconv drama.

You'd also think people rendering movies would input and output PNG instead of JPEGs at 75% quality.

But you'd be wrong.

I appreciate the excitement around this but the people on the fringe hacking this shit together really should be ashamed of themselves. The core is so great! The script kiddie stuff wrapping it is SO BAD! This is perhaps the 15th attempt I've seen at it. Maybe the dumbest one yet.

We have Docker. We have AMI. We have scripts. And... we have people spinning up servers at $2.60/hr where you have to wait an indeterminate amount of time for some marketing intern to enable your NVIDIA developer application.

Google anything related to this technology and you'll immediately see a ton of people having the same three problems over and over. All due to sloppy packaging by people who basically Googlebombed keywords with their half-baked github experiments.

Slapping some half-baked shit onto Github isn't open source. It's littering.

And everyone knows "here's how to install!" walkthroughs never work for more than a week. STOP DOING IT.

fennecfoxen 6 hours ago 1 reply      
That's pretty insane. It makes me really appreciate that I get to use package managers and Puppet/Vagrant scripts for most of the stuff I program.
deepnet 7 hours ago 0 replies      
Thanks, the Caffe on the PYTHONPATH was the bit I needed.
azinman2 5 hours ago 1 reply      
Or... Just use deepdreamer.io
Our beautiful fake histories joeyh.name
25 points by JoshTriplett  8 hours ago   7 comments top 6
scintill76 28 minutes ago 0 replies      
I've been thinking of a "sub-commit" concept, so a "beautiful, fake" commit that is atomic for most git operations, can be inspected more deeply, revealing bundled messy sub-commits that may be useful for understanding the history of that change. You get "every commit is a deliberately chosen, well-tested logical unit" as well as "present history as it actually happened." Maybe something like this can be done with certain branch/merge workflows, but I haven't seen it and don't understand git well enough to come up with it myself.
dasil003 55 minutes ago 0 replies      
I take issue with the title. Is there a VCS that commits every keystroke individually? No, because that wouldn't be useful. There is always a choice in when and what to commit. The only thing with git is that it makes it easy to change things after the fact, but you can achieve something pretty close in svn by simply not committing until you're done with a feature.

It's not about faking anything, it's a question of what information you want to leave for posterity. And frankly I come down pretty strongly against preserving warts for warts sake. As a rule of thumb, commits should be as fine-grained as possible without breaking the build. In my experience more detail than that gives diminishing returns in utility as the signal-to-noise ratio drops and you get overwhelmed with details which only represent a brain fart on the part of the developer, and never had any measurable impact on the project.

JoshTriplett 1 hour ago 0 replies      
While I personally like git's curated commit model (in particular the "keep trying until you get it right and then merge" approach rather than a series of "fix the fix" commits that don't all build), I do think Joey has hit on an interesting principle of many git-using communities, and how it has affected the tools used to maintain that principle.
ch 2 hours ago 1 reply      
I have to side with Joey on this one. Sure in a world like the Linux kernel, where things are still done via emailed patches, avoiding sending a patch-bomb to a maintainer warrants some editorial control on the history of a feature. However, like Joey, I tend to find beauty in the organic evolution of code. Plus I like others to see all the commits where I'm just cursing out build systems, or dev tools or just mumbling about a 'misspeeling'.
lsiebert 1 hour ago 0 replies      
Git is pretty easy to contribute to, fwiw. Thought it does require you to create patches and send them to the mailing list like the linux kernel does.
larve 2 hours ago 0 replies      
I use the following to jump from merge to merge when bisecting (we use pull requests as our main "clean history" management).

 "!git checkout `git rev-list --bisect --first-parent`"

       cached 11 July 2015 01:02:03 GMT