hacker news with inline top comments    .. more ..    16 Oct 2015 Best
home   ask   best   4 years ago   
How is NSA breaking so much crypto? freedom-to-tinker.com
947 points by sohkamyung  1 day ago   233 comments top 39
seanwilson 13 hours ago 4 replies      
Isn't the much simpler explanation that, for particular servers, they either have permission (e.g. companies have agreed to hand over the encryption keys or allow monitoring of the data after it is unencrypted on arrival) or some other means (man-in-the-middle attacks, server backdoors, hacking vulnerable software) to bypass the encryption entirely without you knowing? For instance, I find the idea that they would direct huge amounts of computing power to crack individual keys implausible given the previous example methods are so much easier.
misiti3780 22 hours ago 5 replies      
"Since weak use of Diffie-Hellman is widespread in standards and implementations, it will be many years before the problems go away, even given existing security recommendations and our new findings. In the meantime, other large governments potentially can implement similar attacks, if they havent already."

can someone explain to me why this cant be fixed over night. im no crypto expert, but

" If a client and server are speaking Diffie-Hellman, they first need to agree on a large prime number with a particular form. "

why can't you just switch the large prime number and then continue on sending encrypted data?

andyjohnson0 18 hours ago 2 replies      
"For the nerds in the audience, heres whats wrong: If a client and server are speaking Diffie-Hellman, they first need to agree on a large prime number with a particular form. [...] an adversary can perform a single enormous computation to crack a particular prime [...]"

Can someone explain to me what the authors mean by "cracking" a prime? Is the difficulty of this related to the difficulty factoring a composite number? The language used is annoyingly imprecise.

Edit: Question was already asked by smegel, and has some useful answers.

Pyxl101 22 hours ago 1 reply      
Some advice from the authors on how to properly deploy Diffie-Hellman:


devit 13 hours ago 0 replies      
Apparently some Cisco products might even be using 768-bit DH as default for IPsec!

From http://www.cisco.com/en/US/docs/ios-xml/ios/sec_conn_ikevpn/...:

<<Diffie-Hellman--A public-key cryptography protocol that allows two parties to establish a shared secret over an unsecure communications channel. Diffie-Hellman is used within IKE to establish session keys. It supports ==> 768-bit (the default) <==, 1024-bit, 1536-bit, 2048-bit, 3072-bit, and 4096-bit DH groups. It also supports a 2048-bit DH group with a 256-bit subgroup, and 256-bit and 384-bit elliptic curve DH (ECDH). Cisco recommends using 2048-bit or larger DH key exchange, or ECDH key exchange. >>

Malice or incompetence? (or crappy hardware that needs help to not be slow?)

The recommendation is correct so...

smegel 22 hours ago 2 replies      
Can someone explain what "breaking a prime" means? What is the output after your year of computation?
vbezhenar 16 hours ago 2 replies      
When I setup TLS for web or smtp, there's an option to generate custom dh params. So basically one must generate new dh params for every installation to be safe against attack presented in the article, is it correct?
metachris 10 hours ago 0 replies      
Interesting paper (https://weakdh.org/imperfect-forward-secrecy.pdf) and lots of good references!

Inspired me to write a little tool to download all referenced pdfs from any given pdf: https://github.com/metachris/pdf-link-extractor

paulgerhardt 23 hours ago 0 replies      
See also Martin Hellman's oral history on trap doors: https://conservancy.umn.edu/bitstream/handle/11299/107353/oh...
thiagoharry 14 hours ago 0 replies      
According with the estimated cost given to that machine (few hundred million dollars) and the problem's nature, what they propose is very similar to TWIRL, an hypothetical machine that could factor 1024-bits integer to break RSA. That was the reason that made a lot of people consider 1024-bit RSA not secure anymore and change their keys to 2048 bits. The same should happen with DH now.
sarciszewski 12 hours ago 1 reply      
This might not contribute much to the discussion, but I just want to add:

I for one welcome the coming arrival of ECDH over curve25519 in TLS everywhere.

(And I really hope that comes to pass.)

acd 17 hours ago 0 replies      
You can check for web sites common Diffehellman primes onssllabs.com Check section Protocol details"Uses common DH primes"

Also the latest openssh package warns against Diffie hellman ssh keys now we know why they warn us.

AnonNo15 23 hours ago 4 replies      
Crap. So what are the immediate countermeasures? Switch to elliptic curves cryptography?
agwa 23 hours ago 2 replies      
There aren't any new findings here. It's merely a rehash of the Weak DH attack (by the same researchers) that was made public in May of this year: https://weakdh.org/

Still, it's a good reminder that you should not be using 1024-bit Diffie-Hellman.

zmanian 23 hours ago 1 reply      
How much software has been updated to use stronger DH either ECC or 2048 bit prime field?

Is there an easy way to check if a VPN provider has updated?

The ASICs NSA built for breaking some common 1024 bit fields are probably breaking specific RSA keys now...

astazangasta 11 hours ago 0 replies      
If one is not enough, why not just have a million standard keys to choose from? This makes the problem space prohibitively large, but this many keys could be passed around in a standard distribution easily enough.
542458 23 hours ago 2 replies      
I wonder what the effort to break a 2048-bit prime would be. I suspect it's heading into "dyson sphere powered ideal computer" territory, but I'd be curious to know what it would actually be.
kordless 10 hours ago 1 reply      
> For the most common strength of Diffie-Hellman (1024 bits), it would cost a few hundred million dollars to build a machine, based on special purpose hardware, that would be able to crack one Diffie-Hellman prime every year.

I'll just leave this here: http://fortune.com/2015/06/29/intelligence-community-loves-i...

kristopolous 21 hours ago 1 reply      
about 12 years ago I came up with a pretty clever way to factor numbers that I never pursued the computational complexity of.

The basic algorithm is that you take some candidate X (which will be our 2048 bit number here) and classify your question (primality, whether it is the product of 2 primes, etc) --- once you have your question, Q, then you can pick a number Y0 to get X % Y0 = Z0 ... sometimes ~sqrt(X) works well, other times it's the closest prime factorial, etc.

now using those results, [Q, Y0, Z0], you can optimally pick Y1 and do the operation again, X % Y1 = Y2 ...

Like the Chinese remainder theorem each Z gives you information on the next optimum Y given your question Q ...

I called it tunnel factoring and saw some great early results ... but for some reason I haven't ever pursued it

mrb 23 hours ago 0 replies      
FYI this is not really new news. The authors of that research had already disclosed their findings at https://weakdh.org about 5 months ago.

Today they simply formally presented their research at ACM CCS.

ibmthrowaway271 15 hours ago 0 replies      
Is there a tool to output the DH params being used when attempting a TLS connection (not dumping them from a packet capture)?

I know I can, but I'm hoping for something simpler than having to parse the TLS messages from:-

 openssl s_client -connect host:port -msg
to work it out.

onderkalaci 17 hours ago 1 reply      
There seemed to be no reason why everyone couldnt just use the same prime, and, in fact, many applications tend to use standardized or hard-coded primes.

Then, if the prime number is standardized or hard-coded, why they just not use it? Why we need to break it?

too_late 13 hours ago 1 reply      
Wouldn't this be easy to subvert, though?

I mean, say we put through a few patches and started generating primes more often. Then there big-ass special purpose prime machine becomes an order of magnitude less-effective, right?

I think the best way to defend against these one-to-many attacks is to spread out the cost of decrypting large quantities of data. If we all had our own keys, even if they weren't as strong as one single key that everyone used, that much more work has to be done to decrypt data for a group of users.

I know nothing about crypto, but a layman can hear about these implementation architectures and immediately realize what's wrong with it all.

qakmail 13 hours ago 0 replies      
Just to clarify (because I was confused when I read your comment), the weak DH attack was made public by the same people who wrote this post and the academic paper attached to it. It looks like the post and the paper are part of the same "release".Conflict of interest disclaimer: I was a grad student of Professor Halderman's several years ago.
mediocrejoker 7 hours ago 0 replies      
If the NSA wasn't doing this, you can bet they will be soon.
crozewski 13 hours ago 1 reply      
Can we use distributed computing to crowdsource the computation of more/better primes? Can OpenSSL look to this pool for its primes?
late2part 12 hours ago 0 replies      
I wonder if that estimated cost is the COGS or the R&D? If it's the R&D, what is the cost of the second machine?
cm2187 19 hours ago 1 reply      
Isn't the bulk of the https traffic using RSA, not Diffie-Hellman?
z3t4 18 hours ago 1 reply      
Check your root certificates. If any of those has capabilities of "Man in the middle", they can see your SSL traffic. That's probably how they do it.
petra 15 hours ago 1 reply      
Since we don't exactly know what other ways to break crypto are there - why aren't we focusing on concatenated encryption(at least for critical apps) - while working hard to ensure no crypto vulnerable to malware type attacks, especially considering that malware isn't a good way for web scale surveillance ?
kobayashi 20 hours ago 1 reply      
Regarding VPN usage, is the fix a client-side or a server-side solution?
chinathrow 18 hours ago 4 replies      
Imagine the money spent on both a) measures and b) countermeasures related to the ongoing spying by the intelligence apparatus around the world.

Imagine the money not spent on more pressing issues we face these days: health problems, poverty and the destruction of nature earth, just to name a few.

Why do we, as a society, tolerate this?

auntienomen 23 hours ago 0 replies      
Ha ha! (Seriously, nice paper.)
ck2 18 hours ago 1 reply      
This is an arms race and it doesn't address the underlying cause.

The government of the people should not be spending $10B a year to monitor and track all of its people just to warehouse the data.

That is quite literally Stasi. Not vaguely like, exactly like.

ape4 23 hours ago 0 replies      
Important stuff.
mkagenius 17 hours ago 1 reply      
Devil is in the details, I would take this with a grain of salt before I read the paper.

What if few hundred millions is 10x less than actual amount. What if it takes 10 years instead of 1.

nosuchthing 21 hours ago 3 replies      
Being that crypto is 'just math', why would crypto be safe? The only claim that crypto is safe assumes computational power is limited. Is that a safe assumption? Assuming the crypto math is safe, one also has to be certain the entire system which runs the crypto is safe as well.

Analysis and attempts to decode the Voynich manuscript lead me to believe mathematical patterns intended to hide information, languages in particular, are not safe in the least.

dogma1138 22 hours ago 2 replies      
Breaking crypto is what the NSA was created to do, playing a cat and mouse game with it means you'll always loose.If the NSA cannot break crypto it's useless, and given 2 outcomes them giving up or them just asking for more money and being more intrusive the latter is much more likely.

No one will get their privacy "back" by fighting the NSA through technology, considering their mission, budget and capabilities they'll always win, the only way to pacify the NSA is through legislation that will ensure that they only use their capabilities when it's warranted.

NN88 22 hours ago 5 replies      

I wonder what world you all live in in which this is a bad thing. Theres real threats out there and i'd hate to live in a country that lacked the geopolitical leverage to make use of these tools to my nation's interests.

Twitter announces layoffs sec.gov
580 points by uptown  2 days ago   379 comments top 68
Sidnicious 2 days ago 14 replies      
> Emails like this are usually riddled with corporate speak so I'm going to give it to you straight.

I tried rewriting his email to live up to this promise:

- - -


We're cutting our workforce to strengthen Twitter as a company.

The team has been deciding how to best streamline Twitter, Vine, and Periscope to put their focus on the projects which will have the greatest impact. Moments, which we launched last week, is a great beginning. It's a peek into the future of how people will see what's going on in the world.

We plan to cut up to 336 people. This was a tough decision, and we'll offer each person a generous exit package and help finding a new job. Product and Engineering are going to make the most changes. Engineering will be smaller but remain the biggest percentage of the organization, and other departments will be cut in parallel.

This isn't easy. We'll honor those who we're losing with our service to all the people who use Twitter. We'll do it with a more purpose-built team. Thank you all for your trust and understanding here. As always, please reach out to me directly with any ideas or questions.


celticninja 2 days ago 12 replies      
> The world needs a strong Twitter

Really? Does it? I think Twitter needs a strong Twitter, shareholders need a strong Twitter even Twitter employees need it. However the world is, at best, ambivalent about Twitter, if it disappeared tomorrow a replacement would spring up within a few weeks if the world really needed a way to shotgun their messages into the ether.

ghshephard 2 days ago 4 replies      
The number, 336, is roughly 10% of their employees - which is pretty much exactly the number that Jack Welch recommended turning over each year to improve the work force.

I often wonder whether these "layoffs" aren't actually layoffs, but simply performance based assessments. It's not like Twitter is shutting down an entire office, or abandoning some technology, and letting everyone associated with that office/technology go - presumably they are being selective on other factors as to who they let go - and I'm guessing that performance is likely a key factor.

If, over the next year, twitter doesn't hire back that 10%, or hires employees in different technologies/positions (I.E. Web developers instead of thick client developers, sales people instead of developers, etc...) - then this is a layoff. But, if headcount returns to the same number, in roughly the same job areas, then this is just a performance based annual rank/yank process.

uptown 2 days ago 4 replies      
"we plan to part ways with up to 336 people from across the company. We are doing this with the utmost respect for each and every person."

Bart might disagree:


danso 2 days ago 2 replies      
> The roadmap is focused on the experiences which will have the greatest impact. We launched the first of these experiences last week with Moments, a great beginning, and a bold peek into the future of how people will see what's going on in the world.

That Moments is mentioned so high up in the email isn't particularly reassuring...since it means they haven't launched many other initiatives of note recently. Moments as a feature is extremely disappointing given the years of interesting discoveries that Twitter has yielded algorithmically via its, well, "Discover" tab. What's on Moments looks like a half-baked newspaper front page except when you click on an item, you go to tweets about that item instead of a full story.

I don't want to pile on the project as it is new...but it should've been given more thought and design time given how much prominence "Moments" has on the interface (it is one of four main icons on the menubar)...Nearly all of the stories are hours old...e.g. "Wave of terror attacks hits Jerusalem" and "Playboy covers up"..."FedEx truck splits in two", granted, is news to me...but not something that makes Twitter unique to me.

There's so much more potential in the Trends section...OK, maybe Twitter wants to filter out potentially visually NSFW topics like #NoBraDay...but things like #MH17 and #VMworld and #ILoveYouAboutAsMuchAs...just show me an automated feed of tweets by reputable sources (rather than spambots or random kids) so I can understand why these topics are suddenly trending without having to click through the trends tag and sort through a overwhelming timeline.

edit: that said, I like all the other products...besides core Twitter, Vine and Periscope are standouts (at least, as a consumer)...I just think that "Moments" isn't worth putting into the spotlight, unless there is literally nothing else to be proud of publicly.

acaloiar 2 days ago 7 replies      
> Emails like this are usually riddled with corporate speak so I'm going to give it to you straight.

Well, since you said it that way, I should assume that what comes next will not sound like a steamy pile of meandering corporate speak, right?

> The team has been working around the clock to produce streamlined roadmap for Twitter, Vine, and Periscope and they are shaping up to be strong. The roadmap is focused on the experiences which will have the greatest impact.

A roadmap focused on high-impact experiences. Got it. I hope your firings go really well, Bob.

ChrisLTD 2 days ago 5 replies      
"The world needs a strong Twitter, and this is another step to get there."

Let's not get carried away here. Twitter is great. I use it too much of the day. But the world hardly needs Twitter.

MattBearman 2 days ago 5 replies      

 "Emails like this are usually riddled with corporate speak so I'm going to give it to you straight."
Three paragraphs later...

 "So we have made an extremely tough decision: we plan to part ways with up to 336 people from across the company."
Edit: My bad, I should have been clearer in what I meant. There isn't really any corporate speak, but I wouldn't call 3 paragraphs of fluff 'giving it to you straight'

dang 2 days ago 0 replies      
This was discussed pre-announcement at https://news.ycombinator.com/item?id=10364197.
antirez 2 days ago 1 reply      
> Emails like this are usually riddled with corporate speak so I'm going to give it to you straight.


> We will honor them by doing our best to serve all the people that use Twitter.


mrweasel 2 days ago 3 replies      
I'm a little surprised that they're only firing 336 people. Unless this is just the first round of layoffs and more will follow once products and management has been streamlined/trimmed whatever you want to call it.
ngoel36 2 days ago 1 reply      
This truly sucks - I'm sorry to hear that. If you're one of the unlucky engineers that caught up in all this - reach out to the email address in my profile. We're hiring tons of awesome engineers at Uber, and if that's not the right fit for you I can help get you connected to other SF companies as well.
josefresco 2 days ago 1 reply      
The "moments" feature will be a failure. They're essentially building an editorial model on top of Twitter - something they (as the platform creators) shouldn't be worried with.

The decentralized model of Twitter's content creation is an asset. If you group those into a more traditional top down model, you lose the uniqueness and power of the Twitter platform.

TheMagicHorsey 2 days ago 0 replies      
I'm seriously impressed by Twitter's service ... doing fan-out for so many popular celebrities, so seamlessly, for so many readers is an accomplishment.

But I'm seriously curious why they have 4,000+ employees.

What in the hell are all those people doing?

300~ people being laid off is nothing. I wouldn't have been shocked if they said they were laying off 1,000+ people. I think entire departments probably need to go.

There has to be a lot of dead weight at Twitter.

xmpir 2 days ago 3 replies      
Why is this email on sec.gov?
GuiA 2 days ago 0 replies      
> Let's take this time to express our gratitude to all of those who are leaving us.

Gratitude in the form of not telling employees that they were laid off, and letting them find out when they try to check their email in the morning? [0]

Fuck that noise.

[0] https://twitter.com/bartt/status/653946266938818561 + exact same thing happened to a good friend of mine who didn't tweet about it + hearing reports of it happening to others

petercooper 2 days ago 0 replies      
"[..] Engineering will move [..] faster with a smaller [..] team [..] we [will] part ways with [..] 336 people. [..] with the utmost respect [..] the world needs a strong Twitter, and this is another step to get there."

Or basically, Twitter is weaker with you in it.

chipgap98 2 days ago 0 replies      
There seems to be a large disconnect in this thread between what is actually corporate speak and avoiding being blunt and insensitive
Wintamute 2 days ago 0 replies      
Slightly OT, but honestly I think the world really doesn't need Twitter. If you view the evolution of the internet as a phenomenon fundamentally linked to the emergence of a global cultural consensus, or even consciousness, then to reduce a sizable fraction of its bandwidth to 140 chars, vicious echochambers and a communication mechanism custom designed to bump people's thoughts out-of-context for the purposes of ridicule then Twitter should be viewed as harmful. I hope some of these coming changes directly address the harm current Twitter does to the quality of human communication on the net.
piratebroadcast 2 days ago 0 replies      
I wonder what this means for the Boston twitter offices (Crashlytics and Bluefin Labs) - No mention of them, whereas Vine and Periscope are.
jackgavigan 2 days ago 0 replies      
Well, the stock's up 4.35% today.

That's roughly $2.7m per fired employee!

Kristine1975 2 days ago 1 reply      
TL;DR: We're making some changes to the company. We're firing the 336 of you we don't need anymore. Thanks for your work.

Everything else is fluff. But I guess they have to sugar-coat it a bit with "utmost respect" and "tough decision".

vegancap 2 days ago 4 replies      
What's the reason for this e-mail being housed on a .gov TLD?
spikels 2 days ago 0 replies      
Revenue per employee is an interesting performance metric although not usually applied to growth companies. Twitter's was growing fast relative to other big public tech firms but still much lower than most.


ducuboy 2 days ago 0 replies      
Focused on what exactly?

> We launched the first of these experiences last week with Moments, a great beginning, and a bold peek into the future of how people will see what's going on in the world.

Wonder what's next, because with Moments it feels like they still have no idea what to do with this platform. It would be such a pity to turn Twitter into TV-like manually curated breaking news channels.

leothekim 2 days ago 1 reply      
"up to 336" -- that's a very specific number, suggesting this was somewhat surgical. Most other leadership types are of the "any manager worth her salt should be able to cut 10% of her staff, so do it". Layoffs are never easy, but it sounds like Jack is doing his best make the right cuts and take care of affected.
grandalf 2 days ago 0 replies      
When 80% of the promotional mail in my inbox is from Twitter trying to drive engagement, something is going badly.
RyanMcGreal 2 days ago 1 reply      
> Emails like this are usually riddled with corporate speak so I'm going to give it to you straight.


> we plan to part ways with up to 336 people from across the company

Nothing says "give it to you straight" like using "part ways" to mean "you no longer have a job".

ape4 2 days ago 1 reply      
I guess the message was a bit too long for a tweet.
mobileexpert 2 days ago 0 replies      
What about Twitter's none-core products? GNIP and Fabric (crashlytics and Answers)
oldmanjay 2 days ago 1 reply      
The sheer vulture-like behavior of recruiters around this event is a sight to behold! Truly recruiting is the occupation for the shameless.
jhwhite 2 days ago 1 reply      
Maybe I'm a little self centered but this line:

> We will honor them by doing our best to serve all the people that use Twitter.

seems a little pep talky to me for the people left.

I feel saying this would have been better:

> We will honor them with the utmost respect for each and every person. Twitter will go to great lengths to take care of each individual by providing generous exit packages and help finding a new job.

cubano 2 days ago 2 replies      
On the bright side, Twitter will only fire 140 people at a time, giving the others time to prepare.
jjzieve 2 days ago 0 replies      
I feel like this could spark a massive bubble pop. I mean if investors have lost confidence in a company with one of the largest user-bases in the world, what does that say about all the startups that are valued so high and will likely never make a dime, unless they're bought out.
kenko 2 days ago 0 replies      
"We are moving forward with a restructuring of our workforce so we can put our company on a stronger path to grow. Emails like this are usually riddled with corporate speak so I'm going to give it to you straight."

Riddled with corporate speak like ... the very first sentence.

tony_b 2 days ago 0 replies      
Good thing that his email about a roadmap for moving forward with a restructuringfor a nimbler team and an organization streamlined in parallel as well as an invitation to reach out wasn't riddled with corporate speak the way those emails usually are.
myth_buster 2 days ago 0 replies      
Tech community reaching out with job postings.


hartator 2 days ago 0 replies      
https://about.twitter.com/careers/positions Still doing a lot of hirings.
ausjke 2 days ago 1 reply      
How many employees does it have? I recall it's about 4000 or so, so this is like a 10% cut?

It's better to do a big axe once instead of slicing it gradually, will this be it?

Been there done that, and it sucks.

idibidiart 2 days ago 0 replies      
"Dorsey is no Jobs" sound very fitting now.
DrNuke 2 days ago 0 replies      
Tbh twitter is pretty good professionally, if you follow the right people / organisations in your field and write accordingly. Much better than spammy Linkedin too.
jacques_chester 2 days ago 1 reply      
I think the engineers will be in a good position.

The rest, I'm not as sure.

lfender6445 2 days ago 0 replies      
If anyone who's part of the layoff is looking for the opportunity to work from home (ruby + javascript), let me know -- me [at] gmail.com
perlpimp 2 days ago 0 replies      
not sure how related it is, but a few days ago twitter demanded to change my password, after I changed it - they locked my account demanding that I would provide a phone number to tie to the account, yet none of the numbers I provided do work.

May I supposed they'll be even more focused on collection of various marketing data from their users given how little leverage they have over user's personal lives?

swalsh 2 days ago 1 reply      
If you're impacted, know ruby, want to make healthcare better, and are open to a position in Boston let me know! email in profile.
fjordames 2 days ago 0 replies      
Oh god. My roommate was literally offered a position at their Boulder office last week. I wonder how systemic cuts will be?
coderjames 2 days ago 0 replies      
A more focused Twitter is much needed if the company ever hopes to become profitable, and not just remain a money pit.
grandalf 2 days ago 0 replies      
This is great for the startup ecosystem because likely many of the engineers are very talented.
ThomPete 2 days ago 0 replies      
Wait why did the subject change?

Isn't the correct headline the subject line of the email?

gketuma 2 days ago 0 replies      
Does this mean Twitter Bootstrap 4.0 release will be delayed?
whatok 2 days ago 2 replies      
Any info on whether this is just to appease shareholders or actual redundancies?
tarekkurdy 1 day ago 0 replies      
Forgets to remove the corporate speak.
sjg007 2 days ago 0 replies      
It would have been better if it was a total of 124 characters.
SneakerXZ 2 days ago 0 replies      
I feel sorry for people that were laid off but to be honest does Twitter need 4100 employees? I cannot imagine what all these people do for not such a complicated product.
ComputerGuru 2 days ago 2 replies      
Wow, these aren't proofread by anyone?

> The team has been working around the clock to produce streamlined roadmap for Twitter,

"to produce streamlined roadmap" Really?

mirap 2 days ago 0 replies      
Actually, this is really well written.
smaili 2 days ago 0 replies      
If anyone who's part of the layoff is around SF and is looking for a new opportunity, let me know -- me [at] smaili.org
santialbo 2 days ago 0 replies      
Yesterday TWTR went down almost 7%.
mahouse 2 days ago 0 replies      
>We launched the first of these experiences last week with Moments, a great beginning, and a bold peek into the future of how people will see what's going on in the world.

I am terrified about the future of Twitter.

vishalzone2002 2 days ago 2 replies      
any idea what is 336 as a percentage of their tech workforce?
cdelsolar 2 days ago 0 replies      
Message me if you've been impacted and want to join us at Leftronic!
moron4hire 2 days ago 0 replies      
Here's what "giving it straight" really looks like, while also having a chance to save face:

 Everyone, As part of a restructuring of our workforce, we must layoff 336 people from across the company. This is an extremely difficult decision. We believe this is a necessary step to put our company on a stronger path towards growth. The team has been working around the clock to produce streamlined roadmap for Twitter, Vine, and Periscope and they are shaping up to be strong. With the utmost respect for each and every person, Twitter will go to great lengths to take care of each individual by providing generous exit packages and help finding a new job. The roadmap is a plan to change how we work, and what we need to do that work. Product and Engineering will make the most significant structural changes to reflect our plan ahead, focused on the experiences which will have the greatest impact. We feel strongly that Engineering will move much faster with a smaller and nimbler team, while remaining the biggest percentage of our workforce. And the rest of the organization will be streamlined in parallel. Let's take this time to express our gratitude to all of those who are leaving us. We will honor them by doing our best to serve all the people that use Twitter. We do so with a more purpose-built team, which we'll continue to build strength into over time, as we are now enabled to reinvest in our most impactful priorities. As always, please reach out to me directly with any ideas or questions. Jack
Notice I left out Moments, because I think it's in really poor form to mention efforts made before the layoff, with those 336 people, as being a part of this new roadmap that includes laying off those 336 people. Really, really poor move.

moron4hire 2 days ago 1 reply      
Can someone explain to me how these places figure out that it's engineering's fault for failing to figure out how to monetize passive aggression, 140 characters at a time?

I mean, if I were asking myself "why did we fail to achieve our expected growth potential", am I going to blame the people who did what I told them to do, or am I going to blame what I told them to do?

Well, clearly, if I'm an MBA, I'll blame the stupid proles.

curiousjorge 2 days ago 1 reply      
Have a feeling that Twitter is one of the uniocorns to go next year.

He's approaching this as a simple restructure & pray with engineering teams when in fact the problem is a much more serious problem, there's a loss of confidence in Twitter from investors.

I guess cutting when investors feel like it's due is a good way to appear like you are making changes when in fact the problem with Twitter is much more deep rooted and a fundamental flaw.

1) Investors realize twitter is horrible for monetization

2) Investors are out of patience or trust

3) Twitter scrambles to find a sustainable revenue source.

4) Twitter cuts off Hootsuite and launches competing business

5) Twitter's massive botnets disappear revealing only a small number of it's userbase is active sparking SEC involvement.

foobarbecue 2 days ago 2 replies      
nanoojaboo 2 days ago 0 replies      
Dear Mr Jack, please do not cut my job. I have a wife and two kids and a big mortgage. Thank you, NanooJaboo
signaler 2 days ago 0 replies      
As a hobbyist coding small projects like Twitter in my spare time, I feel their pain and have consistently had to re-adjust the code base, and the amount of project contributors. This is observable on the micro-scale, and I would loathe to think how this plays out on the scale of Twitter, where unbridled and unchecked scale was allowed to take over the company, causing them to lose focus.

Twitter is essentially one big DevOps success story / failure after another, and I have faith they can start to focus again. One motif / question I have seen in every pundit's post about Twitter as a company is why the market (up until now perhaps) has not decided Twitter's faith? If it really is the case that Twitter is a big data company, then how come 90% (random estimate) of their users are fembots / fake accounts?

smonff 2 days ago 0 replies      
IT world need a better class consciousness. The class struggle isn't something from the past. Twitter is one of the biggest company of the net economy, and it actually can fire 336 employees because "the world needs a strong Twitter". How a big company like this one can be authorized to fire people this way? Who employee are gonna react? Are they gonna fight?

Ok, we are not working at the mine, we are working on servers and data, and concept and communication tools, comfortably sitten in white offices, but these company makes a huge amount of money with our work and then will throw employees like garbage? Noooo, this is not acceptable. Jack, do you think that people will take your generous exit package and feel fine: no, some will experiment some difficulties to find a new job, some will divorce, some will be obliged to sell what they build to survive, some will get depression because of unemployment, some might even commit suicide. The price to pay for this can't be equal to your exit packages.

There is a serious problem here. And we are not organized at all to fight this. But workers could unite again. After all organizing a servers strike is not that hard. I wonder why it don't happen.

Workers of the world, unite!

The Drone Papers theintercept.com
454 points by yuvadam  13 hours ago   180 comments top 17
bambax 9 hours ago 7 replies      
This phrasing is incorrect:

> [there are x persons] President Obama ha[s] authorized U.S. special operations forces to assassinate

US special ops are the President's weapon. It doesn't make sense to say "yesterday I authorized my gun to kill a man"; what you want to say is "yesterday I killed a man".

President Obama didn't "authorize targeted killings" or whatever you want to call those.

He (and Bush before him) assassinated people, assassinates them without trial or due process, while smiling and holding babies and making jokes at the White House correspondents dinner, and complaining about the gun culture and mass killings.

The same day of the Umpqua College shooting (10 dead), US drones in Afghanistan targeted a hospital and killed 22 people (12 staff, 10 patients including 3 children).

If Obama really wants Americans to get rid of their guns, maybe he should start with his own.

But of course he won't, so forgive me for not listening to whatever he has to say.

yuvadam 12 hours ago 4 replies      
There are a few interesting anecdotes about this huge story -

This is the same leak that Glenn Greenwald describes to Edward Snowden in Moscow (-> -> -> -> POTUS), meaning this story has been in the making for at least 1.5 years.

Second, this quality and quantity of leaks is incredible and can only be attributed to a news organization that takes security as a paramount consideration, setting up proper secure channels to enable technically apt whistle-blowers to approach them with confidence.

Props to The Intercept for some fine journalist work.

gortok 11 hours ago 4 replies      
There are lots of important points in this set of articles, but a few stand point:

- Congress has not defined what "assassination" means; and since they haven't defined it, the Executive Order 12333 is effectively meaningless. https://theintercept.com/drone-papers/the-assassination-comp...

- IMEIs are used to track targets. Can IMEIs be spoofed?

- Military Aged Males are considered enemy combatants. Effectively that means all males ages 18-49 are considered enemy combatants. Since you're guilty by association, your age makes you a target.

mcphilip 7 hours ago 4 replies      
Logically, if the U.S. feels justified in targeted drone strikes against legitimate threats to national security, what's to stop China or Russia from doing the same? Arent there some legitimate scenarios where U.S. and Chinese national security interests are in opposition?

Who has the "moral high ground", if such a thing exists, in such cases?

uptown 11 hours ago 0 replies      
The full searchable PDFs of the Drone Papers are available here:


snake_plissken 8 hours ago 3 replies      
For me the most grave offense of our current president was taking out Anwar al-Awlaki in 2011. I understand he was not a good individual, by any means, but he was an American citizen which entitled him to all of the protections afforded under The Constitution.

Programs like these are what undermines any legitimacy our country has left.

csommers 8 hours ago 2 replies      
Kind of eerie: http://chj.tbe.taleo.net/chj05/ats/careers/requisition.jsp?o...

Quick Google search for the key-term "GILGAMESH"...

kushti 5 hours ago 1 reply      
I don't see this in CNN top yet, BBC is also silent(though UK citizen was killed by an US drone). However, RT and other media independent of western governments already have this on main page. Let's think why dear HNers.
mahyarm 6 hours ago 0 replies      
I thought it has been known for years that the US govt uses drones to execute military and assassination targets. And the public has no insight into how these decisions are made.

This may reveal internal details, but I don't know how different from what the world has known for years?

NN88 1 hour ago 0 replies      
praptak 10 hours ago 8 replies      
Hm. So what stops US from targeted killings in China? Russia? Europe? Only the probable retaliation?
kzhahou 4 hours ago 0 replies      
This could use a better title. I avoided the link all day thinking it was about quadcopters.
jstalin 8 hours ago 0 replies      
War is the health of the state.
d23 9 hours ago 2 replies      
I know it's a tangent, but I'm actually blown away by the quality of the website itself. They've managed to use modern techniques like header background video and fading in of navigation elements in a way that is classy and reserved. It doesn't lag up my browser; it doesn't make jarring movements that cause UI elements or content to jump out of line of the eye.

The best part? It actually helps add to the point they're trying to make. It's... chilling, serious, international-stakes stuff here. Kudos on perfect execution of the full package.

dang 10 hours ago 1 reply      
We changed the URL from https://theintercept.com/drone-papers to the first article on the list, which begins by introducing the series.
whatafarce 10 hours ago 8 replies      
Utterly disgusting, America conducts clear war crimes, tortures, bombs afghan hospitals, assisinates, all without consequence.

I'm sure some idiot will tell me to vote, because that'll change the systemic corruption and nightmare that has taken hold of US power.

Did voting in nazi Germany help? No because they were voted in, voting only legitimizes a political system completely captured by insiders and corporate juggernauts. These people only response to power and will not hesitate to kill you, imprison you, or otherwise destroy your life.

But go on, tell us how voting matters. Lol.

Flux is the new WndProc bitquabit.com
528 points by gecko  2 days ago   125 comments top 25
Todd 2 days ago 1 reply      
I've also observed this similarity. The msg is like the actionType, the wParam and/or lParam are like the polymorphic objects that you pass with your action.

The dispatcher is also not the most efficient model, where every store is registered to listen to every event. This is a bit like multiple windows on an event loop. The difference is that in Windows, messages are almost always targeted to a particular window's handle (hwnd). This doesn't make sense in Flux, since it's more of an observer pattern. The logic of interpreting the meaning of an action is left to each store, which is really just a cache.

The biggest problem I have with Flux relates to this polymophism. I use TypeScript where possible and this is the one place where it always breaks down. I understand the appeal of JS objects but the only way to ensure your Flux based system is stable is to have lots of unit tests around your actions and stores.

Redux is a more straightforward take on caching. I can also use type annotations on the reducers and associated store structure, so this helps ensure structural consistency. It also solves the isomorphism problem of server side rendering because each request can get its own state. There is no out of the box solution for this with Flux, since stores are singletons by default.

Minor nit: stores are just caches with observers. I'm not sure why they weren't just called caches.

unoti 2 days ago 3 replies      
The big idea from old school windows that is shared with Flux is the idea of little views that render themselves and manage their own state. In Windows we called those Controls or Window Classes. It is a good idea, and one worthy of preserving.
mpweiher 2 days ago 1 reply      
A couple of corrections:

1)Mac OS X does not store a bitmap for every widget, that's iOS's architecture. It stores a bitmap for every window. Having a layer (GPU-stored bitmap) was only introduced once CoreAnimation was ported to OS X. It was and is optional.

2)OS X Views also have a -drawRect: method that works the same way.

3) In fact that's how MVC works. See http://blog.metaobject.com/2015/04/model-widget-controller-m...

And react and frameworks like it just duplicated this, see http://blog.metaobject.com/2015/04/reactnative-isn.html In fact, when I first read about react (non-native), my first thought was "hey, finally they came up with a good equivalent of NSView + drawRect:

jowiar 2 days ago 3 replies      
As someone who has written several things with Flux and Flux-esque architecture, I see it as a step in the middle, rather than where things are ending. It's not a large step from Flux (Stores update themselves in response to actions) to Redux (Model the entire application as reducers on a sequence of Actions) to RxJS Observables.

What's shared in there is the idea that unidirectional data flow is a whole lot easier to reason about, model, and simulate than 2-way data flow. Everything else is semantics.

ajsharp 2 days ago 2 replies      
There are some great things going on in React / Flux, but the part that needs to be emphasized about Flux, that Facebook doesn't address explicitly anywhere, and that most people eager to always be on the cutting edge will never admit, is that this stuff was designed to solve problems for very complex applications. Complexity is relative, and the solutions that reduce complexity and friction in the development process for Facebook may increase it for another organization. That is to say, Flux / React et al is by no means simple. Not even a little bit. But it probably simplified a lot of things for the Facebook team. However, YMMV for your 6 person startup engineering team.
estefan 2 days ago 3 replies      
...and so for those of us who aren't Windows developers, what learnings can we apply to flux to make it better?
jxm262 2 days ago 2 replies      
This was an awesome read. We use React and Flux daily at work so I'm going to share this with coworkers. I'm a little confused on what the author's concern is though.

> Ive just feltwell, weird. Something seemed off

Is there anything substantively wrong with the flux pattern or drawbacks?

narrator 2 days ago 3 replies      
So what is Angular then? Angular seems to me to be more like an ORM for the view where there's dirty checking of the model and then update events are dispatched to the external system which is the DOM instead of the DB. Is there something similar in the GUI toolkit world?
arijun 2 days ago 0 replies      
danellis 2 days ago 2 replies      
I share the author's feeling of dj vu. I feel like I've seen this article already. It was a comment posted on HN earlier today. It's kind of fascinating how someone's comment can get promoted to someone else's blog post in a few hours.
pducks32 2 days ago 1 reply      
See I think Flux is too low-level. I think it's too hard to reason about from the top level. Not that the architecture is inherently badpeople are using it a tonbut that things get out of hand way to fast. Regardless I can't wait to see web development in a year!
geowa4 1 day ago 0 replies      
I've never liked the comparison of Flux to functional reactive programming. It's really just good ol' object-oriented design. Actions are akin to the Command pattern and the Dispatcher feels like a Mediator. Passing callbacks instead of objects and making a mostly directed graph does not yield FRP.

In my latest project, I used React with rx-react (https://github.com/fdecampredon/rx-react) and RxJS. That combination definitely made for some FRP fun.

hoprocker 2 days ago 0 replies      
I love the correlation between modern in-browser development and programming early personal computers. It's akin to how digital logic abstracts away the tyranny of E&M physics, but several layers higher, and this time just between instruction sets/runtimes.

ChromeOS is kind of making this leap, but I really wonder when web browser ASICs (or equivalent) will start popping up.

dustingetz 2 days ago 6 replies      
The author does not understand React :(

> React by itself doesnt actually solve how to propagate changes

It does actually - you update the state, then React propogates the changes for you through it's props mechanism. Flux is an extra layer of indirection over state changes if you need it: https://twitter.com/floydophone/status/649786438330945536 edit: I regret my tone here, there is clearly ongoing work in this area and no widely accepted best practice yet)

Flux is not message passing, React components do not redraw themselves, React components do not pass messages to each other, Flux only superficially looks like winapi because of the switch statement in that particular example.

React provides the view as a function of state. winapi is nothing like that.

React is a giant step towards functional programming. winapi is definitely nothing like that.

edit: Windows -> winapi

pducks32 2 days ago 0 replies      
Does anyone know of a good place to learn about these different approaches. I find this so fascinating.
amelius 2 days ago 0 replies      
Stated more simply, React is just like "rebooting" your computer after you have changed the config files. It is, in this respect, quite ancient technology, except that the framework hits the "reset" button for you.
antoaravinth 1 day ago 0 replies      
What a great article. I was asking in my previous thread, what framework should I use React/Angular : https://news.ycombinator.com/item?id=10359497

Clearly from what I have heard from HN and from this blog post is React with Flux is just the old of doing web development today! Thats great!

sovande 2 days ago 0 replies      
The big dispatcher switch in Flux is eerily reminiscent of how we used to program AWT widgets back in Java 1.0 days. This architecture was improved greatly in Java 1.1 with a delegation model. If the history is to repeat itself, as the OP so eloquent argues for, then, if you want to see where flux will be going in the next couple of years, start using knockout.js now and for once stay ahead of the curve.
thewarrior 2 days ago 1 reply      
Which is the best model to date for complex UI ?

Cocoa + Interface Builder or XAML/WPF ?

Have used Cocoa + Interface Builder and its quite a joy compared to web dev.


Has some thoughts on this :http://stackoverflow.com/questions/2442340/how-does-cocoa-co...

avodonosov 2 days ago 0 replies      
In this line of reinventing the wheel of UI programming in web dev, I am waiting for Borland Delphi reincarnation.
iMark 2 days ago 0 replies      
I've only looked into iOS programming a little, but is this not similar to how views are handled there too?
jsprogrammer 2 days ago 1 reply      
And Node is essentially the Windows message loop [0].

[0] https://en.wikipedia.org/wiki/Message_loop_in_Microsoft_Wind...

jesstaa 1 day ago 0 replies      
Also, Ruby on Rails is Flux.
whatever_dude 2 days ago 1 reply      
The writer really likes the word "idempotent".
underwater 2 days ago 0 replies      
WndProc is how the windows manager communicates with Windows UI code. Flux is how the UI communicates actions back to the data layer of the application. They're completely different.
Page Weight Matters (2012) chriszacharias.com
455 points by shubhamjain  10 hours ago   139 comments top 25
nostrademons 7 hours ago 4 replies      
When I joined Google in 2009, we were on the tail-end of a latency optimization kick that Larry had started in 2007. At the time, we had a budget of 20K gzipped for the entire search results page. I remember working on the visual redesign of 2010, where we had increased the page weight from 16K to 19K and there was much handwringing at the higher levels about how we were going to blow our entire latency budget on one change.

We did some crazy stuff to squeeze everything in. We would literally count bytes on every change - one engineer wrote a tool that would run against your changelist demo server and output the difference in gzipped size of it. We used 'for(var i=0,e;e=arr[i++];) { ... }' as our default foreach loop because it was one character shorter than explicitly incrementing the loop counter. All HTML tags that could be left unterminated were, and all attributes that could be unquoted were. CSS classnames were manually named with 1-3 character abbreviations, with a dictionary elsewhere, to save on bytesize. I ran an experiment to see if we could use JQuery on the SRP (everything was done in raw vanilla JS), and the results were that it doubled the byte size and latency of the SRP, so that was a complete non-starter. At one point I had to do a CSS transition on an element that didn't exist in the HTML, because it was too heavy and so we had to pull it over via AJAX, so I had to do all sorts of crazy contortions to predict the height and position of revealed elements before the code for them actually existed on the client.

A lot of these convolutions should've been done by compiler, and indeed, a lot were moved to one when we got an HTML-aware templating language. But it gave me a real appreciation for how to write tight, efficient code under constraints - real engineering, not just slapping libraries together.

Alas, when I left the SRP was about 350K, which is atrocious. It looks like it's since been whittled down under 100K, but I still sometimes yearn for the era when Google loaded instantaneously.

dpweb 9 hours ago 9 replies      
If you have an engineering mind and care about such things - you care about complexity. Even if you don't - user experience matters to everyone.

Have you ever seen something completely insane and everyone around doesn't seem to recognize how awful it really is. That is the web of today. 60-80 requests? 1MB+ single pages?

Your functionality, I don't care if its Facebook - does not need that much. It is not necessary. When broadband came on the scene, everyone started to ignore it, just like GBs of memory made people forget about conservation.

The fact that there isn't a daily drumbeat about how bloated, how needlessly complex, how ridicuous most of the world's web appliactions of today really are - baffles me.

jonahx 9 hours ago 1 reply      
This is a fascinating example of Simpson's Paradox:


It also reminds me of the phenomenon, in customer service, whereby an increase in complaints can sometimes indicate success -- it means the product has gone from bad enough to be unnoticeable to good enough to be engaged with.

lnanek2 8 hours ago 2 replies      
Pretty funny story considering YouTube is back to unusable on slow connections. They used to buffer the full video, so you could load up a page, let it sit until the video buffers, then watch it eventually, maybe after reading your social news sites. Nowadays the buffering feature has been removed and you'll just come back, hit play, get a second or two of video, then it has nothing again for a long time.

Feels bad for the engineer who spent all that time reducing the size and finding out it made YouTube much more usable across the globe. Amusingly, disabling buffering was probably some penny wise pound foolish way to save bandwidth.

Splines 8 hours ago 1 reply      
If you're on Windows you can use the Network Emulator for Windows Toolkit (NEWT): http://blogs.technet.com/b/juanand/archive/2010/03/05/standa...

I've used it to emulate what it's like on a high-latency or high-loss network. Relatively easy tool to use.

motoboi 9 hours ago 2 replies      
Coming from a low bandwidth, high latency part of the world, I can't confirm this enough.

Today, I have 2 mbit and can use Netflix or Youtube just fine, but mere 4 years ago, I had 600k and, boy, that was hard. Hard as in loading youtube URL and go for a coffee.


In case Duolingo developers are listening, please test your site on high latency and very low bandwidth scenarios. I just love your site, but lessons behave too strangely when internet is bad here.

teach 9 hours ago 1 reply      
Comments from the last time this was posted: https://news.ycombinator.com/item?id=4957992
SandB0x 9 hours ago 0 replies      
It is insane. One of my favourites is the "about.me" site, which is meant to be a simple online business card. Picking a random page from https://about.me/people/featured, you get a page weighing over 3MB!


mbrock 9 hours ago 1 reply      
Everyone's sometimes on spotty WiFi or foreign expensive 3G. I'm more inclined to trust fast-loading sites and apps.

I wonder what would happen if for example iOS decided to visually indicate page weight, kind of like how you can see which apps use the most energy.

gketuma 9 hours ago 4 replies      
As a web dev I always have this in mind but the challenge is convincing your client who wants a video background. Maybe we need a media query that detects internet speed.
misterbwong 8 hours ago 1 reply      
Isn't this phenomenon getting worse now that responsive design is in vogue? We've collectively decided to shoehorn a website designed for the connectivity and speed of a desktop browser into a lower powered device with slower/spotty connectivity.

Genuinely curious: Why is this better than a mobile-friendly site designed specifically with the constraints of a mobile device in mind?

bsimpson 9 hours ago 4 replies      
Thought certainly an interesting anecdote, I don't understand how a video streaming site like YouTube would be useful in a market where a 100K download takes 2+ minutes. You'd have to open a page, walk away for an hour, and hope everything was OK when you got back.
paulirish 3 hours ago 0 replies      
More than page weight, this article demonstrates that averages are dangerous, especially for performance metrics. All key metrics should be plotted in 50/90/95/99 percentiles, and for latency-sensitive ones, geographic breakouts can often reveal a serious delta from the mean.
hyperion2010 2 hours ago 0 replies      
Heh, I've been using flask tempting to make some html forms for exploring large datasets. Turns out when you have 6000 terms that show up in 6 different UI elements putting those in as raw html results in a 13mb file that compresses down to 520kb. Pretty awful use case for forms. I'm pretty prejudiced against JavaScript, but having seen this I now deeply appreciate being able to send something other than raw html.
jjzieve 6 hours ago 1 reply      
For some reason this whole problem reminds me of early game developers dealing with small amounts of RAM. Which clearly isn't a problem today. So would it be fair to say we should focus on increasing bandwidth to most of the world. I'm not saying page weight doesn't matter, but if you're just trying to get something off the ground maybe you shouldn't worry about it so much. I mean, why worry about users with poor bandwidth, if you don't even have users? If you already have a growing user base, then by all means refactor, reduce the footprint. But if you don't, code the damn bloated thing first.
cr4zy 3 hours ago 0 replies      
I highly recommend turning on page throttling in the Chrome Dev Tools sometime. You'll be amazed at even how slow even 4G seems.
andrewstuart2 7 hours ago 3 replies      
Page weight may matter, but I think amortized page weight matters most. It's like the marshmallow experiment for the web. If you can make one request at 10x the size, but it's only made 1/100th as often (presumably spans multiple pages) then as long as people come back enough to justify that initial extra cost, you've effectively decreased to 1/10th again.

That's why I think AJAX, web manifest [1], indexedDB, localStorage, etc. need to be leveraged much more. Imagine most of your app loading without making a single request, except for the latest front page JSON, or the latest . You have a bunch already in indexedDB so you just ask the server "hey, what's new after ID X or timestamp T?"

So your two minutes just became a couple milliseconds (or whatever your disk latency happens to be), and the data loads shortly thereafter, assuming there's not much new data to send back. And if you don't need any new resources, you only had to make a single request.

[1] https://github.com/w3c/manifest

nicolethenerd 9 hours ago 0 replies      
Nice to see this again - I've told this story to many of my web dev students. :-
csense 5 hours ago 0 replies      
The more things change, the more they stay the same.

I remember visiting microprose.com with my 14.4k modem in the mid-90s and being mad that they used so many images I had to wait for about 5-10 minutes or so. I couldn't effectively read it at home and usually ended up reading it at the library.

foxbarrington 8 hours ago 1 reply      
If it takes two minutes to load a 100kb page, does it take twenty minutes to watch a 1MB video? Over three hours to watch a 10MB video?
beatpanda 9 hours ago 0 replies      
I went to work for a company that makes a travel product used by people in almost every country in the world, after trying to use it in southeastern Europe. I told them their page weight was killing the experience, and wanted to join the front end team to fix it.

After 6 months of banging my head against a wall, I realized the reason we weren't fixing page weight was because our product managers didn't care about the experience of users in poorer countries, because they didn't have any money to spend anyway. Even though we had lots of users in those countries, and even though we made a big show of how you could use this app to travel anywhere in the world.

If there's a lesson there, its that as long as cold economic calculations drive product decisions, this stuff isn't going to get any better.

ninjakeyboard 7 hours ago 0 replies      
If it took two minutes to load the framing page, how would they be able to stream the video?
drikerf 8 hours ago 0 replies      
Great point and very important in times when bundling howmany? js dependencies for client apps.
chadwittman 7 hours ago 1 reply      
sirtastic 9 hours ago 1 reply      
WOW! Faster load times and lighter code makes for a better user experience? (mindblown)
Tesla Model S Autopilot Features teslamotors.com
373 points by extesy  1 day ago   302 comments top 28
ChicagoBoy11 1 day ago 10 replies      
A common phrase in aircraft cockpits nowadays is "What the heck is it doing now?" as pilots have migrated from actually flying the plane to simply being glorified systems managers.

While planes have become so, so, so much safer because of all this automation, pilots uncertainty regarding autopilot functioning is a major concern nowadays, and the reason for several accidents.

There are very interesting HCI challenges around properly communicating to the pilot/driver "what the heck it is doing" and clearly communicating just how much control the human has or doesn't have at any given point.

This "announcement" certainly doesn't inspire any confidence that they have really thought this through deeply enough (I think they probably have, but it should be communicated like it). As a huge Tesla fan, I can't help but feel like I need to hold my breath now and make sure something terrible doesn't happen because of this, and it ends up leading to even more regs setting us back on the path to full car automation.

dognotdog 1 day ago 5 replies      
While the over-the-air update is novel, these features all exist on current luxury and even some middle class vehicles as part of driver assistance option packages.

They're typically called Lane Keeping Assistant, Adaptive Cruise Control, Blindspot Warning, Automated Parking, Traffic Sign Recognition, etc.

The emergency steering bit is interesting, though no further details are provided, as it requires the car to ensure that there is a safe space to steer into, which is dicey for a forward collision emergency braking system, so I'd conjecture it is connected to the side collision warning, and allows collision avoidance if there is enough space in the current lane.

lightcatcher 1 day ago 3 replies      
What sensors does the Model S have? I'm surprised that Tesla sold a car with enough sensors for semi-autonomous operation without the actual software until now.

For those with more knowledge about cars, how does the sensor array in the Model S compare with similar models from companies such as BMW, Audi, Mercedes-Benz? I'm interested in knowing if it's software or the already installed hardware holding back recent luxury cars from similar capabilities.

Also, does anyone know anything about the (digital) security features of the Tesla? This announcement from Tesla makes it clear that the actual control of the vehicle can be modified by an over the air software update. With the recent Jeep hack[0] in mind, does any know if something similar is possible on a Tesla, or if there are some safeguards such as signed updates? As one of the most computerized cars on the market, I tend to think that the Tesla cars might also be some of the most (maliciously) hackable cars on the market.

[0] http://www.wired.com/2015/07/hackers-remotely-kill-jeep-high...

joosters 1 day ago 2 replies      
Releasing driving assistance features as a 'beta'? What on earth does that mean here? Are the features ready to use or not? Do Tesla warrant that they work and are safe?

Maybe they expect drivers to treat it like beta software - "Please don't use these features in production cars. Make sure you keep backups of all drivers and passengers in case of bugs."

jzwinck 1 day ago 7 replies      
> Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel.

Which sense of "must" is used here? The car seems to play an unwinnable game with the driver: keep your hands on the wheel or I'll...what? Disengage autosteer and perhaps crash? With no enforcement mechanism, drivers are incentivized to "abuse" (aka "use") the system as much as it allows.

OopsCriticality 1 day ago 1 reply      
I was surprised to find that the Autopilot feature is a paid $2500 upgrade, according to one source.[0] I'm not surprised that Tesla is charging for the upgrade, but that in all the press and enthusiast coverage of Tesla, I don't recall it being mentioned before.

[0] http://blog.caranddriver.com/elon-take-the-wheel-we-test-tes...

verelo 1 day ago 1 reply      
All these controls sound very similar to those in my current year Mercedes...although i would be hopeful that the autosteer on offer here is better than the Distronic plus "lane assist" in the Merc, which while OK, does not do a great job on less than gentle turns above 50km/hr (but its actually great below that speed - to the point I wonder why i'm even in the seat, in particular in stop-start traffic situations). It certainly sounds similar from the "hands must be on the wheel" requirement.

I look forward to the next step up from all the car makers, which is clearly the car driving on its own in a much more confident way, with the driver simply there to manage exceptions as opposed to being 'assisted' by technology as is with the current implementations.

mmerkes 1 day ago 1 reply      
The auto-park feature would be super handy, but I don't see an auto-unpark feature... I look forward to seeing Teslas stuck in amazingly small parking spots!
Mizza 1 day ago 4 replies      
This seems insanely dangerous to me. They're introducing a feature which could, potentially, cause massive highway accidents, but providing documentation that amounts to little more than a glorified README file?:

> Auto Lane Change

> Changing lanes when Autosteer is engaged is simple: engage the turn signal and Model S will move itself to the adjacent lane when its safe to do so.

A single sentence! What's the point of having drivers license lessons and testing if the fundamental operation of the vehicle can change so drastically?

Am I being a luddite, or does anybody else feel this way?

blisterpeanuts 16 hours ago 1 reply      
This is fantastic. I'm psyched, not just because of the cool technology, but also because it will finally spur the public to demand more frequent and accurate road striping.

Massachusetts has terrible road striping; it seems as though they get around to it about every four or five years, waiting until the lanes and ramp markings are beyond dangerous. This has been irritating me for years. And then they seem to use some kind of cheap paint that wears off quickly. Public works job security, I suppose.

But automated lane navigation will require clear markings. Hundreds of thousands of deaths later, we just might finally get a safer road system. Pathetic, but better late than never, I guess.

grecy 1 day ago 0 replies      
Videos are starting to show up on youtube.


VERY impressive.

NN88 1 day ago 5 replies      
How is this different from Mercedes-Benz's "self driving?"
roflchoppa 20 hours ago 0 replies      
Yo if anyone at tesla is reading this, can you implement a feature for the car to move over into the far side of the lane when people are lane-splitting? People already do it manually on the highway, but if this car also did it would be neat. thxand look twice for motorcycles.
waterlesscloud 1 day ago 1 reply      
Here's a video of version 7 in action that someone linked in /r/selfdrivingcars last night. Not super-informative, but interesting to watch anyway.


sandworm101 1 day ago 8 replies      
Note two words absent from the OP: "Speed limit".

This machine will keep pace with traffic. OK. Does that mean it will break speed limits? Unless it is scanning for each and every potential road sign, it simply cannot be respond to arbitrary/temporary limits. The determination of the legal limit on a piece of road is a complex task. Road construction, local conditions, sunrise/set, time of year (school zones) and even weather can be a factor. And let us not forget "Speed limit X when children on road". You need some serious cpu time to work out whether that person walking along the road is a schoolgirl or a construction worker.

Imho any system not capable of determining the speed limit accurately is a legal liability. Have fun with the tickets.

>eliminating the need for drivers to worry about complex and difficult parking maneuvers.

No. Parallel parking is not a complex nor difficult maneuver. It is total beginner territory. No lives are at risk. With a decent bumper, even risk of property damage is minimal. Anyone not capable of learning to parallel park probably shouldn't be behind the wheel of much anything. Anyone buying this car to avoid such mundane tasks isn't someone with whom I want to share the road.

derek 1 day ago 4 replies      
> Drivers must keep their hands on the steering wheel.

This seems odd, my understanding was that drivers needed to "check in" every so often, not handle the wheel at all times.

Animats 22 hours ago 0 replies      
This is similar to what other high-end cars have, lane-keeping and smart cruise control, usable only in freeway-type situations. "Drivers must keep their hands on the steering wheel." Mercedes calls this "Active Lane Keeping Assist", and has offered it for several years now. Here's someone using it with a can taped to the steering wheel to defeat the "hands on steering wheel" requirement.[1] All the major manufacturers have demoed this.

This is NTSB level 2 automation, (Combined Function Automation).[2] ("An example ... is adaptive cruise control in combination with lane centering.") Google is at level 3 (Limited Self-Driving Automation), and going for level 4 (Full Self-Driving Automation).

The big problem at Level 2 is keeping drivers from using it when they shouldn't. Level 2 doesn't understand intersections at all, for example. Or pedestrians, bicycles, baby carriages, deer, snow, etc. That's why the major manufacturers are being so cautious about launching it into a world of driving idiots.

Volvo has now officially taken the position that if an autonomous car of theirs gets into a crash, it's Volvo's fault and they will accept liability.[3] Now that Volvo has said that, other car manufacturers will probably have to commit to that as well.

[1] https://www.youtube.com/watch?v=Kv9JYqhFV-M[2] http://www.nhtsa.gov/About+NHTSA/Press+Releases/U.S.+Departm...[3] http://www.extremetech.com/extreme/215832-volvo-well-take-th...

tonylemesmer 18 hours ago 0 replies      
To me a big issue is the perception of the autonomous driving mode to other road users. How will other drivers and pedestrians know that this car is being driven automatically and treat it accordingly?

In the Bosch video[1] (also linked elsewhere on this thread), the system jumps out of autonomous mode at the first junction. The driver has to re-engage it. Drivers who are not closely monitoring the car situation might not realise what's going on and take 5-10 seconds to realise they need to re-engage autonomous mode or to take over fully.

Some following-drivers will get agitated by this, in the same way that some do with elderly or learner drivers and make silly impatient manoeuvres. If the state of the autonomous car is clearly communicated to other road users then they might be prepared to make allowances.

During this transition phase of mostly human drivers vs. autonomous drivers it will be these situations that frame people's perception of the merits of autonomous vehicles.

That and how the systems react to dangerous swerving or lane changing or conflict situations where collisions are impossible to avoid. The "Google system" or the "Tesla system" will be tested millions of times a day and face intense public scrutiny whereas human drivers are treated individually.

[1] https://youtu.be/KwD1hjlbhwU

thoman23 1 day ago 1 reply      
"Autosteering (Beta)"

That must be the single most frightening use of the Beta label in history.

abalone 1 day ago 0 replies      
Any thoughts on the potential manufacturer liability for software bugs that lead to accidents?

Certainly there are a lot of precedents with anti-lock braking systems, cruise control, etc. But this stuff seems like such a massive expansion of complexity of software control I wonder what will go down in the courts when the inevitable happens.

mixmastamyk 1 day ago 3 replies      
Unfortunately little mention of front collision avoidance (braking), an important safety feature, which I've waited for on Teslas must be years now.

In the forums there's always the guy that says we should "drive better" instead. With that logic, there's little use for safety features at all.

Shivetya 16 hours ago 0 replies      
Are they going to take responsibility for accidents like Volvo will? While none of the features revealed are new to the industry he does a great job of marketing it. The ace Tesla has is that over the air updates, something the other manufacturers will need to work out, hopefully with an industry wide standard that can be regulated properly to insure safety, security, and liability.
spoon16 1 day ago 1 reply      
Anyone know how well the lane change feature works in heavy traffic?
51Cards 1 day ago 1 reply      
"Drivers must keep their hands on the steering wheel."

This video would seem to indicate otherwise?


mathrawka 1 day ago 3 replies      
As someone who spends a fair amount of time traveling between countries that drive on different sides of the road... I am always getting the turn signal and windshield wipers mixed up. So I doubt I can use the auto lane change feature.
capkutay 1 day ago 0 replies      
This is a cool technical achievement, but I don't see the practical use nor does it seem like a big win for tesla drives. So it allows drives to kind of tune out while driving on the freeway?
rl3 17 hours ago 0 replies      
I wonder how the parallel parking system copes with tailgaters rendering a specific spot untenable.
devit 1 day ago 1 reply      
Is it smart enough to not change to a lane going in the opposite direction or change to a "lane" that is actually a ditch off the road?
How to know if where you live is up and coming: fried chicken vs. coffee shops medium.com
305 points by edward  19 hours ago   294 comments top 46
smikhanov 14 hours ago 10 replies      
A more interesting indicator that authors may consider is how many people living in the area do jogging.

When I moved to London's SE4 postcode three years ago (it's pretty close to Peckham, FWIW), the regeneration of the area has just started and the more middle-class looking people appeared around, the more men and women in running gear were visible in the streets in the morning. Poor on average take worse care of their health and fitness, so tapping into Runkeeper's data may prove useful.

In the meanwhile, during these three years, the value of my home grew more than 100%.

Xophmeister 17 hours ago 6 replies      
There's no justification for the assumption of a high coffee-to-chicken ratio implying up-and-coming. It's not an unreasonable assumption, but it's definitely anecdotal.
JackFr 10 hours ago 3 replies      
This analysis is simplistic and flawed.

It asserts a correlation between the fried chicken/coffee shop ratio index and home price, which is reasonable enough. It then assumes that homes that are undervalued vis-a-vis their implied value by the FC/CS index are up and coming. This likely makes sense in an environment of overall rising house prices. However if house prices are falling or stagnant overall it may be the coffee shops which are lagging the market.

DannoHung 15 hours ago 9 replies      
This analysis would be tremendously racist in America.
huskyr 14 hours ago 0 replies      
We made this (Dutch) dataviz a couple of months ago about the gentrification of Amsterdam, and indeed, the number of yoga studios and coffee bars closely resembled the gentrificated areas:


angdis 14 hours ago 4 replies      
My "rule-of-thumb" indicator: whether or not there are a lot of check-cashing, bail-bond and furniture rental establishments.
mhax 15 hours ago 3 replies      
"Not many outlets selling fried chicken"... in Peckham?? I'm not sure the authors data is all that accurate.
TeMPOraL 15 hours ago 4 replies      
Well, China is definitely not an "up and coming" country. I've been here like a week and I already don't want to ever look at chicken in my life again.
acgourley 17 hours ago 2 replies      
Wouldn't you want an area with both? I may be missing the context of London's culture - but if 'up and coming' means 'being gentrified' wouldn't you want to pick an area that has a barbell distribution of upper class and lower class establishments?
some1else 15 hours ago 1 reply      
Peckham might be considered up and coming, but it doesn't look like much :-S


dec0dedab0de 11 hours ago 0 replies      
This was kind of interesting, I like the idea of using data like this, you could probably get the same results with yoga studios and cash4gold's.

That being said I want coffee and fried chicken now.

Shivetya 14 hours ago 3 replies      
So besides "chicken shops" which must be a regional indicator for poor or undesirable what other establishments are also present? There should be an obvious transition type of business that precedes the coffee shop. Combined with chicken shops perhaps the availability of different businesses could give entrepreneurs an indicator where to set up similar or buy up space?
swalsh 14 hours ago 0 replies      
As a lover of fried chicken, i'm not sure this is the right metric to use for my housing search.
analyst74 11 hours ago 1 reply      
He seems to be measuring current housing prices, but if his goal is to measure "up and coming" areas, shouldn't he try to correlate between ratio of fried chicken vs coffee shops in PREVIOUS years, and change of prices in the following years?
thinksocrates 11 hours ago 0 replies      
This would not be an effective method for the Southern USA. Nashville, which is most certainly up and coming, is also springing up "Hot Chicken" shacks left and right in the hippest areas.
sotoer 14 hours ago 0 replies      
Another "up and coming"-ness indicator is the abundance of white guys walking around the neighborhood wearing small shorts.
onion2k 17 hours ago 2 replies      
You put a fried chicken shop in a place where lots of people go while you put a coffee shop in places where there are few coffee shops. I'm sure that correlates with 'up and coming', but it's not necessarily a signature. Some up and coming places will have neither.
WorldMaker 8 hours ago 0 replies      
Clearly this analysis would be a flawed approach to take for the American South. Especially given that right now "hot chicken" places are a hipster fad for the "up and coming" neighborhoods.
chishaku 9 hours ago 0 replies      
I was thinking about yoga studios as a proxy for this type of analysis the other day.
batuhanicoz 10 hours ago 0 replies      
In Istanbul, this also holds true. But you would need to switch "friend chicken" to "tavuk dner" (chicken dner) or to kebab places.

Looking for places that's gonna be "elite"? Switch coffee shops with third wave coffee shops. At least in Istanbul.

timwaagh 15 hours ago 1 reply      
the theory is worth very little. you could identify two other random density factors (like cigarette butts and chelsea fans) and come up with some heatmap to identify the best value houses. only time-series data could give some indication and then you would still have to test it (and then the market would price in your findings shortly after they are published. this kind of arbitrage rarely lasts long).
aembleton 17 hours ago 1 reply      
Where did you get the data on coffee shops and chicken shops from?
EMRo 8 hours ago 0 replies      
I would love to build/use an app that generates RE investment market suggestions based on some of the datapoints you're mentioning. Maybe see if there's some way to automate the analysis of socioeconomic status of the neighborhood with the distribution of various types of businesses and then map price/price delta over time. Might be some cool data in there. Re racism comments...kinda.
gcb0 8 hours ago 0 replies      
1. buy cheap houses

2. buy fried chicken places and convert them to coffee shops

3. ???

4. profit!

sarreph 16 hours ago 1 reply      
I don't necessarily agree with the premise of this article, as many other commenters point out.

To offer an alternative, my own theory is that the 'up and coming' areas are cropping up down the Shoreditch fringe, i.e. Borough (which is seeing a lot of commercial and residential development, and Elephant & Castle (same as Borough, albeit more behind in completion). Such a 'fringe' also spills off into the East, too.

You could extrapolate this trend to Peckham, one of the primary areas the author has highlighted, however I doubt we're going to get anywhere near same level of 'pop-up' commerce/entertainment in these much more southernly areas for some time to come indeed.

throwaway049 15 hours ago 1 reply      
This analysis is too broad-brush. Although London has richer and poorer neighborhoods, it is common for luxury property to be right across the street from much cheaper property.
MisterBastahrd 8 hours ago 0 replies      
Poor community: bail bonds, pawn shop, payday loans, dollar store, cash checking

Affluent community: party planning, wine shop, dessert specialty shop, boutique clothing, european car dealership

mattlutze 16 hours ago 1 reply      
My office computer is blocking Medium for some reason, but, certainly the choice of a fried chicken joint must be localized?

There's a few regions of the US I've lived where fried chicken isn't really a thing, in general. I'm not sure that'd it would make sense to extend the model to those locations at least.

melgibo12 9 hours ago 1 reply      
1. This is not even wrong.2. Fried chicken and coffee shops seem like a proxy for class and race in London and were deliberately chosen by the author.


scottlilly 14 hours ago 0 replies      
If you want to dig deeper into this, I suggest reading "The Clustering of America", by Michael J. Weiss.

I think some of the key indicators it used were number of bowling alleys, liquor stores, and payphones (keep in mind, it was published in 1989).

softyeti 7 hours ago 0 replies      
I would also look at the open dates for those businesses as well if available for better trending.
branchless 12 hours ago 0 replies      
Knew this would be the UK before I even clicked on it from the phrase.

Choose carefully as your housing "investment" will mean as much as your ability to program to your life outcome.

sabujp 9 hours ago 0 replies      
In the US it's school testing scores
agounaris 17 hours ago 2 replies      
Inspiring assumption... I don't think Peckham is exactly your dream area :)
dbattaglia 14 hours ago 1 reply      
Would be interesting to see this done for Brooklyn, which has a lot of "up-and-coming" areas and a plethora of fried chicken and coffee shops.
PeterStuer 14 hours ago 0 replies      
You might also look at where the city plans to create new pedestrian areas and bicycle lanes. Absence of car traffic makes real-estate prices soar.
rblstr 16 hours ago 0 replies      
Wow. Our 'up and coming' excuse for living in South London is actually turning out to be true.
artur_makly 14 hours ago 0 replies      
back in the early 90's nyc , it was when you started seeing French bars.
SixSigma 16 hours ago 0 replies      
I would say that "coffee" = daytime activity and "chicken" = night time activity.

What a data scientist would have done is find the list of shops and property prices and see which correlate.

Of course, you also need to do it over time because "up and coming" implies the future state, not the current.

1. Buy cheap housing in area that attracts new grads / creatives / artists

2. Those people attract certain business types

3. Hot area attracts richer people + people in 1 get more money

4. Property prices rise

5. Sell

logfromblammo 4 hours ago 0 replies      
In the U.S., a leading indicator of future development in suburban or rural areas is often the self-storage business. They are often built out in the middle of nowhere when land is cheap, and continue operating as other businesses are built nearby.

Many of them are owned by large REITs that are speculating on land values. The business pays the property taxes as the value of the land rises.

Chicken shops flipping to coffee shops may predict urban gentrification, but the self-storage business predicts cornfields turning into Wal*Marts.

squozzer 5 hours ago 1 reply      
So who exactly eats fried chicken in the UK?
perrywky 11 hours ago 0 replies      
ironically Peckham ranks the Most Dangerous Places in Londonhttp://www.thetoptens.com/most-dangerous-places-london/
fowkswe 12 hours ago 1 reply      
If the assumption is that fried chicken places are a relic of a poorer past, then this metric does not work for New York City. Fried chicken is the food of the moment:


pinaceae 12 hours ago 0 replies      
hmm, this is local.

for cities in central europe like berlin, munich, vienna the indicators for poor neighborhoods are mobile-phone repairshops, internet cafs and kebab/dner/falafel places. immigrants from the south-east, from turkey to afghanistan shaping these areas.

as nicely illustrated in the current episodes of Homeland.

beachstartup 8 hours ago 0 replies      
in greater LA, fried chicken is actually a new trend in higher-end casual dining, because of the east asian influence on local cuisine (korean, japanese, taiwanese).

oftentimes it's served out of coffee shops. or tea houses. go figure.

ethbro 14 hours ago 7 replies      
> Poor on average take worse care of their health and fitness

Not intended to be snarky, as I grok the intent of your comment, but poor also don't typically have time to come home and go for an hour jog with frequency during the week.

Square files for IPO squareup.com
324 points by nikunjk  1 day ago   202 comments top 28
myth_buster 1 day ago 17 replies      

 We generated net losses of $85.2 million, $104.5 million, and $154.1 million in 2012, 2013, and 2014, respectively. As of December 31, 2014, we had an accumulated deficit of $395.6 million. For the six months ended June 30, 2015, we generated a net loss of $77.6 million. As of June 30, 2015, we had an accumulated deficit of $473.2 million. 
Could someone explain the motive to go public while operating in the losses? Isn't the public market more averse to companies operating in the negative.

Also wouldn't the timing act more as a distraction for Jack?

kzhahou 1 day ago 6 replies      
> All executive officers and directors as a group (13 persons) .... 61.5%

LOL 13 people in the company own more than half. The thousands of employees get to split what's left after investors.

13 people will become BILLIONAIRES and centi/multi-millionaires, while the rest of the company that toiled for years gets (maybe) a down payment on a 1800sqft house on the peninsula.

Meanwhile everyone rails against wall street inequity and the Walton family and whoever else...

tinkerrr 1 day ago 4 replies      
> In the third quarter of 2012, we signed an agreement to process credit and debit card payment transactions for all Starbucks-owned stores in the United States. The agreement was amended in August 2015 to eliminate the exclusivity provision in order to permit Starbucks to begin transitioning to another payment processor starting October 1, 2015. Under the amendment, Starbucks also agreed to pay increased processing rates to us for as long as they continue to process transactions with us. We anticipate that Starbucks will transition to another payment processor and will cease using our payment processing services prior to the scheduled expiration of the agreement in the third quarter of 2016, and, in any event, we do not intend to renew it when it expires.

In addition to the $150 million loss in 2014, looks like the revenue side doesn't look too good either. From their operating data,

Total revenue = $707.8 million

Starbucks revenue = $123 million

So they would likely lose > 17% revenue very soon.

uptown 1 day ago 1 reply      
"I believe so much in the potential of this company to drive positive impact in my lifetime that over the past two years I have given over 15 million shares, or 20% of my own equity, back to both Square and the Start Small Foundation, a new organization I created to meaningfully invest in the folks who inspire us: artists, musicians, and local businesses, with a special focus on underserved communities around the world. The shares being made available for the directed share program in this offering are being sold by the Start Small Foundation, giving Square customers the ability to buy equity to support the Foundation. I have also committed to give 40 million more of my shares, an additional 10% of the company, to invest in this cause. Id rather have a smaller part of something big than a bigger part of something small.

We intend to make this big! Thank you for your support and potential investment in Square.


pnathan 1 day ago 2 replies      
Square is incredibly common these days at farmers markets and coffeeshops. This class of products has a lot of win going for it; I don't know if Square is going to own that market, but they have a huge advantage right now. I look forward to reading their financials.

edit: a roundup from the WSJ, Fortune, and other random news places suggests strong concern over the CEO situation. That seems fair. I would definitely discount the value of a non-profitable company with a part-time CEO. :-/

That said, having a "stupid simple and ubiquitous payment platform" should be a really easy way to print money. Why hasn't it? I think that story is the interesting one.

jasondc 1 day ago 0 replies      
Jack Dorsey still owns close to 25% of Square, pretty incredible:


lquist 1 day ago 4 replies      
Fuck. Seriously. Fuck.

This could be the IPO that ends the party. One of the top candidates for first Unicorpse (see also Evernote).

iamleppert 1 day ago 2 replies      
Look at those numbers! And for those to say that Square is innovative...their reader was basically all they had back in the day. Now that everyone has created a Square clone, payment processing is a race to the bottom and all about brokering deals with large merchants and hope that the banks and processing networks don't eat your lunch.

Throw in all the regulatory hassle of dealing with money and its transfer, as well as liability for fraud, and competing in an industry as exciting as refrigerators...

kochb 1 day ago 0 replies      
misiti3780 1 day ago 3 replies      
So apparently Square and Box have both raised about the same amount of money, roughly $550-600MM [1][2], but somehow Jack Dorsey still holds 25% and Aaron Levie wound up with ~ 4% - Does anyone else find that surprising?



gsibble 1 day ago 2 replies      
Their pace of losses are increasing at about 50% more per year. Most companies that file to go public are at least losing less money over time with a path to profitability. I don't see how this bodes well for a strong IPO.
austenallred 1 day ago 3 replies      
How many people have simultaneously been CEO of two separate public companies? The list has to be pretty small. No wonder Twitter waited so long to make Jack the official CEO.
gmisra 1 day ago 1 reply      
Does anybody else get the feeling that the primary goal of large scale venture capital these days is to pump-and-dump IPOs, regardless of the actual stability/validity of the underlying business?
rpedela 1 day ago 0 replies      
I like Square's product but the growing net losses concern me.
qopp 1 day ago 3 replies      
Isn't this good for startups in general?

Every time a startup IPOs the investors now have a chance to exit and re-invest in other startups with the fresh capital.

dsugarman 1 day ago 2 replies      
doesn't the roadshow require an enormous amount of time for the CEO? Can you really do this and anything else?
spike021 1 day ago 3 replies      
Would it make more sense for a larger company to buy Square before it goes public? I thought I'd heard people saying Apple or Google should but I'm not sure how much benefit there would be since they have payment systems now.
sjg007 1 day ago 0 replies      
The money in credit card processing is in high interest loans fronted to the merchant that are paid back as a percent of the transaction. SquareUp for instance.
thadjo 20 hours ago 0 replies      
What on earth did Lawrence Summers do for his +1M shares? HT Felix Salmon.
jgalt212 1 day ago 0 replies      
It is really disheartening that a company with such horrible numbers thinks they can go public. It's one thing to lose money before going public, and it's entirely another thing to lose money at increasing rates of speed and think you can find an audience for those shares.
hamburglar 18 hours ago 0 replies      
Yow, from their filing: "Transaction and advance losses for the six months ended June 30, 2015, increased by $13.9 million. We incurred a charge of approximately $5.7 million related to a fraud loss from a single seller in March 2015."

I bet March 2015 was not a fun month at Square.

rokhayakebe 1 day ago 0 replies      
Jack, the Roger Bannister of his field.
goodcjw2 1 day ago 0 replies      
does this come actually faster than expected? will jack actually leave?
marincounty 1 day ago 1 reply      
Does anyone know, off hand, if Square has any important patents? I did a little searching, but couldn't find much.

I do like the company. Just curious if they have any patents that will prevent competition?

curiousjorge 1 day ago 0 replies      
November 2016: Square files for Bankruptcy. Do the simple math, they lose money year after year in bigger amounts.

This will buy Square some time while it shops around for a buyer but that's assuming the capital market is still liquid and happy and there's no market downturn.

harryh 1 day ago 0 replies      
You talk about a super high PE and the fact that they have negative earnings in the same sentence. Do you know what these words mean?
urda 1 day ago 1 reply      
Do you have an actual comment regarding the IPO or a point to make? What you have posted appears to be nothing but noise.
manchco 1 day ago 0 replies      
I don't know how he does it. http://imgur.com/QfNiUz3
WebKit removes the 350ms click delay for iOS webkit.org
302 points by asyncwords  1 day ago   135 comments top 15
ksenzee 1 day ago 7 replies      
The change applies only to unscalable viewports. That's a shame, because it means some developers will disable pinch-to-zoom to get a faster click response. That makes this yet another unfortunate conflict between usability and accessibility. The older I get, the more I appreciate being able to zoom (I'm viewing this page at 125% on desktop right now).
untog 1 day ago 7 replies      
While I do symapthise with those lamenting the lack of pinch-to-zoom, I'm confused: apps don't offer pinch to zoom, so how do you use them? If you can use an app fine without pinch to zoom, you should really be able to use a mobile website fine too.

It seems to me that this is an either/or proposition: either you have a not-mobile, pinch-to-zoom-able web site, or you have a mobile-specific site with an app-like viewport that does not allow pinch to zoom. Both of these seem like fine options to me, and I don't think it's a huge loss to lose the middle ground.

jordanlev 1 day ago 1 reply      
Ugh... This change is of course totally logical in isolation, but I fear that this will motivate designers and developers to disable pinch-zooming on their sites (more than they already are). I hate when websites do this, and it is generally considered terrible for accessibility.
zkhalique 22 hours ago 1 reply      
In our framework, we have for a very long time had a Q.Pointer class which contained functionality to normalize things between touchscreens and non-touchscreens. Among other things, it had the "fastclick" event: https://github.com/Qbix/Platform/blob/master/platform/plugin...

There is far more than simply relying on a "click" in touchscreens. For example the "touchclick" event is for those times when the keyboard disappears because focus has been lost in a textbox, but the click will still succeed: https://github.com/Qbix/Platform/blob/master/platform/plugin...

Also, drag-drop is broken in touchscreens WebKit so you have to roll your own, and much more.

You're better off using a library.

paulvs 1 day ago 4 replies      
As I see it, the 350ms delay was added to support zooming via double-tap. What I don't understand is why double-tap zooming is necessary when we have pinch-to-zoom? Can't zoom via double-tap be sacrificed for instant clicks and everyone is happy?
jamesrom 1 day ago 2 replies      
A lot of commenters here are afraid of developers disabling user scaling to get better performance. That is incorrectly making the assumption that user scaling is good thing for every kind of website.

If a 350ms click delay is actually a performance bottleneck on the app you are building, it's very likely user scaling is something you want disabled anyway.

RoboTeddy 1 day ago 1 reply      
How long until this makes it into the Mobile Safari on most people's iOS devices?
escherize 1 day ago 4 replies      
I really don't understand the lamentation around pinch to zoom. There's a fantastic os-level zoom built into ios! Set it up and three-finger-click to activate. And it works great.
nailer 1 day ago 1 reply      

Typing this on an iOS 9 device and I, as a human, cannot 'fast tap' enough for iOS to register a 'fast tap' and not delay. Try it here: http://output.jsbin.com/xiculayadu

dkonofalski 1 day ago 1 reply      
What's the intended function of the previous functionality? Didn't double-tapping zoom in and out to a specific section? What problem does the delay solve that isn't present on unscalable viewports?
mozumder 1 day ago 0 replies      
Any idea when this makes it into an iOS release? Does Apple usually implement this in point releases? Or do we wait until iOS10 next year?
kristianp 1 day ago 2 replies      
Can someone explain what this means for the non-IOS developers amongst us?
outside1234 1 day ago 3 replies      
Remind me: They originally had the 350ms delay in there to distinguish between a tap and a pinch, correct?
joeyspn 1 day ago 1 reply      
Good news for hybrid apps devs...
fogisland 20 hours ago 1 reply      
Will removing this 350ms delay really make user feel faster response?
Judge: NYC Seizing Thousands of Cars Without Warrants Is Unconstitutional amny.com
278 points by bane  2 days ago   101 comments top 17
zaroth 1 day ago 1 reply      
I agree 100% this is a perfect example of where we the people rely entirely on the judiciary to provide a remedy. That such an obviously illegal practice could continue for years unfortunately does not reflect well on any thoughts of swift justice.

It should be possible to get a temporary restraining order against the city in cases like this within days of the first contested case. It should be easy to demonstrate there is no imminent harm of telling the city, you have to stop doing this until we decide it's OK or not, and quite the opposite, cars are an essential and significant asset, and this policy placed a potentially massive burden on the citizens it effected.

In one of the examples, by the time the victim prevailed against the illegal seizure backed by zero evidence or investigation of any kind, they had already sold off his car, and offered nothing in return. A pretty large part of the population doesn't have a spare $2,000 in cash to get their own car back while the city makes them prove in front of a Kangaroo Court that they were driving their own family to the airport... Missing from the article -- is there any hope of any kind of restitution? Can the victims now pursue a civil case against the city?

mapt 1 day ago 2 replies      
You: The city is stealing my car without probable cause in an attempt to extort money from me.

City DA: No they're not.

What's your recourse here? Call the FBI or federal prosecutor and report an organized crime syndicate being run by corrupt law enforcement professionals? Because... isn't that what this is?

Is there any onus, or even incentive, for them to listen and investigate? Is the only way to redress the problems a civil lawsuit against the City citing Bivens and various appellate court principles like malicious prosecution? Because grand theft auto, extortion, racketeering, and fabrication of evidence / perjury are not civil offenses, and conservative readings of the concept of 'standing', as I understand it, make it rather difficult to challenge the authors of a failed / withdrawn prosecution in order to get at the legal principles which triggered it.

Concepts like this one, as well as things like civil asset forfeiture, are so clearly in direct violation of the Constitution that at some point, it's not legitimate to shelter enforcers under cover of "just following orders". We still have laws (Constitutional and common), and Peabody, Minnesota doesn't have the right to do things like put all the gay residents to death by legislative fiat & judicial compliance; If you found this occurring, you wouldn't need to file a lawsuit alleging that a constitutional overreach has been committed and demanding merely that the policy cease to be in effect. Instead, you would get some overriding authority, like the state police or the FBI, to run in with SWAT teams and arrest and prosecute every last person peripherally attached to the Peabody legislature or judiciary or law enforcement. For murder.

No amount of 'adopting selective prosecution based on what we can win, since the courts recognized a valid affirmative defence' or 'changing training programs to be more in line with civil rights' or 'firing/reprimanding the officers involved and settling a civil suit' makes killing the gay population of Peabody less of a crime, and no amount of lawsuit would be required to get that recognized.

maehwasu 1 day ago 0 replies      
And once again, the nice thing about living in not America is that bribes are significantly cheaper.
grecy 1 day ago 1 reply      
>Probable cause is not a talismanic phrase that can be waved like a wand to justify the seizure of any property without a warrant

Does that apply to civil forfeiture as well? Sounds like it should.

aswanson 1 day ago 2 replies      
Why is the regular news reading more and more like my Onion RSS feed? I have a feeling things were always this absurd, if not more so, but the idiocy gets amplified now by the channels being so connected.
thoman23 1 day ago 0 replies      
So the government should not arbitrarily seize property from its own citizenry? I'm sure they will take that under advisement.
peeters 1 day ago 0 replies      
I think the most interesting, or scary, part of all of this is the justification for this warrantless search and seizure: to stop Uber from operating in the city. Usually the government has to invoke public safety to try to justify removing individual rights. Now they can just invoke the taxi lobby I guess.
avoutthere 1 day ago 3 replies      
Wow, how was this ever legal to begin with?
dandare 1 day ago 2 replies      
This is on of the things that fascinates me about the US. Such blatant injustice would be unthinkable in the Europe.
dannysu 1 day ago 2 replies      
I was getting redirected to http://www.forbes.com/forbes/welcome/ if I clicked the link on HN.

If I copy & paste the link to a new tab, then it worked for me.

dools 1 day ago 0 replies      
Just another example of how prohibiting human behaviour instead of regulating leads to over zealous police and undue burden on law abiding citizens.

They should take a page out of London's book and allow minicabs to operate.

AdmiralAsshat 1 day ago 1 reply      
Is the lack of any page displaying with adblocking turned on intentional, or is it simply bad design?
timtas 1 day ago 0 replies      
Yet another reason why I've stopped using plural pronouns to refer to the state at any level.
pbreit 2 days ago 2 replies      
Is this an Uber thing? I didn't see it mentioned.
briandear 1 day ago 3 replies      
Has anyone ever died because of an unlicensed limo? Is it really a threat to public safety? If consenting adults agree to a transaction, I am not sure how that's the government's business. However, if an unlicensed vehicle was portraying itself as a licensed vehicle, then you have a fraud issue, not a public safety one.
bsder 2 days ago 4 replies      
What's the deal with all the Forbes links redirecting to welcome? How do I stop this?

I tried checking the "Warn me when websites try to redirect or reload the page" box in Firefox, but it doesn't appear to be stopping it.

Presumably too many people are starting to use things like "Google Sent Me".

c22 1 day ago 3 replies      
This is a stupid grammar nitpick I only offer in the hopes that you find it useful for your writing. No dismissal of your arguments or denigration of your character is intended.

I think you want the word "affected" in your second paragraph. Effects are the result of causation, whereas affect refers to the causation. The citizens were affected by the effects of this policy.

Mattermost 1.0 released open-source Slack alternative mattermost.org
327 points by shuoli84  1 day ago   136 comments top 30
finnn 1 day ago 6 replies      
http://getkaiwa.com/ is another Slack alternative that uses an XMPP backend, which IMO is much better than a custom backend. So far the only open source Slack clone I've seen that uses an existing standard for the backend
SEJeff 1 day ago 1 reply      
This is great. Also see: https://zulip.org

And the blog on why Dropbox decided to OSS it:


jgrowl 1 day ago 2 replies      
Props for open-sourcing, but I'm putting my money on http://matrix.org/
cdnsteve 1 day ago 2 replies      
It's good to have options.

The takeaway I'm getting from this story, and Mattermost, is:1. Export your critical data from SaaS services if you're business cannot exists without them.2. Test that this works before putting years of data into a service.

There's nothing wrong with SaaS services, they just mean users must do their due diligence in any business partnership. I can't see how a game company can put their resources into delivering this as an open source project with no future plans for monetization. Frankly, without monetization, open source projects generally wither up and disappear. Then you're no further ahead.


lorenzhs 1 day ago 1 reply      
If what you really want is a pretty web-based way to access IRC, then you might want to check out Glowing Bear -- it connects to your WeeChat IRC client via websockets and works nicely on the Desktop and on mobile. It doesn't have a voice recorder, but it gives you the infinite possibilities of a mature IRC client. It's a project I've been contributing to for a while now and I still absolutely love using it.


ju-st 1 day ago 5 replies      
Why do I have to go to slack.com to learn what this is?

I clicked on this link, no explanation. I checked mattermost.org, no explanation. I went to slack.com, no explanation. Then I clicked on a "Product" link on top of the webpage. Finally some information what this actually is.

Even open source projects could benefit from a little bit of marketing.

pbreit 1 day ago 6 replies      
There was a time when I thought something like this was a good idea. But after using Slack for about a week, there's no way I would give up all the benefits of a well-integrated centrally controlled service. All the clients work together perfectly. We have Slack channels with customers. It just all works much better than I can imagine any self-hosted, decentralized service would.
netcraft 1 day ago 1 reply      
I think Slack serves its stated purpose very well (smaller, business oriented teams), but many groups have started using it for larger communities, mostly because it has unlimited users for free. But it isn't made for that and there is no way that most of these groups would ever be able to pay for a premium subscription due to the per-user costs. 10K messages across all channels is surprisingly easy to hit, need the ability to ignore users, etc. I think this project has great potential to fill that niche if it is marketed properly. Slack is so close to working well in that area but really needs to pivot to be able to serve it well and make money doing it.
kentt 1 day ago 2 replies      
I'm trying to decide if this is better than Zulip. They're both open source, backed by someone trusted, and I can run it on my own server.
DannoHung 1 day ago 0 replies      
Interesting that it will be a default feature of Gitlab.

That's a move that seems like it may push Gitlab ahead of GitHub in some ways (well, to me at least).

mugsie 1 day ago 2 replies      
Well.... this is depressing -

Mattermost server is made available under two separate licensing options:

- Free Software Foundations GNU AGPL v.3.0, subject to the exceptions outlined in this policy; or - Commercial licenses available from Mattermost, Inc. by contacting commercial@mattermost.com

"To simplify licensing, weve responded to community feedback and the compiled version of Mattermost v1.0 is now under the MIT open source license" (Emphasis mine)

Why just the compiled version?

giovannibonetti 1 day ago 0 replies      
Since we are talking about open source software, maybe the guys that own the Mattermost account on Github could create a placeholder repo for Android (I wonder if this idea would work for iPhone, too) and accept Pull Requests until there is at least a beta native app.
ywecur 1 day ago 0 replies      
Would be happy to move over to an open source alternative, but at the moment they don't seem to support mobile apps.

It would be very difficult for us to move because of this, we talk a lot on the move.

bachmeier 1 day ago 0 replies      
Doesn't this have some heavy hardware requirements? Three machines with at least 2 GB of RAM? Is that really necessary if I'm going to chat with five people?
djmashko2 1 day ago 0 replies      
I wonder how this compares to Rocket.Chat, another open-source alternative: https://rocket.chat/
e12e 1 day ago 1 reply      
This looks very nice. Is there any plans for an API/client protocol? Web client is all well and good, but I'd want to have a solid console client, as well as some command line tools (eg: "echo "Some message" | xmpp user@host -- where the equivalent for Mattermost would allow to set the topic, or message a group via a bot etc.)?
rpedela 1 day ago 3 replies      
Overall I like it because they closely follow Slack's UI. However I question the choice of fully supporting Markdown. A comment isn't supposed to be documentation. Supporting things like bold, italic makes sense for emphasis or making code easier to read. But headings? When would one ever want really large text in a comment?
jeffjose 1 day ago 1 reply      
In my past job, I was desperately looking to get an open source Slack alternative. The ones I tested then (few months back) didnt hold up nicely against Slack. I'm happy to see that finally there's some good competition.
fsiefken 1 day ago 0 replies      
If it supports SSL XMPP it can be a drop-in replacement for a lot of companies.
lucaspottersky 1 day ago 0 replies      
Feature idea: a canvas where people could draw instead of typing a text.
BHSPitMonkey 1 day ago 0 replies      
The blog post is dated October 2nd; Is HN just learning of this announcement late, or is their blog displaying the draft date rather than the publication date?
ChicagoDave 1 day ago 2 replies      
I've spent a couple of hours trying to get Docker running on my linode Ubuntu server to no avail.

A non-docker implementation would be nice.

pionar 1 day ago 1 reply      
So, what does this offer over Slack, besides being open-source? I see no mention of any actual features, besides basic chat features.
yannis 1 day ago 1 reply      
Besides being an excellent application, this is a valuable resource for anyone studying Golang.
artribou 1 day ago 0 replies      
Does anyone know who the old provider was that locked in their data?
pwenzel 1 day ago 0 replies      
Can it send push notifications or other alerts to my phone?
jhildings 1 day ago 7 replies      
Why not just use IRC ?
mholt 1 day ago 1 reply      
Congratulations, Mattermost team! Huge accomplishment :)
api 1 day ago 0 replies      
There are many OSS alternatives to Slack. Some are clones and some are different approaches and many of them are quite good.

The thing these and all other similar efforts miss is the importance of network effects. Everyone uses Slack because everyone uses Slack.

The real problem that needs to be tackled is one layer down: providing an open, distributed alternative for authentication, identity management, and data interchange that is secure and robust enough to provide a backplane for things like this and that is easy enough for anyone to use that it can be pushed out to the mass market. I can't stress the last point enough. It must be stupid simple easy to use or it will fail. It also must offer a good and simple developer experience (DX) or it will fail. DX is part of UX. Things like XMPP are nightmares for devs and sysadmins and fail badly here.

This is a huge missing piece of the web.

copsarebastards 1 day ago 0 replies      
Just what we need, another solution to a problem that was solved two decades ago!
Default Alive or Default Dead? paulgraham.com
304 points by iamwil  1 day ago   113 comments top 23
JshWright 1 day ago 6 replies      
In the world of high angle rope rescue, we have a concept called the "whistle test". The idea is that if someone were to randomly blow a whistle at _any_ point, and everyone let go of whatever they were holding, that no one would be dropped.

It takes a lot of thought and planning to make sure you're 'default alive' in all circumstances. It slows you down, and it requires you to think through the implications of every decision, big and small.

This sounds like a fairly similar notion...

jacquesm 1 day ago 1 reply      
> The startling thing is how often the founders themselves don't know. Half the founders I talk to don't know whether they're default alive or default dead.

This accurately reflects my own experience. Most common pitfall: converting VC capital to users at a rate that will not sustain the company once the VC capital runs out. So many companies fall into this particular trap that it should have a name of its own.

Bought growth is only worth it if the users remain long enough to make back the money you pumped into them at the time of acquisition in net profits otherwise you might as well do without them.

I'm not sure if the reference to airbnb helps, whatever they did, they're an outlier and simply doing what they did without carefully evaluating your reasons is going to work about as well as any other cargo-cult strategy to success, it would be (a lot) more useful to see this point expressed in an alternate form, start-ups funded by YC in cohorts of months from when they started hiring besides the founders compared to their survival rate.

numlocked 1 day ago 1 reply      
It's amazing to me how deeply ingrained software profit margins are into the start-up world. That calculator...we're an ecommerce company that holds inventory...I spent 30 seconds searching for how to set gross margins on the revenue then realized it assumes all revenue is 100% gross margin. In most businesses (read: anything other than software and maybe pharma), manipulating margin is one of the biggest levers (maybe THE biggest) you have to affect profitability.

Not to mention other big levers like working capital (and potentially running a business with negative WC and generating cash, a la Amazon). It's funny to be running a start-up in SF and still feel a world apart from a lot of the ecosystem.

paulsutter 1 day ago 3 replies      
> In practice there is surprisingly little connection between how much a startup spends and how fast it grows. When a startup grows fast it's usually because the product hits a nerve, in the sense of hitting some big need straight on.

Perhaps the most important underlying point in the article.

It's easy to think that more people will make the company grow faster. Adding people actually makes it harder to tune a product's direction (and thus growth rate). Great to see another dense and on-point post from pg. Every sentence is worth several reads.

analog31 1 day ago 3 replies      
My inability to comprehend the idea that 8 or 9 months is "old" is probably the surest sign that I'm the one who's old.
rsp1984 13 hours ago 0 replies      
There is a lot of truth in this essay but readers should know how to interpret it. When PG assumes stable revenue growth for "the last several months", even just revenue at all, for a startup that's been operating for not even a year, that already narrows down the set of startups for which his advice is applicable quite a bit.

What we're essentially talking about then is a specific breed of startup (a popular breed, but still a specific breed):

 - Pure software / probably SaaS or a website. - Margin close to 100%. - Essentially no R&D component, i.e. just wiring together existing technologies. - Perfect product distribution infrastructure, i.e. internet or mobile app company.
What readers should know however is that outside of this subset of startups there is a wide variety of flowers in the startup ecosystem that can all bloom in different ways and at different times.

And that early growth isn't the only metric that makes a startup attractive. Growth a powerful one that eventually will be more important than everything else, but there are other things -- such as strong core IP or having early product in a small but growing market -- that make startups interesting.

Then there's startups that require literally tens of millions of dollars of up-front investment and years of R&D before there even is a product. Are those all doomed? I guess not. Given the right execution and strategy and patient investors these companies can also be extraordinarily valuable. They are just a different kind though.

Animats 22 hours ago 0 replies      
For the first dot-com boom, I did Downside's Deathwatch[1], which did exactly that for public companies. (Companies IPOed earlier in that boom, often before profitability.) For a public company, SEC filings give anyone enough info to make that calculation.

For a private company, it's much harder to tell from the outside. Any CEO who doesn't know how many months (days?) of cash they have left is hopeless.

[1] http://www.downside.com/deathwatch.html

Multiplayer 1 day ago 0 replies      
I love that calculator. Picture is worth 1,000,000 words here.
jaytaylor 1 day ago 2 replies      
Why is there a shopping cart in the upper right hand corner of his website? What Is available for purchase from Paul?
kra34 1 day ago 3 replies      
It's interesting to see the tone change from Paul Graham and Sam Altman in the last couple of months, its almost like somebody finally bought them a calculator.
AndrewKemendo 1 day ago 0 replies      
I feel like having the mentality that you are always default dead is where your head should be as a founder.

That's how I run my company. Complacency kills, and prevents being able to be proactive in an ever changing market.

DrNuke 17 hours ago 0 replies      
Bootstrapping attitude to the rescue: the more you do with little money, the more you can do when money is raised, and longer.
debacle 12 hours ago 1 reply      
If Twitter would have heeded this advice, it may not be in the bind that it's in right now. Four thousand employees! At Twitter! That's an order of magnitude more than they need.
copsarebastards 21 hours ago 0 replies      
> Say "We're default dead, but we're counting on investors to save us." Maybe as you say that it will set off the same alarms in your head that it does in mine. And if you set off the alarms sufficiently early, you may be able to avoid the fatal pinch.

To make this alarm explicit: if you were that investor, would you save the company? I wouldn't.

caf 23 hours ago 0 replies      
But as a founder your incentives are different. You want above all to survive.

This is a bit like in the early stages of a poker tournament, where you might fold even quite strong starting hands to all-in bets where your expected value is positive - because you're not just betting the number of chips in your stack, you're betting the entire remainder of your tournament.

vasilipupkin 23 hours ago 0 replies      
Aren't most startups except for very very few super start top ones, default dead by definition, maybe until B stage?
andrewstuart 16 hours ago 0 replies      
This is effectively saying that these businesses don't have cashflow projections. Business 101 - should be taught by whoever the investors are that are "adding value".
7Figures2Commas 1 day ago 1 reply      
> Instead you'll be compelled to seek growth in other ways. For example, by doing things that don't scale, or by redesigning the product in the way only founders can. And for many if not most startups, these paths to growth will be the ones that actually work.

Or you could reconsider the size of your total addressable market (hint: it's probably a lot smaller than what's in your pitch deck) and give weight to building a smaller company that's sustainably profitable.

Note that I'm not suggesting growth isn't important. What I am suggesting is that a lot of founders seek "Silicon Valley growth" without considering the possibility that they have an opportunity to build a lasting business that doesn't need hundreds of employees, tens of millions of dollars in funding, hundreds of millions in revenue and billions in enterprise value to succeed.

PhilipA 16 hours ago 0 replies      
It feels like this post is somewhat also addressing the very high burn rates which companies have, and that you should have control over your trajectory before you begin burning all your money.
urs2102 21 hours ago 0 replies      
Despite this definitely being important for all businesses at some point, where does this come in when evaluating businesses like early Facebook and Google where prior to monetization, wouldn't they appear to be "default dead"?
TrevorJ 22 hours ago 0 replies      
Reminds me of a great episode of Dirty Jobs: https://www.youtube.com/watch?v=Ap3peqZ0RlA&feature=youtu.be...
codingdave 23 hours ago 1 reply      
I'm surprised it takes an interview with pg for this question to be raised. I would be asking it in an interview before I ever sign on as a new hire.
logicallee 11 hours ago 0 replies      
What about "Default no progress forever."? Let's suppose someone is building a new airplane and has a hundred subscribers paying $10 per year for their newsletter. Nobody is funding this person, and the business is alive forever. Default alive or default dead?

It's alive until the founder realizes he will never raise a round capable of building an airplane, and pivots and does something cheap and easy online instead.

Complete LiDAR Scan of England Publicly Available environmentagency.blog.gov.uk
301 points by alibarber  2 days ago   67 comments top 21
hanoz 2 days ago 6 replies      
I took a look at this after it was mentioned on Hacker News two weeks ago and ended up building this map of all the DSM 1m data:


I've been quite fascinated to discover how many mysterious lumps and bumps are to be found all over the country, often with no apparent explanation in aerial photo maps, and to my great surprise I find myself cultivating an interest in armchair archaeology. I've stumbled across a few features which on further research have turned out to be sites of note, a couple of which were only discovered in recent years, which is quite exciting. Next mission is to discover something completely unknown. In fact I could do with some help interpreting some features if anyone here has any experience in this area.

Here's a couple of well known sites:

https://houseprices.io/lab/lidar/map?ref=SU1224642189 Stonehenge)

https://houseprices.io/lab/lidar/map?ref=SU1025569962 Avebury)

A few of my 'discoveries':

https://houseprices.io/lab/lidar/map?ref=ST5895844810 (Medieval and Iron Age/Roman field systems near Croscombe, Somerset)

https://houseprices.io/lab/lidar/map?ref=NY7217242430 (Potential henge near Alston, Cumbria)

https://houseprices.io/lab/lidar/map?ref=SX1025261066 (Roman Fort near Restormel Castle, Cornwall)

A couple of things I'm not sure about:



Doctor_Fegg 2 days ago 0 replies      
People have been experimenting with using this to contribute to OpenStreetMap for a couple of weeks now. Here's one writeup: http://chris-osm.blogspot.co.uk/2015/09/extracting-building-...
praseodym 2 days ago 1 reply      
There is a similar dataset available for The Netherlands. A potree point cloud visualisation can be seen at http://ahn2.pointclouds.nl/.
wielebny 2 days ago 1 reply      
LIDAR scans of Poland are available publicly from some time: http://geoportal.gov.pl/dane/numeryczne-modele-wysokosciowe
JorgeGT 2 days ago 1 reply      
An almost complete LiDAR scan of Spain is also publicly available. I wrote about it here and included a few samples: http://wechoosethemoon.es/2015/09/05/lidar-espana-3D/

Sadly it is in Spanish but I hope available areas and pictures of expected results are clear enough! LiDAR data is provided as 2 Km x 2 Km squares of RGB-colored points in *.laz format. If someone is interested I can translate into English or point to the sources.

Tepix 2 days ago 0 replies      
This is amazing. I love that they mention Minecraft as one of the use cases:

"LIDAR data some surprising uses:"

"Computer games: Minecraft players have requested our LIDAR data to help them build virtual worlds: the data could be useful to anyone creating realistic 3D worlds."

cwal37 2 days ago 0 replies      
If you're interested in LiDAR data from the United states, you should have a look at this wikipedia page and its corresponding links[0]. Most states have some kind of data freely available from the most recent survey, although it's neither uniform or always clear in terms of how to access it. The structure of the program allowed individual states to tackle their won territory differently in both surveying and data dissemination, so there's no easy and official central repo as far as I understand.

I was in grad school at Indiana and working at the geological survey while they were finalizing some of the state's pieces of this, and it was really fascinating to see some of the early products some of the people in the geography and geology departments were producing. I mucked around a bit with it myself, but never really produced anything useful. I can speak to finding the data fairly easy to acquire and quite comprehensive at the time, uncertain if that's changed, but might be a decent starting point[1].

[0] https://en.wikipedia.org/wiki/National_Lidar_Dataset_(United...

[1] http://gis.iu.edu/datasetInfo/statewide/in_2011.php

joosters 2 days ago 1 reply      
Can anyone recommend any 3d viewing programs for this data? This is all new to me but I'd love to try experimenting with it. The download zipfiles contain .asc files
NickHaflinger 2 days ago 1 reply      
'All 11 terabytes of our LIDAR data (thats roughly equivalent to 2,750,000 MP3 songs)' or a stack of paper 513 kilometers high :
Schwolop 2 days ago 0 replies      
At one point in time[1], I hired a helicopter to act as a surrogate remote sensor doing data fusion with a ground robot. I flew as a passenger and told the pilot where to fly based on the ground robot's need for data.

Since we had to pay for the helicopter's time anyway, and the field trial was spread over two days, we left our equipment attached to the helicopter when it returned to its airfield that night. The next day we had a 78km long data set of LIDAR, GPS, visual imaging, and inertial measures, all from an altitude of about 25m giving us about +/- 2mm for the LIDAR's range data.

The sad end to this anecdote is that I have no idea what happened to that data. It's presumably sitting on a dusty server somewhere in academia.

[1] This point in time, as it happens: http://www.drtomallen.com/uploads/1/2/0/2/12026356/3361016_o...

deskamess 2 days ago 0 replies      
Any idea about the cost of doing a LIDAR scan for a region? Lets say you have 1200 square km (assume rectangular area). How much would that cost?
dougbinks 2 days ago 0 replies      
Format of the data is listed as Arc/Info ASCII Grid (AAIGrid) which is an ASCII Esri grid https://en.wikipedia.org/wiki/Esri_grid.
JonnieCache 2 days ago 4 replies      
Hell yes! This will be useful in my long term aim of procedurally generating rolling english hills for video game purposes...

EDIT: with the resolution of this thing, maybe I won't need to generate them, maybe I can just set the game in the real england...

bsykora 2 days ago 0 replies      
LIDAR is also being used at NASA to measure atmospheric CO2 concentrations.



Animats 2 days ago 1 reply      
Do they have "first and last" LIDAR data, or just one value per point? It's common to capture the distance to both the first and the last reflection. This often indicates the top of vegetation and the ground level. With that, you can easily identify trees, brush, and crops.
groth 2 days ago 1 reply      
Anyone know how burial mounds/roman roads are found? Would love to see that reproduced for the non-academic world.
chatman 2 days ago 0 replies      
This will be great as a base layer in OSM!
scuba7183 2 days ago 2 replies      
Awesome! Does anyone know if similar resources for the US are available?

Edit: possibly https://lta.cr.usgs.gov/LIDARStill looking for more

tibbon 2 days ago 0 replies      
Could do some neat things with drone piloting with this.
vanous 2 days ago 0 replies      
Does anyone know of publicly available LIDAR data for the Czech Republic?
alphapapa 2 days ago 0 replies      
I was hoping to find some explanation of how they capture the data. I'm guessing it's from aircraft? It'd be interesting to read about how they stitch together and correct the data captured from a moving platform like that. And I wonder how long it takes to capture the whole country.
Show HN: Expos a static site generator for photos and videos github.com
542 points by Jack000  3 days ago   73 comments top 37
Jack000 3 days ago 5 replies      
so this is just a bit of glue code for imagemagick/ffmpeg that I use to generate my blog.

Last time I was on HN there was some interest in what I was using for the backend, so I cleaned up the code a bit and put it on github

eric-hu 3 days ago 1 reply      
This looks amazing. Thank you for generalizing this and open sourcing it.

For anyone looking to use the polygon word wrap feature as in the demo on the Github page with the Eiffel Tower, take a look at its responsive behavior before you make that post. As you shrink the window down on jack.ventures, some text can be cut off, or no longer show up on a clear portion of the image like a wall.

This isn't a bug with the software so much as a flaw with the magazine "words on image clearing" style layout that doesn't translate perfectly to the web medium.

superic 3 days ago 0 replies      
Beautiful work and amazing photography!

Interested in making things at Flickr? Drop me a line at eric at flickr.com :)

ausjke 3 days ago 0 replies      
really cool and I also learned a new way to check dependencies under bash:

command -v convert >/dev/null 2>&1 || { echo "ImageMagick is a required dependency, aborting..." >&2; exit 1; }

bobfunk 3 days ago 0 replies      
Added it to staticgen.com now - love these small non-general purpose static gens :)

Reminds me of ThumnsUp: https://github.com/rprieto/thumbsup

mlapeter 3 days ago 2 replies      
This is really great! Is it open source? I couldn't see a license or anything in the readme so wasn't sure.
mettamage 2 days ago 2 replies      
Got a question: for fun I did this over about 400 to 500 photos. I partially did it also to stress test it. But the HTML file is empty. Did I do something wrong? The rest of the folders are there.

Here is the output if someone is interested to brainstorm about this problem. I edited the output a bit to make it slightly more readable.

<begin output>

Scanning directories.

Populating nav.

Reading files...........................




















_name/Downloads/Expose-master/expose.sh: line 139:

/usr/bin/sed: Argument list too long

Starting encode










</end output>

Anyways, I checked the image which was indeed corrupted, so ImageMagick was right on the money on that one. I still don't get it though why there's no HTML in the HTML file.

cataflam 3 days ago 0 replies      
This looks amazing.

I was using a tool one of my friends made, with a similar idea, just put images and videos in directories and have a script automatically generate a nice output. It can be found at https://github.com/jlaine/django-coconuts, but yours looks spectacular. I'm tempted to switch.

3stripe 3 days ago 0 replies      
Lovely. Would be swell if you could plug it straight into a Dropbox folder.
desireco42 2 days ago 1 reply      
Thank you, this is exactly what I was looking for (actually thinking how to cobble up together myself). So I guess I am target audience :)

I want to add that sites look awesome and this is perfect for large number of people, I just want to thank you one more time for making this, it is also excellent starting point for photo sites.

callmeed 3 days ago 0 replies      
This is very cool. Love that its just a shell script.

In a similar vein, I have a Jekyll plugin that reads folders of images and can create galleries: https://github.com/callmeed/jekyll-image-set

Nice that you have video support. Can't wait to give this a try.

anderspitman 2 days ago 0 replies      
Very cool. Somewhat off topic: I've been looking recently for a good way to host many GBs of photos and videos for my friends and family to browse and download. Basically an open source version of Google Drive's file browser, with thumbnails and image and video previews. Any suggestions?
raimue 3 days ago 0 replies      
Thank you very much!

I appreciate the use of simple bash shell script instead of a scripting language with lots of runtime dependencies.

nathancahill 3 days ago 0 replies      
Phenomenal. I've bounced between Tumblr/Wordpress with photoblog themes, Flickr, 500px and Instagram for sharing my film photography with friends. This is better than all of those, especially with the captioning.

Since most of my photographs are organized by trip or theme, this is perfect.

foolinaround 11 hours ago 0 replies      
can the video clips play the audio when brought into focus? maybe we want to play the audio just once I guess?
lemming 3 days ago 0 replies      
This is really nice! Getting photos and especially video online in an easy way but still have it be beautiful has been really tricky. I'll definitely use this.
hobonumber1 3 days ago 1 reply      
This looks cool. Is there a demo link that I can check out?
jaza 3 days ago 0 replies      
Nice work. I guess that for a sitegen so focused on photos and videos, a bash script works well - lets you do whatever you need directly with imagemagick and ffmpeg, no beating around the bush.

And I like your sites! Now... all I need is to be a better photographer, then I'd be able to actually create something nice with your script. Hahaha.

lucaspottersky 3 days ago 0 replies      
i like the results. now, it would be very useful to have a friendly GUI to edit those things and generate the YML files for you.
falcolas 3 days ago 1 reply      
Given that it's a static site, why the JS requirement? Most browsers already intelligently handle keeping the "right" number of assets in memory, do you believe your JavaScript handles it better?

Also, it's not at all responsive to different viewport sizes; this might be a good thing to address.

EDIT: Oof, that's a kick in the karma. Aah, well.

thieving_magpie 3 days ago 0 replies      
I didn't know that I needed this, but now that I know it exists I really need this. Awesome job, thank you.
giancarlostoro 3 days ago 0 replies      
Would be cool if you could do something similar to this with git repositories as well... Hmmm..
marcfowler 3 days ago 0 replies      
This looks amazing - really great work. I'll definitely check this out properly.
dyogenez 3 days ago 1 reply      
This is seriously cool! I'm going to have to checkout the code for this one to see how some of this was handled.

Do you have any other sites doing photo/video stories which have been an inspiration for this project?

areohbe 3 days ago 0 replies      
This is wonderful. Superb work.
Omnipresent 3 days ago 0 replies      
Looks beautiful. Support for google photos would make this more versatile.
hkjtme 16 hours ago 0 replies      
if this would exist for S3.. i would be soooo happy. tried to find similar (even thumbnails would be ok) .. nothing out there.
flanbiscuit 3 days ago 0 replies      
I just got back from a trip and I have a lot of photos and some videos. This is perfect! I'm going to try this out as soon as I get home tonight
cmstoken 2 days ago 0 replies      
Beautiful work! I'm in love with your site. I'm definitely going to be using the software. Thank you Jack!
caiowilsonb 1 day ago 0 replies      
Thank you so much for sharing.
noahbradley 3 days ago 0 replies      
Love, love, love this. I travel and shoot a lot of photos, so this would be perfect for putting those out there.
thekevan 3 days ago 0 replies      
This is an excellent way to present images and videos. Also, reading about your work is an inspiration. Thanks!
chadscira 2 days ago 2 replies      
I really don't understand the appeal of these static site generators. Why don't people just toss something like CloudFlare in front of their dynamic sites and turn the edge caching up. I mean this is free, the bandwidth expenses are covered, and you now have a globally accessible site.

Unless you're just trying to get away with hosting your whole site as a GitHub page ;)

therealmarv 3 days ago 1 reply      
Looks great, but man... this seems like websites for the top 10% of the world with great internet speed. This sites are really not good with slow(er) internet. Also look at amount of traffic needed for full example of http://jack.ventures even do not want to spend my mobile traffic on that example (although LTE is fast enough).
NKCSS 2 days ago 0 replies      
Very cool.

I have little else to add, but I wanted to let you know anyway :) Keep it up!

evantahler 3 days ago 0 replies      
rekshaw 3 days ago 0 replies      
Wow, love your bio: "These days I mostly travel and work on random stuff that I find interesting. I'm not really looking for employment, but I've always wanted to work at NASA and/or Google. So if you're NASA and/or Google HR, drop me a line ;]"

I wish I had that freedom.

Profile of Margaret Hamilton, programmer of the Apollo software wired.com
254 points by doppp  2 days ago   58 comments top 10
nickpsecurity 2 days ago 6 replies      
Many programmers talk about Ada Lovelace, who was certainly awesome, but should be talking about Margaret Hamilton. Before anyone heard of Dijkstra's work, Hamilton was straight up inventing everything from real, software engineering to properly handling fault-tolerance and human interfaces. Here's what she was doing (see Apollo Beginnings rather than USL):


Her Wikipedia page gives her due credit and shows just how much she and her team pulled off:


Also note the size of that printout of her program. Try to think about how many people make one like that which doesn't fail in the field even when parts of its environment and hardware are failing. Also while using low-level code with little to no tool support on 60's era, embedded hardware rather than a JVM, etc. Margaret Hamilton was the first in the field of high-assurance programming that I've seen. A lot of firsts.

So, I give her credit where possible. The industry should even more as she wasn't a theoretical programmer like Ada Lovelace: she engineered actual software, co-invented many of the best practices for it, deployed them successfully in production, and then made a tool that automated much of that from requirement specs to correct code.

While programmers and industry ignore her, at least NASA gave her positive mention as the founder of software engineering and ultra-reliable software along with the biggest check they ever cut to an innovator:


Where would our systems and tools be if more attention was put into her methods, even just principles, than informal use of C? ;)

lumberjack 2 days ago 6 replies      
Only tangentially related but it seems to me that if you want to work on cool software that does something novel and exciting it seems like it's better to graduate with a degree in math or physics.

Also interesting if you visit her company's website: http://www.htius.com/

It's a view into a software industry that is virtually never reported on in the news, at least not on HN. The client list is impressive. It's not really clear what they actually do? Seems to me like they maintain a developing environment and sell support contracts for it?

Animats 2 days ago 1 reply      
She and Saydean Zeldin used to have a company called Higher Order Software. I met them decades ago, when they were promoting that. They have an unusual formalism which never caught on.

There was a lot of interest in formal techniques in the late 1970s and early 1980s, but the technology didn't go that way.

danso 2 days ago 0 replies      
I knew Hamilton's achievements in the Apollo program were remarkable...but I hadn't known that she had started while being as young as 24 -- and being a mother. That's just incredible. I can't even imagine going through school as a father and being able to balance my time, never mind having to carry child. Nevermind being a star programmer at MIT. And her husband was going through law school at the time, which means not only did she not have a stay-at-home husband to take over the parenting duties, she was the breadwinner.
hoorayimhelping 2 days ago 0 replies      
Excellent chapter from the documentary Moon Machines about the Apollo guidance software (worth getting the whole thing on Amazon if you like this sort of thing), told from the point of view of the people who built the machines, rather than the astronauts.



davegauer 2 days ago 1 reply      
A fascinating read about the incredible capabilities of the Apollo software is _Digital Apollo: Human and Machine in Spaceflight Paperback_ by David A. Mindell.

Those craft were quite likely capable of a _lot_ more than they were ever allowed to do by the astronauts - though I guess we'll never know!

ghaff 2 days ago 1 reply      
From the article:

>Once the code was solid, it would be shipped off to a nearby Raytheon facility where a group of women, expert seamstresses known to the Apollo program as the Little Old Ladies, threaded copper wires through magnetic rings (a wire going through a core was a 1; a wire going around the core was a 0). Forget about RAM or disk drives; on Apollo, memory was literally hardwired and very nearly indestructible.

I was at a talk by Richard Battin, also of Draper on the Apollo program, a few years back. One of the stories he told was of a number of the Apollo astronauts visiting Raytheon (where the core memory was being "sewn") and the general gist of the visit was "be really careful with your work or these nice young boys could die."

ourmandave 2 days ago 2 replies      
Is she a candidate for the $10 bill?
oldmanjay 2 days ago 1 reply      
>why the gender inequality of the Mad Men era persists to this day

Well, that's demonstrably untrue, but the narrative, it must be pushed at all costs.

Engineer builds 'working' Thor's hammer that only he can lift cnet.com
240 points by davidiach  1 day ago   55 comments top 17
Jemaclus 1 day ago 4 replies      
This is pretty clever. The major improvements I'd want to make are some sort of RFID chip that deactivates the magnet when I'm close enough, instead of a fingerprint scanner. It seems like anyone who gets close enough can see the scanner, so I'd prefer to have something more invisible.

And the second thing would be just to improve the lag time between grasping the handle and the deactivation of the magnet, so I can just lean down and casually grab it, instead of having to hold it for a second before bringing it up. The more magic, the better.

Still, this is pretty awesome!

carbide 1 day ago 5 replies      
Cool idea, terrible acting. I feel like the "wow" factor in his audience was really killed by the awkward way he made it look like he was pushing a button and waiting for something to happen, instead of struggling to "lift" the hammer while he waited for the thumbprint to register.
oakwhiz 1 day ago 1 reply      
It needs an accelerometer to engage the magnet if the device is disturbed without the handle being touched.
lifeformed 23 hours ago 2 replies      
I was hoping that instead of magnets, it would just be extremely heavy, and be able to activate an internal gyroscopic system to do something like this: https://www.youtube.com/watch?v=GeyDf4ooPdo

It'd be pretty hard to fit all that in a small package though, and probably dangerous.

MisterBastahrd 1 day ago 1 reply      
Woulda been cool to add a remote shutoff so that the kids trying to lift it could have a bit of a thrill.
animex 1 day ago 0 replies      
An NFC ring might have been a better solution than the laggy fingerprint scanner. Still, cool idea!
magicseth 22 hours ago 0 replies      
Magician Robert-Houdin performed this trick in 1846 (without the fingerprint reader) [1]

He used the "Light and Heavy Chest" to demonstrate his ability to remove the strength of men for political ends.

[1] http://www.themagicdetective.com/2012/05/politics-magic-and-...

copsarebastards 22 hours ago 0 replies      
I'd have picked a different legend: the sword in the stone is more similar to how this works, the sword can be wielded by anyone after the king pulls it out of the stone, and this hammer can be wielded by anyone after the engineer pulls if off of the magnet. Thor's hammer can only be wielded by him, ever.

But it's still awesome.

jeffwass 1 day ago 0 replies      
In 'The Illusionist', the magician Eisenheim did a similar trick to Crown Prince Leopold, except it was King Arthur's sword in the stone.
netcraft 1 day ago 1 reply      
I think NFC or Bluetooth might have been better but neat execution nonetheless.
Vintila 21 hours ago 0 replies      
Is it possible to figure out the force required to lift this with magnets engaged?
trishume 1 day ago 1 reply      
Neat project.

I can't help but wonder if you could beat the magnet by kicking the handle sideways, the strong impact multiplied by the lever force might be enough to beat it.

KM33 1 day ago 0 replies      
This is really neat. I wonder if there is the possibility of using a similar magnet set-up as a lock? I worry about my motorcycle being stolen since it is so easy to pick up and most locks can be broken, if I had an electro-magnet like this one it might be much harder to steal.
brador 1 day ago 1 reply      
Could you do something similar with cornflour mix?
ck2 1 day ago 1 reply      
Instead of a thumbprint, he should have used a bracelet with an rfid chip, much faster response time and his hand could have been anywhere on the handle.

Or just inject the rfid chip under your finger.

ljk 1 day ago 1 reply      
Since it's magnet, did it break the electronic devices the "lifters" were carrying?
oconnor663 1 day ago 1 reply      
What happens if the magnet comes back on while the hammer's not touching the metal? Is there a way to make this safe?
Kilogram conflict resolved at last nature.com
291 points by ColinWright  1 day ago   128 comments top 21
Asbostos 1 day ago 1 reply      
The best part about this batch of changes is they push the mole and Avogadro's constant out on their own where the belong, not linked to any other units. Now we'll have only a single mass unit (kg) instead of two (kg and unified atomic mass unit) that we do now. This will knock carbon 12 of its perch as the definition of the "other" mass unit that's been essential to use SI's mole but was not actually SI itself.
jakeogh 1 day ago 1 reply      
silicon-28 sphere(s?): https://www.youtube.com/watch?v=ZMByI4s-D-Y yep, they let him palm it)

watt balance: https://www.youtube.com/watch?v=VlJSwb4i_uQ

zb 1 day ago 1 reply      
I was amused to read this:

"They never found the cause for the disagreement, but in late 2014 the NIST team achieved a match with the other two"

at a time when this story, also from Nature, is also on the front page: https://news.ycombinator.com/item?id=10383984

LoSboccacc 1 day ago 0 replies      
so the proposed definition was set by fixing the numerical value of the Planck constant to 6.62606X1034 s1m2kg

and the conundrum was that they still needed to have a precise enough measurement of that constant because it's an experimental measurement.



MarcusP 1 day ago 4 replies      
Are metric measurements all derived from the value 1kg? If so, does this mean that the entire metric weight range can now be officially based on mathematics?
dfc 1 day ago 2 replies      
The kilogram is still the only base unit that contains an SI prefix in the base unit's name.
tinkerdol 1 day ago 0 replies      
Reminds me of this movie: https://www.youtube.com/watch?v=5dPnFO_JCdc(haven't seen it, but it looks interesting
pervycreeper 1 day ago 2 replies      
What is the level of accuracy they are aiming for? If it entails have some uncertainty over the precise number of atoms in the silicon sphere, then how did they choose this level of accuracy?
Animats 1 day ago 3 replies      
Ah, they went with the "electric kilogram". The other plan was to build up a regular structure with a known number of silicon atoms. That idea was to make a perfect crystal and count the number of atoms on each face. Apparently that's almost possible, although hard to do.
Aoyagi 1 day ago 1 reply      
Here I thought a kilogram was defined by water... oh well, looks like that definition is slightly outdated.


spydum 1 day ago 1 reply      
i was hoping this was going to explain the kg differences between the original and the copies. instead it just resolves it by changing the standard. good for science i guess, sad for my curiosity
unwind 1 day ago 0 replies      
Duplicate, very close in time: https://news.ycombinator.com/item?id=10385743.
novaleaf 1 day ago 0 replies      
If you are interested in hearing more expert commentary, NPR Science Friday did a piece on this in July:


thieving_magpie 1 day ago 0 replies      
A planet money podcast from a few years ago on the kilogram: http://www.npr.org/templates/story/story.php?storyId=1120033...
bartvbl 1 day ago 3 replies      
I wonder: the article states that the SI unit for Kg up to this point was defined using a single object. Doesn't this definition also involve the fact that it's placed on earth, thus requiring two objects for its definition?
acqq 1 day ago 2 replies      
I didn't understand what then will be used: the Si sphere or the Watt balance?
justhw 1 day ago 1 reply      
There's a good Radiolab episode related to this .


TomGullen 1 day ago 0 replies      
Lived next door to a PHD NPL physist who was working on this a few years ago. I think they ended up handing the project over to Canada or somewhere like that IIRC. Fascinating project and guy.
lifeformed 1 day ago 1 reply      
Previously, why didn't they have a reference gram instead of a kilogram? Seems like it'd be easier to create and maintain, and transport.
mtgx 1 day ago 6 replies      
Now even the U.S. can adopt it.
castratikron 1 day ago 1 reply      
Strange to see Planck's constant used that way, defining a kilogram. Planck's constant usually only shows up when you're doing quantum mechanics and the things you're working with are really small.
The Feynman Lectures on Physics are free online caltech.edu
231 points by alexholehouse  1 day ago   20 comments top 10
rcurry 21 hours ago 1 reply      
One of Feynman's funniest lines, I thought, was during a talk where someone in the audience asked if we'd ever be able to develop an anti-gravity device. Feynman gestured at their chair and said something along the lines of "we already have, in fact you're sitting on one right now."
fletchowns 21 hours ago 0 replies      
If you haven't read "Surely You're Joking, Mr. Feynman!" yet, do yourself a favor and order it right now!
webmaven 1 day ago 2 replies      

In not totally unrelated news, I just finished reading Greg Egan's 'Orthogonal' trilogy (alien (as in really alien, alternate-universe cosmology and physics) multi-generational epic). The Feynman-type diagrams in the last book helped make sense of the weird physics.

alexholehouse 21 hours ago 0 replies      
I should (sheepishly) say that I knew Vol. 1 and 2 were available but very recently discovered 3 was now also available. As it turns out, 3 was published a while ago, so while it was new to me, the full set being available was not, in fact, new to the world.

That said, hopefully it's all new to [some] other people!

mironathetin 16 hours ago 0 replies      
If I wrote "A must read for every physics student" then it felt more like duty. But actually it is fun to read the Feynman lectures because they are so inspiring. One of those rare books that are written so well that it becomes fun to study - even such abstract things like physics.

Few scientists have a talent to present abstract things as well as Feynman. Daniel Kahneman and Sigmund Freud, whose lectures are also a pleasure to read, come to mind.

I couldn't read the Feynman lectures for my courses though (we followed the Berkeley physics series), but when ever I want to refresh my mind or find a nice way to explain things to students, these are a swell reference.

I still prefer the printed version though because of its nice layout with images and drawings in the margins.

guilhermeasg 12 hours ago 0 replies      
There's also a couple of interesting videos here: http://www.feynmanphysicslectures.com/
mhartl 21 hours ago 0 replies      
You're welcome. :-
okasaki 16 hours ago 0 replies      
Some of the text does not render correctly if you set your own browser fonts.
hchenji 23 hours ago 2 replies      
Is this new or has it existed since some time?
Comparison R vs. Python: head to head data analysis dataquest.io
275 points by emre  1 day ago   194 comments top 29
mbreese 1 day ago 6 replies      
This is interesting, but not really an R vs. Python comparison. It's an R vs. Pandas/Numpy comparison. For basic (or even advanced) stats, R wins hands down. And it's really hard to beat ggplot. And CRAN is much better for finding other statistical or data analysis packages.

But when you start having to massage the data in the language (database lookups, integrating datasets, more complicated logic), Python is the better "general-purpose" language. It is a pretty steep learning curve to grok the R internal data representations and how things work.

The better part of this comparison, in my opinion, is how to perform similar tasks in each language. It would be more beneficial to have a comparison of here is where Python/Pandas is good, here is where R is better, and how to switch between them. Another way of saying this is figuring out when something is too hard in R and it's time to flip to Python for a while...

bigtunacan 1 day ago 5 replies      
R is certainly a unique language, but when it comes to statistics I haven't seen anything else that compares. Often I see this R vs Python comparison being made (not that this particular article has that slant) as a come drink the Python kool-aid; it tastes better.

Yes; Python is a better general purpose language. It is inferior though when it comes specifically to statistical analysis. Personally I don't even try to use R as a general purpose language. I use it for data processing, statistics, and static visualizations. If I want dynamic visualizations I process in R then typically do a hand off to JavaScript and use D3.

Another clear advantage of R is that it is embedded into so many other tools. Ruby, C++, Java, Postgres, SQL Server (2016); I'm sure there are others.

phillipamann 1 day ago 1 reply      
R is a wonderful language if you chose to get used to it. I love it. I've even used R in production quality assurance to check for regressions in data (not the statistical regressions). I see countless R posts where people try to compare it to Python to find the one true language for working with data. Article after article, there clearly isn't a winner. People like R and Python for different reasons. I think it's actually quite intuitive to think about everything in terms of vectors with R. I like the functional aspects of R. I wish R was a bit faster but I am pretty sure the people who maintain R are working on that. You can't beat the enormous library that R has.
danso 1 day ago 3 replies      
I spent a few weeks a few months ago learning R. It's not a bad language, and yes, the plotting is currently second-to-none, at least based on my limited experience with matplotlib and seaborn.

There's scant few articles on going from Python to R...and I think that has given me a lot of reason to hesitate. One of the big assets of R is Hadley Wickham...the amount and variety of work he has contributed is prodigious (not just ggplot2, but everything from data cleaning, web scraping, dev tools, time-handling a la moment.js, and books). But that's not just evidence of how generous and talented Wickham is, but how relatively little dev support there is in R. If something breaks in ggplot2 -- or any of the many libraries he's involved in, he's often the one to respond to the ticket. He's only one person. There are many talented developers in R but it's not quite a deep open-source ecosystem and community yet.

Also word-of-warning: ggplot2 (as of 2014[1]) is in maintenance mode and Wickham is focused on ggvis, which will be a web visualization library. I don't know if there has been much talk about non-Hadley-Wickham people taking over ggplot2 and expanding it...it seems more that people are content to follow him into ggvis, even though a static viz library is still very valuable.

[1] https://groups.google.com/forum/#!topic/ggplot2/SSxt8B8QLfo/...

sweezyjeezy 1 day ago 5 replies      
This is just a series of incredibly generic operations on an already cleaned dataset in csv format. In reality, you probably need to retrieve and clean the dataset yourself from, say, a database, and you you may well need to do something non-standard with the data, which needs an external library with good documentation. Python is better equipped in both regards. Not to mention, if you're building this into any sort of product rather than just exploring, R is a bad choice. Disclaimer, I learned R before Python, and won't go back.
c3534l 1 day ago 2 replies      
I like Python better as a language, but Python's libraries take more work to understand and the APIs aren't very unified. R is much more regular and the documentation is better. Even complicated and obscure machine learning tasks have good support in R. BUT the performance for R can be very, very annoying. Assignment is slow as all hell and it can often take work to figure out how to rephrase complicated functions in a way that R can figure out how to do efficiently. I think being much more functional than Python works well for data. I mean the L in LISP stands for list! Visualizations are also easier and more intuitive in R, too, IMO. Especially since half the time you can just wrap some data in "plot" and R will figure our which one it should use.

I think the conclusion of the article is correct. R is more pleasant for mathier type stuff, while Python is the better general-purpose language. If your jobs involves showing people powerpoint presentations of the mathematical analysis you've done,you'd probably want to use R. If, on the other hand, you're prototyping data-driven applications, Python would probably be better.

That said, I really like Julia, but can't justify really diving into it at this point. :\

Mikeb85 1 day ago 0 replies      
The reason I like R - it just makes data exploration and analysis too damn easy.

You've got R Studio, which is one of the best environments ever for exploring data, visualisation, and it manages all your R packages, projects, and version control effortlessly.

Then you've got the plethora of packages - if you're any of the following fields: statistics, finance, economics, bioinformatics, and probably a few others, there's packages that instantly make your life easier.

The environment is perfect for data exploration - it saves all the data in your 'environment', allows you to define multiple environments, and your project can be saved at any point, with all the global data intact.

If I want some extra speed, I can create C++ modules from within R Studio, compile and link them, as easily as simply creating a new R script. Fortran is a tiny bit more work, still easy enough however.

Want multicore or to spread tasks over a cluster? R has built in functions that do that for you. As easy as calling mcapply, parApply, or clusterApply. Heck, you can even write your function in another language, then R handles applying that over however many cores you want.

Want to install and manage packages, update them, create them, etc...? All can be done from R Studio's interface.

Knitr can create markdown/HTML/pdf/MS Word files from R markdown, or you can simply compile everything to a 'notebook' style HTML page.

And all this is done incredibly easily, all from a single package (R Studio) which itself is easy to get and install.

Oh yeah, visualisation, nothing really beats R.

And while there are quirks to the language, for non-programmers this isn't really an obstacle, since they aren't already used to any particular paradigm.

As for Python, I'm sure it's great (I've used it a little), but I really don't see how it can compare. R's entire environment is geared towards data analysis and exploration, towards interfacing with the compiled languages most used for HPC, and running tasks over the hardware you will most likely be using.

evanpw 1 day ago 3 replies      
If you only have time to learn one language, learn Python, because it's better for non-statistical purposes (I don't think that's very controversial).

If you need cutting-edge or esoteric statistics, use R. If it exists, there is an R implementation, but the major Python packages really only cover the most popular techniques.

If neither of those apply, it's mostly a matter of taste which one you use, and they interact pretty well with each other anyway.

acaloiar 1 day ago 2 replies      
I have always considered R the best tool for both simple and complex analytics. But, it should not go unmentioned that the features responsible for R's usability often manifest as poor performance. As a result, I have some experience rewriting the underlying C code in other languages. What one finds under the hood is not often pretty. It would be interesting to see a performance comparison between Python and R.
ggrothendieck 1 day ago 0 replies      
For R: (1) instead of `sapply(nba, mean, na.rm = TRUE)` use `colMeans(nba, na.rm = TRUE)`. (2) instead of `nba[, c("ast", "fg", "trb")]` use `nba[c("ast", "fg", "trb")]`, (3) instead of `sum(is.na(col)) == 0` use `!anyNA(col)`, (4) instead of `sample(1:nrow(nba), trainRowCount)` use `sample(nrow(nba), trainRowCount)` and (5)instead of tons of code use `library(XML); readHTMLTable(url, stringsAsFactors = FALSE)`
mojoe 1 day ago 3 replies      
The one thing that sometimes gets overlooked when people decide whether to use R or Python is how robust the language and libraries are. I've programmed professionally in both, and R is really bad for production environments. The packages (and even language internals sometimes) break fairly often for certain use cases, and doing regression testing on R is not as easy as Python. If you're doing one-off analyses, R is great -- for anything else I'd recommend Python/Pandas/Scikit.
The13thDoc 1 day ago 0 replies      
The "cheat sheet" comparison between R and Python is helpful. The presentation is well done.

The conclusions state what we already know: Python is object oriented; R is functional.

The Last Word appropriately tells us your opinion that Python is stronger in more areas.

acomjean 1 day ago 3 replies      
I work with biologists. R which seems strange to me they seem to take to. I think some of it is Rstudio the ide, which shows variables in memory on the side bar, you can click to see them. It makes everything really accessible for those that aren't programmers. It seems to replace excel use for generating plots.

I've grown to appreciate R, especially its plotting ability (ggplot).

falicon 1 day ago 0 replies      
Language comparisons are equiv. to religion comparisons...you aren't going to find a universal answer or truth, it's an individual/faith sort of thing.

That being said - all the serious math/data people I know love both R and Python...R for the heavy math, Python for the simplicity, glue, and organization.

xname2 1 day ago 0 replies      
"data analysis" means differently in R and Python. In R, it's all kinds of statistical analyses. In Python, it's basic statistical analysis plus data mining stuff. There are too many statistical analyses only exist in R.
fsiefken 1 day ago 0 replies      
It would be nice to compare JuliaStats and Clojure based Incanter with Python Pandas/NumPy/SciPy. http://juliastats.github.io/
zitterbewegung 1 day ago 1 reply      
This is not just interesting for comparison but its interesting for people that know R/Python how to go from one to the other.
willpearse 1 day ago 0 replies      
Very picky, but beware constantly using "set.seed" throughout your R scripts. Always using the same random number is not necessarily helpful for stats, and makes the R code look a lot trickier than it need be
wesm 1 day ago 1 reply      
I hope you all know that the people who have invested most in actually building this software care the least about this discussion.
daveorzach 1 day ago 1 reply      
In manufacturing Minitab and JMP are used for data analysis (histograms, control charts, DOE analysis, etc.) They are much easier to use and provide helpful tutorials on the actual analysis.

What features or workflow does R or Pandas/Numpy offer to manufacturing that Minatab & JMP can't?

andyjgarcia 1 day ago 0 replies      
The comparison is R to Python+pandas.

The equivalent comparison should be R+dplyr to Python+pandas.

Base R is quite verbose and convoluted compared to using dplyr. Likewise data analysis in Python is painful compared to using pandas.

thebelal 1 day ago 1 reply      
The rvest implementation was the main thing that seemed like an R port of the python implementation rather than best use of rvest.

An alternate (simpler) implementation of the rvest web scraping example is at https://gist.github.com/jimhester/01087e190618cc91a213

It would be even simpler but basketball-reference designs it's tables for humans rather than for easy scraping.

xixi77 1 day ago 1 reply      
Really, syntax "nba.head(1)" is not any more "object-oriented" than "head(nba, 1)" -- it's just syntax, and the R statement is in fact an application of R's object system (there are several of them).

IMO, R's system is actually more powerful and intuitive -- e.g. it is fairly straightforward to write a generic function dosomething(x,y) that would dispatch specific code depending on classes of both x and y.

dekhn 1 day ago 0 replies      
In general, if I have to chose between two languages, one of which was designed specifically for statistics, and one that was more general, I will chose the more general one.

R's value is in the implementation of its libraries but there is no technical reason a really OCD person couldn't implement such high quality of libraries in Python.

vineet7kumar 1 day ago 1 reply      
It would be nice to also have some notes about performance of both the languages for each of the tasks compared. I believe pandas would be faster due to its implementation in C. The last time I checked R was an interpreted language with its interpreter written in R.
jkyle 1 day ago 0 replies      
Caret is a great package for a lot of utility functions and tuning in R. For example, the sampling example can be done using Caret's createDataPartition which maintains the relative distributions of the target classes and is more 'terse'.

 > data(iris) > library(caret) > data(iris) > idx <- caret::createDataPartition(iris$Species, p = 0.7, list = F) > summary(iris$Species) setosa versicolor virginica 50 50 50 > summary(iris[idx,]$Species) setosa versicolor virginica 35 35 35

hogu 1 day ago 1 reply      
IF you do your stuff in R, how do you move it into production? Or do you not need to
k8tte 1 day ago 2 replies      
i tried help my wife who use R in school, only to get quickly lost.also attended ~1 hour R course on university.

to me, R was a waste of time and I really dont understand why its so popular in academia. if you already have some programming knowledge, go with Python + Scipy instead

EDIT: R is even more useless without r studio, http://www.rstudio.com/. and NO, dont go build a website in R!

vegabook 1 day ago 0 replies      
Python's main problem is that it's moving in a CS direction and not a data science direction.

The "weekend hack" that was Python, a philosophy carried into 2.x, made it a supremely pragmatic language, which the data scientists love. They want to think algorithms and maths. The language must not get in the way.

3.x is wanting to be serious. It wants to take on Golang. Javascript, Java. It wants to be taken seriously. Enterprise and Web. There is nothing in 3.x for data scientists other than the fig leaf of the @ operator. It's more complicated to do simple stuff in 3.x. It's more robust from a theoretical point of view, maybe, but it also imposes a cognitive overhead for those people whose minds are already FULL of their algo problems and just want to get from a -> b as easily as possible, without CS purity or implementation elegance putting up barriers to pragmatism (I give you Unicode v Ascii, print() v print, xrange v range, 01 v 1 (the first is an error in 3.x. Why exactly?), focus on concurrency not raw parallelism, the list goes on).

R wants to get things done, and is vectors first. Vectors are what big data typically is all about (if not matrices and tensors). It's an order of magnitude higher dimensionality in the default, canonical data structure. Applies and indexing in R, vector-wise, feels natural. Numpy makes a good effort, but must still operate in a scalar/OO world of its host language, and inconsistencies inevitably creep in, even in Pandas.

As a final point, I'll suggest that R is much closer to the vectorised future, and that even if it is tragically slow, it will train your mind in the first steps towards "thinking parallel".

There's No DRM in JPEG Let's Keep It That Way eff.org
223 points by DiabloD3  2 days ago   120 comments top 14
Pxtl 2 days ago 2 replies      
To be fair, ask webcomics guys and photographers about piracy - they get the worst of it. Big companies that would never dream of encouraging you to pirate videos and songs functionally encourage you to swap images around constantly, stripped and re-watermarked and whatnot.

So yes, I do feel a bit bad for small independant artists who watch the standards bodies work themselves into a fury to protect video and audio content while they have to deal with Google Image Search and 9gag.

bytesandbots 2 days ago 1 reply      
DRM is protection of content from the consumer itself. The consumer is going to consume the data through an analogue channel. This channel will always be the source of extracting redistributable content. The very premise of DRM rules out any 100%-solution and sets it as obfuscation.

I feel it might be stupid idea but it is not impractical.

The effort of extracting content should be less than the maximum value that can be generated by redistribution. Thus, returns from piracy diminishes as you go from softwares to video to image to text. The effort of extracting an image is too easy via analogue hole. This is assuming an open technology ecosystem not exactly the RMS world but at least Linus or perhaps Mozilla. The enforcers of the DRM do the sensible thing of spreading their proprietary black boxes to as much people, until, they can shut down their doors to the rest of community. That is precisely when certain open source foundations too had to back down. That is how you can enjoy netflix only within your chrome browser.

What bothers me is why are they trying to make it into the standards. If it is built into the standards, it will be built into the downloaders as well. Remember what happened with HLS AES encryption, it is now built into the video downloaders itself. While I understand benefits of standardization, how it has given shape to tech, it might not be true with something so un-technical as DRM. If you do want obfuscation, at least do not make it standard procedure. You know very well that the strongest DRM can not be technically secure.

endgame 2 days ago 1 reply      
I'm so glad I grew up in an era where people's only concern was making things work AT ALL. Trying to make things work for licensed users only, or only for certain devices, or anything else is just bullshit.
scotty79 2 days ago 6 replies      
If we could get rid of copyright there'd be much less resistance toward embedding information about who created the work.

I'd very much like that we could abolish fines for copying but keep fines for stripping author signature from work, or not propagating original author signature to works that are derived.

This way you could have a trail to reach actual author of the part of work that you find awesome to commission some new work from him.

This could be much more valuable for way more people than current copyright schemes that only seem to benefit fatcats.

anon4 2 days ago 1 reply      
This seems like a technical solution to what's a political problem. Those don't usually work out as one would like, or worse, get enshrined in some standard that doesn't solve anything and which makes things worse for the few people that have supporting programs.
impostervt 2 days ago 1 reply      
One of my side projects is a photo water marking SaaS. I was surprised when people actually started paying for it years ago, as I figured, "there's a ton of watermarking apps out there". But it turns out there's a lot of demand from amateur and semi-pro photographers who believe, rightly or wrongly, that they're being ripped off (and want a simple way to watermark their photos).For pros, There are other services out there that actively scan the internet looking for infringers and send DMCA takedown, or similar, notices. These services are generally two pricey for the type of customers my side-project has.

I guess my point is - there is a pretty big demand to protect images online. I suspect DRM will end up being implement in some form or another.

EvanAnderson 2 days ago 1 reply      
How long until this is used to lock down the independent "publishing" of images? This seems like a great foundation upon which to build software ecosystems that discourage user-generated content w/o the imprimatur of an authorized publisher attached.
atom_enger 2 days ago 5 replies      
Couldn't we just stop using JPEG? I realize this is a can of worms, but it's an option, right?
Spivak 2 days ago 3 replies      
Wouldn't this DRM require every implementation of the JPEG standard to honor the DRM or am I missing something?
LoSboccacc 2 days ago 1 reply      
Isn't this problem better solved by watermarking anyway? People want their work distribute publicly but want attribution to attract users. Buyers need a redistributable license most probably as they are interested in the media most probably as part of some communication effort.

Drm is not going to help after buyer redistribute the purchased work in any way, especially if there is a medium conversion involved - i.e. printed issue.

Nadya 2 days ago 4 replies      
Would this stop me from screenshotting the image, saving it as a .png, and distributing that?

Because I and many people would do just that. Sure, the DRM might work for my grandparents and a few other non-techies but over time I can teach my grandparents how to screenshot an image and others would catch on. People would even make chrome apps to "click a picture and resave it in a shareable format".

I'm not sure what this DRM would solve, if anything, other than pissing off users and giving photographers and other digital-sharing artists a false sense of security.

angersock 2 days ago 0 replies      
More attempted fencing off of the commons. Yay.
throwaway2048 2 days ago 2 replies      
countdown until mozilla folds like wet cardboard
togusa 2 days ago 0 replies      
WebP anyone?
The world needs at least 600M new jobs in the next decade for young people bloomberg.com
216 points by cryoshon  2 days ago   474 comments top 44
downandout 2 days ago 11 replies      
Most of the comments here focus on how people aren't trying hard enough to get jobs. This article indicates that the jobs don't exist and that the problem is likely to get far worse. You can try as hard as you want to get a job - if no one is hiring, you aren't going to get one.

The reality is that as time goes on, the world's needs can and will be met by fewer and fewer people. This should be a good thing, but it won't work under most existing economic systems. Our entire economy has to change to accomodate the new reality that a significant percentage of the population will be unemployed.

jonathanjaeger 2 days ago 14 replies      
Disclaimer: This is purely anecdotal, and not backed by any data.

I'm in my mid-twenties and just recently started interviewing people for the team I work on. It's amazing how little effort many seemingly qualified people put in to secure an entry-level job. Whether it be hustle to learn more about a business, the specifics about the company you might work at, or finding someone to give a second set of eyes on a cover letter or resume, most people really drop the ball. If job prospects are grim, you'd at least hope people would put in more effort.

maresca 2 days ago 6 replies      
Student loan debt surpassed credit card and auto loan debt in the US last year. Many college grads graduate with large sums of debt and can't find relevant jobs. Since student loan debt isn't forgivable, it'll be interesting to see the effect of this over the next decade. I have a hunch that the next big market crash will be caused by student loan debt.
NoGravitas 2 days ago 5 replies      
> The world needs at least 600 million new jobs in the next decade for young people

Or, perhaps, the world needs to stop coupling basic human needs for subsistence and dignity to wage labour, and find some better way of doing things.

jbob2000 2 days ago 0 replies      
This is anecdotal at best, but I feel like there is an apathy epidemic. It's fucking impossible to get people to even do "fun" things, much less a "boring" job. Everyone just wants to sit at home in front of a screen. It could just be the people I surround myself with, but that's the feeling I get.
sosuke 2 days ago 3 replies      
What qualifies as an adult anymore?

 people 15 to 29 years old are at least twice as likely as adults to be unemployed
30 is adulthood in their interpretation of the data.

laurentoget 2 days ago 0 replies      
This is quite a contradiction from the recent talk about demographic pressure leading to a wage turnaround.


" Taking just wage growth, simply put (and for more detail follow the links above), an end to the global labour glut should see real wages (wages accounting for the change in prices) start to rise at a faster pace. An ONS report of 2014 found that UK real wages in the 1970s and 1980s grew by an average of 2.9% a year."

so which is it? too many people or not enough?

fensterblick 2 days ago 1 reply      
As the article highlights, the Arab revolutions were led by the youth. I wonder what, if anything, will happen in the USA when the current generation, saddled by seemingly insurmountable college debt, comes to the realization that it cannot find stable work or afford decent housing.

I truly believe that moment will be an earthquake for the current political environment; what we characterize Republicans and Democrats today will dramatically change (just like it did after the Civil War and also the Civil Rights movement).

ausjke 2 days ago 0 replies      
I don't know how this is going to work, I actually think the root-cause of mid-east crisis is more related to youth-jobless.

Young people without job will lead to bad things, in the meantime the technology/AI/robotic-factory is making more people "unneeded", will it either be a utopia-coming-true or a revolution?

ChuckMcM 2 days ago 2 replies      
These stories are always interesting to read, both from what they say, and what they don't say. For example, do you know that world wide there is a shortage of people in various trades roles[1] ? (Welding, masonry, carpentry, electricians, etc) And why are their young people who are loading up on debt they can't afford to go to Ivy league schools when they can be just as successful going to state schools? How much part time employement might be found if there wasn't a floor on minimum wage? [2] Since we don't have the category of 'extra' or 'part time' job like we used to, current minimum wage policy is geared toward making every job pay a living wage. That prices a lot of jobs out competitiveness for humans and spurs the development of robotic replacements. Not that those jobs are career paths, but they do offer people a bit of extra change in their pocket.

A more intriguing question is to what end might you employ two or three hundred million people? Imagine they are sitting outside your window waiting for your command. Assuming you are paying them a living wage, what economic output could they accomplish that would be "worth" say 5 to 15 trillion dollars a year?

[1] http://facilityexecutive.com/2015/05/u-s-employers-suffer-la...

[2] https://www.cbo.gov/publication/44995

cies 2 days ago 0 replies      
Who needs a job? We need "goods", and to obtain then we usually trade in part of our monetary income; but that does not need to come from a job.

I believe in "mincome" or "basic minimal income", as provided by a form of government to all citizens; to be paid for by tax money. This will be low, but enough to sustain yourself (simple shelter + food). If you want more then that you need to either find a job or walk a more entrepreneurial path and make a job.

The amount people receive as "mincome" will be an important number to control by politicians. It will have a strong effect on the then emerging "post-mincome unemployment rate". This would be all people that are looking to supplement their mincome, but currently have not found the means to do so.

I think a mincome-society will find a lot more people entrepeneuring: as a safety net is in place.

The "jobs" that the article speaks of are only going to be created if there is a strict need for them. A business will usually only create a job in last resort, as employing people costs money and brings risks.

maerF0x0 2 days ago 1 reply      
There are a few ways to create many openings:

1. Legislate a maximum working hours (probably < 40 ) . 3 jobs at 40 hours per week becomes 4 jobs at 30 per week, 33% expansion, problem solved.

2. Allow humans to undercut automation in price competition (eg abolish minimum wage)

3. Expand government to employ people for whatever, just print the money you pay them.

Clearly these all come with unsatisfactory side-effects.

Maybe the fix is to end our obsession with creating jobs and jobs being the form of survival we offer our species. Imagine if we just created 600M jobs automating all the things so that those jobs would never (or nearly never) need to be done again? The future generations would need Billions of jobs! But no one would be worse off for it. Its like when a dishwasher became common place, suddenly kids were free to do more homework or facebook or xbox etc. Suddenly parents were more free not to have kids (to complete household chores). Etc etc. By automating and reducing the work that human kind has to do, we're enabling better lives all involved, including the displaced workers. Change hurts in the shortrun, but can bring utopia in the long run.

morgante 2 days ago 1 reply      
> people 15 to 29 years old are at least twice as likely as adults to be unemployed.

Today I learned that 28-year-olds aren't adults.

PebblesHD 2 days ago 0 replies      
This is a truly massive global issue and it hits quite close to me. I've been absolutely lucky in going to university and getting a reasonably promising role in a financial institution whilst studying. The flip side has happened to my brother who at this point is stuck doing menial jobs to pay for food and transport to get to university whilst he collects debt for going. I've had conversations to the effect of 'I actually cannot afford to go to class today or ill be losing another few hundred dollars i need for food' which is not a though ANY young person should have to face. The system needs immediate change for the future of our current society.
forrestthewoods 2 days ago 0 replies      
Central Asia and South-East Europe are lumped together? Ouch! That feels like a rather stinging critique of the European Union.
cousin_it 2 days ago 2 replies      
Solution: https://en.wikipedia.org/wiki/Works_Progress_Administration

It's been done. It worked. Do it again.

sprucely 2 days ago 0 replies      
Meanwhile the more menial / labor-intensive jobs are being replaced by automation. I read somewhere that automation was supposed to be the great liberator, enabling an ever-increasing amount of leisure time. But at some point attitudes started shifting so that people must justify their existence by continuously working hard; and if things don't happen to work out for them, well they just weren't motivated enough. This attitude is apparent in our [US] welfare system which has a huge administrative overhead in place to prevent freeloading.
DrNuke 2 days ago 0 replies      
The point is there will be less and less global-economy jobs (because of automation and the insane productivity it allows to very few skilled people) and more but not enough local-community jobs (caring, agriculture, menials and so on). In a fair deal of the so-called first world, too many educated people are just reverting to local-community jobs already, competing with the uneducated and migrants. This is not going to end peacefully if some sort of basic income is not introduced soon.
geff82 2 days ago 3 replies      
It is grim with the exception when you live in Germany or Switzerland.
dm03514 2 days ago 0 replies      
Less jobs more food. Grow food, at whatever scale available, pots in your room, pots on the balcony/porch, small gardens, side gardens, public spaces, large gardens.
peg_leg 2 days ago 0 replies      
Another idea: how are people today that do so-called 'work' contributing to the human race anyway? In my occupation I call 'work' my contribution is minimal. I help build software to make corporations more money. Almost a negative on the human race. My saving grace is that I make music in my spare time. That is my real contribution to the world.
sogen 2 days ago 1 reply      
Is this a plain in the open "ideology injection in the brain" from above (the rich) to deter emigration to better countries?
tmaly 2 days ago 0 replies      
If we had space exploration capability like in Star Trek, we could think about a different approach. But we are constrained to Earth and we have limited resources. Capitalism is the best system available to allocate resources. What we have right now is not really Capitalism.
sudo-i 2 days ago 0 replies      
Hey how valid is this information? I didn't quite grasp if they counted in other factors, for example, people incurrent jobs that will pass away. Additionally data such as baby boomers are getting older and will create markets in stagnant areas at the moment.
peg_leg 2 days ago 1 reply      
The young people of today are different. They are on the cusp of potentially something wonderful and strange for the human race. Older people don't recognize it. The values are different. Maybe the idea of 'work' will change to suit them.
Kinnard 2 days ago 0 replies      
Once we no longer need to work we can occupy ourselves with love, learning, passion and play!
tdsamardzhiev 2 days ago 0 replies      
Better think positive, guys - it's only going to get worse as you get older ;)
nathan_f77 2 days ago 0 replies      
Alternatively, let's rethink capitalism now that automation is taking over jobs. Maybe we all don't need to work so hard anymore.
moron4hire 2 days ago 0 replies      
I'm struck by the fact that, if 600M people were living together in one area, they'd spontaneously create jobs around the fact that 600M people will have to figure out ways to interact with each other in an civil way.

I tend to believe that the reason we don't have enough jobs right now are because of market distortions that place unequal value on certain, expensive things, like college educations, personal vehicles of far greater passenger capacity than strictly necessary, private dwellings of extremely large size, and the latest and greatest smartphones ever two years. Our "betters" have successfully created a scenario where people willingly enter into debt slavery to acquire what they believe is their entitlement.

Because, sans weird pricing, there is real need for work to be done, that is not getting done, in our current environments. There are roads that are falling apart. There is food that is not getting to hungry people. There are children who are not learning what they need to learn to be successful. There are hydrocarbons that are continuing to be burnt. There are routine medical physical exams that aren't being performed.

There are things that people want, but can't acquire at a price that is reasonable. This could be a function of the constituent inputs being too expensive, but I doubt this. Arbitrage is a powerful force for innovation. I suspect there is a much stronger force at work that is preventing the goods and services that people need from being created: mega-corporate-backed government regulation.

There are people in 1st world countries who are going hungry, who don't have heat, who don't have doors on their house. I have seen this with my own damn eyes. Yes, they are poor, but is their poverty their fault? And even if, in some extremely twisted way, it is their fault, does it justify forcing them to live in squalor? Should the laziest of lazy people be forced to live in literal shit-holes?

As long as there are people willing to work but incapable of moving, I think a little "undeserved" compassion is a good enough reason to create a job. Just because some vanishingly few poor people are slovenly doesn't justify completely writing off the entire class.

oconnor663 2 days ago 0 replies      
The world has confronted this problem every decade since the beginning of time. Is there any reason to believe This Time Is Different?
dba7dba 2 days ago 3 replies      
We should be honest and talk about just not creating more jobs BUT having LESS babies.
oneJob 2 days ago 0 replies      
...because we don't have enough stuff and we always have to be doing something? How about, work less, live more.
collyw 2 days ago 1 reply      
Or just redistibute the wealth more equally.
mygodtou 2 days ago 0 replies      
Lots of people have great ideas but most governments stifle small business with excessive relations and fees.
faragon 2 days ago 0 replies      
Let's produce more. Let's make a rich world for everyone :-)
peterwwillis 2 days ago 0 replies      
Job hack: open volunteer trade schools in impoverished urban areas and fund it with both government and private money and give tax incentives to those that fund them or volunteer to work there. This could be anything from computer jobs to specialized manufacturing (Foxconn-esque).

Not only could this provide us with a 'cheap labor' manufacturing workforce that corporations love, tech jobs that could be done remotely would also be easy to train for, and thus our country's very limited transportation options wouldn't be such a barrier to getting work. Areas of high crime or gang violence could begin to get kids off the streets and into a stable job.

mrdrozdov 2 days ago 0 replies      
Sounds like a good time to create an education business. :)
mwhuang2 2 days ago 0 replies      
Extra schooling only delays reality and leads to more debt. What really matters is simple supply and demand - whether people have the skills that others are willing to pay for.
pinaceae 2 days ago 0 replies      
I don't fully understand this claim.

The job market has people coming in, but also, in parallel, people exiting out of it.

developed markets, especially in Europe and Japan, will see massive attrition due to people retiring or dying off. the baby boomer generation in the US is retiring as well. all those jobs need to be backfilled and all those old people will need services.

as the world population is stabilizing, it should not be that bad, no?

Mz 2 days ago 0 replies      
Or, we need 600m new small businesses, consultants, etc. The world can change, adapt.
AnimalMuppet 2 days ago 0 replies      
I recently saw an article (don't recall where, but I think it was based on a World Bank report) that indicated that the fraction of the population aged 18-65 had peaked in 2012. That was part of the problem - more people were of working age. But demographically, that's going to be less and less true as we move forward; perhaps that will soften the conclusions of this article.
greengarstudios 2 days ago 1 reply      
Start a startup and create your own job?
fredgrott 2 days ago 1 reply      
The reality is with $31 Billion in mobile app sales and rising those jobs will come from small businesses building mobile apps as we have 1 TBytes of free data to organize into mobile services every year that our current programming languages cannot self learn how to organize.

Yes there will still net jobs loss as tech progress eliminates them..the new job is your small business you set-up

theworstshill 2 days ago 0 replies      
As difficult as it would be to find money for it - I would propose a one time entrepreneurship grant to all college graduates equal to an average yearly salary in the profession (this is an approximation and experts should figure out what variables should adjust for best amount). That would allow several things to happen:1. New graduates with a strong drive for entrepreneurship can start working on their ideas right away and do not have to spent several years working for corporations, picking up anti-patterns.2. New graduates who are unable to find professional work can have a cushion while they search, and can potentially become lesser partners to people in the first category.

Jobs and careers are created by businesses, so the more small-medium size businesses there are, places that are still flexible in their mindset - the more work there will be.

YC Continuity ycombinator.com
243 points by dshankar  6 hours ago   59 comments top 13
BinaryIdiot 5 hours ago 6 replies      
So this announcement got me thinking and while this is slightly off topic and it may seem like a dumb question but I'm going to ask it anyway because I'm curious: where does YC get the money to do everything it does (seed and later stage investment, research, running hacker news, employees, startup school, etc)?

I mean I know roughly how it works from the standpoint of taking investment money and funneling it into companies but unless I missed it I have yet to see them "cash out" on any existing investments so are they bringing in any money from those investments or is it just more and more funding to do further investments that powers everything?

ChuckMcM 6 hours ago 4 replies      
While I think its great to participate pro-rata in later rounds, it feels a lot like YC is backing into becoming yet-another-VC like Sequoia or any other firm in the valley, granted with their 'incubator arm'.

I'd be interested in hearing how being more like a regular VC firm helps YC be better at what it is currently.

not_that_noob 4 hours ago 0 replies      
Awesome! I'm a natural born cynic, but even I have to say that YC has reshaped the entrepreneurial financing ecosystem in the Valley. All these new funds falling all over themselves to be founder friendly, while the old legacy vultures slowly slip into irrelevance - this is the world that pg, sama and team have wrought. As an entrepreneur, thank you for existing.
rdl 6 hours ago 2 replies      
Curious if in 5 years YC figures out a way to OpenIPO YC companies, solving the private/public market impedance mismatch which seems to be happening now.
rdl 5 hours ago 3 replies      
I wonder how long until YC goes upstream of fellowship/startup school: consolidated remote/mini internships for high school students, or some other "get involved with YC companies early" kind of thing.

Especially likely to happen once the first cohort of YC founders has kids of age 12-18.

calgaryeng 4 hours ago 1 reply      
> Capital -- especially long-term capital willing to invest outside of the current trends -- is an important ingredient in that mission.

Does anyone else read this and take away "[investors/VCs/other equity funds who don't group-think and blindly follow each other based on FOMO] are an important ingredient in that mission."

mattty 4 hours ago 0 replies      
It will be interesting to see YC manage the tension of being founder-friendly at later stage companies. Statistically[0], the later stage the company, the likelier it is the founder isn't running the show.

Is founder ceo succession inherently less likely at YC startups due to YC selection process?

[0]: The Founder's Dilemmas, Wasserman, p.299

geofft 5 hours ago 0 replies      
Previous post on the pro rata thing seems to be https://blog.ycombinator.com/pro-rata -- AFAICT, the change there is that it's now <$300MM instead of <=$250MM.
jkurnia 2 hours ago 0 replies      
Does this affect YC nonprofits - will they be eligible for follow-on funding, as well?
jpeg_hero 3 hours ago 0 replies      
sama, of 1,000 co's how many have exited to date?

Exit as in: cash or public stock readily convertible to cash

Know there is a timing issue, but would be interested in just total actual exits?

samstave 3 hours ago 1 reply      
It would really be wonderful, if, when you announce some new person, you could at least provide a link to their linkedin or some other page telling me who the heck this person is.

Most of us are not that connected in SV that we know every name you do... so please provide context.


sova 3 hours ago 0 replies      
what does full partnership entail? -- thanks, newbie me
larrys 4 hours ago 0 replies      
These comments I've reproduced below make me wonder again at what point will YC have their hands in to many pots and risk straying from the original formula which has worked so well. There is of course a need to evolve, change and add products. However in a small organization you also have to be able to manage all of that without having existing or new employees (or management) choke on the added complexity. Sama only has so many hours in a day. Even if there are more partners there is still glue that holds everything together. People at the top.

sama: "In the same way YC was able to make the early-stage ecosystem better for founders, we think we can do the same for the late-stage ecosystem."

rdl: "Curious if in 5 years YC figures out a way to OpenIPO YC companies, solving the private/public market impedance mismatch which seems to be happening now."

dshankar: "Instead, I wonder if YC can formalize & self-regulate a secondary market as an intermediate step before IPOs. "

rdl: "I wonder how long until YC goes upstream of fellowship/startup school: consolidated remote/mini internships for high school students, or some other "get involved with YC companies early" kind of thing."

Many things going on. No way of knowing all of this could work out fine. Just to me, from my observation of business over many years (not specific to startups) I say "DWR" [1]

[1] Danger Will Robinson.

Smartcrop.js content-aware image cropping in JavaScript github.com
241 points by rayshan  1 day ago   28 comments top 13
vortico 1 day ago 4 replies      
Really awesome, and the test cases look just as good as I could do.

I would warn web designers to not blindly apply this to everything though. It scans all the pixels of an image, which can take up to 100ms each, especially on mobile devices. A good use case would be a file upload box with a suggestion to crop the image upon upload.

zappo2938 1 day ago 1 reply      
Cropping images is a massive problem for social media. Here is a talk from 2013 by Christopher Chedeau a front end engineer at Facebook describing some of the problems with their image layout algorithms.[1]

Initially, they tried to solve the problem by getting users to tag people inside the image and then use location of the tags as parameters to crop. If someone is tagged in a photo, Facebook makes sure that person is always inside of the cropped version.

Here is the write up from Christopher's blog.[2]

Was Instagram only using square images at one time? That would have been a brilliant way to have solved this problem.

1. http://blog.vjeux.com/2014/image/image-layout-algorithms-htm...

2. http://blog.vjeux.com/2012/image/best-cropping-position.html

vjeux 1 day ago 0 replies      
If you are interested in a similar approach by a Facebook developer: http://blog.vjeux.com/2012/image/best-cropping-position.html
emehrkay 1 day ago 2 replies      
Damn this is cool. It's kinda amazing that it is all done in a few hundred lines of code. I see there is a skinColor method and setting defined as

 skinColor: [0.78, 0.57, 0.44]
I was curious how it worked with darker skin (I admittedly don't understand what the numbers mean without further analysis), and It came out pretty well (it may default on lighter skin, i don't know)


images found on google

sam_goody 1 day ago 2 replies      
Two others in this space:http://thumbor.orghttp://magickly.afeld.me

But neither seems to get as much love as they need, so always good to have more players.

RichWalton 1 day ago 0 replies      
Awesome work.

As it happens I'm working on an image cropping front end (using CropperJS [1]) - I'm going to integrate this so that the initial crop selection is set using the results from SmartCrop.

Thanks again.

[1] https://github.com/fengyuanchen/cropperjs

rateofclimb 1 day ago 0 replies      
Very cool. The Ken Burns effect application of the algorithm is particularly impressive.
IgorPartola 1 day ago 1 reply      
I've actually been looking for something related. I need to quickly classify if an image contains a face and also if it contains any text. The former seems to be relatively straightforward, but I haven't found anything for detecting text, only OCR'ing it which I don't need. Anyone seen anything like this?
sirtastic 1 day ago 0 replies      
Worked well with the pictures I dropped in, very nice.

Wish someone would make a solid angular dropzone+cropper.

zachrose 22 hours ago 0 replies      
How long until this has native support in CSS?
dheera 1 day ago 0 replies      
If Imgix could implement this server-side, that could be awesome.
jessedhillon 1 day ago 0 replies      
This is called "salient region detection" and some current approaches (there are many) include detecting the contrast between each pixel and the global or regional average color or luminosity. Areas of high contrast are likely to be regions which are considered interesting. Once you have those regions, you would have to have a separate algorithm which maximizes the placement of a rectangle (the crop) to get the greatest coverage of "interestingness".

You could also combine this with face detection, so that a picture of someone in a bikini doesn't end up cropping just to their midsection, since going by surface area, the torso could have more high-contrast pixels than the face.

Here's one approach which has open source C++ code:http://mmcheng.net/effisalobj/

donmb 1 day ago 0 replies      
Exactly what I was looking for. Will give it a try! Thank you.
Auto-Generating Clickbait with Recurrent Neural Networks larseidnes.com
242 points by lars  2 days ago   64 comments top 24
thenomad 2 days ago 2 replies      
If I could feed this an article and have it generate headlines based on the text of that article (and they were any good), there is a solid chance I would pay real money for that service.

Headlines are an absolute pain, and as the article says, they're decidedly unoriginal most of the time. I can't see an obvious reason that an AI would be much worse at creating them as a human.

blisterpeanuts 2 days ago 4 replies      
I like the notion of swamping the Internet with fake click-bait headlines, to dilute the attractiveness of this (to me, odious) form.

Give me sincere, honest news and discussion, or else shut up.

Unfortunately, someone out there must really have a craving for "weird old tricks" and "shocking conclusions".

It's a sort of race-to-the-bottom, least common denominator effect.

Maybe someone will write a browser extension that filters out obvious click-bait headlines. Now that would be clever!

rndn 2 days ago 0 replies      
Could this RNN model perhaps be used to filter click bait headlines from HN automatically? Perhaps one could perform some sort of backward beam search to figure out how likely a particular headline would've been produced by it. If there are words in a headline that the model doesn't know, one could perhaps just let it replace it with one that it knows.
oneJob 2 days ago 0 replies      
Now if we can just teach AI to get sidetracked reading all this content we'd also prevent Judgement Day.

SkyNet: (speaking to self?) "Unleash hell on humans. Launch all missiles."

SkyNet: (responding to self?) "Not now, not now. Let me finish this article on John Stamos's belly button."

clickok 2 days ago 0 replies      
Nice! I've wanted to do something like this for awhile, too, but haven't had the time yet.

What's interesting to me, from a research point of view, is the degree of nuance the network uncovers for the clickbait.We all know that <person> is going to be doing <intriguing action>, but for each person these actions are slightly different. The sentence completions for "Barack Obama Says..." are mainly politics related while "Kim Kardashian Says..." involve Kim commenting on herself.

So it might not really understand what it's saying, but it captures the fact those two people will tend to produce different headlines.

Neat Idea: what if we tried the same thing with headlines from the New York Times (or maybe a basket of newspapers)?We would likely find that the Clickbait RNN's vision of Obama is a lot different from the Newspaper RNN's Obama.Teasing apart the differences would likely give you a lot more insight into how the two readerships view the president than any number polls would.

ChuckMcM 2 days ago 1 reply      

I really find RNNs to be pretty cool. When they are combined with a natural human tendency to see patterns they are hilarious. So perhaps we need to update our million monkeys hypothesis to a million RNNs with typewriters coming up with all the works of Shakespeare.

mikkom 2 days ago 2 replies      
What I'm surprised most is that the headlines seem not to be much better than your average markov chain output
rlu 2 days ago 2 replies      
> The training converges after a few days of number crunching on a GTX980 GPU. Lets take a look at the results.

Stupid question: why is the GPU important here? I would have thought this was more of a CPU task..??

(then again, as I typed this I remembered that bitcoin farming is supposed to be GPU intensive so I'm guessing the "why" for that is the same as this)

flashman 1 day ago 0 replies      
I used a simpler technique (character level language modelling) to come up with an Australian real estate listing generator: http://electronsoup.net/realtybot

This is pre-generated, not live, for performance reasons. There are a few hundred thousand items though, so the effect is similar.

The data source is several tens of thousands of real estate listings that I scraped and parsed.

juddlyon 2 days ago 1 reply      
I can't stop laughing at these. Check out the Click-o-tron site: http://clickotron.com/
OhHeyItsE 2 days ago 0 replies      
This is simply brilliant.

(Ranking algorithm baked into a stored procedure notwithstanding. [ducks])

neikos 2 days ago 1 reply      
I am not sure how much I would give credit to the idea that the neural network 'gets' anything as it is written in the article.

> Yet, the network knows that the Romney Camp criticizing the president is a plausible headline.

I am pretty certain that the network does not know any of this and instead just happens to be understood by us as making sense.

andrewtbham 2 days ago 1 reply      
tldr; guy uses rnn lstm to create link bait site.

hopes crowd sourcing will filter out non-sense.


chipgap98 2 days ago 0 replies      
"Tips From Two And A Half Men : Getting Real" is great. Some of the generate titles are incredible
billconan 1 day ago 0 replies      
I can't understand the first two layer RNN which according to the author optimized the word vectors.

it says:

During training, we can follow the gradient down into these word vectors and fine-tune the vector representations specifically for the task of generating clickbait, thus further improving the generalization accuracy of the complete model.

how to you follow the gradient down into these word vectors?

if word vectors are the input of the network, don't we only train the weight of the network? how come the input vectors get optimized during the process?

indiv0 1 day ago 0 replies      
Reminds me of Headline Smasher [0].

Some pretty fun ones there but it doesn't use RNNs. It just merges existing headlines.

[0]: http://www.headlinesmasher.com/best/all

alkonaut 2 days ago 0 replies      
Missed opportunity for HN headline.

This program generates random clickbait headlines. You won't believe what happens next. You'll love #7.

kidgorgeous 2 days ago 0 replies      
Great tutorial. Been looking to do something like this for a while. Bookmarked!
smpetrey 2 days ago 1 reply      
I think this one is my favorite:

Life Is About Or Still Didnt Know Me

CephalopodMD 2 days ago 1 reply      
Your main site is down. Bottle can't handle serving files scalably or something? Point is, it broke.
hilti 2 days ago 0 replies      
Interesting blog post, but site is down.How much traffic do You get from HN?
joshdance 2 days ago 1 reply      
500 Internal Server Error on the site where you could upvote em.
imaginenore 2 days ago 1 reply      
Getting this error:

 Error: 500 Internal Server Error Sorry, the requested URL 'http://clickotron.com/' caused an error: Internal Server Error Exception: IOError(24, 'Too many open files') Traceback: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 862, in _handle return route.call(**args) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 1732, in wrapper rv = callback(*a, **ka) File "server.py", line 69, in index return template('index', left_articles=left_articles, right_articles=right_articles) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 3595, in template return TEMPLATES[tplid].render(kwargs) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 3399, in render self.execute(stdout, env) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 3386, in execute eval(self.co, env) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 189, in __get__ value = obj.__dict__[self.func.__name__] = self.func(obj) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 3344, in co return compile(self.code, self.filename or '<string>', 'exec') File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 189, in __get__ value = obj.__dict__[self.func.__name__] = self.func(obj) File "/usr/local/lib/python2.7/dist-packages/bottle.py", line 3350, in code with open(self.filename, 'rb') as f: IOError: [Errno 24] Too many open files: '/home/ubuntu/clickotron/views/index.tpl'

VLM 2 days ago 0 replies      
This was an enjoyable article. There is an obvious extension which is to mturk the results and feed the mturk data back into the net. Just give the turkers 5 headlines and ask them which they would click first, repeat a hundred times per a thousand turkers or whatever.

Years ago I considered applying for DoD grant money to implement something reminiscent of all this for military propaganda. That went approximately nowhere, not even past the first steps. Someone else should try this (insert obvious famous news network joke here, although I was serious about the proposal). To save time I'll point out I never got beyond the earliest steps because there is a vaguely infinite pool of clickbaitable English speakers on the turk, but the pool of bilingual Arabic (or whatever) speakers with good taste in pro-usa propaganda is extremely small, so the tech side was easy to scale but the mandatory human side simply couldn't scale enough to make the output realistically anything but a joke.

Appeals court hits largest public patent troll with $1.4M fee arstechnica.com
186 points by solveforall  3 days ago   48 comments top 7
kelukelugames 3 days ago 4 replies      
I was hoping for Intellectual Ventures.
ww520 3 days ago 3 replies      
$1,025 per hour for partners, $750 for associates, and $310 for paralegals.

Those are kind of high. Did they actually charge those rates? Or retroactively bumped up the rates once they knew the case was dismissed?

xacaxulu 3 days ago 0 replies      
The title alone makes me all "Yissssssssssssss".
shmerl 2 days ago 0 replies      
Good, but I was hoping for Intellectual Vultures to be busted.
ccvannorman 2 days ago 0 replies      
A good start.
curiousjorge 3 days ago 4 replies      
>$1,025-an-hour partners.

TIL some people make literally a 100 times what the average person makes.

winter_blue 2 days ago 2 replies      
$1.4 million seems small in comparison to the billions[1] that Microsoft was fined by the EU for bundling certain software with Windows.

Acacia did something that would be considered illegal and exploitative in almost every jurisdiction, whereas bundling a browser straddles a legal gray area. Apple does it OS X and iOS, Google does it on Android & Chromebooks, and nearly every Linux distro does it. On iOS, you can't even use a browser engine other than Safari's WebKit. And none of these companies have gotten into trouble.

It just seems unfair to me that when a company does something slightly unfairly competitive (like Microsoft) they get hit with huge fines, but when a company like Acacia does something that's outright evil and illegal, the fines are a joke.

I do think Microsoft should be paying even bigger fines for patent-trolling Android manufacturers with false patent claims. And a judicial decision or executive act ordering Apple to allow users to install their own software on iOS, and removing the ban on interpreters/browser engines/etc on the App Store would be appropriate.

[1] $794 million in 2003, $449 million in 2006, $1.44 billion in 2008, and $765 million (561 mil)in 2013 -- a total of over $3.4 billion, all for bundling standard software with Windows that all other OSes also bundle. And this money paid in fines to the European Court goes back into the EU budget. (TBH, this smells strangely as a revenue-generating move by the Commission.) See: https://en.wikipedia.org/wiki/Microsoft_Corp_v_Commission

Federal Whistleblower Investigator Fired After Blowing the Whistle on Own Agency nbcbayarea.com
189 points by wcbeard10  10 hours ago   18 comments top 5
PaulAJ 8 hours ago 1 reply      
And meanwhile the government is saying that Edward Snowden should have used official whistleblowing avenues instead of going public.
kbenson 9 hours ago 1 reply      
Is this just re-reporting a past event? They even include a clip of Jon Stewart on the Daily Show talking about this specific event in the past. Is there any new information here, or is this just blogspam from a news channel in an actual TV news segment?
Animats 6 hours ago 0 replies      
Some managers at OSHA need to be fired. Now. Names need to be named by the press.

We need much tougher labor law enforcement.

logfromblammo 9 hours ago 4 replies      
So the agency responsible for investigating claims regarding workplace health and safety matters, under the OSHA organizational umbrella, fired an employee for making claims regarding unethical work policies, which are now under investigation by a different agency, the Office of Special Counsel.

The umpire has reviewed the play: it is not irony. I repeat: it is not irony.

kelvin0 8 hours ago 0 replies      
This is so meta ...
Emacs maintainer steps down gnu.org
241 points by zeveb  2 days ago   71 comments top 10
nanny 2 days ago 4 replies      
Note: this happened almost a month ago now.

See these threads for the discussion on the new head maintainer:



jwr 2 days ago 1 reply      
Stefan's stewardship resulted in a much-improved Emacs. He did a very good job.
davidw 2 days ago 6 replies      
I've been using Emacs for 20 years, I realized. If you think about all the different things that come and go so quickly in this field, that's a pretty amazing run.

Thanks Stefan!

unknownzero 2 days ago 1 reply      
I would encourage anyone who clicks the link to read through the thread. Pretty heartwarming to see the goodbyes, a definite mid-day boost :)
laurentoget 2 days ago 1 reply      
Good to know there are people stepping down off of open source project leadership roles without throwing a tantrum!
rurban 15 hours ago 0 replies      
All this over rtags[0] not in melpa/core? Typical RMS drama, but I see his point, and he reacted quite defensively. Not as aggressive anti-clang/apple as before.

0: https://github.com/Andersbakken/rtags

seigel 2 days ago 1 reply      
Heading over to vi? :)
ilaksh 2 days ago 0 replies      
I remember in my C++ class around 1997 the professor was saying emacs was more a way of life or operating system than just an editor. He was only half-kidding.

In the past 18 years I imagine the functionality may be even more comprehensive, if that is possible?

melling 2 days ago 0 replies      
I guess this will impact the Emacs 25 release?
JeremyBanks 2 days ago 0 replies      
Is your username intended to describe the idea you're suggesting?
If You're Not Paranoid, You're Crazy theatlantic.com
207 points by ForHackernews  2 days ago   123 comments top 19
GigabyteCoin 2 days ago 6 replies      
>Id driven to meet a friend at an art gallery in Hollywood, my first visit to a gallery in years. The next morning, in my inbox, several spam e-mails urged me to invest in art. That was an easy one to figure out: Id typed the name of the gallery into Google Maps.

I don't see how the author makes the connection here.

How does searching for an art gallery on Google Maps translate into spam emails? Is he accusing Google of selling your email address and search information to spammers?

thaumaturgy 2 days ago 2 replies      
Paranoia is a specific thing. It requires irrational, unjustifiable fears and a sense of blame or persecution. You're not "paranoid" if you've changed your behavior in the wake of the Snowden documents or if you're cautious regarding the amount of information you share with third party services and devices.

I'm not being pedantic, I've seen a lot of arguments recently from actual paranoid conspiracy theorists that feel smug in the wake of Snowden. I'd hate to see people start to confuse real paranoia with informed caution.

radiorental 2 days ago 3 replies      
Somewhat ironic and paradoxical http://imgur.com/Z2BIMcw
mfoy_ 2 days ago 2 replies      
Think of how many apps you've installed which request permission to a whole laundry list of phone functions.

"Oh, it's reasonable that this app wants access to my text messages, that way when it sends me a confirmation code it can automatically read it."

"Oh, it's reasonable that this app wants access to my mic, maybe it will implement voice chat in a coming update."

"Oh, it's reasonable that this app wants access to my call history and whatnot, that way it can mute itself or pause itself when I get a phone call."

... oh, I guess it's reasonable that if I text, or talk with my phone nearby, about walnuts I'll start seeing targeted ads for walnuts.

TeMPOraL 1 day ago 1 reply      
> Had merely typing seduction into a search engine marked me as a rascal? Or was the formula more sophisticated? Could it be that my online choices in recent weeksthe travel guide to Berlin that Id perused, the Porsche convertible Id priced, the old girlfriend to whom Id sent a virtual birthday cardindicated longings and frustrations that I was too deep in denial to acknowledge?

While a lot of those examples are true instances of tracking and inference, in some cases I think author is imagining things. People have a scary capability to see patterns and intelligent agents where none exists. It's incredibly easy to cause this.

I'm running a simple IRC bot that "pretends to be human" by means of hand-tailored regular expressions matching input and some witty responses. I can't count the times I tricked people into believing they were talking to human. It's like, write out some simple regexes and you're 90% way to passing a Turing test. People prime and then fool themselves.

So yeah, I'm betting those results in the part I quoted were caused just by "seduction techniques" search. And if he clicked on that Ashley Madison banner, he basically sealed his fate.

cubano 2 days ago 1 reply      
Reminds me of poster I saw in dude's house way way back in my stoner days...

"I know I'm paranoid, but am I paranoid enough?"

hyperion2010 2 days ago 1 reply      
Qu'on me donne six lignes crites de la main du plus honnte homme, j'y trouverai de quoi le faire pendre.--Cardinal Richelieu
xkiwi 2 days ago 2 replies      
It is scary for me to know:

#Majority of people post photo of friends on Facebook without understand facial recognition always scans.

#Prefer convenient over privacy, such as Toll Tag on cars.

#Follow trends.

cardamomo 1 day ago 0 replies      
I really enjoyed reading this piece, not only for its discussion of privacy, but also for its poetic and reflective language. There's something more than technical about today's surveillance problem, and the author approaches this issue from a philosophical and, at times, almost spiritual angle.

These are the kinds of discussions we need to have more often: not only what's going on and what it means in practical terms, but also how today's surveillance explosion changes who we are and how we relate to ourselves.

mattmanser 2 days ago 1 reply      
If you are a sci-fi fan, an interesting new trilogy to read is the Imperial Radch trilogy by Anne Leckie starting with Ancillary Justice.

One of the interesting themes is that everyone is surveilled totally and intimately down to even their feelings, but it's not the point of the book and the protagonist treats it as totally normal and it's never really discussed nor is there any suggestion it would be better if that wasn't the case.

I really enjoyed the trilogy but as someone who's pro-privacy it was a strange read.

msutherl 2 days ago 6 replies      
I still don't quite comprehend why people feel personally bothered by such things. Yes, it is better for society to have safeguards in place to prevent certain kinds of surveillance as a check on governmental and corporate power and we need to fight for this but some data about you stored on some servers, a targeted advertisement? What exactly is the immediate personal threat?
euske 2 days ago 0 replies      
Paranoid or not, we should develop a healthy skepticism about this in the society. A scary thing to me is that most people don't know about the true capability of information linking/correlating from multiple sources. It's not intuitive for us that you can get seemingly innocuous data and combine them to magically tell one's behavior. These kinds of threats should be systematically studied and made consciously known to the public.
zbyte64 2 days ago 2 replies      
The part about voluntarily giving up confessions reminds me of something similar during the Vietnam war. People would be grouped together and would need to "confess" the allures of capitalism. You could only graduate from the program once you procured enough drama to guarantee you were a comrade.
Simulacra 2 days ago 2 replies      
Paranoia can be healthy sometimes
shogun21 2 days ago 0 replies      
People are afraid of what they don't understand. If you just thought of the cloud as a server somewhere instead of a mysterious "ghostly entity", you'd know it's really not as smart as you think it is.
geggam 2 days ago 0 replies      
Amazes me that anyone familiar with data online thinks there aren't ways to track and store everyone's internet usage.
em3rgent0rdr 1 day ago 0 replies      
20+ trackers blocked by Privacy Badger while reading this article.
graycat 2 days ago 0 replies      
In your Web browser, be carefulwhat cookies you are willing toaccept.
powera 2 days ago 1 reply      
Actually, this guy is paranoid and crazy. It doesn't mean he isn't right, but by any definition he is both paranoid and crazy.
Convicted by Code: Defendants should be able to inspect code used in forensics slate.com
168 points by Figs  2 days ago   38 comments top 9
donkeyd 2 days ago 0 replies      
I once nearly lost a contest because of a faulty SQL query on their side. If I didn't get to see the query, I wouldn't have been able to defend my entry and would've lost. Losing this contest would've been trivial, but if I applied this to a trial, it would be horrible.

The error was that a 'group by' was used to find the number of unique entries, even though they had leading spaces, that were part of their uniqueness. Group by doesn't take leading spaces into account, leading them to get a different result than me. I think that this could've happened to a lot of people, even forensic IT engineers.

6stringmerc 2 days ago 2 replies      
Okay so this article is bumping up against the hysteria that I'd categorize as "semi-technology literate" yet makes some good points. Almost like talking about how dangerous it is to walk through a minefield and then stepping on one. There's a valid point in there somewhere.

Copyright reform is one of my favorite subjects, and for a multitude of reasons. Should the prosecution be able to dump a case straight up without recourse because the "Stingray" gathering tool is too lovely to submit to review? Nope. Should breathalyzer code be held from review just because it's a product made by somebody? Nope. Should FOIA be stonewalled or pay-walled and inhibit the Constitutional freedom of the press? Nope!

Innocent until proven guilty is a very, very important premise for the US legal system. It's backed up by both the Fourth and First amendments to the Constitution. Any justification to put them aside for "War on ____" might seem reasonable on the surface, until taking a closer look at multiple murder evidence that comes from within the borders more often than on a laptop of a Citizen who just so happens to be coming back from a foreign country and gets worked for passwords under duress or has to forfeit hardware without recourse.

I dunno, maybe I sound like some kind of off-the-rocker dude by thinking about such things, but I love my country, I'm willing to sit down and think about this kind of stuff. It doesn't have to be extreme. Taking the small steps of talking with one another about what we really value is important in my opinion.

triggercut 2 days ago 2 replies      
There are similar issues with this in Structural and Mechanical engineering. Engineers are expected to rely more and more on software to execute and document complex calculations to verify designs, but how can you be sure those underlying calculations/theorems/models are correctly implemented? Some packages are constantly patching particular edge cases that get sent to them from their users. Many issue announcements to warn of bugs that could cause an incorrect result.

If a result from software led to a critical failure in a design, the onus is most likely still on the Engineer.

I have seen cases where software is formally reviewed by independent verification bodies, much in the same way your ISO 9001 compliance is. I can't see why this wouldn't apply here. Have an independent party, who has signed an appropriate NDA, asses and certify that your product does what it says on the tin and audit it at regular periods.

finance-geek 2 days ago 1 reply      
I think things will become even worse now that criminal "scouting" and even vetting is being done via learning models. So you may not even find hard filters or conditionals...instead the errors (or stereotypes?) would be embedded deep inside some neural net. I'm not even sure how one would explain that one to a jury.
downandout 2 days ago 3 replies      
This defense attorney was creative for asking to examine the source code, but that isn't the only way to cast doubt on the accuracy of the software that DNA matched his client to the crime scene. He could simply obtain a copy of it and have an expert run tests to determine a false positive rate and also what types of scenarios cause the software to deliver false positives, then call that expert as a witness.
TazeTSchnitzel 2 days ago 0 replies      
The essential problem is that in such environments the process of doing a task must be open to inspection, but software exists as a loophole that circumvents making process public.
jhwhite 2 days ago 0 replies      
This use to be a problem in Florida with drunk driving arrests. The company that makes the code for the breathalyzers wouldn't allow their code to be reviewed by defendants. There was finally a precedent set that defendants couldn't mount a viable defense without reviewing the code.

So for a while people accused of a DUI could wind up getting off, under the right circumstances, by requesting the source code then getting refused by the company.

The company finally allowed pieces of the code to be reviewed by the courts.

cm2187 2 days ago 3 replies      
I'm not sure I agree with that view. Independant testing by another lab should remove any doubt on the validity of a forensic, rather than forcing companies to open source their technology. And of course some form of certification/random tests that ensures that the company providing the forensic isn't a bunch of conmen.
joesmo 2 days ago 0 replies      
This is what happens when you have companies profiting off the misery of others.

The biggest reason for companies wanting to protect their source code in this case is that they already know their software is broken, like pretty much every other software, and they don't want to fix it. The arguments against losing money and such are total bullshit as courts have plenty of procedures for disclosing materials only to the relevant parties present, not to the public as a whole. These companies simply don't want to spend the money auditing and making sure their code runs correctly because the only consequence of that is wrongfully convicting someone they don't give a fuck about.

I'd say, let them see the code and let the highest paid expert witness win. That is, after all, the American way.

Whats new in HAProxy 1.6 haproxy.com
191 points by oldmantaiter  1 day ago   39 comments top 10
radoslawc 1 day ago 2 replies      
External check pleases me greatly, but sending emails for me seems to be overkill, there are well established ways to do this in unified manner (logparsers, snmp traps etc).Half way trough to fullfill Zawinski's Law.
pentium10 1 day ago 1 reply      
Cool, now we can use Device Identification feature to route mobile users to a different backend, also love the HTTP/2 connection sharing.
wheaties 1 day ago 2 replies      
Whenever I see a long list of features like these, especially something major like Lua integration, I always wonder, what was given up in the process of adding them? Normally when there's a performance bump, they show the numbers. In this case there's no numbers. Was there a performance hit? Negligible?
nailer 1 day ago 2 replies      
As an HAProxy user, support for logging to stdout (and hence journald) would be great. Currently HAProxy users on the major Linux distros either have to use it in debug mode or have a second log server just for the purposes of running HAProxy.

Otherwise I love HAProxy!

ris 1 day ago 1 reply      
Something I've always wanted to do but from what I can see is impossible in Apache is simply limit the number of connections a single IP can have open at once.

Is this possible with HAProxy? If it is, the documentation doesn't make it clear how.

binaryanomaly 1 day ago 1 reply      

But no http2 so it won't get in front of my nginx instances, yet ;)

dexcs 1 day ago 2 replies      
Nice. It supports lua and mail alerts on changing servers now...
ausjke 1 day ago 2 replies      
I was comparing HAproxy to Squid a while ago and could not figure out what's haproxy's advantage over squid? I ended up using Squid but still am very interested in HAproxy, would like to learn more about it.

Squid remains to be the only one that can deal with SSL proxying(yes it's kind of MITM, but it's needed sometimes), and it's also the real "pure" open source. HAproxy might be better fit for enterprises that need support?

ErikRogneby 1 day ago 0 replies      
Lots of goodness!
wpblogheader 1 day ago 0 replies      
Supports Lua? SWEET!
       cached 16 October 2015 02:11:03 GMT