hacker news with inline top comments    .. more ..    19 Feb 2016 Best
home   ask   best   3 years ago   
A Message to Our Customers apple.com
5754 points by epaga  1 day ago   955 comments top 200
epaga 1 day ago 23 replies      
Huge props to Apple - here's hoping against hope that Google, Facebook, and Amazon get behind this.

One thing I was wondering is how Apple is even able to create a backdoor. It is explained toward the end:

"The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by brute force, trying thousands or millions of combinations with the speed of a modern computer."

This is actually quite reassuring - what this means is that even Apple can't break into an iPhone with a secure passphrase (10+ characters) and disabled Touch ID - which is hackable with a bit of effort to get your fingerprint.

ghshephard 1 day ago 4 replies      
I'm surprised that nobody on this thread has commented on the real substance of this response. It has nothing to do with Apple brute forcing iPhones for the police (which it has done for years, with a simple court order) - but instead, is Apple making it abundantly clear, that if they comply (or are forced to comply) with the All Writs Act of 1789 to create this particular back door, then that opens the floodgate moving forward for all sorts of requests to add backdoors/decrease security.

It's entirely possible, that the FBI can then use this precedent to simply have Apple remove all security from an iPhone in pursuit of an active investigation, which can be done with a straightforward firmware update - which IOS users tend to do without much thought.

StillBored 1 day ago 2 replies      
Ok, so I completely fail to see how a random crazy guy with a gun who shoots up a bunch of unarmed people has "national security implications". This seems to be a "fact" that everyone wants to agree on, but is frankly a load of BS if one considers the government probably already has his entire call/texting history for the last couple years.

I see this as just another "its for the children" ploy, of which I'm completely sick of.

In that I fully support Apple/etc for finally gaining a backbone. If more people stood up, then I wouldn't have to be naked body scanned at the airport, or the dozens of other privacy invasions the government performs on a daily basis simply to give themselves something to do. So, rather than admit they won't ever be able to predict or protect the population in any meaningful way from random people willing to give their lives to make a statement, they waste our time and money coming up with ever more invasive ways to peek into everyone's most private possessions.

firloop 1 day ago 0 replies      
Even though the matters are slightly different, I couldn't help but think that Cook is giving off a Boards of Canada vibe in this post (in a good way).

"Now that the show is over, and we have jointly exercised our constitutional rights, we would like to leave you with one very important thought: Some time in the future, you may have the opportunity to serve as a juror in a censorship case or a so-called obscenity case. It would be wise to remember that the same people who would stop you from listening to Boards of Canada may be back next year to complain about a book, or even a TV program. If you can be told what you can see or read, then it follows that you can be told what to say or think. Defend your constitutionally protected rights - no one else will do it for you. Thank you."


Robin_Message 1 day ago 2 replies      
If the UK record on anti-terror scope creep is anything to go by, not creating this backdoor is a very good idea.

In the UK, laws originally intended for surveilling terrorists were/are routinely used by local councils (similar to districts I think) to monitor whether citizens are putting the correct rubbish/recycling into the correct bin. [1]

This is a pandora's box, and the correct answer is not to debate whether we should open it just this once, it's to encase it in lead and throw it into the nearest volcano. Good on Apple for "wasting" shareholders money and standing up for this.

[1] http://www.telegraph.co.uk/news/uknews/3333366/Half-of-counc... - and lest the source be questioned, this is one of the more reactionary newspapers in the UK.

dh997 1 day ago 1 reply      
Tim Cook: a really nice guy with blue whale-sized cohones.

There can be no compromise because China, Syria and Turkey would also lean on Apple to break into phones of dissidents, and pretty soon, future whistleblowers here in US too in order to prevent leaks (iPhone 7 and iCar notwithstanding).

That's the tradeoff in not giving in to faint, vague "maybes" that there were "external coordination" when in all likihood it was the ultraconservative, Saudi half leading this duo into the kookooland of violent extremism.

The security services will just have to buy exploits, develop malware, cultivate human intelligence sources and monitor everything the old-fashioned way... It's not leat this trick. Rather, the government would have to obtain court orders and have forensics done under supervision.

This isn't a backdoor and doesn't affect consumers, and sets a really high bar to trying to scale this for the government because it requires Apple as the gatekeeper every time to agree to do the one-off hack.

The cynic in me thinks that this letter is more about brand image. Apple wants to claim they can't hack their own phones, even if the government asks, but clearly in the case of the iPhone 5C it IS possible for them to do it, and this creates a contradiction with their public marketing and privacy position. If they didn't release this open letter, then simply complying with the judges order would make them look bad.

rdl 1 day ago 0 replies      
As far as I'm aware, the most proper attacks here are, in order of cost:

0) Find some errata. Apple presumably knows as much as anyone except NSA. Have plausible deniability/parallel construction.

1) OS level issues, glitching, etc. if the device is powered on (likely not the case). Power stuff seems like a particularly profitable attack on these devices.

2) Get Apple, using their special Apple key, to run a special ramdisk to run "decrypt" without the "10 tries" limit. Still limited by the ~80ms compute time in hardware for each try.

(vs. an iPhone 5S/6/6S with the Secure Enclave:)

3) Using fairly standard hardware QA/test things (at least chip-level shops; there are tens/hundreds in the world who can do this), extract the hardware key. Run a massively parallel cluster to brute force a bunch of passphrases and this hw key, in parallel. I'd bet the jihadizen is using a shortish weak passphrase, but we can do 8-10 character passphrases, too. They may have info about his other passphrases from other sources which could be useful.

While I'm morally against the existence of #3, I'm enough of a horrible person, as well as interested in the technical challenge of #3, that I'd be willing to do it for $25mm, as long as I got to do it openly and retained ownership. In secret one-off, $100mm. I'd then spend most of the profits on building a system which I couldn't break in this way.

owenwil 1 day ago 0 replies      
I really hope they actually physically can't access the data on this phone. It's entirely possible this could be the case -- I've been trying to consider the vectors they could use:

- lightning cable delivered iOS patch (probably won't work because iOS won't negotiate over USB until you tap a dialog box)

- OTA update (not connected to internet)

- Cracking open the device and accessing the storage directly (encrypted until boot time)

The most likely vector I can think of:

- - Lightning cable delivered iOS patch from a trusted computer (i.e one that the terrorists actually owned)

It's quite impressive that Apple is taking a stand like this, though perhaps unfortunate timing WRT the larger encryption debate.

imroot 1 day ago 0 replies      
If you look at past cases where the All Writs Act has been invoked, the Courts have rejected this type of government conscription.

Effectively, the government is forcing Apple to take receipt of a device that it does not own or posses, then perform potentially destructive services on a device, and then perform services that could potentially require Apple to testify at a trial under the Confrontation Clause of the Sixth Amendment.

I really think that Apple's in the clear here, and the AUSA's in the case are pulling all the stops to get Apple ordered to break the encryption.

drawkbox 1 day ago 0 replies      
Apple is doing the right thing, the American way, which has been forgotten, that puts freedom over security. Thanks Apple, don't give in. Their 1984 anti Big Brother superbowl ad has finally come to fruition[1].

[1] https://www.youtube.com/watch?v=VtvjbmoDx-I

jasonlingx 1 day ago 4 replies      
The fact that they can create this backdoor, doesn't that mean it already exists?

What Apple needs to do then instead of writing this letter, is release an update that closes this backdoor.

jusben1369 1 day ago 1 reply      
I think, as a society, it boils down to this: "And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."

Can a private, for profit, company deny the will of an elected government working to solve a heinous crime based not on what they say they will do but because they cannot give a 100% guarantee that this is the only time/way it'll be used? Apple acknowledges that the government is saying it's limited to this case but because there's no guarantee (100% certainty) they feel they can deny it?

If yes, what does that mean as a broader precedent. Are we comfortable with private companies denying an elected government based not on what they agree to, but instead because there's a chance it'll be used in other ways?

As terribly flawed one might feel about government very few would think it has less accountability than a private company.

whack 1 day ago 0 replies      
I've never been an Apple fan but this was a fantastic and bold move by them. Software security and hacking is already an enormous problem that every single person has to deal with. Even major companies like the NYTimes have been hacked by malicious users in the recent past. We need to take every reasonable action to combat this threat. Building deliberate vulnerabilities (yes, every backdoor is a vulnerability) into our software and devices is going to make all of us less safe, and all of us more vulnerable to unforeseeable attacks in the future.
illumin8 1 day ago 1 reply      
If you oppose this, please let President Obama know. The FBI is part of the executive branch of the government, and as such, directly reports to the president. In other words, if he tells them to stop, they must comply. Please register your complaint here:


Here is my letter to them:

Dear President Obama,

I've voted for you in both elections, and have been a firm supporter on all your causes (affordable care act, and more). However, your FBI has clearly overstepped it's authority by demanding that Apple spend engineering resources building a software product that can break the encryption of a terrorist's iPhone.

Seriously, you need to stop this. You are the head of the executive branch of the government, of which the FBI is directly underneath your jurisdiction. Director James Comey is directly within your chain of command.

What the FBI is asking for is a master key to be created that can decrypt any iPhone. This makes all Americans with Apple devices insecure in the face of threats to our personal security and privacy. I hope you can understand that this is clearly unacceptable, and needs to be stopped.

I want to register my complete opposition to the FBI in this circumstance. Please stop this.


l3m0ndr0p 1 day ago 0 replies      
Apple's encryption appears to be done in such a way that government entities can safely use them as well as "consumers." But what may happen, is that Apple will be force to produce 2 kinds of iPhones. One for consumers, with strong encryption, but a "backdoor" for warrant "cough" based access. 2nd type of iPhones for government use (string encryption, no backdoors)

They may already have this in place now, but what we are seeing now is a show. They are testing how people/consumers are going to react to this situation. Out government probably figures that nobody will care in the end.

In the USA, we have lost our liberty. It's time to wake up and see what is happening. It's getting worse & the people within our government are working hard to enslave us even more.

iamshs 23 hours ago 0 replies      
Cannot be more glad to see Apple's stand on this. Let's not forget what happened with Blackberry some 5 years ago. India, Saudi Arabia and UAE got monitoring ability on its platform:-


2) http://www.reuters.com/article/us-blackberry-saudi-idUSTRE67...

3) http://www.thestar.com/business/2010/08/16/threats_of_blackb...

agebronze 1 day ago 1 reply      
This is just huge hypocrisy and full of lies.First of all, Apple CAN attempt to brute force the password.Compiling whatever new firmware is needed and signing it with their keys will not introduce any new backdoor like they claimed and lied to the public - the backdoor is already there, and it is their private keys. Just like that "backdoor" somehow end up at some bad guy's hand, so could their private keys.

I would agree with Apple if they wanted FBI to pre-submit all their guessed passcodes for brute force for apple to try, and for apple to have the sole responsibility for that, so that getting said "backdoor" (which really is nothing more than a door handle) will be as hard as getting their private keys, and governments will not keep the said backdoor in their hands. I would also agree if Apple claimed they don't want to be able to crack devices at a judge's order (although that would be against the law - so they can't claim that).

But this is NOT what Apple said. This whole letter is just one big PR bulshit. They CAN brute force a passcode. They failed enforcing significant delay incurred when failing a passcode attempt - even tho this issue was already known for YEARS (will give citation if needed) when apple designed the discussed iPhone 5C - and they also failed requiring passcode to update the device. They already have their convenient backdoor in place in the form of their private keys.

plorg 1 day ago 1 reply      
If I understand correctly, any piece of software that would be used would need to be signed by Apple. Furthermore, the FBI's warrant(?) says specifically that it would only need to work for one device ID. Thus it would be relatively straightforward to create an update for the FBI that could pretty clearly only be used on the phone in question. Unless the FBI had Apple's signing key they could not reuse the software (assuming they couldn't break the bootloader chain of trust, which they apparently cannot if this is the route they are taking). Capability is not the grounds on which they are arguing.

This is a very clearly political refusal. Apple is saying that about as explicitly as they can in this message. Whether or not they can do it, Apple doesn't want to be caught in the game of being a government surrogate or having to determine for themselves if government requests are legitimate (imagine, say, if the Chinese government asked for data from a dissident's phone - would Apple want to risk that market by denying a request that they have complied with in the US?). It's unfortunate for them that the FBI is making this request while people still own phones like the 5c for which they could theoretically disable security features, as opposed to the newer phones which it is possible they are completely unable to defeat.

dfar1 1 day ago 1 reply      
I know that because this is on hacker news, everyone is talking about whether it's possible to access the data, and if possible... how easy or how hard it is. But the focus should be on the whether there should be a clear line/understanding between security and privacy, or should we keep everything black and white as it is now, just looking at extreme cases?

If they cannot co-exist, I'd rather have more security and less privacy. But ideally, I shouldn't have to choose between them.

thorntonbf 1 day ago 0 replies      
This is an interesting chapter in the "Tim will never be Steve" saga that so many people are infatuated with.

This particular hill that Tim Cook has decided to defend is as important as anything Steve Jobs ever did at Apple.

jwr 1 day ago 1 reply      
The important lesson here is that it is time to design the next phones in a way that makes it impossible to either install a software update without unlocking the device or implement auto-erase functionality in hardware.

That way for future phones at least, the issue would become moot: there would be no way for Apple to build and/or install a custom software image that allows brute-force password cracking.

davidhariri 1 day ago 1 reply      
I'm no security expert, but how would Apple access previously encrypted data with a different version of iOS? Doesn't having that ability imply they already have a "back-door"? Could someone explain what I'm missing here or is it more that that would be a one-off solution and the FBI is asking for a global, remote, no apple needed solution...
mrb 1 day ago 3 replies      
Link to the FBI order: https://assets.documentcloud.org/documents/2714001/SB-Shoote...

(Edit: deleted part where I was wrong. Thanks robbiet480 for correcting me. It's 2am here and I was tired.)

Also, prediction: if Apple refuses to build a brute forcer, someone else will do it and sell it to the FBI. Just wait and watch.

mnglkhn2 1 day ago 0 replies      
There needs to be a distinction between state security and "retail" security. State security agencies have the legal framework to compel Apple to do anything and not even talk about it.What I call "retail" security is any act by any legal enforcement agency in the country. Their requests are bound to be in large numbers and for all kinds of things. On top of that, these requests, apparently, are not yet covered by a legal framework. Hence the need to force upon an old law to try and make Apple comply.

What's at stake for Apple is not only their principles but also one of their marketing pillar: "you, the user, can trust us with your data/privacy." By asking Apple to give that up, and quietly, you actually are asking them to undermine their business model. Shareholders will not appreciate that if they wouldn't have a chance to hear about it first.The Apple brand would lose from its value and it would reflect in the AAPL share price.

My point is that the whole thing needs to have legal backup. And Apple is asking for this exact thing: give me a law to use. And not something from the 1700's.

rcthompson 1 day ago 1 reply      
I see a lot of discussion about "Secure Enclave" and other hardware security features and such, and I'm not sure I see the relevance. Assuming that the data has already been properly encrypted, stored on disk, and purged from memory (by shutting down the phone) by a version of iOS that did not already contain a backdoor when the data was encrypted, there's no magic combination of hardware and software that can decrypt that data without the password, right? This seems to be supported by Apple's claim that the best they could possibly do is provide a channel for the FBI to brute force the password.

So am I missing something that makes the iPhone's internal security architecture relevant here?

tommynicholas 1 day ago 0 replies      
No sympathy for terrorists, no sympathy for weakening encryption.

I can understand someone outside of tech not understanding how those are comparable statements, but if anything the latter is more important.

aidos 1 day ago 0 replies      
OT but this post is well on its way to becoming the most popular post since Steve Jobs died.


uberneo 1 day ago 0 replies      
I remember the old dead community of OpenSource Phone - http://wiki.openmoko.org/wiki/Main_Page
cognivore 1 day ago 5 replies      
If I were Cook, I'd draw a line in the sand. If we are force to comply, we exit the phone business, because we won't make phones that compromise our customer's security.

But that would take more balls than anyone left here in this "Land of the free and home of the brave" seems to have left anymore.

NicoJuicy 1 day ago 0 replies      
Would this really be true or is this just a decoy, to let you believe there is no backdoor?

I do believe there is no backdoor for when a city court requests it, but i don't really believe that the FBI or CIA doesn't have access to it.

Considering that iPhone already exists a long time, they must have some means to backdoor the "iCloud"...

chillaxtian 1 day ago 0 replies      
if you are interested in the technical details of iOS security: https://www.apple.com/business/docs/iOS_Security_Guide.pdf
mrmondo 1 day ago 0 replies      
Massive props to Apple, again I am impressed by their commitment to customer privacy.
SCdF 1 day ago 0 replies      
> Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software which does not exist today would have the potential to unlock any iPhone in someones physical possession.

They really need to put that paragraph closer to this one:

> The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by brute force, trying thousands or millions of combinations with the speed of a modern computer.

The first paragraph without the second implies that iOS isn't actually secure at all.

Your_Creator 1 day ago 0 replies      
The All Writs Act is a United States federal statute, codified at 28 U.S.C. 1651, which authorizes the United States federal courts to "issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law."

well, as far as I can see, it is not agreeable to the use and principle of law to force a company (or a person since corporations are people) to spend money and waste resources to compromise its own security systems, which happens to be something they morally object to.

wicket 1 day ago 0 replies      
A phone without a backdoor would be illegal in the UK once the Snooper's Charter comes in to full effect. I'm very interested to see how the UK government will react to Apple's stance.
greggman 1 day ago 0 replies      
I think it's outstanding that Apple is standing up for this.

Will they, can they do anything about data in iCloud as well? While you can turn off iCloud I'd guess the majority of people are using it. Given you can access much of it at iCloud.com that would seem like whether or not you can unlock an iPhone most customers' data is available directly from Apple. Mail, notes, messages, photos, etc. No idea about other apps data that get backuped

Again I'm applauding Apple for standing up for encryption. If they could some how offer the same on iCloud I'd switch from Google (Google is not encrypted either. My point is I'd switch to a service that offers it)

FIX_IT 1 day ago 0 replies      
Just so you know when I forgot the password to my password to my iPhone remembered that I choose 1 from the top section 2 repeated numbers from the row below 1 below that and then a zero. So I tried it untill it said "iPhone disabled connect to iTunes", this is when I found out you can reset the disabled time by clicking on a computer the backup button. So therefor you could successfully create a program that tries passcodes 5 or 10 times then tries and fails to back the iPhone up. (There is no need for a backdoor or anything fancy)
jonathankoren 1 day ago 0 replies      
Apple does deserve the respect their getting for standing up to the government about this. They're absolutely right that this is an attempt to fatally undermine security for a whole host of devices, and sets a disturbing precedent.

What do find interesting, is that Apple isn't the first manufacturer that the government as ordered to crack a device. An "unnamed smartphone manufacturer" was ordered to crack the lock screen on October 31, 2014.[1] No one made a fuss then, so someone caved.

[1] https://en.wikipedia.org/wiki/All_Writs_Act

eva1984 1 day ago 1 reply      
It makes sense for them.

If they put a backdoor in iPhone for US government, they are effectively thrown out of Chinese market.

Interesting enough, what will Apple do if Chinese government demand they to decrypt/put backdoor in exchange of staying in the market?

teacurran 1 day ago 0 replies      
Why is there very little talk about the First Amendment in this whole discussion? They are asking to write custom software.

The supreme court has ruled in separate cases that:1. that software is speech2. that a person (corporations are people according to them) cannot be compelled to speak

It would seem to me that the FBI could perhaps subpoena technical documentation from Apple but it should be required to hire their own developers to write this software.

aiabgold 1 day ago 0 replies      
I'm curious: is it likely that Apple was under a gag order regarding the backdoor proposals/discussions?

I've always wondered why large tech companies/corporations abide by such orders instead of speaking out. Even if Apple was under a gag order, they've created a PR nightmare for the alphabet agencies; Apple could be pursued in court, but that pursuit would now likely be done in the face of negative public opinion.

xlayn 1 day ago 1 reply      

 While we believe the FBIs intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect. Tim Cook
Kudos to this guy for standing up to an idea.

Now on practical notes, this is about security, providing a digitally secure platform to both users and providers, prevent tampering, keeping data secure.

Microsoft could take a cue.

datashovel 1 day ago 1 reply      
I'd love to know the names of the people within the FBI who are pushing this agenda. The only way this foolishness is going to stop is if those people are out of a job.
mcintyre1994 1 day ago 0 replies      
I don't quite understand - what is the actual purpose of being able to push a new version of iOS while locked? Apple don't seem to use this - people stick to whichever version they're comfortable with on old devices and accept whatever limitations.. so why does the functionality even exist?

Even with the restriction of being plugged in, outside of Apple who needs to push iOS versions at tethered devices and will be hindered too badly by having to unlock them first?

rubicon33 1 day ago 0 replies      
I hope apple employees + executives read this:

I am now officially, an Apple fanboy. That's right, I'm gloating to family and friends, about how Apple is standing up to the man, doing the right thing, and refusing to compromise their security.

Keep up the good fight.

ocean3 1 day ago 0 replies      
Even if Apple created a backdoor, how are they going to install it on locked phone? Are locked mobile able to update without access to internet or user passcode?
bigolebutt666 12 hours ago 0 replies      
Wouldn't one assume that once the phone is powered up there is some kind of code at startup or scheduled that would query an apple update server about updates,fixes,etc. At that point it is reasonable that a company such as Apple would force certain updates into the phone whether the customer wanted that or not? All Apple would have to do is direct the phone to a phoney update site(for this IMEI only)containing code that would dump RAM to an outside server. No other phones would be affected and the data would be retrieved. World saved!
ohazi 1 day ago 0 replies      
This is the only acceptable response.
HighPlainsDrftr 23 hours ago 0 replies      
This is ugly. If Apple can indeed break into the phone, they need to say, "We have to stop production now. All of our engineers will need to be behind this. It will cost us at least a billion dollars, if we can do it. We will miss deadlines for new products and software. Write us a check for $1 billion and we will start on it. We may need a few billion more. Write the check - we'll do what we can do. And lets hope we don't accidentally destroy the evidence doing it."
bigolebutt666 12 hours ago 0 replies      
Wouldn't one assume that once the phone is powered up there is some kind of code at startup or scheduled that would query an apple update server about updates,fixes,etc. At that point it is reasonable that a company such as Apple would force certain updates into the phone whether the customer wanted that or not? All Apple would have to do is direct the phone to a phoney update site containing code that would dump RAM to an outside server. No other phones would be affected and the data would be retrieved
mckoss 1 day ago 0 replies      
A possible compromise would be to add a backdoor to the security module that would unlock the phone in exchange for a proof of work.

It would be relatively easy for the chip to offer a challenge and accept, say, a $100,000 proof of work to unlock the phone. This way, we prevent bulk surveillance but still allow the government to access high value targets' devices.

Merad 1 day ago 1 reply      
What's the potential fallout on this case? I assume Apple is appealing the ruling - what happens if the ruling is upheld and Apple refuses to comply (unlikely IMO, but what if)? Could the DOJ target individual Apple engineers and order them to do it or face contempt of court charges?
ezoe 1 day ago 0 replies      
It's almost certain that Apple helped American government to violate customer's privacy.It looks to me this is just a marketing stunt for post-Snowden revelation.

If Apple cared customer's privacy and security so much, how could they sell non-free software that is hard to audit, computer with baseband processor, relies on central server which allows the single point of failure.

My understanding is Apple customer don't much care about their own privacy and security but has weakness on marketing.

leecarraher 1 day ago 0 replies      
Backdoor is somewhat of a misconception. What they want are two front doors, ie we encrypt your message with the recipients public key, and we make a copy with our(in this case apple's) public key. We send both messages over the internet, and apple or your isp/cell service provider (we can also assume nsa prism has it too) stores the apple key'd message or both. When the government wants access, they can issue a subpoena for information from the isp/cell provider for the encrypted data (or just download it from Saratoga Spings), then they issue a warrant to apple to decrypt it with their private key. This is likely the only reasonable and responsible outcome that I can see resulting from this debate. Or, pessimistically it becomes an issue for political fodder and we leave it up to politicians who have little to no understanding of the technology to devise some technologically inept solution.
rloc 1 day ago 1 reply      
I feel like Apple is intentionally over simplifying it for the purpose of this letter or maybe to push back on the FBI ask more easily.

Apple could propose to secure access to the FBI using the same level of security that it uses to protect the access to the phone content for the owner of the phone himself. Tim Cook only talks about one solution of a "tool" that it could install.

If the same level (and method) of security is used then saying that there is a risk of the backdoor being hacked would be equivalent to saying that there is a similar risk of the user access being hacked.

philip1209 1 day ago 0 replies      
Their iMessage encryption is fascinating. It basically makes it impossible to retroactively decrypt iMessages. With a court order, they can start MITMing conversations, but unless they intentionally generate a MITM keypair they are cryptographically locked out of the conversation.

http://techcrunch.com/2014/02/27/apple-explains-exactly-how-... Link to Apple's paper is in the article)

(Yes, Apple could add this key for everybody at the beginning, but if their intention is security then it is a brilliant system.)

huntleydavis 1 day ago 0 replies      
Privacy is obviously the foremost issue at hand with the Government's request here, but there is also a huge potential impact on the future of the iPhone software. There is a huge difference between granting access to a user's data at the Government's request vs demanding a customized build of the iPhone's OS. Imagine the long-term implications of having a third-party tether its misaligned feature requests to every OS update that the iPhone makes. What would be the continued relationship with Apple and the agency behind this? Would this evolve into something analogous to HIPAA compliance?
larrymcp 1 day ago 1 reply      
To play devil's advocate:

Mr. Cook expressed concern that "the government could intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge".

As I read this I wondered, "what harm would actually happen if that occurred"? If the government did read my messages and get my health records & financial data and track my whereabouts, I can't think of anything bad that would actually happen as a result of that.

Is there anything specific that I should be worried about in that scenario?

mempko 1 day ago 0 replies      
> Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software which does not exist today would have the potential to unlock any iPhone in someones physical possession.

The scary part here is that the iPhone data is really not that secure. If apple can overwrite the OS and get access to the data, this means the keys are stored on the phone somewhere, and not password protected, or "fingerprint" protected.

dang 1 day ago 1 reply      
This is now the most-upvoted story HN has ever had.
mirimir 1 day ago 0 replies      
It may well be that Cook's stand will soon become unworkable in the US. The US is always at war, after all, at least effectively. I wonder if Apple would just leave. It's already earning ~60% of revenue outside the US, after all. And hey, it's sitting on tons of offshore cash. Maybe it could build its own country on an unclaimed reef somewhere.
AYBABTME 1 day ago 0 replies      
While I think protecting user data is important, I don't understand what the fuss is about. Anyone could (given technical knowledge + tools) take apart a phone, pull the encrypted data out of storage, and then brute force the encryption on a large machine.

The FBI doesn't need the modified iOS code, and that Apple write/not-write it doesn't change anything in the end, since someone else could just as well write the software with some reverse engineering.

[edit: if you downvote because I'm wrong, please explain because I'd love to know why]

Synaesthesia 1 day ago 0 replies      
I always wondered why Apple took so much trouble with the secure enclave design, I thought it was really overkill, now I see it was really necessary for instances like this.
lisper 1 day ago 0 replies      
It is worth pointing out one salient fact: the phone in question did not belong to the shooter, it belonged to the shooter's employer, which in this case is the county government. That makes Apple's position much less tenable because the owner of the phone is (presumably) consenting to -- maybe even actively encouraging -- the recovery of the data.
krylon 1 day ago 0 replies      
Given the way a lot of people (and the media) tend to go completely bonkers when somebody says "terrorist", this is commendable.

It remains to be seen, though, what Apple will actually do, in legal terms. Will they flat-out refuse to cooperate, even if this means that they will be fined or Mr. Cook will be imprisoned for contempt or something like that? Will they actually send their lawyers to challenge the court decision? That would be very interesting to watch, and if they succeeded, it would create a precedent for a lot of other companies. But so would their failure.

Sealy 1 day ago 0 replies      
Huge respect to Tim Cook for standing up for the personal information security of Apples users around the world. When a non tech demands something as stupid as a back door, they do not acknowledge how weak they make data security.
autoreleasepool 1 day ago 1 reply      
Wow this made my day. I think my faith in Apple's privacy concerns got a much needed revitalization. Privacy and encryption are the number one reason I stick with iPhone and Mac with File Vault. It was always hard to completely trust them after PRISM. However, that was arguably a different Apple.

This stance against the government come poetry reaffirms my faith in the genuineness of Apple'e encryption efforts and Tim Cook specifically.

Synaesthesia 1 day ago 0 replies      
There is one way to brute force an iPhone called IP Box. It's a hardware device which can brute force a 4 digit pin in ~111 hours. http://blog.mdsec.co.uk/2015/03/bruteforcing-ios-screenlock....

But it only works on iOS 8.1 or earlier, was patched in iOS 8.1.1

facetube 1 day ago 0 replies      
This was his employer's phone, right? As in, it was government-owned property being used in the course of terrorism. Were they using Apple's Mobile Device Management (MDM) framework or some other form of key escrow? If not, why should Apple bail out a government entity, at the expense of its own customers and security, that couldn't even be bothered to follow best practices?
danbmil99 1 day ago 0 replies      
Techie question: if Apple can compile a neutered version of iOS to bypass encryption, why can't a hacker (or US govt nerd) at least in theory reverse engineer iOS and patch it accordingly?

(guess answer: iOS needs to be signed. So what they are really asking of Apple is to sign a lobotomized iOS image...)

rdl 1 day ago 1 reply      
I wonder how much of that was personally written by Tim Cook, vs. various other people within Apple (I'm sure legal, PR, product, etc. all had input, but this feels like something he wrote himself.
roadnottaken 1 day ago 0 replies      
Can someone explain this to me? The FBI requests a new version of iOS to be installed on a single phone that was involved in the attack. What, exactly, does this mean? If the phone is locked, how will they install new software on it without unlocking? People are suggesting an update to iOS that will get pushed-out to all users, but contain a backdoor that is specific to that one particular device -- but how will the new iOS version be installed without unlocking first?
uberdingo 1 day ago 0 replies      
This is all brave talk until they publicly say the same thing to China, until then this political bluster. http://qz.com/332059/apple-is-reportedly-giving-the-chinese-...
joshcrawford 1 day ago 0 replies      
"For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them."

Sounds just like gun control :)

tibbon 1 day ago 1 reply      
I heard this morning on the (semi-conservative FM radio) that this was a national security issue, and that Apple is helping terrorists in not bypassing this.

I don't get it- the shooters are dead. How is what is on their phone a matter of national security? We probably have 99% of the information we'll ever have on them. There is no larger plot. Not having what's on this device I cannot imagine puts anyone at risk.

lucio 1 day ago 0 replies      
Can't they dump all the data from that particular device and then send it to the FBI? Maybe the judge will order that?Obviously they're confessing they can break the encryption but they would not do it, on principle. I don't see how they can win this fight.If it's the iphone of the shooter, and they can decrypt it, they should do it. It is not the same as to give the FBI a tool to unlock any iphone.
grecy 1 day ago 0 replies      
Does anyone know what the consequences for Apple will be if they keep refusing, but the courts say they must?

Massive fines? (we know they have the cash to cover it)

Jail time for execs (whoa!)


tdsamardzhiev 1 day ago 0 replies      
If they provide to the government what the government wants now, next year the government will come back with even more ridiculous request. Mr.Cook is right - it'd be great if we can avoid creating a precedent.

Oh wait they already did by providing their clients' data. Trying to stop the government now is like trying to stop a high-speed train. Still, good luck to them! Good to know they are not just pushed around without any resistance.

andy_ppp 1 day ago 0 replies      
Surprising the FBI doesn't have a division of highly paid individuals who can crack iPhones... There are plenty of people online with a vested interest in this topic who I'm sure you could hire to help.

My guess is that this is more about pushing back the law and peoples rights than is is about getting access to this device.

But then I'm highly cynical about what the government claim they can do with technology for obvious reasons.

geocar 1 day ago 0 replies      
Can Apple upgrade iOS on a single device that is locked, from a new untrusted laptop without wiping it?

Can Apple OTA upgrade iOS when the device is locked?

ThinkBeat 1 day ago 0 replies      
If I were a betting man I would put good money on the bet that a bypass exists and is well known to the government.

What parts of the government is a different matter.

This is a perfect setup. Get all the bad guys to run out and buy iPhones (good for Apple) believing that they are safe from the US surveillance machine.

Then the appropriate agency can slurp up whatever it wants.

AshleysBrain 1 day ago 1 reply      
While basically being on Apple's side here, as I understand it, jailbroken devices are unofficial builds of iOS that have some security features removed (e.g. limits on which apps can be installed).

Is it not possible for law enforcement to get what they want from that, if all they want is a custom build of iOS that can be hacked around? And why is it even possible for that to work if the data is supposed to be kept secure?

SkidanovAlex 1 day ago 1 reply      
> People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes

I wonder if this is a grammar mistake, or Apple actually considers the private conversations, nodes, photos to be theirs?

kabdib 1 day ago 0 replies      
I'm betting there are similar vulnerabilities in the current "Apple doesn't have the keys" versions of iOS and the hardware. For instance, do a similar mandated firmware update to the secure enclave, and now you get unlimited guesses at a PIN.


Ah, I've found a couple of sources claiming that the secure enclave wipes its keys if its firmware is updated. Makes sense.

notthegov 1 day ago 0 replies      
It's hard for me to have respect for an organization that was built by J. Edgar Hoover, a person who did not respect the law or American's rights.

The philosophy of corruption and oppression still echoes throughout the FBI. Even today, there are FBI agents that work for private interests. You can't reform a mafia, you must abolish it and start over.

cft 1 day ago 0 replies      
The numerical passcode is likely his ATM pin, or a code from his bank/PayPal or some such. I hope the government can simply subpoena his bank/PayPal etc and this will end at that.
maxnaut 1 day ago 0 replies      
Many may hate Apple, however it's undeniable that they're so committed to user security.
sandworm101 1 day ago 0 replies      
Google CEO Sundar Pichai has thrown in with Apple in a series of tweets explaining his position.


carsonreinke 1 day ago 2 replies      
Maybe I am missing something here, but the Washinton Post says "Federal prosecutors stated in a memo accompanying the order that the software would affect only the seized phone". What is so wrong with that? If they just use it only on this phone? Or is that the weapon has been created and could be used?
intrasight 1 day ago 0 replies      
Apple's OSs are closed and therefore inherently unsecure. When Apples caves, and they undoubtedly will, it will have the beneficial consequence of being a boon to open source communications software.
vbezhenar 1 day ago 2 replies      
This is very disappointing letter for me. It means that Apple can indeed build a backdoor into existing phones, they just don't want to do it (or so they speak). I was under impression that Apple employs security hardware which protects keys and makes impossible to penetrate that defense. If it's not the case, iOS security is not as good as it could be.
p01926 1 day ago 0 replies      
Wow. This is the first HN submission to exceed 5,000 points!

To honour Tim, and his advocacy for our industry, I'm going to spend the rest of my week developing privacy/security projects. I encourage everyone else to do likewise.

BWStearns 1 day ago 0 replies      
Good on them. I was hoping that they'd be able to manage a way to unlock this one without potentially breaking the whole model (by exploiting some bug in the presumably outdated version installed or something that wouldn't positively degrade the security model), but given that that's not the case then I think they're making the right choice.
Cthulhu_ 1 day ago 0 replies      
What I think Apple should (also?) do is appeal to both the law enforcement themselves, and the government - basically go "All secret communications from law enforcement and government figures - up to the President - would be at risk", or something to that effect.

I doubt the ones giving these orders would be comfortable with their own privacy being at risk.

thetruthseeker1 1 day ago 0 replies      
Are iphone hard disks (and the files within it) and cloud content encrypted based on this single private key that is stored in the secure enclave on iphone?
delinka 1 day ago 0 replies      
What's all this talk about pushing updates to locked phones? I have to get involved every time there's an OS update for any of my iDevices. That damn red dot on Settings.app just stares at me while I try to find a time I'd like to be without my device for half an hour.
mladenkovacevic 1 day ago 0 replies      
Way to go Apple.

And Edward Snowden just tweeted this a few minutes ago in response to another tweet proposing Google back up Tim Cook: "This is the most important tech case in a decade. Silence means @google picked a side, but it's not the public's."

muddi900 1 day ago 0 replies      
The court order was posted on HN hours before this letter and eother Tim Cook has not rrad the order or he's lying about the back door. What the court ordered was the removal of the auto-wipe.
vu3rdd 1 day ago 0 replies      
I don't see how this message is reassuring. Are they expecting the customers to just take their word? Without Apple showing the world, every bit of software that they run on their phones, these statements are at best, meant to mislead the users that Apple is doing something on the user's behalf.
hughw 1 day ago 0 replies      
Is it possible for a human just to try all 9999 passcode combinations? Assuming the 10-failure erasure is switched off -- a bad assumption, I know. Is there an additional slowdown after a lot of failed attempts?
WA 1 day ago 0 replies      
I wonder why no one pointed out that privacy boils down to trust:

That letter might be the truth or could be some kind of decoy. Maybe the backdoor will come and Apple knows that already and they try to limit the damage to their brand.

Like "we tried to resist having a backdoor installed, but we couldn't do it ultimately".

bumbledraven 1 day ago 1 reply      
Cook wrote that "this software ... would have the potential to unlock any iPhone in someones physical possession." (emphasis mine)

Is that true? What if it's locked with a secure 128-bit (e.g. 10-word diceware) passphrase?

clarus 1 day ago 0 replies      
I think there are two orthogonal questions:

* Does Apple pretend the FBI cannot access to its devices?

* Can the FBI access to its devices?

The only thing we learn here is the answer to the first question. We know nothing more for the second one.

jacquesm 1 day ago 0 replies      
The real security risk is the ability to update the phone's OS without authorized user consent at least as strong as the original protection the FBI are trying to break.

Right now it all hinges on Apple's private key and that's a very thin wire to hang all this privacy off.

teekert 1 day ago 1 reply      
Am I wrong to think that this brute forcing can still be applied when the raw memory chip is taken of the iPhone? The wipe-all-data-feature requires write access to the chip + some intelligence and monitoring. These capabilities should be physically removable from the actual memory chip, right?
hoodoof 1 day ago 0 replies      
A company with courage. Hard to believe when virtually no institution, government or corporate has it.
tdaltonc 1 day ago 0 replies      
What are the odds that Apple has been ordered to do this before, but every other time they were asked it was in a FISA court? That would mean that this is the first time they've been allowed to talk about it.
NinoScript 17 hours ago 0 replies      
Can't they dump the drive's data to protect it from being erased?
lucio 1 day ago 0 replies      
Being realistic, how many fewer iphones will apple sell if they remove the SE? How many people will not buy an iphone if they are told that their info can be accessed with a judge's warrant? I'm guessing a 0.1% drop in sales?
DannoHung 1 day ago 0 replies      
Cook says any iOS device could be breached if this software were created. But other articles have led me to believe that any iOS model with touchid is immune due to the secure enclave being in play even for non-touch passcode access. Is this wrong?
Kenp77 1 day ago 0 replies      
I'm sorry but "Smartphones, led by iPhone"? Bit presumptuous.
atmosx 1 day ago 0 replies      
I thought Apple already had backdoors. I feel relieved that my iPhone is not backdoored and I'm also very happy for a company who's products I use daily.
cant_kant 1 day ago 0 replies      
cmsimike 1 day ago 0 replies      
Heads up that I just recently discovered - if your iphone has touch id enabled, you can go into the touch id settings and selectively disable touch id for phone unlocking while keeping it for the app store.
stefek99 1 day ago 0 replies      
"In the wrong hands, this software which does not exist today would have the potential to unlock any iPhone in someones physical possession."

Someone who believes in conspiracy theories would make a statement that "now it is official" :)

rajacombinator 1 day ago 0 replies      
I can't recall any previous instance of a mega corporate opposing the tyrannical US Govt. I fully expect Apple to lose here but it is a valiant and rare effort.
znpy 1 day ago 0 replies      
Kudos to Apple for standing up to the US government and stand by its users.
okasaki 1 day ago 1 reply      
Since Apple is part of PRISM[0], the FBI can just ask the NSA.

[0] https://en.wikipedia.org/wiki/File:PRISM_Collection_Details....

rodionos 1 day ago 1 reply      
Can they publish a copy of the FBI letter. Otherwise, Apple's description feels a bit circumstantial and opinionated. I feel like I can make a better judgement on this whole issue if the request is made public.
hackuser 1 day ago 0 replies      
My guess is that it's likely that the FBI can access the data without Apple's help. Based on what we know, how do we distingish between these two situations, and which seems more likely?

A) Apple has created unbreakable security. The FBI cannot access the data and needs Apple's help.

B) iPhone security, like all other security, is breakable. iPhones are a very high-value target (all data on all iPhones); therefore some national security organizations, probably many of them in many countries, have developed exploits. The FBI, following normal practice, does not want to reveal the exploits or capability and therefore must go through this charade.

joelbondurant 1 day ago 0 replies      
Tim Cook admits iOS is already back-doored in the most weaselly worded message I've ever seen.
thorn 1 day ago 0 replies      
I wonder what will be response of other manufacturers making phones with Android.
nateberkopec 1 day ago 1 reply      
Is there any doubt that when the FBI brings up a law from the 1700's to justify breaking digital encryption in 2016 that they are completely making it up as they go along?
neves 1 day ago 0 replies      
What happened that now the companies can talk about these gov requests? The most nefast thing in these gov orders about terrorism is that the companies were forbidden to discuss it publicly.
empressplay 1 day ago 0 replies      
"Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device's flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE."

People hyperventilating that the tool could be used to crack other phones can relax, given the last clause in the quoted text (from the actual order).

agebronze 1 day ago 0 replies      
Actually, someone other than Apple is already able to do the requested things in the court warrant (brute force passcode from a locked iPhone) - ih8sn0w has an iBoot exploit for A5 chipset (same as iPhone 5c), so he can probably boot an unsigned kernel, and use some public tools already published to crack the said passcode. If some lone hacker can do it don't be fooled for a minute that NSA can't, or that the feds couldn't buy something similar from another hacker. This is Apple covering their P ass from the press.
dawhizkid 1 day ago 0 replies      
So this was a work phone owned by his employer. Does that change things? Surprised they didn't have IT software installed already to monitor the device.
Overtonwindow 1 day ago 0 replies      
The way I read this, is that Tim Cook has and said it can't be done, only that it shouldn't be done. This leads me to suspect that Apple can decrypt your phone, and they know precisely how to do it, but in doing so would disrupt their entire marketing campaign around safe and secure encryption.

I'm just a government relations guy, not a security person, so please forgive me, but I'm not sure where I fall on this. I want the FBI to be able to decrypt the San Bernardino attackers phone. The same time, I don't want the government to be able to decrypt my phone. This is one hell of a damned if you do, damned if you don't situation, and I'm really stuck.

LeicaLatte 1 day ago 1 reply      
Possible or not, the FBI seems to have formalized the issue using this opportunity. They are asking the questions they have been wanting to ask since the release of smartphones.
dovdov 1 day ago 2 replies      
"We have no sympathy for terrorists."

They felt the need to state that, huh?

partiallypro 1 day ago 1 reply      
In the future, once terrorists have TouchID iPhones, couldn't they just use the corpse's finger to unlock the phone?
intrasight 1 day ago 0 replies      
And what happens to the engineer tasked with writing this hack if he fails and ends up bricking the phone?
pmarreck 1 day ago 0 replies      
HN is the first place I came to for discussion on this and I just wanted to thank you all for keeping it civil, intelligent, and objective
jwiley 1 day ago 0 replies      
So only Apple has the ability to do this...not the US government. So we trust Apple but not USG?
RalphJr45 1 day ago 0 replies      
Wouldn't many countries like Russia and China stop allowing the sale of iPhones or at least their use by government officials if the FBI succeeds?
castratikron 1 day ago 0 replies      
This is a great way to build public awareness for this issue. Hopefully this will allow more people to get involved in the fight.
droopybuns 1 day ago 0 replies      
Applying an update to break encryption would violate chain of custody and render the information obtained inadmissible in court.
tomelders 1 day ago 0 replies      
Remember, iPhone's are available world wide. If the US wants to play world police, then I want a vote in the US election.
mesozoic 1 day ago 0 replies      
And the government wonders why people from tech don't want to work for it.
phkamp 1 day ago 1 reply      
The fact that Apple indicates that they would be able to produce such a software version is in itself a backdoor in the iPhone.
guylepage3 1 day ago 0 replies      
All of a sudden I'm starting to think my PiPhone is looking pretty good.
bmoresbest55 1 day ago 0 replies      
As much as I would love to believe in Apple (and any other large tech company), a part of me still thinks that maybe they are working with the government in this letter. The FBI knows that the average US citizen does not want to be hacked. What is to stop the FBI from allowing Apple to say these things and put on a show publicly while simultaneously giving over the 'master key' anyway?
kyle4211 1 day ago 0 replies      
It took me a bit, and I believe no one has summarized this very well yet.

FBI: "You've built a device that makes it nation-state-difficult to install custom software without DRM keys. We'd like you to assist us in deploying software signed with your keys."

Apple: "That feels way too much like asking a CA to sign a cert for you, so fuck off."

I'm honestly not sure which side I'm on here.

yummybear 1 day ago 1 reply      
I can't read it from the letter - are they going to refuse to cooperate? Can they do that?
maindrive 1 day ago 0 replies      
I think Apple tried to prove that they don't give any user data to agencies. PR stunt. But actually they got it fundamentally wrong as this was actually case of national security and the attack that happened. So huge PR stunt, but a own goal
HoochTHX 1 day ago 0 replies      
This is the FBI going after a Parallel Construction path. They already have all the information from the NSA bag o'tricks, but none of those can be used in court. But an unlocked phone unlocks the legal obstacles.
Dolores12 1 day ago 0 replies      
You don't own apple hardware, so you can't protect your device.
boredatnight12 1 day ago 0 replies      
Hmmm. If this pans out in Apple's favor, I may finally buy an iPhone.
supergirl 1 day ago 0 replies      
sounds like the backdoor already exists, but only Apple knows how to use it. same as if Apple knew a master password for this phone but refused to give it. they are saying they don't want to give it because once the FBI has it, then they are free to use it anywhere. pretty strange post from Apple.

probably they try to fight this request by arguing that the government is actually asking them to effectively remove security from all the phones (of this model at least). they would be happy to help break this one phone as long as it doesn't affect any other phone.

in that case, then Apple should just break the phone and give it back to the FBI after removing the backdoor.

a-b 1 day ago 0 replies      
well, no one protected from thermo-rectal crypto-analysis. The only difference is that gov guys want to keep it hidden from target
unixhero 1 day ago 0 replies      
Plot twist.

This is actually the result of a barter. The Gov gets to have some low level TOP-SECRET access in trade for this easy access code and that Apple gets to go public to keep the populace calm and pretend they are fighting this thing.

mrmondo 1 day ago 0 replies      
Mods: can you please update title to add some context?
fiatjaf 1 day ago 0 replies      
Cry, US Government!
thrillgore 1 day ago 0 replies      
Am I the only one buying a new iPad because of this announcement?
at-fates-hands 1 day ago 1 reply      
The easy solution to this is to have the gov send Apple the phone. They break into themselves and then hand back the phone with the pass code turned off and whatever software they need to install to do it removed, leaving no trace of how they actually did it.


No software backdoor is created, the FBI gets its data and we all go on with our lives. Why are we spending so much time gnashing teeth over something that has a very simple solution to it?

wildmXranat 1 day ago 1 reply      
To all in-love-with-Apple downvoters, please read this Schneier sound analysis of the same type of situation that RIM(Blackberry) has been met with: https://www.schneier.com/blog/archives/2010/08/uae_to_ban_bl...

/quote:"RIM's carefully worded statements about BlackBerry security are designed to make their customers feel better, while giving the company ample room to screw them." /endquote

I have lost enough points on this thread to simply double down on this issue.

This is not a good sign at all. While Google can't compete with Apple on the principle of "not spying on their users". All Apple has to to is to publicize it and then ask for forgiveness from it's users later.

7GZCSdtn 1 day ago 0 replies      
As a software developer i'm always looking for the real bug. Weapons kill. Not Iphones.
satyajeet23 23 hours ago 0 replies      
Dear Tim Cook,

Thank you!

puppetmaster3 1 day ago 0 replies      
I'm a libertarian. But islamic terrorist phone is just evidence - Apple must unlock it for the FBI.
ratfacemcgee 1 day ago 0 replies      
I have never been more proud to have worked for Apple. Tim isn't afraid to give the government the old double forks when it counts!
Shivetya 1 day ago 0 replies      
Good for them. Freedom comes with a price, sometimes that price of freedom is protecting the privacy of the worst of us to protect all of us.
joering2 1 day ago 0 replies      
> And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

Good one Tim! I mean how long did the LE think they can abuse constitution, put spy devices on people's cars without warrant, use stingrays and do all sort of other crazy stuff including planning and executing white-flag attacks without any consequences whatsoever?? I mean, at some point, we the people - for a good reason - will lose all and any trust we have in them! And that's what Tim is saying in this one sentence that with overwhelming evidence, the US Gov would have hard time arguing against!

jsprogrammer 1 day ago 1 reply      
If it is possible to build the requested OS, then it can be said that the iPhone already has a backdoor.

If the device were truely locked down, there would be no aftermarket solution to unlock it.

My understanding is that Apple was asked to supply software that would prevent their software from destroying evidence on a particular device. They should comply with this order, especially given the device in question.

joeclark77 1 day ago 1 reply      
Can someone explain this to me: if the data is encrypted, how does switching the operating system out enable one to read the data? I'm a layman in this area but I can only surmise that the data is stored unencrypted and it's the operating system itself that's somehow locked. If a change of operating system can open up encrypted data, then what's the point of encrypting hard drives or data sent over a network?
blisterpeanuts 1 day ago 0 replies      
Everyone in the U.S., please write to your Congressional representatives and also to the Presidential candidates you support. They need to know they can't get away with this.
ageofwant 1 day ago 0 replies      
I am happy AAPL is taking this stance. But I can't help but believe that is has very little to do with liberty, and very much with the bottom line. Either way, I guess we should be grateful for little mercies.
Zigurd 1 day ago 0 replies      
This is why technology companies have to go farther than implementing proprietary security systems: They have to put the capability to circumvent security out of reach of themselves.

Real data security has to be a mix of services that are friendly to reliable key exchange and strong unbreakable encryption, and verifiably secure endpoint software, which in practice means open source software where the user can control installation, that implements encryption.

tosseraccount 1 day ago 0 replies      
Doesn't the phone belong to San Bernardino County?
berkeleynerd 1 day ago 9 replies      
A friend of mine at Apple reported multiple Black Vehicles (Lincoln Town Cars and Escalades) with at least one having MD License Plates at the Apple Executive Briefing Center this morning between 11AM and Noon. Occupants had ear pieces and sun glasses and were accompanied by a CHP (California Highway Patrol) cruiser and three motorcycle escorts. I suppose it's possible this was a quick (less than 1 hour) VIP stop but given Tim's message last night, as well as the reaction of folks on campus who were bandying about comments like "I don't want to work on this or because I don't want to be deposed" the impression certainly was it was not a friendly visit. Given Tim's very public push-back I'd think delivery of an NSL with accompanying intimidation is at least possible. I submitted this HN and updated in real-time. There's a bit more discussion here:


lifeisstillgood 1 day ago 1 reply      
Does anyone have a decent architectural overview of iPhone (6)? Security - these enclaves etc sound good but devil is in the details
caogecym 1 day ago 0 replies      
under what kind of pressure would Tim write this public letter?
dang 1 day ago 0 replies      
Please don't post unsubstantive comments.

We detached this comment from https://news.ycombinator.com/item?id=11116803 and marked it off-topic.

zobzu 1 day ago 3 replies      
What im reading is that apple can remote install an update that disable encryption. They dont want to do it.

But that they have the capability is a bit scary.

ogezi 1 day ago 0 replies      
it's a slippery slope.
caogecym 1 day ago 0 replies      
under what kind of pressure would Tim decide to write this public letter?
blazespin 1 day ago 0 replies      
Apple should be more clear that this is 5C and not the latest version.
jaboutboul 1 day ago 0 replies      
Just unlock the freaking phone for them...
rogersmith 1 day ago 0 replies      
Gotta give it to Apple, they sure know how to pull off a PR stunt.
z3t4 1 day ago 1 reply      
Instead of FBI paying apple engineers to hack a phone, why don't they ask their kids !? It would probably save millions of dollars.
amelius 1 day ago 2 replies      
Question: is it possible to design a cryptographic system that, whenever it is accessed by a third party (government), this is made publically visible in a log? Can blockchain technology help here?
droithomme 1 day ago 0 replies      
Well they seem to be saying that the approach they describe, to make a modified OS, would actually work to circumvent encryption on a preexisting device. That means that they already know the device is not really actually secure.

They aren't talking about putting a back door into systems to be used in the future, they are saying it's indeed feasible to place a backdoor on a device already out there and then use the backdoor to access the device. That means the device is not actually secure.

planetjones 1 day ago 9 replies      
With the due legal process the police can search property, safety deposit boxes, bank accounts, vehicles, etc. etc. Why should a smartphone be any different just because Apple says it is ?

As much as I value privacy I really don't agree with Apple's stance here - if due legal process has been followed, why shouldn't they be able to read the contents of an iPhone ?

And yes I get that third party encryption can be used, which isn't owned by Apple and that there's little the authorities could do about it - but that's not the case at hand here.

Twisell 1 day ago 0 replies      
[In walk the drones]

"Today we celebrate the first glorious anniversary of the Information Purification Directives.

[Apple's hammer-thrower enters, pursued by storm troopers.]

We have created for the first time in all history a garden of pure ideology, where each worker may bloom, secure from the pests of any contradictory true thoughts.

Our Unification of Thoughts is more powerful a weapon than any fleet or army on earth.

We are one people, with one will, one resolve, one cause.

Our enemies shall talk themselves to death and we will bury them with their own confusion.

[Hammer is thrown at the screen]

We shall prevail!


On January 24th Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like '1984.'"


Apple Superbowl AD "1984"

Transcription courtesy of George Gollin, 1997

Edit:Removed the link to the video. My goal wasn't to draw traffic anywhere it was just to point out that some of Big Brother sentences in an Ad aired 30 years ago still have strong resonance today.

"Our enemies shall talk themselves to death"Hum... just read yesterday that NSA is believed to use machine learning over cell big-data to determine drone target...

wildmXranat 1 day ago 0 replies      
Not getting an iPhone, even secured - Check!

I bet hardware vendors are just salivating at the concept of having to produce thousands of iPhone cracking docking stations.

lunasight 1 day ago 0 replies      
>The San Bernardino Case

We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the governments efforts to solve this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data thats in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and weve offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software which does not exist today would have the potential to unlock any iPhone in someones physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no an way to guarantee such control

This is just pure awful they admit to helping the fbi. how can we trust them

rms_returns 1 day ago 3 replies      
This is quite unlike Apple. Is this the same company that insists on keeping its source proprietary and is always against FOSS? The idea that you care for your users' privacy and still like to keep control on them by not giving them the freedom to modify source-code is not what I buy.
Vulkan is Here khronos.org
900 points by ekianjo  2 days ago   201 comments top 34
Udo 2 days ago 2 replies      
This chart sums it up pretty well:


There is no reason why any given app developer would want to talk to Vulkan directly. It's basically only suitable for a subset of engine developers and low level API programmers.

I also heard some people who were under NDA might have some disappointing things to report about the Vulkan decision process now that it's out. But the question is whether it matters at all whether it's a good API or not, given my impression that it's apparently not intended for interfacing to humans but to higher layer APIs.

bd 2 days ago 3 replies      
For reality check read "Talos Principle" FAQ on their Vulkan port (probably the most complex non-toy Vulkan application you can try today):


TLDR: optimism for the future but right now performance is actually noticeably worse than their DirectX 11 rendering backend.


A: Ok, first, in GPU-bound scenarios (ultra settings, resolution higher than full HD), you'll see lower performance, 20 to 30% lower. This is work in progress, and we (both Croteam and IHVs) are analyzing and optimizing the performance. We'll get to the bottom of this!

Q: And CPU-bound scenarios?

A: Same or a bit faster. But for now, those scenarios really have to be super-CPU-bound. Like, rendering whole levels, without any help from visibility system, frustum, distance or occlusion culling.

shmerl 2 days ago 5 replies      
Congratualations! It's a major step forward.

Note this however:

> Vulkan takes cross-platform performance and control to the next level, said Bill Hollings of The Brenwill Workshop. We are excited to be working through Khronos, the forum for open industry standards, to bring Vulkan to iOS and OS X.[1]

Of course Apple had to remain the jerks they are and not support Vulkan natively, forcing developers to write translation layers[2]...

[1] https://www.khronos.org/news/press/khronos-releases-vulkan-1...

[2] https://moltengl.com/metalvk/

unsigner 2 days ago 9 replies      
NOT a successor. It's a very different beast. It's frequently described as a "low-level" API, but "explicit API" is more correct. It gives you control (and responsibility) for things that happen behind your back in OpenGL, e.g. semantics of sharing of resources between the CPU and the GPU, explicit separate access to many GPUs, explicit separation of command buffer building and submission etc.

It will live side-by-side with OpenGL for the foreseeable future. It's just targeting the same general area (graphics using GPUs) and is standardized by the same folks (Khronos).

pavlov 2 days ago 5 replies      
As usual, not available for Mac.

If this follows the usual trajectory, we'll have a sort-of-working implementation of Vulkan 1.0 in OS X 10.13 (although it will kernel panic if you look at it the wrong way).

But maybe Apple's stance on Vulkan is different now due to their own Metal API? Is Apple involved in Vulkan at all?

hatsunearu 2 days ago 4 replies      
The website says it is also a compute API. I haven't heard this before; I didn't know Vulkan came with a GPGPU stack. Does anyone have any experience looking at the compute aspects of Vulkan and how it compares to OpenCL?

I wanted to take a squiz at making GPU compute code for "fun" and I'm wondering if Vulkan compute is worth looking at.

vegabook 2 days ago 0 replies      
Thank you AMD. Vulkan is essentially a development of Mantle, AMD's next-gen graphics API, and without that forward-thinking leadership, we'd still be messing with (closed) D3D on Windows, (closed) proprietary per-game drivers from greedy Nvidia, and the outdated OpenGL.

Kudos all the more deserved for the fact that AMD doesn't have the financial firepower of its competitors.

ekianjo 2 days ago 2 replies      
And here's a chart on whether or not you should switch to Vulkan:


dman 2 days ago 1 reply      
Dont notice any AMD products at https://www.khronos.org/conformance/adopters/conformant-prod... even though AMD already has drivers out. Whats up with that?
Benjamin_Dobell 2 days ago 1 reply      
> Google gives you everything you need to incorporate Vulkan into your Android games and other apps where graphics performance is key

Err, do they? Where?

I was under the impression this isn't implemented yet and is expected to be included in Android 7.

speps 2 days ago 1 reply      
bitmapbrother 2 days ago 1 reply      
A video showing the dramatic performance difference between Vulkan and OpenGL on an Android TV device:


maufl 2 days ago 1 reply      
Could OpenGL be implemented as a library on top of Vulkan? So that future drivers will only implement the Vulkan API and if you want OpenGL, you just use an OpenGL library?
coetry 2 days ago 1 reply      
This is finally my chance to delve into graphics hacking. I'm interested in bindings to Common Lisp with this >:)
znpy 2 days ago 5 replies      
Dumb question: what will change for end-users ?
auvi 2 days ago 0 replies      
Great news! I am waiting for the book titled "Vulkan Programming Guide: The Official Guide to Learning Vulkan (OpenGL) 1st Edition" to come out. Looks like it will be out in August 2016.
diakritikal 2 days ago 0 replies      
I think what's most interesting is strong multi threading support in conjunction with an intermediate representation.

Looking forward to seeing SPIR-V compilers in other languages...

frik 2 days ago 1 reply      
I am seeing forward to a possible "WebVulkan" analog to WebGL.
gulpahum 2 days ago 0 replies      
The reason I have been waiting for Vulkan is to get rid of OpenGL driver bugs. I hope that being closer to metal means less bugs.
nercury 2 days ago 0 replies      
It is rare to see a cross platform release with so much collaboration behind the scenes. Extremely exciting!
tgb 2 days ago 1 reply      
Does this do anything to address the difficulties of sharing the gpu resources among processes, particularly concerning untrusted code like a website? Even opengl es can crash my video driver easily under a heavy load and it's more careful than opengl.
advanderveer 1 day ago 0 replies      
Would it be worthwhile to write a Golang API on top of such a low level implementation or would that just negate the reason it exists in the first place?
bd 2 days ago 3 replies      
Quick Vulkan starter pack (if you just want to see if it runs on your system, without compiling anything).

Windows-centric, but many things also available for Linux and Android.

------- Drivers -------

Nvidia GPU driver: https://developer.nvidia.com/vulkan-driver worth trying even if your GPU is not listed as supported, e.g. 900 series mobile Maxwell GPUs like GTX 970M or GTX 980M work ok)

AMD GPU driver: http://support.amd.com/en-us/kb-articles/Pages/Radeon-Vulkan...

------- Demos -------

Nvidia Vulkan Choppers demo: https://nvidia.app.box.com/s/rj9nzxkpz21qu34h8zew301kray9urb...

Nvidia Vulkan Fish demo (threaded rendering): http://developer.download.nvidia.com/mobile/shield/assets/Th...

Vulkan examples binaries by Sascha Willems: http://vulkan.gpuinfo.org/examples.php

- requires assets from source distribution (create "bin" subfolder and place binaries there): https://github.com/SaschaWillems/Vulkan

- also may need to install Visual Studio 2015 redistributables: https://www.microsoft.com/en-us/download/details.aspx?id=481...

------- Games -------

Talos Principle: http://store.steampowered.com/app/257510/ there is a free demo available)

See how to enable Vulkan support here: http://steamcommunity.com/app/257510/discussions/0/412447331...

------- Tools -------

Vulkan HW capabilities database: http://vulkan.gpuinfo.org/

Vulkan Hardware Capability Viewer: http://vulkan.gpuinfo.org/download.php

robohamburger 2 days ago 0 replies      
I haven't read the docs yet but I wonder if this means there could be an open source d3d or opengl implementation that uses vulkan as its backend.

Also: it is a shame the API docs appear to be behind a login page. Hopefully that will change! The nice thing about opengl is it (at least when I used it) the docs were easy to get at.

kbwt 2 days ago 0 replies      
Quick Reference: https://www.khronos.org/registry/vulkan/specs/1.0/refguide/V...

The Khronos site seems to be overloaded right now.

Bytes 2 days ago 0 replies      
It will be interesting to see the adoption rate for games and other applications now that it has been released.
Aardwolf 2 days ago 1 reply      
Too bad the sample code linked to from multiple locations is empty... only a README.md:https://github.com/KhronosGroup/Vulkan-Samples
bane 2 days ago 0 replies      
abrodersen 2 days ago 1 reply      
Mac Support is coming: https://moltengl.com/metalvk/
wenderen 2 days ago 3 replies      
Apparently the GeForce 800 series doesn't support Vulkan :(


Why is this so, any idea?

Eduard 2 days ago 0 replies      
So many teasered demos, but no video to look at...
bovermyer 2 days ago 0 replies      
Vulkan remains, and will only ever be, the Primarch of the Salamanders for me. Sorry, I guess my gamer culture overrides my tech nerd culture.
meerita 2 days ago 0 replies      
Where is Apple on that page?
bijbij 2 days ago 0 replies      
I have dream to see such a master piece enabled on web browsers through javascript.
Apple ordered to bypass auto-erase on San Bernadino shooter's iPhone techdirt.com
682 points by bgentry  2 days ago   344 comments top 40
philip1209 2 days ago 5 replies      
A thought experiment: Let's say the government makes hardware encryption standards in the style of FedRAMP that sets standards for preventing tampering by foreign governments. Then, imagine that a consumer electronics company voluntarily makes all devices comply with this standard. Could a court attempt to compel the company to defeat the standards which the government set as tamper-proof against governments?

A second: What happens if Apple states that it will take a 50-person team with an average annual labor cost of $200K/person approximately 5 weeks to fix the problem with a 50% chance of success. Can Apple bill the court a million dollars to try to fix the issue?

A third: Apple open-sources their encryption modules and firmware. They no longer have proprietary information for how to unlock the phone. Are they legally required to be the ones who defeat a system to which they hold no proprietary information?

A fourth: The small team that built the system no longer works for Apple. Perhaps their visa was revoked and they left the country, perhaps they were poached by a competitor, or perhaps they retired in the years since this module was published. Who is responsible for complying with the order?

A fifth: The data is actually corrupted. Apple presents this conclusion under penalty of perjury after a thousand hours spent on the project, which it requests are compensated.

A sixth: Apple requests that trading of its stock is frozen for one month while it expends considerable resources on complying with an unexpected court order relevant to national security.

tptacek 2 days ago 8 replies      
Remember, this is an iPhone 5C, which doesn't have Touch ID or the Secure Enclave; the security model for this phone is significantly different from that of more recent iPhones.

On phones with a Secure Enclave, the wipe-on-failures state is managed in the coprocessor (which runs L4), and is not straightforwardly backdoor-able.

If you're worried about the police brute-forcing your phone, enable Touch ID and set a passcode that is approximately as complex as the one on your computer.

matt_wulfeck 2 days ago 1 reply      
If you read the iOS security guide you'll know Apple built the phone in such a way as to wash its hands with these types of request. They'll say it's impossible and they won't be lying. Nothing is ever impossible, but it will be very impractical. The hardware and software is built to ensure this.

I think the real game here is to compel Apple to build a backdoor into future models. I expect to see a lot of rhetoric around this fact, until something forces Apple hand.

tzs 1 day ago 0 replies      
The article at Errata Security [1] is better. There is an HN submission for it [2], but it hasn't drawn any attention.

In particular, it addresses technical issues not covered in the Techdirt article that are relevant to many of the existing comments here on HN.

[1] http://blog.erratasec.com/2016/02/some-notes-on-apple-decryp...

[2] https://news.ycombinator.com/item?id=11115251

whatwhatwhat999 2 days ago 1 reply      
Unfortunately, there's no good outcome here.

If Apple can unencrypt the phone, it will prove to everyone that backdoors exist. If they can't, and they tell the FBI as much, it will just give politicians more reasons sound off about how we have to have backdoors, because this shooter was a "terrorist" after all, and we just have to suck it up and do whatever is necessary to go after people like that.

Either way, we end up with backdoors.

headmelted 1 day ago 4 replies      
For me, the most interesting question I would have is absent from the article.

The court is basically ordering Apple to produce new firmware that doesn't block brute forcing. If Apple were to comply, who keeps this firmware after the fact?

There's no mention of this at all, but if the firmware image stays with the FBI then the implications are much more profound with regard to privacy.

ars 2 days ago 2 replies      
> Apple ... will probably have little time to debug or test it overall, meaning that this feature it is being ordered to build will almost certainly put more users at risk.

Eh? They are not being asked to install it to the public at large, just one phone.

Of all reasons to object, this reason makes little sense.

rburhum 2 days ago 2 replies      
So if I get this right, they want to (1) disable the delete feature after x retries (therefore enabling unlimited retries) and (2) enable to submit tries via a connector/wifi, bluetooth (therefore enabling a bruteforce approach). What good is an encrypted filesystem in that scenario?
cant_kant 1 day ago 0 replies      
The key parts of the Federal order:

"Apple's reasonable technical assistance shall accomplish the following three important functions:

(1) it will bypass or disable the auto-erase function whether or not it has been enabled;

(2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE and

(3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE.

The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device's flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE.

The SIF will be loaded via Device Firmware Upgrade ("DFU") mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery analysis.

If Apple determines that it can achieve the three functions stated above in paragraph 2, as well as the functionality set forth in paragraph 3, using an alternate technological means from that recommended by the government, and the government concurs, Apple may comply with this Order in that way."

zekevermillion 1 day ago 0 replies      
If Apple is capable of compromising security on its devices (by using its root key to sign a custom version of iOS, or through some other method), then I see no way that they will avoid eventually being subject to a court order in some jurisdiction that compels this action. If that's true, then device security is already compromised and Apple knows this. Let's say the facts of the case were slightly different, that the FBI "knows" a terrorist attack is about to occur, and Jack Bauer-style demands that Apple assist in compromising a specific device that has the top seekrit plans on it. In that instance, do you think Apple would comply with a warrantless request for cooperation? Hm...

Reading Tim Cook's announcement in light of this thought experiment, methinks he doth protest too much! Apple does not have any objection to compromising user security at the root level, and in fact has already done so by creating a device that has some limited vulnerability to malicious action by the manufacturer signed with its root key. (By the way, no doubt every other manufacturer has done worse, so this is not to deprecate Apple vs. any other big company.)

I would speculate that Tim Cook's goals with this announcement are largely PR-based, and that the goal of Apple's legal strategy is not to avoid cooperation but rather to retain the ability to decide whether to cooperate, and/or to impose a higher perceived cost on the government for such requests. No doubt Apple is correct to say that once a precedent is established, then it will be widely used by law enforcement even in routine cases.

At the end of the day, I am not optimistic that we can avoid a world in which large device manufacturers are compelled (legally and practically) to build security flaws into their devices. Perhaps not the flaw of a back-doored crypto implementation, but other flaws such as those that have been identified in current iOS devices that allow the government (with commitment of sufficient resources) to chip away at some of the more superficial protections.

kirykl 1 day ago 0 replies      
The implications are quite important for future technologies. Neural implants for example. Neural implants are currently used for prosthetics and paralysis. A forced backdoor would kill all research to develop a co-processor directly linked to the brain. Who would want a government backdoor directly to the brain
hardmath123 2 days ago 0 replies      
Robert Graham (Errata Security)'s notes on this: http://blog.erratasec.com/2016/02/some-notes-on-apple-decryp...
Myrmornis 2 days ago 3 replies      
Why the worry about auto-wiping? Is it not possible to make a copy of the encrypted data and then play around with it as much as you want?
blakecaldwell 2 days ago 2 replies      
Does Apple get to bill the FBI for the time that their engineers and legal department will be busy on this request?
elgenie 2 days ago 0 replies      
"""To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the Order."""

If what Apple's security guides claim is true, "unreasonably burdensome" should be an easy standard to meet on practical technical feasibility grounds. The issue is whether they'll want to challenge this on non-technical grounds.

SideburnsOfDoom 1 day ago 1 reply      
So, Apple says that "the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation."


If it's possible to make such a "backdoored" build of iOS, then there are state actors who will be throwing $Millions at doing it already, with or without any willing help from Apple.

kevinastone 2 days ago 3 replies      
Why wouldn't the FBI just clone the phone disk contents and crack the encryption on more dedicated systems?
nanocyber 1 day ago 4 replies      
I thought this was an excellent write-up regarding how the iOS security platform (recent iPhone models) works from someone obviously in the know, as posted in the forums of Apple Insider. (Source: http://forums.appleinsider.com/discussion/191851)

" Apple uses a dedicated chip to store and process the encryption. They call this the Secure Enclave. The secure enclave stores a full 256-bit AES encryption key.

Within the secure enclave itself, you have the device's Unique ID (UID) . The only place this information is stored is within the secure enclave. It can't be queried or accessed from any other part of the device or OS. Within the phone's processor you also have the device's Group ID (GID). Both of these numbers combine to create 1/2 of the encryption key. These are numbers that are burned into the silicon, aren't accessible outside of the chips themselves, and aren't recorded anywhere once they are burned into the silicon. Apple doesn't keep records of these numbers. Since these two different pieces of hardware combine together to make 1/2 of the encryption key, you can't separate the secure enclave from it's paired processor.

The second half of the encryption key is generated using a random number generator chip. It creates entropy using the various sensors on the iPhone itself during boot (microphone, accelerometer, camera, etc.) This part of the key is stored within the Secure Enclave as well, where it resides and doesn't leave. This storage is tamper resistant and can't be accessed outside of the encryption system. Even if the UID and GID components of the encryption key are compromised on Apple's end, it still wouldn't be possible to decrypt an iPhone since that's only 1/2 of the key.

The secure enclave is part of an overall hardware based encryption system that completely encrypts all of the user storage. It will only decrypt content if provided with the unlock code. The unlock code itself is entangled with the device's UDID so that all attempts to decrypt the storage must be done on the device itself. You must have all 3 pieces present: The specific secure enclave, the specific processor of the iphone, and the flash memory that you are trying to decrypt. Basically, you can't pull the device apart to attack an individual piece of the encryption or get around parts of the encryption storage process. You can't run the decryption or brute forcing of the unlock code in an emulator. It requires that the actual hardware components are present and can only be done on the specific device itself.

The secure enclave also has hardware enforced time-delays and key-destruction. You can set the phone to wipe the encryption key (and all the data contained on the phone) after 10 failed attempts. If you have the data-wipe turned on, then the secure enclave will nuke the key that it stores after 10 failed attempts, effectively erasing all the data on the device. Whether the device-wipe feature is turned on or not, the secure enclave still has a hardware-enforced delay between attempts at entering the code: Attempts 1-4 have no delay, Attempt 5 has a delay of 1 minute. Attempt 6 has a delay of 5 minutes. Attempts 7 and 8 have a delay of 15 minutes. And attempts 9 or more have a delay of 1 hour. This delay is enforced by the secure enclave and can not be bypassed, even if you completely replace the operating system of the phone itself. If you have a 6-digit pin code, it will take, on average, nearly 6 years to brute-force the code. 4-digit pin will take almost a year. if you have an alpha-numeric password the amount of time required could extend beyond the heat-death of the universe. Key destruction is turned on by default.

Even if you pull the flash storage out of the device, image it, and attempt to get around key destruction that way it won't be successful. The key isn't stored in the flash itself, it's only stored within the secure enclave itself which you can't remove the storage from or image it.

Each boot, the secure enclave creates it's own temporary encryption key, based on it's own UID and random number generator with proper entropy, that it uses to store the full device encryption key in ram. Since the encryption key is also stored in ram encrypted, it can't simply be read out of the system memory by reading the RAM bus.

The only way I can possibly see to potentially unlock the phone without the unlock code is to use an electron microscope to read the encryption key from the secure enclave's own storage. This would take considerable time and expense (likely millions of dollars and several months) to accomplish. This also assumes that the secure enclave chip itself isn't built to be resistant to this kind of attack. The chip could be physically designed such that the very act of exposing the silicon to read it with an electron microscope could itself be destructive."

mchahn 2 days ago 2 replies      
Why can't they just pull the flash memory and work on it directly?
DyslexicAtheist 1 day ago 0 replies      
>>A magistrate judge, an Apple employee, and an FBI agent agree to meet at a local bar. Only the Apple employee makes it. Why? Because the bar didn't have a back door.
tomohawk 1 day ago 0 replies      
The phone in question was not owned by the shooter, but by his employer, who has consented to the search. This seems like a poor basis to contest protecting someone's privacy.
mariuolo 1 day ago 0 replies      
Do I sound conspiratorial if I suspect this is all PR and NSA has already ways to get around the encryption?

It would entice $EVILDOERS to use a compromised platform.

fab13n 1 day ago 0 replies      
Fortunately for the future, this kind of attack can be thwarted through key stretching (making each attempt intrinsically long to perform, by making it computationally expensive).

I expect to see an optional, configurable key stretching setup in future phones, for those whose privacy is worth a couple of seconds' delay when unlocking their phones.

frogpelt 2 days ago 2 replies      
Who goes to jail if Apple flat out refuses?
briankwest 1 day ago 0 replies      
The headlines keep changing, is it crypto, is it auto-erase, is it a purple unicorn? sheesh lets muddy up the actual issue with incomplete, or inconsistent data. Its the American way!
curryhowardiso 1 day ago 0 replies      
Literally the one time the west coast would have needed Scalia...
empressplay 1 day ago 3 replies      
It bothers me that Tim Cook lied: he stated in his open letter that if they provided the modified OS it could be used on other phones, but the court order specifically says Apple should make the software only work on the specific phone in question.
j1vms 1 day ago 1 reply      
It's at times like these they're surely knocking on the door of every company whose R&D in quantum computing, information theory and algorithms they've been funding for at least the past 2 or so odd decades. "So, is it ready yet?"
dplgk 2 days ago 5 replies      
The 5th amendment protects evidence inside the brain of the accused. As devices becomes more and more an extension of the brain, the more I think we'll need to adjust the rule of the 5th amendment to cover things outside of the brain.
EGreg 2 days ago 4 replies      
I always wondered why more people don't go around bricking iPhones by entering the wrong pin several times. Same goes for any other lockout. Why not do this to someone famous by constantly logging in as them from a botnet?
gcb0 1 day ago 1 reply      
this is smoke screen. Purely.

they can already desolder the flash memory chips and brute force the data, programatically no less, all they want.

dschiptsov 1 day ago 0 replies      
This is wrong. Engineers who make encrypted devices should do their best to make them undecipherable. This is an universal standard - to do your best.

If encryption cannot be broken it means it has been done right, and engineers should have the highest respect.

Govts, on the other hand, should use appropriate policies, not orders or force or backdoors.

tosseraccount 1 day ago 0 replies      
It's not his phone. He worked for the governmentIt belongs to the government.
gizi 1 day ago 0 replies      
Big problem. People will generally stop using phones of which they suspect that they are back-doored. At the same time, it would be a hopeless endeavour for law enforcement to get a swarm of (Chinese or other Asian) companies to help with unlocking their phones. They would literally not even answer the phone. Therefore, this may very well spell the end of highly centralized, Apple-style companies that can be effectively pressured and browbeaten into "compliance".
venomsnake 1 day ago 3 replies      
Why no one is attacking on hardware level? Cut the processor to get the GID and UID, dump the flash, pregenerate rainbow tables with pin, power flash chip externally and give the codes ...

Yeah it is expensive, but I would not be surprised if there aren't such labs that could provide such service. Why does FBI goes trough such pains?

samfisher83 2 days ago 3 replies      
Can't they use the dead guys finger print?
exabrial 1 day ago 2 replies      
Ok someone murders a bunch of defenseless people... Why is Apple dragging their feet? This is tasteless. I'm NOT for backdoors, but this is ridiculous.
doggydogs94 2 days ago 0 replies      
If Apple can jailbreak the phone in its current state, Apple (or the NSA) may be able to help.
ghettoimp 2 days ago 1 reply      
This order says Apple must... - bypass the auto-erase feature - enable the FBI to "submit" passcodes - not purposefully introduce additional delays

I don't see that this requires Apple to do anything in particular with whatever passcodes the FBI submits.

bool tryPasscode (string passcode) { return false; }

Reasonable cost of service: $5?

yarou 2 days ago 1 reply      
They're effectively asking for a backdoor, plain and simple. I'd be highly surprised if Apple complied with this court order.

Even if they removed said feature, the only way to decrypt the FS would be if and only if the owner had a weak strength passcode.

Can somebody explain to me how this warrant is not a direct violation of this individual's 4th amendment rights?

This seems like yet another case where the rights guaranteed by the Constitution are selectively applied based on your skin color.

3D printed sundial whose precise holes cast a shadow displaying the current time mojoptix.com
836 points by edward  5 days ago   92 comments top 23
gulpahum 5 days ago 2 replies      
Interesting, on the Southern hemisphere, the sundial needs to point to the South Pole and the design needs to be altered:

"in the Openscad script you can set a flag that will simply rotate upside-down the whole swiss cheese inside the gnomon, and build a Southern-hemisphere version of the gnomon. This way, the roo can simply use this Southern-hemisphere sundial exactly the same way a cow or a penguin would use their Northern-hermisphere version the sundial, with just one difference: the roo will have to point the tip of his sundial toward the South Pole." [1]

[1] http://www.mojoptix.com/2015/10/25/mojoptix-001-digital-sund...

ant6n 5 days ago 1 reply      
Skip the intro and see it in action: https://youtu.be/wrsje5It_UU?t=12m58s
edejong 5 days ago 0 replies      
Probably just as impressive is this (patent just expired) digital sundial: http://www.digitalsundial.com/
tyingq 5 days ago 2 replies      
Very, very, cool. This got me searching for other implementations of a digital display sundial.

This one is interesting...uses a mask of slits http://www.fransmaes.nl/genk/welcome-e.htm (iframes, ugh...click #8 in the left sidebar)

Edit: Another cool implementation http://www.voshart.com/SUN-CUBE-prototype

StavrosK 5 days ago 6 replies      
Can anyone tell me how OpenSCAD compares to something like SolidWorks for designing objects? I'd much prefer to learn something open and programmable, but if SolidWorks is much easier to use for common cases, then I'd go with that.

I've never used any 3D designing program other than SketchUp.

IvyMike 5 days ago 1 reply      
This older one is pretty neat, too. I've owned it for a while and still can't really figure it out--the shadow mask is incredibly thin, and as far as I can tell it works on magic. http://www.digitalsundial.com/product.html
deutronium 5 days ago 1 reply      
Very cool!

Also apparently a 'digital' sundial was patented at one point - https://en.wikipedia.org/wiki/Digital_sundial

http://www.hineslab.com/digital-sundial/ - The notebook drawings are cool

theoh 5 days ago 1 reply      
There's a high profile public digital sundial in Paris, installed in 1989:


bdg 5 days ago 1 reply      
I doubts the claim about only being able to produce with a 3d printer. I can imagine easily making the layers in mass out of wood and sliding them through bandsaws to mass produce these.
DickingAround 5 days ago 1 reply      
This kind of creativity is really something special to the 21st century; not just that people can make things but that they can make things which are so complex and otherwise would have been cost prohibitive before CNC technologies like 3d printing.
rahul286 5 days ago 4 replies      

I am wondering how much extra time it will take for 3D printing as we add support for 10 minute time interval, 1 minute interval and then seconds (extra set of digits HH:MM:SS).

Also will it be practically possible to have second-level granularity?

techlover14159 3 hours ago 0 replies      
Wow. This is amazing!
zeristor 4 days ago 0 replies      
I loved this. Then I thought how about if you could use words which appear in the shadow over the day.

As ever someone had already had the same idea written the code to produce the OpenSCAD files for different wordings and put it on Github:


mitchtbaum 5 days ago 1 reply      
Cool! Perhaps someone could cast a piece like this with concrete or baked clay inside a larger, sturdy building to display an alphanumeric message to future generations, only during specific alignments. I can see one potential benefit compared to hieroglyphs, tablets, pillars[0], and other megalithic messages that some of our ancestors left us; it would take different skills to warp its message than simply a chisel or spray paint. One fictional story that comes to mind, Indiana Jones' Staff of Ra[1], makes this seem like even if only for a treasure hunt game, it would make for a fun story later on.

0: https://en.wikipedia.org/wiki/Ashoka_Pillars

1: http://indianajones.wikia.com/wiki/Headpiece_to_the_Staff_of...

jhallenworld 5 days ago 0 replies      
It should display the time with Roman Numerals... then ancient Romans could use it.

This is very cool, I'm going to have to play with OpenSCAD.

soheil 5 days ago 3 replies      
This is just incredible, would love to see a version with milliseconds support:) It'd also be nice if the numbers didn't fade in/out but had a sharp cut off point where they'd quickly switch.
wetmore 5 days ago 0 replies      
Something like this came up on hn a while back: https://news.ycombinator.com/item?id=8042673
tobr 5 days ago 1 reply      
Very clever! Is there any reason why you couldn't just carve out tunnels shaped like the numbers, using a stencil typeface?
supahfly_remix 5 days ago 2 replies      
How does the error vary with how far you are from the edge of the time zone?
dominotw 5 days ago 0 replies      
finally a killer app for 3d printing. Future is here.
MrBra 5 days ago 1 reply      
so this proves that time really exists ;
JoeAltmaier 5 days ago 0 replies      
Seems cool. But a 15-minute video? Pictures or it didn't happen.
GitHub responds to Dear GitHub letter github.com
763 points by joshmanders  6 days ago   331 comments top 44
thejameskyle 6 days ago 5 replies      
Some of the responses here are kinda obnoxious. I helped kickstart the original letter and I'm very happy about this response.

GitHub more than anything has been a blackbox, and this was a very notable first step towards opening up. It should be encouraged, not shut down.

Privately (and some publicly), in the past few days, a lot of major open source projects were discussing a coordinated move away from GitHub. This likely puts that all on hold, but we'll see what changes GitHub makes and how people want to respond to it.

Again, I'm very happy about this response, as should everyone in this thread. We'll see in the coming weeks what it really means though.

rogerbinns 6 days ago 9 replies      
I complain every 6 months about their "releases" page - eg this one for my project https://github.com/rogerbinns/apsw/releases

They unhelpfully add a "Source Code" link, which would be great if that is what it was. Instead it is actually just a zip file of the repository at that tag. But the repository is in a maintainer state, so the "Source Code" isn't useful to an end user because various tools need to be run (eg building help, automake/autoconf, dependencies). The people mislead by Github due to this "Source Code" contact me, not github.

Every six months for several years, I send the email explaining how this doesn't serve anyone's interest, how it hurts, and that I am happy with any solution (eg don't auto add, change the name to make it clear, be able to label an existing file as the "Source Code" etc). I get the usual response of understanding and sympathy, and vague suggestion that it may be addressed. Three years and counting ...

greggman 6 days ago 9 replies      
I don't have any giant opens source projects so I guess I don't get bothered by the +1s and cries for help. Chromium gets those on their non-github issues. So does Firefox. I'm guessing WebKit does too.

My biggest issue, which I guess is a non-issue for most others?, is I hate when reviewing a PR that every single line comment generates a message/email.

Maybe it's because I'm used to Rietveld but my normal workflow is to start reading the patch and adding comments. I might have made 10 separate line comments before I get to a part of the patch that invalidates all those comments. Therefore I don't want them sent until I'm ready. I want to be able to make line comments across all files in the PR and only when I'm ready, send them as one coherent message.

As it is now, AFAICT, the only way to do this is to write line comments in some other file on my computer and then later manually find the lines again in github and add them one at a time. Rietveld also lets you respond to line comments inline and mark them as "done".

is there anything for github that does the same? Maybe one of their offline tools?

eibrahim 6 days ago 8 replies      
Wow, It took them 29 days to say "we hear you stay tuned"
bhouston 6 days ago 4 replies      
Brutal delay in responding to their most important constituent? What the hell? This wasn't a minor user's feature request, it was an open letter signed by a ton of major open source leaders. If I was an investor in GitHub I would fire the CEO.
Siecje 6 days ago 3 replies      
I wonder if this has more to do with ESLint[1] than the letter.


mirashii 6 days ago 5 replies      
Now, if only we could get a movement behind a "merge" button that does fast-forward only commits.
tptacek 6 days ago 5 replies      
From reading the comments on this thread, one could get the impression that HN Github users would have been happier had Github not responded at all.
devonoel 6 days ago 0 replies      
Github is a large bloated organization that's slow to move, more news at 6.

I don't see how it should shock anyone that its taken Github 29 days to respond. Github can't move at the speed of say, an open source project with a hand full of contributors. They're a company of 100s of people. And the larger an organization is, the harder it gets to make a decision about anything, because more people have to agree. In this case, they have to agree about how to respond to open criticism while saving PR face, which is a difficult task.

I'm fairly confident that the stink this whole Dear Github thing has raised will result in positive changes of some kind, but let's be real, Github is still just a corporation that faces a lot of the same problems other corporations face. They're probably scrambling to get a whole bunch of people to agree on what it is they should do.

punnerud 6 days ago 2 replies      
"Issues haven't gotten much attention from GitHub these past few years and that was a mistake, but we've never stopped thinking about or caring about you and your communities."

A lot of thinking but no action. Github, what have you been up to these years??

rafael-rinaldi 6 days ago 3 replies      
I would love to know what the reasonable approach would be ideal from the people talking bad things about GitHub. If they responded to issues right away people would complain too.

It's not easy to come up with a solid plan for new features and improvements, specially in a company this big (that has a lot going on as we all know).What? You think they're going to get that list of complaints and start smashing some code? That letter is not even 1 month old.I can kinda understand the frustration but keep in mind that there's a lot of other things to handle, it's more complicated than that.

In the meantime if you're really struggling, there are other great options available out there like BitBucket and GitLab.

chappi42 6 days ago 2 replies      
Quite nice answer. - Googled for some background and stumbled upon the following:

[Danilo Campos, has been known to tweet similarly strong views about diversity. Campos joined in August]:"don't think we'll succeed teaching white, male middle managers empathie and compassion anytime soon so let's limit their scope of damage"

Such a rubbish. I think GitHub has their priorities / focus wrong. - Glad I'm using Gitlab.

progx 6 days ago 1 reply      
Thank you GitHub, to know that you need 29 days to reply, let my OpenSource GitHub Project look not soooo bad. :-
lhnz 6 days ago 2 replies      
If they take 29 days to respond to this, I wonder how long they might take to respond to the fairly shocking allegations within the recent Business Insider article [0]?

In fact I wonder if this response was ever going to happen on its own? Perhaps the cause of the response was their recent string of bad press - dear github, business insider, eslint, etc.

It strikes me as strange that a company that has so many people focused on community and social impact doesn't have time to talk to its users.

I hope they manage to sort themselves out as it's getting embarrassing.

[0] https://news.ycombinator.com/item?id=11049067

VeejayRampay 6 days ago 2 replies      
Reading the reactions to their addressing the complaints, I really don't envy the folks at Github. A lot of folks in the "Yeah that's nice but what about X/Y/Z" range (and same here on HN).

There's simply no winning.

tacos 6 days ago 1 reply      
I wish more companies would take more time to respond to these sorts of situations. Perhaps this was excessive (I see conflicting reports of where they responded and when). But...

Let things cool off, take a hard look at things internally, get the stakeholders committed to actual change, then provide an update. I'm far more likely to believe that then even a perfectly executed PR play. I've watched even the most pathological and idiotic tech people get good at those the past 10 years. Beware people with good haircuts who say nothing and only appear during times of trouble.

miseg 6 days ago 0 replies      
That's a great question on the page by EGreg:

> I wonder if GitHub itself can use its own issues system for prioritizing feature requests and bugs :-)

elliotpage 6 days ago 3 replies      
I just hope they also add new features centering around depreciation of repos and the ability to add a "THIS IS NO LONGER BEING MAINTAINED" flag or an equivalent.

Finding a repo that appears to solve my problems only to find a mass of issues and an absent maintainer is a pretty big time sink, for both me and likely the maintainer too!

ageofwant 6 days ago 3 replies      
Great, but please resist the urge and goading to build another JIRA in github. The second best thing about github is its simplicity.
camhenlin 6 days ago 1 reply      
Has anyone else made the decision to switch due to github's internal racism and sexism rather than technical reasons? That's my biggest reason for looking at alternatives lately (although I haven't made the move yet, but plan to as soon as I have some spare time)
redsymbol 6 days ago 5 replies      
To any githubbers reading this:

A lot of people posting here are being unreasonable, demanding, and childish. Recognize that for what it is, don't let it affect you, and just continue moving forward based on the constructive part of the feedback.

enknamel 6 days ago 0 replies      
This is rather a nonresponse. It basically says GitHub read the letter. It makes no promises, proposes no solutions and no timeline. After working in gaming for years we would post responses just like this all the time and get something scheduled 6 months to a year out knowing that just responding with general affirmations would placate customers without having to do actually fix anything.
sotojuan 6 days ago 0 replies      
I'll be more optimistic and be happy they replied and very interested in what they have in store. They say "next week" and it's Friday so it won't take that long.
shadowmint 6 days ago 0 replies      
Can't wait to see what actually tangible outcome this will have.

I think that's far more interesting than that they took 29 days to respond; ok, they did. Deal with it. They have responded now.

What's important is what happens next.

A public issue tracker?

A rust style RFC process? (https://github.com/rust-lang/rfcs)

Just please, whatever you do, do it fast, and stop people from doing what babel did and migrating to phabricator. :P

SFjulie1 6 days ago 0 replies      
I complain every day more about how we organize our works and need the insanely complex git to collaborate because we have insane ways of producing code.

An industry that requires increased insane level of complexity for simple task to "progress" amaze me at failing to get the problem is not in the tools.

I can guarantee that for 10 weeks work with "old" technologies (python + whatever static HTML server) I can make a one size fits all website for all restaurant that can be customized without security holes. Make it in a free license so that people make money with my second creation ... on purpose. And we would all win. All the economy from having something simple and well thought.

But I can't do it. I am not an influent trend maker so no one will bet a kopek on my business.

And you know, these tools cognitive load on the brain, is a load that should be spent on business problems, where cost efficiency matters.

Where operational expenses are better kept low whatever the initial investment is.

The fact we all use at least git (+ a ticket tracker) that are heavyweight, should tell us that we are dangerously inflating the cost of building software. Hence, contradicting our very primary reason to be : which is to diminish operational expenses the most we can.

And we still have not proven anything about how much money this increased expenses due to complexity finally create value anywhere... In decreased price, decreased incidents, higher quality, better jobs, better economical growth ... saved lives...business values true SLA (and not the one measured by the editors/operators)...

So, well, github may not be the problem.

erikb 6 days ago 0 replies      
Businesses are ridiculously slow. What I can tell you about a business that is a blackbox to you which will surprise you: It's a blackbox to them as well. They just don't know how to manage so many people and ideas. And the sad part is most managers don't. So I don't think it will change. The solution would be to educate the manager about how to find out what's not working and how to resolve such problems, or switch the manager with one that is willing to constantly work on getting things to work instead of just keeping the head above the water. It's a really hard problem though, since as an employee you don't gain much by resolving true systematic problems, and you even may lose some value with your coworkers by changing their daily processes that they are used to so well. It's really something that must come from the top, but if there is nobody to do that then it won't happen.
avitzurel 6 days ago 0 replies      
I think a lot of the disappointed comments are understandable here. (obnoxious or not).

I know I personally expected one of either to happen

1. Github responds in 2-3 days with the EXACT response they made.

2. Github responds in 1-2 weeks with a concrete plans that addresses at least some of the concerns the original letter addressed.

They basically did the bad combination of both.

One of the things that frustrates me the most about companies responding to feedback is "We have it planned" and "we don't have an exact timeline".

More than anything I really dislike the "stay tuned". I don't want to stay tuned, you should either tell me when to expect it or I will tune out (and I think that's what most people think).

To me, this is a non-response.

I like Github, I use it every day.I don't have any plans to move away from Github.If I had a project that I felt wasn't fit for Github, this response would not change anything for me, I would still likely research for alternatives.

quadrangle 5 days ago 0 replies      
Right, this is typical for proprietary software. Everyone has to beg and petition and then if a company isn't awful, they'll respond and do something. Meanwhile, you still can't adapt things to your needs or submit pull requests or otherwise do what you could with free/libre/open tools. And if GitHub does a good job addressing these requests, it will encourage everyone to stay with GitHub where we're still helpless serfs.

If we all demand software freedom, then we won't be so helpless in the future with whatever issues come up.

nv-vn 6 days ago 0 replies      
Ironic that most of the replies were '+1's
chris_wot 6 days ago 4 replies      
That took a long long time for them to respond. It's day 29 now. In fact, they could have communicated something after even a week - probably something similar to what they have actually written, only more 'We're looking at your list and we'll try to get back to you as soon as possible'.

Where is Jono Bacon in all of this?

edit: OK, I feel kind of bad for mentioning Jono because I've watched him do awesome things over the years. I've sent him an email out of the blue asking him to respond - hopefully he gets it.

jcoffland 6 days ago 0 replies      
Next we need to figure out how to light this same fire under Docker. In someways they're even worse. They often close issues because they won't be addressed in the next release. I've tried to explain that this is exactly the case where you want to leave an issue open. They said they would think about that but still leave issues they don't want to work on right now closed.
lutorm 6 days ago 0 replies      
The fact that this discussion is being made by committing to a README.md file seems to point to a serious lack of tools for interaction...
dreamdu5t 6 days ago 0 replies      
Remember it took years just to get them to properly display code that uses tabs (and we still didn't get a simple ui setting).
alexandrerond 6 days ago 0 replies      
Oh wow, that's all they have to say after weeks? Funny enough Gitlab responded in hours. Opportunistic or not, that is why people are taking it seriously.
trengrj 6 days ago 0 replies      
Not sure why they did this on a Friday. I know the standard PR playbook says to release bad news on Friday but I wouldn't call a response to this bad news.
aledalgrande 6 days ago 0 replies      
Well, even if quality is declining I wouldn't know where else to go for now. Gerrit doesn't cut it and neither does Bitbucket. Guess we will see.
Etheryte 6 days ago 0 replies      
What's the point of "we're going to reply next week" post on a Friday? Just make a statement next week.
ptio 6 days ago 1 reply      
GitHub should consider "open sourcing" GitHub itself so the community can submit PRs and RFPs.
barely_stubbell 6 days ago 0 replies      
What I want, more than anything, is to search on commit hashes.
shade23 6 days ago 0 replies      
this would be a better link :https://github.com/bkeepers/dear-github
smoyer 6 days ago 0 replies      
Since they only offered a "receipt" for the original letter, why did it take two months? They could have put this response out the same day the "Dear GitHub" letter was posted.
donretag 6 days ago 0 replies      
So tempted to add a +1 comment...

(This is a joke, please do not do it)

programminggeek 6 days ago 0 replies      
TLDR; We've been busy making piles of money in Enterprise land and forgot that we acquire customers via FOSS. Oops.
gerbilly 6 days ago 0 replies      
And the first response to their letter was a:


Sci-Hub: Removing barriers in the way of science sci-hub.io
758 points by kasbah  5 days ago   213 comments top 37
smanzer 5 days ago 1 reply      
Previous HN discussion (article has neat details of how sci-hub works): https://news.ycombinator.com/item?id=11074638
nokya 5 days ago 3 replies      
I am happy she is doing this. The price that paywalls are charging for accessing research papers is unacceptable: the money does not flow back to the researchers, nor to the institutions. Most of the paywalled research has been sponsored by taxpayer money and hence should be publicly accessible or at a very low "maintenance fee".
travjones 5 days ago 4 replies      
I've used sci-hub a few times. It's a little buggy and not every article can be accessed, but it works well enough to try when I'm not on campus.

$30 to read a single article is ridiculous anyway and presents a barrier to scientists who don't have, can't afford, or don't want to pay for access. I hope sci-hub stays up and improves for some time.

ChuckMcM 5 days ago 1 reply      
It paints a bulls-eye on her back and its use of the term "piracy" to "free knowledge" doesn't flow well in western sentiments. While I get that the "booty" she is stealing are the fees that the journals would like you to charge, the acts she are creating are more like a librarian letting people check out books without a library card because she has an infinite supply of said books.

I am hoping that the rent-seeking behavior of the science journals can be used as the canonical example of how copyright can harm the common good.

By endorsing and upholding this egregious use of copyright, our elected officials are clearly causing more harm than good, and the perversion of the spirit of copyright, that an author is granted a temporary monopoly so that they might recover some of their investment, portrays this use as indentured servitude at best, and outright theft at its worst.

So while I don't think anyone is really "harmed" because Disney won't release the original Cinderella or employs measures to keep it from being copied. It is very much the case that by creating this barrier to scientific research, a person or group who might change the world in a positive way if they had access, is perhaps even unaware that there is relevant work that they cannot get access to. That is definitely a harm in my opinion.

So I hope that the narrative here, which has been dominated by big media for so long, might get some interjection of a more nuanced understanding of why copyright exists, and how to craft laws that embrace that spirit, rather then the rent-seeking interests of the people who live off the work of others.

userbinator 5 days ago 1 reply      
A long time ago (even before Aaron Swartz), when I was still familiar with the active and rapidly growing filesharing community of the time, I vaguely remember reading about an effort by some of the "ebookers" to plant proxies in various universities' networks that would perform much the same function. I wonder what eventually became of it besides the large paper torrents that appeared, but I wouldn't be surprised if SciHub was related to that in some way. Back then, systems were far more open (as opposed to secured), and something like that was easier than it is today.
korginator 5 days ago 0 replies      
Elsevier made more than 3.5 billion dollars in revenue last year. They are trying everything possible to destroy open research. They were behind three bills in the US congress to prevent universities from providing access to pre-publication research. This is research that's been paid for by taxpayer dollars.

Companies with attitudes like Elsevier need to be buried.

smanzer 5 days ago 2 replies      
Good for them. For the last article that I published, the publisher "value added" consisted of highlighting all the all-caps names in my document and asking me to define them as acronyms. Literally the only thing they did, and it wasn't even right.
bpg_92 5 days ago 2 replies      
Welp, I used to get books, papers and software from non-legal sources when I was in undegrad, because I just couldn't afford it, now that I make some money I buy most of this stuff. The thing is, without all those resources in the past I couldn't have made it to where I am now. Just my 2 cents.
yason 5 days ago 3 replies      
Is there a tarball of the data somewhere that one could download, redistribute and host somewhere (on the darknet, assumedly)?
Fizzadar 5 days ago 0 replies      
Great to see a massive middle finger to the journal system. It's a disgrace and has to stop. Unfortunately I fear sites like this might entice more stringent protections for future journal published articles. The war continues.
vixen99 5 days ago 1 reply      
http://www.freefullpdf.com/#gsc.tab=0 is another useful site if you're a lone researcher who doesn't have taxpayers' money funding your literature search and can't afford (in some cases) $30-$40 to look at a published paper.
slantaclaus 5 days ago 1 reply      
This is what Aaron Swartz was trying to do, right?
Eudyptula_minor 4 days ago 0 replies      
As a student who is studying to become a Theoretical Mathematician, I hope Sci-hub stays open for many years ahead. In Finland, we have pretty good access but only if one is a student. Mathematics is so interconnected, that removing paywalls and any obstacles could help uncover breakthroughs by combining ideas from other fellow Mathematicians. I hope UN exercises Article 27 of Human Rights and aligns itself on the right side of history in order to better Science and to encourage curiosity in today's minds and definitely tomorrow's! Pardon my English. Thank you.
yetanotheracc 5 days ago 1 reply      
Speaking of moral courage, how does one contribute institutional login credentials to sci-hub?
tim333 6 days ago 1 reply      
May it continue. Perhaps science can go the way music has where in practice you can see most stuff for free.
peterhuston 5 days ago 0 replies      
In Norway there is free access to NEJM, JAMA, BMJ, Annals of Internal Medicine and the Lancet (2 month delay). UpToDate, BMJ Best Practice and McMaster Plus is also free. See http://www.helsebiblioteket.no/om-oss/english for information about all included resources. You need to access these resources from a Norwegian IP to get access. From abroad, this can f.ex. be done through Tor if you define only exit through a Norwegian exit node.
drethemadrapper 5 days ago 0 replies      
There is another service in the pipeline. I came across it very recently; it is in a public-beta phase. It appears to focus on providing access to all digital libraries and specifically serving the third-world or developing countries, mostly in Africa. It has got a different (business) model and uses some advanced technologies for provisions of the articles. Given that they intend subscribing to the publishers, there is no doubt that they will remain in business for as long as the publishers themselves exist.

The projects like sci-hub.io, library.no and libgen are highly commendable. It is no news that the third-world countries are destabilized by war, economic sanctions, e.t.c. perpetuated by the world powers thereby making them re-prioritize (access to) their resources. And it is not surprising that webrtc/p2p related services are often times blocked in the first world institutions with access to articles from those digital libraries. Such technologies/protocols/tools are defined/shaped (at standardization meetings - IETF, W3C, e.t.c.) by big corporations in order to preserve their own product offerings.

dineshp2 4 days ago 0 replies      
Sometimes we have no option but to break the law until what is considered illegal is made legal.

Storing research papers behind paywalls is absolutely ridiculous. The law literally prevents the development of science.

Having personally seen people benefit directly(for purposes of research) from this initiative solidifies my whole hearted support for sci-hub.

max_ 5 days ago 4 replies      
How is this diffrent(better) from http://arxiv.org ?
srean 5 days ago 3 replies      
Hope widespread knowledge about its existence does not kill it.
spacefight 5 days ago 2 replies      
Looks like it is piping the queries over to scholar.google.com - getting only timeouts right now though.
derpadelt 5 days ago 0 replies      
So ordinary people may finally read my papers with reasonable effort? Sounds like an improvement. I am not in the academic content distribution industry though.
bhouston 5 days ago 2 replies      
So what is the strategy once Springer starts getting their domains taken down? With torrents this was never a big deal once there was the DHT - it didn't matter which search engines where taken down or which trackers, the torrents lived to see another day.

This website at this stage seems particularly easy to take down as it is a centralized weak link.

Centralized services work great when they are legitimate: Netflix, Spotify, but decentralized work best when they are not legal.

dbcooper 5 days ago 0 replies      
There is very little innovation from academic publishers. Most don't even offer a single download that includes the paper and supplementary materials. E-book files are non-existent.

Very expensive publications, like Nature Biotechnology, should at the very least provide a single download (preferably epub) of each issue.

justinclift 6 days ago 1 reply      
Wonder what the size of the data set is so far?

Likely TB's?

treenyc 4 days ago 0 replies      
Nice. Does anyone know if there is something equivalent to this much like the SSRN (Social Science Research Network) http://www.ssrn.com/en/index.cfm/mjensen-20th/

Where you can perform full text search on all the papers?

amelius 5 days ago 0 replies      
Server seems down (?)

Also, isn't this better done over bittorrent?

thecourier 5 days ago 0 replies      
in memoriam: Aaron Swartz
iabacu 4 days ago 0 replies      
Can we download only single articles, or can we download the whole 40M+ collection?
pknerd 5 days ago 2 replies      
While searching it goes to Google Scholar. Am I missing something?
Rainymood 5 days ago 1 reply      
(1) Isnt this illegal?

(2) How do they get access to those papers?

cyphar 5 days ago 6 replies      
I didn't know that you could raid ships using a website(!).

In all seriousness, the act of "sharing the collective knowledge of mankind publicly" isn't morally equivalent to attacking ships and killing people. We should stop using terms that are clearly propaganda created by the film and music industry to try to muddy the waters.

Jeaye 5 days ago 3 replies      
Umm, "pirate website" that uses a secure.sci-hub.io, but not actually a secure connection. Really, Let's Encrypt has made this a no-brainer. Anyone making a site should be expected to be using SSL. Especially those making anything related to anything "pirate" or "secure."
dang 5 days ago 0 replies      
We changed the title from "Sci-Hub Pirate website providing public access to millions of research papers" to what the site itself says.

I don't think this was an egregious title rewrite, but the word "pirate" was becoming the subject of discussion, which a title shouldn't be (and that goes double for extraneous ones).

lambdaelite 5 days ago 9 replies      
Silicon Valley and YC don't exactly have a stellar reputation for ethical behavior. Having a "pirate website" at the top of the news page doesn't exactly change that perception.

I totally get that journals are evil, and charging money for research generated with public funds is questionable. It's very frustrating as a small entity needing to view articles, and being asked to cough up $25-50. That said, there are legitimate alternatives (like emailing the corresponding author, or professional society memberships, or alumni library access, or DeepDyve). The linked website is flagrantly violating copyright and that should be cause for concern; not breaking the law is part of every engineering (and professional) ethical code.

bobby_9x 5 days ago 0 replies      
It's all fun and games, until someone in the open source community wants the same copyright protections from a commercial entity using GNU code without releasing the source.
noelsusman 5 days ago 4 replies      
I think people haven't totally thought this open access thing through.

First, publishing costs money, even in the digital age. It costs money to comb through submissions and decide which ones are worth pursuing. It costs money to hassle scientists into reviewing the submissions. It costs money to convert every submission into the same format. It costs money to develop and host a website to disseminate the articles. All of these things cost money.

Now, who is going to pay for it? Traditionally these costs were put onto the research institutions in the form of library subscription fees. Open access shifts this burden onto the author, and ideally grants would include that into the budget.

Even if grants include that in their budget (and many don't yet), there's a finite amount of money available for research. Shifting the cost of publishing onto grants will make funding available for actual research even smaller than it is now. In some fields publishing costs are entirely negligible compared to the cost of research, but in others it's not.

Also, open access would mean that you have to have funding in order to publish a paper. As it stands right now you don't actually need funding to do research in certain fields. A math professor at a university can devote some of his spare time to a project over several years and publish a paper on it with no costs at all. This happens all the time, not every paper has funding behind it.

I'm not necessarily arguing against open access, I just think people haven't fully explored the downsides of moving away from our current system.

I no longer understand my PhD dissertation medium.com
678 points by refrigerator  4 days ago   270 comments top 55
csense 4 days ago 17 replies      
TLDR: The author independently re-discovered what you may know as Old Code Syndrome.

I think that's because mathematical papers place too much value on terseness and abstraction over exposition and intuition.

This guy's basically in the position of a fairly new developer who's just been asked to do non-trivial update of his own code for the first time. All those clever one-liners he put into his code made him feel smart and got the job done at the time. But he's now beginning to realize that if he keeps doing that, he's going to be cursed by his future self when he pulls up the code a few months later (never mind five years!) and has zero memory of how it actually works.

I'm not intending to disparage the author; I've been there, and if you've been a software developer for a while you've likely been there too.

Any decent programmer with enough experience will tell you the fix is to add some comments (more expository text than "it is obvious that..." or "the reader will quickly see..."), unit tests (concrete examples of abstract concepts), give variables and procedures descriptive names (The Wave Decomposition Lemma instead of Lemma 4.16), etc.

MicroBerto 4 days ago 3 replies      
My response is that for every 100 of these types of papers, one of them may prove to be pivotal or inspirational in something truly groundbreaking and functionally useful. For this reason, I am all for 100 different people spending their time doing things like this, because eventually one of them will make an impact that is greater than 100x the efforts of 100 normal men.

It's just a different kind of "brick in the wall" - only the diamonds in the rough can turn out to be hugely important for something else in the future.

closure 4 days ago 11 replies      
This does not surprise me in the least.

Math was always extremely easy for me growing up. Up through my first differential equations class I found almost everything trivial to learn (the one exception is that I always found proving things difficult).

I made the mistake of minoring in math and that slowly killed my enjoyment of it. Once I got to differential geometry and advanced matrix theory it all just became too abstract and I just wanted to get away from it.

For several years after college I would routinely pull my advanced calculus text out and do problems "for fun". After a while I stopped doing that. Within a few years of no longer being exposed to math, I found it all incredibly foreign and challenging, to the point where I would say I have a bit of an aversion/phobia to it.

I'm trying to reverse that now by tackling a topic I'm interested in but have previously avoided due to the math-heavy nature of it - type theory.

Hopefully I can find the joy in math again through this.

I think my point is that you can lose competence in math very very quickly through lack of constant exposure.

The same is probably true of programming but I hope to never end up in that position.

tokenadult 4 days ago 1 reply      
An interesting read. But I think the author should have explicitly written out the point he is really making: you can't be too careful about making your writing clear, even to yourself. I recall reading (I'd point to the book with a link if I could remember in what book I read this) that mathematicians who occasionally write expository articles on mathematics for the general public are often told by their professional colleagues, fellow research mathematicians, "Hey, I really liked your article [name of popular article] and I got a lot out of reading it." The book claimed that if mathematicians made a conscious effort to write understandably to members of the general public, their mathematics research would have more influence on other research mathematicians. That sounds like an important experiment to try for an early-career mathematician.

More generally, in the very excellent book The Sense of Style: The Thinking Person's Guide to Writing in the 21st Century,[1] author and researcher Steven Pinker makes the point that the hardest thing for any writer to do is to avoid the "curse of knowledge," assuming that readers know what you know as they read your writing. It's HARD to write about something you know well without skipping lots of steps in reasoning and details of the topic that are unknown to most of your readers. This is one of the best reasons for any writer to submit manuscripts to an editor (or a set of friends, as Paul Graham does) before publishing.

And, yes, if you think what I wrote above is unclear, as I fear it is, please let me know what's confusing about what I wrote. I'd be glad to hear your suggestions of how to make my main point more clear. I'm trying to say that anyone who writes anything has to put extra effort into making his point clear.

[1] http://www.amazon.com/gp/product/B00INIYG74/

GreaterFool 3 days ago 3 replies      
I've been working with Haskell* for a couple of years and it is quite often that I work with code that I don't fully understand. I'll come across a terse bit of code, then carefully take it apart to see what it does (by taking bits and pieces out and giving them names instead of passing in using point-free notation and also adding type annotations). Once I see the whole picture, I make my own change and then carefully re-assemble the original terse bit of code. One could ask the question: wasn't the verbose version better? I'm going to lean on the side of no. If I left this verbose and other bits verbose then it would be hard to see the whole picture.

I think doing maths would be better if it was done interactively with software. If equations were code then you could blow it up and look into fine details and then shrink it to a terse form while software keeps track of the transformations to make sure what you write is equivalent. Maybe it's time to add a laptop to that paper pad?

* not arguing anything language specific here except that Haskell makes use of variety of notations that makes the code shorter and more like mahts. More so than most languages.

jholman 4 days ago 2 replies      
All of these arguments are arguments for replacing the mathematics curriculum with video gaming. Games require generalized problem solving (arguably better-generalized than math, and arguably better-transferrable to other domains). Games build character: grit and tenacity, cautious optimism, etc blah blah etc. And games are fun (for many more people than find math fun).

Guess math teachers should start learning to play League of Legends and Pokemon.

Alternatively, I guess we need better reasons than those to teach a subject.

pmarreck 4 days ago 1 reply      
Math seems to have a very ephemeral lifetime in the brain. I skipped a year of college once, and when I returned I realized I had to basically abandon any major with a math requirement, because I had seemingly forgotten everything.

I'm currently struggling with an online Machine Learning class (the Coursera one... at the tender age of 43), and I can only take it (so far, at least... just failed my first quiz... fortunately I can review and re-take) because I was rather obsessed with matrices, oh, about 28 years ago. "You mean I can rotate things in an n-dimensional space without using trig?"

shalmanese 4 days ago 1 reply      
I'm truly shocked by the multiple people in the thread who claim that Math knowledge can be completely erased through as little as a year of non-practice.

For me, Math has always resembled riding a bike more than anything else. Sure, the first few moments, the path is a bit overgrown and all the weeds need to be cleared off but it was always significantly easier revisiting a topic than understanding it for the first time.

For those who forget so quickly, I wonder if you felt like you truly understood it in the first place?

chipsy 4 days ago 1 reply      
It speaks to how finite we are around "knowledge." At the moment we reach understanding, we experience a sophomoric feeling of confidence. But as it fades farther and farther from our working memory, we become less fluent and more hesitant. The emerging pattern becomes one of "I can understand these interesting concepts, but it takes a lot of work and they don't last, so I have to choose to understand practically and situationally." And then in the end our bodies and personalities turn out to control our minds more than we might want to believe, as we turn away from one problem and towards a different one on some whim, never able to view the whole.

As I recognize this more in myself, I am more inclined to become a bit of a librarian and develop better methods of personal note-taking and information retrieval, so that I lose less each time my mind flutters. At the moment that's turned into a fascination with mind maps - every time I need to critically-think through a problem I start mapping it. In the future I might look into ways of searching through those maps.

ikeboy 4 days ago 1 reply      
> I have attempted to deliver [these lectures] in a spirit that should be recommended to all students embarking on the writing of their PhD theses: imagine that you are explaining your ideas to your former smart, but ignorant, self, at the beginning of your studies!

-Richard Feynman

nanis 4 days ago 2 replies      
First, I am not sure Functional Analysis is as obscure as some other areas. But, second, this just shows, once again, that one ought never to use "clearly," "obviously" etc in proofs.

It is the same principle as writing programs so they are easier for the next programmer to read. That person may be you.

jedberg 4 days ago 3 replies      
> Beyond scarcely stretching the boundaries of obscure mathematical knowledge, what tangible difference has a PhD made to my life?

The same thing a bachelors degree does for everyone else. You've proven that you can start, stick with, and complete a task that takes multiple years and a complicated set of steps.

admirethemeyer 4 days ago 0 replies      
I had several exceptional Math teachers throughout my education, but the piece of advice that stuck with me the most is:

"If you're not sure what you want to do with your life, study Math. Why? Because Math teaches you how to think."

The skills I learned studying Mathematics have been invaluable, the Math that I currently am able to recall is abysmal.

The author did a great job calling this out succinctly: Mathematics is an excellent proxy for problem-solving

jimbokun 4 days ago 0 replies      
When I was taking machine learning courses and reading machine learning textbooks a few years ago, I have fond recollections of the derivations from Tom Mitchell's textbook.


Where other textbooks tended to jump two or three steps ahead with a comment about the steps being "obvious" or "trivial", Mitchell would just include each little step.

Yes, you could argue it was my responsibility to remember all of my calculus and linear algebra. But it is kind to the reader to spell out the little steps, for those of us who maybe forgot some of our calculus tricks, or maybe don't even have all of the expected pre-requisites but are trying to press on anyway. Or actually know how to perform the steps but have to stop and puzzle through which particular combination of steps you are describing as "obvious" in this particular instance.

I just remember how nice it was to have those extra steps spelled out, and how much more pleasant it made reading Tom's book.

So thanks, Dr. Mitchell!

option_greek 4 days ago 2 replies      
Math is the shadow universe of physics. Most theorems may not look like they are useful for anything real world till someone is able to peg all the variables to real world. And then as if by magic we realize we already know how how the real world behaves. Till someone does this pegging, the theorems sit idle waiting for problems to solve. I believe this is actually a good thing. We are letting people find solutions before someone finds problems to use them for.
hnarayanan 4 days ago 1 reply      
As a PhD in spplied math, I must say I concur wholeheartedly with the author. The true value of a PhD in a quantitative field is less about specific domain knowledge, and more in the set of general problem solving skills you pick up.
ck2 4 days ago 0 replies      
Reminds me of this (well without the forgetting part, but I do that with old code all the time)


KKKKkkkk1 4 days ago 0 replies      
If you find yourself saying that you gained nothing from your education other than soft skills, maybe you should have passed over the functional analysis part and put the effort directly into learning said soft skills. I'm in the same boat, and I can see how it can be hard to admit this.
kazinator 3 days ago 0 replies      
Someone doesn't understand his own work five years later to this extent, this is a strong indication that the work is actually garbage, and the prior understanding five years ago was only a delusion brought on by the circumstances: the late nights, the pressure, and so on.

Perhaps it doesn't make sense today because it never did, and the self deception has long worn off, not because the author has gone daft.

Several weeks ago, on the last work day before going on vacation, I submitted fixes for nine issues I found in one USB host controller driver. The last time I looked at the code was more than a year ago. I had refactored it and really improved its quality. Looking at some of the code now, I couldn't understand it that well. But that's because it wasn't as good as I thought it was. I was still relying on the fictitious story of how I thought certain aspects of the code worked really well thanks to me, and it wasn't meshing with the reality emanating from freshly reading it with more critical eyes. And, of course, I was also confronted by a reproducible crash. As I'm reading the code, I'm forced to throw away the false delusions and replace them with reality. This is because I'm smarter and fresher today, not because I've forgotten things and gotten dumber! It's taking effort because something is actually being done.

Perhaps a similar problem is here: he's reading the paper with more critical eyes and seeing aspects that don't match the fake memory of how great that paper was, which was formed by clouded judgment at the time of writing. Maybe that obscure notation that he can't understand is actually incorrect garbage. His brain is reeling because it's actually digging into the material and trying to do proper work, perhaps for the first time.

If you can show that your five year old work is incorrect garbage, that suggests you're actually superior today to your former self from five years ago. So that could be the thing to do. Don't read the paper assuming that it's right, and you've gone daft. Catch where you went wrong.

By the way, I never have this problem with good code. I can go back a decade and everything is wonderful. Let's just say there is a suspicious smell if you can't decipher your old work.

Good work is clear, and based on a correct understanding which matches that work. There is a durable, robust relationship between the latent memory of that work and the actual work, making it easy to jog your memory.

analog31 4 days ago 0 replies      
My PhD is in physics, from 20+ years ago, and I would not be able to explain or defend it today without studying it for a while. I've even forgotten the language (Pascal) that I wrote my experimental control and analysis code in.

My experiment formed the basis of a fairly productive new research program for my thesis advisor, so at least it lived on in somebody's brain, but not in mine. ;-)

z3t4 3 days ago 1 reply      
I've also forgotten basically all high level math from school. And have to re-learn when the occasion comes to use some of it. But one thing that occurred to me is that in school I just learned how to make the calculations, so I never got a deep understanding on how things worked anyway. And that's fine.
sbardle 4 days ago 1 reply      
A PhD isn't so much a test of intelligence as it is of perseverance.
jeena 3 days ago 0 replies      
It would have been cool if the original was linked here http://fjmubeen.com/2016/02/14/202/ and not the medium repost. But still interesting.
danieltillett 3 days ago 1 reply      
This post inspired me to re-read my thesis (well browse through it). Although it has been 16 years since I last looked at it, I didnt have any problem understanding it and I didnt even really cringe reading it. I guess it depends on your field how bad this effect is.
pervycreeper 4 days ago 0 replies      
The dissemination of knowledge is at least as important as its discovery. Accessibility (i.e. clarity of exposition, availability to the public, etc.) needs to become a cardinal virtue in research.
fiatjaf 4 days ago 2 replies      
The author almost realized the much more important conclusion of the fact he lived. He shouldn't conclude the article by asking "what is the purpose of studying maths?" and then giving an three stupid answers.

He should have asked: is this actually "knowledge" as they say academia brings to society? Is the money researchers earn being well spent? Did I actually deserve to be remunerated by this piece of work no one understands -- and, in fact, no one has read except for maybe three people?

amelius 4 days ago 0 replies      
This is what happens to most programmers as well, when they try to read code that they wrote a while ago.
bambax 3 days ago 0 replies      
> what is the purpose of studying maths?Mathematics is an excellent proxy for problem-solving /Mathematics embeds character in students /Mathematics is fun

Those may be reasons to study maths (although, studying anything seriously probably yields comparable benefits) but doing a PhD and writing a thesis is not only about yourself: it's supposed to advance the field. It's something you do for the general community.

aldanor 4 days ago 1 reply      
As someone who left academia after a PhD in math (been working as a quant in HFT for the last few years, which mostly involves coding in one form or another), I can totally relate! Back then, all those stochastic integrals and measures made much more sense. However, it doesn't seem totally alien -- I'm pretty sure I could go to hacking math if required, but it would require at least several months to get into the flow.
mirimir 3 days ago 0 replies      
> Mathematics is an excellent proxy for problem-solving

In my experience, earning a PhD in [redacted] was excellent training in problem solving. And in developing working expertise in new areas. I suspect that the choice of field is indeed irrelevant.

> Mathematics embeds character in students

I'd say that actually finishing a PhD does that.

> Mathematics is fun

Whatever you pick for your dissertation topic had better be fun ;)

wodenokoto 3 days ago 0 replies      
Let us give some credit to the professors who read the dissertation and understood what was going on.
smonff 4 days ago 0 replies      
I experiment the same problem with "trivial regular expressions" in two yeats old Perl programs.
peterbraden 3 days ago 1 reply      
Could part of it just be because Mathematical notation is just so bad? It's more of a shorthand than an actual tool of conveying meaning. So much context goes into establishing what a notated equation means - and that context is now gone.
musesum 4 days ago 0 replies      
I had a similar problem with a crypto presentation. Basically, I was angling for a free ticket to an expensive conference. The trick was to propose something that is plausible, but too arcane for practical use. The consolation was a free ticket. Problem was that they accepted the talk. Damn!

So, I started to read crypto journals. Basically anything co-authored by Chaum, Micali, Goldreich, Wigderson. After a few weeks, I starting to get the hang of it. Sort of like learning a new language. So, I gave the presentation and then forgot about it.

A few years later, I decided to show my powerpoint to someone and describe the process. WTF? How did this lead to that? Didn't understand half of it. Was really embarrassing.

brudgers 4 days ago 0 replies      
The Sheetrock was the last step. I myself would do the exterior and interior painting. I told Ted I wanted to do at least that much, or he would have done that, too. When he himself had finished, and he had taken all the scraps I didn't want for kindling to the dump, he had me stand next to him outside and look at my new ell from thirty feet away.

And then he asked it: "How the hell did I do that?" --Kurt Vonnegut, Timequake

I find the experience common when I look back on things I write or design or build. As Bill Gates said, Most people overestimate what they can do in one year and underestimate what they can do in ten years.

znpy 3 days ago 1 reply      
It seems to me that most commenters are ignoring the fact that the author is a guy that basically left high level mathematics after completing his phd.

So basically he went doing other stuff not functional-analysis-related and his functional-analysis got rusty.

It seems quite reasonable to me. Call it old code syndrome, call it "my math got rusty", it seems quite normal to me.

Also: according to http://fjmubeen.com/about/, the author got his phd in 2007. It's 2016.

Almost ten years. What... What are we talking about?

egonschiele 4 days ago 1 reply      
This happens to me all the time. I have a very popular illustrated post on Monads titled "Functors, Applicatives, and Monads in Pictures"[1]. When I wrote it I thought it was the best monad guide ever. Now, reading back, I can see that some parts are confusing. I still see a lot of people liking it, but three years later I wish some parts of it were better.

[1] http://adit.io/posts/2013-04-17-functors,_applicatives,_and_...

calibraxis 3 days ago 0 replies      
Reminds me to get around to Piper Harron's thesis. Which was made to be seriously readable.

Math seems to have a culture of systematically bad writing (as Bill Thurston discussed).

lkrubner 4 days ago 0 replies      
He writes:

"Mathematics is an excellent proxy for problem-solving... Mathematics, by its concise and logical nature, lends itself to problem-solving (it is not unique in this regard)."

But how can we be sure this is true if is unable to read what he wrote?

Maybe I'm thinking about the way Clojure programmers tend to use the word "concise" -- concise is meaningful only if it contributes to readability. Otherwise the more accurate description is "terse". And terse does not lend itself to problem solving.

chx 4 days ago 0 replies      
> Mathematics is an excellent proxy for problem-solving

I went to a special, maths focused high school class and this rings true on that lower level too. I am a reasonably successful programmer/architect today and I have -- repeatedly -- attributed my successful attitude toward solving my problems to the 1200 or so problems we solved during those four years. Our maths education was literally nothing else but solving one problem after the other.

bbcbasic 4 days ago 0 replies      
It is not like riding a bike!

I am reading up on stats after 15 years away from the subject and even the very basic stuff I have forgotten. Although the 'muscle memory' is there so that perhaps it is a bit easier then when totally new.

What I also find is I am more interested in the application/intuition behind something now rather than the mechanics of the formulas. Maybe that has to do with a different aim i.e. usefulness vs. passing an exam..

thaw13579 3 days ago 0 replies      
To make an admittedly bad metaphor, it's likely a lot of that knowledge has been moved from main memory to cold storage, and it would take some time to bring it back. It certainly makes the case for why we write things down! Although the part about having to dig for the main result makes me think the abstract could be improved...
JulianMorrison 4 days ago 0 replies      
Math is twiddling with formal systems, and discovering how they behave, Some of it has uses, some doesn't, and some of what presently doesn't will in the fullness of time result in further islands of usefulness, as yet not even imagined. But ultimately, it needs no more justification than orchestral music.
fiatjaf 4 days ago 0 replies      
How many people have read this and understand it?

How much money was spent on the production of this disseration?

juped 2 days ago 0 replies      
Reading mathematical papers is an acquired skill that needs practice. Writing them also is, but the skills exist somewhat independently.
lin0 3 days ago 0 replies      
This wrapper[1] helps me edit LaTeX fragment in Vim/Emacs...

[1]: https://github.com/linktohack/lyxit

late2part 4 days ago 0 replies      
That's okay. I never understood it, at least you did for a while.
riprowan 3 days ago 0 replies      
We are clearly approaching the point where unassisted human intelligence is becoming insufficient to continue to master even specific domain expertise.
ISL 4 days ago 0 replies      
It takes about three months before a paper we ship becomes the best reference we have on the subject, exceeding our own recollection.

Our brains can only hold so much, especially tiny details.

Houshalter 3 days ago 0 replies      
I took a course on formal logic on courses. I put all of the questions and exercises into anki, a spaced repitition program. This ensures I will always remember it and get it in my head at an intuitive level.

Basically it's like flash cards that decay exponentially. The first review is in one day, the second in two days, then 4 days, and so on.

CurtMonash 3 days ago 0 replies      
Much the same is true of me, and of my best friend in grad school.
omginternets 3 days ago 0 replies      
Me neither. My understanding is that this is the rule, not the exception.
bitL 4 days ago 0 replies      
So, you basically ended up being better coming up with new algorithms?
rackforms 3 days ago 0 replies      
One may be tempted to call this Hodge Theaters syndrome.
nowprovision 4 days ago 0 replies      
This reminds me of revisiting Bash after a period of absence
Choose GitLab for your next open source project agilob.net
640 points by rajathagasthya  6 days ago   272 comments top 45
alexandrerond 6 days ago 2 replies      
I'll just jump in to vouch for Gitlab CE, in a self-hosted setup.

@sytse: it is one of the best documented and well packaged pieces of software these days. Specially in the Ruby world. Thank you for that. Gitlab really makes a difference.

Where others just distribute Docker images or directly nothing (because omg it is too hard to do proper packaging), Gitlab has nice packages for CentOS, Debian and Ubuntu (the distros that usually run in production servers).

We had an old install from git using mysql. Migrating to Omnibus and converting the DB to postgresql was painless. Surprisingly, it was a perfectly documented process without any pitfalls, or any outdated info. Omnibus packages are just wonderful. apt-get upgrade will just do all the necessary, DB migrations and restart all the components with almost zero downtime.

This proved to me that Gitlab actually gives uttermost care to the CE edition and that it is not just marketing.

So thank you Gitlab for being so awesome.

sytse 6 days ago 27 replies      
GitLab CEO here. Awesome to see this article! Ask me anything.
chrisblackwell 6 days ago 1 reply      
Competition is good! Forget all the hate going on against Github right now. We should be support things like GitLab and Bitbucket more in order to get the competition going strong. I'd like to see even more competition in this area.
dclowd9901 5 days ago 3 replies      
Our local instance is only a version back but we find gitlab to be terrifically slow, and regularly have to kick the server as we find it stops tracking updates, then simply stops responding at all. Also, the diffing on merge requests is simply awful. It regularly shits itself on spacing changes and the like making most merge requests almost useless from a code review perspective. Searching just doesn't work. At all. Navigation is weird and cumbersome, with many normal features hidden away in different sections of the UI. This causes weird situations where, since search doesn't work, I have to just browse around for commits to a branch I want view at a certain state, and click that commit to get to a "brows code" view. Why the fuck isn't there simply a repo->branch->commit history->browse code path?

I personally wouldn't recommend gitlab and hope someone else is out there working on something better.

DanielDent 6 days ago 1 reply      
The free hosted Gitlab is great. I really appreciate having it around. And if you haven't looked at it recently, it's faster than it used to be.

Free unlimited private repos w/ 10GB storage per repo (including LFS!) is fantastic. A recent project I worked on involved integrating ~15 upstream modules into a build process. Having the ability to set up a git repo for each module instead of thinking 'maybe I can come up with some hack to do this with less repos' is huge.

Having Gitlab CI w/ the ability to use your own workers just by running a single docker command is the icing on the cake.

And the fact that the repos which need to stay on our own infrastructure can also use GitLab - with an open source option to boot - is what makes choosing GitLab a very easy decision.

threatofrain 6 days ago 1 reply      
Github serves two major purposes: (1) one is to be a git repository, and (2) another purpose is to be the central meeting ground for discovery and community participation. GitLab can serve for the purpose of being a git repository.

The problem is that the more competition we have for (2), the less functional any of the competitors are in serving (2).

Given that when Github goes down, a lot of tools and development workflows also go down, I can understand there being a backup repository. But that's not the same as serving purpose (2), that's just redundancy for (1).

The same problem exists for Facebook. The more Facebooks there are, the less functional any Facebook is.

spudfkc 6 days ago 1 reply      
We use GitLab at work, and while it works great for hosting internal stuff, any open source projects I'm still going to put on GitHub.

GitHub has the community, and I'd personally not like to see that divided, not to mention the interface is just so much easier to use - GitLab's is mildly infuriating.

ramigb 6 days ago 3 replies      
I've been using Bitbucket for 3 years now and i absolutely love the fact that you can create private repos and have many tools available for free, highly recommended.
arianvanp 5 days ago 2 replies      
Our University recently moved from SVN to self-hosted Gitlab for internal projects It integrates nicely with our LDAP server so that you can easily search for collegues and students to add them to projects. I totally love it.

It's probably not a coincidence, as Gitlab used to be Utrecht-based if I recall correctly.

Just want to thank you guys for not having to use SVN for my graduate project.

tschellenbach 6 days ago 2 replies      
I for one am a very happy user of Github. Two mid size open source projects, dozens tiny ones and a few commercial private repos. Github has private repos you just need to pay for them and they are actually very affordable. I've seen developers who make >100 per hour choose bitbucket because they don't want to pay for private repos. How is that rational...? I like Github and I like paying them for a service I use every day.

Travis on the other hand... could use a bit of work :)

tootie 6 days ago 2 replies      
Recently had a need for something like this and Gitlab ended up losing to BitBucket Server which we bought along with JIRA and Confluence. Gitlab seems like a nice, light solution, but their issue tracker ain't JIRA.
joeevans1000 6 days ago 1 reply      
I think this is a nice article and I will take a peek at GitLab. I have to say the following though: I have been very annoyed with the difficulty in creating private repos in GitHub BUT I think it is a brilliant plan to encourage people to make their code available. As much as GitHub's pay-for-private-repos approach annoys me, I think if they hadn't done that, 30% or more of what's available on GitHub would be hidden. People made and make their code public because they have to pay not to. Brilliant, and the right approach for GitHub.
BinaryIdiot 6 days ago 1 reply      
Damn I knew GitLab was good but that article made it sound far more amazing than GitHub. I'm impressed.

One thing I've really wanted was a voting system -> merge. Is there a way to require X amount of votes (in general or from specific in a specific group) which, once that is hit, allows the PR to be merged into master? Part of our typical code review workflow is requiring a specific amount of people to approve the PR before it goes in but it's basically the honor system right now and it would be really cool to turn that into some sort of rule.

sytse 6 days ago 0 replies      
I would love to see something like this, for example federated merge requests http://feedback.gitlab.com/forums/176466-deprecated-feedback...
benatkin 6 days ago 1 reply      
GitLab EE's source code is also available, though not under an open source license. Most of the differences are things I'd like to eventually see in an open source project (maybe GitLab CE, maybe a plugin, maybe a fork).

From my perspective the best thing would be for them to eventually open source some features I'd like to have in an open source project hosting service, once they've made more reasons for people to pay them. The worst thing would be if someone added it in a plugin or fork of GitLab CE and the company GitLab were to claim that they must have looked at the GitLab EE source code. GitLab Pages and audit logs are the features I'd most like to see in the open source version.

This is much like Nginx Plus's improved load balancing with health checks, which has me considering Tengine.

j0hnM1st 5 days ago 1 reply      
Not sure what's the hype about when alternatives like Phabricator.org are much better. Meh.
dominotw 5 days ago 2 replies      
I don't know, Gitlab seems a little too enterprisey for me. For example their documentation is too 'serious' they link to martin fowler's wiki for "feature branch" and wikipedia for "feature driven development". Their gitlab flow has waterflow development like production branch with things going to "production".I do oss for fun, not to go to another job after the first one.

They don't have the cool hipster vibe that github has, github earned their street cred. A copy cat will always be a copy cat.

All these features ppl are asking for are against the ethos of github. Can somone give github props for well implemented git lfs.

sorry gitlab guy who is watching this thread if that was too harsh.

orn 6 days ago 1 reply      
Using Gitlab on site and LOVE it, it's FREE, Big Thanks
darklajid 5 days ago 2 replies      
If only there were a way to get ~painless~ sync with repositories hosted elsewhere.

I'm stuck with TFS here, which is a PITA and broken on so many levels, it's not even funny. Running a GitLab front would solve nearly all of my problems while still complying with the dreaded "everything in TFS" policy. :/

dineshp2 6 days ago 1 reply      
I was wondering what would be the implications of publishing gitlab enterprise edition with an MIT license and make it easy to fork without subscriptions (basically like the community edition).

From what I understand, the real value that you are selling with an enterprise edition is the support, dedicated staff, monitoring services and training.

pori 5 days ago 1 reply      
I've used GitLab before and had a very good experience with it. Meant mostly for private repos, however. I tried pushing it onto the teams I have worked with in the past but they resisted because they were unfamiliar with it and didn't want to deal with server maintenance. The latter was understandable. Still, this really is a great solution for those who want an alternative to GitHub, especially a decentralized one.

Albeit, I am little disappointed because I have finally started getting into open source and being active on GitHub, and now it seems the development community is vying for a way out. GitHub may not be open, it may be centralized, but it does offer the advantages of uniformity and easy discovery.

z1mm32m4n 6 days ago 0 replies      
I had no idea that GitLab had support for changing what type of commit is created on-merge. That's actually a really temping feature; I've spent so much time trying to convince people (especially those starting out) why manually rebasing and merging is A Good Thing.
hardwaresofton 6 days ago 1 reply      
Despite setting up my own gitlab instance once, and trying to get the company I was working at onboard, I never knew that gitlab.com was a thing for hosting projects... is this recent?

Either way, just signed up. Excited to support Gitlab, any way I can.

r-s 6 days ago 1 reply      
I have been using GitLab for some time and I am transitioning my projects to them currently. I prefer it to Github significantly, the downside is that recruiters are still looking for github on resumes.
sethherr 6 days ago 7 replies      
See, here's the thing: I don't want free private repos to exist. What GitHub has done is incentive making things open source, and that's immeasurably valuable.
akhilcacharya 6 days ago 1 reply      
I had used GitLab for Mailman but I hadn't considered using it. I'll probably migrate off of Bitbucket thanks to this article.
dham 5 days ago 0 replies      
> Do I have to tell you, its good for people with vision problems?

Actually light themes work better for me. I have astigmatism and light themes open the pupil allowing me to see the text easier. With a dark theme I basically just see line noise.

This might be a misconception of dark themes.

mrmondo 5 days ago 1 reply      
As I've said before, we moved from GitHub to Gitlab and could not be happier. Most of our stuff is internal and we mirror our open source code to public repos. God damn it's good software - is there still room for improvement? Yes. Are they improving it every.bloody.week? Hell yeah they are!
reactor 6 days ago 0 replies      
https://notabug.org is also an option
mamcx 6 days ago 1 reply      
Exist something like this but for mercurial (I use BitBucket. I mean something to download private...)
merpnderp 5 days ago 0 replies      
At work we'll probably be moving from TFS to gitlab. Phabricator looks nice but gitlab seems more polished and simpler to use. But once we move to git we will be easily portable and can switch easily if we regret our choice.
JohnTHaller 5 days ago 0 replies      
One error in that write-up: SourceForge is open source, not proprietary. It's an open source project called Allura under the auspices of Apache that has paid coders at SourceForge contributing to it.
Heliosmaster 5 days ago 0 replies      
As another guy that uses GitLab, it's really great. You can see GitLab improve every release, constantly. I don't even remember the last feature that GitHub added, to be honest.
LinuxBender 5 days ago 1 reply      
I tried to create an account on GitLab.com. It said I needed to fill in the Captcha. There was no Captcha. Tried on Safari. I will try again later on Firefox.
amelius 5 days ago 1 reply      
I'm wondering if github and gitlab implemented their own version of git under the hood, or if they just use the vanilla git everyone else uses on their server.
antman 5 days ago 0 replies      
For a git solution in python this is quite good: https://kallithea-scm.org/.
xufi 5 days ago 0 replies      
I'd like to maybe try this one day. Both hub are lab are great though and fill different niches of mine
radarsat1 5 days ago 1 reply      
Is there a way to port publicly viewable github issues and PRs over to gitlab automatically?
ioab 6 days ago 1 reply      
They don't have the octocat.
jrcii 6 days ago 1 reply      
The only missing feature is mailing lists. What's a good solution for that?
coderKen 5 days ago 0 replies      
Have two repositories on Gitlab, no complaints so far.
fredgrott 5 days ago 0 replies      
github allows more than one private under paid plans..seems that author makes attempt at comparing apples to oranges
Alterlife 6 days ago 0 replies      
I don't get it, is this some kind of meme that I'm missing?

What makes gitlab unsuitable for linuxontheweb?

What makes an 'os-on-the-web' different from 'html4 bullshit' to a version control tool?

It's just code either way.

musha68k 6 days ago 1 reply      
Please don't... Github is an institution and the only thing replacing it should be something equally global but based on a new protocol of sorts enabling truly decentralized and distributed sharing and collaboration on a higher interaction level.

But meanwhile, again please let's not fragment needlessly - that's certainly not something worth the price to pay for an (admittedly great) "alternative" service be it self-hosted or otherwise...

Edit: I'm strictly talking open-source projects here - Gitlab is a fantastic solution for private repos!

sandGorgon 6 days ago 2 replies      
I'm a rails guy and have run a consulting shop based on rails. But Im actually a bit surprised (and impressed) that you distribute a self-installed piece of software based on Rails (yes I have seen Omnibus and all the cool things you do. As I said very, very impressive!)

I have personally lost many a "enterprise" contract because of the Ruby thing. Not just that, but at the end of the day Java is very performant with least deployment effort .

Which is why we became a JRuby shop first (and packaged JAR for distribution) and eventually to Spring Boot and Scala.

I understand the value of Rails (my current startup is also built on that) - but I'm wondering if it makes sense for you guys to bite the bullet and jump to the JVM.

We Need a Better PC dcpos.ch
647 points by dcposch  2 days ago   672 comments top 131
habosa 2 days ago 21 replies      
Not disagreeing with the premise, but just want to plug a machine I am very happy with.

I was a lifetime mac owner (since OS 9) and my 2010 Macbook Pro finally broke down this year. It made it that long because I doubled the RAM, replaced the battery, and replaced the HDD with a hybrid drive. Finally the battery puffed up and exploded. I brought it to the Apple Store and they were beyond useless. The new MBPs are not modifiable in any way, and Apple's customer service is not what it used to be. So I decided to get a non-Mac for the first time in my life.

I ended up with an Asus Zenbook UX305LA. It's a 13" 1080p screen, 8GB RAM, 256GB SSD, Core i5 processor. Cost me $749 and I dual-boot WIndows 10 and Ubuntu (spending 95% of my time in Ubuntu). It costs about 50% what a similar specced Macbook would cost and is similar size/weight to the Macbook air. Build quality is fantastic (open with one hand, good keyboard, glass trackpad, etc). The battery life is not quite as good as a 13" MBA but better than a Macbook Pro. Overall I am extremely happy with it.

Considering that my last MBP cost me $2200 up front plus ~$300 in repairs over time and lasted me ~4.5 years. That's about $500 a year. This machine cost $750. If it lasts over a year and a half, the experiment is a success. So far it's been nothing but great.

xlayn 2 days ago 10 replies      
Do you like the MacBook but don't like the OS? go ahead and change it, you can have triple boot.

Tiny fonts and pop ups doesn't have anything to do with the machine you are going to buy (unless you also want the website).

HP makes the elitebook, magnesium construction a la thinkpad.

And last but not least I would like to give this entry the RAGE post award, nomination points for

 What I want is a computer with: -Decent build quality -Decent performance and battery life -A decent website. It doesn't have to be an icon of web design, like apple.com. It can be simple and utilitarian, like an Amazon page. It just has to be honest and up to date. It should contain pictures, text, and a Buy button. -A clean OS without crapware or malware factory installed Is that too much to ask? Make one and you can have my money! 
Apple, Lenovo, Microsoft and HP make systems like the one you want, you can install your choice of OS and buy it on Amazon or eBay if you want.

Bonus points for calling apple website an icon of web design when it weights almost 100mb.

Rage score 9/10.

nerdy 2 days ago 2 replies      
I went through the same thought process in December, never owned an Apple computer but needed a new laptop. Despite trying to fend off the idea of getting a Macbook (due to some combination of price/not wanting to succumb to herd mentality/proprietary nature of Apple) I caved.

It felt silly to spend a lot of time researching and configuring a custom laptop. I've never purchased a pre-built desktop but customizing a laptop seemed too poor a risk/reward proposition for my liking. Even with a great deal of effort it would be difficult to create something comparable to what I could simply purchase.

Apple's hardware is easy to appreciate down to small details. You can open the lid without holding the other half of the laptop (my previous laptop would lift up otherwise) and the screen isn't too loose either (doesn't bounce when you type hard). The speakers sound orders better than any of my previous laptops. Battery life is great. Retina is beautiful & responsive (easy to read text, even while scrolling). The trackpad is unbelievable for a variety of reasons (click anywhere, accuracy to the very edge of the pad, multitouch/gestures). I hadn't previously seen any trackpad worth using let alone nearly as good as a standard mouse, but haven't ever needed to connect a mouse to this one. It's just a great overall experience.

There have been a couple software glitches that required a restart to fix, but other than that and the temps (~90C) it reaches under load it has been a pleasure (@ 2 months of heavy use). Figuring out OSX took a week or two.

I don't envy anyone who tries to save money while achieving a similar experience.

chao- 1 day ago 1 reply      
I have the "old X1 Carbon with a low res screen", except that the article is a little inaccurate: it doesn't have to have a low res screen. You can select a 2560x1440 screen. Even better, it's a non-reflective (matte) screen and even has a non-touch option! I guess I'm not alone in disliking reflective screens.

What sold me more than anything, though, was that I could select the best parts without being forced into a touchscreen. In so many other company's product lineups, if I wanted the 1440p screen (or whatever is highest), they were convinced I absolutely wanted a touchscreen, yes sirree! Thank you, Lenovo, for understanding my needs.

I don't remember which manufacturer it was, but I remember one option page where if I wanted to select the Core i7 option, it required the touchscreen upgrade as well. I don't see any logic behind that at all except "Hey, that i7 means you must be a big spender, guess we'll milk you for all you've got!"

As for OS, I run Linux Mint (Cinnamon) currently and experience no particular hardware or driver issues. Battery life is 5-8 hours depending on load, screen brightness, etc.

rmm 1 day ago 4 replies      
Microsoft Surface Book.

Easily the best device i have used in a very long time. Expensive (especially the i7 dGPU option) but i am amazed at how good this machine is.

Huge battery life, powerful when it needs to be. Connect to a dock and you have a powerful desktop.

Currently use it when im working in the office doing design work (plugged into a couple of Dell U2715H) then when i am out on mine sites use it as a portable machine in the field.

The tablet mode and pen is just a cherry on top. Writing notes in the field, marking up drawings etc. ridiculously easy.

i love it.

theyCallMeSwift 2 days ago 2 replies      
Check out the Dell Developer Edition XPS. It comes with Ubuntu on it and has 100% full driver support out of the box. It's a great machine, we use them for work. Screen is great too and it looks really beautiful. https://sputnik.github.io/
knights123 2 days ago 4 replies      
It seems like you should really just buy a MacBook Pro. Sure the software isn't perfect, but it's an order of magnitude closer to perfect than Windows has been lately. You buy it, boot it up, and it works wonderfully. Of course the one day out of the year that it doesn't you go rage comment on how it sucks and it scares Windows users away from Mac, but the rest of the time you'd never consider going back to a typical PC.
PaulHoule 2 days ago 5 replies      
Don't get a a laptop. The pc industry is doomed with phone envy. Desktop machines are one place you can really do better.
jolux 2 days ago 5 replies      
Honestly if you get a Mac you can install rEFInd on it and boot Linux just fine, or run Linux in a VM. The hardware also almost never breaks and especially if you have AppleCare they frequently replace it for free. Yes, it may be difficult to upgrade yourself. That's entirely true and if it's a deal breaker there's not much else around.

You might do well to look at https://puri.sm if you're a Linux user, they seem to make pretty good machines with free software down to the BIOS. There's also always http://minifree.org but they don't really make modern machines.

canthonytucci 1 day ago 3 replies      
I feel compelled to bring up how cheap, modular and abundant bad-ass second-hand thinkpads are. Not to mention that even the new-generation keyboards are amazing, and put the shallow crap on my macbook pro to shame.

The money you'd spend on even a 'cheap' macbook pro will get you something solidly built with a nice screen that's plenty fast for devlopment work AND money left over for an extra battery or 2 + a brand new fat SSD + 16 GB brand new RAM + a nice dinner (maybe even with a date).

Just be careful not to get one from the awkward phase recently where they didn't have individual clicky-butons and this strange ceramic trackpad.

viraptor 2 days ago 2 replies      
> They display low ratings for their own products on their own website. What.

How is this a complaint? I'd love other companies to post both good and bad reviews without filtering.

> Except that page is deceptive, because that's actually the old X1 Carbon

Don't know about other locations, but here the X1 got a really nice discount once the 2016 model got announced. (also visible in the screenshot) I got it and I'm really happy about it. It has a high res screen (WQHD), so the same as the 2016 model, so that note in the post is also a mistake.

ne01 1 day ago 0 replies      
I love my ThinkPad X220 and highly recommend ThinkPad X series.

I remember the day UPS delivered it. I had a flash drive with a Debian 6 image ready to be installed, didn't even boot up the windows to check if everything was OK.

~2.5 years after using it for about 10 hours per day USB ports started to randomly disconnect and reconnect it was very annoying.

I had no idea that ThinkPad X series has 3 years of warranty and still cannot believe that Lenovo sent someone to my house (3 days after I contacted them!) and changed the motherboard it's basically a brand new laptop. FYI, bought it directly from lenove's outlet website for ~$700.

My thinker is ~4 years old and it's still one of my most beloved objects in this world.

virtualwhys 2 days ago 1 reply      
OP doesn't mention Dell Precision line?

Recently released 5510[1] is looking absolutely awesome, will be picking one up when return to States in April to replace current Precision M4700.

2 X SSD + 32GB memory, high end CPU, decent GPU with 3840 X 2160 screen...will be sorted for next few years. If you NewEgg a couple of 480GB SSDs you're around $2,500 for a beastly machine that weighs under 4 lbs (M4700 is nice but hefty, nearly 7 lbs).

[1] http://configure.us.dell.com/dellstore/config.aspx?c=us&cs=0...

hans0l074 1 day ago 1 reply      
I purchased a System76 Galago UltraPro (this model does not seem to be available anymore) at the end of 2014. I have been observing System76 for a year or more before I caved in and purchased this. I live in Finland - and they shipped it across. I really love this laptop. I'm also a Mac user (Air, Pro etc) and I expected the build quality to be lower, but it was surprisingly solid. Perhaps their build quality has improved? Also, one of the most common complaints was about their keyboard and it seems they have fixed this - I have had no problems. I love having a Ubuntu portable for my day-to-day devops work over VPN and with everything set up (all my tools, IDE's etc), it's been a pleasure. Their new line up looks impressive (I'm tempted to get another one). But yes, the battery life is really bad - 2-3 hour max or even less if you have IntelliJ or a VM running. So I have placed a power adapter at all my usual work locations :)Edit : spelling
jseliger 2 days ago 2 replies      
I'm surprised not to see a link to the Librem laptops: https://www.crowdsupply.com/purism/librem-13. They're an attempt to solve the problem being posited, AFAICT, though they ship with Linux, not Windows.

I gave $10 to their initial Kickstarter but have never used one.

dnautics 2 days ago 2 replies      
Avoid the dell developers edition xps13. There's no end key, the trackpad is unusable. Even worse, avoid getting the windows version and installing Linux on that: the trackpad is even less unusable (resets to lower corner intermittently on clicks), and in order to get the wi-fi working I had to recompile the kernel driver telling it that it was FOSS.
euske 1 day ago 2 replies      
There are many independent laptop PC manufacturers in Japan. Panasonic Let's Note (cf. http://panasonic.jp/pc/ ) is known to have excellent battery life while being decent. Laptops from Mouse Computer (cf. http://www.mouse-jp.co.jp/ ) have almost no pre-installed crap. Unfortunately, many of them don't sell outside the country. They're probably too small/thin to Western people. People tend to think a 17-inch laptop "giant" here.
awongh 1 day ago 3 replies      
I had a thinkpad running ubuntu for a couple of years, around 2010-2013, but I was forced to switch to a mbp for one simple reason:

About 1/10 of the time when I would be at a coffee shop or something and would try to connect to the wi-fi router, ubuntu simply wouldn't be able to connect. This thinkpad had one of the officially supported wifi chips in it, and it still didn't work.

Lots of forum posts told me to downgrade driver versions or some such thing, I wasted a bunch of time trying many different solutions and nothing ever fixed it.

I'm a web dev, and without internet I can't get anything done- I had to eventually get rid of it. Too bad, because the build quality was nice and it's a much better deal than a mbp, but unreliable internet is a deal-breaker.

(If there had been some way for me to pay someone with driver knowledge to diagnose and patch the driver problem I would have done that!)

spo81rty 2 days ago 2 replies      
I really like my Razer laptop. They basically look like black MacBook Pros. Quality seems very high and I have enjoyed using it for several months. Quad core Cpu and 16 GB of RAM provides plenty of power as a dev box.
will_hughes 2 days ago 2 replies      
I've got an aging Asus Zenbook UX31A ultrabook which is amazingly thin and light, and still pretty decently fast. There was also very little crapware on it (was originally a Win7 machine with Win8 and then Win10 free upgrades).

My one complaint is that the (small, light) power bricks are apparently made of unobtainium. They're near impossible to find, and the ones you can are extraordinarily expensive.

Given my hate for proprietary power adapters - I'm holding out for a USB-C powered replacement.

Asus doesn't appear to be offering anything, but Razer's Blade Stealth[1] looks like it could be a great option.

[1] http://www.razerzone.com/store/razer-blade-stealth

jitl 2 days ago 4 replies      
Did you look at Dell's line of XPS ultrabooks? I don't own one myself, but the ultra narrow bezels and industrial design look quite nice.
laumars 1 day ago 0 replies      
A year or so ago I was having the same dilemma, but then stumbled across Samsung Chronos. Previously I'd only seen the lower end of Samsung laptops and they were predictably crap. But the higher end Chronos' are something special, sleek design, solid build quality and decent specifications. My only quibble is that Samsung lean a little more towards Apple's design than HPs, so accessing components is a little harder work than I'd have liked. But the upside is my Chronos has excellent compatibility with Linux.

I'm very pleased with my laptop and would recommend the Chronos range to others.

stepvhen 2 days ago 2 replies      
I've bought old thinkpad T-series on ebay for ~$300, installed arch, and moved on with my life. I keep most of my relevant data on git repos or a home server. If it dies, which usually happens after a good few years, I just get a new one. Not ideal, and certainly not for everybody, but a reasonable option if it fits your needs.
soared 2 days ago 1 reply      
Why is a decent website relevant at all? Would it make buying a laptop easier? Sure. But 90% of consumers are going to do almost all of their research off-site and enter the lenovo looking for specific products or categories. Not everyone follows the Apple-Disney ethos of design every consumer facing thing to be flawless.

I haven't done the research but I bet you could see lenovo pages ranking higher for long-tail keywords (specific products) than than more general pages.

jmspring 2 days ago 3 replies      
Post didn't address - buy apple hardware, run something other than OS X on it.

Seems simple. Sure you are locked to apple hardware issues/resolutions, but Apple today is what the Thinkpad was for years (pre-Lenovo), hardware mostly just works.

Insert arguments of customization, configurability, etc... Most people don't care. Lenovo doesn't deliver as well as IBM did on the Thinkpad line. What's the alternative besides Apple?

orionblastar 2 days ago 0 replies      
I have an Acer laptop I won from a church basket raffle. It only has a 1.5 Ghz AMD Dual core CPU and 3Gigs of RAM and 250 Gig hard drive. So it runs slow at first and takes a time to load everything.

It upgraded to Windows 10 Pro quite well.

It is one of those cheaper laptops and it has an AMD GPU as well. I don't know how user serviceable it is, but it hasn't needed any work yet.

I used to have Ubuntu on it, but my wife didn't like it so I had to put Windows back with it as we share the laptop. So I know it runs Ubuntu very well, and it runs Ubuntu faster than it does Windows.

But PC quality has gone down since they moved things to China. Motherboards, you are lucky if they last three years now. My son had an ATX custom system with an ATX motherboard for the Intel 1150 socket made by ASUS, and it went out and was replaced with an Intel brand motherboard that we had to buy from eBay because the socket is so old they don't make new motherboards for it anymore. Motherboard lasted maybe two years before it blew out. We got a Datavac to clean up dust from it and would replace the CPU resin every six months or so.

jgeerts 15 hours ago 0 replies      
> "they sell things that are locked down, both physically and in software"

Well, I can understand the physically part, but fortunately you can solve it with money, you just have to pay for your hardware upfront, it is expensive, but I work on my MacBook Pro > 9hrs a day. It's four years old now and doesn't need replacement yet, it's doing fine, the battery is doing fine.

Most people only look at how much an item costs at the time of purchase, but a fairer comparison would include the span of time that your device is sufficient and you don't need replacement.

I use my laptop for my job, the amount of money I make with a laptop outweighs the cost of a laptop by far. Most professionals in their job invest in good material, why would it be different in IT?

As for the software part, the software that Apple provides is astonishing to me. For example wiping a HD multiple times is just a setting, in windows you can start looking for freeware for doing the same thing. It comes with so many possibilities out of the box, I can't think of anything that I would want to do that I can't do now.

Just yesterday I sat next to a guy running Ubuntu and I couldn't believe what I saw, when he tried to search his hard drive he typed in his search parameters.. but then the first thing that popped up were shopping items, faster than the items present on his hard drive... Really?!?

A MacBook pro isn't perfect and it is costly but it's simply the best option for me now.

k_bx 1 day ago 2 replies      
One very strong point which not that many discuss here is how awful websites of non-Apple manufacturers are. They are a strong indication of how bad things are organized and developed at those companies, if they can't make a nice website answering all your questions in an attractive manner how can they expect you to buy stuff from them? Always impressed me.
flatroze 1 day ago 0 replies      
Go for gaming PCs/laptops.It's the same with clothes today: you can get the "survival/military" type of outwear which is better in quality and tend to last longer than usual outwear. You will look like a show-off, but in reality it's where the jean companies used to be when they first appeared (they used to be made for gold miners and construction workers).I would get something like Aorus X3 or Razer Blade Stealth, both feature really good screens and great hardware. Also seem to be well-engineered, unlike those plastic toys from Dell, Lenovo and all the others.
Iv 2 days ago 1 reply      
What do you think of the Novena? https://www.crowdsupply.com/sutajio-kosagi/novena

If you want to make a good laptop, there are people doing that, in an open hardware and open source way.

wangchow 2 days ago 1 reply      
Honestly I swear by my Surface Pro 3. Maybe learn to hack some device drivers and get Ubuntu running (better) on a device like this. Some people already have it working:


In my opinion it's the most well-rounded device out there.

dpc_pw 2 days ago 1 reply      
I'm very happy with my Asus UX305F. I have Linux installed, everything except brightness keys works perfectly. Great build quality, mate screen, battery lasts very long. And it was cheaper than most of what people mention here. For CPU intense task, I just use my desktop.
TwoBit 1 day ago 0 replies      
And he didn't even mention the terribly shitty trackpad that every single pc laptop has. And its shitty synaptics software.
omphalos 1 day ago 0 replies      
I wish the author included something about the Intel Management Engine. It runs closed source software with privileged access to your entire machine including support for remote execution, has a history of critical vulnerabilities, and is present on every current Intel chip.
leecarraher 2 days ago 1 reply      
i have a 2014 x1 carbon. solid build, decent battery under linux(5hrs average load). probably will get an x1 next round too. dont get booged down with specs though. newest is outdated in a year, and if you don't get a new laptop every year, arguing specs is pretty moot.drawbacks - soldered on ram + it's a premiumnot super modular.positive - its solid and nicely designed with a great keyboard (and this is from an otherwise chicklet style hater, but really nice travel)
thedaemon 1 day ago 0 replies      
The headline is misleading. It should say a better laptop. A PC is not equal to a laptop. I was hoping for a in depth article about modern computer design, but instead found a rant about bad websites and lack of well engineered laptops.
akhilcacharya 2 days ago 2 replies      
After I bought my MBPr 2014 I started looking around at similar machines.

In my mind, the only competition is with the Surface Book, but even that's debatable given the screen size.

vardump 2 days ago 6 replies      
My dream laptop:

1) No malware/spyware tainted brand.

2) At least 32 GB ECC RAM, 16 GB is so 2010. ECC, because memory errors do happen and cause instability. 64 GB option wouldn't hurt either.

3) HiDPI (retina) display (IPS or equivalent)

4) Fast PCI-e attached SSD.

5) Ability to run two 4k monitors @60 Hz.

6) Stable USB3 ports (My 2015 RMBP keeps resetting USB3 ports, making it nearly impossible to run VMs on USB3 drives)

7) ~10h+ on battery.

S_A_P 1 day ago 0 replies      
Some of the big PC vendors have started catching up. I just bought an HP Envy for my father in law, and the hardware is pretty solid, and it had some features I really like(256GB ssd and 1TB hybrid drive, 16gb ram stock, nice display). It is really snappy, and despite his being non technical and wary of moving from Windows 7 to 10, he got on with it really quickly.

I think there is another issue with PC battery life here. Nothing heats up/drains my macbook pro's battery life quicker than windows 10. I have tried various power settings, but nothing ever quite "fixes" it. I have had to do the following "tweaks" to improve battery performance of Windows 10:1)Disable Cortana(at least as much as you are able to)2)remove this "phone companion" app that seems to reinstall with every new update3)adjust power management which noticeably slows performance and or annoyingly shuts down the screen.4)randomly go in and kill rogue battery robbing processes

Mac OS seems to seamlessly handle power management regardless of what Im doing- including multitrack audio recording/processing. I never hear the fan while running OSX only. But without fail my laptop heats up and the fan starts cranking while I have Windows running. I am getting closer to not ever needing to run windows, but I still do some .NET work and work with Sql Server. For me, its a huge turn off to hear a laptop constantly fighting to stay cool. In fact this is the major flaw I saw with the HP laptop I got my father in law. It would get hot and crank the fan. In my anecdotal opinion, windows is a poor laptop OS, and I don't see how there could be a better pc/laptop/tablet until Microsoft figures out how to manage power. I don't have a use case for Linux, so things could be better there, but Linux is still relegated to niche users for the most part in laptops.

sreenadh 1 day ago 2 replies      
I am surprised that the author did not consider Dell. I T410i user, but I will not be buying another thinkpad based on the current line up. Plus I have many issues with Linux drivers. Windows is no more reliable and I am forced to move on. Mac is great but the inability to modify is bugging me. I have a MBP also.

Of course, nothing beats a PC but its tough while travelling or when you just want to move around while working.

Micheal Dell making the company private will be good if he planning to focus on making quality hardware like Dell Developer Edition XPS + Project Sputnik, which is an interesting project. But still needs to mature.

So if I have to buy a machine today, I am lost. There is a need for quality machine for developers as we spend long time with it. It needs to be durable, light, matte screen(its very tough to get that now a days), low heat, good keyboard (mechanical like old thinkpads), good driver support for Linux, decent battery. PLUS have a higher score on iFixit. I like the rating of XPS 13 @ https://www.ifixit.com/Teardown/Dell+XPS+13+Teardown/36157

melted 1 day ago 0 replies      
Just get a MacBook Pro and skip the aggravation entirely. You pay more for a reason. That reason is that no one else knows how to make a proper laptop anymore.
solipsism 2 days ago 0 replies      
Why isn't there a Linux distribution that's specifically for installing on Macbooks? Given how few Macbook models there are, you could include exactly (and only) the correct drivers. Optimize everything to work perfectly, including things that often require tweaking to get working correctly (WiFi, screen brightness, power settings, etc).

If I knew it would not be an adventure installing Linux on my MBP, I'd pay good money for such a distro!

nchudleigh 1 day ago 0 replies      
It would be really cool if someone could do a clean developer book.

Software:Would love something that comes with a Debian distro installed. Something minimal and clean.

Body:Brushed metal exterior, hard rubber for hand rests- no sharp edges either (my wrists hate MBP's edges). Nice big touchpad. Elegant branding (love the lightbar on the chromebook pixels) but perhaps even more subtle (the new macbooks do a good job of this- eliminating the obnoxious glowing apple). Thin would be good.

Hardware:500GB SSD would be more than enough.Don't need GFX card. 8gb ram is more than enough. Good CPU please. Fan-less would be amazing. USB-C would be interesting, at least need a couple of those. And then a pair of USB-3 ports, and a headphone/mic jack. Maybe an HDMI. It would be good if the internals were simple enough to swap out everything with a small philips head.

MOST IMPORTANT THING:I can buy replacement parts on your site for everything (screen, keyboard, shell, all of the hardware, touchpad, etc.)

Price:I would pay anywhere from 1-2k US for this computer. Honestly just to support some healthy competition in the space- I would even go to 2.5k

Ifhax 2 days ago 2 replies      
You should get a dell latitude--woot periodically has excellent deals on "last-year's models", brand-new with 3 years warranty. They are reliable, modifiable, and get excellent battery life. They come pre-installed with Windows Professional, with minimal bloatware--but forget about it. Scrub off the Windoze crap, install Linux, and you will have an awesome little machine.
joonoro 1 day ago 1 reply      
The new ThinkPads are indeed awful, get the X220 aka the last good one. That should do you well until something worthwhile comes to replace it (maybe HP?)

- Last old-style ThinkPad with great build quality

- Last one with the classic keyboard

- Small form factor

- Sandy bridge i7, still kicks butt because it's not undervolted (check the benchmarks)

- up to 16 GB memory

- 10 hours of battery with a new 9-cell battery (+ tlp package on linux)

- Linux or Windows works great

Uptrenda 1 day ago 0 replies      
Has the quality of laptops really been decreasing or are we just remembering "the old days" with nostalgia? Seems like a perfect question that could be easily answered with proper research.

My guess: if you're prepared to pay several thousand you can still buy quality modern laptops today (like ThinkPad P70.) Or do it the cheap way: buy old hardware that's known for its reliability like used ThinkPads (currently using a ThinkPad T520 myself.) There's a good guide here for choosing a ThinkPad model if you're a fan of solid laptops that will last: https://wiki.installgentoo.com/images/8/8f/Tpg140901.png -- it's a bit memey but has solid advice.

mingabunga 2 days ago 2 replies      
I've always found ASUS laptops well built, great screens and very reliable. Their Zen books are nice.
glossyscr 1 day ago 2 replies      
Not disagreeing but there is one notebook and this is how the Macbook should have been:

- Same build quality as Apple

- Thin and light as the Macbook

- Pixel density higher than Retina

- Powerful CPU

- Just released


richardboegli 11 hours ago 0 replies      


Only downside is only 8gb ram, otherwise great machine.

Once a 16/32gb version available, it'll be purchased pretty quickly ;)

agentgt 1 day ago 0 replies      
I too came to the same conclusion that the article did quite some time ago. Apple just owns on build quality.

I am Longtime Lenovo + Linux user that recently switched to Mac and as much I really hate OSX UI the build quality of apple is worth it (that and skype sort of works better on mac than Linux).

The thing that shocks me with my new Mac is how incredibly unstable OSX is at times (El Capitan and previous version). You would think given such hardware control my mac would be more stable than my parents Windows 10 Lenovo Yoga or even Linux on crappy hardware but lately it has not been the case for me.

superobserver 2 days ago 4 replies      
Why not by a Chromebook Pixel (2015)? Seriously.
niutech 1 day ago 0 replies      
No offense, but the author has a first world problem. The notebook should be functional - powerful, lightweight, long lasting, not necessarily pretty. It is not an exhibit, it's a tool. Why does he need a good web page to buy it???

That said, he can grab a Chromebook and install a full-blown Linux distro on it.

cowardlydragon 1 day ago 0 replies      
Not only that, all desktop OSes are regressing. Well, windows 10 is an improvement over the massive shitstorm of Windows 8 where Microsoft apparently tried to commit corporate suicide.

Ubuntu and desktop linux continue to be extremely frustrating. I tried Suse, Ubuntu, and Mint in the last box I built for Xmas, and Ubuntu was the only one that installed cleanly, a marked improvement from the previous couple releases.

OSX is hardware-locked and it has many many problems and frustrations, especially if you are used to key bindings from the PC / Linux realm.

Desktop/laptop is steadily dying. We are all basically waiting on a hybrid of Android to take over desktop.

I continue to be mystified as to why Google does not pursue a desktop Linux operating system that's good. (aka not ChromeOS). It's a gigantic market to take over that Microsoft was begging someone to take from them throughout the Windows 8 debacle. With an unnoticeable hit to their bottom line they could assemble a team that would clean up Crossoffice, merge with Android, make tons of money from Android app store, get search/voice search/services integration at the OS...

ChromeOS could/should be this, but either it's not properly funded to handle the Windows interoperability (put/fund WINE on steroids) more seamlessly or doesn't have very good people on it.

Roritharr 1 day ago 1 reply      
My real issue is finding a decent 13inch laptop that allows 2x4K Monitors at 60hz via simple Docking.

The Surface Book failed in that regard and I haven't gotten confirmation from Dell that the TB 15 Dock will support it.

In the 15 inch Category there is only the HP Zbook with its gorgeous 4 Thunderbolt Ports that allow it. But that's Desktop Replacement, not a portable 13 incher.

Currently I drive 2 2560x1440 Monitors on my Surface Pro 2, one via MiniDp and one via USB3 and am kinda happy with it, I just need more ram... But upgrading and then once again hitting the limits so soon would give me buyers remorse quick.

josteink 1 day ago 0 replies      
So he says there are some good computers out there (like the Lenovo Carbon X1), but due to his bad luck and timing on his part it's a bad choice for another month.

If you just wipe Windows clean on those Lenovos and use them for Linux (Fedora runs great on them!) you wont have to worry about those Superfish issues and everything else plaguing the world of windows-users.

I can see why you would decide not to want to reward Lenovo with your money after those incidents, but it's still some of the best you are going to get.

dfar1 1 day ago 0 replies      
This article covers one computer brand mostly based on their poor website design... and that's it? That's why we need a better pc?
Matthias247 1 day ago 0 replies      
I agree with the content, but not necessarily with the headline.

If you need a Desktop PC the options are plenty, you can pick from a large set of components and build pretty much what you want, from decent formfactor to big towers with powerful and good hardware.

For high quality notebooks however I came to the same conclusion as the author. I wanted to buy a new personal notebook last year, and actually did not want to get a Macbook, as I am not that into Apple and OSX. But I could not really find an alternative, even in the same (high) price region, so I got a MBP13 and I'm happy with it. For the windows machines either the build quality is way lower, battery life is lower, screen resolution is lower, input devices suck or the price tag is even higher (X1 Carbon, XPS13. Surface book too, but it wasn't available in germany anyway).

Another thing that influenced my decision heavily was the Touchpad. On the MBP it works beautifully, and in combination gesture features of the OS it makes working on such a small screen and without an external mouse much more pleasant. On the windows machines the touchpad response is mostly somewhere between bad and mediocre, and there's no swipe between workspaces and such stuff.

beyondcompute 1 day ago 0 replies      
Thanks for the post. I think, we need more of these. I too have been feeling that the companies in a sense "aren't even trying" to show some care and respect for a customer. We agree to pay money for "hardware", they accustomed to "assemble together" some electronic components. There's no inspiration, no consideration, no ambition to provide experience that lasts (despite that hardware is fast-to-become-obsolete, the brand image and the impression you leave can be relevant for a much longer time).For me personally the only decent machine out there is MacBook Air. It does not make false promises about performance (like other laptops do, only to fail miserably later).My ideal machine would be lightweight plastic with mechanic keyboard. Performance is not really critical. Something like Surface pro with 14 inch display and mechanical keyboard (touch-screen is not necessary; good, macbook-like touchpad is) would do.
TurboHaskal 1 day ago 1 reply      
The author thinks apple.com is the pinnacle of web design, values form over function as he prefers the x1 carbon, gets overwhelmed when having too many choices, dislikes seeing user reviews on a product site and probably lies on the ground on a fetal position at the thought of replacing a hard drive by himself.

He is exactly the kind of person Apple makes products for. Just get a Macbook already.

btilly 2 days ago 0 replies      
I just had a https://www.thinkpenguin.com/ arrive and so far it seems good.

See http://www.linux.org/threads/how-great-is-the-korora.5955/ for a sample review.

lhnz 1 day ago 1 reply      
Sorry in advance for taking the headline and talking about something else.

What I want is a real personal computer that can fit in my pocket. A mobile device that is extremely open and very easy to hack on. I would like it to expose sensory readings in a UI and then provide a simple if-this-then-that UI that would allow me to teach daemons to respond to events that occur during the day and additionally support scripting for more complex automations. And once this device exists I'd like open protocols to exist to help other I/O devices to expose themselves to it.

Simply put I'd like devices to start offering extra senses to us beyond the five we were born with, and for these to be unencumbered from walled gardens.

Does such a thing exist?

guelo 2 days ago 1 reply      
I wish I could buy laptops that are not built in China. The Chinese army has zero scruples with regards to hacking. The hack where they got all US federal employees' info was an act of war as far as I'm concerned. I just don't see any reason why they would not be putting in backdoors into all the computers they manufacture.
tobyhinloopen 1 day ago 0 replies      
I agree with the article. I switched to Apple because the OS & hardware were superior, but they aren't anymore. Hardware vendors are catching up and Windows 10 is actually not that bad. OS X is actually getting worse.
manigandham 1 day ago 0 replies      
CES revealed a bunch of new laptops that seem promising: LG Gram 15 [1], Samsung Notebook 9 [2], Dell 7000 Series [3]

1. http://www.lg.com/us/laptops/lg-gram-15Z960-A.AA75U1-ultra-s...

2. http://news.samsung.com/global/samsung-introduces-new-notebo...

3. http://www.theverge.com/2016/1/6/10720212/dell-latitude-13-w...

partiallypro 1 day ago 0 replies      
The article ignores the Surface Book (thought it does mention the Surface itself), Razer Blade and Dell XPS lines which are all pretty renowned for their quality. OEMs make some crappy PCs, no doubt...but it has gotten a lot better in the past 2 years. HP's Spectre is also a really nice device.
matthewwiese 2 days ago 0 replies      
I'm quite happy with my 11" Macbook from 2012, she's quite the lil devil still. I stay on Yosemite and my machine is strong enough for me. If the author requires power, he/she could build a decent Xeon desktop and keep it at home, then just remote into it for intensive development. I keep a backup Core 2 Duo at my appt for work and play and it's still packing enough of a punch to do most of my work (only bottleneck is latency).

To be honest, sometimes I even think my MBA is unnecessary and contemplate a cheap Asus Eee PC netbook because I live in the command line (and consequently don't need to load up heavy IDEs). However, I understand if that approach is too much work for most people who just want a simple solution without all the fuss, but where's the fun in that?

intrasight 1 day ago 0 replies      
I've been buying Lenovo's for about 6 years and are mostly happy. I always buy a bare-bones version and add my own memory and SSD and do a clean OS install (after updating all firmware). A "new machine" is a 20 hour minimum time commitment so I don't do it more frequently than necessary. I always overlap two machines so I have backup.

I am due for an upgrade and was looking at the T460p. But I just read about the bad decisions Lenovo made regarding internal storage so now I am looking for something else.

thephilsproject 1 day ago 0 replies      
I had this same problem almost exactly a year ago. I now have a Surface Pro 3.

I wanted a decent CPU, high res screen, , good build, long battery life and small chassis. Then I chose whichever was cheapest which met those criteria.

I've not been disappointed with my decision!

pmontra 1 day ago 0 replies      
I've been happy with the first ZBook. 1080p, SDD or spinning disks, you can add replace the DVD with another disk, up to 32 GB RAM, apparently you can replace the CPU and the NVIDIA card too. The ZBook G2 specs are even better.

The only minus on your list would be the battery. It can reach maybe 4 hours (Linux) but not more. I knew that and it's not a problem, there is always a power plug nearby where I work.

The only minus in my list was the keyboard, because it has a number pad, but the keys are excellent. It's also definitely not a 1 kg laptop and the power brick is as heavy as a brick, but I don't care much.

aap_ 1 day ago 0 replies      
I wholeheartedly agree. I'm still using a T61 because I don't really like any of the models that succeeded it. The new keyboard, the reduced keyboard layout, the huge touchpad (which I disable anyway) with those reduced buttons, small screen resolutions compared to my 1400x1050 (unless you pay for the ultra expensive display panel or get)...it's just not attractive anymore. And as the author said, thinkpads are still among the best laptops :/ I'd be willing to throw some money at lenovo for a laptop I want to have but it looks like they're no longer producing those :(
headmelted 1 day ago 1 reply      
Am I missing something? Can't you just boot Ubuntu on a Macbook if that's what you want to do?

(This is a genuine question - I've always found OSX sufferable enough to not need to, and I have a fanless Acer for Ubuntu that travels with me).

julochrobak 1 day ago 1 reply      
Does anyone have experience with the Tuxedo Computers - http://www.tuxedocomputers.com ? They have just recently released an interesting 13.3" InfinityBook.
8note 1 day ago 0 replies      
My acer s7 does pretty well;

I've repaired it from severe rain damage, and it's generally easy to disassemble and replace the parts to.

it's got a reasonably nice screen, nice build quality, and reasonably good performance, though you cant change out the wifi chip, nor the ram The webasite also said about enough to do well, and the windows install it came with just had some acer junk on it.

szukai 1 day ago 0 replies      
Lenovo only inherited the name from IBM. It's not the same as it used to be... I really wish the author of the post looked beyond one or two laptops given the title he used for his writeup.
vok5 1 day ago 0 replies      
I bought a Lafit here: http://www.pcspecialist.co.uk/notebooks/lafiteII/.

I believe it is a rebranded Clevo, which is custom-built for you. Lately, I changed the WiFi card without any issue. You can also change the SSD if you want, same for the RAM. The best parts are: Build quality, price (you don't pay for the OS, only if you want to and you get to pay for parts you want), keyboard is nice, lightweight and screen is great.

hguant 1 day ago 2 replies      
I recommend System76. Good laptops, clear website, and you know exactly what you're getting with the OS.


fsloth 1 day ago 0 replies      
A good anecdote but I disagree with the notion that Surface is not a laptop but a tablet. I use my Surface Pro 4 i5 as a laptop and it's a pretty good one at that.
funkaster 1 day ago 0 replies      
For me, that better PC is the Chromebook Pixel (2015), of course, running linux instead of the crippled Chrome OS. It's my main dev machine and I use it more than the MBP that I got @ work.
unsignedint 2 days ago 1 reply      
I'm finding $300 Acer laptop (or netbook, one of their Inspire line) loaded with Linux surprisingly usable. (It was preloaded with Linpus, but I've replaced with my own.) At 1.10 GHz Dual-Core, 4GB RAM, it's a bit slow at times, but as long as I'm not running intensive process on it, it's very usable on the go machine. (Mainly, TeX, translations, web stuff, browsing, and occasional spreadsheet, is what I do on the machine.) I do have a desktop machine for things that requires more power.
bechampion 1 day ago 0 replies      
I was in this situation , I've moved away from osx a month ago i couldn't be happier.

I've moved from a macbook pro retina to a Lenovo x250/i5/8GB/HD screen, Running xubuntu 15.10 ... most bits work (DualScreen/VPN/NiceRDPapp(Remmina)/etc etc)

I've had major issues with Webex (need to have a jre 32bit running on a 64 bit os) but other than that all good!

Good luck!

ripberge 2 days ago 1 reply      
I have a Lenovo X1 carbon and I re-installed Windows to remove all the Lenovo crapware after I bought it. Windows is easy to download and install now. It will save your license, but blow away everything else. It only takes about 30-45 minutes. But I do agree, this is totally ridiculous that you have to do this.

I did buy an MS Surface and it was by far the best PC experience I ever had, however they don't make a keyboard cover with a touch-stick, that's the only reason I went back to a ThinkPad.

mntmn 1 day ago 1 reply      
I recently got a Thinkpad T450s after using iBook & MacBooks for 10 years and I'm very pleasantly surprised about build quality and feel.
mt_caret 1 day ago 1 reply      
What about Vaios? The specs on it are impressive (WQHD, 16GB memory, 20h+ battery life) with a sleek build. Wondering if it goes along well with linux...
sergiotapia 2 days ago 2 replies      
True, there should something to rival Apple in the laptop arena.

Unfortunately there isn't. Apple just makes hardware that's so sexy and intuitive to use. Example, Apple TV. I never used one before but I demo'd one at Best Buy the other day and in 10 seconds I knew how to operate it. Apple does this best, their earnings prove that.

Just buy a Macbook Air, sturdy as hell, long battery life, and great Unix-enough-y OS.

joefreeman 1 day ago 1 reply      
I went through a similar process recently after getting fed up with Apple's direction. I ended up getting a Surface Pro 4. The build quality seems good, battery is ok. The software/drivers are terrible - it rarely wakes up from sleep, and I don't get much delight from using Windows. Hoping Linux support improves in the near future.
narrator 2 days ago 1 reply      
I have a Toshiba p835-z370. It's a few years old but it still runs like a champ and all the hardware is supported on Ubuntu. Toshiba build quality is pretty good.

Here's the teardown: https://www.youtube.com/watch?v=hMH0r76zdt0 . It's only upgradeable to 6gb though. Would love it if they had a 16gb upgradeable version.

nqzero 2 days ago 1 reply      
for me, aspect ratio is huge - if you want something better than 16:9 in a pure laptop (not a tablet or convertible) your options are really limited


i don't understand how there can be so little differentiation in this market

Aardwolf 1 day ago 0 replies      
Personally I interpret the word PC as something like this:


so it's funny that it's about laptops :)

SippinLean 2 days ago 0 replies      
Is Samsung Series 9 still a thing? When I bought mine it was thinner than the MB Air at that point, similar rigid aluminum build quality.
joshAg 1 day ago 1 reply      
For those who don't know much about non-Apple PCs:

The only laptops worth considering are the enterprise lines from dell hp, and lenovo. They are night and day differences from the consumer products, because they are usually entirely separate divisions. Within this, certain lines are better than others. For example, with lenovo I would only really consider a t-series, w-series, or x-series. The p-series looks promising, but that just came out, so maybe don't rush into that if you want to reduce risk.

If the laptop doesn't have the RAM or harddrive soldered on, it's probably cheaper to buy from newegg and install an upgrade yourself than it is to upgrade through the product configurator. This does not void the warranty. The support pages from the website have explicit instruction manuals for doing this yourself as well as full disassembly instructions.

The enterprise laptops have support pages with crapware free drivers. If you don't feel like surgically removing crapware from a new installation, just nuke it from space and install the OS fresh. Heck, most time you don't even need to install those drivers, because the base windows drivers are fine for most things (there's probably a performance bump to using the drivers, and some things, like the fingerprint reader, will need a driver), so you could just skip the driver install.

Since these are enterprise laptops, you can still get 7 and 8 preinstalled (thank god for corporate compliance policies, am i right?! ;). MS call it "downgrade rights". Much like Apple, the best MS OS is the one released in 2009 (windows 7), so splurge for the "downgrade" to it if it's offered for the laptop you want.

If you care inordinately about crapware and don't want to spend time nuking a fresh laptop, then buy a laptop (again only enterprise laptops from lenovo, hp, and dell) from microsoft directly, since those don't come with spyware: http://www.microsoftstore.com/store?SiteID=msusa&Locale=en_U...

You can also get warranties that last much longer than 2 years (some go up to 5), cover accidental damage, or cover the battery, but the specific policies offered depend on which company you go with. I have taken advantage of the lenovo accidental damage warranty a few times, and it was great. The default warranty requires you to ship the laptop off (lenovo overnights me a box with a prepaid overnight shipping label inside it so all I have to do is pack my laptop and drop it at the UPS store) and then they return it within a week, but you can also get an onsite repair warranty where someone will come to you to fix the laptop.

Trust me, it's worth dealing with a website that isn't as shiny as apple's website.

garyclarke27 1 day ago 1 reply      
Macbook Pro is best windows pc - runs perfectly via bootcamp.So much better than slower flackier virtual options - i tried all of them - fast reboot so easy to switch.I need windows for 64 bit Excel 2016 much fatser than mac 32 bit only.2 diplay ports also great- drives 2 32 inch high res monitors perfectly.
justaaron 1 day ago 1 reply      
all these comments to debate what is actually still a standing point:

name a non-mac laptop that' doesn't suck

(is it injection molded plastic with a screen that will eventually flop flop flop? then it sucks!)

then let's find a distro that doesn't suck!(does in include binary blob drivers? then it sucks!is it unity? then it sucks!)

dh997 1 day ago 0 replies      
I like and use Apple gear but I dont like untinkerable black boxes or opaque firmwares.

I think we need a beautiful, functional servers,computer, handheld and wearables with opensource desktop lithography and 3d material deposition. It would take about $20 million and the right people to get going, but it could be hw UNIX -> Linux.

trynumber9 1 day ago 0 replies      
Well, the VAIO Z returns soon after a brief hiatus. Maybe look into those?

[0]: http://us.vaio.com/vaio-z/

[1]: http://us.vaio.com/vaio-z-flip/

jasonszhao 2 days ago 0 replies      
Why not Asus? (Never got one, have a MacBook.)
mamcx 1 day ago 0 replies      
Currently my biggest problem with apple machines is the low amount of storage, and in the case of the news iMacs and Macbooks, how costly and IMPOSSIBLE to replace them are.

I wish apple forget spinning hardrives, go full on SSD and ship with > 512 GB

shaurz 1 day ago 0 replies      
This is why I don't buy laptop computers... they are still overpriced toys. I mean, on a theoretical level, the idea of a mobile computer is appealing, but in practice they all suck very badly.
bduerst 1 day ago 0 replies      
I used an X1 Carbon for years - it's a sweet machine. If you get one, make sure you don't order the generation with the touch-pad control bar, because it craps out.
agarwalrishi 1 day ago 1 reply      
Yup. I agree. We need a good PC with the form factor of Mac mini but without Mac OS. Only option I have is to buy a Mac mini and install Ubuntu on that. Or buy Intel mini PC (Intel NUC). Both options are quite expensive, at least in India.
kennycox 1 day ago 0 replies      
How much RAM do you need for gaming? However, I have 8GB RAM but the game hardly takes 4GB for functioning. One of my friends has 32 GB RAM but I think he is more paying for it. Is more than 8GB RAM needed for gaming?
chemmail 2 days ago 0 replies      
Dell XPS 13/15
AdmiralAsshat 1 day ago 0 replies      
It's a shame that Samsung apparently doesn't consider the laptop market worth their time anymore. Their Ativ Book 9 was a great little laptop.
gonader 1 day ago 0 replies      
Dell XPS 13 is an amazingly good laptop, very well supported in Arch Linux
vegabook 1 day ago 1 reply      
I've been very happy with my Dell Precision M3800 which has outstanding build quality, easily comparable to my previous MacBook, and runs Ubuntu perfectly. In fact I like it better hardware-wise than the macbooks because it has a much more pleasant keyboard-surround surface, which is not freezing cold like my MacBook, nor does it have that unpleasant vertical front leading edge which in my opinion is uncomfortable when touch typing.

On a separate note though, Lenovo has kinda promised to build a 90's-level robust "retro-Thinkpad", the feedback forums for which were wildly successful.


finishingmove 1 day ago 0 replies      
If I were getting a new laptop right now, I'd get the Dell XPS 13 (2015). I get the post's sentiment though, but what really sucks IMO is not the PC/laptop market but mobile...
vatotemking 2 days ago 1 reply      
How about MS Surface Book? Fits all the criteria that OP is looking for.
crudbug 1 day ago 0 replies      
I have both Thinkpad T540 & MBP. The thing I miss the most is the dual mini-display port for support triple monitor (2K) desk work.
hendry 1 day ago 0 replies      
The Lenovo X1C3 is not that bad. I recommend it & it runs Archlinux beautifully.https://natalian.org/2015/02/18/Archlinux_on_a_Lenovo_X1C3/

Yes, Lenovo sales and support are pretty hopeless. Just got to factor that fact in. Must say Apple are pretty hopeless unless you get Apple care.

Anyway, the real problem in my mind is that there is pretty much no competition to Intel nowadays.

IkmoIkmo 1 day ago 0 replies      
The Surface Book and the Dell XPS 13 were not mentioned, while they're easily both in the top 3.
Aoyagi 1 day ago 0 replies      
And while we're at it, why don't we have some VA or even OLED matte displays for laptops?
crudbug 1 day ago 0 replies      
The other thing lacking is Linux multi-monitor support, I see gnome crashing all the time.
VeejayRampay 1 day ago 0 replies      
Man, community managers for all the biggest PC manufacturers must be rubbing their hands on posts like this.
beatpanda 2 days ago 0 replies      
I have the first-generation X1 Carbon and a Dell Precision M3800, both running Ubuntu, and I am very satisfied with both.
wprapido 1 day ago 0 replies      
i'm using a lenovo W540 and am genuinely happy. 32GB RAM, SSD + HDD, old-style keyboard. the only downside is screen resolution of 1920x1080. that's perhaps the only reason why i'm going to replace it
dkarapetyan 2 days ago 1 reply      
Dell sputnik. It's great.
imsofuture 2 days ago 0 replies      
Lenovo doesn't have a great website, but their ThinkPads are nice.
sbuk 1 day ago 1 reply      
The declining software quality is nothing more than a meme. None of you you seem to remember the horror that was 10.0. Or 10.1. Or 10.2 Or...
justinhj 1 day ago 0 replies      
I have a Lenova Yoga Pro and it's great. Nice build quality, silent, touch screen. My kid found it plays latest games at medium detail.
vacri 1 day ago 0 replies      
Meh, the author wants a better computer, but isn't willing to spend for it. The author runs linux, so the crapware from lenovo shouldn't matter, and the lenovo thinkpad line have excellent build quality (stay away from the ideapads). But the author complains that the cheap end of X1 town isn't retina. Sure, for the same price, a macbook air does retina, but that low-end X1 carbon has other features that the air does not have.

So by 'better PC', the author really means 'cheap PC filled with top-end gizmos and a top-end build quality'.

ryan-allen 1 day ago 0 replies      
TLDR: he didn't do much research.

It is true that there is a lot of crap out there for PCs but there are some decent laptops, it's just harder to find them.

Dell XPS' are good. Surface Book is totally equivalent to a Macbook Pro in terms of build quality.

ishbits 2 days ago 1 reply      
I'd like Mac OS X but on a Lenovo T or X. That would just be perfect (assuming it ran as well as it does on a MacBook).
izzydata 1 day ago 0 replies      
Easy, build a desktop.
pmarreck 2 days ago 0 replies      
Well, now you know why I use Macs.

They are closed (unfortunately) but they are not full of bullshit, at least.

MindTooth 1 day ago 0 replies      
Amen to that!
ebbv 1 day ago 0 replies      
This idea that laptops should be tinker able is just so weird to me. No laptop is really tinker able. The best you can do is replace RAM and HDD. Motherboards, battery modules, etc. are all proprietary in these things and usually anything beyond those three is non-trivial to replace.

Call me crazy but I'd rather just have a machine that's specced right out of the box, and with a battery module that I never need to replace.

My 2012 15" Retina MacBook is still going strong over 3.5 years later. Not only that but the machine has been a pleasure to use every day.

It's true there should be alternatives to Apple products out there that are similar build quality, but I have yet to find them. Some of my coworkers are hard line anti-Apple and their laptops are all really poorly made, IMHO.

dba7dba 1 day ago 0 replies      
We need Elon to jump into PC business.
tempodox 1 day ago 0 replies      
+1. This article should be at the top of HN.
joesmo 1 day ago 0 replies      
I'd say upgradability should be added to that list to make a better computer, otherwise it'd at best be equal to one from Apple.
r-w 2 days ago 2 replies      
That was a pretty quick dismissal of Lenovo, and a warped portrayal at that. As far as we can tell, Superfish was not intentionally installed by Lenovo, and any money it made didnt go to them since it wasnt their software. Theres nothing here to suggest that this is a sign of anything untrustworthy happening at Lenovo, any more than there is at any other major laptop seller. It seems like this was just a case of their software QA being not quite on par with their hardware QA. Also, all websites suck. Just sayin.

tl;dr: Picky, picky! wags finger

Too many people have peed in the pool stephenfry.com
693 points by ctz  3 days ago   510 comments top 60
bricemo 3 days ago 4 replies      
There have been numerous studies showing that outrage is one of the most viral emotions. For whatever reason, seeing something that makes you upset has a very high correlation with people sharing/liking/retweeting. I think this unfortunate fact is what is being born out in social media.

Smithsonian had a good short article on it: http://www.smithsonianmag.com/science-nature/what-emotion-go...

Research shows that amazing good news, like a huge leap forward in cancer research, is even more viral. But unfortunately those happen much more rarely up than someone saying something stupid and getting lambasted.

Ryan Holiday's book "Trust me I'm lying" also really opened my eyes to how a lot of this stuff works.

jonstokes 3 days ago 13 replies      
I have a theory about FB and Twitter -- or maybe more of an observation. Anyway, back in the stone age, people had bumper stickers with little slogans on them. And in the break rooms of various places of work, there was always a bulletin board an on a corner of it there was always some faded-from-to-many-xeroxes bit of humor/racism/sports fandom/sexism/dirty joke or other us-vs.-them thing on it that people would look at and amuse themselves with.

Now thanks to Twitter and FB, these little types of tribal territorial markings are just about all that's left of public discourse.

I see things get passed around and think, 20 years ago this would have been mimeographed and hung on a break room bulletin board, where it might have acted as a crude conversation starter about some aspect of "who are we?". But now people consider sharing and liking these things as their /contribution/ to a conversation.

It's like the way that millennials communicate with emoji, except it's image macros and memes and slogans and "gotchas" and hot takes that we consume, copy, and share. The copying and sharing of all that stuff is what passes for discourse now.

Me, I think that this development is double-plus bad.

malsun 3 days ago 5 replies      
I left Twitter last year when it looked like the 'professionally offended' were becoming too influential. They didn't affect me as I keep a low profile, but it was like seeing an omen and noticing friends were buying into their hype.

The way Twitter is now pandering to this lot with the recently announced Orwellian-style group, and other odd behaviour suggests that they approve of the current momentum.

I'm just glad to no longer have its weight on my shoulders. Life is great without social media, but it must be difficult for public figures who stick with it for the free promotion.

carsongross 3 days ago 8 replies      
I'm a bit monomaniacal about this, but the current online culture wars (and the previous offline culture wars) make me wish that secession didn't have such a bad name. It is obvious that large groups of people have irreconcilable differences of opinion on matters of free speech, sexuality, gun ownership and so on.

It is a shame that rather than a firm handshake and friendly wave goodbye, we will end up with a totalitarianism of one ideology or another, violence, or both.

michaelwww 3 days ago 1 reply      
I've been accused of being sexist exactly once in my life, by an anonymous male friend of a woman on Twitter because I dared to answer a question she asked. I didn't realize she was asking in jest. Her friend said I should have known that she was a very knowledgeable person (I didn't) and her question was a joke, and because I answered it I was implying that she didn't know the answer and therefore was manspainling and generally being a sexist dick. This person wasn't satisfied and pursued me for several tweets until I apologized. I did apologize, but I've noticed that it kind of soured me on Twitter. I'm proud of my record of working women in tech and raising an independent daughter. I use my real name on Twitter and I don't need some anonymous busybody attacking me in public for dubious reasons. Perhaps if Twitter had a real name policy it would level the playing field. I'm sure a famous person like Stephen Fry is under constant attack from anonymous trolls.
chatmasta 3 days ago 1 reply      
My Facebook newsfeed feels like a cocktail party where the host has surreptitiously arranged for the silent majority to observe a performance wherein the most "unsophisticated" guests squabble with each other over politics. The opinions on the feed, regularly produced by the same small group of people, are rife with dramatization and lacking any substantive argument. Yet hardly any of them come from friends who I know closely on an individual basis.

Facebook seems like a platform that amplifies the voice of a minority on the outskirts of the social graph, distributing its opinions to the silent majority. This seems good in theory, and would hopefully result in some meaningful improvements to civil discourse. However, it remains unclear whether expectation meets reality in that regard.

jccc 3 days ago 1 reply      
> 'to turn as swimmers into cleanness leaping.

I had to look that one up:

Peace, by Rupert Brooke, 1914:

"A paradoxical image, comparing going to war as an act that cleanses the participants, like a dip in a pool or river."


I'm assuming that connection was intentional, because Stephen Fry is smarter than me.

haberman 3 days ago 1 reply      
Sanctimonious indignation is toxic. But to be fair, people post really offensive stuff too, and often it's well-meaning people who don't even seem to realize what they are doing.

The best example I've seen lately is Richard Dawkins, who posted a video I won't link to where an Islamist and a Feminist are singing a song together. The video makes a real point that isn't intrinsically offensive. But the highly caricatured portrayal of these two people is, to me at least, a pretty inflammatory gesture. Also it came out that the feminist caricature is based on a real person who has received death threats for her work.

Humans aren't programmed to live together with people who are really different from them. We evolved to preserve our in groups and unify against out groups. We have huge blind spots about how other people will perceive things. We can absolutely learn, but it takes effort and we'll make mistakes as we go. Twitter is all of this happening in real time.

lordnacho 3 days ago 6 replies      
It's a wonder he lasted as long as he did.

I've joined a number of online forums through the years, and I've always ended up leaving them due to a spiral of "tone gets rough / the good people leave".

I'm still in this one, but I guess we'll see how long it lasts here. Seems different somehow, but they all do in the beginning.

What are some innovations in solving the asshole/quality spiral? There's lots of takes on moderation, but what are some interesting ones?

djaychela 3 days ago 1 reply      
I'm not surprised - I wouldn't be famous for all the money available (not an option anyway, particularly given my lack of talent!) - so many people now think they have a right to you, if they're a fan, or they've bought your work, or they don't like your work, or whatever. Twitter seems to bring those sorts of things much closer (as, of course, does much social media), and I'd think anyone with any opinions on just about anything will end up getting drawn into arguments over just about everything with people they've never met, whose opinions they probably wouldn't care about in any other sphere.

I seriously think there will be a huge backlash against all of this sort of thing in the next few years - people have binged on it ad nauseam, and given the way my two teenage step-kids have reacted (removing themselves almost entirely from most social media, and only using it to talk to people they actually know and like IRL), I'd think there's more to come, in terms of people just leaving such an arena behind.

JamesBaxter 3 days ago 0 replies      
I've been thinking a lot about online communities since I listened to "So you've been publicly shamed" by Jon Ronson [0].

I quite like twitter just now but I don't follow many people and nobody follows me. I often see tweets from indie game devs discussing yet another Twitter storm in a teacup and I'm frustrated by how much stupid stuff escalates.


dustingetz 3 days ago 1 reply      
Twitter is powered by an evil feedback loop where making people upset makes twitter more money. Their UX is designed to generate easily-amplified content, and amplify it, and this is the unanticipated consequence. I doubt they can fix the feedback loop without torpedoing engagement.
k-mcgrady 3 days ago 0 replies      
Twitter needs to be very careful. I use Facebook to follow my friends. I use Twitter to follow influencers/celebrities. If they start leaving and Twitter is just left with the musings of the average joe they're going to lose and lot of users.
bshimmin 3 days ago 2 replies      
Probably worth noting as a bit of context to this that Fry was presenting the BAFTA awards last night, said something offensive about someone who was a friend of his (but the public wouldn't necessarily have known that), and people on social media reacted badly.
danieka 3 days ago 2 replies      
I think this is a natural consequence of the twitter format. The short message motivates (some) users to win cheap points, offend or simply scream loudly in order to get attention. In my Luddite mind the political conversation is becoming Twitter-fied and now focuses more on short sound bites rather than a sound discourse.
kyledrake 3 days ago 0 replies      
My new years resolution was to stop using Twitter actively, and I've basically done that (spare a few offenses). I was intending to pin up a link to a page explaining my reasons, but haven't gotten to it yet.

Public venues like Twitter have devolved into pitchfork-wielding mob battles over extremely stupid, petty nonsense (on all sides of the political spectrum), and I expect this to just continue to happen over and over again as the influentials of the "Twitterati" get better at pushing manipulable people into whatever half-baked agenda they want to use them for.

The "influentials" are the people most rewarded by this system of relentless psuedo-controversy: petty, shrill narcissists that contribute little (or negatively) to anything of genuine value, merit or productivity.

Through a few personal experiences, I've been made a lot more cognizant of the fact that I won't be able to avoid the mobs by staying out of them. I've decided it's best to just not participate at all, and to work to break the systems that enable the mobs in the first place by coming up with better systems.

We need to go back to having sane, civil, intelligent discourses about real issues online, and figuring out the best way to facilitate them. Twitter is never going to work for that, no matter how many lunatics-running-the-asylum tribunals they come up with.

michh 3 days ago 1 reply      
I think this is a tricky problem to properly solve. Or rather: An opportunity for Twitter to really mess up attempting to fix this.

A big part of what makes Twitter interesting/worthwhile is the ability to interact with lots of different people you wouldn't have stumbled upon.

But also, especially when you're just starting out on Twitter, people you otherwise wouldn't have been able to get through to.

On Twitter I get to have a conversation with someone famous (in the industry but even bona fide celebrities like the author). People who I could never send an email, letter, call and actually get a response. On Twitter, I have a really good chance someone like Stephen Fry will actually read my tweet and even responding. I'm guessing the odds I'd be able to get him on the phone or have him accept my Facebook friend request are rather low.

If Twitter builds a filter bubble around all famous/verified users, it loses a big chunk of it's a appeal to the nobodies. Which includes both trolls and decent people. And I think Twitter can't do without the civilized nobodies.

VonGuard 3 days ago 0 replies      
Sad to say, but humans have been bred over millennia to be clannish, violent, and prone to witch hunts. Quite literally, all of European history is about taking the peasants off to murder other people, and bringing back only the strongest ones who survived the battle to spread their seed to the next generation. The ones in the aristocracy interbred to the point of serious genetic flaws p https://en.wikipedia.org/wiki/Royal_intermarriage ]. So, choose your side: the ones born from the strongest, heartiest fighters, or the ones produced by many generations of selective in-breeding... At least, this is the story for us white folks....

Humans are just irrational, flawed, and prone to freak out. And I, for one, welcome our new robotic overlords...

rblstr 3 days ago 2 replies      
I watched the BAFTAs with my girlfriend last night and I have to say, when Stephen made that comment we both said 'woah that's a bit mean' out loud. Of course, being the general public we're not aware of Mr Fry's relationship with the lady who won the award but from an outside perspective it came across as a bit bullying.

I only found out today about the twitter backlash and I don't think it should be lumped in with so called SJW movements, or the anti-anti-offensive movement, it really sounded like a hurtful unnecessary comment that he shouldn't have made without context or clarifying their relationship. It really seemed un-characteristically mean for Mr Fry.

r721 3 days ago 2 replies      
I wonder is it so hard to build an automatic "insult" detector with all recent progress with deep learning. Quick googling gave quite a couple of older papers:

"Offensive Language Detection Using Multi-levelClassification"


"Automatic identification of personal insults on social news sites"


Nickoladze 3 days ago 0 replies      
> to be offended on behalf of others they do not even know

Never thought about it like this before. Sounds really strange when you put it this way.

DenisM 3 days ago 2 replies      
And so, the Eternal September strikes again [1].

There's a ton of gold in figuring out how some communities succumb to it, and others don't.

[1] https://en.wikipedia.org/wiki/Eternal_September

746F7475 3 days ago 1 reply      
What's the big deal? If you don't like how a platform works just leave it, I haven't used either Facebook or Twitter ever (I do have accounts, but I think you can count my posts and tweets combined with fingers of one hand) and it hasn't affected my life at all
gadders 3 days ago 0 replies      
It's ironic that social justice warriors have driven one of the most progressive celebrities off Twitter.
markatkinson 3 days ago 0 replies      
Up you go!

Best part of not having Twitter and the book of faces is the mysteriousness that you project when attempting to court the opposite sex, either that or just a doushy superiority complex. Either way I want nothing to do with Facebook (twitter is useful for finding out the status of services like Firebase etc). As far as I am concerned in life, distraction is the enemy, and if distraction is the enemy well then that makes Facebook a windmill to my Don Quixote.

nickysielicki 3 days ago 0 replies      
Prediction: A neo-luddite renaissance is on the horizon. The anti-social media sentiment that has only been seen among techhies is going to spread into otherwise-mainstream youth conservatives.

The only thing in the way is that current services have not censored, but they're beginning to, and decentralized alternatives haven't existed.

A maturation of bitcoin over the next year, especially with the fork, will result in bringing novel usages of the blockchain to the average consumer.

andrewbinstock 2 days ago 0 replies      
While I agree that much of social media is as he describes, I don't have this problem on Twitter. I follow just over 100 people whose thoughts interest or entertain me. I shut off the screamers, the haters, the high-emotion types. With the result that my experience of Twitter is pleasant. Is there some reason OP couldn't do the same?
bendykstra 3 days ago 0 replies      
It was through the angry backlash to a tweet that I discovered Stephen Fry. About eight years ago, Stephen couldn't get a Windows laptop to connect to wifi. His Twitter outburst and the reaction of Windows fans became grist for the tech blogs, which is how I noticed it. I wonder if Stephen is romanticising the early years of Twitter or if those angry reactions have just become more common. (Either way, I don't blame him for leaving.)
gthtjtkt 3 days ago 1 reply      
This is true of almost any internet forum today. Every comment you make is meticulously deconstructed in an effort to find some kind of "gotcha". Well, either that or it's completely ignored.

And god forbid you have an opinion on any subject in which you don't hold a doctorate degree. You can't even tell people "it's alright to eat eggs" without the conversation devolving into a back-and-forth war of citations.

6d6b73 3 days ago 7 replies      
Or maybe you just have to go to a pool in a better neighborhood? After all on Twitter you can choose from whom you're getting status updates, can't you?
dwrs 3 days ago 0 replies      
it says right here in the HN approach to comments that "HN is an experiment. As a rule, a community site that becomes popular will decline in quality. Our hypothesis is that this is not inevitablethat by making a conscious effort to resist decline, we can keep it from happening." the question therein is where do we go now, and to which degree can we avoid twitter and other sites beyond their ostensible utility
maolt 3 days ago 0 replies      
The way I see it:1: Really low barriers to entry for each snowflake to express their views.

2: Social networks work as echo chambers, which allow anyone to find dozens that share their views ("If they think like me, it must mean that I am right!!").

3: Misunderstanding of the way the cyberspace works, with people not understanding really that it is encroaching on the "physical" world, and professing extreme opinions they would never have dreamed of voicing if not in front of a screen in the safety and "privacy"(hum) of their home.

4: The powers that be, that is to say the social networks operators have so much at stake that they are walking on eggs, wary of taking any action that could jeopardise their cash cow(actual or potential).

The situation is a mix of all those factors, and the result is I think a bit depressing.

Can you imagine our societies without a police force? Today's social networks are not far from that imho.

mcv 2 days ago 0 replies      
I'm not active on twitter, but from what I've seen, particularly in the aftermath of GamerGate, is that it's a putrid mess of hatred, threats, and bigotry. The problem doesn't so much seem to be that people are defending others, but that they are attacking them.

I never really saw the point, but it in the past few years it has seemed even more pointless. You just can't have a good, nuanced conversation in 140 characters. You can only shout one-liners and threats. It's what the medium seems designed for.

mclovinit 3 days ago 0 replies      
I bounced back and forth on and off FB until, I bit the bullet and deleted each one of my "friends" one by one. If I didn't do that, then I would be plunged in an infinite loop of deactivation/reactivation hell.

It was interesting to see the reaction of people when I said I had deleted them in order to make it less enticing for me to use my account. They were actually offended for being deleted.

Could it be that the mere perpetual overuse (and/or misuse) of social media breeds hypersensitivity? Hmm...I say yes. Nothing wrong with the occassional indulgence, but there's just more to life. But that's just my worthless 2 cents.

jccalhoun 3 days ago 0 replies      
I think one of the problems about social media is that the experience for the common person is often very different than that of (even moderately) famous people. I hear tech journalists talking about their experiences on twitter and they are so different than mine and I have to think it is because they have thousands of followers and I have around 100. If they do something they get tons of feedback but if I tweet the same thing I would hardly get any. Certainly there are times when things erupt and someone becomes "famous" and is bombarded but for most of us that doesn't happen.
pervycreeper 3 days ago 0 replies      
In a medium such as Twitter, there is not enough incentive to make users post quality. Morons can post about whatever irrelevant nonsense they want and gain approval from other morons. From inside a filter bubble, PC policing a celebrity can seem incredibly courageous and clever, when in fact it is the opposite. The potential harms of this are real. Richard Dawkins suffered a stroke last week following such an incident, where a random interloper peppered him with a barrage of criticism, taking her gang of followers with her.
musesum 3 days ago 0 replies      
How about an "untweet". Basically a thumbs down for tweets that smell. Then use collaborative UN-filtering.

Or how about using Word2Vec to usher turds to a remote part of the tweet space. Use sliders to adjust your results. For example "I need a twitter hug. What would happen if I turn down the 'Outrage' and turn up the 'Kittens' -- awwww, how cute! Ok, now what would happen if I turn up the 'Death Metal' -- kittens and death metal -- whoooah!

sixQuarks 3 days ago 0 replies      
Everything is evolving all the time, and future iterations are based on, a reaction to, or rejection of what came before.

There are going to be cycles of products, some you love, some you hate, but what won't change is that there are still like-minded people out there, no matter what your tastes or opinions. We will learn from this generation of social media , and future products will solve some of these problems to a large degree, only to evolve into something we detest again - but each time it should get a little better.

bambax 3 days ago 0 replies      
The irony here is that the person acting most offended is Stephen Fry himself, who decides to quit in high fashion because, alleges he, some people pissed in the pool.

Twitter is not his pool; it's not even a pool: more like the sea.

Many people -- and all kinds of living things -- piss in the sea and do many other things in it; if you're unhappy with that, well yes it's better if you stay on the shore, but don't go around pretending you're so pure.

You're just afraid of the sea.

zarkov99 3 days ago 0 replies      
Assholes, the self righteous,plain evil, or simply stupid kind, are a minority but they have a tremendous impact in any community which does punish or prevent bad behavior. This is why we have to have door locks, police, jails, passwords, etc. Any sufficiently large online community will end up including a bunch of douche bags which, unconstrained by the these old school deterrents, will eventually destroy it.
wcummings 3 days ago 1 reply      
I love Stephen Fry, but I'm conflicted. On one hand, he's upset that people are offended and without a hint of irony is obviously very offended by their sensibilities. That said, celebrities (intentional or otherwise) have a very different experience on social media than the lay person.
danharaj 3 days ago 2 replies      
The presence of justice and the absence of tension are not the same thing. I think a lot of people have lived a large part of their lives completely insulated from the tension that has existed in society over social justice issues for centuries. Now that you can go on Twitter and find out exactly how and why everyone is mad, it looks like comity and mutual respect has evaporated. But really what has happened is that the absence of tension engendered by the undemocratic dissemination of information in the pre-Internet age has been revealed to be a lie: people have been angry for a while and they're angry for reasons. Shocking.

Asking someone to shut up is much easier than assuming they're acting in good faith and have reasonable motivations that you can't perceive, because, you know, it's the Internet and nothing is more ridiculous than a person looking at the Internet and nodding condescendingly while explaining the exact nature of all these human beings they disagree with.

If from the get-go you assume people are acting in bad faith and you demand that they admit that this is the case before you even consider them as reasonable people, you are asking for full surrender instead of treating them as equals.

vectorpush 3 days ago 0 replies      
This is not a twitter problem. If you're a famous celebrity with a gigantic megaphone you have to accept the fact that the masses are going to react to what you put out there. I think it's funny when celebrities get all wrinkled over internet outrage but take for granted the fact that they are leveraging the masses (via twitter and other platforms) to boost their own careers. Woops, you said something that some people on the internet disagree with, suddenly you have a deluge of hate mail instead of doting adoration. Well, what did you expect? The masses are not autonomous eye-balls, they are people with disparate ideas and opinions and if you say things that piss some people off, this is the reaction you get.

So, in the face of internet controversy, what does a celebrity do? Retreat to a convenient forum where they explicitly control the discourse so they can publicly broadcast their disdain for the fickle judgment of masses without having to actually listen to them. I suppose the so-called SJWs aren't the only ones that need a "safe space" to speak their minds without fear of criticism.

f_allwein 3 days ago 0 replies      
Wonder if Stephen Fry will blog more often now - I noticed ths is his first blog post since May 2015.
pklausler 3 days ago 0 replies      
For me, what's important is not avoiding disagreement, conflict, or rage -- what I want to somehow exclude from my social media existence is just good old craziness. The Internet has been enormously empowering to the stark barking mad.
arjn 3 days ago 0 replies      
I agree with Stephen. A while ago I came to the conclusion that widespread use of social media such as FB, tumblr and Twitter (especially twitter), has caused a severe degradation in public discourse and conversation.
stretchwithme 3 days ago 0 replies      
There's tons of crap in real reservoirs. But we filter and chlorinate.
guelo 3 days ago 0 replies      
In certain forums these types of posts are called Goodbye, Cruel World posts and they are summarily ridiculed. HN takes itself too seriously to give this post the proper treatment.
typon 3 days ago 0 replies      
People are blowing this way out of proportion. Stephen Fry made a shitty joke about his friend, people objected, and he got angry at them for not finding it as funny as he did.

He has the right to his opinion as do others. This outrage culture is only slightly more annoying than the people who constantly bemoan it.

nullc 2 days ago 0 replies      
I always thought it was called _twit_ter for a reason.
melted 3 days ago 0 replies      
Should have gone full @Nero, and started tearing them new ones. They can't win against a well spoken gay person who chooses to not give a damn.
Zikes 3 days ago 0 replies      
Now that Twitter has created a committee by which those people are given actual power on the platform, I'd say the pool is now mostly urine with just a bit of chlorine.
woah 3 days ago 0 replies      
Might have been spending too much time on the internet.
marban 3 days ago 0 replies      
I miss Pownce.
mattbgates 3 days ago 0 replies      
Get the out and take a piss outside of the pool, assholes.
moron4hire 3 days ago 5 replies      
The problem isn't that this is a Twitter problem. It's that this is a cultural problem. People kinda suck. People kinda suck a lot.

I was having a conversation with a friend at lunch, when I started putting ketchup on my hotdog. He looked at me and incredulously questioned, "really? What are you, like, 8 years old?" I had never even heard of the idea that "only children put ketchup on a hotdog." Yet here was someone I had originally thought of as a rational human being elevating it to a serious issue.

A lot of this stuff gets defended as "just a joke", but I don't see anyone laughing. If you're so bad at making jokes, maybe you should just give up the practice and stick to being generally pleasant to be around.

I guess I'm lucky that I'm not famous enough to get any randos following me on Twitter. The people who follow me are interested in my work, they aren't hero-worshipping me. I get to have excellent conversations with people on Twitter. But that is only because--even though it's not 100% in my control--my lack of celebrity allows me to guide and design my Twitter sphere. For a celebrity, they just get inundated with folks they don't want to hear from.

Just read the replies to Notch's tweets sometime. I'm surprised he hasn't become a complete recluse. Almost every single one has some 13 year old complaining about "Minecraft sucks now that you're gone", even though he's been gone longer than they've been playing it.

dang 3 days ago 3 replies      
We banned this account for trolling. One bad thing about HN's open policy about accounts is that people can, and do, create account after account that they get banned with. Surprisingly much of the worst behavior on Hacker News is the work of a small set of these.

But the other side is that users who are inspired to create new accounts to comment often make some of the best contributions to a thread. That's true of this thread too. The benefit of being open exceeds the cost.

We detached this subthread from https://news.ycombinator.com/item?id=11104337 and marked it off-topic.

riebschlager 3 days ago 0 replies      
Came to HN and saw this comment with a reaction gif and I was all like [1]

[1] https://en.wikipedia.org/wiki/The_pot_calling_the_kettle_bla...

MCRed 3 days ago 0 replies      
"To leave that metaphor, let us grieve at what twitter has become. A stalking ground for the sanctimoniously self-righteous who love to second-guess, to leap to conclusions and be offended worse, to be offended on behalf of others they do not even know. Its as nasty and unwholesome a characteristic as can be imagined. It doesnt matter whether they think theyre defending women, men, transgender people, Muslims, humanists the ghastliness is absolutely the same. It makes sensible people want to take an absolutely opposite point of view. Ive heard people shriek their secularism in such a way as to make me want instantly to become an evangelical Christian."

This is a fair description of how Stephen Fry (as much as I love him) acted towards me on Twitter several years ago. (about the time I left twitter for good)

I don't think he's a bad guy, I think there is just an authoritarian streak running thru out culture right now that is so strong that a guy who, at the time, had just played a persecuted gay man in V for Vendetta was advocating for the persecution of people he didn't like! (though I'm sure he didn't realize it/recognize it/intend it.)

I think Stephen is a well intentioned guy who doesn't intend to piss in the pool, but I think propaganda has scared and manipulated him, as it has done so many others over history.

Go 1.6 is Released golang.org
549 points by geetarista  1 day ago   331 comments top 29
sriram_malhar 11 minutes ago 0 replies      
Have been using Go since its release, and like the deployment experience, the feeling of solidity of putting together a tight system. The toolchain is great. 1.6 is yet another Solid release in that direction. Thank you all.

However, the _language_ doesn't give me much programming pleasure alas. Since there is plenty of time for Christmas, here's my syntax wish list :)

'?': C's if-then-else operator.

Block-syntax for closures ala Ruby. Unifying blocks and closures makes creating DSLs easy, but doesn't add to cognitive load (no more than using anon funcs)

Pattern matching like Scala, ML, Rust.

Sum types -- (Yeah, I lied. Not just syntax enhancements), or at least discriminated unions. I'd like to see an example (in the FAQ entry on the topic) on why support for it is troublesome.

For 2017 Christmas, -------------------

Macros ala Nim.

Systemic support for Goroutines, including detection of conditions where a goroutine would never get scheduled. Erlang-like tools for built-in goroutine insight.


My ideal language would be an intersection of Nim+Go

sinatra 1 day ago 10 replies      
Go checks a lot of boxes for my ideal language for developing web services: Static type, C derived, has garbage collection, generates a single binary, supports concurrency very well, is opinionated, is small/simple, its community prefers to just use standard lib for most work, etc. Yes, Generics is an issue and so is debugging. But, overall, I can't think of many other options that check so many boxes.

EDIT: I must highlight the point about checking lot of boxes. In many discussions about features of programming languages, we get responses like, "language Y does that too. Why not choose that language?" Well, because we don't pick languages for one specific feature. We pick them for the combination of features.

rphlx 8 minutes ago 0 replies      
> Source trees that contain a directory named vendor that is not used in accordance with the new feature will require changes to avoid broken builds

That seems a little bit distasteful.

jonesb6 1 day ago 8 replies      
The reason I love Go is that every time I pull it out, I write a small amount of it and it runs beautifully. For example my company has a critical micro-service implemented in ~300 lines of Go, it's been running for six months now without a single hiccup, highly performant, very sexy.

The reason I will almost never use Go for web apps is because interaction with databases is limited (almost entirely) to raw queries. Maybe I'm spoiled by the likes of Active Record, Sequelize, Mini-mongo, Sql-alchemy, etc, but it's a huge drop in efficiency to spin my own SQL.

The point to take away here is that Go, more so then many other languages IMO, has its strengths and weaknesses. If you use Go in one of it's weaker use-cases you're gonna have a bad time. If you use Go for one of it's strengths you're gonna have a great time.

See you guys and gals in n weeks when we need to rehash the pros and cons of Golang again.

nathany 1 day ago 1 reply      
There is a Go AMA on reddit for the next 24 hours.


eddiezane 1 day ago 6 replies      
I've really enjoyed the time I've spent with Go but feel like the state of dependency management has kept me away.

Am I being stubborn in my longing for an npm, Ruby Gems, or pip? Is there a reason why one of these hasn't emerged/been adopted by the community? (I'm aware of the 1.5 experiment with vendoring.)

Semver and pinning versions has always just made sense to me. I can easily adopt new features and fixes automatically without worrying about things breaking.

How does the community feel this far along?

Cyph0n 1 day ago 3 replies      
I can't wait to see what's new in 1.6! I really had a pleasure working with Go for my senior project last year. If I need to write either a server (HTTP or TCP/UDP), or a client application that must be easy to build and distribute, Go is my first choice.

What Go is lacking at this moment in my opinion is:

1) A comprehensive and mature web framework. Play w/ Scala is my go-to choice now, with Django a very close second.

2) A decent cross-platform GUI toolkit; heck, I'd settle with Qt and/or .NET bindings for Go. The power of Go is statically linked binaries, and I think the area of desktop applications will be easy to target if a good solution emerges.

dominotw 1 day ago 4 replies      
I've been writing some gocode recently and huge chunk of code is

if err != nil ...

I know you can do if ; err!=nil but that not that much better and you end up in deeply nested if blocks.

i have to mentally block out err !=nil to read any gocode linearly. How is this acceptable, I don't get it.


We recently scanned all the open source projects we could find and discovered that this snippet occurs only once per page or two

This seems false from my experience, def way more than 1 or 2 instances per page.

protomyth 1 day ago 4 replies      
Can someone give a decent explanation of the following:

1) Supposed I have a library that was written in C that receives a security update which is used in a Go program. Under what conditions do I need to get a recompiled version of the Go program.

2) Supposed I have a library that was written in Go that receives a security update which is used in a Go program. Under what conditions do I need to get a recompiled version of the Go program.

3) Is there a way to tell from the binary that the program was written in Go?

Trying to figure this out for my Sys Admin dealing with Vendors role.

jernfrost 13 hours ago 0 replies      
Read the debate with Go vs Java here with interest. I'd like to add a point I think is missed by the Java crowd in favor of Go.

Complexity isn't free. Java might have and abundance of tools, IDE's, language features etc, but you can't claim that matching up every Go feature or tool with something superior found among the huge Java universe makes Java superior in every way.

I find that there is an unfair assumption being used by the Java advocates, here which is that every software developer has a deep knowledge of Java.

As one of those people who can certainly write Java code, but who is not familiar with the Java eco system and has not spend a lot of time with I must say that Go to me is a clear winner.

My exposure to professional Java development has been quite frustrating compared to writing Go code. Every Java project I have gotten has used some different built tool: Ant, Maven or Gradle. They have also all seem to use different IDE's. The complexity of each of these tools is staggering. Considerable time has to be spend learning these tools.

Go in comparison is laughably simple. You can get productive in less than a week without ever having used the dam thing. The tools and the libraries are very quick to get into. In fact I find Go code so easy to read that although I am an iOS developer by trade, I frequently read Go code to understand how various algorithms and network stuff works.

An organization would easily be able to add people to a Go project without much previous exposure to the language. Adding people with limited Java knowledge to a Java project however would be far more expensive. Considerable time would be needed for training.

There is a lot of money to be saved from having a well thought out standard library combined with a simple language with simple well thought out tools.

As a Swift/Objective-C developer, my major gripes with my development process is actually the complexity of the tooling. Both Swift and Objective-C are fairly straightforward languages IMHO. In this regard I greatly envy Go developers although I do enjoy the strong typing and generics in Swift.

kampsy 13 hours ago 0 replies      
I fell in love with python because it was clean and easy to work with. Like most developers, I used to use c when I needed a performance boast. Then I got fade up and decided to learn a new language that could give me the feel of python and the performance of c. Two languages from a list of 10 passed the above criteria Go and Rust. Java did not even make the list because I Don't use languages that are owned by evil empire's(Oracle).

I went with Go because it was easy to use and understand. I could read other people's code easily( Even with a large code base, I have never found myself scratching my head trying to figure out my own code does), could set up my workspace in less than a minute and all the text editors I used (sublime, Atom, Vim) supported it. I Don't really care about the fancy IDE's. Just syntax highlighting and code completion is good for me.

I started learning go on September 2015. And I have managed to implement the porter stemmer algorithm and an inverted index in it. Miss generics but LOVE interfaces. The fact that any concrete type that implements method 1 satisfies interface 8 is awesome. You can easily reuse code from different package without changing anything.

alblue 1 day ago 0 replies      
Release notes are here:


Notably new this time is transparent http/2 support and tighter rules for integration with C.

dh997 1 day ago 1 reply      
Go CSP is minimal and ortongonal, I just wish it did three things:

0. could lto optimize or link against a shared library to reduce the titanic size of compiled programs and cut down on duplication of instruction. Therue is no practical sense in wasting memory and storage on systems with dynamic linkers: edge cases of including the world for rare situations but YAGNI in real production systems.

1. could output flat binaries and self-host runtime (panics) for practical kernel development in Go

2. Generics (both types and immutable constraints), I think C++1z has the right approach to this (and constexpr and constant arrays are nice and are able to provide more hints to the compiler).

I also wonder why Go wasnt developed as an IR compiler / llvm frontend, because it would've levered an existing debug and portability ecosystem with much less work.

pori 15 hours ago 1 reply      
Seen a lot of Erlang mentions in this thread. Is that the native alternative to Go?

Personally, I prefer to write code in a functional manner. While I've always thought Go looked like an amazing platform for programming in general, I haven't been keen on moving to another imperative language.

It seems the landscape for functional alternatives are mainly Scala and Clojure which are both based on the JVM and require a bit of time to learn the tooling. I am not a Java or JVM export, so I haven't been too inspired by this either.

zenlikethat 1 day ago 0 replies      
Congratulations to the Go team! There are many excellent folks working on the Go language and it's been an absolute joy to work with in my experience.
ukd1 1 day ago 0 replies      
https://golang.org/doc/go1.6 - lists the changes
Exuma 1 day ago 3 replies      
I upgraded and it broke our app, something to do with the way it handles https has changed, not sure what
bmh_ca 1 day ago 4 replies      
Go has a lot going for it.

That said, there were a few points I noted, based on a recent go I gave it (pardon the pun), at least in relation to my style of development for this project:

1. It's hard to tinker, mostly because it's fussy about what variables are defined or used. This is a strength in the usual course, but when one is trying to posit what a poorly documented 3rd party API is doing it can be a serious pain.

By tinkering, I found that I often had to comment out or uncomment lines, or handle or ignore errors. There was a lot of flipping up to the beginning of the file. I would spend so much time fiddling with the lines that I would at times forget what I was even trying to do.

I might just have memory problems, I acknowledge. :)

However, what would make sense is a go "mode" where it runs in a non-strict way, with what would ordinarily be errors being warnings. A "tinker" or "whirl" mode, so to speak, that softened the requirements so one could get a better sense of what was happening before committing to a design.

An interpreter mode might also be quite valuable, to address this problem and the ones below.

2. Error propagation - I see the point of errors being returned and the lack of a "throw/catch" style, and its benefit, but I feel it's a lot of typing for marginal gain. I usually end up with an error propagating a set of strings that ultimately conclude as: "Database error: transaction error: processing error: http error: reason", which is to say: equivalent but less information than a stack trace would give. I see the mandatory error acknowledgement simultaneously as a strength and a waste of time, and I admit being on the fence about it.

3. The next point I am not on the fence about: Debugging. It is not apparent how to get a stack trace, and the best option looks like including a third party application that generated errors. For the obvious and reasons below, this is a problem.

4. Package management: This was fussy and could be time-consuming. It is not apparent to me why one needs a GOROOT and a GOPATH. I think Python's virtualenv gets it right, by comparison. A second but related problem is package versions. Maybe I'm missing something, but making sure you get the latest semantically equivalent version (in the semver sense) was not apparent.

5. Package debugging: If you include a 3rd party package, and it's broken in any way, it's a veritable quagmire to identify and fix the problem. My experience was that the best way to debug a third party package was to block and copy all its bits and then debug it as a local source in your own. Obviously this is bad for a long number of reasons, and I might be missing something, but no more apparent option appeared when I investigated on how to tell what is even happening inside third packages.

6. Automated testing: I've not seen a test runner that reloads when source files change, particularly one that might be used with goapp from AppEngine, meaning go auto-testing can be quite a bit of patient thumb-twiddling as the binary reloads.

Which is all to say that there are some concerns about developing a larger project in this language, particularly if there is quite a bit of complexity that needs lots of testing or potential debugging and/or inclusion of many third party packages.

I've not reviewed the 1.6 notes, so perhaps these are addressed to some extent there.

In any case, none of the issues above is insurmountable, and overall I give the Go design a lot of credit for experimentation and interesting choices, but the issues I've seen above give me pause before committing a team to the language for the moment.

niccaluim 1 day ago 2 replies      
Not officially. "Go 1.6 is soon (but not yet)." - commit message from today.
helper 1 day ago 0 replies      
Rebuilding our integration docker image right now. If all our tests pass I expect to have go 1.6 binaries in production by this evening.
jay_kyburz 1 day ago 1 reply      
Can anybody tell me if you can run Go in Chrome using the NaCL stuff? I remember there was talking of it a few years ago but I don't know if anything ever came of it.

A google seach show that you could build for NaCal in Go 1.3 but only run it in special builds not Chrome itself.

CSDude 1 day ago 0 replies      
I would just love some IDE-debug love and better packaging. More packages I use and more I distribute my files, compilation takes considerably longer. Maybe I do not know, but is there some process to compile some parts before hand and link only the changed resulting binary?
robbles 1 day ago 3 replies      
There was some discussion leading up to the release about whether to merge the "SSA" branch, which seems to be a refactor that allows for easier compile time optimisations but also slows compile times for the time being.

Does anyone know if that was included in this release?

golergka 1 day ago 1 reply      
I just recently started with go, but I love how simple (apart from horrible $GOPATH) and effective that is.

Still can't get over the moment I realized that in order to deploy my web server on an empty virtual box all I had to so was to build and upload. After all the languages and frameworks that required endless customization and setting up it was a true eureka moment.

kiril-me 16 hours ago 0 replies      
Do you know any framework using http2 on go lang?
enneff 1 day ago 1 reply      
Binaries are up but not everything is fully updated yet. Announcement blog post coming shortly.

Edit: Blog post up: https://blog.golang.org/go1.6 maybe change the article link to that?

andreamichi 1 day ago 0 replies      
A draft for the 1.6 release notes: https://tip.golang.org/doc/go1.6
obelisk_ 1 day ago 1 reply      
Release notes: https://golang.org/doc/go1.6

Mods, maybe change OP link to this?

fuddle 1 day ago 1 reply      
Go is great, but I wish they would add terany operator support.
Glibc getaddrinfo stack-based buffer overflow googleonlinesecurity.blogspot.com
499 points by 0x0  2 days ago   449 comments top 45
jimrandomh 2 days ago 12 replies      
> "The glibc DNS client side resolver is vulnerable to a stack-based buffer overflow when the getaddrinfo() library function is used. Software using this function may be exploited with attacker-controlled domain names, attacker-controlled DNS servers, or through a man-in-the-middle attack."

> "The vectors to trigger this buffer overflow are very common and can include ssh, sudo, and curl. We are confident that the exploitation vectors are diverse and widespread; we have not attempted to enumerate these vectors further."

> "Remote code execution is possible, but not straightforward. It requires bypassing the security mitigations present on the system, such as ASLR."

It is time for the bimonthly Internet security meltdown. Again. When they say that "exploitation vectors are diverse and widespread", they really mean it. Patch ASAP. This is a race; it is only a matter of time before criminals start automatically and systematically scanning every server on the internet for this, and you really really want to be patched before that happens.

Thinking a bit more long term, it's pretty clear at this point that we need to expunge all C language networking code from the world, replacing it with Rust or pretty much anything else. That's not sufficient by itself, but it is necessary, or else the periodic Internet security meltdowns won't ever stop.

ptrincr 2 days ago 4 replies      
Redhat (RHEL5 unaffected) - https://access.redhat.com/security/cve/cve-2015-7547 - https://access.redhat.com/articles/2161461

RHEL6 - https://rhn.redhat.com/errata/RHSA-2016-0175.html - update to glibc-2.12-1.166.el6_7.7.x86_64.rpm

RHEL7 - https://rhn.redhat.com/errata/RHSA-2016-0176.html - update to glibc-2.17-106.el7_2.4.x86_64.rpm

Debian - https://security-tracker.debian.org/tracker/CVE-2015-7547Use "aptitude show libc6" - needs to be 2.19-18+deb8u3 (jessie), 2.21-8 (sid)

Ubuntu - http://people.canonical.com/~ubuntu-security/cve/2015/CVE-20...

SUSE - https://www.suse.com/security/cve/CVE-2015-7547.html

Interesting to note this tip:

While it is only necessary to ensure that all processes are not usingthe old glibc anymore, it is recommended to reboot the machines afterapplying the security upgrade.

From - https://lists.debian.org/debian-security-announce/2016/msg00...

Therefore at the very least you will need to restart anything which depends on glibc. This should give you a list of packages:

 lsof | grep libc | awk '{print $1}' | sort | uniq

verytrivial 2 days ago 0 replies      
An extra gold star for the valuable comments added by the patch! https://sourceware.org/ml/libc-alpha/2016-02/msg00416.html
Munksgaard 2 days ago 10 replies      
Can we agree that it's urgently necessary to rewrite most of the core Linux/OSS stack in memory safe languages? Exploits like this come up all the time, and we know how to completely eliminate them. I don't care if it's Rust or D or Go or Haskell or OCaml or anything else, as long as it's not C. The sooner we do this, the better.
cnvogel 2 days ago 3 replies      
As far as I see the bug primarily lies within this function here... resolv/res_send.c https://github.com/bminor/glibc/blob/master/resolv/res_send....

Lines 952 ... 1389 [=~450 lines of code], with more than a dozen of variables holding random state. Think about the complexity you have with all the conditionals and loops, often copying and pasting similar conditions with (xx1 && xx2) variants.

While discusions about the relative merits of Rust, C, Ocaml, Intercal are fun, with enough dedication you can write unauditable/unreviewable code in any language. Even though you might avoid memory corruption, you still can't prove that such kind of code does anything correct.

alkonaut 2 days ago 1 reply      
Criticism of C aside, when you use a language that isn't very expressive and where it's easy to shoot yourself in the foot, you need to keep it very neat.

I mean just look at thishttps://sourceware.org/git/?p=glibc.git;a=blob;f=sysdeps/pos...

That size of methods, huge macros, the low ratio of comments to (non-obvious) statements etc.

I know it's easy to criticize very old and tested code, but there is no (and never was any) excuse for code like that.

IgorPartola 2 days ago 1 reply      
This is why I enabled automatic security updates on all the machines I control. I'd rather get a monitoring alert that something is broken, than to find out much later that someone rooted my server.
mwcampbell 2 days ago 1 reply      
The Debian glibc package update that fixes this vulnerability is dated February 11. But the patch wasn't posted on the glibc mailing list until today. So did Debian get the patch even before it was made public upstream? If so, then why didn't Ubuntu get it early as well?
spyrosk 2 days ago 3 replies      
So can someone ELI5 how bad this is?

From what I'm reading this should only affect systems that use a compromised DNS server or in a MitM attack scenario. Which is serious but not so easily exploitable (I think).

jedisct1 7 hours ago 0 replies      
Don't panic, don't spread fear: https://00f.net/2016/02/17/cve-2015-7547/
tasqa 2 days ago 2 replies      
Looking for a quick mitigation technique before patches start rolling out... Would it be wise to limit responses to 512 bytes so the payload cannot be loaded?

Configuring BIND to use a specific buffer size (only for BIND 9.3.2 and newer):

Add the following line to the "options" section of your named.conf file:

edns-udp-size: n

Configuring Unbound to use a specific buffer size:

Add the following line to the "server" section of your unbound.conf file:

edns-buffer-size: n

source: https://labs.ripe.net/Members/anandb/content-testing-your-re...

fpoling 2 days ago 0 replies      
If alloca could be used to get arbitrary-sized buffer the bug would not exist. Another Y story https://fosdem.org/2016/schedule/event/ada_memory/ points out that Ada does not have this limitation. There it is a job of the compiler to allocate bug chunks outside of the stack not to cause stack overflows. C really needs such API.
friendcomputer 2 days ago 0 replies      
If this was originally filed on an open bug tracker in July 2015, what were the glibc team doing in the mean time? The Google post indicates they were "working on it" when Google got in touch. How much work was going on, exactly? How did this languish for so long?
arielb1 2 days ago 2 replies      
That's what you get when you have thousands of lines of code with variable names like `thisanssizp`. glibc should die in a fire.
titzer 2 days ago 1 reply      
Buffer overrun on the stack...this makes me sad. It's 2016.
btrombley 2 days ago 1 reply      
Can someone please explain the fix in practice? Is it as simple as upgrading glibc (and eglibc?) on all servers? Or is there a network change I should immediately change?
jvehent 2 days ago 2 replies      
iptables -t filter -A INPUT -p udp --sport 53 -m connbytes --connbytes 512 --connbytes-dir reply --connbytes-mode bytes -j DROP

iptables -t filter -A INPUT -p tcp --sport 53 -m connbytes --connbytes 512 --connbytes-dir reply --connbytes-mode bytes -j DROP

AndyMcConachie 1 day ago 1 reply      
I have a question that hopefully someone can clear up for me.

If I understand the Google sec article correctly, this requires a single packet > 2048 bytes to be received by a host using glibc.

> The vulnerability relies on an oversized (2048+ bytes) UDP or TCP response, which is followed by another response that will overwrite the stack.

Is my understanding correct?

If it is, then it's worth pointing out that many links on the public Internet have an MTU of 1500 bytes. This is an historical legacy of original Ethernet from the 80's. Path-MTU-Detection(PMTUD) doesn't really work on the Internet, so it's safest to assume that you only ever get 1500 bytes.

Given all that, this places a burden on anyone wanting to eploit this. Since they cannot assume a PMTU greater than 1500 bytes between endpoints, they're limited in how they can exploit the bug. Correct?

Please correct me if I'm wrong. I always feel these bug reports are missing that vital piece of information I need to operationalize the bug. And thanks.

maxima120 1 day ago 0 replies      
I think new generation wants to take over linux with their new ideas but in reality they just want to prove themselves and prepared to completely ruin it in the process.. well. Cos they just young and want new things. They know big words and think they now the world and they hate their parents... so in short - they want to do to the Linux what the previous generation did to windows... karma.

P.s. whatever language you use its irrelevant. Bugs are in the heads. Code is just the reflection.

efuquen 2 days ago 3 replies      
Can someone give me technical reasons why this world isn't possible:

Parts of the linux kernel or glibc or any other critical C code gets replaced by rust code little parts at a time, which is also calleable from C (https://doc.rust-lang.org/book/ffi.html)? That way these libraries could be made safer in a controlled and incremental manner.

And to reiterate I'm asking for technical limitations, not political or dogmatic.

nwah1 2 days ago 5 replies      
Why don't more distros use the lighter weight C runtimes?
t0mk 2 days ago 1 reply      
Is getaddrinfo usually statically linked or dynamically linked to stuff?

Which pkgs on Ubuntu will be necessary to upgrade once they roll the fix to the repos?

Gratsby 2 days ago 2 replies      
Has anyone put together a POC that doesn't require re-pointing the system nameserver and crashing other applications?
pilif 2 days ago 1 reply      
How is this related to CVE-2015-0235 (the GHOST vulnerability last year)?
fpoling 2 days ago 0 replies      
I wonder why alloca+malloc/free was used in the first place and not straightforward malloc/realloc/free. The overhead of the latter should be negligible given that this is a DNS resolver. The overhead in fact could be negative due to simpler code and better cache utilization.

Premature optimization is a root of all evil indeed.

JabavuAdams 2 days ago 0 replies      
I apologize for my bad behaviour on this thread. I'm in a bad place, mentally.
totony 2 days ago 0 replies      
If this causes you serious problems, you should really consider using a patched grsecurity kernel (even though their stable versions arent free anymore)
LinuxBender 2 days ago 3 replies      
Has anyone actually tried the PoC on their systems? I will test on CentOS 6 and 7 after I have had my coffee. Anyone willing to volunteer to test on Ubuntu and Debian?

Here is CentOS 7

 [ 389.064412] do_general_protection: 159 callbacks suppressed [ 389.064416] traps: CVE-2015-7547-c[1161] general rotection ip:7fa6b0d8fd67 sp:7ffdaf034a30 error:0 in libresolv-2.17.so[7fa6b0d87000+16000]

mariuolo 2 days ago 0 replies      
There's something I don't understand: if it's from 2015, how come it hadn't been fixed until now? At least for debian.
Mojah 2 days ago 0 replies      
A summary of the problem, the affected Linux versions and patching remediations have been posted here: https://ma.ttias.be/critical-glibc-buffer-overflow-vulnerabi...
ComputerGuru 2 days ago 0 replies      
Everyone's going crazy advising everyone else to update, but the glibc homepage is happily, statically sitting on version 2.22 from 2015-08-14.

Maybe we should start by releasing an update there asap, and go from there?

(I just switched our last Linux server over to FreeBSD, despite some software we use not being available; so I'm happy to sit this one out.)

leesalminen 2 days ago 2 replies      
I haven't seen anything come down through yum or apt-get yet. Does anyone know how this can be patched prior to that?
amelius 2 days ago 1 reply      
Is there something that can be done about this on the network level? I mean, it seems almost impossible to assure that every instance of getaddrinfo is patched.

I'm thinking about a background tool (iptables plugin?) that simply truncates long DNS replies, so that they can never cause a buffer overflow.

rms_returns 2 days ago 2 replies      
Can someone explain in layman's terms how will this affects me as a Linux user who works on Ubuntu?
Erwin 2 days ago 1 reply      
It seems tcp_wrappers-libs is using getaddrinfo, so if you have some rules setup there that may be an attack vector. I'm not sure if sshd will want to do a getaddrinfo if you don't have some tcp wrappers rule set up in /etc/hosts.{deny,allow}.
Ono-Sendai 2 days ago 1 reply      
Looking at that code, which is a tangle of goto statements and buffer allocations and accesses, it's a miracle any of it works in the first place. I bet there are tons more bugs in there.
cft 2 days ago 1 reply      
If I point the local DNS resolver to Google's DNS server as a temp fix ( in /etc/resolv.conf , will that mitigate the threat before the patch?
mmosta 2 days ago 0 replies      
Newbie questions:

Are "upload file by URL" functions potential vectors? (payload in malcious dns response)

Is this contingent on a cooperating DNS server (not truncating the record?)

takeda 2 days ago 1 reply      
Fun fact: If you have programs written in Go, after patching this you probably will need to recompile all of them.
weinzierl 2 days ago 2 replies      
Can this be used to exploit DNS servers via other rogue DNS servers? What if I setup a rogue DNS and wait for it to be queried by When I own I can exploit every client that queries it. Browsers use getaddrinfo(), don't they?
alblue 2 days ago 1 reply      
Does this affect OSX?
el8c0d3r 2 days ago 0 replies      
Don't use SQL or PHP either! These are vulnerable to bugs!
rtpg 2 days ago 3 replies      
Are there any big efforts to rewrite glibc in something like rust? ... Is that a thing that is even possible? An in-place replacement library for dynamic or static linking.

I'm really worried that I still hear about buffer overflows in this day and age. Of all the libraries in the world, glibc should probably be written in some subset of Idris that compiles into 100% safe C. We have the technology to move to this now

pcwalton 2 days ago 10 replies      
> Languages like Ruby have had their host of ridiculous security errors.

> Fuck Rust and you naive and self-serving evangelists. Come back to me in 20 years with what you've learned.

This might be a cogent criticism (although a pointlessly mean one) if we hadn't started the Rust project with an analysis of precisely what the contents of our security bugs consist of, and designed the project to target those.

Write code that is easy to delete, not easy to extend programmingisterrible.com
555 points by AndrewDucker  4 days ago   132 comments top 28
lostcolony 4 days ago 4 replies      
I think this highlights a common issue when development is (or feels) rushed. You either end up with developers having only done the first part of each of these pairs (repeating themselves, ball of mudding, etc), without time to clean up as part of each iteration, or, you find developers immediately shooting for the latter half of each pair (DRY, modular, etc) without having done the former, and so you get abstractions that make no sense, overly complex interactions in a shared function as they attempt to be DRY, etc.

This latter is also, I feel, what informs a lot of the monolithic frameworks used for 'enterprise' development, Spring and the like, where a predetermined architecture and structure of the app is imposed by the framework, and which leads to, once you get down in the weeds of dealing with odd edge cases and things, hackery on the part of the developer, or framework bloat if the framework attempts to address the most frequent of those cases.

netghost 4 days ago 3 replies      
The trick really boils down to: be messy, but clean up.If you don't do the second half of each thing (copy and paste / don't copy and paste), then you end up with that unmaintainable mess. If you do the second part too early, you end up with the wrong abstraction.

Sandi Metz also talks about this:http://www.sandimetz.com/blog/2016/1/20/the-wrong-abstractio...

chairleader 4 days ago 2 replies      
This is extremely validating to read. How many times have I battled with DRYists over which solution is "better."

I've happened upon the pattern of code growth described here after years of encountering and resolving pain points in code, often as the maintainer. DRYness for DRYness sake might feel satisfying when writing the code, but when changes need to be made, it often scatters constraints and requirements throughout the codebase, making any change an unestimatable mess of sneaky traps. Try writing your boilerplate initialization code straight some time... no metaprogramming, no helper functions, just get your configuration, create your objects and wire them together. It is oddly liberating to have your bootstrap code flat and unmagical.

V-2 4 days ago 0 replies      
Makes sense - you can't know the right abstraction upfront.

As Hejlsberg said:

"If you ask beginning programmers to write a calendar control, they often think to themselves, "Oh, I'm going to write the world's best calendar control! It's going to be polymorphic with respect to the kind of calendar. It will have displayers, and mungers, and this, that, and the other." They need to ship a calendar application in two months. They put all this infrastructure into place in the control, and then spend two days writing a crappy calendar application on top of it. They'll think, "In the next version of the application, I'm going to do so much more."

Once they start thinking about how they're actually going to implement all of these other concretizations of their abstract design, however, it turns out that their design is completely wrong. And now they've painted themself into a corner, and they have to throw the whole thing out. I have seen that over and over. I'm a strong believer in being minimalistic. Unless you actually are going to solve the general problem, don't try and put in place a framework for solving a specific one, because you don't know what that framework should look like."


So of course write it dirty, clean it up afterwards seems like a better idea, because it lets you feel what the right abstraction is. However it takes discipline to always refactor something that works fine already

ianamartin 4 days ago 1 reply      
I feel like this is a little like learning to cook. At the very beginning, I would just create a huge mess as I was going, use a different pan for every item, a different mixing bowl for each thing, and I didn't know how to prep effectively in advance.

This left a train wreck in the kitchen after each meal that I was forced to clean up before I could cook again.

I learned how to do a little up front prep, and that saved time and mess, and made the whole process smoother.

As I learned more, I started trying to clean up as I went and conserve the number of pots and pans, I made a lot of mistakes washing everything after I used it, which slowed the whole process down, and I didn't need to reuse some of it.

Now I'm in a place where I have enough experience that I know in advance what I can reuse for this meal and whether it needs to really be washed (maybe you just deglaze a pan and wipe it down with a paper towel instead of hauling it over to the sink and scrubbing it), when I'm actually done with an item, what the approximate cooking times are, etc.

Now I can cook complex multi-course meals to a good restaurant quality and have just a couple of things left to clean up by the time dinner is ready to serve.

It doesn't take a genius to figure these things out, just the acknowledgement that these things are important, and the more people you are working with, the more important they are.

Sure, I could have spent the rest of my life cleaning the huge mess after every meal at home, and it wouldn't really matter.

But if you are going to be a line cook at a large restaurant, you must get a hold of these concepts.

jonahx 4 days ago 3 replies      
To add to his point about copy and pasting...

Notice the contradiction implicit in these bits of accepted dogma:

1. Copy and pasting code is evil (DRY).

2. Tight coupling is bad, loose coupling is good.

A system written with no abstracted functions (ie, where copy/pasting was the rule) has no coupling. Of course, I'm not advocating such a style. But it's worth keeping in mind that practicing DRY through abstraction necessarily increases coupling, which means that it's always a tradeoff and that you have to get your abstraction right so that the coupling, on average, makes change easier rather than harder.

d0m 4 days ago 1 reply      
Totally agree. One thing that I still find really hard to maintain/delete is CSS code. More and more I feel like it should be included with components rather than in a plain .scss or .css files. It feels good to be able to delete a component knowing that there isn't css crap left behind..
userbinator 4 days ago 0 replies      
This article could be summarised as "abstract only when you see a clear need to", something that seems to be the exact opposite of what a lot of programming courses teach; especially those dealing with object-oriented design. I think abstraction should be viewed not as a technique to be applied generously and whenever possible, but a necessary evil, resorted to only when nothing else can simplify the code.

A bonus of this style is that it often also makes the resulting code more efficient for the machine to execute, reducing the need for optimisations later.

makecheck 4 days ago 0 replies      
The importance of DRY is proportional to what could go wrong if the repeated code has an issue. For instance, if the code is performing some vital calculation or is part of your security architecture, it had better not be repeated anywhere because somebody will need to patch it later and will need to guarantee that the update has been applied consistently.

Sometimes, it's just clearer to rewrite something yourself. For instance, just because you can express just about anything in terms of algorithms in the C++ standard library, you should instead write the simpler stuff by hand. (A great example is this silly idea of copying to an ostream iterator just to print out some data; I dont care if that combination of standard functions happens to produce the desired result, because the code is painful to look at!)

It's also helpful to using aliasing (e.g. C++ reference variables) to make similar code look as similar as possible; i.e. rather than have two similar blocks using entirely different variable names throughout, declare references at the top to give them the same names so that the similarities are obvious. This also makes it easier to later pull common parts into functions if desired.

kazinator 4 days ago 0 replies      
Easy to replace not simply to delete.

Anything that is easy to delete (and just leave it deleted) is superfluous. Don't write superfluous cruft, obviously.

Easy to replace is important in all engineering. A product or structure with easily replaceable parts is better than one without easily replaceable parts, all else being equal.

We'd never say "design a brake caliper with brake pads that are easy to delete". :)

swalsh 4 days ago 1 reply      
From his about: "I am not a very good programmer. I forget to write tests, my documentation is sparse, and im pretty apologetic about it at any code review. "

Guy who doesn't write unit tests suggests work around to dealing with problems in code that has no unit tests...

xamuel 4 days ago 2 replies      
Corollary: Write code such that it's easy to use grep to find where things are used.

One of my pet peeves is when OO programmers take class's separate namespaces as a license to use super-generic method names. Method names like "add", "update", etc. Makes it near impossible to figure out where those methods are used!

MaybiusStrip 3 days ago 0 replies      
Some of the good advice in this blog post is taken too far, like the advice about intentionally writing shitty code in order to learn from your mistakes. Coding is like any other skill. You learn from your mistakes only if you're trying your best not to make them. Otherwise there's no differentiating between your real mistakes and your carelessness.

> A lot of programming is exploratory, and its quicker to get it wrong a few times and iterate than think to get it right first time.

It's really hard to get it wrong a few times if you have 10 other developers building on top of your mistakes. Then you're pretty much stuck with the code you thought you were going to get rid of. In accordance with Murphy's law, it seems like the code you push out knowing it sucks always ends up being the foundation for something really important. So yes, build the simplest thing possible. Yes, don't abstract pre-maturely. No, don't write shitty code on purpose.

Similarly advice with "copy-paste 10 times". While I agree that you should copy-paste 2-3 times, 10 times is way too many. By the time you've copy pasted something 10 times, it's too expensive to refactor. Or if you do refactor it, 5 of those 10 instances have changed beyond recognition and will be missed. They continue to evolve and now for the rest of the life of your software, you're fixing 6 times the amount of bugs.

DanielBMarkham 4 days ago 5 replies      
Interesting. Thanks.

This looks like it's from a very functional (FP) view of the world. I like a lot that's here, but I fear it will make my OOP friends' heads explode.

Also, as a nit, the essay feels a bit on the "thrashing" side. I know when I'm trying to express complex concepts many times I will hit the same topic a few different times from multiple angles until I get something that's tight. This essay feels like on of those attempts -- nothing here to throw rocks at; it just doesn't feel "done" yet.

progx 4 days ago 0 replies      
Step 8: Listen what other programmers say

Step 9: Don`t listen to other programmers

andy_ppp 4 days ago 0 replies      
This all goes to how hard code reuse is - is it worth having two very similar functions that do the same thing instead of having one function that's DRY but more complex than both.

Maybe you should have a shared function that has the similar bits of both but then when you remove pieces you have this added complexity.

Everyone can give you arguments either way...

I remember reading this beautiful explanation about an idea for a functional language where you simply install collections of methods that do specific things and each method you install should be as simple as possible and do just one thing etc. - maybe it was somewhere on the Elixir mailing lists but I can't find it!

dclowd9901 4 days ago 0 replies      
This is a super frustrating article, so I'm going to post the quote I think that sums up the art:

"To write code thats easy to delete: repeat yourself to avoid creating dependencies, but dont repeat yourself to manage them."

This is such a hard thing to explain to someone, so I'm impressed the writer described it so accurately and succinctly.

kapitalx 4 days ago 0 replies      
Rule Of Three covers steps 2 and 3. This they used to teach in our 1st year computer science program: https://en.wikipedia.org/wiki/Rule_of_three_(computer_progra...
yahyaheee 4 days ago 0 replies      
I love this article. I have been at war with some of the tech teams at my company that force reusablilty. Personally I find it a nucance, and in most cases can write a new function in less time. You have articulated this ideology very well, thanks for the insight
z3t4 3 days ago 0 replies      
I might not have understood the article, but avoiding to write code is dangerous, and will lead to code that after a while will be impossible to understand because there will be hacks upon hacks. Your boss loves this though.

And about cleaning up the code after you got it working, ha ,ha, like that would ever happen.It doesn't have to look nice, but make sure you get it right from the start, and that it covers all those twenty edge cases. It will cost more time ... So most software projects fail anyway!? Maybe it's because of your shitty code ? (grin)

falsedan 4 days ago 0 replies      
Another behaviour which makes these steps hard to follow is: someone wrote code and ships it, and thinks, because it works and is being used, it is great and doesn't need to be changed. These feeling get even more in the way when the code took a while to write!

I keep telling my team to write bad code: write something that's correct (but not great) quickly, with an eye to replacing it with something better next sprint. Then, once you have something which works end-to-end, work out which bit is most terrible, and replace it. Repeat & launch when the quality is acceptable (and keep repeating after you launch).

akkartik 4 days ago 0 replies      
Over the past year I can't stop thinking about this article: http://250bpm.com/blog:51
vmorgulis 4 days ago 0 replies      
A neat response to the same problem posted yesterday:


Basically, layers with numbers and preprocessor directives to inject (tangle) code in the proper places without worrying too much about language abstractions.

jrochkind1 4 days ago 0 replies      
This is awesome. Easier said then done, but, yes. Coding is a craft, and always will be.
codazoda 4 days ago 0 replies      
I read this and thought it sounded a lot like my reality. Then, I expected to come to these comments and hear arguments against these thoughts. Glad to see a lot of us agree with much of it.
auvrw 3 days ago 0 replies      
really appreciated this b/c I've lately felt bad a/b not "getting it right the first time," and one point here seems to be that that never happens, exactly; that development always involves some amount of evolving a body of code, at one level or another
danbruc 4 days ago 2 replies      
Jean-Paul Sartres Programming in ANSI C

I am pretty sure that is at least a misattribution.

cake42 4 days ago 3 replies      
Every line of code is written without reason, maintained out of weakness, and deleted by chance Jean-Paul Sartres Programming in ANSI C.

I just started the article and I already have problems with it, not a good sign. While it may be obvious when you research the timelines of JPS (he died a few years before ansi c 89 was established) and C , not to mention the miles of metaphorical distance between Computer Programming and JPS's work) . I guess the author was trying to be cute?? but that fabrication should be made clear as such. he's undermining his own inherent credibility as an author, however much the reader decides to put in. Serious problem in my book.

Issue and Pull Request templates github.com
506 points by joshmanders  1 day ago   133 comments top 26
jamesRaybould 1 day ago 3 replies      
There is already a way of doing this using the URL like: https://github.com/jamesRaybould/go-mssqldb/issues/new?body=...

You can then add it as a simple href to the readme.md.

It also means that you can have multiple templates depending on what a user wants to do, just by having multiple links and changing the content of the `body` parameter.

Simplest way to get going on this is to use http://urldecode.org to write the markdown you want and then hit the encode button, take the result and add it after `body=`

We also use it to auto-assign labels using `labels=` in the URL

erikb 17 hours ago 0 replies      
Now I actually start to worry. Did anybody here ever have the problem of making people happy with a software project?

The usual complain goes like this "You need to do X because I want to be able to do Y." In the complainers mind there is the untested idea that having X will enable him to do Y which solves his unspoken problem Z that he isn't even aware off. The thing is, at this time you don't know Z. You don't know if Y is really solving Z. And you don't know if X is really solving Y. And neither does he. But if you want him to use your tools he doesn't need to worry about that as much as you.

What happens if you just go like "Okay, user wants X, here is X!" is that the users will continue to complain (maybe even more) because Z is still not solved, and because there was no testing and planning involved X is actually creating another problem Z2 that nobody had before. At least that's my experience with an open source project I managed for about 3 years.

What I found actually needs to happen is to discover Z and to discover a way to solve it in the context of the project (which other people may not be as aware of as you are), and with an at least minimized chance of creating more problems. Then this actual solution needs to be sold to the users, because they are not aware of Z, so they think they don't care that you solved Z. But only after doing all that people will stop complaining (not even remembering that there was a problem and how much pain you went through to solve it of course).

Hope that makes sense and explains why I start to worry now, when everybody starts cheering. What I hoped would happen is that you don't hear much about the suggested changes, some other changes happen a few weeks down the road, and then the complains stops without anybody noticing. A success would be that you don't read about github anymore after 1-2 months. People cheering and github saying "Hey we did X" is a really bad thing.

jakozaur 1 day ago 3 replies      
Great job!

Next item, be able to star issues.

That would help a lot and we are able to avoid +1 comments.

VeejayRampay 1 day ago 0 replies      
Well done Github. Simple and elegant solution that I hope will help people a lot.
swang 1 day ago 5 replies      
Kinda meh on adding it to the repo since it's yet another file I have to "manage" that isn't really part of the working code.
anonicode 1 day ago 1 reply      
> This is the first of many improvements to Issues and Pull Requests that we're working on based on feedback from the community

So there is more to come

minimaxir 1 day ago 1 reply      
The template is more for actual issues with the software than to-do lists/user grievances, the latter of which I see used more frequently in GitHub Issues. Maybe it's time to separate GitHub Issues into Issues and Discussion.

EDIT: Missed the fact that the feature is opt-in by the repo owner, which makes things more expected depending on the nature of the repo. Although now thinking about it, the separation is still not a bad idea.

_ikke_ 1 day ago 1 reply      
lr 1 day ago 1 reply      
I asked Craigslist to do this years ago for the "for sale" sections, so that people included the number of doors on a car, the color, etc., and the kind of heat in an apartment, and so on. Such a simple thing, and would make searching so much better, and the service in general better.
tobr 1 day ago 2 replies      
So, the issue template is just a default text that individual users can modify, delete, or otherwise disregard?
colinodell 1 day ago 1 reply      
pull_request_template.md also works.
fiatjaf 1 day ago 1 reply      
I want to know how this works: http://gitmagic.io/
arnarbi 1 day ago 1 reply      
Why isn't this in a separate branch akin to gh-pages, or a separate repository akin to the wiki data?
marcinkuzminski 1 day ago 1 reply      
How does that work with branches ? Is there a master branch required to have this file, what if project doesn't have master branch ?

I think the concept of having a file in source code is flowed for DVCS unless you have so called "source" branch that you can define that is a default source of such information.

logn 1 day ago 1 reply      
The problem I have with this is, I don't want a template for the comment a contributor leaves on a PR; I want to display a message to them before they submit a PR. It's not a standard way to display messages, requiring users to read editable text (that has no clickable URLs) and then delete that text after they read it and submit.
atrotors 23 hours ago 0 replies      
Well, it seems like the open letter is working!

I hope they address the other issues as fast as this one. Rating system is the next one on my list.

rurban 1 day ago 1 reply      
I'm excited, but the PULL_REQUEST_TEMPLATE.md name is too long for root. What about PULL_REQUESTS.md and REPORT_ISSUES.md?
steveklabnik 1 day ago 0 replies      
I have a PR open for Rust to use this. I and others are very skeptical in general, but there's some interesting discussion so far: https://github.com/rust-lang/rust/pull/31732
VeilEm 1 day ago 1 reply      
Doesn't seem to work on enterprise github yet. :(
pducks32 1 day ago 1 reply      
I'd like them to choose a folder name that isn't specific to a site. .github would look silly on hit lab but I like the idea of having a serrated folder.
kuschku 1 day ago 0 replies      
Itd be interesting if itd provide a separate input box for each section of the template maybe even a graphical editor for lists if the template specifies a list.
dang 1 day ago 0 replies      
Url changed from https://github.com/dear-github/dear-github/issues/125 to the announcement post.
EC1 1 day ago 1 reply      
There's a special place in hell for people that make jokes and post massive animated memes in issues.
shmerl 1 day ago 1 reply      
What about attachments to issues? Using gist for it is simply annoying.
thescribe 1 day ago 4 replies      
This sounds like more 'enterprise' bureaucracy. Coming soon, overly complicated paperwork.
gcb0 1 day ago 0 replies      
talk about moving slow.

two years and that's what we get? meanwhile my bigger diffs are still garbage. and we have to use other companies to have a simple agile board... and don't even get me started on decent branch management and rebases...

sigh. really hate that my employer buys that

Wikimedia removes the Diary of Anne Frank due to copyright law wikimedia.org
451 points by rosser  6 days ago   192 comments top 25
Houshalter 5 days ago 6 replies      
Please read this: https://web.law.duke.edu/cspd/publicdomainday/2016/pre-1976

And also look at the graphic on this page: http://www.theatlantic.com/technology/archive/2012/03/the-mi...

Copyright serves a legitimate purpose. I doubt there would be many big budget games or movies produced without some degree of protection. The purpose of copyright is to provide an incentive to produce creative works. But it also needs to balance the benefits of that with the costs of restricting works from the public.

I believe that copyright should last 10 years. That sounds really short, but hear me out. The vast majority of works are not economically relevant after 10 years. And the ones that are, have usually earned 99% of their money within ten years. It's a power law, the first year gets vastly more money than the second year, than the third year, and so on. This is especially true for games and movies which are fast paced industries. It's also true for books and music, but the power law curve is a bit less steep.

There are exceptions to this, but they are just that - exceptions. The point is to provide an incentive for creators, not to give giant benefits to the outliers at the expense of the public.

Also originally copyright law had some additional measures to keep it from being restrictive. You had to actually register the copyright, not just grant it automatically, and you had to renew it halfway through. Which cost a few thousand dollars.

I also am very much in favor of derivative works. Fanfiction shouldn't be illegal, making a t shirt with your favorite character on it shouldn't be illegal, etc.

With this system you would be able to use old music, photos, software, books, movies, etc. As long as they were made before 2006.

Also trademark could handle some of the remaining issues. E.g. "Star Wars" could be a trademark of george lucas. You could make star wars derivatives, but you couldn't brand them with star wars. Consumers would be able to tell apart ripoffs from works by the original creator. Mickey mouse would still be a trademark of disney, but you could watch steamboat willy cartoons on youtube.

eggy 6 days ago 3 replies      
I am more distressed that a work by a German, at the time stateless in the Netherlands, that long ago, and given the global importance of such a work, that the U.S. even has the copyright. The explanation in Wikimedia I understand clearly, but I do not acept it.
noonespecial 6 days ago 1 reply      
Just like that silly explanation of the width of railroad tracks starting with the width of a roman horse's ass, the question of "why can't I read the Diary of Anne Frank" starts with "so there was this cartoon for little kids in the '20's called Mickey Mouse..."

We reach this ridiculous situation by serious people taking a series of thoughtful actions a tiny bit at a time. But serious and thoughtful actions only add upon themselves with each step. Ridiculousness multiplies.

tome 5 days ago 4 replies      
Lots of rage here against the US government, little rage against the Anne Frank Fonds that actually asserts the copyright.

If you really feel so strongly then go and rail against the foundation who could transfer the book into the public domain with least a million times (literally) less difficulty than it would take to get US copyright law changed to have the same effect.


magicfractal 6 days ago 12 replies      
That's barbaric. That's why civil disobedience is the only adequate answer to the current overreach of copyright in the united states. It's up to us, technologists, to create and disseminate tools to help out in the dismantlement of an absurd system that is obsolete in the digital age.
nness 6 days ago 1 reply      
I find it startling that, with the current terms of copyright, no music I'm listening to today or even in the last decade will be in the public domain before I die.
empressplay 5 days ago 1 reply      
Available on archive.org (and going nowhere):


blueflow 5 days ago 0 replies      
Then just read "Mein Kampf" instead, its already free.

I could laugh if this wasn't for real.

Artoemius 6 days ago 0 replies      
That's depressing. As I'm becoming older, I'm beginning to suspect that the bright sci-fi future isn't really going to come true after all, and we'll have to live in a sad bleak dystopia.
anonymfus 5 days ago 0 replies      
Why, considering that US copyright laws are ones of the worst, Wikimedia Foundation is still based here? They should move to other jurisdiction.
based2 5 days ago 0 replies      
Lawrence Lessig: Re-examining the remix


diziet 6 days ago 1 reply      
But who holds the copyright?
JohnIcare 5 days ago 2 replies      
Is there other story like the Anne Franck one?Why not boycot the museum in Amsterdam, the books and speak of others stories? That's cynical? Not more than the Anne Franck copyright's holders...
hartator 5 days ago 0 replies      
This foundation - http://www.annefrank.ch/ owns the copyright. It's meant for charity, but I don't think they realizes they are doing more damage by holding the copyright than whatever good the foundation is doing.
brooklyndude 5 days ago 1 reply      
Ok, I know, first reaction, utter STUPIDITY of the human race. But don't let it down OK? There are some good, smart, people out there that care about it all. They're out there, they really are. Just keep looking. :-)
XJOKOLAT 5 days ago 0 replies      
immediately downloads a digital copy as a matter of principle
acd 5 days ago 0 replies      
This relates to Disney and Mickey mouse as follows, everytime the copyright of mickey mouse was about to expire the copyright law has been extended.

"This law, also known as the Sonny Bono Copyright Term Extension Act, Sonny Bono Act, or (derisively) the Mickey Mouse Protection Act"


Mickey mouse copyright vs works of Jewish girl killed by the Nazis.

rajneeshgopalan 4 days ago 0 replies      
What a shame.. such stories should be spread widely and rapidly!
chris_wot 5 days ago 0 replies      
Wow, that's appalling!
killerpopiller 5 days ago 2 replies      
but if it entered public domain in NL who can uphold Copyright in US? How can US law overrule dutch if it is of dutch origin?
ctingom 5 days ago 0 replies      
So, who actually owns it?
rocky1138 6 days ago 1 reply      
Does anyone have a copy of it that we could seed on torrent?
setra 6 days ago 0 replies      
Nice one a holocaust joke.
throw99182 5 days ago 0 replies      
Good to see the activism regarding copyright. On the other hand, when there is a real human who suffers like, say, Anne Frank, half of HN blames the victim.
Download.com and Others Bundle Superfish-Style HTTPS Breaking Adware howtogeek.com
450 points by jacquesm  6 days ago   223 comments top 23
pdkl95 6 days ago 3 replies      
Lets say I ran a well-known business that sold a some sort of physical product.

It could be a toy, a common household tool or appliance, or anything else that is small and inexpensive. It more-or-less works as intended, but it also included a small robot. The packaging and marketing would be designed so you weren't supposed to notice the robot, but the packaging included the necessary fine print and an explanation that this robot was just there to make sure you got the best experience possible for anybody that spotted this "extra feature".

Unrelated to whatever it was that you bought, at night the robot would install a device on your phone that re-routed all your phone calls through my office (a MITM attack). The phone still works ok, except now calls to your favorite pizza deliver restaurant seem seem to be re-routed to the competitor across town. Some time later a neighbor complains that he can sometimes when he checks his voicemail, he gets your phone conversations instead.

After finishing with the phone, the robot does the same thing to your cable TV.

Some days, the robot would go through your (physical) mail and place stickers with new advertisements into your magazines. Occasionally one of those stickers would end up on your electric bill, obscuring important information. The power company has a similar logo to one of the sticker-ads, so the robot probably confused the two logos. Even if the robot didn't have any stickers to place, it would still open your mail and leave it (opened) on the ground near your mailbox for anybody to see.

If I rand a business that did this - possibly as my main (or only) product - how long would I be able to run this scam before someone threw me jail?


Intentionally breaking TLS with a MITM attack goes way beyond the usual scam/trojan. This isn't even the usual negligence that we see in the "security" of a lot of products. Creating a certificate that lets you MITM any domain is very obviously a willful act.

michaelmrose 6 days ago 4 replies      
If only Microsoft had run with the idea of package management and trusted repos that has existed in open source for decades without restrictions that inspire people NOT to use it. Example to release a debian repo to share your software with your users you have the option to handle hosting/payment if any yourself and keep 100% of the revenue not 70%.

Their position in the market is such that had they started pushing this around the time apt-get started to be a thing they would have had near 100% adoption and users would be used to installing everything that way and would be naturally suspicious of manual installation.

Bad actors could end up on a blacklist that users would opt to enable. You would even have multiple black list sources from people like antivirus vendors etc.

mcv 6 days ago 5 replies      
How is this not extremely illegal? People have gone to prison for far less. How is it possible that a company like Lenovo would get involved with this?

I recently found myself wondering if I should consider CNET a reliable source of software. I guess this story answers that.

libeclipse 6 days ago 9 replies      
It's amazing watching a normal person install something. Most people I've seen just spam the next button until the window disappears, until the software and all its friends are happily installed. I think the first step is to educate users. Or implement a package manager.
userbinator 6 days ago 1 reply      
The bottom line is that you can no longer trust that green lock icon in your browsers address bar. And thats a scary, scary thing.

What's even scarier? Not being able to inspect the traffic your own machine sends or receives because the powers that be have decided that, due to Superfish and all this other unwanted MITM'ing software, to "improve security", certificate stores will be locked down so well that only the "trusted authorities" (i.e., they) can modify them.

As long as users (and by extension, the software they run) can modify the certificate store this "problem" will exist, but as this article shows, it's not hard to add and remove certificates, and thus effectively "choose who you trust". The alternative, to have no choice in who you trust, is far worse. I just hope that the security community realises this, but if things continue moving in the direction they currently are, I'm not so optimistic.

Incidentally, I also use a local MITM proxy, but to remove ads and other crap.

nikcub 6 days ago 2 replies      
The solution to this is Certificate Transparency[0], a distributed public log of certificate timestamps that are submitted by CA's and checked by browsers.

CT has been required for EV certificates in Chrome for a year now[1], and eventually will be required for all certificates otherwise they will error out on connection.

A certificate signed by a root cert that is not the original CA will not validate.

[0] https://www.certificate-transparency.org/

[1] https://blog.digicert.com/certificate-transparency-required-...

AndrewUnmuted 6 days ago 1 reply      
I work for CBS Corporation, which owns CNET/Downloads.com.

I have sent an email to our security team with this story, and will report back if I hear anything from them.

svenfaw 6 days ago 1 reply      
It is not immediately clear but the article was published in February 2015.
sparky_ 6 days ago 2 replies      
I honestly wonder if there should be some sort of signature or approval process on the OS vendor's part before any cert can serve as a root.

I'm not sure what that would look like and I do realize there are some 'walled garden' implications here. But honestly, I don't get how or why a userland application has any right to touch the OS' trusted CA roots. Perhaps some model similar to driver/kext signing would make sense - self signed and/or untrusted could be loaded when the system is booted with some development mode flag, but on a general user system, the only path to get your cert trusted as a root COULD be via update/push from the OS vendor.

nugget 6 days ago 4 replies      
Hate to say it but I'd rather focus on user education than concentrate more centralized compliance control with Microsoft. I am old enough to remember how Microsoft wielded that type of control in the past. Malwarebytes alone could/should fix 90% of this problem.
fdb 6 days ago 2 replies      
The Badfish page at https://filippo.io/Badfish/ seems to be down. Any other place where I can direct people to check for invalid security certificates?
forgotAgain 6 days ago 0 replies      
I'm really surprised that CBS hasn't been called out for the behavior of download.com. They're a major news corporation that needs to maintain their reputation. I've never seen any news story questioning senior management as to the disreputable activities of CBS Interactive and it's subsidiaries like CNET and download.com.
Semaphor 6 days ago 0 replies      
> Make sure [] your [] anti-virus stays updated

Or don't use them considering how many reports of them actually making your system less safe there are.

slipstream- 6 days ago 0 replies      
I've been investigating PUPs for the last month or so.

These kind of bundlers now drop not only adware (browser extensions, or those that drop MITM proxies that break TLS), but also winlockers and fakealert trojans of Indian origin ("CALL OUR [FAKE] TECH SUPPORT TO RESOLVE THE ISSUE").

CM30 6 days ago 0 replies      
Probably a silly question here (and I've asked it before), but why exactly do we only have dodgy download sites for Windows programs anyway?

I mean, other platforms have decent stores and download repositories. And games on Windows... well, you've got a lot of good sites and services there. Everything from Steam to Good Old Games to the average game mod or ROM hack download site is moderated and mostly kept free of adware and other crap.

Is there really no one interested in providing a site or service that explicitly disallows bundling and ad supported crap (or that outright removes it from anything submitted to them)? Does no one with any ethics exist in this space?

cm2187 6 days ago 3 replies      
I'd be willing to pay a modest fee to have a repository of common software, always up to date, free of adware, and that can be updated automatically. A sort of commercial chocolatey with more choice and more up to date packages.
joesmo 6 days ago 0 replies      
How the fuck is this not illegal? Oh wait, it is and someone should prosecute CNET. They're gaining elevated access to your computer without your permission. For once, can't the CFAA be used for good?
slartibardfast0 6 days ago 0 replies      
I wish Microsoft would crackdown hard on this, perhaps by make non-trusted root certs an Enterprise/Pro feature guarded by a group policy entry.

I can't understand Google's reasoning in disabling Cert pinning for non-enterprise users either! How do common or garden home users of Chrome benefit from this feature?

SimeVidas 6 days ago 0 replies      
Google Safe Search, please block the entire download.com domain. That would be hilarious!
voltagex_ 6 days ago 2 replies      
How are they adding a trusted cert without Windows popping up a warning?
chris_wot 6 days ago 1 reply      
Is there any way of resetting the list of root certificates?
titel 6 days ago 0 replies      
This article is one year old.
How to Safely Store Your Users' Passwords in 2016 paragonie.com
471 points by antitamper  1 day ago   268 comments top 33
jordonias 1 day ago 9 replies      
I called my bank the other day and they asked over the phone for my password. This isn't a bank I often use, I only currently have a loan through them so I've never used the login on the website. I said I don't remember setting a password. They gave me a hint about the characters in the password and I was able to remember the password based on their hint. I verbally said the password character by character and they confirmed it. This is an example of how to not handle passwords in 2016.
jtwebman 1 day ago 3 replies      
Bad idea to use bcrypt.hashSync in Node.js. I hate that so many tutorials use that one instead of the correct bcrypt.hash with a callback. This is Node.js for that 200 ms where you are hashing that password nothing else runs, no requests, everything stops. Here is the correct way to use bcrypt in Node.js:

bcrypt.genSalt(10, function(err, salt) {

 if (err) return; //handle error bcrypt.hash(clearPassword, salt, function(err, hash) { if (err) return; //handle error // Store hash in your password DB. });

VincentEvans 1 day ago 5 replies      
Serious question: What about using a Public/Private key encryption to store the password?

- Private key is stored in a secure place. Offline for all i care; printed on a piece of paper; memorized and swallowed.

- When user creates the account - password is padded with salt, then a public key is used to encrypt it. The resulting encrypted form is stored, along with the salt.

- When user attempts to authenticate - the password that is provided is padded with the stored salt, encrypted with the public key and compared to the stored password.

Private key is never used when comparing passwords. Never available to the system doing authentication, etc.

The only purpose of using a reversible encryption - is to be able to switch to a different authentication provider completely transparently to the user.

I've implemented this functionality considering that we may need to switch over to active directory (or some other directory) in place of storing passwords in the database - but never used it, fearful that i would be committing some cardinal crime against proper security practices


oliwarner 14 hours ago 1 reply      
Disappointed that the first solution isn't: Let somebody else do it.

I know this doesn't apply to banking, etc but 99% of the websites that "require" me to create an account and log in don't need to store primary credentials for me. Please pick a secure implementation of oAuth2 and let people store their credentials wherever the hell they want to.

I'm bored of getting hits from "Have I been pwned?"

Freaky 1 day ago 1 reply      
> base64_encode(hash('sha384', $password, true))

> ...

> The above construction may invite theoretical concerns about entropy reduction (i.e. 72 characters of raw binary without any NUL bytes comes out to about 573 bits of possible entropy, but a SHA-384 hash outputs are clearly limited to 384 bits).

Given BCrypt hashes are a mere 184 bits, I don't see how this is a meaningful concern even in principle. If you're brute-forcing search spaces this big you're no longer looking to recover a password, but find a collision.

rcconf 1 day ago 7 replies      
If you're using node.js and you use these hashing methods, your entire server is going to pause for 0.5 seconds on a login because it runs on a single thread. Goodbye to all of your server performance.

You can create a worker system, or use a child process to solve this problem, but most of these articles never mention it

AdmiralAsshat 1 day ago 2 replies      
I had no idea that PBKDF2 had fallen so much in recent years. I still remember the 1Password team extolling its virtues five years ago:


hellofunk 1 day ago 2 replies      
It would be great if, in 2020, or sooner, but probably later, the answer is "don't use passwords any more. They are deprecated components of society."
josefdlange 1 day ago 3 replies      
Anyone have a good explanation of why, in the Python example, they recommend `hmac.compare_digest` instead of `==` for comparison?

Is there something obvious I'm missing here?

damon_c 1 day ago 0 replies      
I have always been impressed by the way django does it.


TorKlingberg 1 day ago 1 reply      
For some reason the article completely fails to link to libsodium: https://download.libsodium.org/doc/
sinatra 1 day ago 5 replies      
From previous discussions about this topic, I had noted down the following best practices:

Passwords should be scrypt'ed on client, and then, the server should generate a SHA256 hash of the scrypt'ed hash and store that in DB.

- Running CPU & memory heavy scrypt hashing on the client side will allow us to use bigger hashing work-loads.

- EDIT: Removing the MITM point, because as many said, that's the job of TLS anyway.

- External brute force attackers will have to take the burden of heavy hashing. No DOSing through scrypt.

- Storing SHA256 hash instead of scrypt hash on DB means even if DB is stolen, attackers can't use stolen scrypt hashes to authenticate any client.

I would love to get others' feedback on this. EDIT: Found the reference: https://news.ycombinator.com/item?id=9305504

pbreit 1 day ago 0 replies      
Assuming this is legit (seems like it), mega bonus points for walking through examples on the various platforms/languages.
Justsignedup 1 day ago 1 reply      
Wish there was a site where it lists algorithms and gives a table, and an ability to compare it to x years ago:

algorith | fairly safe difficulty (all variables) | very safe difficulty without incurring too much performance cost.

pbkdf + sha1 | completely unsafe | completely unsafe

pbkdf + sha2 | 100000 | ...

pbkdf + sha256




latenightcoding 1 day ago 1 reply      
Perl programmers should try this module: Crypt::ScryptKDF I have been using it for a while now
cm2187 20 hours ago 2 replies      
What I find frustrating is the lack of availability of most of these algorithms for the most common platforms (.net, php, java). The author recommendation seems to be driven by availability, not the algorithms own merits.
astockwell 1 day ago 2 replies      
I believe the Ruby example is incorrect: When checking a password's validity, you must use a constant-time comparison or else you are exposing a vulnerability to side-channel timing attacks.

There is an open issue in Coda Hale's bcrypt repo about this: https://github.com/codahale/bcrypt-ruby/pull/119

My stance on posting "best practice" articles is: you must follow all best practices in them.

<Edited for clarity>

firelink 13 hours ago 0 replies      
I kind of don't like this article. I think we should be teaching people best practice for securing and storing a password, not simply giving them a library.
movedx 1 day ago 0 replies      
Hashicorp's Vault (vaultproject.io) is also an excellent way of controlling access to secrets and back ends, such as PostgreSQL.

It's worth spending the time to learn, implement, and integrate into the security best practices you should already be deploying.

carlesfe 1 day ago 0 replies      
Here's yesterday's discussion for the exact same post (maybe the duplicate search failed this time?)


dogweather 1 day ago 3 replies      
I believe that the safest way is not to save them. Instead, outsource this to a few select OAuth providers which you and your customers are willing to trust.
xjlin0 22 hours ago 0 replies      
Ruby's argon2 gem is pretty good!
jimktrains2 1 day ago 1 reply      
Or, we could move to not storing passwords at all, viz client certs and SRP.
balls187 1 day ago 1 reply      
"How to Safely Store a Password in 2016"

Don't. Unless it's 100% absolutely necessary.

If you must, continue reading on.

intrasight 1 day ago 0 replies      
Storing passwords is like storing credit card numbers - just don't do it
giancarlostoro 1 day ago 0 replies      
I'm curious if any D developers care to share their approach to this?
sarciszewski 1 day ago 1 reply      
I'm part of the Paragon Initiative Enterprises team and have access to edit the blog. If you have any questions (AMA) or would like to suggest any additions, please let me know.
pratnala 1 day ago 1 reply      
Does Argon2 have an official website and repo or is it just the PHC repo?
serge2k 1 day ago 1 reply      
> PBKDF2 (nearly everyone except FIPS agrees this is the worst of the acceptable options)

is that true?

privong 1 day ago 3 replies      
For anyone who reads the comments before clicking the article, the subject is storing your users' passwords, not managing your passwords for a variety of services. I had interpreted it as the latter. A better title might be "How to Safely Store Your Users' Passwords in 2016".

(Title is currently: "How to Safely Store a Password in 2016")

PixelB 1 day ago 1 reply      
I use an ultra secure method for my passwords, it is 100% unhackable.

Proprietary Analog Password Encryption Routines.

xs 1 day ago 1 reply      
Lame title. Story talks about storing password hashes and not passwords. I'm still looking for a solution to storing actual user passwords. Scenario: All laptops in the company have a unique local administrator password. How do I manage this effectively as a domain admin?
Apple Apologizes and Updates iOS to Restore iPhones Disabled by Error 53 techcrunch.com
347 points by aj_icracked  10 hours ago   165 comments top 16
baldfat 9 hours ago 6 replies      
> Apple Apologies

I am an Apple hater BUT I have to say very proud of the new Apple and actually saying they made a mistake and apologizes. This and the fight for security are both things as a self proclaimed Apple Hater applaud Apple for doing. Good job!

illumin8 10 hours ago 4 replies      
This seems like the right thing to do - disable the unauthorized Touch ID sensor, but don't brick the phone. The secure enclave is still intact and secure, and if you want Touch ID back, you can get it repaired with authorized parts.
jasonjei 5 hours ago 1 reply      
Some part of me believes that the old Apple that Steve Jobs was in control would have stood his ground with "Error 53." The Tim Cook Apple is a lot more compassionate with respect to these sort of things.
mbrd 9 hours ago 0 replies      
Anyone else get the impression that the head of PR at Apple was on vacation and returned this week to put out all the fires?

Seriously though, this seems like a consumer-friendly decisions, as was the iOS backdoor/San Bernadino press release yesterday and it's nice to see.

roddux 10 hours ago 1 reply      
Nice move, it's pretty cool of them to offer reimbursement to people who bought replacement phones in the meantime.
CountSessine 5 hours ago 2 replies      
Maybe this is a silly question, but does anyone know how secure the TouchID on the iPhone is compared to the 4 digit pin? I remember from a comment in the Android source that the android 5 face recognition is about equivalent to a 3 digit pin. Is TouchID more secure or less secure than the 4 digit pin?

Is it more tractable or less tractable for someone to brute-force the 4 digit pin than the TouchID? I.e. if someone wanted to get into my phone, and they removed the official TouchID sensor and now it falls back on a 4 digit pin, does that do them any good?

I wonder if I could get the old behaviour back - if someone was tampering with my phone by removing the sensor, is there any way of bricking the phone until I can get it to an apple store?

aj_icracked 10 hours ago 1 reply      
I am a little torn on this given there isn't a cited source and I don't know if Apple would give TC (or anyone) an exclusive on this. If it's true we're dancing in the streets though!
gradients 8 hours ago 1 reply      
Wow- this is great for me. I have been sitting on iOS8 for some time now because of this.

I broke my screen and home button and had them replaced before I went on vacation. Luckily I had read about the error 53 issue before attempting to upgrade my jailbroken device.

I'm very surprised Apple would respond so well to an issue typically caused by 3rd party repairs.

TazeTSchnitzel 6 hours ago 1 reply      
> This test was designed to check whether Touch ID works properly before the device leaves the factory.

Does that mean Error 53 stemmed from Apple having distrust in their supply chain? Interesting.

maerF0x0 9 hours ago 3 replies      
Anyone else find it suspicious that this comes the day after court order to backdoor their devices?
chris_wot 9 hours ago 0 replies      
Yes, well, it probably helps to know that the Australian Competition and Consumer Commisioon (ACCC) was investigating Apple for abuse of market power over this issue.

The last time this occurred, it was over illegally claiming iPhones and other Apple devices were out of warranty when they weren't, and misleading consumers that to get any form of warranty service after one year they would need to purchase an Apple extended warranty. They were not only fined millions, but were forced into printing a humiliating retraction on their website and in the press - one that basically was reported on worldwide.

I'm not at all surprised they backed down this quickly this time around. It's almost certain they would have been found to have committed the offence of third line forcing, to which there are very, very steep fines.

hackaflocka 9 hours ago 1 reply      
But wait, they earlier said it was an intended security feature. Now they're saying that it was a factory test not intended to go public?

New personal rule: never update the phone again... ever.

profeta 7 hours ago 0 replies      
So they pleased users and complied with that judge order in one bat?


jsudhams 9 hours ago 4 replies      
Does apple make lot of money or loose lot of money on repairs? Or else they should let third party repairs.Typically for product companies it is better if they have ECO system of repairs/service by third party so that they them selves don't have to support for long which costs lot of money. Other than life safety devices others should allow customer who want to repair their product should be allowed.
wfunction 10 hours ago 2 replies      
Who actually buys the explanation that this was not intended to leave the factory?
dismal2 7 hours ago 0 replies      
If you have a smartphone and think you have any sort of privacy, you're delusional
FFmpeg 3.0 released ffmpeg.org
454 points by vivagn  3 days ago   87 comments top 13
nakodari 3 days ago 2 replies      
At Jumpshare, we use FFmpeg for screen recording. We noticed that the previous version of FFmpeg was not DPI aware. So we went ahead and fixed it. Now FFmpeg will show correct mouse location in hdpi screens. Unfortunately, it seems FFmpeg 3.0 does not ship with this fix. Nevertheless, we're happy to contribute to this open source project.

Here's the fix if anyone is interested: https://github.com/FFmpeg/FFmpeg/commit/00c73c475e3d2d7049ee...

Aissen 3 days ago 1 reply      
This screams for proper release notes. The official ones are pretty light (http://git.videolan.org/gitweb.cgi/ffmpeg.git/?p=ffmpeg.git;... ), and refer to the Changelog (http://git.videolan.org/gitweb.cgi/ffmpeg.git/?p=ffmpeg.git;... ) which is quite terse. Phoronix did some reformatting of the changelog, it's a bit easier to read:


But honestly this type of stuff should be done by the project before any release.

djm_ 3 days ago 6 replies      
Thanks to all the FFmpeg contributors! Fantastic piece of software.

On a project I was recently on recently we started hitting the per-region concurrent transcode limits on Amazon's Elastic Transcoder. [1]

Instead of sharding over pipelines or accounts we set up a pipeline with FFMPEG + Lambda functions and it performed fantastically (within the free tier even).

It was incredibly simple to write the functions and has given that project a lot more freedom; with the caveat that the any single task you undertake should occur within the timeout window (currently 5 minutes). Having said that, it's also straight forward to split the process into steps and have multiple lambda jobs to make the flow more of a pipeline.

[1] http://docs.aws.amazon.com/elastictranscoder/latest/develope...

danso 3 days ago 5 replies      
I know this has been a constant question (in the lines of "Should I go Python 2.x or 3.x?")...but I feel the need to ask it again on the event of a major point release for ffmpeg...but how are things, pragmatically-speaking, in terms of libav vs ffmpeg? I had thought that libav was the new way a few years ago and have more or less been using it on OS X...but now I see that Debian recently switched back to ffmpeg [1]...What are the use-cases for sticking with libav these days? I'm almost sure I started using libav because it was promoted as a concerted effort to create a better API. But by some accounts, ffmpeg has been incorporating libav's changes...and I honestly don't use libav or ffmpeg enough, directly, to really benefit from a better API. And installing both, I believe, has led to a few subtle errors when using libraries that wrap around either.

So, any reason for the casual graphics developer to install libav?

[1] https://lwn.net/Articles/650816/

edit: Oh I see that VLC at some point switched to libav. That was likely a deciding factor when I last did my nominal research into ffmpeg vs libav:


imaginenore 3 days ago 2 replies      
Among new things:

- Common Encryption (CENC) MP4 encoding and decoding support.

- New filters: extrastereo, OCR, alimiter, stereowiden, stereotools, rubberband, tremolo, agate, chromakey, maskedmerge, displace, selectivecolor, zscale, shuffleframes, vibrato, realtime, compensationdelay, acompressor, apulsator, sidechaingate, aemphasis, virtual binaural acoustics, showspectrumpic, afftfilt, convolution, swaprect, and others.

- New decoding: DXV, Screenpresso SPV1, ADPCM PSX, SDX2 DPCM, innoHeim/Rsupport Screen Capture Codec, ADPCM AICA, XMA1 & XMA2, and Cineform HD.

- New muxing: Chromaprint fingerprinting, WVE demuxer, Interplay ACM, and IVR demuxer.

- Dynamic volume control for ffplay.

- Native AAC encoder improvements.

- Zero-copy Intel QSV transcoding.

- Microsoft DXVA2-accelerated VP9 decoding on Windows.

- VA-API VP9 hardware acceleration.

- Automatic bitstream filtering.

fareesh 3 days ago 1 reply      
I use ffmpeg for housekeeping stuff like converting videos from one format to the other, and cutting clips - mostly from the command line. Can some advanced users share if there is anything to look forward to with this release? Better performance? Some convenience features? Thank you in advance
esaym 3 days ago 1 reply      
I love FFmpeg. I first used it to help with uploading 700 audio files to youtube years ago. Of course youtube is video only, so I used ffmpeg to reencode the audio with an image slideshow as video and then uploaded the "videos" using some web scrapping with perl.

More recently I have been downloading programming framework tutorials (android development, django, angular,ect) from youtube to my plex media server. I then go back with ffmpeg to reencode the vids to playback 50% faster. So now I can blast through tutorials on my TV while I eat lunch (I work from home mostly)

Edit: The release mentioned hardware acceleration improvements. I never knew ffmpeg even supported any HW accel: https://trac.ffmpeg.org/wiki/HWAccelIntro

pdknsk 3 days ago 1 reply      
If ffplay supported hardware decoding, it'd be the perfect player. You could not make a more minimal player. It does not, and it doesn't seem to be high on the priority list, rather in last position perhaps.


hiphopyo 3 days ago 0 replies      
Awesome! Hoping for quick updates to the OpenBSD and FreeBSD ports.
_kyran 3 days ago 0 replies      
Thanks for all the work on FFmpeg!
cm3 3 days ago 3 replies      
Is the built-in aac encoder better or as good as fdk-aac? I've been using fdk-aac because it gives lower bitrates and better/same sound.
mkagenius 3 days ago 0 replies      
Last version was named Feynman.This 3.0 is released on 15th feb - Death of Feynman.


obiefernandez 3 days ago 1 reply      
Why the heck do they make it so hard to figure out what's in the release?
Wikipedia starts work on $2.5M internet search engine project to rival Google [pdf] wikimediafoundation.org
435 points by e15ctr0n  4 days ago   182 comments top 40
jcrben 4 days ago 4 replies      
This was/is actually an extremely controversial project. The corporation (basically the Executive Director) pursued the grant and the idea without soliciting input or really disclosing it to the community of editors, and eventually one of the community-elected trustees was removed for questioning the lack of transparency. The community has a long list of software improvements that they'd like to see to the core platform.

A recent employee survey showed only 10% of WMF staff approved of the Executive Director, probably in large part due to things like this.

A critical take on the project as it has been handled: http://permalink.gmane.org/gmane.org.wikimedia.foundation/82...

https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2... is a pretty good (but dated) overview from Wikipedia's weekly newspaper, but there's a few others in the Signpost and a few blog posts across the web.

Rauchg 4 days ago 3 replies      
The biggest problem is the lack of data about what people are searching. It's a catch-22 that's very hard to break in the face of Google's search dominance and ubiquity.

By Google being the best, it only becomes better, and introduces a huge barrier to entry to competitors. It used to be possible to know what people were searching for to end up in a given Wikipedia article, but the process is now only asynchronous (and limited) through Webmaster tools[1]

In my mind, the most interesting aspect of the announcement should not be how much money they have to spend, but how they plan on solving this paradox.

[1] http://webmasters.stackexchange.com/a/60350

abalone 4 days ago 4 replies      
Summary of the approach (p10):

"1) Public curation mechanisms for quality;

2) Transparency, telling users exactly how the information originated;

3) Open data access to metadata, giving users the exact date source of the information;

4) Protected user privacy, with their searching protected by strict privacy controls;

5) No advertising, which assures the free flow of information and a complete separation from commercial interests;

6) Internalization, which emphasizes community building and the sharing of information instead of a top-down approach."

My first thought: How will transparency impact SEO? Will spammers be able to better game the algorithm when they know its internals?

However I am excited at the prospect of a wikipedia-like public curation system for the entire web. I admit I'm flabbergasted that the whole thing ever worked, but it does.

NKCSS 3 days ago 3 replies      
Weird; they have anual drives to raise money to keep the site running; would not expect that the'd have 2.5m lying around to do pet projects like this...
cmarschner 4 days ago 7 replies      
This is a very, very, very small amount of money if you want to build a search engine, let alone one "to rival Google" (source?). Looks like the goals are realistic, though - look how wikipedia search could be extended beyond results from wikipedia.org, build some test sets. And get a better idea what it really is that is supposed to be built.
seven-dev 4 days ago 3 replies      
I support it 100%. I love Google but they have too much power and I'm sure they'll start taking advantage of that soon (like they did with Google+ and Youtube).
nitrix 4 days ago 3 replies      
The title is misleading. It's 250k, not 2.5m and the goal is a knowledge engine, not a search engine.
xojoc 4 days ago 1 reply      
This is exciting. Recently I started to work on an answer engine/search engine. It still sucks but it's a good project to work on when bored


In a few weeks I'll publish the source code and do a Show HN.

I wish a lot of luck to




too. DuckDuckGo also started to crawl the web with its own bot (right now they're using Yandex's api).

We need more competition from different countries. Just think about the censorship done by Baidu or how Google never plays by its own rules.

It's also interesting to think about a way to monetize a search engine. For kairos.xyz I was thinking about paid accounts (1 euro per month) providing more features, like the ability to search from the command line. For example you write "kairos Richard Stallman" and it prints basic information about Richard Stallman on your terminal.

timClicks 4 days ago 0 replies      
Link to Wikimedia's wiki page on the project includes a decent FAQ: https://meta.wikimedia.org/wiki/Knowledge_Engine
inaudible 3 days ago 0 replies      
Anyone want to hazard a guess at the technology they plan to implement to get this started?

Surely this is not designed to be written from scratch, so..

- Are they using known lexical & semantic scanners? - Is it focused on English language first? - What crawlers will scan content? - I'll assume it's an open platform, but license for contributors? - What database architecture will hold the graph? - How does it know the mark of authority, and is this primarily based on human input learning or machine learning?

I'm sure $2.5M wont touch the sides, but maybe if it's a well directed project, with healthy user contribution, based on interesting technologies they might develop a good backbone architecture. Ambitious for sure.

kristianp 4 days ago 0 replies      
naveen99 3 days ago 0 replies      
Computing is so cheap now, google isn't going to be so dominant on text search for long. Their money is needed for video and pictures and audio, but the text internet can be cached whole by small entities now.

Maybe Wikipedia should launch a video encyclopedia to try to provide a 5 minute video of every article, for people who like videos more than reading.

pavanlimo 4 days ago 0 replies      
Doesn't it say 250k in the letter?
udkl 4 days ago 1 reply      
The grant amount is $250,000
jayadeeptp 4 days ago 0 replies      
Misleading title. I don't think they want the result of the grant to rival Google
qaq 4 days ago 0 replies      
I am financing a $10 project to rival Tesla.
datashovel 4 days ago 1 reply      
I'm not kidding when I say that if they want to know where to spend the $2.5m I would start with cleaning up their core codebase. IMO Mediawiki open source code is a disaster.

EDIT: Not because it's written in PHP. Because it's architected poorly.

frik 3 days ago 1 reply      
Good luck. We definitely need more search engines. (That Google announced to lower the PageRank(R)/site score for non-HTTPS sites is a clear indicator that they about to cross the line (monopoly). And no, DDG and most others are "just" meta search engines that rely on Yahoo Boss ($$$) which future is uncertain and relies on Bing.)

There was "Wikia Search" by Wikipedia founder Jimmy Wales:

"Wikia Search was a short-lived free and open-source Web search engine launched by Wikia, a for-profit wiki-hosting company founded in late 2004 by Jimmy Wales and Angela Beesley.

Wikia Search followed other experiments by Wikia into search engine technology and officially launched as a "public alpha" on January 7, 2008. The roll-out version of the search interface was widely criticized by reviewers in mainstream media. After failing to attract an audience, the site closed by 2009."


I used Wikia Search back then, it was good enough (like Bing in comparison to Google back then).

It was based on Apache Nutch and Solr/(Hadoop(?)/Lucene ...

Maybe you can rely on Lucene or SphinxSearch projects to kick-start.

doyoulikeworms 4 days ago 1 reply      
If nothing else, I hope it improves the currently abysmal search features for Wikipedia today.
gersh 3 days ago 0 replies      
As a former NLP engineer and former wikiHow engineer, I have some perspective on this. Google has included more and more information from Wikipedia. Furthermore, Google includes snippets of external websites in the knowledge box on more and more pages.

How long will be until can algorithmically generate its own Wikipedia articles? Wikipedia relies upon coming to its site for contributions and donations. Without search, Wikipedia risks being subsumed by Google. They have a difficult position of thinking about the future without pissing off Google.

Computers are getting more and more powerful. Wikipedia needs to do stay relevant. I think this is the right decision.

jonathankoren 3 days ago 1 reply      
I've read a log of the inside wikimedia links here, and I'm confused about all the talk of gnashing of teeth and rending of cloth. This is controversial because some want to pay down technical debt rather than have a small team do knowledge graph search?


wanderingstan 4 days ago 2 replies      
Given the terrible state of advertising, I would welcome a search engine that penalizes pages with popovers, animated ads, auto playing audio, and so on. Google would never build this, given its business model.

I hope Wikipedia brings some innovation to search, untethered from advertising revenue.

talles 4 days ago 1 reply      
I don't know how should I feel about all those donation campaigns they usually do after this.
melted 3 days ago 2 replies      
That's how much 5 qualified software engineers would cost to employ for a year (gross, including compensation, benefits, payroll taxes, office space, hardware, etc, and that's on the low end of the range). Good luck with that.
aaron695 3 days ago 0 replies      
Wikipedia main asset seems based around user contributions and human interactions not programming or hard algorithms, this seems quite a leap into another field with not much money.

Bing cost MS $5.5 Billion in their field of expertise


mabbo 4 days ago 0 replies      
Google's advantage isn't just that they were first, or that their algorithm is the best- it's the CPU resources they have available to keep their data updated faster.

Search for any news item and you'll have all articles published more than 2 minutes ago included in your results, all blog posts, everything. They consume it all, and offer the output in near-real-time.

Wikimedia don't have the resources to do that. And they especially won't without advertising to pay for it.

languagehacker 4 days ago 1 reply      
As a former Wikia employee, I am somewhat of a MediaWiki insider. I sped Wikia's search engine up by several orders of magnitude and then went on to pilot a number of NLP/machine learning initiatives in the company.

Jimmy Wales' already tried to make a "Google Killer" ten years ago. It was tilting at windmills to say the least. Letting individuals help manage algorithmic search results was harder than you could imagine. Let's not even get into the difficulty of building an effective crawler.

One of Wikia's former CEOs, Gil Penchina, notoriously undervalued search as a result of this very public gaffe. By the time I came in, it took over five seconds to do a simple on-wiki search. Searching across wikis took so long they actually just sent the search to Google and had you abandon the site. I personally fixed a lot of these problems, and that part was pretty cool.

So now let's get to the subject at hand, which is a search feature based on an authoritative knowledge graph. Something like this should adequately surface factual information in an intuitive manner -- optimally based on natural language. Wikia already tried this, too. They brought on a very seasoned advisor who played a crucial role in the semantic web movement far back into the early oughts. I remember going to semantic web meetups in Austin when I was in grad school quite some time ago now to hear this guy talk.

This guy was essentially the SF-based manager or lead for a small team located in Poland whose job it was to take some of the "structured data" at Wikia and attempt to build some kind of knowledge graph on top of it. This project was unsuccessful.

So why did it fail? We'll start with a lack of product direction. Wikia had and probably still has a very junior product organization that is mostly interested in the site's UI and (recently) a focus on "fandom" (yuck). The team allocated to the project was based in Poland (Poznan, to be exact), and primarily kids coming out of a technical school on their first job. Your assumption about communication being a problem would be correct. However, the subject matter expert was so entrenched in his area of specialization, the problem was even more compounded on the native English-speaker side. There was too much getting in the weeds, and not enough focus on incremental progress.

To make things worse, they tried using a proprietary, not-ready-for-primetime data store because it most closely matched the SME's preconceptions on how the data should be structured. There was absolutely not an existing business use case for this data store, and problems getting it to work turned even building a simple demo into a death march.

Either way, what I'm saying is, $250,000 is not enough to solve this problem. We have attempted to solve this problem before in the MediaWiki world. It's not going to magically get better. To make something like this work, you need:

1) Best-in-class UX people who would know how a knowledge graph provides a significant improvement over existing solutions2) Leadership that can bridge the gap between SMEs and implementers3) Very skilled engineering resources with backgrounds in less conventional technologies

This is a massive investment that no one is willing to spend on what is essentially a media play.

About six months later, I had built a proof-of-concept that sucked data out of MediaWiki Infobox templates into Neo4j, a well supported graph database. I was able to answer questions like, "Which cartoon characters are rabbits", and "What movie won the most Oscars in 1968" using the Cypher query language.

At that point in time, Wikia had decided they were tired of investing in structured data, and wanted to re-skin the site for a third time in as many years to make it look more like BuzzFeed.

Structured data is cool. In many cases, unsupervised learning may be what you're actually looking for. But in the end it has to satisfy a real user's needs.

Wikipedia has five million English articles. Wikia has over 20 million. As far as capitalizing on this wealth of knowledge, the devil is truly in the details. But it's a real shame that all of that information isn't put to better use than to encourage the socially maladjusted to take quizzes over which anime character they're more like.

lossolo 3 days ago 0 replies      
This is like competing with Intel in CPU server market, where he has 98% market share.

So they are trying to compete using 2.5 mil dollars with software backed by multi billion dollars, hundreds thousands of servers, tons of data, thousands of developers, ML integration etc.?

Good luck with that. Many tried backed by x times more resources than this 2.5 mil, unfortunately all failed.

jccalhoun 4 days ago 1 reply      
I am guessing this has a different focus than their previous attempt at making a search engine, wikia search, which they abandoned fairly quickly https://en.wikipedia.org/wiki/Wikia_Search
JayHost 4 days ago 0 replies      
This is great. It feels like we live in an information overload era opposite of North Korea.

Search "Are cookies really bad for me" and find an answer that supports what you want to hear.

"Live a little" Sponsored by Nesthouse Cookies INC

Aissen 3 days ago 0 replies      
Comparison point: it's Bing's budget for about 5 hours.
exDM69 3 days ago 0 replies      
The link headline is highly editorialized, there is no mention of "google" or "rivalry" in the pdf in the link.
chrisra 3 days ago 0 replies      
So this is why they've been asking for donations? Made it seem like they're on the ropes.
grandalf 4 days ago 1 reply      
most of my google searches include a wikipedia result on the first page. I would estimate this could reduce Google's web search revenue by upwards of 40% worldwide.
veritas213 3 days ago 0 replies      
a JV might be a better idea...$2.5M isnt that much money and i doubt it will even come close to being useful relative to the other search engines
bato 4 days ago 0 replies      
Surprised no one has been mentioning qwant.com yet.
franky303 3 days ago 1 reply      
$250000 != $2.5M
blairanderson 4 days ago 0 replies      
I support it 100%, but still think they should use search advertising to cover costs and further development instead of asking for donations every year.

especially if they can make something that actually does rival google... other companies have spent billions and not gotten very close.

sparkzilla 4 days ago 0 replies      
Google already is the search engine for Wikipedia. And Wikipedia is the content provider for Google. Why mess up such a beautiful arrangement? http://newslines.org/blog/google-and-wikipedia-best-friends-...
Building a Startup in 45 Minutes per Day While Deployed in Iraq mattmazur.com
505 points by essayoh  3 days ago   119 comments top 22
tbrock 3 days ago 16 replies      
Somewhat related: I have this fantasy where I actually make a living working a non-technical job that is relatively simple to do but where I'm able to read and think about what I would build when I get back to the keyboard all day long.

I'd spend the days thinking about building and how you would do it for 90% of the time but only spend 5-10% of my time executing. The rest of the time is thinking, planning, and re-thinking things I've done, imagining the tweaks and enhancements I'd get to pump out when I have the chance.

The best code I write is when I have it all figured out ahead of time, the problem is paged-in, and my fingers are merely transcribing the stream of consciousness. I just need a couple of moments to get it out.

Dissimilarly, I would imagine that, being deployed in Iraq, this guy doesn't have much time to think about his startup when doing his non-technical job (as it IS a demanding one) but imagine being a sniper spending your time hiding in the brush thinking about what you would build when you could build it. I'm romanticizing war now though...

matt1 3 days ago 6 replies      
Hey all - author and longtime HackerNews user here. It's a nice surprise coming back from an evening away from the computer and finding this at the top of front page. Whatever success I've had with Lean Domain Search is due in large part to what I've learned from this community.

Here's the original HackerNews launch post from 4 years back for context: https://news.ycombinator.com/item?id=3470977.

I'm happy to answer any questions about Lean Domain Search, the deployment, etc.

idorosen 3 days ago 0 replies      
The title should probably read "Doing a Side Project Website in 45 Minutes per Day While Deployed in Iraq," or "How to test the market for viability of an idea in 45 minutes per day," etc. Maybe "startup" has become meaningless...

I don't mean to diminish the accomplishments of the author, Lean Domain Search is a nice tool he built, and may even be a good way to test a market. Doing so while distracted by military service / a full time job certainly shows impressive drive/willpower. However, to build a company that might (someday) involve more than a 1 man show requires more than 45 minutes per day of attention, especially if other people's livelihoods are(/will be) dependent on you.

ZanyProgrammer 3 days ago 1 reply      
Since he was an officer (where you are generally treated much, much better than an enlisted person)and in the Air Force, and only spending 5 months in Iraq, I can believe he did all of this. Though certainly not representative of the average solider/marine who was deployed to Iraq.
hudibras 3 days ago 1 reply      
I knew I recognized that name somewhere. Matt had a bunch of really interesting articles a while back about building poker bots to play online.


Didn't know that he was active duty in the military at the time, though...

suyash 3 days ago 1 reply      
That begs the question, can we define once and for all what is the difference between startup and side/pet projects?

There was no mention about revenues or even users. I found the story very inspiring but the terms need to be clarified else startup is going to be used very loosely. We might as well all call ourselves founders/ceo's/entrepreneurs in which case.

choxi 3 days ago 0 replies      
One of our students at Bloc was on active duty in Qatar while learning Ruby on Rails from us. It's interesting to see the new lifestyles and opportunities that are created when accessibility is unlocked by the Internet.
chatwinra 2 days ago 0 replies      
Fantastic post! Well done Matt.

I'm in a similar situation where I'm working full time whilst trying to finish off a game in my spare time.

I find what helps is to have a clear goal for each work session. Even if it's tiny like 'fix this bug' or 'update this text', you finish working and feel like you've made progress. Too many times in the past I've just started working, lost focus and ended up trying to look at several things on the game and finishing none of them.

rdl 3 days ago 0 replies      
I ran into a lot of NG/RC people (especially, but even some full time active duty people) in Iraq/Afghanistan 2004-2010 who were doing a pretty good job keeping businesses running back home. (I was selling Internet access, and while a lot of people were worried about $25/mo, if you ran a business you'd often be happy spending $500/mo to ensure you had good access -- this was before NIPR, base-wide wifi, etc.)

Great product, btw -- I've seen it on here before I think. And thank you for you service.

boothead 2 days ago 0 replies      
I learned to program on a warship heading to Iraq in 2003. From a book, with no internet. It's amazing what constraint's will do for you - I'm not sure I'd achieve that feat now in the presence of constant distraction!
wdewind 3 days ago 0 replies      
This is also not the only project Matt was working on while deployed because I remember talking to him via email about some other pretty ambitious stuff. Kudos to Matt for really taking away the excuse of not being able to try something on the side even when you're fully employed.
austinhutch 2 days ago 0 replies      
Lean Domain Search is a fantastic site and is my go to for generating ideas for domain names. Thanks for your service and for building a great site!
itsthisjustin 2 days ago 0 replies      
This is an awesome post. I'm also a huge fan of lean domain search and use it all the time when starting a new project. So hats off man!
glossyscr 2 days ago 0 replies      

I used LeanDomainSearch many, many times and this proves that a single guy with so little time can accomplish great things.

gmays 2 days ago 0 replies      
I had the opportunity to meet Matt at MicroConf in 2013--hell of a guy. We hit it off since we were both active duty at the time and had similar stories. He's one of the most well-rounded guys I know in terms of dev and product chops.

The tool he built (Lean Domain Search) is great for finding hidden gems if you're looking for a new name for a project with the .com available.

aledalgrande 3 days ago 0 replies      
It's inspiring to be able to start something from anywhere and in any situation, but I would reconsider how easy is to follow up with that and how much time you have to put in.

Even the author of 4 hour work week in the end had to work an incredible amount of hours between marketing and networking. Getting to the point of break even is not a piece of cake.

sireat 2 days ago 0 replies      
A more objective title would have been how a side-project led to an acquihire at Automatic.

It is still an inspirational story somewhat similar to path taken by patio11.

Are there any side projects which actually have turned into sustainable lifestyle businesses or better?

Chris2048 2 days ago 0 replies      
At first, I misread that as "building a startup in 45 minutes, per day"
jcampbell1 3 days ago 0 replies      
Holy flying fuck. I bought several domains using leandomain search. It is the best. I bought them during hackathons, and earned a prize for both.

You were shooting bad guys, I was typing code for hackathons. Guess we both won?

teekert 2 days ago 0 replies      
This links to a blank page for me.
bau5 3 days ago 6 replies      
Wow. Referring to Nassim Taleb just by his last name? Has he really attained that level of status? He doesn't deserve it IMO, especially around here, with his fearmongering about GMOs and his debate style on twitter.
GFK_of_xmaspast 3 days ago 2 replies      
I hope that work wasn't done on government hardware.
Hard Drive Reliability Review for 2015 backblaze.com
408 points by chmars  2 days ago   111 comments top 22
crispyambulance 2 days ago 3 replies      
This is interesting...

"A relevant observation from our Operations team on the Seagate drives is that they generally signal their impending failure via their SMART stats. Since we monitor several SMART stats, we are often warned of trouble before a pending failure and can take appropriate action. Drive failures from the other manufacturers appear to be less predictable via SMART stats."

~10 years ago, I remember google research put out a highly cited paper wherein they found that SMART stats were not a particularly strong indicator of impending drive failure (50% of drives had no SMART indications of problem before failure). http://research.google.com/pubs/pub32774.html

Has this now changed (at least for Seagate)?

Reliability/longevity is nice but a signal of impending failure is far more valuable from an operations point of view.

roddux 2 days ago 3 replies      
HGST (or Hitachi Global Storage Technologies) are again topping the charts for drive reliability! They must be doing something right.

Also, the fact that backblaze are publishing most of their data online is very cool.

gradstudent 2 days ago 3 replies      
I don't really understand their methodology for computing failure rate. The page says they calculate the rate on a per annum basis as:

([#drives][# failures]) / [operating time across all drives]

Wat? The numerator and denominator seem unrelated. What is being measured here?

To me, it would make more sense to look at time to failure. Together with data on the age of the drive and the proportion of failures each year one could create an empirical distribution to characterise the likelihood of failure in each year of service. That would give a concrete basis from which to compare failure rates across different models.

SixSigma 2 days ago 2 replies      
A useful additional metric is the age of the drive at failure.

This would determine if the failure rate was constant for the life of the drive (meaning random failure) or is it age related (infant mortality or old age).

25 drives that fail after 1 week plus 25 that fail after 50 weeks is different to 50 drives that fail one per week.

tshannon 2 days ago 3 replies      
"...give or take a Petabyte or two"

As one does.

akulbe 2 days ago 6 replies      
Color me skeptical. I bought into this, at first. After reading some other stuff, not so much.

Like this, for example: http://www.tweaktown.com/articles/6028/dispelling-backblaze-...

Ezhik 2 days ago 2 replies      
Are Backblaze the company that bought all the hard drives in the Bay Area during the 2011 crisis?
slowhands 2 days ago 1 reply      
Good data, but I wish they would have rendered these tables using HTML. Not fun typing these out myself to search.
cableshaft 2 days ago 1 reply      
I've head a lot of bad luck with Western Digital hard drives lately. Nice to see some data back that up. I didn't know HGST existed, though.
sandworm101 2 days ago 3 replies      
Is it possible for this data to ever be useful? Given the time necessary to acquire the data, and the rate at which improvements are made to drives, cannot we make the assumption that drives purchased today probably won't operate in exactly the same manner as drives purchased a year ago?

I don't mean to insult, just to ponder the relevance of such long-term studies on tech that changes so quickly.

pkaye 2 days ago 1 reply      
I wish there were similar statistics publicly available for SSDs. From these failure rates, hard drives don't look as reliable as one would imagine.
gist 2 days ago 2 replies      
What would be really helpful is if they could simply put some amazon links on this report to the drives with the best reliability according to their tests.
eps 2 days ago 0 replies      
It'd be interesting and quite helpful to see the failure rate vs. drive age, per manufacturer.

For example, for less reliable manufacturers there might be a "if you get past first N weeks, you are fine" pattern, or a failure cliff exaclty 1 week past the warranty period, or something equally entertaining.

Quequau 1 day ago 1 reply      
Just in time for me.

I've got 5 Western Digital drives which have failed out of original purchase of 6. Now I'm wondering if it's really worth it trying to go through the RMA process (I need to figure out exactly how old they are and how long the warranty is) or if I should just give up on Western Digital and go with a different manufacturer... though I am not looking forward to spending that amount of money all at once.

jamesblonde 1 day ago 0 replies      
Great stuff. Does anybody have any stats for drives' Bit Error Rates (BER) / maximum unrecoverable read errors (URE) / non-recoverable read error rates ? By my understanding, manufacturer quoted BERs for commodity drives, often 10^14, tend to be 10^15 or higher in practice.
zanny 2 days ago 1 reply      
This information is super useful. I have an ST3000DM001 and only trust it because its smart stats are still all in the green (and of course I have local and cloud backups of anything important).

I've had it for four years now and there are no warnings of any kind yet, so I guess I got one from a good batch.

ck2 2 days ago 1 reply      
Always look forward to this report, thanks for sharing the data.

Amazing the 4TB hitachi with twice the platters of the 2TB fail less.

(and I will never buy seagate again for home pc or servers, even before this report I could have told you they are unreliable)

dghughes 2 days ago 1 reply      
Total 213,355 Terabytes or 213 Petabytes, that's quite a bit.
mozumder 2 days ago 0 replies      
Would be more interesting to find out reliability figures for high-throughput data-center models of hard drives instead of backup drive models, with low access rates.
64bitbrain 2 days ago 0 replies      
Are there similar results or survey for SSDs?
pbreit 2 days ago 1 reply      
Sort of off-topic and apologies for the commercial nature, but can you really get a 2 TB thumb drive for $17? Do they work?


contingencies 2 days ago 0 replies      
I have 2 x ST40000DM and 1x ST40000VX in my desktop, plus one 4TB Seagate 'surveillance' drive as a USB luggable, though OSX to which it is currently connected doesn't want to give me the specifics (neither right-click Info, nor DiskUtil).
Why I No Longer Use MVC Frameworks infoq.com
458 points by talles  3 days ago   192 comments top 41
Gratsby 3 days ago 11 replies      
People have a hard time with MVC because frameworks that use the phrase MVC are always more complex than necessary.

There's a simple set of separations that you need to pay attention to for your code to be what I like to call "not stupid".

1. Separate your code from your data. You shouldn't be kicking out HTML with your code, like this guy does in figure 1. When you combine HTML or any display information with code that means your designer is your coder and your coder is your designer. Designers suck at code. Coders suck at design. Keep that shit separate.

2. Separate your logic code from your code that changes your data. You shouldn't be running updates/saves/inserts from 30 different locations in your code. Define your interface, expect that interface to work. When you need to shift from Oracle to Redis to RDS, you shouldn't have to refactor 80% of your application. You refactor your data update code and leave everything else that works alone.

There you have it. Model, View, Controller. You have a data model that can be displayed in any number of ways, and you control CRUD on that Model via the controller. It's not a religion. Just make sure there's a logical separation within your code so you can delegate as your team and application grows.

Architect things logically and call it whatever you want.

lhorie 3 days ago 4 replies      
I feel like this is one of those cases where someone rejects currently existing tools, but then goes on to throw out the baby with the bath water when trying to come up with an alternative.

Figure 7 (http://cdn.infoq.com/statics_s2_20160209-0057/resource/artic...) looks to me like the sort of MVC you would do with vanilla PHP or friends back in the day.

In figure 9 (http://cdn.infoq.com/statics_s2_20160209-0057/resource/artic...): he handwaves saying that you can compose things, but nothing in the article suggests that SAM provides better mechanisms to sync data between client and server. E.g. what if the view is a table, and the action is deleting an item? Does an update consist of re-downloading a whole new table worth of data? Would it also do that if I deleted that same item from some other view? Or does it require 2 http requests in series (a DELETE followed by a GET)? What would an undo action look like in terms of HTTP requests and responses? The graphs feel like they're largely meaningless. Having arrows in a diagram pointing from one word to another doesn't say anything about the complexities of the system.

johndubchak 3 days ago 4 replies      
I believe another major issue arises in the form of the design of these new API's are not designed using the historical knowledge that got us here.

It seems to me at least that every 10 years or so a new group of fresh grads enter the workforce and by sheer will (and all nighters) recreate different forms of the same solutions to familiar problems.

Meanwhile, the previous designers and implementors seem to fade (who knows where they end up?) and the normal evolution we should be seeing in our API's, frameworks, and collective knowledge simply inches (is that too generous?) along.

We are giving different names to original ideas. For example, I have read (sorry, I can't find the links) recently where others have published on interesting new design patterns only to find out they are essentially renamed versions of the original GoF patterns.

The same challenges exist and we are seeing someone else's take on the solution and it feels too familiar because they look at what's out there and provide only a marginal variation on a theme.

I believe this partly explains this "been there, done that" that we're seeing these days.

prewett 3 days ago 1 reply      
I think part of the problem is that MVC is pretty heavyweight. Most UI doesn't need that kind of flexibility, but when you want it, you want it. So you need a way to make it simple most of the time and still have access to the details.

In web development it is probably complicated by that fact that, in my opinion, declarative positioning and sizing of elements is a pipe dream. It looks simple until you try to actually implement it, and HTML/CSS has only a rudimentary implementation. (As far as I know, Motif and Apple's constraints are the only UI toolkits that have a solid implementation) Given what we want to do with the web these days, I think we would be better off with programming the web page declaratively. Something like what Qt does. I've never found an easier way to write a UI than Qt.

carsongross 3 days ago 8 replies      
He nails the problem: API bloat due to chaotic front end needs, but there is a much, much simpler solution: move the UI/DOM building back to the server side.

This gives you a nice, secure environment to build your UI in with the full power of the query language your datastore provides, and it doesn't require any particularly complex architecture or discipline to maintain.

And, of course, I would be remiss if I didn't mention my baby, http://intercoolerjs.org/, as a tool to help you do it.

halayli 3 days ago 1 reply      
He starts with a negative tone when describing react & redux without backing up his negatively with proper & convincing arguments just so that later he can throw in his alternative solution.

I don't know why we keep doing it but often times we cover our eyes on purpose and brush off existing solutions backed by thousands of engineers so that it's easier to make our points.

phantom_package 3 days ago 0 replies      
Isn't GraphQL/Relay (or similar tools) the solution here? Has the best of custom endpoints (can be conceptualized as a single, infinitely customizable custom endpoint) without the drawbacks (there's only one endpoint, and backend engineers just need to expose a schema).

It was kinda disappointing to see him fail to properly address tools that were designed to resolve his original concern. He basically decides that GraphQL forces you to "shape your models to your views" (which is false, GraphQL/Relay just collates and trims your models before delivering them to your views). In a sense, it allows him to continue to say "yes" to his frontend developers (which is good from a product standpoint) without adding a ton of custom endpoints.

scotty79 3 days ago 2 replies      
I'm not sure why author dismisses graphql. It greatly simplifies server-side work. You just need to implement one endpoint that responds accurately and quickly to a composed query in accordance to what rights the current user has. It's not as easy as "serve result of SELECT *" but it's well defined and flexible for consumers.
norswap 3 days ago 1 reply      
I always find it interesting how something seemingly as simple (conceptually speaking) as building a web app can in truth turn out pretty darn complicated.

I'm not seeing all the complexity we see here or in other frameworks is warranted, but clearly there is something going on making this quite hard.

I'd be interested to see fundamental research (and vulgarization thereof) on this.

velox_io 3 days ago 1 reply      
Changing the API design because a frontend dev doesn't like the style is definitely an anti-pattern. Customising APIs for every imaginable use case will get painful fast!

Angular does have powerful services to transform objects. For example last week a library (ngTagInput) was expecting a list of objects with {id, name} however the server was returning a list of integer id's. It was trivial to replace the id with the {id, name} object in the service, without requiring changes to either the library or the back end - the API is still generic.

romanovcode 3 days ago 0 replies      
The reason I choose to work with MVC frameworks is because more often than not other people work on my code with me.

If I would create my own abstractions it would be good for me but very, very bad for others.

mej10 3 days ago 2 replies      
If you are serious about front-end Javascript, you owe it to yourself to check out the ClojureScript libraries Reagent (and re-frame) and Om.Next.
wereHamster 3 days ago 0 replies      
WTF? Blindly concatenating strings... hello XSS..
pearjuice 3 days ago 0 replies      
It's a typical Hacker News cliche: "Why I No Longer Use [commonly accepted standard]" but I am honestly glad that the author put a lot of effort into explaining the theoretical AND practical concepts behind his stance. So in case you skipped the article because it fell through your "textbook HN hit" filter, please take my word for it that it's worth reading.
djfm 3 days ago 0 replies      
Can't really see how the proposed solution is different from React + Redux.
pluma 3 days ago 1 reply      
First he dismisses React, then he says his code is looking a lot like React without the ceremony, then he shows code samples that look a lot like React except they use string concatenation (hello XSS) to build the DOM (goodbye DOM diffing) and instead of following the modern React idea of small composable components they are huge and bloated.

Is this satire?

al2o3cr 3 days ago 1 reply      
From one of the screenshots:

 div class="opacity-'+slider.maskOpacity
Even better once you notice the actual inline `style` attribute two lines up. headdesk

jondubois 3 days ago 1 reply      
Not using an MVC framework is often fine if you're working alone on a project but you might just end up reinventing angular...

Then when new people come in to your project, the learning curve for them will be steeper than Angular - Also you won't be able to hire any 'batteries-included' programmers because nobody will know your framework when they join your company.

Worse, engineers might refuse to join your company if they see that you've built a custom solution from scratch... Chances are that your framework won't be as good as Angular or React.

Maybe you should try Google's Polymer framework? I've tried all the major ones and Polymer is the least 'frameworky' one.

ivan_ah 1 day ago 0 replies      
I'm not super convinced OP's SAM architecture is that revolutionary. It will take something waay better to displace the current champion: React+Redux.

And speaking of the champion, here's a good write up[1] about tfb's helper library 'react-redux-provide' that automatically matches names in this.props with Redux actions and reducer outputs. It's a simple thing, but tremendously reduces the wiring boilerplate for React+Redux apps.

[1] https://news.ycombinator.com/item?id=11098269

sbov 3 days ago 0 replies      
I feel like part of the "problem" of API churn is because we are writing APIs that live in the controller layer. Just because it runs on the back end doesn't mean it lies within the model layer.

Beyond that, it certainly makes for less code to make the model directly line up with the view, but you create coupling between the two. This seems like a major difference between MVC I've experienced in the web vs desktop apps back in the late 90's - in a desktop app my view didn't rely on the model's code, just on the model's data. But nowadays with rails and spring and django the view is coupled directly to model code.

EGreg 3 days ago 1 reply      
Reading this I am kind of happy we rolled our own framework and dogfooded it for the past 5 years.

It has only a few concepts:

 * Pages (webpages) * Tools (components) * Events * Users (accounts) * Streams (data)
Everything is accomplished around these basic concepts.


emergentcypher 2 days ago 0 replies      
I don't think MVC belongs on server-side web frameworks at all. It's not a continuously updating interactive display, it's a one-off thing. You don't have a model object sitting around sending messages to the view when it updates. This doesn't happen. There is no triangle of messaging. What really happens is a request comes in, and your "controller" is the one doing all the calling.

The HTTP server is by its very nature a pipelined or layered application. Requests are turned into responses through a series of transformations. They go down the layers as they work their way to your database, and back up the layers as they are built into responses. This is, incidentally, why functional programming is such a great fit for web servers.

jijji 3 days ago 0 replies      
The problem with MVC is that there is no clear definition of what it is, and so what you're left with is people who are putting code in the View, code in the Model and code in the Controller. Then, go try and debug that. You wind up having to logging basename(__FILE__) to understand what file is responsible for what. It is a nightmare for debugging. Then you have to to talk about the elephant in the room -- performance. Performance using MVC frameworks is orders of magnitude slower than using straight procedural or functional development. It should not take a rocket scientist to understand that calling a function directly is faster than instantiating an object and then abstracting an abstraction from an abstraction from abstraction just to return a result. I dont know why people do it, but I find myself cleaning up the mess more often than not. You cannot convince me that a 10x decrease in performance and a 10x increase in code complexity is a good thing.
dh997 1 day ago 0 replies      
Controllers are important for being gatekeepers, which can be adjusted (think middlewares which add security, logging or other changes uniformly) and inspected outside of the MVC code, and separates outside from inside.

It might be worth the tradeoff in the short-term, but it's the lifecycle TCO of code counting support and modifications over many projects which will validate or invalidate a particular approach (there is also a cost in terms of hiring and learning curve for inventing an alternate convention).

I hope it works out.

hyperpape 3 days ago 1 reply      
I was intrigued enough to look at the source of his Star library (https://bitbucket.org/jdubray/star-javascript/src/5806219be6...), and got very little out of it. Aside from a few truly strange things (https://bitbucket.org/jdubray/star-javascript/src/5806219be6...), it's just generally not commented or documented, and I have a hard time figuring out how it serves the goals of that article.

I'll take another look later, but I'm curious if anyone else got much out of it?

niutech 1 day ago 0 replies      
MVC, in a strict sense, as in Gang of Four, almost does not exists in JavaScript frameworks. What we see is MVP (Backbone.js, Riot.js), or MVVM (KnockoutJS, AngularJS). ReactJS prvides just a View.
xyzzy4 3 days ago 3 replies      
I prefer the "VC" style (MVC minus the M). It is way too much work to have to redefine your models on both the client side and server side, when you can simply have all the data in temporary dictionary structures. You define your models just once in the database itself. I don't know if anyone else does this, but it keeps the amount of code to a minimum.
drostie 3 days ago 0 replies      
Searching through the comments here, only one of them mentions SAM, which is the point of the article. Some nuanced thought about what's going on here is therefore warranted. So let's start with old-school Smalltalk's MVC: this was a message-passing system consisting of three types of concurrent processes:

 * models maintain a list of "subscribers" that they regularly send certain messages to, in response to the messages that they receive from the outside world. This can as always involve the internal state that each process maintains. * views subscribe to models and then present a graphical description of the messages they receive. * controllers listen to user-interface events from the view, and use it to send messages to models.
These have transformed mightily in the intervening years; a model these days often refers to some form of schema for data structures which are returned by services; controllers often query those services and provide them to the view; views may define their own service-queries. Often there is no clear notion of subscription except possibly subscribing to events from the GUI, which is the responsibility of the controller; the controller basically handles everything which isn't either the structuring of data or the organization of the display components.

SAM appears to be coming from the latter developments and is concerned with an associated explosion: Every view and interaction more or less gets its own service on the back-end, providing a sprawling codebase which resists refactors; they also get their own controllers which maintain them.

In the SAM idea, the model now reprises its earlier notion of managing subscribers and containing business logic: however it is now "dumbed down" from its potentially-very-general Smalltalk definition: the model is instead meant to hold and maintain a single data-structure. (Can there be multiple concurrent models?) The model apparently only receives requests for an all-at-once update, but it may decide to transform its state `new_state = f(old_state, proposed_state)`. Presumably then it again tells its subscribers about the new state if it is not identical to the old state. (Each view is expected to compute the diff itself, probably?)

A "state" in SAM appears to be identified with a "view": a "state-representation" is a function from a model to a DOM. Your GUI consists of a bunch of these, and hypothetically we can diff the DOM trees to better understand what needs to change on an update of the related model properties; the "view" is basically just a bunch of representations of the underlying state with some "actions." These "actions" are not actually parallel processes at all but do take the classical responsibility of the "controller", routing user-interface events to messages to send to the model. The apparent point here is that they should correspondingly be very "dumb" controllers: they just create a transformed version of the data that they received from the model and then send it back to the model as a "nomination" for a possible change.

Finally there appears to be a strange next-action callback which may be part of every "view update." (It's not clear where this lives -- in the "action"?) I am not entirely sure why this exists, but it may be that the algebra of actions without this callback is merely an applicative functor, not a full monad. The essential idea here is that the function apparently can compute a follow-up action if such needs to happen.

If I'm understanding this correctly, then a simple app which seems relatively hard to structure this way would contain:

 * a counter stored in a database, * a button which should ideally increment that counter, * a display showing the current value of the counter, * a notification window telling you when your click has updated the counter.
I'm using a counter since it's got a nice algebra for dealing with coincident updates; if someone else updates the counter then your update commutes with theirs, saving the client-side complexity.

Without a framework, we would simply define two endpoints: GET /count (or so) gets the current count; POST /increment will increment the current counter and will say "Success! The current count is now: ____", telling you what you changed the count to.

Here it seems like you need three models. First we have the server-side model of what's going on:

 server model Counter: private count, Subscribers message increment(intended_count): if intended_count == count + 1: count += 1 Subscribers.notifyAll() return ('success', count) else: return ('failure', count) message count(): return count
The requirement that we only nominate new versions of the underlying data-structure means that we cannot just define the appropriate increment service which always succeeds, but must instead tell the client that the request has failed sometimes. Then there are two models on the client side: one holds a list of outstanding requests to increment (updated by actions, retrying any failures) and the other one holds the currently-cached value of the server-side data (because we need to update this by both the former client-model's update events as well as by some automatic process). You would probably condense these in practice into one model, however they are different concerns. The former model, however, is absolutely required, as it provides a way for the "notification window view" to appear and disappear when one of the requests has succeeded.

This seems unnecessarily complicated given the simple solution in terms of two API endpoints -- however it does indeed fulfill its desire for lower API bloat and some separation of concerns.

ciscoheat 2 days ago 0 replies      
It happens time and again, another pattern from the programmers perspective, when MVC really is much more than a pattern, and it includes the users perspective. It's not about separation primarily, that's the nerd's view.

I've made an attempt to clarify what MVC is really about. It was posted here before, I appreciate if you read it: https://news.ycombinator.com/item?id=10730047

falsedan 2 days ago 1 reply      
Please colleges, please make computer science/software engineering students take at least one course where they have to write essays to pass the course.
dclowd9901 3 days ago 0 replies      
First, his rocket example won't even work (in the actions.decrement function, "'model' has not been declared"), but what he's talking about isn't actually new in React land.


Not that every idea has to be novel, but I think the code in this repo provides a far better example of FRP (which SAM is) than that article.

soundoflight 3 days ago 0 replies      
I've worked on proprietary frameworks very similiar to what the author describes. They are great for the web and in my opinion allow for tighter code. The issue is that I haven't found a good, open-source one for the web.

Also, interesting, is the fact that computer engineer "coding" is much more into the state concepts brought up in the article.

andraganescu 2 days ago 0 replies      
<< As a veteran MDE practitioner, I can assure you that you are infinitely better off writing code than metadata, be it as a template or a complex query language like GraphQL. >>

Reading it is worthwhile if only for this phrase.

franciscop 3 days ago 2 replies      
There's an excellent tool to avoid the default value mess shown there: defaults


It will fill your option object with the data if it's undefined

colemannerd 3 days ago 1 reply      
Cool abstraction, but would love to see actual code implementation.
kensign 3 days ago 0 replies      
known 2 days ago 0 replies      
"Premature optimization is the root of all evil" -- Knuth
LAMike 2 days ago 0 replies      
I think the author might like the simplicity of something like Vue.js
mongosucks 3 days ago 1 reply      
pub/sub is the best, call it command bus, call it Flux, Redux,...
abledon 3 days ago 0 replies      
This was a sobering read.
fabrigm 3 days ago 0 replies      
Because you have time?
Uber's Design Meltdown elischiff.com
464 points by lumpypua  2 days ago   219 comments top 49
flyosity 2 days ago 4 replies      
I think one of the best points of this piece is referencing the iconic U sticker that nearly all Uber drivers have on their windshield or passenger window. When I've taken an Uber in an unfamiliar city (the bulk of my Uber trips basically) and I'm on a busy street corner, seeing the unmistakeable black-and-white sticker lets me know quickly and simply that I'm hopping into the right car, or where to start walking if it's 100' away.

What will the new Uber car sticker look like from afar? A circuit board? A community college parking lot pass? There's no way it will be as obvious and instantly recognizable as a big U sticker. In fact I can't even imagine Uber drivers will swap their decals out, ever, so it'll create a rift in the overall brand (drivers with U stickers, website/app with the new brand) and increase overall confusion with passengers.

secstate 2 days ago 4 replies      
Was it just me, or was the uBar advert at the bottom of the title really oddly placed. It was almost as if it were going to be part of the article. How the brand was becoming associated with crap software too.

Meanwhile that was totally just a horribly placed ad for the site ...

exelius 2 days ago 3 replies      
I'm never one to judge a rebrand immediately, as so many people were quick to do here. These things have a tendency to grow on you.

But after living with the rebrand for a few weeks? It sucks, and it made me realize how all the things I used to like about Uber are disappearing.

The simplicity was one of the best parts of Uber: you picked your service level, then ordered a pickup. That was it! These days, they've added a lot more friction to the process, and instituted carpooling rules. Never mind that surge pricing is almost always in effect -- I guess they're having a hard time retaining drivers?

Ultimately, all Uber has long-term is a brand: their business model will be commoditized the day autonomous cars are integrated into a fleet management system. And they fucked up that brand: they went from a recognizable brand identified with the letter "U" to an unrecognizable glyph that looks more like a video game than a ride hailing app.

m52go 2 days ago 4 replies      
For the record, it's happened more than once now that I look through my app menu and see the new Uber icon and have no idea what it is. Takes me a second to think about it and remember this 'rebranding'.

It's one of the worst things they could have done at a time when they need a strong brand the most.

salgernon 2 days ago 2 replies      
About two weeks ago, I found myself in Manhattan, needing to get to a specific airport hotel near jfk for a 7am flight. I'd never used Uber before, so I downloaded the app - I knew nothing about the rebranding (which had happened at the time) but I found myself with a somewhat bewildering array of options - uber, uberx, uber black, uber pool, something else. Multiplied by choosing the type of vehicle?

I had no idea what I should choose and I was in no mood to track down what to the designers and implementors meant by the various choices in the matrix.

(Took the subway to something close, got out, got in a cab and got there for $20 total)

I wish there had been a more scripted experience: where are you going? What's your timeframe? Do you have luggage or need to impress someone? Do you mind acting as a carpool dummy? Do you want conversation or quiet?

Then pick from your array of checkboxes for me.

tetron 2 days ago 0 replies      
Basically, designers have to put up with the same management bullshit as developers, except with significantly less respect for the difficulty of their craft.
binarymax 2 days ago 8 replies      
All they needed to do was rotate the icon 90' clockwise and it would have satisfied everyone.

--EDIT--For clarity: http://imgur.com/sZN6g2i ...still looks like a 'U'

rl3 2 days ago 0 replies      
I really hope the writers of Silicon Valley somehow work this into the show.

Unfortunately Uber's timing is such that it'll probably have to be season 4.

alecsmart1 2 days ago 0 replies      
Just wanted to say that Uber is used by the mass; It's not really a product for techies. So a lot of my friends thought that Uber has been deleted from their phone (as they weren't able to find the icon) and then went ahead and ordered a cab from the competitor.
atria 2 days ago 1 reply      
Am I the only one that sees the icon and thinks that it looks like an abstract drawing of the sphincter or the underside of your anatomy? Must be tortured remembrances of goatsee videos from years gone by. Perhaps there is a tie in with all those grindr ads. Personally I find the combination of moms, puppies, and sphincters a bit disturbing.
carc 2 days ago 1 reply      
I've never, ever in my life seen a rebranding or redesign that wasn't widely complained about. How can we differentiate between this kneejerk reaction that all humans seem to have to changes in a UI/design they are already familiar with, and actually bad redesigns/rebrandings?
Jordrok 2 days ago 1 reply      
Ok, so supposedly the redesign is based on the idea of "bits and atoms". The atom at least seems to be based on the textbook image of a circular nucleus surrounded by lines of orbiting electrons. (Let's ignore the fact that the lines in the logo are pointing in totally nonsensical directions.) The "bit" though...why a square? A bit is a unit of measurement - it has no physical representation. Another circle would be just as valid. Maybe they're thinking of pixels? (Though those aren't always squares either...)

Even if you can get past how pretentious the whole concept of the design is, it all falls down anyway once you realize that they're just making shit up.

nxzero 2 days ago 0 replies      
Huge shifts like this are confusing to consumers. Companies like Starbucks understand that you can be fresh without looking like you're going through an identity crisis. Luckily for Uber, what people really care about is the service they're getting, not the logo for it.
klum 2 days ago 0 replies      
Maybe I'm just reading the wrong authors, but so much of the writing on design today let plain bad design slide because it has the right look/people/brand/tech/image. Eli Schiff's writing is maybe a bit more vitriolic than necessary at times (IMHO) but it's a very refreshing counterpoint.
mangeletti 2 days ago 1 reply      
The first thing that came to my mind when I first saw their new icon was electroencephalogram stickers[1], or maybe those stickers that hold on heart monitors in the hospital.

1. see pic on the right here: http://www.epilepsygroup.com/notes6-35-63/new_patient.php

EDIT: I was told by my friend who is a nurse that the more accurate example would be "telemetry leads" - http://www.wireless-technology-advisor.com/images/cardiac-te...

bhouston 2 days ago 2 replies      
It is fun bashing uber but I am not sure this is quite as bad in all respects as this article makes out. In particular I like the black square icon, it is bold and unique. Their Android and ios icons though are pretty bad. Too bad they couldn't have stayed with the bold black square for those as well with some way to ensure it has good visibility.

Black monolith icon -- I like it.

mastazi 2 days ago 2 replies      
Do you remember the time that Marissa Mayer said, referring to Adobe Illustrator, "Im not a pro, but I know enough to be dangerous" and then "helped" designing the new Yahoo logo? by that time, Mayer's (micro)management style had already resulted in the Lead Yahoo Mail Designer quitting the company[1].

This time it seems that, because of Kalanick's micromanagement of the Uber logo design, Uber's Head of Design has left the company[2].

A non-technical co-founder shouldn't write code, right? So I don't understand why a non-designer co-founder should design a logo.

[1] http://www.inc.com/cody-steve/yahoo-logo-redesign-marissa-ma...[2] http://www.fastcodesign.com/3056457/fast-feed/ubers-head-of-...

wodenokoto 2 days ago 2 replies      
For someone not part of the design process, the author sure seems to know how everybody on the team has been feeling during the redesign process.
droningparrot 2 days ago 0 replies      
"The Uber brand guidelines were sure to make clear that they doesn't want their logo to be urinated on or to be associated with condoms or sex. Because there was a real danger that might have happened."

I see ads for Uber on Grindr all the time. What's that about not associating the brand with sex?

brandonmenc 2 days ago 0 replies      
> It felt wrong for Ubers global and local brands to revolve around the color preferences of a rich, white guy in California - even if that rich, white guy in California is the CEO.

It worked for Steve Jobs at Apple.

Laaw 2 days ago 0 replies      
It took me like 30s to figure out the change, and I haven't had any problems with it since.

Not sure this is going to topple the company. I think this is a case of, "I do this for a living so it's super duper important". The branding redesign probably isn't as important as this author claims.

hackaflocka 2 days ago 0 replies      
Uber's rebranding is a management classic. It will be studied for decades as an example of what happens when committees create things.
arihant 2 days ago 0 replies      
I assumed the backwards 'C' was the incomplete progress bar. First I thought Uber is updating. Then when I realized it is the new icon, I thought it is meant to signify urgency of "your cab arriving."

In no universe did I imagine an atom or a bit there. Why does the bit have a tail that makes an atom a backwards C?

Pixel art on city landscapes would have conveyed their point better, and would have looked better.

wahsd 2 days ago 2 replies      
Not to detract but it's a somewhat similar issue albeit probably will never elevate to the same level, but LastPass also just went through a logo redesign and it's equally rather puzzling.

The logo went from




Considering things like that the three dots simply look like any number of other menu icons, especially on android where you can have three of those ellipsis type of icons in one view sometimes; I'm not sure that was all that smart of an idea. In case you didn't notice/realize it like I didn't, that's a cursor at the end of what are supposed to be three hidden characters. It's just an odd choice in my opinion.

pmarreck 2 days ago 0 replies      
The amount of hubris this article describes is staggering.

And I say this as a long-time Uber fan.

meric 2 days ago 0 replies      
I wonder if this destruction of brand equity will mean Uber have to write down the "intangible assets" section of their balance sheet.
soyiuz 2 days ago 1 reply      
I am confused. How's the app logo nowhere on the homepage? Did they go with UBER, black square, and the weird backward C thing all at the same time? In addition the font for Uber front page does not match the Uber logo once you log it (where the U seems to have a little curve to it). Get it together, people!
ncke 2 days ago 0 replies      
Wow, I didn't think the day would come when I felt sorry for designers.
ank_the_elder 2 days ago 0 replies      
"All in all, it is remarkable that the Uber team produced what they did given the circumstancestruly a testimonial to the patience of Uber's Design Director Shalin Amin and former Head of Design, Andrew Crow, who not so inconspicuously departed the company immediately after the redesign."

Is there a difference between "head of design" and "design director"? They seem to have been at the company at the same time - correct? Just can't understand what the difference between these two positions would be...?

rcurry 2 days ago 0 replies      
Is it just me, or does anyone else think those two icons look like toilet seats?
terda12 2 days ago 3 replies      
The article reads like the author has a personal hatred against Uber, it makes it hard for me to accept his opinions.

Would have been better if he took a more objective look at Uber's redesign instead of taking the time to bash the CEO or the designers.

umutisik 2 days ago 0 replies      
Seeing the iOS logo always makes me think for a second that the app is being updated.
repomies691 2 days ago 1 reply      
How much a branding has actually impact in a product like this? Uber is pretty much the only option for ridesharing in my country, and I guess in most countries in the world. I guess you can harm the brand a little by doing some redesign or something, but I think in the end it doesn't matter that much. The most important thing is that the product works, for a good enough price. I personally at least don't care a bit what the uber icon on my phone is, and I doubt that majority of consumers do.
EvanPlaice 2 days ago 0 replies      
The new logo looks like Pacman after assimilation by the Borg.

Maybe that's indicative of the new 'data platform' development team they're actively recruiting for.

Jemmeh 2 days ago 0 replies      
Lately I've seen quite a few people reach for their phone, not find the Uber app right away and they just click on that pink mustache instead.
melted 2 days ago 1 reply      
Who gives a damn. At this point Uber could choose a turd as their logo, and they'd still be successful.
rajacombinator 1 day ago 0 replies      
Wow talk about a nuclear facepalm. I'd bet Uber doesn't make it through the crash if this is how their thinking process works at the top level.
nanocyber 2 days ago 0 replies      
I used to use Uber quite regularly.

In the past year, I have had several drivers bring up, of their own volition, their dissatisfaction with the fact that tipping Uber drivers is uncommon. It is quite clearly stated in Uber's app and website that there is "no need to tip". I'm not a cheapskate, but I don't carry cash most of the time, and their veiled reminders/requests made me uncomfortable.

It probably sounds strange, but the ridiculous logo change on top of the changing driver culture have made me remove Uber from my transportation-method choices.

Fickle creature I am.

mahyarm 2 days ago 0 replies      
I remember similar sentiment when AirBnB did their rebrand. An article comparing the rebrands of the two companies would be interesting.
kull 2 days ago 0 replies      
Wait, but finally what does this shape/icon mean?
CIPHERSTONE 1 day ago 0 replies      
Seems like an example of a CEO with an over abundance of hubris.
kawsper 2 days ago 0 replies      
Interesting read.

Funnily enough, I used Uber three days ago and didn't notice any chance at all. But after I read the article I checked the Uber logo on my iPhone, and there it was, the new logo staring at me.

deciplex 2 days ago 3 replies      
Apropos of nothing:

> English is the lingua franca of the world

I love reading, and hearing, these words. Heh.

rwmj 2 days ago 1 reply      
He seems very bitter about something.
BillTheCat 2 days ago 1 reply      
Are we supposed to know who Amin is? The article starts referring to him with no introduction.
edem 2 days ago 6 replies      
What a clickbait title. This is not about meltdown but design.
callumlocke 2 days ago 3 replies      
This reads like an oddly smug attack on a fairly successful redesign.
tosseraccount 2 days ago 0 replies      
"Leading up to the Super Bowl, Ubers Twitter feed was all puppies."

Does the writer not get that the "Puppy Bowl" was on at the time? Animal Planet's special show is very popular alternative to the "Let's not compete with Super Bowl" lull on TV.

at-fates-hands 2 days ago 0 replies      
Honestly? This looks like Nixon's logo, just tipped on the side and modified slightly:


Dear Startups: Heres How to Stay Alive heidiroizen.tumblr.com
427 points by whbk  2 days ago   187 comments top 27
dsugarman 2 days ago 6 replies      
>If you are in Silicon Valley and your customers are mostly well-paid consumers with no free time, or other venture-backed startups, well, Id be worried.

That's the most beautifully I've heard this thought articulated. I constantly hear people in SV talk publically talk about how they're living years in the future due to getting services from startups that haven't yet hit other markets. These people are very wealthy and very short on free time; they incorrectly assume the rest of the world is as well. The reason Uber became so successful was because it became cheaper than a cab in most major markets with world class service. You have to really dig deep to justify most other on demand startups having the ability to jump the shark and it's because they don't have a plebeian offering.

eldavido 2 days ago 4 replies      
What goes unspoken is how tiny the overall effect of this will be.

Yes, it will bring some concentrated pain to investors, CEOs, and employees of lots of companies. But how many people will be genuinely, life-alteringly affected by this? 1000? Maybe a few thousand? 1-2% of SF's population? By way of comparison Google has what, 50,000 employees?

I keep having to remind myself that the big companies are the elephants in the room compensation-, real estate- and traffic-wise. They employ hundreds of thousands of people and pay billions of dollars annually in wages. As much as I'd like an affordable place to live, none of this will move the needle that much for the average Bay Area resident.

tarr11 2 days ago 8 replies      
The cynical side of me wonders if all this is "helpful advice" from VCs is just designed to bring valuations down to earth.
mindcrime 2 days ago 1 reply      
On a related note, now's probably a good time to remind people of pg's famous "How Not To Die" essay, which is at least tangentially related to the topic at hand.


roymurdock 2 days ago 8 replies      
You know what kind of companies generally survive? Companies that make more money than they spend. I know, duh, right? If you make more than you spend, you get to stay alive for a long time. If you dont, you have to get money from someone else to keep going. And, as I just said, thats going to be way harder now. Im embarrassed writing this because it is so flipping simple, yet it is amazing to me how many entrepreneurs are still talking about their plans to the next round. What if there is no next round? Dont you still want to survive?

Yes, some companies are moon shots (DFJ has a fair number of those in our portfolio) where this is simply not possible. But for the vast majority of startups, this should be possible.

What is the point of calling them start ups anymore. Remove the high risk/high reward aspect and new companies are simply small businesses that receive small business loans from banks. A lot of the "wow" factor of the startup ecosystem was the mind boggling user growth/high valuation/massive losses phenomenon that a few companies weathered through to IPO and monetization.

I think I saw someone advocating for better terminology on HN recently. I vote to call any close-to-profitable <2 yr old company a small business. Likewise, any portfolio that holds mostly safe small business loans and equity should simply be called a bank.

Leave the unicorn/VC/startup lingo in the past, or use it to describe actual risk profiles, and things will be a lot less confusing.

jorgecurio 2 days ago 3 replies      
> You know what kind of companies generally survive? Companies that make more money than they spend. I know, duh, right? If you make more than you spend, you get to stay alive for a long time. If you dont, you have to get money from someone else to keep going. And, as I just said, thats going to be way harder now. Im embarrassed writing this because it is so flipping simple, yet it is amazing to me how many entrepreneurs are still talking about their plans to the next round. What if there is no next round? Dont you still want to survive?

Sort of reminds me of the conversation with a senior developer I had the first time I joined a startup and my first company lunch at my first job.

Me: "So, we just spend whatever money the company makes"

him: "Correct"

Me: "what if the company is burning all the money it makes to grow as fast as possible, and they can't raise money anymore?"

him: "that will never happen"

Me: (concerned) "so the company is constantly breaking even"

him: "sometimes"

ME: (shocked) "so the company loses money some year, yet raises more money year after year so it can lose more money the following year than the last"

him: (annoyed) "you studied economics haven't you? you dont get it? everyone knows this is how you do startups what did they teach you in that shithole?"

(everyone else laughs)

end scene.

That was 4 years ago. I checked the glassdoor comments and boy I didn't think a 2.1 rating was possible on glassdoor because that would pretty much scare off anyone in the job market....and yup the company is going under exactly for the reasons I asked 4 years ago but was ridiculed at my 'ignorance'

another one bites the dust for vancouver's brain drained tech scene. thank god I won't have to work here again in the near future.

Ftuuky 2 days ago 5 replies      
What? One of the advice is to get cash flow positive with the money you already have. Isn't that basic knowledge? You can't spend more than you have and you only ask for other people's money when you don't need it. Idk, maybe this is an american thing, with all the capital you have but here (Portugal) you can't get series A funding without being at least cash flow positive, no way.
matchagaucho 2 days ago 1 reply      
DFJ has had some great exits over the years. But looking at their current active portfolio, they're a little exposed.... and certainly not investing in any early rounds.

It's not surprising to hear they plan to slow down investing. But that's not necessarily a reflection of the overall market.


arielm 2 days ago 2 replies      
I think the startup world has become somewhat of a fork of how real companies should be built. Over the last few years companies have been investing into "scaling" and getting traction with no real revenue to substantiate any of the growth. That to me is backwards, and why those startups are fearing for their lives now.

Companies should be built with revenue (and profit) in mind, and in most cases those are the ones that thrive and succeed.

carsongross 2 days ago 0 replies      
"Annual income twenty pounds, annual expenditure nineteen pounds nineteen and six, result: happiness.

Annual income twenty pounds, annual expenditure twenty pounds nought and six, result: misery."

--Wilkins Micawber

mmaunder 2 days ago 4 replies      
So the above is obviously written through a VC lens. Through an entrepreneur's lens - who also survived the dot-com bust (at etoys.com) and has since run several failed and now successful businesses - I'd add the following:

The most valuable advice in this post reminds me of Marc A's awesome blog entry. Quote:

"Companies that have a retention problem usually have a winning problem. Or rather, a "not winning" problem."


In my opinion winning is, ultimately, measured by how much cash you can generate. We stopped thinking about an exit a long time ago while in the deepest darkest part of the valley of the shadow of startup death. We were forced to do it because we ran out of money and no one cared about us. Then we started focusing completely on our customers and our income statement. As soon as we did that, amazing things started happening.

Cash, in this case and in this climate, is king. Or net income to be specific. If you're able to generate large amounts of cash and keep a lot of it, not a heck of a lot else matters. From my perspective the only problems that really remain is giving your team a great quality of life and serving your customers.

Cash takes away issues like the board bugging you, investors breathing down your neck or (worst case) wanting to play CEO, hiring problems, retention problems, funding, what business are we in problems, product problems (you're obviously killing it, so do more of that!), exec hires, issues with rebellious execs (you're killing it, so you're implicitly right) etc.

When you "go for growth" (numbers growth, not revenue) you give up all of the above and put yourself as a CEO or exec in a precarious position. Your arguments are no longer that defendable because growth means jack shit unless it generates cash or will very clearly ultimately generate cash.

Think about the CEO of Giphy who just raised something like $50M at something like a $300M valuation. It's like my wife and co-founder says: Doing that you turn a cash problem into a much bigger cash problem. I'd add that you also now have less equity and less influence. For the investors it's awesome - the biz will likely bulk up on talent and worst case will exit as a talent acquisition at $2M per engineer and the investors (who get paid first) will recover perhaps everything that way with little left over.

If I was early stage in this environment I'd do the following:

Stop dreaming about a Deus ex Machina that will reach down and save your sorry ass. Stop fantasizing about acquisitions. If you don't you're going to inadvertently turn acquirers into your target market instead of your real customers. And humans aren't good at focusing on two goals at once.

Then do absolutely everything you can to generate sustainable cash. Usually this means (if you're early stage) discovering who your customers are and what business you're in or (if you're later stage) serving the heck out of your customers and making sure that what you provide is worth more than each dollar they spend to acquire it. Then do more of that. If you're successful doing this, rather than raising money, you'll notice that the really big scary problems simply go away.

zan2434 1 day ago 0 replies      
Based on the rest of the comments here deriding the growth over revenue strategy I think it's very important to bring up that risk is proportional to reward, and by definition any business that can be cash flow positive early on is unlikely to be very risky - and thereby not really what VCs are in this business for.
rpgmaker 2 days ago 0 replies      
When a market like this turns, in order to survive, it is critical to redefine what success is going to look like for you and your employees, and your investors, and your other stakeholders. Holding on to old ideas about IPO dates, large exits and massive new up rounds can ultimately be demotivating to your team.



Stop worrying about morale: Yes, you heard me right. I cant tell you how many board meetings Ive been in where the CEO is anguished over the impacts on morale that cost cutting or layoffs will bring about.

With these prospects, I wonder how will these CEOs keep all those underpaid and highly skilled young laborers working for him/her now?

ar7hur 1 day ago 0 replies      
This is very similar to the infamous "RIP Good Times" presentation Sequoia shared in 2008


jmspring 2 days ago 0 replies      
The whole bit about "don't worry about morale"... Some engineers are replaceable, not all. If things get bad and you lose early/key people, there is a non-negligible hit. But, I think Ben and Mark at A2Z outlined a strategy harkening back to the last big hit -- build up the reserves in the bunker. If you think things will be bumpy for X-months out and you aren't cash flow positive, get the requisite amount in the bank ASAP.
selvan 1 day ago 0 replies      
During late nineties, my startup was providing technology consulting/development service to other dot-com startups. Demand for our consulting service was so high that our company resort to auction kind of process to select customers. Then dot-com bust happened, 97% of our customers had gone out of business, quickly, very quickly. Obviously, our company fortunes dwindled and never recovered from that.
lsiebert 2 days ago 0 replies      
I think people are ignoring the huge cash reserves that Google and Apple, among others, have. I fully expect more acquisitions if VC funding drops out.
aledalgrande 2 days ago 2 replies      
This is based on the "Techcrunch" concept that being successful and continuing business for a startup depends heavily on external funds.

That couldn't be farther from the truth, for a real startup with a real business. Maybe growth will not be as fast without VC funds, but I don't think real businesses will notice shrinking investments.

Correct me if I'm wrong.

arihant 2 days ago 0 replies      
>If you are in Silicon Valley and your customers are mostly well-paid consumers with no free time, or other venture-backed startups, well, Id be worried.

This is the most shallow statement I have read this year. The needs of the rich today would be needs of less rich tomorrow. The author clearly missed out on the whole American dream concept. I'm sure some people felt the same way about refrigerator and cars.

You'd almost never create a market segment starting with the bottom end. Almost every product you touch, including the very screen you're staring at, was once made for the 1%.

And almost always the version for the 1% is expensive, won't see a version 2, and is a one time sale. It doesn't matter if your initial rich/busy customers are going out of business. If you found a need you're fulfilling, you will with a fairly high probability will continue to find customers through the generation.

Dot com bust did not kill Network Solutions/Verisign. Very, very important.

ohadron 2 days ago 2 replies      
Good read. Sounds like good advice in general, regardless of how hard it is to obtain capital.
honksillet 2 days ago 0 replies      
Dear VCs,You made a lot of bad investments. Prepare to take a loss.
melted 2 days ago 0 replies      
A VC taking the valuations down and encouraging the companies to become profitable sooner, to get a fatter slice of a tastier pie when founders come begging for money. News at 11.
jedicoffee 2 days ago 1 reply      
Found the problem! "It is going to be hard (or impossible) for many of todays startups to raise funds." You don't need to take on millions in debt to start a company.
codingdave 2 days ago 0 replies      
Even shorter version of how to stay alive - model your business around some transaction that brings in more revenue than its costs to provide.
outworlder 2 days ago 0 replies      
> It is going to be hard (or impossible) for many of todays startups to raise funds.

So, just like it is in most of the world, then?

julianozen 2 days ago 0 replies      
What a time to be alive
gizi 1 day ago 0 replies      
"The sky is falling ..." No, it isn't. All there is, is that there seems to be less appetite for endlessly unprofitable ventures that get away with dismissing the idea that they should be bringing in more cash than they spend within a reasonable time frame. Furthermore, is the entire VC scene actually needed? Lots of startups do not make use of their services and are doing absolutely fine ...
Do you really need 10,000 steps a day? cardiogr.am
403 points by brandonb  6 days ago   171 comments top 35
newdaynewuser 6 days ago 10 replies      
I got into 10,000 steps craze, thanks in part to my company providing Fitbits. I was not really a health nut, just 3 days a week at gym, lifting and cardio. I had a bit of muscle definition. Gym was just something I did like brushing teeth. I didn't focus my life around it.

Due to incentives and competition among co-workers, I got into Fitbit seriously. While it was fun, but after a few months, not only I lost some definition but also gained weight. I was expecting to lose muscle mass but didn't know that 10000 steps spread out throughout day were not enough activity.

Now I realize Fitbits are great for extremely unhealthy or lazy people. But those of us who been working out without any devices should stay away from any such devices. And not change your workout routines.

It is too late for me, now I cannot workout without a device and some nice charts. I ended up getting a HRM based device. I like the charts it provides but I was happy using mirror to judge how much I needed to workout. And I didn't need any external motivation back then.

brandonb 6 days ago 6 replies      
Hey allOP here.

I'm working on developing Cardiogram, the Apple Watch app from which this data is derived, and working with UCSF cardiology doing machine learning on heart rate data.

Feel free to ask any questions here!

HorizonXP 6 days ago 7 replies      
Great summary of your findings.

I've been wearing a Fitbit Charge HR since last June, and I definitely do not reach 10k steps daily. However, I've noticed that on days I hit the gym and do weight lifting, my caloric burn easily tops 3300, while on days where I'm lazy and just working on my laptop, I barely break 2100.

As a rule, I typically do not do any cardiovascular exercise. I think, I do not know for sure, the Fitbit is better suited for measuring cardiovascular activities, so I'm not sure how or why it's adjusting my caloric burn on my gym days. Maybe it's looking at my heart rate and extrapolating?

Anyway, I'd be very much interested in some kind of fitness tracker that can provide better estimates on caloric burn. That coupled with calorie counting/recording of meals is what will really help me shed this weight.

schwap 6 days ago 0 replies      
I imagine the popularity of '10,000 steps a day' was mainly driven by how cheap/easy it was to measure steps. Nowadays with smart watches or dedicated fitness trackers it's finally becoming possible to come up with better metrics that can be widely used.
arprocter 6 days ago 4 replies      
I walk quickly with long strides, so steps always seemed like a bit of a kludge - someone with shorter legs would presumably take more steps to travel the same distance as me, but would that necessarily make them 'healthier' than me?

In high school we had a cardio class and the requirement was 'get your heart rate above xxx BPM' - it would take less effort for out of shape folks to get above the threshold

antirez 6 days ago 5 replies      
I and my wife both have 50-53 bpm at rest, mine certain periods were 48 and doctor demanded to do an echocardiogram even after I explained I do a lot of sport. Everything was fine. Btw TLDR, we got those bpms doing crossfit 3/4 times-week (4/5 my wife), we rarely walk at all. I switched to crossfit recently (a few months), before I did powerlifting + cardio for years which provides very similar results from the POV of heart rate, AFAIK. Btw I think my level of cardio fitness is improving with CrossFit compared to PL.
erikb 6 days ago 0 replies      
Hm, what about the other goals of walking 10k steps? I started walking to lose weight and found, thanks to all these smart devices, that when I walk more than 15k steps I actually drop in weight. Since I'm on the very unhealthy part of the scale it really helps a lot to improve my health.

What I found is that the motivational gain from having data is quite low though. For a month maybe it was good motivation, but if you don't have someone to specifically compete against then you just stop caring.

aivosha 6 days ago 1 reply      
I dont buy the intense workouts as better alternative to more "slow burn" workouts. I'm a runner and I used to do fast 5k runs 3-4 times a week. I would feel very tired at the end of each run, just depleted. Didnt wanna do anything else but just relax. Had headaches too and most likely due to burning carbs at fast rate (intense workout, high heart rate) and basically depleting my carb reserves, which i think are about 2500 calories in a person. Then I stumbled upon a book, i think here on HN, called Slow Burn by Stu Mittleman [1]. It changed the way i look at the cardio exercise dramatically ! You wanna be healthy - burn fat (slow pace, long time, low HR). You wanna be fit - burn carbs (fast pace, short runs, high HR).These are very different goals as you can imagine ! So which one do you want really ?

[1] http://www.amazon.ca/Slow-Burn-Stu-Mittleman/dp/0062736744

Edit: Just wanted to add, now I run 10k easily, feel great after each run and dont have any headaches and I think, though i dont really care, i loose weight too. Just keeping the HR under control during the run, thats all you need.

didibus 5 days ago 0 replies      
This simply looked at resting heart rate. Sure, you could do better then 10k step a day to improve that, but this does not mean there are no other benefits to 10k steps. As the article points out, there's no study of it.

For me personally, I find walking is fun, and simple to integrate in my routine. It burns an extra 300 cal a day, and forces me to spread my activity throughout the day. That last one is important, because there is a growing number of evidence that inactivity is very bad for you, and that exercise is not enough to counter the effect. What you need is to decrease your periods of inactivity. Finally, walking is very safe and easy on your body.

That's not to say any other alternate type of activity wouldn't be just as good, or better. When it comes to exercise though, anything is better then nothing, so if 10k works for you, keep at it.

the_economist 6 days ago 0 replies      
After I got my iPhone 6 and started monitoring my steps using its pedometer, I started walking 10,000 steps every single day. After a few months, I noticed that my body was quite a bit more toned and muscular. Great benefit to the steps!
AdmiralAsshat 6 days ago 4 replies      
Well that's good to know. I literally can't get to 10,000 steps during a normal day unless I make it a goal during my lunch break to run down to the gym and walk around 4 miles on the treadmill. Since my walking speed is 3.2/3.5 mph, this can't be done leisurely. My only hope for getting to 10,000 is that I'll do the 3.5 or so at the gym and then hopefully be required to run to the grocery store later in the evening so that I can be forced to walk more.

The fitness tracker is supposed to be a motivator, but instead when I see that I'm only at 7-8k per day even after an hour-long walk, it feels more like a fat shamer.

gthtjtkt 6 days ago 6 replies      
Everything about modern fitness is fascinating. We have access to more data about health and exercise than ever before, all kinds of personalized fitness trackers, calorie counters, meal planners, personal trainers, exercise routines and equipment. Yet somehow our society still seems to be getting fatter and more out of shape every year.

What is going on!?

ChemicalWarfare 6 days ago 0 replies      
I think this is more or less common knowledge that the higher the intensity of exercise the less time or "steps" in this case is needed to achieve the same goal.

There is a curve ball however that needs to be taken into the account - the question is what exactly your goal is?For example, a sprinter and a long distance runner might be averaging the same BPM rate over the course of the week both resting and during their workouts, the end result is quite different however.

I myself have always been a proponent of high intensity training for everything - short and intense weight lifting workouts, sprinting etc. This changed when at one point in my athletic "career" I switched to boxing at a relatively young age (under 20 y.o.) and my shins/calves started getting sore to the point where it was difficult to walk.

The way to condition those for relatively low-intensity but long (2+ hour) workouts? Even longer and lower intensity workouts basically just walking around or light jogging for hours.

dfar1 6 days ago 2 replies      
We should not dismiss products as Fitbits as being ineffective just because maybe 10k steps is not a valid metric. A little competition and settings of goals is good for everyone.
jackreichert 6 days ago 1 reply      
The measuring stick in this article seems to solely concerning the resting heart rate. Yes, this is because that is the origin for the 10k/day measure; but, surely there are other benefits regardless of what intensity a workout you have.
neves 6 days ago 0 replies      
Now I'm impressed with the data generated from iWatch. I thought it would be used just in complex studies, but we can get important insights for everyday life.

I really want to buy an activity tracker now.

bootload 6 days ago 0 replies      
Do you need to do 10K steps per day?

I don't think your average person is capable, let alone able to set aside the time. Tools that confirm the minimum physical PT requirement in my view, is good. It will individualise the minimum requirement.

I do above and below, 10km/day, 6 days per week, year in, year out. [0] This puts me hard case territory. Does everyone need this? Probably not. I move upwards of 1600km to 2000km per year. If you move a minimum of 5km/hr that is 2 hours per day. Time commitment required. I choose to move across broken ground and carry weight, a minimum of 1kg and irregular max of 20Kg, averaging around 8kg. The physical commitment to do this is high.

What do I find.

10km/day is hard. In the cold it is better. You have to take precautions in the heat. You don't have to count calories as much. You get sore feet and muscles. Do you need it? I doubt it. For me I can probably meet your requirements with a 5km/day with some resistance gym work.

Interesting psychological observations: The most likely time for failure to get out the door is the short time before you start. If you say to yourself, "f*&$-it, just go" you are more likely start and complete. I've tried this for a decade and the resistance to start is pretty constant but one you get out the door, this disappears.

Do the hardest thing first ~ http://seldomlogical.com/hard.html

[0] currently at 5km/day with weights and gym till I crack 200km.

baldgeek 6 days ago 0 replies      
Been using fitbit for over 2 years now (~ 7.5Million steps logged). Have the data being pushed to other web apps (MyFitnessPal, FitStudio, Higi, Acheivemint, Walgreens balance rewards). Also, trained up and ran 2 half-marathons in this time. Lost 60lbs so far, first 20lbs was from increased walking and diligent choices via calorie logging in MyFitnessPal. Fitbit is not the be all end all, but it definitely keeps me aware.
z3t4 3 days ago 0 replies      
Humans walk very efficient. If you want to burn fat or get your heart rate up, a 20 second sprint is equal to two hours of walking.

If you want to increase the capacity of your cardiovascular system, to get a lower resting pulse, the best method is to engage in an activity that gives you around 70% of your max heart rate, for about 45 minutes or more. Examples of such activities (for normal people) is gym training, cleaning the house, mowing the lawn, etc. At least twice a week.

Max pulse is around 220 minus age. So if you are 35, you want to average (220-35)*0.7=130, witch will feel easy.

panglott 6 days ago 0 replies      
The article calls it the "manpo-kei", but the Japanese sign clearly says "manpo mt" (Manpo-Meter) with "meter" in katakana.
jim-greer 6 days ago 1 reply      
450M data points is cool, but without knowing how many people that represents, it's hard to say how meaningful this is.
thesz 6 days ago 0 replies      
10 000 steps roughly equal to 7-7,5 km walked.

To stop the shrinking of important parts of brain you need about 5 km/day average in a week (you can do 35 km in Sunday, for example). BTW, 5 km/h is close to typical walking speed, not fast and not slow either.

So yes, you do not need 10K steps a day. Slightly above 7000 steps per day would suffice.

anigbrowl 6 days ago 0 replies      
The only statistics I care about are the number of exercise reps I do each day and how many hours a week I spend dancing to loud music.Not only do I not want to get hung up on analytics, I see absolutely no reason to share that data with anyone else so they can work out better ways to advertise to me.
contravariant 6 days ago 0 replies      
Who thought it was a good idea to use bigger bubbles for larger groups of people? Those values should have the least spread, drawing them bigger for no purpose is just confusing. Simple error bars would have been a lot better.
petercooper 6 days ago 1 reply      
Is resting heart rate a good indicator either? Mine is typically 55-58bpm but I do no organized exercise and am sitting for much of the day (I do try to walk quick and run up stairs to make up for it, so while I'm not "fit", I don't feel unfit per se).

I am free from cardiac related symptoms, so does this mean I am "naturally" fit without really trying? (A bit like people who don't focus on their diet but are slim.) Or, as I suspect, is resting heart rate not a very useful indicator of fitness at all?

kazinator 6 days ago 0 replies      
Whaddya know. The bogus idea of chewing food 30 times (at least) also originated (or rather was popularized) in Japan.

This sort of falls into the same bucket.

This was originated by American diet reformer Horace Fletcher in the early 1900's.

See here, e.g.:


dsmithatx 6 days ago 1 reply      
Interesting data but, does lowering my resting heart rate mean I will live longer? Will it decrease the chance of having a heart attack?
rajneeshgopalan 4 days ago 0 replies      
I regularly work out, about 3-4 times a week. I do this to stay in shape and have an healthy lifestyle. Although devices and charts can aid me in getting even better, I do not use any. I simply listen carefully to my body and adapt.
auganov 6 days ago 1 reply      
Without adjusting for BMIs it doesn't tell us that much about causality. If you could get that data that would be huge.
melling 6 days ago 1 reply      
Do people read the Wall Street Journal simply publish the same story 3 weeks later?


Guildpact 6 days ago 0 replies      
The scale of a lot of this data is from 74-77 bpm resting heart rate. I feel like there is only a small correlation and its a small distance improvement from "couch potato" to "high intensity"
agumonkey 6 days ago 0 replies      
Having gone through a heart failure, I can attest that walking is godsend. Even of moderate effort, as long as it's more than 30 minutes. Small hills are bonuses.

-- sent from my dead-oriented design desktop wheelchair.

zobzu 5 days ago 0 replies      
if i bike everyday my resting hr is 42bpm

if i dont for a month its 47

walking is just one way, bike or swim is better though..

(and yes ive a rather low hr genetically regardless)

kumarski 6 days ago 1 reply      
A calorie is not a calorie.

Leptin signal blocking counts for something...

amar-singh 5 days ago 0 replies      
Yes, we need 10000 steps every day. Its very good to lead a healthy life..
U.S. Supreme Court Justice Antonin Scalia has died nytimes.com
327 points by cgtyoder  5 days ago   359 comments top 30
ianamartin 5 days ago 6 replies      
I usually disagreed with Scalia, but I have tons of respect for the man. He's always struck me as one of the best examples of the idea that two intelligent, educated people with reasonable minds may legitimately come to different conclusions about the same set of facts.

I also found when reading transcripts of oral arguments that Scalia was really terrifically funny pretty often.

Once, I was doing some computer repair work for an attorney who was interviewing Scalia extensively for a book about Supreme Court oral arguments. It was late in the day, and all the other staff were gone.

The phone rang, and the attorney asked me if I would answer it since I was close, and tell whoever it was that he was busy. I picked it up and the voice on the other end said, "let me speak to x." I said he wasn't in. The voice says, "I know he's in. Let me speak to him please." I tried to deflect again, and he finally says, "This is Justice Scalia. I guarantee you your boss wants to talk to me."

Without thinking I piped up and said, "Oh, Justice Scalia! You won't believe who just walked in the door! Just a second."

Apparently Scalia thought that was hilarious and told the guy to pay me extra for popping off to a Supreme Court justice like that.

Edit: as a quick sidenote, I'd encourage everyone to actually read the oral arguments and the full opinions for important cases as they come up. The media is absolutely terrible about over simplifying or just straight up not getting the issues correct.

You will find that there is a hell of a lot of thought that goes into opinions, and there is much less predetermined ideology than the way these things often get painted.

rrggrr 5 days ago 6 replies      
Scalia was foremost an advocate of judicial restraint, who believed that the further the Courts stretched interpreting law, the more vulnerable the Courts become to the animosities of the Executive and Legislative branches. The Judiciary, Scalia understood, was the least powerful and most vulnerable of the three branches of government, and its strength and influence rests in its artful and judicious use of its very, very limited authority. A few Scalia quotes that illustrate this:

>There is nothing new in the realization that the Constitution sometimes insulates the criminality of a few in order to protect the privacy of us all.

>If you think aficionados of a living Constitution want to bring you flexibility, think again. You think the death penalty is a good idea? Persuade your fellow citizens to adopt it. You want a right to abortion? Persuade your fellow citizens and enact it. That's flexibility.

>This Court holds only the judicial powerthe power to pronounce the law as Congress has enacted it. We lack the prerogative to repair laws that do not work out in practice, just as the people lack the ability to throw us out of office if they dislike the solutions we concoct.

>Perhaps sensing the dismal failure of its efforts to show that established by the State means established by the State or the Federal Government, the Court tries to palm off the pertinent statutory phrase as inartful drafting. This Court, however, has no free-floating power to rescue Congress from its drafting errors.

danso 5 days ago 0 replies      
RIP Justice Scalia. Didn't agree with him on * many things but he seemed like a relatively principled judge, as far as they go.

(edit: Actually, can't think of too "many" things off the top of my head where I fully disagreed with him. I did enjoy reading his written rulings)

In the coming days, it'll be interesting to see retrospectives on how Justice Scalia ruled on such issues as tech privacy and censorship. For example, in Brown vs. Entertainment Merchants Association (2011), Justice Scalia wrote the majority opinion which said that "video games qualify for First Amendment protection"


edit: More context on the ruling if you don't feel like clicking through: The 7-2 opinion struck down a California law that banned the sales of video games to minors, which had been signed into law by (of all the ironies), Republican Governor Arnold Schwarzenegger.

Basically, the court saw video games as art:

> Like the protected books, plays, and movies that preceded them, video games communicate ideasand even social messagesthrough many familiar literary devices (such as characters, dialogue, plot, and music) and through features distinctive to the medium (such as the player's interaction with the virtual world). That suffices to confer First Amendment protection.

As described by Wikipedia:

> Scalia's decision also stated that the current self-moderated industry standards like the ESRB are operated effectively to regulate the sale of more mature games to minors, and that "filling the remaining modest gap in concerned-parents' control can hardly be a compelling state interest" requiring a law to enforce.

The two dissents were Justice Breyer (considered a liberal justice) and Justice Thomas, who is seen just as much of the conservative base as Scalia is. According to Wikipedia:

> Justices Clarence Thomas and Stephen Breyer dissented, each authoring a separate dissent. Justice Thomas, in his dissent, considered that historically, the Founding Fathers "believed parents to have complete authority over their minor children and expected parents to direct the development of those children," and that the intent of the First Amendment "does not include a right to speak to minors (or a right of minors to access speech) without going through the minors' parents or guardians."

Justice Alito and Roberts, the 2 other conservative members of the court, concurred with Scalia's opinion, but had reservations about being too lax in regulating the content of video games:

> "There are reasons to suspect that the experience of playing violent video games just might be very different from reading a book, listening to the radio, or watching a movie or a television show," referencing the book Infinite Reality which highlights the psychological effects of virtual reality, and argued that the decision "would not squelch legislative efforts to deal with what is perceived by some to be a significant and developing social problem.

teawithcarl 5 days ago 0 replies      
In 1992, I saw Scalia in person preside over a three-judge panel (including two other federal district judges) at Harvard - at the Ames Moot Court, the famous moot (fake) court for the very best two teams of third-year law students.

He did an excellent job over the two day proceedings. It was fascinating to watch in person.

In a parallel event in the evening, I watched Scalia take questions in an open air pavilion lawn (with no security whatsoever, just an open event on campus). At one point an audience member asked him, "What's the most difficult area of law for you personally to judge?"

After seriously furrowing his brow and mind, he finally and sincerely proffered up ... "Indian (Native American) law".

My take on his answer - he found that area of law truly difficult as a strict constructionist.

liquidise 5 days ago 3 replies      
This will likely result in a major political shift of the court. Even if Obama places a staunch centrist, it will be a net left swing, as Scalia was perhaps the second most conservative justice beside Thomas.

There have been a great deal of 5-4 decisions since Sotomayor's nomination that in the future could easily fall the other way.

udfalkso 5 days ago 11 replies      
A judge quoted in the article said, "my educated guess is nothing will happen before the next president is elected."

Presumably he's talking about selecting a replacement to the supreme court. Is this accurate? How long does this process typically take?

jakeogh 5 days ago 0 replies      
tomjakubowski 5 days ago 3 replies      
A dumb hypothetical because this will never happen, but: could Obama nominate himself to replace Scalia? And if the Senate somehow approved him, would he have to resign the presidency to take his spot on the bench? I don't see anything in the eligibility requirements to be President or a judge of the Supreme Court that would suggest an incompatibility. Is there relevant case law?
hwstar 5 days ago 1 reply      
Things are going to get very interesting with the pending court decisions coming at the end of June.
greenyoda 5 days ago 1 reply      
So far, the only story that has been posted that has any significant information about Scalia's career has been this one, from the Chicago Tribune:


hysan 5 days ago 1 reply      
Question for those familiar with how the Judicial system works: If Congress prevents any new appointments until after the election cycle, does that mean that the Supreme Court is effectively halted on making any rulings? What happens to the cases currently being taken on and those that are queued up? Would they all potentially be delayed an entire year+?
madaxe_again 5 days ago 1 reply      
Tangential, but if you're sat there going "Marfa, that rings a bell" (he died in Marfa) - this is probably why. Temperature inversion that reflects headlamps, apparently. https://en.m.wikipedia.org/wiki/Marfa_lights
tzs 5 days ago 2 replies      
There has been some discussion here about whether or not Obama has time to get a replacement nominated and confirmed, or whether the Republicans can slow things down enough to push it to the next President (who they hope will be one of them).

I wonder if it might be better for the Democrats to not even try to get a nominee through before the next election. From what I've seen liberal/progressive leaning younger voters are more likely than conservative leaning younger voters to feel that Congress and to a lesser extent the President are totally beholden to corporate lobbyists and that voting doesn't make a difference so why bother.

An open Supreme Court position at the time of the election could give Democrats something to focus their get out the vote message on (for both the Presidential election and Congressional elections) that might get through to some of those younger liberal/progressive leaning voters.

brandonmenc 5 days ago 0 replies      
Scalia's appointment was a major point of pride for us Italian-Americans. Even today, if you're wealthy with an Italian last name, people joke about mafia connections. Representation on the highest court has been a great counter to that.
throwaway43563 5 days ago 1 reply      
Why should the death of a Supreme Court Justice matter this much, let alone be a cause for celebration (as it clearly is for some in this thread)?

That unelected, life-tenured judges exercise this much authority over us, that they routinely subvert our collective democratic will as expressed through elections and referendums, that people even defend their votes for President on the basis of what kind of Supreme Court nominations they will likely make, does not suggest a healthy democracy.

nbadg 5 days ago 0 replies      
This has been confirmed by the governor of Texas.


teawithcarl 5 days ago 0 replies      
Lawrence Lessig (@Lessig) has yet to comment on Twitter - he clerked under Scalia.

Lessig also clerked under Richard Posner, considered by many to be the top legal theorist in America.

mikerichards 5 days ago 2 replies      
Considering the constant 5-4 decisions, there's 0 chance that a new justice will be confirmed until the next President takes office
lazyant 5 days ago 1 reply      
isn't this the judge that said that torture is not "cruel and unusual punishment" because the Constitution applies to persons and suspects conveniently labeled "enemy combatants" were not persons? good riddance
dredmorbius 5 days ago 0 replies      
dang 5 days ago 0 replies      
A reader emailed to suggest that we change the url from http://www.mysanantonio.com/news/us-world/article/Senior-Ass... to this more substantive obituary. That seems like a good idea. If anyone can suggest a better URL, we can change it again.
growlix 5 days ago 2 replies      
Let us honor Justice Scalia's memory by refraining from interpreting or contextualizing any prose for the remainder of the day.
orbitur 5 days ago 4 replies      
He didn't actually say that, and I'm not sure how it spread.

However, he's still a terrible person, and here's the quote yours spawned from:

There is no basis in text, tradition, or even in contemporary practice (if that were enough), for finding in the Constitution a right to demand judicial consideration of newly discovered evidence of innocence brought forward after conviction.

edit: This was originally in reply to a misquote that has now been deleted. Mods have now made it a parent comment.

dang 4 days ago 1 reply      
> I don't think you have any idea how American politics and elections actually work.

> dumb

> Have you ever met a voter other than yourself.

> naive, impractical, and uninformed

Personal abuse is not allowed in Hacker News comments. It breaks the guidelines by being uncivil and unsubstantive, and provokes reactions in kind (see below). So please edit it out of what you post here.

We detached this subthread from https://news.ycombinator.com/item?id=11096684 and marked it off-topic.

work-on-828 5 days ago 4 replies      
As someone who strongly supports marriage equality and read the full opinion when it came out, I was kinda dismayed to find the dissent a more convincing argument. Did anyone else experience this?
mcherm 5 days ago 0 replies      
I was trying to figure out how to respond to this. The best I could come up with was this: Mr. Rodgers would be disappointed in you.
notthegov 5 days ago 1 reply      
I can't escape the thought of him driving to work and getting all his news only from conservative talk radio. And then declaring he literally believes in Satan.

I'm for religious freedom but a Supreme Court judge who literally believes in evil and refuses to read news contrary to his politics beliefs, seems dangerous.

ck2 5 days ago 1 reply      
Now we get to see the first ever anonymous hold put on a nomination for the supreme court.

It's the only way they can try to run out the clock on Obama.

Or come up with some kind of fake moral outrage.

Can you imagine Trump picking the next three Supreme Court judges? It would set back America for decades.

work-on-828 5 days ago 2 replies      
Is it bad that my third thought was about how this will hurt Hillary's chances?

EDIT: It directly counters the argument that Bernie supporters are naive to risk a more leftist candidate who would have a lower chance of being able to appoint justices.

jdminhbg 5 days ago 1 reply      
> Off-Topic: Most stories about politics, or crime, or sports, unless they're evidence of some interesting new phenomenon. Videos of pratfalls or disasters, or cute animal pictures. If they'd cover it on TV news, it's probably off-topic.
Graphing your friends' sleep with hidden FB API defaultnamehere.tumblr.com
485 points by adamch  4 hours ago   73 comments top 23
ceocoder 1 hour ago 0 replies      
My favorite part -

> This friend recommended nvd3.js, presumably because youre not making real graphs in 2016 unless your graphing library is <something>.js and requires at LEAST one other <something else>.js as a dependency. Everyone looks at you like what, you DONT already use <something else>.js? Jeez say goodbye to your Hacker News karma. Just apt-get install npm && npm install bower && bower install- NO STOP IT THIS ISNT WHAT TIM BERNERS-LEE WANTED.

edit: as huckyaus mentioned in a different thread, author did http://swagify.net/ as well. In completely unrelated news, I'm changing my handle to [Tr1Ck$h0t][LEGIT][60x7]$$$C30C0DER$$$, that will make me really popular among the cool kids.

542458 4 hours ago 10 replies      
Some people might take issue with it, but the writing for this had me in stitches. I very much agree with the author on graphing libraries - there are a few good simple ones, but as soon as you want anything unusual you have to jump to these big, hard to configure monstrosities. More than once I've just given up and written my own server-side generator.
BinaryIdiot 1 hour ago 3 replies      
> If you reload the page youll see approximately fifty-bajillion network requests go off as Facebook desperately tries to load all the junk that it needs to display facebook.com.

I like this part. As a developer I've often looked at the network usage of large websites / web applications and it's always surprising to me just how...unoptimized it is as far as network connections go.

I mean Facebook loads decently enough and all I'm just surprised the first load isn't condensed into a small, handful of network calls to save on latency.

drdiablo 7 minutes ago 0 replies      
Nice work! I really like the idea that the web allows anyone to programmatically dig into the UI and extract data to do things. A friend and I actually made a whole API to interact with FB chat. You should check it out: https://github.com/Schmavery/facebook-chat-api. I'd really love to see what you can come up with, with some of the stuff we support.
jonesb6 2 hours ago 1 reply      
"If you I dunno, didnt have a lot of friends in high school, you might recognise that as a UNIX time stamp - the time in seconds since midnight, January 1, 1970. "

Great article. And a further reminder why Facebook kinda sucks.

spydum 2 hours ago 1 reply      
I like to do this sort of web spelunking all the time.. But the writing and humor really make this more enjoyable than it should be! Of course Facebook leaks info to you about your friends - that is the sole attraction for people to use it! Seems like you could turn this thing into a browser extension as well if you felt daring.. Like some sort of FB snooper.
xiphias 2 hours ago 0 replies      
So to make it useful it just needs to find the pairs of people who ,,go to sleep'' at the same time
bijection 21 minutes ago 0 replies      
Antimatter15 has a pretty cool clock style visualization of this from 2012 [1]

[1] https://antimatter15.com/project/facebook-clock/

buremba 2 hours ago 0 replies      
It would be real creepy if someone does the same thing for Whatsapp, you can even predict who's talking to each other much better than Facebook. It's a bit harder to collect data from web.whatsapp.com because it's using Websockets but let me know if someone develop such tool and publish it on Github. :)
a_bonobo 3 hours ago 1 reply      
>Similarly, Im not sure why there are these weird spikes every three minutes (+- ~1minute) sometimes.

Could these just be keep-alive requests? For example, the mobile app checks whether it's still connected?

anaphor 3 hours ago 0 replies      
I did the same thing with the XMPP interface before they scrapped it and it was obviously much easier...also I used the built in graphing that's in Racket to visualize it. Also I made a thing to do desktop notifications whenever someone came online, which is actually kinda useful.
Wingman4l7 1 hour ago 0 replies      
Reminds me of the old-school user tracker (whose name escapes me) that would give you a bar graph of your friend's online/offline presence when AOL Instant Messenger was the dominant chat client.
jacalata 1 hour ago 2 replies      
As someone who is not personally humiliated by my interest in computers/tech/programming, I wasn't really entertained by the constant "oh yea lol it's cause I'm a waste of oxygen that I know that, don't you hate me as much as I hate myself?" Maybe I know too many nerds with actual self esteem issues to find it funny.
teen 41 minutes ago 0 replies      
You're really funny! Great post. Highcharts.js is the easiest js library to make quick charts btw.
gengkev 2 hours ago 4 replies      
I don't have a Facebook account, but is there really no way to not share your available status to your friends? In Gmail you can simply sign out of Hangouts.

On a side note,

> If youre wondering why the response starts with for (;;);, its to, among other things, encourage developers to use a quality JSON decoder, instead of like, yknow, eval().

This is wrong, as I commented on the linked StackOverflow post, perhaps a bit too strongly. But it's really frustrating to see that people have misconceptions because of incorrect answers on StackOverflow.

davidwparker 2 hours ago 0 replies      
Nice little investigating!

Personally, I have chat off all the time on FB, and I don't have the Messenger (or FB) app on my phone either, so I guess I'm always sleeping :)

Buetol 3 hours ago 2 replies      
Small tip: If you don't want to be tracked, you can also turn off the chat.
Matiss 3 hours ago 0 replies      
This is awesome! Thank you for sharing the code for this. Overall I would say that this could be very entertaining to watch over multiple sites. Potentially gathering a good profile of your friends over time!
obelisk_ 3 hours ago 1 reply      
inb4 facebook resolves this issue by banning anyone who's connected 24/7. (that wouldn't solve the problem either way, btw -- a small group of people could conspire to pull this data at irregular intervals and then share the data with one-another to get a more complete picture while still staying reasonably undetectable if done right.)
yojoma 1 hour ago 0 replies      
This was hilarious and really cool!
AznHisoka 4 hours ago 4 replies      
Does this work for anyone, even if you're not their friend?
gohrt 3 hours ago 1 reply      
What determines whether the app is online? What happens when the user is using the phone but FB is in the background? Does FB get some kind of update when the user is active on device? Or do OP's friend live in the FB app all day long?
beatpanda 3 hours ago 1 reply      
seriously though how are you not already using D3
Paul Graham on Doing Things Right by Accident themacro.com
344 points by runesoerensen  1 day ago   79 comments top 18
peterclary 18 hours ago 3 replies      
Off-topic but: Hooray for transcripts! I can see the benefits of being able to hear the pauses, tone, inflections, etc. but even leaving aside the deaf and hearing-impaired, there are so many advantages to having the text (it's searchable, you can read it while listening to music, it's quicker to read it than to listen to it).
aresant 1 day ago 1 reply      
"Plus, then, I had given this talk about the Harvard Computer Society, and I said, If you want to raise money, raise money from people who made the money doing startups. And then, they can give you advice, too. And I suddenly noticed, they were all looking at me. And I had this horrifying vision of them all e-mailing me their business plans. Which is funny, because thats what YC turned into."

That gave me chills.

The entire startup experience, the essence of being an entrpepreneur for me is in that moment when your brain subconsciously processes all the data around a problem and throws out something obvious and audacious in the same breath. And before you can conciously object BAM you have said it outloud and the adventure begins.

Terr_ 1 day ago 3 replies      
This reminds me of a bit from "The Dilbert Future" (1997):

> Most people won't admit how they got their current jobs unless you push them up against a built-in wall unit and punch them in the stomach until they spill their drink and start yelling, "I'LL NEVER INVITE YOU TO ONE OF MY PARTIES AGAIN, YOU DRUNKEN FOOL!"

> I think the reason these annoying people won't tell me how they got their jobs is because they are embarrassed to admit luck was involved.

> I can't blame them. Typically the pre-luck part of their careers involved doing something enormously pathetic. Take me, for example. I'm a successful cartoonist and author because I'm a complete failure at being an employee of the local phone company.

Outdoorsman 1 day ago 1 reply      
>>Paul : It is actually a trick for interviews. If someone asks you a boring question, just answer the interesting one they might have asked, and nobody complains.<<

That's classic...

An example of how to actually live "your" own life in this world...not paying a great deal of attention to uninteresting things that others bring up; rather molding those same things so that they become interesting, and illuminate parts of your life and the lives of others...

In my estimation, my life is what it is--one I'm very happy with--because of my having just that attitude...

And, yes, I totally agree:

>>Paul : When I was a kid at Christmas, the Sears Catalog was your reference work.<<

pkrumins 1 day ago 5 replies      
That reminds me of this amazing talk called Why Greatness Cannot be Planned:


TLDV: Innovation is not driven by narrowly focused heroic effort. We'd be wiser and the outcomes would be better if instead we whole-heartedly embraced serendipitous discovery and playful creativity. We can potentially achieve more by following a non-objective yet still principled path, after throwing off the shackles of objectives, metrics, and mandated outcomes.

This also matches my experience 100%. All my best discoveries are accidental.

notahacker 16 hours ago 0 replies      
Reading about how his original motivations were purely pecuniary, he was keen to sell his company as quickly as possible and his ambition was to go back to essay-writing and hobbyist programming, I can't help thinking young PG probably wouldn't get accepted into YC these days...
bootload 23 hours ago 1 reply      
"Early-stage startups are just fast-moving chaos. That is a constant. That was true in Henry Ford's day, it was true when we started YC."

Interesting quote. There must be some organisation to early startups, otherwise they wouldn't work. Is the chaos just a description of what cannot be observed and described?

Fantastic read. Liked the bit about straw-drawing to talk to customers.

S4M 14 hours ago 1 reply      
I find interesting the part where PG says that a startup either makes its founders rich either goes down, and basically discards the possibility that it ends up to be just enough to pay its founders a good salary. I think it can be a pretty nice outcome but then again YC must select founders who are very ambitious.
ArkyBeagle 10 hours ago 0 replies      
It's amazing the lengths to which people will go to avoid Windows programming. I know I have :)
alanwatts 14 hours ago 0 replies      
"Superior work has the quality of an accident"

-Alan Watts, The Way of Zen

z3t4 20 hours ago 0 replies      
I love listening to (success) stories. You should make more of those.
EGreg 10 hours ago 0 replies      
e-commerce was not my lifes work. I didnt actually want to spend my life working on this. I did it to get money and make that money.

Aaron : Yeah. This is such an interesting thing because its so opposite from what you tell people a lot of the time, what YC tells people, certainly, of Dont do things just because theres a business there, right?

This is it, right here. If you want to know what most successful businesspeople have in common - not the unicorns - it's that they were prepared to sell their first venture and/or give away a lot of equity to the right people to make it work. Once they have the money, comnections and track record, they can have much more control in their next company.

We went a completely different route. We've tried to change the world... :)

lsniddy 10 hours ago 0 replies      
luck = opportunity + preparedness
Kenji 16 hours ago 3 replies      
I can't believe one of the founders was doing this alongside grad school and PG was like yeah you know, you have a lot of spare time there.

And here I got so much work with full time studying that I can barely finish reading a single book alongside. How can people say university is enjoyable, fun, lots of spare time? For me it's just endless hard work and barely any breaks inbetween.

vidoc 21 hours ago 3 replies      
Full-of-himself !
JohnD19 21 hours ago 3 replies      
I stopped reading this "treatise" when in the first paragraph Aaron spoke highly of the war criminal Donald Rumsfeld.
darshanp 11 hours ago 0 replies      
One of the best things I've read in a long time. Paul Graham is God!
Kotlin 1.0 Released: Pragmatic Language for JVM and Android jetbrains.com
375 points by belovrv  3 days ago   110 comments top 16
norswap 3 days ago 3 replies      
I've been using Kotlin for work (I'm a PhD student, so work is to be understood in a peculiar fashion) and I'm really liking it.

It's a much needed upgrade to Java whose fundamental advantage is what I'd call "crispness": you can write terse code that is actually understandable. In particular, the notion of inlining closures allows for efficient functional programming and great DSLs.

In comparison to Scala, Kotlin is much less advanced, but much much simpler. Like said earlier, it's easily as terse as Scala. Scala, however, is much more expressive (what I'd give for typeclasses...) and that sometimes hurts in Kotlin as it does in Java.

There's also Ceylon (JVM language by Red Hat) who is seemingly in the same niche and which I very much wants to experiment with. From what I've read so far, Ceylon seems conceptually more elegant and more powerful than Kotlin. The open question is whether it can match Kotlin's crispness.

It feels a bit early for a release however. The compiler is still completely wonky. Inference sometimes fails on things that should be completely trivial. It is particularly bad at taking return types into account. It's fully usable, but expect to work your way about some linguistic unpleasantness (or outright bugs).

Also, for some reason, Kotlin seems big in Japan.

mahmoudimus 3 days ago 1 reply      
Congratulations to Andrey and the rest of the JetBrains team. This really has been a long time coming. I've been following and experimenting with Kotlin now for the past year or so and it definitely solves a lot of Java's pain points without having to reinvent the way one thinks about programming languages.

The interoperability and backward compatibility is one of the most useful features, meaning I don't have to wait for an entire ecosystem to develop around it.

The tooling is also incredible and will only get better.

Concurrency support with Quasar and Comsat makes it relatively effortless without reinventing new libraries.

Overall, Kotlin, IMHO, is one of those items in the "Pro"s section on any shop seriously evaluating a JVM stack. Especially with Java9 on its way, it's only going to get better.

winterbe 3 days ago 1 reply      
Kotlin is currently my favorite alternative JVM language. Kudos to all collaborators for the great work. After 5 years of hard work, you finally made it!

Here's a little starter project for Kotlin webapps using Spring Boot and React.js, I made a while ago:


zmmmmm 3 days ago 4 replies      
Kotlin sort of feels like Groovy "done properly" (with no disrespect to the authors of Groovy). Groovy kind of evolved in an opportunistic, unplanned manner and ended up with millions of features, cool whiz bang aspects that look awesome that don&#