hacker news with inline top comments    .. more ..    19 Jan 2016 Best
home   ask   best   3 years ago   
Dear GitHub github.com
1669 points by msvan   ago   458 comments top 85
1
jonobacon 4 days ago 15 replies      
Hi Adam, Addy, Andreas, Ariya, Forbes, James, Henry, John-David, Juriy , Ken, Nicholas, Pascal, Sam, Sindre,

My name is Jono and I started as Director of Community back in November at GitHub. Obviously I am pretty new at GitHub, but I thought I would weigh in.

Firstly, thanks for your feedback. I think it is essential that GitHub always has a good sense of not just what works well for our users, but also where the pain points are. Constructive criticism is an important of doing great work. I appreciate how specific and detailed you were in your feedback. Getting a good sense of specific problems provides a more fruitful beginning to a conversation than "it suxx0rs", so I appreciate that.

I am still figuring out how GitHub fits together as an organization but I am happy to take a look into these issues and ensure they are considered in how future work is planned. We have a growing product team at GitHub that I know is passionate about solving the major pain points that rub up against our users. Obviously I can't make any firm commitments as I am not on the product team, but I can ensure the right eyeballs are on this. I also want to explore with my colleagues how we can be a little clearer about future feature and development plans to see if we can reduce some ambiguity.

As I say, I am pretty new, so I am still getting the lay of the land, but feel free to reach out to me personally if you have any further questions or concerns about this or any other issue. I am at jono@github.com.

2
deathanatos 4 days ago 0 replies      
This, I feel, is the most important bug, even though it precedes the list:

> Weve gone through the only support channel that you have given us either to receive an empty response or even no response at all. We have no visibility into what has happened with our requests, or whether GitHub is working on them.

I'd like to call out that the GitHub user @isaacs maintains an unofficial repository[1] where the issues are "Issues for GitHub". It's not much more than a token of goodwill from a user to open a place like that to organize bugs (GitHub: you are lucky you have such a userbase!), but it's the best thing I know of for "has someone else thought of this?"[2]. Many of the issues that have been filed there are excellent ideas.

[1]: https://github.com/isaacs/github

[2]: though I'd say if you also think about it, you should also go through the official channel, even if just to spam them so they know people want that feature.

3
Osiris 4 days ago 4 replies      
The author mentions that if GitHub was open source, they would implement these features themselves.

Gitlab[1] is an open source repository manager that supports local installs as well as public hosting at gitlab.com. If author appreciates open source, perhaps they should put their efforts into improving an existing open source option rather than relying on a proprietary solution.

[1] https://gitlab.com/gitlab-org/gitlab-ce/tree/master

4
jballanc 4 days ago 5 replies      
It's 2016, and GitHub is stagnant.

GitHub used to bill itself as "Social Coding", but the "Network" graph has not seen ANY updates since its original introduction in April of 2008. Issues has seen very few updates. Even the OSS projects that GitHub uses internally have grown stagnant as GitHub runs on private, internal forks and maintainership passes to non-GitHub-employed individuals (e.g. https://github.com/resque/resque/issues/1372).

The word "Social" no longer appears on GitHub's landing page. They're chasing some other goal...whatever it is.

5
zzzeek 4 days ago 8 replies      
We need world class, modern, distributed bug tracking now. If you google around for this technology, a lot of nice ideas, many using git itself as transport, were poking around, and around 2009 they started falling silent. Why? Because GitHub started up and everyone just buzzed over to it like so many moths to a flame, having learned nothing from places like Sourceforge about what happens when 90% of the open source world trusts their issue trackers, which is really a huge part of a project's documentation, to a for-profit, closed source platform that does not provide very good interoperability.

If GitHub is kicking back and sitting on their huge valuations, then it's time to pick up this work again. If issue tracking and code reviews were based on a common, distributed system like git itself, then all these companies could compete evenly for features and UX on top of such a system, without ever having the advantage of "locking in" its users with extremely high migration costs.

6
monkmartinez 4 days ago 9 replies      
I do not operate a popular OSS project, but I have experienced the +1 spam and it sucks. The suggestions, in my opinion seem rational.

Interesting side note: With the exception of Selenium, most of signees are maintainers of JS/HTML OSS projects. I wonder if we could objectively compare JS to <lang> projects in terms of the problems mentioned in the document. For example, there is a strong correlation between +1'ers and JS repos vs. Python or vice versa. Perhaps, we could walk away with JS devs are more chatty than CPP developers when discussing issues... I don't know, just a thought.

7
Permit 4 days ago 9 replies      
This first request is the anti-thesis of GitHub's simple approach:

>Issues are often filed missing crucial information like reproduction steps or version tested. Wed like issues to gain custom fields, along with a mechanism (such as a mandatory issue template, perhaps powered by a newissue.md in root as a likely-simple solution) for ensuring they are filled out in every issue.

Every checkbox, text-field and dropdown you add to a page adds cognitive overhead to the process and GitHub has historically taken a pretty solid stance against this.

From "How GitHub uses GitHub to Build GitHub"[1]: http://i.imgur.com/1yJx8CG.png

There are tools like Jira and Bugzilla for people who prefer this style of issue management. I hope GitHub resists the temptation to add whatever people ask of them.

[1] http://zachholman.com/talk/how-github-uses-github-to-build-g...

8
jasode 4 days ago 1 reply      
The bullet points of complaints feel like a continuation of Linus Torvald's refusal of github pull requests in May 2012.[1]

Taken all together, it seems like github is on a path of alienating their most valuable members. Github was unresponsive to Linus' feature requests and it turns out that theme continues almost 3 years later.

If github plans to evolve into a full-featured ALM[2] like MS Team Foundation or JIRA instead of being relegated to being just a "dumb" disk backup node for repositories, they have to get these UI workflow issues fixed.

[1]https://github.com/torvalds/linux/pull/17#issuecomment-56546...

[2]https://en.wikipedia.org/wiki/Application_lifecycle_manageme...

9
bsder 4 days ago 1 reply      
Distributed revision control users whining about centralized repository lacking features.

Ummm ... anybody getting the irony here?

And, from a GitHub business perspective, why do I hear Lily Tomlin: "We don't care. We don't have to."

Everybody anointed GitHub as "the chosen one" over strenuous objections from some of us that creating another monopoly for open source projects is a bad idea.

Pardon me for enjoying some Schadenfreude now that GitHub leveraged the open-source adoption into corporate contracts and now doesn't have to give two shits about open source folks.

Lily Tomlin's Phone Company Sketch:https://www.youtube.com/watch?v=CHgUN_95UAw

10
asb 4 days ago 5 replies      
There's been no mention of phabricator yet so I thought I'd give it a shout out. It's used by LLVM, FreeBSD, Blender, Wikimedia and others and I love it. It's under very active development and even if it doesn't solve every issue in this letter, by using an open source tool for development you of course have the option to customize it to the needs of your community.
11
marknutter 4 days ago 0 replies      
Is this a case of the squeakiest wheels getting the grease? What if these problems aren't representative of the overall user base? What if far more people prefer a more simple, minimalistic interface than an ultra-customizeable interface with myriad custom actions and events. I've always appreciated software that deliberately keeps things simple (Basecamp and Workflowly come to mind). It sounds like these people want a full blown Jira/Stash installation.
12
AndyKelley 4 days ago 1 reply      
I don't like the general feel of these suggestions. It sounds like more bureaucratic features, the lack of which is a big part of why GitHub is so pleasant.

Making an issue or a pull request feels like having a casual chat with the project maintainers. Adding fields and other hoops to jump through puts distance between people.

13
carapace 4 days ago 1 reply      
Wow, what a bunch of whiners. If you hate github so much why don't you just fork it and fix-- Oh, right. It's not open source.

Well, there's your problem right there.

(I have sooooo much more in this vein but I'll spare you. ;-)

EDIT: No I won't. Fuck it. This is too ridiculous.

These guys (and they are all guys) chained themselves to github's metaphorical car and now they're complaining that the ride is too bumpy and the wind is a little much.

Don't whine about not getting to sit inside the car! Unchain yourself and go catch one of the cars where the doors are unlocked and open and the driver and other passengers are beckoning you to join them. (Apologies for the mangled metaphor.)

These folks come off to me like masochistic babies.

14
beshrkayali 4 days ago 3 replies      
I do like Github, and I understand how it makes the entire process of maintaining a code repo a lot easier, but what I'd genuinely like to know is why don't big projects just move to their own thing? I understand that there isn't a single solution that exactly matches what Github has, and that maintaining your own git server + git management/issues/etc.. app is a pain, but I see it as the only real solution. Developing in the open can't be done on platforms where restrictions apply, and they do apply. I'm saying this with no intention of sounding like a jerk, but 18 project maintainers and/or developer need to write an open letter to get Github to give'em a "me too" button? I understand the issue, but i still find it rather silly.

The only aspect I could think of where Github has the pro is the community of developers it has, but does it really matter that much? Especially for established/big projects that probably don't care about the fork/stars numbers, or the random look around-ers that pass by.

15
notabot 4 days ago 1 reply      
My company pays me to work on a fairly old-school free software project and we run our own git service. Our workflow is email based so we won't ever consider switching to GitHub.

That said, we do sometimes consider setting up an official mirror on GitHub. Ideology aside (some team members might think we shouldn't promote a propriety solution for free software project), the main thing that puts us off is that there is no way to disable pull requests. Closing all pull requests by hand is not appealing; leaving all pull requests open is not desirable. We can probably write a bot to close pull requests, but that is just yet another administrative burden.

Not sure if GitHub will ever consider allowing users to disable pull requests though. That seems to go against GitHub's core interest.

16
anarchy8 4 days ago 12 replies      
I feel like there is a great opportunity right now for anyone to make a Github replacement. Sounds like a lot of these features are sorely needed at the moment. Why has Github been complacent?
17
arasmussen 4 days ago 0 replies      
I work on a very relevant project called Product Pains.

React Native, the open source project, is using Product Pains instead of GitHub issues for bug reports and feature requests. This is because there were thousands of open issues and, just as this document mentions, it's impossible to organize them. The comments are all "+1" and it's really hard to tell what's important and what's just noise.

If you take a look at https://productpains.com/product/react-native?tab=top you'll see the power of being able to vote on these issues.

So why's Product Pains relevant?

1. It's a temporary alternative to GitHub issues. I'm guessing GitHub will get to adding votes eventually. If you want to use Product Pains for organizing issues for your open source project, go for it. I'll even give it away to you for free.

2. It's a community dedicated to improving products. This document is chock-full of great, constructive, actionable feedback. Product Pains is a community built for posting exactly this. You can post feedback publicly, about any product, people can vote on it, and posts with a lot of votes create a social responsibility for the company to respond.

3. It's a way for your voice to be heard. Posting on Hacker News lasts a day and will get your voice heard. If you post actionable, constructive feedback on Product Pains, and 150 people vote on it, it lingers waiting for GitHub to do something about it. Around 600 users on Product Pains are also React Native developers. They'd probably be ecstatic to vote on constructive feedback for GitHub.

For example, go make an account and vote here: https://productpains.com/post/github/implement-voting-for-is...

18
mplewis 4 days ago 4 replies      
Bitbucket kills GitHub issues with these two features:

- Multiple assignees for an issue- An "Approve" button so that maintainers can stamp a PR with the seal of approval

19
neilgrey 4 days ago 2 replies      
Yup, I love GH, use it every day, but issue management is the pits.

It'd be really nice if I could custom sort the queue of issues so that I know what's next up in my queue of things to do; right now I've got 5 tags called NextUp:1 -> NextUp:5 on each repo; this takes way more manual updating than a simple drag/drop widget.

Like they mentioned, having a voting system would be super useful for knowing what matters -- I cringe every time I leave a +1, so I've gotten into the habit of at least adding a comment after it --- but the premise and the pain are the same.

20
rmchugh 4 days ago 0 replies      
A shame that GitHub aren't more responsive to the community that enables their success when they make such a big deal of their openness. It is also our own fault that we have allowed ourselves to become dependent on a single provider of a relatively simple service.

That said, I'm extremely grateful to the platform for enabling collaboration on open source and to the company for its work on Git, Resque etc.

GitHub's strategy is to open source everything except the business critical stuff, but it seems to me that their business is in enterprise support rather than in actual software. Perhaps they should just open source the whole platform and count on their service business being enough to carry the company?

21
jondubois 4 days ago 4 replies      
I like GitHub issues as they are. I wouldn't like to force people to adhere to a particular format when reporting problems.

I find it strange that some project maintainers get annoyed when people use the issues section to post questions. What's wrong with that?A question can reveal design failures about your software... Maybe if your software was better designed, people wouldn't be asking the question to begin with.

I do think there should be a +1/like button though.

22
athenot 4 days ago 1 reply      
I have mixed feelings about these requests. Yes it would be nice to have these extra features in GitHub. Its issue handling has always been a bit light on the workflow sidebut IMHO has made up for it with a pleasant way to organize conversation around issues. The simple and smooth UX is part of what makes GitHub so great.

For the opposite side of the spectrum, there's the Bitbucket+Jira combo. It is customizable to a PM's heart's content, and in the process can become a mess of a tool.

23
aaron695 4 days ago 0 replies      
After the whole incident where they deleted forks of a project without notice, due to their belief on what is and is not appropriate words to use in code without an apology I think we really need to re-assess GitHub in general.

Their 'control' of code and lack of respect to the people running projects is very disappointing and they seem to not want to move forward on the issues.

I'm surprised the open community is allowing this de-facto ownership of the worlds code and how it's written to take place, I'm not so sure they are a benevolent dictator.

https://www.techdirt.com/articles/20150802/20330431831/githu...

24
guessmyname 4 days ago 2 replies      
Interesting petition, and I agree with it; but I wonder why are all projects mentioned in the _Signed by_ section based on JavaScript? I know there are other languages involved in some of those projects like C++ and Java in Selenium and PhantomJS but this specific thing in the document makes me believe that only JavaScript developers _(at least the ones using GitHub)_ are more prone to complain than other type of developers.
25
pvorb 4 days ago 2 replies      
The problem is that GitHub has a monopoly and is considered _the_ current standard for Open Source. But I think that once some of the major projects move to alternatives like GitLab (which has many of the features described in that letter) GitHub will have to obey its user base. Unfortunately no Open Source project with a large user base will dare to do the first step.
26
duncan_bayne 4 days ago 0 replies      
"It's the world's tiniest open source violin"

https://xkcd.com/743/

27
rlaferla 4 days ago 0 replies      
Github needs two major features: 1. discussion groups for users vs. devs as people use issues for it currently. and 2. A searchable "license" attribute for all projects with standard license templates for MIT/Apache/GPL/etc... When looking for a source code, you need to consider the platform, language and license.
28
sqs 4 days ago 1 reply      
At Sourcegraph, we're trying to help solve these problems for developers everywhere (https://sourcegraph.com), both in open source and inside companies. GitHubs commercial success and contributions to the world of development are impressive (and I'm speaking as a GitHub user for 8 years), but they cant build everything developers need on their own.

Were really pumped about improving dev team collaboration in the GitHub ecosystem by (soon) letting anyone use Sourcegraph.coms code intelligence (semantic search/browsing), improved pull requests, flexible issue tracking with Emoji reactions instead of +1s (example: https://src.sourcegraph.com/sourcegraph/.tracker/151), etc.all on their existing GitHub.com repositories.

All of Sourcegraphs source code is public and hackable at https://src.sourcegraph.com/sourcegraph, so it can grow over time to solve the changing needs of these projects. (Its licensed as Fair Source (https://fair.io), not closed source like GitHub or open source.)

Email me (sqs@sourcegraph.com) if youre interested in beta-testing this on your GitHub.com repositories.

29
fphilipe 4 days ago 1 reply      
My biggest gripe with GitHub has been the notification system. Personally I can't use the web UI for notifications because they bundle multiple notifications per issue. This leads to potentially missed notifications since it is up to me to scan the issue/PR for new comments.

My workaround has been to use email notifications exclusively. I have a Gmail filter that applies a label to all notifications and skips the inbox. Then in my mail client I have a smart mailbox that only shows me unread notifications with that label (or that folder, from an IMAP perspective). The smart mailbox then shows me a counter of unread notifications. This way I don't oversee comments when multiple ones are made in a PR.

Problem 1: No context in these notifications. It would be nice if these emails could show the code in question for diff comments or the entire comments thread.

Problem 2: Now what is really bad with these notification emails is that the link "view it on GitHub" sometimes no longer links to the comment I'm being notified of. This happens when the comment was made on a PR on a line of the diff that no longer exists, as sometimes is the case when new commits are pushed. I then have to go to the main PR page, expand all collapsed "foo commented on an outdated diff" comments and manually search for the comment in order to get the context and be able to reply.

By fixing problem 1, problem 2 would be automatically fixed with it and make my workflow much more productive. Is there anyone else annoyed by this?

30
felhr 4 days ago 2 replies      
I just created and maintain a little Android library (a very rewarding experience by the way) so most of the complaints about Github doesnt really apply to me because the size and reach of my project (I understand the point perfectly though).

But I read some complaints about the users and the issues they tend to open and I fully agree. They are a minority but I can't only imagine what people with bigger projects have to deal with. This is what I've found:

- People with little to zero experience in the language/framework that simply state that my project doesn't work without providing more information and sometimes they didn't reply to my "give me more info" inquiries.

- Guys who just want to get their homework done and They are basically trying to get it done using me as non-paid freelance.

- And my favourite one, junior dev in a company, he needs to get their work done with more pressure than the previous one so became anxious about their problems and I feel it even via email. Eventually He gets the thing done but He notices I changed the build system to Jitpack for better dependency handling and and start to complain about Man in the middle attacks to his company and black-hat hackers replacing my lib with a malicious one (I guess it could happen but come on).

But it is a very rewarding experience besides these anecdotical cases

31
aesthetics1 4 days ago 2 replies      
Each and every suggestion is a sane and much needed improvement.
32
vmarsy 4 days ago 1 reply      
A lot of these points are fair and interesting, but I fail to grab some of the points, especially that one:

 Ability to block users from an organization.
What does blocking users mean? Blocking from commenting/making PR/cloning?

Why blocking a whole organization from an open source project? What would prevent such users to use a personal account instead to do what they organization counterpart is blocked from anyways?

33
runn1ng 4 days ago 2 replies      
All this problems seem to me like good problems to have.

They all seem to stem from the fact that github is too successful. And too many people are on github and too many people are using it, often in wrong ways.

Of course github should solve them all. But still, it's still better to have problems with too many people and too much interest, than have the opposite problem - dying platform that people are leaving (see: sourceforge and Google Code).

34
dragonsh 4 days ago 0 replies      
Look at kallithea SCM at http://kallithea-scm.org/, we have used it and in most cases it works well. Also it supports both git and mercurial. Python should learn a lesson when they decided to move their repository to closed source system like github. But obviously as people use Facebook, developers use github for the same reason, network effect.
35
some-guy 4 days ago 0 replies      
I work at a large company with a central GitHub Enterprise instance, and we use GitHub as a code-reviewing and code-hosting platform. Everything else (including build-automation) is integrated through web-hooks to Atlassian tools for many of the reasons noted in this letter. It works for us, but I am hopeful that GitHub will listen and maybe someday we can have everything on there.
36
mpdehaan2 4 days ago 0 replies      
While I don't maintain Ansible anymore, +9 billion on this. GitHub is hard at scale.

GitHub is fantastic because everyone is on it, but the issue system has not improved since inception - and I felt the UI changes have actually stepped back.

We had to implement our own bot to comment on tickets that did not appear to follow a template, and I would have given a kingdom for a template that let people filter their own tickets into whether they were bugs or feature requests or doc items.

We also had a repo of common replies we copy and pasted manually (this because there was so much traffic and me replying quickly would likely tick someone off - but this too could have been eliminated mostly with a good template system). Having this built-in (maybe I could have picked a web extension) would have also been helpful.

So many hours lost that could have been features or bugfixes - and by many, I mean totally weeks, if not cumulative months.

GitHub does the world a great service, and I love it, but this would help tons.

I always got a response when I filed a ticket - ALWAYS - but a lot of them were in the "we'll take that under consideration" type vein.

I feel opening GitHub RFEs up to votes is probably not the answer to serve the maintainer side of the equation, since users outnumber maintainers, but these needs to be done and would greatly improve OSS just based on expediting velocity.

If you don't use the GitHub tracker you lose out on a lot of useful tickets. However, if you use it, you are pretty much using the most unsophisticated tracker out there.

It's good because there's a low barrier to entry, but just having a template system - a very very very basic one, would do wonders.

A final idea is that GitHub really should have a mailing list or discussion system. Google Groups sucks for moderation, and I THINK you could probably make something awesome. Think about how Trac and the Wiki were integrated, for instance, and how you could automatically hyperlink between threads and tickets. The reason I say this is often GitHub creates a "throw code at project" methodology, which is bound to upset both contributor and maintainer - when often a "how should I do this" discussion first saves work. Yet joining a Google Group is a lot of commitment for people, and they probably don't want the email. Something to think about, perhaps.

Also think about StackOverflow. It's kind of a wasteland of questions, but if there was a users-helping-users type area, it would reduce tickets that were not really bugs, but really requests for help. These take time to triage, and "please instead ask over here and join this list" causes people pain.

I love all the work to keep up site reliability, maybe I'd appreciate more/better analytics, but I totally say this wearing a GitHub octocat shirt at the moment.

37
thockingoog 4 days ago 0 replies      
I could rant for hours about all the things GitHub doesn't do (or does wrong) for "real" software development.

+1 from the Kubernetes project

38
chippy 4 days ago 0 replies      
There are three groups within GitHub, and this article is about the issues faced by the first - big open source projects (a small number).

The main bread and butter of GitHub is from private or organizational projects and do not have these issues

The majority of accounts on GitHub are folks like the majority of HN readers - developers, coders, hackers and do not have these issues.

So all these complaints are in a sense not applicable to the vast majority of both GitHubs revenue generating customers and the vast majority of GitHub users.

39
bad_user 3 days ago 0 replies      
While I like to bitch and moan about stuff myself, I don't really agree with the first point.

What I like about GitHub's issue tracking is that (compared with alternatives, such as Redmine or Jira) it is free form. It doesn't force users to fill information such as steps to reproduce and I don't think it should. And that's because the needs of every project is slightly different. Consider how different the "steps to reproduce" are for a web user interface, versus the usage of some library. Yes, it can be painful for an issue to not provide all the information required, but on the other hand GitHub does a better job than alternatives at fostering conversations and keeping people in the loop. I've even seen projects use the GitHub issues as some sort of mailing list.

On the second point, I do agree that GitHub needs a voting system for issues. Given that GitHub has long turned into some sort of social network, adding a voting system for issues is a no-brainer. But then a voting system doesn't address the problem of people getting frustrated about issues taking too long to get fixed. +1's are annoying, but sometimes that's a feature and I've been on both sides of the barricade.

40
john2x 4 days ago 0 replies      
I wish Github would add a "Discussions" tab for repos, so projects don't need to create a separate Google Group (which require a Google account!) for questions-that-are-not-quite-issues.
41
teen 4 days ago 0 replies      
I actually disagree with some of these suggestions, I find the simplicity of Github issues is what makes it so great. I think this should be solved with 3rd party tools, such as waffle.io
42
rahelzer 4 days ago 5 replies      
Do the undersigned send any money to github? It might be better to phrase your demand in the form of a question, "how much can we pay you to do this work for us?"
43
Karunamon 4 days ago 0 replies      
Most of this stuff seems pretty common sense and reasonable. I really only have a couple of objections:

* Issue templating.

It's one thing to prefill the entry box, it's quite another to add fields that everyone must fill out. I quite like that filling out something on Github is totally the opposite of filling out something on Jira.

* Issues and pull requests are often created without any adherence to the CONTRIBUTING.md contribution guidelines

This is a people problem that has plagued open source from day one. You cannot engineer your way around it in a manner that doesn't annoy your contributors.

There was a blurb in here about getting rid of the big green "new pull request" button, but that was when this link went to a google doc. Good - if someone doesn't want to take PR's, then they have almost no reason to be on Github in the first place. Put another way, it's the mark of someone that wants a repo as a signpost of sorts without actually interacting with its community.

44
iwwr 4 days ago 3 replies      
Is there an issue tracking system out there that works on top of the Github issue system?
45
ssmoot 4 days ago 0 replies      
+1 to the notification spam. Being @sam on github sucks sometimes. And as far as I can figure out there's no way to set watching/following/notifications to opt-in only.

So every time someone who knows a "Sam" uses @sam incorrectly in an issue I get notified, have to unsubscribe, ignore, and leave a polite message to let them know they're doing it wrong.

It's really lame that they've never fixed this.

46
zAy0LfpBZLC8mAC 4 days ago 0 replies      
What I don't get ... why do people building free software even consider forcing their users and in particular their contributors to use proprietary development tools such as github? (Or, for that matter, exclude people from contributing to their projects who only use free software.)

Next, we'll see public complaints to Microsoft because MS Word doesn't properly support the way they want to maintain their project's documentation?

I mean, sure, feel free to complain all you like, but how is this not exactly what was to be expected from the beginning, and why do you expect them to care in the future, given that you just seem to have realized that they didn't care in the past, for obvious reasons, and given that their incentives haven't changed, and there is no reason for them to change in the future?

47
its2complicated 4 days ago 0 replies      
I think if these people have that many issues with GitHub, they should find a replacement. That's what happened in the Node community and it led to a better Node. That's a big list of complaints and GitHub doesn't have much incentive to fix 'em except to silence a bunch of cry babies that are bitching about a free tool.
48
orf 4 days ago 0 replies      
I've felt the same way. The worst bit is notifications, so I get a notification that someone replied to an issue I opened. How do I get there? It's not in my notification page, I have to go to the email and click the link from there. Things get missed.

GitHub needs to step it up. They got to the top first, but can they stay there?

49
transfire 4 days ago 0 replies      
Many times, I've asked GitHub to add icons for :test:, :doc:, :admin: and a couple others. I use them in commit messages as it helps categorize the type of commit. This has to be the easiest kind of improvement imaginable, but they have never bothered.
50
danielsamuels 4 days ago 1 reply      
I know they've only recently released new permissions for organisations, but they're still extremely lacking. As far as I can see, there's no way of setting permissions at a group level.

As an example of how this would be used, we have a Github team within our organisation which is used for non-technical people to post bugs. These people have no reason to be able to see or push code to the repository, they only need to be able to create issues. This applies to every repository in the organisation. As far as I can see, and without manually adding every single repository to the team, there's no way of setting global permissions permissions for a team. This seems like a major oversight to me.

51
Jyaif 4 days ago 0 replies      
I suspect they are focusing on developing their enterprise offering.

Anyway, I found that http://feathub.com/ addressed my frustration about the absence of a voting system.

52
kkoch986 4 days ago 0 replies      
I would just love if they could add target _blank on all the links in comments and issues. I'm constantly navigating away from the issue to view links in question and then realizing the tab with the issue is gone.
53
kiloreux 4 days ago 2 replies      
My experience with Github support is terrible, if not one of the worst, I once had an issue and contacted their support and it took them 1 month to respond to me (literally) I was really surprised by that.
54
itomato 4 days ago 0 replies      
Why do you want GitHub to solve the (very) specific problems of issue and defect tracking?

They make a facility available as a nicety, but if your project has legitimate Global impact, you should be looking at (or bootstrapping) a counterpart.

Don't have the revenue for JIRA? Apply for the Free license.

Don't have the stomach for Bugzilla? Turn out a Node/Go alternative.

Don't have the business alignment with Clearquest or Rally? Lower your expectations to suit your Free (as in beer) SCM tool.

55
shmerl 4 days ago 0 replies      
It's also annoying that Github sometimes is missing some basic features like attachments to bug reports and comments for instsance. All mature bug trackers have such feature.
56
justplay 4 days ago 0 replies      
I personally experiencing this issue. i wrote about this in 2014, you can check here http://paritosh.passion8.co.in/post/96619506751/dear-github-... although i am addressing the problem in different way but the issue is same.
57
dabernathy89 3 days ago 0 replies      
People often ask why WordPress doesn't use Github for its primary development (they do have official read-only mirrors there), and it's not just because they already had an SVN-based system in place when Github came to be. It's because the tooling they already had was more sophisticated, especially regarding issues.
58
jarjoura 4 days ago 0 replies      
There's a lot of great feature requests for issues at the bottom of the document. Not sure why the document highlights only 3 things above the signatures.

Yet, I 100% agree with them. I do not understand why Github issues are so basic. The only feature I feel was added in all of 2015 was making the logging of every metadata change extremely verbose (read: maybe too noisy now?!).

"Person assigned to the issue"

"Person added label"

"Person removed label"

59
cbr 4 days ago 0 replies      

 Wed like issues to gain a first-class voting system, and for content-less comments like +1 or :+1: or me too to trigger a warning and instructions on how to use the voting mechanism.
Why bother users with a warning? Turn it into a vote, and then highlight the vote icon so you can see what happened.

60
technion 4 days ago 1 reply      

 Dont make it so easy to submit bad PRs
I recurrently refer to this[0] PR, and the subsequent discussion, as the reason why, if any project of mine gets any bigger - it will not be accepting Github pull requests.

[0] https://github.com/technion/maia_mailguard/pull/42

61
ajsharp 4 days ago 2 replies      
I get that these are super frustrating issues for these people (cough guys) that maintain these repos, but there's something telling about it that it's all JS people. That last cute lil paragraph really sums it up for me:

> Hopefully none of these are a surprise to you as weve told you them before. Weve waited years now for progress on any of them. If GitHub were open source itself, we would be implementing these things ourselves as a communitywere very good at that!

LOL. I can't tell if this is "go-fuck-yourself"-level passive aggression, or mindless hopefulness that there might actually be a universe in which Github (or a company like it, with hundreds of millions of dollars of venture funding) could be open source. If I worked at Github, my first thought after reading this would be "mmmmm yeeeeaaaaaaa y'can g'fuck yr'self", while the second thought would be "yea, you're not wrong". Generally, passive aggression gets you nowhere when you're asking for something from someone/something who owes you nothing (I know, I know, they "owe" their customers everything).

The Node/React/JS community is hilariously entitled, petulant and childish. The tone of this whole letter is so god damned millennial, it's mind-boggling, because they're not wrong about anything they're asking for. But it's how they ask for it that leaves a dry, acid-y taste in your mouth.

62
pekk 4 days ago 0 replies      
GitHub also does not allow deletion of bullshit issues.
63
billconan 4 days ago 0 replies      
One annoying issue I found with github is that it doesn't provide a discussion board. a lot of times, I have a question to ask, it doesn't mean I found a bug or anything needs progress tracking, but I have to go through the "github issues".
64
qaq 4 days ago 0 replies      
To what degree a company has to not give a f$#% when maintainers of largest projects on a platform can't get any feedback (compounded by a fact that some of those maintainers are very prominent employees of largest github paying customers)
65
vjeux 4 days ago 0 replies      
A proposal for a better way to deal with github issues: a discussion tab. https://github.com/dear-github/dear-github/issues/44
66
danpalmer 4 days ago 1 reply      
I'd settle for just a fix to the (minor) data-loss bug that I reported nearly a year ago, and which still crops up once a month or so.

That, and something for code review. Pull Requests are terrible for code review, and it wouldn't take that much to make them so much better.

67
yingbo 2 days ago 0 replies      
Funny. It's like "Hi, you are rich. We like you. You should onate your money".

There are tradeoffs, so pick services you like.

68
kmfrk 4 days ago 1 reply      
Being a maintainer on a project with some minor community on GitHub is such a garbage experience.

Its pretty neat as a general user, but at least you get the impression with BitBucket that they prioritize productivity and project management. And the task system hasn't received any significant updates since their inception - which is a shame, because tasks are an awesome invention, they just have to be implemented awfully with issues.

I also remember that we recently had to move the entire decision-making process to Slack instead where I suggested we just use the emoji voting system to make our decisions with.

What really gets to me is how adamantly GitHub has ignored all the people who've gone on about this forever. Last time they seemed to care marginally was when jacobian finally managed to twist their arm and get them to implement the Close Issue feature, because one repo issue was a radioactive pit of abuse and invective.

69
nikolay 4 days ago 1 reply      
GitHub does certain things very well, other - not so much. I really think the best way to get them to focus is to start contributing massively to GitLab.

Anyway, implementing just voting won't be a such a good idea in the time of Emoji Reactions!

70
Zikes 4 days ago 1 reply      
Issue spam (in the literal sense) definitely needs addressed as well: https://pbs.twimg.com/media/CYI31g1UQAUXQbs.png
71
peterfpf 4 days ago 0 replies      
This resonated so much with me

PS, it was moved to https://github.com/dear-github/dear-github

72
jessaustin 4 days ago 1 reply      
73
alexchamberlain 4 days ago 0 replies      
Totally agree with what has been said. However, I find it interesting that most of the signatories displayed were for JS projects.
74
petke 4 days ago 0 replies      
I guess I don't know how to use githib. You can send bug reports on a project. But how do you send questions or ask for advice?
75
edem 4 days ago 0 replies      
gog.com has a great mechanic for this which might work here called a community wishlist [1] where people can submit games whey wish to see and people can vote on it and eventually they get things done when possible.

[1]http://www.gog.com/wishlist

76
bl4ckdu5t 3 days ago 0 replies      
I've never seen any issues spammed with +1s like the TravisCI request for Bitbucket support
77
ChuckMcM 4 days ago 0 replies      
I am interested in understanding how much recurring revenue Github is receiving for hosting these projects.
78
lifeisstillgood 4 days ago 0 replies      
Can anyone post a prcis/examples - apparently I do not have rights to see any spreadsheets at all.
79
thewhitetulip 4 days ago 0 replies      
we maybe need a feature of hotness of a bug, "affects me too", that'll prioritize issues out of a bucket load of issues, plus on github you first raise an issue then it is sorted into feature request or bug, can be made better
80
jp_sc 4 days ago 0 replies      
Classic JavaScripters reaction: throw more tooling at it
81
dang 4 days ago 0 replies      
82
johnlbevan2 4 days ago 1 reply      
83
spellboots 4 days ago 1 reply      
:+1:
84
wanstrocity 4 days ago 0 replies      
Chris Wanstrocity is an inept leader, social activists roam the halls in self glory about their contributions to the world while Kakul spends money on retreats and hires senior product people who have zero open source or dev ops experience. This company needs intensive care with new leadership asap or they will be doomed, Gitlab is salivating right now.
85
xpaulbettsx 4 days ago 6 replies      
While I applaud the initiative, it's also a pretty strong indictment of the JavaScript / node.js community that there is not even a single non-male OSS maintainer on this list of important JS projects.

What is being done in the JS community by those who lead it to make progress on this and who is leading that charge? If the answer is "Nobody", why is that true?

The resolution of the Bitcoin experiment medium.com
948 points by tptacek   ago   406 comments top 63
1
Nursie 4 days ago 4 replies      
I've enjoyed watching the Bitcoin experiment. It's been enlightening in a variety of ways, from technically to socially.

I've been fascinated and appalled in equal measure at the fanboy community, at the intolerance of criticism that sprang up very quickly, and how strong feelings ran (likely because of financial investment in the tech).

It's also been interesting watching it go from simple CPU mining, to multiple GPU rigs in dorm rooms, all the way through FPGA and then to massive installations of custom ASIC miners.

But I've always hoped it wouldn't go mainstream for two reasons - limited supply with weighting in favour of early adopters, and the massive electricity costs of the 'mining' and transaction validation process. Scalable, competitive proof of work systems for a widespread currency are an ecological disaster in the making, and deflationary currency with a handful of early users controlling a huge proportion of the total currency supply... these aren't "features".

I'll be very interested in what happens next, and for the reasons given I hope it's not just a BTC clone with better governance.

2
Aqueous 4 days ago 9 replies      
Doesn't this all seem a bit dramatic? Throwing away the baby with the bathwater? Cryptocurrency is still in its infancy, and BitCoin is the first cryptocurrency. It's a little early to declare the project a "failure," throw up your hands and walk out of the room, just because a few people haven't acknowledged the urgency of a (very fixable) technical problem.

People disagree about the reality of global warming. Does that mean we throw out the entire system of laws of the United States and other world powers because it hasn't yet addressed this problem?

This is exactly why I never bought the concept of BitCoin as a 'libertarian' currency. There's always politics, there's always governance. It becomes political as soon as more than one person is involved. And as soon as it's political, institutions, processes, procedures, and laws become necessary - also known as "government."

I still believe in BitCoin, however. Ultimately, there's a way out of this tangle, and like with most political problems, it's a political solution. BitCoin will either adapt and scale up or stay the same and scale (way) down.

3
jhulla 4 days ago 6 replies      
Great summary of the current state of bitcoin.

In the conclusion he states: "<i>Even if a new team was built to replace Bitcoin Core, the problem of mining power being concentrated behind the Great Firewall would remain.</i>"

Bitcoin's decentralized nature encourages power pool formation by promoting economies of scale. It is not surprising that like the production of electronics, clothing, toys, etc. the lowest cost center is in China.

4
namecast 4 days ago 2 replies      
Mike's resolution was apparently to join up with a consortium of banks back in November:

http://bravenewcoin.com/news/30-top-banks-and-mike-hearn-hav...

Read the article, he was clearly laying the groundwork for this move back in Thanksgiving.

The current Bitcoin system, I mean the system we actually use today with the block chain, isn't going to change the world at all due to the 1mb limit. So if I have a choice between helping the existing financial system build something better than what they have today that resembles Bitcoin, or helping the Bitcoin community build something worse than what they have today that resembles banking, then I may as well go where the users are and work with the banks."

5
chippy 4 days ago 0 replies      
What I take from this (and I think I really should invest my own time in researching more on) is psychology.

People want to protect their investments. But because we are talking about money, don't confuse this for meaning that the investments are just about money.

Investments in code contributions, investments in all the articles read, investments in community, friends, social networking, investments in belief systems, investment in the justification for choosing one thing rather than another.

It's simply not consistent to say "oh you only have 20BTC, so you've nothing to lose" or "oh, you made no code contributions, so why are you complaining" as both ignore the potential for massive psychological and personal investments.

All these investments act as a barrier to change. It hurts, it hurts physically to lose big investments.

There is a cost benefit analysis that humans perform internally. Is the hurt of losing this investment now worse than the pain by keeping the investment later.

If we go back to the article, we see Mike repeatedly tell us that Bitcoin is an experiment. He is saying to us now "look, don't invest your time, effort and money into it" - and he is telling himself "I have made the change, I have accepted a loss by investing so much of my time and effort into this, and am moving on".

6
kiwidrew 4 days ago 1 reply      
I have always felt that Bitcoin was doomed to this sort of fate, for the simple fact that the "specification" for the Bitcoin protocol was "whatever Satoshi's client does". The community repeatedly failed to encourage a diversity of implementations, and as a result they effectively ceded control of Bitcoin to the maintainers of the one and only implementation.

By the time independent implementations did begin to develop, it was too late to introduce diversity into the ecosystem.

The result is what we are now seeing.

7
jegoodwin3 4 days ago 1 reply      
So this is a political problem -- here's my prediction of what will happen (given that we live in the real, political, world).

A political entity -- not necessary a sovereign government, but perhaps a bank or financial institution -- will offer a currency swap to existing blockchain holders to adopt their crypto currency. The inducement will be a limited time window to put in your claim, with all unclaimed but mined numbers going to the financial entity to reward their followers or stakeholders.

In the real world, this is called escheat and it is a power of the crown. Bitcoin is essentially a system for recording deeds to digital land. They aren't making more numbers, so the problem is the political resolution of competing claims to the same resource. This sort of claim comes, in the end, to a network consensus of who is the sovereign.

8
bsder 4 days ago 1 reply      
The problem is incentivizing people to "mine". This effectively created a pyramid scheme where the "first in" benefit from the "later in" spending money.

Bitcoin will be more interesting to me once the mining pool is exhausted. At that point, we'll see how much of Bitcoin's value is in use instead of speculation.

9
nickbauman 4 days ago 11 replies      
The folly of BitCoin is to believe that technical problems are somehow orthogonal to social problems. They never are. And never has this been more true in the history of our species than with the technology of money. Money is an artifact of the state; always has been, always will.
10
magicmu 4 days ago 1 reply      
I love this summary, it really touches on all the core issues. However, even though bitcoin has such severe flaws, I think that cryptocurrency in general is still really promising. I've been particularly excited about ethereum for the past two years or so, and progress on that project definitely seems to be happened (although it's not a currency per se). The concept of the block chain still has massive potential.
11
paulsutter 4 days ago 1 reply      
Its no surprise that mining power is concentrated[1]. Activities across populations tend to have a power law distribution[2]

[1] https://www.blocktrail.com/BTC scroll to "Pool Distribution", today more than half the mining capacity is in two pools)

[2] https://en.wikipedia.org/wiki/Rank-size_distribution:

"The rank-size rule (or law), describes the remarkable regularity in many phenomena, including the distribution of city sizes, the sizes of businesses, the sizes of particles (such as sand), the lengths of rivers, the frequencies of word usage, and wealth among individuals. All are real-world observations that follow power laws"

12
sail 4 days ago 6 replies      
This may be naive but why can't Bitcoin fork? Forks historically produced better results.

Why would switching to a cryptocurrency that is better designed be a bad thing?

13
tptacek 4 days ago 0 replies      
This is Mike Hearn.

http://plan99.net/~mike/

14
brownbat 4 days ago 1 reply      
What if the pro-fork community wanted to buy their way to the raised thresholds?

So we're at what, 0.9 Exahash?[1]

Say you want to force the change. You'd need to add three times that, or 2.7 EHash/s.[1]

Let's say you buy a ton of AntMiners to cover that, at 3.3 GHash/s/$.[2]

So that's a paltry, what... $820 Million?

Less if you just buy the factory in Shenzhen.

Basically just one winning Powerball ticket though.

[1] http://bitcoin.sipa.be/

[2] https://en.bitcoin.it/wiki/Mining_hardware_comparison#cite_n...

Caveat emptor: my ability to eyeball math in the peta-exa-yotta range is spotty at best. These results may be off by a factor of... any factor.

15
wmf 4 days ago 1 reply      
16
msvan 4 days ago 3 replies      
Look at the people who derive their influence from the Bitcoin ecosystem. Do you agree with their views? Do you find them generally friendly? Are you happy giving them more power by buying Bitcoin? This was what made me stay out of Bitcoin.
17
puranjay 4 days ago 3 replies      
Bitcoin was always going to fail because even as a technically competent guy, I could never understand, nor be bothered to understand how the damn thing worked.

There was no way in hell a normal, non-tech guy could ever understand it enough to use it everyday.

This was a case of tech folks missing the woods for the trees. Even this article will go way over the heads of 99% of people on the planet.

Why would you go through all that pain when cash is everywhere, easy to access, and easy to understand?

18
pbreit 4 days ago 6 replies      
If bitcoin is such a disaster why is BTC price holding firm or even increasing?
19
ComNik 4 days ago 0 replies      
Just when I was halfway through explaining byzantine fault tolerance to my dad.
20
kelvin0 4 days ago 0 replies      
If I were so inclined, I would infer that sabotage is taking place. Taken from the sabotage expert's manual (page 6):

"Simple sabotage is more than maliciousmischlef,and it should always consist ,of acts whoseresults will be detrimental to the materials and man-power or the enemy"

https://www.cia.gov/news-information/featured-story-archive/...

21
foota 4 days ago 1 reply      
Disclaimer: I know very little about the bitcoin community.

It seems like the quotation, "One of the great things about Bitcoin is its lack of democracy" is grossly out of context. In the original comment, by the person that @octskyward is talking about, it seems to be referring to the fact that it is not a majority rules democracy.

edit: grammar

22
Siecje 4 days ago 3 replies      
> Allowed buyers to take back payments theyd made after walking out of shops, by simply pressing a button (if you arent aware of this feature thats because Bitcoin was only just changed to allow it

What "feature" was recently added? This has always been a problem with BTC.

23
xg15 3 days ago 0 replies      
I have no experience with Bitcoin or its community - so apologies if the following is naive - but some parts of this article make no sense for me.

For events to have taken place like described in the article, several parties would have been required to work together with the common goal of keeping the blocksize restriction in place:

The Chinese miners who hold the majority of the hashing power, the developers of Bitcoin Core, the admins of bitcoin.org and the as of now unidentified operators of the DDOS attacks.

If you assume it's not a conspiracy, then each party must have reasons why such a decision would be desirable. But as the author describes it, there are no reasons. This goal wouldn't just push Bitcoin into a questionable direction, it would be downright suicidal: Over time, Bitcoin would become unusable for any kind of transaction. Not even the greedy miners could want a cryptocurrency that no one uses.

So I think there has to be some upside to the blocksize restriction. If anyone has more info on that, I'd be happy to know.

24
Lazare 4 days ago 0 replies      
Fascinating. I don't even like Bitcoin, and this managed to make me feel kind of sad.
25
oafitupa 4 days ago 0 replies      
> Still, all is not yet lost. Despite everything that has happened, in the past few weeks more members of the community have started picking things up from where I am putting them down. Where making an alternative to Core was once seen as renegade, there are now two more forks vying for attention (Bitcoin Classic and Bitcoin Unlimited). So far theyve hit the same problems as XT but its possible a fresh set of faces could find a way to make progress.
26
zekevermillion 4 days ago 0 replies      
The network is being used to capacity and fees are being determined by a market, which values capacity at an increasing premium! What a failure! (sarcasm intended)

Hearn's post may be technically accurate in terms of the data he's collected. But the conclusions he draws are not correct. Usually in any entrepreneurial project, the fact that the service is over-subscribed and increasingly valuable is a sign of success. If one views bitcoin as an open-source project which should have some ideal technical implementation, then yes Hearn has failed to convince everyone to run his preferred implementation of bitcoin, or to agree on exclusively running a different protocol that is not bitcoin while calling it bitcoin.

There is plenty of room for Hearn to run a bitcoinXT altcoin. The only failure here is one of logic, forced by the concept that there can be only one successful network based on nakamoto consensus protocol, and that that network must either be bitcoin or a replacement bitcoin which supplants the original.

27
nodamage 3 days ago 0 replies      
I am curious if the DDoS attacks mentioned in the article are in any way correlated to Linode's recent downtime (https://news.ycombinator.com/item?id=10806686). The timing fits...
28
tim333 4 days ago 0 replies      
I think the doom a gloom is overdone because the "handful of guys sitting on a single stage" who control 95% of hashing power have a massive incentive not to destroy their main asset.
29
mirimir 4 days ago 1 reply      
OK, if Bitcoin has just about failed, what's the best alternative cryptocurrency?
30
andy_ppp 4 days ago 0 replies      
If true then I hope that the next XXXcoin cost of work function is more geared toward something inherently useful like protein folding or a more general purpose problem like simulating cell interactions?
31
LukeHoersten 4 days ago 3 replies      
Cross post from the NYT article comment:

In open source communities, impactful contributions yield influence. Here are the top 100 bitcoin contributors:https://github.com/bitcoin/bitcoin/graphs/contributors

Guys like Wladimir, Pieter, Gavin, Cory Fields, Gregory Maxwell and Luke Jr have a voice because theyve contributed many thousands of lines of code. (Lines of code are only a proxy for impact).

You may have noticed Mike Hearn isnt in the top 100 contributors list. He is the primary author of the Java implementation of a bitcoin library:https://github.com/bitcoinj/bitcoinj

He started it in 2011, definitely early. But a substantial amount of bitcoin core work had already set the path. There are also similar implementations in many different languages but they are not the primary reference implementation for full nodes.

According to Hearns blog post:Ive talked about Bitcoin on Sky TV and BBC News. I have been repeatedly cited by the Economist as a Bitcoin expert and prominent developer. I have explained Bitcoin to the SEC, to bankers and to ordinary people I met at cafes.

Being cited by journalists is not the same as being a primary contributor.

The disagreement between Hearn and the other developers isnt about whether to increase capacity, its about how. Many of the primary full-node contributors believe a hard-blockchain-fork is a risky approach. Lots of work is being done to explore better options, like segregated witness (http://gavinandresen.ninja/segregated-witness-is-cool).

Mike Hearn tried to (very aggressively) push the idea of increasing the block size with a hard fork. In fact the patch allows node operators to vote and 75% adoption is needed. When it looked like that wasnt going to pan out, he created Bitcoin XT where Decisions are made through agreement between Mike and Gavin, with Mike making the final call if a serious dispute were to arise.

So the claim that Hearn not being able to take over decision making power for the bitcoin community is evidence that bitcoin has failed seems to show something slightly different. It shows that open source software methodology of forking and adoption lets the best implementation win and prevents hostile takeovers.

Mike Hearn is not as impactful to bitcoin development as he or recent news would indicate. Mike leaving the bitcoin community has little impact on the future success or failure of bitcoin.

32
cookiecaper 4 days ago 1 reply      
Pretty good article, and a great overview of the blocksize controversy.

Also a great exhibit of some stereotypical programmer social problems; we don't get a lot of middle ground, most programmers are either openly hostile and combative or so deathly afraid of confrontation and responsibility that they give away their authority so that they don't have to handle the pressure. Gavin should've kept control. Bitcoin is learning exactly why a strong central authority is so desirable in money exchange: it keeps the value of the currency stable by preventing panic and confusion over issues like this.

Many discussions of Bitcoin claim that this power is transferred to "the network" to make the final decision, which sounds very egalitarian and democratic, but Bitcoin failed to provide the controls that would prevent power hoarding and ensure that the people who depend on bitcoin were fairly represented. This is one reason why modern democracies are structured within the framework of a republic. This is probably one of bitcon's hardest to solve problems, since the hardware to get respectable hashrates is unobtainable for quite literally everyone who doesn't have access to their own microfabrication facilities. Even if one of the specialty bitcoin hardware makers had a really good, cheap chip, why would they share it? They'd hoard all the hashpower for themselves. Litecoin attempts to address this by hasing with scrypt, under the belief that it's harder to hoard power with custom hardware if the algorithm uses a lot of processing power and memory instead of just a lot of processing power.

Mike failed to mention one incentive that exists to prevent increasing the block size: miners get the transaction fees attached to each block they mine. If the block size is large, there is little contention for space in the blocks, and ergo there is not much reason to incentivize miners to include your transaction in the next block. By keeping the artificial constraint on the block size, people who own a lot of hash power will be gaining a lot more bitcoin for themselves.

I don't think this crisis is insurmountable. So much money has been sunk into bitcoin that I can't believe people are just going to let this cabal take it out. BitcoinXT will gain notoriety through the mainstream press and the resultant sell off among casual investors will freak the big players out and force them into running XT nodes.

33
Rauchg 4 days ago 1 reply      
"Why has the capacity limit not been raised? Because the block chain is controlled by Chinese miners"

The author is complaining that Bitcoin is working exactly as designed.

From Satoshi's paper:"Proof-of-work is essentially one-CPU-one-vote. The majority decision is represented by the longest chain, which has the greatest proof-of-work effort invested in it."

If you want to raise the block size, out-vote the Chinese miners.

34
LifeQuestioner 4 days ago 0 replies      
"I have not failed. I've just found 10,000 ways that won't work."

- Gotta say, even if bitcoin "fails" I don't feel it will be a "failure". The community and people have learnt so much, I mean, Bitcoin became...big and it's the first cryptocurrency to reach this level.

To have made a perfect system would be unrealistic really.

35
jMyles 4 days ago 0 replies      
It's been a good run with a lot of interesting observations. Upcoming iterations will likely solve a lot of the (truly compelling) problems that bitcoin failed to solve.

This "people problem," as Mike calls it, is undoubtedly a result of the mechanics of the blockchain. Slightly different rules may lead to dramatically different (and less insurmountable) people problems.

36
Procrastes 4 days ago 0 replies      
Bitcoin was always going to fail one way or the other. I expected it to be power usage, but I should have anticipated bog-standard Open Source infighting. I'm hoping the idea of a decentralized electronic ledger and programmable currency doesn't die with it.
37
_superposition_ 4 days ago 0 replies      
It's all fun, games, and ideology. That is, until money is involved. What a shame.
38
pc2g4d 3 days ago 0 replies      
I'm inclined to blame the author's negativity on his obvious conflict of interest.

At least, my understanding is that he recently began working for a group that competes with Bitcoin in trying to connect traditional banks and (non-Bitcoin) blockchain technology. He has an incentive to scare people away from Bitcoin.

Not saying there are no scaling/social/whatever issues in Bitcoin. But the author seems to be conflicted.

Disclaimer: I own a few Bitcoins, and thus hold the opposite conflict of interest.

39
caf 4 days ago 1 reply      
With the benefit of hindsight, perhaps what should have happened is that the original 5 developers should have each maintained their own compatible fork of the codebase, cross-sharing patches they agreed on.
40
gesman 4 days ago 0 replies      
Linkbait-ish template of the article takes away from otherwise promising technical content.

Typical template of building up hype linkbait case - start with a couple of puffed up credentials, pose yourself like a well wishing visionary and then shed few crocodile tears of sadness over "it failed" statements which of course conveying an absolute truth.

Nothing helps to share links better than controversy.

41
jokoon 3 days ago 0 replies      
I didn't really understand how the block size is an issue, and what could happen if it was hit.

Although is it true that if there are 2 large network which are separated (by the great firewall), does that mean the ledger could be split in 2 ?

42
dschiptsov 4 days ago 0 replies      
It is rather naive assumption that valuable assets (and bitcoin is a digital asset, not digital currency) would not be seized by elites, because very essence of elitism is the ability to sieze and control distribution of valuable assets, be it Chinese govt or too big to fail US financial institutions.
43
iofj 4 days ago 0 replies      
Today we could redo bitcoin, as a centralized perfectly trustworthy system :

https://en.wikipedia.org/wiki/Homomorphic_encryption#The_2nd...

Simply have 5 servers running that as a network server and you won't need proof of work, nor will you need a blockchain. Homomorphically encrypt a basic ledger with an encrypted backend, and throw away the key. Done/done.

One might say encrypting 52 integers in 36 hours is somewhat less than acceptable performance, but how does it really compare to bitcoin in total effort ? This is certainly good enough that anyone with a decent pc could run it. Hell, you might even reward them for running it just like bitcoin does. And it ought to be a lot cheaper to run than bitcoin.

44
lumberjack 4 days ago 0 replies      
Of all the Bitcoin people, Mike Hearn is the guy who is respect the most. He is the only one who could see the picture objectively and not get carried away by fanaticism. Even very early circa, 2010, he was always a voice of reason amongst insanity.
45
46
wedesoft 4 days ago 0 replies      
I wonder whether there could be a hierarchy of transactions (like credit cards, banks, and cross-bank transactions). I.e. not requiring the entirety of transactions to be present in the top blockchain.
47
nikolay 4 days ago 0 replies      
Bitcoin's potential got ruined by the Chinese speculators. In the West, on the contrary, people oversold expectations. I really want back the slow, but steady growth before the boom days!
48
LukeHoersten 4 days ago 1 reply      
In open source communities, impactful contributions yield influence. Here are the top 100 bitcoin contributors:https://github.com/bitcoin/bitcoin/graphs/contributors

Guys like Wladimir, Pieter, Gavin, Cory Fields, Gregory Maxwell and Luke Jr have a voice because theyve contributed many thousands of lines of code. (Lines of code are only a proxy for impact).

You may have noticed Mike Hearn isnt in the top 100 contributors list. He is the primary author of the Java implementation of a bitcoin library:https://github.com/bitcoinj/bitcoinj

He started it in 2011, definitely early. But a substantial amount of bitcoin core work had already set the path. There are also similar implementations in many different languages but they are not the primary reference implementation for full nodes.

According to Hearns blog post:Ive talked about Bitcoin on Sky TV and BBC News. I have been repeatedly cited by the Economist as a Bitcoin expert and prominent developer. I have explained Bitcoin to the SEC, to bankers and to ordinary people I met at cafes.

Being cited by journalists is not the same as being a primary contributor.

The disagreement between Hearn and the other developers isnt about whether to increase capacity, its about how. Many of the primary full-node contributors believe a hard-blockchain-fork is a risky approach. Lots of work is being done to explore better options, like segregated witness (http://gavinandresen.ninja/segregated-witness-is-cool).

Mike Hearn tried to (very aggressively) push the idea of increasing the block size with a hard fork. In fact the patch allows node operators to vote and 75% adoption is needed. When it looked like that wasnt going to pan out, he created Bitcoin XT where Decisions are made through agreement between Mike and Gavin, with Mike making the final call if a serious dispute were to arise.

So the claim that Hearn not being able to take over decision making power for the bitcoin community is evidence that bitcoin has failed seems to show something slightly different. It shows that open source software methodology of forking and adoption lets the best implementation win and prevents hostile takeovers.

Mike Hearn is not as impactful to bitcoin development as he or recent news would indicate. Mike leaving the bitcoin community has little impact on the future success or failure of bitcoin.

49
ss6754 4 days ago 1 reply      
What I don't understand is why hard fork is bad or why people oppose it? Isn't competition a good thing?
50
tempodox 4 days ago 0 replies      
I didn't trust BC as far as I could throw a truck. Good riddance.
51
afreak 4 days ago 6 replies      
This was posted in the duplicate thread:

================================================

There are three types of people who are into Bitcoin:

1. People who are in it out of sheer curiosity.

2. People who are in it to get rich quick.

3. People who have been scammed into it.

The people who are in it out of curiosity are the people I don't take issue with. At its beginnings, I found Bitcoin to be a curious thing because it was a novel and new idea. However, as things progressed and I learnt more about how it all worked, I saw it as a cumbersome idea that wouldn't effectively replace anything and as a result now I'd rather make jokes about it than take anything about it seriously. I've never spent more than $20 CAD on Bitcoin and I have gotten it all back for that matter too.

People who get scammed into Bitcoin typically get scammed either one of two ways: they're either being coerced into using it because they've gotten something like malware on their machines (CryptoWall and its variants) or they see it as an investment alternative. The only times I've ever seen non-technical people experience Bitcoin is when I have to tell them that the malware on their computers will only release their unbacked-up data requires a payment using the cryptocurrency to get it all back. And that is really what a non-technical person's experience with Bitcoin is going to be: it's a way to pay thieves.

As for the get rich quick people, they tend to fall into the third category or they themselves are scammers.

Right now there are two forces dominating the Bitcoin community: the miners and those who are holding out on whatever magical unicorn rainbows makes the coins have value. The miners don't want to see changes to the software because it'll hurt their bottom line and the people holding and exchanging it want to see these changes so they can benefit. So as a result, Bitcoin has entered a war of attrition and is starting to show its problems. Mike Hearn's leaving is definitely a consequence of this problem.

Earlier yesterday [1], I made a quip about how it's insulting to suggest that we get those who are "unbanked" as a result of living at no-fixed-address (ie: "homeless") should eventually move on to Bitcoin as an alternative to mainstream financial institutions. It's really for the reasons that Hearn made: would you want to wait a random period of time ranging from maybe a few minutes to a few hours for your transaction to go through? It's already insulting enough that they're living at the bottom of society, so why would we want to get them to use a bottom-tier financial system? Why not instead suggest making it easier for them to participate within mainstream banking schemes?

I anticipate based on my last remarks that the responses to this post will consist of feckless anecdotes and pointless accusations that I and others have a "problem" with Bitcoin. I guess to a certain extent the statement of me having a problem is true, but at the end of the day Hearn is right.

Bitcoin is a failure and if you invested into it then you're getting what you deserve. If you think that it isn't a failure then you obviously didn't comprehend Hearn's writing.

[1] - https://news.ycombinator.com/item?id=10898408

52
dang 3 days ago 0 replies      
We detached this comment from https://news.ycombinator.com/item?id=10906561 and marked it off-topic.
53
dang 4 days ago 2 replies      
Personal attacks are not allowed on Hacker News. We ban accounts that do this, so please don't do this.
55
llSourcell 4 days ago 2 replies      
please stop saying 'BitCoin'. It's Bitcoin.
56
StavrosK 4 days ago 0 replies      
Would you care to actually address some of the arguments?
57
Teodolfo 4 days ago 0 replies      
I am shocked to hear that the techno-libertarian goldbuggery scheme didn't work out well.
58
ilaksh 4 days ago 0 replies      
Need something like npm. Open semantic modules and you vote on modules with your feet. Use something like web assembly (which is actually an abstract syntax tree format) and you get support for many programming languages and architectures.
59
cenal 4 days ago 1 reply      
I am not surprised to see this first attempt at radically changing how we think of currency is having trouble. I'm surprised that the Winklevoss Twins bought so heavily into it. Hopefully the investors in this industry can take the reigns away from China.

http://fortune.com/2015/10/05/gemini-winklevoss-bitcoin/

60
mahouse 4 days ago 2 replies      
Just another reason to isolate our Internet from China.
61
api 4 days ago 1 reply      
It seems like powerful people in the ecosystem (Chinese miners?) really do not want the block size increased.

Why would this be? Are they hoping for increasing transaction fees and therefore increasing mining profits?

In any case increasing the block size seems like a no brainer from a technical point of view, at least if your interest is in Bitcoin itself and its growth and future.

62
jacques_chester 4 days ago 0 replies      
> But Bitcoin Core is an open source project, not a company.

It has some of the shape of an unincorporated association, though. There's a lot of caselaw dealing with disputes arising in those, usually from reluctant courts that were dragged into particularly petty and poisonous disputes.

Given the amount of actual money involved, people might start asking courts to settle some of these questions before too long.

(But don't ask me to do it, I'm not a lawyer and this isn't legal advice).

63
JulianMorrison 4 days ago 0 replies      
As an outsider, the only thing I can take from this is a certain sad feeling that it's a demonstration of a pure wild west unregulated and uncontrolled economy devolving into utter tyranny. You've got your oligarchs, your civil war, your rent seeking, and maybe you'll end up in a year or two with an outright winner. At which point that winner might as well cancel bitcoin, declare themself king and issue a central coinage.

Well, that and "Bitcoin has become garbage, avoid at all costs".

Life is Short paulgraham.com
948 points by janvdberg   ago   397 comments top 102
1
dang 2 days ago 0 replies      
We've closed this thread to comments by new accounts because of trolls.

If you have a new account and want to comment here, you're welcome to email us at hn@ycombinator.com.

2
jacquesm 2 days ago 4 replies      
If you can't see past the link to the other essays here please try to isolate this one for a bit, especially if you're young and you don't have kids yet. This essay is rich in realizations that come with age, and in some sense I would have potentially gotten a lot of mileage out of it when I was 20 or so (but then I would probably have lacked the background to fully comprehend it), and even more when I was a young first time father.

One gem in here that has not been high-lit by other commenters in this thread that stands out for me because that's one that I did figure out very early on in life (and this served me very well) is this one:

"As I've written before, one byproduct of technical progress is that things we like tend to become more addictive. Which means we will increasingly have to make a conscious effort to avoid addictionsto stand outside ourselves and ask "is this how I want to be spending my time?""

Please do ask yourself that question often, and if the answer is 'no' or 'maybe' then simply don't and save yourself a lot of grief and regret in the long run.

3
numlocked 2 days ago 8 replies      
And for a completely different perspective, we have Kurt Vonnegut:

[Vonnegut tells his wife hes going out to buy an envelope] Oh, she says well, youre not a poor man. You know, why dont you go online and buy a hundred envelopes and put them in the closet? And so I pretend not to hear her. And go out to get an envelope because Im going to have a hell of a good time in the process of buying one envelope. I meet a lot of people. And, see some great looking babes. And a fire engine goes by. And I give them the thumbs up. And, and ask a woman what kind of dog that is. And, and I dont know. The moral of the story is, is were here on Earth to fart around. And, of course, the computers will do us out of that. And, what the computer people dont realize, or they dont care, is were dancing animals. You know, we love to move around. And, were not supposed to dance at all anymore.

4
jfaucett 2 days ago 4 replies      
This article resonated a lot with me. It verbalizes what I've been trying to do over the past year or so with my own life. For me a lot of the BS elimination on the first pass was just getting rid of distractions and interruptions, so I cut out the phone/email/chat, etc all during my working hours except for certain times, like 10 minutes after lunch and just before the end of the day. The noise was all driving me insane.

The next step was to remove things from my life that cause stress and are not worth the effort because of the BS they involve. Whether that's just life situations or clients it has been very refreshing.

The next step was to kill a lot of tv/movies, and most of my free time internet usage.

Finally, I started steadily filling in the new time gainings with things I really care about and the personal sense of well-being and accomplishment has improved drastically.

So I have no intent of doing anything other than continuing to go down this road, I've gotten in better physical shape, better health, enjoy life more, have learned a new language, visited many new places, my stress level has dropped by at last 200%, its been a very positive journey so far.

5
japhyr 2 days ago 1 reply      
My father's death taught me to finish the things I start. He was a software engineer in the 70s and 80s, and he taught me how to program when I was a little kid. When he died in 2011 I went through his computer, looking at the projects he was still working on. It was profoundly sad to think that these projects were frozen, that no one would ever use them. The experience of looking through his unfinished projects led me to make the transition from hobbyist programmer to professional.

It was hard to stop playing with a bunch of different projects and make myself focus on one single project, but in the end it has been extremely satisfying to finish what I start. I wish my father was still around to see what I've done, but I might never have finished anything without the lesson of his passing.

6
wallflower 2 days ago 1 reply      
"Youre still young and healthy. Maybe thats why you dont understand what I am saying. Let me give you an example. Once you pass a certain age, life becomes nothing more than a process of continual loss. Things that are important to your life begin to slip out of your grasp, one after another, like a comb losing teeth. And the only things that come to take their place are worthless imitations. Your physical strength, your hopes, your dreams, your ideals, your convictions, all meaning, or, then again, the people you love: one by one, they fade away. Some announce their departure before they leave, while others just disappear all of a sudden without warning one day. And once you lose them you can never get them back. Your search for replacements never goes well. Its all very painfulas painful as actually being cut with a knife. You will be turning thirty soon, Mr. Kawana, which means that, from now on, you will gradually enter that twilight portion of lifeyou will be getting older. You are probably beginning to grasp that painful sense that you are losing something, are you not?"

From 1Q84, Haruki Murakami

7
xenadu02 2 days ago 7 replies      
I'm a more productive engineer now that I have small kids than I ever was before. I don't have time for bullshit. I don't build my own PCs, I buy Macs. I don't waste time building some over-architected nonsense on a side project, I ship the MVP. When I do take time away from my kids I maximize it by learning three or four new technologies, patterns, or libraries at once.

When you realize how short time really is you ruthlessly cut bullshit.

8
talsraviv 2 days ago 0 replies      
> The "flow" that imaginative people love so much has a darker cousin that prevents you from pausing to savor life...

I'm glad he pointed out this seemingly small detail. This took me a very long time to understand.

EDIT: It reminds me of another great post by Paul Buccheit. It's so important to have the 'heroes' of startup culture explicitly spell out these values:

> I worry that perhaps I'm communicating the wrong priorities. Investing money, creating new products, and all the other things we do are wonderful games and can be a lot of fun, but it's important to remember that it's all just a game. What's most important is that we are good too each other, and ourselves. If we "win", but have failed to do that, then we have lost. Winning is nothing.

http://paulbuchheit.blogspot.fr/2012/03/eight-years-today.ht...

9
RKoutnik 2 days ago 2 replies      
It's said that growing up is watching your heroes become human. I'll admit pg was (and still is) one of my heroes and the prime reason I moved across the country to join a startup. While I never got into YC, my (short) life is much better for that move. Yes, the scales dropped from my eyes as I realized just how unglamorous startup life can be and the unfailable pg started to, well, fail.

On the other hand, there's something about the following sidenote that is profoundly human but works quite the opposite of the painful shock implied in my first sentence:

> I chose this example deliberately as a note to self. I get attacked a lot online. People tell the craziest lies about me. And I have so far done a pretty mediocre job of suppressing the natural human inclination to say "Hey, that's not true!"

This is almost universally true. It is incredibly reassuring to know that even the greats struggle with this and antagonists pursue us through all walks of life. I'll admit, I've held back from publishing articles that all of my reviewers liked because I worried about the inevitable negative backlash that comes with standing for anything on the internet. Maybe one day I will publish. If so, this essay helped a great deal in getting me there.

10
the_watcher 2 days ago 2 replies      
This is probably my favorite thing pg has ever written. He's right in so many ways, but my favorite is just his reminder of "don't wait."
11
jroseattle 2 days ago 0 replies      
Life seems really short as soon as kids enter the picture. The best line I ever heard about kids and life was this:

With children, the days are long but the years are short.

These days, I find myself trying to find the "work/life balance", which is really just me managing the ebb-and-flow of time between work and family. What I've learned in that process is that while work provides some satisfaction that meets an internal need, it will never ever hug me back.

Take time, hug your kids, leave your work every now and then. The years won't seem so short that way.

12
kyaghmour 1 day ago 0 replies      
The vast majority of humans on this planet do not get to choose where they spend their time; surely Paul realizes this. Instead, children making clay bricks and parents sending their children off away to work as maids (or worse) do so because they need to survive first and foremost.

I have four children of my own and I'm sceptical of the proposed idea that somehow life is best spent by maximizing time with them. Don't get me wrong, the best moments in life are with my children. Still, one's contributions during our brief passage in the form we like best (walking and free thinking humans) surely should aim to contribute far more than the self-gratifying (and possibly narcissistic) time spent with one's children.

In short, if you do have the luxury of choosing where you actually spend your time, make sure you're giving far more back to the rest of this race than to yourself.

14
fizixer 2 days ago 1 reply      
Life is too short to not be involved in and/or contributing to anti-aging research in any way.

If Wright brothers (and other flight enthusiasts around that time) had not taken the initiative, academics, pundits, and "experts" had it settled that heavier-than-air flight is impossible.

It had to eventually happen because technology is inevitable, but we might have conquered flight in 1953 instead of 1903.

In the case of anti-aging, such a difference means you either die or barely make it past the last generation to die.

15
math 2 days ago 2 replies      
I lived in Sydney for 8 months and Brisbane for 4 years. When thinking back, both feel like a distinct part of my life to a surprisingly equal extent. Maximizing the number of distinct phases in your life seems to me to be important in making it seem longer.
16
pcote 2 days ago 1 reply      
The downside to "don't wait to do the things that matter" is what to do when you empty that bucket list early. I've covered the geeky bucket list stuff. (making video games, blogging, open source contribution, IT jobs, BS in CS, ect.) I've covered the more stereotypical bucket list stuff too. (Skydiving, performing onstage, time with friends, writing the novel, marrying the right girl, ect.)

At this point in my life, there's nothing new that interests me that hasn't already been done. It's made life pretty boring at this juncture. I've lived out all my dreams and now it's all just like "Okay, now what?"

17
vdnkh 2 days ago 1 reply      
I'm (only) 23 and I've felt this for a while. I've been trying to leave my small no-name company for a higher calibre job in NYC for a few months now. I guess I'm lucky because I know what I want - but I need to wade through extreme amounts of bullshit to get there. I'm very efficient with my time but if I made every second worthwhile, I'd go nuts. Sometimes BS time (videogames, the pub, etc) are necessary.
18
petercooper 2 days ago 0 replies      
I think Seneca said something on the topic that meshes even better with HN and the startup way of life than this essay:

The state of all who are preoccupied is wretched, but the most wretched are those who are toiling not even at their own preoccupations but must regulate their sleep by anothers, their walk by anothers pace, and obey orders in those freest of all things, loving and hating. If such people want to know how short their lives are, let them reflect how small a portion is their own.

19
jondubois 2 days ago 0 replies      
I think the problem is definitely finiteness (not length).

I don't understand it when people talk about 'squeezing everything out of life' - As though you could extract real lasting substance/meaning from it.I don't believe it's possible to "Make the most" out of life - It all adds up to 0 in the end.

"Squeezing everything out of life" implies that you're literally taking the juice out of life and storing it somewhere safe/permanent - In reality, it is like squeezing an orange and then putting the juice back inside the orange.

20
ajeet_dhaliwal 2 days ago 0 replies      
I think that the death of a close loved one is a prerequisite to truly understanding how short life is. Having children helps too but I don't think it is enough on it's own. I do absolutely agree that children are the best at focusing us on what is important.

I have known how short life is for a long time but I encounter people on a daily basis, most far older than me who don't seem to realise it or if they do are acting irrationally. When I see them wasting their time on things that are clearly not important it doesn't bother me too much because it is their time to choose what to do with but what does make me angry is if they try to involve me in the 'bullshit' too, to use the term used in the essay. At work this can range from petty disagreements or the colleague that creates busywork. I wonder how many people start startups recognise life is too short compared to those who do not, it would be interesting to find out.

21
azakai 2 days ago 0 replies      
> In middle school and high school, what the other kids think of you seems the most important thing in the world. But when you ask adults what they got wrong at that age, nearly all say they cared too much what other kids thought of them.

That doesn't mean they were wrong when they were kids.

On the one hand, we want to believe the adults because they have perspective and experience. They were those kids. But on the other hand, we should also believe the kids because they are actually living it.

22
keerthiko 1 day ago 0 replies      
> It is possible to slow time somewhat.

pg mentions this, but what he says after is not even the best advice in this same essay for it. The real insight is here:

> The "flow" that imaginative people love so much has a darker cousin that prevents you from pausing to savor life amid the daily slurry of errands and alarms.

The way to slow time down is to break all your routines, and never be in a flow. Have no typical days. Don't have a schedule. Don't have a favorite restaurant, a default outfit, or hang out with the same people more than once or twice before seeking new people. This is nearly impossible for most people to do, because doing these things SUCK. And time is slowest when everything feels like it sucks.

I only know this because this was what my life was for 2 years when I was on the road as a digital nomad. It sucked, but was the most rewarding period of my life as well, because it truly was time slowed down. I learned and experienced such a larger spectrum of things in the same time frame than anyone I knew, including myself from any other time frame.

I don't endorse it as a long-term way to live life, but I highly recommend everyone spend at least a year of their (preferably younger, pre-family) adult life living thus to learn truly how much can be fit in a human life if you frame it right.

23
jzwinck 2 days ago 0 replies      
> One great thing about having small children is that they make you spend time on things that matter: them. They grab your sleeve as you're staring at your phone and say "will you play with me?" And odds are that is in fact the bullshit-minimizing option.

This also has a darker cousin for some parents: since spending time with one's children is always a viable and valuable option, spending time without them becomes difficult. People without children often notice that most of their parental friends disappear. This despite the prior protestations of many that "We'll still do things after we have kids."

Undoubtedly some parents work more efficiently than their childless selves (this is also motivated by a desire to earn money to support the kids). But can they socialize more efficiently too, in particular with people who don't have kids?

24
dennisgorelik 2 days ago 0 replies      
> When someone contradicts you, they're in a sense attacking you.

Not really.

Contradiction means pointing to possible holes in our assumptions. So online discussions - are a way to test our assumptions and learn.

Online discussions is a playground for training our decision-making skills.

Of course, we should maintain a healthy balance between learning in online discussions, other ways of learning and actual decision-making (work). But that healthy balance should probably include more than zero time in online discussions.

25
arbre 2 days ago 1 reply      
Quote from the dalai lama: Man surprised me most about humanity. Because he sacrifices his health in order to make money.Then he sacrifices money to recuperate his health. And then he is so anxious about the future that he does not enjoy the present; the result being that he does not live in the present or the future; he lives as if he is never going to die, and then dies having never really lived.
26
hellofunk 2 days ago 1 reply      
>And while it's impossible to say what is a lot or a little of a continuous quantity like time, 8 is not a lot of something. If you had a handful of 8 peanuts, or a shelf of 8 books to choose from, the quantity would definitely seem limited, no matter what your lifespan was.

But not if you had 8 private jets, or 8 cars, or 8 houses, or even 8 telephones. This is a rather arbitrary statement.

27
unclebucknasty 2 days ago 0 replies      
In a very real sense, though, isn't it all bullshit? I mean, it depends on whether you choose to accept or reject the current Matrix as normative human life, but there is an argument to be made.

For instance, it is kind of funny to say this or that company is more or less bullshitty, when the whole structure is such that requires the masses to work for some company, else essentially be deprived of the resources required for their subsistence. So, most people will have to earn their subsistence by participating in a scheme that allocates more to someone else's subsistence.

Maybe it's the best we can do, but on that scale, the bullshittiest company of all is only marginally more bullshitty than the least.

28
zallarak 2 days ago 1 reply      
Marriage definitely made me prune bullshit from my life. I can only imagine what children would do.

I would include Anger as a subcategory of bullshit. It promotes irrationality and the after effects hamper you. In the renowned book "Emotional Intelligence" the author says that the best thing to do when angry is to focus on controlling it. The more it grows, the harder it is to escape.

29
jrapdx3 2 days ago 0 replies      
Good timing on the topic, I mean "how short is it?" when it comes to life duration. I'm at a stage of slowing down, cutting back after working for decades "at the front line". Like everyone says, it all zoomed by so fast.

Or did it? I think it reflects the point of view, when we're involved in work, all the details to take care of, we feel overwhelmed, busy, time isn't rushing by at all. But once it's history, the past, all of that is suddenly doesn't exist, it has no reality and it is packaged up in memory as though it was just a brief moment. Kind of like closing a menu what's there is hidden, except we're not reopening it, at least not the same way ever again.

Time is relative, as Einstein said, it goes quickly sitting next to a pretty girl, but a boring lecture drags on forever. The epochs across the lifespan come and go, and I think we judge the duration of experience by its currency because involvement with events in real time gives the sense of time. The meaning of a "long" or "short" time is anchored in such reality.

Anyway I've been thinking for a while that what's important is not how much time we have left to live. After all that's not something we can actually ever know. What matters is what we do with the time we have. I'd surely agree we can't afford to waste it on irrelevancies, pipe-dreams, or bitterness. Far better to do what we can, when we can do it.

30
resca79 2 days ago 0 replies      
Your time is limited, so don't waste it living someone else's life. Don't be trapped by dogma - which is living with the results of other people's thinking. Don't let the noise of other's opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.

Steve Jobs

31
sanderjd 1 day ago 0 replies      
> There has always been a stream of people who opt out of the default grind and go live somewhere where opportunities are fewer in the conventional sense, but life feels more authentic. This could become more common.

That part really struck me. I've long thought that the most incessant effect of the student loan phenomenon is that debt indentures you to working a conventional job in a country at a similar level of economic development as the one where you have the debt. You can't just "drop out" and go live somewhere cheap in southeast Asia where it doesn't take much income to live, because your debt payments are not adjusted for standard of living imbalances.

This is probably obvious to everyone, but I think it's worth noting that it is something holding a lot of people back from doing a lot of what is suggested in the article.

32
xyzzy4 2 days ago 0 replies      
On a zoomed-out log scale of significance from 0 to infinity, bullshit activities would rank almost equally important as non-bullshit activities. So just let life happen. Nothing you do is extremely consequential or important regardless.

If you're stuck in traffic, you could've been reading a book. If you were reading a book, you could've been cleaning your room. If you were cleaning your room, you could've been working on a side project. If you were doing that, you could've been working on a better side project to get rich. But that would be less important than curing cancer, which is less important than curing old age. However, even curing old age pales in significance to the fact that entropy will dissipate all energy in the universe. How are you going to prevent that? And what if there's multi-verses that need to be fixed too somehow. You didn't fix the past either. Maybe you should've worked on a time machine instead of solving entropy problems. And what about all those people in poverty getting malaria because you were working on some b.s. problem?

It never ends. You could go crazy dwelling on this stuff too much.

33
vinayak147 2 days ago 1 reply      
An experience feels meaningful only with respect to others that don't. If it wasn't for bullshit we would never know what to cherish.

The bullshit and cherishable also seem to frequently reverse roles. Many things that seem like noise today, may return to foreground with profound meaning later.

I wonder if perhaps nothing is bullshit or meaningful after all. Experience simply plays this game of light and shadow to keep us entertained.

34
oli5679 2 days ago 0 replies      
240 comments so far (mostly debates amongst strangers) on an essay recommending never to debate online with strangers....
35
omarish 2 days ago 0 replies      
Loved this article. It really reminded me of some of Seneca's writing, specifically _On the Shortness of Life_:

> Life is long enough, and it has been given in sufficiently generous measure to allow the accomplishment of the very greatest things if the whole of it is well invested. But when it is squandered in luxury and carelessness, when it is devoted to no good end, forced at last by the ultimate necessity we perceive that it has passed away before we were aware that it was passing. So it isthe life we receive is not short, but we make it so, nor do we have any lack of it, but are wasteful of it. Just as great and princely wealth is scattered in a moment when it comes into the hands of a bad owner, while wealth however limited, if it is entrusted to a good guardian, increases by use, so our life is amply long for him who orders it properly.

36
EugeneOZ 2 days ago 0 replies      
Great post, thanks.Today I also was thinking about how it sucks to getting old. I'm 32 but I see how I'm getting older, my friends getting older and youth looks much brighter :) Value your life and time, young people, enjoy your bodies, don't risk your health, make more love, travel more :
37
hownottowrite 2 days ago 0 replies      
Last two sentence contain all that's important here.

"Relentlessly prune bullshit, don't wait to do things that matter, and savor the time you have. That's what you do when life is short."

38
salmonet 2 days ago 1 reply      
I've noticed that recently pg's twitter has a lot of comments that you would see on a typical stay-at-home mom's Facebook wall. It's refreshing to see someone with the financial opportunity cost of pg opt to stay home and hang out with the kids.
39
datashovel 2 days ago 0 replies      
My take on this is the essay exposes a pretty significant flaw in the human condition. The more capable a person is to ignore the bullshit (ie. enough money to retire comfortably immediately) the less likely they should be doing so. And yet having the ability to "ignore the bullshit" is an integral component of what we've all come to recognize as "living the dream".

In way too many cases the "bullshit" exists because too many capable people are ignoring it.

That is unless "bullshit" is defined as all things that don't matter to anyone. In which case why would we assume anyone is focusing on those things anyways?

40
gpsx 2 days ago 0 replies      
Life is too short for us individually. But I think it is interesting that life is probably just the right amount of time for us as a species, that is if evolution has successfully selected our proper genetic lifespan. If individuals live longer then the species evolves slower and it can not adapt as well. Evolution is measured in generations, not absolute years.

Of course the optimal lifespan will change over time. Today we aren't really facing so many physical survival challenges, but if we extend life longer then we may slow down our speed of innovation.

41
reasonattlm 2 days ago 1 reply      
Life is short. So why not do something about that? Are we not meant to be the very essence of creation? Is this not an age of revolutionary progress in biotechnology?

Just this past week I helped out a young company whose founders are working toward clinical translation of a method of clearance of senescent cells, one of the very first actual honest-to-goodness narrow focus rejuvenation therapies to emerge from the labs. This is something that works to repair and reverse a form of tissue damage that contributes to near all age-related disease.

This is far from the only approach to human rejuvenation presently under development.

But, you know, life is short, so pay attention or not, up to you.

42
gyardley 2 days ago 0 replies      
Life is short when your life is good.

Make too many bad decisions (or have too much bad luck) and it turns out life is really, really long.

43
FatalBaboon 2 days ago 1 reply      
It is funny how one of the top comment is a very long debate about how this article is good or not, completely missing the point.

The sooner you realize life is short, the more you will make smart use of your time.

The same goes for faith: the sooner you realize there is no life after death, the more you will make smart use of your time. Your brain runs out of electricity and fluids, and poof you go.

44
seansoutpost 2 days ago 1 reply      
Paul,you don't know me. But this could not have been more perfectly timed. Thank you, so much.
45
lifeisstillgood 2 days ago 0 replies      
> We had the best time a daddy and a 3 year old ever had.

A couple of years back I was working remotely in UK for US clients - so my day started later. Which meant I woke up with the kids, fed them, played with them, walked them to school.

My abiding memories are cuddling a child in an arm each watching early morning TV before starting the day.

We should all be so lucky, except it's not luck - it's consciously as a society designing work around community and family not the other way round

46
mikemajzoub 2 days ago 0 replies      
From one stranger to another, thank you for posting this essay. I'm at a bit of a crossroads in my life, and your essay came at just the right moment. (This is one of the reasons I love the internet.)

If anyone wants to use IoT data, personal search queries, etc., to build a recommendation engine that increases the probability of 'reading the right thing at just the right time in life', I'd sign up for it! For subtle/complex things, this seems like an overly-intimidating task, but to get started on the project, someone querying illness, loss, etc., might benefit profoundly from this. You'd be essentially be creating a 'skewed Google' that returns what the user _needs_ rather than what the user _wants_ at the moment. (That said, don't pursue such a project at the expense of spending time with family... :) It's a tough balance to strike, isn't it?)

In peace,Mike

47
xCathedra 2 days ago 0 replies      
A lot of these points sound similar to the famous resolutions of the young Jonathan Edwards. I try to make a habit of reading them at least once a year for many of the points Paul mentions in this article.

http://edwards.yale.edu/archive?path=aHR0cDovL2Vkd2FyZHMueWF...

a few examples:"6. Resolved, to live with all my might, while I do live."

"9. Resolved, to think much on all occasions of my own dying, and of the common circumstances which attend death."

"52. I frequently hear persons in old age say how they would live, if they were to live their lives over again: resolved, that I will live just so as I can think I shall wish I had done, supposing I live to old age"

48
codeshaman 2 days ago 0 replies      
One added word entirely changes the rationale of "life is short" for me.

"This", meaning "current".

Add the belief that there is another (form of) life after this one and suddenly the equation changes:

This life is short. But then there's another one coming.

The scientists in us agree - of course there is no evidence of life after this one, despite the messages transmitted to us from our ancestors - in the form of stories, traditions, superstitions, beliefs, religions. Depending on who you ask - we either go to a place were we stay forever (heaven/hell/spirit world) or we come back to life as another being or life form.

But the body dies and rots away !

Technically the body has died many times during it's lifetime - cells die and others are created - or rather - create themselves according to the instructions in the DNA.

The DNA is the one that moves forward through time, all the other pieces of our bodies rot away.But not the whole of it, just 50%. Half the DNA vanishes into void.

But "I" will no longer "exist" !

That's a belief. And also quite vague, because - who/what is this "I" ? Is it my body, is it my brain, is it something which lives inside the body/brain, is it all imaginary ?

Well, think about anyone - someone who's not near you right now - who is he/she ?

Right now, he/she is a thought.

Isn't everyone, dead or alive, just that - a thought ? Isn't "I" a thought then ?

If so, what is life then ? A story ?

Your story.

"Remember that man lives only in the present, in this fleeting instant; all the rest of his life is either past and gone, or not yet revealed. Short, therefore, is man's life, and narrow is the corner of the earth wherein he dwells." - Marcus Aurelius

49
AYBABTME 2 days ago 0 replies      
I love this essay, and I love the mindset of being aware that your life is limited and your time should not be wasted. I find it is avery enabling realization. A personal favorite quote of mine:

 "Think of your many years of procrastination; how the gods have repeatedly granted you further periods of grace, of which you have taken no advantage. It is time now to realize the nature of the universe to which you belong, and of that controlling Power whose offspring you are; and to understand that your time has a limit set to it. Use it, then, to advance your enlightenment; or it will be gone, and never in your power again." Marcus Aurelius (Meditations 2:4)

50
pasbesoin 2 days ago 0 replies      
People telling me to stick it out. Wait for things to improve. Wait until grad school. That the onus was always on me to adapt to the system and its (their) practices.

Worst. Advice. Ever.

If I have one thing to contribute, one iota of value to extract and pass on from my life, this may be it.

P.S. Substitute "extortion" for advice, in many circumstances, for a sense of how it really worked.

51
sireat 1 day ago 0 replies      
"Relentlessly prune bullshit, don't wait to do things that matter, and savor the time you have. That's what you do when life is short."

As a 40+ parent, I savor the time with my children, parents and friends and I prune the bullshit in work down to minimum.

The big problem: How do you choose what matters if nothing matters on the grand scale?

Life is so short that at 40+ you realize you will never get to do even 0.1% of your bucket list. So which rock do you push uphill?

In other words at 20-30 you can be adventurous and make mistakes. At 40+ you have promises to keep and miles to go before you sleep.

52
tmsh 2 days ago 0 replies      
This is the best essay I've read since the Addiction one. Cheers.
53
wellpast 1 day ago 0 replies      
> The "flow" that imaginative people love so much has a darker cousin that prevents you from pausing to savor life amid the daily slurry of errands and alarms.

A similar thought scares the shit of me. I can code for hours on end (in the "flow") toward even perhaps the most trivial ends established by my employer. On the one hand, you could say I'm doing what I love in life. On the other,d darker hand, it seems like I'm squandering so many hours of my life playing this (effective) video game where I code for points (money).

54
p4wnc6 2 days ago 2 replies      
Some practical ways that readers can implement this advice:

1. Don't work for a start-up, since they don't impart salary-winning experience to you, they don't pay you or provide reasonable benefits, and they also don't allow you the freedom to work on big ideas that they usually promise. The lines used to sell naive engineers on working in start-ups are as paramount to life's-too-short-bullshit as anything can be.

2. Don't agree to work in Agile/Scrum-like one-size-fits-all software management environments. Almost every single aspect of these systems is bullshit and will waste your time and break down your morale while draining away your productivity in the best years of your life.

3. Don't work in open-plan offices or even offices that merely have cubicles. It's been settled for a long, long time that even in dense urban areas, providing private offices for individual knowledge workers is extremely cost effective for businesses, as productivity, work-place cognitive health, job satisfaction, moral, etc., all go up substantially. Generally the only reasons for open-plan offices are (1) bullshit trendiness in which an organization performs a shallow copy of some other organization, (2) hyperbolic focus on short-term costs, which means you should be thinking that the upper management doesn't know what they are doing and are bullshitting you -- it's similar to seeing a company stop providing free coffee as a money-saving tactic. It's bullshit -- coffee is so cheap and the productivity and good will it brings are so valuable that it's virtually never a reasonable plan to cut it; and (3) environments where upper management get off on surveillance and cognitive manipulation, and so it becomes a company cultural value to cram everyone into big rooms where you function more like a piece of office furniture than as a worker.

Personally I would also add that life's too short for enterprise C++ and Java (the languages themselves are quite fine, but anyone telling you that some legacy system couldn't have been maintained and incrementally brought into a better state by 2016 is, once again, bullshitting you and see you as nothing but a glorified code janitor).

I think if I could give any advice to young developers, it would be that if they want management types to respect them throughout a prosperous career, they have to avoid the bullshit of the items above. If you let a manager or executive bullshit you by duping you into working for a start-up, by getting you to agree you are a child whose own creative thinking about problem solving can't be trusted and so Agile/Scrum cookbook management is needed and you must play your part, or by getting you to agree that your natural inclinations for privacy, clarity of thought, protection of productivity and time, should all be sublimated so you can be a "team player" by wearing headphones that cost more than your employer's 401k matches for the year so you can just barely function 10 feet from a foosball table, you've already lost, and it will take years to undo the damage.

55
xlayn 2 days ago 0 replies      
I've thinking about the entry... and every once and then you think about it; and realize: it's about choice...

What will be the best use of your time?

When someone ask for the best flavor of linux, or program, or car; depending on the forum you may get the answer "that depends on you", and you may read a lot of different opinions on why people think their version is the best "for them".

With that comes a small problem: deciding the best use of your time, plan for the rest of your life may be incredibly complex.

An alternative B plan could be planning around: "What I don't want on my life"

-I don't want to be in the middle of traffic because is less time with my family...

-I want to spend less time on the internet to go the gym.../I want to stop being a gym rat to learn something on the internet

56
55acdda48ab5 2 days ago 0 replies      
> Life is short, as everyone knows

I've never got this sentiment. Life is the longest thing anyone has ever done. Life is long, very long. I think back ten years ago and it seems like an age ago. It was. I'm early 30s and I feel like I've lived a long life; seen a lot and done a lot and had my kicks. That I've maybe got another full 60 years if I play my cards right is amazing to me. It seems like eons.

The only funny thing about time I've noticed is that as you age (and if you read) the past gets closer and closer. When I was a kid finding out people were born in the 1940s was amazing. SO long ago! Now Napoleon's reign seems very relevant and modern to me.

57
dantheman 2 days ago 0 replies      
I really liked this essay. I've been spending a lot of my time dealing with bullshit, and it's exhausting. Essays like this provide me a reminder to step back and re-evaluate -- is what I think true, has the situation changed, have I changed. What should I do next.

There's a lot written on how to live a great life, but in the end more and more I think, you live great stages in life. At any stage, you optimize for it and with an eye for being prepared for future stages.

58
victor22 2 days ago 0 replies      
Great essay. I've always tried to get out of bullshit with some complex thinking, but at the end of the day, most of it could be avoided faster if preceded by a simple "Is this bullshit?" question.For example, I just realised I wasted 1 week of my time with meetings with an "investor" I knew was bullshitter, because I didn't want to ask myself this question.
59
cmacole 2 days ago 0 replies      
This was a really powerful essay. Don't know if using "8" is a good way to measure if there are not a lot of something. 8 light-years is pretty far and 8 tons is pretty heavy. But overall, great insights.

My favorite: "One heuristic for distinguishing stuff that matters is to ask yourself whether you'll care about it in the future. Fake stuff that matters usually has a sharp peak of seeming to matter. That's how it tricks you. The area under the curve is small, but its shape jabs into your consciousness like a pin."

60
jmcmahon443 2 days ago 0 replies      
Mr. Graham:

I believe you meant "ensure" not "insure" here:

"Indeed, the law of supply and demand insures that: the more rewarding some kind of work is, the cheaper people will do it."

Thank you.

61
FreedomToCreate 2 days ago 0 replies      
Wonder what he thinks when he was investing in OMGPOP, Reddit and 9Gag. How many hours have those sites clocked off of peoples lives. Don't get addicted :)
62
thebear 2 days ago 0 replies      
Somewhat unbelievably, I still have that 1960's jeans jacket with the patch on it that says, "Do it today, tomorrow it may be illegal." And, all hail the Internet, that thing can still be bought online:

http://www.holidays.net/mlk/store/Old-60-s-70-s-Protest-type...

63
GlennS 2 days ago 0 replies      
The original (or is it?) 'Ars Longa' is a nice, short read. Worth a pause:

https://en.wikipedia.org/wiki/Ars_longa,_vita_brevis#Transla...

64
simula67 2 days ago 0 replies      
What if bullshit leads to doing well in things that you care about ? For example, I don't like posturing, building portfolio on Github etc, but if it leads me to getting a better job at a place I can solve complex problems thereby becoming a better engineer, I would like that very much.
65
rdiddly 1 day ago 0 replies      
Know what's funny, life is short, and life is too short for a lot of the crap you'll spend your life doing, but you'll do it anyway because... that's life!
66
orthoganol 2 days ago 0 replies      
This could be a strong argument for and against doing a startup: For - act today on your dream, which uses a startup as its vehicle; Against - sacrifice years, maybe a decade+, of your short life, because you thought it would make you rich or is considered prestigious in your circles.
67
nthnclrk 2 days ago 0 replies      
Rather than engage in the mindless and genuinely inconsequential debate regarding the previous essay, I'd like to bring some attention back to the fact this essay is a genuinely fantastic piece of very clear, appreciable thought.
68
BatFastard 2 days ago 0 replies      
Life is too short to be in a hurry

While that seem contradictory, it is not. When we are in a hurry we make unneeded mistakes, we don't enjoy the process of what we are doing, and we don't do things that reflect our true selves.

69
beeboop 2 days ago 0 replies      
Absolutely my favorite post of his. Rings so true for me right now. Writing this comment is "bullshit" but it makes me feel better to express my satisfaction with reading this post. I need to connect with people more and spend fewer months and years of my life in isolation.
70
blinkingled 2 days ago 1 reply      
PG says Life is Short. There are still 101 people arguing^W discussing it - on an online forum no less ;)
71
SimeVidas 2 days ago 0 replies      
Dont know about life, but my attention span is too short for long posts. Thanks Reddit.
72
transfire 2 days ago 1 reply      
I've been saying this for years... Seems to me we should all drop everything else we are doing and start working on age-extension technology. Once that's solved we will all have plenty of time to do anything else we want.
74
jasonwen 2 days ago 0 replies      
I never wanted to have kids, maybe I'm too young for that. By reading this it definitely made me a step closer to be open to have kids in the future. Thank you for that.
75
SixSigma 2 days ago 0 replies      
Not only is is short, it's not very wide either - Steven Wright
76
EGreg 2 days ago 0 replies      
I wrote something about this very topic years ago:

http://magarshak.com/blog/?p=49

77
xjay 2 days ago 0 replies      
Life is short if your ego is high. Life is long if your ego is low.
78
rajacombinator 2 days ago 0 replies      
Good essay, one of the few recent PG ones I've agreed with.
79
chrisweekly 2 days ago 0 replies      
Beautiful post.

As a father of young children, and a cancer survivor, these words resonated more with me than anything I've encountered on hn in a very long time -- maybe ever.

80
snarfy 1 day ago 0 replies      
This article assumes the purpose of life is accomplishment. Don't forget to stop and smell the roses.
81
ianamartin 2 days ago 0 replies      
Am I the only person on HN who doesn't have this problem?

Other people's bullshit has never bothered me.

I regularly just turn my phone off and pick up the pieces when I feel like it.

Maybe that comes from my musician background. I don't know.

Or maybe I'm just the most inefficient person in the world because I don't give a shit about anything. I just do what I think is necessary when I think it's needed.

I think I'm a fairly productive person. I get things done. But I don't worry about it much.

I spend most of my thoughts and energy on my family and my girlfriend, not work.

Okay, that's not fair: I spend quite a lot of time reading books.

Is this a real problem? Or is it a straw man?

82
huuu 2 days ago 0 replies      
In another related thread I posted: Cut away branches that suck energy but don't bear fruit.

For me this is my bullshit filter.

83
VieElm 2 days ago 0 replies      
This is article is contradicting itself although it may not be so obvious. I imagine dwelling on regrets can be classified as a bullshit activity. It certainly does for me. I am sure you can infer the rest of my argument.

Just do the best you can with your time. If you become unhappy with how you spent it you can use that to inform you on future decisions but you can't change the past.

The pain of having missed significant time with someone you care about is severe, but it is also a thing you can't change.

I am not saying pg is wrong, I am pointing out a problem.

Life may be too short to worry about how you are spending your time.

84
brhsiao 2 days ago 3 replies      
Out of curiosity, what does the vb in vb.html stand for?
85
zhte415 2 days ago 0 replies      
20 years is 1% of the time since the Bible was written to today.

Life is short, so is history, and the impact we can make is enormous.

86
vasilipupkin 2 days ago 0 replies      
fantastic essay. I would just add that life is too short but it is also in a sense very very long. Lots of new chapters and new windows open even as old ones close or narrow. It seems like two entirely contradictory ideas, but I think they are both true at the same time.
87
ebbv 2 days ago 0 replies      
This is the first pg essay I have read and fully agreed with at the end without reservation. Good advice.
88
rbrogan 2 days ago 0 replies      
Life is short, art long. If you cannot escape the bullshit, you might at least make an art of bullshitting. ;)
89
elwell 2 days ago 0 replies      
"He has made everything beautiful in its time. He has also set eternity in the human heart."

Ecclesiastes 3:11

90
somberi 2 days ago 0 replies      
Blake's take on this:

To see a World in a Grain of Sand

And a Heaven in a Wild Flower,

Hold Infinity in the palm of your hand

And Eternity in an hour.

91
statictype 2 days ago 0 replies      
Great piece. This may be the first pg piece I share with my non technical friends and family
92
ilyaeck 2 days ago 0 replies      
Live is too short to read this article in full :)
93
falsedan 2 days ago 0 replies      
so he's not dying?
94
AbdulBahajaj 2 days ago 0 replies      
e
95
heraclez 2 days ago 0 replies      
Off HN I go?
96
alexandercrohde 2 days ago 1 reply      
Summary:

- Poses the hypothesis that "Life is short"

- Proposes an 'objective' basis for this feeling: some of his most meaningful life events happen 8 or less times

- Transitions that the shortness of life justifies avoiding "bullshit," while acknowledging that's a loaded term.

- Proposes examples of "bullshit," traffic jams, unnecessary meetings, bureaucracy, and arguing online.

- Suggests arguing online is an example of a habit that is addictive, yet bullshit.

- Defines bullshit as things that won't matter to you in the future upon reflection.

- Proposes ways to avoid bullshit

- Proposes a way to savor time

Analysis:

- I'm not sold on the metric of measuring something by how much we value it upon reflection.

- I don't think the premise "Life is short" needs to be established to justify "avoid bullshit."

- The argument is fairly loose in that 99.9% of our lives are bullshit by his definition. Is 99% of sex bullshit?

Interesting piece, smarter than your average bear.

97
nice1 2 days ago 0 replies      
I do not always agree with PG's essays, but this one is spot on. Identifying BS is the major task we all face, and having children is a big help - this has been supremely true in my own life. Granted, this does not solve specific problems, but it gives a sense of direction. Without it we are cannot see the forest for the trees.
98
dang 2 days ago 3 replies      
We banned this account for trolling and detached this subthread from https://news.ycombinator.com/item?id=10918078 and marked it off-topic.
99
crimsonalucard 2 days ago 3 replies      
You as pg's wife have vested interest in pg's reputation as your well-being is intrinsically tied with his reputation and well-being through marriage. Your words are not totally empty though as it is still supporting evidence, it is just not solid evidence and it is not something I will totally accept, unlike the op.

The probability for this essay to pop-up right after PG started a fire with his economic essay is just to small for there to be no connection. Could be, that this essay is in itself bullshit. People lie to themselves to hide truths that are painful but self evident. I think this essay could be such a lie.

100
dang 2 days ago 2 replies      
We banned this account for trolling and detached this subthread from https://news.ycombinator.com/item?id=10917656 and marked it off-topic.
101
dang 1 day ago 1 reply      
We detached this subthread from https://news.ycombinator.com/item?id=10917949 and marked it off-topic.
102
plasticchris 2 days ago 0 replies      
Life is very long - TS Eliot
Top Books on Amazon Based on Links in Hacker News Comments ramiro.org
948 points by gkst   ago   171 comments top 46
1
_lpa_ 23 hours ago 10 replies      
I did something pretty similar over christmas, though I used named entity recognition to extract book titles rather than looking for amazon links, and (so far) also limited it to specific "Ask HN" threads about books. You can find it here: http://www.hnreads.com/. It is interesting to see how little overlap there is between the two, though that may be due to my using far fewer (and also newer) threads!
2
SloopJon 1 day ago 2 replies      
Here's a discussion of the original upload of Hacker News data to Google BigQuery:

https://news.ycombinator.com/item?id=10440502

At 4 GB, I'd just as soon query this locally, but this looks like a fun exercise.

I notice that there were 10,729 distinct ASINs out of 15,583 Amazon links in 8,399,417 comments. Since I don't generally (ever?) post Amazon links, I'd be interested in expanding on this in two ways.

First, I'd reduce/eliminate the weight of repeated links to the same book by the same commenter.

Second, I'd search for references to the linked books that aren't Amazon links. Someone links to Code Complete? Add it to the list. In a second pass, increment its count every time you see "Code Complete," whether it's in a link or not.

3
niuzeta 21 hours ago 1 reply      
The absence of SICP, I imagine, is because when people refer to the SICP, they usually just link to the open link to the book: https://mitpress.mit.edu/sicp/ .
4
meadori 23 hours ago 8 replies      
Having owned and read through "Introduction to Algorithms" for years I agree that it is a good book. However, recently I have been feeling like it is recommended way too often without thought.

It is not the best when it comes to explaining things in an intuitive manner. It is a great reference book with lots of algorithms and proofs.

In recent years I have been drawn more towards Levitin's "Introduction to the Design and Analysis of Algorithms".

Anyone else have similar feelings about "Introduction to Algorithms"?

5
dankohn1 19 hours ago 1 reply      
Here is Matt Yglesias's (author of the #1 book) tweet on the analysis:

https://twitter.com/mattyglesias/status/689169613779808257"The only book ranking that matters"

6
dpflan 23 minutes ago 0 replies      
Cool - maybe this can become a monthly recap of posted books / (links)?
7
a_bonobo 17 hours ago 1 reply      
How come "Darwin's Theorem" appears so often? It's quite unknown, with one review on Goodreads and 4 reviews on Amazon

Is this a result of the author spamming his own work?

Edit: Looks like it, short skimming of "darwin's theorem site:news.ycombinator.com" shows that all links are from user tjradcliffe, who is the author. A case for manual curation of data.

8
mattip 17 hours ago 0 replies      
Out of 8 million data points the top book got around 50 references. I wonder how much significance should be attached to that, it looks to me to be down in the noise level.
9
jacko0 22 hours ago 5 replies      
Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold. The best book I've ever read.
10
willyyr 23 hours ago 0 replies      
There is a similar site that didn't make it to the front page which has been posted recently. I think he is using the api though. https://news.ycombinator.com/item?id=10808014
11
greesil 17 hours ago 1 reply      
Check out the review distribution of "Rent Is Too Damn High"

http://www.amazon.com/The-Rent-Too-Damn-High-ebook/product-r...

It's the most polarized I've ever seen in my life.

12
nextos 23 hours ago 2 replies      
Is it possible that some books have been missed due to acronyms employed in comments?

E.g:

- SICP: Structure and Interpretation of Computer Programs

- CTM: Concepts, Techniques, and Models of Computer Programming

- TAOP: The Art of Prolog

13
anc84 1 day ago 5 replies      
Please share how much the affiliate tag generates.
14
pentium10 8 hours ago 0 replies      
In 2015, at Crunch Practical Bigdata Conference, Budapest, I showcased what books some subreddit community talk about: startups, entrepreneur, productivity reads. Slides are available here: http://www.slideshare.net/martonkodok/complex-realtime-event...
15
msutherl 18 hours ago 1 reply      
I maintain a list of HN hacks here: https://www.are.na/morgan-sutherland/hacker-news. I've seen a couple other book projects over the years including: http://hn-books.com/ and http://hackershelf.com/browse/.
16
myth_buster 23 hours ago 3 replies      
I believe people would just write the name of the really popular books like TAOCP, Hackers, Founders at work etc rather than linking to them.

The list:

 "The Rent Is Too Damn High: What To Do About It, And Why It Matters More Than You Think" by Matthew Yglesias Publisher: Simon & Schuster "The Four Steps to the Epiphany: Successful Strategies for Products that Win" by Steven Gary Blank Publisher: Cafepress.com "Introduction to Algorithms, 3rd Edition" by Thomas H. Cormen Publisher: The MIT Press "Influence: The Psychology of Persuasion, Revised Edition" by Robert B. Cialdini Publisher: Harper Business "Peopleware: Productive Projects and Teams (Second Edition)" by Visit Amazon's Tom DeMarco Page Publisher: Dorset House Publishing Company, Incorporated "Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold Publisher: Microsoft Press "Working Effectively with Legacy Code" by Michael Feathers Publisher: Prentice Hall "Three Felonies A Day: How the Feds Target the Innocent" by Harvey Silverglate Publisher: Encounter Books "JavaScript: The Good Parts" by Douglas Crockford Publisher: O'Reilly Media "The Little Schemer - 4th Edition" by Daniel P. Friedman Publisher: The MIT Press "The E-Myth Revisited: Why Most Small Businesses Don't Work and What to Do About It" by Michael E. Gerber Publisher: HarperCollins "Feeling Good: The New Mood Therapy" by David D. Burns Publisher: Harper "Programming Collective Intelligence: Building Smart Web 2.0 Applications" by Toby Segaran Publisher: O'Reilly Media "The Non-Designer's Design Book (3rd Edition)" by Robin Williams Publisher: Peachpit Press "The C Programming Language" by Brian W. Kernighan Publisher: Prentice Hall "The Design of Everyday Things" by Donald A. Norman Publisher: Basic Books "Cracking the Coding Interview: 150 Programming Questions and Solutions" by Gayle Laakmann McDowell Publisher: CareerCup "What Intelligence Tests Miss: The Psychology of Rational Thought" by Keith E. Stanovich Publisher: Yale University Press "On Writing Well, 30th Anniversary Edition: The Classic Guide to Writing Nonfiction" by William Zinsser Publisher: Harper Perennial "Darwin's Theorem" by TJ Radcliffe Publisher: Siduri Press "Knowing and Teaching Elementary Mathematics: Teachers' Understanding of Fundamental Mathematics in China and the United States (Studies in Mathematical Thinking and Learning Series)" by Liping Ma Publisher: Routledge "Don't Make Me Think: A Common Sense Approach to Web Usability, 2nd Edition" by Steve Krug Publisher: New Riders "Expert C Programming: Deep C Secrets" by Peter van der Linden Publisher: Prentice Hall "Clean Code: A Handbook of Agile Software Craftsmanship" by Robert C. Martin Publisher: Prentice Hall "The Elements of Computing Systems: Building a Modern Computer from First Principles" by Noam Nisan Publisher: The MIT Press "Code Complete: A Practical Handbook of Software Construction, Second Edition" by Steve McConnell Publisher: Microsoft Press "The Box: How the Shipping Container Made the World Smaller and the World Economy Bigger" by Marc Levinson Publisher: Princeton University Press "Software Estimation: Demystifying the Black Art (Developer Best Practices)" by Steve McConnell Publisher: Microsoft Press "Refactoring: Improving the Design of Existing Code" by Martin Fowler Publisher: Addison-Wesley Professional "Design for Hackers: Reverse Engineering Beauty" by David Kadavy Publisher: Wiley

17
beefsack 16 hours ago 0 replies      
I wonder how many books would be on the list if it were somehow easy to extract mentions by name instead of by link. Mythical Man Month is mentioned regularly here and I don't think it's linked very often because of how well known it is.
18
nefitty 1 day ago 0 replies      
Hard to read on mobile. Couldn't get past the first few. It is annoying to have to click a tiny thumbnail to read a bad, extracted synopsis from Amazon.
19
corysama 21 hours ago 3 replies      
Interesting to see Influence so high, but Predictably Irrational not listed at all. I've heard Influence is a really great book, but from a quick skim it seems like Predictably Irrational covers the subject matter as least as well if not better. I'd be happy to hear the opinion of someone who has actually read both.
20
joshmaher 10 hours ago 0 replies      
Does that include comments on this article about books being read from links in the comments?

Here's one on understanding the mindset of your investors when raising startup capital - Startup Wealth - http://amzn.to/1Jej8El

21
noobie 23 hours ago 0 replies      
Sad I couldn't find none of non-technical books on Audible. Any audiobook "readers" out there?
22
Ocerge 23 hours ago 1 reply      
Oh god, that algorithms book. Flashbacks to college memorizing red-black trees coming to me.
23
bodecker 20 hours ago 0 replies      
Suggestion: enable arrow keys to allow for easier scrolling through the books
24
Sealy 21 hours ago 1 reply      
Interesting list. I clicked on the top book and Amazon peer reviews gave it 2.5 stars out of 5 with 450+ reviews.

I admire the effort. Calling it Top Books is slightly misleading. Perhaps you can call it, most mentioned books.

25
timdorr 23 hours ago 1 reply      
It would be nice to link to the comments where the books were referenced.
26
abstractalgebra 23 hours ago 0 replies      
In most such lists there's a distinct lack of math books even though there are tons of great math books specifically written for programmers and compsci people.
27
mandeepj 12 hours ago 0 replies      
I also do it most of the time. When ever I see a book recommendation here then I go to amazon either to buy it or save it in my wish list.
28
smartial_arts 20 hours ago 2 replies      
Is this some sort of a promo trap? When clicking on book links I get taken to pages like this one http://www.freebie.guru/au/starwars/starwars625.html
29
tedmiston 17 hours ago 0 replies      
I wonder why Steve Blank would publish a book via CafePress.

 The Four Steps to the Epiphany: Successful Strategies for Products that Win Author: Steven Gary Blank Publisher: Cafepress.com Number of links: 45

30
pjdorrell 16 hours ago 0 replies      
Possible application of Law of Unintended Consequences: every time you write a program to extract data _out_ of HN, you increase motivation for someone else to insert data _into_ HN.
31
andy_ppp 20 hours ago 0 replies      
Here is a talk Matthew Yglesias gave about the contents of his book "The Rent Is Too Damn High":

https://www.youtube.com/watch?v=oHkti4sAUgQ

32
Aaronontheweb 19 hours ago 1 reply      
A little surprised to see that the Mythical Man Month isn't on that list: http://amzn.to/1ZHWUlF
33
joeax 23 hours ago 4 replies      
"The Rent Is Too Damn High: What To Do About It, And Why It Matters More Than You Think"

Not where I live. What to do about it? Move. Find an employer willing to let you work remotely, and find your own quiet cost-conscious piece of paradise.

34
carpdiem 19 hours ago 0 replies      
I notice that this list looks like it has a very long tail.

Can we get the top 100 books as well? (since many of those would have very similar mention numbers as the end of the top-30)

35
Havoc 19 hours ago 0 replies      
Must admit I was expecting less programming books. A lot of the topics on here aren't directly programming related.

Thanks for the list though. Bought the psychology one.

36
clarkmoody 22 hours ago 0 replies      
Easiest way to make it to the top of Hacker News: Hacker News meta posts.

Always interesting to read. But just as interesting is how quickly they pop to the top of the home page.

37
veritas3241 23 hours ago 1 reply      
This is really awesome! Thank you for putting this together.
38
meetbryce 16 hours ago 0 replies      
Your links don't open in a new tab, despite the icon and even if I use my middle mouse button.

Extremely annoying.

39
DanielBMarkham 23 hours ago 1 reply      
Related: There are a ton of sites set up like this. Hopefully somebody will post a list. Lotta work by HN folks on various ways of slicing and dicing the data.

I wrote this curated site from HN several years ago. Got tired of people continuously asking for book recommendations. http://www.hn-books.com/

Couple points of note. This is 1) an example of a static site, 2) terrible UI, 3) contains live searches to comments on each book from all the major hacking sites, and 4) able to record a list of books that you can then share as a link, like so (which was my reason for making the site)

"My favorite programming books? Here they are: http://www.hn-books.com#B0=138&B1=15&B2=118&B3=20&B4=16&B5=1... "

I started writing reviews each month on the books, but because they were all awesome books, I got tired of so many superlatives!

Thanks for the site.

40
deadowl 22 hours ago 0 replies      
I've read a grand total of two of those. Working Effectively with Legacy Code seems like it would make for a good read.
41
DyslexicAtheist 18 hours ago 0 replies      
amazed that nobody talks about W. Richard Stevens anymore. i am getting old

https://en.wikipedia.org/wiki/W._Richard_Stevens

42
enitihas 23 hours ago 1 reply      
I am surprised to see the absence of SICP or anything from Dale Carnegie.
43
rplittle 19 hours ago 1 reply      
Curious why the #1 is only 2.5 stars on Amazon
44
dschiptsov 6 hours ago 0 replies      
How come that SICP is not here?
45
ck2 23 hours ago 1 reply      
Now that is a website with a nice clean layout and easy to read.
46
a-dub 20 hours ago 0 replies      
There are a couple of goodies in there, but tbh that list is pretty depressing.
Dear open-source maintainers, a letter from GitLab gitlab.com
874 points by levlaz   ago   285 comments top 41
1
jbk 1 day ago 6 replies      
With VideoLAN, we're not on github, but I believe we fit the 'large open source project' description. We host VLC, FFmpeg, x264 and quite a few related libraries.

For VLC and all related VideoLAN projects, we're moving to our own instance of GitLab hosted on our infrastructure.

And to be honest, it's quite good, but a few stuffs are ridiculously limited, to the point that some people in the community are resisting the change.

The first part is the groups and subgroups: it seems incredibly difficult to give sub-groups access to repos (like a team for iOS, one for Android, one for libVLC... but all are under the "videolan/" group). It seems there is a way with the EE, but not in the CE; and the current idea for the CE is to have sub-projects, which is not good, because it will make our URLs way more complex than needed.

The second part is the bugtracker/issues tracker. We use trac for VLC, and we want to leave it for something better; but gitlab issues is way too limited, even when using the templates. Especially, it seems to be impossible to add custom searchable fields (like "platforms", "priority" or "modules") which are very very useful to do queries. Also, there is no way to do custom queries and store them ("I want all the bugs for Windows, which are related to the interface modules").

If I remember correctly, this second part was also a complaint in the open letter to github.

Finally, it's not really related, since it's more a feature request, but we'd love to allow external people to fork our repos, but not create completely new ones (or have them validated) because we don't want to host any projects under the sun (there is github and gitlab for that). So far, you either allow both features or none of them.

PS: can we have custom landing pages and custom logo in the CE version? :D :D

2
nathan_f77 1 day ago 4 replies      
I still don't know how I feel about GitLab. My initial reaction was that they were an underhanded, cheap knockoff of GitHub. It felt kind of dirty, like they were stealing GitHub's thunder and giving it away for free. Then they started charging for enterprise features and turned it into a business, which felt even weirder. And then they raised a lot of money, which kind of made them seem more legitimate. And now this letter, which changed my perspective quite a lot. I didn't know that had features like voting on issues, and issue templates.

So now it even feels like they're doing Git hosting the right way, making the core software open source, and charging for enterprise features.

On the other hand, I would have probably never paid for GitHub if they followed this model. So I don't think GitHub would have been as successful.

3
mdw 22 hours ago 3 replies      
I've recently made an iOS App that integrates with GitLab. The people at GitLab have been incredible, they respond to my issues, improve the API with every release, I didn't expect this level of awesomeness when I started the project.

What's great about GitLab, there's a release on the 22nd of each month, so you can depend on pretty much continual improvement. Even if you don't think GitLab is suitable for your Open Source project, talk to the team on their issue tracker, things get solved pretty quickly!

4
akerro 1 day ago 2 replies      
My company with ~300 developers are moving to Gitlab in the next few months. Today our CTO/PM shared opinion about Gitlab and he was very happy we're doing it, even better I recommended it to him :)

I'm a Gitlab users for a few years now, personally I like it much more than Github, one of the reason is that I fear that Github contains too many projects and gains too much control over OSS, I also dislike their CoS.

Good luck Gitlab!

5
_yy 1 day ago 2 replies      
Meanwhile, Phabricator has implemented all of these (and even more).

Custom templates: https://secure.phabricator.com/book/phabricator/article/form...

Votes: https://www.mediawiki.org/wiki/Phabricator/Tokens

It's better in almost every aspect than GitLab and GitHub.

See https://en.wikipedia.org/wiki/Phabricator for an (incomplete) list of open source projects using it.

6
glandium 1 day ago 5 replies      
Just for this, I'm tempted:

One issue that was raised several times was the ability to not create merge commits. In GitLab you can, as an alternative to the merge commits, use fast-forward merges or have merge requests be automatically rebased.

The main thing keeping me from actually doing it is the network effect... and this:

Disadvantages

Right now GitLab.com is really slow and frequently down. This is because of fast growth in 2015.

7
thejameskyle 18 hours ago 1 reply      
I'm one of the co-authors of the Dear GitHub letter. This is the type of response I want so badly from GitHub (but wasn't expecting).

GitLab still has a ways to go in terms of performance/reliability and polishing their product, but GitHub aught to be very nervous about them.

8
merb 1 hour ago 0 replies      
What I would like to have in the free version is:- Project importing from Stash to GitLab- Mirror Projects - Display merge request status for builds on Jenkins CI- Rebase merge requests before merge- Git hooks (commit message must mention an issue, no tag deletion, etc.)

These are pretty essential.

9
imonabout 1 day ago 3 replies      
I don't think the fact that GitLab is free for basic users is very discoverable from the site. You've got the "features" tab, which leads to what seems to be the option of downloading a community edition, which makes it look like GitLab is only offering the code itself but not hosting from their, and the enterprise edition for some licensing fee (I suppose). Then you've got "sign in", but no "sign up", which leads me to ask "but then how do I sign up", which most naturally (for me) leads me to the "pricing" tab. Now this page only shows "trial" and the different priced tiers.

But if I press "sign in", I am able to sign up, with no notice about it being just a limited (45 day) trial. So so far I'm assuming that this is a perpetually free account, though I'm not completely sure yet...

10
rcarmo 1 day ago 4 replies      
I chose Gitlab for my former company over GitHub Enterprise because we wanted an on-prem solution, and it worked well enough for ~200 folk. We did have to tweak (and occasionally break) a few things, since quite a few people suffered from NIH syndrome and wanted things done "the right way".

In general, I liked it, but it always irked me that its Ruby underpinnings made it hard to upgrade/migrate stuff (we basically just swapped LXC containers at one point, not sure how it was handled during the last upgrade). If anyone ever manages to do a credible alternative that does _not_ use Ruby in any way but keeps the overall GitHub-like workflow, a lot of operations folks will switch _instantly_.

(Like https://try.gogs.io/explore, for instance)

Also, like some commenters already pointed out, the CE edition was ridiculously limited in some regards - we mostly skipped the bits we didn't like and did product-level ticketing outside it (using Trac), with Gitlab issues used only for "techie" stuff, tracking fixes, etc.

But today I'd probably just sign us all up for GitHub and be done with it, or fire up a VM image from some marketplace - there's hardly any point in maintaining our own infrastructure or doing a lot of customization.

11
gravypod 1 day ago 2 replies      
If GitLab plays their cards right, they can take the market from github. That, in my book, will be good because unlike github, we can all contribute to making GitLab better.

The only question left is if your servers are powerful enough to run gitlab. Maybe I'll sacrifice a goat for some new server hardware and 256GB of ram.

12
neumino 23 hours ago 1 reply      
I've been using GitLab instead of GitHub recently for all my new projects and honestly there's nothing worse than GitHub and a few things are better - like having protected branches and master by default in it -, private respositories etc.

That being said, what both GitHub and GitLab are missing is actually becoming a "social network" or maybe more an active network. There are tons of interesting projects that pops up every day, that I would be interested in knowing about, contributing, but there's basically no way to learn about them.

Kudos to the GitLab team for all its work :)

13
alexbardas 1 day ago 1 reply      
Both products are really good, but Github needs to realise that they have to be more open and listen more to their users.

There is an opportunity for Gitlab here and I'm happy that they decided to make this announcement.

The community is the actual winner of this healthy competition.

14
cyphar 9 hours ago 0 replies      
While I've always admired GitLab being a free software alternative to GitHub, the whole EE licensing stuff really rubs me the wrong way. EE is completely proprietary software, even for people who have paid for it (it only allows the freedom to modify the code, but you cannot run it as you wish or distribute code freely). I understand charging money for your software, but you can also charge for feature requests and support. There's no need to lock away features from people using your free software just because you've deemed that feature "not useful for you" or "pay-to-use".

What I really don't get is the argument that "we won't liberate feature XYZ from EE because it's only useful for companies with 100+ developers". I think it's quite impressive that you can know what every user of your free software needs, and that you'll protect them from code only suited for enterprise.

I'll still use GitLab (the fact there's a free software version is great), and I'll be the first to fork (or back someone else's fork) CE as soon as you get acquired and your free software is no longer maintained by you (see: Oracle with Solaris, and every other acquahire ever).

15
lowpro 18 hours ago 0 replies      
Great PR move by gitlab, and I don't mean that negatively. This kind of direct response is exactly what a company looking to improve should do, hats off to gitlab!

Once their performance increases, maybe we'll see the momentum shift from Github.

16
filleokus 1 day ago 1 reply      
It seems as the linked issue, for the "long list of suggestion", is being spammed or something?

https://gitlab.com/gitlab-org/gitlab-ce/issues/8938

I have never used GitLab myself, but some of the features mentioned in the article (like a true voting system) is something I've really longed for. Might have to reconsider trying out GitLab more.

17
bitshepherd 1 day ago 3 replies      
I remember GitLab. They interviewed myself and a bunch of my colleagues to test the salary waters here in the Bay Area and hired nobody because we were all "overpriced", with several being underpaid for the area.

If you want the talent you need, especially in the Bay Area, you have to pay more than what the average developer makes in Amsterdam. I want to like GitLab, but I just can't get that bad taste out of my mouth.

18
sandGorgon 1 day ago 1 reply      
Please improve your mobile UI - phones have lesser horizontal space than vertical. You have put a cool looking sidebar on the left side which takes up a lot of space causing text layout to be funny on my 5.2 inch LG G2.

Github OTOH has an extremely usable mobile UI.

19
onuras 1 day ago 2 replies      
Their issue tracker seems really busy atm: https://i.imgur.com/Iv46xnx.png
20
danielsamuels 1 day ago 1 reply      
Perhaps if they had the same uptime, page load speed and UX as Github, I would consider it. Unfortunately we tried it for a week at work and none of us could figure out our way around it (when it was working).
21
allan_s 18 hours ago 1 reply      
For me the big things missing (not only for big open source projects)

 1. more than one level of subdivision for groups/projects (see below) 2. groups of users (call it department/team): because of 1 we have a lot of groups, (because all the small libs are in separate projects, so the project itself is a group, but of course we have several projects), so everytime somebody join the company, we have to add him in every single project (for them to be able to read the code), and we have also subcontractors, we would like to have a nice way to separate from the others
I think this things are most "day to day annoying", other than that gitlab is pretty good, the CI system, the hook system, it's pretty easy to put that links to issue section, ticket reference redirect to redmine , taiga , jira or whatever.

22
nevi-me 22 hours ago 0 replies      
At least with Gitlab I can open an issue somewhere and follow it, like with open source projects. It doesn't disappear into the abyss with no feedback on what's happening.

I've implemented two self-hosted Gitlab instances at work, for one of the instances on our private network, I'm still fighting with IT to allow us to use LDAP. Gitlab EE is still off our 'pockets' as management aren't too keen to pay for it, at least yet, but I hope that we'll get there.

Our self-hosted instance is also a bit slow, not as slow as Gitlab.com, and if it was written in a language that I'm familiar with, perhaps I and some of my team could contribute to 'making it faster'. Pity I don't have enough time left in the day to learn Ruby. I've read up a bit on the work going on around Unicorn and workers, but maybe some of these things could be better written in other more 'performant' languages?

For personal projects I still use Bitbucket + JIRA. I got to the point where I decided to stop looking for freebies and pay. JIRA has been awesome, totally worth the price.

23
spudfkc 21 hours ago 2 replies      
I get why GitLab wrote this, but it seems like just a stab at GitHub.

There seems to be a lot of hating on GitHub here, but I personally love GitHub (and we use GitLab at my current employer).

I think GitLab is doing a great thing, and I appreciate that their community edition is free and open source, but GitHub has been able to provide an invaluable service. They have a great community that facilitates open source projects and a vastly better UI than GitLab (though that isn't saying much with how awful GitLab's UI is).

I'm eager to see how GitHub evolves in the future with GitLab as a competitor, as GitLab has a lot of nice features (built-in CI, etc).

24
georgefrick 1 day ago 1 reply      
I've been using GitLab myself and suggesting it to any clients with the proper infrastructure. When I found GitLab I tried it out, and finally ended up moving my SVN to Git (private projects). It's a great piece of software and I love the web hooks.
25
infocollector 1 day ago 3 replies      
Any chance you can support Mercurial? There was a highly voted issue that I saw sometime back.
26
k__ 1 day ago 0 replies      
I read that Babel uses Phabricator.

How does GitLab compare to Phabricator?

27
grimborg 19 hours ago 1 reply      
Using gitlab for go development, "go get" doesn't work. Googled around, couldn't find a solution. With github, on the other hand, it just works.

I love gitlab (even made a git tool to easily create repositories from the commandline, gitgitlab) but these small things make a real difference. I'll end up paying for a github organization account just to get this annoyance out of the way.

28
nhumrich 10 hours ago 0 replies      
I love gitlab and its features, and will commit to actually using it once dockerhub has gitlab integration. I agree though that is more of a limitation on dockerhub.
29
pmilot 1 day ago 8 replies      
This attempt by the Gitlab folks to ride the Github dissatisfaction wave seems a little low-brow. Why respond to a letter that's not addressed to you? I would have preferred them to simply post an honest "Why you should migrate from Github to Gitlab" article. The tone just seems a little devious to me.

By the way, we're using self-hosted Gitlab at work and we love it. This isn't a knock against the actual product. In fact, I think Gitlab has improved tremendously in the last 18 months. I just wish they would be a little more up-front about their marketing efforts.

30
p1mrx 16 hours ago 1 reply      
GitLab could differentiate their service by offering IPv6 support, which GitHub has so far declined to do.
31
nik736 1 day ago 1 reply      
Off topic:https://jumpshare.com/v/MOEwe43eAHatINgDTi0zWhat does this stand for? ;-)
32
MoSal 1 day ago 1 reply      
Somebody just spammed their Issues page[1]. The one they link to in the letter.

I think the spammer is trying to make a point! For starters, there seems to be no rate limit applied.

[1] https://gitlab.com/gitlab-org/gitlab-ce/issues

33
giancarlostoro 15 hours ago 1 reply      
Thank you GitLab for being yet another option, and for at least maintaining one Open Source version of your software at the very least.
34
simi_ 23 hours ago 0 replies      
Sourcegraph has pivoted into a git hosting appliance, it's pretty cool, even if nowhere near as full-featured. https://sourcegraph.com/
35
sqldba 1 day ago 2 replies      
Because the thread has turned into a "why I like/dislike GitLab over GitHub", I'll say one thing that keeps sending me back to GitHub is the neat-o desktop client.

I use a command line for everything else in life; but with Git I'm hopeless.

36
jhasse 23 hours ago 0 replies      
Has anyone any experience with Gogs? How does it compare to GitLab?

https://gogs.io/

37
jorisw 1 day ago 2 replies      
Before you think of switching to GitLab, look at their status Twitter account over the past few months. They've had a lot of down times. They run on Microsoft Azure.
38
chris_wot 1 day ago 1 reply      
GitHub probably want to, you know, actually pay attention to their developers now. Gitlab might just eat their lunch.
39
beshrkayali 1 day ago 0 replies      
Nice ride, Gitlab!
40
bigbugbag 1 day ago 4 replies      
The "crippled community edition vs paid enterprise edition" business model raises red flags. It's not that different from the free crippled demo version or outdated by 2+ major versions freeware model.

Who gets to decide that features are enterprise only ? How are these enterprise only features: "Hosting static pages straight from GitLab", "git-annex", "git hooks", etc. ?

Get a crippled version that doesn't fit the reasonable expectations or pay for enterprise edition coming with a big bag of features I have no use for just to get the missing features that I can have on github for free (And I'm not a big fan of github)?

As such gitlab community is not very useful to me and does not seem to have a future because its chosen business model goes against its usefulness to people.

41
arihant 1 day ago 4 replies      
> "Right now GitLab.com is really slow and frequently down. This is because of fast growth in 2015. We are working to improve it in the first quarter in 2016. For now please consider downloading GitLab and using it on-premise for a fast experience."

Is this a joke? I mean for people looking for free private Git hosting, there is Bitbucket. This statement is like saying "free, but not really, really." The fact is if I want hosted Git hosting from Gitlab, I cannot reliably get it without paying at least $390 upfront for their EE plan. Too much smokescreen, too less actually on offer.

Wikipedia Turns 15 wikipedia.org
440 points by doppp   ago   109 comments top 19
1
klenwell 4 days ago 2 replies      
I was talking to a friend the other day and came up with a term to sorta quantify what I consider perhaps the most important revolution of my lifetime: TTK or Time To Knowledge.

The example I always use, the occasion when it first occurred to me, was a couple years ago when, for some reason, I decided I wanted to make a foam for a cocktail. Within 5 minutes, I had found a video on Youtube illustrating how, not to mention a dozen other sites documenting various techniques.

I imagined being back in the 1980s or 90s and confronting the same wild impulse. How would I have figured this out? Asked a couple people perhaps. Contemplate a trip to my local library. Maybe make a mental note to chat with a bartender next time I found myself at a cocktail bar. Probably just give up on the idea and go back to watching the A-Team.

This is a rather trivial example. But then consider the ease and dramatically lowered TTK where programming knowledge (via StackOverflow) or general knowledge (Wikipedia) is concerned. The internet itself cut the lag. But it was first Google, then Wikipedia, that turned TT#$&!%&@ (Time To me cursing that I have access to all this potentially useful information that I can't quite seem to reach) to TTK, Time To (real meaningful well-organized) Knowledge.

2
deepnet 4 days ago 3 replies      
I pay monthly to wikipedia, I use it tens of times a day, it is a fantastic unparalleled resource - a real public good.

Wikipedia is a proof of a utopian vision that infused the early web - ensuring public rights wins public contribution.

Humanities collective knowledge is better distributed because of Wikipedia, a true wonder of the modern world.

IMHO, we must treasure wikipedia as it is not clear it could happen again and it embiggens us all.

3
bsimpson 4 days ago 4 replies      
Whoa - Wikipedia has been around more than half my life already?

Good on them! It's one of the best things humanity has ever created. Hopefully they'll find a funding strategy that doesn't make them constantly feel like panhandlers. They provide uncountably huge value, yet I suspect with their current marketing, even very heavy readers rarely donate.

EDIT: Sounds like they are working on it: https://15.wikipedia.org/endowment.html

4
moonshinefe 4 days ago 0 replies      
Wikipedia isn't without its problems, but there's no doubt it's been a very positive force on the internet. I'm not sure there's been anything like it in human history, where most people can have such easy access to such a wide variety of encyclopedic topics for free. It's an equalizer for sure in education.

For casual readers like myself it's also a real pleasure to occasionally just dive into a section of history, follow the links around, and learn about the world. Same goes for various other topics but that's the one that came to mind.

Here's to hoping Wikipedia sticks around for a long time to come.

5
rodionos 4 days ago 5 replies      
Congratulations, Wikipedia! I found it tremendously useful over the years. I have one suggestion for them. I wouldn't mind making micro-donations for articles of particular quality and relevance to me. Say I find something useful and click on a '2c' donation button. The system doesn't trigger an immediate payment transaction because I would not have time to do it but instead waits until I accumulate $1+ in 'spend' and then displays another button 'Pay'. This way micropayments are easy for me as the reader and at the same time Wikipedia's transaction costs are minimized. Besides, I like to be able to pay later.
6
jonknee 3 days ago 0 replies      
In the age of Unicorns, Wikipedia is truly the rare beast. Nothing is perfect, but Wikipedia has managed to not screw up one of the most impressive collections of information the world has ever seen. They haven't sold out (like say, Mozilla) and manage to fund one of the world's most popular web sites with an annual pledge drive. Bravo!
7
emw 4 days ago 0 replies      
Wikipedians are hosting free events across the world for "Wikipedia Day" this weekend.

* San Francisco (Saturday): https://en.wikipedia.org/wiki/Wikipedia:Meetup/San_Francisco...

* New York City (Saturday): https://en.wikipedia.org/wiki/Wikipedia:Meetup/NYC/Wikipedia...

* Boston (Saturday): https://meta.wikimedia.org/wiki/Wikipedia_15/Events/Boston

* Bangalore (Sunday): https://meta.wikimedia.org/wiki/Wikipedia_15/Events/Bangalor...

* London (Sunday): https://meta.wikimedia.org/wiki/Meetup/London/101

* Portland, Seattle, Vancouver (Saturday, meet Ward Cunnigham!): https://meta.wikimedia.org/wiki/Wikipedia_15/Events/West_Coa...

New York will feature a talk about Wikidata, how to query it with SPARQL, and how we are integrating it with Wikipedia and pushing forward the Semantic Web. Other NYC talks include things like "Git-flow approach to collaborative editing", "Copyright and plot summaries", and "Automated prevention of spam, vandalism and abuse". We will be linking up with San Francisco and likely some other cities for a global teleconference at 4:00 - 5:00 PM ET (21:00 UTC).

If you're interested, sign up and stop by!

8
Isamu 4 days ago 0 replies      
I remember the announcement of the beginning of the project, probably on Slashdot. There was the call for participants.

I felt certain they would fail to achieve critical mass in order to become the large scale success that they have.

Glad to be proven wrong! And congrats.

I have contributed too. Here's hoping they solve the latest set of challenges with the insider community.

9
midgetjones 4 days ago 1 reply      
The page is 2.5mb and took 18 seconds to fully load. Is that why they've been badgering us for donations for donations recently?

(I am being facetious; I bloody love Wikipedia and do donate, but you think they'd be more careful about this sort of thing)

10
daniel_iversen 4 days ago 1 reply      
I've always wondered if Wikipedia could come up with alternate revenue streams because of the huge amounts of people they touch, the mammoth cross section of content they have. And the kinds of traffic analysis they could do... Targeted ads? Analytics for companies (a medical, tech or other company could learn what their stakeholders are interested in), recommendation engines, content classification engines for media and news orgs (or even SEO or UX navigation conscious web devs) etc.. They could do these thighs gracefully. What do you think?
11
rplnt 4 days ago 5 replies      
I was thinking just yesterday whether wikipedia will ever change its design or if it is timeless (unless devices change radically). I don't remember any major changes, I guess there were gradual that I didn't notice?
12
markc 3 days ago 0 replies      
I first encountered Jimbo back in the 80's when he ran the Moderated Discussion of Objectivist Philosophy (MDOP), an Objectivist mailing list that featured brilliant philosophical discussions among the stars of the Objectivist community. Jimbo's contributions were consistently excellent as well. [Anyone know if there's an archive anywhere?]

It was quite a surprise when he turned up years later in an entirely different context as a founder of Wikipedia - though I'm not surprised he did something big. His charisma showed in his MDOP contributions and he always seemed destined for something big. Congrats Jimbo, and all the other people who have made Wikipedia, for this amazing asset to humanity.

13
arca_vorago 3 days ago 0 replies      
I love wikipedia, and contribute when I can, it does a great service to us all. Where it fails though is regarding controversial subjects. It would be a thesis paper to get into the meat of the issue, but thats its weakpoint.

Which is why I love they have shared their system open source so others can use it.

The real issue boils down to a filter bubble problem, and google isnt helping avoid this. Its that people use wikipedia as a panacea and forgoe actually following sources far too often.

Shades of trusting trust but instead of compilers its editors and censorship.

14
paulannesley 3 days ago 0 replies      
Pretty much the more important site on the internet.I make a small automatic monthly donation; you can set it up in a few clicks via https://wikimediafoundation.org/wiki/Ways_to_Give
15
ausjke 4 days ago 0 replies      
Use it occasionally and donate to it annually, happy anniversary Wikipedia!
16
martiuk 4 days ago 0 replies      
I support Wikipedia's efforts and would gladly donate once they sort out the NPOV issues on the site. A lot of power users think NPOV is their POV and revert anything they don't agree with.
17
herbst 4 days ago 1 reply      
The grid is no grid on chrome. The boxes are beyond each other.
18
knughit 3 days ago 0 replies      
They celebrated by sending spam email to past donors.
19
joolze 4 days ago 0 replies      
"Wikipedia is no banners at the top of the page"
I ended up paying $150 for a single 60GB download from Amazon Glacier medium.com
620 points by markonen   ago   216 comments top 44
1
dirktheman 1 day ago 8 replies      
I'm the first one to admit that Glacier pricing is neither clear nor competetive regarding retreival fees. I do think that a lot of people use it the wrong way: as a cheap backup. I use:

1. My Time Machine backup (primary backup)

2. BackBlaze (secondary, offsite backup)

3. Amazon Glacier (tertiary, Amazon Ireland region)

I only store stuff that I can't afford to miss on Glacier: photos, family videos and some important documents. Glacier isn't my backup, it's the backup of my backup of my backup: it's my end-of-the-world-scenario backup. When my physical harddrive fails AND my backblaze account is compromised for some reason, only then will I need to retrieve files from Glacier. I chose the Ireland region so my most important files aren't even on the same physical contintent.

When things get so dire that I need to retrieve stuff from Glacier, I'd be happy to pony up 150 dollars. For the rest of it, the 90 cents a month fee is just a cheap insurance.

2
res0nat0r 1 day ago 5 replies      
Glacier pricing has to be the most convoluted AWS pricing structure and can really screw you.

Google Nearline is a much better option IMO. Seconds of retrieval time and still the same low price, and much easier to calculate your costs when looking into large downloads.

https://cloud.google.com/storage/docs/nearline?hl=en

3
markonen 1 day ago 0 replies      
OP here. Some updates and clarifications are in order!

First of all, I just woke up (its morning here in Helsinki) and found a nice email from Amazon letting me know that they had refunded the retrieval cost to my account. They also acknowledged the need to clarify the charges on their product pages.

This obviously makes me happy, but I would caution against taking this as a signal that Amazon will bail you out in case you mess up like I did. It continues to be up to us to fully understand the products and associated liabilities we sign up for.

I didn't request a refund because I frankly didn't think I had a case. The only angle I considered pursuing was the boto bug. Even though it didn't increase my bill, it stopped me from getting my files quickly. And getting them quickly was what I was paying the huge premium for.

That said, here are some comments on specific issues raised in this thread:

- Using Arq or S3's lifecycle policies would have made a huge difference in my retrieval experience. Unfortunately for me, those options didn't exist when I first uploaded the archives, and switching to them would have involved the same sort of retrieval process I described in the post.

- During my investigation and even my visits to the AWS console, I saw plenty of tools and options for limiting retrieval rates and costs. The problem was that since my mental model had the maximum cost at less than a dollar, I didn't pay attention. I imagined that the tools were there for people with terabytes or petabytes of archives, not for me with just 60GB.

- I continue to believe that starting at $0.011 per gigabyte is not a honest way of describing the data retrieval costs of Glacier, especially when the actual cost is detailed, of all things, as an answer to a FAQ question. I hammer on this point because I don't think other AWS products have this problem.

- I obviously don't think it's against the law here in Finland to migrate content off your legally bought CDs and then throw the CDs out. Selling the originals, or even giving them away to friend, might have been a different story. But as pointed out in the thread, your mileage will vary.

- I am a very happy AWS customer, and my business will continue to spend tens of thousands a year on AWS services. That goes to something boulos said in the thread: "I think the reality is that most cloud customers are approximately consumers". You'd hope my due diligence is better on the business side of things, as a 185X mistake there would easily bankrupt the whole company. But the consumer me and the business owner me are, at the end, the same person.

4
astrostl 1 day ago 1 reply      
Arq has a fantastic Glacier restore mechanism. You select a transfer rate with a slider, and it informs you how much it will cost and how long it will take to retrieve. It optimizes this with an every-four-hours sequencing as well. See https://www.arqbackup.com/documentation/pages/restoring_from... for reference.
5
re 1 day ago 1 reply      
Glacier's pricing structure is complicated, but fortunately it's now fairly straightforward to set up a policy to cap your data retrieval rate and limit your costs. This was only introduced a year ago, so if like Marko you started using Glacier before that it could be easy to miss, but it's probably something that anyone using Glacier should do.

http://docs.aws.amazon.com/amazonglacier/latest/dev/data-ret...https://aws.amazon.com/blogs/aws/data-retrieval-policies-aud...

6
KaiserPro 1 day ago 0 replies      
Glacier is not a cheap/viable backup

its even less suited to disaster recovery (unless you have insurance)

Think about it. For a primary backup, you need speed and easy of retrieval. Local media is best suited to that. Unless you have a internet pipe big enough for your dataset (at a very minimum 100meg per terabyte.)

4/8hour time for recovery is pretty poor for small company, so you'll need something quicker for primary backup.

Then we get into the realms of disaster recovery. However getting your data out is neither fast nor cheap. at ~$2000 per terabyte for just retrieval, plus the inherent lack of speed, its really not compelling.

Previous $work had two tape robots. one was 2.5 pb, the other 7(ish). They cost about $200-400k each. Yes they were reasonably slow at random access, but once you got the tapes you wanted (about 15 minutes for all 24 drives) you could stream data in or out as 2400 megabytes a second.

Yes there is the cost of power and cooling, but its fairly cold, and unless you are on full tilt.

We had a reciprocal arrangement where we hosted another company's robot in exchange for hosting ours. we then had DWDM fibre to get a 40 gig link between the two server rooms

7
Spooky23 1 day ago 2 replies      
The only use case I would be willing to commit to glacier would be legal-hold or similar compliance requirement.

The idea would be that the data would either never be restored or you could compel someone else to foot the bill or using cost sharing as a negotiation lever. (Oh, you want all of our email for the last 10 years? Sure, you pick up the $X retrieval and processing costs)

Few if any individuals have any business using the service. Nerds should use standard object storage or something like rsync.net. Normal people should use Backblaze/etc and be done with it.

8
Nexxxeh 1 day ago 5 replies      
The post is a useful cautionary tale, and he's not alone in getting burned by Glacier pricing. Unfortunately it was OP not reading the docs properly.

Yes, the docs are imperfect (and were likely worse back in the day). And it was compounded by the bug, apparently. But it's what everyone on HN has learned in one way or another... RTFM.

Was it mentioned in the article that the retrieval pricing is spread over four hours, and you can request partial chunks of a file? Heck, you can retrieve always all your data from Glacier for free if you're willing to wait long enough.

And if it's a LOT of data, you can even pay and they'll ship it on a hardware storage device (Amazon Snowball).

Anyone can screw up, I'm sure we all have done, goodness knows I have. But at the very least, pay attention to the pricing section, especially if it links to an FAQ.

9
sathackr 1 day ago 0 replies      
This sounds a lot like demand-billing [1] [2] that's common with electric utilities, particularly commercial, and increasingly, people with grid-tied solar installations. [citation needed]

You pay a lower per-kilowatt-hour rate, but your demand rate for the entire month is based on the highest 15-minute average in the entire month, then applied to the entire month.

You can easily double or triple your electric bill with only 15 minutes of full-power usage.

I once got a demand bill from the power company that indicated a load that was 3 times the capacity of my circuit (1800 amps on a 600 amp service). It took me several days to get through to a representative that understood why that was not possible.

[1] http://www.stem.com/resources/learning

[2] http://www.askoncor.com/EN/Pages/FAQs/Billing-and-Rates-8.as...

10
joosteto 1 day ago 3 replies      
If downloading more than 5% of stored data is so expensive, wouldn't it have been cheaper to upload a file 19 times the size of the stored data (containing /dev/urandom)?After that, downloading just 5% of total data would have been free.
11
captain_jamira 1 day ago 2 replies      
If one can download a percentage for free each month - 5% in this case, and the price of storage is dirt-cheap, then couldn't one just dump empty blocks in until the amount desired for retrieval falls under the 5% limit? In this case, if one wants to retrieve 63.3 GB, uploading 1202.7 GB more for a total of 1266 GB, 63.3 GB of which represents just under 5%. There's no cost for data transfer in and the monthly cost at $0.007/GB would be just $8.87. And that's just for the one month because everything wanted would be coming out the same month.

Has anyone tried this or know of a gotcha that would exclude this?

And I realize that for the OP's situation, it wouldn't have mattered since he thought he was going to get charged a fraction of this.

12
lazyant 1 day ago 1 reply      
You don't have a backup until you test its restore.
13
profsnuggles 1 day ago 1 reply      
Even with the large data retrieval bill he still saves ~$100 vs the price of keeping that data in S3 over the same time period. Reading this honestly makes me think glacier could be great for a catastrophic failure backup.
14
kennu 1 day ago 1 reply      
Glacier is more comfortable to use through S3, where you upload and download files with the regular S3 console, and just set their storage class to Glacier with a lifecycle rule. I've used the instructions in here to do it: https://aws.amazon.com/blogs/aws/archive-s3-to-glacier/
15
slyall 1 day ago 1 reply      
I've had some big Glacier bills in the past, even the upload pricing has gotchas[1]

These days the infrequent access storage method is probably better for most people. It is about 50% more than Glacier (but still 40% of normal S3 cost) but is a lot closer in pricing structure to standard S3.

Only use glacier if you spend a lot of time working out your numbers and are really sure your use case won't change.

[1] - 5 cents per 1000 requests adds with with a lot of little files.

16
Zekio 1 day ago 1 reply      
Pricing should always be made straight forward, easy to understand, and that pricing plan is dodgy as hell
17
detaro 1 day ago 2 replies      
Seems like "precise prediction and execution of Amazon Glacier operations" might be a niche product people would pay for (and probably already exists for enterprise use cases?)

That's something that generally keeps me from using AWS and many other cloud services in many cases: the inability to enforce cost limits. For private/side project use I can live with losing performance/uptime due to a cost breaker kicking in. I can't live with accidentally generating massive bills without knowingly raising a limit.

18
stuaxo 1 day ago 0 replies      
"The problem turned out to be a bug in AWS official Python SDK, boto."

My only experience of using boto was not good. Between point versions they would move the API all over the place, and being amazon some requests take ages to complete.

After that worked with google APIs which were a better, but still not what I'd describe as fantastic (hopefully things are better over last 2 years).

19
tomp 1 day ago 1 reply      
Wouldn't it be better for the OP to simply upload 20 * 60GB (= 1.8TB) of random data, wait a month (paying less than 20 USD), and then download the initial 60GB within his 5% monthly limit?
20
Pxtl 1 day ago 2 replies      
Considering the fact that bugs in the official APIs resulted in multiple retry attempts, he should demand some of his money back.
21
atarian 1 day ago 0 replies      
They should rename this service to Amazon Iceberg
22
Jedd 1 day ago 2 replies      
About a year ago NetApp bought Riverbed's old SteelStore (nee Whitewater) product -- it's an enterprise-grade front-end to using Glacier (and other nearline storage systems). It provide a nice cached index via a web GUI that let you queue up restores in a fairly painless way. It even had smarts in there to let you throttle your restores to stay under the magical 5% free retrieval quota. It's not a cheap product, and obviously overkill for a one-off throw of 60GB of non-critical data ... but point being there are some good interfaces to Glacier, and roll-your-own shell scripts probably aren't.

As noted by others here, if you treat glacier as a restore-of-absolute-last-resort, you'll have a happier time of it.

Perhaps I'm being churlish, but I railed at a few things in this article:

If you're concerned about music quality / longevity / (future) portability - why convert your audio collection AAC?

Assuming ~650MB per CD, and the 150 CD's quoted, and ~50% reduction using FLAC, I get just shy of 50GB total storage requirements -- compared to the 63GB 'apple lossless' quoted. (Again, why the appeal of proprietary formats for long term storage and future re-encoding?)

I know 2012 was an awfully long time ago, but were external mag disks really that onerous back then, in terms of price and management of redundant copies? How was the OP's other critical data being stored (presumably not on glacier). F.e. my photo collection has been larger than 60GB since way before 2012.

Why not just keep the box of CD's in the garage / under the bed / in the attic? SPOF, understood. But world+dog is ditching their physical CD's, so replacements are now easy and inexpensive to re-acquire.

If you can't tell the difference between high-quality audio and originals now - why would you think your hearing is going to improve over the next decade such that you can discern a difference?

And if you're going to buy a service, why forego exploring and understanding the costs of using same?

23
LukeHoersten 1 day ago 0 replies      
Does anyone have a success story for this type of backup and retrieval on another service?
24
elktea 1 day ago 0 replies      
I briefly used Glacier for daily backups as a failsafe if our internal tape backups failed when we needed them. The 4 hour inventory retrieval when I went to test the strategy and the bizarre pricing quickly make me look at other options.
25
pmx 1 day ago 2 replies      
I have a strong feeling that he would get a refund if he contacted Amazon support, considering it was caused by a bug in the official SDK and he didn't ACTUALLY use the capacity he's being asked to pay for.
26
kozukumi 1 day ago 0 replies      
For unique data you want super robust storage options, both local and remote. But for something as generic as ripped CDs? Why bother? Just use an external drive or two if you are super worried about one dying. Even if you lose both drives the data on them isn't impossible to replace.
27
prohor 1 day ago 0 replies      
For cheap storage there is also Oracle Archive Storage with 0.1c/GB ($0.001/GB). They have horrible cloud management system though.

https://cloud.oracle.com/en_US/storage?tabID=1406491833493

28
natch 1 day ago 1 reply      
This is why I break my large files uploaded to Glacier into 100MB chunks before uploading. If I ever need them, I have the option of getting them in a slow trickle.
29
cm2187 1 day ago 3 replies      
Perhaps a naive question but why would glacier try to discourage bulk retrieval? Is it because the data is fragmented physically?
30
forgotpwtomain 1 day ago 2 replies      
> Id need more than one drive, preferably not using HFS+, and a maintenance regimen to keep them in working order.

I'm really doubting the need for a maintenance regimen on a drive which is almost entirely unused. Could have spent $50 on a magnetic-disk-drive and saved yourself hours worth of trouble.

31
JimmaDaRustla 1 day ago 0 replies      
Wow, thanks for this!

I currently have 100gb of photos on Glacier. I am going to be finding another hosting provider now.

32
z3t4 1 day ago 0 replies      
I was looking at Glacier for my backups, but it seemed to complicated ... glad I didn't use it.

I ended up using some cheap VPS, two of them located in two different countries. And it's still cheaper then say Dropbox.

33
jaimebuelta 1 day ago 0 replies      
You will ALWAYS pay more that you expect when you use AWS (and probably other cloud services). This case is quite extreme, but the way costs are assigned, is quite complicated not to miss something at some point...
34
alkonaut 1 day ago 1 reply      
Curious: if you use a "general storage provider" (like glacier) for backup, rather than a "pure backup provider" (like Backblaze, CrashPlan) why is that?
35
random3 1 day ago 1 reply      
So depending on how the "average monthly storage" is computed you could get 20x more data in one month and then retrieve the 5% (previously 100%) that you care about for free, and then delete the additional data?
36
NicoJuicy 1 day ago 0 replies      
Does anyone have a backup script for backblaze or a similar windows app like SimpleGlacier Uploader?
37
sneak 1 day ago 1 reply      
This article claims that glacier uses custom low-RPM hard disks, kept offline, to store data.

Does s/he substantiate this claim in any way? AFAIK glacier's precise functioning is a trade secret and has never been publicly confirmed.

38
dalanmiller 1 day ago 1 reply      
So, what's the most cost effective way to download all your files from Glacier then?
39
jedisct1 1 day ago 1 reply      
I don't get Glacier. It's painfully slow, painful to use and insanely expensive.https://hubic.com/en is $5/month for 10 Tb, with unmetered bandwidth. A far better option for backups.
40
pfarnsworth 1 day ago 1 reply      
If there's a bug in Amazon's libraries, can't you ask for a refund?
41
harryjo 1 day ago 1 reply      
Impressive that Amazon can choose to serve a request at 2x the bandwidth you need, with no advance notice, and charge you double the price for the privilege.
42
languagehacker 1 day ago 1 reply      
This is a simple case of spending more than you should have because you didn't understand the service you were using. It's impacted a little worse by how silly the whole endeavor is, given the preponderance of music streaming services.
43
otakucode 1 day ago 3 replies      
I'm surprised that the author had 150GB of Creative Commons audio CDs to begin with!
44
anonfunction 1 day ago 2 replies      
I don't like how the title and article reads like a hit piece on Amazon Glacier. It's great at what it is intended for. In addition it seems he still saved money because over 3 years because the $9 a month savings added up to more than the $150 bill for retrieval.

I'm surprised that this aspect has not been mentioned here in the comments yet:

> I was initiating the same 150 retrievals, over and over again, in the same order.

This was the actual problem that resulted in the large cost.

At my old job we would get a lot of complaints about overage charges based on usage to our paid API. It wasn't as complicated of pricing as a lot of AWS services, just x req / month and $0.0x per req after that, but every billing cycle someone would complain that we overcharged them. We would then look through our logs to confirm they had indeed made the requests and provide the client with these logs.

Iceland sentences bankers to prison for their part in the 2008 collapse loansafe.org
289 points by puppetmaster3   ago   99 comments top 14
1
Beltiras 2 days ago 8 replies      
I'm a native of Iceland. There are mixed feelings with respect to the sentences. We are generally proud to have a system that does imprison bankers but there are frustrations that the true architects of the financial crisis not only got off scot free but also with a substantial portion of the loot scurried away on Tortola. There has been substantial revisionist history practiced by the guilty and their defenders and the more time passes since the crash the fuzzier memory becomes. After all the electorate voted the parties that were largely responsible for the system that crashed back into power.

I'll answer any questions that you would be curious about local sentiment and coverage.

EDIT: Having finally gotten to a copy of the article I want to point out several factual statements in the article that I find a little bit challenging to agree with.

1. Iceland let the banks fail.

Not really. It was a restructuring. In the restructure a holding company handed all the assets over to a new corporate entity. All debts and obligations were honored.

2. Iceland avoided austerity

Many social programs were cut. Great cutdowns in the healthcare system, so much so it is on the verge of collapse now.

3. Geothermal is clean energy

Ask anyone from Reykjavk. Silver can no longer be kept without wrapping it in cloths, otherwise it goes dark in a day due to the hydrogen sulfide pollution. It's also not renewable. You can farm the area for about fifty years before it's too cold to extract and then it takes a millennia to recover.

4. Value of ISK/EUR

That may be the artificial value while capital controls are in effect. True value is way below that. I might cover the "snowhenge" problem later, it's a doozy.

5. Quoting the President on the banks.

You would have to know how much of a fluffer for the banks the president was for the banks and the oligarchs before the crash to know how stinky that sounds to an Icelander.

2
JumpCrisscross 2 days ago 2 replies      
Let's make sure we're comparing apples to apples. Most, if not all, of these convictions were for plain-vanilla securities fraud. Executives using company funds to buy stock for their own accounts on insider information. Or hiding their earnings from tax authorities in Tortola. These actions are clearly illegal and regularly prosecuted in the United States. Comparing these cases to the lack of prosecution of, e.g., Lehman Brothers's Dick Fuld is disingenuous. No laws were clearly broken there.
3
bmh_ca 2 days ago 1 reply      
If this topic interests you, I recommend you read the book by Gudrun Johnson called "Bringing Down the Banking System: Lessons from Iceland". It is a translation and summary of the Report of the Investigation Commission of Althing (the Icelandic Parliament).

https://en.wikipedia.org/wiki/The_Report_of_the_Investigatio...

The Report of Althing is, in my opinion, the most comprehensive since the Amulree Report of Newfoundland in 1933, and one of the very few reports to investigate causes of the crisis.

As a matter of interest, I seem to recall that 83 million euros was given to Icelandic bankers for their part in the crisis. The proceeds unaccounted for numbered in the billions. Much of what the banks did was modelled on Enron-style deception, a practice which remains quite commonplace now in the financial marketplace.

4
cbeach 2 days ago 2 replies      
Frustrated that the author neglected to elaborate on the crucial detail - the truth behind Iceland's post-2008 "revival" is that it imposed capital controls on its own citizens. It imprisoned their money to avoid "capital flight." This affected everyone. Even students on school trips had to fill out copious government paperwork before exchanging minimal amounts of -their own- money for foreign currency.

I know I'd want to take my earnings and get the hell out, if my government started bowing to leftwing populism and came for the bankers like a medieval pitchfork-waving mob.

Capital controls mean that the "recovery" is completely artificial, and until they're fully lifted we won't know just how deep Iceland's problems really go.

5
gizmo 2 days ago 1 reply      
I consider these prosecutions a good thing, even though the selection of who goes to prison and who gets to enjoy their spoils seems arbitrary. For instance none of the politicians who played benefited from and played a large role in the Icelandic financial crisis have been prosecuted. The financial oversight institutions failed too, but somehow their failure to do their jobs is acceptable?

From a game theory perspective persecuting bankers is correct. The incentive for people who are already wealthy to take massive risks gets reduced when they know they may go to prison when it all hits the fan. It has little to do with real justice, though.

6
rodionos 2 days ago 3 replies      
It maybe utopian or too liberal, but I'm against imprisonment for any kind of crime against wealth, be it shoplifting or securities/bank fraud on planetary scale. I think prison should be used for crimes against humanity, e.g. homicides, domestic violence, child abuse etc. Let the economic agents figure out how to protect their wealth by themselves by increasing transparency, improving risk controls, monitoring and security protection.

Say you don't want your car stolen, go and get some insurance, let insurance companies figure out how to prevent theft and find stolen cars. Let them figure out how to make stolen cars unusable. It's just an example, but I think it's doable.

7
ma2rten 2 days ago 0 replies      
Since the link seems to inaccessible, this website seems to be a mirror of the same content:

http://americannewsx.com/crooks-crime/iceland-sentences-26-c...

8
gregpilling 2 days ago 2 replies      
What would have happened if the US and Euro banks went down the same path? Does anyone have a well thought out insight?
9
MichaelBurge 2 days ago 8 replies      
I'm not sure I get the hate on bankers. I don't like the bailouts - those could've been handled better - but people seem to be angry about the fact that they offered loans that they shouldn't have. That is, non-bankers were given money that they shouldn't have been given, and the claim is that we should be more selective about offering capital and opportunities to non-bankers and the underprivileged poor.

If somebody offered me a billion dollars at a 3% interest rate or whatever it is right now, I would seriously consider accepting it knowing that I don't have the capacity to pay them back. There are a lot of things in the world that pay better than 3% interest and that are relatively safe.

Similarly, a bank that gave out loans whenever you asked(up to the 10 loan limit, maybe) would be a great opportunity even for people who couldn't pay it back out of income. You could buy an empty plot of land and put a small apartment complex on it. You could buy a 4-plex and apply any number of forced appreciation techniques(put in a coin-operated laundry, redo the piping to reduce the amount spent on plumbers, plant some flowers outside to increase the attractivity, etc.). Of course, just betting on general appreciation of single-family homes is bad money.

Basically, I was too young to really experience the effects of the 2008 and I'm feeling envy rather than anger about these people's ability to cheaply obtain capital. So I don't empathize with these news reports about punishing bankers. Can someone explain why I should feel anger instead?

10
junto 2 days ago 0 replies      
I'm a big believer that for every up there is a down. I respect Iceland for doing this. Rather than giving into a fear of the unknown, they faced their demons. It is without doubt a painful ride, but if any one of us were to negligently run a company into the ground and mismanage it, we would expect to pay the penalty for. I find it sad that we have allowed these big banks to grow so big, that they are "too big to fail". At what point do we have a scenario where other types of company are also too big to fail?
11
kafkaesq 2 days ago 0 replies      
Would be curious if anyone with knowledge of the various meltdowns that happened in Iceland can tell us if the Wikileaks disclosures from that period contributed significantly to the unravelling, as it happened.
12
joeclark77 2 days ago 0 replies      
But have they done anything about the RMT'ers in EVE-Online?
13
kazinator 2 days ago 0 replies      
I.e., each of 26 bankers, for a period of some 2 to 3 years, cannot get a passport.
14
SloopJon 2 days ago 1 reply      
I couldn't reach the linked article, but it's about 74 years combined since 2012, as summarized by this Reddit post:

https://www.reddit.com/r/news/comments/418h2r/iceland_senten...

Longest term was six years.

Six-Legged Giant Finds Secret Hideaway, Hides for 80 Years npr.org
406 points by sabya   ago   59 comments top 19
1
gulpahum 9 minutes ago 0 replies      
Are those edible? They look like a good replacement for shrimps. Nothing would guarantee their survival better than becoming a food source for humans!
2
abraae 12 hours ago 6 replies      
This story resonates with me.

Here in New Zealand, we have many native species of birds, insects, frogs, lizards and the like that thrived when our islands were cut off from the rest of the planet, but that have become extinct, or are in imminent danger of being so due to introduced predators such as rats, stoats, hedgehogs, ferrets, cats etc. etc.

It leads to the bizarre situation that conservation here is largely about killing things.

3
vblord 1 hour ago 0 replies      
I remember this posted from a year ago. I remember reading it and being grossed out that it was a giant bug and not some sort of 6 legged bear. Why that guy would ever touch that thing is beyond me.

https://news.ycombinator.com/item?id=9179292

4
oska 12 hours ago 1 reply      
Wingsuit flyby of Ball's Pyramid:

https://www.youtube.com/watch?v=cqQrWWpcT0I

5
brianclements 9 hours ago 0 replies      
Reminds me of a Radiolab episode[1] about a similar effort to bring back a specific species of tortoise in the Galapagos islands. The offending infringing species there where goats. What was really interesting was the method used for the eradication program.[2]

[1] http://www.radiolab.org/story/brink/[2] https://en.wikipedia.org/wiki/Judas_goat

6
cmpb 3 hours ago 1 reply      
I'm a southern Louisiana (US "Deep South") native. Here, and in other areas of the south (and elsewhere in the world), we have similarly large "cicadas", which are basically giant crickets. They hatch once every 13 years (shorter than most other cicadas, which hatch every 17 years) [1]. Though the hatches produce huge numbers (sometimes causing areas of road to be literally covered and obscured), and can sometimes be a nuisance because of their sound, they play a very integral role in the ecological cycle and contribute to a very diverse system of plants and animals which many people around here take great pride in. I hope that the residents of Lord Howe Island can learn to live with some new friends.

[1] https://en.m.wikipedia.org/wiki/Cicada

7
travis_brooks 11 hours ago 0 replies      
Did a search to see what happened with the stick bugs and discovered the population is now large enough they're in zoos in San Diego, Toronto, and Bristol:http://www.9news.com.au/national/2016/01/13/16/52/revived-au...
8
gherkin0 10 hours ago 1 reply      
Are they ever going to go back to the island and collect more specimens for breeding (and perhaps release some captive-bread individuals to replace them)? Even though that population was probably extremely inbred there's probably still some genetic diversity there that wasn't represented in to two wild specimens they managed to breed.
9
clarkmoody 12 hours ago 0 replies      
10
sohkamyung 10 hours ago 0 replies      
The story of the Lord Howe Island stick insect is nicely told in this award winning short animation, "Sticky" [1].

There is also a book out now on the insects [2].

[1] https://vimeo.com/76647062

[2] http://www.publish.csiro.au/pid/7226.htm

11
reustle 12 hours ago 1 reply      
This needs a (2012) in the title
12
eddiegroves 10 hours ago 2 replies      
All those eggs from one breeding pair, do insects not need gene diversity for the population to succeed?
13
trusche 6 hours ago 0 replies      
14
vortico 11 hours ago 0 replies      
It's interesting how this story has a true purpose, to convince the residents of Lord Howe to allow the insects to be released on their land by the pressure of the article's readers. Ball's Pyramid itself is perhaps even more fascinating to me. It looks like a good place for a wizard to live...
15
ed_blackburn 4 hours ago 1 reply      
How do they propose to rid Howe Island of rats?
16
michaelcampbell 12 hours ago 2 replies      
Those things scare the bejeezus out of me (irrationally, I'll grant), but these stories give me hope, if at least a little.
17
YeGoblynQueenne 7 hours ago 0 replies      
18
stuart78 10 hours ago 0 replies      
That video. I'm traumatized, yet I can't look away.
19
niels_olson 11 hours ago 1 reply      
Don't pitch the bugs. Pitch their shells. As buttons or jewelry.
A statement from f.lux about Apple's recent announcement justgetflux.com
363 points by mrzool   ago   190 comments top 39
1
rm999 4 days ago 10 replies      
A couple thoughts, as someone who has used f.lux for years and loves what it does:

1. It's a feature, not a product. And a simple one, conceptually. As much as I'd love to have competition in apps offering this functionality (like keyboards), "make my screen more red" isn't exactly rocket-science.

2. It's not well-designed. Their messaging mixes up two very different use-cases: matching the color of your room and aiding your sleep. That's ok - I use it for both - but there's no way to customize it. Even a super-simple option would let me communicate that I only want it on after 10:30 pm, a couple hours before I go to sleep, when I darken my room. Instead I need to deal with it automatically turning on every day at 4:30 pm, which makes something that should be simple very cumbersome (I have to manually turn it on and off every day in the winter).

2
k-mcgrady 4 days ago 1 reply      
Classy response. I was cringing before I even opened the link expecting whining but I am pleasantly surprised. I use f.lux on my Mac and find it has really improved my sleep. I hope Apple brings Night Shift to Mac as having the feature built-in will help more people. Personally I don't see them changing their policy towards f.lux on iOS until at least iOS 10 as the API's it requires to function are currently private. Maybe with the introduction of Night Shift they'll start to stabilise those and open it up.
3
skc 4 days ago 7 replies      
So bizarre how app developers are always so polite and gracious when it comes to Apple. It's like even everyone is deathly afraid of saying anything that puts Apple in a negative light.

I can only imagine what this post would have looked like had it been say, Google in question.

4
nchrys 4 days ago 2 replies      
I use f.lux on my Mac and it has made a real difference in my sleep. On my iPad when I am reading late I use the inverted colors mode which can be triggered by pressing three times on the home button (if you activate the shortcut). I would say this is even more a relief for the eyes, as not only blue colors become warm but all the white that is the background color for most everything becomes just dark. Of course you can't watch movies in these conditions but I highly recommend this for readers.
5
AdmiralAsshat 4 days ago 4 replies      
If memory serves, f.lux was banned for the "private" API's they used, which was necessary to control the the screen warmth.

So now that Apple has released their own screen dimming app, is Apple's implementation any different than flux's? Or did Apple effectively just abuse their app policy so that they could proactively kill a competitor to their "new" feature?

6
yggydrasily 4 days ago 3 replies      
Even though I really appreciate the idea behind it, I was never all that comfortable running f.lux on my Macbook. I never understood why it was free yet closed-source. It made me suspicious in a way, if that makes sense.

After a while I just gave up and uninstalled f.lux. Instead, I created a 4500K white point copy of the default color profile and manually switched to it at night, which seemed to have the same effect. It also prevented those big flashes whenever switching to and from full-screen apps.

I'm grateful that f.lux has pushed this issue to the point of getting traction as a built-in feature from Apple (and I do hope Apple brings it to OS X at some point), however unless f.lux becomes open-source, I don't plan to reinstall it.

7
StavrosK 4 days ago 4 replies      
For context (and correct me if I have this wrong), f.lux was using a non-standard way to allow people to install the app on their phones (allowing them to install it using Xcode) and Apple banned that method. A few months later, Apple announced an iOS feature that does exactly what f.lux does.
8
lukeschlather 4 days ago 0 replies      
Personally, I use Redshift which is an open-source f.lux clone.

Ideally, I'd like to be able to watch TV/use a computer in a dark room and not have to worry about it being so bright it gives me a headache, but also have good color quality. Software color temperature apps handle the headaches pretty well, but they don't save power and they don't let me see true colors.

It would be really nice to see some more hardware effort put into low-intensity backlights, especially color reproduction at lower brightness settings.

9
markpapadakis 4 days ago 0 replies      
I hope f.lux gets their chance with Apple. Their application is great and helps so many people. Btw, Michael Herf is the guy who built Picassa. He is a brilliant programmer and be sure knows a lot about graphics and colors.
10
gcr 4 days ago 2 replies      
This statement is a great example of professional writing in action.

> Apple announced this week that theyve joined our fight to use technology to improve sleep.

Right from the opening sentence, this piece begins on a positive note. It isn't f.lux vs apple. It's flux and apple versus the overarching problem, and that's a much more effective statement than the bitter fight that all of us were probably expecting. I'm very impressed by the f.lux team's maturity.

If I were to teach a professional writing course, I would show this piece to my students as an outstanding example of how spin can affect the reader's perception.

11
binaryanomaly 4 days ago 2 replies      
It's not the first time apple is acting in such an unfavorable way towards inventors/developers.It was exactly the same story regarding the use of the voluem button for taking pictures. 1st banned by apple and the later on integrated by apple. Not the best attitude imho.
12
goldenkey 4 days ago 0 replies      
F.lux is great but the root of the problem is the type of backlight that monitors use. A hardware company ought to come out with a monitor armed with multiple backlight bulbs.

I mean, look at the difference in abrasiveness of spectrum between these bulb technologies:

http://housecraft.ca/wp-content/uploads/2012/09/spectral_res...

F.lux is great and helps but what we really need is an ergonomic monitor. That will truly give us healthier eyes and improved circadian rhythm.

Monitor tech is hard but this sounds like a great goal for a startup. :-)

13
joblessjunkie 4 days ago 4 replies      
Did the f.lux team just seriously imply that dimming your monitor at night can prevent cancer?
14
sandbags 4 days ago 2 replies      
Although Apple are perfectly within their rights to do what they've done I hope they'd reach out to the f.lux authors and do something for them. It's not like they're destroying a business - f.lux being free - but it would reward the authors for proving the value of the improvement to the platform. You want to encourage developers like that. And it's not like Apple are short of cash.
15
tehwebguy 4 days ago 2 replies      
I would be so much angrier than this post is. Dude thinks they might want to team up after Apple fucked them in 2x in a row.
16
PhasmaFelis 4 days ago 0 replies      
Why is f.lux struggling so hard to get on iOS, a platform that clearly doesn't want them, when they still haven't released a version for Android, which is significantly more friendly and, IIRC, has a larger user base?

My understanding is that an Android version would have to require a rooted phone to really do everything properly, which is a significant limitation, but rooting your phone is completely Google-approved and there are plenty of apps in the Android app store that openly require it. If (not unreasonably) f.lux is really concerned about reaching users who aren't savvy or interested enough to root their phones, then a root-only version would be an ideal test case to encourage Google to open up the API.

17
zippergz 4 days ago 1 reply      
Maybe this is a sidetrack, but is there real peer-reviewed science behind the stuff f.lux claims? I tried it for a while and noticed no difference whatsoever (except that photos look like crap when the color balance is so skewed).
18
cronjobber 4 days ago 0 replies      
I recently had a need to stop and re-start f.lux (on Windows). I noticed that it was slow to start and suspected it was accessing the network. Since I had disabled automatic updates, that was strange.

Switched to process explorer and found f.lux connected to three different addresses out there on the wild internet.

Fishy, to say the least. You need permanent internet connections to do what you're ostensibly doing? Replaced it with open-source alternative "redshift".

19
2bitencryption 4 days ago 4 replies      
Question about private api's:

What makes a private api private? Is it merely undocumented, but still usable in the exact same way as a "public" api? I.e., in my code I invoke it like normal, but I just need to know the name?

Or do I have to fiddle with the compiled code of my app to get it to call the instruction location of the otherwise invisible function?

If they were meant to be private, why can't the app, which surely runs in some underpriviledged mode, be blocked from calling the function, which knows it itself is privileged?

20
AdamGibbins 4 days ago 4 replies      
What announcement? I missed it.
21
simple10 4 days ago 1 reply      
Here's the related MacRumors article with more info on Apple's Night Shift app. It's currently available to developers but apparently only runs on iPhone 5s and iPad Air and later.

http://www.macrumors.com/2016/01/11/apple-ios-9-3-night-shif...

22
coldtea 4 days ago 0 replies      
Maybe they should try fixing the video glitches one frequently gets whenever f.lux is enabled on OS X first?
23
qkhhly 4 days ago 1 reply      
While they are spending time fighting on Apple's platform, I wish they'd spend a bit more time on Linux (Ubuntu) and android platforms. There are large number of users on those platforms. Their Ubuntu version works OK (not sure about other Linux variants), but there is no android version.
24
erikpukinskis 4 days ago 0 replies      
Makes sense. So many people get up in arms when a competitor starts trying to do their job. That doesn't make sense to me... There is so much to do, so many challenges out there to address. If someone wants to do exactly what you're doing, why not let them? Most people struggle to find a worthy successor. It's a gift. Take the opportunity to move on to the next phase in your life, which has a good shot at being even better. You're smarter now, after all.

I know some people feel they only have one good idea in them, but I think that results from either a) aiming too high on subsequent rounds, or b) phoning it in. But if you love to work, just start small and you'll avoid both of those things.

25
willtim 4 days ago 0 replies      
What I find strange is why they developed an iOS version at all, given the restrictions. They could have provided the top-selling Android version. I had to give my money to some other clone, I would have much rather given it to the innovators.
26
nickpsecurity 4 days ago 0 replies      
Made a scene from Pirates of Silicon Valley jump into my mind:

https://www.youtube.com/watch?v=yG4DvM0wxdk

"90 hours a week and loving it. Like the T-Shirt? I'm going to give it to my people. Some of them work even more than 90 hours a week." (Steve Jobs depiction)

Woz says it's the only accurate one about Jobs and company. So, whether those words or not, I can only assume jobs worked his people to death to achieve Apple's success. Other things are consistent with that. Then, I hear they're "joining" f.lux to help their mission of improving sleep or whatever. Haha...

27
tacos 4 days ago 0 replies      
Part of me was hoping they'd let it go. While awareness of this issue is greater now than it was six years ago, I'm not sure if they've been pushing this wave closer to shore or merely surfing it.

If this is truly the world health issue they think it is, now that iOS is taken care of seems like they should focus on Android, TVs and Kindles instead of begging Apple to let them compete with a built-in feature.

After their last PR push and petition campaign (which landed them on various media outlets and the HN homepage twice in three days) it took them 7 weeks to land 5,000 signatures.

I appreciate their passion but talking about cancer, weight gain and acne -- while providing affiliate links to salt lamps and Swarovski crystals -- just feels weird.

28
goldenkey 4 days ago 0 replies      
F.lux is great but the root of the problem is the type of backlight that monitors use. A hardware company ought to come out that produces a decent monitor with multiple backlight bulbs.

I mean, look at the difference in abrasiveness of spectrum between these bulb technologies:

http://housecraft.ca/wp-content/uploads/2012/09/spectral_res...

Tinting the color using F.lux is helpful but doesn't supplant the boon that a proper ergonomic monitor would for eye and circadian rhythm health.

29
kaiserama 4 days ago 2 replies      
I would love to have f.lux on my un-jailbroken iPhone. Currently I use Twilight App on a Nexus by my nightstand during the night because of how harsh the iPhone screen is in the dark (even with brightness turned down all the way).
30
niels_olson 4 days ago 0 replies      
Nocturne in red mode is far better for sleep conservation. I started using it over flux because many moons ago I developed an obsession for preserving night vision while standing watch on the bridges of ships, and can attest that you can pass out cold while editing a document with Nocturne in red mode.
31
dreamling 4 days ago 0 replies      
That an incredibly gracious post! wow.
32
usermac 4 days ago 1 reply      
For years I've used the accessibility feature to make the home screen invert the colors when I triple click it. Problem solved.
33
hitlin37 4 days ago 0 replies      
Redshift is pretty nice too, if you are on Linux. Unfortunately, f.lux experience isn't so great on Linux.
34
eridius 4 days ago 2 replies      
> Today we call on Apple to allow us to release f.lux on iOS, to open up access to the features announced this week, and to support our goal of furthering research in sleep and chronobiology.

I'm confused, why do they want this? Night Shift does what f.lux does. Apple opening up the APIs to allow f.lux to run on iOS seems rather pointless, since the OS is already doing the same thing.

35
milkers 4 days ago 1 reply      
This is just another way to say "Apple, please acquire us".
36
cha-cho 4 days ago 1 reply      
What is the lesson here for the next flux type product creator?
37
beatpanda 4 days ago 1 reply      
This, right here, is why I abandoned the Apple ecosystem. When a company has the kind of control that Apple has with the app store, they will inevitably abuse it.
38
rocky1138 4 days ago 0 replies      
"Please, sir, can I have some more?"
39
s73v3r 4 days ago 0 replies      
Except what would f.lux add on top of what Apple has released? It seems like Apple already did all the work.
Before Growth samaltman.com
363 points by firloop   ago   206 comments top 32
1
beat 3 days ago 13 replies      
This reminds me of pg's comments about "playing house". The appearance of a startup is more important than the reality of a startup, for founders whose core drive is to "be a startup founder", not "create this product the world desperately needs so maybe it will stop haunting my dreams".

I've met founders like this, recently. They're building some sort of social app that does something no one needs and no one would pay for. Maybe I'm being dense and they're actually geniuses, but I don't think so. Most startups fail, and I think a lot of that is because a lot of startups are really stupid. They're "solving" worthless non-problems because the founders aren't doing the hard work of finding real, valuable problems to solve.

Building a great product is really hard. And it can't all be done completely lean - at some point, you need to envision something so profound that people don't realize how much they need it. You don't just write Hello World, charge a buck, growth hack, and brag about how you got two bucks for it the next week. As Henry Ford said, if he'd asked customers what they wanted, they'd have said faster horses. Don't build a faster horse. You won't be great that way.

2
Aqueous 3 days ago 9 replies      
I don't really understand why they need to grow quickly. You only need to grow quickly if you take millions of dollars and need to distribute a profit to your investors before too long.

Whatever happened to growing slowly, proving the revenue stream before you throw millions of dollars at something? Why is that such a bad thing?

I understand from a VC's point of view why it's a bad thing, and that's the point of view which YC and other incubators are coming from. But why is it a bad thing for the entrepeneurs?

Taking time to consider what you're building is important. Taking time to grow is also important.

3
hitekker 3 days ago 1 reply      
>I think the right initial metric is do any users love our product so much they spontaneously tell other people to use it? Until thats a yes, founders are generally better off focusing on this instead of a growth target.

I have a warning for future, young developers: do NOT join a startup that hasn't found product/market fit, and isn't trying, with all of its might, to find that product/market fit. And when I say, by all of its might, I mean companies that product/market fit isn't the first and foremost goal of the startup.

This is really hard to determine from the outside.

For example, a friend of mine is working in a well-funded, post series A startup that just declared internally its biggest target was a billion dollars of products shipped by the end of the fiscal year. The team is smart, the founders are passionate, and they've been judicious with their funding.

Problem is that the CEO herself admitted that they don't yet have product/market fit. Which is to say their "guiding light" metric is a lagging indicator, i.e. it measures that tail-end of their efforts, and certainly not how much customers love them (and are willing to pay). It's the equivalent of early Facebook using advertising revenue or page views as their growth target instead of monthly active users.

As Sam Altman points out, everything they do will be "hacking" or, as PG puts it, doing tricks that are not sustainable for a real, billion dollar company, in attempts to achieve that misleading number.

At least from what she told me, the most senior engineer said that this isn't bad: the engineers will focus on product/market fit and the sales/marketing teams will focus on growth. Of course, she didn't say that, in a conflict between the fake growth target and product/market fit, the fake growth target always wins. Even now, I'm trying to pull my friend out, before the mental gymnastics set in.

Don't be fooled. A startup needs a real, sometimes small, star in the sky to navigate churning waters. A ship that chases the moon is a ship that will soon be sinking.

4
adamb 3 days ago 1 reply      
Perhaps I'm the only one that sees this, but this essay appears to be a significant departure from the YC rhetoric I've observed in the past few years. Every time I've heard retention discussed (at YC) in the context of growth, it's cast as something you address when your growth has stalled. Not, as this essay would lead you to believe, a prerequisite of growth itself.

This essay, while it discusses a really important idea, comes across as saying: "Foolish, fashion-focused founders! Clearly retention comes first! We would never imply that you should focus on growth before your product had adequate retention!"

But that's exactly what I've seen YC partners do. Recommend that companies focus on growth, because that's what YC's definition of a startup is: a company that grows very quickly. Having seen both sides of the curtain, the essay leaves me with a greasy, queasy feeling.

Am I missing something here?

Sam, are you changing your standing advice from "focus on growth" to something else? If that's what this post signals, please take ownership of the change and spell it out.

It's poor form to imply that misguided founders (and their devotion to a fad) are driving the growth zealot craze when you've had a hand on the wheel for years.

5
magicmu 3 days ago 1 reply      
> I think the right initial metric is do any users love our product so much they spontaneously tell other people to use it? Until thats a yes, founders are generally better off focusing on this instead of a growth target.

It all comes back to this. I was listening Aaron Harris' interview with Digital Ocean's Mitch Wainer on Startup School Radio the other day (phenomenal podcast). Wainer mentioned how people actually love Digital Ocean, and how rare it is for that word in particular to be paired with the name of a company. I think it's said so often that it can easily be glossed over, but that word is not chosen lightly. Having a product users actually love is extremely difficult, and extremely rewarding.

6
pquerna 3 days ago 0 replies      
I think the right initial metric is do any users love our product so much they spontaneously tell other people to use it?

Many larger companies use "Net Promoter Score", as a way of telling this:

https://en.wikipedia.org/wiki/Net_Promoter

Of course, like Agile, there are Consultants, a methodology and books, but there is some truth in asking your customers if they would promote you to another person.

7
ivankirigin 3 days ago 1 reply      
It seems like there are a million different ways to say the same thing: early on focus on retention.

You can get this from your data at sufficient scale. You can get it from surveys like NPS. You can get it from talking to customers.

One problem I have with the word "growth" is that it implies top level acquisition. A better definition would be hitting your goals. If you want to grow your retained userbase, then getting top level acquisition isn't actually hitting your goal.

8
applecore 3 days ago 1 reply      
This is closely related to Aristotle's concept of potentiality[1] and the active intellect. More succinctly, a startup must build up the potential energy of its product prior to transforming it into the kinetic energy of growth.

[1]: https://en.wikipedia.org/wiki/Potentiality_and_actuality

9
minimaxir 3 days ago 3 replies      
> A startup that prematurely targets a growth goal often ends up making a nebulous product that some users sort of like and papering over this with growth hacking. That sort of worksat least, it will fool investors for awhile until they start digging into retention numbersbut eventually the music stops.

But by then, the startup has already won. They already got the funding thanks to their sleazy marketing, and the social validation/connections that comes with it.

The essay doesn't contradict the "it is easier to ask for forgiveness than permission" idiom, unfortunately.

10
gustaf 3 days ago 0 replies      
This is a great post. This is the by far most common miss-understanding I get from new startups. People are far too confident their product have reached product-market-fit before that is actually true - often making this claim with very little data.

I've seen many VC's who are not data-literate enough to be able to understand the difference between measuring product value and growth without substance.

11
andrewmcwatters 3 days ago 0 replies      
Isn't Y-Combinator's whole thing "Make something people want."? That's a rather succinct and meaningful statement on its own. A very powerful one that should be held above perhaps all others.

I'm sure that whatever your business practices are, after boiling everything away, you should be doing this above all things. An article like this suggests perhaps some sizable amount, or many organizations are losing sight of the way.

12
danieltillett 3 days ago 0 replies      
The alternative to this is focus on getting your customers to pay you in advance for your product. Not only will this tell you if your customers really love your product, but it will provide you with all the cash you need to grow as fast as you want.

The downside is very few products are that loved that customers will pay for 12 months in advance.

13
hammock 3 days ago 1 reply      
Sam is talking about the importance of watching leading indicators when there is a good bit of time between your present actions and future performance.

If you are relying on your users to drive growth, an easy-to-measure proxy of "do any users love our product so much they spontaneously tell other people to use it?" is Net Promoter Score[1], or the difference between the % of users who would recommend you, and those who would not.

There are other (perhaps non SV startup) businesses that dont rely on users to drive growth. For example, if you rely heavily on distribution arrangements or retail placement. In that case you might want to look at other metrics that are leading indicators of growth.

[1]https://en.wikipedia.org/wiki/Net_Promoter

14
steve-benjamins 3 days ago 0 replies      
As Warren Buffet has said, the best strategies are simple but hard. This is YC advice at its best.

(I'm paraphrasing Buffet)

15
betadreamer 3 days ago 0 replies      
Great post where great product -> growth, but I think it's not stating the real problem. We need to ask why founders are starting to have this mentality. From my perspective this is because YC tends to give too many generic advices and some sticks more than others. The big one that caused this is PG's startup = growth post, and yes he is right and it specifically states you really need to concentrate on growth.

If you have to take away one thing, then it will be growth but this generalization is very dangerous because as Sam's post states you need a product before growth. The better advice is to not concentrate on one thing but to bluntly say you need to have everything and everything will all come together. You need great team, great product, and great growth. If one is weak, you need to fix it.

16
sharemywin 3 days ago 1 reply      
I'm curious what happened to HomeJoy? They talked about having good cohort analysis numbers(retention) and then they blew up?
17
colund 3 days ago 0 replies      
This growth belief reminds me of the trap of focusing on 'what' instead of 'why' as in https://www.ted.com/talks/simon_sinek_how_great_leaders_insp...

It seems stupid to do something just because you think you're expected to do it rather than think for yourself and do what you want to do and think would be a good idea.

18
mildbow 3 days ago 1 reply      
This is interesting in how un-interesting it is.

Find product-market fit before taking investment. Because the only reason to take VC investment (for the etype of startups we are talking about) is as a steroid injection towards growth.

Why is this the only time it makes sense? Because that's the only way you and your investor's incentives are aligned.

Now, whether you need to/should grow fast to survive in the long term is another question.

19
EGreg 3 days ago 1 reply      
NAIL ENGAGEMENT FIRST. THEN GROWTH. Otherwise you are just bruning through users. Your best hope is to attract them back later via transactional notifications.
20
frik 3 days ago 0 replies      
Two months ago: https://news.ycombinator.com/item?id=10495402 Altman's previous blog article)

How much or little has changed?

21
avivo 2 days ago 0 replies      
I've been explaining this perspective to most clients for years, almost word for word. Specifically the question/focus: "do any users love our product so much they spontaneously tell other people to use it?"

I'm excited to now have an "official" source I can easily cite for it!

I think there are some businesses where this might be less true, but it's pretty rare.

22
thedaveoflife 3 days ago 1 reply      
So his advice is to build a great product?
23
muxme 3 days ago 7 replies      
I'm running into the growth problem with my website: http://muxme.com. Lots of people say the love the website, but it seems like no one is telling their friends. I was thinking of maybe adding user created sweepstakes, because I think the only way people would spontaneously tell their friend is if they win something from the site. More winners = more people telling their friends, but I can only go so far on my budget. What do you guys think?
24
cdmcmahon 3 days ago 0 replies      
This definitely rings true to me. The company I work for has been around for ~6 years and while we have made money and grown in that time, it wasn't until the last year really that we hit on exactly what our clients love about our product and continued to nail that. It can take way, way longer than some entrepreneurial people are comfortable waiting for. But now that we've found it, and are continuing to expand our product around what we've found, growth is starting to come much more easily.
25
edanm 3 days ago 1 reply      
'I think the right initial metric is do any users love our product so much they spontaneously tell other people to use it?'

Question - is this a valid metric when talking about B2B companies? I would imagine there are plenty of B2B solutions which became huge, but which don't exactly have companies rushing out to tell everyone to use them.

I tend to think sama's question is only relevant for B2C, but I'd love to be proven wrong.

26
free2rhyme214 3 days ago 0 replies      
Didn't Alex Schultz, VP of Growth at Facebook, already say this...

https://www.youtube.com/watch?v=n_yHZ_vKjno&feature=youtu.be...

27
amelius 3 days ago 0 replies      
> The other thing that these companies have, and that also usually gets figured out early, is some sort of a monopoly.

I wonder how this would apply to, say, Uber and AirBnB. What kind of monopoly do they have?

28
rdlecler1 3 days ago 0 replies      
Sometimes it's best to grow more slowly at first, build your brand, your culture, figure out your process, don't burn through mountains of cash (if you're lucky enough to have that problem) and wait for the market to come around.
29
Mz 3 days ago 0 replies      
So, basically, get the seeds right, then plant, water, and tend. Don't get crappy seed ideas and try to overcome their deficiencies with some kind of fertilizer. (fertilizer AKA "manure"/bs.
30
bsbechtel 3 days ago 0 replies      
Also, good unit economics or a plausible path to get to good unit economics for your product (i.e., technology pipeline will dramatically reduce some key cost for your company).
31
throwaway41597 3 days ago 0 replies      
small typo: "and that also usually gets figure out early" => figured
32
compumike 3 days ago 0 replies      
Great essay, Sam! I want to expand slightly from a founder's perspective:

You always have two constituencies to think about: (A) your users / customers, and (B) your not-yet-users/customers.

"Before growth", you should really pay very very little attention to constituency B. If you're doing anything interesting, there will always be a torrent of skeptics, doubters, and haters. (That sucks, but it's very human, and you just have to ignore it.)

Focus entirely on constituency A. There are three "R" metrics to look for: (1) Retention. (2) Revenue. (3) Referrals. (Note: these are the three "RRR" of the classic "AARRR" model.)

Initially, constituency A might just be your friends or extended network who you can convince give what you've built a try. You may be able to get them to try your new product once or twice, but that's about it.

However, if they keep using it again and again voluntarily, that's a really good sign (Retention). If they voluntarily and happily pay for it, that's great (Revenue). And if they voluntarily tell other people they know about it (Referrals), that's amazing!

Until you have those nailed -- or at least retention and referrals if you're deferring monetization -- don't worry at all about the "B" group (i.e. don't worry about acquiring new leads at the top of your funnel), because you don't have product-market fit yet.

Once you do have a product that the constituency A people love, only then can you start thinking about constituency B and how to turn them into new happy users / customers. (While continuing to make your constituency A happy!)

I think Sam is going a step further to place extra emphasis specifically on Referrals out of all three of these metrics. This is a subtle but powerful insight. Whether your product is B2C or B2B, most purchasing decisions have a huge emotional component. If your product is so good that your users want to tell other people about it, that's a huge step above being merely satisfactory, and goes a huge way toward powering your growth.

I don't care if you're buying a burrito, a car, CDN bandwidth, or an analytics SaaS. You as a buyer want a reasonable degree of certainty that you're going to feel good about that purchase after you've made it. Hearing that referral from your peer, whether a friend who tried that food truck before, or another engineer who used that SaaS provider before, goes a huge way toward giving you that pre-purchase confidence.

If your users aren't excited enough about it to be talking about it with their peers, you may have to adjust your product or your segmentation of the market until you get there.

Raspberry Pi Bare Metal Programming with Rust thiago.me
339 points by thiagopnts   ago   90 comments top 10
1
jcoffland 2 days ago 6 replies      
As a long time C programmer this is not very convincing. A C program to do the same requires far less voodoo. All you need to do is take the address of the GPIO register then toggle the bit. No name mangling. No error handlers to override. Don't get me wrong I know rust does have some compelling features. It does seem odd to me that so many of what would be compiler options in C are hard coded.
2
Aeolos 2 days ago 5 replies      
I have the perfect project for this! Generate a signal using the GPIO ports to drive a mirror and camera in a synchronized fashion.

Does anyone know if the RPi GPIOs can be driven at around 80KHz? I've seen reports that this is possible, but that the USB or video driver tends to lock the CPU for long times, messing with timings - but hopefully running on bare metal would take care of that.

3
akerro 2 days ago 7 replies      
Is Rust the future? As a C++ (hobby), Java (full time) developer, should I invest my time in Rust?
4
notabot 2 days ago 2 replies      
As a C programmer who fiddles with low level stuff I wonder why people get exciting about such trivial thing.

Don't get me wrong. Rust is an interesting language. The thing described in this post is well within its capability, i.e. IMO there isn't really anything that worths bragging about. Such trivial thing neither demonstrates the real potential of Rust, nor answers important questions from real world engineering perspective.

I'm all for having better tool to write low level stuff. I have dabbled with Rust and the experience was eye-opening. I think Rust still have a lot to catch up though.

5
akhilcacharya 2 days ago 1 reply      
Outside of C, are there any other languages or platforms that can do this? I'd like something modern, but I haven't liked what I've seen with Rust personally.
6
kod 2 days ago 0 replies      
If you want to play around with the code in this post that requires nightly, multirust makes it much more pleasant to use nightly and / or stable rust.

https://github.com/brson/multirust

7
nickysielicki 2 days ago 1 reply      
I remember reading a couple months ago that bare metal using Rust was in a bad state. If I recall correctly the outputs were prohibitively large, among other issues.

Has that changed?

8
s986s 2 days ago 1 reply      
Very cool! Rust has been impressing me alot these past few months.
9
Animats 2 days ago 0 replies      
Very nice. A few more months and that should be stable.
10
kalimatas 2 days ago 0 replies      
Thanks for sharing!
There are no secure smartphones devever.net
369 points by moehm   ago   114 comments top 25
1
gue5t 4 days ago 8 replies      
The folks at http://neo900.org/ are well aware of this and that phone is designed accordingly (details at http://neo900.org/faq#privacy). Hype-driven products like BlackPhone misrepresent their devices as being perfectly secure when this significant attack vector is completely unmitigated.

On the Neo900, the modem is connected via USB (bus; there is no physical connector) which means it doesn't have DMA. There is no feasible open-source baseband. OsmocomBB (http://bb.osmocom.org/trac/) is the closest thing to one, and it is relatively incomplete and works on a very limited range of mostly badly outdated hardware, none of which would not really be reasonable to use in a phone to be manufactured today.

It's incredibly difficult to get people to care and help with the lack of software for tasks like GSM communication. Somehow even among people who describe themselves as "hackers", most just want to run Android or iOS and buy/run closed-source apps, and are more interested in Javascript and employment than reverse-engineering and doing things that have never been done before. The potential of reprogrammable computers that, at a low level, run the code you ask them to doesn't seem to get through to most of the HN crowd.

2
tptacek 4 days ago 7 replies      
It's good to draw attention on baseband processors, but there are technical assertions in this post that are probably not accurate (lack of auditing and the notion that you can assess the security of a whole phone system by whether or not there's an IOMMU).

The systems security of modern phones is surprisingly complex. Google and Apple both care very deeply about these problems, and both have extremely capable engineers working on them. Without getting too far into the weeds: they haven't ignored the baseband.

3
verusfossa 4 days ago 1 reply      
Semi-related: I feel like there should be an open source dumbphone project. Smartphones have a lot of bells and whistles with a large attack area but many people just want to make calls. Sure this wouldn't fix the baseband issue, but it seems the safest way to ensure isolation between personal information and potentially hostile cellular blobs is to never put the information on the device in the first place. A small, open, dialer-only yet potentially extensible cellular platform would be really welcome, I think. Maybe something like this [1] using the rpi zero as a base with some 3d printed cases, reverse engineer what you can otherwise. Replicant is a noble project, but chasing android versions, screen sizes, video codecs, proprietary hardware locks etc is an uphill battle and kind of a distraction IMHO.

[1] https://www.raspberrypi.org/blog/piphone-home-made-raspberry...

edit: typo

4
hardwaresofton 4 days ago 3 replies      
Would really love to see this upvoted more. This basic truth should be common knowledge for privacy-minded or security-minded technologists/developers.

There are lots of reasons GSM won't/is hard to make work. What are the options? As more and more carriers in the USA provide wifi-dongles that are connected to 3G, maybe it's better to just do that, and move off making calls directly from your phone completely?

For example, it might make sense to buy some phone, connect it to a device (or flash it with some software) that makes it essentially a portal for phone calls of sorts, and give it sandboxed access to your network. It's significantly harder for GSM backdoors to be effective if the entire device is sandboxed right? Maybe this way, as you roam around, you can somewhat securely communicate over IP to your call-making device, and make/receive calls?

[EDIT] - Thinking about it, the suggestion is moot, since all someone would have to do is write some software to replay messages, or leak messages or some other nefarious thing, and stick it on the baseband of the device -- even if it can't damage your network it's still quite insecure.

Maybe we should just give GSM up altogether, and start trying to move ourselves (and the world) to only communicating over IP (which we have a shot at securing, assuming modern crypto isn't completely broken)? What is the situation like with completley open source wifi connectivity?

5
peteretep 4 days ago 2 replies      
The article asserts:

 > It would, in my view, be abject insanity not to > assume that half a dozen or more nation-states (or > their associated contractors) have code execution > exploits against popular basebands in stock.
To me this ignores the flip-side of the argument. If US intelligence services really thought the Chinese and Russians could remotely and invisibly hack all/most smartphones then no-one with access to sensitive information would be allowed to do work on one, unless they're very confident that they've managed to secure their devices without leaving a stone unturned.

Soz Hilz[0].

[0] http://media4.s-nbcnews.com/i/newscms/2015_10/913276/150303-...

6
shmerl 4 days ago 0 replies      
> Modern smartphones have a CPU chip, and a baseband chip which handles radio network communications (GSM/UMTS/LTE/etc.) This chip is connected to the CPU via DMA. Thus, unless an IOMMU is used, the baseband has full access to main memory, and can compromise it arbitrarily.

Indeed. Such design coupled with very obscure and closed baseband firmware is a security nightmare. One should ask, who was pushing for such an approach.

7
matt_wulfeck 4 days ago 2 replies      
> For devices with cellular access, the baseband subsystem also utilizes its own similar process of secure booting with signed software and keys verified by the baseband processor.

According to the iOS security white paper, the baseband firmware is part of the secure boot chain, and has its own secure boot chain.

This allows me to assume it's very hard to inject or replace the the firmware with a malicious code. Whether or not the firmware itself has a backdoor or whatever I don't know, but at least one major phone manufacture knows this firmware is very important for security.

8
jrockway 4 days ago 0 replies      
Don't modern ARM chips encrypt memory (for DRM reasons)? If that's in use (modulo the area the the baseband needs to read/write), it doesn't matter. An adversary can read the memory, but can't read the encryption key out of the CPU.

If it could, anyone with a logic probe could grab un-DRM'd video data out of the RAM, which would make many people very unhappy.

9
mindslight 4 days ago 0 replies      
The Samsung i9300/i9500/etc use separate Exynos application processors coupled with stand-alone communication chipsets from Intel (originally Infineon). Check out the Replicant wiki for more info. The Replicant devices are unfortunately quite old (i9300 being the latest), but newer models do continue the trend (SM-G920H for the S6 international Exynos, I believe). You'll just be stuck on their proprietary OS, or need to fund a bunch of CM/Replicant development.

Of course the same branded (eg "Galaxy S6") has many different models for across the world, most using integrated Qualcomm chips. Honestly looking at the list of variants and thinking about the conservatism of RF and telecom regulatory regimes, you'd have to be naive to think the whole ecosystem doesn't simply exist under the control of major intelligence agencies. Communications have always been regarded as dangerous.

10
95014_refugee 4 days ago 0 replies      
The article author is not well-informed. You can verify this yourself by physical inspection, via leaked schematics, by paying a teardown analyst or via examination of available documentation for the components used in modern smartphone designs.
11
brandmeyer 4 days ago 0 replies      
There is something in between nothing and a full IOMMU. I've been working with a TMS570 processor lately whose DMA engine supports an IOMPU. This hardware is equivalent to the Cortex-R/M Memory protection unit.

An MPU has all of the same protection domains that an MMU does, except for a few major differences:

- The total number of protected regions is very small (12 or so), such that the hardware cost is somewhat smaller than a TLB cache.

- To offset the small number of regions, the size of each region can be almost any power of two.

- The MPU does not perform address translation, again reducing the hardware cost.

Thus, the kernel can configure the peripheral's DMA engine to only allow access to a page or few.

12
hackuser 4 days ago 1 reply      
Based on what I've read from more authoritative sources (or is the author an authority in this area?), this information is outdated:

> It can be safely assumed that this baseband is highly insecure. It is closed source and probably not audited at all. My understanding is that the genesis of modern baseband firmware is a development effort for GSM basebands dating back to the 1990s during which the importance of secure software development practices were not apparent. In other words, and my understanding is that this is borne out by research, this firmware tends to be extremely insecure and probably has numerous remote code execution vulnerabilities.

I've read in several places that basebands now widely use the OKL4 microvisor,[1] based on the formally verified (fwiw) seL4 microkernel, and are much more secure than before. Does anyone know more about this?

[1] https://gdmissionsystems.com/cyber/products/trusted-computin...

13
xlayn 4 days ago 1 reply      
I have address as a response to gue5t how creating a chain of trust may not be a doable goal as probably every part of the system is not trust-able.

However I'm thinking a different approach can be taken, suppose we abstract the different ways for communication a device has and use them as sockets or layers and then create an algorithm that distributes the communication through several channels.

For example two cellphones

-one against another using the light on one screen against the camera of the other

-the vibrator motor captured by the microphone

-introducing certain pattern of noise in Bluetooth communication by the other radios

-communication through stenography

-sending huge amounts of information (the more information the more power needed to discriminate, understand)

-abuse how things are not supposed to work (instead of sending packets in the correct order, use ping as a way of sending data by crafting the requests).

-Custom network stack

-use of a customized version of encryption with extra large keys (think 20480 bits instead of 2048)

-and last but not least use an algorithm that mutates the distribution logic based on certain algorithm dependent on time (in the same fashion virus mutate themselves)... using as key for example your voice

edit: explanation

14
shyn3 4 days ago 0 replies      
Would this device be protected? http://www.gsmarena.com/blackberry_priv-7587.php
15
InclinedPlane 4 days ago 1 reply      
This is just a special case of the fact that there's no secure anything.

In 2016 that's just the price you pay for using computers. You just have to live with it. Mitigate it the best you can, rely on the ol' "mossad or not mossad" strategy now and then, hope for the best, etc. If you have a strong need for increased security, well, god help you (spoilers: you will receive no help), you're going to pump a lot of effort into building something that will still have tons of vulnerabilities.

16
scientes 4 days ago 4 replies      
Or you can just use WIFI and turn the baseband off like I do. The cell companies are all crooks anyways (in the US), and I don't want to do business with them.
17
delinka 4 days ago 0 replies      
"...no secure smartphones." There's always a weakness somewhere. Makes me wonder why governments want backdoors. Just in case we manage to make a perfectly secure device? It's too onerous to ask a judge for permission to "hack" a phone through weaknesses rather than asking permission for an escrowed key?
18
elchief 4 days ago 1 reply      
How about those http://www.cryptophone.de/en/company/news/gsmk-introduces-ne... phones? Is a "baseband firewall" just a gimmick?
19
walterbell 4 days ago 0 replies      
Tor has a discussion (2014) of Android security, including baseband processors, https://blog.torproject.org/blog/mission-impossible-hardenin...
20
wbl 4 days ago 0 replies      
When I spoke to John Callas he said there was a serial link to the baseband on the Blackphone. I might be misremembering, but it certainly is an issue that designers are working on solving.
21
aninteger 3 days ago 0 replies      
So what are the tools and knowledge required for one to tinker hardware enough to understand and build security for smart phones?
22
monochromatic 4 days ago 3 replies      
Why is it that the baseband has full access to main memory?
23
nickpsecurity 4 days ago 0 replies      
That's right. Focus on the baseband is kind of the new fad in mainstream ITSEC. These problems are long-known in high-assurance as the cert requires all things that can compute, store, or do I/O to be assessed. The reason is that, historically, these were all where attacks came in. I'm pretty tired but I can do at least a few points here.

1. Software. The phones run complex, low-assurance software in unsafe language and inherently-insecure architecture. A stream of attacks and leaks came out of these. The model for high-assurance was either physical separation with trusted chip mediating or separation kernels + user-mode virtualization of Android, etc so security-critical stuff ran outside that. There was strong mediation of inter-partition communications.

2. Firmware of any chip in the system, esp boot firmware. These were privileged, often thrown together even more, and might survive reinstall of other components.

3. Baseband standards. Security engineer Clive Robinson detailed many times of Schneier's blog the long history between intelligence services (mainly British) and carriers, with the former wielding influence on standards. Some aspects of cellular stacks were straight designed to facilitate their activities. On top of that, the baseband would have to be certified against such requirements and this allowed extra leverage given lost sales if no certification.

4. Baseband software. This is the one you hear about most. They hack baseband software, then hack your phone with it.

5. Baseband hardware. One can disguise a flaw here as debugging stuff left over or whatever. Additionally, baseband has RF capabilities that we predicted could be used in TEMPEST-style attacks on other chips. Not sure if that has happened yet.

6. Main SOC is complex without much security. It might be subverted or attacked. With subversion, it might just be a low-quality counterfeit. Additionally, MMU or IOMMU might fail due to errata. Old MULTICS evaluation showed sometimes one can just keep accessing stuff all day waiting for a logic or timing-related failure to allow access. They got in. More complex stuff might have similar weaknesses. I know Intel does and fights efforts to get specifics.

7. Mixed-signal design ends up in a lot of modern stuff, including mobile SOC's. Another hardware guru that taught me ASIC issues said he'd split his security functions (or trade secrets) between digital and analog so the analog effects were critical for operation. Slowed reverse engineering because their digital customers didn't even see the analog circuits with digital tools nor could understand them. He regularly encountered malicious or at least deceptive behavior in 3rd party I.P. that similarly used mixed-signal tricks. I've speculated before on putting a backdoor in the analog circuits modulating the power that enhances power analysis attacks. Lots of potential for mixed-signal attacks that are little explored.

8. Peripheral hardware is subverted, counterfeit, or has similar problems as above. Look at a smartphone breakdown sometime to be amazed at how many chips are in it. Analog circuitry and RF schemes as well.

9. EMSEC. The phone itself is often an antenna from my understanding. There's passive and active EMSEC attacks that can extract keys, etc. Now, you might say "Might as well record audio if they're that close." Nah, they get the master secret and they have everything in many designs. EMSEC issues here were serious in the past: old STU-III's were considered compromised (master leaked) if certain cellphones got within like 20 ft of them because cell signals forced secrets to leak. Can't know how much of this problem has gotten better or worse with modern designs.

10. Remote update. If your stack supports it, then this is an obvious attack vector if carrier is malicious or compelled to be.

11. Apps themselves if store review, permission model, and/or architecture is weak. Debatable how so except for architecture: definitely weak. Again, better designs in niche markets used separation kernels with apps split between untrusted stuff (incl GUI) in OS and security part outside OS. Would require extra infrastructure and tooling for mainstream stuff, though, plus adoption by providers. I'm not really seeing either in mainstream providers. ;)

That's just off the top of my head from prior work trying to secure mobile or in hardware. My mobile solution, developed quite some time ago, fit in a suitcase due to the physical separation and interface requirements. My last attempt to put it in a phone still needed a trusted keyboard & enough chips that I designed (not implemented) it based on Nokia 9000 Communicator. Something w/ modern functions, form-factor, and deals with above? Good luck...

All smartphones are insecure. Even the secure ones. I've seen good ideas and proposals but no secure[ish] design is implemented outside maybe Type 1 stuff like Sectera Edge. Even it cheats that I can tell with physical separation and robust firmware. It's also huge thanks to EMSEC & milspec. A secure phone will look more like that or the Nokia. You see a slim little Blackphone, iPhone, or whatever offered to you? Point at a random stranger and suggest they might be the sucker the sales rep was looking for.

Don't trust any of them. Ditch your mobile or make sure battery is removable. Don't have anything mobile-enabled in your PC. Just avoid wireless in general unless its infrared. Even then it needs to be off by default.

24
miguelrochefort 4 days ago 0 replies      
What does one need a secure smartphone for?
25
digi_owl 4 days ago 0 replies      
Between -sec, cloudcuckoolander designers, and architecture astronauts it feels like all the fun has vanished from tech.
Amazon Has Just Registered to Sell Ocean Freight flexport.com
313 points by prostoalex   ago   98 comments top 16
1
pcooney10 4 days ago 8 replies      
This deal is huge, but I hope Amazon starts doing a little better diligence on their sellers. I was initially drawn to Amazon for their amazing customer reviews - they were usually current and decently, well-written - but most importantly they offered a resassurance that websites without product reviews didn't have.

Over the past year, I've noticed a HUGE uptick in the quantity of fake 5-star reviews. They are so blatant it's frightening, and they usually go unnoticed in Amazon's default "Most Helpful" sorting.

In particular, the Home Office Desk Chairs landscape is pretty insane: http://www.amazon.com/Home-Office-Desk-Chairs-Furniture/b?ie.... I was trying to find a chair back in September, and I was appalled by some of the reviews I was seeing. Top selling products, with several hundred reviews that averaged out to 4/4.5/5 stars.

This is a screenshot from back in September: http://imgur.com/qbCz0yE, and it only contains a small sample of the "Awesome, highly recommend" reviews spattered around. You'll notice this pattern on virtually every chair on Amazon, except the Aamazon basics chairs which wer launched sometime in late September / early October. Their reviews seem pretty good so far (i.e. real), but unfortunately for me I had purchased a chair from eBay before these launched.

These patterns are pretty frighteneing (especially considering a lot of people are actually buying these things), especially considering I've experienced the same issues when shopping for others things.

Has anyone else had an experience like this? Or am I losing it?

2
coffeebite 4 days ago 2 replies      
Did Flexport just preemptively reveal Amazons entry into the ocean freight market and then convince every major newspaper in the world to label the service unfit for American companies?
3
bemmu 4 days ago 1 reply      
Up to now they were telling anyone who wanted to ship directly to FBA warehouse to use Samuel Shapiro and Co., Inc.

I still have a stack of paperwork from them back when I was considering shipping some items by container from Japan.

4
balls187 4 days ago 1 reply      
I'm working on software to facilitate international commerce.

Amazon getting into international freight logistics is big news.

5
shubhamjain 4 days ago 2 replies      
50+ years in future - Amazon is still optimising some part of their ecommerce infrastructure / entering a new market. Jokes aside, it will be incredibly interesting how Amazon's strategy to discard profits and just scale everywhere will work out. Will they become insanely profitable as everyone expects?
6
jhayward 4 days ago 1 reply      
If I read this correctly this is a vertical integration play.

Just from the title I had guessed that that they would own ships as a hedge against rises in freight costs, just as airlines buy oil stocks etc.

Is anyone aware of any analysis of Amazon's recent moves in aircraft, drones, and so on as hedge vs integration vs disruption?

7
gulfie 4 days ago 1 reply      
Due to shipping company economics the ships are in port. http://www.zerohedge.com/news/2016-01-11/nothing-moving-balt...

Amazon needs move product to make money. If others will not physically move that product to the US, then they'll have to do it. More expensive is better than none at all.

8
hayksaakian 4 days ago 0 replies      
this is actually really big.

amazon was changing up it's rules with shipping from china a few months ago, now I can see why they did it.

9
ijafri 4 days ago 1 reply      
>As the freight forwarder on a companys shipments, Amazon would see both the name of the supplier and the wholesale price paid by the importer. (from the Article)

It's unlikely that an importer or exporter is going to use Amazon for the freight forwarding outside its network. I guess Amazon only will end up handling shipments of its own or network sellers who significantly rely on amazon to sell their products, likely through LCL Consol box, putting a lot of smaller shipments into a single 40HC container., that's going to save them a lot of cost as well as give them tighter control over the shipment routing and transit time etc.

Other regular importers & exporters either in China or USA are never going to use it. Because of the reason mentioned above.,

An importer or exporter shares too much sensitive and critical information with his freight and customs agent., If I suddenly get involved in trading or in other words become their competitor its obvious no one is going to share the information with me.

10
brightball 4 days ago 2 replies      
I understand the move and why they are doing it...this just feels like what a monopoly starts to look like.
11
oneJob 3 days ago 1 reply      
Interesting possible side effect. Ocean freight vessels are, disproportionately, on of the greatest contributors, and in fact one of the greatest contributors in absolute terms, to CO2 emissions. Perhaps Amazon's involvement will shine a spotlight on this issue that has heretofore gone under the radar.
12
hamandcheese 4 days ago 0 replies      
I thought amazon was already in the container game https://aws.amazon.com/ecs/
13
thedogeye 4 days ago 0 replies      
Someday I hope that Amazon will provide a fully vertically-integrated ecommerce marketplace for the people of the Amazon River basin.
14
swehner 3 days ago 0 replies      
I take it as a sign of unhealthiness.

Amazon obviously have too much cash lying around.

So be good and stop using Amazon!

15
it_learnses 3 days ago 0 replies      
the govt should seriously start considering breaking up Amazon.
16
meeper16 4 days ago 0 replies      
I thought they would have had this in the works years ago. No surprise.
WebTorrent BitTorrent over WebRTC webtorrent.io
373 points by Doolwind   ago   94 comments top 33
1
mmcclure 1 day ago 2 replies      
I feel like I've seen this pop up a few times now, but this is really, really cool stuff. The only thing that concerns me about the growing popularity of using WebRTC is the security concerns around unknowingly joining a p2p network like this for potentially any site you visit. It's not hard to imagine what a bad actor could do to content before passing it along, or more simply, the fact that your true IP is exposed.

Curmudgeony security issues aside, this undeniably feels like The Future and a big deal to watch out for. It's also one of those cases where a creator / maintainer makes a huge difference for long term viability in my opinion. Feross is crazy smart and has been working with all the related tech for a while now (via PeerCDN, Instant.io, etc, etc), and is just an all around respectful, nice guy, which is important for the continued development / community aspect.

2
lambdacomplete 1 day ago 5 replies      
Amazing project, really! But please, for the sake of users (like me) who live in countries where ISPs set a "quota" on DSL connections: ask the users whether they want to start downloading Sintel before doing so :) Now I'm afraid of opening the website again.
3
imrehg 1 day ago 3 replies      
Does this site really start downloading a 124MB torrent right after opening the page (sintel.torrent)? If so, why would that be a good idea to do?
4
erikpukinskis 1 day ago 1 reply      
This makes me so happy. If we can get good support for WebRTC and getUserMedia the web will be able to keep going as a decent platform for apps.

http://caniuse.com/#feat=stream

http://caniuse.com/#feat=rtcpeerconnection

We're really at the mercy of open platform-minded engineers at Google, Apple and Microsoft though! I wonder what we can do to help support those folks.

5
currysausage 1 day ago 2 replies      
Very curious about the legal implications if every site that I visit can transfer files to unknown peers in the background. P2P is, AFAIK, a big source of costly cease-and-desist orders in Germany. With WebTorrent, I guess I could tell the right holder to bring the matter to court and plausibly state that some malicious ad iframe must have distributed that MKV without my knowledge.
6
TheAceOfHearts 1 day ago 0 replies      
You can try out WebTorrent at Instant.io[0]. It's probably the easiest ways to share files with someone, as long as both people have modern browsers.

Unfortunately, after a certain file size it'll just crash your browser. It'd be great if there was a way to work with large (+2GB) files.

[0] https://instant.io/

7
taylorhou 1 day ago 0 replies      
Very interesting. Figured the day would come but the dev finally did it. Re-decentralizing the web is a great goal and with simple demonstrations like yours, we'll get there! Cheers mate
8
johnchristopher 1 day ago 0 replies      
Question: I see there are some local network IP addresses in the graph ? I suppose external IP addresses are hidden for privacy/security purpose but how well are there hidden ?

Anther question: How do I open the file once downloaded ? (I use ublock, should the file be displayed in the rectangular area next to the graph ?

9
yAnonymous 1 day ago 1 reply      
That's great, but BitTorrent over JS is also dangerous, at least where I live.

C/D letters come with a 200-1000 fee depending on the content and now it's trivial to make someone download stuff illegally in the background.

10
kentbrew 1 day ago 1 reply      
Page wants just a tiny bit of explanation about what's going on. Firefox 43.0.4 doesn't play the movie; it just sits there with a black box.
11
magicmu 1 day ago 0 replies      
What a coincidence, I was just playing with this for the first time last weekend! They also have an npm package that can be used for both torrent streaming via node and the browser (https://www.npmjs.com/package/webtorrent). Awesome project.
12
liamzebedee 1 day ago 1 reply      
WebRTC requires the use of a centralised signalling server for the initial connection between two peers. I feel many miss this point when reading about WebRTC-enabled projects. Even if you do have Universal Plug and Play which port forwards automatically (and thus you can communicate directly between two peers), you still need this centralised signalling server.

Correct me if I'm wrong, but this poses a problem if you ever want to take WebRTC further (i.e. in a self-hosted mesh network).

13
rtkwe 1 day ago 0 replies      
Interesting, if the player never starts you never connect to additional peers. I'm running this in firefox 43 with flash disabled and the video never starts.
14
janpieterz 1 day ago 0 replies      
Interesting, I'd be curious to some speed tests. I was seeding to around 22 peers for a while but did not get over 5Mbps up, while my internet connection is capable of around 530Mbps. Wondering if this was an inherent WebTorrent problem or simply that not enough people were online with strong connections.
15
cel1ne 1 day ago 0 replies      
Like many, I thought about this since a couple of years.

My idea was a browser-plugin for youtube, that would take the downloaded video and start seeding it. On the other side, if a video has been blocked by YT, it would automatically use the torrent version.

16
jaysoncena 1 day ago 0 replies      
How come the download was already completed but the video only buffered around 50%?
17
throwaway13337 1 day ago 0 replies      
Can't wait to see a popcorn time in the browser. :)
18
edpichler 1 day ago 0 replies      
Very cool, but what annoys me is it starting the p2p download and upload without asking authorization?
19
franciscop 1 day ago 0 replies      
I was toying with the idea of doing something like this a couple of days ago, but two things stopped me:

- No support even in modern browsers by default [1]

- Don't want to [maybe] get into legal troubles if it's wrongly used

[1] http://caniuse.com/#search=webrtc

PS, apparently the caniuse info was wrong, since now it appears in green

20
buzzdenver 1 day ago 2 replies      
What is the animation on the left full of RFC 1918 addresses ? I assume those are really NAT-ed at some point, aren't they ?
21
devilsbabe 1 day ago 0 replies      
Other companies like http://streamroot.io/ are also using WebRTC to help content hosting sites like YouTube and Netflix deliver VOD and live streams. Really exciting!
22
ferongr 1 day ago 0 replies      
>Error: No WebRTC support: Not a supported browser

Funny, Fx44 does support WebRTC

23
leoplct 1 day ago 0 replies      
Looking forward to see Popcorn time on WebRTC
24
nik736 1 day ago 0 replies      
If I use Safari on that site it's just downloading from your server, right? Since Safari doesn't support WebRTC.
25
vonklaus 1 day ago 1 reply      
it is so fucking obvious that this idea is exactly how browsers will work in the future. A browser is going to just be something like node-webkit/webkit/electron etc. so compatability won't be an issue, then you just connect to a ton of different clients that are running narrow crawls of shit you are searching for. The browser will then not take you to the page, but just display the information directly without loading a shit ton of js.

You can tag or organize the data locally and cache it, or return it sorted to the nodes which serve it to others. People don't give a shit about webpages for search, they care about information. The web is a big rss feed, and our old feedreader "google" stopped doing that well, and also we pay a massive privacy tax for that now.

I see this happening in ~2 years for really techie people and being standard in 5.

edit: elastic search, webkit, real time, distributed file systems, apache spark, google tensor flow. These ingredients will be used to make the new browser which browses information and returns that information not the actual web pages.

26
lelandbatey 1 day ago 2 replies      
This seems very interesting already! I now have some more technical questions:

- Where is the downloaded data being stored? With a traditional bittorrent client I the data is written to disk. Since JS doesn't make raw disk access available, I'm assuming it's being kept track of in through some js api that tells the browser to store this data. What API is it using?

- Even when I finish downloading the video, the player doesn't allow me to seek to random positions in the video. It displays a "this is how much is buffered"[0] bar that is way smaller than the green bar at the top of the page indicating download progress. Why is this the case?

- As you can see in the screenshot[0], there's lots of nodes that are labeled with ip addresses that are not visible to my computer at all. Is this because the displayed ip addresses are self reported?

[0] - http://nacr.us/media/pics/screenshots/screenshot--17-46-37-2...

27
ericfrederich 1 day ago 0 replies      
How does this project differ from ipfs?
28
rasz_pl 1 day ago 0 replies      
1 Pretends to work on a browser not supporting WebRTC. This got me thinking so I went to webrtc.org and all the examples/samples also pretend to work and/or fail silently - is WebRTC API really not able to even ascertain level of support of the running browser? .. looked under the hood and found https://webtorrent.io/bundle.js: throw new Error('No WebRTC support: Not a supported browser'), so it definitely can, but fails to catch those errors and do anything/inform user.

2 looked at network traffic and it seems to open separate TLS sessions per transferred data packet, not the most optimal thing to do, might be an artefact of being hosted on https. Probably a cpu bottleneck right there.

3 doesnt store anywhere (local/session storage).

29
andreapaiola 1 day ago 0 replies      
Nice tech!
30
gionn 1 day ago 0 replies      
are you a wizard?
31
jsprogrammer 1 day ago 0 replies      
Nice demo.
32
knocte 1 day ago 1 reply      
WebRTC will anyway become obsolete with IPv6, right?
33
Nux 1 day ago 0 replies      
-1

Complains it cannot play the file for not having Chrome with Mediasource. Why not serve an ogg or webm for crying out loud?

Also, why auto-start the download?!

After the download is finished, where can I watch the video? There's no link for watching it anywhere.

If I refresh the page the download starts again.

I realise this is just an experiment and kudos for that, but the author could have made some better choices re above.

Etsy stock has lost 76% of its value in 9 months google.com
277 points by megafounder   ago   262 comments top 21
1
profinger 3 days ago 4 replies      
I'm not surprised at all. Etsy used to be user based and hand made. Now it's just another market place. They still act like they try to enforce "non-mass-produced" but there are countless examples of the opposite being true. Places where people have directly reported instances of this and the problem is still present.

I think the problem is that they stopped caring about their user base and, therefore, became less ubiquitous and people saw that and were like "oh.. time to value them lower.."

2
analyst74 3 days ago 1 reply      
As they always say, this time is different.

My opinion is that the tech sector has greatly expanded since 2000, not just in amount of investment available, but also types of business tech companies are actually in.

So maybe there is a web/ad bubble and it might pop, but how much that affects individual company is more nuanced.

Unsophisticated investors might still lump Google/Twitter/Tesla/AMD into the same "tech sector", but they will find the signal they receive is very mixed and difficult to analyze.

3
chris_wot 3 days ago 4 replies      
The lockup agreement expired. Looks like employees and executives are selling up big time.

http://www.marketwatch.com/story/etsy-shares-fall-after-expi...

4
jonknee 3 days ago 1 reply      
I wouldn't be surprised if Etsy is acquired now that the share price is beaten down. Seems like a good fit for eBay (it would provide much needed relevance) or Amazon. It's a good brand name, but not much else.
5
jheriko 3 days ago 3 replies      
"over-hyped gimmicky thing trendy with hipsters out-of-touch with reality starts to fail over short period of time"

not surprised one bit. i echo the sentiment of other commenters who give other examples.

this is only surprising if you are also out of touch with reality

6
maxxxxx 3 days ago 2 replies      
Isn't that the strategy these days? Extract as much value from a company and only then go public?
7
adventured 3 days ago 0 replies      
And they're actually a better value today thanks to the proper correction, than they were at the point of IPO, despite what the analysts and pumpers (or shorts) would tell you.

Now they're trading for 2.x times sales for fiscal 2015. They have 1/3 of their market cap in cash. And they're accumulating cash from operations rather than burning it. The value proposition on the stock is dramatically higher than it was when the hype was on the moon.

8
theseatoms 3 days ago 2 replies      
This doesn't bode well for other publicly traded B corporations, unfortunately.

https://en.wikipedia.org/wiki/Benefit_corporation

9
dtx 3 days ago 0 replies      
speciality ecommerce has always been a moody market and a company like Etsy getting traded at this value is still just sentimental value. The company has had negative margins for a while, along with negative returns on all it's assets.

Operating cash flow is a mere 2% of the current market cap. This defines shambles unless it can really turn it's fortune around in '16-17.

10
marme 3 days ago 1 reply      
etsy is clear case of poor management, just taking a brief look at their financial statements it is clear they are doing something clearly wrong. Their revenue is growing but their expenses are growing at the same rate so they remain unprofitable. from 2014 to 2015 revenue grew from 29 million to 45 million per quarter while general expenses grew from 22 million to 31 million. Amazon is another company that does this but at least they have the sense to keep the company slightly in the black to slowly gain small amount of cash reserves, etsy is just burning through their capital for no reason, they earn plenty of revenue to be able to turn a profit
11
sly_foxx 3 days ago 0 replies      
They can obviously make money by raising a transaction fee a little bit from 3.5% to maybe 5%-7%. eBay has done it to 10% + PayPal fees and they are making billions in net profits.
12
sulam 3 days ago 4 replies      
To be clear, it's lost XX% of it's market cap. The market cap is a knowingly-flawed assessment of earnings potential for a company. Its actual value is probably unchanged.
13
mw67 1 day ago 0 replies      
There is no market. People want an iPhone 6, not a wooden necklace
14
perseusprime11 2 days ago 0 replies      
The stock should trade at $2 if you look at their financials. Growth stalled a while back. Etsy will never be able to compete with Pinterest. This is a $100 million dollar company in the guise of $1b. Going the same way as gilt group.
15
bigB 3 days ago 0 replies      
All comes down to Monetization really.....If you dont find a good way to make a profit stock price will fall. Investors will only wait a certain amount of time before losing faith and interest.
16
benhebert 3 days ago 0 replies      
What is etsy spending their money on? What's their main expense? Talent?
17
ratonofx 3 days ago 0 replies      
as a two-sided marketplace, etsy got a wrong balance between vendors and buyers... Last time I did read they speaking about making a lot of money offering vendors' services... They abandoned the crowdsource, the crowd abandoned them..
18
aszantu 3 days ago 0 replies      
so, where do I turn now if I want to sell my prints?
19
kelseydh 3 days ago 0 replies      
Good time to buy.
20
w1ntermute 3 days ago 16 replies      
Angie's List: from $28 in Jul 2013 to $9[0]

Box: from $24 in Jan 2015 to $10[1]

GoPro: from $87 in Oct 2014 to $11[2]

Groupon: from $26 in Nov 2011 to $2.60[3]

GrubHub: from $46 in Apr 2015 to $21[4]

Twitter: from $70 in Jan 2014 to $18[5]

Yelp: from $97 in Mar 2014 to $21[6]

Zillow: from $121 in Feb 2015 to $22[7]

Zynga: from $15 in Mar 2012 to $2[8]

0: http://www.google.com/finance?q=ANGI

1: http://www.google.com/finance?q=BOX

2: http://www.google.com/finance?q=GPRO

3: http://www.google.com/finance?q=GRPN

4: http://www.google.com/finance?q=GRUB

5: http://www.google.com/finance?q=TWTR

6: http://www.google.com/finance?q=YELP

7: http://www.google.com/finance?q=NASDAQ:ZG

8: http://www.google.com/finance?q=ZNGA

21
adevine 3 days ago 0 replies      
It's not really correct to say "Wall St loses its shit."

I think it's more correct to say that the extremely high valuations to start with are predicated on the idea that these companies can expand well beyond their initial niche. They're all hoping to be like Facebook, who went from college kids to the whole world. When it starts to become apparent that this won't happen, valuations come down to reflect a niche product.

You may say that Twitter has been beaten with a bat by Wall St, but they are still worth 12 billion dollars. Someone hit me with that bat.

Why Big Companies Keep Failing: The Stack Fallacy techcrunch.com
337 points by walterclifford   ago   145 comments top 28
1
gregdoesit 16 hours ago 5 replies      
From my experience the same (often random) reason that makes a company succeeds, then becomes their DNA, and finally can make them fail.

I saw this happen with Skype where I worked a couple of years. The company succeeded because of P2P: we grew with little infrastructure to reach 200M+ people. P2P became our DNA, rooted deep within (almost) every core component.

Then came the new wave of mobile messaging apps. We reacted... with a P2P messaging solution. It was obvious this wasn't working - you sent a message to someone from Skype for iPhone, and they got it... sometime.

We knew to have a chance against Whatsapp and other messaging apps we needed server based messaging, so we built it.

It took 3 years. Yes, it took this long to get rid of the P2P code from just the messaging components from the 20+ Skype products - we had 1,000+ engineers and 50+ internal teams by the end which significantly slowed things down. When we were done and popped the champagne - no one really cared.

And yes, the source code is still full of P2P references and workarounds to this date.

2
adevine 16 hours ago 19 replies      
I don't buy the arguments in this article. For example, the whole part about why Google failed with Google+ is just wrong IMO. It wasn't that Google wasn't capable of building a good social network. If anything, I (and most people I know) preferred the design and interface of Google+. The problem was that Facebook already had a huge head start, and all your friends were already there. Facebook was "good enough", and there wasn't a big enough incentive to want to switch to Google+.

If anything, large companies often miss out on new trends and changes in business and technology, but it's not solely because building that one new layer "up the stack" is so technically hard or different.

3
anonymousguy 15 hours ago 2 replies      
The solution to stack fallacy is simple but really counter-intuitive. All of the mentioned examples, I mean every single one, indicate a business trying to force its way into the higher level through business channels. For example, when a business wants into a higher level they make it a business priority to create a new product and attempt to drive the priorities of this next level product through their business objectives. That is an epic fail.

It is important to instead concede that you don't know the needs of the consumers in the higher level, and if you think you do it is because you are guessing. The only way avoid the problem is to not attempt to move into the higher level, at least not intentionally and not through business priorities.

This is extremely counter-intuitive because there are generally fewer expenses and greater market frequency at each higher level, which means superior revenue potential. Businesses exist to make money and to ignore moving up to the higher level means denying this potential (vast) revenue source.

This doesn't mean you can't move into the higher level of the stack and be really good at it. It just means you cannot do so both intentionally and as a business objective.

The solution is to double-down on where you already are with what you are already good at and focus on product quality of your existing products. Continue to improve where you are already good. Improvements and enhancements to existing products can gradually yield the next level as the improvements progressively open new potential in an evolutionary fashion. While getting to the next level this way is much slower it is also risk reduced and continues to associate your brand with quality and consume satisfaction.

This will only work, though, if the goal is improving the current product and not acquiring revenue in that desired higher level. Think evolution and not revolution. It has to be a gradual, almost accidental, increase of capability based on meeting current consumer needs.

4
hyperpape 13 hours ago 1 reply      
I don't want to be too dismissive because something about the article rang true to me, but I don't know that I buy the whole central conceit that the idea of a stack can apply as universally as this article needs it to.

Apple's networked services have often struggled. But are they really higher level than the things Apple succeeds at? Asking whether enormous distributed data stores are higher level than Mail.app just seems confused. It's different, and it brings new challenges, but are they part of the same stack? And is the data ingestion and sanitizing that Maps struggled with higher or lower level than the client that was basically ok? You can multiply these questions and I'm not sure you can get good answers.

5
vonklaus 10 hours ago 1 reply      
This article seems to have many correct pieces, but I don't think they coslesce to prove the point, or at least, not entirely.

I don't think that manufacturing semiconductors are comparable to building maps. Apple should have done a better job with maps, and even though they do complex manufacturing, likely should have done worse at chip manufacturing.

Iirc they brought in 3rd parties to help with the chip fab, and certainly spent more money building that core competency than maps.

I believe the author is correct that the issues is companies not fully understanding, and consequently underestimate, what it takes to be successful in a different arena putside their cc.

Google sees people as articles in a db. They dont understand people at all, they dont understand design as it relates to people, and they didn't understand that nobody needed another social network.

They probably underinvested (initially) in G+ and it was not a great product. It didnt achieve critical mass quickly, and thus had no chance of growing as a docial platform ever.

However, google is a lot more capable of creating something like this because they have all the core conpetencies down.

I guess my takeaway is that the companies can in fact take these arenas, but they underestimate the challenge. So to use a drug dealing analogy, they try to start moving bricks amd kilos, instead of working their way up learning the market pushing dimes and quarters.

They start too big, and when you fail big, you dont get the recovery of a smaller failure which affords small relaunches and features.

Tldr big companies try to enter at the top, cant recover from huge public failures, and either exit or buy in

6
cturner 8 hours ago 0 replies      
It's particularly funny when the stack fallacy is held by database providers, because databases are the wrong tool for most of the jobs they get used for at the moment.

Current usage of the database uses it as a loose, adhoc, difficult-to-maintain, polling-based API between multiple applications.

The future perspective looks back on our time, shaking its head at the way people use databases for everything in the same way that we shake our heads at bloodletting.

Oracle's business model is (1) convincing people to use platforms they shouldn't be using and then (2) selling the victims ongoing hacks and services to work around the limitations of the model.

Amazon's software services won't be build on a database. They'll be built using a decentralised messaging platform.

7
oautholaf 11 hours ago 0 replies      
A lot of the examples and counter-examples in the threads here are great, but Microsoft in the Windows era is a great counter-example here: from operating system to Office dominance. How did they crush Lotus and WordPerfect again?
8
libertymcateer 15 hours ago 2 replies      
Apple is not vertically integrated - Wikipedia entries to the contrary notwithstanding. It is a grossly inaccurate statement. Up until very recently, Apple didn't own a single factory - how can one possibly claim that they are vertically integrated if they don't own their own means of production?

Apple is a fantastically successful software and industrial design company. The vast majority of their production is outsourced. This is not vertical integration.

Additionally, actually, Apple has tremendous amounts of hugely successful and popular software.

Though I dig the underlying point of this article, that product management is hard, I think the examples are less than good.

9
marshray 15 hours ago 1 reply      
> Can we compete with Intel or SAP?

Well for one thing we know that Intel spends several $billion to open a new semiconductor plant and has a dozen of them already. https://en.wikipedia.org/wiki/List_of_Intel_manufacturing_si...

Whereas SAP is, well, a lot of software. Which is something, but Intel needs to make a lot of software too, and chip designs are in some ways a specialized form of software.

So I think in some sense Intel is strictly more challenging to replicate than SAP. (But this is probably just my misunderestimation talking. :-)

10
a-robinson 7 hours ago 1 reply      
The author claims that "Stack fallacy is the mistaken belief that it is trivial to build the layer above yours", but then says that IBM was wrong when they "happily allowed Microsoft to own the OS market".

Wasn't IBM a classic case of not trying to build the layer above them on the stack?

The Wikipedia page on IBM PC DOS even claims that their "radical break from company tradition of in-house development was one of the key decisions that made the IBM PC an industry standard".

11
kazinator 16 hours ago 1 reply      
This stack fallacy sounds very familiar. Oh, we will just have a few system calls like open, close, read and write, some TTY and credentials related stuff, a bit of signal handling, process control with fork, exect and wait ... writing a shell language on top will practically just be a footnote.
12
fforflo 6 hours ago 0 replies      
As a comment on TC says:

"What the article is referring to as stack fallacy is the work of Physics Nobel Laureate Philip Anderson: https://web2.ph.utexas.edu/~wktse/Welcome_files/More_Is_Diff...

Let's give credit where it due please."

13
dpflan 14 hours ago 0 replies      
This article is funny - twisting together the ideas that a company specializes in a product in a specific market and that other companies can use the products / tools of other companies to develop their own unique products for a specific market and market development and competition. The "up-the-stack" company building something using products from "down-the-stack" has already entered a market, gained market share, and specialized in a market in which the "down-the-stack" company has no presence. Now, the "down-the-stack" company sees an example of a successful product that uses their technology that they know so well, but their company is not specialized for this product, so it hubristically does low-hanging-fruit-snatching to try to enter the new market. "Big companies keep failing" because they are not being innovative based upon what's mentioned in this article; they see an easy out and enter a market that already has an incumbent(s).
14
anjc 16 hours ago 1 reply      
I'd like some solid examples of what companies confirmed perceptions of competitors were in the same vertical, versus the reality.

Because even the author references competency-based views of competitive advantage, but for some reason ignores resource based views, and ignores the fact that companies might be aware of their competences. That is to say, I'm sure that large companies tend to mostly be aware of what their competences are based on the resources and knowledge that they have. If they don't have marketing departments that have analyzed the ERP market, sales teams with ERP training, tech departments with key HR, key knowledge etc etc, then I'm certain they are very well aware of this.

Maybe some companies have had marketing missteps and have made poor strategic and competitive decisions, however, but I really doubt that it's due to a lack of introspection or simple analysis as described.

Also, IBM didn't "think nothing much" of the software layer. They misunderstood the nature of power in the supply chain, and most importantly, didn't solidify their position within the supply chain while they were dominant.

15
annnnd 2 hours ago 0 replies      
> The bottleneck for success often is not knowledge of the tools, but lack of understanding of the customer needs.

THIS! +1000! I would even leave out "often", or at least replace it with "usually".

16
tokipin 9 hours ago 0 replies      
Nice observation. Another way to put it is that induction is harder than deduction.

A related factor is that larger companies tend to be more specialized (formalized processes, specialists, focused teams/departments, and so on), meaning they can be prematurely optimized with respect to new goals and poorly equipped to conduct the necessary roaming.

17
anshublog 13 hours ago 0 replies      
I am the author. Happy to answer any questions about my post.
18
cbsmith 13 hours ago 0 replies      
This seems more than a bit flawed. It presumes companies or anyone else think these launches in to new markets are low risk. They are generally seen as anything but. There is hope that leveraging existing strengths will improve the odds, but only the idiots think they are certain to win.
19
ogezi 16 hours ago 1 reply      
A company should always focus on your strengths if not it'll be both overstretched and unsuccessful. Great read.
20
mwnz 15 hours ago 1 reply      
Do big companies really keep failing? I'm failing to see the evidence of that assertion.
21
tuke 13 hours ago 0 replies      
True enough, but there are also companies that were designed from the beginning to have vertical integration and control much of their business from beans to buildings. And of course I am thinking of Starbucks. (People don't really understand the technological story of Starbucks, which has a lot to do with their introduction of the vacuum packs to get their coffee across the country, and overcoming the challenge of brewing coffee on passenger jets.) Mostly for the better, they decided long ago to own as much of the stack as they could.
22
shim2k 6 hours ago 0 replies      
Peter Lynch writes about it in his book "One Up on Wall Street". He calls it 'Diworsification'.
23
jackgavigan 11 hours ago 0 replies      
> Product management is the art of knowing what to build.

And in what order.

24
dkarapetyan 14 hours ago 0 replies      
Not so much stack as legacy. Have you seen legacy architectural decisions? They're impossible to get rid of. It's surprising how much the initial architecture can hinder change.
25
tn13 15 hours ago 2 replies      
There was a brilliant essay by an Indian politician few years back after his party lost the elections. Later in lecture he explain why political parties and large companies have so much in common when it comes to failing.

His basic logic was that - Success depends on processes- Processes even though might be thought of as abstract in reality are function of people at top. - Company gets successful because some bright guy is the rebel, he questions status quo, persists and succeeds. - As time goes by, the rebellious ideas actually become conservative ideas. The rebel is now on top. As his ideas fade he struggles to stay on top.- He recruits people who see the world through him, he builds processes that enforce that vision.- This makes it difficult for the truth to be visible to the top management.- By the time failure is visible it is hard to turn around the ship. - IN SHORT: Companies/Nations fail because someone at top did not know when to quit. - In the end that rebel turned conservative becomes bitter. He thinks the world owed him something for what he achieved.

He explained who USSR examples. How a genetic scientist got promoted because his fake research re-enforced something that Stalin had said long back and his peers were scared to point out the fact because it might get perceived as anti-Stalin.

I observed Blackberry very closely and it resonated to me so much. The founders at one point blamed people for using iphone and not blackberry.

Best companies in the world are seem to be those where their top leaders quit at their peak to make way for their successor.

26
ljw1001 15 hours ago 1 reply      
Some insights perhaps, but the claim that this is "Why big companies keep failing" is way overblown
27
ap22213 15 hours ago 1 reply      
An alternative (or maybe complimentary) theory is in Clayton M. Christensen's innovator's dilemma. Big companies build enormous revenue bases on certain types of technologies. Then they struggle to innovate because, by transitioning, they eat away at their existing revenue streams.
28
shadowmint 15 hours ago 1 reply      
Why do people keep reading TC?

Here let me make an article... wait wait... ah... "Big Companies FAIL" that sounds like nice click bait. Now... hm, let's invent some stupid word to pad it out how about the 'Stack Fallacy'. Programmers will dig the 'stack' part. Yeah. Ship it!

Seriously, this article is content free.

People make products. Sometimes they work... sometimes they fail.

If you pretend you have some magical insight into why they fail or succesd with gems of wisdom like:

 found it very difficult to succeed in what looks like a trivial to build app social networks.
and:

 The stack fallacy provides insights into why companies keep failing at the obvious things things so close to their reach that they can surely build. The answer may be that the what is 100 times more important than the how.
Then... wow. I don't even know what to say.

Really? What you build is important?

No kidding.

Why is the top of the list this morning?

Iran Complies with Nuclear Deal; Sanctions Are Lifted nytimes.com
263 points by jseliger   ago   230 comments top 16
1
tyre 2 days ago 17 replies      
This is terrific news!

As with the tearing down of the Berlin Wall, opening up countries to the world's economy and ideas is the first step towards democracy.

The implications of this are tremendous (not in order of importance):

1) Oil prices will continue to fall as Iran is able to supply the global markets. Many oil states rely on money from natural resources to preserve monarchies. Money for freedom only works so long as the money keeps flowing.

2) Our (US) reliance on Saudi Arabia will diminish as there are now two powers in the region to work with. Having strong relations with both Shiite and Sunni powers in the Middle East will likely reduce sectarian violence. We're light years from being out of the woods, but this is big step in the right direction.

3) The Iranian people will gain access to the world economy. From a human rights perspective, they are the biggest winners here. As with Sunni/Shiite relations, no doubt a long way to go (the Ayatollah is a tyrant,) but you gotta celebrate the wins when you can.

4) De-escalation of our conflict with Iran. We saw it with Iraq, Vietnam, and Korea. Invasion + nation building is sexy, but highly ineffective. Having one less nuclear power that calls for our destruction is certainly a nice to have.

5) Shows Americans that diplomacy can work. Iranians don't hate Americans, they hate what America represents. To them, we represent a superpower that gives little to no thought of anyone else's sovereignty. We assassinated their democratically elected leader and backed the Shah, which got us into this mess. Diplomacy is far less sexy and easily criticized, but that's a huge part of getting this deal done.

Note: Many of these are over-simplified. Nonetheless, this is a pretty big deal and a cause for celebration.

2
bane 2 days ago 3 replies      
This an incredibly subtle deal.

- Iran gets access to global markets, and in time tourism (there's an incredible number of amazingly beautiful things in Iran for tourists to see, from ancient to modern ski resorts)

- Iranian oil will keep prices in the toilet, this is basically a way for the U.S. to punish Saudi Arabia for decades of support for various maleficent actors. Except it doesn't involve an invasion, a takeover, or anything else beyond economic sabotage. The Saudis have also had decades to form a more diverse economy, and for various reasons haven't managed to do it...this has kept them vulnerable to this kind of action and it helps free the the major users of Saudi oil from "vendor lock-in"

- It demonstrates that cool, calm, collected diplomacy can actually work. However, many people will forget that the U.S. and Iran have been fighting a proxy war for decades. It hasn't been a hot war, but Stuxnet, various revolutionary movements and so on have been bits of that war. This isn't just Iran throwing in the towel because the sanctions finally worked, its because all of the other major leverage points Iran could muster were defeated.

- While the sanctions by themselves failed to work, they helped create a political climate inside of Iran that favored this outcome instead of having another go at saber rattling.

- This helps provide a mildly more palatable "friend" in the region than Pakistan

3
salimmadjd 2 days ago 1 reply      
This is a tremendous news politically and economically.

Economics - 1 - the price of oil has been declining since the IranDeal was signed. In US alone the annual savings as result of cheaper gas and cheaper food (food production costs is strongly tied to gas prices) is about $500B/year (from $4/gallon - $2/gallon). Basically providing additional $500B in spending money in US. 2 - globally, the lower price of food and gas can potentially provide additional spending money.

3 - Iran has crumbling infrastructure and need numerous foreign contractors to rebuild (European and Chinese have already signed up). Sadly US companies will not be able to participate.

4 - Iran has potential for a large consumer market.

Politically -

1 - US will finally have a second option (let's call it the second front) in middle east. We in US have been keeping a blind eye toward Saudis, their indirect financing of ISIS and all types of jihadist fighters in region from Libya, Syria, Africa, Afghanistan, etc.

2 - Iran will be tapped into helping stabilize Afghanistan, Iraq and Syria (there is already talks of providing and exist for Assad)

3 - Iran's gas can provide a hedge (at least the fear of an Iranian pipeline) against Russia. This probably wont happen as Iran relies more on Russia than Europe and will probably maintain that role, but that's a possibility.

4 - The open Iran is forced to further integrate globally. This has always been the fear of the hardliners and it'll be resisted by some within, however, there seem to be an understanding that stopping progress is a futile task.

5 - One of the largest women secondary eduction (close to 70% of college students are women), it will eventually play out as a potential model for other regional muslim countries to emulate.

6 - It will force Saudis to change. Saudis are extremely worried about the IranDeal. But their biggest existential threat is not militarily, but culturally. From having 70% women college students, to hybrid system of government. Iran socioeconomic role, will put pressure internally on Saudi rulers and it will force them make uncomfortable changes or face internal turmoil.

4
gotchange 2 days ago 4 replies      
> Iran will use the roughly $100 billion in frozen assets it will receive to support terrorism and other misadventures.

> Irans support for terrorism, its imprisonment of dissidents and even some Americans, its meddling in Iraq and Syria and its arms trade.

Funny that Saudi Arabia is guilty of all the above if not 10X worse but not a single word from those republicans. This speaks volumes of the power of the Saudi lobby in the US political system and how their wealth could influence decisions and policies in the US.

5
msoad 2 days ago 1 reply      
One very welcome change is reconnection of SWIFT network to Iran. You can't believe how hard it is right now to send money to Iran. This hit me when I had to transfer money to my family in their extremely hard times, when my brother had an accident and needed cash for treatments.

I hope this trend continues and Iran comes back to the international scene. It's good for everyone. Iran is very similar to Israel. Most of people are normal but there is a small percentage of extremist who have a lot of voice. Lucky for Israel they have a better constitution and governance model.

6
1024core 2 days ago 1 reply      
I don't know why we're so ensconced in Saudi arms. Iran is much more liberal than Saudi Arabia; and Iran doesn't export the fundamentalist interpretation of Islam (Wahabbism) which is causing headaches all over the world.

Now I'll sit back and wait for the "but... but... Iran said they would wipe Israel off the map!!!1!" crowd.

7
Retric 2 days ago 8 replies      
The sad thing is Iran used to be reasonably liberal in the 1960's. Based on pictures from that time period, but politics got really ugly.

IMO, it's a classic case of nut-jobs on both sides of a boarder causing pain for a wide range of people.

8
agorabinary 2 days ago 2 replies      
A great lecture on Iran: https://www.youtube.com/watch?v=rtELk8S3dhU

Really gives some perspective on an oft-misunderstood place.

9
jernfrost 2 days ago 0 replies      
Very exciting news, that made me happy about the news for a change. I am confident we can make Iran into a normal country if we show them some good will. It is a country full of all political stripes like anywhere else. The confrontational policies of the past has emboldened the Iranian hardliners.

It is time to embolden the moderates and reformers in Iran! When you read about the details of Iranian society, it is very clear that they have a huge amount of potential. Regular Iranians are the most positive to the west in the region. Religion is in strong decline there. They got a lot of real industry there. They are big car manufacturers e.g. They have more scientific output than the whole Arab world combined. Their strain of Islam is not as extremist as the one found among the gulf states like Qatar, Saudi Arabia, Jemen etc.

We got to give the Iranians reasons to believe that playing well with the west will give them a lot more benefits than antagonizing Israel.

I support Israel's existence but I really wish they had a more moderate and constructive leader than Bibbi. He really comes across as a deranged conspiracy theorist. To make real progress we really need to get Iran and Israel to make the peace.

10
jameslk 2 days ago 1 reply      
Now if only we could do away with the double standard and hold Isreal responsible for their nuclear escapades[0] too.

0. https://en.m.wikipedia.org/wiki/Nuclear_weapons_and_Israel

11
bronz 2 days ago 2 replies      
So what did he mean when he said that Iran would be able to access its holdings abroad?

"A senior American official said Saturday that Iran will be able to access about $50 billion of a reported $100 billion in holdings abroad, although others have used higher estimates."

12
joezydeco 2 days ago 0 replies      
Crude oil is expected to fall another $2-$3 on Monday after this news. Curious what that will do to the rest of the market.
13
audessuscest 2 days ago 0 replies      
They always did. It's just that now we agree to recognize it...
14
rajacombinator 2 days ago 0 replies      
Nice (surprising) to see some let up in the US's brutal war of aggression against Iran!
15
TheGuyWhoCodes 2 days ago 3 replies      
This is the worst news this year.More money for terror, seems legit.
16
vonklaus 2 days ago 1 reply      
Makes a lot of sense. The scrawny kid with low self esteem who has a bad family life so he lashes out from time to time, is entirely ostracized by all the popular kids until he agrees to turn in his red ruder bb gun.
Signs of Secret Phone Surveillance Across London vice.com
255 points by walterbell   ago   98 comments top 14
1
scott77 4 days ago 8 replies      
Interestingly enough, during last trip to London, my phone (Nexus 5 running Cyanogenmod 12.1) was able to somehow detect something fishy and warned me several times "Network may be monitored by an unknown 3rd party". Among others, I've seen this warning two times out of two when passing by Cheapside/St Martin's corner - next to the St Paul's Cathedral.

At the time I just dismissed this as some tinfoil hat developer adding some nonsensical warnings to the firmware, but in retrospect, after reading this article - this matches perfectly, chances are - phone was indeed detecting Stingrays. Still no idea how it managed to do it.

EDIT: I had no data/IP connection of any kind at and around the time of seeing this, so this is clearly unrelated to TLS interception.

2
darkr 4 days ago 0 replies      
> A VICE News investigation has found evidence that sophisticated surveillance equipment that spies on people's phones is being used across London

That the Met (and other police forces) regularly use IMSI catchers is not new information - here's a ~5 year old Guardian article on the subject:

http://www.theguardian.com/uk/2011/oct/30/metropolitan-polic...

3
madaxe_again 4 days ago 2 replies      
Watched the documentary last night.

Given that these things are cheap, would any fellow Brits be interested in clubbing together and acquiring one and installing it in or around parliament? I'm sure there would be plenty of buyers for the call records of MPS.

4
Thlom 4 days ago 2 replies      
Same thing found in Oslo about a year ago. Police Security Service denied they existed though. If I remember correctly these were mostly located around parliament and other government institutions as well as in the embassy area.

Report in english:http://www.aftenposten.no/nyheter/iriks/New-report-Clear-sig...

5
jritchie 4 days ago 3 replies      
Do IMSI catchers have any way of verifying the intercepted IMSIs are legitimate? If not, would it be possible to build a device to flood them with fake/spoofed IMSIs?
6
pmlnr 4 days ago 1 reply      
Recommended app for phones not doing this by default:

https://f-droid.org/repository/browse/?fdfilter=SnoopSnitch&...

7
simonvc 4 days ago 5 replies      
Richard from Privacy International pointed me at these: http://www.amazon.co.uk/dp/B00RRL4XLWIt's a faraday cage for you phone. Pop it in and you disappear instantly.
8
arca_vorago 4 days ago 2 replies      
I stayed in The City during the holidays and at one point walked close by the Police hq. After continuing on I enabled gps for a moment and even though I was half a mile a way my gps was showing me smack dab in the center of the police hq for about 15 minutes... I realized then what was happening and that I should have known better, but I had forgotten to ask a friend to take his faraday cage laptop bag and phone bag... I knew better and am still kicking myself for it.

The UK's level of surveillance is extremely unsetteling to me, and quite frankly I think a lot of Americans have forgotten all the reasons why the UK might not be as good of an ally as everyone thinks since the 47 USUK agreement. Thr point being that I really hope our politicians dont start adopting That level of surveillance just because they do it to.

It seems we have quietly been in a surveillance arms race, which isnt good for the population at all.

9
alex_duf 4 days ago 2 replies      
It doesn't really comes as a surprise, but it's good to see the news coming out.Now we need a legal framework to ensure this form of power isn't abused.
10
yenda 4 days ago 0 replies      
When you combine this news with this one it makes you think twice before bringing your phone to a protest https://news.ycombinator.com/item?id=10905643
11
chippy 4 days ago 0 replies      
I seem to recall a recent discussion where these devices are very hard to identify compared to a number of legit mobile phone amplification devises that many large buildings have for their employees in cities.

Is this the same technology?

12
rnhmjoj 3 days ago 0 replies      
Are prices so high in order to prevent private use or are they really that expensive? What keeps anyone from building their own?
13
mikewilliams 3 days ago 0 replies      
I've read in the past that phone tapping produces an audible echo, is this the case with imsi-catchers as well or has the technology advanced far enough it's not possible to detect being tapped?
14
jcromartie 4 days ago 0 replies      
If a technology exists, and can feasibly be leveraged by a party to its advantage, then you should assume that it is happening.
How Police Officers Seize Cash from Innocent Americans priceonomics.com
243 points by ryan_j_naughton   ago   205 comments top 19
1
throwaway15236 3 days ago 3 replies      
It happened to me. I was flying out of San Francisco once on an international flight.And at SFO some of the international terminals go through these long corridors to connect to that moveable tunnel that connects to a plane.I was fixing some bug on my laptop and decided to wait till the last minute to board (though I was sitting right infront of the check-in desk).When I went past the check-in desk and into the corridor, two police officers approached me and asked me if they could ask a few questions.It felt suspicious right away why they were stopping me as there were hardly any people around - there was no one behind me, and everyone had already gone through the corridor.They took me aside into a small room and asked me where I was travelling to and for what purpose. But then they repeateda number of times that it's illegal to carry more than $10,000 in cash and if I was carrying over that amount and that if I was it would have to be confiscated.I told them I wasn't and I think I only had less than $500 in my wallet. They went through my wallet, went through my bag pack, even flippedthrough my Clojure book and then they let me go. But I kept having that feeling that something didn't feel right. Later I wished I had asked for the officers names or if that would have gotten me into trouble with them.
2
balls187 3 days ago 12 replies      
John Oliver had a great piece on this: https://www.youtube.com/watch?v=3kEpZWGgJks

In the US, where the constitution expressly prohibits it: that your property is seized w/o due process is complete and utter garbage.

By no means am I a right-wing/vigilante militia supporter, but this type of behavior from the police makes me support having a heavily armed citizenry.

3
iamleppert 3 days ago 4 replies      
Between the police shooting innocent people, tasering and bearing up our kids, does anyone even feel safe around them anymore?

Whenever I see police, I have the same fight or flight response as if I'd see someone in a dark alley. The police have become dangerous, and none of my friends trust them. They would be the last people I'd call if there was something happening. Too much of a risk they would beat you, kill you, or rob you.

Who does that sound like?

4
l33tbro 3 days ago 3 replies      
As a non-American, it's pretty scary for the rest of us to see how your society can tolerate these Eritrean standover tactics, while still retaining this "freedom loving" doublespeak in the national cinema you project into the world.
5
CptJamesCook 3 days ago 2 replies      
This happened to a coworker of mine, when he was flying from the midwest back to San Francisco. He had $15,000 cash on him and it was noticed at security. As he was boarding the plane, government officials grabbed him and took the money.

However, it did happen to be money he had made selling drugs several years prior. They had identified him as a convicted felon with a drug related offense and connected the money to it.

6
tracker1 3 days ago 3 replies      
I'm curious if any of the police officers behind the seizures could be sued as personally liable when no charges are brought.
7
ilyaeck 3 days ago 1 reply      
How is this not a major sensation/scandal? "Free country" my foot!
8
SN76477 3 days ago 0 replies      
It is an embarrassment that our police cannot act like good citizens.
9
joshpadnick 3 days ago 2 replies      
I often wonder if there is a tech solution to these types of issues. For example, are records of every instance of a civil forfeiture publicly available? Would it be helpful (and ethical) to publish this list to shine a bright spotlight on the practice? Would that make a difference?

It also raises the age-old question of "who polices the police?" The (federal) DOJ can only do so much, it seems. But maybe ordinary citizens can demand reform if injustice stares them in the face?

From what I've read in the press (especially NyTimes), the USA justice system seems to fundamentally disadvantage poor people [1]. The saddest part about the civil forfeiture business is that it probably affects the poorest people, who then have the least resources to challenge it.

On a separate note, I know the press is more likely to publish instances of injustice vs run-of-the-mill "just" justice. I honestly have no concept of if we live in a society with a tiny bit of corruption, or a lot more than I ever realized.

[1]http://www.nytimes.com/2015/08/16/magazine/the-bail-trap.htm...

10
forgottenacc56 3 days ago 0 replies      
Police seizures is the foundation of a corrupt society.
11
wsha 2 days ago 0 replies      
There are legitimate concerns about police overreach, etc., but I can't get past the part where we start off talking about someone checking $11,000 in cash.

Was this guy just not familiar with air travel? Or is it less likely that money will be seized from a checked bag than from a carry-on? It's just absurd to me to put that much value into a checked bag, especially in the form of cash.

12
jostmey 3 days ago 0 replies      
The way civil forfeiture is being used in these examples is a violation of a citizen's constitutional right and the perpetrators, no matter their appearance, are criminals and must be dealt with as such.

I'm sure there are some lawyers out there who might say otherwise. They will tell you that "this" is how the constitution is interpreted. But the constitution, particularly the bill of rights, was written so that you could understand it, irregardless of what politicians and lawyers say.

13
colinshark 3 days ago 0 replies      
"You know, stuff we needed but didnt necessarily have a budget for. So, when this money comes in, its considered extra"

Except municipalities explicitly write expected seizures into their annual budgets. They literally must seize property to make their planned budget.

14
known 2 days ago 0 replies      
Corrupt police always try to bully/rob you;Don't confront him; Call a senior police official or call a lawyer;
15
graycat 3 days ago 0 replies      
Yes, for government to confiscate private property without due process is a bummer, an outrageous violation of our Constitutional rights. No doubt about that. In particular, that the police believe that someone is acting suspiciously is nothing like justification for such confiscation.

Yes, there have long been news articles on this civil forfeiture scam. No doubt it happens. But have to suspect that it doesn't happen very often to innocent people. Why so suspect? Because there would be more screaming, political debates, SCOTUS cases, etc.

In a sense, that there can be such a scam is not too surprising: That is, as we know well, generally, "The price of liberty is eternal vigilance". So, we can expect attacks on our Constitutional rights, and, to get our rights back or just maintain them, we have to fight, continually. That is, there are plenty of people who will take our rights unless we do fight back. So, right along, there needs to be fighting back.

Where is the ACLU in all of this? What about other groups interested in keeping government under control?

We can fight back by bringing law suits and by voting.

But, in particular, and in practical terms, in a local community, likely it can be enough to be known and respected in the community, active in politics, well known to the local politicians, and to have a little chat with them. A respected local citizen will likely not get pushed around by the police. It's a little like high school -- it helps to fit in at least a little.

Broadly an immediate, expedient, practical solution is: In public, don't carry a lot of cash. If have a lot of cash in your house, then, in case your house gets searched, have that cash well hidden. If you have a business that gets paid in cash, say, some tens of thousands a month, maybe make a daily deposit to your business account. Then, get well known at your bank as a successful, local business person who does get revenue in cash -- hopefully then your bank won't file papers saying that you are suspicious. E.g., generally in business, a banker wants good business customers, and business person wants good respect from their banker.

Commonly in a small community, the police know a lot of the people. That can help, say, if at 3 AM drive to the post office to deposit a letter -- the local police will just remember who you are and relax.

It might help to make a donation to some local police charity drive and, there, shake hands with the local chief of police. Maybe can get a window sticker for your car indicating that you are such a supporter.

Likely if there are enough legal cases where citizens bring suit against civil forfeiture, the practice will reach the SCOTUS and get struck down. Of course, legal cases are very expensive, but there are a lot of law school graduates without much to do; with enough forfeiture cases, some of those lawyers would take such cases and change the situation.

For long distance travel, there is an old saying, "A stranger in a strange land". So, it has long been recognized that being such a stranger is not the usual but has some dangers. E.g., don't carry much cash or anything very valuable. E.g., if want to carry $15,000 to buy a used car, just go to a bank and get a certified check for that amount and hide it somewhere, maybe in a book, fold it up and put it between two credit cards in your wallet, or just mail it to yourself at your destination. Some of the police might say that anyone who didn't use such a technique is suspicious.

16
steveeq1 3 days ago 9 replies      
From a statistical perspective, you are far more likely to shot or beaten up by a black man than you are through a crooked cop. Would you go around telling people you have the same fears towards black people and say this is a fair statement?

I think what's going on is availability bias. This is where dramatic or sensationalized dangers get overplayed in one's mind instead of paying attention to the actual probabilities. It's basically like shark bites, airplane crashes, or terrorist attacks. The media tends to sensationalize these risks because there is money in it and people tend to over-fear them despite their actual statistical probability. Don't get me wrong, there should be something done about crooked cops. But it's also important to distinguish real risk with the types of risks that sell newspapers.

17
joesmo 3 days ago 0 replies      
This is one of the reasons I cannot respect the police or the judicial system in this country and I seriously question those who do.
18
samstave 3 days ago 0 replies      
I fucking love the fact that HNers while perhaps not knowing, overmen outrightly deny it, are a part of /r/conspiracy and dont even realize it.

the whole system needs disruption, that's what we do.

19
baakss 3 days ago 4 replies      
Someone checked a bag with $11,000 cash in it that he saved up by waiting tables over 5 years? So many questions...
Why London Underground stopped people walking up the escalators theguardian.com
261 points by vanilla-almond   ago   257 comments top 50
1
tenfingers 2 days ago 9 replies      
... or, you could ask everybody to walk, thereby "quadrupling" the effective capacity.

I've been in UK many times, and being able to effectively walk up all escalators due to the diligence of the people always impressed me. Coming from a country that doesn't have such respect for basic rules, it feels just wrong despite the gain of average efficiency.

2
gus_massa 2 days ago 3 replies      
In Argentina we have a more complex protocol:

* Outside peak hours: stand on the right part of the escalator and leave the left part for walking.

* At peak hours: stand in both sides of the escalator.

* At extremely peak hours: If most of the flux goes upwards, stop the downward escalator and walk on it (this is actually illegal)

3
JoshTriplett 2 days ago 2 replies      
One thing the article doesn't analyze at all: standing on both sides improves total throughput, but how does it affect the individual latency for people walking up the left side? If the answer is "it's faster for them too", then tell people that and they'll be more inclined to go along with it. And if the answer is "it's slower for them", then no wonder they don't want to go along with it.
4
mattmcknight 2 days ago 2 replies      
The math is the article is a little sketchy. "But a 2002 study of escalator capacity on the Underground found that on machines such as those at Holborn, with a vertical height of 24 metres, only 40% would even contemplate it. By encouraging their preference, TfL effectively halves the capacity of the escalator in question, and creates significantly more crowding below, slowing everyone down. "It doesn't make sense that this would "halve the capacity" unless no one was walking. It seems that with a 40% stated preference for walking, it would be only a 10% loss of capacity due to preference differences. The description should really also include a description of how fast walking is compared to standing. Elevators move at around 0.3 m/s, people walk at about 4x that pace, so even doubling the spacing requirements would shave a fair bit off of that.

I think this result is a due to it only being tried on one of three up escalators. By the assumption that there are 6 lanes, devoting 1/3 to walkers and 2/3 to standers can lead to greater efficiency if that more closely matches the actual preference distribution. By maintaining choice, and matching the available options to those desired by passengers one can optimize the results for both those who prefer speed and those who prefer not expending energy.

5
thwarted 2 days ago 6 replies      
Considering that an escalator is one of the few real-life powerups available outside of a video game, not walking up it is tantamount to wasting it, both traveling up and down. Like getting Haste and then just standing there, or being able to Deal Double Damage and not hitting the trigger.

I usually camp at the bottom, waiting for the crowd to disperse, then grab the powerup and use it efficiently.

6
markdown 2 days ago 2 replies      
> had gone to Hong Kong on holiday. Lau noticed that passengers on that citys Mass Transit Railway (MTR) were standing calmly on both sides of the escalator and, it seemed, travelling more efficiently and safely as a result.

Is this new? When I visited Hong Kong in 2011, you had to stand on the right and walk on the left. In fact, this was one of the things that caused a great deal of anger towards "mainlanders" tourists from mainland China who ignored such social conventions.

EDIT: I just googled it and apparently the "no walking on the escalators" rule in Hong Kong is only a few months old:

http://www.scmp.com/news/hong-kong/education-community/artic...

7
vanilla-almond 2 days ago 1 reply      
Here is a photo of the escalators in Holborn station, as mentioned in the article

https://www.flickr.com/photos/topaas/16557435918/

8
pmalynin 2 days ago 0 replies      
Okay, in my city (Edmonton) we have the province's biggest University, that is mostly served by the LRT, and the university station in underground and hence has escalators.During peak morning hours, every single person walks up the escalator stairs and it is considered rude to stand in the morning. When the escalator is down everyone complains, because we need to 2x the walking now.Actually the fact that escalator is broken (i.e one is turned off and the other is closed for maintenance) has created a sort of inside joke, to the point where the escalator has its website to say if its working http://uofaescalator.com/
9
lmm 2 days ago 3 replies      
Too much modern public transport is slowing people down for the sake of throughput. Using the escalator this way adds capacity, but it makes journey times longer, much like the endless passages that have replaced cross-platform interchange on newer lines, or the decision to not connect Crossrail to Oxford Circus and force people to walk up, along the street and down instead.

If the escalators are at capacity then the right thing is to build more. But that costs money, so instead we get "cheats" like this.

10
erostrate 2 days ago 1 reply      
"With the constant (and unsustainable) attention of staff, and three weeks of practice, they eventually became a little more docile [...] Its like child psychology [...] So if you cant tell them what to do every two minutes, how on earth do you get them to comply? [...] The handrail and tread of the escalator will be a different colour, and firmly planted pairs of feet will decorate the left of the steps."

I wonder if they have tried to simply explain to commuters why it makes sense to stand on both sides. Treating people like intelligent and responsible adults often works much better than treating them like intellectually disabled children.

11
cm2187 2 days ago 0 replies      
This is complete b/s. I (too) often take the tube at peak hour, the left line is usually full, even in long escalators like the connection between the DLR and the central line at Bank. The idea that you have an empty left line and a packed right line is just not true.
12
driverdan 2 days ago 0 replies      
What an unnecessarily long article. Scroll down to the second photo to see what it's about and skip the text.
13
dennisgorelik 2 days ago 3 replies      
I am surprised that in London right side of escalator was reserved for non-walking travelers.

British drive on the left side of the road, so more natural tradition would be to stay on the left side of escalator and run on the right.

14
usrusr 2 days ago 0 replies      
An interesting example of how too much uniformity can weaken a system. Where i live, there is a low but "reliable" percentage of riders who are just too rude to follow the left/right protocol. They are rare enough that during low-traffic periods, chances are quite low that you will be blocked while taking the individual latency improvement of walking. However, in periods of high traffic there will be enough of of them to reliably break the throughput-limiting lane pattern and congestion propagation will make sure that it stays that way until the next break in throughput demand.

Never occurred to me that the occasional lane-blocker was an accidental optimization.

15
Terr_ 2 days ago 2 replies      
> In lieu of actual people, a hologram customer service operative will remind people to stand on both sides.

Is this some sort of British slang thing, or did somebody make enormous strides in hologram technology when I wasn't looking?

16
Houshalter 2 days ago 1 reply      
Escalators are very rare novelties here. I never really understood their purpose over a regular stairs, but they are so cool on the occasions I see them.

As a kid I was taught that walking up or down an escalator was rude, could cause injury, and defeats the purpose of the escalator. I think I was punished for doing it. When I went to DC, people were asking me to move out of the way so they could run up the escalators. I thought they were just being rude, but then I noticed lots of people doing this.

I don't remember there being any signs or anything anywhere explaining this. It just emerged as part of their culture. Very interesting. I will definitely try running up an escalator the next time I see one.

17
oxplot 2 days ago 0 replies      
I think the main issue is the width of the escalators. Given that strangers don't like to stand next to one another (as per article's claim), two single width escalators would end up working more efficiently than one double width.
18
ramshanker 2 days ago 0 replies      
Come to Rajeev Chawk station of New Delhi Metro, and you will see 3 rows of people on escalator: Left, Right & Center too.

Must be almost touching the escalator's "Design Capacity"!

19
slash213 2 days ago 0 replies      
Moscow has one of the busiest subway systems in the world, ranked 4th by annual ridership. There's a "stand on right, walk on left" rule, but at congestion hours passengers are advised to stand on both sides to make transportation more efficient, and it's been like that for decades.

Reading a Guardian article on that feels, uh, really redundant? I guess we Russians have some experience in dealing with mobs.

20
ycmbntrthrwaway 2 days ago 1 reply      
In Moscow escalators are controlled manually. Each set of escalators has one operator. During peak hours operators manually instruct people to use both sides, constantly repeating instruction for new people that keep arriving. As soon as they stop repeating, people use only the right side as usual.
21
fiatjaf 2 days ago 2 replies      
"Escalator" is an interesting name. Here in Brazil escalators are called "rolling stairs" (escada rolante).
22
tfolbrecht 2 days ago 2 replies      
No Western person would be comfortable riding an escalator side by side with a stranger (aka weirdo in UK speak) for the equivalent of fifty to a hundred steps of stairs in a hot smelly subway so dirty your snot turns sooty. Standing on the right to allow others to walk up the left is not a technical optimization constructed by the impatient of the world. The real optimization problem is you're dealing with people and not frictionless spheres. X number of people walking up the left is faster than zero.
23
cauterized 2 days ago 0 replies      
The only time I ever see two people standing still on a single escalator stair is parents with children or romantic partners - people who are comfortable sharing physical space with one another. Strangers don't share an escalator stair.

So if you let people stand on the right and walk on the left, you're getting higher density and throughput, since you now have one person standing per stair PLUS people walking on the left. It's like turning a 1-lane into a 2-lane street.

24
kazinator 2 days ago 1 reply      
Here is an idea.

On long escalators, mark some of the stairs in a particular color at regularly spaced intervals. Those are the designated "rest stairs".

Someone walking up who changes their mind (temporarly or for the rest of the ride) just finds a green stair, and moves over to the standing spot to let those behind pass.

Rest stairs can be spaced reasonably sparsely so as not to cut into capacity too much or annoy people, and would only be featured on the long escalators where this is a problem.

Maybe some people give up on the idea of walking up the long escalators in rush hour because they don't want to hinder someone who is faster.

25
cornholio 2 days ago 3 replies      
Brain fart time: how about a transit system composed entirely of escalators ? When you board, you take a slow treadmill that runs in parallel to a faster one, and that in turn to another, gaining 0.3-0.6 meters/second on each lateral skip. You do the reverse when approaching your destination.

An treadmill running at 10m/s with pairs of people spaced at 0.5m has a capacity of 40 people/second, or 2400 people per minute, or one large train every 30 seconds. The average speed is higher and you no longer need to wait for a train. The only very small problem is the prohibitive cost with existing tech.

26
Yizahi 2 days ago 0 replies      
Walking on an escalator is usually not very efficient thing - you cut throughput in half and will maybe save 30 to 60 seconds walking down, and only if you'll actually catch a train in that period. Walking upwards is even less efficient - saving 10-20 seconds by inconveniencing hundreds of people? Really?

All this is comparable to people crossing streets on red light and afterwards walking slower than me crossing on green.

27
PhasmaFelis 2 days ago 0 replies      
TL;DR: Because many more travelers stand (on the right) than walk (on the left); catering to the latter potentially leaves nearly half the escalator's capacity unused.
28
Felix21 2 days ago 1 reply      
It's still faster for the person running up the escalator but for that luxury, we are sacrificing half the capacity of the escalator.

Having another row of people on the left means the overall capacity increases and everyone moves faster but I always walk up the stairs and this won't benefit me one bit. Everyone else wins.

Standing when I'm in a rush can never be faster than walking up the escalator.

29
ilzmastr 2 days ago 1 reply      
I've never heard "escalump" in DC, but I have heard and used "escalefter" many times. There was even an ad inside the metro about it one time:

http://klaprothlaw.com/wp-content/uploads/2012/07/Tourism-to...

30
jjp 2 days ago 0 replies      
I assume the biggest improvement is removing the merging problem when people queuing on the left merge at the last minute and people on the right aren't being completely passive to the queue jumping.
31
dreamfactory2 2 days ago 2 replies      
It seems to me that the throughput on the left side is faster - and therefore even if it were more spaced out (not the case in rush hour when it is almost as crowded as the right side), the fact that people are moving faster and spending less time on it would make up for any difference.
32
JupiterMoon 2 days ago 0 replies      
With many thing once people get used to a change their behaviour adapts in ways that the changer did not hope for. I suspect this will end with each person standing across both sides of the escalator cutting throughput.
33
faitswulff 2 days ago 3 replies      
Why not make the escalators smaller so they only fit one person at a time? Then they could have two lanes where they used to have one and standers would naturally block walkers.
34
droithomme 2 days ago 1 reply      
The article and its experts assume that "efficiency" is more important than culture and regimen without proving that is true.
35
codeulike 2 days ago 0 replies      
From my point of view, says TfLs head of transport planning Geoff Hobbs, the ideal train would look like a bread bin

lol I can see that being taken out of context.

36
kazinator 2 days ago 1 reply      
The solution is to actually have a full escalator, with an uninterrupted row of people on the left who are walking. That's what I see in Vancouver, and cities in Japan. The escalator is fully occupied, and one side of it is used for walking.

It's surprising to hear that Londoners are just keeping one side clear, with few "takers" to climb up. Are they unfit or something? Respectful, though.

37
interfixus 2 days ago 0 replies      
I run up stairs. Hence have little patience but plenty of scorn for people who can't be bothered with a bit of locomotion on the escalator. Come on, I'm 56, you're half my age, move it!

London Transport might have attempted a campaign along those lines. Lowest common denominator can be tiresome at times.

38
IshKebab 2 days ago 3 replies      
I wonder why they don't make the escalators three people wide. There is certainly space for it.
39
gaius 2 days ago 1 reply      
So in conclusion, present-day Londoners are lazy, and rather than encouraging people to move more, we should pander to them. Why not give everyone a Coke and a Big Mac with every Oyster card too? No wonder there is an obesity crisis.
40
legulere 2 days ago 1 reply      
Is there a reason why all escalators are 2 people wide?
41
bluejekyll 2 days ago 1 reply      
And there goes my little exercise I get in the morning.
42
donatj 2 days ago 1 reply      
I feel like I can't be alone in not wanting to stand next to someone I don't know on an escalator and finding the idea uncomfortable?
43
shabbaa 2 days ago 0 replies      
Tldr?
44
everyone 2 days ago 0 replies      
This sucks. Now people who want to walk up and get a little exercise cant, just cus they are outnumbered by lazy sedentaries.. :(
45
nkrisc 2 days ago 0 replies      
It wouldn't be quite as large of a problem if people who were physically able would just quit being lazy and walk up the escalator, even if it is a bit of a hike. It's not like walking to the top of a skyscraper.

You know what's faster than either walking up stairs or an escalator? Walking up an escalator.

46
delibes 2 days ago 2 replies      
I used to go through Holborn every morning at 8.50am and I could see them preparing the experiment a few weeks in advance. They placed people in all the corridors to count throughput.

However, as I was changing from the Central line to the Piccadilly line in the morning, like many others I walked through the much quieter 'No Exit' corridor (against the flow of traffic) to avoid/alleviate the congestion in the actual designated exit corridor. I'm not sure if I was counted in their stats as a +1 or -1?

Then they closed my local station anyway, so I switched to the overground. 5 mins more, but so much nicer!

47
stretchwithme 2 days ago 2 replies      
I don't think use of the Underground will increase by 60% by 2050. Mass transit will be a bad memory by then.

When our driverless cars head underground, they will all be going at the same high speed.

48
sandworm101 2 days ago 3 replies      
>>The stand-on-the-left controversy is no exception. Harrison, Stoneman and their colleagues believe it could make a noticeable impact on congestion at some of Londons busiest stations, congestion that will only get worse as train design, frequency and reliability improve, as the trains get faster and the doors get bigger, and ever more passengers are dumped on the platform at a time.

THAT is a very british approach. Planners know that a problem is approaching, a problem created willingly by infrastructure improvements elsewhere. But rather than address that spillover issue with money/time/new bricks, yet another code of behavior is to be enforced. The people are to shoulder the burden yet again. Heaven help the tourist in a hurry who gets an asbo for not maximizing the carrying capacity of tube escalators. I wait for the day the escalator stops and everyone stands motionless for fear of being ticketed.

See https://www.youtube.com/watch?v=DyL5mAqFJds where shoddy architecture is answered by suggesting that things will be ok so long as only lightweight people enter the building.

And i thought there was an obesity crisis? They've been telling us for years to keep moving and now here is a government agency telling people to stand motionless? I say encourage people to burn calories by running the escalators in reverse!

49
frobozz 2 days ago 3 replies      
Capacity could be improved even more, without fuelling the obesity epidemic and inconveniencing those of us who prefer to keep moving . All they have to do is make walking compulsory on both sides.
50
sbuttgereit 2 days ago 1 reply      
Would this not be fairly straightforward to simulate? I see statements like: "and, it seemed, travelling more efficiently and safely as a result." and "His report prompted Harrison and her colleagues to wonder..." and think... they're guessing? They're going right to disrupting normal travel patterns of many people for a trial on nothing more than a hunch? Why not prove to a reasonable degree that this would actually work beyond the anecdotal feelings of a few employees before inflicting such a change on the public?

Seems like they are just shifting the cost for finding out from themselves to the disrupted travellers. Even if over longer periods of time this proves more efficient, disrupting normal patterns for regular commuters will cause a lot of stress and disorientation; such stress may be a soft cost, but most commutes already suck. I guess when you're a government agency, it's hard to fail your customers in a way that matters to you.

You Can't Destroy the Village to Save It: W3C vs. DRM, Round Two eff.org
216 points by cpeterso   ago   107 comments top 8
1
epistasis 3 days ago 11 replies      
I know my opinion is not a popular one, but I find these arguments somewhat less than convincing.

The goal seems to be to reduce the spread of DRM. I'm cool with that. However, I'm not sure that these actions will do anything at all to reduce the usage of DRM. My reasoning there is that those who want to use DRM are not going to accept any alternative that is not DRMd. So in order to stop spread of DRM, keeping DRM non-standardized would have to prevent others from adopting DRM.

So the ultimate question for me is: who's going to start using DRM that's not already? I think this set is zero. Adding standardization of DRM won't close up any more content that wasn't closed before, IMHO.

However, by not standardizing, we lock out all sorts of non-mainstream clients form accessing content. Now that Flash is going to disappear entirely, that means no access to all sorts of content on Linux, unless it's standardized.

So I see something to gain, and nothing to lose by standardizing DRM. I'm making assumptions to arrive at that conclusion, but I believe that they're no worse than the ones that Cory Doctorow is making here. It's just weird to see myself diverging from the EFF on this, and on T-mobile, and other things.

2
pornel 3 days ago 0 replies      
This spec (W3C EME[1]) has been introduced and heavily lobbied by Google and Netflix.

Regardless of what W3C decides, Chrome won't drop Netflix support, and Netflix for now seems to be hell-bent on having total legal control over which devices are allowed to play their content.

[1] https://w3c.github.io/encrypted-media/

3
Silhouette 3 days ago 2 replies      
I'm not much of a DRM fan. That said, the W3C has already become borderline irrelevant to the future of the web, as it has been far too slow to standardise everyday technologies that have been widely deployed in browsers for a considerable time. Realistically, actively refusing to cooperate seems unlikely to stop browser developers from implementing new features. It's just going to mean that those features work differently from one browser to another, and possibly that some browsers or platforms won't offer the features at all. Personally, I'd rather the W3C work within the bounds of reality as a moderating influence than have it become a mere talking shop with no real influence at all.
4
guelo 3 days ago 1 reply      
Am I understanding this right that the idea is that if a member of W3C sues anybody for reverse engineering their DRM then they get kicked out of W3C?

I wonder how much of a deterrent that is. W3C needs Google/Microsoft/Apple more than they need W3C. The content producers aren't even members of W3C I don't think. I guess it would be companies that create the encryption plugins like Adobe that could theoretically sue people under the DMCA. I just don't see how the W3C could even function without the biggest players at the table.

5
dendory 3 days ago 0 replies      
I think the security aspect is the best option. If a CEO can't access some proprietary web app and is told "You need to use browser X that supports DRM", then they will be "Why isn't that already installed on my system". But if they are told "The web app uses hidden code that may be insecure, and cannot be audited" then you'll have executives, people in actual power positions, say "Get that stuff off our network".
6
bitwize 3 days ago 0 replies      
Either the Web will adopt DRM or the Web will be considered extremely suspect as a distribution medium.

Similarly, either you will accept backdoored encryption or you will be automatically considered a terrorist and singled out for LE scrutiny.

7
chris_wot 3 days ago 1 reply      
I don't expect Tim Berners-Lee to agree to this, given he was the one who championed DRM at W3C in the first place.

Also, I don't expect Mozilla to do anything useful, given they went along with it so easily. But then, I haven't expected anything much of Mozilla in a long while.

8
stcredzero 3 days ago 2 replies      
People need to realize that DRM is simply a tool. Absent pernicious laws like the DMCA, forms of DRM are like fences, locks, and cameras. Such tools can be used to oppress people. The same tools could also be used to protect individual's privacy and act as a check on what organizations can do with people's data.

Those who control products and infrastructure shouldn't be allowed to set up such tools to grant themselves disproportionate power. However, individuals need to realize that such tools could help protect them as well. (Analogy: Camera phones and police misconduct.)

Read the TPP readthetpp.com
245 points by SimplyUseless   ago   106 comments top 10
1
tptacek 1 day ago 16 replies      
I opened this page up, skipped forward to the first highlighted section (18. Intellectual Property), and skimmed to the first annotation. The original text:

1. A Party may, in formulating or amending its laws and regulations, adopt measures necessary to protect public health and nutrition, and to promote the public interest in sectors of vital importance to their socio-economic and technological development, provided that such measures are consistent with the provisions of this Chapter.

The annotation:

In other words, the TPP overrides any domestic laws protecting public health and nutrition, or socio-economic development.

That's not at all how the TPP works. The treaty doesn't allow foreign governments to "override" local laws, but rather allows for damage claims against the governments themselves if they enact and enforce laws contrary to the agreements in the TPP itself.

I'd really like the TPP annotated by legal experts. Instead, it's annotated by the CTO of Fight For The Future. I'm not sure that's a win.

2
walterbell 1 day ago 0 replies      
Citizen's Trade organized 1,500 groups to sign a letter to the US Congress, against the TPP, http://www.citizenstrade.org/ctc/blog/2016/01/07/1500-groups...

"... the TPP elevates investor rights over human rights and democracy, threatening an even broader array of public policy decisions than described above. This, unfortunately, is the all-too-predictable result of a secretive negotiating process in which hundreds of corporate advisors had privileged access to negotiating texts, while the public was barred from even reviewing what was being proposed in its name.

The TPP does not deserve your support. Had Fast Track not become law, Congress could work to remove the misguided and detrimental provisions of the TPP, strengthen weak ones and add new provisions designed to ensure that our most vulnerable families and communities do not bear the brunt of the TPPs many risks. Now that Fast Track authority is in place for it, Congress is left with no means of adequately amending the agreement without rejecting it entirely. We respectfully ask that you do just that."

3
johnmaguire2013 1 day ago 2 replies      
This seems like a perfect proof of concept for genius.com if they're serious about becoming a way to annotate anything (not just songs).

[1] http://genius.com/web-annotator

4
jariz 1 day ago 1 reply      
This is great and all and is something that should absolutely be shared, however, if the intent behind this project is to share it with 'the average' person it's completely useless.No one's going to read through that entire thing, I'd expect them to at least put up a summarized version.
5
shmerl 1 day ago 1 reply      
There really should be a stronger push to scrap this undemocratic monstrosity.
6
dpweb 1 day ago 1 reply      
Read the TPP.. Skipped right to the HN comments about the TPP.
7
lindx 1 day ago 1 reply      
This is what happens when you try to visit this site with Tor: https://anonm.gr/up/b386.png

Cloudflare's captchas are nearly impossible to solve, which means that Tor users are effectively blocked from seeing the site. Would you consider using something other than Cloudflare to host the site?

8
pluckytree 1 day ago 0 replies      
I think the positive benefit of this effort will likely be undermined (and its already underway) by reactionary comments from uninformed people. They'll play well to people that already know the TPP sucks, but not from those on the fence or really wanting to learn about it.
9
krick 1 day ago 2 replies      
A brief question: should I know what this is if I'm not an USA citizen?
10
jsprogrammer 1 day ago 2 replies      
It is ridiculous that after months of Obama telling people to "just read it", he dumped the agreement as ~268 separate PDFs.

No body has time for that. It's nice that they have pared this down to 31 different sections, but my guess is that they are not showing the full agreement here.

It would be much nicer if someone just dumped it all into a single PDF and HTML file.

Edit: Care to leave a comment rationalizing your downmods?

The Unreasonable Reputation of Neural Networks mit.edu
263 points by fforflo   ago   138 comments top 18
1
hacker_9 1 day ago 12 replies      
"Human or superhuman performance in one task is not necessarily a stepping-stone towards near-human performance across most tasks.

By the same token, the ability of neural networks to learn interpretable word embeddings, say, does not remotely suggest that they are the right kind of tool for a human-level understanding of the world. It is impressive and surprising that these general-purpose, statistical models can learn meaningful relations from text alone, without any richer perception of the world, but this may speak much more about the unexpected ease of the task itself than it does about the capacity of the models. Just as checkers can be won through tree-search, so too can many semantic relations be learned from text statistics. Both produce impressive intelligent-seeming behaviour, but neither necessarily pave the way towards true machine intelligence."

So true, and this is why I don't listen when Elon Musk or Stephen Hawkings spread fear about the impending AI disaster; they think because a neural network can recognize an image like a human can, that it's not a huge leap to say it will be able to soon think and act like a human, but in reality this is just not the case.

2
andreyk 1 day ago 1 reply      
I think this is a good analysis of what Deep Learning is particularly good for and its limitations, but was somewhat annoyed at the lack of any citations of people actually overhyping it. The most there was is this:

"This is all well justified, and I have no intention to belittle the current and future impact of deep learning; however, the optimism about the just what these models can achieve in terms of intelligence has been worryingly reminiscent of the 1960s."

From what I've read and seen, the leading people in the field (Yann LeCun, Hinton, etc.) seem to be very aware that the current methods are particularly good for problems dealing with perception but not necessarily reasoning. Likewise, I have not seen many popular news sources such as NYT make any crazy claims about the potential of the technology. I hope, at least, that the people who work in AI are too aware of the hype cycles of the past to get caught up in one again, and so there will not be a repeat of the 60's.

3
boltzmannbrain 1 day ago 1 reply      
I think readers of this post will enjoy "What is Machine Intelligence vs. Deep Learning vs. Artificial Intelligence" by Numenta's Jeff Hawkins: http://numenta.com/blog/machine-intelligence-machine-learnin...
4
theideasmith 1 day ago 2 replies      
Someone once gave the analogy of climbing to the moon. You can report steady progress until you get to the top of the tree/mountain. I think this is applicable here. We'll need a new paradigm, beyond statistical learning, to create AGI
5
proc0 1 day ago 1 reply      
Another article basically saying something along the lines of "there is no current technology that comes close to producing AGI, therefore let's dismiss all these technologies". Of course we don't know what we don't know, until we do, and then it's not as mysterious.

It's not hard to see that the reason NN are becoming the prime candidate for AGI, is because of their inspired architecture based on biological neurons. We are the only known AGI, therefore something similar to the brain will be producing an AGI. NN at least mimic the massively parallel property of biological neurons. And if we're optimistic, the fact that NN is mimicking how vision works in our brain, might mean that we are at some point in the continuum of the evolution of brains, and it's a matter of time until we discover the other ways brains evolved intelligence.

What keeps me optimistic is evolution. At some point brains were stupid, and then they definitely evolved AGI. The question is how did this happen and whether or not there is a shortcut, like inventing the wheel for transportation instead of arms and legs.

6
maciejgryka 1 day ago 0 replies      
Nice article - it's good to be realistic about what we can do with current tools.

I feel like the gist of what current neural nets can do is "pattern recognition". If that's fair, I also suspect that most people underestimate how many problems can be solved by them (e.g. planning and experiment design can be posed as pattern recognition - the difficulty is obtaining enough training data).

It's true that we're most likely a very long way away from general AI - but I'm willing to bet most of us will still be surprised within the next 2 years by just how well some deep-learning based solutions work.

7
otakucode 1 day ago 0 replies      
I expect that as we improve machine intelligence more and more, aside from the fact that we will simply keep moving the goalposts of what we consider "intelligent" like the irascible scamps we are, we're going to discover that embodiment is absolutely necessary. Not just any embodiment either, but we will need to place the neural networks in bodies very much like our own. Neuroscience continues to find surprising things that link our "general human intelligence" to our bodies. Paralyze a face and a person becomes less able to feel happiness or anger, eventually forgetting what feeling those things even meant, as one example.

We shouldn't forget that the mind/body split is a wholly artificial construct that has no basis in reality. The brain is not contained in the head. The nerves running down your spine and out to your toes and all over your body are neurons. Exactly the same neurons, and directly connected to the neurons, that make up what we think of as the separate organ 'the brain'. They're stretched out very long, from head to toe, sure, but they are single cells, with the exact same behavior and DNA, and there is no reason to presume that they must have some especially insignificant role in our overall intelligence.

Then there is the fact that it is probably reasonable to presume that a machine which has human-level intelligence will not appear overnight. It would almost necessarily go through long periods of development. During that development, when the machine begins to behave in ways the designers are not able to understand, what will be their reaction? Will they suppose that maybe the machine had intentions they were unaware of, and that it is acting of its own volition? Or will they think the system must be flawed, and seek to eliminate the behavior they didn't expect or understand?

I have a hard time imagining that an AI system will be trained on image classification and one day suddenly say "I am alive" to its authors or users. If it instead performs poorly on the image classification because it is pondering the beauty of a flower in one of the images, what are the chances that nascent quasi-consciousness would be protected and developed? I think none. We only have vague ideas about intelligence and consciousness and our ideas about partial intelligence are utterly theoretical. Has there ever been a person who was 1% intelligent? Is mastering checkers, or learning NLP to exclusion of even proprioception 1% of human intelligence? You optimize for what you measure... and we don't know how to measure the things we're looking for.

8
Houshalter 1 day ago 1 reply      
>Human or superhuman performance in one task is not necessarily a stepping-stone towards near-human performance across most tasks.

Here's the important difference about NNs. They are incredibly general. The same algorithms that can do object recognition can also do language tasks, learn to play chess or go, control a robot, etc. With only slightly modifications to the architecture and otherwise no domain information.

That's a hugely different thing than brute force game playing programs. Not only could they not learn the rules of the game from no knowledge, they couldn't even play games with large search spaces like Go. They couldn't do anything other than play games with well defined rules. They are not general at all.

Current neural networks have limits. But there is no reason to believe that those limits can't be broken as more progress is made.

For example, the author references that neural networks overfit. They can't make predictions when they have little data. They need huge amounts of data to do well.

But this is a problem that has already been solved to some extent. There has been a great deal of work into bayesian neural networks that avoid overfitting entirely. Including some recent papers on new methods to do them efficiently. There's the invention of dropout, which is believed to approximate bayesian methods, and is very good at avoiding overfitting.

There are some tasks that neural network can't do, like episodic memory, and reasoning. And there has been recent work exploring these tasks. We are starting to see neural networks with external memory systems attached to them, or ways of learning to store memories. Neuroscientists have claimed to have made accurate models of the hippocampus. And deepmind said that was their next step.

Reasoning is more complicated and no one knows exactly what is meant by it. But we are starting to see RNNs that can learn to do more complicated "thinking" tasks, like attention models, and neural turing machines, and RNNs that are taught to model programming languages and code.

9
tim333 1 day ago 0 replies      
>Extrapolating from the last few years progress, it is enticing to believe that Deep Artificial General Intelligence is just around the corner and just a few more architectural tricks, bigger data sets and faster computing power are required to take us there. I feel that there are a couple of solid reasons to be much more skeptical.

On the other hand there are reasons to be optimistic. Human brains are built from networks of neurons and the artificial neural networks are starting to have quite similar characteristics to components of the brain - things like image recognition (https://news.ycombinator.com/item?id=9584325) and Deep Mind playing Atari (http://www.wired.co.uk/news/archive/2015-02/25/google-deepmi...)

The next step would may be to wire the things together in a similar structure to the human brain which is kind of what Deep Mind are working on - they are trying to do the hippocampus at the moment. (https://www.youtube.com/watch?v=0X-NdPtFKq0&feature=youtu.be...)

Also we are approaching the point where reasonably priced hardware can match the brain, roughly the 2020s (http://www.transhumanist.com/volume1/moravec.htm)

It'll be interesting to see how it goes.

10
MichaelMoser123 1 day ago 1 reply      
I think that this book is really interesting "Surfaces and essences: Analogy as the fuel and fire of thinking" by Hofstadter and Sander

Many people got dissilusioned with classical AI because mathematical logic (inference engines) would not scale to 'strong' AI.

Hofstaedter says that most concepts handled by Humans do not fit into clear cut onthologies one to one. Instead each higher order concepts are created by finding analogies between objects or simpler concepts, and by grouping these similar concepts into more complex entities.

I have a summary of the book here http://mosermichael.github.io/cstuff/all/blogg/2013/10/15/po...

11
bsbechtel 1 day ago 3 replies      
We will never have human level AI until we can properly understand, define, and model human intelligence. While we are advancing at a very rapid pace on that front, we are still years away from the field being considered mature.
12
tianlins 19 hours ago 0 replies      
it is true that most recent success of deep neural network are in the regime where n, d are large. And we surely shouldn't fantasize general AI solved in this way. However, the very appealing aspect of deep neural network is end-to-end training: for image recognition, we can map from raw-pixels to output. This is very different from other ML techniques. In some sense, deep neural networks learn "algorithms", not just "models". This formulation can be richer especially when given lots of data.
13
skybrian 1 day ago 1 reply      
There was a recent paper [1] about learning visual concepts from few examples. I don't know if it generalizes or not, but it seems too early to assume that researchers will hit a dead end.

[1] http://science.sciencemag.org/content/350/6266/1332.full

14
sevensor 1 day ago 0 replies      
At last, the thing that's unreasonable isn't effectiveness. I've been hoping for a while that someone close to the field would cut through the hype and put ANNs in context.
15
interdrift 1 day ago 2 replies      
We have something that can understand a pattern but we don't have something that can understand how different patterns relate to each other.
16
tacos 1 day ago 2 replies      
Current top post quotes the most negative observation of the paper. Here's the most positive, and perhaps the most useful to HN readers or investors who are exploring the space:

"Deep learning has produced amazing discriminative models, generative models and feature extractors, but common to all of these is the use of a very large training dataset. Its place in the world is as a powerful tool for general-purpose pattern recognition... Very possibly it is the best tool for working in this paradigm. This is a very good fit for one particular class of problems that the brain solves: finding good representations to describe the constant and enormous flood of sensory data it receives."

17
DrNuke 1 day ago 0 replies      
I think many are missing the point here: AI can just be very stupid and still wipe everything out. It only takes some sort of irreversible minimisation function to let machines destroy all at sight. Drones are the first step, then comes IoT, what else? We fully depend on machine learning just now. So no wonder many are scared even before machines becoming human-intelligent.
18
neom 1 day ago 0 replies      
How far are we from general purpose quantum computing?
An introduction to Machine Learning docs.google.com
295 points by antoineaugusti   ago   34 comments top 12
1
antoineaugusti 2 days ago 0 replies      
Please note that I'm not the author of the presentation. Made by Quentin de Laroussilhe http://underflow.fr

I had to make a copy to my Google account to keep the slides.

2
rafaquintanilha 2 days ago 2 replies      
Worth to mention that a Statistical Learning Stanford course [1] just started and according to the lecturers there is a lot of overlap in both areas.

[1] https://lagunita.stanford.edu/courses/HumanitiesSciences/Sta...

3
compactmani 1 day ago 1 reply      
If you are just starting out with applied machine learning I would focus heavily on understanding bias and variance as it will really help you succeed. It's I think what (largely) separates the sklearn kiddies from the pros.
4
aabajian 2 days ago 1 reply      
This really is a fantastic presentation for newcomers to the field. When I was taking these classes I found it difficult to keep all of the available algorithms organized in my mind. Here's an outline of his presentation:

Overview (5 slides)

General Concepts (9 slides)

K nearest Neighbor (6 slides)

Decision trees (6 slides)

K means (4 slides)

Gradient descent (2 slides)

Linear regression (9 slides)

Perceptron (6 slides)

Principal component analysis (6 slides)

Support vector machine (6 slides)

Bias and variance (4 slides)

Neural networks (6 slides)

Deep learning (15 slides)

I especially like the nonlinear SVM example on slides 57 and 58. It provides a visual of projecting data into a higher dimensional space.

5
yelnatz 2 days ago 0 replies      
Pretty good summary of what you learn in your first machine learning class in college.
6
fnl 1 day ago 1 reply      
Nobody concerned about plagiarism here? I am pretty sure I've seen a number of the slides and graphics elsewhere. Correct attributions however seem amiss.
7
lectrick 2 days ago 3 replies      
Is there an online course for this I could take?
8
kendallpark 2 days ago 1 reply      
Yes, thank you. I'm hoping to build an ANN this summer and don't have the luxury of taking an actual class.

Does anyone have any other resources?

9
aerioux 2 days ago 0 replies      
that was a really good introduction :) sort of like an executive summary - all the "why we care" and some of the words you might want to look at to actually learn the details
10
max_ 2 days ago 0 replies      
Thanx for sharing this!!
11
Dowwie 2 days ago 1 reply      
is there a corresponding video where the slides are presented?
12
remriel 2 days ago 0 replies      
Thank you.
An open letter of gratitude to GitHub github.com
248 points by arthurnn   ago   105 comments top 28
1
mmcclure 3 days ago 4 replies      
I assume this is a response to the "Dear Github" letter. I'm fairly certain that everyone involved in that letter (including myself), is very appreciative of Github and its impact on OSS. That letter didn't feel ungrateful or malicious at all to me, but I sure hope it didn't come off that way to others.

What I do frequently see with Github, is that they've managed to work their way into almost being beyond reproach. This letter feels like an example of that...Almost like Github needs someone to stand up for it in light of some meanies picking on it.

It's a good product. We should give credit where credit is due, just don't forget it's a product. A (by all indications), very profitable product that wants to make money off you. That is its goal and purpose in life, and OSS furthers it. For the record, I think this is a good and healthy relationship, but we shouldn't pretend it's some FOSS group or non-profit out struggling to provide us with Git hosting.

2
luso_brazilian 3 days ago 4 replies      
Before 2007, the way to participate in Open Source was fragmented. Each project had their own workflow, patches circulated in emails, issues were reported in a myriad ways, and if anyone wanted to contribute they had to figure out every project's rules.

Then, a handful of guys took the challenge to build an awesome platform and as a consequence of their hard work, their platform earned its hegemony.

Two things stand out in this "thank you Github" open letter:

1. While the situation improved tremendously in certain areas the way to participate in Open Source is still very much fragmented. Most of the major open source projects (like Linux, Mozilla, Apache and nginx, to name a few) still have their own workflows, patches are still circulated in emails and issues are still being reported in a myriad ways. Despite of the big visibility GitHub has among the new open source projects we are very far from not being fragmented.

2. Before 2007 we had, for instance, SourceForge that back then had also earned its hegemony and, for a series of reasons (one of them being too late to answer to the community wants and needs) lost its way, its hegemony and its user base.

There is time for praise and time for hard work and, IMO, the "Dear Github" open letter is a constructive way to call attention to the perceived problems while the 'Dear "Dear Github"' and this gratitude letter are dismissive to their concerns (the former) and mostly empty praise and adulation (the later).

3
mpdehaan2 3 days ago 2 replies      
Posting this right after some good suggestions for the service was given feels like this is saying it is wrong to make suggestions for GitHub because they have done good things.

This to me, itself, is wrong.

The GitHub issue tracker does need to change. While it's great for OSS that projects can get a leg up SOONER, GitHub does introduce it's own problems by having some watered down tooling in some areas.

I'm STILL at odds with how it has shifted the equation from discuss to throw code at the problem, which generates extra code review and often, angry committers when their patches are not immediately merged or unwanted, or have to be reworked.

GitHub has done some GREAT things because it has built up critical mass, but because it has gotten critical mass and has become a defacto standard, does have some obligation to keep up with demand.

This seems passive aggressive to me.

4
phaed 3 days ago 2 replies      
The motivation behind this letter is embarrassing.

It's as if they were talking to GitHub the thankless FOSS maintainer. Quit mirroring guys. It's a for-profit enterprise that would do well to listen to the concerns of its userbase.

5
minimaxir 3 days ago 1 reply      
The controversy isn't black-and-white and I'm not sure why this letter is painting it that way. GitHub can be a major boon to open source and have core issues which make it incredibly frustrating to work with the service.

The dilemma is about the sum of the parts.

6
throwaway1456 3 days ago 1 reply      
> Before 2007, the way to participate in Open Source was fragmented. Each project had their own workflow, patches circulated in emails, issues were reported in a myriad ways, and if anyone wanted to contribute they had to figure out every project's rules.

And it was much better IMO. Now we have a centralized website, in the hands of a single corporation, which requires nonfree JavaScript for much of the basic functionality[1]. Git was designed to work well with email and has commands built-in to format, send and apply patches. I think anyone who used email for patches seriously will agree that they are largely superior to GitHub's pull requests.

The free software movement being fragmented is a good thing. GitHub is the land of trends: web developers using Mac OS X who make apps with the latest trendy frameworks like React and Angular (if you think that's a misportrayal, look at the first three pages of the most starred repositories on GitHub[2]). These people don't care about the free software movement, they're just following the current trends, one of which is "Open Source". But if they really cared about free software, they would not be using Mac OS X or GitHub, which requires you to run nonfree JavaScript code in your browser to report issues, open pull requests, etc.

The serious projects that do care about free software don't use GitHub.

[1]: See Mike Gerwitz's GitHub Does Not Value Software Freedom: https://mikegerwitz.com/about/githubbub

[2]: https://github.com/search?q=stars:%3E1&s=stars&type=Reposito...

7
swillis16 3 days ago 2 replies      
Github makes millions of dollars per year and has a huge amount of users so the product is proven good. While I understand the need to be appreciative of Github, giving the organization a bit of user feedback is not going to hurt them very much. The response assumes that this is written because of the "Dear Github" letter.
8
gkya 3 days ago 2 replies      
I find the current situation with Github very unhealthy, because, even though very unlikely, someone can literally pull off the plug of open-source. Not all, but a great part. If such thing happened, be it with government intervention or some crazy attack towards github, it would make us lose a lot time migrating to other solutions. It would be like a Great Barbaric Invasion of Open Source, where everyone migrated to Bitbucket or private solutions, sort of an OSS incastellation.
9
jballanc 3 days ago 2 replies      
> Before 2007, the way to participate in Open Source was fragmented.

Um...ever hear of Source Forge? Yeah, before 2007 there was another OSS hegemony. It failed to meet its users needs. It was replaced.

So it goes.

10
skywhopper 3 days ago 1 reply      
A case study in passive aggressive behavior. Well done!
11
carlsborg 3 days ago 1 reply      
such a letter should include a note of thanks to torvalds for creating git in the first place.
12
mchahn 2 days ago 0 replies      
One time an Atom user posted a question on the Atom editor forum. He said he loved Atom and wanted to donate. It was a bit complex to explain that they would be contributing to a large corporation, GitHub. I thought this was symbolic of the confusing relationship GitHub has with open software.
13
notabot 3 days ago 1 reply      
I just want to point out accepting pull requests for signatures is a bad idea -- someone is going to lose the race and rebase over and over if unlucky, assuming many people are going to sign this. :-)
14
tzs 3 days ago 0 replies      
> Before 2007, the way to participate in Open Source was fragmented. Each project had their own workflow, patches circulated in emails, issues were reported in a myriad ways, and if anyone wanted to contribute they had to figure out every project's rules.

Don't you still have to figure out every project's rules? Being on Github does not impose coding guidelines, testing requirements, documentation requirements, contributor license agreement policies, project management and governance system, code review process, dispute resolution process, and so on.

> Nowadays doing Open Source is infinitely easier thanks to you, GitHub. You've provided the tools and the social conventions to make those days a thing of the past.

Nearly every time over the past 30+ years that I've wanted to fix a bug or add a feature to some open source thing I've been using, and been thwarted, it was never figuring out the workflow, or patch procedure, or issue reporting that did me in, or figuring out the project's rules.

The big problem has usually been one or both of (1) the project has a bazillion files and it is not at all clear from the meager documentation and haphazard directory organization which are for the thing itself and which are for ancillary tools, and (2) it gets build errors that I can't resolve.

15
santix 3 days ago 1 reply      
I comment what I wrote on the "Dear GitHub" post.

Shouldn't we (the OSS community) have an open source, roll-your-own version of something like GitHub? Like, the repo-management equivalent to a phpBB or a Wiki or a Wordpress.

We do have the separate components, though maybe the hard part is to glue them together. But still, it is something what would be worth the time and effort, wouldn't it?

16
davexunit 3 days ago 0 replies      
Sorry, but as a free software advocate, this really bugs me.

>Before 2007, the way to participate in Open Source was fragmented. Each project had their own workflow, patches circulated in emails, issues were reported in a myriad ways, and if anyone wanted to contribute they had to figure out every project's rules.

And now we have a monoculture. Monoculture is bad, folks.

This letter paints pre-2007 as something bad because everyone used their own infrastructure for their projects, but this is actually a really great thing. It meant that more projects had autonomy over the infrastructure that they rely on. So, rather than needing to beg a for-profit corporation for features that they want, they could actually change the software they used to work for them. Monoculture is more convenient for the masses, but trading freedom for convenience is a bad deal in the long-term.

The web is becoming more centralized every day, to the detriment of all Internet users whether they know it or not, and when SaaS apologists thank GitHub for helping it makes me upset. A federated, free software source code hosting tool could solve the barrier to entry problem without relinquishing control to a company who ultimately does not care about you.

And how about GitHub's ToS? Has anyone read it? Probably not. I didn't when I signed up. Did you know that changes to the ToS can happen any time and without notice? Even if you did read the terms, by agreeing to them, you agree that they can completely change them. Who would reasonably agree to that if it were not buried in legalese? You also surrender your rights to a fair trial by defending and indemnifying GitHub. For further reading, see "Why I don't support or contribute to GitHub repositories" [0] or read the ToS for yourself.

Now, on a technical note: GitHub encourages bad development practices via hooking people on their web interface. The Pull Request interface is the biggest offender. It encourages unclean commit history because it's scary to rewrite the patch set of a pull request. If you rebase fixup commits, you have to force push the changes. You cannot even do the safer route of deleting the remote branch and pushing the new branch because GitHub will automatically close the pull request with no way to re-open it. So, most people just pile on fixup commits that never get squashed into decent patches. And that's not all! The Pull Request interface makes it difficult to comment on individual patches by encouraging reviewers to look only at the aggregate diff of all patches. This leads to lower patch quality because it leads to a bunch of terrible patches that look okay squashed together to enter the Git repository. When your patch history sucks, it reduces the utility of blaming and bisecting to find issues or otherwise learn about the code. Reviewing patch sets on a mailing list is, despite being "low tech", a much better experience for me. I'm not forced to use a web interface, I can just use the email client of my choosing, and Git already knows how to do an email-based workflow. There's a reason why a huge project like Linux still does patch review via email.

In conclusion, GitHub is a company that receives almost nothing but praise. Most criticism is dismissed because they have a nice UX for a certain group of users (not me). I think GitHub has harmed the free and open source software community both ethically, legally, and technically. I no longer use GitHub for hosting my personal projects. I write all of this in the hopes that more people will recognize this and work on real replacements for GitHub.

[0] https://wubthecaptain.eu/articles/why-i-dont-support-github....

17
akash0x53 3 days ago 1 reply      
That user seriously gonna merge any incoming PR? What a great person with hell lot of time.
18
resca79 3 days ago 0 replies      
I'm not totally agree with both letters, just for personal reasons, but this is not the point.

Overall GitHub is a cultural place where anyone can improve his personal skills, expecially in computer programming, thanks to the huge code present on it.I have romantic vision of Github. For instance, guys from poor parts of the world can study great code with this site.

Yes it is a company with investors and probably it made some wrong decisions, and if we want we can choose other services, but today sorry for the repetition Github represents an open and huge cultural Hub.

19
akash0x53 3 days ago 1 reply      
I recommend, dont send any pull requests. On of the useless project on Github. How he gonna merge 1000s of PR - and PR to just add a name - seriously? This is just waste of time & resources.
20
ybden 3 days ago 0 replies      
I'm a little surprised that GitLab (https://about.gitlab.com/) has only been mentioned once so far, considering the many times that people have brought up the topics of GitHub being closed-source and the ecosystem of tools being fractured.
21
wereHamster 3 days ago 0 replies      
When GitHub came along it was an improvement over what was available back then. But that doesn't mean it's perfect or that it can stop evolving. Yes, thank you GitHub for what you achieved in our community so far, but dear GitHub don't stop and keep moving forward.
22
manigandham 3 days ago 0 replies      
If you want to thank them, just be a customer. This is unnecessary and seems like it devalues any criticism, especially considering the other recent letter. They are not some sacred thing to be protected.
23
nikolay 3 days ago 0 replies      
I can't stand brown-nosing...
24
justinph 3 days ago 2 replies      
Did anyone else notice that the list of signers on this letter is entirely male?
25
JustSomeNobody 3 days ago 1 reply      
Can we PLEASE stahp already with the "open letters"!?
26
qaq 3 days ago 0 replies      
Hmm so the response of the GitHub is to post a single generic response of "we will look into it" and then spend time and resources arranging this marvel of a letter.
27
gooddoob 3 days ago 0 replies      
What a nice slap to so called community leaders.
28
lgleason 3 days ago 0 replies      
Since corporations are people too under US law, I guess someone wanted to make sure it feels good about itself.....;)
Ask HN: What sites do you use to find contract work?
332 points by the_wheel   ago   153 comments top 38
1
andreasklinger 23 hours ago 10 replies      
I used to do a lot of contract work. I can't tell you what you should do - but here is what i did and it worked for me

Two approaches:

1) Work for one large client and essentially become an employee (consider this. a lot of startups pay good money for remote employees)

2) Work for multiple clients

Focusing on #2 here

Core rule: You want to be paid premium for quality and service.

Avoid marketplaces - it's very hard to compete on quality here.

Niche - the more focused you are on a (profitable) niche the better you can charge premium for domain competence

As thibaut_barrere mentioned - Build a brand - i would even go further - create an agency like brand. At the point is stopped saying "I" but said "we" i was able to charge more.

Dont charge by the hour but by the value - most developers charge their time - you want to charge the value you provide to the client. Read up on "willingness to pay"

Most important: Deliver as promised and always try to over-deliver in service, quality, etc. Eg try to understand why the client asks for features and not only what features she/he asks for - you might be able to come up with better solutions or anticipate future requests. Any successful project should usually lead to improved reputation and more projects and clients.

Good luck!

2
peterbsmith 23 hours ago 2 replies      
Personal networks.

I came into Syracuse knowing nobody and nothing.

I had never done any app making as of January 2015. I had done some wordpress stuff, but just the basics.

And I had (and have) no CS degree.

I now make a living on contract work. I did it by going to local meetups and introducing myself as a freelance web developer. Nevermind that I hadn't done freelance web development ever. I kept going to meetups for month and still attend a monthly hacker meetup. I participated in hackathons without really knowing how to program.

But all along the way I met people more experienced than I am and picked up two clients along the way. I think one thing that I do differently to most is that I charge a high rate (I always quote $150/hr). I am willing to negotiate lower than that but its a starting point. I have been paid that in the past for less complicated work like hiring developers and being a project manager.

What am I saying? Your questions is what sites to use? Just one: meetup.com

3
kaizensoze 21 hours ago 1 reply      
HN - http://hnhiring.me/

Remote OK - https://remoteok.io/

Stack Overflow - https://careers.stackoverflow.com/jobs?allowsremote=True

LiquidTalent - http://www.liquidtalent.com/

Working Not Working - http://workingnotworking.com

Hired - https://hired.com/contract-jobs

Gigster - https://gigster.com/

Mirror - http://mirrorplacement.com/

Metova - http://metova.com/

Mokriya - http://mokriya.com/

HappyFunCorp - http://happyfuncorp.com

Savvy Apps - http://savvyapps.com/

Clevertech - http://www.clevertech.biz/

Workstate - http://www.workstate.com/

AngelList - https://angel.co/jobs

I know you're just asking for sites and not approaches to finding contract work, but getting in with a very promising early stage company through contract-to-hire [that allows remote] is probably the most sustainable way to go.

Doing one contract project after another at an hourly rate just doesn't scale well financially and finding a next decent client can be like pulling teeth.

4
thibaut_barrere 1 day ago 1 reply      
I've been contracting, consulting & freelancing for the last 10 years (5 years completely remote). My advice is to avoid "searching contract work", but reverse the situation completely: make your new clients find you instead. I wrote about this in depth here: https://www.wisecashhq.com/blog/how-to-have-clients-find-you....

Sites /can/ work (I know people who make a good living off certain sites), but nothing will beat self-managed marketing on the long run.

Feel free to email me (see profile) if you have specific questions.

Good luck!

5
marknutter 22 hours ago 0 replies      
I posted this article on medium the other day that contains all the advice I've compiled after 8 years of freelancing as a software developer: https://medium.com/@marknutter/advice-for-the-freelance-deve...

In short, to answer your question, I never used any sites to find contract work. I got all my leads through face-to-face interaction with real humans in the real world, and a good deal of it came from word-of-mouth because of exceeding my clients' expectations.

Contracting sites marginalize developers and the type of clients who troll them are typically the kind who will try to squeeze as much work out of developers for as little money as they can. On top of that, developers are generally a pretty introverted crowd, so the number of introverted and talented developers who troll those sites looking for work is far greater than the number of outgoing, personable developers in your local area. Which group do you want to compete against?

6
mathgeek 1 day ago 1 reply      
Welcome to HN! You'll find that this was asked previously:

https://news.ycombinator.com/item?id=8908279

7
graham1776 17 hours ago 1 reply      
The one thing I always tell anyone on the job hunt (which in your case is finding contract work), which few ever seem to take me up on: Informational Interviews.

These are informal "Can I take you out to coffee?" talks with people in your industry to see what they are working on, what is happening with them, what is going on in the industry. Every job I have ever gotten is through informal meetings with people I have met through my network (whether its your old job, your friends, parents, relatives, or other).

At the end of every one I ask: "Is there anyone else you think I should talk to?" and "Do you currently have any opportunities at your company for me?". Rinse repeat.I guarantee that after investing in 30 informational interviews you will find work.

8
swimduck 22 hours ago 0 replies      
I have a different approach to finding contract work, particularly as I don't have much work experience. Upwork and similar websites have not worked well for me.

Instead, I browse job boards and when I find an interesting role I contact the company. If they are interested in my background and the fit is right, I sell them on setting up a contract relationship instead of full-time employee. Sometimes it works, other times it doesn't. The important part is being honest that you are looking to work as a contractor, not an employee.

Job boards to consider: AngelList, WeWorkRemotely etc. If you're looking for a list of job boards (http://nodesk.co has lots and so does this article by teleport http://teleport.org/2015/03/best-sites-for-remote-jobs/)

9
dmitri1981 22 hours ago 2 replies      
For those who are London based, I recently launched a mailing list for members of the London Hacker News Meetup, which sends out contracts based on your language preference. It's averaging about 10 jobs a month at the moment however I am working on getting it to about 100 pm by the end of the year. The current sign up page is at http://eepurl.com/byq7Af
10
pmorici 23 hours ago 2 replies      
I would avoid sites like Upwork (aka: odesk), elance, and anything similar like the plague unless working for less than minimum wage and dealing with morons is your idea of good contract work.

I suspect the secret to contract work success lies in having really good networking skills and a Rolodex of contacts from having worked in a given industry and having a reputation as someone who delivers. If you don't have that then you would probably have better luck finding reasonable work by going to meetups or similar industry events to build a network of professional contacts. The only way I know of to do this online is to become a notable contributor to prominent open source projects and then use that to leverage paid work.

11
ThePhysicist 23 hours ago 2 replies      
I recently made it through the Toptal (http://www.toptal.com) screening process but haven't taken on any work through their site yet, the hourly rate that you can ask there seems to be quite reasonable though compared to sites like upwork.com, where you will mostly compete with people that are willing to work for 10 $ / hour (which for someone living in a developed country is just not possible).

For Germany, Gulp (www.gulp.de) is a very good site where you can actually find clients that are willing to pay a reasonable hourly rate (they even have a rate calculator on their site).

12
nbrempel 23 hours ago 1 reply      
I've never used a website. Reach out to everyone you know. Buy them a coffee, mention you are getting into contracting, ask who else you should talk to, thank them, repeat.
13
quackware 22 hours ago 2 replies      
I get my clients primarily through gigster (http://www.gigster.com), referrals, and my website.
14
Nursie 22 hours ago 1 reply      
What's the context?Which country are you in?What are your skills?

If you're in the UK...

I've been contracting about 3 years now and started it the simple (and probably dumb) way - stick a resume up on jobsite.co.uk, wait for agents to call. Lots will. Be nice to them on the phone but be firm about what rates and locations you're willing to work. You'll get lots of useless ones who haven't even bothered to read it, but no matter, you'll learn to filter them out pretty quickly. Remember the good ones. Rinse, repeat.

I've had two contracts now through reputation, which is quite nice, but getting contracts from previous workmates isn't a panacea. One of them was the most boring thing I've ever done in my life (worse than shelf-stacking in a warehouse) and I quit after three weeks because I was literally unable to complete the work it was so dull. I told the client that I was poor value for money and a recent graduate would be a better choice. The other one was good though!

Also, make sure you're prepared for some time off between contracts, it's pretty much going to happen.

15
BorisMelnik 21 hours ago 0 replies      
I like to go to places like upwork or elance and seek out people in the US with low rep that haven't done a lot of jobs. A lot of times those people are ones for big companies that are stuck in a situation that need a quick hack put together. Do a good job and you get put on their 'list' for future use.
16
eswat 22 hours ago 0 replies      
I tackle this sideways by going to Meetup or Eventbrite. Specifically I go to meetups and events that potential buyers go to and let them know what I do (I dont try to sell my services on first contact). It takes some pruning but after a while my preferred clients are the ones I keep in contact with and we start working. I get less work through this than just by referral though.

Depending on your living situation and time available Id recommend trying to establish your own identity so you dont have to go through a marketplace for contract work. Instead youll have the contract work come to you and not filtered through a middleman that would take a cut out of your work. I would never recommend someone go through fiverr, Upwork or these other marketplaces unless they were just moonlighting.

17
odonnellryan 19 hours ago 1 reply      
A lot of people are saying job websites don't work. I don't agree with them.

I've been consulting over a year (US-based, near NYC) and I've found plenty of very good clients (small and large) through freelancing websites.

Few loose guidelines I've used to help me with applying to gigs:

1) Evaluate if you think the person understands the value of the work, and only reply if you can somewhat-confidently answer "yes."

2) Reply to gigs that say "$5" or some other crazy low number, as long as they seem competent at explaining their project.

3) ALWAYS follow up with your past clients! Ask them for new work regularly.

18
fasouto 23 hours ago 1 reply      
Some people at HN will tell you the opposite but I find two of my best clients at Upwork.

I didn't bid to low quality jobs and once I finish my job I offer them an maintenance contract outside upwork.

19
Mandatum 20 hours ago 0 replies      
Depends on how I'm feeling. If I'm not looking for very interesting work or I'm saving for travel, I have a few large clients (5000+ employees) that always have projects going. They are the bread and butter of my contract work and I'm known across pretty much all of the IT senior management at those companies.

If I'm looking for more cutting edge, interesting work I'll go out and find either a company, industry or project I'm interested in and try and insert myself into it somehow. Usually through meetups, over coffee or in one case just showing up (probably wouldn't recommend that, depends on the people - in my case it was 4.30PM on a Friday and I brought beer).

Usually I'll either do it gratis (if it's non-profit or public domain) or cut my rates if I'm learning on-the-job.

When I started pretty much all of my job offers and contracts came by word of mouth. I only had to kick down doors a few times before I had developed a reputation as a good worker. This involved cold-emailing, calling and meeting people at various industry events.

20
coderKen 18 hours ago 1 reply      
Anyone, currently looking for a remote front-end developer? I am full-stack developer (tending towards front-end nowadays), I live in Lagos, Nigeria and looking for remote work.I have a strong Javascript(NodeJS, AngularJs) background with over 3yrs experience.

Portfolio: http://goo.gl/OmEpz8

Git: https://goo.gl/oYbi8F

some side projects I have done:

http://goo.gl/TGRSWg

http://goo.gl/kHcn5M

http://goo.gl/eUPozF

http://goo.gl/6orP0y

Have done more complex stuff but requires user to login.

21
bcks 23 hours ago 3 replies      
I've had good success hiring developers for short-term project work through https://gun.io.
22
gist 23 hours ago 0 replies      
I'd like a way, similar to the first of month feature (where employment possibilities are posted on HN) where you could post requirements for a software project and get responses from the hacker news community (or at least links to either relevant profiles or reputable hackers as suggestions).

Edit: I mean on HN similar to the first of month feature not a site (I know these are out there obviously).

23
mirap 23 hours ago 0 replies      
Does anyone have good place to look for contract work in field of UX & Product design? I'm UX designer currently living in Prague, looking for remote work (and I'm open to relocate). My portfolio: http://podorsky.cz/
24
WoodenChair 19 hours ago 0 replies      
One thing that I think is valuable when looking for contracting work (what I call consulting) is to learn how people that have been highly successful in consulting built their business. Checkout episodes 4 (Marcus Zarra) and 5 (Michael Fellows) of Consult:http://consultpodcast.com

or

https://itunes.apple.com/us/podcast/consult/id1018251429?mt=...

25
dustingetz 1 day ago 2 replies      
HN who's hiring threads, exclusively

update: I post my pitch in the freelancer thread and potential clients contact me, for example https://news.ycombinator.com/item?id=9998249

26
gk1 17 hours ago 0 replies      
Wrote about this recently: http://www.gkogan.co/blog/how-i-learned-to-get-consulting-le...

The gist of it is, as many here are saying: Don't use marketplace sites. Instead show off your knowledge in a way that gets attention of potential customers, then they'll come to you.

27
peacemaker 18 hours ago 0 replies      
I've done this by reaching out to friends and old work colleagues to see what they're up to and offering to help. Because it's people you know it is much easier to make arrangements you will both be happy with. After 15 years working in software that turns out to be quite a lot of people, especially if you take the time to regularly reach out to people via LinkedIn etc.
28
telecuda 20 hours ago 0 replies      
Tip: Have an Indeed.com resume verbose with your areas of expertise. Build a project using Parse.com or the Twitter API? Put that in there. As an employer, one of my more successful methods is to search for specific skill sets that a project may require, then reach out to a small handful of developers who hit on those searches with a pitch to why -new project- is exciting.
29
victorantos 19 hours ago 0 replies      
If you are looking for frontend contracts, in particular - angularjs,

I would recommend http://AngJobs.com

disclaimer: I run AngJobs, https://github.com/victorantos/AngJobs

30
122333444555666 18 hours ago 1 reply      
A bit of tangent but some advice needed. So I've been contracting out a bit on UpWork - used all the bahavioral hacks in the book: using "we" etc... It's worked amazing for getting clients. Not bad at sales. I've got one client now -- a hedge fund -- that's being very stingy. We agree on a fixed price for a particular scope/milestone, the release is shipped, but they come back and say "this is great, but we need this one additional feature or this whole release is worthless." Usually I, I mean "we", oblige. But it's getting ridiculous. What do we do? Play hardball and say no shipment until payment? Or just ditch the client. The day rate is plummeting mind you, closing in on free. Total contract size in the low XXks.
31
nnd 20 hours ago 0 replies      
I'm fairly new to consulting (been doing it for almost a year now). I'm on my second gig right now, and both of them are through Toptal. For the first one, a recruiter reached out to me with a gig, the second one I got thanks to an article I wrote in their blog.
32
lazyant 19 hours ago 0 replies      
Anybody knows of (good) sites for remote server (Linux esp.) contract work (sysadmin/devops/optimization/security/reliability)? if there are none, anybody interested in one?
33
JoeAltmaier 22 hours ago 0 replies      
My way was working at several successful startups, and then going into contracting. So I had contacts at every level of Silicon Valley. Might not work for everybody.
34
jameslk 21 hours ago 0 replies      
I've been contracting/consulting for a couple of years now. Most my contracts have come through referrals (of friends) and sometimes recruiters. However, I was able to start my contracting career thanks to a contract that came through Toptal. This allowed me to quit my job and do this full-time.

Here's my list of resources that I would be looking at ifI needed to start looking for a contract immediately:

Boards:

- Authentic Jobs: http://www.authenticjobs.com/

- StackOverflow Careers: http://careers.stackoverflow.com/jobs?type=contract&allowsre...

- We Work Remotely: https://weworkremotely.com/jobs/search?term=contract

- Angelist: https://angel.co/jobs

- Github Jobs: https://jobs.github.com/

- Hired: https://hired.com/contract-jobs

Networks:

- Toptal: https://www.toptal.com/ I'm a member of Toptal's network)

- Gigster: https://www.trygigster.com/ (haven't used it yet)

- Crew: https://crew.co/ (haven't used it yet)

Offline ideas:

- Approach companies at Meetups

- Meetups, meetups, meetups

- Pitch on forums

- Work with contract agencies

- Become a subcontractor

It also helps to work on branding yourself, blogging, and integrating into communities (like HN!). Generally, just becoming an authority on a topic and allowing people get to know you before they work with you helps a lot. Kind of like patio11 has done for himself around here. Then people start coming to you instead of the other way around.

I would also highly recommend looking at DevChat TV's Freelance podcasts for ideas, they're really great: https://devchat.tv/freelancers

35
qp9384btv_2e 21 hours ago 1 reply      
For those who do contract work, what is your policy on code-reuse between clients?
36
chilicuil 22 hours ago 2 replies      
Fiverr (https://www.fiverr.com/), tasks usually take less than an hour and give me enough revenue to pay domains and hosting for my pet projects.
37
awjr 23 hours ago 0 replies      
Which country? ;
38
juliend2 21 hours ago 0 replies      
LinkedIn.

I send a LinkedIn message to some of my contacts I'd like to work with, telling them it's been a while and that I'd like to get in touch, and offer them to take a cup of coffee with them this week.

During the meeting, tell them about your freelance status and that you're looking for work.

Good luck!

Video games are essential for inventing artificial intelligence togelius.blogspot.com
220 points by togelius   ago   112 comments top 16
1
m_mueller 12 hours ago 10 replies      
Recently having become a father has made me think a lot about general intelligence. Seeing my son getting excited about his 'world state changing' gave me an idea. What if the main thing that holds us back is the reliance on cost functions? Human, and to some extent, animal intelligence is the only intelligence we know about. If that's what we want to emulate, why don't we try modelling emotions as the basic building blocks that drive the AI forward? Until now, the way I understand neuronal nets, we have basically modelled the neurons and gave them something to do. My hunch is that brain chemistry is what's driving us actually forward, so what if we model that as well? Instead of seratonin, endorphin etc. we could also look at it at a higher level, akin to Pixar's Inside Out - joy, fear, sadness, disgust, anger, and I would add boredom.

Let's stay with video games for a bit. What if we look at joy as 'seeing the world change', graded by the degree of indirection from our inputs (the longer it cascades, the more joy it gets). Maybe let it have preference for certain color tones and sounds, because that's also how games give us hints about whether what we do is good or not. Boredom is what sets us on a timer - too many repetitions of the same thing and the AI gets bored. Fear and disgust is something that comes out of evolutionary processes, so it might be best to add a GA in there that couples success with some fear like emotion. Anger, well, maybe wait with that ;-).

Edit: Oh, and for the love of god, please airgap the thing at all times...

2
fitzwatermellow 14 hours ago 1 reply      
Great summary of current state of the art with links to interesting projects: GVG-AI, VGDL...

Videos games are also essential for AI pedagogy. Creating Pac-Man agents in Stanford's AI class is a great example. Most players can barely get a "strawberry" but to see a trained agent mimicking human expert level play is eye-opening.

Quick reminder: Global Game Jam 2016 starts Jan. 29 and NYU is hosting its annual jam!

http://gamecenter.nyu.edu/event/global-game-jam-2016/

3
venachescu 12 hours ago 5 replies      
This is a cute argument, but I think it falls into a trap of following its own thinking.

Video games are explicitly designed to test and fit within our bounds of conscious control and processing; particularly the retro games, but essentially all games in general have a very limited input control space (a couple keys or joysticks) and usually very rigorously defined action values. Moreover, these were designed by humans with very explicit successes, losses and easily distinguishable outcomes.

None of these descriptions fit the kind of control that an 'intelligent' system needs to handle. Biological systems do not have predefined goal values, very incomplete sensory information and most importantly control spaces that are absolutely enormous compared to anything considered in a video game. At any point in time the human body has ~40 degrees of freedom it is actively controlling - compared to ~5 in a serious video game.

I do not doubt that pattern recognition and machine learning techniques can be improved through these kind of competitions. But the problem is in conflating better pattern recognition with general intelligence; implying or assuming any sort of cost, value or goal function in the controlling algorithm hides much of our ignorance about our 'intelligent' behavior.

4
pgodzin 15 hours ago 4 replies      
Really interesting to think about the skills necessary just to play a modern open-world game such as Skyrim successfully.

NLP to understand dialogue and actions that need to be taken based on what NPC's/quests/item descriptions say, strategies for several different enemies with different strengths and weaknesses, exploring the open world in a logical order.

When you think about the difficulties of such a loosely defined problem, it's hard to buy into the real-world fears of AI.

5
Paul_S 14 hours ago 2 replies      
AI researchers are trying to make AI smarter. Game AI can already be easily written to win 100% of games but that's not the point. Gamedevs are trying to make AI more human-like. I'm not sure the two overlap.
6
anentropic 4 hours ago 0 replies      
"The most important thing for humanity to do right now is to invent true artificial intelligence (AI)"

bollocks

7
Houshalter 9 hours ago 0 replies      
I don't disagree that video games are a very useful benchmark to evaluate intelligence. But I don't think AGI will evolve from video games. I think that language understanding is the path to AGI.

Language is quite complex and can't easily be beaten by hard coded algorithms or simple statistics. You can do some tasks with those things, but others they will fail entirely. The closer you get to passing a true turing test, the harder the problem becomes. It certainly requires human intelligence, and most of our intelligence is deeply rooted in language.

He mentioned games like Skyrim and Civilization as being end goals. But even a human that doesn't speak English wouldn't be able to play those games. Let alone an alien that knew nothing about our world, or even our universe.

8
hanniabu 8 hours ago 0 replies      
I always had a feeling that the path to an intelligent system should be similar to that of Google's autocomplete algorithm.

On boot, all surrounding data will be taken in, this step would give everything context. All new data coming in would be processed (referenced to original data to determine what is happening and actions to take), then clustered, and then updated to the original data set, dropping data from the original set determined to be irrelevant, and updating the context to give more relevant perspective of the new data coming in. (And loop)

9
tlarkworthy 8 hours ago 0 replies      
I agree. No free lunch implies no general algorithm for solving random problems from the set of all problems. So what's the practical subset of problems that is useful in the real world? Fingers crossed, we already encoded the useful problems in the different game genres we developed. E.g. RTS pushes the planning vs. reaction dilemma, RPG tests verbal inference and morality, puzzles test logic etc. We already digitized a large claas of problems we care about for the real world in games!
10
sriku 8 hours ago 0 replies      
> In order to build a complete artificial intelligence we therefore need to build a system that takes actions in some kind of environment.

This.

"Made up minds:a constructivist approach to artificial intelligence" by Gary Drescher presents a small scale virtual world with a robot embedded in it that figures out the laws of its world by interacting with it, much like what a child does. Need more people thinking like this.

11
proc0 14 hours ago 1 reply      
Very interesting read, and I always knew about this being an avid gamer since as far as I can remember. It always intrigued me how a computer can play against a human, and as games got more sophisticated, interacting with AI's got more and more human-like.

Aside from using them as benchmarks, they way games are capable of simulating a world will probably be key in creating a true AGI. In the comment section of the article, we're already seeing some theories that involve video games not just a tests, but as a primary component of the intelligence architecture. Very exciting times!

12
jarboot 14 hours ago 0 replies      
The game "Yavalath" [1] in the article looks really neat: A simple little game with only two rules which never really ends in a draw, unlike tic tac toe.

[1] http://cameronius.com/games/yavalath/

13
dschiptsov 14 hours ago 0 replies      
I think it is emotions. Teach the car that cracks in the tarmac affect its fitness negatively, and it will drive better avoiding them or passthis ng them with caution in the long run.

At least motorcycle drivers who care are better drivers.

14
bitwize 15 hours ago 8 replies      
I don't think building an AI is the most important task on our plate. We still have those disease, hunger, poverty, and war problems to contend with. If building an AI helps us solve those, then sure, let's build the AI. But I don't think strong AI is necessary to gain traction on the problems that confront the sapient beings we already have around.
15
mankash666 13 hours ago 1 reply      
"The most important thing for humanity to do right now is to invent true artificial intelligence"

Maybe the article makes some valid scientific points, but I simply cannot go past this unscientific opening claim to a purportedly scientific article. Not just me, no peer-review journal will accept such frivolity. Passing on the article and hoping for better scientific writing in the future!

16
purpled_haze 15 hours ago 4 replies      
> Video games are essential for inventing artificial intelligence

And here's why they aren't: First-person Shooters.

Why give AI something that's a goal that involves killing things that look like humans or animals for points? That's a recipe for disaster.

Breakout's not much better either. How often do you need to break a wall to smithereens with a ball? Never.

A 'Brief' History of Neural Nets and Deep Learning andreykurenkov.com
243 points by andreyk   ago   20 comments top 7
1
dahart 3 days ago 3 replies      
Excellent article, thanks for the contribution.

Perhaps some independent validation, but I was coincidentally having this conversation the other day with a relatively well known computer vision researcher, about why it seems like the idea of neural nets has floundered for decades and suddenly it's the hot topic, and we're seeing massively improved results.

His answers, summarized, are that:

1- Big data is making possible the kind of training we could never do before.

2- Having big data & big compute has made some training breakthroughs that allowed the depth to increase dramatically. The number of layers was implicitly limited until recently because anything deep couldn't be practically trained.

3- The activation function has very commonly in the past been an S-curve, and some of the newer better results are using a linear function that is clamped on the low end at zero, but not clamped on top.

All really interesting to me. This is making me want to implement and play with neural nets!

Of course, now the big question: if we have a neural net big enough, and it works, can we simulate a human brain? (Apparently, according to my AI researcher friend, we're not there yet with the foundational building blocks. He mentioned researchers have tried simulating a life-form known to have a small number of neurons, like a thousand, and they can't get it to work yet.)

2
mturmon 3 days ago 2 replies      
Upvoted -- what a nice and detailed history.

Thanks for linking to the old NYT article on Frank Rosenblatt's work. One can see how researchers of the time were irked by delirious press releases when the credit-assignment problem for multilayer nets had not been addressed.

(We managed to mostly address the credit-assignment problem for multilayer nets...but the delirious press release problem remains unsolved.)

Incidentally, it's "Seymour Papert", not "Paper" (appears twice).

3
FreedomToCreate 3 days ago 0 replies      
Great read. Its incredible what we, as a species have been able to achieve in the last 2 centuries. I feel like we are where scientists were with computers in the 1950s. We are starting to see the big picture, but its wide application is still decades away.
4
ausjke 3 days ago 0 replies      
Great article, took an neural network course while I was doing my graduate study long time ago and it might be time to resume that subject. the NN training then took a long time to be impractical for real use and now it should be much faster.
5
lovelearning 3 days ago 0 replies      
Author's done an excellent job of explaining what the problems were at every stage and how NNs evolved to solve them. Learnt a lot from this series.
6
BasDirks 3 days ago 0 replies      
This is actually a very important chapter in human development. And it will pass in the blink of an eye. Works like these are good, important.
7
signa11 3 days ago 0 replies      
just to re-iterate, the reference text #13 I.e. parallel distributed processing (vol. 1&2) are also an excellent introduction to the field, from its infancy.

contains a collection of papers by nn luminaries including rumelhart, Hinton etc. Very highly recommended.

Bitcoin Is Dead, Long Live Bitcoin avc.com
221 points by gist   ago   192 comments top 22
1
AndrewKemendo 3 days ago 12 replies      
Back in 2009 when I was getting into bitcoin, it was widely described as a "stateless currency" and not just an alternative to Visa or credit cards. Having a background in economics, bitcoin looked to me like a truly revolutionary technology and if it was adopted en masse would change politics forever.

The problem however is that a nation's currency is arguably the primary source of it's sovereignty. Whomever controls the currency primarily used in a nation, controls the nation. So it's safe to say that no nation is going to let bitcoin or any other currency take hold for widespread commercial use in lieu of it's sovereign currency - that's a hallmark of a failed state.

In fact that's why when regulators declared bitcoin as a commodity it was such a big deal. The US doesn't really care because we haven't had commodity money since the 1970s, but the idea that if enough people decided that they wanted commodity money, they would arguably have a medium with which to do it with that wasn't tied to any state.

So as with every product, unless a massive number of people decide to stop using their home currency and start using bitcoin, then bitcoin will "fail."

Whether it pivots to some kind of credit system or not is largely immaterial because of the potential it had.

It's like if we created Artificial General Intelligence and it decided to just write movie reviews forever.

2
mabbo 3 days ago 7 replies      
The problem with bitcoin, as I see it, is that discussion of problems that may exist can lower the value of bitcoin- whether the problem is real or not. If you're heavily invested emotionally with bitcoins, you need to discuss these problems so that they can be solved. If you're heavily invested financially, you need to make sure no one talks about these problems, at least until you have a plan for how you're not going to lose your fortune.

Those invested financially have very good reasons to believe in solutions that will retain their investment, and dismiss solutions that put them at risk.

3
Animats 3 days ago 0 replies      
The article asks "is Bitcoin gold" or "is Bitcoin Visa".

Bitcoin's first killer app was drugs. Then Silk Road I and Silk Road II were taken down. That put a dent in the price.

Currently, Bitcoin is a way to get yuan out of China and convert it to dollars or euros, despite China's exchange controls. Most of the mining and the exchange volume are in China. Buying Bitcoin with yuan and selling it outside China is technically prohibited by the People's Bank of China, but they haven't cracked down hard on it. Yet. Mining is also a way to convert yuan to dollars. Miners in China can also qualify for the loans and subsidies the government of China gives businesses.

Bitcoin as a general currency for transactions just isn't happening. The real transaction costs are too high. There's volatility risk. There are exchange costs getting in, and exchange costs getting out. (At the retail level, those are high; Robocoin ATMs have a buy/sell spread of about 15%) There's also the substantial risk that the exchange will fail or steal your money. (This got better in 2015, but until then, more than half of Bitcoin exchanges went bust. It wasn't just Mt. Gox.) Paying 1% - 3% to Visa looks good compared to that, especially since buyers get protection against merchant fraud.

Bitcoin as a speculation looks good some years, and bad in others. It's like a penny stock, except that Bitcoin is zero-sum. There's no intrinsic value, and no fundamentals. It's all greater-fool speculation.

The impressive thing about Bitcoin is that the system is well behaved in the presence of a very high level of criminality. Few, if any, other distributed computer based systems can make that claim. It would be nice if DNS or BGP or SS7 worked that well.

4
Anonobread 3 days ago 2 replies      
Block size isn't a free parameter. The team behind btcd, a Bitcoin full node written in Go, made the following observations about block size [1]:

1. a 32 MB block, when filled with simple P2PKH transactions, can hold approximately 167,000 transactions, which, assuming a block is mined every 10 minutes, translates to approximately 270 tps

2. a single machine acting as a full node takes approximately 10 minutes to verify and process a 32 MB block, meaning that a 32 MB block size is near the maximum one could expect to handle with 1 machine acting as a full node

3. a CPU profile of the time spent processing a 32 MB block by a full node is dominated by ECDSA signature verification, meaning that with the current infrastructure and computer hardware, scaling above 300 tps would require a clustered full node where ECDSA signature checking is load balanced across multiple machines.

For context, a meager 300 tps is less than 10% of what VISA does - hence this solves no long standing problems in Bitcoin, yet it condemns all the nodes to run in compute clusters in datacenters. Naturally, "small blockists" as we're called, point out that this isn't how Bitcoin works today at all. Forcing nodes into compute clusters in remote datacenters is a major, sweeping departure from nodes running on home networks with consequences both forseeable and unforseeable.

[1]: https://blog.conformal.com/btcsim-simulating-the-rise-of-bit...

5
pcmaffey 3 days ago 6 replies      
1. I have no idea how to get Bitcoin. A quick search takes me to several exchanges that look like torrent sites for pirating shit.

2. Nor has anyone presented me a compelling reason to get bitcoin (other than the future). Off the top of my head, if a company offered me an account that would take care of my online transactions in a secure way using Bitcoin, I might be interested. Maybe I'd put a few hundred in it each month and use it for random expenditures, and hopefully get access to some cool micro-services.

But I have zero awareness anything like that exists.

3. If it's to succeed as capital, it will be a daring bank that creates their own bitcoin equivalent... followed by one of the norwedish countries making their own national bitcoin. Then the floodgates will open.

6
kang 3 days ago 0 replies      
It is a traditional view that in order to increase throughput we need to increase the bandwidth. This is true but we also know that throughput can be increased by abstracting the existing bandwidth to hold more information.

Increasing the blocksize will increase throughput, but is it the only way? Definitely not. The very recently improvement by Dr.Peter Wuille to segregate the witness (cryptography used to validate transactions) from the data (transactions) increases the capacity by around 66%, even if block's max size is unchanged. This is already in the testing phase. There is a whole bitcoin scaling roadmap & a lot of work going on.[0]

Practically, the death scenario being painted by XT hasn't realised and transactions are still going through normally.[1]

The bitcoin developer community comprises of very capable hackers many of whom are not only PHDs but have also invented new cryptographic techniques and also are robust security coders.[2][3]

[0]https://bitcoin.org/en/bitcoin-core/capacity-increases-faq[1]http://statoshi.info/dashboard/db/transactions?panelId=4&ful...[2]https://people.xiph.org/~greg/confidential_values.txt[3]https://blockstream.com/2015/08/24/treesignatures/

7
manyoso 4 days ago 7 replies      
This is interesting to me because it presents the blockchain size dispute in a way that seems much more sympathetic to the "no, we shouldn't increase the blockchain size" crowd than Mike Hearn's essay.

Is it really true that the people who are objecting to increasing the blockchain size see Bitcoin turning into a reserve "currency" to hold wealth instead of a liquid currency used to make real time transactions?

Is that really the debate? Because I didn't get that from Mike Hearn's essay at all...

8
phaet0n 3 days ago 0 replies      
Bitcoin is simply going through a discontinuity brought on by a fixed blocksize. Personally, I feel uncertainty is the real issue. The "community" should either accept this, or opt for a potentially unlimited ceiling (limited by physics and network size and topology).

If you choose to remain with the fixed blocksize, then you're betting the system will reach another equilibrium (which may be total collapse). This equilibrium will revolve a natural evolution in the pricing of transactions.

Either way, at some other point in the future another source of discontinuity will be the circulation limit. Again, which may or may not kill the system.

Bitcoin is simply evolving, as it necessarily needs to.

It's definitely interesting to watch as an outsider.

9
brudgers 3 days ago 0 replies      
Even when Bitcoin was riding high, Wilson was investing in something he believed was likely to fail. It's what he does: invest in startups, I mean.

I don't see any reason to believe that his position on Bitcoin should be read in any other terms than his investment in Bitcoin companies is not appreciably worse or more risky than other companies in his fund. That his personal asset portfolio holds less Bitcoin than wine illustrates the distinction he's making.

His message is for people invested in Bitcoin companies. It is don't panic, these companies were always risky.

10
compcoin 3 days ago 0 replies      
Bitcoin will one day be known as 'the grandaddy of them all' just like the Rose Bowl. Bitcoin suffers from centralization, the very thing it was designed to circumvent The Bitcoin cultist are emotionally wrapped up with the idea that it has to be everything for everyone. Money has a way of making us emotional. Bitcoin has to be a payment machine for payment processors, It has to be a store of wealth for investors, It has to be a currency for libertarians. It is so caught up in trying to be everything that it does none well. Bitcoin developers dont be afraid to go your own way. They should branch out into new areas and create new coins, applications and opportunities. Why does it have to be Bitcoin or bust. Proponents attack new ideas. They wake up at night in terror that something new might emerge that meet needs in better ways. What would the Stockmarket be like if there was but one issue to trade. It is no wonder that Bitcoin companies struggle to make money. The market is being restrained from growing by those that seek to control the golden goose and make certain that there is no other options. Yes, new things are coming and new financial markets will come into being whether Bitcoin changes its protocol or not. Centralization leads to waste fraud and abuse even with the best of intentions Bitcoin
11
tomasien 3 days ago 0 replies      
Love the Bitcoin/VISA analogy. Wrote about it here: https://medium.com/shekel-magazine/odd-bedfellows-the-strang...

Can 100% understand why some people may hate that future for Bitcoin, but I think the parallels are serious and it's probably what will happen.

12
maaku 3 days ago 1 reply      
> At the core of the debate is whether the Bitcoin blockchain should be a settlement layer that supports a number of new blockchains that can be scaled to achieve various goals or whether the Bitcoin blockchain itself should evolve in a way that it can scale to achieve those various goals.

No, among the developers actually working on Bitcoin that is not what the debate is about at all.

Bitcoin is a decentralized ledger, and indeed it can be argued that this is the only property about Bitcoin which is interesting/useful. Why? Because all properties we care about (availability, uncensorability, unseizability, etc.) derive from decentralization[0]. And at the end of the day we can do everything Bitcoin does faster, better, and cheaper on some alternative consensus system (see: Stellar, Open-Transactions, Liquid) that does not have this decentralization property. Decentralization is expensive. It requires a dynamic membership, multi-party block signing algorithm, which at the moment means proof of work. And proof of work costs hundreds of millions of dollars per year to maintain, and throttles the available bandwidth due to the adversarial assumption and the existence of selfish mining[1].

The question is not whether Bitcoin should be a store of value or a medium of exchange. That implies we have some choice in the matter. The question is what level of on-chain utility does Bitcoin actually support under untrusted, adversarial conditions, without losing all properties derived from decentralization. This is an empirical question. The available bandwidth is something that can be determined from the performance of the code in the real world extrapolated to various adversarial simulations.

We had two Scaling Bitcoin workshops last year that gave us a data-driven answer: 3-4MB per block, tops. There are potentially ways that this number can be improved (see: weak blocks), and those are being worked on but are still some time from showing results. There are also some assumptions underlying this number, e.g. that we change the validation cost metric, which none of the existing proposals do in a smart way. But the scientific process is telling us right now that with the tools available to us we can increase the worst-case block size to 3-4MB with a better metric without the decentralization story becoming unacceptably worse off.

That is the plan of Bitcoin Core. The deployment of segregated witness will allow up to 2MB blocks under typical conditions, and 3-4MB under worst-case adversarial conditions. It will exhaust the available capacity for growth in the Bitcoin network at this time. Meanwhile, work progresses on IBLT, weak blocks, Bitcoin-NG, fraud proofs and probabilistic validation, and other related technologies that might provide an answer for the next increase a year or two later. I'm hopeful we may even be able to get an order of magnitude improvement from that one, but we'll see.

No one I'm aware of is pushing for smaller blocks because Bitcoin should be a store of value and a settlement layer. If I had magic pixie dust I'd want 1GB blocks and everything on-chain too. But we live in the real world and are stuck in a situation where Bitcoin loses all of its unique properties if we scale much further than where we are at now. And so we must ask the question: what will Bitcoin become, since it can't scale on-chain? How can we live with that outcome? The idea of a settlement layer and off-chain but trustless payment networks like Lightning naturally arise from that thinking. The Lightning Network[2] is a way that we can have our cake and eat it too: Bitcoin remains small and decentralized, but everyone still has access to bitcoin payments. Lightning can potentially scale to global usage with a small-block chain as the settlement layer.

[0]: http://bluematt.bitcoin.ninja/2015/01/14/decentralization/

[1]: http://hackingdistributed.com/2013/11/04/bitcoin-is-broken/

[2]: http://lightning.network/

13
magicmu 3 days ago 3 replies      
It seems like the potential fork in bitcoin is already happening to some degree with Elements and the sidechain (https://github.com/ElementsProject/elements) -- a super interesting project that I just found out about yesterday! I can't help but feel, to a certain degree, like forking bitcoin is re-arranging deck chairs on the titanic though. There already seem to be more viable alternative cryptocurrencies.
14
gtrubetskoy 3 days ago 1 reply      
Correct me if I'm wrong, but if this was a stock or some other security, and I somehow had influence over how it is managed thus what I said could affect its value, declaring it "dead" (or soon to be worthless) AND holding interest in it (long or short) - would be criminal in most countries.
15
nikolay 3 days ago 0 replies      
The big drawback to the consumer is the speculative price. Of course, this attracts the scum of the Earth, but that's another r story. Bitcoin can be saved by applying the concept of a Currency Board [0]. All speculator will leave and only the serious players will stay and then regular folk can jump in. Of course, you can't make a million dollars from a single bitcoin you bought at $400, but you won't make it anyway if things go the way they've been going! Well, I know you can't apply the concept to Bitcoin, but just wanted to give some food for thought.

[0] https://en.wikipedia.org/wiki/Currency_board

16
markbnj 3 days ago 1 reply      
I have never understood the bitcoin protocol very well, and I did not have time to read the entire posted article and its linked supporting pieces. That said, I did pick up that one of the major issues is that a few miners in China control a huge portion of the hash power and have an incentive not to allow growth. Given that computing power is so affordable, is it really not possible for some competing group to decide to save the protocol by creating some large resource pools? In other words, can miners not be evicted once in control?
17
brighton36 3 days ago 1 reply      
The actual story here is that the attempt to fork Bitcoin (Bitcoin XT) is dead. Hearn tried to do so, failed, and left. Anyone who doesn't run XT should be celebrating their victory.
18
pbreit 3 days ago 0 replies      
I'm worried (about myself) that I easily shift between these 2 camps: this thing is outrageously ingenious and this thing is outrageously preposterous.
19
anon4this1 3 days ago 0 replies      
if chinese miners are indeed trying to limit blocksize in order to increase transaction costs, then the necessary chain of events which needs to occur is: 1) faith in bitcoin as a currency/payment system drops causing price crash 2) miners realise that the marginal increase in money made via transaction costs is massively outweighed by what they have lost in block rewards by limiting growth of the system 3) the miners get their act together and everyone upgrades to a new whizzy high capacity bitcoin.

This should continue to work until block reward becomes negligible in 2035.

20
nikolay 3 days ago 1 reply      
Bitcoin is long dead; the blockchain - not so much, but it won't survive in its present form as it's highly inefficient.
21
RealityVoid 3 days ago 0 replies      
Yeah, well,that's just like... your oppinion, man.
22
glossyscr 3 days ago 0 replies      
> I read his entire post a couple times.

because some blockchain investments might be in danger

       cached 19 January 2016 16:11:02 GMT