hacker news with inline top comments    .. more ..    21 Jan 2016 Best
home   ask   best   2 years ago   
Top Books on Amazon Based on Links in Hacker News Comments ramiro.org
1039 points by gkst   ago   179 comments top 50
_lpa_ 2 days ago 11 replies      
I did something pretty similar over christmas, though I used named entity recognition to extract book titles rather than looking for amazon links, and (so far) also limited it to specific "Ask HN" threads about books. You can find it here: http://www.hnreads.com/. It is interesting to see how little overlap there is between the two, though that may be due to my using far fewer (and also newer) threads!
SloopJon 2 days ago 2 replies      
Here's a discussion of the original upload of Hacker News data to Google BigQuery:


At 4 GB, I'd just as soon query this locally, but this looks like a fun exercise.

I notice that there were 10,729 distinct ASINs out of 15,583 Amazon links in 8,399,417 comments. Since I don't generally (ever?) post Amazon links, I'd be interested in expanding on this in two ways.

First, I'd reduce/eliminate the weight of repeated links to the same book by the same commenter.

Second, I'd search for references to the linked books that aren't Amazon links. Someone links to Code Complete? Add it to the list. In a second pass, increment its count every time you see "Code Complete," whether it's in a link or not.

niuzeta 2 days ago 1 reply      
The absence of SICP, I imagine, is because when people refer to the SICP, they usually just link to the open link to the book: https://mitpress.mit.edu/sicp/ .
meadori 2 days ago 8 replies      
Having owned and read through "Introduction to Algorithms" for years I agree that it is a good book. However, recently I have been feeling like it is recommended way too often without thought.

It is not the best when it comes to explaining things in an intuitive manner. It is a great reference book with lots of algorithms and proofs.

In recent years I have been drawn more towards Levitin's "Introduction to the Design and Analysis of Algorithms".

Anyone else have similar feelings about "Introduction to Algorithms"?

dankohn1 2 days ago 1 reply      
Here is Matt Yglesias's (author of the #1 book) tweet on the analysis:

https://twitter.com/mattyglesias/status/689169613779808257"The only book ranking that matters"

a_bonobo 2 days ago 1 reply      
How come "Darwin's Theorem" appears so often? It's quite unknown, with one review on Goodreads and 4 reviews on Amazon

Is this a result of the author spamming his own work?

Edit: Looks like it, short skimming of "darwin's theorem site:news.ycombinator.com" shows that all links are from user tjradcliffe, who is the author. A case for manual curation of data.

mattip 2 days ago 0 replies      
Out of 8 million data points the top book got around 50 references. I wonder how much significance should be attached to that, it looks to me to be down in the noise level.
jacko0 2 days ago 5 replies      
Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold. The best book I've ever read.
DanielBMarkham 2 days ago 1 reply      
Related: There are a ton of sites set up like this. Hopefully somebody will post a list. Lotta work by HN folks on various ways of slicing and dicing the data.

I wrote this curated site from HN several years ago. Got tired of people continuously asking for book recommendations. http://www.hn-books.com/

Couple points of note. This is 1) an example of a static site, 2) terrible UI, 3) contains live searches to comments on each book from all the major hacking sites, and 4) able to record a list of books that you can then share as a link, like so (which was my reason for making the site)

"My favorite programming books? Here they are: http://www.hn-books.com#B0=138&B1=15&B2=118&B3=20&B4=16&B5=1... "

I started writing reviews each month on the books, but because they were all awesome books, I got tired of so many superlatives!

Thanks for the site.

spinchange 1 day ago 0 replies      
Might as well add a link to a book I learned about on HN but never seem to run across since, UNIX and Linux System Administration Handbook, 4th Edition: http://www.amazon.com/Linux-System-Administration-Handbook-E...
willyyr 2 days ago 0 replies      
There is a similar site that didn't make it to the front page which has been posted recently. I think he is using the api though. https://news.ycombinator.com/item?id=10808014
greesil 2 days ago 2 replies      
Check out the review distribution of "Rent Is Too Damn High"


It's the most polarized I've ever seen in my life.

nextos 2 days ago 2 replies      
Is it possible that some books have been missed due to acronyms employed in comments?


- SICP: Structure and Interpretation of Computer Programs

- CTM: Concepts, Techniques, and Models of Computer Programming

- TAOP: The Art of Prolog

anc84 2 days ago 5 replies      
Please share how much the affiliate tag generates.
fhoffa 1 day ago 0 replies      

On https://reddit.com/r/bigquery, /u/omicron_n2 left queries to repeat the experiment on HN and on reddit comments too:

- https://reddit.com/r/bigquery/comments/41py1v/top_30_books_o...

And a presentation by /u/Pentium10 on the same topic, using the books that redditors read:

- http://www.slideshare.net/martonkodok/complex-realtime-event...

msutherl 2 days ago 1 reply      
I maintain a list of HN hacks here: https://www.are.na/morgan-sutherland/hacker-news. I've seen a couple other book projects over the years including: http://hn-books.com/ and http://hackershelf.com/browse/.
myth_buster 2 days ago 3 replies      
I believe people would just write the name of the really popular books like TAOCP, Hackers, Founders at work etc rather than linking to them.

The list:

 "The Rent Is Too Damn High: What To Do About It, And Why It Matters More Than You Think" by Matthew Yglesias Publisher: Simon & Schuster "The Four Steps to the Epiphany: Successful Strategies for Products that Win" by Steven Gary Blank Publisher: Cafepress.com "Introduction to Algorithms, 3rd Edition" by Thomas H. Cormen Publisher: The MIT Press "Influence: The Psychology of Persuasion, Revised Edition" by Robert B. Cialdini Publisher: Harper Business "Peopleware: Productive Projects and Teams (Second Edition)" by Visit Amazon's Tom DeMarco Page Publisher: Dorset House Publishing Company, Incorporated "Code: The Hidden Language of Computer Hardware and Software" by Charles Petzold Publisher: Microsoft Press "Working Effectively with Legacy Code" by Michael Feathers Publisher: Prentice Hall "Three Felonies A Day: How the Feds Target the Innocent" by Harvey Silverglate Publisher: Encounter Books "JavaScript: The Good Parts" by Douglas Crockford Publisher: O'Reilly Media "The Little Schemer - 4th Edition" by Daniel P. Friedman Publisher: The MIT Press "The E-Myth Revisited: Why Most Small Businesses Don't Work and What to Do About It" by Michael E. Gerber Publisher: HarperCollins "Feeling Good: The New Mood Therapy" by David D. Burns Publisher: Harper "Programming Collective Intelligence: Building Smart Web 2.0 Applications" by Toby Segaran Publisher: O'Reilly Media "The Non-Designer's Design Book (3rd Edition)" by Robin Williams Publisher: Peachpit Press "The C Programming Language" by Brian W. Kernighan Publisher: Prentice Hall "The Design of Everyday Things" by Donald A. Norman Publisher: Basic Books "Cracking the Coding Interview: 150 Programming Questions and Solutions" by Gayle Laakmann McDowell Publisher: CareerCup "What Intelligence Tests Miss: The Psychology of Rational Thought" by Keith E. Stanovich Publisher: Yale University Press "On Writing Well, 30th Anniversary Edition: The Classic Guide to Writing Nonfiction" by William Zinsser Publisher: Harper Perennial "Darwin's Theorem" by TJ Radcliffe Publisher: Siduri Press "Knowing and Teaching Elementary Mathematics: Teachers' Understanding of Fundamental Mathematics in China and the United States (Studies in Mathematical Thinking and Learning Series)" by Liping Ma Publisher: Routledge "Don't Make Me Think: A Common Sense Approach to Web Usability, 2nd Edition" by Steve Krug Publisher: New Riders "Expert C Programming: Deep C Secrets" by Peter van der Linden Publisher: Prentice Hall "Clean Code: A Handbook of Agile Software Craftsmanship" by Robert C. Martin Publisher: Prentice Hall "The Elements of Computing Systems: Building a Modern Computer from First Principles" by Noam Nisan Publisher: The MIT Press "Code Complete: A Practical Handbook of Software Construction, Second Edition" by Steve McConnell Publisher: Microsoft Press "The Box: How the Shipping Container Made the World Smaller and the World Economy Bigger" by Marc Levinson Publisher: Princeton University Press "Software Estimation: Demystifying the Black Art (Developer Best Practices)" by Steve McConnell Publisher: Microsoft Press "Refactoring: Improving the Design of Existing Code" by Martin Fowler Publisher: Addison-Wesley Professional "Design for Hackers: Reverse Engineering Beauty" by David Kadavy Publisher: Wiley

nefitty 2 days ago 0 replies      
Hard to read on mobile. Couldn't get past the first few. It is annoying to have to click a tiny thumbnail to read a bad, extracted synopsis from Amazon.
corysama 2 days ago 3 replies      
Interesting to see Influence so high, but Predictably Irrational not listed at all. I've heard Influence is a really great book, but from a quick skim it seems like Predictably Irrational covers the subject matter as least as well if not better. I'd be happy to hear the opinion of someone who has actually read both.
q-base 1 day ago 0 replies      
Anyone who has read the #1 book (Rent is too high) and who might want to add a few comments about it? What it describes, suggests etc. Never heard about it before.
beefsack 2 days ago 0 replies      
I wonder how many books would be on the list if it were somehow easy to extract mentions by name instead of by link. Mythical Man Month is mentioned regularly here and I don't think it's linked very often because of how well known it is.
noobie 2 days ago 0 replies      
Sad I couldn't find none of non-technical books on Audible. Any audiobook "readers" out there?
Ocerge 2 days ago 1 reply      
Oh god, that algorithms book. Flashbacks to college memorizing red-black trees coming to me.
pentium10 1 day ago 0 replies      
In 2015, at Crunch Practical Bigdata Conference, Budapest, I showcased what books some subreddit community talk about: startups, entrepreneur, productivity reads. Slides are available here: http://www.slideshare.net/martonkodok/complex-realtime-event...
bodecker 2 days ago 0 replies      
Suggestion: enable arrow keys to allow for easier scrolling through the books
arnold_palmur 1 day ago 0 replies      
How is A Guide to the Good Life not on here - I feel like I see it mentioned at least once a day.
Sealy 2 days ago 1 reply      
Interesting list. I clicked on the top book and Amazon peer reviews gave it 2.5 stars out of 5 with 450+ reviews.

I admire the effort. Calling it Top Books is slightly misleading. Perhaps you can call it, most mentioned books.

timdorr 2 days ago 1 reply      
It would be nice to link to the comments where the books were referenced.
abstractalgebra 2 days ago 0 replies      
In most such lists there's a distinct lack of math books even though there are tons of great math books specifically written for programmers and compsci people.
clarkmoody 2 days ago 0 replies      
Easiest way to make it to the top of Hacker News: Hacker News meta posts.

Always interesting to read. But just as interesting is how quickly they pop to the top of the home page.

joshmaher 1 day ago 0 replies      
Does that include comments on this article about books being read from links in the comments?

Here's one on understanding the mindset of your investors when raising startup capital - Startup Wealth - http://amzn.to/1Jej8El

smartial_arts 2 days ago 2 replies      
Is this some sort of a promo trap? When clicking on book links I get taken to pages like this one http://www.freebie.guru/au/starwars/starwars625.html
andy_ppp 2 days ago 0 replies      
Here is a talk Matthew Yglesias gave about the contents of his book "The Rent Is Too Damn High":


tedmiston 2 days ago 0 replies      
I wonder why Steve Blank would publish a book via CafePress.

 The Four Steps to the Epiphany: Successful Strategies for Products that Win Author: Steven Gary Blank Publisher: Cafepress.com Number of links: 45

joeax 2 days ago 4 replies      
"The Rent Is Too Damn High: What To Do About It, And Why It Matters More Than You Think"

Not where I live. What to do about it? Move. Find an employer willing to let you work remotely, and find your own quiet cost-conscious piece of paradise.

pjdorrell 2 days ago 0 replies      
Possible application of Law of Unintended Consequences: every time you write a program to extract data _out_ of HN, you increase motivation for someone else to insert data _into_ HN.
Aaronontheweb 2 days ago 1 reply      
A little surprised to see that the Mythical Man Month isn't on that list: http://amzn.to/1ZHWUlF
mandeepj 1 day ago 0 replies      
I also do it most of the time. When ever I see a book recommendation here then I go to amazon either to buy it or save it in my wish list.
carpdiem 2 days ago 0 replies      
I notice that this list looks like it has a very long tail.

Can we get the top 100 books as well? (since many of those would have very similar mention numbers as the end of the top-30)

Havoc 2 days ago 0 replies      
Must admit I was expecting less programming books. A lot of the topics on here aren't directly programming related.

Thanks for the list though. Bought the psychology one.

ck2 2 days ago 1 reply      
Now that is a website with a nice clean layout and easy to read.
veritas3241 2 days ago 1 reply      
This is really awesome! Thank you for putting this together.
dpflan 1 day ago 0 replies      
Cool - maybe this can become a monthly recap of posted books / (links)?
deadowl 2 days ago 0 replies      
I've read a grand total of two of those. Working Effectively with Legacy Code seems like it would make for a good read.
meetbryce 2 days ago 0 replies      
Your links don't open in a new tab, despite the icon and even if I use my middle mouse button.

Extremely annoying.

DyslexicAtheist 2 days ago 0 replies      
amazed that nobody talks about W. Richard Stevens anymore. i am getting old


enitihas 2 days ago 1 reply      
I am surprised to see the absence of SICP or anything from Dale Carnegie.
rplittle 2 days ago 1 reply      
Curious why the #1 is only 2.5 stars on Amazon
a-dub 2 days ago 0 replies      
There are a couple of goodies in there, but tbh that list is pretty depressing.
dschiptsov 1 day ago 0 replies      
How come that SICP is not here?
Dear open-source maintainers, a letter from GitLab gitlab.com
921 points by levlaz   ago   311 comments top 43
jbk 2 days ago 7 replies      
With VideoLAN, we're not on github, but I believe we fit the 'large open source project' description. We host VLC, FFmpeg, x264 and quite a few related libraries.

For VLC and all related VideoLAN projects, we're moving to our own instance of GitLab hosted on our infrastructure.

And to be honest, it's quite good, but a few stuffs are ridiculously limited, to the point that some people in the community are resisting the change.

The first part is the groups and subgroups: it seems incredibly difficult to give sub-groups access to repos (like a team for iOS, one for Android, one for libVLC... but all are under the "videolan/" group). It seems there is a way with the EE, but not in the CE; and the current idea for the CE is to have sub-projects, which is not good, because it will make our URLs way more complex than needed.

The second part is the bugtracker/issues tracker. We use trac for VLC, and we want to leave it for something better; but gitlab issues is way too limited, even when using the templates. Especially, it seems to be impossible to add custom searchable fields (like "platforms", "priority" or "modules") which are very very useful to do queries. Also, there is no way to do custom queries and store them ("I want all the bugs for Windows, which are related to the interface modules").

If I remember correctly, this second part was also a complaint in the open letter to github.

Finally, it's not really related, since it's more a feature request, but we'd love to allow external people to fork our repos, but not create completely new ones (or have them validated) because we don't want to host any projects under the sun (there is github and gitlab for that). So far, you either allow both features or none of them.

PS: can we have custom landing pages and custom logo in the CE version? :D :D

nathan_f77 2 days ago 4 replies      
I still don't know how I feel about GitLab. My initial reaction was that they were an underhanded, cheap knockoff of GitHub. It felt kind of dirty, like they were stealing GitHub's thunder and giving it away for free. Then they started charging for enterprise features and turned it into a business, which felt even weirder. And then they raised a lot of money, which kind of made them seem more legitimate. And now this letter, which changed my perspective quite a lot. I didn't know that had features like voting on issues, and issue templates.

So now it even feels like they're doing Git hosting the right way, making the core software open source, and charging for enterprise features.

On the other hand, I would have probably never paid for GitHub if they followed this model. So I don't think GitHub would have been as successful.

mdw 2 days ago 4 replies      
I've recently made an iOS App that integrates with GitLab. The people at GitLab have been incredible, they respond to my issues, improve the API with every release, I didn't expect this level of awesomeness when I started the project.

What's great about GitLab, there's a release on the 22nd of each month, so you can depend on pretty much continual improvement. Even if you don't think GitLab is suitable for your Open Source project, talk to the team on their issue tracker, things get solved pretty quickly!

_yy 2 days ago 2 replies      
Meanwhile, Phabricator has implemented all of these (and even more).

Custom templates: https://secure.phabricator.com/book/phabricator/article/form...

Votes: https://www.mediawiki.org/wiki/Phabricator/Tokens

It's better in almost every aspect than GitLab and GitHub.

See https://en.wikipedia.org/wiki/Phabricator for an (incomplete) list of open source projects using it.

akerro 2 days ago 2 replies      
My company with ~300 developers are moving to Gitlab in the next few months. Today our CTO/PM shared opinion about Gitlab and he was very happy we're doing it, even better I recommended it to him :)

I'm a Gitlab users for a few years now, personally I like it much more than Github, one of the reason is that I fear that Github contains too many projects and gains too much control over OSS, I also dislike their CoS.

Good luck Gitlab!

glandium 2 days ago 5 replies      
Just for this, I'm tempted:

One issue that was raised several times was the ability to not create merge commits. In GitLab you can, as an alternative to the merge commits, use fast-forward merges or have merge requests be automatically rebased.

The main thing keeping me from actually doing it is the network effect... and this:


Right now GitLab.com is really slow and frequently down. This is because of fast growth in 2015.

thejameskyle 2 days ago 1 reply      
I'm one of the co-authors of the Dear GitHub letter. This is the type of response I want so badly from GitHub (but wasn't expecting).

GitLab still has a ways to go in terms of performance/reliability and polishing their product, but GitHub aught to be very nervous about them.

gravypod 2 days ago 2 replies      
If GitLab plays their cards right, they can take the market from github. That, in my book, will be good because unlike github, we can all contribute to making GitLab better.

The only question left is if your servers are powerful enough to run gitlab. Maybe I'll sacrifice a goat for some new server hardware and 256GB of ram.

rcarmo 2 days ago 4 replies      
I chose Gitlab for my former company over GitHub Enterprise because we wanted an on-prem solution, and it worked well enough for ~200 folk. We did have to tweak (and occasionally break) a few things, since quite a few people suffered from NIH syndrome and wanted things done "the right way".

In general, I liked it, but it always irked me that its Ruby underpinnings made it hard to upgrade/migrate stuff (we basically just swapped LXC containers at one point, not sure how it was handled during the last upgrade). If anyone ever manages to do a credible alternative that does _not_ use Ruby in any way but keeps the overall GitHub-like workflow, a lot of operations folks will switch _instantly_.

(Like https://try.gogs.io/explore, for instance)

Also, like some commenters already pointed out, the CE edition was ridiculously limited in some regards - we mostly skipped the bits we didn't like and did product-level ticketing outside it (using Trac), with Gitlab issues used only for "techie" stuff, tracking fixes, etc.

But today I'd probably just sign us all up for GitHub and be done with it, or fire up a VM image from some marketplace - there's hardly any point in maintaining our own infrastructure or doing a lot of customization.

neumino 2 days ago 1 reply      
I've been using GitLab instead of GitHub recently for all my new projects and honestly there's nothing worse than GitHub and a few things are better - like having protected branches and master by default in it -, private respositories etc.

That being said, what both GitHub and GitLab are missing is actually becoming a "social network" or maybe more an active network. There are tons of interesting projects that pops up every day, that I would be interested in knowing about, contributing, but there's basically no way to learn about them.

Kudos to the GitLab team for all its work :)

alexbardas 2 days ago 1 reply      
Both products are really good, but Github needs to realise that they have to be more open and listen more to their users.

There is an opportunity for Gitlab here and I'm happy that they decided to make this announcement.

The community is the actual winner of this healthy competition.

lowpro 2 days ago 0 replies      
Great PR move by gitlab, and I don't mean that negatively. This kind of direct response is exactly what a company looking to improve should do, hats off to gitlab!

Once their performance increases, maybe we'll see the momentum shift from Github.

filleokus 2 days ago 1 reply      
It seems as the linked issue, for the "long list of suggestion", is being spammed or something?


I have never used GitLab myself, but some of the features mentioned in the article (like a true voting system) is something I've really longed for. Might have to reconsider trying out GitLab more.

Ellahn 22 hours ago 1 reply      
Congratulations on GitLab, I personally use it and love it, and it's awesome to see amazing people like the VLC team considering GitLab.

A suggestion I have is to consider open-sourcing and re-branding EE on a GPL-Like license that also requires projects hosted with it to be open sourced, while specifying that contributions to this version can be re-licensed by GitLab to be used by paying customers (Or re-licensed to MIT and released on CE). This way open source people get it all, and if you want to use it on closed source you pay GitLab.

This also has the benefit of allowing open source developers to work with GPL, and the changing to MIT could even be decided before merging (Sending to either CE on MIT or EE on GPL+GitLabProprietary, according to developer decision).

There's a lot to fix on the OpenSource world, but there are also so many possible ways. Best of luck to the GitLab team, I hope we see more amazing stuff from you (Btw, GitLab CI got a lot better recently, I hope you keep improving it. :D)

bitshepherd 2 days ago 3 replies      
I remember GitLab. They interviewed myself and a bunch of my colleagues to test the salary waters here in the Bay Area and hired nobody because we were all "overpriced", with several being underpaid for the area.

If you want the talent you need, especially in the Bay Area, you have to pay more than what the average developer makes in Amsterdam. I want to like GitLab, but I just can't get that bad taste out of my mouth.

sandGorgon 2 days ago 1 reply      
Please improve your mobile UI - phones have lesser horizontal space than vertical. You have put a cool looking sidebar on the left side which takes up a lot of space causing text layout to be funny on my 5.2 inch LG G2.

Github OTOH has an extremely usable mobile UI.

onuras 2 days ago 2 replies      
Their issue tracker seems really busy atm: https://i.imgur.com/Iv46xnx.png
cyphar 1 day ago 1 reply      
While I've always admired GitLab being a free software alternative to GitHub, the whole EE licensing stuff really rubs me the wrong way. EE is completely proprietary software, even for people who have paid for it (it only allows the freedom to modify the code, but you cannot run it as you wish or distribute code freely). I understand charging money for your software, but you can also charge for feature requests and support. There's no need to lock away features from people using your free software just because you've deemed that feature "not useful for you" or "pay-to-use".

What I really don't get is the argument that "we won't liberate feature XYZ from EE because it's only useful for companies with 100+ developers". I think it's quite impressive that you can know what every user of your free software needs, and that you'll protect them from code only suited for enterprise.

I'll still use GitLab (the fact there's a free software version is great), and I'll be the first to fork (or back someone else's fork) CE as soon as you get acquired and your free software is no longer maintained by you (see: Oracle with Solaris, and every other acquahire ever).

danielsamuels 2 days ago 1 reply      
Perhaps if they had the same uptime, page load speed and UX as Github, I would consider it. Unfortunately we tried it for a week at work and none of us could figure out our way around it (when it was working).
chris_wot 2 days ago 1 reply      
GitHub probably want to, you know, actually pay attention to their developers now. Gitlab might just eat their lunch.
nevi-me 2 days ago 0 replies      
At least with Gitlab I can open an issue somewhere and follow it, like with open source projects. It doesn't disappear into the abyss with no feedback on what's happening.

I've implemented two self-hosted Gitlab instances at work, for one of the instances on our private network, I'm still fighting with IT to allow us to use LDAP. Gitlab EE is still off our 'pockets' as management aren't too keen to pay for it, at least yet, but I hope that we'll get there.

Our self-hosted instance is also a bit slow, not as slow as Gitlab.com, and if it was written in a language that I'm familiar with, perhaps I and some of my team could contribute to 'making it faster'. Pity I don't have enough time left in the day to learn Ruby. I've read up a bit on the work going on around Unicorn and workers, but maybe some of these things could be better written in other more 'performant' languages?

For personal projects I still use Bitbucket + JIRA. I got to the point where I decided to stop looking for freebies and pay. JIRA has been awesome, totally worth the price.

allan_s 2 days ago 1 reply      
For me the big things missing (not only for big open source projects)

 1. more than one level of subdivision for groups/projects (see below) 2. groups of users (call it department/team): because of 1 we have a lot of groups, (because all the small libs are in separate projects, so the project itself is a group, but of course we have several projects), so everytime somebody join the company, we have to add him in every single project (for them to be able to read the code), and we have also subcontractors, we would like to have a nice way to separate from the others
I think this things are most "day to day annoying", other than that gitlab is pretty good, the CI system, the hook system, it's pretty easy to put that links to issue section, ticket reference redirect to redmine , taiga , jira or whatever.

spudfkc 2 days ago 2 replies      
I get why GitLab wrote this, but it seems like just a stab at GitHub.

There seems to be a lot of hating on GitHub here, but I personally love GitHub (and we use GitLab at my current employer).

I think GitLab is doing a great thing, and I appreciate that their community edition is free and open source, but GitHub has been able to provide an invaluable service. They have a great community that facilitates open source projects and a vastly better UI than GitLab (though that isn't saying much with how awful GitLab's UI is).

I'm eager to see how GitHub evolves in the future with GitLab as a competitor, as GitLab has a lot of nice features (built-in CI, etc).

georgefrick 2 days ago 1 reply      
I've been using GitLab myself and suggesting it to any clients with the proper infrastructure. When I found GitLab I tried it out, and finally ended up moving my SVN to Git (private projects). It's a great piece of software and I love the web hooks.
k__ 2 days ago 0 replies      
I read that Babel uses Phabricator.

How does GitLab compare to Phabricator?

infocollector 2 days ago 3 replies      
Any chance you can support Mercurial? There was a highly voted issue that I saw sometime back.
merb 1 day ago 1 reply      
What I would like to have in the free version is:- Project importing from Stash to GitLab- Mirror Projects - Display merge request status for builds on Jenkins CI- Rebase merge requests before merge- Git hooks (commit message must mention an issue, no tag deletion, etc.)

These are pretty essential.

pmilot 2 days ago 8 replies      
This attempt by the Gitlab folks to ride the Github dissatisfaction wave seems a little low-brow. Why respond to a letter that's not addressed to you? I would have preferred them to simply post an honest "Why you should migrate from Github to Gitlab" article. The tone just seems a little devious to me.

By the way, we're using self-hosted Gitlab at work and we love it. This isn't a knock against the actual product. In fact, I think Gitlab has improved tremendously in the last 18 months. I just wish they would be a little more up-front about their marketing efforts.

nik736 2 days ago 1 reply      
Off topic:https://jumpshare.com/v/MOEwe43eAHatINgDTi0zWhat does this stand for? ;-)
grimborg 2 days ago 1 reply      
Using gitlab for go development, "go get" doesn't work. Googled around, couldn't find a solution. With github, on the other hand, it just works.

I love gitlab (even made a git tool to easily create repositories from the commandline, gitgitlab) but these small things make a real difference. I'll end up paying for a github organization account just to get this annoyance out of the way.

p1mrx 2 days ago 1 reply      
GitLab could differentiate their service by offering IPv6 support, which GitHub has so far declined to do.
MoSal 2 days ago 1 reply      
Somebody just spammed their Issues page[1]. The one they link to in the letter.

I think the spammer is trying to make a point! For starters, there seems to be no rate limit applied.

[1] https://gitlab.com/gitlab-org/gitlab-ce/issues

nhumrich 1 day ago 0 replies      
I love gitlab and its features, and will commit to actually using it once dockerhub has gitlab integration. I agree though that is more of a limitation on dockerhub.
simi_ 2 days ago 0 replies      
Sourcegraph has pivoted into a git hosting appliance, it's pretty cool, even if nowhere near as full-featured. https://sourcegraph.com/
sqldba 2 days ago 2 replies      
Because the thread has turned into a "why I like/dislike GitLab over GitHub", I'll say one thing that keeps sending me back to GitHub is the neat-o desktop client.

I use a command line for everything else in life; but with Git I'm hopeless.

giancarlostoro 2 days ago 1 reply      
Thank you GitLab for being yet another option, and for at least maintaining one Open Source version of your software at the very least.
jorisw 2 days ago 2 replies      
Before you think of switching to GitLab, look at their status Twitter account over the past few months. They've had a lot of down times. They run on Microsoft Azure.
jhasse 2 days ago 0 replies      
Has anyone any experience with Gogs? How does it compare to GitLab?


Entalpi 1 day ago 0 replies      
imonabout 2 days ago 3 replies      
I don't think the fact that GitLab is free for basic users is very discoverable from the site. You've got the "features" tab, which leads to what seems to be the option of downloading a community edition, which makes it look like GitLab is only offering the code itself but not hosting from their, and the enterprise edition for some licensing fee (I suppose). Then you've got "sign in", but no "sign up", which leads me to ask "but then how do I sign up", which most naturally (for me) leads me to the "pricing" tab. Now this page only shows "trial" and the different priced tiers.

But if I press "sign in", I am able to sign up, with no notice about it being just a limited (45 day) trial. So so far I'm assuming that this is a perpetually free account, though I'm not completely sure yet...

beshrkayali 2 days ago 0 replies      
Nice ride, Gitlab!
bigbugbag 2 days ago 4 replies      
The "crippled community edition vs paid enterprise edition" business model raises red flags. It's not that different from the free crippled demo version or outdated by 2+ major versions freeware model.

Who gets to decide that features are enterprise only ? How are these enterprise only features: "Hosting static pages straight from GitLab", "git-annex", "git hooks", etc. ?

Get a crippled version that doesn't fit the reasonable expectations or pay for enterprise edition coming with a big bag of features I have no use for just to get the missing features that I can have on github for free (And I'm not a big fan of github)?

As such gitlab community is not very useful to me and does not seem to have a future because its chosen business model goes against its usefulness to people.

arihant 2 days ago 4 replies      
> "Right now GitLab.com is really slow and frequently down. This is because of fast growth in 2015. We are working to improve it in the first quarter in 2016. For now please consider downloading GitLab and using it on-premise for a fast experience."

Is this a joke? I mean for people looking for free private Git hosting, there is Bitbucket. This statement is like saying "free, but not really, really." The fact is if I want hosted Git hosting from Gitlab, I cannot reliably get it without paying at least $390 upfront for their EE plan. Too much smokescreen, too less actually on offer.

Caltech Researchers Find Evidence of a Planet Beyond Pluto caltech.edu
694 points by cyanbane   ago   124 comments top 38
hackuser 8 hours ago 2 replies      
The NY Times covers the story well:


EDIT: After reading the comments in this discussion, many of which are addressed in the NYT article, I'd say the NYT article is almost certainly worth reading.

xenophonf 5 hours ago 2 replies      
No one's mentioned Tyche yet?


In searching for Tyche, the WISE missions ruled out the possibility of anything larger than Saturn (95x the mass of Earth) out to about 10000 AU and anything larger than Jupiter (317x the mass of Earth) out to about 26000 AU. WISE was able to detect objects the size of Neptune (17x the mass of Earth) out to about 700 AU, so it should be possible to find the object proposed by the Caltech astronomers here (10x the mass of Earth at around 600 AU). I don't know if WISE's current condition would allow it to perform such a search, as it's completely out of coolant.

dexwiz 9 hours ago 0 replies      
We have been looking for a "missing" objects for a while. The nemesis star theory says the sun has a brown dwarf companion. This object would interact with the Oort Cloud instead of the Kuiper belt. But this theory has been pretty much refuted.

This theory definitely looks more promising. Finding eccentric Kuiper belt objects, and aligning them with a missing object seems to be a good bet. Giving the object an orbit should make the search easier, and we will probably have a conclusion one way or another within a few years.


gregwtmtno 9 hours ago 0 replies      
Skepticism is warranted, but keep in mind that Mike "pluto killer" Brown, one of the authors, has an impressive track record. He discovered Eris and Sedna.
nkoren 9 hours ago 4 replies      
If it holds up, this is fantastic. Given how interesting the Pluto system has turned out to be, I can only fantasise about a potential super-earth system relatively nearby.

However, to be completely pedantic: would this actually be a planet? Or still a dwarf planet, despite its massive size? Keep in mind that the definition of planethood is not only that it's large enough to be rounded by its own gravity, but that it has also "cleared its orbit". I get the impression that this would cut through broad swathes of the still-cluttered Kuiper belt, and thus would only qualify as a "dwarf" despite its massive size.

I checked the original papers for references to whether it had cleared its orbit, and couldn't find any. Correct me if I'm wrong?

tdaltonc 9 hours ago 8 replies      
> Batygin and Brown inferred its presence from the peculiar clustering of six previously known objects that orbit beyond Neptune.

This raised a big red flag in my mind. This must produce a literally astronomical multiple comparisons problem. Yes they reported sigma = 3.8, but if they didn't do their multiple comparisons correction right (which I am in no position to determine), they're basically reading tea leaves.

If you're not familiar with multiple comparisons, it's kind of like [this](https://www.goodreads.com/quotes/649893-you-know-the-most-am...) or [this](https://xkcd.com/882/). If you look at enough extra-neptunian bodies, some of them are going to be in an odd looking cluster.

logingone 9 hours ago 4 replies      
I don't seem to have a grasp of our visibility of our solar system. We can see a number of the planets with the naked eye, a number of the moons with binoculars and a few hundred $///.. telescope. Yet even with these great big radio telescopes, antenna arrays, Hubble, etc, we seem to be quite unaware of what's in our neighbourhood. Anyone have figures of how much we've surveyed?
ilyagr 7 hours ago 0 replies      
There is an awesome summary by Emili Lakdawala at the Planetary Society (as is often the case with such news).


japaget 9 hours ago 0 replies      
rbanffy 8 hours ago 0 replies      
The scales involved are astonishing. If the orbit is correct, less than one year has passed on it since the invention of writing here on Earth.
sandworm101 9 hours ago 2 replies      
Giant, weird orbit, debate ... there can be only one name. This is Planet X.

Also because X would be 10th discovered planet, a reference that pluto, while not a planet today, was indeed the ninth _discovered_ planet.

runewell 5 hours ago 1 reply      
The biggest casualty of Science this decade has been elementary school dioramas.
axxl 1 hour ago 0 replies      
At WWDC last year one of the lunchtime presentations was by Michael Brown, one of the researchers involved here. He was one of the people who proved that Pluto should not be considered a planet and went over the history of the discovery/classification of planet bodies. He also mentioned that they had some evidence of a planet body beyond Pluto due to its gravitational affect of some of the smaller bodies out there even then. Interesting to see this is finally coming to something! He has the appropriate/amusing twitter handle of http://twitter.com/plutokiller
onetwotree 8 hours ago 0 replies      
This is pretty solid science, even if, as others have pointed out, there's a bit of academic drama-rama and green jellybean stuff going on. In particular, their model made a prediction that they didn't set out looking for, which corresponded to existing observations. And of course, the whole hypothesis is easily testable. While they don't know where the hypothetical planet might be on it's orbit, it sounds like there's a good shot that small telescopes should be able to spot it.

Exciting stuff.

ChuckMcM 9 hours ago 1 reply      
Ah Planet X which is now planet IX with Pluto's demotion :-). I am guessing that you can confirm it by looking for star occulusions. Presumably the planet would blank out stars as it passed between earth and those stars. So would it be possible to find it using existing plates?
empressplay 37 minutes ago 0 replies      
I just want to know if Planet X is covered by human-enslaving robots.
beamatronic 9 hours ago 4 replies      
Why wouldn't their calculations be able to suggest possible locations for the planet?
givan 8 hours ago 1 reply      

He believed this hypothetical planet of Nibiru to be in an elongated, elliptical orbit in the Earth's own Solar System, asserting that Sumerian mythology reflects this.

mast 7 hours ago 0 replies      
Very cool.

It reminds me about a book I read called "In Search of Planet Vulcan" (http://www.amazon.com/In-Search-Planet-Vulcan-Clockwork/dp/0...). Before Einstein, astronomers tried to explain the motion of Mercury by suggesting there might be another planet inside Mercury's orbit.

FreedomToCreate 9 hours ago 1 reply      
If Planet Nine exists I wonder how the astrologers will tweak there models.
c3534l 6 hours ago 0 replies      
I've been seeing news stories about Planet X for far too long to take any of this seriously, no matter how seemingly trustworthy the news source is.
pavel_lishin 9 hours ago 3 replies      
> * the putative ninth planetat 5,000 times the mass of Plutois sufficiently large that there should be no debate about whether it is a true planet*

That's 10x the mass of the Earth, right, or about 3x the size of Neptune?

dluan 9 hours ago 1 reply      
More from the folks behind the paper: http://www.findplanetnine.com/
theptip 8 hours ago 2 replies      
Did anyone else notice that the rendering of the orbits in the article looks strikingly similar to Kerbal Space Program's orbit UI?
johngossman 6 hours ago 0 replies      
This is very cool. Along with a visible supernova and a super bright comet, this is one of those things I dreamed about as an astronomy nerd kid.

Though part of me wants to say "Pictures or it didn't happen!"

JamesUtah07 8 hours ago 0 replies      
At first I thought this was another one of those planet x articles that talks about some hypothetical planet way out but then I saw that Mike Brown was involved and immediately got really excited. I hope they find something out there and we can send a probe to it in my lifetime.
photonwins 7 hours ago 1 reply      
Does it mean Voyager hasn't really escaped Sun's influence yet?
ck2 5 hours ago 0 replies      
If true, voyager hasn't left the solar system yet.
danieldrehmer 9 hours ago 2 replies      
I'm puzzled by how a planet this big could form in an orbit so distant.

The fact that the material in that region is so spread out and the orbital period of such object is so long matters.

I would love to read some thoughts on that.

bpg_92 8 hours ago 0 replies      
For a moment there, I thought of Nibiru :D
neur0tek 1 hour ago 0 replies      
mgav 3 hours ago 0 replies      
Maybe it's a Death Star?
pavpanchekha 6 hours ago 0 replies      
They should call it Pluto.
richmarr 1 hour ago 0 replies      
Reading the article I didn't see a mention of a previously hypothesised fifth gas giant... so here's a link in case anyone's interested or knows something more.


luckystarr 8 hours ago 1 reply      
If this turns out true, that planet needs to be called "Tartarus". :D
mud_dauber 7 hours ago 0 replies      
LV-426. Just sayin'.
mturmon 9 hours ago 2 replies      
Provocative title for this Caltech press release.

Mike Brown, the co-author of the paper reported here, discovered Eris, a KBO like Pluto, in 2005. This discovery prompted the IAU in 2006 to demote Pluto out of the realm of "planet" into a "dwarf planet".

At the time, Alan Stern's New Horizons mission to Pluto had just been launched, it finally arrived last year. Stern was incensed that NH started out as a visitor to the 9th planet and was going to end up as a visitor to one of many KBOs, and not even the largest one (Eris is more massive).

The quotes given at the time (http://www.space.com/2791-pluto-demoted-longer-planet-highly...) are revealing:

"Pluto is dead." -- Mike Brown

"This definition stinks, for technical reasons...It's a farce." -- Alan Stern

For more: http://www.space.com/12709-pluto-dwarf-planet-decision-5-yea...

Stern is visiting Pasadena on a New Horizons victory lap next week. Should be interesting.

Desktop Neo rethinking the desktop interface for productivity desktopneo.com
785 points by ziburski   ago   301 comments top 85
dangrossman 1 day ago 9 replies      
It seems like a lot of people skimmed without reading the top or bottom of this site:

It's not a company. It's not a product. You're not being asked to buy it or buy into it, just to discuss the concept if you'd like.

This website is a portfolio piece for a 21-year-old university student hoping to find an internship. In my opinion, it's an impressive demonstration of his design and technical skills. It certainly says a lot more than the average 21-year-old's resume listing what courses they've taken so far.

tomc1985 1 day ago 12 replies      
The problem with the tablet analogy is nobody seems to know how to design complex, professional-level (like Photoshop, or Blender) applications in that paradigm. Tablet apps are overwhelmingly consumption-oriented, and those that aren't all seem to be oriented around popular (and populist) appeal.

Why is everyone out for the blood of the desktop/windows paradigm? Why can't the simplified-tablet market and the desktop-power-user markets coexist? Windows and taskbars and start menus are wonderful and my favorite way of interacting with computers, why must it be taken away?

king_magic 1 day ago 1 reply      
There are a few things I like in this (particularly the Finder bit), but so much of this is just a bunch of extraordinary statements with absolutely no universally-accepted justification, spoken as if they have some sort of universally accepted justification.

For example, "Panels use screen space more efficiently and are a more elegant way to multitask than normal windows."

Says who? You? It just drives me crazy when I see statements like these. It's not an effective way to get your point across. You want to make your case for something like that? Actually make your case. Present some evidence and your conclusions. Not everything has to be a Jony Ive marketing video.

edit: Last thing I'll say - I just re-watched the video, and caught the last line - "it rethinks desktop computing to help you get work done". My advice to the author - go work for various companies for 5-10 years, and then come back and see if that statement holds up. My ability to get work done would be crippled with this; in fact, my work would come to a grinding halt. "Work" just doesn't work in the kinds of idealistic ways these types of marketing-like videos always seem to show.

And don't get me wrong, I like the author's ambition. If I was in the internship-givin' business, hell, I'd probably consider him. I think this is a good way to get your name out there, even if it attracts criticism (like mine).

TrevorJ 1 day ago 5 replies      
The biggest thing I find lacking in desktop UI is the ability to save and switch between states. We need the idea of project 'sessions' where I can open up the software I need to use, pull up the files I need, set my favorite folders, etc and then hit 'save' and be able to restore back to that setup at the click of a button.

Right now, switching between projects means setting all that up every single time.

froh42 1 day ago 3 replies      
Hmm, isn't that "just" a tiling window manager at the bottom of a few other nice things?


While feeling the shortcomings of a window system, I never felt like I could productively work with a tiling manager, since I frequently overlap windows as a mechanism for easy access ... and there are lots of popup windows I don't want to cover the main application.

Oh yes, and I purposefully keep my Mac apps out of Full screen mode (and use a keyboard shortcut to maximize when needed, keeping out of full screen).

EdSharkey 1 day ago 1 reply      
I've sometimes thought the most egalitarian way to design a GUI would be to make it exquisitely accessible to the blind. This would force designers to organize their GUI with sensible, navigable hierarchies of data and controls.

To me, an exquisitely blind-accessible GUI's should function like a fancy keyboard navigable/editable graph data structure that echoes the hierarchies and relationships represented on the sighted displays. The biggest boon to such a GUI would be that the blind-accessible controls would function as an "expert" mode of navigating the GUI - one would never have to touch the mouse or trackpad to get stuff done.

Sighted users would benefit from learning these keyboard controls and we'd inject some hyper-productivity back into our apps to counter the fisher-price'ification that has been creeping into GUI's over the past 10 years.

tlow 1 day ago 0 replies      
This is still hacker news, right? So hypothetically we're talking silicon valley startups. How many of these kinds of places are heavily reliant upon outside companies like those mentioned by @GuiA to build prototypes of their designs?

I have seen plenty of consultants hired, but nearly every startup I've seen or been at/around (~20) prototyped their designs in house. As an example bu.mp hired an industrial design firm to help them make physical prototypes of a POS competitor to NFC technology (which ultimately failed), but had their own engineers and designers actually create and test the working prototypes.

Shortcomings notwithstanding, I think this is an example of excellent work and a creative way to find an internship. Given the opportunity, I'd most certainly offer this guy an internship if I could.

Nice job. Ich wnsche dir viel Glck.

rocky1138 1 day ago 5 replies      
First paragraph in and I'm already not their customer:

"We now use smartphones and tablets most of the time, since they are much easier to use."

No. Just no. I don't want to use a touchscreen and closed ecosystem to develop software. That would be a nightmare.

The rest of the design seems to be taken straight from 10/GUI.

david-given 1 day ago 0 replies      
That's weird --- I looked at some of the sample animations, and I could swear I heard a tiny voice shouting 'Oberon! Plan-9! Rio!' in my ear...

I'd like to try something like this in action; but the problem, as always, is going to be bootstrapping. Look how badly Ubuntu manages something as simple as putting the application's menu bar in a non-standard place.

...I worked once with a desktop environment for the PC, GEOS. It had a feature where your application's UI was described in logical terms and this was then mapped to a physical UI when the app loaded. It allowed pluggable look-and-feels to drastically modify the look and behaviour of the application as they saw fit.

If we had something like that, this would be easy. Shame we don't, really.

vic-traill 2 hours ago 0 replies      
I think this is very interesting content, and this is represented in the volume of dialogue it has generated.

I found that Neo really resonated with my residential use, but not so much for my work.

From the Author's referenced blog post 'The Desktop is Outdated': "We interact with a lot of different content today, and a large part is outside of files". Not in my work environment where the majority of content is inside of files. But sitting at home - yeah, this is true for me.

It appears to me that the mobile interface cart is trying to drive the productivity desktop horse here. I don't know to what extent I buy it, but I like the way Neo challenges current desktop design.

I'm going to go back and read it again.

mattlutze 1 day ago 0 replies      
Tags are really tough. Unstructured categorization comes with its own basket of issues -- some distinct from structured taxonomies, but not necessarily less tricky.

A few thoughts there:

1. What happens when my mental schemas change in a year? The word I use to look something up changes, and I suddenly can no longer find it.

2. What happens when I get a little lazy in obediently tagging everything I create? Imagining the 2 or 3 words I'll want to use in the future to look something up (see the first thought) is really tough. Mentally taxing = a barrier to adoption.

3. Folders can get unnecessarily deep, stale, etc... but having the structure available to browse can be an extremely useful trigger in reestablishing the hallways of my desktop-stored "mind palace."

4. Having many ways to discover information I'm looking for > having a few ways. Search, browse, categorize, all have a purpose depending on the way a file or piece of information imprinted on my memory.

jb55 1 day ago 1 reply      
So basically xmonad + with a tagged file indexer. Although fzf + https://github.com/rupa/z works for me most of the time.

With respect to eye tracking, I had a similar idea the other day. Imagine holding a key, then moving your gaze to see an on-screen target following where you're looking at. You could use this as a really quick way to scroll or highlight/copy text without leaving your home row.

I really don't like leaving my home row.

I may be crazy.

clord 1 day ago 0 replies      
Computers knowing about focus is quite a compelling future direction. There are lots of really good possibilities. But also a few problems.

For instance, If you can track focus, you can make the monitor seem larger than it really is. Just scale everything that isn't being looked at down a bit. As the gaze shifts towards other objects, shift things slowly around and overlap the enlarged window over other background windows.

You can make focus-dependent shortcuts. Imagine vim with nouns and motions that can refer to and act on the focus point. `ytF`: yank-to-focus, where F is a motion from cursor to focus. Lots of rich possibilities there.

The only major problem I see is that you can't really share work easily (unless you open it up to multiple focus points somehow.) Also, it would be frustrating to have to look where you're typing. I'll often look at something else while typing just before switching tasks. I'm going to start paying attention to my focus more to see if there are other potential pitfalls with the computer knowing about it and changing modes in response.

In general I like the possibilities opened up by having focus. Heck even with a traditional mouse+keyboard, the extra data would help the computer understand us better. It might be more suited to a VR desktop, where the 'multiple-viewers' problem does not exist.

falcolas 1 day ago 1 reply      
Tiling window manager, with easy back-and-forth? Works for me.

I have to say, I really like tagging-as-filesystem concept (where you can also meta-tag something), and the gaze/touch interaction proposed would be awesome to have in my opinion.

The current interaction with tagging (basing off OS X) is still pretty clunky. It's a separate field, the new vs previous tag selection is aggravating with a keyboard, and there's no easy way to browse or select multiple tags to filter content.

Having to move to and from the mouse a lot is a bit of a pain, and even with the best touchpads on the market, the interaction with them can still be annoying when trying to do things like click on a HN upvote arrow. Being able to start the mouse from the point you're looking at, or even forgot the mouse entirely when starting typing into the field you're looking at - those would be fantastic additions.

I'm slightly more dubious about the voice interaction, though there are times where "Hey Siri, set a timer for 10 minutes" is a great way to interact with an otherwise over-complicated device.

isaiahg 1 day ago 2 replies      
A traditional desktop does everything I need. This just doesn't seem like enough of an improvement to make me jump ship. This whole "rethinking the desktop" buzz just seems like another fix looking for a problem.
mintplant 1 day ago 1 reply      
> The [menu] is easily scannable and you can search for options just by typing.

This is really important. It's 2016, search is easy, and I shouldn't have to hunt around because I don't know whether you stuck your options dialog under File, Edit, or Tools. I appreciate Ubuntu for implementing this OS-wide with Unity's HUD feature.

Raphmedia 1 day ago 2 replies      
"About this Concept

Neo was designed to inspire and provoke discussions about the future of productive computing. It is not going to be a real working operating system interface, it is just a concept. I am not saying that these ideas would definitely work and that this is the future of computing. However, there is large potential in rethinking the core interfaces of desktop computing for modern needs, and somebody has to try."

egypturnash 1 day ago 1 reply      
> Folders were a great metaphor when our files were a handful of office documents. But today, complex hierachies make it hard to organize and find what we are looking for. The concept of a file being located in a single place seems outdated. Content is stored, synced, backed up and shared among many different devices. And our most important content lives in services or apps, not in folders and files. While mobile devices are trying to kill file management completely, its more important today than ever. With the enormous amount and complexity of content, we need a new solution.

Okay, buddy. Tell me how you're going to navigate a complex project built with multiple apps with your cool new tag-only scheme.

How would you re-organize the myriad number of source code files, library sources, images, image source files, readmes, and other things? Hierarchical structures are good for that kind of thing. If you're going to advocate throwing them out then I think it is imperative for you to tell me how you will organize real projects rather than just a handwave about search and adding tags to tags. How would I reorganize the directory full of 193 Illustrator art files and a ton of subdirectories, many with multiple layers of their own subdirectories, that make up the graphic novel I finished last year? It's got a total of ~2.7k files in it but all of that complexity is hidden behind a bunch of subdirectories so I can quickly find what I need at any moment.

(And also: holy crap those sample images are so much WHITE, using this will be like staring into a spotlight. And so impersonal - the user-set desktop picture is a wonderful thing that makes the computer feel like it's theirs and I kinda feel like this proposal completely drops affordances like that in favor of a blown-up iPad UI.)

vineet7kumar 1 day ago 1 reply      
The gestures for fullscreen and minimize operations need 6 fingers. Using both hands to do anything on a touch screen seems very impractical (except for typing with both the thumbs on a reasonably sized screen or a split keyboard).
ksk 1 day ago 0 replies      
I feel like these sorts of presentations are not very honest. If your claim is improved productivity, why not do a side by side? A cursory glance shows that all of these features would completely destroy my productivity.

Taking a look at the gestures...

1) Scroll through panels - Alt+TAB is unquestionably faster (< 1sec)

2) Open App Control - Win key? (< 1sec)

3) Open Apps menu - Keyboard Shortcuts (< 1sec)

4) Open Finder - Win + E (< 1sec)

5) Close Panel - Alt F4 ((< 1sec)

6,7,8) Resize - Win + UP/DOWN/LEFT/RIGHT (< 1sec)

Now, sure I see some people claiming that "nobody" knows these shortcuts or that they are "not intuitive" (itself a loaded term), etc.


Proposal 1 (Teach people these shortcuts)

Proposal 2 (Invent completely new gestures - teach people these new gestures, which will then slow them down because a keyboard is just crazy fast compared to touch.)

Now, I'm going off of the whole Desktop phrasing. Maybe on a mobile, all of these might make more sense..

ricefield 1 day ago 1 reply      
For what it's worth, this feels very similar to the 10/GUI concept, which made some waves in 2009. (http://10gui.com/video/)
legulere 18 hours ago 0 replies      
I don't think tags are good enough to categorize data. Tags lack information what is being tagged. When it's clear what's tagged then they work. On twitter for instance it's the topic you're speaking about. What you really want is a property system like wikidata.

Think about a music program you want to write. You want to find all artists that music files on this computer have and when the user clicks on an artist find all the music that belongs to this artist.

imonabout 1 day ago 2 replies      
I'm presuming you've researched tiling window managers as part of your work on this?
javipas 12 hours ago 0 replies      
Fun fact: as @ziburski has acknowledged, he got some inspiration from Clayton Miller's 10/GUI (2009), and if anyone watches the video


he will see how the organization in panels and not windows was called "con10uum" (min 3:30). I wonder if some Microsoft designer/VP/exec saw this and get the idea to put "Continuum" as one of the most important features of Windows 10.

Besides from that, great work, congratulations @ziburski. Like many others I miss shortcuts on a productivity environment, but this really could work.

Animats 20 hours ago 0 replies      
The eye-tracking GUI concept is interesting. That needs to be tried again. It's been proposed before.[1][2] Dismissing notifications by looking at them is an interesting concept. It's not clear whether users would like it or hate it. That idea probably needs to come with an equally easy way to get it back. Maybe you want to do something about the notification.

One general insight about GUIs is that easily undone mistakes aren't so bad. That's the Amazon one-click purchase insight. The innovation is not that you can buy with one click. It's that you can cancel for the next 10 minutes or so. Most shopping systems had a "we have your money now, muahahaha!" attitude on cancellation until Amazon came along.

[1] http://hci.stanford.edu/research/GUIDe/[2] http://www.cs.tufts.edu/~jacob/papers/barfield.pdf

terda12 1 day ago 0 replies      
Seems like too much unnecessary features to be honest.

Windows + Mouse with hotkeys is still the most comfortable set up for most people including me. I don't need an optimized experience.

I've tried Metro, I've tried i3, I've tried bspwm, I use Vim with split screen, and I still prefer the concept of draggable windows at the end of the day. It feels more free to be able to drag windows and take ownership of the layout than having a computer dictate what my layout should be.

At the end of the day, I feel like I'm more and more preferring simple UI's rather than these multitouch optimized experiences

JimRoepcke 1 day ago 1 reply      
At 1:00 of the video, he says "In my document I want to make a sentence bold, so I just select it, and then just click..."

But he doesn't explain HOW he selected the text. Since there's no mouse cursor, I have no idea how he just did that. A glance doesn't seem to be sufficient since it requires movement and intent.


lottin 1 day ago 0 replies      
It looks really nice. That said I don't see how this interface is better productivity -- unless your work consists entirely in reading mails and surfing the web.Also I don't get why UI designers hate "folders" and want to replace them with "tags". Personally I don't think you can beat a good old file system with a bunch of tags. Of course the file system requires some effort on the part of the user to keep things organised, but then again so do tags.
pestaa 1 day ago 0 replies      
I'd like to hear what Bret Victor would have to say about these ideas.
clapinton 1 day ago 0 replies      
This is a great example of how to get elements from mobile design (hamburger button, left sliding menu) and adapt it to desktop, which Apple has been slowly doing in the past versions.

My 2:- 6 fingers on the trackpad means two hands on the trackpad, which means moving one of them out of its working position. Just like you are avoiding moving the mouse around with the look-and-tap thing, you want to avoid users getting their hands off the keyboard, onto the trackpad, and back to the keyboard;- the context menu is really nice.

It is a very impressive piece of work for someone who is looking to show off his skills while still in college. I made projects myself before graduating, but they were always amateur-y and only for myself. I never made it to actually share it with the world. He actually did something and put it out there for others to see and judge.

Kudos to the guy, specially if the whole design process which he claims to have gone through (research, target groups, user flows) is as thorough as it sounds.

shib71 1 day ago 0 replies      
I found the repetitive use of "outdated" to be off-putting. It implies that much of the value of a new interface (including this one) lies in it's novelty. Except for a minority, the precise opposite is true - most people aren't interested in learning how to use their computer again.
everyone 1 day ago 1 reply      
"We now use smartphones and tablets most of the time, since they are much easier to use."

I completely disagree with that.If you gave two people (one on a pc, the other on a tablet or phone) some task, ie. move a file from here to here, assuming they were both equally conversant in their chosen system, the pc user would be able to do it in a fraction of the time of the tablet or phone user.

andrewrothman 1 day ago 0 replies      
I think this is fantastic. There's only two problems as I see it.

The first is the consideration of running this on current hardware. Obviously Neo is meant to be run on hardware designed for these interactions (as shown with the voice key on the keyboard and the gaze tracking camera) but I'd like more detail on how this could run on systems that do not have this hardware.

The second is in professional applications. I did not see any screenshots that deal with Photoshop or Final Cut Pro or After Effects or Solidworks or really any of the crucially important applications for the desktop. These are exactly what keep the desktop around, and what keep people from accepting Metro.

It'd be great if we could solve these two issues, or at least discuss them. I think these ideas are really great and the presentation was amazing. I'm seriously impressed, but it's a little held back from widespread adoption until we can figure a few things out.

P.S. Also I'd like to be able to split my terminal windows horizontally please :)

teknologist 1 day ago 0 replies      
"Fullscreen - click and drag outwards on a panel with 6 fingers"

AKA the "goatse" gesture.

dingo_bat 1 day ago 0 replies      
>The desktop metaphor as the basis of computer interfaces is inefficient and outdated. Today, most of our data exists outsides of files and folders.

What does this even mean? ALL my data is in files and folders. Is there a filesystem that doesn't use the concept of files and directories? If not, then isn't it best to model the system closely in the UI?

I also read the "Window management is outdated" and it completely failed to detail how windows are bad. It seems to me that windows are the most flexible UI paradigm, allowing you to decide how exactly you want to use your screen space. The challenge is on the devs to make apps with a reactive UI that changes according to the size of the window.

k_bx 18 hours ago 0 replies      
Things I definitely don't like:

- swiping to switch between apps (with transition animation) is a no-go, cmd-tab is much better as it is right now- with taking two-finger-swipe for switching apps, you need something to go back and forward in your browser. 3-finger-swipe?

Not sure regarding tags instead of folders. Folders are simpler and easier to understand, so this needs a lot of details review and experience investigation.

FatalBaboon 1 day ago 0 replies      
I keep hearing doomeries as "Laptops will kill desktops", "tablets will kill laptops". What's next? we're going to work on smartwatches?

And here I am, with my desktop computer, comfy keyboard & screens, fully hackable. I've had it for almost 12 years now, and I just replaced parts every now and then to keep it somewhat current. That will never go away. Then I got an old laptop for working away from home, but I hardly ever use it.

So when I hear touch screens will become ubiquitous and its necessary minimalistic UX will drive the consumer computer industry, I have my doubts.

Never gonna happen.

Something being useful does not mean it will necessarily eat everyone's lunch. Nor that is has to.

relkor 1 day ago 1 reply      
How did he create those demos, did he actually write a windows manager that wraps existing OS functions into his format? The amount of code that would take to give all of the functionality would be stupendous. Also how did he implement eye tracking with such robustness? Could he be kernel hacking and remapping the video memory buffer so that his shim code can take the output of working programs and then rearrange it for the purposes of his demo? Could a designer shed some light for this programmer on how exactly that video was generated. What type of tools would he have used so that I can google them on my own time?
andy_ppp 1 day ago 0 replies      
Okay this is fine... as far as it goes. I think I still want windows sometimes.

I would much prefer all of the things I do to be organised by the work I'm doing. Here is an example.

If I switch to the "personal project" I get a completely clean workspace (or whatever I left it as) with email filtered to be about only things related to my personal project. I want only the apps that I use for this project (browser, pycharm, photoshop, terminal) all on different screens all of these apps only showing the work and paths I have associated with a specific project.

Basically build GTD into all my apps and the desktop and allow me to filter said desktop by tags, projects, people etc.

And I think I'll take this opportunity to turn on no procrast again...

scrumper 1 day ago 0 replies      
I thoroughly enjoyed reading this contemplation of what desktop UIs might look like in 3-5 years. Excellent work.

There are some very good ideas in here, notably the pop-up circular 'swipe' menu of contextual actions (near the bottom of the page). I built one of these into a trading system once and it was well received. Gaze tracking combined with "Just Type" strikes me as being potentially very powerful too, if combined with some sort of highlight thing so you know where you're going to be typing.

There are some amusingly weird things too, like the polydactyl-only requirement to "click and drag inwards on a panel with 6 fingers."

threesixandnine 17 hours ago 1 reply      
What's up with all the criticism. Jealous much?

While browsing this I found it to be inspiring and beautiful. Way better than anything some companies that call themselves "design" put out for BIG $$$.

Screw all the negative energy here. Go man, go. You are the real deal among all the fake critics-designers.

TeMPOraL 1 day ago 0 replies      
I love it. It combines exactly the three things I want to see together:

- Tiling window manager - that I have (StumpWM), and it's awesome! (pun intended)

- Convenient tagging of everything - now that's been discussed for many years already, and yet for some reason nobody actually implemented it properly.

- Eye tracking - it's a feature I wanted to have since I started using multiple monitors and noticed the unnecessary input action I have to make to switch focus so that it follows what I'm looking at.

It also has one of my favourite UI patterns - wheel menu.

A great demo, IMO, and I wish someone would implement it. I'd happily try it out, and if good enough, I'd become a paying user.

Namrog84 1 day ago 1 reply      
The very first thing I noticed in the first image with 2 side by side is the example with wikipedia on the right. Wikipedia has side bars I don't care or need to see. This is exactly the instance I'd have the left one overlapping the roght to cover it up. Giving me more space to left. And chrome still allows me to scroll the window behind. Without clicking or bringing focus to it. Many applications also havesidebara or other whitespace that I normally might cover. Sometimes as simple as a calculator app or something. I am still open to this idea but until I see any tile manager address this well. It's a nom starter for me.
ciaoben 19 hours ago 0 replies      
Great concept. I found the idea pretty interesting, but not for desktop maybe. I see a lot of good ideas that tablet companies could 'steal' to finally level app the productivity on tablets. Maybe, even if this concept is 'Apple oriented', could be a good game changer for android tablets.
alistairSH 1 day ago 1 reply      
I like this direction. Now that my primary personal laptop is a MacBook Air 11" (and my wife's is a 13"), I find we both use full-screen apps more frequently than on larger monitors. But, it doesn't feel like the OS has kept up, and the full-screen experience is less than ideal. Apps continue to open in windowed mode, and I have to resize or full-screen them. Swiping moves me from desktop to desktop, not app to app. Take some of the app-switching work that is now present in OSX and roll it into OSX (or Windows). I haven't put a lot of thought into this - I just know the OS "desktop" no longer feels like it works.
garblegarble 17 hours ago 0 replies      
This reminds me a lot of 10/gui[1], another interesting desktop concept that uses multitouch in a cool way to navigate based on the number of fingers you use

[1] http://10gui.com/video/

jadbox 1 day ago 0 replies      
I think no one told him about "tiling managers", while not pretty, I LOVE i3. https://i3wm.org/
tdriggs 1 day ago 0 replies      
This looks like something out of the design team for Windows 8 (in both good and bad ways). It is visually striking, a dramatic simplification of what currently exists, and it makes several assumptions about the real world of what app developers need and how they can plug into OS-provided UI surfaces. One of the most painful learnings for Microsoft with Windows 8 was that overly-standardizing things like content sharing and tagging leads apps into situations where the affordances dont work.
anjc 1 day ago 0 replies      
Agreed OP, the advances that Windows 8.1 and Metro made were truly amazing.
snickmy 1 day ago 0 replies      
There are some good thoughtful points. On overall a well done presentation.

I wonder how natural touch on a touchpad is for a productivity flow in a desktop context. I personally never managed to get it integrated in my workflow (neither with magic mouse or trackpads).I also would be interested in understanding the impact of a mixed input experience. Not sure if our brain can switch context so fast. On the other hand, I see the synergy in having eye sight + voice control.

ommunist 14 hours ago 0 replies      
Someone -- hire this guy, give him a fellow OSX developer and build the alternative UI for OSX. Its smoking hot! And makes use of Magic Trackpad.
WhoBeI 1 day ago 0 replies      
Hmm.. not sure. Doesn't look like my kind of interface really. but I would probably still give it a try. Just one thing... the word productivity is used a lot but I see only webpages and file searches. Where's the cad/cam interfaces, the triple log file tailing terminals.. How would you gesture or speak an image to Gimp?
mleonhard 1 day ago 0 replies      
I love it! I would want an angled keyboard with integrated touch-pads for each hand. I use the Goldtouch V2 keyboard. It would be great it if the keyboard was edged with 2-3 inch touchpad surfaces. Then I could keep my hands in an ergonomic position and avoid having to reach for the mouse with every interaction.
darrelld 1 day ago 0 replies      

I get that this is meant to be a thought provoking concept piece and I respect that. My issue here is with the font color choices on some of the paragraphs. It is set to have an alpha of 0.4! Why would you do that? Why make things less legible?

I'm no designer so I've made some poor contrast choices and I can forgive those, but why would you make the text more see through? Is this a design pattern?

saiprashanth93 19 hours ago 0 replies      
Is there a history of people putting concepts out and other people building on them? Yes, of course. The cynicism in this thread regarding this being self-promotion is very unfortunate.
PascLeRasc 1 day ago 0 replies      
I think this middle vertical text (http://imgur.com/n2KQZqO) should be flipped to be like book titles on a shelf, or omitted at all. It was incredibly difficult to read that.
homulilly 1 day ago 0 replies      
Cool resume, I thought the part where it loads super compressed low detail images for every card in the app demonstration and then loads the full detail image when you actually show that tab was a clever way to make the ui more responsive without adding a ton of page load time.
v-yadli 19 hours ago 0 replies      
Talks about desktop interface, and leverages a touch pad. When falling back to a mouse, this design language would not be effective.
tomc1985 1 day ago 0 replies      
On another note, it has become waaaay too easy to come up with a flashy website for a research product.
masmullin 1 day ago 1 reply      
Why something as intuitive as eye tracking, but something as clunky as touchpad gestures?

Surely if technology is good enough to track the eye, we don't need a touchpad to track hand gestures?

I really like this concept, but when I heard "three finger scroll" I died a little inside.

OOPMan 22 hours ago 0 replies      
Zzzzzzzzzzzzzzzz, another attempt to force touch-paradigms into the desktop environment.

Can't these Apple hipsters just be happy with the iDevices and leave my WM alone?

workitout 1 day ago 0 replies      
I prefer to overlap and manage my windows as I need to. I'd like to see a desktop from Google based on Android but with overlapping windows, not the confining a phone is required to use.
rplnt 1 day ago 1 reply      
I would love to have that radial context on right hold + move. In browser at least. Gone would be mouse gestures, but I think that this can capture quite a few of them and make them more accessible.

Anyone? :)

hellofunk 1 day ago 0 replies      
Some of the features demonstrated in their video are already in the most recent version of OSX, fyi. Coincidentally I noticed them today and was pleasantly surprised.
justhw 1 day ago 0 replies      
Guys, sometimes just enjoy the work. No need to nitpick on every detail. This is not a service and you are not a customer. It's a simple design concept.
mark_l_watson 1 day ago 0 replies      
I like the idea. It reminds me somewhat of sliding out a secondary app on a new model iPad, but with many vertical panels instead of just a maximum of two.
RachelF 1 day ago 0 replies      
This web page will be very useful prior art for the big companies that are going to try and patent these ideas!
lucian1900 1 day ago 0 replies      
I was surprised to realise that I already do something similar on OS X. All windows full height and have different widths and I use a touchpad.
FreedomToCreate 1 day ago 0 replies      
A lot of the design paradigms they have implemented already exist in MacOS without sacrificing the flexibility a desktop interface affords.
guhcampos 1 day ago 0 replies      
That's what Gnome 3 should have been.

But please don't fork/refactor Gnome again to become this. Not yet.

personjerry 1 day ago 0 replies      
One thing though, this should not be an OS as suggested, but merely a window manager.
dredmorbius 23 hours ago 0 replies      
I've seen a hell of a lot of annoying UI/UX design over my 40+ years of computer use. And very little that's not entirely derivative of Doug Englebart's Mother of All Demos, in December of 1968.

Lennart's selecting from among the best widespread ideas, adding in several that are currently fringe, and coming up with a few twists of his own. The result is several things I'd very much like to have right now.

Any company looking hard at the desktop would be idiotic not to make him an offer. Lennart would be best advised to pick very carefully from among those he's tendered, and to make clear his own expectations.

The losers will be chasing his taillights for the next 20 years. Possibly 40.

The vision here is strong, but practical. It's dispensing of numerous tired elements of existing design, but replacing them with what appear to be far more workable models. I see some weaknesses and holes in the presentation, but at this stage these are details, not as is so often the case baked-in architectural flaws.

The synthesis of tiling, desktop, and touch interfaces in particularly is quite promising.

My biggest concern is that Lennart would accept an offer where his ideas would be absorbed into the black hole of an ossified company.

The obvious contenders for him are Apple, Google, and Microsoft. Ubuntu and Mozilla might be in the offing, and dark horses might emerge from Amazon or one of the existing hardware vendors -- Asus, Samsung, or LG, perhaps.

I think Apple, Microsoft, and Amazon would all be mistakes. The first two are far too wedded to their legacy platforms. Amazon is simply a horrible place to work, and the first rule is to not work for assholes.

Google, Ubuntu, or Mozilla offer the opportunity to develop this project and maintain an open-source offering. They'd probably be at the top of my list. I've suggested Google talk to Lennart and make him an offer.

One of the major hardware vendors might be of interest. I haven't kept track of where KDE/Qt are headed, but that's another option.

What I'm most thinking is, "damn, this would be a great time for a credible challenger for the desktop to appear". I'm not sure there is one.

But the world is Lennart's oyster.

I've commented further on the interface, and on what I see as weaknesses based on the presentation:


mpnordland 1 day ago 0 replies      
sort of feels like what I already do, although in a prettier package. Mostly just fullscreen everything in it's own virtual desktop on my main screen with a secondary for music/logs/email.
donpark 1 day ago 0 replies      
I like the concept and hope it will get built, hopefully for Linux.
profinger 1 day ago 0 replies      
I want Gaze recognition now. Where can I put my money? lol
dennismart 17 hours ago 0 replies      
very nice, feeling like you could be fast with it
bobwaycott 1 day ago 0 replies      
First impression:

Conceptually, I'm really liking the approach. I could see it being a viable refreshing of desktop paradigms. There is an interesting mix of OS X, Windows, Linux WMs, and some other goodies from other apps here.

Visually, the biggest drawback is I could not tell while watching the video which panel has focus. I am assuming the idea is this would be handled by tracking eye focus. Perhaps I could get used to that, but it'd have to be instantaneous switching. As I'm typing this in a half-width browser window that takes up all vertical space, I have the Desktop Neo site in another half-width browser window taking up full vertical space by its side. I'm bouncing my eyes back and forth between the site and this textbox I'm typing in. I'm currently staring at the Neo site while I'm typing, without any looking back. Such a desktop paradigm would have to remain very intelligent about recognizing that I'm currently typing in a panel while looking at, and perhaps scrolling through another panel, without wanting my current action to lose focus or be interrupted in any way. I work this way all the time.

Arzh 1 day ago 0 replies      
So like, Windows 10?
samstave 1 day ago 0 replies      
I personally hate interacting with machines via gesture touch controls.
rasz_pl 1 day ago 1 reply      
Touch and voice better than a mouse, sure. Tag with voice because its faster! proceeds to show 8 seconds to add two words = 15 WPM.
ollybee 1 day ago 1 reply      
sergiotapia 1 day ago 0 replies      
Where do I download it to try it out?
coleifer 1 day ago 0 replies      
What is the pretentious fappery?! Use a tiling window manager and get over it.
obilgic 1 day ago 0 replies      
How do I get only the window/panel management features?

edit: this is nothing more than a design prototype.

jaked89 1 day ago 0 replies      
Boring. I almost fell asleep.
Brave: Brendan Eich's clean-ads browser startup brave.com
546 points by Seldaek   ago   466 comments top 68
jordigh 10 hours ago 39 replies      
I will repeat this one more time, because Eich seems to be missing the point.[1]

I don't adblock for privacy, security, or speed. Those are just nice-side effects. I adblock because I do not want to be manipulated into buying things I do not need.

I wonder what would happen if, as a society, we said, "enough, no more ads". Would it really be the capitalist apocalypse that the ad industry is trying to make us believe it would be?


[1] https://news.ycombinator.com/item?id=10244964

hapless 11 hours ago 7 replies      
It sounds like their plan is to block all ads, then sell the new ad inventory created by all the blank space on pages.

Why on earth would users want this browser?

metafunctor 9 hours ago 2 replies      
Is anyone else concerned about the possibility that if ad blocking on the web becomes widespread enough, we will end up with more ads baked into the content itself? Native advertising, ads burned into images, ads burned in the middle of videos?

Thinking selfishly, I would much prefer the status quo, where I can block most ads, but the majority of consumers don't do it. Current ad blocking tech is fine, I'm afraid this could become an arms race.

grizzles 7 hours ago 0 replies      
I'm the founder of a competitor in this space called Paymail.net. We're looking forward to battling it out in the marketplace with Brave and the traditional ad networks. I posted an intro on Hackernews 2 days ago, but we didn't make it to the front page.

Our difference to Brave is that we give free ads to everyone, the advertiser only pays if the end user makes a purchase. Similiarly the display site gets nothing if there was no economic exchange. Capitalism is supposed to be a machine for you getting what you want. We want to help that process along. I have an uncompromising attitude that web/world ads should be for things that you really want to see, and then they become content.

That might be a utopian vision today, but I have strong belief in the power of people's self interest to drive positive change.

Edit: chrispm reposted the link here, https://news.ycombinator.com/item?id=10940684

vladikoff 11 hours ago 3 replies      
From a quick glance:

1) Desktop browser is an electron app with ad tracking injected into your app via http://cdn.brave.com/ via https://sonobi.com/welcome/index.php which promises "EFFECTIVELY PLAN AND SOURCE MARKETING OPPORTUNITIES WITH QUALITY AND VIEWABILITY FROM PREMIUM PUBLISHERS"

2) iOS browser is a fork of Firefox iOS - https://github.com/mozilla/firefox-ios

3) Android browser is a fork of https://play.google.com/store/apps/details?id=com.linkbubble...

jensen123 10 hours ago 1 reply      
It's great to see more people working to solve the problem of intrusive ads! I wonder what exactly they mean by intrusive ads, though? I hope it's more than just ads that don't respect your privacy. To me an ad is intrusive if it has any kind of movement or animation. Anything that moves automatically attracts my attention, so this is very annoying if I'm trying to read something.

I don't mind ads in print magazines so much (other than the fact that print magazines are unlikely to write negative stuff about companies that advertise with them). Ads in print magazines are ok with me, because there's no movement on them. So I can easily read one page, even though the next page has a full page ad.

They mention standard sized spaces and faster browsing. I actually wouldn't mind large ads - like something taking up my whole screen - that I can scroll through. Back in the 90s, it probably made sense to have small 468x60 pixel banner ads, but as fast Internet connections are becoming more and more common, I don't really see the point of restricting the size anymore. Large full page ads aren't really a problem in print magazines, and I don't think it would be on the web either, if we just got rid of the animations.

xjay 1 hour ago 0 replies      
Brave seems to acknowledge a smaller part of Ted Nelson's "Project Xanadu" [1], where you're paying to access content.

From the Project Xanadu Wikipedia article:"9. Every document can contain a royalty mechanism at any desired degree of granularity to ensure payment on any portion accessed, including virtual copies ("transclusions") of all or part of the document."

Ted's approach is (in my view) also a deduplication effort, as you're citing the original content, tracing it back to its origin by reference.

[1] https://en.wikipedia.org/wiki/Project_Xanadu

iza 11 hours ago 1 reply      
> With Brave, you can choose whether to see ads that respect your privacy or pay sites directly. Either way, you can feel good about helping fund content creators.

How do they plan on doing that? Not like it hasn't been tried before. The problem is you can't collect money on someone's behalf without them opting in, and if it is opt-in only you get the chicken and the egg problem for adoption.

webjprgm 9 hours ago 1 reply      
A few thoughts:

(1) If they block tracking, does it block Google Analytics? Because that would annoy me as a website owner.

(2) The reason I don't pay subscriptions to sites like Wall Street Journal and NY Times is that I get my content from aggregators like Hacker News so I only go to one of those paid sites if I follow an occasional link. Micropayments would fix that if I could pay one company a $5/mo subscription to then have payments automatically dolled out to a select list of good sites until my $5 was used up (then maybe ask me each time after that, or something).

(3) They talk about avoiding the ad-blocking war, but they are just contributing to it. I guess what they think is that by making a way for the website owner to get paid they avoid some of the war, but many companies like to be in direct control of their money so they might not like a middleman sitting on the high way charging everyone a tax to pass. And if Brave doesn't charge something for its services then it has no business model, so I'm assuming they are not passing 100% of revenue on to the site owner.

willherschel 16 minutes ago 0 replies      
Brave wants to save me from intrusive 3rd-party trackers... but uses Chartbeat and Doubleclick on Brave.com?
wespad 6 hours ago 0 replies      
That's funny. I also use NoScript, just to be extra cautious, and when I go to Brave.com, I don't see ANYTHING. I would hope that somebody claiming to want to fix the web would be able to serve up a page that doesn't need permission to execute a script.
runjake 10 hours ago 2 replies      
The first question that pops into my head is "How is Brave going to monetize?".

They've received substantial investor money, so apparently they have something lucrative in mind. And it's probably not good for privacy-conscious end users.

Brakenshire 10 hours ago 0 replies      
I was just reading some web performance audits by Paul Irish:


One of things he mentions on one of the sites, is Adsense looking at every scroll event, and doing tracking work which takes 25ms on a smartphone (his smartphone, likely to be high end). That means your scrolling performance is going to be inherently bad, probably below 30 fps one you take into account other work associated with the browser or the site. Having a browser which takes out this kind of code, but doesn't break the business model of the website owner does seem like an interesting idea. It seems like a major part of the mobile web is half-broken for these kind of reasons.

anotherhacker 11 hours ago 3 replies      
I'm doubtful of this taking off. People stick with the default apps and settings. The average person just really doesn't care that much about this kind of thing.

Maybe enterprise or businesses will like it - so they can avoid their employees visit whitelisted sites that mistakenly have malicious code in the ads. Eg. Flash

rockdoe 11 hours ago 3 replies      

"Firefox for iOS"

They forgot to remove the branding from their "new" browser.

juandazapata 10 hours ago 3 replies      
It's funny when AdBlock tells you that brave.com has 2 blocked Ads. Oh the irony.
threesixandnine 9 hours ago 0 replies      
Am I getting the message from their #about section correctly?

They want to block ads that the person running a web site put on their web site with their own (Brave Ads Infused TM Ltd. Inc. - let's make some money while pretending we are freeing the world).

abhv 11 hours ago 2 replies      
How will Brave prevent anti-ad-blocking mechanisms from interfering with the page?

For example, cbs/abc/nbc seem to detect muBlock and then stop serving content.

Plough_Jogger 2 hours ago 0 replies      
The surfer in the main slider has had his fins completely adblocked.


zmanian 11 hours ago 0 replies      
The biggest challenge of micropayments is how to minimize the cognitive load of making many tiny payments.

How should the user agent decide when to alert the user?

desireco42 5 hours ago 0 replies      
I think this is great thing, it is right thing, that is coming from trustworthy person/company.

Now, time will tell how things will play out, but I believe I can count on Brendan to make the right choices when it comes to features, compromises.

bootload 3 hours ago 0 replies      
"Brave is open source!"

Wow, new browser technology that is open source, this is good news. I'm hoping the development focus is flexible, remember flock? [0]

[0] https://www.flickr.com/photos/tags/flock+browser

abercromby 10 hours ago 1 reply      
Man that logo is similar to Workfront's:https://www.workfront.com
kybernetikos 8 hours ago 0 replies      
Confused about this. Does it display adverts on pages belonging to organizations not using its network? That would be pretty ethically dubious in my book.
Kristine1975 10 hours ago 0 replies      
What's with the requests to static.doubleclick.net on that page?
CaptSpify 5 hours ago 0 replies      
Why a new browser, instead of plugins for existing browsers?

I dislike ads, but there are already solutions for blocking them. Although I do like the premise of this, I'm not eager to switch browsers just to start supporting advertisers.

tomp 11 hours ago 0 replies      
I don't see any reason I would prefer this over the native iOS ad blocking (with an appropriate blacklist).
bpodgursky 8 hours ago 1 reply      
Anyone OK with this business model should also be ok with ISPs stripping out ads and replacing them with their own content (remember that?)

On the other hand, if this gets traction (unlikely, admittedly) this may finally force the issue to the courts and get content fiddling declared copyright/TOS violation. Which I'm not sure you all want.

Animats 7 hours ago 1 reply      
So why do I have to "sign up" for this supposedly privacy-enhanced browser? They don't have a need to know who I am.

"Then we put clean ads back". This is open source, right? It's on Github. Can someone fork this and remove all the ads? Thank you.

adrianlmm 7 hours ago 0 replies      
Glad to see you back Brendan!
callmeed 7 hours ago 2 replies      
Maybe this is a stupid question, but ... if ad blocking is such an issue for publishers, why don't they do the ad-serving logic on the server and display locally saved creative assets (from the same host)?
headgasket 9 hours ago 0 replies      
This is a worthy endeavour. I want to download and help test and develop this today. Helping deflate this ad-bubble before it turns into the "The Zero Theorem" is worth the effort.

This however does not tackle the mindset shift that needs to occur for the masses to start protecting the private information they voluntarily give up on services they are signed in on the social net.

We are currently working on a project that will use this information to the marketer's advantage in a way that will make people sick once they realize the extent of the profiling going on, with the ultimate goal of reversing the trend before it's too late. Make people raise their guards, sell some tech on the way.

devy 10 hours ago 0 replies      
Due to Brave's ad-block technology, I guess Brendan Eich will be blocked[1] by IAB from their annual conferences. (every pun intended.)

[1]: https://news.ycombinator.com/item?id=10937704

vhold 4 hours ago 0 replies      
> We make sure you aren't being tracked while you shop online and browse your favorite sites.

That's not a realistic claim. Nothing is stopping publishers and advertisers from sharing back end data.

RobertDeNiro 11 hours ago 0 replies      
I wonder how much they paid for the domain name.
linksbro 11 hours ago 0 replies      
Please fix the links to the Terms of Use and Privacy Policy. These should always be the first documents working on your startup's website.

And can anyone find this 'roadmap' that Eich talks about in the post?

st3fan 11 hours ago 0 replies      
"Imitation is the sincerest form of flattery."

To be fair, Firefox for iOS is open source. Take it, remix it, improve it. It is all good. Mozilla Public License.

smanuel 10 hours ago 0 replies      
Someone should make a browser that blocks "Everybody can earn xxx$+ daily... You can earn from ..." spam comments. I think I've seen those mostly through the FB comments plugin.Obviously FB can't / doesn't want to fix that.
xyzzy4 10 hours ago 1 reply      
But I want to block all advertising, not just harmful advertising.
lossolo 9 hours ago 3 replies      
You all are forgetting one thing. Without ads google would not exist, without ads facebook couldn't exist and expand, make research. If everybody will block ads then you will see decline in free sites.
return0 3 hours ago 0 replies      
Sounds like a brave plan indeed. Good luck to them!
jsvaughan 10 hours ago 1 reply      
Surely we are soon going to see server side ad-injection and the end of ad blocking
_pmf_ 9 hours ago 0 replies      
AdSense was considered the pinnacle of unobtrusive ads in the past.
lumberjack 10 hours ago 1 reply      
Where is the incentive for the user to use this thing?
jagermo 9 hours ago 1 reply      
Playing the devils advocat:

How Do you want to finance development in the lang run?

It is a nice solution and I'd hate to see it go because of financial problems.

pussinboots 5 hours ago 0 replies      
patreon but it allocates a percentage of however much you want to pay per month to the sites you visit relative to how much time you spent on them
chejazi 11 hours ago 0 replies      
Haven't looked at the source but worth noting there's a Bitcoin logo on the homepage near "Browse Better."
joesmo 9 hours ago 0 replies      
It says 'startup' but I don't see a business model or customers willing to pay for what they can already get free.
e15ctr0n 6 hours ago 0 replies      
Though the team working on Brave seems to have a few ex-Mozilla engineers, they have chosen to fork browsers other Firefox. (Also, nice to see so many Canadians!)

'Brave browser promises faster Web by banishing intrusive ads' | Jan 20, 2016 http://www.cnet.com/news/ex-mozilla-ceo-try-braves-new-brows...

> Eich and his team built Brave out of Chromium, which is the foundation for Google's Chrome browser, which leaves most of the actual development and security support to Google. Why not use Firefox, into which Eich poured so much effort? Because Chrome is more widely used and therefore better tested by developers who want to make sure their websites work properly, he said. "Chromium is the safe bet for us," he said.

* The desktop browser is a cross-platform desktop application created with a fork of Github's Electron framework that is itself based on Node.js and Chromium. https://github.com/brave/electronhttps://github.com/brave/browser-laptop

* The iOS browser is a fork of Firefox for iOS, which is a Swift app developed from scratch by Mozilla. https://github.com/brave/browser-ios

* The Android browser is Link Bubble, which is a wrapper around the default Android browser https://github.com/brave/browser-android Previous HN discussion here: https://news.ycombinator.com/item?id=7453897 Australian developer Chris Lacy announced its sale in Aug 2015: http://theblerg.net/post/2015/08/05/ive-sold-link-bubble-tap...

* The ad blocking technology is courtesy a Node.js module of Adblock Plus filter that uses a bloom filter and Rabin-Karp algorithm for speed.https://github.com/bbondy/abp-filter-parser-cpp

* The database is MongoDB. https://github.com/brave/vault

Past news coverage:

Mystery startup from ex-Mozilla CEO aims to go where tech titans won't | Nov 17, 2015 http://www.cnet.com/news/mystery-startup-from-ex-mozilla-ceo...

Use Link Bubble to open links in the background on Android | Aug 26, 2015 http://www.cnet.com/how-to/use-link-bubble-to-open-links-in-...

ck2 9 hours ago 0 replies      
Does anyone else remember when adsense first launched and it was text-only ads?

People would actually stop to read the ads because they were interesting and relevant.

Then google caved to images and animation and 100+ objects on a page, each with their own tracking scripts to slow browsers to a crawl.

MisterBastahrd 9 hours ago 0 replies      
So his business model is to hijack advertising and funnel it through his own little scheme, forcing publishers to pay him in order to get their customer's ads seen. Hope he loves getting his browser's user agent treated like Ebola.
ifdefdebug 10 hours ago 0 replies      
My ublock add-on blocks exactly 5 requests from their home page... so why would I want to install their browser?
puppetmaster3 10 hours ago 0 replies      
This will be banned by app stores and such and quickly.
shmerl 10 hours ago 1 reply      
Is it using Servo?
dbg31415 7 hours ago 0 replies      
I really like just replacing my hosts file to block ads.


But I can't do that on my phone without jailbreaking it. Stupid phone.

ebbv 9 hours ago 0 replies      
This seems awful from a content provider perspective. I no longer get to have control over my own content, Brave gets to decide for me how much my content is worth and what ads appear on my site.

This isn't that far removed from coming into a bakery and saying "The Cupcakes are no longer $2, they're $1.50 'coz that's what we think people want to pay."

I realize the idea is that this is "better" for the content providers than Ad Block, but both are, IMHO stupid. If a site you visit has ads you don't like, complain to the people who run the site and stop going to it. All Ad Block software has never been a fix, merely a tool in an ever escalating war of ads where users and content creators both lose.

jscheel 10 hours ago 0 replies      
Can they block scroll-jacking too?
wildmXranat 3 hours ago 0 replies      
Touche 11 hours ago 7 replies      
The scrolling on this website is janky as hell. Looks like they are using some "Smooth scrolling" plugin that makes scrolling not smooth at all. Not sure why native browser scrolling was insufficient for them, I'm scrolling here in HN all the time and it's perfectly smooth.
dang 2 hours ago 0 replies      
Comments like this are not welcome here, regardless of one's position on any issue. Please post civilly and substantively or not at all.
dang 3 hours ago 5 replies      
We detached this subthread from https://news.ycombinator.com/item?id=10938818 and marked it off-topic.
orliesaurus 10 hours ago 0 replies      
hahahahahahaha and this is why I use the meanest adblocking oss software i can find on my devices
melted 5 hours ago 0 replies      
It is very brave of Brendan to pursue such an obviously dead-end idea.
WhatIsDukkha 8 hours ago 0 replies      
If we took the name "Brendan Eich" and replaced it with "Comcast" more of us would find this entirely loathsome.

Indeed it is a very loathsome business model.

People have taken exception to it when ATT and Comcast inject ads into your browsing experience and when Adblock Plus removes and then reinjects them.

Why is this not hijacking the web, extorting publishers with buy into yet another ad network and then trying to leverage this into a future payment network?

mirap 11 hours ago 2 replies      
Have you seen git of desktop version? It seems it's completely written in JavaScript: https://github.com/brave/browser-laptop

Could such thing be secure?

indus 11 hours ago 1 reply      
This para from their home page could be akin to legalizing marijuana!

"The new Brave browser blocks all the greed and ugliness on the Web that slows you down and invades your privacy. Then we put clean ads back."

jorangreef 11 hours ago 4 replies      
I would much prefer a new browser that makes true native web apps possible with a one-click install to indicate trust.

A browser built with Electron that exposes Node.js and otherwise keeps away from the HTML5 kitchen sink, in order to push innovation away from the spec committees and back out to the community. Vital technology like TCP, UDP, DNS, and the filesystem is being locked up behind a fascade of poorly implemented APIs.

A browser with a small, efficient core, optimized for rendering, and with a brilliant app install system, and brilliant native cross-platform integration. The time is ripe.

jacquesm 11 hours ago 0 replies      
Pet peeve: accounts with < 10 karma and 1 submission to their name that complain about what is news and is not news.
I ended up paying $150 for a single 60GB download from Amazon Glacier medium.com
637 points by markonen   ago   220 comments top 45
dirktheman 2 days ago 8 replies      
I'm the first one to admit that Glacier pricing is neither clear nor competetive regarding retreival fees. I do think that a lot of people use it the wrong way: as a cheap backup. I use:

1. My Time Machine backup (primary backup)

2. BackBlaze (secondary, offsite backup)

3. Amazon Glacier (tertiary, Amazon Ireland region)

I only store stuff that I can't afford to miss on Glacier: photos, family videos and some important documents. Glacier isn't my backup, it's the backup of my backup of my backup: it's my end-of-the-world-scenario backup. When my physical harddrive fails AND my backblaze account is compromised for some reason, only then will I need to retrieve files from Glacier. I chose the Ireland region so my most important files aren't even on the same physical contintent.

When things get so dire that I need to retrieve stuff from Glacier, I'd be happy to pony up 150 dollars. For the rest of it, the 90 cents a month fee is just a cheap insurance.

res0nat0r 3 days ago 5 replies      
Glacier pricing has to be the most convoluted AWS pricing structure and can really screw you.

Google Nearline is a much better option IMO. Seconds of retrieval time and still the same low price, and much easier to calculate your costs when looking into large downloads.


markonen 2 days ago 0 replies      
OP here. Some updates and clarifications are in order!

First of all, I just woke up (its morning here in Helsinki) and found a nice email from Amazon letting me know that they had refunded the retrieval cost to my account. They also acknowledged the need to clarify the charges on their product pages.

This obviously makes me happy, but I would caution against taking this as a signal that Amazon will bail you out in case you mess up like I did. It continues to be up to us to fully understand the products and associated liabilities we sign up for.

I didn't request a refund because I frankly didn't think I had a case. The only angle I considered pursuing was the boto bug. Even though it didn't increase my bill, it stopped me from getting my files quickly. And getting them quickly was what I was paying the huge premium for.

That said, here are some comments on specific issues raised in this thread:

- Using Arq or S3's lifecycle policies would have made a huge difference in my retrieval experience. Unfortunately for me, those options didn't exist when I first uploaded the archives, and switching to them would have involved the same sort of retrieval process I described in the post.

- During my investigation and even my visits to the AWS console, I saw plenty of tools and options for limiting retrieval rates and costs. The problem was that since my mental model had the maximum cost at less than a dollar, I didn't pay attention. I imagined that the tools were there for people with terabytes or petabytes of archives, not for me with just 60GB.

- I continue to believe that starting at $0.011 per gigabyte is not a honest way of describing the data retrieval costs of Glacier, especially when the actual cost is detailed, of all things, as an answer to a FAQ question. I hammer on this point because I don't think other AWS products have this problem.

- I obviously don't think it's against the law here in Finland to migrate content off your legally bought CDs and then throw the CDs out. Selling the originals, or even giving them away to friend, might have been a different story. But as pointed out in the thread, your mileage will vary.

- I am a very happy AWS customer, and my business will continue to spend tens of thousands a year on AWS services. That goes to something boulos said in the thread: "I think the reality is that most cloud customers are approximately consumers". You'd hope my due diligence is better on the business side of things, as a 185X mistake there would easily bankrupt the whole company. But the consumer me and the business owner me are, at the end, the same person.

astrostl 3 days ago 1 reply      
Arq has a fantastic Glacier restore mechanism. You select a transfer rate with a slider, and it informs you how much it will cost and how long it will take to retrieve. It optimizes this with an every-four-hours sequencing as well. See https://www.arqbackup.com/documentation/pages/restoring_from... for reference.
re 3 days ago 1 reply      
Glacier's pricing structure is complicated, but fortunately it's now fairly straightforward to set up a policy to cap your data retrieval rate and limit your costs. This was only introduced a year ago, so if like Marko you started using Glacier before that it could be easy to miss, but it's probably something that anyone using Glacier should do.


KaiserPro 2 days ago 0 replies      
Glacier is not a cheap/viable backup

its even less suited to disaster recovery (unless you have insurance)

Think about it. For a primary backup, you need speed and easy of retrieval. Local media is best suited to that. Unless you have a internet pipe big enough for your dataset (at a very minimum 100meg per terabyte.)

4/8hour time for recovery is pretty poor for small company, so you'll need something quicker for primary backup.

Then we get into the realms of disaster recovery. However getting your data out is neither fast nor cheap. at ~$2000 per terabyte for just retrieval, plus the inherent lack of speed, its really not compelling.

Previous $work had two tape robots. one was 2.5 pb, the other 7(ish). They cost about $200-400k each. Yes they were reasonably slow at random access, but once you got the tapes you wanted (about 15 minutes for all 24 drives) you could stream data in or out as 2400 megabytes a second.

Yes there is the cost of power and cooling, but its fairly cold, and unless you are on full tilt.

We had a reciprocal arrangement where we hosted another company's robot in exchange for hosting ours. we then had DWDM fibre to get a 40 gig link between the two server rooms

Spooky23 3 days ago 2 replies      
The only use case I would be willing to commit to glacier would be legal-hold or similar compliance requirement.

The idea would be that the data would either never be restored or you could compel someone else to foot the bill or using cost sharing as a negotiation lever. (Oh, you want all of our email for the last 10 years? Sure, you pick up the $X retrieval and processing costs)

Few if any individuals have any business using the service. Nerds should use standard object storage or something like rsync.net. Normal people should use Backblaze/etc and be done with it.

Nexxxeh 3 days ago 6 replies      
The post is a useful cautionary tale, and he's not alone in getting burned by Glacier pricing. Unfortunately it was OP not reading the docs properly.

Yes, the docs are imperfect (and were likely worse back in the day). And it was compounded by the bug, apparently. But it's what everyone on HN has learned in one way or another... RTFM.

Was it mentioned in the article that the retrieval pricing is spread over four hours, and you can request partial chunks of a file? Heck, you can retrieve always all your data from Glacier for free if you're willing to wait long enough.

And if it's a LOT of data, you can even pay and they'll ship it on a hardware storage device (Amazon Snowball).

Anyone can screw up, I'm sure we all have done, goodness knows I have. But at the very least, pay attention to the pricing section, especially if it links to an FAQ.

sathackr 3 days ago 0 replies      
This sounds a lot like demand-billing [1] [2] that's common with electric utilities, particularly commercial, and increasingly, people with grid-tied solar installations. [citation needed]

You pay a lower per-kilowatt-hour rate, but your demand rate for the entire month is based on the highest 15-minute average in the entire month, then applied to the entire month.

You can easily double or triple your electric bill with only 15 minutes of full-power usage.

I once got a demand bill from the power company that indicated a load that was 3 times the capacity of my circuit (1800 amps on a 600 amp service). It took me several days to get through to a representative that understood why that was not possible.

[1] http://www.stem.com/resources/learning

[2] http://www.askoncor.com/EN/Pages/FAQs/Billing-and-Rates-8.as...

lazyant 3 days ago 1 reply      
You don't have a backup until you test its restore.
captain_jamira 2 days ago 2 replies      
If one can download a percentage for free each month - 5% in this case, and the price of storage is dirt-cheap, then couldn't one just dump empty blocks in until the amount desired for retrieval falls under the 5% limit? In this case, if one wants to retrieve 63.3 GB, uploading 1202.7 GB more for a total of 1266 GB, 63.3 GB of which represents just under 5%. There's no cost for data transfer in and the monthly cost at $0.007/GB would be just $8.87. And that's just for the one month because everything wanted would be coming out the same month.

Has anyone tried this or know of a gotcha that would exclude this?

And I realize that for the OP's situation, it wouldn't have mattered since he thought he was going to get charged a fraction of this.

profsnuggles 3 days ago 1 reply      
Even with the large data retrieval bill he still saves ~$100 vs the price of keeping that data in S3 over the same time period. Reading this honestly makes me think glacier could be great for a catastrophic failure backup.
joosteto 2 days ago 3 replies      
If downloading more than 5% of stored data is so expensive, wouldn't it have been cheaper to upload a file 19 times the size of the stored data (containing /dev/urandom)?After that, downloading just 5% of total data would have been free.
kennu 3 days ago 2 replies      
Glacier is more comfortable to use through S3, where you upload and download files with the regular S3 console, and just set their storage class to Glacier with a lifecycle rule. I've used the instructions in here to do it: https://aws.amazon.com/blogs/aws/archive-s3-to-glacier/
limeyx 2 hours ago 0 replies      
So ... why not just upload and additional 60GB / 0.05 and then download the entire 60GB which is now 5% of the total storage for free ?
slyall 3 days ago 1 reply      
I've had some big Glacier bills in the past, even the upload pricing has gotchas[1]

These days the infrequent access storage method is probably better for most people. It is about 50% more than Glacier (but still 40% of normal S3 cost) but is a lot closer in pricing structure to standard S3.

Only use glacier if you spend a lot of time working out your numbers and are really sure your use case won't change.

[1] - 5 cents per 1000 requests adds with with a lot of little files.

Zekio 3 days ago 1 reply      
Pricing should always be made straight forward, easy to understand, and that pricing plan is dodgy as hell
detaro 3 days ago 2 replies      
Seems like "precise prediction and execution of Amazon Glacier operations" might be a niche product people would pay for (and probably already exists for enterprise use cases?)

That's something that generally keeps me from using AWS and many other cloud services in many cases: the inability to enforce cost limits. For private/side project use I can live with losing performance/uptime due to a cost breaker kicking in. I can't live with accidentally generating massive bills without knowingly raising a limit.

stuaxo 3 days ago 0 replies      
"The problem turned out to be a bug in AWS official Python SDK, boto."

My only experience of using boto was not good. Between point versions they would move the API all over the place, and being amazon some requests take ages to complete.

After that worked with google APIs which were a better, but still not what I'd describe as fantastic (hopefully things are better over last 2 years).

tomp 3 days ago 1 reply      
Wouldn't it be better for the OP to simply upload 20 * 60GB (= 1.8TB) of random data, wait a month (paying less than 20 USD), and then download the initial 60GB within his 5% monthly limit?
sneak 3 days ago 1 reply      
This article claims that glacier uses custom low-RPM hard disks, kept offline, to store data.

Does s/he substantiate this claim in any way? AFAIK glacier's precise functioning is a trade secret and has never been publicly confirmed.

Pxtl 3 days ago 2 replies      
Considering the fact that bugs in the official APIs resulted in multiple retry attempts, he should demand some of his money back.
atarian 2 days ago 0 replies      
They should rename this service to Amazon Iceberg
Jedd 3 days ago 2 replies      
About a year ago NetApp bought Riverbed's old SteelStore (nee Whitewater) product -- it's an enterprise-grade front-end to using Glacier (and other nearline storage systems). It provide a nice cached index via a web GUI that let you queue up restores in a fairly painless way. It even had smarts in there to let you throttle your restores to stay under the magical 5% free retrieval quota. It's not a cheap product, and obviously overkill for a one-off throw of 60GB of non-critical data ... but point being there are some good interfaces to Glacier, and roll-your-own shell scripts probably aren't.

As noted by others here, if you treat glacier as a restore-of-absolute-last-resort, you'll have a happier time of it.

Perhaps I'm being churlish, but I railed at a few things in this article:

If you're concerned about music quality / longevity / (future) portability - why convert your audio collection AAC?

Assuming ~650MB per CD, and the 150 CD's quoted, and ~50% reduction using FLAC, I get just shy of 50GB total storage requirements -- compared to the 63GB 'apple lossless' quoted. (Again, why the appeal of proprietary formats for long term storage and future re-encoding?)

I know 2012 was an awfully long time ago, but were external mag disks really that onerous back then, in terms of price and management of redundant copies? How was the OP's other critical data being stored (presumably not on glacier). F.e. my photo collection has been larger than 60GB since way before 2012.

Why not just keep the box of CD's in the garage / under the bed / in the attic? SPOF, understood. But world+dog is ditching their physical CD's, so replacements are now easy and inexpensive to re-acquire.

If you can't tell the difference between high-quality audio and originals now - why would you think your hearing is going to improve over the next decade such that you can discern a difference?

And if you're going to buy a service, why forego exploring and understanding the costs of using same?

LukeHoersten 3 days ago 0 replies      
Does anyone have a success story for this type of backup and retrieval on another service?
elktea 3 days ago 0 replies      
I briefly used Glacier for daily backups as a failsafe if our internal tape backups failed when we needed them. The 4 hour inventory retrieval when I went to test the strategy and the bizarre pricing quickly make me look at other options.
pmx 3 days ago 2 replies      
I have a strong feeling that he would get a refund if he contacted Amazon support, considering it was caused by a bug in the official SDK and he didn't ACTUALLY use the capacity he's being asked to pay for.
natch 3 days ago 1 reply      
This is why I break my large files uploaded to Glacier into 100MB chunks before uploading. If I ever need them, I have the option of getting them in a slow trickle.
kozukumi 2 days ago 0 replies      
For unique data you want super robust storage options, both local and remote. But for something as generic as ripped CDs? Why bother? Just use an external drive or two if you are super worried about one dying. Even if you lose both drives the data on them isn't impossible to replace.
prohor 2 days ago 0 replies      
For cheap storage there is also Oracle Archive Storage with 0.1c/GB ($0.001/GB). They have horrible cloud management system though.


cm2187 3 days ago 3 replies      
Perhaps a naive question but why would glacier try to discourage bulk retrieval? Is it because the data is fragmented physically?
forgotpwtomain 2 days ago 2 replies      
> Id need more than one drive, preferably not using HFS+, and a maintenance regimen to keep them in working order.

I'm really doubting the need for a maintenance regimen on a drive which is almost entirely unused. Could have spent $50 on a magnetic-disk-drive and saved yourself hours worth of trouble.

JimmaDaRustla 3 days ago 0 replies      
Wow, thanks for this!

I currently have 100gb of photos on Glacier. I am going to be finding another hosting provider now.

jaimebuelta 2 days ago 0 replies      
You will ALWAYS pay more that you expect when you use AWS (and probably other cloud services). This case is quite extreme, but the way costs are assigned, is quite complicated not to miss something at some point...
z3t4 2 days ago 0 replies      
I was looking at Glacier for my backups, but it seemed to complicated ... glad I didn't use it.

I ended up using some cheap VPS, two of them located in two different countries. And it's still cheaper then say Dropbox.

random3 3 days ago 1 reply      
So depending on how the "average monthly storage" is computed you could get 20x more data in one month and then retrieve the 5% (previously 100%) that you care about for free, and then delete the additional data?
alkonaut 2 days ago 1 reply      
Curious: if you use a "general storage provider" (like glacier) for backup, rather than a "pure backup provider" (like Backblaze, CrashPlan) why is that?
NicoJuicy 2 days ago 0 replies      
Does anyone have a backup script for backblaze or a similar windows app like SimpleGlacier Uploader?
dalanmiller 2 days ago 1 reply      
So, what's the most cost effective way to download all your files from Glacier then?
pfarnsworth 2 days ago 1 reply      
If there's a bug in Amazon's libraries, can't you ask for a refund?
jedisct1 2 days ago 1 reply      
I don't get Glacier. It's painfully slow, painful to use and insanely expensive.https://hubic.com/en is $5/month for 10 Tb, with unmetered bandwidth. A far better option for backups.
harryjo 3 days ago 1 reply      
Impressive that Amazon can choose to serve a request at 2x the bandwidth you need, with no advance notice, and charge you double the price for the privilege.
languagehacker 3 days ago 1 reply      
This is a simple case of spending more than you should have because you didn't understand the service you were using. It's impacted a little worse by how silly the whole endeavor is, given the preponderance of music streaming services.
otakucode 3 days ago 3 replies      
I'm surprised that the author had 150GB of Creative Commons audio CDs to begin with!
anonfunction 3 days ago 2 replies      
I don't like how the title and article reads like a hit piece on Amazon Glacier. It's great at what it is intended for. In addition it seems he still saved money because over 3 years because the $9 a month savings added up to more than the $150 bill for retrieval.

I'm surprised that this aspect has not been mentioned here in the comments yet:

> I was initiating the same 150 retrievals, over and over again, in the same order.

This was the actual problem that resulted in the large cost.

At my old job we would get a lot of complaints about overage charges based on usage to our paid API. It wasn't as complicated of pricing as a lot of AWS services, just x req / month and $0.0x per req after that, but every billing cycle someone would complain that we overcharged them. We would then look through our logs to confirm they had indeed made the requests and provide the client with these logs.

Six-Legged Giant Finds Secret Hideaway, Hides for 80 Years npr.org
493 points by sabya   ago   81 comments top 23
abraae 1 day ago 6 replies      
This story resonates with me.

Here in New Zealand, we have many native species of birds, insects, frogs, lizards and the like that thrived when our islands were cut off from the rest of the planet, but that have become extinct, or are in imminent danger of being so due to introduced predators such as rats, stoats, hedgehogs, ferrets, cats etc. etc.

It leads to the bizarre situation that conservation here is largely about killing things.

travis_brooks 1 day ago 0 replies      
Did a search to see what happened with the stick bugs and discovered the population is now large enough they're in zoos in San Diego, Toronto, and Bristol:http://www.9news.com.au/national/2016/01/13/16/52/revived-au...
oska 1 day ago 0 replies      
Wingsuit flyby of Ball's Pyramid:


brianclements 1 day ago 0 replies      
Reminds me of a Radiolab episode[1] about a similar effort to bring back a specific species of tortoise in the Galapagos islands. The offending infringing species there where goats. What was really interesting was the method used for the eradication program.[2]

[1] http://www.radiolab.org/story/brink/[2] https://en.wikipedia.org/wiki/Judas_goat

theophrastus 1 day ago 0 replies      
The old accepted view was that terrestrial arthropods are limited in size by their system of respiration which amounts to exoskeletal pores (spiracles) and internal airways. So terrestrial bugs can't get much bigger because of the body volume (cubic) versus passive respiratory surface (square). (we've got a forced air/blood circulation system so we can get bigger) Consequently it would be fascinating to learn how close Dryococelus australis is to that theoretical limit, or has it developed some sort of active respiratory system? (in which case, maybe it's time to start engineering insect saddles?
clarkmoody 1 day ago 0 replies      
sohkamyung 1 day ago 0 replies      
The story of the Lord Howe Island stick insect is nicely told in this award winning short animation, "Sticky" [1].

There is also a book out now on the insects [2].

[1] https://vimeo.com/76647062

[2] http://www.publish.csiro.au/pid/7226.htm

gherkin0 1 day ago 1 reply      
Are they ever going to go back to the island and collect more specimens for breeding (and perhaps release some captive-bread individuals to replace them)? Even though that population was probably extremely inbred there's probably still some genetic diversity there that wasn't represented in to two wild specimens they managed to breed.
gulpahum 1 day ago 1 reply      
Are those edible? They look like a good replacement for shrimps. Nothing would guarantee their survival better than becoming a food source for humans!
cmpb 1 day ago 1 reply      
I'm a southern Louisiana (US "Deep South") native. Here, and in other areas of the south (and elsewhere in the world), we have similarly large "cicadas", which are basically giant crickets. They hatch once every 13 years (shorter than most other cicadas, which hatch every 17 years) [1]. Though the hatches produce huge numbers (sometimes causing areas of road to be literally covered and obscured), and can sometimes be a nuisance because of their sound, they play a very integral role in the ecological cycle and contribute to a very diverse system of plants and animals which many people around here take great pride in. I hope that the residents of Lord Howe Island can learn to live with some new friends.

[1] https://en.m.wikipedia.org/wiki/Cicada

reustle 1 day ago 2 replies      
This needs a (2012) in the title
eddiegroves 1 day ago 2 replies      
All those eggs from one breeding pair, do insects not need gene diversity for the population to succeed?
vblord 1 day ago 0 replies      
I remember this posted from a year ago. I remember reading it and being grossed out that it was a giant bug and not some sort of 6 legged bear. Why that guy would ever touch that thing is beyond me.


vortico 1 day ago 0 replies      
It's interesting how this story has a true purpose, to convince the residents of Lord Howe to allow the insects to be released on their land by the pressure of the article's readers. Ball's Pyramid itself is perhaps even more fascinating to me. It looks like a good place for a wizard to live...
JoeAltmaier 1 day ago 0 replies      
Once humans spread and travel became commonplace, its inevitable that 'invasive species' should become a worldwide problem. It only has to happen once; you can't ever be careful enough to avoid that. And the most invasive species of all, which erases whole ecosystems and changes everything including the water, soil and weather, is of course, us.
trusche 1 day ago 0 replies      
thescriptkiddie 1 day ago 2 replies      
> It was 12 centimeters long and the heaviest flightless stick insect in the world.

So there are bigger insects around, and they can fly?

michaelcampbell 1 day ago 2 replies      
Those things scare the bejeezus out of me (irrationally, I'll grant), but these stories give me hope, if at least a little.
stuart78 1 day ago 0 replies      
That video. I'm traumatized, yet I can't look away.
ed_blackburn 1 day ago 1 reply      
How do they propose to rid Howe Island of rats?
YeGoblynQueenne 1 day ago 0 replies      
niels_olson 1 day ago 1 reply      
Don't pitch the bugs. Pitch their shells. As buttons or jewelry.
cconcepts 1 day ago 2 replies      
If only everything on HN had a wingsuit flyby
The NICAM Codec replacement project bbc.co.uk
506 points by dtf   ago   99 comments top 17
linker3000 12 hours ago 3 replies      
That's some decent BBC engineering from a team that works away quietly in the background and just makes things happen, or keep happening. I don't work in that field, but like many people in my profession (Systems and networking), I also tend to be invisible until something breaks, and then I am the fall guy if it's not fixed within a heartbeat.

/I was half expecting the article to outline how a box full of 80s technology had been replaced with a Raspberry Pi!

king_magic 12 hours ago 2 replies      
This is one of the most interesting things I've read on HN in weeks. Awesome to see this kind of success story, and it's super fascinating to get insight into how radio works behind the scenes.

I'll be honest, I had no idea all of this goes into radio. I thought there were just, err, towers?, that broadcast... radio waves, and that was kind of it? I mean, I knew on some level it had to be a tad bit more complex than that, but you never see or hear about it.

pjc50 12 hours ago 2 replies      
I've always been impressed by the BBC tech people I met, doing a high-quality job in an extremely bureaucratic neophobic environment.

As part of making the unit, we had to ensure it complied with all the relevant European directives so that it could be CE marked. We had the Codec tested for both safety and EMC.

Interesting, I thought this wasn't required for internal use; probably the BBC corporate structure means it no longer counts as "internal".

On "Radio Teleswitch":

[Droitwich] transmitter uses a pair of obsolete metre-long valves which are no longer manufactured anywhere in the world.[1]

In October 2011, the BBC admitted that the Droitwich transmitter, including Radio 4's longwave service and Radio Teleswitch, will cease to operate when one of the last two valves breaks, and no effort would be made to manufacture more nor to install a replacement longwave transmitter.

This is like the Domesday Book (1980s interactive multimedia system on Laserdisc): high-tech and sui generis at the time it was built, but eventually uniquely obsolete.

The BBC itself is very much a valve-and-laserdisc organisation in a Google world. We huddle round the warm glow, concerned that eventually some vital element will give out due to lack of money and the whole enterprise will give up.

Periodically someone comes up with the idea of shutting down analog FM radio. This is politically inconceivable as half the country has ancient radios that were tuned to Radio 4 in 1967 and have never played any other station (an exaggeration, but only a small one).

darkr 13 hours ago 1 reply      
> In addition, the signalling information for switching Economy 7 electricity meters is carried on Radio 4 LW.


Fascinating stuff.

jhallenworld 12 hours ago 2 replies      
I can see from the photos that it's using this board: http://zedboard.org/product/zedboard

I'm surprised BBC is not only willing to make custom hardware, but custom RTL design as well.

I'm curious what the failure rate of the FPGA will be, I mean they are more susceptible to soft errors than CPUs or ASICs. Maybe BBC will fund making a custom ASIC :-) Well I see two systems, maybe they are redundant that way.

Why not make the project open source? Put it up on github.

jacquesm 13 hours ago 5 replies      
Very nice work. What is the rationale behind doing a bespoke design rather than sharing the development cost with all the European broadcasters, is NICAM unique to the UK? What do other broadcasters use for similar services?
rplnt 12 hours ago 0 replies      
Somewhat related, a few years ago I saw a video of a guy starting up a "live TV studio" from 70s/80s I think at his cottage/cabin. It was quite interesting, unfortunately not in English. I'll try to find it anyway and update this post.

edit: Here's the video I had in mind https://www.youtube.com/watch?v=51F7zNWqlgM I guess it's boring without understanding the audio :)

There's more on his channel

daveguy 10 hours ago 0 replies      
I appreciate a good success story about major transitions / upgrades / process changes / etc. Those situations where there is usually SOME glitch and you have to engineer the hell out of it to get a perfect transition. The only reference to testing was "We had the Codec tested for both safety and EMC." Was that in house or external? Surely there was some major transition testing low power broadcast tests, etc to make sure everything went smoothly. Or did they just get really lucky? Seems like there's more story to this.
CaseFlatline 11 hours ago 0 replies      
Truly an impressive feat:

"When you do things right, people won't be sure you've done anything at all." - Futurama

ssapkota 13 hours ago 0 replies      
Thats awesome!People easily overlook these things. They hardly realise that its same as upgrading an aircraft in midair.
jnbiche 11 hours ago 1 reply      
I read the article with interest, waiting with anticipation for them to announce the resulting hardware and software had all been open sourced. And then...nothing.

I'm surprised, I thought in the UK these kinds of taxpayer-funded projects were required to be open source.

In any case, it's too bad. I was not familiar with the NICAM codec up to now, but I'm sure that software and hardware plans would have been very useful to some developing countries, many of whom apparently also use NICAM in their state-owned broadcast companies.

thrownaway2424 12 hours ago 1 reply      
Will be interested to read the followup article in the year 2048 regarding whether these new boxes lasted as long as their predecessors.
ihsw 12 hours ago 4 replies      
Honest question -- why are they using PATA ribbon cables in the new NICAM codec box? Surely SATA would be more reliable and easier to administer.
gpvos 10 hours ago 0 replies      
What I missed in the article is whether there was an audible glitch at 4:15 AM.
acheron 12 hours ago 9 replies      
Really interesting article, but terrible clickbaity title. I wouldn't have read it if there weren't so many positive comments on HN already.

It could work if there were a subtitle. "35 million people didnt notice a thing: BBC Radio's NICAM Codec replacement project". I dunno. But it is (or should be) very important that the title actually informs you as to what the article is about. "35M people didn't notice a thing" doesn't tell you anything at all.

Edit: Upon further consideration I may have been a little harsh. This is really a blog entry more than an "article" per se, and I suppose that's fine for a blog post title. The problem comes when we link to it from somewhere else, e.g. HN; we need the additional context.

javi2601 7 hours ago 0 replies      
good !!!!!!!!
TheAppGuy 11 hours ago 0 replies      
Top on HN, but I can see why. The title tempts you to click through. :)
The Rails Doctrine rubyonrails.org
491 points by robin_reala   ago   353 comments top 42
sauere 1 day ago 14 replies      
Funny, everything in this post is exactly why i prefer Python(+Django/Flask) over Ruby(+Rails). Too much magic happening everywhere, a gazillion of built-in methods, weird shit happening all over the place, all that combined with all the syntactic sugar Ruby offers.

Granted, it might look "beautiful" in the eyes of a experienced RoR developer, but personally i find it just makes code very hard to read. Just my 2 cents.

vinceguidry 1 day ago 7 replies      
I submit that anyone who takes the time to really learn Rails and how it works and the problems it solves will never want to use another web framework except for toy projects. It really is that good, and just keeps getting better every year.

It does one thing, web applications, and does it really, really well for the type of coder / team it was designed for.

People that belittle it by saying it has too much magic or whatever have never seriously tried to maintain a huge web application all by themselves. Web apps are ridiculously, mind-bogglingly complex, Rails conventions have been baking for well over a decade now.

If you're Twitter, maybe you have the resources to maintain a serious web presence without Rails. Everyone else is just handicapping themselves.

attilagyorffy 1 day ago 0 replies      
I've been a Rails developer for 10 years now and I have to say it is still my favourite tool of choice (yes even in 2016). There are other interesting tools and frameworks out there (I'd love to put my hands on Phoenix and Elixir if I had the time) but for now I have to be honest: When I start a new client project I have to consider a few things:

* Using a well-known framework is favourable over new shiny toys in a commercial system

* An ecosystem that has good 'defaults' is essential. A single web framework won't do everything for you. You need stuff around that for testing, deployments, etc.

For mainly the reasons above Rails is still my primary tool of choice. Yes, it has pinpoints but the reasons above far outweigh the new and shiny.

equalarrow 1 day ago 2 replies      
Couldn't agree with DHH more.

I found Rails during the 0.9 releases. Coming from the horrific world of Java web frameworks like Spring and Tapestry, it was a revelation to see how fun and productive web programming could be again.

When I saw things like 2.times, 1.day.ago, and ActiveRecord, I knew I was done with Java web development. This isn't a knock against Java per se, but I just spent so much time in the trenches having to do busywork and configuration that isn't necessary if authors of those frameworks put programmer productivity and enjoyment first.

Thank you DHH!

edwinnathaniel 1 day ago 0 replies      
I recently wrestled with the idea of using NodeJS for any side projects. I touched Ruby/Rails in the past for a few times (on-off between small works) and never really felt excited or see it as something "beautiful" because at the end of the day, we, developer, still have to debug our own buggy code.

After a few strings of drama in the NodeJS land, a few chat with friends who used NodeJS in production, and seeing the pattern of BigCo who use NodeJS only for glue/front-end/gateway appserver, I'm pretty close to settle with Rails.

I admire DHH for his dedication to the framework for so long (since 2004). You don't see that type of dedication in the NodeJS land. Yes, NodeJS is still new, but a few BDFL/leaders of important NodeJS projects had left the boat already. One of the premier NodeJS vendors, StrongLoop, always mired with controversy (I happened to know a few people who worked for a company that was acquired by StrongLoop).

Building long-term side-project requires a stable platform because of the time limit (outside office hour + other responsibilities). Dealing with unstable but "sexy" technology is not a good choice.

jaredcwhite 1 day ago 0 replies      
I think DHH has realized that Rails has a bit of an "image" problem when it comes to new web app development, and this is a way to lay out why Rails remains relevant and even desirable. No doubt some will take exception to some or most of his points, but that's OK. For the rest of us who thrive in continuing to build with and evolve the Rails ecosystem, it feels good to see a renewed focus on communicating just what makes Rails (and Ruby) special and vital.
bcardarella 1 day ago 3 replies      
For those that are interested in a more performant and scalable Rails-like experience I encourage you to check out Phoenix: http://phoenixframework.org

We have been working with Elixir (the language Phoenix is built on) for over a year. It is an incredibly satisfying language and its toolset is surprising advanced for its age. Phoenix has been a very easy mental jump for our developers who were already trained on Rails.

criddell 1 day ago 11 replies      
Now that Rails isn't the cool new thing anymore, is it still a good choice for new projects?

For the new year, I'm thinking about the programming landscape and contemplating what I want to learn next. I've tinkered with Ruby on Rails in the past and think DHH is on to something with the doctrine he has laid out. I've only scratched the surface of Rails, but it still seems pretty neat to me. A deeper dive would be fun.

On the other end of the spectrum, Dart + Angular is appealing in part because I'm a fan of some of the Google developers working on Dart. While Rails feels like it is entering a comfortable middle age (or at least adulthood), Dart feels a little like a car being assembled as it cruises down the interstate. It's new and maybe a bit dangerous.

Both projects are super-appealing to me. Unfortunately time constraints prohibit me from chasing both and I'm interested in opinions especially of Rails people. Are you still excited about it?

rileymat2 1 day ago 2 replies      
About picking the "Best" tool for the job. One annoying thing about Stack Overflow is that they aggressively stop those types of questions with their mod system. Just understanding the Pros and Cons of the major options is very valuable, but not allowed.
elliotec 1 day ago 1 reply      
I was just singing praises to Rails last night for these reasons.

At my corporate job, we use Java with Spring, and a hellish mess of node packages/libraries/whatever we're calling them now. It's not good.

So when I do side projects or consulting gigs, I always run to Rails which is what got me into development. It's very refreshing.

Reminds me of the jQuery post the other day. Rails is just a little older than jQuery. Rails with some jQuery sprinkled in is as close to development perfection as I've ever seen.

Also re: "too much magic" - Why is magic a bad thing? Creating layers of abstraction to ease building is what programming is really about, right? Why would we want to go backwards to configure everything under the claim of "I need to know how EVERYTHING works!"?

batiste 1 day ago 2 replies      
There is great stuff to say Ruby and RoR. But I have yet not experienced the "smile" that DHH is describing. Especially when I have to read and understand the code written by colleagues (or myself mind you). To do my job I had first to become a master in grep to find where stuff come from and deciphering half assed documentation. Mostly because exploring objects with hundreds of methods is not my cup of tea.I also enjoy understanding my tools, and RoR is, in my experience, not particularly easy to understand. This doesn't put a smile on my face.
cballard 1 day ago 3 replies      
> Where Java championed forcefully protecting programmers from themselves, Ruby included a hanging rope in the welcome kit.

I don't think this is something to be admired in a language. Java is a bad language to work in because it's not expressive, not because it protects programmers (which it doesn't).

Thankfully, we are seeing this shift the other way with the growing popularity of languages like Rust and Swift.

ronnier 1 day ago 2 replies      
>Optimize for programmer happiness

I've never been more unhappy programming than when I touch Rails. I'll take python or Java any day of the week.

>Convention over Configuration

Discoverability is totally gone. Walking into a brand new RoR app and it's next to impossible to figure out what's going on. Not having a good IDE doesn't help.

>Exalt beautiful code

Write simple easy to read code. Don't be clever.

ianamartin 1 day ago 1 reply      
My background is Python, C#, and SQL, and it shows because those languages have affected the way I think about programming.

I strongly disagree with DHH's points 2,3,4, and 6. But I think I'm going to learn the language anyway and use it for some projects because I think it's important to not get in a rut. Using different languages changes the way you think about code, and if you are not constantly updating the way you think about code, you are falling behind.

Besides, I really like the way DHH writes. His language is persuasive even when I quite certainly disagree with him.

Still going to learn a functional language first probably. But then Ruby/Rails.

tim333 1 hour ago 0 replies      
I like the philosophy of optimizing for programer happiness even if it's imperfectly implemented. I think those kind of ideas have more power than they are commonly credited with.
norswap 2 hours ago 0 replies      
I was hit hard by the cognitive dissonance at the start of section 2.

Just after waxing in section 1 on how Rails is designed to his own happiness (including adding quirky methods like 'fifth'), he says "One of the early productivity mottos of Rails went: Youre not a beautiful and unique snowflake"

Luyt 1 day ago 1 reply      
I transcribed an interview with DHH [0] in which he also explains his motivations for choosing Ruby (instead of PHP or Java, for example), and where he explains some design principles for Rails.

[0] http://www.transcribed-interview.com/dhh-rails-david-heineme...

"One example I always pull out is what you gonna call the primary key in your tables? When I was working with PHP and Java, every single shop, almost every single application, would have its own naming scheme. Some would say they have the Products table and then they'd have productid, others would have product_id, some people would have prod_id or p_id or P_id, and every time somebody made a new design decision it meant configuration. You now have to tell your models, your objects, how they're going to talk to this database table. Because it needs to know what the hell you called the freaking primary key column, and it just doesn't matter! Who cares what the primary key column is called? It just doesn't matter. It's going to have zero impact on the usefulness of your application."


"Dont repeat yourself is all about not having the same intentions spread out in multiple places. Don't have one configuration. If you're calling something, let's again take the example of the primary key. If you're calling that for id you shouldn't have to configure that in three different places that all have to work together and all have to be changed together. You should just pick one authoritative place to have that information stored. And then you can make changes from there. It also goes with the the whole Ruby idiom: we don't want those Java boilerplate ten line things: that's repeating yourself. If you have the same idiom, if you have the same intentions, that should really be exceedingly a short expression. And that goes up throughout the entire framework. Just keep one place to change those things, and keep the idioms very short."

jerguismi 1 day ago 0 replies      
It is funny how small things affect your future. I did a lot of stuff with rails back in 2007 or so. However I really got fed up with it because some things were so slow. So for the next project I decided to pick up django, and I've been using django since then.

Would it have made much difference if I had staid with ruby/rails instead of python/django? I dunno. Many of my friends have careers with rails, and they have nice careers. I have nice career as well, as do many other friends with django. In the end they are not that different, I guess.

qertoip 1 day ago 0 replies      
If you like the Rails Doctrine, check out Phoenix Framework and Elixir language by the former Rails Core Developer Jose Valim. You'll get hooked as you did with Rails 10 years ago.
resca79 1 day ago 0 replies      
I like ruby and rails, not just for the framework itself, but for what it was and it is for many developers.

I want to say thanks to DHH for his way to maintain the framework simple, despite to many PRs that try to make rails more complex.

Thanks to this document, I'd like to read also some document about the standard in big rails app.

jaequery 1 day ago 2 replies      
Sigh, it's hard to take any of these comments seriously without first understanding how much real-world experience they actually have in the industry.
TheMagicHorsey 1 day ago 0 replies      
Mature Ruby projects are very difficult for a new Ruby programmer to parse and understand. In contrast, Go projects are quite easy to understand.

DSLs are part of the reason for this.

Naturally there is a tradeoff. People that have been working with a particular DSL in Ruby for a long time can achieve amazing things in a short amount of time. Equivalent productivity is not really possible in Go (at least not yet, in my experience).

Having said that, I think it is easier to onboard new (and newbie) programmers onto a Go project than a mature Ruby project.

AKifer 1 day ago 0 replies      
Having been working with Rails for a long time, now I wonder what could be the next cool thing looks like. The only improvement I'd like to see is about performances, and Elixir and Crystal are a good candidates to replace ruby in this direction. But until now, I don't see any other framework equaling the elegance of Rails, great job David and all the community, let's continue to keep Rails weird.
reaction 8 hours ago 1 reply      
Can someone describe their experience with using rails as a back-end JSON API for javascript apps (flux/react or with angular/ember)?
jesstaa 1 day ago 0 replies      
9. Monkey patching instead of good design10. Don't worry about data integrity
euphemize 1 day ago 1 reply      

 This is easily described with a contrast to Python. [...] Ruby accepts both exit and quit to accommodate the programmers obvious desire to quit its interactive console. Python, on the other hand, pedantically instructs the programmer how to properly do whats requested, even though it obviously know what is meant (since its displaying the error message).
When you're trying to convince me that your programming language is great because another one is so much shitter in comparison, you're on the defensive and basically telling me you don't trust your own stuff. Especially when this is at the top of your article.

By the way, python handles your example very gracefully in comparison to other REPLs (for instance, node spits out a cryptic, but expected traceback on exit/quit)

windor 20 hours ago 0 replies      
I think this should be post here: https://dockyard.com/blog/2015/11/18/phoenix-is-not-rails

It's very interesting to see such different philosophies.

miseg 1 day ago 0 replies      
I know PHP-bashing is oldschool, and I love what I've been able to do with PHP.

After a decade of me in PHP, his love for Ruby is a lovely selling point for trying out that language.

bliti 1 day ago 1 reply      
Bit off-topic here:

How is the Rails market these days? It seems (personal perception of outsider) that it has been replaced by Python/JS work.

sadiqmmm 1 day ago 0 replies      
DHH is amazing.... Thank you for creating RAILS.
runjake 1 day ago 0 replies      
I think I can boil down all the debates in the comments:

Some people prefer the Rails way of doing things. Others prefer something else. Nobody is wrong.

tarikjn 1 day ago 0 replies      
My tool of choice today is Clojure, but I have to thank RoR for teaching me a lot of sensible conventions that I use in Clojure nowadays.
aikah 1 day ago 2 replies      
> 7.Progress over stability

No real ecosystem can be built on rails since its apis keep on breaking.

That's why in my experience, most people give up on rails.

Stop breaking apis, introduce stability and maybe the framework will be successful again. People don't give up on rails because of ruby, or because it's bloated or slow, but because it became an unmaintainable, non upgradable mess as time passed.

keredson 1 day ago 2 replies      
one thing i hate about rails is writing the database migrations. (i thought: "why not just define what the schema should be, and let a diff-like tool figure out update statements?") so that's what i wrote:


been using it in production for ~1y now.

ryan-allen 1 day ago 0 replies      
I personally prefer ASP.NET MVC, it has all the core parts of early Rails (back when it was nice and simple), and it's statically typed to boot.

The most magic thing in .NET MVC is Entity Framework, and all it's magic is either a bunch of strongly typed convention classes (of which you can create your own), or LINQ queries, which work pretty well.

If I'm building stuff for Adults with Money I'll choose static typing any day.

tomphoolery 1 day ago 1 reply      
whoa what the hell is ApplicationRecord?
of 1 day ago 0 replies      
s/so its only fitting/so it's only fitting/
Ologn 1 day ago 2 replies      
> Progress over stability

> We have to dare occasionally break and change how things are to evolve and grow.

> ...harder to swallow in practice. Especially when its your application that breaks from a backwards-incompatible change in a major version of Rails. Its at those times we need to remember this value, that we cherish progress over stability...

I encountered this doctrine first hand. A division of a company I worked at had its division's website rewritten in RoR by some outside consultants. Even brand new, it was a mess of Ruby gems of conflicting dependencies which was undeployable. The build system was a train wreck as well. Some of the main web pages of their web sites like rubygems.org were not available - the web site had "progressed" and when you clicked on a link, the page explaining anything had disappeared. I mean hell, go to rubyforge.org this minute (1:15 PM EST) - it is down, of course. The blog of the author of the Ruby gem that then was the main gem that handled web serving did not inspire confidence.

As I would have to maintain it (as a sysadmin, not programmer), I refused to deploy the system, since it was an unreproducable mess of conflicting gems and gem versions and kludges. They decided to pay the RoR consultants additional money, despite them having never delivered a working system. By the time I left, it still was never deployed. The division closed dozens of offices soon after, and then the division shut down - perhaps not completely related, but probably somewhat related to this project.

Meanwhile, the company's other divisions had decent programmers programming for Java application servers, and that code and those servers were much more solid. Builds rolled out easily (and could easily be rolled back if need be). It was really night and day. It makes sense this contempt for stability is explicitly part of their doctrine.

xyzzy4 1 day ago 4 replies      
The main problem with Rails is that other developers you work with have strong opinions about coding style, and they hammer you on code reviews if you don't code their way.
of 1 day ago 0 replies      
Push up a big tent? David Oscarmeyer Heineken is gross! :P
kaonashi 1 day ago 3 replies      
The worst thing about rails these days is ruby itself. If I'm going to use a non type-safe language, I may as well use the one that's on the client side as well (and has first class functions).
vudu 1 day ago 1 reply      
Everyone who has created their own open source framework which supports a multi-million-dollar business that now supports your hobby of driving million dollar race cars, please feel free to cast stones at what rails is and has become.
Thanks for Trumpet Winsock thanksfortrumpetwinsock.com
407 points by wooby   ago   116 comments top 34
peckrob 23 hours ago 9 replies      
Oh man, the memories.

For people who have grown up in a hyper-connected always-online world, it's hard to explain the pure joy of hearing the sound of your computer picking up the phone and sending those tones [0]. Because it meant going from isolated, disconnected and unitary to being part of a wider world.

Suddenly, everything was at your fingertips and it was intoxicating to me as a teenager. Fire up Trumpet Winsock and dial into the local mom and pop ISP. Suddenly you're surfing the early web using Netscape. Or open up WinVN and read some newsgroups. Or spend way, way too many hours playing MUDs (seriously, I think I spent almost every night MUDding during my teenage years).

Or learning cool HTML tricks by looking at the source of a page (back when pages were simple and you could tell things by looking at the source). Some of my earliest exposure to "programming" was because I wanted to make cool web things on my 1mb of ISP provided web space.

So yes, thank you Trumpet Winsock. Without you my formative years would have been very different and I likely wouldn't be in the career I'm in now.

[0] http://www.windytan.com/2012/11/the-sound-of-dialup-pictured...

kazinator 20 hours ago 3 replies      
> Do you remember connecting to the Internet in 1994 or 1995?

In 1993 I was already using Linux, with an actual TCP/IP stack, not some bolted-on thing. In 1994 I was doing contract work on Linux already. One of the jobs was for these guys, still chugging along:


They employed a group of full-time people who continuously gathered new information about mining prospecting going on around the world, stuffing it into a database. This was turned into periodically refreshed web pages, for which subscribers could "click to pay". I hacked the CERN httpd to lock the click-to-pay data, and whipped up a billing system for invoicing customers. (Spat out TeX -> dvi -> laserjet: most beautiful invoices anyone ever got for anything.) I made a nice visual control menu for the whole system using a C program and ncurses, and even Yacc was used on the project for something.

One of the genius programmers on the database side claimed that "OMG, Linux causes data loss", because when the hundreds of megs of generated HTML was copied over to the servers (Linux ext2 FS), the disk usage was way lower than on the FAT. Haha!

In 1995 I got an Asus motherboard with two Pentium 100 processors, and ran Linux 1.3.x with early SMP support (big kernel lock heavily used). make -j 3 was only 27% faster than make.

sengork 23 hours ago 0 replies      
I wonder how much of his programming know-how could be attributed to the high school curriculum.

In the 1970s Tasmania was the best equipped Australian state for computer based subjects. A lot of the schools had terminals to a central computer[1]. Buses, I/O devices and assembler topics were covered as early as year 9 levels.

[1] https://en.wikipedia.org/wiki/RSTS/E

jeff_marshall 23 hours ago 2 replies      
I have fond memories of the transition from local BBSes (my parents were annoyed enough by my constant phoning of the local BBS to download commander keen and the like), to IP connectivity via Trumpet. The reach of the internet (esp IRC!) was mind-blowing for someone living in a relatively isolated community in Alaska at the time.

I feel like I don't fully appreciate the gradual transition from dial-up and Trumpet to LTE and a supercomputer in my pocket :) I wonder what people born today will experience that has as great of an impact.

jacques_chester 23 hours ago 2 replies      
When I set that site up in 2011, it was really heart-warming how many people rallied around and chipped in.

It's doubly nostalgic to see it here again, 5 years later.

Edit: and there's still room on that donors page for any companies wanting to chip in something substantial.

Edit 2: 5 years, not 4.

matthucke 21 hours ago 2 replies      
In 1995 I was an expert in setting up Trumpet Winsock, paid to consult on its installation and configuration - even though I had never once installed it myself.

That is to say, I was a tech support lackey, answering the phones and talking to dozens of dialup ISP users daily.

It was a small company, and of the three techs there, none of us were Windows users - two Linux, one Mac. Someone had helpfully printed screenshots of Winsock's various dialog boxes and taped them up around our cubicle. It was enough.

SwellJoe 23 hours ago 1 reply      
I was using an Amiga with the Miami TCP stack back then, which I paid for. My first Windows machine had Windows 95, which had networking built-in. But, I'm happy that some folks have made good on their shareware obligations. Writing software was a lot more difficult back then...I sometimes can't believe anything ever got done before we had the Internet to research things (and I know I personally was a much less effective developer before the Internet).
acqq 17 hours ago 0 replies      
Here's the author's story, the making of:


Hint: from scratch, reading RFCs, in Turbo Pascal, as a part of his internet newsreader project!

Also, http://petertattam.com is down currently, but



cyanbane 23 hours ago 1 reply      
Color boxes, Phreaking, ASCII art groups, zips broken up at 1.44m, LOTD and other door games. That was my youth. Donation sent.
dsr_ 15 hours ago 0 replies      
And also thanks to Russell Nelson, who maintained the best collection of ethernet card packet drivers for many years -- if you wanted to connect a DOS machine to an IP ethernet network, that was your best option. Probably still is.

Looks like he still has that up at http://www.crynwr.com/drivers/

gethoht 22 hours ago 0 replies      
Donated $50 now that I have the cash. I did not have the money back when I was 12 and first getting into computing. Cheers to winsock.
cannam 17 hours ago 1 reply      
In 94 I got a job (my first in London) at a small company that made software with and for SGI workstations, and despite all this computational power they still used a 386 with Trumpet on Windows for their only internet connection.

It would dial up a few times a day to exchange email using Demon's inbound SMTP (tenner-a-month account!), or one could laboriously route through it if one really needed something specific.

In summer 1995 they replaced it with an ISDN line.

dankohn1 16 hours ago 0 replies      
My startup [0] conducted the first, secure commercial transaction on the web in 1994. I have strong memories of taking people on the phone through the many steps of downloading Trumpet Winsock via ftp from Australia so that they could then install the NCSA Mosaic web browser. Thanks, Peter, for your essential work.

Here's a short video [1] Shopify released last month about the transaction, where I reference how hard it was at the time to get online.

[0] http://www.nytimes.com/1994/08/12/business/attention-shopper...[1] https://www.youtube.com/watch?v=eGyhA-DIYvg

rmason 23 hours ago 0 replies      
I met Peter at BBSCon down in Tampa in 1995. A really humble guy and truly one of the nets pioneers.
marpstar 21 hours ago 1 reply      
My absolute earliest memory of going on the internet was my grade school librarian firing up Trumpet Winsock on some Windows 3.1 machine when I was in second grade (circa 1995). He navigated to nfl.com and then printed the website out.

I remember thinking "this is pointless" but went on to build my first web pages only a few years later (4th or 5th grade).

korginator 13 hours ago 0 replies      
My first experience with Trumpet WinSock was on a small project where the computer talked to some state of the art (at the time) network connected data acquisition devices. Coming from a Unix world at the time, the whole windows ecosystem and specially its networking felt stone age, ridiculously buggy and error prone. It quickly drove us back to the old SunOS and Silicon graphics Irix workstations.
jlgaddis 20 hours ago 5 replies      
I'm trying to remember "the" big FTP site back then. There was one in particular that was the "go to" site for, well, pretty much everything.

ftp.cdrom.com, metalab.unc.edu, sunsite.something, ...

tomcam 19 hours ago 0 replies      
I paid $25 around 1993 but I sure got my money's worth and then some--so tonight I kicked in another $38 gratefully.
petercooper 16 hours ago 0 replies      
I remember mine and my dad's first confusions at using the Internet. How could you do multiple things at the same time? Obviously, we were used to how BBSes worked and had no idea of TCP/IP at the time.. :-)
sangnoir 19 hours ago 0 replies      
Moderators: shouldn't the title have a (2011) at the end? That is when it was authored.
Sami_Lehtinen 15 hours ago 0 replies      
I really loved the Trumpet Winsock debug mode which clearly showed packet types, syns, acks and other details. Since that I've been familiar with IP networking.
aheilbut 21 hours ago 0 replies      
I always thought Netmanage Chameleon worked better.
bks 22 hours ago 0 replies      
I ran an ISP that used trumpet, before Windows 95 and it was way better than my SLIP account. Donation already sent! Thank you.
smallreader 21 hours ago 0 replies      
This is great but should be tagged (2011).
tyingq 15 hours ago 0 replies      
Similarly, thanks KA9Q & WATTCP. Oh, and Kermit too.
danieltrembath 15 hours ago 0 replies      
As a mac user I mostly remember ButtTrumpet and giggling.
seattlesbest 18 hours ago 0 replies      
Never work for free,we're not paying royalties to the ancestors of the inventors of the wheel.

Thanks for the free work,here are some stock options!

You're the lowly programmerz, I'm the IDEA GUY!

justin_vanw 20 hours ago 0 replies      
Donated just now, wish there was a way to give 20 years of compound interest along with it :(
orionblastar 23 hours ago 0 replies      
I had a PC Shop in 1995-1997 we sold copies of a software product called Internet in a Box. https://en.wikipedia.org/wiki/Internet_in_a_Box

I think it competed with Trumpet Winsock. He had clients who used Trumpet Winsock but had problems configuring it so we helped them out.

It was later on with Windows 95 OSR2 that IE was bundled with it and it had a Winsock Dial Up Network stack that Internet in a Box and Winsock lost a lot of sales. I think they sold MSN subscriptions with it.

AOL and Compuserve competed with sending out free floppy disks and later on CD-ROMs. Then there was that $500 Internet rebate that made a PC basically free but had a $35/month dial up ISP bill to pay for it for five years.

But I remember people registering Trumpet Winsock for $25 and then choosing a mom and pop ISP. Trumpet Winsock was downloaded from a BBS, and was shareware and some ISPS gave out copies of it on a floppy disk when people signed up for service.

ck2 18 hours ago 1 reply      
Thanks to the original Netscape folks too.
emmanueloga_ 21 hours ago 2 replies      
On a related/tangential note, here [1] is the website for terminate, the world's most powerful communications software.

1: http://www.terminate.com/

kentf 22 hours ago 0 replies      
Instead of PayPal, let's use Tilt for this. It's free and a YC company. I am happy to set it up, unless someone else wants to.
sandworm101 20 hours ago 0 replies      
Sorry, off-topic, but in recent months I cannot read "trumpet" without thinking of the sort of thing I saw on TV today. Winsock comes across as 'wind-sock'... also apt.
Why Big Companies Keep Failing: The Stack Fallacy techcrunch.com
403 points by walterclifford   ago   168 comments top 31
gregdoesit 2 days ago 6 replies      
From my experience the same (often random) reason that makes a company succeeds, then becomes their DNA, and finally can make them fail.

I saw this happen with Skype where I worked a couple of years. The company succeeded because of P2P: we grew with little infrastructure to reach 200M+ people. P2P became our DNA, rooted deep within (almost) every core component.

Then came the new wave of mobile messaging apps. We reacted... with a P2P messaging solution. It was obvious this wasn't working - you sent a message to someone from Skype for iPhone, and they got it... sometime.

We knew to have a chance against Whatsapp and other messaging apps we needed server based messaging, so we built it.

It took 3 years. Yes, it took this long to get rid of the P2P code from just the messaging components from the 20+ Skype products - we had 1,000+ engineers and 50+ internal teams by the end which significantly slowed things down. When we were done and popped the champagne - no one really cared.

And yes, the source code is still full of P2P references and workarounds to this date.

anonymousguy 2 days ago 2 replies      
The solution to stack fallacy is simple but really counter-intuitive. All of the mentioned examples, I mean every single one, indicate a business trying to force its way into the higher level through business channels. For example, when a business wants into a higher level they make it a business priority to create a new product and attempt to drive the priorities of this next level product through their business objectives. That is an epic fail.

It is important to instead concede that you don't know the needs of the consumers in the higher level, and if you think you do it is because you are guessing. The only way avoid the problem is to not attempt to move into the higher level, at least not intentionally and not through business priorities.

This is extremely counter-intuitive because there are generally fewer expenses and greater market frequency at each higher level, which means superior revenue potential. Businesses exist to make money and to ignore moving up to the higher level means denying this potential (vast) revenue source.

This doesn't mean you can't move into the higher level of the stack and be really good at it. It just means you cannot do so both intentionally and as a business objective.

The solution is to double-down on where you already are with what you are already good at and focus on product quality of your existing products. Continue to improve where you are already good. Improvements and enhancements to existing products can gradually yield the next level as the improvements progressively open new potential in an evolutionary fashion. While getting to the next level this way is much slower it is also risk reduced and continues to associate your brand with quality and consume satisfaction.

This will only work, though, if the goal is improving the current product and not acquiring revenue in that desired higher level. Think evolution and not revolution. It has to be a gradual, almost accidental, increase of capability based on meeting current consumer needs.

adevine 2 days ago 19 replies      
I don't buy the arguments in this article. For example, the whole part about why Google failed with Google+ is just wrong IMO. It wasn't that Google wasn't capable of building a good social network. If anything, I (and most people I know) preferred the design and interface of Google+. The problem was that Facebook already had a huge head start, and all your friends were already there. Facebook was "good enough", and there wasn't a big enough incentive to want to switch to Google+.

If anything, large companies often miss out on new trends and changes in business and technology, but it's not solely because building that one new layer "up the stack" is so technically hard or different.

hyperpape 2 days ago 1 reply      
I don't want to be too dismissive because something about the article rang true to me, but I don't know that I buy the whole central conceit that the idea of a stack can apply as universally as this article needs it to.

Apple's networked services have often struggled. But are they really higher level than the things Apple succeeds at? Asking whether enormous distributed data stores are higher level than Mail.app just seems confused. It's different, and it brings new challenges, but are they part of the same stack? And is the data ingestion and sanitizing that Maps struggled with higher or lower level than the client that was basically ok? You can multiply these questions and I'm not sure you can get good answers.

tn13 2 days ago 2 replies      
There was a brilliant essay by an Indian politician few years back after his party lost the elections. Later in lecture he explain why political parties and large companies have so much in common when it comes to failing.

His basic logic was that - Success depends on processes- Processes even though might be thought of as abstract in reality are function of people at top. - Company gets successful because some bright guy is the rebel, he questions status quo, persists and succeeds. - As time goes by, the rebellious ideas actually become conservative ideas. The rebel is now on top. As his ideas fade he struggles to stay on top.- He recruits people who see the world through him, he builds processes that enforce that vision.- This makes it difficult for the truth to be visible to the top management.- By the time failure is visible it is hard to turn around the ship. - IN SHORT: Companies/Nations fail because someone at top did not know when to quit. - In the end that rebel turned conservative becomes bitter. He thinks the world owed him something for what he achieved.

He explained who USSR examples. How a genetic scientist got promoted because his fake research re-enforced something that Stalin had said long back and his peers were scared to point out the fact because it might get perceived as anti-Stalin.

I observed Blackberry very closely and it resonated to me so much. The founders at one point blamed people for using iphone and not blackberry.

Best companies in the world are seem to be those where their top leaders quit at their peak to make way for their successor.

vonklaus 1 day ago 1 reply      
This article seems to have many correct pieces, but I don't think they coslesce to prove the point, or at least, not entirely.

I don't think that manufacturing semiconductors are comparable to building maps. Apple should have done a better job with maps, and even though they do complex manufacturing, likely should have done worse at chip manufacturing.

Iirc they brought in 3rd parties to help with the chip fab, and certainly spent more money building that core competency than maps.

I believe the author is correct that the issues is companies not fully understanding, and consequently underestimate, what it takes to be successful in a different arena putside their cc.

Google sees people as articles in a db. They dont understand people at all, they dont understand design as it relates to people, and they didn't understand that nobody needed another social network.

They probably underinvested (initially) in G+ and it was not a great product. It didnt achieve critical mass quickly, and thus had no chance of growing as a docial platform ever.

However, google is a lot more capable of creating something like this because they have all the core conpetencies down.

I guess my takeaway is that the companies can in fact take these arenas, but they underestimate the challenge. So to use a drug dealing analogy, they try to start moving bricks amd kilos, instead of working their way up learning the market pushing dimes and quarters.

They start too big, and when you fail big, you dont get the recovery of a smaller failure which affords small relaunches and features.

Tldr big companies try to enter at the top, cant recover from huge public failures, and either exit or buy in

libertymcateer 2 days ago 3 replies      
Apple is not vertically integrated - Wikipedia entries to the contrary notwithstanding. It is a grossly inaccurate statement. Up until very recently, Apple didn't own a single factory - how can one possibly claim that they are vertically integrated if they don't own their own means of production?

Apple is a fantastically successful software and industrial design company. The vast majority of their production is outsourced. This is not vertical integration.

Additionally, actually, Apple has tremendous amounts of hugely successful and popular software.

Though I dig the underlying point of this article, that product management is hard, I think the examples are less than good.

cturner 1 day ago 0 replies      
It's particularly funny when the stack fallacy is held by database providers, because databases are the wrong tool for most of the jobs they get used for at the moment.

Current usage of the database uses it as a loose, adhoc, difficult-to-maintain, polling-based API between multiple applications.

The future perspective looks back on our time, shaking its head at the way people use databases for everything in the same way that we shake our heads at bloodletting.

Oracle's business model is (1) convincing people to use platforms they shouldn't be using and then (2) selling the victims ongoing hacks and services to work around the limitations of the model.

Amazon's software services won't be build on a database. They'll be built using a decentralised messaging platform.

marshray 2 days ago 1 reply      
> Can we compete with Intel or SAP?

Well for one thing we know that Intel spends several $billion to open a new semiconductor plant and has a dozen of them already. https://en.wikipedia.org/wiki/List_of_Intel_manufacturing_si...

Whereas SAP is, well, a lot of software. Which is something, but Intel needs to make a lot of software too, and chip designs are in some ways a specialized form of software.

So I think in some sense Intel is strictly more challenging to replicate than SAP. (But this is probably just my misunderestimation talking. :-)

ap22213 2 days ago 1 reply      
An alternative (or maybe complimentary) theory is in Clayton M. Christensen's innovator's dilemma. Big companies build enormous revenue bases on certain types of technologies. Then they struggle to innovate because, by transitioning, they eat away at their existing revenue streams.
kazinator 2 days ago 1 reply      
This stack fallacy sounds very familiar. Oh, we will just have a few system calls like open, close, read and write, some TTY and credentials related stuff, a bit of signal handling, process control with fork, exect and wait ... writing a shell language on top will practically just be a footnote.
fforflo 1 day ago 0 replies      
As a comment on TC says:

"What the article is referring to as stack fallacy is the work of Physics Nobel Laureate Philip Anderson: https://web2.ph.utexas.edu/~wktse/Welcome_files/More_Is_Diff...

Let's give credit where it due please."

oautholaf 1 day ago 0 replies      
A lot of the examples and counter-examples in the threads here are great, but Microsoft in the Windows era is a great counter-example here: from operating system to Office dominance. How did they crush Lotus and WordPerfect again?
anjc 2 days ago 1 reply      
I'd like some solid examples of what companies confirmed perceptions of competitors were in the same vertical, versus the reality.

Because even the author references competency-based views of competitive advantage, but for some reason ignores resource based views, and ignores the fact that companies might be aware of their competences. That is to say, I'm sure that large companies tend to mostly be aware of what their competences are based on the resources and knowledge that they have. If they don't have marketing departments that have analyzed the ERP market, sales teams with ERP training, tech departments with key HR, key knowledge etc etc, then I'm certain they are very well aware of this.

Maybe some companies have had marketing missteps and have made poor strategic and competitive decisions, however, but I really doubt that it's due to a lack of introspection or simple analysis as described.

Also, IBM didn't "think nothing much" of the software layer. They misunderstood the nature of power in the supply chain, and most importantly, didn't solidify their position within the supply chain while they were dominant.

tokipin 1 day ago 0 replies      
Nice observation. Another way to put it is that induction is harder than deduction.

A related factor is that larger companies tend to be more specialized (formalized processes, specialists, focused teams/departments, and so on), meaning they can be prematurely optimized with respect to new goals and poorly equipped to conduct the necessary roaming.

anshublog 2 days ago 1 reply      
I am the author. Happy to answer any questions about my post.
dpflan 2 days ago 0 replies      
This article is funny - twisting together the ideas that a company specializes in a product in a specific market and that other companies can use the products / tools of other companies to develop their own unique products for a specific market and market development and competition. The "up-the-stack" company building something using products from "down-the-stack" has already entered a market, gained market share, and specialized in a market in which the "down-the-stack" company has no presence. Now, the "down-the-stack" company sees an example of a successful product that uses their technology that they know so well, but their company is not specialized for this product, so it hubristically does low-hanging-fruit-snatching to try to enter the new market. "Big companies keep failing" because they are not being innovative based upon what's mentioned in this article; they see an easy out and enter a market that already has an incumbent(s).
a-robinson 1 day ago 1 reply      
The author claims that "Stack fallacy is the mistaken belief that it is trivial to build the layer above yours", but then says that IBM was wrong when they "happily allowed Microsoft to own the OS market".

Wasn't IBM a classic case of not trying to build the layer above them on the stack?

The Wikipedia page on IBM PC DOS even claims that their "radical break from company tradition of in-house development was one of the key decisions that made the IBM PC an industry standard".

shadowmint 2 days ago 1 reply      
Why do people keep reading TC?

Here let me make an article... wait wait... ah... "Big Companies FAIL" that sounds like nice click bait. Now... hm, let's invent some stupid word to pad it out how about the 'Stack Fallacy'. Programmers will dig the 'stack' part. Yeah. Ship it!

Seriously, this article is content free.

People make products. Sometimes they work... sometimes they fail.

If you pretend you have some magical insight into why they fail or succesd with gems of wisdom like:

 found it very difficult to succeed in what looks like a trivial to build app social networks.

 The stack fallacy provides insights into why companies keep failing at the obvious things things so close to their reach that they can surely build. The answer may be that the what is 100 times more important than the how.
Then... wow. I don't even know what to say.

Really? What you build is important?

No kidding.

Why is the top of the list this morning?

cbsmith 2 days ago 0 replies      
This seems more than a bit flawed. It presumes companies or anyone else think these launches in to new markets are low risk. They are generally seen as anything but. There is hope that leveraging existing strengths will improve the odds, but only the idiots think they are certain to win.
annnnd 1 day ago 0 replies      
> The bottleneck for success often is not knowledge of the tools, but lack of understanding of the customer needs.

THIS! +1000! I would even leave out "often", or at least replace it with "usually".

ogezi 2 days ago 1 reply      
A company should always focus on your strengths if not it'll be both overstretched and unsuccessful. Great read.
mwnz 2 days ago 1 reply      
Do big companies really keep failing? I'm failing to see the evidence of that assertion.
tuke 2 days ago 0 replies      
True enough, but there are also companies that were designed from the beginning to have vertical integration and control much of their business from beans to buildings. And of course I am thinking of Starbucks. (People don't really understand the technological story of Starbucks, which has a lot to do with their introduction of the vacuum packs to get their coffee across the country, and overcoming the challenge of brewing coffee on passenger jets.) Mostly for the better, they decided long ago to own as much of the stack as they could.
peter303 1 day ago 0 replies      
The counter example would be Apple when they introduced music players and music distribution. That was far away from their core business of PCs.The smartphone was less of a stretch, because it was a potential extension of the iTouch comunications computer.
ljw1001 2 days ago 1 reply      
Some insights perhaps, but the claim that this is "Why big companies keep failing" is way overblown
LCDninja 1 day ago 0 replies      
When reading the article I was reminded about Warren Buffets message re identifying and staying within your "circle of competence."
a-dub 23 hours ago 0 replies      
That's why when I build a company, I'm going to use a hash table.
jackgavigan 1 day ago 0 replies      
> Product management is the art of knowing what to build.

And in what order.

dkarapetyan 2 days ago 0 replies      
Not so much stack as legacy. Have you seen legacy architectural decisions? They're impossible to get rid of. It's surprising how much the initial architecture can hinder change.
shim2k 1 day ago 0 replies      
Peter Lynch writes about it in his book "One Up on Wall Street". He calls it 'Diworsification'.
Subject: Urgent Warning haxx.se
401 points by robin_reala   ago   172 comments top 43
madaxe_again 1 day ago 5 replies      
Ah, this sort of thing is as old as the moon.

We run an eCommerce platform, have a variety of clients using it, they have a number of customers. At least once a week we get an accusation either from a client or an end-user of some outlandish nefarious behaviour, usually due to some complete lack of understanding of the nature of technology.

Way back when, we responded, tried to help, tried to explain, but it tends to be the case that if someone has made their mind up, they've made their mind up, and anything you say can and will be used against you - confirmation bias is a harsh mistress.

The best response is usually no response, I'm afraid to say. It's a drain on your time, they won't be any the wiser unless you're prepared to sink serious time into educating a stranger, and more often than not responding results in escalation, and people doing stupid things like involving lawyers and law enforcement.

Case in point: About six years ago, we had an older guy phone us up frothing about how we'd hacked his wife's computer and she'd accidentally bought something from one of our clients. We explained that it would be hard to accidentally enter your address and credit card details, and that if they didn't want the order they should contact the merchant, not the web developer (they clicked our "ecommerce by" link in the footer of the client site - we don't do that any more!). We thought that was that. A week later we got a stern phone call from an ombudsman who wanted to know why we were ignoring the distance selling rules and taking advantage of old people... and they didn't understand that we weren't a merchant, didn't place an order on their behalf, either - so months of time were wasted, and we narrowly avoided ending up in court over a non-issue.

Anyway. When you have a conversation with an idiot, nobody watching can tell which one of you is the idiot.

jerf 1 day ago 3 replies      
About nine years ago now, I worked for a small security startup that managed to make its way up to PCI certification for scanning websites before keeling over dead. It died the "just one more feature before we can release" death, so you've never heard of it. You've probably never even heard of the company that bought it/put it out of its misery. I'm pretty sure the customer base never exceeded the low single digits. Its most impressive moment was when it made the local TV for just... being a startup, basically. (I'm in the Midwest. Startups aren't and weren't that rare, but few of them even tried to get on local TV. At least at the time it wasn't hard to get into the human interest slot, presumably because it was a nice change of pace for the TV station.)

Yet the founders had a collection of letters from people, actual hand-written letters, asking for help with their hacked computers, asking how to hack, at least one probably-paranoid-schizophrenic one about... errr... hacking and the government and chips in brains and all that sort of thing and whether or not this company could help protect them against the hacker aliens (I don't recall the exact details but this is not an exaggeration of the flavor, alas).

There's an amazing amount of this sort of thing going on. At scale the only thing you can really do is ignore them; engagement doesn't go well for anybody, even the sender just ends up more frustrated and angry than when they started if you try so it's not even good for them. On an isolated basis you might get lucky, but don't count on it.

Edit: Kinda commenting on the thread above anchored on madaxe_again's comment, let me emphasize that I'm not saying ignore it because you can't be arsed, or because replying is beneath you, or because elitism... I'm saying that ignoring it works out best even for the sender, which is why you should do it. That it happens to be the easiest course of action for you as well is just one of those rare times when the easy action also happens to be right.

vxxzy 1 day ago 3 replies      
If you desire to respond, I would relate the topic to her field. She claims to be a photographer. Surely she has a vague idea of copyright law. I would explain that you had developed a piece of Intellectual Property and licensed said property in a specific form -akin to taking a photograph. The person who takes a photograph owns the copyright. It just so happens that Spotify and Instagram, "enjoy your work" and have decided to make use of your work under the license you have given.
malkia 1 day ago 3 replies      
This came today on a friend's facebook page (he's in IT/QA/etc.) - http://imgur.com/mnfQe3V

"My facebook suddely split in half and this screen popped up with all these random cyber space options and it was like watching and assessing things soooo weird? and talking about child... and children being forced WTF????? is this some sort of cyber police thing that my IP was accedently allowed to access so i could help stop child abuse on the net or am i going crazy???? has this happened to anyone else??? - :(( - feeling confused".

[what happened, was that this person most likely clicked F12 or Ctrl+Shift+I - and brought up the chrome/firefox/etc. developer console]

robterrell 1 day ago 3 replies      
My name and email address is the very last thing in the Waze about box, because they used some code I open sourced. Turns out it's the only email address anywhere in the app.

Thanks to this, I get about a dozen emails a week from people asking for Waze help. (Lots more when Waze changes something, like hardware support for a particular device!)

I've tried contacting Google (either to get these people help, or to get my email address removed...) with no luck.

I empathize with Daniel. It's an unexpected downside to open sourcing something and asking for credit.

js2 1 day ago 0 replies      
Attack of the Repo Men:


The only way to win is not to play.

elthran 1 day ago 3 replies      
Definitely agree with one of the comments on the blog - the domain (haxx.se) really doesn't help the author's case here when trying to explain to a layperson.
centizen 1 day ago 2 replies      
Honestly, I wouldn't even respond. I would imagine any response you give is going to be twisted as your original ones have been. You've tried to be reasonable, and you don't owe her anything - don't engage her any further.
mootothemax 1 day ago 1 reply      
I once received a request to hack a load of different companies' client databases due to one of my posts in the HN "Seeking freelancer" job threads, pretty much well solely because "Hacker" is in the site's title.

Bless him, the guy mostly kept on signing off his emails as "John," having forgotten to change his name in the "From" field, except for the time he forgot and signed his "real" name again.

(I say "emails" - it was a bizarre few exchanges, starting with "I have a job for you," and myself replying to his opaque emails to find out quite what on earth the guy was on about)

lazyant 1 day ago 2 replies      
This reminds me of the case when some state or local agency got the default CentOS / Apache page and they contacted them as if they were hackers.

EDIT (thanks mariuolo): http://www.theregister.co.uk/2006/03/24/tuttle_centos/

vog 1 day ago 1 reply      
The author information is part of the user interface, and can be as confusing as every other UI aspect. It should be carefully designed such that it is clear what the author takes responsibility for and what they don't.

Some years ago there was a similar issue with the default Apache website on CentOS. Somebody that their webspace being reset by their hoster, but rather than complaining to the hosting community, the user complained to the contact info shown on the default website, claiming they had hacked their website. (Sorry, couldn't find the link of that story anymore.)

mattlutze 1 day ago 3 replies      
Lest Stenberg get tangled up too much in the email-sender's unfortunately troubles, he should likely refer the emailer to Instagram and Spotify's customer support.

As an aside, hopefully someone can recommend to the emailer as well to use a different service to host high-quality versions of her photography so that potential clients can evaluate critical clarity in her technique. I'm not sure I'd want to rely on Instagram as the sole example of my work, but, maybe she's targeting a different clientele that I'm imagining.

As an aside (#2), who hacks Spotify accounts?

vitd 1 day ago 1 reply      
It's not exactly the same, but it reminds me of something I heard a few years ago. I knew someone who worked at a small non-profit. Once a year they'd do a particular fund drive that involved calling their previous donors and either talking with them or leaving a message asking them to please donate again. And every year, they'd get a message back on their machine saying only, "Please take me off your list!" They had no idea who the caller was, and she never answered the phone when they called her, so they could never take her off the list!
joefarish 1 day ago 0 replies      
TIL Spotify is a major partner of Spotify.
icebraining 1 day ago 0 replies      
Seems to be down, here's a cached version: http://webcache.googleusercontent.com/search?q=cache:_tJLn3s...
johannes1234321 1 day ago 0 replies      
I can't count how many mails I got via php.net webmaster and security mails (yeah, I admit my involvement there ...) from people who locked them out from some website with a "powered by php"-button and asked me for a password reset.The picture of the producer of a screw in a machine often seemed to work.
hospes 1 day ago 0 replies      
Unfortunately in this kind of cases more technical you get and try explain things, more they think that you have something to do with it. If you still want to try to convince her, then ask her to check the same information from any of her friends devices or any other device, so she can confirm that your name can be seen in all of them. This is simple enough that she can do herself. If I would you, I would just forward the email to Instagram support.
tripzilch 3 hours ago 0 replies      
Dear Sir/Madam,

You have contacted the software design company that licenses these commerce systems to merchants. We do not deal with the merchants' customer service. Please directly contact the merchant instead.

This is an automated message and you cannot reply to it.

mrsirduke 1 day ago 0 replies      
Site is broken, here's a link/mirror in googles webcache:


Walkman 1 day ago 1 reply      
I'm starting to believe to use my real name for my email address was not a very wise choice (kissgyorgy@me.com).
6stringmerc 1 day ago 0 replies      
Personally this brings up a very ridiculous and rather humorous observation on the nature of ToS agreements:

Intelligent people don't bother reading them closely, and the people who have read them often use the information contained in rather stupid ways.

Granted I have spent many hours over the years reading contest / entry rules and ToS type documents, so I'm pretty comfortable picking on myself a bit here and there. Often I've read a very clear ToS and then observed the responsible company basically disregard their own rules and stated processes. Two notable examples were for a Deadmau5 remix project (he 'lost his laptop' and they stretched the contest for a couple months, barely supplied any promised materials, etc) and a Local Motors contest (routinely lied about what they were looking for as judging criteria, then claimed to contact winners on day X to start authorization process, instead vetted winners in advance and then used day X to announce). I've used these experiences to temper my trust of any online engagement or contest, because it's nearly impossible to hold any provider accountable when they're dishonest or just inept.

Antagonizing bothersome people is a form of entertainment [1] from time to time. If there's nothing to be gained from actually being constructive, then being obtuse might be the most worthwhile course of action. YMMV.

[1] http://dontevenreply.com/view.php?post=111

harryf 1 day ago 0 replies      
> "Also Spotify is a major partner of Spotify"

It reads like text generated by a computer

TazeTSchnitzel 1 day ago 0 replies      
This sort of confusion could be partly resolved if apps with "License" or "Legal" screens would actually have a little explanatory paragraph at the top, explaining what it is they contain in layman's terms. For the average person, I imagine that scrolling through one of those would be quite bewildering, especially if WTFPL stuff was in there.
biot 1 day ago 0 replies      
"You know how when the credits roll on a movie, it shows a list of songs the movie has licensed? Same goes for software. I have as much ability to influence Instagram as Mozart does to influence the movies his music is in."
agentgt 1 day ago 0 replies      
Maybe advise her that she have a lawyer explain it to her or one of her friends look at it and explain it to her.... ie delegate.
msh 1 day ago 0 replies      
Reminds me of back in the late nineties I developed a freeware email client. The client header in the send emails said "???mail by www.example.com"

I tought it might give me more users... What it gave me was lots of angry emails once someone used my mail client to send out spam. That header was gone pretty quickly.

onetwotree 1 day ago 2 replies      
So there's a lot of "LOL LOOK AT THIS DUMB PERSON I AM SO MUCH SMARTER THAN HER!" going on here, especially in the reply that he seems to have sent her.

Here's what's going on. Her IG account may, in fact have been hacked. This happens. She's obviously afraid and angry. She is the kind of person who thinks she can solve all of her own problems, and found the licenses section of the app, which included something with a nonsensical name (libcurl) and a domain "haxx.se". Despite having known that haxx.se is for libcurl basically forever, I occasionally see it and associate it with gray or black hat stuff before I remember. So it's not at all surprising that a non-initiate saw this and thought it might have something to do with her IG account being hacked.

Daniel says his reply to her original email was "clear and rational". It should have been "understanding, compassionate, and patient". This is someone who is seriously freaked out, because her livelihood is at risk, and based on the fact that she went digging through the app, she is probably having what a shrink would call a "crisis of control". So here's what the author should have done:

1 ) Patiently explain what libcurl does (it let's programs request web pages, just like a browser). Explain that he's the author, but he's given it away for free. The license is in the app because he took pains to ensure that nobody can package it up with some slick marketing and sell what he's giving away free.

2 ) Acknowledge that haxx.se sounds kind of shady. Explain why he chose the domain. Self deprecating humor would be great here. Explain that despite this, all kinds of apps use libcurl for perfectly benevolent purposes.

3 ) Explain that he has nothing to do with instagram (commenters have suggested the car parts analogy, which seems like a good plan).

4 ) Finally, and most importantly, link her to their hacked accounts page! They have people paid to deal with this stuff, who are much, much better at dealing with panicking laypeople.

There is a lot of "reason good, feelings bad!" stuff in the tech community these days. It makes people see us as a bunch of borderline autistic[1], self centered, stuck up, evil nerds. Many of us, myself included, were terrible with social interactions and dealing with our feelings at some point in our lives, so the finer points of human interaction and emotional thinking left a bad taste in our mouths. But we've all grown up. We aren't social rejects and evil nerds anymore. We have lives, careers, friends, and family. We need to let go of the stuff we suffered in our youth, forgive those "stupid popular kids", and learn how to be nice.

[1] In the sense of the popular conception of borderline autism, not the clinical condition, which generally doesn't make you a jerk.

tehwebguy 1 day ago 0 replies      
The "hacking ring" accusations remind me of the ill person who used to harass a friend from their time at DailyBooth, incredibly weird situation.

It's pretty unnerving because the person clearly needs some assistance but also thinks you are the bad guy.

nkrisc 1 day ago 1 reply      
I think I might try to explain it this way:

It's like he makes paint. He mades this really cool shade of red paint that everyone likes. One day, some really mean dude used this guy's paint to paint his car red. He then used that car to go on a crime spree. The victims of the crime then went to the guy who made the paint demanding their stolen money back.

xg15 1 day ago 0 replies      
Well, if she found thieves had the lock of her bike cracked open with a steel cutter, she wouldn't call the manufacturer of the cutter either and accuse them of thievery, would she?
joantune 1 day ago 0 replies      
Urgent: use cache before hitting your DB all the time in your blog :P * website is down * (an easy way to do that is with cloudflare)
cek 1 day ago 0 replies      
This link now goes to 'download.gz'.
FussyZeus 1 day ago 2 replies      
> "I came across this information using my Spotify which has also been hacked into and would love your help hacking out of Spotify. Also, I have yet to figure out how to unhack the hackers from my Instagram"

You keep using that word, I don't think it means what you think it means.

In all seriousness though, she went to the ToS for help with the Instagram app? Why not write Instagram support directly?

cptskippy 1 day ago 1 reply      
I think the proper response is something along the lines of "did Michelin Tires hack your car?"
logicallee 1 day ago 0 replies      
Sorry, it's pretty ridiculous that the author chose to write all that and engage in correspondence without mentioning the elephant in the room either to the photographer, or to the reader (us) in this write-up. The elephant in the room is that haxx.se is a tongue-in-cheek name (or a coincidence.)

Would it have killed him to mention this?

"Dear Photographer Lady: I run a very well-regarded library, you may have had this reaction because I have the tongue-in-cheek name haxx.se [alternatively: because of the coincident name haxx], however I am a well-paid consultant similar to yourself and other than this choice of domain name there is nothing alarming. The library is famous and you should see a similar notice in all of your friend's phones (or anyone else's you check). It is in use by major corporations including Apple and Spotify. Sorry about the confusion."

that's literally all this is about. (obviously.)

In fact, it makes me seriously question the author's good faith that he ends with the call-to-action "Ive tried to respond with calm and clear reasonable logic and technical details on why shes seeing my name there. That clearly failed. What do I try next?" without mentioning the elephant in the room.

peterwwillis 1 day ago 1 reply      
How terrible would it be to just create new accounts?
shade23 1 day ago 0 replies      
webpage down?Any link to a cached copy?
dangerpowpow 1 day ago 0 replies      
throwastone 1 day ago 0 replies      
coldcode 1 day ago 0 replies      
I think it best not to use your actual name in such an email/web reference so you more easily ignore the lunatics.
grb423 1 day ago 1 reply      
I think you are misusing the phrase "nothing to do with." It seems to me that you in fact did have something, however indirect and misunderstood by the emailer, to do with these products. I, on the other hand and for example, had nothing.
Enable Node.js to Run with Microsoft's ChakraCore Engine github.com
340 points by bpierre   ago   121 comments top 21
domenicd 1 day ago 6 replies      
It's important to remember that ChakraCore only currently builds on Windows, so this is only for Windows Node.js builds.

The ChakraCore roadmap [1] shows that they plan on porting the interpreter to Ubuntu, but not to Mac OS, and they don't plan on porting the JIT. So even it does go Windows + Linux, it will still be a low-performing toy on Linux.

This is vaguely interesting from a technical point of view, but doesn't seem like it'll majorly impact the future of Node.js, except maybe as something Microsoft can offer for Azure customers that choose Windows Server.

[1]: https://github.com/Microsoft/ChakraCore/wiki/Roadmap

tbrock 1 day ago 3 replies      
This is awesome, but before anyone uses this for something real keep in mind that the way this works is that Microsoft has created a shim on top of Chakra that exposes a V8 API.

If V8 changes its API drastically (which it frequently does) this little experiment is basically over. There is only so much work people can do to keep up with a moving target.

Barring NodeJS creating an abstract JS engine API that developers can create v8/chakra/spidermonkey adapters for I can't see this being a success.

That being said, I hope that Node and V8 coupling becomes less tight and the interface between the two does become abstract so that we can all benefit from a choice of engine that implements the requirements of NodeJS.

morebetterer 1 day ago 2 replies      
Many posters in that Github Chakra pull request thread appear to be in the "Node is V8 and only V8" camp. This is disappointing. Since when is having a bigger developer community, supporting the latest ES6 standards, and having wider reach for NodeJS a bad thing? Look at what the friendly rivalry has done for the various browsers - the competition benefitted everyone and advanced technology immeasurably.
underbluewaters 1 day ago 5 replies      
I don't understand what the goal is here. Honest question, not a criticism. Can someone please explain. Is the intention to have node work on multiple Javascript engines like spidermonkey? I can see why Chakra developers could want this but as a developer using node I don't see how I would benefit.
kojoru 1 day ago 0 replies      
There's a Microsoft blog post which gives much more details: https://blogs.windows.com/msedgedev/2016/01/19/nodejs-chakra...
jorangreef 17 hours ago 0 replies      
I'm not sure if Google and V8 have always gone out of their way to embrace Node and optimize V8 for Node. Perhaps they have. I'm not sure.

But this is Microsoft going out of its way to provide a swap-in engine for Node, with a focus on optimizing the engine for Node through benchmarks, and cross-platform builds in the pipeline.

Hopefully, Microsoft will be able to achieve the following:

1. A smaller Node binary through a stripped-down engine.

2. A GC optimized for Node and server applications.

3. An engine optimized for Node, working closely with Node's core technical team.

Perhaps this might encourage Google to do the same.

morebetterer 1 day ago 0 replies      
I hope the V8 C++ API doesn't become the defacto Node engine API. Node really needs a proper engine-neutral API. This way the native node modules can truly be portable across engines and across node versions.

Supporting additional JS engines would ultimately lead to a healthier ecosystem and higher quality JS implementations.

tjfontaine 1 day ago 0 replies      
Congratulations Microsoft and the Node.js team! Providing alternate JavaScript run times is critical to the success of a platform that requires predictable performance on a variety of environments. I'm so proud of everyone making this come true!

Great work everyone.

taf2 1 day ago 4 replies      
I am sure I'm a minority on this thread - but does anyone else think that if the engine is being developed in the open as open source then competition is not a good thing. I believe it's similar to having io.js and node.js - together it is a stronger platform. To this end, I think it would have made a lot of sense for MS to collaborate with webkit/blink and vice versa... that's the true spirit of open source IMO... I don't believe the web is stronger with more rendering engines and more JS runtimes... I think it's weaker. What would make it stronger is more open source and more collaboration. We have to be very happy that MS is taking the first step in this direction. Open Sourcing ChakraCore is a huge first step and now integrating it with other systems as well . I just hope over the next 10 years we see more collaboration between webkit, blink, gecko, and trident... as well as the JS engines...
dsp1234 1 day ago 0 replies      
Some HN comments on this topic from when ChakraCore was announced last week.


bhouston 1 day ago 3 replies      
They should also have the option to use FireFox's engine as well.

Who has the fastest JS core these days?

nevi-me 1 day ago 3 replies      
Interesting development, a good one at that!

A few questions though. Node tracks V8 releases, so the first thing I'm wondering is whether ChakraCore will continue emulating V8 APIs into the future? What happens when ChakraCore adds breaking changes which V8 doesn't support?

I presume MS will be supporting most development on the engine, as they benefit from IoT Core and other applications relying on Node. If Mozilla were to do the same and add SpiderMonkey, this would likely see the 3 major vendors accelerate adoption of JS standards further. I'm more bullish on JS development into the future.

If I were to predict, I'd see Node switching to ChakraCore as its primary engine in a year. V8 has been 'we build you follow'. I know Domenic and other Google developers have helped with the relationship with Node contributors, but what will happen when MS offers a less-maintenance-prone engine? Embrace, extend, extinguish!

jondubois 1 day ago 1 reply      
This is great. I still feel that Google isn't particularly excited about Node.js. It's good to see that Microsoft puts faith in the platform.

I guess Google has a few conflicts of interest with Go and Dart.

shad0wc0dex 1 day ago 0 replies      
With Microsoft starting to open source more projects, things like this really start to become a reality. Awesome pull request, and I look forward to playing around with this once the request has been merged. (If it does get merged)
alkonaut 18 hours ago 0 replies      
What is a potential benefit of using Chakra over V8? performance? licensing?
dev6563 1 day ago 2 replies      
Why should a developer care about this? What does ChkraCore bring to the table that V8 doesn't ?
minionslave 1 day ago 1 reply      
What does this mean for noje.js?
jordan801 23 hours ago 0 replies      
Has anyone wrote up a Chakra vs V8 comparison?
inglor 1 day ago 2 replies      
This is incredibly cool, I have a request.

Please do not post comments on the GH issue unless you have something important to add. These issues gain a lot of attention and it makes it _incredibly_ hard for collaborators to communicate.

Locking the issue to collaborators means other people from the outside who have a significant contribution or want to help can't do that.

Comments like +1 -1 and such create a significant amount of noise.

Support open source, keep the discussion clean.

dang 23 hours ago 1 reply      
We detached this subthread from https://news.ycombinator.com/item?id=10932756 and marked it off-topic.
z3t4 1 day ago 0 replies      
As a long time JS developer I was blown away by the V8 performance, and it still amaze me. And it works very well on Windows.

What would be cool though, is if we could use vbScript in Node. Because vbScript is even easier to learn then JavaScript!

The State of Meteor Part 1: What Went Wrong discovermeteor.com
332 points by JanLaussmann   ago   175 comments top 42
magicmu 13 hours ago 4 replies      
The startup I'm with right now just finished a meteor/react project that I led. This article really touched on our major pain-point with the learning cliff that you hit after a certain point. We used FlowRouter since it has React support, but managing subscriptions correctly (let alone caching them) took way more time than we had anticipated. It wasn't until near the end of the project that we realized none of us actually had a total mastery of what was going on under the hood in meteor, which was a terrifying realization. All things considered, though, I think our biggest mistake was biting off more than we could chew in using React and Meteor, when we had never made an app using either before. On the other hand, blaze is pretty rough...
marknutter 14 hours ago 3 replies      
The big mistake was trying to provide anything beyond Minimongo on the client side. People are very picky about what front-end frameworks they use on the front end, and typically people who have strong opinions about that are less opinionated about what they use on the back end. I'm firmly in that camp, for instance. I've been a huge fan of Firebase for a while because it frees me from the tedium of having to create another REST application just to talk to a database. If Meteor had, from the beginning, focused on simply being a kick-ass open-source alternative to Firebase they would be killing it right now.

It was a monumental task to try to create something that would please both front-end and back-end web engineers. Another issue with Meteor is that it was envisioned, not extracted. [1] From Rail's creator, DHH: "First, Rails is not my job. I don't want it to be my job. The best frameworks are in my opinion extracted, not envisioned. And the best way to extract is first to actually do."

Meteor was the goal, not an actual, real-world application. Often when this is the case the software ends up solving a bunch of problems that seem logical to solve, but in practice are not actually practical (another framework like this that comes to mind is the notorious famo.us project). Compare this to Rails and React which were forged in the crucible of real, day-to-day development and problem solving.

[1] - http://david.heinemeierhansson.com/posts/6-why-theres-no-rai...

fefifofu 8 hours ago 3 replies      
This article does a disservice to Meteor. The article feels drama filled and says many things are broken when they are not.

"Blaze is threatened by React". You can use React or you can use Blaze. If React becomes so popular that Blaze is not longer used, that's OK... nothing to be threatened about. It's nice that Meteor can move with a trend.

"Tracker and Minimongo might eventually disappear as well". Tracker and Minimongo aren't giant stick bugs near Australia that need to be preserved. It's ok if they are replaced. They are internal tools for Meteor provide its "reactivity". I doubt reactivity is going away.

Other non-scary things: Routing is solved by community packages. Pagination, forms... really? Server-side rendering has the "spiderable" package, but the SEO / server-side rendering problem isn't unique to Meteor.

The database issue is valid. Meteor uses MongoDB. But, you shouldn't go down the Meteor road and try to shoe-horn a relational DB into non-relational DB, then say WTF. You knew from the beginning that non-relational DBs have their own set of problems. My limited understanding is that MongoDB was picked because it was the easiest way to get the reactivity that the MDG was looking for. Meteor road maps says SQL support is on its way.

I don't know where the OP is going with this. Maybe this is the part 1 of the late night TV commercial where they list all of our problems (think Slap Chop), then in Part 2 he'll solve all of our Meteor problems if we buy the next book he writes.

wsvincent 11 hours ago 1 reply      
I'm currently using Meteor to teach a college-level course on Web Development for beginners.


It's incredible how accessible Meteor is for this purpose:

* One line complete local setup

* One line deployment (to Meteor servers, but still)

* Javascript only (no need to learn Python, Ruby, etc in addition)

* Out of the box User Accounts

* Simple templating engine with Blaze

I can think of no other framework/language combo where true beginners can deploy a live, database-driven website in a couple of hours. I really wish Meteor could focus more on this aspect: becoming THE entry-level framework for learning web development.

However Meteor's business model is around hosting, so it's inevitable they move further and further towards the needs of professional, rather than entry-level, developers. And this takes them further into the areas outlined in this article where they are currently weak.

realusername 13 hours ago 6 replies      
The thing which prevented me to use meteor for big projects is it looks really monolithic from the outside.

What happens if suddenly I want to rewrite part of the back-end or part of the front-end with something else for various reasons ? What happens if I want to switch from MongoDb to RethinkDb or Postgres for some reason ? It's good to have default choices but it looks from the outside that the default choices with meteor are pretty fixed.

But maybe I'm wrong, that's just how it looks like from the outside.

orthoganol 13 hours ago 5 replies      
> Meteor has yet to establish itself as a mainstream development technology on the same level as Rails or even vanilla Node.js... almost four years after Meteor first launched, I have to admit I thought the framework would be more widespread by now. So what happened?

A more general answer, I feel like the web programming world doesn't really need new frameworks, does it? Rails or Django got mainstream adoption because there was a need at the time, likewise with frontend JS frameworks (which seems to be consolidating around just 2 - React and Angular), and likewise with Node as filling a need for easy async. I'm not that knowledgeable on Meteor [1], however I think by default it's reasonable to expect no new frameworks to have mainstream adoption without a major change to the web.

[1] I don't know if Meteor's x-platform appeal is enough to convert users from other x-platform, native and/or hybrid solutions (Ionic, Titanium, RubyMotion, etc.).

tbrock 11 hours ago 3 replies      
> Once a new Meteor user starts to go beyond the basics and look into things like routing, pagination, subscription caching & management, server-side rendering, or database joins, they realize that the difficulty curve quickly ramps up.

Routing? Really? I am not an expert but I think if routing, of all things, is hard in to do in your web framework, you most likely have a problem. That's requirement zero!

canthonytucci 12 hours ago 2 replies      
Give me a box of boards and nails any day. All my ikea furniture is rickety and much of it is starting to look dated.

I just spent a good amount of time over the last few months building a prototype for an application in Meteor and it has been a joy.

Out of the box I got happy, grokable app/server communication, I got sane user account tools and I got a build process that works well enough that I haven't thought about it at all. I almost never need to look at documentation, I just build features. I've only needed a few community packages, and the ones I have used have been working pretty well for me.

I feel like I've been living the dream. So much ceremony and overhead just melted away.

I'd be curious to hear what kind of issues people have hit with blaze/meteor package management/etc that make them want to swap in react/npm/etc. (I spent the better part of 2015 with react/flux and it would take a lot to get me to switch back).

treenyc 14 hours ago 0 replies      
Personally, the one thing I don't like about Meteor is that it insist on using MongoDb. That along makes me NOT want to use it, despite other cool features.
joslin01 12 hours ago 0 replies      
I like Meteor and am using it for my current project, but I architected it in a way to stay away from their pub/sub model where you publish partial data sets to the client. I believe this only has utility where you have a feed of data that you want to filter or sort on a lot of different attributes.

Instead, I just utilize `Meteor.Methods` for all client/server interaction. I actually think it's pretty nice because you define the method on the client as a "stub" that gets called as it waits on the server response. I think the tutorials and guides focus too heavily on their fancy client/server mongo magic.

Though something that is a bit frustrating reading this is that I left React. I found it overly-complicated for my product and preferred architecting everything in terms of templates. Blaze is threatened by React? How? I like React, but I can't exactly get behind writing HTML components in javascript because whoa man, a diff engine & "look ma! I'm being functional!" I realize the benefits of uni-directional flow, but all it really did was add a lot more conceptual weight to a pretty simple interface.

So I said, this is pointless let's just simplify things with Meteor + Blaze. Now Meteor says, it's "threatened" by React?

juanmnl 11 hours ago 4 replies      
I loved meteor. Was committed to it for 6/7 months while still learning JS (and working on ember and rails projects with others). I knew it was a risky decision, but it seemed like a good choice for making it as a one-man dev agency. For me, there's no stack selection problem, for me, the main problem is the co$t of launching a working demo-app or a prototype. The only thing i wanted was a simpler/cheaper way of deploying apps. I knew there was something coming, since the beginning i heard about galaxy, an MDG developed deploying solution, which i thought was the solution for all my problems. But then, then came the galaxy launch and its price ($495/mo). At that price I felt betrayed, misrepresented. (still waiting for the individual price several months after)

The big problem i see with meteor for the future, is that MDG has/will have to answer to its shareholders rather than its community.

ianamartin 12 hours ago 3 replies      
I was never able to get on board with Meteor because of the Mongo dependency. Not to get too far off topic, but Mongo is guaranteed to lose data under certain well-defined conditions.

I completely fail to understand why some developers seem to be allergic to learning about data structures and the means of query-ing them. It's not really that hard. If you can JS, you can SQL.

edwinnathaniel 8 hours ago 1 reply      
There does not seem a day to go by without a NodeJS-based framework/library get the heat from the community ever since December 2015.

It appears that 2016 is the year when the hype from NodeJS died down and vendors/OSS-community must now deal with the hard, un-glamorous, work to clean up, maintain, prepare roadmap, and so on. Which is good!

Or it could also be the year where people realized that the use-case for NodeJS is fairly very very specific/niche (unless if they love JS so much that they're willing to absorb the pain using NodeJS + its ecosystem).

SailJS vs TrailJS, ExpressJS is dying, NodeJS vs io.js, StrongLoop+IBM fiasco (and SL reputation), now MeteorJS!. I had high hopes that MeteorJS can be one of the premier NodeJS frameworks (Rod Johnson, who created Spring Framework, invested in the company. I hope they listen to him...).

I love to see competition in NodeJS ecosystem and hopefully sooner than later, a few solid options emerged because right now, everything is not a good option except barebone NodeJS and roll-your-own-framework.

warcode 13 hours ago 1 reply      
Meteor has been perfect for the single page web apps I've been making as side projects. I don't know any other frameworks where I could have completed an encrypted chat application as fast and painless as I did with meteor.

All the things mentioned as "going beyond basics" seem like things that meteor was never designed for and that we have other tools to handle.

I really feel like the problem is with the people behind meteor trying to make it THE framework, instead of just being a framework that excels at a single purpose.

drdaeman 12 hours ago 0 replies      
Anything that implies the lock-up unless everything is rewritten completely is not a viable choice for me.

I'm wary of anything (especially JavaScript as trends change there faster than one can read the articles) no matter how praised it is. Sometimes things just do break in the most bizarre manner when you already have everything set up. If components are modular enough, you can swap them, or at least patch a completely new piece over the old one, to handle the problematic cases.

When I once had to replace a very old legacy system (non-web, but it still holds) I've just slapped a dummy do-nothing proxy-like system in the front and gradually did stuff piece-by-piece. Then threw out the old garbage when it wasn't doing anything anymore.

But from what little I understood about Meteor from the tutorials and examples, there's one single giant system, that you either use or don't. This means, if one hits some bug or - worse - architectural limitation, they're going to have really tough time.

nik736 14 hours ago 0 replies      
As long as there is no direct PostgreSQL support, Meteor is no viable option anyways.
davidw 12 hours ago 1 reply      
When I saw MongoDB, Meteor got put on a "check back some day" list. Phoenix is way more interesting: it's built on some very solid foundations.
oldmanjay 12 hours ago 3 replies      
I can sympathize greatly. I played around a bit creating an isomorphic "framework for frameworks" to explore some techniques I didn't get to use in my day-to-day. There is a constant tension between leaky abstractions and expressing functionality meant to run in very different environments that I personally am taking years to fully grok [0]. Of course, unlike the meteor folks, I'm doing it for fun, so there is no pressure to make any money doing it on my end.

[0] for instance, I made it trivial to invoke client functions asynchronously from server code using normal call syntax. Now how to deal with timeouts? The syntax doesn't permit much without leaking, destroying the original point.

rgoomar 8 hours ago 0 replies      
This article is really great. Yes, Meteor is still one of my favorite platforms, but every library / framework / platform always comes with its pain points. I have had a great experience with it so far and I have to admit, I did get to a point recently where I was having trouble recommending it to other people and companies due to the various points stated in the post.

But, having said that, there is still a very bright future ahead for Meteor. The people at Meteor Development Group have identified these pain points and have plans to improve some of it. There are various tasks that they are working on now that will change Meteor a bit, but also make it a better platform (in my opinion). Things such as better NPM / module support, support for multiple different types of databases, faster build times, better testing support, etc.

The platform has been around for awhile and I still think that it can reach the goal of being the go-to JS platform for building cross-platform applications.

nichochar 11 hours ago 1 reply      
Honestly, I used meteor and tried pretty hard, I went to their hackathons, etc...

I just don't think it's as good as the other options technically (does not scale well as a server, which is a problem for most applications).

More importantly, as a developer, I do not believe VC money should be bringing opinions to the open source community (which is what this is) is a bad idea. The whole point of open source projects is to have natural selection happen. Meteor having VC money allows them to get it wrong but still market it etc... Which just seems like a poor idea.

DennisP 13 hours ago 0 replies      
I tried out Meteor this past spring. I started with a simple Trello clone, which I got working in a couple hours, including learning time. It was good enough at that point that my brother started using it at work. I was pretty impressed.

So I got to work on some more advanced features I'd been thinking about. And at some point, Meteor started throwing an error from somewhere in its innards, and for the life of me I couldn't figure it out. Some kind of problem mapping data to UI, I don't remember the exact message.

I decided I needed to know the guts of Meteor to be able to debug problems like this, and put the whole project aside to wait for the Meteor in Action book. But now I'm onto other things.

metabrew 13 hours ago 2 replies      
Seems like a good analysis of Meteor's shortcomings.

I did some research on related areas a few months ago.My ideal stack at the moment would involve:

* redux or cerebral with immutable model

* react

* css modules

* webpack

* a realtime-enabled version of falcor, which doesn't exist yet.

The last bit is still the missing piece for realtime, as far as i'm aware. Neither GraphQL not Falcor seem to been designed with realtime model updates (via websocket) in mind.

rafaquintanilha 10 hours ago 1 reply      
The major benefit of Meteor (at least for me) always was its quick bootstrap. So you want start coding or work on a mock-up? Just meteor create myapp and start. No boilerplate. No script and style tags. Heck, add bootstrap with a single line of command and you start with fine-grained classes! Sure you might need a database... don't worry! No need for schema, modeling, whatever, the server is already there and so the database.

Eventually you might want to scale and then, as in any library/framework/platform things will get a little more complicated. I don't see it as a problem, really.

Meteor always was opinionated and maybe some opinions were unpopular from the beginning (MongoDB, for example). But when you start adding multiciplity (React/Blaze/Angular for the frontend, Iron Router/Flow Router/React Router for routing, Meteor Package System/NPM...) you can lose the track. I'm afraid this is happening now.

harel 12 hours ago 0 replies      
I did a little project with meteor in its early (pre 0.5) days. At some point internal errors I could not understand started cropping up. Nobody in the community at the time could help. In the end I wrote the app in go and angular. The idea of meteor seemed nice at the time but in practice it wasn't fun. The all in approach means you should also understand what goes on under the hood because when the engine fails and nobody knows why, you are stuck.
YogeeKnows 9 hours ago 0 replies      
As a newcomer to Javascript frameworks I totally love Meteor JS. I come from Java backend programming and My Job/Client always worked on proven not so recent frameworks like Spring etc. I always wanted to learn the Javascript frameworks but the vast options there are made it very difficult. Too many JS frameworks, Too many choices. I started learning Angular and they decided to do a makeover which requires learning from start. Even React isn't something quick to master.

But Meteor, it just made my move into Javascript webapps so easier. I'm seeing results instead of that paralysis of am I learning the right framework. I'm also almost 80% done with my niche Classified ads site.

Meteor JS is best thing I picked up In 2015.

yetanothercoder 12 hours ago 0 replies      
I built an app with Meteor right around when 1.0 was released. The absolute worst part for me was attempting to shim what should have been a relational database design into the default NOSQL mongoDb (you pretty much had no choice).
ClashTheBunny 12 hours ago 0 replies      
'All happy families are alike; each unhappy family is unhappy in its own way.' -Tolstoy

It seems that many people are unhappy with Meteor in its own way. It takes 375 factors to make a 'Rails' and Meteor has 370 for each person, but each person is missing a different five.

Geee 11 hours ago 0 replies      
Meteor is easily the most productive webstack I've ever tried (couple of years ago) including learning, app setup, configuration etc vs some of other popular choices. I hope at least some of that carries over to what ever is coming out next.
pygy_ 12 hours ago 0 replies      
So, Meteor is going to turn into what SocketStream was, five years ago, before it was completely eclipsed by Meteor's marketing steamroller.
dvt 8 hours ago 0 replies      
So, full disclosure: I recently co-authored a book on Meteor which comes out in a few months[1]. I don't think Meteor is a cure-all panacea or that it's the Best Framework Ever. I've come to Meteor with a fair bit of experience (Play 1.0/2.0, Sails.js, Flask, etc.) and a fair bit of skepticism. However, I really don't think much went wrong with Meteor. In fact, it treads some of the same ground as its predecessors. As far as OP's article is concerned, I have some very specific squabbles with it:

> This creates a bigger barrier to entry compared to front-end frameworks like React and Angular, or server languages like Go or Elixir.

Okay, Meteor has an arguably bigger barrier to entry than React or Angular (maybe), but definitely 100% not Go or Elixir. I think this is just disingenuous.

> I believe some of Meteors early choices may have ended up handicapping it. For example, Meteors focus on real-time applications makes them a lot easier to build compared to any other platform out there. But it does also come at the cost of hidden complexity, and potential performance problems down the road.

This is the #1 problem of every framework, ever. Mr. Greif is not saying much, if anything at all.

> Once a new Meteor user starts to go beyond the basics and look into things like routing, pagination, subscription caching & management, server-side rendering, or database joins, they realize that the difficulty curve quickly ramps up.

Here, he's conflating things that are easy (routing and pagination) with things that are hard (subscription caching), so it's hard to see exactly what the criticism is here. Not to mention that Iron Router are pretty mature. I haven't run into a routing issue yet that it couldn't solve. As far as joins and caching, etc., these are definitely difficult things. I don't think any framework out there completely (and in the general case) solves these out-of-the-box. Maybe someone could introduce me to one.

> The result of all this is that Meteor has ended up in an awkward place.

I think it just ended up where almost all other frameworks end up: useful, but not completely generalized. In fact, I think striving for a very high degree generality might be a mistake, lest we want to end up with something like Hibernate.

[1] http://www.amazon.com/Introducing-Meteor-Josh-Robinson/dp/14...

eibrahim 13 hours ago 0 replies      
The reason I picked firebase over meteor was ember support. Ember and ember data on top of meteor would be a killer combo for me.
cpplinuxdude 14 hours ago 4 replies      
This is a little worrying; after much research I'm starting a new project for a startup using the meteor stack.
applepple 6 hours ago 0 replies      
I think the problem with Meteor is that it's not meaty enough...


dfischer 9 hours ago 0 replies      
Lots of fear mongering here and as you'll notice, a lot of voice from inexperience with meteor itself.

We bet on Meteor and it's doing phenomenal. The development experience is great. I haven't been this happy and impressed since rails .9 ripping me a new case of programming love.

Look, meteor has it's downfalls but so does any stack. There's nothing working against you, it's all just code. There's no magic. You don't understand subscriptions, merge box, and how to handle joins? Level up a little bit. It's not a hard thing to figure out. Open source code makes it easy to dig in.

Yes, mongo is looked at in bad light, it's also incredibly powerful when used right. The trick is to not be oblivious to the tools you use and work around it when it matters.

The most important thing for a product, startup, and company is the ability to make the right calls and grow at the right time and then be able to handle and accommodate problematic areas when you need to.

Does meteor scale? Yes. Does it have a theoretical limit? Yes. The trick there is to be cognizant of the limit and plan around it when you are approaching.

It's all just JavaScript at the end of the day.

Oh you want to scale out your processing? Easy. Fan out some back end nodes in whatever language you want. Pull from mongo, do your crunching, and then feed it back in. Then meteor will sync it to all clients. This is pretty powerful. You can completely swap out and fan pieces out with meteor.

We have a massive app in meteor. It's also architected to be modular if we have to break it up.

The isomorphic nature of the code base has created pleasant APIs on both the front end and back end and has drastically simplified a real time app experience. Meteor sets a new standard for web applications and demands new perspectives. When you embrace it you get an ecosystem that is quite revolutionary to work with. Everything from the web-app, latency compensation, pub/sub, down to Cordova builds.

Overall, we are very happy. Yes things can be better, and we are looking forward to seeing meteor evolve.

Meteor shouldn't focus on supporting new view frameworks like blaze vs react. Just focus on adopting the tools out there and making it better. That's why meteor doesn't need blaze. Meteor doesn't need a router. Meteor should pick the best open source versions of that in the Js community and make it easy to work with inline with the "vision". Meteor should focus on the developer UX. This is their power. An opinionated framework that removes the stress of configuration and has tools that fit really well together to allow an individual, team, and company to focus on building what matters: features.

The vision is grand and I believe it in and I'm also not worried about scale. People said the same shit about rails, and x, y, z. Scale is a good problem to have and when you have that problem you'll figure out what you have to do.

If anyone has any questions about meteor architecture feel free to ping me.


waitingkuo 13 hours ago 0 replies      
In my experience, the best part of Meteor is its tightly couple stack so that I can developer very smoothly. But the worst part is also its tightly couple stack when my web/app grows. Really hope that it would be easier to decouple some important package like tracker or minimongo
ricardobeat 9 hours ago 1 reply      
> The recent Blaze/React debate

Can someone point to the backstory here?

premasagar 13 hours ago 0 replies      
Another place I'd love to see Meteor move towards the (future) standards would be an eventual replacement of Fibers with async/await throughout, which to me is more explicit and intuitive.
elchief 12 hours ago 1 reply      
"Project Foo: The Greatest Thing Ever". What is that? I should look into that.

A year or two later. "Project Foo: Total Crap". Glad I didn't look into that.

yclept 13 hours ago 1 reply      
I was turned off to Meteor when one of the initial releases was touting being able to use the browser console as a Mongo shell. Never took it seriously since.
pm24601 9 hours ago 1 reply      
We used meteor at my last company. By far and away the biggest issue was being stuck with Mongo. NoSQL databases have their place. But they are not a universal solution to everything (or even most things).

Relational DBs exist for a reason.

Being stuck with Mongo is the major reason why I wouldn't use meteor again.

The other stuff was minor and had solutions. We also built an open source library that wrapped up collections well.


sneezeplease 13 hours ago 0 replies      
Meteor is great and Im optimistic what 1.3 will bring with better NPM support.
popmystack 12 hours ago 3 replies      
I'm sorry, but how are you in a lead position but somehow didn't understand a timeless principle of software development?
WebTorrent BitTorrent over WebRTC webtorrent.io
379 points by Doolwind   ago   94 comments top 33
mmcclure 3 days ago 2 replies      
I feel like I've seen this pop up a few times now, but this is really, really cool stuff. The only thing that concerns me about the growing popularity of using WebRTC is the security concerns around unknowingly joining a p2p network like this for potentially any site you visit. It's not hard to imagine what a bad actor could do to content before passing it along, or more simply, the fact that your true IP is exposed.

Curmudgeony security issues aside, this undeniably feels like The Future and a big deal to watch out for. It's also one of those cases where a creator / maintainer makes a huge difference for long term viability in my opinion. Feross is crazy smart and has been working with all the related tech for a while now (via PeerCDN, Instant.io, etc, etc), and is just an all around respectful, nice guy, which is important for the continued development / community aspect.

lambdacomplete 2 days ago 5 replies      
Amazing project, really! But please, for the sake of users (like me) who live in countries where ISPs set a "quota" on DSL connections: ask the users whether they want to start downloading Sintel before doing so :) Now I'm afraid of opening the website again.
imrehg 3 days ago 3 replies      
Does this site really start downloading a 124MB torrent right after opening the page (sintel.torrent)? If so, why would that be a good idea to do?
erikpukinskis 3 days ago 1 reply      
This makes me so happy. If we can get good support for WebRTC and getUserMedia the web will be able to keep going as a decent platform for apps.



We're really at the mercy of open platform-minded engineers at Google, Apple and Microsoft though! I wonder what we can do to help support those folks.

currysausage 3 days ago 2 replies      
Very curious about the legal implications if every site that I visit can transfer files to unknown peers in the background. P2P is, AFAIK, a big source of costly cease-and-desist orders in Germany. With WebTorrent, I guess I could tell the right holder to bring the matter to court and plausibly state that some malicious ad iframe must have distributed that MKV without my knowledge.
TheAceOfHearts 3 days ago 0 replies      
You can try out WebTorrent at Instant.io[0]. It's probably the easiest ways to share files with someone, as long as both people have modern browsers.

Unfortunately, after a certain file size it'll just crash your browser. It'd be great if there was a way to work with large (+2GB) files.

[0] https://instant.io/

taylorhou 3 days ago 0 replies      
Very interesting. Figured the day would come but the dev finally did it. Re-decentralizing the web is a great goal and with simple demonstrations like yours, we'll get there! Cheers mate
lelandbatey 3 days ago 2 replies      
This seems very interesting already! I now have some more technical questions:

- Where is the downloaded data being stored? With a traditional bittorrent client I the data is written to disk. Since JS doesn't make raw disk access available, I'm assuming it's being kept track of in through some js api that tells the browser to store this data. What API is it using?

- Even when I finish downloading the video, the player doesn't allow me to seek to random positions in the video. It displays a "this is how much is buffered"[0] bar that is way smaller than the green bar at the top of the page indicating download progress. Why is this the case?

- As you can see in the screenshot[0], there's lots of nodes that are labeled with ip addresses that are not visible to my computer at all. Is this because the displayed ip addresses are self reported?

[0] - http://nacr.us/media/pics/screenshots/screenshot--17-46-37-2...

johnchristopher 2 days ago 0 replies      
Question: I see there are some local network IP addresses in the graph ? I suppose external IP addresses are hidden for privacy/security purpose but how well are there hidden ?

Anther question: How do I open the file once downloaded ? (I use ublock, should the file be displayed in the rectangular area next to the graph ?

yAnonymous 2 days ago 1 reply      
That's great, but BitTorrent over JS is also dangerous, at least where I live.

C/D letters come with a 200-1000 fee depending on the content and now it's trivial to make someone download stuff illegally in the background.

kentbrew 2 days ago 1 reply      
Page wants just a tiny bit of explanation about what's going on. Firefox 43.0.4 doesn't play the movie; it just sits there with a black box.
magicmu 3 days ago 0 replies      
What a coincidence, I was just playing with this for the first time last weekend! They also have an npm package that can be used for both torrent streaming via node and the browser (https://www.npmjs.com/package/webtorrent). Awesome project.
liamzebedee 2 days ago 1 reply      
WebRTC requires the use of a centralised signalling server for the initial connection between two peers. I feel many miss this point when reading about WebRTC-enabled projects. Even if you do have Universal Plug and Play which port forwards automatically (and thus you can communicate directly between two peers), you still need this centralised signalling server.

Correct me if I'm wrong, but this poses a problem if you ever want to take WebRTC further (i.e. in a self-hosted mesh network).

rtkwe 3 days ago 0 replies      
Interesting, if the player never starts you never connect to additional peers. I'm running this in firefox 43 with flash disabled and the video never starts.
rasz_pl 2 days ago 0 replies      
1 Pretends to work on a browser not supporting WebRTC. This got me thinking so I went to webrtc.org and all the examples/samples also pretend to work and/or fail silently - is WebRTC API really not able to even ascertain level of support of the running browser? .. looked under the hood and found https://webtorrent.io/bundle.js: throw new Error('No WebRTC support: Not a supported browser'), so it definitely can, but fails to catch those errors and do anything/inform user.

2 looked at network traffic and it seems to open separate TLS sessions per transferred data packet, not the most optimal thing to do, might be an artefact of being hosted on https. Probably a cpu bottleneck right there.

3 doesnt store anywhere (local/session storage).

janpieterz 2 days ago 0 replies      
Interesting, I'd be curious to some speed tests. I was seeding to around 22 peers for a while but did not get over 5Mbps up, while my internet connection is capable of around 530Mbps. Wondering if this was an inherent WebTorrent problem or simply that not enough people were online with strong connections.
cel1ne 2 days ago 0 replies      
Like many, I thought about this since a couple of years.

My idea was a browser-plugin for youtube, that would take the downloaded video and start seeding it. On the other side, if a video has been blocked by YT, it would automatically use the torrent version.

jaysoncena 3 days ago 0 replies      
How come the download was already completed but the video only buffered around 50%?
throwaway13337 2 days ago 0 replies      
Can't wait to see a popcorn time in the browser. :)
franciscop 3 days ago 0 replies      
I was toying with the idea of doing something like this a couple of days ago, but two things stopped me:

- No support even in modern browsers by default [1]

- Don't want to [maybe] get into legal troubles if it's wrongly used

[1] http://caniuse.com/#search=webrtc

PS, apparently the caniuse info was wrong, since now it appears in green

buzzdenver 3 days ago 2 replies      
What is the animation on the left full of RFC 1918 addresses ? I assume those are really NAT-ed at some point, aren't they ?
edpichler 2 days ago 0 replies      
Very cool, but what annoys me is it starting the p2p download and upload without asking authorization?
devilsbabe 2 days ago 0 replies      
Other companies like http://streamroot.io/ are also using WebRTC to help content hosting sites like YouTube and Netflix deliver VOD and live streams. Really exciting!
ferongr 2 days ago 0 replies      
>Error: No WebRTC support: Not a supported browser

Funny, Fx44 does support WebRTC

nik736 2 days ago 0 replies      
If I use Safari on that site it's just downloading from your server, right? Since Safari doesn't support WebRTC.
leoplct 2 days ago 0 replies      
Looking forward to see Popcorn time on WebRTC
vonklaus 3 days ago 1 reply      
it is so fucking obvious that this idea is exactly how browsers will work in the future. A browser is going to just be something like node-webkit/webkit/electron etc. so compatability won't be an issue, then you just connect to a ton of different clients that are running narrow crawls of shit you are searching for. The browser will then not take you to the page, but just display the information directly without loading a shit ton of js.

You can tag or organize the data locally and cache it, or return it sorted to the nodes which serve it to others. People don't give a shit about webpages for search, they care about information. The web is a big rss feed, and our old feedreader "google" stopped doing that well, and also we pay a massive privacy tax for that now.

I see this happening in ~2 years for really techie people and being standard in 5.

edit: elastic search, webkit, real time, distributed file systems, apache spark, google tensor flow. These ingredients will be used to make the new browser which browses information and returns that information not the actual web pages.

ericfrederich 2 days ago 0 replies      
How does this project differ from ipfs?
andreapaiola 2 days ago 0 replies      
Nice tech!
jsprogrammer 3 days ago 0 replies      
Nice demo.
gionn 2 days ago 0 replies      
are you a wizard?
knocte 2 days ago 1 reply      
WebRTC will anyway become obsolete with IPv6, right?
Nux 2 days ago 0 replies      

Complains it cannot play the file for not having Chrome with Mediasource. Why not serve an ogg or webm for crying out loud?

Also, why auto-start the download?!

After the download is finished, where can I watch the video? There's no link for watching it anywhere.

If I refresh the page the download starts again.

I realise this is just an experiment and kudos for that, but the author could have made some better choices re above.

2^74207281-1 is Prime mersenne.org
302 points by ramshanker   ago   123 comments top 15
jlewallen 1 day ago 2 replies      
Glad I actually read the article because I never would have known this:

"The official discovery date is the day a human took note of the result. This is in keeping with tradition as M4253 is considered never to have been the largest known prime number because Hurwitz in 1961 read his computer printout backwards and saw M4423 was prime seconds before seeing that M4253 was also prime."

coldpie 1 day ago 8 replies      
Big numbers are fun. Here's some stupid stuff I thought of while reading about this.

The number is 2^74,207,281 - 1. Naively stored like a normal integer in memory, that's roughly 75 MB of RAM just to store the number. Compare that to your 64-bit time_t and imagine how long a duration of time could be represented by such a large number. Stored on 3.5" floppy disks for some reason, that's roughly 50 disks just to store the number.

Edit: Duh, bits, not bytes. I leave my mistake here for posterity.

JoeAltmaier 1 day ago 1 reply      
If you were abducted by aliens, and had memorized that number, you could possibly trade it for wealth in their society. If they hadn't discovered it yet!
kylnew 1 day ago 8 replies      
I'm curious to understand the significance of finding the largest known prime numbers. It seems that some people get pretty excited but I'm not sure if it's just purely academic or there's some useful applications I'm unaware of. Can anyone enlighten me?
agentgt 1 day ago 0 replies      
I had to find out what a Mersenne number was again (vaguely remember but forgot) and I was shocked to find this problem in mathematics:

"It is not known whether or not there is an odd perfect number, but if there is one it is big! This is probably the oldest unsolved problem in all of mathematics." [1]

[1]: http://primes.utm.edu/mersenne/index.html

haberman 1 day ago 3 replies      
> The primality proof took 31 days of non-stop computing on a PC with an Intel I7-4790 CPU.

How does this work? You just try to divide by every possible number and ensure the result isn't an integer?

gitpusher 1 day ago 0 replies      
Meanwhile, somewhere in Wisconsin, the number 274-207-2811 is being mercilessly prank-called my mathematicians.
pluteoid 1 day ago 0 replies      
Nice. Only the 49th Mersenne yet discovered. This also gives us a new perfect number, 2^(74207281-1)(2^74207281-1).
ramshanker 1 day ago 2 replies      
22,338,618 digits long, 1 More step towards EFF award for first 100 Million digit prime number.
jacquesm 1 day ago 1 reply      
Wow, many years since I've contributed to the project with my little cluster, nice to see them still going strong.

Incidentally, the project source code contains one of the fastest Fourier transform implementations that I'm aware of.

Consultant32452 21 hours ago 2 replies      
I have a stupid question that will reveal how I know nothing about cryptography. I understand that large primes are somehow important for some types of cryptography. How are there not just databases of large primes that make breaking the types of cryptography that rely on large primes trivial?
hinkley 1 day ago 1 reply      
Are there any double Marsenne primes that we know of? Where 2^(2^n - 1) - 1 is prime as well?
legohead 1 day ago 1 reply      
is there anything special you can do with mersenne primes? if there's no pattern to them, can't it just be seen as dumb luck?
banku_brougham 1 day ago 1 reply      
I love this. So M49 is now the largest known prime, but what does it take to find the next prime number? Or any larger prime?
technotaoist 1 day ago 2 replies      
I downloaded and checked the zip file. Last digit is a 6. Downloaded it multiple times to test. Can anyone else replicate this?
Why you should not develop apps for Windows 10 irrlicht3d.org
367 points by irrlichthn   ago   252 comments top 20
creshal 16 hours ago 15 replies      
It's crazy just how unfinished everything involving Windows 10 seems to be. Half-assed UI concepts implemented clumsily, supported by an unsupported store, updated haphazardly at a pace that makes enterprise scream bloody murder and all Microsoft does is trying to trick more users into accidentally updating.

What are they thinking? They can do better than this. Much, much better.

_nedR 16 hours ago 6 replies      
The worst thing in my experience about windows 8/windows 10 apps is that you cannot install apps from 3rd party sources. Sure you can install on a temporary/testing basis by signing in with a dev license, but you have to keep doing this every 3 months until microsoft decides to pull the plug.

This is a problem if you are writing an app for a client who doesn't want to publish on the app store. The only option is the very expensive and opaque volume-licensing program which is useless if your app won't have many activations anyway.

Ultimately Windows 8/10 is a closed platform similar to iOS. Are we as developers willing to pay 30% of our revenues to Microsoft when the day comes that UWP will replace .net/win32?

opayen 14 hours ago 1 reply      
From the article: "Unless you know the exact name of my app, you won't find it."

His "Website constructor" app is in 3rd position when I search for "Website". Not that bad for such a generic keyword.

Also, I guess one thing with these apps is that the price range (between $12 and $20) is a quite high compared to the majority of the other apps on the store and (unfortunately?) a lot of people are not ready to buy apps >$2.

I guess that the store search algorithm depends on the conversion rate (people buying the app VS people trying it) and that conversion rate is probably pretty low, so maybe lowering the app price could lead to more sales and higher ranking.

TheRealDunkirk 14 hours ago 0 replies      
It's just like every other new Microsoft product. They produce enough materiel to get something started -- lots of demos and howtos and samples -- all by Microsoft employees or surrogates -- and then they leave it for something else, and hope the world figures out the rest. They try to suck you in up front, and hope that you get committed enough to not leave when you discover the mess that's waiting for you behind the curtain.

I've come back to full-time Windows development after over a decade of PHP and Rails, and you could lift my experience with trying Entity Framework and lay it on top of these complaints, and not notice a difference.

I've always said this about Microsoft products: they make it really easy to get to 70% of what you want, and then make it nearly impossible to finish. There's a massive bend in the effort/results curve. With open source, it's more difficult in the early stages, but it's a steady progression to 90%, and then you have the tools to work out how to get exactly what you want, if you want to make the effort.

lottin 15 hours ago 2 replies      
For a long time Microsoft was a monopoly and it shows.

Recently I contacted Microsoft technical support over a product that has some serious issues. Basically they tell me that they have no intention of paying for the repair even though the product is under warranty, in blatant violation of European law.

wanda 13 hours ago 1 reply      
I use Chrome OS as my daily driver/recreational computer, and I work from the command line on FreeBSD and Linux machines during the day, so maybe I'm just out of touch with Windows, however:

My partner has a laptop with an Core i3 processor and above-average RAM, and is on the whole a beast compared to my Dell Chromebook 11. My partner upgraded to Windows 10 and I have since used it occasionally to check something here or whatever.

From a couple of admittedly brief sessions using it, I have noticed that Windows 10 seems extremely "laggy", for want of a better word, as though there is a 100ms delay between clicking something and that click registering. It's nimble enough at starting up, it's just interacting with the thing that seems to be slow.

Anybody else experience this, or am I just used to instant feedback from UI?

ChuckMcM 10 hours ago 1 reply      
While I sympathize with the author's rant, bad service is bad service, I don't know if not developing for Win10 is the conclusion, it sounded like the App store sucks but so does the OSX App store and it doesn't mean you don't develop of OSX, it just means you don't use the App store.

Sounds like side loading is a better choice on both OSes.

vmateixeira 15 hours ago 3 replies      
It's not only Windows 10 app store support that's like that. Every Microsoft product support shares the same constraints and lack of solutions/answers.

After moving abroad I was unable to access my Microsoft account(locked), for more than a month, just because I couldn't provide them with the last IP address I had used to access my account, along with last 5 received email subjects, 5 contacts in my contact list among many other unreasonable questions the common user doesn't even know how to answer. All this because I didn't own my recovery email address any longer.

Sir_Substance 12 hours ago 0 replies      
This is a problem Microsoft has had since their Phone 7 store. I tried out their app submission process with a small and uninteresting but perfectly functional dice rolling app back when the windows store was brand new with nothing on it, and saw the same thing. It doesn't really matter what tags you add, the search is crap and my app didn't show up on any of the tags I added. Only by searching it's name letter-perfect could I find it. I wonder sometimes if theres a hidden seam of quality apps on the windows store that no one knows about because the search is so bad...
SCHiM 15 hours ago 4 replies      
I like windows 10, it's better than windows 7 IMO. Things have a smoother feel, the console works better and I feel more secure due to all sorts of new security improvements. I haven't noticed huge issues, the only thing that really bothers me if the fact that explorer has a tendency to freeze for a second when accessing a large spinning disk for the first time since boot up.
eklavya 10 hours ago 0 replies      
I have an hp split laptop. It's really anemic and I regret to have thought that processors had come as far as not having to think about them (you do). It has an Intel Pentium N3510. Anyways, it came with windows 8 and it was a slow and buggy product from day one. Linux on it runs worse then windows somehow. But since windows 10 it's at least bearable now. Windows 10 is faster at least on this machine.
ksk 7 hours ago 0 replies      
Yeah, hopefully more people listen and opt out. A future where Windows has a reduced market-share is ideal for the consumer. It doesn't even have to be that OSX or Linux take over. Maybe the void could be filled by some other vendor with a completely new design.
7952 15 hours ago 0 replies      
It never made much sense to focus the desktop app store on commercial products. Windows has a plethora of high quality open source software that most users never find and would have trouble downloading safely. Why wait for the commercial market to catch up when you can make good software available right now that benefits users? App stores work because of easy availability and without that they are mostly useless.
darkhorn 15 hours ago 0 replies      
And the apps work very slow.
hohenheim 14 hours ago 0 replies      
Couldn't they forward you to someone that can help you with the problem? The support seem to shut you down and close all the doors which is infuriating at best.
jhildings 15 hours ago 4 replies      
apps? Whats wrong with saying applications as for the last 20 years in windows environments ?
at-fates-hands 12 hours ago 0 replies      
As a user who has not only Windows 8.1 but also Windows Phone (now Windows Mobile) 10, this blog is spot on and indicates a much larger problem on the horizon.

I can't tell you how many times I saw an article about an app and tried to find it in the store, only to come up empty handed. This has happened so many times on my Windows Mobile 10 phone, I lost count. Just this week I was looking for a better Twitter client app. I would type in "twitter" "tweet" "twitter client" "twitter app" and all I would get would be the main Twitter app and nothing else. Then I had to start doing Google searches for "best twitter client for windows mobile 10" which then gave me articles from 2013 and 2014 and for WP 8.1 - not exactly an up to date list of current Twitter clients.

Compare that to the Google Play store. You simply enter "Twitter" and get a dozen other apps, none of which have Twitter in their name. Plume, Echofone, Fenix, Talon, Persiscope are just a few of the examples.

It was such a massive headache, I actually decided yesterday I'ms scrapping Windows Mobile 10 and going back to Android. I've been a huge supporter since WP 7, and held out hope things would improve, and they haven't. Their app store is a massive failure, you can't find anything in the store you want, developers have no reason to build for the platform, and I'm not going to start with the myriad of UI problems I see already in the latest build (build 10586.63) that still have not been solved. Just basic shit like battery life is still a major problem. Not to mention they took away some very basic features from 8.1 that everybody loved like the ability to show Bing weather on the lock screen.

I've finally reached my breaking point with their platform.

arca_vorago 12 hours ago 0 replies      
Now if only I could push libreoffice/gdocs or ms would release office for linux, I could deploy linux. Also, way too many scada systems require windows machines for the various proprietary ide's.... and it drives me nuts! Where's the foss scada?

Windows is dangerous and 10 just made it more so. It's not easy, but break free of the proprietary os chains now before they lock your brain in with iBrain and update it without your permission.

joesmo 9 hours ago 0 replies      
It sounds like he's talking to a bot when emailing Microsoft. I'm not surprised they'd use one to respond to emails.
mc_hammer 15 hours ago 5 replies      
from the windows 10 hello world:

To make your window resizable (!) and responsive, use the following XAML

 <VisualStateManager.VisualStateGroups> <VisualStateGroup> <VisualState x:Name="wideState"> <VisualState.StateTriggers> <AdaptiveTrigger MinWindowWidth="641" /> </VisualState.StateTriggers> </VisualState> <VisualState x:Name="narrowState"> <VisualState.StateTriggers> <AdaptiveTrigger MinWindowWidth="0" /> </VisualState.StateTriggers> <VisualState.Setters> <Setter Target="inputPanel.Orientation" Value="Vertical"/> <Setter Target="inputButton.Margin" Value="0,4,0,0"/> </VisualState.Setters> </VisualState> </VisualStateGroup></VisualStateManager.VisualStateGroups>
i lolled.

theres also this gem:

 bmp Windows::WinForms::XAML::Imaging::Bitmap = ^new Windows::WinForms::XAML::Imaging::Bitmap(filestream)
Which can be avoided by putting `namespace Windows::WinForms::XAML::Imaging` in every CPP file.

Riemann A network monitoring system riemann.io
341 points by jonbaer   ago   103 comments top 26
falcolas 1 day ago 8 replies      
Evaluated it, and ultimately rejected it for a couple of reasons:

- You must pick up Clojure to understand and configure Riemann (we're not a Clojure shop, so this is a non-trivial requirement)

- Config file isn't a config file, it's an executed bit of Clojure code

- Riemann is not a replacement for an alerting mechanism, it's another signal for alerting mechanisms (though since it's Clojure and the configuration file is a Clojure script, you can absolutely hack it into becoming an alerting system)

- Riemann is not a replacement for a trend graphing mechanism.

- There are other solutions which can be piped together to get the 80% of the functionality we wanted from Riemann (Graphite + Skyline) in much less invested time

Skyline link: https://github.com/etsy/skyline

foolano 1 day ago 1 reply      
Riemann is great. We use it at work.

I like it so much that I did an experiment to implement it in C++


My implementation sucks, but I had a lot of fun working on it and I got to learn how Riemann works better.

jamtur01 1 day ago 0 replies      
Love Riemann. So much that I'm writing a book about monitoring with Riemann as the core routing engine:


There's a sample chapter available, which covers the initial Riemann implementation and a Clojure "getting started" guide which should help anyone - even if you're not interested in the rest of the book! :)


shizcakes 1 day ago 1 reply      
We've been using Riemann since 2013 and love it!

If you're coming from Nagios (or not), and you'd like something that will schedule Nagios event scripts (and others) and send them to Riemann, I have been using this in production since mid-2013: https://github.com/bmhatfield/riemann-sumd

It allows you to tap into the huge ecosystem that is Nagios monitors, without requiring any other Nagios component at all. It just translates the output into a Riemann event.

donaldguy 1 day ago 0 replies      
Since the theme of this thread seems to be non-clojure alternatives, I'll point in the direction of InfluxData (formerly InfluxDB)'s new-ish Kapacitor project.

While I'm unconvinced their custom JavaScripty DSL (TICKscript) is actually preferable to Clojure or even can be read without careful, quite LISP-like indentation, it is pretty similar in basic functionality to Riemann and is definitely not-Clojure*

see: https://influxdata.com/blog/announcing-kapacitor-an-open-sou...



*at worst, it's a mangled subset of clojure with extraneous dots and the parentheses in the wrong places :-)

moomin 1 day ago 1 reply      
Reminder that the author of Reimann is available for consulting and is a scary-smart human being based around SF.
dorfsmay 1 day ago 1 reply      
Note that this isn't a better nagios... It's more like Apache Storm. Think of it as grep on steroid. I've heard of several thousand events per seconds on a single server. The other advantage is that it uses a well known programing language rather than a DSL.
pyritschard 1 day ago 1 reply      
I'm very happy to see riemann featured here. I've been using it in production since early 2012 and contributed to it (intensively, at one point) as well.

It's been a breeze, rather worry free and its very good collectd support has enabled us to cover very interesting use cases at Exoscale.

FlorianOver 1 day ago 1 reply      
Used in production without a hassle for 2 years.For our setup and scenario it was a very fitting and good solution. Easy Setup and easy usage. Recommended.
pkd 1 day ago 5 replies      
Clojure. I think I should stop thinking and really learn that language now. It seems very tempting as a first functional language.
lafay 1 day ago 2 replies      
"Network monitoring system" makes me think of routers, switches, Mbps / Gbps, etc. This seems more like a "server monitoring system."
elementai 1 day ago 1 reply      
Evaluated it and a couple of others. Chose Bosun, never looked back, it's probably the only system with flexible and concise DSL for evaluating alerts (somewhat similar to R in spirit).

Riemann rocks, just not as monitoring system.

Bosun link: http://bosun.org/

adam-_- 1 day ago 0 replies      
The presentation video on the homepage is amazingly engaging. It's well worth a worth watch.
hendry 1 day ago 4 replies      
Anyone use http://prometheus.io/ ? wdyt
zgohr 1 day ago 1 reply      
Monitoring request - dynamic configuration. Using Nagios requires configuration file changes and service reloadWould prefer dynamic, on the fly configuration. Even better, configuration that can be adjusted externally, possibly from a web API endpoint. Better still, configuration that IS external, where the monitoring service queries an external service every X minutes to determine what to monitor.

Performant - In the realm of 6 oid monitoring of 50,000+ devices in 5 minute intervals

ultramancool 1 day ago 1 reply      
How does a system like this differ from say, the ElasticSearch+Logstash system?
MayMuncher 1 day ago 0 replies      
Ive been using Riemann at work for the past 6 months or so. There is a learning curve if you don't know clojure, that has been the biggest hurdle for me. But I love what it does and its fantastic along with the dashboard it comes with
porker 1 day ago 0 replies      
Anyone able to give a quick overview of why they use Riemann vs SaaS offerings, and the distinguishing features (other than cost/control)?

I like the look of the config structure, being Clojure.

la6470 1 day ago 0 replies      
All new monitoring systems should be named YAMF-X (Yet Another Monitoring Framework)

Jokes aside.. while the stream processing on events seem powerful there was something similar in graphite but probably not as advanced or easy to use. However the push approach brings its own limitations specially on existing setups.

bluedonuts 18 hours ago 0 replies      
We use Riemann to track all requests on Apis amongst other things. Have become rather fond of it. It happily handles 40k events p/s on rather modest hardware. The dashboard is not the prettiest but it is functional. Overall it has proved invaluable for spotting anomalies in our metrics.
avodonosov 1 day ago 0 replies      
What is the purpose of such a system? Can anyone describe at least one solid use case?
jordigh 1 day ago 1 reply      
Hm, why the name? I don't get it. Riemann wasn't knowing for being a good monitor or sentry.
acd 1 day ago 0 replies      
Very interesting project in monitoring! Thanks for creating it! Seems like new fresh ideas :).
wuliwong 1 day ago 0 replies      
What is the connection between the name Riemann and the product?
peterwwillis 1 day ago 0 replies      
A Java version of Ganglia?
uberneo 1 day ago 2 replies      
https://nodequery.com/ -- this is the lightweight version , not as extensive as Riemann but good for normal monitoring and notifications.
Did European Court Just Outlaw Massive Monitoring of Communications in Europe? cdt.org
261 points by ghosh   ago   40 comments top 4
kbart 1 day ago 2 replies      
A great gesture, but sadly I'm sure all these national letter agencies will find a loophole in laws or pass some bills to push their agenda anyway. You know as a usual story goes -- "but terrorists!", "think of the children!" etc. Enforcing EHCR ruling was never a strong side, on the way of it becoming a national law (if ever), it often gets diluted or (purposely) misinterpreted.
bede 1 day ago 4 replies      
Given that our Home Secretary recently stated that 'the UK does not undertake mass surveillance' to the investigatory powers government committee, I can't see this having any semblance of an effect (over here, at least).
umanwizard 1 day ago 2 replies      
Forgive the nitpick, but it's an important distinction: this is the European Court of Human Rights, not the "Eu" [sic] court of anything. It is a separate institution from the EU and has non-EU adherents, like Russia for example.

Another point: the court has no enforcement mechanism. I suspect if all the major European powers disagree with one of its rulings, they will easily be able to flaunt it with impunity.

mpweiher 1 day ago 1 reply      
Except here the answer is YES.

And they've done it several times before, but governments keep ignoring the rulings. I wonder if entire governments can be jailed for contempt of court...

India's telecom regulator cracks down on Facebook for its Free Basics campaign [pdf] trai.gov.in
372 points by spothuga   ago   98 comments top 24
scorpion032 1 day ago 4 replies      
Lets get this straight.

1. Facebook made users email the regulator on a subject of "tangential relevance" - saying they support Free Basics, while the questions asked were on Differential Pricing

2. These emails were unsubscribed by TRAI, and 12 MM of those 14 MM emails weren't actually sent - probably because they went out to an empty mailing list.

3. The emails that were sent, were sent by misleading people into "supporting digital equality".

4. Facebook choose to represent and speak for all of the millions that had chosen to "support digital equality" which was questioned by the regulator.

5. Facebook didn't bother to inform the users that originally answered the "opinion poll" of "supporting digital equality" of the questions asked by TRAI even after having been asked to and extending the consultation deadline for the same.

6. Facebook choose to spend $44MM on this campaign in this process. (and an obviously unknown but really large sum for lobbying!)

I'm no policy expert or a strategy consultant, but if there ever has been an epitome for "shooting oneself in the foot", this would be it.

firasd 1 day ago 1 reply      
Harshest sentence: "your urging has the flavor of reducing this meaningful consultative exercise designed to produce informed decisions in a transparent manner into a crudely majoritarian and orchestrated opinion poll."
aws_ls 19 hours ago 0 replies      
I am loving the Net-neutrality activism in India, from a wide bunch of folks: Startup folks (HNers & the likes of Mahesh Murthy); OSS folks (likes of HasGeek/Kiran Jonalagadda);even the entertainment industry(AIB, Vishal Dadlani etc)...so for the third time now, the message has been driven across that, we are not sheep. You can fool the over-eager politicians, in their willingness to be seen as tech-savvy (even tech visionaries!), by just being overly pleased to have photo-ops with FB founders and the likes. But not everyone.

Thankfully, there is enough critical base of skepticism, over here. These kind of days, make me really proud as an Indian Internet user.

PS: On another note, welcome the entry of Netflix. Already became a member and enjoying it. I hope they remain very-very careful of not disturbing the sacred net-neutrality waters.

gopalv 1 day ago 0 replies      
You can't astro-turf a groundswell - I think that's the whole reason TRAI is ticked off.

I applaud them catching onto this, because that the indian "license raj" has its downsides as well as upsides. This the first time I'm seeing the regulatory powers actively fighting the "the first hit is free" tactic.

The ideal approach would be to meter it as usage and freebasics getting a fixed bandwidth fraction until the pricing kicks in.

Also, it helped that the non-FB movement explained things better - the AIB ads to save the internet[1] were hilarious.

[1] - http://indianexpress.com/article/technology/tech-news-techno...

chatmasta 1 day ago 3 replies      
So Facebook complained that its emails were blocked?

There is some really, really delicious irony in that complaint... perhaps if Facebook paid for higher priority mail delivery, their emails would deliver. :)

I've always found these "email your congressmen!" campaigns to be largely ineffective and even counterproductive. The FCC opened net neutrality for public comment and received a similar response, i.e. hundreds of thousands of form emails sent to them. The emails bury any signal under gigabytes of noise. The only result is that regulators will ignore all emails, not just the form letters. The form letters are basically a regulatory DDOS campaign.

Add to that the fact that the campaigns are orchestrated by large players in tech, e.g. Facebook, and they lose all credibility. A megaphone and an agenda should not be sufficient tools to subvert public discourse.

goddamnsteve 1 day ago 2 replies      
The hardest part is that, there are Indian minds at Facebook designing these schemes for India. Such a shame on them.
akshayB 1 day ago 2 replies      
Free Basics - Facebook claims that any company or service provider can join this and if that is the case why don't they give the power to India's people to decide who can or cannot join. As of now Facebook has the final word in rejecting or accepting your application which is just wrong. They can easily funnel in USA companies (or Facebook invested startups) in India which compete against Indian companies. This is a nasty move by Facebook so they can dominate the world's biggest open market (world 17% population).
denzil_correa 1 day ago 1 reply      
Here is the Facebook's letter to the Telecom Regulatory Authority of India (TRAI) that garnered this response.


Here are earlier communications between TRAI and Facebook


pflats 1 day ago 1 reply      
Every time I read another article on this, it makes me more skeptical about Zuckerberg's for-profit "charity".
malchow 1 day ago 1 reply      
The funny (or unfortunate, if you're FB) component is that if the FB form response in favor of Free Basics had been written more elegantly, TRAI might have considered it to be responsive. The TRAI invitation to comment was winsomely written. It included the following as the final question:

Are there alternative methods/technologies/business models, other than differentiated tariff plans, available to achieve the objective of providing free internet access to the consumers? If yes, please suggest/describe these methods/technologies/business models. Also, describe the potential benefits and disadvantages associated with such methods/technologies/business models?

A rather nice invitation for people who actually did want to write in in support of Free Basics.

middleclick 1 day ago 2 replies      
The address from where FB is operating is the Taj Mahal hotel in New Delhi where the lowest room rent is approximately 450 USD per night. Quite interesting!
option_greek 20 hours ago 0 replies      
Facebook has been using so many tricks used by politicians these days that they might as well register as a political party in India :)
shmerl 1 day ago 1 reply      
Meanwhile in US big ISPs erode net neutrality with caps and exemptions while FCC is doing nothing.
therelaxist 20 hours ago 0 replies      
The one thing that everyone looks over is the fact that Facebook wants everyone to use their product (both rich and poor). I don't think Free Basics has much to do with Net Neutrality since they lobbied against it in the US.

Some of the arguments their sales rep puts up when you go to them to place an ad:1. We display ads even if the user does not have a need.2. The standard "Everyone is on facebook" which is not quite true in India.

I eventually stuck with google because I didn't feel my product which is targeted at farmers would be useful enough on facebook. This whole bullsh!t campaign is to get all indians on facebook so they can convince advertisers to sell ads exclusively on FB.

PS: freebasics may be an ad free platform but facebook on freebasics will definitely not be ad free.

suprgeek 1 day ago 2 replies      
Facebook tried to game the "Invited comments" system in an amateurish way and got spanked. GOOD!

But this should not distract from the fact the there needs to be a substantive debate about the merits/otherwise of their Free Basics initiative.

In a country like India where millions of people have ZERO access to the full internet, any effort that provides ANY access - however limited and curated - should not the shouted down by a vocal subgroup.

If after a meaningful debate the conclusion is that it is better to hope for eventual full access with zero current access(!) rather than instant limited access then so be it, but Facebook is not helping its cause with these stupid shenanigans.

randyrand 1 day ago 0 replies      
FB PR needs work. But I support a free and optional pared down internet.
chris_wot 1 day ago 0 replies      
Earth to Facebook: astroturfing entirely sucks, it was a very 90s tactic and it's an absolutely shit way of convincing anyone of anything.

Frankly, you look like you are bullies, and you've pissed off a lot of people by absolutely misleading them into signing your petition, to the point where you actually tricked many folks directly.

There are many, many people who distrust Facebook. I used to be on the fence, but now I see just what Facebook is like, I have to work out if I'm going to continue with an account. Frankly, it would be great if Facebook would wither and die on the vine (IMO, of course!) and a more ethical social media platform takes over. I can but live in hope.

jrbapna 20 hours ago 1 reply      
as somebody who hasn't been following this at all, can someone please explain all of the hostility toward fb? I get that what fb is offering is far from ideal, but isn't it slightly better than the status quo of no internet for a large amount of people? and as Facebook is a business, not a charity, shouldn't they be able to recoup money on their investment? I imagine giving free internet access, albeit limited, to the world's soon to be largest population doesn't come cheap
crossdiver 1 day ago 0 replies      
So rewarding to see the epic paternalism from FB smacked down to size.
intended 19 hours ago 1 reply      
I am very worried about Facebook's behavior.

The history of the Indian NN battle has been-

1) old TRAI head releases an implicitly anti neutrality set of consultation questions, very close to the end of the consultation period. (The underlying plan was to have the new rules passed without scrutiny, and the new head of the regulator would assume office in a month or two. All decisions would be blamed on the old head - SOP)

2) Nikhil Pahwa among many other individuals, including people on Reddit india start being vocal about it, (including an MP who brings it up in parliament)

3) these individuals coalesce into a rough group and using Twitter and in particular AIB's you tube video (Indian comedy group), get the message out. Millions of emails specifically answering the questions get sent to the regulator

4) what was assumed to be a slam dunk for telcos, turns into an actual consultation process, especially with the arrival of the new TRAI head.

5) committee is formed and consultation paper answers/counter comments are being taken into consideration for policy


Now comes a new paper - months after the previous NN movement. The topic is on differential pricing.

This time Facebook learns from the NN movement and opens with a rebranded Internet.org. Freebasics

Freebasics ostensibly is using the Facebook network to promote itself. Practically it's the same as Facebook using its network to promote a policy which it thinks is good for its users.

FBasics follows a huge online campaign with a marketing blitzkreig. (Not even kidding. There were more ads for Facebook than there were for popular movies at the time. Multiple hoardings and news paper ads)

In essence, Facebook learnt from the NN movement, and tried to create the same basic groundswell of support for its plans. It included utterly unethical onlir surveys which essentially asked "do you agree with saving people: yes/maybe later".

In sharp contrast - while FB Started strong, the save the Internet coalition had to do a cold start - they were/are never meant/intend to be a permanent NGO/movement. Nor are any members activists or professionals lobbyists.

So they didn't have things like opt in mailing lists to reach out for people. Nor had they anticipated the need to ask people if they could be contacted for future updates or requests.

Still, people once again coordinated, got the work done and got the message out - but a much smaller number than before (more arcane discussion topic than NN) and far less than Facebook managed to pump out.


The ability of Freebasics to leverage Facebook is hugely worrisome.

If it were not for a technicality - that some marketing honcho misunderstood the actual message that had to be sent - all of those messages sent to TRAI would be considered valid, and TRAI would have taken it into account.

A TRAI functionary said it directly day before yesterday - TRAI regrets that Facebook handled the issue the way it did because it was a great opportunity for people to let TRAI know what they really wanted. There's a sense of regret and disappointment at the regulator.

Facebook, learns. As will anyone who paid attention to this.

The next time Facebook, or reliance need to have hell with a consultation paper, and it moves into the theater of public opinion - they will act correctly.

They will answer the correct questions. They will message more people. They will improve.

In contrast, the volunteers who decided to take this issue up, won't exist for other issues or have the necessary ability and man power to match the big players.

This isn't a win. It's a warning.


Note/ Details have been subsumed into larger points, so specific dates and sequences may be out of order (such as conversion of Internet.org to FBasics)

dang 17 hours ago 2 replies      
Attacking an entire population like that breaks the HN guidelines, which call for civil discourse. Please don't do this.

We detached this subthread from https://news.ycombinator.com/item?id=10932362 and marked it off-topic.

mtgx 1 day ago 0 replies      
Meanwhile, Verizon and T-Mobile seem to be walking all over the "strong net neutrality" rules the FCC passed.


surds 1 day ago 0 replies      
I am loving this! Smacked hard and flat in the face!
bugger_guy 1 day ago 2 replies      
This is epic..fb is turning evil.
How Elon Musk Stole My Car atlantic.net
392 points by Artemis2   ago   146 comments top 31
6stringmerc 1 day ago 5 replies      
>Tesla has a strange way of communicating with customers I think is best described as customer service vaporware. That is, they spend more time trying to create the illusion of customer service, rather than actually providing it. There is no mechanism for them to get feedback, as I tried to provide, so its difficult to see how they can improve if they dont know where they are going wrong.

Now we find out the unexpected corporate benefit of not having showrooms or physical locations by which to service clients - if they can't walk into your place of business and make a scene, just consider them a happy customer!

>In my experience, its a hobby masquerading as a company, and it can probably run as a hobbyist organization for some time.

This is the gut feeling I get with every over-the-top announcement by Tesla. Frequently I get down-voted here for griping about the linguistic flourishes in Tesla announcements, but I have my reasons. Sure, creating a neat innovation or clever door opening apparatus is impressive and all, and great for show, but the boring part of pulling it off reliably XX,XXX times is a totally different animal.

Also, a corporation where everybody is too on eggshells to point out that the boss lifted a customer's car and they're too scared to engage either the CEO or the customer is, for lack of a better concept, high-school-level drama lameness.

sithadmin 1 day ago 6 replies      
>"[The Tesla Owner Advisor] called me to explain he had a call in with the Office of the CEO at Tesla and was working with his team in Tesla to resolve a problem that had come up their CEO, Elon Musk, had taken my car and was using it as his personal vehicle to test a new version of autopilot. Even worse, he said he could see all the calls I had made into the Orlando delivery center this past week, and no one was taking my calls because no one knew what to do."

The fact that a customer-facing resource is airing information about internal process screwups directly to a customer is indicative that something is very wrong with service management at Tesla; this is a bush-league customer service mistake. Not only is the customer being informed that there is a apparently a massive issue with the pipeline for delivering product to customer, but they're also indicating that there's clearly nobody enforcing ownership or accountability for reported issues.

technofiend 1 day ago 6 replies      
Had this been "Elon took my car and told his company to give me a heavily discounted one in exchange" then presumably this article would have been filled with praise for their customer service. What a missed opportunity.

Instead of feeling like you are gambling on a potential upgrade my perception is now I'd be gambling on a potential loss / failure-to-deliver if I bought a used car from Tesla. It's only one datapoint, but it's the only data point I have.

zachware 1 day ago 0 replies      
I once ordered 100 Model S's at one time (Google it) then later took delivery of 12 Model S's at one time. Here's what I can tell you. Tesla's system for selling one new car to one person at one time is very, very good.

Tesla's system for doing anything other than one car to one person at one time is not good.

When we initially placed our 100 unit order we got 100 confirmation emails timed suspiciously as though some poor person was entering the details one at a time. Their owner's website couldn't handle 100 unique vehicles tied to one user. When we took delivery we had to go through a bunch of human processes 12 unique times. The people seemed incapable of batching tasks like signing title paperwork. We went through the routine for each car twelve times.

All of their systems are built to do one thing very well.

So this guy asked Tesla to do something it is not built to do and sell him a loaner car. And the systems broke. All of them. In Tesla systems (operational and technological) everything is built to do one thing.

Yes, it stinks that some people at Tesla acted dumbly in response to this. But overall, don't forget. People are components of a system.

There are no programs in the Tesla system to handle any of the variables this situation threw at it, starting with what the guy wanted Tesla to do.

When you deliberately ask a system to do something it isn't designed to do you shouldn't be surprised when it breaks.

knowaveragejoe 1 day ago 6 replies      
> Tesla is pioneering two things at once, (a) a luxury full-EV segment for passenger vehicles, and (b) bypassing the traditional dealer network and selling directly to consumers. Since I never got my car, I cant speak to (a). But, because (b) is so horribly broken, I dont think (a) can succeed.

How on earth does this person's experience point to the non-dealership model being broken? Has this person never dealt with a shitty dealership?

thecosas 1 day ago 0 replies      
The best customer service 1) identifies and resolves the client's problem and then 2) tries to identify what went wrong internally and escalate appropriately.

Aligning yourself with the customer, then failing to provide a solution is a rookie move.

None of us were on the line with this customer; it's very possible he was prying for details and the CSR was trying to be accommodating with information because they weren't empowered to deliver a good solution.

eridal 1 day ago 0 replies      
This needs to end with Elon Musk delivering by himself a brand-new top-model car, with him saying: "sorry I took your car, now you'll get mine"
et2o 1 day ago 1 reply      
Yikes. They should give him a better car at the agreed upon price.
wiremine 1 day ago 1 reply      
My father-in-law works for a Tier 2 auto supplier that works primarily with GM. Elements of this story sounded very similar to how GM operates: no callbacks, lack of empathy, passing the buck, etc.

I wonder how deep the similarities go, or if this story is just a really odd edge case.

zekevermillion 1 day ago 1 reply      
This is an interesting anecdote about an extremely unlikely scenario, and one that hopefully you could laugh off if you're in a position to spend $100k on a luxury car.
brandon272 1 day ago 0 replies      
Seems silly to chalk Tesla up to a "hobby masquerading as a company" based on a single interaction buying (or trying to buy) a vehicle that is not even sold as part of their typical sales and delivery channel. What would this person's experience had been if they weren't seeking a special discount deal and just ordered one normally?

The only concerning part in the article is the explanation by Kevin that the reason he couldn't get through to anyone wasn't because no one was available, but rather that they saw him calling and refused to answer because they didn't want to deal with him.

peter303 1 day ago 4 replies      
You probably had case for a lawsuit, but not worth the effort. You story probably lost more business for Telsa than they would have paid you damages. (Maybe they'll sue you for libel)
Arzh 1 day ago 0 replies      
That is a really weird experience. That being said one weird experience doesn't mean a system is "horribly broken." You got into a weird hole, but how many of these problems can really come up.
joeevans1000 23 hours ago 0 replies      
This all sounds uncannily similar to my attempt to buy a used Falcon 9 from them. They kept promising delivery to my drone barge, and each delivery attempt turned into a whole runaround, each and every time. Every time the Falcon arrived with more use than what I was led to believe. The one correct delivery was hundreds of miles off on the mainland. The kicker is that I've heard Elon is using that one for himself. No one answering my calls... and no response. I did reach a technician who was very apologetic about the botched deliveries, but he didn't seem to want to escalate the matter up the management chain for some reason.
rdl 20 hours ago 0 replies      
Pretty amazing how organizational incompetence took what could have been a minor inconvenience or even a positive (your car gets driven by Elon for 50-100km is probably a plus, if it is already used) to a pissed off customer.

Otoh, I'd far prefer a company screw over a rich guy who is the very definition of an equal party to contract/informed consumer. The "buy here pay here" used car dealers catering to poor, relatively uninformed, and powerless consumers do things far worse than this as routine business practice.

malchow 1 day ago 0 replies      
These problems suggest more to me about the immaturity of Tesla's Inventory Car channel than about its customer support in general. There may be one person spending 20% of his time on the loaner car sales channel at Tesla.
tlow 23 hours ago 0 replies      
I'm still waiting to hear if:1. This story is verified legitimate and then if so2. Where's Tesla's response?
lectrick 1 day ago 0 replies      
You don't need the dual charger.

Source: Owner of a non-dual-charger who has literally never missed it. I charge at 30mi/hr off a 220v dryer connection in my garage, what's not to like?

agentgt 1 day ago 0 replies      
I wonder if Elon will respond with reasons given what happened to the New York times guy: https://www.teslamotors.com/blog/most-peculiar-test-drive

Of course I have doubts the reporter was accurate but I do believe this guys complaint sounds legit.

jdenning 1 day ago 1 reply      
I was thinking about buying a (edit: new) Tesla recently, and my main concern was that one might have a difficult experience receiving the car after paying a deposit well in advance.

Tesla - if you're reading this, this guy's experience has convinced me; I won't be considering buying a Tesla again until you can buy one and drive off with it that day.

RankingMember 1 day ago 1 reply      
Reading this made it feel like everyone at Tesla is afraid of Elon Musk, which hopefully is not the case. Not being able to talk honestly to a CEO (or anyone) is how things fester.
devy 1 day ago 0 replies      
Has the OP tried to reach out to Tesla / Elon Musk on Twitter at all? I don't know other channels, but Elon Mush is Twitter A LOT!
FussyZeus 1 day ago 0 replies      
This sounds like a completely innocent foulup (Musk taking the car, probably without checking if it was sold) that was then handled in just the worst way possible and snowballed into a ridiculous saga.

You cannot just NOT ANSWER a customer when you don't know what to do. You take it to your superior, who takes it to theirs, who takes it to theirs. Simply not answering the phone and hoping this guy would just be ok with losing 4 thousand dollars is certifiably insane, and whoever decided that should be the course of action should be fired. That is NOT how you handle a customer.

At the very least he should've been offered either the car as is with a discount, or a similar model for the same price. I'm sure he would've been happy with either option, but the Tesla customer service staff utterly failed him.

The existence of a sales channel is utterly irrelevant. He contracted with the sales rep to buy THAT car at THAT price, that's what was agreed and Tesla did not deliver. THEY need to make it work, not him.

ck2 1 day ago 1 reply      
TIL even millionaires look for discounts on Teslas.

What Tesla did wasn't right but very hard to have a pity-party for Marty Puranik.

Shivetya 1 day ago 0 replies      
While many bemoan the dealership models it is a dealership I once used to get something done when all else seem to fail. There are far more good ones than bad and big companies can effectively and afford to ignore a single consumer, even a vocal one. So while there are some benefits of dealing directly with a manufacturer it can also be insane at time how tone deaf they can be. Dealers suffer this at times too but larger ones know the game and better yet know the people to call. I had a North American rep calling me direct on my issue and it was resolved.

Once Tesla ever moves into a large volume car I don't see how they will keep up the image they portray. Its not that simple. Whats worse here is that they have people who saw what was going on and it wasn't run up the flagpole fast instead they tried to up sell the customer!!! Get real guys.

jpeg_hero 1 day ago 1 reply      
Lol, guy angling for a free car.

Sorry your discount scheme didn't work out. Pay retail like the majority of the retail public.

powera 1 day ago 2 replies      
This is not the The Atlantic that you've heard of.

And i smell a rat. Has anyone else who has purchased a Tesla had even half as many issues? I think this guy just worked things until he got some sales schmuck to make him an offer too good to be true, and then it was. (Note the several weeks to find a car. If Tesla sales are always months of back and forth they have serious problems.)

RubberShoes 1 day ago 0 replies      
jacquesm 1 day ago 0 replies      
The only way that I can see this made right is that the guy gets a free Tesla and lifetime-of-the-car free support and repairs on it. Inexcusable.
Retric 1 day ago 3 replies      
I read this as: "I tried to save money and it backfired."

They have actual sales channels with real support, by sidestepping that and your stuck in weird internal processes.

IMO, paying full price and just buying fewer things massively simplifies most processes.

PS: That's not to say Tesla did a good job. Just that edge cases are often fragile and it's a good idea to weigh your time vs. the actual savings.

solaris_2 1 day ago 1 reply      
I can understand being frustrated about being given the run-around when you have deposited $4k for a really expensive car but I think this guy(Marty) is an asshole:

1. He posted this on his company blog. Not a personal blog, not medium but on a company blog. I think he's hoping to get some business from the exposure. I'd be planning to leave a company quickly if my boss posts personal rants that are not business related on the company blog.

2. He outed the one rep that told him the truth. He gave the date and the name of the rep that told him Elon was driving the car. Why would you do that?? Perhaps the reason why the other reps kept mum was because they knew Marty was a difficult customer.

3. He goes on about having a new baby, about how his electrician was calling to install some power-ups in his garage. These things are not relevant to the story. Simply tell your electrician the car has not arrived yet.

He also mentions that "In 21 years as a founder/CEO of my own company, dealing with Tesla has been the most bizarre and strange experience Ive had interacting with another organization"

That simply cannot be true.

Bottom line: Marty thinks the world revolves around him and is really upset Elon doesn't care about him.

How do bank payments actually work? getmondo.co.uk
262 points by alexbilbie   ago   123 comments top 21
pc86 12 hours ago 9 replies      
I was just complaining to my wife about this earlier this morning (we are in the US).

I made a credit card payment last Thursday (1/14) in the mid morning, maybe 9:45 AM. Ideally this should happen pretty quickly, but I know in the US the infrastructure just doesn't match my current expectations. Seemingly there was zero movement until my credit account was credited the payment Friday afternoon, but the money was still in my checking account (no holds or pending transactions).

The weekend goes by, no movement. Monday (a bank holiday) comes and goes, as does Tuesday. I get a text message early this morning (~6:45 AM) that there was a large withdrawal from my checking account - the payment showed up! Nearly three full business days before the money was actually moved, and as of right now it's still a pending transaction.

The cynic in my wants to think that it's so people who don't pay attention to their finances are more likely to overdraw/double-spend the money, but part of me thinks it's just because the US infrastructure around banking and payments is so old it just isn't capable of anything approaching real time transactions. I would love to see a logistical/technical explanation of why it takes so long for this type of thing in the US.

jslampe 4 hours ago 1 reply      
Source: I one of the elected tech-reps on the The Faster Payment Task Force,a Fed-chartered initiative to create a real-time improved bank transfer system, like the one in the UK. I also work at Dwolla, a payment platform that has exclusively focused on modernizing the backend ACH system for over 5 years, as well as created our own real-time system for banks (See: FiSync).

There's a lot of good, and bad, information here. It's all impossible to tackle, so I'm going to point to the good things the US has going for it. (Worth getting other Fed Fast members and doing an AMA? Let me know):

1) As many mentioned, Same-Day ACH is coming (slowly but surely). Although not real-time, it will be a helpful stopgap as more real-time systems come online at financial institutions (see: #3). Combined with new Payment API Platforms, like Dwolla, many of us are enabling meaningful access and adding flexibility to an otherwise outdated platform. This will position platforms to take advantage of these new timeframes when they start arriving late 2016 and 2017.2) The major ACH operators, The Federal Reserve and the bank-owned The Clearing House, are both making significant investments in their tech stack to enable real-time capabilities (The Fed just inked a $17M deal with IBM to update their software and capabilities, and TCH signed a deal with UK Faster Payments Provider, VocaLink).3) The Faster Payment Task Force is an unprecedented market-led initiative, which despite all odds, that actually making meaningful progress on aligning the criteria and expectations for an interoperable real-time system. This is HUGE. Imagine cramming +300 lifelong competitors, embattled legal adversaries, entrenched interests, and long-standing rivalries in one room to debate the future of a trillion dollar landscape. Now imagine them to agreeing to create a better system. And I'm not just talking about improvements in speed, but better security, flexibility, and capabilities that could enable the next wave of commerce. Keep your eyes on https://fedpaymentsimprovement.org/, big news is coming in the next few weeks.

jrcii 12 hours ago 1 reply      
Regarding international payments, I dug into this a little one day out of curiosity and ended up here: https://www.frbservices.org/eventseducation/education/fedwir...The PDFs on that page appear to show you precisely how the XML messages that conduct the transfers are formatted

There's some interesting stuff I couldn't get to here https://www2.swift.com/uhbonline/books/hub/httoc.htm

And some other interesting stuff covering these standardshttp://project.i20022.com/the-standardhttp://www.c24.biz/c24-io-standards-financial-messaging-libr...

lucaspiller 8 hours ago 2 replies      
I never thought I'd say this, but banks in the UK are actually pretty good. Over the last few years I've lived in a few different countries and banks just dont compare.

In the UK you can open a bank account for free, with no minimum balance and no stupid charges (i.e. ATM withdrawals, even at your own bank).

tinkerrr 12 hours ago 0 replies      
There was a discussion a couple of years ago around how ACH works. ACH is the dominant system of money transfer in the US and takes 2-5 business days on average for a transfer, compared to the real-time systems in many other countries.


cmurf 5 hours ago 0 replies      
I think the main reason is all seems pretty stupid is because the system, even within one country, is so f'n goddamn huge with a lot of legacy stuff in it, that bad UX is just inevitable, rather than being incompetent.

But an exception to incompetence is the EMV's decision in the U.S. to transition from swipe & sign, to chip & sign, and later (2018?) to chip & PIN. That really is just stupid. In the U.S. my debit card I never sign for whether chip or swipe, I always use PIN. In Europe, that same card when swiped I'm asked for PIN, if I use the chip it wants sign! WTF?

The delays of ACH seem less between friends at least if you use Venmo or Google Wallet since that's effectively private currency (within that system) and then only periodically use a "sweep" action if that balance gets higher than you want.

petke 9 hours ago 2 replies      
I'm disappointed there isn't a guy moving a small gold coin from one vault to another at the central bank every time I buy something with my debit card. "looks like bob is getting drunk on cheap beer at the local pub again. If he'd just order a bottle of wiskey instead I wouldnt have to run back and forth all damn night. Maybe id get to go home and see my kids for once"
m_eiman 12 hours ago 2 replies      
The Swedish equivalent to Paym, called Swish, has been really successful. Probably an interesting case to study for those trying to create the next Paypal.
stanmancan 10 hours ago 2 replies      
In Canada we have something called an Interact E-Transfer that the major banks all support. You add a recipients email address and create a "secret question" they can answer. Then, you simply fill our a form that says "Send $X From Account Y to Recipient Z". Depending on the bank the email is sent between 0-60 minutes later, the recipient clicks the link, logs in, and picks which account to deposit it to. The whole process is super easy to use.

Most banks charge about $1 per Interact E-Transfer, but some accounts will get a certain number of free transfers.

With all that being said, I just received an email from TD Canada Trust (One of the major banks) stating that they are now imposing a $5 fee to cancel an E-Transfer (previously free). I can't remember the last time I cancelled an E-Transfer, but I'm still most likely going to close all my accounts with them over it. I find it absolutely outrageous that they would charge you $5 to cancel a completely automated process. The only explanation I can think of is to nickle and dime their customers.

Animats 7 hours ago 2 replies      
All the fast payment systems outside the US are Government-run. In the US, online payments are run by private enterprise. The US doesn't have enough socialism for this to work.

The US Fed has been pushing for faster payment processing, but the big banks don't want it.

kbart 12 hours ago 1 reply      
Good article. I'd like more technical details, especially security related, as some pretty hard problems must have been solved to implement such system where every error or vulnerability might costs tons of money. Any recommendations?
abcampbell 1 hour ago 0 replies      
The US ACH system is actually what killed bitcoin as a currency fwiw.

It's like the system acting in self-defense

SwaroopH 11 hours ago 0 replies      
India uses something similar IMPS. http://www.npci.org.in/aboutimps.aspx
capex 2 hours ago 1 reply      
Do we have something like FPS in Australia?
omh 12 hours ago 3 replies      
This type of system looks like somewhere that a blockchain could be genuinely useful.

Each of the trusted banks could be given access to the blockchain (rather than needing proof-of-work) and it would work as a distributed ledger without the need for VocaLink as a trusted intermediary.

Of course the question is how the costs of running VocaLink compare to those of running a blockchain type system.

tantalor 11 hours ago 1 reply      
> Net settlement is used because its more efficient - it only requires a few hundred entries in the Bank of Englands ledgers each day for potentially millions of payments.

Not a very compelling reason... storage is cheap!

pmorici 11 hours ago 3 replies      
Anyone who doubts why something like Bitcoin would be of use in banking should read the part about net settlement and counter party risk.

This article describes the system for moving money within one country but similar mechanisms exist for moving money internationally. Imagine the counter party risk involved between banks who are governed by different central banks in different countries half way around the world. Bitcoin removes the counter party risk in transfers by making settlement instant per transaction instead of net daily settlement in traditional banking.

The reason this is important is that these transfers revolve around credit and when there is a banking crisis and everyone suspects everyone else's credit worthiness then payment systems can break down.

jstanley 12 hours ago 2 replies      
> Sadly, uptake of Paym has been slow.

I don't think this is very sad. I don't use Paym. I think account number + sort code is easy enough to deal with and doesn't need to be changed.

TamDenholm 10 hours ago 0 replies      
By the way, Mondo are doing a hack weekend this weekend and i'll be attending, if anyone else on HN is, please shoot me a mail (see profile).
phkahler 11 hours ago 4 replies      
So everything seems to go back to money in accounts at the bank of England. Where does that "money" come from? How are there non-zero balances there? What stops someone at BoE from just changing a balance?

I guess changing an account balance without a corresponding transaction would make automated checks (accounting verification) fail. But then we get back to the question of how any money got in the system in the first place.

gambiting 7 hours ago 3 replies      
"The transaction limit has recently been raised to 250,000 per payment"

So what happens if you want to send more? Do you need to make a personal trip to the bank or is there some other system in place?

Read the TPP readthetpp.com
249 points by SimplyUseless   ago   106 comments top 10
tptacek 3 days ago 16 replies      
I opened this page up, skipped forward to the first highlighted section (18. Intellectual Property), and skimmed to the first annotation. The original text:

1. A Party may, in formulating or amending its laws and regulations, adopt measures necessary to protect public health and nutrition, and to promote the public interest in sectors of vital importance to their socio-economic and technological development, provided that such measures are consistent with the provisions of this Chapter.

The annotation:

In other words, the TPP overrides any domestic laws protecting public health and nutrition, or socio-economic development.

That's not at all how the TPP works. The treaty doesn't allow foreign governments to "override" local laws, but rather allows for damage claims against the governments themselves if they enact and enforce laws contrary to the agreements in the TPP itself.

I'd really like the TPP annotated by legal experts. Instead, it's annotated by the CTO of Fight For The Future. I'm not sure that's a win.

walterbell 3 days ago 0 replies      
Citizen's Trade organized 1,500 groups to sign a letter to the US Congress, against the TPP, http://www.citizenstrade.org/ctc/blog/2016/01/07/1500-groups...

"... the TPP elevates investor rights over human rights and democracy, threatening an even broader array of public policy decisions than described above. This, unfortunately, is the all-too-predictable result of a secretive negotiating process in which hundreds of corporate advisors had privileged access to negotiating texts, while the public was barred from even reviewing what was being proposed in its name.

The TPP does not deserve your support. Had Fast Track not become law, Congress could work to remove the misguided and detrimental provisions of the TPP, strengthen weak ones and add new provisions designed to ensure that our most vulnerable families and communities do not bear the brunt of the TPPs many risks. Now that Fast Track authority is in place for it, Congress is left with no means of adequately amending the agreement without rejecting it entirely. We respectfully ask that you do just that."

johnmaguire2013 3 days ago 2 replies      
This seems like a perfect proof of concept for genius.com if they're serious about becoming a way to annotate anything (not just songs).

[1] http://genius.com/web-annotator

jariz 3 days ago 1 reply      
This is great and all and is something that should absolutely be shared, however, if the intent behind this project is to share it with 'the average' person it's completely useless.No one's going to read through that entire thing, I'd expect them to at least put up a summarized version.
shmerl 3 days ago 1 reply      
There really should be a stronger push to scrap this undemocratic monstrosity.
dpweb 3 days ago 1 reply      
Read the TPP.. Skipped right to the HN comments about the TPP.
lindx 3 days ago 1 reply      
This is what happens when you try to visit this site with Tor: https://anonm.gr/up/b386.png

Cloudflare's captchas are nearly impossible to solve, which means that Tor users are effectively blocked from seeing the site. Would you consider using something other than Cloudflare to host the site?

pluckytree 3 days ago 0 replies      
I think the positive benefit of this effort will likely be undermined (and its already underway) by reactionary comments from uninformed people. They'll play well to people that already know the TPP sucks, but not from those on the fence or really wanting to learn about it.
krick 3 days ago 2 replies      
A brief question: should I know what this is if I'm not an USA citizen?
jsprogrammer 3 days ago 2 replies      
It is ridiculous that after months of Obama telling people to "just read it", he dumped the agreement as ~268 separate PDFs.

No body has time for that. It's nice that they have pared this down to 31 different sections, but my guess is that they are not showing the full agreement here.

It would be much nicer if someone just dumped it all into a single PDF and HTML file.

Edit: Care to leave a comment rationalizing your downmods?

Video games are essential for inventing artificial intelligence togelius.blogspot.com
244 points by togelius   ago   129 comments top 17
m_mueller 1 day ago 10 replies      
Recently having become a father has made me think a lot about general intelligence. Seeing my son getting excited about his 'world state changing' gave me an idea. What if the main thing that holds us back is the reliance on cost functions? Human, and to some extent, animal intelligence is the only intelligence we know about. If that's what we want to emulate, why don't we try modelling emotions as the basic building blocks that drive the AI forward? Until now, the way I understand neuronal nets, we have basically modelled the neurons and gave them something to do. My hunch is that brain chemistry is what's driving us actually forward, so what if we model that as well? Instead of seratonin, endorphin etc. we could also look at it at a higher level, akin to Pixar's Inside Out - joy, fear, sadness, disgust, anger, and I would add boredom.

Let's stay with video games for a bit. What if we look at joy as 'seeing the world change', graded by the degree of indirection from our inputs (the longer it cascades, the more joy it gets). Maybe let it have preference for certain color tones and sounds, because that's also how games give us hints about whether what we do is good or not. Boredom is what sets us on a timer - too many repetitions of the same thing and the AI gets bored. Fear and disgust is something that comes out of evolutionary processes, so it might be best to add a GA in there that couples success with some fear like emotion. Anger, well, maybe wait with that ;-).

Edit: Oh, and for the love of god, please airgap the thing at all times...

fitzwatermellow 2 days ago 1 reply      
Great summary of current state of the art with links to interesting projects: GVG-AI, VGDL...

Videos games are also essential for AI pedagogy. Creating Pac-Man agents in Stanford's AI class is a great example. Most players can barely get a "strawberry" but to see a trained agent mimicking human expert level play is eye-opening.

Quick reminder: Global Game Jam 2016 starts Jan. 29 and NYU is hosting its annual jam!


venachescu 1 day ago 5 replies      
This is a cute argument, but I think it falls into a trap of following its own thinking.

Video games are explicitly designed to test and fit within our bounds of conscious control and processing; particularly the retro games, but essentially all games in general have a very limited input control space (a couple keys or joysticks) and usually very rigorously defined action values. Moreover, these were designed by humans with very explicit successes, losses and easily distinguishable outcomes.

None of these descriptions fit the kind of control that an 'intelligent' system needs to handle. Biological systems do not have predefined goal values, very incomplete sensory information and most importantly control spaces that are absolutely enormous compared to anything considered in a video game. At any point in time the human body has ~40 degrees of freedom it is actively controlling - compared to ~5 in a serious video game.

I do not doubt that pattern recognition and machine learning techniques can be improved through these kind of competitions. But the problem is in conflating better pattern recognition with general intelligence; implying or assuming any sort of cost, value or goal function in the controlling algorithm hides much of our ignorance about our 'intelligent' behavior.

pgodzin 2 days ago 4 replies      
Really interesting to think about the skills necessary just to play a modern open-world game such as Skyrim successfully.

NLP to understand dialogue and actions that need to be taken based on what NPC's/quests/item descriptions say, strategies for several different enemies with different strengths and weaknesses, exploring the open world in a logical order.

When you think about the difficulties of such a loosely defined problem, it's hard to buy into the real-world fears of AI.

Paul_S 2 days ago 2 replies      
AI researchers are trying to make AI smarter. Game AI can already be easily written to win 100% of games but that's not the point. Gamedevs are trying to make AI more human-like. I'm not sure the two overlap.
Houshalter 1 day ago 0 replies      
I don't disagree that video games are a very useful benchmark to evaluate intelligence. But I don't think AGI will evolve from video games. I think that language understanding is the path to AGI.

Language is quite complex and can't easily be beaten by hard coded algorithms or simple statistics. You can do some tasks with those things, but others they will fail entirely. The closer you get to passing a true turing test, the harder the problem becomes. It certainly requires human intelligence, and most of our intelligence is deeply rooted in language.

He mentioned games like Skyrim and Civilization as being end goals. But even a human that doesn't speak English wouldn't be able to play those games. Let alone an alien that knew nothing about our world, or even our universe.

sriku 1 day ago 0 replies      
> In order to build a complete artificial intelligence we therefore need to build a system that takes actions in some kind of environment.


"Made up minds:a constructivist approach to artificial intelligence" by Gary Drescher presents a small scale virtual world with a robot embedded in it that figures out the laws of its world by interacting with it, much like what a child does. Need more people thinking like this.

anentropic 1 day ago 0 replies      
"The most important thing for humanity to do right now is to invent true artificial intelligence (AI)"


hanniabu 1 day ago 0 replies      
I always had a feeling that the path to an intelligent system should be similar to that of Google's autocomplete algorithm.

On boot, all surrounding data will be taken in, this step would give everything context. All new data coming in would be processed (referenced to original data to determine what is happening and actions to take), then clustered, and then updated to the original data set, dropping data from the original set determined to be irrelevant, and updating the context to give more relevant perspective of the new data coming in. (And loop)

tlarkworthy 1 day ago 0 replies      
I agree. No free lunch implies no general algorithm for solving random problems from the set of all problems. So what's the practical subset of problems that is useful in the real world? Fingers crossed, we already encoded the useful problems in the different game genres we developed. E.g. RTS pushes the planning vs. reaction dilemma, RPG tests verbal inference and morality, puzzles test logic etc. We already digitized a large claas of problems we care about for the real world in games!
proc0 2 days ago 1 reply      
Very interesting read, and I always knew about this being an avid gamer since as far as I can remember. It always intrigued me how a computer can play against a human, and as games got more sophisticated, interacting with AI's got more and more human-like.

Aside from using them as benchmarks, they way games are capable of simulating a world will probably be key in creating a true AGI. In the comment section of the article, we're already seeing some theories that involve video games not just a tests, but as a primary component of the intelligence architecture. Very exciting times!

ph0rque 1 day ago 0 replies      
So, black-box / integration testing for AI? Neat.

On a related note, I think an official driving test simulation for all the self-driving algorithms, perhaps sponsored by the government, would be really beneficial.

jarboot 2 days ago 0 replies      
The game "Yavalath" [1] in the article looks really neat: A simple little game with only two rules which never really ends in a draw, unlike tic tac toe.

[1] http://cameronius.com/games/yavalath/

dschiptsov 2 days ago 0 replies      
I think it is emotions. Teach the car that cracks in the tarmac affect its fitness negatively, and it will drive better avoiding them or passthis ng them with caution in the long run.

At least motorcycle drivers who care are better drivers.

bitwize 2 days ago 8 replies      
I don't think building an AI is the most important task on our plate. We still have those disease, hunger, poverty, and war problems to contend with. If building an AI helps us solve those, then sure, let's build the AI. But I don't think strong AI is necessary to gain traction on the problems that confront the sapient beings we already have around.
mankash666 2 days ago 1 reply      
"The most important thing for humanity to do right now is to invent true artificial intelligence"

Maybe the article makes some valid scientific points, but I simply cannot go past this unscientific opening claim to a purportedly scientific article. Not just me, no peer-review journal will accept such frivolity. Passing on the article and hoping for better scientific writing in the future!

purpled_haze 2 days ago 4 replies      
> Video games are essential for inventing artificial intelligence

And here's why they aren't: First-person Shooters.

Why give AI something that's a goal that involves killing things that look like humans or animals for points? That's a recipe for disaster.

Breakout's not much better either. How often do you need to break a wall to smithereens with a ball? Never.

Ask HN: What sites do you use to find contract work?
352 points by the_wheel   ago   159 comments top 38
andreasklinger 2 days ago 10 replies      
I used to do a lot of contract work. I can't tell you what you should do - but here is what i did and it worked for me

Two approaches:

1) Work for one large client and essentially become an employee (consider this. a lot of startups pay good money for remote employees)

2) Work for multiple clients

Focusing on #2 here

Core rule: You want to be paid premium for quality and service.

Avoid marketplaces - it's very hard to compete on quality here.

Niche - the more focused you are on a (profitable) niche the better you can charge premium for domain competence

As thibaut_barrere mentioned - Build a brand - i would even go further - create an agency like brand. At the point is stopped saying "I" but said "we" i was able to charge more.

Dont charge by the hour but by the value - most developers charge their time - you want to charge the value you provide to the client. Read up on "willingness to pay"

Most important: Deliver as promised and always try to over-deliver in service, quality, etc. Eg try to understand why the client asks for features and not only what features she/he asks for - you might be able to come up with better solutions or anticipate future requests. Any successful project should usually lead to improved reputation and more projects and clients.

Good luck!

peterbsmith 2 days ago 2 replies      
Personal networks.

I came into Syracuse knowing nobody and nothing.

I had never done any app making as of January 2015. I had done some wordpress stuff, but just the basics.

And I had (and have) no CS degree.

I now make a living on contract work. I did it by going to local meetups and introducing myself as a freelance web developer. Nevermind that I hadn't done freelance web development ever. I kept going to meetups for month and still attend a monthly hacker meetup. I participated in hackathons without really knowing how to program.

But all along the way I met people more experienced than I am and picked up two clients along the way. I think one thing that I do differently to most is that I charge a high rate (I always quote $150/hr). I am willing to negotiate lower than that but its a starting point. I have been paid that in the past for less complicated work like hiring developers and being a project manager.

What am I saying? Your questions is what sites to use? Just one: meetup.com

kaizensoze 2 days ago 1 reply      
HN - http://hnhiring.me/

Remote OK - https://remoteok.io/

Stack Overflow - https://careers.stackoverflow.com/jobs?allowsremote=True

LiquidTalent - http://www.liquidtalent.com/

Working Not Working - http://workingnotworking.com

Hired - https://hired.com/contract-jobs

Gigster - https://gigster.com/

Mirror - http://mirrorplacement.com/

Metova - http://metova.com/

Mokriya - http://mokriya.com/

HappyFunCorp - http://happyfuncorp.com

Savvy Apps - http://savvyapps.com/

Clevertech - http://www.clevertech.biz/

Workstate - http://www.workstate.com/

AngelList - https://angel.co/jobs

I know you're just asking for sites and not approaches to finding contract work, but getting in with a very promising early stage company through contract-to-hire [that allows remote] is probably the most sustainable way to go.

Doing one contract project after another at an hourly rate just doesn't scale well financially and finding a next decent client can be like pulling teeth.

thibaut_barrere 2 days ago 1 reply      
I've been contracting, consulting & freelancing for the last 10 years (5 years completely remote). My advice is to avoid "searching contract work", but reverse the situation completely: make your new clients find you instead. I wrote about this in depth here: https://www.wisecashhq.com/blog/how-to-have-clients-find-you....

Sites /can/ work (I know people who make a good living off certain sites), but nothing will beat self-managed marketing on the long run.

Feel free to email me (see profile) if you have specific questions.

Good luck!

marknutter 2 days ago 0 replies      
I posted this article on medium the other day that contains all the advice I've compiled after 8 years of freelancing as a software developer: https://medium.com/@marknutter/advice-for-the-freelance-deve...

In short, to answer your question, I never used any sites to find contract work. I got all my leads through face-to-face interaction with real humans in the real world, and a good deal of it came from word-of-mouth because of exceeding my clients' expectations.

Contracting sites marginalize developers and the type of clients who troll them are typically the kind who will try to squeeze as much work out of developers for as little money as they can. On top of that, developers are generally a pretty introverted crowd, so the number of introverted and talented developers who troll those sites looking for work is far greater than the number of outgoing, personable developers in your local area. Which group do you want to compete against?

mathgeek 2 days ago 1 reply      
Welcome to HN! You'll find that this was asked previously:


swimduck 2 days ago 0 replies      
I have a different approach to finding contract work, particularly as I don't have much work experience. Upwork and similar websites have not worked well for me.

Instead, I browse job boards and when I find an interesting role I contact the company. If they are interested in my background and the fit is right, I sell them on setting up a contract relationship instead of full-time employee. Sometimes it works, other times it doesn't. The important part is being honest that you are looking to work as a contractor, not an employee.

Job boards to consider: AngelList, WeWorkRemotely etc. If you're looking for a list of job boards (http://nodesk.co has lots and so does this article by teleport http://teleport.org/2015/03/best-sites-for-remote-jobs/)

graham1776 2 days ago 1 reply      
The one thing I always tell anyone on the job hunt (which in your case is finding contract work), which few ever seem to take me up on: Informational Interviews.

These are informal "Can I take you out to coffee?" talks with people in your industry to see what they are working on, what is happening with them, what is going on in the industry. Every job I have ever gotten is through informal meetings with people I have met through my network (whether its your old job, your friends, parents, relatives, or other).

At the end of every one I ask: "Is there anyone else you think I should talk to?" and "Do you currently have any opportunities at your company for me?". Rinse repeat.I guarantee that after investing in 30 informational interviews you will find work.

ThePhysicist 2 days ago 2 replies      
I recently made it through the Toptal (http://www.toptal.com) screening process but haven't taken on any work through their site yet, the hourly rate that you can ask there seems to be quite reasonable though compared to sites like upwork.com, where you will mostly compete with people that are willing to work for 10 $ / hour (which for someone living in a developed country is just not possible).

For Germany, Gulp (www.gulp.de) is a very good site where you can actually find clients that are willing to pay a reasonable hourly rate (they even have a rate calculator on their site).

Nursie 2 days ago 1 reply      
What's the context?Which country are you in?What are your skills?

If you're in the UK...

I've been contracting about 3 years now and started it the simple (and probably dumb) way - stick a resume up on jobsite.co.uk, wait for agents to call. Lots will. Be nice to them on the phone but be firm about what rates and locations you're willing to work. You'll get lots of useless ones who haven't even bothered to read it, but no matter, you'll learn to filter them out pretty quickly. Remember the good ones. Rinse, repeat.

I've had two contracts now through reputation, which is quite nice, but getting contracts from previous workmates isn't a panacea. One of them was the most boring thing I've ever done in my life (worse than shelf-stacking in a warehouse) and I quit after three weeks because I was literally unable to complete the work it was so dull. I told the client that I was poor value for money and a recent graduate would be a better choice. The other one was good though!

Also, make sure you're prepared for some time off between contracts, it's pretty much going to happen.

coderKen 2 days ago 1 reply      
Anyone, currently looking for a remote front-end developer? I am full-stack developer (tending towards front-end nowadays), I live in Lagos, Nigeria and looking for remote work.I have a strong Javascript(NodeJS, AngularJs) background with over 3yrs experience.

Portfolio: http://goo.gl/OmEpz8

Git: https://goo.gl/oYbi8F

some side projects I have done:





Have done more complex stuff but requires user to login.

dmitri1981 2 days ago 2 replies      
For those who are London based, I recently launched a mailing list for members of the London Hacker News Meetup, which sends out contracts based on your language preference. It's averaging about 10 jobs a month at the moment however I am working on getting it to about 100 pm by the end of the year. The current sign up page is at http://eepurl.com/byq7Af
pmorici 2 days ago 2 replies      
I would avoid sites like Upwork (aka: odesk), elance, and anything similar like the plague unless working for less than minimum wage and dealing with morons is your idea of good contract work.

I suspect the secret to contract work success lies in having really good networking skills and a Rolodex of contacts from having worked in a given industry and having a reputation as someone who delivers. If you don't have that then you would probably have better luck finding reasonable work by going to meetups or similar industry events to build a network of professional contacts. The only way I know of to do this online is to become a notable contributor to prominent open source projects and then use that to leverage paid work.

odonnellryan 2 days ago 1 reply      
A lot of people are saying job websites don't work. I don't agree with them.

I've been consulting over a year (US-based, near NYC) and I've found plenty of very good clients (small and large) through freelancing websites.

Few loose guidelines I've used to help me with applying to gigs:

1) Evaluate if you think the person understands the value of the work, and only reply if you can somewhat-confidently answer "yes."

2) Reply to gigs that say "$5" or some other crazy low number, as long as they seem competent at explaining their project.

3) ALWAYS follow up with your past clients! Ask them for new work regularly.

nbrempel 2 days ago 1 reply      
I've never used a website. Reach out to everyone you know. Buy them a coffee, mention you are getting into contracting, ask who else you should talk to, thank them, repeat.
quackware 2 days ago 2 replies      
I get my clients primarily through gigster (http://www.gigster.com), referrals, and my website.
Mandatum 2 days ago 0 replies      
Depends on how I'm feeling. If I'm not looking for very interesting work or I'm saving for travel, I have a few large clients (5000+ employees) that always have projects going. They are the bread and butter of my contract work and I'm known across pretty much all of the IT senior management at those companies.

If I'm looking for more cutting edge, interesting work I'll go out and find either a company, industry or project I'm interested in and try and insert myself into it somehow. Usually through meetups, over coffee or in one case just showing up (probably wouldn't recommend that, depends on the people - in my case it was 4.30PM on a Friday and I brought beer).

Usually I'll either do it gratis (if it's non-profit or public domain) or cut my rates if I'm learning on-the-job.

When I started pretty much all of my job offers and contracts came by word of mouth. I only had to kick down doors a few times before I had developed a reputation as a good worker. This involved cold-emailing, calling and meeting people at various industry events.

BorisMelnik 2 days ago 0 replies      
I like to go to places like upwork or elance and seek out people in the US with low rep that haven't done a lot of jobs. A lot of times those people are ones for big companies that are stuck in a situation that need a quick hack put together. Do a good job and you get put on their 'list' for future use.
fasouto 2 days ago 1 reply      
Some people at HN will tell you the opposite but I find two of my best clients at Upwork.

I didn't bid to low quality jobs and once I finish my job I offer them an maintenance contract outside upwork.

122333444555666 2 days ago 1 reply      
A bit of tangent but some advice needed. So I've been contracting out a bit on UpWork - used all the bahavioral hacks in the book: using "we" etc... It's worked amazing for getting clients. Not bad at sales. I've got one client now -- a hedge fund -- that's being very stingy. We agree on a fixed price for a particular scope/milestone, the release is shipped, but they come back and say "this is great, but we need this one additional feature or this whole release is worthless." Usually I, I mean "we", oblige. But it's getting ridiculous. What do we do? Play hardball and say no shipment until payment? Or just ditch the client. The day rate is plummeting mind you, closing in on free. Total contract size in the low XXks.
mirap 2 days ago 0 replies      
Does anyone have good place to look for contract work in field of UX & Product design? I'm UX designer currently living in Prague, looking for remote work (and I'm open to relocate). My portfolio: http://podorsky.cz/
eswat 2 days ago 0 replies      
I tackle this sideways by going to Meetup or Eventbrite. Specifically I go to meetups and events that potential buyers go to and let them know what I do (I dont try to sell my services on first contact). It takes some pruning but after a while my preferred clients are the ones I keep in contact with and we start working. I get less work through this than just by referral though.

Depending on your living situation and time available Id recommend trying to establish your own identity so you dont have to go through a marketplace for contract work. Instead youll have the contract work come to you and not filtered through a middleman that would take a cut out of your work. I would never recommend someone go through fiverr, Upwork or these other marketplaces unless they were just moonlighting.

dustingetz 2 days ago 2 replies      
HN who's hiring threads, exclusively

update: I post my pitch in the freelancer thread and potential clients contact me, for example https://news.ycombinator.com/item?id=9998249

WoodenChair 2 days ago 0 replies      
One thing that I think is valuable when looking for contracting work (what I call consulting) is to learn how people that have been highly successful in consulting built their business. Checkout episodes 4 (Marcus Zarra) and 5 (Michael Fellows) of Consult:http://consultpodcast.com



bcks 2 days ago 3 replies      
I've had good success hiring developers for short-term project work through https://gun.io.
gist 2 days ago 0 replies      
I'd like a way, similar to the first of month feature (where employment possibilities are posted on HN) where you could post requirements for a software project and get responses from the hacker news community (or at least links to either relevant profiles or reputable hackers as suggestions).

Edit: I mean on HN similar to the first of month feature not a site (I know these are out there obviously).

gk1 2 days ago 0 replies      
Wrote about this recently: http://www.gkogan.co/blog/how-i-learned-to-get-consulting-le...

The gist of it is, as many here are saying: Don't use marketplace sites. Instead show off your knowledge in a way that gets attention of potential customers, then they'll come to you.

peacemaker 2 days ago 0 replies      
I've done this by reaching out to friends and old work colleagues to see what they're up to and offering to help. Because it's people you know it is much easier to make arrangements you will both be happy with. After 15 years working in software that turns out to be quite a lot of people, especially if you take the time to regularly reach out to people via LinkedIn etc.
telecuda 2 days ago 0 replies      
Tip: Have an Indeed.com resume verbose with your areas of expertise. Build a project using Parse.com or the Twitter API? Put that in there. As an employer, one of my more successful methods is to search for specific skill sets that a project may require, then reach out to a small handful of developers who hit on those searches with a pitch to why -new project- is exciting.
victorantos 2 days ago 0 replies      
If you are looking for frontend contracts, in particular - angularjs,

I would recommend http://AngJobs.com

disclaimer: I run AngJobs, https://github.com/victorantos/AngJobs

nnd 2 days ago 0 replies      
I'm fairly new to consulting (been doing it for almost a year now). I'm on my second gig right now, and both of them are through Toptal. For the first one, a recruiter reached out to me with a gig, the second one I got thanks to an article I wrote in their blog.
juliend2 2 days ago 0 replies      

I send a LinkedIn message to some of my contacts I'd like to work with, telling them it's been a while and that I'd like to get in touch, and offer them to take a cup of coffee with them this week.

During the meeting, tell them about your freelance status and that you're looking for work.

Good luck!

lazyant 2 days ago 0 replies      
Anybody knows of (good) sites for remote server (Linux esp.) contract work (sysadmin/devops/optimization/security/reliability)? if there are none, anybody interested in one?
JoeAltmaier 2 days ago 0 replies      
My way was working at several successful startups, and then going into contracting. So I had contacts at every level of Silicon Valley. Might not work for everybody.
jameslk 2 days ago 0 replies      
I've been contracting/consulting for a couple of years now. Most my contracts have come through referrals (of friends) and sometimes recruiters. However, I was able to start my contracting career thanks to a contract that came through Toptal. This allowed me to quit my job and do this full-time.

Here's my list of resources that I would be looking at ifI needed to start looking for a contract immediately:


- Authentic Jobs: http://www.authenticjobs.com/

- StackOverflow Careers: http://careers.stackoverflow.com/jobs?type=contract&allowsre...

- We Work Remotely: https://weworkremotely.com/jobs/search?term=contract

- Angelist: https://angel.co/jobs

- Github Jobs: https://jobs.github.com/

- Hired: https://hired.com/contract-jobs


- Toptal: https://www.toptal.com/ I'm a member of Toptal's network)

- Gigster: https://www.trygigster.com/ (haven't used it yet)

- Crew: https://crew.co/ (haven't used it yet)

Offline ideas:

- Approach companies at Meetups

- Meetups, meetups, meetups

- Pitch on forums

- Work with contract agencies

- Become a subcontractor

It also helps to work on branding yourself, blogging, and integrating into communities (like HN!). Generally, just becoming an authority on a topic and allowing people get to know you before they work with you helps a lot. Kind of like patio11 has done for himself around here. Then people start coming to you instead of the other way around.

I would also highly recommend looking at DevChat TV's Freelance podcasts for ideas, they're really great: https://devchat.tv/freelancers

qp9384btv_2e 2 days ago 1 reply      
For those who do contract work, what is your policy on code-reuse between clients?
chilicuil 2 days ago 2 replies      
Fiverr (https://www.fiverr.com/), tasks usually take less than an hour and give me enough revenue to pay domains and hosting for my pet projects.
awjr 2 days ago 0 replies      
Which country? ;
The Unreasonable Reputation of Neural Networks mit.edu
266 points by fforflo   ago   142 comments top 18
hacker_9 3 days ago 12 replies      
"Human or superhuman performance in one task is not necessarily a stepping-stone towards near-human performance across most tasks.

By the same token, the ability of neural networks to learn interpretable word embeddings, say, does not remotely suggest that they are the right kind of tool for a human-level understanding of the world. It is impressive and surprising that these general-purpose, statistical models can learn meaningful relations from text alone, without any richer perception of the world, but this may speak much more about the unexpected ease of the task itself than it does about the capacity of the models. Just as checkers can be won through tree-search, so too can many semantic relations be learned from text statistics. Both produce impressive intelligent-seeming behaviour, but neither necessarily pave the way towards true machine intelligence."

So true, and this is why I don't listen when Elon Musk or Stephen Hawkings spread fear about the impending AI disaster; they think because a neural network can recognize an image like a human can, that it's not a huge leap to say it will be able to soon think and act like a human, but in reality this is just not the case.

andreyk 3 days ago 1 reply      
I think this is a good analysis of what Deep Learning is particularly good for and its limitations, but was somewhat annoyed at the lack of any citations of people actually overhyping it. The most there was is this:

"This is all well justified, and I have no intention to belittle the current and future impact of deep learning; however, the optimism about the just what these models can achieve in terms of intelligence has been worryingly reminiscent of the 1960s."

From what I've read and seen, the leading people in the field (Yann LeCun, Hinton, etc.) seem to be very aware that the current methods are particularly good for problems dealing with perception but not necessarily reasoning. Likewise, I have not seen many popular news sources such as NYT make any crazy claims about the potential of the technology. I hope, at least, that the people who work in AI are too aware of the hype cycles of the past to get caught up in one again, and so there will not be a repeat of the 60's.

boltzmannbrain 3 days ago 1 reply      
I think readers of this post will enjoy "What is Machine Intelligence vs. Deep Learning vs. Artificial Intelligence" by Numenta's Jeff Hawkins: http://numenta.com/blog/machine-intelligence-machine-learnin...
tacos 3 days ago 2 replies      
Current top post quotes the most negative observation of the paper. Here's the most positive, and perhaps the most useful to HN readers or investors who are exploring the space:

"Deep learning has produced amazing discriminative models, generative models and feature extractors, but common to all of these is the use of a very large training dataset. Its place in the world is as a powerful tool for general-purpose pattern recognition... Very possibly it is the best tool for working in this paradigm. This is a very good fit for one particular class of problems that the brain solves: finding good representations to describe the constant and enormous flood of sensory data it receives."

theideasmith 3 days ago 2 replies      
Someone once gave the analogy of climbing to the moon. You can report steady progress until you get to the top of the tree/mountain. I think this is applicable here. We'll need a new paradigm, beyond statistical learning, to create AGI
proc0 3 days ago 1 reply      
Another article basically saying something along the lines of "there is no current technology that comes close to producing AGI, therefore let's dismiss all these technologies". Of course we don't know what we don't know, until we do, and then it's not as mysterious.

It's not hard to see that the reason NN are becoming the prime candidate for AGI, is because of their inspired architecture based on biological neurons. We are the only known AGI, therefore something similar to the brain will be producing an AGI. NN at least mimic the massively parallel property of biological neurons. And if we're optimistic, the fact that NN is mimicking how vision works in our brain, might mean that we are at some point in the continuum of the evolution of brains, and it's a matter of time until we discover the other ways brains evolved intelligence.

What keeps me optimistic is evolution. At some point brains were stupid, and then they definitely evolved AGI. The question is how did this happen and whether or not there is a shortcut, like inventing the wheel for transportation instead of arms and legs.

maciejgryka 3 days ago 0 replies      
Nice article - it's good to be realistic about what we can do with current tools.

I feel like the gist of what current neural nets can do is "pattern recognition". If that's fair, I also suspect that most people underestimate how many problems can be solved by them (e.g. planning and experiment design can be posed as pattern recognition - the difficulty is obtaining enough training data).

It's true that we're most likely a very long way away from general AI - but I'm willing to bet most of us will still be surprised within the next 2 years by just how well some deep-learning based solutions work.

Houshalter 3 days ago 1 reply      
>Human or superhuman performance in one task is not necessarily a stepping-stone towards near-human performance across most tasks.

Here's the important difference about NNs. They are incredibly general. The same algorithms that can do object recognition can also do language tasks, learn to play chess or go, control a robot, etc. With only slightly modifications to the architecture and otherwise no domain information.

That's a hugely different thing than brute force game playing programs. Not only could they not learn the rules of the game from no knowledge, they couldn't even play games with large search spaces like Go. They couldn't do anything other than play games with well defined rules. They are not general at all.

Current neural networks have limits. But there is no reason to believe that those limits can't be broken as more progress is made.

For example, the author references that neural networks overfit. They can't make predictions when they have little data. They need huge amounts of data to do well.

But this is a problem that has already been solved to some extent. There has been a great deal of work into bayesian neural networks that avoid overfitting entirely. Including some recent papers on new methods to do them efficiently. There's the invention of dropout, which is believed to approximate bayesian methods, and is very good at avoiding overfitting.

There are some tasks that neural network can't do, like episodic memory, and reasoning. And there has been recent work exploring these tasks. We are starting to see neural networks with external memory systems attached to them, or ways of learning to store memories. Neuroscientists have claimed to have made accurate models of the hippocampus. And deepmind said that was their next step.

Reasoning is more complicated and no one knows exactly what is meant by it. But we are starting to see RNNs that can learn to do more complicated "thinking" tasks, like attention models, and neural turing machines, and RNNs that are taught to model programming languages and code.

otakucode 3 days ago 0 replies      
I expect that as we improve machine intelligence more and more, aside from the fact that we will simply keep moving the goalposts of what we consider "intelligent" like the irascible scamps we are, we're going to discover that embodiment is absolutely necessary. Not just any embodiment either, but we will need to place the neural networks in bodies very much like our own. Neuroscience continues to find surprising things that link our "general human intelligence" to our bodies. Paralyze a face and a person becomes less able to feel happiness or anger, eventually forgetting what feeling those things even meant, as one example.

We shouldn't forget that the mind/body split is a wholly artificial construct that has no basis in reality. The brain is not contained in the head. The nerves running down your spine and out to your toes and all over your body are neurons. Exactly the same neurons, and directly connected to the neurons, that make up what we think of as the separate organ 'the brain'. They're stretched out very long, from head to toe, sure, but they are single cells, with the exact same behavior and DNA, and there is no reason to presume that they must have some especially insignificant role in our overall intelligence.

Then there is the fact that it is probably reasonable to presume that a machine which has human-level intelligence will not appear overnight. It would almost necessarily go through long periods of development. During that development, when the machine begins to behave in ways the designers are not able to understand, what will be their reaction? Will they suppose that maybe the machine had intentions they were unaware of, and that it is acting of its own volition? Or will they think the system must be flawed, and seek to eliminate the behavior they didn't expect or understand?

I have a hard time imagining that an AI system will be trained on image classification and one day suddenly say "I am alive" to its authors or users. If it instead performs poorly on the image classification because it is pondering the beauty of a flower in one of the images, what are the chances that nascent quasi-consciousness would be protected and developed? I think none. We only have vague ideas about intelligence and consciousness and our ideas about partial intelligence are utterly theoretical. Has there ever been a person who was 1% intelligent? Is mastering checkers, or learning NLP to exclusion of even proprioception 1% of human intelligence? You optimize for what you measure... and we don't know how to measure the things we're looking for.

tim333 2 days ago 0 replies      
>Extrapolating from the last few years progress, it is enticing to believe that Deep Artificial General Intelligence is just around the corner and just a few more architectural tricks, bigger data sets and faster computing power are required to take us there. I feel that there are a couple of solid reasons to be much more skeptical.

On the other hand there are reasons to be optimistic. Human brains are built from networks of neurons and the artificial neural networks are starting to have quite similar characteristics to components of the brain - things like image recognition (https://news.ycombinator.com/item?id=9584325) and Deep Mind playing Atari (http://www.wired.co.uk/news/archive/2015-02/25/google-deepmi...)

The next step would may be to wire the things together in a similar structure to the human brain which is kind of what Deep Mind are working on - they are trying to do the hippocampus at the moment. (https://www.youtube.com/watch?v=0X-NdPtFKq0&feature=youtu.be...)

Also we are approaching the point where reasonably priced hardware can match the brain, roughly the 2020s (http://www.transhumanist.com/volume1/moravec.htm)

It'll be interesting to see how it goes.

MichaelMoser123 2 days ago 1 reply      
I think that this book is really interesting "Surfaces and essences: Analogy as the fuel and fire of thinking" by Hofstadter and Sander

Many people got dissilusioned with classical AI because mathematical logic (inference engines) would not scale to 'strong' AI.

Hofstaedter says that most concepts handled by Humans do not fit into clear cut onthologies one to one. Instead each higher order concepts are created by finding analogies between objects or simpler concepts, and by grouping these similar concepts into more complex entities.

I have a summary of the book here http://mosermichael.github.io/cstuff/all/blogg/2013/10/15/po...

bsbechtel 3 days ago 3 replies      
We will never have human level AI until we can properly understand, define, and model human intelligence. While we are advancing at a very rapid pace on that front, we are still years away from the field being considered mature.
skybrian 3 days ago 1 reply      
There was a recent paper [1] about learning visual concepts from few examples. I don't know if it generalizes or not, but it seems too early to assume that researchers will hit a dead end.

[1] http://science.sciencemag.org/content/350/6266/1332.full

tianlins 2 days ago 0 replies      
it is true that most recent success of deep neural network are in the regime where n, d are large. And we surely shouldn't fantasize general AI solved in this way. However, the very appealing aspect of deep neural network is end-to-end training: for image recognition, we can map from raw-pixels to output. This is very different from other ML techniques. In some sense, deep neural networks learn "algorithms", not just "models". This formulation can be richer especially when given lots of data.
sevensor 3 days ago 0 replies      
At last, the thing that's unreasonable isn't effectiveness. I've been hoping for a while that someone close to the field would cut through the hype and put ANNs in context.
interdrift 3 days ago 2 replies      
We have something that can understand a pattern but we don't have something that can understand how different patterns relate to each other.
DrNuke 3 days ago 0 replies      
I think many are missing the point here: AI can just be very stupid and still wipe everything out. It only takes some sort of irreversible minimisation function to let machines destroy all at sight. Drones are the first step, then comes IoT, what else? We fully depend on machine learning just now. So no wonder many are scared even before machines becoming human-intelligent.
neom 3 days ago 0 replies      
How far are we from general purpose quantum computing?
Scrapy Tips from the Pros scrapinghub.com
254 points by ddebernardy   ago   67 comments top 21
kami8845 13 hours ago 6 replies      
I love ScrapingHub (and use them) but these tips go completely against my own experience.

Whenever I've tried to extract data like that inside Spiders I would invariably (and 50,000 URLs later) come to the realization that my .parse() ing code did not cover some weird edge case on the scraped resource and that all data extracted was now basically untrustworthy and worthless. How to re-run all that with more robust logic? Re-start from URL #1

The only solution I've found is to completely de-couple scraping from parsing. parse() captures the url, the response body, request and response headers and then runs with the loot.

Once you've secured it though, these libraries look great.

PS: If you haven't used ScrapingHub you definitely should give it a try, they let you use their awesome & finely-tuned infrastructure completely for free. One of my first spiders ran for 180,000 pages and 50,000 items extracted for $0.

Cyph0n 15 hours ago 3 replies      
Extremely well designed framework. It can cover more than 90% of use cases in my opinion. I'm currently working on a project written in Scala that requires a lot of scraping, and I feel really guilty that I'm not using Scrapy :(
blisterpeanuts 10 hours ago 0 replies      
Yay! Another article on scrapy. I'm just getting started and my first goal is to scrape a tedious web-based management console that I can't get API access to, and automate some tasks.

Very glad to learn about this site Scraping Hub. Keep the war stories coming. It's technologies like these that brighten up our otherwise drab tech careers and help some of us make it through the day.

daturkel 15 hours ago 0 replies      
I've played with Scrapy before to make a proof of concept and I was pleased with how easy it was (haven't yet had to use it for anything else yet).

That being said, had no idea how sophisticated it could get. This is super impressive, especially the JavaScript rendering.

rahulrrixe 14 hours ago 0 replies      
I have been using this framework for more than three years and seen how it has evolved and made scrapping so easy. The portia project is also awesome. I have customised scrapy for almost all the cases like having single spiders for multiple sites and providing rules using JSON. I think it is highly scalable with bit of tweak and scrapy allows you to do very easily.
KhalilK 13 hours ago 0 replies      
I tried using PHP to scrape 50,000 webpages for a couple of fields, got it done in 4 hours, with scrapy it took 12 minutes. Been using it ever since.
stummjr 15 hours ago 2 replies      
Hey, author here! Feel free to ask any questions you have.
contingencies 13 hours ago 1 reply      
Man, I feel old. Does anyone remember learning web scraping from one section of Fravia's site? Ever try to move forward from that to write a fully fledged search engine? These memories are from 15 years ago... quite amusing how much hasn't changed. In hindsight it was probably easier back then due to the lack of JS-reliant pages, less awareness of automation and less scraper detection algorithms.
staticautomatic 6 hours ago 2 replies      
I like this article but for its discussion of these libraries. On another note...

Am I the only one who dislikes Scrapy? I think it's basically the iOS of scraping tools: It's incredibly easy to setup and use, and then as soon as you need to do something even minutely non-standard it reveals itself to be frustratingly inflexible.

nathell 15 hours ago 1 reply      
Wow, never heard of Scrapy! Looks like I've reinvented it in Clojure: https://github.com/nathell/skyscraper/
hokkos 13 hours ago 2 replies      
What I need is an API scraper, scrapy seems to be mostly for HTML. I know how to look at network requests in Chrome dev tools and JS function to understand the shape of REST API, so I need something to plan the exploration of the arguments space. For example if you want to scrap airbnb, you look at their API, find there is a REST call with a lat, long box, I need something to automatically explore an area, if the api only give the 50 first results and you hit this number of calls it should schedule 4 calls with half the size boxes and so on. If the request has cursors, you should be able to indicate to the scraper how to follow it. I don't know what is the best tool for that.
escherize 8 hours ago 0 replies      
Here's slides for a talk I gave about an interesting approach to scraping in Clojure [1]. This framework works really well when you have hierarchical data thats a few pages deep. Another highlight is the decoupling of parsing and downloading pages.

[1] - http://slides.com/escherize/simple-structural-scraping-with-...

bbayer 7 hours ago 1 reply      
I love scraping web and produce structured data from web pages. The only downside of using XPath or similar extracting approach is necessity of constant maintenance. If I have enough knowledge about machine learning, I would like to write a framework that analysis similar pages and finds structure of data without giving which parts of page should be extracted.
elktea 4 hours ago 0 replies      
Shame Python 3 isn't there for Scrapy yet - although checking the repo looks like someone is actively (last few hours) porting it. Good to see.
inovica 14 hours ago 1 reply      
We use Scrapy for a few projects and it is really really good. They have a commercial-side to them, which is fine, but for anyone doing crawling/scraping I'd strongly recommend it. Good article also
indymike 13 hours ago 1 reply      
Great to see Scrapy getting some love. It's really well done and it scales well (used it to scape ~2m job posts from ATS & government job banks in 2-3 hours).
novaleaf 8 hours ago 0 replies      
Anybody interested in browser automation, can try http://api.phantomjscloud.com

disclaimer, I wrote it. No crawler yet, though that's next after a new website.

inovica 12 hours ago 4 replies      
Quick question. Could you use Scrapy to specific individual pages from thousands (or millions) of sites, or would you be better off using a search engine crawler like Nutch for this? I want to crawl the first page of a number of specific sites and was looking into the technologies for this.
steinsgate 13 hours ago 2 replies      
Confession : I am guilty of using regex superpowers to extract data from urls. Will check out w3lib soon!
pjc50 13 hours ago 1 reply      
That reminds me, I was going to write a scraper to extract my HN comments.
shostack 5 hours ago 1 reply      
Rubyist here...how does Scrapy compare to Nokogiri?
A free shipping mystery goyet.com
255 points by Immortalin   ago   125 comments top 28
scholia 6 hours ago 7 replies      
The postal service is a networked delivery service, just like the Internet. The difference is that the data packets are physical envelopes.

Now, the cost of running the postal service is the cost of supporting the whole network of delivery vans, sorting offices, collections and so on, including all the staff.

The cost of supporting the network is the same whether it carries a very large number of packets or zero packets (up to the point where you have to add infrastructure to cope with extra traffic. Yes, like the Internet).

This economic structure means you can carry traffic at a marginal cost if you know the cost of supporting the whole network is covered.

All of which was worked out before Sir Rowland Hill launched the Uniform Penny Post in the UK in 1840. This disrupted the whole messenger business (where you paid for distance traveled) and was widely copied everywhere else. After that, nations formed a Universal Postal Union on the basis that "we'll deliver your letters if you deliver ours" (like the Internet).

In the early days of the public Internet (early 1980s), I used to explain how it worked by comparing it to the penny post. It's nice to be able to do the reverse ;-)

franciscojgo 6 hours ago 2 replies      
Here's an article from 2014 that explains this "loophole" https://www.washingtonpost.com/news/storyline/wp/2014/09/12/...
Animats 6 hours ago 4 replies      
It's a combination of subsidies in China [2] and really low Universal Postal Union rates for terminal delivery from China to the US.[1] Terminal delivery in the US is about $1/Kg. Market rate from Shentzen to Long Beach to ship a container is about $1200 right now.

[1] http://fortune.com/2015/03/11/united-nations-subsidy-chinese...[2] http://cep.lse.ac.uk/pubs/download/cp396.pdf

applecore 2 hours ago 1 reply      
If you discover a product that sells for ten times more on Amazon than AliExpress, your first instinct should be to arbitrage the opportunity by ordering the product and selling it on Amazon (and letting Amazon handle the fulfillment: http://services.amazon.com/fulfillment-by-amazon/benefits.ht...).
tyingq 6 hours ago 0 replies      
The secret is government subsidies. Not just from China to Chinese businesses either. Some driven by the UPU program through the UN[1], as well as individual receiving country's programs, like the "ePacket" program where US bound shipments from China get artificially low rates from the USPS.


PanMan 3 hours ago 1 reply      
This also surprised me: I ordered stuff on AliExpress that I wouldn't ship back to China for 0,70, even if the item was free (which electronics aren't).

I asked around and it turns out that postal services in rich (EU) countries have a special, heavily sponsored rate for 3th world nations. This wasn't a problem with the occasional letter from Afrika, but the post services didn't really saw this coming: Mass free shipping from china. Apparently the EU wanted to get out of this, but china refuses (and seems to be able to, for now).

Enjoy it while it lasts :)

throwaway2048 6 hours ago 0 replies      
The chinese postal system offers extremely low government subsidized rates. This isn't just Alibaba, almost everybody shipping out of china offers free shipping.


for reference 6RMB=0.91 USD

Larger parties are able to obtain even lower rates.

AcerbicZero 6 hours ago 1 reply      
Couldn't this just be a variation on the classic loss leader sales strategy? It also reminds me of what Amazon originally* did, get someone to sign up for an account, buy a widget at or around cost, ship for free, and then make the money back on the more expensive larger items as you become the "go to" for whatever someone needs.

*I say originally, as my experience with Amazon has been on a downward trend for at least the last two years.

vdnkh 6 hours ago 7 replies      
As an aside, I've been screwed several times by Prime's "guaranteed two-day" shipping. I had package which was very late December last year. When I went to the USPS tracking page to check the location, it turns out Amazon had requested the 8 day shipping option. After several angry calls I finally had the item cancelled and a new one on the way with Saturday shipping (the lower-tier rep said that it wasn't possible, so I elevated to the manager). Lo and behold, about a month later my original item showed up and I had to return it immediately. This was my worst experience, but not my first with late shipping. I've never had so much as a prime extension offered for my trouble.
bkeroack 6 hours ago 0 replies      
China subsidizes shipping for domestic businesses either a) explicitly as a part of economic policy, or b) implicitly/de facto via corruption.

In other words, for the business selling you the item, shipping is free. It's the Chinese citizen/taxpayer that foots the bill. This shouldn't be surprising since China, eg, is notorious for devaluing their own currency as a means of boosting exports. This is effectively a tax on the greater populace for the benefit of their manufacturing sector.

tarikjn 3 hours ago 0 replies      
This is mostly do to the Universal Postal Union Convention, it was discussed in details on this thread: https://news.ycombinator.com/item?id=9795017
chippy 5 hours ago 0 replies      
I thought the mystery was that China Post created something called "ePacket" to subsidize post to international destinations for online only shipments.

Amazon does not like this at all and are currently lobbying the US Govt to stop it.

randrews 3 hours ago 0 replies      
Same reason chip manufacturers offer free samples.

Some portion of the people who buy one 3-cent button are going to come back in a couple weeks and order forty thousand 3-cent buttons. Free, fast shipping on the first button is to entice you to use that button for your design instead of someone else's.

nissehulth 1 hour ago 0 replies      
I'm in Sweden and I've ordered stuff like a single USB cable from China at a price less than USD 1. Still, free shipping, and $1 wouldn't even buy me domestic letter postage.

If the item is small enough, it usually arrive in a padded envelope in my mailbox. Bigger items I may have to pickup at the post office. Most often, it arrives within 2 weeks.

I suspect that postal services in different countries have some kind of peering agreement. That they simply assume that the amount of mail would be pretty much the same in both directions and because of that, they don't really charge each other.

Someone1234 6 hours ago 1 reply      
I agree with the article, the term "free" is widely misused. Which is strange given the available options in the English language: bundled, included, prepaid, incorporated, et al.

If something costs $99 it isn't free. Free is no cost. $99 is not no cost, therefore $99 is not free. Just as text messaging isn't a "free" part of your $30/month cellphone plan, it is an included part however.

dsugarman 6 hours ago 3 replies      
on a tangential note about free shipping, aside from this China mystery, there is no such thing as free shipping. free shipping is a marketing ploy which makes you feel like you are getting some value where you actually lose it. when you add the shipping component of an online purchase into the item price, it removes the ability for you to save when you purchase multiple items. each one is priced as if it were being shipped on it's own.
mickelsen 2 hours ago 0 replies      
Many of these 30 cents button / iPhone cable sellers are usually users with less feedback as well, and they need a cheap way to gain more reputation in Aliexpress or eBay. This is a way to do it and as you get more completed transactions, your seller account gets upgraded and gets better rates. IWon't touch the shipping part which I think is already well covered in the comments.
krschultz 4 hours ago 1 reply      
AliExpress is losing money on such sales. But why bother, then?

It has to be this. I work at an E-commerce company. We often do the math on just exactly which items make a profit for us vs which don't. It's not always obvious until you really dig into the exact costs of handling the items. I believe Alibaba has a lot of 3rd party resellers, it's possible they haven't done the math themselves.

cha5m 2 hours ago 0 replies      
A possible explanation to this could be that AliExpress is simply trying to attract people to their platform by any means possible.

This means subsidizing purchases to draw users in.

I just considered buying that 3c button just because of how cheap it was, and this would have required me to register with them, and enter my cc number. This means that purchasing stuff from them in the future would be easier and I would be more likely to do it.

This could explain the free shipping on the China leg, and then as was mentioned there is a treaty that allows for nearly free shipping in the US by Chinese companies.

mesozoic 2 hours ago 0 replies      
Yeah honestly it's amazing. I've ordered some stuff from there and at that cost I can't believe it makes it to me. Though the post office does require a signature on a lot of these and then holds it hostage and it's not worth me making a trip to the post office to retrieve my $1 item.
voltagex_ 3 hours ago 0 replies      
>I received my package around 10 days later, in perfect condition.

That's amazingly fast - I don't understand why it's faster to send things from Hong Kong/China to the US than from the same to Australia. I've had things from AliExpress take upwards of 35 days to get to me (in a "metro" area of SE Australia)

kerv 3 hours ago 0 replies      
I once bought a iPhone charge cable from ebay, free shipping from china for $0.01. I won the auction a couple hours later. 6 weeks later it arrived!

No idea how it is possible. The bubble wrapping protecting it from shipping is worth more than what I paid.

fiatjaf 4 hours ago 1 reply      
Of course there are subsidies and complex discount + partnership relations involved, but in the end the question is simply: WHO is paying for this?
randomgyatwork 7 hours ago 0 replies      
I sent a letter from Japan to Canada for 70 cents. Sending the same letter from Canada to Japan would cost at least $2.50

China must simply have cheaper shipping rates, obviously not 3 cents, but probably a fraction of what you are used to spending.

gohrt 4 hours ago 0 replies      
This thread is full of good ideas and information about $1-$3 products and int'l shipping agreements, but the "$.03 button" mystery is almost certainly simply an edge case that no one cares about. It's a rare purchase that is subsidized by the massively larger volume of people buying 1000-button batches or what-have-you.
atirip 6 hours ago 0 replies      
They most likely buy shipments in bulk. Volume is predictable, so they ask postal office(s): approx y packages, approx x size, how about z dollars in total for a year?
shiftpgdn 6 hours ago 2 replies      
I am 90% certain that a majority of international dealing Chinese vendors simply use counterfeit postage.
boxidea 7 hours ago 1 reply      
I'd bet the short obvious answer is that it doesn't cost them anything to ship small, lightweight products like this.

The plane was already flying from China to wherever, filled with AliExpress merchandise. They may have paid for the whole plane instead of just by weight. So adding this product didn't add any tangible extra weight or cost.

Then the mail carrier or whoever is already going on that route. I doubt he gets paid per package, so adding another small package is no big deal and doesn't add extra time/cost.

Unless someone is ordering 1million buttons, then the weight starts to add up. But then so does the cost of the buttons, which at that point would cover the shipping.

I'm generalizing a bit since I don't know the specifics of the shipping industry. But this makes sense to me.

OpenFace: Free and open source face recognition with deep neural networks cmusatyalab.github.io
245 points by fitzwatermellow   ago   45 comments top 8
thedangler 1 day ago 1 reply      
I would use it to detect friends that come over to my place and load their taste in music on my media server.
teps 17 hours ago 1 reply      
It would be interesting to see if it's possible to recognize people in films.I'm not sure if it's much harder or not. In a way, a video is more complicated than an image, but you have way more data to recognize a face.Someone know if there is any work in that direction?

A plugin for vlc that can show you the name of any actor when you ask would be really fun!

godzillabrennus 1 day ago 7 replies      
Oh good, now the 7/11's will be able to detect who I am by my face through their security cameras and will do some targeted advertising to me as I walk the aisles.
mr_spothawk 1 day ago 1 reply      
can copwatch.org find a clever way to use this?
kefka 1 day ago 3 replies      
I also built an Open Source (GPL3) facial recognition program as well, called uWho (github.com/jwcrawley/uWho). Mine doesn't need anything like CUDA or OpenCL, and runs on a 6 year old laptop at 1280x720@15fps.

I ran it at our free, ticketless convention called Makevention (bloomington, IN). Estimates were that 650-700 people showed up. My tracker counted 669 uniques, which I think is spot on.

I also wrote mine for privacy in mind. The database was a KNN on a perceptual hash of the face. The data that was stored was only a hash and could only verify a face: it could not generate the face from the hash. Considering the application (Maker/Hacker con) I wanted to be sure that this was the case. (The data only resided on that machine, and it's wiped now.)

I've halted work on the gui version of it. Now I want to make it into a client/server, where the clients are RasPis (or other cheap compute with camera) and the server is whatever good machine you have. Initially, I'll reimplement the same algo, but I know that KNN has exponential time/cpu requirements the more samples I get.

mike_rochannel 16 hours ago 0 replies      
Actually, it's nice to see functionality like this in the open. Attached to a wide range camera pointing to a local train station it should be fairly easy to match faces to people driving to work leaving their house and flats unguarded.

There was a nice book, "Database Nation", that described a case of scanning licence plates of cars crossing a bridge to see who's at home and who left for work. Made burglaries a lot easier.

And now we do that based on faces ... nice .. /s

finnn 1 day ago 1 reply      
The efforts people will go to to avoid using HTTPS.... https://cmusatyalab.github.io/openface/demo-1-web/
exolymph 1 day ago 3 replies      
"We do not support the use of this project in applications that violate privacy and security. We are using this to help cognitively impaired users to sense and understand the world around them."

Oh okay. Surely this will stop any bad actors.

Tech faces hour of reckoning as fundraising drops, layoffs rise usatoday.com
222 points by apsec112   ago   154 comments top 25
brianmcconnell 2 days ago 3 replies      
21 year SF resident here. I was here for the dot com bubble, and the smaller "multimedia" boom in the early/mid 90s.

I think what is unfolding is a shift away from consumer businesses and back to boring business to business services. The latter tend to have more predictable cash flow (which ironically is a handicap when VC money is flowing to "viral" consumer businesses that can paper up their growth).

B2B services whose customers are not linked to the local tech economy should do just fine. It might be harder to raise money for a while, but that's good too, it'll weed out the weaker companies. OTOH, consumer companies that already have ten years of rapid growth priced into their valuation are pretty screwed. Hard to see where they avoid some really painful and damaging adjustments.

And a word of advice for workers. If you are at all unhappy with your job, or suspect that your company is vulnerable to a downturn, now is a good time to start looking around for something you'll be ok with for a couple years. Once companies start laying people off in larger numbers, it will get ugly (maybe not 2000 ugly, but it won't be fun).

StavrosK 2 days ago 6 replies      
Wow, this article looks like it's pushing some agenda hard. They took one quarter that was pretty much the same for NA and Asia and much better for Europe as the comparable quarter in 2014 and use it to report "fundraising drops"?Looks to me like someone is trying to actually bring about fundraising drops by using spurious data.

"Q4 2015 pretty much the same as Q4 2014, the sky is falling!"

puranjay 2 days ago 4 replies      
This comment from Tucker Max's article on why he stopped angel investing bears repeating here:

> "But when you have a lot of money chasing all these great ideas, and you combine it with the fact that entrepreneurship has gotten sexy in the last few years and become the in thing for a certain crowd, what you end up with is a huge number of people starting companies who have no business at all doing that."


I've said this to tons of people privately and publicly: entrepreneurship has become the new "I'm working on a novel". Lots of people have good ideas in their heads. That doesn't mean they should be building businesses on them.

Some people are just not cut out to be entrepreneurs. And once the money dries up, these will be the first ones to go.

The rest will continue building businesses because they can't really think of doing anything else.

VeilEm 2 days ago 4 replies      
Great, can we stop calling it a bubble now? Like if there were an actual bubble in the bay area, bad trends like this would have popped it? The thing is things aren't really see-sawing. The pendulum isn't going to swing as hard as it did in 2000 as it would have done so already.

That the technical companies in the bay area seems to be isolated from the hardships of the rest of the economy doesn't make it a bubble, it's simply a place with a different economic focus with different economic realities.

I swear people around here actually sometimes seem to wish bad things would happen, as some kind of schadenfreude.

We see these high valuation numbers being lobbed around as some kind of insult to good sense, but we're talking about just a few companies and in the grand scheme of the size of the economy in San Francisco, Palto Alto, Mountain View and San Jose these numbers are a tiny fraction of the overall system.

The reality is that software is still eating the world, and more change and value to be derived therefrom is still to come: automated vehicles, robotics, smart appliances, VR, and a myriad of far less sexier technological innovations with sound economic value creation.

Ologn 2 days ago 2 replies      
20 years ago, if you wanted to ship software, or run a web server on the Internet, you needed some amount of capital. I worked at one of the cheaper local Internet Service Providers in 1996, and with a 3-year contract we gave you 1 gig of traffic a month on your colocated box at 10 MB port speeds for $540 a month - you provide your own box.

Now for $10 a month, Linode provides me with the VPS box, I get 2 TB transfer, 125MB port speed, and the plan is not yearly or monthly or even daily, but hourly.

There are two app stores I can put apps out with, with a variety of monetization schemes. I know for Android that over 1 billion people use their Android device at least once a month. There is also the web.

There has been an explosion of open source software, and improvements on existing software - Linux, Apache, and MySQL. Or Nginx and MongoDB. Plus free Java application servers, or Python, PHP or Javascript web frameworks. There is Paypal and Stripe.

Companies have gone from renting offices to subleasing to co-working to virtual offices.

If you can program (or can install software with minimal programming skill and have some creative ideas and will hustle), there has never been a time when fundraising has been less important, because you do not need money to get your product to market. This being the case, worrying about getting a job and layoffs makes less sense. Because nowadays you don't need an office, a shrink-wrapped software deal with computer stores, a handful of leased colo'd servers and IT team to support them. You can start with much less, get to where you're making a minimal living, and then go from there. After that you can take a job, or take seed/VC money if you want, but you don't have to. I have seen this come to pass, and I have heard many luminaries in Silicon Valley say the same thing. Of course, in good times it is less work and easier to get a job making low 6 figures programming, than it is to make $30k a year on your own bootstrapped business, never mind pushing that up to where it grows to $100k. It is a lot of work, as many have said. It is doable like never before though.

FussyZeus 2 days ago 2 replies      
The companies that don't make it: CEO's fly away on golden parachutes sewn with the money they didn't earn, middle management goes on to other companies with an Impressive Name on Resume (+2 to hire-ability skill) added, and the engineers lose it all on the company stock they were promised could only go up and have to find a new source of income for the multi-thousand a month studio apartment they rented to get out there.

(Yes I realize this is overly simplified and bleak, that's the point.)

is_it_worth_it_ 2 days ago 2 replies      
Less VC money means less demand for engineers. Less demand for engineers means dropping wages. Higher levels of outsourcing and H1Bs also means dropping wages. American engineers though "scarce" will see a significant drop in wages, salaries in this field are going to collapse. They were being propped up by the government printing money, thus pushing up tech companies stock and pushing up VC money. With all that money no longer circulating, we will see wages go down.
jjoe 2 days ago 3 replies      
Most startups have already priced this fundraising-drought in their previous fundraisers. Instead of raising exactly what they needed, they over-raised by double or tripple that. It's selfish in some way as it likely contributed to the negative sentiment and the high valuations.

I think this won't be an acute bubble for incumbents but for early stage startups that need funds badly to keep the short runway clear.

xiphias 2 days ago 3 replies      
My favaurite part: "We're in a bit of a different bubble this time, the exuberance now has a foundation" as measured by market size and sales potential, he says.

Sure, it's always different this time for a CEO who needs to keep hopes up for investors. Sales is crashing with the deflation that is coming. And also overvaluations are followed by undervaluations when the interest rates increase.

jondubois 2 days ago 0 replies      
Sounds like things are going to get tough. I have mixed feelings about this.

On one hand, I feel sorry for all those software devs (myself included) who might be out of a job in the next few months or whose salaries will start shrinking.

On the other hand, it will be nice to watch some over-funded startups crumble to pieces and make room for more deserving (bootstrapped) newcomers.

at-fates-hands 2 days ago 1 reply      
Nobody should really be surprised by this. People all over tech have been arguing about the bubble for some time. I see this as an early stage harbinger of things to come though.

With the Fed raising rates, VC's already starting to hedge their bets, and friends of mine leaving startups to go back to stable corporate gigs, I see a shuffling of the deck, but nothing too major.

I feel like this is the normal eight year tech cycle that is just coming back around again:

- 1992 recession and crash

- 2000 recession and crash

- 2008 recession and crash

bovermyer 2 days ago 5 replies      
On the face of it, this just appears to be a possible end of the VC bubble. I highly doubt that this means much for startups that are built solely with founders' capital.
guylepage3 2 days ago 1 reply      
I've been hearing more and more of the companies feeling the Series A crunch. This will lead to many Seeds not making it into their next round. The times are changing for sure.
djcapelis 2 days ago 0 replies      
Huh. Is it finally starting?

Overdue, if true. If not, I guess we'll keep waiting for the next hangover to catch up to us.

joshmaher 1 day ago 0 replies      
If your a founder facing a fundraising struggle at the seed, bridge, or series A round I highly recommend you learn more about what your potential investors are thinking or need to know from entrepreneurs before they invest by reading more about the 20 different archetypes of investors. After a year of research, these different archetypes are identifies in my book Startup Wealth - http://amzn.to/1Jej8El
dangerpowpow 2 days ago 0 replies      
Data shown is based on 3 months.
brightball 2 days ago 0 replies      
"tech" is a bit broad. Programming startups are the most capable of have a very low operating budget with high growth, so the burn rates should be easier to minimize to get the most out of those investments.

The tech startups that involve actual physical products with manufacturing, distribution, shipping, retail placement, etc are another story.

It seems like we would be well served to have terms to differentiate for sake of headlines like this.

Kluny 2 days ago 0 replies      
Are we taking USA Today seriously now? Why, please?
debacle 2 days ago 0 replies      
People shouldn't see this as negative signaling for tech, but rather positive signaling for other parts of the economy.
sholanozie 2 days ago 1 reply      
American investors would be well advised to look north of the border...
nemo44x 2 days ago 0 replies      
""Companies will still raise funding, but at lower valuations," says Arianna Simpson, a Silicon Valley-based investor."

I wonder if this means companies have gotten wise to the bad deals they were signing just to get a high valuation and are agreeing to lower valuations but not giving up preferred stock that is so powerful. I know a few smaller companies that have done this and prefer to not make headlines with sky high valuations that everyone knows are meaningless.

This is a good thing.

sjg007 2 days ago 0 replies      
In the event of a downturn I think Facebook could absorb a large amount of the laid off employees.
code4tee 2 days ago 0 replies      
When the tide goes out you get to see who's been swimming naked. The tide is starting to go out..
peter303 2 days ago 0 replies      
Pretty accurate. Most of the so-called unicorns are overvalued by a factor of four or so through accounting tricks.

The article is correct that it is not silly as the late 1990s. Many startups now actually have working products, revenues, and sometimes profits. Many 1990s companies lacked those.

       cached 21 January 2016 03:11:03 GMT