hacker news with inline top comments    .. more ..    27 Mar 2016 Best
home   ask   best   3 years ago   
1
NPM and Left-Pad: Have We Forgotten How to Program? haneycodes.net
1702 points by df07  3 days ago   851 comments top 177
1
runin2k1 3 days ago 25 replies      
Holy moly-- is-positive-integer/index.js:

 var passAll = require('101/pass-all') var isPositive = require('is-positive') var isInteger = require('is-integer') module.exports = passAll(isPositive, isInteger)
I retract my previous statements that Javascript programmers are going down the same enterprise-y mess that Java programmers went down a decade ago.

They've already taken it to an entirely different level of insanity.

2
atjoslin 3 days ago 13 replies      
Counter-argument:

A good micro-module removes complexity. It has one simple purpose, is tested, and you can read the code yourself in less than 30 seconds to know what's happening.

Take left-pad, for example. Super simple function, 1 minute to write, right? Yes.

But check out this PR that fixes an edge case: https://github.com/azer/left-pad/pull/1

The fact of the matter is: every line of code I write myself is a commitment: more to keep in mind, more to test, more to worry about.

If I can read left-pad's code in 30 seconds, know it's more likely to handle edge cases, and not have to write it myself, I'm happy.

The fault in this left-pad drama is not "people using micro-modules". The fault is in npm itself: all of this drama happened only because npm is mutable. We should focus on fixing that.

3
nly 3 days ago 5 replies      
Nobody has forgotten. These people never knew to begin with.

NPM/JS has subsumed the class of programmer who would have previously felt at home inside PHPs battery-included ecosystem. Before that, a similar set of devs would have felt at home with Visual Basic. Seriously, go visit the comments section on archived copies of the PHP documentation. You'll find code of a similar nature. If PHP had had a module system 10+ years ago you would have seen this phenomenon then. Instead it was copy and paste.

This isn't elitism, it's just the way it is. The cost of a low barrier to entry in to a software ecosystem is taking in those who don't yet have software engineering experience.

Nobody should be surprised that NPM, which I believe has more packages than any other platform, is 90% garbage. There are only so many problems to solve and so few who can solve them well, in any language. Put 100 programmers in a room, each with 10 years experience, and you'll be lucky to find 1 who has written a good library. Writing libraries is really hard.

4
Wintamute 3 days ago 2 replies      
Going down the "lots of tiny modules" route is about these three things:

a) No standard lib in JS

b) JS is delivered over the internet to web pages in a time sensitive manner ... so we don't want to bundle huge "do everything" libs. Sometimes its convenient to just grab a tiny module that does one thing well. There isn't the same restriction on any other platform

c) Npm makes it really easy to publish/consume modules

d) And because of c) the community is going "all in" with the approach. It's a sort of experiment. I think that's cool ... if the benefits can be reaped, while the pitfalls understood and avoided then JS development will be in an interesting and unique place. Problems like today can help because they highlight the issues, and the community can optimise to avoid them.

Everyone likes to bash the JS community around, we know that. And this sort of snafu gives a good opportunity. But there many JS developers working happily every day with their lots of tiny modules and being hugely productive. These are diverse people from varied technical backgrounds getting stuff done. We're investigating an approach and seeing how far we can take it.

We don't use tiny modules because we're lazy or can't program, we use them because we're interested in a grand experiment of distributing coding effort across the community.

I can't necessarily defend some of the micro modules being cited as ridiculous in this thread, but you can't judge an entire approach by the most extreme examples.

5
NathanKP 3 days ago 8 replies      
I don't see anything wrong with using a pre-made left pad function. Why waste time and lines of code implementing something so trivial when there is already a solution available?

However, I agree it is ridiculous to have a dedicated module for that one function. For most nontrivial projects I just include lodash, which contains tons and tons of handy utility functions that save time and provide efficient, fast implementations of solutions for common tasks.

Lodash includes `padStart` by the way (https://lodash.com/docs#padStart).

6
adambard 3 days ago 4 replies      
I think it speaks to just how lacking the baseline Javascript standard library is. The libraries that come with node help, but all of this stuff seems like it should be built-in, or at least available in some sort of prelude-like standard addon library. The lack of either leads to all these (apparently ephemeral) dependencies for really simple functions like these.

That said, I work with Java, Clojure and Python mostly so I may be more used to having a huge standard library to lean on than is typical.

7
mastazi 3 days ago 5 replies      
Usually, dependency hell doesn't bite you, until it does. Try to rebuild that thousand-dependencies app in three years from now and you'll see ;-)

I recently had to rebuild a large RoR app from circa 2011 and it took me longer to solve dependencies issues than to familiarise myself with the code base.

Excessive dependencies are a huge anti-pattern and, in our respective developers communities, we should try to circulate the idea that, while it's silly to reinvent the wheel, it's even worse to add unnecessary dependencies.

8
larkinrichards 3 days ago 0 replies      
I wanted to write this post after the left-pad debacle but I've been beaten to it.

I think we got to this state because everyone was optimizing js code for load time-- include only what you need, use closure compiler when it matters, etc. For front end development, this makes perfect sense.

Somewhere along the line, front end developers forgot about closure compiler, decided lodash was too big, and decided to do manual tree shaking by breaking code into modules. The close-contact between nodejs and front end javascript resulted in this silly idea transiting out of front-end land and into back-end land.

Long time developers easily recognize the stupidity of this, but since they don't typically work in nodejs projects they weren't around to prevent it from happening.

New developers: listen to your elders. Don't get all defensive about how this promised land of function-as-a-module is hyper-efficient and the be-all end-all of programming efficiency. It's not. Often times, you already know you're handing a string, you don't need to vary the character that you're using for padding and you know how many characters to pad. Write a for loop; it's easy.

Note that this is exactly the sort of question I ask in coding interviews: I expect a candidate to demonstrate their ability to solve a simple problems in a simple manner; I'm not going to ask for a binary search. Separately, I'll ask a candidate to break down a bigger problem into smaller problems. In my experience, a good programmer is someone who finds simple solutions to complex problems.

Note: rails is similarly pushing back against developers that have too many dependencies:

https://www.mikeperham.com/2016/02/09/kill-your-dependencies...

9
darawk 3 days ago 4 replies      
Everything in this article is categorically wrong and antithetical to every principle of good programming ever articulated. The only problem here, as others have already noted, is that NPM allows people to delete published packages.

Small modules are not evidence of a problem, and they certainly aren't evidence of an inability to implement these things on the part of the people depending on them. Why would I implement left-pad myself when there is already a well-tested implementation that I can install? Building up an ecosystem of tiny abstractions, bit by bit, iteratively and evolutionarily, is how we get robust, well-designed complex systems. We don't get there by everyone reinventing the left-pad function to sate some misplaced appetite for self-reliance.

The author seems to make some arbitrary distinction between things that are 'large enough' to be packaged and 'pure functions' which are 'too small' to be their own modules, and I just couldn't disagree more. Tiny, pure functions are ideal modules. They facilitate the greatest degree of re-use, most clearly articulate what they ought to be used for, and stateless things are, in general, more composable than stateful things. There is no better unit of re-use than a tiny, pure function.

10
panic 3 days ago 2 replies      
Functions are too small to make into a package and dependency. Pure functions dont have cohesion; they are random snippets of code and nothing more. Who really wants a cosine dependency? Wed all really like a trigonometry dependency instead which encompasses many tricky functions that we dont want to have to write ourselves.

This is a pretty weak argument. What is "cohesion" and why do we care that modules have it? Joe Armstrong, one of the creators of Erlang, has argued the opposite (http://erlang.org/pipermail/erlang-questions/2011-May/058768): that lots of small, individual-function modules are better than a "misc" module that grows endlessly and may overlap with other people's "misc" modules.

Calling a function instead of writing the code yourself doesn't mean you've forgotten how to program! The real problem here is the cost and risks associated with dependencies in general (both of which are actually lower for single-function modules), and the broken package removal policies of npm.

11
haddr 3 days ago 1 reply      
While in general I agree with the article I must admit that I also strongly DISAGREE with the overall message. Especially with this:"Finally, stringing APIs together and calling it programming doesnt make it programming."

Stringing APIs together is what actually programming is. This is building software and for instance when i use .toString() method I can easily forget how it is done, focus on other high level things and don't care about dependencies, as long as everything works fine.

Let's admit that the main problem here is with broken npm, rather than packages themselves. If someone has written the "leftpad" function, it is so I don't have to write it again, and I can save probably 15-40 min programming and checking some corner cases.

Also please note that javascript can be really tricky down in the details. So if there's anything that can help, it's better that it exists, rather than not.

12
pjlegato 3 days ago 4 replies      
Yes, or more accurately a large new generation of coders is entering the workforce who know how to code only in a superficial sense and think this is a good thing.

Programming, and especially startup programming, is being taken over by people who are primarily technicians rather than engineers. They want to assemble prefab components in standardized ways rather than invent new things. They are plumbers who know how to install from a menu of standard components, rather than civil engineers desigining purpose built one-off aqueducts.

It is the inverse of the "not invented here syndrome." The technician-programmer is trained to minimize time spent thinking about or working on a problem, and to minimize the amount of in-house code that exists. The goal is to seek quick fix solutions in the form of copy/paste from StackOverflow, libraries, and external dependencies to the greatest extent possible.

In house coding should, they believe, ideally be limited to duct-taping together prebuilt 3rd party libraries and services. Those who want to reinvent the wheel are pompous showboating wankers (they believe); creating your own code when you don't absolutely have to is a self-indulgent waste of time for impractical people who just like to show off their hotshot skills and don't care about getting things done. Move fast and break things and all that.

This began with stuff like PHP but really got going with Rails, which preached convention over configuration as a religion, and supplied a standardized framework into which you could easily fit any generic CRUD app that shuttles data between HTML forms and a database (but is painful if you want to deviate from that template in any way.) Note that Rails doesn't use foreign keys and treats the relational database as little more than a glorified persistent hash table.

This set the stage for Node.js (why bother learning more than 1 programming language?) and NoSQL (why bother learning how database schemas work?)

13
peferron 3 days ago 0 replies      
The funniest thing about this entire debacle is the thousand of self-assured programmers coming out to show the JS/NPM world how it's done, only to have their short, simple, no-nonsense functions fail miserably on some edge cases they didn't think about.

This discussion about the "isarray" package is probably my favorite: https://www.reddit.com/r/programming/comments/4bjss2/an_11_l...

14
mooreds 3 days ago 2 replies      
Relevant:

'"The Excel development team will never accept it," he said. "You know their motto? 'Find the dependencies -- and eliminate them.' They'll never go for something with so many dependencies."

In-ter-est-ing. I hadn't known that. I guess that explained why Excel had its own C compiler.'

http://www.joelonsoftware.com/articles/fog0000000007.html

15
pkrumins 3 days ago 0 replies      
Yes, we have.

The entire Javascript ecosystem is a huge catastrophe. It will collapse any time soon. It's complex, fragmented and no one really likes it. There are a dozen different tools to get started. No one even understands how to get started easily. There are no fundamental tools. Everything is changing every week. You can't just build a product and then rebuild it even a month later. Nothing works anymore a month later - your dependencies have changed their APIs, your tools have different flags that do different things, there are new data models that you never needed and shouldn't even care about.

The developers are in high stress. Devops engineers are in even higher stress because they get to see what developers don't.

It's a huge mess and my advice to "prefer core-language solutions to small abstractions to small helper libraries to general libraries to frameworks" (http://bit.ly/1UlQzcH) hasn't been more relevant than today.

Software should be developed using least amount of complexity, dependencies, effort and using fundamental tools that have been and will be here for the next 20 years. Cut those dependencies, you don't need them. They're here today and won't be here tomorrow.

16
pfooti 3 days ago 0 replies      
So, I suppose you could do something like this instead.

 function leftPad(str, width, pad = ' ') { const actualWidth = Math.max(str.length, width); return `${pad[0].repeat(actualWidth - str.length)}${str}`; }
And that would do a leftPad pretty well, and be reasonably robust to stuff like the required width being less than the string width, the padding character being multiple characters long, and so forth. It doesn't do any type-checking of course.

It also doesn't work on older browsers - both string.repeat and template strings are new. You could fake it with string addition, but addition behaves oddly in the case your arguments are numerical, whereas template strings handle that. There's also a trick where you can say (new Array(desiredLength + 1)).join(' ') to make a string that is the appropriate length, but you've got OBOEs to worry about if you're not paying attention (Array.join puts the character between the elements, so you need an n+1 array for an n-length string). Also, at least on some browsers, Array.join is pretty cruddy, and you really ought to construct the string with an old-fashioned for loop.

Javascript has all kinds of weird corner cases and lots of browser compatibility problems. The fact that someone's written a decent implementation of something that should have been standard in the String object means I don't have to worry about it.

Of course, I do have to worry about stuff like losing access to left-pad when someone throws an npm tantrum, or dealing with future build issues if npm becomes untrustworthy. A cryptographically sound package manager seems like a reasonable want, especially after this week's issues.

But if your take-away from this whole problem is "meh, javascript devs are lazy", you're missing the point.

17
terryf 3 days ago 1 reply      
So, apparently some guys managed to build a system where it is very easy to re-use small parts of other people's code and now the author is complaining that "too much code re-use is happening" ?

I'm fairly old, so I remember the complaints a decade or two ago that people had where "We can compose hardware from IC's and you don't have to know what's going on inside and it's all standard and just works! Why can we not do that with software?!?! (of course that ended up with things like CORBA and DCOM, which was all wrong)"

aaaand here we are in a situation where code re-use is actually happening on a wide scale and now you're complainig about that?

28k lines in an empty project? ha, how many lines of code does the preprocessor generate for #include <stdio.h>I haven't actually measured, but I bet it isn't that far off from 28k lines.

18
tschellenbach 3 days ago 2 replies      
In a way this shows what a great job NPM did at making it easy to publish packages. It's so easy that people decide to package up extremely easy functions.

As a python developer I would never publish a small package, simply due to the overhead of setting up a PIP package.

19
haberman 3 days ago 1 reply      
> In my opinion, if you cannot write a left-pad, is-positive-integer, or isArray function in 5 minutes flat (including the time you spend Googling), then you dont actually know how to code.

Spoken like someone who writes functions in 5 minutes that I find bugs in later.

Just because a problem is simple to describe informally doesn't mean it is simple to implement without bugs.

20
jgrahamc 3 days ago 0 replies      
What concerns me here is that so many packages took on a dependency for a simple left padding string function, rather than taking 2 minutes to write such a basic function themselves.

It takes more than two minutes. That little module has a test suite, if you're including it then you have some assurance it does what it says it does. If you write it you've got to worry about whether it works or not.

21
qewrffewqwfqew 3 days ago 1 reply      
The elephant in the room here is Javascript. No other scripting language has been so bloody awful in the past that a five-line module that saves you typing `array.prototype.forEach.call` or something that papers over the language's awful idea of equality comparison with three functions has been "useful" or "more than five minutes coding without references".

Granted, these modules don't do such useful things, but that's the environment their creators were immersed in.

22
spo81rty 3 days ago 0 replies      
As a dot net developer I avoid adding packages and references at all costs due to all of these same reasons. Usually for something simple like this left pad example, I just copy the code and put it in to my project in some sort of class of helper functions or extension methods.

Seems like a lot of these basic Javascript functions need to be built in to javascript/node itself or consolidated down to a single package of common core functions that extend Javascript. Like padding, is array, etc. As others mentioned, these are fundamental things in other languages.

23
xupybd 3 days ago 0 replies      
Have we forgotten how to program? No, we have changed the way we program. We leverage the work of others to solve ever more complex problems. When we do this we get more than just the simple functionality we require. We get all the testing that comes from a package being used by thousands of projects. We get updates when people find a better faster way. We get to lower the number of lines of code we have to maintain.

Yes, these are simple tasks but do we gain anything doing these simple tasks ourselves? I find there is a finite amount of focus I have when programming, and I'd rather spend that solving the bigger problem.

This reminds me of a discussion I overheard between two professors when I was an undergrad. Prof 1 "This new language makes it so much easier to x, y and z"Prof 2 "Yes but what's the point, I can do all these things in Java"Prof 1 "I could do all these things in NAND gates, but I'll get more work done if I have this tool."

24
xdissent 3 days ago 2 replies      
In the case of left-pad, 2538464 of its 2550569 downloads last month are attributed to dependents of the line-numbers package (https://www.npmjs.com/package/line-numbers). So it would appear that relatively few people directly rely on left-pad, which highlights the importance of vetting the dependencies of dependencies.
25
nick32m 3 days ago 1 reply      
What's the problem of writing a function with a few lines of code and exports as a module?

I think it's totally fine. Like other people said, it's the mindset we borrow from Unix, do one thing and do one thing well. The function would be well tested, and could be reusable.

I don't understand why so many people just require lodash into their project (when they start project) while they only use one or only minimum set of the functions. I mean lodash is a very great library with clean and well tested code, but it's also quite bulky like a big utility lib, and for me most of the time I only need one or two of the functions I would just go to npm and find a module just do that thing.

26
buckbova 3 days ago 2 replies      
Nearly everyone has had it drilled into them the "Don't reinvent the wheel." nonsense for better or worse.

I use lodash in almost every javascript project I start, big or small because it makes my life easier.

I'd rather use the lodash isArray than roll my own.

https://lodash.com/docs#isArray

27
gardano 3 days ago 0 replies      
When I was prevented from upgrading Xcode/Swift for a project last year because of podfile dependencies, that cemented in my mind that every time I include a dependency, I'm putting myself at risk.

If I can spend half a day writing the code myself, I will -- because I know it will prevent headaches down the road. Yes, yes, I know any code I write adds the probability of bugs down the road. But at least the roadblock would be of my own doing, and not dependent on if/when the package maintainer upgrades their stuff.

28
supermatt 3 days ago 0 replies      
There are lots of useful titbits often left out of a languages corelib.

Back in the day we would have built up various pieces of library code, which we would have utilised rather than continually guessing best practices. For example, the isArray method cited may be trivial but is also non obvious. We'd probably have something like that in our library of useful snippets.

Sometimes we may have shared the code on forums and the like and people would copy and paste the code, sometimes into their own libraries. We would locate them by browsing known locations or inefficiently querying search engines

Now we utilise a central resource and simple tools to have a global library of code. Instead of searching far and wide we query a handful of tools that effectively does the copying and pasting for us.

How that can be considered a bad thing is beyond me. It's not a question of knowing how to code, it's a question of using your time effectively.

Granted, there is the problem of so-called modules being removed and dependencies breaking. This can be alleviated by vendoring your modules, a simple task with most dependency management tools.

Personally I think that published modules should persistent indefinitely based on the license the code utilises, although I'm not clear on the actual legalities of the recent npm issue (although if it's due to a trademark complaint, I don't see how it would ever be enforceable for completely unrelated code in any slightly sane country).

29
newobj 3 days ago 1 reply      
Have we forgotten how to program?

Maybe we've forgotten how to /just/ program. Everyone bangs the drum so hard of "let github be your resume." Incentivizing putting every brain fart you ever had out into the universe instead of just keeping it to yourself.

Just a thought.

30
lmm 3 days ago 1 reply      
If your package manager is so cumbersome that it makes a 14-line package not worth it, get a better package manager.

We haven't forgotten how to program. We've got better at it.

31
joeandaverde 3 days ago 1 reply      
I completely agree with the author.

The loudest people in the Node community have been evangelizing this practice for as long as I can remember. This shouldn't come as a surprise.

The argument, "If I didn't write it I don't have to think about it" is ludicrous. I just have to point at the left-pad incident disprove the premise of this argument.

The analogy of building things with a bunch of npm lego blocks is laughable. Those responsible for advocating the use of trivial functions by acquiring module dependencies are leading the masses astray.

"But, If I find that there's a bug in a module I can AUTOMATICALLY fix it everywhere!"

No.

You still need to assess how the change to that module impacts any code that depends on it. Just by updating a module and posting a "minor" bug fix can lead to other bugs that RELIED on the behavior as it was originally written.

It's simple, write your own trivial functions. Test them. Maintain them.

P.S.

Another module that can easily be in-lined to every code base you own. (3 million downloads this week).

https://www.npmjs.com/package/escape-string-regexp

32
adamwong246 3 days ago 0 replies      
"There are no small parts, only small actors". - Constantin Stanislavski

If there's a flaw to this debacle, it's that packages can be un-published. That is some grade A+ BS.

But no, there is no such thing as a package too small. Coding is hard. Collaboration should be default-on, not default-off.

33
nnq 3 days ago 1 reply      
...maybe it's time a committee of really smart people sit and sip through all the most used modules below N lines of code or smth, and just write an opensource JS-stdlib, hopefully merging in top 30% most used methods of Lodash too? Node/NPM is a great example of why too much democracy and decentralization is bad. Just gather some experts and have them centrally plan a "standard library" then impose it as "industry standard" have a recommended "no fragmentation policy" like "no forking" and "use it all or not at all", the hell with your web app's need for "performance"... even a few hundred Ks of code will not hurt anyone nowadays ff sake...

I even consider PHP a "more sane" language because you at least have most of the useful utility functions in a global namespace and everyone uses them. Of course, the real ideal on this is Python's solution: a nice set of standard libraries that you know are baked in, but most of them you still import explicitly - hence it's pretty easy to write even large Python applications that have a small and comprehensible number of dependencies!

(And more generally: our strike for "more efficiency" in programming is stupid imho! I'd always take a less efficient solution, even "less safe/tested", if it's more "understandable" and "explainable" and sometimes, paradoxically, making things a bit more monolithic and centrally planned makes then orders of magnitude easier to reason about for our tiny ape brains...)

34
sauere 3 days ago 1 reply      
> Theres a package called isArray that has 880,000 downloads a day, and 18 million downloads in February of 2016. It has 72 dependent NPM packages. Heres its entire 1 line of code: return toString.call(arr) == '[object Array]';

How anyone can deal with JavaScript for more than 5 minutes is absolutely beyond me

35
asragab 3 days ago 0 replies      
This was probably grotesquely naive of me, but I literally had no idea how jenga-tower like the javascript ecosystem was. Eye opener!
36
tomohawk 3 days ago 1 reply      
Rob Pike: "A little copying is better than a little dependency"
37
chukye 3 days ago 0 replies      
Man, I have to quote: `In my opinion, if you cannot write a left-pad, is-positive-integer, or isArray function in 5 minutes flat (including the time you spend Googling), then you dont actually know how to code.`

You would be surprised of how many developer these days have 'afraid' to write such functions, or how lazy they are, they found this thing and just add to a project, then push to some google list and the project got a lot of followers, and in the next day the project has changed about 90%. I saw this happen over and over again in this ecosystem, this is insane dude.

A lesson I learn is: you _need_ to read every module source code before add to any project, the NPM ecosystem has so many "shits" out there. You cannot trust in any npm module, recently I tried to trust in a module that has more than 5k stars, but I found a such ugly bug on that, that I feel my soul die, and I swear I hear the angels cry, thats not how open source supposed to be.

These days, seems that people dont care about the 'bug free' as long as it work a half way.

38
noiv 3 days ago 1 reply      
The Python community has a proper response: https://pypi.python.org/pypi/left-pad/ /s
39
mr_justin 3 days ago 0 replies      
It's not that anyone has forgotten, it's that a lot of people never learned how to in the first place. Every programmer community is riddled with these problems but the NPM world seems to be the worst. The ruby gem "american_date" annoys me to no end. It's just a highly-specific implementation of Time#strptime. Gah
40
Alex3917 3 days ago 0 replies      
So any developer could make this in five minutes, but for some reason they can't verify whether or not it works? That doesn't make sense.

In reality it could take an hour to get this working properly, but it does take only a couple minutes to verify that the solution here is correct. There are certainly good reasons for not adding extra dependencies to your project, but trading a known amount of time to check that an existing project does what you want for an unknown amount of time to redo it yourself is probably not a great bet.

41
imh 3 days ago 0 replies      
Unzipped, the source code for GNU coreutils is 30MB (zipped 4MB). This is a great example of a collection of single purpose functions you should never rewrite yourself. There's only one dependency if you want to use them because they're packaged together. With normal desktop code, 30MB doesn't really matter and you can link only what you need. Can you do that with the usual Javascript package managers/bundlers, or would you need to send the whole 30MB package to the client to use one function from it?
42
partycoder 3 days ago 0 replies      
node is can be a good idea. But people don't take JavaScript programming seriously. Most libraries objectively suck.

Google, authors of V8 and #1 subject matter experts on V8, have published coding standard. Does someone use it in the node community? No. Everyone loves "standard", a lousy standard that allows everyone put a badge on their github page while still having a code base full of vomit.

JSDoc. A great solution for documentation. You would expect major libraries to adopt it, or a similar thing. But again, no. Major libraries such as busboy do not use them. The documentation resembles a napkin.

Then everything else: input validation, error handling, consistency... etc. Take "request" for instance, one of the most widely used libraries. The state machine it implements is inconsistent. You abort a request and get a timeout, you can abort a request without starting it and get an exception. Issues that will drive you insane while debugging.

Express, one of the most widely used web frameworks on node.js. Do this on a route: setTimeout(function(){ throw new Error(); });. Great, now you have broken out of the error handling context. Great job.

Node libraries suck all across the board. It's the PHP of the 21st century. There are exceptions, like: lodash, bluebird, and others.

43
nikolay 3 days ago 1 reply      
We have. I spent some time optimizing [0] a String.repeat function over at Stackoverflow and I was surprised that many developers today don't know what they are doing, including core team members [1]. Specifically,

 function repeatString(str, len) { return Array.apply(null, { length: len + 1 }).join(str).slice(0, len) }
[0]: http://stackoverflow.com/questions/202605/repeat-string-java...

[1]: http://stackoverflow.com/questions/202605/repeat-string-java...

44
mtalantikite 3 days ago 0 replies      
One of the things I've enjoyed the most while programming in Go for the past couple years is the standard library and how much the Go community emphasizes keeping external dependencies to a minimum. For most projects I find myself using few if any external packages.

Now of course there are times you'll want to reach for a library, say for something like an http router, and up until recently the dependency management side of Go has been lacking. But when a pattern or library arises that many find useful the core team is open to pulling that in to the standard library if a strong enough case is made, for example in the context package (https://github.com/golang/go/issues/14660).

45
sebak 3 days ago 0 replies      
The main reason there is a fetish for these micropackages is a fetish for github stars. The formula seems to be: Overly generic + Simple + Javascript = Lots of stars.

That being said, there is something to be said for using these micropackages. Left padding a string is easy, but you might just have forgotten about that one edge case where in browser X and language Y you have to do things different. It's not really the case here, but things that seem simple at first often turn out to be hard because of some edge cases. One might hope these edge cases are solved if they use a library.

46
88e282102ae2e5b 3 days ago 1 reply      
To me, this looks like a symptom not of bad programmers but of a terrible standard library.
47
namuol 3 days ago 0 replies      
Formula for a top HN article:

1. Make an observation about a popular thing.2. Blindly extrapolate.3. Make one or more broad, controversial statements.4. (Optional) Nuance.

48
overgard 3 days ago 0 replies      
Here's the funny thing that gets forgotten: in a lot of commercial software, 3rd party dependencies need to go through legal to get properly vetted and attributed and so on. This also usually requires an engineer (to be able to answer things like if it's dynamically linked or not, etc.).

As staid and corporate as it might sound initially, it's a very smart thing to do. One screw-up with licenses could be catastrophic. Are you all really checking that carefully?

I can't even imagine how any sort of proper legal checks could be done with a trillion micro libraries.

49
robodale 3 days ago 0 replies      
I'm content and happy not knowing what the fuck this is all about. Pad left? Are you kidding me? If your tower og babel fell because of your reliance on some script kiddie's toy project, I am happy and content knowing you get what you deserve. Law of leaky abstractions, motherfucker.
50
fredbot 3 days ago 1 reply      
I call this kind of attitude the "Tea Party" of JavaScript development. The reason why we currently have JavaScript tooling fatigue is exactly because Tea Party developers insist on writing everything themselves instead of trying to build a better abstraction. The lesson here isn't not fewer dependencies: it's managing dependencies. NPM should not allow someone to arbitrarily remove modules that other's may be depending on. It's like building a bridge and them deciding to remove it after a whole city now depends on it.
51
rycfan 3 days ago 2 replies      
If I engage in as much hyperbole as the author, where does "write it yourself" stop? If I'm working on a team of two, should we each write our own left-pad? How about a team of three? Four? Five? Fifty? At a certain point, it makes sense for that to be written once for the project. We spent 30 years in software engineering trying to figure out how to get code re-use, and now that's it common and widespread, we want to go back to NIH?
52
kerkeslager 3 days ago 0 replies      
A lot of the discussion here really isn't talking about the problem at hand.

From the perspective of Babelify users, a major bug was introduced into software they depended on. I don't know how much money in developer time was lost due to this but it would almost certainly be in the thousands of dollars.

And it could have been a lot worse. It could have been something more complicated than left-pad. The author could have introduced a vulnerability or outright malicious code, or been hacked and done the same, and millions of people would have downloaded it and run it.

Arguably, small modules are good if you control them. Maybe they are more composable, maybe they enable better testing, maybe they encourage code reuse. I am not going to argue for our against small modules.

But none of the positives of small modules matter if an unknown developer who you have no reason to trust can change or unpublish the module out from under you. It's irresponsible as developers to risk our employers' and clients' businesses in this way for a function we could write in five minutes.

53
spion 3 days ago 1 reply      
Its interesting how everyone used this as a chance to attack the small modules approach. This approach definitely has downsides, but the problem caused by leftPad being unpublished wasn't one of them.

If jdalton declared a jihad on arrays tomorrow and decided to pull all array related functions from lodash, we would have the exact same problem.

If kriskowal decided that Q must mirror built in Promise and published a version that does this tomorrow, we would again have the exact same problem.

There is only one connection between this problem and the small module approach. As the size of a module decreases, the number of dependencies increases and so does the number of authors that produced your dependencies. With the number of authors increasing, the chances that some author decides to go rouge or protest for some reason also significantly increases.

Therefore, its irresponsible to use this approach with a package manager that allows an old, established module with many dependents to be unpublished so easily by the original author.

54
dschiptsov 3 days ago 0 replies      
I remember how I have been downvoted to oblivion for comparing JavaScript madness with Java EE "packer's" paradise years ago.

The Programmer's Stone first essay is actual as it never been before.

Actually, it is a common pattern. When some activity becomes popular due to very low barrier to enter it will end up in a such kind of mess. It seems like nowadays everyone is either a programmer or a scientist and researcher.

This is the quality of their software and research.

There has been a reason why good schools taught principles (algorithms and data structures) not particulars (objects and classes). But since MIT and Berkeley dropped Scheme-based courses in favor of "pragmatic" Python (thank god not JavaScript) based courses we are heading to a disaster. Java madness taught us nothing.

History is full of examples where assault by mediocrity ruined the whole branches of philosophy, arts and crafts. Instead we have fastfood, mass media, social media and now this mass coding, which combines worst from mass and social.

Just try to compare things like Smalltalk or Plan9 or R4RS Scheme or Zeta LISP of Symbolics or with this stuff.

55
meric 3 days ago 2 replies      
NPM modules can be used on browsers. On browsers, space is a premium.

Why would you want to install a 500kb dependency that has only one function you need, when you can install a 10kb dependency that has it?

Would you want each of your five 20kb dependencies to re-implement the same 5kb function, increasing the code you must send to the client by 20%, or would it be more optimal for each of those dependency to use the same 5kb function?

The author rants about practices of developers from different programming environment, without experience, without figuring how things came to be. If he did give an effort to think from the perspective from Node.JS developers hed have addressed the previous two points.

This is like going to a friends house and complaining everything is put in the wrong place. It would have been wise to immerse in Node.JS conventions and observe for a while before making comment.

EDIT: Reply to scrollaway:

I've also understated the problem.

Let's look at the problem in the current Node.js environment, it's not uncommon for a web app to have 20 dependencies, each of those have 10, and each of those 10 have 5. That's a total of 20 times 10 times 5 = 1000 dependencies in total.

Let's say you were to remove a 10 line library function that's "standard library-like", used by 15% of those dependencies, and have each of the existing dependencies re-implement that in each of those dependencies that uses it.

15% times 1000 times 10 lines is 1500 lines of code.

So if you're going to troll a solid argument by nitpicking, do it properly and get the details right.

56
jeffdavis 3 days ago 0 replies      
Joe Armstrong had an interesting related comment here:

http://erlang.org/pipermail/erlang-questions/2011-May/058768...

Maybe the unit of modularity should be a single function and we can do away with modules?

57
omaranto 3 days ago 0 replies      
This culture of tiny one-function modules sounds like Joe Armtrong's proposal about the "Key-Value database of all functions".

http://erlang.org/pipermail/erlang-questions/2011-May/058768...

58
sebringj 3 days ago 2 replies      
This is a non-issue and taking focus away from the real issue. The issue is the security hole that NPM opens up when a namespace can be grabbed up by anyone if the original developer pulls out.
59
rimunroe 3 days ago 0 replies      
I'm not really sure whether or not I should to add my voice to the din, but I feel like this whole thing is more a problem with npm and what it allows vs. what it encourages (and the rather paltry standard libraries in Node & browsers), rather than a problem with developers feeling entitled to not have to write their sorters and padding functions.

npm actively encourages structuring projects as many tiny individual modules and dealing with the resultant dependency trees and deduplication. Both of these things (along with the ease of publication) combine to encourage people to share their packages.

They make it incredibly easy to consume code from other people, but but at the same time provide a similarly low-barrier mechanism to retroactively change published code. That combination seems like a way more deserving topic of criticism than the unending refrain of "developers these days are so lazy".

60
UK-AL 3 days ago 1 reply      
I find NPM packaging ridiculous. Awhile I go I used NPM on windows, where the folder hierarchy became so deep it broke windows file handling. I could not delete the modules folder. I had install a npm package which allowed me to delete it. I think this is fixed in new versions by flattening the hierarchy, but still.
61
kevin_thibedeau 3 days ago 1 reply      
The insanity is that JavaScript doesn't have a standard string library. More a case of forgetting how to design a programming language than how to program.
62
cel1ne 3 days ago 0 replies      
I know the discussion revolves around the amount of dependencies, but I want to add a comment about semver, which has a part in this mess:

In my opinion it will not be done right in 80% of cases.

Every breaking change requires updating the major versions, but developer are hesitating to go from 1.0.0 to 6.0.0 in a month.

The way out is staying in the 0.x range therefore abandoning semver alltogether.

A nice write up about how packages are not following semver in java-land:

http://avandeursen.com/2014/10/09/semantic-versioning-in-mav...

63
memracom 3 days ago 0 replies      
Agreed.Most development groups should be building a local collection of utilities that contains all of these snippets, and most importantly, some documentation of what they do and some unit tests to demonstrate that they are correct.

No need to have global dependencies on small snippets that really should be in a core library anyway. C has libc, Java and C# have the creator's (Oracle or Microsoft) standard set of libraries, Python has the "batteries included" stuff in all the different distros. And so on. All of these snippets rightly belong elsewhere, not in packages.

And even if you did get them added to the right libraries, I guarantee you that you will not get rid of the need for a collection of small, and somewhat random, in-house functions, classes and libraries.

64
thrillgore 3 days ago 1 reply      
I started out really thinking Kik was wrong to sue this guy but like with all things, the longer this goes on the less sympathetic I grow.

Write your own goddamn utility classes, people. Or learn how to suspend package releases, include them in your projects, and smoke test your releases.

65
dc2 3 days ago 0 replies      
For a less sensational review of this situation, NPM published a post-mortem:

http://blog.npmjs.org/post/141577284765/kik-left-pad-and-npm

66
duncanawoods 3 days ago 0 replies      
The problem with shared hyper-modularisation is that it assumes the name of a function is unambiguous with only one valid implementation. If that were true, it should be encouraged but given it isn't, the practice will crushed by ambiguity and unintended consequences.

My app might well have an is-positive-integer function but it will include a range of context dependent choices about e.g. floating point, infinities, zero, null, "9", "9 ", "09", boxed numbers, string representations exceeding js max int, etc. etc.

67
joshstrange 3 days ago 0 replies      
I can't take this author seriously at all, one of his most egregious cases is the is-positive-integer library which until today had around 30 downloads in the last month.... No one was really using this and furthermore of course you can find bad/iffy code on NPM for the same reason you can find bad/iffy code on github. ANYONE can publish code. I could write a similar module for any other library, publish it to their repo, then scream LOOK! python is stupid and python devs are stupid.

I firmly believe that building on what has already been done allows for much safer code written at a quicker pace. Why should we constantly repeat ourselves? Also by using npm modules we can abstract logic and prevent someone on the team from going in and modifying it for their own use. It is a documented/constant function that we can use knowing exactly what it does. Is it a better world where everyone just copies code out other's repos and then has to include the licence/docs/tests along with it? It's much easier to just pull in the repo which contains everything and make it trivial to see where a function came from.

People are blowing this whole thing way out of proportion just because it makes a good headline "11 lines of code broke node"... You can all try to shame people who build on what's come before and chant "Not invented here" but I'll opt to build on what is proven to work instead of rewriting everything. At the end of the day that's what ships products.

68
j-diaz 3 days ago 1 reply      
Another explanation for the flourishing of these one function modules may be the fact that some people feel a kind of high/achievement from being able to say they have a published module out there. A sort of bragging rights if you will.
69
HarrietJones 3 days ago 0 replies      
This isn't forgetting to program, it's a deliberate choice as to how library code needs to be organised. We can disagree about choices made, but let's not assume other people aren't able to code as well as we think we can code.

That being said, I think this is a perfect example of where a good concept (small, tightly scoped modules) is applied dogmatically at the cost of the codebase. It's the node.js equivalent of AbstractRequestFactoryFactoryFactory stuff you see in Java, and the Mock Messes you see in Ruby.

70
stevewilhelm 3 days ago 0 replies      
> What concerns me here is that so many packages took on a dependency for a simple left padding string function

Clearly to mitigate such a tightly coupled dependency, Left-Pad should be a micro-service. :-\

71
blainesch 3 days ago 0 replies      
I think we missed the point here. The fact that it's 11 lines is meaningless, this could have been babel itself.
72
kf5jak 3 days ago 0 replies      
I wouldn't even think of looking for a package that does something as simple as the ones mentioned. If I need to pad a string, my first thought would be to create a new function, not look for a package...
73
plugnburn 2 days ago 0 replies      
Have to put a little clarity over my "so sad yet so true".

When the developers of such a serious library as React start to depend on a third-party one-function module made by some Dick-from-a-mountain (a russian idiom that means a random person who did nothing significant but tries to show out in all possible means), that means React developers are even more to blame than that Dick-from-a-mountain himself.

If you make any really popular piece of software, you absolutely must have a failover plan. No excuses for not having it.

But what's even sadder is that this issue had spawned a new wave of pseudo-elitist attacks on the entire JS dev community. Calm down guys, things like that could have happened for any language that has a widely-used centralized package system (Perl's CPAN, Python's pip, Ruby's gems etc).

Let me repeat that again: languages don't make software bad, people do. Just don't let such Dicks-from-a-mountain rule over your own modules with elementary stuff like leftpad, and you'll be safe.

74
spotman 3 days ago 0 replies      
Every single dependency you import should make you nervous. You would look at it like hiring someone. You are letting go of some responsibility. You are letting go of some control. The end result might be better than you could do in-house, but it might not. But you are already hoping that it's better, and giving up control.

Save these for the things you can not do in house, like NaCL. Don't write that yourself.

But string padding, sorry. Any developer worth their salt would laugh at adding a dependency for this. It's irresponsible, and comes across as amateur hour.

This is a simple case of optimizing for responsibility. The inexperienced programmer does not know how to do this, because they have not spent enough time being responsible, and having to deal with the fallout of managing responsibility poorly.

An experienced programmer carefully manages responsibility. A simple function, that is easy to understand, easy to test, and easy to reason about, is something that makes more sense to either pull all the way into your codebase, or write yourself.

Years of doing this means that managing dependencies should never be just slap it into your packaging system and start using the function. If you are on the line for making the wheels turn for a large scale platform that is directly connected to a monetary transaction of some nature, you will quickly find yourself preferring to remain responsible for everything that you possibly can control. There is certainly enough that you can't control to keep you on your toes.

75
smokeyj 3 days ago 1 reply      
What's with these kids and their "calculators". Back in my day we used a slide rule and we liked it!

But seriously this is stupid. Programming shouldn't be the goal. Just because you can write a function doesn't mean you should. Every line of code you write is overhead that must be tested and maintained. I guarantee that if the author chose to hand roll code instead of using packages he'd have a lot more bugs. But he wouldn't know that until he hit some mundane edge case scenario in production.

77
gdulli 3 days ago 0 replies      
There's so much wrong here, it can't even all be seen at once. The part we can perceive is only a projection from a higher dimension into our three-dimensional space.
78
erikpukinskis 3 days ago 2 replies      
I think this is the fundamental thing people don't understand about NPM and JavaScript, and the web in general:

Nothing is included. And that's a feature. The web is not trying to be the kitchen sink. That's iOS. They provide high level APIs for everything. And as a result, the platform is architecturally only as vibrant as Apple can make it.

Now maybe you're happy with Apple, maybe you love the iOS APIs. But if you don't, you're stuck. There's not a rich bed of alternative view layers that you can draw from to build your own vision of how software should work.

Node and the web browser strive to be lowest common denominators. They provide just the very basics: a document format, a very simple programming language, and an http server. The rest is up to you.

That's pretty scary, and so the JavaScript world has dabbled in frameworks. In the end all-inclusive frameworks are antithetical to the spirit I'm talking about, so things trend towards small modules that do one thing pretty well. People go overboard sometimes. I would argue left-pad should just be a copy-pasted snippet rather than module. But that's not a sickness in the community, it's just developers feeling out how far to go.

If you like every application to look the same, and you don't mind being chained to enormous and restrictive standard libraries and frameworks, then you will hate JavaScript. If you like writing software from scratch, forming opinions about all of the different parts of the system, and allowing each application to be built out of different parts, out of the right parts, then you should give JavaScript a closer look.

79
tobltobs 3 days ago 0 replies      
Need more of this WTF stuff? Have a look at the docker registry/repository/hub and those pearls of wisdom of the new DevOps guilde.
80
gsmethells 3 days ago 0 replies      
The fact that anyone has to even think about code size is the real problem.

Yes, downloading a giant JS lib for one function is insane, hence the ton of tiny dependencies.

However, it is equally insane that basic SDK expectations found in every other language has yet to come to be pre-implemented by the JS engine in the web browser itself. Some basic code ought to already be on the localhost the moment your app arrives.

81
markbnj 3 days ago 0 replies      
This is literally one of the funniest things I've heard about in months. Look for my new python library isDict, coming soon.
82
pklausler 2 days ago 0 replies      
Insight from this story: If it is important that a thing be done correctly, it should not be made so easy to do that it will end up being done by people who shouldn't be allowed to do it.

I may have just invented a rule for choosing programming language and systems there.

83
grillorafael 3 days ago 0 replies      
I agree that some packages might be too much but I don't think `left-pad` is one of them.

I wrote my own left-pad for a project I'm working now and I had to revisit a few times for tiny problems and lack of time to write tests. I would definitely use `left-pad` module if I knew the existence at that time.

84
AdamN 2 days ago 0 replies      
One should think of these requires as like .h files and the underlying code as something like a .c file. They're public definitions with potentially changing underlying code.

It's good to have small packages. Don't forget that the underlying ECMA script is changing so the implementation of these 'libraries' are (or will be) different over time from what they used to be. If somebody finds a faster way to do the method, then it will be done.

Finally, anybody who has used js in the real world understands how many corner cases there are and how difficult it is to make durable methods (i.e. how to know if an array is empty - which requires like 4 different conditions).

85
fieryeagle 3 days ago 0 replies      
<rant>The problem here is JS developers have baked in the notion of having NPM as the alternative to Google + StackOverflow + own thoughts. It's really a no-brainer (literally) to just slap another package than to bother thinking about what a piece of code does, the edge cases and pitfalls. Move fast and break things, right?

Sure there was some argument about Unix philosophy, small module doing one thing and does it very well. Did anyone bother considering the quality of most NPM packages? Quality is not reflected with passed Travis CI or extensive community testing and feedbacks. Not at all. Look at the those packages on apt-get. They are modular and robust. They do what they were supposed to do.

Now take a long hard look at the state of NPM. What do we have? People clamoring for reusability and whatnots. Most of them don't even know what they're talking about, just reciting the latest hip statement from the Internet. Being mature about development means accountability for what you do, not pushing shit around that you don't even have knowledge off. As a self-proclaimed polyglot, I love JavaScript as a language but not the ecosystem. It's like watching a dog chasing its tails:

- Endless loops of discussion that help stroke the egos but not improve anything else.

- Craps for resume/repository padding, not for actual developers to use.

- Bandwagon mentality that just pushes the latest fad along, and the herd towards the cliff.

The notion that JS developers are kids playing grown-up, has been reinforced with this NPM incident. If we want to discard that notion slowly, we need to be more mature developers. It's that simple. Here's what I think we could do:- Have a clear idea on what dependency you need. Browser, IDE, terminal etc are dependencies. Basic type checking is not.- Be better craftsmen. Roll and maintain your own toolboxes. Only share a working hammer, not a broken nail or a wood chip.- Note that for each package you publish, thousands more hours would be spent on learning, adapting, using and reporting mistakes. Collectively, the current community wastes so much time with finding the right things to use. Often times, we learn much more by playing with code, even posting on StackOverflow. That's hands-on, `npm i` is not.- Own the code better. The idea that teams like Babel and React devs with all the brilliant developers choose to put their eggs in a private corp's whims is just scary. You can't hope to build robust software while playing Jenga tower.

86
vu3rdd 3 days ago 0 replies      
I am posting the article "No silver bullets" again, in the wake of the npm fiasco. I think it is an essential reading for every programmer, every year!

https://news.ycombinator.com/item?id=11350728

87
zalzal 3 days ago 0 replies      
Separate from the discussion of whether super small modules and hundreds or thousands of deps are a good idea, is the point of operational stability.

Putting on your devops hat, whatever your dependencies, from a reliability and reproducibility point of view, you should control your reliance on unexpected decisions of npm or third-party developers. A lot of the panic with npm issues comes from people blindly using the npm registry and then seeing breakages, with no safety net. I hate to say "I told you so" but this is an issue we worried about a lot when considering Node productionization last year: https://medium.com/@ojoshe/fast-reproducible-node-builds-c02...

88
grumblestumble 3 days ago 0 replies      
Couple of things:

* If you're in favor of the micro-module approach, you shouldn't be relying directly on NPM, and should have something like Sinopia in place. After all, external code isn't the only thing you're vendoring, right?

* Micro modules are fine - but your application code should depend on a privately published utils module whose entry point is a prebuilt distribution of all your external micro-modules exposed through a facade. Your utils module deps are all installed as dev dependencies to avoid the Fractal Nightmare.

* Yay, now you have your own 'standard library' which still manages to leverage the NPM philosophy of distributed code. And if some twit decides to throw a tantrum, it will only impact future builds of your custom std lib - and you'll know about it at build time.

89
dreta 3 days ago 0 replies      
After reading this, i dont think im capable of ever complain about OOP again. Whenever you think youve seen it all, web developers always manage to come up with something worse. The only thing more depressing than this is the comment section below the article.
90
nv-vn 3 days ago 1 reply      
Here's a proposal (that I'm sure others have come up with in the past) -- why not create one big, community backed "batteries included"-type module that would implement all the small, commonly used functions. This could combine all these ridiculously small libraries and greatly reduce the number of necessary dependencies for a package. Extending the standard library should be just that: standardized. If the entire community focused on one project like that they could just as easily write the same code (but with smaller package.jsons, less require()s, and less time spent learning new libraries/searching for the right libraries. In fact, it would be great if something like that could be packaged as a standard node module so you'd get the same sort of quality assurance as you get with official projects.
91
ZeWaren 2 days ago 0 replies      
Regardless of whether micro-modules are good or bad, I think that if you are the owner/manager of a project, you should be able, given its full flat list of dependencies, to explain why each one of them is useful for you.

Every project I've seen that uses npm always required 100s or 1000s of dependencies.

If building or running your project requires something and you can't explain why, I think there's a problem.

92
innocentoldguy 3 days ago 0 replies      
This problem speaks volumes about the myriad shortcomings of the JavaScript standard library, in my opinion.
93
lucb1e 3 days ago 0 replies      
Yeah look at all these modern languages that do so much for you, like .empty() on an array -- have we forgotten how to do simple comparisons?! You could just take a second to consider the properties of an empty array, namely it contains no items (.count() == 0).

My point being, if something is a completely reusable and basic feature, a dependency is totally just. I remember a few years ago when I and all devs I knew (which weren't many, I was maybe 17) had our own libraries to include in all personal projects we made. It contained features we had to look up once and from then on just automated and imported, stuff like password hashing or input checking. This went out of style as single-programmer programs are going out of style, but the little useful features are still there.

94
joantune 3 days ago 0 replies      
One can argue that modules are good, but them depending blindly on newer versions like that was bad dependency management.

I say this because I strongly believe that reinventing the wheel is unnecessary and can bring more problems than not.

There are many examples, and I could come up with a made up one, but here's a very real bug that I debugged from another programmer not so long ago:

So, he came up with a JNI for SPSS's C library, applied it correctly, and got haunted for lots of months with an unsolvable bug. The problem? he miswrote the final copy of the file, and sometimes, some bytes where copied twice.

He tried to solve this problem for a long time (and eventually lived with it because SPSS was still resilient to this)

Is this concrete example of a 'module' ridiculously short? yes, but my logic still holds IMO.

95
rsp1984 3 days ago 0 replies      
Dependencies are one side of the problem. Unavailability of binaries and dogmatic use of dynamic linkage are the other side.

When I installed a simple lines-of-code counting tool through Macports the other day I accidentally opened the door to dependency hell as gigabytes of not even remotely related stuff started to build [1].

Without a doubt something is going very wrong with Free Software and package managers. On the other hand, never look a gift horse in the mouth so I may not even be the right guy to complain here.

[1] http://pastebin.com/cAZgbaFN

96
gladimdim 3 days ago 0 replies      
On DigitalOcean instance I cannot even use browserify+minify+babel6 cause npm process is killed by host (it consumes > 512Mb of RAM). So I have to manually run browserify + babel then minify. Still it produces 500kb of bundle.js :D
97
svs 3 days ago 0 replies      
The problem is not of small modules. The problem is lack of dependability. If the language patrons stand behind a set of modules and guarantee continuity and availability, it really doesn't matter what is in them and the world can continue regardless of how insane the module or the whims of any one author. This is not about the technical merits of having or not having a stdlib. The module in question could have been anything.

Making this about is-positive-integer misses the point that this is a social/political problem not a technical one. A language ecosystem must address concerns of business continuity as first class concerns.

98
morgante 3 days ago 0 replies      
This shows the beauty and success of npm in making it very easy and cheap to publish small modules.

It's not that we can't write a left pad function ourselves. It's that we might easily miss an edge case or make a mistake in doing so.

The author seems to be hung up on a preconceived idea of what a package "should" be without actually offering a compelling argument for why a single short function can't be a module.

Yes, every dependency you introduce is a liability. But so is every line of code you write. I'd much rather take the risk on a shared library which can be audited and battle tested by the entire community.

If a function is so easy to write, it's trivial to check out the module's code, review it does the right thing, and then lock the dependency.

99
jasonbelmonti 1 day ago 0 replies      
Clearly this is insane - but what the solution?

Part of the problem, in my opinion, is that packages are not subject to any review process. These dependency chains get out of hand because they are rarely top-of-mind.

100
raz32dust 2 days ago 0 replies      
The author brings up an excellent point, but I disagree with the solution. We should of course reuse existing, well-tested code if it available, even for simple things like left-padding. The real issue here is that there is a module for left-pad alone. If it were something like StringUtils module with a bunch of commonly used string functionality, it would have been great.

What is it about the node community that triggered this misunderstanding of package management and code reuse?

101
smitherfield 3 days ago 0 replies      
Without addressing the wisdom or lack thereof of including dependencies for small functions, perhaps the problem of disappearing/changing small dependencies could be solved with an option along the lines of

 npm install <small-dependency> <destination> --save-inline
Which would just copy the dependency verbatim to <destination>. Maybe have a "<dependency> is 100kb. Are you sure you wish to copy the entire source to <destination> instead of using a 'require' reference? y/n" prompt for the inevitable silly types who'd do it with Angular.

102
Artoemius 3 days ago 0 replies      
It's not that we have forgotten how to program. It's that everybody and their dog is now a programmer.

Professional racers don't need an automatic transmission, but it's quite helpful for an unprofessional driver.

103
nostrademons 3 days ago 1 reply      
Somehow, someone always forgets that engineering is about trade-offs, and so every few years we can an indignant series of articles about how stupid and ignorant today's programmers are and how we should all go back to the same processes that they called stupid and ignorant 4-5 years ago.

Relying on reimplementation, copy-paste, npm shrinkwrap, or various other ways of importing third-party code into your repository results in the following advantages:

1. You know exactly what goes into your product, and can audit everything for security, performance, or coding standards.

2. You often end up importing less, as third-party modules may have functionality that you don't need but other clients do.

3. You can modify the resulting code to add showstopper functionality, even if upstream doesn't want to.

4. You aren't subject to the whims of someone removing your dependency from the Internet or replacing it with a version that does something you don't want.

Relying on lots of little libraries installed via package manager gives you the following advantages:

1. You can easily install & try out modules that other people have written, letting you test out new features on users more quickly.

2. You can share code with other modules that have the same dependencies, often reducing the overall size of your system. This is important when there's a cost (eg. download size) to your total bundle.

3. You have less code for your engineers to read & maintain.

4. You can easily track licensing & contact information for your dependencies.

5. You automatically get any new features released by your upstream dependencies.

6. You automatically get security updates and performance enhancements released by your upstream dependencies.

The last is nothing to scoff at: imagine if the headline, instead of 'left-pad breaks the Internet!', had been a security vulnerability in left-pad which literally broke the Internet. Imagine how hard that would be to fix if everyone had copy/pasted the code or re-implemented it. This is not an academic scenario either: remember "Nearly all binary searches and mergesorts are broken", published by the guy who wrote the broken binary search implementation in the Java standard libraries?

http://googleresearch.blogspot.com/2006/06/extra-extra-read-...

Always copying your dependencies into your source tree is not the answer to this, no more than always relying on npm modules was the answer to updating your dependencies. They both have pluses and minuses, and if you really want to be a good programmer, you need to weigh both of them. For my projects, I tend to use whatever libraries I need when building them out (via npm, if possible), and then periodically audit the dependencies to make sure I'm still using them and they wouldn't be better off incorporated directly into the project. I wish more products did this, but I don't control what other programmers do.

104
Animats 3 days ago 0 replies      
It beats the alternative - pulling in some huge package with lots of little functions, all of which end up in the output. At least you're not loading huge amounts of unreachable code into the running system.

In languages with linkers, the better linkers would discard all unreachable functions. Then came DLLs. Use one function in a DLL/.so, and the whole thing has to be loaded. Go and Rust are usually statically linked, reflecting the fact that pulling in big DLLs was usually a lose. You rarely have two different programs on the same machine using the same DLLs, except for some standard low-level ones such as the C library.

105
ocdtrekkie 3 days ago 2 replies      
My current programming project, my goal has been to do as much in-app as possible. Does that mean I'm more likely to have bugs in my own code? Yes. But I've learned a ton doing it, and I know that my code doesn't have a giant monster of bloat hidden behind some random dependency somewhere. And yeah, that means when I wanted to handle email, I learned a heck of a lot about how programs handle email. Did it take more time? Yup. Education is time well spent.

I've got two dependencies besides the basic framework my project is built on: A scheduling engine, and a database interface. I eventually hope to factor out both.

106
thkim 2 days ago 0 replies      
The core issue here is that there is no included standard package in Javascript. It happened because Javascript did not have authoritative implementation when it first began. Next ECMA should require to pack some batteries in to avoid this micro-module hell.
107
doctorstupid 3 days ago 0 replies      
Smart people created software that lowered the barriers of entry to making software. It was inevitable that not-so-smart people would eventually be writing software that others would build upon.
108
andrewingram 3 days ago 0 replies      
This is from a client-side web dev perspective:

I'm hoping that proper support for ES6 modules to enable tree-shaking bundle builds (in the short term), and HTTP2 with client support for modules (in the long term), will allow us to head towards a world where we converge around a handful of large utility libraries.

In theory, tiny dependencies was supposed to allow us to only include code we actually needed in our bundled code. Bu the reality is that everyone uses different tiny dependencies for solving the same problem. So you end up with enormous bundles made up of different solutions to the same tiny problems.

109
smegel 3 days ago 1 reply      
> What concerns me here is that so many packages took on a dependency for a simple left padding string function, rather than taking 2 minutes to write such a basic function themselves.

Wait -- code reuse is bad now??

110
hughw 2 days ago 0 replies      
Of course I would never create a dependency on a small module like left-pad. I would simply copy the function wholesale into the body of my code!
111
ycmbntrthrwaway 3 days ago 3 replies      
What would happen to all those NPM projects if GitHub is destroyed? I don't think it will close anytime soon, but lets say, a meteor shower hits GitHub data center or something along the lines of this.
112
venomsnake 3 days ago 1 reply      

 module.exports = leftpad; function leftpad (str, len, ch) { str = String(str); var i = -1; if (!ch && ch !== 0) ch = ' '; len = len - str.length; while (++i < len) { str = ch + str; } return str; }
Isn't that the least efficient way to do that function? Prepending a string has always been very expensive operation.

Calculating needed length. Using repeat and just concatenating 2 strings would be faster.

113
Shivetya 3 days ago 0 replies      
Hell, the java programmers I work with seem to never use each others simple functions and instead recreate the will every single time.

As for the issue with such a short piece of code being reused by many, why score on number of lines? If it works and is a useful function is more important to me. I am not familiar with bundling within the usage the article covers but we tend to bundle like functions together and the compiler drops unused ones

114
gitaarik 2 days ago 1 reply      
Why was it removed anyway? I agree that the ability to unpublish something is the real problem, but I wonder why the author actually unpublished it. I wonder if the author knew about all the projects that depend(ed) on it. Maybe he/she actually did it as an evil experiment, though a very interesting and eye-opening experiment. Does anyone know?
115
blablabla123 3 days ago 0 replies      
I guess this is the way one is supposed to use node, preventing one to write non-DRY code. (Golang takes the exact opposite approach, having it's own drawbacks of course.) However, when using React, I kind of trust that the maintainers don't include packages than require 12 star projects, and if, that they fork this stuff themselves.

BTW, isn't that a Facebook project, so aren't they supposed to use a CI? ;P

116
StreamBright 3 days ago 0 replies      
So funny, just few weeks back I had an argument with somebody about writing a simple functions vs. importing libs when you need less then 5% of the functionality. I am more convinced than ever that it is better off to have the least amount of external dependencies. Of course I would not want to rewrite a 2M+ LOC library with very complex code, but left pad is not one of those use cases.
117
romualdr 3 days ago 0 replies      
The author missed the point about modularity in Javascript.

Small packages done right, well tested = maintenable, reusable, stable code.

The problem does NOT comes from packages. The problem comes from un-publishing public packages and centralized repository server.

I have a java project with a lot of dependencies. Does it mean it's bad ? No, but if maven repos are closing tomorrow, my project will not build as well.

118
alistproducer2 3 days ago 0 replies      
On the one hand, I ,love how the JS community is constantly coming up with new [versions of old] things. Even though most of it is creative waste, it's still creative and out of that churn we get some pretty awesome apps and tools.

On the other hand, there's a lot of bad practices disguised as "simple" and "efficient." Using NPM for one line functions is a great example of this.

119
iamleppert 3 days ago 0 replies      
I couldn't agree more. I've been using the new ES6 style module syntax for a few days now because a co-worker forced me to, so he would use my library.

I'm not convinced its worth it compared to the simplicity of commonjs and module.exports. You have to pull in babel, which has over 40k files to do all this.

Why are people destroying the beautiful simplicity that is javascript? Can those people please go back to java?

120
city41 3 days ago 0 replies      
I'd just like to point out that React does not have a dependency on left-pad. React has no dependencies at all[0]. devDependencies have no impact on third party consumers.

[0] https://github.com/facebook/react/blob/master/package.json

121
progrocks9 3 days ago 0 replies      
I just removed the tilde and caret of all my dependencies (in my package.json) and maybe that's the way to go. Seal a local version of your packages and don't update unless is completely needed. But I'm still worried about the fragility of the package environment.
122
jmount 2 days ago 0 replies      
For more fun, my commentary on leftpad code: http://www.win-vector.com/blog/2016/03/more-on-npm-leftpad/
123
robodale 3 days ago 0 replies      
I'm content and happy not knowing what the fuck this is all about. Pad left? Are you kidding me? If your tower og babel fell because of your reliance on some script kiddie's toy project, I am happy and content knowing you get what you deserve. Law of leaky abstractions, motherfucker. Spolsky...do you read it?
124
salehd 3 days ago 0 replies      
Well it has always been like this. In early 2000s most developers I knew would simply copy-paste code from all over the Internet.

Nowadays you use NPM search instead of Google search

The fact is that lazy programmers are lazy. The methods change but the principle remains the same. In 90s people typed in code from magazines and books.

125
serge2k 3 days ago 0 replies      
> if you cannot write a left-pad, is-positive-integer, or isArray function in 5 minutes flat (including the time you spend Googling), then you dont actually know how to code.

I'd probably get 2 of those wrong in some weird way, but I blame javascript. I mean without Google of course.

126
j-diaz 3 days ago 1 reply      
Maybe some people just want to claim they have a published module. Making them feel some sort of achievement or glory.
127
dustingetz 3 days ago 0 replies      
If the code is open source, what difference does it make if the code is in my module or someone else's?
128
plugnburn 2 days ago 0 replies      
By the way, ES6 syntax (works in modern foxes and chromes):

leftpad = (str, len, pd = ' ') => Array(len > str.length ? 1+len-str.length : 0).join(pd) + str

WTF are you talking about? Making this into a module?!

129
spullara 3 days ago 0 replies      
It seems to me that the JavaScript VMs should get together and start including standard library. That would also give the benefit that those would be highly optimized. They can keep it small at first and focus on number and string manipulation.
130
jammycakes 3 days ago 1 reply      
Here's a quick rule of thumb. If it's the kind of function you would ask a candidate to write at the start of a job interview, you shouldn't be importing a separate module to do it.
131
return0 3 days ago 0 replies      
Packaging is the new programming.
132
dschiptsov 3 days ago 1 reply      
This is, finally, the "Fractal Of Bad Design" moment for JavaScript.
133
z3t4 3 days ago 0 replies      

 function foo(arr) { var str = ""; var leftpad = require("leftpad"); for(var i=0; i<arr.length; i++) str += leftpad(arr[i]); return str; }

134
spajus 3 days ago 0 replies      
That's what you get when you let JavaScript into the server side.
135
ajuc 3 days ago 0 replies      
What's wrong with reusing small fragments of code?

The usual complains about many dependencies are mostly void (it's not bloat if you only depend on single functions you actually use).

136
polynomial 3 days ago 0 replies      
What actually happens when you try to update a required package, but it is gone from upstream? Is there no way to keep the existing package you already have?
137
TickleSteve 3 days ago 0 replies      
"Small modules are easy to reason about"

No... "appropriately sized modules are easy to reason about"

In this case... "Appropriate" has gone out of the window!

138
digitalpacman 3 days ago 0 replies      
.... isn't this a problem with node, and not developers? Wouldn't you say this is a symptom of a systemic problem of the framework that it is lacking common features that everyone needs?
139
IvanK_net 3 days ago 0 replies      
When am I developing some large project for a long time, sometimes I find, that I have reimplemented the same function at several places (few years between implementations).
140
plugnburn 3 days ago 0 replies      
So sad yet so true.

We haven't. React developers probably have.

141
sordina 3 days ago 0 replies      
Check out the work on Morte for a more reasoned approach for how to take micro-modularization to it's natural (or insane) conclusion.
142
po1nter 3 days ago 1 reply      
Everyone keeps mentioning the lack of a standard library for JavaScript as an excuse for this shit show. IMO this is just a futile attempt to mask incompetence.
143
benedictchen 3 days ago 0 replies      
I find it interesting that blaming code reuse is a valid thing, but blaming a lack of test coverage and CI build testing is not.

The problem is the lack of a test culture.

144
collinmanderson 2 days ago 0 replies      
I'm generally not a big fan of IPFS, but IPFS seems like the perfect solution to this problem.
145
niklabh 2 days ago 0 replies      
npm had the solution one year ago (namespacing) https://docs.npmjs.com/getting-started/scoped-packagesif only developer can embrace the "change"
146
losvedir 3 days ago 0 replies      
There's nuance to the discussion that both sides are missing. People argue forcefully whether these small modules are good or bad, but I'm not seeing much evidence that they understand the other side.

First: why small modules are bad. Lots of dependencies complicate your build, and you end up with the dreaded diamond dependency issue. Failures in a dependency become more likely to affect you. It gets you in the habit of using prebuilt modules even if maybe it's not quite what you need and it would have been better to write yourself. With `npm` specifically, we've seen how its mutability can break the build, though that's about `npm` and not the idea necessarily.

I think most software developers' gut responses are that something is wrong and crazy in the npm ecosystem.

That said, there are benefits that this blog post and others aren't mentioning, related to the javascript situation specifically.

The first one is that javascript is a surprisingly difficult language to get right. Sure, the final solution is only a few lines, but which lines are hard. You have to navigate the mindfield that are is V8 in nodejs, v8 in chrome, spidermonkey, chakra, etc. I've had code work in Chrome before but blow up in IE, and it's really hard to track down and test.

The comments in the blog post are illustrative:

One line of code package:

 return toString.call(arr) == '[object Array]';
Crazy right? And my first stab probably wouldn't have been to implement it that way. Why not:

 (testvar.constructor === Array)
that a commenter suggested, which should be faster? Well another commenter said:

 The constructor comparison will fail if the array comes from a different context (window).
I've run into issues before with cross-browser compatibility stuff, and it's frustrating and hard to test. If there's some de facto standard package that implements it for you, hopefully the community can iron out edge cases.

The other thing that people don't bring up, is that there's not much JS standard library, and in the browser context you have to send all your code to the front end.

So maybe you write these 11 lines yourself, and then another package writes these 11 lines, and another... it adds up. But if everyone uses the same package, the code only gets sent once and they all share it.

Lastly, people talk about how `sin` should be a part of a "trigonometry" package and not by itself. Well, again you're faced with sending a bunch of unnecessary code to the frontend. With webpack2 and tree shaking, or e.g. Google's Closure compiler, it can strip out dead code and so this issue will go away in the future, but we're not quite there yet. So package authors still bundle all these things separately.

So pros and cons.

147
qaq 3 days ago 0 replies      
It's up to you how to go about it on server side you for example can go with express or hapi (minimal external dependencies).
148
bliti 3 days ago 1 reply      
It suddenly feels like the 90s all over again.
149
Nr47 3 days ago 0 replies      
when you go out for food or order in, do you ask yourself "have I forgotten how to cook?"

Sure in some ways NPM has packages that don't deserve the title of a package, but isn't the convenience of not having to reinvent every code worth it?

150
lifeisstillgood 3 days ago 1 reply      
Surely there is a need to standardise on a set of well-maintained "batteries included" packages.
151
grav 3 days ago 0 replies      
Isn't the problem that there is a lack of a good standard library in Javascript?
152
lintiwen 3 days ago 2 replies      
I see two kinds of programmers here.

Good programmers understand the risks of making your system depend on something you don't have control really well; They know how keeping system complexity low is like an good investment which makes your life later easier (low maintaining costs).

Bad programmers stacks up technical debts such as including unnecessary dependencies until the system no longer works.

153
z3t4 3 days ago 1 reply      
This is why one guy can now compete with say Google or Microsoft, because that guy uses code written and managed by more engineers then both Google and Microsoft have combined. Instead of paying hundreds of dollars to said companies, you can just NPM install "what you need".
154
sorpaas 3 days ago 0 replies      
IMHO, this is all due to lack of a standard library in the Javascript world.
155
MattHeard 3 days ago 0 replies      
> Even if correct, is it the most optimal solution possible?

"most optimal"?

156
forgotmypassw 3 days ago 0 replies      
JavaScript was a mistake.
157
memracom 3 days ago 0 replies      
Seems that we need a tool to crawl all the repos that a developer owns and report lines of code in each package that they wrote. If there are lots of small packages, then this is not the kind of person you want to hire, except maybe to do PR.

Real developers do not do stuff like this.

158
zongitsrinzler 3 days ago 0 replies      
It's not about programming, it's about fast progression.
159
justin_vanw 3 days ago 0 replies      
Forgotten? Most people who develop on Node.js never knew...
160
vdnkh 3 days ago 0 replies      
I don't have an issue with modules or the Unix philosophy, I have an issue with using NPM for all these tiny modules. Hint: you can make your own modules, store them within the project, and add them to your package.json to be required anywhere.
161
danielrhodes 3 days ago 0 replies      
It does seem pretty insane, but how many of these are polyfills?
162
democracy 3 days ago 0 replies      
Re-usability is a good concept but can be overused easily.
163
anonymousguy 3 days ago 0 replies      
This is acceptable because web developers expect a certain level of coddling. Many developers are quick to defend this insanity because they simply cannot see their own level of entitlement.
164
ankurdhama 3 days ago 0 replies      
What's next? A hello-world package and Node tutorial about hello world program being include this package as dependency and call the exported function.
165
shitgoose 2 days ago 0 replies      
here is one of the comments to original post:

"Immutability at the centralized authority level and more decentralization of package distribution is the solution, not 'write more functions yourself'."

what the fuck does that mean?? they just don't give up, do they... Fucking retards.

166
tonetheman 3 days ago 0 replies      
All the comments here have answered the question quite well. Yes we have forgotten how to program and we want to argue about it.
167
frozenport 3 days ago 1 reply      
I don't see nobody criticizing 'std::min()'. Perhaps what we really need is a 'std' for js?
168
fiatjaf 3 days ago 0 replies      
No saner alternative presented.
169
facepalm 3 days ago 1 reply      
What I don't get about Left-Pad, shouldn't they have used Arrays.join for better perormance?
170
dang 2 days ago 1 reply      
This comment as well as https://news.ycombinator.com/item?id=11354704 break the HN guidelines. Please don't post uncivil comments regardless of how wrong or ignorant someone seems.

We detached this subthread from https://news.ycombinator.com/item?id=11351657 and marked it off-topic.

171
CodeOtter 3 days ago 0 replies      
npm install goldmansachs
172
c3t0 3 days ago 0 replies      
Functions Are Not Packages

Cracked me up :D

173
irascible 3 days ago 0 replies      
Ahh job security :D
174
irascible 3 days ago 0 replies      
Ahhh job security :D
175
tn13 3 days ago 0 replies      
I don't think it is bad at all. For a lot of project where saving even a minute matters.
176
tiglionabbit 3 days ago 1 reply      
This is just more evidence that the unit of code is a function, not a module.

Eventually, given a pure enough language, every "package" could contain only a single function, and every function in a project could be published as an independently reusable unit.

177
jsprogrammer 3 days ago 0 replies      
This post is too dismissive and confuses some topics.

"Packages", "modules", "functions", and other words can mean different things within different contexts. Well-known and tested functions are useful. Putting them in a "package" is just a consequence of how node/npm work. There should certainly be a much, much better implementation of code sharing, but sharing and using well-known, singular functions should be exactly what we are going for.

2
I've Just Liberated My Modules medium.com
1563 points by chejazi  4 days ago   791 comments top 117
1
callmevlad 4 days ago 17 replies      
The fact that this is possible with NPM seems really dangerous. The author unpublished (erm, "liberated") over 250 NPM modules, making those global names (e.g. "map", "alert", "iframe", "subscription", etc) available for anyone to register and replace with any code they wish.

Since these libs are now baked into various package.json configuration files (some with 10s of thousands of installs per month, "left-pad" with 2.5M/month), meaning a malicious actor could publish a new patch version bump (for every major and minor version combination) of these libs and ship whatever they want to future npm builds. Because most package.json configs use the "^1.0.1" caret convention (and npm --save defaults to this mode), the vast majority of future installs could grab the malicious version.

@seldo Is there a plan to address this? If I'm understanding this right, it seems pretty scary :|

[1] https://medium.com/@azerbike/i-ve-just-liberated-my-modules-...

2
nordsieck 4 days ago 6 replies      
One interesting thing to me, is that it is pretty clear that the kik lawyers pretty dramatically over enforced their trademark.

For those who don't know, the purpose of trademarks is to prevent customer confusion; essentially we don't want people to be able to sell cheap knock-offs of someone else's thing without the general public being able to easily distinguish between them. In practical terms, trademarks are "scoped" by their "goods and services" declarations.

For example, Apple the device manufacture[1] and Apple the record label[2] could both be trademarked because they had non-overlapping goods and services declarations... until iTunes started selling music[3].

If you look at kik's trademark application[4], you can clearly see that the trademark is limited to chat/media consumer applications, a pretty obvious over enforcement.

[1] http://apple.com

[2] http://applerecords.com

[3] https://en.wikipedia.org/wiki/Apple_Corps_v_Apple_Computer

[4] https://trademarks.justia.com/858/93/kik-85893307.html

3
larkinrichards 4 days ago 12 replies      
I applaud this action and while I'd like to point the finger at NPM, there's no real other method to fix historical package versions that depend on this.

It is worth pointing to the silly state of NPM packages: Who decided that an external dependency was necessary for a module that is 17 lines of code?

 module.exports = leftpad; function leftpad (str, len, ch) { str = String(str); var i = -1; if (!ch && ch !== 0) ch = ' '; len = len - str.length; while (++i < len) { str = ch + str; } return str; }
Developers: less dependencies is better, especially when they're so simple!

You know what's also awesome? The caret semver specifier[1]. You could install a new, broken version of a dependency doing that-- especially when other packages using peerDependencies rely on specific versions and you've used a caret semver specifier.

[1] https://github.com/lydell/line-numbers/pull/3/files

4
smsm42 4 days ago 4 replies      
Reading some of the comments reminds me old tale about a young man, that every morning on his way to work passed by a beggar and gave him a coin (that was back when coins actually had some value). One morning though the beggar notices the coin is smaller than usual, and he asks:

- Why you gave me a different coin today?

and the young man says:

- I got married and now I'm starting a family, I need more money so I can not give you as much anymore.

And the beggar cries out:

- People, look at this putz, he got married, and now I have to feed his family?!

I think the fact that we get so many awesome things for free is unbelievably lucky. I mean, not only we work in the one of the more generously paid jobs, we also get a lot of the tools we need for free! How cool is that? But some people think that if they are given those awesome things for free, they must deserve it and whoever gives them owes them forever. That's not the case. Yes, it is annoying to find somebody who contributed before does not want to do it anymore. It is mildly inconvenient and it can be improved. But let's not lose the perspective - the author does not owe us or npm continued support. It is sad he does not want to do it anymore, but that's what open source is about - people can take it over, and it happened within a single day. Such resilience is something to be proud of, not something to complain about.

5
camwest 4 days ago 4 replies      
FYI I'm the one who republished left-pad after it was unpublished.

I think of it similar to letting a domain name expire. The original author removed the code and I forked it and published a new version with the same package name.

The main issue was there were so many hard coded dependencies to 0.0.3 so I asked npm support if they could allow me to re-publish that version and they complied since I was now the maintainer of that package.

6
praxulus 4 days ago 2 replies      
This is a surprisingly effective protest action. It got the attention of an incredible number of people very quickly, and the damage is mostly limited to wasting the time of a bunch of build cops.

I don't have much of an opinion on his actual reasons for protesting, but I do think it was a pretty cool protest.

7
felixrieseberg 4 days ago 6 replies      
Azer has contributed awesome modules to the community, but such a move _obviously_ messes with a bunch of people who previously didn't trust npm, but Azer. Npm works fine. There might be issues with it, but the reason builds are failing right now is that he decided to unpublish all of them - in a move that feels very kneejerky, despite him claiming that it's the opposite.

If this had been actually in the interest of the community (because he thinks that npm isn't acting in our interest), he'd give people a fair warning. I could have lived with a "Hey, this was my experience, it sucked, I'll unpublish things in 30 days. Please update your dependencies." We know how to deprecate things gracefully.

8
jimjimjim 4 days ago 2 replies      
I am obviously a old fossilized ancient developer. This situation seems like insanity.

not the unpublishing part. the part where the thing that you require to sell/publish/do your job isn't under control or isn't stored within your organization.

Am i wrong in thinking that you should just have a local copy of all of your source code dependencies. would it really take that much longer?

9
chvid 4 days ago 3 replies      
In case anyone is wondering what was in the now broken dependency - here is the source code in full:

 module.exports = leftpad; function leftpad (str, len, ch) { str = String(str); var i = -1; if (!ch && ch !== 0) ch = ' '; len = len - str.length; while (++i < len) { str = ch + str; } return str; }
https://github.com/azer/left-pad/blob/master/index.js

10
cammsaul 4 days ago 3 replies      
Update: NPM takes "unprecidented action [...] given the severity and widespread nature of the breakage" and un-un-publishes left-pad

https://twitter.com/seldo/status/712414400808755200

11
jerf 4 days ago 5 replies      
This is why you should vendor it. What is "it"? All of it, whatever it may be. You should be able to build your systems without an internet connection to the outside world.

I say this with no reference to particulars of your language or runtime or environment or anything else. This is merely a specific example of something that could happen to a lot of people, in a lot of languages. It's just a basic rule of professional software development.

12
drinchev 4 days ago 1 reply      
Sadly there is a user @nj48, who already published empty modules and took the names [1].

Is this a joke or something coordinated with the community?

[1] https://www.npmjs.com/~nj48

EDIT : The hijacked modules look suspicious. http://www.drinchev.com/blog/alert-npm-modules-hijacked/

13
tdicola 4 days ago 1 reply      
I've never felt good any time I have to use node modules and see this gigantic stream of dependencies come flying down. It's even more painful when you need to assemble license information for your software and crawl through _every single dependency and all of their dependencies_ to find their licenses, etc. to check they are OK to use in your software. Just look at the View License info in the Atom text editor some time for a truly insane wall of text (over 12,000 lines!!). IMHO the entire node / NPM system is seriously flawed with so many tiny dependencies for trivial stuff.
14
jwiley 4 days ago 2 replies      
I think that unfortunately this was a foregone conclusion. Copyright law, like most other laws in our society, favor corporate interests.

I support his stand on principal, however. Azer is a talented developer and has an impressive life story, and has certainly contributed more to society than a social network well know for invading children's privacy.

https://medium.com/@azerbike/i-owe-my-career-to-an-iraqi-imm...

https://en.wikipedia.org/wiki/Kik_Messenger#Controversies

15
x0ner 4 days ago 3 replies      
Not sure I follow this completely...

You start a project with the same name as a company, which owns the registered brand and are surprised when some 3rd party complies with legal suggestions to make an adjustment?

Seems kind of silly to expect that NPM would want to fight for your project name when you didn't seem to do your own due diligence when picking a name. Also, a bit backwards to go remove all your modules as well, therefore breaking builds.

16
joeandaverde 4 days ago 1 reply      
Here's a highly downloaded 11 line module with lots of dependents.

https://www.npmjs.com/package/escape-string-regexp

I stopped searching at 1.

I've certainly benefitted from the vast ecosystem of npm. I greatly appreciate the work that goes into making this ecosystem what it is. However, I think we need to be a bit more critical when it comes to acquiring dependencies. Especially authors of very prominent packages.

Fun fact: one of my projects (a web api) depends on over 700 unique name/version modules.

Fellow programmers. This is embarrassing.

17
nchelluri 4 days ago 0 replies      
Wow, very interesting post for me. Earlier today, at work, we ran into an issue where `npm install` was failing because the `shuffle-array` module wasnt found. Investigation showed that the cause was that it was unpublished today. We found that this was a required dependency of the `match` module and this was in our dependency list in`package.json`.

We investigated and found out that it had been erroneously committedits actually a memory game and has absolutely no place in our webservice project. :) (Mistakes happen dependency audits can be worthwhile!)

Now, some hours later, I found your post on HackerNews and was really shocked to see, hey, this is exactly why it was unpublished. Quite a chain of events. Never thought Id figure out why the modules were unpublished, but now I get it! Thanks for the explanation.

[crossposted from the medium article]

18
aioprisan 4 days ago 2 replies      
If NPM wants to stay relevant and a serious contender, they need to have more clear policies in case of IP issues. In this case, the companies weren't even in the same space. Republishing someone's package who has chosen to unpublish and leave your platform is akin to Facebook resurrecting a Facebook profile because they had a lot of friends and the social circle ripple effects would be too high for feed quality for other users, so they chose to reactive the account AGAINST the author's wishes. WHAT?!?We need an open source NPM alternative, yesterday.
19
zwetan 4 days ago 1 reply      
funny thing, but assuming that kik is related to kik.com

if you look here http://dev.kik.com/build/, they promote their own server eg. "Our open source web server Zerver can help serve your cache manifest properly, as well as doing other speed boosting stuff like automatic style inlining."

this Zerver is on github and build with npm

https://github.com/jairajs89/zerver/blob/master/package.json

I did not run the build but I'm pretty sure that now their server is not building anymore as it depends on babel

call that irony ;) ?

20
overgard 4 days ago 6 replies      
I think it's amusing to see this from the perspective of the company. Some guy uses your trademark without your permission so you tell him to knock it off. He refuses, so you go around him, and so he protests... by fucking over all of his users. In a dispute that doesn't involve them. And people are celebrating this.
21
adamkittelson 4 days ago 0 replies      
About a year ago I tried to unpublish a version of a library I'd pushed to Elixir's hex.pm package manager but the API rejected it. Turns out they only allow you to revert publishing for an hour after you push.

It was a little inconvenient at the time but in light of this I can very clearly see the wisdom of that decision.

22
KajMagnus 3 days ago 0 replies      
Does this mean that I can no longer safely run `npm update`, or ask anyone to download my Node.js project and tell them to run `npm install`? Because the npm repo has in effect been compromised and is unsafe to use, until further notice?

That's what I'm assuming right now anyway. I'm not going to upgrade any Node.js dependencies or run `npm update` or tell anyone to run `npm install`.

If you look at the list of liberated libraries ( https://gist.githubusercontent.com/azer/db27417ee84b5f34a6ea... ) it's "impossible" for me to know which ones of all these libs I use indirectly via some other libraries, and ...

...Elsewhere in this discussion: (https://news.ycombinator.com/item?id=11343297)

> > Is there a plan to address this?

> Too late. Every package name on the list has been claimed already by a randomer with unknnown intentions.

Sounds dangerous to me. ... And I wish there was some way to get notified, when this issue has been fixed somehow.

23
chejazi 4 days ago 0 replies      
This broke a number of builds that depended on the (previously) published modules, here's a GitHub issue showcasing that: https://github.com/azer/left-pad/issues/4
24
al2o3cr 4 days ago 1 reply      
"eventually create a truly free alternative for NPM."

Which will either comply with copyright laws, or get blasted off the 'netz and break everyone's build...

The rules are messed up, but dramatic gestures and abstract hopes that "free software will save us" aren't going to fix them.

25
dham 4 days ago 0 replies      
What if Kik uses Node and they broke their own builds inadvertently by enforcing their trademark. 0_0
26
dham 4 days ago 0 replies      
Small modules they say. Small standard lib is ok they say. Just going to point out that in a lot of other languages, string padding is just built into the standard lib.
27
mschuster91 4 days ago 3 replies      
brouhaha, this is why you should not put node_modules into .gitignore (same for PHP's composer.lock and vendor/ folder).

To be honest, I have waited for something like this to happen so that people finally wake up and realize how deeply and truly compromised the JS ecosystem really is. 11 SLOC not available any more and all over the internet builds are breaking etc.?!

And please, why isn't essential stuff like this in the JS standard string library?

28
vulpes 4 days ago 1 reply      
Here's [1] a list of all modules that were liberated. Some serious land-grab opportunities there

[1]: https://gist.github.com/azer/db27417ee84b5f34a6ea

29
jonathankoren 4 days ago 1 reply      
My god! It's full of attack vectors!https://github.com/substack/provinces/issues/20
30
tobltobs 4 days ago 0 replies      
Even if those trademarks would include a tool like kik, it is completely brainwashed to enable trademarks for three letter words and enforcing them on software packages names.

What are we supposed to type for package names in 10 years. 'abshwjais_kik', or will it be hipster to use unicode like in ''.

31
kikcomms 3 days ago 1 reply      
Hi everyone, please read this explanation from Kik's head of messenger about how this played out: https://medium.com/@mproberts/a-discussion-about-the-breakin...

We're sorry for our part in creating the impression that this was anything more than a polite request to use the Kik package name for an upcoming open source project.

32
fiatjaf 4 days ago 3 replies      
Why isn't GitHub the source of all node packages? npm supports it very nicely.

I mean: why don't people write `npm install user/repo --save` instead of `npm install package --save` every time already?

33
cyphar 4 days ago 2 replies      
Seems odd that a patent lawyer is being involved in a trademark dispute. Also, given the fact that he didn't make any money off it, I severely doubt that it would ever go to court.
34
seldo 4 days ago 2 replies      
The package author decided to unpublish the package. A new author has now stepped in and re-published the package (yay open source!) and deps are fixed.
35
lerpa 4 days ago 0 replies      
Good for him, if that platform isn't working go somewhere else.

The major problem here is relying on a central authority like NPM in the first place.

36
lukegt 4 days ago 1 reply      
I like how npm even encourages you to create packages in place of the "liberated" ones when you try to visit their now missing pages:

https://www.npmjs.com/package/abril-fatface

 abril-fatface will be yours. Oh yes, abril-fatface will be yours. mkdir abril-fatface cd abril-fatface npm init # work your magic npm publish

37
larrik 3 days ago 0 replies      
I don't think Kik is the bad guy here. This npm module was rather new (<6 months?), while Kik Messenger has been around for years, and is VERY popular with the young crowd. They are both software. It would be like the author naming his module 'imessage' or 'spotify', except this is with a company that isn't as visible to the HN crowd.

I personally think him not knowing Kik existed was odd, and not googling the name at all even odder. Even still, I think Kik's response and npm's response were perfectly valid.

Looking at the voting of the comments here makes me sad for what has become of the HN community.

38
sklivvz1971 4 days ago 0 replies      
The problem here is that NPM is a private company in an institutional role.

You will always have some very common dependencies which, if brought down or altered, could compromise a lot of projects.

The problem is that npm has to act like an institution, not like a private company.

39
julie1 4 days ago 1 reply      
The number of coders complaining about an author exercising the basic of intellectual property rights is too high.

1) all coders should understand authors right be the code free or closed;

2) there is no excuse for someone whose value is based on creativity to ignore how IP works (the good and the bad part) because our comfortable incomes come from the protection these rights gives to our work

3) if your code is broken for 11 sloc, maybe you depend too much on others work and you have no value yourselves.

Benevolent persons sharing their code on their free time owes you nothing.

Repay authors whose code you use.

Buy closed source software, and at least respect the free software authors. It costs you nothing already.

40
tobltobs 4 days ago 0 replies      
My congrats and respect for his decision. Through actions like this companies might understand that the current trademark (and patent) law is only benefiting lawyers.

And wouldn't it be wonderful if as a result of this the build for KIKs Pedo API are broken?

41
nikolay 4 days ago 1 reply      
IED [0] + IPFS [1] + GPG looks like a dream come true.

Note: IED could be much faster than NPM installer due to parallel downloads, which would work great with the slower IPFS.

[0]: http://gugel.io/ied/

[1]: https://ipfs.io/

42
_it_me 4 days ago 0 replies      
Lol that satire flipped bit site even caught it http://www.theflippedbit.io/2016/03/23/developer-outraged-as...
43
jlarocco 4 days ago 1 reply      
Assuming it's kik.com that complained, the complaint to take down the kik NPM module seems legitimate. They've clearly been around a lot longer, are known by more people, and are in an overlapping market.

It seems like a lot of people would expect a kik module in NPM to be related to the company in some way, and it wasn't.

44
swang 4 days ago 3 replies      
Was that lawyer overreaching? I don't know. But for this guy to expect npm to use their resources to defend him (which they may even possibly lose!) and get mad at them is... a bit presumptuous? Github isn't open source either so is he going to get mad when the lawyers send them an email about kik?
45
datashovel 4 days ago 1 reply      
Open source community needs to aggregate a list of lawyers who will consult on these sorts of things (related to the community at large) pro-bono. This way all parties on the open source side can feel a little less pushed around and bullied and a little more protected.

The best part would be to learn that the claim was not valid in the first place. At the very least, having representation would provide for some wiggle room where you can have days if not weeks to resolve the issue, instead of feeling you have to take immediate action.

46
hellbanner 4 days ago 1 reply      
Can we talk about how patents own namespaces? If I have a little "kik" soccer tournament that no one knows about, then it's fine. As soon as the namespace collides with the HUGE, vastly connected internet, it's a "problem".

We're going to run out of proper nouns, folks.

48
st3v3r 4 days ago 1 reply      
Wow, first they steal a package from the original author, then they do this. Why will anyone want to publish to NPM after this again?
49
fold_left 4 days ago 0 replies      
I've been warning of the potential for issues like this for quite a while and would be really grateful for people's feedback on this approach to try and insulate your projects from them https://github.com/JamieMason/shrinkpack.

Its not completely there yet, but I think there's something worth exploring further in this idea.

50
diffraction 4 days ago 0 replies      
kik has lawyers all over the world... because it is the platform of choice for pedophiles and sexual predators. there are many billable hours spent responding to doj/states attorney subpoenas. (http://www.trentonian.com/general-news/20140728/pedophile-on...)(http://woodtv.com/2015/02/02/sexual-predator-warns-parents-a...)
51
sjclemmy 4 days ago 0 replies      
I am a heavily invested user of JavaScript and the surrounding ecosystem and the security aspects of the npm package system has been in the back of my mind for a while. As I don't consider myself an 'expert' in all things npm and package management I've deferred to the general consensus, which didn't seem to mind too much about the security problems npm exhibits (This reminds me of the sub-prime crisis).

I think an event like this is a really positive thing, as it promotes discussion about something that is exceedingly important. All it takes to exploit this vulnerability is a bit of time and effort, it looks really easy to inject malicious code into any number of 'de-published' packages.I hope that some kind of name spacing and / or locking of npm packages results from this and that the javascript ecosystem continues to mature and develop in the right direction. Npm inc have an opportunity here to do the right thing. If they don't then there's going to be a mutiny and a 'better' alternative will supersede npm. Bower anyone? ;)

52
zachrose 4 days ago 2 replies      
So what keeps Kik from going after Github?

https://github.com/starters/kik

53
cdubzzz 4 days ago 1 reply      
Does this series of tweets [0] seem rather odd to anyone? He seems to be calling people soulless and pondering his own "Power and Responsibility".

[0] https://twitter.com/izs/status/712510512974716931

54
sbuttgereit 4 days ago 2 replies      
I've been reading in the comments regarding 1) the practical effect of breaking builds and 2) the security issues of how package names can be reused on npm once they are unpublished (versioning aside for a moment).

I wonder what other, similar, packaging distribution platforms are vulnerable to this sort of thing? I am not speaking from knowledge of any of the procedures of any of those I'm about to mention, but I have and do depend on some them. Thinking about this issue and some of those other tools that pull long strings of dependent packages does give me pause. Especially the replacement of some dependency with less than friendly code... breakage can be managed, but silent invaders...

Does Perl & CPAN, Rust & crates.io, or Ruby & RubyGems.org suffer these same issues and it just just hasn't been a problem yet? Do they have means of avoiding this? Again, I haven't studied the question... but I think I may :-)

55
wtbob 4 days ago 1 reply      
The problem here is that there is a single petname space (pet namespace? pet name space?) administered by one organisation but used by everyone.

With a different system, the author could have a key $foo, and call his package ($foo kik), and that wouldn't interfere with (us-trademark-office kik).

56
taumeson 4 days ago 1 reply      
Wow, this is an amazing outcome here.

Why is "unpublishing" something that can happen in npm? What's the point? I can see the downside, what's the upside?

57
octref 4 days ago 3 replies      
Why don't people just use lodash?

https://lodash.com/docs#padStart

It's well-tested, well-maintained, performant, with good documentation and has custom-build to leave out functions you don't need.

58
erikb 4 days ago 0 replies      
I don't see the issue here. If the name is taken the lawful way (and Kik is a clothes store chain as well as a chat app, so it's even taken twice) why fight it or be angry about it? Just take another name.

That said the decisions by NPM are also hard to follow. Why allow someone else to take over ownership of a package? Why allow anyone to take down published versions of an open source package? If you publish open source stuff on my site I have all the right to keep that stuff in that version and share it with others. That's pretty much what FOSS is about, right?

59
forrestthewoods 4 days ago 0 replies      
Why yes, depending on a third-party, external package manager is a huge risk. I have always believed that open source projects should be fully inclusive of any and all dependencies. This event has not changed that opinion.
60
cammsaul 4 days ago 1 reply      
There's a PR open to remove "unpublish" from NPM here:

https://github.com/npm/npm/pull/12017

61
jahewson 4 days ago 1 reply      
This is a really bad decision on npm's part. Kik's laywer has pulled a fast one on them. Kik has no right to enforce the Kik trademark beyond the limited set of goods and services listed in the trademark application [1]. Kik is a registered word mark for mobile messaging software only. That's why the trademark database contains many entries for just the word Kik, other companies own the use of that word for other goods and services.

I'm really surprised that npm didn't push back against this. It's not like npm isn't full of trademarks:

https://www.npmjs.com/package/pepsi

https://www.npmjs.com/package/coke

https://www.npmjs.com/package/kfc

https://www.npmjs.com/package/virgin

https://www.npmjs.com/package/sprint

https://www.npmjs.com/package/nba

https://www.npmjs.com/package/nfl

https://www.npmjs.com/package/google

https://www.npmjs.com/package/yahoo

https://www.npmjs.com/package/skype

https://www.npmjs.com/package/word

https://www.npmjs.com/package/excel

https://www.npmjs.com/package/unix

https://www.npmjs.com/package/windows

https://www.npmjs.com/package/osx

[1] http://tmsearch.uspto.gov/bin/showfield?f=doc&state=4804:lir...

62
TimJRobinson 4 days ago 1 reply      
Quick script to test if your project is using any of the modules he unpublished:

 for module in $(curl -s https://gist.githubusercontent.com/azer/db27417ee84b5f34a6ea/raw/50ab7ef26dbde2d4ea52318a3590af78b2a21162/gistfile1.txt); do grep "\"$module\"" package.json; done
If any names appear you should replace them or force that specific version always (remove ~ or ^ before it). If nothing appears you're probably good.

63
rzimmerman 4 days ago 0 replies      
npm really shouldn't let authors unpublish. It should definitely be impossible to overwrite a published package version (it is, but only for the past year or so).

When you install express, you install 40 dependencies. Each of these has separate maintainer(s) and coordination is optional. If we're going to allow this dependency mess to grow organically, npm needs to be strict about what gets published and we need to be really careful about depending on anything but a strongly pinned version.

64
Top5a 3 days ago 0 replies      
All legality, copyright law, etc. aside, how did this even create a problem?

Even on small projects, basic build engineering dictates that you are cognizant of which package versions against which you are building. Furthermore, all packages should be locally cache-isolated on your build server (or local box if you do not have a build server). Building against the most "up-to-date" versions of remote dependencies puts you completely at risk for situations such as this, let alone at the mercy of malicious updates to such remote dependencies.

What sane (pun intended) person would ever build against the most recent version of all packages (including small ones such as this) from a remote build server? Also, for larger (i.e. more than several employees) type operations, how could QA possibly function when building from "most recent version of all packages"?

All these entities that are suffering because of this should immediately fire all their build engineers, because they are not only a reliability concern, but, more critically, a vulnerability concern.

65
dc2 4 days ago 2 replies      
> This is not a knee-jerk action.

The only thing knee-jerk and honestly irresponsible is not warning anyone first, especially knowing how much his modules were depended upon.

Otherwise, there's nothing wrong with this.

66
pluma 4 days ago 0 replies      
By complying with kik's request, npm has set a precedent for library authors that basically means: in doubt, you will lose your package name, even if you dispute the trademark.

This means npm apparently wants everyone to handle trademark disputes like Jade did: https://github.com/pugjs/pug/issues/2184

67
repn001 4 days ago 0 replies      
Not a Package Manager (NPM)
68
Trisell 3 days ago 0 replies      
I think this blossoming episode leads me to believe that if you are running a production app, then you need to be hosting your own internal npm, and updating that from the global npm. That way when something like this happens you are able to continue on, and not have many issues, like the builds braking that are being reported on github.
69
albertfdp 4 days ago 0 replies      
In order to check if I was affected by any potentially malicious hacker that gets ownership of one of the existing liberated modules and adds malicious code on them, I have created a small script to check that:

https://github.com/albertfdp/did-azer-break-my-stuff

70
miiiichael 3 days ago 0 replies      
I'm adding this bash script to the conversation. https://gist.github.com/mbranch/f77e62d91f46972dcc32

It reports on the inclusion of unpublished modules in all package.json files found in deeper directories.

71
ilaksh 4 days ago 0 replies      
This type of thing is one of the reasons I suggested before that a module registry could and should be a distributed peer-to-peer system.
72
zakame 4 days ago 0 replies      
Sounds like something that would not likely happen on other repos like the CPAN/CRAN/CTAN.

Perhaps the JS community at large would be better off with a similar system? I remember sometime long ago that there was a JSAN effort: http://www.openjsan.org/

73
alongtheflow 4 days ago 0 replies      
Aftermath of situation from left-pad. Some says that it started to break major projects like react-native.

https://github.com/azer/left-pad/issues/4#issuecomment-20006...

74
mstade 4 days ago 0 replies      
The ability to "unpublish" a package is fundamentally strange, because it enables situations like this.

It's also strange that people put so much trust and faith into a private company to host and distribute packages largely for free and then rile against them when they do stuff like this with infrastructure they own. NPM is not some free and open space, it's a private company with private interests. You should expect them to do whatever they need to protect those interests which may or may not coincide with public interest.

I hope this resolves in more people getting involved with projects like IPFS and Nix, that may ultimately provide some recourse to the issues of centralized package management.

75
cdnsteve 4 days ago 0 replies      
76
cat-dev-null 4 days ago 0 replies      
NPM is a for profit, so they're a SPoF from lawyers and governments seeking to control others.

The other issues is a lack of distributed package/artifact replication which makes it possible to take down an entire ecosystem by unplugging a few servers.

77
stblack 4 days ago 0 replies      
I read the whole damn thread and nobody, nobody links to the assholes that deserve to be kik-ed.
78
wangderland 3 days ago 0 replies      
If your project use the packages in the list, and got broken due to this.Here comes a solution https://medium.com/@viktorw/how-to-fix-npm-issues-in-your-pr...
79
st3v3r 4 days ago 1 reply      
Hopefully NPM will think of this in the future, next time they try something like this.
80
grapehut 4 days ago 0 replies      
My biggest issue with npm is the lack of verifiable build. Even if I read it on github, I have absolutely no idea if that's exactly what the person uploaded to npm. I very well could have malicious code and not know it.
81
tytho 3 days ago 0 replies      
Perhaps someone has already suggested this, but what if npm had some sort of "unpublish block" if any modules depended on yours? Or maybe some sort of notification to the dependent package owners. This doesn't solve the issue of unpublishing dependent free packages, nor does it solve someone taking over and putting malicious code, but it would encourage a more responsible behavior when removing a highly depended upon package.
82
flurdy 3 days ago 0 replies      
Can NPM not add to their TOS and features a "notice period"? With a grace period for errors e.g. if published and older than one week to remove a package you have to give notice first, e.g. 2 months. With a suspension before actual removal?

With some avenues for expedite removal/suspension ie security and legal, which would have removed kik quicker but not leftpad.

Whether people would be aware of the notices or ignore them is another issue.

83
gedrap 4 days ago 0 replies      
Thanks to this, I hope people will consider the way too common deployment approach when during the build time you pull stuff from npm (or whatever external package manager/repository), and if it fails, the build fails.

This is fine for small projects. There are tons of applications where availability is less important then development speed.

However, not being aware of the risks and tradeoffs you're making is just plain simple insanity.

84
justaaron 4 days ago 1 reply      
oh geez.... welcome to trademark law

Google.

(why is this getting frontpage HN coverage?)

a trademark is a globally enforceable right (madrid agreement) and one has an obligation to protect ones mark from "dilution" from others in the same category:

i.e. if you are selling "apple" garden shovels, you needn't worry about crossing into "apple" computer land, but I guarantee you that they already registered that mark for "home electronics" etc.

Most countries require formal registration of the trademark (they are searchable in online databases)and most will go on a "first filing" basis.but several, including the USA, go by a "first usage" basis and require you to prove your use of the mark in public...

it's a long shot, but you can always look of that company has, in fact, registered that mark, and in which country/territory are they claiming usage rights.

(for example, they can't be a local computer shop named "apple computers" that only sold to locals since 1854, that suddenly sells computers on the global market, as there is already a global entity with that name registered)

85
superninja 4 days ago 0 replies      
"This situation made me realize that NPM is someones private land where corporate is more powerful than the people, and I do open source because, Power To The People."

This is true of all package distribution systems. There's always a small elite of admins who regard the system as their territory (and usually have no respect for the authors).

People contributing should be well aware of this.

86
staticelf 4 days ago 0 replies      
87
kelvin0 3 days ago 0 replies      
The only Kik I knew was the Cola:https://p2.liveauctioneers.com/1164/26545/9944497_1_l.jpg

So this lawyer he's from what company? Cause there seems to be quite a lot a Kiks around these days (Kik Messenger?)

88
ecthiender 4 days ago 0 replies      
Well done OP! I stand in solidarity with OP. I think this is a good way of showing our resistance to corporate power - by boycotting them.
89
kofejnik 4 days ago 2 replies      
this is why your dependencies should be checked into git
90
Coxa 3 days ago 0 replies      
Check your project for these liberated modules using (yay) this module https://www.npmjs.com/package/affected
91
antouank 3 days ago 0 replies      
> npm took the name away because they reasoned that more people would think that `kik` the pkg would refer to kik the app. full stop.

https://twitter.com/ag_dubs/status/712669386511949824

92
joepie91_ 4 days ago 0 replies      
For everybody discussing decentralization of NPM in the comment threads down below, please read the following thread: https://github.com/nodejs/NG/issues/29

Much of the thinking work has already been done.

93
dclowd9901 4 days ago 1 reply      
While I don't disagree with OP's angst, fuck them for choosing pride over working products. It's irresponsible and shows a complete lack of maturity. I'll make sure never to consume their modules in the future. God forbid they have a bad day and decides to insert malicious code into their modules.
94
Confusion 4 days ago 0 replies      
Meta: please don't upvote for agreement if facts are asserted that you cannot corroborate. And please carefully consider whether you only believe or actually know something is true. A lot of patent falsehoods are being asserted and upvoted in this thread.
95
spriggan3 4 days ago 0 replies      
It's high time people publishing packages on NPM audit their dependencies. I bet 80% of them are unnecessary.
96
yeukhon 4 days ago 1 reply      
If I call my module pizza, are they going to send me an email about naming it pizza? Let's think about that. If a company owns kik as a trademark, I'd offer some money to buy it off before trying to act like a tough guy. At least be soft first if your goal is get rid of kik module out there.
97
hartator 4 days ago 1 reply      
Atom.io is not impacted, I think it's a good thing that apm is running on its own network.
98
trumbitta2 3 days ago 0 replies      
Trying to help with damage control: https://news.ycombinator.com/item?id=11346633
99
Wintamute 4 days ago 0 replies      
I'm confused weren't scoped packages added to avoid all of this sort of thing? Kik should just have used "@kik/kik", and the original package author should have been left alone.
100
jordanlev 4 days ago 0 replies      
How does serving his modules from a different corporate-controlled repository (github now instead of npm) serve his purpose of "liberating" the code from potential corporate meddling?
101
guhcampos 4 days ago 1 reply      
"This is not a knee-jerk action"

Yes, it is. The fact you did not know about a company branded "Kik" does not make you excempt from the law. A law which, surprisingly enough, is being used in a reasonable situation here. Your package and their segment are closely enough related in context that people could assume they are actually related, giving you the power to essentially break their business if you do bad stuff.

In this case it's not really a trolling coming from them. You don't just brand your new beer brew "Coca-Cola" - there's no reasonable argument to do that besides being a troll.

P.S.: holy crap npm is so broken I'm glad I'm on the backend side of life.

102
tobltobs 4 days ago 1 reply      
What would Stallmann say?
103
howareroark 4 days ago 1 reply      
If I can buy a domain from ICANN for 10 bucks and then sell it to a company for a million... Why can't this guy reserve the right to sell this to that company for a million?
104
galistoca 4 days ago 1 reply      
Reading the article I thought it was some massive popular framework. But when I visited the library's github page, it seems to have only 8 favorites. Am I missing something?
105
tehwalrus 4 days ago 0 replies      
Sounds like NPM should move to pulling in the code from specific github tags or something? although I suppose, github is also "private land"...
106
gambiting 4 days ago 1 reply      
I have read the post and I still have no idea what NPM is.
107
stevebmark 4 days ago 0 replies      
This seems like a fairly childish response. I'm not pro-copyright, especially in software, but "someone took my made up name" seems like a dumb reason to unpublish the rest of your work.

> "NPM is someones private land"

No shit npm is a privately owned company? That hasn't changed before nor after you took these actions.

> "Power To The People"

This is what I don't get. All of the modules that were unpublished seem unpopular / not used so I don't know what impact this will have, but how does screwing over users of open source software equate to power to the people?

108
bbcbasic 4 days ago 2 replies      
> This is not a knee-jerk action

Seems like it. Why break everyone's builds? You could just keep the modules there and then declare you will only keep them updated elsewhere?

109
EGreg 4 days ago 0 replies      
I'd like to liberate your modules

The new pickup line

110
Chyzwar 4 days ago 0 replies      
We should just troll them and created packages with kik in name like:

kik-looseriKikonly-kiktrue-kiktrue-kik2real-kik

111
devishard 4 days ago 0 replies      
Yet another example of the JavaScript ecosystem being pretty much garbage.

To be clear, I'm not attacking the author here. He released left-pad at version 0.0.3; no responsible developer should be using that in production code.

112
chvid 4 days ago 2 replies      
I really just hope that this guy just didn't know what he was doing and what effect it would have.

Otherwise it is totally irresponsible to mess up a big project like babel just because you control a few lines of trivial code.

113
studentrunnr 3 days ago 0 replies      
npm will improve after this and that is a net good thing which comes from this.
114
jlg23 4 days ago 3 replies      
Seriously? "When I started coding Kik, didnt know there is a company with same name. And I didnt want to let a company force me to change the name of it. After I refused them, they reached NPMs support emphasizing their lawyer power in every single e-mail CCing me. I was hoping that NPM would protect me, because I always believed that NPM is a nice organization."

a) Ignorance is no excuse.

b) Expecting others to fight for one is lame. Either have the balls and fight or STFU.

"Summary; NPM is no longer a place that Ill share my open source work at, so, Ive just unpublished all my modules. This is not a knee-jerk action."

Wrong, that is the prototype of a knee-jerk action.

Last but not least, whining about it in public in the hope "something will happen" is pathetic.

What I'd suggest (though now it is too late): Rename the module to comply with legal claims, put up a new module under the old name that throws errors when called that describe the reason so developers see it and put shame on the threatening company/lawyers.

115
turtlekiosk 3 days ago 0 replies      
can i kik it?

RIP Phife Dawg

116
dang 4 days ago 2 replies      
Name-calling and slurs, both of which you've done here, break the HN guidelines. If you have a substantive point, please make it without stooping to these; if you don't, please don't post at all.

We detached this subthread from https://news.ycombinator.com/item?id=11340768 and marked it off-topic.

117
zongitsrinzler 4 days ago 3 replies      
Extremely dick move on behalf of the developer. Why would you remove modules that other people are using in production?

Did you think a small team like NPM would go head to head with a company having full time lawyers? And for what?

3
That awkward moment when Apple mocked good hardware and poor people techinasia.com
1070 points by nkurz  4 days ago   567 comments top 110
1
rockshassa 4 days ago 23 replies      
I just can't bring myself to feel the author's anger, in any capacity. He wants to position this as a jab against those who build their own PCs, but that is utterly irrelevant. What percentage of those 600 million five-year-old PCs do you really think are being thoughtfully maintained by modders? Does the author realize that most people do not want the responsibility of maintaining their own hardware? Or that they don't have the knowledge to do so?

Allow me to paint a more realistic picture: many of those PCs are junky, dusty boxes, running some outdated version of windows, filled with bloatware and riddled with security issues. Inside them are a bunch of spinning platters just waiting to fail. And when they do eventually fail (due to wear, or a virus, et al), someone's Grandmother is going to be shit out of luck, with no way to get at her email, saved photos, or anything else.

A properly configured iPad, leveraging iCloud for device backups, photo backups, email credentials etc, solves all of these problems. And they'll even configure the iPad for you in the store, so grandma doesn't need to know how to do any of it. Do YOU want to be the poor sap attempting data recovery on a failed disk, then realizing that even if you do recover grandma's data, you've still got to go buy a replacement drive, find a copy of windows that grandma knows how to recognize, and install everything exactly as it was before you got there? I've been that guy before, in both a personal and professional capacity. You will eventually fail, memories will be lost, tears will be shed.

We must not gloss over the fact that the iOS ecosystem does solve some very real pain points for real people.

2
vbezhenar 4 days ago 9 replies      
I can't imagine how I would replace my laptop with iPad. Some tasks are definitely doable: Web browsing, Mail processing, Music listening, Skype (though chatting on iPad is terrible because you have to switch around all the time, losing your focus, may be split apps might help, can't experience it, because my iPad have RAM like 15-year old PC).

Generally speaking for power user every activity on iPad is strictly worse. I can't easily download ZIP, unzip it, open some text file, edit it, send it via Mail. Probably I can do it with right apps, but it would require much more clicks or taps.

What I can't even imagine doing on iPad: using Intellij Idea, using XCode, using Google Chrome to debug and develop web apps, using image editors like Sketch and Pixelmator (I know that I can get some kind of image editing, but I don't think that I can do what I'm doing on PC).

Now things I could theoretically do but probably can't, because of walled garden: using Terminal to embrace full Unix power, downloading files with BitTorrent, using BitCoin. Probably possible with Jailbreaking, I'm not sure. Also I'm not sure whether I could download some huge 20GB file and watch it using another app without duplicating (does iOS copy file when I open it with other app or just hardlink?).

And, of course, keyboard is necessary. Mouse would be useful too, but iPad doesn't support mouse, AFAIK.

So probably the only users who can easily migrate from PC to iPad are very casual users, who use their devices to browse web, chat and play simple games. There could be some professionals who work with iPads, it's theoretically possible, but I can't imagine anyone.

3
Udo 4 days ago 5 replies      
I'm an Apple-only user at the moment, both mobile and desktop.

When Apple asserts that a desktop computer should be replaced by a locked-down handheld device with very limited capabilities, the odd thing seems to me is they don't realize these devices do very different things and fulfill very different needs.

I don't worry about the demise of the desktop because I'm nostalgic, I worry about the loss of power and productivity incurred by users with desktop-illiteracy. There are many applications for which a handheld device, especially one with the limitations of iOS, is just not suitable - in much the same way a full desktop/laptop computer is not suitable for things mobile devices excel at.

That the hardware is locked down, outdated, and supremely expensive are additional criteria making the disconnect worse, but these are not the crux of the problem in my opinion. I see two outcomes from this, neither one is appealing: either Apple is misjudging the needs of their users, to the point where trendsetters like programmers will be switching away from the platform. Or, they succeed in their vision and breed several generations of technologically illiterate information workers fumbling their way through life with nothing but extremely limited mobile devices as their only productivity tool.

4
waspleg 4 days ago 18 replies      
I completely agree with the author of the article. Apple is the real-world incarnation of the economic premise of Huxley's "A Brave New World". This was just the mask dropping for a second to pander to the faithful.

I'd like to add, as someone who works at K12 public high school, that I've seen the reality of the article played out. My building is 100% free lunch, most are extremely poor, and yet there is a sizable number with new iphones. Why? Because they don't want to look poor or be thought of as such.

In American society poverty is associated with failure on many levels. We have our caste system as much as India, only ours is economic, and enforced ruthlessly with endless class warfare - largely in one direction.

5
gurkendoktor 4 days ago 1 reply      
That slide was really bad. It's not about being easily offended (I have no reason to) - but it pokes a hole into the first half of the event, where Apple tried to present itself as a green and caring company.

It's such an obvious mistake that I wonder nobody at Apple has pointed that out during the rehearsals? The thing about old PCs being designed 'before social media' was dumb, too. This is a "pro" tablet, right? How does it matter if it has a Facebook app?

This is the same Phil Schiller that is now in charge of the App Store, and as an iOS developer, I find the carelessness a little worrying.

6
gopz 4 days ago 1 reply      
Apple says this shit every time it has a major press event. Saying it mocks poor people is digging around for something to be offended about. Is the author really offended every time they see an ad for a newer car model when current and prior models work just fine? Saying it mocks good hardware isn't out of touch. Seriously, what did they think Apple was going to say:

> Buy a new iPad pro.

Schiller turns around, goes back to the slides, stops midway, turns to the crowd and says "But I still use an old PC, because if it ain't broke, don't fix it!", winks, and continues the presentation.

Come on, this article is grasping at straws.

7
Spooky23 4 days ago 0 replies      
IMO, this is a typical Apple troll article.

"<vendor> mocks poor people" is the dumbest possible analysis of this data. I work for an organization with something like $1.5B in IT spend. Our 40th percentile PC is 8 years old. The 80th percentile PC is 4. Our desired refresh for a desktop PC is 40 months.

Why? The post 2008 recession killed discretionary spending. Then Microsoft failed utterly to deliver a compelling desktop strategy from 2008 to the present day. They finally got their shit together with Windows 10, but their fantasy world where the universe is transitioning from Windows 7->8->10 makes that more friction-prone than it need be.

Consumers are in the same boat. People skipped upgrades because of the friction involved in the transition, which is why Microsoft is dragging you to upgrade kicking and screaming.

Personally, I use my elderly in-laws as a proxy for non-technical consumers. They are technophobes -- a retired fireman and nurse respectively... not rich, not poor. When I met my wife in 2000, they were still leasing a telephone from AT&T. They made the PC->Mac transition in 2009 and were actually able to use their computer without worrying about the typical PC woes (AV, updates, etc). That Mac is aging and it's was starting to turn into the time to go.

With the iPad Pro, my father in law ran out to the Apple Store by himself, got the stuff he needed and got everything going on his own. Long story short -- he loves it. It does everything that they need to do, and is a more convenient form factor than the laptop. He hasn't touched the computer other than syncing music from the Mac to the PC since.

8
ekianjo 4 days ago 3 replies      
The superior attitude of Apple execs is nothing new. Even right at the start of the company that's a culture Jobs started with, looking down on all its competitors.

What's rather sad is the laughing audience, to me, who left their brains at the door and laugh and applause when they are told to. Just like in 1984's iconic Apple ad, by the way. The loop is complete.

9
jMyles 4 days ago 3 replies      
In addition to the point the author makes about the benefits of modularity, this strikes me as environmentally tone-deaf as well. Do we just expect devices to enter the waste stream every 5 years now? Can't we do better?
10
mrbill 4 days ago 4 replies      
I just bought a "new to me" laptop.

Refurb Thinkpad T420s from 2011.I added 16G RAM, two Intel SSDs, an Ultrabay battery, and an 802.12ac wifi card.Grand total: less than $325.

This will be my primary portable for at least 2-3 years, and it's already four years old.

Just because I can afford Apple doesn't mean I can justify the 2x price premium, or that "old" hardware isn't capable.

11
ysavir 4 days ago 3 replies      
Did you know there are over 600 million Honda Civics over 5 years old still on the roads? Hah! Clearly, their owners should replace them with Porsche 911s instead.
12
ams6110 4 days ago 0 replies      
My daily computer at work is a 6 year old Dell Optiplex. 4GB RAM and an SSD. Perfectly good for what I do.

My laptop is a 1st gen Macbook Air, inherited from original owner. Also perfectly good for what I do.

My car is from 2004. Perfectly reliable and meets all my needs.

There is a HUGE amount of retail activity that is caried out just because people want the newest and latest version of things. And I have no problem with that, but it's absolutely not necessary if you don't want to participate.

13
WhitneyLand 4 days ago 1 reply      
I'm usually pretty sensitive to this kind of thing, but this is an overblown reaction that reads too much into what was said.

There are enough people and situations to judge for disregarding the condition of poverty, we don't need to contrive any.

14
macspoofing 4 days ago 1 reply      
Wow. Is this how far the author had to dig to find something to complain about? Yes, companies will frequently give you some marketing spin to get you to buy a new version of a product. And yes, you should use your head to figure out if you a) need it, b) can afford it. And no, it's not an insult to those that decide not to buy the new product.
15
tylercubell 4 days ago 0 replies      
Yes, it was a misinformed statement on the part of Phil Schiller, but I find it annoying that the author cherry-picks one line from an hour long keynote to write an overly dramatic holier-than-thou diatribe. If the author wants to make the case that Apple is elitist or out-of-touch, then he'll need to gather more evidence rather than rely on a few personal anecdotes and pretentious quips.
16
Kristine1975 4 days ago 0 replies      
Since Apple sells hardware, of course they will find it sad if people don't buy new hardware all the time. But it's nice to see them being (unintentionally?) honest about it for once amidst all the marketing.
17
agumonkey 4 days ago 0 replies      
Didn't read Schiller's comment as a jab to users. More as the usual attack against the MS/clone industry.

My thoughts at that moment: 'modern digital like' is a sad joke. 4K video won't change your life, even 1080p. If your hardware isn't absurd, pop Linux, a SSD if needed, and enjoy the 80$ bliss. All from a guy trying to sell hour based color shifting .. come on.

18
donatj 4 days ago 0 replies      
This cuts particularly close to home for me. I work for an educational company with developers that actively mock schools with low resolution screens and poor JavaScript performance, not to mention Android and Fire tablets, while they sit on these brand new MacBook Pros with SSDs and Cinema Displays doing their testing. The schools don't have these low power machines because they're dumb or not tech savvy. The vast majority have them because they are poor. We are trying to help the poor and you are completely missing the point. If you can deliver the exact same content in a way that doesn't require a high performance machine that is the ideal. The more we can do server side the more we are giving instead of taking.
19
frogpelt 4 days ago 0 replies      
This is so silly. Apple is a company that makes money when you buy their products.

If there are 600 million people who are using something besides one of their product they are going to say that those people should use their products instead.

Get off your high horse.

20
studentrob 4 days ago 1 reply      
Schiller is marketing his company's product. So what?

What he said was along the same lines of rhetoric as all the "I'm a Mac, I'm a PC" commercials.

There's nothing to take personally about what an executive says about the products you choose to use in your home. That's your choice.

21
post_break 4 days ago 0 replies      
Teases people with 5 year old PCs, still ships Cinema Displays and Macbook Pros that are from 2011.
22
tombert 4 days ago 0 replies      
It kind of feels like the author was just looking for reasons to be offended. This comment didn't really seem that offensive at all.
23
zekevermillion 4 days ago 1 reply      
My home PC is over eight years old and still functions adequately. I enjoy using it and maintaining it much more than I would fumbling around with a new Apple iPad. It cost about $2800 new, and I've spent about $300 on upgrades since then. So the total cost of ownership is about 390/year. I don't think this cost is much better than buying new iThings every couple years. But I get a much more powerful device, and it requires me to learn a few basic things about the computer to keep it up. So for me, as I suspect for a lot of people, a decision to maintain a PC system is not really as much about cost as it is about fun.
24
mberning 4 days ago 2 replies      
I have no idea how the two points in the article are the only takeaway the author had. On point one, Apple has never been the cheapest at anything. Although they have been offering more low end options as of late they are still not a 'budget' manufacturer and likely never will be.

On point two, it is absolutly possible that a 5 year old machine still runs fine. Depending on the machine! A 5 year old macbook pro is still pretty capable. A 5 year old acer laptop that you got at a black friday sale... not so much. I think this is the level of user they would like to have switch and it's not that far fetched.

25
SEJeff 4 days ago 0 replies      
Actually a lot of homeless are turning to smartphones to help with their day to day lives.

http://www.business2community.com/tech-gadgets/the-us-homele...

http://www.nytimes.com/2015/04/15/upshot/fighting-homelessne...

26
mobiuscog 3 days ago 0 replies      
The only hardware in my house that needs to be replaced (which includes both PCs and Macs) of that age is...

... my original iPad. Because Apple left it to die - even the last update means it can't run Apple's own store or browser without crashing and I can't get versions of software that previously worked because they're not supported or offered any more as Apple forced developers to move up to all of the new API's.

Yes - I can see exactly why I should spend yet more money on something that will intentionally become obselescent in the future.

27
erokar 4 days ago 0 replies      
The comment appears even more heinous when Tim Cook goes on to brag a about recycling and Apple's environmental responsibility. Please. Apple is one of the most bourgeoisie and reactionary companies today, primarily pushing products meant for consumption, not for creating. At least be honest about it.
28
w8rbt 4 days ago 0 replies      
I use 7 year old PCs because they run Linux just fine.

Also, I think it's bad for the environment and humanity in general to buy new igadgets every year. So, I use a 4 year old refurbished dumb phone that I bought for $9.99. It works great (just like my 7 year old Linux PC).

29
Mikeb85 4 days ago 0 replies      
> This is an amazing statistic, he says with a serious look before revealing that there are more than 600 million PCs in use that are over five years old. This is really sad.

Maybe people are still using old PCs because they still work and are fully functional? I'm using a 4 year old ThinkPad and guess what? It's not only still fully functional, it's still quick, snappy, the screen is still bright and looks nice (I did splurge for the upgrade at the time), and it's been super low maintenance (been running Linux on it the whole time). It could use a new battery (capacity has gone down over time), but hey, it's a removable battery so I can do that.

I imagine I'll still be getting a few more years of use from it, there certainly are no signs that I should upgrade at the moment. I mean, if I were to be a little greedy, I'd buy a new laptop with a wicked video card and give this one to my wife (after she made me throw out a fully functional 6 year old desktop we didn't use very often), but is that necessary, not really.

It's pretty plain that Apple simply plans for its products to be replaced quicker than 5 years, to make more sales, and they're speaking to the faithful.

30
scandox 3 days ago 0 replies      
8 years ago I bought a top of the range Sony Vaio. I used it till last month. Replaced hard disk one time. Replaced memory one time. Glued charger together one time.

I bought a new median range Dell this month....and meh. Basically those eight years have made no difference to me running Arch and/or Debian.

Imagine trying to say that about the 8 years preceding that (I.e 2000 to 2008).

31
OSButler 3 days ago 0 replies      
In my case they mocked themselves, as I'm working on an old Macbook model.I'd love to upgrade, but everything is still working. The things I like about the new Macbooks barely affect my work and if I spec one out to my preferred options, then it ends up with an astronomical price tag. It just seems like they are advancing in some areas (screen, touchpad, ...) but are stuck in the past in several others (memory, HDD, ...).

I'll most likely be getting a new Macbook once my current one finally decides to visit its ancestors. With that said, I'm actually impressed with its longevity, as none of my other PCs (desktops & laptops) ever lasted that long. So, I actually see running on old hardware as an impressive feat (unless you're a gamer).

32
teekert 4 days ago 0 replies      
"Ending is better than mending"

For the uninitiated: http://www.enotes.com/homework-help/ending-better-than-mendi...

33
coco1989 4 days ago 1 reply      
My Ipad mini keeps getting slower and slower. soon it will be a Kindle
34
rythie 3 days ago 0 replies      
I'm not convinced the new iPad pro will even be usable for 5 years.

I gave up on the iPad 1 and iPad mini because they didn't have enough RAM to load many websites. The new iPad pro has only 2GB of RAM and even 5 year old PCs had 4GB or 8GB.

35
ebbv 4 days ago 0 replies      
Yeah it's a stupid comment; my PC uses a Core i5 2500K which is more than 5 years old. There's good reason; it's still plenty fast.

But saying this was mocking poor people is just ridiculous. It's trying to stir up outrage over nothing.

36
sklivvz1971 4 days ago 0 replies      
I felt the same as the author. The blog post, however, is a bit unfair to Apple and Schiller.

I think what he meant was "It's sad that these people don't know yet the wonderful news that they can use this fantastic new iPad instead"

37
kdamken 4 days ago 0 replies      
Is the iPad Pro a feasible replacement for an actual computer? Not even close. Not until it runs a version of OS X.

Was apple mocking poor people? Of course not. It's 2016. Can we stop overreacting about every little thing yet?

38
abecedarius 3 days ago 0 replies      
> And I bought an easily moddable, upgradable piece of hardware that can adapt to new technologies in ways no Apple product could ever hope to.

Funny thing: the Apple ][ was the flag bearer in its time for that kind of upgradability. Wozniak insisted on this against Jobs, and it kept the ][ alive through the 3's flop and well into the Mac era -- the Mac did not succeed right away.

Is it so impossible for Apple to bring back some of that spirit of open design? The Jobs lockdown was and still is their greatest turnoff for me as a customer.

39
donkeyd 4 days ago 0 replies      
My 15" Macbook Pro will turn 4 this year. I don't see it being replaced soon, since it still performs quite well. The 13" MacBook Pro that my girlfriend uses is about to turn 6 years old. That one could be replaced, since it has some issues, but for her it does what it needs to do, so there's no real need. We might be able to replace it with an iPad Pro, but I doubt we will.

I wouldn't call myself poor, but I have no need to replace these devices every 3 years, since not much changes, except they get thinner.

40
rubyfan 4 days ago 0 replies      
Wow, this is reading way too much into the comment than it deserves. I guess if you have an agenda then you will figure out any way to make your point no matter how far a stretch it is.
41
ilamont 3 days ago 0 replies      
Reminds me of the time Schiller dumped Instagram after an Android version came out.

A reader noticed Schiller deleted his Instagram account (@schiller), and then reached out to Apples most visible public speaker by Twitter for confirmation. Schiller told the reader that he quit the rising photo-based social network, because the app jumped the shark when it launched on the Android platform.

http://9to5mac.com/2012/04/19/apple-marketing-svp-phil-schil...

At the time, I thought it was a slap in the face to people who couldn't afford iOS devices but wanted to join the Instagram community. Schiller portrayed it as a drop in quality, apparently:

Another 9to5Mac reader, Clayton Braasch, claims to have emailed Schiller directly, asking him to elaborate upon the statement. In a post on his blog, Braasch writes that Schiller responded 9to5Mac says it has verified the email headers and that while Schiller still considers Instagram a "great app and community," he enjoyed the fact that it was used by a small group of early adopters. Now that its reach has expanded, Schiller allegedly wrote, "the signal to noise ratio is different."

http://www.theverge.com/2012/4/19/2961612/apple-phil-schille...

42
psk23 4 days ago 0 replies      
Ive got a PC Thats one week old. I bought it straight after apple failed to announce any macs with a decent GPU. Most devs I know are switching back this year or last.
43
peterashford 2 days ago 0 replies      
I'm typing this on my home PC which is over 5 years old. I've replaced the gfx card, CPU and memory and it plays Battlefield 4 pretty well. I also use it for programming (I work in game dev). Why would I replace this very serviceable PC for a Apple toy which will be unsupported as soon as its manufacturer thinks they can get away with it in an attempt to get everyone on their must-upgrade-to-the-latest-thing merry-go-round. I appreciate quality hardware and fashion in tech a-la Apple makes me puke.
44
joesmo 3 days ago 0 replies      
"Maybe Apple really does find the idea of hardware that can function for five years sad and funny these days."

Maybe? Is there even a question anymore? If there is, one only needs to look at the designs Apple has been pushing for the last seven years: un-upgradable, mass-consumer grade quality products. If anything has been crystal clear from the disappearance of all upgradeable parts in all their products over the last few years and their constant events (at least 3 a year) is that they want you to buy new shiny stuff and they don't care about supporting the old stuff. Apple will support you for one year or up to two or three (depending on product) if you pay a few hundred extra for Apple care. The fact that there is no Applecare plan longer than two years for iPads and three years for laptops should be another crystal clear indicator that their product aren't meant to last and be used that long.

That being said, at least with their laptops (haven't used the new Macbook) the hardware quality is generally very high (except for the trackpads). I hope it continues to stay this way.

45
dreta 4 days ago 0 replies      
Virtue signalling, the article.

Apples based in SF. An iPad Pro is like a weeks worth of rent, and to them work probably means scribbling a hipster poem while sipping coffee you cant pronounce without 5 years of language study. How is anything the guy said different from how the conferences always go. The fact that the author found it worthwhile to write about this like this is an actual problem is astounding.

46
joezydeco 4 days ago 0 replies      
How about the other moment where Apple showed their "40 years in 40 seconds" video and scratched out the Newton?

You really want to mock the massive effort of that group - a group you believed in at one point in time? At least honor the memory of Ko Isono.

I didn't see the Lisa or the Apple /// get the same treatment. Maybe because Saint Steve backed those projects?

47
twoodfin 4 days ago 0 replies      
Good evidence that if you try hard enough you can be offended by anything.
48
ivankirigin 4 days ago 0 replies      
This story is a pretty good measure of whether you get offended by something very small.

A 5 year old PC is low quality. A company that makes high quality products wants people to have better experiences with computers.

At the event, they launched their lowest price phone. I bet that phone in a year is going to be even cheaper.

You can find offense in everything, but you shouldn't.

49
kmano8 4 days ago 0 replies      
I was passively listening to the keynote, and this comment caught my ear. My first reaction was, "Well damn, I'm pretty proud that the 2010 Macbook Pro I've been using as a 24/7 server for the past 5 years is still chugging along."

Though I suspect Macs might not be included in that statistic, it seemed out of touch nonetheless.

50
droithomme 4 days ago 0 replies      
I disagree with this claim.

> There are really only two reasons why people might have a computer thats more than five years old:> 1. They cant afford an upgrade.> 2. They dont need an upgrade.

There are many other reasons. Among them, upgrading hardware will force an OS upgrade that will break significant software and hardware.

51
mwfunk 3 days ago 1 reply      
That's reading a lot into a random joke. When some public person makes a pseudorandom comment and you perceive that as a mask being dropped to reveal confirmation of all your darkest suspicions about that person or the people he speaks for, you might be projecting, just a smidgen.
52
rdl 3 days ago 0 replies      
I just realized my gaming PC is almost 5yo. i7-970, 6 core, x58, 560 Ti, 24 GB, a couple SSDs. Was high end when I built it and still basically ok now.

Computers really have plateaued in a lot of ways. Phones for 2-4y vs 1-2y, laptops for. 3-6 vs 2-3, desktops for 4-8 vs 3-4y.

53
songzme 3 days ago 0 replies      
I think its easy to point fingers, but I myself is guilty of "mocking poor people". One memory that stuck in my mind was during college during office hours. A classmate was struggling to configure her assignment on her 5 year old PC, nobody really wanted to help her because her computer was so old and so irrelevant. Casually, I joked (with classmates around us)"You should burn your laptop and get a mac." Some laughed, but she didn't.

"Why don't you buy me one"

I was a little offended, her remark and body language felt a little hostile. We never talked again.

This memory stuck with me and I wish I could apologize to her. My seemingly harmless remark poked fun at her economic handicap for my amusement.

54
dopamean 4 days ago 1 reply      
This reads like outrage for the sake of outrage. Like someone looking for offense everywhere.
55
circa 3 days ago 0 replies      
I agree with a lot of this in the personal world. Sure a lot of people don't have the money to upgrade.

At my old job, at an MSP, I used to upgrade a TON of people from XP to Win 7 or 8.1 in the small/medium business world. I could not believe how many companies absolutely had the money to upgrade their PCs but simply did not want to because they were afraid of the learning curve. It basically came down to that. They had no problem paying for us to "fix" their XP machines at $125+/hour. The same went for a lot of servers. Who knows how many of those are in the said 600 million pool.

56
skc 3 days ago 0 replies      
It's not the comment that was absurd. It was the fact that Apple seemingly has no answer as to why there are over 600 million people still using old PC's.

Instead of exploring why that might be the case they instead decided to mock these people.

Just bizarre.

57
dates 4 days ago 2 replies      
"Maybe Apple really does find the idea of hardware that can function for five years 'sad' and funny these days."

I don't think this is true. Apple just replaced the motherboard on my 2011 Macbook Pro for free. I've also upgraded the RAM, HD to a SSD, Battery, & Fans. Its running soooo good I love it.

The price of 5+ year old macbook pros, imacs, and mac pros on ebay is proof that apple does make hardware that lasts... Anyways, I'm interested in the relationship between apple releasing new products and the aftermarket value of older version. I think apple releasing newer ipads probably makes older used ipads more affordable..

58
jonkiddy 4 days ago 0 replies      
"nothing about todays iPad Pro presentation made me rethink my position at all"

This is the only part of the article that resonated with me. Apple completely failed to provide any viable reason for 600m+ PC users to switch to an iPad Pro.

59
bechampion 3 days ago 0 replies      
I own a macbook and I love it but, what I really need is

BrowserTerminalPython

That is it , to me is very relative ... i can work with a 5 year old laptop... no problem , most of my things run on servers etc etc.Ipad Pro? no way .. give it to kids.

60
chetanahuja 3 days ago 0 replies      
It's funny because I'm still using a giant MacPro from 2008 (yes... the tower) as a home machine. It's crammed full of high density storage and I've added aftermarket RAM to the box to max out the slots. It's the last mac model that allows you to upgrade easily.

I think they continued to sell it until 2012 or so but then fixed the "oversight" with an art deco piece of a "desktop" computer with no room for upgrades. I still buy Apple laptops because work involves dealing with Xcode and iOS stuff but for any personal use, no more Apple hardware for me.

61
s_q_b 4 days ago 0 replies      
The point of the throwaway remark was that there's pent-up demand for consumer goods due to the Recession.

Similarly, traders are speculating global new vehicle sales to increase, as the average age of a car on the road far older than the previous trend. It's north of a decade even in the United States.

The theory is that during tight times people deleverage and reduce spending, while during boom times the demand that accumulated during the tight times is released through extra consumption.

Getting from the Apple comment to the article's topic, much less conclusions, takes a spectacular amount of cognitive gymnastics.

62
gnicholas 4 days ago 0 replies      
>Even if you insist on a tablet, you could get Microsofts Surface 3, which boasts a slightly worse screen but offers double the storage capacity, for US$100 less.

It is well-documented that the Surface OS takes up far more space than iOS, which means that the available space is not nearly as disparate as it would seem (though the Surface still has a bit more): http://www.slashgear.com/surface-3-storage-space-still-limit...

63
d_theorist 4 days ago 1 reply      
I think this person has missed the point.

The assumption here is that the vast majority of >5-year-old PCs are crap, and that the reason for continuing with them is that a new full PC replacement is too expensive. The iPad is supposed to be disruptive because it's a lower-cost machine that can nonetheless do everything the user used to do on a full-fledged PC.

You might disagree that the iPad succeeds in this, but it is nonetheless the way Apple thinks about where the iPad sits in the market. This was the point of Schiller's remark. It has nothing to do with mocking the poor.

64
pmontra 4 days ago 0 replies      
I paid my bills from Nov 2006 to Feb 2013 with a HP nc8430. I let it go because it started to be annoyngly low on memory because of my new usage patterns: more VMs of more memory hungry OSes and more browser tabs, only 4 GB. Furthermore it run out of support and I would have had to buy my spare parts, no more next business day on site assistance. I got a Zbook 15 and I feel like I can go on with that for another 6 years (i7, up to 32 GB). No need to be on a 2 years update cycle, sadly for HP (or Apple) but not for me.
65
kplanz 3 days ago 1 reply      
I do not agree with the author of the article. I think there also is reason #3: They think they don't need an upgrade.

There are a lot of people who own an old PC and think it's the best possible setup. But in reality they actually wouldn't need a full PC because all they do is browse the web and read/write emails.My opinion is that many people would be better off with an iPad than a PC. It's small, portable, intuitive, easy to maintain, has great customer support and so on.

66
randcraw 4 days ago 0 replies      
Maybe Schiller meant it was sad that PCs haven't improved substantially in 5 years, which is true to a degree that isn't just sad, it's seriously bad. As software historically continues to slow, app runtimes inevitably will too.

Is an iPad Pro 2 the answer? Obviously not. The unsaid lesson from this event is that Apple's products aren't improving as quickly either. Nor are the new features as interesting. So it's not surprising that Schiller would look backward instead of forward.

67
phaser 3 days ago 0 replies      
I prefer a 5 year old computer that is open, repairable, I can run any OS I want than the "Ultimate PC replacement" that can only run software signed (and sold) by Apple.
68
quietplatypus 4 days ago 0 replies      
I'm not even poor, and I still do my research on a workstation from 2009. It's fast enough for what I'm doing, and why waste new hardware on bloated software?
69
bogomipz 4 days ago 1 reply      
In the last year we've seen Apple produce a pencil and a watch, whats next a paperclip? I found the comment condescending and yet not at all surprising. Seriously F them.
70
XorNot 4 days ago 0 replies      
There's a study I want to see done: take a bunch of people, dress them at a whole bunch of different overall "looks". Put them in different contexts holding several different smartphones of varying prices.

Then, bring in the subject group: and ask a question - how much do you think the phone is worth? How wealthy do you think the person is?

My hypothesis: people have no idea what phone's cost at an average viewing distance (say 3m+).

71
blackhaz 3 days ago 0 replies      
Reading this on a 2009 MacBook Pro with SSD, working perfectly fast for my tasks, with Thinkpad T400 running FreeBSD nearby. I think they're running out of ideas why would I need to upgrade. Definitely not doing it for a new fancy UI animation. Just imagine: a content that scrolls down and then smashes into the screen boundary, bounces back and then reaches equilibrium slowly. My ass...
72
bfrog 3 days ago 0 replies      
Is it really any surprise no one is bothering to keep following the upgrade windmill money shakedown? My 5 year old laptop works just as well with gmail, youtube, and google docs as my 8 year old desktop etc etc. There's very little reason to keep on the upgrade bandwagon right now. The innovation of software has more or less flat lined for the end user machine... for now.
73
peferron 3 days ago 0 replies      
My wife's iPad 2 slowed down to a crawl after updating to iOS 8. She doesn't use it anymore because of that. Obviously Apple doesn't want iOS devices more than 5 years old to remain in use either.

In the meantime I've upgraded my parents' PC from Win XP to Win 10 and it runs as fast or even faster than before.

I agree with the Apple exec that it's really sad.

74
mchahn 3 days ago 0 replies      
This thread is so long that I may be repeating what has already said, but ...

I converted to Linux from Windows about a year ago. An unexpected but great benefit is that I have been able to pull PCs 5 to 10 years old out of my garage and now they run like brand-new. I should be good for a while.

75
mirimir 3 days ago 0 replies      
> There are really only two reasons why people might have a computer thats more than five years old: 1) They cant afford an upgrade. 2) They dont need an upgrade.

There's a third: they wouldn't trust an upgrade :) Just where to draw the line isn't obvious. But it's probably closer to 10 years old than five.

76
Philipp__ 4 days ago 0 replies      
iPad neds to change to the point where it wouldn't be called and recognized as an iPad, or just die. That device is in technological gap right now, getting beaten by newer more modern devices that evolved from iPad form. It annoys me as hell what Apple is doing right now. For first few years I thought "Thank you God Cook is keeping it simple and well defined as it was under Jobs", but now I am afraid off repeating history. Wheel always turn, and I would be really sad to see 90s Apple before Jobs came back. I enjoy in their computers a lot, and they will always be main thing from Apple, at least to me. I bought second iPhone 5s a month ago because it simply does the job for me... While I was Apple fanboy, every consecutive year I get more and more disappointed.
77
georgeglue1 3 days ago 0 replies      
I'm preaching to the choir here, but the obvious third case the author omits is 'semi-technical folks who don't want to deal with the friction of a new environment', which is not so offensive.

And 600 million 5+ year PCs seems like a low number...

78
ixtli 4 days ago 0 replies      
I understand why someone would feel this way but the unreconcilable reality is this: http://windows.microsoft.com/en-us/windows/lifecycle

The majority of those machines Apple is "mocking" are running operating systems that have known exploits in the wild and people are doing their personal banking on them. The shitty, stuck up delivery doesn't change the fact that it actually would be better and more private for people to dump their unsupported windows xp/vista/7 boxes and use an iPad Pro. Then, at least, they'd have a semblance of security.

I'm not even willing to say that Apple didn't intend to mock the poor but the facts remain regardless of delivery or intent. I think we can agree that for the majority of the people we're talking about here, "They dont need an upgrade" is simply not the case.

79
beyondcompute 3 days ago 0 replies      
I am using a 2011 laptop. I am not poor. I understand Apple is being sad (it's a business) that I don't buy new hardware (I feel there's no more return over the investment). I also agree that tablets are closer to the present (not the "future", ha-ha) of computing but I am software developer and I will gladly switch to iPad pro only when I have an access to terminal, all the usual unix tools, homebrew and the ability to compile/run the usual web-dev software: servers, database systems, programming languages (and to easily connect my mechanical keyboard). Apple is not mocking the poor. Calm down, let's get back to work.
80
typon 4 days ago 0 replies      
Is everyone here really acting incredulous at a giant corporation promoting consumerism?
81
Karunamon 3 days ago 0 replies      
This just reads like trying really hard to be offended by something. Literally any comment about legacy hardware (hell, even the word "legacy") could be interpreted this way if the author reaches hard enough.
82
Sarki 4 days ago 0 replies      
This Apple Keynote 2015 video extract speaks for itself:https://www.youtube.com/watch?v=b4UOmR_xSBM
83
drivingmenuts 3 days ago 0 replies      
It's marketing - you pit the have-nots against the haves to make the have-nots want to have and the haves to want more.

If that makes you angry, then you need to upgrade your thin skin.

84
jayfk 4 days ago 1 reply      
I bought a Mac Pro 4,1 from 2010, flashed it to the 5,1 firmware, upgraded the CPU, added a Graphics Card, 4x 3TB drives and a PCIe SSD blade.

Best thing I've ever did in terms of apple hardware.

85
Joof 3 days ago 0 replies      
There's a lot that those 5 year old PCs can do that those shiny new iPads can't. In most other cases, they get the job done anyway.
86
edandersen 4 days ago 0 replies      
The correct solution would be to provide an x86 build of iOS (which they must have) with mouse support to install on these aging desktops. They'll make a cut on selling apps.
87
sickbeard 4 days ago 6 replies      
Story time! My gf and I went to take pictures of some expensive cars at an Audi dealership. While there one of the nicely dressed sales people started talking to us about the Q7. He quoted about 70k cdn as a starting price and argued, sure it was an expensive car but it lasts longer and you save money in the long run.

I think maybe that's what apple was trying to get across, buy some quality hardware that is updated regularly instead of spending low on cheap things. It's a poor argument for sure but I wouldn't call it out of touch because it is the defacto argument sales people try to use to get you to purchase more expensive things.

88
kitsune_ 3 days ago 0 replies      
I have an i7 930, I bought in 2010. Like hell I could benefit from a USD 600 iPad Pro. His statement actually only shows me how preposterous it is to buy an iPad.
89
nkrisc 4 days ago 0 replies      
I think the author's second point brings up something important:

If I've replaced every component in my desktop computer, even the case at one point, is still the same computer?

90
roboto584903 3 days ago 0 replies      
A 2011 Mac mini with a quad core processor has more processing power than any current model, while the price has stayed the same. Now that's sad.
91
Tycho 3 days ago 0 replies      
Hmm. I use a 2011 MacBook Air and the thing performs (and looks) like it's brand new. With the latest OS. I have no plans to upgrade it whatsoever.
92
scblock 3 days ago 0 replies      
The sanctimoniousness in here is especially ironic given the general attitude of HN and YC about money and technology. It's hard to take you seriously.
93
robmcm 3 days ago 0 replies      
The elephant in the room here is that Apple support their products for longer than anyone in the industry.

Support for legacy iOS devices and macs is very impressive.

94
placeybordeaux 3 days ago 0 replies      
My well over 5 year old computer just got a graphics card that will push it into vive VR range.

Thanks for the advice of getting a ipad pro, bro.

95
seivan 3 days ago 0 replies      
Clickbait. Pure and simple. I felt the same way when my parents used PCs until I got them iOS devices.
96
mattkrea 4 days ago 0 replies      
5 years is a little short of a time frame but this wasn't mocking poor people.. this is a bit much.
97
ArcticCelt 4 days ago 0 replies      
"Let them use iPad Pros"
98
sunasra 3 days ago 0 replies      
I think they forgot 'Innovations are based out of Tradition'
99
wodenokoto 4 days ago 0 replies      
I saw the screenshot of the slide and thought they meant old software and mulled quite a bit about the wording.

I have a mac that's over 5 years old and I'm quite proud of the fact that the hardware is still good. Apparently I am bringing shame to Apple.

100
msie 3 days ago 0 replies      
This is how much im going to spend on this stupid article.
101
grandalf 3 days ago 0 replies      
The article is reactionary and click bait.
102
eximius 4 days ago 0 replies      
Never attribute to malice which might otherwise be explained by ignorance...

Though this might be a bit of both.

103
RUG3Y 4 days ago 0 replies      
I'm upset that Bugatti doesn't make cars that poor people can afford.
104
analognoise 4 days ago 0 replies      
Are we seriously that sensitive now? The guy is trying to sell computers.
105
Wonderdonkey 3 days ago 0 replies      
I'd been exclusively with Apple since 1989 when I bought my first Mac (an SE with dual HD floppy drives and a whopping 400 MB external hard drive). I stuck with Apple through the difficult years and then even became the editor of a multi-title Mac publishing operation. Apple loved me so much they gave me a loaded iPod for my birthday one year. I don't know how many people they've done that for, but not many, I'm guessing.

But things started to change with the success of the iPhone and then the iPad. We Mac fanatics used to say that any success for Apple was a win for the Mac platform. But in reality, it hasn't played out that way. The Mac is languishing, and it's languishing in ways that I can only attribute to intent. It's becoming more frustrating to use. Files that you see right in front of you don't come up in a search. Software updates bring rapid obsolescence. Simple things like "Save As" have been changed in Apple's apps so that now Shift-Command-S, for example, is the command to "Duplicate" a file, which you then have to Command-S save. Then when you close out, you have the additional step of dismissing a save dialog on the original document. The hardware, obviously, is not being built to last. (Apple's laptops were always frail things, dating back to my PowerBook Duo and PowerBook 520c, but their desktops and workstations were always bullet-proof; they are not anymore.)

It's a bunch of little things and big things combined to make a very frustrating experience.

This December, I decided to jump ship. I bought a Surface Pro 4. The hardware is awesome (Core i7 with 16 GB RAM). The software needed some manual intervention, but it's coming along. (Microsoft didn't include the WinTab driver, for example, so there was no pressure sensitivity for some apps. And there was no documentation available for it. And frankly Apple's keyboard shortcuts for special characters are better than Microsoft's, but I've been able to emulate those.)

I don't even think about what platform I'm on when I'm working now (except when I use Dreamweaver CS6 because Adobe is freaking horrible and can't deal with Microsoft's trackpad and wants to force me to rent CC, which I will never do).

I never considered an iPad or iPad Pro for a second. They are useless to me. When I get a unit in, it just gathers dust. There's nothing "pro" about it unless your profession is typing e-mails and visiting Web pages. Plus, I actually like computers. I'm old enough that they are still like science fiction to me. I still have dreams about them. And I like to be able to get into the software guts of my computers and mess around in there.

I also refuse to consider the iPhone. Not as long as there are Android phones that have expandable storage and a removable battery. Plus every time I have to deal with one of my kids' iPhones, especially when I have to deal with iTunes on top of that, I want to punch Apple so badly.

And most importantly, I don't want to be at the bottom of the food chain in Apple's iTunes ecosystem. And that's all Apple's customers have become.

106
746F7475 3 days ago 0 replies      
Being this mad over some random comment...

Either upgrade or not, no one cares.

107
DrNuke 4 days ago 0 replies      
You don't need to be the Pope to understand that money is the shit of the Devil and Apple is a big good carrier itself.
108
golemotron 4 days ago 0 replies      
This brings hyper-sensitivity to a new level.

Perhaps to make sure no one's feelings are hurt, all ads for new products should be banned.

109
xutopia 4 days ago 0 replies      
Give me a break! That was not a jab at poor people. It was explaining how a market there is.

If you want to see a jab at poor people look at Gainsbourg burn a 500 franc note (about 100USD at the time) on live television. That's a jab at poor people https://www.youtube.com/watch?v=gMq3Zr9_ARE

110
FussyZeus 4 days ago 1 reply      
While I think the comments about laughing at the poor are on point, the other half of this seems a little "trying to be offended." Yes it could've been worded better but it's a joke around our office that the PC's need to be replaced after 1-2 years, where we have Macbooks that have been in service for more than 5 with little to no issues.

I think it's generally accepted common knowledge that Mac's age far better than PC's, maybe not so much on desktops, but laptops? Definitely. I have a custom built PC at home myself and I'd never trade it for a Macbook of my own, but after using a macbook from the company for the last few years, I can't ever go back to a Windows laptop.

4
I switched to Android after 7 years of iOS joreteg.com
832 points by joeyespo  3 days ago   496 comments top 79
1
kev009 3 days ago 21 replies      
I did the opposite and can't believe how much better my life had gotten because my iPhone is just a simple tool that I use for communications and don't think about it as a project. With Android, I always wanted to tweak silly things and run Cyanogenmod because the handset firmware was always so bad and vulnerable. On several occasions I'd bricked my phone requiring hours of recovery, or had transient failures of cell service and communications issues. I guess if you have the right level of discipline, apathy, or use a Nexus device that may be more Apples to Apples (harhar).
2
sjenson 3 days ago 6 replies      
Nearly all of the comments here are missing the point of this blog post. The author likes Progressive Web apps, they are important to him. He's moving to Android because it supports the web better.

That's it.

This isn't iOS vs Android and it certainly isn't web vs native. Yes, the article is critical of native apps (and the app store) so I can see how you'd go there but it's a distraction. I see this article as an"I want to use the best mobile web platform possible" argument.

3
kalleboo 3 days ago 6 replies      
My main exposure to Chrome web apps is Hangouts on Chrome for Mac and half the time I shut it down and choose to use the native app on my phone instead due to the poor, non-native UI and the battery life impact of Chrome.

edit: the other shiny Google Web App example, Google Docs, doesn't work either. In Safari it likes to drops keys, and the last time I used it in Chrome (last autumn), it would either crash the whole tab, or freeze it up long enough for it to tell me it gave up and that I should just copy the content and paste into a new document

It seems we're re-living the nightmare of Java "cross-platform compatibility" but with an even worse programming language.

> In fact, I think Progressive Web Apps (PWAs) actually have a huge leg-up on native apps because you can start using them immediately

> Theres just so much less friction for users to start using them

Every web app I've used has required a painful sign-up process, which is usually where I bail out of the process. Way more friction than an app store install.

4
BuckRogers 3 days ago 1 reply      
I did the opposite. 7 years of Android to iOS. I'll never go back unless Apple somehow swaps the experience to be more like Android phones, and less like iOS is. But I don't really care about that. I just want my phone to work, to make calls and not fail or slowdown. Not be another computer I have to maintain. iOS in my experience is a great choice if that's the goal.

He hit the nail on the head at the end. Native React and similar tools are going to simply help the app stores. I have no qualm with app stores as I'm not a webapp diehard.

Just use what makes sense. I never think that is Javascript and take the exact opposite view of the author. I use JS only when I absolutely have to. I prefer to build native platform experiences, which if you're doing more than a CRUD app many times you have to do anyway. I'd work with C#, Swift, Rust, Python and their associated ecosystems before trying to JS All The Things. I find that concept very anti-democratic and regressive.

The Javascript diehard mentality will come to it's final death throes once wasm hits V2 and allows every language the chance to work in the browser.Then the web will truly progress as the author states. Developers will be freed to use whatever they want. Swift on the server, iOS and browser. Let programming platforms and tooling duel it out, not hand the crown to a PL that was created in 1 week. I choose Python, but everyone should be able to use whatever they want as well.

For me, that's the real "progressive web app".

5
mrcwinn 3 days ago 3 replies      
The argument seems to be that app developers aren't doing very well on the app store, and you're looking to the free and open web as the place where vast sums of money will be made? For the vast majority of these apps, I beg to differ. The web plays by the same rules as the app ecosystem: it's very expensive to monetize, unless of course you are creating value for someone who has money and minimal friction when paying.

"Unfortunately, the web platform itself wasnt quite ready for the spotlight yet. It was sort of possible to build web apps that looked and performed like native apps..."

Are you talking about 2007 or 2016? Native apps will always outperform non-native apps - and not because of any emotional or "political" reason - but for perfectly obvious technical reasons. Web apps have an extra layer between themselves and the hardware. Native apps do not (or, at least, the layer is much thinner). Even if web apps increase in speed another 100x, native apps will be right there too.

Look, at the end of the day, use Android or iOS. I don't care. I've used both. But don't switch for this reason.

6
itp 3 days ago 4 replies      
Wow. I'm a long time Android user and probably pay more attention than most, and I had no idea web apps had gotten quite this nice. Currently the only web app / web shortcut I have installed is the HackerWeb app[0], which is nice but clearly not taking advantage of all of the functionality it could.

I "installed" Flipkart Lite and the Voice Memos demo app to see the state of the world. Clearly it's possible to build some really nice web apps these days! I hope to see more of it moving forward.

[0] https://hackerwebapp.com/

7
S_A_P 3 days ago 4 replies      
Im not sure what it is about articles like these that bother me so much. Is this guy some hacker hero that I should know? I dont care what the platform is, and this is nothing to do with iOS vs Android. I really cannot stand this "why I quit x" type of blog post. Is there a reason this guys opinion matters more than anyone elses? I know I could just ignore articles like this, but it does happen to be staring me in the face at the top of the list. At the risk of irony, I would much rather see a case made for improving something than a "I chose this because its better, and I know better than you" article.
8
mostafaberg 3 days ago 6 replies      
>I dont know about you, but the idea of having a fully capable web browser in my pocket was a huge part of the appeal.

A: Both iOS and android have fully capable web browsers, I'm not sure what's missing here ?

>Im talking about stuff that QA should have caught, stuff that if anybody at Apple was actually building ? apps this way would have noticed before they released.

A: They do pass QA, that's why features are removed

>One quick example that bit me was how they broke the ability to link out to an external website from within an app running in standalone mode. target=_blank no longer worked.

A: Thank god apple no longer allows that, how do you expect a tiny screen to have popups and switch web browser views when you click links ? this is a very bad UX.

>We were running a chat product at the time, so anytime someone pasted a URL into chat it was essentially a trap.

A: I'm not here to judge your decisions or why you did it that way, but IMHO a chat product doesn't really belong in a "web browser"

>The message from Apple seemed clear: web apps are second-class citizens on iOS

A: Exactly, and it is that way for many good reasons.

I see you've mostly switched to android just so you can continue developing webapps, that's okay for you, but it's not a really good reason at all.Don't be like the people who where bashing apple when it decided to remove support for flash player, because that's one of the reasons the web hasbecome the way it is today, i'm not an apple fanboy, i also did the switch from iOS to Android after around 7 years too.

9
untog 3 days ago 4 replies      
I agree with pretty much everything in this article - I firmly believe that we're due a "post-app" world where progressively enhanced web sites provide 95% of the functionality we require. But we're not there yet - I'd love to see better WebView integration into native UI components (UINavigationController and the like), to provide things like swipe-to-go-back, which is monumentally hard to do on the web.

But hey. Maybe, just maybe, we'll end up back in a world were cross-platform development is viable. If Apple lets us.

10
criddell 3 days ago 1 reply      
And then somebody in management asks why the new app is missing so many features on his brand new iPhone. In fact, all the C-level folks and board members are primarily iPhone and iPad users and none of them are happy that so many goodies are missing.

If you aren't worried about provided a first class experience to your iOS customers, then build for Chrome + Android. Although, that sounds a little like "build for IE6 + Windows" 15 years ago.

11
rimantas 3 days ago 0 replies      
It would be nice if guy stuck to coherent argument. Meanwhile he talks about "monarch enforcing a 30% tax.", about iOS developers barely making any money.Ok, so where are the numbers, how much money did he make with his "installable web apps" on Android?
12
mastazi 3 days ago 0 replies      
Just yesterday, I switched back to Android after 4 years of iOS and I am really really pleased. I especially like the interoperability between apps and the "draw over other apps" capability.In relation to the linked post:

1- I'm not 100% convinced that web-based apps are always the way to go on mobile platforms, there are many pros and cons.

2- While Chrome for Android supports a wider array of web standards[1], that difference doesn't (yet) seem very significant looking at various sources such as caniuse.com.

I just wish Apple was working more actively on Safari develpment, both on desktop and mobile: they started from a very good position (e.g. the circa-2010 Safari for iOS was vastly better than the circa-2010 Android browser) and they are now rapidly losing ground.

[1] http://caniuse.com/#compare=ios_saf+9.0-9.2,and_chr+49

13
hackuser 3 days ago 1 reply      
A bit OT: I'd like a mobile platform that provides confidentiality (from both government draget and commercial spying) and end-user control. These seem like fundementals of any platform, at least as user options, but I haven't found it:

* iOS seems to have some confidentiality, though are users really protected from commercial spying? Of course end-user control is very limited.

* Android provides some end-user control if you root your phone, but it's complicated to utilize. Confidentiality is awful; there are a never-ending number of holes and leak, AFAICT, many built into the OS. No fork (i.e., ROM) of Android seems to focus on confidentiality, though I'm curious whether Blackberry's Priv locks down the OS in addition to the hardware.

* Basebands are neither confidential nor provide end-user control, in any phones outside of FOSS projects, AFAIK.

* Mobile service providers also are an omnipresent risk.

---

I suspect a decent solution to the baseband and mobile service problem is the following, but I haven't tried it and I know it has some weaknesses:

* a hosted VPN service that provides a firewall (the firewall is needed to filter outbound connections from your phone)

* a cellular router that's pre-paid, tethered to the phone to isolate the baseband from the rest of your handheld computer

* VOIP service for voice and SMS/MMS

14
jamisteven 3 days ago 4 replies      
I for one could never go back to Android. iOS is just such a better user experience, much more fluid. Android feels like Cisco Voice's product lineup, all pieced together. fragmented applications and processes that dont work side-by-side with eachother. The other reason, which is huge to me, is the hardware. I am huge on how things feel in the hand and in my opinion apple's hardware is just far superior to anything offered for Android. Best thing I have seen hardware-wise was the Samsung Galaxy Alpha, and the Oneplus2. I miss the old Nokia days, e61/e62, that build quality was top notch, although running symbian made it a bit of a snail. I tried switching out of iOS and over to a nexus5 when it was released, I had pre-ordered it and was super excited for it to come in, but the hardware felt like total shit to me, and after a month I swapped back to iOS. Im still rocking an original iphone5 thats jailbroken, works better than that N5 any day of the week. Much like cars, it isnt about the size of the engine, or the tech it comes with, its about the whole package and how it all works together, as a unit.
15
vjeux 3 days ago 0 replies      
I want to clarify some points on React Native. Unlike what is commonly said, my goal with the project is to make the web better.

A fundamental problem with the web as it exists right now is that as a "user", you cannot go one level deeper when you want/have to in order to provide a good experience. There's a big list of things like customizing image decoding/caching or extending layout part of css that is encoded in the browser and cannot be changed in userland.

The way to solve your problem is to convince a browser vendor to implement a solution, then all the other browsers to support it and wait for years such that your userbase can use it. This loop is extremely long and involves a lot of conflicting interests and having to support it forever.

The idea of React Native is to provide a subset of the web platform and hooks to drop down lower whenever you want to. For example, as a user you can use <Image> which behaves like a web <img> and be done with your day. But, if you want to use another image format, or manage your image cache differently then you can implement it and provide a <MyImage> component to the end user.

The advantage is that each app can start building and experimenting with its own low-level infrastructure and replace pieces that the default platform doesn't do adequately for the use case they are trying to solve.

Now, why is it good for the web? Since React Native primitives have been designed to work on the web with a small polyfill ( https://github.com/necolas/react-native-web ), there's now a concrete way to improve the web platform without being a browser vendor. You can prototype with your ideas on React Native and when you figure that one is actually good, now start the process to ship it to the entire web platform. Kind of the same way you can prototype your js transforms with babel and then push them to tc39 to make them official.

If React Native is as successful as I want it to be, the web platform is going to supports all the use cases that only React Native can provide today and we can just rm -rf the entire project and use the web.

16
jarjoura 3 days ago 2 replies      
I think apps are in a lull right now because most were abandoned and left users feeling jittery about pouring their lives into them. Also few apps spent the effort to take advantage of working offline. If I'm in the subway, I'm basically unable to use anything except games. Although the last couple of games I couldn't play because they were trying to connect to an Ad server that would fail and so the game wouldn't progress.
17
ryao 3 days ago 3 replies      
> WebBluetooth (yup, talking to bluetooth devices from JS on a webpage)

This sounds like a great new attack vector for the black hats of tommorrow.

There are just some things that a web browser should not do. Exposing things that previously required escape from sandbox attacks is one of them.

18
rcarmo 3 days ago 0 replies      
I tried to do this a few years back and it was completely impossible to order anything from the Google store and have it delivered to Portugal.

Although I routinely rebuilt Android to reflash my Nook Color and even rebuilt Android x86 for the "Magalhes" school laptops on a lark, I could not beg, borrow or steal an Android device with "proper", vanilla Android for myself without resorting to shady imports and zero warranty.

So after a year of using an HTC One[1] and, later, a moderately vanilla LG 4, I quietly went back to the iPhone, got a Nexus 7 (2013) to scratch my occasional Android development itch, and haven't looked back. The ecosystem is _so_ much better, Safari on it (and my iPad) still knocks Chrome on Android out of the park from a user perspective, and I can tinker all I want on stuff like the Remix PC and the ODROID without having to put up with a lousy phone user experience.

Would I use Android? Yes, for sure - but I wouldn't _like_ it.

Would I develop for it? Sure, no problem. Did that for digital signage, even[2].

Would I develop for it _first_? Doubtful. The only serious money in it is in vertical (B2B) apps and suchlike.

Would I develop web apps for it _first_? Like... are you serious? With the market being what it is?

So although I "get" the article, I think it's not that realistic.

[1]: http://taoofmac.com/space/blog/2013/10/20/2230[2]: https://github.com/rcarmo/android-signage-client

19
ar0 3 days ago 1 reply      
TLDR: Chrome on Android supports Service Workers and WebRTC while Safari on iOS does not. This means Android these days is better suited for fully-fledged web applications that do not require a native app (or at least a native app wrapper).
20
frobware 3 days ago 2 replies      
Sadly, the web is an accessibility nightmare. If that changes, then sure, I could move too. But there's a lot that modern versions of iOS get right regarding accessibility, stuff that I wish google/android would do too.
21
lucian1900 3 days ago 1 reply      
I like native apps. I'm still annoyed there's no native desktop Hangouts app and how many things Atom gets wrong.
22
nostromo 3 days ago 0 replies      
Putting your development preferences ahead of your customer's preferences is a recipe for failure.
23
64bitbrain 3 days ago 0 replies      
If I ever get a Android phone, it will be Nexus series. I had an HTC and I waited ages for Android 4 update because AT&T didn't had there "customized" version, with bunch of useless apps on it. On the other hand, my friend was able to upgrade to latest version of Android, because he was using Nexus. I switched to an iPhone and I loved it. Better battery life, and clean installation.
24
milge 3 days ago 0 replies      
I also went from iOS to Android (iPhone 4s -> Nexus 5x). I had my 4s for 5 years and loved it. I'd still have it if it weren't for iOS 9 being too big to install and verizon overcharging. I've developed apps on both.

Some apps have to be apps to use sensors and devices built into the phones. A lot of apps could probably get away with being mobile sites. Doubly so with some of the new html technologies being introduced by the W3C.

Because Android and iPhone are owned by companies, they can move fast. The web has to accommodate for many more devices. So web standards move slower. In the time apps have become huge, a lot has been added to web standards. But I'm guessing most people haven't noticed. My guess is people are used to using frameworks and have abstracted themselves away from the basics.

As a challenge to the reader, see what you can build in only JS/HTML/CSS with no server side. You'll be pleasantly surprised by what you can accomplish.

25
greatlakes 3 days ago 0 replies      
I think the differentiating factor here is Chrome's push to support and utilize the Service Worker API (https://developer.mozilla.org/en-US/docs/Web/API/Service_Wor...). The opportunity for web apps to have an offline experience and utilize push notifications is not only exciting but game changing for the web platform as a whole.
26
Exuma 3 days ago 4 replies      
I hate any title that begins with why. Seriously, no one cares. People have opinions. Even seeing 'why' makes certain I will not read your article.

Let me guess the following without even reading...

* several paragraphs of whining about things that are just personal preference.

* making broad/generalized wide sweeping statements and stating them as fact

* tons of rhetorical questions followed by over simplified answers in support of the other product

* dripping with misguided enthusiasm, using lots of words in CAPS and BOLD.

27
t3ra 3 days ago 1 reply      
I am always surprised when people say things like CHROME is bringing X API.Take a look at these HTML5 Web APIs : https://developer.mozilla.org/en-US/docs/Web/API

They have been here since sometime now! and Mozilla built a whole operating system around them which is has "Progressive Web Apps" in its core!

28
Synaesthesia 3 days ago 0 replies      
Apple have actually been always pretty excellent with Safari performance and features on iOS. They were impressive from the start and they have kept pace with Google with regard to JavaScript performance overall performance has usually been class-leading, rendering too. Ok they're missing WebRTC right now and workers, but I'm pretty sure WebRTC will come soon and workers too at some point.
29
viseztrance 3 days ago 1 reply      
Meanwhile, on the desktop google music doesn't work without flash installed, and there's no desktop client in sight. Great times.
30
krzrak 3 days ago 4 replies      
Sidenote: I checked the Google's Project Fi - damn, it's expensive. For $20/month you get unlimited calls and texts, but you have to pay extra $10 for every 1GB of data.

Here in Poland for $10 I get the same unlimited calls and texts plus 4GB of LTE included (and then you're limited to 1 Mbps - but you can take the $13 plan and get unlimited GBs too).

31
Kjeldahl 3 days ago 2 replies      
Good post. There's one other challenge though and that is access to native GUI widgets. Just having an app icon and appearing in the task switcher simply isn't enough. It's one of the problems React Native tries to solve, although I have to admit I'm not impressed with it so far. With the momentum Javascript is having I wouldn't be surprised if most vendors will release "native" javascript bindings to their platforms anyway, which hopefully will remove the last missing piece for "native experiences" using Javascript on iOS and Android (and for that sake, Windows 10 and OSX).
32
dimillian 3 days ago 1 reply      
Show me on good mobile web app that is- Useful- Work offline- Fast- Don't have weird glitches
33
Splendor 3 days ago 0 replies      
If you want to argue that I shouldn't expect my user to have the newest iPhone you shouldn't also list WebBluetooth as a pro. My user probably doesn't have a device that supports it either.
34
__m 3 days ago 0 replies      
After 7 years of people switching from android to iOS or vice versa, i stopped reading blog posts about it
35
ignoramous 3 days ago 2 replies      
I am not entirely sure if Android is the best mobile platform out there. Apple continues to innovate at an incredible pace on its hardware and software. It is untouchable as far as HCI is concerned, they just seem to get most of the UX right. Its amazing to see them make computers that work and behave like a charm.

Pricing is unreasonable, TBH. And that's where Android eco-system has held an upper edge for too long now. Android as a platform, superior enough technology-wise, is terrible 'fragmentation' wise. Apple's laser sharp focus on UX around their entire line-up is commendable. To an extent, they think about their end-users at a level unparalleled at other tech companies-- not supporting flash, pushing aggressively ahead with ad-blocker support, adding a voice-enabled assistant, iCloud etc Apple's radical re-think of a smart-phone is a miracle. Almost everyone before them got it wrong. They are operating on some other level altogether.

Google, I think except for Google Now and their notifications scheme on Android have mostly been playing a catch-up with iOS.

I think Google faces the same issue with their Cloud offerings too. All the talk of the most advance platform/tech in the world and they still languish behind AWS and Azure.

36
zanny 3 days ago 0 replies      
I'm actually going to be on topic off topic, but I seriously hope that somehow we have celestial alignment and QML can somehow take off as the defacto networked app standard. HTML/CSS/JS is a document format, styling for said documents, and a language cooked up in a week to bake into a browser in the 90s. And the 90s language is the best part!

QML is ground up meant to write interfaces in, and provide all kinds of critical functionality you would want on everything from mobile to televisions to toasters to your desktop:

* Hardware acceleration everywhere.

* DPI scaling.

* Ability to write controls in native C++ or as composite elements in QML itself.

* Signals and slots throughout all aspects of the framework, instead of callback hell.

* Intuitive and first class animations support.

* Native look and feel on almost all platforms through the Controls API, with the ability to restyle them however you want.

* All aspects of the framework support network transparency. You can associate resources remotely or locally, and all the necessary properties to track loading and downloading are available, and the API handles component loading from web services much more intuitively than HTML script / css loading.

I love QML a lot, and there is even a project called qmlweb to run it in the browser, but I really want to see http://test.org/app.qml be a thing. Having written my share of web applications and QML ones, I have no idea why anyone thinks spreading the design disaster of the traditional web to encompass all user software is the best we can achieve.

37
hackergary 3 days ago 1 reply      
Sounds like someone trying to force web apps to do native apps' jobs. When something like React Native bridges web languages and full native benefits/performance.
38
jonlunsford 3 days ago 1 reply      
I also just made the exact same switch, after years of wanting more control over my hardware without jailbreaking, I just want to install f.lux for crying out loud! As a web dev, i'm very excited to loose the chains of iOS :)
39
eranation 3 days ago 1 reply      
I'm a Java guy, open source advocate, I love to have "power user" features and I was an android guy since android came out. I recently made a move to iOS (iPhone 6), and I'm not looking back.

It has much less features, it's a walled garden and all, I have to learn a new language (or two) to be able to develop apps for it (And pay $100), but the reason I like it so much is that it simply works.

Not just the software side, my android devices always had more issues, my Galaxy S III spent 3 times being fixed at Samsung for different reasons, so far with the iPhone I had no software or hardware issues.

And when my wife had battery issues with her iPhone 5c, instead of taking it for fixing they just gave her a new one on the spot and apologized for the inconvenience.

Simple, do-one-thing and do it right devices, that simply work.

This is a classic "do more with less", less features, nothing too exciting, but the little they have simply works.

40
nilkn 3 days ago 0 replies      
This was a pretty interesting article, and from the title alone I had absolutely no idea this was actually a discussion about the relative merits of web apps and native apps on phones, with the main claim being that we've nearly reached the point where web apps are viable and that Android happens to support this better at the moment. I suspect many others were caught off guard too (and perhaps did not even read it), given how many comments here are just addressing the generic issue of iOS vs. Android and all the drama that comes along with someone emphatically announcing that they've at last switched to the other side.

This is why I think that rhetoric phrased in terms of one camp vs. another is often greatly counter-productive.

41
kdamken 3 days ago 1 reply      
My only issue with iOS is that Safari doesn't play WebM. You have to download and open them with the VLC app. I wish they would just accept that it's a solid format and adapt it, but I don't see that happening any time soon.
42
minionslave 3 days ago 1 reply      
I just realized that half of the applications I have on my phone can't be used without a data connection.

What happens when I lose signal. The cloud is nice, but I need some offline in my life too.

43
incepted 3 days ago 0 replies      
> Of course I dont know the full backstory, but it sure seemed like the original plan for 3rd party developers on iOS was to have us all just build apps using the web.

Correct, there was no SDK on the first generation iPhone. It was a closed device, like all Apple devices. And that's how Jobs wanted it, he just thought that the idea of third party applications running on this device was pure absurdity.

Then Android came out and Jobs had to adapt.

> Apple made what turned out to be a really smart business decision: they released an iOS SDK and an App Store and the rest is history.

Kind of. Apple made a really smart business decision: they realized that if they didn't match Android and provide an SDK as well, they would lose. So they followed suit.

> The end result, for those of us still trying to build installable web apps for iOS was that with nearly every new iOS release some key feature that we were depending on broke.

This makes it sound as if these features got accidentally broken. No, they were intentionally removed or crippled because they either threatened Apple's dominance or cut into their profits. You could call that another set of "really smart business decisions"

44
agentgt 3 days ago 1 reply      
For me "Continuity" [1] (ie phone call on computer if I can't find my iphone) is the killer feature for why I stick with Apple.

I know there are is something sort of like it for Android but someone showed it to me and it didn't really work.

[1]: https://support.apple.com/en-us/HT204681

45
alexkavon 3 days ago 0 replies      
I have to say that I'm all for web apps and web based apps using things like Cordova or what not, but recently my company's app has been hitting some walls. There are a lot of great things those systems can do and they're great for starting out. However in the long run you might as well consider developing native or developing using something Xaramin (which will probably be even more free soon). Native development just provides a less kludgy of developing. My company will be making the switch soon in this light.

EDIT:I'd also like to say that the reason it's tough to develop for the web is languages like Javascript, sure it's getting better very slowly, but it also doesn't really allow for other languages to run in the browser and probably won't in the future. Sure you can compile, but why compile to JS and use a web view and work around conflicts while developing an app, when you can use a typed language and access APIs that work?

46
Negative1 3 days ago 0 replies      
Great writeup, thank you!

I actually did the opposite. Owned a Nokia "smartphone" when I got my first gen iPhone. Stayed on for 2 more generations then switched to a Samsung Galaxy (reason: wanted to see what this whole Android thing was about).

In every way it was a painful experience but I stuck with it for a few years. When I finally switched back to an iPhone I was like, wow, it just works. Forgot how that felt.

I'm still a fan of Android and believe it does some things so much better. Google Now is actually incredibly cool (too useful to be creepy). Music library management was much better (I miss you so so much N7 player). eBook reading was also better (Moon+ is amazing).

On the other hand, even as a power user (I program Android and iOS apps for a living) it frustrated me to no end. Android is here to stay (which is great) but from a usability (i.e. user friendliness) perspective it still has so much further to go.

47
nevir 3 days ago 0 replies      
Another iOS -> Android switcher here. I owned every iPhone up to and including an iPhone 6, and then switched to a Nexus 6.

From my perspective, the two platforms (and when I talk about Android, I mean Android-on-a-Nexus) are pretty much homogenous. They look and feel very similar, behave similarly, etc, etc.

48
Polarity 3 days ago 1 reply      
i switched to linux (elemntary) last year after years of osx. feels good and fast.
49
pawelkomarnicki 3 days ago 0 replies      
For me these two platforms are more-or-less the same from a user perspective. There're some cosmetic changes, like notifications are handled better on Android by bundling them, instead of pinging for every single one of them. Apps are usually on both platforms, same with games. iOS users at my work seem interested and impressed by the Nexus 6P, some consider the possibility to switch someday. But it really doesn't matter as long as the device get the stuff done, does it? :P
50
abpavel 3 days ago 0 replies      
"I want the ability to create app-like experiences on the OS with web technology.Very little seems to be happening in that regard as far as I can tell"

I'm not sure why this doesn't sound like a compelling reason for a switch.I find the maintenance aspect much more persuasive.Usually people are paid to do sysops, administering, maintaining and tinkering with OS. It's a job. And handsomely paid at that. Doing the sysops job and having to pay for it, just to maintain your own phone, seems like a bad economical proposition.

51
brotoss 3 days ago 1 reply      
I want to switch back to Android soooooooo so badly. But iMessage is too damn convenient.
52
stcredzero 3 days ago 1 reply      
I dont know whether or not this type of app was actually intended to be the primary mechanism for 3rd party dev to build apps for iOS but regardless

This was basically said by Steve Jobs and Scott Forstall during a WWDC keynote.

53
mladenkovacevic 3 days ago 1 reply      
Been with Android since forever, but the battle for security that Apple has recently been fighting on behalf of their customers is enough to make me want to start considering iOS devices in the future.
54
alexchantavy 3 days ago 0 replies      
It'd be more accurate to title the article "Why Apple needs to treat Progressive Web Apps as first-class citizens".

It's less about iOS-vs-Android than it is about eliminating friction between the web and mobile apps. I enjoyed this analysis very much but if the article wasn't so highly upvoted already I might have skipped over it due to the title.

55
RogueIMP 3 days ago 0 replies      
My first smart phone was an iPhone 4. After switching to the Galaxy S3, I was sold!

Apple is good for users who was a device that is simple, set in it's was and easy to use. It makes it hard to break.Android makes a phone that has unlimited potential, but at your own risk.As an IT, I'm a tinkerer... so I prefer the later. :)All about preferences.

56
userium 3 days ago 0 replies      
We just today published a UX checklist for iOS and Android apps (https://stayintech.com/info/mobile_ux_checklist). There are some good ideas on this thread that I can later add on that list! Hopefully useful for some of you.
57
merpnderp 3 days ago 0 replies      
The point about the full web app experience is a good one. And while Android's Nexus One left me high and dry on promises of continual updates and pushed me into my iPhone 4s from which I've never come back. If iOS doesn't get full support for service workers soon, I'll have to look again at Android.
58
r0m4n0 3 days ago 0 replies      
Interesting opinion. I am struggling to think of a single standalone webapp I would benefit from... I use a few native apps that work flawlessly and the clean out of the box functionality for phone calls and web browsing. I guess I'll stick with IOS (shrug)
59
ianai 1 day ago 0 replies      
Speaking solely as a user, I don't want to run that sort of application.
60
mayoff 3 days ago 0 replies      
I can see this reasoning being right on the money, if you can be happy building web apps.

Personally, I find that developing with web technologies is a miserable experience, and developing with iOS native technologies is a joy. YMMV.

61
cdevs 3 days ago 0 replies      
He basically list all the reasons I've stayed away from android. As a tech toy it's fun to play with but as a everyday phone it scares the hell out of me security wise when every minute they are opening up the attack surface.

So I've been with iPhone every since, it opens emails, reads text and webpages I barely open the App Store anymore. But I can see your side as I love writing code and tearing things apart to mod - I just decide my phone wouldn't be one.

62
komali2 3 days ago 0 replies      
My only fear is that ultimateGuitarTabs will use these developments to make visiting their site on mobile even more of a hellish experience.
63
jshelly 2 days ago 0 replies      
These statements and arguments are so pointless these days. Use whatever you prefer and be happy.
64
yegle 3 days ago 0 replies      
The author forgot a key point: webapp allows you to reuse cookies in browser so users don't need to login again from your app.
65
SiVal 3 days ago 0 replies      
Apple is just taking a page from Microsoft's old playbook, sabotaging the Web platform in order to prop up the competitiveness of its native platform. When MS owned the dominant platform, they made sure that the Web browser they shipped pre-installed on every Windows machine was always just good enough to claim to be a usable Web browser (or it might have driven people away from Windows), yet always bad enough to make the Web itself look bad compared to Windows. With the biggest benefit of the Web being its reach, anything that could limit the reach of new powers could hold back its spread, and MS did hold back its spread for years.

At the same time Apple, having no leverage from their own native OS of the present, touted their hardware/OS/browser stack as the best way to use the platform of the future--the Web--to make themselves more relevant in a MS-dominated world and sell more hardware. They did a lot of good for the Web platform in the past.

Fast forward to today: the iPhone ignited the explosion of mobile computing and made Windows' dominance of the desktop into being merely the biggest frog in a smaller and smaller pond. MS no longer had a monopoly to defend, repented for its sins, and began to build first-rate, evergreen browsers to stay relevant in the new world. (Competition is a wonderful thing.)

And Apple took their place, not as the monopoly OS in the new, big pond, but as an OS that was a large enough part of it that it could make things "not work on the Web" by making them not work on iOS. They manage to frequently be behind in getting new things to work in Safari (cf: caniuse.com), while being careful not to be so far behind that it affects their reputation with the general public and weakens them in competition with Android, and they prohibit any superior browser from interfering with this delicate "hurt the Web without hurting yourself" strategy by banning all others from iOS.

The result is that anything iOS Safari can't do, Web developers can't use: iOS Safari's shortcomings appear to be the Web's shortcomings, which can be overcome by committing to Apple-proprietary alternatives.

They can't afford to fall too far behind, though, or conventional wisdom will gradually emerge that iOS isn't as good as Android at "Web stuff". And as "Web stuff" improves on other platforms, the Web matters more and more as does your reputation for supporting it.

If developers, blogs, pundits would talk and post about it every time iOS Safari fails, yet again, to support some new Web technology, and even release some features that work nicely on Android/Chrome but require a native app on iOS, "because, you know, the iPhone's Web support is not very good, as everyone knows...", it will increase the pressure on Apple to shift the balance of "good but not too good" farther forward.

66
rachkovsky 3 days ago 0 replies      
How about in-app purchases? Wouldn't it be harder to implement low friction flow?
67
Touche 3 days ago 1 reply      
I'm amazed when I go to webdev conferences and see 90% iPhones, including many of the most prominent "javascript celebrities". Then I tell myself that just because you work on something doesn't mean you are passionate about it. I'm passionate about the web and couldn't imagine using an OS where all major features get delivered 4 or 5 years after creation (like IndexedDB was).
68
Zigurd 3 days ago 0 replies      
TL;DR (applicable to all articles in both directions): Apple software quality has gone to crap. Android is an inconsistent mess and I hate $OEM or $CARRIER bloatware.

In fact, both iOS and Android are usable. If you had one or the other issued to you by an employers, it would be fine. The only shocking thing is that there isn't a third and fourth choice with a vibrant device and app ecosystem.

69
agumonkey 3 days ago 0 replies      
All this makes me wonder if we should change the whole idea of device, users, business.
70
exabrial 3 days ago 1 reply      
Galaxy S7 is far superior to any of the iPhones. Take it outside in the rain
71
listingboat 3 days ago 0 replies      
But Android users never update there OS and there is a lot of old Android OS versions to support, correct? Additionally, the device manufacturers control the OS distribution and what's included.
72
daxfohl 3 days ago 0 replies      
Web apps are okay but really there just needs to be a better way of 'using' native apps. A 'yes I want to use you now but no I don't want to install you' button.
73
asai 3 days ago 1 reply      
The web is a patchwork of different frameworks, languages and standards without any clear direction as to where its heading. Why anyone would want to work with js is also beyond me.
74
Jonasen 3 days ago 0 replies      
Late bloomer, you say? :)
75
brodo 3 days ago 0 replies      
Yay for intellectual diversity!
77
RunawayGalaxy 3 days ago 0 replies      
Didn't need a whole blog post. The necessity of moving files would have been sufficient.
78
wnevets 3 days ago 0 replies      
The fact apple has to take so many features from android speaks for itself.
79
Jerry2 3 days ago 6 replies      
>So, instead of opening my text editor I placed an order for a Nexus 6P

Nexus 6P is notorious for atrocious build quality. It bends easier than a bar of chocolate. [0] Google should do a recall on these things. It bends a lot easier than an old iPhone 6 plus.

[0] https://www.youtube.com/watch?v=r3cWVdLqXCg

Edit: I see Google fanboys decided to downvote this comment instead of engaging in a debate. This is not in the spirit of Hacker News. I know HN has a lot of Google employees who are extremely touchy but come on.. be objective once is a while

5
Left-pad as a service left-pad.io
934 points by manojlds  3 days ago   253 comments top 55
1
c4n4rd 3 days ago 5 replies      
This is really exciting!!! I was a bit disappointed that the right-pad will be out only in 2017. I am looking forward to that release because there is a high demand for it now.

What kind of load balancing is being used on the back-end?I called leftpad(str, ch, len) with the length I needed and noticed that is not very scalable because it is blocking.

A better approach I would recommend to those using it is to call the API in a for loop. In my tests, it had performance very close to those I see in C or assembly.

I was a bit turned off that the free version can only handle strings up to 1024 in length. I know you need to make some money, but it is big turn off for a lot of my projects.

Edit: I finally signed up for it but still noticed that I am only allowed to use 1024. I called your customer support line and they said I was calling the API from multiple IP addresses and for that I need an enterprise license. Please help me with this issue, it is very crucial at this point as my project is in a complete stop because of this.

2
pilif 3 days ago 1 reply      
As a very sarcastic person, I highly approve of this. This absolutely reflects my opinion of all this mess.

Thank you for making this site so that I don't have to write an opinion piece like everybody else seems to have to. Instead, if asked about the issue, I can just point them at this site.

Yes. This isn't constructive, but this mess had so many layers that I really can't point to a single thing and offer a simple fix as a solution.

As such I'm totally up for just having a laugh, especially when it really isn't being nasty against specific people but just laughing about the whole situation.

Thank you to whoever made this

3
faizshah 3 days ago 7 replies      
I don't understand why this community has to have a weekly cycle of bashing different programming communities. Every week there's a new drama thread bashing Java devs, Go devs, Javascript devs etc. The thing that I come to this community for every week is to read about new developments in our industry, if you don't come here for that then what are you coming here for?

And wasn't it just a few months ago people were praising the innovation of Urbit for having a 'global functional namespace'? But because it's popular to hate on javascript devs for applying -- sorry, I forgot this was javascript bashing week -- for reinventing concepts from other areas in computer science and software engineering the HN community has to start hating on another programming community's work.

That said this is a pretty funny satirical page, apologies to the author for venting at the HN community.

4
mschulze 2 days ago 4 replies      
As a Java developer, I am a bit jealous. When people joke about us we usually only get a link to the Spring documentation of AbstractSingletonProxyFactoryBean (or maybe the enterprise hello world), but no one ever wrote that as a service. Maybe someone can do that? https://abstractsingletonproxyfactorybean.io seems to be available!
5
supjeff 3 days ago 3 replies      
6
nogridbag 2 days ago 2 replies      
I'm late to the left-pad discussion. I thought it was considered a bad practice to depend on an external repo as part of your build process. At my company we use Artefactory to host our maven libs. Even if one were removed maven central our builds would continue to work fine (in theory).
7
andy_ppp 3 days ago 24 replies      
Hahaha - isn't it hysterical how everyone using npm for small reusable code pieces! Aren't they morons! How stupid of people to trust their package manager to be consistent and correct and return packages they were expecting.

How stupid of people to reuse small often used functions that only do one thing well.

How does everyone taking the piss intend to protect themselves from this in their OS package manager, or PPM or composer or pip?

It's not javascript devs fault that the standard library is so piss poor you need these short code snippets and I've definitely included small 5-10 line packages via npm or other package managers rather than roll my own because it's likely they have bug fixes I haven't considered. I can also use npm to share these snippets between the many projects I'm building.

* No I wasn't affected by this because I review the packages that I want to include, however the level of smugness here is absolutely ridiculous.

8
icefox 3 days ago 1 reply      
Nice, it is even bug compatible

http://api.left-pad.io/?str=foo&len=7&ch=12

return {"str":"12121212foo"}and not {"str":"1212foo"}

9
huskyr 2 days ago 1 reply      
Reminds me of Fizzbuzz enterprise edition: https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris...
10
jaxondu 3 days ago 0 replies      
It's 2016! Left-padding without any deep learning algorithm is so lame.
11
gumby 2 days ago 0 replies      
HELP!! The CEO heard about this new service and now my manager told me we need to upgrade all our packages to this new service ASAP! But there's nothing on stack overflow I can use to change our system! I need to get this pilot done STAT so we can plan the migration and send it out for bid!

HELP!!

12
beeboop 3 days ago 3 replies      
Tomorrow: left-pad.io announces $120 million investment at $1.2 billion valuation

Month from now: left-pad announces purchase of $170 million office building in SV to house their 1200 employees

13
jeffreylo 3 days ago 2 replies      
Doesn't work with unicode characters:

# ~ [8:47:18]$ curl 'https://api.left-pad.io/?str=&len=5&ch=0'{"str":""}%

14
andrepd 2 days ago 0 replies      
>`left-pad.io` is 100% REST-compliant as defined by some guy on Hacker News withmaximal opinions and minimal evidence.

Wonderful

15
Flott 2 days ago 1 reply      
I'm desperately looking for some feedback from big users.

- Does it scale well?

- Is it pragmatic for long term use scenario?

- Is it Thread safe?

- Does it learn from previous call made to the API?

- Does it have a modern access layer?

- Does it enforce 2nd factor authentication?

- Is it compatible with Docker containers?

- What about multi-region scenarios?

- Any benchmark available showing usage with AWS + cloudflare + Docker + a raspberry pi as LDAP server?

16
rfrey 2 days ago 2 replies      
I'm very disappointed in the creators' choice of font for their landing page. Practically unreadable, my eyes burned.
17
Schwolop 8 hours ago 0 replies      
This ain't got nothing on Fuck Off as a Service: https://www.foaas.com/
18
maremmano 3 days ago 1 reply      
What about splitting left and pad in two microservices?
19
gexla 2 days ago 0 replies      
Can someone explain to me why I might need this? I checked the site and the documentation is horrible. The site isn't professionally done and there are no videos.

Can I rely on this service going to be around in 5 years? It just seems like this company might be, you know, a feature rather than a company.

20
yvoschaap2 3 days ago 0 replies      
While very useful SaaS, I always use the tweet package manager from http://require-from-twitter.github.io/
21
p4bl0 3 days ago 0 replies      
As a friend said on IRC, it's kind of sad that the website is not made with bootstrap.
22
stared 2 days ago 1 reply      
I am waiting for "integer addition as a service" (vide http://jacek.migdal.pl/2015/10/25/integer-addition.html).
23
sansjoe 2 days ago 0 replies      
A programmer is someone who writes code, not someone who installs packages. Do you really need someone else to pad strings for you? Come on.
24
a_imho 2 days ago 0 replies      
My gut feeling tells me serious software engineers who look down on javascript programmers are feeling justified now. Brogrammers are exposed, hence the lot of knee jerk. Indeed, it is pretty funny, but dependency management still remains a hard problem.
25
Jordrok 2 days ago 0 replies      
Very nice! Any plans for integration with http://shoutcloud.io/ ? I would love to have my strings both left padded AND capitalized, but the APIs are incompatible. :(
26
venomsnake 3 days ago 0 replies      
I don't think this is enterprise ready. And I am not sure that they are able to scale their service. Left padding is serious business.
27
TickleSteve 3 days ago 1 reply      
Presumably this is using a docker instance on AWS or the like? </sarcasm>

BTW: Well done... nothing like rubbing it in. :o)

28
creshal 3 days ago 1 reply      
I need a SOAP binding for this, because reasons.
29
chiph 3 days ago 1 reply      
Needs more Enterprise. Where are the factory factory builders?
30
danexxtone 3 days ago 1 reply      
Where do I sign up for alpha- or beta-testing for right-pad.io?
31
facepalm 3 days ago 1 reply      
Cool, but it would be more useful if they had a npm module for accessing the service.
32
ChemicalWarfare 2 days ago 1 reply      
BUG! (I think)using '#' as a 'ch' value pads the string with spaces:

$ curl 'https://api.left-pad.io/?str=wat&len=10&ch=#'

{"str":" wat"}

Please provide github link to fork/submit pr :)

33
bflesch 2 days ago 1 reply      
I get an error

 {"message": "Could not parse request body into json: Unexpected character (\'o\' (code 111)): was expecting comma to separate OBJECT entries\n at [Source: [B@6859f1ef; line: 2, column: 22]"}
when using double quotes. It seems some JSON parsing fails. Not sure if this can be exploited, so I wanted to let you know.

Demo link: https://api.left-pad.io/?str=%22;

34
ritonlajoie 3 days ago 0 replies      
I'm looking for a Left-pad specialized linux distro. Anyone ?
35
rahimnathwani 2 days ago 0 replies      
Do you have any client libraries available for different languages?

I don't want to create a direct dependency between my code and your API. I'd rather create a dependency between my code and your client library's code, as I'm sure you will always keep that up to date with any API changes.

36
cmancini 3 days ago 1 reply      
This is great, but I'll need an API wrapper package.
37
shitgoose 2 days ago 0 replies      
this is fantastic! What is your stack? Are you NoSQL or relational? Redis? What is your test coverage? I am sure you hiring only trendy developers. I see huge potential in your service, do you accept private investments? I would like to get in now, before Google or YC snatches you! again, keep up good work and - can't wait for right-pad next year!
38
talideon 2 days ago 0 replies      
But the question is, is it enterprise-ready? :-)
39
pka 2 days ago 0 replies      
The real discussion is not about package managers, micromodules, or whatever.

It's about "real programmers can write left_pad by themselves" and everybody else just sucks. True scotsmen etc.

Now I don't know why asm people arent feeling threatened and aren't attacking the C left_pad gurus yet...

40
Mopolo 2 days ago 0 replies      
That would be fun if a company named Left Pad asked to get this domain like Kik did at the beginning of all this.
41
idiocratic 2 days ago 0 replies      
Are you hiring?
42
dkackman1 2 days ago 0 replies      
SECURITY NIGHTMARE!!!!!!!!!

Without any sort of nonce, this service is trivially susceptible to a replay attack

43
mrcwinn 2 days ago 0 replies      
They didn't even think to version their API. This is total crap.
44
jug 2 days ago 0 replies      
If left-pad.io goes down, will it take the rest of the WWW infrastructure with it? I'm missing a Q&A for important and apparently relevant questions like these.
45
MoD411 3 days ago 0 replies      
Boy, that escalated quickly.
46
jdeisenberg 3 days ago 0 replies      
Have we, or have we not, officially entered the silly season?
47
yyhhsj0521 3 days ago 0 replies      
I wonder whether the author uses leftpad on this site
48
nyfresh 2 days ago 0 replies      
100 times on the boardhttp://bit.ly/1RzOIK2
49
fallenshell 1 day ago 0 replies      
Will there be a premium plan?
50
sentilesdal 2 days ago 0 replies      
for the graphic designers out there who need left-pad, blue-steel-pad is now available.
51
markbnj 2 days ago 0 replies      
This is awesome. Props to you all.
52
Blackthorn 2 days ago 0 replies      
Wow, so funny. DAE lol javascript?

Ugh, how does this garbage even get upvoted so highly.

53
jschuur 3 days ago 1 reply      
Is it rate limited?
54
justaaron 2 days ago 0 replies      
this is hilarious and timely
55
d0m 2 days ago 1 reply      
I'm ready to get downvoted to hell with this comment but here we go..:

I feel like only non-javascript devs are bashing against small modules and NPM. All great javascript devs I know LOVE that mentality.

Let me offer some reasons why I (as a current Javascript dev having professionally coded in C/C++/Java/Python/PHP/Scheme) think this is great:

- Unlike most other languages, javascript doesn't come with a battery standard library. So you're often left on your own to reinvent the wheel. I mean, common, in Python you do "'hello'.ljust(10)" but AFAIK there isn't such thing in javascript. Javascript is more like the wild west where you need to reinvent everything. So having well tested libraries that does one thing extremely well is really beneficial.

- Javascript, unlike most other languages, has some pretty insane gotchas. I.e. "'0' == 0" is true in javascript. Most devs have been burned so bad in so many ways in Javascript that it's comforting to use a battle-tested library, even for a small feature, rather than reinventing it.

- And anyway, where should we put that function? Most big projects I've worked on have some kind of "helper file" that has 1500 lines, and then at some point different projects start depending on it so noone likes to touch it, etc. So, yeah, creating a new module takes a bit more time, but remember that it's not about the writing time but more about the maintenance time. I'd much rather have lots of small modules with clear dependencies than a big "let's put everything in there" file.

- I feel arguing about whether something should be in a separate module is similar to arguing without something should be in a separate function. For me, it's like hearing "Hey, learn how to code, you don't need function, just write it when you need it." And hey, I've worked in projects professionally where they had no function and it was TERRIBLE. I was trying to refactor some code while adding function, and people would copy my function inside their 1500 lines file. Let me tell you I left that company really fast.

- It's fair to say that UNIX passed the test of time and that the idea of having lots of small programs is extremely beneficial. It forces common interface and great documentation. Similar to how writing test force you to create better design, modularizing your code forces you to think about the bigger picture.

- As far as I'm concerned, I really don't care whether a module is very small or very big, as long as what it does is well defined and tested. For instance, how would you test if a variable is a function? I don't know about you but my first thought wasn't:

 function isFunction(functionToCheck) { var getType = {}; return functionToCheck && getType.toString.call(functionToCheck) === '[object Function]'; }
Who cares if it's a 4 lines module. I don't want to deal with that javascript bullshit. Yes, I could copy past that in my big helper file, but I'd much rather used one that the javascript community use and test.

- Finally, it seems like Node/javascript hasn't started that way. Not so far ago with had Yahoo monolithic javascript libraries and jquery. Even the first versions of most popular node library (such as express) were first written as a monolithic framework. But it's been refactored into dozen of small modules with clear functions. And now, other libraries can just import what they need rather than the whole project.

OK, so I told you about the good thing. What about the bad thing?

- Adding dependencies to a project is REALLY HARD TO MAINTAIN. I've had so many bad experience using node because of that. I.e. I work on a project, it's tested and work fine. 2 months later I clone and start the project and everything breaks. Oh, X and Y libraries decided to fuck everything, that other library now depend on a new version of Node, but I can't upgrade node because that other library depend on a previous version of Node. It's complex. I won't go on in explaining my solution to this problem, but enough to say that it's a problem and installing random amateur libraries in a professional project can lead to disaster.

- It takes longer to code. I've touched that earlier. It's a tradeoff about write now vs maintain later. Take a look at segmentio github repo: https://github.com/segmentio. I'd personally love to have that as onboarding experience rather than some massive project with everything copy/pasted a few time. But yes, it took them more time to create those separate modules.

6
Docker for Mac and Windows Beta docker.com
893 points by ah3rz  3 days ago   235 comments top 58
1
falcolas 3 days ago 9 replies      
The last time I used xhyve, it kernel panic'ed my mac. Researching this on the xhyve github account [1] showed that it was determined that it's due to a bug with Virtualbox. That is, if you've started a virtual machine since your last reboot with Virtualbox, subsequent starts of xhyve panic.

So, buyer beware, especially if said buyer also uses tools like Vagrant.

[1] https://github.com/mist64/xhyve/issues/5

I've said before that I think the Docker devs have been iterating too fast, favoring features over stability. This development doesn't ease my mind on that point.

EDIT: I'd appreciate feedback on downvotes. Has the issue been addressed, but not reflected in the tickets? Has Docker made changes to xhyve to address the kernel panics?

2
tzaman 3 days ago 7 replies      
If I had a yearly quota on HN for upvotes, I'd use all of them on this.

> Volume mounting for your code and data: volume data access works correctly, including file change notifications (on Mac inotify now works seamlessly inside containers for volume mounted directories). This enables edit/test cycles for in container development.

This (filesystem notifications) was one of the major drawbacks for using Docker on Mac for development and a long time prayer to development god before sleep. I managed to get it working with Dinghy (https://github.com/codekitchen/dinghy) but it still felt like a hack.

3
wslh 3 days ago 3 replies      
Can someone explain in simple terms how Docker for Windows is different from Application Virtualization products like VMware ThinApp, Microsoft App-V, Spoon, Cameyo, etc? Also, why does it require Hyper-V activated in Windows 10? I found this: https://docs.docker.com/machine/overview/ but I don't understand if you need separate VMs for separate configurations or they have a containerization technology where you are able to run isolated applications on the same computer.
4
darren0 3 days ago 3 replies      
This is an amazing announcement, but... The beta requires a NDA. The source code is also not available. This gives the impression that this will be a closed commercial product and that really takes the wind out of my sails.
5
izik_e 2 days ago 0 replies      
We have been working on hypervisor.framework for more than 6 months now, since it came out to develop our native virtualization for OS X, http://www.veertu.com As a result, we are able to distribute Veertu through the App Store. Its the engine for Fast virtualization on OS X. And, we see now that docker is using it for containers. We wish that Apple would speed up the process of adding new Apis in this hypervisor.framework to support things like bridge networking, USB support, so everything can be done in a sandboxed fashion, without having to develop kernel drivers. I am sure docker folks have built their kernel drivers on top of xhyve framework.
6
_query 3 days ago 4 replies      
If you're using docker on mac, you're probably not using it there for easy scaling (which was the reason docker was created back then), but for the "it just works" feeling when using your development environment. But docker introduces far too much incidental complexity compared to simply using a good package manager. A good package manager can deliver the same "it just works" feeling of docker while being far more lightweight.

I've wrote a blog post about this topic a few months ago, check it out if you're interested in a simpler way of building development environments: https://www.mpscholten.de/docker/2016/01/27/you-are-most-lik...

7
rogeryu 3 days ago 3 replies      
> Faster and more reliable: no more VirtualBox!

I'm a Docker n00b, still don't know what it can do exactly. Can Docker replace Virtualbox? I guess only for Linux apps, and suppose it won't provide a GUI, won't run Windows to use Photoshop?!

8
rocky1138 3 days ago 1 reply      
"the simplest way to use Docker on your laptop"

I think they forgot about Linux :)

9
nzoschke 3 days ago 1 reply      
Very excited about this. Docker Machine and VirtualBox can be a rough experience.

> Many of the OS-level integration innovations will be open sourced to the Docker community when these products are made generally available later this year.

Does this mean it is closed right now?

10
mwcampbell 2 days ago 1 reply      
Interesting to see that at least one of the Mirage unikernel hackers (avsm) has been working on this.

https://news.ycombinator.com/item?id=11352594

I imagine a lot of this work will also be useful for developers wanting to test all sorts of unikernels on their Mac and Windows machines.

11
totallymike 3 days ago 1 reply      
I'm delighted to read that inotify will work with this. How's fs performance? Running elasticsearch or just about any compile process in a docker-machine-based container is fairly painful.
12
f4stjack 3 days ago 2 replies      
So, let's say if I am developing a Java EE app under windows with eclipse and want to use docker container for my app, how do I go about it?
13
philip1209 2 days ago 2 replies      
Does anybody have any guides on setting up dev environments for code within Docker? I recall a Dockercon talk last year from Lyft about spinning up microservices locally using Docker.

We're using Vagrant for development environments, and as the number of microservices grows - the feasibility of running the production stack locally decreases. I'd be interested in learning how to spin up five to ten docker services locally on OSX for service-oriented architecture.

This product from Docker has strong potential.

14
raesene4 3 days ago 1 reply      
This is v.cool, although for the Windows version it'd be great if it became possible to swap out the virtualization back-end so it's not tied to Hyper-V.

At the moment VMWare Workstation users will be a bit left out as Windows doesn't like having two hypervisors installed on the same system...

15
Lambent_Cactus 2 days ago 9 replies      
Tried to sign up, but the enroll form at https://beta.docker.com/form is blank for me - it just says "Great! We just need a little more info:" but has no forms.
16
mathewpeterson 2 days ago 0 replies      
I'm really excited to see this because I've spent the last few months experimenting with Docker to see if it's a viable alternative to Vagrant.

I work for a web agency and currently, our engineers use customized Vagrant boxes for each of the projects that they work on. But that workflow doesn't scale and it's difficult to maintain a base box and all of the per project derivatives. This is why Docker seems like a no-brainer for us.

However, it became very clear that we would have to implement our own tooling to make a similar environment. Things like resolving friendly domain names (project-foo.local or project-bar.local) and adding in a reverse proxy to have multiple projects use port 80.

Docker for Mac looks like it will solve at least the DNS issue.

Can't wait to try it out.

edit: words

17
evacchi 2 days ago 1 reply      
I wonder if (and hope that!) this fixes the issues[1] with (open)VPN. I can't use xhyve (or veertu) at work because of this.

[1] https://github.com/mist64/xhyve/issues/84

18
alexc05 2 days ago 0 replies      
I cannot wait to get home to play with this!

If I were a 12 year old girl I would be "squee-ing" right now. Ok, I'm lying - I'm a 40 year old man actively Squee-ing over this.

:)

It really plays nicely into my "weekend-project" plans to write a fully containerized architecture based in dotnet-core.

19
_mikz 3 days ago 2 replies      
20
nstart 3 days ago 1 reply      
My goodness. This is some of the best news from docker this year and we are still just getting started. Packaging various hot reloading JavaScript apps will finally be possible. Gosh. I can't begin to say just how excited I am for this.
21
sz4kerto 3 days ago 1 reply      
Can some Docker employee explain how are file permissions going to work on Windows? For me, that's the biggest pain (on Win).
22
numbsafari 3 days ago 0 replies      
I'm really hoping that this will be available via homebrew and not a way to force everyone to use Docker Toolbox or, god forbid, the Mac App Store.

Docker Toolbox just brings back too many nightmares from Adobe's awful Updater apps.

23
alfonsodev 3 days ago 2 replies      
Biggest problem with Boot2docker was volume mounting and file permissions, hope this happens soon.> Volume mounting for your code and data: volume data access works correctly, including file change notifications (on Mac inotify now works seamlessly inside containers for volume mounted directories). This enables edit/test cycles for in container development
24
pokstad 2 days ago 0 replies      
Funny this appears today, I just discovered Veertu on the Mac App Store (http://veertu.com) 2 days ago and love it. It also uses OS X's new-ish hypervisor.framework feature to allow virtualization without kernel extensions or intrusive installs.
25
jtreminio 3 days ago 0 replies      
I run my stack(s) on Vagrant with Puppet for provisioning. I use OSX, but one of the major pain points of working with Linux VMs on a Windows host are file permission issues and case insensitivity.

I don't think Docker can do anything about case sensitivity, but with this new release will permissions differences be handled better?

26
jnardiello 2 days ago 1 reply      
To be entirely honest, I'm quite concerned about your choice on choosing Alpine as the base distro. Their choice of using musl over glibc might be cool but if you have to put old libs inside a container, it's hell (if not entirely incompatible).
27
AsyncAwait 2 days ago 1 reply      
Why does signing up for the beta require agreeing to a non-disclosure agreement?
28
ruipgil 3 days ago 0 replies      
Finally, I really hated the additional complexity and gotchas that boot2docker carried.
29
danbee 2 days ago 1 reply      
I couldn't sign up using Firefox on Windows. I'd enter a username, email and password then the form would just go blank on submission.
30
Grue3 2 days ago 0 replies      
I really want to try this, but I'm unable to register. At the page where it says "Create your free Docker ID to get started" after I click Sign Up, the page just refreshes and my chosen ID becomes blank with no indication of what's wrong. I've chosen several different IDs and neither of them worked. Browser is Firefox 45.0.1 on Windows 7.
31
bradhe 2 days ago 0 replies      
This is amazingly cool. We've been using docker at Reflect (shameless: https://reflect.io) since we started it and even if we didn't have all the cgroups features, it'd be super helpful just to be able to run the stack on my laptop directly instead of having the Vagrant indirection.
32
geerlingguy 3 days ago 1 reply      
Private beta is behind a questionnaire, just FYI. You can't, unfortunately, download anything yet unless you get an invite.
33
slantedview 2 days ago 1 reply      
I've been running docker-machine with a VMWare Fusion VM with VT-x/EPT enabled and am using KVM inside my containers to dev/test cloud software. I'd be interested to know if I can still get the performance of Fusion and the support I need for nested virtualization out of Docker for Mac.
34
newman314 2 days ago 0 replies      
This is strange. I just created a Docker ID and as able to log into the regular hub but when I try to log into the beta, it keeps saying error.

Is there a user/password length limit? (I used a 30char user/password. 1password FTW).

35
nikolay 2 days ago 0 replies      
I've always wondered about invites for open-source projects... that don't even open-source...
36
d_sc 2 days ago 0 replies      
This is great news to hear, I've been using a brew recipe that includes: brew install xhyve docker docker-compose docker-machine docker-machine-driver-xhyve to get close to what they're doing in this beta. Really looking forward to trying this out. Signed up for the beta!
37
paukiatwee 3 days ago 1 reply      
If I read correctly, docker for Mac is run on top on another visualization (xhyve, not VirtualBox) and docker for windows run on top of Hyper-V, which mean that it is not for production workload (at least for Windows).

So you can only use it for development. And it is close sourced. hmmm...

38
mateuszf 2 days ago 2 replies      
When I log in and go to https://beta.docker.com/form there is an empty form and js console says:Uncaught ReferenceError: MktoForms2 is not defined
39
silvamerica 2 days ago 1 reply      
Will there be an easy way to switch / upgrade from docker-machine with vbox without having to recreate all of my images and containers over again?

I know it's a small thing, but it's kind of a pain sometimes.

40
mrmondo 2 days ago 0 replies      
Thanks god for no more Virtualbox, that thing was a pig, endless amounts of networking and IO problems that lead every developer using it to come to my team for help.

also, Oracle.

41
girkyturkey 2 days ago 0 replies      
Finally! I've spent the last month or so on Docker to learn about it as I am somewhat new in this environment. I'm just excited to try it out and have a more broad range of tools.
42
eggie5 2 days ago 0 replies      
Using docker on a mac always seemed to hackish b/c you had to run a separate VM. This seems like a step in the right direction and am excited to visit docker again!
43
rikkus 2 days ago 0 replies      
So on Windows this runs Linux in their isolated environment? I just got excited thinking it meant Windows in Windows but it looks like that's not the case.
44
Titanous 3 days ago 1 reply      
Is the source code available? I don't see it at https://github.com/docker
45
mrfusion 2 days ago 0 replies      
Would this be a good way to deploy a program based on opencv to nontechnical users? So far I haven't found a good way to do that
46
awinter-py 2 days ago 0 replies      
great news but I'm not sure a young startup should be wasting money on what was obviously a professionally produced launch video
47
brightball 3 days ago 0 replies      
This is HUGE! Looking forward to trying it out.
48
tiernano 3 days ago 0 replies      
link says its Hyper-V on Windows, but then says Windows 10 only... Anyone know if Windows Server is also supported?
49
partiallypro 2 days ago 0 replies      
Kinda surprised they didn't just wait 7 days and announce this at Build with Microsoft.
50
ThinkBeat 2 days ago 1 reply      
I would like to see Windows docker images.Will this ever happen? Or can I do it already?
51
ndboost 3 days ago 0 replies      
shut up and take my money!
52
contingencies 2 days ago 0 replies      
Not-news (support for two new hypervisors implemented, already dodgy package altered) voted up to 718 points. God you people are sheep. I guess what we take from this is docker is getting desperate for newslines.
53
eddd 3 days ago 1 reply      
i'll finally get rid of docker-machine, THANK YOU DOCKER.
54
TheAppGuy 2 days ago 0 replies      
Is this relevant to my app developer community on Slack?
55
Ivan_p 2 days ago 0 replies      
can somebody provide a link for this app? I can't wait anymore! :D
56
howfun 3 days ago 1 reply      
Why would be Windows Pro required?
57
serge2k 2 days ago 0 replies      
still just VMs?
58
pmoriarty 3 days ago 1 reply      
Unfortunately, despite the title, Docker still does not run natively on a Mac or on Windows. It runs only inside a Linux VM.

From the OP:

"The Docker engine is running in an Alpine Linux distribution on top of an xhyve Virtual Machine on Mac OS X or on a Hyper-V VM on Windows"

7
Citus Unforks from PostgreSQL, Goes Open Source citusdata.com
753 points by jamesheroku  3 days ago   152 comments top 28
1
no1youknowz 2 days ago 2 replies      
This is awesome. I have experience with running a CitusDB cluster and it pretty much solved a lot of the scaling problems I was having at the time. For it to go open source now, is of huge benefit to the future projects I have.

> With the release of newly open sourced Citus v5.0, pg_shard's codebase has been merged into Citus...

This is fantastic, sounds like the setup process is much simpler.

I wonder if they have introduced the Active/Active Master solution they were working on? I know before, there is 1 Master and multiple Worker nodes. The solution before was to have a passive backup of the Master.

If say, they released the Active/Active Master later on this year. That's huge. I can pretty much think of my DB solution as done at this point.

2
devit 2 days ago 2 replies      
I've been unable to find any clear description of the capabilities of Citus and competing solutions (postgres-x2 seems the other leader).

Which of these are supported:

1. Full PostgreSQL SQL language

2. All isolation levels including Serializable (in the sense that they actually provide the same guarantees as normal PostgreSQL)

3. Never losing any committed data on sub-majority failures (i.e. synchronous replication)

4. Ability to automatically distribute the data (i.e. sharding)

5. Ability to replicate the data instead or in addition to sharding

6. Transactionally-correct read scalability

7. Transactionally-correct write scalability where possible (i.e. multi-master replication)

8. Automatic configuration only requiring to specify some sort of "cluster identifier" the node belongs to

3
exhilaration 2 days ago 3 replies      
4
gtrubetskoy 2 days ago 4 replies      
If anyone from Citus is reading this: how does this affect your business model? I remember when I asked at Strata conf a couple of years ago why isn't your stuff Open Source, the answer then was "because revenue". So what changed since then?
5
TY 2 days ago 2 replies      
This is awesome! Tebrikler (congrats) on the release of 5.0 and going OS, definitely great news.

Can you publish competitive positioning of Citus vs Actian Matrix (nee ParAccel) and Vertica? I'd love to compare them side by side - even if it's just from your point of view :-)

6
erikb 2 days ago 0 replies      
Unforking is a very smart decision. Postgres also has gained a lot of favour since MySQL was bought by Oracle. Altogether Citus has earned a lot of kudos for that move, at least with me, for all that may count!
7
faizshah 2 days ago 2 replies      
So this sounds similar to Pivotal's Greenplum which is also open source, can anyone compare the two?
8
voctor 2 days ago 1 reply      
Citus can parallelize SQL queries across a cluster and across multiple CPU cores. How does it compare with the upcoming 9.6 version of PostgreSQL which will support parallel-able sequential scans, parallel joins and parallel aggregate ?
9
azinman2 2 days ago 3 replies      
I want it to be called citrus, which is what I always read it as....
10
rkrzr 2 days ago 2 replies      
This is fantastic news! Postgres does not have a terribly strong High Availability story so far and of course it also does not scale out vertically.I have looked at CitusDB in the past, but was always put off by its closed-source nature. Opening it up seems like a great move for them and for all Postgres users. I can imagine that a very active open-source community will develop around it.
11
lobster_johnson 2 days ago 0 replies      
This is great!

One thing I'm having trouble with is finding information about transactional semantics. If I make several updates (to differently sharded keys) in a single transaction, will the transaction boundaries be preserved (committed "locally" first, then replicated atomically to shards)? Or will they fan out to different shards with separate begin/commit statements? Or without transactional boundaries at all?

In fact, I can't really find any information on how CitusDB achieves its transparent sharding for queries and writes. Does it add triggers to distributed tables to rewrite inserts, updates and deletes? Or are tables renamed and replaced with foreign tables? I wish the documentation was a bit more extensive.

12
ccleve 2 days ago 1 reply      
I'd very much like to see what algorithm these systems are using to enable transactions in a distributed environment. Are they just using straight two-phase commit, and letting the whole transaction fail if a single server goes down? Or are are they getting fancy and doing some kind of replication with consensus?
13
signalnine 2 days ago 0 replies      
Congrats from Agari! We've been looking forward to this and continue to get a lot of value from both the product and the top-notch support.
14
jjawssd 2 days ago 2 replies      
My guess is that Citus is making enough money from consulting that they don't need to keep this code closed source when they can profit from free community-driven growth while they are expanding their sales pipeline through consulting.
15
ahachete 2 days ago 0 replies      
Congratulations, Citus.

Since I heard last year at PgConfSV that you will be releasing CitusDB 5.0 as open source, I've been waiting for this moment to come.

It makes 9.5's awesome capabilities to be augmented with sharding and distributed queries. While this targets real-time analytics and OLAP scenarios, being an open source extension to 9.5 means that a whole lot of users will benefit from this, even under more OLTP-like scenarios.

Now that Citus is open source, ToroDB will add a new CitusDB backend soon, to scale-out the Citus way, rather than in a Mongo way :)

Keep up with the good work!

16
BinaryIdiot 2 days ago 0 replies      
I don't have a ton of experience scaling out and using different flavors of PostgreSQL but I had run across Postgres-XL not long ago; does anyone know how this compares to that?
17
ismail 2 days ago 0 replies      
Any thoughts on using something like postgres+citrus vs hadoop+hbase+ecosystem vs druid for olap/analytics with very large volumes of data
18
X86BSD 2 days ago 2 replies      
AGPL? This is dead in the water :( It will never be integrated into PG. What a shame. It should have been a 2 clause BSDL. Sigh.
19
satygeek 2 days ago 3 replies      
Does CitusDb fit in olap analytical workloads to do aggregations on hundreds millions of records using varying order and size of dimensions (eg druid) in max of 3 seconds response time using as few boxes as possible - Or there are other techniques have to be used along with Citusdb? Can you shed a light on your experience with CloudFlare in terms of cluster size and queries perf?
20
uberneo 2 days ago 0 replies      
Great product - If would be nice to have a Admin interface like RethinkDB where you can clearly define your replication and Sharding settings.Any documentation around how to do this from command line ?
21
albasha 2 days ago 0 replies      
I recently switched back to MariaDB because I didn't see a clear/easy path for Postgres scalability in case the project i am working on takes off. I am under the assumption there are at least two fairly simple approaches to scale MySQL; master-master replication using Galera and Aurora from AWS. What do you guys think? Am I right in thinking MySQL is easier to scale given I want to spend the least amount of time maintaining it.
22
Dowwie 2 days ago 0 replies      
would a natural evolutionary path for start ups be to emerge with postgresql and grow to requiring citusdb?
23
onRoadAgain23 2 days ago 5 replies      
Being burned before,I will never use an OS infrastructure project that has enterprise features you need to pay for. They always try to move you to paid and make the OSS version unpleasant to use over time as soon as the bean counters take over to milk you

"For customers with large production deployments, we also offer an enterprise edition that comes with additional functionality"

24
ioltas 2 days ago 0 replies      
Congrats to all for the release. That's a lot of work accomplished.
25
ksec 2 days ago 0 replies      
Does anyone know How does Citus compared to Postgre XL ?
26
lambdafunc 2 days ago 0 replies      
Any benchmarks comparing CitusDB against Presto?
27
Someone 2 days ago 2 replies      
One must thank them for open sourcing this, and cannot blame them for using a different license, but using a different license makes me think calling this "unfork" is bending the truth a little bit.
28
Dowwie 2 days ago 0 replies      
is it correct to compare citusdb with pipelinedb?
8
Julie Rubicon facebook.com
779 points by ISL  4 days ago   130 comments top 51
1
jimrandomh 3 days ago 5 replies      
Post is fiction. Spoilers follow.

.

.

When I started reading this, I didn't realize it was fiction. When I got to the point where the protagonist left the end-date off a query and saw a spike, I thought it was going to be explained by falsified data and lead into a (real) accusation of providing fraudulent metrics to advertisers. It wouldn't be the first such accusation. But it turned out to be a science-fiction story with unexplained time travel in it. Oh well.

2
stygiansonic 3 days ago 2 replies      
Fiction or not, this is similar to an actual story of (ex)-fraud researchers at Capital One[1], who (ab)used their access to credit card transaction data in order to infer whether target companies' quarterly earnings would be below/above expectations, and then traded on that knowledge.

However, in the actual story, there was no "black box" to query, the ex-employees wrote all the complicated queries themselves.

EDIT: They turned ~$150K into ~$2.8 MM USD over about three years, before being caught, mostly through options trades it seems.

1. http://www.bloombergview.com/articles/2015-01-23/capital-one...

3
chatmasta 3 days ago 16 replies      
Usually I read comments on HN before the article, but there were no comments when I saw the link, so I went straight to reading it.

What a bizarre piece. How long did it take everyone else to realize it was fiction? For me, I did not realize until the very end -- and even then I still wasn't sure. It could just as well have been written by some delusional non-technical employee at Facebook.

4
state 3 days ago 4 replies      
Worth noting that Robin is also the author of Mr. Penumbra's 24-Hour Bookstore [1] that others around HN would probably find pretty enjoyable. I thought it was quite fun.

1 - https://en.wikipedia.org/wiki/Mr._Penumbra%27s_24-Hour_Books...

5
maxaf 3 days ago 1 reply      
I'm "technical" and "hands-on", yet I was gullible enough to believe the story, IM-ed it to my wife, and was surprised when she hadn't panicked like I did.

I read way too much science fiction. Back to work now!

Kind of relieved though.

6
asadlionpk 3 days ago 3 replies      
Nice piece. When I saw that first from-future graph, I thought Facebook is faking their data. As they are known for faking page likes.
7
Jacksonb 3 days ago 0 replies      
"Published on the day Julius Caesar was murdered 2060 years ago, after crossing the Rubicon. Julie Rubicon. Nice touch."
8
FlyingLawnmower 3 days ago 0 replies      
I think the HN audience will "see through" this story, but anyone who isn't familiar with the current state of neural nets/has read the many pieces fortelling the future of AI might just find it plausible. I really liked the writing style.
9
kozikow 3 days ago 0 replies      
The first graph immediately looked too fake. Spike after an event would follow something like a log normal distribution, rather than sudden spike and day after back to normal. What's more, the world is not equally using the internet. The uneventful graph should follow something like a sin wave, with the highest point 2x higher than the lowest.
10
ChuckMcM 3 days ago 1 reply      
It is a fun story and one that comes at the question of "how much data is too much data and is too hard to resist?" Something that anyone who has worked in a popular Internet facing application has to come terms with. I found 'Manna'[1] a more compelling emergent AI story, but the data as predictor aspects of this story have their own particular flavor.

One of the things that immediately flashed it as fiction for me was that the graphs had all the same shape, which if you've ever looked at trend graphs you will see they might all have a similar outlier quality to them but they build and sustain in different ways.

At Blekko (a search engine company) we did an interesting study on query traffic to see if you could "predict" the "hallmark" holidays based on search queries. The idea was that holidays like Valentines Day come up, people start thinking about plans or gifts before that, could we advise an advertiser when the "peak" planning session was so that they could maximize the impact of their advertising spend by focusing it during the peak? And if so what sorts of queries were people making that indicated they were doing holiday planning? The results were mixed. For things like Valentines it was easy, flowers, chocolates, bed & breakfast reservations sort of rose out of the general query stream, St. Patrick's Day? Not so much. But the data peaks all had different shapes appropriate for different levels of impact (Christmas shopping really starts in August among the back to school traffic for the really prepared). So looking at (and for) "interest spikes" like the ones in the story had a bunch of different shapes, some with slow onset and rapid decline, some with rapid onset and rapid decline, and some which were like soft swells on a breezy afternoon at the beach.

That said, the dataset made possible by Facebook's chat stream would be even better for those sorts of investigations.

[1] http://marshallbrain.com/manna1.htm

11
minimaxir 3 days ago 1 reply      
The funny thing is that if the implications noted in the story were true and Facebook could accurately forecast events into the far future, building a social network would be the least of their priorities.
12
wallflower 3 days ago 0 replies      
This is some fantastic writing.

And if you are looking for a longer, slightly better, fictionalized account of Facebook, "The Circle" by David Eggers is a quick, engrossing read and quite hard to tear yourself away from once you begin.

https://en.wikipedia.org/wiki/The_Circle_(Eggers_novel)

13
cavisne 3 days ago 0 replies      
Great piece of fiction, it feels like the author works at Facebook or knows someone who does though.

That said, what's described could possibly be done with existing technology . Could Facebook accurately move forward statements like "next Monday is going to be massive for Volkswagen" over time you could weight private messages from people who work for regulators higher. For more public events like an apple launch they could predict a spike easily just by all the media mentions of the date

14
djsumdog 3 days ago 0 replies      
Haha..it's like a modern campfire ghost story. It's written in a believable manner too. It had me going. I like it. Good job!
15
dynofuz 3 days ago 1 reply      
reminds me of the movie primer: https://en.wikipedia.org/wiki/Primer_%28film%29
16
thomble 3 days ago 0 replies      
This piece scared the hell out of me (seriously, I felt panicked) until I discerned that it was fiction. Great read.
17
kevando 3 days ago 5 replies      
Very entertaining! How did no one realize this was fake by the first graph? Does xkcd design the graphs for fb premium partners?
18
tfgg 3 days ago 0 replies      
Lovecraft does data science?

(the variance on the graphs intuitively felt a bit low, but then again it is facebook)

19
mwcampbell 3 days ago 0 replies      
Reminds me of this interpretation of Minority Report, only with AI instead of human precogs.

http://mjyoung.net/time/minority.html

20
avipars 3 days ago 1 reply      
I think April Fools has come early this year
21
thomasahle 3 days ago 0 replies      
Was I the only one, whos first impression was, that facebook was somehow involved in insider trading. And that for some reason they were brave enough to also use this information for load balancing...
22
Kevin_S 3 days ago 0 replies      
As someone who is non-technical, I pretty much was mind blown until the very end. Had a feeling it was fiction haha. Glad the comments here clarified. Entertaining read.
23
mat_jack1 3 days ago 0 replies      
It's funny that I've received the notification for the HN500 while I was reading "Mr. Penumbra's 24h bookstore" and noticed it after having finished the book. Clicked the link and the author was Robin Sloan as in the book :)

Apart from the funny combination I just chime in to recommend the book here, I think most of you here would enjoy that.

24
chrome_x 3 days ago 0 replies      
Even when you feel its fiction, it just sounds like something that I could be reading off tomorrow's newspapers. This was a great read!
25
gizmodo59 3 days ago 0 replies      
By seeing the title along with the domain facebook.com I couldn't resist but click. Perfect click bait. Though a good story.
26
Animats 3 days ago 0 replies      
Aw. It's in the tradition of "The Endochronic Properties of Resublimated Thiotimoline", by Isaac Asimov.
27
Mahn 3 days ago 1 reply      
Fiction, but that was entertaining. To the author, you should probably consider writing a book in this style :)
28
lucb1e 3 days ago 0 replies      
So I figured this was too epic to be real, but what part of it is real? Is there a team selling statistics from all posts from advertisers? Because that part I believed without a second thought and honestly, it does make sense.
29
dekhn 3 days ago 0 replies      
Best version of this is still "Vaxen, my children..." http://www.hactrn.net/sra/vaxen.html read to the end, and check the meaning of the date
30
raverbashing 3 days ago 0 replies      
Nice piece of fiction
31
personjerry 3 days ago 1 reply      
It's disturbing to me how many comments on here assert that the piece is fiction with little to no evidence. It occurs to me that perhaps Facebook is trying to make it seem like it's fiction by posting on various forums!
32
steven2012 3 days ago 0 replies      
Identified as fiction because VW isn't traded on the US stock market.
33
chejazi 3 days ago 0 replies      
Facebook has the data to make a solid prediction market. That should be their new biz model - they'll still exploit our data but they won't be degrading our user experience with ads.
34
EGreg 3 days ago 0 replies      
Did John Titor write this?
35
api 3 days ago 0 replies      
I figured it was fiction by the end, but I couldn't really tell. I can't tell the difference between fiction/satire and reality anymore. The world is too strange.
36
jbscpa 3 days ago 0 replies      
Hari Seldon and Gaal Dornick as devlopers of Psychohistory would approve. (Isaac Asimov's "Foundation" universe)

"Psychohistory is the name of a fictional science in Isaac Asimov's Foundation universe, which combined history, psychology and mathematical statistics to create a (nearly) exact science of the behavior of very large populations of people" source: Wikia

37
Thane2600 3 days ago 0 replies      
i thought the following into existence yesterday and read the post today. that facebook uses users as a layer of abstraction above one brain. a brain of brains. putting queries (thoughts) into the system (brain) and obtaining a result that is the average of many.
38
bpp 3 days ago 0 replies      
Very enjoyable, and almost plausible...
39
agumonkey 3 days ago 0 replies      
I enjoyed the oracle discovery very much and even more its use on themselves. Cute.
40
ktusznio 3 days ago 0 replies      
This piece was written by Robin Sloan, who is an author of fiction. :) Great story.
41
malkia 3 days ago 0 replies      
Oh, somehow I've got the Pollyhop breeze from House Of Cards
42
daveheq 3 days ago 0 replies      
Hmm, fact-fic; I wonder who run with it make it an urban legend.
43
thadd 3 days ago 0 replies      
It still makes me excited about the future of neural networks :)
44
KerryJones 3 days ago 0 replies      
This could easily be a Black Mirror episode
45
lawnchair_larry 3 days ago 0 replies      
Not a very useful submission title.
46
jkkorn 3 days ago 0 replies      
The big short, Facebook edition.

Entertaining read.

47
wallzz 3 days ago 0 replies      
I really got scary reading this
48
hatsunearu 3 days ago 0 replies      
I thought this was real, but man that's probably the best creepypasta I've read in a while.
49
PSeitz 3 days ago 0 replies      
Obvious fiction. Nice story, but I don't like fiction masked as real.
50
dluan 3 days ago 1 reply      
Enchilada?
51
fsiefken 3 days ago 0 replies      
Who could be the real author if it's fiction? This Robin Sloan might certainly have the means and the motive. Either it is fiction, with a plot which is not to far fetched as any company with these kinds of databases can exploit it in the prediction market. "Information is power and currency in the virtual world we inhabit" Billy Idol once said. It's also an important element in Asimov's Foundation series: extrapolating history through psychohistory.

Or it's not entirely fiction or perhaps even factious. I remember Rupert Sheldrake mention in one of his mindboggling talks that he wanted to investigate potential psi effects on a much larger scale and was in talks with Google. If there were 'results' pertaining to precognition (or AI enhanced precognition of the crowd) would the public get to know about it?

9
Require-from-Twitter github.com
698 points by uptown  3 days ago   127 comments top 40
1
michael_storm 3 days ago 3 replies      
This is the Internet of Things for code. This is wonderful.

This is also probably a snarky shot at npm [1], for those who lack context.

[1] https://news.ycombinator.com/item?id=11340510

2
spriggan3 3 days ago 10 replies      

 > "dependencies": { > "babel-preset-stage-0": "6.5.0", > "babel-preset-es2015": "6.6.0", > "babel-runtime": "6.6.1", > "babel-plugin-transform-runtime": "6.6.0", > "babel-cli": "6.6.5", > "babel-core": "6.7.4", > "twit": "2.2.3", > "entities": "1.1.1" > },
The problem right here. Just to run a script you now need to import a whole third party language runtime ? what other language does pull this kind of stunt ? Javascript is madness.

3
yAnonymous 3 days ago 2 replies      
I'm currently talking to investors to start a business around this. Please don't delete it.
4
martin-adams 3 days ago 3 replies      
Maybe to make this more reliable, you should retweet the module first, then require your clone.
5
rburhum 3 days ago 0 replies      
Hi. I don't know how to program (otherwise I would do this myself), but can you port this to Google+ please? My office blocks twitter. Thanks!
6
melvinmt 3 days ago 1 reply      
Why is this not a npm module yet? Name suggestion: kik2
7
cmpolis 3 days ago 0 replies      
Tangentially related(tweet sized js) and an awesome project: https://t.d3fc.io/ is a collection of d3 visualizations from tweets. The code is cryptic on first inspection, but if you look at the sandbox setup, it starts to make sense and 140 chars is a wonderful constraint. eg: https://t.d3fc.io/status/694991319052103680
8
philmander 3 days ago 1 reply      
Require from stack overflow?

require('how do I prepend spaces to a string')

9
logn 3 days ago 4 replies      
A developer gets upset at unilateral actions by NPM resulting in a project being renamed or taken down unnecessarily (potentially breaking builds). So this dev decides to take down all their projects, as a sort of protest. This breaks a lot of builds. The JavaScript community thinks a clever solution is utilizing Twitter as part of the build process? Because then everything would be dependent on Twitter not adding an "edit tweet" button...
10
aioprisan 3 days ago 3 replies      
Pretty comical. I bet folks would actually use this to some extent, without realizing that Tweets can also be deleted.
11
spotman 3 days ago 0 replies      
the next version should have a require from #hashtag, so that it can be fault tolerant, and would last longer when lawyers request a takedown!
12
ChikkaChiChi 1 day ago 0 replies      
Github could easily solve this problem:

If a repo meets certain criteria in licensing, create the ability to "Static Publish" a release. This feature communicates that the version repo is the public's and cannot be removed by the individual or team. Github also enters the agreement that if there is ever a need to "move" the version due to copyright infringement, it provides either aliasing or sooner sort of notification and time bomb before it goes offline.

If this were done, folks could be in the know about which packages and dependencies are at risk, so they enter into the install with their eyes wide open.

13
m_mueller 3 days ago 0 replies      
> // ES6 leftPad

and he even had space for a comment in there....

14
franciscop 3 days ago 1 reply      
For everyone who doesn't know it, there's a project called http://140byt.es/ compiling many code snippets that fit in a tweet (;

There was also a clever trick to compress/uncompress ascii text by using base[huge number] or something like that (full unicode) so it could be uploaded to twitter, but I don't remember the exact number

15
jjawssd 2 days ago 0 replies      
Next up: require from bitcoin blockchain
16
cphoover 3 days ago 3 replies      
this would actually be an interesting coding challenge and experiment to see if something could be built worthwhile in modules limited to no larger than 140 characters.
17
homero 3 days ago 0 replies      
Shouldn't have voted against increasing character count, we could've had a free cdn
18
rcthompson 3 days ago 0 replies      
Make sure you scroll down and read the mandatory disclaimer before commenting on the merits of this approach.
19
0x7fffffff 3 days ago 0 replies      
Well there you go. Problem solved.
20
amptorn 1 day ago 0 replies      
One of the interesting side-effects here is that while it encourages you to write very small packages, it also strictly caps the maximum number of packages which you can add as dependencies of your own package. There's only so many `require` calls you can fit in a Tweet.
21
t1amat 3 days ago 0 replies      
Standardized modules like this are exactly what the node.js-stack bot herding community has needed!

On the plus side: if you saw this dependency in a module you were looking at you would know to think twice.

22
sorenjan 2 days ago 0 replies      
There's also a package manager for it: http://require-from-twitter.github.io/

> require-from-twitter is the core code for the tweet package manager. Our beta version has only one dependency: npm. But we're working hard on adding more dependencies as quick as possible.

23
fallenshell 13 hours ago 0 replies      
Let's host systemd on Twitter.
24
anotherevan 3 days ago 0 replies      
There's still another eight days until April first.
25
olegp 3 days ago 0 replies      
Along similar lines, I made it possible to use NPM packages in the browser without a build step or server: https://github.com/olegp/bpm

More info here: https://meetabit.com/talks/37

26
peterkelly 3 days ago 0 replies      
What we really need is require-from-stackoverflow
27
plugnburn 2 days ago 0 replies      
Why not just use anonymous gists in conjunction with RawGit CDN?

Unique IDs, no way to change or delete (since the gists are anonymous), served right out-of-the-box with a proper content type from cdn.rawgit.com.

28
escoundel 2 days ago 0 replies      
TDD - Twitter Driven Development
29
mooreds 2 days ago 0 replies      
Twitter finally has a business model! Who knew that source code hosting would be the killer app?
30
howeyc 3 days ago 0 replies      
I know this is supposed to be funny, BUT if you vendored and kept a local copy in you're build environment, you wouldn't have to worry if the tweet gets deleted.

This is the lesson I see noone talking about.

Of course, using a tweet as a source for a library is silly.

31
amelius 3 days ago 0 replies      
Filesystem interface to Twitter:

http://softwaretechnique.jp/DownLoad/twfs_en.html

This is probably more generic than the project discussed here.

32
andremendes 3 days ago 0 replies      
Well, twitter staff was saying they'd last at least another ten years, would NPM?
33
ikeboy 3 days ago 0 replies      
For deleted tweets:

On every fetch, submit the tweet to archive.org and archive.is if not already there. If tweet is deleted, fetch from there instead.

34
bagnus 3 days ago 0 replies      
I'm impressed no one has posted their own version for a different language.
35
nivertech 3 days ago 1 reply      
twitter doesn't have an edit button, but it does have the delete button ;)
36
wallzz 3 days ago 1 reply      
can someone explain what is this ? I really have no idea
37
progx 3 days ago 0 replies      
Lol YMMD
38
cphoover 3 days ago 0 replies      
funny :)
39
dbpokorny 3 days ago 0 replies      
If you can get a good toolkit for writing a GLR parser, then people will write their own tokenizers, BNF formal grammars, and plug it into your parser. It would take a single person about two to six weeks to get something thoroughly polished in JavaScript along the lines of what is described. However I think that without some form of centralization, (perhaps a subreddit? idk) it will be difficult for the standardization and namespace organization process to take place. If it is just one person, there is no question of standardization; if it is multiple people, the question of who is in charge of the namespace becomes relevant. Who is in charge of the namespace in this particular experiment?
40
chris_wot 3 days ago 0 replies      
npm over twitter? A site populated by trolls and spammers. What could possibly go wrong?
10
Google opens access to its speech recognition API techcrunch.com
599 points by jstoiko  3 days ago   167 comments top 42
1
blennon 3 days ago 5 replies      
This is HUGE in my opinion. Prior to this, in order to get near state-of-the-art speech recognition in your system/application you either had to have/hire expertise to build your own or pay Nuance a significant amount of money to use theirs. Nuance has always been a "big bad" company in my mind. If I recall correctly, they've sued many of their smaller competitors out of existence and only do expensive enterprise deals. I'm glad their near monopoly is coming to an end.

I think Google's API will usher in a lot of new innovative applications.

2
CaveTech 3 days ago 6 replies      
> To attract developers, the app will be free at launch with pricing to be introduced at a later date.

Doesn't this mean you could spend time developing and building on the platform without knowing if your application is economically feasible? Seems like a huge risk to take for anything other than a hobby project.

3
zkirill 3 days ago 5 replies      
I came across CMU Sphnix speech recognition library (http://cmusphinx.sourceforge.net) that has a BSD-style license and they just released a big update last month. It supports embedded and remote speech recognition. Could be a nice alternative for someone who may not need all of the bells and whistles and prefers to have more control rather than relying on an API which may not be free for long.

Side note: if anyone is interested in helping with an embedded voice recognition project please ping me.

4
hardik988 3 days ago 1 reply      
Tangentially related: Does anyone remember the name of this startup/service that was on HN (I believe), that enables you to infer actions from plaintext.

Eg: "Switch on the lights" becomes

{"action": "switch_on", "thing" : "lights"}

etc.. I'm trying really hard to remember the name but it escapes me.

Speech recognition and <above service> will go very well together.

5
hardwaresofton 3 days ago 0 replies      
In case you're not interested in having google run your speech recognition:

CMU Sphinx:http://cmusphinx.sourceforge.net/

Julius:http://julius.osdn.jp/en_index.php

6
melvinmt 3 days ago 0 replies      
If you're having trouble (like me) to find your "Google Cloud Platform user account ID" to sign up for Limited Preview access, it's just the email address for your Google Cloud account. Took me only 40 minutes to figure that one out.
7
josephcooney 3 days ago 0 replies      
I wrote a client library for this in C# by reverse engineering what chrome did at the time (totally not legit/unsupported by google, possibly against their TOS). I have never used it for anything serious, and am glad now there is an endorsed way to do this.

https://bitbucket.org/josephcooney/cloudspeech

8
jaflo 3 days ago 0 replies      
Pretty impressive from the limited look the website (https://cloud.google.com/speech/) gives: the fact that Google will clean the audio of background noise for you and supports streamed input is particularly interesting.

I don't know I should feel about Google taking even more data from me (and other users). How would integrating this service work legally? Would you need to alert users that Google will keep their recordings on file (probably indefinitely and without being able to delete them)?

9
theseatoms 3 days ago 0 replies      
Key sentence:

> The Google Cloud Speech API, which will cover over 80 languages and will work with any application in real-time streaming or batch mode, will offer full set of APIs for applications to see, hear and translate, Google says.

10
robohamburger 3 days ago 0 replies      
Unless I have gone crazy google has had a STT available to tinker with for awhile. It is one of the options for jasper [1]. Hopefully this means it will be easier to setup now.

Would be nice if they just open sourced it though but I imagine that is at crossed purposes with their business.

[1] https://jasperproject.github.io/documentation/configuration/

11
jonah 3 days ago 0 replies      
SoundHound released Houndify[1], their voice API last year which goes deeper than just speech recognition to include Speech-to-Meaning, Context and Follow-up, and Complex and Compound Queries. It will be cool to see what people will do with speech interfaces in the near future.

[1] https://www.houndify.com/

12
timbunce 3 days ago 1 reply      
FWIW I'd just finished a large blog post researching ways to automate podcast transcription and subsequent NLP.

It includes lots of links to relevant research, tools, and services. Also includes discussion of the pros and cons of various services (Google/MS/Nuance/IBM/Vocapia etc.) and the value of vocabulary uploads and speaker profiles.

http://blog.timbunce.org/2016/03/22/semi-automated-podcast-t...

13
mobiledev88 3 days ago 0 replies      
Houndify launched last year and provides both speech recognition and natural language understanding. They have a free plan that never expires and transparent pricing. It can handle very complex queries that Google can't.
14
amelius 3 days ago 2 replies      
Why isn't speech recognition just part of the OS? Like keyboard and mouse input.
15
vram22 3 days ago 0 replies      
For anyone who wants to try these areas a bit:

My trial of a Python speech library on Windows:

Speech recognition with the Python "speech" module:

http://jugad2.blogspot.in/2014/03/speech-recognition-with-py...

and also the opposite:

http://code.activestate.com/recipes/578839-python-text-to-sp...

16
danso 3 days ago 1 reply      
FWIW, Google followed the same strategy with Cloud Vision (iirc)..they released it in closed beta for a couple of months [0], then made it generally available with a pricing structure [1].

I've never used Nuance but I've played around with IBM Watson [2], which gives you 1000 free minutes a month, and then 2 cents a minute afterwards. Watson allows you to upload audio in 100MB chunks (or is it 10 minute chunks?, I forgot), whereas Google currently allows 2 minutes per request (edit: according to their signup page [5])...but both Watson and Google allow streaming so that's probably a non-issue for most developers.

From my non-scientific observation...Watson does pretty well, such that I would consider using it for quick, first-pass transcription...it even gets a surprising number of proper nouns correctly including "ProPublica" and "Ken Auletta" -- though fudges things in other cases...its vocab does not include "Theranos", which is variously transcribed as "their in house" and "their nose" [3]

It transcribed the "Trump Steaks" commercial nearly perfect...even getting the homophones in "when it comes to great steaks I just raise the stakes the sharper image is one of my favorite stores with fantastic products of all kinds that's why I'm thrilled they agree with me trump steaks are the world's greatest steaks and I mean that in every sense of the word and the sharper image is the only store where you can buy them"...though later on, it messed up "steak/stake" [4]

It didn't do as great a job on this Trump "Live Free or Die" commercial, possibly because of the booming theme music...I actually did a spot check with Google's API on this and while Watson didn't get "New Hampshire" at the beginning, Google did [4]. Judging by how well YouTube manages to caption videos of all sorts, I would say that Google probably has a strong lead in overall accuracy when it comes to audio in the wild, just based on the data it processes.

edit: fixed the Trump steaks transcription...Watson transcribed the first sentence correctly, but not the other "steaks"

[0] http://www.businessinsider.com/google-offers-computer-vision...

[1] http://9to5google.com/2016/02/18/cloud-vision-api-beta-prici...

[2] https://github.com/dannguyen/watson-word-watcher

[3] https://gist.github.com/dannguyen/71d49ff62e9f9eb51ac6

[4] https://www.youtube.com/watch?v=EYRzpWiluGw

[5] https://services.google.com/fb/forms/speech-api-alpha/

17
ocdtrekkie 3 days ago 1 reply      
"Google may choose to raise those prices over time, after it becomes the dominant player in the industry."

...Isn't that specifically what anticompetition laws were written to prevent?

18
j1vms 3 days ago 3 replies      
I would say that Google's main goal here is in expanding their training data set, as opposed to creating a new revenue stream. If it hurts competitors (e.g. Nuance) that might only be a side-effect of that main objective, and likely they will not aim to hurt the competition intentionally.

As others here have pointed out, the value now for GOOG is in building the best training data-set in the business, as opposed to just racing to find the best algorithm.

19
zkhalique 3 days ago 1 reply      
Has anyone tried adding OpenEars to their app, to prevent having to send things over the internet from e.g. a basement? Is it any good at recognizing basic speech?
20
z3t4 3 days ago 1 reply      
At least offer a self hosted version. Maybe it's just me, but I'm not comfortable sending every spoken word to Google.
21
amelius 3 days ago 3 replies      
Nice. But what I want is open-source speech recognition.
22
szimek 3 days ago 0 replies      
In the sign-up form they state that "Note that each audio request is limited to 2 minutes in length." Does anyone know what "audio request" is? Does it mean that it's limited to 2 minutes when doing real-time recognition, or just that longer periods will count as more "audio requests" and result in a higher bill?

Do they provide a way to send audio via WebRTC or WebSocket from a browser?

23
dominotw 3 days ago 0 replies      
Nice! Curious how it compares to amazon's avs that went public this week.

https://github.com/amzn/alexa-avs-raspberry-pi

24
yeukhon 3 days ago 0 replies      
I thought I read open source, then I realized open access. I believe in the past there was a similar API, or maybe it was based on Google Translate. But I swear at one point people wrote hackathon projects using some voice APIs.
25
Negative1 3 days ago 1 reply      
I would be hesitant to build an entire application that relied on this API only to have it removed in a few months or years when Google realizes it sucks up time and resources and makes them no money.
26
saurik 3 days ago 4 replies      
I think this more directly competes with the IBM Watson speech API, not Nuance?
27
hans 3 days ago 0 replies      
cool, next up is a way to tweak the speech API to recognize patterns in stocks and capex .. wasn't that what Renaissance Technologies did ?

really GooG should democratize quant stuff next .. diy hedge fund algos.

28
zelcon 3 days ago 1 reply      
Great, now when will Google let us use the OCR engine they crowdsourced from us over the last decade with ReCaptcha. tesseract is mediocre.
29
alfonsodev 3 days ago 2 replies      
I'm reading many libraries here, I wonder what's the best open and multi platform software for spech recognition to code with vim, Atom etc. I only saw a hybrid system working with dragon + Python on Windows. I would like to train/ customize my own system since I'm starting to have pain in tendons, and wrists. Do you think this Google Api can make it? Not being local looks like a limiting factor for speed, lag.
30
willwill100 3 days ago 1 reply      
Will be interesting to compare with http://www.speechmatics.com
31
chair-law 3 days ago 1 reply      
What is the difference from a speech recognition API and [NLP libraries](https://opennlp.apache.org/)? This information was not easily found with a few google searches, so I figured others might have the same question.
32
vincent_s 3 days ago 0 replies      
33
infocollector 3 days ago 1 reply      
What is the best speech recognition engine, assuming one has no internet?
34
flanbiscuit 3 days ago 0 replies      
I hope this opens up some new app possibilities for the Pebble Time. I believe right now they use Nuance and it's very limited to only responding to texts.
35
mysticmode 3 days ago 0 replies      
I'm not sure, what will happen to Google's webspeech API in the future. Whether it will be continued as a free service.
36
sandra_saltlake 3 days ago 0 replies      
Sounds like this is bad news for Nuance,
37
omarforgotpwd 3 days ago 0 replies      
Fuck. Yes.IBM has a similar API as well as part of their Watson APIs but I really wanted to use Google's.
38
mark_l_watson 3 days ago 0 replies      
I think they are pushing back against Amazon's Echo speech APIs, which I have experimented with.

I just applied for early access.

39
E4life 3 days ago 0 replies      
Finally, this is something that will be the main way for communication in the future.
40
jupp0r 3 days ago 1 reply      
Anybody got the api docs yet? I wonder if I can stream from chrome via webrtc.
41
braindead_in 3 days ago 0 replies      
How well does this work with conversational speech? Any benchmarks?
42
BinaryIdiot 3 days ago 2 replies      
So this was very, very exciting until I realized you have to be using Google Cloud Platform to sign up for the preview. Unfortunately all of my stuff is in AWS and I could move it over but I'm not going (far too much hassle to preview an API I may not end up using, ultimately).

Regardless this is still very exiting. I haven't found anything that's as good as Google's voice recognition. I only hope this ends up being cheap and accessible outside of their platform.

11
Privacy Forget Your Credit Card privacy.com
645 points by doomrobo  2 days ago   355 comments top 89
1
soneca 2 days ago 13 replies      
Is that something that new? My bank (Ita, in Brazil) offers this option for some time now.

Here(in portuguese): https://www.itau.com.br/cartoes/cartao-virtual/

Or am I missing something?

Edit: They launched it in 2002: http://exame2.com.br/mobile/tecnologia/noticias/itau-agora-t...

Edit2: Sounds new in the US. This is not supposed to be a bragging/snarky comment. Just genuinely surprised as innovation usually come the other way around, from US to Brazil. So Congrats on the launch! Good job, sounds tough to launch it not being a Bank!

2
ac29 2 days ago 6 replies      
In case anyone didn't catch what this actually costs, the answer is: 1.5-2%, which is the rate you could get cash back (or airline miles/etc) with good credit.

Because this service draws directly from your bank account, and takes what would otherwise be your rewards from the credit card fees their banking partners charge, it provides a nice business model for them at the cost of you getting 0% rewards back. Not worth it, in my opinion.

3
boling11 2 days ago 25 replies      
Hey HN - Privacy.com co-founder here. I'm really excited to share what we've been working on for the past year and a half or so.

We've been neck-deep in payments stuff on the card issuing side (getting a BIN sponsor, ACH origination, etc), so happy to answer any questions on that front as well.

P.S. For new users, your first $5 donation to watsi.org is on us :)

4
URSpider94 2 days ago 4 replies      
I think people are over-thinking this offering a little too much. People who are asking if the company will resist a subpoena, or if all customer data will be irreversibly encrypted, are expecting too much.

The main purposes of this product are to be able to mask your marketing data (name, address, phone) to businesses, and to mitigate damage in the event of a data breach (any stolen card numbers are useless).

It's not going to prevent a government entity from subpoena'ing your records and finding out what you've bought. Also, if you're buying anything that needs to be, you know, shipped or emailed to you, you're kinda going to have to give a valid address. Under the default settings, they also include the merchant information in the feed back to your bank, so your bank still gets all of the info on where you're shopping and what you're buying.

Finally, I am very skeptical of their claim about walking away from subscriptions and trials. Sure, in theory, you make it much harder for vendors to track you down, but by law, you're agreeing to pay for the company's services when you accept their agreement, and if they do bother to subpoena your information and come after you, if they find out that you presented them with a fraudulent name, phone number and address, I don't expect that would go well for you in court.

5
tedmiston 2 days ago 2 replies      
My biggest question with Privacy, and of any one-time use credit card numbers service, is always:

Will it affect my rewards? Will businesses still show up unaffected with the same categories on my credit card statement? (I have a travel rewards only card, so breaking the rewards flow is a deal-breaker for using a higher level service.)

Edit: I misunderstood the service as being able to be layered on top of normal credit cards. It looks like the funding source is only bank accounts for now. Still my question remains if building on credit or debit cards is on the roadmap.

Edit 2: They are one-time use numbers, right? "Use at merchants" (plural) seems to possibly imply otherwise.

> What happens when I generate a new Privacy card?

> We'll give you a random 16-digit Visa card number that you can use at merchants that accept Visa debit cards...

Edit 3: It sounds like the business model results in keeping the money that would go to rewards on a normal card.

> How do you make money?

> Every time you spend using a Privacy card, the merchant or website pays a fee (called interchange) to Visa and the issuing bank. This fee is shared with us. We have some premium features planned, but rest assured, our core virtual card product will always be free and we will never sell your personal data.

6
drglitch 2 days ago 5 replies      
Both citi and bankofamerica (and I believe so, but didn't personally use, Wells Fargo) offered this service for free on their CC accounts in mid to late 2000s.

You could set limits per number, have it lock to just single merchant, etc. pretty nifty when paying some wacky merchant online.

All have since shuttered the service because pretty much every CC comes with purchase protection that you can invoke to charge the vendor back in case of something going wrong.

Virtual CCs provide very limited utility in my mind - because the place you're likely to have your CC swiped - a bar or a cab - are still going to use only the legacy plastic version.

7
mirimir 2 days ago 1 reply      
It's an interesting idea. However, I'm not comfortable with a third party having all that information. Some banks issue "corporate" cards, with numerous "employee" cards. I already trust the bank, after all. So what else does Privacy.com provide that's worth the risk? They're still subject to KYC, right? So there's no strong privacy. Or am I missing something?
8
zgubi 2 days ago 1 reply      
@boling11, why does privacy.com need access to my online banking on an ongoing basis, after the initial signup is finished?

I have changed my online banking password after signing up successfully, and I received an email complaining that "Our connection to your bank is broken".

I can understand the need for initially providing my banking credentials for AML/KYC reasons, but I feel uncomfortable with your company continuing to use those after the initial check.

Why can't you just use the routing/account numbers for ACH after the initial signup?

9
__d 2 days ago 7 replies      
I understand why you need it, and I want this service in a big way, but I'm just baulking at giving you my online banking username and password. Why should I trust you with that?
10
orf 2 days ago 4 replies      
> Privacy is PCI-DSS compliant. We are held to the same rigorous security standards as your bank.

I always giggle when I see that.

11
nommm-nommm 2 days ago 1 reply      
"Never forget the cancel one of those pesky 30 day free trials."

This is very misleading to say the least. Not paying for a service doesn't cancel a service. If they tried to bill your card and the card was rejected that doesn't mean the service is cancelled.

12
mtgx 2 days ago 3 replies      
> STEP TWOWhen you check out on any website, the Privacy icon will appear in the card form.Click it to create a new card, and auto-fill the card form. Use any name and billing address you like.

> STEP THREEAfter the card is charged, we withdraw the money from your chosen funding account, similar to a debit card.

Not sure I get this. Do you have to fund an account on Privacy.com? So it's like a Paypal where you generate a new payer name every time you pay for some other service with it?

> Sensitive information is encrypted using a split-key encryption with partial keys held by separate employees, meaning no one can decrypt your data; not even us.

Umm. Pretty sure that giving your employees the ability to decrypt my data means that "you" can decrypt it.

13
nommm-nommm 2 days ago 1 reply      
So what happens when I have to return something and they put the money back on the card I used to purchase it?
14
drglitch 2 days ago 0 replies      
Quick question to founder lurking here - if you're advertising yourself as a credit card and yet you do not extend credit (and use bank account as funding source) aren't you misadvetising? If it's just a virtual debit card, you are likely providing far less protection to consumer than a credit card would.
15
gwintrob 2 days ago 0 replies      
Great company name. How'd you get the domain?
16
mkhalil 2 days ago 1 reply      
I'm in love. Seriously, been waiting for this for soooo long. And the fact that the website supports two factor auth + is SUPER easy to use makes this a double whammy!!! :)

I've been a customer for about 5 minutes, have used it twice, and am already going to recommend it.

edit: I'm quite aware that this has been possible, but both banks/credit cards that I have make me jump through tons of ugly UI and clicks to make it happen.

17
habosa 2 days ago 2 replies      
This is one of those things I have wanted to make so many times and I assumed it would either be technically impossible (card numbers not actually a huge number space) or it would just get marked as fraud.

Excited to see someone giving it a try.

18
jjallen 2 days ago 2 replies      
Wish they explained this better:

"Please ensure this information is accurate. We'rerequired to verify this information against publicrecords. But don't worry, we'll keep it private."

I suppose I'm legally opening a bank account, which has similar requested info as this, but are they checking my credit (probably not, I know, but it makes me uncomfortable)? Will wait a while.

19
electic 2 days ago 3 replies      
I signed up for this. Sadly, it is not what I thought it was and the website does not make it very clear. Basically, this is for online purchases only. To make matters a bit worse, it wants to connect to your real bank account.

What we need here is a physical credit card that I can use in the real-world that has a new number on each swipe. Most of my historical fraud has happened because I probably swiped my card at a location that was compromised.

Just my two cents.

20
jcrawfordor 2 days ago 1 reply      
I accept that disabling JavaScript is generally a losing battle, but it specifically irks me when the website of a privacy-centric service is just completely blank if you don't have JavaScript enabled. Of all 30 people out there browsing without JavaScript, it seems like they have an elevated chance of all wanting to learn about this service, and I find myself moderately discouraged from trying it by this issue.
21
cemregr 2 days ago 1 reply      
The email you send to verify the bank comes off as SUPER shady. It reads exactly like a phishing email. It doesn't talk about which site / bank I'm using. Might be worth fixing.

From:Account Management Team <account.management@acctmanagement.com>

....

Thank you for being a valued customer.

Sincerely,Online Banking Team

22
film42 2 days ago 0 replies      
This is super close to the product that I really really really want. The only thing that's missing for me, is that this requires a checking or savings account. When I purchase something with my credit card (most things), it's because I want the rewards program points. With this, I don't get that. If I can't pay with my credit card, then I'm losing money (~$300/yr).

I really want a product that let's me proxy my credit card (and change it when I get a new card). I want a firewall for my credit card.

23
tome 2 days ago 1 reply      
How do they not run out of numbers? According to this random image I found on the internet, each bank has a space of one billion card numbers. If you have ten million customers, say you're going to run out of these very quickly.

http://www.financetwitter.com/wp-content/uploads/2014/08/Cra...

24
dogma1138 2 days ago 1 reply      
Mastercard has this service in quite a few countries, the downside is that usually they do not offer the same insurance as for the normal cerdit card and those cards will not pass an actual credit check.Other issuers, banks, and other organizations (post office for example) also offered similar services.

I never really liked these services they don't really support recurring payments, some of them force you to purchase a card with a specific amount rather than it being valid for a specific transaction, some times they have issues with various 3d party checks (pre-paid card check, region lock/address verification, fraud etc.) and more importantly it's not an elegant solution as you end up with allot of credit card numbers.

Overall while this one might have a nice UX it doesn't really solve a problem that hasn't already been solved either through Paypal or trough your own credit card company.I can see all payments on my Amex and Visa cards in the UK, I can check which ones are recurring, I can initiate a charge back and for everything else well there's paypal which offers even an easier UX.

25
mindslight 2 days ago 1 reply      
I like this, especially the repudiating of the privacy-hostile billing name/address voodoo. But I'd worry about forgoing the traditional protection of credit card chargebacks, and having to rely on debit card terms and direct ACH.
26
cordite 2 days ago 2 replies      
The stop subscriptions aspect really stood out to me, I had to spend 40 minutes on the phone with that darn company to get things canceled, even though I only used it for one day for an hour.
27
fuzzywalrus 2 days ago 0 replies      
I'm not sure if I'm ready to hand over personal details to Privacy, there's not much assurance other than "We'll never sell your data to anyone".

Does privacy.com see where I make all my purchases? Is there a collection of my metadata? What assurances do I have that you take personal privacy seriously?

28
r1ch 2 days ago 1 reply      
Any way this works without a browser extension? I'm assuming such an extension has full access to every single page in order to do its job, which is a huge security risk. You don't need to be reading my emails or passwords.
29
rgbrgb 2 days ago 2 replies      
Is this Final without a physical card?

https://getfinal.com/

30
iamleppert 2 days ago 0 replies      
It looks like funding is done via ACH. Does your business operate a credit operation as well to handle the risk of spending money and unable to complete the ACH transaction?

I've always wondered about the business side of that...where does the money come from, how is individual debt handled. Do you operate collections? How do you do this without requiring a credit check? etc..

31
Swizec 2 days ago 1 reply      
At first I was really really excited. This is something I've wanted for months if not years.

Then they asked for my bank username and password.

32
bluejekyll 2 days ago 1 reply      
A problem I experienced with temporary card numbers is when you need that credit card number again to refund back a purchase if it was needed (out of stock, wrong thing, returns, etc).

I remember having a lot of trouble with the vendor because of this, so I stopped using them. Does this deal with that in some way?

33
speeder 2 days ago 0 replies      
I wish this was "country-agnostic"

I am from Brazil, and the government sometimes censor online stores, or is just an ass...

Also many stores have some sort of licensing agreement that exclusive Brazil, sometimes with no other way to get some stuff, for example there is a series of books that I can't legally obtain copies of them after Barnes e Noble closed Fictionwise, anyone on my country wanting one of those books must pirate it (they are digital only, and the stores that sell them are mostly US-only, and a bunch even check your IP or insert DRM that checks your IP).

If this payment service could hid someone country, I am very sure that in some countries piracy would drop a bit.

34
mfkp 2 days ago 2 replies      
Very useful - my citibank credit card used to have a feature like this many years ago (I believe called "virtual card numbers"), but they got rid of it for some reason.

Though I am more likely to give my personal details to citibank than some startup. Trust is a big issue with payment startups.

35
plugnburn 2 days ago 0 replies      
In Ukraine, Fidobank offers "Shtuka" (, translated as "piece" or in jargon "thousand") debit cards that are attached to MoneXY account that is in turn attached to mobile number only. And since prepaid cellular service is mostly anonymous here, you can actually have as many anonymous accounts as you can for about 60 UAH (a bit more than 2 USD) each. And still these are physical MasterCards you can put into your pocket, accepted at any supermarket and also suitable for online transactions.
36
makmanalp 2 days ago 1 reply      
So, my bank in Turkey (Garanti) offered this more than a decade ago - you could make "virtual" cards to use on online transactions, and load them up with the specific amount of money.

This way you didn't need to worry about card numbers being stolen because they were easy to cancel and also didn't have any money in them.

Other cool stuff they did back then: online banking actually had features, and had a 2 factor keyfob. And they had a way where you could SMS people money by sending them a password protected one time code that they could go to any garanti ATM and withdraw cash.

Why are banks in the US so far behind?

37
panabee 1 day ago 0 replies      
in the USA there are about 160M people with credit cards. for a preliminary model, let's assume 10% value privacy or have enough transactions where privacy trumps rewards/protections. assuming the startup captures 50% of this market, that yields 8M users. if the average user spends $1000 per year on private transactions -- this card won't replace all CC transactions, only the ones where privacy trumps rewards/protections -- and the company earns 2% per transaction, the company generates $160M in revenue under these assumptions. obviously the key variables are (a) 8M users and (b) $1000 annual spend.

to size the whole market, look at all 2015 credit + debit purchases and ask yourself what percentage of those would have been made private if some solution made things simple and easy enough. 1%? 5%? 10%?

the potential for private purchases seems promising, esp if they (or someone else) can expand the market by making private purchasing as easy as private browsing.

38
lolobkk 2 days ago 0 replies      
Privacy.comThis site uses a weak security configuration (SHA-1 signatures), so your connection may not be private.

They not even using a secure signature for their SSL Cert and they want to be your trusted payment proxy?

39
dcosson 2 days ago 0 replies      
> Never forget to cancel one of those pesky "30 day free trials."

This seems like a bad idea, I'm surprised they're advertising it. I'm pretty sure not being able to charge your card doesn't let you out of a contract you've signed.

I looked into this because I was too lazy to cancel a gym membership once. There are a lot of stories online of a gym sending someone's account to collections because they thought they didn't have to actually cancel it since the credit card expired.

The product still seems useful for one-time purchases though.

40
jeena 2 days ago 0 replies      
My bank in Sweden offers this automatically when you use their website. Not with as a nice UX as this, it is a popup with a flash app in it, but still good enough to be very usable.

https://translate.google.com/translate?hl=sv&sl=sv&tl=en&u=h...

41
serge2k 2 days ago 0 replies      
Finally, a card for my dial up needs!

Really though, isn't something like the apple pay system a better way? You don't risk getting flagged as a prepaid card and reject, you aren't giving out your data.

42
pavs 2 days ago 0 replies      
I use netteller, that does something similar, called virtual cards. Can create multiple cards and assign funds to each virtual card. Its not as smoothly done as this one, but same thing.
43
coryfklein 2 days ago 0 replies      
Shouldn't this service be marketed to credit card companies instead of credit card users? If I get a fraudulent charge on my credit card I can just dispute it and have it removed. What value do I get with privacy.com that I don't already have that is worth the extra fees I have to pay?
44
prohor 2 days ago 1 reply      
Does it work if I live outside US?
45
darksim905 2 days ago 0 replies      
Whoever works on this & put it together / posted this. Thank you. I just recently learned a while back that paypal had something similar but discontinued it. Whatever you have to do to keep this service running & any help you need in spreading the word, I'm willing to help out. This is needed badly for those who are privacy conscious.

Thank you :-)

46
avar 2 days ago 1 reply      
I've been curious as to why the following strategy wouldn't work as a hack as well:

* Your credit card has a balance of $0 on it

* You have some app that allows $NAME to deduct $X from it

* You transfer $X to it earmarked for $NAME for some limited amount of time.

I.e. you could walk into Starbucks, have an app on your phone to say you're depositing $20 into an account earmarked for /starbucks/i for 30 minutes.

47
robotcookies 1 day ago 0 replies      
Doesn't this just shift who gets your information from the credit card company to the company running this?

If it's only intended to prevent identity fraud or data theft, then it's really 'security' more than 'privacy'.

48
efader 2 days ago 0 replies      
Oh the irony, a bank that offers a burner like credit card numbers and pretends to not know the aggregate transactions using the guise of privacy

LOL

49
elchief 2 days ago 0 replies      
Looks cool.

Supports TOTP 2FA, HSTS, nosniff, CSP, x-frame-options, xss-protection

A+ ssllabs rating

A securityheaders rating

Some issues:

Some user enumeration issues. I emailed security@privacy.com but it doesn't exist...resent to questions@

I don't like how they ask for your bank's login username and password. I don't feel comfortable giving them that. There must be another way.

Should confirm email address before you can login

50
rilez7 2 days ago 0 replies      
It would be great if this + other fintech services catered to overseas markets. It's understandable why they don't, but as an expat/nomad, centralizing your banking is a huge pain point. This cohort is only going to grow.
51
llamataboot 2 days ago 0 replies      
Here are the list of banks currently supported, to save you a click or two:

Bank of AmericaCapital One 360Charles SchwabChaseCitibankFidelityNavy Federal Credit UnionPNC BankUS BankUSAA BankSunTrustTD BankWells Fargo

52
greenspot 2 days ago 0 replies      
Still my email used for every transaction will connect the dots. So where is the point?

Awesome domain btw.

53
DanBlake 2 days ago 0 replies      
There is a few of these services and they all look awesome. The issue has always been for me that I value my points/miles more than I value the convenience of not worrying about my credit card # being stolen. If I could do this with my SPG card, I would be all over it.
54
jdc0589 2 days ago 0 replies      
damn. I've been wanting a service like this for a very long time. Not just for privacy of security, but hopefully so that if my banking or real credit card information changes I could just go to one place to make all my updates.

Looking forward to seeing how it looks.

55
llamataboot 2 days ago 0 replies      
Wondering what the $2k a month spending limit is about? That seems too low to switch all spending to Privacy, but seems like a lot of mental overhead to figure out what I want to use Privacy for and what I don't...
56
Cartwright2 2 days ago 0 replies      
Is it possible to create and verify a PayPal account against one of these cards? This would allow users to have pseudonymous PayPal accounts. It always bothers me when I go to make a donation that I have to give my real name.
57
guico 2 days ago 0 replies      
This exists in Portugal for at least 10 years (in Portuguese): https://www.mbnet.pt/#compras
58
husamia 2 days ago 1 reply      
I like the fact that they have 2x factor authentication
59
nikolay 2 days ago 0 replies      
PayPal had this and killed it - stupid PayPal! Bank of America has this. Discover has this, too. CitiBank has it, too. I really hate not being able to get cash back with Privacy.com so I won't probably use it.
60
eiopa 2 days ago 0 replies      
ACH only :(

I want to use this, but I don't want to give you full access to my bank account.

61
leemailll 2 days ago 0 replies      
Citi offers this feature, but not sure whether it is for all their credit cards
62
nodesocket 2 days ago 0 replies      
This is awesome, and something I've been thinking about a while. A few concerns though:

$2,000 a month spending limit is too low.

Concern about transactions being declined because they flagged as pre-paid.

63
DavideNL 2 days ago 0 replies      
So instead of giving my data to the companies i buy products from, i'm now giving my data to privacy.com, who then sells it to (unknown) companies?
64
hotpockets 2 days ago 0 replies      
Would there be any way for merchants to accept your cards only? And, hopefully have fees closer to ACH rates, since that seems to be what you are using?
65
agotterer 2 days ago 0 replies      
How does privacy.com ensure you have the funds to pay for the transaction? How do they deal with chargebacks and disputes?
66
leonaves 2 days ago 0 replies      
Love the idea, but I just wanted to shout out the logo. Best logo concept I've ever seen, and the whole branding looks great anyway. Brilliant work.
67
phantom_oracle 2 days ago 0 replies      
Which are the supported financial institutions? Your website has no information about this at all, even after digging through it.
68
secresearch 2 days ago 0 replies      
This is an interesting idea. Citi offers something similar, but this seems a lot more convenient.
69
dawhizkid 2 days ago 0 replies      
Tested on a few websites and immediately blocked.
70
tedmiston 2 days ago 0 replies      
Any plans to make a physical card? Basically the multiple virtual card service you have now but in one card I can use in person, like Coin.
71
o_____________o 2 days ago 1 reply      
"Sorry, no compatible accounts were found. Only checking/savings accounts are compatible."

Inaccurate error, FYI.

72
justplay 2 days ago 0 replies      
My bank also provide this type of virtual credit card, but it is useless. It doesn't work, i tried in paypal.
73
ginkgotree 2 days ago 1 reply      
Hey! Such a great idea! Any chance you guys will work with Amex soon? I use my Platinum and Delta cards for everything.
74
strange_quark 2 days ago 0 replies      
So I should give Privacy my bank account information in the name of "security"? No thanks.
75
hdjeieejdj 2 days ago 0 replies      
the issues I have with this are:

1) only for online purchases and limited use case- how many times do I make a purchase online that's not on Amazon, or where I'm not using PayPal?

2) new chip cards already do this for in store purchases

3) loss of travel/reward points

76
AznHisoka 2 days ago 0 replies      
What payer name and address does the retailer see when the transaction goes through?
77
chris_wot 2 days ago 1 reply      
Is this for only U.S. customers?
78
juli3n 2 days ago 0 replies      
The is something named e-carte in France, and that is directly powered by banks :)
79
jopython 2 days ago 0 replies      
This feature is offered by BoA. I am still their customer because of this.
80
sandra_saltlake 2 days ago 0 replies      
All the virtual card providers seem to suck on this front.
81
pcarolan 2 days ago 0 replies      
Good idea. Good marketing, even if not new, this needs to happen.
82
StartAppAchill 2 days ago 0 replies      
logged in, authenticated with my bank, got the code, then nothing. Would not accept my code. Could not move forward.
83
AJAlabs 2 days ago 0 replies      
Some banks like Citibank do this as well.
84
subliminalpanda 2 days ago 1 reply      
Are extensions for other browsers planned?
85
homero 2 days ago 0 replies      
Not using it without ach verification
86
kozikow 2 days ago 0 replies      
Any plans to support UK cards?
87
kidsthesedays 2 days ago 0 replies      
why virtual card numbers aren't worth it: http://www.mybanktracker.com/news/why-virtual-credit-card-nu...
88
chris_va 2 days ago 0 replies      
How are disputes settled?
89
StartAppAchill 2 days ago 0 replies      
logged, asdfasf
12
Amazon Provides DIY Echo Plans for Raspberry Pi github.com
445 points by rpdillon  1 day ago   100 comments top 16
1
escobar 1 day ago 5 replies      
I have had reservations about the Echo line because of the whole "always listening" thing, regardless of what anyone's said about how it's not recording, how I can unplug it, etc. The whole "always listening" thing isn't what interests me about playing with Alexa.

As someone who's spent a fair amount of time with hardware, I think this is what will make me tinker with the Alexa service - I am interested to see what it can do and I like keeping up with Amazon's hardware projects. I've got all the parts lying around to throw this together without spending anything, so it's a neat way for them to grab some interest from a different user demographic. This also should be fairly easy to get running on a BeagleBone too, which I tend to lean towards (more I/O, PRU can be useful)

2
dperfect 1 day ago 3 replies      
This may bring me one step closer to my personal "holy grail" of home automation: every room in the house[1] working with seamless voice-activated home automation. This is what I'm ultimately after:

- A cheap device (DIY if possible) in the form factor of a small plug-in unit. Ideally the device itself should be practically "invisible" in each room, and won't require any special home wiring. This is definitely in the realm of possibility for a Raspberry Pi (or similar).

- A microphone for the device that works at least as well as the Echo's far-field mic. I have not been able to find any good options for this, apart from some obscure parts that are too expensive for me to test, let alone buy for every room.

- Software that allows for voice-activated operation. There's probably a suitable workaround for doing this with the Alexa Voice Service now, though it may require more CPU power than is available on the Raspberry Pi.

- Ideally, I could host the voice service myself and wouldn't have to worry about the privacy implications of going through someone like Amazon. I know there are several existing software packages that claim to do this, but none that I've found can match the quality of Echo/Alexa for everyday interaction.

- Audio feedback does not need to be high quality, but at least audible. A small speaker within the device is probably enough. For other areas of the house, it would be nice for the output to be connected to a bluetooth speaker in the room or a home audio system (if available).

The Echo Dot appears to be a pretty close match for this (though I haven't tried it) - at least in terms of functionality, but the form factor still seems a bit off. I'd rather have a self-contained plug-in unit than something that sits on a desk or table.

[1] Or most of the house anyway

3
eiopa 1 day ago 4 replies      
tl;dr:

It's literally a tutorial on configuring Alexa Voice Services + their sample code on Debian.

The way you interact with it is by clicking on a button in a Java app. No trigger phrase like Echo.

4
torbjorn 1 day ago 8 replies      
This is awesome. I am strongly considering getting setting this up as I just purchased a fresh raspberry pi.

The only limitation appears to be you have to click a "start listening" button to get it to start recording audio. You can't simply say "Alexa" to get the raspberry pi + alexa web service to listen for your query.

Anyone have any ideas for a work around/ solution to this?

5
Implicated 1 day ago 1 reply      
Privacy concerns aside, this is pretty damn cool.

I've been looking for an excuse to tinker with a raspberry pi for a while - this seems like something I could have some fun with then give away to someone less paranoid/concerned with the privacy issues.

6
blacksmith_tb 1 day ago 0 replies      
Nice to see them walking through pretty much everything from getting your RPi running to making it work with AVS. That said, Sam Machin's Python CHIP / RPi client was there first, and has a smaller footprint: https://github.com/sammachin/AlexaCHIP
7
haack 1 day ago 6 replies      
Out of curiosity, does anyone know what amazon's incentive is to do this?
8
dharma1 20 hours ago 0 replies      
Has anyone done hardware tinkering with the Echo? Does it run Linux? what does the mic array look like? Possible to use just the mic array and pipe the audio elsewhere?
9
danifel 1 day ago 0 replies      
What I think you guys are really looking for is something like this: http://www.microsemi.com/products/audio-processing/home-auto.... Ambarella uses those in their IP Cameras designs, so it should be straight forward to integrate...
10
daveloyall 1 day ago 0 replies      
Props to Amazon for putting this up. There are hundreds of steps and a lot of it is manual drudgery. 10/10 would hack again.
11
sp332 1 day ago 1 reply      
Does anyone know a way to use the Android Alexa app without buying an Echo device or Fire TV first?
12
regularfry 1 day ago 0 replies      
Has anyone found a decent solution to hooking more than one mic input into an RPi? Something that would allow doing some simple DSP across, say, a 4-input array?
13
brooklyndude 1 day ago 0 replies      
We have 100% totally pivoted on this one. Every proposal we put out, now has Echo front and center. As we say "screens", you mean like your father/mother used to use? How old school. A screen? Oh boy ... :-)

As Woz says, "bigger than the iPhone." That sounds like a hell of a prediction to me. Woz knows all. :-)

14
jarmitage 1 day ago 1 reply      
Would there be a way to dodge the privacy issues with this, by spoofing the service somehow?
15
Irishsteve 1 day ago 0 replies      
Anyone know how to buy one if based outside the US
16
newman314 1 day ago 1 reply      
Self-signed cert?

Miss opportunity for Amazon to push Let's Encrypt...

13
An administrator accidentally deleted the production database gliffy.com
529 points by 3stripe  4 days ago   331 comments top 78
1
arethuza 4 days ago 18 replies      
My very first job - ~25 years ago.

Destroyed the production payroll database for a customer with a bug in a shell script.

No problem - they had 3 backup tapes.

First tape - read fails.

Second tape - read fails.

Third tape - worked.... (very nervous at this point).

I think most people have an equivalent educational experience at some point in their careers.

Edit: Had a project cancelled for one customer because they lost the database of test results..... 4 months work! Their COO (quite a large company) actually apologised to me in person!

Edit: Also had someone from Oracle break a financial consolidation system for a billion dollar company - his last words were "you need to restore from tape" and then he disappeared. I was not happy as it was his attempts at "improving" things were the cause of the incident! Wouldn't have been angry if he had admitted he had made a mistake and worked with us to fix it - simply saying "restore from tape" and running away was not a good approach.

2
steven2012 4 days ago 2 replies      
This is what happens when you don't have a disaster recovery plan, or if you have one but never test it out. You need to test your disaster recovery plans to actually know if things work. Database backups are notoriously unreliable, especially ones that are as large as the one this post is talking about. Had they known it would take 2-3 days to recover from a disaster I'm sure they would have done something to mitigate this. This falls squarely on the shoulders of the VP of Engineering and frankly it's unacceptable.

I worked at a company that was like this. My first question when I joined was, "do we have a disaster recovery plan?" The VP of engineering did some hand waving, saying that it would take about 8 hrs to restore and transfer the data. But he also never tested it. Thankfully we never had a database problem but had we encountered one we would have lost huge customers and probably would have failed as a business.

I also worked at a company that specializes in disaster recovery, but our global email went down after a power outage. The entire company was down for 1 day. There were diesel generators but they never tested them and when the power outage occurred they didn't kick in.

Case in point: Test your damn disaster recovery plans!!!

3
Smerity 3 days ago 0 replies      
I was testing disaster recovery for the database cluster I was managing. Spun up new instances on AWS, pulled down production data, created various disasters, tested recovery.

Surprisingly it all seemed to work well. These disaster recovery steps weren't heavily tested before. Brilliant! I went to shut down the AWS instances. Kill DB group. Wait. Wait... The DB group? Wasn't it DB-test group...

I'd just killed all the production databases. And the streaming replicas. And... everything... All at the busiest time of day for our site.

Panic arose in my chest. Eyes glazed over. It's one thing to test disaster recovery when it doesn't matter, but when it suddenly does matter... I turned to the disaster recovery code I'd just been testing. I was reasonably sure it all worked... Reasonably...

Less than five minutes later, I'd spun up a brand new database cluster. The only loss was a minute or two of user transactions, which for our site wasn't too problematic.

My friends joked later that at least we now knew for sure that disaster recovery worked in production...

Lesson: When testing disaster recovery, ensure you're not actually creating a disaster in production.

(repeating my old story from https://news.ycombinator.com/item?id=7147108)

4
Rezo 4 days ago 5 replies      
Treating app servers as cattle, i.e. if there's a problem just shoot & replace it, is easy nowadays if you're running any kind of blue/green automated deployment best practices. But DBs remain problematic and pet-like in that you may find yourself nursing them back to health. Even if you're using a managed DB service, do you know exactly what to do and how long it will take to restore when there's corruption or data loss? Having managed RDS replication for example doesn't help a bit when it happily replicates your latest app version starting to delete a bunch of data in prod.

Some policies I've personally adopted, having worked with sensitive data at past jobs:

- If the dev team needs to investigate an issue in the prod data, do it on a staging DB instance that is restored from the latest backup. You gain several advantages: Confidence your backups work (otherwise you only have what's called a Schrdinger's-Backup in the biz), confidence you can quickly rebuild the basic server itself (try not to have pets, remember), and an incentive to the dev team to make restores go faster! Simply knowing how long it will take already puts you ahead of most teams unfortunately.

- Have you considered the data security of your backup artifacts as well? If your data is valuable, consider storing it with something like https://www.tarsnap.com highly recommended!)

- In the case of a total data loss, is your data retention policy sufficient? If you have some standard setup of 30 days worth of daily backups, are you sure losing a days worth of data isn't going to be catastrophic for your business? Personally I deploy a great little tool called Tarsnapper (can you tell I like Tarsnap?) that implements an automatic 1H-1D-30D-360D backup rotation policy for me. This way I have hourly backups for the most valuable last 24 hours, 30 days of daily backups and monthly backups for a year to easily compare month-to-month data.

Shamless plug: If you're looking to draw some AWS diagrams while Gliffy is down, check out https://cloudcraft.co a free diagram tool I made. Backed up hourly with Tarsnap ;)

5
SimplyUseless 4 days ago 1 reply      
Been there Done that :)

I was once on-call working for one the leading organizations. I got a call in the middle of the night that some critical job had failed and due to the significant data load, it was imperative to restart the processing.

I login to the system with a privileged account. Restart the job with new parameters and since I wanted not to see the ugly logs, I wanted to redirect the output to /dev/null.

I run the following command./jobname 1>./db-file-name

and there is -THE DISASTER-

For some reason this kept popping in my head - "Bad things happen to Good people"

We recovered the data but there was some data loss still as the mirror backup had not run.

Of course, we have come long way since then. Now, there are constant sync between Prod/DR and multitude of offline backups and recovery is possible for last 7 days, the month or any month during the year and the year before.

6
ww520 4 days ago 3 replies      
We've all been there. Shit happens. That's what backup is for.

OT: It's probably bad form to publicly blame someone for it, even if it's done by him. It's suffice to say, we screwed up but on our way to recovery. It's better to follow the practice of praising in public and discussing problem in private.

7
dools 4 days ago 7 replies      
This is how I learned about xargs ...

I once typed onto a client's production mail and web server that basically ran the whole business for about 50 staff, as root, from the root directory:

chmod -R 644 /dirname/ *

I seem to recall the reason was that tab completion put a space at the end of the dirname, and I was expecting there to be multiple files with that name ... anyway the upshot was that everything broke and some guy had to spend ages making it right because they didn't have the non-data parts of the file system backed up.

I learned that whenever you do anything you should:

find . -name "*.whatever you want" | more

then make sure you're looking at expected output, then hit the up arrow and pipe it into xargs to do the actual operation.

8
rlonstein 3 days ago 3 replies      
BTDT. Got the t-shirt. Early in my career...

* Multiple logins to the conserver, down the wrong system.

* rm -rf in the wrong directory as root on a dev box, get that sick feeling when it's taking too long.

* Sitting at the console before replacing multiple failed drives in a Sun A5200 storage array under a production Oracle DB, a more senior colleague walks up and says "Just pull it, we've got hot spares" and before I can reply yanks a blinking drive. Except we have only two remaining hot spares left and now we have three failed. Under a RAID5. Legato only took eight hours to restore it.

* Another SA hoses config on one side of a core router pair after hours doing who knows what and leaves telling me to fix it. We've got backups on CF cards, so restore to last good state. Nope, he's managed to trash the backups. Okay, pull config from other side's backup. Nope, he told me the wrong side and now I've copied the bad config. Restore? Nope, that backup was trashed by some other admin. Spent the night going through change logs to rebuild config.

There were a few others over the years, but all had in common not having/knowing/following procedure, lacking tooling, and good old human error.

9
innertracks 3 days ago 1 reply      
Not long ago I discovered backups don't do any good if you delete them. The incident went down while I was wiping out my hard drive to do a fresh install of Fedora. I believe what happened may have been due to sleep fatigue.

Everything is a bit hazy. At one point in my wandering on the command line I found the mount point for my external backup drive. "What's this doing here?" and decide to remove it.

At some point I woke up in a panic and yanked the usb drive off the my laptop. Heart pounding. "Oh shit."

I actually felt like I was going to get sick. Tax records, client contact info, you name it, all gone. Except, basically, the pictures of my kids, mozilla profile, and my resume files.

While I reconstructed some of the missing files there a bunch that would be nice to have back. All of the business records though have had to be reconstructed by hand. By the next day I did realize I really only cared about the pictures of my kids in the end. And those were somehow saved from my blunder.

Work flow change: backup drive is only connected to laptop while backups are being made or restored. Disconnected at all other times. A third backup drive for backups of backups is on the todo list.

10
_spoonman 4 days ago 2 replies      
If that administrator is reading this, chin up ... it happens to the best of us.
11
sqldba 3 days ago 1 reply      
The Enterprise I work for is currently implementing a new idea - where they hire a crack team of generalists - and give them complete and utter unfettered access to production (including databases).

This is despite our databases being controlled by my team and having the best uptime and least problems of anything in the entire business. Networks? Fucked. Infrastructure? Fucked. Storage? Fucked. But the databases roll on, get backed up, get their integrity checks, and get monitored while everyone else ignores their own alarms.

The reasoning for this is (wait for it...) because it will improve the quality of our work by forcing us to write our instructions/changes 3 MONTHS IN ADVANCE for generalists to carry out rather than doing it ourselves. 3 MONTHS. I AM NOT MAKING THIS UP. AND THIS IS PART OF AN EFFICIENCY STRATEGY TO STEM BILLIONS OF DOLLARS IN LOSSES.

Needless to say the idea is fucking stupid. But yeah, some fucking yahoo meddling with the shit I spent my entire career getting right, is sure to drop a fucking Production database by accident. I can guarantee it. Your data is never safe when you have idiots in management making decisions.

12
brainbrane 3 days ago 0 replies      
About 15 years ago, my school's electrical engineering lab had a fleet of HP-UX boxen that were configured by default to dump huge core files all over the NFS shares whenever programs crashed. Two weeks before the end of the semester a junior lab assistant noticed all the core files eating a huge chunk of shared disk space and decided to slap together a script to recursively delete files named "core" in all the students' directories.

After flinging together a recursive delete command that he thought would maybe work, he fired it off with sudo at 9:00pm just before heading out for the night. The next morning everyone discovered that all their work over the semester had been summarily blown away.

No problem, we could just restore from backups, right? Oh, well, there was just one minor problem. The backup system had been broken since before the start of the semester. And nobody prioritized fixing it.

Created quite the scenario for professors who were suddenly confronted with the entire class not having any code for their final projects.

They talked about firing the kid who wrote and ran the script. I was asking why the head of I.T. wasn't on the chopping block for failing to prioritize a working backup system.

13
Jedd 4 days ago 1 reply      
https://www.gliffy.com/examples/

First graphic on this page includes a bright red box asking: "Is your data safe online?"

Evidently not a rhetorical question.

14
DennisP 4 days ago 2 replies      
One time the DBA and I were looking at our production database, and one by one the tables started disappearing. Turned out one of the devs had tried out a Microsoft sample script illustrating how to iterate through all the tables in the database, without realizing that the script was written to delete each table.
15
blantonl 4 days ago 1 reply      
If the gentleman who did this loses his job, then those looking for a new sysadmin should definitely give this guy some serious consideration.

Because I guarantee you he'll never, ever, let this happen again.

16
bliti 4 days ago 2 replies      
The official rite of passage that turns anyone into a bona-fide sys admin. The equivalent to running your production server on debug. D:
17
krzrak 4 days ago 1 reply      
Once I asked server support guy to move database from production to dev. He did - without any question of doubt - exactly that: copied database to dev environment and deleted it from the production.(note: in my language word "move" is less unambiguous than in English - it may mean, depending on the context "move" or "copy").
18
alistproducer2 4 days ago 0 replies      
Last week I deleted a large portion of our pre-production ldap. I use jXplorer ldap client and for some reason the control d (delete) confirm dialog defaults to "Ok" instead of cancel. I'm use to hitting control f (search) and then enter to repeat the last search and when I hit d instead of f I deleted a bunch of stuff. The silver lining is I patched the problem in jXplorer and submitted it. It's my first legit contribution to a project.
19
amelius 4 days ago 2 replies      
In my opinion it is way too easy in Unix to accidentally delete stuff (even for experienced users). Having a filesystem with good (per-user) rollback support is, imho, more than just a luxury.
20
d0m 4 days ago 1 reply      
So.. story time. While at the university, there was that project where we had to create an elevator simulator in C as a way to learn threading and mutexes. All the tmp files were stored in ./tmp/.

In between build/run/debug cycle, I would "rm -fr ./tmp". But once, I did "rm -fr . /tmp". At that time I didn't know any better and had no version control.

I had to redo that 2 weeks in a night, which turn out to be more easier than expected considered I had just written the code.

My lessons from that:

 A) Version control, pushed somewhere else. B) Use simple build scripts.

21
Yhippa 4 days ago 0 replies      
"Tell me about a time where something didn't go the way you planned it at work."
22
zimpenfish 4 days ago 1 reply      
I've done this - ran out of space on /home for the mSQL database (~1996 era), I moved it to /tmp which had plenty free. I suspect most people can now guess which OS this was on and what happened when the machine rebooted some weeks later...

(Hint: Solaris)

23
odinduty 4 days ago 9 replies      
Well, who hasn't done a DELETE without a WHERE clause? ;P
24
orbitingpluto 3 days ago 2 replies      
I was forced to train someone so cocky that they ended up doing a rm -rf / on our production server a month after I quit. He also accidentally euthanized a legacy server, deleted the accounting database when he was trying to do a hardware based RAID rebuild, completely destroyed the Windows domain server and mocked my daily tape backup regimen - opting to ship USB consumer grade hard drives in an unpadded steel box instead to off-site storage... The list goes on. He literally destroyed everything he touched. The only reason he wasn't fired was because he was a pretty man.
25
linsomniac 3 days ago 0 replies      
A dark and snowy night a bunch of databases on a server just vanished. This was on a server that was still in development, but was part of the billing system for a huuuge company, and it was under a lot of scrutiny. The files are just gone. So I contact the DBA and the backup group. For whatever reason, they can't pull it off local backups, so tapes had to be pulled in from Iron Mountain.

As I said above, a dark and snowy night. Took Iron Mountain 4 hours to get the tapes across town. The DBA and I finally get the database up around 8am the next morning. I investigate, but can't find any system reason for the databases vanishing, the DBA can't either.

2 weeks later, the same thing happens.

I eventually track it down to a junior developer who has been logged in and has on several occasions run this: "cd /" followed by "rm -rf /home/username/projectname/ *" Note the space before the star. On further investigation, I find the database group installed all the Oracle data directories with mode 777.

26
fortpoint 4 days ago 0 replies      
Sounds like a terrible situation. I wish those guys luck.

One useful sys ops practice is the creation and yearly validation of disaster recovery runbooks. We have a validated catalog of runbooks that describe the recovery process for each part of our infrastructure. The validation process involves provoking a failure (eliminate a master database), running the documented recovery steps and then validating the result. The validation process is a lot easier if you're in the cloud since it's cheap and easy to set up a validation environment that mirrors your production env.

27
xnohat 4 days ago 3 replies      
Every System admin could have a bad day like this :)Some years ago I have deleted entire production server just by very simple command "rm -rf /" instead "rm -rf ./" and I had logged in with root account. No words to explain the feeling at that time.Thanks to backups, without it, I have been killed thousand time by my customers.
28
3stripe 4 days ago 2 replies      
Posting as a reminder to myself that "in the cloud" != safe

There's always room for computer error, and more like, human error.

Imagine if something like this happened to Dropbox? Ooooft.

29
cyberferret 3 days ago 0 replies      
I am actually gladdened by reading the posts by others on here mentioning how they did the same thing. I've been kicking myself for decades over a similar thing I did when I was starting out as a programmer.

Not as big as some of those here, but back in the late 80's I was a self employed programmer writing DOS apps for local businesses to help them run more efficiently.

There was a local martial arts supply shop whose owner was sort of a friend of mine, and he engaged me to write a stock control and hire database for him, which I did. When it came time to implement, he told me that there was a LOT of data to enter, so he would hire a couple of young students of his to sit down for an entire week and key in the data, which was all good.

After they had finished, he called me back in to 'go live', and I sat down in front of his server PC and began to check that everything was OK. Normally, it is my habit to take a backup of the entire app directory before working on it, but I think I was going through a break up with my then girlfriend and was a little sleep deprived.

I noticed that some temporary indexes had been created during the data entry and I went to quickly delete it (thinking to rebuild all the indexes for best performance), but typed in 'DEL .DAT' instead of 'DEL .KEY'.

I still remember that sinking feeling as I sat there looking at the blinking 'C:\>' prompt, knowing I had wiped out all his work. Telling the owner was also one of the hardest things I have done, and I fully expected him to pull down one of the sharp oriental weapons from the wall and take me apart.

But he was really cool and understanding about it. He refused my offer to pay for the students to come back in and re-key the data again, which actually made me feel worse, because I knew he wasn't having the easiest time at that point making ends meet in his business.

End of the day, we got it all working and he used the system for many, many years. But to this day, I still make a copy of anything I am about to touch, before I work on it.

30
jon-wood 4 days ago 0 replies      
I'll join the chorus of people who've done something similar. In my case it was the database of a small e-commerce site, where I'd taken a backup and then formatted the database server to reinstall it.

What I hadn't realised was that the backup script was set to dump to the OS drive, so in the process I'd also just formatted the backup. Thankfully one of our developers had a recent copy of the database locally, but it definitely wasn't my finest hour.

31
Illniyar 3 days ago 0 replies      
I must say that's really the most transparent way to handle a downtime I've ever seen.

I would be scared shitless to expose for all to see what really happened and what is happening, even more so when it's makes them look like they don't know what they are doing.

I must applaud them for that, I wish if I ever get into such a nasty situation, I'll be able to do what they did.

32
ghamrick 4 days ago 0 replies      
In prehistoric times on an OS named CTOS, a distributed client/server OS, I was charged with making tape backups of user's local workstations, IVOLing (formatting) the disk, and restoring from tape. The contract specced that 2 tape backups were to be made, but of course in the interest of expediency, I only made one. And then I encountered the user's tape that wouldn't restore. I remember thinking that losing a user's data is the biggest crime a sysadmin can possibly commit, and it taught me a great lesson on the value of backups and their integrity. Fortunately, I swapped out tape drives like a mad man until one managed to restore the tape.
33
jjuhl 3 days ago 1 reply      
This reminds me of something I did at a previous employer (an ISP), many, many years ago.

I needed to do an update in a SQL database to fix some customer issue - the statement should just update one row but seemed to take a looong time to run, which seemed strange. When it finished and printed something like "700000 rows updated" I noticed I had forgotten the WHERE clause and I had also not started a transaction that I could roll back. Whoops!

That's when our support got really busy answering customer phone calls and I started asking who was in charge of our backups.

That was not a good day.

34
creullin 4 days ago 0 replies      
Sucks, but we've all been there. If the admin is reading this, it's all going to be ok! Just remember, life sucks, then you die...
35
shubb 4 days ago 2 replies      
Poor guys. Really interesting reading though.

I initially thought it was weird they had to run several "processes" in case 1 failed. But running out of space or something correctable is actually something likely to happen. Is this standard? It's quite smart.

Anyway, assuming they get the data back, I think they've done pretty good - 0 data loss and a days downtime isn't bad given this is a true disaster.

It would be nice if they'd let us know how the db got deleted, and what they suggest to mitigate in a blog after.

36
ZeWaren 4 days ago 0 replies      
That reminds me of that time where I imported the nightly dump of a database TWICE into the same server.

Dropping an entire database brings problems, having duplicate content and deleted content coming back bring a whole new realm of others good times.

37
moviuro 4 days ago 2 replies      
My mentor told me: "get everything wrong, but get the backups right", as he was busy debugging the backup solution he had in place at my college (ZFS + NetApp + rsync + sh + perl + tape).

On my own, I'd put CoW wherever possible. It's so easy to delete something on UNIX that it should also be easy to restore and CoW is without a doubt a no-brainer for this.

38
return0 4 days ago 0 replies      
> We are working hard to retrieve all of your data.

Given that in most cases where a backup exists, the user data is not lost, it's a bit unsettling to say that (and also in most cases admins are not working, they are mostly waiting). It's more reassuring to the user to say "we are verifying that all data is restored correctly" or sth.

39
novaleaf 4 days ago 1 reply      
My first real job was a DBA at Microsoft, on a very large marketing database (aprox 1.5TB in 2000)

That experience, how much work is required for "real" production databases, led a bad taste in my mouth. I stay away from self-hosted db's to this day. (example, I use google cloud datastore nowadays)

40
kchoudhu 4 days ago 0 replies      
I did this to the trading database back in 2008 while supporting the mortgage desk of a major investment bank, a day before Lehman went down.

Thank god for backups and translog replays.

41
tobinharris 3 days ago 0 replies      
In 2002 I accidentally executed

DROP TABLE HOTELS;

whilst working on the Virgin Holidays website. We managed to get it from backup but it made me shart.

42
wazoox 3 days ago 0 replies      
Ah, that moment when we needed to copy 1 master disk drive to 80 PCs urgently using Ghost, and my boss said "I'll take care of it, I'm very familiar with Ghost". And with the first PC proceeded to copy the blank disk onto the master.

Problem was : creating the master drive was the job of someone else 1000 km away, with special pieces of tailor-made software... The guy ended at the airport trying to get someone on a leaving plane taking couriering the disk drive (fortunately for us, some lady accepted; this was still possible in 1998).

43
dkopi 4 days ago 0 replies      
"The good news is that we have copies of our database that are replicated daily, up until the exact point of time when the database was deleted. We are working hard to retrieve all of your data."

Better news would be if every user had local copies of their work too. both in local storage, and on a cloud storage provider of their choice.Preferably in a non proprietary format.

This isn't just about getting me to trust your site if you crash or have a tragic mistake. This is also about getting me to trust your site if you go out of business (as too many startups unfortunately do).

44
linsomniac 3 days ago 0 replies      
I once had a client we were running their office Linux server for. They needed more storage, so they asked me to come in and put in some larger drives on the RAID array. Somehow during this, the old drives freaked out and the data was just gone.

So, we go to the backup tapes. Turns out that something changed in the few years since we set up backups, and the incrementals were being written at the beginning of the tape instead of appending. These were DDS tapes, and there is a header that stores how much data is on the tape, so you can't just go to the end and keep reading.

Now, we had been recommending to them every month for a year or more that a backup audit should be done, but they didn't want to spend the money on it.

They contacted a data recovery company who could stream the data off the tape after the "end of media", and I wrote a letter to go with the tape: "Data on this tape is compressed on a per-file basis, please just stream the whole tape off to disk and I'll take it from there." We overnight it to them and a week later they e-mail back saying "The tape was compressed, so there is no usable data on it." I call them up and tell them "No, the compression re-starts at every file, so overwriting the beginning is fine, we can just pick up at the next file. Can you just stream it off to disc?" "Oh. Welllll, we sent the tape back to you, it should be there in a week." They shipped it ground. We shipped it back, they did the recovery, and we got basically all the data back.

45
hiperlink 3 days ago 0 replies      
~20 years ago I was working for a relatively small banking software company (in Hungary) (it was a really good job from the learning point of view, but was really underpaid).

One Monday afternoon one of our clients just called that the banks officer's suddenly can't log in, random strange errors are getting displayed for them, etc.

OK, our support team tried to check, we can't login either, strange error.

"Did you do anything special, [name of the bank's main sysadmin]?"

"Well, nothing special, I just cleaned up the disks as usual."

"How did you do it?"

"As usual: 'mc', sort by file size in the INTERFACE/ folder, marked the files and F8".

That's normal.

OK, since we had the same user account (I knoooow), launch 'mc'. Looks normal. Except... In the left panel the APP/DB directory is opened... Check... Appears normal... At first... But... WAIT. Where is the <BANKNAME>.DB1 file?

"<ADMIN>, how long time did it take?"

"Dunno, I went for my coffee, etc."

Apparently he deleted the production systems' main DB file. It's got resolved by restoring the backup from Saturday and every file and input transactions had to be re-inputed based on the printed receipts, the officer's stayed in late night, etc. He is still the head of IT at the same bank. (Yeah everyone makes mistakes, but it wasn't the only one of hims, but likely the biggest.)

46
aNoob7000 4 days ago 0 replies      
I would really love to get more detail about how they structured the full backups and transaction log backups for the database. Are the backups dumped to disk before being picked up on tape? Or are the backups streamed directly to the backup system?

I'd also would love to know how large is the database that was deleted. Doing a point in time restore of database that's a couple of hundred gigs should be relatively fast (depending on what hardware you are running on).

47
peterwwillis 3 days ago 2 replies      
Serious question: Do modern "all in the cloud" tech companies actually have DR plans?

All the presentations i've seen about people deploying in the cloud leaves out any DR site, replication process, turnover time for the DR site taking production traffic, etc. It's like they believe redundant machines will save them from an admin accidentally hosing their prod site and having to take 3+ days to recover.

48
jestar_jokin 3 days ago 0 replies      
Earlier in my career, I worked in prod support for an insurance web application. It had a DB containing reference data. This reference data was maintained in an Excel spreadsheet; a macro would then spit out CSV files, which would be used by command line scripts to populate databases in different environments (test, staging, pre-production). The DB data was totally replaced each time. Pre-production data would be copied into production, every night or so.

One time, I ran the staging and pre-production scripts at the same time. This had the unusual effect of producing an empty CSV file for pre-production.

When I got in the next day, I discovered all of the production data had been wiped out overnight...

Thankfully, it was all reference data, so it was just a matter of re-running the export macros, and pleading with a DBA to run the data import job during business hours.

I ended up writing a replacement using generated SQL, so we could apply incremental updates (and integrate better with a custom ticketing system).

49
girkyturkey 4 days ago 0 replies      
My first internship used Google Drive for their database (small start up) and there have been numerous times where I have almost lost a substantial amount of work/information. This article brought back that feeling of anxiety. But that is a lesson to be learned, even if it was the hard way. Everyone goes through that at some point in their career.
50
okket 4 days ago 0 replies      
These days it should be possible to roll back a few steps ( 15 min / 1 hour) with a copy on write filesystem like zfs? Full scale restore from backup should only be necessary if the storage hardware fails (IMHO).

You still need to apologize for a some data loss, though. So make sure that everything you do has one or two safety nets before it hits the customer.

51
donatj 3 days ago 0 replies      
New devops guy at my work a few years ago somehow completely blows away the CDN. Of course we have all of the data locally but it took almost a full day to reupload. I believe this is our longest downtime to date.
52
alphacome 4 days ago 2 replies      
I am wondering why the OS not introduce a policy to protect important files/directories. For example, we can mark something is important, then if someone try to delete it, it will ask the person to input some key (at least 20 characters), if the key is incorrect, the operation will be canceled.
53
mrlyc 2 days ago 0 replies      
I've found that it's important to do my own backups and not rely on IT to do them. I once returned from my holiday to find that the sysadmin had wiped my hard drive. He said he thought I had left the company. Fortunately, I had backups on computers in other states that he didn't know about.
54
lasermike026 4 days ago 0 replies      
Just reading this headline makes me queasy.
55
Joyfield 3 days ago 0 replies      
I once accidentally moved the cgi-bin (long time ago) on one of Swedens biggest websites. moved it back pretty quick so it was "only" down for a couple of seconds.
56
BinaryIdiot 4 days ago 0 replies      
My very first commercial experience doing development was as an intern at Polk Audio. At the time their online solution was pretty immature (no version control and no development environments; everything was coded up in production).

I was working on a very important, high traffic form and...accidentally deleted it. Their backup consisted of paying another company to back up each file. Fortunately they came through but it took a full day to restore a single file.

57
forgottenacc56 3 days ago 0 replies      
Good management blames management for this. Bad management blames the sysadmin and publicly says that "the sysadmin did it".
58
gtrubetskoy 3 days ago 0 replies      
This is where delayed replicas come in very handy: https://dev.mysql.com/doc/refman/5.6/en/replication-delayed.... I don't know whether they're running on MySQL though...
59
iamleppert 3 days ago 0 replies      
One time I restored a database from a MySQL binary log a table that held about 10,000 employee pay rates. Unfortunately, the log was shifted a few rows and the mistake wasn't noticed until a few weeks later when the CEO and some high level directors noticed their pay was traded in for the high 6 figures to an hourly rate.

What a mess!

60
unfunco 4 days ago 0 replies      
Have done this and similar. And now I have aliases in my zshrc:

 alias db="mysql --i-am-a-dummy"

61
w8rbt 4 days ago 0 replies      
People who do things make mistakes. It's the ones who don't make mistakes that should be of concern.
62
ausjke 4 days ago 0 replies      
Knew one sysadmin was fired due to his "rm -rf /" fat finger without a working backup tape scheme.

Also once we had to retrieve some code from tapes, which are just stacked in a messy black room, and nobody can eventually find that, but no firing anybody either.

63
pc86 4 days ago 0 replies      
Looks like the pricing page is 404 right now as well (but all other pages seem to be fine
64
nwatson 3 days ago 0 replies      
Sorry for those that lost information, personally glad it didn't involve Atlassian Confluence-hosted Gliffy illustrations ... I have a lot of those and the tool is great for quick shareable embedded engineering sketches.
65
PaulHoule 4 days ago 0 replies      
Last time I did that the chief sysadmin had my back and we had it restored in 5 min.
66
noir-york 4 days ago 0 replies      
Admit it - who here hasn't read this and not gone back and tested their restores?
67
alienbaby 3 days ago 1 reply      
very early career days, wrote a script that had rm -rf in it. I knew this was dangerous and so the script asked, 3 Times, if you were sure you were in the right place.

That was the problem, asking 3 times.. people just spammed enter x3 at that point in the script.

Someone using it came over to me one day.. 'hey, look, what going on with this system. I can't do ls ? '

There was no system, pretty much. The script had rm -rf'd while he was root and running the script from root.

The job of the script? installing and configuring the backups for a system. So yea, there were no backups for this system at this point in time !

68
matchagaucho 3 days ago 0 replies      
I only store... IDK.... about 80% of my system architecture diagrams on Gliffy.

FML :-/

69
sirpogo 3 days ago 0 replies      
And Gliffy is back up.

https://www.gliffy.com/apology/

70
keitmo 4 days ago 0 replies      
The "Other" Moore's Law:

Backups always work.Restores, not so much.

71
manishsharan 4 days ago 0 replies      
This is my biggest fear when I use my production Redis
72
daodedickinson 4 days ago 2 replies      
Are there any more sites like gliffy and draw.io?
73
Sujan 4 days ago 0 replies      
Poor guys...
74
hathym 4 days ago 0 replies      
don't laugh, this can happen to you
75
Raed667 4 days ago 0 replies      
Shit happens =)
76
yitchelle 4 days ago 0 replies      
Just going to add the obligatory http://thedailywtf.com/
77
xg15 3 days ago 0 replies      
I accidentally all the data...
78
owenwil 3 days ago 0 replies      
Anyone have a screenshot?
14
Dear Apple, theres nothing really sad about using a 5-year-old PC thenextweb.com
376 points by Ph4nt0m  4 days ago   236 comments top 62
1
mikehearn 4 days ago 14 replies      
I interpreted that comment as a jab at the PC industry. The implication being that in the last 5+ years the PC industry has failed to offer 600 million users a compelling reason to upgrade. Isn't that the correct interpretation, considering the whole point of bringing that up is because they're positioning a new device intended to replace the PC? I seriously doubt it's intended to mean (to quote the article) "LOL poor people".

Granted, this interpretation would make for a boring thinkpiece and would not get me to #1 on HN.

2
sudosushi 4 days ago 5 replies      
As someone sitting at a nearly 5 year old Macbook Pro, I took the comment as an off hand throwaway. I understand not liking the comment, but this isn't news. A company thinks everyone should be using the latest of their products. Oh no.
3
NikolaeVarius 4 days ago 4 replies      
I swear, if a modern tech company went up and said 'We think you should buy our product", someone would start yelling about how its insulting how a company is endorising capitalism and materialism.

We've gone from "Won't somebody think of the children" to "Won't somebody think of every single possible group that is possible to somehow offend in some way"

4
johansch 4 days ago 6 replies      
I built a desktop PC almost exactly five years ago.

- Intel Core i5 2500K, 3.3 GHz, quad-core (200 USD)

- 8B (2x4GB) DDR3 1600MHz C9 (100 USD)

- 120GB 2.5" SSD Intel X25-M G2 (200 USD)

- GeForce GTX 460 1GB (200 USD)

It's sad that a full five years later, CPU performance/USD has barely moved at all. RAM is half the cost now, SSDs a quarter the cost. Not sure about how GPUs have developed?

(Edit: The GTX 960 which today also costs 200 USD seems to be about twice as fast as the GTX 460.)

5
danielvf 4 days ago 0 replies      
This is classic trolling, plain and simple.

The head of marketing thinks that using a competitor's product is "sad".

In response, this article calls Apple: "Insensitive". Offensive. "Hypocritical". "Insulting". And worst of all, promoting inequality by building high quality, expensive products and forgetting the needs of the poor.

It's an article designed to produce a response.

6
BWStearns 3 days ago 0 replies      
Really this isn't terribly offensive. They're selling a product they think is superior, in their [marketing] minds the world would be a better place if you were pulled from the womb, slapped on the ass, and handed an iPad Pro, to be renewed every generation of gadget.

The Apple presentation stage is not the Basilica of St Peter. These are marketing pronouncements, not moral ones and analyzing them as such is such intense naval gazing that it's actually bad for your neck. If Apple hates the poor it's for no reason other than that they're outside their customer base.

7
existencebox 3 days ago 1 reply      
Perhaps I have some wires crossed from too much time as a sysadmin, but I take a 5 year old (+) PC (or any machine, really) as a mark of pride, not any bit of shame whatsoever. It speaks to a high degree of reliability which often speaks well of the operator (even if just "choosing robust hardware" is a component of this)

Some stories to add some color to this:Ran a very primitive file sharing server for my university on a dual P(2 or 3, don't quite remember) machine that was probably around a decade old by the time they finally ended up retiring it.My home fileserver is a ~10 TB 4U monster, running on (conveniently) 5 year old hardware and very boring FBSD. Outside of moving apartments, it has not had unplanned downtime once, I will continue using it as long as this is true and would be sad if I didn't get another good few years out of it.

I _WISH_ I could get the same lifetime out of desktop PCs but I tend to find assorted parts failing at an asymptotic rate around 3-5 years. The world in which we all use <5 year old hardware is a sad one, to be avoided, to my eyes. (To clarify, I don't mean this in any luddite sense, I don't believe tech should stop moving forward, but I long for more robust products with longer viable lifespans, such that one can make a choice to upgrade rather than waiting for the inevitable.)

8
oldmanjay 4 days ago 0 replies      
I'm not a fan of moralistic handwringing, particularly when it's brought about by uncharitable interpretations of what is clearly just marketing. This and the related articles are such poor quality that it makes me sad to see them get so much traction here. It's the sort of thing I'd expect at dailytech or slashdot.
9
colund 4 days ago 3 replies      
In times of climate change debate I think Apple is doing the wrong thing here, encouraging buy and throw away mentality increasing waste.
10
specialp 3 days ago 2 replies      
It is ironic that Apple is mentioning this as I believe this is going to spell the end of their era of massive profits. Phones now are getting to the state where much like PCs, the older phone is good enough, and the new phone is not substantially better. There will always be people buying a phone for .2ghz more CPU or some slightly higher res screen, but the days of rapid evolution of mobile devices are over, and Apple is going to have problems selling someone a $6-800 phone every year or 2.
11
tombert 4 days ago 1 reply      
Did nothing interesting happen in the tech world today? This is such a non-story, it's really weird that this is on the front-page twice.
12
agentgt 3 days ago 1 reply      
I interpreted it as "it's sad that those Window users having been using just Windows for 5 or more years and not Mac".

IMO He's catering to the audience of Apple enthusiasts (who are the ones that watch Apple events generally) and not making fun of poor people.

Its sort of analogous to when Jobs said: "It's like giving a glass of ice water to somebody in hell" -- about iTunes on Windows computers

Oh so Jobs thinks Windows users are Evil since they are in hell right?

13
apatters 4 days ago 1 reply      
If there's something sad about that fact, it's that the industry has delivered so little value in the past 5 years that not many people feel compelled to upgrade!
14
johnhattan 4 days ago 2 replies      
Actually, my main development tower is about nine years old.

And in that time, I've upgraded the processor, doubled the memory, upgraded the hard drive to an SSD, switched the video card twice, upped the number of connected monitors from one to three, and upgraded the OS from 32-bit Vista to 64-bit Windows 10.

It was pretty leading edge when I built it, and it's still pretty leading edge today. What's sad is the expectation that I should throw my computer away every 18 months.

15
sergiotapia 3 days ago 0 replies      
Jesus christ, they are in the business of selling computers. This faux outrage over a salesman trying to sell his computers is gross - what's wrong with people?

Now watch as every blog tries to scramble to see who has the most outrage and who is the largest victim.

16
dcustodio 4 days ago 0 replies      
Some people need to feel offended just as I need my morning coffee. I'm not even counting how old is my pc/laptop and that's the thing I like about PCs - there's hardly anything new that triggers my Gear Acquisition Syndrome.
17
c0achmcguirk 4 days ago 0 replies      
"I want to be offended!"

signed,people who take offense at a off-hand remark like this.

Grow some thicker skin and stop wasting my screen real estate with irrelevant non-stories like this.

18
imaffett 4 days ago 0 replies      
I still use my 2009 MBP. I upgraded to an SSD drive and 8 gigs of ram. I don't game on it, but I can do almost all of my development on it. I'll admit it's slower then my 2013 MPB at work, but I see no reason to spend more money on a working computer.

My second computer is a Chromebook. My oldest daughter uses it for school work and we couldn't be happier. It's much better then our iPad (which we don't use anymore).

19
studentrob 3 days ago 0 replies      
Schiller is marketing his product. This is no different from the "I'm a Mac, I'm a PC" commercials.

There are now two articles on the HN front page about this utterly pedantic topic which boils down to marketing. Unbelievable.

20
fit2rule 4 days ago 0 replies      
You know what makes me really happy? Any computer being used for fun/interesting/productive things, not just 'the latest ones'.

As a die-hard retrocomputing enthusiast with far more old computers in my basement than new, I'm biased. But I sure think that the time has come for the compute industry to start highlighting the need for lesser computing power, but yet still more productive computing.

8-bit computers are awesome. 16-bit machines superb! Get yourself set up with these systems and you can entertain yourself for hours and hours. 8-bit is a great way to learn software development - 2 hours of 8-bit coding a week will keep you sharper than sharp when the time comes to go back to the hipstertools-de-jour. (I kid, I kid.)

Point is this, folks: old computers never die - their users do.

21
drzaiusapelord 4 days ago 1 reply      
If anything its a fairly strong statement on the ruggedness of the PC platform. I have a 2500k i5 in my old desktop. With a new-ish videocard I play all the newest AAA games at high quality. Its incredible how the x86 world really hasn't had any huge performance bumps and how a Q1 2011 CPU is still competitive.

Also, there's the larger narrative of people buying tablets and putting off PC upgrades, so the PC ages. Don't worry Apple, you're still getting their money. Its just people aren't ready to replace a general purpose computer that they control and can run pretty much everything with a walled garden mobile device designed to get ad impressions and consume media.

If anything, this is Apple's frustration. They have all this success but people and businesses keep buying PCs. They'll never crack this market. They're too invested in the Jobsian "closed" ecosystem philosophy to be as agile as the PC platform. Mocking those who don't drink their kool-aid just makes them look like sore winners.

edit: I'm aware I can buy a newer chip, but from a single core vs single core perspective its not that much faster. Very little consumer software is properly multi-threaded so this is why my expensive work computer with the newest i7 doesnt feel any faster than my 5 year old desktop at home. Most things are pegged to one core and at the end of the day single core performance is what's going to matter.

22
tfandango 3 days ago 1 reply      
I'm a little sour on Apple. They say their products are rugged and "built to last", but an iPad is mostly glass which you need to encase in a giant rubber protective case if you don't want the screen shattered. Even then they are easy to break and very costly to fix, to the point now where it barely makes sense to fix it over replace it. Self-Repair is less expensive but so far I'm 50/50 on successes. I would say they are built to last, until the next one comes out.
23
flyinghamster 3 days ago 2 replies      
As someone who just picked up a reconditioned six-year-old, i7-equipped ThinkPad for a hell of a lot less money than what it would have cost brand-new, I have to laugh. It may not be the Latest and Greatest, but it's still speedy enough to handle anything I'm going to throw at it. The CPU performance curve has flattened out in the last few years, to the point that it's not worth spending lots of money for a modest performance gain.

I'll let someone else take the depreciation hit.

24
emp_ 4 days ago 0 replies      
The only thing not lasting more than 5 years in my 20+ years owning computers are Macs, other iDevices and PCs are much, much stabler. Macbook Pro 2008 (died in 3 years), iMac 27 2010 (died in 4.8 years), Macbook Air 2013 (dying) and Mac Mini 2012 (dying) are the reason why I always end up coming back to my 2010 gaming PC and 2010 HTPC, the ones to break the 5+ year mark without a major hardware failure.
25
po1nter 4 days ago 0 replies      
I don't understand how the author went from the guy saying it's "really sad" to "Apple is insulting people". I mean even his first argument on why people don't upgrade IS actually sad since they can't afford to do that. I know it is because I'm one of those people who can't afford to upgrade my machine.

/rant typed on a 5 year old Asus N53SV.

27
mrbill 3 days ago 1 reply      
I said this in the other thread..

I just bought a "new to me" laptop.

Refurb Thinkpad T420s from 2011.

I added 16G RAM, two Intel SSDs, an Ultrabay battery, and an 802.12ac wifi card.

Grand total: less than $325.

This will be my primary portable for at least 2-3 years, and it's already four years old.

Just because I can afford Apple doesn't mean I can justify the 2x price premium, or that "old" hardware isn't capable.

28
islane 3 days ago 0 replies      
Echoing the comments from others, My "main" pc is about 7 years old running the x58 platform (socket 1366) - it easily outperforms my work-supplied development laptop.

For the uninitiated, the ebay workstations mentioned are typically these ancient x58's. Most support hex core xeons, 24gb ram (or 48gb unofficially, more on server boards and some workstations), and a pile of PCI-express lanes. As such, you can easily add in PCI-express m.2 SSDs, USB 3/3.1, and GPU's to your heart's content. The takeaway is that old pc tech can be had at a fraction the cost of new hardware with comparable performance.

I understand the marketing nonsense from Apple, the "PC does what" consortium, and hardware vendors on the whole - but there is nothing sad about owning an old pc. The reality is that the best performance for the price lies in "obsolete" platforms.

29
atomical 4 days ago 0 replies      
Is there anything sad about using a 5 year old mac? I'm hoping that eventually my mac book pro will last 5-10 years. With multi-core systems, SSDs, and 16 gigs of ram in MBP's do we really need to be upgrading so much? Also, clock speed advances have stalled.
30
sz4kerto 4 days ago 1 reply      
I think I have a rusty, almost 5 year old PC lying around. It has an Intel i7-2600K overclocked to 4.6 GHz, 24 GB RAM, 240G SSD and a Radeon 6950 GPU that can run most games well in fHD.

Most of the machines Apple sells are actually slower than this PC.

31
parenthesis 4 days ago 0 replies      
Until earlier this year I was still making heavy use of a Powerbook from 2004. I only stopped using it because the graphics hardware started to go funny.

I sold its battery, memory and power supply to someone still using a slightly older Powerbook.

32
draw_down 4 days ago 0 replies      
I agree that this was tone-deaf of them, I just find it fascinating that we picked this one instance of tech industry rich guy tone-deafness. Seems to be getting a lot of play for some reason. But it's everywhere if you look.
33
Overtonwindow 3 days ago 0 replies      
I agree with this piece. I have long been upset at Apple's forced obsolesce policy, and creating the notion that devices are disposable. I still have a Macbook from 2009, and another from 2012, that I am doing everything in my power to upgrade and avoid the forced slow down. Likewise with my iPad and phone. I resist the persistent upgrade requests to the OS. Not because I am blase about security issues, or not wanting bugs to be fixed, but because those fixes and upgrades come with a cost: premature, forced obsolescence.
34
scarface74 3 days ago 0 replies      
Thinking of all of the computers I've had since 2007, all of them are usable and still in use at least once a week.

2006 era Mac Mini Core Duo 1.66Ghz. Run Windows 7 gave it to my mom. She still uses it.

2009 era Sony Viao - Core Duo 1.66Ghz, 2Gb RAM. Windows 7. My son uses it for Office and MineCraft

2009 Dell Pentium Dusl Core 2Ghz, 4Gb of RAM. It's still my only laptop. The display is 1600x900 and is still better than many cheap laptops. The battery is crap though.

2011 Core 2 Duo 2.66Ghz laptop. My wife's computer. It's still feels fast.

My workhouse is a 3Ghz I3 with 6Gb of RAM. Bought in 2012.

35
rsync 3 days ago 0 replies      
You know what's really sad ?

I am using a 2009 octo mac pro, which is now 7 years old and since that date, apple has not released a single product that is compelling enough to upgrade that system.

36
balls187 3 days ago 0 replies      
I believe that an iPad (or comparable Android Tablet) is better for most computer users than any low/mid-tier 5-year old PC.

With PC's you can upgrade them, and make tweaks to squeeze out every bit of performance, but by and large for most people, when taking into account that mobile content consumption is on the rise, a tablet is a better upgrade than a new PC. Tasks like email, text, video, music, photography, facebook, are pretty much now done via mobile phone. For these people, PCs are anachronistic.

37
reacweb 3 days ago 0 replies      
I have a HP Pavilion Elite m9458fr bought in july 2009 for 416 (on eBay hp_marketplace_fr). I only need a silent reliable computer with a reasonably fast CPU for web development on Linux. I have no need for beefy GPU, I just need to connect 2 displays (23" and 19"). I would like to replace it in order to have USB3 connectors and SATA III HD. I do not find anything on the market for reasonable price. Should I buy a ultra-HD laptop with a magnifying lens ?
38
gd2 3 days ago 0 replies      
It seems a somewhat strained interpretation to view this as Apple is anti-poor people. But it does point out that Apple is losing touch with what people do with personal computing power.

Phone are much better then five years ago, computers not so much. I'd much rather spend my dollars where the major improvement in computing technology is, then spending to upgrade a desktop to view thing almost the same as before.

39
ayb 3 days ago 0 replies      
I'm using a MacBook Pro from late 2011.

Most of the actual machine specs (i.e. processor and max RAM) have barely changed in 5 years. I put 16 Gb in my laptop 5 years ago and it's still the most you can squeeze into a 13" MacBook Pro.

It's sad (and somewhat telling) that Apple has not packed more power into this form factor over the past 5 years.

40
PaulHoule 4 days ago 0 replies      
The main problem I see is Intel and Microsoft have given up on power users and it is all Apple envy and phone envy, no wonder people don't buy new PCs.

Back in the 0s I had a policy of never rehabilitating an old PC because a new PC was better in every way.

The other day a friend brought a Macbook from 2007 to me with a busted HDD and I put an ssd I had laying around in and we got Win10 running on it with no drama and no Apple malware (iTunes, boot camp, etc.) It feel faster than a skylake machine with one of those useless hybrid hard drives and after puffing some hcfc gas through the fan it is great.

Any and or pre core 2 machine would go to the trash, I would not even donate it to the poor, but frankly broadwell and skylake are just an excuse to reduce the io slots to put manufacturers of gfx cards.

They say customers get better battery life but software screws that up if they really tried it and the most you can get is spend a lot of money on a thin and light machine that the doorman can slide under the hotel or get a 2 in 1 machine just because you need a trackpad on a touchpad machine and have a fight over if and where you stow it with the stewardess just to have another reason to get arrested at your destination.

I mean, even IBM sells 360 chips that clock over 5 that use water cooling. It is not that hard.

41
johndevor 4 days ago 0 replies      
Holy crap the "politically correct" army has entered the tech world. We're not safe anywhere...
42
craigmccaskill 3 days ago 1 reply      
I'm using a ~5 year old PC I built myself and it still outperforms the hardware in any available mac product (desktop, tablet or laptop) that isn't a Mac Pro (starting price $2999,00).

I don't have a compelling reason to upgrade until the launch of VR headsets.

43
holri 3 days ago 0 replies      
Maybe user of old computers are not poor but:

* Do not suffer from avarice?

* They do not need the newest shiny toy for their ego and to win recognition?

* They don't touch a good running working system?

* They want to save CO2 emissions and noble earths?

* They have a frugal live?

* The know what Eco-sufficiency means?

* They know that consumerism does not make happy?

44
yq 4 days ago 0 replies      
semi-Related:iCar release date rumours, features and images: Apple CEO Tim Cook comments on Apple Car rumours

http://www.macworld.co.uk/news/apple/will-apple-make-icar-pr...

Reading related news is interesting. Imagine Apple applies its tactics on their cars: You probably need a new apple designed plugin other than the universal one, special tools to change flat tires and/or update the exterior slightly to market therefore driving a 5-year-old car is sad.

45
jitendrac 3 days ago 0 replies      
I am using 7 year old pc with single upgrade(motherboard),I have no reason to buy new!!!!I have no reason to buy any expensive apple laptop or ipad.
46
jaimex2 4 days ago 0 replies      
There is however something sad about buying products to not look poor.
47
jvagner 3 days ago 0 replies      
If your child's school only had 5 year old computers, you'd probably think, "Huh, it'd be better if this school had newer computers."
48
compactmani 3 days ago 0 replies      
600 million PC users discovered that running a lightweight unix distribution/DE and not allowing javascript made the life expectancy of their computers triple.

Or so I dream.

49
SeanDav 3 days ago 0 replies      
I miss my "Turbo" switch that used to be on older PC's. Just press "Turbo" and your PC is good for another couple of years.
50
facepalm 4 days ago 0 replies      
I can be "not wrong" and sad at the same time. A newer PC would likely be much faster, resulting in less stress and less wasted time.
51
chasing 3 days ago 0 replies      
I, too, am shocked that a computer company would communicate that their new computing devices are better than the old ones people already own.

Shocked.

52
rabboRubble 3 days ago 0 replies      
Hahahahahhahahahahah... my main machine is a Macbook Pro, Mid-2009. Going on 7 years old. Hahahahahahah.
53
agumonkey 4 days ago 0 replies      
-- sent from my almost perfect 9yo laptop
54
justinholmes 3 days ago 1 reply      
My 5 year old PC has a dual socket Xeon I think that beats any iPad rubbish.
55
ArenaSource 4 days ago 1 reply      
Last Macbook Pro generation is from 2012... this is really sad, it really is.
56
bliti 4 days ago 0 replies      
I use a 4 year old MBP. Does that make me middle class then?
57
shanselman 3 days ago 0 replies      
Yikes...My primary PC is 5 years old. Works great.
58
busterarm 4 days ago 0 replies      
Still running a desktop with a Q9650 and 8GB RAM.

Runs fine.

59
kps 4 days ago 0 replies      
My main home machine is an 8-year-old Mac Pro running Snow Leopard. Apple today sells nothing that could replace it, let alone improve on it.
60
balls187 3 days ago 0 replies      
Apple v PC flame war still lives on.
61
programminggeek 3 days ago 0 replies      
What Phil said worked.
62
snowwrestler 4 days ago 4 replies      
Yes there are things that are sad about it. A 5-year-old PC is probably not running a recent version of Windows, and has a higher likelihood of being compromised. And if a person wants a new computer but can't afford it, that is sad too.

That said, it was an obviously stupid stat for Schiller to cite. But now we're going to be subjected to a long series of Apple-bashing articles that overreach in the opposite direction. By the end of today we'll see multiple "actually, I'm proud to be running 5-year-old PC" posts.

Why? Because when mining for pageviews, there are few veins as rich as bashing Apple.

Edit to add: If we want to talk about the tech industry and poor people, let's do so. How many new companies are variations of "let us bring things to your door for you for an extra fee" or "let's give you personalized service so you don't have to go shopping/ride a bus/interact with a human"?

15
Boom (YC W16) signs $2B letter of intent with Virgin, $5B total techcrunch.com
426 points by lacker  3 days ago   164 comments top 19
1
brenschluss 3 days ago 3 replies      
Wow, who's doing the marketing? Pretty savvy. If they had launched with this notice, we'd be thinking, "Oh cool, Virgin's making a supersonic airplane with some company!"

Instead, with this tiered announcement:Three days ago, nobody knew about Boom. Within the last two days, they have lots of new press, and and thus lots of skepticism. Today, they 'announce' an effective endorsement with Virgin. Brilliant.

2
paulsutter 3 days ago 1 reply      
Title misleading. The relationship is opposite what's implied. Boom will be hiring Branson's "The Spaceship Company" to do engineering, and Virgin gets an option to buy the first ten planes.

Granting an option means Boom has given something to Virgin, not the other way around.

> a Virgin Group spokeswoman confirmed their plans to The Guardian: We can confirm that The Spaceship Company will provide engineering, design and manufacturing services, flight tests and operations and that we have an option on the first 10 airframes. It is still early days and just the start of what youll hear about our shared ambitions and efforts.

EDIT: Let's hope that we hear Virgin is making a big investment in Boom soon, that will be a stronger indicator. If Virgin really had substantive interest in the planes, and Boom was actually experiencing demand, Virgin would have had to PAY MONEY for an option.

3
Certified 3 days ago 2 replies      
Maybe I am misinformed but I was under the impression part of the reason concorde was retired is because at those speeds you tear up the ozone.

"From their particle measurements, the authors of the Science study calculate that a future possible fleet of 500 supersonic passenger aircraft will increase the surface area of particles in the atmosphere by an amount similar to that following small volcanic eruptions. In mid-latitude regions, such emissions have the possibility of increasing ozone loss above that expected for nitrogen oxide emissions alone. The increase in the number of particles may also affect the ozone-related processes occurring on wintertime polar stratospheric clouds (PSCs) in the polar regions."

-http://www.publicaffairs.noaa.gov/pr95/oct95/noaa95-65.html

4
abalone 3 days ago 1 reply      
I'm disheartened there's no mention of carbon footprint or sustainability. With all the great focus in our community on sustainable land transport (Tesla), data centers, solar and neo-nuclear power, etc., here we have a startup that markets this:

"imagine leaving New York in the morning, making afternoon meetings in London, and being home to tuck your kids into bed."

That is a terrible thing to enable from an environmental impact standpoint. While air travel can be somewhat more efficient per mile than driving alone in a gas car[1], the distances it enables you to travel are vastly greater. Further upping the convenience factor would no doubt encourage more "binge flying".

Perhaps Boom is more efficient than typical airplanes. Perhaps it's less efficient. Perhaps it could make planes that travel at the same speed but at half the carbon footprint. We wouldn't know from this. It's not part of the story.

Let's change that. Let's make sustainability as important a consideration with airplanes as it is with cars.

[1] http://www.yaleclimateconnections.org/2015/09/evolving-clima...

5
onion2k 3 days ago 1 reply      
Awesome as this is, there's something largely outside of Boom's control that could derail their plans. When Concorde was ready to fly the British and French governments (who paid for the plane's development) had to negotiate with New York for permission to land there. There were concerns about the noise - not sonic booms, just general plane noise, because Concorde was "loud". The problems were actually political ones because Concorde wasn't American. Boom will have to do the same, but London has some really bad NIMBY issues with airports and the expansion of Heathrow at the moment. Problems that will probably continue for decades. If the anti-expansion environmental lobby groups can block this, and they see it as politically useful to do so, there will be a lot of negotiating to wade through.
6
maxxxxx 3 days ago 4 replies      
I would be extremely impressed if they could pull this off. Developing a supersonic passenger plane must possibly be one of the most expensive and complex things to do. Probably harder than what SpaceX does.
7
vannevar 3 days ago 2 replies      
If building a supersonic passenger plane with an existing engine was viable, it seems like somebody would already be doing it. Which means it's probably not viable, in which case Boom is an aircraft engine company first and foremost; once they have the engine, wrapping a plane around it should be the easy part, relatively speaking. Developing a new engine costs on the order of $1B, so the development cost for the whole plane would probably be between $1.5-2B. So they'd pay for that with their first 10 planes to Virgin, with maybe some money left over to change their name.
8
samfisher83 3 days ago 7 replies      
Just for some context 787 cost $32billion dollars to develop.
9
kafkaesq 3 days ago 1 reply      
So what's the carbon impact (per passenger-mile) of the service Boom is proposing (compared to regular air travel) again?

I'm not sure this is something to cheer, just because it's new and shiny (and because it appeals to tech types who fancy they'll finally get to afford a ride on one of these contraptions, some day).

10
haberman 3 days ago 2 replies      
I don't know the first thing about how a business in the space operates. Can someone fill me in on why a startup like this would go through YC? How are $120k and Silicon Valley connections going to help a startup in this space in the slightest?
11
oniony 3 days ago 2 replies      
Letter of intent. Not a contract. The article is speaking like it's a done deal.
12
nerdy 3 days ago 1 reply      
They got $5bn in LOIs, a new YC record: https://twitter.com/sama/status/712705887853383680
13
frenchman_in_ny 3 days ago 0 replies      
I love what these guys are doing, but I find it odd that they're using an actively registered tail number (N22BT) in their mockup images.
14
rory096 3 days ago 2 replies      
This is awesome. Great counterpoint to all the naysayers and middlebrow dismissals on Monday. I'm pumped to see this thing fly.
15
rdl 3 days ago 1 reply      
Boom is probably my favorite new company in a long time. I wish there were infosec concerns :)
16
rgovind 3 days ago 1 reply      
What is the importance of an LOI? Its non binding. So why does it matter, except as a PR exercise?
17
nbevans 3 days ago 2 replies      
The article describes Concorde as "ill-fated". Is that accurate?
18
mathattack 3 days ago 0 replies      
In a prior thread I asked about how they would get the funding they need straight out of YC. I guess this is the answer!
19
forrestthewoods 3 days ago 0 replies      
Yesterday: Hahaha what a stupid company name! What a bunch of idiots!

Today: Oh shit

16
What I Learned Selling a Software Business kalzumeus.com
434 points by gyardley  3 days ago   84 comments top 14
1
aresant 3 days ago 2 replies      
FEI still has the original listing on their site:

Yearly revenue - $31,000

Yearly net profit - $19,000

Asking price - $57,000 SOLD

It's fascinating what a small amount of money we're ultimately talking about vs. the influence of the "cult of Bingo Card Creator fans" on HN - which I am card carrying member of.

(1) http://feinternational.com/buy-a-website/3745-software-busin...

2
song 3 days ago 1 reply      
Just wanted to quote this:

"Im told, against my expectations, that BCC was impressively well-documented by the standards of other businesses its size. This implies that many people are running their small projects in even more of a cowboy fashion than I do, for example by not having dedicated books for the business. If this describes you, God help you. At a minimum, get your books for the last year done professionally whatever you spend on bookkeepers/accountants will be a pittance next to the time saved and additional valuation captured."

Even if you're not selling, getting this done will save a lot of headaches the road... Dedicated books for the business is a MUST. I know a lot of small businesses where this is not done religiously and it always comes back to bite the owner in the ass...

EDIT: By the way, I was curious so I just took a look at the BCC site, the blog is timing out...

3
dennisgorelik 3 days ago 7 replies      
This time Patrick's summary of Bingo Card Creator does not look rosy at all.

All facts are still the same, but the overall impression of BCC now is that it is a small, declining and time-consuming business. Patrick himself actually struggles with money, like all of us.

Patrick definitely has (had?) the power of optimistic spin in his stories.

4
davidw 3 days ago 1 reply      
> Back in the day someone won a Nobel Prize for pointing out that, if a population of goods has unknown potentially costly problems, and there is no way to determine which particular instances of the goods have the problesms, the market will penalize all goods in that population. The canonical example is used cars.

George Akerlof and "The Market for Lemons": https://en.wikipedia.org/wiki/George_Akerlof

5
sdrinf 3 days ago 2 replies      
| Selling BCC was going to pay for living expenses while we built Starfighters first game (Stockfighter) and also pay for some development work to assist with the sale of my other SaaS business, Appointment Reminder.

^^ - you're selling AR as well? Last presentation was showing that to be a profit machine? Would love to learn about the reasoning behind that decision!

6
benologist 3 days ago 2 replies      
I'm spending this year packaging up my current business to make it as attractive as possible to potential buyers.

This talks a lot about the process, but what are some things people like us can do to maximize their return on such a sale?

7
simonswords82 3 days ago 1 reply      
I got in touch with FEI about selling one of my web businesses and they couldn't help due to our UK focus. Sucks for us, I've heard good things about them.

Can anybody recommend a broker that assists UK based and focussed web businesses?

8
voltagex_ 3 days ago 0 replies      
>migrate all of my email in Google Apps for Work (oh God, dont ever do this)

Yeah... I have a non-trivial number of purchased Android apps on my Google Apps for Work account ($6AUD/month) and there's no published way to move apps to a "normal" Google account.

I'd probably be paying Google forever if I had business dependencies hanging off that account (but I set it up when custom domains were "free").

9
voltagex_ 3 days ago 0 replies      
>Accordingly, I decided to retroactively cut her in for 5% of the business. Props to Pepper for accommodating this request, as it is somewhat non-standard. ("Can you invoice me a substantial amount of money and promise me that you will pay a particular employee of yours a bonus of the same amount, net only of taxes?" "We can do that.")

Businesses exist that are this cool? Where do I find them?

10
pitt1980 2 days ago 1 reply      
"People try to buy software businesses with no money down. (Will you loan me the entire purchase price of the business? Ill pay you back over the next 3 years. Promise!)"

----------------

While I see why you would have run away from that particular structure, I'm curious as to how flexible you might have been from a straight lump sum structure

If I was buy a business like this, willing to accept $X as money down, some % percentage of revenue for Y months, until you were paid Z amount, with some contingencies built in would look pretty attractive

a sellers willingness to agree to terms like that would send a pretty strong signal that they weren't selling a lemon

as a buyer, I'd be willing to commit to a Z price significantly higher than what I'd be willing to commit as a lump sum up front

if the seller believed in the business, (and I guess were able to substantiate that I had enough ability no to drive the business into the ground) it seems like such a structure would net the seller more as well

----------------------

I'd love to hear your thoughts about how receptive you might have been to an offer like that

11
BorisMelnik 3 days ago 0 replies      
I think one of the big things about BCC isn't how much (or how little) money he made but how well documented the process was. We've all seen plenty of projects do 10k months but not many of them are sustainable or so well documented in a blog.
12
raymondhong 3 days ago 0 replies      
great
13
sbierwagen 3 days ago 1 reply      
(2015)
14
quellhorst 3 days ago 3 replies      
Such a long article but no mention of how much he sold it for.
17
Jenkins 2.0 Beta jenkins.io
407 points by sciurus  1 day ago   150 comments top 20
1
teraflop 1 day ago 3 replies      
It's cool that they're promoting the "pipeline" plugin to a built-in feature, but the devil is in the details.

Under the hood, it's implemented by taking a script written in a "DSL" (it's actually Groovy code), transforming it into continuation-passing style, and serializing its state. This is pretty cool from a theoretical CS perspective, but having played with it a little bit, the implementation seems very fragile. There are a number of long-standing unfixed bugs that cause the transformed code to misbehave in ways that you wouldn't predict just from looking at it, even if you're experienced with Groovy. I ran into a couple of them just during a brief period of experimentation. For instance:

https://issues.jenkins-ci.org/browse/JENKINS-27893 varargs are mishandled)

https://issues.jenkins-ci.org/browse/JENKINS-28277 currying doesn't work)

https://issues.jenkins-ci.org/browse/JENKINS-26481 (calling a closure in a loop doesn't work right)

https://issues.jenkins-ci.org/browse/JENKINS-28183 (killing a pipeline job doesn't clean up its state properly)

And there are inherent limitations to the approach; for instance, you can't store any non-serializable values in local variables, which means simple things like foreach loops don't work (because you can't serialize an iterator).

I really like the idea behind the Pipelines feature, in principle. But I think building it around a general-purpose programming language, and then failing to support all of that language's features, is a recipe for headaches. If you never try to do anything more complicated than what's shown in the examples then you should be fine, but the borders between what works and what doesn't are very ill-defined.

EDIT: Oh yeah, and there's a surprising amount of functionality that isn't documented anywhere except for blog posts and JIRA tickets.

2
jquast 1 day ago 8 replies      
Having been using jenkins several times since it was Hudson, as well as Bamboo, travis-ci, python buildbot, and TeamCity, Jenkins is a loser from my perspective. Not worth becoming invested in.

Its simply too difficult to use by a laymen developer or sysadmin when accommodating complex buildchains, which only grow more complex over time in most software businesses.

If you're going to program your CI with a web administrative interface, and provide an easy to navigate interface for users, expand with build metrics and external systems, I suggest TeamCity over any others.

Though jenkins is free, my experience is maintenance teams of 2 or more people with tribal knowledge develop around the tool that far outweigh the license cost of TeamCity, which solve many of these homegrown engineered problems as part of the product cost.

If you have a strong developer team, I would suggest using something like python buildbot and programming your CI end-end in a single unified language, much better than shoe-horning plugins and groovy code inside your jenkins textarea fields!

That said, all CI systems are just software, and any requirement can be accommodated with sufficient amount of additional software, forks, or changes. Any will do.

3
jacques_chester 1 day ago 7 replies      
It's great to see that Jenkins is following the path blazed by GoCD[1] and Concourse[2] to make the pipeline concept more central.

That said, this appears to be achieved by promoting the plugin into the default installation.

It also misses some of additional the advantage Concourse holds over Jenkins and GoCD: build configuration is purely declarative and can be checked in with the project. You know what version of your pipeline built the software at any point in its history. And you have a reasonable shot at recreating that build, because every task in every job is run in a fresh container.

These are killer features, in my view. Jenkins can be extended with plugins to try to sorta-kinda do either or both, but it's not part of the central organising concept of how it works. Windows can run some POSIX apps, but it's not a nix.

Further out, the Jenkins pipelines are tricky to do fan-out/fan-in with; in Concourse it's trivial. You have to lay out your pipeline in Jenkins, Concourse will lay it out automatically based on declarative information about each job. Rather than a very rich API for plugins, Concourse boils the unit of extension down to "resources", which with three actions (check, get, put) can model points in time, git commits, S3 files, version numbers, interacting with a complex product repository and so on.

I used to tolerate CI/CD, as a necessary and worthy PITA. Now I find myself actively looking for regular tasks, resources, sanity checks and so on I can put into Concourse, so that I don't have to remember them or write them up in a wiki.

Disclaimer: I work for Pivotal, which sponsors Concourse development. But I wouldn't rave about it if I wasn't such a convert.

[1] https://www.go.cd/

[2] http://concourse.ci/

4
hidingfornow 1 day ago 1 reply      
I've been tasked with implementing a large Jenkins deployment to support a ton of teams and I don't think I've hated a piece of software so much in such a long time. The past few years I've been using other CI systems like Circle and I totally forgot how much you have to fight Jenkins.

The UI is atrocious, job state is spread out among tons of crappy xml files, and the plugin system causes tons of headaches. If you're going to have a system that forces you to use the UI for the most part, rather than scripting up config files that I can load with some automation, at least make that UI nice to use.

Hopefully 2.0 is fixed up, but personally I'd never reach for Jenkins as a CI system if it wasn't part of a client's requirements.

5
zwischenzug 1 day ago 2 replies      
I wrote these posts recently about stateless Jenkins deployments:

https://zwischenzugs.wordpress.com/2016/01/24/ci-as-code-sta...

https://zwischenzugs.wordpress.com/2016/01/30/ci-as-code-par...

https://zwischenzugs.wordpress.com/2016/02/25/922/

Does anyone know if any effort has been put in to make Jenkins more 'programmable' in 2.0? I had a quick look at the bumph recently but couldn't see anything pertinent.

6
bb0wn 1 day ago 1 reply      
Very exciting -- a new major version of Jenkins has been a long time coming.

> "Jenkins 2.0 is a drop-in replacement of the Jenkins 1.x series of releases and fully backward compatible. There is practically no reason not to upgrade once 2.0 is released."

Skeptical, but optimistic about this claim.

I hope that the Jenkins configuration format is something that can be manipulated more easily in the new version.

7
smegel 1 day ago 2 replies      
If there was ever a webpage that needed a facelift, it would be Jenkins, so hopefully this is not just a backend rewrite.
8
Eduard 1 day ago 0 replies      
I hope Jenkins 2 will fix Jenkins 1's tendency for unclear and ambiguous user interfaces. Many configuration input fields have weird names, and their explanation fields can confuse me even more, and when using plugins, the whole configuration layout can quickly become a mess - Jenkins 1 often gives me the feeling of a mighty but hard-to-use Japanese pro tool lost in translation.
9
code_research 1 day ago 3 replies      
Every random piece of software expects some kind of file in the root directory of a project - this is not acceptable and leads to messy project layouts.

Could developers of these tools please stop dictating where people have to put the "Blahfile" dsl config files and finally allow people to configure these kind of details.

Also I would like to propose a common standard directory "projectroot/config" for these kind of files, NOT the project root directory itself.

Thanks for your attention!

10
sytse 1 day ago 2 replies      
It is clear that build stages should be first class citizens. We based GitLab CI on the awesome work that GoCD and Concourse did in this respect. Our DSL allows you to assign jobs to stages https://gitlab.com/gitlab-org/gitlab-ce/blob/43e49f52e30199c... and jenkins seems to have picked a similar syntax.

What is interesting is that Jenkins has a DSL but that this is not part of the repository. This means that it is hard to extend when you introduce new functionality and when you push an old branch it might not work anymore. I think that the model that Travis CI pioneered clearly wins.

I'm torn on the plugins. One one hand it is great to be able to plug so much in. But the plugins have access deep into the Jenkins internals, preventing a rewrite of core functionality. Our idea is: "Because GitLab is open source the enhancements can become part of the codebase instead of being external. This ensures the automated tests for all functionality are continually run, ensuring that plugins always work. It also ensures GitLab can continue to evolve with its plugins instead of being bound to a plugin API that is hard to change and that resists refactoring. This ensures we have many years of quality code and great monthly releases ahead of us." https://about.gitlab.com/direction/#vision

11
xvilka 1 day ago 3 replies      
Jenkins by itself is eating too much memory on our buildservers, which can be used more wisely. Using compiled language could help that - e.g. Drone.io CI written in Go: https://drone.io/

You can find it's sources here: https://github.com/drone/drone

I just hope it would be able to work without docker, like jenkins.

13
mschuster91 1 day ago 1 reply      
As the website got HN'd, a question: does Jenkins support now pass-through of parameters in pre/post build actions to other jobs?

Background: I have "deploy frontend" and "deploy backend" job, both of which need to invalidate some layers of caching, and a job parameter that specifies the environment (it's the docroot).

For now I have a metric shitload of shared code between the jobs as there is no way to move the cache clearing code to its own job and pass the environment parameter through in pre/post build actions.

14
dkarapetyan 1 day ago 0 replies      
That's nice but even the current version of Jenkins is buggy as hell and it's been around for a while now. Every place I've been at takes the base Jenkins worker management and then layers routing and job orchestration on top of it through something else because Jenkins itself just can't deal with anything that is beyond triggering and canceling a job. Even canceling doesn't work half the time.

So given that's the foundation they're layering even more complicated stuff on top of it. Thanks but I'll pass.

15
needusername 1 day ago 0 replies      
Jenkins always struck me as one of the few software projects that don't use OSGi but should be using it.
16
Alupis 1 day ago 1 reply      
I've been using Jenkins for years, and have been overwhelmingly happy. 2.0 is a big step for the project, congrats!

The flexibility Jenkins provides as build system is enormous. The plugin community is also a huge benefit -- chances are, if you have a need, there's a plugin for it.

17
donatj 1 day ago 0 replies      
I've become a big proponent of Drone CI. It's very configurable and Docker powered.
18
foolinaround 1 day ago 0 replies      
Is there a potential release date, so that we could plan for the adoption of the beta for future releases?
19
glasz 1 day ago 0 replies      
ugh. this is still alive? i was done with jenkins the first time i had to use it.

since everbody's at it, i'll recommend github.com/drone/drone. drone really gives docker a pretty use case.

20
t0mk 1 day ago 2 replies      
I came here to write sth like "still in Java... :/" but I can't even check it, the site seems to be hugged to death.
18
More Encryption, More Notifications, More Email Security googleblog.com
339 points by nailer  2 days ago   163 comments top 15
1
codelitt 2 days ago 11 replies      
This seems like much ado about nothing.

I certainly appreciate the effort is better than nothing, however, how often are those notices served to US/European citizens? It's one thing to stand up to government overreach in foreign countries, but how about the country where you (and a large percentage of your users) reside? They specify attackers, but I'm assuming this notice to the end user does not apply to the US/EU governments requesting your data and them complying?

Another gripe I have is that TLS has probably been broken by the NSA^1. It's better than nothing to alert us about the other party not using it, but really provides limited protection. PGP/GPG is really the only assurance you have and the plugins for different desktop apps are nearly always buggy. I end up just manually encrypting/decrypting with GPG because a buggy encryption integration is not a comforting thought. If they really cared about keeping your privacy safe, they'd have an end-to-end encryption tool/integration.

[1]: http://blog.cryptographyengineering.com/2013/12/how-does-nsa...

2
hartator 1 day ago 1 reply      
I think the red lock next to the recipient email is more confusing than anything.

It's suggesting strongly some kind of end-to-end encryption, like PGP, when there is still nothing. Google still has full access to the plain text versions of these emails as well as the receiving email providers.

It's creating a fake assumption of security that can be more damaging than anything.

3
rubyfan 2 days ago 2 replies      
I agree with much of the others commenting here. The IETF strict transport security draft is ridiculous. If every carrier who passes the message can #1 read it and #2 potentially change the content and #3 promiscuously route messages to each other then why does it really matter if they pass it amongst themselves securely? Line security is easily defeated by other 'features' of SMTP.

End to end encryption is the only thing that will really matter in email security. And even with end to end encryption email is a flawed medium, since it leaks meta data in the process of message delivery. That is kind of a barrier to secure messaging.

4
Tepix 1 day ago 4 replies      
Google has no interest in end-to-end encryption: It would put them out of the loop. It would be contrary to their mission to analyze the customer data to deliver better ads.

Gmail is a big privacy problem. Even if you don't use it yourself, nowadays a large percentage of your emails will end up there. And why? It's all about lazyness and low friction.

Computer literate persons (that's you, right?) should really consider to get off their butts and host their own email. It's not hard, it's not expensive and it's not a lot of work to maintain either. It can even be fun and informative. By sticking to gmail, you're no longer credible when complaining about the erosion of privacy on the internet.

5
theandrewbailey 2 days ago 2 replies      
> In the 44 days since we introduced it, the amount of inbound mail sent over an encrypted connection increased by 25%.

I'm surprised that it's not more than that. I can imagine executives everywhere asking their IT people "Why do all of our company's emails have this error on them? They are all red and scary!"

6
technion 1 day ago 0 replies      
My major concern here is that to me, end users are reading into this way too much.

I say this because I caught a developer a few days ago implementing an online payment gateway using a Wordpress "form to email" plugin. The ensuing argument came down to his firm belief that email to gmail is now "encrypted", and thus, this is perfectly safe.

We need to be careful about sending this sort of message.

8
ryporter 2 days ago 3 replies      
Why "state-sponsored" attacks specifically? If any group of attackers is targeting me, then I'd be just as concerned. Introducing that distinction seems like it will force Google to determine whether a group is backed by a government.
9
Animats 1 day ago 0 replies      
The mail content is still in the clear inside Google. As long as Google does mail that way, it's not secure. Only end to end encryption can provide any security.
10
lallysingh 1 day ago 0 replies      
Ignoring kuschku's telling, this is a big move for a big ship (email security). Congrats!
11
daviddahl 1 day ago 0 replies      
But, no protection from Google in any of this. Sigh.
12
dadrian 1 day ago 0 replies      
Congrats Jon!
13
kuschku 2 days ago 2 replies      
Well, if GMail would be open source, and we could self-host it, we could get the same advantage.

Giving all your private data to a foreign company, serving interests of investors, acting directly against your and your nations interests is NOT acceptable, and should NOT be common.

14
mtgx 2 days ago 3 replies      
What's the progress on the End-to-End tool? What's the progress on making Hangouts end-to-end encrypted for that matter?

I feel that these improvements, while useful, are a sideshow to stop privacy enthusiasts from switching to better encrypted services or tools, while Google (and Microsoft, and Yahoo) continues to mine all of your private conversations for advertising purposes.

15
satbyy 1 day ago 1 reply      
By chance, I happened to type (note the www.)

 https://www.security.googleblog.com/
This immediately popped up a red warning in Chrome:

 Your connection is not private Attackers might be trying to steal your information from www.security.googleblog.com (for example, passwords, messages, or credit cards). NET::ERR_CERT_COMMON_NAME_INVALID
It seems that the SSL certificate is issued to *.googleusercontent.com. Given that we're talking of Google, I expected that the URL will redirect to non-www https site, but apparently not.

19
Microsoft demos 'holoportation' 3D presence tech with HoloLens microsoft.com
409 points by Impossible  1 day ago   106 comments top 40
1
hanniabu 1 day ago 3 replies      
I was waiting for this, knew it was only a matter of time before Microsoft came out with something like this since it's right in their field of office products.

Hopefully this is the catalyst needed to create better work/life balance and lower the barrier for remote work and make it just as productive as being present in the office.

2
neals 1 day ago 4 replies      
I never hear or see anybody mention this, but I feel that it is important to note that the actual surface area of the HoloLens is currently way to small to give anything resembling the presented experiences.

They always show the hololens used from a third perspective. Filmed with a camera, not the actual Hololens capture.

The hololens that I got to use (admittedly, almost a year ago now) was just nowhere near the expierence of any of the demo's. It's just a tiny tiny augmented display, compared to the massiveness of the headset.

Let's say it's more toward Google Glass than towards Oculus rift.

I really believe in this (type of) product, I have clients in certain fields where augmentation is just a obvious next step, but it is not going to be what they currently display.

And they know it, because there isn't a single image on the internet where you see what the Hololens user is actually seeing. Though I'm sure I'll be proven wrong at that in the comments below.

[Edit] Yes, I see there is a small render of the Hololens in the bottomleft corner, but you really need to see it compared to your full field-of-vision to get how small it is.

3
LarryMade2 1 day ago 4 replies      
Nice "demo"...

Microsoft lately seems to be really good at being a "proof of concept developer." Which is cool and all that, but then people wait and forget why it was so cool.

As a comparison, Apple holds back their tech till it is polished and ready for market, so people are excited and start literally lining up to get whatever it is... And when they get it for the most part they arent disappointed, because it is pretty much exactly what was presented on stage at the show and they can show it off while its still a new shiny idea.

4
sigmar 1 day ago 3 replies      
Pretty cool, but my first thought is: how authentic will communication be when "holoporting" to each other if you both have those big hololens mounted on your head? It will be hard to pick up on each other's facial expressions and impossible to look into each other's eyes.
5
lipanski 1 day ago 1 reply      
Is it just me or HoloLens sounds a bit like a Hooli product?

All jokes aside, given that the rendering quality improves, this could have interesting applications for scientists researching facial and body expressions in certain environments. It would enable them to replay scenes and emotions which current 2D/3D video fails to capture, especially because it allows you to change your viewpoint.

I doubt this will be a game changer for teleconferencing, mainly due to the oversized device you have to wear, but then again all these gadgets tend to look a bit creepy in their initial phases (e.g. the Google Glass prototype).

6
cb18 1 day ago 2 replies      
How do they define which aspects of the captured image to extract, model, and transmit? If they were just modeling the humans, I could seeing using motion detection, or infrared perhaps, but they seem to have also extracted and modeled the little girl's toys.

So i'm curious what method they use to extract just specific bits of the image captured in the holodeck. Perhaps this is answered in one of their papers.

If anyone knows and has a quick overview, or link to a relevant paper that'd be great.

7
Qworg 1 day ago 0 replies      
Here's the actual page on the research - http://research.microsoft.com/en-us/projects/holoportation/

Compare this to previous work at Microsoft, just last year: http://techxplore.com/news/2015-07-microsoft-hololens-video-...

8
noobie 1 day ago 1 reply      
Downvote me if you want but HOLY FUCK! I got goosebumps when he started replaying the interaction in reverse! This is really reaaaally cool!
9
mpnordland 1 day ago 1 reply      
I realize that I'll probably never get something like this, but dang, that was cool.There seemed to be a distinct quality difference in the realtime created models, and the prerecorded models. The realtime ones had issues, where as the prerecorded ones (like the girl with the dog) were smooth and nice. I definitely think that if this goes mainstream, higher quality models will be generated in most cases and only use real time stuff if there isn't one available.
10
DanAndersen 1 day ago 0 replies      
This is pretty exciting because of its potential for telementoring. I've been working on a research project using tablets and augmented reality for surgical telementoring in austere environments ( https://engineering.purdue.edu/starproj/ ), but a setup like what MS has here would offer so many benefits. Imagine a trainee surgeon being able to feel as if a mentor surgeon was actually present and able to gesture and give instruction during the course of a live surgery, for example.
11
kazagistar 1 day ago 0 replies      
I see they learned from their previous campaigns, where they focused on the ability to "work from anywhere" and were criticized[1] for creating a strange, dystopian vision of all work and no leisure. Now, they are focusing much more on the personal aspects in their marketing, even though their primary market is fairly likely to be business.

[1] https://signalvnoise.com/posts/3683-microsofts-dystopian-pit...

12
blaze33 1 day ago 0 replies      
If you look at the tv that's in the room, it looks like it's actually showing the real live 3d reconstruction with what I'd guess is a 500-1000ms latency.

So the main video that is in sync (when the guy has it's own holograph in superposition) was certainly edited afterwards..

13
ksec 1 day ago 0 replies      
Give them another 4 years time to perfect their technology, that is 2020, another 5 years again to iterate and adoption. I can see in 10 years time this will have massive impact to office work. Less Business trip to travel.

I dont think it will replace business travel. But there will be a lot less of it. And it is also dependent on the countries broadband quality. ( May be in the future the Job requirement will write you need 100Mbps broadband + low latency results. )

14
veritas3241 1 day ago 1 reply      
He seemed extremely careful not to let his daughter walk through him. He always made room for her. I wonder if there have been studies done to assess how creepy it might be to have a family member walk right through you...
15
stuart78 1 day ago 0 replies      
Remarkably impressive, but I have to admin to being a bit creeped out by the save and relive stuff at the end. A bit too close to the dream machine in "Until the End of the World" for my taste.
16
eganist 1 day ago 1 reply      
So, an idea:

How much work would it take to segregate just the augmented reality displays as a pair of glasses that a user can wear while receiving the display data wirelessly? I say this because once you can do that, you can have (for instance) an executive conference room setup with a number of identical conference rooms scattered around the world, each with a server performing the hololens rendering logic and rendering across _n_ glasses per room. No bulky hololens computers-on-head and an eerily realistic recreation of one conference room with all participants in-room.

I don't know the exact patents Microsoft filed on the display technology they have for the visors, but I suspect it's not yet easy enough to compress them into anything close to Google Glass yet (you'd need larger glass in any case), but one can hope.

17
rosme 1 day ago 0 replies      
Just this week I wrote a blog about how to build your own holographic studio by using multiple Kinects and RoomAlive Toolkit. Nowhere near the sophisticated capture that Microsoft Research demoed of course. http://smeenk.com/build-your-own-holographic-studio/
18
kodablah 1 day ago 0 replies      
Install this at sports stadiums and not only make my own viewing more "embedded" in the action, but allow replay to get any angle (granted lower res than they are used to for really tight calls). I imagine that many sports organizations would be lining up outside MS research's door with cash in hand. Though the tech probably needs to advance quite a bit more to capture that many 3D objects in real time.
19
happi-live 2 hours ago 0 replies      
It's very cool!
20
Heliosmaster 1 day ago 0 replies      
This is not cool just for work. I can definitely see myself using it to feel closer to my family, a few thousands miles away.

Also, this is research. Too many of these comments are bashing it, mostly because it comes from MS I guess. We can still hate their products, but the research is really useful.

21
freekh 1 day ago 0 replies      
Pretty cool demo! Maybe they got inspired by what this guy did a while back with 3 kinects: https://www.youtube.com/watch?v=Ghgbycqb92c?
22
CM30 1 day ago 0 replies      
Hey, this looks pretty impressive. It also makes me want to imagine some of the video game uses of the technology, like say, how cool a Five Nights at Freddy's game would be based on this tech...

To some degree, it also reminds me a bit of a certain villain from Teen Titans...

23
fuddle 1 day ago 0 replies      
Thats very impressive, looking forward to seeing this advance in the next few years.
24
gajomi 1 day ago 0 replies      
It's expensive, which makes me want to think of who would buy these first. There might be a market in medicine (think Vilayanur Ramachandran style phantom limb therapy) and psychiatry.
25
breezest 1 day ago 0 replies      
The concept is great but the equipment must be very expensive.

It seems multi-national company may make money by providing a private room to enjoy the 'holoportation service'.

26
iamleppert 20 hours ago 0 replies      
I can't even, the video "demo" has been so obviously post-produced.
27
imron 1 day ago 0 replies      
Shrinking it all and putting it on the coffee table was a nice trick.
28
azinman2 1 day ago 0 replies      
So amazing. Can't wait until all you need is a few tiny webcams and something like magic leap allows it to be in your glasses so you can actually see the other person unobstructed.
29
davkap92 1 day ago 0 replies      
Microsoft Office 3D here we come, walking past life size graphs and real life spreadsheets
30
foota 1 day ago 0 replies      
This team must have the best remote standups.
31
jonny1090231 1 day ago 0 replies      
Very impressed! I can't even begin the possibilities that this tech could bring to the table.
32
hiharryhere 1 day ago 0 replies      
very very cool. The guy in the video is an excellent presenter too.
33
murbard2 1 day ago 0 replies      
The case against supersonic plane travel..
34
maxpert 1 day ago 0 replies      
Microsoft just made PORN more convenient!
35
kordless 1 day ago 0 replies      
These things will be the end of all of us.
37
greenspot 1 day ago 0 replies      
1st thought: next Skype

2nd thought: 4 cameras and some space required

3rd thought: command-w

38
vic_nyc 1 day ago 0 replies      
The first and only promising technology from Microsoft I've ever seen :
39
hoodoof 1 day ago 1 reply      
Microsoft seems to be fiddling around in research while Sony runs away with the Virtual Reality cup.
40
deckar01 1 day ago 1 reply      
I don't understand why they need to pretend to support features they haven't developed. It makes something that is obviously advanced feel cheaper and fake in some way. The daughter is obviously a recording, but they call it "live", say she can't hear him, then he gives her audio queues. Who is editing this and what are they thinking?
20
Andy Groves Warning to Silicon Valley nytimes.com
340 points by hvo  22 hours ago   234 comments top 29
1
luso_brazilian 21 hours ago 16 replies      
> "There was room for improvement, he argued, for what he called job-centric economics and politics. In a job-centric system, job creation would be the nations No. 1 objective, with the government setting priorities and arraying the forces necessary to achieve the goal, and with businesses operating not only in their immediate profit interest but also in the interests of employees, and employees yet to be hired."

Although a valid concern putting "job creation" as a goal for governments can (and in a lot of occurrences in recent history, did) backfire spectacularly.

In the most reductive analogy it creates incentive for the government to create "hole diggers" and "hole fillers" type of jobs that, in aggregate, generate very few useful work while fulfilling this basic goal of job creation.

I believe this "job creation as priority" approach is inferior to both the "laissez-faire" capitalist alternative and the "basic income" social democratic one.

As an example of its dangers it suffices to see the kafkaesque process of fund allocation (and sourcing) for the public funded aerospace industry, both the military procurement (fighter jets, bombers) and the civilian NASA one.

2
p_monk 20 hours ago 3 replies      
Adam Smith was making this same argument the only time he ever used the term "invisible hand." Smith assumed (incorrectly) that capitalists would always prefer their own domestic markets. For Smith, his conception of capitalism was good because it offered the best chance at achieving equality. However, globalization of capital has proven that his underlying assumptions were incorrect, so it may very well be that Smith today would have been seen alternatives to capitalism as better suited to provide equality. I believe Smith's beliefs would have led him to be something like what we call a "market socialist" these days.

The original passage from Wealth of Nations:

"He generally, indeed, neither intends to promote the public interest, nor knows how much he is promoting it. By preferring the support of domestic to that of foreign industry, he intends only his own security; and by directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention."

3
awakeasleep 21 hours ago 1 reply      
> All of us in business, Mr. Grove wrote, have a responsibility to maintain the industrial base on which we depend and the society whose adaptability and stability we may have taken for granted.

Thats the money quote to me, and so much of what is overlooked when talking about industry and to a large extent, social welfare programs. I wish the overton window included the concept of the costs required to maintain a useful, stable society.

4
afarrell 21 hours ago 2 replies      
There is a conflict of values to consider here:Do we we place a higher value on preventing relative poverty in the United States or extreme poverty around the world?

"""Poverty rates started to collapse towards the end of the 20th century largely because developing-country growth accelerated, from an average annual rate of 4.3% in 1960-2000 to 6% in 2000-10... China is responsible for three-quarters of the achievement."""[1]

[1] http://www.economist.com/news/leaders/21578665-nearly-1-bill...

5
narrator 16 hours ago 4 replies      
The mystery at the center of all this is "Why is it so astronomically cheap to do the x in Asia vs the U.S?". Not just cheap as in low wage, but cheap as in finished goods costing less than the cost of materials delivered to the American factory. It's not just manufacturing.

In Thailand you can live in a high rise condo with a gym and maid service and eat out every day and do some traveling for $2000/month. That life style is 8x more expensive in San Francisco. In Argentina before the 2002 crash, the country was very expensive, but it wasn't some luxurious paradise, so it's not just that it's a developing country. AFter the 2002 crash the prices were absurdly cheap. Something just doesn't make sense, and there's an enormous amount of denial that anything's amiss. Prices are a mystical sacred phenomenon it would seem. American companies don't care why prices are cheap, they just build stuff where the prices are cheap and sell it where they are high. They couldn't care less about the mystery, neither could the politicians. It's just the magical mysterious market.

Health care prices are also a mystical sacred phenomenon. Why are they 10x in the U.S what they are in other countries? Other people get sick and die there and are willing to pay whatever money they have to get better. Prices don't necessarily scale with income per capita either. I am sick of pat answers to this question and "just-so" stories. Someone should take some cost accountants and figure out where all the money is actually going for these kinds of things and write some journal articles. It would probably reveal a lot of bizarre and interesting rough edges on the economy and maybe a fair amount of business opportunity.

Some researcher should go look at the cost to build an Intel fab in Taiwan vs the U.S and start asking why. Start with why is it 1/5 the price to build the fab in Taiwan: 1. Because the construction company charges that much. Why? 2. Because the labor is 1/2 and the materials are 1/3. Why? Because the labor's rent is 1/5? Why? Because the land for the apartment building was 1/10 the price? Why? Because there are no mortgages and the land had to be bought with savings from an export business, etc, and follow it all the way down for everything. Depth first search! Piles of fascinating economic nuggets are just sitting there untouched.

6
paulpauper 19 hours ago 0 replies      
The lump of labor fallacy is a pervasive one

Despite all the hype over outsourcing, the number of Silicon Valley jobs is close to the 2000 highs

http://www.spur.org/sites/default/files/wysiwyg/u168/Screen%...

Silcon Valley job growth has pretty much tracked the rest of the nation https://upload.wikimedia.org/wikipedia/en/8/8e/US_Labor_Forc...

Job creation and destruction is an inescapable part of a dynamic economy.

7
e0 21 hours ago 1 reply      
Here's a direct link to the Andy Grove's opinion piece referenced in the article: http://www.bloomberg.com/news/articles/2010-07-01/andy-grove...
8
cleandreams 20 hours ago 1 reply      
I think the rise of Trump reflects the reality of a declining middle class and the overall problem of political stability in the face of declining opportunity and economic security for most people. In business it is easier to focus on the short term win. Corporations have profited from being stateless, from offshoring and outsourcing. But it remains to be seen what the long term cost will be. I suspect the backlash will be strong because when people have their backs to the wall they lash out.
9
Alex3917 21 hours ago 2 replies      
If the U.S. really had a shortage of prosperity then this would make sense. The problem is that the U.S. already has more wealth than it knows what to do with, it's just that it's completely misallocated.

And given the fact that sending our designs to China to get manufactured is basically the only thing stopping them from going to war with us, I'm not convinced it really makes sense to give up this benefit in order to 'fix' domestic problems that don't really have anything to do with globalization.

Certainly there could be more domestic manufacturing than there is currently, but just because the current system is causing some problems doesn't mean we should discount the benefits.

10
jakelarkin 21 hours ago 1 reply      
Consumers will pick the best cheap gadget regardless of origin. Investors want maximal profits in a 1-3 years time frame. The system unfortunately does not afford much wiggle room for ethical and national duty when working with those constraints.

Grove's outlook also colored by difficulties of semiconductor industry. Asia in addition to having cheap labor, governments would prop up barely profitable fabrication companies as national champions, give them subsidized loans and regulatory favors. American investors & management, perhaps rightly, balked at trying to compete on that front.

11
dmritard96 20 hours ago 2 replies      
There is supposedly some reshoring in the US but I don't really see it from hardware startups unless they have something bulky/specialized/low volume. The bigger question is not whether we should employ in one taxation region or another to concentrate wealth as we see fit, but rather how quickly generalized arms, machine vision and new manufacturing processes will impact our ability to rely on labor as a source of societal stability.
12
nxzero 4 hours ago 0 replies      
Often find it troubling that people find it so easy to cast an opinion on a topic they so poorly understand. Yes, this topic is important, no, I don't understand it. What is the best way to understand the current situation?
13
jmspring 12 hours ago 0 replies      
Maybe I've been around Silicon Valley too long (born here), but the key paragraph of the whole story really is this --

Mr. Grove acknowledged that it was cheaper and thus more profitable for companies to hire workers and build factories in Asia than in the United States. But in his view, those lower Asian costs masked the high price of offshoring as measured by lost jobs and lost expertise. Silicon Valley misjudged the severity of those losses, he wrote, because of a misplaced faith in the power of start-ups to create U.S. jobs.

MBAs and money people look at immediate costs, managers look at engineering costs, lost in the off-shoring debacle is...you actually aren't saving money in the long term.

There are specific exceptions in the case of leveraging specific expertise -- for me, looking for a WinMo optimized two way DH key handshake in assembler, send the code and email Dec 23, receive the results Dec 25, was great. But most outsourcing isn't targeting specifics, it's the generalities and that is a big issue. Costs are masked.

14
redwood 21 hours ago 2 replies      
Germany sets a great example here with its local industrial production base and culture.
15
jboydyhacker 19 hours ago 0 replies      
Another way to view what Andy Grove was saying was we neither have capitalism, socialism, or central planning. Rather we have a toxic mix in many industries like Health Care, Defense, Drug Development etc. i.e the comment that what we have is inferior to "lasssez-faire" or free market.

China has used central buying/orders to drive up artificial demand/buying and as a result has a massive over capacity and debt problem that appears to actually be stickier and worse than ours.

I think the best learning is it's better to make a clear choice on the system to allocate resources rather than hybridize it. US Example? Healthcare-- Either go completely public like those in Europe which spend 4-6% of their GDP to cover health or go 100% provide and move government out of it . These hybrid systems have us spending 20% of our income on healthcare so we get the worst of both worlds.

16
hbt 16 hours ago 0 replies      
Note he didn't provide a solution to scaling companies domestically other than taxing offshore products.

That type of protectionism may work domestically but it means that American companies cannot compete overseas if they are being undercut on price.

It looks like technology will fix the problem in time. American companies will start manufacturing in the US but there will be no jobs, just automation.

The focus should really be: how can we increase the intelligence of the average worker so they can compete in the new economy and not just "live life on basic income" or "get an overseas job waiting to be automated".

17
kushti 18 hours ago 0 replies      
that the free market is the best of all economic systems the freer the better. - the US is following this principle only until "too big to fail" guys have problems.
18
Ericson2314 15 hours ago 0 replies      
Very simple, in the short turn as others have mentioned, the US's infra is absolutely shit and we need to poor money into that. In the long term, automation will break the Keynesian feedback loops that underlie such new-deal programs, and we should switch to basic income instead.
19
wahsd 47 minutes ago 0 replies      
It is the biggest flaw of the existing instantiation of Capitalism in the USA, that major non-financial factors are simply left out, ignored, or simply unrealized in the calculation of value. There is massive damage being done to the USA by not just Silicon Valley, even though it plays a major roll, that is very short sighted and trades of massive profits and gains and power in the future for short term gains that are primarily financial. It's very much a kind of monopolistic system that is leading to and not many people realize the astronomical cost of the deadweight loss to the American people. To put it in other terms that Silicon Valley types may understand, we have nothing even close to market fit and are operating at negative gross margins if you take all factors into account. It is easy to fool oneself that all things are well when you cherry pick your parameters and measures.
20
hyperpallium 19 hours ago 0 replies      
Well, most people need employment, so of course that should be a priority of a government for/by/of the people.

The article skipped what he meant by "priorities" and "forces". Presumbly, it's not forcing companies to make up jobs, but by things like lowering employment transaction costs and making the people more valuable.

Education is the obvious one, but it needs to be education that actually does make one more valuable. Also make sure the people are healthy. Better infrastructure - pretty much all those "government" things.

21
yasky 19 hours ago 1 reply      
Dear Hacker News - after scrolling 3 pages I still haven't seen the 2nd comment. Can you please implement collapsable comments? Please!?
22
matchagaucho 20 hours ago 1 reply      
I love Andy, but it can be difficult to reconcile this article with the pro-globalization message in his book "High Output Management".

Had Intel outsourced the manufacturing of their memory chips during their "scale-up" phase, it's conceivable that they may have established more "partners" and less "competitors"... similar to what Apple has accomplished today with iPhone manufacturing.

23
Nutmog 14 hours ago 0 replies      
This makes no sense from a global perspective. Why was Andy keen to reduce job growth in Asia and also reduce technology growth in the whole world? Did he think Americans were more deserving than Asians? That should have been clearly unethical from a humanitarian point of view considering the higher poverty levels that are still present in Asia.

If local=better then perhaps he should have focused on job growth in California and worried about companies that export their growth phase jobs and expertise to other parts of the US - especially underdeveloped parts which won't contribute much back except cheap labor. That's also costing Californians their jobs. What makes Iowa or Arkansas more important than a booming megacity in China?

24
Negative1 20 hours ago 2 replies      
Truly an inspiring man.

Jobs for the uneducated are disappearing and saying "well lets just educate everyone" is not enough. I'm racking my brain trying to come up with ways that tech can solve that problem but it seems like an upward battle (e.g. Uber makes anyone a cab driver, but in a few years, will fire all those people when automated cabs hit the road).

25
kelukelugames 22 hours ago 2 replies      
I majored in circuits because back then that's where the money and talent went in Silicon valley. Many of my classmates have since switched to software. The few remaining IC and fab people make pennies compare to the programmers. They also have PhDs to our BS and Master degrees.

Software seems more robust but I don't trust myself to make predictions.

26
sbardle 19 hours ago 3 replies      
Manufacturing isn't coming back. Do you think your average American would put up with the conditions in a Chinese iPhone factory?
27
nolepointer 16 hours ago 0 replies      
Make America great again.
28
ilaksh 17 hours ago 0 replies      
Technology can help us to make the fundamental economic structures more sophisticated in order to handle more complex goals.

The problem with running everything with simple money is that it is like running a body on doped water alone. I can't even say blood because money is one dimensional and blood has many useful components.

I think something like Ethereum is going in the right decentralized and high-tech direction. If we can have a common information platform then we can wire up our systems to be more decentralized and also have the capability to do holistic calculation and goals as well as evolve the system.

So I think having society integrate common information platform is key to turning this from a bunch of monopoly games into more like a wired game of Eve Online, or at least something where we can keep track of more than one stat ($) per entity.

29
x5n1 21 hours ago 0 replies      
No one cares. People with capital looking to make profit see employees as interchangeable machines and as a cost center. They don't want to pay for training, i.e. 2 years experience for a junior position. They don't even want to employ people, i.e. contract work forever. They don't care if there is a loss of skills in the local market, they only care about what they have to pay now to get where they are going in the next 6-12 months. After that is anyone's guess. Don't expect them to do anything but increase their own well-being at the cost of everyone else.
21
D'Oh My Zsh How I unexpectedly built a monster of an open source project medium.com
376 points by werrett  3 days ago   93 comments top 20
1
kbd 3 days ago 5 replies      
Could someone please explain to me the attraction in using Oh My Zsh (and similar)? It seems strange to me to use others' configs.

Over time I've customized my bash config and have all the information I want in my prompt. If I ever switched to zsh I'd just learn how to translate what I have in bash. Why would I want to start with someone's big framework for configuration?

2
smitherfield 3 days ago 3 replies      
A personal pet peeve is when people refer to having lots of features as "bloat." It's generally very nice to have lots of features all in one place without having to do things manually, or find somebody's potentially-sketchy plugin/app.

My test for "bloat"

1. Has the project grown so large that it's starting to run into real-world performance issues? (I can't imagine this could be the case with even the largest of shell configurations except on very low-end embedded devices).

2. Has the project grown so large that bugs are popping up faster than the developers can do maintenance? Is there no or only one person who can read and understand the entire codebase? (AFAIK, no and no).

3. Are there many undocumented/poorly-documented features? Are there features that are both undocumented/poorly-documented and dangerous? Are there many deprecated or outdated features that have yet to be removed? (AFAIK, no, no, no).

4. Are there many features that both duplicate and do not improve on functionality found elsewhere? (Debatable and mostly subjective).

3
shpx 3 days ago 7 replies      
If you like oh-my-zsh you'll love https://fishshell.com/. The main difference I use is better command completion as you're typing, and you can complete by word with alt-f. And ruby like syntax.

for example (| is the cursor and everything after it is grey text)

>echo hello world

hello world

>echo |hello world # alt-f

>echo hello| world

Fish and oh-my-zsh both take about 5 seconds to init though. If you don't like that you should be using prezto (which is the fork he mentions in the article)

4
cies 3 days ago 2 replies      
"I like Prezto[1] nowadays" -- an `Oh my zsh` refugee

1: https://github.com/sorin-ionescu/prezto

5
sdegutis 3 days ago 4 replies      
I heard of zsh back around 2012, lots of colleagues who I respected very highly were using it. But I just never could get behind the idea of using someone else's defaults. Even when I switched to Emacs, I hand-picked every line in my configs from looking at a bunch of people's configs and with a lot of random googling to solve things. Even though it took like 2 weeks of constant tweaking to get just right, I haven't touched it much in like the past 3 years or so, and I've been super productive ever since. So meh, seems to have worked out for me. But YMMV. Also, I use eshell (with some tweaks) almost exclusively now, as opposed to a "real" terminal with bash or fish or zsh (etc.
6
scosman 3 days ago 6 replies      
"Id become dependent on these shortcuts."

The intro to this article is as much a caution of becoming dependant on non-standard tools, as it is a pitch for omzsh. If you can't sit down at a normal bash window and get shit done, your shortcuts are hurting you.

7
wsha 3 days ago 0 replies      
I like learning from other people sharing their config files, but my attitude towards oh-my-zsh is similar to that of the author's co-workers in that I don't want to install a bunch of customizations that I don't understand. I couldn't find a summary of what all oh-my-zsh is supposed to do and the source has grown too large for me to read it quickly. I guess I trust code I haven't read most the time I am using a computer but it feels wrong to me to allow my shell to auto-update customizations that I don't understand.
8
ignoramous 3 days ago 0 replies      
Reminds me of Nathan Marz's great piece about lessons learnt from creating, and maintaining Storm as OSS: http://nathanmarz.com/blog/history-of-apache-storm-and-lesso...
9
sethrin 3 days ago 3 replies      
910 contributors, 191 issues, 516 pull requests, and his response is that "reviewing and approving pull requests is a nice-to-happen versus a need-to-happen." While I'm glad I am not in the position of maintaining that (or anything else important), that doesn't really speak well to the long-term prospects for the project. Clearly this is something that I and many others find very useful; It would be a shame to let it stagnate. The glib part of me would suggest either 'stepping up, or stepping down', but I can't really credibly offer solutions, I just am trying to point out a problem.
10
voltagex_ 3 days ago 1 reply      
I've tried a number of times to switch to zsh, ostensibly for oh-my-zsh. My main issue is that bash is the default almost everywhere so it's more work to change it than it is to just be "happy enough" with Bash.

I used to work on a fairly underpowered ARM5 and I could feel the impact of most prompt customisations on the speed of the system, especially on initial login. That feeling is still there - mainly because I haven't found the right SD card for my Raspberry Pi.

To avoid this becoming a complete ramble - are there any advantages to switching to zsh as someone who's reasonably comfortable with bash? Hell, even OS X switched (and boy, t/csh was a shock when using FreeBSD).

11
julie1 2 days ago 0 replies      
Humm ... I looked for the fun if no one spotted in any comment that there was a creepy feature the author likes.

Periodic automated arbitrary code execution from a remote source.

Here is a list of the stupid ideas that old coders warned from

 - abritrary remote code execution [X] this, curl|bash - too much dependencies [X] npm - lack of specifications, staging [X] Agile - non deterministic HW [X] Intel - non deterministic software [X] llvm/gcc/AI - Single point of failure [X] github/CA - attack by majority on P2P [X] blockchain, bitcoin - bigger sloc is more bugs [X] heavy frameworks - using immature technologies [X] haskell - bloatwares [X] angular - private corp standardizing [X] QUIC & al, browser wars - beware of information entropy [X] big data - moving parts [X] the Cloud - higher surface of vulnerability [X] IoT - monopolies [X] google - using private cie for infra [X] github is the new sourceforge - putting half backed std in prod [X] IPv6 - lack of consistency [X] most nosql tech - legal risk due to IP law [X] coding by copy/pasting
If I was an old coder still coding I would say we are very close to a singularity : the total lack of trust that could result in all this is simply customers reverting to fax, teletypes, snail mail... or going to court to ask for financial compensations.

If you need an expert to help you on this, I can help.

12
spystath 2 days ago 0 replies      
A light but featureful alternative to omz is also the grml zsh configuration [0]. I've been using it since 2011 or so and I've probably touched my .zshrc once or twice. If you fancy some colors you can also add some syntax highlighting [1]. Or just use fish which is great for interactive use!

[0]: https://github.com/grml/grml-etc-core/tree/master/etc/zsh

[1]: https://github.com/zsh-users/zsh-syntax-highlighting

13
draw_down 2 days ago 0 replies      
I understand the author's viewpoint but I would probably get rid of it the first time it asked to auto update.

Customization is nice but I guess I mostly prefer to spend as little time thinking about shells as possible.

14
vidoc 3 days ago 0 replies      
Reminds me how lame those geek t-shirts are, and how vulgar it is to put stickers on laptops!
15
beefsack 3 days ago 0 replies      
I absolutely love bash + powerline. You might know powerline if you're a Vim user.

http://i.imgur.com/3FKaEIy.png

It's incredibly easy to set up, I have a script to do it[1] but doing it by hand is trivial.

[1]: https://github.com/beefsack/bash-powerline-installer/blob/ma...

16
onetimePete 2 days ago 1 reply      
The irony is that after all those years, we still dont find a optimal way to find out - when it is a good time to annoy a user about update decisions.Do it during the system start up phase?Do it before they go into a break?Do it upon return to the system, when work was already interrupted?Do it shortly before shutdown?

No, annoyia be praised. It must be when the user has focused for longer then 5 Mins on something.

17
fvargas 3 days ago 0 replies      
> Its March 22, 2016 and the top trending repository on Github is ?

Not oh-my-zsh. It was top trending for the Shell category, not for all of GitHub.

18
gjvc 3 days ago 3 replies      
"oh my zsh" is too slow to be life-enhancing
19
jarjoura 3 days ago 0 replies      
I adore Oh My Zsh, way better than bash and the completion plugins are extremely helpful!
20
OJFord 3 days ago 1 reply      

 > This wouldnt my first foray into open source software; > nor my last.
I know that I'm annoyed perhaps too easily by poor grammar - but the opening sentence, really?!

22
Red Hat becomes first open-source company to make $2B zdnet.com
262 points by simonebrunozzi  3 days ago   82 comments top 12
1
acomjean 3 days ago 0 replies      
I was at a business looking to switch from HPUX/ Solaris to RHEL (Red Hat Enterprise Linux). We had some HPUX realtime(ish) extensions and used IPC heavily and really used the scheduler/Processor set functionality in HPUX. The scheduling was important, so transitioning wasn't going to be easy (endian issues aside).

They sent a bunch of us to a week long Linux Internals course at Red Hat. Really excellent class, knowledgeable instructor (turned us onto fedora, linux weekly news (https://lwn.net) and centos before red hat partnered with them). When the class wasn't exactly what we expected the instructor took the last day to go over some of the processor scheduling/ real time extension stuff we needed to know (My company's employees were the only one taking the class). The OS transition was shelved for a bit and I ended up leaving that company, but the course really changed my mind about that company.

Good for them.

2
rileymat2 3 days ago 2 replies      
Minor note. If I am reading the article correctly, it has revenue of 2B, not earnings.
3
jedberg 3 days ago 3 replies      
Are there any open source companies that make even 1B a year? The reason I ask is because anyone with a model of "open source our core product" makes me concerned for their survival. It seems like a tough business model and Redhat is the only (moderate) success I can think of.

When your biggest competitor is yourself at a $0 price point, how do you compete?

4
shmerl 3 days ago 0 replies      
Good. Some use Linux, and barely contribute back. Getting RHEL subscription is a good way to do it, because RedHat does a lot of work on improving Linux.
5
chris_wot 3 days ago 3 replies      
Man, what I'd do to work for that company! And I don't say that about many businesses.
6
eggy 3 days ago 2 replies      
Good for RH! AFAIR they abandoned the freely-downloadable Red Hat Linux, and started charging for the pre-compiled versions of Red Hat Enterprise Linux (RHEL), and this is when they started making money to be where they are today. Didn't they have a falling out with Linus?
7
kristianp 3 days ago 7 replies      
What's so good about RHEL/fedora vs debian or ubuntu server, say?
9
IamFermat 3 days ago 5 replies      
When you think about it, it's crazy it took them this long to make $2B in revenue. It took them 23 years since it was founded in 1993. Facebook hit $2B in revenue 6 years. Which is crazy when you look at the comparison and RH is the most successful open-source co. Open-source is a failed model for commercial success.
10
kachnuv_ocasek 3 days ago 3 replies      
What does 'open-source company' mean? One could claim that Microsoft is one as well.
11
otterley 3 days ago 0 replies      
Mods, please edit the title: that's not what the article says. A $2B company is not identical to a company having $2B in earnings.
12
fred_is_fred 3 days ago 0 replies      
But isn't Apple an Open Source company? ;)
23
Why Anti-Authoritarians Are Diagnosed as Mentally Ill (2012) madinamerica.com
371 points by kushti  2 days ago   282 comments top 36
1
fallingfrog 2 days ago 17 replies      
This comment below the article triggered something deep inside me: "So a 7 year old who refuses to read or practice reading is healthy anti-authority individual?". Let me tell you that when I was a kid I loved to read, I read voraciously everything I could find, and I did because nobody made me do it. I can assure you that if a teacher had sat me down and forced me to read something boring at that young age, I would have hated the teacher, hated reading, and hated school and I would hated anyone who tried to get me to do it again, forever. Believe me, some kids just love freedom! Please, if there are any teachers out here reading this, do not kill the fragile thing in your kids that loves to learn by forcing them to read or do something boring with no explanation except, do it because you said so!
2
humbleMouse 2 days ago 14 replies      
This is very interesting to me. I went to a psychiatrist last week and she told me I have "delusional thoughts about the government." She asked me what gives me anxiety, and I told her that it freaks me out that the government collects all of our data and wants to take away encryption. She told me I am nuts and that the government doesn't do that. ce le vie!
3
jensen123 2 days ago 11 replies      
I think it's interesting to look at how psychiatric disorders are invented. For example, is homosexuality a mental disorder? It's basically just a bunch of psychiatrists getting together, suggesting stuff and voting on it. It is not science.

http://www.theatlantic.com/health/archive/2013/05/the-real-p...

4
meric 2 days ago 4 replies      
The definition of "Authority":

the power or right to give orders, make decisions, and enforce obedience.

We're all free-willed individuals. Accepting orders, accepting other's decisions, and obeying others are voluntary actions. As soon as we refuse to follow an order, refuse to obey, the "authority", by definition, returns its power back to ourselves. Authorities are only authorities because by our grace, we allow them to be authorities.

The crazy ones are those who would voluntary give up their own initiative to the stupid, to the power obsessed, to the egocentric, to the selfish.

5
openasocket 1 day ago 2 replies      
OK, my father deals with people with ODD and Conduct disorder on a regular basis, and I think this article is really miss-characterizing the condition. You don't have ODD if you have a reason for defying authority, even a bad reason. ODD is defined as having problems with planning and inhibition. I was actually talking with my father just last night about ODD, and he described an experiment to me. Imagine you have a patient play a card game in which they have the chance to win money or lose money. Initially, the game is rigged in the patient's favor, but as the game goes on it gets harder and harder for them to win. Most people will figure out that the odds have changed and stop playing, but people with ODD or Conduct disorder will continue playing far longer than anyone else because they have poor planning skills. Describing people with ODD as simply anti-authority is really missing the crux of the condition.

EDIT: I shouldn't criticize the article, but rather the people who are doing the over diagnosing. Just thought I'd give some context to the condition, so people understand what ODD really is.

6
Aqueous 1 day ago 0 replies      
"Anti-authoritarians question whether an authority is a legitimate one before taking that authority seriously."

But there also some individuals who think any authority is illegitimate. I have friends and close relatives who I would characterize as having this personality trait. I also observe that it has a negative impact on their quality of life. They have trouble holding jobs, staying in school, keeping friends, and building personal romantic relationships because they are so resistant to being accountable to any other person - accountability which translates in their mind to obedience to authority.

Authority isn't always illegitimate. That's why there is a such a thing as a pathological degree of anti-authoritarian sentiment.

7
Joof 2 days ago 2 replies      
My (former) psychiatrist labeled me noncompliant for requesting modern studies on SSRIs not done by the selling company before taking them. I guess I fall in that category.
8
joeax 2 days ago 2 replies      
People that have an unrelenting, almost worship-like devotion to authority scare the bejesus out of me. Imagine what a great society we would be if we were all free thinkers.
9
Santosh83 2 days ago 1 reply      
Any deviant act or view is probably anti-authoritarian if by authoritarian you also include the "authority" with which mainstream views and regulations are imposed, sometimes by law, more often by majority pressure.

Yet everything mainstream isn't somehow bad. Humans survive only by cooperating and banding together. It's just that we need to strike a golden mean between listening to authority and being unique. It's a matter of intelligence, applied on a case-by-case basis, and will work only if the most people in a group practice it.

10
tristor 1 day ago 0 replies      
Having experienced some of these things in my own life, I can honestly say that I do believe that being an anti-authoritarian is often mistaken as a mental problem.

I believe that authority is the wielding of power by an institution or individual, and power is only useful if you voluntarily cede control (this doesn't apply always, but does in 99% of cases). Refusal to cede control returns that power back to you. People who /believe/ that they have power over you, when proven that they don't, by your inability or denial of compliance get irrationally angry and tend to think that it makes you a terrible person. I consider this a selfish response, if perhaps a natural one.

To truly live in a free society we must be okay with having power over us when the situation requires (such as at work), while also being okay with rejecting power when our morals and ethics demand it of us. The issue seems to be that some people have the ability to do the latter, and other do not, always appealing to authority to base their arguments or guide their path. The fact that many of these people who are obsessed with authority end up in mental health professions is not really surprising, since the mental health professions largely base the current accepted practices on cults of personality rather than on scientific inquiry.

The fact that I chose a profession where it's basically impossible to BS to anyone who knows anything because the rules of mathematics drive things far more than how eloquently you speak is also not surprising. Anti-authoritarians love facts, facts are a good basis for deciding when to cede power and when to take it back and provide a guiding principle in your life that doesn't rely on another person being an authority over you, even if they are outside your control.

Anyway, that's kind of my reaction to reading the article. Definitely interesting to reflect back on some of my life experiences given this context and consider how things could have turned out differently, both for the worse or for the better.

11
TazeTSchnitzel 2 days ago 1 reply      
In the Soviet Union, and I imagine the other MarxistLeninist countries, people were sometimes diagnosed as mentally ill for not believing in socialism.
12
randcraw 1 day ago 1 reply      
'Anti-authority' is not a well defined label. Independent thought and independent behavior are very different things. And unless your behavior is disruptive or destructive, generally nobody cares what you think.

Iconoclasm is a more specific label: independence of thought, or belief, or convention, without any mention of behavior. Presumably iconoclasts are what Levine is describing as positive examples of anti-authority, AKA mentally healthy folks who just think differently, but don't act on it, or who act on it in constructive ways.

But anyone who acts disruptively or destructively also qualifies as anti-authority. So I don't see where Levine draws the line between good and bad forms of anti-authority. Inactive belief is good? Disruptive or violent is bad? He doesn't say.

13
alanwatts 2 days ago 0 replies      
Reminds me of a Dr. King quote:

>But though I was initially disappointed at being categorized as an extremist, as I continued to think about the matter I gradually gained a measure of satisfaction from the label. Was not Jesus an extremist for love: "Love your enemies, bless them that curse you, do good to them that hate you, and pray for them which despitefully use you, and persecute you." Was not Amos an extremist for justice: "Let justice roll down like waters and righteousness like an ever flowing stream." Was not Paul an extremist for the Christian gospel: "I bear in my body the marks of the Lord Jesus." Was not Martin Luther an extremist: "Here I stand; I cannot do otherwise, so help me God." And John Bunyan: "I will stay in jail to the end of my days before I make a butchery of my conscience." And Abraham Lincoln: "This nation cannot survive half slave and half free." And Thomas Jefferson: "We hold these truths to be self evident, that all men are created equal . . ." So the question is not whether we will be extremists, but what kind of extremists we will be. Will we be extremists for hate or for love? Will we be extremists for the preservation of injustice or for the extension of justice?

"Crazy", "Nutjob", "Mentally Ill", "Extremist", "Heretic" - these are words used to write off ideas which are too difficult for one to even consider because they often threaten the most deeply held assumptions of a persons life. Social constructions can be very fragile.

14
bcheung 1 day ago 1 reply      
Anti-Authoritarian sounds a lot like critical thinking. That's a disease now?

"Anti-authoritarians question whether an authority is a legitimate one before taking that authority seriously. Evaluating the legitimacy of authorities includes assessing whether or not authorities actually know what they are talking about, are honest, and care about those people who are respecting their authority. And when anti-authoritarians assess an authority to be illegitimate, they challenge and resist that authoritysometimes aggressively and sometimes passive-aggressively, sometimes wisely and sometimes not."

15
thetruthseeker1 1 day ago 1 reply      
I am an immigrant in America. I have had problems with my bosses(American) when I challenged some of their decisions at work. When I challenged their decision in every quantifiable metric, they could not defend it in any way and relented, but then I heard from HR about not following orders in general.

I think in American culture there is lot of emphasis on command hierarchy( may be military influenced?) and who wears the pants so to speak, compared to my home country in the east, and I think that may have something to do with it.

But I can attest to the the repulsiveness I feel to subtle authoritarianism where orders are sneaked in along with regular conversation.

16
tsunamifury 2 days ago 0 replies      
I think once you realize the vast majority of authorities preach contribution and submission while merely taking for themselves -- you begin to see everyone around you who submits to them as a bit insane. This becomes really hard to rectify until you can figure out an equilibrium of physical submission to a unjust authority without psychological submission.
17
cat-dev-null 2 days ago 0 replies      
Police have been known to send people whom they have beaten up for mental examination in order to discredit them. This happened in Los Altos to a guy playing Ingress.
18
vanderZwan 2 days ago 0 replies      
Telling how the people in the comments who disagree with the article also tend to appeal to authority (see the comments by Herb and Jaroon).
19
nickbauman 2 days ago 2 replies      
I have a friend whose 14-year-old daughter has been diagnosed with ODD. She frequently cuts herself with anything she can get her hands on and when she doesn't get her way she will literally spin around and around shouting and growling angrily for a quarter to a half hour like she's possessed, sometimes physically lashing out at her parents. It's terrifying to watch and is definitely not your garden variety skepticism of authority.
20
Mikeb85 2 days ago 1 reply      
On one hand, devotion to authority is kind of what makes society possible. We all work, are cogs in the machine, and as a reward we all get a more or less decent existence.

Of course, the downside is that when society is ill, or when our leaders are objectively wrong, we don't see it because we're used to just following authority, and can't see beyond our couches and screens.

21
xlm1717 1 day ago 0 replies      
I thought it was particularly interesting when the author talked about differing reactions people had when taking psychiatric medication. It then seems that the effect of psychiatric medication has more to do with the person's frame of mind than any actual effect of the medication.

According to the author, if the patient rejects the doctor's authority, the patient can react even more violently when put on psychiatric medication. Even more interestingly, the author goes on to suggest that patients who do take the medication take it to placate authority, rather than for perceived effect. The author uses the example of someone in a highly stressful job taking Xanax instead of marijuana because while they might believe marijuana helps more, they don't take it due to employer drug tests.

It leads me to this question: when doctors think a medication is working, do they think so because of actual beneficial effects seen in the patient, or simply because the patient is complying with the treatment?

22
mbrutsch 2 days ago 1 reply      
When you have a "diagnosis" from a "scientific discipline" that cannot be rigorously tested or diagnosed, it seems obvious it would be open to abuse in this manner.

If there were a blood test for crazy, it would be a little more difficult. Sadly, diagnosing illnesses of the mind is just educated guesswork, and will likely never rise to the level of science that most people assume.

23
stegosaurus 1 day ago 1 reply      
This is the primary issue I have with our treatment of mental health.

Physical disabilities and illnesses come about due to clear actions, and are generally recognizable as universally bad.

These sorts of things either happen near-arbitrarily (birth abnormaility, blood clot) or via an accident (falling down the stairs). After the fact, you're less able, in general.

By contrast, many mental illnesses are triggered by society, and some make no sense without a societal structure.

People go to work, become stressed at the lack of reward or meaning in their employment and get depressed. Now they have an 'illness', because they no longer want to turn up for work. Cyclical reasoning.

There are certainly many mental illnesses that aren't covered by this, but I think that when comments are made about 'declining mental health in society' - it's really just throwing diagnoses at the fact that we're becoming more cut-throat....

24
Yaa101 1 day ago 0 replies      
19th century schooling (which is still our current schooling system) is about preparing for conscription.
25
jkot 2 days ago 0 replies      
I read a bit on authors pages. I think he paints "anti-authoritarian" as young hippie who is against everything just because. That explains no mentions of psychiatry in soviet union, early 20th century and so on.
26
dreamlayers 1 day ago 0 replies      
Sanity is defined relative to society and social norms. For example, homosexuality stopped being labelled as a disorder when society's attitudes started changing.
27
jokoon 1 day ago 0 replies      
I don't think authority is the problem, what bothers me more, is how passive consensus somehow reinforces some ideas more than others, and that can make them difficult to challenge them.

So in a sense, I think that there is a transparent "consented" authority, which is harder to challenge. Of course most people are happier in such a society. That's true, you can't really tell if the person is insane or just having an opinion about something.

28
Sideloader 1 day ago 0 replies      
I have a friend who I've known since first grade. He would never just "obey orders"...he would always try to find out why he was being asked to do something and if the request seemed illegitimate he would refuse, sometimes in creative ways, to honor it. He was diagnosed with ADHD and labeled a "problem child".

As we grew older and his ethics and moral compass developed he would supply arguments to back up his actions - he would not rebel for rebellion's sake but because he truly believed his actions were reasonable and justified. In high school these took the form of successfully defying the school's no-hats-allowed policy and under the auspices of a Frank Zappa loving art teacher forming The Anarchist's Club (the tongue-in-cheek logo was a medieval spiked club weapon emblazoned with a circle-A symbol).

The Anarchist's Club biggest action was a boycott of a mandatory pep rally for the school's basketball team (we offered to study in the library instead) which earned its 13 or so members a suspension with readmission after a written apology and meeting with our parents. After three days only me, my friend and one other student remained defiant. (Our parents were supportive of our "cause".) We were readmitted after my friend and I went to the town's newspaper and told them our story and a subsequent meeting with the school board which, to our surprise, agreed the offer to study in lieu of attending the rally was reasonable and overruled the principal's authority.

After high school we both moved to the west coast to attend college and drifted apart. In the "real world", despite much rhetorical talk of freedom and liberty from all corners, obedience and conformity to an often rigid status quo are expected if one wants to be accepted by, and benefit from, mainstream society. There are exceptions, like entrepreneurship to some extent, and limited room to negotiate but generally defying the authority of supervisors, managers, professors, the law of the land and any number of "superior" people and institutions is met with the crack of a whip rather than an offer of dialog and compromise.

Most of us suppress our anti-authoritarian instincts and fall in line with the established order. The more we have to lose, the more we opt for security and predictability over principles and conscience. My friend says he cannot do this. Talking a decade after high school he says he was born without the "conformity gene" and that he can't will himself to do something he feels is unjust or infringes unreasonably on his liberty. He can't, for example, obey a micro-managing boss if his method of working is more productive than the bosses way. If the boss is truly concerned about productivity and the company's bottom line he or she will not force employees to do things one way and one way only. Every person has methods that work for them, but may not work for others, and a boss who does not respect this is, according to my friend, an ego-tripping asshole or a mindless drone in thrall to "management theory".

My friend argues that his anti-authoritarian nature has cost him many jobs and opportunities and has made climbing the middle-class career ladder all but impossible. He makes money by buying and selling things on Craigslist and E-bay, freelance graphic design jobs and the marijuana reselling industry. He claims to have no interest in a stable career, home ownership and a family. More than that, he says his disposition makes it impossible for him to submit to arbitrary authority and to shelve his reservations for the sake of a career. If I had just met him I might be tempted to call bullshit and think he's rationalizing his failure after the fact but I've known this guy almost my entire life and his behavior has been consistent since first grade.

He has been diagnosed with adult ADHD, an anxiety disorder, depression and other mood disorders. (The DSM even provides a handy "unspecified disorder" category.) One anecdotal tale proves nothing and there is a possible chicken/egg problem here but based on my friend's and, to a lesser extent, my own and other people's experiences I am convinced there are good reasons for the APA or similar group to do an in-depth, independent study investigating the links between mental illness diagnoses and anti-authoritarian personality traits. Given what is at stake, I doubt that will happen any time soon.

29
useYourIllusion 1 day ago 0 replies      
I find it quite alarming that the DSM has been wielded as a tool for hospitalizations and forced medication despite providing sparse forensic evidence of the diseases it attempts to diagnose.
30
pmarreck 1 day ago 0 replies      
Site repeatedly crashes on mobile, thanks to Javascript ads.

And this is why I was made to disable my ad blocker by many sites. Thanks, f*ing ad industry. The least you could do is not crash my shit if I allow you in.

31
hyperpallium 1 day ago 0 replies      
The majority is always sane, Louis.
32
eevilspock 1 day ago 0 replies      
Prior discussions of this piece:

206 comments, 3 years ago: https://news.ycombinator.com/item?id=5674438

138 comments, 4 years ago: https://news.ycombinator.com/item?id=3642570

33
dschiptsov 1 day ago 0 replies      
Authority of a math textbook or a biology textbook is not the same as authority of religious dogmatism, tradition or a state ideology.

Rejecting the former is an illness, rejecting the later is a right.

34
Zigurd 1 day ago 0 replies      
This overlooks an opportunity: We need a new disorder. Call it Control-seeking Disorder, or CSD. Cop gets in your face? Diagnose him with CSD and require treatment that lowers aggression. TSA bumming you out? Make sure they screen out CSD sufferers. Get them in treatment.
35
pigpaws 2 days ago 0 replies      
really? I thought they were called "Libertarians".
36
TheLogothete 2 days ago 3 replies      
Many people are so desperate to convince themselves that they are very special snowflakes that they will use every opportunity to renounce the status quo. It doesn't matter if the status quo is very sensible and sound. They need, crave to be unique. So every crazy, poorly concocted conspiracy theory gets thousands, millions of devout followers. Convinced beyond belief in the most absurd of things. The funniest thing is that they all think everybody else is delusional and they got to the very special, elevated place, reserved only for the very enlightened.

It is very annoying, tbh.

24
Study: People Want Power Because They Want Autonomy theatlantic.com
313 points by Jerry2  4 days ago   131 comments top 18
1
codeonfire 4 days ago 4 replies      
This is obvious in the workplace. Here are some of the things that lack of power/autonomy will lead to:

- Being forced to do work assigned to someone else.

- Being forced to do work for which someone else will receive credit or compensation

- Being forced to appear as though under someone else's control (other than direct manager)

- Being forced to do the more dangerous, risky, difficult, or thankless work

- Being forced to do work far below one's qualifications

- Being forced to take over failed projects or projects that have already been rejected by upper management.

2
jhwhite 4 days ago 3 replies      
This is a really cool article that goes really well with Daniel Pink's book Drive: The surprising truth about what motivates us. Pink says that people need 3 things to truly feel motivated at work. Mastery, Autonomy, and Purpose.

That seems to be based on self-determination theory mentioned in the article that autonomy, relatedness, and competence are human's basic psychological needs.

The link (http://selfdeterminationtheory.org/SDT/documents/2004_DeciVa...) about self-determination takes you to a paper that references a researcher, Mihaly Csikszentmihalyi. He has done a lot of research on optimal experience and has written a great book about it called Flow.

3
marcus_holmes 4 days ago 3 replies      
Now all we need is a couple of independent researchers to replicate these conclusions to verify them and we might have learned something.

I get really sceptical about these social sciences studies, especially after the recent round of replication attempts that failed.

4
cylinder 4 days ago 3 replies      
This is what has been bothering me all day / week and just before I clicked this. I cannot stand being micromanaged. I'm being driven crazy by it now, and it makes me want to go back to entrepreneurship.
5
danharaj 4 days ago 2 replies      
As a shill for libertarian socialism, i am obviously glad to see controlled experiments that validate my prejudices about human nature. I switched jobs a few months ago from a steady job with good benefits to a less steady job with no benefits because it gave me far more autonomy and independence. I know many people who are very ambitious and want to be their own bosses: few of them want to rule others. In fact, of the bosses i know, all of them are frustrated by having to order and direct people.

There is one pattern of thought that i think is missed by this experiment, but another one could just as well measure it too and i hope someone decides to: it's easier to order people to do what you think is right than it is to convince them that you are right. I think under circumstances where a person feels frustrated that their point of view isn't being validated by others they will have a greater tendency towards authoritarian power as opposed to autonomous power.

Certainly, how many of us have gone through the phase of growing up where we think that if everyone just listened to us and did what we said, everyone would be better off? I think well intentioned paternalism is a greater cause for authoritarian desires than narcissism and ego validation.

6
graycat 4 days ago 2 replies      
The classic explanation is from E. Fromm, The Art of Loving with, to paraphrase,

"For humans the fundamentalproblem of life is gettinga feeling of security in theface of the anxiety from our realization that alonewe are vulnerable to the hostileforces of nature and society."

rough quote from memory.

So, alone we feel vulnerable. So, we want security in the faceof that vulnerability.

Notably, Fromm does not say thatmoney or power will give thatfeeling of security and, instead,claims that the first recommendedsolution is a good romanticrelationship, that is, with"knowledge, caring, respect,and responsiveness" where the knowledge means the couple readily exchanges knowledgeof themselves.

Sure, one can try to useautonomy and self-sufficiency, and those via, say, money and/or power to get the feeling of security. But if only byomission, Fromm is saying thatbeing so alone won't work well.

7
pink_dinner 4 days ago 4 replies      
This is why I started my own company: So nobody could tell me what to do. It's not really even about the money.
8
erikb 4 days ago 1 reply      
Well, it's different things at work, right? Yes on one side people want autonomy. This very much so applies to many developers. But power is actually also exciting, for some people even sexual. Why do I think that? Well, computer games and porn for instance. In both you already get your agenda. But there are games and porn that gives you explicitely power over another (virtual) person. And people still like that, when they already have power over themselves.
9
kijin 4 days ago 1 reply      
I don't think the distinction between autonomy and influence/power is so clear-cut in practice.

The article defines autonomy as the absence of unwanted influence. But in a human society, unwanted influence is not something you can simply opt out of. Having autonomy without isolating yourself from the rest of the society means having at least some degree of influence over those who would like to influence you in unwanted ways.

As long as there are people out there (politicians, marketers, burglars, terrorists, etc.) who are trying their damnedest to influence you, the only way you can achieve autonomy is to be able to tell them to get the fuck off your lawn. Sometimes you need to push people physically off your lawn. Sometimes you need to kill them, because they would kill you if you don't. An autonomous person without effective power will quickly cease to be autonomous.

So autonomy is just another form of power. Some might even say that everything is power, and it's not just empty rhetoric.

10
Semiapies 4 days ago 0 replies      
"Mostly", because quite a few people still just want control over others. And someone striving for "autonomy" through power still just wants to make you a tool for achieving their autonomy.
11
tdeck 4 days ago 0 replies      
I remember seeing a talk a while back where the presenter was attempting to define the utility function that characterizes intelligent agents. His thesis was that an intelligent agent would work to maximize, up to some event horizon, its number of possible courses of action. This reminded me of that talk - money, power, etc... are all ways to increase one's freedom of choice and thus autonomy.
12
Jedd 4 days ago 0 replies      
Is this really the results of a new study?

I'm sure I was reading results of studies > 10 years ago, about stress levels of people within some number of large organisations, and finding people towards the top of the hierarchy were less stressed than those at the bottom. It was possibly speculative, though I'm sure I recall they'd confirmed it somehow, that it was due to people towards the bottom of an organisational structure had far less control over what their day or week looked like, and how they could plan out their tasks, than those at the top.

13
linhchi 4 days ago 0 replies      
It's like I'm motivated to work like crazy until the point I can afford to be nothing.
14
noam87 4 days ago 0 replies      
> To be free in an age like ours, one must be in a position of authority. That in itself would be enough to make me ambitious. (Ernest Renan)
15
Aloha 4 days ago 1 reply      
I don't mean to be glib - but they needed a study for this?

I think this is one of the primary reasons people have chased power for millennia - either to have control over themselves, change the world they live in, or lord over (seek retribution) those who've wronged (done arbitrary things to) them (perceived or otherwise).

16
daodedickinson 4 days ago 1 reply      
Man, I never got to BLARP as an undergrad. I feel so left out.
17
PaulHoule 4 days ago 0 replies      
"If I was black, and I lived hereI'd want to be a big man in the FBIOr the CIA.

But as I'm not,And as I'm free, white and 21I don't need more power than I've got...Except sometimes, when I'm broke"

18
sridca 4 days ago 1 reply      
My feeling is that it is impossible to achieve autonomy in the workplace for no other reason than the fact that the factors for what you end up doing in day-to-day job is, ultimately (if not immediately), dictated by the investors of the company. Essentially I am exchanging my skills, expertise and time for money.
25
VNC Roulette vncroulette.com
395 points by rwmj  15 hours ago   197 comments top 49
1
milesf 11 hours ago 11 replies      
I think we need some sort of awareness day for the general public to understand what internet security _really_ is. Whenever I see news reports, it's always cast as "hackers broke in to..." such and such. Yet if some brick-and-mortar business is robbed because the owner left the front door unlocked, people would rightfully put the onus mostly on the store owner.

EDIT: Wow. I'm being modded into the basement. When did Hacker News become so PC? Victim-blaming? Seriously? The VNC connections illustrated on this site are that way because of incompetence and ignorance. The reason there are no unlocked brick-and-mortar businesses is because it is due diligence to protect one's assets from not just criminals, but simple mischief.

2
bpicolo 0 minutes ago 0 replies      
The fourth one it shows me appear to be a medical records database. Wowzerz
3
eranation 12 hours ago 8 replies      
So... what does this mean? I mean, if there are so many hydropower plans et al vulnerable for VNC, how come we didn't have some major catastrophe? Is it simply more common to have a "read only" VNC vulnerability? (which is still a huge problem). Is VNC by default not password protected for read only viewing (and requires password for taking control?) Obviously nothing should be password-less by default, and should not have a "changeit" password (I'm looking at you glassfish) but I really hope that even if VNC lets you be in "guest view only mode" without a password by just knowing an IP (who does that?!) then at least I hope they still require a password to also take control, right? please tell me they do. (otherwise I'll be surprised we are all still alive to be honest)

I mean there are some controls there that I'm sure if the wrong person pushes that red button, something will go kaboom.

And there is no shortage of people out there who would not think twice to blow things up.

So yes, this is scary, but also makes me be very surprised that statistically we are probably not supposed to be alive by now if so many critical control systems have VNC exposed like that in a way that allows full control on the system and not just viewing.

Perhaps it's just selection bias, if the world have ended by now then I would be able to type this.

But still seriously, with all these screenshots, I assume this is not something new, so how come I didn't hear yet on a major real world damage due to a VNC vulnerability?

Is this really most likely to be a read only privacy issue? (which is not to be taken lightly, but not the same as being able to press "shutdown" on some power plant controls)

4
foota 13 hours ago 0 replies      
There seems to be an accompanying blog post: http://hahasecurity.blogspot.com/2016/03/hack-millions-of-de...
5
kbenson 11 hours ago 3 replies      
Today I learned that Chinese (Japanese?) character support in terminals looks way cooler than western fonts[1].

http://vncroulette.com/images/115.218.120.95.jpg

6
tapp 13 hours ago 2 replies      
Agree that it's simultaneously fascinating and alarming.

Does anyone know what exactly this is?: http://vncroulette.com/images/176.64.166.110.jpg

7
smilekzs 13 hours ago 1 reply      
http://vncroulette.com/index.php?picture=87

"Please secure your VNC!"

EDIT: Also:http://vncroulette.com/index.php?picture=270

"Upgrade your VNC Server license in order to benefit from premium security features ...""An anonymous user has connected. Number of connected users: 1"

8
cure 12 hours ago 1 reply      
http://vncroulette.com/index.php?picture=193

I'm glad this screen is sanitized regularly.

9
cenal 12 hours ago 0 replies      
What unfortunate timing for this poor guy who is now forever captured having dissapointed his client: http://vncroulette.com/images/14.97.72.37.jpg
10
hyperion2010 13 hours ago 2 replies      
For those who havn't seen Dan's talks before: https://www.youtube.com/watch?v=5cWck_xcH64
12
digi_owl 2 hours ago 0 replies      
I wonder how many installs date from before the facility was put on