hacker news with inline top comments    .. more ..    29 May 2016 Best
home   ask   best   2 years ago   
1
Jury in Oracle v. Google finds in Google's favour twitter.com
1821 points by LukeB_UK  2 days ago   386 comments top 44
1
grellas 2 days ago 13 replies      
Law evolves and the law of copyright in particular is ripe for "disruption" - and I say this not as one who opposes the idea of copyright but, on the contrary, as one who strongly supports it.

It is right that the author of a creative work get protection for having conceived that work and reduced it to tangible form. Developers do this all the time with their code. So too do many, many others. Many today disagree with this because they grew up in a digital age where copyright was seen as simply an unnecessary impediment to the otherwise limitless and basically cost-free capacity we all have to reproduce digital products in our modern world and hence an impediment to the social good that would come from widespread sharing of such products for free. Yet, as much as people believe that information ought to be free, it is a fact that simply letting any casual passer-by copy and distribute any creative work with impunity would certainly work to rob those who may have spent countless hours developing such works of the commercial value of their efforts. I will grant that this is a social policy judgment on which the law could come down on either side. I stand with the idea of copyright protection.

Even granting the correctness of copyright as a body of law that protects certain property interests, there are still many abuses in the way it is implemented and enforced. Copyright terms have been extended to the point of absurdity, and certainly well beyond what is needed to give the original author an opportunity to gain the fruits of his or her labor. Enforcement statutes are heavy-handed and potentially abusive, especially as they apply to relatively minor acts of infringement by end-users. And the list goes on.

The point is that many people are fed up with copyright law as currently implemented and, when there is widespread discontent in society over the effects of a law, the time is ripe for a change.

I believe this is where copyright law is today.

The Bono law may have slipped through Congress with nary a dissent in its day but this will not happen again, whatever the lobbying power of Disney and others. And the same is true for the scope of copyright law as it applies to APIs.

Ours is a world of digital interoperability. People see and like its benefits. Society benefits hugely from it. Those who are creatively working to change the world - developers - loath having artificial barriers that block those benefits and that may subject them to potential legal liabilities to boot. Therefore, the idea that an API is copyrightable is loathsome to them. And it is becoming increasingly so to the society as a whole.

The copyright law around APIs had developed in fits and starts throughout the 1980s and 1990s, primarily in the Ninth Circuit where Silicon Valley is located. When Oracle sued Google in this case, that law was basically a mess. Yet Judge Alsup, the judge assigned to this case, did a brilliant synthesis in coming up with a coherent and logically defensible legal justification for why APIs in the abstract should not be protected by copyright. He did this by going back to the purpose of copyright, by examining in detail what it is that APIs do, and by applying the law in light of its original purpose. The result was simple and compelling (though the judicial skill it took to get there was pretty amazing).

Legal decisions are binding or not depending on the authority of the court making them and on whether a particular dispute in under the authority of one court or another when it is heard.

The decision by Judge Alsup is that of a trial judge and hence not legally binding as precedent on any other judge. It could be hugely persuasive or influential but no court is bound to follow it in a subsequent case.

The Federal Circuit decision that reversed Judge Alsup and held APIs to be copyrightable is not that of a trial judge and has much more precedential effect. Yet it too has limited authority. The Federal Circuit Court does not even have copyright as its area of jurisdiction. It is a specialty court set up to hear patent appeals. The only reason it heard this case was because the original set of claims brought by Oracle included patent claims and this became a technical ground by which the Federal Circuit Court gained jurisdiction to hear the appeal. But there are many other Federal Circuit courts in the U.S. and the effect of the Federal Circuit Court decision concerning copyrights is not binding on them. There is also the U.S. Supreme Court. It has the final authority and its decisions are binding on all lower federal courts as concerns copyright law.

The point is that the battle over this issue is not over. It is true that the Federal Circuit decision was a large setback for those who believe APIs should not be subject to copyright. Yet there remains that whole issue of social resistance and that is huge. It will undoubtedly take some time but the law can and does change in ways that tend to reflect what people actually think and want, at least in important areas. No one has a stake in seeing that Oracle be awarded $9 billion in damages just because it bought Sun Microsystems and found an opportunity through its lawyers to make a big money grab against Google. But a lot of people have a stake in keeping software interoperability open and free and many, many people in society benefit from this. Nor is this simply an issue of unsophisticated people fighting the shark lawyers and the big corporations. Many prominent organizations such as EFF are in the mix and are strongly advocating for the needed changes. Thus, this fight over APIs will continue and I believe the law will eventually change for the better.

In this immediate case, I believe the jury likely applied common sense in concluding unanimously that, notwithstanding Oracle's technical arguments, the use here was in fact benign given the ultimate purposes of copyright law. I leave the technical analysis to others but, to me, this seems to be a microcosm of the pattern I describe above: when something repels, and you have a legitimate chance to reject it, you do. Here, the idea of fair use gave the jury a big, fat opening and the jury took it.

2
rayiner 2 days ago 17 replies      
These are the statutory fair use factors the jury was required to consider (17 U.S.C. 107):

(1) the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;

(2) the nature of the copyrighted work;

(3) the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and

(4) the effect of the use upon the potential market for or value of the copyrighted work.

It's a somewhat surprising result, because two of the factors weigh heavily against Google (it's a commercial work, and was important to Android gaining developer market-share). Oracle's strategy going forward, both in post-trial motions and in any subsequent appeal, will be based on arguing that no rational jury could have applied these factors to the undisputed facts of the case and concluded that the fair use test was met.

It's also not a particularly satisfying result for anybody. If API's are copyrightable, then I can't think of a better case for protecting them than in this one, where Google created a commercial product for profit and there was no research or scientific motivation. It wasn't even really a case (like say, Samba) where copying was necessary to interoperate with a closed, proprietary system. Davlik isn't drop-in compatible with the JVM anyway.

That makes Oracle's win on the subject matter issue basically a pyrrhic victory for anyone looking to protect their APIs. They're protectable, but can't be protected in any realistic scenario.

And if you're in the camp that believes APIs should not be protected, this precedent--if it stands--means that you'll have to shoulder the expense of going to trial on the fair use issue before winning on the merits.

3
nabaraz 2 days ago 2 replies      
My favourite part of the trial was when the judge told Oracle that a high schooler could write rangeCheck[1].

[1] https://developers.slashdot.org/story/12/05/16/1612228/judge...

4
phasmantistes 2 days ago 2 replies      
I'd also just like to give huge props to Sarah Jeong for keeping up such a high-quality live stream of tweets over the course of the entire trial. That's reporting done right.
5
mythz 2 days ago 2 replies      
Whew, Oracle's lawyers and blind greed doesn't get to destroy interoperability for the entire Tech Industry.

But the fact that Oracle could get this close and spin deceit to a non-technical Jury to decide whether using API declarations from an OSS code-base would in some universe entitle them to a $9B payday, is frightening.

6
wbillingsley 1 day ago 0 replies      
Is it just me, or is this result the worst of both worlds (for the you-and-me's of the world, rather than billion dollar corporates)?

Had APIs been found not to be copyrightable, that would have been great and opened up the development ecosystem for us to use and adapt each other's APIs.

Had it been clear that no this sort of thing wasn't allowed then small developers would have had protection that they could publish their APIs without fear of a deep-pocketed competitor saying "thanks, we're going to muscle you out of the market, using your own design to do it, and ignoring any of that GPL nonsense you've licensed it licensed under, we're just having your API as our own thanks". Not an open world, but at least everyone would be on a level field.

But a "fair use" finding of fact sets no precedent for anything else, gives no protection for the ordinary developer, and essentially means "it turns out you can do this if you're big enough to afford to pay high-tier lawyers for six years". (ie, BigCorp can copy APIs with impunity, but you can't)

7
davis 2 days ago 1 reply      
If you found Sarah's coverage of the trial useful, she is accepting payments on PayPal since she was doing it with her own money:https://twitter.com/sarahjeong/status/731687243916529665
8
koolba 2 days ago 1 reply      
Wow. I suddenly have a lot more faith in the courts and juries to land sane verdicts in technology trials. Still sad that it takes a billion dollar company to be able to stand up to this (as anybody smaller would be crushed by the trial expense) but let's celebrate it none the less.

Any lawyers around? I wonder if Google can claim legal expenses back from Oracle.

9
tostitos1979 2 days ago 4 replies      
Despite the win, I think it would have been far better for the computer industry if Google had bought Sun. Unlike other companies with crap (IMHO ... Nokia, Motorola), Sun actually had stuff of value. This is a lesson that geeks get but I'm not sure MBAs do or will ever get.
10
Cyph0n 2 days ago 1 reply      
Great news! This is a win for us software devs :)

I'd like to note that Ars Technica's coverage of the trial has been excellent throughout.

11
shmerl 2 days ago 0 replies      
Congratulations! It's a pity that previous decision declared APIs copyrightable. This never should have happened. But at least fair use worked out.

I wonder though how universal that ruling would be. Is any reimplementation of APIs going to be fair use, and if not, what are the criteria?

12
grizzles 2 days ago 0 replies      
Apis are still copyrightable according to the Federal Circuit Court of Appeals. That's not great, and I hope Congress does something about it for the other languages (eg. C#/.NET) that haven't yet been whitelisted as fair to use by the judicial system.
13
BinaryIdiot 2 days ago 0 replies      
Great news to a degree. It still means APIs can be copy-written which is a bit unfortunate in my opinion. But they won on fair use which is still a victory.

Anyways I wonder how long this is going to keep going on for as I'm assuming Oracle will appeal.

14
zerocrates 2 days ago 4 replies      
See, as ever, Florian Mueller for a... different perspective: http://www.fosspatents.com/2016/05/oracle-v-google-jury-find...
15
chatmasta 2 days ago 1 reply      
What impact does this have on reverse engineering private APIs and reimplementing them? And selling those reimplementations?

Can I reverse engineer the private API of a mobile app, then implement my own client to talk to its servers?

What if I create my own "bridge" API to talk to the private API? Can I then sell access to the bridge API, allowing developers to use the private API of the app through my service?

And how does this relate to, e.g. running private world of warcraft servers with modded code that allows purchasing in-game items? (See http://www.themarysue.com/blizzard-private-server-lawsuit/)

16
musesum 2 days ago 0 replies      
> "For me, declaring code is not code," Page said.

Unless, of course, the declaring code is declaring declaring code, as in Prolog and its ilk.

17
Analemma_ 2 days ago 1 reply      
Now we have to hope this doesn't get overturned by a Circuit judge like it did before. Still, this is excellent news.
18
jhh 2 days ago 3 replies      
What I don't understand about this: Why didn't Google/Android use Java under the Open Source license under which it has been provided. Wouldn't that have saved all the trouble?
19
bitmapbrother 2 days ago 0 replies      
Oracle will likely appeal, but they'll lose again. Overturning a unanimous jury verdict is very difficult.
20
blacktulip 2 days ago 2 replies      
Excuse me here but I have to ask. Is this final? Because I've read that Oracle won the case some time ago.
21
jhanschoo 1 day ago 0 replies      
While it is unpopular among most HN readers that APIs be found to fall under copyright, I don't see how it affects software interoperatability in the long term.

It seems to me that all that this necessitates is for software publishers to include an open license for their APIs, or to release it into the public domain.

In fact, it might even be beneficial, since companies can license different APIs to different customers, and have additional legal force prohibiting the use of APIs that are unofficial or unauthorized to them.

22
mark_l_watson 2 days ago 0 replies      
I remember that afternoon deserts served in Google restaurants when I worked there were very tasty - I hope everyone is celebrating with a good snack :-)

Seriously, I think this is a good verdict. I think that Oracle has been doing a good job sheparding Java, but this law suit really seemed to me to be too much of a money grab.

23
pavpanchekha 2 days ago 7 replies      
This is possibly my best-case scenario. APIs are copywriteable (so says the Supreme Court), and this seems reasonable, since some APIs really are very good and treating them like an artistic work has benefits. But implementing them is fair use, preserving the utility of APIs for compatibility. Great news!
24
brotherjerky 2 days ago 0 replies      
This is fantastic news!
25
cm3 2 days ago 1 reply      
So, APIs are still thought as copyright'able and that was a different trial, right?

Now Google was ruled okay to use that single, small function, or what was this about?

A little more info would be nice for those who aren't following this closely.

26
pjmlp 1 day ago 1 reply      
If Oracle decides to drop Java development, I wonder who will bother to pick it up.

Surely not Google, if they hadn't bothered the first time around.

27
jcdavis 2 days ago 2 replies      
What is the room for appeals here?

This is a massive ruling, heres to hoping it stands

28
spelunker 2 days ago 1 reply      
Is this legit? If so, thank goodness.
29
yeukhon 2 days ago 0 replies      
I for one would like to have a public digital recording of the actual trial available...
30
satysin 2 days ago 0 replies      
So is this really over for good now? Can Oracle appeal and drag this on for another decade?
31
EddieRingle 2 days ago 1 reply      
Hopefully soon we can stop focusing on legalities and get back to building cool stuff.
32
crispyambulance 2 days ago 0 replies      
Perhaps the jury was not as clueless as some here were assuming?
33
steffenfrost 2 days ago 0 replies      
Would swizzling methods violate copyright?
34
ShaneBonich 2 days ago 0 replies      
That was expected. Happy for Google.
35
JackPoach 1 day ago 0 replies      
Good for google
36
ilostmykeys 2 days ago 0 replies      
Oracle is evil.
37
known 1 day ago 0 replies      
Patents and Open source is Oxymoron
38
benmcnelly 1 day ago 0 replies      
IANAL
39
ChrisLomont 2 days ago 3 replies      
This is 100% inaccurate

That story leaves out significant details, and his description of the reason is wrong. He leaves out significant prior sources and reasons, to the point your statement "It was not created to help artists or whoever" is simply wrong.

Wikipedia provides a far better history with extensive sources [1]. Note for example the section "Early Developments" where they list precursors to anything in your source as having significant components of individual and author rights.

Note also in the US copyright law was started by author guilds wanting author protection, and the first federal copyright act in 1790 was about protection for authors, not for printers. This was an extension of many states laws that had passed author protections.

[1] https://en.wikipedia.org/wiki/History_of_copyright_law

40
jrochkind1 2 days ago 0 replies      
thank god.
41
known 1 day ago 0 replies      
Say no to software patents
42
VeejayRampay 2 days ago 0 replies      
Nelson Muntz would rejoice at this verdict. Oracle's claim was laughable.
43
7ero 2 days ago 0 replies      
Google copies API from Oracle, yet I still have to come up with my own solutions to get a job at Google?Fuckin' Google.
44
suyash 2 days ago 3 replies      
Today is a sad day for Silicon Valley. Our legal process has demonstrated how incompetent it is when it comes to Technology IP protection.
2
All European scientific publicly funded articles to be freely accessible by 2020 eu2016.nl
1089 points by whazor  1 day ago   124 comments top 31
1
WhoBeI 1 day ago 5 replies      
I think this is a really good move. About bloody time to. I shouldn't have to pay twice to read research papers I helped fund and I certainly shouldn't be paying a private company for it.

EU is saying that when and if you choose to publish papers based on publicly funded research you must ensure those papers will be publicly available. This means you must budget for any cost associated with it but since your budget are public funds it's basically going to be paid for by the EU in most cases.

The "when and if" means that you can choose not to publish or you can choose to publish after you have sought protection (patent). You can also, as a publisher, have a wait period before research papers become "open access" to, I don't know, be an asshole I guess.

Reference (pdf):http://ec.europa.eu/research/participants/data/ref/h2020/gra...

Edit: The little pdf warning.

2
intrasight 1 day ago 8 replies      
What does this mean in practice? Will the EU force the publishers to make them freely accessible? Will the EU only allow scientists to publish in open journals? Europe indirectly funds a lot of research. This will bleed into the workflow of scientists worldwide.
3
brodo 1 day ago 1 reply      
Under the Horizon 2020 programme, open access publishing (green or gold) is already mandatory today for all EU/EC funded projects. Grant proposals must incorporate open access fees in order to be accepted. It's not much money for most project consortiums as the EU tends to fund big projects with budgets of ~10 million and more as the norm (there are smaller ones, but ~2 million is really as small as it gets as far as I know). So a couple of 10K won't make that much of a difference. If it would become mandatory for all publicly research in the EU, the problem would be quite different. There are really small grants (50K and smaller) and if you have 2 publications out of that grant, you now pay 10-20% of your grant money for publishing fees.
4
alaskanloops 1 day ago 1 reply      
This is one of those things thats just so common sense, its hard to imagine a solid argument against it.

Hopefully it will encourage the US to follow, although I'm sure publishers will dig their feet in and make up some bullsh*t reason why it would be a bad move.

5
dandelion_lover 1 day ago 0 replies      
I wish the same happened with the software written using public funding.
6
Fomite 1 day ago 2 replies      
"From 2020, all scientific publications on the results of publicly funded research must be freely available." is not "All European scientific articles".

I've got at least three papers that wouldn't fall under that heading.

7
kriro 1 day ago 1 reply      
Seems excellent but I'm afraid the EU will just pay a sweet bundle of money to the publishers for the privilege instead of making it a law.Either way I hope it also extends to the member states level eventually. The circus of avoiding conflict between EU funding and member state funding is often quite amusing so this will be interesting.

It feels almost ethically self evident that any research funded by citizens in any way should be made available to said citizens (and that's more or less 100% of research in Europe). Lets step in that direction.

8
a_imho 1 day ago 0 replies      
>unless there are well-founded reasons for not doing so

backdoored by default. Any reason why it is 2020 not sooner or later?

9
norswap 14 hours ago 0 replies      
To contextualize the discussion on "open access" (the practice of editors that ask for a sum of money in order to make the paper available). It costs a lot: about 1k for ACM venues. Currently I have to pay for this out of my own budget, and I don't have that kind of money (I'm a PhD student, and in addition to my scholarship (1.8k/month net of tax, in Belgium), I get a 5k budget for two years that must cover material and travel -- I can also get about one additional travel grant per year).

Publisher have about zero added value except fixing the odd latex issue. Actual editorial work is assumed free of charge by professors. Other venues such as Arxiv and CiteSeerX do a better job of distribution for free. Science publishers are leeches in the system and should be removed.

I think the EU a measure will precipitate the current movement against such predatorial publishing practice. Already, publishers are walking on eggs. They move against sci-hub but they'd never dare to go against the common practice to just host everything one publishes publicly, license notwithstanding, something almost everyone does in the CS field -- maybe they say it's a "draft"... (meaning there isn't a copyright notice mostly).

10
alex_hirner 1 day ago 1 reply      
To top it off, I would love that open access is tied to open data and strong councils for standard setting of such formats in a particular domain. The current swath of approaches to open research data is babylonian.
11
return0 1 day ago 0 replies      
> must be freely accessible to everyone

When? Open access after 12 months has different cost than immediate open access, and, at the pace of today's research, is not open access at all. And ERC funded research already requires open access publishing anyway.

Why not demand publishing to open access/nonprofit journals instead? Why does EU have to pay elsevier $2000-5000 per article?

This does not seem significant to me.

12
sndean 1 day ago 1 reply      
I'm not as familiar with European journals, but if this happened in the US (more or less forced open access) it'd possibly mean that every time we want to publish it would cost researchers ~$3000.

Hopefully this (and similar) legislation comes along with a statement regarding who's supposed to cover that cost. Because some of us avoid that price tag by publishing in non-open access journals.

13
flexie 1 day ago 0 replies      
Only articles based on EU funded research or also articles based on research funded by member states?
14
dadkins 13 hours ago 0 replies      
Interestingly, papers by US government employees are already in the public domain. Journals have no problem publishing those. I think one direction this could go is that public universities could insist that their researchers put their work in the public domain.

Journals don't need copyright to function. There's still value in editing, peer review, and distribution. They're just being greedy by insisting on it.

15
hackuser 1 day ago 0 replies      
Aren't they already required to publish on Sci-hub?
16
lomnakkus 1 day ago 1 reply      
If this actually pans out, it could be incredible. It always saddens me to get a link to a scientific paper and to hit my head on the paywall.

There's of course the "(partially) publicly funded" qualifier, but I don't remember too much privately funded research in the journals that I read as a CS undergrad, so hopefully it won't matter too much. (I guess other areas such as Chemistry, Biology might be more prone to falling under this qualifier?!?)

17
iangpmulvany 1 day ago 0 replies      
Slightly related, I just took part in a workshop of experts who are tasked with advising the commissioner on policy decisions for making open science happen in the EU, I've written up my notes: https://medium.com/p/7691802b543a
18
mablae 1 day ago 0 replies      
And Aaron Schwartz is dead. RIP
19
graffitici 20 hours ago 0 replies      
Does anyone have any insights regarding the "foreign startup visas". Does that legislation have a name I can use to track progress?

It's quite annoying to have to apply for week-long business visas every time I have to travel to Europe. Especially in the summer, when it takes 2 months just to get an appointment with the consulate..

20
jwildeboer 1 day ago 2 replies      
Next step: As they have an exception for "IP" (intellectual property rights like patents) we must promote the logical consequence. All publicly funded research must mean all "IP" must be made available to anyone for free.
21
88e282102ae2e5b 1 day ago 0 replies      
Has it been stated anywhere that they won't allow a temporary embargo after publishing? Like how some funding agencies allow papers to be behind a paywall for 12 months and only then require that it be open access?
22
Aelinsaar 1 day ago 1 reply      
Good. With whatever limitations, this is still good. It's not enough, but this is going to be a long LONG fight.
23
ShaneBonich 1 day ago 1 reply      
Once again Europe shows the way
24
desireco42 1 day ago 0 replies      
Why wait if you know it is inevitable? (as would say Richard Bandler)
25
known 18 hours ago 0 replies      
Too little; Too late;
26
ivan_gammel 1 day ago 0 replies      
Some public funding of copyrighted art work purchases would be also useful, if that's the only viable alternative to decreasing copyright age to some reasonable 20-30 years.
27
daveheq 1 day ago 0 replies      
But wait, scientific articles are hoaxes made for tax money, so now the lies are freely available to anybody!!! We need to cut more taxes on billionaires and publicly fund more oil projects so we can get these inconvenient science articles out of our way. Where's my next iPhone?!
28
partycoder 1 day ago 3 replies      
I can see the evil people behind the TTIP getting in the way of this. Before this goes into effect, the US will execute order 66 and take all the articles down.
29
MrBra 1 day ago 0 replies      
My thoughts are for those that are not with us anymore that helped make this slowly happen... I say thank you, but if I could choose I'd rather have them back and get to this results a bit later.

It's nothing new that in this field sometimes people have a tendency to put their ideals of progress before their own life.

No matter how strong is your love for your world changing ideas, YOU as a person come first.

Unless (but you could also argue about that) without your sacrifice the whole world population would be at immediate risk of extinction..

30
maerF0x0 1 day ago 1 reply      
If trump wanted to help win over the science crowd, he could make this an issue. Seems more likely to come from Bernie though.
31
iangpmulvany 1 day ago 0 replies      
Of related interest, I"m just back from a workshop on Open Science from a group that advises the commissioner. I was an invited expert in to the workshop. This is part of the route through which announcements like the one linked here get made. I've written up my noted https://medium.com/p/7691802b543a
3
Reddit launches image uploads, ditching alliance with Imgur techcrunch.com
498 points by coloneltcb  3 days ago   396 comments top 32
1
zitterbewegung 2 days ago 10 replies      
Having a third party host Reddit's images gave away a bunch of control to the point that imgur has its own community based upon the site . At the time Reddit didn't have the resources to make a competitor but this seems to be a good move now that imgur is doing dark patterns.
2
makecheck 2 days ago 10 replies      
Imgur forgot its roots as a dumb image host, rapidly crufting-up its service with things that were not only annoying but interfering with its basic function. If you cant easily see an image as soon as you click on it on mobile or otherwise then the image service has completely failed.

The sad thing is, they could have added non-intrusive ads. DaringFireball does it; write a single line of text such as This image brought to you by FooBar, Inc. and SHOW THE LINKED IMAGE. No pop-ups, no tricks, no obscurity; ad+image, done.

3
esolyt 2 days ago 3 replies      
> At this time, the Reddit community can still choose to use Imgur or other sites for image hosting.

The fact that they used the phrase "At this time" makes it sound like they are planning to disable external image links in the future.

4
Dramatize 3 days ago 1 reply      
It seemed like Imgur has been getting ready for this for awhile. The experience using it hasn't been that great since direct linking stopped working on mobile.
5
whalesalad 2 days ago 5 replies      
Looks like it's backed by S3

 x-amz-storage-class: REDUCED_REDUNDANCY Content-Type: image/jpeg Server: AmazonS3
Can't imagine that AWS bill...

6
tdiggity 2 days ago 2 replies      
Here's a thread that led to people's hate of imgur: https://www.reddit.com/r/iphone/comments/46cjo5/i_just_want_...

It was really annoying on mobile.

7
6stringmerc 2 days ago 2 replies      
Searched this entire thread and no mention of DMCA or Safe Harbors. Linking to a 3rd party is good to avoid worrying about that sort of stuff. Not that I'm an advocate for illicit rehosting, but just commenting as an observer.

It gets briefly touched on at the very end of the article, but self-hosting may carry with it a whole lot of extra work to avoid getting a big bullseye painted on the company. I'm sure more than a few content creators - probably in the adult business - have their lawyers on speed dial. Will be interesting to see how this plays out.

It's not just a PR thing, it's a legal thing.

8
ourcat 2 days ago 0 replies      
From 2014 When Imgur took $40MM in Funding : http://techcrunch.com/2014/04/03/after-five-years-of-bootstr...

"In addition, the Reddit investment will finally formalize the already friendly relationship between the two sites, making them more symbiotic and maybe even more integrated in some way, though Schaaf declined to go into details as to whats ahead for the two, only saying that theres no promise of Imgur being Reddits official image host at this time."

9
chjohasbrouck 2 days ago 0 replies      
When you're growing as fast as Reddit was, you don't change the recipe, you just keep cooking it. They already had the ingredients for exponential growth, so change in general just introduces unnecessary risks. They had no need for a larger exponent.

The outcome of that is:

-Alien Blue beat Reddit to mobile optimization of Reddit's own platform

-Imgur beat Reddit to sharing images on Reddit

-Reddit and Imgur now exist as pseudo-sister-sites with features that overlap on a fundamental level, and Reddit is cannibalizing its own equity in Imgur

Now that they can't just hire site reliability engineers and watch their graph go up like a volcano, their only play is to improve their product, but now they're running uphill.

I think it's generally correct to play it safe and maintain organic exponential growth if you have it, but when it comes to things as obvious as mobile optimization and image sharing, maybe it's ok for a company to come out of their shell and iterate on obvious stuff (even though they don't have to).

10
siegecraft 2 days ago 4 replies      
The imgur hate in here is odd. It seems like people are just assuming the worst based on one critical comment thread that got turned into a techcrunch article. Which is understandable, I don't understand how they can offer a free unlimited image hosting site (and they've offered it for so long that, well, you can see the backlash they got when direct linking is taken away). Still, their management seems savvy enough to know that they would have to carefully manage that change.

There's at least two distinct types of imgur users, those who only use imgur as an image host and "imgurians." The huge usage numbers come from the former while the actual value (IMHO) is in the unpaid labor of the latter group (like most social media). That means imgur can piss off their freeloaders as long as they don't upset their community, but this is really no help at all because the community is far more critical than the freeloaders. Freeloaders don't care about imgur's community drama.

11
pfarnsworth 2 days ago 8 replies      
I don't know why reddit would want to take on the burden and cost of dealing with images. It might make it incrementally easier, but with something like RES the entire thing was seamless. Seems like a misplaced focus on a feature that doesn't bring that much value, and a lot more cost. I'll be curious to see how successful this ends up being. If they wanted to concentrate on making the site easier to use, they should have hired the people who made RES.
12
boyce 2 days ago 1 reply      
Spotted this was happening nearly two months ago - surprised it's taken the tech blogs a while to cotton on

https://news.ycombinator.com/item?id=11453224

13
clw8 2 days ago 1 reply      
I bet Reddit will finally be blocked in China because of this. Imgur has been blocked for years but Reddit's never been blocked.
14
Keyframe 2 days ago 0 replies      
It was inevitable. Not because of imgur going bad (did they? They did take some questionable moves regarding direct image links, but can't blame them considering volume). It's because of ye olde adage here on HN. If you your business relies on someone else, be prepared for their moves (to not use harsher words). Large volume of popular reddit-submitted content is imgur based. They have to get (back) that under control. Smarter move would be to outright buy or merge with imgur, considering their content and user base, but that probably have been considered.
15
wassago 2 days ago 0 replies      
I think it'll be the same when StackOverflow introduced snippers and everyone thought JSFiddle would fade away because of it... In reality there was zero impact on the traffic from SO.

I'm pretty sure the same will be for Imgur.

16
jasonm23 2 days ago 1 reply      
The community on Reddit can get pretty salty at times, but the open ugliness of Imgur comments gives "vintage YouTube" a run for its money.
17
visarga 2 days ago 2 replies      
Can reddit clone the pics previously uploaded on imgur or does it have its eggs in someone else's carriage?
18
alttab 3 days ago 1 reply      
Turns out a S3 bucket in itself isn't a website.
19
ben_jones 2 days ago 0 replies      
No matter where your personal beliefs put you on the subject, Reddit has been taking a lot more interest in the quality and nature of its content lately. This means subreddit bans, shadow bans, and similar activity. I wonder if this will see an increase once Reddit has complete control and monitoring over its Image hosting. It also opens them up to DMCA and other complaints.

From an engineering perspective Reddit had frequent crashes and severe lag for years. They've improved a lot in the past two, and seeing them confident enough to launch a service like this says something about their progress in that regard. However I could see it be a complete flop as well. Time will tell.

20
kinkdr 3 days ago 4 replies      
Bad news for Imgur...
21
Guest98123 2 days ago 2 replies      
I'm surprised one of the big companies (Facebook, Google, Microsoft, Amazon), or one of the world's governments doesn't start up a simple image hosting site with direct linking, and foot the bill in exchange for tracking everyone, and having more of the world under their umbrella, visiting daily.

I'm assuming those companies also have the infrastructure to reduce bandwidth expenses, and they could accept the loss in the short term, to have a well established service and community for when imaging hosting becomes more profitable in the future.

A simple image hosting service is relatively quick to develop, and it's almost guaranteed to be one of the most used services on the internet. Why are these companies wasting hundreds of millions of dollars on all sorts of failed projects, when they could easily win this market?

22
tech-no-logical 2 days ago 1 reply      
and it uses cloudflare, so browsing for pics gives me another gazillion captchas because I'm on a vpn. well thanks al lot...

at least imgur didn't use cloudflare.

23
jug 2 days ago 0 replies      
I think this is a good move. Imgur got used only because it was convenient and that it didn't have any stuff brewing around the service. That it was created by a redditor probably also helped. Now it is a service with a lot of stuff going on around it, and a competitor to Reddit in a sense, being its own community oriented around discussing trending topics or creating memes.

Since Reddit has their own service for this now, hopefully they can use their revenue to keep it clean and at least reasonably ad free and focused on its simple mission. If that can happen, I think this is a step forward from what Imgur has become. Not anything bad per se if you want a different community, but not optimal if you already are on one and just want to view the pictures.

24
rezashirazian 2 days ago 1 reply      
Viewing images on reddit has always been broken. Without RES (Reddit Enhancement Suite) the UX on the site is horrendous. And even with RES, images not hosted on imgur wouldn't work most of the time.

It was so bad I took on myself to aggregate images from /r/funny and put them in a mobile friendly and fast website. http://www.pixpit.com

unfortunately every time I tried to talk about it on reddit, they deleted it.

25
corndoge 2 days ago 0 replies      
https://docs.google.com/spreadsheets/d/1kh1TZdtyX7UlRd55OBxf...

There are and have always been plenty of simpler alternatives with a variety of features. I stopped using imgur when direct links stopped working.

Personal fav: https://uguu.se/

26
amelius 2 days ago 1 reply      
Of course Imgur could take revenge by breaking all images that are still linked to from Reddit.

Did Reddit make backup copies?

27
ljk 2 days ago 0 replies      
28
zkhalique 2 days ago 0 replies      
This is what you get for relying on someone else's centralized platform.
29
yc-kraln 2 days ago 1 reply      
I posted a version of this comment seven years ago in the original reddit launch thread for imgur: I have server resources and happy to host any reasonable attempts to make a quality image host, that have a (to me) valid business plan.
30
baldfat 2 days ago 4 replies      
> I saw that multiple times here on HN now that people misuse the term. It's a pity, the original idea is something to be aware of, and diluting what dark pattern means hinders that awareness.

Sorry long post TL/DR Fighting people on word definitions is frustrating and words change meaning when one aspect gets popular.

I have fought this war over and over again over the following words:

Troll - Everything is a troll now. It use to be just someone that tried to ruin other people's fun. Now if anything that makes anyone laugh is a troll.

Hacker - This site is case in point. Hacker was a person who would make inventions with pieces that didn't normally get take apart and put back together into something different then its original intent. Then it became criminals. Now it's start up coverage websites were if you use terms of Hackers of long gone days (i.e. M$) you get down voted for being unprofessional. :)

Humanism - Means Atheist? Humanism was founded by the leaders of the Renaissance and the Reformation Movement Leaders would self-identify as Humanist. I went on a 5 day Reddit AMA with the President of the American Humanist Society (Secular Humanist) saying they can't hijack the word and change its meaning.

Humanism and related terms are frequently applied to modern doctrines and techniques that are based on the centrality of human experience. In the 20th century, the pragmatic humanism of Ferdinand C.S. Schiller, the Christian humanism of Jacques Maritain, and the movement known as secular humanism, though differing from each other significantly in content, all show this anthropocentric emphasis. http://humanism.ws/featured/a-history-of-humanism-robert-gru...

31
albedoa 2 days ago 1 reply      
> One last try. I'll stop then.

This is the least believable thing you've said.

32
kordless 2 days ago 0 replies      
Maybe I'll turn out to be right after all.
4
Crying robinwe.is
647 points by sinak  1 day ago   236 comments top 47
1
donatj 1 day ago 24 replies      
I am a big burly guy with a beard - and I get overwhelmed and cry regularly. Not even from sadness, just from general emotion.

For example last night I watched the trailer for Overwatch[1] and got so excited I started tearing up. I don't know why, anything that triggers ANY sort of strong emotion in me brings it on. Always has.

My wife makes fun of me for it. I don't see anything wrong with it.

[1] https://www.youtube.com/watch?v=FqnKB22pOC0

2
Smaug123 1 day ago 3 replies      
For anyone else who was surprised, like me, that there was so much crying recorded in this article, you might find [a certain Reddit thread][1] interesting. I had been under the impression that basically no-one cried ever, but it turns out that some hormones just seem to make it happen.

[1]: https://www.reddit.com/r/AskReddit/comments/4g1pgu/serious_t...

3
aantix 1 day ago 2 replies      
I'm really glad someone had the courage to share this data.. Growing up I was always under the impression that it was wrong to cry, and it probably exasperated some of my depressive episodes.

Strangely, after becoming a father, I feel like I'm even more sensitive and can cry at the drop of a seemingly innocent comment. I bet I cry almost daily.. But after it occurs, it's refreshing. I feel stronger.

4
Jaruzel 16 hours ago 0 replies      
I love this thread. I've been a long time HN lurker, but only fairly recently did I create an account and start contributing.

See the honesty and emotion on these comments really makes me feel I'm hanging out with a good bunch of people. It's so refreshing compared to the toxic communities of the other large social sites.

I'm a cryer. I cry a lot, I'm also clinically depressed with social anxiety, which makes social interaction difficult, so my emotional are mostly on the surface anyway. Crying for me is the best form of emotional release. I cry at the big ending in films, I cry at the end of amazing books (The Green Mile totally destroyed me, I was in business class on a plane at the time, blubbering my eyes out...).

On balance though, I also get very angry often. I guess you can't have one without the other...

5
SandersAK 1 day ago 0 replies      
This is awesome. I think so much study goes into anger management and trying to understand why we get angry, but it seems that other emotional outburts are often shunned in discussion.

As someone who comes from a family of people (read: Chinese and Southern) who house an intense stigma around crying, it's been a struggle for me to better empathize and understand crying for other people.

I think the author nailed the use of personal logging here - it's not about extrapolating onto others, it's just another form of self-reflection.

6
tbabb 1 day ago 7 replies      
I had no idea that people cry this much. It's like pulling back a curtain.
7
ohitsdom 1 day ago 1 reply      
Incredibly interesting, and waaaay more data than I expected. The fancy d3 weighted tree/forking chart was fascinating.

> I tried to categorize cries as they were happening (because I wanted to create a real-time crying dashboard)

Sure, who hasn't wanted to create a real-time crying dashboard?

8
modoc 1 day ago 1 reply      
I almost never cry. Some of my friends joke that I cry once a decade, but honestly that is pretty close to the truth. I'm not ashamed to cry, and I don't try to not cry, or anything like that, I just almost never want to cry...
9
tjbiddle 22 hours ago 0 replies      
Everyone is an emotional being; whether they show it to the world regularly or not. I'm normally a very stoic, "serious", sometimes up-tight and irritable person. I know I've turned off many people from the way I interact. But get me in-front of someone I can truly connect with, and I'm a whole different soul - I laugh, smile, my heart flutters, I'll cuddle up to a partner, the works. It all depends how comfortable the other person makes me feel.

I'm very into personality typing and identify as INTJ for Myers Brigg (Which is the smallest subset at ~1-2% of the population). I met a girl the other week that I found out was INTJ as well and we connected instantly on a whole new level - still blows my mind when I talk to her; and we were both able to share that very quickly. But we're both very stoic and have our guards up in public until we click well with someone and can relax.

10
chris_va 1 day ago 2 replies      
Long distance relationship, infidelity breakup, family illness, and bedbugs all in a year? Yikes, glad they already had a therapist to speak with.
11
howlingfantods 1 day ago 2 replies      
Fascinating dataset. I think ending a long term, long distance relationship (and then finding out he was married all along), probabaly skewed this dataset upwards.

Last time I cried was watching the most recent episode of Game of Thrones. And then watching YouTube reaction videos of that episode, a little less crying with each video. Hodor...

12
Kluny 1 day ago 0 replies      
The data collection, analysis, and emotional honesty of this post are extremely impressive.
13
greenspot 1 day ago 2 replies      
Disney movies.

Literally every single Disney movie gets me to cry. Usually towards the end, around the climax when the soundtrack let the harps and violins kick in.

Not sure if it's me or Disney has some special Crying Department which meticulously orchestrates a crying storyboard.

14
morgante 1 day ago 0 replies      
It's amazing how much variability there is in how much people cry. Also, the intensity obviously varies a lot: I've never cried for longer than maybe 5 minutes.

Personally, I'm tempted to start a similar log but for the exact opposite reason: I cry so little that I don't remember the last time I did, and I would like to.

15
transpy 1 day ago 1 reply      
I dunno... I find the notion of crying as a reaction to fictional (or even non-fictional) media... foreign. I find this notion foreign. (Man, just a goatee, almost 33). I do consume fiction, but I just don't 'suspend disbelief', as it is called. Call me a fiction grinch, but it's just the way my brain works.

I recently changed jobs and moved to a new city. When, after a long hiring process I was given the job, I cried a little (like 10 seconds), out of joy. Then, once I arrived in the city, it was hard to find a new home but I faced it without victimism: I decided to move, so I had the obligation to face whatever hardships would emerge along the way. But the day I finally found a home and everything started to settle, when I finally had the keys, I sat down on one of the rooms and, beer in hand, cried, but out of relief and joy also.

When I was younger I used to cry easily, but as I'm getting older I just don't cry often anymore. Somewhat it has become a reaction reserved for existential highlights, so to speak.

16
jetcata 1 day ago 0 replies      
This is super interesting, I've found that as I've aged I cry a lot less than when I was younger. If gender makes any difference, I'm a woman. I found that when I was younger I cried a lot, at all sort of things. Now I find it difficult to cry, and I kind of miss that emotion, because I think it's an important part of what makes us human :(
17
projektfu 12 hours ago 1 reply      
Looking at the "heat map", it looks like the OP needs to start going to bed on time, like 10pm, on a regular basis. It seems she is staying up late and crying a lot during that time, perhaps because emotional things and overwork are getting her at that time. Put the phone on Do Not Disturb and check out at 10pm.
18
dfar1 1 day ago 0 replies      
The crying post is good, but also the rest of the website. Amazing amount of data.
19
SeanCrawford 1 day ago 0 replies      
I read somewhere the author of Starship Troopers, Robert Heinlein, wrote that when he realized he couldn't cry he took steps to learn to do so. I am a man still learning.

In my self help group for recovery from abuse even some of the women could not cry. We thought this was from bigger people saying, "If you don't stop crying I'll give you something to cry about" and also we thought they were bloody liars, unable to admit to us that our crying upset them and they didn't want to feel (just as they took substances to avoid feeling)

20
Zelphyr 1 day ago 2 replies      
So many people here admitting they cry with an undertone that they feel various levels of bad for doing it. @donatj even said his wife makes fun of him when he cries.

We have got to make this a non-issue. Its so insane to me that in Western culture someone isn't allowed to express a perfectly normal aspect of their physiology. Its like shaming someone for puking. It may not be pleasant but not doing it is way worse.

I sincerely believe we should be pushing back and HARD when we're shamed for crying. "We" being anyone because, lets face it, women are shamed for it too.

21
auganov 1 day ago 0 replies      
Since everyone is talking about their crying habits - I have this very annoying tendency to get teary eyed/cry when having intense discussions with some people. Especially if there's any hint of negativity in it. Awkward for obvoius reasons. So I just tend to avoid these.
22
ghoul2 1 day ago 0 replies      
This is _so_ well done!

Could you talk about how you kept track of this when you were away from your computer? Any app etc?did you make a record right then? or just a mental note and then trancribed later?

I apologize for focusing on the logistics part of it :-)

23
jaytaylor 1 day ago 2 replies      
Can anyone explain how the visualization with the header '? -> ?? -> ???' (a little past the middle on the right hand side) was generated?

I'd really like to know how spreadsheet data gets turned into that awesome D3 diagram!

24
soneca 1 day ago 0 replies      
My crying is triggered by positive stories. Most of the times, specifically by solidarity demonstrations.

Sports, natural disasters, simple day-to-day actions. Whenever i feel there is real and genuine altruism.

I rarely cry by sad or melancholy reasons.

25
iandanforth 1 day ago 0 replies      
If this began happening to me I would consider it a serious medical issue. Crying on a daily basis? That seems debilitating.
26
louprado 1 day ago 0 replies      
During allergy season my eyes are always on the verge of tearing up. Then even mildly enthusiastic conversations will cause my eyes to fully tear and that makes me feel emotional. It is embarrassing and often leads me to abruptly end conversations.

There have been numerous studies where facial expressions can affect emotion. It is a positive feedback loop [1]. The situation I described above isn't a facial expression per se, but does anyone else feels like eye strain or irritants (soldering?), makes them more emotional.

[1] http://scienceblogs.com/neurophilosophy/2010/04/16/botox-may...

27
djhworld 17 hours ago 0 replies      
I haven't cried for years, and I don't wear that as a badge of honour either, it's just never been a thing for me.

I've had the 'lump in the throat' moments, sure, but not breached the wall to have the dam burst in quite some time. In fact I think the last time I properly cried was when my dog was euthanised and that was 12 or so years ago. That was the first time I'd seen my Dad completely broken too.

28
therealdrag0 1 day ago 0 replies      
I don't cry often. But a common types of cry I have is in narratives when someone achieves something.

For example, I cried reading The Devil in the White City when the Ferris wheel started moving for the first time.

29
agumonkey 1 day ago 0 replies      
Weird anecdote. Grief can lead to health damaging anger, when sometimes you need to cry. Pardon the metaphor but it really feels like leaking the pain out instead of keeping it rotting inside your head.
30
burnguy123 1 day ago 1 reply      
Is anywhere near this amount of crying normal? I have never tracked but I would suspect it would be more like 1x every year or two if we remove allergies.
31
nice_byte 21 hours ago 0 replies      
Holy wow, I am genuinely surprised. I would never have imagined that it's possible for grown-up people to cry more than one or two times per year, let alone cry enough times to gather stats about it!
32
NTDF9 1 day ago 0 replies      
What a great study! We engineers and rational-thinking oriented people tend to disregard how our emotions affect us on a daily basis.

Makes me wonder...Would I want a emotion tracker (app or hardware) which lets me study ME? Yes

Do I want it in the cloud or to be made money from? Not at all. An opening up of emotions opens me up to a lot of abuse.

33
Jean-Philipe 1 day ago 0 replies      
For me, crying was mostly related to lack of eating and working out, lack of sleep or from too much work. The triggers were real, relationship problems, work related issues, you name it. But as long as I eat well and work out, I can take on a lot of personal problems without crying.
34
novia 1 day ago 0 replies      
Hey, girls cry more (typically) than guys.

As noted in the article, different people have different sensitivity levels that their emotions must meet before the waterworks start. Just because you cry less than the author does not mean that her amount of crying is abnormal.

Personally, I cry just as much as her if not more.

35
_audakel 1 day ago 2 replies      
i read this and sent my wife a link. she wanted to know why i was analyzing her crying. i told her i just wanted to run a neuralnet on her crying data.....

http://i.imgur.com/NE380tW.png

36
partiallypro 1 day ago 0 replies      
I cry, maybe 3 times a year, and tear up maybe 10 times a year. Last year was really rough, so it was a bit more, but I have no idea how someone could cry -that- much and still have a livable life.
37
sharp11 1 day ago 0 replies      
This is such a wonderful blending of genres. We techies often can seem (or be?) detached from emotion. The rigor with which analytics were applied here ("real-time crying dashboard"!) is genius. So funny and so true.
38
pmiller2 23 hours ago 0 replies      
I've had a lot of 3's and 4's, and a couple of 5's in the past year. It's been rough. :(
39
kpwagner 1 day ago 0 replies      
Well executed analysis. It takes real commitment to follow through and track these results, then actually do something with the data. Better than a lot of TEDx talks I've seen lately.
40
_RPM 1 day ago 0 replies      
After staying up for 3 days straight coding with caffeine, I had a cry spell. It was like a religious experience.
41
gadders 13 hours ago 0 replies      
I think I have cried twice in ten years. I don't cry if possible, and certainly not in public.
42
geedzmo 22 hours ago 0 replies      
Go Cats
43
imaginenore 1 day ago 2 replies      
> I've always considered myself to be a bit of a crybaby

No, that's some ridiculous amount of crying. I wonder if it's severe depression or Pseudobulbar affect or some hormonal disorder.

https://en.wikipedia.org/wiki/Pseudobulbar_affect

44
Xcelerate 1 day ago 2 replies      
Edit: I have retracted this post; it came across in a way that I did not intend it to. Thank you to those of you who explained the response I received.
45
hwhatwhatwhat 1 day ago 1 reply      
If I understand you correctly, I can see what you mean about emotional openness being a more feminine trait, and understand your implication that homosexual males may lean more towards that behaviour too (stereotypically, at least) - but I really don't understand what race has to do with this?
46
dang 1 day ago 4 replies      
Please don't take HN threads on generic ideological tangents. Especially not a thread like this, which is full of people relating specific experiences.
47
MicroBerto 1 day ago 4 replies      
Notice how many cries are due to relationship-based issues.

You are putting yourself at serious emotional risk if you allow your happiness to be driven by another person, regardless of who it is.

IMHO, your emotional success should be based upon things you can control, and if it's not, then you need to make more rapid decisions as to who or what is allowed to take part in your life.

5
Apple, Microsoft, and Google hold 23% of all U.S. corporate cash geekwire.com
384 points by sethbannon  1 day ago   152 comments top 15
1
WildUtah 1 day ago 5 replies      
The US tax code heavily punishes companies that hold cash in a variety of ways. The motive is to subsidize banks by requiring companies to rely on short term debt for operations. It's the biggest and most lucrative of subsidies for the banking industry and laundering such subsidies is how many bankers get rich. For instance, Mitt Romney made his money that way.

(The main tax code subsidies here are the business interest loophole and the accumulated earnings tax)

Usually a big company holds almost no liquid assets and hands profits out to bondholders and shareholders or invests actively to grow, even in unrelated businesses. That's to avoid tax penalties.

Apple, Google, Microsoft, and the like are holding masses of cash outside the USA. Gridlock in Washington and a stagnant and irrational corporate tax code make it very hard to bring cash home to invest for American companies. Therefore you find tech companies with few overseas expenses exporting services and accumulating cash they can't bring home.

(The main tax issue here is the non-territorial tax system in the USA. All other developed countries have a territorial system.)

Other countries can see this is a problem that kills jobs and wages at home so they don't do it. Obama has tried to fix it and Trump's main tax proposal is aimed at it, but both parties in Congress block change. Republicans don't want to hand Obama a victory and would rather hand out goodies to donors with loopholes than simplify taxes and Democrats just want to punish profitable multinationals even if it kills jobs.

And Wall Street is very mercurial with credit for tech companies, especially growing ones, so they can't rely on the banking subsidies.

The result is that tech giants are the only companies with a reason to accumulate cash so they overwhelmingly are the only ones that do so.

In summary, this is a government regulatory policy that creates cash rich tech companies. It's a result of bad decisions on Capitol Hill, not on Wall Street or Sand Hill Road. It is not a stock valuation or corporate strategy issue.

2
boxy310 1 day ago 5 replies      
There was a very interesting article in the NY Times earlier this year about this subject:

"For other industries, though, a dollar of savings is worth a lot more than itself. For pharmaceutical companies, a dollar in savings is worth $1.50. For software firms, its even higher: more than $2. This means that investors are behaving as if they trust the executives in these industries, like Larry Page of Alphabet, to be smarter about using that money than the investors themselves could be. ...

"Why? The answer, perhaps, is that both the executives and the investors in these industries believe that something big is coming, but this is crucial theyre not sure what it will be. Through the 20th century, as we shifted from a horse-and-sun-powered agrarian economy to an electricity-and-motor-powered industrial economy to a silicon-based information economy, it was clear that every company had to invest in the new thing that was coming. These were big, expensive investments in buildings and machinery and computer technology. Today, though, value is created far more through new ideas and new ways of interaction. Ideas appear and spread much more quickly, and their worth is much harder to estimate."

Source: http://www.nytimes.com/2016/01/24/magazine/why-are-corporati...

The implication is that big tech companies see some very disruptive trends coming down the pipeline, but they're not sure which specific idea or ideas should take most of the investment, and they're hedging their bets by hoarding cash for now. Maybe when the AI/automation revolution kicks into high gear, we'll finally see what they spend all this money on.

3
gozur88 1 day ago 0 replies      
I think Occam's razor applies here: It's not that management teams have big plans for the money, or that they "believe that something big is coming, but... theyre not sure what it will be". It's that they don't know what to do with the money and there isn't much pressure to disburse it.

Cash hordes are reflected in the share price, so investors can monetize the cash by selling a few shares. If you're a "buy and hold" investor you don't want a dividend because that's a taxable event.

4
MrMullen 1 day ago 2 replies      
When a company like Apple has $200B in "cash" overseas, where do they put it? I mean, is there literally a giant vault of physical cash or gold somewhere or do they buy 6 month T-Bills or just put it all in a private bank?

BTW: $200 billion in $100 bills weighs 2,200 tons.

5
Analemma_ 1 day ago 0 replies      
Is it apropos on HN to link to cite one's own comment? I made a comment about this a month or so ago that I think is relevant: https://news.ycombinator.com/item?id=11592455. Essentially, I think this cash hoarding is motivated by fear, fear of being "disrupted" and becoming the next RIM or Palm or whatever: in a dominant position in year N; dead as a doorknob in year N+3. And this is the toughest kind of fear, because it's perfectly reasonable but can still get pathological when taken too far.
6
ksec 23 hours ago 0 replies      
As someone not an US citizen, it is because US are running into 2016 election that we have to be constantly reminded by the media there are lots of US companies holding huge amount of cash overseas?

The amount of these article popping up is just getting silly. And yet not a single one has proposed a (fair ) solution to the problem / US Tax Code. Everytime I just end up skip read the article or go straight to HN comment for some of the best thoughts around.

7
foota 1 day ago 3 replies      
Maybe a stupid question, but why doesn't Apple start a vc fund?
8
bcheung 1 day ago 1 reply      
What is meant by cash in this context? I'm assuming a lot of it is invested in financial vehicles with varying degrees of risk and liquidity? It's not just sitting in a savings account earning 0.01% interest, is it?
9
bahmboo 1 day ago 0 replies      
So those 3 could cover federal spending for almost 2 weeks.
10
beezle 1 day ago 0 replies      
Once again the boatloads of cash story rears its head. Except that the figures stated are inflated, most certainly in the case of Apple who already spent a good amount of that cash by issuing debt to pay for dividends and stock buybacks.
11
chirau 1 day ago 1 reply      
I am curious as to how links on HN work. I am certain i posted this very link two days ago. Does HN now allow multiple posts of the same link?
12
microcolonel 1 day ago 1 reply      
>Excluding the energy sector, tech comprised 40% of non-financial free cash flow in 2015, similar to the 42% average since 2007.

This writeup is a bit sensationalized. They try to argue that "tech" is the biggest by ignoring the actual two biggest. This is like arguing that men are the majority population, 100% of the remainder after women.

13
eximius 1 day ago 0 replies      
I feel like I read a similar headline with the same figure but it was 23% among technology corporations cash...
14
known 18 hours ago 0 replies      
15
stellazhiu 1 day ago 6 replies      
When I hear the phrase corporate profits, I think of the fact that they aren't really profits at all. According to a UN report, the amount of environmental damage the companies caused is on par in dollar terms than the profits they made. They just exchanged something for something else, except that the global population has to eat the losses in the form of pollution. Nothing is free in this world unfortunately, not even corporate profits.

http://www.theguardian.com/environment/2010/feb/18/worlds-to...

6
Blocklist of all Facebook domains github.com
477 points by temp  18 hours ago   124 comments top 30
1
marios 17 hours ago 9 replies      
Not the way I'd do it, since you can easily miss on some new domain that belongs to facebook (or perhaps some server that does not look like it belongs to facebook in the first place, but it is sitting in their assigned subnets).

If you really want to block all traffic from/to facebook, lookup the IP prefixes associated with their AS number(AS32934), and setup your firewall to block those. If you are using PF, tables are your friend. With netfilter, consider using ipset.

2
kazinator 14 hours ago 1 reply      
It's inefficient to specify a large number of hosts in the facebook.com domain instead of blocking the whole domain.

For this, you can run dnsmasq and use the "--address" option or "address" command in dnsmasq.conf:

 $ man dnsmasq [...] -A, --address=/<domain>/[domain/]<ipaddr> Specify an IP address to return for any host in the given domains. Queries in the domains are never forwarded and always replied to with the specified IP address which may be IPv4 or IPv6. To give both IPv4 and IPv6 addresses for a domain, use repeated -A flags. Note that /etc/hosts and DHCP leases over ride this for individual names. A common use of this is to redi rect the entire doubleclick.net domain to some friendly local web server to avoid banner ads. The domain specification works in the same was as for --server, with the additional facility that /#/ matches any domain. Thus --address=/#/1.2.3.4 will always return 1.2.3.4 for any query not answered from /etc/hosts or DHCP and not sent to an upstream nameserver by a more spe cific --server directive.

3
mp3geek 16 hours ago 1 reply      
Alternatively you can use Adblock to block it.

https://secure.fanboy.co.nz/fanboy-antifacebook.txt

Disclaimer, list Author.

4
zhong 11 hours ago 1 reply      
No need do this, come to China prepared everything for you like this.
5
avree 17 hours ago 1 reply      
If you're blocking Instagram, shouldn't you be blocking the Oculus Rift site (and any subdomains) too?
6
punnerud 17 hours ago 1 reply      
This does just the same:.facebook.com.facebook.com.fbcdn.com.fbcdn.net.facebook.com.edgekey.net.facebook.com.edgesuite.net.instagram.com.instagramstatic-a.akamaihd.net.instagramstatic-a.akamaihd.net.edgesuite.net.cdninstagram.com.tfbnw.net.whatsapp.com.fbsbx.comfacebook-web-clients.appspot.com.fb.mefbcdn-profile-a.akamaihd.neth-ct-m-fbx.fbsbx.com.online-metrix.netac-h-ct-m-fbx.fbsbx.com.online-metrix.net
7
maaaats 17 hours ago 1 reply      
Would wildcard support in hosts files be too heavy for the performance needed? Most of these are subdomains that *.facebook.com would have blocked.
8
cm2187 17 hours ago 2 replies      
I can understand the multiplication of sub domains, to be able to use multiple connections. But what's the rationale for the multiplication of domain names? Ad blocker avoidance?
9
mfo 17 hours ago 4 replies      
I just want to say "Privacy matters, thanks you" (even more when FB decided to leverage their like/share button for a global ad netwotk) :-) non tech saavy guys may love a simple .sh / .bat to automatically add those entries to the /etc/hosts on windows & unix
10
joeblau 15 hours ago 2 replies      
Will there be a point where the government will step in or is all of this tracking within fair use of non-logged in Facebook users visiting a website?
11
justsaysmthng 16 hours ago 4 replies      
I'm sorry, I've been away for a couple of hours... What happened ?

Why should I (we) block all facebook domains ?

12
TazeTSchnitzel 15 hours ago 0 replies      
Should you not also include `::` IPv6 entries?
13
elcapitan 17 hours ago 0 replies      
Oh wow, I didn't know it were that many, I had like 20 in my hosts file. Thanks!
14
rotoole 13 hours ago 1 reply      
Aside from being easier to automate, getting IP's via the ASN lookup is also better for blocking HTTPS requests when you are MITM, since the HTTPS request will only contain the IP and not the FQDN.

Also, many firewalls do a 1-time DNS lookup of a given FQDN to resolve a single IP address when a FQDN based rule is created. This doesn't work well if you have an FQDN that can resolve to many different IP's, which is typical for cloud services.

15
stardogg 17 hours ago 1 reply      
I'm wondering ... what's the best approach to automatically collect all domains of a company?
16
hallatore 17 hours ago 3 replies      
Isn't Ghostery a better solution for something like this? If we are talking about browsers that is.
17
mickrussom 11 hours ago 0 replies      
Im going to start trying this out. Awesome. Also as marios said block AS32934's networks.
18
jpkeisala 16 hours ago 4 replies      
Slightly off topic: It would be nice to have some kind of extension for Chrome that blocks all Time-Wasting websites with one click. Has anyone seen something like that?
19
petrikapu 15 hours ago 2 replies      
Do you how to apply this just for my user on OS X? My partner is heavy FB user and I don't want to block her...
20
midgetjones 17 hours ago 1 reply      
It is faintly terrifying just how long the list is.
21
hackney 8 hours ago 0 replies      
So awesome. Facebook: Server not found
22
curiousgal 17 hours ago 1 reply      
Any idea of an easy/quick way to toggle these edits on/off on Ubuntu?
23
curiousgal 17 hours ago 1 reply      
>On Windows 7, the default AV security scan will try to remove the# facebook.com entry

Wat?

24
ausjke 13 hours ago 1 reply      
IMHO the only way to block things efficiently is via a proxy these days, IP/domain-based blocking are not reliable or efficient.
25
zxcvcxz 15 hours ago 3 replies      
Honest question because I seriously don't know: Is facebook really worse than google when it comes to privacy?

I kind of wonder who exactly are the people telling everyone to block facebook everywhere while everyone seems to collectively ignore google.

Google and facebook seem to both purposely ignore the known implications of their data collection programs. They likely have handed over data to the NSA, and we know they sell the data.

26
meeper16 11 hours ago 0 replies      
This is awesome. Facebook needs to go to the graveyard with friendster, myspace and AOL.
27
hellbanner 15 hours ago 0 replies      
Thank you! Now I can censor, too
28
supergirl 14 hours ago 0 replies      
dramatic and useless.
29
hartator 11 hours ago 0 replies      
I would actually do it for Google. But, we all know we have became to dependent on them.
30
labithiotis 15 hours ago 2 replies      
Who really cares?
7
React Tutorial: Cloning Yelp fullstackreact.com
511 points by jashmenn  2 days ago   243 comments top 41
1
swanson 2 days ago 9 replies      
I'm no Javascript Boilerplate Fatigue apologist, but there are many comments in here that are treating this as a "Learn how to use React" tutorial. This is not what is advertised nor the stated reason this article was written.

From the first sentence: "we get a lot of questions about how to build large applications with React and how to integrate external APIs" Large application strucutre, external integrations...not "Hello World"

Of course there is no need for multiple environments, or a robust testing setup, or an icon font, or 6 different webpack loaders to learn the basics of React. This is not a tutorial on the basics of React -- this is written from the perspective of a large production level application. And that "overhead" and "insanity" are helpful for real projects that are staffed by more than one person and need to last more than a week.

It seems absurd that it might take you 2 hours to setup the environment before you get to hello world. But if you are planning to use this as the foundation of a 6 month long project, 2 hours to get everything setup in a sane way is not an issue.

There is absolutely churn in the JS world and some of it does seem unbelievable because things haven't contracted and settled into a steady-state. But don't just levy a blanket "lol fuck javascript tooling" every time you see a complicated setup process; you are doing yourself a disservice by not critically evaluating the choices the author of this post made and determining for yourself if they will benefit your project and/or learning efforts.

2
eterm 2 days ago 18 replies      
To me this epitomizes what I feel as I'm trying to explore options for different front-end frameworks. In this article I'm 30 screens down (literally 30 page down presses) and it's not even finished setting up the environment and dependencies.

Sure, this is something that you only do once, so if it then leads to much better development workflow it makes sense to have a solid investment upfront, but it makes it very hard to compare technologies and make informed decisions given how much up-front "glue" is needed to get a demo application up and running.

I'm not blaming react, everything else in the javascript space is just as bad right now. There's at least 4 different ways of loading modules (UMD, AMD, commonJS, es (babel) modules), 2 of which (require(), import .. from) are used within this example.

In fact the whole process is so complex part of the process is "clone this package which has pre-configured parts of the process".

And all of this to glue together a couple of APIs.

3
pjs_ 2 days ago 4 replies      
This genuinely made me feel ill. The author has done a tremendous service to others - clearly and patiently listing the thousands of steps required to get a basic modern web app up and running. I agree with others that it is often difficult to find all the steps for a process like this in one place. At the same time, this is completely, totally fucking insane.
4
qudat 2 days ago 0 replies      
I've built a multitude of websites/applications in my lifetime spanning many different programming languages, build systems, tools, etc. From PHP, to python, to golang, to java, and JS. It seems painfully obvious to me that the people complaining about this setup/build system do not fully grasp the power and beauty of the JS ecosystem, more specifically react, redux, time traveling, and hot module reloading. It is without competition, the best development experience I have ever had been apart of. There is no lag time between when I make a change in JS or CSS to when those changes get applied to the application. There's no compiling (even tho there is a compile step), no refreshing the page, no stepping through all your previous steps to get to that part of the application, your changes are patched in real time and available to you automatically.

I guess the saying is true, that haters are going to hate, but there really is no competition in terms of development experience once you grok the ecosystem.

5
eagsalazar2 2 days ago 1 reply      
The criticism here is really baffling in how it blames the js ecosystem for the complexity in building large production apps..

I wouldn't find fault with a similar tutorial for "how to set up a production, at scale, rails app with only sever side rendering and zero javascript". Between hosting, cacheing, deployment, worker queues, database provisioning, etc, LOL that tut would be gigantic and that makes sense!

If people are mad that making production client side applications isn't trivially easy, well that just isn't going to happen and that isn't because the js ecosystem is screwed up.

6
keyle 2 days ago 3 replies      
I've dealt with many technology stacks. If this is the future, we're F#$$%.

Seriously, how can people be blasting older technology like Flash and Flex (which was GREAT), out of the water for not using web standards and then this Frankenstein of a "stack" is going so mainstream.

Sure, there is the VM problem, but the language was good and so was the framework. This looks horrendous and scary. Imagine maintaining this for the next 10 years when "the standards" will have moved on to you "next gen" javascript "framework".

My only way to write web apps has been using micro frameworks, jquery, and small libs that do one thing and one thing only. I can handle serving pages with Go thanks, and I don't need a routing system that looks like a vintage joke. Sorry if I sound jaded, I've been doing this for 15 years.

7
Cyph0n 2 days ago 8 replies      
Great work on the tutorial. I'm sure it took a lot of time to setup, and it seems well written.

However I simply won't believe that setting up a simple React app requires so much overhead. Granted, I have no experience with React, and only marginal experience with frontend web dev.

As I read the tutorial, this is the list of questions I had:

1. Why do we need so many Babel presets? What do they do?

2. Why do we need Webpack exactly? Why not use a traditional build system like Gulp?

3. Why is Webpack so difficult to setup? Are there no pre-configured setups for React?

4. What the hell is postcss? Are Less and Sass out of fashion now?

5. And why all this added complexity to setup CSS? They are only stylesheets for God's sake!

6. Oh, so now we need to configure Webpack to support postcss? The definition of reinventing the wheel. Is there no plugin system for Webpack?

7. Why is it so complicated to setup multiple environments using Node and Webpack?

Phew, looks like we're done -- nope, we're not.

8. So many libraries just to setup a testing environment? I wouldn't be surprised if frontend apps aren't well tested...

9. Ah, we also need a "JSON loader", whatever the hell that is.

10. Great, another CLI tool for testing. And more configuration of course.

11. Webpack once more needs to be configured to support our new testing app.

12. We need a better spec reporter? Why? More configuration...

13. More Webpack configuration.. I'm already sick of it.

So many things to keep in mind, so many dependencies, so very many points of failure. If just one of these libraries is abandoned, or has a breaking change, your entire development environment is dead. Is this the current state of frontend web dev, or are these guys just overdoing it for the sake of the tutorial?

I find this all weird because I have the habit of thinking very carefully about every single dependency when I'm writing software. Do I really need it i.e. can the same task be achieved using the standard library? If not, how active is the development of the library (recent activity, issue response time, number of contributors)? How many libraries does it depend on - the fewer, the better? And even with all this, it's still not guaranteed that things will go smoothly!

8
rvanmil 2 days ago 1 reply      
I've been on the fence for quite a while, but a couple of weeks ago I finally bit the bullet and taught myself webpack, ES6, React, CSS modules, Redux, and everything else that comes along with these tools.It definitely felt like relearning my job (coming from working with Grunt, Backbone and jQuery) and it took a lot of time and effort to really get to understand everything (still learning many new things each day), but man was it worth it. I enjoy working with these tools very, very much and I am able to build apps significantly faster than I could before.
9
rvdm 1 day ago 0 replies      
First off. Great tutorial! I wish more frameworks their native documentation would come with more real world examples. Redux's fantastic documentation is a step in the right direction but still makes real world solutions a bit too much of a side note.

Regarding Javascript fatigue, I want to share something that greatly helped me.

I have written enterprise applications for Fortune 500s using PHP, Rails, Backbone, Angular, React, Node, Express, Grunt, Gulp, Webpack, Yeoman, Bower, Redux, jQuery, Coffescript, Prototype.js ( remember them!? ), LESS, SASS... Basically whatever was hot at the moment.

Long I've enjoyed learning new things, but after having made a solid investment in Angular only to find out none of it's SEO solutions were really commercially viable the fatigue hit me hard and I gave up on trying to learn new things for a while. I simply stopped caring.

Then I got approached by SpaceX for a JS full stack position. All they told me about the interview beforehand was that it would be very JS heavy, yet no details on what stack or framework they were working with.

To prep I brushed up hard on my basic JS skills. Codeschool.com their JS road trip was very useful. So were "Eloquent JavaScript" and "JavaScript: The Good Parts".

After making that tough but very rewarding investment, learning React, Flux, Redux, Elm etc. all became a breeze. I no longer have any attachment to any framework. They're all just different ways of using JS to me. And no matter what the future brings, no matter how many frameworks and build tools get thrown our way, I don't think ( hope? ) my heavy investment in Javascript will soon disappoint.

So for those of you out there trying to figure out what to invest in next, React, Elm, RxJS.. My advice would be to get a deep understating of pure Javascript first. Ideally, try to build your very own framework using vanilla JS. Once you do that you'll find each new framework is just a different opinion on how JS should be used.

Many frameworks have come and gone. But after more than a decade of investing in the Javascript language, it keeps rewarding.

10
showerst 2 days ago 0 replies      
This is a really great tutorial, it's rare to find pieces that step you through the whole process with dev/prod and tests without assuming you already understand the arcane setup.

It also shows what an arcane dependency-hell react is... how much boilerplate does it take to get up a map with a valid route? I hope this is something that becomes a bit more standardized/easier as the ecosystem evolves.

11
blueside 2 days ago 3 replies      
The setup required just to get a production-ready idiomatic `hello world` app in React is downright insane. Without the Facebook name behind it, I don't see how React could have ever made it this far.

Foolish of me to keep underestimating the pains that JS developers willingly tolerate.

12
hammeiam 2 days ago 1 reply      
I also got frustrated by the setup time for a simple react app, so I make a starter repo that I just clone for all my new projects. It's made to be as simple, light, and understandable as possible while still being useful. Check it out and let me know if you have any questions! https://github.com/hammeiam/react-webpack-starter
13
zeemonkee3 2 days ago 0 replies      
The anger in this thread does not bode well for the future of React.

I predict big, complex, arcane React stacks will be a punchline in a few years, much like J2EE/EJB is today.

And yes, I know React itself is a small library - Java servlets were a small, simple API that formed the foundation for a ton of over-engineered abstraction on top.

14
hathawsh 2 days ago 0 replies      
The complexity of this tutorial reflects the current state of web app development more than the complexity of React. React by itself is actually rather simple on the surface.

Even though I don't need this tutorial to be productive, I think I'm going to go through it anyway to fill in holes in my knowledge. It looks well written.

15
arenaninja 2 days ago 0 replies      
I haven't finished reading, but so far this is an excellent walkthrough. This goes far far FAR beyond the 'Hello World' and 'TodoApp' tutorials and demos the amount of tooling you have to dedicate to keep things as seamless as possible.

I recently wrote that in a side project it does not appear to be worth the effort, but that applies to my side project and nothing else. Your next project may well look a LOT like this.

16
ErikAugust 2 days ago 1 reply      
We (Stream) are releasing a React/Redux tutorial series - you build a photo sharing app (Instagram clone), for those you might be interested:

http://blog.getstream.io/react-redux-example-app-tutorials-p...

17
morgante 2 days ago 1 reply      
People should stop complaining about how "hard" it is to get started or how "complicated" React is.

React is simple. Its core abstraction is trivial (making views into pure functions).

If you want to, you can get started with React today without installing any software at all. Just include it from the CDN: https://cdnjs.com/libraries/react/

The rest is there because things like live-reloading are genuinely helpful. But you don't need to roll them yourself. There are dozens of great boilerplates you can base a new project off of.

Also, I've never had as much difficulty setting up a React environment as the constant struggle it is to get even a basic Java app to build from source.

18
kelvin0 1 day ago 0 replies      
Is it me, or does the JS tool ecosystem have the same problems encountered in some functional language frameworks (LISP for example)? I'm not comparing the languages here (I'm not a fan of JS), but how they are lacking the 'Batteries' and toolsets other languages have grown up with. Since you can do practically anything with JS (or LISP), if something is lacking simply write your own solution ... repeated 1,000 times. This is why there seems to be a gazillion tools/solutions/framework each doing something similar but different enough to cause massive confusion to newcomers. This is one of the reasons DART had appealed to me, the language came with a framework and tools that helped you get started right away, instead of getting bogged down trying to figure if you should used Webpack or it's numerous alternatives.
19
bdcravens 2 days ago 0 replies      
I know there's a lot of dog-piling on this tutorial, but having gone through fullstack.io's Angular 2 book, if I wanted to learn React, I'd probably (and probably will) go with their title.
20
pacomerh 2 days ago 2 replies      
Cool tutorial, but first check if React is what you really need for your next thing. Facebook creates React to solve the problem of dealing with large applications with data that changes overtime. React became very popular to the point where the word gets spread saying this is the newest and coolest thing to learn. At this point the core idea and key aspects of why React is cool get misunderstood and we start assuming that React is the best choice for everything! and Virtual DOM is so cool and done!. So now apps that could be done in 1/3 of the time if you just used a simpler library or even vanilla Javascript are being written with all these complex flows, dispatching actions and thunking async operations when all you really needed was display a list of things with ajax...... I'm not saying React is not cool. I'm just saying, understand why React is relevant today and decide if its the right thing for you project. Of course these tutorials are gonna be using simple examples that are unrealistic. They're meant to help you understand how things connect with each other, but are you building a project that is worthy of all this setup?.
21
drumttocs8 2 days ago 2 replies      
I think most everyone agrees that the amount of work to get all these pieces glued together before you can even start is ridiculous- a problem that Meteor set out to fix years ago. It faltered, of course, by being too opinionated. Now that Meteor fully supports React and npm, though, is there any reason not to use it? Sure does remove some pain points.
22
sergiotapia 2 days ago 1 reply      
Meanwhile here's how you work with React in a Meteor application.

https://github.com/nanote-io/nanote-web

No need to mess around with plumbing because honestly, who cares about that stuff.

23
freyr 2 days ago 0 replies      
All this effort to create a single page app that likely doesn't need to be a single page app in the first place.

If you're jumping through hoops to build an app that looks and feels like a traditional web site, you're doing it wrong.

24
mobiuscog 1 day ago 0 replies      
May I just thank you for:

a) providing an estimate of the time it will take

and

b) offering a simple way to receive it in PDF format for later consumption.

I wish every site did this.

25
jyriand 1 day ago 0 replies      
I wish somebody with clojurescript/reagent experience would rewrite this to see how it compares.
26
dandare 2 days ago 1 reply      
There is something wrong with me but I can not/refuse to use command line. It has no discoverability and no visible system state information. Every time I try to use it I start to panic because I can not see my files, I can not see the state of my task, I can not see anything. Maybe I have some form of dyslexia. Anybody out there with similar symptoms? Anybody knows a remedy to my problem please?
27
JoshGlazebrook 2 days ago 3 replies      
Am I the only person who just doesn't get the appeal of Webpack over using something like Gulp? It just seems to me like Gulp is so much easier to use and setup.
28
chrishn 1 day ago 1 reply      
Not React specific, but how do we tackle SEO for a site like Yelp, where SEO is very important? I want to turn an article-based site into a SPA, so it can consume a stateless API and horizontally scale. But I fear it'd lose a lot of pages in Google.
29
troncheadle 2 days ago 0 replies      
I have something VERY similar to this in production built with Angular, and let me tell you, I'm looking forwards to the day that I get to refactor it in React.

I get that it's frustrating to do a lot of setup. But that's the nature of the game. We're all standing on the shoulders of giants.

React is a pleasure compared to other ways of conceptualizing and practicing building user interfaces on the front end.

30
missing_cipher 2 days ago 0 replies      
I don't understand what people are saying, the basic Hello World App is right here: https://www.fullstackreact.com/articles/react-tutorial-cloni...

Like 1/15th of the guide down.

31
mickael-kerjean 1 day ago 0 replies      
That's quite funny, I've spent the last few days achieving pretty much the same thing but with Angular2. If some people are interested about it I might write a few things about it as well
32
k__ 2 days ago 1 reply      
If you want to look into React, but aren't in the mood for all the setup:

https://github.com/kay-is/react-from-zero

33
tonetheman 2 days ago 0 replies      
Interesting... he needs to include his webpage config from each stage. He linked the full one but that will not work when you are still in the starting parts.
34
gt384u 2 days ago 0 replies      
ITT: People who don't React well to critical feedback from new users
36
int_handler 2 days ago 0 replies      
The cover of this book is very a e s t h e t i c.
37
ahhsum 2 days ago 1 reply      
This looks awfully close to python. Is that true, or am i imagining things?
38
nijiko 2 days ago 1 reply      
All of that, for that simple demo...
39
Scarbutt 2 days ago 1 reply      
yes, your problem seems to be learning by banging the keyboard like a monkey which you inherited from clicking like a monkey to get a mere sense of how to use something instead of reading some upfront documentation.
40
tschellenbach 2 days ago 1 reply      
More tutorials coming up soon: cabin.getstream.io
41
zxcvcxz 2 days ago 1 reply      
Lots of complaining but no one offers a better solution. Any article detailing how to build a yelp clone is going to be kind of long. It's really not that bad compared to doing something like this with LAMP. Much simpler than Angular too.

With Angular I feel like I have to re-learn web development and do everything the "angular way", and who knows when Angular three is coming out and the "Angular way" completely changes.

And then what are the non-javascript alternatives? It really doesn't matter much because I'll have to interface with javascript anyway if I'm a professional webdev, so why would I add even more clutter to the already cluttered web dev world?

With React, I can learn a small framework that's highly extensible and basically pure javascript. I don't feel like I have to re-learn everything I know about web-dev when using react like I do with Angular.

8
Twilio S-1 sec.gov
373 points by kressaty  2 days ago   209 comments top 28
1
Animats 2 days ago 14 replies      
There's a lot there not to like:

 Revenue: $166,919,000 Net Loss: $38,896,000
So they're still not profitable. This is surprising, since they don't have any big capital investments. They're not doing anything that takes a lot of R&D. The thing runs on Amazon AWS. They've been operating for years and should be profitable by now. Yes, they're growing fast, but the costs don't rise in advance of the growth. You don't have to prepay Amazon for AWS.

"Each share of Class A common stock is entitled to one vote. Each share of Class B common stock is entitled to 10 votes and is convertible at any time into one share of Class A common stock."

So the public stockholders have no power. The insiders can't be fired. Google and Facebook did that, but they were big successes before the IPO. It's unusual to try to pull that off when you're unprofitable. The NYSE, on which they want to list, didn't allow multiple classes of stock until 1986.

WhatsApp is only 15% of their revenue, so that's not a big problem.

Twilio's big thing is telephony integration. They have a SS7 gateway and can integrate Internet and telephony. If Amazon or Google offered that, Twilio would have a big problem. Google has Google Voice and Google Hangouts, but doesn't offer telephony integration via a usable API. Yet.

This IPO is an exit for their VCs. They were all the way up to a series E round, and since they grew fast by losing money, the early investors had to pour in a lot of cash.

2
markolschesky 2 days ago 4 replies      
Had no idea that WhatsApp was even a Twilio customer let alone one of its largest.

>We currently generate significant revenue from WhatsApp and the loss of WhatsApp could harm our business, results of operations and financial condition.

>In 2013, 2014 and 2015 and the three months ended March 31, 2016, WhatsApp accounted for 11%, 13%, 17% and 15% of our revenue, respectively. WhatsApp uses our Programmable Voice products and Programmable Messaging products in its applications to verify new and existing users on its service. We have seen year-over-year growth in WhatsApp's use of our products since 2013 as its service has expanded and as it has increased the use of our products within its applications.

>Our Variable Customer Accounts, including WhatsApp, do not have long-term contracts with us and may reduce or fully terminate their usage of our products at any time without penalty or termination charges. In addition, the usage of our products by WhatsApp and other Variable Customer Accounts may change significantly between periods.

3
calcsam 2 days ago 2 replies      
Revenue is great; sales & marketing costs are fine, competitive environment is great; the main concern here is the cost of ongoing service.

Revenue: Twilio made $166M in 2015. From Q1 2015 to Q1 2016, thew grew 80% -- so we can project a 2016 revenue of around $300M. At that pace, they'll hit ~$1B in 2018 or 2019.

Landscape: They have very few competitors, in constrast to other high-profile enterprise startups like Box.

Cost of revenue: Their cost of revenue -- servers, telecom bandwidth, customer support -- is ~45% of revenue. Typical SaaS startups run around 20-30%. I suppose this is the danger of being in the telecom space -- you do have high data costs.

Sales & marketing: Coming in at ~$50M, or ~30% of revenue is quite reasonable. Box raised concerns a couple of years back when S&M were 125% of revenue; they were able to get it down to 65% or so and then they IPO-ed. 30% is fine.

4
breaker05 2 days ago 22 replies      
Good for them! I love their service and use it on GoatAttack.com
5
stanmancan 2 days ago 8 replies      
This is the first time I've ever looked through an S-1 before, but in the Risks section they say:

 We have a history of losses and we are uncertain about our future profitability
Is it normal to go public when being uncertain if you'll ever be profitable?

6
rottencupcakes 2 days ago 0 replies      
Looking at their escalating losses, I have to wonder if this IPO is a desperation play after failing to raise private money at an acceptable valuation in the current climate.
7
liquidise 2 days ago 6 replies      
I continue to be surprised by the trend of companies who have fallen short of profitability filing for IPO's. So often i find myself asking "is this the best for the company and its employees or is it best for the investors who want a faster return, at the expense of the company and its employees?"
8
andyfleming 2 days ago 1 reply      
I'm surprised all of these comments are so negative. Yeah, maybe there are some things that don't quite add up yet, but they have great developer engagement, solid services, and are actually innovating in the space. Twilio is a winner. When or how much is another question.
9
karangoeluw 2 days ago 1 reply      
I love Twilio, but I'm a little surprised they lost $35M on $166M rev in 2015. I thought they were very close to profitability.
10
hamana 2 days ago 6 replies      

 > Jeff Lawson(1) --- 8,623,617 --- 11.9%
How pathetic is this? Around 90% of the company is taken by the vulture capitalists and you, as a founder, only get to keep 12%. Bill Gates at the time of Microsoft's IPO had around 50% of the company.

11
nickbaum 2 days ago 1 reply      
It's hard to overstate how much easier the Twilio API has made it for developers to interact with SMS and phones.

When I first built StoryWorth, it only worked over email because I thought voice recording would be too complex (both to use and to implement).

However, users kept asking for it so I finally bit the bullet... and it was way easier than I expected. Using the Twilio API, I had voice recordings over the phone working within days.

The team is also super friendly. Less than a month after our launch, someone reached out to me and they wrote about our company on their blog:

https://www.twilio.com/blog/2013/05/old-stories-new-tech-sto...

Really glad to see their continued success!

12
anotherhacker 2 days ago 0 replies      
"Disrupt" appears 18 times in this filing. Is that code for "one day we'll be profitable, I promise!"
13
firebones 19 hours ago 0 replies      
Ctrl+F patio11. What? There's no patio11 comment here. Surprisingly "quiet".

https://en.wikipedia.org/wiki/Initial_public_offering#Quiet_... ???)

14
tqi 2 days ago 0 replies      
Does anyone have recommendations for resources/guides to S-1s (ie what to look for, which sections are usually boilerplate, what is normal/abnormal, etc)?
15
tschellenbach 2 days ago 0 replies      
We use Twilio for our phone support over at getstream.io, it was really easy to setup. Makes it fun to build this type of stuff :) Congrats to the their team!
16
m1117 1 day ago 0 replies      
Ha! I was able to accidentally guess 3 weeks before the IPO https://twitter.com/MikeTweetFeed/status/725077010796539904
17
shaqbert 2 days ago 0 replies      
Interesting that they are "selling" base revenues. After all total revenues in 2015 were $167m, and base revenues only $137m.

The definition of the $30m "missing" revenues seems to indicate that this piece of the business might churn at any moment, or is just a brief "burst" of revenues w/o the transactional nature of SaaS.

Depending on the lumpiness of these bursts, that is a smart decision to "ring-fence" in the reporting of an otherwise sound recurring business. Guess this is a good CFO here...

18
joshhart 2 days ago 1 reply      
Not even cash flow positive. Stay away.
19
patrickg_zill 2 days ago 1 reply      
I am skeptical, a bit, of their longterm ability to fight off margin compression.

What do people use them for?

If minutes of calling, inbound or outbound, that is trivial in terms of switching costs.

If sms, there are plenty of competitors, including teli.net (disclaimer, I know people that work there), haven't directly compared pricing however.

The real question to ask is how much lockin they have managed to generate. Without lockin they will eventually suffer margin compression.

20
josefdlange 2 days ago 2 replies      
As a naive nincompoop, is there any way for me to guess the initial price of a share when they become available?
21
cissou 2 days ago 2 replies      
If someone with better knowledge of S-1s could shed light on this: where can we see the option pool / option grants awarded to employees? Must be somewhere in there, no?
22
tyingq 1 day ago 1 reply      
I'm curious what protects Twilio from a race to the bottom in pricing from competitors. The barrier to entry here seems fairly low. There's Plivo, Nexmo, Tropo, Sinch, and probably others. Some of them appear to have reached near feature parity with relatively low spend.
23
joshuakarjala 1 day ago 0 replies      
Still can't get an inbound SMS number in Denmark. Damn you TC gods
24
cmcginty 2 days ago 1 reply      
Did they actually set the expected value of the stock to $.001/share?
25
JackPoach 1 day ago 0 replies      
$40 a million is a lot of cash to bleed.
26
ramoq 1 day ago 1 reply      
Authy (YC) acquisition was a bust.
27
krschultz 2 days ago 0 replies      
Stop spamming for Paysa. 100% of your comments are blatant advertisements.
28
dang 2 days ago 3 replies      
We detached this subthread from https://news.ycombinator.com/item?id=11780119 and marked it off-topic.
9
Announcing Rust 1.9 rust-lang.org
345 points by steveklabnik  2 days ago   61 comments top 10
1
asp2insp 2 days ago 1 reply      
The time complexity of comparing variables for equivalence during type unification is reduced from O(n!) to O(n). As a result, some programming patterns compile much, much more quickly.

I love this. Not just my code compiling more quickly, but the underlying implementation is super interesting.

2
jeffdavis 2 days ago 2 replies      
Excited about unwinding becoming stable. I am hacking on postgres-extension.rs, which allows writing postgres extensions in rust. This will mean that postgres could call into rust code, then rust could call back into postgres code, and that postgres code could throw an error, and rust could safely unwind and re-throw. Cool!!
3
shmerl 2 days ago 0 replies      
Progress on specialization is good.

> Altogether, this relatively small extension to the trait system yields benefits for performance and code reuse, and it lays the groundwork for an "efficient inheritance" scheme that is largely based on the trait system

4
bryanray 2 days ago 0 replies      
Looks like a great release. Controlled unwinding looks very interesting. #GreatJobRustTeam
5
MichaelGG 2 days ago 1 reply      
I don't understand the announcement on panics. Hasn't it always been the case that thread boundaries (via spawn) could contain panics?

It's also used to have the incorrect default that such panics were silently ignored. .NET made this same mistake: background threads could silently die. They reversed and made a breaking change so any uncaught exception kills the process. I'd imagine Rust will do so by encouraging a different API if they haven't already. (I opened an RFC on this last year, but I didn't understand enough Rust and made a mess of it. But still a great experience, speaking to the very kind and professional community. In particular several people were patient, but firm, in explaining my misunderstandings.)

6
spion 2 days ago 4 replies      
I don't know how this unexpected vs expected errors philosophy gets propagated, but to me it always looked suspicious. Take array bounds for example: what if you have an API that lets users send a list of image transformations, and the user requests a face-detect crop, followed by a fixed crop with a given x, y, w, h.

Clearly your code can get out of (2D) array bounds with the fixed crop (if the image is such that the face-detect crop ends up small enough). Suddenly the thing that was "unexpected" at the array level becomes very much expected at the higher API level.

So the API provider can't decide whether an error is expected or not. Only the API consumer can do that. Applying this further, a mid-level consumer cannot (always) make the decision either. Which is why exceptions work the way they do: bubble until a consumer makes a decision, otherwise consider the whole error unexpected. Mid-level consumers should use finally (or even better, defer!) to clean up any potentially bad state.

I think Swift got this right. What you care about isn't what exactly was thrown (checked, typed exceptions) but the critical thing is whether a call throws or not. This informs us whether to use finally/defer to clean up. The rest of error handling is easy: we handle errors we expect, we fix errors we forgot to expect, but either way we don't crash for clean up because finally/defer/destructors take care of that.

7
JoshTriplett 2 days ago 1 reply      
I'm really looking forward to the next stable version after this, which will hopefully stabilize the new '?' syntax for 'try!'.
8
wyldfire 2 days ago 0 replies      
Unexpected problems are bugs: they arise due to a contract or assertion being violated.

Speaking of which, DBC [1] would be an awesome feature for consideration. It's one of relatively few areas where D is superior to Rust IMO.

[1] https://github.com/rust-lang/rfcs/issues/1077

9
mtgx 2 days ago 2 replies      
Wow, almost 2.0 already. Any major features reserved for 2.0 or will it be just another typical release like 1.8 or 1.9, more or less?
10
Animats 2 days ago 5 replies      
catch_unwind

Exceptions, at last! Not very good exceptions, though. About at the level of Go's "recover()". If this is done right, so that locks unlock, destructors run, and reference counts are updated, it's all the complexity of exceptions for some of the benefits.

I'd rather have real exceptions than sort-of exceptions.

10
Housing in the Bay Area samaltman.com
403 points by dwaxe  3 days ago   388 comments top 40
1
akanet 3 days ago 2 replies      
People should actually call in! Over at the SFBARF, we have a thread of people reporting on how long it took to call their representatives: https://groups.google.com/forum/#!topic/sfbarentersfed/7G4Kr....

I managed to talk to someone at both of my representatives' offices within a minute, without any sort of delay. It's very easy, please do call in if you support this measure!

Also, when you do call in, post here afterwards with how long it took you. A lot of people don't bother because they think it'll be a hassle, but it's about a 100 times easier than calling to book a restaurant reservation.

2
s3r3nity 3 days ago 5 replies      
This is one of the top factors preventing me from moving back to the Bay Area. It's good to know there's at least SOME serious attempt at a solution (even if folks around here are mostly pessimistic.)

It still surprises me that an industry that's all about connecting the world through the internet and allowing anyone to work with anyone else on anything says "yeah but you need to work in this small area of the country to do it."

The armchair economist in me says that high housing prices in the Bay Area will cause a market "correction" of sorts with other cities exploding as tech sectors -- see NY, Boston, Chicago, Austin, Las Vegas -- as those cities can get away with much better housing for cheap, even if the opportunities aren't as plentiful (yet.)

<conspiracy theory> The cynic in me likes to think that VC's and other wealthy tech execs are trying to keep housing prices high by artificially creating this in-elasticity ("you can have my funding if you live in SF" or "we don't allow remote workers"), as they all probably own houses in the area; the incentives for them would be to have housing prices keep rising, NOT fall, as they would lose significant real-estate equity. </conspiracy theory>

3
jevinskie 3 days ago 16 replies      
There is so much pressure, career-wise, to move to either the Bay Area, the West Coast in general, or a major urban hub.

I honestly like living in a semi-rural part of the Midwest. I like seeing fields and forests while driving to work. If I want to get out to some wilderness, I can do so in 15-30 minutes. I travel in a car that, I feel, offers me freedom. To boot, I have a nice, reserved spot in the garage below my apartment. I live downtown and I still feel great freedom/"space". I cant imagine being able to do anything like this in a place like SF or even Chicago.

My cost of living is very cheap. My commute is 15 minutes. I live close to my family. Why should I want to leave?

4
abalone 3 days ago 9 replies      
This is a contentious issue and in case you are not from SF and wondering why Y Combinator is taking a stand on it, it has to do with the politics of the tech boom.

The plain truth is that the housing crisis is a combination of too much sudden demand fueled by the tech boom and too little supply. The tech community likes to deflect criticism of their sudden influx of relatively high-income workers and say "don't blame us, it's a supply problem."

This bill focuses mainly on juicing supply by removing restrictions on builders. But part of that is weakening the affordable housing restrictions on them.[1] The problem with that is, increasing market-rate housing alone may not make an appreciable difference in prices where it matters most from a macroeconomic perspective: for low-to-mid income residents.

Cities need teachers, service industry workers, artists, etc. Already the restaurant industry is facing a major worker shortage.[2] Let's not even get into the cultural effects of long-term residents getting displaced en masse. So if market-rate supply alone can't solve this, then you need affordable housing regulations to set aside spots.

Sam has offered a rather simplistic statement on the law of supply and demand on this. But that masks the real question: whether weakening affordable housing demands as this bill does is a smart move. Will juicing mostly market-rate housing be enough?

[1] http://48hills.org/2016/05/22/a-terrible-housing-bill-looms-...

[2] http://www.bizjournals.com/sanfrancisco/print-edition/2016/0...

5
sytelus 3 days ago 9 replies      
I highly doubt this bill would pass without severe modifications. The problem is that we have two groups in population: One who own houses and other who don't. First group don't want new houses and dilute value of their property. Politicans very well knows that making this group unhappy will end their careers. The second group is often in minority outside large cities. This is why its very hard to pass laws that would allow build new houses. It's great that SamA is taking interest in this. If enough valley billionaires can pay for lobbing then politicians can finally ignore their majority constituents and do the "right thing" :).
6
simonebrunozzi 2 days ago 1 reply      
This is one of the worst piece of advice, IMHO. Here's why.

1) You can't add housing without adding proper infrastructure (transportation, sewage, power, etc). Adding infrastructure is expensive and takes a very long time. There will be no budget for proper infrastructure increase if the population keeps rising - remember that hundreds of thousands of people work for companies that offshore their taxes.

2) Most people need financial advice on what real estate investment means. If there's another bubble burst, a lot of average income people will have their life ruined. They would have bought an apartment for, say, $1M, with 200k down, only to see its value decrease by 30-35%, essentially wiping out their savings and forcing them into unsustainable debt.

3) I wish there was an easy way to fix the housing problem in SF. There isn't. Let's be realistic about it.

Please don't downvote me just because you disagree with me. Instead, tell me why.

7
stale2002 3 days ago 1 reply      
If you are passionate about the housing crisis in The Bay Area, you should check out The San Francisco Bay Area Rentors Federation.

They are one of the groups doing the most activism on this issue.

8
carapace 3 days ago 29 replies      
I'm in favor of how "building in the Bay Area is approved by discretion", I call it having some power over what gets built. "Democracy" y'know?

I'm from SF. I grew up here. I'm also a computer nerd.

To me, the tech "industry" has been ruining the city I love since the Dot-Com boom. To me, they are an invading army culturally speaking. In a very real sense my home is being destroyed.

The saddest part is, this city kinda sucks. It's not even a good place to locate a business. The city gov is soft-corrupt. The weather NEVER gets better. We're on a damned peninsula. If I were starting a company I'd go to Davis CA! People aren't coming here because it makes sense. This is just where the game is being played. They are drawn here like aspiring actors are drawn to Hollywood, but the prospects are just as glamorous and illusory. A few will strike gold, the rest will toil and vanish.

This is a town for freaks and weirdos. If you don't believe me, visit Civic Center. ;-) It is said "The people who are too strange for the rest of the country move to California, and the people to weird for California move to San Francisco." (If you are still too weird you go to Berkeley.)

What concerns me the most is that the newcomers might lack the environmental commitment, and tip the scales from valuing conservation to valuing rampant development. San Francisco is the largist metropolitan area with the most wilderness/open space around it in the world. The last thing we need is more lux condos and freeways.

I'm going to be calling those numbers, you can bet, but it will be to urge "con" on this abrogation of the weirdo-freaks ability to stonewall development. This is a good thing.

Y'all "young gods" will just have to figure out some other place to hatch the singularity. Move away. Build a floating "Seastead". Just please stop trying to cram a million more people into the Bay Area. It's not actually a good idea!

Also, I hear Portland is nice.

9
Tiktaalik 3 days ago 4 replies      
Even if this bill passes and substantially more multi-family housing starts getting built, do not be surprised if house prices remain extremely high.

Vancouver doesn't have the same anti-development issues that SF has, and has been building multi-unit condos and apartments all over the region for decades. Multi-unit housing starts are about at its highest level ever. The average price of a detached house however is C$1.3 million (~US$998k thanks to the weak Canadian dollar) and rising. Low interest rates, "fear of missing out," incredible amounts of real estate speculation and a dash of foreign investment can go a long way.

It's likely that the strongest benefits to increased supply in SF will manifest itself in lower rents and/or lower rate of rent increases.

10
brudgers 2 days ago 0 replies      
I suspect this might have an impact. The timeline though will be ten to twenty years, optimistically.

1. Eligible property will have to change hands and land in the right hands to make the projects happen. That's the nature of redevelopment, someone has to be willing to knock down a profitable building in hopes of building something more profitable.

2. The rate at which property will change hands will correlate to the legal certainty it can be redeveloped. Until new legislation successfully survives court challenges, the uncertainty will swirl around its "by right" provisions.

3. The trades base will affect the cost of construction and hence the viability of projects. Imagine there is some number /n/ of plumbers in the Bay area. At some threshold amount of construction, these plumbers are all in demand, and so the price of plumbing rises. This happens across several trades and formerly viable pro forma's pencil out poorly with the new cost structure.

4. People sitting on the land now are already in a wait and see mode. As land becomes more desirable to developers, prices go up. Rising prices attract speculators hoping to sell on. The rate at which developable land flows to competent developers remains moderate.

5. Five years from identification of a feasible redevelopment site for multifamily housing to full occupancy would be pretty quick even in a relatively development friendly culture. Research, options, due diligence, financing, entitlement, design, bidding, construction and leasing...any one [or several] can run out over a year.

Throw in the normal rate economic and financial cycling and the sorts of effects experience indicates these have on the construction and housing sectors, and even in the next ten years, there will likely be a period where half of all projects in the pipeline die.

There's no silver bullet.

11
sakopov 3 days ago 0 replies      
Isn't housing just a painful side effect of an astronomical influx of people moving to one city? IMHO, the problem is the tech industry being too centralized.
12
jeffdavis 2 days ago 2 replies      
I've wondered for a while weather housing exclusivity can improve economies in some respects. Do the high housing costs create a strong selection bias to bring in only the most determined to take part in the SV culture?

Obviously, lower housing costs would cause a higher absolute density of people. But maybe the presence of people with other unrelated interests would somehow dampen the tech economy, even if the absolute density of tech people remained the same.

Let me be clear that I support more residential construction on the peninsula, so I am not saying this as part of a weird socio-political agenda. It's purely out of curiosity.

13
joe_the_user 3 days ago 0 replies      
"His bill would make it so multi-family buildings are automatically approved by right as long as they comply with local zoning, and have 5-20% affordable units--the percentage depending on location and subsidy offered."

The problem here is that this leaving height limits in place and could create a rush to impose even more stringent zoning restrictions.

What would be necessary would be legislation that pre-empts both local zoning and other mechanism for a relatively small area to allow the construction of a few high rises without the destruction of existing housing stock.

The ideal of Nimbys seems to be no development and the ideal of developers seems to be building a limited number of ultra-high-priced units in the super desirable places like the Mission. The sane approach would be to allow a series of high rises in Potrero Hill and along the Bay.

This approach sounds like it would simply push-through development according to the developer ideal, which would serve the Nimbys right but also destroy whatever remains of the character of the mission.

14
JimboOmega 3 days ago 0 replies      
I tried to read the bill, but don't know how the old text read, and am far from a development lawyer.

Will this have the impacts Mr. Altman suggests they will? Specifically - significantly more development of multi unit dwellings?

15
pfarnsworth 2 days ago 1 reply      
Companies that extract advertising and other revenue from around the world, like Google, Facebook, LinkedIn, etc, should create job centers outside of the Bay Area.

This is the only way that house prices will moderate, is if these globally significant companies spread their distribution of money around the country/world. There's too much concentration of money flowing into the Bay Area in terms of revenues, and that is creating this hyper-inflation that we see here.

My house value when up by 60% in 2 years. I can sell it for almost $2 million if I wanted to, which I don't. I keep thinking it's not sustainable, until I hear the offers that people get from these companies. Google offered one of my friend $160k + 400k in RSUs, and Facebook offered another one $180k and $500k in RSUs. That's well over $1M over 4 years for both. Of course salaries like this will cause hyper-inflation in this area.

16
1945 2 days ago 1 reply      
Can we scale the schools, transportation, etc. to support this legislature? If so, I'm all for it. But the way things stand currently, it's already pushed to the limits.
17
epicureanideal 3 days ago 2 replies      
This is good, except that I oppose subsidies for "below market rate housing". If workers are needed in an area, businesses will eventually need to pay them more. Subsidizing their housing is a backdoor subsidy to the business owners who don't need to pay their workers enough to deal with living in the Bay Area.
18
rajacombinator 2 days ago 2 replies      
A benevolent dictator could bulldoze easily upwards of 50% of housing in areas like the sunset, entire peninsula, and Oakland, and replace it with high quality, high density condos that would increase quality of living by at least 10x, while reducing costs by 50%. But, alas, we got kleptocrats.
19
pjc50 2 days ago 0 replies      
The most likely "correction" of this is going to be the next big earthquake. :(
20
godzillabrennus 3 days ago 0 replies      
I'd be happy to extend a beta license for a brand new grassroots lobbying platform to the folks at YC for them to be more efficient and effective while managing this effort.

Heck, if anyone on HN is part of any lobbying effort in California and would like a beta experience of a new platform email me lane (at) fastAdvocate (dot com)

Warning you in advance, the website up today is horrible and not representative of the beta product. The homepage is about to be replaced. We focused on the product first.

21
honkhonkpants 3 days ago 4 replies      
When I look at this bill I have to wonder if it will really help. Outside of the city of San Francisco itself, almost nothing is being built, because the construction costs are too high. Not the permitting and planning costs, but the actual guys-with-hammers-and-saws costs. For example in Oakland there are number of properties that are fully entitled to build and have been sitting that way for years. So how does Brown's bill help?
22
spoonie 2 days ago 0 replies      
It's probably a marginal issue, but would decreased living costs reduce salaries? Presumably if living costs were lower companies could pay less while still offering a higher net-income than other areas. I guess what I'm asking is how connected are the high salaries in the bay area with the cost or living there?
23
eru 2 days ago 0 replies      
The long term solution was proposed more than a century ago ironically enough by an economist from San Francisco.

http://qz.com/169767/the-century-old-solution-to-end-san-fra...

24
youngButEager 2 days ago 3 replies      
Some landlords (such as our firm) are liquidating our rental assets due to the increasing risk of 1979-1995 properties being pulled into rent control regimes. Cities like San Jose, Richmond, San Mateo, just this year either debated increasing or actually increased control over rental properties. The population is rising; the supply of rental units is too low; cities want to stop landlords from raising prices; so they are just this year increasing rent control (San Jose just did this).

Anything built before 1995 (that's a California State statute) can be (will be) rent controlled (unless population drops dramatically in cities near the coast).

The State passed a law (the "Costa-Hawkins Rental Housing Act") that prevents California cities from pulling in any property built after 1995 into a rent control regime. But Costa-Hawkins does not protect rental properties built between 1979-1995.

Right now, properties built before 1979 are rent-controlled in California cities (San Francisco, San Jose, etc.)

But at the drop of a gavel, a municipality can pull in all properties built between 1979 and 1995 into rent control -- the Costa-Hawkins is the only thing stopping municipalities at the year 1995 (they'd like to be able to rent control every property necessary to accommodate their increasing population over time).

The citizens of a municipality in Southern California -- Santa Monica I believe -- narrowly defeated a rent control change designed to extend rent control to apartments built up to 1995.

That does not help supply. Jerry Brown trying to 'fast track' development does not remove the risk that newer properties will be rent controlled in the future.

A new developer sees partial-confiscation of their asset being a possibility, because the California legislature could modify Costa-Hawkins and remove 1995 as the limit year for rent control.

Once a city/state has demonstrated a commitment to seizure of:

- a restaurant's right to increase prices- a dentist's right to increase prices- a doctor's right to increase prices- a private school's right to increase prices- a rental property's right to increase prices

Those restrictions lead to:- fewer restaurants- fewer dentists- fewer doctors- fewer private schools- fewer apartments

Businesses must have freedom to operate or no one will start businesses (supply of businesses will drop).

Once a city/state has demonstrated a commitment to partial seizure of personal assets (restaurant, dental practice, apartment property, etc.) of a person, or a business, there is a very strong likelihood that 'seizure/state control' tendency will never stop.

So the governor of California can wave his hands all he wants with this kind of "fast track" proposal to building real estate assets in California.

Until the state makes it clear that they will stop the partial or "legislative seizure" of rental properties, very few developers are stupid enough to risk a change/nullification of Costa-Hawkins.

25
WWKong 2 days ago 1 reply      
How does this benefit/harm the folks who already recently bought a house at exorbitant price, burning all their savings, calculating their returns based on how bay area prices have behaved over the last couple decades? Will they be screwed?
26
puppetmaster3 2 days ago 0 replies      
Does anyone see a problem w/:'legislation that would allow a lot more housing to be built'?

I stopped reading after that, as I was satisfied as to what the problem and solution is.

27
johncbogil 3 days ago 0 replies      
I made an app that uses your GPS location to show you who your elected representatives are, lets you call, tweet, email them.

TryVoices.com (iOS + Android)

28
blazespin 2 days ago 0 replies      
Great, I'm sure the 101 will just be a blast to drive now! Maybe they should solve the transportation grid lock first..
29
mickrussom 3 days ago 0 replies      
Too little too late. Middle class is gone.
30
tryitnow 2 days ago 0 replies      
I love this, but why does he list just those specific members?
31
foobarqux 3 days ago 3 replies      
Apparently not all economists agree that increasing housing supply would make rents cheaper.
32
tomjacobs 3 days ago 0 replies      
12,000 people moved to San Francisco in 2014. 4,000 homes were built. What's the problem?

https://medium.com/@TomPJacobs/what-housing-crisis-3c0568a5d...

33
powera 3 days ago 2 replies      
"The bill is currently being debated in California's Congress as part of the upcoming annual budget" - it's called the Legislature.

I mean, yes, I knew what he meant, but it doesn't give me confidence that he understands how to influence the political system.

34
sseal 2 days ago 1 reply      
And now HN has started to push a political agenda...
35
Pica_soO 2 days ago 0 replies      
This being California- when can we expect housing on the ocean floor of the bay?
36
weisser 3 days ago 4 replies      
Cannot believe I'm posting here but relevant:

I have a room in my house in the Sunset opening next month. $900. It's month-to-month rent. Looking to house a founder/hacker/designer/etc currently struggling with SF rent (or someone who wants to move to SF but didn't think they could afford to live in SF)

My email is in my bio.

37
aaronbrethorst 3 days ago 0 replies      
typo: "legislative aides," not "legislative aids"
38
dang 3 days ago 0 replies      
That's not nice and quite false.

You've been posting many unsubstantive comments to Hacker News. Please stop.

39
iamleppert 3 days ago 2 replies      
I'm currently renting out my upstairs walk-in closet. I live in Potrero, near 26th & Rhode Island. The closet is pretty big and can easily fit a double bed and dresser. $900 + utilities, but we might be able to work something out.

edit: this is a serious post

40
branchless 3 days ago 2 replies      
Won't work. Imagine supply is met, housing is cheap. People then want to live there more. Eventually the point of equilibrium is restored which is that banks keep on creating fiat money until land prices reach a point where you can just about eat and afford the interest repayments.

Land value tax is the answer for all those who don't simply want a quick in so they can get on the gravy train and actually want their kids to grow up in a society where wealth creation, not money creation, makes you richer.

11
FPGA Webserver github.com
303 points by luu  2 days ago   96 comments top 16
1
emmelaich 2 days ago 3 replies      
I think Clash (http://www.clash-lang.org/) would help a lot with this sort of effort.

"CaSH (pronounced clash) is a functional hardware description language that borrows both its syntax and semantics from the functional programming language Haskell. It provides a familiar structural design approach to both combinational and synchronous sequential circuits. The CaSH compiler transforms these high-level descriptions to low-level synthesizable VHDL, Verilog, or SystemVerilog."

2
Rubu 2 days ago 0 replies      
Cool stuff. There's an open-source TCP stack available in Verilog on OpenCores[0], but that is actually C code compiled to Verilog using Chips[1]. Had no idea this kind of stuff existed, but then again, it has been a while since I've done anything in VHDL.

[0]: http://opencores.org/project,tcp_socket [1]: https://github.com/dawsonjon/Chips-2.0

3
jdmoreira 2 days ago 2 replies      
Just implementing a TCP/IP stack in hardware is insane! Does this even exist?

I'm sure there are some ICs which give you TCP/IP over serial or something but they are not implemented at a gate level, they are probably just an MCU running code.

4
ohazi 2 days ago 1 reply      
Nice to see someone still using VHDL... I feel like I blinked and everyone switched to Verilog.
5
epynonymous 2 days ago 8 replies      
a little bit off topic so i apologize in advance, but i think this is the right audience so i'd like to pose my question, but i'm thinking about what the smallest device is that can run say a golang app with net/http reasonably fast. i'm thinking about the use case where i can run a fairly low powered micro device, say smaller than a raspberry pi (say usd quarter sized), but enough to run say a rest api or serve some static pages. typically i'd have to buy some desktop, server, or laptop to host such a thing which seems like overkill, perhaps power hungry, i think it maybe good if you need to run a myriad of processes or containers (load balanced over nginx), but if i just need a simple rest api, single process. and i'm thinking of embedding these into everything (dishwashers, tv's, stereo receiver, led lightbulb) perhaps what i'm saying is sort of like an fpga, but higher level, something that can run an elf binary, i dont want to have to write vhdl, i mean the converters from 3gl to vhdl/verilog sounds cool, but what are the costs for fpgas? i also wouldnt need the device to have like bash access, something dumbed down to copy a binary and run/stop it (like docker). i guess what i'm seeking is something like an fpga for embedding in devices that can run higher level code with wifi.
6
nickpsecurity 2 days ago 0 replies      
Should be straight-forward. I know, famous last words for people starting on projects. Yet, a web server is straight-forward piece of software if you're not trying to make a production system with widespread adoption. For a static web server, heres what it does:

1. Parse a HTTP request into simple, internal form.

2. Convert the identifier into location in memory with the page.

3. Convert data at that location into outgoing packets.

4. Run those through I/O.

One clever, embedded system I saw pre-encoded the HTML pages as TCP packets in memory to just send them directly. The HW will obviously need TCP/IP stack. There's plenty examples in academic literature. Whole thing is a pipeline with part manipulating memory, in/out HTTP processing, in/out TCP processing, in/out IP processing, in/out Ethernet, a memory subsystem for accessing RAM, and some cache thrown in there likely.

That's as far as I got when I thought about it. Looks quite doable given everything up to TCP in that stack has been done already. The rest seems straight-forward. Could probably even implement it in a static way amendable to fixed, pre-allocations of memory or on-chip resources.

7
b1gtuna 1 day ago 0 replies      
Followed his profile and discovered he is also working on an introductory FPGA course - http://hamsterworks.co.nz/mediawiki/index.php/FPGA_course_v2.

I am a firmware engineer (just began 2.5 months ago) in a Xilinx shop. I only had half a lecture on FPGA during my undergrad, so his course will help.

9
pldrnt 2 days ago 0 replies      
Looking forward to where this goes... For work I had to implement arp, icmp, udp and our protocol on top of udp, for 10G ethernet, in an fpga, it would have been fun to add dhcp and tcp but the time and priority weren't there.
10
stiller 2 days ago 0 replies      
If he makes it up to the HTTP layer, he can implement a very fast load balancer.
11
madengr 2 days ago 1 reply      
Reminds me of a presentation I saw last year on a mathematically unhackable web server. It was essentially a giant lookup table (no RAM), and making the slightest change required re-synthesis. But it was unhackable.
12
exabrial 2 days ago 3 replies      
This is pretty BA! Definitely falls into the category of "Why? Because I can!" which are some the cool projects to watch.

On an unrelated note, has anyone ever tried to implement the JVM's stack machine and memory model in FPGA? It's pretty well specified, might make for an interesting project.

13
iamleppert 2 days ago 4 replies      
Can someone who knows more comment on if this actually makes sense and for what applications and/or constraints? Maybe it makes sense when performance/power/cost is figured in etc? Just curious as to the OP's motives.
14
poorman 2 days ago 2 replies      
If done correctly, you might end up with a very secure webserver in your hands. I imagine it would be tough to use conventional vulnerability penetration techniques on something like this.
15
yuhong 2 days ago 0 replies      
This reminds me of HTTP/0.9. Is there anything other than embedded that even uses it these days?
16
MasterScrat 2 days ago 0 replies      
I'd be very curious to see some benchmarks!
12
Visiting Chelsea Manning in prison mit.edu
339 points by rdl  2 days ago   169 comments top 5
1
hendersoon 2 days ago 3 replies      
I find it very sad that the bulk of comments on this matter are about Manning being trans rather than the injustice of jailing a kid for 30 years for blowing the whistle on war crimes.
2
chinathrow 2 days ago 1 reply      
> She hopes that the world hasnt forgotten about her.

No, we haven't. And we also haven't forgotten about the war crimes she published.

3
arca_vorago 2 days ago 4 replies      
For those people who are curious about the act of disseminating information like Manning did, allow me to summarize.

When you get a clearance, and often even when just working with confidential documents, the DoD has you sign an SF312, which is a classified information non-disclosure.

When you join the military, one of your first acts is to be sworn in, and you swear an oath, "I do solemnly swear (or affirm) that I will support and defend the Constitution of the United States against all enemies foreign and domestic; that I will bear true faith and allegiance to the same. That I will obey the orders of the President of the United States and the orders of the officers appointed over me, according to regulations and the Uniform Code of Military Justice. So help me God."

Now let me say this as clearly as possible:

The Constitutional oath outranks the NDA. Period. If in doubt, the oath wins, every single time.

That being said though, there is more room for nuance in the stories, primarily due to what kind of information is revealed, what it's intended purpose was, and who is was distributed to.

In my opinion Snowden was more aware of this than Manning, because he did due diligence to review the documents before sharing them, and then making sure to limit damaging information that was actually vital to actual national security, and was very specific with the organizations he shared the information with about these requirements.

Manning did a huge document dump, which I think was well intentioned and had the right reasons behind it, but he didn't think it through in detail enough or take the time to redact the information.

So even if the constitutional oath outweighs the SF312 NDA, it's still breaking the law, but we need to draw very clear distinctions between breaking an NDA and the too oft-cited charges of "treason", which by definition is aiding and abetting the enemy during times of war.

I would also like to point out the hypocrisy of the establishment when it comes to classified leaking, because people in the White House and on the Hill leak all kinds of classified material whenever it happens to be expedient to them politically. A good example of this is the Cheney leak that outed Valerie Plame. Where are the cries about treason against Cheney? (whom I personally think is demonstratively more of a traitor than any of the aforementioned names...)

To me, one of the primary problems DoD faces these days is that there is little to no punishment for being openly unconstitutional, and people have not only forgotten their oaths and their importance, but have failed to call out others who have acted against theirs. To me, an oath still means something, but I want to know... if I could prove someone broke their oath with knowledge, what is the legal punishment offered?

Whatever it is, that is what we need to be doing against those in positions of power who spend more effort undermining the constitution than defending it. (and personally, I think the true enemies of the constitution wear business suits and ties, not thwabs...looking at you wallstreet)

4
thr32989 2 days ago 10 replies      
> I will be Chelseas first visitor since her sister in November.

> I bring up her recent appeal to reduce her sentence from 35 years to 10 years, and she seems worried that it didnt receive enough coverage in the press. She hopes that the world hasnt forgotten about her.

I am afraid world has already forgotten.

Google search for "site:news.ycombinator.com Chelsea Manning" for past month gives three results including this one. "Manning" gives more results, but only in reference to Snowden and Panama Papers, no prison.

And transgender rights activists are more busy with right gender pronouns then with him/her.

5
mynameishere 2 days ago 5 replies      
Whatever sympathies you want to arouse for Pvt Manning for political purposes, however wordily and clumsily expressed, try to keep in mind that throughout all of history, in all contexts, regardless of rank, Manning would have been summarily hanged as a spy and/or traitor rather than merely imprisoned. Manning got off nice and easy.

That said, it was a total failure of the system to allow someone so mentally unstable to have any kind of clearance. Manning's AFQT score was probably 99 at a time when the army had trouble getting anyone literate to join. And so there wasn't a discharge. Here's an interesting interview about Manning's (near) discharge:

http://www.theguardian.com/world/2011/may/28/bradley-manning...

13
Person carrying bacteria resistant to antibiotics of last resort found in U.S. washingtonpost.com
300 points by dak1  2 days ago   201 comments top 22
1
tjohns 2 days ago 8 replies      
Relevant story: As a kid, one of my friends would frequently get strep throat. So his mom would give him amoxicillin until he appeared better... and then save the rest of the bottle for the next time he'd (invariably) get strep throat.

And that's how antibiotic resistance happens.

2
af16090 2 days ago 2 replies      
The cover story for this past week's Economist was about antibiotic resistance: http://www.economist.com/news/briefing/21699115-evolution-pa...

And from that story, it talked about Colistin (the drug this patient's E. coli is resistant to): "Some of the antibiotics farmers use are those that doctors hold in reserve for the most difficult cases. Colistin is not much used in people because it can damage their kidneys, but it is a vital last line of defence against Acinetobacter, Pseudomonas aeruginosa, Klebsiella and Enterobacter, two of which are specifically mentioned on the CDC watch list. Last year bacteria with plasmids bearing colistin-resistant genes were discovered, to general horror, in hospital patients in China. Agricultural use of colistin is thought to be the culprit."

Considering the same article says that "In America 70% of [antibiotics] sold end up in beasts and fowl" it seems that an easy thing to do would be to stop giving antibiotics to animals

3
ChrisArgyle 1 day ago 1 reply      
TL;DR

Modern medicine is over! Get to the zombie apocalypse shelter!

Not really though; everything scary in this article is either wrong, exaggerated or both. Ars Technica explains it expertly:

http://arstechnica.com/science/2016/05/everybody-be-cool-a-n...

4
adrusi 2 days ago 5 replies      
Look, this is scary and a big problem, but can we please stop talking about the "end of the road" for antibiotics?

The worry here isn't that antibiotics will suddenly become useless and whenever anyone gets a bacterial infection they'll have no hope. The worry is that there will be a number of prevalent bacterial illnesses which can't be treated with antibiotics.

Currently antibiotics work for an overwhelming majority of bacterial illnesses, that's not going to change overnight. What will change is the idea that bacterial illnesses are trifles because they can be cured every time by antibiotics. A few diseases will emerge, more and more over time, that have much worse consequences than we are used to thinking about right now, but the rest will be the same.

I don't mean to underplay the threat, but if we keep pushing this rhetoric, people will discredit the threat when it turns out that 50 years later we're still using antibiotics for most illnesses that people actually get (because antibiotic-resistant strains are effectively quarantined). People will compare it with the "we're going to run out of oil" scare.

5
slg 2 days ago 7 replies      
The article doesn't touch on it, but the obvious followup question from the laymen is why can't we develop new antibiotics? I was curious and according to Wikipedia we haven't developed a new class of antibiotics in 30+ years. Can someone with knowledge on the subject explain why we seemingly can't discover/develop new forms of antibiotics to combat these resistant bugs?
6
c3534l 2 days ago 0 replies      
> Health officials said the case in Pennsylvania, by itself, is not cause for panic. The strain found in the woman is treatable with some other antibiotics.

Thanks for completely ignoring that advice with a headline and three paragraphs of misleading information designed specifically to cause panic.

7
tokenadult 1 day ago 1 reply      
Another Hacker News user submitted a better story yesterday

http://arstechnica.com/science/2016/05/everybody-be-cool-a-n...

but I see that this story with the alarmist headline got more traction on the main page of HN. That's unfortunate for understanding the underlying issues.

8
searine 2 days ago 1 reply      
The are three solutions needed here :

1. Stricter regulation of antibiotics, particularly in farming.

2. Better government funding of antibiotic discovery.

3. Stricter regulation of antibiotic use. No solo-drugs, all antibiotics used in stacks of 3 or more. Better monitoring of complete antibiotic use cycles.

Biologic resistance can be managed, HIV is more than enough evidence of it working. We have to get serious about it, the age of reckless antibiotic use needs to end, now.

9
jrk 2 days ago 0 replies      
"Colistin is widely used in Chinese livestock" oh for fuck's sake
10
rdtsc 2 days ago 3 replies      
Wonder if we'll see a resurgence of phage therapy due to this.

Phage therapy is using viruses which will infect and attack the bacteria. Viruses can mutate and adapt just as well as bacteria (while say antibiotics are static in a way). So they can keep up with the mutations.

It is a pretty crazy but also ingenious approach.

https://en.wikipedia.org/wiki/Phage_therapy

11
Zelmor 2 days ago 0 replies      
This is what happens when you raise livestock on antibiotics as the de facto standard. You are what you eat.
12
ifdefdebug 1 day ago 0 replies      
Two lines in the article:

> Colistin is the antibiotic of last resort for particularly dangerous types of superbugs

and further down:

> Colistin is widely used in Chinese livestock ...

This is absurd. Preventive use of antibiotics on livestock works just like a giant training camp for hostile bacteria, and horizontal gene transfer will spread the necessarily created resistances to human microbes rendering them useless sooner or later.

13
rtpg 1 day ago 1 reply      
I don't get the deal here.

>Health officials said the case in Pennsylvania, by itself, is not cause for panic. The strain found in the woman is treatable with some other antibiotics.

So the last resort doesn't work, but other stuff works. It's totally reasonable to assume that if bacteria becomes resistant to more common antibiotics, that some other kind of antibiotic could do the trick.

Though I guess it would be nicer to have some sort of "proof" that the bacteria _does_ get weaker to stuff it's less exposed to.

Actually, side note but wouldn't mass feeding of antibiotics for certain kinds of bacteria let us completely wipe it out, a la smallpox?

14
rdl 1 day ago 0 replies      
I wonder if you could create a new antibiotic and restrict it to supervised inpatient use only, to preserve efficacy for as long as possible.
15
Practicality 2 days ago 3 replies      
It might be time to start editing our (DNA) code to fight the bacteria. It seems like the only thing that will be fast enough to keep up with the mutations.
16
tedd4u 2 days ago 0 replies      
There is an interesting RadioLab episode that covers antibiotic resistance and an unlikely source of new antibiotics.

Staph Retreat - Nov 2015http://www.radiolab.org/story/best-medicine/

Or load it up in your favorite mobile podcast app.

17
jwatte 2 days ago 1 reply      
If we can't kill these infections after they happen: Can we develop vaccines against them to prevent occurrence?
19
dctoedt 2 days ago 2 replies      
Time for Congress to authorize a very big monetary prize for the company that comes up with a better solution, with that solution then being licensed for free to all U.S. manufacturers (or something like that to make it politically acceptable to the xenophobic elements in the GOP).
20
GigabyteCoin 2 days ago 1 reply      
I wonder if antibiotic resistance will be the reason that people move out of the cities back to rural areas.

You can't get sick if you're not near anyone else.

21
ams6110 2 days ago 1 reply      
Meta: something about washingtonpost.com locks my browser every time.
22
PythonicAlpha 1 day ago 0 replies      
And still, the (mis-)use of antibiotics in meat production carries on (at least in Europe) -- even using last-resort antibiotics.
14
An industry that helps Chinese cheat their way into and through US colleges reuters.com
278 points by okket  3 days ago   318 comments top 32
1
jchiu1106 2 days ago 10 replies      
As a former Chinese international student graduated from one of the top Canadian universities with distinction, I have mixed feelings about this. On one hand, the toxic culture permeated the Chinese visa student community is definitely responsible for the high rate of cheating. Most Chinese visa students came to study in the west did not came to pursue academic excellence. They (and their parents most likely) see it as a way to beef up their profile with a foreign diploma. This was a culture that I tried so desperately to stay away from when I was in school.

On the other hand, I feel bad that the broad generalization impacts tremendously negatively those Chinese visa students who did pursue their dream and passion which led them to a foreign university. I worked hard to graduate with distinction, and learned my stuff well enough to go on and have a successful career, but I always feel I have to go the extra mile just to prove myself; I have to go above-and-beyond just to gain equal footing. I cannot quantify the negative impact the those people cause, but it's incredibly unfair to be prejudiced just because I may look like those people.

Academic dishonesty is certainly not a Chinese-only problem. The media singling out a group such as the Chinese visa students is certainly a popular thing to do to gain clicks, but it's a little unfair.

</end-of-rant>

2
kirrent 3 days ago 4 replies      
Australia is also a popular destination for Chinese students and sees high rates of cheating for many of the same reasons as those given in the article. Poor language skills allowed by lax student visa requirements, isolation, and the high penalties for failing a course when your family is paying full price creates desperation.

4 corners did a pretty good story on it and the smh investigation into MyMaster was also pretty good. Even if you can't get the video on 4 corners, the transcript is available.

http://www.abc.net.au/4corners/stories/2015/04/20/4217741.ht...

http://www.smh.com.au/nsw/mymaster-essay-cheating-scandal-mo...

3
s_baby 3 days ago 5 replies      
The international students have unreasonably high standards for maintaining financial assistance and staying out of academic probation(3.5-3.8 at my university). By design college courses give work just a little beyond what a student can reasonably accomplish(with a "good" grade) and curve accordingly. These kids aren't just cheating they are marginalizing students who are honest or don't have this kind of social access. If you're not part of a fraterntiy/sorority that archives coursework/tests of previous students or part of an international student in-group then you can easily fall through the cracks from this grade deflation effect.
4
mrep 3 days ago 4 replies      
I went to a public university with a large international population and this doesn't surprise me in the slightest. A lot of the international students were extremely poor at english. One of the guys on my software project team was an international student who confirmed that it was very easy to pay someone to take the english proficiency exam.

Us Americans aren't so innocent though. The amount of adderall/vyvanse/ritalin... that goes through college campuses is ridiculous.

Honestly, I feel that it's just a symptom of extreme competition. With colleges and first time jobs taking such extreme care to filter on GPA and the likes, every little bit usually pays off.

5
hackaflocka 3 days ago 1 reply      
Can confirm. I'm a professor at a US university.

One class of 40 students, about 20 were Chinese foreign students.

Turns out, they were each doing 40+ credits that semester. About 20+ from the U.S. university, and about 20+ from a Chinese university (online).

How's that possible? They were sharing assignments and exams. Each one had the responsibility to do the same assignment and exam over and over for about half a dozen others.

They were on track to complete a 4-year program in 2 years. The actual degree was being awarded by the Chinese university, and they were transferring the credits from the U.S. university.

7
jboles 3 days ago 0 replies      
Sadly, not a new thing and not just US universities.

Ghostwriting scandal that hit a bunch of Australian universities about 18 months ago:

http://www.smh.com.au/national/education/students-enlist-mym...

"Australia's international student market is a $15 billion industry and the country's largest export after iron ore, coal and gold"... and money talks.

8
whack 3 days ago 6 replies      
Cheating is and always should be a pox that needs to be eradicated. But let's not turn a blind eye to what's happening here locally. Parents in the 1% spend an extraordinary amount of money and resources on college-prep companies, many of which tell the kids exactly what extracurriculars to take, what to do after school, and virtually write the entire outline of their admissions essays. It's little wonder that schools like Harvard are playgrounds for the ultra-elite.
9
cels 2 days ago 0 replies      
When I began graduate school, it was very surprising to me to see how the Chinese kids all cheated together as a matter of course and the Indian kids all cheated together as a matter of course. Some of the Americans would cheat somewhat, but furtively, rather than just being an ingrained part of culture.

(with the exception of one top Indian student who had no need for cheating whatsoever)

10
studentrob 2 days ago 0 replies      
High school students' parents in China give gifts to teachers to guarantee good grades for their kids.

This culture of cheating goes beyond cheating on the American tests.

In the business world, it's common to give gifts to your boss to get a promotion.

China will not surpass us economically any time soon. Their educational and promotional systems have a long way to go.

11
mkagenius 3 days ago 0 replies      
12
raincom 3 days ago 1 reply      
Instead of looking at the issue morally, one can see what it boils down to: learning vs credentialing

This shows that one wants a credential, but does not want to learn. There are tons of people out there who precisely want that.

13
raddad 2 days ago 0 replies      
I was taking a school bus license endorsement exam and saw they throw out a Chinese girl for cheating. The examiner said it was a fairly frequent occurrence. They couldn't read the questions or answers but they could match the picture with the correct line of answers on the computerized testing machine after enough memorization.
14
studentrob 1 day ago 0 replies      
This story is HUGE

Entire copies of the SAT are getting released in China, and possibly the US. The College Board's ETS cancelled January 2016's test in China out of concern that the test that would be used was already available on the internet. Then they gave another test in March. Questions from that test started appearing online within hours after it finished [1]

The problem with this is that the College Board reuses tests. They do not feel it is feasible for them to issue unique questions for each test, according to [2]

This undermines the biggest criteria used to admit students to college

It means rich kids in China can buy their way into US colleges through these companies that compile actual SAT questions, and poorer kids who have done honest hard work will miss out on opportunities to study in the US.

So long as this goes unaddressed, we're importing rich kids who will arrive unprepared and return home without much further education. They're simply giving money to certain institutions in the US and not adding much to the development of US or Chinese innovations. The quality of schools could degrade, causing a weaker economy, etc. etc.

Hope the College Board can see how pervasive this is. The internet is a real game changer when it comes to maintaining academic integrity.

[1] https://soundcloud.com/reuters/howtogamethesat#t=26:21

[2] https://soundcloud.com/reuters/howtogamethesat#t=5:40

15
ausjke 2 days ago 0 replies      
In the past, only the brightest and best Chinese students could come to US, mostly by scholarship for their master/PhD, most of them stayed after the study.

These days, the majority students from China are those ordinary kids(or even worse) with a rich dad, most of them are the only child in the family which was likely spoiled, these combined produced a quality issue, so we're seeing them on the news, that they cheat, they committed crimes, they do drugs,etc.

In the meantime, many universities are in need of cash, which is another reason in the mix.

So all in all, it's all about money, one needs that, another one has that but not much more than that, thus all kinds of related issues.

16
mmkx 3 days ago 3 replies      
This is happening on a mass scale at University of California, Irvine. Nobody is investigating.
17
andrey_utkin 2 days ago 1 reply      
Have seen at last one Chinese guy's profile on Upwork having some test passed with maximum score, completed within ~5 minutes (while the time for a test is usually ~40 minutes).
18
IndianAstronaut 3 days ago 1 reply      
I am sure they could stop it but the money the students bring is too valuable.
19
ikeboy 3 days ago 0 replies      
Discussion of a previous post in the series: https://news.ycombinator.com/item?id=11380174
20
leroy_masochist 3 days ago 1 reply      
Perhaps a dumb question but why doesn't the Chinese government crack down on this? I feel like they have a pretty good handle on what's going on within their borders, and this issue is poisoning the reputation of Chinese-born young people whether as applicants to companies, applicants to grad school, or generally as trustworthy human beings.

Is it because the college students benefiting from these operations are the children of influential people? That's the only explanation I can think of that makes any sense.

21
leesalminen 3 days ago 2 replies      
I was enrolled in a Business Calculus course (100 level) my freshman year. The "instructor" of the class at the front of the room could not speak English intelligibly. I speak multiple languages and am decent at interpreting broken English, but it was not even close to understandable. I left, never went back (except for exams). I still passed just fine, but really was a terrible intro experience to college at a large state school.
22
TDL 2 days ago 1 reply      
Reading some of the comments here makes it clear that we (the global we) have way overvalued the piece of paper that we get upon graduation and have largely ignored the process, habits, & knowledge that said piece of paper once stood for. The problem with all these developing nations & cheating isn't one of moral failing, but the fact that a cred is valued more than what the cred is supposed to stand for. I would argue that this has to do more with the venality of Western universities (and our superior marketing) that it has to do with much else.

We (Westerners) tell the world, and ourselves, all you need is a piece of paper and all will be well. This piece of propaganda plays a larger role than any perceived moral inferiorities among the developing peoples of the world.

23
hoodoof 3 days ago 10 replies      
I know its naive but I find it really hard to understand why people value the piece of paper more than the knowledge.

The good news is that you can't cheat as a software developer. Or maybe you can? Depressing thought.

24
auggierose 2 days ago 0 replies      
That doesn't only apply to the Chinese. A German friend of mine faked his transcripts to do an MBA in California. As long as he paid the 25K in tuition fees, nobody cared to take a closer look. He did well, got his MBA, and lived happily ever after.
25
known 2 days ago 0 replies      
You call it corruption; They call it innovation;
26
Kinnard 3 days ago 0 replies      
Damn, first generation college student black americans sure are getting ripped off . . .
27
fatman13gg 3 days ago 1 reply      
> U.S. universities offer an easier way to get ahead, with a quality education and better job prospects.

The better job prospects part might not be true. Those came back to China for jobs will get their current position with or without a U.S. degree.

28
yeowMeng 3 days ago 2 replies      
Unrelated but: It's easy to believe that yr peers are cheating. But I believe there are a lot of hard working people that know the concequence of cheating your way through.
29
googletazer 2 days ago 1 reply      
The real question is how can you make money from this.

Cheating is a part of human nature, and its not very surprising it would happen, especially when the pressure is high at universities. If it wasn't good for survival, it wouldn't happen.

30
qaq 2 days ago 0 replies      
A friend worked as a PM for similar service targeting different demographic (and mostly written assignments). They were pulling in over 1 million a month profit so this is fairly widespread and definitely not limited to Chinese students.
31
paradite 3 days ago 2 replies      
One common mistake made by westerners is treating lack in language proficiency as low competency in work or study. Native English speakers take for granted the ability to speak English fluently but do not realize it takes years to master spoken English as a non-native speaker.

Although this is tangential to the issue being discussed here, but you can see this mistake appearing in lots of comments here.

32
JoeAltmaier 2 days ago 1 reply      
"a vibrant East Asian industry "? There were 30 students out of a student population of 30,000 that used this service. That comes to, what, 0.1 percent? How many of the other 49,970 students hedged their bets in other shady ways (sharing papers, duplicating homework, covering one another for labs)? This is not a situation limited to those unscrupulous foreigners. The whole article smacks of nationalism.

I know, its pretty cynical to have a business dedicated to systemized cheating. But how is that different from sororities and frats that make available (and sell) last years notes/exams? That's often a thriving cottage industry.

Caveat: I went to Iowa. I was in the Engineering program, which was 50% foreign students (99% of the graduate students) back then. These were some of the hardest-working, smartest scholars I have ever met.

I don't think there's anything new to learn in this article.

15
Blue Ocean: a new user experience for Jenkins jenkins.io
336 points by Artemis2  20 hours ago   93 comments top 16
1
i386 15 hours ago 5 replies      
My team and I designed Blue Ocean. We are really excited to get your feedback and suggestions. Ask me anything :
2
Roritharr 18 hours ago 2 replies      
Yes! This! A million times this!

I wished for a better Jenkins UI for so long I had given up hope. GitLab CI looked enticing, but after putting so much work into building a workflow that doesn't pollute my repos with environment specific data i'm right back at Jenkins.

Thanks for this, this will make it so much clearer to our POs where we stand without spending forever trying to push that info into Jira.

3
jupp0r 16 hours ago 1 reply      
I just wish they would have used this change to get rid of the weather icons ...
4
serbrech 11 hours ago 1 reply      
I like the concept or pipeline, and I like that the definition can be checked in with the code, but :

* I don't like to couple the builds to a product

* Can I run the build locally? if something goes, how to I know if it's a bug in the pipeline script, the build server environment, or my code. I want reproducible builds.

5
franblas 16 hours ago 1 reply      
Very good news. I love Jenkins but the interface is not very friendly and even with some themes templates like this one https://github.com/jenkins-contrib-themes/jenkins-material-t... it's still hard to navigate and get right informations. Definitively yes for this integration!
6
truebosko 16 hours ago 1 reply      
I love Jenkins, and it's incredibly important for our teams. The redesign offering much stronger focus on the pipeline is key in helping us (and perhaps many) teams move past the "Continuous build tool" phase and into the "Continuous integration/deployment" tool we would love!
7
karterk 18 hours ago 2 replies      
I have worked with many open and proprietary CI tools, and find GoCD to be the best in this space right now. It supports both Continuous Integration and Continuous Delivery. They also made it open source last year:

https://www.go.cd/

8
luka-birsa 17 hours ago 1 reply      
We're running both GitlabCI and Jenkins (different teams) inhouse. Gitlab really upped their Game with the integrated Docker repo (we've adopted Docker heavily) but it is lacking the plugin ecosystem of Jenkins.

Jenkins on the other hand just feels dated. The new ui is very welcome.

Looks like we're going to keep the dual approach for a while until things crystallize further.

9
smartbit 16 hours ago 0 replies      
I wonder how fast cloud bees will upgrade their new exam [0] based on version 1.625.2 of the Jenkins core? The exams where only available since last month and already quite outdated since version 2.0 was introduced [1]. What alternatives to exams exist to asses someone's knowledge and experience in using, managing & maintaining an application?

[0] https://www.cloudbees.com/sites/default/files/cje_study_guid...

[1] https://news.ycombinator.com/item?id=11362058 & https://news.ycombinator.com/item?id=11574487

10
benbristow 16 hours ago 1 reply      
Looks much better!

I've never used Jenkins for my own projects but I've used it before when Minecraft's 'bukkit' framework used to use it and it's a horrible piece of web software to navigate.

11
rcarmo 16 hours ago 0 replies      
This is excellent news. Jenkins is often the hardest thing for people to get to grips with in a CI pipeline solely because of the UI/UX, and I, for one, welcome our redesigned build butlers... :)
12
stuaxo 11 hours ago 0 replies      
About time! This is great.
13
hjgilmore 12 hours ago 1 reply      
Looks really great! Much more modern and visually clean.
14
joneholland 15 hours ago 1 reply      
I've yet to find a build system that is better than TeamCity.
15
jacques_chester 12 hours ago 1 reply      
This is a fantastic upgrade to Jenkins, well done.

Sincerely,

A Concourse Fan.

16
zxcvcxz 15 hours ago 0 replies      
CLASSIC HN (2 Months ago)

https://news.ycombinator.com/item?id=11362058

>Jenkins 2.0 Beta

Almost every top level comment is very negative.

I just think the discussion now vs the discussion then is worth noting. People seem much more positive about jenkins just in the last few weeks.

Recently since it was integrated into Azure those negative comments have started to get downvoted, in favor of positive ones.

https://news.ycombinator.com/item?id=11737374

Very interesting. I figure this is just the more "corporate culture" here on HN.

16
Apps made for one OS shouldn't insist on aping the design elements of another macworld.com
291 points by okket  1 day ago   230 comments top 44
1
ericdykstra 1 day ago 13 replies      
User interfaces should be designed for the user, not for the operating system. I feel like more apps should embrace their uniqueness rather than aping whatever Apple/Google/whatever puts out as the "new standard" every time their design team decides to go in a new direction.

A lot of user interface design is managing user expectation (things people have gotten used to, like where a "create" button is or what a "share" icon looks like) versus optimizing for long-term usability of an app (something like Snapchat, with its seemingly-esoteric UI that is actually efficient once one is used to using it).

The only reason the author gives for taking this position is that "Participating on someone elses operating system means youre on their turf. The author even admits that "Im not saying either design is superior." So app companies should forego optimizing for user experience to pay respect to the platform that they're on? This is a totally backwards position; apps should respect their users. If that means following all the OS guidelines for a sparingly used app, that's fine. But if that means creating a unique UI and app flow for long-term user happiness, then the app company owes no respect to the "local conventions" of the OS.

edit: clarity

2
ajoy39 1 day ago 7 replies      
Material Design isn't just about Android Apps though. They make libraries for websites too, and have guidelines for all types of devices built into the spec. Google isn't using MD in their iOS apps to bring Android design to iOS they're using MD in their iOS apps because they believe MD should be a universal design paradigm for applications and web apps. You can argue that it shouldn't be, you can not like it as a design framework in general, but it's not about Android vs iOS it's about Google's vision of application design regardless of platform.
3
simula67 1 day ago 6 replies      
Google is acting more like Microsoft.

They are building huge moats around their business ( Android, Chrome etc ) so it becomes harder to use some Google products without using others.

Google discontinued 'Google Sync' ( https://support.google.com/a/answer/2716936?hl=en ).

It favors Google Finance and Youtube over its rivals in search results page.

They 'have not figured out' how to allow extensions in Chrome mobile.

They had no-poaching agreements with rival companies.

I am almost a Google fanboy, but I suspect they will be no different to Microsoft in a few years. Letting products stagnate, pushing the tech industry backwards but making a pretty profit for investors.

4
kmiroslav 1 day ago 8 replies      
The author should launch iTunes on Windows and realize that Apple is pulling the same kind of crap on Windows.

Large companies have large egos.

5
simonh 1 day ago 0 replies      
While in general I'd agree with Jason, in this case I can't. There's a big difference between the case of Word on the Mac back in the 90s and Google apps on iOS.

I use Gmail, Google Maps and Google Docs, but I use them equally on iOS and on the web. What I want from these apps is a consistent UI on the web and in the iOS apps. I couldn't give two craps about what it looks like on Android. No disrespect to Android, but that's just not an issue for me.

Jason's position is reasonable. Making the UI on iOS more like iOS conventions would have advantages and for some people this would be a better option, but there's an extra issue here he's not acknowledging.

6
whack 1 day ago 1 reply      
It's worth pointing out one major difference between Word 93 and Google apps in 2016. Back in 1993, we were pretty much living in a 1-device-per-user world. People had one computer, and they used all their apps on that one computer. Hence, if consistency is the goal, the only thing to be consistent with is other apps on that computer.

However, in 2016, we're living in a multi-device world. A single person can easily own a smartphone, a tablet, a laptop as well as a desktop. Some of them may be iOS, others may be Windows, and others may be Android. And they are likely to use the same apps (eg, google Maps) on all of those different devices. Hence, I would argue that consistency across devices today, is more important than consistency across different apps. As someone who owns a mix of both Mac and Android products, I would like for Google docs to look and behave similarly no matter which device I'm on, even if it means that Google Maps ends up looking differently from Apple-Music.

7
erikb 1 day ago 0 replies      
The funny thing is: I understand his argument, but the Android like Google apps where what finally made the iPhone a little usable for me. I have it for nearly a year now and still every day wish myself back to the times I had a Nexus 4. It is just way more efficient. I don't what the goal of a normal iPhone user is. But my goal is to get things done as fast as possible. I need to work. But for that iPhones are really painful to use. And I don't know but if you have seen the Gmail App, can you ever go back to the Mail app from iOS? It's like a Porsche vs a rikshaw. Why would anybody want to use the latter one?
8
tluyben2 1 day ago 3 replies      
I must be a very weird person I feel lately (this theme is recurrent on HN); I don't actually notice the difference. I use iOS & Android and if the app works well I don't notice if it doesn't behave 'like it is supposed to do' on the respective OS. It is far more noticeable on desktops. However, if copy/cut/paste (Firefox on Linux notably which is unusable because I cannot copy to/from it), file dialog and drag & drop works with the rest of the applications and the application is solid otherwise as well I don't care about that either and probably won't notice it. I notice it with software that presents me a hard to use custom file open/save dialog only. And I tend to just not use that anymore.

But on mobile; if the app is good (I for one like Google docs) I don't believe many people actually notice this difference.

9
bikamonki 1 day ago 0 replies      
I agree with the point being made: I would be really pissed if a desktop app had the window management buttons on a different corner, even the same corner but different icons would feel bad. However, mobile apps are more like websites rather than desktop apps. The 'shell' has to be consistent with UI design and behaviour whereas the 'content' can have a UI relevant to function and maybe a different brand. For example, if you open gmail on Chromium and Safari, you'd expect it to behave and look the same on both.
10
Zelmor 1 day ago 1 reply      
"and the target of dislike and rage from many people who love Apple products."

He managed to lose me in the first paragraph. Not a record, but really close.

11
fenomas 1 day ago 0 replies      
This seems like a trivially silly article. In essence it argues: "OSes are special sauce and apps are commodities - Google should recognize this and stop trying to differentiate their app designs."

That's certainly a valid opinion, but its reverse is just as defensible - if one considers services to be special sauce and OSes to be commodities, then it directly follows that app UIs should first be consistent across platforms. And considering that Google's entire business model is based on the latter view, why on earth would they take the author's view?

12
Dylan16807 1 day ago 1 reply      
Just some icons and menu colors? If that's their biggest problem then they're doing well.
13
jetskindo 1 day ago 1 reply      
If Google docs had a mac like ui on ios, the author would be writing about how inconsistent Google docs is. "They should unify their design"
14
throwaway13337 1 day ago 0 replies      
Google is also resembling Microsoft in the sortof-broken-but-usable cross-product features.

For example, when I browse to google maps, and try to look at the reviews list of a point of interest on the map, it takes me to a google search - completely out of the context I was in - and opens as modal dialog in that search area.

There are lots of examples of similar functionality. Individually, its harmless, but as a whole, it feels sluggish and clunky.

This kind of behavior seems to be common in large software companies that have separate divisions with communication issues between them.

It's sad to see Google go down that road.

15
torgoguys 1 day ago 0 replies      
Reading the headline, I was half expecting the article to say that Google's just-announced Instant Apps initiative is the resurrection of ActiveX in IE, a comparison I'm just waiting for some journalist to make. (To this point, I've mostly read nothing but praise for the idea (instant apps). I see instant apps as another sad retreat from the promise of the open web.)
16
makecheck 1 day ago 0 replies      
While I prefer adherence to the OS conventions, I can handle variations as long as they solve some problem. For example, one could argue in favor of something like a non-standard icon if that makes gestures easier to discover than they are by default on iOS.

Gratuitous redesigns are silly though, and by nature have to consume engineering effort that otherwise wouldve been spent elsewhere. And when engineering effort has clearly gone into a new veneer while there are still broken app features, I become really angry. Too many apps on iOS alone have been updated with redesigns and mysteriously lost functionality in the process (or effectively lost functionality, by hiding previously-known features somewhere new).

17
pervycreeper 1 day ago 5 replies      
tl;dr--- Google is the equivalent of 90s-era Microsoft because it is angrily disliked by some Apple fans, and it incorporates Material Design in its iOS apps.
18
TazeTSchnitzel 1 day ago 0 replies      
Particularly irritatingly: Google disables shake to undo, which I've actually needed, instead making it shake to give feedback, which is never something I've needed.

Presumably just to upset iOS users?

19
JustSomeNobody 1 day ago 1 reply      
What a sad and click bait-y article.

So, there's not a million other examples of apps that also don't follow the iOS style?

This article was written just to beat the Android vs iOS war drums.

20
vmateixeira 1 day ago 0 replies      
They're just looking after they're business. By having the same applications layout/behaviour, it will help users to switch to Android easily in the future.
21
werber 1 day ago 0 replies      
The only google app I regularly use on my iPhone is their Photos app, and I like that it's cohesive across platforms. The subtle switch to being material design land on iOS gives me reassurance that my pictures will be safe in google's cloud. This feels worlds away from the design wars 20 years ago, when all you got was frustration and no real user benefit.
22
edko 1 day ago 0 replies      
Apple has strict rules for allowing apps into the iOS AppStore, and they are enforced for every developer, including Google. While I disagree with the contents of the article, the rant should not be against Google only. If material design apps are accepted then it is not only Google's "fault".
23
proyb2 1 day ago 0 replies      
That's why we all have different furniture.
24
jasonm23 1 day ago 0 replies      
The more you understand product design and usability, the more you value consistency with the platform.

The problem is, no one starts out with this knowledge or value system.

25
digi_owl 1 day ago 1 reply      
I for one miss the 3.x-4.x Holo UI.
26
Tyler-Durden 1 day ago 0 replies      
Material design is not android design. It is googles design concept. Being surprised that the company uses it for their applications, is like an American traveling to a far off country called Great Britain and be surprised they speak English there
27
Arnt 1 day ago 0 replies      
I'm going to go out on a limb and assume that most users of google apps also use google-as-a-website.

So what's google to do, provide consistency between the app and other apps or between the app and the corresponding web UI?

That isn't a simple question with a simple answer...

28
JackPoach 1 day ago 0 replies      
And Microsoft is 'unmaking' many of the mistakes it made in the past two decades
29
ysavir 1 day ago 0 replies      
I think the author misses the point. Using a native-looking interface will only encourage users to stay on their iTems, and Google (not so) secretly wants to convert those users to their own native platforms+.
30
isidoreSeville 1 day ago 1 reply      
I've never really used iOS so can someone tell me if this is even a real problem?

The apps shown in the linked piece seem totally fine to me, so maybe there are better examples out there?

31
no_gravity 1 day ago 0 replies      
According to the author, they made that mistake in 1993.

In 1993, Microsoft's market cap was under $30 billion. Today it is over $400 billion.

32
PSeitz 1 day ago 0 replies      
an apple fanboy dislikes android, quelle surprise
33
DeadReckoning 1 day ago 0 replies      
Totally disagree. I love the iOS material design apps like Inbox and Google Photos
34
ja27 1 day ago 1 reply      
"Boo hoo" - every Android user with an app drawer full of iOS rounded rect, flat, gradient-shaded icons.
35
samwestdev 1 day ago 0 replies      
Another crappy Jason Snell's Original
36
johnloeber 1 day ago 1 reply      
Making analogies between Microsoft in the 90s and X without mentioning the huge antitrust lawsuit[0] always seems dubious to me. The evolution of Microsoft as a business is not purely due to its product strategy.

[0] https://en.wikipedia.org/wiki/United_States_v._Microsoft_Cor....

37
grawlinson 1 day ago 1 reply      
Microsoft now aren't that great with the following examples:

* Constant starting/shuttering of their mobile OS offerings.

* Disregarding user privacy in Windows 10 (and extending that to Windows 7/8 via updates)

* Forcing users of Windows 7/8 to update to 10. It's basically malware at this stage.

* Removing admin abilities in Enterprise offerings of Windows 10.

38
tn13 1 day ago 1 reply      
Honestly as a user I think Apple and Google can fry frogs with their design guidelines. I want app makers to make apps that are fun to use and enjoyable and more importantly useful. I could not care less if they are following company X's design guidelines or company Y's design guidelines.

There is nothing earth shattering or innovative or useful about Apple or Material design guidelines. It is just an attempt to establish their brand.

39
rdiddly 11 hours ago 0 replies      
UI "conventions" (i.e. fashions) wouldn't be such a big deal if people adhered more closely to first principles like discoverability, etc.
40
d0m 1 day ago 0 replies      
TL&DR: Author doesn't like material design
41
deprave 1 day ago 0 replies      
WebKit and Blink: Embrace, Extend, Extinguish.
42
Zigurd 1 day ago 0 replies      
The article has some good points. Material Design is a fine convention for Android and for Google's Web apps, and is well-supported for those if you use Google's libraries and frameworks. But I would not recommend that app developers should take MD to iOS. iOS has it's own conventions. Observe them. On top of all that, Web UI design lacks universal conventions. There isn't much to be gained by aping Google's style on the Web, unless you really like it. Every developer needs distinctive design elements in the areas that are not covered by platform conventions, and their Web UI is a good place to develop distinctive elements.

BUT that doesn't mean Google is making a mistake, except possibly on iOS. If Google's iOS apps are not close enough to iOS conventions, Google is, at least potentially, confusing their iOS users.

43
pan69 1 day ago 1 reply      
Three words for the author of this post: iTunes for Windows
44
vixen99 1 day ago 1 reply      
He says "Truth be told, just as I used Word 5.1 back in the day, I use many Google services today."

Just as? No, he paid good money for the Microsoft software.

17
Startups Cant Manufacture Like Apple Does (2014) bolt.io
330 points by bootload  1 day ago   79 comments top 16
1
bobjordan 1 day ago 3 replies      
This is a good list. Let me add one to it. I manufacture products in China and I consistently see new designs for injection molded parts where the designer has unrealistic expectations for the tolerances that can be achieved on molded parts. While the tolerance we can achieve will change proportionally with the size of the injection molded part, don't design parts that rely on precisions of 0.01mm. Hard to do this unless you are Apple.
2
tgb 1 day ago 5 replies      
I remember coming across this some time ago on HN and being really interested by the white plastic bit. It clicks into place Apple's ability to set itself apart visually during the white earbuds era of iPod advertisements. PCs were black and gray and beige, Apple was a shining white.

And no one else could mimic their style, since it was just too difficult.

3
amluto 12 hours ago 1 reply      
One particular bit of this makes no sense:

> What happened when Apple wanted to CNC machine a million MacBook bodies a year? They bought 10k CNC machines to do it.

CNC milling scales linearly. If you want to make 1k things per year, you can probably do it with one CNC machine. I know a startup that's using CNC-milled enclosures and that's probably the single easiest part of their production.

Sure, startups won't buy 10k CNC machines, but they won't need them either.

4
sbierwagen 1 day ago 0 replies      
5
dammitcoetzee 21 hours ago 0 replies      
I ranted a bit on this as well on Hackaday http://wp.me/pk3lN-PBv . Apple and Foxconn just have an unimaginable amount of capital. However, I don't think hardware designers should despair. There are many ways to design things, and constraint is the mother of innovation. As the author mentions, it's entirely possible to make products just as appealing with a fraction of the cost.
6
Nokinside 18 hours ago 1 reply      
It's not just Apple vs startups. Large customers like Apple or Samsung have big advantages over smaller phone manufacturers.

Apple can always call Foxconn and tell them that they want even this barely visible detail fixed. Foxconn comes up with number for changes in the manufacturing processes ($10 million for better tooling for example) and it's done. Small manufacturers who have low margins can't justify similar attention to detail.

7
bambax 19 hours ago 3 replies      
> CNC machining is fantastic for prototypes [but] it is not for consumer devices. Figure out a way to cast your metal parts.

Why is that?

I'm trying to build a better quick release plate for DSLR cameras, compatible with Manfrotto tripods but that lets you do a couple of other things (like attach a hand strap to it directly).

I had prototypes made in China with CNC machining, and they are of a very good quality (superb, even, it seems to me).

For production, injection molding is of a reasonnable price but the result would not be the same quality (plastic is a poor choice for this).

Die casting is too expensive for the volume, given it's really a niche product.

So I was thinking of doing short runs with CNC: what' wrong with that option?

8
fauria 16 hours ago 3 replies      
Does anyone know what a "CM" in this context is?:

"Unless youre a billionaire genius, your product will have noticeable ejector pin marks. A good CM knows how to hide these well. Nearly zero CMs hide them as well as Apple does."

9
sgnelson 1 day ago 0 replies      
Well, the truth is that very few companies can manufacture like Apple does. Between the obsession with design, details, and having the resources to have the manufacturers create entire new production processes just for them. And often, the companies who do the above are usually not consumer facing.
10
bane 22 hours ago 4 replies      
It's pretty interesting to think that quality can actually go up as volume increases. This seems to run counter to general perception that small/hand batches can be of incredibly high quality and once it goes to mass production quality goes lower.

We've never really seen production at Apple/Samsung's scale before and I wonder if this quality curve is something that is all that well understood.

11
funkyy 10 hours ago 0 replies      
Posts like that are why I come to HN. Even if my field of expertise have absolutely nothing to do with manufacturing, its nice to read about problems and solutions of huge industry explained in simple and clean language.
12
sgt 11 hours ago 0 replies      
I bought a Hand ground coffee grinder last year. It's a Kickstarter project and we should actually have received the grinder last year some time, but in the spirit of Kickstarter I'm patient.

I think the challenges they've had are similar to what is touched upon in this article.

They've had a ton of clearance and molding issues, and dozens of design prototypes and mismatching parts. To add to the difficulties, the coffee grinder has moving parts that take a lot of stress, as opposed to just a box with a circuit board inside.

* http://handground.com/

13
whyagaindavid 11 hours ago 0 replies      
A while ago Andrew 'bunnie' Huang (Remember chumby?) gave a talk on hardware manufacturing in linuxconf-AU. Could not find a link but it covered a lot - why you cant do it like apple or samsung. His book is out now!
14
makenova 23 hours ago 1 reply      
What are high margin parts?...asking for a friend
15
ArtDev 1 day ago 2 replies      
Apple outsources its manufacturing.

A small startup can't just get Samsung and Foxconn to build their parts. This is true.

16
AceJohnny2 1 day ago 1 reply      
(2014), but still as relevant as ever.
18
Show HN: Automatic private time tracking for OS X qotoqot.com
421 points by ivm  2 days ago   228 comments top 68
1
ivm 2 days ago 19 replies      
I was not happy with features and UX of other productivity trackers. Most of the time tracking software is made for controlling employees or for billing clients and I just wanted an automated productivity measurement.

I tried RescueTime before but it was too expensive for its functionality ($72-108/year) and also collected all my tracked data on their servers. There is standalone ManicTime on Windows but OS X standalone trackers lack features and most of them are not automatic.

So I started to play with OS X accessibility and got promising results pretty fast. Then there were about 14 months of writing some code once in a week or two and 3 months of almost full time polishing and gathering feedback.

Now it's marketing time. Qbserve did well on PH but almost no other sites picked it from there. This week I pitched about 70 journalists and bloggers who write about Mac or productivity apps but the results are not clear yet.

I'll be very grateful for advices on how to promote it better and overall feedback. Thank you!

2
albertzeyer 2 days ago 1 reply      
Fwiw, I developed a very puristic similar project: https://github.com/albertz/timecapture

So far, it's only tracking the time and recording which app is in the foreground and what file / url is currently opened in there. It doesn't have any GUI and it won't show you nice statistics like Qbserve. But it shouldn't be difficult to calculate any statistics you want from the data.

Python, Open Source, easy to add support for other platforms and apps (so far mostly MacOSX support). Patches are welcome. :)

3
deweerdt 2 days ago 1 reply      
I bought the app, and I'm really happy with it, thanks!

I know it's a long shot, but some sort of shell integration would be awesome. My typical day is > 60% iTerm2. iTerm2 has shell integration: https://iterm2.com/shell_integration.html, and maybe that would be one way is something that Qbserve could be fetching info about what's going on in the terminal.

4
Joe8Bit 2 days ago 2 replies      
Some feedback:

* Make the price more readily apparent on the landing page

* Tracking the '6,400 sites, apps and games' is great, but it would be good if I could find out if the ones I care about in that list!

* Make the above the fold screenshot bigger, I tried squinting/zooming before I realised I could scroll down

* Can I determine which things are productive/neutral/distractive? As I wouldn't want to buy it if that was static

Looks good though!

5
Karunamon 2 days ago 1 reply      
Minor UI feedback:

The settings UI is extremely hard to read on my screen. The headings are light grey on white, and no amount of messing with my screen's contrast settings leaves something easy to read.

The checkboxes also immediately convey "disabled" due to their coloring. Your UI in general is spot on and sanely designed, but please consider taking a cue from the OSX HIG[1] and use the system colors and leave the light grey stuff for actual disablement, it will make your app look a lot more native.

[1] (about halfway down the page): https://developer.apple.com/library/mac/documentation/UserEx...

6
ryanmarsh 2 days ago 2 replies      
If you're a consultant or you work in a consulting firm I have some advice for you.

Get comfortable with fudging the numbers on your time reports. It's ok. Report what's reasonable given:

You aren't being paid for your minutes you're being paid for the ability to solve a customer's problem in minutes.

I bill my clients 40 no matter what because sometimes I give them 100 hours worth of value in 1 hour. It all balances out. It took a while to realize this wasn't an integrity violation.

You aren't a machine resource. You're a human working in immense complexity. Your productivity is a roller coaster. It's ok. Don't sell minutes. In reality your customer can't handle the unpredictablity in billable hours if you were exacting and billed what you're actually worth. Instead we smooth it into 40 (or whatever), and that's ok.

7
baby 2 days ago 1 reply      
Did you test it thoroughly for websites tracking? I made a Firefox plugin[1] to track how long I would spend on facebook but it never had really accurate results.

Do you track only the current tab? Do you still track it if it's not foreground? Even if Firefox has many windows?

How do you track tabs in the browser from the OS by the way?

[1]: https://github.com/mimoo/FirefoxTimeTracker

8
howlingfantods 2 days ago 1 reply      
Love the idea! Only suggestion would to be switch "Distractive" to "Unproductive" or "Distracting." I'm sure distractive is technically a word, but this is my first time hearing it. But that's just me. I may just have a limited vocabulary.
9
avivo 2 days ago 1 reply      
I use ulogme which also records when you are typing, and is more customizable (and open source and free).

Explanation and demo with screenshots: https://karpathy.github.io/2014/08/03/quantifying-productivi...

Github URL: https://github.com/karpathy/ulogme

There are definitely some good ideas in it for inspiration when making a similar product.

10
fuzzythinker 9 hours ago 1 reply      
Does it really require Yosemite+ ? I think there's still quite a bit of people still holding on to Mavericks.
11
daemonk 2 days ago 2 replies      
I really like the UI. Is it possible to implement keyboard/mouse movement activity tracking? I don't mean keylogging or anything, but something like key presses per minute while an app is focused or mouse movement in pixels per minute while an app in focused.
12
joshcrowder 2 days ago 1 reply      
Great! The fact that this is private is a huge +1 for me. Looking forward to trying it out! I saw one of your comments on the data being available at ~/Library/Application\ Support/Qbserve/ it would be good if the schema was documented on the site, maybe in a developers section?
13
jrcii 2 days ago 2 replies      
This looks fantastic! Great work. I have some feedback too: Right now it groups all of my CLI programs into iTerm2. I would be very interested in tracking the actual programs. vim time means I'm coding, irssi (IRC), newsbeauter, cmus (music), or sl probably means I'm goofing off.
14
zzzmarcus 2 days ago 0 replies      
I've been using Qbserve for a couple weeks and I'm really happy with it. For me the best feature is just having that little number in the menu bar that shows what percentage of my time has been focused. This, more than any other timer or tracker, has been a simple and effective motivator for me to keep creating.

There are a lot of features I can imagine that would let me slice and dice tracked data better, but for a V1, this is something special.

15
aantix 2 days ago 1 reply      
I love the alerts. I setup an alert for when I have been distracted for more than 30 minutes.

Could you disable those distracting sites after 30 minutes? I'm only half-way kidding..

Still a fantastic app.

16
peternicky 2 days ago 2 replies      
I have used Rescuetime for years and for the most part, am very satisfied with the service. It would be helpful if you added a simple comparison between Rescuetime and your service.
17
danielparks 2 days ago 0 replies      
I've only been using it for 30 minutes, but so far it's great!

Being able to map a domain with all its subdomains to a category would be awesome. I access a whole bunch of hosts in an internal domain, and they're all productive.

18
mrmondo 2 days ago 1 reply      
Looks interesting! I have to ask: is it OS X native or is it some JavaScript thing?
19
lancefisher 2 days ago 0 replies      
This is a cool project. I thought about building something similar a few years ago when I was doing a lot of consulting. The most annoying part of the work was accurately billing clients when some days I'd switch between several projects.

Here's a few things that could make it super useful:

* Track time spent writing email by contact* Track hangout/skype/etc by contact* Track time spent on code per project* Connect phone records to tie in the time on the phone with contacts

Good luck!

20
botreats 2 days ago 1 reply      
I like the idea of this a lot, but not working with Firefox is a dealbreaker. If only 100% of my time spent in Firefox was actually productive....
21
knowtheory 2 days ago 0 replies      
Just downloaded it and fired it up, and immediate first impressions is that there's a lot to like in the app so far.

I'll be curious to see if I can build gentle nudges back on task if i'm off in the woods, or how i can better categorize different types of app usage. Coupling to my todo lists might be helpful.

22
pault 2 days ago 0 replies      
I've been using Rescuetime for the last 8 years or so, and I will be purchasing this in the next few weeks to see if it will work as a replacement. From what I can see, it shouldn't be a problem. Congratulations on shipping!
23
Karunamon 2 days ago 0 replies      
YES!

Finally an excuse to drop Rescuetime and their goofy UI. I've had this running for about an hour or so, and it seems to provide me exactly what they do, for cheaper, while respecting privacy.

Congrats on an awesome app, and I hope you do well selling this!

24
stinos 2 days ago 2 replies      
Away from the keyboard or watching a movie? Idle time is detected intelligently.

Problem is it heavily depends on the person what is really idling. Ideally you should be able to read the mind to see if there's any work-related activity :] I'm still using manual time tracking mainly because of this (even despite the obvious disadvantage of forgetting to turn it on or off): there's all kinds of solutions like detecting mouse/keyboard idling to fancier ones like detecting if your phone is near your pc and stuff like that, but at least for me none of these are as correct as just manually saying 'now I'm working, now I'm not': they can't detect things like me sitting outside with pen & paper.

25
mcoppola 2 days ago 1 reply      
Early impressions are excellent. I can easily see some billable/reporting functionality added as premium features. I'll be adding this to my daily routine and seeing how it works for the trial period - but you likely have a paid user in me already. Thank you!
26
pwelch 2 days ago 0 replies      
+1 for privacy and storing locally
27
asadhaider 2 days ago 2 replies      
This looks like a simple way to keep track of time I spend on projects.

It would be perfect if it could also log more details such as what filename/project is open in Sublime, that way I know what I'm working on.

28
Jonovono 2 days ago 3 replies      
Looks beautiful. Any plans to add 'Focus' mode like Rescuetime. Basically just ability to block distracting websites? I'd probably switch over from rescuetime if that's added :)
29
jakobegger 1 day ago 1 reply      
This is pretty misleading. From their product page:

> All the tracked information is stored locally.

From their privacy policy:

> we are using third party service Firebase.com to collect it (...)

So privacy is a selling point, but they have a lot of analytics in their app. Some of it can be turned off, some of it can't.

If you claim your app stores all data locally, it's quite dishonest if it talks to a bunch of analytics services...

30
phelm 1 day ago 2 replies      
This looks very useful, I am preparing to be a little shocked by the results. I will probably be buying after the trial.

One thing, this app seems to interact strangely with Spectacle (OSX Window controller) whereby browser windows move very slowly across the screen rather than the instant snap that I am used to.

31
daemonk 2 days ago 2 replies      
Just bought it. I really like it. One thing that would make this perfect for me is the ability to show stats for specific time ranges within a day. I use my laptop for both work and home. It would be nice to see a set of stats for just 9am-6pm everyday; or whatever ranges I want.

I guess the time tracking works right now by just tallying up seconds for each category? And it isn't recording time stamps?Recording time stamps might end up taking up too much space?

32
tharshan09 2 days ago 1 reply      
Just curious, what is the tech stack?
33
graeme 2 days ago 1 reply      
Does the license allow use on multiple computers? I have a computer for heavy work, and another where I do email and social media. I'd like to track both.
34
welder 2 days ago 2 replies      
This is the new RescueTime! Now you just need to market it to all of RescueTime's users:

https://twitter.com/rescuetime/followers

Small nitpick: How can you guarantee data is kept locally without open-sourcing the app?

This is similar to WakaTime but only for OS X and not as granular data, because one is for programmers and the other more general users.

35
vyoming 19 hours ago 1 reply      
Any reason why you chose fastspring instead of stripe for payment processing ?
36
fintler 2 days ago 0 replies      
On a 13' Macbook, I need to scroll down to click "Download" or "Buy Now". You might want to move those buttons up a bit.
37
manish_gill 2 days ago 1 reply      
Hi. Trying it out and would happily buy after using it for the next few days.

One query I have: Is there any way I can hide the app icon from the cmd+tab list? I want the ability for it to stay and works quietly behind the scene and since I have too many applications running on at the same time. Maybe a "hide icon" or some other thing? Thanks.

Looks like a fantastic product on first look. :)

38
gruffj 2 days ago 0 replies      
Great app, really enjoyed using it so far. I've found the percentage of productive time shown in the toolbar to be really useful.
39
spoinkaroo 2 days ago 0 replies      
This looks like exactly what I need, I'm going to try the free trial and then let you know what I think and probably buy it.
40
kasperset 2 days ago 0 replies      
I like this app as it is. New features would be welcome but I prefer lean and mean app. Simpler the better.
41
zombieprocess 2 days ago 1 reply      
In terms of distribution - Could you create a dmg with the drag & drop to the Applications directory as is standard for OSX?
42
ivan_k 2 days ago 1 reply      
Wonderful tool! I can see myself using it every single day.

Comment on usability: currently, different ports from the same domain name are recorded as different websites. I think it should be sufficient to group all the ports used with `localhost` as "productive".

Thanks for the great work!

43
zmarouf 2 days ago 2 replies      
Am I right in assuming that Qbserve only tracks active windows? To elaborate: If I have a monitor set to a fullscreen OSX Desktop with either Spotify or VLC while actually coding, the time spent listening/watching won't count unless the application is active, correct?
44
rememberlenny 2 days ago 1 reply      
This looks like a great tool. Im testing it out now.

I regularly use multiple computers for personal/work. Can there be a way to cross sync data across systems using an external host? I'd like to use Dropbox or some similar solution to keep the data files up to date.

Would that be possible?

45
billions 2 days ago 0 replies      
Would be nice to compare productivity with others. Just purchased the full version.
46
Zirro 2 days ago 1 reply      
This looks like it would be very useful to me. A pet peeve of mine is when an application does not use a monochrome icon in the menu bar. I don't suppose you could offer a monochrome option for the percentage, turning it the same colour as the icon?
47
xufi 2 days ago 0 replies      
This is pretty cool. I'd love to use it to keep track of time since Itend to get distracted by looking at other tabs and I've been looking for a way to keep track of where im wasting most of my time
48
weinzierl 2 days ago 0 replies      
I installed the trial version and it looks awesome.Unfortunately the lack of Firefox support is a deal breaker for me as I spend a lot of time in Firefox.This would be my top feature request.
49
Yhippa 2 days ago 0 replies      
I really like this idea. Unfortunately I'm a multi-device user including things like using a Chromebook which doesn't have native apps. Would love to see this aggregate data across different types of devices.
50
imdsm 2 days ago 1 reply      
How long is it -25%? If I try for ten days, can I still get the -25%?
51
cdnsteve 2 days ago 1 reply      
So was this developed using Swift?Curious, I see SQLite backend.
52
DarthMader 21 hours ago 0 replies      
I assume this does not work with a vpn?
53
thuruv 2 days ago 1 reply      
Dead link. :(
54
elevenfist 2 days ago 0 replies      
I love the idea of apps like these, but can people really not remember what they do all day? That thought is almost inconceivable.
55
sd8f9iu 2 days ago 0 replies      
Looks great! The interface picture, halfway down, should be the top one it's too hard to tell what the app does from the first one. Might give it a try.
56
muhammadusman 2 days ago 0 replies      
The UI is so much nicer than RescueTime, I love it!
57
r0m4n0 2 days ago 0 replies      
What my employer thinks I'm doing: 8 hours on stackoverflow = 8 hours of research

What I'm actually doing: earning reputation to improve my resume

58
jbverschoor 2 days ago 1 reply      
OK tried it, but it's not for me.

I need to be able to track activity per project.

Projects can be determined from the open window path or url.

Timings does this, but it's just one big mess

59
Nemant 2 days ago 0 replies      
Only 7MB! Good job! Product looks awesome. If it works well for the next 10 days I'm definitely buying it :)
60
ghostbrainalpha 2 days ago 0 replies      
I'm very happy with the icon in the dock!
61
pibefision 2 days ago 2 replies      
Why in this kind of sites, there is not a single person behind the marketing site? What's the reason to be hidden?
62
jbverschoor 2 days ago 0 replies      
I'm currently testing Timings, but it's support for activities and projects isn't done properly.

Checking out this one

63
Tempest1981 1 day ago 1 reply      
Is it $30 per machine, or per user, or per household?
64
spark3k 2 days ago 0 replies      
Timecamp have been at this for a while now. With syncing to project management apps.
65
kentt 2 days ago 0 replies      
Congratulation on shipping!
66
r00fus 2 days ago 0 replies      
I like how qotoqot.com shows up as "productive"
67
maknz 2 days ago 0 replies      
Damn this is good. I'll be buying.
68
imron 2 days ago 1 reply      
Any possibility of Mavericks support?
19
W^X now mandatory in OpenBSD undeadly.org
227 points by fcambus  1 day ago   60 comments top 9
1
byuu 1 day ago 4 replies      
I've always been in favor of all OpenBSD security enhancements I've seen, but I have to say, and please hear me out, this is an objectively terrible idea.

Yes, most programs should disallow W|X by default. But trying to banish the entire practice with a mount flag, knowing full well few people will go that far to run a W|X application, is bad practice. I'd rather see this as another specialty chmod flag ala SUID, SGID, etc. Or something along those lines. One shouldn't have to enable filesystem-wide W|X just to run one application.

The thing is, when you actually do need W|X, there is no simple workaround. Many emulators and JITs need to be able to dynamically recompile instructions to native machine code to achieve acceptable performance (emulating a 3GHz processor is just not going to happen with an interpreter.) For a particularly busy dynamic recompiler, having to constantly call mprotect to toggle the page flags between W!X and X!W will impact performance too greatly, since that is a syscall requiring a kernel-level transition.

We also have app stores banning the use of this technique as well. This is a very troubling trend lately; it is throwing the baby out with the bathwater.

EDIT: tj responded to me on Twitter: "the per-mountpoint idea is just an initial method; it'll be refined as time goes on. i think per-binary w^x is in the pipeline." -- that will not only resolve my concerns, but in fact would be my ideal design to balance security and performance.

2
cranium 1 day ago 2 replies      
For those heading into the comments to know what this is about: W^X is a protection policy on memory with the effect that every page in memory can either be written or executed but not both simultaneously (Write XOR eXecute). It can prevent, for example, some buffer overflow attacks.
3
sillysaurus3 1 day ago 5 replies      
This paper's thesis is that W^X does not work, and not because of any of the reasons presented in this thread: https://cseweb.ucsd.edu/~hovav/dist/geometry.pdf

The paper says that to bypass W^X protection, you can simply scan an executable for "the instruction you want to use, followed by a RET". The paper calls these "gadgets."

You can write any function you want by using these gadgets: simply call them. When you call a gadget, it executes the corresponding instruction, then returns. This allows you to write arbitrary functions, since real-world programs are large enough that they have a massive number of gadgets for you to choose from.

Can someone provide a counterargument?

4
jtchang 1 day ago 2 replies      
Does this mean to successfully exploit a program I need to write to an area in memory that the program will later turn the page in memory to "Execute"?
5
nightcracker 1 day ago 2 replies      
What about JIT compilation and other forms of code generation?
6
bch 1 day ago 2 replies      
NetBSD is going through some similar security moves currently (extending PaX[0]), and iiuc, there are special considerations required for Java/jvm, because of the bytecoding process. Does anybody know if my understanding is correct (that a page will have to be both writable and executable) and if so, what are OpenBSDs considerations for this ?

[0] http://mail-index.netbsd.org/current-users/2016/05/15/msg029...

7
malkia 1 day ago 0 replies      
I dunno why, but this quote from Benjamin Fraklin came to m mind - Those who surrender freedom for security will not have, nor do they deserve, either one.

i'm just kiddin ;)

8
fithisux 21 hours ago 1 reply      
Can someone provide an introduction for dummies like me?
9
anfroid555 1 day ago 0 replies      
Anyone know if Erlang is good?
20
What Does It Mean to Be Poor in Germany? spiegel.de
262 points by nkurz  9 hours ago   154 comments top 11
1
zinssmeister 7 hours ago 9 replies      
I read the german version of this article a few days ago and paired with my own experience (I was born and raised in Germany and lived there until the age of 25) I came to the conclusion that: a.) being poor in Germany is much much better than being poor in most other countries (incl. the United states, where I reside now) and b.) When you are poor in Germany you can drastically improve your cash flow by not living in one of the expensive cities. Living out in the country in a small village will enable you to stretch your welfare checks compared to living in Munich, where most things and most people around you are much more.

I also think Germany has a unique opportunity here to tackle the problem of social mobility and could improve the way they deploy the welfare budget. For example supporting a young family with many children before they face sliding into "poor status", due to one parent staying home to take care of the family. The article did a great job highlighting three interesting situations and their challenges/opportunities.

2
claudiug 7 hours ago 2 replies      
we are living in a world where if you are unlucky to have money or a proper job, people will see you as a lazy, cheap, dirty human.

I guess, maybe universal salaries, and free school, medicine will make our societies engage in what is important in life. Be happy, engage with your families and friends.

When I was young, I remember clear, days when I was starving, and so my parents trying and fighting to provide some food. Now, I life my first 30 years of life, and realize that nothing change. We are working for money and dreams that are fake.

In Berlin, capital, you see more people that are searching for food, empty bottles, and are in hartz and people look at them as dead animals. In Berlin, there are east european poor people, that sleep on the bridges, on parks. What they are doing? Ask for money.

Been poor in Germany, I guess is better than in Bulgaria or Romania, but been poor in any country, also ignorant make us damn idiots.

3
theoneone 7 hours ago 1 reply      
Try being poor in Greece: no chance for a proper job( people with Ph.D. and degrees work in cafe shops),no welfare, no proper healthcare( public hospitals are overcrowded and with no stuff) etc. I think more advanced eu countries are poor-friendly and really help people with bad luck get to their feet.
4
woodpanel 2 hours ago 1 reply      
As it has already been noted by others: those 3 examples are atypical for Germany. The authors could've asked the more typical poors about their backstory (the white trash, immigrants, street kids, drunks and junkies). I can only assume they wanted to portrait poors to which "middle class"-people could relate.

Which at least for me touches interesting points:

1) I think that Germans do not relate to their typical poors - at least not as much they'd like to think they do. To me this is connected to the welfare system: It effectively does anything to prevent middle-class people from a hard landing while doing up to nothing for the poorest to being able to compete for better places on the social ladder.

This makes sense from a political game theory standpoint: The vast majority of Germans aren't poor, at most they feel struggling. How can a democracy prevent its own sovereign (the majority of voters) from polluting the countries institutions for their own interests?

2) This "welfare focus" comes at the price of low social mobility: A welfare system aimed at the majority of people is generating huge costs, which the poorest also have to compensate for (ie harder to find employment, even harder to self-employ).

3) This "focus" is also reflected by the political debates about welfare reform: It's usually dominated by the concern for a soft landing rather than upward mobility. The massive welfare reforms of Chancellor Schrder in the late 1990 were thus mostly criticized by those groups fearing to become poor, rather than by people advocating for the poor or the poors themselves. In fact, for the majority of long-time jobless those reforms netted them more cash and a simpler way to get it.

PS:

4) Being poor within a society is always bad, no matter how high the average standard of living is. It's kind of sad to see those 3 example's compassion-worthiness degraded just because they are "poor in Germany". (In fact: If I was young I'd rather be poor in a country with a lower standard of living if it hat a higher social mobility.)

5
tacon 6 hours ago 0 replies      
That was an oddly confusing segment about Huber. In the third to last paragraph of the long story are we (finally) told that "He's chronically ill", though I suppose one is supposed to decode his health issues from the earlier "By that point, he had already been unable to work for two years and was on welfare." What are the opportunities for independent IT contractors, as he apparently was, in Germany today? Isn't there strong demand for IT skills in Germany, for a salary that pays above welfare?
7
tptacek 6 hours ago 3 replies      
From the article:

But at the beginning of the 1990s, his life came crashing down around him. One of his main customers stopped paying. He fought for years to get the unpaid money and restructured his debt to get seed capital for his new business plan. He fought desperately -- and ultimately in vain. In 1997, he was forced to capitulate. By that point, he had already been unable to work for two years and was on welfare. He lost his family home in a foreclosure and his pension and retirement insurance plan was seized. His landlord evicted him. Manfred Huber was ruined.

How does bankruptcy law work in Germany? This kind of catastrophe is what BK law is supposed to prevent.

8
stesch 6 hours ago 0 replies      
From my workplace (in Germany) I can see the trash containers of a supermarket. And from time to time I see a senior citizen getting some food there.
9
dnautics 4 hours ago 0 replies      
all three interveiwed were poor and german. What is it like to be poor and "not german", say, poor and a turkish immigrant? Or a child of turkish immigrants?
10
x0x0 7 hours ago 9 replies      
This may be a very American perspective, but I utterly fail at any sympathy towards the parents that had two kids and were doing fine, then decided to have two more and are whining that someone else didn't step up to pay for them. I mean, it's not as if babies being up all night is a predictable part of having a baby or anything. The mother whines that kids shouldn't be like a mercedes, where you decide beforehand if you can afford it. But that's exactly what kids are like. And if only there where some way (algebra) to know that per-capita income would shrink with each child! Hell, you may even think the country should pay for unlimited kids, but they couldn't be bothered to check before having the 3rd and 4th child if Germany does or not. They are irresponsible parents.
11
thr12331 7 hours ago 4 replies      
Its funny that refugees get more money than German pensioners.
21
Peter Thiel, Tech Billionaire, Reveals Secret War with Gawker nytimes.com
256 points by uptown  3 days ago   319 comments top 32
1
pkinsky 3 days ago 6 replies      
>He added: Its not for me to decide what happens to Gawker. If America rallies around Gawker and decides we want more people to be outed and more sex tapes to be posted without consent, then they will find a way to save Gawker, and I cant stop it.

An important reminder of exactly what conduct, on Gawker's part, people are defending. They have no moral high ground, noble principles, or higher purpose. They're scum, and it's pure karma that they're being destroyed by someone they outed.

2
owenversteeg 3 days ago 9 replies      
I absolutely despise Gawker. The stories they run and the stuff they do would normally mean that I would love for them to be run into the ground.

At the same time, they are the only independent online media outlet left in the world. Every other company that once prided itself on being "independent" from the large media network owned by ancient billionaires has now received at least hundreds of millions of dollars in investment from them.

Say what you will about the publishing of the tape (and I for one think it was a despicable action) it certainly showed Gawker's true editorial strength. They could do whatever the hell they wanted, publishing for the editorial integrity rather than to get pageviews.

The idea of every single large online media outlet being at least partially owned by a small group of media companies owned by billionaires is horrifying. BuzzFeed has a great and fascinating new editorial unit, but they have 200 million reasons not to publish anything that goes against the mainstream media.

Gawker is a horrible company, but I'm going to miss them.

[edit] Wow, this is really being run into the ground with downvotes. If you disagree, please don't hesitate to let me know why, as I'm genuinely curious. My email's in my profile if you'd prefer to contact me there. Cheers!

3
nikcub 3 days ago 3 replies      
Thiel is also apparently financing Shiva Ayyadurai's lawsuit against Gawker. He is the guy who claims to have invented email as a 14 year old. Both Techdirt[1] and Gawker[2] wrote about his nonsense claims, but Shiva and Thiel are suing Gawker.

The sex tape is a little more black and white as a moral argument, but i'd love to hear justifications for defending a man who is clearly so full of shit in a case that with Thiel's support he'll likely win.

[0] http://fortune.com/2016/05/12/gawker-lawsuit-shiva-ayyadurai...

[1] https://www.techdirt.com/blog/?tag=shiva+ayyadurai

[2] http://gizmodo.com/5888702/corruption-lies-and-death-threats...

4
internaut 2 days ago 2 replies      
The only thing that should be upsetting about this outcome is that it wouldn't have happened without clandestine backing by a billionaire. The execution of justice is not in a healthy state. Arguably Gawker would have not necessitated an intervention had the justice system been less broken.

The honest reason why some people are bellyaching about this is because they don't see Thiel/Hogan/Trump as members of their own political tribe. The rationales they come up with are retroactive justifications because they feel that they've taken a hit. Part of their political tribe - Gawker, lost out. Political tribe affiliation wins out over pragmaticism and logic. We all know very well that Gawker was a nest of Social Justice political advocates.

They ought to ask themselves if Thiel wasn't a Trump Delegate, was not wealthy but managed to accomplish the same thing, would they still have a problem with this. I am certain the answer is no. The only change I would have made personally is that I would have stayed clandestine but then again I'm not as familiar with SV's political context.

I am much much more disturbed by the funding activities of George Soros than Peter Thiel. Abuse of power by the rich can be a genuine problem, but this wasn't an example of that.

It is time to play The Warrior Song!

https://www.youtube.com/watch?v=2Xo3fwddONA

Kill with a heart like Arctic Ice!

5
hysan 2 days ago 3 replies      
I haven't made up my mind on who is right, if there is even a right side to this situation. But I am wondering one thing:

For those who believe that Thiel's financial backing is unethical because it makes it unfair, how so? Doesn't that line of logic presume that the court system is completely ruled by money? Is that how the legal system works in America? Money wins?

edit: Not sure why I got downvoted. I'm trying to be sincere and express my disbelief. I really want to hear a well thought out answer cause I really want to understand the situation and why people just keep repeating that Thiel is being unethical.

6
Overtonwindow 2 days ago 0 replies      
I support this 100%. Just because someone calls themselves, or could ever be claimed as, journalists, does not automatically give them a pass in ruining peoples lives by creating sensationalist stories which has the only purpose of drawing increased web traffic. Gawker should be taken to task over its behavior and I hope they are forced out of business.
7
agd 2 days ago 0 replies      
I think the thing to be clear about it that this is not some high and mighty ethical challenge to Gawker or in defence of privacy (cough cough Palentir). This is petty revenge writ large by a billionaire.

If it was about principle why was he hiding his actions? This interview is a pure PR exercise because his involvement was revealed.

The whole story is a sorry mess.

8
madeofpalk 3 days ago 10 replies      
Josh Marshall, from John Gruber:

 It all comes down to a simple point. You may not like Gawker. Theyve published stories I would have been ashamed to publish. But if the extremely wealthy, under a veil secrecy, can destroy publications they want to silence, thats a far bigger threat to freedom of the press than most of the things we commonly worry about on that front. If this is the new weapon in the arsenal of the super rich, few publications will have the resources or the death wish to scrutinize them closely.
http://daringfireball.net/linked/2016/05/25/marshall-thiel

http://talkingpointsmemo.com/edblog/a-huge-huge-deal

9
tsycho 2 days ago 1 reply      
I am conflicted about this, and see merits on both sides of the argument on Thiel's actions. So I tried to draw an analogy:

Let's say that there is some patent troll that only extorts money from small startups, who don't have the resources to fight back. Some billionaire of today, who in his past career was harassed by this patent troll, decides (both for revenge and for the greater good in his opinion) that the patent troll should die and secretly starts backing startup lawsuits against the troll.

1/ Is this a fair analogy?

2/ Is the press "special" and hence we cannot use analogies from other industries?

3/ People who disagree with Thiel's actions, would you feel similarly against the billionaire above?

10
jondubois 2 days ago 0 replies      
It's quite bourgeois of Peter Thiel to concern himself with such petty things.He should focus his ego, his money and his attention on more important matters.Nobody really cares about him and his famous friends being offended by Gawker - In fact, quite a lot of people enjoy reading that crap.Since when is Mr Thiel's emotional comfort more important than the people's rightto information (albeit gossip)?

If I could be as wealthy as Peter Thiel, I wouldn't give a damn about what the media wrote about me. Maybe I would just quietly cry myself to sleep in my gold-plated bed inside my luxurious NYC penthouse.

11
brianmcconnell 2 days ago 1 reply      
Two important points here.

#1 Gawker did not out Peter Thiel. Peter Thiel outed Peter Thiel. Back when Friendster was a thing he had a public profile which featured him shirtless on a boat which clearly advertised his interest in handsome men. So on the scale from Closet Queen to Totes Obvious, he was more on the side of totes obvi. It was also the worst kept secret in San Francisco, particularly if you had any latin friends. So Owen Thomas was right in concluding that Thiel was already out when he ran his "Peter Thiel is totally gay" piece because it wasn't news to anybody.

#2 if Hulk Hogan is claiming injury and embarrassment from a sex tape, why did he make a sex tape? Yes, Gawker is muck raking trash (I just read it for the comments!), but they trade in such material. Unless I am missing something, Gawker didn't trick him into making a sex tape. Common sense would tell you that if you don't want your sex tape on the Internets, don't make a sex tape in the first place.

12
627467 2 days ago 0 replies      
Seems likes the impression of a self-righteous, wealthy industrialist who is trying to shutdown speech he dislikes. I mean, if you're pulling a gilded-age industrialist stunt at least go out and build your own media outlet.

Keep speech plural.

No, I don't feel like I need to justify whether I read Gawker or not; or whether I agree with Gawker as a publication or not. If you don't like one article go and take it down if you got legal standing. Why go nuclear?

PS. I know little about P. Thiel and his political and social stances. The impression I got of him are from headlines like: "Billionaire investor Peter Thiel's plan to pay college students to drop out..." and "Libertarian Island: A billionaire's utopia". And now "Billionaire Peter Thiel funded Hulk Hogan lawsuit to take down Gawker"... Is there a decent biography out there worth reading?

13
johansch 3 days ago 1 reply      
This is commendable. Gawker is scum.
14
mrep 3 days ago 1 reply      
Bad policy. If you don't like what they are doing based on moral grounds, then use your money and influence to educate people on why what they are doing is bad. That has a better chance of stomping out this problem for good.

Taking a vendetta like this by trying to sue them out of business is just playing whack a mole. Another company will rise out of their ashes.

15
VonGuard 3 days ago 1 reply      
As someone who wrote for Denton, I'm shocked to learn how much $ Gawker was making. He paid writers as little as $5 a post.
16
whatok 2 days ago 3 replies      
Thought experiment. Let's suppose I was previously the victim of police brutality. Later on in life, I became a billionaire and decided I wanted to do something with my money. I decide to fund every lawsuit in the country against all police departments. Is this okay?
17
billhendricksjr 3 days ago 0 replies      
Is it wrong for me to be amused at how much I dislike all parties involved in this story?
18
tryitnow 2 days ago 0 replies      
Does anybody know why Gawker thought it was such a great idea to publish the tape? And not just to publish but to keep it up despite warnings from authorities?

Is this something that is normal among tabloid outfits? I have no idea because I really don't follow tabloids.

19
cwisecarver 3 days ago 5 replies      
This is what I sent to my wife about this story earlier today. I think it's true for the public too.

"I guess people can do whatever they want with their money. For me the bottom line is that Hogan is a public figure and the definition of newsworthy isnt up to anyone other than media to decide on. Sure, its tacky as fuck to report on him fucking his best friends wife and it isnt any of anyones business but the market can decide if gawker should be in business or not. It shouldnt be able to decide what they are or are not allowed to report so long as theyre true."

It's creepy that the billionaires can fund lawsuits against the media because they wrote something that the billionaire didn't like but that's a free market. I don't think anyone that reads Jezebel or gawker or valleywag is going to stop reading it because Nick Denton publishing something about an ex-pro wrestler fucking a dj's wife. That's their market.

The people decided this verdict was just. I think it's ridiculous but then again Trump is probably running against Clinton for the leader of the free world so you get what you get.

20
james1071 2 days ago 0 replies      
I really don't see what the problem is.

What does it matter who funded his legal case?

21
ninv 2 days ago 0 replies      
My takeaway from this whole story, We can't get justice without lot of money. Which is sad.

Hogan/Bollea should be able to fight his case without Mr. Thiel's financial support.

22
AndrewKemendo 3 days ago 0 replies      
In case anyone from the NY Times online is reading this, they have a bug in their super fancy navigation UI:

Left or right click drag opens a new story. This would be fine, except if you want to select text it de-facto opens a new story.

23
andywood 3 days ago 0 replies      
It's really no use criticizing some random individual for using their money to buy something that's for sale. If you don't want a thing to be bought, you have to find some way to make it not for sale.
24
jgalt212 2 days ago 0 replies      
Peter Thiel is snake because he took these actions in actions in secret.

If he had been open from the start, then I may have a different opinion of him.

25
TaylorGood 2 days ago 0 replies      
That's about as strong of a maneuver as it gets on Peter's part. To devise such a plot, bravo.
26
dvhh 3 days ago 0 replies      
I am very very conflicted about this, do we classify Gawker network as blogs or journalism ?
27
perseusprime11 1 day ago 0 replies      
The only safe way for Theil to come out of this is for him to double down and setup a legal defense fund for all the innocent people who get harassed by Gawkers of the world.
28
Aelinsaar 3 days ago 0 replies      
There's always a bigger fish.
29
android521 3 days ago 2 replies      
Why did Peter Thiel supports Trump? It is unimaginable.
30
nefitty 3 days ago 4 replies      
Thiel claims to despise "massive privacy violations" on the part of groups like Gawker, yet he was one of the co-founders of Palantir. These people are just narcissists who want revenge for someone splashing mud on their boots. They're not fighting for our privacy, or for the common good. Sorry buddy, you are worth more than 99.999% of us. That makes you a significant figure and thus newsworthy. Your sexual orientation gives people some insight into whether you would support a certain political party, activists groups, etc. I'm just astonished at the tone deafness of this guy's comments.
31
perseusprime11 3 days ago 0 replies      
Nothing new here. People with power, money and influence can do shady things. Facebook with trending news, Peter Theil with funding suit against Gawker...they are all the same.
32
Jugurtha 3 days ago 0 replies      
Beyond good and evil and morals, I think Thiel is being really gentlemanly about this. How hard would it have been to get very good looking people to seduce the people responsible (or their spouses, offsprings, parents), have some wild stuff going on, get that on video, and publish that online...

Would Gawker still feel as strongly about freedom of expression when their significant other's or their daughters'/sons'/parents' legs are spread all over the internet? If that video was sent to them, would they publish something like "Watch the sextape of the people who pulblished Hogan's sextape",

I can think of many, many ways Thiel could have harmed them. He's a very smart man and I'm sure he could think of many more, but he held back going at it in a civilized manner only legally backing a law-suit.

22
The Path to Rust thesquareplanet.com
278 points by adamnemecek  3 days ago   205 comments top 13
1
pimeys 2 days ago 5 replies      
In my current job, I was given a task to write a service which should handle billions of events eventually in the fastest possible way. The language choice was given for me to decide, and I was thinking that maybe I'll do it with C++14, Scala, Go or Rust. C++ has it's quirks and I'm not really enjoying it's build tool of choice, cmake. Scala I can write fast, but scaling the app in Mesos would consume lots of memory; every task would take a huge slice of RAM because of the additional JVM. Go I just don't like as a language (personal taste) and I think the GC adds a bit too much of pausing for this app, so I gave Rust a shot.

The first week was madness. I'm fairly experienced developer and the Rust compiler hit me to my fingers constantly. Usually the only way out was to write the whole part again with a different architecture. On the second week I was done with my tasks, very comfortable with the language and just enjoying the tooling. Also I was relying on tests way less because of the compiler, even less than with Scala. If it compiles, it has a big chance of working. Cargo is awesome. Let me repeat: Cargo is awesome. Also I like how I can write code with Vim again and even though for some things I need to read the Rust source, it is pretty easy to get answers if you're in trouble. In the end I wrote some integration tests with Python and I'm quite happy with them.

Now I want to write more Rust.

2
loup-vaillant 2 days ago 5 replies      
> Rust is not the most beginner-friendly language out there the compiler is not as lenient and forgiving as that of most other languages [], and will regularly reject your code []. This creates a relatively high barrier to entry []. In particular, Rusts catch bugs at compile time mentality means that you often do not see partial progress either your program doesnt compile, or it runs and does the right thing. [] it can make it harder to learn by doing than in other, less strict languages.

I don't see how making the type system stricter makes the language harder to learn. Maybe that's because I know another relatively paranoid type system (Ocaml), but still.

A type system that rejects your code is like a teacher looking at a proof you just wrote, and tells you "this doesn't even make sense, and here's why". It may be frustrating, but this kind of feedback loop is tighter than what you would get from a REPL.

And you do see partial progress: the type errors change and occur further in the source code as you correct your program. Each error is an opportunity to fix a typo or a misconception. The distinction between a broken prototype that doesn't even compile and a working program isn't binary: when you correct a type error, your program is less broken, even though it doesn't compile yet.

3
Munksgaard 2 days ago 1 reply      
> This latter point is particularly interesting; the Rust compiler will not compile a program that has a potential race condition in it.

I feel obliged to point out that this is false. Rust prevents _data races_, but not _race conditions_. You can read more in the Rustonomicon here: https://doc.rust-lang.org/nomicon/races.html

4
Animats 2 days ago 10 replies      
Well, the functional crowd won. An example expression from the parent article:

 let idx = args // iterate over our arguments .iter() // open each file .map(|fname| (fname.as_str(), fs::File::open(fname.as_str()))) // check for errors .map(|(fname, f)| { f.and_then(|f| Ok((fname, f))) .expect(&format!("input file {} could not be opened", fname)) }) // make a buffered reader .map(|(fname, f)| (fname, io::BufReader::new(f))) // for each file .flat_map(|(f, file)| { file // read the lines .lines() // split into words .flat_map(|line| { line.unwrap().split_whitespace() .map(|w| w.to_string()).collect::<Vec<_>>().into_iter() }) // prune duplicates .collect::<HashSet<_>>() .into_iter() // and emit inverted index entry .map(move |word| (word, f)) }) .fold(HashMap::new(), |mut idx, (word, f)| { // absorb all entries into a vector of file names per word idx.entry(word) .or_insert(Vec::new()) .push(f);
Is there editor support for indenting this stuff?

5
0xmohit 2 days ago 0 replies      
Good to see such articles that provide an insight into various aspects of a programming language.

A couple of other beginner-friendly resources would include:

- An alternative introduction to Rust [1]

- 24 days of Rust [2]

- CIS 198: Rust Programming [3]

[1] http://words.steveklabnik.com/a-new-introduction-to-rust

[2] http://zsiciarz.github.io/24daysofrust/

[3] http://cis198-2016s.github.io/

6
joobus 2 days ago 7 replies      
I'd like to know what the author considers "systems work"; I don't consider garbage-collected languages (Go, Python) "systems" languages.
7
Jonhoo 2 days ago 0 replies      
Author of the post here. Curious that this got posted again. Was originally posted as https://news.ycombinator.com/item?id=11773332. Can the posts be merged by a mod somehow?
8
jeffdavis 2 days ago 3 replies      
I really like my experience with rust so far also, but a few caveats:

* try!() Is pretty annoying

* Working effectively with C in non-lexical ways seems to involve some unstable libraries and still requires nightly rust

* Macros are safer, but can't do some things that C macros can. For instance, they are hygienic, which means you can't conjure up new identifiers. For that, you need a syntax plugin, which is very powerful but the APIs aren't stable yet. This goes to the previous point.

* A few annoyances, like warning when you don't use a struct field as "dead code". If I'm interfacing with C I probably need that struct field whether the rust compiler sees it or not, but I don't want to disable all dead code warnings for that.

9
zimbatm 2 days ago 4 replies      
Is rust ever going to re-introduce the N:M model again ? For services which need to handle 1M connections the system threads are too expensive and mio brings back the callback hell.
10
georgewsinger 2 days ago 2 replies      
If I'm not hacking on something super low-level, like hardware or an OS, then should I still try Rust? Why not stay within super high-level/expressive programming languages like Haskell/clojure?

I ask because a lot of extremely smart people I know like Rust.

11
Mihies 2 days ago 4 replies      
One thing I am missing is dependency injection/ioc. How does one effectively unit test without it?
12
shmerl 2 days ago 0 replies      
This is also a pending issue: https://github.com/rust-lang/rfcs/issues/349
13
namelezz 2 days ago 3 replies      
> Once your code compiles, youll find (at least I have) that it is much more likely to be correct (i.e., do the right thing) than if you tried to write similar C, C++, or Go code.

What issues in Go do this sentence refer to?

23
Experts say Olympics must be moved or postponed because of Zika washingtonpost.com
206 points by graeme  1 day ago   135 comments top 25
1
sago 1 day ago 6 replies      
Every day hundreds of thousands of people fly in and out of Brazil to and from all corners of the world. Rio Galeo alone handles 17m passengers per year, much more than the total number of Olympic tickets available (most of which are sold to Brazilians, and most of the attendees buy more than just one).

Nobody I can find is giving credible numbers that show the olympics will constitute a significant increase to affected areas over the year as a whole (edit: as adevine points out, below, the article puts the increase at c 0.0025), nor that travel is currently only from a handful of places worldwide.

If international travel to Rio is a public health problem, then focussing on the olympics is pure tokenism.

2
cmurf 1 day ago 16 replies      
Maybe there should be an Olympic nation. Some country just gives up the land, their claim of sovereignty over it, hand it over the U.N. (certainly not the IOC) and wealth redistribute without all of this stupid waste. Billions of dollars spent with a moving olympics and all the infrastructure ends up not being used for pretty much anything ever again. It's one of the most idiotic things on the planet. I like the olympics, sorta, but how we make them work is immensely wasteful. The very possibility it would be canceled and the infrastructure never used for its intended purpose even once makes that all the more apparent.
3
MicroBerto 1 day ago 4 replies      
As a spectator and not a competitor, I'd be far more concerned about the crime in Rio.

It's tough to know what's really going on with their crime (given our sensationalist media and my not living there)... but you couldn't pay me enough money to go to the 2016 Olympics.

4
dredmorbius 1 day ago 1 reply      
I find it interesting that James Burke, in an interview some years after his Connections series, discussing how he'd continue the series, said of the jet airline that he'd explore its role as a vector of international disease transmission.

https://archive.org/details/JamesBurkeReConnections_0

5
persona 1 day ago 0 replies      
It's amazing that neither the article nor the 100+ comments take into account that the Olympics will occur in August, which is winter in Brazil. Truly Rio doesn't freeze but historical numbers of dengue infections for example are the lowest during that month. By a factor of 300 in some data. Of course 1 infection is one too much, but if weather comes in as it usually does, August won't see many mosquitoes attending the olympics.
6
habosa 1 day ago 0 replies      
It seems like there are a laundry list of reasons not to have the Olympics in Brazil this year:

 * Zika * President was impeached, rampant corruption farther down * Crime and social unrest arising from the massive amount of money spent on Olympics vs social programs * Reports of pollution making watersports unsafe
But the Olympics will definitely happen there, of course. Too much money already spent or planned to be spent.

7
nomercy400 1 day ago 4 replies      
113 of the 149 experts are from the US and Canada. 3 are from South America. Dunno, but it seems that a large portion of the rest of the world's experts doesn't care or doesn't see it this way.

Also, the Olympics are a billion dollar business. Billions of dollars VS faster spreading of Zika. Billions. They aren't going to postpone it.

8
dave2000 1 day ago 1 reply      
They say it's too hot in Qatar to hold the World Cup too, but you can't argue with money.
9
corybrown 1 day ago 1 reply      
Can anything really stop the worldwide spread at this point? Even without the olympics, it's already throughout Latin America, which sees plenty of travelers on aggregate
10
gyakovlev 1 day ago 0 replies      
Plague Inc. players know how Olympics helps to spread the disease.
11
belzebub 1 day ago 2 replies      
Won't someone think of the sponsors!?
12
SCAQTony 1 day ago 1 reply      
CDC: Zika Transmission Risks:

Through mosquito bites,From mother to child,Through sexual contact,Through blood transfusion.

"... Anyone who lives in or travels to an area where Zika virus is found and has not already been infected with Zika virus can get it from mosquito bites. Once a person has been infected, he or she is likely to be protected from future infections. ..."

http://www.cdc.gov/zika/transmission/

13
jlg23 1 day ago 0 replies      
I'd like to know how reliable the data on Rio is. Are there more Zika infections than in the poor north eastern part of Brazil or are (suspected) cases just much more like to be reported due to easier access to qualified medical care? Is distribution in Rio uniform or does it mostly concern very poor areas which are very unlikely to be visited by tourists anyway?

NB: The dengue data they refer to[1] shows a sharp decline in infections from May on, at the end of the wet season (obviously inferring from 2015 data).

[1] http://www.rio.rj.gov.br/dlstatic/10112/5880996/4153672/deng...

14
codecamper 19 hours ago 0 replies      
Portugal has strong links to Brazil, as they both speak the same language.

Looks like this is leading to a slow rise in Zika in portugal.http://www.reuters.com/article/health-zika-portugal-idUSKCN0...

Great.... I'm in Portugal! Dammit.

15
the_watcher 1 day ago 1 reply      
Too much money is at stake (advertisers, networks, etc) for the Olympics to be postponed, but if they move it, I wonder if there is anywhere in the world that could be ready to host it at that time. London deconstructed much of their infrastructure, so they're out. Beijing perhaps? Honestly, Los Angeles could probably handle it by housing the athletes at UCLA and USC and expanding the geographic range of events down to San Diego and up towards Santa Barbara.
16
bernardom 1 day ago 0 replies      
Interesting: CDC chief says not to postpone:http://www.bbc.com/news/world-latin-america-36401150
17
abhi3 1 day ago 0 replies      
Imagine athletes from poorer states take it back to their countries which don't have good healthcare infrastructure and it becomes an uncontrollable epidemic there.A very real possibility of Half a Million people taking the virus back to every city on the planet and all they are saying is we'll use mosquito repellents near the stadiums and hotels and everything will be fine.

I understand that there's a lot of money and sunk cost at stake for Brazil and IOC but their adamance over this is dangerous.

18
kaonashi 1 day ago 0 replies      
That and the coup.
19
OrthoMetaPara 18 hours ago 0 replies      
If pregnant, don't go to Brazil. If not pregnant, don't conceive child while in Brazil.

I don't really see the issue, here.

20
daodedickinson 1 day ago 0 replies      
I have no problem with the Olympics as long as we realize that it is a party, a festival, a celebration. I don't expect my parties to turn a profit. If people can agree to it while understanding the cost, fine; but I am tired of demagogues telling democracies that everything they propose will "bend the cost curve down" and result in lower taxes, higher revenues, and cheaper everything.
21
ck2 1 day ago 1 reply      
The problem with the "news" covering this is they are going to do the same thing they do with hurricanes.

They will hype it in such a way that a serious threat becomes a joke and ignored by those that see the topic being treated as clickbait.

22
tn13 1 day ago 0 replies      
Experts say a lot of things but that does not mean we gotta listen to them. American public education system is run by Experts remember ?
23
perseusprime11 1 day ago 1 reply      
Without downvoting, please remind me again why we need Olympics in this modern day and age?
24
foota 1 day ago 0 replies      
I live in the Pacific Northwest, and for a brief moment I parsed this as the Olympic Mountains.
25
hrathi 1 day ago 3 replies      
wth, are pregnant women competing in Olympics?
24
Comparing Git Workflows atlassian.com
261 points by AJAlabs  2 days ago   99 comments top 12
1
drewg123 2 days ago 17 replies      
One of the things I hate about the traditional git workflows describe there is that there is no squashing and the history is basically unusable. We have developers where I work that use our repo as a backup, then when things are merged to master, the history is littered with utter garbage commits like the following:"commit this before I get on the plane""whoops, make this compile""WTF?"

These add no benefit to history, and actually provide an impediment to bisecting (since a lot of these intermediate revisions will not even compile).

At my previous job, we used gerrit. The nice thing about gerrit from my perspective is that it kind of "hid" all of the intermediary stages in the gerrit review. So you could push all you wanted to your gerrit review fake-branch thing, and when you finally pushed to master, there would just be a nice, clean atomic change for your feature. If you needed more detailed history of the steps during review, it was there in gerrit. But master was clean, and bisectable.

Is there any git other git tool or workflow which both allows people to back things up to the central git repo AND which allows squashing changes down to meaningful bits, AND which does not loose the history of review iterations?

2
useryMcUserface 2 days ago 2 replies      
This article has actually been around for a while. Explains it really great. But one advice from me is that try to choose only what is sufficient to your project and team. No benefit in being overequipped for a simple job.
3
zmmmmm 2 days ago 3 replies      
It amazes me how the entire software industry seems to be adapting its workflows around the necessity of making Git usable. While there are certainly other positive attributes about some of these workflows, the main reason people use them in my experience is because "if you don't use workflow X you get undesirable problem Y with Git". Most of these problems simply didn't exist or were not nearly as severe with previous revision control systems, so we never needed these elaborate workflows. Now suddenly Git is considered a defacto tool, and downstream effects of using it are transforming the entire software development process.
4
zamalek 2 days ago 1 reply      
You can also evolve, basically, to each model in the order that they appear in the article.

As an example: I've been working on a new spike for the past 2 weeks with one other developer. Maybe 10 times a day we'll need something that the other person has committed, so we work against one branch (master). The workflow suits this extremely rapid iteration.

One repo has now matured to the point where developer branches make sense. We created "develop" on it as well as our own branches off that. We're not close to a v0.1 yet - but we'll be evolving to git flow the minute we want to ship.

Eventually as more devs join, we'll need the full-blown PR workflow, that also naturally stems from its predecessor.

There's a "meta-workflow" here, which implies which workflow to use.

5
nwatson 1 day ago 0 replies      
The article completely mischaracterizes Subversion workflows, making the mistake of treating a Subversion repo just like developers typically use git repos ... one-repo-per-project.

Subversion instead is a tree with projects as nodes toward the leaves, each project with its own trunk, branch, tags. It's each of these projects that corresponds to a git repo. Teams I worked on always treated each project as its own "repo" ... so the central single Subversion tree became like our 'github' or 'bitbucket' ... and one could do all the lockless branching within each project, no problem. YOU COULD BE AS NON-LINEAR IN THIS APPROACH AS YOU NEED TO BE, with full support for branching, tagging, merging, etc.

Where Subversion was much better was in supporting consolidated views of multi-project build / release environments, or in mixing sub-project code in parent-project code. Using svn:external it was each to put "that subproject in this place in my own project". Using git submodules and other approaches is a pain. You end up having to check out a bunch of git repos and managing your own glue.

6
EnderMB 1 day ago 0 replies      
It'd be nice to see someone collate more git workflows, and what the advantages and disadvantages of these are.

Over time, my workflow has become simpler and simpler. I've worked with some weird and wacky workflows before, which have been born from a given requirement, such as quick deployment to a number of different environments, or two separate teams working on separate parts of one codebase while maintaining separate CI workflows. Some of these workflows have seemed absolutely mental, but I've seen them several times over in different places, so there must be some kind of logic to the madness.

Different dev teams have wildly different practices, so it'd be good to acknowledge the "typical" way of doing things, and embracing the workflows that work if you need to do something out of the ordinary.

7
crispyambulance 2 days ago 0 replies      
Kudos to atlassian for bringing some much needed clarity to a confusing topic. So many people that claim mastery of git only know particular workflows and, when attempting to mentor others, just mansplain whatever they know without consideration that there are alternative valid ways of doing things.

Without a firm grasp of one's intent(workflow) learning git commands is pointless and leads to people desperately flailing out commands.

8
axelfontaine 1 day ago 0 replies      
Or you could actually practice continuous integration and let everyone work on master.

Much simpler, opens the door for feature toggles, continuous delivery and more without any merge headaches.

9
sytse 2 days ago 0 replies      
I think the ideal workflow depends on the complexity you need. I've tried to write about what kind of requirements cause what kind of workflow in http://docs.gitlab.com/ee/workflow/gitlab_flow.html

What do you think?

10
Bromskloss 2 days ago 0 replies      
What was the workflow in mind when Git was designed?
11
jupp0r 2 days ago 0 replies      
Those are general development models and not specific to git.
12
kevinSuttle 2 days ago 1 reply      
Was this updated recently? This has been up for awhile.
25
Coding without Google bfilipek.com
271 points by ingve  2 days ago   199 comments top 46
1
numair 2 days ago 9 replies      
Programming is so much more fun and awesome now. Everyone has a computer, and has their computer on them 24 hours a day. Whether it's a website or a mobile app, you're able to immediately get things in the hands of your "normal" friends and see what they think.

The development side of things are also better. All of these open-sourced, internet-networked tools/libraries/languages allow you to build really powerful stuff really quickly. Yes, there is a huge quality and maintenance problem, with lots of people writing libraries they instantly abandon, but I think this just forces you to read lots of source code and become a better programmer.

Build more powerful stuff, faster, and get it in the hands of more people, faster. What a time to be a programmer. The article does a good job of noting the problem of distraction, but I think that has always been an issue, and why we have always seen anti-social traits among lots of successful developers. Tuning out the allure of being young and having mindless fun in the 90s to focus on code is no different from tuning out the allure of being young and having mindless fun today. As for older people, well, it's always been a challenge to be super-focused and dedicate time to learning new things while dealing with household/adult responsibilities.

This is a very special moment as a developer. It is like being in the business of rock 'n roll in the era in which radio and television took off, which made it possible to create music that ended up broadcast around the world within weeks. If you can stay true to the roots of the code, and avoid all of the weird get-rich-quick types that have entered the scene, you can have a lot of fun right now.

The past wasn't any better, and the future can't be predicted. The time is now.

2
Morgawr 2 days ago 4 replies      
When I was 17 (about ~10 years ago) I was hospitalized for a whole month. All I had was my linux laptop without any games (well, I did have Battle for Wesnoth which was great) and no internet connection in the hospital.

I was so bored of being bed-ridden at the hospital that I eventually started browsing the source code of a C++ game engine (Irrlicht engine) and use that as documentation for a small game project. During that experience I learned to read the local documentation, read the source code and actually figure out how stuff was implemented in the engine, which quirks it had and even modify and recompile it.

Simply because I was bored and I had no way to get the answers I needed off the internet.

It was a very enlightening experience, I think every developer should get to a point, at least once in their life, where they're just sitting in front of the computer with no external distractions and no internet connection, just explore what you have and make the best out of it.

3
nostrademons 2 days ago 4 replies      
It can be kind of a fun experience to try coding something substantive without an Internet connection - say, while riding Caltrain or on a plane flight. It's a very different workflow. You need to download all of your docs ahead of time, and then get used to browsing through them as the doc generator intended. If you get stuck, better dig into the source code of that library you don't know how to use. You end up thinking through your code a lot more, and being more careful and more rigorous about the library calls you use.

I've found it's a pretty handy skillset to have, but for most everyday programming, I'm happy to reach for the search box.

4
Tistel 2 days ago 6 replies      
I have been a dev for 15 years. Its seems less fun now. I did hard core C++ games for 13 of that (Nintendo DS+Wii, 360 and PS3 games). You always had a pretty good sense of what what going on. Now I am doing web/mobile. Its just a messy swamp. Everyone's first instinct is to slap another JS library on every problem. The build breaks and I ask what does somebslib.js do? No one knows or cares. They just keep slapping more JS on the problem until it looks like it works. No desire to have a clear mental model. No desire for efficiency. Oh well. Working on personal projects is the only super fun coding I get to do.
5
iamleppert 2 days ago 3 replies      
Wow, this really took me back!

My first real experience with coding was the NeHe OpenGL Tutorials (which are still available online ironically). At the time I was writing a visualization plugin for an early version of Winamp, and I downloaded all NeHe's tutorials and saved them offline, as well as the OpenGL reference and Visual C++ CHM help file.

I would then go down to spend time with my grandparents in Southern Ohio, without any kind of Internet access, no cell phone back in those days, and a barely Pentium laptop. I later hauled my desktop down there, but most of the time was spent out on the deck with my shitty laptop.

It was amazing.

I would just spend the entire day coding and learning, while my grandmother read her book, out in the wilderness, looking out over a picturesque lake every now and then to clear my mind and think about a problem I was having, or to ponder the next steps in what I wanted to learn or do. Especially hard problems would require a nature walk while I thought about how to best represent things in the real world and grasped 3D graphics concepts for the first time.

In between breaks we would take turns getting up and making tea for one another, and every now and then I would make something cool, or learn something for the first time, and excitedly have to show my grandmother, who would immediately tell me I was "so smart" and shake her head in amazement of "modern technology".

Towards the end my grandfather put up speakers and we listened to old 50's and 60's big bands out on that deck during these trips. I think more than anything these were the shared experiences that got me into coding and really opened my mind up to the possibilities.

These days, this kind of situation and environment is hard to replicate. Things are just so unnecessarily complex and now all our tools expect an Internet connection to work. It makes me sad that today's generation will probably never know the kind of experience I had. There is something about not having the Internet at your fingertips that forces you to reason through things on your own and find your own powerful source of creativity rather than the route search/download someone else's code that we all do now.

6
pjmlp 2 days ago 2 replies      
I started coding in the 80's, when all I had was access to the BASIC programming manual that came with the Timex 2068.

Thankfully I also got some programming books shortly thereafter, which introduced me to Z80 Assembly.

Back then these books for home micros were targeted to children so you had cartoons and funny drawings explaining programming concepts.

http://www.misteriojuvenil.info/detalhes.php?id=3422

Since those descriptions are in Portuguese, the Atari archives are a better example for an international audience,

http://www.atariarchives.org/

Other than that, we had to get listings in Crash, Your Sinclair, Micro-Hobby, Micromania, Input, Spooler. Most of them we had to fix before they could run, because there were always the usual typographic errors breaking the code.

Specially painful when typing hexdumps of Assembly for entry into monitor applications.

Then those of us lucky enough to live close by to a library with computing books, also got to hold some of them. Or get to meet others to share our ideas.

When I managed to get online to the local district BBS, I was already 18 years old, and one could hardly use it, because how expensive it was and it only allowed between 5 to 10 simultaneous connections.

We were forced to think out of the box and try to figure out the solution to a given problem on our own. Which lead to very creative ideas.

Specially in the demoscene community, which was a great experience back then.

Nowadays, we just copy-paste.....

7
aavotins 2 days ago 2 replies      
Maybe programming with Google by my side has taken away some of the romance I associated with programming, but it has certainly made me a better and more productive programmer.

I found a Zip drive(anybody remembers those?) with lots and lots of my early source code, circa 2004, when I programmed day and night, because it was so much fun to learn. There was no Stack Overflow back then and even Googling yielded fewer results. Unfortunately some of the creative solutions I came up with back then, make me cringe right now.

Being able to consult with knowledge and experience of people much, much smarter and wiser than me, has made me more productive. I don't waste time solving the same problems again and again, I can find tested and efficient solutions to problems quickly. I am more productive by leveraging all the knowledge. I am not paid to implement quicksort and then test it extensively to match already existing solutions that are blazing fast, I am paid to do real, practical work. The less boilerplate I have to write, the better.

Just my 2 cents.

8
dvirsky 2 days ago 3 replies      
When I started coding 30 years ago, I didn't have internet, and being a kid living in a remote rural area where even the nearest bookstore is a long drive away - virtually unreachable for a 12 year old - I hardly had any resources available at all. It was both frustrating and exciting.

Every book or tool you could get your hands on was a treasure; and my friends and I, developing primitive adventure games, had to reinvent some wheels in pretty lame ways, like image compression for our backgrounds, or drawing of sprites onto a background.

I once got my hands on a 8086 assembly manual, what a gem that was! but alas, we had no assembler or a C compiler, so we reverted to creating blank files and using DOS' DEBUG.EXE to tweak the assembly from all zeros to whatever we wanted.

It was really exciting although not very "productive". Then one of the guys got a modem and the BBS world opened up to us, and it was never the same again.

9
tluyben2 2 days ago 1 reply      
My (I was 8) first computer (not PC) I got beginning of the 80s when there were only BBSs with, at least in my country, not much programming info (pirated software, images, stories, text games, porn). I had a Basic, assembly and C book second hand and I bought the occasional magazine and listed/disassambled existing software I typed over from mags or downloaded. That was all that was needed: the rest you had to make up by reading in the manual or experimentation.

For my current hobby projects I use things like C, Forth, Lua: languages and libraries I can keep in my brain and do not need internet for to write things that work. Compared to commercial work it is especially the fast changing things (often changing for no apparent reason) like JS projects and/or some web/app frameworks that really require Google. Certainly the best practice to use a library for everything, even when unstable or rapidly changing, is a pain. It is often much faster to write things yourself without having to debug yet another library but you keep thinking you are wasting time and fall for it. With internet not available you have to find a solution yourself. Using a lib is usually the better solution but just writing things without Google just feels better to me.

10
hoodoof 2 days ago 1 reply      
In current times you can build vastly more powerful software than you could in "the day", but you can't possibly remember how to drive it all.

Constant reference to Google and StackOverflow is unavoidable unless you are building something within a very constrained technology.

11
marcus_holmes 2 days ago 3 replies      
I find coding offline more constructive - my most productive coding sessions are on the train. Getting to focus on what I'm trying to achieve without getting distracted is important and useful.

But debugging is impossible without the internet, I find. Working out what an error means from just the error description and the source code of the library is bloody difficult. Googling the error message gets better results immediately. A friend and I were talking about this, that maybe we should just use UUID's for error messages and write the error description in Stack Overflow, since that's where they'll end up anyway.

12
gwbas1c 2 days ago 0 replies      
One of the advantages of Google is that technical documentation often omits critical details. How often do you look at something on MSDN, or Apple's docs, and just scratch your head? Then, a quick Google search fills in the missing information!

Back in the 90s, I couldn't just go Google the ambiguous areas of Borland's documentation. As a result, I probably made lots of silly novice mistakes that a novice today won't make.

13
tonyle 2 days ago 0 replies      
Google is too useful to not use it, But you shouldn't have to do a google search because you don't remember a command or basic functionality of a language or library.That information should be rapidly available by the documentation.Google should be used for troubleshooting or researching a new concept.

http://devdocs.io/offline is a great resource if you want to code offline.

14
Bahamut 2 days ago 0 replies      
I have a slightly different perspective as someone who learned how to code in recent times (about 4 years ago) with the rich amount of resources on the web.

I do use Google & the internet generally a fair amount, largely for referring to APIs, but for various people's approaches to solving certain problems on occasion.

The two dangers of relying heavily on Google though is losing knowledge & sometimes generating doubt in ones conclusions from a well-reasoned understanding of the problem, and encouraging inefficiencies when sometimes a little harder thought at a problem could solve it much faster due to having sufficient knowledge.

My opinion is that there are pros and cons having the vast resources of the internet, and while it has mostly been a net positive thing, we should know to guard against the weaknesses that is possible to slip into being too heavily reliant on it.

15
mobiuscog 2 days ago 0 replies      
I think part of the problem these days is expectations.

Many years back, drawing a simple vector drawing on the screen was amazing... now it's a single command, and people expect HDR 3D rendered images otherwise it's not worth doing.

Life is busier, more 'impressive' and the internet shows you many people who are so advanced you shouldn't even bother.

The days of home computing were awesome in so many ways.

Then again, the opportunities are just different these days - the astonishing capabilities given to new games designers through Unreal Engine / Unity, etc.

The ability to show your work to so many people and not need computer 'friends' that live nearby.

It's just different.

It is a shame that so many people don't read so much to gain knowledge but instead mentally cut/paste examples to build something quickly.

16
ino 2 days ago 0 replies      
I learned programming alone with qBasic when I was 14 without internet.

By luck my father had left an icon to qBasic on his win 3.1 computer and it had a similar interface to edit which I knew very well, and I opened a file and realised the code made it go.

qBasic had a good and complete help with all the functions and examples. I loved it, and It helped me learn english and basic.

I remember playing football with friends and an idea came up to solve a problem I had at the time and I stopped playing to write the code in my school notebook. When I came home, it worked.

The internet has many dogmas and principles that are being repeated and learned by novices without understanding why by going through it themselves. Sometimes a singleton is the right thing to use, for example.

17
geff82 2 days ago 0 replies      
I often find myself contemplating to move my information consumption back to a purely paper based approach as long as this is still possible. After all, good newspapers and magazines do a good job at curating the most important information for me. The fact that this way, "news" arrive with a little delay at my desk, doesn't make me less smart on a higher level. The only useful information on the net (useful in the sense: makes me money and saves time) is purely job related stuff. But the net makes it really hard to only read those parts... I still hope, I can find a good balance on the electronic world one day.
18
joemi 2 days ago 0 replies      
My problem with with trying to learn programming back in the mid-90's (my first attempt at programming) was that I couldn't find any books that I could really get into and truly learn from. I knew I should be able to figure it out, but things weren't clicking. I'm sure that had to do with the quality of bookstores in rural NH at the time, but still, it was horribly frustrating to drop a bunch of cash on a few books that seemed pretty good in the store only to get home and realize that after the first chapter or two I felt completely lost and hopeless.

Fast-forward to my second attempt to learn programming ten years later in the mid 00's, and between the sheer amount of tutorials/ebooks/other random things online, I was able to pick it up very quickly, the way I always figured I should be able to. Part of that was definitely the shift to higher abstraction levels for a lot of things, but I firmly believe a large part was that I was able to much more easily find instruction that worked for me, rather than having to choose from just a few limited options.

Because of that, I have no real fond memories of programming before google, and I owe all of my ability to the programming-with-google world. (That said, I have extremely fond memories of the communities on the BBS's I frequented back in the day!)

19
markbnj 2 days ago 0 replies      
I was one of those guys in 1990... heck I was one of those guys in 1983. But I don't really romanticize it. I still have three shelves full of books, including some classics like the 3-vol. Knuth, Abrash's Zen books, the ARM of course, as well as the C++ essentials (Lippman, Meyers, Eckel, etc.), gang of three patterns books, and quite a few others. I spent thousands on books, but I haven't bought a book on programming in something like ten years now. I once paid $140+ for an IBM manual on programming the PC VGA chip. I'm sure I still have it in a box downstairs. I also paid $700 for a 9600 baud modem, and regularly forked over $400+ a month to Compuserve, where I spent time with other geeks on the Computer Language and Dr. Dobbs' Journal forums. Interesting times.

It's worth noting that one of the reasons search is in general so important these days is there are so many more things we have to know. If you weren't actually working in those days then you don't realize how comparatively simple it was. There was more complexity on a micro scale, because we were writing compiled code w/o the benefit of all the high level abstractions available now. But on a macro level things were much, much simpler.

20
pavlov 2 days ago 0 replies      
I feel that programming is fundamentally similar to other writing in that there are more and less research-oriented "subtypes" of writing and writers.

If a journalist is writing a 1-column newspaper story about how the European Central Bank's latest meeting affects banks, that's clearly a research-oriented piece. Probably she will spend more time on discovery of facts and stakeholder opinions rather than actual writing.

On the other hand, a writer working on a short story may do very little research. Fiction generally builds on life experience -- it's very hard to write a heartfelt story on a topic that you have to constantly google up.

Something similar happens in software, at least to me. There's some coding that requires constant online research and browsing... But then there's the other kind where I know my tools already, I know the project is possible based on previous experience, and I can just sit down and start writing. IDE autocompletion and one-click access to relevant API headers often ensures that I don't need online help for anything.

I like the latter kind much more. Research-oriented coding makes me feel like I'm some kind of junior API lawyer. The other kind is more like making a painting: you're slowly building up something that might be good but you also have to accept that it will look like shit at many stages. That solitary exploration is the reward to me in both painting and programming.

21
lanevorockz 2 days ago 0 replies      
A bit surprised with the view of Code Googling, imho it boils down to deep and shallow knowledge. Even though results are quicker when googling you will never find optimal solutions for your problem or not even necessarily accurate.It was a bit more painful before but the diversity of solutions was immense and solutions were optimized for the problem at hand.It's a good thing to make programming more accessible but it's not the necessarily the way to go for all programmers.
22
zwischenzug 2 days ago 0 replies      
I find this interesting, as I recently completed a book (1).

My co-author (much younger and smarter than me) mentioned while we were in the middle of writing it that he'd 'never read a computer science book'.

I had to reflect on how much had changed in 15 years of software development. When I started I was using Dogpile to search for 'documents' on the 'net (which had an apostrophe then) while balancing a 'learn C++ in 21 days' book on my knee.

So 'coding without books' (and coding well) is already more than possible.

1) https://www.manning.com/books/docker-in-practice, since you ask.

23
rmtew 2 days ago 0 replies      
When I programmed in university pre-2000, I used gopher in place of google.

Applicable matches were limited to source code file search, which gave examples of function calling out in the wild. There were no books of use available, and the documentation was often vague and incomplete man pages.

24
joneil 2 days ago 0 replies      
This reminds me of Derek Siver's post about memorizing a programming language[0]. If I start working in a new language or new framework full time, I find it really helpful to be deliberate about learning the things I'll use regularly, to reduce the dependence on Google for common API calls etc.

Like others have said I've found that programming on a flight or long train ride is super productive, especially for starting a new feature where I'm writing fresh code (as opposed to debugging existing code). For some reason I've rarely translated that into deliberately using "flight mode" when I need to focus.

[0]: https://sivers.org/srs

25
bechampion 2 days ago 0 replies      
Wouldn't be a great idea a VLOG of someone coding without internet at all ,and talking about the pros and cons?

Once I moved to a small town , and I didn't have too much money for proper broadband , so I had 2 hours of dial-up every day.

I remember using that time much more efficiently and trying to download PDFs and such for the following hours , a lot of the things I've read in books/PDFs then I still remember today after 10 years. (Basic unix IPC and others)

Some of the things I've read online yesterday I don't remember today ha!

26
asakurasol 2 days ago 0 replies      
I semi-experienced this last year when I was teaching coding in prison. For obvious reasons prisoners weren't allowed to have internet access, their learning entirely depended on outdated learning manuals and tutors.

It was hard, though it became clear very quickly which books were actually written for beginners even though all of them said "introduction to blah blah" or "foo bar for beginners".

27
rukuu001 2 days ago 0 replies      
God, I remember hunting up the various Inside Macintosh books in the mid-90s. I finally found some in the library of a tech college.

"No, we don't even let our students take those out of the library."

So I made hand-written notes, went home and tried to use them. It was slow work and I didn't get very far.

I much prefer Google. I just read whatever I find with a much more critical eye.

28
groovecoder 2 days ago 0 replies      
To fight distractions online and stay in a coding zone, I use RescueTime and set limits on time spent on distracting sites. Like hacker news. ;)
29
therealmarv 2 days ago 1 reply      
With that program you even download an offline version of Stackoverflow. Just download Dash https://kapeli.com/dash and make all your coding docs offline available. It's really impressive for coding/working without Internet connection. UPDATE: Did not know about devdocs.io Seems like an opensource version of Dash.
30
alexroan 2 days ago 0 replies      
I felt like I was doing a degree in google-ing at one stage, but I don't believe the speed and complexity of development nowadays could be matched by the pre-broadband days. Communities can be brought together on sites like Stackoverflow and people who aren't experts in certain fields can just google it, and implement something in minutes that would take much much longer to find, then learn, pre-google.
31
Ensorceled 2 days ago 0 replies      
All the people romanticizing the good all days need to write CAD/CAM software in Fortran IV on an IBM 360 with nothing but the five thousand opaque IBM reference binders and a couple of college texts on computer graphics. To a deadline.

The real "joy" was posting a problem on the comp.lang.c and praying Dan Bernstein or Henry Spencer was online and would answer sometime in the next day or so ...

32
nurettin 2 days ago 0 replies      
>I could focus better on the ideas and on the code. Now, with so many distractions you need to be more resistant and self disciplined.

I think it is about the task at hand. When I try drawing 2D spaceships and adding keyboard controls, it is a flowing experience no matter what tools I use.

When I start solving bugs in compression libraries or debugging million line code bases, it is all pain, frustration and google.

33
xufi 2 days ago 0 replies      
This reminds me of the time when I first started getting more into computers around age 13/14 and learned HTML when I had limited bandwidth dialup. I had to push myself to test things by going over to someone overs house and then tried my best to get the concepts down
34
tacone 2 days ago 0 replies      
> I believe that offline experience that I had in the past was a good thing. I could focus better on the ideas and on the code. Now, with so many distractions you need to be more resistant and self disciplined.

More disciplined === better programmer. The Internet is doing it right.

35
KennyCason 2 days ago 0 replies      
"I often downloaded pages with tutorials, so that I could read them offline later." - So true.
36
shams93 2 days ago 0 replies      
I got into java pre google my mac didnt have enpugh disk space for code warrior but the jdk was tiny enough to fit on my mac performa with its 250 meg hard drive. Wrote my thesis project in java 1.0 java was horrifically slow back then but the jdk was really small.
37
dmh2000 1 day ago 0 replies      
i remember back in the day when you had to order paper reference books and it took days to weeks to get them, or for the supposedly free ones you were denied because your need/qualifications were somehow not right. And you simply could not proceed without them.
38
cmrdporcupine 2 days ago 0 replies      
Coding with Google at your side is great, until you have to interview on a whiteboard without access to Google... while interviewing at Google.
40
hyperpallium 2 days ago 0 replies      
A huge difference between programming Before Google and now is the explosion in libraries/frameworks/APIs. With arbitrary undocumented "features" requiring obscure setup and workarouds. Google gets you to actual programming faster.
41
tunnuz 2 days ago 0 replies      
Point made, I'm reading this instead of coding right now.
42
gkanai 2 days ago 0 replies      
Come to China. No Google here!

(You can get to Google with a VPN but...)

43
thallukrish 2 days ago 0 replies      
Actually there is a difference between gluing pieces of code and making it work and writing algorithms that solve hard computer science problems.
44
known 2 days ago 0 replies      
Good coders code, great - reuse. :)
45
pmontra 2 days ago 1 reply      
I learned programming on a Sinclair ZX81 with the manual of that computer. I think it was called Timex in the USA. There wasn't much that could be done on that computer so the manual was more than enough. Exception: machine code, which needed another book. With two reference books I could do basically everything.

Fast forward to today. I would need a reference book for every language I use, possibly more than one per language (core language, standard libraries, etc), plus one reference for every library (jars, gems, node modules, ...). This is both inconvenient and impossible.

Inconvenient because I would need a bookshelf of, how many? one hundred books? And good luck doing it on a train.

Impossible because how could the author of Random Library XY n.0 publish a reference book for at least every major version of it? Software development would slow to a crawl (but maybe there won't be JavaScript fatigue and the like.)

The solution would be something close to what I did out of necessity in the early 90s, pre-web: download the manual of the program or of the library and maybe print it. I remember that reading a manual from the first line to the end meant that I really knew what that library did. Now it's more like I google for it, load the page, CTRL-F for the concept I'm looking for, try it, it works -> done, it doesn't work -> google more or look for another library. Shallow knowledge of hundreds of libraries vs deep knowledge of a handful of them.

Some languages enforce or facilitate writing documentation inside the code and distribute it. Example: Ruby gems usually install the rdoc so you have local documentation fore every gem you download and can use it offline. However getting to the rdoc for one among the dozens of gems I could be using in a project is slower than getting to its README on GitHub, which maybe is the only documentation there is. Can maven download the javadocs for every jar it gets? Apparently yes but I googles for it :-) [1]

Finally, looking on the ruby-lang site for that method I seldom use in the Enum class is much faster than turning pages in a book.

And this is when everything goes well. When you have errors (not the easy ones) good luck without googling the answer. You could waste days understanding what's going on. Either you end up with intimate knowledge of every piece of software you're using or you give up and find a job in another industry. Again, software development would slow down to a crawl.

So... there is no good alternative to googling with the exception of very short sessions when you want to be totally focused, you know very well every piece of software you're working on (maybe you use other software of yours as a reference) and there are no surprises.

[1] http://stackoverflow.com/questions/5780758/maven-always-down...

46
ry_ry 2 days ago 3 replies      
Coding without searching basically boils down to coding without docs, and that seems pretty self-defeating.

Sure - If you just can't resist Stack Overflow's dubious charms, or find yourself updating your LinkedIn profile three times a day then cutting the hardline has some merit, but they aren't inherently programming issues.

26
Harvey OS A Fresh Take on Plan 9 harvey-os.org
220 points by antonkozlov  1 day ago   67 comments top 8
1
SwellJoe 1 day ago 6 replies      
So, a funny thing happened on the way to Linux dominance of the server: We started container-izing things and focusing on building one-service per container (or VM). Suddenly the OS matters a lot less; each container only needs the libs and kernel services that it needs to do its job. It doesn't need the whole OS and doesn't benefit from the whole OS being present.

I suspect there's an opportunity for a lot of alternative systems to make inroads in that space. But, then again, if the OS doesn't matter...it may be that we all just end up using the path of least resistance, which is almost always a small Linux distro with all the stuff we're used to. But, for someone that loves Plan 9 or Solaris or OpenBSD and wants to deliver service containers, they can probably get away with deploying containers using those systems without people balking at the idea.

2
stepvhen 1 day ago 1 reply      
For, and if anybody was wondering, the name is a reference to the Jimmy Stewart movie "Harvey"[1] , in which Jimmy Stewart is a grown man who has an imaginary best friend, a 6 foot tall bunny rabbit.

[1]: http://www.imdb.com/title/tt0042546/

3
rcarmo 1 day ago 2 replies      
I like the fact that this is happening, with a live Git repo and all.

9front is alive, sure, but bootstrapping things atop a modern compiler and a (at least partially) Linux compatible ABI makes a lot of sense.

4
colindean 1 day ago 0 replies      
I can't think of Harvey without thinking "Crichton!"

http://vignette3.wikia.nocookie.net/farscape/images/b/bd/Har...

5
transfire 1 day ago 0 replies      
Hope this project makes real inroads!
6
TheMagicHorsey 1 day ago 1 reply      
Aren't we all supposed to dream about unikernels moving forward anyway?

If I'm just using a library operating system that is linked directly into a single unikernel that merges the application code and system code, then I don't really care if I'm running on a hypervisor, on bare metal, or a host operating system. I'm only using a few system capabilities, and I'm not really taking advantage of other services running on the OS.

So I don't really care if the host OS is Linux, or Plan 9, or BSD. It just has to be UNIX-y enough to host a VM for my unikernel.

7
hinkley 8 hours ago 0 replies      
What's a pooka?
8
gtirloni 1 day ago 1 reply      
operating system that does away with Unix's wrinkles

What would those be?

27
Facebook will show ads to non-Facebook users on other websites wsj.com
208 points by ApplePolisher  1 day ago   134 comments top 26
1
weinzierl 1 day ago 3 replies      
> Users with a Facebook account can opt-out of the ad scheme by adjusting their settings, while non-Facebook members can opt-out through the Digital Advertising Alliance in the US, the Digital Advertising Alliance in Canada, and the European Interactive Digital Advertising Alliance in Europe.

...or by using a wide-spectrum blocker like uBlock Origin,and for the cookie warning problem there is a solution too: "EU: Prebake - Filter Obtrusive Cookie Notices"[1]

[1] https://raw.githubusercontent.com/liamja/Prebake/master/obtr...

2
dzek69 1 day ago 2 replies      
It did years ago :) When i didn't had account they already know my name, mail, telephone number, because some idiots gave them their mails login & passwords, allowing FB to read all contacts.

I wasn't on Facebook, yet in the "hey, join FB" message they listed 6 of "people I know that are already of Facebook".

So imagine how much data they got about me:1) No account on FB2) But FB tracked me with their cookies on almost every page, because almost every page has some FB widget (fanpage box, like buttons etc)3) They know "some-random-guy-9348239849" likes to browse pages A, B, C, D, which means he likes AA, AB, BA, BB, BC, CA, DA activities4) Then people start giving away "their" data about me. A lot of them did, so FB could connect who I know, and their connections with each other.5) So FB got "my-name-PERSONAL_DATA", not associated with the data from point 3.6) I got that message7) If I'd click the welcoming link, then I wouldn't even need to register - FB then could connect that "some-random-guy" data with "my-name" data.

This probably goes even deeper, but we just don't know that.

Facebook with that message told me way too much about how much they know about unregistered users, I bet they don't do this anymore :)

3
eknkc 1 day ago 3 replies      
Surprised they weren't doing that already. Facebook has javascript injected on a lot of sites, always assumed that they'd collect a shit ton of data whether you are a user or not.

Also, all ad blockers block google analytics like tracking services but social plugins are generally opt in. I guess they should be blocked by default too.

4
vthallam 1 day ago 5 replies      
This is really a big thing since it directly affects the Google cash cow(Adwords/Doubleclick). Facebook has all the data in the world about a user(location, personal details, current mood) which google lacks and so they can target users on the rest of the internet in a better way.
5
userbinator 1 day ago 4 replies      
while non-Facebook members can opt-out through the Digital Advertising Alliance in the US, the Digital Advertising Alliance in Canada, and the European Interactive Digital Advertising Alliance in Europe.

Or you can just block all requests to their domains. I have *.facebook.com and a few others blocked.

6
Bedon292 1 day ago 1 reply      
Begins? I though they were already tracking everyone who saw a Facebook badge anywhere on the internet already. Or were they just trying to track the users who were not logged in at the time?
7
slazaro 1 day ago 2 replies      
All the more reason to block their widgets on other websites, using something like Ghostery or Disconnect.
8
andy_ppp 1 day ago 2 replies      
I'm seeing this now:

"To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy."

Am I signing a contract with Facebook by clicking on their website. Is that legal?

9
deadowl 1 day ago 0 replies      
There are entire companies built on tracking non-users around the internet.
10
DKnoll3 1 day ago 0 replies      
France already pumped the brakes on Facebook using their position of power to target non-users http://thehackernews.com/2016/02/facebook-france.html,

I wonder where this will land them in the future with France, Brazil, or other countries that aren't as "friendly" with FB as the United States.

11
DyslexicAtheist 1 day ago 0 replies      
12
iamgopal 1 day ago 0 replies      
Ads, Facebook and many a times google display network irritates me not because they demands my attention, but they do it by tracking my behavior, and hence usually shows me what I already know, used a million times, and can write review of ten pages. They take away discover-ability of internet from me, and I hate it.
13
astazangasta 1 day ago 5 replies      
Again, I deplore the colossal waste of human effort and talent that is going into building a sophisticated panopticon for the noble purpose of better targeted advertising. For fuck's sake, what a waste of the best minds of our generation.
14
Jemm 1 day ago 3 replies      
127.0.0.1 facebook.com
15
MrBra 1 day ago 0 replies      
Install Disconnect or Ghostery right now and spread the word.
16
ComodoHacker 20 hours ago 0 replies      
So we must assume their shadow profiles are complete now.
17
cloudjacker 1 day ago 0 replies      
.... like every other company does?
18
cdnsteve 1 day ago 0 replies      
So Google Adwords/Adsense. That would have saved a whole article.
19
callmeed 1 day ago 0 replies      
I just added FB ads to an iOS app this morning. The app doesn't even use FB login so I was suprised to see the sample ads were still very location-specific and targeted. They've also got web/HTML tools for placing ads on a domain.
20
joesmo 1 day ago 0 replies      
Will Mozilla finally make ad blocking the default or do we need yet another organization that can stand up to corporate pressure and money going forward to do so? That's my only question: who if anyone will bring a web browser to market that blocks ads and thereby malware by default?
21
quantum_nerd 1 day ago 0 replies      
Dear Facebook,

you can track me all over the net all you want. I am just not buying shit.

22
nxzero 1 day ago 0 replies      
Surprised Facebook didn't by AddThis when it was up for sell; instead Oracle did.
23
theoapps 1 day ago 0 replies      
Zero ethics
24
lucb1e 1 day ago 2 replies      
They have been doing this since forever and if I remember correctly, the EU told them to stop. I don't remember if they really did stop (at least, I'm not sure if they say they did, nobody knows what they really did of course).

/me opens article

Oh, it's about showing ads to non-users, not about tracking. The article doesn't even claim they didn't track non-users before. Clickbait?

25
perseusprime11 1 day ago 0 replies      
Facebook, please stop!
26
dang 1 day ago 0 replies      
28
Growing Rift Between Valve and Oculus uploadvr.com
245 points by T-A  1 day ago   133 comments top 12
1
errantspark 1 day ago 6 replies      
Been working on VR side projects for 3 years now, I've built against the DK1, DK2, CV1 and the Vive and I have to say there's zero question in my mind that Oculus is FAR behind the curve. The Vive was love at first sight, the Oculus was quite the opposite.

The Vive is stunning, not only because room-scale VR is fantastic. (I don't share the opinion that the future rests solely with room scale, there's plenty of stuff to do in VR sitting down) but also because Valve just seems to have it together more than Oculus. The Lighthouse system is brilliant, it's a much more elegant solution than what Oculus has. The Oculus platform is gross, I don't see any advantages to using it from the consumer side, as far as I can tell it only exists to lock people into a particular ecosystem.

I understand why they did it, similarly to Origin or the Epic Games Launcher, but seriously? Steam won. I find it incredibly frustrating having to futz around with other platforms. Do they seriously think that I'm going to add all my Steam friends on the Oculus platform? That's ridiculous. I'm very doubtful that the platforms will get a userbase outside of the people who are FORCED to use them because they want a particular exclusive.

To top it all off it takes 3 USB ports to run the Oculus (4 if you want the controllers) vs the Vive's one.

2
cma 1 day ago 2 replies      
The worst part is Oculus sold customers on something more open than they delivered. Now if you buy from their store your purchases are locked to their headset, and if you ever buy anything else in the future.

Oculus/Palmer said they didn't care if you modded their games to work on third party headsets, just they weren't going to provide support themselves; instead they went out and broke it intentionally.

>If customers buy a game from us, I don't care if they mod it to run on whatever they want. As I have said a million times (and counter to the current circlejerk), our goal is not to profit by locking people to only our hardware - if it was, why in the world would we be supporting GearVR and talking with other headset makers? The software we create through Oculus Studios (using a mix of internal and external developers) are exclusive to the Oculus platform, not the Rift itself. https://www.reddit.com/r/oculus/comments/3vl7qe/palmer_lucke...

>As I already said in my first reply, I don't care if people mod their games as long as they are buying them. https://www.reddit.com/r/oculus/comments/3vl7qe/palmer_lucke...

>Glad there are some sane people out there. [said to someone saying it was only an issue of support] https://www.reddit.com/r/Vive/comments/4etddh/this_is_a_hack...

3
istorical 1 day ago 4 replies      
The discussion around Vive vs Rift and Valve vs Oculus is getting more and more emotionally clouded and "good guys vs bad guys" with each passing week. People are letting their own frustration around Oculus' poorly managed launch and non-existent PR affect their perspective on the situation. The fact is that although Oculus is adopting somewhat of a walled garden approach, people entirely overlook that Valve maintains a virtual monopoly on PC games distribution. Sure they aren't as powerful as iOS app store or the Google Play store, but Valve wants the same thing as any other player - to be in a position where you can't avoid selling your content through their channel and to take a big cut of all the sales. Those who attack Oculus for trying to be the one who gets the cut are deluding themselves. Apple, Google, Valve, they all already do this. Further, Oculus doesn't make any money on hardware right now, what can a person expect them to do? Just operate without any intention of ever making a profit?

The PC gaming community online can be extremely toxic and idealistic, entirely ignoring business realities.

4
jc4p 1 day ago 4 replies      
I have an Oculus CV1, I had the DK2, and I've been making VR side-projects using their SDK for a couple years now. I'm really disillusioned about Oculus as a platform, though. I didn't even consider buying the HTC Vive because I've been riding the Oculus train for a while and had faith in them as the future of VR, but come on.

I really wish I could know how Carmack feels about this.

Sidenote: for what it's worth, I haven't used my Oculus since the first week I received it. It was supposed to work with glasses (the DK2 does) by shipping with different foam faceplates that can change how far off my face it is (my glasses fit in the Oculus, it's just that the lenses are so close to my eyes that my glasses scratch them) but they silently took that off the "What's in the box" months before shipping.

5
shmerl 1 day ago 1 reply      
What a shame. Rift started as a crowdfunded open project and ended up as a disgusting lock-in.

>Frequently secured through digital rights management (DRM ) technology, this functionality is typically standard for digital download stores.

Not really. There are enough DRM-free ones. I don't care about games that are released through some exclusive DRM-infested stores, but what's more worrying is that hardware itself is probably tied to those stores. I.e. can you use Rift with games for example released through GOG?

In this sense Vive isn't better now too, since it requires SteamVR (because no one else implemented OpenVR so far).

6
ohitsdom 1 day ago 3 replies      
Really cool tech, but kind of a depressing start to this young industry.

I feel like desktop apps are making huge strides in being cross-platform, both in attention from developers and the lower development effort required thanks to a myriad of software platforms and tools. It's just sad to see games still clinging to "exclusives" as if that's a positive thing.

7
bitmapbrother 1 day ago 0 replies      
The winner will be decided by third party support. Oculus can continue with their DRM shenanigans, but in the end it's all going to be irrelevant. The third party developers will always exceed what Oculus or Vive put out in quality and quantity. And right now, according to Steam, Vive seems to have the majority of the third party developer support.
8
some-guy 1 day ago 5 replies      
Oculus / Vive / PSVR are all "consoles" in a way, and that as competition becomes more and more fierce, profitability on hardware will go down over time. By staying a closed platform, if Oculus can deliver on their hardware while selling software on their platform.

What's different about consoles though is that, at least in Oculus and Vive's case, the software looks to be fairly compatible with one another (hack-blocking aside by Oculus). What I don't understand is, if it's trivial to run Oculus software on other hardware platforms with hacks, why wouldn't they want people to buy their software if it can run on other platforms as well? Is it brand protection? They can't make large margins on their headset forever.

9
bni 7 hours ago 0 replies      
Rift has two major advantages that made me choose it:

1. Doesn't require to hang stuff on walls, a single sensor beside the monitor is enough. For me dedicating a room to VR is out of the question.

2. It has built in headphones. Yes they look "cheap" on pictures, but when you actually use a VR headset that goes on and off a lot you appreciate the convenience. Sound is also VERY good in them.

I don't get the blind Facebook hate and Steam fanboyism. Remember Steam is a 35% taking monopoly on PC gaming. If anything that monopoly needs to be dealt with.

10
ben_jones 14 hours ago 0 replies      
As much as I'd like to see an "underdog" win the VR wars I think the Space race to market is a very small part of the battle. Facebook will be able to market non-gaming apps much better then gaming companies, imo, and so don't necessarily have to be first, just eventually get close to parity (which is inevitable).
11
seanwilson 1 day ago 1 reply      
I'm not a fan of lock-in at all but if Oculus don't make money selling the hardware and can't expect a high chance of sales from their store then how are they suppose to make money? Is their whole business model based on purchases from the Oculus store?

Competition is good. I don't like how Oculus is trying to create their own locked in ecosystem but I don't like how some people only want to buy things from Steam either.

12
highCs 1 day ago 1 reply      
Got a bunch of real questions: do you know if VR is growing? Does people like the experience? How long are the user sessions?
29
How I turned Street Sharks into an online social experiment geek.com
233 points by danso  1 day ago   105 comments top 35
1
JackFr 1 day ago 4 replies      
I don't think it's the internet's fault. In 1941 a couple of friends made a few phone calls to newspaper sports desks and concocted a college football team:

http://www.nytimes.com/2016/01/16/sports/ncaafootball/the-41...

2
Dove 1 day ago 1 reply      
There was an article a couple days ago about why everyone believes Columbus proved the world was round[1], even though that isn't true[2]:

 The real myth of the medieval flat earth begins first in the eighteenth and nineteenth centuries and has two principal sources. Probably the most influential of these was the American author Washington Irving who in his fictional biography of Columbus claimed that Columbus had to fight against the Churchs belief that the world was flat in order to get permission and backing for his voyage, a complete fabrication. 
I see an interesting parallel between the two cases. When you're the most influential person talking about something (or the only person at all...), people who later want to learn about the subject treat you as the best available authority, even if you're a bad one. Information is copied from authorities and self-reinforcing over time, much like genes in a population. What that means is when there's a bottleneck in the number of people talking about a topic, you can see a founder effect[3].

[1]https://thonyc.wordpress.com/2016/05/25/repeat-after-me-they...

[2]That Columbus proved it, that is, not that the world is round.

[3]https://en.wikipedia.org/wiki/Founder_effect

3
projectramo 1 day ago 2 replies      
I know this is pitched as an experiment about the internet, but I think it is an even more telling experiment about the malleability of human memory.
4
ipsin 1 day ago 2 replies      
I honestly don't understand the animosity towards the author.

I consider this sort of mischief to be a healthy thing. If you are exposed to true facts constantly, all facts are true, in the same way that a TSA X-ray tech who sees ten thousand not-bombs might miss the one actual bomb.

I think of lies and disinformation as good for a mental agility, and an adjunct of storytelling. Is there no space for the trickster in our lives?

5
vessenes 1 day ago 0 replies      
To those who are questioning how much truth the 'confession' has in it, you are definitely getting in the spirit of things.

The confession is sort of a buzzfeed-ish version of Umberto Eco's Foucault's Pendulum, and he digs in, at length musing on lies and truth and how they mush together in ways that are very hard to untangle.

At any rate, if you liked the essay, I recommend the book.

6
6stringmerc 1 day ago 2 replies      
In my opinion, this prank is very intricate and took a lot of effort while not actually proving the point that "history is meaningless on the internet" because the subject matter - the Street Sharks - are/were essentially forgotten by history itself. Sometimes things just naturally slide away and nobody cares. It's like, "What year did the black Power Ranger turn good?" or "What was the top selling Rock band in the years 1975-1983?" and then making up an answer because, well, it's not really pertinent stuff. There are much better cons online, I like to think...edit...even if this is some kind of meta-joke where it turns out all the fake parts are being faked...sigh...
7
conradfr 1 day ago 1 reply      
That's great. I love how myths can still be created and spread in this Internet age.

Another example : around 2005 someone made a playlist for himself of some Mars Volta b-sides and called it "A Missing Chromosome" [1]. Somehow it spread on P2P networks, was even listed on Wikipedia in the band's discography section with a back story etc.

I still have it in my collection under that name with the art etc and don't see the need to correct it.

A forum I'm in maintains a list of some fake things they put in Wikipedia. It's harder to do these days but some are still there and we joke that some parts may have found their way in some student school work.

1. http://forum.thecomatorium.com/forum/index.php?showtopic=105...

8
TeMPOraL 1 day ago 1 reply      
What surprised me the most was people willing to go along with a lie. Human memory is a pretty faulty and malleable thing, but this level of confusion as was cited at one point in the article? I think it requires someone to be either consciously lying or have utter disregard to the value of what they're saying. It's one thing to repeat a lie because you didn't know the information wasn't true (though I'd consider providing confidence estimates on information you repeat as a basic human decency); it's something else entirely to repackage the lie and sell it as something you vouch for personally. These people are bullshit amplifiers.
9
imron 1 day ago 0 replies      
Back in the early days of the IMDB, and before they got strict with verification, my brother uploaded information and credits for his high school movie assignments, which I had 'acted' in.

A Google search for my real name still returns my IMDB actors page as one of the top links.

10
drivingmenuts 1 day ago 0 replies      
It's going to be a much bigger problem in the future.

I looked at the descriptions for the three shows. Frankly, nothing seemed out-of-line because I didn't watch toy-oriented cartoons in the 90s and that subject matter falls into a genre I just don't care about. Truthfully, I suspect that those synopses pretty much describe all 90s cartoons.

For that matter, your entire article could be a social experiment, but the subject matter of Street Sharks is so far off the spectrum of things I care about that it's the first time I've ever even heard of them.

But, it does illustrate how easy it is to slip things in like that. Do it enough times with lots of small things and sooner or later, you have enough evidence for something larger and so on and so forth.

Still don't care about Street Sharks, though.

And please, for the love of <deity>, don't show this to Michael Bay.

11
mangeletti 1 day ago 7 replies      
I can't help but read this like:

 Let me show you how dangerous sidewalks are by telling you the story about the time I went for a walk while swinging a spiked bat at every passer by. It was great fun, but it demonstrates why sidewalks are dangerous.

12
andrewflnr 1 day ago 1 reply      
I think Rox->Roxie is a pretty plausible confusion even without "malicious" interference, so I'm not sure those parts prove anything.
13
coldcode 1 day ago 2 replies      
Truth is such a slippery concept. How easy it was to create by a school kid, and how organically it grew over time despite being entirely made up. Imagine how much better people with professional tools and desire can generate much more harmful information than fake Henry Winkler appearances.
14
smaili 1 day ago 0 replies      
IMHO I think it was the fact that the show was so obscure and lacked enough popularity to actually motivate people to bother validating it. If it was a much more popular show, say The Big Bang Theory, there would be enough knowledgable people to catch the falsities.
15
MollyR 1 day ago 1 reply      
A little scary what this implies about groups of people with axes to grind on places like wikipedia.
16
nkrisc 1 day ago 0 replies      
What I think is most interesting is the people who all claim to "remember" these lies. Are they knowingly claiming to remember something they never saw to appear more knowledgeable or is their memory shaped by these lies they read? The latter is scarier.
17
tuna-piano 1 day ago 1 reply      
For those blaming the guy, and not the system / sites.

If Obama used the fake fact in a speech of his, would you blame his researchers or the guy who originally made up the lie?

I think the point is that authorities on information have a responsibility to ensure the information is accurate.

18
kolapuriya 1 day ago 0 replies      
The fact that people have blindly accepted them as false is subtlety proving his point about history on the internet.

the fact that you do not understand that people simply do not care about street sharks does not make his premise intelligent or thought provoking.

19
superJimmy64 1 day ago 0 replies      
"Im living in an ontological nightmare of my own making. Its jawsome!"

Had me in tears. Also, I completely forgot about that show until coming across this so thanks for bring back some history!

20
cloudjacker 1 day ago 0 replies      
I was hoping this would reference the Berenstain Bears thing. I thought that was stupid because someone was obviously introducing a psychological impression when creating the theory.
21
koolba 1 day ago 0 replies      
This is a jawesome playbook of how "facts 2.0" get created.
22
justinlardinois 1 day ago 0 replies      
Reminds me of the tale of Slow Blind Driveway, which if I recall correctly was the longest running hoax ever on Wikipedia:

https://en.wikipedia.org/wiki/Wikipedia:List_of_hoaxes_on_Wi...

And of the fingerboxes meme on 4chan.

23
bitwize 1 day ago 0 replies      
This reminds me of the time some fangirl invented an entire season of Inspector Gadget with her fan character in a prominent role as a love interest for Gadget -- and tried to pass it off as if it really happened, replete with faked screenshots of Gadget and her invention. The IG fan community actually took the bait for a while before she was outed.
25
Namrog84 1 day ago 0 replies      
What if this article was the social experiment. Did anyone verify any of the claims that he had even actually lied. Maybe he is fabricating a story about fabrication? I know I didn't verify. But don't fall into his sneaky trap again(or perhaps the first time).
26
owly 1 day ago 0 replies      
No one has mentioned the 2016 Election and the fabrication of stories from the front runners of both parties. History and memory are not being erased, just overwritten with new "facts".
27
syphilis2 1 day ago 0 replies      
It's interesting that in this case the misinformation can readily be proven false. What method is there to identify incorrect information when no primary source exists? How much trusted information can never be verified?
28
fennecfoxen 1 day ago 1 reply      
This seems like an interesting article. However, the large advertisement on the screen covering the entire viewport with no 'close' button makes reading it a real challenge.
29
dsugarman 1 day ago 0 replies      
think about how much worse it was before the internet
30
FussyZeus 1 day ago 1 reply      
None of this is the author's fault. The initial lie was meant as harmless fun, the fact that so many (largely highly regarded) media properties went and spewed it out as fact later on is the real problem. I mean how many times a week do we hear about Old Big Media retweeting nonsense or publishing Onion articles?

News sources these days are so programmed to chase every story that there's no room for fact checking and they all look like incredible idiots. It's amazing to me that anyone takes Old Media seriously anymore.

31
EGreg 1 day ago 0 replies      
I have done something less malicious, but right around the same time.

When I was in NYU's grad math program, the Wikipedia just getting going. It had a lot of articles, but not on every topic. I was studying Analysis and decided to start the article on the Hessian Matrix.

Yes, I started that article and it's been fun to watch it grow over the years. Most sections I have added are still there, as well as some phrases such as "more can be said from the point of view of Morse Theory". It really set the direction of the subsequent topics and edits.

One section in particular, there, was completely made up by me. It was ACTUALLY TRUE, but it was never (to my knowledge) stated anywhere. No one had really made a treatment of the matter. Namely:

Hessians of vector valued functions. I said they were tensors of rank 3.

There was a discussion in the talk page about it. Some people were confused and argued for a bit but since was true, the community kept it, thinking that being a true math concept it must have a source somewhere. Now it has been expanded and an actual analysis of how it can be a tensor of rank 3 has been worked out. Now this may have led to citations that will lead to research on Hessians of vector valued functions acting as tensors. All because I wrote it there.

It wasn't false, like the Colbert's lie that "Elephant population of Africa has tripled in the last six months." But it was an experiment to see what an unsourced original statement would lead to on a fairly mainstream article.

32
fapjacks 1 day ago 0 replies      
I have done very similar stuff to Wikipedia for almost ten years now. It is astonishingly easy to make specious claims in backwater articles that no one cares about. It is pretty insane what happens to the articles afterwards. For example, on one article, someone else has made a couple of edits adding even more lies to my completely invented claims! Another "subtle vandal" (as I call myself)! I never expected that. And when someone else edits your false information for clarity, or whatever, it's as good as gold. One trick I use to encourage that is to make minor grammatical or spelling errors. If a human doesn't lumber by to edit (and then seal into fact) my bullshit information, usually an automated bot will eventually do the same thing. It's very important not to make your information sound too trite or wild. Most Wikipedia editors take everything with a big grain of salt the last few years.
33
Aelinsaar 1 day ago 0 replies      
More like, "I used lies about a cartoon to convince some people online that history is meaningless on the internet." Nothing was proved, and I wonder how long that deception lasted. Edit: Title has since been changed.
34
Endy 1 day ago 0 replies      
One word applies: JAWSOME!!!
35
sp332 1 day ago 2 replies      
So, you lied, but it's someone else's fault?

After all, what kind of person would intentionally sow lies about Street Sharks across the internet?

Well, not a good one.

30
FBI raids dental software researcher who discovered patient data on FTP server dailydot.com
252 points by corywright  1 day ago   157 comments top 23
1
ghoul2 1 day ago 5 replies      
As a separate issue: why the "shock and awe" response to what is (even allegedly) a non-violent crime? Why the assault rifles? Why could he not have been arrested by just a couple agents walking upto the door, knocking, serving the search warrant, and then maybe having the techs step in to conduct the search and seizure?

Why does US Law Enforcement so dramatically escalate every contact with a citizen? Everytime they do this, they risk accidental injury to the people, kids, pets.

What in this particular situation necessitated a SWAT-level treatment?

Maybe the law should be fixed such that warrants have to specifically include firearm authorizations.

2
jneal 1 day ago 7 replies      
This reminds me of something that happened to me in high school back in 1999. I found an Excel doc in a public network drive that contained every single student's SSN, DOB, whether they had free/reduced lunch, address, phone, etc. I was admittedly snooping around, but this was all public stuff every student and teacher had full access to.

When I found it, I told one of the teachers that I trusted and she insisted that I must tell the principal. So I went down to the principal's office and told her. My primary goal was to get this removed or made private because even at that young age I knew this was very sensitive data and I wouldn't want just anyone having access to my information like that.

When I got home from school, I found my mother upset because we'd been called to return to school for an emergency meeting. I was questioned, and when I told them I only wanted this sensitive information properly secured I was told by the county IT administrator "Did you ever stop to think if maybe this information was public for a reason?" I took a second, and literally wanted to say "There is no reason this information should ever be public" but I ended up keeping my mouth shut in hopes to not get into further trouble.

I was nearly expelled for "hacking". They placed me on "academic probation" and threatened that if I did so much as forget my school ID at home one day, I would be immediately expelled without question. I was removed from my elective classes that involved computers and was disallowed from touching any computers at school.

Fun fact: Someone on the yearbook staff accidentally deleted the only copy of the yearbook files and our yearbook was in danger of basically not being made. I was called to the principal's office and asked to help. I was able to recover the deleted files and save the day. At some point they realized I never had malicious intent, but I still hold a small grudge for the way I was treated as a criminal for uncovering such a big security hole.

3
callesgg 1 day ago 7 replies      
About a month or so a go i found a open public mongo database with about 12GB of records regarding peoples retirement founds of what i assume was hundreds of thousands of people, account numbers, how much money was in the accounts when they had moved them to various founds and so on.

Thought long and hard about what to do but decided to not do anything, dont feel like risking my entire life just to help someone.This is me assuming they did not intend to have it publicly open.

With that story out there, it would be nice to have a legit legal way to inform the police or a similar trustworthy government agency that could handle issues like this.

4
openasocket 1 day ago 2 replies      
It sounds like Patterson Dental deserves as much blame as the FBI, if not more, because it sounds like they were the ones pressing charges and motivating prosecution in the first place. Also, why aren't they being charged with what is almost certainly a HIPAA violation?
5
qb45 1 day ago 1 reply      
Another lesson not to trust people/organizations ignorant enough to keep confidential data in plain text on anonymous FTP.

It seems that the 21st century responsible disclosure procedure goes like that:

0. use tor for the research itself

1. report problems anonymously

2. if they don't care - report them to law enforcement for breach of confidentiality

3. if these don't care either or don't accept anonymous tips - make noise in the media

Of course, this is for dealing with idiots who keep their data on public FTP. If the attack takes some clever hacking, go check if they don't offer bug bounties. Funny times we are living in.

6
AdmiralAsshat 1 day ago 8 replies      
The FBI is going to have a hell of a time arguing that accessing a public FTP server with no password protection is a crime.
7
wyldfire 1 day ago 1 reply      
> Defense attorney Tor Ekeland, who represented Auernheimer in the federal court case in New Jersey, has offered to help Shafer ...

Based on his website it appears that "Tor" is actually his given name. What an odd coincidence.

8
merrywhether 1 day ago 3 replies      
Reading this, I had an idea for a new law that could counteract this stupid reaction to security research:

Particularly for protected patient information (but maybe for other classes of sensitive data as well), it would be interesting to somehow classify having this information breached as a crime by the holder of the information (I realize this might be hard to do given the reality of security these days, so there would need to be some nuance of course). The crux of my idea would be to automatically count any access that results in prosecution as a breach of said data, thus meaning that prosecuting a security researcher would automatically put the information holder under separate prosecution. I wonder if something like this could be feasible.

9
sathackr 1 day ago 2 replies      
Fun fact:

Many financial institutions use the last 4 of your SSN as identity verification.

If you're a business, it's the last 4 of your FEI/EIN.

I know at least in FL, this is publicily available at sunbiz.org

So with the account number printed at the bottom of your paycheck/stub and the FEI/EIN, you can often authenticate to a financial institution and obtain privileged information.

I know this not because I was on the "hacker" side, but because I was involved on the financial institution side of it and caught this as part of my engagement. The institution was issuing new logins for its internet banking site and the password would have been based on the users name, zip code, and SSN/FEI/EIN, all 3 of which are available (in FL) on that sunbiz.org site.

10
downandout 1 day ago 0 replies      
Unless there is more to the story, he won't be prosecuted for accessing an anonymous FTP server. However, they will scour the computers/drives they took (for months or possibly even years), looking for evidence of this or any other technically illegal misdeed. In the unlikely event they find nothing that they can take issue with (this being a security researcher's computer equipment, they'll find all kinds of hacking tools and possibly evidence of other research that could be construed as hacking attempts), in a year or so, he might get his stuff back. If they find anything, he'll face charges for that.

That's how law enforcement in the US works. A crack in the door, in the form of a ridiculous accusation, is all it takes for one's life to be destroyed.

11
rrggrr 1 day ago 1 reply      
Here's an investigative tool the CFAA & the FBI needs... if a company like Patterson Dental spins up an investigative raid with a baseless complaint, the Bureau should be able to charge them with a crime. One almost hopes the FBI investigation yields enough evidence to charge Patterson with a criminal violation of HIPAA.
12
fiatmoney 1 day ago 1 reply      
It needs to be understood that if you react this way to responsible disclosure practices, your company & you personally will be subject to irresponsible disclosure practices.
13
pmontra 1 day ago 1 reply      
Do you have laws in the USA that mandate protection of health data?
14
mevile 1 day ago 0 replies      
I'm not addressing the FBI response, but hear me out. As a security researcher you have to stop at the first vulnerability. Don't use the vulnerability to get more information. It's the companies responsibility to ascertain the impact of the problem. This person should not have attempted to download anything from the FTP server. It should have spotted the FTP server, notified the company and made it clear they never attempted to download anything from it.

There was a similar issue with S3 credentials and Facebook a few months ago. The security researcher went too far. There was a large outcry by everyone about Facebooks response. I'm not addressing the response. I'm saying as a security researcher you need to protect yourself by trying very hard to limit the impact of what you're doing to remove risk of legal liability. Only go as far as the first problem and no further.

15
phusion 1 day ago 1 reply      
This is so wrong, but it's not surprising. We've been reading stories for years of security researchers being charged with a crime or harassed for simply pointing out blatant security holes.

What kind of thinking is this? He was doing them a favor. Every time, it seems to me that they are embarrassed by the incident and lash out. WHY!?? We should be treating these researchers like heroes, not kicking in their doors and having the FBI charge them with criminal CFAA violations. Once the chilling effect comes down in full force, we'll have a much less secure Internet.

16
a3n 1 day ago 0 replies      
It's as if the CFAA was intended to protect behavior like Patterson did.
17
pipermerriam 1 day ago 0 replies      
The FBI seems to have lost it's way (Same with most of the other 3-letter governmental entities and other law enforcement). How do we change the system so that they are held accountable for these sort of things?

This is getting ridiculous. I can't predict the general public's opinions on things like this but it seems so clearly "wrong".

I have hope for a peaceful fix but I am skeptical that we aren't well on our way to a much more traditional violent revolution.

Everything I've read on the subject suggests that the early signs of revolution are a sufficiently large disparity between the rich and the poor such that the poor can no longer provide for themselves. It seems like this is well on its way and likely speeding up.

I'd love to see some statistics on situations like the 2014 Ferguson Missouri situation. I'm curious if there's a rise in situations where the government sufficiently crosses the line that the public backlash manifests violently. I expect that we're still in a stage where these situations are still largely centered around poor minorities [1] but situations like this suggest that incidents are starting to expand into demographics that might get the "middle class" [2] to finally pay attention.

I hope we can find a way to unite as a single voice to change things. I hope it doesn't end up being violent. The following things encourage me.

* Decreased relevance of the "mass media". This is a double edged sword. On one hand it allows for news that might be ignored by a major network to still be disseminated widely. On the other hand, the "public" has a really poor track record of consuming news that isn't also entertainment and many of these issues seem to fall entirely outside of people's interests.

* The ability to aggregate these sort of events to establish a clear pattern of behavior. It's getting harder to hide things.

Also these disclaimers:

1. I say poor minorities because based on my knowledge of the law enforcement overstepping it's typically in situations involving people who are poor and black.

2. The "middle class" is used here to reference a predominantly "white" demographic that most mass media caters to. I've struggled to find the appropriate language here, fearing I'll be labeled racists somehow. Hoping that my message reads as intended.

18
cloudjacker 1 day ago 0 replies      
Use Tor through Whonix gateway. FBI's NIT doesn't have a way through that.
19
2close4comfort 1 day ago 0 replies      
The FBI putting the Cyber in Cyber. I know we all feel safer with them on the watch
20
Steuard 1 day ago 5 replies      
I know this is only tangentially related to the HN content here, but does anyone have a sense of why the FBI would choose to respond to this sort of case with a dozen agents and weapons drawn? Rather than, say, two guys politely ringing the bell and asking him to come with them?

Unless there's a lot left out of this article, I wouldn't think most "unauthorized computer access" suspects tend to be heavily armed. (Particularly if the company actually reported the context of the "crime", including the fact that he had voluntarily notified them of the problem.)

21
eric_h 1 day ago 1 reply      
I could not get this site to fully load even after (or maybe because) my adblocker blocked 68 requests.

However, loads great in lynx!

22
joesmo 1 day ago 0 replies      
In the meantime, companies like Apple and Google are deleting users' files without their consent and infecting computers with malware through ads yet I don't see Tim Cook or Larry Page being woken up in the middle of the night by a SWAT team. What a fucking joke our legal system is.
23
vox_mollis 1 day ago 0 replies      
The FBI has always been our enemy, from John Edgar Hoover onward.
       cached 29 May 2016 04:11:01 GMT