hacker news with inline top comments    .. more ..    16 Mar 2017 Best
home   ask   best   2 years ago   
Keep the Internet Open samaltman.com
992 points by firloop  1 day ago   381 comments top 31
confounded 1 day ago 9 replies      
Because of the massive amount of consolidation over the last few years, the death of Net Neutrality would be the best competitive moat imaginable for Google, Facebook, Amazon, etc.

While all their CEOs will make faint noises in favor of Net Neutrality, SV outspends Wall Street 2:1 on lobbying[1], the goal of which is to cement monopoly status, not to make the world a better place. Most of it goes to Republicans.

Things have changed fast; Google's lobbying spend has increased by over 50% since the SOPA blackout.

The technology industry serves the interests of capital/share-holders, not technologists.

[1]: https://www.bloomberg.com/news/articles/2016-10-18/outspendi...

EDIT: "Google hardly even had lobbyists back during the SOPA blackout." > "Google's lobbying spend has increased by over 50% since the SOPA blackout." h/t DannyBee below.

azundo 1 day ago 3 replies      
We see the consequences of a non-open Internet every day where we work in Uganda. It is common there for people to have "social bundles" that only work for Facebook, WhatsApp, Twitter, etc. Our users believe they have Internet access but don't understand why data from our app is not syncing when Facebook is working. It's a large barrier for us and adds a huge moat for the incumbents. We're considering more integration over WhatsApp or Facebook Messenger so we too can benefit from the cheaper data but that only locks us into those platforms and strengthens their position.
rdl 1 day ago 14 replies      
While I like the Internet being open, I don't like "net neutrality" extremism.

1) While I dislike monopoly carrier antics, I'm even more unhappy about the idea of government bureaucrats dictating network engineering standards to carriers and ISPs. If you build a network with caches/content servers close to users, and expensive backhaul back to your network core, you can offer essentially unlimited traffic to users hitting the cache, and still provide more limited access to other stuff. I'd prefer high bandwidth everywhere, but that isn't always an option. It should be a market decision, not a national government decision.

2) The real problems with lack of NN are due to lack of competition in the access provider market. Focus on fixing that. A lot of this is due to local governments requiring providers cover entire markets to cover any of a market -- if someone wants to build another WebPass or CondoInternet and only serve high-density developments, that's great! Trenching fiber to single family homes is marginal anyway, but if you have low take-up in a neighborhood it is even worse. If you are going to push for regulation, have it be regulation to empower actual competition in the network access market.

3) Zero-rating in emerging markets (e.g. FB in India) is really the only way for many lower income users to afford services at all -- particularly for video services.

blhack 1 day ago 7 replies      
This is a difficult question. Emotionally, I am a net neutrality absolutist, but I think I can talk myself out of it, which to me means that the conversation is complex.

Should the power company be able to offer you a discounted rate if you host equipment that offsets their cost (for instance: a powerwall)[1]?

I think that the answer to that question is yes.

Well okay, then should your ISP be able to offer you a discounted rate to use services that offset their cost (youtube, which has an cache near you, vs vimeo, which doesn't)? If the power company can do this, then why can't the ISP?

Again, I don't like this, but I can't think of a consistent explanation for why they shouldn't be allowed to dynamically charge you based on their cost.

[1]: I don't know enough about the electrical grid to know if a powerwall could actually be used in a way that offsets the cost to a power utility or not. I don't think that the correctness of this statement matters to the example. Substitute "a powerwall" for "equipment that saves the power company money".

gz5 1 day ago 3 replies      
The utility should be the fiber (or spectrum). Regulate that; leave the ISPs to then compete openly.

Today's software can enable multiple ISPs to share the fiber/spectrum. I could then have 10 ISPs today and 15 ISPs tomorrow. I might use ISP #1 for certain content, and ISP #2 for certain services.

ISPs essentially become a function of the services and content they provide, and how they provide it, in an ultra-competitive, granular market. The way it should be.

brothercolor 1 day ago 3 replies      
The internet is what makes Silicon Valley go round. Everyone, including Sam Altman, needs to be an activist. Asking for someone else to take point on something so fundamental is like asking Natives to take point on water prot-- wait.
Alex3917 1 day ago 3 replies      
> Doing this allows the government to ensure a level playing field

In theory this is a good argument. In practice, my experience is that this argument causes people to write off net neutrality as just being something that's about letting tech bros make lots of money. I've even heard this from folks in the tech industry, who really should know better.

An argument that may be more convincing is that the Internet is the only media channel where we don't get all of our information from the same three or four mega conglomerates. But if net neutrality is eliminated then ISPs are going to pull a Martin Shkreli, and overnight your cost of hosting a Wordpress blog is going to go from $5 a month to $25,000 per month or whatever.

When this happens the only way to have a blog will be to host your content on Facebook, who will be able to decide which points of view are allowed and which are banned. If we lose the Internet, the last free media channel, then there is no going back. Not just on this issue, but on every issue.

SAI_Peregrinus 1 day ago 1 reply      
I don't think that NN legislation is the answer. It's a social solution to a technical problem. We shouldn't be making it illegal to discriminate between traffic types, we should be making it impossible. Encrypt everything by default, encrypt and anonymize DNS, and generally get rid of the ability for ISPs to tell one data stream from another. Unfortunately this requires re-architecting quite a lot of the Internet, so it's even less likely to happen quickly than getting a bunch of politicians to stop listening to lobbyists or ISPs to actually compete with one another.
d--b 1 day ago 2 replies      
Like Altman, I am amazed that this is not inflaming the community again. In order to pass an unpopular law, you just need to try to pass it several times, until the public gets tired of protesting?
cb21 1 day ago 3 replies      
> But this idea is under attack, and I'm surprised the tech community isn't speaking out more forcefully. Although many leading tech companies are now the incumbents, I hope we'll all remember that openness helped them achieve their great success. It could be disastrous for future startups if this were to change--openness is what made the recent wave of innovation happen.

Is it surprising? What organization is supposed to speak up? Tech workers don't have a union so nobody is lobbying for us in DC. We have to hope that enough huge companies and their leaders will act in the way we want and I don't exactly expect Zuckerberg and Thiel to represent me and my interests in Washington.

finkin1 1 day ago 1 reply      
I think the tipping point last time (SOPA/PIPA protests) was when Reddit, Wikipedia, Google, etc. went dark. It's sad we don't have Aaron Swartz to help this time around. It's good to see Sam offering help, but who is actually going to step up?
simplehuman 1 day ago 1 reply      
I don't think activism is the answer to these problems. Instead, we need to show consumer benefits of an open internet. Right now, people are totally loving the mono-culture and the internet silos that the mega corps are building (facebook, whatsapp, gmail, google, chrome, apple app store, github to name a few). It's going to hard to convince them unless you can show them tangible benefits.

I think the big companies have totally nailed it. They have kept things free. And they have kept the population sufficiently distracted that there is no time for 'activism' or thinking of society. This means that the big corps can now push reforms unquestioned in their favor.

pgodzin 1 day ago 0 replies      
The tech community rallied pretty hard in favor of net neutrality last go-round, they were just lucky that Wheeler (head of the FCC) agreed with them.

I think there isn't anything specific to fight against now as there was with SOPA. Once a specific legislation/regulation appears, I think the tech world will strongly oppose again. I also suspect the large tech companies are trying to use their lobbying power behind the scenes preemptively again.

johnwheeler 1 day ago 2 replies      
To the folks on HN who don't vehemently, vehemently oppose a non-neutral system, your line of thinking puzzles the hell out of me. Is the idea that government regulation is bad in all cases but corporate regulation is OK in all cases? Why? Because you've bought into free market forces making all things better no matter what?

Those forces only make things better unless they don't. To me, it's crazy that's not immediately clear.

In this case, the regulation is for keeping the system free unless you're a monopoly or part of an oligopoly. Your future chances of being in that camp are so minuscule and even smaller if you support killing off Net Neutrality. Are you fighting on behalf your future self that will likely never exist? That's arrogant and delusional.

This is not what Ayn Rand meant by espousing the public good is best served when people are self-interested. That's not the same thing as being delusional.

saycheese 1 day ago 1 reply      
Interesting reviewing the list of organizations that signed the letter sent to the FCC by the ACLU & EFF on keeping the internet open:http://www.commoncause.org/policy-and-litigation/letters-to-...

Really makes you wonder what's going on, since Apple, Google, Microsoft, etc. are nowhere to be found on the list.

no_protocol 1 day ago 1 reply      
> There's an argument that Internet Service Providers should be able to charge a metered rate based on usage. I'm not sure whether I agree with this, but in principle it seems ok. That's how we pay for public utilities.

I can monitor and control the power usage of my electrical appliances.

I can control outgoing network requests from my networked devices.

I cannot control what is sent to my network from the outside. It doesn't make much sense to be charged for what someone else sends to me. Even if I shut off my network device, a particularly rude service might just continue blindly sending data my way, running up my costs with no way to opt out.

Typical postal mail delivery is paid for by the sender, not the recipient. It becomes complicated here.

beefman 1 day ago 0 replies      
I was convinced in favor of net neutrality by a 2007 study at the University of Florida.[1] When ISPs are allowed to charge content providers individually, there is less incentive to improve overall bandwidth. There's also an incentive to cripple the free tier, especially if it can be done subtly or by neglect over time. So net neutrality seems like good policy.

[1] http://web.archive.org/web/20130602210518/http://news.ufl.ed...

koolba 1 day ago 0 replies      
If the physical line itself was separated from the ISP providing you service on the line, this problem would solve itself through natural competition. We almost saw this happen with ADSL but it lost out to the fatter pipes of coax and fiber.

Most people have either one or zero options for coax (i.e. cable) or fiber. That leads to a monopoly where you either have to take whatever Comcast / Time Warner / Verizon offer or live with craptacular DSL. Get rid of that monopoly and have the maintenance of those pipes be run by the local municipality. Then you can have real competition.

michaeljbishop 16 hours ago 0 replies      
Am I the only person frustrated that this article contains a call to arms and then no actionable items? I'm on board, but what should I do about it? Put that in your article!
LeicaLatte 21 hours ago 0 replies      
"I really hope an activist or tech leader will step up and organize this fight."

Sums up America 2017. Outsource everything, even your fights.

trcollinson 12 hours ago 1 reply      
I love the political and ethical debate surrounding this, both in this thread and elsewhere. As a technologist and a software developer I think we should use every means available to stop these sorts of laws from taking effect and harming what should be a completely neutral internet.

With that being said, I would like to ask from a technical perspective, what can we do if this does go into effect? I remember a quote, I want to say it was by Sit Timothy Berners-Lee, but I can't find it. So I will just state it myself.

Any lack of neutrality within the internet network is not a feature or a regulation but is in fact a bug within the system. When a bug is found, software engineers find ways around those bugs.

So, if ISPs finally succeed in causing our internet to no longer be neutral, how will we be moving around that bug?

enknamel 1 day ago 0 replies      
>What's clearly not OK is taking it further--charging different services different rates based on their relationships with ISPs. You wouldn't accept your electric company charging you different rates depending on the manufacturer of each of your appliances.

This does happen though. Electric companies, phone companies, etc all charge different rates based on who is using the service. Based off of different programs and income based subsidies who is using the service determines the cost paid. This also effects what producers sell to the low end of the spectrum. If electricity is substantially cheaper for lower income consumers, they will care less about paying for better energy efficiency. So you are technically paying a different rate based off what your appliances are.

ABCLAW 1 day ago 0 replies      
Those involved in technology are often confronted with brutally kafkaesque situations wherein the human and the machine are starkly opposed. Fighting against the machine is often demoralizing, depressing, and dehumanizing - but not always. Not today.

Thank you, Sam, for the mote of hope.

bo1024 1 day ago 0 replies      
I think the appliance analogy is a pretty good one. But now imagine that on top of being able to charge different amounts for different brands of appliances, the electric company happens to manufacture their own line of not-so-quality stoves, refrigerators, etc.
tehwebguy 1 day ago 1 reply      
Ajit Pai is public enemy number one.
vvanders 1 day ago 1 reply      
I love the electricity company/appliance analogy. That seems fairly accurate and something that's easily relatable to the layman who doesn't understand the details of the internet.
no_protocol 1 day ago 3 replies      
> You wouldn't accept your electric company charging you different rates for each of your appliances.

I can get cheaper electricity for my water heater if I choose to attach it to a source that can be interrupted at times of peak load. I'm not quite sure this line of reasoning will stand.

Edit: It appears the linked article has been updated since I originally posted this. The new text is:

> You wouldn't accept your electric company charging you different rates depending on the manufacturer of each of your appliances.

This makes the argument much clearer and my concerns no longer seem important.

tomcam 1 day ago 0 replies      
I think Sam needs to clarify some terms.

> The internet is a public good

Black's Law says a public good is "An item that taxation is used to finance, the consumption of which has been decided by the whole of society. It is not an item for consumption that an individual has decided upon."

Does Sam want to tax all of us for net access, then have the government decide how it should be allocated? Do ISPs disappear under this scenario? Does Sam want the same government that has gone insane collecting personal data to have further control over net our access? Does he trust them to do the right thing with it? If so, does he have any evidence it would do so?

> I believe access should be a basic right.

"Basic right" is not defined and doesn't have a legal tradition, so I'm not sure what it means. Is it meant to be a constitutionally delineated right like (in the USA) freedom of the press, or to worship, or to bear arms? Because none of these require that we have the government confiscate our money in order that others may exercise those rights.

ctack 17 hours ago 0 replies      
It's a terrific plea from Sam. He speaks for Ycombinator and by proxy, independant hackers like myself. But there is just so much peace of mind in it for the big boys.
Not31337 1 day ago 2 replies      
Charging per volume of traffic would eliminate thenet neutrality issue.

If I use Netflix on Comcast I would pay Comcast for my use of their network. Comcast would pay Netflix's ISP for the use of that network (at negotiated rate lower than my rate because of volume). Netflix would pay their ISP For their usage as well.(also priced for volume)

The two end point rates would have to be set so carriers make a profit and the responsibility of traffic is shared between the sender and receiver. (people who get thing for free tend to treat it as worthless).

This model has to be worked out to completion to understand all the implications.

The key thing is if Comcast was getting reasonable revenue from moving Netflix traffic it would not need to throttle other people's traffic to bully consumers to buy on demand TV from Comcast.

As side effect this may reduce usage by people who download stuff just because they can because its free/already paid for. This is related to free things are treated badly concept I mentioned earlier.

pizzetta 1 day ago 2 replies      
Not to put a fly in the soup, but don't utilities charge different rates for different customers? for Example PG&E might have one rate for residential, one for small biz and one for large biz?

Not to say that net neutrality is different but comparing it to utilities may not be appropriate given diff classes of customers are granted different kinds of rates and service.

Stories that Hacker News removes from the front page sangaline.com
1261 points by foob  2 days ago   312 comments top 55
dang 2 days ago 15 replies      
The story the OP is complaining about was flagged by users. Moderators never saw it (edit: wrong, we put 2010 on the title by mistake, see downthread [1]). Had we seen it, we would have turned off the flags. There's a long tradition of people looking at HN data and posting about it. Edit #2: since the 2010 thing was our mistake (an accident of sleep deprivation by the looks of it!) I've invited foob to repost the original article using the second-chance mechanism described at https://news.ycombinator.com/item?id=11662380 and the links there. I think he's planning to do that tomorrow.

The [flagged] annotation only shows up on stories that are heavily flagged, i.e. enough to kill the post. User flags have downweighting effects long before that.

Story rank on HN is determined by upvotes, flags, software, and moderators. Moderators downweight stories to the degree that they don't fit the site guidelines. This doesn't happen by upvotes alone, unfortunately; certain stories routinely get tons of upvotes regardless of how good they are for HNe.g. anything sensational, indignant, or meta. If we didn't have a compensating factor, those stories would dominate the front page every day and HN would no longer follow its primary rule: "anything that gratifies one's intellectual curiosity". Of course that means HN is subject to our interpretation of what counts as "intellectual curiosity". HN has always worked that way; before we did it, pg did, and he trained us to do it so it would work as before. There's no way around the need for moderator intervention on a site like HNthe clue is in the word 'moderator' itself: left to its own devices the system runs to extremes and it needs a negative feedback loop to dampen it.

When YC is involved, we do this less than usual as a matter of principle. When HN itself is involved it's a little bit different, because the hypnotic power of all things meta causes HN upvoters to go into an upvoting trance. Meta is basically crack, so we routinely downweight such postsbut only so much, to compensate for the crack effect. That's what I've done here, which is why the post is now at #7 rather than #1. It should probably be lower, but I want to illustrate the point that we intervene less, not more, when judgments about ourselves are involved. As a further example, a moderator actually turned off software penalties and user flags on this post this morning, which is probably why it went to #1 in the first place. That's more than I would have done but it shows how seriously we take that principle.

None of this is new information, btw. I've posted about it plenty over the years and am always happy to answer questions.





1. https://news.ycombinator.com/item?id=13858850

falcolas 2 days ago 3 replies      
Obligatory pimping of a tool I use (not mine, though): http://hckrnews.com/

Shows stories which have hit the front page ever, in the order of their posting. If it's currently on the front page, the link is orange. If it's not, it's black. It's very interesting to watch how frequently highly upvoted and commented posts turn black, while their temporal peers remain.

Anecdotally, there appears to be trend of positive/neutral news about YC companies remaining on the front page the longest, the latest shiny technology sticks around for awhile longer than average, and pretty much any negative news disappears almost instantly.

For example, as of this instant in time, there is an article about Angular2 which remains on the front page while more highly upvoted and commented articles about laptop security, AT&T discrimination, and a Nintendo Switch CVE discussion are all gone from the front page.

laurent123456 2 days ago 6 replies      
As much as I like HN, I'm not a big fan of the secrecy around moderator interventions - what gets censored, what posts get re-titled, etc.

I can understand they might want to keep the ranking algorithm and anti-spam techniques secret, but stuff that are manually censored by a moderator should be indicated as such, maybe by some automatic message like "This post was removed due to [reason]".

Some websites manage to fight spam while remaining reasonably transparent (eg. StackExchange, where pretty much everything is documented - flags, closing reasons, edits, etc.).

hardwaresofton 2 days ago 6 replies      
The site is being crushed by traffic right now -- but without reading the article, I've also found that some stories that I thought were important that were scrubbed from HN's front page just about as soon as I saw it (when I doubled back to read the comments)...

While I realize I'm not entitled to explanations, some transparency would be appreciated. Maybe it could even be automatic, whenever a mod removed something forcibly from front, they could leave a comment and it'd show up on some page?

[EDIT] - After reading the article, if a mod did indeed take down the post because it discussed reverse engineering the rank algorithm, I think that's pretty naive. Security through obscurity isn't a thing, and the better response is just to make a better algorithm, not try and suppress knowledge about it.

I say this naively myself, as I've never had to maintain a ranking algorithm with these many users who depend on it (or any at all for that matter), but surely the problem isn't intractable?

ploggingdev 2 days ago 1 reply      
I can comment on the "Books that Aaron Swartz read, loved and hated" story. OP posted that link to HN and it made the front page. The story was soon flagged by the community (people viewed it as Amazon affiliate spam) and it fell off the rankings. When the moderator dang noticed that the story had been flagged, he decided to restore it (https://news.ycombinator.com/item?id=13840869). So yeah, that explains why this particular story fell off the front page quickly but does not have any [flagged] or [dupe] tags attached now.

I've learned that it's best not to jump to conclusions based on what you think is true (however sound your analysis might be). Always ask the other side(s) for an explanation. In this case, you could have sent an email to the mods asking for an explanation. If you find their response unsatisfactory, go ahead and write a post explaining why.

johnlbevan2 2 days ago 1 reply      
Other theories:

- This is a bug / happens randomly; you just noticed it because you were looking (i.e. as you analyse this data); all the posts it's happened to before and since went unnoticed. That's supported by the evidence of your analysis; most of the results don't look any different to other posts.

- It's not the link, but the related activity. Presumably if you're running analysis on HN data, there are a lot of HN requests coming from your machine. Maybe any posts made by your IP are therefore treated as suspect (i.e. the sort of protection you'd expect to avoid automated posting or upvoting... just without that extra sophistication). Perhaps the other posters had something similar... Would be good to see if any of those posts were by the same author; as that may add weight to this theory.

- Other variables... Maybe the algorithm has rules which cause this behaviour under some conditions; e.g. posts made the previous day (not 24 hours ago; but rather before midnight UTC / something like that) lose weight when midnight hits; so posts made moments before suddenly lose enough score to knock them off the top spot; whilst those which had more score before midnight, or were posted just after survive... Many other possibilities such as this may exist; and we'd only know by looking at those variables in the data... What else is common about the posts which are in your post's club vs those which aren't?

iplaw 2 days ago 4 replies      
I didn't understand why the guy's PayPal story was removed from the front page -- the one about PayPal seizing a $40k USD balance without warning, allegedly due to a 2% chargeback rate over several years of doing business and hundreds of thousands of dollars in successful transactions.

I thought that the point of HN was auto-moderation? Perhaps now that HN has seen great increases in popularity, the quality of content has to be more carefully controlled, lest the quality of posts on the HN front page slowly enter a death spiral towards that of reddit.

Think of how stupid the average person is, and realize half of them are stupider than that. George Carlin

Sir_Cmpwn 2 days ago 1 reply      
A few months ago I wrote a tool that scrapes HN and detects moderator activity (and provides interesting stats in general). I posted it twice and it didn't receive much attention, and the database storage requirements starting to get out of hand, so I killed it. It seems like an appropriate time to bring it up again. The source code is here:


Some screenshots:



If there's interest, I can put this back up and start pruning old data so it's more maintainable. The data I collected shows a lot of questionable moderator activity and a lot of abuse of flagging. I'm also unhappy with HN sending all comments on paywalled posts (which are against the rules) to /dev/null, when they're usually at least willing to talk about things.

minimaxir 2 days ago 1 reply      
Historically, the mods have not killed a post for discussing Hacker News meta, although on occasion they apply a penalty to meta submissions. (the original post only had 32 upvotes, which is enough to get swallowed)

Indeed, HN recently allowed a post that advocated gaming the system because it encouraged debate: https://news.ycombinator.com/item?id=13676362

A conspiracy theory, even backed by data, is not the best application of Occam's Razor.

laktak 2 days ago 4 replies      
Don't forget that HN tries to detect vote fraud.

So some articles might simply disappear because the OP asked too many friends for upvotes or because of false positives.

moomin 2 days ago 2 replies      
It's very interesting to note the number one pulled story: https://news.ycombinator.com/item?id=13741276.

If you take a look at the comments, it's theorized there that the story got pulled not because of moderator action, but because people abused the flagging mechanism. Given the content, and given the principal person under discussion, this seems pretty likely to me.

yAnonymous 2 days ago 1 reply      
The main problem I have with this is that similar to Reddit, HN is advertised as open-minded, free-talking place by the team themselves.

When you then secretly censor stuff, because it doesn't fit your agenda, be it politically or financially, it makes you look even more like a hypocrite.

Branding yourself liberal while employing fascist methods (censoring and banning) seems to be a trend, not only on the internet.

probably_wrong 2 days ago 0 replies      
Apparently I'm going to go against the flow to say that I'm perfectly fine with the way HN is moderated. In fact, I think the moderation is probably the reason why I come here. If magazines, TV shows, and journals have editors, it seems completely natural to me that a quality website should have one too.

More often than not, keeping a community from turning into 4chan requires some heavy moderation (reddit's AskHistorians comes to mind, with entire threads nuked at once), and it's often a thankless job. I'm happy that HN managed to keep it's overall spirit, and I thank the mod team for that.

ifdefdebug 2 days ago 0 replies      
The list of stories the author claims "have moderator fingerprints on them" does not seem to have any kind of common pattern at all which would support such a claim.

Edit: to be clear, with common pattern I mean the topics of the submission (obviously they have one common pattern, which is dropping out of the front page quickly). They do not reveal some secret agenda moderators might follow or something like that.

golemotron 2 days ago 0 replies      
I predict that one of the biggest issues in tech over the next few years will be 'silent moderation.' Tech companies like to present the illusion that it is all 'just an algorithm' but that is deceptive.

Silent curation and other practices like shadow-bannning are unethical and symptomatic of a mentality that seeks to avoid confrontation. If things go well we'll see more transparency over time. A good start for a site like HN would be to create another page that shows just the titles of the submissions rejected (no links). People can google for those titles if they are interested.

kens 2 days ago 0 replies      
I wrote in detail in 2013 about how the Hacker News algorithm works and the penalties that can drop stories from the front page. My analysis was based on reverse engineering the algorithm from observed behavior and comparing with the published Arc code. This latest analysis seems kind of reinventing the wheel.


Interestingly, my 2013 article also suddenly dropped off the front page. Apparently it somehow triggered "voting ring detection" and was penalized. (I'm not part of a voting ring of course.)

Sean1708 2 days ago 1 reply      
I don't think I understand your figures correctly, because none of them (except the last) seem to show any significant drop. There's a drop of about 10 positions in one of them but I wouldn't call that particularly significant, and it even climbs back up after twenty minutes or so.
j_s 2 days ago 0 replies      
http://hnrankings.info is a nice UI to see this happening in real time.

Here is an example of three submissions, two flagged enough to get kicked off the front page (for poor use of sources on a contentious topic) but not get marked as 'flagged':


jonathanstrange 2 days ago 1 reply      
I'm pretty happy with the way HackerNews handles posts and can only recommend to them to aggressively moderate in the future, too.

There are way to many toxic users, trolls, shills, astro-turfers, voting rings, paid advertising, political organisations, disinformation campaigns, and other 'special interest' parties on the Net to be able to do without strong moderation.

eli 2 days ago 2 replies      
I find this sort of conspiratorial meta complaint boring and (IMHO) off topic. I didn't flag it, but I'd understand if others did.
PleaseHelpMe 2 days ago 0 replies      
> The stories that Hacker News removes from the front page (sangaline.com)

is : http://sangaline.com/post/reverse-engineering-the-hacker-new...

mirimir 2 days ago 3 replies      
Conversely, I've wondered how some stories (e.g., https://news.ycombinator.com/item?id=13857880) hit the top of the front page with only a few points.

Edit: Thanks all. I get it now :)

Paul-ish 2 days ago 2 replies      
Could it be that moderators see things we can't, eg vote manipulation and astro-turfing?
euphoria83 2 days ago 0 replies      
I had, in the past, found that a post was deleted. I was so enraged by that action that I had abandoned HN for almost a year. It wasn't my post. But, I found the action arbitrary. I do understand that it is important to keep the spirit of HN and actively discourage posts that might take HN the same way as infinite other internet forums. Since the guidelines cannot be clearly interpreted, there will always be some controversy about what should or should not have been removed. This case is different though, because the post was down-voted out by the community.
chmaynard 1 day ago 0 replies      
I still browse HN daily, but I lost confidence in the moderators after I posted an article that trended rapidly with an interesting, useful discussion and then suddenly dropped off the front page.

I asked the moderators why this happened. Their explanation was that the article I posted was a duplicate, and therefore created a distraction for readers who wanted to comment on new material. This struck me as total bullshit, but I tried to be constructive and proposed a method of merging multiple threads on the same article. I never got a response.

Jerry2 2 days ago 1 reply      
Interesting. Another aspect of HN that I've noticed are stories that are not censored but actually promoted. Some time ago, I've noticed that some stories were few hours old and had 4 votes yet they were in top 15 position on HN. I never understood how that's possible without some kind of manipulation.

I've collected some of these anomalies. Peruse them and analyze them in this album:


Maybe OP can find a pattern in these.

omouse 2 days ago 0 replies      
Mention "unions" or "professionalism" or anything that involves giving the actual value that a developer produces to the developer and the story will disappear very quickly. Maybe that's because of the mods or maybe it's because of the flagging by readers who have the pro-state-capitalist argument so ingrained in them. Whatever it is, it biases Hacker News in a bad way.
crablar 2 days ago 0 replies      
This is why we are working on Mineranker, an open source newsfeed and ranking platform: https://github.com/francisypl/mineranker

If you are interested in more open newsfeed ranking systems, check it out.

pooptasticpoop 2 days ago 0 replies      
I'm glad I saw this here. HN has some of the strangest draconian moderation of any website i've ever visited.

Though, i'm continually driven back here because of the insanely high quality of the comments here.

paulpauper 2 days ago 0 replies      
It's probably a combination of flagging and other factors. Obvious the mods aren't going to want to make their algos transparent. Flagging removes story completely but perhaps mods have the power to bump a story off the page but without flagging it.
aaronhoffman 2 days ago 0 replies      
@foob, thought you might be interested in this "visual front page" of HN https://www.sizzleanalytics.com/HackerNews
aaronchall 2 days ago 0 replies      
It seems to me that the stories (all headlines read as controversial, as do the texts that I dug into) are removed due to flagging by users.

It would seem to me that if you're looking to grind your political axe, this is not the best place to do so.

anotheryou 2 days ago 1 reply      
Does anyone know why some posts have a rel="nofollow" in the link? (not just these no-comment ycombinator promotional posts)

I asked this before and a mod said I should ask again via mail, but never got a response from hn@ycombinator.com.

leot 2 days ago 1 reply      
I've been wondering what happened to this story, which was doing better than most but didn't appear anywhere on first 15 hn pages.



threepipeproblm 2 days ago 0 replies      
>> Thats about 2.1% so its not a particularly common occurrence but it is happening on a daily basis.

I would say that 1/50 front page stories being buried is particularly common.

grimmdude 2 days ago 0 replies      
Good writeup. Ironic that this story is #1 on the front page now.
UhUhUhUh 2 days ago 0 replies      
Interesting discussion, and therefore, logically, interesting post. Oddly, I begin to wonder whether a variation on a bet system wouldn't be useful. Flagging or up-voting don't come with any cost attached. The only current cost I can see is related to pissing off a group or sub-group, which is not conducive to productive exchange. Just a thought from someone who has never flagged anything.
DanBC 2 days ago 1 reply      
This blog suggest that moderator action must have been necessary, but seems to say that flags are unimportant.

I disagree. Just a few flags can cause a story to drop off the front page.

pvaldes 2 days ago 0 replies      
Some of those links are closed for voting for some reason.It seems that you can upwote the rest (including this "Stories that HN removes..." thread) but the vote count will not change anyway. Maybe is just a bug, maybe not. I don't know.

Updated: Probably just a bandwidth issue. After 20 minutes the vote count is changing again.

c3534l 2 days ago 0 replies      
There's a subreddit for pages that get deleted from their front page. There's often good discussion around those pages and it's usually for a good reason they were removed. But sometimes there does appear to be real bias. It'd be nice if I could see what sort of stuff is getting removed and be able to discuss why.
qznc 2 days ago 1 reply      
Afaik there is a mechanism to remove flame bait articles. If a link gets too many comments too quickly, it gets removed.
timthelion 2 days ago 1 reply      
I've had one post removed from the top spot on the front page:https://news.ycombinator.com/item?id=5576041

I origionally posted with the title "For a moment, I thought bing was down" or something (I don't remember the origional title). The title was later changed to:

Title: "Bing doesn't support SSL"https://news.ycombinator.com/item?id=5576041

Later, the story was was removed entirely after I wrote the following comment:

" Actually, it's been like this a really long time. I just noticed, that HN stories which have nondescript titles fare better, so I decided to conduct a little experiment. 1st spot on the front page seems to confirm my hypothesis."https://news.ycombinator.com/item?id=5576342

I certainly understand why the mods removed the "story", but at the same time, I felt that the discussion of the "non-descript title bias" would have been an interesting one to have.

kristopolous 2 days ago 0 replies      
I had something that was removed from the front page of reddit and hn nearly simultaneously in 2011 - and I hadn't posted it on either, it was just my content. I saw it in the logs; some giant cliff.

I've always wondered if there's cross collaboration since then.

DanBC 2 days ago 0 replies      
If you notice something weird happening with your submissions you should email the mods. They're happy to explain what's going on and to have a look to see if the flags are fair.
codr4life 2 days ago 3 replies      
Hacker News is nothing but a censored echo chamber, pretending otherwise at this point is pure ignorance. Anything that doesn't fit the narrative will be beaten down or removed.
intralizee 2 days ago 0 replies      
Transparency is nice but most social news sites don't care for it.

If there was a middle ground it probably would be a section where you can specifically view threads that were removed from view.

brudgers 2 days ago 0 replies      
A useful data point missing from the article would have been the moderator's response to an email inquiring as to the story's history.
turc1656 2 days ago 0 replies      
I would like to see a monthly version of this post, similar to the recurring "Who's Hiring" that happens every month.
logicallee 2 days ago 1 reply      
This is called "burying" and it is unremarkable. (This story is likely to be buried.)

The stories that are buried are not appropriate for the front page. The reason you come to Hacker News is because it has a better front page, with better comments under it, than other places. You experience the benefit of this editorial intervention each and every day.

I've had a story buried as it was gaining a lot of traction very quickly: this one. https://news.ycombinator.com/item?id=11920431

The quality of the comments was inordinately low and it didn't look like it would be improving, which is the reason it was buried.

No complaints from me around this. You can email the moderators if you want to know their reasoning. (I'm not one.)

People here need to understand and be thankful for the extraordinary and ongoing work that the moderators do every single day to keep this place an appropriate place for interesting, deep discussion along the editorial lines chosen. It is not a democracy (see: reddit) but I find the moderators generally extremely fair.

As far as I understand the moderators bury tons of stories (often political, link-bait, etc), which do however get traction quickly until they do so. It is easy to get traction through click-bait.

Generating serious discussion is harder. For example, this title promises "the stories that Hacker News removes" -- but is not really about the stories that Hacker News removes. For example the author does not analyze the comments under them or see why it derails or is not a good contribution to HN.

It is more of a click-bait title is bait-and-switch, and is designed to generate easy outrage.

There's nothing remarkable here despite the traction this story is getting. It is part of the hidden workings that keep HN great. Dan and Scott (the moderators) do an extremely good and thankless job keeping the principles of this place alive.

You have no idea how hard they work and I've seen them make difficult and intricate decisions. (Sometimes as simple as detaching a thread that was derailing an important discussion.) In my opinion this story does not belong on the front page.

65827 2 days ago 0 replies      
Was it critical of Google or one of the other mega corps? I've noticed oftentimes info that is disappears quickly on here.
wookoouk 2 days ago 0 replies      
My universities firewall blocked the link :(
jacquesm 2 days ago 0 replies      
Nice article. Not so nice to rip the HN comments and display them below the article.
midgetjones 2 days ago 0 replies      
I guess the acid test will be what happens to this post. If the first one was removed via manual intervention for whatever reason, then this one surely will be too.
fidla 2 days ago 2 replies      
I think it was removed because of the blatant linking to your site, considered spam by most of us
Teach Yourself Computer Science teachyourselfcs.com
1274 points by kruse-tim  2 days ago   236 comments top 44
scandox 2 days ago 8 replies      
This is a really good list. I love the simplicity. I also agree that it is both worthwhile and very interesting to learn the fundamentals of CS.

That said, I think it is a mistake to assume that lots of Type 2 developers wander around in a perpetual state of under-achievement. Most of these people are indeed a different class of developer (I think the word engineer is positively abused), but many of them really have almost no professional requirement to understand fundamentals. Any more than they need to understand particle physics.

These developers are a class of systems integrators and they produce a lot of usable systems, at a quality level that represents appropriate trade-offs to the business case they are employed to address.

Yes, many will say this is a less elevated pursuit. It has its own challenges and mindset. It lives at a particular level of abstraction and its very existence assumes stability of that layer of abstraction. The fact that this breaks down sometimes is besides the point.

The reality is most developers probably do Type 2 work, though very many may have or aspire to have a Type 1 level of knowledge and insight. However I think it's unfair to portray a contented Type 2 developer as lacking in some essential.

Impossible 2 days ago 3 replies      
This is a solid list, but its a shame that no computer graphics resources are even mentioned. Although the reason for the omission is mentioned in the FAQ, I'd argue that computer graphics basics (images, basic rasterization, color spaces, etc.) are as fundamental as networking or databases. A link to Computer Graphics Principles and Practice (https://www.amazon.com/Computer-Graphics-Principles-Practice...) would have been nice.

I understand that most graphics resources out there focus on real-time 3D rendering for games or writing raytracers, which I agree are currently industry specific topics. Your average developer isn't going to write a vector graphics library as part of their day job, but the browser abstracts computer graphics in the same way it abstracts networking or compilers, so if the goal is to understand the underlying principles of software platforms you'll be working on every day I think computer graphics is a strange, biased, omission.

ghufran_syed 2 days ago 5 replies      
For those who need the structure of a formal course, or who want a CS degree for career reasons, the University of London's International program is a great option - it's very flexible, so easy to combine with full-time work, and costs around $2500 per year for 3 years.I'm around 2/3 of the way through, and find it helps force me to learn things I know I need to know, but might not make the time for otherwise


The creative computing has a slightly more art/graphics emphasis, but is still rigorous:http://www.londoninternational.ac.uk/courses/undergraduate/g...

rezashirazian 2 days ago 0 replies      
Great list that covers the basics. For anyone interested to expand I would suggest the following:

(the description is taken from the corresponding courses I took in college which I found super helpful)

Programming paradigms: Examination of the basic principles of the major programming language paradigms. Focus on declarative paradigms such as functional and logic programming. Data types, control expressions, loops, types of references, lazy evaluation, different interpretation principles, information hiding.

Textbook on Haskell and prolog would be recommended.

Computability: An introduction to abstract models of sequential computation, including finite automata, regular expressions, context-free grammars, and Turing machines. Formal languages, including regular, context-free, and recursive languages, methods for classifying languages according to these types, and relationships among these classes.

Introduction to the Theory of Computation by Michael Sipser

Explorations in Information Security: A broad survey of topics in information security and privacy, with the purpose of cultivating an appropriate mindset for approaching security and privacy issues. Topics will be motivated by recreational puzzles. Legal and ethical considerations will be introduced as necessary.

Someone already mentioned computer graphics which I excluded. I personally had the most fun in college in my graphics courses. They were hard but super rewarding and a ton of fun!

gravypod 2 days ago 0 replies      
I wish there was a way to include "What could have been" or "What could be" into the "What is" of an "education" in Computer Science.

Distributed Systems, Databases, Networking and Architecture all have a past with much better solutions that were never adopted because of patents, cost, or some such other that grow fainter with every coming day.

If courses like these cosnsited of "History" in parallel I think I'd be a more well-rounded graduate.

mamcx 2 days ago 2 replies      
About the part on databases:

 "but we suggest just writing a simple relational database management system from scratch"
This part is the one I'm more interested, but also the most hard to get.

As explained there, is very hard to get information about databases (all is hunting material here and there). So, how do this? How build a "basic RDBMS"?

Probably looking at sqlite will be the default answer, but that is not the ideal. Is hard to see how was the thinking process after a materialized and realized piece of code.

calcsam 2 days ago 3 replies      
Ozan and Myles also teach this stuff in-person in SF: http://bradfieldcs.com/.

I just finished their databases course and it was excellent.

gerry_shaw 2 days ago 4 replies      
FYI... The video content for SICP seems to be going away in a couple of days...https://www.youtube.com/playlist?list=PL3E89002AA9B9879E
partycoder 2 days ago 1 reply      
Personally I think one of the problems with self-learning are gaps in knowledge.

As a part of a formal education you get to learn what you like, as well as what you do not like much.

My advice to self-learners is: never engage in "cargo-cult programming". This means: do not touch or reuse code that you do not understand. Force yourself to understand. If you lack the time, write it down and follow up later.

hackermailman 2 days ago 0 replies      
The best way I found was to go through the CMU BSc requirements (or other university), look up the public course pages and pair the lecture notes (and occasionally lecture vids) w/TAoCP series looking up the same topics but getting a thorough drilling in the topic by trying the problems in the book. Before I started doing this I kept forgetting material after taking it a few weeks later like when I had to rewatch a lecture on floating point to remember what the bias was.

If you look up 15-213 and get the book CS:App that accompanies the course it will more than prepare you to understand the MMIX fasicle (or orig MIX if you want) http://www.cs.cmu.edu/afs/cs/academic/class/15213-f16/www/sc...

esfandia 2 days ago 2 replies      
I love this. It has a Back to Basics, no BS approach to CS that appeals to me. I agree with all the recommendations. A couple of tiny comments:

- I know that learning C is not strictly speaking part of Computer Science, but it is a nice counterpart to SICP, ties in with other topics (such as computer architecture and OS) and should definitely belong to this curriculum. The authors of this site themselves have defended C in another blog post. Like pg would say, all you need is Lisp and C.

- IMO a better option for learning databases is Jennifer Widom's MOOC: http://cs.stanford.edu/people/widom/DB-mooc.html

busterarm 2 days ago 1 reply      
The book that I recommend for Networking, and that has been recommended to me by every fulltime NetEng I've ever asked, has been Interconnections.


Yes, the material is a bit dated. Yes, it won't give you the ins and outs of what you need to know. What it will give you is the why and from there you can figure out everything else you need to know.

javajosh 2 days ago 1 reply      
I strongly feel that Erlang really, really needs more visibility in the world. It is an important language for distributed systems, but the language itself is startlingly spare, using recursion and pattern-matching in lieu of conditionals. There are two resources that I like, a 3 week Future Learn course [1] and Learn You Some Erlang [2].

It is my belief that the Erlang "process" is a true object, as opposed to Ruby/Java/C++ etc object which is, ultimately, a thin easily-torn veneer over global spaghetti.

WhatsApp's acquisition for $1B for a 57-person team that could run a large, world-wide messaging system with Erlang should also be considered a resounding endorsement.

Last but not least, I personally have come to see the overall trend toward statelessness is a coping mechanism to deal with bad technology.

(If I could change my name to ErlangJosh, and if it sounded good, I would.)

1. https://www.futurelearn.com/courses/functional-programming-e...

2. http://learnyousomeerlang.com/

johnhenry 2 days ago 1 reply      
I kept two text books after graduating college -- one was Structure and Interpretation of Computer Programs. I also remember going back through Harvey's videos online whenever I missed lecture... are those being taken down too? https://news.ycombinator.com/item?id=13768856
psiclops 2 days ago 0 replies      
I know of the people behind this, who founded a CS school. I've heard good things, but I hadn't seen this before!
barking 2 days ago 3 replies      
Are all those CS61 lectures from Berkeley shortly going to disappear from youtube?
crench 2 days ago 1 reply      
This could do without the "Why learn computer science?" section entirely.
mettamage 2 days ago 0 replies      
Couple of things about the distributed systems course.

1. Maarten van Steen -- one of the authors -- recorded screencasts in 2012 (see https://www.distributed-systems.net/index.php/books/distribu... ).

2. Maarten van Steen released an updated version of the book this year about distributed systems.

Full disclosure: I followed Maarten van Steen's lectures back in the day :)

rhizome31 1 day ago 0 replies      
They recommend to put 100-200 hours in each topic. That would require to give it 8-10 hours a week for three years. Sounds feasible even with a day job actually.
mattfrommars 1 day ago 1 reply      
Which demographic do these compilation of computer science books target people? Considering the number of votes this post got on HN, it must be some importance. I already have plenty of resources gathered up with time which right now working on the first one, but really, who are they targetting at?

From the threads [slashdot, HN] I read yesterday which related to me quite well [26 year old trying to enter the software development field], all seem to conclude, 'you're a dinosaur if you're coding past 30'. Which drew a really grim picture of where my future is heading. To me, there is vast amount of knowledge and learning which I don't know if I can do within 4 year span. There are numerous books, theories and fields to work in.

How can someone around 30 stop coding and move to more management type position when there is so much to learn in this space?

paradite 1 day ago 0 replies      
SICP is a great resource for learning functional programming paradigm, but is it a suitable resource for CS beginners?

There is only a few universities that still use SICP or its variants in CS introduction modules.

I think a book on imperative or OOP paradigm might be better and more relevant in today's context.

hackermailman 1 day ago 1 reply      
This list is short on programming language theory, here's a rigorous book I enjoyed if interested in the formal definitions of such things as Abstract Syntax Trees or Types.

Robert Harper's Practical Foundations of Programming Languages. (free draft copy, also take the notation guide) http://www.cs.cmu.edu/~rwh/pfpl.html

There's videos for this book (and for Category Theory) @ OPLSS https://www.cs.uoregon.edu/research/summerschool/summer16/cu...

BeetleB 2 days ago 3 replies      
You know, as a former academic, I was reading this list and I immediately knew: This is written by someone in academia (and sure enough, you find out at the bottom it is).

I don't have a problem with this list per se. For all I know, it may be a good list and the designation of Type 1 and Type 2 engineers may be accurate.

But I wish I read a post from a Type 1 engineer in industry that mirrored what academics often write. I hardly find one. Why the disconnect? If the academics are so right, why is it mostly academics who preach this? There are more Type 1 engineers than academics, I'm sure.

Take my story: Was pursuing a PhD in physics/engineering and dropped out. Heavy on mathematics. And programming was always a hobby/passion. Went into industry in my discipline (not programming). Then decided to change careers into software.

Going in, I had the impostor syndrome. I had read quite a bit of CLRS in grad school on my own, but remembered little. So I took a bunch of Coursera courses to review all the basic algorithms, graph theory, etc.

My goal was that this was the bare minimum to survive, and I would work for a while and figure out what to focus on next (architecture? networking, OS? databases?).

Well, I've been working a bunch of years now, and there is no "next thing". Even the algorithms courses I took, while a lot of fun and interesting, play little role beyond what most Type 2 engineers will know!

That's just the reality: Most software jobs do not require you to know much beyond the basic data structures (hash, sets, lists, etc) and the complexity of their storage/operations. I looked for ways to use all the extra stuff I had learned (in essentially introductory algorithms courses), and did not find opportunities. I'm facing the inverse problem: Someone who knows some of this (or wants to), and having trouble finding a job where this knowledge actually leads to more robust systems.

And it's hard to find the jobs where these things matter, and it is rare that they are paid more. Difficulty and complexity does not equate to higher pay. Market rules do. Trust me, I know. I was doing more challenging work before I became a software engineer, but I get paid more now because there were few challenging jobs.

I know people say it often, but I'll say it too: Communication and negotiation skills are more valuable than the topics on the page. Why spend your nights on diminishing returns when you can get pretty far with just the basics of negotiation? Most engineers are overeducated in terms of what they need to know when it comes to technical skills. But other important skills? We're very undereducated. Why work hard to be even more overeducated, while ignoring the deficiencies?

intrasight 1 day ago 1 reply      
A good list of topics. But there are many others. A CS undergrad now has many more options than the four or five available courses that I had when I was at Carnegie Mellon.

Next item: "Print yourself a computer science diploma" ;)

autorun 2 days ago 0 replies      
That's a good list of books, but it's terrible to start with. I'm pro-reading, pro-book, but I mean, you can't be motivated on learning computer science knowing that you SHOULD read all those books.
grexe 1 day ago 0 replies      
What is the best book or video is always hard to say, but it's always interesting to see such collections, having studied CS myself and working in the field for 20 years.I'd appreciate proper quotations though, missing the books authors. Also a short reasoning why this title was selected would be helpful.Thanks for putting this together!
epigramx 1 day ago 0 replies      
> Computer Architecture

I found it fascinating to learn about the basis of computing on memory that is manipulated by a processor. It takes only 10 minutes to realize the basic concepts of it. Meanwhile, the majority of popular technology Press still treats processors as mystical machines disconnected from memory, that mysteriously make your game run faster.

eaguyhn 1 day ago 0 replies      
Don't forget the Scientific Method (it is Computer Science, after all)


ousmanedev 2 days ago 0 replies      
I loved this list so much that I created a public Trello board that summarizes it, and helps you track your progress if you're following the curriculum.https://github.com/ousmanedev/teach-yourself-computer-scienc...
sudeepto 1 day ago 0 replies      
Getting this message : The webpage at https://teachyourselfcs.com/ might be temporarily down or it may have moved permanently to a new web address.ERR_TUNNEL_CONNECTION_FAILED
rerx 2 days ago 0 replies      
As a physicist breaking into software engineering, but lacking education in CS, this list is extremely welcome to me.
deepaksurti 1 day ago 0 replies      
Two subjects that I think are missing yet important though in the practical and not theoretical context:

10. Software Engineering

11. The Art of Shipping

Shipping is a crucial skill to learn, if you don't know how to ship, then learning everything else is really moot!

gigatexal 2 days ago 0 replies      
Having bombed the programming interview for a Google cloud position and like a kid seemingly suffering from stockholm-syndrome I think the value to having arbitrarily abstract and difficult interviews are overall a good thing -- getting through one is like getting through basic training, it shows you have the grit to persevere. And in the case of programming you at least can understand data structures well enough to answer the canned questions. I know the initial reasoning behind it all was to show people who could think analytically but also be competent in the role. The downside I think is the process biases people who might be valuable members of a team who don't care about implementing a linked list but understand the pros and cons of one having used such a data structure or a similar one in production applications.
willbw 2 days ago 0 replies      
Appreciate the short list and not hundreds of links to resources.
andychong1996 1 day ago 1 reply      
This list is awesome, as a student, I really doubt a computer science will worth the money spend... Degree is expensive in most places nowaday.
aqibgatoo 2 days ago 0 replies      
Wow awesome list first time i have seen some explaining why they gave preferences to particular things over others in such a lucid and simple way ...
philbarr 1 day ago 0 replies      
Does anyone know of a different decent networking book that's available on Safari?
james_niro 2 days ago 0 replies      
ITunesU have great computer science classes which is offered my major universities
sidcool 2 days ago 2 replies      
I am a self taught programmer with 10 years experience. Would this help me?
jaddood 1 day ago 0 replies      
Great website! Thanks for sharing!
leog7 1 day ago 0 replies      
Very strange nothing on computer security
mzzter 2 days ago 0 replies      
palavsen 1 day ago 0 replies      
Read good compilation! I agree with almost everything on the list.
suyash 2 days ago 1 reply      
There are almost dozen blogs and websites promising to teach yourself CS - great initiative but most people are either saying the same things or missing few key points here and there. Everyone seems to tell it as if they know it best.
The Uber Bombshell About to Drop danielcompton.net
1080 points by dantiberian  2 days ago   505 comments top 11
harryh 2 days ago 10 replies      
Uber passengers only pay 41% of the cost of trips, with investor capital making up the difference

FWIW this assertion (which isn't really core to the central thesis of the post, but still) is wrong. That number comes from


but the author of that story misread the data. Uber only counts their cut as revenue not the full cost of the ride.

Despite this repetition (now corrected, thx!) of this incorrect data I find the overall thesis of the post compelling! As a disinterested bystander, it will be interesting to see how it all plays out.

EDIT: It turns out the original 41% statement comes from http://www.nakedcapitalism.com/2016/11/can-uber-ever-deliver... not from the Financial Times. It can be hard to trace these things back sometimes.

bedhead 2 days ago 5 replies      
Among the circumstantial evidence, the self-funding aspect is what jumped out at me the most. This is the equivalent of a hedge fund manage buying something based on inside information, but only for his personal account. Why cut other people in on the wink-and-nod-guaranteed payday? Besides, Lewandowski couldn't realistically raise money through proper channels if his pitch amounted to, "I stole stuff from Waymo."
Analemma_ 2 days ago 5 replies      
Yeah, this is a big deal. Uber has gambled everything on self-driving cars, what with their absurd burn rate and all. If that gets shut down because of trade secret injunctions, I don't think they have time to pivot to another strategy, especially after their reputation has nosedived.
mmanfrin 2 days ago 10 replies      
Outside of the case, the most striking thing to me about this is the level of detail that Google has over the logs and actions of their laptop. They were able to tell that a memory card was plugged in for 8 hours a year ago?

I have trouble finding exactly how a customer encountered a 500 10 minutes ago.

GCA10 2 days ago 3 replies      
For all the noise and excitement at the start of these clashes, don't they usually get settled with a licensing agreement that both sides can live with?

Sometimes we also see a medium-sized payment (not ruinous) to address the allegedly bad conduct. That usually gets paired with some lawyer-like phrases that amount to a blend of quasi-apology and face-saving evasions.

It's still an interesting suit. But after Apple/Samsung, Oracle/SAP and many others, it's hard to expect that the eventual resolution lives up to the pre-trial buildup.

wyldfire 2 days ago 1 reply      
This part looks to be the most smoking-gun:

> December 13, 2016 - A Waymo employee was accidentally copied on an email from one of its LiDAR-component vendors titled OTTO FILES. The email contained a drawing of what appeared to be an Otto circuit board that resembled Waymos LiDAR board and shared several unique characteristics with it. (Filing 59)

rsp1984 2 days ago 1 reply      
(Possible) Consequences: ... Without any way to raise more money or reduce their costs, Uber runs out of money and folds.

Highly unlikely. Why? Investors who stomach multi-billion dollar annual losses will probably just shrug off a mere lawsuit or a "bad media narrative". If the choice is to either write off $15 billion or to give another couple to help the company go through a rough patch (what a buying opportunity!) I think I know what investors are going to do.

They may demand Kalanick's head in the process (and I think they will -- not that it would really hurt him much personally though...) but seriously a whole nother level of crap would have to happen before investors start getting comfortable with the thought of letting go those $15 billion.

booleanbetrayal 2 days ago 3 replies      
I can't imagine what Lewandowski and Uber were thinking here. The timeline is just too constrained to even be taken seriously.
d--b 2 days ago 4 replies      
It's not a bigger bombshell than what was already in the news. We know what stealing means. The guy is an engineer at Waymo and plugs a USB stick in a laptop to download 9GB of code. And he hoped to get away with this? I'm pretty sure everyone at Google knew, they were just waiting to time it right.

"Hey, what about now? When Uber is taking fire from all directions?"

mikepurvis 2 days ago 2 replies      
No relationship between Uber's Ottomotto and the self-driving vehicles of OTTO Motors: https://www.ottomotors.com/company/newsroom/press-releases/o...

Disclosure: employee of Clearpath, the company behind OTTO Motors.

salimmadjd 2 days ago 3 replies      
I actually think this might be a blessing in disguise for Uber. I always argued self-driving cars would destroy Uber should Apple or Google want to enter this space.

Why? Because self-driving cars are basically a fleet service driven by a software. Once you remove the driver (where Uber spent so much acquiring) The only differentiator is the consumer facing experience. Neither Uber or Lyft will have as much power as Apple and Google since they ultimately own the mobile experience.

I ultimately envision this business as kind of a Kayak mobile on the phone managed by Siri or Google/Apple maps that will call the nearest taxi or the cheaper rate aggregating from multiple possible vendors. Larger fleets (Uber, Apple, Google) to smaller individually manage fleets. Car companies might decide to also enter that market in collaboration with financial underwriter.

So having a network of drivers ultimately gives Uber some leverage and removing them from the equation, I think it'll actually destroy Uber. This is why I think, this could be a blessing in disguise.

How the Maker of TurboTax Fought Free, Simple Tax Filing (2013) propublica.org
780 points by apsec112  3 days ago   435 comments top 10
xupybd 3 days ago 11 replies      
I live in New Zealand. I don't have to file taxes if my income is a standard wage my tax payments are automatic. My employer deducts them from my wage and they are sent to the IRD (Our government tax dept). My student loan and super payments are also automatic. If there is anything wrong then at the end of the tax year I can file to correct things.

I also get donations rebates, this is a one page form that lists all my charitable donations. Very easy very quick.

All my details are available to me online. All transactions are there and it's very transparent.

Why would anyone oppose a simple system like this?

harryh 3 days ago 1 reply      
Intuit certainly does this, but blaming them for our overly complex taxes is wrong. They might move the needle a little bit but they aren't the primary driver. The primary driver is all the constituencies of all the little things in our taxes that add up to make them complicated.

Like ObamaCare? That came with two additional forms.Live in a high tax state and like deducting your state taxes? That makes things more complicated.Big fan of deductions to for education or child care? That comes with complexities.I could go on and on....

Now maybe your answer to all of these questions was "no", but there are a lot of people that say "yes" to a lot of these questions. It's really hard to upset that apple cart. Lobbying doesn't have much to do with it.

schainks 3 days ago 1 reply      
So, just for a different perspective, Taiwan has a relatively simple tax filing experience, and the government invests significant resources to make as simple as possible (which does result in many people paying their taxes properly, on time).

While their digital tools for filing taxes make the telegraph feel modern, the "in person" experience is full of helpful people and takes about 90 minutes including travel time.

My main criticisms of the filing process:

1. Tax bureau has a one month timeframe where you can go, in person, to file "on time", but only during business hours. Any 9-5 worker must take time off to go file in person. It's a pretty nice customer experience - The volunteers in the bureau help you file your taxes with highest deductions possible, it gets crosschecked by a government tax clerk, and you're done.

2. Make the software work better on a modern OS and give it modern usability. It's _really_ crap UI, and I only run it in a VM just in case because the download site is also shady looking.

3. Locals gaming the system can make your life harder as a working-from-home small business owner. Many landlords don't pay income taxes on their properties, which means tenants cannot register business addresses at their homes, and must "rent" an address for about $100 / month.

4. Withholdings on foreigners, by default, are artificially high as a "precaution".

5. Refunds process in August after filing in May. Because they still process every return much by hand.

6. Double taxation on people like US citizens. The tax clerk has asked friends of mine, while filing, to show their US tax return to make sure there are not more taxes owed. They can ask, but it's not enforceable. So why do it? Because the tax rate on that income earned elsewhere can be as high as 30-40%! The tax clerk gets to decide how bad of an offender you are. GLHF.

7. If your income goes down compared to the year before as a foreigner, you will probably pay a penalty for "making less money" because they suspect tax evasion. Pay the fine (less than $USD 100) and walk away, or they dig your records hard and you could wind up in a situation like #6 above.

tschwimmer 3 days ago 11 replies      
Serious question: Is it possible for society to criminalize rent seeking behaviors like this? It seems clear that there's no benefit to keeping the status quo tax filing system except for the benefit of tax preparers. What's stopping the US from creating a law that says if a company attempts to lobby for something in bad faith (like the tax example), they will face sanction?
tuna-piano 2 days ago 5 replies      
I haven't read a legitimate argument against the IRS calculating taxes automatically, but here it is.

The more invisible taxes are to the individual person, the less they think about that money (and the higher taxes can go without them complaining too much).

Rent feels expensive because every month you write a check for rent. However, for many people, taxes are a much bigger expense than rent. But taxes don't feel as painful, because people don't write a check every month for taxes. Taxes are just invisibly withdrawn from your paycheck.

The easier and more invisible it is to pay taxes, the more you forget about how much money that really is. If you believe in constrained government, there's a good case to be made that we should make tax payments more visible, not less.

mb_72 2 days ago 3 replies      
I have experience as a tax payer in Estonia (employee only) and Australia (employee and business owner). In Estonia I login to my bank with my citizens ID card, click through a few screens ... and my tax return is done. All the financial information supplied by my employer to the government is immediately visible and checkable, not to mention donations made to tax-deductable organisations is also immediately visible. It is incredibly quick and convenient. In Australia it's just easier to leave everything to my accountant as they already have my quarterly business returns information; cost is still pretty reasonable (500AUD or so for the yearly business returns and the personal returns for my wife and I). But ... yeah ... in Estonia it just rocks. The ID-card based functionality for banking, digital signatures, tax ... just awesome.
hackuser 3 days ago 6 replies      
A tangential issue: It seems almost impossible to me that the systems of tax preparation services, whether cloud-based, local software, or offline, are secure.

The standard of security is, make the target more expensive to breach than it's worth to the attacker. How much would it be worth to have access to the tax returns of large swaths of the population?

I don't know the answer, but I'm guessing it's easily worth billions of dollars. Foreign intelligence services would very much like that information, as well as sophisticated criminals.

I am very doubtful that Intuit or H&R Block, for example, invest in security sufficient to protect themselves against that level of attack.

azernik 3 days ago 0 replies      
There's a fascinating article linked from this one: https://www.propublica.org/article/turbotax-maker-linked-to-...

See the final paragraphs, which I've copied below - it's essentially a version of the Citogenesis effect.


"Dennis Huang, executive director of the L.A.-based Asian Business Association, also told ProPublica he was solicited by a lobbyist to write about return-free filing. When the lobbyist sent him a suggested op-ed last summer and told him the proposal would hurt small businesses, Huang wrote an op-ed in the Asian Journal that claimed Asian-owned businesses would not only spend more time paying taxes, but they'd also get less of a refund each year.

Huang declined to disclose the lobbyist's name, but acknowledged he didn't really do his own research. "There's some homework needed," he said.

Oregon's Martin did some research on return-free filing and now supports it. She also co-published a post about the issue and the PR efforts related to it because, she says, she was alarmed that other nonprofits could easily agree to endorse a position they did not fully understand.

"You get one or two prominent nonprofits to use their name, and busy advocates will extend trust and say sure, us too," Martin said."

elberto34 3 days ago 1 reply      
the deep cynicism here is how people think Turbo tax is doing a pubic service by making taxes easier and cheaper, that's how effective their marketing is. Convince the public there is a problem that only said company can solve.
Upvoter33 3 days ago 2 replies      
well, of course. imagine you are a business and the govmt could change a law and put you out of business. wouldn't you put as much money as you could into preventing that change?this is why money in politics causes problems...
GitLab acquires Gitter, will open-source the code venturebeat.com
802 points by marcinkuzminski  11 hours ago   259 comments top 39
timanglade 11 hours ago 0 replies      
Tim, Marketing at GitLab here. Sorry for the false start, a draft leaked out via Twitter but the announcement was scheduled to run at 10 Pacific.

You can read Gitters post: http://blog.gitter.im/2017/03/15/gitter-gitlab-acquisition/

And here is what VentureBeat had to say about the news: http://venturebeat.com/2017/03/15/gitlab-acquires-software-c...

UPDATE: and here is GitLabs post with a few more details: https://about.gitlab.com/2017/03/15/gitter-acquisition/

hamandcheese 9 hours ago 18 replies      
Somewhat off topic, but I was once very interested in working for GitLab. According to their compensation calculator, though, I'd be making less than half what I make now (also working remotely). They ding me for living in a comparatively low cost area.

I'm surprised they are able to attract talent to achieve what they have done so far. It makes me worry somewhat about my future prospects in an increasingly globalized talent pool.

Philipp__ 11 hours ago 6 replies      
Every next move by Gitlab makes them grow in my eyes, personaly. Few weeks ago I started using it as my main "portfolio", it's not much but I really like their way of thinking, openness and providing services. Their whole eco-system is looking really interesting right now, and I hope they will continue to advance and grow.

Edits: typos, wrote from phone.

nico 3 hours ago 4 replies      
Up until this week I was in love with Gitlab. We've been using it at work for over a year as our main repo, code review tool and CI (tried self-hosted first, then moved to gitlab.com).

However, the service (gitlab.com) is constantly having issues, most of them not reported on their status page or on their twitter status account. For the last week it's been practically unusable, to the point where our whole dev team combined has wasted almost a hundred hours just re-trying builds and deployment jobs. Yesterday we tried, unsuccessfully, moving to the new AWS tools (CodeCommit, CodeBuild and CodePipeline), and today we just moved back to Bitbucket + CircleCI (we use RoR if you are wondering).

Unfortunately today I couldn't seriously recommend gitlab.com to anyone needing a reliable hosted repo + CI solution (maybe self-hosted works better though, YMMV).

Regardless, I have a deep respect for what Gitlab as a company has done so far. After looking into repo + CI options I've realized that they've created probably the best all-in-one platform out there, at least their vision/concept. Wish them the best and hope to use their service again in the near future once they have their stuff together.

sandGorgon 9 hours ago 2 replies      
@sytse - this is huge. You can move the YC internal slack to gitter as well ;)

But seriously - people are willing to throw money for an enterprise gitter - especially after https://medium.freecodecamp.com/so-yeah-we-tried-slack-and-w...

Look at the number of people begging Discord to take money from them - https://feedback.discordapp.com/forums/326712-discord-dream-...

gitter always had search working - discord just got it recently.

caio1982 11 hours ago 3 replies      
"Next piece of wow: we will be open sourcing all of the Gitter"

The Gitlab folks really know how to do it. It is of course the rational approach to it, but still, that's a bold move.

slg 11 hours ago 3 replies      
Does anyone have an insight into the differences between Slack, Mattermost, and Gitter? I have only ever used Slack before but it seems like Gitlab is already heavily invested with Mattermost. That makes me wonder what the future looks like for both Mattermost and Gitter. Are they different enough that they can both coexist without hurting either product or is one of them destined to be folded into the other in hopes of taking on Slack more directly?
erlend_sh 4 hours ago 0 replies      
This is pretty weird to me:

> What about Mattermost, how is this different?

> Gitter was built to be used in the open. Weve always seen Gitter as a network, or a place where people can come to connect to one another. Team collaboration, whilst possible, has never been a core aspect of the Gitter experience.

> Mattermost is a powerful, integrated messaging product for team collaboration - we will continue to ship and recommend using Mattermost for internal team communication.

Surely GitLab would be better off investing fully into a single chat-platform? The road to making Gitter good for internal team communication is not particularly long or windy.

kureikain 2 hours ago 0 replies      
I really like Gitlab and this acquiring.

When I firsted started as a developer, I don't know how people do thing in real production. I don't know how real company build real software.

Nowsaday, with all open source thing from real company that acquire/shutdown I can finally learn a lots from them.

Recently Stajoy open source everything, now Gitter is next. I'm sure I will learn a lot by consume the real code that run on production.

Thank you Gitlab/Gitter.

systemfreund 9 hours ago 0 replies      
When it's going to be open sourced then we could finally implement a proper matrix.org bridge. Currently all messages are relayed through a special gitter user called matrixbot
simplehuman 10 hours ago 1 reply      
This appears more like a acqui-hire. Also, since mattermost is bundled as part of GitLab, it makes me wonder why they didn't acquire mattermost instead.

This is good though because mattermost is open-source only on paper. They refuse to integrate important features even when people contribute code (to protect their commercial version).

Third, people see GitLab as GitHub competitor. But really it competes with Atlassian.

securingsincity 11 hours ago 2 replies      
I hope this spurns more OSS communities to move to gitter. As much as I enjoy slack, limits on features like search and invites creates a friction that I think a service like gitter can help solve.
itchyjunk 1 hour ago 0 replies      
Have used gitter a few times. I've heard it lacks moderator tolls from the channel owners a few times. OpenAI has their channel there too.
brandnewlow 11 hours ago 1 reply      
I'm a member of a Gitter for a smart mirror project. It's been a fun way to connect with other fans of the project and the maintainers. Feels like a smart move.
Dangeranger 11 hours ago 1 reply      
This seems like a good match.

I've been helping some early coders in the FreeCodeCamp Gitter and have been impressed with the quality of the app for a company with little capital. They have done a lot with short resources.

It would be better in my opinion for small Slack communities to transition to Gitter since Slack has said they do not plan to support large scale free communities in the long term.

edit; typo

yani 9 hours ago 0 replies      
That is awesome news. I have been using Gitlab for the last year. I switched over my personal projects and 3 organizations I was advicing from GitHub.They won me with their data centers in Asia and Europe.GitHub is extremely slow in Asia and they do not want to change.
omouse 6 hours ago 0 replies      
Hah, nice. Love it, love it that a whole business can run on free/open source and selling hosting/support.
catshirt 10 hours ago 0 replies      
i was just hating on GitLab here a few days ago so i will take the opportunity to say this is really cool on a bunch of levels.
ohstopitu 10 hours ago 6 replies      
I love Gitlab but sadly, quite a lot of services that offer free accounts for open source projects (TravisCI comes to mind), generally don't integrate with Gitlab

It's one of the few reasons why I am still using Github :( .

radarsat1 7 hours ago 1 reply      
What the heck is Gitter? A public chat service? Sounds like adding yet a new way for people to bug you about things that don't work, but without the controlled discussion of opening a proper Issue. As a developer or maintainer, what value does this bring to a project?
nxrabl 9 hours ago 0 replies      
Discussion at: https://gitter.im/gitterHQ/gitter

The goal seems to be to have the code open sourced by this June, with all history included.

Stack details here: https://stackshare.io/gitter/gitter

DeepAndDark 6 hours ago 0 replies      
I hope the search in gitter-chanels gets better now, cause of open-sourcing it. There are good infos and tips hidden. Maybe tags or upvotes would solve this.
dorianm 9 hours ago 0 replies      
Think about your favorite github repo, or even just a very specific one, and there is always a few people talking about this repo: https://gitter.im/rails/rails
dang 10 hours ago 1 reply      
realPubkey 10 hours ago 0 replies      
Maybe we can then fix all the UI-bugs in gitter which pissed me on so often.
ChuckMcM 6 hours ago 1 reply      
So the race to acquire the next team dev solution heats up. Slack + github + trello equivalent
orsenthil 10 hours ago 0 replies      
The blog post is now up: http://blog.gitter.im/
mkurz 11 hours ago 0 replies      
Would be interesting to know how much they paid
gregjw 5 hours ago 0 replies      
Solidly great business decision, well done.
marknadal 5 hours ago 0 replies      
Congrats to the gitter team, you guys are awesome and have a way (far far superior) product than slack.
dewiz 11 hours ago 2 replies      
next, I hope to see some "content" offer attached to gitlab's ecosystem, for instance a strong collaboration with stackoverflow/OSQA
alpb 11 hours ago 5 replies      
Can mods fix the link please, it's currently 404.
iplaw 11 hours ago 0 replies      
Looks like the Gitlab v. Github battle rages on.
dantrevino 7 hours ago 0 replies      
great news!
m0sa 8 hours ago 1 reply      
`git pull gitter`
dantrevino 7 hours ago 0 replies      
great news
romanovcode 11 hours ago 3 replies      
This leads to 404. Their official blog says nothing about it (http://blog.gitter.im/) and Twitter also says nothing about it (https://twitter.com/gitchat).
ausjke 10 hours ago 0 replies      
thought gitter is part of github, what an ignorance.
Is Facebook a Structural Threat to Free Society? truthhawk.com
718 points by jonstokes  1 day ago   396 comments top 16
DanielBMarkham 1 day ago 31 replies      
When I was a kid in the 70s I remember reading a national magazine article about another kid my age who had his own computer. Amazing! This was something I wanted.

Reading on, it described how he had built his computer from electronics and operated it from his attic. He had quite a few programs for his computer. One he liked the most allowed him to simulate buying and selling of stocks.

If you've ever read any ads from that period, the implication is clear: computers are awesome because they are going to challenge us to become better people. They will teach us at a speed we can learn, they will reward us as we progress, and the obstacles and learning will get more and more advanced.

People who don't have computers are going to be missing out -- on self development.

Contrast that to my trip the other day by commercial air travel. Everywhere I went, people were on their phones. Were they learning foreign languages? Becoming experts at symbolic logic or global politics?

They were not.

Instead they were playing the stupidest games imaginable. Facebooking, taking quizzes where any moron with the ability to type would get 90% correct -- and then sharing the results with their friends.

Zuck and others figured it out. Computers don't have to be computers. They have to be video games. Who gives a shit whether the guy on the other end is learning to be a better person. Challenge them with idiotic trivial tasks, then reward them with blinky lights, sound effects, and the imagined praise of their peers. They'll do that shit all day long. All they need is more batteries.

Yes. It's a problem.

ruddct 1 day ago 15 replies      
Facebook is:

* One of the most addictive products the world has ever seen (Opioids, another such product, were used to overthrow countries)

* The single most important media company in the world

* Controlled by one person

Threat to free society? Jury's out. But at this point, it certainly seems worth regulating.

(Edit: The above points are not meant to paint FB/Zuck in a bad light. To their credit, they've built an incredible ecosystem and a mind-bogglingly good product. We all strive to create sticky/addictive products.

My point is: When your product is incredibly addictive to a large chunk of humanity, regulation should be considered)

gthtjtkt 1 day ago 2 replies      
It's not Facebook, it's the fact that we're all too happily Amusing Ourselves to Death: http://a.co/frMmE2s

We were keeping our eye on 1984. When the year came and the prophecy didn't, thoughtful Americans sang softly in praise of themselves. The roots of liberal democracy had held. Wherever else the terror had happened, we, at least, had not been visited by Orwellian nightmares.

But we had forgotten that alongside Orwell's dark vision, there was another - slightly older, slightly less well known, equally chilling: Aldous Huxley's Brave New World. Contrary to common belief even among the educated, Huxley and Orwell did not prophesy the same thing. Orwell warns that we will be overcome by an externally imposed oppression. But in Huxley's vision, no Big Brother is required to deprive people of their autonomy, maturity and history. As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think.

What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture, preoccupied with some equivalent of the feelies, the orgy porgy, and the centrifugal bumblepuppy. As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny "failed to take into account man's almost infinite appetite for distractions." In 1984, Orwell added, people are controlled by inflicting pain. In Brave New World, they are controlled by inflicting pleasure. In short, Orwell feared that what we fear will ruin us. Huxley feared that what we desire will ruin us.

This book is about the possibility that Huxley, not Orwell, was right.

alistproducer2 1 day ago 4 replies      
The events of the last couple of years caused me to evaluate the psychological toll that me self-induced exposure to media, of all forms, was taking. I came to the conclusion that with social media, mankind was participating in the largest social experiment of all time and just hoping that things turned out well. I decided the results, so far, have not been promising and I no longer wanted to participate. I deleted my FB and canceled cable. Now I chose what I'm exposed to. I advise more people to do the same.
decasteve 1 day ago 3 replies      
The threat is that Facebook has become too big to fail. A major leak of private data and messages of its users would be devastating to society given the scale.

What happens if/when Facebook fails as a company? What happens to the data then? It gets sold off. That's a scary prospect.

Facebook is in the fickle game of Internet advertising. When the noise overcomes the signal in what Facebook shows, when the content of the users' connections gets drowned out to advertising, people will leave in droves. When advertisers fail to see the return on their investment, the money will dry up.

darpa_escapee 1 day ago 1 reply      
Any kind of non-democratic for-profit organization is going to have incentives that don't align with "free society".

When one of those organizations runs the top centralized content and communications silo, use it to censor, stalk its users and promote its or its sponsors interests, it becomes a threat.

owly 1 day ago 2 replies      
Deleted FB a few years ago, I haven't missed it for a second. I thought we reached peak FB about a year ago and was obviously wrong. I honestly don't understand why people can't communicate directly with each other given the insane amount of tools available and how cheap data, domains and services have become. There is absolutely an "us vs. them" going on between those who are on FB and those who have opted out. I'd love to get more people to opt out and this article is one of the better arguments out there. For those who want to try a true block of FB and all its works, try this... https://github.com/jmdugan/blocklists/blob/master/corporatio...
notacoward 1 day ago 4 replies      
(Disclosure: I have accepted a job offer at Facebook, but not yet started)

Does Facebook really have a unique amount of data? Google also knows a lot about people through billions of searches a day. Apple can learn a lot through people's iDevices, and Google (again) or Samsung can do the same with Android. So can any cellular service provider, or anyone running coax or fiber into your home. The government can tap any or all of those. At least Facebook doesn't have an army, or paramilitaries like DHS or just about any sheriff's department I've ever encountered.

I'm not saying the author's concerns are invalid. I've had occasion to think about these exact issues a lot, and I'm sure many of my soon-to-be colleagues have too. The way I see it, Facebook and other social media occupy much the same position as phone companies used to, both in terms of how they facilitate interaction and in their privileged financial/infrastructural positions they occupy. There's good in that (e.g. ability to pursue the kinds of speculative projects that Bell Labs was famous for). There are also dangers, no question.

The thing is, if it wasn't Facebook it would be someone else. There's no shortage of others ready to step in if Facebook alone were targeted with laws and regulations. Instead of worrying about Facebook specifically, we need to think about what a modern "common carrier" law should look like in the social-media age. Perhaps some kinds of regulations on use of information do make sense, but that dialog isn't likely to be very constructive so long as most of the people on one side seem to be free-market fundamentalists betraying their own principles by singling out one company among many.

jimmytucson 1 day ago 2 replies      
I find it amusing that news media outlets are stoking fear and resentment over the government hacking technology to spy on us when the information it gathers is a mere shaving of what corporations like Alphabet and Facebook have access to on the back end.

How is it preferable that a handful of incredibly talented, well-funded, private companies know more about you than your mother or your best friend (or arguably yourself)? Why are people more frightened by a bureaucratic government agency led by Donald J. Trump than the world's leading researcher in artificial intelligence, who owns the index to the entire internet, and creates things like this: https://www.youtube.com/watch?v=-7xvqQeoA8c? Why is Wikileaks working so hard to protect these companies?

twsted 1 day ago 0 replies      
Yes and "tech savvy" people should educate their friends when possible.

Most of my friends, for instance, ignore that FB knows most of their web chronology through the omnipresent Like button.

And we must fight the "I-have-nothing-to-hide" attitude.

bantunes 1 day ago 1 reply      
For me, the scariest bit is that I tell people about this (techies and non-techies alike) and they are made aware of the consequences of their every move being tracked and they still don't care. At all.

We're either collectively retarded and it will take very little to rule us all, or venturing toward a society where privacy simply does not exist but it's... fine? Can society adapt to this and someone having a sextape or their browser history leaked be no big deal to anyone?

I don't know which is scarier.

tunesmith 1 day ago 1 reply      
I deleted Facebook from my phone a while ago, and don't miss it. I've been surprised how much I don't miss it, honestly - I get less notifications and haven't been tempted to reinstall it once.

The website is another matter. I felt reluctant to cut the cord entirely. At least visiting the website feels more deliberate than reacting to all the phone notifications.

But shortly after the election, I did log out, and kept myself logged out for a while. My fingers would still take me to the page, but the login screen would remind me, and I'd close the tab.

I finally logged back in a couple of weeks ago.

I do feel like I've noticed a couple of subtle differences. I think the political discussions are a lot less useful than I used to think. I especially think my friends posting political awareness posts are less useful than I used to think. No sense preaching to the choir unless it's actual surprising information. I think the months off started to make me feel less like a Democrat and more independent (although far from Trumpy-Republican.) I'm more free-speech and less boycotty. I'm less tempted to unfriend people that voted differently than I did.

I generally believe that the most ridiculous opinions I see in comment threads on facebook are... well, I have no way of knowing whether those comments are by real people or by bots. So I feel less like engaging with them.

Facebook is more enjoyable when you use it to connect with your friends - sharing photos, good personal news, etc. Current events, not so much.

elorant 1 day ago 0 replies      
The biggest problem with Facebook is that it's a walled in garden. Whatever you write there isn't accessible outside the web. It's not even searchable inside FB itself. I'm sure this is done on purpose, if people knew that the crap they write could be searchable they'd be much more reluctant to express themselves.
rrggrr 1 day ago 0 replies      
Black box warnings exist in the drug industry because its understood people do not understand complex risks and they don't read the fine print. Facebook IS a threat only because many of its users are sharing information ignorant of the risks.

HIPAA exists in healthcare because its understood data can be misused to the great detriment of patients. Facebook IS a threat only because its users have no recourse over the misuse of their information.

Products liability laws exist for manufactured goods because raising the costs of failure to protect your customers results in safer products, and because it decreases government's obligation to support injured consumers. Facebook IS a threat only because there is no cost associated with injuring users, and little awareness of how Facebook is injurious.

Its a systemic issue beyond Facebook. Legislative reform is required.

chiefofgxbxl 1 day ago 1 reply      
With the average American now spending 40 minutes on Facebook per day [0], it may be worth checking out this TedX talk: Distracted? Let's make technology that helps us spend our time well | Tristan Harris [1] (he speaks of a good design example at 4:00 in). It's astounding that this average means that roughly an entire day is wasted each month essentially scrolling through a newsfeed.

[0] http://www.businessinsider.com/facebook-monthly-average-time...[1] https://www.youtube.com/watch?v=jT5rRh9AZf4

hasenj 1 day ago 6 replies      
IMO this is a side product of the "software must be free" mentality.

Frankly, software should not be free. When you let people give you things for free, you basically let them control you and take away your freedom.

Rand Paul: NSA Routinely Monitors Americans Communications Without Warrants theintercept.com
538 points by remx  2 days ago   269 comments top 13
grandalf 2 days ago 8 replies      
While this article offers the clearest presentation of the issue I've seen to date, I think the key point (which the article points out halfway in) is that the truth of Trump's claim that he was wiretapped does not depend on the presence or absence of a FISA warrant.

The system is designed to be used without warrant, so those harping on the detail of whether or not a warrant existed that had Trump in scope are not focused on the core issue.

It would be nice if all surveillance could be traced back to FISA warrants, but Snowden's revelations make it clear this is absolutely not the case.

exabrial 2 days ago 7 replies      
My opinion on hn was unpopular when I spoke out against Apple during the San Bernardino affair, because the FBI seemed to have the proper DOJ signoff and I think the motivation was obvious.

However this is unacceptable. We are a society of laws, and one of them is due process. Spying likely started during the Bush years, and Obama somehow escaped scrutiny for continuing the program (even with Snowden leaks). Hopefully it finally gets shut down during the Trump administration, even if merely because the media seems far less tolerant of his transgressions.

mgamache 2 days ago 6 replies      
The problem with the press coverage of this topic is a lack of personalization. What does the government know about me and why is that important? I think most US citizens would be shocked to learn that most of our personal information is accessible without a warrant. (phone records / bank account / email / web history / phone location data / Car location / Purchase history / Facebook etc....). Also, how much the government can interpret from that information.
1001101 2 days ago 2 replies      
It's interesting that the article mentions then Sen. Obama's change of heart re: the FISA Amendments Act - saying he would filibuster it and then voting for it. I wonder what led to his change of heart. I haven't heard that explained.
emehrkay 2 days ago 2 replies      
Where exactly does Paul stand on issues of privacy like this? You'd think we is against it, but he is also okay with ISPs selling your browsing history.


Darthy 2 days ago 3 replies      
Of course the other 7.2 billion people in the world are also routinely monitored, but since this is an action on foreign territory, these 7.2 billion people have no recourse against this practice.
linkregister 2 days ago 0 replies      
The author of this article did a poor job of refuting Susan Hennessey's statement that reverse targeting is unlawful and not practiced. Taking an excerpt of a Hayden speech and then highlighting his statement that "communications with one end in the U.S." are the most interesting doesn't really prove this.

The author would have a stronger argument by sticking to the facts. Searches of U.S. persons without a warrant are directly at odds with the language in the 4th amendment of the Constitution, full stop.

lordnacho 2 days ago 6 replies      
I have a question about this whole government intrusion thing. Perhaps a lawyer can explain:

Suppose the government gets a warrant to wiretap some guy. He happens to get a call from his lawyer, and the government overhears that he's committed some crime.

Now there's an attorney/client privilege preventing you from directly producing the tape (is there?) so you can't just do that. But the fact that you've heard this means as an investigator you'll probably pursue this guy much more aggressively, and perhaps gather other evidence rather than give up.

How does that work?

willvarfar 2 days ago 2 replies      
Why aren't politicians and their staff using end-to-end encrypted VOIP?
caycep 2 days ago 0 replies      
Dana Priest and William Arkin's project documents a lot of the issues w/ the intelligence bureaucract.... http://projects.washingtonpost.com/top-secret-america/
wallace_f 2 days ago 0 replies      
The world would be a much worse place without people like Greenwald.
youdontknowtho 2 days ago 6 replies      
strictnein 2 days ago 1 reply      
One feels that Paul doesn't quite have a full grasp of the techniques he's describing (and it sounds like he's conflating a lot of stuff that Snowden leaked), but that headline sure gets the clicks, so let's go with it. Especially since it's Greenwald, who uses Paul's jumbled mess of an explanation to burn everything down.

Doesn't even make sense what he's proposing:Instead of getting a warrant to record the American, the NSA targets the foreigner? But what if they call someone else overseas? Or call people in the US? Seems like a really suboptimal way of targeting someone. And a low-level employee could unmask the caller? Sure, and that could also lead to that employee getting fired and prosecuted. I can access lots of data at work, but I would be shown the door and possibly sued if I did so.

Study: Immigrants Founded 51% of U.S. Billion-Dollar Startups wsj.com
577 points by dankohn1  2 days ago   659 comments top
bayesian_horse 2 days ago 29 replies      
One of the basic premises of anti-immigrant policies is that you can somehow influence the ratio of "useful" vs "not-useful" immigrants.

Apart from the idea of sorting people into useful and useless being inhumane, it also seems to be counterproductive. It looks like every kind of screening of immigrants will deter the more desirable ones, as far as that determination is possible on their arrival at all.

Ask HN: What are some good technology blogs to follow?
864 points by buddies2705  3 days ago   181 comments top 85
jamesblonde 3 days ago 4 replies      
The morning paper (in Computer Science):https://blog.acolyer.org/
jjude 3 days ago 5 replies      
These are the three technology sites I visit (almost) daily:

1. https://dev.to/2. http://highscalability.com/3. https://www.oreilly.com/ideas

patgenzler 3 days ago 0 replies      
https://stratechery.com/ - best tech blog on the Internet. Nothing related to coding but thorough and thoughtful take on every-day-happenings in the tech industry.
jsmeaton 3 days ago 1 reply      
Steve Yegge was one of the best bloggers I've read. Other than a post from November it'd been dark for a few years. Still a good read though.



forgotpwtomain 3 days ago 0 replies      
smcl 3 days ago 0 replies      
Raymond Chen's posts are excellent https://blogs.msdn.microsoft.com/oldnewthing/
yuribro 3 days ago 2 replies      
OpenBSD related - http://www.tedunangst.com/flak/

Weekly aggregations:

- http://chneukirchen.org/trivium/

- http://www.dragonflydigest.com/ (Look for the weekly "Lazy Readings" post)

toomanybeersies 3 days ago 0 replies      
Troy Hunt: https://www.troyhunt.com/

He writes great articles on security and is the man behind https://haveibeenpwned.com/

erlehmann_ 3 days ago 2 replies      
https://blog.fefe.de comes to mind, but it is in German.

For a weekly HN digest, I read this: http://n-gate.com/hackernews/

whichdan 7 hours ago 0 replies      
People of Color in Tech[0] is really great; lots of very insightful interviews.

[0] http://peopleofcolorintech.com/

neurocroc 3 days ago 1 reply      
I am keeping a mind map of all blogs that I want to read and follow (https://my.mindnode.com/Lr33AxQg1yTrPzYJrAbFD7E6Wr7cM6YyoUfX...)

It's part of a bigger mind map I am making (https://github.com/nikitavoloboev/knowledge-map)

relics443 3 days ago 4 replies      
Coding Horror [1], and Joel on Software [2] are my favorites.

[1] https://blog.codinghorror.com/[2] https://www.joelonsoftware.com/

ddebernardy 3 days ago 4 replies      
John Gruber's blog, Daring Fireball, is pretty good if you don't mind the occasional (ok, near systematic) pro-Apple biais.

Likewise for the Macalope's column.

dmit 3 days ago 0 replies      
Ted Unangst does a great job aggregating links to tech content over at http://www.tedunangst.com/inks/. His own blog is great as well.

Also, previously: https://news.ycombinator.com/item?id=11563516.

geerlingguy 3 days ago 0 replies      
http://hackaday.com Has a lot of good content for IoT and hardware hacking. Lately some spot-on articles summarizing various electronics and RF terminology for the layperson.
allenleein 3 days ago 0 replies      
My favorites:

1. Freecodecamp: https://medium.freecodecamp.com/2. Hackernoon: https://hackernoon.com/3. The morning paper: https://blog.acolyer.org/4. Codinghorror: https://blog.codinghorror.com/5. a16z: http://a16z.com/6. Ben Thompson :https://stratechery.com/

msangi 3 days ago 0 replies      
http://joeduffyblog.com is great, albeit it's far from being daily. It has long posts about operating system and programming language design
jyriand 3 days ago 15 replies      
Somewhat related to following blogs. But how do I follow blogs anyway? Is there any good Google Reader like apps, that are easy to use?
scottpiper 3 days ago 0 replies      
From https://summitroute.com/blog/2017/01/07/news_summaries/ , some have already been mentioned.

- Downclimb (my own), for weekly infosec news summaries: https://summitroute.com/blog/2017/03/12/downclimb/

- Bulletproof TLS, monthly, for crypto and TLS news: https://www.feistyduck.com/bulletproof-tls-newsletter/issue_...

- Mobile security news, monthly: http://www.mulliner.org/blog/blosxom.cgi/security/mobile_sec...

- This week in 4n6, weekly DFIR: https://thisweekin4n6.com/2017/03/12/week-10-2017/

heisenbit 3 days ago 0 replies      
http://semiengineering.com/ as I think we are at an inflection point of Moore's Law and it is worth understanding how that plays out at the lower layers of the stack.
JCDenton2052 3 days ago 0 replies      
Some of the blogs from my RSS feed, mainly but not exclusively .NET:

Scott Hanselman

Martin Fowler

Coding horror

Fabulous adventures in coding (Eric Lippert)

Zed Shaw (still on my list even though he seems to have largely abandoned tech)

Ayende Rahien

Steve Yegge

Schneier on security

The Light Cone (Brian Beckman)

The Shade Tree developer (Jeremy Miller)

rekwah 3 days ago 0 replies      
I would recommend https://hackernoon.com/
mappingbabeljc 3 days ago 0 replies      
I write a weekly AI newsletter called Import AI which is also cross-published to this WP blog. I try to cover a mixture of fundamental research papers and applied stuff. It also includes some OpenAI updates: https://jack-clark.net/
idahasen 2 days ago 0 replies      
Dev networks that are part of my daily dose of information

- https://hashnode.com- http://coderwall.com- http://reddit.com/r/webdev/- https://hackernoon.com

fauria 3 days ago 0 replies      
I have a public list of engineering techblogs at Twitter: https://twitter.com/fauria/lists/techblogs/members
OJFord 3 days ago 1 reply      
This list highlights and confirms a mild annoyance I have every time I see (or get recommended) a blog I might want to follow: it's rarely easy to get an overview of historical posts.

Almost everyone seems to go for the 'no summaries, home page is the latest post in full, followed by the one before in full, ...' format.

Notable exceptions mentioned here: antirez (brief summaries) and danluu (list of titles). Both of these approaches are far better IMO.

watwut 3 days ago 1 reply      
https://dzone.com/ actually technical articles for people who prefer tech over pop and culture.
Fannon 3 days ago 0 replies      
http://www.2ality.com/ for deeper insight in JavaScript and its current development.
urig 3 days ago 0 replies      
I pretty much scanned through the entire list of comments and i cant believe no one's mentioned www.hanselminutes.com. That is an excellent podcast and blog from Microsoft's Scott Hanselman who's an excellent interviewer and student ofn technology as well as a mentsch. Highly recommended.
xylon 3 days ago 0 replies      
LWN.net - news for the Free Software community
innerzeal 1 day ago 0 replies      
There's also an awesome blog about distributed systems correctness by Kyle Kingsbury at https://aphyr.com/
benkarst 3 days ago 0 replies      
remx 2 days ago 0 replies      
Take your pick from this list here:


Mojah 3 days ago 1 reply      
Self promotion: https://ma.ttias.be

Not daily, but plenty of links to follow-up on.

Alternatively, weekly summary of all things Linux & open source (RSS feed available); https://cronweekly.com

vgy7ujm 3 days ago 0 replies      
http://perltricks.com is very good.
sureshn 3 days ago 0 replies      
I would recommend benedict Evans weekly news letter , it gives the best news and updates from the tech world. Unlike a blog site which can be monolithic this news letters covers the top tech happenings of the week and it feels very complete for me
jakubgarfield 3 days ago 0 replies      
I publish 4 weekly digests with only 5 links per each every Monday (so you have one article a day).

Programming Digest - https://programmingdigest.net/

C# Digest - https://csharpdigest.net/

Elixir Digest - https://elixirdigest.net/

React Digest - https://reactdigest.net/

angadsg 3 days ago 0 replies      
Stack Overflow newsletters[1] are great as well. It sends you top questions of the week, both answered and unanswered. Great way to learn small things about things you love. Its the perfect application of "Knowledge should be bite-sized".

I subscribe to RPi, Net Eng, CS, theoretical CS and Code Golf news letters. Any other suggestions?


edit: Added link

acemarke 3 days ago 0 replies      
I wrote a big list of React/Redux-related blogs in a Reddit comment about a month ago: https://www.reddit.com/r/reactjs/comments/5t8loz/what_are_yo... . Most of them aren't daily, but the content is excellent.
madetech 3 days ago 0 replies      
nvartolomei 3 days ago 0 replies      
Not exactly a blog, but worth checking https://www.infoq.com
franverona 3 days ago 0 replies      
I follow a blog/podcast called Scale Your Code (https://scaleyourcode.com/). The host interviews a lot of interesting people like DHH or Jeff Atwood. He didn't post every day, but interviews are pure gold (last one was with Nick Craver from Stack Overflow).
icefo 3 days ago 0 replies      
It's updated monthly but really worth to have in your rss feed http://spritesmods.com/

The guy hacks and create stuff from time to time and it's very interesting to read. It's also more on the hardware side of things (I had to Google what's a shift register and how they work to understand one of the article)

maurits 3 days ago 0 replies      

Specialized in compressive sensing, matrix factorization and machine learning.

Don't let the blue color put you of, the author reads and reviews an unbelievable amount of research every week and maintains a huge repository of papers, implementations, talks and video's.

Gammarays 3 days ago 0 replies      
I put together a votable list of most of the sites recommended by HN users so its easier to see which blogs are the most popular/recommended (anyone can vote).


known 3 days ago 0 replies      
mpiedrav 3 days ago 0 replies      
Specifically on InfoSec, I would recommend:

Krebs on Securityhttps://krebsonsecurity.com

Daniel Miesslerhttps://danielmiessler.com/blog

skazka16 3 days ago 0 replies      
No one has mentioned https://kukuruku.co/. We translate popular and interesting tech articles to English. We are also working on letting users write and publish their own posts.
mike-- 3 days ago 0 replies      
davidiach 3 days ago 0 replies      
I subscribe to Benedict Evans newsletter. It's basically a collection of interesting tech related links with commentary.

It's not daily though.


petra 3 days ago 0 replies      
For deeper insight about technology in general, not specifically software: https://www.reddit.com/r/DeeperTech/
perseusprime11 3 days ago 6 replies      
A related question, what tool do you use to manage your feeds? Instapaper is good for one time links, overcast is good for podcast feeds but I am still struggling to find a decent one after Google retired reader.
thelgevold 3 days ago 0 replies      
Blog about JavaScript topics like frameworks and web performance: http://www.syntaxsuccess.com/
joshlemer 3 days ago 0 replies      
If you are interested at all in Scala, lihaoyi's blog (http://www.lihaoyi.com/) is phenominal.
eDameXxX 3 days ago 1 reply      
The tittle should be:

"How can I become a master procrastinator"


"Websites that can steal all my free time"

inka 3 days ago 0 replies      
https://mysteriouscode.io/blog/ - for stuff around AWS but also FreeBSD and general IT security.
ReviewDeeper 3 days ago 1 reply      
You can check https://reviewdeeper.com it provides information about useful but unnoticed apps and other trending topics in the tech world
jjuhl 3 days ago 0 replies      
I'd recommend "Embedded in Academia" - https://blog.regehr.org/
SodaDezign 3 days ago 1 reply      

A great way to follow interesting subjects (eg. FPGA, Singel Board Computers... )

adamnemecek 3 days ago 1 reply      
Lind5 3 days ago 0 replies      
Semiconductor Engineering http://semiengineering.com/
aslammuet 3 days ago 0 replies      
May this be helpful.http://www.theserverside.com/
bitmedley 3 days ago 0 replies      
Liliputing is quite good for tech news: https://liliputing.com/
rrobukef 3 days ago 0 replies      

A blog on security, privacy and (foto) forensics.

shthed 3 days ago 0 replies      
http://alterslash.org a readable slashdot digest
sandworm101 3 days ago 0 replies      

It is a niche area but covers an intersection of law, technology, consumer protections and software development.

Amivit 3 days ago 1 reply      
How do you guys manage all the various blogs to keep up on new posts? RSS? Which tool(s)?
luckysideburn 3 days ago 1 reply      
this site (it is still a pilot project) collects trend words together inside dashboards http://www.congruit.io/... I have written it for fun, because I don't want read tons of blogs :)
lobasaurusrex 3 days ago 0 replies      
I love digitaltrends.com. Good writing and a lot of interesting pieces on new technology.
BorisMelnik 1 day ago 0 replies      
not the normal CS type stuff but:


npguy 3 days ago 0 replies      
sciencesama 2 days ago 0 replies      
techmeme.com is a very good collection of all the conversation catchers that are happening in tech industry !
vondelphia 3 days ago 0 replies      
You may want to take all this advice, and create a news feed rss widget on http://start.me - that's what I just did.
sciencesama 2 days ago 0 replies      
any such decent ones for networking ?? networking as in computer networking.
yostrovs 3 days ago 0 replies      
Medgadget.com for medical tech
RayofLight 3 days ago 0 replies      
techmeme.com good to get the tech news of the day.
icemelt8 3 days ago 2 replies      
purpleidea 3 days ago 1 reply      
I'm a big fan of "The Technical Blog of James" https://ttboj.wordpress.com/ but I'm pretty biased. Check it out and LMK!
sametmax 3 days ago 1 reply      
If you can read french, http://sebsauvage.net/links/ is a nice generalist IT blog.

I'm the author of http://sametmax.com. And I like to brag, saying it's probably the highest quality blog on python. And I mean it. But it's in french and also talk about porn so you've been warned.

DrNuke 3 days ago 3 replies      
I am in awe of many resources you are sharing here now but my question is how they are going to monetise their effort? Some of these are run on a volunteering basis and while it is good for the community, I am not sure it is healthy and sustainable in the long term. Any sort of funding provided?
Scala Native v0.1 scala-lang.org
626 points by zepolud  1 day ago   232 comments top 35
scotchmi_st 1 day ago 7 replies      
I would love for there to be a similarly thorough project with Clojure. It really bothers me there's no good native compiler. Apart from anything else, it means that Clojure lives and dies by the languages it compiles to, and while Java is used everywhere still, it probably isn't the thing the kids are learning these days. Besides, without going into any further rational arguments for why using the JVM (or another VM) isn't always great, something about it feels a bit icky to me. On an aesthetic level.
mark242 1 day ago 3 replies      
This will be huge for getting Scala running on AWS Lambda. The cold-start times for JVM apps is just ridiculous and makes Lambda/API gateway essentially unusable for anything written on the JVM.
cwyers 1 day ago 6 replies      
It seems to me like Scala's biggest benefit and biggest downside are two sides of the same coin: easy interop with the JVM and Java code. Scala Native just seems like you're paying all the price of that for none of the benefit.
gbersac 1 day ago 4 replies      
That's a great news ! We are exclusively using scala at work for back end and I wonder if it could be interesting to switch new projects to scala native.

Did you test scala native against well known and massive open source scala project ? Did the performance improved or regress ? Did you wrote a brand new scala compiler for native code ?

lacampbell 1 day ago 2 replies      
Scala is a language I desperately wanted to like - high level, statically typed pure OO language. But in practice I found it almost unusable. The type signatures were unreadable and I distinctly recall writing a 100 line or so program where the type declarations crashed the compiler. And the tools themselves were huge memory hogs - sbt was a particularly bad offender (though otherwise quite pleasant).

I also did not get on well with the community, which seemed to have a lot of people with the attitude - "they won't let me use haskell at work so I'll make do with this shit". They didn't seem to understand or be interested in OO at all, and were very fanatical about driving application logic with types, purity, and the like.

Regardless, a native variant would be something well worth investigating if it ever reaches "production ready".

evdev 1 day ago 1 reply      
As a Scala guy on a Scala team, I'd think this would be most immediately useful on smaller fill-the-gaps sub-projects where we have to integrate with native code.
kentosi 1 day ago 3 replies      
This is extremely exciting. I can't wait to try this out.

On the other hand, I wonder why such an effort was never carried out with Java itself? Or maybe it was but just never took off?

pale-hands 1 day ago 2 replies      
Will macros work with Scala Native, as they do with ScalaJs? (I believe that compile-time metaprogramming is the way forward, especially if the target doesn't support reflection or dynamic code loading).
eli_gottlieb 1 day ago 0 replies      
This is amazing. Scala is one of my favorite languages to work with, and getting it native-code support will finally help make it fast enough to justify using it everywhere.
wst_ 1 day ago 1 reply      
Somewhat relevant: https://blog.plan99.net/kotlin-native-310ffac94af2#.ijzik0jx...

Title says Kotlin, but it is about JVM languages going native, in general. Or should they?

speg 1 day ago 2 replies      
I can't get hello_world to work, something about a unresolved dependency: org.scala-native#sbt-Scalia-native;0.1.0: not found

I'm not a regular Scala user.

jcstauffer 1 day ago 1 reply      
Great! Any benchmarks on whether the Scala compiler runs faster when compiled to native?
mafribe 1 day ago 1 reply      
Nice work, I hope this will eventually be a serious alternative to the JVM route.

Quick question: does this compile down to DOT before going to LLVM? Or has DOT not yet arrived in Scala Native?

huula 1 day ago 2 replies      
Question: what kind of frameworks can be practically migrated to Scala Native?
gravypod 1 day ago 1 reply      
Does anyone have any example binaries compiled with this? What are the sizes that you could expect?

They say one of their targets is using this for command-line tools (I'm guessing for startup speed and needing to be small in memory footprint) but it's not of much value if an "echo" or "grep" implementation takes up 15 to 30MB on the drive.

c-smile 1 day ago 0 replies      
I've created once custom barebone JavaVM with binary size of around 100k. It was made as an executable jsmile.exe that was capable to read bytecodes cat'ed to the executable itself : http://www.terrainformatica.com/org/j-smile/index.htm

The goal was to create JVM suitable for standalone GUI applications. Project was abandoned when Sun/MS Java wars started in favor of the Sciter (https://sciter.com).

As of NativeScala ... I think that approach (binary with nano JVM + attached class files) may work better and with less effort. Scala needs JVM infrastructure, GC, etc. as far as I understand.

brangalinafoeva 1 day ago 1 reply      
How do the compilation times compare to targeting the JVM?
enjoiful 1 day ago 1 reply      
At first glance of this article's title, I thought it would be a terrible idea to use Scala to write native mobile applications. Imagine using Scala.JS to write a NativeScript/React Native app. shutters
auggierose 1 day ago 0 replies      
It seems like that would open up a nice way of using Scala on iOS also.
rainhacker 1 day ago 0 replies      
> This opens the door for Scala to be used in environments where full-blown virtual machine is usually an overkill

Not sure if I get this, don't Java VMs support this use case (J2ME) ?

Negative1 1 day ago 1 reply      
Important bit: "The project has reached a point of feature completeness in terms of the coverage of the Scala language. We support the whole language including the more advanced features such as method dispatch via structural types and even macros."

It must be frustrating to work on a project like this, see areas where the language can be improved, but only be able to do the work to make it purely compatible. Hopefully some good comes out in the form of some good SIPs.

lukax 1 day ago 2 replies      
The generated binary for simple Hello World is 3.45 MB which is quite a lot for printing one line of text but it can be compressed to 326K using UPX.
twic 1 day ago 0 replies      
This is certainly an impressive piece of work. However, i think it's worth paying attention to the limitations, and the use cases they imply; overall, this looks less like "compile your existing Scala app to native code!" and more like "use Scala to interface with existing native libraries!".

On the other hand, it's also worth bearing in mind that this is version 0.1.0; over time, some of these limitations will lift. What i don't know is whether Scala Native will develop into a complete version of Scala which compiles to native code, or evolve into a variant of Scala more tightly adapted to a niche of talking to native libraries.

Anyway ...

(1) No threading [1]:

Scala Native doesnt yet provide libraries for parallel multi-threaded programming and assumes single-threaded execution by default. Its possible to use C libraries to get access to multi-threading and synchronization primitives but this is not officially supported at the moment.

So forget about using Akka for now.

(2) NullPointerExceptions are replaced with segfaults (hopefully) [1]:

A number of error conditions which are well-defined on JVM are undefined behavior: Dereferencing null. Division by zero. Stack overflows. Those typically crash application with a segfault on the supported architectures.

That's not so bad; where Java apps might let nulls flow around and rely on catching NullPointerExceptions to recover from them, Scala apps are much more likely to use Optional consistently.

(3) If you do want to talk to a native library, and you need to allocate memory to do it, you're on your own [2]:

Unlike standard Scala objects that are managed automatically by the underlying runtime system, one has to manage native pointers manually.

Scala Native provides a built-in way to perform stack allocations of unmanaged memory using native.stackalloc function: [...] When using stack allocated memory one has to be careful not to capture this memory beyond the lifetime of the method. Dereferencing stack allocated memory after the methods execution has completed is undefined behaviour.

Scala Natives library contains a bindings for a subset of the standard libc functionality. This includes the trio of malloc, realloc and free functions

Java's traditional JNI is a verbose, slow, pain in the stdout, but it was designed pretty carefully to avoid problems like this.

(a) Intermission! Check out how they do type-level numbers [1]:

Natural numbers are types that are composed of base naturals Nat._0, ... Nat._9 and an additional Nat.Digit constructor.

That's a new one on me!

(4) Incomplete JDK libraries [3]:

Scala Native supports a subset of the JDK core libraries reimplemented in Scala. Here is the list of currently available classes: [...] This is an ongoing effort, some of the classes listed here might be partially implemented.

The list has most of the fundamental stuff - a good chunk of java.io and NIO, the collections, java.lang, atomics. But no java.text, java.net, concurrency, regexp, date and time, JDBC, reflection, XML, etc.

They don't mention how much of the Scala libraries they support. I would imagine that they can build anything that's in pure Scala and depends only on JDK classes in that list, so you'll get the core language stuff and the collections. Not sure.

[1] http://www.scala-native.org/en/latest/user/lang.html

[2] http://www.scala-native.org/en/latest/user/interop.html

[3] http://www.scala-native.org/en/latest/lib/javalib.html

squar1sm 1 day ago 0 replies      
I wonder if Akka will be a part of scala native? It's sort of considered stdlib? When our team spiked on Akka, we liked it and got our near-reality proof of concept to work. It'd be awesome to have such a high level library like Akka compile to a binary.
kelvich 1 day ago 0 replies      
Nice job, thanks!Seems that right now you had implemented some wrappers for libc and some java classes ported to scala. What plans do you have to further evolve API? Will you focus on reimplementing java.* or create your own set of classes?
anta40 21 hours ago 0 replies      
Any Windows pre-built binary to try?

I imagine building this stuff on Windows will be challenging :/

LeanderK 1 day ago 2 replies      
Is Scala Native GCed?
tejasmanohar 1 day ago 2 replies      
Does this mean the future of Scala is off the JVM? I ask because the post calls the JVM impl. a "reference implementation".
amelius 1 day ago 3 replies      
Does this provide a garbage collector?
NickHoff 1 day ago 2 replies      
Will this mean that we can get rid of type erasure when running native code?
crudbug 1 day ago 1 reply      
Will language support destructors for manual memory management ?
0xFFC 1 day ago 1 reply      
so dream comes true!

P.S. I think this is related to rust, in a sense before Rust there was no serious competitor to C/C++, but after seeing what Rust doing to C/C++ I think there will be more native language to compete with in low level area.

webserg 1 day ago 0 replies      
good idea!
mihaela 1 day ago 0 replies      
Most things called native are not.
rekado 1 day ago 2 replies      
Unfortunately, this requires an existing Scala compiler to build, so it won't be useful as a bootstrap compiler for Scala on the JVM. Does anyone here know of an alternative implementation of Scala that could be used to build the libraries and tools of the reference implementation from source?

It is a problem that many compilers cannot be bootstrapped from source without a trusted binary of a previous release.

A Formal Spec for GitHub Flavored Markdown githubengineering.com
465 points by samlambert  1 day ago   88 comments top 18
jorams 1 day ago 2 replies      
It's odd how neither this post, nor the spec, nor GitHub's "Mastering Markdown" help page[1], nor the more complete "Basic writing and formatting syntax" page[2], mentions the fact that GitHub treats every newline as a hard break.

CommonMark contains this little sentence to work around its specified behavior, which is left untouched in the GFM spec:

> A renderer may also provide an option to render soft line breaks as hard line breaks.

I'd say whether or not it does this is a rather important thing to mention. When I write Markdown documents for GitHub I have to change my editor settings, only because of this.

[1]: https://guides.github.com/features/mastering-markdown/

[2]: https://help.github.com/articles/basic-writing-and-formattin...

erlend_sh 1 day ago 0 replies      
GitHub devs & Markdown enthusiasts at large, please consider contributing some brainpower to these last remaining issues that are blocking the v1.0 release of CommonMark:


simplehuman 1 day ago 2 replies      
This is great! A couple of years back, there was a failed attempt at standardizing this - http://www.vfmd.org/ and http://www.vfmd.org/vfmd-spec/specification/. GitHub given it's popularity will surely succeed more.
legulere 1 day ago 2 replies      
It's a specification based on the commonmark specification. Both are not a formal specs. They are more of an informal specification with some edge-cases listed (in contrast to the original markdown specification which has known unspecified edgecases).
rcarmo 1 day ago 1 reply      
I have to wonder why this isn't done in the form of a context-free grammar, like Hitman[0] uses. Specs in English are still too vague for my liking.


legulere 1 day ago 3 replies      

Why this? This is not a working blacklist to prevent XSS (e.g. onload="...")

IgorPartola 1 day ago 2 replies      
At the risk of starting a mini-flame war, is RST a more cohesive format? If one was to pick one of the two formats to start using for personal documentation, which format should one choose?
forsaken 1 day ago 3 replies      
This is great -- lack of a standard that was actually used (unlike CommonMark) was one of my main issues with Markdown (http://ericholscher.com/blog/2016/mar/15/dont-use-markdown-f...) -- It's really great to see GitHub leading in this department, and it gives me hope that one day we might actually have Markdown that is portable between implementations.
patrec 1 day ago 1 reply      
Now if only org-mode could define a sane, parsable format.
charonn0 11 hours ago 0 replies      
So that's why my project wikis have suddenly stopped rendering markdown properly. I've been trying to figure out WTF was going on since yesterday!
jostmey 1 day ago 2 replies      
HTML used to serve as a simple way to format a document. Now HTML is too complex for that purpose. Introduce markdown. In ten years, markdown will be too complex for formatting documents.

I am big of markdown in case I didn't make that clear. I love it

dreamcompiler 1 day ago 1 reply      
This is great news! Does anybody have a recommendation for a Javascript parser for Formal GFM? (I know there are a million JS MD parsers; I'm looking for a good one that will let me serve GFM docs over HTTP and render them on the browser.)
tomcam 1 day ago 0 replies      
web2py handled all of these issues and made its markup language extensible with its markmin specification: http://www.web2py.com/init/static/markmin.html
Siecje 13 hours ago 0 replies      
What's wrong with http://www.vfmd.org/ ???
libeclipse 1 day ago 1 reply      
I'm really happy to see this. It's actually quite frustrating that although markdown is so nice, it barely has a consistent standard. It's almost impossible to use it cross-service.

Hopefully now that Github has standardised their own flavour of it (and quite a nice flavour too), more people will start to use it.

Of course there is the obligatory XKCD: https://xkcd.com/927/

strikedout 1 day ago 1 reply      
Why no ~~strike out~~ in spec?
flippyhead 1 day ago 0 replies      
Jeesh, about time!
harmonyinfotech 1 day ago 0 replies      
Is it ok if I promote my domain here? I read the guidelines but it doesn't mention anything regarding self promotion. Sorry if it ain't appropriate but anyone looking for a relevant domain (markdown.in) please get in touch or any suggestion if it is better to develop it.
New ways to foot the hefty bill for making old ships less polluting economist.com
379 points by ghosh  3 days ago   217 comments top 28
CalChris 3 days ago 4 replies      
Slow steaming is an improvement and it saves quite a bit of bunker. But while bunker is cheap, it's really noxious; the Cosco Busan spilled bunker when it hit the Delta tower of the Bay Bridge.

Boats are supposed to switch over to a cleaner fuel when they enter port. For example, Port of Oakland is upwind of residential housing in Oakland. So this is a public health issue. Even the terminal tractors (port trucks) idling are an issue. Hopefully they'll switch over to EVs:


Boats are designed for a critical hull speed. Emma Maersk cruises at 31 mi/h on the open ocean.


That bulbous nose on container ships sets up a counter bow wave to lower drag but only at a certain cruising speed. However, shippers weren't paying a premium for that higher speed and although it's more efficient for that hull it was still costly.

So new boats are tuned to a more efficient lower speed (slow steaming) with less powerful engines and even older boats are getting hauled into dry dock and re-nosed for a lower speed. Overall shipping speeds are down and shipping costs are also down.


While the new Panama Canal extension could be a fiasco in its own right (100 years later and not nearly as well built; it leaks) new canals could improve things. The Thai Canal could make the Suez route more competitive than the Panama route for Asia to Europe.

Lastly, like airlines, it's really hard to make money in shipping. Witness the Hanjin bankruptcy:


The City of Oakland owns the Port of Oakland and we don't make much money off of it either. $16M/yr for both the airport and the port, last time I checked.

kogepathic 3 days ago 8 replies      
> That also imperils banks across the world, which have lent $400bn secured on smoke-spewing ships.

So, why should we care? Presumably the banks have paid analysts to determine that was a sound investment.

If governments are doing their jobs, banks should be able to eat this kind of loss without becoming insolvent. Otherwise why bother having regulations at all, if every minor hiccup means taxpayers have to bail out the banks?

Why do I care if shipping companies go out of business because of over capacity? Isn't that what market forces are all about?

So we should keep dangerously polluting ships running, because the banks that loan the shippers money will lose their shirts for several quarters if the shipping company goes bust?

userbinator 3 days ago 2 replies      
It is interesting to note that the types of engines used in these large ships are among the most efficient:


The pollution has more to do with the type of fuel used.

Broken_Hippo 3 days ago 4 replies      
I've hit my limit for the economist this month, so I went and looked up the article. It seems these articles have been coming out since at least 2009, and the gist is that because these (older) ships burn heavy fuel, which isn't refined like gasoline.

And it seems the fix is to urge the companies to update their ships by not allowing them in ports, but considering how long these articles have been coming out, it looks like progress is slow on that front - and if it has changed. Shipping companies have been selling off some of their stock, and it would seem that at least a few of the older ships should have been included.

anonu 3 days ago 1 reply      
A bit of clarification on which oxides, from the article: By burning heavy fuel oil, just 15 of the biggest ships emit more oxides of nitrogen and sulphurgases much worse for global warming than carbon dioxidethan all the worlds cars put together
upofadown 3 days ago 0 replies      
>just 15 of the biggest ships emit more oxides of nitrogen and sulphurgases much worse for global warming than carbon dioxide...

Oxides of sulphur are not greenhouse gasses. Nitrous oxide is a greenhouse gas but it doesn't come from burning fuel.

These links go into the actual reasons these sorts of pollutants are bad:

* https://en.wikipedia.org/wiki/NOx#Environmental_effects

* https://en.wikipedia.org/wiki/Sulfur_dioxide#As_an_air_pollu...

Interestingly enough, there is some thought that nitrogen oxide emissions from ships actually cause global cooling.

goodcanadian 3 days ago 1 reply      
It seems to me that many comments here are missing the point. To be fair, the article also seems to get it wrong. The problem with sulphur and nitrogen oxides isn't just global warming (though apparently N2O is a real problem, there). To my mind, the real problem is good old fashioned pollution as we talked about in the 80s. Acid rain, anyone?
igravious 3 days ago 0 replies      
15 Biggest Ships Create More Pollution Than All Cars in the World (2013)


128 points by danboarder 457 days ago | 65 comments

deepGem 3 days ago 6 replies      
Isn't it feasible for container ships to go electric ? They have such massive surface areas for batteries. I thought of solar but then the container loading/unloading aspects will become quite cumbersome, unless you could somehow put solar panels on individual container roofs and load those on the top. A logistical nightmare nonetheless.

A crude search yields this about Emma Maersk, one of the largest container ships.

She is powered by a Wrtsil-Sulzer 14RTFLEX96-C engine, the world's largest single diesel unit, weighing 2,300 tonnes and capable of 81 MW (109,000 hp) when burning 14,000 litres (3,600 US gal)[31] of heavy fuel oil per hour. At economical speed, fuel consumption is 0.260 bs/hphour (1,660 gal/hour).[32] She has features to lower environmental damage, including exhaust heat recovery and cogeneration.[33] Some of the exhaust gases are returned to the engine to improve economy and lower emissions,[34] and some are passed through a steam generator which then powers a Peter Brotherhood steam turbine and electrical generators. This creates an electrical output of 8.5 MW,[35] equivalent to about 12% of the main engine power output. Some of this steam is used directly as shipboard heat.[36] Five diesel generators together produce 20.8 MW,[35] giving a total electric output of 29 MW.[26] Two 9 MW electric motors augment the power on the main propeller shaft

So you need about 285 Tesla Models P100D motors to power a ship of this size. Doable I guess. Again, I'm no expert on shipping.

richdougherty 3 days ago 1 reply      
"The problem, he adds, is one of incentives. Ship owners, who would normally borrow for such upgrades, do not benefit from lower fuel bills. It is the firms chartering the vessels that enjoy the savings. But their contracts are not long enough to make it worthwhile to invest in green upgrades. The average retrofit has a payback time of three years, whereas 80% of ship charters are for two years or less."

"Hence the interest in new green-lending structures. ... The idea is to share the fuel savings between the shipowner and the charterer over a longer contract, giving both an incentive to make the upgrades. Such schemes used to be thwarted by the difficulty of measuring exact fuel consumption on ships. New technologies allow more accurate readings."

This is the exact same problem that arises in landlord/tenant relationships when it comes to things like insulating a property. Insulation might be relatively cheap and pay itself back in a few years. But the landlord doesn't have an incentive to insulate because the benefit goes to the tenant. The current tenant also won't insulate because they'll probably leave before they can realise all the benefit of their investment.

In theory, landlords or shipowners should have an incentive to invest, since it should improve their property and therefore allow them to increase their rents or charter fees, but for some reason this doesn't happen. Possibly consumers can't accurately assess the value of improvements so they are reluctant to pay more.

The measurement devices mentioned should allow both parties to have a more accurate way to share in the benefits.

It's a complicated dance of incentives and information...


djsumdog 3 days ago 4 replies      
> Such schemes used to be thwarted by the difficulty of measuring exact fuel consumption on ships. New technologies allow more accurate readings.

Why is it so difficult to measure fuel consumption on ships?

Bud 3 days ago 0 replies      
This comment thread illustrates why HN posters shouldn't presume to write their own headlines for an article unless they really know what they are doing.
rs999gti 3 days ago 1 reply      
Maybe it's time to re-examine using nuclear reactors on cargo ships?


seizethecheese 3 days ago 2 replies      
Warning: I'm totally ignorant on the subject.

It seems like oil gets refined with gasoline going to cars and heavier fuels going to ships. Can we really say that cars are so much cleaner? Their fuel is surely subsidized by a market for the heavier fuels.

Radle 3 days ago 1 reply      
"Carrying more than 90% of the worlds trade, ocean-going vessels produce just 3% of its greenhouse-gas emissions."The article says itself, shipping is super efficient.
wcoenen 3 days ago 0 replies      
The article seems to imply in its first paragraph that sulfer dioxide is a greenhouse gas. But doesn't SO2 have a cooling effect on the climate?


WalterBright 3 days ago 1 reply      
I read 30 years ago that some cargo ships were being equipped with computer operated sails, which would substantially reduce fuel use. I wonder what happened to that.
ianai 3 days ago 1 reply      
These ships could be nuclear powered - like subs/other vessels are already. That ought to make a huge impact on carbon footprint.
reacweb 2 days ago 1 reply      
Increase the price of petrol. Many conflicts in the world are related to petrol. A significant part of military budget can be seen as a subsidize to petrol. There should be enough taxes to compensate all these hidden costs.
hendler 2 days ago 0 replies      
Checkout "Freightened" - https://vimeo.com/202104276
pfarnsworth 3 days ago 1 reply      
What would be the economic repercussions if these ships were immediately shut down?
tener 3 days ago 0 replies      
Why not just tax them for the excessive pollution?
ajarmst 3 days ago 5 replies      
The lede is false and misleading on its face. Excluding "carbon dioxide" from your list of "oxides" when discussing (and comparing with) automobile greenhouse gas emissions is absurd. The misleading claim is also clearly intentional, so none of the other claims can be accepted at face value. (No, shutting down 15 ships would not do more to address greenhouse oxide emissions than banning automobiles world-wide.) More than disappointing, and never should have been published.
marcusarmstrong 3 days ago 3 replies      
IMO, the title should note that "oxides" does not include "Carbon Dioxide".
hellbanner 3 days ago 0 replies      
Flagged, use the given title from the page.
holydude 3 days ago 4 replies      
Which makes danes one of the biggest hypocrites on planet earth.
jerkstate 3 days ago 7 replies      
you know what's even more interesting, it seems like shipping fuel is heavily subsidized. The international price for bunker fuel is about $330 per ton. Oil is $50 per barrel, a barrel is about 300 lbs, so 7.5 barrels make a ton. That's $375. Why is the refined product cheaper than the raw product?

edit: many have responded calling residual fuel a "waste product" - it is useful and being used so calling a waste product strikes me as semantically incorrect. If it were being sold opportunistically, like a large proportion of it was going to waste but some was being sold, I would agree with that, but it seems like it's all being sold, right?

malchow 3 days ago 2 replies      
Would anyone care to link to evidence that carbon dioxide causes an increase in temperatures? I'd be curious to read some of this literature.
LLVM 4.0.0 llvm.org
462 points by zmodem  2 days ago   141 comments top 12
bajsejohannes 2 days ago 2 replies      
> thanks to Zhendong Su and his team whose fuzz testing prevented many bugs going into the release.

http://web.cs.ucdavis.edu/~su/ claims 1228 bugs found (counting both LLVM and GCC). Impressive!

opt-viewer 2 days ago 0 replies      
Looks like it didn't make the release notes but one of the features new for this release is opt-viewer. It's useful for finding the rationale why some bit of code was/wasn't optimized. It's a WIP but usable today.

I made a demo [1] for this tool.

[1] https://github.com/androm3da/optviewer-demo

lossolo 2 days ago 4 replies      
LLVM Coroutines - This is the most exciting thing for me. Gor Nishanov in his videos explains how coroutines are implemented and how are optimized by LLVM. Asynchronous IO code will be so easy to write and so efficient. Context switch in cost of function call, you can have billions of those coroutines, heap allocation elision (in certain cases). Can't wait for coroutines to land in Clang.

I am a big fan of Go gorutines so Networking TS and Coroutines TS made me very happy, connecting both and having it in standard will be great. Just a shame that for Networking TS integration we will need to wait for C++20.

pjmlp 2 days ago 1 reply      
Love the improvements to clang-tidy!


Congratulations on the work. Also nice to see that OCaml bindings are still being taken care of.

falcolas 2 days ago 1 reply      
> Stable updates to this release will be versioned 4.0.x

/nit Semantic versioning (or communication) failure. I would think that "stable updates" would represent minor releases (i.e. 4.x.0), not bugfix-style patches. Unless all new features will be present in major releases instead of "stable updates"?

javajosh 2 days ago 4 replies      
Ha, was just reading http://aosabook.org/en/llvm.html.

(Really like that LLVM IR. Does anyone code in it directly? Was also thinking it would be interesting to port Knuth's MMIX examples to it.)

grassfedcode 2 days ago 0 replies      
I'm trying to add support for lldb to a gdb frontend (https://github.com/cs01/gdbgui/), and need a gdb-mi compatible interface to do it.

lldb-mi exists, but its compatibility with gdb's mi2 api is incomplete. Does anyone know of a more compatible api to gdb-mi2 commands, or if there are plans to improve lldb-mi's?

self_awareness 1 day ago 1 reply      
Visual Studio is already at version 2017, and LLVM is only at 4, they need to catch up real quick! /ducks
mark-r 2 days ago 0 replies      
Too bad they didn't use more aggressive aggressive grammar checking.
amyjess 2 days ago 1 reply      
I wish they'd do what GCC does and just eliminate the middle number entirely.
futurix 2 days ago 1 reply      
Version number inflation strikes another software package, although at least it is not as bad as Chrome or Firefox.
aslammuet 2 days ago 3 replies      

Demo page is not working. Is there any other page that makes me understand what really is it and where it is helpful.

Vibrator Maker to Pay Millions Over Claims It Secretly Tracked Use npr.org
451 points by ceejayoz  1 day ago   249 comments top 39
tyre 1 day ago 24 replies      
I think this is a great example to the tech world of what people actually care about.

Your average American didn't understand or get worked up over Snowden and the prospect of a surveillance state; not for long anyway. We don't have much of a national conversation about it anymore, Obama isn't remembered for his actions around the NSA, bulk collection, etc.

Most people also don't seem to care too much about Facebook, Google, etc. collecting their browsing data and selling it to advertisers.

People very much care about the privacy of their sex life.

Did this company violate their own privacy policy?

It looks like the company settled rather than drag things out through court, but didn't actually do anything beyond collect standard usage data.

The company didn't even give it to third parties. So it isn't that they did something worse than NSA Facebook, but that people are more sensitive to the privacy of their sex lives than other things.

We wonder why Snapchat first rose to popularity for sexting while most people couldn't care less about GPGing their emails or using Signal day-to-day.

Either most people don't care about privacy or we, the tech community, do a poor job of connecting things like encryption to what people do genuinely care about.

qdot76367 1 day ago 1 reply      
If anyone is interested in accessing the WeVibe or other toys (Kiiroo, Lovense, etc) directly via bluetooth, versus going through their apps, I run a website for documenting and reverse engineering this stuff, at http://metafetish.com. All of our docs and code are on github at


dangrossman 1 day ago 5 replies      
Looks like my s/o and I could be making a claim as part of the settlement class. Didn't get much use out of the app, the bluetooth connection was super unreliable.

That said, I take it as a given that any app I install on my phone is probably tracking my usage of their app. Dropping in Mixpanel or Heap or some other analytics lib that tracks feature usage seems like such a standard part of developing a mobile app, I'd be surprised if a developer didn't do it.

altendo 1 day ago 2 replies      
The We Vibe was the topic of a Defcon 24 talk, Breaking the Internet of Vibrating Things[1]. Was an excellent talk, but I felt it needed more jokes woven in.

[1] https://www.youtube.com/watch?v=v1d0Xa2njVg

EDIT: grammar fail

qdot76367 1 day ago 0 replies      
The Internet of Dongs project, at http://internetofdon.gs (on twitter at http://twitter.com/internetofdongs) exists to combat issues with security and user privacy in sex toys, They're working with multiple toy producers to create systems to report bugs and increase security.
follower 1 day ago 0 replies      
[From an earlier submission: https://news.ycombinator.com/item?id=13862694]

Related DEF CON 24 presentation: "Breaking the Internet of Vibrating Things": https://www.youtube.com/watch?v=v1d0Xa2njVg (Includes more technical details)

Related TEDx presentation: https://www.youtube.com/watch?v=WxRSjC1rPmA (Aims to raise awareness of related IoT privacy issues for a non-technical audience via the concept of a personal "Device Intimacy Spectrum".)

Disclosure: I'm one of the presenters/security researchers referenced in the article.

rosser 1 day ago 2 replies      
I suppose it's to be expected, but the navet of thinking that an IoT sex toy wasn't phoning home still surprises me.

Not to excuse it, because spying on your users particularly in an identifiable way, and doubly so given the sensitivity of this specific case is a shitty thing to do, but it's not like this is unprecedented.

xiaoma 1 day ago 2 replies      
Over the longer term, privacy is dead. Sensors are proliferating at a rate web servers were 20 years ago and a state of continual recorded surveillance is where we are headed over the next 20 years.

The main question is, how equitable will that surveillance be? Governments and powerful multinationals will have access to the personal information of ordinary people. Will the converse also be true?

As unpleasant as the prospect of sub mosquito-sized recording devices everywhere is, it matters greatly whether if law enforcement, moguls and politicians are subject to the same scrutiny as those without power.

mythrwy 1 day ago 5 replies      
I have a really hard time imagining people using this.

Maybe it's just my prudishness but how the hell is fighting with bluetooth pairing in any way foreplay?

On the information video there is a graphic showing it can be used by separated couples. One person is in Europe, one the US. Just don't give up your phone at the border.

And don't lose your phone either. You may just wind up losing your partner also when they see how much more adept someone else is at working the controls.

Guess there are some things I'll just never understand.

mysterypie 1 day ago 1 reply      
I'm of two minds about the funny comments this article is getting. On the one hand, some are indeed funny and often really clever use of the English language. On the other hand, I think about hours and hours I'd spend reading really clever comments on Reddit and then in the end realizing that I didn't learn anything, nor did it change my mind or influence my opinion about anything. I'm glad that HN exists as an alternative.
callmeed 1 day ago 3 replies      
* > An estimated 300,000 people bought Bluetooth-enabled WeVibes, according to court documents, and about 100,000 of them used the app.*

I know its not the primary issue, but its a very interesting part of the story to me and raises a lot of questions. Only 1/3 of people who purchased the device used the connected app. This sounds a lot like my Annova sous-vide: it has an app but I never use it (a dial and button are fine by me). I wonder if this 1/3 number is the normal rate among "smart devices". Do 2/3 of people not use it because its of no real valueor because the setup/ux sucks? Do companies make smart devices because "everyone else is doing it" or is there another reason (charge more & better margins)? Finally, will we start to see a decline in smart/connected devices if adoption stays low (in favor of products that simply innovate in other ways)?

Animats 1 day ago 3 replies      
Perhaps this will make it clearer that controlling things from your phone currently involves somebody in the middle, monitoring what you're doing. If we had better phone-to-phone data connections, this wouldn't be necessary. This is a phone pairing application between phones that could be brought near each other for pairing.
tomek_zemla 1 day ago 1 reply      
Is this one of the devices on CIA hacked systems list?
kitd 16 hours ago 0 replies      
Gentle warning if you're at work: the article has a large picture of the product at the top.
6d6b73 16 hours ago 0 replies      
But they did it to help people!

This is how they use the data:

Red lights flashing...

Tactical Officer: - Action Stations - User 5563 is close but needs additional stimulation.

Captain: Engineering can you give us additional 10%?

Engineering: We will need to adjust Warp Field but it should work for about 5 seconds.

Captain: That should be enough. Do it!

Engineering: Ready

Captain: Engage!

Applejinx 18 hours ago 0 replies      
I picture it being hooked to a GPS:

three inches eastthree inches westthree inches eastFIVE inches west

And then they sell the data to Facebook, who can market it to more effectively target men who move like that.

which steps over the line from snark into relevant observations on abuse of privacy and who benefits, given deep enough data :)

marvin 19 hours ago 0 replies      
Haha, I own one of these and the thought has struck me multiple times that Lelo are probably collecting data on usage and also that there must be security holes to their backend so you could in principle take control over thousands of vibrators. Never worried too much about it, and not at this point either. But it's obviously not a good thing.
jonaldomo 1 day ago 2 replies      
So is the moral of the story for a developer to make sure you have an updated privacy policy? If they would have updated the privacy policy on their product would they have been legally protected?
bleair 1 day ago 0 replies      
You should assume any (phone or windows store or mac store) app you install can and likely will be uploading all personal data that it can to their "mothership" in the name of wishing to keep track of Usage and "improve" future products. There are no laws preventing the selling of information to marketing agencies.
zelias 11 hours ago 0 replies      
This could bring new meaning to the concept of the "man in the middle attack"
MichaelMoser123 23 hours ago 0 replies      
isn't it amazing how every piece of equipment is turned into a tracking device? Always reminds me of Stanislav Lem's 'The Washing Machine Tragedy' http://nemaloknig.info/read-192176/?page=10 where this appliance turned smarter and smarter until it took over...
gozur88 1 day ago 1 reply      
I'm trying to imagine what you would actually do with this kind of data.
brilliantcode 1 day ago 0 replies      
Seems like publicity stunt only works when you don't have to payout millions to people affected by it negatively.

They got the publicity but at a price that is too high.

fpgaminer 1 day ago 2 replies      
Besides the privacy concerns being raised here, connected sex toys themselves fascinate me. Like a lot of IoT markets, it intuitively feels as if adding connectivity and intelligence to the products will benefit them in some way. And yet, also like a lot of IoT markets, this doesn't seem to be panning out.

The toys themselves are too primitive to be useful in general. They're too sluggish in their responses, and not sensitive enough. There's also little to no feedback on the control side.

The data collection side of things (privacy issues aside) is also not useful. Does the frequency with which you use a vibrator really going to inform your life? Sleep patterns, diet, exercise, etc. Those are all useful metrics. Certainly the amount you have sex is also a useful metric. But to be useful, you need to actually know how much you have sex, and need to have the ability to analyze that data alongside everything else. A tracked vibrator does not accomplish that, and there's no central app for analyzing all this data together (that I know of). A smart watch, on the other hand, _could_ track sexual activity, and already has the facilities for analyzing that data along with the other important metrics.

But there's still a market here, I feel, for when the right combination of technology shows up. About a year or two ago Internet controlled vibrators showed up on cam sites like Chaturbate. It started off as a novelty on a few cam shows, but today almost every show has them. It consists of a vibrator, either worn externally or internally, that vibrates with variable intensity based on tips given my customers of the show. So, you tip, it vibes. It's a means for customers to have more direct involvement in the show. It's an easy sell to tell someone "You know that hot girl? You can pay to give her pleasure." That's the sort of "right combination of technology" I'm talking about.

The next big innovation, I think, will come from an Internet connected, articulated Fleshlight-like product for men. Ya know, a Fleshlight that jacks you off. There's one product out there, but like most of these failed attempts, it sucks. It has the right "idea", but failed execution. It connects to your computer and you can then direct its movement either with a synced video or remote control by a cam show performer. That's a great idea! But the articulation needs to be better, with several nodes with at least two degrees of freedom (up-down, contract/relax). If you can make the device actually useful, it won't be hard to extract a hefty price on the device, and a hefty price on videos and camshows. And, of course, this is a far more useful device for long distance relationships. Not to belittle the needs of the woman, but I don't believe a remote controlled vibrator is in the same class of remote-intimacy as a remote controlled masturbator. The equivalent would be more like a remote controlled tongue or "fucking machine". But good executions of both are further away, I believe.

And yes, I _have_ thought "too much" about this stuff, even to the extent of sketching out a potential way to build the masturbator using electro magnetic actuators arranged in a ring to provide silent operation.

> Since the app was released in 2014, some observers have raised concerns that Internet-connected sex toys could be vulnerable to hacking.

Oddly enough, that might be some people's fetish.

k-mcgrady 1 day ago 0 replies      
Linking the data with email addresses was stupid and unnecessary. Regardless of whether it's right or wrong (if this wasn't 'embarrassing' data I don't think anyone would care) linking data to emails was just a totally stupid decision.
watertom 1 day ago 0 replies      
How is what they did different than what facebook does?
nojvek 1 day ago 0 replies      
How do I claim the lawsuit amount?
HiFlight 13 hours ago 0 replies      
Vibrator maker creates Clit Bit.
blacksqr 1 day ago 0 replies      
Puts a new spin on the phrase "give me a buzz", doesn't it?
smacktoward 1 day ago 3 replies      
> The We-Vibe product line includes a number of Bluetooth-enabled vibrators that, when linked to the "We-Connect" app, can be controlled from a smartphone. It allows a user to... give a partner, in the room or anywhere in the world, control of the device.

Wow. I'm just kind of incredulous that this was never hacked. The lawsuit is about the company's own data-collection practices, but just imagine the freakout if one fine day Vladimir Putin took control of all these devices at once.

Has anyone done a security review of the device and the associated app? If ever a service called for a thorough penetration test... (Bah-dum-bum! Thank you, I'll be here all week, tip your wait staff.)

I'm wondering if the lack of hacks came from actual good engineering on the company's part -- hope springs eternal! -- or if the device was just too niche to catch the interest of the black hats?

w1ntermute 1 day ago 1 reply      
The perils of teledildonics.
russx2 1 day ago 1 reply      
alva 1 day ago 1 reply      
bjourne 1 day ago 1 reply      
How stupid can a company be?
slezakattack 1 day ago 1 reply      
A sex toy being hacked by some hacker could make for an interesting porn plot..
hellofunk 1 day ago 0 replies      
This article really sent shivers through me.
peter_retief 1 day ago 0 replies      
No man this must be marketing :) :)
Dowwie 1 day ago 1 reply      
Yesterday it was farts and today it's IoT enabled vibrators.. Has Howard Stern taken over HN?
arcaster 1 day ago 0 replies      
Just waiting for some scumbag to repost as fake news... "Vault 7 leaks find NSA spying on vibrator use!"
Pi-hole A black hole for Internet advertisements pi-hole.net
539 points by goblin89  2 days ago   306 comments top 40
laumars 2 days ago 4 replies      
For those of us - myself included - who run a hosts file list (either using dnsmasq like Pi-hole or directly), here are the sources that Pi-hole use so you can add to your own solution:


There's a few on there I don't use and will look to implement. There's also a few they seem to have missed (perhaps intentionally?) so below I have included the lists I use in case it's useful for anyone else:

 http://someonewhocares.org/hosts/hosts http://winhelp2002.mvps.org/hosts.txt http://adaway.org/hosts.txt http://pgl.yoyo.org/adservers/serverlist.php?hostformat=hosts&showintro=0&mimetype=plaintext&useip= https://raw.githubusercontent.com/StevenBlack/hosts/master/data/StevenBlack/hosts http://www.malwaredomainlist.com/hostslist/hosts.txt http://www.montanamenagerie.org/hostsfile/hosts.txt

QuadrupleA 2 days ago 9 replies      
I definitely hate ads like anyone, but it should be acknowledged that ads serve a purpose too, of creating an economy for free content and giving people an avenue to get some income in exchange for their efforts, and possibly be able to devote their full attention to a free project and not have to support themselves with other income. Ads give creators some incentive to create stuff - although the income generally isn't great unless your audience is massive.

That said, the ad networks out there seem pretty awful in terms of privacy, slow-running javascript code mess, huge images or videos adding megabytes to the size of what should be a simple text page, etc. So if it's not doing it already perhaps this project can filter and put some pressure on the ad networks to clean up their mess a bit and not harm the user experience so much, and if an ad network is playing nicely, allow it through as a way to support free projects and their creators.

nkkollaw 2 days ago 14 replies      
With all the ad blocking technologies that are coming up, I wonder if Google is devising something to counteract these efforts.

For instance, since browser-based ad blockers work from what I know by blocking known domain names, couldn't Google create random subdomains and serve the code from a different subdomain every day or even every few hours, as well as change the way their JavaScript and HTML looks?

Even something very expensive to run would be justified with all the money that advertising brings in.

If Google can create software that can tell what's in a picture, or if a person in a picture is happy or not, why can't they find a way to fool ad blockers..?

ckastner 2 days ago 7 replies      
The installation shortcut given is

 curl -sSL https://install.pi-hole.net | bash
and one is expected to execute this as root.

Yes, I know this is supposed to be a convenience thing, but I wish people wouldn't actively encourage this pattern.

staunch 2 days ago 2 replies      
One thing is for certain, ad blocking is going to become more and more prevalent and never less.

The end game for ad blocking is to all but eliminate advertising. An ad blocking client could, ultimately, just block any domain that has aggressive anti-ad block features.

With enough users doing this, new sites that are ad free would quickly replace the old ad driven sites. Some of the ad driven sites would modernize.

Ads are a failed path. By eliminating ads we open the door to novel solutions. Only a cynical fool could believe technology isn't up to solving this minor problem. There are already a dozen potential solutions waiting for the incentives to change.

gwu78 2 days ago 3 replies      
Default settings use remote, shared DNS caches run by an advertising company.

Regardless, this is a step in the right direction. DNS is highly effective for this filtering out advertising.

Personally I just run my own authoritative nameserver(s) with all the IP addresses I need. No recursive cache.

When I browse to websites where I have never been and may not return, I am never using graphical browser that loads "resources" automatically from any random domain.

I am using a browser I compiled myself. I am only reading text.

Binary resources, e.g., video, can be downloaded non-interactively with an ftp/http client.

If it is an important website that I use repeatedly, then I have all the IP addresses for the resources the website's pages will need stored in a zone files. Then it is "safe" to use a browser written by an organization company that makes money from ads. All DNS requests are answered by my server(s).

I can retrieve (refresh) the IP addresses for my zone files very quickly with custom software I wrote to do this. My lookups are faster than a cold recursive cache and send out fewer requests.

IMO, the way to think about "ad-blocking" is not to try to imagine how to block every possible ad server. Instead, just focus on what web content you want and figure out what addresses you need to get it.

At one point a certain browser written by an advertising company had its own DNS resolver. Imagine your /etc/resolv.conf being completely ignored. Food for thought.

Animats 2 days ago 5 replies      
This is just a DNS filter. Why is it a big deal? DNS filters sort of work, but they're not new or magic. They trigger some ad-blocking detectors when the ads don't load.
j_s 2 days ago 1 reply      
You may have missed the precursor discussion this past weekend, a walkthrough of setting up pihole on VPS:

Set up a cheap cloud hosted adblocker in an hour for $2.50 a month


Of particular added value there was mention of Android apps that can be setup to self-host an ad-blocking VPN / hosts filtering without rooting: https://news.ycombinator.com/item?id=13853408


NetGuard is the first free and open source no-root firewall for Android.

nikon 2 days ago 0 replies      
Using this at home. Really interesting to see the blocked domains on the dashboard. Realised my two Samsung Smart TV's were constantly calling home for example. You'll eventually whitelist some things that break like Spotify/Sonos IIRC.

I may switch to an Odroid C2 if I go with a permanent VPN connection as the throughput of the RPi3 network port is not the best.

allendoerfer 2 days ago 1 reply      
I love how they recommend curl-bash-piping, but then put a disclaimer beneath it, even with a detailed post about it. As if people would not just copy paste it anyway. I think they were just trying to dodge the usual curl-pipe-bash-is-evil comment thread unsuccessfully, since I started it anyway.
kuon 2 days ago 1 reply      
I've been running a DNS based ad blocking for ages, but I realized that recently, youtube has been serving ads from the same domain as regular videos. I wonder if anybody has seen this.
24gttghh 2 days ago 0 replies      
I went so far as to set up Dnscrypt with a pi-hole setup recently and it was almost as painless as advertised. And it finally gave me something productive to do with my RPi3! https://github.com/pi-hole/pi-hole/wiki/DNSCrypt could use a little wordsmithing, but it wasn't too bad.
crorella 2 days ago 2 replies      
When I tried pi-hole I often noted some urls were added 'automagically' to the whitelist, they will show up a few days after I removed them. All of them were weird domains.
a3n 2 days ago 3 replies      
This could be in response to any number of comments on this page:

I use uBO and a few other blockers. I almost never see an ad.

A few days ago I saw an ad, and I was surprised. It was for Cadillac cars. I hovered over the ad, and it seemed to go directly to cadillac.com. And I was sort of OK with that.

The page, and the ad, seemed to be designed like any other legitimate link to another page or site. I don't know how the image made its way on to the page and in to my browser, but it appeared much less intrusive than a totally ad network-served ad.

Certainly the 1st party site could collect data about my visit and send it somewhere, but at least they appear to be more in the loop than just opening their site to all comers.

And if I clicked through to cadillac.com, they could do the same.

Anyway, that's more along the lines of what I've been wishing for as a consumer in web ads.

JustSomeNobody 2 days ago 0 replies      
Wow, somebody here hates the author. Every one of their comments is down voted to death.

What gives? Did they do something to make people mad? I'm really confused.

seedifferently 2 days ago 1 reply      
For anyone interested in a cross-platform single-binary alternative to Pi-hole, I've been hacking on this: https://github.com/seedifferently/nogo

(Disclaimer: I am the author of nogo)

gcb0 2 days ago 2 replies      
solution that doesn't require a dns server (or can be a dns server local cache)


I add this to my modem/wifi ap. and then just let every device use it to resolve. if the device allows to set a hostfile, I also add a local copy for when iam not in my network.

ThenAsNow 2 days ago 1 reply      
I have to admit to doing as little as possible from web browsers on phones, but on the desktop I rely extensively on uMatrix + NoScript (don't know if adding PrivacyBadger on top would buy me anything). However, NoScript for Android seems to be moribund and I don't think there is a uMatrix for Android either. DNS-based ad-blocking seems very 90s (i.e., designed for an era that's less invasive than today), and there's a ton of javascript content that really needs to be filtered as well if you want to counter all the ads + tracking. Is there any equivalent to NoScript + uMatrix on Android?
geuis 2 days ago 1 reply      
I haven't had time to look at the code yet. I have some questions though.

As an experiment a while back I wrote a simple dns server that blocked ad-related domains. https://github.com/geuis/lead-dns. While it technically worked, it made using the web almost non functional. Nearly every site was broken in some way. So blocking purely by domain isn't going to work. I wonder how pi-hole is dealing with it.

no_wizard 2 days ago 1 reply      
PiHole is pretty cool, very 'plug n play' which I like. A sufficiently advanced average user can set it up without too much trouble just following on a guide, even a relatively tech savvy 'lay' person can do this.

If you like a more technical solution I prefer something like running a Unbound + NSD server

Here's some great tutorials on that:

(Kudos to the people who write Calomel, i really liked these tutorials, it was a great way for me to get started and look into these services deeper once understanding what was going on here)



Pairing that with squid proxy can be the ultimate win:




and don't forget dnscrypt people!


I'm really big into having ones own DNS server on the network instead of completely using outside solutions. There is little overhead with a sufficiently modern implementation.

Also, these solutions run on FreeBSD/OpenBSD for those who prefer.

As a complete aside. Aren't most routers, esp. business class routers, running modified Unix/Linux anyway? Why on earth hasn't a reputable company made a guns ready router that lets you have access to the Linux/Unix underpinnings without flashing (albeit awesome) Open Source alternatives? I would think in the 'business/enterprise' class hardware side this would be more prevalent.

Maybe I just don't know of any solutions like that available stateside. I found one in Europe:


Can't get it stateside though :(

I instead custom built most of my networking hardware...but still.

toad_tyrrant 2 days ago 2 replies      
I just set this up at home yesterday (using an Odroid C2). A very pleasant experience so far.

I'm trying to find other services that are worth running in a similar fashion. Any ideas?

WhizzoButter 2 days ago 0 replies      
The Connectify Hotspot app added a similar feature recently. It'll block ads for all clients connected to the hotspot. It's Windows-only though:https://www.connectify.me/blog/block-annoying-ads-connectify...
vxNsr 2 days ago 1 reply      
How are people dealing with whitelisting when a website becomes broken because of an over zealous block. I find myself turning off ublock for some websites at least once a day, not because there are ads on the page but because they're issuing a dependency somewhere that has been blacklisted.
__oz 2 days ago 4 replies      
+1 for pfSense port.

Also, not sure how I feel about having this device as my primary DNS server for my entire internal network. What if the project gets compromised and injects a number of malicious DNS entries, now my entire network is toast?

ryandrake 2 days ago 0 replies      
I've been doing this at home for probably close to 10 years or so, set up manually using dnsmasq, and periodically fetching new blacklists. Nice to see it wrapped in a tidy package--great work.
vanekjar 2 days ago 1 reply      
Cool project! Great idea and nice UI. I have tried recently and it works fine with HTTP.

The problem is with ads served via HTTPS and since today most of the pages are using HTTPS protocol pi-hole is kinda useless.

For reference on this topichttps://discourse.pi-hole.net/t/websites-hanging-timing-out-...

KiDD 23 hours ago 0 replies      
I tried using Pi-Hole but I have so many trouble with IPv6 DNS not working that I just gave up...
philplckthun 2 days ago 3 replies      
This is really neat! I'm wondering why the name is so Raspberry Pi-specific ;)

Is there a docker container for it already, by any chance?

verdverm 2 days ago 0 replies      

Is a golang ad-hole. I've found it to be more performant in both the DNS/server and UI

Walf 2 days ago 0 replies      
Network-level means no easy opt-out on a per-site basis. You cannot choose to support certain sites or view occasional sites that, understandably detect ad-blocking.
zakk 2 days ago 2 replies      
Stupid question: how does it work when the majority of traffic is through SSL, making a request to an ad and a request to actual content indistinguishable?

I don't think all websites serve ads from a different host. Do they?

shade23 1 day ago 0 replies      
Considering this happens at the DNS level. Any idea how the websites which do not let you view content till you disable the adblocker will react?
jerrac 2 days ago 1 reply      
I wonder if there's any way to integrate this with OPNSense firewall. I already have that set up as my LAN's dns server so I can name my internal computers.
SnaKeZ 2 days ago 0 replies      
Link to the Android Client (unofficial):


pix64 2 days ago 3 replies      
I feel like this is a bit dangerous because it doesn't allow the end user to disable it if it breaks a website which is quite common.
Vanayad 9 hours ago 0 replies      
what about windows ? :(
libeclipse 2 days ago 2 replies      
I used this for a while but I found that it was unbearably slow. Perhaps it's too much to ask from a RaspberryPi3.
jedisct1 2 days ago 0 replies      
dnscrypt-proxy can do filtering as well: https://github.com/jedisct1/dnscrypt-proxy/wiki/Filtering
gravypod 2 days ago 2 replies      
What kind of minimum resources are required to run this? Can I run this on my 128MB VM?
Introducing Create React Native App facebook.github.io
530 points by dikaiosune  2 days ago   156 comments top 20
hasenj 2 days ago 2 replies      
I started a project in React Native a few months ago, then I left it for a couple weeks, then when I came back to it, I did the usual chore of updating libraries, and found that I could no longer compile/run the project. Purged and re created my npm_modules folder to no avail. Things kept getting worse and I simply have no clue what is going on. Started a fresh RN project and even that would not run.

I got the feeling that it's layers upon layers of leaky abstractions and the whole thing seems rather fragile.

dikaiosune 2 days ago 3 replies      
Hi! I wrote a bunch of the code for CRNA (although it obviously builds on a lot of great work from Facebook and Expo people). Happy to answer questions.

I'll be doing a brief demo during my lightning talk at ReactConf -- livestreamed at ~5:45pm pacific time today at http://conf.reactjs.org/livestream.

baron816 2 days ago 11 replies      
What are people's feelings towards React Native vs actual native mobile development these days? Are people starting to get the sense that this is the future and coding in Swift/Objective-C/Java for mobile is dying off?
RubenSandwich 2 days ago 2 replies      
I'm a bit unsure about this decision. Because this is tying React Native to a private entity, namely Expo, that has not open sourced it's toolchain used here. Expo takes your code and builds it on their servers and then serves it to you. That toolchain unlike React Native is not open source. This is creating a walled garden around Expo who can easily decide to charge for their server fees. Yes it is a much easier of an boarding experience then installing Xcode and Android Studio but at least with Xcode and Android Studio you own the whole toolchain. Because of this fact I don't think this is comparable to React Create App, which is excellent.

Edit: Expo has open sourced their code, see below. And CRNA does not build code on their servers, I assumed it had the same workflow as their XDE tool. It does not. But I'm still unsure of this decision. It makes Expo a vital part of React Native, or at least puts them in the position to have that position.

donjh 2 days ago 0 replies      
Create React App was a great addition to the React ecosystem and definitely made the library more approachable. Excited to check this out!
brilliantcode 2 days ago 3 replies      
Has the Android side improved with React Native? The last time I took a look at this which was last year, the consensus was the Android result wasn't optimal.
eeeze 2 days ago 1 reply      
Thanks for the great work on it! Will definitely check it out.

As for someone who was using Expo for a while, how's that different from the exp start and the rest of the platform? Seems like an interface or another entry point -question if necessary?

hypercluster 2 days ago 1 reply      
Always interesting to see how cross platform is evolving (this and also Xamarin update recently), wonder how the Google and Apple regard this. Anyway, I just tried out the Expo App on iOS and noticed that the back swipe animation does not seem to be native in the examples. Is there a reason for that? (that's something I always try out of I want to check if an app is native).
sprite 2 days ago 2 replies      
A bit off topic but how is react native performance compared to native? Lets say you did an Instagram type timeline, would you be able to get the same scrolling performance with RN as going native?
andrewfong 2 days ago 2 replies      
Are there any tutorials or documentation on integrating Expo with a different RN build process? For example, if we have an existing setup with TypeScript or want to use Mocha instead of Jest or whatever.
ourcat 2 days ago 4 replies      
Didn't Apple start rejecting apps recently which do 'hot code' pushes to update apps remotely? (As the video on the Expo.io site says)
fleshweasel 2 days ago 3 replies      
I don't see why there can't be React-like libraries written and used in languages that compile to native. I'm not expecting to have JSX but I should be able to write component classes and implement their render methods, returning view trees written with some kind of object/array literal syntax.

To get as good of a development experience as React, it would require some work by the compiler and runtime people to basically let you do something like hot loading--Android has something like this now, and maybe Apple will get it too, though I'm not holding my breath.

I think it's a no-brainer for web development these days to do React because 1. you can opt out of it for parts of the page it's not going to work with for whatever reason and 2. the performance is pretty damn good compared to lots of alternatives, including writing all the UI state management logic yourself. However, I've not been convinced that the buy-in is worth it for native mobile development. Can someone who knows more tell me: is it fairly easy to do something like say "I can't/don't want to use React Native for this view controller--I'm going to implement it in code and use it and everything will just work."

Traubenfuchs 2 days ago 1 reply      
Crashed my phone (An infinite amount of "Loading js bundle" popups appeared, creating a solid black background from all the shadows) and then it killed my server (EDIS/waveride vserver -now unreachable from outside, probably some kind of overusage protection).


swah 2 days ago 0 replies      
Interesting, this probably kills Phonegap now :) :(
Kiro 2 days ago 1 reply      
Is there any reason to use this rather than "New Project" in the XDE if you're already an Expo user?
rtw1981 2 days ago 0 replies      
sAbakumoff 2 days ago 1 reply      
What is it with React Native? HN's front page today:

2. React Native Navigation Library (airbnb.io)

3. Introducing Create React Native App (facebook.github.io)

8. Wix Releases a React Native Physics Engine (github.com)

intrasight 1 day ago 2 replies      
Serious question: Isn't React Native just a short-term fix while we wait for WASM?
kikimora 1 day ago 0 replies      
First of all - I enjoy using React in browser and consider it the best browser UI tech out there. At the same time I've been working with desktop and mobile apps for past 12 years and there are things that makes me very suspicious about React Native.

1. Native apps does not have anything like browser DOM. Most of complex controls recycle views based on internal control logic. I don't see how Virtual DOM idea can be mixed with that. UITableView does not care about Virtual DOM and there is no easy way to apply Virtual DOM diff to a particular cell. To make default iOS table Virtual DOM aware you will have to rewrite it from scratch.Quote from here - https://medium.com/@talkol/recycling-rows-for-high-performan...>Recycling previously allocated rows that went off-screen is a very popular optimization technique for list views implemented natively in iOS and Android. The default ListView implementation of React Native avoids this specific optimization in favor of other cool benefits

2. Running event handlers in a separate thread is very bad idea. Appcelerator Titanium does same thing and it gave me all sorts of trouble in the past. For example - you scroll UIScrollView and it generates scroll events. These events asynchronously delivered to JS thread. JS thread changes view position. UIScrollView expect these changes during scroll event not after and does not optimise for that. As a result you get gazillions layout/scroll events which kills layout performance. You also get tons of locking/unlocking as a bonus.In Titanium delay between UI event sending event and JS thread receiving it was visible with naked eye. It was virtually impossible to create responsive UI due to these delays. React Native seems much better in this area but I suspect that complex cases like I described above will break.

3. People have been trying to build cross platform UI for decades. There are dozens of desktop technologies in this space. None of them got even close to the point of replacing native UI. Maybe React Native will be first one but statistics is not on their side.I think technologies which draw all UI themselves does better than others which provide cross platform wrappers to native controls. - Java Swing, WPF, maybe QT - see some use, mostly in enterprise.- Java SWT, wxWidgets (I can add AWT here) - only SWT is being used in Eclipse, others are pretty much dead.

So I'm really sceptical about React Native future as a replacement for native UI. It might be useful in niche areas but polished apps might be very problematic.

You might also find this interesting- Bloomberg rewrite it consumer app with React Native https://www.techatbloomberg.com/blog/bloomberg-used-react-na...- Customers rate it one star - https://itunes.apple.com/us/app/bloomberg/id281941097?mt=8

Couple of years ago I've spent few months digging into Appcelerator Titainum which is also based on JS and in a few of ways similar to React Native. After fixing few bugs in Titanium code base and even making myself familiar with WebKit internals I can tell that Titanium is complete a garbage without any hope for improvement. Major reason for this - they decided to run JS on separate thread for performance reasons.

kikimora 1 day ago 0 replies      
First of all - I enjoy using React in browser and consider it the best browser UI tech out there. At the same time I've been working with desktop and mobile apps for past 12 years and there are things that makes me very suspicious about React Native.

1. Native apps does not have anything like browser DOM. Most of complex controls recycle views based on internal control logic. I don't see how Virtual DOM idea can be mixed with that. To make UITableView Virtual DOM aware you will have to rewrite it from scratch.Quote from here - https://medium.com/@talkol/recycling-rows-for-high-performan...>Recycling previously allocated rows that went off-screen is a very popular optimization technique for list views implemented natively in iOS and Android. The default ListView implementation of React Native avoids this specific optimization in favor of other cool benefits

2. Running event handlers in a separate thread is very bad idea. Appcelerator Titanium does same thing and it gave me all sorts of trouble in the past. For example - you scroll UIScrollView and it generates scroll events. These events asynchronously delivered to JS thread. JS thread change view position. UIScrollView expect these changes during scroll event not after and does not optimise for that. As a result you get gazillions layout/scroll events which kills layout performance. You also get tons of locking/unlocking as a bonus.In Titanium delay between UI event sending event and JS thread receiving it was visible with naked eye. It was virtually impossible to create responsive UI due to these delays. React Native seems much better in this area but I suspect that complex cases like I described above will break.

3. People have been trying to build cross platform UI for decades. There are dozens of desktop technologies in this space. None of them got even close to the point of replacing native UI. Maybe React Native will be but statistics is not on their side.Technologies which draw controls themselves does better than others which provide cross platform wrappers to native controls. Java Swing, WPF, maybe QT - see some use, mostly in enterprise.Java SWT, wxWidgets (I can add AWT here) - only SWT is being used in Eclipse, others are pretty much dead.

So I'm really sceptical about React Native future as a replacement for native UI. It might be useful in niche areas but polished apps might be very problematic.

You might also find this interesting- Bloomberg rewrite it consumer app with React Native https://www.techatbloomberg.com/blog/bloomberg-used-react-na...- Customers rate it one star - https://itunes.apple.com/us/app/bloomberg/id281941097?mt=8

Couple of years ago I've spent few months digging into Appcelerator Titainum which is also based on JS and in a few of ways similar to React Native. After fixing few bugs in Titanium code base and even making myself familiar with WebKit internals I can tell that Titanium is complete a garbage without any hope for improvement. Major reason for this - they decided to run JS on separate thread for performance reasons.

Facebooks code quality problem (2015) darkcoding.net
496 points by setra  2 days ago   226 comments top 45
combatentropy 2 days ago 2 replies      
I look at programming as all design, whether deliberate or not. Martin Fowler drove the point home for me (https://www.martinfowler.com/articles/newMethodology.html).

He said that some people drew inspiration from construction, where there are designers and builders. One or two highly paid architects draw up the plans. Then you can hire a bunch of cheap labor to build it, say a bridge. This belief leads to a dichotomy in software companies, where one person is the "architect" and others are just "regular" programmers --- often outsourced to the lowest bidder.

From Jack Reeves he cites the epiphany "that in fact the source code is a design document and that the construction phase is actually the use of the compiler and linker." There is little repetitive, mindless work in programming, because as most of you know, "anything that you can treat as construction can and should be automated." Therefore, "In software all the effort is design . . ."

charleslmunger 2 days ago 4 replies      
"Exhibit C: Our site works when the engineers go on holiday"

Is not all so different from "Fewer patients die when heart surgeons are on vacation". Of course your site is going to be more reliable when nobody is changing it! You should be worried if reliable isn't the steady state, and it requires constant changes to stay up!

paganel 2 days ago 3 replies      
I read Richard Gabriel's "Worse Is Better" essay in my first month hired as a professional programmer, that was almost 12 years ago. Since that time the lessons of that essay have only increased in significance for me, with this FB story as additional proof.

Their back-end code might be bollocks, and I can certainly believe that judging by how sluggish their FB app feels on my phone, but the fact is that they've conquered the Internet (together with Google and a couple of other companies). It's a fact that I personally hate, but they're still winners in the end.

thedevil 2 days ago 7 replies      
It's hard to argue with the business sense of pushing out features quickly at the cost of code quality.

But I've learned that that there are two kinds of development teams:

(A) the teams that are "moving fast and breaking things" while "creating business value"


(B) the teams that are "only maintaining legacy code", "not getting anything done", "costing a lot of money" and "complaining too much" about code quality that team A wrote.

As an engineer, I've learned that it's less work and more rewarding to be on the (A) teams than on the (B) teams.

zo7 2 days ago 3 replies      
Some of FB's ~18,000 Obj-C header files from a post linked in the article: [1]

* FBFeedAwesomizerProfileListCardViewControllerListenerAnnouncer.h

* FBBoostedComponentCreateInputDataCreativeObjectStorySpecLinkDataCallToActionValue.h

* FBEventUpdateNotificationSubscriptionLevelMutationOptimisticPayloadFactoryProtocol-Protocol.h

* FBGroupUpdateRequestToJoinSubscriptionLevelMutationOptimisticPayloadFactoryProtocol-Protocol.h

* FBMemReactionAcornSportsContentSettingsSetShouldNotPushNotificationsResponsePayloadBuilder.h

* FBProfileSetEventsCalendarSubscriptionStatusInputDataContextEventActionHistory.h

* FBReactionUnitUserSettingsDisableUnitTypeMutationOptimisticPayloadFactoryProtocol-Protocol.h

oh my god

[1]: http://quellish.tumblr.com/post/126712999812/how-on-earth-th...

userbinator 2 days ago 2 replies      
It is ironic that a codebase with so many classes is likely to occur precisely because of dogmatic adherence to those "best practices" which were originally intended to improve code quality --- aggressive refactoring/modularisation being a likely culprit. What I think they really need is not more design, not more architecture, not more of anything but KISS and YAGNI.
fishnchips 2 days ago 1 reply      
I spent a few months at Facebook and left because of that. Not necessarily because of poor code quality but because of all the issues it was causing, and a culture of turning a blind eye to (or worse still, being proud of!) those issues. It's obviously true that quality of their code does not affect their market position, but it sure affects who they hire and who they retain.
dawidloubser 2 days ago 1 reply      
I would like to argue that, in most of these cases, it's not "code quality" that is at fault, but "design quality" (before coding) - which is often absent entirely.

The problem with design, in software, is not that most people forget to do it. It's that they never learn to do it. It always comes back to bite you.

I don't want to start a discussion on design, and how most people mess it up because of lack of skill or experience therein. But hacker culture seems to be allergic to design, and hacker culture seems to be what everybody strives for these days.

whack 2 days ago 0 replies      
I do think that FB, and most large orgs, have code quality problems. But the article does a pretty bad job at making its case.

> "Thats 429 people working, in some way, on the Facebook iOS app. Rather than take the obvious lesson that there are too many people working on this application, the presentation goes on to blame everything from git to Xcode for those 18,000 classes."

How does the author know that 429 is too many? How does the author know that FB's goals and functionality can be best achieved with fewer people/classes. This just reads like a classic "Why is Google so big, I can do that in a weekend" comment (https://danluu.com/sounds-easy/)

> "Our site works when the engineers go on holiday"

This is pretty much universally true for any dynamic site where engineers are constantly adding new features. Change always comes with risks. Changing code is always more likely to break something in the very-short-term, compared to not changing code. I have a hobby site (shameless plug, http://thecaucus.net) which has been running in auto-pilot for the past year, and almost never breaks, because there are virtually no moving pieces. The fact that the FB site breaks more often when engineers are making changes to it, is just repeating a universal law of software development.

I do think that the organizational structures at most large companies are bloated, inefficient, non-transparent, and produce sub-par code. I had high hopes when I read the article's headline, but the arguments presented simply aren't very persuasive.

rumcajz 2 days ago 2 replies      
I believe "our site works when the engineers go on holiday" thing is not fair. Of course that application is less stable when it's actively modified. There's no way to make it more reliable on weekdays than on weekends except for stopping the development altogether or maybe deploying on weekends.
ptero 2 days ago 0 replies      
Code quality is important, but for an organization the size of the FB good systems engineering is much more important than good software engineering. If the overall system is organized sanely, failures of subcomponents (which happen due to software bugs, hardware, humans, etc.) can be isolated, firewalled and fixed in reasonable time without bringing the system down.
BinaryIdiot 2 days ago 2 replies      
As pointed out this is from 2015 but I'd love to hear some updated. Has any Facebook employee created newer presentations that discussed, say, their 18,000 iOS classes and the fact that in a single week 429 people contributed to the same iOS app?

I'd also love to hear an update to the "Our site works when the engineers go on holiday" claim.

planetjones 2 days ago 0 replies      
Well when there is a codebase of the scale of Facebook's, with so many activate contributors of course there will be a code quality problem. I see so much stuff at Google breaking or becoming inconsistent too (away from the core search features).

Even if they didn't move fast and break things, they would still have code quality issues. It's just the nature of a huge codebase like they have. And let's face facts - the development process needs to be suitable to the type of problem being solved. This is not a life and death situation and users would value new features, more than extreme reliability and consistency.

mybrid 2 days ago 0 replies      
It is worth noting that when something goes viral we still don't know why. Facebook represents the ultimate in viral success. Given we do not understand why things go viral then it stands to reason that the underlying software methodology is not causation. This means that the fickle finger of fate could easily be visited upon Facebook the same as with MySpace and Facebook will be in no control. This is independent of the any software methodology they use.

Claims made without evidence can be dismissed without evidence.

aamederen 2 days ago 6 replies      
I believe, from time to time, you just need to stop developing new stuff for a period of time and dedicate the team to improve the overall quality, pay technical depts, hunt bugs, remove legacy code and refactor still existing ones.
amelius 2 days ago 0 replies      
It also seems that improvements in code quality caused by React happened not because they were driven by management, but because a few "heroic" engineers thought it was an interesting side-project. At least, that's my impression from the publicly available talks they gave.
bobbyi_settv 2 days ago 0 replies      
> The second exhibit is from Facebook Research... wouldnt you just write your disk files to a ramdisk? Surely they noticed this too

I'm not sure the author understand what a Research team is and what it does. Trying out a new solution to an existing problem is sort of their job. I'm also not sure how a research team publishing a paper discussing an alternate solution indicates anything about the company's "code quality".

watwut 2 days ago 0 replies      
Reality check is that most companies, including most startups, attempting to create software of such size end up tangled in their own mess way sooner and fail. Keeping it maintainable at such scale is not as easy as keeping it maintainable when there are four of you.
segmondy 2 days ago 18 replies      

I've become very cynical since the past year or so because there is a lot of noise/articles on the net, all knowing what's best for you or wrong with what you are doing. Yet without context or true authority to talk about it. My question these days when I read such article, "What has the author done that is equivalent?" There is rarely anything to be found.

Facebook might have a "code quality problem." But until you have worked at an organization that is big like Facebook please hold forth your tongues or fingers in this case. Startup principles don't apply nor academia. Facebook's "code problem" is what really does happen in the real world with such humongous business enterprises. Lot's of moving pieces in code, in people, in ideas and the amalgamation of these results in what most programmers will see as "code quality problem" yet the market sees as billions of dollars.

chiefalchemist 2 days ago 0 replies      
But seriously (i.e., ref to my previous comment). All the tools. All the talent. All the leadership and management. All the money. And still even the mighty FB struggles.

Point being: This shit might not be rocket scientist hard, but it ain't easy either. And when you don't have the war chest of the likes of FB you're in for an ongoing and never ending (quality) battle.

sz4kerto 2 days ago 1 reply      
It's interesting to note though that they have code quality problems but that doesn't prevent them from being massively successful.
raygelogic 2 days ago 0 replies      
this strikes me the same way as the story of the guys who figured out where to add armor to bombers during WW2. the places on the plane that had bullet holes after a mission clearly were operational enough to return to base. the places that never had any damage were probably critical, so that's where armor was added.

if a company can make billions with a poor-quality codebase, clearly quality isn't a bottom-line concern.

what is a concern is shipping the damn product.

p0nce 2 days ago 0 replies      
At this point it is clear that software size increase linearly with the size of your software team, whatever the problem you solve.

This affects every organization and is something that should be actively fight against. Having accidental, unplanned, unaccounted costs should not be the default path.

strken 2 days ago 0 replies      
Can someone explain the example with the ramdisk? It's not clear to me how a ramdisk is relevant to the problems put forward in the paper, i.e. disk is too slow, they rely on lazy page allocation, and they have insufficient ram to hold two copies of everything simultaneously.
jheriko 2 days ago 0 replies      
one look at hhvm tells me they have poor quality development practices. :P

i get the impression that they use the struct keyword to avoid having to type public everywhere for instance...

dep_b 2 days ago 0 replies      
code quality can't handle our scale
_pmf_ 2 days ago 0 replies      
> The article moves on, without wondering whether releases regularly breaking your app are a normal part of the software engineering process.

To be fair, that's absolutely their call to make. Nothing of value is ever lost if your platform does not provide any non-ephemeral value.

ChrisRus 2 days ago 0 replies      
> From Jack Reeves he cites the epiphany "that in fact the source code is a design document and that the construction phase is actually the use of the compiler and linker."

Ha. That is laughable. If it were so, then we would be able to automatically graft features from one product to another. I contend this is exactly how SoC are designed and there's no reason why we shouldn't copy the methods of the digital design community (proven to scale or you wouldn't be reading this).

tomkin 2 days ago 0 replies      
It's important to understand that Facebook is incredibly large, and the fact that any software scaled with them to the point they are presently at, is a "on the shoulders of giants" situation that seems to actually be working well enough for them to become a billion dollar enterprise.

Also, the building of infrastructure that takes on scale that's not yet been required is somewhat inefficient. But it does beg the question: If Facebook is limited by its infrastructure, what responsibilities do they have to build software that continues to scale for future organizations?

norswap 2 days ago 1 reply      
I was doing iOS dev three years ago, and the Facebook lib for iOS was the worst piece of shit I ever had to deal with.

Their app also used to be unbearably sluggish despite not doing anything too fancy.

peletiah 2 days ago 0 replies      
Makes me wonder how their decision for an open-plan office is related with this...
sushobhan 2 days ago 0 replies      
I don't think all the things discussed are still relevant as it's a fairly old post. Also, "best practices" are not always best depending on the scalability of the problem itself.
badpenny 2 days ago 0 replies      
I recommend reading They Write the Right Stuff[1] to anybody interested in NASA's approach to writing software.

[1] https://www.fastcompany.com/28121/they-write-right-stuff

baggachipz 2 days ago 0 replies      
> The Hack and Move fast and break things culture must make it very hard for developers to focus on quality.

Well, yeah, that's what the "break things" part means. The problem is when people/companies try to have it both ways. "Move fast and have high quality" isn't possible.

aqibgatoo 2 days ago 0 replies      
bob martin a.k.a uncle bob always says that people consider patterns like mvvm,mvc,mvp etc as Architecture which actually are just for the ui layer.It happens commonly in mobile development.
Bahamut 2 days ago 0 replies      
Anecdotal evidence, but I have a friend who complained that some critical parts don't have tests, and a lot of developers are loathe to write them. He got bit by it when he made a change in an area with no test coverage and broke the spam filter.
alex- 2 days ago 0 replies      
> "when Facebook employees are not actively making changes . . . the site experiences higher levels of reliability."

This seems like a great thing to me. i.e. The system is stable and the error budget is being used to facilitate change.

pablo_fiesta 2 days ago 0 replies      
Rub lavender oil on the servers, say "web-scale" 3 times, click you heals and everything will be OK. Oh wait, but the Russians.
jyriand 2 days ago 0 replies      
Only Kent Beck can save them. :)
typetypetype 2 days ago 0 replies      
What's the point of an app? Generating high quality code or generating revenue? Until you hit intersections where revenue can be increased by changing code, the priority will always be on new or changed features.
chiefalchemist 2 days ago 0 replies      
Move fast...and just leave things broken. Users be damned.
dwils098 2 days ago 0 replies      
links to slides are all dead... in parent articles and reference articles
linkmotif 2 days ago 0 replies      
While the author wrote this article FB made millions of dollars.
neals 2 days ago 0 replies      
_pmf_ 2 days ago 2 replies      
If a company hails React as a sane way to develop software, you know where it stands.
Ask HN: Which Berkeley Courses Should I Archive?
494 points by berkeleyarchive  1 day ago   147 comments top 32
toomuchtodo 1 day ago 6 replies      
All of it has already been archived (EDIT: thanks to the hard work and quick response of ArchiveTeam and /r/DataHoarder).

[edit: link to Archive Team project-specific page removed to reduce excessive load; replaced with Archive.is link below]


SilasX 1 day ago 7 replies      
I deeply apologize if this is off topic, but this request highlights an issue with the original debate, where posters were tripping over each others to express indignation about Berkeley releasing these videos without disability accommodations, reiterating the same arguments for the ADA, and asserting that those same considerations apply here.

If you thought the judgment against Berkeley was justified-- that they couldn't give away these videos for free without e.g. subtitles -- are you equally against this effort? Because it accomplishes the exact same thing: the availability of some useful education videos that are unusable by (some) people with disabilities.

If you're not, how do you reconcile that? Your position seems to be something bizarre like:

A) "Yeah, Berkeley should release free, deaf-unusable videos, but gosh darnit, they better well do it through back channels, because we need to respect the disabled."

B) "It's great to release the videos, as long as someone other than Berkeley endorses and/or hosts them, because we need to respect the disabled."

I just don't see a way to reconcile them, and yet I get the impression that some posters here do hold both views (the judgment was justified, and this effort should not be halted/is good).

Edit: Here are the discussions I am referring to, courtesy of BenElgar:



ucb_throwaway 1 day ago 4 replies      
As a former UC Berkeley student, I just want to add that besides the public lecture videos, there are also many private, unlisted course videos on YouTube from the last couple years (after it became an issue). The Archive Team has missed these videos as they're only accessible via UC Berkeley's student portal if you're a student in the class. The Archive Team and current/former students need to work together here to retrieve the private YouTube playlists and download the videos.
huma 1 day ago 0 replies      
Not CS related, but I've found these courses to be of great value:

- Sociology 150A (Robb Willer)https://www.youtube.com/watch?v=edfKMAePWfE

- Geography C110 (Richard Walker)https://www.youtube.com/watch?v=oYR5PdPZ_w0&list=PL4rxxS6x1H...

- Physics 10: Physics for Future Presidents (Richard Muller)https://www.youtube.com/watch?v=6ysbZ_j2xi0

There's also an audio course on Buddhist Psychology by Eleanor Rosch, if you're interested. Now seems to be available only on iTunes.


nsrose7224 1 day ago 0 replies      
I'm not sure which of the following are actually available on Youtube, but here are the courses I enjoyed the most and I think are the most valuable as a CS major:

- CS161 Security (Wagner preferably)

- CS189 Machine Learning (Shewchuk)

- CS170 Efficient Algorithms and Intractable Problems

Not a comprehensive list, just my favorites.

osoba 1 day ago 0 replies      
Archive CS294-112 Deep Reinforcement Learning Sp17 https://www.youtube.com/playlist?list=PLkFD6_40KJIwTmSbCv9OV...

It's not on the main UC Berkeley YouTube account so I don't think it will be deleted but better safe than sorry

xyle 1 day ago 0 replies      
That is a great list to have so far! On top of that, I have to highly recommend CS168: Internet Architecture (preferably with Scott Shenker), CS 161: Computer Security (with either Wagner, or Weaver), and CS169: Software Engineering (with Armando Fox, also available on EdX). These are the 3 courses that were most influential on my Undergrad experience (on top of the 61 series and 162)
ruang 1 day ago 1 reply      
pavanky 1 day ago 1 reply      
Would it not have been better for Berkley to reach out to the community to ask them to help improve captioning if money was an issue ? Is this not a legal way to solve this problem instead of taking it down for lack of funds ?
CalChris 1 day ago 1 reply      
Dunno if CS 161, Computer Security, is available but it's a great course. Wagner or Weaver both have their points.

CS 70 should make the list but the lecture notes are enough.

Strongly prefer Kubi for OS or pretty much anything.

myth_buster 1 day ago 0 replies      
CS189 - I've been going through this course and I like the presentation and content. Specially, all the plots that are brought in to explain and also the notes.


ilyaeck 1 day ago 0 replies      
How/where are you planning on making them publicly available? I suppose UC Berkeley might actually turn a blind eye if one where to simply repost them on YouTube - because UC meant for them to be public in the first place. Alternatively, maybe repost on Vimeo?
yourapostasy 1 day ago 2 replies      
The proximate cause of this removal is Gallaudet University [1] [2].

What possible motivation could have moved those employees to file on behalf of their university? Malicious intent (they didn't want the free material competing with their courses)? Lack of gratitude (the material is free, but that's not enough)? Zealotry (everything, even free content, must meet their ADA compliance standards)? Simple lack of thinking through potential consequences? Lack of Net citizenship/spirit? Anyone have any insight into the real story behind their protest? I don't want to excoriate them without knowing the whole story, it could have been just someone's doh! moment turned into a really bad outcome.

I know there are closed-captioning format standards. Perhaps someone can create a site that runs closed captioning underneath YouTube videos, with the closed captioning supplied by volunteers (kind of like how closed captioning was done by anime fans)? Hook it up to a GitHub backend so closed caption data can be refined by anyone, with appropriate sidebar discussions. Hook it up to Google Translate to generate Braille and foreign translations of the hand-curated closed captioning, and let users refine the auto-generated translations. Then prevail upon the DOJ and Gallaudet University to give this time to develop instead of hammering on UC Berkeley, and let Creative Commons-licensed closed captioning fill in the content everywhere for all access-challenged students, for all content? Google might be interested to use this as a corpus for Deepmind.

Gallaudet University could have pioneered a solution and become the world leader in automated accessible content generation working in partnership with Google Deepmind, for example. That would have brought in way, way more funding through licensing than this short-sighted approach they are taking now. If Gallaudet University established a CS focus upon this, it would draw in top global talent for a variety of specialties. HAL-like accurate automated lip-reading coming out of this, with even more accuracy when using mic arrays? Yes, please; you get something like that even 90% accurate and you just gave a mindgasm to every meeting-organizer in the world who wants meeting notes taken. And as much as you all hate meetings, if you had a near-irrefutable better-than-stenographer CYA from meeting notes just once, I guarantee you would love that mechanism, increased meetings frequency or no.

This is a darkening of the Net and education in general, and it should not stand. Unless the reporting on this development is simply not including their side of the story, that Gallaudet University is not front and center of this issue trying to get ahead of the outcome by seeking win-win solutions should be making them a pariah on the Net and in the education world. Gallaudet University pursuing a perfect-is-enemy-of-good tactic likely has not considered that they just pulled free education for hundreds of millions of young minds in the developing nations who cannot afford anything close to a world-class education, but have family and friends willing to translate for them. That's unnecessary.

[1] https://www.insidehighered.com/news/2017/03/06/u-california-...

[2] http://www.theblaze.com/news/2017/03/09/government-over-regu...

UPDATE: I glanced at the YouTube API, and it seems the IFrame Player API supports returning playback status and elapsed seconds [3], so closed captions running underneath an iframe'd video and dynamically responding to the state of the video appears feasible. Only a 5-minute glance, though, so I might have missed other bits for analyzing for feasibility.

[3] https://developers.google.com/youtube/iframe_api_reference#P...

AIMunchkin 11 hours ago 0 replies      
So what would happen if someone created a youtube account called "NotUCBerkeley" and reuploaded all these videos with their playlists intact? Asking for a friend of course!
saycheese 1 day ago 0 replies      
Anyone looking to create a personal archive for any YouTube content should look into these scripts:https://rg3.github.io/youtube-dl/

They're very easy to use and well documented.

beefield 1 day ago 1 reply      
I would be curious to read the background discussion at HN you refer to?
WillyOnWheels 1 day ago 1 reply      
It's super easy to download a video off of Youtube with jdownloader 2.0 . Works in Linux and MacOSX (probably windows too, I have no experience with it)

You select the youtube url, copy it, download, done.


theli0nheart 1 day ago 3 replies      
Could someone make a torrent with all of the videos? I would but I've never done it before and don't have the time to learn. :(
eddieh 1 day ago 0 replies      
Physics C10 (aka L&S C70V)
ner0x652 20 hours ago 0 replies      
Berkeley CS 61ABerkeley CS 61CBerkeley CS 162Berkeley CS 186
partycoder 1 day ago 0 replies      
The archive files can be accessed here:


Should contain the full backup.

neurobot 1 day ago 0 replies      
As far as I know this course has been archive in archive.org, you can find it there with berkeley as a keyword.
tmccrmck 1 day ago 1 reply      
Could you get CS170 with Papadimitrou, EE 16A/B, and multiple semesters of 61A with Harvey? Thanks!
AnimalMuppet 1 day ago 0 replies      
> I'm in the process of archiving some of the most important Computer Science courses, mainly for my own benefit, but I intend to make them publically available. (This is a throwaway account b/c I don't want to run afoul of any legal issues.)

Um, hate to break it to you, but you're not going to run into legal issues for saying that you're going to grab archives and make them publicly available. You're going to run into trouble for actually making them publicly available (if the license doesn't permit that). The only thing you change by using a throwaway account is that the legal trouble isn't associated with your main HN account.

vpribish 1 day ago 5 replies      
Anyone want to chime in with a way to simply grab them all?
Kinnard 1 day ago 1 reply      
Could you write a script that captures all of them?
jjawssd 1 day ago 0 replies      
3.1 TB torrent





"And lastly I finished downloading all of the UC Berkeley. Videos, any transcriptions/captions and all other video info. I made a torrent as they are the most efficient at sharing. All 3.1TB of it, it's not hosted on the fastest server, but with a few seeds it should go quick enough. If you want to keep this great learning resource alive, feel free to seed or partial seed, I will seed it for as long as I can. [4] For video listings please look at this list [5]."

legohorizons 1 day ago 0 replies      
daseiner 1 day ago 0 replies      
Anything with Hubert Dreyfus
reachtarunhere 1 day ago 0 replies      
Funny I downloaded exactly the same courses.
master_yoda_1 1 day ago 0 replies      
None. If you are interested in some subject then you would have taken. And if you got interested in future there must be some course. So don't waste your hard disk space.
Introducing Keras 2 keras.io
431 points by mirceam  1 day ago   67 comments top 13
minimaxir 1 day ago 3 replies      
Copying my rare product endorsement from the previous submission:

Keras is so good that it is effectively cheating in machine learning, where even Tensorflow tutorials can be replaced with a single line of code. (which is important for iteration; Keras layers are effectively Lego blocks). A simple read of the Keras examples (https://github.com/fchollet/keras/tree/master/examples) and documentation (https://keras.io/getting-started/functional-api-guide/) will let you reverse-engineer most the revolutionary Deep Learning clickbait thought pieces.

It's good to see that backward compatability is a priority in 2.0, since it sounds like a lot had changed.

juxtaposicion 1 day ago 2 replies      
Will Keras2 support PyTorch as backend, in the future?

Answer: [0]No, there are no plans to support PyTorch. There is nothing to be gained in supporting every novelty framework that crops up every quarter. Our goal is to make deep learning accessible and useful to as many people as possible, and that goal is completely opposite to building up deep learning hipster cred.

[0]: https://github.com/fchollet/keras/issues/5299

caoxuwen 1 day ago 1 reply      
Highly recommend this course - http://course.fast.ai/ that utilizes Keras as the main programming tool
epberry 1 day ago 0 replies      
Keras is fantastic. Not the tightest analogy and probably unoriginal but I think of it as the Python to Tensorflow's C. It's easy to drop into tensorflow flow when needed but you can probably get away with Keras for a long time. Also, Francois helped us when we DM'd him on Twitter which was incredible.

Thank you so much Francois! I'm incredibly excited about this release!

krick 1 day ago 5 replies      
I'm only starting with all that machine-learning, NN stuff and as many others I want to ask for some guidance/resources/learning material. What I feel especially lacking is something very broad and generic, some overview of existing techniques (but not as nave as Ng's ML course, I assume). There exist a lot of estimators and classifiers, there exist a lot of techniques and tricks to train models, there exist a lot of details on how to design a NN architecture. So how, for instance, do I even decide, that Random Forest is not enough for this task and I want to build some specific kind of neural net? Or maybe I don't actually need any of these fancy famous techniques, but rather there exist some very well defined statistical method to do what I want?

What should I read to start grokking this kind of things? I feel quite ready to go full "DIY math PhD" mode and consume some heavy reading if necessary, but where do I even start?

uptownfunk 1 day ago 2 replies      
The mathematician in me has kept me from jumping into deep learning before I understand the mathematical and statistical underpinnings of the algorithms involved. Looking forward to reading through the latest book out by mit press and giving things a whirl with Keras which I've heard so much about.
gidim 1 day ago 1 reply      
I love Keras but I think this update broke more things than you realized. For example it's no longer possible to get the validation set score (val_acc) during training which renders early stopping impossible. This was a documented feature on your FAQ.

Is the old documentation still available? I'd like to wait before I upgrade.


backpropaganda 23 hours ago 2 replies      
1. Still no support for multiple losses. Models like VAEs cannot be idiomatically implemented. The second loss has to be 'hacked' in. Notice how in the official example for VAE, the kl_loss is computed using variables which are NOT available via the loss function (https://github.com/fchollet/keras/blob/master/examples/varia...)

2. It's still an input->output paradigm, rather than a {input, output}->loss paradigm which gives more flexibility.

These two issues are the main reason why I stick to slightly lower level APIs, even though I _want_ to use Keras.

Kiro 19 hours ago 3 replies      
Is it better to learn Keras instead of tflearn?

Copying a comment I made in another thread where one response recommended Keras:

I currently have a small pet project where I think some simple ML would be cool but I don't know where to start.

Basically my use case is that I have a bunch of 64x64 images (16 colors) which I manually label as "good", "neutral" or "bad". I want to input this dataset and train the network to categorize new 64x64 images of the same type.

The closest I've found is this: https://gist.github.com/sono-bfio/89a91da65a12175fb1169240cd...

But it's still too hard to understand exactly how I can create my own dataset and how to set it up efficiently (the example is using 32x32 but I also want to factor in that it's only 16 colors; will that give it some performance advantages?).

chestervonwinch 14 hours ago 0 replies      
Keras is a great wrapper library built upon two fantastic frameworks -- theano and tensorflow. I'm glad to see it is moving forward, and kudos to everyone involved in all these libraries!
diminish 19 hours ago 1 reply      
Slightly irrelevant but curious question about the Analytics for 7day (34K), 14day and 30day active users. I'm running a similar site so, could it be that, a lot of users reading documentation are using ad/tracking blockers so that active users count appear higher than it actually is in GA. Documentation users tend to read quite high pages per session. If I'm right then they should see less page views per user than expected.
mooneater 1 day ago 1 reply      
Awesome.Yet "codebases written in Keras 2 next month should still run many years from now" given that deep learning is no new, how can they that confident that this API will remain relevant years down the line?
rfeather 1 day ago 1 reply      
I wonder why they decided to get rid of MaxoutDense. Is there something better or is it so trivial to implement they decided to drop it?
Hiring without whiteboards github.com
389 points by sugarpirate  22 hours ago   349 comments top 17
kelvin0 11 hours ago 10 replies      
Imagine this. You are interviewing for a job, you walk in a room with 2 people who hand you over a sheet with a few problems. They ask you to write the solutions on the whiteboard, while they wait for you to complete.

Not a word is said, they are clicking at their laptops, and staring at the whiteboard, as waiting for the genie to pop out of a bottle. All the while your mind is frozen and stuck in a bad loop.

This lasts an hour, you are barely able to complete parts of the problems and are frozen. Of course this affects your usually creative and sharp mind.

The torture lasts an hour, time's up! You shake their hands, as a kiss of death, and head out. As you are walking back, all the answers to all the problems they wanted you to whiteboard, come rushing like a torrent in your mind. Too bad, another 'botched' technical interview.

This is my experience as a battle tested developer who is shipped many products and has been programming for the love of computers since the age of 12 (professionally for more than 15 years). I am not going to be working at Google any time soon (not that a Google job really matters to me).

jonasvp 20 hours ago 11 replies      
I'm responsible for hiring developers at our company based in Berlin, Germany, and found it best to have a guided interview about the candidate's work experience and interesting problems that she/he solved. I never understood the whiteboard hazing/CS trivia that are so widely discussed on HN since it seems extremely disconnected from the actual work that's being done.

That said, I'm always surprised how many candidates cannot even point to one problem they worked on they found interesting or one solution that they're proud of.

We worked with an HR consultant to develop a interview guide in the form of certain questions that we make sure to hit during the interview in order to be able to compare between candidates and make an informed decision.

However, we're small and not in the US. Anyone have experience with other companies in Germany/Europe? How does the typical interview work over here?

ninjakeyboard 16 hours ago 15 replies      
What's with all the whiteboard backlash? I ask simple questions and expect people to be able to write code unaided to express their idea. Not "implement a linked list" or "write quicksort" but basic "You have two arrays - find if a number exists in both arrays" sort of thing, primarily to reason about runtime complexity, and to make sure they can actually write code.

If you tell a candidate to prepare, they should be able to come in and do this. If they can't write some simple code unaided, then I don't want to work with them. I was a copy-paste programmer once too.

bogomipz 20 hours ago 6 replies      
I would like to see a compilation of companies that aren't requiring a "coding project" as part of their interview process. I much prefer to write on a whiteboard as part of an onsite interview than to do a "coding project" for every single company that "might" be interested in me.

My experience lately has been an immediate request to complete a coding project right after speaking to a recruiter but before speaking to anyone technical. And perhaps the height of absurdity I have even had automated emails saying "thank you for you interest we invite you to complete this coding challenge/project."

Are you really expected to do one of these now for every single company you send your CV?

The whiteboards seems far more respectful of people's time at least.

vinhboy 21 hours ago 9 replies      
I think coderpad.io is the modern day equivalent of a whiteboard.

It's nice that you get REPL and a keyboard, but the unnecessary pressure of being watched, critiqued, and timed are all there.

My mind just goes completely blank whenever I am put in this scenario.

Are we programming or defusing a bomb?

hasenj 13 hours ago 2 replies      
Using the presence of whiteboard questions as a metric for interview quality seems rather pointless to me.

Whiteboards questions can be very good or can be very bad or anywhere in between. Just like any interview question in general.

I for one prefer whiteboard questions to generic open ended personal questions.

Good whiteboard questions can quickly assess whether someone knows the basics of programming or not.

Open ended questions only judge one's ability to tell stories.

wollstonecraft 9 hours ago 2 replies      
The main purpose of whiteboard is to decrease the liquidity of job search for engineers, thus depressing wages. After a few years of doing design docs and copying protobuf fields, the median engineer at a top company would need a month of prep to be able to pass whiteboard interviews again. Much more effective than secret no-poaching agreements.
wolfgke 5 hours ago 0 replies      
Why the universal hate for whiteboard interviews? I studied computer science at a German university and it was expected that in the exercise groups you are able to solve the exercises that you prepared on a whiteboard from the first semester on. If you did not solve it sompletely you still had to whiteboard-code the parts of the solution that you were able to and improvise the rest.

For more complicated algorithms I still love to draw lots of pictures before implementing them.

I can understand that people complain that they have to solve tasks (on a whiteboard) that won't have to do much with what they will do at the job. But this has nothing to do with whiteboard interviews, but with badly chosen tasks (say: data structure brainteasers) for the job interview.

nostrademons 20 hours ago 2 replies      
It's funny, whiteboard interviews really would be a better discriminator for founders than employees. They test confidence rather than competence (or technically, competence + confidence). The former is incredibly important for founders, but the latter is probably what you care about more for employees. And yet founders almost never do a whiteboard interview, yet it's standard for technical employees...
DigitalSea 20 hours ago 7 replies      
The problem with whiteboards, is they're not reflective of real life. When was the last time you can honestly say you used a whiteboard to solve a programming problem instead of Googling and ending at a StackOverflow question with a great answer? I've never used a whiteboard for programming. I've used plenty of whiteboards for things one level back; planning, prioritising and visualising task related things.

The reality of programming is, these days everyone Google's their problems. Sometimes you end up at StackOverflow or sometimes you find yourself on a random blog or official documentation. Occasionally you ask a coworker for help, but the ego a lot of developers have to be smart means most feel intimidated to do so. And sometimes you just bang your head against your desk until things start working.

Whiteboard and highly technical interviews that expect you to know high-level computer science concepts are tilted in the favour of graduates who are freshly minted from whatever respective educational institution they graduated from.

ibejoeb 20 hours ago 2 replies      
It really is terrible. Companies need to accept this.You're sitting behind me. Or beside me. I can smell your breath. Or you're on a webcam. It's voyeuristic. You've already solved this problem. Perhaps you even designed it. Your talking is breaking my concentration. This is not how I work. It's probably not how you work either, and if it is, I think we're done here.

Now, if a company insists on an adversarial process, there are better ways to do it. The nature of the problem is given in advance. Both the subject and the evaluator go in blind, and they both do it.

tamalsaha001 13 hours ago 1 reply      
I have a question. When you hire complete freshers out of school, how do you interview them without asking for CS stuff? In my experience, fresh undergrads usually do not have any real world experience.

I have experimented with short take home projects. This drastically reduces the available pool, since great freshers can get job in other places with whiteboards interview. Also, there is a sense that interview process is used to get people to do free work.

Also, non whiteboard interview processes are extremely time consuming. Non-whiteboards interviews will be better. But as a matter of practical experience, I have found it to be the only viable option for hiring fresh undergrads.

Kiro 20 hours ago 4 replies      
Am I the only one who thinks the pressure a whiteboard adds is actually something valuable? In my last job there were a lot of situations where you had to solve a real problem under extreme pressure. People who couldn't handle it simply weren't cut for the job.

I don't think all interviews should use it but for us it was good for filtering out candidates who would not perform in those situations.

alaaibrahim 21 hours ago 0 replies      
Have you ever tried to bring a laptop to an interview, and asked to use it instead of the whiteboard? Believe me very few would object.
listentojohan 20 hours ago 0 replies      
It was not really a whiteboard exercise, but I once had an interview at which we went through some of my previous work (code-ish) on a whiteboard. I was mostly for structure, and I think that's a good example. Also there was a good relaxed atmosphere, so it was more a point of discussion. Got an offer.
therealmarv 20 hours ago 1 reply      
Whiteboard interviews are important. Reason: http://i.imgur.com/oM9yCym.png
koja86 18 hours ago 0 replies      
OMG it's happening!

I was always puzzled by how unrelated metric used for hiring is to actual work at many companies. I don't mind algorithmic brainteasers - on the contrary I would absolutely love to solve such problems at work yet those seem to be scarce.I also don't mind other types of developer work - be it more research or architectural oriented or "just" coding.

But I am not happy learning something different for interviews every few years and then forgetting it over the course of next period of day to day work. Feels like a waste of time and is confusing in regards to what one would actually do in certain job.

Weve lost control of our personal data, including 33M NetProspex records troyhunt.com
377 points by user7878  23 hours ago   129 comments top 13
sathackr 15 hours ago 9 replies      
I've stopped giving my information to entities that don't need it.

I use fake account information where it is legally permissible and the system is requiring some input to proceed.

When I get asked for my phone number, zip code, email address's etc... At checkout in stores, I give a polite "no thank you". Which usually results in a huff and/or an eye roll from the cashier, as if I'm expected to give this info for the privilege of shopping there.

If the information sources dry up or are of sufficiently low quality, the market value is significantly reduced, as would be the incentive to collect and store such information.

jacquesm 21 hours ago 4 replies      
I am aware of one EU entity that sits on a mountain of such data (and much worse, in fact) with very little in terms of security.

I'm simply waiting for the day when there will be a hack like this on the European continent, it's scary what sits in lightly protected databases, especially if you consider the probable sources of data like this and that - at least in Europe - it would be illegal to create such a DB without the consent of those whose information is stored in them.

We've lost control is the perfect way to describe things.

inthewoods 16 hours ago 1 reply      
Recognizing that my comment here is unrelated to the central question around whether companies should have this data, the irony for NetPropex (and the hundreds of other companies in the same business) of this kind of data availability is that it results in a lot of cold emails and phone calls that largely go ignored. Thus, as the data has become more widespread, it has become less valuable. Buying a list of people to contact, in my experience, is like throwing money in the trash.
marktangotango 14 hours ago 2 replies      
Awesome advertisement for netprospex, a lot of people would pay a lot of money for that data. I'm sure their phones are ringing off the hook this week. I'm sure this is unintended, but that's the reality as I see it.
bambax 16 hours ago 2 replies      
> CSV file containing JSON data


Corrado 21 hours ago 0 replies      
I got a warning this morning from HaveIBeenPwnd and this article was in the breach email. Considering the current US political environment and the recent article by Tim Berners Lee, this spam list is extra scary.
joshpadnick 13 hours ago 2 replies      
I believe our current paradigm for how data is stored is fundamentally broken. The author is right that when you choose to use a service you have no real control over how they use your data. Frankly, companies aren't even accountable to uphold their own privacy policies since no one actively monitors them (except perhaps in the context of HIPAA, PCI, etc.)

What I'd love to see is a marketplace of "personal data banks" that would work like this:

- The bank maintains an isolated database of every major database vendor. The databases are isolated to a single consumer.

- The bank exposes API endpoints of every major database to companies like Facebook or new SaaS startups. Those companies now agree -- when requested -- to write your data not to their private database but to the bank's database that is private to you.

- You, the consumer, pay the bank a modest monthly fee to control who can access that data, and even optionally cut off access to the original "generator" of data.

I guess this still suffers from the need to trust that Facebook is abiding by your request that all data be written to the bank, and network latency becomes a real issue. So maybe it's not the right business model, but it's an important problem to solve.

brazzledazzle 12 hours ago 0 replies      
I think I've finally found the source of my constant influx of spam. It won't fix the downstream sources like resellers but at least I can ask Netprospex to remove me. But since they aren't the ones directly spamming me do they have to comply?
dwightgunning 18 hours ago 1 reply      
From a writing point of view, I found it an interesting choice to simply use "author" instead of "Tim Berners-Lee" when attributing the quote in the opening paragraph. Surely dropping TBL by name would give more immediate credibility and encourage the reader to continue.
devy 10 hours ago 0 replies      
Pretty sure recruiters mining this kind of data as well.
jgalt212 17 hours ago 1 reply      
I fail to see the difference between PII for expensive purchase (NetProspex) and PII for cheap purchase (River City, and whoever else is using the stolen data for gain).
EGreg 18 hours ago 0 replies      
I wrote on this same topic a few years ago, might interest some:


shamaku 13 hours ago 1 reply      
At some point we will have to trust our fellow men. The sooner the better for all.
We didnt lose control of the Web it was stolen ar.al
408 points by imartin2k  3 days ago   171 comments top 26
subjectsigma 3 days ago 2 replies      
I agree with the sentiment of this article, but only to a degree.

I recommend anyone interested in this to read Open Standards and the Digital Age, by Andrew Russel. The book is a partial refutation of the idea of an 'open web' using historical examples, the most shocking being the failure of open and democratic methods to build an open Internet standard versus the success of Serf and Kahn inventing TCP/IP in a closed and corporate environment, funded by the military. The reality is that some systems critical to the operation of the Internet, such as DNS, are highly centralized, un-open, un-private and un-free, or at least when compared with cyberlibertarian expectations of how the Internet should be. It also addresses other perversions of 'open', such as the irony that some of FLOSS's biggest customers are megacorps like Apple and Microsoft, who up until recently contributed way less back than what they took.

I don't think the web was 'stolen' from 'us', I think people just don't realize how controlling it was before. They're mistaking an epiphany for an actual loss of freedom that may or may not have been there in the first place. We need to fight for a free and open Internet, but let's not kid ourselves with inaccurate and misleading language.

notlisted 2 days ago 3 replies      
FB did steal the web and your social network. Remember how FB grew? If not, let me remind you: "See which of your friends are also on FB! Just give us your Google and Yahoo contacts". Yes, you gave FB permission to check your contacts, but FB omitted to tell us that they would be HOLDING ON to your contacts forever and ever. FB then sent those contacts, without your permission and not per your request, an email saying YOU were on FB or and email that YOU had asked them to join. Initially there was a little quid-pro-quo, you could also export contacts from FB to Goole and/or Yahoo. This was closed down soon after they gobbled up the contact lists of the world[1][2].

Of course, LinkedIn later did a similar thing, and grew in a similar fashion, but at least there you could, and can, REMOVE the contacts you've handed over without full disclosure.

I willingly GAVE my contacts to Google/Yahoo, and they provided a service for me, GMAIL/YMAIL. Facebook STOLE what I gave to google/yahoo, and they used dark patterns.

The cat is out of the bag, but let's not forget our history.

PS Articles have even claimed that shadow-profiles were created for those who had not signed up yet, which could be matched with actual sign-ups at a later date.

[1] https://techcrunch.com/2010/11/04/facebook-google-contacts/

[2] http://searchengineland.com/facebook-you-have-no-right-to-ex...

jstewartmobile 2 days ago 1 reply      
The worst part is that if you're employed, it's probably a job requirement to use at least one of these surveillance capitalism platforms.

Most people I know in tech have to be on Slack. Most people I know in advertising and PR have to be on Facebook, Instagram, and Twitter. For some jobs, like journalism, even the "troops on the ground" are required to post and tweet. Even for jobs without the social media taint, a lot of companies use corporate gmail, so now Google is way up in your business.

In 2007, you could probably just chalk it all up to poor personal choices. In 2017, I don't know if that's entirely true. We're in a situation that cries out for regulation, although that will probably not happen in the US until after a calamity, since regulation is seen as one of the heads of the beast in our money religion.

edit: Slightly off-topic, but both mainstream mobile platforms have chilling surveillance and control aspects that make most of the web seem benign in comparison.

jacquesm 3 days ago 2 replies      
It all went wrong at the firewall. As soon as peer-to-peer was over and NAT became a security layer as well as a technological construct to put more than one host behind one public IP it was essentially game-over.

If you want to reboot the web then you need to reboot the internet first, solve the insecurity of privately hosted servers first and convince ISPs that symmetric connectivity should be the rule.

After that you have a fighting chance.

bambax 2 days ago 12 replies      
While I agree with the gist of the article, I have a problem conflating Google and Facebook in the same evil bag.

Facebook is the evilest of evil, not only for the reasons stated, but also because information entered into Facebook never comes out. You have to have an account on FB to see anything on the platform. That is contrary to everything the Web ever stood for.

Since I don't have a Facebook account, most of my friends don't either, and I will prevent my kids from having one for as long as humanely possible, I'm not sure what kind of benefit it provides. (I'm aware that 25% of all of humanity is on Facebook so they must find something in it. I just don't know what it is, and would rather not find out.)

But, to me, Google is quite different. Not only is Google Search is unparalleled, it's also quite open. Yes, Google constantly tries to have you "login" and turn on search history, etc., but one can still use Google Search completely anonymously. That's not a detail, that is a very important feature of Google Search.

* * *

Now about fighting back, what about starving the beast with adblock? Everyone concerned by any of this should not only use adblock but heavily promote it to everyone. That's probably not the complete solution but it's obviously part of it.

vinceguidry 3 days ago 3 replies      
This is just unnecessarily divisive. Look, this is the way commerce works. Checks on corporate behavior come in two realms, legal and ethical. Companies will violate ethics if everybody else is doing it. Companies violate the law if they are the proverbial bad eggs.

If unethical practices become normal, the thing to do is to get a law passed. It's the way this has always worked. Laws change the entire landscape of commerce. They shake things up enough to where a new status quo is found. Law isn't perfect but it can shift the ethical regime more in the direction of the people.

The author's recommendation of a world without kings is a fantasy. If you eliminate hierarchy that means everybody must become an institution. Being an institution is not fun. It's fun to fantasize about building your own house but only the really motivated actually do it. Kings do us a favor by creating structure where there once was none. Silicon Valley is ultimately a force for good.

vtange 3 days ago 2 replies      
There seems to be a lot of antagonism directed at Google and Facebook in this piece, making it sound as if those companies knowingly stole or forced people to fork over data.

I know sometimes it's easy to play the evil mega-corp card, but we need to ask ourselves the question: what is the goal here, to take down Google and Facebook? 'Cause if you're worried about an internet with extra surveillance and restrictions, taking down Google and Facebook doesn't really solve things.

Plus, even in a world with Google and Facebook out of the picture, there will still be political trolls hired by other companies and nation-states. There are also alternate-Googles that can just swoop in and fill the void you create if say you do take down Google. They are not necessarily better than Google today.

zitterbewegung 3 days ago 3 replies      
The web wasn't stolen we gave it to them. You didn't have to sign up for Facebook. You don't have to sign up for googles services. Look how people like RMS use the web .
empath75 3 days ago 9 replies      
I've been thinking of closing my Facebook account, but I'd lose contact with so many friends and family. They don't use email or any kind of instant messenger any more, and nobody makes phone calls any more. If you're not on Facebook you might as well not exist.
wyldfire 3 days ago 0 replies      
Just like the Newton and the Palm Pilot -- the solutions to this problem came too soon. Diaspora, GNU social, Ello (?) probably others. Perhaps in ten or twenty years a breaking point will come along.

That said, I think there's a real market for closed content like FB's. And even though I find a decentralized system more appealing, I can imagine a new, closed/centralized system taking FB's place in the future.

ams6110 2 days ago 1 reply      
The original, open, non-privacy-invading web is still there. There are many sites sharing and hyperlinking academic research and relevant practitioner information that do not have advertising and do not track you.

We are the ones who voted with our clicks to use the likes of LinkedIn, Facebook, and Google.

H4CK3RM4N 3 days ago 0 replies      
I really like the comment at the bottom which mentions the structural flaws in attempting to democratise the web while client-server architecture is still king.
simplehuman 3 days ago 1 reply      
Have any visionaries written what an alternative web looks like I.e we still need the services of Google and Facebook but done in a privacy conscious way.
dorfsmay 3 days ago 1 reply      
Isn't the problem that that you need money/time to make good, easy to use platforms?

How many people here still use usenet vs reddit/HN ?

Keverw 2 days ago 0 replies      
I don't think we really need more regulations in this area. There is already a bunch of privacy related laws on the books.

Plus using data to improve the user experience, recommendations(YouTube, Netflix), targeted ads seems pretty neat(I prefer ads about tech products, my interests compared to makeup or pads). The products get better and improve. Facebook doesn't sell your information, they let advertisers use it to target you. Probably more profitable to let companies use the data instead of selling it.

Companies like LexisNexis and Acxiom are the ones I'd be really worried about. Some states DMV's even sell databases. I'd be more worried about them, Google and Facebook are way better corporate citizens than these mega databroker companies would be. At least Google and Facebook you can opt-out of. LexisNexis, good luck opting-out. Last time I checked only law enforcement who fear they are in danger can opt-out.

Regulations are what kills innovation. Probably since the government has mainly left the internet alone is probably why it's one of the most innovative industries. Imagine having to read a 300 page 2 column paged document and wait on a lengthly process before you are even allowed to put up even a blog.

Then all of this talk lately about "fake news" just seems like censorship. I am worried that some day the internet will be over regulated and censored it will be just like cable television at some point.

barnacs 3 days ago 0 replies      
It wasn't stolen, we're giving it away. And not just the web, but our freedom.

If you're using services that support surveillance capitalism or you are working on such products, please stop. Thank you!

d--b 2 days ago 1 reply      
The argument is an angry version of "if it's free, you're the product," which at this stage is very uncontroversial, and only comments on the "we've lost control of our data" point made in the original post. The link between what author calls "people farming" and "surveillance" is not compelling.

That said, I personally agree with the author in identifying the main problem of the web as people tracking. In my opinion, Tim Bernes-Lee points about misinformation and political advertising are not specific to the medium, but rather to the times we live in. People are pissed, people are scared, they need something to blame, they need some fantasy to believe in, they make up scary news, they vote for the guy that gives them a dream.

What's specific to the web though, (and that is starting to spread out of the web) is the data tracking. Whether for advertising ends or for surveillance purposes, data tracking creates a power imbalance between people and systems that is unbearable.

That power imbalance is the weirdness you felt the first time you saw a gmail ad related to the email you were reading. It's the anger that heats up your cheeks when the sales guy asks for your email address when you just want to buy shoes. It's the 2-hour phone call to the customer service that ends in "I'm sorry there is nothing I can do for you". It's the "late fee" mails you automatically receive for a service that you cancelled. It's realizing that the app your employer installed on your phone can tell them your location at all times. It's the swatting that reminds you not to shop for pressure cookers online. It's the cameras. It's the cars. It's the lightbulbs.

We as people are weak. I don't think Silicon Valley intended it that way. I think they genuinely wanted to improve the world. And in order to keep it cheap, they found money where they could, and in the process, they undermined people's privacy in a way that is making the world a lot worse than it was.

I personally feel hopeful. Countries are made of people. And I think that people are starting to get it. We need rules to prevent this. Laws that force companies to automatically give you the option not to track you. The same laws that forced mailing list senders to have the unsubscribe button (thank god for the unsubscribe button!). For this to happen, we need lobbying, we need awareness. We need a "this website is not tracking you" label. We need privacy checks.

z3t4 2 days ago 0 replies      
At least Google cares for the web because of the revenue it gets from the Google search engine. Google wants everything to be a web app, while Facebook wants everything to be a Facebook app, and Microsoft wants everything to be a dot.net app.
epigramx 2 days ago 0 replies      
When TV was the main mass communication system you needed a single antenna or a single satellite to project to unlimited people in the projected area. While that was one-way and the internet is two-way, the internet introduced the requirement for the projector to have the infrastructure to support millions of users individually. That means the main technological reason Google and Facebook took over was that you need money to take over the internet, you can't just do it for free, and those like Wikipedia that did it in a less profitable way had to be very proficient in collecting donations.
panglott 2 days ago 1 reply      
The older Web wasn't much harder to use, but it was overrun with spam and content farms and the threats of anonymous communication were far more visible.
xor1 2 days ago 0 replies      
Richard Stallman and Hideo Kojima were right.
sushobhan 2 days ago 0 replies      
Web becoming much like the economy where there are big players who control the web and others who are shouting for equality. There is another group, largest amongst these three, almost 90% of general population, who don't bother/understand and go with the wind.With big data come into the scene, things get out of hands as these "big guns" are on a spree to collect more and more of our personal details. IOT is another thing that will surely make situation much worse, as they are meant to be our personal assistance, we are allowing them to learn our behavior and act accordingly.When we see an advertisement related to our 3days earlier search, it's not the ad that is get promoted instead we are getting sold.
HugoDaniel 2 days ago 0 replies      
What about porn ? He forgot to mention the "free" porn industry.
tannhaeuser 2 days ago 0 replies      
Could it be that US antitrust regulation policies and authorities, or lack thereof, are part of the problem?

Edit: not saying it is, but what about the WA deal?

unlmtd 2 days ago 0 replies      
Google doesn't exist. There are only people who work under an umbrella termed "Google". Those people do not agree with each others on everything. So, Yes, we can work with some of them. After all, wasn't Snowden 'The NSA' !?
German traffic light stays red for 28 years (2015) thelocal.de
368 points by curtis  1 day ago   285 comments top 38
rurban 19 hours ago 10 replies      
I know this light personally. I live there. It is no problem for the casual driver, as Dresden is one of those rare german towns with US "Rechtsabbieger" rules. You may turn right even with red lights.And the german law clearly states if one entries to a crossing has lights, all others have to get lights also. This is the only case in Germany where this law makes no sense.

But the best is the last sentence: They do change all the green and yellow light bulbs also in their regular maintenance schedule. The justification goes like this: Bulbs mostly wear due to hours in usage, yes. Red and Yellow are not used, so should not wear, and thus should not be replaced. But hours can not be the only cause for problems, and while we are there and there's a budget for replacement, we'll do it all together. Not really a just technical justification, just a practical one.

We do the same in software also all the time. There are two principles: 1. Make minimal changes to cause only minimal effect. 2. Harmonize your changes to avoid unnecessary asymmetries.They went with 2.You can fight the system, or you can live with it. In this case there is no reason to rock the boat.

[Update:]The lawful alternative would have been to close this crossing for the Ziegelstrae into the Gntzstrae. People would have protested over this sillyness. And Dresden has enough budget surplus to do such things. In fact Dresden was very lucky in its real-estate deals, where they sold all of their old residential apartment complexes to a US company Fortress for billions: http://www.spiegel.de/wirtschaft/milliardendeal-dresden-verk...Normally such deals turn out bad and unprofitable, but in this case not. Both parties are happy.

curun1r 21 hours ago 2 replies      
I feel like there's an analogy to software in this story. Someone put in a hack that, while ugly, works. I'm sure many city planners over the years have looked at it and thought it should be fixed. But when the current solution is working, taking the time to do the necessary testing to ensure that a new, cleaner solution works would be painful. So, 30 years later, the hack is still in place.

I know I've come across the software equivalent of this many times in my life as a developer. "Hmm...that's a really odd way of doing that, I should refactor this. But I need to write a bunch of tests for this since the original developer didn't write any. And am I really thinking through all the possible use cases properly? Even if I am, this is now a 3-day estimate, which means it will probably take 5 days, and all to "fix" something that's technically working. That's a lot of risk/time commitment for very little payout. Maybe I should just leave it be and move on to a project that's better setup to succeed."

Yep...makes total sense that this light still exists. Still doesn't explain why they replace the green and amber bulbs though...

ams6110 23 hours ago 3 replies      
This is a light in Ft. Walton Beach Florida. I took a picture of it years ago because I thought it was funny. It's on 35mm film in a box somewhere.... (the linked image is not mine).


nevex 18 hours ago 0 replies      
This article is from 2015 and noone seems to have mentioned yet that the light got removed in 2016: http://www.faz.net/aktuell/gesellschaft/ewiges-rot-an-dresdn...
sharpercoder 19 hours ago 1 reply      
When I was about 15 years old, a friend tried to convince me a traffic light in front of his house was always green. I never believed him. Fast forward 15 years later. I moved to an apartment, cycling past my friends house to go to work. After a few weeks, I started to notice a traffic light which is always green! It was the same light my friend told me about. I remembered.

It is here: https://goo.gl/maps/2cgcx1b92LL2. Streetview direct link: https://www.google.com/maps/place/52%C2%B000'00.9%22N+5%C2%B...

peterkelly 21 hours ago 4 replies      
I guess it's like the "no smoking" lights in airplanes. Even though they're always on (it's not like there's times during the flight when you can smoke and other times when you can't), they serve as an important reminder, and therefore have a purpose.
scarhill 14 hours ago 1 reply      
This traffic light in Harrison, NY has been green for decades: https://www.google.com/maps/@41.0142226,-73.7217408,3a,75y,1...I have been driving through there since the late 70's and I'm pretty sure that the intersection configuration was in place earlier than that. So it's likely that this light has been green for over fifty years.
thought_alarm 21 hours ago 7 replies      
I get why that road is signalled. It intersects right in the middle of another signalled intersection.

But in my neck of the woods we'd probably show a green right arrow when intersecting traffic from the left is stopped at a red light.

kyle-rb 23 hours ago 2 replies      
This sounds like it could be straight out of The Hitchhiker's Guide to the Galaxy.
peterburkimsher 22 hours ago 1 reply      
Is it in the red light district?

(When I was a child, I actually thought that the meaning of that phrase was related to traffic lights. I was so nave.)

ryanjmo 22 hours ago 4 replies      
I came across a light like this once and noted it. After some thought though I realized that it actually makes perfect sense. The goal of the light is to never let people go straight and people can only turn on a green arrow. It is the only logical output for this goal as far as I can tell. If it were just a red arrow and a green arrow, without a full red people may think they can go straight, because there is no light preventing them explicitly. It's pretty funny.
hermitdev 22 hours ago 3 replies      
Ok, why not just put up signs saying "do not enter" and "right turn only"? And, then, just have a signal for the assumed protected right-turn (Germans do drive on the right, correct?)

This "solution" seems very confusing. In every US/Canadian jurisdiction I've visited has nothing like this German solution. Just indicate with lights, signage and road furniture that the particular route is not allowed. To technically keep the route open, but prohibit by lights seems prone to confusion, especially for individuals not from the area. Also, how do mapping apps deal with this intersection (curious)? Do mapping apps properly route through such an intersection?

(edit: clarity)

chiph 15 hours ago 0 replies      
I had a German colleague come to visit me in the US one time. When we went out to lunch, and I made a "right on red after stop", he just about put his foot through the floorboard trying to push his invisible brake pedal. German driver training is very thorough. :)
lsaferite 15 hours ago 1 reply      
I'd just like to understand how on earth it costs 458 per month to maintain this single light?

Does anyone have some insight into that data point?

Edit: I know the 150k over 28 years number is a bit rounded, but even using that it's 446/month or 14.60/day to run the light?!?

stickfigure 23 hours ago 3 replies      
Does it really cost ~500 per month to run a single traffic light?
mannigfaltig 21 hours ago 1 reply      
I'm not quite understanding the reasoning behind this. It seems a stop sign plus a "right turn only" sign would imply the same rule.
JumpCrisscross 18 hours ago 0 replies      
Let's assume a North-South/East-West intersection. Eastbound traffic (on the Western road) sees this light.

If the Eastbound traffic can only go right, I presume the Westbound traffic can only go left. (Otherwise the Eastbound traffic would be permitted to go straight and/or left.) If that is true, then the Southbound traffic can only go straight; no turns. (It follows that there is no Northbound traffic.)

In essence, three roads converge to one. If that's so, why is this an intersection? Why not merge the roads or build a wall making incorrect crossings or turns obvious?

quasimodem 22 hours ago 2 replies      
How did it stay red during bulb replacement? Article is totally misleading!
mc32 23 hours ago 2 replies      
Do they use steady red lights for all "right turn onlys" which on the other end have opposing traffic (do not enter)? Or is this an exception to the right turn only + do not enter at the other end? Harrison and 10th[1] in SF for example.


blahedo 22 hours ago 1 reply      
I wonder if this article (from 2015) has now become obsolete? I notice that the OSM of that location[0] appears to indicate that Ziegelstrae now dead-ends just before the relevant intersection, and the Bing satellite maps, though difficult to interpret with all the tree cover, appear to match that.

[0] http://www.openstreetmap.org/#map=18/51.05376/13.75783

dustedrob 23 hours ago 0 replies      
"It's not a bug, it's a feature"
femto 23 hours ago 4 replies      
Why does the signal include an amber and green light?
ben_bai 19 hours ago 1 reply      
There is a denial-of-service attack possible.

Red light with green arrow means you may turn right if you feel like it or wait for green. Just park your car in front of the traffic light... it'll be fine.

losteverything 18 hours ago 1 reply      
Rt turn on red is voluntary. Not one to have road rage much but I enjoy waiting for the green if a jerk is behind me.

Also being paid by the hour to drive red lights are your friend

tempodox 22 hours ago 7 replies      
Would that light actually stop anyone? In France and Italy, traffic lights are just recommendations. We once got honked at for waiting at a red light when there was no cross traffic.
data_hope 18 hours ago 1 reply      
Funnily enough, there is precedent, that a red light can be ignored if it doesn't change for 5 minutes. And I really doubt that there isn't a way to regulate the crossing without a permanent red light.

http://www.stvo.de/info/faq/165-ampel-bleibt-rot (German)

jhh 19 hours ago 0 replies      
I am German (and drive a car) and as per my understanding this traffic light could simply be removed with the only change necessary being a stop sign instead of the Yield sign.

The explanation given by the city (or its translation) don't make any sense to me.

EDIT: After re-reading the explanation I guess the idea is that since the crossing street shows a green light, the "opposing" street must have a red one.

TheCoreh 19 hours ago 1 reply      
Perhaps they could put a physical barrier in front of that, so that it's no longer possible to go forward, and then it's no longer part of the intersection?
husam212 14 hours ago 1 reply      
I'm curious, do self-driving cars handle a case like this without intervention from the driver?
mikejmoffitt 11 hours ago 0 replies      
"And their explanation might show another habitat that dies hard"

Another... habitat?

golergka 22 hours ago 1 reply      
When you have a steady, well working system and only one special case that causes that system to waste resources, you don't put special if cases and switches to optimize for this special case. Especially if it's one in a million.
am185 22 hours ago 0 replies      
should put red plastic or colored paper there. or just paint it red.
PhasmaFelis 14 hours ago 0 replies      
> While this may seem nonsensical and a waste of money to your average motorist, the Dresden authorities can explain in exact detail why the light never changes colour. And their explanation might show another habitat that dies hard - a love of convoluted, self-defeating regulation.

That seems unnecessarily critical. Clear and consistent communication is pretty important when you're talking about traffic, and the cost is surely a drop in the bucket compared to the city's overall budget.

gambiting 19 hours ago 1 reply      
Surely, a stop sign:https://www.stickergenius.com/wp-content/uploads/2013/02/sto...

Plus a "turn right only":http://l7.alamy.com/zooms/95a9489a7d8b4c418cb6cfd3b6753295/r...

Would achieve the same effect? People would have to stop at the intersection, and then they can only turn right. Same as red light+green right arrow.

edit: sorry, yes, I now understand - it has to be a traffic light because all other sides have traffic lights.

fiatjaf 23 hours ago 0 replies      
Looks like some traffic lights from my town.
soheil 22 hours ago 0 replies      
It's not red when they're replacing the bulbs for maintenance or when there is a power outage. So this story is false. Also this is a pretty random story to be topping HN. Sure there is an edge case where this could happen, so what.
simplehuman 23 hours ago 2 replies      
32 points and not a single comment. OK, I will bite.

> But anyone who waits to cross into Gerokstrae could be waiting a long time - almost three decades if they are patient enough.

Did anyone notice the awesome pun ? If you wait that long, you will become a patient :-)

How I Made $70k Self-Publishing a Book about Ruby on Rails nateberkopec.com
413 points by nateberkopec  13 hours ago   123 comments top 22
austenallred 11 hours ago 13 replies      
I think people severely, severely underestimate the amount you can make self-publishing.

I blame it on traditional publishing companies and old habits. My mom was a book editor, and the very best authors they published would sell 10,000 books. Their cut was something like $2/book, they'd pull in $20k.

So when I decided to write a book on growth hacking (https://www.secretsaucenow.com), I decided to self-publish. This was more difficult, I had to find an editor and pay them, had to market it myself, etc. Luckily I'm pretty good at the marketing part (hence the book) and I found old contacts of my mom's to edit, but we launched on Kickstarter, so I figured if it flopped I would know in advance.

We sold $66,000 on Kickstarter, then went on to sell another $44,000 on Indiegogo while we finished it up.

Since then we've sold around $20,000 on gumroad. In total it's something like 2,000 copies (though we charge a lot more per copy than most book publishers do, and that keeps us out of Amazon, etc.)

If I had a publishing deal I would make about $5,000 from that. But because I self-published I took home $130,000 minus costs and fees, so we'll conservatively say $100,000.

Now I sit back and check gumroad a couple times a day; I average $800 in sales per day, and almost make more from the book than I do from my (well-paying) full-time job. If I can keep this rate up I will replace my salary (or, rather, double it)

jasode 10 hours ago 3 replies      
>, I launched a _course_ called The Complete Guide to Rails Performance. Since then, I have sold just over 500 copies, for gross revenue of $70,714.20 ($1350/week).

(The emphasis on "course" was mine.)

It looks like much of the revenue was the full course which means videos and Q&A webinars.

I'm not going to say it's clickbait but it's slightly misleading to put "Book" in the article's title since most tech authors including Douglas Crockford, John Resig, etc will not be able to pull $70k in 12 months whether they self-publish or let a traditional publisher like O'Reilly market the book.

That said, congratulations on producing content that people want to pay for. Your numbers seem to be better than the author royalties from Pluralsight/Lynda for a single course.

bphogan 8 hours ago 0 replies      
This article is full of great advice on how to write a book, whether you publish for yourself or for someone else.

I have written 10 books. 9 for the Pragmatic Bookshelf. I've made a nice side income on my books by doing many of the same things he lists. This isn't so much about self-publishing as it is about writing great technical content.

Full disclosure - in addition to writing, I am a development editor for the Pragmatic Bookshelf. But I publish my own books with them because they offer the best of both worlds. They split the profits with you 50/50, but you get an editor to work with throughout the process, and they take care of a lot of other things, like distribution, sales, billing, copy-edit, typesetting, etc so you don't have to.

But the important thing is to get your message out there to people in a quality way, like this article suggests. Do the research, find your voice, write about something people care about. If you go the self-published route, get technical reviewers, hire a development editor, and get a copy editor. Your end results will go far.

[edited to clarify value add for publisher]

ryandrake 9 hours ago 1 reply      
> My favorite example of this (from the Ruby community, of course) is Why's Poignant Guide to Ruby. If you're a Rubyist, you already know what this is, but if you're not, it's a legendary tome in the Ruby community.

Not sure that's a great example, in my view. I remember when I was learning Ruby, and everyone online was "Read that _why guide it's great!!" It was a mess. I couldn't learn anything from it at all. There was probably good content in there, but the interspersed poorly drawn comics and nonsense asides/tangents made it impossible to follow. Maybe it was great for some people but it did not fit my learning style at all. I expect a vanilla introduction, a clear progression through control structures, data types, keywords, variables, constants, operators, etc. and a concise review at the end of each chapter, structured reference at the end, etc. In other words, a plain vanilla "Pragmatic Programmer" or "O'Reilly" style guide.

> Having a unique voice is one of the most important things you can do to make your content stand out in a sea of "blah". It's what will make readers remember you and keep coming back.

Agreed that it's memorable though...

ilamont 10 hours ago 2 replies      
Indie publisher here. Nate did a lot of research to identify the hole that needed to be filled, and then executed on developing his voice, setting up product tiers, spreading the word, etc. I also liked his posts as prototypes approachits a great way to develop the ideas, hone creativity, and get feedback prior to releasing a book. Congratulations!

However, I would like to offer a word of caution for anyone considering self-publishing. For every selfpub sales success there are hundreds of books that fail to make an impact. U.S. ISBN registrations for self-published books have basically doubled every few years (1) and probably crossed the 1 million mark last year. However, the number of Americans buying books has slowly declined over the past several years, and is probably around 45% of the population (down from over 55% a decade ago). (2) The growth in demand is simply not there to support the explosion of supply, at least in the United States.

Niche technical books that are well-written and effectively marketed have greater potential than a poorly edited shape-shifting romance title from an unknown author. On the other hand, technical titles also have to compete with free or low-cost information on the Internet, YouTube, etc., not to mention existing books that dominate the field (something that Nate referenced in his post).

Of course, there are other rewards associated with writing books, including personal satisfaction, expanding ones personal brand, and improving writing and editing skills. But sales are never guaranteed.

1. http://leanmedia.org/number-book-readers-declines-even-self-... (Data from Bowker, http://media.bowker.com/documents/bowker-selfpublishing-repo...)

2. http://www.nielsen.com/us/en/insights/reports/2016/2015-us-b...

EternalData 3 hours ago 1 reply      
As a long-term play, I think as much as people think about investing capital, they should think about investing their time the same way. You put some blocks of time into a particular project that scales well, and it'll eventually provide you with enough passive income to escape being a service layer for hire. In that sense, a job would be a portfolio of investments you can make time-wise in building yourself and passive income assets.
dboreham 2 hours ago 0 replies      
A good read. One of the more interesting, informative and well written pieces to appear here in a while.

The one concern I have with this playbook is that fashions in niche technical subjects change (I think) very quickly. For this to beat simply selling coding services on a T+M basis the revenue stream would need to continue for several years, with a suite of similar books/courses added over time. Are people still going to be interested in paying money to learn about Ruby performance in 2 years from now? Can two, three, four equally profitable subject be identified and mined every year? Probably not, unfortunately, due to this industry's eternal desire to churn the tech (often to no obvious positive purpose).

siruva07 6 hours ago 0 replies      
You were once my (first!) intern trying to hock background checks before any p2p services needed identity verifications as a service.

I couldn't be more proud. This may get downvoted to hell (or to invisible), but ef it ;-)

apo 11 hours ago 2 replies      
> I recently removed Google Analytics from my site when I realized it didn't really matter to me.

Interesting idea. If your analytics reports don't cause you to change course, there is no reason to collect them in the first place.

Still, it's hard to believe that analytics add zero value to an aspiring self-publisher. What about simple things, such as knowing which posts get the most traction, and so may be more likely to appeal to someone buying a book or course. That information can be quite counterintuitive.

taude 6 hours ago 0 replies      
This is an interesting read. I published a tech book for a major publisher, received a $7K advance which never earned-out. Of course, the timing of my technical topic couldn't have been worse, as the industry shifted. But if I were to do it all over again, I'd definitely go the self-publishing route. Especially, since I understand a professional editor/writer workfow.

Edit: one other thing that's interesting is that he did this in 2016, about a decade after Ruby on Rails was introduced, and after the market was saturated with Ruby on Rails material .

Great work, and congrats on boosting your consulting business. When I published my book, I definitely got clients and work because of it, that was the real payoff.

Delmania 10 hours ago 2 replies      
Articles like this depress me. I know that if I want to achieve financial independence, I am going to need to start a business, which involves activities like this. (I should also get involved in real estate). However, when I sit down to think about this, I realize how untechnical I am, get depressed, and then move on..
ziikutv 2 hours ago 1 reply      
Good read. I am wondering, why did you decide not to do a free HTML version of the book; aside from the obvious, it is tedious to do.

Did you consider Physical printing of books?

umen 8 hours ago 0 replies      
Great thread !

can you please suggest from experience what is the best way to copy right and translate book to english , if i want to self publish ? as none native english writer ?Thanks!

ejo0 9 hours ago 0 replies      
Congrats Nate! Really glad to see you doing well post JudoJobs, forwarding this along to a few people who are into self-publishing technical books. Accounting for taxes on self-publishing just seems to be something you need to be careful about if you are selling via a few different channels like gumroad so you aren't audited.
marak830 4 hours ago 0 replies      
thank you for this. As I have mentioned in previous comments a fellow chef and I are looking into publishing our own chef's guides with an application and indepth wrote ups.

Posts like this and all the comments are immensely helpful(and really help to keep the motivation up!)

Thank you :-)

z3t4 6 hours ago 0 replies      
Good advice on persistence. It will be easier if you love it. But there will also be times when you'll hate it. Sticking to the plan though is most crucial.
3legcat 3 hours ago 0 replies      
I think the fact this is book is a success shows just how many people are having problems with Ruby on Rail performance.

If they would be willing to rethink their stack, they might have lesser need for such books.

dsschnau 7 hours ago 0 replies      
>I have sold just over 500 copies, for gross revenue of $70,714.20 ($1350/week).

Uhh that is a really expensive book then

hota_mazi 7 hours ago 2 replies      
> I have sold just over 500 copies, for gross revenue of $70,714.20

Er... what? Each copy of the book is $140?!?

desireco42 2 hours ago 0 replies      
Hmm... I intend to make an info product that I believe could be used by most developers, however, my idea is to sell it for $10, thinking that no developer should say he can't afford it. Mostly because of that as I want as many to actually read and apply.

From what I read here, this product, which would have two parts and is largely done, I should sell it for $49 specifically with higher tiers.

Not sure what to think honestly. I am close to finishing it and I want to make money, but mostly want to share with other developers and open communication.

I like idea about slack channel and community.

jgalt212 6 hours ago 1 reply      
It blows my mind why all these smart people write books for such little payoff, but maybe it has to do with man's desire to feel important.

This is why I love books. You can get to hear what somewhat smarter than you has to say for $15. There isn't a better deal out there.

iplaw 11 hours ago 3 replies      
Trolling the Entire Internet codeword.xyz
418 points by Rudism  15 hours ago   153 comments top 16
Wonderdonkey 12 hours ago 5 replies      
Easily 9.9 out of 10 people won't get sardonicism. My contribution was the Evil of Pippi Longstocking site, which aimed to prove that Pippi Longstocking is the devil. (Anybody remember the '90s?)

The Daily Show, which thought I was serious, invited me to do an interview with Mo Rocca. (They were disappointed to learn the site was a joke.)

I earned a headline in Sweden, and in the story Astrid Lindgren lamented being misunderstood (ha!).

The hate mail (and some fan mail) from Sweden was precious. I later added a section on the site called "Swedemail." You'd think that would tip people off, but nope.

Anyhoo, AOL took it down eventually. But it's still on archive.org, thankfully. Here's a later snapshot with the Swedemail section (complete with Barnes & Noble affiliate ads!) if anyone's interested. http://web.archive.org/web/20021017095408/http://members.aol...

OJFord 14 hours ago 9 replies      
Fun read. The article mentions:

> e/n sites (which was what blogs were sometimes called before the term blog had gained wider adoption)

Does 'e/n' stand for something? This 1999 post [0] calls it 'Everything and Nothing', but I can't find anything else on it, is that right? It says:

> This was probably influenced by the number of EBG-like sites popping up with the words everything or nothing in their name.


I remember unabbreviated 'weblog' that seems all but gone now, but I've never come across 'e/n site' before.

Also, spoiler alert: the domain in question now redirects to a new Github repo with the original site's source. [1]

[0] - http://www.hearye.org/1999/05/whats-an-en-site/

[1] - https://github.com/rudism/NetAuthority

jamiethompson 13 hours ago 1 reply      
Reminds me of the time when I was running a dating site (before dating sites were really a thing)

I released some untested code and wound up emailing every user with a huge email which was essentially a concatenation of every user's email content, personal information et al.

Much anger was unleashed.

CodeCube 13 hours ago 1 reply      
Wow, ton of memories being dredged up by this post and the comments ... good times :)

I often think back to that time period. He mentions that there was no concept of social networks back then, but IMO he's totally wrong. I mean, obviously there weren't any social networks as they exist today, but between the message board communities, and the massively interlinked personal e/n blogs ... there very much was a social network. And it was decentralized. For a short time, it was turning out to be a beautiful thing, especially once RSS started gaining popularity.

I understand why myspace and facebook took all of that marketshare; it wasn't easy enough for the average person to put up their own site, and by the time things like wordpress became popular, every instance was so generic looking (despite templates) that it was tough to get anyone reading your stuff. There was a glimmer of hope with things like google's RSS reader, and Google's social graph API (https://developers.google.com/social-graph/) ... the future could have been awesome, but alas

kordless 8 hours ago 0 replies      
> my massively inflated sense of self-importance from all of the blog posts, links, phone calls, and emails that continued to pour in clouded my judgement

This is brilliant insight. I appreciate this site contributed to better understanding what a reality without trust would look like.

gourou 13 hours ago 1 reply      
The code snipped at the end had me cracking up
owyn 8 hours ago 0 replies      
It was so easy to register weird domain names back then. Our contribution was satan.com. The guy who owned the domain name eventually sold it and it goes nowhere now. For a few years in the late 90's the entire site was just a badly drawn MS Paint image:


Clicking on that was a mailto: link that sent mail to an internal mailing list that we had. Got some good laughs from it but we could never figure out what to do with it. The archives are still floating around somewhere...

gumby 7 hours ago 0 replies      
It was bittersweet to see Lester haines' name on that Register link. Lester passed away last summer -- he was an inspired writer and a jewel of the 21st century incarnation of The Register. The link must have been one of his early pieces and not up to his eventual standards.
jhobag 6 hours ago 0 replies      

"Should I be concerned about Dihydrogen Monoxide?"


oldsj 11 hours ago 0 replies      
I love that Joe Rogan took the time to email you about this but didn't catch on that it was a joke
tomcam 2 hours ago 0 replies      
I will respect HN when a site making similar fun of Islam hits the front page
marze 10 hours ago 0 replies      
The internet was nothing if not hyper free speech back then.

So ironic that this shows up now, during the internet's "let's scrub out the fake news" phase.

lkrubner 12 hours ago 7 replies      
My favorite bit of trolling, from 2004, was the "I don't usually link to blonde jokes, because they are sexist, but this one was really funny..." Such a very clever bit of trolling.

A few bloggers (most of a feminist mindset) agreed to launch it together, and they linked to each other. Then several dozen other prominent bloggers joined in, linking to each other with text, which varied but basically stayed with the same theme: "Blonde jokes are sexist, so I don't usually promote them, but this one was really funny..."

So a person reading the first blog would click the link, and go to a second link where the text was again "I know, I know, I should not promote blonde jokes, they are sexist, but this one really made me laugh..." and you click again and again you read "This is the funniest blonde joke that I have ever read" and you click again and again you read "I hate blonde jokes, but this one made me laugh out loud..." and you click again...

How many times did you click, before you realized the joke was on you? The joke was pretty much a test of your social intelligence.

That joke really only worked in the blogosphere of 2000-2006, the era when the blogosphere was at its peak. I am not sure how anyone could recreate that joke now.

djhworld 8 hours ago 0 replies      
This was a really fun read, thanks. I first made it onto the web in 1999, those were the days!
_audakel 13 hours ago 16 replies      
Wow..... Just went to the site he mentioned (rotten.com). I nearly threw up. I thought I had a decent stomach for stuff but that is fucked up. Sorry for the click bait sounding comment but I am disturbed after a few of the pics. I have no words.
Asooka 13 hours ago 0 replies      
Well that site DEFINITELY falls foul of the Net Authority guidelines. Suggesting that NA isn't real, smh.
I invented the web. Here are three things we need to change to save it theguardian.com
380 points by perseusprime11  4 days ago   241 comments top 29
ThePhysicist 3 days ago 11 replies      
Surprised that no one mentioned the new EU data protection directive, which goes a long way to fix the first issue mentioned in the article, the loss of control over our personal data (only for users in Europe though). I studied it in detail as I work in data analysis and consult companies on this, and I honestly think it is one of the best laws produced by the EU so far:

It gives users a multitude of rights such as being informed about exactly which kind of data a company has about them (and even get a digital copy of that data), how the company uses that data and for which purposes it is used. And if you're subjected to algorithmic decision making (e.g. an algorithm decides if the bank should award you a credit) you have the right to know which kind of algorithms were used in the process and to contest the decision. You also have the right to demand the deletion of your personal data and to revoke the right of a company to process it, as well as to demand correction of inaccurate data. The legislation also allows for severe fines and punishments for companies not respecting the regulation (up to 4 % of yearly turnover of the whole company group), so even companies the size of Google or Facebook should have strong incentives to follow the regulation.

Andrenid 4 days ago 15 replies      
I would truly love a new "web" which is effectively style-free. I think the existing one is fine to continue for the general public, online shopping, social networks, etc.. but a parallel information-dense system that uses ultra light weight browsers that work on every device and platform, accessing machine-readable data with some standard stylesheet that concentrates in readability, and the "good" bits of the modern web, without the fluff.

It would be great for technical blogs and news, project sites, wiki type data stores, discussion forums, etc.

Maybe everything in this "new" web is static, no stylesheets except browser-side for users to customise themselves.

I'm not sure what the actual answer is but I know the existing web is broken beyond repair.

K0nserv 3 days ago 5 replies      
> Today marks 28 years since I submitted my original proposal for the worldwide web. I imagined the web as an open platform that would allow everyone, everywhere to share information, access opportunities, and collaborate across geographic and cultural boundaries.

Seems to me that the above and the points raised in point 2 sit on opposite sides of the spectrum. Either you get a free and open internet where everyone can publish content as they like or you police who and want can be published. The spread of misinformation seems to be a direct result of the democratic nature of the internet.

Oxitendwe 3 days ago 2 replies      
If Tim Berners-Lee really wanted to save the web, he wouldn't support DRM in our web standards.[1] It's absolutely disgusting as well that they would argue that "fake news" is somehow a threat to the internet, provide no evidence whatsoever to explain why, and then link to a panel run by a media company considered untrustworthy by about half of American voters.[2]

[1] https://www.w3.org/blog/2017/02/on-eme-in-html5/

[2] http://www.rasmussenreports.com/public_content/politics/gene...

pdimitar 3 days ago 2 replies      
As much as I'd love to help projects like IPFS (for example), truth is that most people simply don't care and are entirely clueless on the impact of the continued centralization and surveillance of the Internet on their lives. Sitting with random people on a table, they giggle and smirk saying "I've got nothing to hide, you're too paranoid, bro, cheer up!" and I quickly give up. They have zero idea about how much info is collected about them. If tomorrow somebody pulled out that info in a fabricated trial against them, they'll sing another tune but it will be way too late. Nobody ever listens until it impacts them directly. Sad reality about Homo Sapiens. Another one is the echo chamber effect -- people absolutely LOVE their social echo chambers and they can legitimately punch in you the face if you point them at a source that disagrees with them.

As a second and last point to the above, I can't afford donating all my free time to help progress the decentralized internet anymore. I am 37 and I have a very happy personal life but need to work on my health a lot, I am very tired and burned out and I am finding myself unable (even if I want) to work for free without any reward in sight (not even talking about money; I am sure I wouldn't even be thanked). I imagine many others are in a similar position -- in terms of finances, in the health department, or in their general mental stance.

I very much like the idea of creating a "home internet box" which is a self-contained fanless machine connected to an UPS -- and it contains router, firewall, own website, own mailserver, own private Dropbox, a universal P2P node (BitTorrent / IPFS) etc., but as others have pointed out, our current stack of network technologies is way too bloated and full of incomplete standards -- which in turn are likely full of exploits and dark corners -- that right now the only seemingly appropriate course of action is to get rid of it all -- except the physical layer protocols -- and start over.

Try making an API app that works with anything else than HTTP and HTML/JSON. Tell me how that went for you. Try using ASN.1 as a data format, or a compressed secured IP layer protocol. Yes it's possible but it's much slower than it should be. Seems us humans always want to have one "universal truth".

It's extremely sad and I am afraid we'll live to see very oppresive times pretty soon.

mpweiher 3 days ago 0 replies      
The letter:


Referenced by the W3C, but surprisingly without a direct hyperlink, only by title. A bit strange considering the organization:


fiatjaf 3 days ago 7 replies      
Does anyone else see as problem that web browsers are getting so feature-rich? That means that if anyone wants to write a new web browser he won't be able to.
tomohawk 3 days ago 3 replies      
It's interesting that one of the great things about the web is the promise of it distributed nature, and yet his prescribed 'fix' for misinformation sounds like the establishment of some sort of central authority to regulate content.
chippy 3 days ago 2 replies      
The current narrative of misinformation as a news item is a new thing, which arrived in our world during the recent US election process. The issue gets my suspicion-radar bleeping. The whole narrative smells funny to me.

It's not really a global issue, it's a current affairs issue and one particular to a specific geography. And its not really an internet issue I think but a human one.

What I find interesting is that Trump is adopting the narrative that emerged to criticise him, to criticise media bias in general. That's interesting because political bias and misinformation can be separated - actual wrong reporting of facts vs bias of interpretation, but they can be argued to produce the same effect.

mborch 3 days ago 3 replies      
The new web needs to be distributed in a privacy-preserving sense. Today, you can't realistically browse the web without getting identified and generally speaking geolocated.

What we need is a model where you pull information you request from distributed and diverse pools of public domain content.

return0 3 days ago 1 reply      
The web needs more anarchy, not less. Less spoon-feeding people with the truth, let them bear the brunt of their failures. All the problems he mentions are political, stemming from too much power in governments which makes political candidates ruthless.
kyledrake 3 days ago 0 replies      
I've been getting increasingly concerned about the future of the web as well https://arstechnica.co.uk/information-technology/2017/02/fut...
fiatjaf 3 days ago 0 replies      
> "Its too easy for misinformation to spread on the web"

It's too easy for misinformation to spread everywhere.

ajdlinux 3 days ago 1 reply      
As commerce has become more mechanised, we've lost the ability to bargain and haggle in consumer business relationships. Forget about sacrosanct privacy rights, I can't even choose to pay to opt out of a lot of data collection. We need better options than all-or-nothing.
hackuser 4 days ago 2 replies      
The marketplace won't sort out the security issues, any more than it sorted out unstable banks. Consumers lack the ability to obtain information, understand the issues, and make good decisions.

Computer systems should be regulated for safety, which includes confidentiality and integrity, like everything else.

bnolsen 3 days ago 0 replies      
Except for the private data part, I found this not very constructive. Too political. IMHO biggest problem with the web itself is snooping, tracking all driving an insane amount of bloat which clogs the internet. Any extra bandwidth or horsepower is immediately sucked and then some up by advertisers and tracking. There's nothing lean and mean about it anymore.

The internet exists as an information resource that people need to be able to sift through themselves, not something that governments or other self selected groups decide to arbitrarily censor for whatever selfish reasons they have.

fiatjaf 3 days ago 2 replies      
> "Political advertising online needs transparency and understanding"

As if that wasn't a problem outside the web. Defenders of democracies like to dream about "transparency and understanding".

curiousgeorgio 3 days ago 0 replies      
It's refreshing to read Berners-Lee's proposed solutions as they point toward more technical and market-based approaches rather than the typical "we need more legislation to fix these issues" incantation.

I often hear many of the same people fighting "against government overreach in surveillance laws" (as Berners-Lee mentions) while at the same time advocating more legislation to govern information use/misuse on the web. I don't think it's realistic to expect government overreach to magically work where we want it and stop right where we don't.

Many of these problems aren't on the forefront of most people's minds (yet), but as the issues become more publicized and people begin to understand their importance, then we (as in "the people", not the government) will have a greater voice - and more importantly, power through informed choices - to make a difference.

simplehuman 3 days ago 3 replies      
For a start we need to get rid of this ad driven model. But this is not going to happen because people are addicted to free. Its like a drug.
toadkicker 3 days ago 0 replies      
I just wanted to add some positive comments here complaining the state of the web as we know it today vs. the people who are working on the solutions to the problems. There are well defined paradigms for building distributed systems. While much of the web was built in the belief these distributed systems would take root, lots of engineering went into client-server configurations. There are a ton of psychological reasons why these decisions were made. They don't have to keep being made though. We can all embrace distributed applications (some call them serverless applications) and free the web once again. Here's a lot of great projects that are trying to do just that: http://github.com/toadkicker/awesome-ethereum
sauronlord 3 days ago 1 reply      
He is advocating for censorship.

Misinformation spreads everywhere not just the web. Who decides what is "misinformation"?

All speech and information is political, because man is a political creature. Who decides what is "political"?

His first point about losing control of our personal data is right on though.

Even so called "heroes of the web/freedom" are on the "fake news bandwagon".

What the hell have we come to when this is considered enlightening discourse.

We're all in deep shit and this is a taste of things to come this century.

nsxwolf 3 days ago 0 replies      
It sounds like he's just trying to put the genie back in the bottle now. First he creates a system that gives everyone total freedom and now he's like, whoah, that's way too much freedom.
liopleurodon 3 days ago 0 replies      
Tim Berners-Lee. Thank God, I was worried they had interviewed Al Gore
perseusprime11 3 days ago 1 reply      
Isn't Facebook the biggest culprit of web? Their walled garden approach and lack of social network portability is what I feel is killing web more than anything.
kahrkunne 3 days ago 0 replies      
Publishing this in The Guardian, what beautiful irony
chengiz 3 days ago 0 replies      
Berners-Lee seems to want to stay relevant, I mean he invented the web but does that give credibility to his announcments and concerns and predictions now? I mean we all know how the semantic web turned out.
inetknght 4 days ago 2 replies      
Irony: can't view the page with an adblocker.
Tylerosaurus 3 days ago 0 replies      
I thought Al Gore invented the internet
SomeStupidPoint 4 days ago 1 reply      
> Imagine that Big Brother scenario extended to the millions of smart devices such as digital thermostats and fire alarms feeding the Internet of Things ecosystem, and you have a problem that could eviscerate the privacy of billions of people, say security experts.

Is this anything but opportunistic scare-mongering?

"Spy agency own spy tools. Wouldn't it be scary if they used them on you?!?!?"

Tesla to raise 1B tesla.com
319 points by loourr  6 hours ago   208 comments top 23
Cookingboy 5 hours ago 17 replies      
As an investor of Tesla who bought into Elon's vision I've been getting more and more nervous about how the company is actually run at this point.

A modern car company is as much of a production and supply chain company as it's a product and tech company. I get the feeling that due to Tesla being cutting edge in the tech department it led to all the supply chain/production problems being treated as if they are speed bumps on the way, when in reality they are fundamental core competitive areas for an automotive company. Toyota is not the world's premier automaker because they have cutting edge tech, they are what they are because they build unbelievably number of reliable cars at an unbelievable scale.

The bleeding of cash after so many promises of "not going to raise money again" just shows that their vision is way ahead of their execution ability. Decisions like the terribly executed Falcon Wing Doors on a SUV that delayed production and increased cost and hurt reliability just further gave fuel to all the doubters.

I also have a strong suspicion that Tesla has not been honest with their Auto Pilot 2 and full self driving tech's progress. The current AP2 is an unsafe joke and I have no idea how people are ok with paying $5-10k for a significant downgrade when compared to AP1 because of a non-binding promise that it "might" get upgraded before their lease term is out. Any other automakers would be laughed out of the room for suggesting this kind of sales tactics.

I still believe in Tesla/Elon's end game vision, and I think what they accomplished is nothing short of brilliant, but I think it's a very risky investment at this point and the road ahead is bumpier than many like to admit.

The upcoming Model 3 will be the acid test for Tesla. It will no doubt be a good car, but at this point people should focus less on what the car can do and on how the car will be rolled out, sold and supported afterwards. A wealthy Model S customer may be ok with having the car in the shop after a minor accident for 5 months due to lack of replacement parts, but for an average Joe paying $500/month leasing a Model 3 that would just be the birth of a new BMW customer.

kumarski 2 hours ago 3 replies      
Uber as a ponzi scheme of ambition

1/ Basic premise was they keep highlighting their entry into larger and larger markets and so investors keep ponying up $ for the ambition. 2/ First, it was taxis

3/ Then, it was taxis + logistics

4/ Then taxis, logistics, vehicle ownership

5/ Then taxis, logistics, vehicle ownership, autonomous

6/ Then taxis, logistics, vehicle ownership, autonomous, trucking

7 / Then taxis, logistics, vehicle ownership, autonomous, trucking, drones

8/ I might have the order wrong but each spins Uber into addressing an even larger Total Addressable Market

9/ While they never actually own even the first one (taxis) yet. But investors love the ambition and keep ponying up.

10/ Hence, the ponzi scheme of ambition


It's hard to say whether Musk is doing something similar or not. Satellites, boring company, mars mission, electrified grid, etc....Obviously, Kalanick is much different than Musk, but I can't help but feel pattern recognition creeping up on me.

salimmadjd 5 hours ago 3 replies      
Tesla is basically taking on a loan in form of a convertible notes. probably much smarter to distribute their lender among multiple entities than one giant bank that would have leverage on them.

Also, it looks like Tesla would use a portion of the money to inflate their share prices via a small buyback. The after market trading so far make this strategy look successful.

So at the end of a day: Tesla is raising capital without diluting their shareholders and reducing their share values and giving up much control. Also the maturity date of 5 years is rather interesting. I wonder if it correlates with their internal projections and their ability to pay that money back by then.

patrickg_zill 4 hours ago 1 reply      
If I were to make a computer analogy, I would probably say that Tesla in having to deliver the Model 3, is right at the point IBM was when they were finishing up the IBM 360 series: it will be a make or break product release for them.


quote from Thomas J Watson Jr.:

The expense of the project was indeed staggering. We spent three quarters of a billion dollars just on engineering. Then we invested another $4.5 billion on factories, equipment and the rental machines themselves. It was the biggest privately financed commercial project ever undertaken."

(note $5 Billion number was in 1964, inflation-adjusted, that is many billions more)

csours 3 hours ago 1 reply      
Disclaimer up front: I work for a Tesla competitor.

If I were a Tesla investor I'd want to hear more about how Design for Automation is going. If you plan to automate vehicle assembly in a cost effective manner (as Musk has indicated), your components and your assembly operations must be designed for that automation.

Tesla has had notable problems getting suppliers to understand and implement their plans (BorgWarner? - original transmission for the Roadster; Mobile Eye - self driving components; Falcon wing door component supplier).

Tesla regularly asks suppliers to do things that no other car company has asked them to do (see list above). Design for Automation is a new requirement for many components, so this may be a challenge for suppliers, and thus Tesla.

Tesla may win this gamble, or they may not. It will be interesting and educational either way.

thewhitetulip 15 minutes ago 1 reply      
I think the time is ripe for TESLA to launch into India, a manufacturing plant. Modi govt is keen to ratify the Paris accord and raise the clean energy initiatives. On another side, it would help TESLA brand if they first launch a high end car in India and later market the mass market car as a better luxury car, they'll earn an awfully large income on this.
grandalf 5 hours ago 1 reply      
This is the sort of thing that ought to terrify the makers of legacy automotive technology, both for how Tesla might scale production, but also how Tesla might become a lot more aggressive with its intellectual property strategy.
bischofs 4 hours ago 2 replies      
Tesla looks like they will lose that cool new thing vibe soon, the novelty of an electric car will wear off - especially when porsche or benz come out with an electric sedan or coupe. These companies will offer a much more compelling product with the ownership value that people expect from a luxury car company.

I have shopped a model S and besides the electric drivetrain it is not really that nice of a car compared to an E class or a porsche panamera - The interior is weak as far as fit and finish. As soon as the the germans offer a similar product on the high end ( and with GM already offering a full electric on the low end ) tesla is done.

tmsldd 2 hours ago 0 replies      
Nothing wrong with Tesla raising money, after all they basically burn it to sustain the company..The curios part is that it looks so few.. 1 B isn't a really much.
Animats 3 hours ago 0 replies      
The notes:

will rank senior in right of payment to any of our indebtedness that is expressly subordinated in right of payment to the notes, (in other words, they're not senior to anything no explicitly listed as junior, or any existing debt)

will rank equally in right of payment with any of our unsecured indebtedness that is not so subordinated (including our Existing Convertible Notes),

will be effectively junior in right of payment to any of our secured indebtedness to the extent of the value of the assets securing such indebtedness and will be structurally subordinated to all indebtedness and other liabilities (including trade payables) of our subsidiaries...

That's a rather junior senior note.

None of this matters unless they go bust, in which case it really matters.

marricks 5 hours ago 1 reply      
That went quickly from denying a capital raise to reality.
dilemma 3 hours ago 1 reply      
It's correct that if EVs take off, there is going to be a lot of new demand for batteries.

It is also correct that as a car manufacturer, owning a battery producing company and selling to the rest of the industry, is a competitive advantage.

It is not correct that building both of these companies from scratch, in parallel, is the way to do it.

Focus on building cars an become the leader in that, then buy Panasonic or a flailing manufacturer.

Or become the leader in battery technology, and buy a car company couldn't get on the EV train fast enough and turn it around.

Tesla is spreading itself too thin, losing focus and eventually competitiveness and then, control

slagfart 5 hours ago 1 reply      
Does anyone know the % interest rate on these? It seems to be blanked out.

How can financial markets function efficiently when such basic information is suppressed?

JumpCrisscross 3 hours ago 0 replies      
Ah, convertible bonds. On the trading floor, a converts-trading colleague had a Post-It note on his monitor:

"Interest rates up: converts down

Interest rates down: converts down

Volatility up: converts down

Volatility down: converts down

Apple strudels up: converts down

et cetera"

Converts are complicated. For one, nobody is fighting for you post-issuance. You don't (yet) hold stock, so the Board doesn't think you're pretty. Yet your bonds will be subordinated, making them swim like equity. This leads to all kinds of fun [2] when markets fail to maintain monotonicity.

Monotonicity means if a line is going up it keeps going up; never down (and vice versa) [3]. A graph for the Empire State Building with the floor number on the X axis and the height of said floor on the Y axis would be monotonically increasing. If I bent the building into a U shape, that graph would not be.

If you graph pay-offs for people in your company as a function of the stock price, you want it to be monotonic. That means interest are aligned. If your CFO makes a million dollars when the stock tanks, he's going to want the stock to tank.

Let's consider HappyCo. HappyCo has issued 10,000 shares of Common Stock. They trade at $100 per share (a $1MM market capitalization [4]). HappyCo issues 1,000 convertible bonds that can be turned into one share of Common Stock in exchange for $120 per share. To keep this easy for now, let's say HappyCo issue these converts secretly, i.e. the market can't price them in before exercise.

If the stock price is way above 130 everyone wins. If the stock price is below 130, converts lose. (Stockholders also lose.)

But what if the price is exactly 130? The market, thinking there are 10,000 shares outstanding, weighs the company in at a $1.31MM market cap. But then--dun dun dun--everyone converts. Holy shit, 1,000 new shares! At what price does this new share count (11,000) yield a $1.31MM market cap? 119.09 per share. At 129 pre-conversion, the $1.29MM company trades at 129 per share after accounting for the converts (nobody converts). At 130 pre-conversion it trades at 118.18 post-conversion. It takes us until 141.90 pre-conversion to get back to 129 post-conversion. The price goes monotonically up, but the stockholders do okay, then worse, then better again. Monotonicity was broken. An evil shareholder learning of the converts might prefer a pre-conversion price of 129 over 140.

This may seem silly. Nobody secretly issues stock [5]. The market would start pricing the converts in as the stock price approached the conversion price. How does it do that?

Options! A convert is a bond married to an option. (People once made a lot of money arbitraging converts against the issuer's stock, bond and options markets [6]). You get all the legal complexity of bonds [7] entangled with the mathematical complexity of options [8]. As I said, fun [2].

But bankers would just price things properly at the outset to ensure they maintain monotonicity, right? Well, they try to. But look at the variables in a common option valuation equation [9]. Rates, volatility, dividends, et cetera. Each of these changes the balance for converts holders. (For example, if dividends go up one might want to convert sooner, i.e. at a lower price. What if taxes go up at the same time? Who knows! Fun [2]!)

In a perfect world, each variable would iterate, one by one, like an Excel spreadsheet that isn't complaining about circularity. Investors would, one by one, plug new numbers into their models to get a clean answer. Unfortunately, we inhabit a reality where more than two aspects of it can change simultaneously.

Upshot: converts commonly ram through their boundaries all the time. When this happens, everyone converts. Actually, not everyone since each investor has a different break-even conversion price. An insurance company, with lower borrowing costs, will convert differently than an individual investor on margin. Similarly, an invest trading out of a tax-deferred IRA will convert differently than one trading straight. Fun [2]!

TL; DR Converts a lots of fun [2] for financial theoreticians, fun for market makers and hedge funds, a little less fun for bankers and an affordable source of entertainment for all.

[2] http://dwarffortresswiki.org/index.php/DF2012:Losing

[3] https://en.wikipedia.org/wiki/Monotonic_function

[4] http://www.investopedia.com/terms/m/marketcapitalization.asp

[5] http://sharesleuth.com/investigations/2012/12/small-companie...

[6] http://www.institutionalinvestor.com/article/1027772/boy-won...

[7] http://www.treasurer.ca.gov/cdiac/debtpubs/handbook.pdf

[8] https://en.wikipedia.org/wiki/BlackScholes_model

[9] https://en.wikipedia.org/wiki/BlackScholes_model#Black.E2.8...

exabrial 4 hours ago 3 replies      
My big question is, why? It seems they are selling vehicles. Are they in trouble or is this just to expand business?
ww520 2 hours ago 0 replies      
Why does the after hour market going up for the news?
antoniuschan99 5 hours ago 0 replies      
Why does Tesla need this? What is the money going to be used for?

Is this like Solar Bonds that SCTY used?

CptJamesCook 5 hours ago 3 replies      
I just picked up my Model X after a week in the shop. They weren't able to finish all of the work, because "we are busy this time of year." They told me to bring it back in April.

In about a year of ownership, this is the 3rd time I've lost the car for about a week. I probably should have brought it to the shop 3-4 other times, but I let the issues build up.

They've had to replace all sorts of parts on my car, including the entire driver's seat. Even basic stuff like my phone playing audio doesn't work half the time. The driver's side wing mirror quit opening. I've had to reboot the car 10+ times in order for the dashboard display to work. The gull wing door opened into a ceiling and left a scratch. The hood opened into a ceiling and left a scratch. My windows have quit going up three times. Most of the trim / sealing / etc is coming off, misaligned, etc. I probably sit here and think of 10 more problems.

Anyway, I knew I was buying a v 0.1 car and Im generally happy with it, despite the problems. Their service has actually been amazing, but that's kind of the point of my post:

If the Model 3 has anywhere near the same level of problems as my X, at 10x the scale, Tesla is doomed. How could this possibly be even remotely cost effective on a lower-margin Model 3?

eternalban 4 hours ago 1 reply      
Fuel cells are the future, not batteries.
unstatusthequo 5 hours ago 1 reply      
$750M, not billion
4SomeReason 3 hours ago 0 replies      
Dude's user name is 'losers'. Lol
hackuser 4 hours ago 1 reply      
How much of Tesla's prospects are due not to free market competition, but Elon Musk lending his credibility to Trump? What happens to his competition, who do compete in the market and take a principled stand against Trump? Is Musk still supporting Trump?

I just spent some time doing a little research (below) and as far as I can tell, the answer to the questions seem unsettling, though I couldn't find many great, current sources in a short time. I'd be interested in serious discussion and information; let's skip the partisan hackery - there's so much out there that more won't provide any marginal revenue!




Musk's support for Trump:

* Billboard near Elon Musk's Tesla factory asks him to 'dump Trump'


* Elon Musk, other business leaders to talk infrastructure with Trump Wednesday


Musk, who is a member of both Trump's economic advisory council and manufacturing council

* Elon Musk: Rex Tillerson could be an 'excellent' secretary of state [from January]


Note that Tillerson was previously CEO of Exxon-Mobil, a captain of the carbon industry.




How his businesses benefit from the relationship with Trump:

Don't forget the strong appearance that Trump is selling access. What will happen to the space exploration goals of Jeff Bezos, owner of the Washington Post, for example, or other Musk competitors? Note that Musk's actions puts pressure on them, too, to support Trump or risk serious damage to their businesses. And also it seriously disrupts the free market to have it depend on personal relationships with the President rather than merit of the companies.

* Musks Surprise Rapport With Trump Means 40% Rally for Tesla [from January]


* Elon Musks Trump Alliance Indicates Great Things for Tesla Stock


* UBS analyst says he can't understand why Tesla shares are up so much this year, sell the stock


"We struggle to understand the run-up, particular as Q4 deliveries missed, though positive spin on the Musk-Trump relationship, reconfirmed Model 3 launch timing, and expectations of new reveals (including more autonomous features) are likely factors," analyst Colin Langan wrote in a note to clients Wednesday.

* UBS (Basically): Teslas Path To Profitability Now Dependent On Musk-Trump Bromance [a one-sided editorial]


UBS notes, there has been discussion that Trumps infrastructure project may involve federal subsidies for a nationwide EV charging corridor, which could reduce the cost burden.

* Elon Musk's surprising secret weapon: Trump? [From January]

* Elon Musk Has Trumps Ear, and Wall Street Takes Note



[Both re: Morgan Stanley investor advice:]

"Elon Musk has an important line of communication to Donald Trump through his role as a strategic advisor to the President-elect," Adam Jonas, an analyst with Morgan Stanley, wrote in an investor note Thursday. / "We believe this level of coordination with the new administration could actually evolve into greater strategic value than with the prior administration," Jonas added.

Mr. Jonas said that the "strategic relationship between Tesla leadership and the new administration is an important development" in his decision to upgrade Tesla stock.




And some background:

* Elon Musk Is Betting Big on Donald Trump


Paypal Horror Story 40k Frozen No Answers
400 points by sabslaurent  3 days ago   242 comments top 33
JumpCrisscross 3 days ago 5 replies      
Under U.S. federal law, PayPal is not a bank [1]. This is important. From the government's perspective, you give PayPal money and then PayPal gives you money. Between those events, it's not your money. PayPal has enormous discretion around what they can do with those funds, how and under what circumstances they get to decide to give it to you and if they get to keep it forever.

What state do you live in? Do you do business through an entity, e.g. an LLC? If so, where is it registered?

PayPal is, varyingly, registered as some form of a money transmitter in many states [2]. While your federal protections are probably limited to antifraud, protections you're probably outside of (PayPal has good lawyers--you agreed to surrender lots of privileges when you opened an account), there may be state regulations you can use to, if not force action, encourage it.

Going forward, I tend to consider any business using PayPal for mission critical processes as being negligent with important risks. If you can't avoid using PayPal for certain lines of business, set up a nightly sweep from PayPal to a proper bank account.

[1] http://www.zdnet.com/article/fdic-decides-paypals-no-bank/

[2] http://law.bepress.com/cgi/viewcontent.cgi?article=1153&cont...

Disclaimer: I am not a lawyer. Only a lawyer can give you good legal advice. Don't take legal advice from Internet comments.

sabslaurent 3 days ago 1 reply      
An update for anyone who cares...I tweeted this link and tagged Paypal, they reached out saying they are sorry to hear and submitted my case for a review. I replied via DM how can they review ny case without asking any new information or telling me what the issue is, got no reply back. Just got an email saying "Appeal Denied" Paypal account closed for security reasons.

Which security reasons I have absolutely no idea. There's no contact info on the email, just says it's a do not reply address.

ziszis 3 days ago 2 replies      
Traditional customer service channels are increasingly broken since they are seen as a cost center to be reduced by many companies. In particular, if you are trying to cancel service or withdraw money it is even "worse" because not only do they have to pay the salary for the rep, they also lose money helping you.

If after one or two calls you don't get what you want, it is not worth retrying. I gave up after being placed in queue for an hour with Comcast. After finally getting ready to speak to a rep, I was informed by an automated message that they were closed for the day and to call back tomorrow.

Here is what I find works:1) Least likely - Traditional customer support channel. Try once or twice at most.2) More likely - Contact publicly on social media like Twitter (Comcast is actual great about this).3) Most likely - Getting upvoted and written about here and elsewhere.

I would wait to engage with a lawyer as there is a good chance that someone from Paypal will popup in this thread. It is probably already surfacing in some internal emails at Paypal now. Good luck.

oblib 3 days ago 3 replies      
I make invoicing software that I sell using PayPal and it lets my users add a "Pay with PayPal" button to their invoices and over the years several of my clients have called me and related stories similar to this and they all had some common traits.

They all processed quite a bit of money via PayPal and they all had issues with customer's requesting refunds which they disputed or didn't issue in a timely manner, and they all sold something which had the potential to be a bit shady. One of them sold aircraft parts to Iran, which may have had some legal restrictions that applied. Another sold guns.

Since I sell access to web based software I don't have to ship anything and the product is "delivered" instantly. I also process refunds immediately and without any question.

PayPal most certainly doesn't like getting caught up in refund issues. In my case most of the customers who've requested a refund contacted me first and I issued it promptly with a "Thanks for trying my software" note attached.

The few that have contacted PayPal first resulted in PayPal sending me a notice about the request for a refund and, again, I issued it immediately, but their notice makes it clear that is what they expect and if I recall correctly they put a time limit on that.

Take from this what you want but what I've taken from it is when a customer requests a refund issue the refund as quickly as possible and try hard to make that as easy as possible for your customers and to minimize the potential for them asking for one.

mhoad 3 days ago 2 replies      
I very recently (as in this was resolved about 72 hrs ago) was in a a similar situation where I had $20k withheld by PayPal.

I would switch all of my business billing to stripe in a second if they had the ability to pay out to accounts in different currencies like I can do with PayPal.

Despite the fact that I run an Australian business I usually bill in USD but because I travel so much sometimes I'd like to have it in EUR or GBP etc for practical purposes. However with Stripe I am forced to convert it into AUD before I can do anything at all with it meaning I usually have to eat currency conversion fees twice before I can use it in a practical sense. Hence PayPal sadly....

fermigier 3 days ago 3 replies      
IIRC this is not the first time this kind of bad behaviour has been reported about Paypal. I suppose you were aware of that. Did you consider Paypal's reputation when you chose to do business with them? Were there alternatives that you considered at the time and if so, why did you stick with Paypal?
rgbrenner 3 days ago 1 reply      
2% dispute rate?! That would get you shut down at every merchant account provider I've ever dealt with. In fact, it's usually 0.5% or 1% max. One major bank gave me 0.25% max. I've never seen an agreement that said 2% was ok.

That's a very serious fraud problem.

I've run an ecommerce store -- the chargeback rate was 0.1% (seriously, I calculated this from actual #s).

Nothing in this story (except maybe the customer service issues) would be out of the ordinary for any merchant account.

Edit: fixed chargeback rate

mastazi 3 days ago 0 replies      
I don't understand, this post has no link and no text body, just a recursive link to this discussion page. Where is the "horror story" mentioned in the title?

EDIT: Oh I see, the original post is buried down in the thread, because it has been posted as a comment. Perhaps the mods could fix this?

sabslaurent 3 days ago 14 replies      
I've processed successfully hundreds of thousands of dollars with my Paypal account over the pat 4-5 years. My account has been on review a few different times because of volume spikes (around the holiday season, I do ecommerce). When that happens, I usually reach out to Paypal and we discuss like humans, I explain where I'm coming from, they make suggestions and we get it settled. Around November 2016 Paypal reached out and told me due to the disputes coming in they need a $5000 set reserve + a 10% rolling reserve which will be released 90 days after a transaction.I accepted and since then Paypal has called me on 3 different occasions to check up on me and my efforts to reduce the dispute rate. We discussed and the calls seemed to go very well without them having any requests or EVER telling me my account is at risk of being limited and shut down due to disputes.

Towards the end of January I myself realized I tired of the disputes (seemed to be a quality issue with the product which got hundreds of 5 star reviews on my site but still disputes were coming in at around 2%) so I slowly stopped the business meaning I stopped advertising and the only sales coming in were trickling in organically. My volume went down from $200k a month to about $10k a month.

On Wednesday I log in to my Paypal account and it says it's limited they need more information.

They asked for Photo ID, bank statement, proof of address, supplier invoice, supplier contact info and proof of delivery for the last 5 transactions.

I provided everything but the proof of delivery for the last 5 transactions. From the resolution center whenever I clicked proof of delivery it brought me to a page with no transactions so of course I could not provide proof of delivery for transactions that don't exist.

I contacted Paypal letting them know I submitted everything but proof of delivery since there's a bug in their system, they said no worries i'll get an email requesting the transactions they need tracking for and I could just reply back.

I never got that email, but I did wake up Thursday morning with an Appeal Denied automated email saying my account is closed and the money will be frozen for 180 days. That's $20k CAD in my reserve + $15k USD in my available balance. Keep in mind in the past 30 days I processed less than $10k usd on Paypal in total.

I reached out to a supervisor at Paypal and told him what his happening simply doesn't make sense, i provided everything they needed except for what their system was unable to request/receive and that if they had any issue with what I provided they should tell me what it is and help me resolve instead of giving me the hammer for no reason. He said he couldn't help me but opened a ticket for both his supervisor and a supervisor from the limitation team to call me within 24 hours.

The limitation department supervisor never called me back but the business support manager called me back a few hours later. He called me from an unknown number in the evening, told me there's been a mistake, they added a second set of eyes to my account and they agree with me the limitation was unnecessary and wrongfully made. He said he just has a few questions and I will either get a restored access email in a couple hours or a call asking for more information in order to get it settled but he said there's a small chance of that happening, realistically the account will just be restored within a couple hours.

I never got an email or call again, so I called the following day. When I called the rep basically told me there's no evidence of a call and there are no notes on my account from that person/call and nothing was moved forward for a review.

I told him that is nonsense and to look harder. He eventually tells me there's evidence of a call but no notes, they tried to reach out to that supervisor and he wasn't available so there's nothing they could do for me, the decision is final.

I'm being treated like a fraud and a criminal when I'm a legitimate entrepreneur who's processed 10's of thousands of transactions successfully. I also paid them thousands of dollars in fees, never had a negative balance or anything of the sort that would put Paypal at risk.

Now whenever I call they are extremely rude telling me the account is closed they're holding the money and there's absolutely nothing I can do.

They have been rude, lying, inconsistent, unfair and have made 0 effort to resolve this amicably.

They have 0 logical reason to hold $40k of my money for 180 days, the only reason I can think of is they do this on 10's of thousands of accounts and gain big money off the interest.

When I log in to my account there's a notification saying they need more information from me. When I click on that notification it brings me to a page that says the account is limited because they need more information regarding my recent sales, they do not say what information or how to provide information. That is straight up illegal and a complete abuse of power.

I know there are thousands of Paypal horror stories but I genuinely feel abused. I have expenses, a family and so on and need that cash flow and no one at Paypal can be consistent for more than one phone call or help me resolve my issue, it's pathetic.

Just had to vent and hopefully this will give them some of the negative attention they deserve.

mstaoru 3 days ago 1 reply      
If you're outside of the US, it gets even more interesting.

First, you need to be VERY careful about using VPN or letting remote team members access Paypal. One misstep with, say, Hong Kong account being accessed from Ukrainian IPs, and you're blocked for a security review which drags for days and weeks.

Second, they completely neglect any special international shipping methods' unique constraints. Sometimes when you ship from China, the tracking will only appear when the package reaches destination country. This is considered an outright fraud by Paypal, which promptly returns money to the client and you're left with a loss.

On top of that, they will impose 3% commission for currency conversion. Did you ever hear of a bank taking 3% to convert between your multi-currency accounts? Well, "Paypal is not a bank".

Add to the mix their robotic support with that condescending tone.

No. I wouldn't touch Paypal with a ten-foot pole.

kelvin0 3 days ago 1 reply      
OK, so when I click this post, I go directly to HN's comments section. Where is the original story? Missing URL?
ahmetyas01 3 days ago 0 replies      
I worked with paypal almost 10 years. I can tell you this. Paypal is an evil company.
dawhizkid 3 days ago 0 replies      
I had this happen to me. I filed a complaint with the CFPB and within a week had my account unfrozen (i.e. still shut down but ability to transfer funds out)
teilo 2 days ago 2 replies      
If the OP were running an entirely legitimate business, they would have no issue revealing what product they are selling. They clearly do not want us to know what that product is. This is no doubt because we would have no sympathy for them if we knew.

That being the case, I'm calling bullshit.

jccooper 3 days ago 1 reply      
We use PayPal for a few scattered customers who have problems paying with a card and the occasional eBay sale, so I don't pay much attention to it. Whenever I see a PayPal horror story, I transfer all funds out of there to my bank. It's not the most effective sweep method, but it works depressingly well. I wonder what PayPal's cash balance would be if it had a reputation as being a safe place to keep a balance?
coupdejarnac 2 days ago 3 replies      
I'm looking at using Paypal for my next business, and it scares me that there is no recourse whenever an issue arises. I need to receive payments and send payments to workers, mostly in Europe. I'm doing a marketplace for jobs kind of like Upwork. I'm based in the USA, so Paypal makes it possible. Are there any alternatives to Paypal? I've been talking with Payoneer, but they have not inspired much confidence. I had my paypal account frozen about 10 years ago for a bullshit reason, and I'd like to avoid using them again.
TimMeade 3 days ago 1 reply      
We quit using paypal 8 years ago for exactly this kind of treatment. Seems it has not changed.
remx 3 days ago 0 replies      
I wouldn't move large volumes through Paypal. PP is useful for small donations and shuttling small amounts around, but not for the amounts being discussed here, because the larger the amount, the more it hurts you when things go awry.
funkyy 3 days ago 0 replies      
It might sound bad, but usually, when I was doing volume, and I would travel to the different country, region, I would call PayPal central and let them know. While I am all about privacy, this always made them put some comment on my account that helped me pass all the bad things. "We need more documents" issue? Solved in hours. $20K spike of revenue in few days? Not a problem!
Markoff 2 days ago 0 replies      
yeah, lifting my limits work further verification was online ordeal for like 2 or more weeks since apparently my ID card with address and full name issued months ago is not good enough, my printed bank record from internet banking is not enough, in the end had to send them two other bank records from different bank to have my bank account back to regular

also don't get me started they steal 5% of my income and don't have live chat service to resolve issues and their FAQ is referring to website layout from years ago with most of the steps wrong

sadly still two of my vendors don't offer to me other payment solutions (well one does wire transfer but only for large amounts i can collect in months) so i still have to use this horrible service to not lose income

grahamburger 3 days ago 2 replies      
One of my top-ten rules for staying sane on the Internet is 'Never leave any money in your PayPal account.' Served me well so far - I had my account frozen but there was only $2 in the account. PayPal is horrible.
joeclark77 2 days ago 0 replies      
Queen Victoria supposedly told her daughter on her wedding night, "Just lie back and think of England." Whenever I have to deal with PayPal, I think to myself "Just lie back and think of SpaceX." If Paypal helps Elon Musk take us to Mars, maybe it's worth putting up with at least occasionally. (You can think of Tesla if you prefer.)
uptown 3 days ago 1 reply      
Apparently no details either. Where's the story?
kaffee 3 days ago 1 reply      
As someone paying for products, I detest PayPal. I'm forced to do it rarely enough that I would pay 5-10% extra to avoid doing it. I realize I'm just one data point but perhaps there's a market here?

One case where it's especially frustrating is Etsy. There are many vendors on Etsy who refuse to accept any payment other than PayPal. (One can't use an Etsy gift card.)

Edit: add etsy note

eonw 3 days ago 0 replies      
this has been going on as long as paypal has been around. i never allow anyone to send more then a $1k to my paypal, everything larger than that gets check or wire. First time i got a $5k wire, they locked me out and held my monies for 90 days.... doesnt really help a growing business to have money locked. this was in 2002.
lquist 3 days ago 1 reply      
One of the most valuable lessons I've learned from HN was to autosweep my PayPal account. I am thankful that I was fortunate enough to learn this lesson before starting my company and running tens of millions through PayPal. I can't say for sure that I would still be in business if I hadn't.
Exuma 3 days ago 0 replies      
Yeah... pretty much the exact reason I don't leave any significant amount of money in PayPal at one time.
Buge 3 days ago 1 reply      
Where is the story? This post doesn't link to any article or anything.
adamio 3 days ago 2 replies      
You stopped advertising and still had passive income of 10k monthly ?
ArtDev 3 days ago 0 replies      
I deleted my Paypal business account, never going back.
elastic_church 3 days ago 0 replies      
How surprising, I've never heard of this happening before

(this is sarcasm, for people who actually never heard of this happening before)

wayn3 3 days ago 0 replies      
This is a case of KYC. Paypal wants to get to know you. Just talk to them. This looks scary to people who dont deal with banks and entities that act like banks, but paypal is doing this because they need to protect themselves from aiding people in money laundering, which is a pretty big deal.

If you explain to them what youre doing and come up with some proof, they will release your money. They do not want to steal it. Almost all these cases revolve around someone not communicating with paypal and then acting surprised when they freeze funds.

Any bank would do that. If my bank is hit wit ha 100k transfer and I don't say a word about it and then appear at the local branch and demand to cash it all out without an explanation of whats going on, they will refuse that, as well. And probably call the cops just to cover their asses. Seriously. Talk to them.

Reverse Engineering the Hacker News Ranking Algorithm sangaline.com
344 points by foob  1 day ago   49 comments top 4
nstj 1 day ago 2 replies      
Or you could just search for Paul Graham's posts[0] :)

> (= gravity* 1.8 timebase* 120 front-threshold* 1 nourl-factor* .4 lightweight-factor* .17 gag-factor* .1)

 (def frontpage-rank (s (o scorefn realscore) (o gravity gravity*)) (* (/ (let base (- (scorefn s) 1) (if (> base 0) (expt base .8) base)) (expt (/ (+ (item-age s) timebase*) 60) gravity)) (if (no (in s!type 'story 'poll)) .8 (blank s!url) nourl-factor* (mem 'bury s!keys) .001 (* (contro-factor s) (if (mem 'gag s!keys) gag-factor* (lightweight s) lightweight-factor* 1)))))
[0]: https://news.ycombinator.com/item?id=1781417

saycheese 1 day ago 0 replies      
Anyone interested in the topic of HN's ranking algorithm should look through the HN's submission archives:


For example:https://medium.com/hacking-and-gonzo/how-hacker-news-ranking...

jkchu 1 day ago 0 replies      
I wrote a piece a while back around implementing your own ranking algorithm. Thought this audience might find it relevant/interesting.


foob 1 day ago 4 replies      
This is the article that was discussed in yesterday's "The stories that Hacker News removes from the front page" [1]. After speaking with @dang, it sounds like what happened with the original submission was that a moderator accidentally put "(2010)" in the title and users flagged it because they incorrectly thought it was old. He invited me to resubmit the article today to allow for real discussion and to demonstrate that what happened to the first submission was accidental.

I know that this analysis will get less attention than the one from yesterday, but I personally find it far more interesting and hope that it can stand on its own merits. I'll be around to answer any questions that might come up.

[1] - https://news.ycombinator.com/item?id=13857086

       cached 16 March 2017 04:11:01 GMT