hacker news with inline top comments    .. more ..    7 May 2017 Best
home   ask   best   2 years ago   
Uber faces criminal probe over software used to evade authorities reuters.com
853 points by techlover14159  2 days ago   545 comments top 40
gotothedoctor 2 days ago 11 replies      
Seems like a lot of people are confused/have questions about why Uber is being criminally investigated by the Department of Justice.*

Here's why:

1. Uber is subject to the laws in the jurisdictions in which it operates. Evading authorities is textbook obstruction of justice. Not only did Uber build software that they used to evade authorities & break local laws in multiple states and countries, but they profited from it (which has a variety of other RICO implications.)

2. Sure, corporations are people too, but, nonetheless, only people engage in civil disobedience. Related, for courts, a company that profits from violating of local laws is not a protester or freedom fighter battling injustice, it is criminal enterprise.

3. Charging Uber under RICO would by no means be unusual or a stretch; this is a quite run of the mill application of these laws. (eg see Preet Bharara's RICO prosecutions: https://www.google.com/search?q=preet+bharara+rico+prosectio... )

4. This is unquestionably a federal matter, within the DOJ's jurisdiction. Uber operates across state lines--and used Greyball in multiple jurisdictions. That said, Uber could & likely will face criminal investigation in other jurisdictions.

5. Finally, this is definitely not Trump's revenge on Travis. Not only does it simply not work that way--USAs are independent & it'd be beyond illegal, but this specific USA was appointed by Obama. (There were two Trump didn't fire; USA Stretch is one of them)

(*And, yes, I am a lawyer. And, many years ago, I worked at the Department of Justice)

naskwo 1 day ago 7 replies      
Last weekend, I visited Hamburg with my wife. I was surprised when I was told that I couldn't catch an Uber. However, on each German taxi (you know, the beige ones) there was a sticker prompting me to download the "EUTaxi" (or similarly named) app, which I did.

Brilliant. I was able to summon a car within minutes, and this app also allows for paying via the app.

Uber's biggest threat is, IMO, the creation of a well-working "push to ride & pay" taxi app by other countries that are as similarly well organised as Germany.

For me as a consumer, I could care less if I download Uber or EUTaxi. As long as I get my ride on time, and with a professionally licensed driver.

samrap 2 days ago 10 replies      
At first, it seemed like legitimate software to assess spam until:

> For example, it mined credit card information to see if the owner was affiliated with a credit union used by police and checked social media profiles to assess the likelihood that the person was in law enforcement.

Yikes. I know people who have the mentality that this sort of thing is ok. Whether you're a startup that never makes it, or one worth billions, at some point this kind of stuff surfaces. You can't run a successful company and get away with this stuff, especially as a start up when everyone is out to get you even moreso.

I'm still waiting for the big one that makes me quit using Uber though.

nostromo 2 days ago 9 replies      
I'm not a fan of Uber, but it seems like it's their right to decide who is allowed to use their service. (Baring protected classes, of course.)

If a regulator started creating fake user accounts in order to scrape Uber's data, I don't see why Uber can't put a stop to that.

And if a legislator is a vocal critic of Uber, I don't see why Uber should be forced to allow them to use their service.

pimmen89 1 day ago 0 replies      
I see the civil disobedience argument thrown around here now. Are you seriously saying that running a business without complying to regulations is a civil right?

Yes, the regulations can be argued to be whacky, but I don't see how they infringe on civil rights. Rosa Parks couldn't stop being African American but Uber can switch to another business plan.

sillysaurus3 2 days ago 1 reply      
It's so strange how quickly all of this happened.

Actually I guess it's been awhile. https://twitter.com/dhh/status/504374011711594496 was in 2014, wow.

ianamartin 2 days ago 1 reply      
Kalanick must have known this was on its way when he backed out of the Code Conference a few days ago. Even with just the events of the last few months, he was already going to be on the hot seat facing Kara Swisher and Walt Mossberg.

And you know neither of them were going to pull any punches.

John Gruber quipped that cancelling that interview was probably the smartest thing Kalanick has done in a while. I see that point, but I slightly disagree. If Uber could come up with some kind of decent response to all of the recent shitstorms including this one, that would be a great platform to spread the message.

Cancelling reads to me like they are afraid to answer tough questions because they have no good answers.

On the other hand, I wouldn't want to tangle with Kara Swisher in particular. She's tenacious; she doesn't care who you are or how big your company is; she will tear your intestines out your butthole and feed them to you while the entire technology world watches. I love her. She and Walt are shining examples of a free press holding the feet of the powerful to the fire when needed.

The rest of the media could learn a thing or two from them.

But back to the point, yeah, I bet he's glad he doesn't have to respond to this as well.

linkregister 2 days ago 6 replies      
This is really big; if they were misusing credit card records like that, they risk losing accreditation from their PCI systems. There is probably some criminal aspect to violating PCI as well. Does anyone here know more about how PCI works?
xixi77 2 days ago 5 replies      
What exactly part of this is criminal? I mean, isn't any business free to discriminate and refuse service to anyone they like (for example government employees, hackers, or Democrats), as long as they are not in one of several protected categories like sex/race/national origin/etc.? Or is this violating consumer privacy laws when CC info is used this way?
stepitup 2 days ago 6 replies      
What's really strange is that I bet a good percente of the people reading this comment will know exactly what I mean when I refer to a: "Startup that employs mafia tactics and forces businesses to pay a protection fee, or actively fucks them up and attempts to hurt them."

Why aren't the authorities going over the ACTUAL, honest-to-god mafia startup? Sorry, I realize it's off topic here, but my first thought when I read "Justice Dept begins criminal probe" -- was FINALLY!

We all know which company I'm referring to. And all I said was "employs mafia tactics and forces businesses to pay a protection fee, or actively fucks them up and attempts to hurt them". There's only one company like that in our "community". Why are they allowed to behave that way with immunity?

(Sorry to hijack this thread. Also, I have no disclaimer to make and am not associated with either company.)

rajathagasthya 2 days ago 1 reply      
Uber is having one hell of a year. Anyone have an idea whether it has significantly affected their recruitment of new people?
maverick_iceman 2 days ago 1 reply      
Intuitively, it seems that Uber committed fraud; but is there a specific law that they violated? It's not a rhetorical question, I'm genuinely curious to know.
hackuser 2 days ago 0 replies      
> If a ride request was deemed illegitimate, Uber's app showed bogus information and the requester would not be picked up, the employees told Reuters.

Ignoring the illegality of interfering with law enforcement, this is bad customer service. Not only would I resent the invasion of privacy, but that algorithm is going to have a meaningful number of false-positives. It's a crappy way to deal with customers - if you think there's a problem, tell them you are denying service.

It's hard to imagine another business doing this. Does a restaurant just not bring food? An online store just not send the clothes? I know you are thinking about shadowbanning, which I also think is crappy, but at least it's much less consequential.

terribleplan 2 days ago 2 replies      
Ok, what's the over under on the people who coded this getting used as scapegoats? I'm at about 40/60 against.

Morality and legality in software is a massive frontier we hardly pause to think about.

raspasov 1 day ago 2 replies      
How is "software used to evade authorities"

different from

.. using a "radar used to detect highway patrol"?

bogositosius 2 days ago 7 replies      
Evading local regulators is a federal matter?
ausjke 1 day ago 1 reply      
Austin where I live rejected Uber and it seems that's a right move.
wonderwonder 1 day ago 0 replies      
Uber needs new leadership yesterday.

Current team did the impossible and bootstrapped the company to where it is via hustle, working in the grey (and black apparently) and pure drive. They should be commended not for their methods but just the for the fact that they actually succeeded in the environment they worked in, one stacked against them.

Uber is a real company now and they need to hand it off to a proven leadership team that can guide it moving forward. Give the current leaders a golden parachute but the time has come to transition.

S3curityPlu5 1 day ago 0 replies      
Uber needs to go down enough is enough already. how many times can you get away with criminal acts, most likely they will just have to pay a fine again though. It seems like corporations can get away with anything these days, and if they get caught they just pay a fine and go on operating.
snappyTertle 1 day ago 1 reply      
Sure, Uber may have broken the law (or evaded it); however, just because a law exists, doesn't make it correct. We should also question if the law should be there in the first place.
rdxm 2 days ago 0 replies      
lol.... i believe the proper phrase here is "chickens coming home to roost"
mirimir 2 days ago 0 replies      
If DPR had played a tighter game, maybe they could have managed a fully subversive ride service. But no matter how well executed, drivers would still be putting their vehicles on the line. Maybe look like hitchhiking?
jeffdavis 2 days ago 1 reply      
Doesn't seem very likely to be a serious problem for Uber. Basically bad PR.

If I were them, I would be more worried that they can't kill competitors well enough to ever be highly profitable.

dkarapetyan 2 days ago 2 replies      
So Uber is now Theranos?
bbcbasic 2 days ago 0 replies      
Another one! Can't wait to watch the Uber movie.
InclinedPlane 18 hours ago 0 replies      
Uber has nowhere to hide here. Let's say you're ordinary schlubs who are engaging in illegal activity at work, what's the dumbest possible thing you could do? Leave evidence, of course, or worse, create evidence. Such as talking about it over email. That would be super, super dumb.

Let's look at the stratospheric levels of dumbness that Uber got up to here. This is a handy checklist of things not to do if you're engaging in criminal activity:

Document their crimes by talking about them openly on official, archived communication channels such as email.

Make the criminal activity official corporate policy.

Write software to support their criminal activity, with no reasonable believable cover story.

Give their criminal conspiracy a project codename.

Even garden variety street gangs aren't this idiotic. Imagine the police pulling someone in for questioning and opening up their bag to find a notebook. Page 1 of the notebook begins with this heading: "Project Keys: Smuggling Heroin into the United States". Page 2 of the notebook is an extended description of the exact methods used to smuggle heroin past border security. Page 3 of the notebook is a list of dates, times, and individuals who have smuggled heroin into the US. And so on. No drug dealer is that stupid, because that would put you in jail for a very long time if it fell into the hands of law enforcement. And yet, here we are, Uber really is that stupid.

SCAQTony 1 day ago 1 reply      
Why is the Uber board allowing their CEO to continue to make sketchy ethical decisions? You would think they would ask Travis Kalanick to fall on his sword so the company can reboot and perhaps rebrand it's image so it would seem less "sinister" towards regulation, it's drivers and the law.
Jabanga 2 days ago 2 replies      
The laws Uber is accused of facilitating the violation of are themselves tyrannical abridgements of personal liberty. The entire exercise, from the local laws to punish Uber drivers, to the DOJ's prosecution of Uber for helping its drivers evade economic persecution, are a disgusting exercise in majority supported tyranny.
beedogs 1 day ago 0 replies      
It's fun watching the world's worst unicorn startup being turned into dog food.
ransom1538 1 day ago 1 reply      
Meanwhile, %36 of US homicides are never resolved.
nafizh 2 days ago 1 reply      
This might be a blow back for Kalanick resigning from Trump's advisory council?
LordHumungous 1 day ago 0 replies      
kafkaesq 2 days ago 0 replies      
amelius 1 day ago 1 reply      
It seems like their company motto is: be evil.
laughingman2 1 day ago 3 replies      
Lots of Anarcho-Kapitalists drunk on Ayn Rand Kool Aid.

Travis is a criminal who has marketed his breaking of labor laws, civil laws as "disruption".

Maybe this is what will happen you get billions of investment dollars without earning through old fashioned way of making profits. Financialism has blinded America.

thr0waway1239 1 day ago 0 replies      
Mark Zuckerberg must be feeling ecstatic. You don't need to outrun the negative PR bear, you only need to outrun your idiot fratbro friend.
topitguys 1 day ago 1 reply      
What?? Looks like the sharing economy is really taking a lot of hit. I read somewhere that there is a conspiracy to defame companies like Uber and Airbnb. Both doing so well and helping people big time in such overly priced market..
Kinnard 2 days ago 4 replies      
>the Greyball technique was also used against suspected local officials who could have been looking to fine drivers, impound cars or otherwise prevent Uber from operating, the employees said

Doesn't Uber have a responsibility to protect itself and its drivers from fake riders looking to do harm even if they're government employees??

I think going after average or in many cases poor people trying to make a buck driving Uber is an "aggressive tactic"

grandalf 2 days ago 5 replies      
Edit: Please don't down-vote this. Up-vote it and argue articulately against it!

The only thing more embarrassing for authorities than having propped up a corrupt medallion taxi system for decades is this sort of probe.

In order to disrupt the corrupt medallion system it took billions of dollars and algorithms to evade the officials who had been tasked by the corrupt medallion industry to leverage small compliance technicalities to sabotage Uber in specific markets.

Every municipality that had a medallion system that was disrupted by Uber was effectively humiliated. Uber revealed just how inefficient and profligate those systems are.

The quality of car service everywhere Uber serves is supremely better than it had been before Uber. We can now get a car in minutes and see the ETA update as the driver approaches.

So many of us found it infuriating to call 333-TAXI (or equivalent) and be told "5 to 30 minutes" no matter how much demand was going on. Then when the cab failed to show up after 40 minutes, a follow-up call would yield "it should be another 5 to 30 minutes" after which the operator would simply hang up.

It took Uber's vision (and YC's vision in supporting it) to move the world forward into the future. We should all realize that the officials Uber had to fool using its algorithms were the foot soldiers of backwardness and corruption.

iloveluce 2 days ago 3 replies      
Everyone really is piling up against Uber. I really hope this isn't some sort of Justice Department revenge against Travis for having left the Trump advisory council [0]

[0] https://www.nytimes.com/2017/02/02/technology/uber-ceo-travi...

Prepack helps make JavaScript code more efficient prepack.io
814 points by jimarcey  3 days ago   219 comments top 36
chmod775 3 days ago 4 replies      
I just ran this on a huge JS project that has a quite intensive "initialization" stage (modules being registered, importing each other, etc.), and prepack basically pre-computed 90% of that, saving some 3k LOC. I had to replace all references to "window" with a fake global object that only existed within the outer (function() {..})() though (and move some other early stuff with side effects to the end of the initialization), to get it to make any optimizations at all.

Very impressive overall.

yladiz 3 days ago 5 replies      
I hate to bring this up whenever I see a Facebook project, but it still warrants saying: the patents clause in this project, like in others including React, is too broad. I really wish they made a "version 3" that limited the scope of the revocation to patents pertaining to the software in question, e.g. Prepack, React, rather than a blanket statement that covers any patent assertion against Facebook. While I suppose the likelihood of this occurring is small, I can imagine a company holding some valid patents, such as something related to VR, that aren't related to Prepack that Facebook infringes upon, as well as using a software that Facebook produces like Prepack, sue Facebook for infringement, and then losing the right to use Prepack as a result. From my understanding these kinds of clauses are beneficial overall, but the specific one that Facebook uses is too broad.

Tangentially related: what would happen if you did sue Facebook for patent infringement, and continued to use this software?

dschnurr 3 days ago 2 replies      
This is coolit's worth mentioning that you might be trading runtime performance for bundle size though, here's a contrived example to demonstrate: http://i.imgur.com/38CR3Ws.jpg
chime 3 days ago 3 replies      
This has promise but still needs more work. I added one line to their 9 line demo ( https://prepack.io/repl.html ) and it ballooned to 1400+ lines of junk:

 (function() { function fib(x) { y = Date.now(); // the useless line I added return x <= 1 ? x : fib(x - 1) + fib(x - 2); } let x = Date.now(); if (x * 2 > 42) x = fib(10); global.result = x; })();
I understand Date might not be acceptable for inner loops but a lot of my code that deals with scheduling would benefit significantly if I could precompute some of the core values/arrays using a tool like prepack.

NTillmann 3 days ago 17 replies      
Hi, I am Nikolai Tillmann, a developer on the Prepack project. I am happy to answer any questions!
jamescostian 3 days ago 1 reply      
The examples are very far from the JS I see and read, but this is definitely a very useful tool. It seems like gcc -Olevel. It would be interesting to incorporate some sort of tailoring for JS engines into this, like how a compiler might try to make x86-specific optimizations. For example, if you know your target audience mostly runs Chrome (or if the code is to be run by node), you might apply optimizations to change the code to be more performant on V8 (see https://github.com/petkaantonov/bluebird/wiki/Optimization-k... for example).

I love it and can't wait to use it on some projects!

ianbicking 3 days ago 2 replies      
A long time ago there was a theory about using Guile (the GNU Scheme) as a general interpreter for languages using partial evaluation: you write an interpreter for a language in Scheme, use a given program as input, and run an optimizer over the program. This turns your interpreter into a compiler. I played around with the concept (making a Tcl interpreter), and it even kind of worked, often creating reasonably readable output.

Prepack looks like the same kind of optimizer it could be a fun task to write an interpreter and see if this can turn it into a compiler/transpiler.

xg15 2 days ago 1 reply      

 function define() {...} function require() {...} define("one", function() { return 1; }); define("two", function() { return require("one") + require("one"); }); define("three", function() { return require("two") + require("one"); }); three = require("three");

 three = 3;
There is a certain irony that now it's possible to do optimisations like that in javascript - a dynamically typed language with almost no compile time guarantees.

Meanwhile java used to have easy static analysis as a design goal (and I think a lot of boilerplate is due to that goal) but the community relies so much on reflection, unsafe access, dynamic bytecode generation, bytecode parsing etc that such an optimisation would be almost impossible to get right.

untog 3 days ago 1 reply      
This should have a big impact on the "cost of small modules", as outlined here:


Which is to say, one of its most effective use cases will be making up for deficiencies in Webpack, Browserify and RequireJS. Which I'm a little ambivalent about - I wish we could have seen improvements to those tools (it's possible, as shown by Rollup and Closure Compiler) rather than adding another stage to filter our JavaScript through. But progress is progess.

gajus 3 days ago 0 replies      
tyingq 3 days ago 1 reply      
How "safe" is it? I'm thinking, for example, of Google's closure compiler and the advanced optimizations, which can break some things.

Or roughly, if it compiles without errors, is it safe to assume it won't introduce new bugs?

bthornbury 3 days ago 2 replies      
Awesome project, the performance gains seem real, but why wouldn't these optimizations be happening at the javascript JIT level in the vm? (serious question)

React / javascript programming, is the most complex environment I've ever dug into, and it's only getting more complex.

create-react-app is great for hiding that complexity until you need to do something it doesn't support and then it's like gasping for air in a giant sea of javascript compilers.

vikeri 3 days ago 5 replies      
I was under the impression that V8 and the like are so optimized that this would give marginal gains. Would love to be wrong though. Do you have any performance benchmarks?
Kamgunz 3 days ago 1 reply      
Very interesting, nobody mentioned how formal and quite technical this README is, it goes really into details about what it does, and even future plans laid in three sections across 30 bullet points. One bullet point in the really far future sections said "instrument JavaScript code [in a non observable way]" emphasis mine, that part was noted in several other bullet points. It seems to me every compiler/transpiler/babel-plguin changes JavaScript code in a non observable way, no? Just a theory, but that undertone sounds to me like the ability to change/inject into JavaScript code undetectably on the fly in some super easy way.

Just another day at Facebook's office...

arota 3 days ago 2 replies      
This is exciting, and has a lot of potential to significantly improve JS library initialization time.

I wonder if this is the same project[0] Sebastian McKenzie previewed at React Europe 2016?

[0] https://www.youtube.com/watch?v=xbZzahWakGs

dandare 3 days ago 3 replies      
What is the business model for a tool like this? Who has the resource to spend man/years of work while also create such a fantastic, simple yet comprehensive landing page?
drumttocs8 3 days ago 3 replies      
Coming from a non-CS background, I've always wondered why you can't "convert" code from one framework or paradigm to another. For instance, converting a project from jQuery to React. If you can define the outputs, why can't you redefine the inputs? That's what it seems like this project does... I suppose converting frameworks would be a few orders of magnitudes harder though.
mstade 2 days ago 0 replies      
I'm happy to see there's an option to use this with an AST as input, more tools like this should follow suit. Hopefully it can then push us to a place where there's a standard JS AST such that we don't reinvent that wheel over and over. Babel seems to be winning here, but I don't think it matters so much which one wins so long as any one does.

This tool looks interesting, particularly the future direction of it, but I'm weary about efficiency claims without a single runtime metric posted. The claims may be true, initializing is costly, but so is parsing huge unrolled loops. For an optimization tool, I'd hope to see pretty strong metrics to go along with the strong claims, but maybe that is coming?

Interesting work, nonetheless!

aylmao 3 days ago 0 replies      
Facebook's javascript / PL game doesn't disappoint. This is awesome!
ericmcer 3 days ago 0 replies      
Pretty cool, it did not make much difference in my application size, as it has very little static data in it. It seems pretty rare to do something like:

and more common to do:


iamleppert 3 days ago 6 replies      
Not a comment about the tool, which looks cool and well done.

It's sad that there are developers and projects who write the type of code that causes these sorts of performance trade offs. I stopped writing this kind of fancy code a long time ago when I realized it wasn't worth it. You're just shooting yourself in the foot in the long run.

I think static analysis performance optimization tools are great but a certain part of me thinks it just raises the waterline for more shitty code and awful heavy frameworks that sacrifice the user experience for the developer experience.

"Just run it through the optimizer" so we don't actually have to think about what a good design looks like...

frik 3 days ago 2 replies      
How does it compare to Google's closure compiler? It is considered by many best in class. It understands the code (uses Java based Rhino Javascript engine), while most alternatives (UglifyJS & co) just monkey patch things. You can trust the Google's closure compiler output.

Edit: @jagthebeetle: have you tried "advanced mode"? (One should read the documentation before using it, it's really a game changer but requires one to read the docu first)

Waterluvian 3 days ago 0 replies      
What percentage of typical code is packable like this? What I really need is a way to easily determine, "is it worth bothering with a tool like this?"
kasper93 3 days ago 0 replies      
I think that just in time compilers are better at doing thier things. Sure it is nice project that can interpret and print preprocessed js, but I think it might in fact not bring speed in most cases.

And the current state doesn't even know how to constant fold this loop.

function foo() { const bar = 42; for (let i = 0; i <= bar; i++) { if (i === bar) { return bar; } }};

hdhzy 3 days ago 1 reply      
This looks very good indeed but the lack of initial data model very severely limits the production usability of this tool. You can't use "document" and "window" ...

It's the same problem TypeScript have/had that for external libs you need definition files for it to work. Now if we had TypeScript-to-assumeDataProperty generator that would be VERY interesting!

kamranahmed_se 2 days ago 0 replies      
> helps make javascript code more efficient


Are you sure?

kccqzy 3 days ago 0 replies      
This reminds me of Morte, an experimental mid-level functional language created by Gabriel Gonzalez. They both seem to be super-optimizing, that is partially executing the program in question. Of course it is a great deal easier to do in a functional language than JavaScript.


KirinDave 3 days ago 0 replies      
I wonder what this would do to Purescript code?
jlebrech 2 days ago 0 replies      
I want something that can separate my code into what can be precompiled into wasm and what has to stay in JS. maybe just insert comments so i can see what needs to be done.
Traubenfuchs 2 days ago 0 replies      
I can't get anything to work in it. Just for fun I put the not minifed vue.js source inside and I get:

null or undefinedTypeError at repl:537:23 at repl:5:16 at repl:2:2

reaction 3 days ago 0 replies      
Has anyone used this with webpack + reactjs ?
avodonosov 3 days ago 0 replies      
How does one measures performance improvement for a web page gained from such tools?
k__ 2 days ago 0 replies      
Just throw your webpack bundles in and be amazed.
Hydraulix989 3 days ago 4 replies      
iMark 3 days ago 0 replies      
The destination page looks uncomfortably like Webpack's.

Not the best idea, imho.

Puerto Rico files for biggest ever U.S. local government bankruptcy reuters.com
616 points by chollida1  3 days ago   458 comments top 33
knob 3 days ago 19 replies      
I have lived all my life in Puerto Rico, and as you can imagine, this issue is quite controversial.We owe money to the creditors... and what you do with a debt is to you pay it off.Yet the amount is so staggering, that I wonder if it's actually possible.

As is typical, decisions by politicians placed us in this situation. Decade after decade, 4-year term after 4-year term, the government has spent money it does not have. Be one political party or the other, it is the same thing.

There was a big manifestation this past Monday, where various Unions, groups, and students did a "Paro Nacional".Truthfully, I don't think they accomplished anything, other than various idiots vandalizing property they don't own.We are in deep shit, and it's going to get worse.

Lots of people are leaving the island, which just compounds the problem (less revenue).

I don't leave because: it's where I was born, where I have lived my entire life, and it is, honestly, paradise.

Obligatory John Oliver's Puerto Rico segment: https://www.youtube.com/watch?v=Tt-mpuR_QHQ

What will happen? I have no idea.Good to see this in HN.

6stringmerc 3 days ago 4 replies      
All that text about Puerto Rico's financial difficulties and not one mention of the rampant fraud that spun up the debt from a reasonable amount to the $XB that is likely to default is the result of graft, corruption, and fraud enabled by many US and international actors.

Want to take a guess how many of those responsible for the transactions, debt issues, and back-office fund transfers are in jail and all their assets seized to repay the debt, a la Civil Forfeiture for drug crimes in the US?

I feel bad for the Citizens of Puerto Rico, because here in Dallas the Fire and Police Pension fund is probably insolvent due to a combination of graft and incompetence as well, and those responsible in both these circumstances seem to just shrug it off and go on with life, their pockets lined more than 90% will ever see in 20 years of hard work.

Talking about the debt in Puerto Rico without firmly acknowledging the conditions that led to it is irresponsible in my opinion. Hard to separate the two. And, if we're going to talk about accountability for the latter, then we should nail those who committed the former first. Likelihood of that? Heh, yeah I'm not optimistic - doesn't mean I can espouse what I believe to be a justifiable alternate course.

grandalf 3 days ago 3 replies      
When issues of local government financial problems come up, I think it's important to remember that local governments are typically not able to undertake deficit spending the way the Federal government is.

If the US Federal government were prevented from running a budget deficit, we'd likely see a lot more solvency issues and financial failures of many sub-organizations within government due to bad budgeting. It would also be a lot harder for our leaders to start wars or get away with sloppy cost estimates of their grandiose ideas.

My argument is not that government should be small, simply that its spending should reflect its income and the results of its spending should be easily correlated with their cost.

Since the Federal government takes the lion's share of income taxes, local governments are constrained in their ability to generate income. Yet for most people, local governments provide the vast majority of the useful government services they enjoy.

So while it may seem that the Federal government is comparatively stable and responsible, the reality is that it's simply far less accountable to anyone and is able to use that lack of accountability to launder its reputation. States (or local governments) do not have the same luxury.

EternalData 3 days ago 3 replies      
I feel like local governments are going to be the first to really be affected by significant pension overlays that are not properly accounted for -- a lot of pension funds assume rates of return that are historically farcical. I think the industry average used to be about 8%.

"During the 20th Century, the Dow advanced from 66 to 11,497. This gain, though it appears huge, shrinks to 5.3% when compounded annually."


It doesn't auger well for the future of stable financial markets. Weak localities will fail, and eventually the states that have to support them. Puerto Rico is the canary in the coal mine.

creaghpatr 3 days ago 6 replies      
A few months ago I discussed this with friends and suggested the government subsidize cheap flights to Puerto Rico, say $30 for a round trip flight from Atlanta, scaled up depending on domestic distance.

This would cost taxpayer money but would drive a ton of tourism to Puerto Rico (albeit temporarily) and pump a bunch of money into their economy, defibrillator style. I've never been there but given a cheap round trip flight I would easily go and spend money.

The alternative appears to be some kind of bailout and/or debt restructuring and I don't see that working out in the long term, which would presumably cost taxpayers even more money.

rburhum 3 days ago 1 reply      
For the Spanish speakers of HNs, NPR's Radio Ambulante has an excellent episode (called "deuda") that they did last year explaining the complexity of the situation.

The short version is that a lot of the debt is held by the people of PR themselves in form of bonds that were sold when the economy was good.

The reason it was artificially good is that for a long period it had great US tax benefits when compared with other US territories/states. During that time several corporstions (particularly pharma) had factories and jobs there. During that time there was a boom in bond sales - a lot of predatory practices that are reminiscent of the US housing bubble were done there, too. Once the tax loophole was closed, corporations left the island, jobs went to shit, the bonds could not be repaid, and all well, you get the picture. Except that in this case, "not paying debt", literally means not paying a huge chunk of all the life savings that people poured (arguably even patriotically) to their own bond system (see the parallels to the housing crisis?).

You could technically boost the enconomy there by reintroducing similar tax loopholes, but it would be a temporary fix, and there would be significant struggles in the Senate and Congress... Taxation without representation sucks big time...

Link to podcast:https://16683.mc.tritondigital.com/NPR_510315/media-session/...

djsumdog 3 days ago 6 replies      
Puerto Rico should really be a state by now. Would it being a state given it any distinct advantages (or are there any specific problems in light of it not being a state) in this type of situation?
bradleysmith 3 days ago 0 replies      
I spent several months in Puerto Rico working on operations with Google's Project Loon, launching balloons out of Cieba.

Seeing the island during September 2016 power outage was eye-opening. It was admittedly a pretty bad event that spurred it, but portions of the island were without power for several days. Infrastructure development is definitely necessary, particularly considering the possibility of storm hits there.

It is a lovely island, I hope this manages to nudge along solutions from what seemed to be a stagnating problem.

oneguynick 3 days ago 0 replies      
As you hear about Puerto Rico the next few days, take some time to get smart on the history and dynamic. Highly recommend Congressional Dish Podcast that came out a few weeks ago. Very good - http://www.congressionaldish.com/cd147-controlling-puerto-ri...
sidmitra 3 days ago 0 replies      
There is an excellent episode on the podcast Radio-Ambulante on exactly this. There're are accounts of a few people living there, small business owners etc. if i recall correctly


English translation: http://radioambulante.org/en/audio-en/translation/translatio...

elihu 2 days ago 0 replies      
I'm not super knowledgeable, but this seems to me like a good thing. If investors lend money to a government that can't repay, they should lose some of that money. I don't think whole economies should remain in a state of economic slavery.

Is there some way in which this move might turn out bad for the people of Puerto Rico, aside from it probably being more difficult to raise debt in the future at a reasonable rate?

WillyOnWheels 3 days ago 0 replies      
The Marxist hidden in me says

* listen to Richard Wolff's views on Puerto Ricohttp://www.rdwolff.com/tags/puerto_rico

* listen to David Graeberhttps://www.amazon.com/Debt-First-5-000-Years/dp/1612191290

gwbas1c 3 days ago 1 reply      
A lot of people forget that government debt is often funny-money. Defaulting on debt is very different than not paying your friend back.
kqr2 3 days ago 0 replies      
For a good intro to the Puerto Rico financial crisis, see this Planet Money podcast :


kirse 3 days ago 0 replies      
Are there any investment opportunities amidst this news? I've always wondered how these massive bankruptcies play out.
ComodoHacker 2 days ago 0 replies      
TIL there are territories of the United States where the U.S. Constitution is not fully applicable.
ksec 2 days ago 0 replies      
Ok, So China's economy is on the verge of collapse ( as the media likes to paint it), it is not sustainable, but they are carefully trying to control it.

EU is, Brexit, Greece ( tiny compared to P.Rico ).....

And now U.S.

It seems we have more problems then we had after the Money Printing machine turned to full power in 2008.

This may sound naive, do we actually have a recession in the past 20 - 30 years? We may have contraction, but actual recession? 2008, was like a come and go event in such a short time frame.

And an even more stupid question, is a constant GDP growth, sustainable? Or if we include the lowered value of currency we have been in negative already?

justforFranz 3 days ago 0 replies      
Does anyone know how much the US federal govt is on the hook for?
danschumann 2 days ago 1 reply      
Their governor looks like Michael Scott. I've never understood why governments go into debt. It's just stupid.
nom 3 days ago 0 replies      

I'll just leave this here.

masterleep 3 days ago 3 replies      
Illinois badly needs to do this as well.
rrggrr 3 days ago 2 replies      
> As is typical, decisions by politicians placed us in this situation. Decade after decade, 4-year term after 4-year term, the government has spent money it does not have.

Every time every one of us votes for any politician who favors any deficit spending - and this is most of us - we give aid and comfort to this outcome. We all want fiscal responsibility until its a cause that touches us emotionally.

faragon 2 days ago 0 replies      
Apple could buy that debt and rename the island to "Apple Island", etc.
cpr 3 days ago 0 replies      
As Maggie Thatcher said: eventually you run out of other people's money.
azinman2 3 days ago 0 replies      
On the bright side, perhaps it's now a great time to buy property in PR?
guelo 3 days ago 0 replies      
The PROMESA bill is disgusting. An unappointed board filled with bankster-friendly technocrats will now overrule democracy in Puerto Rico. The island is in for a generation of recessions and low growth as the entire economy will be permanently directed towards paying off bondholders.
hindsightbias 3 days ago 0 replies      
Why not sell of the island, or part of it for seasteading?
digitalneal 3 days ago 1 reply      
Wonder how bad it would be if they didn't subsidize rum production so heavily.
_delirium 3 days ago 2 replies      
golemotron 3 days ago 0 replies      
Debt: The Next 5000 Years
kyleblarson 3 days ago 0 replies      
In 30 years bankruptcies like this one will look like peanuts compared to the collapse of massively underfunded government employee pension funds that will begin happening soon.
Taylor_OD 3 days ago 0 replies      
I was in Puerto Rico a year or so ago. I'm by no means a expert but it seemed like a odd place to live. Mostly because its not a US state yet a good chunk of the people seem to want it to be.
mirekrusin 3 days ago 1 reply      
Should I cash my USD now or should it be safe for few more months?
Something is wrong when the telephone app on your phone becomes 3rd party martinruenz.de
581 points by guy-brush  2 days ago   329 comments top 53
Voloskaya 2 days ago 10 replies      
Or that just means that we still call the device in our pocket a "phone" for legacy reasons. If we are okay with having a third party handles our messages, VoIP etc, why not the phone app?
mcherm 2 days ago 8 replies      
I completely disagree with the title. The fact that "telephony" can be an app on the phone is a WONDERFUL thing. It means that the author of this article has a choice, as opposed to NOT having a choice.
NeutronBoy 2 days ago 3 replies      
> This leaves me in an unpleasant spot as I, where I can, avoid using google services and now need to find an alternative dialling application. Isnt this sweet? I am searching for a dialling application for my smartphone. A DIALLING application

You bought a Google phone and don't want to use Google services, but complain that the dialler can be provided by a third party? Isn't that a good thing?

tomkarlo 2 days ago 0 replies      
It's always been a "telephone app", on smart phones. And in many cases, it's already been somewhat "3rd party" because in reality it's from the ODM who made your phone or the SOC vendor, not the company that branded your phone. The author is only just becoming aware that this is "3P" because he happens to have a branded dialer added on his phone, versus a "white label" dialer that was already there.

I'm fairly certain Wileyfox didn't make the dialer that previously came with his phone, either. (They're a smaller OEM.) They probably just used the one provided by the ODM assembling the device for them.

Animats 2 days ago 2 replies      
It's a real problem. Are there any voice dialing programs for Android phones which do voice recognition locally and don't require Google services? That's the way it used to work until Google broke it so they could monitor all your dialing.
gruez 2 days ago 3 replies      


>Choose precisely the data you wish to share; protect apps with additional PINs; prevent spam with Truecaller Integrated Dialler.

So their idea of privacy is privacy from everyone except the manufacturer (and "trusted" third parties)

>Of course, you can always root the phone and install custom roms. But this process takes some time and the development and compatibility with these roms is less than satisfactory.

Maybe he shouldn't have bought a device with such a small userbase?

gumby 2 days ago 2 replies      
I don't agree. It's simply that some vendors are untrustworthy.

Frankly if the telephone app on my "phone" stopped working I wonder how long it would take me to notice.

leeoniya 2 days ago 0 replies      
i just bought a Volkswagen, which comes with Car-Net, which apparently relays telemetry to Verizon Telematics via a 3G connection. oh and there's a gps reciever and an in-car microphone.

their privacy policy is scary.


dealer refuses to disable or remove this carnet module. their "solution" is to tell me just not to sign up for the services. ummm..lol no. needless to say i'm taking matters into my own hands via vw message boards and:


may have to physically remove the module and/or neuter the antenna.

walrus01 2 days ago 2 replies      
I have never heard of a "Wileyfox Swift". If you need an awesome and capable Android 7.0 based dual SIM phone with zero carrier crapware/bloatware and a close to stock Android experience, the OnePlus 3T (64 or 128GB version) is a good choice.

Since Oneplus' falling out with Cyanogen Inc, and the financial failure of Cyanogen, Oneplus' own OxygenOS is essentially a re-implemented CyanogenMod that has all of the same features.

ranveeraggarwal 1 day ago 0 replies      
Which is why I switched to Google-made phones (Nexus). No bloat, frequent security up dates and default apps. Sure, Google knows what I'm having for dinner, but it's a single entity I trust. At least I don't go about moving through apps, reading their TnCs.
bostand 2 days ago 1 reply      
This why we need [edit] GDPR.

Next time someone tries to harvest my personal data using an "all-inclusive" EULA I'm going to sue his ass in EU-land.

thomseddon 2 days ago 0 replies      
Found myself in the exact same situation (have a WileyFox, updated and ended up with the awful true caller app as the default dialler + found I couldn't use the default google dialler), I've since been using: https://play.google.com/store/apps/details?id=com.contapps.a... but it does have adverts, I'd welcome any other suggestions (aside from re-flash).

Only today I was joking about how absurd it is that I was struggling with such a fundamental feature.

timsayshey 2 days ago 0 replies      
Too much traffic -- couldn't access the page.

Here's the cached version: http://archive.is/vs4a7

tdicola 2 days ago 2 replies      
Looking at my cell phone bill I use far more data than I do actual voice a month. It's kind of an anachronism to call it a phone anymore. It's a pocket computer.
nemoniac 2 days ago 0 replies      
Can someone recommend a good, safe, privacy-respecting 3rd party phone app for Android, preferably for Cyanogenmod or LineageOS?
bighi 2 days ago 0 replies      
You're not only using Android, but you're using an Android phone from a smaller company.

While this is bad, it's not really unexpected.

notsohuman 1 day ago 0 replies      
Somethings wrong with the idea that there can be apps installed in your phone that could do things you don't know
on_and_off 2 days ago 1 reply      
What exactly is wrong with the dialler being an app ?

It allows exactly what the author wants with his specific desire for a dialler that does not rely on play services or true caller.

bbulkow 1 day ago 0 replies      
One thought. A company like WileyFox must spend significant bucks to provide after-the-fact engineering to provide updates. It would be a classic error to underestimate the cost of ongoing engineering, and provide a lower up front cost, then be strapped for cash later. At that point, WileyFox management needs a revenue stream to provide the engineering to push 7.1.1. It would - again I am guessing - typically turn to outside companies who pay for placement, like the dialer. Those companies make money by selling data.

Thus I am not surprised that this is happening, although I had not heard of WileyFox until today. It's also possibly WileyFox is greedy, but I'm going with incompetence; I will believe first that someone made a mistake in pricing and planning, and they're looking for the best outcome for customers, company, brand.

I am, perhaps uncomfortably, saying that "you got an unexpectedly low price" going with a company like WileyFox. They gave you a cheaper phone, and frequent updates, and now they need to pay for it. They could either start charging subscriptions for the 7.1.1 update ( which would be fair, no? ), or they could do what they are doing by making a deal with what sounds like a shady dialer company, or they could bail out and not provide the updates they promised ( like most handset providers do ).

I could get all high-horse about software consumers being trained to expect free software. That expectation being set by the billions of dollars in VC money that has been spent to capture markets - markets that later require "exploitation". But I won't go there - it's a complicated argument, there are other market forces, there are models that work, companies can be "up front" about data-driven business models.

My way of saying - you do have to pay for software, you took the 7.1.1 update, you should expect to pay..... right?

bennyp101 2 days ago 0 replies      
When I updated it at the weekend, I too noticed that the dialer had gone - which is annoying as I find the interface to Truecaller a bit of a pain - but as I rarely make calls it's something I can live with. The incoming call filtering and showing of names not in my phonebook is actually pretty useful.

When I bought my Wileyfox Swift 18 months ago it came with Truecaller as an app already installed, and it integrated with the standard dialer well - ie I could use the dialer app to make calls, and incoming calls would show info from Trucaller. Not sure why they couldn't keep it like that.

accountyaccount 2 days ago 1 reply      
Disagree entirely. The stock phone app is likely the least used app I keep on my phone.
skewart 1 day ago 1 reply      
One of the central complaints in the blog post is that it's too hard to get your hands on an Android phone that doesn't come with impossible-to-remove bloatware. I've been amazed and annoyed by this too.

I understand all the reasons why manufacturers add these apps to the phone. But still, I would gladly buy a phone that came with a stripped down version of Android, no bloatware, and received regular updates and security patches.

Sure, you can root your phone, and there are various ROMs out there that you can install. But as someone who's never done this it seems kind of complicated and annoying.

I want a simple experience where it just works right out of the box. Heck, I'd be happy if to buy a phone from someone who just resells devices after flashing a decent ROM onto them and makes updates easy somehow.

mayneack 2 days ago 0 replies      
Many have already posted about how the phone is less important than email/browser which we're ok with being 3rd party.

I didn't really think there could be a difference between phone apps until I got project fi from Google. Their phone app comes with voicemail transcription and spam detection. Some of that is obviously from the service itself, but these features seem like things that I'd want to be able to acquire even if I had an AT&T Samsung (which is what I moved from and didn't have by default). Third party seems fine with me.

mmmBacon 1 day ago 1 reply      
I was out riding motorcycles with a friend. He went down and was pretty broken up. As the ambulance loaded him up, I tried to use his Android phone to call his wife to let her know he had been hurt and what hospital he was being taken to. It was a stressful situation but I could not find the phone function at first easily (no idea which distribution). The phone function is still really important!
10165 1 day ago 1 reply      
Is it true that carriers are increasingly switching to calling over WiFi? I understand the new iPhones have this feature, with switchover happening automatically.

Did the popularity of Facetime, WhatsApp, etc. have anything to do with this? I have seen claims from carriers that WiFi calling "improves coverage".

At some stage, will anyone question what is the point of the cellular network? Especially in urban areas.

What is the difference between

a. a "smartphone" and

b. a portable, pocket-sized computer with rechargeable battery where the user chooses what to install and can remove any pre-installed software, where user gets a choice of 1. using pre-installed software and default settings or 2. using her own bootloader, kernel and userland, and where the user can easily open the case and tinker.

Does a computer need to have any association with the company selling internet access? Today's "phones" are manufactured for carriers (who are often ISP's, too), not for users. The carriers in turn sell these customized "phones" to users.

ris 2 days ago 1 reply      
So what makes you think Apple or Google aren't using your call information for advertising purposes in the default diallers?
shipintbrief 1 day ago 0 replies      
Something is wrong when OP writes about dialer app that sells your private info were installed on device without any warnings and everyone tells the OP that they don't really use phone app.Makes me question the humanity.
AngeloAnolin 2 days ago 1 reply      
Did I read that correctly?

Isnt this sweet? I am searching for a dialling application for my smartphone. A DIALLING application.

Does this mean he can't practically call someone outside his contacts list as there's no way to key in phone numbers? Or would he still be able to make regular voice calls?

Fej 2 days ago 1 reply      
> This leaves me in an unpleasant spot as I, where I can, avoid using google services and now need to find an alternative dialling application.

AOSP dialer. Done. Don't want the truedialer app? Run CM, like it says in the post.

loueed 1 day ago 0 replies      
Many companies are betting that phones will be phased out for augmented reality glasses. Magic Leap, Microsoft, Facebook, Google, and Apple all have teams working on it.

People expect these devices to accept sim cards, will this break the confusion? My Smartglasses are a personal computer that also has phone functionality.

dotBen 2 days ago 1 reply      
No, this is simply proof that consumers must pick carefully the hardware vendor they decide to buy as which software it comes with will vary greatly between devices. This is simply not an issue with Google Pixel/Nexus devices, Samsung or any of the other major vendors.

Running Cyanogen (which has now no longer developed) nor 'GAPPS' (Google's standard package of apps that gives you access to the Play Store, and the stock dialer) is a pretty fringe use of Android.

steveharman 1 day ago 0 replies      
Aren't there loads of free "phone" (dialer) apps in the Play store? Just use one of those?
TheStrangBird 2 days ago 0 replies      
Well in the end modern smartphones are just general purpose computers which have some artificial restrictions on what you can do with them and happen to have a touchscreen/modem/speaker/microphone etc.

So the dieler just being a app is a direct consequence from smartphones not being any kind of "special/magical" embedded device.

Through silently overriding the Dialer with a program I would normally suspect to be malware which sneaked on my phone is a horrible think to do...

meroje 2 days ago 0 replies      
The Swift originally shipped with CyanogenOS, which is not the same thing as CyanogenMod. CyanogenOS already featured Truecaller on the dialler to show names on incoming calls. I was confused too when the dialler disappeared from my launcher, but I use it very rarely and essentially to receive calls so not a big deal for me. I understand OP's concerns though.
jumpkickhit 2 days ago 0 replies      
Funny, I'm reminded of my HP Ipaq back in 2005 or so. 800mhz smartphone with a stylus, wifi, running Windows CE.

It struck me as a portable computer, with "cellphone" functionality added as an almost afterthought.

Guess that's still the case if you think about it.

smnplk 2 days ago 0 replies      
I have and idea for a VOIP app called The Stallminator. The app would feature a nice backdrop of Richard Stallman's face http://tinyurl.com/msbkwb3
alinspired 2 days ago 0 replies      
This issue echoes many privacy-related discussions and the same approach for android applies here:

 - unlock - install minimal 3rd party ROM (ie LineageOS) - choose what you want from Google via Open Gapps or use F-droid

thuruv 1 day ago 0 replies      
Whether thats a Google service or not, the OP has clearly stating the point that the problem he's facing now will have an evolution and affect even us in future.
johnhenry 2 days ago 0 replies      
Unfortunately, in this day and age, I don't think it's reasonable to expect privacy even from a first party application.
guy-brush 1 day ago 0 replies      
I just updated the article to include further emails from truecaller and wileyfox.
mdekkers 2 days ago 0 replies      
truecaller is something alltogether evil
codewiz 2 days ago 0 replies      
Double check which software you get with which device. Not being an apple fanboy, I have to admit: At least you know what you get when buying an Iphone.

I don't understand why some people compare a $650 iPhone 7 with cheap phones loaded with some half-assed vendor fork of Android?

If you wanted stock Android, you should have bought a $650 Pixel phone.

jaimex2 1 day ago 0 replies      
And this is yet another reason why Cyanogenmod no longer exists.
ianseyler 2 days ago 1 reply      
My iPhone is my pocket computer. It's on a tablet plan (data only - up to 1GB for $20 CAD) because voice and text is something I would hardly use. TextNow worked fine for me in past for phone calls and texting non-apple devices but I've since switched to Hushed as my go-to. I've never used the built in "Phone" app.
znpy 2 days ago 0 replies      
i am still wondering how comes that nobody came up with a phone application for linux computers.

I have a 3G modem in my ThinkPad X220 and i am fairly sure it is technically capable of doing and receiving phone calls.

dorianm 2 days ago 0 replies      
I thought it was about LinkedIn selling people's profiles :)
fulafel 2 days ago 1 reply      
How do I find a dialer app that's cloud-free?
agumonkey 2 days ago 0 replies      
I miss my motorola v3650.
InclinedPlane 1 day ago 0 replies      
It's fine. If you want a phone just for voice go buy a flip phone, they are basically free and plans are cheap. If you want a smartphone, then deal with the implications.
lujingfengjeff 1 day ago 0 replies      
pilot72 2 days ago 0 replies      
You purchased a Chinese mobile phone and it installed spyware in an update.

Anybody who's ever purchased a mobile from eBay or AliExpress has already seen that. They need to get their revenue from somewhere. Next time stick to a known, trustable brand.

mtkd 2 days ago 4 replies      
I fought having a phone through to 2008 - now I'm reconsidering wanting one again

if there was a pure messaging device right now with no voice - I'd bite their hand off

I don't need a mobile browser or even apps - all they do is stop me from disconnecting from work for a few minutes or hours - I don't think always-on is good for anyone over time

The Horror in the Standard Library zerotier.com
690 points by aw1621107  1 day ago   171 comments top 30
bluejekyll 23 hours ago 1 reply      
OMG, as I was reading this I thought, "man, this reminds me of a bug I ran into with std::string back in 2000", A few sentences later, and this is also about std::string and the STL.

Mine was different though, after tracking down a memory leak that was happening with the creation of just new empty string, I discovered in the stdlib that there was a shared pointer to the empty string with a reference count of how many locations were using it (ironic that this was intended to save allocations). It turned out this was on Intel and we had what was rare at the time, a multi-processor system. It turned out that the std::string empty string reference count was just doing a vanilla ++, no locking, nothing, variable not marked volatile, nothing.

A few emails with a guy in Australia, a little inline assembly to call a new atomic increment on the counter, and the bug was fixed. That took two weeks to track down, mostly because it didn't even cross my mind that it wasn't in my code.

From that point on, I realized you can't trust libraries blindly, even one of the most used and broadly adopted ones out there.

faragon 16 hours ago 1 reply      
The problem is forgetting that dynamic memory usage is not "free" (as in "gratis" or "cheap"). In fact, using std::string for long-lived server processes doing intensive string processing (e.g. parsing, text processing, etc.) is already known to be suicidal since forever, because of memory fragmentation.

For high load backend processing data, you need at least a soft real-time approach: avoid dynamic memory usage at runtime (use dynamic memory just at process start-up or reconfig, and rely on stack allocation for small stuff, when possible).

I wrote a C library with exactly that purpose [1], in order to work with complex data (strings -UTF8, with many string functions for string processing-, vectors, maps, sets, bit sets) on heap or stack memory, with minimum memory fragmentation and suitable for soft/hard real-time requirements.

[1] https://github.com/faragon/libsrt

arunc 20 hours ago 2 replies      
I encountered exactly the same issue few years ago in UIDAI in one of our large scale biometric matchers and the resolution was exactly the same. After a week of debugging I found that the libstdc++ allocator was the culprit. I found [1] and confirmed the same, which helped in fixing this issue.

The thing that was more interesting (or sad) was to know that the GCC developers didn't expect the multithreaded applications to be long running.

"Operating systems will reclaim allocated memory at program termination anyway. "

[1] https://gcc.gnu.org/onlinedocs/libstdc++/manual/mt_allocator...

alyandon 1 day ago 1 reply      
I myself ran across this same scenario many years ago with a similar amount of hair pulling and eventually concluding that the GNU libstdc++ allocator wasn't reusing memory properly. Unfortunately, I was never able to pare down the application to the point that I had a reproducible test case to report upstream.

GLIBCPP_FORCE_NEW was the solution for the near term and since I was deploying on Solaris boxes I eventually switched to the Sun Forte C++ compiler.

It really bugs me that this problem still exists. :-/

bgd11 20 hours ago 0 replies      
All the technicalities aside the writing style of the author is amazing. I would have never thought that someone can create such an intense narrative with 'malloc' as the main character
firethief 1 day ago 1 reply      
> Nothing made any sense until we noticed the controller microservice's memory consumption. A service that should be using perhaps a few hundred megabytes at most was using gigabytes and growing... and growing... and growing... and growing...

Not identifying this until many hours after symptoms were impacting users sounds like a pretty big monitoring blind spot.

cyphar 1 day ago 4 replies      
Did you report the issue upstream with a patch? The solution to "the standard library is broken" is to fix the standard library, no? It's all free software after all.
consultSKI 12 hours ago 0 replies      
>> Most operators in C++, including its memory allocation and deletion operators, can be overloaded.

Have I mentioned lately how much I hate C++?

Great read.

stephen_g 23 hours ago 1 reply      
Things like this is why I was happy to see the LLVM project write their own C++ standard library. libstdc++ has always seemed a bit hacky and fragile to me. It's great to have an option which is a more modern, clean codebase.

Have you tested to see if this works better with LLVM libc++?

spyder81 10 hours ago 0 replies      
"Then I remembered reading something long ago" is when experienced programmers are worth their weight in gold.
brynet 7 hours ago 0 replies      
Interestingly for recent versions of GCC (>=4.0) GLIBCXX_FORCE_NEW is defined for libstdc++, not GLIBCPP_FORCE_NEW.


tripzilch 9 hours ago 0 replies      
Upvoted for the Lovecraft and pulp horror lit references, and starting with "It was a dark and stormy night ..." :-)

Great writing, great read.

TimJYoung 13 hours ago 0 replies      
I'm not sure if the other debug tools mentioned offer this, but AQTime Pro:


has an allocation profiler that can be used to track down this sort of problem. You can take allocation snapshots while the application is running to see where the allocations are coming from (provided that you can run AQTime Pro against a binary with debug symbols/info).

I'm not affiliated with the company - just a happy customer that has used them for years with Delphi development.

charles-salvia 1 day ago 2 replies      
I'm a bit confused here.

>> Most operators in C++, including its memory allocation and deletion operators, can be overloaded. Indeed this one was.

Okay, well, firstly - the issue here seems to be a problem with the implementation of std::allocator, rather than anything to do with overloading global operator new or delete. Specifically, it sounds like the blog author is talking about one of the GNU libstdc++ extension allocators, like "mt_allocator", which uses thread-local power-of-2 memory pools.[1] These extension allocators are basically drop-in extension implementations of plain std::allocator, and should only really effect the allocation behavior for the STL containers that take Allocator template parameters.

Essentially, libstdc++ tries to provide some flexibility in terms of setting up an allocation strategy for use with STL containers.[2] Basically, in the actual implementation, std::allocator inherits from allocator_base, (a non-standard GNU base class), which can be configured during compilation of libstdc++ to alias one of the extension allocators (like the "mt_allocator" pool allocator, which does not explicitly release memory to the OS, but rather keeps it in a user-space pool until program exit).

However, according to the GNU docs, the default implementation of std::allocator used by libstdc++ is new_allocator [3] - a simple class that the GNU libstdc++ implementation uses to wrap raw calls to global operator new and delete (presumably with no memory pooling.) This allocator is of course often slower than a memory pool, but obviously more predictable in terms of releasing memory back to the OS.

Note also that "mt_allocator" will check if the environment variable GLIBCXX_FORCE_NEW (not GLIBCPP_FORCE_NEW as the author mentions) is set, and if it is, bypass the memory pool and directly use raw ::operator new.

So, it looks like the blog author somehow was getting mt_allocator (or some other multi-threaded pool allocator) as the implementation used by std::allocator, rather than plain old new_allocator. This could have happened if libstdc++ was compiled with the --enable-libstdcxx-allocator=mt flag.

However, apart from explicitly using the mt_allocator as the Allocator parameter with an STL container, or compiling libstdc++ to use it by default, I'm not sure how the blog author is getting a multi-threaded pool allocator implementation of std::allocator by default.

[1] https://gcc.gnu.org/onlinedocs/gcc-4.9.4/libstdc++/manual/ma...

[2] https://gcc.gnu.org/onlinedocs/gcc-4.9.4/libstdc++/manual/ma...

[3] https://gcc.gnu.org/onlinedocs/gcc-4.9.4/libstdc++/manual/ma...

halayli 1 day ago 1 reply      
this conclusion might be wrong. the code in question while it might not be allocating/freeing memory it might be stumbling on memory blocks and corrupting mem management structures. Turning the flag on might be fixing the issue by mere luck because memory allocations, locations and structures would be different
bogomipz 11 hours ago 1 reply      
This was a nice write up, however I didn't follow how memory fragmentation was related to a memory leak. Can someone explain? I understand that alternate memory allocators would help with the fragmentation issue but how does the choice of allocators affect memory leakage?
squidlogic 1 day ago 2 replies      
Amazing write-up. Informative and gripping in its prose.
grandinj 21 hours ago 0 replies      
I'm guessing the cpp thing is a holdover from the days when the glibc maintainer was less than entirely helpful. There has been actual improvements in glibc in this area lately so hopefully these kinds of hacks will slowly go away.
Ono-Sendai 16 hours ago 0 replies      
So where's the bug report with repro test code?
bboreham 20 hours ago 1 reply      
Post doesn't actually say what was broken, or indeed prove the location of broken-ness. Just that it went away with a different compile option.

Exciting writing, but lacking a point.

rurban 22 hours ago 0 replies      
The "make malloc faster" part was done over a decade ago with the followup from ptmalloc2 (the official glibc malloc) to ptmalloc3. But it added one word overhead per region, so the libc people never updated it to v3. perf. regression.They rather broke the workarounds they added. And now they are breaking emacs with their new malloc.
jcalvinowens 22 hours ago 0 replies      
I don't understand the point of this article... if you think there's a bug in the library, fix it. Don't write a melodramatic blog post lamenting how horrible it is in the hope somebody else will do it for you.

This isn't particle physics, it's code: we don't have to guess, we can look at it and see how it works.

selimthegrim 15 hours ago 0 replies      
James Mickens, move over. There's a new sheriff in town.
SFJulie 18 hours ago 1 reply      
Memory fragmentation due to dynamic non fixed size data structure and multithreading is an old foe. That may not be fixable in c/c++

Worker A allocates dynamic stuff. Algo take a segment (0+sof(str)(ofA) + n) Work B Allocates to create same kind of data structure (fragment of a JSON) [ofA, OfB]Wk A resume allocating, boundary of [0, ofA] exceeded, no free contiguous space up or down [Ofb, OfC] allocatedWk C enters wants to alloc, but sizeof(string) make it bigger than [0, OfA] so [ofD, ofE] asked.... and the more concurrent workers the more interleaving of memory allocation go on with fragmented memory.

Since malloc are costly the problem known, a complex allocator was created with pools of slab and else, probably having one edge case, very hard to trigger having phD proven really complex heuristic.

CPU power increase, more loads more workers, interleaving comes in, edge case gets triggered.

And C/C++ makes fun of fortran with its fixed size data structures embracing any new arbitrary size arbitrary depth data structure for the convenience of skipping a costly waterfall model before delivering a feature or a change in the data structure and avoiding bike shedding in committees.

Human want to work in a way that is more agile than what computers are under the hood.


Always allocated fixed size memory range for data handling, and make sure it will be enough. When doing REST make sure you have an upper bound, use paging/cursors, which require FSM, have all FP programmers say mutable are bads, sysadmin say that FSM are a pain to handle when HA is required, and CFO saying SLA will not be reached and business model is trashed, and REST fans saying that REST is dead when stateful.

Well REST is a bad idea.

Safety1stClyde 23 hours ago 0 replies      
It was only yesterday that I was reading another discussion from hacker news about problems with Gnu C library.


logicallee 1 day ago 2 replies      
People forget that C++ is just a tool, like a screwdriver or a hammer. A good carpenter knows when it's time to take a metallurgy class and resmelt his hammer, because its composition is not correct for being a hammer.
ris 16 hours ago 0 replies      
The correct response is to file a bug report, not write a clickbait-y article.
tbodt 1 day ago 1 reply      
Maybe it'll get fixed now that a post saying "libc++ is broken" got hackernewsed
mtanski 1 day ago 2 replies      
Yeah malloc() is pretty terrible in glibc by modern standards. For some workloads it just can't keep up and ends up fragmenting space in such a way that memory can't be returned to the OS (and thus be used for the page cache) and you end up in this performance spiral.

I always deploy C++ server on jemalloc. Been doing it for years and while there's been occasional hicks up when updating it has provided much more predictable performance.

nly 1 day ago 5 replies      
Actually it is C's malloc and free that is "broken". malloc() takes a size parameter, but free() doesn't. This imbalance means it can never be maximally efficient. Whatever GNU stdlibc++ is doing is probably, on balance, a net win for most programs.

It's not exactly roses in C++ either of course. You can do better than the standard library facilities. Andrei Alexandrescu gave a great, entertaining, and technically elegant talk on memory allocation in C and C++ at Cppcon 2015 that is well worth watching


SCOTUS Rejects Guilty Until Proven Innocent Can't Keep Money from the Innocent forbes.com
433 points by rrauenza  3 days ago   186 comments top 12
jdc0589 3 days ago 5 replies      
Does this have any implications for the ridiculous seizures of property police carry out every now and then when they pull someone over who has a large amount of cash on them?

The whole idea of a court case titled, e.g., "State of Texas vs $45,000" is asinine. I'd love to see the SCOTUS lay down the law on that bullshit.

pc86 3 days ago 5 replies      
Who dissented? I would be interested in reading the dissent and seeing if it was based off of some technicality of this case, or a belief that requiring someone to prove their innocence wasn't a violation of their Constitutional rights.
rayiner 3 days ago 4 replies      
The article does a piss-poor job of actually framing the issue that was decided: http://www.scotusblog.com/case-files/cases/nelson-v-colorado. In particular, the article's characterization of the question presented is so hand-wavy that it makes the dissent seem incomprehensible.

To understand Thomas's logic, start with Section I of the dissent: https://www.supremecourt.gov/opinions/16pdf/15-1256_5i36.pdf. Thomas is not saying that the state should be able to "keep money from the innocent." Thomas's reasoning is roughly the following:

1) The 14th amendment requires the government to give you adequate process before depriving you of a property right.

2) At the time petitioner was convicted, he paid over a sum of money to the state. He lost the property right in that money, and the state gained the property right in that money.

3) After the conviction was overturned on appeal, the state required petitioner to go through a process to get his money back. Petitioner alleges that the process was inadequate under the 14th amendment.

4) To make a 14th amendment deprivation claim, you have to show deprivation of a property right. At the time petitioner sought his money back, he had no property right. That transferred to the state under step (2).

5) Thomas asks: where is the property right that is the basis for the 14th amendment claim? It can't come from the 14th amendment itself, because that only comes into play once petitioner has a property right.

To get around (5), you can theorize that the initial payment automagically become null-and-void when the conviction was vacated. But that would be an unusual result, because legal judgments don't ordinarily have that kind of effect. Say you buy some property, then sue the seller and get a judgment saying you overpaid. You can assert that judgment against the seller to collect, but at most the judgment means the seller owes you money back, it does not automatically transfer the property right in some of the money you paid back to you.

I think the majority is right in that the overall effect of the Colorado law would be a due process violation. But the majority's reasoning kind of requires thinking of the whole process as a black box and not thinking too hard about what happens inside.

ars 3 days ago 2 replies      
I don't understand this:

> Ali had previously sold the Chevy but still held title to it and it was registered in his name ..... In order to regain his Chevy, Ali was asked to prove his innocence.

Why does he need to regain "his" Chevy? It said he sold it, so it's not his anymore. Just the title transfer was not registered yet (which in my state is pretty common, especially for junk cars, since the state charges a fee and sales tax to record the transfer).

rhizome 3 days ago 0 replies      
A little more of an educated take from a couple weeks ago:


Neliquat 3 days ago 5 replies      
This is a huge step forwards. I only hope asset forfiture is next.
OliverJones 3 days ago 0 replies      
Sarah Stillman wrote up civil forfeiture in The New Yorker in the summer of 2013. John Oliver's TV piece on the topic must have drawn heavily from this article.


davrosthedalek 3 days ago 3 replies      
It was a 7:1 vote. Is there a resource to look up how individual judges voted?
inlined 2 days ago 0 replies      
> Ali had previously sold the Chevy but still held title to it and it was registered in his name. When the buyer was arrested for a DWI and drug possession, police seized the truck and filed a civil forfeiture action against it, even though Ali was not involved. In order to regain his Chevy, Ali was asked to prove his innocence.

If Ali sold the car but kept the title what the heck did he "sell"? This sounds like a poor defense against civil forfeiture. It seems like he'd have to claim he cheated the person he sold the car in order to assert ownership. I wouldn't want to touch that case either.

felipelemos 3 days ago 1 reply      
What surprises me is - even for just 1 vote against - it was not a unanimous decision.
wu-ikkyu 3 days ago 1 reply      
Does this mean police can't legally steal your money anymore?
justinclift 3 days ago 1 reply      
Wonder if Kim Dot Com's legal team would be able to make use of this in their ongoing US legal saga?
How to Survive as a Solo Dev for Like a Decade or So sizefivegames.com
419 points by lj3  2 days ago   161 comments top 16
raamdev 2 days ago 16 replies      
> I find it astonishing that startup indie devs pay out for an office, with all the extra bills that entails. Work from home, keep your overheads as close to zero as possible.

I've been a solo dev for nearly a decade (8 years) and I found that sometimes it makes complete sense to pay for an office. Before my daughter was born, I was able to work from almost anywhere: noisy cafes, at home, the library, etc. But after she was born it was like my brain needed a space away from home where I could close a door and have a room all to myself, a place where I couldn't hear family sounds or be within walking distance of anyone I knew. I tried cafes and libraries but, oddly, they no longer worked for methe noise and the people walking by suddenly became a huge distraction. I couldn't focus. I decided to rent a tiny artist studio for $350/mo that I found on CraigsList and it was the best $350 that I spent each month in terms of direct impact on my ability to get work done.

karmajunkie 2 days ago 3 replies      
This would be more accurately titled, "I survived 10 years as a solo developer and here are the choices I made". Here's my version:

1) Craftsmen pay good money for their tools. Invest in your office space, whether its a dedicated space in your home or a coworking membership. But be honest about what tools you need. You probably don't need a $1000 aeron chair. But you do need one that's comfortable. You probably don't need that super-cool triple-panel 17" laptop Razer pumped at CES for web development. You probably do need something with enough RAM to run a few VMs sometimes.

2) Outsource everything that doesn't make you money or that you aren't good at. Taxes and bookkeeping, for example. But know enough about it so you can tell if you've hired good help there.

3) Invest in yourself. Learn some new technology at least once a yearwhether thats a framework, a language, or a skill like design. Go to at least one regional conference, and at least one national conference if you can afford it.

4) Work the shit out of your network. Set a limit on how many unpaid lunch meetings you'll take to hear about other people's problems, and always try to find a way to help them even if you don't wind up taking the job. And then try to hit your limit most of the time. Farm favors like they're a cash crop.

5) Find a way to keep yourself accountable, whether that's a mentor, a coach, or an accountability partner. We all need someone to keep us honest about our motivations and rationalizations from time to time.

6) Try to exercise some self control over how many self-indulgent HN comments you make in a given period of time. :)

enraged_camel 2 days ago 4 replies      
>>> Dont spend any money: Do as much as you can yourself. If you cant afford it, dont pay someone to make assets that you could do yourself. Whats more, do you really need to hire a full time coder? Or can you just hire a freelancer for a month? If you dont have money, make the sound effects yourself.

Nah. As a solo dev you need to spend your time efficiently.

You can go to Fiverr and pay literally $5 for stuff like that. Sure, what you get won't be amazing, but it will be passable and it is pretty much guaranteed to be better than what you can create as a pure beginner.

That is far more preferable to spending hours or days (or maybe even weeks) learning to do it yourself.

throw9966 2 days ago 7 replies      
The best advice that I got was don't quit your full time job. My desktop sharewares dont sell that often nowadays, but its enough to pay my rent and monthly expenses. The salary that I get from my full time job goes direct to my bank untouched.

> "Working from a cafe"

100% agree. For me, no work gets done from a cafe. I wonder what work people do by sitting at Starbucks. I cant write one line of code if I am being constantly distracted. Does anyone feel different ?

hartator 2 days ago 4 replies      
> Sit down and do some fucking work. Dont go for a coffee, thats not work and you know thats not work, no youre not working from the cafe stop lying to yourself. Get up, get on and do some fucking work.

I can relate so much to the coffee trap. Just went back from grabbing coffe. Just another to delay working.

erikb 2 days ago 6 replies      
Now that I've experienced the start-up world twice I have to say: What's the big deal about working full time in big corp? Not all of them have cubicles and ask their devs to wear suits. T-Shirt, free coffee, huge desk, free hardware, other smart people who are just like me, reliable income, some level of attractiveness to the other gender due to stable life.

Honestly I don't know why I didn't do that from the start and worked on my projects in my spare time.

slaunchwise 2 days ago 0 replies      
I was a solo for 10+ years. I rented an office when my kids were too little to simultaneously grasp the ideas that 1) I liked them and wanted them near me and 2) I could not actually have them near me right now. When they were old enough to understand I moved back home. It didn't make much of a difference in terms of productivity for me.

One thing that did help was a sense that my workspace was both mine and a place for work. I needed to know that and no one had the right to interrupt or try and shoo me away. Public spaces never worked for me because other people had a right to them, too, and they could bring their kids or ask me questions about the nearest chair or whatever. I could concentrate better knowing that was true. It was worth money to me.

redwyvern 2 days ago 0 replies      
Not really well written, but some of the points on not wasting money and time with BS activities and expenditures were important.
jacquesm 2 days ago 0 replies      
If you want to make it for much longer than just a decade plan for those once-every-in-10-years dry spells and accidents and SAVE. Sock money away as if your life and career depend on it, one day they will and if you don't have savings you will end up in trouble. Save 20% of your gross at a minimum, then, once you reach 100K or so of rolling reserve you can relax and start spending a bit but really try to maintain that reserve.
vpresident 2 days ago 1 reply      
Being an indie developer for 10 years feels too much for me.

I mean, once you do 2-3 games and you have a little success, I think you should use that notoriety to gather more talent and assemble a team. Forget starting a startup, studio, being an entrepreneur. I say you should continue doing what you do, but instead of doing all the development yourself and outsourcing the graphics or the sound effects, just bring them in.

A team of 2-3 developers, 1-2 artists, and a composer should level up faster. Watch Fullbright[1], they had amazing success together, while on their own they were.. just ok.

There are any former solo devs here that could share their story? What was the next step for them?

[1] https://en.wikipedia.org/wiki/Fullbright_(company)

jmspring 1 day ago 0 replies      
I've worked as part of a larger local remote team for a number of years. I'm dealing with customers, group meetings, or hackfests on about a 3-5 day a month basis. Outside of that, I'm fully remote.

I used to have a garage office, but the last several months the coffee table is the remote office. Soma.fm queued up when I start working. I take breaks for bike rides/lunch/exercise, if I feel the need to be "social", the local brewery (more like park + brewery) has outside seating and friends are there.

Solo - just gauge how much interaction you need and when you need it.

gcb0 2 days ago 1 reply      
was the author abused by a 3d space shooting game? why pick on that genre as the holy grail of a game that won't be finished? Iam almost sure I'm missing some internal game dev joke.
kristianp 2 days ago 0 replies      
> Say Game1 brings in 40,000: with rent and beer and socks thats probably two years salary,

Be prepared to not make a huge amount of money, but you get to write games for a living.

fapjacks 2 days ago 2 replies      
"Sit down and do some fucking work."

The other stuff is just optional.

ninjakeyboard 2 days ago 1 reply      
nice article and I almost bought your game but I don't play games on my pc :( But if I did I would - looks mint - Nice trade on experience for advertising
return0 2 days ago 0 replies      
I find that the frequency of posts about solo entrepreneurs has increased. Are there any startups working for this niche?
Thieves drain 2FA-protected bank accounts by abusing SS7 routing protocol arstechnica.com
365 points by Dowwie  2 days ago   206 comments top 24
kevin_b_er 2 days ago 9 replies      
SMS is not a secure 2nd factor. It is subject to not only technical attacks such as the one in the article, but also a wide variety of social engineering attacks. Getting cell phone reps to compromise an cell phone account is apparently not hard, and has been used many times to take over online accounts.
Latty 2 days ago 6 replies      
Banks here in the UK use your chip & pin based card as a second factor (or rather, as the two factors - the chip you have, the pin you know) - they give you a little card reader that can use the card and pin to provide a 2FA token for logging in or sign requests to send money.

It's a much better system. Of course, some banks don't use it to it's full potential - many use it only for signing money transfers, but it's still pretty good. The readers are also cheap and standardised, so you can use any one of them for any account, which is useful.

danjoc 2 days ago 1 reply      
Last July, NIST called out SMS 2FA as insecure


Second comment: SMS should have been removed long time ago considering the SS7 problems. Better to use a secure token.

Is the bank taking responsibility and covering the loss for their customers?

ismail 2 days ago 1 reply      
The problem with SS7 is that trust is assumed. Mobile carriers that have roaming agreements will have either a direct link or via a hub. So what happened here was the network of the foreign roaming partner was used to redirect the SMS traffic on the victims carriers. Would not be surprised if it was an inside job.

With ss7 you can do fun things like query the last location update/logged in base station for a mobile phone, due to roaming carrier x can query for customers on carrier y in another country. If you link up to one of the roaming hubs you can pretty much get the location of anyone with a mobile phone. Feature phones included.

ryanmarsh 2 days ago 2 replies      
Phreaking, the cutting edge way to commit computer fraud in 2017.

Who would have guessed?

kwhitefoot 2 days ago 1 reply      
The headline makes it sound as if abusing SS7 was all they needed to do but in fact they had to have the other factor as well so it really is not quite as scary as it at first appears. It also seems from the article that the thieves were able to log in to the accounts with just a password and only needed the SMS to sign transactions.

It's different here in Norway; the banks require two factor authentication to log in as well as signing transactions.

I don't claim it's perfect but at least no one can log in unless they control both factors.

matt_wulfeck 2 days ago 0 replies      
I'm still a little irked that Google constantly reminds me to add a phone number as a backup for my email account. I already have google push login, OTP, as well as backup codes.

This proves that the phone can be more a liability in the face of much better technology.

codewithcheese 2 days ago 1 reply      
Namecheap only supports SMS 2FA. The have been suggesting they will support Authenticator for years now https://blog.namecheap.com/two-factor-authentication/

Pretty unacceptable considering how important domain control is.

hinkley 2 days ago 2 replies      
Isn't this the old "SMS is not 2FA, stop calling it that" argument?
idlewords 2 days ago 4 replies      
Here's a guide for how to set up SMS-free two-factor authentication on your Gmail account. It will cost you $18; if that's a hardship, contact me.


zyx321 2 days ago 1 reply      
The Sueddeutsche article claims that German customers were affected too. Most German banks I know of support TAN-generators[1] which are completely unhackable by any known methods. Insert your card, scan the barcode on your screen, confirm the target IBAN and amount, and you get a unique TAN that is calculated from your transaction parameters.

[1] https://www.amazon.de/ReinerSCT-Tanjack-chipTAN-SmartTAN-Tan...

mdekkers 2 days ago 2 replies      
My bank's 2FA literally comes on a piece of paper. A set of numbered codes, and the banking app/site tells me which code to use for any given transfer.
finnn 2 days ago 2 replies      
When I asked (via Twitter) if my credit union would provide a secure 2FA option, they told me:

> We're always on the lookout of how we can keep our members' accounts secure. Right now, the Mobile Texts are FFIEC compliant.

stcredzero 2 days ago 0 replies      
In August, Lieu called on the FCC to fix the SS7 flaws that make such attacks possible. It could take years to fully secure the system given the size of the global network and the number of telecoms that use it.

One of the newly discovered great sins of the early 21st century, is to disseminate insecure code. Before the public became widely aware of chemical pollution, I'm sure many polluters thought themselves innocent and environmentalists as pernicious busybodies.

cyberferret 2 days ago 1 reply      
Another feather in the cap for a dedicated 2FA solution such as Google Authenticator etc. that doesn't use SMS?

Though having replaced two phones since using that solution - it can be a pain to have to re-set it up with each provider every time. I can see that if someone is prone to losing their phone, it will become a major issue.

I think the problem is that all the companies whom I use 2FA for have totally different methodologies for re-setting it up on a new device. Whilst some have an automated way of verifying my identity and resetting the new device almost instantly, I have had a couple that needed talking to a human support rep (inconvenient, but understandable) and one company that needed another employee in the company to do a full 2FA verification themselves, and then talk to a company support rep on my behalf to verify my request to reset my 2FA settings! (WTF).

Thus, each time I replace my phone, I find myself actually culling the number of services where I use 2FA purely because it was too much of a pain to go through the reset process, and it was actually easier to drop 2FA with them altogether (or in one case actually drop the service altogether).

riobard 2 days ago 0 replies      
So SS7 is like BGP where you can just announce your number/IP block?
DBNO 2 days ago 3 replies      
Edit: I had an idea for an improved sms 2fa, but comments gave persuasive reasons why google authenticator was better. Thanks for the comments!

Idea basically is a 3FA system where bank sends you a one-time 6-digit number. You then have to translate that number using a user-seeded cryptographic hash function. This secret function is your third factor which translates the received SMS code into the value you'll input at login.

Analysis: Security would increase; but ease-of-use would decrease, especially in regards to how a user would reset their password if they lose both their password and their program that calculates the cryptographic hash.

partycoder 2 days ago 0 replies      
Phreaking in 2017, interesting. The golden age of phreaking ended with SS7. SS5 was very insecure, people could just emit tones in certain frequencies and pull off tricks like calling for free. Maybe this is the beginning of a new era.

I think major websites should stop using SMS and ask for just an authenticator app or secure keys. SMS should be regarded as a bad security practice.

pm90 2 days ago 1 reply      
This is really scary... can banks please start using something like Google Authenticator? I was assuming that 2FA over SMS was the most secure thing ever...apparently that's not the case.
jdmichal 2 days ago 0 replies      
This sounds a lot like attacks on the CAN buses within car systems. We can no longer afford to have zero-authentication, zero-authorization networks anywhere.
danellis 2 days ago 1 reply      
Where and how do these people get access to the PSTN?
EGreg 2 days ago 0 replies      
These days what is a good way to authenticate people AND prevent them from making millions of accounts?
Techbrunch 2 days ago 1 reply      
It would be nice to have a WhatsApp API that could be use for 2FA, banks probably already have your number.
finnn 2 days ago 4 replies      
Is there a good technical explanation of how SS7 works, technical docs, etc?
How Stripe teaches employees to code stripe.com
512 points by p4lindromica  2 days ago   113 comments top 17
rectang 2 days ago 2 replies      
When I worked for Eventful, I organized several study groups for beginners similar to this.

Like the Stripe team, participants in these study groups also found that cross-departmental collaboration improved. After learning the fundamentals of coding, people understood better how to work with engineering teams.

We also found a few people out on the edge of the bell curve who had strong engineering aptitude, including one fellow who eventually became a stellar engineer for us.

The main difficulty is that you need a lead who enjoys teaching. Personally, I find the challenge of explaining concepts at varying audience-appropriate levels fascinating and stimulating. We had one other fellow lead a group, and he had a good experience too. But as you can see from this discussion, not everyone wants to take this on.

barrkel 2 days ago 5 replies      
Flipping this the other way - a company that works in a particular area, whether it's law, finance, retail, advertising, anything - it's important to educate engineers about the specifics of that industry. The alternative is to forgo bottom-up innovation in the company (with engineers that don't have enough business context) and risk embedding a command and control approach to feature development (coming in from sales, through product management, into engineering design, all the way down to the interchangeable coding monkeys).
bballer 2 days ago 0 replies      
This is great! I think every one should have some fundamental knowledge of how coding works, especially those working at SaaS companies who aren't directly involved in code. It would definitely be a confidence booster to those who are in roles where the primary function is supporting software and the processes (and bugs of course) around the primary product.

The problem we have is that we are strapped and everyone is pedal to the metal every hour we are at work. We just don't have the time to sit down and layout these kind of courses for all the employees and then follow through on properly teaching them. I sure wish we did. I can see how at large stable companies this can be a huge win, but the reality is for the smaller fish it's tough to pull this off.

jjcm 2 days ago 2 replies      
I'm currently running a very similar thing over at Atlassian. I was brought in to run their prototyping, and one of the things I immediately noticed is many of the designers didn't have the tools they needed to make proper prototypes. Rather than just teach them on invision/atomic/principal, I figured it'd be better to just teach them how to do front end dev. We're now holding lessons for a 2 hour block every week in 10 week cycles, and that seems to be the best for people schedule wise. Originally we did all day courses but too many people had conflicts. Course notes are here if anyone is interested: http://prototype.guide - also includes a small electron server to help dynamically compile stylus/pug.
korzun 2 days ago 7 replies      
A part of me does not understand how start-ups find so many different ways to burn the inventors money. Another part of me wonders how these people have so little workload that they can afford to sit in a 2.5-hour class every week and take projects home / do them at work?

I worked with companies that tried to do something similar, and besides a nice PR piece for the blog, these types of things are nothing more than a nice 2.5-hour break for people who can't schedule enough meetings to make their day go by faster.

Wait until individuals who came there to work get fed up with carrying the rest of their team and start looking elsewhere.

partycoder 2 days ago 3 replies      
There is substantial domain knowledge you require to code safely.

Explain someone who hasn't been exposed to how numbers are represented in memory that doing financial stuff with IEEE 754 floating point numbers (default in JavaScript and others) can lead to precision errors, with possible financial consequences. Very hard to do without going all the way to the basics.

Or maintainability, security, configuration, construction for verification, performance, scalability, concurrency, thread-safety... and all non-functional requirements.

You can save money in less skilled people. I have fixed bugs implemented by self-taught guys. I remember profiling a service experiencing service degradation (with terrible financial consequences) and finding a O(2^n) function that could be O(1). That's the kind of risk you expose yourself to.

jorblumesea 2 days ago 2 replies      
Take recruiting for example, would that really allow you to have a more technical conversation with a prospective employee? If they are any sort of engineer they'll run circles around you in every sense, so what's the benefit here? I'd love to see the data breakdown of how/why this benefits certain jobs or teams.
jaboutboul 2 days ago 1 reply      
How about opening the course and sharing on github so other can benefit/help improve as well?
timlod 1 day ago 2 replies      
Everytime there's a post about Stripe, it is a positive post. Programms they initiate, people they hire, how they act in their business. To me they seem like an example of a 'good' company with great culture as opposed to what you read about Uber [edit: removed caps] and co. these days, very refreshing!
barking 1 day ago 1 reply      
I would humbly suggest that they cease and desist from calling their workers stripes.
CodeSheikh 1 day ago 1 reply      
It is nice to see how more and more tech companies are taking this initiative to hold on-campus intro to programming classes (mostly it is javascript or python). But before most of these companies decide to teach coding to every single employee, I would first focus on those god awful "tech" project managers who get hired because of some fancy MBA degree and most of the time they have no clue about the complexities of a software project in general.
throwaway2016a 2 days ago 6 replies      
> For example, == is used at Stripe to mean youre in agreement

Is this normal? I have never heard this before.

imron 2 days ago 1 reply      
Next can they teach their web designers about contrast?
alexpetralia 15 hours ago 0 replies      
I'm surprised by how controversial this article has been. I think it reflects the wide gamut of workplaces that continue to exist in tech, despite the external homogeneous appearance the industry maintains.
sidchilling 1 day ago 0 replies      
Perhaps a follow-up blog post on which department the people who attended the program were from and how did the program help then in their day-to-day work.

Once, we have success stories, more companies would be interested in implementing such programs.

Kudos to the initiative!

diek 2 days ago 5 replies      
> In seating, we mix engineering teams with non-engineering teams

In my experience this is a huge productivity killer for engineering teams. No, sitting us down next to the recruiting guy doesn't make the recruiter better at technology or the engineers better at understanding recruiting. It mostly just makes us hate the person who talks on the phone all day while we're trying to work.

hasenj 1 day ago 1 reply      
It's all nice and everything, but I have few "questions"

Is asking people about how they "felt"about the course really a good method of evaluating the course's effectiveness? It sounds like people would always makes up a reason to say they felt good about the course, either as lip service, or because they enjoyed it even though it might have zero effect on their work.

Maybe instead they should try to come with specific things to measure before and after the course?

Americans' Access to Strong Encryption Is at Risk, an Open Letter to Congress rietta.com
358 points by rietta  3 days ago   134 comments top 22
Nomentatus 3 days ago 9 replies      
The irony here is that simple one-time-pad solutions (OTP) will continue to be available to securely encrypt the sort of messaging that's of use to terrorists (relatively short infrequent messages), instead it's the general communications (including for banking) that the rest of us perform online that will be made vulnerable.

You don't even have to program or use a computer to create these OTP solutions, for limited messages you could just flip a coin to create the OTP if necessary (although there are lots of more automated solutions available as well.)

Airgapped computers at both ends provide another way 'round restrictions for more sophisticated actors. Their backdoors won't be accessible (remotely.)

So taking away secure encryption from the rest of us is just security theatre; a destructive, narcissistic legislative exercise designed to make it look like the pompous powerful doing something when they're doing nothing of any real use while creating terrible risks.

This is why, I think, legislators have consistently ignoring logic and math from professionals such as the OP - they don't care. They know perfectly well they're pissing into the wind doing nothing useful; that it's all theatre; they just think the fallout is going to land on someone else's pants after they're out of office. But tech works (and fails) faster than that.

[Counterargument: if everything else is breakable, securely encrypted messages really stand out. One answer: But very short messages (in an unknown format) aren't generally breakable, anyway, and that's the likely case.]

xupybd 3 days ago 4 replies      
Just think of the outrage if the government required master keys to everyone's homes? I know there is a difference, but it's not a huge leap to compare the two. We don't want the government to have such easy access to our homes because we can't trust every government employee not to abuse it. I think the same goes here. No mater what safe guards you put in place it's a scary thought that you simply can't keep the government out of your affair's. Sure now you think you have nothing to hide. But what if your political views become criminal, what if your religious views become hate speech? We're not there yet but times can change quickly.
1001101 3 days ago 2 replies      
Does anyone here remember the clipper chip? If you don't, I'd recommend boning up on this chapter of the crypto wars.

The 'because terrorism' excuse falls a bit flat with me.

Thought experiment: how hard would it be for a terrorist organization with access to 100's of millions of dollars (eg. ISIS) to come up with a secure communications scheme? One time pad. A reasonable cipher that hasn't had any 'help' during development. Even run an encrypted channel over a backdoored product. I'm sure many of us could come up with something in a day (with decryption over an airgap). How about a hostile government with multi-billion dollar budgets (and who have been using OTP already for decades).

Is this about terrorists, or is this about citizens? My bet is on the latter.


rietta 3 days ago 0 replies      
I'm going to have to go back to listen to the entirety of the Senate hearing at some point. With so much talk about Russia hacking and influence and then they flip the switch and want backdoors into encryption even though any mandated tool the government demands for so called lawful intercept can be hacked by or ordered by the judges in Russia! There is a strange disconnect and I think it hurts us that the public discourse is security vs privacy rather than being about the personal security off all citizens.
libeclipse 2 days ago 1 reply      
I did my Extended Project Qualification (EPQ) [1] on this issue, and it actually surprised me how many people think that the governments are right in this debate.

When presenting the work, I had a chance to ask ordinary people, and they all pretty much agreed that the government should be able to "break" encryption with a warrant.

This is a scary prospect, and I feel that educating citizens as well as the government is important.

[1] https://github.com/libeclipse/EPQ/blob/master/paper.pdf

sandworm101 3 days ago 0 replies      
Access is under no risk whatsoever. Encryption is math. It is open source. It will always be there. What is at risk is the legal right to use it, the government's permission for the public to use that math. My point: people with good reason to fear the government will still access and use encryption. This therefore isn't about terrorists. It is about watching the everyday people who want to abide by the law.
_jal 3 days ago 0 replies      
The Four Horseman of the Infocalypse[1] ride again!

[1] https://en.wikipedia.org/wiki/Four_Horsemen_of_the_Infocalyp...

nom 3 days ago 0 replies      
The greatest problem right now is our hardware, not our software. We can always devise secure encryption schemes without backdoors. Nobody can do anything against it.

Our hardware on the other hand... is probably backdoored already.

WalterBright 2 days ago 1 reply      
It isn't just our privacy at issue. With more and more critical infrastructure on the internet, having unbreakable encryption is a major national economic and national security requirement.

It's unrealistic to think that if there is a means for access by the government, that foreign enemies and criminal organizations won't be able to access it, too, and cause havoc.

pinaceae 3 days ago 0 replies      
As if they'd give a shit.

Right now they want to un-insure 24mil people, re-introduce the whole pre-existing condition scam.

you really think a ruling class that has no qualms being "pro-life" while denying young mothers healthcare will care about your nerd bullshit?

notliketherest 3 days ago 3 replies      
This is not a battle they can win. Most American's DGAF if their shit is encrypted, until the PSA campaign fighting against laws like these tells them the government is taking away their rights and able to snoop on their lives. Just like SOPA and others this will be defeated.
natch 3 days ago 0 replies      
For congressional consumption, I suspect arguments like this need to be dumbed way, way, down.

Tim Cook's "software equivalent of cancer" is an example of an effective dumbed down take on it, but it need not be the last one. The more ways the point can be re-worded concisely so that lay people will understand it, the better.

shmerl 3 days ago 0 replies      
Some just never learn. How many times will they bring up this "let's make a backdoor but we don't really want a backdoor" stupidity?
paulddraper 3 days ago 2 replies      
Encryption will never be intentionally backdoored on a large scale.

I think one of RSA argued this, basically "Do you really think the government will want to review and approve everything on the app store?"

Forcing big players to divulge data, making accused people decrypt their devices -- those are things the government could do. Encryption per se isn't in any danger.

threepipeproblm 2 days ago 0 replies      
I read that Sen. Diane Feinsetin is supporting an anti-encryption bill. It's never been completely clear to me if she, and those like her, fall more on the stupid side, or more on the evil side.

But the arguments against this aren't that difficult... so I have to guess it's the evil. Power corrupts.

spilk 2 days ago 1 reply      
The US Department of Defense arguably runs the most extensive key escrow system in the world. Every DoD employee and many contractors have Common Access Cards (CAC) that contain email encryption keys that are escrowed with DISA.
nickpsecurity 2 days ago 0 replies      
A better example of work that Congress might be interested in would be Schneier and Kerr's writeup on encryption workarounds showing government tools they have available with legal considerations of current or expanded ones. That's the kind of practical stuff that can influence powerful people's opinion as they're always looking at grey areas to balance many conflicting interests.


feld 3 days ago 0 replies      
Bernstein v. United States
deepnet 3 days ago 1 reply      
> ... "protected being being stolen."

repetition error.

I_am_neo 3 days ago 0 replies      
As a sovereign I demand my privacy!
microcolonel 3 days ago 1 reply      
Good sentiment, and better cause...

but please, for the love of god, proofread your writing!

azinman2 3 days ago 4 replies      
Wait does he have a Masters in Information Security from the College of Computing at the Georgia Institute of Technology???!

Joking aside, unfortunately it takes deep problems to motivate people/the US to change. It'll swing this way, and there will be dramatic consequences. Only then will things swing back the other way.

It's too bad there isn't any balance here -- it does make sense in many situations that the police/courts should be able to gain access to information. But encryption doesn't care about the situation. Encryption doesn't care who you are. Encryption has no contextual morales of its own.

If data had physical weight, where things that were important we're really hard to steal, then it'd function like the real world. But data does not, and it's too easy to download gigs of data one should never have access to. It's very difficult to gain a middle ground as suggested by Pelosi. I don't know if she understands that.

Build yourself a Linux github.com
477 points by AlexeyBrin  2 days ago   85 comments top 25
Sir_Cmpwn 2 days ago 2 replies      
Building a Linux distro from scratch has been one of the most tasking projects I've attempted. There are two phases: the frustration phase, and the tedious phase. Bootstrapping it is an incredibly frustrating process - I restarted from scratch 5 times before any attempt ever got to the tedious phase. The tedious phase never ends. You have to make hundreds of packages to get to a usable desktop system. I've made 407 packages and I still don't have a desktop to show for it (I'm hoping to get sway working tomorrow, I think I have about 30 packages left).

Still, I've finally gotten to a somewhat practical self-hosting system. My laptop runs it on the metal and all of the infrastructure (website, bugzilla, git hosting, mirrors) is on a server running my distro too. It's taken almost a year of frustration, but it's very rewarding.

erikb 2 days ago 1 reply      
I really wondered what's so special about this project that aims to achieve what probably more projects aimed to achieved than there are lines of code in the kernel.

It's not the goal, it's not the OS. But the documentation is a sight to behold! Very clear, detailed, interesting writing style, and it puts together quite a few frustrating topics in a simple, structured matter. Wow and kudos! Keep on writing docs, please!

digi_owl 2 days ago 2 replies      
Both this and LFS reminds me that Linux makes sense until you get the DEs involved. At that point shit just sprawls all over the place as there are no longer any notion of layers.
thom_nic 2 days ago 0 replies      
A similar process is building a custom kernel and rootfs for an ARM device such as the beaglebone. Olimex actually has a good tutorial for their device: https://www.olimex.com/wiki/AM335x

This is only slightly more complicated due to the need to cross-build but I found it fairly easy with qemu-static-arm and prebuilt cross toolchain packages for Ubuntu/ Debian.

The benefit is that you can develop for a target device that is not your PC, so no worry about messing up the bootloader and leaving your PC in a state where you need a recovery CD to fix it and boot. Just get a USB-serial cable :)

You can also try buildroot or yocto, although I had no interest in building every package manually versus relying on Debian's repos.

asciimo 1 day ago 2 replies      
This is almost as complicated as building a Javascript web application.
fizixer 2 days ago 5 replies      
Have you even looked at the LFS project[1]? And what does your guide provide that LFS doesn't?

[1] http://www.linuxfromscratch.org/

lanna 2 days ago 1 reply      
If you are interested in building your own Linux, the LFS project has a lot of detailed information: http://linuxfromscratch.org
jonathanstrange 1 day ago 4 replies      
I have a question related to this article, though not directly. If building completely from scratch turns out too cumbersome and time-consuming, what would be the easiest way of building a minimal, fast starting distro with graphical user interface and networking whose only purpose is to run one application on x86 hardware in kiosk mode?

Is there a distro builder for dummies?

throw2016 2 days ago 0 replies      
Following this guide will get you something very close to a base Alpine Linuxwith busybox. Alpine is fairly minimal out of the box and even eschews grub for syslinux.

The upside with Alpine is if you need features and packages they are an install away. But if the purpose is to learn about compiling the kernel and how the system initalizes this is a decent start.

marenkay 18 hours ago 0 replies      
Huge fang on Linux From Scratch myself, after reading this I wonder if someone has tried the same with FreeBSD! Or the Darwin sources released for OSX (.. not talking about dormant PureDarwin project)
thu 2 days ago 0 replies      
This sounds like Aboriginal Linux: http://landley.net/aboriginal/about.html
rijoja 1 day ago 0 replies      
I am following this guide to build a kernel. But it seems like that instead of getting the headers from the kernel source they are using a github repository which only contains the headers to save downloading time. All fine and dandy if the latest commit to this repo wasn't from 3 years ago!!
blanket_the_cat 2 days ago 0 replies      
This is awesome. I've been building almost the exact same project, along almost the same timeline (based on the commit history). Mostly an excuse to learn more advanced Bash, and Linux Internals/Features I've never had a good excuse to explore. Gonna release next week. Hope I get as warm a reception. Kudos on an awesome project!
agumonkey 1 day ago 0 replies      
Let's branch to add : sysvinit/BSD, init, OpenRC, upstart, systemd, SMF, launchd, Epoch, finit ..


peterwwillis 2 days ago 0 replies      
I remember when HOWTOs were actually maintained over time so their instructions were up to date. Blogs killed HOWTOs.
felixsanz 1 day ago 0 replies      
Awesome! Good job. This also helps understand what the distro installer does.
Siecje 1 day ago 0 replies      
Has anyone used Tinycore Linux? http://tinycorelinux.net/
colemickens 2 days ago 0 replies      
Seem neat to learn, but for something maintainable, LinuxKit seems interesting.
ausjke 1 day ago 0 replies      
This is an awesome write-up. did not know losetup can do what kpartx does now with the option -P, I did similar things in the past but this is a good update for me.
Jaruzel 1 day ago 0 replies      
I've crashed and burned a couple of times trying to complete LinuxFromScratch, so I may give this a go - It seems a bit clearer on the core steps.
apeacox 2 days ago 1 reply      
Well, I'd use http://linuxfromscratch.org/ for that matter...
akavel 2 days ago 0 replies      
Would be cool to port this to Nix package manager machinery (i.e. make this kinda an alternative to NixOS).
Ericson2314 1 day ago 0 replies      
I feel like just reading Nixpkgs is probably just as edifying, tbh.
faragon 2 days ago 0 replies      
airswimmer 2 days ago 0 replies      
I don't think this is much useful. You should check out the linuxfromscratch.org
Google accuses Uber of creating a fake company in order to steal its tech businessinsider.com
354 points by golfer  3 days ago   170 comments top 20
golfer 3 days ago 1 reply      
Some quality journalists are live tweeting the proceedings from inside the courtroom. Getting some great updates in real time:





TazeTSchnitzel 3 days ago 2 replies      
This made me wonder what HN thought of the acquisition at the time.

Well: https://news.ycombinator.com/item?id=12315205

Top comment noted how the company looked like a quick flip.

crench 3 days ago 3 replies      
"Here's the thing," [Judge William Alsup] said. "You didn't sue him. You sued Uber. So what if it turns out that Uber is totally innocent?"

This is going to be a very interesting case.

anigbrowl 3 days ago 3 replies      
What does it take to get a business license revoked these days? If an individual carried on the way Uber does s/he's be looking at a long stretch in prison. While I championed Uber's disruption of the taxi monopoly when it got started, and did a lot of free advocacy here on HN against taxi industry shills, the firm has turned out to be as corrupt or worse than the market it set out to disrupt.

Should the various allegations made against the firm prove true, and and it seems like there's a good chance of that, a good number of people need to face criminal charges, the company needs to be shut down and its assets auctioned off, and the investors need to end up with nothing because they abrogated their corporate governance responsibilities.

askvictor 3 days ago 3 replies      
Uber seems to be the logical extreme of 'easier to beg forgiveness' mentality; there are no rules (explicit or implicit) or ethical boundaries that are not subject to be broken in pursuit of their goals.
marcell 3 days ago 2 replies      
Based on this live tweets (https://twitter.com/CSaid) it doesn't sound like Waymo/Google is making much headway. They want to pin this on Uber, but haven't presented evidence of wrongdoing by Uber:

 Judge to Waymo: U have no proof that shows a chain of Levandowski saying to anybody heres the trade secrets.
Unless Waymo presents something like this, I don't see how this trial benefits them. Sure they can make a big fuss an get Levandowski kicked off self driving cars / Lidars, but that won't stop Uber from moving forward with their program.

Animats 3 days ago 10 replies      
Levandowski is probably going to come out of this really well. Uber, not so much. Google's LIDAR technology is obsolete spinning-scanner gear, mostly from Velodyne. It's something you'd use on an experimental vehicle, not a production one. The production LIDAR systems are coming, they're all solid state, and they come from big auto parts makers like Delphi and Continental. So by the time the Uber case gets to trial, it will be moot.

Levandowski already got his money. Waymo could sue him, but what are the damages? Google isn't selling anything, so they can't show impact on their sales volume. (Neither is Uber. Uber's venture into self-driving is probably more to pump up the valuation than to provide a real service, anyway.)

I'm beginning to think that self-driving will be a feature that comes from auto parts companies. You need sensors and actuators, which come from auto parts companies. You need dashboard units, which come from auto parts companies. You need a compute unit, which is just a ruggedized computer packaged for automotive conditions, something that comes from auto parts companies. You need software, which may come from a number of sources. This may not be all that disruptive a technology.

dmitrygr 3 days ago 1 reply      
Judge Alsup, to Waymo Lawyer: "You have one of the strongest records I've seen of somebody doing something bad. Good for you!"
kristianc 3 days ago 0 replies      
Original HN discussion from the time - several on here, including myself thought acquisition looked like quick flip from the outset:


Namrog84 3 days ago 1 reply      
If this turns out to be true, this does not bold well for a bright UBer future. I feel like I only hear bad things about them lately.

Anyone have any speculation as to what might happen to Uber if this turns out to be true?

wand3r 3 days ago 1 reply      
This is Ubers fault for leaving themselves open, but this will play out like a hostile takeover w/ Google upping their 6percent stake to controlling or wholly owning Uber
Touche 3 days ago 1 reply      
The entire point of Uber is to evade laws on technicalities, so the fact that they are doing this here should be a surprise to no one.

That is their core competency, in fact.

aaron695 3 days ago 0 replies      
"Google just accused Uber of creating a fake, shell company with its former engineer to steal its tech"

They don't seem to be backing that clickbait title do they?

I think the backdated form makes sense, more sense they doing something obvious like creating a fake company the 'day' after someone leaves Google.

fujipadam 3 days ago 0 replies      
Considering Uber's past unethical behavior, I am sure they stole tech. The problem is that it is very difficult to prove it.

If it is proved, there should be actual consequence to the executives, including jail

PascLeRasc 3 days ago 1 reply      
Is this Otto the same as www.ottomotors.com? That site seems slightly fake, like the kind of site Hooli in Silicon Valley would have.
tim333 2 days ago 0 replies      
So if Uber stole Google's designs but are not using them in their present self driving stuff can Google do much?
huangc10 3 days ago 1 reply      
If this is true, it's kind of a brilliant scheme, you know, in an evil "I'm going to take over the world" sorta way.
asafira 3 days ago 0 replies      
Since parts of this are public information, when is the next bit of information expected to come out?
bunderbunder 2 days ago 0 replies      
This page loads a 2400x1800 *.jpg into a (on my screen) 372x279 image element.

I believe no further comment is necessary.

valuearb 3 days ago 2 replies      
The Rust Libs Blitz rust-lang.org
411 points by aturon  1 day ago   118 comments top 16
dmix 1 day ago 3 replies      
This is something Haskell could really benefit from. Largely just through writing documentation for common libraries.

A post was recently on the frontpage of HN about using Haskell in production [1] that divided the common documentation experience between "hard" and "soft" docs. Far too often with Haskell you only get the 'hard' docs where you get descriptions of functionality and functions but it lacks why (and cohesively how) you would want to use the various functionality.

This makes a strong assumption you are already deeply familiar with the usecase and implementation concept.

This may apply to Rust as well. Rust will likely attract experienced developers, much like Haskell, where in most cases a decent level of code quality would be anticipated. But one of the hardest things to get right as an OSS developer is documentation. You're often so busy with the burden of maintenance that the explanatory side gets sidelined. Especially as a library and the underlying language evolves. So I hope this is a priority focus during their reviews.

[1] https://news.ycombinator.com/item?id=14266462

z1mm32m4n 1 day ago 1 reply      
I really love seeing articles like this come out about Rust.

It's language design the way it should be: incorporating the cutting edge ideas from academia while still striving to cater to beginners; drawing on the strengths of other languages communities to build out good library and solutions to package management; designing everything in the open, and constantly seeking feedback from their users.

It's a great blend of theoretical CS, HCI, computer systems, and application development, and it's always fun to hear about what they're up to.

kibwen 1 day ago 3 replies      
I'm going to use this opportunity to second the suggestion that people consider taking this year's Rust community survey: https://blog.rust-lang.org/2017/05/03/survey.html . I know it's mentioned in the post, but I figure the number of people reading the comments is much larger than the number who actually click through the link. :P And even if you don't or have never used Rust, we still value your feedback!
mintplant 1 day ago 1 reply      
May I suggest the bytes crate [0]? It's one of those small libraries providing a key building block (mutable and immutable byte buffers), and is a dependency of tokio-io and any other crate which implements tokio-io's Encoder/Decoder traits.

[0] https://crates.io/crates/bytes

eriknstr 19 hours ago 2 replies      
The article also links the state of rust survey. I visited the survey with intent to answer it but the first question "Do you use Rust?" only has the following three alternatives for an answer:

- "Yes"

- "No, I stopped using Rust"

- "No, I've never used Rust"

When making a survey the alternatives for the answers are very important. I feel that none of these alternatives apply to me. That's bad. Unfortunate because I would have liked to participate in the survey.

I've done a little bit of beginner programming in Rust in order to try and learn the language. However I haven't yet used it to implement anything actually useful so I wouldn't say "yes, I use Rust". It's been a while since last I did something in Rust but I wouldn't say "no, I stopped using Rust" because to me that implies that I have decided that Rust is not for me, which is not something that I feel, I want to use it, I just keep pushing it down on the list of things to do because other more immediate desires and problems keep popping up.

ssdfe 1 day ago 1 reply      
I do wish cargo packages were namespaced a la Github. Squatting on usernames is one thing, but package and project names are often the only way you hear about something. cargo react-svg might be a terrible project or a good quality one maintained by facebook, but you wouldn't know from the name. Because of the name, it'll be at least somewhat downloaded if that's a common need. It makes grouping by org difficult too.
stcredzero 1 day ago 3 replies      
Theres a countervailing mindset which, in its harshest terms, says the standard library is where code goes to die

This can be addressed in a language with sufficient annotation and good parser tools. In some future language, there should be a unification between the version control, the de-facto codesharing site, language/library versions, and syntax-driven tools to automatically rewrite code.

It should be possible to "publish" a language and its libraries such that any breaking changes will automatically be updated when you switch library versions. (This should also be applicable to Entity-Relation diagrams and Object-Relational mappings -- those can be treated as a versioned library.)

wyldfire 1 day ago 1 reply      
> The product of this process will be a mature core of libraries together with a set of API guidelines that Rust authors can follow to gain insight into the design of Rust and level up crates of their own interest.

Is there a plan to make these things statically checkable by rustc/rustfmt/rust-tidy or some sort?

dasmoth 18 hours ago 3 replies      
I realise this is being done with the best of intentions and will probably be a big net positive in practise, but something about the way it has been presented here rubs me up the wrong way.

In short, the blog post makes very little mention of the role of the primary author(s) of the libraries in question, beyond "Every two weeks, hold a library team meeting [...] with the author in attendance." While I imagine the reality will be quite different, this sounds an awful lot like "oi, you, code review in my office now!"

One of the attractions of working on open source is that it offers more scope for autonomy and individual recognition than the typical commercial software job. It's slightly alarming that this doesn't seem to be recognised here.

As I say, I'm sure the reality will be fine (and I remain very keen to give Rust a serious try when time permits), but the rather collectivist presentation here is a tiny bit off-putting.

bpicolo 1 day ago 1 reply      
One thing I don't see here: It's difficult to integrate libs into e.g. parallel when they don't derive all the various things (Copy). Will a goal be to aim for deriving standards for libs looked at?
bhickey 1 day ago 1 reply      
bstrie is coming by my place tomorrow, we're going to take a stab at overhauling `rand`. We must've been discussing this for two years.
Animats 1 day ago 9 replies      
Does the "Rust standard of quality" for these crucial crates include "no unsafe code"?

"Vec" currently needs unsafe code, because Rust doesn't have the expressive power to talk about a partially initialized array. Everything else with unsafe code is an optimization. Often a premature one. Maps should be built on "Vec", for example.

jhasse 1 day ago 1 reply      
My biggest gripe with the crate situation is that some of them require nightly. E.g. everything coroutines AFAIK.
microcolonel 1 day ago 3 replies      
Rust sorta has a de-facto code style. It'd be interesting to add tooling to cargo to make it obvious how to comply with the evolved standard style for Rust.
modeless 1 day ago 3 replies      
Unfortunate name. I thought this was about libz aka zlib.
luck_fenovo 1 day ago 3 replies      
It is an unfortunate name. I don't know why, given how inclusive the Rust community usually is and how eager they were to remove master/slave terminology, that they would choose to announce this effort under the banner of Nazi war tactics.
Why does Google prepend while(1); to their JSON responses? stackoverflow.com
455 points by vikas0380  12 hours ago   87 comments top 12
winteriscoming 11 hours ago 5 replies      
Everytime I read about such constructs, it makes me realize, as a regular developer, how complex web application security is and how difficult it is to think about and cover your application against each and every such potential problem.
c0achmcguirk 10 hours ago 0 replies      
I believe this hack (JSON Hijacking) was discovered by Jeremiah Grossman in 2005[1].

It's fascinating to read how he discovered it and how quickly Google responded.

[1] - http://blog.jeremiahgrossman.com/2006/01/advanced-web-attack...

samfisher83 12 hours ago 9 replies      
Why don't browsers strip cookies when they are doing cross domain javascript fetches?
westoque 11 hours ago 4 replies      
I wondered the same thing years ago. I always thought that browsers would have implemented other security measures so that websites avoid doing this.

Around 90 something percent of websites I visit don't implement that `for(;;)` or `while(1)` solution.

So are we saying that they're vulnerable sites?

xg15 11 hours ago 2 replies      
I had a hunch that this is to prevent people from including the resource in a script tag - but I always wondered how they'd access the data as a JSON expression on its own should technically be a no-op when interpreted as JS (or so I thought).

The overridden array constructor was the missing link.

Though couldn't you have it easier by making sure your top-level JSON structure is always an object?

As far as I know, while a standalone array expression []; is a valid JS statement, a standalone object expression {}; is not and would produce a syntax error.

zoren 11 hours ago 0 replies      
That is one weird array in Google's reply. Looks like it could have been an object instead, whereby JSON hijacking wouldn't be a problem.
CaliforniaKarl 12 hours ago 1 reply      
I haven't worked with JSON like that before. Do JSON parsers properly ignore the stuff Google puts in, or do you have to strip it out before parsing?
the_mitsuhiko 8 hours ago 0 replies      
Pretty sure browsers no longer permit overriding ctors for literals.
frik 11 hours ago 2 replies      
FB prepends a "for(;;);" which is 1 char shorter than "while(1);", has been the case since 2012/13.

Firebug v2 and ChromeTools know how to parse such JSON and ignore that first part. (IE11 and Firefox newer DevTools can't "handle" it aka show just a plain text string)

NewEntryHN 10 hours ago 1 reply      
Google use cookies to authenticate API requests?
tossaway322 10 hours ago 1 reply      
Jeez, why not live w/o JavaScript?

We keep trying to accomodate a defunct language with insoluble problems. Isn't that an error in our thinking processes?


Animats 10 hours ago 1 reply      
Why not "while(0)"? Then an eval wouldn't do anything.
I tried Haskell for 5 years metarabbit.wordpress.com
358 points by sndean  2 days ago   252 comments top 19
arnon 2 days ago 5 replies      
We have a code base of roughly 200,000 lines of Haskell code, dealing with high performance SQL query parsing, compilation and optimizations.

I only remember one situation over the past 5 years that we had a performance issue with Haskell, that was solved by using the profiling capabilities of GHC.

I disagree that performance is hard to figure out. It could be better, yes - but it's not that different than what you'd get with other programming languages.

choxi 2 days ago 8 replies      
If anyone's curious to try out functional programming, I would highly recommend Elm. I haven't been so excited about a language since I went from C to Ruby ten years ago, and Pragmatic Studios has a great course on it (I have no affiliation): https://pragmaticstudio.com/courses/elm
gjkood 2 days ago 15 replies      
Question to the Haskell experts here.

Is Haskell more academic in nature or used heavily in Production environments?

Is there an application sweet spot/domain (say Artificial Intelligence/Machine Learning, etc) where it shines over using other languages (I am not talking about software/language architectural issues like type systems or such)?

I have no experience with Haskell but do use functional languages such as Erlang/Elixir on and off.

hzhou321 2 days ago 7 replies      
> 1. There is a learning curve.

Time and experience can cover up anything. So this does not say much about Haskell other than it is all negative without time and experience.

> 2. Haskell has some very nice libraries

So does NodeJS and (on an abstract level) Microsoft Word. Libraries are infrastructures and investments that (like time and experience) can cover up any shortcomings.

> 3. Haskell libraries are sometimes hard to figure out

That is simply negative, right?

> 4. Haskell sometimes feels like C++

That is also negative, right?

> 5. Performance is hard to figure out

That is also negative, right?

> 6. The easy is hard, the hard is easy

That is a general description of specialty -- unless he means all hard are easy.

> 7. Stack changed the game

Another infrastructure investment.

> 8. Summary: Haskell is a great programming language.

... I am a bit lost ... But if I read it as an attitude, it explains a lot about the existence of infrastructure and investment. Will overcomes anything.

throwaway110034 2 days ago 5 replies      
I will never, ever use Haskell in production because of its default evaluation strategy, the wrongness of which was tacitly conceded not long ago with the addition of the strictness pragma (which only works per-module) to GHC.

I think it's especially telling that its community skews so heavily towards this blogger/monad tutorial writer dilettante demographic rather than the D. Richard Hipp/Walter Bright 'actually gets real work done' demographic. I know which of the two I'd rather be in. Haskellers are even worse than Lispers in this regard. For the amount of noise about Haskell, you'd expect to see high-quality operating system kernels, IDEs, or RDMBSs written in it by now. Instead its killer apps are a tiling window manager, a document converter, and a DVCS so slow and corruption-prone even they avoid it in favor of Git.

m-j-fox 2 days ago 8 replies      
> very hard to understand why a function could be useful in the first place

So true.


> mfix :: (a -> m a) -> m a

> The fixed point of a monadic computation. mfix f executes the action f only once, with the eventual output fed back as the input. Hence f should not be strict, for then mfix f would diverge.

But why tho?

jes5199 2 days ago 3 replies      
Haskell: where difficult problems are trivial, and where trivial problems are the subject of ongoing academic research
matt_wulfeck 2 days ago 5 replies      
> The same is not true of Haskell. If you have never looked at Haskell code, you may have difficulty following even simple functions.

Why is it that people talk about this almost as if it's done virtue of the language? As if the fact that's it's so inscrutable proves that it's valuable, different, and on a higher plane of computing.

fizixer 2 days ago 0 replies      
Wow, really good writeup. And on point regarding '5 days' vs '5 years' approach.

And it really confirms my biases against the language.

dmix 2 days ago 0 replies      
> The easy is hard, the hard is easy

Probably the best single line description of Haskell.

The learning curve helps with the former (easy) part, the latter (hard) part contains some really brilliant ideas where you start to wonder why so many people still use other languages for everything, then you remember how hard the easy stuff can be...

nicolashahn 2 days ago 0 replies      
I don't think it takes 5 years to learn this stuff. Nothing stood out to me and I only used Haskell for 10 weeks for a single college course 2 years ago. It's all true though.
devrandomguy 2 days ago 1 reply      
> Types / Type-driven development Rating: Best in class

> Haskell definitely does not have the most advanced type system (not even close if you count research languages) but out of all languages that are actually used in production Haskell is probably at the top.

What are these other research languages, that have such incredible type systems? Do they usually have implemented compilers, or would they only be described in an abstract form? Can I explore them for fun and curiosity?

KirinDave 2 days ago 0 replies      
I strongly agree with most of this.I've been so pro-Purescript because I think we need a break from Haskell into something new but related. Haskell is great, but has so much baggage it's tough to enter into.
Maro 2 days ago 1 reply      
I have tried to use Haskell a number of times in the past ~5 years for small-scale projects and witnessed others try to use it, and most of these projects (actually I think all) have resulted in failure / rewrites in more simple/plain-vanilla languages like Python/Go. I keep thinking Haskell is interesting, but at some point I had to force myself to stop investing time in it because I kept concluding it's not a good investment of time for a move-fast/practical person like me. I again had to remind myself when I read this post.

Some reasons I remember for the various failures, in no particular order:

- steep learning curve = experienced (in other languages) programmers having a tough time not being productive for weeks/months in the new language, with no clear payoff for the kind of projects they're working on

- sometimes/often side-effects/states/global vars/hackyness is what I want, because I'm experimenting with something in the code; and if I'm not sure if this code will be around in 3 months, I want to leave the mess in and not refactor it

- in general, I think all-the-way pure no side effects is too much; I think J.Carmack said sth along the lines: Haskell has good ideas which should be imported into more generally useful languages like C++/etc, eg. the gamestate in an FPS game should be write-once, it makes the engine/architecture easier to understand (but in general the language should support side-effect)

- I found the type system to be cumbersome: I kept not being able to model things the way I wanted to and running into annoyances; I find classes/objects/templates etc from the C++/Java/Python/whatever world to be more useful for modeling applications

- when the spec of the system keeps changing (=the norm in lean/cont.delivery environments), it's cumbersome/not practical to keep updating the types and deal with the cascading effects

- weird "bugs" due to how the VM evaluates the program (usually around lazyness/lists) leading to memory leaks; when I was chasing these issues I always felt like I'm wasting my time trying to convince the ghc runtime to do X, which would be trivial in an imperative language where I just write the program to do X and I'm done

- cryptic ghc compile errors regarding types (granted, this is similar in C++ with templates and STL..)

- if it compiles it's good => fallacy we kept running into

- type system seemed not a good fit for certain common use-cases, like parsing dynamic/messy things like json

Working at Facebook for the last year and seeing the PHP/Hack codebase which powers this incredibly successful product/company has further eroded my interest in Haskell: Facebook's slow transition from PHP to Hack (=win) shows that some level of strictness/typing/etc is important, but it's pointless to overdo the purity. Just pick sth which is good enough, make sure it has outstanding all-around tooling, have good code-review, and then focus on the product you're building, not the language.

I'm not saying Haskell is shit, I just don't care about it anymore. I'm happy if people get good use out of it, clearly there are problem spaces that are compact, well-defined and correctness is super-important (like parsing).

nabla9 2 days ago 0 replies      
> Performance is hard to figure out

1000x changes in performance is not a problem if:

1. Performance of one module is not overly dependent on the code that uses it.

2. Performance never degrades order of magnitude with new compilers.

dlwdlw 2 days ago 1 reply      
The thing with enlightenment level ideas isn't that they aren't enlightening but that they attract hype. Individuals pretending to be enlightened if you will.

A scientific mindset as well as liberalism are also ideas where new proponents often want to draw a line in the sand to stratify people into superior and inferior. The original proponents were chasing a higher level of quality for all, but the need for social stratification weaponizes and gates ideas.

Safety1stClyde 2 days ago 2 replies      
> If you read an article from 10 years ago about the best way to do something in the language, that article is probably outdated by two generations.

Thank you, that is all I need to know about Haskell. I won't be learning Haskell then, in the same way that I won't have anything to do with C++. I don't have enough time to use these fashion-dominated and fad-obsessed programming languages.

davidrm 2 days ago 2 replies      
This has been the most disappointing blog post I have read in quite some time.
anorphirith 2 days ago 4 replies      
we should forbid click bait titles on HN, it's an insult to the audience's intelligence

Clickbait title are so common we think they're normal titles.... Here's how he did it: you create a craving for an answer then you offer a solution for that craving. "heres how it was" ==> that's the trickAlso "heres how" should never be used in a title, we all know that the title's subject IS what you're going to talk about.a pre-click bait era title would have sounded like: Learnings after using Haskell for 5 years

Soylent Closes $50M Series B Round Led by GV soylent.com
329 points by thejacenxpress  2 days ago   630 comments top 67
johnfn 2 days ago 19 replies      
Soylent gets a lot of hate (I'm looking forward to skipping out on this thread before the inevitable negative comments from people who assume that Soylent consumers will eat nothing but Soylent for the rest of their lives), but to me, it's solved a large problem in my life: what to eat when I'm hungry but I don't have enough time to prepare a full meal. It can happen every now and then when I'm rushing around, and soylent blows away whatever I'd eat before (nothing, some Mexican I bought in the Mission, clif bars, etc).

I've actually switched over to a product called Ample which is similar to Soylent but a bit more health conscious with ingredient choice. Still, I've got nothing against Soylent.

rubatuga 2 days ago 23 replies      
Although I was initially brought onto the hype train by great marketing and the promises of health and complete nutrition, I realize that my initial reliance on Soylent was actually part of a deeper problem, brought on by problems such as depression and inappropriate time management. When in reality I should have had enough time to eat out or possibly even cook a meal, I found myself relying on Soylent. I didn't leave my room, and had trouble doing anything. I soon began to lose my appetite and had to force myself to gulp it down, attempting to make sure I wouldn't starve myself. Drinking Soylent was ruining my health.

I think many people approach Soylent as a way to solve some of their problems, but people should realize it won't be and can't be. Another person I knew had bought a few boxes of Soylent attempting to lose weight. In reality, she did not change her weight appreciably, as all she did was consume the same number of calories she would have otherwise.

Lastly I would like to add that there is some debate over the actual nutritional efficacy of the composition of Soylent. If you look at the bioavailability of their calcium supplement, calcium carbonate, it is significantly less than the one found in milk, calcium phosphate. However, Soylent will still claim that it is possible to have 100% D.V. of calcium with 5 bottles.

Overall, Soylent is probably not a healthy solution to your problems.

dreammakr 2 days ago 2 replies      
Long-time lurker, first time commenter. I've seen several people mention that there have not been meal replacement products like Soylent ever. The nutritional profile reminds me of EAS Myoplex + additional vitamins. Myoplex has been around since the mid 90s and was always marketed as a meal replacement. Learned about it from a bodybuilder website when trying to gain weight and easily "eat" extra meals. They have never marketed in the way Soylent has but it serves the same purpose. I took it to replace at least one meal a day for about two years. Blood work during that time period was normal and it worked well imo. Nothing against Soylent but the product is not a new concept.
tommynicholas 2 days ago 3 replies      
The new cacao Soylent is so much tastier than I expected it would be. Despite all the mocking Soylent gets, the product is excellent. Maybe it's groundbreaking and maybe it's not, but it's definitely helpful to my life.
Afforess 2 days ago 4 replies      
The crowds here saying that Soylent is "just" rebranded Ensure/Slimfast/etc sound like the same voices that said Dropbox was rebranded rsync/scp/ftp. The arguments are cookie-cutter and wrong for the same reasons they were with Dropbox.
stevenwu 2 days ago 1 reply      
On the topic of raising/burning $:

I remember seeing that they had a position open for a software engineering role. If I remember the job description correctly, they've built out their own online store?

I recently saw that you can buy their products through Amazon - I wonder if it was money well spent to roll out their own web store versus using Amazon/Shopify. As a past customer I don't remember seeing any particularly unique feature that made hiring in-house staff for this aspect necessary.

tonydiv 2 days ago 5 replies      
It amazes me that VCs find this type of business interesting because it's not defensible in many of the ways a software company is. I am also not sure why the company wants to raise this much money -- they need to continue growing like crazy (outside of tech regions where engineers who are unwilling to cook live) or go bust.

Nonetheless, the drink is ok. I tried it for a few months. Instead of avoiding cooking, I have embraced it, and now cook incredible meals for $3-$4 using my Joule sous vide. Eating real food has changed my mood significantly.

If anyone in SF wants to buy a whole box of Soylent Original (white bottles), I will sell an extra I have for 40% off. Must pick up, located at Chavez/Bryant.

Karunamon 2 days ago 0 replies      
To hopefully short-circuit a lot of pointless debate, the product you're about to mention is not equivalent to Soylent unless all of the following conditions are true:

1. It contains enough calories for an average healthy adult to live on (2000/cal/day, give or take)

2. The sugars used are high glycemic index

3. You will incur no major nutritional deficiencies/toxicities by long term use

4. It costs no more than $14/day (or $8/day for the powder) while satisfying all above conditions

Most diet drinks fail on condition 1, most meal replacements you're aware of like Ensure fail on conditions 2 and 3, and the leftovers usually fail on condition 4.

zorbadgreek 2 days ago 5 replies      
I'm trying to understand what Soylent's business moat is. My inclination is economies of scale to drive cost of goods down, like many food companies.

To that end, I don't see their product as all that unique or difficult to replicate, and I also foresee headwinds for them if/when they try to market to a broader base of consumers who already have powders, shakes, bars, and hundreds of other meal substitutes to choose from.

NathanCH 2 days ago 1 reply      
Soylent as a company has been run terribly. They've had numerous recalls and the recent price increase for Canadian customers is highway robbery.

That said, Soylent has improved my diet significantly. My entire life I have struggled to consume enough calories. Adding one serving of Soylent per day has allowed me to do two things:

1. Increase the number of calories I consume so I am in a daily surplus (I've gained a healthy 13 pounds since 2015 thanks to Soylent).

2. More importantly Soylent has accustomed me to eating larger meals. I can actually go to a restaurant and eat a full meal. I'm sure you can understand how much that has improved my social life.

Prior to Soylent I was consuming Ensure daily for more than five years but it's not enough calories to make a difference plus it's way more expensive.

mks40 2 days ago 12 replies      
A question to Soylent customers:

Do you use it as the occasional convenient meal replacement or how far do you go into replacing all real food?

I suspect there is both, but what I am getting at is how much Soylent's long-term success depends on people seeing eating as a nuisance that should be optimized away versus something that should be savoured and enjoyed.

To me this is in the context of the larger question of personal utility maximisation. In the grand scheme of things, we have just started being able to really monitor and improve all aspects of our lives (in terms of time spent, convenience), and there is the question of how far we (most people/potential customers) ultimately want to go. It has become clear that there is the potential to optimise away friction/time spent in almost all human habits, but it is not yet clear if we really want to keep going down that route.

Will we keep optimizing things like meals just because we can until there are (conceivably) nutrient implants that make eating unnecessary, or will we sort of revert and see that maximising utility of every interaction does not lead to overall greater satisfaction?

In one world, Soylent could eventually dominate, in the other, it will remain a niche product because eating and food is too important too most, also culturally speaking.

tdees40 2 days ago 3 replies      
It's a food company, and its product is fully baked. Why aren't they bootstrapping? Are they not profitable? If so, how do they have a business? If I sold ketchup at a loss, would I be able to close a $50M financing round?
victorhooi 1 day ago 1 reply      
I just completed the Everest Base Camp trek with my wife - I used Soylent (version 1.4) for the hike. So I basically subsisted on Soylent for 2 weeks of hiking/climbing up to 5,500 metres above sea level.

I did 1 packet per day (2000 calories), supplemented with some snack food on the trail (dried nuts and fruit, Stinger-brand honey waffles, muesli bars etc.).

Motivation was firstly as an experiment (to see how I would cope), secondly because I wanted to accurately control/measure my caloric intake, and thirdly, because I was somewhat paranoid about getting food poisoning on the trek. (I used a MSR Guardian to provide clean filtered water for mixing up Soylent).

I didn't really notice any odd effects - and it went better than expected well. Didn't get food poisoning (wife got diarrhoea, but she ate local food) - was a bit hungry on some days (in hindsight, 2,000 calories was a bit low - I upped it to 2,500 calories on the day I climbed Kala Pattar).

All I can say was, Soylent was great for this use-case, and I'm a firm believer now. At home, I only use Soylent for when I have no time to cook, and need a reasonably healthy/complete meal - if your alternative is going out for a late-night kebab, or 24-hour fast-food, it's not a hard choice for me =).

ckastner 2 days ago 0 replies      
I find it odd how popular Soylent has become. I always considered it to be the product that it was in the movie: the absolute minimum of nourishment given a lack resources. A dystopian nightmare.

A popular argument seems to be "I don't have time eat something proper". To me, that just replaces the lack of one resource (the ones in the movie) with another (time).

mastarubio 2 days ago 2 replies      
I have been using Soylent for breakfast for roughly 1.5 years and it has been great. I've lost weight and I feel healthier. I haven't got sick during this time either from a flu or serious cold. It's a great time saver when used as a breakfast food and will help save you precious time in the morning before you go to work. Time that you could spend sleeping rather than stuck in a drive-through or making something. Works for my busy lifestyle. And I like the fact that I am getting all those nutrients. While I realize it's not perfect now, it's nice to know that they are continually trying to improve it with the different versions. If it's at least half as nutritious as Soylent claims it is, that's a win in my book. And to think it might actually get to that holy grail where it's genuinely great for you then that will be something special. Realistically, this is probably a slow march, but one that I am proud to be a part of. Mixes great with fruit, peanut butter, honey and other food items too which I know is healthy for me.
matthewrudy 2 days ago 0 replies      
I've never tried Soylent, but I do have to drink Ensure many times a day.

Ensure has a massive market, $billions in annual sales, and medically proven results.

The medical market is not where Soylent is going right now, But it is massive and proven.

If they could market to younger, hipper folks who've been prescribed Ensure, but would spend their own money for something nicer... That'd be me.

(BTW: I finally ordered some huel just now... Really sick of Ensure, will give that a try)

VikingCoder 2 days ago 3 replies      
Soylent was fine. I maintained weight, and was eating better, and the food was easy.

But I'm in a medical weight loss program now, and I'm loving it. It's doctor supervised, I started out at 400 pounds, and I'm down to 360 after five weeks. It's medically supervised because I'm in ketosis, which can be very dangerous. So best to have blood drawn regularly, etc.

But the food is great. You can also get them as your own supplements / replacements.


Right now I'm eating 1,000 calories a day. I'll be going down to 800 calories per day. And I'm burning about 2,000 calories of my own fat every day (that's a pound). Again, this is a dangerous diet, but it's medically supervised.

I feel great, I don't feel "hungry" all the time. It's awesome.

md2be 2 days ago 0 replies      
This is a play on the large soft drink beverage companies who are literally thirsty for companies to replace their soda portfolios. Bai was recentesly aquifed by Dr Pepper for almost 2 billion. So a 10x return for this round is certainly possible.
shas3 2 days ago 0 replies      
I am one of those people who got horrible nausea eating a couple of batches of the Soylent bar. I really liked the taste and the nutrition profile. But I simply cannot tolerate any Soylent products after the episodes of nausea. There is only one other food (an obscure Indian snack) that ever created a similar aversion in me. I don't know when, if ever I'll get over the association of Soylent with nausea in my head. I wish it had never happened so that I could have continued using their products.
joshjkim 2 days ago 0 replies      
What people gloss over about soylent on their way to the oft-repeated "it's just like slimfast wtf" point is that it's not the specific product that's interesting/innovative/valuable (it is kinda like slimfast...), it's taking that product to an entirely new and arguably larger market - I have no idea many 20 to 30 somethings working in tech were out there buying slimfast 10 years ago, but I'm guessing (and I could be wrong!) it's fewer than the number of folks who buy soylent today. assuming it can maintain appeal to "hard-working tech folks", then it's arguable that they can also successfully appeal to other professionals in other markets who, again, I would guess were not buying slimfast or other meal/diet drinks. it's all about marketing and "telling a new story", and i don't mean that in a bad way at all - sure, any company can quickly copy the ingredients, but they will also have to sell the same story, which is totally possible but arguably harder to do as well. that all being said, marketing IMO is only so good of a moat and there's only a few co's who really dominate relying mostly on it (coke, hermes, nike to name a few), so i'm not sure the new market and their story is worth this large of an investment, but who knows it very well could be!
Rudism 2 days ago 0 replies      
I've been trying out alternate-day fasting (alternating between 500-600 calorie "fasting" days and normal eating days). Primarily to help me lose a few pounds, but also in response to all of the recent publicized studies about how fasting and calorie restriction may have other overall health benefits.

Usually on my 500 calorie days I'll just drink a couple Boosts. I'd like to consider Soylent as an option for the fasting days as well, but from a cost perspective it can't really compete with the other alternatives out there (Boost, Ensure, and various no-name store brands available from places like Sam's Club and Costco). Liquid Soylent runs between $2.69-$3.09/400kcal, whereas high protein Boost (which is not even the cheapest option if you're willing to go with no-name brands) runs at $1.93/400kcal. Going with Soylent Powder can bring your costs down more in line at $1.54/400kcal, but the added hassle of having to mix it yourself doesn't make this a very attractive option in comparison.

I guess what I'm ultimately getting at is, assuming you are just using Soylent to supplement an otherwise normal diet, I don't understand its appeal over other less expensive nutritional meal supplements like Boost and Ensure.

wakkaflokka 2 days ago 0 replies      
I've been drinking Soylent every day for breakfast for the past year (the bottles). I find it really convenient, and unlike SlimFast and other shakes, it doesn't seem overly sweet to me. It gets rid of my hunger with no fuss, no mess, and no fanfare. Ideally, I could eat a healthy breakfast, but in reality I know that I'd just be pounding down some cereal bars or something, which would arguably be worse nutrition-wise than Soylent.
datashovel 2 days ago 0 replies      
Because of all the negativity in Soylent threads I always feel obliged to add my opinion. Soylent has been an extremely positive addition to my life.
kingkawn 2 days ago 1 reply      
The idea that nutrition can be boiled down in this way feels that it will inevitably be missing something severely important that the modeling doesn't account for. The modelers afterwards will excuse the oversight by saying look this thing was so small, who would've thought to measure in such and such a way? Nature.
Animats 2 days ago 1 reply      
Does this mean they're going to make their own product? Currently, Soylent, the company, is just a marketing operation. Everything else is outsourced.

That's not an unusual strategy for hype-based products. Skyy Vodka and WD-40 were completely outsourced. Skyy Vodka was originally made by Frank-Lin Distillers Products in San Jose, the company that makes most of the low-end booze on the West Coast. Frank-Lin buys bulk ethanol by the tank car load (they have their own railroad sidings), does a little post-processing on the ethanol, takes in tap water and runs it through a deionzing plant, mixes them, adds flavoring, and bottles. They have a really fancy automated bottling line which can handle about a thousand different bottles and can change bottle types automatically. This is called product differentiation.

JohnnyConatus 2 days ago 1 reply      
Did anyone else have this problem: you enjoyed the product at first but after X number of bottles (like case 2 for me) your body started to react poorly to it and the taste became repulsive?

Not trying to badmouth soylent, I had a similar experience with a brand of granola bars.

bcaulfield 1 day ago 0 replies      
Coffiest easily exceeded my modest expectations. Quick, easy to digest when I'm nervous, and the combination of caffeine and theanine is... nice. My diet is trash, and it's vastly better than not eating or stopping for an Egg McMuffin at the drive through.

Soylent has gotten me off fast food entirely. I keep a few bottles at home and at work, so that I can get a meal during my commute that doesn't come from a drive through.

Balgair 2 days ago 0 replies      
Oh boy, another Soylent thread.... Here is how the comments are going to go: Half the commenters are going to say Soylent is not supposed to be for every meal, it's just for when you are too busy to eat. The other half cannot fathom ever using Soylent because if they are too busy to eat they will quit their job first, no exceptions. Neither side can bridge the gap, as food culture is very unique to each person. Let the market sort it all, props to Soylent for letting that happen and not wringingtheir hands over all of it.

(For the record, I would quit my job before skipping meals; they are actually that important to me staying sane)

rexreed 2 days ago 0 replies      
There's no saying more true than you are what you eat (or drink). Soylent drinkers are definitely Soylent people. I prefer a world of flavor, spice, and variety. I also wonder if it's a coincidence that the funders of the Juicero (solving a problem no one knew they had for people who clearly have too much disposable income) are the same as the funders of Soylent. I know there's a high overlap of HN readers and Soylent drinkers, but I think there's a big disconnect with the rest of the non-Silicon Valley populace.
hvmonk 2 days ago 1 reply      
Is there any long term effect medical study about this product?

I think our body has organs which releases various gastric juices to digest the food we eat. It is not only about how much calorie/protein one is consuming, there are also some useful by-products which helps in overall functioning of the body as well. A very simple analogy is only drinking fruit juice instead of eating them raw. We are not taking in fibers which helps in digestion, slow decomposition and good bowl movement.

I am very skeptical about approaches like this where we measure our food just in terms of calories, vitamins, protiens and then consume them directly in that format.

costcopizza 2 days ago 0 replies      
Ensure wrapped up in a nice minimalist package.
gavanwoolery 2 days ago 0 replies      
As an anecdotal/tiny success story, I drank Soylent exclusively for a month and lost 10 pounds - of course, this was by dramatically reducing my calories and exercising, not by virtue of Soylent alone. That said, I did find Soylent to be great on two accounts - it is not something you will consume for fun/pleasure/killing boredom, and it is easy to measure your calorie intake if you are consuming it solely. Side note: I am not a doctor or nutritionist, do not take my story as scientifically-grounded advice.
muratmutlu 2 days ago 0 replies      
Anyone know the difference between Soylent and a good quality weight gainer like Reflex Instant Mass Pro?

Nutrition Facts

Reflex Instant Mass Prohttps://www.reflexnutrition.com/instant-mass-pro/


akvadrako 2 days ago 0 replies      
I can't stand reading these posts about Soylent. There is so much irrational hate backed up by nothing more than incomplete and irrelevant arguments. People recommend virtually any other product even if it misses half the qualities of Soylent; things like nuts + fruits, whey protein, ensure, clif bars or even cooking.

I'm fairly certain it's due to an astroturfing campaign, but I don't know who would pay for such a thing.

Kattywumpus 2 days ago 1 reply      
I wonder how long it will be until some VC forces the inevitable rebranding of the Soylent name.

"We need to reach out to a larger demographic with a name that communicates the value proposition of the product. Liquid Lunch focus-groups well in the demographic of females 18-30, which is where we see our growth trending in future..."

I've always liked the cheekiness of the Soylent name and it's really the only thing that's made me pay the slightest bit of attention to the product.

mikro 2 days ago 0 replies      
I drink a lot of coffee, and for me the Coffiest product is cheaper and healthier than your average Starbucks drink. I also really like the taste of the Cacao drink, which satisfies my sweet tooth whenever I feel like reaching for a bar of chocolate. I don't necessarily view it as a meal replacement, but rather an upgrade to unhealthy things I already eat regularly.
aomix 2 days ago 0 replies      
I still don't understand the strong reactions Soylent gets on either side. There's plenty to criticize and to like but reactions to it cluster around it being the end of the world or solution to all your problems.

I'm a fan of the breakfast Soylent (Coffiest). To me it's the best form of the Soylent idea.

ebbv 2 days ago 0 replies      
If Soylent were a good idea it wouldn't need a $50M funding round at this point. People know about it. Most people I know who've tried it after a while didn't like it any more and abandoned it.

And now they've brought out flavors, basically turning it into an expensive meal replacement. It's ridiculous.

epmaybe 2 days ago 0 replies      
One thing I'm curious about: do liquid diet like Soylent change the brush border in our intestines in an appreciable way? If I went completely soylent for a few months, and then tried to eat something more...raw, would there be any changes in digestion?
awl130 2 days ago 0 replies      
I'm reserving judgement until I know exactly how Soylent affects our microbiome. This study seems to just have gotten started: https://mycrobes.berkeley.edu/the-study/
eddieone 2 days ago 0 replies      
As a person who has researched Soylent, I would not describe it as cheap or healthy. Most of the people with opposing opinions seem to think it's a magic weight loss supplement. In reality, the properties that cause weight loss seem to be the low quality ingredients.
venture_lol 2 days ago 0 replies      
Live a restrictive life, careful, watching, planning and get life expectancy of 85yrs ? A bad turn of luck could be the end is right around the corner

Live a wild debauched, taste everything, free for all, no care whatsoever and get a life expectancy of: 80yrs? A somewhat lucky draw could see you beyond 90

Hard choices :)

sebringj 2 days ago 0 replies      
"Everything the body needs..." - the Matrix next we get implantable nutrient packs that last for a year. Hey, YCombinator idea? Go fund that. (this might classify as a troll post as its not particularly relevant but just had to, sorry)
zenkat 1 day ago 0 replies      
Does anyone know the valuation of the company? And more importantly, what justifies that valuation? I have trouble believing that the meal replacement market is all that lucrative.
sachinag 2 days ago 0 replies      
If Forerunner didn't participate in this, then Soylent doesn't have a chance. I'd trust Forerunner over GV (or anyone) on a new consumer brand - and that's what Soylent is, going up against everything from Hint water to Ensure.
arzt 2 days ago 4 replies      
I'm curious about the naming choice as the end of the movie with Heston reveals that eating "Soylent Green" is a form of cannibalism. Does the term "Soylent" signify something outside of the movie?
intrasight 2 days ago 0 replies      
How is Soylent better than buying a quality blender and fresh veggies and protein powder and coconut oil? Or is for folks who live in a food desert?
ceejay 2 days ago 0 replies      
I used to get annoyed by all the negativity Soylent received from people. Now I just get amused by it. Soylent has been nothing but a positive addition to my life.
skdotdan 2 days ago 0 replies      
The future is food that tastes and feels like real food but with the nutrients that your genetics and physical conditions tell that you should eat. One day...
theprop 2 days ago 0 replies      
The best food advice I've read: don't eat or drink anything that humans haven't been eating or drinking for at least 500 years.
grandalf 2 days ago 1 reply      
I've been wanting to try Soylent but have not done so b/c of the price. Is there an HN promo code or something? I'd try it for a full month.
vthallam 2 days ago 1 reply      
Off topic, but have anyone tried 'Soylent Coffiest'? How do you like it for breakfast + coffee replacement for few days a week?
b1gtuna 2 days ago 0 replies      

I have been drinking Soylent 12 bottles a month. This alone has freed me up from thinking about what to eat for lunch.

mtw 2 days ago 0 replies      
Does anyone know of any clinical trials showing health benefits of Soylent? or adverse health consequences?
rubyfan 2 days ago 0 replies      
Have they figured out the vile angry flatulence problem?
vernie 2 days ago 0 replies      
Maybe... you're not as busy as you believe you are?
zzzzzzzza 2 days ago 0 replies      
personally I love soylent. Curious what the hard numbers are on sales growth/units shipped per month.
aanet 2 days ago 0 replies      
Soylent and Juicero

Rather surprised that nobody, as yet, has made a connection between Soylent and Juicero.

* 1st: Juicing & Nutrition- Theres very little evidence that liquid food / juicing has any benefits for most adults. Most nutritionists worth their salt will advise against juicing. Juicero (and other juice makers) take perfectly good,healthy, nutrient-rich fruits and vegetables and make them less healthy. Ditto for Soylent. Crushing natural foods (vegetables, fruits, any other in their natural form) together to seek out their nutrients, and reconstituting them in powder/liquid form is, by any other fancy name, a juice. The skin on an apple, the seeds in raspberries and the membranes that hold orange segments together they are all good for you. That is where most of the fiber, as well as many of the antioxidants, phytonutrients, vitamins and minerals are hiding. Fiber is good for your gut; it fills you up and slows the absorption of the sugars you eat, resulting in smaller spikes in insulin. When your body can no longer keep up with your need for insulin, Type 2 diabetes can develop. [1]

- I wonder if people who see the benefits of Soylent/juicing have read Michael Pollan or Marion Nestle. See [1], [2], [3]

* 2nd: Silicon Valley and investmentsBoth Soylent and Juicero are funded by marquee investors. Heres a brief list for Juicero (Total $118M raised) [4]GV (nee Google Ventures)KPCBAbstractCampbell SoupThrive Capital

Heres for Soylent (Total $70M raised):[5]GV (nee Google Ventures)A16ZTao CapitalIndex VenturesYCLerner Hippeau Initialized Capital

What do these have in common? Apart from being in the food business? It is the Food-as-a-Service business model. That is the essential ingredient (no pun) of the business, not the nutrients per se.

In effect, both Soylent and Juicero are products targeted towards high disposable income, busy professionals who want convenience, and perhaps the glow of save the world from hunger. (whatever that means). Any health benefits are inconsequential at best in the grand scheme of things.

If you value your nutrition and health, you are far better off relying on the tried-and-tested advice from Michael Pollan: Eat Food, Not Too Much, Mostly Plants.

All other rationalization of time/effort/nutritional benefit of Soylent/Juicero in save what world from hunger is, well, just plain old rationalization by any other name.

[1] [People think juice is good for them. Theyre wrong. - The Washington Post](https://www.washingtonpost.com/posteverything/wp/2017/04/26/...)[2] [Books | Michael Pollan](http://michaelpollan.com/books/)[3] [Marion Nestle - Wikipedia](https://en.wikipedia.org/wiki/Marion_Nestle)[4] https://www.crunchbase.com/organization/juicero/investors[5] https://www.crunchbase.com/organization/soylent-corporation#...

catenthusiast 2 days ago 0 replies      
Nerds will buy anything if trendy Silicon Valley thought leaders endorse it.
accountyaccount 2 days ago 2 replies      
Aside from brand, how does soylent differ from ensure?
Questron 2 days ago 6 replies      
Weird. You need food during an eight hour shift?
CPLX 2 days ago 3 replies      
That's quite a chunk of change for rebranding SlimFast for the urban millennial set. I wonder how many other eight figure venture-fundable concepts could be exploited by lurking in the grocery store aisles with a label maker.
moat 2 days ago 6 replies      
I feel like I'm the only who has ever read the nutrition facts. It's a garbage product full of ingredients I wouldn't aim to put in my body. Love the idea of it all, just not the science behind it.
pinaceae 2 days ago 0 replies      
Soylent - We've put SlimFast on the Internet.

And this after JuiceBro. Amazing.

maverick_iceman 2 days ago 0 replies      
Don't know why GV would invest in Soylent after so much fiasco. Seems like they earn money so easily that they are content to throw it away.
metaphorm 2 days ago 0 replies      
I feel like more people should read this book

In Defense of Food by Michael Pollan


joering2 2 days ago 6 replies      
Serious question -- do they have an FDA approval?? Can they produce and sell it without it?

Recently learnt about some Amish selling home made honey without FDA approval; now awaiting trial on a 20 years jail charge.

Sorting Two Tons of Lego, the Software Side jacquesmattheij.com
383 points by jacquesm  12 hours ago   66 comments top 17
jph00 11 hours ago 6 replies      
I've really enjoyed talking to Jacques about his lego project over the last few days, and I hope that it will lead to some additional learning materials on course.fast.ai (which he kindly links to in his article) as a result. It's great to see such a thorough writeup of the whole process, which I think is much more interesting and useful than just showing the final result.

The key insight here, to me, is that deep learning saves a lot of time, as well as being more accurate. I hear very frequently people say "I'll just start with something simple - I'm not sure I even need deep learning"... then months later I see that they've built a complex and fragile feature engineering system and are having to maintain thousands of lines of code.

Every time I've heard from someone who has switched from a manual feature engineering approach to deep learning I've heard the same results as Jacques found in his lego sorter: dramatic improvements in accuracy, generally within a few days of work (sometimes even a few hours of work), with far less code to write and maintain. (This is in a fairly biased sample, since I've spent a lot of time with people in medical imaging over the past few years - but I've seen this in time series analysis, NLP, and other areas too.)

I know it's been trendy to hate on deep learning over the last year or so on HN, and I understand the reaction - we all react negatively to heavily hyped tech. And there's been a strong reaction from those who are heavily invested in SVMs/kernel methods, bayesian methods, etc to claim that deep learning isn't theoretically well grounded (which is not really that true any more, but is also beside the point for those that just want to get the best results for their project.)

I'd urge people that haven't really tried to built something with deep learning to have a go, and get your own experience before you come to conclusions.

marze 8 hours ago 1 reply      
Suggestion: why not use three cameras simultaneously, each from a different angle, then classify the three images? Those cameras must be nearly free in cost.

Also, to get more training data, what about setting up a puffer to blow the part back on the belt and tumble it? If you could configure the loader belt to load parts slowly and stop after one is seen, you could automatically re-image the first part an arbitrary number of times by blowing it backwards before letting it move along and restarting to first belt to get another.

And question: do you normalize out color at any stage? As in, classify a black and white image, with a separate classifier for the color?

11thEarlOfMar 10 hours ago 0 replies      
This is true hacking. I mean, at its essential core. The purpose, the methods, the tools, the rationale. If there is an archetype for Hacker, it's jacquesm.


ma2rten 11 hours ago 1 reply      
It sounds like you went though a similar process as the computer vision community over last couple decades.

First people used to write classifiers by hand, but they found it's too tedious, unreliable and has to redone for each object you want to classify. Then they tried to learn to detect objects by using local feature detector and train a machine learning model to classify objects based on that. This worked much better, but still made some mistakes. Convolutional Neural Network were already used to classify small images of digits, but people were skeptical they would scale to larger images.

This was until in 2012 AlexNet came along. Since then performance of convolutional networks has improved each year. Now they can classify images with similar performance as humans.

ziikutv 11 hours ago 1 reply      
For anyone wondering, that is a USB Microscope camera which can be ordered via Amazon[1].

[1]: https://www.amazon.com/XCSOURCE-Microscope-Endoscope-Magnifi...

justforlego 8 hours ago 1 reply      
Would it be possible to use existing 3D descriptions of the bricks to train the model?The LDraw library contains mainly every LEGO brick [1].

[1]: http://www.ldraw.org/

modeless 11 hours ago 2 replies      
Awesome project!

> I simply dont think it is responsible to splurge on a 4 GPU machine in order to make this project go faster.

2 things: 1. You can rent 8-GPU machines on AWS, Azure or GCE. 2. The incredibly wide applicability of machine learning means that an investment in hardware might not be wasted. Even if you only use the machine for this one project, if it helps you learn more about the field it will probably still be a good investment career wise.

unityByFreedom 2 hours ago 0 replies      
Thank you for posting this follow-up!

I look forward to seeing if you can push it further by leveraging faster hardware in the cloud.

I suspect the training time could cause you to lose interest in iterating improvements. But, how cool would it be to make the project even better =)

RoboTeddy 5 hours ago 0 replies      
> Right now training speed is the bottle-neck, and even though my Nvidia GPU is fast it is not nearly as fast as I would like it to be. It takes a few days to generate a new net from scratch but I simply dont think it is responsible to splurge on a 4 GPU machine in order to make this project go faster.

Easy cloud training: https://www.floydhub.com/

dpkonofa 11 hours ago 2 replies      
I love projects like this because, while it doesn't necessarily have a direct application right away, it solves a piece of a problem that could go a long way to help something else. Reminds me of the skittles/M&M sorting machine that someone built a little while ago. As more projects like this develop, we're teaching computers more and more about visual recognition.

Can't wait for Skynet to go live! :-P

iDemonix 7 hours ago 1 reply      
> Right now training speed is the bottle-neck, and even though my Nvidia GPU is fast it is not nearly as fast as I would like it to be. It takes a few days to generate a new net from scratch but I simply dont think it is responsible to splurge on a 4 GPU machine in order to make this project go faster.

You should stick up a donate button, if you keep writing interesting articles about how it all works, I'd happily throw a few dollars towards the process.

datenwolf 11 hours ago 1 reply      
Cool project!

One question: Wouldn't it have been easier to use a line scan camera and tether line aquisition to the belt's movement by attaching a rotary encoder which output would trigger individual line scans? That's the standard solution in the industry.

Saus 8 hours ago 2 replies      
Nice work, I enjoyed the write-ups. You wrote that you wanted to sell off complete sets.

Would you be able to first make an inventory of all your available pieces. And then load a DB with (all?) complete sets and let the machine sort different sets in 1 bucket (starting with the most expensive set first?). Or how are you going to get your sets together?

wolfgang42 11 hours ago 1 reply      
> [The stitcher determines] how much the belt with the parts on it has moved since the previoue frame (thats the function of that wavy line in the videos in part 1, that wavy line helps to keep track of the belt position even when there are no parts on the belt)

I'm curious about this wavy line--does it need to be specially encoded in any way or did you just squiggle the belt with a marker and let the software figure out how it lines up?

tezza 6 hours ago 0 replies      
excellent writeup.

has anyone applied this sort of thing to voice recognition ? i see a lot of computer vision applications, but haven't found any audio classifiers amongst the CV articles

Nexxxeh 11 hours ago 2 replies      
Are there instances where multiple parts look the same from different angles?
wwarner 4 hours ago 0 replies      
great write up, thank you for sharing it.
Build Yourself a Redux zapier.com
372 points by jdeal  1 day ago   152 comments top 18
sergiotapia 1 day ago 16 replies      
If you're looking for something easier to use to help you manage state in your React apps look no further than Mobx. It's pretty incredible how stupid easy it is to use, it kind of feels like cheating.


I've tried to use Redux a couple of times but I just spent way too much time in plumbing code. Code I really don't care about. To be frank this code looks terrible (no fault of the author):

 const handlers = { [CREATE_NOTE]: (state, action) => { ... }, // ... a thousand more of this }
Not to mention, I never not once felt happy working with Redux. I'm all about developer UX and working with tools that feel nice to use.

With Mobx you just declare a variable as Observable, then mark your components as Observers, and voila: You have crispy, no-plumbing reactivity.

In a way it kind of feels like Meteor where you save data on the database and it replicates everywhere it's being used.

rasmi 1 day ago 3 replies      
If anyone is interested in learning this content through an in-depth video tutorial, I highly recommend Dan Abramov's two-hour "Getting Started with Redux" [1] and the excellent follow-up "Building React Applications with Idiomatic Redux" [2]. This is a great article, but learning Redux more thoroughly directly from the creator himself may be of interest to some!

[1] https://egghead.io/courses/getting-started-with-redux

[2] https://egghead.io/courses/building-react-applications-with-...

dgregd 19 hours ago 3 replies      
I've seen and understand Redux TO DO examples.

However I develop enterprise CRM app. In db there are 200k client records, 500k sales calls records. It is implemented as a standard Ruby on Rails / Postgresql web app. It works quite well. It is also pretty straightforward to implement a such app in a Java/PHP MVC framework.

Let's say I would like to implement UI using React/Redux. How should I start? For example the app has calendar month view, for each day there are 20 sales calls. So the month view has 400 sales calls and clients data displayed (date, time, client name, target group).

Do I have to put 400 sales calls and 400 clients data to a Redux store to display the calendar month view? What about client data search results and pagination? In just few clicks a user can display hundreds of clients records (thousands in case of results map view). Do they belong to a Redux store? If a user modifies one sales call record, how it is persisted in central DB? What about edge cases where some uniqueness conditions have to be checked on central DB level?

Rails covers all things needed to implement my medium CRM app. When I read Redux TO DO tutorials I have a filling that they cover just 10% of what is needed to implement a full CRM app. Could you please direct me to Redux examples / tutorials how to implement a full enterprise database app (SugarCRM scale).

PS. to down voters, please write a few words what is wrong with my questions so I can learn what is appropriate to post on HN

joshwcomeau 1 day ago 2 replies      
TFW you see a thread about Redux, and you just _know_ that the comments are going to consist of nitpick complaints.

Contrarian opinion (apparently): Redux is a lifesaver when it comes to complex applications. There's a little more ceremony, but a lot more organization, a lot fewer bugs.

tarr11 1 day ago 1 reply      
I like this article. It's probably a good idea to build your own simple todo app using Redux from scratch first, and then follow this guide. It would make a lot more sense.

Using this as a place to put some thoughts on Redux after having picked it up over the past few weeks.

I have been spending the least few weeks re-writing an "offline-first" mobx React app into Redux, after it started spinning out of control and becoming unmanageable. Mobx provided a lot less guidance on how to structure a larger app (read: more than a few pages with non-trivial remote interactions)

Like React itself, it took me a few weeks to grok the philosophy and architecture, and internalize the key components so that I wasn't getting lost every few lines of code.

I had evaluated Elm earlier in the year but passed on it, as there were some interop issues, and the component ecosystem wasn't as mature as react.

Redux has had the effect of organizing my code and helping me reason about the structure, as well as providing time travel for free.

Typescript to be very helpful when building with Redux, specifically when I did something wrong, and had to refactor.

I've also been pleasantly surprised at the middleware ecosystem, and how useful and easy to configure it has been.

twfarland 1 day ago 1 reply      
You can replace redux with any FRP library. Your state is a signal/stream/whatever that folds over an initial state with a signal/stream/whatever of actions/messages. Your top level view component should listen and render based on that. Example: https://github.com/twfarland/sprezzatura-acto-mario
msoad 1 day ago 1 reply      
I have to deal with Redux at work and I absolutely hate how much code I have to write to flip a binary in my React component!

I used MobX on the side projects and I absolutely love it! I might be biased but I think MobX is so much better for any size project. Redux is just too good at marketing and their "Hello world" looks very very interesting and reasonable but it doesn't scale. When you have multiple people working on the same codebase it becomes a hot mess!

If you're starting a project, give MobX a shot and see how it goes.

arbesfeld 1 day ago 0 replies      
One advantage of Redux that people tend to miss is the serializable state object which is incredibly helpful for local logging and remote debugging. It's the reason we built LogRocket (though now we have a bunch of other features for general web apps).
antjanus 1 day ago 0 replies      
I wrote a similar article last month on the same topic but it's much more simplified with CodePens to detail the way:


It covers only Redux and not React which I think is a little more useful. It DOES cover Enhancers.

Anyways, I've seen this article circulate and I'm glad people are interested in the inner workings of Redux!

emehrkay 1 day ago 3 replies      
Am I missing something new with object literals or is this an error:

 window.state.notes[id] = { id, content: '' };

acemarke 1 day ago 3 replies      
This is a great article! As I commented on the post itself when it was published, I keep a big list of links to high-quality tutorials and articles on React, Redux, and related topics, at [0]. That includes a section of "Redux implementation walkthroughs" at [1]. This is probably the best article of that type that I've seen. It not only covers the core of Redux, but also builds miniature versions of Redux middleware and the React-Redux `connect` function. I already added it to my list, and definitely recommend it.

Readers may also be interested in my Redux addons catalog at [2], which includes links to hundreds of Redux middleware, utilities, and other useful libraries. That includes multiple ways to batch dispatching of actions.

[0] https://github.com/markerikson/react-redux-links/blob/master...

[1] https://github.com/markerikson/react-redux-links

[2] https://github.com/markerikson/redux-ecosystem-links

dclowd9901 1 day ago 0 replies      
Wrote a similar piece last year if you like this kind of thing. I love learning stuff by implementing it myself:


neebz 1 day ago 0 replies      
Shameless Plug: I gave a talk last year explaining similar concepts https://github.com/neebz/react-redux-presentation
floatboth 1 day ago 10 replies      
"Redux is a simple library" woah woah stop right there. How is this:

 const handlers = { [CREATE_NOTE]: (state, action) => { ... }, // ... a thousand more of this }
simple? This looks horrible. Every time you want to work on code that modifies data, you have to switch to the one file where you keep all the data-modifying functions? Seriously?!

Freezer https://github.com/arqex/freezer is a much, much better experience. You just shove immutable objects down the component tree, and they just come with methods that modify "them" (by actually changing the data tree). It has an event system as well for when you actually want to centralize actions. Works great with Polymer, by the way!

hippich 1 day ago 2 replies      
I am not getting fully redux yet, but from my understanding it is sorta like app-wide message bus with message handlers. Is my understanding correct?
antouank 1 day ago 1 reply      
Or do yourself a favour and use Elm.
mal34 1 day ago 1 reply      
Making simple things Complex !!
Long-dormant bacteria and viruses in ice are reviving as climate warms bbc.com
283 points by raulk  1 day ago   143 comments top 30
zbjornson 1 day ago 2 replies      
Soil-borne anthrax is very common; there's in fact "anthrax season" when (usually small) outbreaks happen among wild and farmed animals, from North America to southern Africa, Russia to China and India. (Search for anthrax on https://www.promedmail.org/, there are 37 reports in 2017 so far.) That a thawed carcass was infected is an interesting anecdote as far as the mode of the transmission, but it isn't surprising. That is, it's not a disease that we've eradicated that is coming back to haunt us.
secfirstmd 1 day ago 1 reply      
The comments on this article are fascinating and why I love reading the HackerNews. Point vs point debates about interesting scientific theory but in a way that average person like me can understand. I used to read about this kind of conversation happening in the late 19th Century in the bars of Royal Science institutions in Europe - it feels a little bit like that. :)
mickrussom 1 day ago 4 replies      
My wife always bugs me about the cold. I tell her operating rooms are cold. Heat = entropy, disease vector increase. Any thawing of permafrost will start to revive dormant diseases, viruses and flora. We might as well complete the trifecta and start looking for ancient DNA and revive long gone species for the win. She always tells me cold and drafts = sick, but if you look where the percent of currently diseased - its never in the north - always in tropical places where diseases, worms, parasites have a field day. There will be a day where she'll be begging for the cold :)
enknamel 1 day ago 4 replies      
There are quite a few sci-fi novels and one show I saw that feature nightmare scenarios based off of thawing ice. I seriously doubt we will see something catastrophic though. It's been a while since I took bio but bacteria/viruses from thousands if not millions of years ago will most likely not be able to bind to our cells.
arctangent 1 day ago 1 reply      
I'm surprised that Fortitude [1] hasn't been mentioned yet.

It's a fairly good TV show on this topic.

[1] https://en.wikipedia.org/wiki/Fortitude_(TV_series)

fhood 1 day ago 3 replies      
I'll just add this to my list of things that I probably should give some thought, but won't because the top of the list includes refugee crisis, income inequality, and all the less Crightonesque consequences of climate change.
Houshalter 1 day ago 0 replies      
I'm not saying this isn't a threat. But it doesn't seem as scary as the title or comments are making it out to be. The article admits that most bacteria can't survive this long frozen. Only certain types that have adapted to serving in the cold long term by forming spores. It only mentions one bacteria that harms humans that can do that, botulinum. Which isn't contagious and is only a problem with improperly canned food. And anthrax which is deadly but fortunately not very contagious.

Viruses are more of a concern, but the article doesn't make a great case there either. They mention that scientists found a smallpox victim but were unable to recover a complete smallpox virus. Just fragments of it's DNA. The scariest thing recovered was Spanish Flu. Which fortunately many people have already been vaccinated against: http://www.reuters.com/article/us-flu-vaccine-idUSTRE65E65S2...

btilly 1 day ago 0 replies      
Diseases from early humans are an interesting point of worry. What tends to make a deadly disease deadly is that it is able to infect us, but is poorly adapted to us. A disease that is adapted to a close relative of ours is likely to both infect us and not be well-adapted to modern humans.

Which could be really, really bad.

jondubois 20 hours ago 2 replies      
I'm not a microbiologist but when accounting for evolution, you'd think that a microbe which was locked away in ice for millions of years would be maladapted to modern animals - Particularly in terms of transmission between hosts.

I would be more afraid of pathogens that were frozen more recently.

whoisstan 17 hours ago 0 replies      
Read the 'Drowned World' by J.G. Ballard, humans start having ancient dreams.

'Just as psychoanalysis reconstructs the original traumatic situation in order to release the repressed material, so we are now being plunged back into the archaeopsychic past, uncovering the ancient taboos and drives that have been dormant for epochs Each one of us is as old as the entire biological kingdom, and our bloodstreams are tributaries of the great sea of its total memory.

The Drowned World, J.G. Ballard, Millennium 1999, p. 41.

chiefalchemist 1 day ago 1 reply      
A legit fear, but it could be a positive.

When you consider, for example, the Zika virus and it's effect on the human brain, perhaps - on the other hand - we have a virus to thank for making homo sapiens more intelligent than our then "competition"?

Bacteria, could be a positive well.

Of course it's a roll of the dice either way. C'est la evolution.

jsz0 18 hours ago 0 replies      
This sounds like a very manageable threat. We already have systems in place to identify and control the spread of diseases. It's already equipped to deal with new or rare diseases. This will be an added burden but probably no more difficult than dealing with something like ebola. Likely easier due to the geography and population density involved.
raulk 1 day ago 3 replies      
Revel with me in the thought that we humans think we are center of the world.

That Earth is made for us and we have the power to shape it in whichever way we wish. That we own the planet.

But, in reality, we don't. We are here only temporarily. There are powerful organisms hiding out there who are perennial.

And they act like guards. If we push it too far, we set off the right conditions for them to spring to life, and restore balance on Earth by anhililating the threat i.e. us.

What a time to be alive!

tomcam 19 hours ago 0 replies      
This has been happening for at least a couple of hundred years. Mammoth bodies have been exposed in Siberia since at least the 18th century and probably much further back than that.
DanBC 1 day ago 4 replies      
> From the bubonic plague to smallpox, we have evolved to resist them

We have antibiotics for plague, and vaccination / eradication for small pox. That doesn't feel like we evolved any resistance. A couple of thousand cases of plague are reported to WHO each year.

stuffedBelly 1 day ago 0 replies      
Reminds me of this horror flick I watched a couple of years ago

The Thawhttp://www.imdb.com/title/tt1235448/

franzwong 1 day ago 1 reply      
No matter the climate change is due to human or nature. The fact is the temperature is getting higher and it is the problem.
dangayle 1 day ago 1 reply      
Drilling in the frozen north and releasing an ancient monster is the subject of the first episode of the revamped Mystery Science Theater on Netflix.
zoom6628 1 day ago 0 replies      
Nature has given us a whole CDC vault for free. Seems like a golden moment for science akin to corpse in the Alps.
yourthrowaway2 1 day ago 0 replies      
We're all gonna die!
koolba 1 day ago 0 replies      
Reminds me of the anthrax outbreak in Russia:


ccvannorman 1 day ago 0 replies      
I bet the CIA is sweating about that Winter Soldier that froze in the 60s.
rdxm 1 day ago 1 reply      
we've already been de-seleceted. now it's just a matter of time....
rglover 1 day ago 0 replies      
What a time to be alive.
cmdrfred 1 day ago 0 replies      
Conversely modern bacteria and viruses are going to sleep in the Antarctic.


muninn_ 1 day ago 0 replies      
I guess I prefer that they stay there... but can't help but to say that it seems fascinating that there are these dormant antique lifeforms just waiting to be discovered. Hope they don't kill us.
minikites 1 day ago 5 replies      
Over the past 5-10 years I've gone from being mostly optimistic about our collective future to quite pessimistic. It's looking increasingly likely that we're unable to solve problems like climate change that require mass cooperation and that too many people are too selfish and short-sighted to allow for collective action. And in the (hopefully unlikely) event that industrial society collapses (from a pandemic, mass political instability, etc) any surviving humans won't be able to restart it because we've already used all of the "easy" fossil fuels. This is pretty much our only shot at making civilization work.
marcgcombi 1 day ago 0 replies      
Who cares.
graycat 22 hours ago 2 replies      
> "as climate warms"

How much warming, in degrees F, C, or K, since when, measured how, by whom, published where, compared with what other measurements?

Why ask these questions? For one, AFAIK "climate warms" essentially has not been happening to any significant extent for about 20 years and, really, since the coldest of the Little Ice Age -- apparently there was some cooling from 1940 to 1970 so some warming since then. On the Little Ice Age, there was ice skating on the Thames River in London.

Reference for temperature over the past 2000 years? Okay:

Committee on Surface TemperatureReconstructions for the Last 2,000 Years,National Research Council, SurfaceTemperature Reconstructions for the Last2,000 Years, ISBN 0-309-66264-8, 196pages, National Academies Press, 2006,available at


In the Medieval Warm Period, did all the ice and permafrost melt and make everyone sick? Well, if it all melted, then what's in the ice and permafrost now is not so old and maybe safe. But I didn't hear that diseases released by the melting ice and permafrost made lots of people sick during the Medieval Warm Period.

My guess: The BBC is pushing made-up, cooked-up, stirred-up, gang-up, pile-on, continually reinforced fake, nonsense scare stories to get continuing eyeballs, ad revenue, and British government subsidies.

Not reading it.

For some simple evidence: The Little Ice Age really was significantly cooler, but there is no evidence that it was preceded closely by lower concentration of CO2 -- the lower temperatures had some cause other than lower CO2.

The Medieval Warm Period really was warmer, but there is no evidence that it was preceded closely by higher concentration of CO2 -- there must have been some cause other than higher CO2.

It appears from ice core samples and more that the temperature of the earth has varied significantly over at least the last 800,000 years. Maybe CO2 has had something to do with warming since the Little Ice Age, and otherwise it looks like the causes of warming/cooling had little or nothing to do with CO2.

It appears that people who talk about warming are blaming CO2, in particular from human activities, and from what I've seen in the temperature records for the past 800,000 years, the only time when CO2 might have caused significant warming was since the Little Ice Age -- even if we accept this, there's the problem of the cause of the cooling from 1940 -- 1970. Otherwise the temperature changes had other causes -- so, my guess is that the temperature change since the Little Ice Age also has some cause(s) other than CO2.

Is CO2 a greenhouse gas, that is, absorbs Planck radiation from the surface of the earth? Yup, absorbs in three bands in the infrared; since we can't see CO2, it does not absorb visible light. So, is there a warming effect from that CO2 absorption? Well, maybe, but water is also a greenhouse gas so that maybe the radiation would be absorbed by water instead of CO2. But even if CO2 is the only way that infrared radiation can get absorbed, it's still not clear how much warming, net, all things considered, it would cause. E.g., lighting a match will also warm the earth.

Is there more CO2 in the atmosphere now? Apparently the concentration someplaces is 400 ppm (parts per million) -- IIRC that would be in Hawaii, right, near a volcano, and volcanoes are supposed to be one of the major sources of CO2. Also there's CO2 in the ocean, and warm water absorbes less CO2 than cold water, so maybe recently some of the ocean around Hawaii is warmer and the source of the Hawaii CO2.

I've seen no good presentations of CO2 levels over time with explanations of the causes.

I've seen no good data on CO2 sources, sinks, or flows.

E.g., first cut, how much CO2 is in the atmosphere now? Then, how much CO2 enters the atmosphere from human activity each year now? If the ocean warms a little, say, from an el Nino, how much CO2 is released into the atmosphere? At what rate do green plants take CO2 from the atmosphere? My guess is that CO2 from human activities is comparatively tiny, that the basic data would show this, and this is why we don't get the basic data.

I see lots of articles on CO2 and warming, but I don't see articles with even this basic, first cut data.

So, to me, the articles don't really have a case since if they did they would make their case. In the articles I see efforts to grab people emotionally but darned little data to convince people rationally.

BBC: "as climate warms" is where you lost me.

Or with this logic, we could write even more shocking articles:

As the next galactic gamma ray burst hits the earth, all the atmosphere will be blown off the earth, and everyone will die. Moreover, since the gamma rays will come at the speed of light, we will never be able to see them coming. Now, get scared. Get afraid. Be very afraid. Watch the BBC for hourly updates 24 x 7 for the rest of your life to keep up on just what will happen as the next gamma ray burst hits the earth. Same for marauding neutron stars, highly magnetic neutron stars, and black holes. Read BBC tomorrow for the results as the next black hole hits the earth.For more, the expansion of the universe is slowing down, and we may be in a big crunch and all compressed to a point -- see the BBC next week for the details when this happens. Back home, see what will happen when Yellowstone blows again -- last time it put ash 10 feet deep (it's rock and enough to crush nearly any roof) 1000 miles down wind or some such. Remember, those bacteria are down there, fighting every second among themselves, evolving, just to come out and kill everything else, including YOU!!!

bcaulfield 1 day ago 1 reply      
Oh. Goody.
Coinbase adds support for Litecoin techcrunch.com
338 points by tmlee  3 days ago   240 comments top 18
lend000 2 days ago 4 replies      
It will take significant time and effort to overtake Bitcoin's name recognition and first mover advantage. However, there's also an advantage to being a second mover that can adapt quickly to a changing environment... I doubt Bitcoin will be the supreme crypto-currency in a few years. It's just too implausible that Bitcoin is perfect enough as is, and/or the community will be able to implement any needed changes before its "competitors" take market share.

Regardless, it's a good time to be a cryptocurrency investor. We're still in 1996 in Dot Com Bubble time.

pyabo 2 days ago 1 reply      
Litecoin is now back because of Segwit. It's Bitcoin's hope to push a final acceptance of Segwit on Bitcoin. But, it doesn't have a real value in my opinion.About cryptocurrencies in general I can say that I use them to get payments and to pay employees and I find them very useful and the user experience is much better than normal banking (fast international transfers and complete tracking) and no need for KYC.
jerguismi 2 days ago 1 reply      
Segregated witness should activate in litecoin in about ~6 days.


Uptrenda 2 days ago 5 replies      
I never really got the appeal of Litecoin. It was basically just a copy and paste of Bitcoin with only minor alterations made to the source code (like changing the PoW for Script which never really defeated ASICs and a few other small changes.) It made no major innovations that can be named and even to this day it still lacks several of the major bug fixes that made it into Bitcoin (there are still patches missing from like ~2014 that result in the software being even harder to use for devs than Bitcoin.)

This is not coming from a Bitcoin Maximalist, by the way (not presently a fan of Bitcoin either.) Just thought I'd point out how bizarre it is that Litecoin even has a price at all when the software is functionally similar to Dogecoin. It's only real claim to fame is that it copy-pasted the code earlier than most other cryptocurrencies and hence now survives as a zombie currency backed by the souls of all the bag-holders foolish enough to invest in it (much like how Yahoo continues to survive to this day.)

Can we all just agree that Litecoin is a way over-hyped and silly excuse for a cryptocurrency? Silver to Bitcoin's gold? I can't think of a single problem that Litecoin actually solves compared to ... well, anything. At best: you could say its a speculative instrument tied to the cancerous block size debate. But other than that - is good marketing really worth the price of $21 USD a coin? I can see a lot of people getting screwed by this when the currency inevitably crashes again ...

NamTaf 2 days ago 3 replies      
It seems to me that this is still speculation based on the idea that increased exposure will increase investment in it which will drive the price higher to punters buy up LTC anticipating that it'll rise and thus it self-fulfils. I don't really see what fundamentals have changed to spur such a climb.
sxp 2 days ago 0 replies      
I wonder if the sudden increase in price over April is related to to Coinbase buying up LTC to fill their reserves: http://coinmarketcap.com/currencies/litecoin/
hultner 2 days ago 1 reply      
Is Litecoin still gaining momentum at a significant pace?

I'm not quite up to current developments in the crypto coin communities, however my perception from the outside is that Ethereum has replaced Litecoin as the leading altcoin. Is this incorrect?

How well does Litecoin fare against the congestion problems we've heard about from the Bitcoin communities?

eelkefolmer 2 days ago 1 reply      
Litecoin spiked up from $16 to $36 this morning. Trading on GDAX (also part of coinBase) has far lower fees (0.25%) than coinBase (1.5%)
tuxracer 2 days ago 7 replies      
Is there a reason a lot of these cryptocurrencies have suddenly started to skyrocket all at once?
nnfy 3 days ago 5 replies      
Would a vendor like coinbase be subject to any legal repercussions if its employees purchased litecoin before the option to purchase went live, and the price spiked?
cableshaft 2 days ago 1 reply      
About time. I've been waiting for an easy way to get Litecoin for so long. Every time I checked in the past, it seemed to require working with some Russian bank to get things set up, and something would always error out or "not be supported" or some crap and I'd eventually give up, like in the early days of bitcoin (if I had figured out how to successfully buy it I could have bought btc at $7 a coin once, and probably would have over 100 of them now).

Would have loved to have grabbed some LTC when it was dirt cheap, but I'll settle for getting in early on Coinbase. Did that shortly after Coinbase added Ethereum at $11, and it's now trading at $89.

joshuaswaney 2 days ago 0 replies      
Cryptocurrency implementations have to make tradeoffs like any other type of software, and in this case we're looking at the tradeoff between transaction speed and security. Other cryptocurrencies favor anonymity over convenience. Is there a CAP theorem equivalent for these problems? There seems to be a healthy tension between scalability and security at the very least.
danielleheong 2 days ago 1 reply      
So little activity until 1st of April. What gives... https://www.coingecko.com/en/price_charts/litecoin/btc/90_da...
kzisme 2 days ago 7 replies      
I don't really get the draw towards crypto-currencies - aside from mining them to bring more into the pool of available currency - is there a point to purchasing or using them to make purchases?

Is there a reason I should start using these currencies? Aside from trading currencies to make a few bucks?

edpichler 2 days ago 2 replies      
Why should I use litecoin for transfer money when there is bitcoin with more liquidity? I really don't get this yet. Anyone could explain?

PS: I am a Bitcoin enthusiast, and I am not criticizing Litecoin, I just want to understand the possible advantages it could have.

ptenk 2 days ago 2 replies      
This rise is driven by Coinbase alone. The premium there was like 30-40% to other exchanges at one point.
quotha 2 days ago 3 replies      
This is all gonna end bad
dvdhsu 2 days ago 20 replies      
Is there interest in a cryptocurrency index fund? The idea is you could just buy an index fund composed of, say, 50% BTC, 20% ETH, 15% LTC and 15% ZEC. I'm fairly certain that one of these cryptocurrencies will 10 or 100x in value over the next 5 years, but buying each one individually is just such a pain. Would you invest?

In reality, we'd probably buy the top 10 coins weighted according to some measure, and rebalance once a week. We would send out investor updates and let you know what the weights are, along with performance over the past week.

We're YC and our previous idea didn't work out, and this is something we're considering pivoting to. If we get enough interest we'll start one!

Ngrok: Secure tunnels to localhost ngrok.com
406 points by pvsukale3  23 hours ago   182 comments top 35
Lazare 18 hours ago 4 replies      
A lot of people seem to be a bit confused about the point of ngrok, why it's useful, how much it costs, etc. Let me try and help out. :)

For me, the killer feature for ngrok is testing/developing webhooks. You install ngrok in your dev environment, start it up, then point the stripe/slack/whatever webhook your working on at the generated URL.

ngrok will 1) proxy that request through to your dev environment 2) log the request 3) log the response 4) let you replay previous requests. It could not be more helpful for developing webhook handlers, and has literally saved me hours of work in the last couple of months alone.

Finally, the free tier is all you need for that; it gives you a unique ngrok subdomain which changes every time you start the tunnel and some (generous) usage caps, both of which are fine for this usage.

People pointing out the potential security issues are correct, but that's an argument to be careful and think about what you're doing. Besides, what's your proposed alternative? Because most of the obvious ones have equally troubling issues.

tmp98112 15 hours ago 0 replies      
Makes me sad to see all the negativeness towards this service, which clearly works and serves a need some people have.

Yes, there are alternatives, but I hate when people jump to dismiss service like this, without fully considering what issues the proposed alternatives have. Obviously it is ok to mention the alternative options, but that can be made in constructive way.

Let's celebrate the fact that somebody has built and released something and even seems to have a business model to support it. Instead of complaining about 5-20 bucks per month, try to figure out how you could channel some of your corporate multimillion IT budget to this fellow hacker. Wouldn't it be great if building and running this kind of small solutions would be actually a viable way of making living?

packetized 20 hours ago 2 replies      
Literally the most terrifying service for any security-minded operations-focused person. Wonderful tool, interesting and useful in a dizzying array of aspects - but dear lord, I've had some real horrific moments when users told me that they installed it to allow access to their (private) repos for testing.
inconshreveable 22 hours ago 14 replies      
Hiya there folks - I'm the creator of ngrok, happy to answer any questions
yeldarb 22 hours ago 3 replies      
Happy paying ngrok user here.

Love it for developing anything using webhooks and also hybrid mobile apps (I have my app pull the JS from the Dev box I'm working on via ngrok without having to rebuild the app or deploy the code anywhere).

It significantly speeds up my workflow!

jeremejevs 20 hours ago 5 replies      
Nice tool, but without committing for annual billing (which I don't intend to do, not for the first year of usage) it's $10 a month. My internet connection, my mobile plan, my Photoshop & Lightroom subscription, a huge collection of music (Spotify), 3K~5K movies and TV shows (Netflix), etc., all cost approximately the same. I mean, sure, $120 a year is pocket change for somebody using Ngrok professionally, but that's still super disproportional, compared to, say, monster of a piece of software like Photoshop. I'd probably subscribe for $2, but otherwise, IMO, frp [0] on a $3 VPS [1] is better value, with the extra benefit of being FOSS and having zero limits.

[0] https://github.com/fatedier/frp

[1] https://www.scaleway.com/pricing

kyboren 19 hours ago 1 reply      
PSA: if you want to provide remote access to a local service, but don't want the potentially-terrifying security implications, use Tor Authenticated Onion Services (AKA "HiddenServiceAuthorizeClient" [0]).

On a machine on the LAN, install Tor and set up an authenticated onion service, and point it to the desired endpoint. In order to access the service, clients need a manually-loaded encryption key (and Tor, of course). Without this key, nobody will be able even to discover your endpoint, let alone actually connect to it.

[0]: https://gitweb.torproject.org/torspec.git/tree/rend-spec.txt

YPCrumble 13 hours ago 2 replies      
A fantastic open-source javascript alternative is localtunnel (https://github.com/localtunnel/localtunnel). I've used this more often than ngrok after ngrok became a paid service.
bespoke_engnr 10 hours ago 0 replies      
I see some people arguing that "you should use a dev/staging environment with a public IP" instead of having ngrok tunneling traffic directly to your local dev box.

When you're editing HTML/CSS, you don't have to run a deploy script before checking how your markup renders. Ngrok gives people writing web services the same convenience when dealing with requests from a 3rd party on the Net.

It is the equivalent of saving your HTML/CSS source files and instantly seeing the changes when you reload your browser.

I just wrote a little proof-of-concept Alexa app that crawls HumbleBundle ('Bundled Goods', very much beta quality at the moment) and ngrok was invaluable for developing it quickly.

yasn77 18 hours ago 2 replies      
I prefer the implementation of http://localhost.run

To me it seems a lot cleaner, simply use SSH rather than download any app

saintfiends 16 hours ago 0 replies      
Their 1.x is open source: https://github.com/inconshreveable/ngrok

This is a similar open source alternative:https://github.com/fatedier/frp

Both written in Go.

DAddYE 9 hours ago 0 replies      
A lot of negativity but I found this tool super useful when developing for Alexa and test out my scripts. Keep it going guys!
49531 13 hours ago 0 replies      
I've used ngrok for a while now, and I love it. I used it just last night to test out some webrtc stuff I was doing. Was able to get friends from around the world on video chat served from localhost within seconds.

It's also super handy when building webhooks, you can use the unique URL to test out apis without having to deploy anything. I can't rave about it enough.

yannis 12 hours ago 0 replies      
For a part-time programmer Mechanical Engineer, it is such a gratifying task to use ngrok. I first came across it a couple of years back. Great Project, great development, open sourced and well written in go. https://github.com/inconshreveable/ngrok/tree/master/src/ngr...

It takes two seconds to deploy an application in the evening from my kitchen table, check it also on my mobile and the next day access it from work also and show it to co-workers.

Call it "usability" for Engineers!

andkon 4 hours ago 0 replies      
I love this product page so much! Great illustrations, a wonderful balance of levity and clarity about ngrok's purpose.
xg15 15 hours ago 0 replies      
The title of this submission makes it sound like a selling-fridges-to-eskimos scam product - you need to read quite a bit to find out what's it actually about and that it solves (or simplifies) an actual use-case.

I think a better comparision is with DynDNS services: It sets up a public host connected to your own machine - but unlike DynDNS, the host doesn't point to your machine's IP directly. Instead, requests are routed through a proxy/tunnel, so your machine can be kept behind a firewall and is only available through the public host.

(I figure, the proxy allows for some more neat tricks, such as restricting ports/urls/etc or holding requests open while your machine changes IPs.)

pfista 10 hours ago 0 replies      
Ngrok is the coolest tool I use on a pretty consistent basis. Developing webhooks locally is usually what I use it for, and the web interface replay capability is amazing. The creator gave a great talk on why he built it and how it progressed over the years: https://www.youtube.com/watch?v=F_xNOVY96Ng
nbrempel 12 hours ago 1 reply      
Ngrok is one of the most valuable development tools in my toolbox.
joantune 8 hours ago 0 replies      
I have been using this for quite a while and it's really useful. Even though I have VPSs available, where I could make a SSH tunnel, this is simply way more convenient, so I end up using ngrok a lot for development
ausjke 14 hours ago 1 reply      
Never used it, what's the difference between ngrok and using DMZ with port-forwarding? are they the same thing? What's the technical advantage other than it is easy to use? I can port-forwarding easily on my router to expose whatever port to the public, why do I need ngrok?

With a DDNS + Port-forwarding you can easily have what Ngrok provides? or am I missing something?

samcheng 22 hours ago 1 reply      
It's possible to roll your own ngrok clone via SSH tunnels, a publicly-available server somewhere, and autossh. This is basically ssh-tunnel-as-a-service.
adamson 12 hours ago 0 replies      
I've been using this for web server testing since 2012. It's probably the paid tool that's given me the most bang for my buck in terms of hours saved
leesalminen 13 hours ago 0 replies      
I've been on the free tier for a while and have been meaning to upgrade to show support for such a great service. Seeing this post this morning reminded me. Upgraded for the year!
stevemk14ebr 22 hours ago 2 replies      
3 important questions:

1) My university blocks LogMeInHamachi which is the main tool i've tried to get around hosting behind a NAT. Will this likely be blocked too, or is it not possible to tell without trying

2) Is there any costs associated with this. Do i ever need to pay

3) Does the person connecting to my server also require a special client or does this appear to them as any standard connection would.

vhost- 21 hours ago 1 reply      
I really like ngrok. I use it a lot. I just really dislike how the TCP tunnels work. With HTTP you get a unique subdomain which makes it harder for people to just scan and connect. With TCP, it's always 0.tcp.ngrok.io, so you can just scan that domain and connect to anything that's open.
sandGorgon 17 hours ago 0 replies      
I'm a happy paid user. If I had one request, it is for a different pricing and better management for groups of users.

People like me would like to buy 5-10 licenses and manage them centrally.

Define shared endpoints and individual endpoints,etc

roylez 22 hours ago 1 reply      
I used to use Ngrok, then I discovered ultrahook which gives me a persistent endpoint, for free.
stefanhuber 19 hours ago 0 replies      
Use a hidden service! I manage many intranet servers over tor. You have no problems with nat or firewalls and it is free!

Ok it is slower, but for many things like ssh it's great...

jonthepirate 18 hours ago 0 replies      
docker run -ti -e 'PROXY_TO_ADDR=http://www.example.com/' jonthepirate/ngrok:latest

^ that expression will expose a local docker powered web server, assuming www.example.com is the local dns name on your docker network. Enjoy.

prodicus 18 hours ago 0 replies      
Awesome! OT but can anyone suggest some free tools using which I can make similar diagrams. They look pretty cool :)
PacketPaul 20 hours ago 2 replies      
How much is the paid service? You could roll your own for around $15/month using an Amazon EC2 T2 micro instance.
nejdetckenobi 19 hours ago 0 replies      
long time ngrok user here. used several times to show my prototypes to the others. It's simpler than deploying on heroku (or anywhere actually), and is without restrictions ofc because you use your own hw.
stummjr 16 hours ago 0 replies      
Ngrok is just awesome! A huge shout out to the developers!
zAy0LfpBZLC8mAC 15 hours ago 1 reply      
Yet another needless cost of not switching to IPv6 already ...
andreiw 19 hours ago 0 replies      
Cool. Where's the ARM64 linux build? That's more important than the legacy 32-bit ARM builds (i.e. ARM servers)
FBI director Comey backs new Feinstein push for decrypt bill techcrunch.com
291 points by pearlsteinj  3 days ago   208 comments top 30
grandalf 3 days ago 13 replies      
From his perspective as the head of the FBI whose job it is to achieve outcomes within the law, of course Comey advocates encryption backdoors. He would likely also advocate allowing the FBI to suspend the bill of rights for any suspect during the duration of an investigation, and he'd quite likely prefer that the FBI be legally allowed to torture suspects if extreme techniques were viewed as likely to result in useful information. To law enforcement, the rights of a suspect are a barrier to many convictions.

How did we get to this point? Nobody would reasonably argue that extreme surveillance measures, patriot act, etc., is necessary to stop the vast majority of crimes from occurring, so why is it so easy for seemingly serious/intelligent people to think this nonsense is reasonable?

Members of our government are so indoctrinated about stopping "terrorism" that they have lost all sense of perspective. Terrorism is a political word to describe political enemies of the state, yet the patriot act and surveillance machinery has been used in enforcement of many other kinds of (less serious) crime.

I am surprised anyone can still use the word "terrorism" with a straight face anymore after it's become so clear that there is no large existential threat (merely the occasional zealot who acts out due to his/her own mental health issues). And in spite of a historically unprecedented global surveillance system there have been no attacks thwarted.

Comey is a symptom of the kind of cowardly, authority-respecting society we've become. I look forward to the day when our FBI director is not someone whose gaffes and judgment calls we read about in the newspaper on a regular basis.

dhfhduk 3 days ago 4 replies      
I'm confused about this. I'm hurried at the moment, but this seems to a bill that orders tech companies to provide a solution to encryption without having a backdoor?

Isn't this like legislating a violation of mathematics or something?

FullMtlAlcoholc 3 days ago 2 replies      
So, the NSA and the CIA were recently hacked, yet these numbskulls think we can create a system that will only be accessed by "the good guys" How many hacks, leaks etc will it take for them to understand that if this passes, that will be the end of online security?

New Rule: If you want to propose cybersecurity legislation, you need to pass the fizz buzz test.

peterwwillis 3 days ago 0 replies      
> "What nobody wants to have happen is something terrible happen in the United States and it be connected to our inability to access information with lawful authority."

But they're not asking for that. They're asking for the ability to force companies to grant them access to information without something terrible happening.

The only way you could prevent something terrible happening, and have that prevention be "connected to [their] ability to access information with lawful authority", is to have the ability to inspect private data. And the only reasonable way they would do that is to do it surreptitiously.

They could try just asking the user to unlock their iPhone, or demand it with a court order (where I assume they can plead the 5th), but either would tip the suspect off. So they have to do it without the user's knowledge. And the only way to do that is if the company has a backdoor, or makes it so incredibly insecure as to no longer guarantee privacy at all.

The only logical way to give the FBI what it wants is to compromise user privacy.

> During the session, Comey also made repeat plays for expanding the scope of national security letters (NSL) arguing that these administrative subpoenas were always intended to be able to acquire information from internet companies, not just from telcos.

The FBI claims that they would always get permission from a judge for invading user privacy. In the next breath, they want to expand NSLs, which is invading user privacy without requiring a judge's approval.

Both Lavabit and Silent Circle have had to close down their businesses after Lavabit was unreasonably demanded by the government (in a gag-ordered search warrant) to give up its private TLS keys, exposing all its users' privacy. But no law enforcement agency gives a shit about privacy; only secrecy.

mgleason_3 3 days ago 2 replies      
Unbelievable. Just happened to see a clip today (https://goo.gl/F9XeQU) where Feinstein was "grilling" Comey about announcing the investigation into Clinton right before the election.

When Feinstein totally let him off the hook I was floored?!? He interfered worse than the Russians - how does he still have a job?

Ahh, she wants his support for the decrypt bill. I'll never understand why the Democrats have zero interest in protecting personal privacy.

feld 3 days ago 1 reply      
I dont think Congress intended that distinction but what it does do us is in our most important investigations it requires us that if we want to find out the subscriber info to a particular email to go and get an order from a federal judge in Washington as part of the FISA court. An incredibly long and difficult process. And Im worried about that slowing us down and Im also worried about it being a disincentive for our investigators to do it at all.

Hurdles to protect privacy are important. If it's not an arduous process we have a problem.

DarkKomunalec 2 days ago 0 replies      
Would it be okay to mandate spy microphones in all cars, spy cameras in all rooms, and make it illegal to remove or disable them, as long as only the 'good guys', with a warrant, could access the info?

What if doing this would save N people/year from terrorist attacks?

What other rights should we sacrifice for a 'safer' society? Surely we shouldn't let terrorist recruit people, so there goes free speech. We also shouldn't let them gather together to plot their wicked plots, so there goes freedom of association. And if we could bar people at risk of committing terrorist acts, from vulnerable locations, such as subways, airports, parks with a lot of people in them, well, I'm sure that would save a few lives too.

utternerd 3 days ago 1 reply      
> saying such legislation would be better from a public safety perspective

According to whom, we the people or a bunch of authoritarians who'd like to be able to access every nook and cranny of our personal lives?

adrr 3 days ago 2 replies      
Putting in backdoors is sure fire way to kill US based mobile phone producers. Criminals will just use foreign produced phones and only way to counteract that is to outlaw those phones. Can't wait till they criminalize having certain firmware on your phones.
pgodzin 3 days ago 1 reply      
> We all love privacy, we all care about public safety and none of us want backdoors we dont want access to devices built in in some way. What we want to work with the manufacturers on is to figure out how can we accommodate both interests in a sensible way

How is this possibly reconcilable?

thegayngler 2 days ago 0 replies      
I don't know why California Democrats elected Diane in the first place. Were there not any real liberals in California to choose from preferably with some expertise in Californias most valuable export?
ardit33 3 days ago 7 replies      
Diane Feinstein is old and needs to retire. She is completely out of touch with the needs of her constituency, and comes off more like an old guard republican rather then a democrat that she is supposed to be.
rdxm 3 days ago 2 replies      
geeeez, how long is Cali going to foist Feinstein on the rest of the country. The level of idiocy is just beyond painful...

Edit to add: of course the same could be said about the remaining 49 states and their reps/sens as well...

rietta 3 days ago 0 replies      
I was watching the hearing during lunch, had to attend to work meetings, and then saw this article which is what spurred me to post my open letter to Congress tonight and share it here on HN at https://news.ycombinator.com/item?id=14261423. We have to get this information out there in a format that Congress and our non-techie friends and family understand.
RichardHeart 3 days ago 0 replies      
Law enforcement is tasked with putting people in jail, not so much preventing future abuses of bad laws by governments. This is why checks and balances must be maintained, for when all you have is a hammer everything looks like a nail.
bdamm 3 days ago 1 reply      
"The high profile court battle ultimately ended after the FBI paid a third party company to gain access to the device via an exploit in the security system."

Why isn't this an acceptable solution?

AJ007 3 days ago 1 reply      
Can someone call out these alleged encryption back doors for what they are? Junk science.

If Apple and Google aren't legally able to build as secure as devices & infrastructure as possible, the DOJ, FBI, NSA, and CIA sure as hell won't be secure. Merry Christmas to Assange.

benevol 3 days ago 0 replies      
If you want to lose all of your tech monopolies, then go ahead with your backdoors (the ones whose existence will be publicly known, that is).
microcolonel 3 days ago 0 replies      
> We have to figure out a way to optimize those two things: privacy and public safety.

Given how safe the public is, you'd think that this would mean "we need to focus on privacy". That is the public's priority. The FBI, whose mandate is abviously not to protect the privacy of citizens, is obviously going to advocate for the public safety, or more specifically his organization's degree of visible success in ensuring it.

Obviously the director of the FBI is not who you should be asking for a balanced recommendation regarding safety and privacy.

JustSomeNobody 3 days ago 0 replies      
What are the tech companies he has been having a "growing consensus" with? I want to boycott them.
jacquesm 3 days ago 1 reply      
Nice bill. Maybe they should finally get around to declaring Pi to be 3 too, two birds with one bill.
Mendenhall 3 days ago 1 reply      
Is there any good information on what has been accomplished through such access etc ?

What have they stopped using such methods? I think if they wanted to get anything like this moving forward they need to show results. Not too many trust the government these days.

I do not like the idea of "backdoors" but I can see realistic need for such things. I think many are against such things "until" some massive WMD type attack then the tune will change.

scardine 3 days ago 0 replies      
There is another big problem with mandatory decryption laws.

If someone want to incriminate you, they don't need to plant a file with child porn anymore: they just need to plant a file composed of random bytes and acuse you of having encrypted child porn there.

Now good luck providing the court an encryption key that does not exist.

cprayingmantis 2 days ago 0 replies      
If you're wondering how it got to this point I'd like to remind you that you (If you live in the US) don't own this country. The people in charge don't care about you. They care about money, power, and stability of their system. It's hopeless to resist because they own your home, your bank account, and all your money. The only way we'll ever change it is getting scientists, nerds, and engineers into congress. I don't know how we'll do it but we have to do it to ensure freedom for everyone in the USA.
unityByFreedom 3 days ago 0 replies      
Ridiculous. When will these numbskulls understand that you can't regulate people's use of encodings? It's right there in human language. You can't force everyone to use the same one.
jjawssd 3 days ago 2 replies      
Why do California Democrats vote this person in year after year?
cmdrfred 2 days ago 0 replies      
Why is someone who is 83 years old and likely has to call her grandson for help paying a bill online writing law about encryption?
phkahler 3 days ago 2 replies      
I still don't understand. They want to be able to have a court order a device maker to decrypt data, but today they can already get a court to order the device owner to decrypt it. The device owner actually has the password or key. The truth is that they want to do this without the device owner knowing it's being done.
bsder 3 days ago 0 replies      
Right after the Intel security disclosures.


Esau 3 days ago 0 replies      
Color me surprised.
Ask HN: Google Doc email virus?
479 points by eof  3 days ago   207 comments top 58
ademarre 3 days ago 4 replies      
I reported this attack vector to Google back in 2012. They awarded a modest bounty, and then a few months later I heard this:

> "We're deploying some abuse detection and reactive measures to deal with impostors that might try to abuse this sort of attack. Given this, we do not intend to perform validation that the URL matches the branding information."

That last part was in reference to one of my proposed mitigations, which they chose not to implement.

Here's the discussion on the IETF OAuth WG mailing list from that same time period: https://www.ietf.org/mail-archive/web/oauth/current/msg07625...

mailinatorguy 3 days ago 5 replies      
Mailinator here:

Yes, we sent the inbox to a blackhole but keep in mind, Mailinator does not and can not actually "Send" any email.

It's a receive-only service. As always, any email "from" @mailinator.com has had it's reply-to forged (which is pretty trivial).

Also - even before we blackholed the email, it's unlikely any email in that inbox (i.e. hhhh..) was read. Each box has a 50 email limit (FIFO) which was immediately overwhelmed. You couldn't click fast enough between seeing the inbox list and clicking an email.

Mailinator is simply a "receiver" in all of this but we have no indication our servers were otherwise involved.

jakob223 3 days ago 4 replies      
EDIT: According to a Google representative on the reddit thread, this application is now blocked. If your account was affected, you no longer need to do anything.

If you fell for this, changing your password is not the right solution - you want to log into your google account and remove permissions from the application.

https://myaccount.google.com/permissions?pli=1 should show a list of apps connected to your account.

Also, if you fell for this, you sent a bunch of emails to people like the one you received, so maybe tell them not to click.

hemancuso 3 days ago 4 replies      
It's a pretty nasty one, since it uses their standard OAuth flow with an app "Google Docs" to have users grant full access to their email and contacts.

1. I can't believe Google doesn't have basic filters to disallow developers from registering an app named "Google Docs"

2. Perhaps there should be some more validation/limits associated with allowing apps on the platform that can gain full access to email. A secure email account is the One True Source of authentication in the digital world. Google should make it way harder for people to get tricked into granting full access to their inbox.

btym 3 days ago 1 reply      
I love how simple this worm is. They haven't exploited any security holes (other that looking like Docs), it literally just asks for full access to your email address.
aub3bhat 3 days ago 0 replies      
Its a malicious OAuth client (multiple clients?) that calls itself "Google Docs" and fooled user into giving access to read emails, while pretending to show as if it was needed by GDocs itself to access a Document, enabling launch of among other things password resets on other websites.

the root problem seems to be that the identity of OAuth Servers is not authenticated/clearly shown, i.e. a malicious app can claim that its name is Google Docs even though it is not endorsed by Google.

IMPORTANT NOTE: If you are running any website that has "Reset my password" it might be used by attacker, since even though the attacker does not have access to password, the attacker had access to email inbox. Thus the email password reset flow will allow attacker to compromise other websites that rely on Gmail account for password resets.



philip1209 3 days ago 0 replies      
Wow, Hired.com appears to have emailed all of their users about this. Must be spreadinq quickly. Note that they advise compromised users to change their password - which other comments indicate does not solve the issue.

Below is the Hired notification.


Important: Email Phishing Alert

Hi <first name>,

It has come to our attention that some of our users may have been hit with a Google Docs phishing scam. It appears that this scam has been spreading throughout the internet today, and is not isolated to Hired or our customers and candidates. If you want more information, you can read about it here[1] or here[2].

If you receive a Hired email that says that someone from Hired has shared a Google Doc with you, please validate with the sender before clicking the link or doing anything else.

If you think your account may have been compromised, be sure to change your password immediately.

We apologize for this interruption to your day. Please let us know if you have any questions.

Thanks, The Hired team

[1] https://www.theverge.com/2017/5/3/15534768/google-docs-phish...

[2] https://gizmodo.com/a-huge-and-dangerously-convincing-google...

yurisagalov 3 days ago 1 reply      
Looks like this is fairly widespread.

This is what the attack actually looks like: https://twitter.com/zachlatta/status/859843151757955072

sudom82 3 days ago 4 replies      
Source code of the worm: https://pastebin.com/raw/EKdKamFq

Edit: How I got this:

Someone on reddit went to their site when it wasn't down, and downloaded the files linked in the page's HTML. I just posted it here.

This isn't the full source code. There was another PHP file visible on their website that unfortunately isn't visible anymore.

coleca 3 days ago 1 reply      
Considering how easy it would be to filter this out, why has Google allowed it to continue spreading within their own email network? Obviously they have no control over what goes on outside of Gmail/G Suite, but inside their own network, they should be able to setup a basic filter to stop anything TO: hhhhhhh@mailinator or whatever it is. I received this email (but did not click the link) in my Gmail account from another Gmail user, so it never left the Google network. From the reports here it looks like it is still spreading even though Google disabled the app.

With all of Google's machine learning expertise, how is it that this got past all of their SPAM detectors? It took me 2 seconds to hover over the link and see it was a crazy link that ended up at a domain called google.pro. Really? One of the world's largest and most advanced email systems couldn't figure that out?

alexlongterm 3 days ago 0 replies      
We wrote a guide for google suite admins on how to lock down their domain. Oauth and phishing are major threats and google could do much more here https://medium.com/@longtermsec/more-tips-for-securing-your-...
jmcdiesel 3 days ago 0 replies      
I work for a fortune 500 (wont disclose) but we just shut off email for our entire organization due to this...
rst 3 days ago 0 replies      
slrz 3 days ago 2 replies      
Hi, I'm Google Docs. Would you please grant me access to your Google account so that I can read, send, delete and manage your mail, as well as manage your contacts?
aaronmiler 3 days ago 1 reply      
Our support team is getting spammed a lot from our customers. We're in the education space, and it's spreading pretty quick.

On initial inspection the URL looks harmless, but it's got some malicious params in there, mainly

It appears to request read/send access to your email, and then spam all your contacts

gigabo 3 days ago 0 replies      
Reported as a service disruption on the status dashboard:

> We're investigating reports of an issue with Google Drive. We will provide more information shortly.


wjke2i9 3 days ago 0 replies      
Things like this are bound to happen when you have centralized systems controlling everything with full control of the information (no zero-knowledge storage like email/document/communication encryption). You're essentially trusting one third party provider with everything in your life/business/organization.
seanp2k2 3 days ago 0 replies      
sergiotapia 3 days ago 0 replies      
Just received one as well. Source is Hired.com - according to them:



Hi Sergio,

It has come to our attention that some of our users may have been hit with a Google Docs phishing scam. It appears that this scam has been spreading throughout the internet today, and is not isolated to Hired or our customers and candidates. If you want more information, you can read about it here or here.

If you receive a Hired email that says that someone from Hired has shared a Google Doc with you, please validate with the sender before clicking the link or doing anything else.

If you think your account may have been compromised, be sure to change your password immediately.

We apologize for this interruption to your day. Please let us know if you have any questions.

Thanks, The Hired team

M1233mjm 3 days ago 0 replies      
When can we expect a public statement regarding the phishing scam and the fallout? We all know it used our accounts to forward itself to everyone in our contact lists, but what about our emails? Have those also been forwarded/harvested? We need to know this to know how to react.
mrpound 3 days ago 1 reply      
Same here. Several emails so far from different seemingly random companies and individuals with clearly malicious Google Docs requests w/ a suspicious param in the oauth request in the link:


wmblaettler 2 days ago 0 replies      
To see the list of apps connected to your Google Account: https://myaccount.google.com/permissions
yeboi 3 days ago 1 reply      
Here's an interesting case that I encountered (~1:20pm maybe):

1) I clicked on the link on my phone's email app. It looked super believable since it was coming from a person I was expecting a Google Doc invite from. I allowed access to "Google Docs" and then the page hit a 502 gateway error.

2) I tried it again on my computer by logging in, and this time, when the page was loading (after I allowed access), I saw the website was not legitimate (based on the url) SO I immediately closed the tab.

Here's the interesting part: None of my contacts got a "Google Docs" invite from me - meaning I didn't "send" any mail. Any idea how I can see if the person behind this has my emails too via API requests?

packetized 3 days ago 0 replies      
Eagerly awaiting the response from Cloudflare detailing their response, since all of the domains associated with this so far appear to have been hosted with them, or at least fronted by their service.
choxi 3 days ago 0 replies      
I got one from "DocuSign": https://twitter.com/choxi/status/844949531896655872

The link went to a page that looked like Google Docs and asked for my Google login, but I noticed the domain was wrong so I didn't sign in. I tried the link again today and it looks like Chrome does flag it as a phishing site now.

os400 1 day ago 0 replies      
G Suite customers have been asking for the ability to whitelist OAuth clients/scopes for their domains for years, for this exact reason. So far, Google hasn't really given a shit.

I guess that might finally change now.

AdmiralAsshat 3 days ago 0 replies      
My brother called me about 15 minutes ago to tell me this hit his student e-mail as well.

I'd be curious at the postmortem how quickly this thing spread.

codedokode 2 days ago 0 replies      
As we saw one should not let users decide who can get access to their email account. Users are easily fooled. Google should review all applications wanting such access manually.

Though this is unrelated to the topic I think it would be good if Google reviewed apps permissions in Google Play too because users are bad at this.

EdwardMSmith 3 days ago 1 reply      
Feels like "I love you" all over again.
discreditable 3 days ago 0 replies      
G Suite admins: you can check for compromise by going to Reports > Token in the admin panel. A compromise looks like this: https://i.imgur.com/Dm0NNTn.png
_pergosausage 3 days ago 1 reply      
The very same thing happened at my university. The sender is hhhhhhhhhhhhhhhh@mailinator.com
TimButterfield 3 days ago 0 replies      
There is also a Docusign phish email going around. I received a couple of them yesterday from mail2world, though signed by [company name].onmicrosoft.com for that user's business email address. They purported to be from people I knew.
ethn 3 days ago 0 replies      
I just received one of these as well. They seem to get their targets by compromising a single user and then by monitoring the people who are viewing the same Google Docs as the infected victim had in the past.
garyfirestorm 3 days ago 2 replies      
This happened to me. An unknown person from my organization shared a Google doc. I didn't open it, and replied by saying 'what is this about?'. He said he didn't send any gdocs :|
Clubber 3 days ago 1 reply      
The bad thing about centralized internet is it makes some mail servers much juicer targets than the decentralized mail servers of old.

I decided gmail wasn't for me when I read they harvested your emails for ads. 1GB in 2004 sounded so enticing too!

If you are technically savvy and have access to a static IP, I highly recommend setting up postfix/dovecot and registering a domain. It's fairly straight forward for technical people. You can have it setup, soup to nuts in an hour or two. There's online docs everywhere.

It's probably not going to be as secure as a gmail, but it's a much smaller target. Most internet providers will give you a static for an extra $5 or so.

jaimehrubiks 3 days ago 0 replies      
The only tricky thing is not seeing these weird permissions. Google may block naming an app "Google Docs" but someone could always trick it with "Google Docs." or whatever
spydum 3 days ago 0 replies      
Next up, prepare for the inbox onslaught of every CASB provider hawking their wares and telling you all about the googpocalypse and how they are uniquely prepared to solve it!
aaronmiler 3 days ago 0 replies      
Just checked the malicious link again.

It looks like Google removed (at least one of) their access tokens

Checked the URL containing:


cloudaphant 3 days ago 1 reply      
Any clues what this was trying to do? I suppose we have to wait for Google to publicise what went on once OAuth had been granted.
Markoff 2 days ago 0 replies      
so what should i tell my mom to avoid her Gmail being hacked in future same way? (it wasn't hacked since they had only English language audience this time)

don't click on unknown links which take you to Google login page and never approve access to your data in any dialog?

cassie942 1 day ago 0 replies      
there was a warning may 4 about a massive google doc phising scam check on.digg.com/2py2k5g
cassie942 1 day ago 0 replies      
warning may 4 of massive google docs phising scam check on.digg.com/2py2k5g
sleepychu 2 days ago 0 replies      
Is there mitigation against deploying exactly this attack another way?
killa_kyle 3 days ago 0 replies      
This is burning through our office right now. emailing all clients! diablo!
d2kx 3 days ago 0 replies      
Yeah @SwiftOnSecurity warned about this, lots of people/orgs affected
mathattack 3 days ago 0 replies      
I got a few, then it died. Perhaps Google now recognizes this as spam.
caydo00n 3 days ago 2 replies      
Anyone know how far spread this is? it just Hit our school emails
pmcpinto 3 days ago 0 replies      
I received it too
sudom82 3 days ago 1 reply      
edit: accidentally double posted

double edit:1. replied in above comment.2. dunno. first time using HN, accidentally submitted twice when I was on comment posting cooldown I guess.

MediaSquirrel 3 days ago 0 replies      
Same here
patmcguire 3 days ago 1 reply      
Yes, it's all over.
petervandijck 3 days ago 0 replies      
Yes, same here.
ownc 2 days ago 0 replies      
My teacher said not to open this email.
pinaceae 3 days ago 0 replies      
amazing how large this is, our company just a massive wave of those. all from "internal" addresses.
ben_jones 3 days ago 0 replies      
We have an entire generation that's been trained by big tech companies to instantly click agree, share, like, etc., buttons. This is only going to get worse.
Apple plans to spend $1B to support advanced manufacturing jobs in the U.S washingtonpost.com
238 points by happy-go-lucky  2 days ago   269 comments top 22
ryanmarsh 2 days ago 24 replies      
> Cook told "60 Minutes in 2015. I mean, you can take every tool and die maker in the United States and probably put them in a room that were currently sitting in. In China, you would have to have multiple football fields.

In that interview he also said that the jobs weren't coming back.

This is one of the most powerful statements uttered on American labor and employment trends in recent memory.

I have shared this quote with every middle/lower-middle class person I know who voted for Trump and their responses have all been the same, horror.

Financial interests convinced liberals and conservatives alike that globalism was a good thing (for altruistic or selfish reasons respectively). Instead it just gave Capital access to cheap labor and totally fucked the American worker.

ardit33 2 days ago 3 replies      
There is also geopolitical reasons for this. There is a high chance that east asia will be embroiled with some war coming soon, which will disrupt both manufacturing (either in china or Taiwan), or transports of goods.

Even if the North Korea problem gets solved, the China problem is still there and it wont be solved anytime soon.

If you have some streamlined manufacturing here, it is easier to fork it in case the main area of supplies goes offline due to regional wars/embargoes/troubles.

Right now relying only on east asian manufacturing is a single point of failure for apple.

jonknee 2 days ago 2 replies      
> Apple says that it intends to bolster the U.S. manufacturing sector by creating a $1 billion advanced manufacturing fund with some of that initial money going toward a company the tech giant is prepared to partner with, chief executive Tim Cook said.

Amazing what passes for news these days. This sounds no different from what they normally do with their supply chain. Anyone remember GT Advanced Technologies? Apple wanted their sapphire glass for the iPhone 6 and fronted $439m for a factory in Arizona. The deal went south and Apple ended up owning everything and 700+ people were laid off. Apple turned it into a data center.

brudgers 2 days ago 0 replies      
I read the headline and thought:

+ Advanced manufacturing jobs are done by robots.

+ $1 billion builds one moderately sized manufacturing plant (Tesla's Gigafactory is $5 billion [1])

+ There's a scene Austin Powers: International Man of Mystery I wish the press would review before reporting so breathlessly.

[1]: https://en.wikipedia.org/wiki/Gigafactory_1

unchocked 2 days ago 0 replies      
Setting up a fund to create jobs is a pretty telling indicator that those jobs don't exist, and aren't expected to.
PatrickAuld 2 days ago 1 reply      
I think "Advanced Manufacturing Jobs" are going to be roboticist, machine learning and mechanical engineer related. Foxconn is moving this direction and I'm betting Apple thinks they can do it as well as them. Plus possible tax breaks based on Trump's comments.
euphoria83 2 days ago 0 replies      
This is the way to bring jobs back to the US. Start with investing "small" sums of money to create an enterprise that will require manufacturing expertise in some area. The ecosystem will then expand. Other companies can then take advantage of the same expertise. Also, the radius of expertise will grow as supporting jobs crop up.
CrankyBear 2 days ago 1 reply      
Or, to put this into perspective, that 1/250th of their cash on hand. http://money.cnn.com/2017/05/02/investing/apple-cash-quarter...
foepys 2 days ago 0 replies      
I guess it's part of a plan to pay lower taxes on their oversea cash when they will bring it into the US. Create jobs, get tax breaks. Apple has stashed about $250 billion overseas and paying a billion to reduce the tax burden is most likely the cheaper option.
danm07 2 days ago 1 reply      
Anyone else find it odd that all these headlines states only how much these companies are going to spend on manufacturing, but none of them say what they're actually going to be making?
mtgx 2 days ago 0 replies      
This isn't just out of Apple's good heart:


easilyBored 2 days ago 0 replies      
$1 billion. I wonder what % of their marketing PR a billion is. Because that's all this is. PR to buy a few good mentions and to enable Apple to say, "Yeah we're doing something..."
tmsldd 2 days ago 2 replies      
So.. 0.3% of their cash pile?
ericfrederich 2 days ago 0 replies      
So what'll that be?... 5 new jobs created? Look at the Tesla Gigafactory, they brag on how dense it is and how automated it is... that means less jobs.

I'm not saying it's a good thing or a bad thing, I'm just saying bringing manufacturing back to the US doesn't mean jobs, it couldn't and still be profitable. Any new factory in the US will have to be so heavily automated it couldn't possibly provide any jobs

acd 2 days ago 0 replies      
Robots will assemble iphones in the US.
yummybear 2 days ago 0 replies      
So, engineers for the robots?
tengbretson 2 days ago 2 replies      
mediumdeviation 2 days ago 2 replies      
paulrpotts 2 days ago 0 replies      
Or, less than one-half of one percent of their cash reserves. That's generous and I'm sure it will be effective and not at all just a marketing gesture.
jlebrech 2 days ago 1 reply      
Trump spoke to Tim Cook last night
devy 2 days ago 0 replies      
Finally! This is long-overdue indeed.

On a related note, I was pretty shocked when I first knew scientific glass blower was a very special skill set and is dying.[1]

[1]: http://www.latimes.com/local/education/la-me-caltech-glassbl...

malandrew 2 days ago 4 replies      
It's so incredibly obvious that the best course of action is to let all these companies bring home all those foreign revenues so long as a significant portion of it is reinvested in the US is a reasonable period of time, excluding investments in financial instruments (stocks and bonds) and non-productive assets like land.

Any money that is stuck abroad will be reinvested abroad, making foreign economies more competitive relative to the US economy. Anyone arguing for taxing profits from abroad heavily I can only assume has a very poor understanding of macroeconomics.

The Patek Philippe Caliber 89 and Horologys Easter Problem hodinkee.com
324 points by gpresot  1 day ago   182 comments top 32
bkeroack 1 day ago 4 replies      
Only tangentially related, but if you ever get the chance to check out the Patek Philippe Museum in Geneva you should do so. I'm not really a watch geek so before going there I was kind of "meh", but that place is incredible.

They have thousands of timepieces, some dating back to 1530. To see what people have been able to accomplish with nothing but springs and tiny gears is nothing short of mind-blowing. As a technologist it's pretty humbling, considering all the advantages we have today and yet ancient watchmakers could do this work literally with nothing but hand tools.

te_platt 1 day ago 5 replies      
I'm always curious when Gauss turns up in the solution to a math problem. Wikipedia has a section on the algorithm here: https://en.wikipedia.org/wiki/Computus#Gauss_algorithm

The same article also explains why it is a hard problem:

"Easter calculations are based on agreements and conventions, not on the actual celestial movements nor on indisputable facts of history."

Sounds a lot like many user requirements I have had to code for.

encoderer 1 day ago 3 replies      
On a tangent, my favorite example of how php is a giant bag of rando functions is easter_date(). This ships with the language core.
lunaru 1 day ago 7 replies      
I'm big into mechanical watches and I find it surprising that more software developer types are not into watches. They're the mechanical parallel that completes my otherwise purely digital world.

Also, everything about them (when done well) sounds like a top developer's wet dream, from the design, development, and down to the extremely rigorous QA processes around building these mechanical marvels.

In particular, this "inside rolex" piece from the Hodinkee website is a must-read, and I believe was also posted to HN a few months ago: https://www.hodinkee.com/articles/inside-rolex

jameshart 1 day ago 1 reply      
"As you can probably imagine, a program disk for the full cycle of Easter dates would be a wildly impractical thing as well; it would have to have 5,700,000 steps in order to encode the full cycle of Easter dates."

That assumes that you want to encode the program as a single disc encoding the repeating cycle. But surely you could use a series of program discs to perform successive lookups and offsets and reduce it down to several more manageable discs? e.g. you have one disc that encodes the cycle of offsets of the spring equinox by year, then use that to offset the rotation of the wheel that encodes the lunar cycle...

ce4 1 day ago 1 reply      
I think it's one of the lamer complications.

The date-of-easter complication consists of a notched program wheel - practically a look up table. Due to the limited LUT-size of 28 (1989-2017) this program wheel needs replacement.

noonespecial 23 hours ago 0 replies      
Mechanical computing is awesome. Miniaturized mechanical computing doubly so. I'm glad it's alive and well in the world of watches.

I especially love this sort of thing because there's not really a practical reason for this to exist. Its pure art. Technical art of amazing effort and skill. The best part is that people appreciate it and pay big money for it (lots of people). People pay for good art! That's all kinds of awesome.

iaw 1 day ago 1 reply      
> the year indication goes to 9,999 and Schwilgu is supposed to have helpfully suggested that in 10,000, someone might paint in a "1" to the left of the year window

I hope that's a true story, it's such a practical solution.

gavinpc 1 day ago 0 replies      
> a true date-of-Easter complication is probably the single most difficult complication in horology

Funny story. I was making the "final" commit before shipping a desktop application, and I wanted to make an Easter egg. But what should it be? It should be Easter-related, I supposed. I made a pastel color theme for the main screen, that should appear only on Easter. This product is used in DoD, and I doubted that anyone would ever be using it on a Sunday.

The product had a design flaw, a kind of time bomb. I was a greenhorn working on it when it first shipped in 1994, and I thought nothing of the fact that its rate table was arranged horizontally, like so:

 id 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 1 4.49 4.59 4.79 4.99 5.16 5.26 5.68 5.78 5.99 6.44 2 ... etc
Despite being a Navy budget forecaster by trade, the Captain who created the program lacked the hubris to worry about needing rates for the next millennium. This set of dates was hardcoded all over the the codebase. The Captain, a poor typist, kept resetting the bomb by adding more years.

Fast forward to 2012. The product has been acquired, and I was brought in to port it for 64-bit machines. And the only task remainingrevision #1400was to compute the date of Easter.

It took me about 2 minutes to conclude that this wasn't going to happen (although I didn't know it was that hard). So although it lacks the elegance that you'd like in your Easter egg, this one hardcodes the dates of Easter Sunday... through 2020. I mean, it's going to be on the web by then, right?

Yeah, it's still shipping.


garethsprice 1 day ago 1 reply      
Beautifully written article, important to remember as developers that we'll only ever be able to capture the chaos and complexity of the real world as a rough model no matter how intricate the engineering.
pjungwir 1 day ago 1 reply      
Once upon a time I wrote a function for remind(1) to compute the date of Pascha (Easter in the eastern churches), which uses different rules. It uses the Julian calendar (mostly). Here it is on github:


It lets you choose between 3 algorithms, one of which is Gauss's.

OliverJones 1 day ago 2 replies      
Heh heh. Churches have been squabbling about the date of Easter since there was Easter. It seems this particular resolution of the problem relates to the western church's choices for Easter. Orthodox people still, if I understand it right, use the Julian calendar.

This stuff is hard to get right, even if you're a bishop with legions of scholars and theologians at your disposal.

And, just for grins, look up how the modern state of Israel decided when daylight saving time begins and ends up until 2012. (They rationalized it in 2012.)

tzs 1 day ago 0 replies      
"Easter is a feast, not a planet." --Johannes Kepler

A good reference for anyone who has to write calendar related code is the book "Calendrical Calculations" by Dershowitz and Reingold [1].

[1] https://www.amazon.com/Calendrical-Calculations-Nachum-Dersh...

Animats 1 day ago 0 replies      
What a kludge. They encoded a table in a variable-depth wheel, like the snail used in striking clocks, but not so regular. The table only has 29 entries. For a status-symbol complication watch, that's tacky. What does it do if not serviced? Display "needs service", or wrap around and show wrong values?
criddell 1 day ago 9 replies      
The idea of this watch is confounding.

Presumably, a person that is interested in having their watch tell them what day Easter falls on would be a follower of and believer in Jesus. From what I know about Jesus, he probably wouldn't be impressed by somebody spending millions of dollars on this.

What watch would Jesus wear?

Or maybe I'm looking at it wrong. Maybe it's just a great piece of art in the form of a clock.

cestith 1 day ago 1 reply      
I suddenly have The Buggles in my head singing "Software Killed the Hardware Star". The article's subtitle mentions the watch "needs a service". Well, that service could be CalDav if it was a software timepiece on a digital platform, and no skilled tradesman with tweezers would be involved.

Yes, a Patek Phillippe is a thing of physical beauty. However, for functionality like this software wins the day.

danielam 1 day ago 1 reply      
Interesting article. One note...

"[...] the whole structure of astronomical mechanical complications whether in the Strasbourg cathedral clock, or in watches like Caliber 89 is a manifestation of a world view."

The idea of a "clockwork universe" is a mechanistic, modern, Enlightenment-era idea. Astronomical clocks predate that worldview by quite some time.

coldpie 1 day ago 0 replies      
Tangential to this, if you're interested in how the calculation of Easter led to using cathedrals as astronomical instruments, check out The Sun in the Church by J.L. Heilbron. Frankly, it's a slog to read at times, but there's some very fascinating stuff in there, including some impressive engineering feats.
peterkshultz 1 day ago 0 replies      
Astronomical Algorithms by Jean Meeus is a great read for anyone interested in these types of problems.
BobMackay 1 day ago 0 replies      
Just for completeness, the 1990 paper referred to was by my father, Professor Alan Mackay, and was published in Modern Physics Letters B, Vol 4, No 15. I have put up a roughly scanned copy at http://bobmackay.com/Alan/AlanCV120.pdf in case anyone is interested.
pcurve 1 day ago 0 replies      
An in depth, partially subbed video on Caliber 89


ryanmarsh 1 day ago 0 replies      
Sincerely thank you for posting this. I had no idea of these timepieces and I am in awe of their beauty and sophistication. I think I have a new obsession.
cafebabbe 1 day ago 5 replies      
Ah, love for needlessly complicated things.

If it's code, it's an engineering flaw, but if it's mechanical, it's a marvel.

noir-york 1 day ago 0 replies      
A great read! Horology, Ecclesiastical history and lattice structures in one article. Thanks for posting.
sillypuddy 1 day ago 0 replies      
TLDR: Dates and times are hard
gcb0 1 day ago 0 replies      
there are dozen of places where one can link to patents. and even if not, they have convenient index numbers. why not link/mention the number if you are going to mention the patent on the article some 20 times?
gertef 1 day ago 0 replies      
Can you use a neural network to discover a formula for Easter date?
Teknoman117 1 day ago 0 replies      
Does anyone know of a video of the easter mechanism of the strasbourg astronomical clock running on the new year? I couldn't seem to track one down...
raverbashing 1 day ago 1 reply      
It seems the website is already gone

Edit: it's back up

f_allwein 1 day ago 3 replies      
"the basic rule for Easter is that it falls on the first Sunday after the first full moon of Spring (that is, the first full moon after the Spring Equinox)"

Given that the watch knows all these parameters (i.e. day of the week, moon phase and date (Mar 21 = Spring equinox), would it be possible to construct a mechanical complication that calculates the Easter date?

Aspirin May Prevent Cancer from Spreading, New Research Shows scientificamerican.com
272 points by azuajef  1 day ago   111 comments top 11
noam87 1 day ago 4 replies      
(on my phone right now)

Hasn't this been known for a while now? (with COX-2 NSAIDs specifically, like Advil). Even the standard book "The Biology Of Cancer" (2014) mentions it in passing in one of the later chapters.

I believe there's even been trials where NSAID is applied topically during biopsy, reducing chance of mets.

If I recall correctly, suspected mechanism is some sort of correlation between COX-2 inhibition and decreased expression of CTLA4/PD-1.

It's kind of sad, but it seems that too often vital lines of research sit on university shelves for years before anything practical is done about it :/

(disclaimer: not an expert; just guy with cancer.)

nonbel 1 day ago 1 reply      
There are some interesting theories about aspirin. For example that it was Rasputin's secret healing trick to tell people to stop taking it:

"Gilliard,[32] the French historian Hlne Carrre d'Encausse[33] and Diarmuid Jeffreys, a journalist, speculated Rasputin's healing practice included halting the administration of aspirin, a pain-relieving analgesic available since 1899.[34] Aspirin is an antiaggregant and has blood-thinning properties; it prevents clotting, and promotes bleeding which could have caused the hemarthrosis. The "wonder drug" would have worsened Alexei's joints' swelling and pain.[35][36]"https://en.wikipedia.org/wiki/Alexei_Nikolaevich,_Tsarevich_...

It is also claimed to be the real cause of the "Spanish flu":

"The high case-fatality rateespecially among young adultsduring the 19181919 influenza pandemic is incompletely understood. Although late deaths showed bacterial pneumonia, early deaths exhibited extremely wet, sometimes hemorrhagic lungs. The hypothesis presented herein is that aspirin contributed to the incidence and severity of viral pathology, bacterial infection, and death, because physicians of the day were unaware that the regimens (8.031.2 g per day) produce levels associated with hyperventilation and pulmonary edema in 33% and 3% of recipients, respectively. Recently, pulmonary edema was found at autopsy in 46% of 26 salicylate-intoxicated adults. Experimentally, salicylates increase lung fluid and protein levels and impair mucociliary clearance. In 1918, the US Surgeon General, the US Navy, and the Journal of the American Medical Association recommended use of aspirin just before the October death spike. If these recommendations were followed, and if pulmonary edema occurred in 3% of persons, a significant proportion of the deaths may be attributable to aspirin."https://academic.oup.com/cid/article/49/9/1405/301441/Salicy...

aantix 1 day ago 2 replies      
As for the internal bleeding risks, I was wondering what the actual numbers were. This is based on one low dose aspirin a day.


"These results translate into an absolute rate increase with aspirin above placebo (the incidence of cases of major GI bleeding attributable to low-dose aspirin) of 0.12% per year (95% CI: 0.070.19% per year).[20] Based on this value, 833 patients (95% CI: 5261429 patients) would need to be treated with low-dose aspirin instead of placebo to cause one major GI bleeding episode during a 1-year period (i.e. the NNH is 833)"

erikpukinskis 1 day ago 4 replies      
Because it's anti-inflammatory?

Could be confirmation bias, but I keep getting the sense that cancer/immune failure in general is connected to our modern 24/7 eating cycle, without any down time for your body to not be inflamed and work on clearing out bad stuff. Thus life extension from intermittent fasting.

agumonkey 1 day ago 1 reply      
Talking about simply accessible products, anybody has knowledge about cannabinoids (specifically cannabidiol, not THC) ?

I've dug as deep as I could from someone out of a lab and found tons of evidence, but the issue is that doctors have each a different point of view on it. So far I got:

 - it's innocuous but help opioids for pain - it's potentially antitumoral but human studies are lacking [0] - it's been studied and discarded because useless [1] - cannabis derivatives are toxic [2] - what's cannibidiol ? oh USA use that, Europe lags behind, I'll see
[0] mice model with human cancer lines showed regression, look for pierre yves desprez or sean mcallister on pubmed, youtube has a talk where some patients reported IRM visible regression on stage IV cancer

[1] renowned oncologist opinion, yet so many research on cannabidiol still going ? odd

[2] so far I've yet to find more than one paper listing toxicity (except one talking about accidental overdose on that one hypersensitive person that used her daughter's provider)

I'll take any info just in case.

joshgel 1 day ago 0 replies      
Probably more interesting is that the USPSTF recently added a recomendation for aspirin use to prevent colorectal cancer in certain populations[1]. There is increasing levels of evidence about this, though it is by no means certain.[2]

[1] https://www.uspreventiveservicestaskforce.org/Page/Document/...[2] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3354696/

jszymborski 1 day ago 1 reply      
Anyone have a link to the paper? I'm usually the first to criticise popsci treatments of cancer developments, but this actually seems to be an interesting study/development.
swah 1 day ago 4 replies      
Surely this would have been discovered by patients by accident, if it was true?
omginternets 1 day ago 0 replies      
That effect size, though... :/
vermooten 1 day ago 0 replies      
jtwebman 1 day ago 0 replies      
So goes eating a whole plant based vegan diet. https://www.forksoverknives.com/science-says-about-diet-and-...
Leaked document reveals UK plans for wider internet surveillance zdnet.com
282 points by cmsefton  1 day ago   145 comments top 19
vjvj 1 day ago 4 replies      
This will only get worse if Theresa May actually wins the vote / public approval to be PM. As Home Secretary she came up with fantastic ideas such as:

* Let us monitor every single call, email, text and website to catch terrorists and peadophiles (1)

* Let us ban apps like Whatsapp and iMessage because terrorists might use them (2)

* Let us use tax payer money for a fleet of vans to drive around areas with a high % of non Brits telling them to "go home". Which to be fair did result in 11 people leaving the country. (3)

(1)http://www.huffingtonpost.co.uk/2012/04/03/theresa-may-inter...(2) http://www.independent.co.uk/life-style/gadgets-and-tech/new...(3) https://www.theguardian.com/uk-news/2013/oct/31/go-home-vans...

teh_klev 1 day ago 1 reply      
And of course it's a Statutory Instrument which means these amendments to the Investigatory Powers 2016 Act will be implemented with little or no opposition. Parliament doesn't get to debate SI's and in the past 20 years they've been abused by both Labour and the Conservatives to pass legislation with little or no oversight.

Further reading:



da_n 1 day ago 3 replies      
I am a British citizen. There will not be any significant public opposition to this, a few articles in The Guardian and a small mention on the news between Trump stories. I have only experienced apathy when I discuss mass surveillance with people in this country. "I have nothing to hide, why should I care?". There is a fundamental disconnect between real world privacy and online privacy to most people in this country. At this point it is too late anyway, May has gone full frontal assault on the internet and privacy advocates and security experts don't have a hope. I just want her to go full-on now and implement maximum surveillance, maximum snooping and maximum data retention and profiling. All of this can only blow up embarrassingly in her face. It is just a question of time. Sadly I have come to the conclusion this is the only way that such powers will ever be opposed by the public.
jsnathan 1 day ago 1 reply      
It is not enough to develop technical solutions to circumvent the privacy erosion,as these will soon be branded as aiding crime or the enemies of the state,and eventually regulations like these will be passed almost everywhere.

Companies standing up for the privacy of their users should be held to ahigher standard than simply imposing limits on their own data collection.

They must band together to actively lobby to counter these kind of policiesto not run the charge of moral hypocrisy.

Another problem is that the party leadership on all sides is often more or lessin favor of these policies.

If there is to be any effective political opposition to this it must be organizedfrom the bottom up.

SimonPStevens 1 day ago 0 replies      
I will literally vote for anyone to get the converatives out.

I have several usually fairly liberal friends saying they are going to vote Tory because we need strong government for the brexit negotiations. This is complete and utter nonsense. We are screwed when it comes to brexit whatever happens at this general election. The EU have zero insentive to give us anything (we are a small and insignificant minor annoyance to them), the government we have will make no difference to what they offer, and we have no cards to play to make them budge.

What will make a huge difference is the government we have in place after brexit completes, and how that government takes us forward from there. May and the conservatives completly terrify me. Do you know the very first EU bill they announced to be repealed from UK law will be the human rights bill that protects citizens from their own government. Just stop for a minute and consider that this is the number one most important thing the conservatives decided needed to change when we leave the EU... Our protection from them. It is literally the very definition of abuse of power. We need a more middle ground government to bring some sense back to the UK for after brexit completes. Not 5 more years of Tory profiteering.

I urge everyone being swayed by the conservatives waffle to really think about what a long term Tory government will mean. Look into the strongest competitive party in your local constituency and vote for them.

abdias 1 day ago 0 replies      
There is absolutely no reason for any government to have this capacity unless it is to target potential dissidents and critics of their political agenda (as well as training "pre-crime" tech).

Intel regarding terrorist-attacks and crime comes almost exclusively from HUMINT, not SIGINT.

This type of installment (which is basically already in place) reverses, in effect, the principle of a person being innocent until proven guilty (by due process) by deeming anyone a potential criminal as default. This is highly unacceptable in a society claiming to be free.

This reminds me of East-Germany's Stasi police - on steroids.

iovrthoughtthis 1 day ago 1 reply      
I know it keeps being said that we can't produce technical solutions to the problem of state surveillance but the political changes we're advocating for that would fix this are akin to the "re-write the product" arguments.

The people you want to change are such a vast system of people, views, beliefs and incentives that changing them will take an inordinate amount of time (this is an assumption).

Though the effort required to build an environment frustrating enough to make technical state surveillance infeasible is perhaps huge it would seem that the impact any individual can have in that area is vastly larger than in the former (also an assumption). This would suggest to me that more could be achieved to frustrate the process of automating state surveillance in a shorter time span through technology than can through government reform.

noja 1 day ago 1 reply      
Is this a "leak" or a "let's test the reaction and amend" leak?
mirimir 1 day ago 1 reply      
I wonder about the full story. Are any UK ISPs already cooperating? I mean, the NSA has many^N intercepts. So maybe this isn't about interception, but rather about how intercepts get managed and queried.
Crosseye_Jack 1 day ago 2 replies      
The highlighted part in the doc on the zdnet site. (italic emphasis my own)

"To provide and maintain the capability to disclose, where practicable, the content of communications or secondary data in an intelligible form and to remove electronic protection applied by or on behalf of the telecommunications operator to the communications or data, or to permit the person to whom the warrant is addressed to remove such electronic protection."

Sounds to me that it applies to what the telco's apply. Now does the term "telecommunications operator" apply to services running on top of the internet (iMessage, WhatsApp, Etc). Because to me "telecommunications operator" are the people running the networks that connection people to the internet (Sky, BT, TalkTalk, Virgin, etc) not the companies that provide services on the internet.

I'll have to have a read though it in full later to see if I can get a better understanding of this document.

coldcode 1 day ago 0 replies      
The UK doesn't have the pesky Constitution we have in the US; however its only a slight obstacle if they can keep it out of the real justice system. I think the banning of encryption is the real kicker; if they can accomplish this (not sure how that would work) it will only wake up the people once they realize their bank accounts and other important personal connections are being stolen by people who have hacked or stolen the backdoor keys. Once people lose their money then it will dawn on folks that this is wrong. Without such a personal connection its just "I have nothing to hide".
type0 19 hours ago 0 replies      
> which critics called the "most extreme surveillance law ever passed in a democracy".

When a Government of a country takes such steps it's no longer should be called a democracy. It doesn't matter that they were democratically elected, what matters is what they do to their citizens, NSDAP was also democratically elected and I wouldn't call Third Reich a democracy.

easilyBored 1 day ago 2 replies      
Apple stops iPhone sales to UK due to encryption backdoor requirement.

What would happen? Would the government back down? No doubt if Google, Microsoft and Google joined it would be easier. China wouldn't care but UK is different politically and much smaller to have it's own replacements.

velox_io 1 day ago 0 replies      
Politicians medaling with technology rarely ends well..

If they force ISPs to keep detailed logs and ban encrypted messaging Apps, more people will just start using VPNs.

RE: Brexit, there where so many lies during the referendum, if they ran it again the result would be very different. This whole thing feels like a death march.

The EU cannot let Britain exit cleanly as it will destabilise the union.

Don't get me started on May or Corbyn.

dmix 1 day ago 0 replies      
If I'm reading this right they basically want to set up ISP survillence system similar to Russia? Installing some black boxes for a direct backdoor into the raw data.

Beats having to tap cables/backbone, plus they don't need to look for metadata identifiers in the data to ID each user.

dijit 1 day ago 4 replies      
They already do a lot more than the US do, Edward Snowden even said GCHQ do metadata and data collection wholesale.

Not to mention the random SSL downgrades that happen when you're going via UK transit links. (Which I have experienced myself!)

The UK should be considered dangerous. And the current government is only going to make it more dangerous.

There is also little to no hope of the current government being ousted during the next election.

I'm quite upset that my home country has to be so anti-freedom and anti-privacy.

cm2187 1 day ago 0 replies      
But what can they do exactly? Most of the https websites I connect to from the UK are based outside of the UK. My ISP won't be able to MITM this traffic.
vixen99 1 day ago 1 reply      
What about cost analysis? To what extent would such surveillance actually enable the forces of law and order to reduce or stop terrorist outrages and thus death and destruction in society? We would at least have an balance to consider. My guess - zero. It's not that difficult to imagine that those intent on mayhem can plan and execute an attack without using electronic media at all.
turblety 1 day ago 5 replies      
I think the title's wrong here. This is North Korea implementing the policy, right? Not the UK?

All sarcasm aside, this is horrendous. But let's not pretend this is to curb "terrorism", for whatever that means. It's simply a way to slowly erode the peoples democratic rights. Very sad times to be in.

With latency as low as 25ms, SpaceX to launch broadband satellites in 2019 arstechnica.com
265 points by jseliger  3 days ago   169 comments top 23
modeless 3 days ago 2 replies      
Shorter summary of SpaceX's Senate testimony https://www.commerce.senate.gov/public/_cache/files/6c08b6c2...:

Initially, the SpaceX system will consist of 4,425 satellites operating in 83 orbital planes (at altitudesranging from 1,110 km to 1,325 km). SpaceX has proposed an additional constellation of 7,500 satellites operating even closer to Earth.

For the end consumer, SpaceX user terminalsessentially, a relatively small flat panel, roughly the size ofa laptopwill use phased array technologies to allow for highly directive, steered antenna beamsthat track the systems low-Earth orbit satellites. In space, the satellites will communicate with each otherusing optical inter-satellite links, in effect creating a mesh network flying overhead that will enableseamless network management and continuity of service.

Later this year, SpaceX will begin the process of testing the satellites themselves, launching one prototypebefore the end of the year and another during the early months of 2018. Following successful demonstrationof the technology, SpaceX intends to begin the operational satellite launch campaign in 2019. Theremaining satellites in the constellation will be launched in phases through 2024, when the system willreach full capacity.

mabbo 3 days ago 5 replies      
It's been said before, but what people need to understand is that the goal of this service isn't to make money- it's to balance demand.

SpaceX can plan to launch be 30% more launches than their actual customers need. All those extra rockets are used for internet satellites. If there's an accident (boom) or a sudden need for more rockets, the internet satellites get delayed, and the paying customer gets a rocket.

If customers are finicky about using re-flown rockets, that's fine, the internal customer doesn't care. If a customer backs out for any reason, the internal customer takes their place.

Rocket production gets a smooth flow going, which improves production efficiency.

bmcusick 3 days ago 3 replies      
This is good news for the USA, but it's GREAT news for countries where the government regularly cuts off and controls the Internet for political purposes. If receivers can be as small as a personal laptop, expect a thriving black market in them in places like Iran and Egypt.

Such receivers can also act as backhaul for mesh networks, providing access to un-censored information in times of turmoil and peace alike.

Blinks- 3 days ago 1 reply      
"SpaceX has also proposed an additional 7,500 satellites operating even closer to the ground, saying that this will boost capacity and reduce latency in heavily populated areas."

This should be interesting, depending on how this is regulated on the legal side it could result in the major ISP's updating our infrastructure, of course nothing has stopped them from agreeing to update infrastructure then running off with our tax money in the past. Not to mention they will probably use legal means to hamstring this if possible. Hopefully this spurs competition in areas with stagnating ISP's and increases availability of low latency connections in sparsely populated areas.

akgerber 3 days ago 2 replies      
My aunt's town in rural Western Massachusetts still only has broadband in the library plus scanty LTE service and current terrible satellite how and whether to wire the town with broadband is driving the first contested election in decades.

This could make a major difference for that type of rural town.

candiodari 2 days ago 3 replies      
Interesting. What would this be able to do ? Current wireless speeds, even with state of the art technology, is going to be something like 10Gbit. That'll have to be shared by an entire city at least (correction: about 2 satellites per state on average. Less for New York), and of course there won't be any CDNs or HTTP caches "upstairs" so to speak. So 1 bit in = 1 bit back out.

Even at 4425 satellites that means, let's see. The earth is 500 million km2, and they don't need the same density everywhere, so some optimizations may be possible, so let's say 5000 satellites. That means 100,000 km2 per satellite. That means one satellite per decent sized country, only about 100 for the entire US. This is not geostationary orbit, so they'll have to aim for evenly spaced satellites.

Let's say they fix their satellites and do 10x what is currently possible, so that means 100Gbit/satellite. Current internet traffic into and out of the Netherlands is 50 times that (granted, more and denser cities than portugal, but the Netherlands would have to share one SpaceX satellite with Belgium and parts of the UK).

They would only be able to provide about 1/10,000,000th of a Satellite to a cellphone in the US, and if they have 100 Gbit/satellite, that's not even 1kbit on average. Even 10x or 100x that capacity wouldn't change much. Doubling or tripling the number of satellites won't fix this, we'd need between hundreds of thousands and to millions of satellites.

This will either need to be rather expensive (certainly as much as your cellphone, probably more, as this seems to indicate the system can only support 100x less users than the mobile network in the US, and even less in smaller dense countries or cities like the Netherlands or New York). Even at state of the art technologies, bandwidth limits are going to be tight. Like any satellite service, I can't see this work at city densities, or for mobiles.

IndexicalDemon 3 days ago 1 reply      
It's a cool system. There's a slight disadvantage that the satellite-to-ground link is at 10-15 GHz for users, so it will go down in a heavy rainstorm, and some fraction of the time due to other atmospheric effects. Should be reasonably reliable but far from fiber, and it will be interesting to see what uptime they guarantee. Assuming this all ever actually happens.
oceanswave 3 days ago 4 replies      
Seven years. US Broadband providers have seven years to take us over the coals, practice full bore capitalism with our data and generally run oligilistic practices until their time even begins to run out
jseliger 3 days ago 2 replies      
This may help explain why Verizon is cutting FiOS prices and increasing speeds: https://arstechnica.com/information-technology/2017/04/veriz...
devrandomguy 3 days ago 0 replies      
It would be super cool if amateur cubesats could interface with this network. I've always wanted to run a server on orbit, where energy is free and regulations are limited to RF spectrum issues.
fpgaminer 3 days ago 3 replies      
I was curious about the financials of this project. Here's my back of the napkin:

According to (https://www.quora.com/How-much-does-it-cost-to-set-up-and-la...) it costs ~$80-$100 million per satellite to build, launch, and run. So that's ~$354-$443 billion for all the satellites. Let's call it $443.

Now, I pay ~$60/m for internet. That's $720/yr. If they want to recoup costs after 10 years, it'll take ~61.5 million subscribers.

For the U.S. alone, that'll probably never happen. But worldwide? Maybe. Though worldwide the price per subscribe will be less, often times substantially less.

On the other hand, SpaceX will have lots of operational discounts because its their own rockets, and they already have all the supply channels for building space tech. And maybe they aren't looking to payback in 10 years for this initial run; maybe they only want to break even so payback in 15 is good enough.

Just some numbers for thought.

P.S. Businesses unlucky enough to be in "rural" areas, aka most commercial business parks outside of tech cities, are in the unfortunate position of paying the local telephone company/AT&T/TimeWarner thousands per month for roughly dial-up speeds. SpaceX could charge the same but offer broadband speeds and get a nice chunk of business, methinks.

dboreham 2 days ago 1 reply      
Ugh. Are they really talking about Latency, or are the numbers they cite actually RTT? And are the numbers the loop latency, or end-to-end latency to some destination on the public network such as

For comparison my CableCo connection in the left-middle of America achieves a loop RTT of about 10ms (5ms Latency), and an RTT to of 25ms (12.5ms Latency).

mattcoles 3 days ago 4 replies      
Do we have space in orbit for this? This is a lot of satellites and I also don't get how the latency can be so much lower than current satellite ISPs?
ChuckMcM 3 days ago 1 reply      
Neat, TeleDesic[1] V2.0 (or is it 3.0?) :). That effort stalled when single stage to orbit rockets became impractical due to insufficent launch demand when Regan's StarWars program was cancelled.

With a fast turn around re-usable F9 booster (so its relaunch costs are significantly less than launching a new one) Musk would have control over his destiny in this case. When Iridium was being built the whole ground station thing (up/down links into the switched telephone network) was nearly a decade slog of permission, licenses, and regulations.

That said, if he is successful he will suck a lot of money out of wallets that belong to some pretty big fat cats companies. And if he made every ground station a Tor exit node it would really mess with the powers that be.

[1] https://en.wikipedia.org/wiki/Teledesic

stubish 2 days ago 0 replies      
Get it up there, and you will have Australia as a client. We have made a mess of the national broadband network rollout, and this might be going live before the NBN rollout is complete. Probably faster, cheaper, and with reliability still to be seen.
protomyth 3 days ago 1 reply      
I thought the limiting factor for a satellite ISP was power on the satellite side as much as latency?
artursapek 3 days ago 4 replies      
Oh how I wish I could buy SpaceX stock...
slinger 3 days ago 1 reply      
Do this means that it'll be possible to play with folks from another continent with a good latency?
faragon 3 days ago 0 replies      
So Musk wants to be the Internet, too. I, for one, welcome that.
api 3 days ago 0 replies      
Please call it Skynet. :)
tehwhynot 3 days ago 1 reply      
Do the same thing on Mars. Then get self-driving space internet machines that transfer internet from here to Mars. They always position themselves to be between here and Mars for best RTT. Thank me later, humanity.

And yeah, I know latency would be between 6 and ~45 minutes.

soheil 3 days ago 1 reply      
Compared to my connection Webpass with sub-1ms latencies this seems like ages. But of course not the same can be said about majority of ISPs so this is a big win.
ct0 3 days ago 0 replies      
I was able to pay $20/mo for each month remaining on my contract to end service and start a new contract as a new customer when moving form 50/50 to 150/150.
How to Get into VR ycombinator.com
303 points by vincentschen  3 days ago   206 comments top 29
minimaxir 3 days ago 17 replies      
> The development of VR has been surprisingly tied to science fiction. Authors in the field have envisioned the futures that engineers set out to build

Incidentally, the high romanticism of VR one of the reasons why I am highly skeptical of the industry. The article argues "VR will also enable immersive concerts, reinvented museums, and live, court-side sporting events", but what is it doing now outside of games, which have been hit-or-miss? (the Samsung Gear VR commercials make VR look ridiculous, IMO)

AI is a similarly romantic industry, but the difference is that there are many practical, non-gimmicky applications of AI now and already implemented on your phones/PCs.

Clubber 3 days ago 2 replies      
"Silicon Valley" touched on this last week:

Its a VR play, Bachman says. Thats the frothiest space in the Valley right now. Nobody understands it but everyone wants in. Any idiot could walk into a fucking room, utter the letters v and r, and VCs would hurl bricks of cash at them.

By the time they find out its vaporware, its too late. Ive got to get into this.

Animats 3 days ago 5 replies      
Somebody at YC likes "Ready Player One", which is a stupid book. (Essential skill for taking over a big company: ability to play a perfect game of PacMan.) "Snow Crash" was ahead of its time. Yet the plot of Snow Crash would play out the same if everybody simply had modern phones.

Where's the killer app for VR? VR headsets have been around for years. The current generation of technology works adequately. Yet other than first person shooters, there's not much to do in there. You can plug into Second Life or High Fidelity with a VR headset, but few people do. Using a VR headset to simulate a screen so you can watch a movie is more trouble than it's worth.

Tossrock 3 days ago 1 reply      
It's exciting to me that there are still this many naysayers, even here on HN. That suggests there's still time to enter the field before the inflection point hits.
gfodor 3 days ago 1 reply      
One thing that I think can get lost is that to "Get into VR" if you are already a great software engineer you may not need to learn anything new, just find a place to work where you can immediately apply the skills you already have, towards a VR-oriented product. From there, look for opportunities to develop VR-specific skills like the ones mentioned in this article.

For example, at our company (AltspaceVR) we have a huge need for engineers of all stripes, not just the types of folks you'd normally think of for what is traditionally thought of as a "game" (this term, too, is probably about to become very stale.) It's a huge challenge for us to get great engineers who are not in the games industry to realize that working in this space, depending on the application, can be much like hopping, say, from a web app to a mobile app. Are there new skills to learn? Yes, tons. But that doesn't mean you are starting at ground zero nor that your current skills aren't going to highly valued and critical to an organization's success.

We need people to help build our backend systems (Rails, Kafka, etc), manage ops (Ansible, Datadog, AWS), create deployment tools, build UI (React, but rendered in VR!), work on our mobile app (React Native), build analytics pipelines, the list goes on. In fact I'd argue if you are looking at an organization that doesn't understand the need for such a wide spectrum engineering talent to deliver a great application, you should be careful!

Basically for VR companies who are working on applications that will leverage VR, the "VR-specific" slice of the engineering work is probably a lot smaller than you'd expect, just like for companies developing mobile applications, the "mobile specific" part of engineering work is only a part of the effort.

In other words, don't think you can't get into this field just because you aren't a graphics geek.

(Oh yeah, we're hiring :) https://altvr.com/careers)

deepnet 3 days ago 2 replies      
At least in my case VR simulator sickness was nascent prioproception.

Fast paced and hectic movements performed in VR would make someone sick if performed in real life.

Back in the day, playing FPSs with shutter glasses, I realised the motion sensations were tied to the motion. Then the sensations became a feature not a bug.

Once I understood the feelings they stopped making me nauseous. VR-prioproception improved my FPS performance. Feeling orientation and velocity really helps.

And immersion goes off chart.

Hard to know what way one is pointing in space, sometimes there are scant motion cues.

But in VR with prioproception When your stomach jumps into your mouth, pulling insane high-G maneuvers as bolts of laser fire crackle past your ship. With prioproception you sense it, becoming as Tudyk said "a leaf on the wind". Fully Embodied. No mind, no controls, you just are.

My vision for VR is a web browser, informed by Gibson's original vision of cyberspace, XEROX Park's 3D information sorting and VRML97 with Unreal style portals as hyperlinks.

IMHO The VR in Snowcrash was mundane. Predating the net, Gibson saw further, to a paradigm shift coming from a new knowledge tool.

Gibson's later 'locative art' idea is a nice twist on Augmented Reality - you have to go somewhere to see the virtual art overlayed on that place - it is site specific.

aVReality 3 days ago 2 replies      
Perhaps counterintuitively, I see/hear more interest in VR for business/productivity than for gaming, right now. I think it comes down to stimulation: we're used to gaming and other entertainment being high-octane, exciting experiences. But work/productivity is boring. VR, by virtue of its immersion and fantastical interactions, can bring more excitement to productivity uses (which are typically boring), while adding legitimate value (e.g., building better relationships with remote employees through business collaboration in VR).
moron4hire 3 days ago 1 reply      
I've only seen the talk in person and I've not been able to find a resource online that enumerates exactly what he means, but Dr. Ken Perlin gives an excellent talk (with live demo) about how AR could potentially change the human communication completely.

The gist of it is: assuming ubiquitous access to an AR system that provides a completely shared experience, people will naturally start using that shared experience as a communication tool. We will draw what we mean when words become difficult. We will riff on those ideas in a real-time simulation. And with the mind-set or combination of services, it may as well be, or could actually become, a physical reality.

If you've seen any of his more recent, HTML5-based demos (he has completely switched off of Java) involving visually-oriented, live-programmable environments with physics baked-in, it's almost completely targeted at this dream of communication. He has said he's not in it for graphics or VR/AR anymore, it's the communication aspect he's after.

adamnemecek 3 days ago 1 reply      
To learn shader programming, check out ShaderToy https://www.shadertoy.com/

its insane how short some of these implementation are.

soared 3 days ago 1 reply      
While this was super interesting to read it was fairly disappointing. I'm hoping the entire series focuses on VR and this wasn't our only post. They lead off with

> We talked to college students interested in engineering, business, and technology to figure out what resources would be most helpful to them. Then, we reached out to experts from academia, industry, or some combination of the two.

But we all ended up with was a blog post with some links to online courses and books. Is yc doing ...more? I was expecting something like an actual online course/intro with low requirements to get in. IE let me use my laptop camera and some html/css/js to see what building in AR is like.

anderspitman 3 days ago 1 reply      
Just wanted to add my voice to the author's plug for Rainbows End (book). Although I didn't find the story especially engagin, the technological ideas and perspective on VR were inspiring. Bonus if you're into genetics or bioinformatics at all (similar to Jurassic Park in that respect).
benjoho 3 days ago 0 replies      
Another good resource for those starting out in computer graphics/VR. It's the textbook for our VR class at UIUC.


centrinoblue 3 days ago 1 reply      
Does anyone have a good resource covering Headset Hardware reviews and specs?

I researched this recently looking for an HDMI compatible AR headset to try command line coding with but it was tough to find concise feature/spec/price comparisons.

Sign of the times I guess.

euske 2 days ago 0 replies      
Everyone is talking about the hardware cost, but people seem to have forgotten the software cost, which will have a much bigger impact to the industry overall.

Yes, there will be a few great VR apps and games that actually add to the experience, but what an average VR app is going to be? Making something in VR doesn't necessarily make it better. You have to make a good game/app in the first place. And then there's an additional cost for VR. Now you have one more element to screw up.

Look, even a regular standard PC could be 10x more useful if all the software is better built and run flawlessly. Instead we ended up with half-baked glitchy mess. Making a good VR app won't be easier than this. The profit break even point will be higher because people have a higher expectation. It seems the industry is ill-fated.

intrasight 2 days ago 1 reply      
The elephant NOT in the room here is Apple. I think one can safely assume that Apple is very actively doing R&D to solve the very hard problems with VR - and they will probably succeed. It'll probably be a walled garden like iOS, but it will still move the market.
sevensor 3 days ago 3 replies      
> VR is a new medium

This is just blatantly not true, at least not on the implied timescale. VR has been around for literally decades, including immersive VR with head and hand tracking. I know because I used it in 1998. The current VR hype cycle is based on this all having gotten a lot cheaper and faster.

To top it off, listing Snow Crash and Neuromancer as suggested reading is just silly. Those are the exact same books we were reading during the last VR boom, and while they're two of my favorites, they're both laughably wrong about VR. Specifically, VR is bad for the exact use cases presented in these novels! VR is, was, and will continue to be a product with niche appeal. We'll see the market expand because it's cheaper, but it's not an effective user interface for most applications.

Keyframe 3 days ago 1 reply      
There seems to be certain momentum behind VR development, but I'm afraid it will fizzle out on mass market. I can't quite put my finger on it, but it (talk about VR) reminds me a lot of talk about VR in 90's, just on a larger scale. I hope I'm wrong, I just don't see it now for some reason. Maybe if it dies out a bit and then another wave comes around with improved tech and maybe riding on the haptics of the future... who knows. As for AR, I don't see it at all. I just don't see it living outside of commercials with happy people clicking on their phones. What I'm sure is that VR, dead or not on the mass market, will continue to live on in niche varieties (as it did after 90's fizzle out) like job trainings (space, military, medicine..) and such.
artur_makly 2 days ago 0 replies      
VR Porn i believ will be the biggest of all social consumption. last year i had a chance to try it (out of sheer curiosity and nostalgia) . it was way way to real. the end of society.
gavanwoolery 3 days ago 0 replies      
For those interested in how NOT to get into VR, I present: https://www.youtube.com/watch?v=7OHlaVNOKGM
BatFastard 2 days ago 1 reply      
Everyone in VR thinks they are creating things for the first time. From 2000 to 2009 I founded and ran the second largest VR company in the world. The number of firsts that my team had was amazing.

What really amazes me is now that VR is growing. Zero people have talked to me about what my team learned. Its like the past never existed...

rezashirazian 3 days ago 0 replies      
I've always been sceptical of VR. As much as I was amazed by Oculus the first time I tried it, I never thought it was going to catch on. All for the sole reason that humans do not like to have things on their faces.

Any technology that needs to mount on the user's face will have to propose a huge value proposition.

mixedbit 2 days ago 0 replies      
A very interesting development in this space is WebVR. You can use WebGL to create widely available (95% of browsers) web-based 3D experience and thanks to WebVR the experience can upgrade to VR for visitors that have VR hardware.
JokerDan 2 days ago 0 replies      
I would love to get my hands on a VR kit in order to look into AR and UX design, but sadly, I do not have the money to invest in a HTC VIVE or similar.
damaru 2 days ago 0 replies      
Why are VR posts devoided of any talk about porn technology development. I think a huge market for VR will be the porn industry, and it seems quite untapped at this point.
devmunchies 3 days ago 0 replies      
I would love to see a "Paths" post about getting into 3D printing. Or if anyone has any good resources for a total beginner I'd be grateful.
return0 3 days ago 2 replies      
First they should ask if you want to get into VR. I mean, there is a certain crisis, as the hype did not live up and obvious problems (sickness) remained obvious.
it_learnses 3 days ago 6 replies      
do I need a PC to be able to get into it?
greggman 2 days ago 1 reply      
I still haven't drunk the VR kool-aid.

I want to believe. I've used all the software. Played many games. The problems for me are 2 fold

1. I don't have the space. I don't personally know anyone that does. Maybe if you live in the mid-west or the suburbs but there are plenty of cities where people are unlikely to have a space for room scale VR and arguably the non-room scale VR is just not that compelling. Visiting a "VR space" is not ever going to be mainstream so that seems like an issue for mass VR adoption.

2. When I play a good non-VR game I play for hours and days at a time. As an example last month I played Zelda:Breath of the Wild for 70+ hours, 10-14 hours a day. I've yet to play anything in VR that I could stay in for 10-14 hours. In fact many VR games get extremely tiring very quickly. I've played a few where after 10 minutes I'm exhausted. That's doesn't seem likely to become a main form of entertainment.

I've also played at a few VR arcades. Again the games have been fun and immersive but the also felt like thrill rides. In other words, 5-10 minutes and I'm done. Neat experience but just like I could not ride a rollercoaster for 10hrs I can't do VR for 10hrs. It's not about the form, it's about the activity. Searching a room, opening draws, pressing various floating buttons, holding virtual weapons. It's tiring not relaxing like non VR games.

Those issues don't seem solvable. They aren't tech issues they're inherent to the whole concept. Fixes for the space would require direct brain implants so you don't actually move around, you just think you are. I don't know if there is a solution to the 2nd. It's one thing to push buttons to see Nathan Drake climb mountains. It's another to actually do the climbing.

As for the article itself I was a little sad to see VR defined as only

> VR only tracking, rendering, and display. Tracking is the process of recording the users location and orientation in 3D space. Rendering is the process of constructing the appropriate image for a user. Display refers to the fidelity with which the hardware can produce the rendered image."

Even with the problems mentioned above there seems to still be low-hanging fruit.

If you want to be able to talk to people in VR you need to be able to emote. That means you need low-res cameras looking at your eyes and mouth so your avatar can show your expressions.

Similarly, 2 hands is not enough. I need censors on my feet and maybe knees, waist, elbows. In many VR experiences I've played things where bumping up against something with my waist or, kicking something away with my feet, using my knee to close a draw etc seem like natural actions that were thwarted by lack of input.

mfrager 3 days ago 2 replies      
Using VR makes me sick...
Iceland drills 4.7 km down into volcano to tap clean energy phys.org
240 points by dnetesn  1 day ago   159 comments top 16
chris_va 1 day ago 10 replies      
I'd be careful about getting too optimistic.

Conventional geothermal is about 4-5 cents/kWh, on par with natural gas in the US. The dominant capital cost is drilling a large well (you need volume), and so geothermal plants are generally only built in areas that require shallow (1km) wells.

Well costs are ~quadratic in depth. Given how much money has already been spent optimizing drilling for the oil&gas industry, along with how cutthroat that market is, I don't see the cost coming down significantly. As a result, deep geothermal will likely be limited to niche regions like Iceland. And you need deep geothermal to scale it past the existing locations.

I would love to be wrong, since geothermal checks all the boxes for renewables and is also suitable for base load power, but I don't see an obvious path forward short of a drilling tech miracle.

(source: Climate and Energy R&D group)

Robotbeat 1 day ago 6 replies      
And geothermal energy is a too-often-overlooked technology. It's not intermittent like wind and solar. It's more like nuclear but without the emotional baggage. As we try deep decarbonization, we're going to need more things like geothermal (or nuclear) or we'll end up spending like 2 or 3x as much money over-building solar to provide enough power even on cloudy winter days, building many more wind turbines, etc.

The hard part isn't getting to 60-80% clean energy, it's getting that last 20%. Geothermal helps a LOT with that. (As does nuclear, which is why we should be protecting existing nuclear assets until fossil fuels are eliminated... The existing ~20% of our electricity in the US produced by nuclear will make deep decarbonization multiple times cheaper than relying solely on wind and solar alone, plus accelerate deep decarbonization by a decade.)

An interesting idea is to build geothermal and solar at the same site. Consider areas where these two maps overlap in high potential for both:Geothermal: http://www.smu.edu/~/media/Site/Dedman/Academics/Programs/Ge...andSolar:http://www.nrel.gov/gis/images/map_pv_us_june_dec2008.jpg

Places like New Mexico and southern Colorado are good fits for this.

A problem with geothermal is if you draw heat too fast from the ground, the output will decline over the years. If you stop drawing heat, the ground will warm back up and then when you start again, output will be higher than it was when you stopped. So there is some sense in conserving geothermal power when demand for it is low.

So the idea is, you draw from solar when the Sun is shining and draw from geothermal when it's not. By being co-located, you can use the same grid infrastructure and get higher utilization out of your powerlines. You've essentially converted some of your solar energy into a baseload power source. Or you can think about it as enhancing the output and lifetime of your geothermal power source.

(And it's possible that if you have a LOT of extra solar power, you could run a resistive load underground, using the ground as a makeshift thermal battery.)

tonystubblebine 1 day ago 0 replies      
When I was in sixth grade I entered a science contest to invent a new form of clean energy. Mostly I entered because I was getting a B in my science class and needed extra credit to get up to an A.

And the thing I "invented" was literally what's in this article, geothermal power. Pump water down near magma, have it turned to steam, have that steam come rushing back up to power turbines.

The problem with my invention was that the company sponsoring the contest was a geothermal energy company.

I'm still proud of myself though, because I thought of the idea independently and it is a pretty damn cool idea for how to get energy.

rconti 1 day ago 1 reply      
Iceland has discussed building a ~7TW (I think?) cable to transmit power to the UK.

At least in Reykjavik, hot water is piped directly into homes and used for heating (radiators) as well as hot water (with attendant sulfur smell). My host there told me "my wife doesn't like the smell so, we use heated cold water instead". I had to think for a few seconds to parse the phrase "heated cold water". Oh, right, that's what I call "hot water" :)

It's so abundant, they don't mind the waste. Just leave the windows above the radiator open, the radiator keeps the room warm, you get fresh outside air inside, and the convective flow keeps the air moving. That chilled water is sent back to the geothermal plant and pumped back into the ground to replace the water taken out. IIRC there are definitely been geological issues (earthquakes) as a result of this whole process.

cmbuck 1 day ago 5 replies      
Yes, this seems much better than coal/oil, but isn't there a finite amount of heat under Earth's crust? Have we studied what would happen if we cool Earth's internal temperature by extracting heat in this way?

The Magnetoshpere which protects us from radiation is generated by the magma under the crust[1]. Eventually, if we interfere with the magma currents too much, don't we run the risk of damaging our magnetosphere?

[1] https://en.wikipedia.org/wiki/Earth%27s_magnetic_field#Physi...

devrandomguy 1 day ago 1 reply      
That's pretty dwarfy! But, can they pump magma to the surface, to defend their rocky fortress?

I just hope that they are being careful not to drill through any adamantine formations. http://dwarffortresswiki.org/index.php/DF2014:Raw_adamantine

Reason077 1 day ago 0 replies      
"The Institute of Economic Studies at the University of Iceland said in a February report that the country will not be able to abide by the COP21 climate change agreement signed in Paris in 2015.

Greenhouse gas emissions are rising in all sectors of the economy, except in fisheries and agriculture, it said."

This is unfortunate. Given Iceland's cheap & abundant renewable energy (2X Norway's electricity production per capita!), they really ought to be following the example set by Norway and prioritising Electric Vehicles through tax policies, etc.

It would be easy to build excellent charging infrastructure for EVs in this island nation - instead you have hordes of tourists driving around the ring road in smelly diesels.

I do see a few Nissan Leafs around Reykjavik and Akureyri, but there is barely any public charging infrastructure for driving between cities and tourist attractions.

NicoJuicy 18 hours ago 0 replies      
Does anyone know where phys.org has their articles from? I suppose it has something to do with Elsevier. Since a LOT of their articles has to do with students and universities.

Overall, they publish a great amount of "copy-paste" articles, either from other news sites and/or student papers. I am 100% sure their reports don't write original articles, just rewrite from other sources. It looks to me that Elsevier has found something new with the information they are tapping from (students/universities)

PS. The source of their article now is : https://techxplore.com/news/2016-10-geothermal-power-potenti...

PS2. Elsevier seemed my best guess, since a lot of articles discuss research from students. Other articles are a rewrite

Teknoman117 1 day ago 2 replies      
I'm at a loss a bit, are they directly tapping underground sources of hot pressurized liquid and not using some form of heat exhanger? How would they deal with various minerals dissolved in the water from gumming up their turbines or heavy elements escaping into the environment? IIRC there was one geothermal plant (in one of the nordic countries, i don't recall which) sitting by hotsprings that had to replace their piping every few months due to mineral deposits...

random thought: Could geothermal power be considered nuclear power considering half of the Earth's internal energy comes from decaying radioactive isotopes?

zoom6628 1 day ago 0 replies      
Interesting . NZ has had a geothermal plant for decades that is situated in a volcanic area (and right next to tourist hotspots of Rotorua and Taupo). They force cold water down and use the steam to drive turbines. No idea on the efficiency but the fact its been operating for decades would suggest its good enough to be viable.
avar 1 day ago 0 replies      
There's an English language summary page from the company itself at https://www.resourcepark.is
greatNespresso 1 day ago 0 replies      
This looks like the beginning of the shinra to me
thatwebdude 1 day ago 1 reply      
Makes sense, some neighbors in the Midwest have been heating their homes, driveways, etc. with geothermal heat pumps. Same principle, I suppose.
idlewords 1 day ago 0 replies      
Do you want to get firemonsters? Because this is how you get firemonsters.
awqrre 1 day ago 1 reply      
It would be cool to 3d print rock structures using magma from below
mmwako 1 day ago 7 replies      
Someone more knowledgable than me, please correct me... if they are drilling a hole in the ground and making steam come up from said hole, doesn't that heat up the Earth's crust and atmosphere more than the previous condition without the hole, therefore contributing to higher temperatures (and climate change), making it not that "clean" after all? Maybe cleaner than carbon/petrol, but not ideal for the current context.
       cached 7 May 2017 04:11:02 GMT